metadata
title: Edge LLM Platform
emoji: π
colorFrom: blue
colorTo: purple
sdk: docker
sdk_version: 4.36.0
app_file: app.py
pinned: false
license: mit
short_description: Local LLM platform with modern web interface
π Edge LLM Platform
A lightweight, local LLM inference platform with a modern web interface.
Note: All development now happens directly in this repository (EdgeLLM_HF). This is both the development environment and the production Hugging Face Space.
β¨ Features
π€ Hybrid Model Support
- Local Models: Run Qwen models locally for privacy
- API Models: Access powerful cloud models via AiHubMix API
- Seamless Switching: Switch between local and API models effortlessly
- Thinking Models: Support for models with visible reasoning process
π Available Models
Local Models (Privacy-First)
Qwen/Qwen3-4B-Thinking-2507- Local model with thinking process (~8GB)Qwen/Qwen3-4B-Instruct-2507- Local direct instruction model (~8GB)
API Models (Cloud-Powered)
Qwen/Qwen3-30B-A3B- Advanced Qwen3 with dynamic thinking modesqwen2.5-vl-72b-instruct- Multimodal model with vision capabilitiesQwen/QVQ-72B-Preview- Visual reasoning with thinking process
π¨ Modern UI/UX
- Responsive Design: Works on desktop and mobile
- Chat Interface: Beautiful conversation bubbles with session management
- Model Management: Easy switching between local and API models
- Parameter Controls: Temperature, max tokens, and system prompts
- Session History: Persistent conversations with localStorage
π Project Structure
EdgeLLM/
βββ frontend/ # π¨ React frontend with ShadCN UI
βββ backend/ # π§ FastAPI backend
βββ static/ # π± Built frontend assets
βββ app.py # π Production entry point
βββ requirements.txt # π Python dependencies
βββ README.md # π Documentation
π― Quick Start
Clone the repository
git clone https://huggingface.co/spaces/wu981526092/EdgeLLM cd EdgeLLMSet up environment variables
# Create .env file with your API credentials echo 'api_key="your-aihubmix-api-key"' > .env echo 'base_url="https://aihubmix.com/v1"' >> .envInstall dependencies
pip install -r requirements.txt cd frontend && npm install && cd ..Run locally
python app.pyDeploy changes
# Build frontend if needed cd frontend && npm run build && cd .. # Push to Hugging Face git add . git commit -m "Update: your changes" git push
π Live Demo
Visit the live demo at: https://huggingface.co/spaces/wu981526092/EdgeLLM
π§ Configuration
Environment Variables
For local development, create a .env file:
api_key="your-aihubmix-api-key"
base_url="https://aihubmix.com/v1"
For production (Hugging Face Spaces), set these as secrets:
api_key: Your AiHubMix API keybase_url: API endpoint (https://aihubmix.com/v1)
API Integration
This platform integrates with AiHubMix API for cloud-based model access. Features include:
- OpenAI-compatible API interface
- Support for Qwen 3 series models
- Multimodal capabilities (text + vision)
- Streaming and non-streaming responses
π οΈ Development Workflow
- Frontend development: Work in
frontend/ - Backend development: Work in
backend/ - Build frontend:
cd frontend && npm run build - Deploy: Standard git workflow
git add . git commit -m "Your changes" git push
ποΈ Architecture
Backend (FastAPI)
- Models Service: Handles both local model loading and API client management
- Chat Service: Routes requests to appropriate generation method (local/API)
- API Routes: RESTful endpoints for model management and text generation
- Configuration: Environment-based settings for API credentials
Frontend (React + TypeScript)
- Modern UI: Built with ShadCN components and Tailwind CSS
- Chat Interface: Real-time conversation with message bubbles
- Model Management: Easy switching between available models
- Session Management: Persistent chat history and settings
π License
MIT License - see LICENSE for details.