Spaces:
Running
Running
Remove HF LLMs (#113)
Browse files- README.md +3 -7
- global_config.py +13 -13
README.md
CHANGED
|
@@ -42,7 +42,7 @@ In addition, SlideDeck AI can also create a presentation based on PDF files.
|
|
| 42 |
|
| 43 |
# Summary of the LLMs
|
| 44 |
|
| 45 |
-
SlideDeck AI allows the use of different LLMs from
|
| 46 |
|
| 47 |
Based on several experiments, SlideDeck AI generally recommends the use of Mistral NeMo, Gemini Flash, and GPT-4o to generate the slide decks.
|
| 48 |
|
|
@@ -50,8 +50,6 @@ The supported LLMs offer different styles of content generation. Use one of the
|
|
| 50 |
|
| 51 |
| LLM | Provider (code) | Requires API key | Characteristics |
|
| 52 |
|:---------------------------------| :------- |:-------------------------------------------------------------------------------------------------------------------------|:-------------------------|
|
| 53 |
-
| Mistral 7B Instruct v0.2 | Hugging Face (`hf`) | Mandatory; [get here](https://huggingface.co/settings/tokens) | Faster, shorter content |
|
| 54 |
-
| Mistral NeMo Instruct 2407 | Hugging Face (`hf`) | Mandatory; [get here](https://huggingface.co/settings/tokens) | Slower, longer content |
|
| 55 |
| Gemini 2.0 Flash | Google Gemini API (`gg`) | Mandatory; [get here](https://aistudio.google.com/apikey) | Faster, longer content |
|
| 56 |
| Gemini 2.0 Flash Lite | Google Gemini API (`gg`) | Mandatory; [get here](https://aistudio.google.com/apikey) | Fastest, longer content |
|
| 57 |
| Gemini 2.5 Flash | Google Gemini API (`gg`) | Mandatory; [get here](https://aistudio.google.com/apikey) | Faster, longer content |
|
|
@@ -78,10 +76,8 @@ SlideDeck AI uses a subset of icons from [bootstrap-icons-1.11.3](https://github
|
|
| 78 |
|
| 79 |
# Local Development
|
| 80 |
|
| 81 |
-
SlideDeck AI uses LLMs via different providers,
|
| 82 |
-
|
| 83 |
-
for example, in a `.env` file. Alternatively, you can provide the access token in the app's user interface itself (UI). For other LLM providers, the API key can only be specified in the UI. For image search, the `PEXEL_API_KEY` should be made available as an environment variable.
|
| 84 |
-
Visit the respective websites to obtain the API keys.
|
| 85 |
|
| 86 |
## Offline LLMs Using Ollama
|
| 87 |
|
|
|
|
| 42 |
|
| 43 |
# Summary of the LLMs
|
| 44 |
|
| 45 |
+
SlideDeck AI allows the use of different LLMs from several online providers—Azure OpenAI, Google, Cohere, Together AI, and OpenRouter. Most of these service providers offer generous free usage of relevant LLMs without requiring any billing information.
|
| 46 |
|
| 47 |
Based on several experiments, SlideDeck AI generally recommends the use of Mistral NeMo, Gemini Flash, and GPT-4o to generate the slide decks.
|
| 48 |
|
|
|
|
| 50 |
|
| 51 |
| LLM | Provider (code) | Requires API key | Characteristics |
|
| 52 |
|:---------------------------------| :------- |:-------------------------------------------------------------------------------------------------------------------------|:-------------------------|
|
|
|
|
|
|
|
| 53 |
| Gemini 2.0 Flash | Google Gemini API (`gg`) | Mandatory; [get here](https://aistudio.google.com/apikey) | Faster, longer content |
|
| 54 |
| Gemini 2.0 Flash Lite | Google Gemini API (`gg`) | Mandatory; [get here](https://aistudio.google.com/apikey) | Fastest, longer content |
|
| 55 |
| Gemini 2.5 Flash | Google Gemini API (`gg`) | Mandatory; [get here](https://aistudio.google.com/apikey) | Faster, longer content |
|
|
|
|
| 76 |
|
| 77 |
# Local Development
|
| 78 |
|
| 79 |
+
SlideDeck AI uses LLMs via different providers. To run this project by yourself, you need to use an appropriate API key, for example, in a `.env` file.
|
| 80 |
+
Alternatively, you can provide the access token in the app's user interface itself (UI).
|
|
|
|
|
|
|
| 81 |
|
| 82 |
## Offline LLMs Using Ollama
|
| 83 |
|
global_config.py
CHANGED
|
@@ -28,7 +28,7 @@ class GlobalConfig:
|
|
| 28 |
VALID_PROVIDERS = {
|
| 29 |
PROVIDER_COHERE,
|
| 30 |
PROVIDER_GOOGLE_GEMINI,
|
| 31 |
-
PROVIDER_HUGGING_FACE,
|
| 32 |
PROVIDER_OLLAMA,
|
| 33 |
PROVIDER_TOGETHER_AI,
|
| 34 |
PROVIDER_AZURE_OPENAI,
|
|
@@ -74,16 +74,16 @@ class GlobalConfig:
|
|
| 74 |
'max_new_tokens': 8192,
|
| 75 |
'paid': True,
|
| 76 |
},
|
| 77 |
-
'[hf]mistralai/Mistral-7B-Instruct-v0.2': {
|
| 78 |
-
|
| 79 |
-
|
| 80 |
-
|
| 81 |
-
},
|
| 82 |
-
'[hf]mistralai/Mistral-Nemo-Instruct-2407': {
|
| 83 |
-
|
| 84 |
-
|
| 85 |
-
|
| 86 |
-
},
|
| 87 |
'[or]google/gemini-2.0-flash-001': {
|
| 88 |
'description': 'Google Gemini-2.0-flash-001 (via OpenRouter)',
|
| 89 |
'max_new_tokens': 8192,
|
|
@@ -110,9 +110,9 @@ class GlobalConfig:
|
|
| 110 |
'- **[az]**: Azure OpenAI\n'
|
| 111 |
'- **[co]**: Cohere\n'
|
| 112 |
'- **[gg]**: Google Gemini API\n'
|
| 113 |
-
'- **[hf]**: Hugging Face Inference API\n'
|
| 114 |
'- **[or]**: OpenRouter\n\n'
|
| 115 |
-
'- **[to]**: Together AI\n'
|
| 116 |
'[Find out more](https://github.com/barun-saha/slide-deck-ai?tab=readme-ov-file#summary-of-the-llms)'
|
| 117 |
)
|
| 118 |
DEFAULT_MODEL_INDEX = int(os.environ.get('DEFAULT_MODEL_INDEX', '4'))
|
|
|
|
| 28 |
VALID_PROVIDERS = {
|
| 29 |
PROVIDER_COHERE,
|
| 30 |
PROVIDER_GOOGLE_GEMINI,
|
| 31 |
+
# PROVIDER_HUGGING_FACE,
|
| 32 |
PROVIDER_OLLAMA,
|
| 33 |
PROVIDER_TOGETHER_AI,
|
| 34 |
PROVIDER_AZURE_OPENAI,
|
|
|
|
| 74 |
'max_new_tokens': 8192,
|
| 75 |
'paid': True,
|
| 76 |
},
|
| 77 |
+
# '[hf]mistralai/Mistral-7B-Instruct-v0.2': {
|
| 78 |
+
# 'description': 'faster, shorter',
|
| 79 |
+
# 'max_new_tokens': 8192,
|
| 80 |
+
# 'paid': False,
|
| 81 |
+
# },
|
| 82 |
+
# '[hf]mistralai/Mistral-Nemo-Instruct-2407': {
|
| 83 |
+
# 'description': 'longer response',
|
| 84 |
+
# 'max_new_tokens': 8192,
|
| 85 |
+
# 'paid': False,
|
| 86 |
+
# },
|
| 87 |
'[or]google/gemini-2.0-flash-001': {
|
| 88 |
'description': 'Google Gemini-2.0-flash-001 (via OpenRouter)',
|
| 89 |
'max_new_tokens': 8192,
|
|
|
|
| 110 |
'- **[az]**: Azure OpenAI\n'
|
| 111 |
'- **[co]**: Cohere\n'
|
| 112 |
'- **[gg]**: Google Gemini API\n'
|
| 113 |
+
# '- **[hf]**: Hugging Face Inference API\n'
|
| 114 |
'- **[or]**: OpenRouter\n\n'
|
| 115 |
+
'- **[to]**: Together AI\n\n'
|
| 116 |
'[Find out more](https://github.com/barun-saha/slide-deck-ai?tab=readme-ov-file#summary-of-the-llms)'
|
| 117 |
)
|
| 118 |
DEFAULT_MODEL_INDEX = int(os.environ.get('DEFAULT_MODEL_INDEX', '4'))
|