AI & ML interests
<|Tokenization|> <|Base model training|> <|SFT|>
Gemma models 1B, 2B, sft etc. trained from scratch
-
SRP-base-model-training/gemma_3_800M_sft_v2_translation-kazparc
Text Generation • 0.9B • Updated • 17 -
SRP-base-model-training/gemma_3_800M_base_v2_multilingual_10B_data
Text Generation • 0.9B • Updated • 4 -
SRP-base-model-training/gemma_3_800M_sft_v2_translation-kazparc_latest
Text Generation • 0.9B • Updated • 10 -
SRP-base-model-training/gemma_3_2B_base_v1_kk_only_5B-data
Text Generation • 2B • Updated • 16
Gemma models 1B, 2B, sft etc. trained from scratch
-
SRP-base-model-training/gemma_3_800M_sft_v2_translation-kazparc
Text Generation • 0.9B • Updated • 17 -
SRP-base-model-training/gemma_3_800M_base_v2_multilingual_10B_data
Text Generation • 0.9B • Updated • 4 -
SRP-base-model-training/gemma_3_800M_sft_v2_translation-kazparc_latest
Text Generation • 0.9B • Updated • 10 -
SRP-base-model-training/gemma_3_2B_base_v1_kk_only_5B-data
Text Generation • 2B • Updated • 16