gemma3 is absolutely trash.
#9
by
Maani
- opened
while this model isn't that, it still uses the same implementation of attention. it's trash for quantization, it's trash for inference, it's trash for any sorta computation beyond take it and inference it on TPUs.
I might be alone in asking for this, but I'd very much appreciate it if you left a dumpster fire like google behind and used a normal implementation that can work with the rest of the libraries instead of fighting with the developer because google's fucking shares might drop. what has google done for us lately?! except sucking our bones dry like the parasite it is?