@levelsio says Gemma 4 running locally on an iPhone wasn’t useful enough even for survival tips, implying current on-device small models still fall short.
“Google's Gemma 4 on a 128 GB Macbook Pro is near AGI on the go, no internet needed”
Gemma 4, bringing our most intelligent open models and breakthrough reasoning to your personal hardware and devices while outcompeting models 20x its size
Tried Gemma 4 ran locally on my iPhone today
I thought it'd be useful in case the apocalypse happens
I guess I'll freeze to death instead
Gemma 4 is now available on Locally AI for iOS where you can easily use it offline.
Airplane mode default
This finding is one of many signals tracked across Artificial Intelligence. The live feed updates every few hours with new authority voices, debates, and emerging ideas.
← Back to Artificial Intelligence