TrendingTopic

Small open models on consumer hardware

April 5, 2026David Ondrej, Ahmad, Simon Willison

David Ondrej and Ahmad argue Gemma 4 is small enough for phones and home GPUs, while Simon Willison shares running multiple Gemma 4 variants locally and via API.

It's so small that it can run on your phone
Gemma 4 (31B, Dense)... Would run on any hardware at home
both can run easily on consumer hardware at home
the first three generated on my laptop via LM Studio
David Ondrej
Ahmad
Simon Willison
open-modelslocal-aigemma

See what authorities are saying right now

This finding is one of many signals tracked across Artificial Intelligence. The live feed updates every few hours with new authority voices, debates, and emerging ideas.

← Back to Artificial Intelligence