TrendingTopic

local and edge AI momentum

April 3, 2026Matthew Berman, Georgi Gerganov, Awni Hannun

Matthew Berman says Gemma 4 "fits on most consumer hardware" and predicts hybrid hosted and edge models, Georgi Gerganov touts day-0 llama.cpp support, and Awni Hannun celebrates running LMs with Ollama + MLX.

Gemma 4 is a bigger deal than more people realize...
It's an incredible model that fits on most consumer hardware.
The future is hybrid hosted/edge models.
Gemma 4 is here!
The best open-source model you can run on your machine.
You can now run LMs with Ollama + MLX!
Matthew Berman
Georgi Gerganov
Awni Hannun
open-modelsedgetooling

See what experts are saying right now

This finding is one of many signals tracked across Artificial Intelligence. The live feed updates every few hours with new expert voices, debates, and emerging ideas.

← Back to Artificial Intelligence