In r/LocalLLaMA, the Claude Code leak triggered a wave of attempts to treat it as open source and to run Claude Code like tooling against local and OpenAI compatible endpoints, including patches to remove hardcoded Anthropic assumptions.
Claude code source code has been leaked via a map file in their npm registry
All hail Claude code because it is now "Open Source"?
Forked claw code couldnt get it running with my local models cause there was hardcoded Anthropic client ,so now the CLI auto-detects the provider from the model name and env vars.
Ollama, LM Studio, OpenAI, xAI, or any OpenAI-compatible endpoint works
Tested on Windows 11 with Ollama
This finding is one of many signals tracked across Artificial Intelligence. The live feed updates every few hours with new expert voices, debates, and emerging ideas.
← Back to Artificial Intelligence