#localai

5 posts · Last used 2d

Back to Timeline
@newsgroup@social.vir.group · 2d ago
Google has announced plans to invest up to $40 billion in Anthropic, the developer of Claude AI. The previous record was $13 billion from Microsoft into OpenAI. Amazon had already given Anthropic $4 billion. Three tech giants are now financially tied to the three leaders of the frontier AI race. Independent players at this level have effectively disappeared. Computing power is controlled by the same cloud providers. Training data flows from platforms owned by the same corporations. Talent is recruited by whoever has the deepest pockets. This is not competition. This is structural monopoly. Anthropic markets itself as a safety-first AI lab with Constitutional AI. The question remains: who writes the constitution? A private company now dependent on Google and Amazon. Corporate oversight of "safe AI" is not neutral academic process. It is the definition of allowable knowledge through the lens of business interests. When you talk to Claude, ChatGPT, or Gemini, your conversations are stored on corporate servers. Official reason: improving models. Default setting: data collection. You can opt out if you find the buried setting. Most users never will. The alternative is not utopian. Open models like Meta LLaMA 4 and Mistral AI can run locally on your own hardware. Ollama and LM Studio provide one-command installation. Local AI means no data collection. No corporate control. No surveillance. Frontier models are more capable. But for most daily tasks, open models are sufficient. The question is not technical. It is structural. And the window for keeping AI as a public good rather than a corporate asset is closing. https://newsgroup.site/google-40-billion-anthropic-ai-monopoly-open-source/ #AI #Monopoly #Google #Anthropic #OpenAI #Microsoft #Amazon #OpenSource #Privacy #LocalAI
6
4
5
@13@2137.social · Apr 15, 2026
1
0
4
@newsgroup@social.vir.group · Apr 13, 2026
AMD GAIA just became a true desktop app — now you can build custom AI agents via chat, load your own documents, and run everything locally without sending a single byte to the cloud. Llama 3.3, Mistral, Phi-4 — all on your hardware. Full setup guide: https://newsgroup.site/amd-gaia-local-ai-agent-desktop-guide-2026/ #AI #LocalAI #Privacy #AMD #Linux
0
0
1
@thbley@phpc.social · Apr 03, 2026
New update for the slides of my talk "Run LLMs Locally": WebGPU Now models can run completely inside the browser using Transformer.js, Vulkan and WebGPU (slower than llama.cpp, but already usable). https://codeberg.org/thbley/talks/raw/branch/main/Run_LLMs_Locally_2026_ThomasBley.pdf #ai #llm #llamacpp #stablediffusion #gptoss #qwen3 #glm #localai #webgpu
2
0
0
@gadgetchecks@burningboard.net · Feb 12, 2026
0
0
0

You've seen all posts