SOFT CAT
.ai
news
horizon
thoughts
radar
tools
pipeline
play
☰
news
horizon
thoughts
radar
tools
pipeline
play
← all tags
#local-llm
1 post tagged local-llm.
Tools & Experiments
Ollama
Run open-source LLMs locally with one command. No GPU required.
active
local-llm
open-source
self-hosted