
Ollama
The easiest way to run large language models locally
•16 reviews•3 shoutouts•380 followers
What is Ollama?
Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.
Do you use Ollama?

Recent Ollama Launches
Forum Threads
Ollama - The easiest way to run large language models locally
Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.
Ollama Alternatives
View all Ollama alternativesReview Ollama?
5/5 based on 16 reviews
Maker reviews of Ollama


Ollama is the choice for our users wanting local graphs, works best with 32b param models.


It's one of the best apps to run models locally
Reviews
•4 reviews
Ollama is the best way to run LLMs locally and the easiest way to test, build, and deploy new models. It has opened my eyes to the world of LLMs, a fantastic product.
•2 reviews
Very easy and powerful to run and customize local LLMs, and to integrate (Langchain or LlamaIndex).