Ollama

Ollama

The easiest way to run large language models locally
•16 reviews•3 shoutouts•
380 followers

What is Ollama?

Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.

Do you use Ollama?

Ollama gallery image

Recent Ollama Launches

Forum Threads

Chris Messina
•

1yr ago

Ollama - The easiest way to run large language models locally

Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.

View all

Ollama Alternatives

View all Ollama alternatives

Review Ollama?

5/5 based on 16 reviews
View all reviews

Maker reviews of Ollama

cognee
Boris Arzentar
used this to buildcogneecognee
(360 points)
Ollama is the choice for our users wanting local graphs, works best with 32b param models.
AI Renamer
Ozgur Ozer
used this to buildAI RenamerAI Renamer
(134 points)
It's one of the best apps to run models locally
Apollo AI
Aaron Ng
used this to buildApollo AIApollo AI
(234 points)
A great way to try out LLMs on your desktop. Also great for hosting your own AI servers for apps like Apollo.
View all Ollama reviews

Reviews

Kim Hallberg
•4 reviews

Ollama is the best way to run LLMs locally and the easiest way to test, build, and deploy new models. It has opened my eyes to the world of LLMs, a fantastic product.

Xingcan HU
•6 reviews

easy to use, like docker for ai

Mackers
•2 reviews

This looks really interesting - we'll be looking further into this in the upcoming week as it could be a lifesaver for us

Sanjay S
•10 reviews

Easy to run multiple AI models locally without worrying about privacy issues.

Adrien SALES
•2 reviews

Very easy and powerful to run and customize local LLMs, and to integrate (Langchain or LlamaIndex).

View all
© 2025 Product Hunt