All activity
![marcusmartins](https://ph-avatars.imgix.net/573251/original.jpeg?auto=compress&codec=mozjpeg&cs=strip&auto=format&w=48&h=48&fit=crop)
![marcusmartins](https://ph-avatars.imgix.net/573251/original.jpeg?auto=compress&codec=mozjpeg&cs=strip&auto=format&w=48&h=48&fit=crop)
marcusmartins
left a comment
Recently I a long flight and having ollama (with llama2) locally really helped me prototype some quick changes to our product without having to rely on spotty plane wifi.
![Ollama](https://ph-files.imgix.net/c8b766e6-b9b3-4eed-93dc-602ab5a0f2ff.png?auto=compress&codec=mozjpeg&cs=strip&auto=format&w=48&h=48&fit=crop)
Ollama
The easiest way to run large language models locally