would you prefer an local llm software?
Roche
1 reply
there are many online llm ui, like monica,maxai
would you prefer local llm chat with pdf/chat with something you like?
if yes, which model or ability you want?
Replies
Ruben Boonzaaijer@ruben_boonz
Ringly.io
I use ollama to host my local models like mistral and llama2
Share