Chathouse: Chat with AI

Chathouse: Chat with AI

AI chat for iPhone/iPad, with ollama and reasoning support

21 followers

Chathouse is your all-in-one gateway to a universe of AI-powered conversations. It supports lots of state-of-the-art models, including the best reasoning LLMs, and even lets you connect to self-hosted ollama instances. Of course, there's a free tier as well!
Chathouse: Chat with AI gallery image
Chathouse: Chat with AI gallery image
Free Options
Launch Team / Built With
Auth0
Auth0
Start building with Auth0 for AI Agents, now generally available.
Promoted

What do you think? …

Marco Quinten
Hello, one of the developers here! Chathouse is designed to be as private as possible. We're a EU company, and do not log any of your chats. Speech recognition is done on-device, fully local. If you connect to your own ollama/exo/etc. instance, your chats never leave your local network. If you use the built-in or provider-specific models, the data protection policies and terms of use of the specific providers apply. We do not log any prompts, but we can't guarantee that the inference provider doesn't either. We hope you enjoy Chathouse, and if you have any questions, feel free to ask!
Logan King

@fivesheep I love the multi model support.

Logan King

Is Chathouse provide performance insights to help users choose the best model for their needs?

Marco Quinten

@logan_king Thanks for the question! Chathouse doesn't provide performance insights, but I think we do something that not many people are doing right now.


The Chathouse models are basically "proxy" models with a dynamic underlying model. The user doesn't have to care what the model is, and I think most users have no idea which model to use anyway. You gotta be very deep into the LLM rabbit hole to have any idea about the specific differences between different state-of-the-art models.


So in the Free and Pro plans, we automatically use one of the latest and greatest models available. Currently, that's Gemini 2.0 Flash Lite for free and Gemini 2.0 Flash for Pro, which is performing incredibly well in benchmarks, and provides great latency and performance, paired with good instruction following and problem solving capabilities.


The Chathouse Pro+ model goes one step further and uses Not Diamond's prompt routing, so the prompt will actually be sent to the model that's most likely to give the best answer. So with Pro+, not only can you freely choose between all available models; You are also very likely to get the best possible answer across all providers by just staying on the "default" Chathouse Pro+ model.


So TLDR: You'll get very good results without having to know anything about LLMs, and on Pro+ you'll even get better results on average than if you were to choose the model manually.

Nikitaben Bhikhabhai

It's fantastic to see you offering access to multiple LLMs and self hosted ollama instances.

Marco Quinten

@nikitaben_bhikhabhai Thank you very much!


We noticed that there are a lot of AI cash-grabs on the App Store, but of course, all of them are solely focused on selling their subscription and ignore users who already pay for an AI provider such as OpenAI or Perplexity, and simply want a more powerful UI. Same thing with ollama users.


There are very limited options for simply connecting to your own ollama instance on your phone, and we wanted to make that as easy as possible, without locking it behind a paywall.