Easily invoke and manage local AI models in your browser.
LocalAPI.AI is a local AI management tool designed for Ollama. It can run in the browser with just one HTML file and requires no complex setup. It is also compatible with vLLM, LM Studio, and llama.cpp.