🎉 New: Introducing the lmstudio-python
and lmstudio-js
SDK libraries!


Your local AI toolkit.
Download and run Llama, DeepSeek, Mistral, Phi on your computer.
By using LM Studio, you agree to its terms of use


Beginner Friendly, with Expert Features
Easy to start, much to explore
Discover and download open source models, use them in chats or run a local server
Chat
Easily run LLMs like Llama and DeepSeek on your computer. No expertise required

With LM Studio, you can ...

Libraries:
Cross-platform local AI SDK
LM Studio SDK: Build local AI apps without dealing with dependencies
Python
TypeScript
1. Install the SDK using pip
pip install lmstudio
LLM Chat
Agentic Tools
Structured Output
Manage Models
import lmstudio as lms llm = lms.llm() # Get any loaded LLM prediction = llm.respond_stream("What is a Capybara?") for token in prediction: print(token, end="", flush=True)
Frequently Asked Questions
TLDR: The app does not collect data or monitor your actions. Your data stays local on your machine.