All activity
Alex Pinel
left a comment
Here is a little pet project of mine, it is a standalone app meant to run a local LLM with a focus on loading and interacting with docs in a fully local environment.
Runs well on Apple Silicon and on Windows with GPU acceleration. Current supported files are: pdf, docx, xslx, pptx, and markdown. 16gb of RAM are recommended and might actually be necessary
Hope you find it interesting! Please...
Dot
Local AI keeping your data safe
Dot is an AI tool designed for privacy-conscious users who want to harness the power of language models without compromising their data privacy. Engineered to run locally, Dot enables users to train an LLM with their documents and files.
Dot
Local AI keeping your data safe