• Subscribe
  • I’m Gal, maker of Clerkly and I have hacked together AI and government data. Ask me anything!

    gal
    1 reply
    I built an AI system that uses only official USA government sources with GPT4 to help us navigate the bureaucracy. The result is pretty cool, you can play around with the demo at https://app.clerkly.co/ . Feel free to ask anything! _____________________________________________________________________________ How Did I Achieve This? Data Location First, I had to locate all the relevant government data. I spent a considerable amount of time browsing federal and local .gov sites to find all the domains we needed to crawl. Data Scraping Data was scraped from publicly available sources using the Apify ( https://apify.com/ )platform. Setting up the crawlers and excluding undesired pages (such as random address books, archives, etc.) was quite challenging, as no one format fits all. For quick processing, I used Llama2. Data Processing Data had to be processed into chunks for vector store retrieval. I drew inspiration from LLamaIndex but ultimately had to develop my own solution since the library did not meet all my requirements. Data Storing and Links For data storage, I am using GraphDB. Entities extracted with Llama2 are used for creating linkages. Retrieval This is the most crucial part because we will be using GPT-4 to generate answers, so providing high-quality context is essential. Retrieval is done in two stages. This phase involves a lot of trial and error, and it is important to have the target user in mind. Answer Generation After the query is processed via the retriever and the desired context is obtained, I simply call the GPT-4 API with a RAG prompt to get the desired result. P.S: We are launching in 30 days on PH - https://www.producthunt.com/posts/clerkly

    Replies

    gal
    You can ask tehnical and non-tehnical questions! :D