What are your biggest learnings when building with GPT?
Christophe Pasquier
5 replies
I was thinking about this when reflecting on our latest launch (basically chatGPT for your knowledge base, check it out), all the builders out there will face with exactly the same issues.
So I listed my 3 top learning/overcome challenges working with it. Sharing them here, I'd love to hear lessons from others!
Replies
Christophe Pasquier@christophepas
Slite
Here are mine!
#1 You can't just feed everything to GPT and expect an answer.
Whatever your use case, these models are craaaazy expensive, and have all a limit of input. An average team on our platform has 5000 docs, 1.5M words, 3M tokens. The limit is 4,000 tokens... ๐
-> You need a way to cut your input ๐คซ
#2 Keep cost low with other models ๐ธ
Even if you restrict your GPT usage - and you should - it is crazy costly. GPT is not great for everything. You have much better models for classifications, embedding and so on.
๐Be smart, and as much as possible keep GPT for the last mile of your interface.
#3 Generate *accurate* results.
In our case the result is an answer. Now, let's say you write in a doc "Van Gogh painted Mona Lisa".
If you ask Slite who did.., it will answer VanGogh ๐
Your docs, your answers ๐คท
It's our secret sauce but we manage to give the sources the answer went from, and while its just a UX tweak - we'll still say Van Gogh ^^ - it changes everything for your user.
--
That's it, here are my learnings! Share yours and if you're curious about what we build, support and check it out ๐ https://www.producthunt.com/post...
Share
TweetBoostr
Really true and we have faced similar issues.
TweetBoostr
What did you do to cut the cost? Specifically in fine tunning model?
@uday_patel4 aha thatโs the secret sauce ;) but as said the big part is about not trusting GPT to be made to do everything, while it can feel quite smart, itโs just a text generation model and other models are much more suited and cheaper for certain tasks