Happy Wednesday! Donāt forget to subscribe to Maker Stacks. This week, we are interviewing serial founder and Product Hunt community veteran Mubs on his tech stack.Ā
Hereās some news:Ā
š§µ Meta just showed off Threads fediverse integration for the first time.
š· Segway just announced a sub $1,000 smart lawnmower.Ā
š¤ After raising $1.3 billion, the Inflection founders are joining Microsoft to lead its AI efforts.
Come hang with the Product Hunt team, Mercury, and Jam.dev at our next event on April 9th. Meet the team, chat with makers, and discover some of the coolest AI products being built today and while vibing with some good food and good drink.Ā
Spots filled up fast for our last event, so be quick!
When you think of AI, your mind might immediately go to consumer-facing apps like ChatGPT, Sora, and others. But like everything else in tech, thereās tons going on under the hood (namely chips) and a few companies have stood out amongst the crowd.
Nvidia is one of those standout companies thatās riding the AI wave to untold levels of success, even surpassing both Alphabet (Google) and Amazon in market cap at one point thanks in large part to its AI chip manufacturing efforts, particularly the H100 GPU.Ā
Now, the company plans to cement its lead even further ā with its new B200 and GB200 āsuperchips.ā
Announced at an event yesterday dubbed the āAI Woodstock,ā the new chips can offer up to 20 petaflops of FP4 horsepower from its 208 billion transistors and, when combined with a single Grace CPU, could potentially offer up to 30 times more performance for LLM interfaces while being a lot more efficient.Ā
To put that into perspective, the H100 GPU, which was considered the industry standard for AI prior to this, has around 80 billion transistors.
What does all that mean? Ā Well, it means that for the first time in my life, I can confidently say a number with a billion after it is small relative to another.Ā
Secondly, itās a lot of tech speak to say these chips are really powerful. How powerful? According to Nvidia, training a 1.8 trillion parameter model on the companyās previous line of chips would have taken 10,000 GPUs and 15 megawatts of power. Swap out the old for the new, and that reduces to 2,000 GPUs and just 4 megawatts of power.
According to Nvidiaās CEO, Jensen Huang, the chips will cost somewhere in the ballpark of between $30,000 and $40,000. So yeah, itās safe to say that you and I wonāt be buying one to power our ChatGPT-wrapper side projects anytime soon.Ā
Tired of explaining the same thing over and over again to your colleagues? Itās time to delegate that work to AI. guidde is a GPT-powered tool that helps you explain the most complex tasks in seconds with AI generated documentation.
Simply click capture on our browser extension and the app will automatically generate step-by-step video guides complete with visuals, voiceover and call to actions.
The best part? Our extension is 100% free.
- Stability AI just released a new text-to-3D video AI model.Ā
- ThinkAny is a search engine that uses AI to curate higher-quality content.
- Inner Vision Pro is a Vision Pro app for your mental health.