Launches
Coming soon
Upcoming launches to watch
Launch archive
Most-loved launches by the community
Launch Guide
Checklists and pro tips for launching
Products
News
Newsletter
The best of Product Hunt, every day
Stories
Tech news, interviews, and tips from makers
Changelog
New Product Hunt features and releases
Forums
Forums
Ask questions, find support, and connect
Streaks
The most active community members
Events
Meet others online and in-person
Advertise
Subscribe
Sign in
Qwen 1.5 MoE
Highly efficient mixture-of-expert (MoE) model from Alibaba
•
0 reviews
•
3 shoutouts
•
47 followers
Visit website
Follow
Overview
Launches
Shoutouts
Reviews
Team
More
Blog
•
Newsletter
•
Questions
•
Forums
•
Product Categories
•
Apps
•
About
•
FAQ
•
Terms
•
Privacy and Cookies
•
X.com
•
Facebook
•
Instagram
•
LinkedIn
•
YouTube
•
Advertise
© 2025 Product Hunt
What is Qwen 1.5 MoE?
Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
Do you use Qwen 1.5 MoE?
I use this
I use something else
Recent Qwen 1.5 MoE Launches
Qwen 1.5 MoE
Highly efficient mixture-of-expert (MoE) model from Alibaba
Launched on April 3rd, 2024
3
81
Qwen 1.5 MoE Alternatives
Airtrain.ai LLM Playground
3 reviews
Data analysis tools
LLM Explorer
0 reviews
AI Coding Assistants
Review Qwen 1.5 MoE?
Leave a review
Maker reviews of Qwen 1.5 MoE
Salman Paracha
used this to build
Arch
(289 points)
Highly performant base models that can be used for task-specific training. Such as the function calling experience built into Arch
Helpful
Share
Report
4mo ago