Open-source data pipeline tool for transforming and integrating data. The modern replacement for Airflow.
- Integrate and synchronize data from 3rd party sources
- Build real-time and batch pipelines to transform data using Python, SQL, and R
- Run, monitor, and orchestrate thousands of pipelines without losing sleep
A voice AI platform provides APIs for speech-to-text, text-to-speech, and language understanding. From medical transcription to autonomous agents, Deepgram is the go-to choice for developers of voice AI experiences.
The Labelf AI Platform aims to let anyone, no matter previous knowledge, create and use AI text classification models. We use an advanced approach to active learning to accelerate training on both labeled and unlabeled data. Can also be used for data analysis.
Machine learning models are only as good as the datasets they’re trained on, yet it’s extremely difficult to improve dataset quality. Aquarium uses deep learning to find problems in your model performance and edit your dataset to fix these problems.
Version data, track models, train on custom clusters - without worrying about setting up infrastructure or cloud compute. Save on cloud costs by only paying for what you need. Run hyperparameter sweeps on multiple nodes with the click of a button.
Deploifai is a software platform that will manage all your machine learning infrastructure and deployment so that you can focus on building your AI models! Train and deploy a machine learning model with ease, all the way to production.
The Platform enables you to train and run AI models on GPU / CPU Spot instances in AWS cloud reducing your training costs by up to 70%, setup and training time. Supports job scheduling, visualization and history, and TensorBoard integration.
A SaaS platform that enables you to monitor machine learning model performance, proactively alerting you on biases, concept drifts and data integrity issues early and resolve them to improve the accuracy and reliability of models. With real time insights, Mona provides an ongoing, granular understanding of the data to address fairness concerns and other anomalies before they negatively impact the business.