Get Started with HumeAI's EVI Next.js Starter Kit
Education
Introduction
Hey everyone, welcome to Nerding IO! I'm JD, and today we're going to be talking about Hume AI. Hume AI is pretty interesting because it has three different parts: an empathetic voice interface, an expression measurement API, and the ability to create custom models. We're going to use this article as an introduction and then dive into more in-depth features in future articles.
Exploring the Dashboard
Once you log into Hume, the dashboard offers various features and demos, including the IOS app, the empathetic voice demo, and an interactive podcast experience known as Chatter. The cornerstone of Hume's offering includes:
- Empathetic Voice Interface: This allows you to run voice-based interaction demos.
- Expression Measurement: This feature can identify expressions through your webcam or voice.
- Custom Models: Allows users to create tailor-made models for their specific needs.
Each of these functionalities has its own playground and interaction capabilities through Hume’s API.
Getting Started with the Next.js Starter Kit
Hume AI offers a Next.js starter kit to get you up and running quickly. It's well-documented and includes all the necessary elements and demos. Here's how to get started:
- Clone the Next.js Starter Repository: