- Horizon AI
- Posts
- The World's First AI With Emotional Intelligence 🧠
The World's First AI With Emotional Intelligence 🧠
+ Amazon's Largest Venture Investment Yet
Welcome to another edition of Horizon AI,
AI startup Hume AI has announced the launch of EVI, which they claim is the world's first emotionally intelligent AI. Join us to see what this new ‘empathetic’ AI is all about.
Let's get going!
Read Time: 3.5 min
Here's what's new today in the Horizon AI
A Chart to Kickstart Your Day 📈: The rise of the .ai domain
Hume Introduces The First Conversational AI With Emotional Intelligence 🧠
Amazon Makes Its Largest Venture Investment Yet for AI Start-Up Anthropic
AI Tutorial: How to add and style a product into a background in Midjourney
AI Tools to check out
The Latest in AI and Tech 💡
A chart to kickstart your day
The rise of the .ai domain
Source: Google trends
The AI fever has turned Anguilla's “.ai” domain into a digital gold mine for the tiny British island territory in the Caribbean.
Google searches for “ai domain” overtook “.com domain” for the first time ever in June last year.
AI News
HUME AI
Hume Introduces The First Conversational AI With Emotional Intelligence 🧠
Source: Hume AI
AI startup Hume has unveiled the Empathic Voice Interface (EVI) - a conversational AI assistant that can understand and respond to the emotional tone and context of voice interactions in remarkably human-like ways.
Details:
EVI was developed combining standard LLM capabilities with expression measuring techniques. This innovative architecture is termed an 'empathic large language model' or eLLM by the company.
It was trained on millions of human interactions. It can understand when the user has finished speaking and generate an appropriate vocal response almost instantaneously.
It can also detect the emotional tone of a user's voice to add depth to each interaction and tailor its responses accordingly.
EVI includes fast, reliable transcription and text-to-speech and can hook into any LLM, which opens up a lot of possibilities.
Chatting with Hume AI's EVI feels remarkably close to talking with a real human being. Test it out yourself here: demo.hume.ai.
TOGETHER WITH BRIGHT DATA
Data Power-Up with Bright Data
Bright Data elevates businesses by collecting web data into turning it into actionable insights. Our global proxy network and web unblocking tools enable businesses to build datasets in real time and at scale. They provide a competitive edge in Ecommerce, Travel, Finance, and beyond.
Tap into the value of clean and structured data for market research, ML/AI development, and strategic decision-making. With our scalable solutions, you can efficiently enhance your data strategy, ensuring you're always one step ahead. Start with our free trial and see the difference yourself.
AMAZON
Amazon Makes Its Largest Venture Investment Yet for AI Start-Up Anthropic
Source: Amazon
Amazon has announced an additional $2.75 billion investment in AI startup Anthropic - bringing its total planned stake in the company to $4 billion.
Details:
This marks Amazon's largest venture investment in its three-decade history.
Amazon will maintain a minority stake and no board seat in the company, which is now valued at $18.4 billion after the investment.
As part of the deal, Anthropic will use AWS as its primary cloud provider going forward and will develop and deploy its future models on the AWS Trainium and Inferentia chipsets.
News of the Amazon investment comes weeks after Anthropic debuted Claude 3, its newest suite of AI models, which have impressed with their performance and firmly positioned the company as one of ChatGPT’s main rivals.
AI Tutorial
How to add and style a product into a background in Midjourney
Go to Midjourney
Use the "/describe" feature, upload your product image, and then add your desired image, in this case, bubble background
Copy the image description prompt from your product, then repeat the process for your background image.
Copy the image url of your product then add the url for the background and add "--cref" at the end of your prompt
Use the following prompt formula: (image description prompt product) + (image description prompt background image) + (image link product) + (image link of background) (--cref)
Example:
“A closeup of delicate soap bubbles floating in the air, with a soft pastel background that adds to their ethereal beauty. The focus is on capturing the intricate details and reflections within each bubble, creating an enchanting atmosphere in the style of the artist. A red blank can of soft drink isolated on a white background, this rendered illustration with copy space for your design. A red mock up can of soda or energy drink. No text. A mockup template for product presentation. A vector stock photo in the style of no particular artist. --cref s.mj.run/oSPhuO8g0iQ s.mj.run/N4XwfMEdM0o”
Source: @Salmaaboukarr on X
AI Tools to check out
✨ Arcads: Create winning ads with AI Actors.
🚀 Mentor: AI-powered goal management coach.
🤖 DryMerge: Automate your work with plain English.
✅ Hoory: Customer support automation platform driven by conversational AI that helps businesses streamline their support operations.
🎆 Creatie: Turn ideas into stunning designs in seconds.
The latest in AI and Tech
Source: @beckylitv on X
A deepfake demo created using "arcads.ai" has gone viral on X, featuring a woman apparently speaking and gesturing to the camera. However, her lips and voice have been artificially synchronized with a pre-written script using AI technology. This enables users to manipulate the woman in the video to say whatever they want, resulting in an incredibly realistic portrayal. The outcome was so convincing that it sparked debates over whether it was truly generated with AI.
OpenAI is testing a new program that will allow developers in the U.S. to earn money based on the usage of their GPT models. For this initial test phase, OpenAI has partnered with a small group of developers to gather feedback and insights on the most effective approach for implementing this usage-based compensation model.
The United States government is now mandating that all federal agencies name a Chief AI Officer. This senior leadership role will be responsible for overseeing the agency's use of AI systems to ensure safe implementation within the public service.
According to a report from The New York Times, Meta is set to infuse its Ray-Ban smart glasses with AI capabilities starting next month. The AI integration will enable multimodal features, including translation, object recognition, animal identification, and monument detection.
HeyGen, a leading provider of "digital avatars", has launched its groundbreaking "Avatar in Motion 1.0" feature. This feature enables seamless lip-syncing of people speaking many languages while they move and gesticulate with their hands.
That’s a wrap!
We'd love to hear your thoughts on today's email!Your feedback helps us improve our content |
Not subscribed yet? Sign up here and send it to a colleague or friend!
See you in our next edition!
Gina 👩🏻💻