Amazon adds emotions to Alexa; Now it can express excitement or disappointment like human!

Amazon adds emotions to Alexa to sound happy or excited and sad or disappointed when the situation calls for it. That is it can sound a bit less robotic and sound more like a natural human. Amazon has given Alexa emotions as disappointed and excited. Precisely speaking, for example, Alexa can announce that your favorite football or cricket team has lost the march in a sad or disappointed tone. And it can also exhibit excitement when delivering a congratulatory message.

Amazon has actually released this change in the United States only for now. Amazon’s Neural Text powers this Alexa’s new feature to exhibit emotions to speech technology. According to news, Amazon says, “overall satisfaction with the voice experience increased by 30 percent when Alexa responded with emotions”. Alexa has three levels of expressing the happy and sad tones, and when this capability tested by Alexa developers, they were extremely satisfied with its more human-like voice rather than monotonous one.

Simply saying, the before Alexa’s voice would be a monotonous robot voice, but now the researchers say that Alexa responds well with actual human-like emotions. The emotions of Alexa works with artificial intelligence (AI). Amazon has also given different speaking styles to Alexa.

That is, Alexa can also deliver news in that specific speaking style. It delivers news like a real anchor. Also, it can deliver music like a real DJ/VJ by adopting a specific speaking style. That is, Alexa has the ability to act as a real newsreader while delivering news or a real DJ/VJ while delivering music. And it has the ability to act as a real newsreader while delivering news in Australia.

The users who admire each and every development or innovation of the technology in the field of artificial intelligence will find it attractive. But the one who thinks artificial intelligence is taking over mankind and thinking or getting frustrated about each and everything becoming a machine-oriented will obviously find it difficult and irritated.

Amazon released the updated feature of voice and speaking style on Tuesday. The update on its market-leading voice assistant is currently only available in the United States. Alexa also updates the features of its listening ability. The listening ability is also developed with emotional intelligence. The emotional intelligence applied to Amazon actually identifies the tone of the users.

This feature was added to Alexa last year. It is used to identify the user’s voice if frustrated due to Alexa’s botched response to a music-related command like, for example, ‘Alexa stop the music’. This feature will specifically find the speaker’s tone and word choice to command Alexa to do a particular task.

Actually, the news style of Alexa is 31% more natural than Alexa’s cool and normal voice. And the music style is 84% more natural than the normal Alexa’s voice according to Amazon. Also, the developer’s satisfaction with the emotional voice response of Alexa is 30% more natural than the normal Alexa’s tone.

Also, data collected by the voice bots say that the voice of human is preferred more than the synthetic ones. That is, a voice bot survey says people remembering the synthetic voice from a website is 13.4%. But 32.5% of people remember the human voice from a lengthy audio clip. Probably, emotion must be shown in voice by Alexa for especially games and sports. That is, to announce a winner excitedly or a loser disappointedly.

Mickey Sampson

Mickey Sampson is a PHP Developer who is skilled at coding, analytical approach, and database design having a deep understanding of Core PHP, Magento, WordPress, and different latest technologies. Either he keeps himself busy in building new and one of a kind apps or playing with his pets - an animal lover, who has 7 different animals residing with him.

We will be happy to hear your thoughts

      Leave a reply

      MySmartChoice
      Logo