Voice assistants from Google and Apple follow the voice of the users. The assistant responds to the users as they speak. However, they cannot understand your feelings because they do not have the ability to understand emotions. But soon Apple assistant Siri will be able to read your facial expressions and understand your feelings.
Apple is working on making the company Siri even better. Artificial intelligence makes it very easy for any user. Now the company will also add users’ emotions to it so that users can get a better experience. For this, the company has also filed a new patent. The new version of AI Voice Assistant Siri will be able to recognize users’ facial expressions and emotions by reading their faces.
As soon as the users give the command to Siri, they will try to understand their gestures and will help the user completely. The company believes that facial expression analysis will be helpful to understand all the voice commands given by the user. Speaking of the present, Siri is unable to understand the emotions hidden behind the user’s commands.
The company has said that there are many times in which intelligent software does not understand the actions of the user. Another verification layer will be added to the new Siri, with the help of which the user’s request will be easy to understand. For example, if in some case it would be difficult for Siri to answer based only on voice commands, then he would also read the face and try to understand the user’s expressions. According to the patent offered by the company, it will be able to understand the user’s expression through the microphone, audio input, camera, and image. For this, the company will feed a random feed of data for certain situations in the Facial Action Coding System.