top of page

INTERIM REPORT #1

​

 

As we are seeing, the use of emotional AI in healthcare has the ability to help doctors diagnose and interact with patients. Being able to take their temperature, recognize and diagnose illnesses. A large part of the healthcare field that emotional AI will disrupt is the psychological field. Emotional AI has the ability to not only recognize patient’s emotions through facial expressions but there is development in voice analytics that will be able to analyze and determine patients emotions through the way in which they speak.  Current technologies already track our physical health. This includes blood pressure, heart rate, and our workouts. If physical health is already tracked, mental health will begin to be tracked as well. The data collected from physical health already shows benefits to health professionals. Tracking mental health will only do the same once it can reach a level of accuracy.

 

Emotional AI shows trends of its effects on education. Current technology tracks a student’s emotions during tests as well as during homework assignments. Rather than waiting until test time, a teacher can recognize early on whether his/her students are struggling with the information. This can lead to an emergence in personalized educational programs. Based on the reaction of the student, the program can change answers based on a student’s struggles or understanding. The standard of education can lead to an effect on societal change as during primary education years, children are taught about the basics of social behavior. They interact with different children each year, they are able to recognize basic emotions and receive responsibilities. 


The emergence of Emotional AI creates legal risks in medicine. Diagnosing mental health diseases with AI has not been accurate as current programs find difficulties with assessing different emotions. This includes being racially biased. If the diagnosis is wrong by the program, malpractice suits would increase. Who would be held liable? Face abnormalities can create difficulties for emotional AI programs to recognize emotions. Privacy concerns will be emphasized as the data collected can be sold to third-party vendors by large corporations. The invasiveness of tracking emotions will bring legal action to these companies. To be able to read facial emotions, a camera must always be able to record facial imaging. How this data will be collected will bring concerns to the public. Apple’s increasing investment into AR/VR technologies and companies shows that they are planning ahead. Currently, the military uses Apple AR glasses for training purposes. This has not been marketed to consumers just yet. By utilizing emotional AI software with AR/VR technologies, the rumors behind Apple’s smart glasses may be true. A pair of smart glasses by Apple can use Siri. With emotional AI, Siri can be able to recognize our emotions. The two technologies working together can give consumers a personalized experience. Issues with those would obviously be privacy concerns. But another could also be data collection. Usually when we are online, our data is tracked as we move from each webpage. With tech like our phones or smart watches we are able to understand the collection of our physical data. But how will data from glasses be collected? What kind of information would it collect? Would it be able to track what we see ourselves? This would all depend on what one can use smart glasses for. If it is able to track our emotions, it should be able to see our face.

bottom of page