top of page

INTERIM REPORT #2

Interim Report #2

 

Contradictions: When it comes to emotional artificial intelligence there are a few opposing forces that are at play. One of the main contradictions that we have found is that there is a lack of human connection when it comes to emotional AI. Are people going to be comfortable sharing their issues with artificial intelligence? While it is true that these systems can have the ability to diagnose and provide emotional support that any real human can do, there is still an emptiness to working with machines as opposed to humans. Many people may just be uncomfortable talking to a machine for empathy. As well, another large contradiction that we have come across is the idea that these systems at the current moment do not always interpret human emotion correctly. When it comes to psychiatric evaluation you are going to want to ensure that your emotions are not just being interpreted to fit a checkbox to meet a machine’s expectation, while your facial expressions may emit one emotion your words can be saying something differently and this can confuse these machines. In these instances, it is not certain that these machines are going to be able to diagnose you correctly based on your emotions and it can cause damage to those who are misdiagnosed.

Inflections: At the current moment we think we are seeing a major turning point stemming from Zuckerberg’s announcement of Metaverse. In his talks already following the announcement he has discussed where he thinks the future of the metaverse will head. As it appears in thought today, this virtual reality world he plans on creating our avatars that have the potential to emit emotions that we are portraying in real time. In this sense the technology that is going to be implemented into this system will be facial tracking and emotional recognition implementing the tools of emotional AI in order to represent digitally our emotions through our avatars easier than sending a simple emoji. With the implementation of emotional recognition software into something as large as the Metaverse we certainly believe that it will help create more software and data to begin implementing emotional AI in other realms of our lives sooner than we believed.
Currently, friends can connect on video games like minecraft where friends can enter their avatar into other worlds and create anything they want. Television shows, such as Alter Ego, use motion censoring technology. This allows contestants to create an avatar and with climate change accelerating and ice caps melting, there is a risk of new unknown diseases being released. A platform like metaverse where we can be in a classroom or an office or literally any space can allow people to meet as if it were normal settings. Future pandemics would look different than today. 


Practices:  While emotional artificial intelligence is not completely developed yet we have certainly begun seeing it used in different aspects of life. One of the largest implementations we are seeing is in the medical field. Artificial emotional intelligence is being used in the medical field for multiple aspects, they are analyzing patient data and charts in order to help doctors make diagnosis. In helping doctors make diagnosis and analyzing patient charts this technology is freeing up doctor’s time in order for them to spend more time with their patients. Aside from patient analysis, emotional artificial intelligence is being used in China to assist with hospice care, they are talking with older patients to keep them company, reminding them to take their medication, and taking important vitals. We also see emotional artificial intelligence begin to be used in education, as students we have used RpNow which monitors students facial expressions in order to determine if they are cheating or not. When we look at marketing and retail sectors when it comes to this technology Walmart has filed a patent in order to analyze people’s biometric data as they are waiting in checkout lines. In measuring a biometric the data that will be collected are people’s heart rate and blood pressure to tell if a customer is upset in order for managers to direct team members to assist these customers.


Hacks: With any new technology there are always risks that come along with how people will use these products. With emotional AI we believe that we are already seeing a Hack come up in regard to China and their monitoring of civilians through emotional artificial intelligence that is something out of a horror film. They have begun to implement plans of installing spy cameras in order to monitor their civilians. Through this instillation they will use AI in order to calculate a persons “social score” which helps determines a citizen benefits or punishments. Another unintended use that we may see is the manipulation of these technologies in the medical field. When things are broken down into boxes that need to be checked off in order for this technology to diagnose you, people may begin to put on a performance in order to be diagnosed with certain medical conditions in order to get drugs. This can lead to an opioid crisis, as well as many misdiagnoses. 

Extremes: Recognizing facial imaging into surveillance systems already exists. Government surveillance systems are expanding to include emotional AI. China is testing this surveillance system to analyze facial emotions and even skin pores of Uyghur Muslims with cameras placed 3 meters (or almost 10 feet) away from an individual. This gives the Chinese government the ability to track whether the detained individuals are lying. It also goes as far as to recognizing the ethnicity of an individual. According to the BBC, China is home to nearly 800 million surveillance cameras. If emotional AI is being practiced in use and nearly perfected, the government’s ability to watch its citizens can become more efficient. This can be similar to North Korea’s regime in shaping human behavior. As mentioned before, if a person already knows they are being watched and a camera being placed far from the face can accurately decipher human emotions, they will only change how they act.

Companies are also moving into more surveillance, especially if they are trying to move towards more self-checkout systems. In 2018, Walmart filed a patent for carts to include a sensors to track consumer emotions while shopping. This includes heart rate, speed, and temperatures. This would be used to better a customer’s experience when shopping so if a customer is struggling, an employee can be sent to their location to help. This method of emotion tracking can expand into tracking whether a consumer is stealing. 

Rarities: What are the chances of robots taking over? This is one of the most popular beliefs when it comes to robotics. Can a robot that functions and act human be able to take out the human race. One of the most impressive robots made, Sophia, was able to show a human-like robot who acted as we would. She has been granted citizenship in Saudi Arabia. And now, Sofia wants a family. Now of course, she is not capable of reproduction. But the ability of creating goals and wishes shows even more human characteristics. Can this lead to problems?

George Orwell released one of his most popular novels, “1984”, in 1949. The main theme throughout the story is a dystopian society where the government is always watching. Those who talk against the government, write, or even think poorly are punished. Yes, there are instances today of the government hacking and spying its citizens through multiple platforms. For example, the NSA looking through your camera. But what are the chances of a society similar to 1984 where our thoughts are monitored?? 

bottom of page