Artificial intelligence will interpret human emotions: research call for the regulation of these types of tools in schools and workplaces

AI

According to a report, although the pandemic has caused people and authorities to change their approach towards fighting the coronavirus, some technology companies have tried to use this as an excuse to introduce “unverified” artificial intelligence (AI) tools in the workplace. college. Nature magazine. 

Comment: We can no longer allow emotion-recognition technologies to go unregulated, argues @katecrawford. https://t.co/DuN7iTPE8R

— Nature Portfolio (@NaturePortfolio) April 6, 2021

In a heated debate about the abuse potential of these technologies, various emotion reading tools are being sold on the market for remote monitoring of children and workers to predict their emotions and performance. These tools can capture emotions in real-time and help organizations and schools better understand their employees and students, respectively. 

For example, one of the tools can decode facial expressions and classify them into happiness, sadness, anger, disgust, surprise, and fear. The program is called “Four Little Trees” and it was developed in Hong Kong. He claims to be able to assess children’s emotions during class. 

Academic researcher and author of “The Atlas of AI” Kate Crawford (Kate Crawford) wrote in “Nature” that such technologies must be regulated to better develop policy and win public trust. The polygraph test, commonly known as the “polygraph test,” was invented in the 1920s. The US investigative agency FBI and the US military used this method for decades until it was finally banned. 

Any method that uses AI for random surveillance of the public must first be reliably monitored. Crawford wrote: “It can also help set standards to deal with excessive intervention by businesses and governments.” He also cited a tool developed by psychologist Paul Ekman that says The Six Human Emotions are standardized as suitable content for computer vision. 

After the September 11, 2001 attacks, Ekman sold his system to US authorities to identify airline passengers who showed fear or pressure to investigate whether they were involved in terrorist acts. The system has been severely criticized for racial prejudice and a lack of credibility. 

Allowing these technologies without an independent review of their effectiveness is unfair to job applicants because their facial expressions do not match the facial expressions of employees and therefore should be judged fairly. 

Students will be reported to school because a machine angers them. The author Kate Crawford (Kate Crawford) called for legislation to protect these tools from their untested use.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *