Microsoft to Stop Selling Emotion Reading Technology and Restrict Access to Facial Recognition Tools

Microsoft has confirmed that it will withdraw software that evaluates a person’s emotional state by processing their image. In addition, the company will also restrict access to its facial recognition technology.

Following Google’s lead, Microsoft is ending the sale of emotion-reading technology. The company will also restrict “unrestricted” access to facial recognition technology. Existing customers will only have one year before they lose access to Azure Face, a suite of artificial intelligence (AI) tools that attempt to determine emotions, gender, age, smile, facial hair, hair and makeup. Speaking about development , Sarah Bird, Principal Product Manager, Microsoft Azure AI Group, said:

These efforts have raised important questions about privacy, the lack of consensus on the definition of “emotion”, and the inability to generalize the relationship between facial expression and emotional state across use cases, regions, and demographics.

Microsoft is reportedly looking into whether emotion recognition systems are based on science. It’s not immediately clear what Microsoft meant. However, it is possible that the company failed to perfect the algorithms that guess the emotional state of a person from the image. Moreover, the company can reinforce its case against new rules and regulations on the use of such tools.

In addition to ending the sale of emotion reading technology, Microsoft is also ending unrestricted access to its facial recognition technology. The company has indicated that customers using its facial recognition technologies must obtain prior authorization. Clearly, Microsoft customers must have contractual obligations. However, it’s not clear if Microsoft is imposing additional restrictions or simply asking companies to sign a disclaimer relieving Microsoft of any legal sanctions associated with any misuse.

For now, Microsoft has simply asked its customers to “avoid situations that violate privacy or where the technology could fail.” An obvious legally dubious goal is the identification of minors. Incidentally, Microsoft does not specifically prohibit such use.

Microsoft also places some restrictions on its Custom Neural Voice feature, which allows customers to create AI voices based on recordings of real people .

Leave a Reply

Your email address will not be published.