Smart Home

What are Siri and Alexa racists in the identification of words?

An important study from Stanford University found that several of the most important systems of recognition of voice in the world, Amazon, Apple, Google, IBM and Microsoft make many more errors with the users blacks with the whites.

According to this research, the racial bias is apparent in the disparity of admission of the words uttered to train the artificial intelligence of the system.

The averages in the five companies that were tested show that the systems identified erroneously the words of the white people, about 19 per cent of the time. For blacks, this figure jumped to 35 percent. About two percent of the audio of the white people are considered unreadable, while this increased to 20 percent for black people.

According to manifest, in the study, “there is concern that the speech recognition systems suffer from racial prejudice, a problem that has recently come to light in several other advanced applications of machine learning, such as facial recognition, natural language processing, online advertising and risk prediction in criminal justice, health care and services for children. Here, we evaluated racial disparities in five commercial tools, voice to text, developed by Amazon, Apple, Google, IBM and Microsoft, powering some of the most popular applications of the voice recognition technology”.

The only company that answered the research of Stanford was Google, who said that “we have been working on the challenge is to recognize with precision the variations of the speech for several years, and will continue to do so”.

The conclusion of the research is devastating in what it has to do with the discrimination against the black population.

“The achievement gaps that we have documented suggest that it is considerably more difficult for african americans to benefit from the growing use of voice-recognition technology, from virtual assistants in mobile phones to computer hands-free to people with physical disabilities. These disparities can also cause damage to the active african-american communities when, for example, employers use the voice recognition software to automatically assess the interviews with the candidates or the agencies of criminal justice to automatically transcribe the court proceedings,” the study reports.

Recommendations of the editor

SHARE
RELATED POSTS
Centres of control devices for a smart home
New technologies and projects that you can’t not know
The best tvs 4K to enjoy your favorite content

Leave Your Reply