AGIR






OUTILS LIBRES

browsers
Firefox
messengers
Jappix - Thunderbird
search
Duckduckgo - Quaero - Scroogle
servers
all2all - domaine public - Telekommunisten
networks
Appleseed - Crabgrass - Diaspora - elgg - OneSocialWeb - pip.io
microblog
identi.ca

RELATED SITES

Ada Lovelace Institute - AI Now - Algorithm Watch - Algorithmic Justice League - AlgoTransparency - Atlas of Surveillance - Big Brother Watch - Citizen Lab - Conspiracy Watch - Constantvzw - controle-tes-donnees.net - Data Detox Kit - Digital Freedom Fund - Domaine Public - Do Not Track Electronic Frontier Foundation - europe-v-facebook - Fight for the Future - Forbidden Stories - Gender Shades - Google Spleen - greatfire.org - Guard//Int - hiljade.kamera.rs - Homo Digitalis - Human Rights Watch - Inside Google - Inside Airbnb - Liberties - LobbyPlag - Make Amazon Pay - Manifest-No - Ministry of Privacy - More Perfect Union - myshadow.org - Naked Citizens - Ni pigeons, ni espions - No-CCTV - Non à l’Etat fouineur - Nothing to Hide - noyb - NURPA - Online Nudity Survey - Open Rights Group - Ordinateurs de Vote - Pixel de tracking - Police spies out of lives - Prism Break - Privacy.net - Privacy International - Privacy Project - La Quadrature du Net - Radical AI Project - Reset the Net - Save the Internet - Souriez vous êtes filmés - Sous surveillance - Spyfiles - StateWatch - Stop Amazon - Stop Data Retention - Stop Killer Robots - Stop Spying - Stop The Cyborgs - Stop the Internet Blacklist ! - Stop the Spies - Stop Watching Us - Sur-ecoute.org - Technopolice - Tech Transparency Project - Transparency Toolkit - URME Surveillance - Watching Alibaba - Where are the Eyes ? - Who Targets Me ? - Wikifémia - Wikileaks

Société

AI systems claiming to ’read’ emotions pose discrimination risks | Technology | The Guardian

analyse - 19 février 2020

lire sur le site originel >>> (AI systems claiming to ’read’ emotions pose discrimination risks | Technology | The Guardian)

Expert says technology deployed is based on outdated science and therefore is unreliable Artificial Intelligence (AI) systems that companies claim can “read” facial expressions is based on outdated science and risks being unreliable and discriminatory, one of the world’s leading experts on the psychology of emotion has warned.
Lisa Feldman Barrett, professor of psychology at Northeastern University, said that such technologies appear to disregard a growing body of evidence undermining the (...)



Mots-clés de l'article

algorithme - biométrie - émotions - Europe - facial - Malaisie - reconnaissance - recrutement - Royaume-Uni - travail - Amazon - GigEconomy - HireVue - Rekognition - theguardian.com - Unilever -

VOIR TOUS LES MOTS-CLÉS