AGIR






OUTILS LIBRES

browsers
Firefox
messengers
Jappix - Thunderbird
search
Duckduckgo - Quaero - Scroogle
servers
all2all - domaine public - Telekommunisten
networks
Appleseed - Crabgrass - Diaspora - elgg - OneSocialWeb - pip.io
microblog
identi.ca

RELATED SITES

Ada Lovelace Institute - AI Now - Algorithm Watch - Algorithmic Justice League - AlgoTransparency - Atlas of Surveillance - Big Brother Watch - Citizen Lab - Conspiracy Watch - Constantvzw - controle-tes-donnees.net - Data Detox Kit - Digital Freedom Fund - Domaine Public - Do Not Track Electronic Frontier Foundation - europe-v-facebook - Fight for the Future - Forbidden Stories - Gender Shades - Google Spleen - greatfire.org - Guard//Int - hiljade.kamera.rs - Homo Digitalis - Human Rights Watch - Inside Google - Inside Airbnb - Liberties - LobbyPlag - Make Amazon Pay - Manifest-No - Ministry of Privacy - More Perfect Union - myshadow.org - Naked Citizens - Ni pigeons, ni espions - No-CCTV - Non à l’Etat fouineur - Nothing to Hide - noyb - NURPA - Online Nudity Survey - Open Rights Group - Ordinateurs de Vote - Pixel de tracking - Police spies out of lives - Prism Break - Privacy.net - Privacy International - Privacy Project - La Quadrature du Net - Radical AI Project - Reset the Net - Save the Internet - Souriez vous êtes filmés - Sous surveillance - Spyfiles - StateWatch - Stop Amazon - Stop Data Retention - Stop Killer Robots - Stop Spying - Stop The Cyborgs - Stop the Internet Blacklist ! - Stop the Spies - Stop Watching Us - Sur-ecoute.org - Technopolice - Tech Transparency Project - Transparency Toolkit - URME Surveillance - Watching Alibaba - Where are the Eyes ? - Who Targets Me ? - Wikifémia - Wikileaks

Société

Facial recognition software easily IDs white men, but error rates soar for black women

analyse - 13 février 2018

lire sur le site originel >>> (theregister.co.uk)

Even with a representative data set to learn from, software gets worse the darker your skin
Commercial AI is great at recognising the gender of white men, but not so good at doing the same job for black women.
that’s the conclusion of a new study, "Gender Shades : Intersectional Accuracy Disparities in Commercial Gender Classification", that’s compared gender classifiers developed by Microsoft, IBM and Chinese startup Face++ (also known as Megvii).
The study found that all three services (...)



Mots-clés de l'article

algorithme - biais - biométrie - facial - USA - Apple - discrimination - Face++ - Google - IBM - Microsoft - theregister.co.uk -

VOIR TOUS LES MOTS-CLÉS