AGIR






OUTILS LIBRES

browsers
Firefox
messengers
Jappix - Thunderbird
search
Duckduckgo - Quaero - Scroogle
servers
all2all - domaine public - Telekommunisten
networks
Appleseed - Crabgrass - Diaspora - elgg - OneSocialWeb - pip.io
microblog
identi.ca

RELATED SITES

Ada Lovelace Institute - AI Now - Algorithm Watch - Algorithmic Justice League - AlgoTransparency - Atlas of Surveillance - Big Brother Watch - Citizen Lab - Conspiracy Watch - Constantvzw - controle-tes-donnees.net - Data Detox Kit - Digital Freedom Fund - Domaine Public - Do Not Track Electronic Frontier Foundation - europe-v-facebook - Fight for the Future - Forbidden Stories - Gender Shades - Google Spleen - greatfire.org - Guard//Int - hiljade.kamera.rs - Homo Digitalis - Human Rights Watch - Inside Google - Inside Airbnb - Liberties - LobbyPlag - Make Amazon Pay - Manifest-No - Ministry of Privacy - More Perfect Union - myshadow.org - Naked Citizens - Ni pigeons, ni espions - No-CCTV - Non à l’Etat fouineur - Nothing to Hide - noyb - NURPA - Online Nudity Survey - Open Rights Group - Ordinateurs de Vote - Pixel de tracking - Police spies out of lives - Prism Break - Privacy.net - Privacy International - Privacy Project - La Quadrature du Net - Radical AI Project - Reset the Net - Save the Internet - Souriez vous êtes filmés - Sous surveillance - Spyfiles - StateWatch - Stop Amazon - Stop Data Retention - Stop Killer Robots - Stop Spying - Stop The Cyborgs - Stop the Internet Blacklist ! - Stop the Spies - Stop Watching Us - Sur-ecoute.org - Technopolice - Tech Transparency Project - Transparency Toolkit - URME Surveillance - Watching Alibaba - Where are the Eyes ? - Who Targets Me ? - Wikifémia - Wikileaks

Société

Study finds gender and skin-type bias in commercial artificial-intelligence systems

analyse - 12 février 2018

lire sur le site originel >>> (mit.edu)

Examination of facial-analysis software shows error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women.
Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford University will present later this month at the Conference on Fairness, Accountability, and Transparency.
In the researchers’ experiments, the three programs’ error rates in (...)



Mots-clés de l'article

algorithme - biais - biométrie - facial - USA - discrimination - mit.edu -

VOIR TOUS LES MOTS-CLÉS