gestyy.com/w2uVwV gestyy.com/w2uK4G gestyy.com/w2uJPF viid.me/qxgvBT viid.me/qa5603 sh.st/boMkZ sh.st/c4otk sh.st/zjoYf sh.st/dPFrI sh.st/dY8c7 sh.st/s6aZD sh.st/s6a31 sh.st/d0Q2j sh.st/d0Q9W
If she's in the kitchen, she's a woman: how algorithms reinforce prejudice
- Obtener enlace
- X
- Correo electrónico
- Otras aplicaciones
A bald man, about sixty, moves with his wooden spatulas a few pieces of meat inside a frying pan. She wears pasta glasses, jeans and is in front of the stoves in her small kitchen, decorated in light tones. When viewing this image, the artificial intelligence is clear and thanks to its sophisticated learning etiquette you see: kitchen, spatula, stove, woman. If you are in a kitchen, between stoves, you must be a woman. A team from the University of Virginia has just published a study that once again points to what many specialists have been denouncing: artificial intelligence not only does not avoid human error stemming from its prejudices, but can also worsen discrimination and is reinforcing many stereotypes.
In their work, these scientists put the magnifying glass in the images of two giant banks of images, which are usually used to train the machines. And, above all, what they learned from them. Initially, men starred in 33% of the photos that contained people cooking. After training the machine with these data, the model showed its weakness: it inferred that 84% of the sample were women. " Big data-based technologies are sometimes known to inadvertently worsen discrimination because of biases implicit in the data," the authors warn. "We show that starting from a biased corpus of gender," they add, predictive models "amplify bias."
Machines become more sexist, racist or classist because they identify the underlying trend and bet on it to target. It is already well known the case of Tay, the intelligent robot that Microsoft designed to integrate into the conversation of Twitter learning from other users: had to withdraw it in less than 24 hours because it began to make Nazi apology, to harass other tweeters and defend the wall of Trump. At this point, examples of algorithms that exacerbate prejudices or discriminations are innumerable and call into question the great promise of these systems: to remove human error from the equation. Algorithms condemn us to repeat the past from which we wanted to escape by replicating the prejudices that defined us.
" Big data- based technologies sometimes worsen discrimination because of implicit biases in data," warn the authors
Google began tagging black people as gorillas and Google Maps located the "black house" in the Obama-era White House. Photos of black Flickr users were classified as "chimpanzees". The clever Apple Siri, who has an answer for everything, does not know what to saywhen the owner of the mobile phone tells her that she has been raped. Nikon software warns the photographer that someone has blinked when the subject has Asian traits. The HP webcam is not able to identify and follow the faces more brown, but the white ones. The first beauty contest judged by a computer placed a single dark-skinned person among the 44 winners. In the US, Amazon leaves out of its best promotions the African-American (poorer) majority neighborhoods. Facebook allows advertisers to exclude ethnic minorities from their target market and instead include people who explicitly identify themselves as anti-Semitic or young people whose algorithms have identified them as vulnerable and depressed.
"Promising efficiency and impartiality, they distort higher education, increase debt, encourage massive incarceration, beat the poor at almost every juncture and undermine democracy," said Cathy O'Neil, data specialist and author of the revealing book Weapons of mathematical destruction ( Weapons of Math Destruction , Crown ), in which it crumbles all the algorithmic disasters from its formation like doctor in Mathematics in Harvard and its work experience as data scientistin the financial world. "Going to college, borrowing money, being sentenced to prison or finding and keeping a job. All these fields of life are increasingly controlled by secret models that provide arbitrary punishment," he warned.
Google began tagging black people as gorillas and Flickr classified them as chimpanzees
As O'Neil says, the biases of algorithms can be far more dangerous and far-reaching. ProPublica wrote a few months ago when he discovered that a program used in US Justice to forecast the recidivism of prisoners was remarkably racist . Black defendants were twice as likely to be mislabeled as likely repeat offenders (and treated more harshly by the penal system), while white defendants who did relapse were labeled as low risk with twice as likely as blacks. Citizens, and of course convicts, are unaware that their future is being decided by a flawed computer program that is going to be as racist as the most racist judge. Cold, calm and conscientiously racist.
Research from Carnegie Mellon University found that women are less likely to receive paid job ads on Google. The programs used in hiring departments of some companies have shown a penchant for names used by whites and reject those of blacks. Police authorities in several cities use programs that help them predict where crime is most likely to occur; in this way, they go to these areas more, they stop more people there and reinforce the negative cycle. And insurances are more expensive and severe for residents in poor black neighborhoods. "The result is that we criminalize poverty, believing that our tools are not just scientific but fair," summarizes this specialist.
"Promising efficiency and impartiality, they beat the poor at almost every juncture and undermine democracy," says Cathy O'Neil
As O'Neil points out in his book, in some cases, the problems of the algorithm are due to a problem in the selection of the data. In others, it is due to the underlying prejudice in society that software simply makes its own to hit. But the biggest problem is the economic model: "When you are building statistical systems to find customers or manipulate desperate debtors, growing incomes seem to show they are on the right track." The software is doing its job. end up serving as a substitute for truth. " O'Neil denounces that this is a "dangerous confusion" that arises "over and over again." Facebook lets its algorithm select and sell ads to "people who hate Jews" and "because he becomes rich with it; if they are paid, they can not be wrong.
A regulator against opacity
These are problems discovered by journalists, researchers and institutions. Or when they make manifest and force the company to correct them, but what about all the processes that are already mechanized and we do not know how they affect us? How will a woman know that she was deprived of seeing a job advertisement? How could a poor community know that it is being policed by software? How do you defend an offender of an ethnic minority who does not know what an algorithm points to? Facebook and Google, for example, are perfectly aware of this problem and even explain how it happens , but they are absolutely opaque and do not allow anyone to monitor these biases efficiently, O'Neil criticizes. There are many more programs of this type being applied in the judicial system American, but their biases are unknown, because each company keeps its secret algorithms, such as the Coca-Cola formula.
The software is doing its job. The problem is that the benefits end up serving as a substitute for the truth ", criticizes the specialist
If the algorithm has become law, it must be transparent, accessible, debatable and amendable, as the law itself. It is what more and more specialists and organizations, such as the League for Algorithmic Justice or Artificial Intelligence, are demanding nowadays to assert that the problem of intelligent machines is their rampant social prejudices and not that they are going to provoke a Terminator- style apocalypse . And that, therefore, it is necessary to create public regulators that review their systems. It is a crisis that will only grow: a few days ago scandalized a controversial algorithm that sought to identify the gais by its face; in the US, for example, and to half of the populationhas his face registered in police databasesof facial recognition. And the giants of the network already know our sexual orientation even without being users of their services. "We can not count on the free market to correct these mistakes," ditch O'Neil.
- Obtener enlace
- X
- Correo electrónico
- Otras aplicaciones
Entradas populares de este blog
Descargar Photoshop CC FEBRERO 2017 MAC GRATIS
Descripción: Mayor libertad, velocidad y potencia para hacer realidad imágenes increíbles. Conseguirás decenas de funciones nuevas y renovadas, entre las que se incluye la herramienta de enfoque más avanzada del mercado. Comparte tu trabajo directamente desde Photoshop® CC en Behance® para recibir comentarios y mostrar tus propios proyectos. Accede a funciones nuevas en cuanto estén disponibles. Disfruta de todo tu mundo creativo en un único lugar. Contenido Adobe Photoshop CC 2014 Activado Plugins: Imagenomic y Nik Software Camera Raw 8.5 Photoshop cc 2017 Mejoras de los objetos inteligentes Conserva los vínculos a archivos externos empaquetándolos automáticamente en un solo directorio. También puedes convertir tus objetos inteligentes incrustados existentes en objetos inteligentes vinculados. Composiciones de capas mejoradas Ahorra tiempo: ahora puedes cambiar la visibilidad, la posición o la apariencia de una capa en una composición de capas y, a continuación, sincr...
ADOBE ILLUSTRATOR cc FEBRERO 2017 gratis MAC OS Full ENERO [ACTUALIZADO]
El programa Adobe Illustrator es uno de los más utilizados, sino el más extendido, para crear gráficos vectoriales de gran potencia. Nos permite también dibujar en perspectiva de hasta 3 puntos y nos hace disponer de unas herramientas fantásticas para dibujar y diseñar. Podemos crear formas de un nivel absolutamente profesional, con funciones muy realistas que podremos combinar. Entre ellas, las flechas de ancho variable o el degradado perfecto de colores. También dispondremos de la posibilidad de esos dibujos en perspectiva que darán una apariencia muy real a nuestras creaciones. Pinceles que simulan perfectamente los pinceles de cedra que utilizamos en la vida real, degradados y transparencia en los objetos... Para los fanáticos de la tipografía, este programa lo tiene todo, y es compatible con las fuentes OpenType. Sombras, texturas... mil opciones en los atributos de cada objeto o texto. ...
DOWNLOAD ADOBE ILLUSTRATOR CC 2017 MAC FULL FREE
We can create forms of an absolutely professional level, with very realistic functions that we can combine. Among them, the arrows of variable width or the perfect gradient of colors. We will also have the possibility of those drawings in perspective that will give a very real appearance to our creations. Brushes that perfectly simulate cedar brushes that we use in real life, degraded and transparency in objects ... For fans of typography, this program has it all, and is compatible with OpenType fonts. Shadows, textures ... thousand options in the attributes of each object or text. It is very useful and of great value the function of being able to convert to vectors any image or photo using the application Live Trace. 3D effects, directed illumination and real effects. Everything in this powerful application that will not leave any aspect uncovered and with which you can enjoy the vectorial design. Creative cloud installer: http://adf.ly/12gnGw Crack illustrat...
Comentarios
Publicar un comentario