Machines Taught by Photos Learn a Sexist View of Women
|| Algorithmic Bias || Media Article || Short (5 min or less)
A discussion around how gender bias can be proliferated through AI machines.
Please note that content may be behind a paywall, or have a limited number of free articles.
|| Algorithmic Bias || Media Article || Short (5 min or less)
A discussion around how gender bias can be proliferated through AI machines.
|| Diversity Equity and Inclusion || Media Article || Short (5 min or less)
The science in Damore’s memo is still very much in play, and his analysis of its implications is at best politically naive and at worst dangerous.
|| Health and Medical Applications Diversity Equity and Inclusion || Media Article || Short (5 min or less)
An argument against human gene editing technology being used to edit out disabilities.
|| Algorithmic Bias || Media Article || Short (5 min or less)
When we feed machines data that reflects our prejudices, they mimic them – from antisemitic chatbots to racially biased software. Does a horrifying future await people forced to live at the mercy of algorithms?
|| Robotics Applications || Media Article || Short (5 min or less)
|| Diversity Equity and Inclusion || Report || Long (10+ min)
James Damore's Google Memo. The core arguments: Men and women have psychological differences that are a result of their underlying biology. Those differences make them differently suited to and interested in the work that is core to Google.
|| Diversity Equity and Inclusion || Media Article || Short (5 min or less)
|| Diversity Equity and Inclusion || Media Article || Medium (5-10 min)
The ride-sharing giant was the subject of a withering report, but its values and its hard-driven C.E.O. remain in place.
|| Internet/Privacy || Media Article || Medium (5-10 min)
Private groups - like the one used by students whose Harvard acceptances were recently rescinded - may offer a false sense of confidence.
|| Algorithmic Bias || Module || Short (5 min or less)
Evan Peck, Associate Professor at Bucknell University's teaching modules. His lesson plan connect ethics values to code for CS 1 students, and is centred around MIT's Moral Machine.
|| Misinformation || Report || Long (10+ min)
A report about media manipulation, including case studies and syllabus.
|| Diversity Equity and Inclusion || Media Article || Short (5 min or less)
An analysis of premiums and payouts in California, Illinois, Texas and Missouri shows that some major insurers charge minority neighborhoods as much as 30 percent more than other areas with similar accident costs.
|| Diversity Equity and Inclusion || Media Article || Short (5 min or less)
Tech companies are spending hundreds of millions of dollars to improve conditions for female employees. Here’s why not much has changed—and what might actually work.
|| Misinformation || Blog Post || Short (5 min or less)
About a University of Washington research study looking at how people spread rumors online during crisis events, from natural disasters like earthquakes and hurricanes as well as man-made events such as mass shootings and terrorist attacks.The study primarily focused on Twitte
|| Diversity Equity and Inclusion || Blog Post || Medium (5-10 min)
Susan Fowler's blog post about her experience at Uber, outlining the sexism she experienced while on the team.
|| Misinformation || Media Article || Medium (5-10 min)
A filing cabinet broken into in 1972 as part of the Watergate burglary sits beside a computer server that Russian hackers breached during the 2016 presidential campaign at the Democratic National Committee’s headquarters in Washington.
|| Algorithmic Bias || Media Article || Short (5 min or less)
Algorithms are, in part, our opinions embedded in code. They reflect human biases and prejudices that lead to machine learning mistakes and misinterpretations.
|| Algorithmic Bias || Media Article || Short (5 min or less)
Computers are inheriting gender bias implanted in language data sets—and not everyone thinks we should correct it.
|| Algorithmic Bias || Video || Medium (5-10 min)
MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures.