Why did the A-level algorithm say no?
Accusations of unfairness over this year's A-level results in England have focused on an "algorithm" for deciding results of exams cancelled by the pandemic.
Accusations of unfairness over this year's A-level results in England have focused on an "algorithm" for deciding results of exams cancelled by the pandemic.
Pressure is mounting on ministers to let teacher-assessed grades stand in England to avoid a second wave of exams chaos hitting GCSE results this week.
As officials mull steps to tackle police brutality and racism, California’s Santa Cruz has become the first U.S. city to ban predictive policing, which digital rights experts said could spark similar moves across the country.
David Heinemeier Hansson, a Danish entrepreneur and developer, said in tweets last week that his wife, Jamie Hansson, was denied a credit line increase for the Apple Card, despite having a higher credit score than him.
When the researchers then tested Google’s Perspective, an AI tool that the company lets anyone use to moderate online discussions, they found racial biases.
OMPAS, is used nationwide to decide whether defendants awaiting trial are too dangerous to be released on bail. In May, the investigative news organization ProPublica claimed that COMPAS is biased against black defendants.
Computers are inheriting gender bias implanted in language data sets—and not everyone thinks we should correct it.
When we feed machines data that reflects our prejudices, they mimic them – from antisemitic chatbots to racially biased software. Does a horrifying future await people forced to live at the mercy of algorithms?
Studies have shown that people are being treated differently online based on their race, actual or perceived. Websites have been found to use demographic data to raise or lower prices, show different advertisements, or steer people to different content.
A discussion around how gender bias can be proliferated through AI machines.