AI bias

Apple AI Transcribes “Racist” as “Trump”: Bug or Bias?

Apple acknowledged and is addressing a flaw in its iPhone’s Dictation feature where the word “racist” is transcribed as “Trump.” The company attributes the issue to difficulties distinguishing words with the letter “r,” a claim disputed by speech recognition expert Peter Bell. Professor Bell suggests intentional software manipulation as a more likely cause. A fix is being deployed.

Read More

UK Benefits Fraud AI System Found to Be Biased

A UK government AI system used to detect welfare fraud exhibits bias based on age, disability, marital status, and nationality, according to an internal assessment. This “statistically significant outcome disparity” was revealed in documents obtained via the Freedom of Information Act, despite earlier government assurances of no discriminatory impact. While human oversight remains, concerns remain regarding a “hurt first, fix later” approach and the lack of fairness analysis regarding other protected characteristics. The revelation fuels calls for greater transparency in government AI use, particularly given the numerous undisclosed applications across UK public authorities.

Read More