Object Detection systems (commercial and academic) are trained on biased data— Soumith Chintala (@soumithchintala) June 7, 2019
This disproportionally affects accuracy in lower-income households and continents like Africa & Asia.
Work by my colleagues at FAIR using the Dollar Street dataset from GapMinderhttps://t.co/gdm7Ar9f4l pic.twitter.com/vmtkh3i4eo
Was just rereading @aylin_cim @j2bryson @random_walker paper on bias in word embeddings. They use "small baskets" of words (from heavily cited psychology papers) to represent a concept, and compare the distance/similarity between different concepts.— Rachel Thomas (@math_rachel) May 31, 2019