Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by JuliaAngwin on 2019-03-19 (UTC).

It started with curiosity + expertise. In 2016 @suryamattu and I were curious about what Facebook knew about us. So he built a tool that let people see how FB had categorized them & noticed a racial category called “Ethnic Affinity.” /2https://t.co/fbKrDw9xnN

— Julia Angwin (@JuliaAngwin) March 19, 2019
miscethics
by JuliaAngwin on 2019-03-19 (UTC).

Next up: a tip. @rachelegoodman1 called me to tell me how FB’s racial categories could be used in illegal ways. So @terryparrisjr & I bought a housing ad & targeted it to be only shown to whites - and oops! - it was approved. /3https://t.co/25gUKd7Bc6

— Julia Angwin (@JuliaAngwin) March 19, 2019
misc
by JuliaAngwin on 2019-03-19 (UTC).

Flash forward three years, and now Facebook is eliminating all sensitive ad targeting categories for housing, employment and credit. This is a huge change, and it will hopefully open up more opportunities for people who would otherwise have been denied them. /9

— Julia Angwin (@JuliaAngwin) March 19, 2019
ethicsbias

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib