Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by Max_Fisher on 2019-06-03 (UTC).

YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.

YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitationhttps://t.co/zNwsd9UsgN

— Max Fisher (@Max_Fisher) June 3, 2019
ethics
by Max_Fisher on 2019-06-03 (UTC).

Any user who watched one kiddie video would be directed by YouTube's algorithm to dozens more — each selected out of millions of otherwise-obscure home movies by an incredibly sophisticated piece of software that YouTube calls an artificial intelligence. The families had no idea.

— Max Fisher (@Max_Fisher) June 3, 2019
ethics

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib