The hack of Equifax by the Chinese government wasn’t a malfunction of the system; it was a direct result of how the system was designed.https://t.co/5JL68Ch6zG pic.twitter.com/K2tno0s1N0
— Rachel Thomas (@math_rachel) February 11, 2020
The hack of Equifax by the Chinese government wasn’t a malfunction of the system; it was a direct result of how the system was designed.https://t.co/5JL68Ch6zG pic.twitter.com/K2tno0s1N0
— Rachel Thomas (@math_rachel) February 11, 2020
When AI > AI+human, we face important ethical questions.
— Alexandre Cadrin-Chênevert (@alexandrecadrin) February 7, 2020
"In the reader study, the performance level of AI was 0.940, significantly higher than that of the radiologists without AI assistance (0.810). With the assistance of AI, radiologists' performance was improved to 0.881." https://t.co/q16jpyJizU
A great interview with Galit Shmueli @gshmueli on fairness and ethics in data science. https://t.co/yAIXVAPsy6
— Rob J Hyndman (@robjhyndman) January 29, 2020
My view: An ‘Epic’ pushback as U.S. prepares for new era of empowering patient health data. https://t.co/jftyuR3hfT @ONC_HealthIT @SecAzar #onc2020 @statnews #digitalhealth @hugohealth @YNHH @ePatientDave @aneeshchopra @BraveBosom @mandl @zakkohane @chrissyfarr @matthewherper pic.twitter.com/xZSGOmHStX
— Harlan Krumholz (@hmkyale) January 27, 2020
The surveillance state beyond just face recognition: “We need to have a serious conversation about all the technologies of identification, correlation and discrimination” by Bruce Schneier (@schneierblog) in @nytimes https://t.co/PMHxlORwfG
— Stanford NLP Group (@stanfordnlp) January 27, 2020
Reasons Why Online Advertising is Broken:
— Rachel Thomas (@math_rachel) January 25, 2020
- illusion of consent
- intrusive tracking & profiling
- massive breach of security
- no transparency
- potential for discrimination
- broken by design & by default
- responsible for rise of clickbait@ka_iwanska https://t.co/Q5Jm1Se58f
4 Principles for Responsible Government Use of Technology
— Rachel Thomas (@math_rachel) January 21, 2020
- Listen to local communities
- Beware how NDAs obscure public process & law
- Security is not the same as safety
- Policy decisions should not be outsourced as design decisionshttps://t.co/lU6JsFxtAb #TechPolicyCADE
“The Secretive Company That Might End Privacy as We Know It“ https://t.co/poCCdNnft9
— Sebastian Raschka (@rasbt) January 19, 2020
a) deliberately mass scrapping, hoarding & using — here: even selling!! — someone’s data w/o consent should be illegal. Like stalking; meaning, you can do sth doesn’t make it ok
1/2
In the mid-1880s, a factory owner purchased pneumatic molding machines (for $500,000) that produced inferior casings at a higher cost than the earlier process, in order to displace the skilled workers (who were also union organizers). The machines were abandoned after 3 years. pic.twitter.com/T1xOaPNfZm
— Rachel Thomas (@math_rachel) January 19, 2020
"Sure, that might lead to a dystopian future or something, but you can’t ban it.” -- David Scalzo, head of private equity firm that invested in Clearview
— Rachel Thomas (@math_rachel) January 19, 2020
I don't follow this logic.https://t.co/FIxuVEgRry pic.twitter.com/yWaIzNxBVk
A Peter Thiel-funded startup is scraping every photo it can find online (3 billion so far) to build a powerful facial recognition app that it’s selling to law enforcement. https://t.co/tLs0eFisq5
— Andy Baio (@waxpancake) January 18, 2020
Clearview facial recognition app should be the target of a class-action copyright lawsuit https://t.co/SMk4SLzhYj
— Tim Wu (@superwuster) January 18, 2020