This sounds interesting: a Stanford course on "How to Learn Math", including debunking myths about math & learning https://t.co/Ec71r2PYm1
— Rachel Thomas (@math_rachel) October 8, 2018
This sounds interesting: a Stanford course on "How to Learn Math", including debunking myths about math & learning https://t.co/Ec71r2PYm1
— Rachel Thomas (@math_rachel) October 8, 2018
As someone who hasn't played with lazy graph frameworks, I learned a lot from this discussion of the pros and cons https://t.co/f5DA1F4xiK
— Jeremy Howard (@jeremyphoward) October 6, 2018
Basically lazy pays graph construction + GPU cost (because you only start computing once you see all of it), while eager pays only the GPU cost (because GPU runs async while you queue the kernels).
— Adam Paszke (@apaszke) October 6, 2018
No, it does matter. CNNs have ops so costly that the eager overhead on CPU is entirely hidden behind GPU execution, so you don’t pay anything for it. But eager never postpones running the kernels, so it will have lower latency than lazy (and it’s still somewhat cheaper).
— Adam Paszke (@apaszke) October 6, 2018
Lazy is reasonable in C++, where application-logic gets compiled and is in order of nano-seconds, it quickly breaks down in Python, where application logic is interpreted. Doing N for-loops queuing compute for a batch-size N will effectively introduce N * 1us overhead. per op
— Soumith Chintala (@soumithchintala) October 6, 2018
Although there's also just something extreme about Google's marketing in generalhttps://t.co/b5kxp3lmEw pic.twitter.com/H99jPIHY85
— Rachel Thomas (@math_rachel) October 5, 2018
Every visualisation that requires navigation makes an implicit bet on the user navigating to the view giving optimal insight. As a dataviz creator you must be conscious of this bet
— Thomas Lin Pedersen (@thomasp85) October 5, 2018
You may not be able to be a data scientist right away, but as an analyst you'll learn all the basic skills you need and be exposed to tech. And, re: dev skills. They're in demand always and forever, even if you don't end up doing data work.
— Vicki Boykis (@vboykis) October 5, 2018
To clarify : I'm excited about the direction tf2 is taking. They've made some tough but important decisions and I think tf2 is likely to be great. And I like Geron's book, and his summary here of tf2.
— Jeremy Howard (@jeremyphoward) October 5, 2018
I just think the "better than pytorch" claim is poorly made here. https://t.co/HDvcrUzg7q
It can be fun to compare the pros/cons of different programming languages, but ultimately you should use whatever tools you're comfortable with and what help you get the job done. This will change over time.
— Rachel Thomas (@math_rachel) October 5, 2018
I agree with what @jeremyphoward wrote a year ago: pic.twitter.com/PsTduGqSYO
I think I would have liked TensorFlow better if Google had been more accurate in its marketing. The dissonance between how it was marketed and what it actually was contributed to my early frustrations.
— Rachel Thomas (@math_rachel) October 5, 2018
From blog posts I wrote in early 2017: pic.twitter.com/o47Hl8iG6q
As a data scientist, one of the worst things you can do is be technically correct while being simultaneously unhelpful. E.g. answering “Why do these numbers differ?” with “Cause they‘re from different tables.”
— Angela Bassa (@AngeBassa) October 4, 2018
Like, you’re not wrong… but what can we do with this information?! pic.twitter.com/4xPh0bfzmT