Highly recommend this talk by @marksaroufim, "Dealing with Career Stagnation: my Machine Learning Story"
β Hamel Husain (@HamelHusain) November 14, 2022
This talk is amazingly relevant again today: https://t.co/H0VqhgZfim
Highly recommend this talk by @marksaroufim, "Dealing with Career Stagnation: my Machine Learning Story"
β Hamel Husain (@HamelHusain) November 14, 2022
This talk is amazingly relevant again today: https://t.co/H0VqhgZfim
What an excellent video on Fast Fourier Transform by @veritasium ! πhttps://t.co/Gzhu8iOJ85
β AurΓ©lien Geron (@aureliengeron) November 3, 2022
George Hotz live-coding stable diffusion in tinygrad https://t.co/RR8Ewrdo5f via @YouTube
β Eric Jang (@ericjang11) September 5, 2022
What is the one resource I would recommend for anyone getting into RecSys?
β Radek Osmulski πΊπ¦ (@radekosmulski) September 2, 2022
This lecture by @xamat.
β’ it covers several foundational methods
β’ more importantly, it will teach you how to think about RecSys problems
Here are a couple of highlights:https://t.co/hCUWhQvWPs
The Man behind #StableDiffusion β An Interview with Emad Mostaque (@EMostaque), founder of Stability AI.
β hardmaru (@hardmaru) August 13, 2022
Really inspiring interview.https://t.co/Ham198RSm8
Can one deliver a RecSys masterclass with a focus on
β Radek Osmulski πΊπ¦ (@radekosmulski) August 12, 2022
β’ online vs batch predictions
β’ monitoring: data distribution shifts
β’ model deployment
in 25 minutes? π€
Apparently, @chipro can! π A must-watch
Link to video: pic.twitter.com/gWWUI7WGbc
When it comes to tabular data
β Radek Osmulski πΊπ¦ (@radekosmulski) August 5, 2022
β it is equal parts art and science at the highest of levels
β who can run better experiments and better capture their outcomes, wins
A stellar talk by @Giba1!
Come for high-level insights, and stay for a great explanation of unique techniques! pic.twitter.com/YXC7zhtC1S
How to avoid leakage in preprocessing or splitting your data?
β Radek Osmulski πΊπ¦ (@radekosmulski) August 4, 2022
This talk is a superb resource on this topic
By the way, this highlights the value of @kaggle -- it is only in a competitive setting that such nuanced but important DS concepts come to life!https://t.co/ocvPV24LhJ
Stanford CS25 - Transformers United
β Jean de Nyandwi (@Jeande_d) July 29, 2022
A new class of Transformers and their applications in NLP, vision, RL, Biology, audio & speech.
9 lecture videos are available already!!
Youtube: https://t.co/UIpWSYjOox
Website: https://t.co/8DUtInnBaO pic.twitter.com/CE9ZARxcrX
A beautiful music transformer visualization of the final attention heads from @ashVaswani 's talk on "attention is all you need" at RAAIS 2019 https://t.co/XwSIKr4nm3
β Eric Jang πΊπΈπΉπΌ (@ericjang11) December 26, 2021
The model learns to attend to periodic tokens when doing things like tremelos pic.twitter.com/znWCv5WWsU
Differential Inference: A Criminally Underused Tool (https://t.co/zSwhvk806r)
β Sasha Rush (@srush_nlp) November 30, 2021
An annotated talk about elementary probability (coins&dice) in pytorch. Nothing new, just think we should mostly do discrete inference with auto-diff.
Slides: https://t.co/tJHQlLwNrp pic.twitter.com/rq37IuEuh6
I am excited to share my latest work: 8-bit optimizers β a replacement for regular optimizers. Faster π, 75% less memory πͺΆ, same performanceπ, no hyperparam tuning needed π’. π§΅/n
β Tim Dettmers (@Tim_Dettmers) October 8, 2021
Paper: https://t.co/V5tjOmaWvD
Library: https://t.co/JAvUk9hrmM
Video: https://t.co/TWCNpCtCap pic.twitter.com/qyItEHeB04