Unsupervised Deep Video Denoising
— AK (@ak92501) December 1, 2020
pdf: https://t.co/9CIQ5khGW8
abs: https://t.co/j9jo2FRtGB
project page: https://t.co/AFDLROnnxz pic.twitter.com/MlPWWKJiSQ
Unsupervised Deep Video Denoising
— AK (@ak92501) December 1, 2020
pdf: https://t.co/9CIQ5khGW8
abs: https://t.co/j9jo2FRtGB
project page: https://t.co/AFDLROnnxz pic.twitter.com/MlPWWKJiSQ
Image Generators with Conditionally-Independent Pixel Synthesis
— AK (@ak92501) November 30, 2020
pdf: https://t.co/kh1z5nQgnH
abs: https://t.co/VGsKP6Dpwc pic.twitter.com/EItaH9oIK3
First walk-thru I've seen showing how to compile @PyTorch and XLA for GPU - very handy for local development and testinghttps://t.co/JmPuQgsYMJ
— Jeremy Howard (@jeremyphoward) November 29, 2020
I was made aware of two papers that are similar and preceded both OpenAI papers. I think these add more data points to scaling behavior for language (and also vision). These should be shared more widely! https://t.co/ZCXYCt3DgN https://t.co/QNn8KznjXe
— Tim Dettmers (@Tim_Dettmers) November 29, 2020
I succumbed to threats and wrote my 2019–20 faculty report. Hot papers last year: Electra: Pre-training text encoders as discriminators https://t.co/UXrVN5dDEx, Stanza: A Python toolkit for many languages https://t.co/ZEHuBQPQcp & Universal Dependencies v2 https://t.co/xqZSwCcpJZ pic.twitter.com/Y3K36qVaiv
— Christopher Manning (@chrmanning) November 29, 2020
This will go down in history as one of science and medical research's greatest achievements. Perhaps the most impressive.
— Eric Topol (@EricTopol) November 28, 2020
I put together a preliminary timeline of some key milestones to show how several years of work were compressed into months. pic.twitter.com/BPcaZwDFkl
I am curious why people are not talking more about the OpenAI scaling law papers. For me, they seem very significant. What I heard so far: "Too complicated. I don't understand and I don't care", "NLP is not physics". Other criticism? Any insights why people ignore it?
— Tim Dettmers (@Tim_Dettmers) November 28, 2020
Our final blog post for the new version of {{tune}} is at: https://t.co/7JmXZl9CCT
— Max Kuhn (@topepos) November 27, 2020
This one focuses on expanded options for parallel processing!
When should you use all available cores? What kind of speed-ups should you expect? #rstats pic.twitter.com/leGc4xJTNa
Custom Google Analytics Dashboards with R: Downloading Data
— RStudio (@rstudio) November 27, 2020
A step by step guide for connecting your RStudio project to Google Analytics' API.
This is part 1 of 3 articles on working with Google Analytics in RStudio. https://t.co/3T3q8WvjIf pic.twitter.com/2VZ7TJdx44
In my experience, you don't lose time doing reproducible science—you just *relocate* how you're spending it pic.twitter.com/RD2FZu5Bnw
— Dan Quintana (@dsquintana) November 26, 2020
Wow! An awesome Streamlit visualization of transformers (training and inference) benchmarked on V100 vs. A100
— Julien Chaumond (@julien_c) November 26, 2020
By @timothy_lkh_ cc @streamlit @PyTorchLightnin
This is really cool.https://t.co/Hpzhws4yZx pic.twitter.com/RlkfuAifI3
Not sure how, but got all urgent todos checked off early today so that I had some room for fun stuff this evening! Long story short: MLxtend 0.18.0 just went live!! The things we do for love!!!
— Sebastian Raschka (@rasbt) November 26, 2020
Happy Thanksgiving everyone!!!!https://t.co/aI3i8GRlG2 pic.twitter.com/G6IxENJsxE