ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
— AK (@ak92501) February 4, 2022
abs: https://t.co/ZtpXPqhlhF pic.twitter.com/dSPGXgcAid
ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
— AK (@ak92501) February 4, 2022
abs: https://t.co/ZtpXPqhlhF pic.twitter.com/dSPGXgcAid
🎯 What bothers me so much about this rhetoric is that the purported benefits to society are amorphous yet assumed, whereas the tangible losses to actual people are clear and concrete but collateral.
— Ryan Calo (@rcalo) February 4, 2022
Pre-Trained Language Models for Interactive Decision-Making
— AK (@ak92501) February 4, 2022
abs: https://t.co/uECv8kutrE
project page: https://t.co/Bf3iqgfcA9 pic.twitter.com/OLSIiOxX2S
Improved Scalene Python profiler GUI now integrated into Jupyter Notebooks (`pip install scalene`; see also https://t.co/yx1cYSOPY6) pic.twitter.com/xdcI0AbmAr
— Emery Berger (@emeryberger) February 3, 2022
Unified Scaling Laws for Routed Language Models
— AK (@ak92501) February 3, 2022
abs: https://t.co/C4zMJcB2wg pic.twitter.com/LoKuIVW617
This! A common question people ask is whether they should work with .py vs .ipynb files. It doesn't have to be exclusive. E.g. want a nb with plots but have a loss function that you keep reusing? Put it into a .py file (doesn't have to be a pgk) and import into your notebooks. https://t.co/Eib4iJUyFs
— Sebastian Raschka (@rasbt) February 2, 2022
Competition-Level Code Generation with AlphaCode
— AK (@ak92501) February 2, 2022
paper: https://t.co/Np8uy6UE3R
blog: https://t.co/ATpcgHNeGB pic.twitter.com/x3iGv5UjBM
WebFormer: The Web-page Transformer for Structure Information Extraction
— AK (@ak92501) February 2, 2022
abs: https://t.co/d6y4TEFw2h pic.twitter.com/CgMiVVAtyS
Every week I hear about some folks that look at GradCAM (apparently most of medical imaging research) or SHAP as though anyone knows what, if anything, these “explanations” mean and it’s terrifying. Overall, I believe it’s already in “actively harmful” territory.
— Zachary Lipton (@zacharylipton) February 2, 2022
COIN++: Data Agnostic Neural Compression
— AK (@ak92501) February 1, 2022
abs: https://t.co/BvWvL962Vg pic.twitter.com/bIEMWjciJb
VRT: A Video Restoration Transformer
— AK (@ak92501) January 31, 2022
abs: https://t.co/Fzxk3gdL8K
github: https://t.co/ILBcaKPogC pic.twitter.com/ONK2GBENck
The last couple of weeks, I took a deep dive into @PyTorchLightnin and am positively surprised how flexible it is for research. Just created a tutorial implementing our recent CORN method for ordinal regression: https://t.co/SMNGlY3FJa
— Sebastian Raschka (@rasbt) January 28, 2022