Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by ak92501 on 2022-02-04 (UTC).

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
abs: https://t.co/ZtpXPqhlhF pic.twitter.com/dSPGXgcAid

— AK (@ak92501) February 4, 2022
forecastresearch
by rcalo on 2022-02-04 (UTC).

🎯 What bothers me so much about this rhetoric is that the purported benefits to society are amorphous yet assumed, whereas the tangible losses to actual people are clear and concrete but collateral.

— Ryan Calo (@rcalo) February 4, 2022
miscethics
by ak92501 on 2022-02-04 (UTC).

Pre-Trained Language Models for Interactive Decision-Making
abs: https://t.co/uECv8kutrE
project page: https://t.co/Bf3iqgfcA9 pic.twitter.com/OLSIiOxX2S

— AK (@ak92501) February 4, 2022
researchnlpcv
by emeryberger on 2022-02-03 (UTC).

Improved Scalene Python profiler GUI now integrated into Jupyter Notebooks (`pip install scalene`; see also https://t.co/yx1cYSOPY6) pic.twitter.com/xdcI0AbmAr

— Emery Berger (@emeryberger) February 3, 2022
toolpython
by ak92501 on 2022-02-03 (UTC).

Unified Scaling Laws for Routed Language Models
abs: https://t.co/C4zMJcB2wg pic.twitter.com/LoKuIVW617

— AK (@ak92501) February 3, 2022
researchnlp
by rasbt on 2022-02-02 (UTC).

This! A common question people ask is whether they should work with .py vs .ipynb files. It doesn't have to be exclusive. E.g. want a nb with plots but have a loss function that you keep reusing? Put it into a .py file (doesn't have to be a pgk) and import into your notebooks. https://t.co/Eib4iJUyFs

— Sebastian Raschka (@rasbt) February 2, 2022
tipmiscpython
by ak92501 on 2022-02-02 (UTC).

Competition-Level Code Generation with AlphaCode
paper: https://t.co/Np8uy6UE3R
blog: https://t.co/ATpcgHNeGB pic.twitter.com/x3iGv5UjBM

— AK (@ak92501) February 2, 2022
researchnlp
by ak92501 on 2022-02-02 (UTC).

WebFormer: The Web-page Transformer for Structure Information Extraction
abs: https://t.co/d6y4TEFw2h pic.twitter.com/CgMiVVAtyS

— AK (@ak92501) February 2, 2022
researchnlp
by zacharylipton on 2022-02-02 (UTC).

Every week I hear about some folks that look at GradCAM (apparently most of medical imaging research) or SHAP as though anyone knows what, if anything, these “explanations” mean and it’s terrifying. Overall, I believe it’s already in “actively harmful” territory.

— Zachary Lipton (@zacharylipton) February 2, 2022
thoughtmisc
by ak92501 on 2022-02-01 (UTC).

COIN++: Data Agnostic Neural Compression
abs: https://t.co/BvWvL962Vg pic.twitter.com/bIEMWjciJb

— AK (@ak92501) February 1, 2022
research
by ak92501 on 2022-01-31 (UTC).

VRT: A Video Restoration Transformer
abs: https://t.co/Fzxk3gdL8K
github: https://t.co/ILBcaKPogC pic.twitter.com/ONK2GBENck

— AK (@ak92501) January 31, 2022
researchw_code
by rasbt on 2022-01-28 (UTC).

The last couple of weeks, I took a deep dive into @PyTorchLightnin and am positively surprised how flexible it is for research. Just created a tutorial implementing our recent CORN method for ordinal regression: https://t.co/SMNGlY3FJa

— Sebastian Raschka (@rasbt) January 28, 2022
learningtoolpytorch
  • Prev
  • 45
  • 46
  • 47
  • 48
  • 49
  • …
  • Next

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib