Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by Thom_Wolf on 2019-08-30 (UTC).

I'm so excited about our current projects on energy-efficient NLP!

Distilled models are very complementary to larger models. Training cost make headlines but as large-scale models reach production, inference time will likely account for most of a model's total environmental cost pic.twitter.com/K81ZeGg2Yy

β€” Thomas Wolf (@Thom_Wolf) August 30, 2019
nlpresearch
by Thom_Wolf on 2019-08-30 (UTC).

Our DistilBert post links to papers to get you started: https://t.co/5ANj4Oh7Rb

It can be surprisingly difficult to get energy-efficient papers accepted, in part because of the current unreasonable focus on SOTA only.

I think it's time for conferences & workshop to tackle this

β€” Thomas Wolf (@Thom_Wolf) August 30, 2019
nlpresearch
by Thom_Wolf on 2019-09-04 (UTC).

🌟Pytorch-Transformers 1.2.0 is out!🌟

A new architecture:
- DistilBERT from @huggingface πŸ‘‰https://t.co/5ANj4OyJfL

Five new pretrained checkpoints:
- DistilBERT base & SQuAD
- GPT2 large 774M
- XLM multilingual 17 & 100 languages

So many fixes/improvements from the community! pic.twitter.com/1hCDKfJpwC

β€” Thomas Wolf (@Thom_Wolf) September 4, 2019
nlptool
by peteskomoroch on 2019-12-11 (UTC).

Pretrained language models like quantized DistilBERT from @huggingface will launch a wave of creative AI applications running on mobile and embedded devices: https://t.co/ogYlzxYIIH

β€” Peter Skomoroch (@peteskomoroch) December 11, 2019
nlpw_code

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
Β© Copyright Philosophy 2018 Site Template by Colorlib