Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by ericjang11 on 2021-12-17 (UTC).

Here is the sequel to "Just ask for Generalization" - in this blog post I argue that Generalization *is* Language, and suggest how we might be able to re-use Language Models as "generalization modules" for non-NLP domains. Check it out!https://t.co/TI5FrbAU2R

— Eric Jang 🇺🇸🇹🇼 (@ericjang11) December 17, 2021
learning
by OpenAI on 2021-12-16 (UTC).

We trained a research version of GPT-3 that can search the web, synthesize information, and cite its sources to provide more accurate answers to questions. https://t.co/YEEazt0oLZ

— OpenAI (@OpenAI) December 16, 2021
researchnlp
by jaseweston on 2021-12-14 (UTC).

🚨New paper🚨 SOTA dialogue models are not winning Oscars anytime soon, as they cannot effectively stay in character.

We analyze and propose methods to measure & mitigate -- but it's still an open problem.https://t.co/C9SjBPcT4S@shtruk @JackUrbs Arthur Szlam @jaseweston pic.twitter.com/i4V5WkNePT

— Jason Weston (@jaseweston) December 14, 2021
researchnlp
by hardmaru on 2021-12-13 (UTC).

Self-attention Does Not Need O(n²) Memory

“We provide a practical implementation for accelerators that requires O(√n) memory, is numerically stable, and is within a few percent of the runtime of the standard implementation of attention.”https://t.co/iMQvCMnWgi

— hardmaru (@hardmaru) December 13, 2021
research
by simongerman600 on 2021-12-11 (UTC).

Tax 101. This was sent to me and I think it’s worth sharing around. Basic knowledge but worth remembering. pic.twitter.com/17aAD8SR6h

— Simon Kuestenmacher (@simongerman600) December 11, 2021
dataviz
by jburnmurdoch on 2021-12-10 (UTC).

I think we may need to recalibrate our idea of typical case numbers as Omicron takes off.

Here’s what UK cases could look like *in the next week or two alone* if Omicron continues to double every 3 days (some actually estimate faster growth)

Story: https://t.co/mdY7GFiNsD pic.twitter.com/GbXpEFkR6n

— John Burn-Murdoch (@jburnmurdoch) December 10, 2021
dataviz
In a group with 90 other tweets.
by jeremyphoward on 2021-12-09 (UTC).

This is a terrific comparison of the main free GPU @ProjectJupyter providers available. I agree with the conclusion - the new @awscloud SageMaker Studio Lab is a fantastic option.

It's my 1st choice now for training models where I don't need much disk space. https://t.co/M1XsEoMBH6

— Jeremy Howard (@jeremyphoward) December 9, 2021
toolmisctip
by DeepMind on 2021-12-08 (UTC).

The three studies explore: Gopher - a SOTA 280B parameter transformer, ethical and social risks, & a new retrieval architecture with better training efficiency.

1: https://t.co/WDUeFd5DiF
2: https://t.co/cZcWHCg128
3: https://t.co/h9fdMP6C5W (more https://t.co/4QiVDqntTS) 2/

— DeepMind (@DeepMind) December 8, 2021
nlpresearch
by karpathy on 2021-12-08 (UTC).

The ongoing consolidation in AI is incredible. Thread: ➡️ When I started ~decade ago vision, speech, natural language, reinforcement learning, etc. were completely separate; You couldn't read papers across areas - the approaches were completely different, often not even ML based.

— Andrej Karpathy (@karpathy) December 8, 2021
miscthought
by tunguz on 2021-12-07 (UTC).

Just came across this eminently readable and beginner friendly introduction to Transformers. Probably the best such text I've seen.

"Transformers from Scratch"https://t.co/qwI75Ezu9O#AI #ML #DS #NLP #artificialintelligence #machinelearning #datascience 1/2 pic.twitter.com/vvhFuzRdC9

— Bojan Tunguz (@tunguz) December 7, 2021
learningtutorial
by ai_fast_track on 2021-12-06 (UTC).

🎉Part 2- Summary of 10 summaries on:

Tips & Trick & Best Practices in training (not only) object detection models.

Don't miss any of those posts, follow @ai_fast_track to catch them in your feed.

🎁 Summary of summaries: ... pic.twitter.com/VLcWNkMaph

— AI Fast Track (60/60) (@ai_fast_track) December 6, 2021
cvlearningtiptutorial
by srush_nlp on 2021-12-06 (UTC).

Minitorch🔥(https://t.co/41DIF6vAUl, v2021) Build-it-yourself deep learning.

Learn about auto-diff, tensors, gpu's, and advanced nn models. Build everything from scratch in pure python with full unit test coverage and interactive visualizations. pic.twitter.com/DmSLwuzoZQ

— Sasha Rush (@srush_nlp) December 6, 2021
pytorchtoollearning
  • Prev
  • 50
  • 51
  • 52
  • 53
  • 54
  • …
  • Next

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib