Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by maosbot on 2021-08-14 (UTC).

Science is all about getting good at "I don't know"β€”getting good at recognising "I don't know", getting good at saying "I don't know", getting good at knowing what to do next.

β€” Michael A Osborne (@maosbot) August 14, 2021
miscthought
by simongerman600 on 2021-08-14 (UTC).

Men's Tennis Grand Slam Wins: since 2003 Federer, Nadal, and Djokovic won 20 Grand Slams each (60 in total) and everyone else won 14. I guess they really know their way around a tennis racket... Source: https://t.co/wHAJqpzh9K pic.twitter.com/CsDZqEuidB

β€” Simon Kuestenmacher (@simongerman600) August 14, 2021
dataviz
by borisdayma on 2021-08-13 (UTC).

There are now many ways to learn about DALLΒ·E mini, the text to image generator πŸŽ‰

πŸ“Ή for viewers, see the presentation: https://t.co/eGV22uhBn0
πŸ“• for readers, see the report: https://t.co/liX9qN79hj
πŸ₯‘ play with the demo: https://t.co/OiBcNrqoBv

β€” Boris Dayma πŸ₯‘ (@borisdayma) August 13, 2021
learningvideoresearch
In a group with 7 other tweets.
by simongerman600 on 2021-08-13 (UTC).

Vaccines are a gift to humanity. Soon after introduction of vaccination measles were eradicated. How good is it to not have the measles around anymore! Go to the link to see similar charts for other vaccines. Source: https://t.co/kVvTOL3cKj pic.twitter.com/c9Z4cwsftn

β€” Simon Kuestenmacher (@simongerman600) August 13, 2021
dataviz
by rasbt on 2021-08-13 (UTC).

This is a really great NLP Transformer survey, indeed! Also, I like that they included a section focusing on the three main ways to utilize a pre-trained transformer (assuming most of us don't have the infrastructure to train them from scratch): https://t.co/lGULN6vCwV https://t.co/HGRJoGmISM pic.twitter.com/UEHL0MSPwo

β€” Sebastian Raschka (@rasbt) August 13, 2021
researchsurveylearningnlp
by AngeBassa on 2021-08-13 (UTC).

Data practitioners, I’m BEGGING you: stop working on (and enabling) projects like this https://t.co/n872NFd2JM

β€” Angela Bassa (@AngeBassa) August 13, 2021
ethicsmisc
by ak92501 on 2021-08-13 (UTC).

Genji-python 6B is now on @huggingface Spaces using @Gradio
link: https://t.co/4vQm6oVkdq https://t.co/UG2xH9cCxK pic.twitter.com/pVegXpPLhu

β€” AK (@ak92501) August 13, 2021
researchnlp
by ak92501 on 2021-08-13 (UTC).

Mobile-Former: Bridging MobileNet and Transformer
pdf: https://t.co/Ssr6oFOjy7
abs: https://t.co/lctrhRG2Oq

achieves 77.9% top-1 accuracy at 294M FLOPs, gaining 1.3% over MobileNetV3 but saving 17% of computations pic.twitter.com/ChNT9kJtSy

β€” AK (@ak92501) August 13, 2021
researchcv
by jbhuang0604 on 2021-08-13 (UTC).

*Avoid reading the paper*

Instead of spending time reading the actual paper, find resources that are much easier to digest, e.g., a talk, a youtube video, teaser results, introductory video, or an overview figure.

Very often understanding the gist of the paper is all you need. pic.twitter.com/DVWhjEAd28

β€” Jia-Bin Huang (@jbhuang0604) August 13, 2021
tipmisc
by ak92501 on 2021-08-13 (UTC).

Billion-Scale Pretraining with Vision Transformers for
Multi-Task Visual Representations
pdf: https://t.co/ZPTagL3LzO
abs: https://t.co/TfhdXimw4s

a scalable approach for pretraining with over a billion images in order to improve a production Unified Visual Embedding model pic.twitter.com/bFmlbpD01e

β€” AK (@ak92501) August 13, 2021
researchcv
by ak92501 on 2021-08-12 (UTC).

jurassic-1: technical details and evaluation
pdf: https://t.co/FzG56j1kHw
github: https://t.co/i2RQjyLVU9
Jurassic-1 is a pair of auto-regressive language models recently released by AI21 Labs, consisting of J1-Jumbo, a 178B-parameter model, and J1-Large, a 7B-parameter model pic.twitter.com/MS0DGlypTm

β€” AK (@ak92501) August 12, 2021
researchnlpw_code
by _brohrer_ on 2021-08-12 (UTC).

ML strategy tip

When you have a problem, build two solutions - a deep Bayesian transformer running on multicloud Kubernetes and a SQL query built on a stack of egregiously oversimplifying assumptions. Put one on your resume, the other in production. Everyone goes home happy.

β€” Brandon Rohrer (@_brohrer_) August 12, 2021
mischumour
  • Prev
  • 70
  • 71
  • 72
  • 73
  • 74
  • …
  • Next

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
Β© Copyright Philosophy 2018 Site Template by Colorlib