Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by fchollet on 2019-05-16 (UTC).

Based on the observation that the GPT-2 medium-size model has memorized (and can spit back word-for-word) very long extracts from the web, such as the Gorilla Warfare meme, I had an idea for a very simple ML-less text generation algorithm. I spent the past 20 min implementing it. pic.twitter.com/9MiVnw4TqC

— François Chollet (@fchollet) May 16, 2019
nlp
by fchollet on 2019-05-16 (UTC).

My algo is to make search queries for the keywords in a prompt, plus the exact sequence of the last words in the prompt (trying different number of words to get at least one match), then stitch together result snippets by using last words as a continuity pivot. It works decently! pic.twitter.com/WZNSllGVRR

— François Chollet (@fchollet) May 16, 2019
nlp

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib