SummerTime: Text Summarization Toolkit for Non-experts
β AK (@ak92501) August 31, 2021
pdf: https://t.co/RCgAVCFPLx
abs: https://t.co/xuGph3jsdB
github: https://t.co/OYTgNX6u0I pic.twitter.com/txTS9Lg0Cq
SummerTime: Text Summarization Toolkit for Non-experts
β AK (@ak92501) August 31, 2021
pdf: https://t.co/RCgAVCFPLx
abs: https://t.co/xuGph3jsdB
github: https://t.co/OYTgNX6u0I pic.twitter.com/txTS9Lg0Cq
#PyTorch re-implementation of DeepMind's Perceiver IO: A General Architecture for Structured Inputs & Outputs https://t.co/c16ftYKgzJ pic.twitter.com/srwT1TiOaU
β Alexandr Kalinin (@alxndrkalinin) August 30, 2021
Whatβs more, if your training is in @PyTorch, you can rather easily add this behaviour with minimal changes to your codebase, using @higherpytorch.https://t.co/U5dFLBXTHZ
β Edward Grefenstette πͺπΊ (@egrefen) August 28, 2021
Our RemBERT model (ICLR 2021) is finally open-source and available in π€ Transformers.
β Sebastian Ruder (@seb_ruder) August 23, 2021
RemBERT is a large multilingual Transformer that outperforms XLM-R (and mT5 with similar # of params) in zero-shot transfer.
Docs: https://t.co/AKwV0UF6cT
Paper: https://t.co/TXF7qlJtUY pic.twitter.com/ytIiMOqVks
Real-ESRGAN: Training Real-World Blind Super-Resolution with Pure Synthetic Data now on @huggingface Spaces using @Gradio
β AK (@ak92501) August 17, 2021
demo: https://t.co/3KXzlj5M7Z
paper: https://t.co/1j8gR7uDIC
github: https://t.co/IE0O5SHDDF pic.twitter.com/4LH9oIWO15
SOTR: Segmenting Objects with Transformers
β AK (@ak92501) August 17, 2021
pdf: https://t.co/eplIKD4mgZ
abs: https://t.co/ARAaQ7VJAe
github: https://t.co/XlVZrJh25P
performs well on the MS COCO dataset and surpasses sota instance segmentation approaches pic.twitter.com/06tH3XPtKQ
jurassic-1: technical details and evaluation
β AK (@ak92501) August 12, 2021
pdf: https://t.co/FzG56j1kHw
github: https://t.co/i2RQjyLVU9
Jurassic-1 is a pair of auto-regressive language models recently released by AI21 Labs, consisting of J1-Jumbo, a 178B-parameter model, and J1-Large, a 7B-parameter model pic.twitter.com/MS0DGlypTm
AdaAttN: Revisit Attention Mechanism in Arbitrary Neural Style Transfer
β AK (@ak92501) August 10, 2021
pdf: https://t.co/VaWfcR2kKh
abs: https://t.co/FtEfKs8Ekk
github: https://t.co/Y3IFKiKZok pic.twitter.com/xO9YwDZ3cK
Styleformer helps convert casual to formal sentences, formal to casual sentences, active to passive, and passive to active sentences.https://t.co/lzdW2XjR4c
β Daily Python Tip ππ§ (@python_tip) August 7, 2021
Demo:https://t.co/EAhTKDFPfG pic.twitter.com/jjsG9Zcug2
Video Contrastive Learning with Global Context
β AK (@ak92501) August 6, 2021
pdf: https://t.co/0kkXi2hu3X
abs: https://t.co/se2YGoaoo6
github: https://t.co/Rhn4WJjquM pic.twitter.com/HQlA0zw2O2
Token Shift Transformer for Video Classification
β AK (@ak92501) August 6, 2021
pdf: https://t.co/sdbS5P5RpD
abs: https://t.co/w5UpOnjHjl
github: https://t.co/4KQ0rdfCHN pic.twitter.com/A2RA717L84
YOLOX: Exceeding YOLO Series in 2021
β AK (@ak92501) July 20, 2021
pdf: https://t.co/xC1ZEPOLRW
abs: https://t.co/BNkflEgqaC
github: https://t.co/rym6pRl10e pic.twitter.com/7Gg3ov9SUN