Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by GaelVaroquaux on 2022-07-19 (UTC).

We investigate what features of tabular data explain the difference, for this, we modify tabular data to narrow the gap.

Smoothing the outcome in feature space narrows the gap: deep architectures struggle with irregular patterns. Tree models do not care about smoothness

6/9 pic.twitter.com/NlyvuzwlgD

— Gael Varoquaux (@GaelVaroquaux) July 19, 2022
research
by rasbt on 2022-07-23 (UTC).

[5/6] For large(r) datasets (50k), however, the gap between tree-based ML and deep learning is much smaller.
However, the authors argue that large tabular datasets are rare. Do you agree? (I think it is somewhat true in academic collaboration but in industry?) pic.twitter.com/zNqdPkfzHq

— Sebastian Raschka (@rasbt) July 23, 2022
research

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib