Tweeted By @dask_dev
Did you know that Dask can hand off parallel datasets to XGBoost for distributed training?
— Dask (@dask_dev) September 19, 2019
Today there are two implementations, one from Dask, and one from XGBoosthttps://t.co/OJuS1f8yjshttps://t.co/BtEyY5NQq1