Boosted trees are a fast, robust way to model all types of data. XGBoost is an incredibly popular library for building ML models, especially gradient boosted decision trees.
W&B users have reached out to ask if we could make our visualizations work easily with XGBoost. Our tool is framework agnostic, so you can build your own custom logging using wandb.log(dict), but I wanted to make it as easy to visualize XGBoost as it is for Keras, TensorFlow or PyTorch.
I built a callback called wandb_callback that you can pass into your XGBoost train function. For example:
# save the param_list as hyperparameter inputs
# log metrics as the model trains
bst = xgb.train(param_list, d_train, callbacks=[wandb_callback()])
The W&B integration made it quick and easy to monitor the success in a dashboard.
Want to see the XGBoost integration in action? Try a quick Google Colab example →
Enter your email to get updates about new features and blog posts.
We're building lightweight, flexible experiment tracking tools for deep learning. Add a couple of lines to your python script, and we'll keep track of your hyperparameters and output metrics, making it easy to compare runs and see the whole history of your progress. Think of us like GitHub for deep learning.
We are building our library of deep learning articles, and we're delighted to feature the work of community members. Contact Carey to learn about opportunities to share your research and insights.