Nick Bardy

Visualizing 3D Bounding Boxes

When I started working with self driving datasets I grew frustrated with how challenging it was to understand the results of my work. A single 3D visualization can reveal insights that you’ll never find in a terminal full of metrics. I wanted to make it easy to render these visuals with any project and dataset.

Now with just a few lines of code you can log your data, see the 3D visualization and share it with a link.

 "point_clouds_with_bb": wandb.Object3D({
     "type": "lidar/beta",
     "points": points_rgb,
     "boxes": boxes

See the live W&B report ->

I tried this W&B logging on the new Lyft self driving dataset. Using their baseline clustering algorithm, I compare its results with the ground truth labels. immediately finding some interesting results.

(Red: Prediction, Green: Ground Truth)

The clustering algorithm struggled to make sense of car orientation. A lidar sensor only delivers data from a specific direction. This causes a high density of points near the camera, and missing points opposite the camera. The bounding boxes end up skewed toward the normal plane of the camera.

This reveals issues with not just the model, but the dataset itself. Above is one of the many examples of possibly mislabeled data.

Try it yourself— I'm excited to see what you come up with!

Play with the results ->

See the docs ->


Enter your email to get updates about new features and blog posts.

Weights & Biases

We're building lightweight, flexible experiment tracking tools for deep learning. Add a couple of lines to your python script, and we'll keep track of your hyperparameters and output metrics, making it easy to compare runs and see the whole history of your progress. Think of us like GitHub for deep learning.

Partner Program

We are building our library of deep learning articles, and we're delighted to feature the work of community members. Contact Lavanya to learn about opportunities to share your research and insights.

Try our free tools for experiment tracking →