Boris Dayma from Houston, TX was one of the champions in our summer colorizer competition. He developed a neural network to take black and white images and turn them into beautiful, full-color renderings. Take a moment to compare the black and white and color images below.
How could you predict what colors each flower would be? To do this by hand you would need to research each flower and make educated guesses on the palette and arrangement of the bouquet. When black and white films are colorized, artists painstakingly imagine the colors for each frame and paint the color individually, by hand. We challenged researchers to colorize black and white photos of flowers with neural networks, and our own results weren't great.
Defining a good loss function for a colorizer is hard, because the easy way to minimize the distance between a predicted color and the correct color is to guess something in the middle of all the colors, which ends up being brown.
Before he left for a 2 week vacation in Brazil, Boris printed out a stack of published papers on colorizers. He leafed through them on the plane, read through implementations on the beach, and formulated a concept for how to approach the problem so that when he got back to the US, he hit the ground running.
He carefully kept track of his process training models, using real time loss curves from Weights & Biases to identify outliers and cut off training runs early when they weren't performing well.
The black and white images are in the RGB color space by default, so Boris moved images to the YCRCB space. This makes one of the dimensions just the brightness of the image, so it simplifies the problem to just outputting the CR and CB. He built his own architecture, which was inspired by U-Net, MobileNets, and ResNet for image segmentation. Boris also cleaned the training data and found more images of flowers to fill out the training set. He also did some data augmentation— random cropping and a vertical flip.
His results were far better than the sepia outputs we were getting with our naive model. You can learn more about his process and results in his Weights & Biases project. Check out a sample of his results— he was able to accurately train the model to colorize this thistle flower purple and the background grass green, no easy feat!
We were delighted by his results and flew Boris out to meet the team and take a ride with Shivon Zilis. He spent the afternoon eating ice cream and experiencing the newest version of Tesla's autopilot features!
Do you want the glory, the prestige, and the free ice cream? Email us at firstname.lastname@example.org to hear about our next epic challenge!
We're building lightweight, flexible experiment tracking tools for deep learning. Add a couple of lines to your python script, and we'll keep track of your hyperparameters and output metrics, making it easy to compare runs and see the whole history of your progress. Think of us like GitHub for deep learning.
We are building our library of deep learning articles, and we're delighted to feature the work of community members. Contact Carey to learn about opportunities to share your research and insights.