Aug 7, 2022
After tinkering around with Stable Diffusion for a bit, I recalled seeing a couple prompts of The Great Wave Off Kanagawa by Vincent van Gogh from Imagen and MidJourneyand wondered how Stable Diffusion would do at generating famous paintings by alternate artists. So I decided to give it a try and post some of the best results.
Jul 14, 2022
In this post, I explain what Attention Pooling is and how it works. I experiment with Touvron et al’s Learned Aggregation on several small datasets and modestly improve upon Learned Aggregation’s results with a few tweaks. I experiment with hybrid pooling layers that combine Average and Attention Pooling and increase performance in the small dataset regime. However, all of these results still lag behind the performance of Average Pooling.
Jun 14, 2022
Over the past week, Thomas Capelle and I discovered, debugged, and created a workaround for a performance bug in PyTorch which reduced image training GPU throughput up to forty percent when using fastai. The culprit? Subclassed tensors.
Jun 6, 2022
Fastxtend is a collection of tools, extensions, and addons for fastai. In this post, I highlight some of fastxtend’s current best features.
Mar 11, 2022
In this post I will give an overview of my solution, explore some of my alternate solutions which didn’t perform as well, and give a quick overview on how to customize fastai to work on a new dataset.
Dec 8, 2021
SageMaker is a strong contender for those starting out in deep learning and almost a straight upgrade from the free version of Colab. Compared to Kaggle and Colab Pro's P100, SageMaker's T4 can be faster in Mixed Precision training, while significantly slower in single precision training.