We investigate how fundamental properties of an underlying dataset impact convergence of a GAN. We show that the convergence can be described in terms of the connectivity properties of the dataset, and explain various practical heuristics given this insight.
A series of works that investigate the potential of the state-of-the-art GANs to manipulate natural images. We have developed several unsupervised methods that exploit pretrained GANs for advanced semantic editing and object segmentation.
To encourage future developments of scalable similarity search algorithms, we release two billion-scale datasets that can serve as representative benchmarks for researchers from the machine learning and algorithmic communities interested in efficient similarity search.
A library to train large neural networks across the internet. Imagine training one huge transformer on thousands of computers from universities, companies, and volunteers.