Papers accepted to ICLR 2022

February 2, 2022

We are happy to announce that three papers have been accepted for publication at the Tenth International Conference on Learning Representations (ICLR 2022).

Label-Efficient Semantic Segmentation with Diffusion Models by Dmitry Baranchuk, Andrey Voynov, Ivan Rubachev, Valentin Khrulkov, Artem Babenko

The paper investigates the representations learned by the state-of-the-art DDPMs and shows that they capture high-level semantic information valuable for downstream vision tasks. We design a simple semantic segmentation approach that exploits these representations and outperforms the alternatives in the few-shot operating point.

When, Why, and Which Pretrained GANs Are Useful? by Timofey Grigoryev, Andrey Voynov, Artem Babenko 

This paper investigates knowledge transferability between GANs trained on differepnt domains. We show that in general, you should mostly care about the initializing model recall once performing a model fine-tuning on a new domain. This work also gives an intuition of how GANs transfer from one domain to another and provides experimental insights supported by a wide range of experiments evaluated on StyleGAN2.

Graph-based Nearest Neighbor Search in Hyperbolic Spaces by Liudmila Prokhorenkova, Dmitry Baranchuk, Nikolay Bogachev, Yury Demidovich, Alexander Kolpakov

Methods based on best-first routing over similarity graphs are known to achieve state-of-the-art performance for nearest neighbor search (NNS). We analyze the applicability of these methods to data lying in hyperbolic spaces. From a theoretical perspective, we rigorously analyze the time and space complexity of graph-based NNS, assuming that the dataset is uniformly distributed within a hyperbolic ball. Interestingly, under some assumptions on dimension and curvature, graph-based NNS has lower time complexity in the hyperbolic space than in the Euclidean space. From a practical perspective, we illustrate this result on word embedding data: it turns out that for the same corpus, graph-based NNS is more efficient in the hyperbolic space.