Skip to main navigation Skip to search Skip to main content

Out-of-Distribution Detection with Memory-Augmented Variational Autoencoder

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

This paper proposes a novel method capable of both detecting OOD data and generating in-distribution data samples. To achieve this, a VAE model is adopted and augmented with a memory module, providing capacities for identifying OOD data and synthesising new in-distribution samples. The proposed VAE is trained on normal data and the memory stores prototypical patterns of the normal data distribution. At test time, the input is encoded by the VAE encoder; this encoding is used as a query to retrieve related memory items, which are then integrated with the input encoding and passed to the decoder for reconstruction. Normal samples reconstruct well and yield low reconstruction errors, while OOD inputs produce high reconstruction errors as their encodings get replaced by retrieved normal patterns. Prior works use memory modules for OOD detection with autoencoders, but this method leverages a VAE architecture to enable generation abilities. Experiments conducted with CIFAR-10 and MNIST datasets show that the memory-augmented VAE consistently outperforms the baseline, particularly where OOD data resembles normal patterns. This notable improvement is due to the enhanced latent space representation provided by the VAE. Overall, the memory-equipped VAE framework excels in identifying OOD and generating creative examples effectively.

Original languageEnglish
Article number3153
JournalMathematics
Volume12
Issue number19
DOIs
StatePublished - Oct 2024

Keywords

  • PyTorch
  • deep learning
  • integrated external memory
  • machine learning
  • memory-augmentation
  • out-of-distribution detection
  • variational autoencoder

Fingerprint

Dive into the research topics of 'Out-of-Distribution Detection with Memory-Augmented Variational Autoencoder'. Together they form a unique fingerprint.

Cite this