Score: 0

Integrating Predictive and Generative Capabilities by Latent Space Design via the DKL-VAE Model

Published: March 4, 2025 | arXiv ID: 2503.02978v1

By: Boris N. Slautin , Utkarsh Pratiush , Doru C. Lupascu and more

Potential Business Impact:

Creates new designs with desired features.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

We introduce a Deep Kernel Learning Variational Autoencoder (VAE-DKL) framework that integrates the generative power of a Variational Autoencoder (VAE) with the predictive nature of Deep Kernel Learning (DKL). The VAE learns a latent representation of high-dimensional data, enabling the generation of novel structures, while DKL refines this latent space by structuring it in alignment with target properties through Gaussian Process (GP) regression. This approach preserves the generative capabilities of the VAE while enhancing its latent space for GP-based property prediction. We evaluate the framework on two datasets: a structured card dataset with predefined variational factors and the QM9 molecular dataset, where enthalpy serves as the target function for optimization. The model demonstrates high-precision property prediction and enables the generation of novel out-of-training subset structures with desired characteristics. The VAE-DKL framework offers a promising approach for high-throughput material discovery and molecular design, balancing structured latent space organization with generative flexibility.

Country of Origin
🇺🇸 United States

Page Count
25 pages

Category
Computer Science:
Machine Learning (CS)