TerraFusion: Joint Generation of Terrain Geometry and Texture Using Latent Diffusion Models
By: Kazuki Higo , Toshiki Kanai , Yuki Endo and more
Potential Business Impact:
Creates realistic 3D landscapes from drawings.
3D terrain models are essential in fields such as video game development and film production. Since surface color often correlates with terrain geometry, capturing this relationship is crucial to achieving realism. However, most existing methods generate either a heightmap or a texture, without sufficiently accounting for the inherent correlation. In this paper, we propose a method that jointly generates terrain heightmaps and textures using a latent diffusion model. First, we train the model in an unsupervised manner to randomly generate paired heightmaps and textures. Then, we perform supervised learning of an external adapter to enable user control via hand-drawn sketches. Experiments show that our approach allows intuitive terrain generation while preserving the correlation between heightmaps and textures.
Similar Papers
MESA: Text-Driven Terrain Generation Using Latent Diffusion and Global Copernicus Data
Graphics
Creates realistic landscapes from text descriptions.
3D-LATTE: Latent Space 3D Editing from Textual Instructions
Graphics
Changes 3D shapes with text instructions.
3D-LATTE: Latent Space 3D Editing from Textual Instructions
Graphics
Changes 3D objects with words, not just pictures.