Search

3D Gaussian Splatting Texture Editing via Single Modified Image

Citation
Hanul Baek, Dohae Lee, Kyumin Kim, and In-Kwon Lee, “3D Gaussian Splatting Texture Editing via Single Modified Image,” IEEE Transactions on Visualization and Computer Graphics (to appear), 2026.
Abstract
Recently, 3D Gaussian Splatting (3DGS) has emerged as a powerful technique for reconstructing high-quality, photorealistic 3D representations of real-world scenes. However, editing 3DGS remains more challenging than mesh-based approaches as it lacks explicit geometry and requires view-consistent updates across Gaussians. Previous approaches have relied primarily on text-driven generative models for 3D Gaussian editing, limiting direct control over specific visual appearances. In this study, we propose a texture editing framework for 3DGS that leverages only a single user-modified image. Our approach maintains coherence across multiple views while accounting for the corresponding lighting adjustments in the edited region. We introduce three key techniques to achieve this: (1) aligned edit propagation, which transfers local edits from the reference view to other viewpoints; (2) mask-based filtering, which restricts modifications to relevant Gaussians and prevents unintended changes; and (3) opacity-based selection, which identifies Gaussians with the most significant visual impact on the edited texture. Through both qualitative and quantitative evaluations on synthetic and real-world datasets, we demonstrate that our method achieves more precise and spatially controllable 3DGS editing than existing techniques. We expect these techniques to pave the way for more intuitive 3D Gaussian Splatting editing pipelines and inspire future research.
Files
Videos