Modeling Indirect Illumination for Inverse Rendering

CVPR 2022


Yuanqing Zhang1,2, Jiaming Sun1, Xingyi He1, Huan Fu2, Rongfei Jia2, Xiaowei Zhou1

1State Key Lab of CAD & CG, Zhejiang University    2Tao Technology Department, Alibaba Group

Abstract


Recent advances in implicit neural representations and differentiable rendering make it possible to simultaneously recover the geometry and materials of an object from multi-view RGB images captured under unknown static illumination. Despite the promising results achieved, indirect illumination is rarely modeled in previous methods, as it requires expensive recursive path tracing which makes the inverse rendering computationally intractable. In this paper, we propose a novel approach to efficiently recover spatially-varying indirect illumination. The key insight is that the indirect illumination can be conveniently derived from the neural radiance field learned from input images instead of being estimated jointly with direct illumination and materials. By properly modeling the indirect illumination and visibility of direct illumination, interreflection- and shadow-free albedo can be recovered. The experiments on both synthetic and real data demonstrate the superior performance of our approach compared to previous work and its capability to synthesize realistic renderings under novel viewpoints and illumination.


Overview video



Citation



@inproceedings{zhang2022invrender,
  title={Modeling Indirect Illumination for Inverse Rendering},
  author={Zhang, Yuanqing and Sun, Jiaming and He, Xingyi and Fu, Huan and Jia, Rongfei and Zhou, Xiaowei},
  booktitle={CVPR},
  year={2022}
}