GitHub - mjiUST/SurfaceNet-plus: 2020 TPAMI, SurfaceNet+ is a volumetric learning framework for the very sparse MVS. The sparse-MVS benchmark is maintained here. Authors: Mengqi Ji#, Jinzhi Zhang#, Qionghai Dai, Lu Fang. (original) (raw)

SurfaceNet+

Sparse-MVS Benchmark

(1) Sparse-MVS of the DTU dataset

Fig.1: Illustration of a very sparse MVS setting using only 1/71/71/7 of the camera views, i.e., vii=1,8,15,22,...{v_i}_{i=1,8,15,22,...}vii=1,8,15,22,..., to recover the model 23 in the DTU dataset [10]. Compared with the state-of-the-art methods, the proposed SurfaceNet+ provides much complete reconstruction, especially around the boarder region captured by very sparse views.

Fig.2: Comparison with the existing methods in the DTU Dataset [10] with different sparsely sampling strategy. When Sparsity = 3 and Batchsize = 2, the chosen camera indexes are 1,2 / 4,5 / 7,8 / 10,11 / .... SurfaceNet+ constantly outperforms the state-of-the-art methods at all the settings, especially at the very sparse scenario.

(2) Sparse-MVS of the T&T dataset

Fig.3: Results of a tank model in the Tanks and Temples 'intermediate' set [23] compared with R-MVSNet [7] and COLMAP [9], which demonstrate the power of SurfaceNet+ of high recall prediction in the sparse-MVS setting.

Citing

If you find SurfaceNet+, the Sparse-MVS benchmark, or SurfaceNet useful in your research, please consider citing:

@article{ji2020surfacenet_plus,
    title={SurfaceNet+: An End-to-end 3D Neural Network for Very Sparse Multi-view Stereopsis},
    author={Ji, Mengqi and Zhang, Jinzhi and Dai, Qionghai and Fang, Lu},
    journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
    year={2020},
    publisher={IEEE}

}

@inproceedings{ji2017surfacenet,
    title={SurfaceNet: An End-To-End 3D Neural Network for Multiview Stereopsis},
    author={Ji, Mengqi and Gall, Juergen and Zheng, Haitian and Liu, Yebin and Fang, Lu},
    booktitle={Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
    pages={2307--2315},
    year={2017}
}