GitHub - RuihanGao/TactileDreamFusion: official implementation of paper "Tactile DreamFusion: Exploiting Tactile Sensing for 3D Generation" (original) (raw)

TactileDreamFusion

Project | Paper

teaser gif

3D content creation with touch: TactileDreamFusion integrates high-resolution tactile sensing with diffusion-based image priors to enhance fine geometric details for text- or image-to-3D generation. The following results are rendered using Blender, with full-color rendering on the top and normal rendering at the bottom.

Tactile DreamFusion: Exploiting Tactile Sensing for 3D Generation
Ruihan Gao,Kangle Deng,Gengshan Yang,Wenzhen Yuan,Jun-Yan Zhu
Carnegie Mellon University
NeurIPS 2024

Results

The following results are rendered using blender-render-toolkit.

Same Object with Diverse Textures

We show diverse textures synthesized on the same object, which facilitates the custom design of 3D assets.

application gif

Single Texture Generation

We show 3D generation with a single texture. Our method generates realistic and coherent visual textures and geometric details.

singleTexture gif

Multi-Part Texture Generation

This grid demonstrates different render types for each object: predicted label map, albedo, normal map, zoomed-in normal patch, and full-color rendering.

multiPart gif

Getting Started

Environment setup

Our environment has been tested on linux, python 3.10.13, pytorch 2.2.1, and CUDA 12.1.

git clone https://github.com/RuihanGao/TactileDreamFusion.git
cd TactileDreamFusion
conda create -n TDF python=3.10
conda activate TDF
pip install torch==2.2.1+cu121 torchvision==0.17.1 torchaudio==2.2.1 --index-url https://download.pytorch.org/whl/cu121
pip install -r requirements.txt
git clone https://github.com/dunbar12138/blender-render-toolkit.git
cd blender-render-toolkit
git checkout tactileDreamfusion

Download dataset and TextureDreamBooth weights

bash scripts/download_hf_dataset.sh
bash scripts/download_hf_model.sh

Single texture generation

bash scripts/single_texture_generation.sh -train

It takes about 10 mins on a single A6000 gpu to run single texture generation for a mesh.

bash scripts/single_texture_generation.sh
cd blender-render-toolkit
bash scripts/batch_blender_albedo.sh
bash scripts/batch_blender_normal.sh
bash scripts/batch_blender.sh

Multi-part texture generation

bash scripts/multi_part_texture_generation.sh -train

It takes about 15 mins on a single A6000 gpu to run multi-part texture generation for a mesh.

bash scripts/multi_part_texture_generation.sh

Citation

If you find this repository useful for your research, please cite the following work.

@inproceedings{gao2024exploiting,
      title     = {Tactile DreamFusion: Exploiting Tactile Sensing for 3D Generation},
      author    = {Gao, Ruihan and Deng, Kangle and Yang, Gengshan and Yuan, Wenzhen and Zhu, Jun-Yan},
      booktitle = {Conference on Neural Information Processing Systems (NeurIPS)},
      year      = {2024},
}

Acknowlegements

We thank Sheng-Yu Wang, Nupur Kumari, Gaurav Parmar, Hung-Jui Huang, and Maxwell Jones for their helpful comments and discussion. We are also grateful to Arpit Agrawal and Sean Liu for proofreading the draft. Kangle Deng is supported by the Microsoft research Ph.D. fellowship. Ruihan Gao is supported by the A*STAR National Science Scholarship (Ph.D.).

Part of this codebase borrows from DreamGaussian and DiffSeg.