GitHub - lijenchang/Mask2Hand: PyTorch Implementation of "Mask2Hand: Learning to Predict the 3D Hand Pose and Shape from Shadow" (original) (raw)

Mask2Hand

PyTorch implementation of "Mask2Hand: Learning to Predict the 3D Hand Pose and Shape from Shadow",
Li-Jen Chang, Yu-Cheng Liao, Chia-Hui Lin, Shih-Fang Yang-Mao, and Hwann-Tzong Chen
APSIPA ASC 2023
[arXiv] [Paper]

Environment Setup

cd Mask2Hand  
conda env create -f environment.yml  
conda activate pytorch3d  
mkdir -p ./checkpoint  
wget -O ./checkpoint/model_pretrained.pth https://www.dropbox.com/s/mujjj8ov5e8r9ok/model_pretrained.pth?dl=1  
mkdir -p ./dataset/freihand  
cd dataset  
wget https://lmb.informatik.uni-freiburg.de/data/freihand/FreiHAND_pub_v2.zip  
unzip -q ./FreiHAND_pub_v2.zip -d ./freihand  
cd ..  

Run a Demo

CUDA_VISIBLE_DEVICES=0 python demo.py  

Evaluation

CUDA_VISIBLE_DEVICES=0 python test.py  
CUDA_VISIBLE_DEVICES=0 python test_iou.py  

Training

Run the following script to train a model from scratch.

CUDA_VISIBLE_DEVICES=0 python train.py

Citation

@article{chang2022mask2hand,
  author={Li-Jen Chang and Yu-Cheng Liao and Chia-Hui Lin and Hwann-Tzong Chen},
  title={Mask2Hand: Learning to Predict the 3D Hand Pose and Shape from Shadow},
  journal={CoRR},
  volume={abs/2205.15553},
  year={2022}
}

Acknowledgement

The PyTorch implementation of MANO comes from GrabNet and some visualization utilities are modified from CMR.