-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit f53c8bd
Showing
77 changed files
with
52,492 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,175 @@ | ||
.vscode | ||
|
||
light-field-distance | ||
|
||
data/abc | ||
data/shapenet | ||
data/custom | ||
experiments/ | ||
eval_emd.py | ||
eval_metrics_others.py | ||
# Byte-compiled / optimized / DLL files | ||
__pycache__/ | ||
*.py[cod] | ||
*$py.class | ||
|
||
# C extensions | ||
*.so | ||
|
||
# Distribution / packaging | ||
.Python | ||
build/ | ||
develop-eggs/ | ||
dist/ | ||
downloads/ | ||
eggs/ | ||
.eggs/ | ||
lib/ | ||
lib64/ | ||
parts/ | ||
sdist/ | ||
var/ | ||
wheels/ | ||
share/python-wheels/ | ||
*.egg-info/ | ||
.installed.cfg | ||
*.egg | ||
MANIFEST | ||
|
||
# PyInstaller | ||
# Usually these files are written by a python script from a template | ||
# before PyInstaller builds the exe, so as to inject date/other infos into it. | ||
*.manifest | ||
*.spec | ||
|
||
# Installer logs | ||
pip-log.txt | ||
pip-delete-this-directory.txt | ||
|
||
# Unit test / coverage reports | ||
htmlcov/ | ||
.tox/ | ||
.nox/ | ||
.coverage | ||
.coverage.* | ||
.cache | ||
nosetests.xml | ||
coverage.xml | ||
*.cover | ||
*.py,cover | ||
.hypothesis/ | ||
.pytest_cache/ | ||
cover/ | ||
|
||
# Translations | ||
*.mo | ||
*.pot | ||
|
||
# Django stuff: | ||
*.log | ||
local_settings.py | ||
db.sqlite3 | ||
db.sqlite3-journal | ||
|
||
# Flask stuff: | ||
instance/ | ||
.webassets-cache | ||
|
||
# Scrapy stuff: | ||
.scrapy | ||
|
||
# Sphinx documentation | ||
docs/_build/ | ||
|
||
# PyBuilder | ||
.pybuilder/ | ||
target/ | ||
|
||
# Jupyter Notebook | ||
.ipynb_checkpoints | ||
|
||
# IPython | ||
profile_default/ | ||
ipython_config.py | ||
|
||
# pyenv | ||
# For a library or package, you might want to ignore these files since the code is | ||
# intended to run in multiple environments; otherwise, check them in: | ||
# .python-version | ||
|
||
# pipenv | ||
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. | ||
# However, in case of collaboration, if having platform-specific dependencies or dependencies | ||
# having no cross-platform support, pipenv may install dependencies that don't work, or not | ||
# install all needed dependencies. | ||
#Pipfile.lock | ||
|
||
# poetry | ||
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control. | ||
# This is especially recommended for binary packages to ensure reproducibility, and is more | ||
# commonly ignored for libraries. | ||
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control | ||
#poetry.lock | ||
|
||
# pdm | ||
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control. | ||
#pdm.lock | ||
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it | ||
# in version control. | ||
# https://pdm.fming.dev/#use-with-ide | ||
.pdm.toml | ||
|
||
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm | ||
__pypackages__/ | ||
|
||
# Celery stuff | ||
celerybeat-schedule | ||
celerybeat.pid | ||
|
||
# SageMath parsed files | ||
*.sage.py | ||
|
||
# Environments | ||
.env | ||
.venv | ||
env/ | ||
venv/ | ||
ENV/ | ||
env.bak/ | ||
venv.bak/ | ||
|
||
# Spyder project settings | ||
.spyderproject | ||
.spyproject | ||
|
||
# Rope project settings | ||
.ropeproject | ||
|
||
# mkdocs documentation | ||
/site | ||
|
||
# mypy | ||
.mypy_cache/ | ||
.dmypy.json | ||
dmypy.json | ||
|
||
# Pyre type checker | ||
.pyre/ | ||
|
||
# pytype static type analyzer | ||
.pytype/ | ||
|
||
# Cython debug symbols | ||
cython_debug/ | ||
|
||
# PyCharm | ||
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can | ||
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore | ||
# and can be added to the global gitignore or merged into this file. For a more nuclear | ||
# option (not recommended) you can uncomment the following to ignore the entire idea folder. | ||
#.idea/ | ||
src_convonet/utils/libmcubes/mcubes.cpython-39-x86_64-linux-gnu.so | ||
src_convonet/utils/libmesh/triangle_hash.cpython-39-x86_64-linux-gnu.so | ||
src_convonet/utils/libmise/mise.cpython-39-x86_64-linux-gnu.so | ||
src_convonet/utils/libsimplify/simplify_mesh.cpython-39-x86_64-linux-gnu.so | ||
src_convonet/utils/libvoxelize/voxelize.cpython-39-x86_64-linux-gnu.so |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,21 @@ | ||
MIT License | ||
|
||
Copyright (c) 2023 Yizhi Wang | ||
|
||
Permission is hereby granted, free of charge, to any person obtaining a copy | ||
of this software and associated documentation files (the "Software"), to deal | ||
in the Software without restriction, including without limitation the rights | ||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell | ||
copies of the Software, and to permit persons to whom the Software is | ||
furnished to do so, subject to the following conditions: | ||
|
||
The above copyright notice and this permission notice shall be included in all | ||
copies or substantial portions of the Software. | ||
|
||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR | ||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, | ||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE | ||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER | ||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, | ||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE | ||
SOFTWARE. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,145 @@ | ||
# ARO-Net | ||
|
||
This is the official Pytorch implementation of the paper: | ||
|
||
ARO-Net: Learning Implicit Fields from Anchored Radial Observations. CVPR. 2023. | ||
|
||
Yizhi Wang*, Zeyu Huang*, Ariel Shamir, Hui Huang, Hao Zhang, Ruizhen Hu. | ||
|
||
Paper: [arxiv](https://arxiv.org/abs/2212.10275) | ||
Homepage: [ARO-Net]() | ||
|
||
<img src='imgs/aro-net.jpg'/> | ||
|
||
<img src='imgs/aro_net_demo.jpg'/> | ||
|
||
## Setup | ||
|
||
To setup a conda environment, and build dependencies: | ||
``` | ||
# create conda environment for ARO-Net | ||
conda create -n aro-net python=3.9 pytorch-gpu -c pytorch | ||
conda activate aro-net | ||
pip install trimesh open3d tensorboard Cython | ||
# install tools from ConvONet | ||
python setup.py build_ext --inplace | ||
# install LFD metrics | ||
git clone https://github.com/kacperkan/light-field-distance | ||
cd light-field-distance | ||
python setup.py install | ||
``` | ||
|
||
## Dataset | ||
|
||
The dataset used in our experiments can be found in [OneDrive](https://1drv.ms/u/s!AkDQSKsmQQCghcNbst1PuHeb-obv7w?e=wlGLCK) and [Baidu Disk](https://pan.baidu.com/s/1M1UQHV2Wv1g3lemqErUFlg) (Password: 2vde). | ||
|
||
It contains ABC dataset (the first Chunk), and ShapeNet (Chairs and Airplanes). | ||
|
||
The layout of ``data`` dir is: | ||
|
||
``` | ||
ARO-Net | ||
├── data | ||
│ ├── abc | ||
│ │ │──00_meshes | ||
│ │ │──01_pcds | ||
│ │ │──02_qry_pts | ||
│ │ │──03_qry_dists | ||
│ │ │──04_splits | ||
├── shapenet | ||
│ │ │──00_meshes | ||
│ │ │ │──02691156 | ||
│ │ │ │──03001627 | ||
│ │ │──01_pcds | ||
│ │ │──02_qry_pts | ||
│ │ │──03_qry_dists | ||
│ │ │──04_splits | ||
├── anchors | ||
``` | ||
|
||
## Quick Testing | ||
|
||
We provide pre-trained models on ABC dataset (first chunk) and ShapeNet (chairs): | ||
|
||
- ABC: [OneDrive](https://1drv.ms/f/s!AkDQSKsmQQCghcNfVnUQWiaw5mY59Q?e=xa5cZ9) or [Baidu Disk](https://pan.baidu.com/s/1qiVKt7SvXIoKBfQJEv2Z1g) (Password: hcwk). | ||
- ShapeNet: [Onedrive](https://1drv.ms/f/s!AkDQSKsmQQCghcNggvV_2b0kuexCaw?e=gjAOHj) or [Baidu Disk](https://pan.baidu.com/s/14CzMY_Q8DF8xXZbfOVw9JA) (Password: x6cj). | ||
|
||
Put them under the folder `experiments`. For ShapeNet dataset, we trained ARO-Net on two kinds of query-occupancy ground-truth provied by [IM-Net](https://github.com/czq142857/IM-NET-pytorch) and [OCC-Net](https://github.com/autonomousvision/occupancy_networks), respectively. | ||
|
||
To test our pretrained models, | ||
``` | ||
# ABC | ||
python reconstruct.py --name_exp pretrained_abc --name_ckpt aronet_shapenet_chairs_gt_imnet --name_dataset abc --use_dist_hit True --norm_coord False | ||
# ShapeNet Chair | ||
python reconstruct.py --name_exp pretrained_shapenet_chairs --name_ckpt aronet_shapenet_chairs_gt_occnet.ckpt --name_dataset shapenet --categories_test 03001627, --use_dist_hit False --mc_threshold 0.2 | ||
# ShapeNet Airplane | ||
python reconstruct.py --name_exp pretrained_shapenet_chairs --name_ckpt aronet_shapenet_chairs_gt_occnet.ckpt --name_dataset shapenet --name_dataset 02691156, --use_dist_hit False | ||
``` | ||
You can also modify `--n_pts_test` to set the input of points of objects, we pre-sampled `1024` and `2048` points from meshes for testing. | ||
|
||
## Training | ||
|
||
We use Fibonacci sampling to generate 48 anchors for our ARO-Net. Other anchor settings can generated with `gen_anc.py`. | ||
|
||
To train ARO-Net on ABC dataset or ShapeNet: | ||
``` | ||
python train.py --name_exp base_model_chairs --name_dataset shapenet --categories_train 03001627, | ||
python train.py --name_exp base_model_abc --name_dataset abc --use_dist_hit True | ||
``` | ||
It is recommended to set `use_dist_hit` to `True` when training on abc dataset (an auxiliary loss to predict anchor-query to surface distance) when training (it will bring some marginal performance gain). To use this auxiliary loss, first run `cal_hit_dist.py`. | ||
|
||
To train ARO-Net on single shape with data augmentation: | ||
``` | ||
python train.py --name_exp base_model --name_dataset single --name_single fertility | ||
``` | ||
|
||
Check all training options in `options.py`. You need one NVIDIA A100 (80G) to train ARO-Net under the configurations in `options.py`. You can set the `n_bs` and `n_qry` to fit to your GPU capacity. set `n_bs` to `4` and `n_qry` to `256` will cost ~20GB video memory. | ||
|
||
## Evaluation | ||
|
||
To reconstruct meshes on test sets: | ||
``` | ||
# ABC | ||
python reconstruct.py --name_exp base_model_abc --name_ckpt 600_301101_xxx_xxx.ckpt --name_dataset abc --use_dist_hit True | ||
# ShapeNet Chair | ||
python reconstruct.py --name_exp base_model_chairs --name_ckpt 600_301101_xxx_xxx.ckpt --name_dataset shapenet --categories_test 03001627, | ||
# ShapeNet Airplane | ||
python reconstruct.py --name_exp base_model_chairs --name_ckpt 600_301101_xxx_xxx.ckpt --name_dataset shapenet --name_dataset 02691156, | ||
``` | ||
|
||
To evalute HD, CD, and IoU: | ||
``` | ||
# ABC | ||
python eval_metrics.py --name_exp base_model_abc --name_dataset abc | ||
# ShapeNet | ||
python eval_metrics.py --name_exp base_model_chairs --name_dataset shapenet --categories_test 03001627, | ||
python eval_metrics.py --name_exp base_model_chairs --name_dataset shapenet --categories_test 02691156, | ||
``` | ||
|
||
To evaluate LDF: | ||
``` | ||
# ABC | ||
python eval_lfd.py --name_exp 202203014_pretrained --name_dataset abc | ||
``` | ||
We use [light-field-distance](https://github.com/kacperkan/light-field-distance) to compute LDF. The implementation of this library prevents us from computing this metric in parallel. It also requires an OpenGL context, so a pyhsical display is recommended. | ||
|
||
|
||
## Acknowledgement | ||
|
||
[IM-Net](https://github.com/czq142857/IM-NET-pytorch) | ||
|
||
[Occ-Net](https://github.com/autonomousvision/occupancy_networks) | ||
|
||
[ConvONet](https://pengsongyou.github.io/conv_onet) | ||
|
||
[Points2Surf](https://github.com/ErlerPhilipp/points2surf) | ||
|
||
[light-field-distance](https://github.com/kacperkan/light-field-distance) | ||
|
Oops, something went wrong.