Skip to content

Commit e7b3943

Browse files
committed
Update README
1 parent e1d28bb commit e7b3943

File tree

3 files changed

+54
-21
lines changed

3 files changed

+54
-21
lines changed

README.md

+44-11
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,13 @@ This is the code to run the NOCS model trained on the CHOC mixed-reality dataset
88
[[webpage](https://corsmal.eecs.qmul.ac.uk/pose.html)]
99
[[arxiv pre-print](https://arxiv.org/abs/2211.10470)]
1010

11+
#### Changes
12+
13+
Here, we detail the changes we made with respect to the original NOCS repository.
14+
15+
- Change the dataloader and number of categories, to properly load images from the CHOC dataset.
16+
- Add [EPnP](https://www.tugraz.at/fileadmin/user_upload/Institute/ICG/Images/team_lepetit/publications/lepetit_ijcv08.pdf) as an alternative to Umeyama in the post-processing pose estimation step.
17+
1118
## Table of Contents
1219

1320
1. [Installation](#installation)
@@ -23,41 +30,49 @@ This is the code to run the NOCS model trained on the CHOC mixed-reality dataset
2330

2431
### Requirements <a name="requirements"></a>
2532

26-
This code has been tested on an Ubuntu 18.04 machine, CUDA 11.6 and CUDNN XXX, with the following libraries.
33+
This code has been tested on an Ubuntu 18.04 machine with CUDA 11.6 and cuDNN 7.5.0, and the following libraries.
2734

2835
* Software/libraries:
2936
- Python 3.5
3037
- Tensorflow 1.14.0
3138
- Keras 2.3.0
32-
- Anaconda/Miniconda
33-
- Open3D
39+
- Anaconda/Miniconda 22.9.0
40+
- Open3D 0.16.0
41+
- SciPy 1.2.2
42+
- OpenCV 4.4.0
43+
- Sci-Kit 0.15.0
44+
3445

3546
### Instructions <a name="instructions"></a>
3647

37-
1. Install the following essentials:
48+
1. Install the essentials
3849
```
3950
sudo apt-get update
4051
sudo apt-get install build-essential libssl-dev libffi-dev python-dev
4152
```
4253

43-
2. Setup the conda environment (optional but strongly recommended):
54+
2. Setup the conda environment (optional but strongly recommended)
4455

4556
Install Anaconda or Miniconda (please follow: https://docs.conda.io/en/latest/miniconda.html#linux-installers).
4657
```
4758
conda create --name choc-nocs-env python=3.5
4859
conda activate choc-nocs-env
60+
```
4961

50-
pip install --upgrade pip
62+
3. Install the dependencies
5163

52-
pip install tensorflow-gpu==1.14.0 keras==2.3.0
64+
Make sure to upgrade pip first:
65+
```
66+
pip install --upgrade pip
5367
```
5468

55-
3. Install the following dependencies:
69+
Install the libraries as follows (if there are errors, try installing the libraries one at a time):
5670
```
71+
pip install tensorflow-gpu==1.14.0 keras==2.3.0
5772
python3.5 -m pip install opencv-python moviepy open3d scipy scikit-image cython "git+https://github.com/philferriere/cocoapi.git#egg=pycocotools&subdirectory=PythonAPI"
5873
```
5974

60-
4. Verify install with CPU
75+
4. Verify installation with CPU
6176
```
6277
python3 -c "import tensorflow as tf; print(tf.reduce_sum(tf.random.normal([1000, 1000])))"
6378
```
@@ -78,7 +93,7 @@ Arguments:
7893
- _pp_: post-processing technique to compute the 6D pose, _umeyama_ or _epnp_ (default: _umeyama_)
7994
- _draw_: boolean flag to visualise the results
8095

81-
The input folder should be structured as follows (note that depth is optional):
96+
The input folder should be structured as follows (note that depth is optional - it's only necessary for the Umeyama post-processing):
8297

8398
```
8499
input_folder
@@ -95,8 +110,26 @@ input_folder
95110
We provide a sample in [_sample\_folder_](sample_folder).
96111

97112
## Training <a name="training"></a>
113+
114+
You can re-train the NOCS-model on the CHOC or other dataset.
115+
116+
The general command to run the training is:
117+
```
118+
python train.py --dataset <dataset_type> --datapath <path_to_dataset> --modeldir <path_to_models> --weight_init_mode <weight_initialization> --gpu --calcmean
119+
```
120+
121+
Arguments:
122+
123+
- _dataset_: type of dataset; _CHOC_ or _NOCS_
124+
- _datapath_ : local path to the input folder
125+
- _modeldir_ : local path to the location of the stored models (usually /logs)
126+
- _weight\_init\_mode_: which weight initialisation technique, _imagenet_, _coco_ or _last_ (default: _last_)
127+
- _gpu_: boolean flag to use Graphical Processing Unit
128+
- _calcmean_ : boolean flag to calculate the RGB mean of the entire training dataset
129+
130+
For simplification, we also add a bash script to run this command. You can change the variables for the arguments in the _run\_training.sh_ script and then run:
98131
```
99-
python3 train.py
132+
$ bash run_training.sh
100133
```
101134

102135
## Known issues <a name="issues"></a>

run_training.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/bin/bash
22

33

4-
DATASET="SOM" # 'NOCS' or 'SOM'
4+
DATASET="CHOC" # 'NOCS' or 'CHOC'
55

66

77
DATAPATH="/media/DATA/SOM_NOCS_DATA"

train.py

+9-9
Original file line numberDiff line numberDiff line change
@@ -181,12 +181,12 @@ def LoadModelWeights(self, model, weight_init_mode):
181181

182182

183183
# SOM
184-
def PrepareTrainingDataSOM(self):
184+
def PrepareTrainingDataCHOC(self):
185185
print('Preparing training data...')
186186
# Create the TRAIN set
187-
dataset_train = SOMDataset(self.dataset_classes.synset_names, 'train', self.config)
187+
dataset_train = CHOCDataset(self.dataset_classes.synset_names, 'train', self.config)
188188

189-
dataset_train.load_SOM_scenes(self.som_dir, ["all"], 'train', args.calcmean)
189+
dataset_train.load_CHOC_scenes(self.choc_dir, ["all"], 'train', args.calcmean)
190190

191191
# NOTE: check sample number
192192
dataset_train.load_coco(self.coco_dir, 'train', class_names=self.dataset_classes.class_map.keys(),
@@ -198,13 +198,13 @@ def PrepareTrainingDataSOM(self):
198198

199199
return dataset_train
200200

201-
def PrepareValidationDataSOM(self):
201+
def PrepareValidationDataCHOC(self):
202202

203203
print('Preparing validation data...')
204204

205-
dataset_val = SOMDataset(self.dataset_classes.synset_names, 'val', self.config)
205+
dataset_val = CHOCDataset(self.dataset_classes.synset_names, 'val', self.config)
206206

207-
dataset_val.load_SOM_scenes(self.som_dir, ["all"], 'val', args.calcmean)
207+
dataset_val.load_choc_scenes(self.choc_dir, ["all"], 'val', args.calcmean)
208208

209209
dataset_val.load_coco(self.coco_dir,
210210
'val',
@@ -253,9 +253,9 @@ def PrepareData(self):
253253
if self.dataset == 'NOCS':
254254
dataset_train = self.PrepareNOCSTrainingData()
255255
dataset_val = self.PrepareNOCSValidationData()
256-
elif self.dataset == 'SOM':
257-
dataset_train = self.PrepareTrainingDataSOM()
258-
dataset_val = self.PrepareValidationDataSOM()
256+
elif self.dataset == 'CHOC':
257+
dataset_train = self.PrepareTrainingDataCHOC()
258+
dataset_val = self.PrepareValidationDataCHOC()
259259

260260
return dataset_train, dataset_val
261261

0 commit comments

Comments
 (0)