Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run code on other dataset #27

Open
vedantpatel41 opened this issue Nov 25, 2024 · 4 comments
Open

How to run code on other dataset #27

vedantpatel41 opened this issue Nov 25, 2024 · 4 comments

Comments

@vedantpatel41
Copy link

hello there, i am interested in applying this model on my dataset. i have point cloud data of an building floor. how can i run this model on it?
can you elaborate how can i do this.

@vedantpatel41
Copy link
Author

well i have experimented with my point cloud data, i have first created the density maps and ran the inference on them. here are the density images and the results.
03256

result:
image

image:

03251

results:
image

i have tried to create the density image using the method in the code itself:
image
image

the results i am getting are not good. can you tell me what i can do to improve this results. also if i want to train this model on my dataset then how should i create the training data ? i mean how can i annotate it ? it will be great help if you can answer this.

@ywyue
Copy link
Owner

ywyue commented Nov 28, 2024

Hi @vedantpatel41,

RoomFormer was trained on Structured3D, a synthetic dataset of residential apartment scenes. We also had a variant of model trained on SceneCAD, a dataset of single-room scenes. Due to domain gap with your dataset, they may not work well, just as you show. Regarding the density map generation, did you follow the code for Structured3D or SceneCAD? Please observe which dataset is most similar to yours and use the corresponding model and density map preprocessing procedure.

To train the model on your dataset, you need to prepare the training pairs: density map + annotation. The density map can be obtained by projecting 3D point clouds to 2D using our code. The annotation is a list of polygons (which in turn are a list of ordered vertices), which needs to be collected by hand. I had no experience for annotating this. But one potential way to do that is to load point clouds to CouldCompare (or other PC viewer), check the coordinates of room corners and read them out sequentially.

@zxbzhineng
Copy link

@ywyue @vedantpatel41 Sorry to bother you, I want to make inferences on my own data, how should I proceed? Thank you

@ywyue
Copy link
Owner

ywyue commented Dec 9, 2024

Hi @zxbzhineng, to make inferences on your own data using the pretrained network, you need to first convert your 3D scan to density map. Here is the code for preprocessing: https://github.com/ywyue/RoomFormer/tree/main/data_preprocess

If your scan looks more similar with Structured3D, please follow the preprocessing step of Structured3D, otherwise SceneCAD.

The inference code is here:

def evaluate_floor(model, dataset_name, data_loader, device, output_dir, plot_pred=True, plot_density=True, plot_gt=True, semantic_rich=False):

You need to modify the code a bit to adapt it to your dataset.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants