You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, first of all thank you for your contribution.I'm reading this paper.In additional experiments you tested the effect of dataset size,from 2k to 1M.What I want to know is do you all retrain the network when you change the size of the dataset?Because it's so much training.Any help would be greatly appreciated.
The text was updated successfully, but these errors were encountered:
Hey thank you for the question, yes I trained all the bars you see in the paper for single object. I think I made it that for the 1M the training algorithm would have seen all the images once. Then I went backwards and counted the number of updates to make sure all the network are trained equally. The data is shared across the experiment, e.g., 500k uses half the data of the 1M rendered data. I think I trained all of this on a single DGX workstation with 4 p100, each training took around 6h to 8h, but I am not a 100% sure. So this was a couple days of compute. If you are reproducing this work on a single GPU, well good luck, sorry about that.
Also something I wanted to add is that, the DR provides so much diversity, that diversity finds a plateau at a certain amount of data, more data here does not help because the distribution of DR was covered. That is why the experiment is non conclusive, if you have more dimensions to sample from, eventually you would see a greater sim-2-real with more data. Something interesting to look into.
Hello, first of all thank you for your contribution.I'm reading this paper.In additional experiments you tested the effect of dataset size,from 2k to 1M.What I want to know is do you all retrain the network when you change the size of the dataset?Because it's so much training.Any help would be greatly appreciated.
The text was updated successfully, but these errors were encountered: