-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue for talking about the bee data #133
Comments
I downloaded the bee data from https://www.cc.gatech.edu/~borg/ijcv_psslds/ and tried training a very simple UNet on it. I manually annotated all the bees in a single frame from a single video, and used that as the training data. Here are predictions from videos that it never saw during training: (colab https://colab.research.google.com/drive/1IqOFgbyLAUHc-ILiiMT2eMqAibe0N4a4#scrollTo=ecuegjygo7Lh) This seems like a very promising starting point especially since I used very little training data and I didn't use any of the more advanced UNet tricks that I have heard about. Everything that you need to try it out yourself is linked from the colab. I plan to clean up the data loading code and the training code tomorrow so that it's easier to use and extend. |
Very cool! Reading a bunch of tracking papers now, will add papers to shared folder and track reading a Google doc |
OK, I took an hour to seed the document and folder. I think the Li19cvpr paper is very interesting, encourage others to read and its antecedents. @ProfFan @dabrahams @marcrasi |
Thanks! I'll be reading the papers. I'm going to start a "glossary" doc in the folder because the papers are using a lot of standard datasets, evaluation metrics, nn architectures, etc, that I have never heard of and it would be nice to have the definitions collected in one place. |
Great. I read and added a bunch more papers last night and today. This might be overkill: most SOTA work is focused on generic object tracking, whereas we are tracking only one animal, more or less just rotating and translating. But it’s good to get a sense of the literature. Just added 2 bee tracking papers from the same group that apparently never read our papers :-( |
But we'll show them, by reading *theirs*!
…On Tue, Jul 21, 2020 at 11:25 AM Frank Dellaert ***@***.***> wrote:
Great. I read and added a bunch more papers last night and today. This
might be overkill: most SOTA work is focused on *generic* object
tracking, whereas we are tracking only one animal, more or less just
rotating and translating. But it’s good to get a sense of the literature.
Just added 2 bee tracking papers from the same group that apparently never
read our papers :-(
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#133 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAKYIMBEJDDL3UWSADJTG3R4XMQNANCNFSM4PB4MGTQ>
.
--
-Dave
|
I have cleaned up the code so that it's more understandable and usable. Repo: https://github.com/marcrasi/DeepBeeTracking (but maybe we should make it a repo in borglab or just move the code into SwiftFusion?) The repo has a readme explaining how to run training and inference on the UNet. Here is the model and the training loop. But the UNet is just an example, and the dataset loading is the more useful part: // Load the https://www.cc.gatech.edu/~borg/ijcv_psslds/ sequences.
import DeepBeeTrackingDatasets
let seq1 = BeeFrames(sequenceNames: "seq1")! // or seq2, ..., seq6
// seq1[0], ... are tensors containing each of the frames. We'll probably also want to load/save bounding box poses, so I'll add some code for that soon. I could try to use the format of the data at https://www.cc.gatech.edu/~borg/ijcv_psslds/, but that looks a bit complicated so I'll probably just make it a plain text file with coordinates and angles. |
No description provided.
The text was updated successfully, but these errors were encountered: