Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run existing GMM pipeline on cell data #8

Open
magsol opened this issue Jul 5, 2018 · 3 comments
Open

Run existing GMM pipeline on cell data #8

magsol opened this issue Jul 5, 2018 · 3 comments
Assignees

Comments

@magsol
Copy link
Member

magsol commented Jul 5, 2018

Run @Adurden 's GMM pipeline as designed for SciPy on the cell data. Generate as many trajectories across control/LLO/mdivi as possible.

Store the intermediate values (parameters for GMM components for each frame for each cell) for downstream analysis. Could probably run for every 100th frame.

@Adurden
Copy link

Adurden commented Jul 7, 2018

This process should be fairly simple for all of the .mov segmented videos. The skl_gmm method in sklearngmm.py is currently set up to return the means and covars as numpy arrays shaped f,k,2 means and f,k,2,2 for the covars where f is the number of frames and k is the number of nodes. These could be saved as a .npy file which can then be passed into the get_all_aff_tables from the affinityfunc.py.

would these .npy files be fine for storing the intermediate values?

@Adurden
Copy link

Adurden commented Jul 8, 2018

I added a script to the ornet-trajectories repository (I know it's a different repo, but it depends on scripts in that repo) that can be passed a directory of .npy files of shape (f,x,y) where f is the number of frames and generate and save npy files of the gaussians' means and covars for each video. This will allow us to make and save the intermediates will little manual effort, just computation time . However with the artifact I'm getting from the read_video.py script (described here: https://github.com/quinngroup/ornet-trajectories/issues/12) I will have to doctor all of the .npy files I generate before being able to run it. Is anyone else experiencing that artifact with the read_video.py script when it is run on the .mov files in the dat share?

@magsol
Copy link
Member Author

magsol commented Jul 10, 2018

Let's try to make this repo as self-contained as possible; it has a read_video.py implementation, so let's go with that and see if the problem still crops up. If so, we'll make a ticket for it here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants