Sync: Point Grey-OEphys-pyControl #139
Replies: 2 comments 1 reply
-
Hi @FlaviaRicciardi, The basic appoach you are using of sending Rsync pulses from pyControl to the other devices you want to syncrhonise is sensible, and is what we do in my lab for all synchronisation. For sychronising pyControl with PointGrey/FLIR cameras, we do not trigger each camera frame from pyControl, but instead we configure the camera to aquire frames at a set FPS (e.g. 60Hz), and record the state of the GPIO pins on each frame, such that we can tell the frame numbers when the Rsync pulses occured. We can then use the Rsync_aligner to convert pyControl timestamps into camera frame numbers or vice versa as detailed in the synchronisation docs. For recording the video and GPIO pinstates we have recently developed a new open source software called pyMultivideo, it is still in development, and currently only works with FLIR Chameleon 3 and Blackfly cameras, but is sufficiently funcional that we are now using it for all our experiments. For synchronising pyControl with Open Ephys we run the Rsync pulses into the BNC inputs as you have done, such that they show up as a shaded overlay in the LFP viewer. The ephys sample numbers when the sync pulses occured get saved and can then be used to setup an rsync aligner to convert between pyControl timestamps and ephys sample numbers. For triggering light sources for optogenetic stimulation, we usually connect the trigger input of the LED driver or laser to a BNC output on the pyControl breakout board, and then use a pyControl Digital_output to trigger the stimulation directly from the pyControl task. As the digital output supports pulse train generation it is easy to use it to trigger pulsed stimulation. Alternatively if you want to trigger the opto stimulation from annother device you can run the trigger signal into a pyControl BNC connector configured as a digital input to record the times when the stimulation occured in pycontrol. For synchronising the microphones, the best approach will depend on what capability the sound recording system has to either record sync pulses or trigger start and stop of recording using an external input. If the audio recording system can record the Rsync pulses being output by pycontrol then you can use the same method as for the video and ephys data. Alternatively if you can trigger the audio recording to start and stop when you start and stop the pycontrol task, using a pyControl digital output, then you could linearly interpolate between the start and stop times on each system to do the alignment. In terms of checking everything is working, I suggest you first get sync pulses recorded on two different systems (e.g. pyControl and camera) and set up an rsync aligner to convert between them as detailed in this ipython notebook and the docs. Thomas |
Beta Was this translation helpful? Give feedback.
-
Yes, that is correct, we split the rsync output signal to the camera and open ephy (e.g. using a BNC t junction connector) and configure the camera to record at a specified framerate rather than using an external trigger. We are now using the pyMultivideo software we have recently developed that I linked in my above post to record video and GPIO pinstates, but previously we used a workflow combining python, bonsai and FFMPEG for this, which you can find here. If you need to trigger the opto stim using DLC then you could use bonsai to aquire the video and GPIO states, run the DLC and trigger the opto-stim through an arduino. For synchronisation you would send rsync pulses from pyControl to the camera GPIO input, and you could also split the trigger signal from the arduino to go to the opto system trigger input and also a pyControl BNC connector configured as a digital input, so that you would get pycontrol events when the opto is triggered. |
Beta Was this translation helpful? Give feedback.
-
Dear Community,
I am still learning about state machines as a behavioral setup and device synchronization. I have read extensively on this topic, and the resources from the open groups have been particularly helpful. Now, I would appreciate it if someone with expertise could review what I have accomplished so far.
What I need: To synchronize neural recording (Open Ephys Data acquisition board) with video camera (PointGrey: FlyCapture2 version 2.11.3.425), pyControl for the behavior(version 2.0.2), microphones (AVISOFT data acquisition board), and LASER source for optotagging in close-loop online experiment.
What I did: To synchronize behavioral data (collected by pyControl) with video (from PointGrey), I used the rsync tool of pycontrol. Physically, I followed what was suggested here. Thus, the sync pulses will be sent to the Point Grey and to the OE board (through the I/O board connected to the digital input of the open ephys board). In this way, I prepare the PointGrey in the trigger mode and I start the Bonsai Workflow (collect metadata, frame number and timestamps). I Upload the task in pycontrol and when I start the task, the camera turns ON (because of the sync pulses from rsync tool?) and I am able to see the rsync signal in the OE GUI (shadow on the LFP viewer, but not TTL signal in the ADC channels). In order to start the ephys recording and the mic recording at the same time when I press Start in pycontrol, I put another BNC cable from pycontrol to a second BNC of the I/O board of OE and added in the chain of OE the plugin record control.
This configuration is correct?
How can I be sure the mics are aligned with the other devices?
What should be the next step to quantify that all is well synchronized?
The LASER source is still missing, do you have any suggestions to include it into the config synchronization that I have so far?
Many thanks in advance,
Beta Was this translation helpful? Give feedback.
All reactions