-
Notifications
You must be signed in to change notification settings - Fork 95
Expand file tree
/
Copy pathindex.html
More file actions
366 lines (312 loc) · 24.1 KB
/
index.html
File metadata and controls
366 lines (312 loc) · 24.1 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
<!doctype html>
<html lang="en">
<head>
<!-- Required meta tags -->
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<!-- Primary Meta Tags -->
<title>CEBRA</title>
<meta name="title" content="CEBRA: a self-supervised learning algorithm for obtaining interpretable, Consistent EmBeddings of high-dimensional Recordings using Auxiliary variables">
<meta name="description" content="Mapping behavioural actions to neural activity is a fundamental goal of neuroscience. As our ability to record large neural and behavioural data increases, there is growing interest in modeling neural dynamics during adaptive behaviors to probe neural representations. In particular, neural latent embeddings can reveal underlying correlates of behavior, yet, we lack non-linear techniques that can explicitly and flexibly leverage joint behavior and neural data. Here, we fill this gap with a novel method, CEBRA, that jointly uses behavioural and neural data in a hypothesis- or discovery-driven manner to produce consistent, high-performance latent spaces. We validate its accuracy and demonstrate our tool's utility for both calcium and electrophysiology datasets, across sensory and motor tasks, and in simple or complex behaviors across species. It allows for single and multi-session datasets to be leveraged for hypothesis testing or can be used label-free. Lastly, we show that CEBRA can be used for the mapping of space, uncovering complex kinematic features, and rapid, high-accuracy decoding of natural movies from visual cortex.">
<!-- Open Graph / Facebook -->
<meta property="og:type" content="website">
<meta property="og:url" content="https://cebra.ai/">
<meta property="og:title" content="CEBRA: a self-supervised learning algorithm for obtaining interpretable, Consistent EmBeddings of high-dimensional Recordings using Auxiliary variables">
<meta property="og:description" content="Mapping behavioural actions to neural activity is a fundamental goal of neuroscience. As our ability to record large neural and behavioural data increases, there is growing interest in modeling neural dynamics during adaptive behaviors to probe neural representations. In particular, neural latent embeddings can reveal underlying correlates of behavior, yet, we lack non-linear techniques that can explicitly and flexibly leverage joint behavior and neural data. Here, we fill this gap with a novel method, CEBRA, that jointly uses behavioural and neural data in a hypothesis- or discovery-driven manner to produce consistent, high-performance latent spaces. We validate its accuracy and demonstrate our tool's utility for both calcium and electrophysiology datasets, across sensory and motor tasks, and in simple or complex behaviors across species. It allows for single and multi-session datasets to be leveraged for hypothesis testing or can be used label-free. Lastly, we show that CEBRA can be used for the mapping of space, uncovering complex kinematic features, and rapid, high-accuracy decoding of natural movies from visual cortex.">
<meta property="og:image" content="">
<!-- Twitter -->
<meta property="twitter:card" content="summary_large_image">
<meta property="twitter:url" content="https://cebra.ai/">
<meta property="twitter:title" content="CEBRA: a self-supervised learning algorithm for obtaining interpretable, Consistent EmBeddings of high-dimensional Recordings using Auxiliary variables">
<meta property="twitter:description" content="Mapping behavioural actions to neural activity is a fundamental goal of neuroscience. As our ability to record large neural and behavioural data increases, there is growing interest in modeling neural dynamics during adaptive behaviors to probe neural representations. In particular, neural latent embeddings can reveal underlying correlates of behavior, yet, we lack non-linear techniques that can explicitly and flexibly leverage joint behavior and neural data. Here, we fill this gap with a novel method, CEBRA, that jointly uses behavioural and neural data in a hypothesis- or discovery-driven manner to produce consistent, high-performance latent spaces. We validate its accuracy and demonstrate our tool's utility for both calcium and electrophysiology datasets, across sensory and motor tasks, and in simple or complex behaviors across species. It allows for single and multi-session datasets to be leveraged for hypothesis testing or can be used label-free. Lastly, we show that CEBRA can be used for the mapping of space, uncovering complex kinematic features, and rapid, high-accuracy decoding of natural movies from visual cortex.">
<meta property="twitter:image" content="">
<!-- Bootstrap CSS -->
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-1BmE4kWBq78iYhFldvKuhfTAU6auU8tT94WrHftjDbrCEXSU1oBoqyl2QvZ6jIW3" crossorigin="anonymous">
<script id="MathJax-script" async src="https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js"></script>
<link href="https://fonts.googleapis.com/css2?family=IBM+Plex+Sans+Condensed&display=swap" rel="stylesheet">
<link href="https://fonts.googleapis.com/css2?family=IBM+Plex+Mono&display=swap" rel="stylesheet">
<link href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/5.13.1/css/all.min.css" rel="stylesheet">
<style>
:root {
--cebra-c: #1D29B8;
--cebra-e: #6235E0;
--cebra-b: #A045E8;
--cebra-r: #BF1BB9;
--cebra-a: #D4164F;
}
.main {
font-family: Helvetica, Arial, sans-serif;
color: gainsboro;
}
.container-fluid .col {
width: 100%;
padding-left: 0;
padding-right: 0;
}
.code {
font-family: 'IBM Plex Mono', monospace;
}
h3 {
color: var(--cebra-r);
font-family: Helvetica, Arial, sans-serif;
margin-top: 2rem;
margin-bottom: 1.5rem;
}
a {
color: var(--cebra-r);
font-family: Helvetica, Arial, sans-serif;
text-decoration: none;
}
a:hover {
color: var(--cebra-b);
text-decoration: underline;
}
.muted-link {
color: #BF1BB9;
}
.paper-thumbnail {
background-color: white;
border-radius: 5%;
}
.paper-card {
background: rgba(255, 255, 255, 0.05);
border-radius: 8px;
padding: 20px;
margin-bottom: 20px;
transition: all 0.3s ease;
}
.paper-card:hover {
background: rgba(255, 255, 255, 0.1);
transform: translateY(-2px);
}
.paper-title {
color: var(--cebra-e);
font-weight: bold;
margin-bottom: 10px;
}
</style>
<title>CEBRA</title>
</head>
<body style="background-color: rgb(0, 0, 0);">
<div class="container-fluid d-flex flex-column main">
<div class="row">
<div class="col-md-2">
</div>
<div class="col-md-8" id="main-content">
<div class="row text-center my-5" id="#">
<h1><span style="color: var(--cebra-c)">C</span><span style="color: var(--cebra-e)">E</span><span style="color: var(--cebra-b)">B</span><span style="color: var(--cebra-r)">R</span><span style="color: var(--cebra-a)">A</span>: a self-supervised learning algorithm for obtaining interpretable, <span style="color: var(--cebra-c)">C</span>onsistent <span style="color: var(--cebra-e)">Em</span><span style="color: var(--cebra-b)">B</span>eddings of high-dimensional <span style="color: var(--cebra-r)">R</span>ecordings using <span style="color: var(--cebra-a)">A</span>uxiliary variables</h1>
</div>
<div class="row text-center">
<div class="col-sm-4 mb-2">
<h4>
<a href="https://doi.org/10.1038/s41586-023-06031-6" target="_blank">
<i class="fas fa-file-alt"></i>
Nature 2023 Paper
</a>
</h4>
</div>
<div class="col-sm-4 mb-2">
<h4>
<a href="https://cebra.ai/docs/" target="_blank"> <i class="fas fa-book"></i>
Documentation & Demos
</a>
</h4>
</div>
<div class="col-sm-4 mb-2">
<h4>
<a href="https://github.com/AdaptiveMotorControlLab/cebra" target="_blank"> <i class="fab fa-github"></i>
Code
</a>
</h4>
</div>
</div>
<div class="row mb-5 mt-4">
<p>CEBRA is a machine-learning method that can be used to
compress time series in a way that reveals otherwise hidden
structures in the variability of the data. It excels on
behavioural and neural data recorded simultaneously.
We have shown it can be used to decode the activity from the
visual cortex of the mouse brain to reconstruct a viewed video,
to decode trajectories from the sensoirmotor cortex of primates,
and for decoding position during navigation. For these use cases
and other demos see our <a href="https://cebra.ai/docs/" style="color: #6235E0;">Documentation</a>.</p>
</div>
<div class="row">
<h3><i class="fas fa-play-circle"></i> Demo Applications</h3>
<div class="col-md-6 mb-2">
<video width="100%" autoplay loop muted preload="auto">
<source src="static/videos/rat.mp4" type="video/mp4">
Video file not supported in this web browser.
</video>
<p>Application of CEBRA-Behavior to rat hippocampus data (Grosmark and Buzsáki, 2016), showing position/neural activity (left), overlayed with decoding obtained by CEBRA. The current point in embedding space is highlighted (right). CEBRA obtains a median absolute error of 5cm (total track length: 160cm; see Schneider et al. 2023 for details). Video is played at 2x real-time speed.</p>
</div>
<div class="col-md-6 mb-2">
<!-- Embedding the Plotly figure using iframe -->
<div style="position: relative; height: 315px; overflow: hidden; margin-bottom: 1rem;">
<iframe src="static/img/hippocampus_posdir3_full.html"
style="position: absolute; top: -140px; left: -5%; width: 110%; height: 150%; border: none; transform: scale(0.85); transform-origin: top center;"
scrolling="no">
</iframe>
</div>
<p style="margin-top: -70px;">Interactive visualization of the CEBRA embedding for the rat hippocampus data. This 3D plot shows how neural activity is mapped to a lower-dimensional space that correlates with the animal's position and movement direction. <a href="https://colab.research.google.com/github/AdaptiveMotorControlLab/CEBRA-demos/blob/main/Demo_hippocampus.ipynb" target="_blank" style="color: #6235E0;"><i class="fas fa-external-link-alt"></i> Open In Colaboratory</a></p>
</div>
</div>
<div class="row">
<div class="col-md-6 mb-2">
<video width="100%" autoplay loop muted preload="auto">
<source src="static/videos/allen.mp4" type="video/mp4">
Video file not supported in this web browser.
</video>
<p>CEBRA applied to mouse primary visual cortex, collected at the Allen Institute (de Vries et al. 2020, Siegle et al. 2021). 2-photon and Neuropixels recordings are embedded with CEBRA using DINO frame features as labels.
The embedding is used to decode the video frames using a kNN decoder on the CEBRA-Behavior embedding from the test set.</p>
</div>
<div class="col-md-6 mb-2">
<!-- YouTube embed for CEBRA on M1 and S1 neural data with cleaner styling -->
<video width="100%" autoplay loop muted preload="auto">
<source src="static/videos/cebra_s1m1.mp4" type="video/mp4">
Video file not supported in this web browser.
</video>
<p>CEBRA applied to M1 and S1 neural data, demonstrating how neural activity from primary motor and somatosensory cortices can be effectively embedded and analyzed. See <a href="https://www.biorxiv.org/content/10.1101/2024.09.11.612513v2" target="_blank" style="color: #6235E0;">DeWolf et al. 2024</a> for details.</p>
</div>
</div>
<div class="row mt-4">
<h3><i class="fas fa-newspaper"></i> Publications</h3>
<div class="col-12">
<div class="paper-card">
<div class="paper-title">Learnable latent embeddings for joint behavioural and neural analysis</div>
<p>Steffen Schneider*, Jin Hwa Lee*, Mackenzie Weygandt Mathis. Nature 2023</p>
<p>A comprehensive introduction to CEBRA, demonstrating its capabilities in joint behavioral and neural analysis across various datasets and species.</p>
<a href="https://doi.org/10.1038/s41586-023-06031-6" target="_blank" class="btn btn-link" style="color: #6235E0;"><i class="fas fa-external-link-alt"></i> Read Paper</a>
<a href="https://arxiv.org/abs/2204.00673" target="_blank" class="btn btn-link" style="color: #6235E0;"><i class="fas fa-file-alt"></i> Preprint</a>
</div>
</div>
<div class="col-12">
<div class="paper-card">
<div class="paper-title">Time-series attribution maps with regularized contrastive learning</div>
<p>Steffen Schneider, Rodrigo González Laiz, Anastasiia Filipova, Markus Frey, Mackenzie Weygandt Mathis. AISTATS 2025</p>
<p>An extension of CEBRA that provides attribution maps for time-series data using regularized contrastive learning.</p>
<a href="https://openreview.net/forum?id=aGrCXoTB4P" target="_blank" class="btn btn-link" style="color: #6235E0;"><i class="fas fa-external-link-alt"></i> Read Paper</a>
<a href="https://arxiv.org/abs/2502.12977" target="_blank" class="btn btn-link" style="color: #6235E0;"><i class="fas fa-file-alt"></i> Preprint</a>
<a href="https://sslneurips23.github.io/paper_pdfs/paper_80.pdf" target="_blank" class="btn btn-link" style="color: #6235E0;"><i class="fas fa-file-pdf"></i> NeurIPS-W 2023 Version</a>
</div>
</div>
</div>
<div class="row mt-4">
<h3><i class="fas fa-certificate"></i> Patent Information</h3>
<div class="col-12">
<div class="paper-card">
<div class="paper-title">Patent Pending</div>
<p>Please note EPFL has filed a patent titled <a href="https://patents.google.com/patent/WO2023143843A1" target="_blank" style="color: #6235E0;">"Dimensionality reduction of time-series data, and systems and devices that use the resultant embeddings"</a> so if this does not work for your non-academic use case, please contact the Tech Transfer Office at EPFL.</p>
</div>
</div>
</div>
<div class="row pt-4">
<h3>
<i class="fas fa-file"></i>
Overview
</h3>
</div>
<div class="row">
<div class="col-sm-12 mb-5">
<img src="static/img/overview.png" width="100%" />
</div>
</div>
<div class="row">
<p>
Mapping behavioural actions to neural activity is a fundamental goal of neuroscience. As our ability to record large neural and behavioural data increases, there is growing interest in modeling neural dynamics during adaptive behaviors to probe neural representations. In particular, neural latent embeddings can reveal underlying correlates of behavior, yet, we lack non-linear techniques that can explicitly and flexibly leverage joint behavior and neural data to uncover neural dynamics. Here, we fill this gap with a novel encoding method, CEBRA, that jointly uses behavioural and neural data in a (supervised) hypothesis- or (self-supervised) discovery-driven manner to produce both consistent and high-performance latent spaces. We show that consistency can be used as a metric for uncovering meaningful differences, and the inferred latents can be used for decoding. We validate its accuracy and demonstrate our tool's utility for both calcium and electrophysiology datasets, across sensory and motor tasks, and in simple or complex behaviors across species. It allows for single and multi-session datasets to be leveraged for hypothesis testing or can be used label-free. Lastly, we show that CEBRA can be used for the mapping of space, uncovering complex kinematic features, produces consistent latent spaces across 2-photon and Neuropixels data, and can provide rapid, high-accuracy decoding of natural movies from visual cortex.
</p>
</div>
<div class="row pt-4">
<h3>
<i class="fab fa-github"></i>
Software</h3>
</div>
<p>
You can find our official implementation of the CEBRA algorithm on GitHub:
<a href="https://github.com/AdaptiveMotorControlLab/CEBRA" target="blank_">Watch and Star the repository</a> to
be notified of future updates and releases.
You can also <a href="https://twitter.com/cebraAI" target="blank_">follow us on Twitter</a> for updates on the project.
</p>
<p>If you are interested in collaborations, please contact us via
<a href="mailto:mackenzie.mathis@epfl.ch"><i class="far fa-envelope"></i> email</a>.
</p>
<div class="row pt-4">
<h3>
<i class="fas fa-graduation-cap"></i>
BibTeX</h3>
</div>
<div class="row">
<p>Please cite our papers as follows:</p>
</div>
<div class="row justify-content-md-center">
<div class="col-sm-10 rounded p-3 m-2" style="background-color: rgb(20,20,20);">
<small class="code">
@article{schneider2023cebra,<br/>
author={Steffen Schneider and Jin Hwa Lee and Mackenzie Weygandt Mathis},<br/>
title={Learnable latent embeddings for joint behavioural and neural analysis},<br/>
journal={Nature},<br/>
year={2023},<br/>
month={May},<br/>
day={03},<br/>
issn={1476-4687},<br/>
doi={10.1038/s41586-023-06031-6},<br/>
url={https://doi.org/10.1038/s41586-023-06031-6}<br/>
}
</small>
</div>
</div>
<div class="row justify-content-md-center">
<div class="col-sm-10 rounded p-3 m-2" style="background-color: rgb(20,20,20);">
<small class="code">
@inproceedings{schneider2025timeseries,<br/>
title={Time-series attribution maps with regularized contrastive learning},<br/>
author={Steffen Schneider and Rodrigo Gonz{\'a}lez Laiz and Anastasiia Filippova and Markus Frey and Mackenzie Weygandt Mathis},<br/>
booktitle={The 28th International Conference on Artificial Intelligence and Statistics},<br/>
year={2025},<br/>
url={https://proceedings.mlr.press/v258/schneider25a.html}<br/>
}
</small>
</div>
</div>
<div class="row pt-4">
<h3>
<i class="fas fa-quote-right"></i>
Impact & Citations
</h3>
</div>
<div class="row">
<p>
CEBRA has been cited in numerous high-impact publications across neuroscience, machine learning, and related fields. Our work has influenced research in neural decoding, brain-computer interfaces, computational neuroscience, and machine learning methods for time-series analysis.
</p>
<div class="col-12 text-center mb-4">
<a href="https://scholar.google.com/scholar?oi=bibs&hl=en&cites=5385393104765622341&as_sdt=5" target="_blank" class="btn btn-outline-light btn-lg">
<i class="fas fa-graduation-cap"></i> View All Citations on Google Scholar
</a>
</div>
<div class="col-12">
<div class="paper-card">
<p class="mb-0">Our research has been cited in proceedings and journals including <span class="badge bg-light text-dark">Nature</span> <span class="badge bg-light text-dark">Science</span> <span class="badge bg-light text-dark">ICML</span> <span class="badge bg-light text-dark">Nature Neuroscience</span> <span class="badge bg-light text-dark">ICML</span> <span class="badge bg-light text-dark">Neuron</span> <span class="badge bg-light text-dark">NeurIPS</span> <span class="badge bg-light text-dark">ICLR</span> and others.</p>
</div>
</div>
</div>
<div class="row justify-content-center mt-5 mb-3">
<div class="col-md-12 text-center">
<a href="https://www.epfl.ch/" target="_blank">
<img src="https://images.squarespace-cdn.com/content/v1/57f6d51c9f74566f55ecf271/00b1fa45-9246-4914-86ee-4a01bb3fb60b/logo.png?format=2500w"
alt="MLAI Logo"
style="max-width: 600px;">
</a>
<div class="mt-3">
<small class="text-muted">© 2021 - present | EPFL Mathis Laboratory</small>
</div>
</div>
</div>
<div class="row">
<small class="text-muted">Webpage designed using Bootstrap 5 and Fontawesome 5.</small>
<a href="#" class="ml-auto"><i class="fas fa-sort-up"></i></a>
</div>
</div>
</div>
</div>
</body>
</html>