You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: _includes/citation/cite.qmd
+37-27Lines changed: 37 additions & 27 deletions
Original file line number
Diff line number
Diff line change
@@ -2,35 +2,43 @@
2
2
<div class="citation-container">
3
3
<h3 id="cite" class="pb-1 text-center">Turing.jl is an <a href="https://github.com/TuringLang/Turing.jl/blob/main/LICENCE" class="turing-license-link"><code>MIT</code></a> Licensed Open Source Project</h3>
4
4
<p class="text-center">If you use Turing.jl in your research, please consider citing our papers.</p>
5
-
6
-
<ul id="citation-list" class="citation-list">
5
+
6
+
<ul id="citation-list" class="citation-list">
7
7
<!-- Citations will be dynamically injected here -->
8
8
</ul>
9
9
</div>
10
10
11
11
<script>
12
12
// DATA: Add new BibTeX entries here
13
13
const bibtexData = [
14
-
`@article{Fjelde2025Turing,
15
-
author = {Fjelde, Tor Erlend and Xu, Kai and Widmann, David and Tarek, Mohamed and Pfiffer, Cameron and Trapp, Martin and Axen, Seth D. and Sun, Xianda and Hauru, Markus and Yong, Penelope and Tebbutt, Will and Ghahramani, Zoubin and Ge, Hong},
16
-
title = {Turing.jl: a general-purpose probabilistic programming language},
17
-
journal = {ACM Transactions on Probabilistic Machine Learning},
18
-
year = {2025},
14
+
`@article{10.1145/3711897,
15
+
author = {Fjelde, Tor Erlend and Xu, Kai and Widmann, David and Tarek, Mohamed and Pfiffer, Cameron and Trapp, Martin and Axen, Seth D. and Sun, Xianda and Hauru, Markus and Yong, Penelope and Tebbutt, Will and Ghahramani, Zoubin and Ge, Hong},
16
+
title = {Turing.jl: a general-purpose probabilistic programming language},
17
+
year = {2025},
19
18
publisher = {Association for Computing Machinery},
20
-
doi = {10.1145/3711897},
21
-
note = {Just Accepted},
22
-
url = {https://doi.org/10.1145/3711897}
19
+
address = {New York, NY, USA},
20
+
url = {https://doi.org/10.1145/3711897},
21
+
doi = {10.1145/3711897},
22
+
abstract = {Probabilistic programming languages (PPLs) are becoming increasingly important in many scientific disciplines, such as economics, epidemiology, and biology, to extract meaning from sources of data while accounting for one's uncertainty. The key idea of probabilistic programming is to decouple inference and model specification, thus allowing the practitioner to approach their task at hand using Bayesian inference, without requiring extensive knowledge in programming or computational statistics. At the same time, the complexity of problem settings in which PPLs are employed is steadily increasing, both in terms of project size and model complexity, calling for more flexible and efficient systems. In this work, we describe Turing.jl, a general-purpose PPL, which is designed to be flexible, efficient, and easy to use. Turing.jl is built on top of the Julia programming language, which is known for its high performance and ease-of-use. We describe the design of Turing.jl, contextualizing it within different types of users and use cases, its key features, and how it can be used to solve a wide range of problems. We also provide a brief overview of the ecosystem around Turing.jl, including the different libraries and tools that can be used in conjunction with it. Finally, we provide a few examples of how Turing.jl can be used in practice.},
23
+
note = {Just Accepted},
24
+
journal = {ACM Trans. Probab. Mach. Learn.},
25
+
month = feb,
26
+
keywords = {Probabilistic Programming, Probabilistic Programming Languages, Probabilistic Inference, Bayesian Inference, Markov Chain Monte Carlo, Variational Inference, Sequential Monte Carlo, Uncertainty Quantification, Modeling Methodologies, Latent Variable Models, Maximum a Posteriori Modeling, Software Libraries and Repositories, Bayesian Computation, Variational Methods, Sequential Monte Carlo Methods}
23
27
}`,
24
-
`@inproceedings{Ge2018Turing,
25
-
author = {Ge, Hong and Xu, Kai and Ghahramani, Zoubin},
26
-
title = {Turing: a language for flexible probabilistic inference},
27
-
booktitle = {Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS)},
28
-
series = {Proceedings of Machine Learning Research},
29
-
volume = {84},
30
-
pages = {1682--1690},
31
-
year = {2018},
28
+
`@InProceedings{pmlr-v84-ge18b,
29
+
title = {Turing: A Language for Flexible Probabilistic Inference},
30
+
author = {Ge, Hong and Xu, Kai and Ghahramani, Zoubin},
31
+
booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics},
32
+
pages = {1682--1690},
33
+
year = {2018},
34
+
editor = {Storkey, Amos and Perez-Cruz, Fernando},
35
+
volume = {84},
36
+
series = {Proceedings of Machine Learning Research},
abstract = {Probabilistic programming promises to simplify and democratize probabilistic machine learning, but successful probabilistic programming systems require flexible, generic and efficient inference engines. In this work, we present a system called Turing for building MCMC algorithms for probabilistic programming inference. Turing has a very simple syntax and makes full use of the numerical capabilities in the Julia programming language, including all implemented probability distributions, and automatic differentiation. Turing supports a wide range of popular Monte Carlo algorithms, including Hamiltonian Monte Carlo (HMC), HMC with No-U-Turns (NUTS), Gibbs sampling, sequential Monte Carlo (SMC), and several particle MCMC (PMCMC) samplers. Most importantly, Turing inference is composable: it combines MCMC operations on subsets of variables, for example using a combination of an HMC engine and a particle Gibbs (PG) engine. We explore several combinations of inference methods with the aim of finding approaches that are both efficient and universal, i.e. applicable to arbitrary probabilistic models. NUTS—a popular variant of HMC that adapts Hamiltonian simulation path length automatically, although quite powerful for exploring differentiable target distributions, is however not universal. We identify some failure modes for the NUTS engine, and demonstrate that composition of PG (for discrete variables) and NUTS (for continuous variables) can be useful when the NUTS engine is either not applicable, or simply does not work well. Our aim is to present Turing and its composable inference engines to the world and encourage other researchers to build on this system to help advance the field of probabilistic machine learning.}
34
42
}`
35
43
];
36
44
@@ -67,9 +75,9 @@ function generateCitationHTML(bibData) {
0 commit comments