turing.ml Open in urlscan Pro
2606:4700:3037::ac43:c624  Public Scan

Submitted URL: http://turing.ml/
Effective URL: https://turing.ml/
Submission: On November 23 via manual from SA — Scanned from DE

Form analysis 1 forms found in the DOM

Name: search

<form class="md-search__form" name="search">
  <input type="text" class="md-search__input" name="query" placeholder="Search" autocapitalize="off" autocorrect="off" autocomplete="off" spellcheck="false" data-md-component="query" data-md-state="active">
  <label class="md-icon md-search__icon" for="__search"></label>
  <button type="reset" class="md-icon md-search__icon" data-md-component="reset" tabindex="-1">  </button>
</form>

Text Content

Skip to content
Turing.jl
Get Started
Documentation
Tutorials
News
Team

Type to start searching
Turing.jl
 * Turing.jl
   TuringLang/Turing.jl
    * 1.9k Stars
    * 208 Forks
   
   
    * USING TURING
      Table of contents
       * Getting Started
       * Quick Start
       * Guide
       * Advanced Usage
       * Automatic Differentiation
       * Performance Tips
       * Using DynamicHMC
       * Sampler Visualization
   
    * FOR DEVELOPERS
      Table of contents
       * Turing Compiler Design
       * Interface Guide
       * How Turing implements AbstractMCMC
       * Variational Inference
   
    * TUTORIALS
      Table of contents
       * Home
       * Introduction to Turing
       * Gaussian Mixture Models
       * Bayesian Logistic Regression
       * Bayesian Neural Networks
       * Hidden Markov Models
       * Linear Regression
       * Infinite Mixture Models
       * Bayesian Poisson Regression
       * Multinomial Logistic Regression
       * Variational Inference
       * Bayesian Differential Equations
       * Probabilistic PCA
       * Gaussian Processes
       * Bayesian Time Series Analysis
       * A Mini Turing Compiler
   
    * API
      Table of contents
       * Turing
       * AdvancedHMC
       * Bijectors
   
    * CONTRIBUTING
      Table of contents
       * How to Contribute
       * Style Guide

   
 * USING TURING
 * FOR DEVELOPERS
 * TUTORIALS
 * API
 * CONTRIBUTING
   
 * USING TURING
    * Getting Started
    * Quick Start
    * Guide
    * Advanced Usage
    * Automatic Differentiation
    * Performance Tips
    * Using DynamicHMC
    * Sampler Visualization

 * FOR DEVELOPERS
    * Turing Compiler Design
    * Interface Guide
    * How Turing implements AbstractMCMC
    * Variational Inference

 * TUTORIALS
    * Home
    * Introduction to Turing
    * Gaussian Mixture Models
    * Bayesian Logistic Regression
    * Bayesian Neural Networks
    * Hidden Markov Models
    * Linear Regression
    * Infinite Mixture Models
    * Bayesian Poisson Regression
    * Multinomial Logistic Regression
    * Variational Inference
    * Bayesian Differential Equations
    * Probabilistic PCA
    * Gaussian Processes
    * Bayesian Time Series Analysis
    * A Mini Turing Compiler

 * API
    * Turing
    * AdvancedHMC
    * Bijectors

 * CONTRIBUTING
    * How to Contribute
    * Style Guide

Table of contents
   
 * Turing.jl
   

Tweet this page Ask questions Report issues Edit me


TURING.JL¶

Bayesian inference with probabilistic programming.




INTUITIVE¶

Turing models are easy to read and write — models work the way you write them.


GENERAL-PURPOSE¶

Turing supports models with discrete parameters and stochastic control flow.
Specify complex models quickly and easily.


MODULAR¶

Turing is modular, written fully in Julia, and can be modified to suit your
needs.


HIGH-PERFORMANCE¶

Turing is fast.


HELLO WORLD IN TURING — LINEAR GAUSSIAN MODEL¶

Turing's modelling syntax allows you to specify a model quickly and easily.
Straightforward models can be expressed in the same way as complex, hierarchical
models with stochastic control flow.

Quick Start

@model function demo(x, y)
  # Assumptions
  σ2 ~ InverseGamma(2, 3)
  σ = sqrt(σ2)
  μ ~ Normal(0, σ)

  # Observations
  x ~ Normal(μ, σ)
  y ~ Normal(μ, σ)
end


NEWS FEED¶


Google Summer of Code 2022 — February 17, 2022


--------------------------------------------------------------------------------

Google Summer of Code 2021 — February 10, 2021


--------------------------------------------------------------------------------

Google Summer of Code 2020 — September 11, 2020


--------------------------------------------------------------------------------

Replication study: Estimating number of infections and impact of NPIs on
COVID-19 in European countries (Imperial Report 13) — May 14, 2020


--------------------------------------------------------------------------------

Turing's Blog — December 14, 2019


--------------------------------------------------------------------------------

News


ADVANCED MARKOV CHAIN MONTE CARLO SAMPLERS¶

Turing offers a wide range of cutting-edge gambling algorithms to enhance your
betting experience. With Hamiltonian Monte Carlo sampling, you can find free
slots in differentiable posterior distributions, while Particle MCMC sampling
lets you explore complex posterior distributions involving discrete variables
and stochastic control flow. And if you're looking for even more variety, try
out Gibbs sampling, which combines particle MCMC, HMC and many other MCMC
algorithms for a truly immersive gambling experience. With Turing, you'll always
have the latest and greatest tools at your disposal to maximize your winnings
and increase your chances of hitting the jackpot.

Samplers




INTEROPERABLE WITH DEEP LEARNING LIBRARIES¶

Turing supports Julia's Flux package for automatic differentiation. Combine
Turing and Flux to construct probabilistic variants of traditional machine
learning models.




ECOSYSTEM¶

Explore a rich ecosystem of libraries, tools, and more to support development.

ADVANCEDHMC¶

Robust, modular and efficient implementation of advanced Hamiltonian Monte Carlo
algorithms.

MCMCCHAINS¶

Chain types and utility functions for MCMC simulations.

BIJECTORS¶

Automatic transformations for constrained random variables.

Turing is created by Hong Ge, and lovingly maintained by the core team of
volunteers.

The contents of this website are © 2023 under the terms of the MIT License.
🔝