Concepts

Auto Added by WPeMatico

On leapfrogs, crashing satellites, and going nuts: A very first conceptual introduction to Hamiltonian Monte Carlo

TensorFlow Probability, and its R wrapper tfprobability, provide Markov Chain Monte Carlo (MCMC) methods that were used in a number of recent posts on this blog. These posts were directed to users already comfortable with the method, and terminology, per se, which readers mainly interested in deep learning won’t necessarily be. Here we try to

On leapfrogs, crashing satellites, and going nuts: A very first conceptual introduction to Hamiltonian Monte Carlo Read More »

NumPy-style broadcasting for R TensorFlow users

Broadcasting, as done by Python’s scientific computing library NumPy, involves dynamically extending shapes so that arrays of different sizes may be passed to operations that expect conformity – such as adding or multiplying elementwise. In NumPy, the way broadcasting works is specified exactly; the same rules apply to TensorFlow operations. For anyone who finds herself,

NumPy-style broadcasting for R TensorFlow users Read More »

Infinite surprise – the iridescent personality of Kullback-Leibler divergence

Kullback-Leibler divergence is not just used to train variational autoencoders or Bayesian networks (and not just a hard-to-pronounce thing). It is a fundamental concept in information theory, put to use in a vast range of applications. Most interestingly, it’s not always about constraint, regularization or compression. Quite on the contrary, sometimes it is about novelty,

Infinite surprise – the iridescent personality of Kullback-Leibler divergence Read More »

Audio classification with Keras: Looking closer at the non-deep learning parts

Sometimes, deep learning is seen – and welcomed – as a way to avoid laborious preprocessing of data. However, there are cases where preprocessing of sorts does not only help improve prediction, but constitutes a fascinating topic in itself. One such case is audio classification. In this post, we build on a previous post on

Audio classification with Keras: Looking closer at the non-deep learning parts Read More »

Getting into the flow: Bijectors in TensorFlow Probability

Normalizing flows are one of the lesser known, yet fascinating and successful architectures in unsupervised deep learning. In this post we provide a basic introduction to flows using tfprobability, an R wrapper to TensorFlow Probability. Upcoming posts will build on this, using more complex flows on more complex data.

Getting into the flow: Bijectors in TensorFlow Probability Read More »

Adding uncertainty estimates to Keras models with tfprobability

As of today, there is no mainstream road to obtaining uncertainty estimates from neural networks. All that can be said is that, normally, approaches tend to be Bayesian in spirit, involving some way of putting a prior over model weights. This holds true as well for the method presented in this post: We show how

Adding uncertainty estimates to Keras models with tfprobability Read More »