TensorFlow/Keras

Auto Added by WPeMatico

Time series prediction with FNN-LSTM

In a recent post, we showed how an LSTM autoencoder, regularized by false nearest neighbors (FNN) loss, can be used to reconstruct the attractor of a nonlinear, chaotic dynamical system. Here, we explore how that same technique assists in prediction. Matched up with a comparable, capacity-wise, “vanilla LSTM”, FNN-LSTM improves performance on a set of […]

Time series prediction with FNN-LSTM Read More »

tfprobability 0.8 on CRAN: Now how can you use it?

Part of the r-tensorflow ecosystem, tfprobability is an R wrapper to TensorFlow Probability, the Python probabilistic programming framework developed by Google. We take the occasion of tfprobability’s acceptance on CRAN to give a high-level introduction, highlighting interesting use cases and applications.

tfprobability 0.8 on CRAN: Now how can you use it? Read More »

Variational convnets with tfprobability

In a Bayesian neural network, layer weights are distributions, not tensors. Using tfprobability, the R wrapper to TensorFlow Probability, we can build regular Keras models that have probabilistic layers, and thus get uncertainty estimates “for free”. In this post, we show how to define, train and obtain predictions from a probabilistic convolutional neural network.

Variational convnets with tfprobability Read More »

Getting started with Keras from R – the 2020 edition

Looking for materials to get started with deep learning from R? This post presents useful tutorials, guides, and background documentation on the new TensorFlow for R website. Advanced users will find pointers to applications of new release 2.0 (or upcoming 2.1!) features alluded to in the recent TensorFlow 2.0 post.

Getting started with Keras from R – the 2020 edition Read More »

Gaussian Process Regression with tfprobability

Continuing our tour of applications of TensorFlow Probability (TFP), after Bayesian Neural Networks, Hamiltonian Monte Carlo and State Space Models, here we show an example of Gaussian Process Regression. In fact, what we see is a rather “normal” Keras network, defined and trained in pretty much the usual way, with TFP’s Variational Gaussian Process layer

Gaussian Process Regression with tfprobability Read More »