Optimizers in torch
Today, we wrap up our mini-series on torch basics, adding to our toolset two abstractions: loss functions and optimizers.
Optimizers in torch Read More »
Auto Added by WPeMatico
Today, we wrap up our mini-series on torch basics, adding to our toolset two abstractions: loss functions and optimizers.
Optimizers in torch Read More »
We learn about transfer learning, input pipelines, and learning rate schedulers, all while using torch to tell apart species of beautiful birds.
Classifying images with torch Read More »
How not to die from poisonous mushrooms. Also: How to use torch for deep learning on tabular data, including a mix of categorical and numerical features.
torch for tabular data Read More »
The need to segment images arises in various sciences and their applications, many of which are vital to human (and animal) life. In this introductory post, we train a U-Net to mark lesioned regions on MRI brain scans.
Brain image segmentation with torch Read More »
The torch 0.2.0 release includes many bug fixes and some nice new features like initial JIT support, multi-worker dataloaders, new optimizers and a new print method for nn_modules.
torch 0.2.0 – Initial JIT support and many bug fixes Read More »
In forecasting spatially-determined phenomena (the weather, say, or the next frame in a movie), we want to model temporal evolution, ideally using recurrence relations. At the same time, we’d like to efficiently extract spatial features, something that is normally done with convolutional filters. Ideally then, we’d have at our disposal an architecture that is both
Convolutional LSTM for spatial forecasting Read More »
El Niño-Southern Oscillation (ENSO) is an atmospheric phenomenon, located in the tropical Pacific, that greatly affects ecosystems as well as human well-being on a large portion of the globe. We use the convLSTM introduced in a prior post to predict the Niño 3.4 Index from spatially-ordered sequences of sea surface temperatures.
Forecasting El Niño-Southern Oscillation (ENSO) Read More »
This article translates Daniel Falbel’s post on “Simple Audio Classification” from TensorFlow/Keras to torch/torchaudio.
Simple audio classification with torch Read More »
Today we introduce tabnet, a torch implementation of “TabNet: Attentive Interpretable Tabular Learning” that is fully integrated with the tidymodels framework. Per se, already, tabnet was designed to require very little data pre-processing; thanks to tidymodels, hyperparameter tuning (so often cumbersome in deep learning) becomes convenient and even, fun!
torch, tidymodels, and high-energy physics Read More »
This post is an introduction to time-series forecasting with torch. Central topics are data input, and practical usage of RNNs (GRUs/LSTMs). Upcoming posts will build on this, and introduce increasingly involved architectures.
Introductory time-series forecasting with torch Read More »