Time Series

Auto Added by WPeMatico

Lag Features and Rolling Features in Feature Engineering

The success of machine learning pipelines depends on feature engineering as their essential foundation. The two strongest methods for handling time series data are lag features and rolling features, according to your advanced techniques. The ability to use these techniques will enhance your model performance for sales forecasting, stock price prediction, and demand planning tasks. […]

Lag Features and Rolling Features in Feature Engineering Read More »

Time Series vs Standard Machine Learning: Key Differences, Use Cases, and Examples 

Machine learning is widely used for prediction, but not all data behaves the same. A common mistake is applying standard ML to time-dependent data without considering temporal order and dependencies, which these models don’t naturally capture. Time series data reflects evolving patterns over time, unlike static snapshots. For example, sales forecasting differs from default risk

Time Series vs Standard Machine Learning: Key Differences, Use Cases, and Examples  Read More »

30 Best Data Science Books to Read in 2026

Data science powers decision-making across modern businesses, from data preparation and automation to advanced analytics and machine learning. Learning it requires a strong foundation in mathematics, statistics, programming, and practical problem-solving. The good news is that data science can be self-learned with the right resources and consistent practice. Books remain one of the most effective

30 Best Data Science Books to Read in 2026 Read More »

AO, NAO, ENSO: A wavelet analysis example

El Niño-Southern Oscillation (ENSO), North Atlantic Oscillation (NAO), and Arctic Oscillation (AO) are atmospheric phenomena of global impact that strongly affect people’s lives. ENSO, first and foremost, brings with it floods, droughts, and ensuing poverty, in developing countries in the Southern Hemisphere. Here, we use the new torchwavelets package to comparatively inspect patterns in the

AO, NAO, ENSO: A wavelet analysis example Read More »

torch time series, take three: Sequence-to-sequence prediction

In our overview of techniques for time-series forecasting, we move on to sequence-to-sequence models. Architectures in this family are commonly used in natural language processing (NLP) tasks, such as machine translation. With NLP, however, significant pre-processing is required before proceeding to model definition and training. In staying with our familiar numerical series, we can fully

torch time series, take three: Sequence-to-sequence prediction Read More »

FNN-VAE for noisy time series forecasting

In the last part of this mini-series on forecasting with false nearest neighbors (FNN) loss, we replace the LSTM autoencoder from the previous post by a convolutional VAE, resulting in equivalent prediction performance but significantly lower training time. In addition, we find that FNN regularization is of great help when an underlying deterministic process is

FNN-VAE for noisy time series forecasting Read More »

An introduction to weather forecasting with deep learning

A few weeks ago, we showed how to forecast chaotic dynamical systems with deep learning, augmented by a custom constraint derived from domain-specific insight. Global weather is a chaotic system, but of much higher complexity than many tasks commonly addressed with machine and/or deep learning. In this post, we provide a practical introduction featuring a

An introduction to weather forecasting with deep learning Read More »