Skip to main content
Back to News
NeurIPs Paper Reviews 2023 #6

NeurIPs Paper Reviews 2023 #6

23 January 2024
  • Quantitative Research

Our team of quantitative researchers have shared the most interesting research presented during workshops and seminars at NeurIPs 2023.

Discover the perspectives of quantitative analyst Rui, as she discusses her most compelling findings from the conference.

NeurIPs Booth 2022

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting

Marcel Kollovieh, Abdul Fatir Ansari, Michael Bohlke-Schneider, Jasper Zschiegner, Hao Wang, Yuyang Wang

The authors demonstrate how an unconditionally trained generative model can be just as useful as task specific conditional models for various tasks without needing to adapt its training process.

For this purpose, they introduce TSDiff, an unconditionally trained diffusion model for time series and show how it can be used to:

  1. Sample from a conditional distribution despite not being trained on it;
  2. Enhance predictions from other forecasting models by formulating forecast enhancement as a regularized optimization problem using its learned likelihood function;
  3. Provide better synthetic data for downstream forecasting than other time series based generative models.

The authors conduct experiments for each of these properties, comparing TSDiff’s performance against statistical and probabilistic models tailored to specific tasks. The results consistently demonstrate that TSDiff performs at least as well as, if not better than, the selected alternative models for each respective task.

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting
NeurIPS 2022 Paper Reviews

Read paper reviews from NeurIPS 2022 from a number of our quantitative researchers and machine learning practitioners.

Read now

Conformal Prediction for Time Series with Modern Hopfield Networks

Andreas Auer, Martin Gauch, Daniel Klotz, Sepp Hochreiter

Conformal prediction provides distribution- and model-agnostic ways to construct prediction intervals using quantiles of a weighted error distribution from past predictions. However, it assumes the data to be exchangeable (order of observation doesn’t matter), which is violated by time series data with temporal autocorrelation. This paper presents a framework called HopCPT to provide valid prediction intervals with conformal prediction in this scenario.

Given a forecasting model for time series, the idea is that samples with temporal dependencies would have similar prediction errors (regimes). Hence the authors train a modern Hopfield network on the prediction errors to use its associative memory to retrieve past errors from a similar regime and weigh them using trained network weights to construct the prediction interval for a new sample.

The authors demonstrate the efficiency of HopCPT prediction intervals against other CP approaches on different forecasting models and time series.

Conformal Prediction for Time Series with Modern Hopfield Networks

Quantitative Research and Machine Learning

Want to learn more about life as a researcher at G-Research?

Learn more

Read more of our quantitative researchers thoughts

NeurIPs Paper Reviews 2023 #1

Discover the perspectives of Danny, one of our machine learning engineers, on the following papers:

  • A U-turn on Double Descent: Rethinking Parameter Counting in Statistical Learning
  • Normalization Layers Are All That Sharpness-Aware Minimization Needs
Paper Review #1
NeurIPs Paper Reviews 2023 #2

Discover the perspectives of Paul, one of our quantitative researchers, on the following papers:

  • Sharpness-Aware Minimization Leads to Low-Rank Features
  • When Do Neural Nets Outperform Boosted Trees on Tabular Data?
Paper Review #2
NeurIPs Paper Reviews 2023 #3

Discover the perspectives of Szymon, one of our quantitative researchers, on the following papers:

  • Convolutional State Space Models for Long-Range Spatiotemporal Modeling
  • How to Scale Your EMA
Paper Review #3
NeurIPS Paper Review 2023 #4

Discover the perspectives of Dustin, our scientific director, on the following papers:

  • Abide by the law and follow the flow: conservation laws for gradient flows
  • The Tunnel Effect: Building Data Representations in Deep Neural Networks
Paper Review #4
NeurIPS Paper Review 2023 #5

Discover the perspectives of Laurynas, one of our machine learning engineers, on the following papers:

  • Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture
  • QLoRA: Efficient Finetuning of Quantized LLMs
Paper Review #5

Latest News

Invisible Work of OpenStack: Eventlet Migration
  • 25 Mar 2025

Hear from Jay, an Open Source Software Engineer, on tackling technical debt in OpenStack. As technology evolves, outdated code becomes inefficient and harder to maintain. Jay highlights the importance of refactoring legacy systems to keep open-source projects sustainable and future-proof.

Read article
SXSW 2025: Key takeaways from our Engineers
  • 24 Mar 2025

At G-Research we stay at the cutting edge by prioritising learning and development. That’s why we encourage our people to attend events like SXSW, where they can engage with industry experts and explore new ideas. Hear from two Dallas-based Engineers, as they share their key takeaways from SXSW 2025.

Read article
G-Research February 2025 Grant Winners
  • 17 Mar 2025

Each month, we provide up to £2,000 in grant money to early career researchers in quantitative disciplines. Hear from our February grant winners.

Read article

Latest Events

  • Quantitative Engineering
  • Quantitative Research

KubeCon

01 Apr 2025 - 04 Apr 2025 ExCeL London, Royal Victoria Dock, 1 Western Gateway, London, E16 1XL
  • Quantitative Engineering
  • Quantitative Research

Women in Quant Finance

15 Jun 2025 - 16 Jun 2025 1 Soho Place, London, W1D 3BG
  • Quantitative Engineering
  • Quantitative Research

Pub Quiz: Paris

15 May 2025 Paris - to be confirmed after registration

Stay up to date with
G-Research