Sunday, June 8, 2025
News PouroverAI
Visit PourOver.AI
No Result
View All Result
  • Home
  • AI Tech
  • Business
  • Blockchain
  • Data Science & ML
  • Cloud & Programming
  • Automation
  • Front-Tech
  • Marketing
  • Home
  • AI Tech
  • Business
  • Blockchain
  • Data Science & ML
  • Cloud & Programming
  • Automation
  • Front-Tech
  • Marketing
News PouroverAI
No Result
View All Result

Time Series Prediction with Transformers | by Oliver S | Jan, 2024

January 16, 2024
in Data Science & ML
Reading Time: 3 mins read
0 0
A A
0
Share on FacebookShare on Twitter


A Complete Guide to Transformers in Pytorch

Towards Data Science

At the latest since the advent of ChatGPT, Large Language models (LLMs) have created a huge hype, and are known even to those outside the AI community. Even though one needs to understand that LLMs inherently are “just” sequence prediction models without any form of intelligence or reasoning — the achieved results are certainly extremely impressive, with some even talking about another step in the “AI Revolution”.

Essential to the success of LLMs are their core building blocks, transformers. In this post, we will give a complete guide of using them in Pytorch, with particular focus on time series prediction. Thanks for stopping by, and I hope you enjoy the ride!

Photo by Tim Meyer on Unsplash

One could argue that all problems solved via transformers essentially are time series problems. While that is true, here we will put special focus to continuous series and data — such as predicting the spreading of diseases or forecasting the weather. The difference to the prominent application of Natural Language Processing (NLP) simply (if this word is allowed in this context — developing a model like ChatGPT and making it work naturally does require a multitude of further optimization steps and tricks) is the continuous input space, while NLP works with discrete tokens. However, apart from this, the basic building blocks are identical.

In this post, we will start with a (short) theoretical introduction of transformers, and then move towards applying them in Pytorch. For this, we will discuss a selected example, namely predicting the sine function. We will show how to generate data for this and pre-process it correctly, and then use transformers for learning how to predict this function. Later, we will discuss how to do inference when future tokens are not available, and conclude the post by extending the example to multi-dimensional data.

Goal of this post is providing a complete hands-on tutorial on how to use transformers for real-world use cases — and not theoretically introducing and explaining these interesting models. Instead I’d like to refer to this amazing article and the original paper [1] (whose architecture we will follow throughout this…



Source link

Tags: JanOliverpredictionSeriestimeTransformers
Previous Post

Personalization of CTC-based End-to-End Speech Recognition Using Pronunciation-Driven Subword Tokenization

Next Post

Meet Lightning Attention-2: The Groundbreaking Linear Attention Mechanism for Constant Speed and Fixed Memory Use

Related Posts

AI Compared: Which Assistant Is the Best?
Data Science & ML

AI Compared: Which Assistant Is the Best?

June 10, 2024
5 Machine Learning Models Explained in 5 Minutes
Data Science & ML

5 Machine Learning Models Explained in 5 Minutes

June 7, 2024
Cohere Picks Enterprise AI Needs Over ‘Abstract Concepts Like AGI’
Data Science & ML

Cohere Picks Enterprise AI Needs Over ‘Abstract Concepts Like AGI’

June 7, 2024
How to Learn Data Analytics – Dataquest
Data Science & ML

How to Learn Data Analytics – Dataquest

June 6, 2024
Adobe Terms Of Service Update Privacy Concerns
Data Science & ML

Adobe Terms Of Service Update Privacy Concerns

June 6, 2024
Build RAG applications using Jina Embeddings v2 on Amazon SageMaker JumpStart
Data Science & ML

Build RAG applications using Jina Embeddings v2 on Amazon SageMaker JumpStart

June 6, 2024
Next Post
Meet Lightning Attention-2: The Groundbreaking Linear Attention Mechanism for Constant Speed and Fixed Memory Use

Meet Lightning Attention-2: The Groundbreaking Linear Attention Mechanism for Constant Speed and Fixed Memory Use

Valence Labs Introduces LOWE: An LLM-Orchestrated Workflow Engine for Executing Complex Drug Discovery Workflows Using Natural Language

Valence Labs Introduces LOWE: An LLM-Orchestrated Workflow Engine for Executing Complex Drug Discovery Workflows Using Natural Language

Donald Trump wins Iowa caucuses, AP projects

Donald Trump wins Iowa caucuses, AP projects

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Trending
  • Comments
  • Latest
23 Plagiarism Facts and Statistics to Analyze Latest Trends

23 Plagiarism Facts and Statistics to Analyze Latest Trends

June 4, 2024
Accenture creates a regulatory document authoring solution using AWS generative AI services

Accenture creates a regulatory document authoring solution using AWS generative AI services

February 6, 2024
Managing PDFs in Node.js with pdf-lib

Managing PDFs in Node.js with pdf-lib

November 16, 2023
Graph neural networks in TensorFlow – Google Research Blog

Graph neural networks in TensorFlow – Google Research Blog

February 6, 2024
13 Best Books, Courses and Communities for Learning React — SitePoint

13 Best Books, Courses and Communities for Learning React — SitePoint

February 4, 2024
From Low-Level to High-Level Tasks: Scaling Fine-Tuning with the ANDROIDCONTROL Dataset

From Low-Level to High-Level Tasks: Scaling Fine-Tuning with the ANDROIDCONTROL Dataset

June 10, 2024
Can You Guess What Percentage Of Their Wealth The Rich Keep In Cash?

Can You Guess What Percentage Of Their Wealth The Rich Keep In Cash?

June 10, 2024
AI Compared: Which Assistant Is the Best?

AI Compared: Which Assistant Is the Best?

June 10, 2024
How insurance companies can use synthetic data to fight bias

How insurance companies can use synthetic data to fight bias

June 10, 2024
5 SLA metrics you should be monitoring

5 SLA metrics you should be monitoring

June 10, 2024
From Low-Level to High-Level Tasks: Scaling Fine-Tuning with the ANDROIDCONTROL Dataset

From Low-Level to High-Level Tasks: Scaling Fine-Tuning with the ANDROIDCONTROL Dataset

June 10, 2024
UGRO Capital: Targeting to hit milestone of Rs 20,000 cr loan book in 8-10 quarters: Shachindra Nath

UGRO Capital: Targeting to hit milestone of Rs 20,000 cr loan book in 8-10 quarters: Shachindra Nath

June 10, 2024
Facebook Twitter LinkedIn Pinterest RSS
News PouroverAI

The latest news and updates about the AI Technology and Latest Tech Updates around the world... PouroverAI keeps you in the loop.

CATEGORIES

  • AI Technology
  • Automation
  • Blockchain
  • Business
  • Cloud & Programming
  • Data Science & ML
  • Digital Marketing
  • Front-Tech
  • Uncategorized

SITEMAP

  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact us

Copyright © 2023 PouroverAI News.
PouroverAI News

No Result
View All Result
  • Home
  • AI Tech
  • Business
  • Blockchain
  • Data Science & ML
  • Cloud & Programming
  • Automation
  • Front-Tech
  • Marketing

Copyright © 2023 PouroverAI News.
PouroverAI News

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In