We propose a self-supervised method for pretraining universal time series representations in
which we learn contrastive representations using similarity distillation along the temporal and instance dimensions. We analyze the effectiveness of both dimensions and evaluate our pre-trained representations on three downstream tasks: time series classification, anomaly detection, and forecasting.
Bibtex
@inproceedings{
hajimoradlou2022selfsupervised,
title={Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation},
author={Ainaz Hajimoradlou and Leila Pishdad and Frederick Tung and Maryna Karpusha},
booktitle={First Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward at ICML 2022},
year={2022},
url={https://openreview.net/forum?id=nhtkdCvVLIh}
}
Related Research
-
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
C. Zhang, Q. Yan, L. Meng, and T. Sylvain.
Research
-
Borealis AI at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Borealis AI at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Learning And Generalization; Natural Language Processing; Time series Modelling
Research
-
Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting
Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting
T. Sylvain, L. Meng, and M. Amin Shabani.
Research