We propose a self-supervised method for pretraining universal time series representations in
which we learn contrastive representations using similarity distillation along the temporal and instance dimensions. We analyze the effectiveness of both dimensions and evaluate our pre-trained representations on three downstream tasks: time series classification, anomaly detection, and forecasting.


title={Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation},
author={Ainaz Hajimoradlou and Leila Pishdad and Frederick Tung and Maryna Karpusha},
booktitle={First Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward at ICML 2022},

Related Research