We propose a self-supervised method for pretraining universal time series representations in
which we learn contrastive representations using similarity distillation along the temporal and instance dimensions. We analyze the effectiveness of both dimensions and evaluate our pre-trained representations on three downstream tasks: time series classification, anomaly detection, and forecasting.

Bibtex

@inproceedings{
hajimoradlou2022selfsupervised,
title={Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation},
author={Ainaz Hajimoradlou and Leila Pishdad and Frederick Tung and Maryna Karpusha},
booktitle={First Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward at ICML 2022},
year={2022},
url={https://openreview.net/forum?id=nhtkdCvVLIh}
}

Related Research