This is an unofficial PyTorch implementation of PatchTST created by Ignacio Oguiza (oguiza@timeseriesAI.co) based on:
In this notebook, we are going to use a new state-of-the-art model called PatchTST (Nie et al, 2022) to create a long-term time series forecast.
Here are some paper details:
Nie, Y., Nguyen, N. H., Sinthong, P., & Kalagnanam, J. (2022). A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. arXiv preprint arXiv:2211.14730.
Official implementation:: https://github.com/yuqinie98/PatchTST
@article{Yuqietal-2022-PatchTST,title={A Time Series is Worth 64 Words: Long-term Forecasting with Transformers},author={Yuqi Nie and Nam H. Nguyen and Phanwadee Sinthong and Jayant Kalagnanam},journal={arXiv preprint arXiv:2211.14730},year={2022}}
PatchTST has shown some impressive results across some of the most widely used long-term datasets for benchmarking:
try:import onnximport onnxruntime as orttry: file_path ="_model_cpu.onnx" torch.onnx.export(model.cpu(), # model being run inp, # model input (or a tuple for multiple inputs) file_path, # where to save the model (can be a file or file-like object) input_names = ['input'], # the model's input names output_names = ['output'], # the model's output names dynamic_axes={'input' : {0 : 'batch_size'}, 'output' : {0 : 'batch_size'}} # variable length axes )# Load the model and check it's ok onnx_model = onnx.load(file_path) onnx.checker.check_model(onnx_model)del onnx_model gc.collect()# New session ort_sess = ort.InferenceSession(file_path) output_onnx = ort_sess.run(None, {'input': inp.numpy()})[0] test_close(output.detach().numpy(), output_onnx) new_output_onnx = ort_sess.run(None, {'input': new_inp.numpy()})[0] test_close(new_output.detach().numpy(), new_output_onnx) os.remove(file_path)print(f'{"onnx":10}: ok')except:print(f'{"onnx":10}: failed')exceptImportError:print('onnx and onnxruntime are not installed. Please install them to run this test')
onnx and onnxruntime are not installed. Please install them to run this test