This repo holds the code for: ExeChecker: Where did I go wrong? (link, pdf)
We experimented on two datasts: ExeCheck and UI-PRMD. Both datasets contains correct and incorrect movements related to common exercises performed by patients in physical therapy and rehabilitation programs.
To use for our experiments, you can download the processed dataset from here, unzip the them, place them in the folder of processed_execheck
and processed_uiprmd
correspondingly and proceed with the training.
The processed dataset has been segmented on repititions and augmented with mirroring.
The ExeCheck dataset consists of RGB-D videos of 10 rehabilitation exercises performed by 7 healthy subjects collected by a Azure Kinect sensor. Each exercise has a paired performance in both correct and incorrect forms by the same subject with 5 movement repetitions. You can download the original dataset from the Dropbox
The UI-PRMD consists of 10 rehabilitation movements. A sample of 10 healthy individuals repeated each movement 10 times in front of two sensory systems for motion capturing: a Vicon optical tracker, and a Kinect camera. The data is presented as positions and angles of the body joints in the skeletal models provided by the Vicon and Kinect mocap systems.
You can also create your own dataset using the scripts in the prepare
folder with corresponding modifications.
Change the config file according to your needs, then run with
python trainMulti_perExe.py --config ./config/execheck_Multi_perExe.yaml
If you find our research helpful, please cite this work:
@inproceedings{gu2025exechecker,
title={ExeChecker: Where Did I Go Wrong?},
author={Gu, Yiwen and Patel, Mahir and Betke, Margrit},
booktitle={European Conference on Computer Vision},
pages={340--355},
year={2025},
organization={Springer}
}
- We use STGAT as our feature extractor. Thanks to the original authors for their work!