"Transformer-based deep reverse attention network for multi-sensory human activity recognition" - published in Engineering Applications of Artificial Intelligence, Elsevier.
Access the journal article: Click Here
@article{pramanik2023transformer,
title = {Transformer-based deep reverse attention network for multi-sensory human activity recognition},
author={Pramanik, Rishav and Sikdar, Ritodeep and Sarkar, Ram},
journal = {Engineering Applications of Artificial Intelligence},
volume = {122},
pages = {106150},
year = {2023},
issn = {0952-1976},
doi = {10.1016/j.engappai.2023.106150},
url = {https://www.sciencedirect.com/science/article/pii/S0952197623003342}
}
The original credits for the dataset goes to the authors of the following repository: https://github.com/RanaMostafaAbdElMohsen/Human_Activity_Recognition_using_Wearable_Sensors_Review_Challenges_Evaluation_Benchmark
Datasets can be found here: https://drive.google.com/drive/folders/13j488oaUwk_lufg9w9dvtExxw4wmOGVx
pip3 install -r requirements.txt
python3 main.py --data_directory "data"
Available arguments:
--epochs
: Number of epochs of training. Default = 150--folds
: Number of Folds of training. Default = 10--batch_size
: Batch size for training. Default = 192--learning_rate
: Initial Learning Rate. Default = 0.001