Open abdulrahmanalaa123 opened 2 years ago
The details are available in the paper: https://mental.jmir.org/2018/3/e10153/ Data was collected from a Samsung R720 Gear S2 Sport BT.
The raw data is under folder raw_data. Coding of participants to label the data is available in user_study_encoding.csv: https://github.com/juancq/emotion-recognition-smartwatch/blob/d1374d06f126c5a6e0d68cb63c21f5af53d5bcf7/user_study_encoding.csv
Hi! Our group is currently conducting a research about the effects of music to certain people's emotions and we would like to use your dataset in performing Multidimensional Scaling. We would just like to ask if there's any way that we could Identify the features in each column inside the "features" dataset? Thank you!
@Lnzry good point!
You can identify the features by tracing this function: https://github.com/juancq/emotion-recognition-smartwatch/blob/0c9c7c644f2e59da3908867fde1df2785123eead/extract_windows.py#L73-L110
The above also calls this function: https://github.com/juancq/emotion-recognition-smartwatch/blob/0c9c7c644f2e59da3908867fde1df2785123eead/extract_windows.py#L30-L70
I would love if you post the description of the data or where you got it from and if its from an API can you clarify which was the code for accessing that API and collecting it?or a link to download the data whole