Open Stevewsw opened 6 years ago
@Stevewsw I can certainly help. Can you be more specific on what kind of illustrations do you need?
Hello
I am redoing your code. can you please guide me exactly why are you using mat.getTrainingData(file) because i can't find on the internet why mat. is used for ? Moreover, are you using the .edf files as your dataset ? Kindly guide me of how have you used your dataset?
Thank you
@GulrukhTurabee My apologies, I had not pushed one of the source files which contains the readMat
module.
My data is stored in .mat matlab files. I haven't uploaded the raw data because the files are very large. The readMat
module is used to read mat files using the h5py
module and perform data related operations, such as splitting into training and testing data, standardizing the data, one-hot-encoding the data etc.
Hello
Okay. No problem. Actually it is giving me an error as 'no module named as readMat' even after installing all the required dependencies and modules. can you guide me on this? And secondly, can you please provide me a screenshot of your data directory. The .mat files that you are using as your data files , do they comprise of all .edf files from Sleep- Edf database?
Thank you
@GulrukhTurabee pull the recent version of the commit, I pushed the readMat
module this morning.
I have already converted data from EDF into native matlab structs using edfread
, which can be found in Matlab Central. Here is a screenshot of the structure:
Since there are 36 subjects, using leave-one-subject out cross-validation, there are 36 .mat files with each subject as testing data, using the file name as DataTestSub_{sub_id}.mat
. The training data is stored as cell arrays, X_train
and Y_train
, where each cell contains a matrix. The testing data, X_test
and Y_test
is stored as matrices.
Thank you so much for providing the readMat Module. Are you sure there are 36 subjects ? because i have 39 subject form same dataset. Moreover, can you also (if possible ) provide me the code of storing the training data as cell arrays ?
Hello Any idea about this error ?
Thanks in advance.
@GulrukhTurabee The code expects a struct named data
and its not able to find it. When you create the data
struct, make sure you are using something like this:
data = struct('X_train', {}, 'Y_train', {}, 'X_test', [], 'Y_test', [], 'TestSubIdx', []);
The above code initializes an empty struct. For saving the struct use:
save('yourFileName.mat', 'data', '-v7.3');
P.S.: I am mostly going to revise the readMat
module to handle data importing, because currently it is not the best way to do it. I will keep posted.
@Stevewsw I can certainly help. Can you be more specific on what kind of illustrations do you need?
Hey, thanks for this brilliant repo;
I'm analyzing the EEG data for 'sleep stage scoring';
here I have eeg data (.edf
)
.edf : each one with Fs = 200, obtains 8 channels info.
I have converted into (.npy
) with a shape of ((nChannels * timelength * SampleFrequency
), for example ( 8*10h*200
), where 10h == 10*60*60
;
.csv: containing label info.
No. | StartTime | Label |
---|---|---|
1 | 21:34:57 | WK |
2 | 21:35:27 | WK |
3 | 21:35:57 | WK |
4 | 21:36:27 | WK |
5 | 21:36:57 | WK |
6 | 21:37:27 | WK |
7 | 21:37:57 | WK |
... | ... | ... |
Please do not hesitate to let me know if you have any questions. It is greatly grateful if it would be responded;
@kedarps Is there any possibility that you can upload your data files on google drive and share with me or may be email me ? That will be a great help on your part.
Thank you
@Stevewsw and @GulrukhTurabee, give me a few more days to respond to your messages. I can certainly help you.
@Stevewsw and @GulrukhTurabee, give me a few more days to respond to your messages. I can certainly help you.
Alright. waiting for your reply..
@kedarps Hi..I'd also like to give your implementation a trial. Might it be possible that you share one of your .mat files. This would certainly help me a lot.. best, dom
I wanna redo your code;
could you add some illustrations in readme.md in detail step by step please?
it would be grateful if it would be responsed.