trentool / TRENTOOL3

Open-Source MATLAB toolbox for transfer entropy estimation
http://trentool.github.io/TRENTOOL3/
GNU General Public License v3.0
56 stars 26 forks source link

negative transfer entropy? #15

Closed brekels closed 6 years ago

brekels commented 9 years ago

Hello, I am doing a project analysing pairs of EEG and EMG channels to measure flow from brain activity to facial movements. However, when I run the TRENTOOL pipeline (either in ensemble or regular mode), I get many negative values for transfer entropy, which translate to NaNs for mutual information. Do you know off-hand how this could be?

Can post more details as necessary, but I have 384 data points per trial (1.5s @ 256 Hz), EEG filtered 1-30 Hz, EMG 20-40 Hz, so I've set ACT threshold to 8 or 10 with maxlag of 128. Using VW_ds, Ragwitz criterion, and Faes method (mostly just working off the example script).

Any help would be much appreciated. Thanks!

trentool commented 9 years ago

@brekels Negative TE values are due to the estimator used (See the paper and dissertation by Kraskov). You can't interpret the TE values obtained from this estimator at face value. TRENTOOL therefore implements permutation testing to check if estimated TE values are statistically significant. So the important information returned by the toolbox is, if is information transfer between two sources.

I hope this helps, just let me know if anything is unclear,

Best, Patricia

brekels commented 9 years ago

Thanks, Patricia @trentool! The dissertation was very helpful

It seems that the important point there is that "H can be negative if close pairs in Y corresponded mainly to distant pairs in X," which they say is unlikely (they haven't seen data as noisy as this!). Still, it seems strange that running trials from one condition in the ensemble method was able to yield significant results (before correction). Unless I'm missing something?

trentool commented 9 years ago

I am sorry, I don't get your question, what do you mean by "running trials from one condition in the ensemble method"? Were you testing data from one trial against surrogate data? Why would it be strange to get significant results?

brekels commented 9 years ago

Sorry, no I was running ~2500 trials together, all from one group of experimental conditions. I was just surprised that the majority of transfer entropy values were < 0, MI gave NaNs, and still gave significant results for several channel pairs (possibly not a big enough surrogate data set).
It sounded to me like negative values should be relatively rare.. would that mean embedding/delay search params are way off, or just noisy data? Thanks!

trentool commented 9 years ago

We do find negative TE values in some of our data sets, so in general it is nothing to worry about. Also, it does not mean you can't have significant results. If TE values for surrogate data are also negative, you can get a significant result for your TE estimate on the original data (if the TE value is larger than 1-alpha percent of the surrogate TE values). So, on first glance, your results seem fine.

Also, negative TE values do not mean that you have the wrong embedding dimension. You can check the distribution of optimal embedding dimensions over trials. If a lot/the majority of trials have the maximum embedding dimension you fed to TEprepare in cfgTEP.ragdim, then it would make sense to run the analysis again to check if a larger embedding gets chosen by the Ragwitz criterion.

dotsoneuroscience commented 9 years ago

Hello,

I am finding that negative TE values are the norm in the data I am working with. I am analyzing LFP signals (sample at 1KHz) that are 400 ms long, with ~500 trials.

Is their a different way to estimate the bias to avoid this? My primary concern is that this makes the actual TE values difficult to interpret within and between pairs of signals.

Any help is appreciated.

mwibral commented 9 years ago

Hi,

the important figure is the difference to surrogate data, not the raw. This difference is the number derived in the permutation tests.

Best, Michael

On 09/11/2015 06:17 AM, dotsoneuroscience wrote:

Hello,

I am finding that negative TE values are the norm in the data I am working with. I am analyzing LFP signals (sample at 1KHz) that are 400 ms long, with ~500 trials.

Is their a different way to estimate the bias to avoid this? My primary concern is that this makes the actual TE values difficult to interpret within and between pairs of signals.

Any help is appreciated.

— Reply to this email directly or view it on GitHub https://github.com/trentool/TRENTOOL3/issues/15#issuecomment-139448993.Web Bug from https://github.com/notifications/beacon/AIqYGp0Eaqimt9MMHmNmwKMk45vayS92ks5owk1MgaJpZM4F1sfx.gif

trentool commented 8 years ago

@dotsoneuroscience regarding your comment: My primary concern is that this makes the actual TE values difficult to interpret within and between pairs of signals Note that the actual TE values estimated by the Kraskov estimator can't be interpreted at face value because of the bias (see the first answer in this thread). To compare TE values between pairs of signals you have to make sure that the number of points that enter the estimation is the same for all pairs and that all data are embedded using the same embedding dimension. Otherwise, differences in TE values may be due to differences in bias and not actual differences in information transfer.

MinPanNWU commented 1 year ago

Hello,

In my experiment, the majority of parameters are default values. But all of transfer entropy values were < 0. It's difficult for me to understand.

Any help would be much appreciated!

mwibral commented 1 year ago

Hi, I assume you are using the Kraskov estimaor on continuous valued data. If so, the bias properties of the Kraskov estimator will sometimes result in negative values. This is described in detail in the original publication and Kraskov'r PhD thesis.

What may matter more for you, if your TE values are actually significant.

Best, Michael

On Mon, 2022-11-07 at 05:52 -0800, p2020m-2121 wrote:

Hello, In my experiment, the majority of parameters are default values. But all of transfer entropy values were < 0. It's difficult for me to understand. Any help would be much appreciated! — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

MinPanNWU commented 1 year ago

Thanks a lot, Michael!

Would you give me advice and guidance about how to transform negative TE into positive?

Any help would be much appreciated!

Sincerely,

Min

pwollstadt commented 1 year ago

Dear Min, please have a look at this very nice post by Joe Lizier on negative values from the Kraskov estimator:

https://github.com/jlizier/jidt/wiki/FAQs#what-does-it-mean-if-i-get-negative-results-from-a-kraskov-stoegbauer-grassberger-estimator

The negative values are a result of the estimator's built-in bias correction. There is no transform to remove this effect. However, you may consider performing a statistical test to asses the estimates statistical significance and report that.

Best, Patricia

gnlxk commented 1 year ago

Hi,

Once running transfer entropy using KSG estimation with my data, I have consistently observed negative values of transfer entropy, and the negative transfer entropy was always more negative than every transfer entropy calculated by surrogate data (running 1000). At this point, I wonder that we should consider this case either no-relation between the two signals or a systematically consistent relationship between these two signals.

I will appreciate any help.