zcakhaa / DeepLOB-Deep-Convolutional-Neural-Networks-for-Limit-Order-Books

This jupyter notebook is used to demonstrate our recent work, "DeepLOB: Deep Convolutional Neural Networks for Limit Order Books", published in IEEE Transactions on Singal Processing. We use FI-2010 dataset and present how model architecture is constructed here. The FI-2010 is publicly avilable and interested readers can check out their paper.
393 stars 213 forks source link

CNN Kernel Size #19

Closed deeepwin closed 2 years ago

deeepwin commented 2 years ago

Thank you very much for the Jupyter example. I studied the paper and especially the proposed architecture. However, I cannot get my head around the code implementation and what is described in your paper.

On page 5 it is explained that first CNN convolutes over p() and v(t) using (1, 2) kernel. Input features are shape (Batch, 100, 40, 1)

grafik

But in your example code the kernel is defined along the width Conv2D(32, (1, 2) ...? not height. So you already convolute over the levels (horizontially). But your p() and v(t) features are vertically as part of the 40 data feature per time stamp. Shouldn't you define Conv2D(32, (2, 1) ...)?

    conv_first1 = Conv2D(32, (1, 2), strides=(1, 2))(input_lmd) 
    conv_first1 = keras.layers.LeakyReLU(alpha=0.01)(conv_first1) 
    conv_first1 = Conv2D(32, (4, 1), padding='valid')(conv_first1)
    conv_first1 = keras.layers.LeakyReLU(alpha=0.01)(conv_first1)
    conv_first1 = Conv2D(32, (4, 1), padding='same')(conv_first1)
    conv_first1 = keras.layers.LeakyReLU(alpha=0.01)(conv_first1)

    ...

Also in the paper nothing is mentioned about the other two layers with kernel_size = (4,1). Now kernel is vertical, with height=4. Could you shortly explain the reasoning behind? Thank you.

deeepwin commented 2 years ago

kernel size is correct, feature is on height dimension which corresponds to the kernel width.