Closed solyron closed 1 month ago
Or in ludwig/models/modules what is the relevant model?
@solyron Your questions is not really clear to me. Could you please provide a detailed example and we will try to help.
Thank you, Could you provide more detailes regarding the nerual network implememted by ludwig in the numerical case:
Got it. As the documentation says, after the normalization, the numerical feature is passed to a single neuron for scaling purposes. There's no activation at this moment.
thank you, what is excatly the scaling? What feature scaling have you used? Is it just sum of XiWi? is it polinomal? (for example 1+w0x0+(w1x1)^2+..) / other? I guess you mean by "scaling purposes" finding the weights, but I would like to know what do you do with the inputs I insert?
P.S. I searched at ludwig/models/modules the code where you implemented the scaling for the numerical case, but couldnt find it.
P.S. 2 - Feature request: Adding an activatoin function to the numerical cases
The code is in ludwig/features/numerical_feature.py
Here scaling is basically achieved by the single neuron. Single neuron without activation is basically just a linear transformation.
I'll add a note for adding activations for numerical feature. It should be a simple task, if you want to take a crack at it.
Just to clarify, the single neuron is a simple linear transformation, the idea is that it should learn to scale different input numerical features differently is needed, but I agree with the idea to be able to provide all the other parameters of a fully connected layer, there's no reason not to.
Hello,
In Numerical Input Features and Encoders "the raw float values coming from the input placeholders are passed through a single neuron for scaling purposes" Does this model is capable of dealing with training-set that comes from a set of non-linear equations/ single non-linear equation?
Would love to see more information regarding the sigmoid function used in that singe neuron or other relevant information if exists. Thx!