ludwig-ai / ludwig

Low-code framework for building custom LLMs, neural networks, and other AI models
http://ludwig.ai
Apache License 2.0
11.2k stars 1.19k forks source link

Numerical input encoder #588

Closed solyron closed 1 month ago

solyron commented 4 years ago

Hello,

In Numerical Input Features and Encoders "the raw float values coming from the input placeholders are passed through a single neuron for scaling purposes" Does this model is capable of dealing with training-set that comes from a set of non-linear equations/ single non-linear equation?

Would love to see more information regarding the sigmoid function used in that singe neuron or other relevant information if exists. Thx!

BY-hash commented 4 years ago

Or in ludwig/models/modules what is the relevant model?

msaisumanth commented 4 years ago

@solyron Your questions is not really clear to me. Could you please provide a detailed example and we will try to help.

solyron commented 4 years ago

Thank you, Could you provide more detailes regarding the nerual network implememted by ludwig in the numerical case:

  1. what is the method? (Back-Propagation/ Stochastic Gradient Descent/Learning Rate Decay/other?)
  2. what is the activation function? (Sigmoid/Hyperbolic Tangent/ other?)
  3. Is it true that there is only one neuron and only one hidden layer? Any other information regarding the characterstics of the neural network in the numerical case would be of a great help. thx!
msaisumanth commented 4 years ago

Got it. As the documentation says, after the normalization, the numerical feature is passed to a single neuron for scaling purposes. There's no activation at this moment.

solyron commented 4 years ago

thank you, what is excatly the scaling? What feature scaling have you used? Is it just sum of XiWi? is it polinomal? (for example 1+w0x0+(w1x1)^2+..) / other? I guess you mean by "scaling purposes" finding the weights, but I would like to know what do you do with the inputs I insert?

P.S. I searched at ludwig/models/modules the code where you implemented the scaling for the numerical case, but couldnt find it.

P.S. 2 - Feature request: Adding an activatoin function to the numerical cases

msaisumanth commented 4 years ago

The code is in ludwig/features/numerical_feature.py

Here scaling is basically achieved by the single neuron. Single neuron without activation is basically just a linear transformation.

I'll add a note for adding activations for numerical feature. It should be a simple task, if you want to take a crack at it.

w4nderlust commented 4 years ago

Just to clarify, the single neuron is a simple linear transformation, the idea is that it should learn to scale different input numerical features differently is needed, but I agree with the idea to be able to provide all the other parameters of a fully connected layer, there's no reason not to.