Closed lbellego closed 2 years ago
@lbellego , Thank you for the idea. It's a good idea.
Do you have an estimation about how many neurons/layers do you need on your network? Most small functions require 2 or 3 layers max. Are you planning to use fully connected (dense) layers?
I only have 32 neurons one layer... I predicted the cost of my house regarding the sold history around. I have the sold price, area, number of rooms, floors, bathrooms...etc. (a list of 50 house sold around). I did it with neural-api and I have the same estimate than the regression written with GO, for my house. With the formula I have a formula with some things like value*price ... and I can extract the garage price for exemple :-)
@lbellego , Your project is a cool project. Thank you for sharing.
Before I suggest a solution, my I ask you please what was the layer type that you've used?
yes, this is an extract of the code : (a copy of the exemple for and or xor)
NN := TNNet.Create();
NN.HideMessages();
NFit := TNeuralFit.Create();
TrainingPairs := TNNetVolumePairList.Create();
NFit.OnStart := @ProcOnStart;
NFit.OnAfterEpoch := @ProcOnAfterEpoch;
...........
NN.AddLayer(TNNetInput.Create(Length(vInput)));
for Cnt := 1 to Self.nCouches do
NN.AddLayer(TNNetFullConnectReLU.Create(Self.nNeurons));
NN.AddLayer(TNNetFullConnectLinear.Create(Length(vOutput)));
for Cnt := 1 to Self.Data.csvDoc.RowCount - 1 do begin // construct vInput and vOutput arrays from the csv ... TrainingPairs.Add( TNNetVolumePair.Create(TNNetVolume.Create(vInput), TNNetVolume.Create(vOutput)) ); end;
NFit.InitialLearningRate := 0.01;
NFit.LearningRateDecay := 0;
NFit.L2Decay := 0;
NFit.Verbose := False;
NFit.HideMessages();
NFit.InferHitFn := @MonopolarCompare;
NFit.Fit(NN, TrainingPairs, nil, nil, {batchsize=}4, {epochs=}Self.nEpochs);
with some entries like that : prix, aire maison, aire terrain, garages, étages, sdb, chambres 352000, 1224, 6710, 1, 1, 2, 3 306000, 980, 5413, 0, 1, 1.5, 3 340000, 1299, 6000, 0, 1, 1, 3 310000, 893, 5616, 0, 1, 1.5, 3 320000, 904, 5704, 0, 1, 1.5, 3 342000, 1066, 9735, 0, 1, 2, 4 385000, 1156, 8000, 0, 1, 2, 4 362500, 1700, 6000, 0, 2, 2, 4 320000, 1062, 6760, 0, 1, 2, 4 355000, 1200, 6000, 0, 1, 2, 4 345000, 960, 5909, 0, 1, 1, 4 340000, 1382, 9000, 0, 1, 2, 3 305000, 1008, 8111, 0, 1, 2, 4 294000, 1053, 6000, 0, 1, 2, 3 306000, 923, 6000, 1, 1, 1, 3 312014, 1063, 6112, 0, 1, 2, 3 352000, 1591, 5500, 1, 1, 2, 4 295100, 1054, 5560, 0, 1, 1.5, 3 305000, 838, 6160, 0, 1, 2, 3
I search the price for 1130,5500,0,1,2,3
Result :
Nombre de valeurs: #19
Recherche de 'PRIX'
Résultat : 334393,75
;-) it was the prices before Covid here...
With the formula it will be cool to have the price of a garage, a room, etc... in the area of course. I think a garage is about 20000$....but not sure.
Awesome experiment!
Via TNNet.Layers[].Neurons[].Weights, you can find the weights for each layer. In your case, the first neuron of the first layer after the input will be: NN.Layers[1].Neurons[0].Weights
. What each neuron does on your experiment is: a dot product from the input (or output from the previous layer) and their weights. From the weights, you could create a static function on your code.
Anyway, the easiest solution would just be loading/saving the trained network with LoadFromString/LoadFromFile and SaveToString/SaveToFile and let TNNet do the job.
@lbellego , is the question solved?
yes, thanks
@lbellego, glad to help.
Hi,
I have a network with 5 inputs and 1 output. Is it possible to obtain the formula like this library written with GO ? https://github.com/sajari/regression/blob/master/regression.go