jostmey / NakedTensor

Bare bone examples of machine learning in TensorFlow
Apache License 2.0
2.42k stars 146 forks source link

What if you have more than one feature? #13

Closed jaketeater closed 7 years ago

jaketeater commented 7 years ago

This isn't an issue, but I did see that you responded to some other questions.

What if you had four features (four xs, as in the petal dataset)? How would your code change?

Sorry if this is am off the topic question, I am new to this.

jaketeater commented 7 years ago

I just discovered the simple classification example from this commit

jostmey commented 7 years ago

cool beans

jaketeater commented 7 years ago

What if instead of classification, I wanted a number? For example, in the iris flower data set, given 3 features (Sepal length, Sepal width, Petal length), I wanted to predict the 4th (Petal width) and ignore the species all together.

Sorry for the random question! Even just a name of the method for doing this, or some starting point would be very helpful!

jostmey commented 7 years ago

In each example, the outcome is floating point number. So the example shows you how to predict a number. This is usually called something like regression.

Now if you want to have a binary outcome, that is a different matter. You can create models with a binary outcome, but you have to define the error differently. Read about cross-entropy error and sigmoid squashing functions for more on this topic. Once you know how to handle binary outcomes, it is not hard to generalize to outcomes that involve more than 2 choices.

Hope that helps

jaketeater commented 7 years ago

Thanks for the reply!

Regression is what I am looking for. Thanks for the word!

I was then able to find this, which is exactly what I am trying to do - "Multiple Regression with Two Predictor Variables". That will work as a good starting point.

Thanks for your help!