msyim / TensorFlowFun

Me playing with TF
0 stars 0 forks source link

Note on Linear Regression (correctly using ndarray) #1

Open msyim opened 7 years ago

msyim commented 7 years ago

It seemed like the model doesn't work when the input data has dimension >= 2:

x_train = [[1.,2.],[3.,4.]]
y_train = [4.,8.]

# Build a graph
X = tf.placeholder(tf.float32, shape=[None,2])
Y = tf.placeholder(tf.float32, shape=[None,1])

W = tf.Variable(tf.random_normal([2,1]), name='weight')
b = tf.Variable(tf.random_normal([1]), name='bias')

hypothesis = tf.matmul(X,W) + b
cost = tf.square(hypothesis - Y)

Clearly, the desired weight matrix is : [[1],[1]] and the bias : [1], but the model always converged to a point where the output is ~ [6,6] (and consequently, the loss ~ 4)

Studied the cost matrix whose dimension should be (2,1) but with the above code, the cost matrix had dimension: (2,2).

There was a problem in defining the y_train array. I had to change y_train to:

y_train = [[4.],[8.]]

Tensorflow does not seem to implicitly change the dimension as necessary (which is good)