python - Tensorflow multi-variable logistic regression not working -


i trying create program classify point either 1 or 0 using tensorflow. trying create oval shape around center of plot, blue dots are:

everything in oval should classified 1, every thing else should 0. in graph above, blue dots 1s , red x's 0s.

however, every time try classify point, choses 1, if point trained with, saying 0.

my question simple: why guess 1, , doing wrong or should differently fix problem? first machine learning problem have tried without tutorial, don't know stuff.

i'd appreciate can give, thanks!

here's code:

#!/usr/bin/env python3  import tensorflow tf import numpy import matplotlib.pyplot plt  training_in = numpy.array([[0, 0], [1, 1], [2, 0], [-2, 0], [-1, -1], [-1, 1], [-1.5, 1],   [3, 3], [3, 0], [-3, 0], [0, -3], [-1, 3], [1, -2], [-2, -1.5]]) training_out = numpy.array([1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0])  def transform_data(x):     return [x[0], x[1], x[0]**2, x[1]**2, x[0]*x[1]]  new_training_in = numpy.apply_along_axis(transform_data, 1, training_in)  feature_count = new_training_in.shape[1]  x = tf.placeholder(tf.float32, [none, feature_count]) y = tf.placeholder(tf.float32, [none, 1])  w = tf.variable(tf.zeros([feature_count, 1])) b = tf.variable(tf.zeros([1]))  guess = tf.nn.softmax(tf.matmul(x, w) + b)  cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(tf.matmul(x, w) + b, y))  opti = tf.train.gradientdescentoptimizer(0.01).minimize(cost)  init = tf.initialize_all_variables() sess = tf.session() sess.run(init)  in range(1000):     (item_x, item_y) in zip(new_training_in, training_out):         sess.run(opti, feed_dict={ x: [item_x], y: [[item_y]]})  print(sess.run(w)) print(sess.run(b))  plt.plot(training_in[:6, 0], training_in[:6, 1], 'bo') plt.plot(training_in[6:, 0], training_in[6:, 1], 'rx')  results = sess.run(guess, feed_dict={ x: new_training_in })  in range(training_in.shape[0]):     xx = [training_in[i:,0]]     yy = [training_in[i:,1]]     res = results[i]      # prints `[ 1.]`     print(res)      # uncomment these lines see guesses     # if res[0] == 0:     #     plt.plot(xx, yy, 'c+')     # else:     #     plt.plot(xx, yy, 'g+')  plt.show() 

the problem occurs when use softmax_cross_entropy_with_logits. in concrete case, both logits , labels should have shape [batch_size, number_of_labels=2].

note tensors logits=tf.matmul(x, w) + b , labels=y have shape [batch_size, 1], tensorflow assuming number_of_labels=1. that's why guess same.

a) solve problem encoding training_out one-hot vector. recommend using np.eye() achieve that:

training_out = [1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0] training_out = numpy.eye(2)[training_out] 

then, need make following changes:

y = tf.placeholder(tf.float32, [none, 2]) w = tf.variable(tf.zeros([feature_count, 2])) b = tf.variable(tf.zeros([2])) ... in range(1000):     (item_x, item_y) in zip(new_training_in, training_out):         sess.run(opti, feed_dict={x: [item_x], y: [item_y]}) ... results = sess.run(guess, feed_dict={x: new_training_in})[:,1] 

b) alternatively, use sparse_softmax_cross_entropy_with_logits, allows labels have shape [batch_size]. i've tweaked code make work in way:

import tensorflow tf import numpy import matplotlib.pyplot plt  training_in = numpy.array(     [[0, 0], [1, 1], [2, 0], [-2, 0], [-1, -1], [-1, 1], [-1.5, 1], [3, 3], [3, 0], [-3, 0], [0, -3], [-1, 3], [1, -2],      [-2, -1.5]]) training_out = numpy.array([1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0])  def transform_data(x):     return [x[0], x[1], x[0] ** 2, x[1] ** 2, x[0] * x[1]]  new_training_in = numpy.apply_along_axis(transform_data, 1, training_in)  feature_count = new_training_in.shape[1]  x = tf.placeholder(tf.float32, [none, feature_count]) y = tf.placeholder(tf.int32, [none])  w = tf.variable(tf.zeros([feature_count, 2])) b = tf.variable(tf.zeros([2]))  guess = tf.nn.softmax(tf.matmul(x, w) + b)  cost = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(tf.matmul(x, w) + b, y))  opti = tf.train.gradientdescentoptimizer(0.01).minimize(cost)  init = tf.initialize_all_variables() sess = tf.session() sess.run(init)  in range(1000):     (item_x, item_y) in zip(new_training_in, training_out):         sess.run(opti, feed_dict={x: [item_x], y: [item_y]})  print(sess.run(w)) print(sess.run(b))  plt.plot(training_in[:6, 0], training_in[:6, 1], 'bo') plt.plot(training_in[6:, 0], training_in[6:, 1], 'rx')  results = sess.run(guess, feed_dict={x: new_training_in})  in range(training_in.shape[0]):     xx = [training_in[i:, 0]]     yy = [training_in[i:, 1]]     res = results[i]      # prints `[ 1.]`     print(res)      # uncomment these lines see guesses     if res[0] == 0:         plt.plot(xx, yy, 'c+')     else:         plt.plot(xx, yy, 'g+') plt.show() 

Comments

Popular posts from this blog

mysql - Dreamhost PyCharm Django Python 3 Launching a Site -

java - Sending SMS with SMSLib and Web Services -

java - How to resolve The method toString() in the type Object is not applicable for the arguments (InputStream) -