python - train logistic regression model with different feature dimension in scikit learn -
using python 2.7 on windows. want fit logistic regression model using feature t1
, t2
classification problem, , target t3
.
i show values of t1
, t2
, code. question is, since t1
has dimension 5, , t2
has dimension 1, how should pre-process them leveraged scikit-learn logistic regression training correctly?
btw, mean training sample 1, feature of t1
[ 0 -1 -2 -3]
, , feature of t2
[0]
, training sample 2, feature of t1 [ 1 0 -1 -2]
, feature of t2
[1]
, ...
import numpy np sklearn import linear_model, datasets arc = lambda r,c: r-c t1 = np.array([[arc(r,c) c in xrange(4)] r in xrange(5)]) print t1 print type(t1) t2 = np.array([[arc(r,c) c in xrange(1)] r in xrange(5)]) print t2 print type(t2) t3 = np.array([0,0,1,1,1]) logreg = linear_model.logisticregression(c=1e5) # create instance of neighbours classifier , fit data. # using t1 , t2 features, , t3 target logreg.fit(t1+t2, t3)
t1,
[[ 0 -1 -2 -3] [ 1 0 -1 -2] [ 2 1 0 -1] [ 3 2 1 0] [ 4 3 2 1]]
t2,
[[0] [1] [2] [3] [4]]
it needs concatenate feature data matrices using numpy.concatenate.
import numpy np sklearn import linear_model, datasets arc = lambda r,c: r-c t1 = np.array([[arc(r,c) c in xrange(4)] r in xrange(5)]) t2 = np.array([[arc(r,c) c in xrange(1)] r in xrange(5)]) t3 = np.array([0,0,1,1,1]) x = np.concatenate((t1,t2), axis=1) y = t3 logreg = linear_model.logisticregression(c=1e5) # create instance of neighbours classifier , fit data. # using t1 , t2 features, , t3 target logreg.fit(x, y) x_test = np.array([[1, 0, -1, -1, 1], [0, 1, 2, 3, 4,]]) print logreg.predict(x_test)
Comments
Post a Comment