python - Neural network for more than one class not working -
i trying use neural network classification problem. have 6 possible classes , same input may in more 1 class.
the problem when try train 1 nn each class, set output_num_units = 1 , on train, pass first column of y, y[:,0]. following output , error:
## layer information # name size --- ------ ------ 0 input 32 1 dense0 32 2 output 1 indexerror: index 1 out of bounds axis 1 size 1 apply node caused error: crossentropycategorical1hot(elemwise{composite{scalar_sigmoid((i0 + i1))}}[(0, 0)].0, y_batch) inputs types: [tensortype(float32, matrix), tensortype(int32, vector)] inputs shapes: [(128, 1), (128,)] inputs strides: [(4, 4), (4,)] inputs values: ['not shown', 'not shown']
if try use output_num_units=num_class
(6) , full y (all 6 fields), first error of kstratifiedfold, because seems not expect y have multiple rows. if set eval_size=none
, following error:
typeerror: ('bad input argument theano function name "/usr/local/lib/python2.7/site-packages/nolearn-0.6a0.dev0-py2.7.egg/nolearn/lasagne/base.py:311" @ index 1(0-based)', 'wrong number of dimensions: expected 1, got 2 shape (128, 6).')
the configuration working setting more 1 output unit , passing 1 column y. trains nn, not seem right giving me 2 output layers, , have 1 y compare to.
what doing wrong? why can't use 1 output? should convert y classes vector of 6 columns vector of 1 column number?
i use following code (extract):
# load data data,labels = prepare_data_train('../input/train/subj1_series1_data.csv') # x_train (119496, 32) <type 'numpy.ndarray'> x_train = data_preprocess_train(data) #print x_train.shape, type(x_train) # y (119496, 6) <type 'numpy.ndarray'> y = labels.values.astype(np.int32) print y.shape, type(y) # net config num_features = x_train.shape[1] num_classes = labels.shape[1] # train neural net layers0 = [('input', inputlayer), ('dense0', denselayer), ('output', denselayer)] net1 = neuralnet( layers=layers0, # layer parameters: input_shape=(none, num_features), # 32 input dense0_num_units = 32, # number of units in hidden layer output_nonlinearity=sigmoid, # sigmoid function has 1 class output_num_units=2 , # if try 1, not work # optimization method: update=nesterov_momentum, update_learning_rate=0.01, update_momentum=0.9, max_epochs=50, # want train many epochs verbose=1, eval_size=0.2 ) net1.fit(x_train, y[:,0])
i wanted use cnns in lasagne, didn't work same way, predictions 0... recommend @ the mnist example. find 1 better use , extend, old code snippets didn't work due api changes on time. i've amended mnist example, target vector has labels 0 or 1 , create output layer nn way:
# finally, we'll add fully-connected output layer, of 2 softmax units: l_out = lasagne.layers.denselayer( l_hid2_drop, num_units=2, nonlinearity=lasagne.nonlinearities.softmax)
and cnn:
layer = lasagne.layers.denselayer( lasagne.layers.dropout(layer, p=.5), num_units=2, nonlinearity=lasagne.nonlinearities.softmax)
Comments
Post a Comment