keras神经网络常见问题-mse, nmse

1. the History callback gives only loss and acc for each epoch, how can I get the loss for each batch ?

predict = model.predict(batch)
loss = MSE(batch,predict)
or

Here's a simple example saving a list of losses over each batch during training:

class LossHistory(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.losses = []

    def on_batch_end(self, batch, logs={}):
        self.losses.append(logs.get('loss'))

Example: recording loss history

class LossHistory(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.losses = []

    def on_batch_end(self, batch, logs={}):
        self.losses.append(logs.get('loss'))

model = Sequential()
model.add(Dense(10, input_dim=784, init='uniform'))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')

history = LossHistory()
model.fit(X_train, Y_train, batch_size=128, nb_epoch=20, verbose=0, callbacks=[history])

print history.losses
# outputs
'''
[0.66047596406559383, 0.3547245744908703, ..., 0.25953155204159617, 0.25901699725311789]
'''

参考: http://keras.io/callbacks/#create-a-callback

计算nmse

keras神经网络常见问题-mse, nmse_第1张图片

#计算NMSE

from numpy import mean, square, arrange

import math

a = arange(10) # For example

b = arrange(10)

e1 = mean(square(a-b))

e0 = mean(square(a))

nmse = 10*math.log10(e1/e0)




你可能感兴趣的:(机器学习)