TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!

感知机梯度详细过程,Keras高层API,Keras的自定义网络!

文章目录

    • 一、单层单输出感知机梯度(属于二分类问题)
      • 1.1、具体公式推导
      • 1.2、tensorflow中如何实现单一输出的感知机
    • 二、单层多输出感知机梯度(属于多分类问题)
      • 2.1、具体公式推导
      • 2.2、tensorflow中如何实现多输出的感知机
      • 2.3、链式法则(扩展到多层)
    • 三、多层感知机梯度
      • 3.1、回顾章节二和三
      • 3.2、具体公式推导
    • 四、Keras高层API
      • 4.1、五大功能
      • 4.2、这里主要讲解Metrics
      • 4.3、4.2中的实战
    • 五、Keras高层API-2
      • 5.1、Compilt和Fit
      • 5.2、没有keras之前标准的training流程
      • 5.3、通过keras完成标准的简介的写法
        • 5.3.1、Training训练的代码
        • 5.3.2、Test测试的代码
        • 5.3.3、完成test之后其实还有一步,evaluate。
        • 5.3.4、总结
    • 六、keras的自定义网络(非常的重要!)
      • 6.1、keras.Sequential容器
      • 6.2、Layer/Model
      • 6.3、自己实现一个Dense层MyDense(重要)
      • 6.4、应用刚才自定义一个5层网络
      • 6.5、实战自定义层3.2,自定义网络3.3。
    • 七、需要全套课程视频+PPT+代码资源可以私聊我!

一、单层单输出感知机梯度(属于二分类问题)

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第1张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第2张图片

1.1、具体公式推导

  • 因此对于一个单层的感知机,我们可以很好的总结:最终的loss对哪一个节点输入求导( x 0 , x 1 , . . . . . . , x n x_{0}, x_{1}, ......, x_{n} x0,x1,......,xn);它就等于最终的输出值,和对应节点输入的值组合的乘机。虽然推导有些麻烦,它仅仅和最终的神经元的输出 O 0 O_{0} O0 和对应的输入节点值 x j x_{j} xj的值有关系。
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第3张图片

1.2、tensorflow中如何实现单一输出的感知机

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第4张图片
import tensorflow as tf
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

# 一个样本x
x = tf.random.normal([1, 3])
w = tf.ones([3, 1])
b = tf.ones([1])

# 一个样本的真实标签y
y = tf.constant([1])

with tf.GradientTape() as tape:
    tape.watch([w, b])
    prob = tf.sigmoid(x@w+b)
    loss = tf.reduce_mean(tf.losses.MSE(y, prob))

grads = tape.gradient(loss, [w, b])
print('w grad: \n', grads[0])
print('b grad: \n', grads[1])
  • 输出结果:
w grad: 
 tf.Tensor(
[[-0.0009601 ]
 [-0.00522288]
 [-0.00349461]], shape=(3, 1), dtype=float32)
b grad: 
 tf.Tensor([-0.00506869], shape=(1,), dtype=float32)

Process finished with exit code 0

二、单层多输出感知机梯度(属于多分类问题)

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第5张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第6张图片

2.1、具体公式推导

  • 总结:
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第7张图片

2.2、tensorflow中如何实现多输出的感知机

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第8张图片
  • 代码如下:
import tensorflow as tf
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

x = tf.random.normal([2, 4])
w = tf.ones([4, 3])
b = tf.ones([3])

y = tf.constant([2, 0])

with tf.GradientTape() as tape:
    tape.watch([w, b])
    prob = tf.nn.softmax(x@w+b, axis=1)
    loss = tf.reduce_mean(tf.losses.MSE(tf.one_hot(y, depth=3), prob))

grads = tape.gradient(loss, [w, b])
print('w grad: ', grads[0])
print('b grad: ',grads[1])
  • 运行结果:
w grad:  tf.Tensor(
[[ 0.05927337 -0.06327497  0.0040016 ]
 [ 0.0433852   0.02499989 -0.06838509]
 [ 0.00613058  0.04706442 -0.05319501]
 [ 0.15224242 -0.02957086 -0.12267157]], shape=(4, 3), dtype=float32)
b grad:  tf.Tensor([-0.03703704  0.07407407 -0.03703704], shape=(3,), dtype=float32)

Process finished with exit code 0

2.3、链式法则(扩展到多层)

  • 通过使用链式法则,我们可以把最后一层的误差一层的反向传播到中间层的权值上面去,从而得到中间层的梯度信息,然后更新中间层的权值,从而达到最优化的效果。
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第9张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第10张图片
  • 代码演示
import tensorflow as tf
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

x = tf.constant(1.)
w1 = tf.constant(2.)
b1 = tf.constant(1.)
w2 = tf.constant(3.)
b2 = tf.constant(1.)

with tf.GradientTape(persistent=True) as tape:
    tape.watch([w1, b1, w2, b2])

    y1 = x * w1 + b1
    y2 = y1 * w2 + b2

dy2_dy1 = tape.gradient(y2, [y1])[0]
dy1_dw1 = tape.gradient(y1, [w1])[0]
dy2_dw1 = tape.gradient(y2, [w1])[0]
print(dy2_dy1)
print(dy1_dw1)
print(dy2_dw1)
  • 运行结果:
tf.Tensor(3.0, shape=(), dtype=float32)
tf.Tensor(1.0, shape=(), dtype=float32)
tf.Tensor(3.0, shape=(), dtype=float32)

Process finished with exit code 0

三、多层感知机梯度

3.1、回顾章节二和三

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第11张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第12张图片

3.2、具体公式推导

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第13张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第14张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第15张图片
  • 因此:
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第16张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第17张图片

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第18张图片

  • 总结

四、Keras高层API

4.1、五大功能

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第19张图片

4.2、这里主要讲解Metrics

  • 评估标准 Metrics
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第20张图片
  • 有一个现成的准确度的meter就是 m e t r i c s . A c c u r a c y ( ) metrics.Accuracy() metrics.Accuracy()
  • 如果只是简单的求一个平均值的话,有一个更加通用的meter就是 m e t r i c s . M e a n ( ) metrics.Mean() metrics.Mean()
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第21张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第22张图片

4.3、4.2中的实战

  • numpy 创建的数组都有一个shape属性,它是一个元组,返回各个维度的维数。有时候我们可能需要知道某一维的特定维数。
  1. 二维情况
>>> import numpy as np
>>> y = np.array([[1,2,3],[4,5,6]])
>>> print(y)
[[1 2 3]
 [4 5 6]]
>>> print(y.shape)
(2, 3)
>>> print(y.shape[0])
2
>>> print(y.shape[1])
3

  • 可以看到上面的y:是一个两行三列的二维数组,y.shape[0]代表行数,y.shape[1]代表列数。
  • 总结:可以看到,shape[0]表示最外围的数组的维数,shape[1]表示次外围的数组的维数,数字不断增大,维数由外到内
import tensorflow as tf
from tensorflow.keras import datasets, layers, optimizers, Sequential, metrics

def preprocess(x, y):
    x = tf.cast(x, dtype=tf.float32) / 255.
    y = tf.cast(y, dtype=tf.int32)
    return x, y

batchsz = 128
(x, y), (x_val, y_val) = datasets.mnist.load_data()
print('datasets:', x.shape, y.shape, x.min(), x.max())

db = tf.data.Dataset.from_tensor_slices((x, y))
db = db.map(preprocess).shuffle(60000).batch(batchsz).repeat(10)

ds_val = tf.data.Dataset.from_tensor_slices((x_val, y_val))
ds_val = ds_val.map(preprocess).batch(batchsz)

network = Sequential([layers.Dense(256, activation='relu'),
                      layers.Dense(128, activation='relu'),
                      layers.Dense(64, activation='relu'),
                      layers.Dense(32, activation='relu'),
                      layers.Dense(10)])
network.build(input_shape=(None, 28 * 28))
network.summary()

optimizer = optimizers.Adam(lr=0.01)


# 第一步: 这里要对loss和accuracy做一个跟踪。所以这里建立了2个metrics
# 一个是accuracy的metrics,一个是求loss均值的metrics.
acc_meter = metrics.Accuracy()
loss_meter = metrics.Mean()

for step, (x, y) in enumerate(db):

    with tf.GradientTape() as tape:
        # [b, 28, 28] => [b, 784]
        x = tf.reshape(x, (-1, 28 * 28))
        # [b, 784] => [b, 10]
        out = network(x)
        # [b] => [b, 10]
        y_onehot = tf.one_hot(y, depth=10)
        # [b]
        loss = tf.reduce_mean(tf.losses.categorical_crossentropy(y_onehot, out, from_logits=True))


        # 第二步: 每次loss计算完之后会更新一次metrics列表,这样loss会非常的准确。
        loss_meter.update_state(loss)

    grads = tape.gradient(loss, network.trainable_variables)
    optimizer.apply_gradients(zip(grads, network.trainable_variables))

    if step % 100 == 0:

        # 第三步: 测试的时候把loss的result打印出来。
        print(step, 'loss:', loss_meter.result().numpy())

        # 第四步: 把当前的loss buffer缓存清理掉。======这样每隔100次打印出来的loss是前100次的平均loss,而不是第100次了。
        # 数值会看起来非常的稳定。
        loss_meter.reset_states()

    # evaluate 测试的时候。我们来看acc metrics
    if step % 500 == 0:
        total, total_correct = 0., 0

        # 首先: acc_meter缓存清0。
        acc_meter.reset_states()

        for step, (x, y) in enumerate(ds_val):
            # [b, 28, 28] => [b, 784]
            x = tf.reshape(x, (-1, 28 * 28))
            # [b, 784] => [b, 10]
            out = network(x)

            # [b, 10] => [b]
            pred = tf.argmax(out, axis=1)
            pred = tf.cast(pred, dtype=tf.int32)
            # bool type
            correct = tf.equal(pred, y)
            # bool tensor => int tensor => numpy
            total_correct += tf.reduce_sum(tf.cast(correct, dtype=tf.int32)).numpy()
            total += x.shape[0]

            # 然后: acc_meter的值更新缓存到列表。
            acc_meter.update_state(y, pred)

        print(step, 'Evaluate Acc:', total_correct / total, acc_meter.result().numpy())

  • 运行结果:
  • 需要注意的是:这里我们不仅使用了acc_meter方法,我们自己实现了类型acc_meter的方法,怎么实现呢,我们有这样的一个变量叫做total, total_correct。总的样本的数量,和总的正确的数量。
C:\Anaconda3\envs\tf2\python.exe E:/Codes/Demo/TF2/metrics.py
datasets: (60000, 28, 28) (60000,) 0 255
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                multiple                  200960    
_________________________________________________________________
dense_1 (Dense)              multiple                  32896     
_________________________________________________________________
dense_2 (Dense)              multiple                  8256      
_________________________________________________________________
dense_3 (Dense)              multiple                  2080      
_________________________________________________________________
dense_4 (Dense)              multiple                  330       
=================================================================
Total params: 244,522
Trainable params: 244,522
Non-trainable params: 0
_________________________________________________________________
0 loss: 2.351126
78 Evaluate Acc: 0.2671 0.2671
100 loss: 0.50758445
200 loss: 0.25146392
300 loss: 0.19939858
400 loss: 0.19180286
500 loss: 0.15045771
78 Evaluate Acc: 0.9591 0.9591
600 loss: 0.1392191
700 loss: 0.13043576
800 loss: 0.13935085
900 loss: 0.12730792
1000 loss: 0.119043715
78 Evaluate Acc: 0.9707 0.9707
1100 loss: 0.10553091
1200 loss: 0.10021621
1300 loss: 0.111887835
1400 loss: 0.10525742
1500 loss: 0.10338638
78 Evaluate Acc: 0.9668 0.9668
1600 loss: 0.09393982
1700 loss: 0.10706411
1800 loss: 0.0876565
1900 loss: 0.09356122
2000 loss: 0.07625327
78 Evaluate Acc: 0.969 0.969
2100 loss: 0.08937727
2200 loss: 0.08263406
2300 loss: 0.104584485
2400 loss: 0.10313261
2500 loss: 0.094911754
78 Evaluate Acc: 0.9671 0.9671
2600 loss: 0.07035615
2700 loss: 0.08280234
2800 loss: 0.0859525
2900 loss: 0.065915905
3000 loss: 0.06708269
78 Evaluate Acc: 0.9739 0.9739
3100 loss: 0.06600948
3200 loss: 0.084229834
3300 loss: 0.0853124
3400 loss: 0.064022705
3500 loss: 0.0710441
78 Evaluate Acc: 0.9659 0.9659
3600 loss: 0.07671407
3700 loss: 0.08920249
3800 loss: 0.05802461
3900 loss: 0.061849356
4000 loss: 0.071581885
78 Evaluate Acc: 0.9711 0.9711
4100 loss: 0.071715534
4200 loss: 0.06235297
4300 loss: 0.06333204
4400 loss: 0.07377879
4500 loss: 0.06499765
78 Evaluate Acc: 0.9749 0.9749
4600 loss: 0.067099705

Process finished with exit code 0

五、Keras高层API-2

5.1、Compilt和Fit

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第23张图片

5.2、没有keras之前标准的training流程

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第24张图片

5.3、通过keras完成标准的简介的写法

5.3.1、Training训练的代码

  • 现在的写法,首先指定下面的compile以后,直接在下面fit一下,然后这个epoch就是指定上涨图片中的sclice中的10,就是10次epoch,每次traning的loss是按照下面的traing计算,得到一个gradient以后,根据Adam优化器更新一个对应的参数,然后在指定的一个周次做一个metrics测试。注意这里还有指定做测试的周期,之前我们做测试是: if step%100 ==0的时候,我们做一次测试,做测试的时候测试的数据集是什么呢?这里也没有指定,所以根本没有用到测试的功能,可以吧metrics=[‘accuracy’]删除掉。这样的话我们traing的一个逻辑是完全的指定的,training多少步,traing的优化器,traing的这样的一个loss,整个traing所用的数据集db,这样非常方便。
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第25张图片
  • 我们运行上面的打印什么结果呢?我们来看一下标准的进度条的打印,看看基本的信息。
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第26张图片

5.3.2、Test测试的代码

  • 需要注意的是做测试的时候epoch总是等于1的,为什么呢?
  • 因为我们测试的时候,只需要对所有样本测试一次,不需要对所有样本测试多次,因为是一样的效果对嘛,所以测试的时候epoch是固定的,step也是固定的,因此对于测试来说只需要知道测试集ds_val。
  • 比起之前的参数,现在多了2个黄色框框标记出来的;一个是validation_data就是要在那一个数据集上面进行测试。还有一个validation_freq下面写错了,表示每多少个epoch做一次validation。traing2个epoch测试一次,也就是我们对db循环2次做一个测试,看下面的图表示的。会打印一个指标就是accuracy。这个accuracy计算过程是标准的。来看一下具体的输出结果:
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第27张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第28张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第29张图片

5.3.3、完成test之后其实还有一步,evaluate。

  • 这里的evaluate和上面的validation_freq是一样的,区别在于什么呢?区别在在于validation_freq是在traing中间进行的,为什么中间进行呢?因为在training的时候,我也不知道traing什么时候是个头,有可能traing10天,有可能是3个月,有可能更久,那么什么时候停止呢?我这个时候,就会隔一段时间做一次validation_freq或者隔一段时间做一次test,比如根据(我再中间写一段代码if test_accuracy>0.99,我就把当前的状态save住,我就把这个循环跳出来break;这个就是中间做一个validation的好处,我们可以随时的停止,所以这一部分可以提前终止不一定完成10个epoch,这10epoch一般是我们指定的最大的epoch,达到我们的需求可以提前的终止,当跳完这句话话后就是evaluate还要做一次测试,来验证,只需要给一个数据集;其实就相当于一次valuation_freq;当然这个测试的数据集也可以不用这个ds_val数据集,我们可以找另外一个,这样更加公平,公正!.)
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第30张图片
  • evaluate的输出。最后显示最后的loss和accuracy。
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第31张图片

5.3.4、总结

  • 最后当我们traing完成以后,我们把模型的参数保存下来,我们下一次交给一个生产环境的时候,或者说交给一个客户的时候,客户拿到这个模型,加载这个模型的参数以后,他需要来做一个预测!预测这一部分怎么完成呢?根据前面的逻辑,
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第32张图片

六、keras的自定义网络(非常的重要!)

6.1、keras.Sequential容器

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第33张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第34张图片
TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第35张图片

6.2、Layer/Model

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第36张图片

6.3、自己实现一个Dense层MyDense(重要)

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第37张图片

6.4、应用刚才自定义一个5层网络

TensorFlow2.0笔记13:感知机梯度详细过程,Keras高层API,Keras的自定义网络!_第38张图片

6.5、实战自定义层3.2,自定义网络3.3。

  • 代码如下
import tensorflow as tf
from tensorflow.python.keras import datasets, layers, optimizers, Sequential, metrics
from tensorflow.python import keras
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

def preprocess(x, y):
    """
    x is a simple image, not a batch
    :param x:
    :param y:
    :return:
    """
    x = tf.cast(x, dtype=tf.float32) / 255.
    x = tf.reshape(x, [28*28])
    y = tf.cast(y, dtype=tf.int32)
    y = tf.one_hot(y, depth=10)
    return x, y

batchsz = 128
(x, y), (x_val, y_val) = datasets.mnist.load_data()
print("datasets: ", x.shape, y.shape, x.min(), x.max())


db = tf.data.Dataset.from_tensor_slices((x, y))
db = db.map(preprocess).shuffle(60000).batch(batchsz)
ds_val = tf.data.Dataset.from_tensor_slices((x_val, y_val))
ds_val = ds_val.map(preprocess).batch(batchsz)

iteration = iter(db)
sample = next(iteration)
print("迭代器获得为:", sample[0].shape, sample[1].shape)


class MyDense(layers.Layer):
    def __init__(self, inp_dim, outp_dim):
        super(MyDense, self).__init__()

        self.kernel = self.add_variable('w', [inp_dim, outp_dim])
        self.bias = self.add_variable('b', [outp_dim])

    def call(self, input, training=None):
        out = input @ self.kernel + self.bias
        return out

class MyModel(keras.Model):

    def __init__(self):
        super(MyModel, self).__init__()

        self.fc1 = MyDense(28*28, 256)
        self.fc2 = MyDense(256, 128)
        self.fc3 = MyDense(128, 64)
        self.fc4 = MyDense(64, 32)
        self.fc5 = MyDense(32, 10)

    def call(self, inputs, training=None):

        x = self.fc1(inputs)  ##fc1为一个instance,默认调用__call__()==> call()
        x = tf.nn.relu(x)
        x = self.fc2(x)
        x = tf.nn.relu(x)
        x = self.fc3(x)
        x = tf.nn.relu(x)
        x = self.fc4(x)
        x = tf.nn.relu(x)
        x = self.fc5(x)

        return x

network = MyModel()

network.compile(optimizer=optimizers.Adam(lr=0.01),
                loss=tf.losses.CategoricalCrossentropy(from_logits=True),
                metrics=['accuracy']
)

network.fit(db, epochs=5, validation_data=ds_val,
            validation_freq=2)


network.evaluate(ds_val)

sample = next(iter(ds_val))
x = sample[0]
y = sample[1] # one-hot
pred = network.predict(x) # [b, 10]
# convert back to number
y = tf.argmax(y, axis=1)  # [b, 1]
pred = tf.argmax(pred, axis=1)

print(pred)
print(y)

  • 运行结果如下:
C:\Anaconda3\envs\tf2\python.exe E:/Codes/Demo/TF2/layers_model.py
datasets:  (60000, 28, 28) (60000,) 0 255
迭代器获得为: (128, 784) (128, 10)
Epoch 1/5

  1/469 [..............................] - ETA: 30:44 - loss: 2.3296 - accuracy: 0.1406
  8/469 [..............................] - ETA: 3:50 - loss: 2.0159 - accuracy: 0.2441 
 17/469 [>.............................] - ETA: 1:47 - loss: 1.4509 - accuracy: 0.3627
 26/469 [>.............................] - ETA: 1:09 - loss: 1.1746 - accuracy: 0.4412
 34/469 [=>............................] - ETA: 53s - loss: 1.0164 - accuracy: 0.4921 
 43/469 [=>............................] - ETA: 41s - loss: 0.8971 - accuracy: 0.5363
 52/469 [==>...........................] - ETA: 34s - loss: 0.8000 - accuracy: 0.5712
 61/469 [==>...........................] - ETA: 28s - loss: 0.7249 - accuracy: 0.5997
 70/469 [===>..........................] - ETA: 24s - loss: 0.6783 - accuracy: 0.6235
 79/469 [====>.........................] - ETA: 21s - loss: 0.6375 - accuracy: 0.6435
 88/469 [====>.........................] - ETA: 19s - loss: 0.6002 - accuracy: 0.6605
 97/469 [=====>........................] - ETA: 17s - loss: 0.5694 - accuracy: 0.6753
105/469 [=====>........................] - ETA: 15s - loss: 0.5466 - accuracy: 0.6870
114/469 [======>.......................] - ETA: 14s - loss: 0.5273 - accuracy: 0.6988
123/469 [======>.......................] - ETA: 13s - loss: 0.5061 - accuracy: 0.7093
132/469 [=======>......................] - ETA: 12s - loss: 0.4883 - accuracy: 0.7188
141/469 [========>.....................] - ETA: 11s - loss: 0.4724 - accuracy: 0.7273
149/469 [========>.....................] - ETA: 10s - loss: 0.4596 - accuracy: 0.7343
158/469 [=========>....................] - ETA: 9s - loss: 0.4463 - accuracy: 0.7416 
166/469 [=========>....................] - ETA: 9s - loss: 0.4361 - accuracy: 0.7475
174/469 [==========>...................] - ETA: 8s - loss: 0.4260 - accuracy: 0.7531
182/469 [==========>...................] - ETA: 8s - loss: 0.4160 - accuracy: 0.7583
190/469 [===========>..................] - ETA: 7s - loss: 0.4083 - accuracy: 0.7632
198/469 [===========>..................] - ETA: 7s - loss: 0.4013 - accuracy: 0.7677
206/469 [============>.................] - ETA: 6s - loss: 0.3923 - accuracy: 0.7720
214/469 [============>.................] - ETA: 6s - loss: 0.3869 - accuracy: 0.7761
223/469 [=============>................] - ETA: 5s - loss: 0.3794 - accuracy: 0.7804
232/469 [=============>................] - ETA: 5s - loss: 0.3731 - accuracy: 0.7845
241/469 [==============>...............] - ETA: 5s - loss: 0.3671 - accuracy: 0.7883
250/469 [==============>...............] - ETA: 4s - loss: 0.3616 - accuracy: 0.7920
259/469 [===============>..............] - ETA: 4s - loss: 0.3556 - accuracy: 0.7954
268/469 [================>.............] - ETA: 4s - loss: 0.3490 - accuracy: 0.7987
277/469 [================>.............] - ETA: 3s - loss: 0.3440 - accuracy: 0.8018
286/469 [=================>............] - ETA: 3s - loss: 0.3391 - accuracy: 0.8048
295/469 [=================>............] - ETA: 3s - loss: 0.3355 - accuracy: 0.8076
304/469 [==================>...........] - ETA: 3s - loss: 0.3308 - accuracy: 0.8103
312/469 [==================>...........] - ETA: 2s - loss: 0.3266 - accuracy: 0.8126
320/469 [===================>..........] - ETA: 2s - loss: 0.3220 - accuracy: 0.8149
328/469 [===================>..........] - ETA: 2s - loss: 0.3192 - accuracy: 0.8170
337/469 [====================>.........] - ETA: 2s - loss: 0.3155 - accuracy: 0.8194
346/469 [=====================>........] - ETA: 2s - loss: 0.3116 - accuracy: 0.8216
355/469 [=====================>........] - ETA: 1s - loss: 0.3085 - accuracy: 0.8237
364/469 [======================>.......] - ETA: 1s - loss: 0.3047 - accuracy: 0.8258
373/469 [======================>.......] - ETA: 1s - loss: 0.3010 - accuracy: 0.8278
382/469 [=======================>......] - ETA: 1s - loss: 0.2990 - accuracy: 0.8298
391/469 [========================>.....] - ETA: 1s - loss: 0.2966 - accuracy: 0.8316
400/469 [========================>.....] - ETA: 1s - loss: 0.2931 - accuracy: 0.8334
409/469 [=========================>....] - ETA: 0s - loss: 0.2908 - accuracy: 0.8352
418/469 [=========================>....] - ETA: 0s - loss: 0.2886 - accuracy: 0.8368
427/469 [==========================>...] - ETA: 0s - loss: 0.2869 - accuracy: 0.8385
436/469 [==========================>...] - ETA: 0s - loss: 0.2837 - accuracy: 0.8400
445/469 [===========================>..] - ETA: 0s - loss: 0.2810 - accuracy: 0.8416
454/469 [============================>.] - ETA: 0s - loss: 0.2783 - accuracy: 0.8431
463/469 [============================>.] - ETA: 0s - loss: 0.2765 - accuracy: 0.8445
469/469 [==============================] - 7s 15ms/step - loss: 0.2751 - accuracy: 0.8456
Epoch 2/5

  1/469 [..............................] - ETA: 6:22 - loss: 0.1546 - accuracy: 0.9531
 11/469 [..............................] - ETA: 36s - loss: 0.1503 - accuracy: 0.9531 
 21/469 [>.............................] - ETA: 19s - loss: 0.1609 - accuracy: 0.9518
 31/469 [>.............................] - ETA: 13s - loss: 0.1630 - accuracy: 0.9516
 41/469 [=>............................] - ETA: 10s - loss: 0.1648 - accuracy: 0.9515
 50/469 [==>...........................] - ETA: 9s - loss: 0.1633 - accuracy: 0.9516 
 59/469 [==>...........................] - ETA: 7s - loss: 0.1633 - accuracy: 0.9517
 68/469 [===>..........................] - ETA: 7s - loss: 0.1564 - accuracy: 0.9519
 77/469 [===>..........................] - ETA: 6s - loss: 0.1558 - accuracy: 0.9522
 86/469 [====>.........................] - ETA: 5s - loss: 0.1581 - accuracy: 0.9525
 95/469 [=====>........................] - ETA: 5s - loss: 0.1602 - accuracy: 0.9527
104/469 [=====>........................] - ETA: 4s - loss: 0.1588 - accuracy: 0.9528
113/469 [======>.......................] - ETA: 4s - loss: 0.1577 - accuracy: 0.9530
122/469 [======>.......................] - ETA: 4s - loss: 0.1582 - accuracy: 0.9532
131/469 [=======>......................] - ETA: 4s - loss: 0.1571 - accuracy: 0.9533
140/469 [=======>......................] - ETA: 3s - loss: 0.1550 - accuracy: 0.9535
149/469 [========>.....................] - ETA: 3s - loss: 0.1530 - accuracy: 0.9537
158/469 [=========>....................] - ETA: 3s - loss: 0.1516 - accuracy: 0.9539
167/469 [=========>....................] - ETA: 3s - loss: 0.1506 - accuracy: 0.9541
176/469 [==========>...................] - ETA: 3s - loss: 0.1513 - accuracy: 0.9543
185/469 [==========>...................] - ETA: 2s - loss: 0.1513 - accuracy: 0.9544
194/469 [===========>..................] - ETA: 2s - loss: 0.1498 - accuracy: 0.9546
203/469 [===========>..................] - ETA: 2s - loss: 0.1480 - accuracy: 0.9548
212/469 [============>.................] - ETA: 2s - loss: 0.1470 - accuracy: 0.9549
221/469 [=============>................] - ETA: 2s - loss: 0.1455 - accuracy: 0.9551
230/469 [=============>................] - ETA: 2s - loss: 0.1439 - accuracy: 0.9553
239/469 [==============>...............] - ETA: 2s - loss: 0.1428 - accuracy: 0.9555
248/469 [==============>...............] - ETA: 2s - loss: 0.1411 - accuracy: 0.9556
257/469 [===============>..............] - ETA: 1s - loss: 0.1402 - accuracy: 0.9558
266/469 [================>.............] - ETA: 1s - loss: 0.1392 - accuracy: 0.9560
275/469 [================>.............] - ETA: 1s - loss: 0.1392 - accuracy: 0.9562
284/469 [=================>............] - ETA: 1s - loss: 0.1384 - accuracy: 0.9563
293/469 [=================>............] - ETA: 1s - loss: 0.1374 - accuracy: 0.9565
302/469 [==================>...........] - ETA: 1s - loss: 0.1367 - accuracy: 0.9566
311/469 [==================>...........] - ETA: 1s - loss: 0.1364 - accuracy: 0.9568
320/469 [===================>..........] - ETA: 1s - loss: 0.1354 - accuracy: 0.9569
329/469 [====================>.........] - ETA: 1s - loss: 0.1358 - accuracy: 0.9571
338/469 [====================>.........] - ETA: 1s - loss: 0.1357 - accuracy: 0.9572
347/469 [=====================>........] - ETA: 1s - loss: 0.1353 - accuracy: 0.9573
356/469 [=====================>........] - ETA: 0s - loss: 0.1343 - accuracy: 0.9574
365/469 [======================>.......] - ETA: 0s - loss: 0.1335 - accuracy: 0.9576
374/469 [======================>.......] - ETA: 0s - loss: 0.1327 - accuracy: 0.9577
383/469 [=======================>......] - ETA: 0s - loss: 0.1331 - accuracy: 0.9578
392/469 [========================>.....] - ETA: 0s - loss: 0.1329 - accuracy: 0.9579
401/469 [========================>.....] - ETA: 0s - loss: 0.1321 - accuracy: 0.9580
410/469 [=========================>....] - ETA: 0s - loss: 0.1318 - accuracy: 0.9582
419/469 [=========================>....] - ETA: 0s - loss: 0.1320 - accuracy: 0.9583
428/469 [==========================>...] - ETA: 0s - loss: 0.1321 - accuracy: 0.9584
437/469 [==========================>...] - ETA: 0s - loss: 0.1320 - accuracy: 0.9585
446/469 [===========================>..] - ETA: 0s - loss: 0.1321 - accuracy: 0.9586
455/469 [============================>.] - ETA: 0s - loss: 0.1319 - accuracy: 0.9586
463/469 [============================>.] - ETA: 0s - loss: 0.1317 - accuracy: 0.9587
469/469 [==============================] - 4s 9ms/step - loss: 0.1313 - accuracy: 0.9588 - val_loss: 0.1248 - val_accuracy: 0.9650
Epoch 3/5

  1/469 [..............................] - ETA: 6:49 - loss: 0.0761 - accuracy: 0.9766
 10/469 [..............................] - ETA: 42s - loss: 0.1143 - accuracy: 0.9681 
 19/469 [>.............................] - ETA: 23s - loss: 0.1244 - accuracy: 0.9657
 28/469 [>.............................] - ETA: 16s - loss: 0.1271 - accuracy: 0.9652
 37/469 [=>............................] - ETA: 12s - loss: 0.1280 - accuracy: 0.9652
 46/469 [=>............................] - ETA: 10s - loss: 0.1247 - accuracy: 0.9654
 55/469 [==>...........................] - ETA: 9s - loss: 0.1182 - accuracy: 0.9656 
 64/469 [===>..........................] - ETA: 7s - loss: 0.1176 - accuracy: 0.9659
 73/469 [===>..........................] - ETA: 7s - loss: 0.1163 - accuracy: 0.9662
 82/469 [====>.........................] - ETA: 6s - loss: 0.1187 - accuracy: 0.9664
 91/469 [====>.........................] - ETA: 5s - loss: 0.1188 - accuracy: 0.9665
100/469 [=====>........................] - ETA: 5s - loss: 0.1207 - accuracy: 0.9665
109/469 [=====>........................] - ETA: 5s - loss: 0.1184 - accuracy: 0.9666
118/469 [======>.......................] - ETA: 4s - loss: 0.1212 - accuracy: 0.9666
127/469 [=======>......................] - ETA: 4s - loss: 0.1212 - accuracy: 0.9667
136/469 [=======>......................] - ETA: 4s - loss: 0.1183 - accuracy: 0.9668
145/469 [========>.....................] - ETA: 3s - loss: 0.1179 - accuracy: 0.9668
153/469 [========>.....................] - ETA: 3s - loss: 0.1170 - accuracy: 0.9669
162/469 [=========>....................] - ETA: 3s - loss: 0.1160 - accuracy: 0.9670
170/469 [=========>....................] - ETA: 3s - loss: 0.1160 - accuracy: 0.9671
177/469 [==========>...................] - ETA: 3s - loss: 0.1158 - accuracy: 0.9672
184/469 [==========>...................] - ETA: 3s - loss: 0.1150 - accuracy: 0.9672
192/469 [===========>..................] - ETA: 2s - loss: 0.1157 - accuracy: 0.9673
199/469 [===========>..................] - ETA: 2s - loss: 0.1152 - accuracy: 0.9673
207/469 [============>.................] - ETA: 2s - loss: 0.1135 - accuracy: 0.9674
215/469 [============>.................] - ETA: 2s - loss: 0.1130 - accuracy: 0.9675
223/469 [=============>................] - ETA: 2s - loss: 0.1129 - accuracy: 0.9675
231/469 [=============>................] - ETA: 2s - loss: 0.1131 - accuracy: 0.9676
239/469 [==============>...............] - ETA: 2s - loss: 0.1141 - accuracy: 0.9676
247/469 [==============>...............] - ETA: 2s - loss: 0.1135 - accuracy: 0.9677
254/469 [===============>..............] - ETA: 2s - loss: 0.1138 - accuracy: 0.9677
262/469 [===============>..............] - ETA: 2s - loss: 0.1125 - accuracy: 0.9677
270/469 [================>.............] - ETA: 1s - loss: 0.1119 - accuracy: 0.9678
277/469 [================>.............] - ETA: 1s - loss: 0.1119 - accuracy: 0.9678
285/469 [=================>............] - ETA: 1s - loss: 0.1107 - accuracy: 0.9679
294/469 [=================>............] - ETA: 1s - loss: 0.1098 - accuracy: 0.9679
303/469 [==================>...........] - ETA: 1s - loss: 0.1093 - accuracy: 0.9680
311/469 [==================>...........] - ETA: 1s - loss: 0.1087 - accuracy: 0.9680
320/469 [===================>..........] - ETA: 1s - loss: 0.1079 - accuracy: 0.9681
329/469 [====================>.........] - ETA: 1s - loss: 0.1075 - accuracy: 0.9681
338/469 [====================>.........] - ETA: 1s - loss: 0.1080 - accuracy: 0.9682
347/469 [=====================>........] - ETA: 1s - loss: 0.1087 - accuracy: 0.9682
356/469 [=====================>........] - ETA: 0s - loss: 0.1092 - accuracy: 0.9683
365/469 [======================>.......] - ETA: 0s - loss: 0.1090 - accuracy: 0.9683
374/469 [======================>.......] - ETA: 0s - loss: 0.1087 - accuracy: 0.9684
383/469 [=======================>......] - ETA: 0s - loss: 0.1086 - accuracy: 0.9684
392/469 [========================>.....] - ETA: 0s - loss: 0.1086 - accuracy: 0.9684
401/469 [========================>.....] - ETA: 0s - loss: 0.1077 - accuracy: 0.9685
410/469 [=========================>....] - ETA: 0s - loss: 0.1078 - accuracy: 0.9685
419/469 [=========================>....] - ETA: 0s - loss: 0.1078 - accuracy: 0.9686
428/469 [==========================>...] - ETA: 0s - loss: 0.1075 - accuracy: 0.9686
437/469 [==========================>...] - ETA: 0s - loss: 0.1070 - accuracy: 0.9686
446/469 [===========================>..] - ETA: 0s - loss: 0.1074 - accuracy: 0.9687
455/469 [============================>.] - ETA: 0s - loss: 0.1078 - accuracy: 0.9687
463/469 [============================>.] - ETA: 0s - loss: 0.1077 - accuracy: 0.9687
469/469 [==============================] - 4s 8ms/step - loss: 0.1075 - accuracy: 0.9688
Epoch 4/5

  1/469 [..............................] - ETA: 7:01 - loss: 0.0781 - accuracy: 0.9766
  8/469 [..............................] - ETA: 54s - loss: 0.0913 - accuracy: 0.9714 
 14/469 [..............................] - ETA: 32s - loss: 0.1014 - accuracy: 0.9698
 23/469 [>.............................] - ETA: 20s - loss: 0.0984 - accuracy: 0.9689
 31/469 [>.............................] - ETA: 15s - loss: 0.1058 - accuracy: 0.9688
 40/469 [=>............................] - ETA: 12s - loss: 0.1017 - accuracy: 0.9688
 49/469 [==>...........................] - ETA: 10s - loss: 0.1062 - accuracy: 0.9686
 58/469 [==>...........................] - ETA: 8s - loss: 0.1098 - accuracy: 0.9686 
 68/469 [===>..........................] - ETA: 7s - loss: 0.1083 - accuracy: 0.9685
 77/469 [===>..........................] - ETA: 7s - loss: 0.1084 - accuracy: 0.9683
 86/469 [====>.........................] - ETA: 6s - loss: 0.1111 - accuracy: 0.9682
 95/469 [=====>........................] - ETA: 5s - loss: 0.1120 - accuracy: 0.9682
104/469 [=====>........................] - ETA: 5s - loss: 0.1117 - accuracy: 0.9680
113/469 [======>.......................] - ETA: 5s - loss: 0.1098 - accuracy: 0.9680
122/469 [======>.......................] - ETA: 4s - loss: 0.1101 - accuracy: 0.9680
131/469 [=======>......................] - ETA: 4s - loss: 0.1094 - accuracy: 0.9680
140/469 [=======>......................] - ETA: 4s - loss: 0.1067 - accuracy: 0.9681
149/469 [========>.....................] - ETA: 3s - loss: 0.1069 - accuracy: 0.9682
158/469 [=========>....................] - ETA: 3s - loss: 0.1059 - accuracy: 0.9683
167/469 [=========>....................] - ETA: 3s - loss: 0.1044 - accuracy: 0.9684
176/469 [==========>...................] - ETA: 3s - loss: 0.1030 - accuracy: 0.9685
185/469 [==========>...................] - ETA: 3s - loss: 0.1024 - accuracy: 0.9687
194/469 [===========>..................] - ETA: 2s - loss: 0.1016 - accuracy: 0.9688
203/469 [===========>..................] - ETA: 2s - loss: 0.1004 - accuracy: 0.9689
212/469 [============>.................] - ETA: 2s - loss: 0.0985 - accuracy: 0.9691
220/469 [=============>................] - ETA: 2s - loss: 0.0980 - accuracy: 0.9692
229/469 [=============>................] - ETA: 2s - loss: 0.0971 - accuracy: 0.9693
238/469 [==============>...............] - ETA: 2s - loss: 0.0966 - accuracy: 0.9694
247/469 [==============>...............] - ETA: 2s - loss: 0.0958 - accuracy: 0.9696
256/469 [===============>..............] - ETA: 2s - loss: 0.0962 - accuracy: 0.9697
265/469 [===============>..............] - ETA: 1s - loss: 0.0957 - accuracy: 0.9698
274/469 [================>.............] - ETA: 1s - loss: 0.0967 - accuracy: 0.9699
281/469 [================>.............] - ETA: 1s - loss: 0.0968 - accuracy: 0.9700
290/469 [=================>............] - ETA: 1s - loss: 0.0960 - accuracy: 0.9701
299/469 [==================>...........] - ETA: 1s - loss: 0.0956 - accuracy: 0.9702
308/469 [==================>...........] - ETA: 1s - loss: 0.0953 - accuracy: 0.9702
317/469 [===================>..........] - ETA: 1s - loss: 0.0943 - accuracy: 0.9703
326/469 [===================>..........] - ETA: 1s - loss: 0.0942 - accuracy: 0.9704
335/469 [====================>.........] - ETA: 1s - loss: 0.0939 - accuracy: 0.9705
344/469 [=====================>........] - ETA: 1s - loss: 0.0942 - accuracy: 0.9706
353/469 [=====================>........] - ETA: 1s - loss: 0.0936 - accuracy: 0.9707
362/469 [======================>.......] - ETA: 0s - loss: 0.0936 - accuracy: 0.9707
371/469 [======================>.......] - ETA: 0s - loss: 0.0933 - accuracy: 0.9708
380/469 [=======================>......] - ETA: 0s - loss: 0.0933 - accuracy: 0.9709
389/469 [=======================>......] - ETA: 0s - loss: 0.0941 - accuracy: 0.9709
398/469 [========================>.....] - ETA: 0s - loss: 0.0942 - accuracy: 0.9710
407/469 [=========================>....] - ETA: 0s - loss: 0.0939 - accuracy: 0.9711
416/469 [=========================>....] - ETA: 0s - loss: 0.0934 - accuracy: 0.9711
425/469 [==========================>...] - ETA: 0s - loss: 0.0930 - accuracy: 0.9712
434/469 [==========================>...] - ETA: 0s - loss: 0.0928 - accuracy: 0.9713
443/469 [===========================>..] - ETA: 0s - loss: 0.0931 - accuracy: 0.9713
450/469 [===========================>..] - ETA: 0s - loss: 0.0926 - accuracy: 0.9714
458/469 [============================>.] - ETA: 0s - loss: 0.0927 - accuracy: 0.9714
465/469 [============================>.] - ETA: 0s - loss: 0.0927 - accuracy: 0.9715
469/469 [==============================] - 4s 9ms/step - loss: 0.0931 - accuracy: 0.9715 - val_loss: 0.1442 - val_accuracy: 0.9645
Epoch 5/5

  1/469 [..............................] - ETA: 6:39 - loss: 0.1922 - accuracy: 0.9375
 10/469 [..............................] - ETA: 41s - loss: 0.1019 - accuracy: 0.9584 
 19/469 [>.............................] - ETA: 22s - loss: 0.0993 - accuracy: 0.9621
 28/469 [>.............................] - ETA: 15s - loss: 0.0959 - accuracy: 0.9641
 37/469 [=>............................] - ETA: 12s - loss: 0.1014 - accuracy: 0.9652
 46/469 [=>............................] - ETA: 10s - loss: 0.1008 - accuracy: 0.9661
 55/469 [==>...........................] - ETA: 8s - loss: 0.0937 - accuracy: 0.9668 
 64/469 [===>..........................] - ETA: 7s - loss: 0.0934 - accuracy: 0.9674
 73/469 [===>..........................] - ETA: 6s - loss: 0.0945 - accuracy: 0.9681
 82/469 [====>.........................] - ETA: 6s - loss: 0.0948 - accuracy: 0.9685
 91/469 [====>.........................] - ETA: 5s - loss: 0.0919 - accuracy: 0.9690
100/469 [=====>........................] - ETA: 5s - loss: 0.0939 - accuracy: 0.9693
109/469 [=====>........................] - ETA: 4s - loss: 0.0926 - accuracy: 0.9697
118/469 [======>.......................] - ETA: 4s - loss: 0.0929 - accuracy: 0.9700
127/469 [=======>......................] - ETA: 4s - loss: 0.0929 - accuracy: 0.9703
136/469 [=======>......................] - ETA: 4s - loss: 0.0914 - accuracy: 0.9705
145/469 [========>.....................] - ETA: 3s - loss: 0.0896 - accuracy: 0.9707
154/469 [========>.....................] - ETA: 3s - loss: 0.0875 - accuracy: 0.9710
163/469 [=========>....................] - ETA: 3s - loss: 0.0858 - accuracy: 0.9712
172/469 [==========>...................] - ETA: 3s - loss: 0.0861 - accuracy: 0.9714
181/469 [==========>...................] - ETA: 3s - loss: 0.0854 - accuracy: 0.9716
190/469 [===========>..................] - ETA: 2s - loss: 0.0843 - accuracy: 0.9718
199/469 [===========>..................] - ETA: 2s - loss: 0.0847 - accuracy: 0.9720
208/469 [============>.................] - ETA: 2s - loss: 0.0838 - accuracy: 0.9722
217/469 [============>.................] - ETA: 2s - loss: 0.0853 - accuracy: 0.9723
226/469 [=============>................] - ETA: 2s - loss: 0.0877 - accuracy: 0.9724
235/469 [==============>...............] - ETA: 2s - loss: 0.0887 - accuracy: 0.9726
244/469 [==============>...............] - ETA: 2s - loss: 0.0888 - accuracy: 0.9727
253/469 [===============>..............] - ETA: 2s - loss: 0.0892 - accuracy: 0.9727
262/469 [===============>..............] - ETA: 1s - loss: 0.0880 - accuracy: 0.9728
271/469 [================>.............] - ETA: 1s - loss: 0.0877 - accuracy: 0.9729
280/469 [================>.............] - ETA: 1s - loss: 0.0873 - accuracy: 0.9730
289/469 [=================>............] - ETA: 1s - loss: 0.0873 - accuracy: 0.9731
298/469 [==================>...........] - ETA: 1s - loss: 0.0869 - accuracy: 0.9732
307/469 [==================>...........] - ETA: 1s - loss: 0.0875 - accuracy: 0.9733
316/469 [===================>..........] - ETA: 1s - loss: 0.0868 - accuracy: 0.9733
325/469 [===================>..........] - ETA: 1s - loss: 0.0871 - accuracy: 0.9734
334/469 [====================>.........] - ETA: 1s - loss: 0.0874 - accuracy: 0.9735
343/469 [====================>.........] - ETA: 1s - loss: 0.0877 - accuracy: 0.9735
352/469 [=====================>........] - ETA: 0s - loss: 0.0884 - accuracy: 0.9736
361/469 [======================>.......] - ETA: 0s - loss: 0.0884 - accuracy: 0.9736
370/469 [======================>.......] - ETA: 0s - loss: 0.0878 - accuracy: 0.9737
379/469 [=======================>......] - ETA: 0s - loss: 0.0876 - accuracy: 0.9737
388/469 [=======================>......] - ETA: 0s - loss: 0.0877 - accuracy: 0.9738
397/469 [========================>.....] - ETA: 0s - loss: 0.0875 - accuracy: 0.9738
406/469 [========================>.....] - ETA: 0s - loss: 0.0874 - accuracy: 0.9739
415/469 [=========================>....] - ETA: 0s - loss: 0.0870 - accuracy: 0.9739
424/469 [==========================>...] - ETA: 0s - loss: 0.0870 - accuracy: 0.9740
433/469 [==========================>...] - ETA: 0s - loss: 0.0870 - accuracy: 0.9740
442/469 [===========================>..] - ETA: 0s - loss: 0.0868 - accuracy: 0.9741
451/469 [===========================>..] - ETA: 0s - loss: 0.0868 - accuracy: 0.9741
460/469 [============================>.] - ETA: 0s - loss: 0.0869 - accuracy: 0.9741
467/469 [============================>.] - ETA: 0s - loss: 0.0870 - accuracy: 0.9742
469/469 [==============================] - 4s 8ms/step - loss: 0.0869 - accuracy: 0.9742

 1/79 [..............................] - ETA: 0s - loss: 0.0397 - accuracy: 0.9922
17/79 [=====>........................] - ETA: 0s - loss: 0.1593 - accuracy: 0.9573
33/79 [===========>..................] - ETA: 0s - loss: 0.1624 - accuracy: 0.9590
49/79 [=================>............] - ETA: 0s - loss: 0.1476 - accuracy: 0.9624
65/79 [=======================>......] - ETA: 0s - loss: 0.1252 - accuracy: 0.9685
79/79 [==============================] - 0s 3ms/step - loss: 0.1161 - accuracy: 0.9697
tf.Tensor(
[7 2 1 0 4 1 4 9 5 9 0 6 9 0 1 5 9 7 3 4 9 6 6 5 4 0 7 4 0 1 3 1 3 4 7 2 7
 1 2 1 1 7 4 2 3 5 1 2 4 4 6 3 5 5 6 0 4 1 9 5 7 8 9 3 7 4 6 4 3 0 7 0 2 9
 1 7 3 2 9 7 7 6 2 7 8 4 7 3 6 1 3 6 9 3 1 4 1 7 6 9 6 0 5 4 9 9 2 1 9 4 8
 7 3 9 7 9 4 4 9 2 5 4 7 6 7 9 0 5], shape=(128,), dtype=int64)
tf.Tensor(
[7 2 1 0 4 1 4 9 5 9 0 6 9 0 1 5 9 7 3 4 9 6 6 5 4 0 7 4 0 1 3 1 3 4 7 2 7
 1 2 1 1 7 4 2 3 5 1 2 4 4 6 3 5 5 6 0 4 1 9 5 7 8 9 3 7 4 6 4 3 0 7 0 2 9
 1 7 3 2 9 7 7 6 2 7 8 4 7 3 6 1 3 6 9 3 1 4 1 7 6 9 6 0 5 4 9 9 2 1 9 4 8
 7 3 9 7 4 4 4 9 2 5 4 7 6 7 9 0 5], shape=(128,), dtype=int64)

Process finished with exit code 0

七、需要全套课程视频+PPT+代码资源可以私聊我!

你可能感兴趣的:(深度学习,Deep,Learning)