keras神经网络入门:1.简单神经网络实现mnist识别,识别准确率高达89%

2023-10-30

第一步:导入相关模块

import tensorflow as tf
from  tensorflow.keras import layers,models
from tensorflow.keras.datasets import mnist
import numpy as np 

第二步:导入数据集

(train_data,train_label),(val_data,val_label)=mnist.load_data()
train_data=train_data/255.0
val_data=val_data/255.0

第三步:搭建神经网络

model=models.Sequential(
    (
    layers.Flatten(input_shape=(28,28)),
    layers.Dense(32,activation='relu'),
    layers.Dense(16,activation='relu'),
    layers.Dense(10,activation='softmax')
    )
)
model.compile(optimizer=tf.optimizers.SGD(lr=0.001),
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
              metrics=[tf.keras.metrics.SparseCategoricalAccuracy()])
model.summary()
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_1 (Flatten)          (None, 784)               0         
_________________________________________________________________
dense_9 (Dense)              (None, 32)                25120     
_________________________________________________________________
dense_10 (Dense)             (None, 16)                528       
_________________________________________________________________
dense_11 (Dense)             (None, 10)                170       
=================================================================
Total params: 25,818
Trainable params: 25,818
Non-trainable params: 0
_________________________________________________________________

第四步:训练神经网络

model.fit(train_data,train_label,epochs=160,batch_size=512,validation_data=(val_data,val_label))
Epoch 1/160
118/118 [==============================] - 1s 5ms/step - loss: 2.3386 - sparse_categorical_accuracy: 0.1019 - val_loss: 2.3109 - val_sparse_categorical_accuracy: 0.1097
Epoch 2/160
118/118 [==============================] - 0s 2ms/step - loss: 2.2886 - sparse_categorical_accuracy: 0.1236 - val_loss: 2.2737 - val_sparse_categorical_accuracy: 0.1301
Epoch 3/160
118/118 [==============================] - 0s 3ms/step - loss: 2.2576 - sparse_categorical_accuracy: 0.1471 - val_loss: 2.2461 - val_sparse_categorical_accuracy: 0.1534
Epoch 4/160
118/118 [==============================] - 0s 2ms/step - loss: 2.2314 - sparse_categorical_accuracy: 0.1738 - val_loss: 2.2202 - val_sparse_categorical_accuracy: 0.1830
Epoch 5/160
118/118 [==============================] - 0s 2ms/step - loss: 2.2053 - sparse_categorical_accuracy: 0.2046 - val_loss: 2.1932 - val_sparse_categorical_accuracy: 0.2137
Epoch 6/160
118/118 [==============================] - 0s 2ms/step - loss: 2.1778 - sparse_categorical_accuracy: 0.2369 - val_loss: 2.1644 - val_sparse_categorical_accuracy: 0.2512
Epoch 7/160
118/118 [==============================] - 0s 3ms/step - loss: 2.1485 - sparse_categorical_accuracy: 0.2714 - val_loss: 2.1337 - val_sparse_categorical_accuracy: 0.2874
Epoch 8/160
118/118 [==============================] - 0s 3ms/step - loss: 2.1173 - sparse_categorical_accuracy: 0.3032 - val_loss: 2.1013 - val_sparse_categorical_accuracy: 0.3162
Epoch 9/160
118/118 [==============================] - 0s 3ms/step - loss: 2.0847 - sparse_categorical_accuracy: 0.3284 - val_loss: 2.0678 - val_sparse_categorical_accuracy: 0.3446
Epoch 10/160
118/118 [==============================] - 0s 3ms/step - loss: 2.0511 - sparse_categorical_accuracy: 0.3503 - val_loss: 2.0332 - val_sparse_categorical_accuracy: 0.3654
Epoch 11/160
118/118 [==============================] - 0s 3ms/step - loss: 2.0166 - sparse_categorical_accuracy: 0.3687 - val_loss: 1.9976 - val_sparse_categorical_accuracy: 0.3843
Epoch 12/160
118/118 [==============================] - 0s 3ms/step - loss: 1.9812 - sparse_categorical_accuracy: 0.3861 - val_loss: 1.9612 - val_sparse_categorical_accuracy: 0.4039
Epoch 13/160
118/118 [==============================] - 0s 3ms/step - loss: 1.9451 - sparse_categorical_accuracy: 0.4027 - val_loss: 1.9239 - val_sparse_categorical_accuracy: 0.4177
Epoch 14/160
118/118 [==============================] - 0s 3ms/step - loss: 1.9081 - sparse_categorical_accuracy: 0.4186 - val_loss: 1.8858 - val_sparse_categorical_accuracy: 0.4357
Epoch 15/160
118/118 [==============================] - 0s 3ms/step - loss: 1.8704 - sparse_categorical_accuracy: 0.4367 - val_loss: 1.8471 - val_sparse_categorical_accuracy: 0.4526
Epoch 16/160
118/118 [==============================] - 0s 3ms/step - loss: 1.8318 - sparse_categorical_accuracy: 0.4556 - val_loss: 1.8075 - val_sparse_categorical_accuracy: 0.4681
Epoch 17/160
118/118 [==============================] - 0s 3ms/step - loss: 1.7924 - sparse_categorical_accuracy: 0.4744 - val_loss: 1.7672 - val_sparse_categorical_accuracy: 0.4880
Epoch 18/160
118/118 [==============================] - 0s 3ms/step - loss: 1.7522 - sparse_categorical_accuracy: 0.4949 - val_loss: 1.7259 - val_sparse_categorical_accuracy: 0.5057
Epoch 19/160
118/118 [==============================] - 0s 3ms/step - loss: 1.7110 - sparse_categorical_accuracy: 0.5152 - val_loss: 1.6837 - val_sparse_categorical_accuracy: 0.5252
Epoch 20/160
118/118 [==============================] - 0s 3ms/step - loss: 1.6691 - sparse_categorical_accuracy: 0.5355 - val_loss: 1.6409 - val_sparse_categorical_accuracy: 0.5473
Epoch 21/160
118/118 [==============================] - 0s 3ms/step - loss: 1.6267 - sparse_categorical_accuracy: 0.5579 - val_loss: 1.5978 - val_sparse_categorical_accuracy: 0.5675
Epoch 22/160
118/118 [==============================] - 0s 2ms/step - loss: 1.5840 - sparse_categorical_accuracy: 0.5785 - val_loss: 1.5546 - val_sparse_categorical_accuracy: 0.5889
Epoch 23/160
118/118 [==============================] - 0s 3ms/step - loss: 1.5413 - sparse_categorical_accuracy: 0.5994 - val_loss: 1.5115 - val_sparse_categorical_accuracy: 0.6073
Epoch 24/160
118/118 [==============================] - 0s 3ms/step - loss: 1.4988 - sparse_categorical_accuracy: 0.6158 - val_loss: 1.4687 - val_sparse_categorical_accuracy: 0.6238
Epoch 25/160
118/118 [==============================] - 0s 3ms/step - loss: 1.4566 - sparse_categorical_accuracy: 0.6315 - val_loss: 1.4262 - val_sparse_categorical_accuracy: 0.6374
Epoch 26/160
118/118 [==============================] - 0s 3ms/step - loss: 1.4146 - sparse_categorical_accuracy: 0.6450 - val_loss: 1.3842 - val_sparse_categorical_accuracy: 0.6512
Epoch 27/160
118/118 [==============================] - 0s 2ms/step - loss: 1.3732 - sparse_categorical_accuracy: 0.6564 - val_loss: 1.3426 - val_sparse_categorical_accuracy: 0.6629
Epoch 28/160
118/118 [==============================] - 0s 2ms/step - loss: 1.3323 - sparse_categorical_accuracy: 0.6666 - val_loss: 1.3017 - val_sparse_categorical_accuracy: 0.6720
Epoch 29/160
118/118 [==============================] - 0s 3ms/step - loss: 1.2922 - sparse_categorical_accuracy: 0.6763 - val_loss: 1.2615 - val_sparse_categorical_accuracy: 0.6826
Epoch 30/160
118/118 [==============================] - 0s 2ms/step - loss: 1.2528 - sparse_categorical_accuracy: 0.6847 - val_loss: 1.2224 - val_sparse_categorical_accuracy: 0.6909
Epoch 31/160
118/118 [==============================] - 0s 2ms/step - loss: 1.2146 - sparse_categorical_accuracy: 0.6924 - val_loss: 1.1842 - val_sparse_categorical_accuracy: 0.6995
Epoch 32/160
118/118 [==============================] - 0s 2ms/step - loss: 1.1775 - sparse_categorical_accuracy: 0.6997 - val_loss: 1.1472 - val_sparse_categorical_accuracy: 0.7074
Epoch 33/160
118/118 [==============================] - 0s 3ms/step - loss: 1.1416 - sparse_categorical_accuracy: 0.7074 - val_loss: 1.1116 - val_sparse_categorical_accuracy: 0.7140
Epoch 34/160
118/118 [==============================] - 0s 3ms/step - loss: 1.1070 - sparse_categorical_accuracy: 0.7154 - val_loss: 1.0773 - val_sparse_categorical_accuracy: 0.7221
Epoch 35/160
118/118 [==============================] - 0s 3ms/step - loss: 1.0739 - sparse_categorical_accuracy: 0.7236 - val_loss: 1.0445 - val_sparse_categorical_accuracy: 0.7305
Epoch 36/160
118/118 [==============================] - 0s 3ms/step - loss: 1.0422 - sparse_categorical_accuracy: 0.7308 - val_loss: 1.0131 - val_sparse_categorical_accuracy: 0.7377
Epoch 37/160
118/118 [==============================] - 0s 3ms/step - loss: 1.0120 - sparse_categorical_accuracy: 0.7387 - val_loss: 0.9833 - val_sparse_categorical_accuracy: 0.7476
Epoch 38/160
118/118 [==============================] - 0s 3ms/step - loss: 0.9833 - sparse_categorical_accuracy: 0.7468 - val_loss: 0.9549 - val_sparse_categorical_accuracy: 0.7545
Epoch 39/160
118/118 [==============================] - 0s 3ms/step - loss: 0.9560 - sparse_categorical_accuracy: 0.7544 - val_loss: 0.9280 - val_sparse_categorical_accuracy: 0.7615
Epoch 40/160
118/118 [==============================] - 0s 3ms/step - loss: 0.9302 - sparse_categorical_accuracy: 0.7609 - val_loss: 0.9027 - val_sparse_categorical_accuracy: 0.7673
Epoch 41/160
118/118 [==============================] - 0s 3ms/step - loss: 0.9059 - sparse_categorical_accuracy: 0.7675 - val_loss: 0.8788 - val_sparse_categorical_accuracy: 0.7747
Epoch 42/160
118/118 [==============================] - 0s 3ms/step - loss: 0.8830 - sparse_categorical_accuracy: 0.7745 - val_loss: 0.8564 - val_sparse_categorical_accuracy: 0.7804
Epoch 43/160
118/118 [==============================] - 0s 3ms/step - loss: 0.8614 - sparse_categorical_accuracy: 0.7794 - val_loss: 0.8354 - val_sparse_categorical_accuracy: 0.7858
Epoch 44/160
118/118 [==============================] - 0s 3ms/step - loss: 0.8412 - sparse_categorical_accuracy: 0.7837 - val_loss: 0.8157 - val_sparse_categorical_accuracy: 0.7903
Epoch 45/160
118/118 [==============================] - 0s 3ms/step - loss: 0.8222 - sparse_categorical_accuracy: 0.7888 - val_loss: 0.7972 - val_sparse_categorical_accuracy: 0.7953
Epoch 46/160
118/118 [==============================] - 0s 3ms/step - loss: 0.8043 - sparse_categorical_accuracy: 0.7930 - val_loss: 0.7798 - val_sparse_categorical_accuracy: 0.7986
Epoch 47/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7874 - sparse_categorical_accuracy: 0.7962 - val_loss: 0.7635 - val_sparse_categorical_accuracy: 0.8011
Epoch 48/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7716 - sparse_categorical_accuracy: 0.7994 - val_loss: 0.7482 - val_sparse_categorical_accuracy: 0.8043
Epoch 49/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7566 - sparse_categorical_accuracy: 0.8025 - val_loss: 0.7337 - val_sparse_categorical_accuracy: 0.8078
Epoch 50/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7425 - sparse_categorical_accuracy: 0.8054 - val_loss: 0.7200 - val_sparse_categorical_accuracy: 0.8110
Epoch 51/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7291 - sparse_categorical_accuracy: 0.8077 - val_loss: 0.7071 - val_sparse_categorical_accuracy: 0.8130
Epoch 52/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7164 - sparse_categorical_accuracy: 0.8104 - val_loss: 0.6948 - val_sparse_categorical_accuracy: 0.8161
Epoch 53/160
118/118 [==============================] - 0s 3ms/step - loss: 0.7044 - sparse_categorical_accuracy: 0.8127 - val_loss: 0.6832 - val_sparse_categorical_accuracy: 0.8194
Epoch 54/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6929 - sparse_categorical_accuracy: 0.8158 - val_loss: 0.6723 - val_sparse_categorical_accuracy: 0.8212
Epoch 55/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6821 - sparse_categorical_accuracy: 0.8175 - val_loss: 0.6617 - val_sparse_categorical_accuracy: 0.8242
Epoch 56/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6717 - sparse_categorical_accuracy: 0.8199 - val_loss: 0.6518 - val_sparse_categorical_accuracy: 0.8263
Epoch 57/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6619 - sparse_categorical_accuracy: 0.8217 - val_loss: 0.6423 - val_sparse_categorical_accuracy: 0.8287
Epoch 58/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6525 - sparse_categorical_accuracy: 0.8236 - val_loss: 0.6332 - val_sparse_categorical_accuracy: 0.8304
Epoch 59/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6435 - sparse_categorical_accuracy: 0.8259 - val_loss: 0.6246 - val_sparse_categorical_accuracy: 0.8321
Epoch 60/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6349 - sparse_categorical_accuracy: 0.8275 - val_loss: 0.6164 - val_sparse_categorical_accuracy: 0.8345
Epoch 61/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6267 - sparse_categorical_accuracy: 0.8295 - val_loss: 0.6085 - val_sparse_categorical_accuracy: 0.8362
Epoch 62/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6188 - sparse_categorical_accuracy: 0.8313 - val_loss: 0.6009 - val_sparse_categorical_accuracy: 0.8378
Epoch 63/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6113 - sparse_categorical_accuracy: 0.8334 - val_loss: 0.5936 - val_sparse_categorical_accuracy: 0.8396
Epoch 64/160
118/118 [==============================] - 0s 3ms/step - loss: 0.6040 - sparse_categorical_accuracy: 0.8350 - val_loss: 0.5867 - val_sparse_categorical_accuracy: 0.8415
Epoch 65/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5970 - sparse_categorical_accuracy: 0.8367 - val_loss: 0.5800 - val_sparse_categorical_accuracy: 0.8427
Epoch 66/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5904 - sparse_categorical_accuracy: 0.8382 - val_loss: 0.5736 - val_sparse_categorical_accuracy: 0.8446
Epoch 67/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5839 - sparse_categorical_accuracy: 0.8398 - val_loss: 0.5674 - val_sparse_categorical_accuracy: 0.8463
Epoch 68/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5777 - sparse_categorical_accuracy: 0.8413 - val_loss: 0.5615 - val_sparse_categorical_accuracy: 0.8476
Epoch 69/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5718 - sparse_categorical_accuracy: 0.8424 - val_loss: 0.5558 - val_sparse_categorical_accuracy: 0.8490
Epoch 70/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5660 - sparse_categorical_accuracy: 0.8438 - val_loss: 0.5503 - val_sparse_categorical_accuracy: 0.8500
Epoch 71/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5605 - sparse_categorical_accuracy: 0.8450 - val_loss: 0.5449 - val_sparse_categorical_accuracy: 0.8512
Epoch 72/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5551 - sparse_categorical_accuracy: 0.8466 - val_loss: 0.5398 - val_sparse_categorical_accuracy: 0.8523
Epoch 73/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5499 - sparse_categorical_accuracy: 0.8480 - val_loss: 0.5348 - val_sparse_categorical_accuracy: 0.8545
Epoch 74/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5449 - sparse_categorical_accuracy: 0.8493 - val_loss: 0.5300 - val_sparse_categorical_accuracy: 0.8550
Epoch 75/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5400 - sparse_categorical_accuracy: 0.8504 - val_loss: 0.5254 - val_sparse_categorical_accuracy: 0.8560
Epoch 76/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5353 - sparse_categorical_accuracy: 0.8515 - val_loss: 0.5210 - val_sparse_categorical_accuracy: 0.8578
Epoch 77/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5308 - sparse_categorical_accuracy: 0.8523 - val_loss: 0.5166 - val_sparse_categorical_accuracy: 0.8584
Epoch 78/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5264 - sparse_categorical_accuracy: 0.8536 - val_loss: 0.5124 - val_sparse_categorical_accuracy: 0.8598
Epoch 79/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5221 - sparse_categorical_accuracy: 0.8548 - val_loss: 0.5083 - val_sparse_categorical_accuracy: 0.8614
Epoch 80/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5179 - sparse_categorical_accuracy: 0.8561 - val_loss: 0.5044 - val_sparse_categorical_accuracy: 0.8612
Epoch 81/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5139 - sparse_categorical_accuracy: 0.8572 - val_loss: 0.5005 - val_sparse_categorical_accuracy: 0.8625
Epoch 82/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5100 - sparse_categorical_accuracy: 0.8585 - val_loss: 0.4968 - val_sparse_categorical_accuracy: 0.8635
Epoch 83/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5062 - sparse_categorical_accuracy: 0.8595 - val_loss: 0.4932 - val_sparse_categorical_accuracy: 0.8640
Epoch 84/160
118/118 [==============================] - 0s 3ms/step - loss: 0.5025 - sparse_categorical_accuracy: 0.8602 - val_loss: 0.4897 - val_sparse_categorical_accuracy: 0.8650
Epoch 85/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4989 - sparse_categorical_accuracy: 0.8614 - val_loss: 0.4862 - val_sparse_categorical_accuracy: 0.8655
Epoch 86/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4954 - sparse_categorical_accuracy: 0.8620 - val_loss: 0.4829 - val_sparse_categorical_accuracy: 0.8661
Epoch 87/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4920 - sparse_categorical_accuracy: 0.8632 - val_loss: 0.4797 - val_sparse_categorical_accuracy: 0.8670
Epoch 88/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4887 - sparse_categorical_accuracy: 0.8638 - val_loss: 0.4766 - val_sparse_categorical_accuracy: 0.8680
Epoch 89/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4855 - sparse_categorical_accuracy: 0.8648 - val_loss: 0.4735 - val_sparse_categorical_accuracy: 0.8683
Epoch 90/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4823 - sparse_categorical_accuracy: 0.8656 - val_loss: 0.4705 - val_sparse_categorical_accuracy: 0.8692
Epoch 91/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4793 - sparse_categorical_accuracy: 0.8665 - val_loss: 0.4676 - val_sparse_categorical_accuracy: 0.8697
Epoch 92/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4763 - sparse_categorical_accuracy: 0.8673 - val_loss: 0.4648 - val_sparse_categorical_accuracy: 0.8706
Epoch 93/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4734 - sparse_categorical_accuracy: 0.8680 - val_loss: 0.4621 - val_sparse_categorical_accuracy: 0.8709
Epoch 94/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4705 - sparse_categorical_accuracy: 0.8689 - val_loss: 0.4594 - val_sparse_categorical_accuracy: 0.8714
Epoch 95/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4678 - sparse_categorical_accuracy: 0.8695 - val_loss: 0.4568 - val_sparse_categorical_accuracy: 0.8723
Epoch 96/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4651 - sparse_categorical_accuracy: 0.8702 - val_loss: 0.4542 - val_sparse_categorical_accuracy: 0.8733
Epoch 97/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4624 - sparse_categorical_accuracy: 0.8710 - val_loss: 0.4517 - val_sparse_categorical_accuracy: 0.8741
Epoch 98/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4598 - sparse_categorical_accuracy: 0.8713 - val_loss: 0.4492 - val_sparse_categorical_accuracy: 0.8743
Epoch 99/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4573 - sparse_categorical_accuracy: 0.8720 - val_loss: 0.4469 - val_sparse_categorical_accuracy: 0.8751
Epoch 100/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4548 - sparse_categorical_accuracy: 0.8730 - val_loss: 0.4445 - val_sparse_categorical_accuracy: 0.8756
Epoch 101/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4524 - sparse_categorical_accuracy: 0.8735 - val_loss: 0.4422 - val_sparse_categorical_accuracy: 0.8763
Epoch 102/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4501 - sparse_categorical_accuracy: 0.8741 - val_loss: 0.4400 - val_sparse_categorical_accuracy: 0.8768
Epoch 103/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4478 - sparse_categorical_accuracy: 0.8748 - val_loss: 0.4378 - val_sparse_categorical_accuracy: 0.8781
Epoch 104/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4455 - sparse_categorical_accuracy: 0.8755 - val_loss: 0.4358 - val_sparse_categorical_accuracy: 0.8787
Epoch 105/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4433 - sparse_categorical_accuracy: 0.8758 - val_loss: 0.4336 - val_sparse_categorical_accuracy: 0.8791
Epoch 106/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4411 - sparse_categorical_accuracy: 0.8768 - val_loss: 0.4316 - val_sparse_categorical_accuracy: 0.8793
Epoch 107/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4390 - sparse_categorical_accuracy: 0.8770 - val_loss: 0.4297 - val_sparse_categorical_accuracy: 0.8796
Epoch 108/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4369 - sparse_categorical_accuracy: 0.8777 - val_loss: 0.4277 - val_sparse_categorical_accuracy: 0.8804
Epoch 109/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4349 - sparse_categorical_accuracy: 0.8780 - val_loss: 0.4258 - val_sparse_categorical_accuracy: 0.8806
Epoch 110/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4329 - sparse_categorical_accuracy: 0.8787 - val_loss: 0.4239 - val_sparse_categorical_accuracy: 0.8811
Epoch 111/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4310 - sparse_categorical_accuracy: 0.8795 - val_loss: 0.4221 - val_sparse_categorical_accuracy: 0.8818
Epoch 112/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4290 - sparse_categorical_accuracy: 0.8799 - val_loss: 0.4203 - val_sparse_categorical_accuracy: 0.8826
Epoch 113/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4272 - sparse_categorical_accuracy: 0.8805 - val_loss: 0.4185 - val_sparse_categorical_accuracy: 0.8829
Epoch 114/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4253 - sparse_categorical_accuracy: 0.8808 - val_loss: 0.4167 - val_sparse_categorical_accuracy: 0.8834
Epoch 115/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4235 - sparse_categorical_accuracy: 0.8812 - val_loss: 0.4151 - val_sparse_categorical_accuracy: 0.8835
Epoch 116/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4217 - sparse_categorical_accuracy: 0.8818 - val_loss: 0.4134 - val_sparse_categorical_accuracy: 0.8843
Epoch 117/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4200 - sparse_categorical_accuracy: 0.8822 - val_loss: 0.4117 - val_sparse_categorical_accuracy: 0.8847
Epoch 118/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4183 - sparse_categorical_accuracy: 0.8828 - val_loss: 0.4102 - val_sparse_categorical_accuracy: 0.8852
Epoch 119/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4166 - sparse_categorical_accuracy: 0.8831 - val_loss: 0.4087 - val_sparse_categorical_accuracy: 0.8854
Epoch 120/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4150 - sparse_categorical_accuracy: 0.8838 - val_loss: 0.4071 - val_sparse_categorical_accuracy: 0.8858
Epoch 121/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4133 - sparse_categorical_accuracy: 0.8839 - val_loss: 0.4056 - val_sparse_categorical_accuracy: 0.8861
Epoch 122/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4117 - sparse_categorical_accuracy: 0.8844 - val_loss: 0.4040 - val_sparse_categorical_accuracy: 0.8864
Epoch 123/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4102 - sparse_categorical_accuracy: 0.8849 - val_loss: 0.4026 - val_sparse_categorical_accuracy: 0.8867
Epoch 124/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4086 - sparse_categorical_accuracy: 0.8852 - val_loss: 0.4012 - val_sparse_categorical_accuracy: 0.8867
Epoch 125/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4071 - sparse_categorical_accuracy: 0.8853 - val_loss: 0.3997 - val_sparse_categorical_accuracy: 0.8869
Epoch 126/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4056 - sparse_categorical_accuracy: 0.8859 - val_loss: 0.3984 - val_sparse_categorical_accuracy: 0.8869
Epoch 127/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4042 - sparse_categorical_accuracy: 0.8861 - val_loss: 0.3970 - val_sparse_categorical_accuracy: 0.8870
Epoch 128/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4027 - sparse_categorical_accuracy: 0.8862 - val_loss: 0.3956 - val_sparse_categorical_accuracy: 0.8875
Epoch 129/160
118/118 [==============================] - 0s 3ms/step - loss: 0.4013 - sparse_categorical_accuracy: 0.8870 - val_loss: 0.3943 - val_sparse_categorical_accuracy: 0.8879
Epoch 130/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3999 - sparse_categorical_accuracy: 0.8873 - val_loss: 0.3930 - val_sparse_categorical_accuracy: 0.8883
Epoch 131/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3985 - sparse_categorical_accuracy: 0.8877 - val_loss: 0.3917 - val_sparse_categorical_accuracy: 0.8882
Epoch 132/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3972 - sparse_categorical_accuracy: 0.8880 - val_loss: 0.3904 - val_sparse_categorical_accuracy: 0.8891
Epoch 133/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3958 - sparse_categorical_accuracy: 0.8884 - val_loss: 0.3891 - val_sparse_categorical_accuracy: 0.8890
Epoch 134/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3945 - sparse_categorical_accuracy: 0.8886 - val_loss: 0.3880 - val_sparse_categorical_accuracy: 0.8895
Epoch 135/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3932 - sparse_categorical_accuracy: 0.8889 - val_loss: 0.3868 - val_sparse_categorical_accuracy: 0.8898
Epoch 136/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3919 - sparse_categorical_accuracy: 0.8892 - val_loss: 0.3856 - val_sparse_categorical_accuracy: 0.8899
Epoch 137/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3907 - sparse_categorical_accuracy: 0.8896 - val_loss: 0.3844 - val_sparse_categorical_accuracy: 0.8900
Epoch 138/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3894 - sparse_categorical_accuracy: 0.8900 - val_loss: 0.3832 - val_sparse_categorical_accuracy: 0.8902
Epoch 139/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3882 - sparse_categorical_accuracy: 0.8902 - val_loss: 0.3821 - val_sparse_categorical_accuracy: 0.8906
Epoch 140/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3870 - sparse_categorical_accuracy: 0.8906 - val_loss: 0.3810 - val_sparse_categorical_accuracy: 0.8905
Epoch 141/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3858 - sparse_categorical_accuracy: 0.8907 - val_loss: 0.3799 - val_sparse_categorical_accuracy: 0.8907
Epoch 142/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3847 - sparse_categorical_accuracy: 0.8912 - val_loss: 0.3788 - val_sparse_categorical_accuracy: 0.8909
Epoch 143/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3835 - sparse_categorical_accuracy: 0.8917 - val_loss: 0.3777 - val_sparse_categorical_accuracy: 0.8918
Epoch 144/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3824 - sparse_categorical_accuracy: 0.8921 - val_loss: 0.3766 - val_sparse_categorical_accuracy: 0.8924
Epoch 145/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3812 - sparse_categorical_accuracy: 0.8923 - val_loss: 0.3757 - val_sparse_categorical_accuracy: 0.8924
Epoch 146/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3801 - sparse_categorical_accuracy: 0.8929 - val_loss: 0.3745 - val_sparse_categorical_accuracy: 0.8931
Epoch 147/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3790 - sparse_categorical_accuracy: 0.8930 - val_loss: 0.3736 - val_sparse_categorical_accuracy: 0.8937
Epoch 148/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3780 - sparse_categorical_accuracy: 0.8934 - val_loss: 0.3726 - val_sparse_categorical_accuracy: 0.8939
Epoch 149/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3769 - sparse_categorical_accuracy: 0.8934 - val_loss: 0.3716 - val_sparse_categorical_accuracy: 0.8939
Epoch 150/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3759 - sparse_categorical_accuracy: 0.8938 - val_loss: 0.3706 - val_sparse_categorical_accuracy: 0.8941
Epoch 151/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3748 - sparse_categorical_accuracy: 0.8939 - val_loss: 0.3696 - val_sparse_categorical_accuracy: 0.8945
Epoch 152/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3738 - sparse_categorical_accuracy: 0.8942 - val_loss: 0.3686 - val_sparse_categorical_accuracy: 0.8943
Epoch 153/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3728 - sparse_categorical_accuracy: 0.8947 - val_loss: 0.3676 - val_sparse_categorical_accuracy: 0.8947
Epoch 154/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3718 - sparse_categorical_accuracy: 0.8948 - val_loss: 0.3668 - val_sparse_categorical_accuracy: 0.8952
Epoch 155/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3708 - sparse_categorical_accuracy: 0.8951 - val_loss: 0.3658 - val_sparse_categorical_accuracy: 0.8953
Epoch 156/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3698 - sparse_categorical_accuracy: 0.8953 - val_loss: 0.3650 - val_sparse_categorical_accuracy: 0.8960
Epoch 157/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3689 - sparse_categorical_accuracy: 0.8956 - val_loss: 0.3641 - val_sparse_categorical_accuracy: 0.8962
Epoch 158/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3679 - sparse_categorical_accuracy: 0.8959 - val_loss: 0.3632 - val_sparse_categorical_accuracy: 0.8967
Epoch 159/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3670 - sparse_categorical_accuracy: 0.8962 - val_loss: 0.3623 - val_sparse_categorical_accuracy: 0.8966
Epoch 160/160
118/118 [==============================] - 0s 3ms/step - loss: 0.3660 - sparse_categorical_accuracy: 0.8963 - val_loss: 0.3614 - val_sparse_categorical_accuracy: 0.8971





<tensorflow.python.keras.callbacks.History at 0x21359af4160>
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

keras神经网络入门:1.简单神经网络实现mnist识别,识别准确率高达89% 的相关文章

随机推荐

  • Kafka3.0.0版本——文件清理策略

    目录 一 文件清理策略 1 1 文件清理策略的概述 1 2 文件清理策略的官方文档 1 3 日志超过了设置的时间如何处理 1 3 1 delete日志删除 将过期数据删除 1 3 2 compact日志压缩 一 文件清理策略 1 1 文件清
  • 【Pytorch】利用Pytorch+GRU实现情感分类(附源码)

    在这个实验中 数据的预处理过程以及网络的初始化及模型的训练等过程同前文 利用Pytorch LSTM实现中文新闻分类 具体这里就不再重复解释了 如果有读者在对数据集的预处理过程中有疑问 请参考我的其他博客 里面对这些方法均有我的一些个人体会
  • 稀缺原理

    不管是什么东西 只要你晓得会失去它 自然就会爱上它了 稀缺原理 机会越少见 价值似乎就越高 对失去某种东西的恐惧似乎比对获得同一物品的渴望 更能激发人们的行动力 稀缺原理的力量来源 1 基本可以根据获得一样东西的难易程度 迅速 准确的判断它
  • plsql developer 终极注册码

    product code 4v6hkjs66vc944tp74p3e7t4gs6duq4m4szbf3t38wq2 serial number 1412970386 password xs374ca 手机扫一扫 欢迎关注公众号 关注程序员成
  • python:从键盘输入一个字符,判别它是否大写字母,如果是,将它转换成小写字母;如果不是,则不转换。然后输出最后得到的字符。

    letter str input 请输入一个字母 if letter lt Z 凡是小于大写Z的都要转换成小写 print 转换小写字母为 letter lower lower 方法可以把大写转换成小写 else print 转换大写字母为
  • 网络协程编程

    一 背景 为什么需要网络协程 1 协程 纤程并不是一个新概念2 大并发 高性能对于服务端的高要求3 移动设备的快速增长加大了服务端大并发压力4 Go 语言的兴起将协程带到了一个新的高度支持协程的编程语言 1 Go 语言 非常容易支持大并发
  • Eigen入门之密集矩阵 1 -- 类Matrix介绍

    简介 本篇介绍Eigen中的Matrix类 在Eigen中 矩阵和向量的类型都用Matrix来表示 向量是一种特殊的矩阵 其只有一行或者一列 Matrix构造 在Matrix h中 定义了Matrix类 其中的构造器包括如下的5个 可以看到
  • python爬虫可以做什么呢?

    1 收集数据 Python爬虫程序可用于收集数据 这是最直接和最常用的方法 由于爬虫程序是一个程序 程序运行得非常快 不会因为重复的事情而感到疲倦 因此使用爬虫程序获取大量数据变得非常简单 快速 2 数据储存 Python爬虫可以将从各个网
  • 【防攻世界】misc解题思路-学习笔记

    前言 靶场地址 防攻世界 一 Cat falg 丢进 kali 或者其他Linux系统直接 cat flag 二 MeowMeow可爱的小猫 这道题就很离谱 flag需要用010工具打开 拉到最后就可以看到文字样式 组起来就是 CatCTF
  • 【深度学习】去掉softmax后Transformer会更好吗?复旦&华为诺亚提出SOFT:轻松搞定线性近似...

    作者丨happy 编辑丨极市平台 导读 本文介绍了复旦大学 华为诺亚提出的一种新颖的softmax free的Transformer SOFT 所提SOFT显著改善了现有ViT方案的计算效率 更为关键的是 SOFT的线性复杂度可以允许更长的
  • OPC UA 学习笔记 Event,Condition和Alarm

    告警和事件在自动控制领域十分重要 它是保障系统安全运营的主要措施 OPC UA 设立了专门的一部分来规范告警和条件模型 OPC 10000 9 UA Part 9 Alarms and Conditions 事件 条件和告警是OPCUA 信
  • 编程常用英语词汇(一)

    编程常用英语词汇 一 编程对英语的要求自不必说 这个技能是必须的 在平时开发和学习过程中 就算你不能完全读懂这句话的意思 如果你了解其中某个关键词的意思 也能大概猜到这句话的意思 这样的话也能对你解决问题提供很大的帮助 这就是我为什么要进行
  • Paperreading之三Simple Baselines for Human Pose Estimation

    本次paper是coco2018关键点检测项目的亚军方案 方法非常的简洁明了 但是效果很惊艳 达到了state of the art paper的标题也是写了simple baseline 整篇paper包含一个sota的姿态估计和姿态跟踪
  • setchecked方法会触发OnCheckedChangeListener

    分享一下我老师大神的人工智能教程 零基础 通俗易懂 风趣幽默 还带黄段子 希望你也加入到我们人工智能的队伍中来 https blog csdn net jiangjunshow 转载请标明出处 http blog csdn net xx32
  • 模式识别机器学习PRML考前自测

    绪论 1 请介绍一下机器学习的整体框架 机器学习和模式识别基本一个东西 模式识别是工业界的称呼而已 机器学习是人工智能下属的子领域 用来辅助在大数据时代进行数据分析与数据管理 应用于数据挖掘工作 人工智能的其他子领域包括比如NLP CV等领
  • in module XXX. File is included in 4 contexts

    问题描述 整合SSM框架时 出现以下情况 原因分析 这四个配置文件应该位于同一个ApplicationContext下 解决方案
  • linux执行python程序实时重定向输出日志【-u参数】

    执行python程序的时候 不能实时输出 总是在执行完成之后再一次性的输出到日志中 对于长时间的 希望看到中间过程的情况 不是很方便 查了下是因为缓存的问题 u 表示不启用缓存 实时输出打印信息到日志文件 如果不加 u 则会导致日志文件不会
  • 框架到底是干什么用的?

    不用刻意地在设计中体现MVC 而是在不知不觉中写出符合优秀架构的代码 这应该是所有框架的共同目的 也应该是根本目的
  • 绝大部分人都理解错了HTTP中GET与POST的区别

    GET和POST是HTTP请求的两种基本方法 要说它们的区别 接触过WEB开发的人都能说出一二 最直观的区别就是GET把参数包含在URL中 POST通过request body传递参数 你可能自己写过无数个GET和POST请求 或者已经看过
  • keras神经网络入门:1.简单神经网络实现mnist识别,识别准确率高达89%

    第一步 导入相关模块 import tensorflow as tf from tensorflow keras import layers models from tensorflow keras datasets import mnis