개요¶
'핸즈 온 머신러닝' 2판 10.2장(372p)을 공부하면서 스케일 조정이 왜 필요할까에 대한 의문이 생겼다.
데이터 준비¶
In [1]:
%tensorflow_version 2.x
from tensorflow import keras
In [2]:
fashion_mnist = keras.datasets.fashion_mnist
(X_train_full, y_train_full), (X_test, y_text) = fashion_mnist.load_data()
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz
32768/29515 [=================================] - 0s 0us/step
40960/29515 [=========================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
26427392/26421880 [==============================] - 0s 0us/step
26435584/26421880 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz
16384/5148 [===============================================================================================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz
4423680/4422102 [==============================] - 0s 0us/step
4431872/4422102 [==============================] - 0s 0us/step
In [3]:
X_train_full.shape
Out[3]:
(60000, 28, 28)
In [4]:
X_train_full.dtype
Out[4]:
dtype('uint8')
In [5]:
X_valid, X_train = X_train_full[:5000] / 255.0, X_train_full[5000:] / 255.0
y_valid, y_train = y_train_full[:5000], y_train_full[5000:]
# Not Scaled Data
X_Not_Scale_valid, X_Not_Scale_train = X_train_full[:5000], X_train_full[5000:]
정규화시킨 X_valid, X_train 데이터와 정규화시키지 않은 X_Not_Scale_valid, X_Not_Scale_train 데이터 준비
In [6]:
model = keras.models.Sequential()
model.add(keras.layers.Flatten(input_shape=[28,28]))
model.add(keras.layers.Dense(300, activation="relu"))
model.add(keras.layers.Dense(100, activation="relu"))
model.add(keras.layers.Dense(10, activation="softmax"))
model2 = keras.models.Sequential()
model2.add(keras.layers.Flatten(input_shape=[28,28]))
model2.add(keras.layers.Dense(300, activation="relu"))
model2.add(keras.layers.Dense(100, activation="relu"))
model2.add(keras.layers.Dense(10, activation="softmax"))
In [7]:
model.compile(loss="sparse_categorical_crossentropy", optimizer="sgd", metrics=["accuracy"])
model2.compile(loss="sparse_categorical_crossentropy", optimizer="sgd", metrics=["accuracy"])
In [8]:
history = model.fit(X_train, y_train, epochs=30, validation_data=(X_valid, y_valid))
Epoch 1/30
1719/1719 [==============================] - 8s 3ms/step - loss: 0.7251 - accuracy: 0.7637 - val_loss: 0.5026 - val_accuracy: 0.8320
Epoch 2/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.4891 - accuracy: 0.8317 - val_loss: 0.4612 - val_accuracy: 0.8374
Epoch 3/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.4429 - accuracy: 0.8456 - val_loss: 0.4152 - val_accuracy: 0.8602
Epoch 4/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.4172 - accuracy: 0.8550 - val_loss: 0.3986 - val_accuracy: 0.8590
Epoch 5/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3956 - accuracy: 0.8608 - val_loss: 0.3828 - val_accuracy: 0.8672
Epoch 6/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3798 - accuracy: 0.8668 - val_loss: 0.3828 - val_accuracy: 0.8660
Epoch 7/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3667 - accuracy: 0.8711 - val_loss: 0.3762 - val_accuracy: 0.8634
Epoch 8/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3548 - accuracy: 0.8742 - val_loss: 0.3629 - val_accuracy: 0.8726
Epoch 9/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3447 - accuracy: 0.8767 - val_loss: 0.3554 - val_accuracy: 0.8800
Epoch 10/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3359 - accuracy: 0.8805 - val_loss: 0.3400 - val_accuracy: 0.8768
Epoch 11/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3269 - accuracy: 0.8830 - val_loss: 0.3394 - val_accuracy: 0.8794
Epoch 12/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3183 - accuracy: 0.8858 - val_loss: 0.3288 - val_accuracy: 0.8848
Epoch 13/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3112 - accuracy: 0.8897 - val_loss: 0.3304 - val_accuracy: 0.8806
Epoch 14/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3048 - accuracy: 0.8919 - val_loss: 0.3237 - val_accuracy: 0.8852
Epoch 15/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2985 - accuracy: 0.8931 - val_loss: 0.3259 - val_accuracy: 0.8818
Epoch 16/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2918 - accuracy: 0.8952 - val_loss: 0.3321 - val_accuracy: 0.8790
Epoch 17/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2872 - accuracy: 0.8973 - val_loss: 0.3167 - val_accuracy: 0.8864
Epoch 18/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2808 - accuracy: 0.9000 - val_loss: 0.3161 - val_accuracy: 0.8852
Epoch 19/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2759 - accuracy: 0.9006 - val_loss: 0.3114 - val_accuracy: 0.8888
Epoch 20/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2713 - accuracy: 0.9019 - val_loss: 0.3131 - val_accuracy: 0.8874
Epoch 21/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2659 - accuracy: 0.9037 - val_loss: 0.3032 - val_accuracy: 0.8906
Epoch 22/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2612 - accuracy: 0.9067 - val_loss: 0.3097 - val_accuracy: 0.8912
Epoch 23/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2562 - accuracy: 0.9077 - val_loss: 0.3178 - val_accuracy: 0.8848
Epoch 24/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2520 - accuracy: 0.9091 - val_loss: 0.3140 - val_accuracy: 0.8888
Epoch 25/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2483 - accuracy: 0.9113 - val_loss: 0.2995 - val_accuracy: 0.8910
Epoch 26/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2433 - accuracy: 0.9118 - val_loss: 0.3022 - val_accuracy: 0.8900
Epoch 27/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2402 - accuracy: 0.9132 - val_loss: 0.2895 - val_accuracy: 0.8946
Epoch 28/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2361 - accuracy: 0.9143 - val_loss: 0.2997 - val_accuracy: 0.8908
Epoch 29/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2315 - accuracy: 0.9173 - val_loss: 0.3082 - val_accuracy: 0.8878
Epoch 30/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.2288 - accuracy: 0.9172 - val_loss: 0.3112 - val_accuracy: 0.8878
epochs 1000번 설정 후 진행
In [9]:
history2 = model2.fit(X_Not_Scale_train, y_train, epochs=100, validation_data=(X_Not_Scale_valid, y_valid))
Epoch 1/100
1719/1719 [==============================] - 6s 3ms/step - loss: 18340856725504.0000 - accuracy: 0.0996 - val_loss: 2.3143 - val_accuracy: 0.1002
Epoch 2/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0977 - val_loss: 2.3142 - val_accuracy: 0.1002
Epoch 3/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0984 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 4/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0983 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 5/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0999 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 6/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0999 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 7/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0981 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 8/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0966 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 9/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0994 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 10/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0977 - val_loss: 2.3145 - val_accuracy: 0.0976
Epoch 11/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0984 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 12/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1000 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 13/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0983 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 14/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0995 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 15/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0984 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 16/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0979 - val_loss: 2.3144 - val_accuracy: 0.1008
Epoch 17/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3142 - val_accuracy: 0.1008
Epoch 18/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1005 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 19/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0995 - val_loss: 2.3142 - val_accuracy: 0.0986
Epoch 20/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0989 - val_loss: 2.3142 - val_accuracy: 0.0986
Epoch 21/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 22/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0980 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 23/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3142 - val_accuracy: 0.0914
Epoch 24/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1002 - val_loss: 2.3143 - val_accuracy: 0.1012
Epoch 25/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0966 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 26/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1002 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 27/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0980 - val_loss: 2.3145 - val_accuracy: 0.0976
Epoch 28/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1000 - val_loss: 2.3143 - val_accuracy: 0.1002
Epoch 29/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0992 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 30/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0981 - val_loss: 2.3145 - val_accuracy: 0.0976
Epoch 31/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0996 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 32/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1003 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 33/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1006 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 34/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0984 - val_loss: 2.3143 - val_accuracy: 0.0976
Epoch 35/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1003 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 36/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 37/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 38/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0988 - val_loss: 2.3143 - val_accuracy: 0.0976
Epoch 39/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1005 - val_loss: 2.3143 - val_accuracy: 0.1002
Epoch 40/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0996 - val_loss: 2.3144 - val_accuracy: 0.0976
Epoch 41/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0991 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 42/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0992 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 43/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0993 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 44/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0981 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 45/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 46/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0989 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 47/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1010 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 48/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0981 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 49/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1003 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 50/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0992 - val_loss: 2.3146 - val_accuracy: 0.0914
Epoch 51/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0985 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 52/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0975 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 53/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1003 - val_loss: 2.3142 - val_accuracy: 0.1002
Epoch 54/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1001 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 55/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0994 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 56/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0980 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 57/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0984 - val_loss: 2.3144 - val_accuracy: 0.0976
Epoch 58/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0990 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 59/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0965 - val_loss: 2.3144 - val_accuracy: 0.0976
Epoch 60/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0992 - val_loss: 2.3143 - val_accuracy: 0.0976
Epoch 61/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0999 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 62/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0974 - val_loss: 2.3144 - val_accuracy: 0.1008
Epoch 63/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0992 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 64/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0994 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 65/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1002 - val_loss: 2.3142 - val_accuracy: 0.0986
Epoch 66/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0993 - val_loss: 2.3143 - val_accuracy: 0.0976
Epoch 67/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0981 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 68/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0977 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 69/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0980 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 70/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1003 - val_loss: 2.3143 - val_accuracy: 0.0978
Epoch 71/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0995 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 72/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0985 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 73/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0986 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 74/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0983 - val_loss: 2.3143 - val_accuracy: 0.1002
Epoch 75/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0998 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 76/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0992 - val_loss: 2.3143 - val_accuracy: 0.0978
Epoch 77/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0982 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 78/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0969 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 79/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0997 - val_loss: 2.3142 - val_accuracy: 0.0978
Epoch 80/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0973 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 81/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0975 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 82/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1006 - val_loss: 2.3142 - val_accuracy: 0.0986
Epoch 83/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0993 - val_loss: 2.3143 - val_accuracy: 0.0976
Epoch 84/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0973 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 85/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0989 - val_loss: 2.3143 - val_accuracy: 0.1012
Epoch 86/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0973 - val_loss: 2.3144 - val_accuracy: 0.0978
Epoch 87/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0982 - val_loss: 2.3145 - val_accuracy: 0.0914
Epoch 88/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0967 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 89/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0974 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 90/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.1001 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 91/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0963 - val_loss: 2.3144 - val_accuracy: 0.0986
Epoch 92/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0988 - val_loss: 2.3143 - val_accuracy: 0.0986
Epoch 93/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0997 - val_loss: 2.3143 - val_accuracy: 0.0976
Epoch 94/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0967 - val_loss: 2.3142 - val_accuracy: 0.0986
Epoch 95/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0976 - val_loss: 2.3144 - val_accuracy: 0.0976
Epoch 96/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0966 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 97/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0984 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 98/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0978 - val_loss: 2.3144 - val_accuracy: 0.0914
Epoch 99/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0989 - val_loss: 2.3143 - val_accuracy: 0.0914
Epoch 100/100
1719/1719 [==============================] - 5s 3ms/step - loss: 2.3027 - accuracy: 0.0977 - val_loss: 2.3144 - val_accuracy: 0.0914
손실(loss)이 일정해지고 정확도가 0.9 정도에 머무는것을 알 수 있다.
시각화¶
In [10]:
import pandas as pd
import matplotlib.pyplot as plt
In [11]:
pd.DataFrame(history.history).plot(figsize=(8, 5))
plt.grid(True)
plt.gca().set_ylim(0, 1)
plt.show()
In [12]:
pd.DataFrame(history2.history).plot(figsize=(8, 5))
plt.grid(True)
plt.gca().set_ylim(0, 0.2)
plt.show()
정확도가 0.08 ~ 0.1 사이에서 진동하는 모습을 볼 수 있고, 학습하는 동안 손실(loss)의 값이 변하지 않음을 알 수 있었다.
결론¶
- 데이터 정규화는 모델을 학습시키는데 필요한 시간을 줄이는것에 큰 기여를 한다.
- 하지만, 정규화를 시키지 않은 데이터는 시간 요소 뿐만 아니라 알고리즘 정확도 측면에서 epoch 수를 아무리 늘려도 제대로된 학습이 어려울 수 있다.
- 손실(loss)이 일정 값으로 진행되는것으로 보아 가중치가 갱신되지 않는 것으로 파악된다.
- 데이터셋 정규화의 중요성에 대해서는 항상 인지하자.
In [13]:
from IPython.core.display import display, HTML
display(HTML("<style>.container {width:90% !important;}</style>"))
반응형
'ML & DL (Machine Learning&Deep Learning) > Deep Learning (DL)' 카테고리의 다른 글
[Deep Learning] Transformer - Attention is All You Need (NIPS 2017) (0) | 2022.02.26 |
---|---|
[Deep Learning] ResNet - Deep Residual Learning for Image Recognition (CVPR 2016) (0) | 2021.11.11 |
[TensorFlow] 시퀀스 API를 사용하여 이미지 분류기 만들기 (2) (0) | 2021.02.20 |
[TensorFlow] 시퀀스 API를 사용하여 이미지 분류기 만들기 (1) (0) | 2021.02.02 |
[TensorFlow] 경사 하강법(Gradient Descent) (0) | 2021.02.01 |
댓글