Inicio Rápido#

Introducción#

En esta lección construimos nuestra primera red neuronal con Pytorch. Usamos el famoso ejemplo fashion mnist. En este caso tomaremos los datos directamente de la librería tensorflow.keras.datasets.

Instalar Tensorflow#

En consola ejecute el siguiente comando.

#conda install -c conda-forge tensorflow

Para instalar sklearn use el canal de Intel, que mantiene la librería actualizada

# conda install -c intel scikit-learn

Carga librerías requeridas#

import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Flatten, Dense
from tensorflow.keras.layers import Activation
from tensorflow.keras.layers import InputLayer

from tensorflow.keras.optimizers import SGD

from tensorflow.keras.datasets import cifar10

import matplotlib.pyplot as plt
import numpy as np

from sklearn.preprocessing import LabelBinarizer
2022-08-30 21:43:48.303939: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory
2022-08-30 21:43:48.304002: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.

Trabajando con los datos#

# Carga el dataset CIFAR-10 
print("Cargando el dataset CIFAR-10...")
((x_train, y_train), (x_test, y_test)) = cifar10.load_data()
print('¡¡Hecho!!')

assert x_train.shape == (50000, 32, 32, 3)
assert x_test.shape == (10000, 32, 32, 3)
assert y_train.shape == (50000, 1)
assert y_test.shape == (10000, 1)

# Escala los datos al rango [0, 1]
trainX = x_train.astype("float32") / 255.0
testX = x_test.astype("float32") / 255.0

# convierte las etiquetas de enteros a vectores (no necesario en realidad)
lb = LabelBinarizer()
trainY = lb.fit_transform(y_train)
testY = lb.transform(y_test)

rows = x_train.shape[1]
cols = x_train.shape[2]
channels = x_train.shape[3]
Cargando el dataset CIFAR-10...
Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz
     8192/170498071 [..............................] - ETA: 0s

    40960/170498071 [..............................] - ETA: 3:59

    90112/170498071 [..............................] - ETA: 3:37

   204800/170498071 [..............................] - ETA: 2:23

   385024/170498071 [..............................] - ETA: 1:41

   778240/170498071 [..............................] - ETA: 1:02

  1581056/170498071 [..............................] - ETA: 36s 

  3186688/170498071 [..............................] - ETA: 21s

  6316032/170498071 [>.............................] - ETA: 11s

  9412608/170498071 [>.............................] - ETA: 8s 

 12328960/170498071 [=>............................] - ETA: 7s

 15441920/170498071 [=>............................] - ETA: 6s

 18522112/170498071 [==>...........................] - ETA: 5s

 21651456/170498071 [==>...........................] - ETA: 5s

 24518656/170498071 [===>..........................] - ETA: 4s

 27648000/170498071 [===>..........................] - ETA: 4s

 30744576/170498071 [====>.........................] - ETA: 4s

 33873920/170498071 [====>.........................] - ETA: 3s

 36708352/170498071 [=====>........................] - ETA: 3s

 39788544/170498071 [======>.......................] - ETA: 3s

 42917888/170498071 [======>.......................] - ETA: 3s

 45801472/170498071 [=======>......................] - ETA: 3s

 48898048/170498071 [=======>......................] - ETA: 3s

 51863552/170498071 [========>.....................] - ETA: 3s

 54894592/170498071 [========>.....................] - ETA: 2s

 58040320/170498071 [=========>....................] - ETA: 2s

 61087744/170498071 [=========>....................] - ETA: 2s

 64217088/170498071 [==========>...................] - ETA: 2s

 67149824/170498071 [==========>...................] - ETA: 2s

 70230016/170498071 [===========>..................] - ETA: 2s

 73310208/170498071 [===========>..................] - ETA: 2s

 76095488/170498071 [============>.................] - ETA: 2s

 79175680/170498071 [============>.................] - ETA: 2s

 82305024/170498071 [=============>................] - ETA: 2s

 85155840/170498071 [=============>................] - ETA: 1s

 88285184/170498071 [==============>...............] - ETA: 1s

 91398144/170498071 [===============>..............] - ETA: 1s

 94199808/170498071 [===============>..............] - ETA: 1s

 97034240/170498071 [================>.............] - ETA: 1s

100016128/170498071 [================>.............] - ETA: 1s

102866944/170498071 [=================>............] - ETA: 1s

105947136/170498071 [=================>............] - ETA: 1s

108683264/170498071 [==================>...........] - ETA: 1s

111812608/170498071 [==================>...........] - ETA: 1s

114941952/170498071 [===================>..........] - ETA: 1s

118005760/170498071 [===================>..........] - ETA: 1s

120922112/170498071 [====================>.........] - ETA: 1s

124051456/170498071 [====================>.........] - ETA: 1s

125329408/170498071 [=====================>........] - ETA: 1s

128442368/170498071 [=====================>........] - ETA: 0s

130506752/170498071 [=====================>........] - ETA: 0s

133652480/170498071 [======================>.......] - ETA: 0s

136388608/170498071 [======================>.......] - ETA: 0s

139501568/170498071 [=======================>......] - ETA: 0s

142581760/170498071 [========================>.....] - ETA: 0s

145711104/170498071 [========================>.....] - ETA: 0s

148643840/170498071 [=========================>....] - ETA: 0s

151724032/170498071 [=========================>....] - ETA: 0s

154820608/170498071 [==========================>...] - ETA: 0s

157966336/170498071 [==========================>...] - ETA: 0s

160718848/170498071 [===========================>..] - ETA: 0s

163848192/170498071 [===========================>..] - ETA: 0s

166748160/170498071 [============================>.] - ETA: 0s

169861120/170498071 [============================>.] - ETA: 0s

170498071/170498071 [==============================] - 4s 0us/step
¡¡Hecho!!
print(len(trainY))
print(testY[0])
50000
[0 0 0 1 0 0 0 0 0 0]

Una primera imagen de los datos#

# Muestra las primeras 100  images en color

img_rows = rows
img_cols = cols

imgs_t = x_test[:100]
imgs_t = imgs_t.reshape((10, 10, img_rows, img_cols, channels))
imgs_t = np.vstack([np.hstack(i) for i in imgs_t])
plt.figure(figsize=(8,8))
plt.axis('off')
plt.title('Primeras 100 imágenes de l conjunto de datos Cifar-10')
plt.imshow(imgs_t, interpolation='none')
plt.savefig('./test_color.png')
plt.show()
../../_images/Tensorflow-01_14_0.png

Creando modelos#

# define model
class NeuralNetwork(Model):
    def __init__(self):
        super(NeuralNetwork, self).__init__()

        self.linear_relu_stack = Sequential([
            InputLayer(input_shape=(32,32,3)),
            Flatten(),
            Dense(512, activation='relu'),
            Dense(512, activation='relu'),
            Dense(10, activation='softmax'),            
        ])
                  
    # la función call es la que define la estructura de la red
    # en este ejemplo aceptamos solo una entrada, pero si lo desea,
    # siéntete libre de usar más
    def call(self, x):
        probs = self.linear_relu_stack(x)
        return probs

model = NeuralNetwork()
2022-08-30 21:43:59.859202: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
2022-08-30 21:43:59.859243: W tensorflow/stream_executor/cuda/cuda_driver.cc:269] failed call to cuInit: UNKNOWN ERROR (303)
2022-08-30 21:43:59.859265: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (fv-az449-310): /proc/driver/nvidia/version does not exist
2022-08-30 21:43:59.859548: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 AVX512F FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.

Optimizador, métricas y compilación del modelo#

init_lr = 1e-2
batch_size = 64
epochs = 10

optimizer = SGD(learning_rate=init_lr, momentum=0.9, decay=init_lr / epochs)

model.compile(loss="categorical_crossentropy", optimizer=optimizer,
              metrics=["accuracy"])

Entrenamiento#

hist =  model.fit(trainX, trainY, batch_size=batch_size, 
                  validation_data=(testX, testY),
                  steps_per_epoch=trainX.shape[0] // batch_size,
                  epochs=epochs, verbose=1)
Epoch 1/10
2022-08-30 21:44:00.996372: W tensorflow/core/framework/cpu_allocator_impl.cc:82] Allocation of 614400000 exceeds 10% of free system memory.
  1/781 [..............................] - ETA: 7:10 - loss: 2.4179 - accuracy: 0.1250

  7/781 [..............................] - ETA: 7s - loss: 2.3998 - accuracy: 0.1071  

 14/781 [..............................] - ETA: 6s - loss: 2.3227 - accuracy: 0.1473

 21/781 [..............................] - ETA: 6s - loss: 2.2673 - accuracy: 0.1659

 28/781 [>.............................] - ETA: 6s - loss: 2.2260 - accuracy: 0.1808

 35/781 [>.............................] - ETA: 5s - loss: 2.1839 - accuracy: 0.2000

 42/781 [>.............................] - ETA: 5s - loss: 2.1758 - accuracy: 0.2016

 49/781 [>.............................] - ETA: 5s - loss: 2.1609 - accuracy: 0.2073

 56/781 [=>............................] - ETA: 5s - loss: 2.1451 - accuracy: 0.2121

 63/781 [=>............................] - ETA: 5s - loss: 2.1373 - accuracy: 0.2153

 70/781 [=>............................] - ETA: 5s - loss: 2.1253 - accuracy: 0.2208

 77/781 [=>............................] - ETA: 5s - loss: 2.1134 - accuracy: 0.2297

 84/781 [==>...........................] - ETA: 5s - loss: 2.1003 - accuracy: 0.2370

 91/781 [==>...........................] - ETA: 5s - loss: 2.0900 - accuracy: 0.2406

 98/781 [==>...........................] - ETA: 5s - loss: 2.0779 - accuracy: 0.2473

104/781 [==>...........................] - ETA: 5s - loss: 2.0652 - accuracy: 0.2509

111/781 [===>..........................] - ETA: 5s - loss: 2.0566 - accuracy: 0.2524

118/781 [===>..........................] - ETA: 5s - loss: 2.0467 - accuracy: 0.2572

125/781 [===>..........................] - ETA: 5s - loss: 2.0383 - accuracy: 0.2595

132/781 [====>.........................] - ETA: 5s - loss: 2.0254 - accuracy: 0.2631

139/781 [====>.........................] - ETA: 5s - loss: 2.0173 - accuracy: 0.2654

145/781 [====>.........................] - ETA: 5s - loss: 2.0127 - accuracy: 0.2678

152/781 [====>.........................] - ETA: 4s - loss: 2.0065 - accuracy: 0.2698

159/781 [=====>........................] - ETA: 4s - loss: 1.9997 - accuracy: 0.2731

166/781 [=====>........................] - ETA: 4s - loss: 1.9933 - accuracy: 0.2764

173/781 [=====>........................] - ETA: 4s - loss: 1.9873 - accuracy: 0.2790

180/781 [=====>........................] - ETA: 4s - loss: 1.9824 - accuracy: 0.2808

186/781 [======>.......................] - ETA: 4s - loss: 1.9802 - accuracy: 0.2816

193/781 [======>.......................] - ETA: 4s - loss: 1.9748 - accuracy: 0.2838

200/781 [======>.......................] - ETA: 4s - loss: 1.9695 - accuracy: 0.2853

207/781 [======>.......................] - ETA: 4s - loss: 1.9655 - accuracy: 0.2860

214/781 [=======>......................] - ETA: 4s - loss: 1.9615 - accuracy: 0.2875

221/781 [=======>......................] - ETA: 4s - loss: 1.9590 - accuracy: 0.2888

228/781 [=======>......................] - ETA: 4s - loss: 1.9552 - accuracy: 0.2900

234/781 [=======>......................] - ETA: 4s - loss: 1.9516 - accuracy: 0.2916

241/781 [========>.....................] - ETA: 4s - loss: 1.9480 - accuracy: 0.2927

248/781 [========>.....................] - ETA: 4s - loss: 1.9460 - accuracy: 0.2937

255/781 [========>.....................] - ETA: 4s - loss: 1.9418 - accuracy: 0.2958

261/781 [=========>....................] - ETA: 4s - loss: 1.9399 - accuracy: 0.2968

268/781 [=========>....................] - ETA: 4s - loss: 1.9369 - accuracy: 0.2977

275/781 [=========>....................] - ETA: 4s - loss: 1.9337 - accuracy: 0.2987

282/781 [=========>....................] - ETA: 3s - loss: 1.9293 - accuracy: 0.3005

288/781 [==========>...................] - ETA: 3s - loss: 1.9268 - accuracy: 0.3015

293/781 [==========>...................] - ETA: 3s - loss: 1.9240 - accuracy: 0.3029

300/781 [==========>...................] - ETA: 3s - loss: 1.9236 - accuracy: 0.3034

306/781 [==========>...................] - ETA: 3s - loss: 1.9238 - accuracy: 0.3041

313/781 [===========>..................] - ETA: 3s - loss: 1.9206 - accuracy: 0.3047

320/781 [===========>..................] - ETA: 3s - loss: 1.9200 - accuracy: 0.3052

327/781 [===========>..................] - ETA: 3s - loss: 1.9189 - accuracy: 0.3056

333/781 [===========>..................] - ETA: 3s - loss: 1.9178 - accuracy: 0.3062

340/781 [============>.................] - ETA: 3s - loss: 1.9152 - accuracy: 0.3071

346/781 [============>.................] - ETA: 3s - loss: 1.9154 - accuracy: 0.3067

352/781 [============>.................] - ETA: 3s - loss: 1.9136 - accuracy: 0.3074

359/781 [============>.................] - ETA: 3s - loss: 1.9105 - accuracy: 0.3089

365/781 [=============>................] - ETA: 3s - loss: 1.9080 - accuracy: 0.3093

371/781 [=============>................] - ETA: 3s - loss: 1.9051 - accuracy: 0.3106

377/781 [=============>................] - ETA: 3s - loss: 1.9031 - accuracy: 0.3115

383/781 [=============>................] - ETA: 3s - loss: 1.9012 - accuracy: 0.3122

390/781 [=============>................] - ETA: 3s - loss: 1.8993 - accuracy: 0.3130

396/781 [==============>...............] - ETA: 3s - loss: 1.8976 - accuracy: 0.3140

403/781 [==============>...............] - ETA: 3s - loss: 1.8938 - accuracy: 0.3154

409/781 [==============>...............] - ETA: 3s - loss: 1.8916 - accuracy: 0.3158

415/781 [==============>...............] - ETA: 2s - loss: 1.8904 - accuracy: 0.3164

421/781 [===============>..............] - ETA: 2s - loss: 1.8892 - accuracy: 0.3172

427/781 [===============>..............] - ETA: 2s - loss: 1.8881 - accuracy: 0.3179

434/781 [===============>..............] - ETA: 2s - loss: 1.8849 - accuracy: 0.3190

440/781 [===============>..............] - ETA: 2s - loss: 1.8835 - accuracy: 0.3194

444/781 [================>.............] - ETA: 2s - loss: 1.8819 - accuracy: 0.3199

449/781 [================>.............] - ETA: 2s - loss: 1.8816 - accuracy: 0.3201

455/781 [================>.............] - ETA: 2s - loss: 1.8793 - accuracy: 0.3211

461/781 [================>.............] - ETA: 2s - loss: 1.8774 - accuracy: 0.3220

468/781 [================>.............] - ETA: 2s - loss: 1.8753 - accuracy: 0.3230

473/781 [=================>............] - ETA: 2s - loss: 1.8736 - accuracy: 0.3237

479/781 [=================>............] - ETA: 2s - loss: 1.8725 - accuracy: 0.3243

486/781 [=================>............] - ETA: 2s - loss: 1.8708 - accuracy: 0.3251

493/781 [=================>............] - ETA: 2s - loss: 1.8680 - accuracy: 0.3261

499/781 [==================>...........] - ETA: 2s - loss: 1.8661 - accuracy: 0.3273

505/781 [==================>...........] - ETA: 2s - loss: 1.8645 - accuracy: 0.3281

511/781 [==================>...........] - ETA: 2s - loss: 1.8634 - accuracy: 0.3282

518/781 [==================>...........] - ETA: 2s - loss: 1.8622 - accuracy: 0.3290

524/781 [===================>..........] - ETA: 2s - loss: 1.8612 - accuracy: 0.3292

530/781 [===================>..........] - ETA: 2s - loss: 1.8596 - accuracy: 0.3296

536/781 [===================>..........] - ETA: 2s - loss: 1.8590 - accuracy: 0.3298

542/781 [===================>..........] - ETA: 1s - loss: 1.8581 - accuracy: 0.3301

548/781 [====================>.........] - ETA: 1s - loss: 1.8568 - accuracy: 0.3306

554/781 [====================>.........] - ETA: 1s - loss: 1.8559 - accuracy: 0.3311

560/781 [====================>.........] - ETA: 1s - loss: 1.8550 - accuracy: 0.3316

567/781 [====================>.........] - ETA: 1s - loss: 1.8529 - accuracy: 0.3326

573/781 [=====================>........] - ETA: 1s - loss: 1.8510 - accuracy: 0.3336

578/781 [=====================>........] - ETA: 1s - loss: 1.8498 - accuracy: 0.3340

584/781 [=====================>........] - ETA: 1s - loss: 1.8489 - accuracy: 0.3341

590/781 [=====================>........] - ETA: 1s - loss: 1.8474 - accuracy: 0.3348

596/781 [=====================>........] - ETA: 1s - loss: 1.8458 - accuracy: 0.3358

602/781 [======================>.......] - ETA: 1s - loss: 1.8443 - accuracy: 0.3362

608/781 [======================>.......] - ETA: 1s - loss: 1.8428 - accuracy: 0.3370

615/781 [======================>.......] - ETA: 1s - loss: 1.8408 - accuracy: 0.3377

622/781 [======================>.......] - ETA: 1s - loss: 1.8385 - accuracy: 0.3384

628/781 [=======================>......] - ETA: 1s - loss: 1.8373 - accuracy: 0.3392

635/781 [=======================>......] - ETA: 1s - loss: 1.8358 - accuracy: 0.3397

641/781 [=======================>......] - ETA: 1s - loss: 1.8353 - accuracy: 0.3399

648/781 [=======================>......] - ETA: 1s - loss: 1.8331 - accuracy: 0.3409

654/781 [========================>.....] - ETA: 1s - loss: 1.8321 - accuracy: 0.3416

661/781 [========================>.....] - ETA: 1s - loss: 1.8307 - accuracy: 0.3420

667/781 [========================>.....] - ETA: 0s - loss: 1.8294 - accuracy: 0.3425

674/781 [========================>.....] - ETA: 0s - loss: 1.8290 - accuracy: 0.3428

680/781 [=========================>....] - ETA: 0s - loss: 1.8282 - accuracy: 0.3434

687/781 [=========================>....] - ETA: 0s - loss: 1.8270 - accuracy: 0.3441

693/781 [=========================>....] - ETA: 0s - loss: 1.8259 - accuracy: 0.3444

700/781 [=========================>....] - ETA: 0s - loss: 1.8252 - accuracy: 0.3447

706/781 [==========================>...] - ETA: 0s - loss: 1.8240 - accuracy: 0.3451

712/781 [==========================>...] - ETA: 0s - loss: 1.8223 - accuracy: 0.3458

719/781 [==========================>...] - ETA: 0s - loss: 1.8202 - accuracy: 0.3466

725/781 [==========================>...] - ETA: 0s - loss: 1.8185 - accuracy: 0.3473

730/781 [===========================>..] - ETA: 0s - loss: 1.8175 - accuracy: 0.3476

736/781 [===========================>..] - ETA: 0s - loss: 1.8167 - accuracy: 0.3479

742/781 [===========================>..] - ETA: 0s - loss: 1.8155 - accuracy: 0.3487

748/781 [===========================>..] - ETA: 0s - loss: 1.8144 - accuracy: 0.3490

755/781 [============================>.] - ETA: 0s - loss: 1.8127 - accuracy: 0.3494

761/781 [============================>.] - ETA: 0s - loss: 1.8110 - accuracy: 0.3501

768/781 [============================>.] - ETA: 0s - loss: 1.8099 - accuracy: 0.3502

775/781 [============================>.] - ETA: 0s - loss: 1.8082 - accuracy: 0.3510

781/781 [==============================] - ETA: 0s - loss: 1.8069 - accuracy: 0.3515
2022-08-30 21:44:08.892887: W tensorflow/core/framework/cpu_allocator_impl.cc:82] Allocation of 122880000 exceeds 10% of free system memory.

781/781 [==============================] - 8s 10ms/step - loss: 1.8069 - accuracy: 0.3515 - val_loss: 1.6747 - val_accuracy: 0.4021
Epoch 2/10
  1/781 [..............................] - ETA: 6s - loss: 1.8293 - accuracy: 0.3125

  7/781 [..............................] - ETA: 6s - loss: 1.6688 - accuracy: 0.3425

 14/781 [..............................] - ETA: 6s - loss: 1.6655 - accuracy: 0.3785

 21/781 [..............................] - ETA: 6s - loss: 1.6715 - accuracy: 0.3966

 28/781 [>.............................] - ETA: 6s - loss: 1.6675 - accuracy: 0.3985

 35/781 [>.............................] - ETA: 6s - loss: 1.6591 - accuracy: 0.4042

 42/781 [>.............................] - ETA: 5s - loss: 1.6472 - accuracy: 0.4140

 47/781 [>.............................] - ETA: 6s - loss: 1.6453 - accuracy: 0.4166

 53/781 [=>............................] - ETA: 6s - loss: 1.6422 - accuracy: 0.4160

 60/781 [=>............................] - ETA: 6s - loss: 1.6406 - accuracy: 0.4146

 66/781 [=>............................] - ETA: 6s - loss: 1.6388 - accuracy: 0.4176

 72/781 [=>............................] - ETA: 6s - loss: 1.6408 - accuracy: 0.4171

 78/781 [=>............................] - ETA: 5s - loss: 1.6454 - accuracy: 0.4153

 84/781 [==>...........................] - ETA: 5s - loss: 1.6401 - accuracy: 0.4185

 91/781 [==>...........................] - ETA: 5s - loss: 1.6432 - accuracy: 0.4148

 98/781 [==>...........................] - ETA: 5s - loss: 1.6504 - accuracy: 0.4113

105/781 [===>..........................] - ETA: 5s - loss: 1.6493 - accuracy: 0.4116

111/781 [===>..........................] - ETA: 5s - loss: 1.6509 - accuracy: 0.4104

117/781 [===>..........................] - ETA: 5s - loss: 1.6478 - accuracy: 0.4126

124/781 [===>..........................] - ETA: 5s - loss: 1.6523 - accuracy: 0.4102

131/781 [====>.........................] - ETA: 5s - loss: 1.6530 - accuracy: 0.4093

138/781 [====>.........................] - ETA: 5s - loss: 1.6507 - accuracy: 0.4097

145/781 [====>.........................] - ETA: 5s - loss: 1.6500 - accuracy: 0.4096

152/781 [====>.........................] - ETA: 5s - loss: 1.6488 - accuracy: 0.4087

159/781 [=====>........................] - ETA: 5s - loss: 1.6484 - accuracy: 0.4097

166/781 [=====>........................] - ETA: 5s - loss: 1.6482 - accuracy: 0.4098

172/781 [=====>........................] - ETA: 5s - loss: 1.6443 - accuracy: 0.4115

178/781 [=====>........................] - ETA: 5s - loss: 1.6428 - accuracy: 0.4118

184/781 [======>.......................] - ETA: 4s - loss: 1.6434 - accuracy: 0.4116

190/781 [======>.......................] - ETA: 4s - loss: 1.6431 - accuracy: 0.4126

197/781 [======>.......................] - ETA: 4s - loss: 1.6413 - accuracy: 0.4132

203/781 [======>.......................] - ETA: 4s - loss: 1.6399 - accuracy: 0.4134

210/781 [=======>......................] - ETA: 4s - loss: 1.6397 - accuracy: 0.4140

216/781 [=======>......................] - ETA: 4s - loss: 1.6410 - accuracy: 0.4133

222/781 [=======>......................] - ETA: 4s - loss: 1.6417 - accuracy: 0.4133

228/781 [=======>......................] - ETA: 4s - loss: 1.6425 - accuracy: 0.4124

233/781 [=======>......................] - ETA: 4s - loss: 1.6422 - accuracy: 0.4123

237/781 [========>.....................] - ETA: 4s - loss: 1.6447 - accuracy: 0.4111

243/781 [========>.....................] - ETA: 4s - loss: 1.6459 - accuracy: 0.4115

249/781 [========>.....................] - ETA: 4s - loss: 1.6453 - accuracy: 0.4123

255/781 [========>.....................] - ETA: 4s - loss: 1.6447 - accuracy: 0.4126

261/781 [=========>....................] - ETA: 4s - loss: 1.6430 - accuracy: 0.4138

266/781 [=========>....................] - ETA: 4s - loss: 1.6427 - accuracy: 0.4138

273/781 [=========>....................] - ETA: 4s - loss: 1.6418 - accuracy: 0.4141

278/781 [=========>....................] - ETA: 4s - loss: 1.6415 - accuracy: 0.4137

284/781 [=========>....................] - ETA: 4s - loss: 1.6422 - accuracy: 0.4136

290/781 [==========>...................] - ETA: 4s - loss: 1.6415 - accuracy: 0.4136

295/781 [==========>...................] - ETA: 4s - loss: 1.6397 - accuracy: 0.4138

301/781 [==========>...................] - ETA: 4s - loss: 1.6388 - accuracy: 0.4140

307/781 [==========>...................] - ETA: 4s - loss: 1.6377 - accuracy: 0.4145

313/781 [===========>..................] - ETA: 4s - loss: 1.6371 - accuracy: 0.4145

319/781 [===========>..................] - ETA: 4s - loss: 1.6377 - accuracy: 0.4139

324/781 [===========>..................] - ETA: 3s - loss: 1.6386 - accuracy: 0.4138

330/781 [===========>..................] - ETA: 3s - loss: 1.6383 - accuracy: 0.4140

336/781 [===========>..................] - ETA: 3s - loss: 1.6375 - accuracy: 0.4144

342/781 [============>.................] - ETA: 3s - loss: 1.6371 - accuracy: 0.4144

348/781 [============>.................] - ETA: 3s - loss: 1.6370 - accuracy: 0.4143

354/781 [============>.................] - ETA: 3s - loss: 1.6359 - accuracy: 0.4145

361/781 [============>.................] - ETA: 3s - loss: 1.6362 - accuracy: 0.4147

368/781 [=============>................] - ETA: 3s - loss: 1.6364 - accuracy: 0.4150

374/781 [=============>................] - ETA: 3s - loss: 1.6351 - accuracy: 0.4156

380/781 [=============>................] - ETA: 3s - loss: 1.6346 - accuracy: 0.4162

386/781 [=============>................] - ETA: 3s - loss: 1.6332 - accuracy: 0.4165

392/781 [==============>...............] - ETA: 3s - loss: 1.6325 - accuracy: 0.4170

398/781 [==============>...............] - ETA: 3s - loss: 1.6329 - accuracy: 0.4170

404/781 [==============>...............] - ETA: 3s - loss: 1.6320 - accuracy: 0.4171

410/781 [==============>...............] - ETA: 3s - loss: 1.6319 - accuracy: 0.4171

417/781 [===============>..............] - ETA: 3s - loss: 1.6313 - accuracy: 0.4172

422/781 [===============>..............] - ETA: 3s - loss: 1.6302 - accuracy: 0.4180

427/781 [===============>..............] - ETA: 3s - loss: 1.6303 - accuracy: 0.4179

433/781 [===============>..............] - ETA: 3s - loss: 1.6309 - accuracy: 0.4175

439/781 [===============>..............] - ETA: 3s - loss: 1.6297 - accuracy: 0.4177

446/781 [================>.............] - ETA: 2s - loss: 1.6286 - accuracy: 0.4182

452/781 [================>.............] - ETA: 2s - loss: 1.6274 - accuracy: 0.4186

459/781 [================>.............] - ETA: 2s - loss: 1.6262 - accuracy: 0.4189

466/781 [================>.............] - ETA: 2s - loss: 1.6252 - accuracy: 0.4190

471/781 [=================>............] - ETA: 2s - loss: 1.6254 - accuracy: 0.4190

477/781 [=================>............] - ETA: 2s - loss: 1.6240 - accuracy: 0.4193

484/781 [=================>............] - ETA: 2s - loss: 1.6222 - accuracy: 0.4203

490/781 [=================>............] - ETA: 2s - loss: 1.6213 - accuracy: 0.4209

496/781 [==================>...........] - ETA: 2s - loss: 1.6192 - accuracy: 0.4217

502/781 [==================>...........] - ETA: 2s - loss: 1.6197 - accuracy: 0.4217

508/781 [==================>...........] - ETA: 2s - loss: 1.6185 - accuracy: 0.4225

514/781 [==================>...........] - ETA: 2s - loss: 1.6184 - accuracy: 0.4224

521/781 [===================>..........] - ETA: 2s - loss: 1.6179 - accuracy: 0.4227

528/781 [===================>..........] - ETA: 2s - loss: 1.6177 - accuracy: 0.4227

535/781 [===================>..........] - ETA: 2s - loss: 1.6162 - accuracy: 0.4233

541/781 [===================>..........] - ETA: 2s - loss: 1.6158 - accuracy: 0.4237

548/781 [====================>.........] - ETA: 2s - loss: 1.6142 - accuracy: 0.4243

555/781 [====================>.........] - ETA: 1s - loss: 1.6133 - accuracy: 0.4246

561/781 [====================>.........] - ETA: 1s - loss: 1.6136 - accuracy: 0.4245

567/781 [====================>.........] - ETA: 1s - loss: 1.6137 - accuracy: 0.4247

573/781 [=====================>........] - ETA: 1s - loss: 1.6128 - accuracy: 0.4248

580/781 [=====================>........] - ETA: 1s - loss: 1.6132 - accuracy: 0.4245

587/781 [=====================>........] - ETA: 1s - loss: 1.6131 - accuracy: 0.4247

594/781 [=====================>........] - ETA: 1s - loss: 1.6127 - accuracy: 0.4250

601/781 [======================>.......] - ETA: 1s - loss: 1.6119 - accuracy: 0.4250

608/781 [======================>.......] - ETA: 1s - loss: 1.6110 - accuracy: 0.4256

615/781 [======================>.......] - ETA: 1s - loss: 1.6095 - accuracy: 0.4261

622/781 [======================>.......] - ETA: 1s - loss: 1.6086 - accuracy: 0.4264

628/781 [=======================>......] - ETA: 1s - loss: 1.6088 - accuracy: 0.4266

635/781 [=======================>......] - ETA: 1s - loss: 1.6085 - accuracy: 0.4268

641/781 [=======================>......] - ETA: 1s - loss: 1.6083 - accuracy: 0.4267

648/781 [=======================>......] - ETA: 1s - loss: 1.6081 - accuracy: 0.4267

654/781 [========================>.....] - ETA: 1s - loss: 1.6076 - accuracy: 0.4267

661/781 [========================>.....] - ETA: 1s - loss: 1.6072 - accuracy: 0.4268

668/781 [========================>.....] - ETA: 0s - loss: 1.6076 - accuracy: 0.4271

674/781 [========================>.....] - ETA: 0s - loss: 1.6070 - accuracy: 0.4274

681/781 [=========================>....] - ETA: 0s - loss: 1.6065 - accuracy: 0.4280

688/781 [=========================>....] - ETA: 0s - loss: 1.6061 - accuracy: 0.4281

695/781 [=========================>....] - ETA: 0s - loss: 1.6054 - accuracy: 0.4284

702/781 [=========================>....] - ETA: 0s - loss: 1.6046 - accuracy: 0.4287

709/781 [==========================>...] - ETA: 0s - loss: 1.6047 - accuracy: 0.4285

716/781 [==========================>...] - ETA: 0s - loss: 1.6046 - accuracy: 0.4282

723/781 [==========================>...] - ETA: 0s - loss: 1.6039 - accuracy: 0.4284

730/781 [===========================>..] - ETA: 0s - loss: 1.6034 - accuracy: 0.4284

737/781 [===========================>..] - ETA: 0s - loss: 1.6033 - accuracy: 0.4282

743/781 [===========================>..] - ETA: 0s - loss: 1.6027 - accuracy: 0.4285

749/781 [===========================>..] - ETA: 0s - loss: 1.6020 - accuracy: 0.4289

755/781 [============================>.] - ETA: 0s - loss: 1.6012 - accuracy: 0.4291

761/781 [============================>.] - ETA: 0s - loss: 1.6005 - accuracy: 0.4293

767/781 [============================>.] - ETA: 0s - loss: 1.6002 - accuracy: 0.4296

773/781 [============================>.] - ETA: 0s - loss: 1.6001 - accuracy: 0.4297

780/781 [============================>.] - ETA: 0s - loss: 1.5996 - accuracy: 0.4298

781/781 [==============================] - 7s 9ms/step - loss: 1.5996 - accuracy: 0.4299 - val_loss: 1.6148 - val_accuracy: 0.4231
Epoch 3/10
  1/781 [..............................] - ETA: 7s - loss: 1.7400 - accuracy: 0.3906

  8/781 [..............................] - ETA: 6s - loss: 1.5740 - accuracy: 0.4440

 14/781 [..............................] - ETA: 6s - loss: 1.5496 - accuracy: 0.4469

 20/781 [..............................] - ETA: 6s - loss: 1.5259 - accuracy: 0.4489

 26/781 [..............................] - ETA: 6s - loss: 1.5255 - accuracy: 0.4517

 33/781 [>.............................] - ETA: 6s - loss: 1.5253 - accuracy: 0.4554

 40/781 [>.............................] - ETA: 6s - loss: 1.5194 - accuracy: 0.4578

 46/781 [>.............................] - ETA: 6s - loss: 1.5212 - accuracy: 0.4589

 52/781 [>.............................] - ETA: 6s - loss: 1.5109 - accuracy: 0.4640

 58/781 [=>............................] - ETA: 5s - loss: 1.5088 - accuracy: 0.4629

 64/781 [=>............................] - ETA: 5s - loss: 1.5086 - accuracy: 0.4644

 70/781 [=>............................] - ETA: 5s - loss: 1.5044 - accuracy: 0.4668

 76/781 [=>............................] - ETA: 5s - loss: 1.5070 - accuracy: 0.4649

 82/781 [==>...........................] - ETA: 5s - loss: 1.5060 - accuracy: 0.4637

 88/781 [==>...........................] - ETA: 5s - loss: 1.5125 - accuracy: 0.4615

 95/781 [==>...........................] - ETA: 5s - loss: 1.5142 - accuracy: 0.4627

102/781 [==>...........................] - ETA: 5s - loss: 1.5143 - accuracy: 0.4633

109/781 [===>..........................] - ETA: 5s - loss: 1.5142 - accuracy: 0.4632

116/781 [===>..........................] - ETA: 5s - loss: 1.5083 - accuracy: 0.4646

123/781 [===>..........................] - ETA: 5s - loss: 1.5085 - accuracy: 0.4668

130/781 [===>..........................] - ETA: 5s - loss: 1.5067 - accuracy: 0.4668

137/781 [====>.........................] - ETA: 5s - loss: 1.5127 - accuracy: 0.4649

144/781 [====>.........................] - ETA: 5s - loss: 1.5118 - accuracy: 0.4641

150/781 [====>.........................] - ETA: 5s - loss: 1.5148 - accuracy: 0.4618

157/781 [=====>........................] - ETA: 5s - loss: 1.5129 - accuracy: 0.4628

163/781 [=====>........................] - ETA: 5s - loss: 1.5100 - accuracy: 0.4644

170/781 [=====>........................] - ETA: 5s - loss: 1.5074 - accuracy: 0.4661

176/781 [=====>........................] - ETA: 4s - loss: 1.5083 - accuracy: 0.4659

183/781 [======>.......................] - ETA: 4s - loss: 1.5079 - accuracy: 0.4658

189/781 [======>.......................] - ETA: 4s - loss: 1.5081 - accuracy: 0.4641

195/781 [======>.......................] - ETA: 4s - loss: 1.5099 - accuracy: 0.4626

201/781 [======>.......................] - ETA: 4s - loss: 1.5135 - accuracy: 0.4623

207/781 [======>.......................] - ETA: 4s - loss: 1.5151 - accuracy: 0.4618

213/781 [=======>......................] - ETA: 4s - loss: 1.5142 - accuracy: 0.4616

219/781 [=======>......................] - ETA: 4s - loss: 1.5172 - accuracy: 0.4605

225/781 [=======>......................] - ETA: 4s - loss: 1.5174 - accuracy: 0.4607

231/781 [=======>......................] - ETA: 4s - loss: 1.5200 - accuracy: 0.4596

237/781 [========>.....................] - ETA: 4s - loss: 1.5203 - accuracy: 0.4589

242/781 [========>.....................] - ETA: 4s - loss: 1.5196 - accuracy: 0.4595

248/781 [========>.....................] - ETA: 4s - loss: 1.5214 - accuracy: 0.4578

254/781 [========>.....................] - ETA: 4s - loss: 1.5206 - accuracy: 0.4582

260/781 [========>.....................] - ETA: 4s - loss: 1.5219 - accuracy: 0.4570

265/781 [=========>....................] - ETA: 4s - loss: 1.5218 - accuracy: 0.4569

271/781 [=========>....................] - ETA: 4s - loss: 1.5207 - accuracy: 0.4577

277/781 [=========>....................] - ETA: 4s - loss: 1.5188 - accuracy: 0.4585

282/781 [=========>....................] - ETA: 4s - loss: 1.5169 - accuracy: 0.4594

288/781 [==========>...................] - ETA: 4s - loss: 1.5161 - accuracy: 0.4593

294/781 [==========>...................] - ETA: 4s - loss: 1.5168 - accuracy: 0.4590

300/781 [==========>...................] - ETA: 4s - loss: 1.5163 - accuracy: 0.4595

306/781 [==========>...................] - ETA: 4s - loss: 1.5156 - accuracy: 0.4599

312/781 [==========>...................] - ETA: 4s - loss: 1.5146 - accuracy: 0.4604

317/781 [===========>..................] - ETA: 4s - loss: 1.5152 - accuracy: 0.4603

323/781 [===========>..................] - ETA: 3s - loss: 1.5160 - accuracy: 0.4604

329/781 [===========>..................] - ETA: 3s - loss: 1.5159 - accuracy: 0.4608

335/781 [===========>..................] - ETA: 3s - loss: 1.5161 - accuracy: 0.4612

341/781 [============>.................] - ETA: 3s - loss: 1.5171 - accuracy: 0.4610

347/781 [============>.................] - ETA: 3s - loss: 1.5165 - accuracy: 0.4611

353/781 [============>.................] - ETA: 3s - loss: 1.5171 - accuracy: 0.4608

359/781 [============>.................] - ETA: 3s - loss: 1.5170 - accuracy: 0.4613

365/781 [=============>................] - ETA: 3s - loss: 1.5180 - accuracy: 0.4606

370/781 [=============>................] - ETA: 3s - loss: 1.5185 - accuracy: 0.4602

376/781 [=============>................] - ETA: 3s - loss: 1.5184 - accuracy: 0.4606

381/781 [=============>................] - ETA: 3s - loss: 1.5190 - accuracy: 0.4605

386/781 [=============>................] - ETA: 3s - loss: 1.5182 - accuracy: 0.4609

391/781 [==============>...............] - ETA: 3s - loss: 1.5179 - accuracy: 0.4610

397/781 [==============>...............] - ETA: 3s - loss: 1.5178 - accuracy: 0.4612

402/781 [==============>...............] - ETA: 3s - loss: 1.5179 - accuracy: 0.4610

408/781 [==============>...............] - ETA: 3s - loss: 1.5177 - accuracy: 0.4612

414/781 [==============>...............] - ETA: 3s - loss: 1.5177 - accuracy: 0.4611

420/781 [===============>..............] - ETA: 3s - loss: 1.5168 - accuracy: 0.4612

426/781 [===============>..............] - ETA: 3s - loss: 1.5162 - accuracy: 0.4612

432/781 [===============>..............] - ETA: 3s - loss: 1.5164 - accuracy: 0.4611

438/781 [===============>..............] - ETA: 3s - loss: 1.5171 - accuracy: 0.4606

444/781 [================>.............] - ETA: 2s - loss: 1.5166 - accuracy: 0.4608

450/781 [================>.............] - ETA: 2s - loss: 1.5171 - accuracy: 0.4607

457/781 [================>.............] - ETA: 2s - loss: 1.5178 - accuracy: 0.4605

463/781 [================>.............] - ETA: 2s - loss: 1.5173 - accuracy: 0.4609

469/781 [=================>............] - ETA: 2s - loss: 1.5170 - accuracy: 0.4611

475/781 [=================>............] - ETA: 2s - loss: 1.5167 - accuracy: 0.4616

482/781 [=================>............] - ETA: 2s - loss: 1.5166 - accuracy: 0.4620

488/781 [=================>............] - ETA: 2s - loss: 1.5162 - accuracy: 0.4624

495/781 [==================>...........] - ETA: 2s - loss: 1.5176 - accuracy: 0.4618

501/781 [==================>...........] - ETA: 2s - loss: 1.5180 - accuracy: 0.4615

507/781 [==================>...........] - ETA: 2s - loss: 1.5181 - accuracy: 0.4612

513/781 [==================>...........] - ETA: 2s - loss: 1.5174 - accuracy: 0.4614

519/781 [==================>...........] - ETA: 2s - loss: 1.5168 - accuracy: 0.4613

525/781 [===================>..........] - ETA: 2s - loss: 1.5173 - accuracy: 0.4610

531/781 [===================>..........] - ETA: 2s - loss: 1.5168 - accuracy: 0.4609

537/781 [===================>..........] - ETA: 2s - loss: 1.5168 - accuracy: 0.4605

543/781 [===================>..........] - ETA: 2s - loss: 1.5172 - accuracy: 0.4604

549/781 [====================>.........] - ETA: 2s - loss: 1.5172 - accuracy: 0.4605

555/781 [====================>.........] - ETA: 2s - loss: 1.5176 - accuracy: 0.4604

561/781 [====================>.........] - ETA: 1s - loss: 1.5183 - accuracy: 0.4603

567/781 [====================>.........] - ETA: 1s - loss: 1.5176 - accuracy: 0.4604

573/781 [=====================>........] - ETA: 1s - loss: 1.5174 - accuracy: 0.4607

579/781 [=====================>........] - ETA: 1s - loss: 1.5177 - accuracy: 0.4604

585/781 [=====================>........] - ETA: 1s - loss: 1.5167 - accuracy: 0.4608

592/781 [=====================>........] - ETA: 1s - loss: 1.5169 - accuracy: 0.4607

598/781 [=====================>........] - ETA: 1s - loss: 1.5165 - accuracy: 0.4608

604/781 [======================>.......] - ETA: 1s - loss: 1.5161 - accuracy: 0.4611

610/781 [======================>.......] - ETA: 1s - loss: 1.5162 - accuracy: 0.4612

616/781 [======================>.......] - ETA: 1s - loss: 1.5158 - accuracy: 0.4614

622/781 [======================>.......] - ETA: 1s - loss: 1.5151 - accuracy: 0.4615

628/781 [=======================>......] - ETA: 1s - loss: 1.5148 - accuracy: 0.4616

634/781 [=======================>......] - ETA: 1s - loss: 1.5139 - accuracy: 0.4619

640/781 [=======================>......] - ETA: 1s - loss: 1.5136 - accuracy: 0.4620

646/781 [=======================>......] - ETA: 1s - loss: 1.5135 - accuracy: 0.4622

652/781 [========================>.....] - ETA: 1s - loss: 1.5149 - accuracy: 0.4615

659/781 [========================>.....] - ETA: 1s - loss: 1.5145 - accuracy: 0.4615

666/781 [========================>.....] - ETA: 1s - loss: 1.5131 - accuracy: 0.4618

673/781 [========================>.....] - ETA: 0s - loss: 1.5121 - accuracy: 0.4621

680/781 [=========================>....] - ETA: 0s - loss: 1.5128 - accuracy: 0.4615

686/781 [=========================>....] - ETA: 0s - loss: 1.5125 - accuracy: 0.4617

693/781 [=========================>....] - ETA: 0s - loss: 1.5114 - accuracy: 0.4621

700/781 [=========================>....] - ETA: 0s - loss: 1.5114 - accuracy: 0.4622

707/781 [==========================>...] - ETA: 0s - loss: 1.5108 - accuracy: 0.4624

714/781 [==========================>...] - ETA: 0s - loss: 1.5108 - accuracy: 0.4626

721/781 [==========================>...] - ETA: 0s - loss: 1.5102 - accuracy: 0.4631

728/781 [==========================>...] - ETA: 0s - loss: 1.5103 - accuracy: 0.4631

735/781 [===========================>..] - ETA: 0s - loss: 1.5102 - accuracy: 0.4632

742/781 [===========================>..] - ETA: 0s - loss: 1.5103 - accuracy: 0.4631

749/781 [===========================>..] - ETA: 0s - loss: 1.5100 - accuracy: 0.4631

756/781 [============================>.] - ETA: 0s - loss: 1.5095 - accuracy: 0.4632

763/781 [============================>.] - ETA: 0s - loss: 1.5090 - accuracy: 0.4631

770/781 [============================>.] - ETA: 0s - loss: 1.5087 - accuracy: 0.4631

777/781 [============================>.] - ETA: 0s - loss: 1.5085 - accuracy: 0.4633

781/781 [==============================] - 7s 9ms/step - loss: 1.5088 - accuracy: 0.4632 - val_loss: 1.4748 - val_accuracy: 0.4753
Epoch 4/10
  1/781 [..............................] - ETA: 6s - loss: 1.6071 - accuracy: 0.3438

  8/781 [..............................] - ETA: 5s - loss: 1.4406 - accuracy: 0.5000

 15/781 [..............................] - ETA: 6s - loss: 1.4324 - accuracy: 0.4989

 22/781 [..............................] - ETA: 6s - loss: 1.4444 - accuracy: 0.4846

 29/781 [>.............................] - ETA: 6s - loss: 1.4586 - accuracy: 0.4923

 36/781 [>.............................] - ETA: 6s - loss: 1.4619 - accuracy: 0.4956

 42/781 [>.............................] - ETA: 6s - loss: 1.4750 - accuracy: 0.4898

 49/781 [>.............................] - ETA: 5s - loss: 1.4753 - accuracy: 0.4890

 56/781 [=>............................] - ETA: 5s - loss: 1.4868 - accuracy: 0.4850

 63/781 [=>............................] - ETA: 5s - loss: 1.4830 - accuracy: 0.4839

 70/781 [=>............................] - ETA: 5s - loss: 1.4786 - accuracy: 0.4831

 77/781 [=>............................] - ETA: 5s - loss: 1.4849 - accuracy: 0.4797

 84/781 [==>...........................] - ETA: 5s - loss: 1.4808 - accuracy: 0.4784

 91/781 [==>...........................] - ETA: 5s - loss: 1.4823 - accuracy: 0.4778

 98/781 [==>...........................] - ETA: 5s - loss: 1.4813 - accuracy: 0.4770

105/781 [===>..........................] - ETA: 5s - loss: 1.4788 - accuracy: 0.4780

112/781 [===>..........................] - ETA: 5s - loss: 1.4765 - accuracy: 0.4778

119/781 [===>..........................] - ETA: 5s - loss: 1.4741 - accuracy: 0.4794

126/781 [===>..........................] - ETA: 5s - loss: 1.4730 - accuracy: 0.4794

133/781 [====>.........................] - ETA: 5s - loss: 1.4693 - accuracy: 0.4810

140/781 [====>.........................] - ETA: 5s - loss: 1.4678 - accuracy: 0.4820

147/781 [====>.........................] - ETA: 5s - loss: 1.4704 - accuracy: 0.4810

153/781 [====>.........................] - ETA: 5s - loss: 1.4722 - accuracy: 0.4801

160/781 [=====>........................] - ETA: 4s - loss: 1.4685 - accuracy: 0.4816

167/781 [=====>........................] - ETA: 4s - loss: 1.4693 - accuracy: 0.4791

174/781 [=====>........................] - ETA: 4s - loss: 1.4716 - accuracy: 0.4790

181/781 [=====>........................] - ETA: 4s - loss: 1.4721 - accuracy: 0.4788

188/781 [======>.......................] - ETA: 4s - loss: 1.4737 - accuracy: 0.4779

195/781 [======>.......................] - ETA: 4s - loss: 1.4748 - accuracy: 0.4790

202/781 [======>.......................] - ETA: 4s - loss: 1.4736 - accuracy: 0.4802

209/781 [=======>......................] - ETA: 4s - loss: 1.4736 - accuracy: 0.4800

216/781 [=======>......................] - ETA: 4s - loss: 1.4748 - accuracy: 0.4798

223/781 [=======>......................] - ETA: 4s - loss: 1.4750 - accuracy: 0.4796

229/781 [=======>......................] - ETA: 4s - loss: 1.4756 - accuracy: 0.4788

236/781 [========>.....................] - ETA: 4s - loss: 1.4745 - accuracy: 0.4786

242/781 [========>.....................] - ETA: 4s - loss: 1.4742 - accuracy: 0.4782

249/781 [========>.....................] - ETA: 4s - loss: 1.4753 - accuracy: 0.4781

255/781 [========>.....................] - ETA: 4s - loss: 1.4757 - accuracy: 0.4779

262/781 [=========>....................] - ETA: 4s - loss: 1.4761 - accuracy: 0.4779

269/781 [=========>....................] - ETA: 4s - loss: 1.4746 - accuracy: 0.4784

276/781 [=========>....................] - ETA: 4s - loss: 1.4726 - accuracy: 0.4792

283/781 [=========>....................] - ETA: 4s - loss: 1.4723 - accuracy: 0.4796

290/781 [==========>...................] - ETA: 3s - loss: 1.4723 - accuracy: 0.4800

297/781 [==========>...................] - ETA: 3s - loss: 1.4725 - accuracy: 0.4799

304/781 [==========>...................] - ETA: 3s - loss: 1.4726 - accuracy: 0.4802

311/781 [==========>...................] - ETA: 3s - loss: 1.4717 - accuracy: 0.4803

318/781 [===========>..................] - ETA: 3s - loss: 1.4721 - accuracy: 0.4800

325/781 [===========>..................] - ETA: 3s - loss: 1.4731 - accuracy: 0.4796

332/781 [===========>..................] - ETA: 3s - loss: 1.4706 - accuracy: 0.4807

339/781 [============>.................] - ETA: 3s - loss: 1.4711 - accuracy: 0.4802

346/781 [============>.................] - ETA: 3s - loss: 1.4703 - accuracy: 0.4804

353/781 [============>.................] - ETA: 3s - loss: 1.4700 - accuracy: 0.4803

360/781 [============>.................] - ETA: 3s - loss: 1.4703 - accuracy: 0.4800

367/781 [=============>................] - ETA: 3s - loss: 1.4692 - accuracy: 0.4805

374/781 [=============>................] - ETA: 3s - loss: 1.4695 - accuracy: 0.4799

381/781 [=============>................] - ETA: 3s - loss: 1.4689 - accuracy: 0.4800

388/781 [=============>................] - ETA: 3s - loss: 1.4683 - accuracy: 0.4804

395/781 [==============>...............] - ETA: 3s - loss: 1.4685 - accuracy: 0.4805

402/781 [==============>...............] - ETA: 3s - loss: 1.4687 - accuracy: 0.4805

409/781 [==============>...............] - ETA: 2s - loss: 1.4685 - accuracy: 0.4804

416/781 [==============>...............] - ETA: 2s - loss: 1.4671 - accuracy: 0.4810

423/781 [===============>..............] - ETA: 2s - loss: 1.4674 - accuracy: 0.4806

429/781 [===============>..............] - ETA: 2s - loss: 1.4666 - accuracy: 0.4806

435/781 [===============>..............] - ETA: 2s - loss: 1.4671 - accuracy: 0.4802

442/781 [===============>..............] - ETA: 2s - loss: 1.4660 - accuracy: 0.4810

449/781 [================>.............] - ETA: 2s - loss: 1.4656 - accuracy: 0.4809

455/781 [================>.............] - ETA: 2s - loss: 1.4657 - accuracy: 0.4810

462/781 [================>.............] - ETA: 2s - loss: 1.4664 - accuracy: 0.4808

468/781 [================>.............] - ETA: 2s - loss: 1.4660 - accuracy: 0.4809

474/781 [=================>............] - ETA: 2s - loss: 1.4652 - accuracy: 0.4812

480/781 [=================>............] - ETA: 2s - loss: 1.4647 - accuracy: 0.4811

487/781 [=================>............] - ETA: 2s - loss: 1.4639 - accuracy: 0.4814

494/781 [=================>............] - ETA: 2s - loss: 1.4633 - accuracy: 0.4814

501/781 [==================>...........] - ETA: 2s - loss: 1.4625 - accuracy: 0.4820

508/781 [==================>...........] - ETA: 2s - loss: 1.4625 - accuracy: 0.4819

515/781 [==================>...........] - ETA: 2s - loss: 1.4616 - accuracy: 0.4821

522/781 [===================>..........] - ETA: 2s - loss: 1.4621 - accuracy: 0.4818

529/781 [===================>..........] - ETA: 2s - loss: 1.4626 - accuracy: 0.4818

536/781 [===================>..........] - ETA: 1s - loss: 1.4618 - accuracy: 0.4824

543/781 [===================>..........] - ETA: 1s - loss: 1.4620 - accuracy: 0.4825

550/781 [====================>.........] - ETA: 1s - loss: 1.4622 - accuracy: 0.4820

557/781 [====================>.........] - ETA: 1s - loss: 1.4617 - accuracy: 0.4819

564/781 [====================>.........] - ETA: 1s - loss: 1.4613 - accuracy: 0.4820

570/781 [====================>.........] - ETA: 1s - loss: 1.4621 - accuracy: 0.4818

577/781 [=====================>........] - ETA: 1s - loss: 1.4617 - accuracy: 0.4819

584/781 [=====================>........] - ETA: 1s - loss: 1.4621 - accuracy: 0.4819

591/781 [=====================>........] - ETA: 1s - loss: 1.4614 - accuracy: 0.4823

598/781 [=====================>........] - ETA: 1s - loss: 1.4610 - accuracy: 0.4822

604/781 [======================>.......] - ETA: 1s - loss: 1.4608 - accuracy: 0.4824

610/781 [======================>.......] - ETA: 1s - loss: 1.4611 - accuracy: 0.4826

616/781 [======================>.......] - ETA: 1s - loss: 1.4621 - accuracy: 0.4822

622/781 [======================>.......] - ETA: 1s - loss: 1.4615 - accuracy: 0.4825

628/781 [=======================>......] - ETA: 1s - loss: 1.4616 - accuracy: 0.4826

634/781 [=======================>......] - ETA: 1s - loss: 1.4608 - accuracy: 0.4829

641/781 [=======================>......] - ETA: 1s - loss: 1.4609 - accuracy: 0.4826

648/781 [=======================>......] - ETA: 1s - loss: 1.4608 - accuracy: 0.4826

655/781 [========================>.....] - ETA: 1s - loss: 1.4598 - accuracy: 0.4829

662/781 [========================>.....] - ETA: 0s - loss: 1.4602 - accuracy: 0.4827

669/781 [========================>.....] - ETA: 0s - loss: 1.4602 - accuracy: 0.4828

676/781 [========================>.....] - ETA: 0s - loss: 1.4601 - accuracy: 0.4826

683/781 [=========================>....] - ETA: 0s - loss: 1.4597 - accuracy: 0.4826

690/781 [=========================>....] - ETA: 0s - loss: 1.4590 - accuracy: 0.4830

696/781 [=========================>....] - ETA: 0s - loss: 1.4581 - accuracy: 0.4833

702/781 [=========================>....] - ETA: 0s - loss: 1.4578 - accuracy: 0.4831
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
/tmp/ipykernel_3563/2802866821.py in <module>
      2                   validation_data=(testX, testY),
      3                   steps_per_epoch=trainX.shape[0] // batch_size,
----> 4                   epochs=epochs, verbose=1)

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/keras/utils/traceback_utils.py in error_handler(*args, **kwargs)
     62     filtered_tb = None
     63     try:
---> 64       return fn(*args, **kwargs)
     65     except Exception as e:  # pylint: disable=broad-except
     66       filtered_tb = _process_traceback_frames(e.__traceback__)

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
   1407                 _r=1):
   1408               callbacks.on_train_batch_begin(step)
-> 1409               tmp_logs = self.train_function(iterator)
   1410               if data_handler.should_sync:
   1411                 context.async_wait()

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/util/traceback_utils.py in error_handler(*args, **kwargs)
    148     filtered_tb = None
    149     try:
--> 150       return fn(*args, **kwargs)
    151     except Exception as e:
    152       filtered_tb = _process_traceback_frames(e.__traceback__)

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
    913 
    914       with OptionalXlaContext(self._jit_compile):
--> 915         result = self._call(*args, **kwds)
    916 
    917       new_tracing_count = self.experimental_get_tracing_count()

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
    945       # In this case we have created variables on the first call, so we run the
    946       # defunned version which is guaranteed to never create variables.
--> 947       return self._stateless_fn(*args, **kwds)  # pylint: disable=not-callable
    948     elif self._stateful_fn is not None:
    949       # Release the lock early so that multiple threads can perform the call

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs)
   2452        filtered_flat_args) = self._maybe_define_function(args, kwargs)
   2453     return graph_function._call_flat(
-> 2454         filtered_flat_args, captured_inputs=graph_function.captured_inputs)  # pylint: disable=protected-access
   2455 
   2456   @property

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _call_flat(self, args, captured_inputs, cancellation_manager)
   1859       # No tape is watching; skip to running the function.
   1860       return self._build_call_outputs(self._inference_function.call(
-> 1861           ctx, args, cancellation_manager=cancellation_manager))
   1862     forward_backward = self._select_forward_and_backward_functions(
   1863         args,

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/eager/function.py in call(self, ctx, args, cancellation_manager)
    500               inputs=args,
    501               attrs=attrs,
--> 502               ctx=ctx)
    503         else:
    504           outputs = execute.execute_with_cancellation(

/opt/hostedtoolcache/Python/3.7.13/x64/lib/python3.7/site-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
     53     ctx.ensure_initialized()
     54     tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 55                                         inputs, attrs, num_outputs)
     56   except core._NotOkStatusException as e:
     57     if name is not None:

KeyboardInterrupt: 

Guardando y recuperando el modelo#

Necesitamos instalar

#!conda install -c anaconda pyyaml h5py

Guarda el modelo#

Cuando corra el cuaderno localmente, use el siguiente comando para guardarlo

mis_pesos = '../Checkpoints/Model_0_tf'
model.save_weights(mis_pesos)

Crea un nuevo modelo con mis pesos almacenados#

model_1 = NeuralNetwork()
model_1.compile(loss="categorical_crossentropy", optimizer=optimizer,
              metrics=["accuracy"])
model_1.load_weights(mis_pesos)

Evaluación del modelo#

# Evalua el modelo original 
loss, acc = model.evaluate(testX, testY, verbose=2)

print("Exactitud del modelo original accuracy: {:5.2f}%".format(100 * acc))
313/313 - 3s - loss: 1.3708 - accuracy: 0.5143 - 3s/epoch - 9ms/step
Exactitud del modelo original accuracy: 51.43%
model_1=NeuralNetwork()
model_1.compile(loss="categorical_crossentropy", optimizer=optimizer,
              metrics=["accuracy"])

model_1.load_weights(model)
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
c:\Users\User\OneDrive\Documentos\Libro-Programacion\Tensorflow\Cuadernos\Tensorflow-01.ipynb Celda 32 in <cell line: 4>()
      <a href='vscode-notebook-cell:/c%3A/Users/User/OneDrive/Documentos/Libro-Programacion/Tensorflow/Cuadernos/Tensorflow-01.ipynb#ch0000040?line=0'>1</a> model_1=NeuralNetwork()
      <a href='vscode-notebook-cell:/c%3A/Users/User/OneDrive/Documentos/Libro-Programacion/Tensorflow/Cuadernos/Tensorflow-01.ipynb#ch0000040?line=1'>2</a> model_1.compile(loss="categorical_crossentropy", optimizer=optimizer,
      <a href='vscode-notebook-cell:/c%3A/Users/User/OneDrive/Documentos/Libro-Programacion/Tensorflow/Cuadernos/Tensorflow-01.ipynb#ch0000040?line=2'>3</a>               metrics=["accuracy"])
----> <a href='vscode-notebook-cell:/c%3A/Users/User/OneDrive/Documentos/Libro-Programacion/Tensorflow/Cuadernos/Tensorflow-01.ipynb#ch0000040?line=3'>4</a> model_1.load_weights(model)

File c:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\utils\traceback_utils.py:67, in filter_traceback.<locals>.error_handler(*args, **kwargs)
     65 except Exception as e:  # pylint: disable=broad-except
     66   filtered_tb = _process_traceback_frames(e.__traceback__)
---> 67   raise e.with_traceback(filtered_tb) from None
     68 finally:
     69   del filtered_tb

File c:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\saving_utils.py:335, in is_hdf5_filepath(filepath)
    334 def is_hdf5_filepath(filepath):
--> 335   return (filepath.endswith('.h5') or filepath.endswith('.keras') or
    336           filepath.endswith('.hdf5'))

AttributeError: 'NeuralNetwork' object has no attribute 'endswith'
# Evalua el modelo restaurado
loss, acc = model_1.evaluate(testX, testY, verbose=2)

print("Exactitud del modelo restaurado accuracy: {:5.2f}%".format(100 * acc))
313/313 - 3s - loss: 1.3708 - accuracy: 0.5143 - 3s/epoch - 8ms/step
Exactitud del modelo restaurado accuracy: 51.43%
predictions = model.predict(testX)
313/313 [==============================] - 3s 10ms/step

Predicción#

print(predictions[0])
print(np.argmax(predictions[0]))
print(testY[0])
[0.04558412 0.06122134 0.03857893 0.49717635 0.10958219 0.09021669
 0.01830608 0.00147913 0.13623646 0.00161872]
3
[0 0 0 1 0 0 0 0 0 0]
assert np.argmax(predictions[0]) == np.argmax(testY[0])