Note
Go to the end to download the full example code
Tensorflow/Keras#
Note
This example requires the tensorflow
package to be installed.
Theoretically, tpcp is framework agnostic and can be used with any framework. However, due to the way some frameworks handle their objects, some special handling internally is required. Hence, this example does not only serve as example on how to use tensorflow with tpcp, but also as a test case for these special cases.
When using tpcp with any machine learning framework, you either want to use a pretrained model with a normal pipeline or a train your own model as part of an Optimizable Pipeline. Here we show the second case, as it is more complex, and you are likely able to figure out the first case yourself.
This means, we are planning to perform the following steps:
Create a pipeline that creates and trains a model.
Allow the modification of model hyperparameters.
Run a simple cross-validation to demonstrate the functionality.
This example reimplements the basic MNIST example from the [tensorflow documentation](https://www.tensorflow.org/tutorials/keras/classification).
Some Notes#
In this example we show how to implement a Pipeline that uses tensorflow. You could implement an Algorithm in a similar way. This would actually be easier, as no specific handling of the input data would be required. For a pipeline, we need to create a custom Dataset class, as this is the expected input for a pipeline.
The Dataset#
We are using the normal fashion MNIST dataset for this example It consists of 60.000 images of 28x28 pixels, each with a label. We will ignore the typical train-test split, as we want to do our own cross-validation.
In addition, we will simulate an additional “index level”. In this (and most typical deep learning datasets), each datapoint is one vector for which we can make one prediction. In tpcp, we usually deal with datasets, where you might have multiple pieces of information for each datapoint. For example, one datapoint could be a patient, for which we have an entire time series of measurements. We will simulate this here, by creating the index of our dataset as 1000 groups each containing 60 images.
Other than that, the dataset is pretty standard.
Besides the create_index
method, we only need to implement the input_as_array
and labels_as_array
methods that
allow us to easily access the data once we selected a single group.
from functools import lru_cache
import numpy as np
import pandas as pd
import tensorflow as tf
from tpcp import Dataset
tf.keras.utils.set_random_seed(812)
tf.config.experimental.enable_op_determinism()
@lru_cache(maxsize=1)
def get_fashion_mnist_data():
# Note: We throw train and test sets together, as we don't care about the official split here.
# We will create our own split later.
(train_images, train_labels), (test_images, test_labels) = (
tf.keras.datasets.fashion_mnist.load_data()
)
return np.array(list(train_images) + list(test_images)), list(
train_labels
) + list(test_labels)
class FashionMNIST(Dataset):
def input_as_array(self) -> np.ndarray:
self.assert_is_single(None, "input_as_array")
group_id = int(self.group_label.group_id)
images, _ = get_fashion_mnist_data()
return (
images[group_id * 60 : (group_id + 1) * 60].reshape((60, 28, 28))
/ 255
)
def labels_as_array(self) -> np.ndarray:
self.assert_is_single(None, "labels_as_array")
group_id = int(self.group_label.group_id)
_, labels = get_fashion_mnist_data()
return np.array(labels[group_id * 60 : (group_id + 1) * 60])
def create_index(self) -> pd.DataFrame:
# There are 60.000 images in total.
# We simulate 1000 groups of 60 images each.
return pd.DataFrame({"group_id": list(range(1000))})
We can see our Dataset works as expected:
dataset = FashionMNIST()
dataset[0].input_as_array().shape
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz
0/29515 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
29515/29515 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
0/26421880 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
679936/26421880 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
8445952/26421880 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
17620992/26421880 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
26421880/26421880 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz
0/5148 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
5148/5148 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz
0/4422102 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
516096/4422102 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
4422102/4422102 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
(60, 28, 28)
dataset[0].labels_as_array().shape
(60,)
The Pipeline#
We will create a pipeline that uses a simple neural network to classify the images.
In tpcp, all “things” that should be optimized need to be parameters.
This means our model itself needs to be a parameter of the pipeline.
However, as we don’t have the model yet, as its creation depends on other hyperparameters, we add it as an optional
parameter initialized with None
.
Further, we prefix the parameter name with an underscore, to signify, that this is not a parameter that should be
modified manually by the user.
This is just convention, and it is up to you to decide how you want to name your parameters.
We further introduce a hyperparameter n_dense_layer_nodes
to show how we can influence the model creation.
The optimize method#
To make our pipeline optimizable, it needs to inherit from OptimizablePipeline
.
Further we need to mark at least one of the parameters as OptiPara
using the type annotation.
We do this for our _model
parameter.
Finally, we need to implement the self_optimize
method.
This method will get the entire training dataset as input and should update the _model
parameter with the trained
model.
Hence, we first extract the relevant data (remember, each datapoint is 60 images), by concatinating all images over
all groups in the dataset.
Then we create the Keras model based on the hyperparameters.
Finally, we train the model and update the _model
parameter.
Here we chose to wrap the method with make_optimize_safe
.
This decorator will perform some runtime checks to ensure that the method is implemented correctly.
The run method#
The run method expects that the _model
parameter is already set (i.e. the pipeline was already optimized).
It gets a single datapoint as input (remember, a datapoint is a single group of 60 images).
We then extract the data from the datapoint and let the model make a prediction.
We store the prediction on our output attribute predictions_
.
The trailing underscore is a convention to signify, that this is an “result” attribute.
import warnings
from typing import Optional
from tpcp import (
OptimizablePipeline,
OptiPara,
make_action_safe,
make_optimize_safe,
)
from typing_extensions import Self
class KerasPipeline(OptimizablePipeline):
n_dense_layer_nodes: int
n_train_epochs: int
_model: OptiPara[Optional[tf.keras.Sequential]]
predictions_: np.ndarray
def __init__(
self,
n_dense_layer_nodes=128,
n_train_epochs=5,
_model: Optional[tf.keras.Sequential] = None,
):
self.n_dense_layer_nodes = n_dense_layer_nodes
self.n_train_epochs = n_train_epochs
self._model = _model
@property
def predicted_labels_(self):
return np.argmax(self.predictions_, axis=1)
@make_optimize_safe
def self_optimize(self, dataset, **_) -> Self:
data = tf.convert_to_tensor(
np.vstack([d.input_as_array() for d in dataset])
)
labels = tf.convert_to_tensor(
np.hstack([d.labels_as_array() for d in dataset])
)
print(data.shape)
if self._model is not None:
warnings.warn("Overwriting existing model!")
self._model = tf.keras.Sequential(
[
tf.keras.layers.Input((28, 28)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(
self.n_dense_layer_nodes, activation="relu"
),
tf.keras.layers.Dense(10),
]
)
self._model.compile(
optimizer="adam",
loss=tf.keras.losses.SparseCategoricalCrossentropy(
from_logits=True
),
metrics=["accuracy"],
)
self._model.fit(data, labels, epochs=self.n_train_epochs)
return self
@make_action_safe
def run(self, datapoint) -> Self:
if self._model is None:
raise RuntimeError("Model not trained yet!")
data = tf.convert_to_tensor(datapoint.input_as_array())
self.predictions_ = self._model.predict(data)
return self
Testing the pipeline#
We can now test our pipeline.
We will run the optimization using a couple of datapoints (to keep everything fast) and then use run
to get the
predictions for a single unseen datapoint.
pipeline = KerasPipeline().self_optimize(FashionMNIST()[:10])
p1 = pipeline.run(FashionMNIST()[11])
print(p1.predicted_labels_)
print(FashionMNIST()[11].labels_as_array())
(600, 28, 28)
Epoch 1/5
1/19 ━━━━━━━━━━━━━━━━━━━━ 14s 829ms/step - accuracy: 0.1250 - loss: 2.3593
19/19 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.3623 - loss: 1.8187
Epoch 2/5
1/19 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.7188 - loss: 1.0003
19/19 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.6958 - loss: 0.9131
Epoch 3/5
1/19 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.7812 - loss: 0.7427
19/19 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7586 - loss: 0.7140
Epoch 4/5
1/19 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.7812 - loss: 0.6089
19/19 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7986 - loss: 0.5833
Epoch 5/5
1/19 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7812 - loss: 0.5329
19/19 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8290 - loss: 0.5055
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
[8 8 6 9 4 0 7 3 7 9 4 8 4 3 7 8 1 4 0 7 9 8 5 5 2 1 3 4 1 9 7 5 9 9 7 8 2
7 4 7 2 4 7 1 1 7 5 4 8 3 5 9 0 7 4 0 0 9 1 9]
[8 8 0 9 2 0 7 3 7 9 3 8 4 3 7 8 1 4 0 7 9 8 5 5 2 1 3 4 6 7 7 5 9 9 7 8 2
7 4 7 0 3 5 1 1 5 5 2 8 3 5 9 0 7 3 0 0 7 1 9]
We can see that even with just 5 epochs, the model already performs quite well. To quantify we can calculate the accuracy for this datapoint:
from sklearn.metrics import accuracy_score
accuracy_score(p1.predicted_labels_, FashionMNIST()[11].labels_as_array())
0.8
Cross Validation#
If we want to run a cross validation, we need to formalize the scoring into a function. We will calculate two types of accuracy: First, the accuracy per group and second, the accuracy over all images across all groups. For more information about how this works, check the Custom Scorer example.
from collections.abc import Sequence
from tpcp.validate import Aggregator
class SingleValueAccuracy(Aggregator[tuple[np.ndarray, np.ndarray]]):
def aggregate(
self, /, values: Sequence[tuple[np.ndarray, np.ndarray]], **_
) -> dict[str, float]:
return {
"accuracy": accuracy_score(
np.hstack([v[0] for v in values]),
np.hstack([v[1] for v in values]),
)
}
single_value_accuracy = SingleValueAccuracy()
def scoring(pipeline, datapoint):
result: np.ndarray = pipeline.safe_run(datapoint).predicted_labels_
reference = datapoint.labels_as_array()
return {
"accuracy": accuracy_score(result, reference),
"per_sample": single_value_accuracy((result, reference)),
}
Now we can run a cross validation. We will only run it on a subset of the data, to keep the runtime manageable.
Note
You might see warnings about retracing of the model. This is because we clone the pipeline before each call to the run method. This is a good idea to ensure that all pipelines are independent of each other, however, might result in some performance overhead.
from tpcp.optimize import Optimize
from tpcp.validate import cross_validate
pipeline = KerasPipeline(n_train_epochs=10)
cv_results = cross_validate(
Optimize(pipeline), FashionMNIST()[:100], scoring=scoring, cv=3
)
CV Folds: 0%| | 0/3 [00:00<?, ?it/s](3960, 28, 28)
Epoch 1/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 1:35 779ms/step - accuracy: 0.0938 - loss: 2.5074
2/124 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.1328 - loss: 2.4094
25/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3223 - loss: 1.8317
49/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4249 - loss: 1.5674
73/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4868 - loss: 1.4092
98/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5284 - loss: 1.3026
122/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5578 - loss: 1.2286
124/124 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.5611 - loss: 1.2206
Epoch 2/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 3s 31ms/step - accuracy: 0.8750 - loss: 0.4848
25/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7849 - loss: 0.6260
49/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7895 - loss: 0.6091
74/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7939 - loss: 0.5966
98/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7963 - loss: 0.5906
123/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7988 - loss: 0.5865
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7990 - loss: 0.5862
Epoch 3/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 5s 41ms/step - accuracy: 0.8438 - loss: 0.4153
25/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8117 - loss: 0.5263
50/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8218 - loss: 0.5117
75/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8263 - loss: 0.5038
99/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8280 - loss: 0.5014
123/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8292 - loss: 0.5001
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8293 - loss: 0.5000
Epoch 4/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 5s 41ms/step - accuracy: 0.8750 - loss: 0.3588
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8301 - loss: 0.4582
51/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8373 - loss: 0.4516
75/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8397 - loss: 0.4484
100/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8408 - loss: 0.4486
122/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8419 - loss: 0.4488
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8421 - loss: 0.4488
Epoch 5/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 4s 38ms/step - accuracy: 0.8750 - loss: 0.3387
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8456 - loss: 0.4158
51/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8512 - loss: 0.4119
75/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8530 - loss: 0.4106
99/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8543 - loss: 0.4113
122/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8555 - loss: 0.4119
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8557 - loss: 0.4119
Epoch 6/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 4s 40ms/step - accuracy: 0.8750 - loss: 0.3044
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8590 - loss: 0.3743
51/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8628 - loss: 0.3752
75/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8629 - loss: 0.3771
99/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8631 - loss: 0.3793
123/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8637 - loss: 0.3807
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8638 - loss: 0.3807
Epoch 7/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 5s 43ms/step - accuracy: 0.9062 - loss: 0.2879
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8609 - loss: 0.3446
51/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8655 - loss: 0.3475
76/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8670 - loss: 0.3506
100/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8681 - loss: 0.3529
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8692 - loss: 0.3542
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8692 - loss: 0.3543
Epoch 8/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 5s 43ms/step - accuracy: 0.8750 - loss: 0.2663
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8678 - loss: 0.3187
51/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8732 - loss: 0.3235
75/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8755 - loss: 0.3275
99/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8769 - loss: 0.3301
123/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8780 - loss: 0.3317
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8780 - loss: 0.3317
Epoch 9/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 5s 42ms/step - accuracy: 0.9375 - loss: 0.2468
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8889 - loss: 0.2957
50/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8886 - loss: 0.3004
74/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8880 - loss: 0.3047
99/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8879 - loss: 0.3076
123/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8881 - loss: 0.3095
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8881 - loss: 0.3095
Epoch 10/10
1/124 ━━━━━━━━━━━━━━━━━━━━ 5s 42ms/step - accuracy: 0.9375 - loss: 0.2620
26/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8992 - loss: 0.2780
51/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8970 - loss: 0.2834
75/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8960 - loss: 0.2874
100/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8952 - loss: 0.2900
124/124 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8950 - loss: 0.2917
Datapoints: 0%| | 0/34 [00:00<?, ?it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 3%|▎ | 1/34 [00:00<00:04, 7.78it/s]WARNING:tensorflow:5 out of the last 5 calls to <function TensorFlowTrainer.make_predict_function.<locals>.one_step_on_data_distributed at 0x7f97acd03490> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/stepWARNING:tensorflow:6 out of the last 6 calls to <function TensorFlowTrainer.make_predict_function.<locals>.one_step_on_data_distributed at 0x7f97acd03490> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 6%|▌ | 2/34 [00:00<00:04, 7.76it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 9%|▉ | 3/34 [00:00<00:04, 7.36it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 12%|█▏ | 4/34 [00:00<00:03, 7.57it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 15%|█▍ | 5/34 [00:00<00:03, 7.63it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
Datapoints: 18%|█▊ | 6/34 [00:00<00:03, 7.61it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 21%|██ | 7/34 [00:00<00:03, 7.69it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 24%|██▎ | 8/34 [00:01<00:03, 7.46it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
Datapoints: 26%|██▋ | 9/34 [00:01<00:03, 7.51it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 29%|██▉ | 10/34 [00:01<00:03, 7.60it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 32%|███▏ | 11/34 [00:01<00:03, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step
Datapoints: 35%|███▌ | 12/34 [00:01<00:02, 7.57it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 38%|███▊ | 13/34 [00:01<00:02, 7.56it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 41%|████ | 14/34 [00:01<00:02, 7.56it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
Datapoints: 44%|████▍ | 15/34 [00:01<00:02, 7.59it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 47%|████▋ | 16/34 [00:02<00:02, 7.67it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 50%|█████ | 17/34 [00:02<00:02, 7.72it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
Datapoints: 53%|█████▎ | 18/34 [00:02<00:02, 7.70it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step
Datapoints: 56%|█████▌ | 19/34 [00:02<00:02, 7.40it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 59%|█████▉ | 20/34 [00:02<00:01, 7.35it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step
Datapoints: 62%|██████▏ | 21/34 [00:02<00:01, 7.34it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 65%|██████▍ | 22/34 [00:02<00:01, 7.47it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 68%|██████▊ | 23/34 [00:03<00:01, 7.58it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
Datapoints: 71%|███████ | 24/34 [00:03<00:01, 7.60it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 74%|███████▎ | 25/34 [00:03<00:01, 7.65it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 76%|███████▋ | 26/34 [00:03<00:01, 7.72it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step
Datapoints: 79%|███████▉ | 27/34 [00:03<00:00, 7.63it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 82%|████████▏ | 28/34 [00:03<00:00, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 85%|████████▌ | 29/34 [00:03<00:00, 7.61it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step
Datapoints: 88%|████████▊ | 30/34 [00:03<00:00, 7.60it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 91%|█████████ | 31/34 [00:04<00:00, 7.59it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 94%|█████████▍| 32/34 [00:04<00:00, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step
Datapoints: 97%|█████████▋| 33/34 [00:04<00:00, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 100%|██████████| 34/34 [00:04<00:00, 7.71it/s]
Datapoints: 100%|██████████| 34/34 [00:04<00:00, 7.60it/s]
CV Folds: 33%|███▎ | 1/3 [00:08<00:17, 8.93s/it](4020, 28, 28)
Epoch 1/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 1:35 767ms/step - accuracy: 0.1250 - loss: 2.3656
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3664 - loss: 1.8220
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4665 - loss: 1.5610
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5183 - loss: 1.4138
101/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5543 - loss: 1.3110
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5811 - loss: 1.2336
126/126 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.5820 - loss: 1.2309
Epoch 2/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.8438 - loss: 0.4830
27/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7650 - loss: 0.6236
52/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7668 - loss: 0.6230
77/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7712 - loss: 0.6176
102/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7753 - loss: 0.6116
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7789 - loss: 0.6047
Epoch 3/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 5s 46ms/step - accuracy: 0.8750 - loss: 0.3561
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8125 - loss: 0.5178
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8123 - loss: 0.5234
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8140 - loss: 0.5212
100/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8158 - loss: 0.5186
101/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8158 - loss: 0.5186
125/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8175 - loss: 0.5150
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8177 - loss: 0.5147
Epoch 4/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 37ms/step - accuracy: 0.9375 - loss: 0.3082
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8383 - loss: 0.4725
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8339 - loss: 0.4834
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8342 - loss: 0.4816
100/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8351 - loss: 0.4794
125/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8364 - loss: 0.4756
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8366 - loss: 0.4753
Epoch 5/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.8750 - loss: 0.3008
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8342 - loss: 0.4343
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8366 - loss: 0.4400
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8398 - loss: 0.4367
100/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8421 - loss: 0.4348
124/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8443 - loss: 0.4319
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8446 - loss: 0.4315
Epoch 6/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 5s 40ms/step - accuracy: 0.8750 - loss: 0.2896
25/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8394 - loss: 0.4028
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8439 - loss: 0.4080
74/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8488 - loss: 0.4049
98/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8518 - loss: 0.4032
122/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8542 - loss: 0.4007
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8547 - loss: 0.4001
Epoch 7/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 35ms/step - accuracy: 0.8750 - loss: 0.2921
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8353 - loss: 0.3777
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8417 - loss: 0.3815
75/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8484 - loss: 0.3781
99/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8524 - loss: 0.3763
123/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8556 - loss: 0.3740
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8561 - loss: 0.3735
Epoch 8/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 38ms/step - accuracy: 0.8750 - loss: 0.2784
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8479 - loss: 0.3533
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8532 - loss: 0.3575
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8588 - loss: 0.3547
101/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8625 - loss: 0.3532
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8654 - loss: 0.3510
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8655 - loss: 0.3509
Epoch 9/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 5s 42ms/step - accuracy: 0.8750 - loss: 0.2839
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8575 - loss: 0.3364
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8616 - loss: 0.3381
75/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8667 - loss: 0.3344
100/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8703 - loss: 0.3327
125/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8732 - loss: 0.3306
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8734 - loss: 0.3304
Epoch 10/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 5s 40ms/step - accuracy: 0.8750 - loss: 0.2753
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8623 - loss: 0.3163
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8680 - loss: 0.3181
74/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8737 - loss: 0.3148
98/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8775 - loss: 0.3133
122/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8803 - loss: 0.3116
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8808 - loss: 0.3112
Datapoints: 0%| | 0/33 [00:00<?, ?it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 3%|▎ | 1/33 [00:00<00:04, 7.56it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 6%|▌ | 2/33 [00:00<00:04, 7.62it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 9%|▉ | 3/33 [00:00<00:03, 7.52it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 12%|█▏ | 4/33 [00:00<00:03, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 15%|█▌ | 5/33 [00:00<00:03, 7.37it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 18%|█▊ | 6/33 [00:00<00:03, 7.44it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 21%|██ | 7/33 [00:00<00:03, 7.57it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 24%|██▍ | 8/33 [00:01<00:03, 7.63it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 27%|██▋ | 9/33 [00:01<00:03, 7.61it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 30%|███ | 10/33 [00:01<00:03, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 33%|███▎ | 11/33 [00:01<00:02, 7.66it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 36%|███▋ | 12/33 [00:01<00:02, 7.64it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 39%|███▉ | 13/33 [00:01<00:02, 7.70it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 42%|████▏ | 14/33 [00:01<00:02, 7.53it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 45%|████▌ | 15/33 [00:01<00:02, 7.54it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 48%|████▊ | 16/33 [00:02<00:02, 7.67it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 52%|█████▏ | 17/33 [00:02<00:02, 7.70it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 55%|█████▍ | 18/33 [00:02<00:01, 7.66it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 58%|█████▊ | 19/33 [00:02<00:01, 7.73it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 61%|██████ | 20/33 [00:02<00:01, 7.78it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 64%|██████▎ | 21/33 [00:02<00:01, 7.68it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 67%|██████▋ | 22/33 [00:02<00:01, 7.74it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 70%|██████▉ | 23/33 [00:03<00:01, 7.75it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 73%|███████▎ | 24/33 [00:03<00:01, 7.72it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 76%|███████▌ | 25/33 [00:03<00:01, 7.74it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 79%|███████▉ | 26/33 [00:03<00:00, 7.75it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 82%|████████▏ | 27/33 [00:03<00:01, 4.18it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 85%|████████▍ | 28/33 [00:04<00:01, 4.86it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step
Datapoints: 88%|████████▊ | 29/33 [00:04<00:00, 5.49it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 91%|█████████ | 30/33 [00:04<00:00, 5.92it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 94%|█████████▍| 31/33 [00:04<00:00, 6.21it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 97%|█████████▋| 32/33 [00:04<00:00, 6.57it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 100%|██████████| 33/33 [00:04<00:00, 6.79it/s]
Datapoints: 100%|██████████| 33/33 [00:04<00:00, 7.03it/s]
CV Folds: 67%|██████▋ | 2/3 [00:18<00:09, 9.02s/it](4020, 28, 28)
Epoch 1/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 1:37 778ms/step - accuracy: 0.0938 - loss: 2.3129
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3843 - loss: 1.7676
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4687 - loss: 1.5417
74/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5184 - loss: 1.4047
98/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5527 - loss: 1.3084
123/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.5796 - loss: 1.2323
126/126 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.5833 - loss: 1.2218
Epoch 2/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.8125 - loss: 0.4789
25/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7851 - loss: 0.6082
49/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7832 - loss: 0.6147
73/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7856 - loss: 0.6113
98/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7875 - loss: 0.6083
122/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7892 - loss: 0.6043
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7896 - loss: 0.6034
Epoch 3/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 37ms/step - accuracy: 0.8750 - loss: 0.3564
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8299 - loss: 0.5008
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8228 - loss: 0.5140
74/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8221 - loss: 0.5146
99/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8223 - loss: 0.5149
123/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8225 - loss: 0.5139
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8226 - loss: 0.5135
Epoch 4/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 35ms/step - accuracy: 0.9062 - loss: 0.3024
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8545 - loss: 0.4397
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8466 - loss: 0.4546
74/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8452 - loss: 0.4561
99/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8444 - loss: 0.4580
124/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8440 - loss: 0.4581
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8441 - loss: 0.4580
Epoch 5/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 39ms/step - accuracy: 0.9062 - loss: 0.2785
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8682 - loss: 0.4022
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8613 - loss: 0.4169
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8599 - loss: 0.4192
101/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8588 - loss: 0.4207
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8581 - loss: 0.4211
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8581 - loss: 0.4210
Epoch 6/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 5s 41ms/step - accuracy: 0.9062 - loss: 0.2679
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8797 - loss: 0.3655
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8744 - loss: 0.3802
75/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8732 - loss: 0.3830
76/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8732 - loss: 0.3832
100/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8721 - loss: 0.3857
124/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8713 - loss: 0.3866
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8713 - loss: 0.3865
Epoch 7/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 36ms/step - accuracy: 0.9375 - loss: 0.2306
25/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8884 - loss: 0.3376
50/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8835 - loss: 0.3513
75/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8828 - loss: 0.3541
99/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8819 - loss: 0.3576
123/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8813 - loss: 0.3592
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8813 - loss: 0.3593
Epoch 8/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 37ms/step - accuracy: 0.9375 - loss: 0.2117
25/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8933 - loss: 0.3155
49/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8899 - loss: 0.3288
73/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8890 - loss: 0.3319
97/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8881 - loss: 0.3353
121/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8875 - loss: 0.3368
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8875 - loss: 0.3369
Epoch 9/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 34ms/step - accuracy: 0.9375 - loss: 0.2051
25/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8974 - loss: 0.2972
49/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8945 - loss: 0.3087
73/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8937 - loss: 0.3114
97/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8936 - loss: 0.3143
121/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8939 - loss: 0.3156
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8941 - loss: 0.3157
Epoch 10/10
1/126 ━━━━━━━━━━━━━━━━━━━━ 4s 35ms/step - accuracy: 0.9375 - loss: 0.1920
26/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8986 - loss: 0.2800
51/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8977 - loss: 0.2915
75/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8979 - loss: 0.2946
99/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8982 - loss: 0.2977
124/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8986 - loss: 0.2993
126/126 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8987 - loss: 0.2994
Datapoints: 0%| | 0/33 [00:00<?, ?it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 3%|▎ | 1/33 [00:00<00:04, 7.76it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 6%|▌ | 2/33 [00:00<00:04, 7.68it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 9%|▉ | 3/33 [00:00<00:03, 7.62it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step
Datapoints: 12%|█▏ | 4/33 [00:00<00:03, 7.60it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 15%|█▌ | 5/33 [00:00<00:03, 7.33it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 18%|█▊ | 6/33 [00:00<00:03, 7.32it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 21%|██ | 7/33 [00:00<00:03, 7.42it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 24%|██▍ | 8/33 [00:01<00:03, 7.48it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 27%|██▋ | 9/33 [00:01<00:03, 7.47it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 30%|███ | 10/33 [00:01<00:03, 7.51it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 33%|███▎ | 11/33 [00:01<00:02, 7.56it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 36%|███▋ | 12/33 [00:01<00:02, 7.43it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 39%|███▉ | 13/33 [00:01<00:02, 7.47it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step
Datapoints: 42%|████▏ | 14/33 [00:01<00:02, 7.42it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 45%|████▌ | 15/33 [00:02<00:02, 7.36it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 48%|████▊ | 16/33 [00:02<00:02, 7.46it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 52%|█████▏ | 17/33 [00:02<00:02, 7.53it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 55%|█████▍ | 18/33 [00:02<00:02, 7.46it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 58%|█████▊ | 19/33 [00:02<00:01, 7.54it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 61%|██████ | 20/33 [00:02<00:01, 7.58it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 64%|██████▎ | 21/33 [00:02<00:01, 7.50it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 67%|██████▋ | 22/33 [00:02<00:01, 7.55it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step
Datapoints: 70%|██████▉ | 23/33 [00:03<00:01, 7.58it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 73%|███████▎ | 24/33 [00:03<00:01, 7.50it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 76%|███████▌ | 25/33 [00:03<00:01, 7.53it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 79%|███████▉ | 26/33 [00:03<00:00, 7.59it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step
Datapoints: 82%|████████▏ | 27/33 [00:03<00:00, 7.51it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 85%|████████▍ | 28/33 [00:03<00:00, 7.53it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 88%|████████▊ | 29/33 [00:03<00:00, 7.55it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step
Datapoints: 91%|█████████ | 30/33 [00:04<00:00, 7.46it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 94%|█████████▍| 31/33 [00:04<00:00, 7.51it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 97%|█████████▋| 32/33 [00:04<00:00, 7.58it/s]
1/2 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step
2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step
Datapoints: 100%|██████████| 33/33 [00:04<00:00, 7.53it/s]
Datapoints: 100%|██████████| 33/33 [00:04<00:00, 7.50it/s]
CV Folds: 100%|██████████| 3/3 [00:26<00:00, 8.93s/it]
CV Folds: 100%|██████████| 3/3 [00:26<00:00, 8.95s/it]
We can now look at the results per group:
cv_results["test__single__accuracy"]
[[0.8333333333333334, 0.8333333333333334, 0.9166666666666666, 0.7666666666666667, 0.8166666666666667, 0.8166666666666667, 0.8, 0.8333333333333334, 0.8333333333333334, 0.85, 0.7666666666666667, 0.8333333333333334, 0.8833333333333333, 0.8666666666666667, 0.8833333333333333, 0.8666666666666667, 0.8333333333333334, 0.8333333333333334, 0.8, 0.8333333333333334, 0.7833333333333333, 0.7333333333333333, 0.8, 0.8333333333333334, 0.8333333333333334, 0.8833333333333333, 0.8666666666666667, 0.85, 0.85, 0.75, 0.75, 0.85, 0.8166666666666667, 0.8833333333333333], [0.8, 0.9, 0.75, 0.8666666666666667, 0.8833333333333333, 0.8166666666666667, 0.8833333333333333, 0.8833333333333333, 0.8333333333333334, 0.8, 0.85, 0.8, 0.8166666666666667, 0.8, 0.7833333333333333, 0.8, 0.8166666666666667, 0.7666666666666667, 0.85, 0.75, 0.8, 0.8333333333333334, 0.7666666666666667, 0.9, 0.8, 0.7833333333333333, 0.8166666666666667, 0.8666666666666667, 0.8833333333333333, 0.9166666666666666, 0.7333333333333333, 0.7833333333333333, 0.9], [0.7833333333333333, 0.9333333333333333, 0.75, 0.8333333333333334, 0.8833333333333333, 0.8666666666666667, 0.8833333333333333, 0.9, 0.9, 0.9, 0.9166666666666666, 0.85, 0.8666666666666667, 0.8833333333333333, 0.85, 0.8, 0.9166666666666666, 0.75, 0.7166666666666667, 0.8666666666666667, 0.8166666666666667, 0.75, 0.7833333333333333, 0.7333333333333333, 0.8666666666666667, 0.8833333333333333, 0.9166666666666666, 0.8, 0.8666666666666667, 0.9, 0.85, 0.85, 0.9]]
Average first per group and then over all groups:
cv_results["test__agg__accuracy"]
array([0.82892157, 0.82525253, 0.84747475])
And the overall accuracy as the average over all samples of all groups within a fold:
cv_results["test__agg__per_sample__accuracy"]
array([0.82892157, 0.82525253, 0.84747475])
Total running time of the script: (0 minutes 41.903 seconds)
Estimated memory usage: 713 MB