Tensorflow/Keras#

Note

This example requires the tensorflow package to be installed.

Theoretically, tpcp is framework agnostic and can be used with any framework. However, due to the way some frameworks handle their objects, some special handling internally is required. Hence, this example does not only serve as example on how to use tensorflow with tpcp, but also as a test case for these special cases.

When using tpcp with any machine learning framework, you either want to use a pretrained model with a normal pipeline or a train your own model as part of an Optimizable Pipeline. Here we show the second case, as it is more complex, and you are likely able to figure out the first case yourself.

This means, we are planning to perform the following steps:

  1. Create a pipeline that creates and trains a model.

  2. Allow the modification of model hyperparameters.

  3. Run a simple cross-validation to demonstrate the functionality.

This example reimplements the basic MNIST example from the [tensorflow documentation](https://www.tensorflow.org/tutorials/keras/classification).

Some Notes#

In this example we show how to implement a Pipeline that uses tensorflow. You could implement an Algorithm in a similar way. This would actually be easier, as no specific handling of the input data would be required. For a pipeline, we need to create a custom Dataset class, as this is the expected input for a pipeline.

The Dataset#

We are using the normal fashion MNIST dataset for this example It consists of 60.000 images of 28x28 pixels, each with a label. We will ignore the typical train-test split, as we want to do our own cross-validation.

In addition, we will simulate an additional “index level”. In this (and most typical deep learning datasets), each datapoint is one vector for which we can make one prediction. In tpcp, we usually deal with datasets, where you might have multiple pieces of information for each datapoint. For example, one datapoint could be a patient, for which we have an entire time series of measurements. We will simulate this here, by creating the index of our dataset as 1000 groups each containing 60 images.

Other than that, the dataset is pretty standard. Besides the create_index method, we only need to implement the input_as_array and labels_as_array methods that allow us to easily access the data once we selected a single group.

from functools import lru_cache

import numpy as np
import pandas as pd
import tensorflow as tf

from tpcp import Dataset

tf.keras.utils.set_random_seed(812)
tf.config.experimental.enable_op_determinism()


@lru_cache(maxsize=1)
def get_fashion_mnist_data():
    # Note: We throw train and test sets together, as we don't care about the official split here.
    #       We will create our own split later.
    (train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.fashion_mnist.load_data()
    return np.array(list(train_images) + list(test_images)), list(train_labels) + list(test_labels)


class FashionMNIST(Dataset):
    def input_as_array(self) -> np.ndarray:
        self.assert_is_single(None, "input_as_array")
        group_id = int(self.group_label.group_id)
        images, _ = get_fashion_mnist_data()
        return images[group_id * 60 : (group_id + 1) * 60].reshape((60, 28, 28)) / 255

    def labels_as_array(self) -> np.ndarray:
        self.assert_is_single(None, "labels_as_array")
        group_id = int(self.group_label.group_id)
        _, labels = get_fashion_mnist_data()
        return np.array(labels[group_id * 60 : (group_id + 1) * 60])

    def create_index(self) -> pd.DataFrame:
        # There are 60.000 images in total.
        # We simulate 1000 groups of 60 images each.
        return pd.DataFrame({"group_id": list(range(1000))})

We can see our Dataset works as expected:

dataset = FashionMNIST()
dataset[0].input_as_array().shape
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz

 8192/29515 [=======>......................] - ETA: 0s
29515/29515 [==============================] - 0s 1us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz

    8192/26421880 [..............................] - ETA: 0s
  139264/26421880 [..............................] - ETA: 9s
 1097728/26421880 [>.............................] - ETA: 2s
 5120000/26421880 [====>.........................] - ETA: 0s
 9887744/26421880 [==========>...................] - ETA: 0s
14221312/26421880 [===============>..............] - ETA: 0s
18669568/26421880 [====================>.........] - ETA: 0s
23379968/26421880 [=========================>....] - ETA: 0s
26421880/26421880 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz

5148/5148 [==============================] - 0s 0us/step
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz

   8192/4422102 [..............................] - ETA: 0s
 114688/4422102 [..............................] - ETA: 1s
 778240/4422102 [====>.........................] - ETA: 0s
4014080/4422102 [==========================>...] - ETA: 0s
4422102/4422102 [==============================] - 0s 0us/step

(60, 28, 28)
dataset[0].labels_as_array().shape
(60,)

The Pipeline#

We will create a pipeline that uses a simple neural network to classify the images. In tpcp, all “things” that should be optimized need to be parameters. This means our model itself needs to be a parameter of the pipeline. However, as we don’t have the model yet, as its creation depends on other hyperparameters, we add it as an optional parameter initialized with None. Further, we prefix the parameter name with an underscore, to signify, that this is not a parameter that should be modified manually by the user. This is just convention, and it is up to you to decide how you want to name your parameters.

We further introduce a hyperparameter n_dense_layer_nodes to show how we can influence the model creation.

The optimize method#

To make our pipeline optimizable, it needs to inherit from OptimizablePipeline. Further we need to mark at least one of the parameters as OptiPara using the type annotation. We do this for our _model parameter.

Finally, we need to implement the self_optimize method. This method will get the entire training dataset as input and should update the _model parameter with the trained model. Hence, we first extract the relevant data (remember, each datapoint is 60 images), by concatinating all images over all groups in the dataset. Then we create the Keras model based on the hyperparameters. Finally, we train the model and update the _model parameter.

Here we chose to wrap the method with make_optimize_safe. This decorator will perform some runtime checks to ensure that the method is implemented correctly.

The run method#

The run method expects that the _model parameter is already set (i.e. the pipeline was already optimized). It gets a single datapoint as input (remember, a datapoint is a single group of 60 images). We then extract the data from the datapoint and let the model make a prediction. We store the prediction on our output attribute predictions_. The trailing underscore is a convention to signify, that this is an “result” attribute.

import warnings
from typing import Optional

from typing_extensions import Self

from tpcp import OptimizablePipeline, OptiPara, make_action_safe, make_optimize_safe


class KerasPipeline(OptimizablePipeline):
    n_dense_layer_nodes: int
    n_train_epochs: int
    _model: OptiPara[Optional[tf.keras.Sequential]]

    predictions_: np.ndarray

    def __init__(self, n_dense_layer_nodes=128, n_train_epochs=5, _model: Optional[tf.keras.Sequential] = None):
        self.n_dense_layer_nodes = n_dense_layer_nodes
        self.n_train_epochs = n_train_epochs
        self._model = _model

    @property
    def predicted_labels_(self):
        return np.argmax(self.predictions_, axis=1)

    @make_optimize_safe
    def self_optimize(self, dataset, **_) -> Self:
        data = np.vstack([d.input_as_array() for d in dataset])
        labels = np.hstack([d.labels_as_array() for d in dataset])

        print(data.shape)
        if self._model is not None:
            warnings.warn("Overwriting existing model!")

        self._model = tf.keras.Sequential(
            [
                tf.keras.layers.Flatten(input_shape=(28, 28)),
                tf.keras.layers.Dense(self.n_dense_layer_nodes, activation="relu"),
                tf.keras.layers.Dense(10),
            ]
        )

        self._model.compile(
            optimizer="adam",
            loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
            metrics=["accuracy"],
        )

        self._model.fit(data, labels, epochs=self.n_train_epochs)

        return self

    @make_action_safe
    def run(self, datapoint) -> Self:
        if self._model is None:
            raise RuntimeError("Model not trained yet!")
        data = datapoint.input_as_array()

        self.predictions_ = self._model.predict(data)
        return self

Testing the pipeline#

We can now test our pipeline. We will run the optimization using a couple of datapoints (to keep everything fast) and then use run to get the predictions for a single unseen datapoint.

pipeline = KerasPipeline().self_optimize(FashionMNIST()[:10])
p1 = pipeline.run(FashionMNIST()[11])
print(p1.predicted_labels_)
print(FashionMNIST()[11].labels_as_array())
(600, 28, 28)
Epoch 1/5

 1/19 [>.............................] - ETA: 13s - loss: 2.3593 - accuracy: 0.1250
19/19 [==============================] - 1s 3ms/step - loss: 1.5112 - accuracy: 0.4933
Epoch 2/5

 1/19 [>.............................] - ETA: 0s - loss: 0.9220 - accuracy: 0.7188
19/19 [==============================] - 0s 3ms/step - loss: 0.8705 - accuracy: 0.7083
Epoch 3/5

 1/19 [>.............................] - ETA: 0s - loss: 1.0179 - accuracy: 0.6875
19/19 [==============================] - 0s 2ms/step - loss: 0.6799 - accuracy: 0.7850
Epoch 4/5

 1/19 [>.............................] - ETA: 0s - loss: 0.4521 - accuracy: 0.8750
19/19 [==============================] - 0s 2ms/step - loss: 0.5962 - accuracy: 0.8133
Epoch 5/5

 1/19 [>.............................] - ETA: 0s - loss: 0.4230 - accuracy: 0.8438
19/19 [==============================] - 0s 2ms/step - loss: 0.5158 - accuracy: 0.8400

1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step
[8 8 0 9 6 0 7 3 7 9 3 8 6 3 7 8 1 4 0 7 9 8 5 5 2 1 3 3 1 9 7 5 9 9 7 8 2
 7 2 7 2 6 7 1 1 7 5 4 8 3 5 9 0 7 3 0 0 9 1 9]
[8 8 0 9 2 0 7 3 7 9 3 8 4 3 7 8 1 4 0 7 9 8 5 5 2 1 3 4 6 7 7 5 9 9 7 8 2
 7 4 7 0 3 5 1 1 5 5 2 8 3 5 9 0 7 3 0 0 7 1 9]

We can see that even with just 5 epochs, the model already performs quite well. To quantify we can calculate the accuracy for this datapoint:

from sklearn.metrics import accuracy_score

accuracy_score(p1.predicted_labels_, FashionMNIST()[11].labels_as_array())
0.8

Cross Validation#

If we want to run a cross validation, we need to formalize the scoring into a function. We will calculate two types of accuracy: First, the accuracy per group and second, the accuracy over all images across all groups. For more information about how this works, check the Custom Scorer example.

from collections.abc import Sequence

from tpcp.validate import Aggregator


class SingleValueAccuracy(Aggregator[np.ndarray]):
    RETURN_RAW_SCORES = False

    @classmethod
    def aggregate(cls, /, values: Sequence[tuple[np.ndarray, np.ndarray]], **_) -> dict[str, float]:
        return {"accuracy": accuracy_score(np.hstack([v[0] for v in values]), np.hstack([v[1] for v in values]))}


def scoring(pipeline, datapoint):
    result: np.ndarray = pipeline.safe_run(datapoint).predicted_labels_
    reference = datapoint.labels_as_array()

    return {
        "accuracy": accuracy_score(result, reference),
        "per_sample": SingleValueAccuracy((result, reference)),
    }

Now we can run a cross validation. We will only run it on a subset of the data, to keep the runtime manageable.

Note

You might see warnings about retracing of the model. This is because we clone the pipeline before each call to the run method. This is a good idea to ensure that all pipelines are independent of each other, however, might result in some performance overhead.

from tpcp.optimize import Optimize
from tpcp.validate import cross_validate

pipeline = KerasPipeline(n_train_epochs=10)
cv_results = cross_validate(Optimize(pipeline), FashionMNIST()[:100], scoring=scoring, cv=3)
CV Folds:   0%|          | 0/3 [00:00<?, ?it/s](3960, 28, 28)
Epoch 1/10

  1/124 [..............................] - ETA: 1:18 - loss: 2.5074 - accuracy: 0.0938
 21/124 [====>.........................] - ETA: 0s - loss: 1.5500 - accuracy: 0.4375  
 42/124 [=========>....................] - ETA: 0s - loss: 1.2303 - accuracy: 0.5580
 64/124 [==============>...............] - ETA: 0s - loss: 1.0663 - accuracy: 0.6196
 86/124 [===================>..........] - ETA: 0s - loss: 0.9905 - accuracy: 0.6508
108/124 [=========================>....] - ETA: 0s - loss: 0.9299 - accuracy: 0.6756
124/124 [==============================] - 1s 2ms/step - loss: 0.8968 - accuracy: 0.6919
Epoch 2/10

  1/124 [..............................] - ETA: 0s - loss: 1.0481 - accuracy: 0.7500
 23/124 [====>.........................] - ETA: 0s - loss: 0.5833 - accuracy: 0.8084
 45/124 [=========>....................] - ETA: 0s - loss: 0.5938 - accuracy: 0.8021
 67/124 [===============>..............] - ETA: 0s - loss: 0.5724 - accuracy: 0.8050
 89/124 [====================>.........] - ETA: 0s - loss: 0.5670 - accuracy: 0.8083
111/124 [=========================>....] - ETA: 0s - loss: 0.5586 - accuracy: 0.8114
124/124 [==============================] - 0s 2ms/step - loss: 0.5606 - accuracy: 0.8104
Epoch 3/10

  1/124 [..............................] - ETA: 0s - loss: 0.7180 - accuracy: 0.6250
 23/124 [====>.........................] - ETA: 0s - loss: 0.5287 - accuracy: 0.8084
 45/124 [=========>....................] - ETA: 0s - loss: 0.5210 - accuracy: 0.8188
 67/124 [===============>..............] - ETA: 0s - loss: 0.5048 - accuracy: 0.8232
 88/124 [====================>.........] - ETA: 0s - loss: 0.4930 - accuracy: 0.8267
110/124 [=========================>....] - ETA: 0s - loss: 0.4960 - accuracy: 0.8247
124/124 [==============================] - 0s 2ms/step - loss: 0.5009 - accuracy: 0.8232
Epoch 4/10

  1/124 [..............................] - ETA: 0s - loss: 0.4630 - accuracy: 0.8125
 23/124 [====>.........................] - ETA: 0s - loss: 0.4163 - accuracy: 0.8560
 44/124 [=========>....................] - ETA: 0s - loss: 0.4156 - accuracy: 0.8558
 65/124 [==============>...............] - ETA: 0s - loss: 0.4369 - accuracy: 0.8495
 86/124 [===================>..........] - ETA: 0s - loss: 0.4422 - accuracy: 0.8477
107/124 [========================>.....] - ETA: 0s - loss: 0.4385 - accuracy: 0.8473
124/124 [==============================] - 0s 2ms/step - loss: 0.4448 - accuracy: 0.8437
Epoch 5/10

  1/124 [..............................] - ETA: 0s - loss: 0.6924 - accuracy: 0.6250
 23/124 [====>.........................] - ETA: 0s - loss: 0.4502 - accuracy: 0.8247
 44/124 [=========>....................] - ETA: 0s - loss: 0.4355 - accuracy: 0.8395
 66/124 [==============>...............] - ETA: 0s - loss: 0.4297 - accuracy: 0.8414
 88/124 [====================>.........] - ETA: 0s - loss: 0.4113 - accuracy: 0.8526
110/124 [=========================>....] - ETA: 0s - loss: 0.4136 - accuracy: 0.8534
124/124 [==============================] - 0s 2ms/step - loss: 0.4119 - accuracy: 0.8540
Epoch 6/10

  1/124 [..............................] - ETA: 0s - loss: 0.5976 - accuracy: 0.7812
 23/124 [====>.........................] - ETA: 0s - loss: 0.4029 - accuracy: 0.8628
 44/124 [=========>....................] - ETA: 0s - loss: 0.3814 - accuracy: 0.8672
 65/124 [==============>...............] - ETA: 0s - loss: 0.3834 - accuracy: 0.8659
 87/124 [====================>.........] - ETA: 0s - loss: 0.3880 - accuracy: 0.8646
109/124 [=========================>....] - ETA: 0s - loss: 0.3823 - accuracy: 0.8673
124/124 [==============================] - 0s 2ms/step - loss: 0.3792 - accuracy: 0.8687
Epoch 7/10

  1/124 [..............................] - ETA: 0s - loss: 0.4009 - accuracy: 0.8438
 23/124 [====>.........................] - ETA: 0s - loss: 0.3659 - accuracy: 0.8655
 45/124 [=========>....................] - ETA: 0s - loss: 0.3446 - accuracy: 0.8743
 67/124 [===============>..............] - ETA: 0s - loss: 0.3595 - accuracy: 0.8741
 88/124 [====================>.........] - ETA: 0s - loss: 0.3582 - accuracy: 0.8764
110/124 [=========================>....] - ETA: 0s - loss: 0.3578 - accuracy: 0.8764
124/124 [==============================] - 0s 2ms/step - loss: 0.3555 - accuracy: 0.8768
Epoch 8/10

  1/124 [..............................] - ETA: 0s - loss: 0.2810 - accuracy: 0.9062
 23/124 [====>.........................] - ETA: 0s - loss: 0.3007 - accuracy: 0.9144
 45/124 [=========>....................] - ETA: 0s - loss: 0.2950 - accuracy: 0.9056
 67/124 [===============>..............] - ETA: 0s - loss: 0.3147 - accuracy: 0.8960
 88/124 [====================>.........] - ETA: 0s - loss: 0.3267 - accuracy: 0.8910
109/124 [=========================>....] - ETA: 0s - loss: 0.3369 - accuracy: 0.8879
124/124 [==============================] - 0s 2ms/step - loss: 0.3464 - accuracy: 0.8826
Epoch 9/10

  1/124 [..............................] - ETA: 0s - loss: 0.2351 - accuracy: 0.9375
 23/124 [====>.........................] - ETA: 0s - loss: 0.3427 - accuracy: 0.8845
 45/124 [=========>....................] - ETA: 0s - loss: 0.3402 - accuracy: 0.8792
 67/124 [===============>..............] - ETA: 0s - loss: 0.3301 - accuracy: 0.8834
 89/124 [====================>.........] - ETA: 0s - loss: 0.3242 - accuracy: 0.8897
111/124 [=========================>....] - ETA: 0s - loss: 0.3206 - accuracy: 0.8905
124/124 [==============================] - 0s 2ms/step - loss: 0.3196 - accuracy: 0.8886
Epoch 10/10

  1/124 [..............................] - ETA: 0s - loss: 0.2527 - accuracy: 0.9688
 23/124 [====>.........................] - ETA: 0s - loss: 0.2408 - accuracy: 0.9130
 45/124 [=========>....................] - ETA: 0s - loss: 0.2670 - accuracy: 0.9062
 66/124 [==============>...............] - ETA: 0s - loss: 0.2923 - accuracy: 0.8987
 88/124 [====================>.........] - ETA: 0s - loss: 0.2964 - accuracy: 0.8967
110/124 [=========================>....] - ETA: 0s - loss: 0.2950 - accuracy: 0.8994
124/124 [==============================] - 0s 2ms/step - loss: 0.2921 - accuracy: 0.8997


Datapoints:   0%|          | 0/34 [00:00<?, ?it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   3%|▎         | 1/34 [00:00<00:06,  5.17it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   6%|▌         | 2/34 [00:00<00:06,  4.95it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   9%|▉         | 3/34 [00:00<00:06,  4.82it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  12%|█▏        | 4/34 [00:00<00:06,  4.95it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  15%|█▍        | 5/34 [00:01<00:05,  5.02it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  18%|█▊        | 6/34 [00:01<00:05,  4.90it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  21%|██        | 7/34 [00:01<00:05,  4.96it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  24%|██▎       | 8/34 [00:01<00:05,  4.85it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  26%|██▋       | 9/34 [00:01<00:05,  4.87it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  29%|██▉       | 10/34 [00:02<00:04,  4.82it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  32%|███▏      | 11/34 [00:02<00:04,  4.83it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  35%|███▌      | 12/34 [00:02<00:04,  4.80it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  38%|███▊      | 13/34 [00:02<00:04,  4.83it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  41%|████      | 14/34 [00:02<00:04,  4.84it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  44%|████▍     | 15/34 [00:03<00:03,  4.75it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  47%|████▋     | 16/34 [00:03<00:03,  4.76it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  50%|█████     | 17/34 [00:03<00:03,  4.76it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  53%|█████▎    | 18/34 [00:03<00:03,  4.71it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  56%|█████▌    | 19/34 [00:03<00:03,  4.69it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  59%|█████▉    | 20/34 [00:04<00:02,  4.80it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  62%|██████▏   | 21/34 [00:04<00:02,  4.85it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  65%|██████▍   | 22/34 [00:04<00:02,  4.93it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  68%|██████▊   | 23/34 [00:04<00:02,  5.00it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  71%|███████   | 24/34 [00:04<00:02,  4.87it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  74%|███████▎  | 25/34 [00:05<00:01,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  76%|███████▋  | 26/34 [00:05<00:01,  5.00it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  79%|███████▉  | 27/34 [00:05<00:01,  4.98it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  82%|████████▏ | 28/34 [00:05<00:01,  4.88it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  85%|████████▌ | 29/34 [00:05<00:01,  4.82it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  88%|████████▊ | 30/34 [00:06<00:00,  4.77it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  91%|█████████ | 31/34 [00:06<00:00,  4.86it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  94%|█████████▍| 32/34 [00:06<00:00,  4.85it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  97%|█████████▋| 33/34 [00:06<00:00,  4.91it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints: 100%|██████████| 34/34 [00:06<00:00,  4.85it/s]
Datapoints: 100%|██████████| 34/34 [00:06<00:00,  4.86it/s]

CV Folds:  33%|███▎      | 1/3 [00:13<00:27, 13.76s/it](4020, 28, 28)
Epoch 1/10

  1/126 [..............................] - ETA: 1:17 - loss: 2.3656 - accuracy: 0.1250
 23/126 [====>.........................] - ETA: 0s - loss: 1.5016 - accuracy: 0.5014  
 45/126 [=========>....................] - ETA: 0s - loss: 1.2219 - accuracy: 0.5917
 67/126 [==============>...............] - ETA: 0s - loss: 1.0972 - accuracy: 0.6278
 89/126 [====================>.........] - ETA: 0s - loss: 0.9993 - accuracy: 0.6633
111/126 [=========================>....] - ETA: 0s - loss: 0.9268 - accuracy: 0.6872
126/126 [==============================] - 1s 2ms/step - loss: 0.8913 - accuracy: 0.7000
Epoch 2/10

  1/126 [..............................] - ETA: 0s - loss: 0.4984 - accuracy: 0.8125
 23/126 [====>.........................] - ETA: 0s - loss: 0.5577 - accuracy: 0.8043
 45/126 [=========>....................] - ETA: 0s - loss: 0.5663 - accuracy: 0.7986
 67/126 [==============>...............] - ETA: 0s - loss: 0.5941 - accuracy: 0.7887
 89/126 [====================>.........] - ETA: 0s - loss: 0.5782 - accuracy: 0.7928
111/126 [=========================>....] - ETA: 0s - loss: 0.5696 - accuracy: 0.7973
126/126 [==============================] - 0s 2ms/step - loss: 0.5662 - accuracy: 0.8002
Epoch 3/10

  1/126 [..............................] - ETA: 0s - loss: 0.5249 - accuracy: 0.8125
 22/126 [====>.........................] - ETA: 0s - loss: 0.5252 - accuracy: 0.8210
 44/126 [=========>....................] - ETA: 0s - loss: 0.4821 - accuracy: 0.8310
 65/126 [==============>...............] - ETA: 0s - loss: 0.4638 - accuracy: 0.8389
 87/126 [===================>..........] - ETA: 0s - loss: 0.4852 - accuracy: 0.8333
109/126 [========================>.....] - ETA: 0s - loss: 0.4805 - accuracy: 0.8343
126/126 [==============================] - 0s 2ms/step - loss: 0.4893 - accuracy: 0.8279
Epoch 4/10

  1/126 [..............................] - ETA: 0s - loss: 0.2082 - accuracy: 0.9688
 23/126 [====>.........................] - ETA: 0s - loss: 0.4260 - accuracy: 0.8533
 45/126 [=========>....................] - ETA: 0s - loss: 0.4435 - accuracy: 0.8465
 67/126 [==============>...............] - ETA: 0s - loss: 0.4278 - accuracy: 0.8517
 88/126 [===================>..........] - ETA: 0s - loss: 0.4354 - accuracy: 0.8519
109/126 [========================>.....] - ETA: 0s - loss: 0.4472 - accuracy: 0.8438
126/126 [==============================] - 0s 2ms/step - loss: 0.4431 - accuracy: 0.8453
Epoch 5/10

  1/126 [..............................] - ETA: 0s - loss: 0.1979 - accuracy: 0.9375
 23/126 [====>.........................] - ETA: 0s - loss: 0.3909 - accuracy: 0.8628
 44/126 [=========>....................] - ETA: 0s - loss: 0.3991 - accuracy: 0.8551
 65/126 [==============>...............] - ETA: 0s - loss: 0.4037 - accuracy: 0.8567
 86/126 [===================>..........] - ETA: 0s - loss: 0.4037 - accuracy: 0.8568
107/126 [========================>.....] - ETA: 0s - loss: 0.4044 - accuracy: 0.8569
126/126 [==============================] - 0s 2ms/step - loss: 0.4046 - accuracy: 0.8580
Epoch 6/10

  1/126 [..............................] - ETA: 0s - loss: 0.4598 - accuracy: 0.8125
 22/126 [====>.........................] - ETA: 0s - loss: 0.3580 - accuracy: 0.8594
 43/126 [=========>....................] - ETA: 0s - loss: 0.3819 - accuracy: 0.8583
 65/126 [==============>...............] - ETA: 0s - loss: 0.3777 - accuracy: 0.8625
 87/126 [===================>..........] - ETA: 0s - loss: 0.3763 - accuracy: 0.8653
108/126 [========================>.....] - ETA: 0s - loss: 0.3705 - accuracy: 0.8689
126/126 [==============================] - 0s 2ms/step - loss: 0.3772 - accuracy: 0.8667
Epoch 7/10

  1/126 [..............................] - ETA: 0s - loss: 0.2244 - accuracy: 0.9062
 23/126 [====>.........................] - ETA: 0s - loss: 0.3537 - accuracy: 0.8791
 44/126 [=========>....................] - ETA: 0s - loss: 0.3222 - accuracy: 0.8885
 65/126 [==============>...............] - ETA: 0s - loss: 0.3296 - accuracy: 0.8856
 87/126 [===================>..........] - ETA: 0s - loss: 0.3479 - accuracy: 0.8804
109/126 [========================>.....] - ETA: 0s - loss: 0.3480 - accuracy: 0.8802
126/126 [==============================] - 0s 2ms/step - loss: 0.3479 - accuracy: 0.8786
Epoch 8/10

  1/126 [..............................] - ETA: 0s - loss: 0.2092 - accuracy: 0.9375
 23/126 [====>.........................] - ETA: 0s - loss: 0.2916 - accuracy: 0.9035
 45/126 [=========>....................] - ETA: 0s - loss: 0.3055 - accuracy: 0.8944
 67/126 [==============>...............] - ETA: 0s - loss: 0.3109 - accuracy: 0.8904
 89/126 [====================>.........] - ETA: 0s - loss: 0.3282 - accuracy: 0.8869
111/126 [=========================>....] - ETA: 0s - loss: 0.3284 - accuracy: 0.8871
126/126 [==============================] - 0s 2ms/step - loss: 0.3285 - accuracy: 0.8863
Epoch 9/10

  1/126 [..............................] - ETA: 0s - loss: 0.1886 - accuracy: 0.9062
 22/126 [====>.........................] - ETA: 0s - loss: 0.2929 - accuracy: 0.9020
 44/126 [=========>....................] - ETA: 0s - loss: 0.3051 - accuracy: 0.8942
 66/126 [==============>...............] - ETA: 0s - loss: 0.3084 - accuracy: 0.8821
 88/126 [===================>..........] - ETA: 0s - loss: 0.3141 - accuracy: 0.8832
109/126 [========================>.....] - ETA: 0s - loss: 0.3139 - accuracy: 0.8862
126/126 [==============================] - 0s 2ms/step - loss: 0.3120 - accuracy: 0.8900
Epoch 10/10

  1/126 [..............................] - ETA: 0s - loss: 0.2327 - accuracy: 0.9062
 22/126 [====>.........................] - ETA: 0s - loss: 0.2966 - accuracy: 0.8821
 44/126 [=========>....................] - ETA: 0s - loss: 0.3048 - accuracy: 0.8857
 66/126 [==============>...............] - ETA: 0s - loss: 0.3046 - accuracy: 0.8887
 88/126 [===================>..........] - ETA: 0s - loss: 0.2988 - accuracy: 0.8892
110/126 [=========================>....] - ETA: 0s - loss: 0.3066 - accuracy: 0.8869
126/126 [==============================] - 0s 2ms/step - loss: 0.3065 - accuracy: 0.8871


Datapoints:   0%|          | 0/33 [00:00<?, ?it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   3%|▎         | 1/33 [00:00<00:06,  4.69it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   6%|▌         | 2/33 [00:00<00:06,  4.91it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   9%|▉         | 3/33 [00:00<00:06,  4.81it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  12%|█▏        | 4/33 [00:00<00:05,  4.88it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  15%|█▌        | 5/33 [00:01<00:05,  4.85it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  18%|█▊        | 6/33 [00:01<00:05,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  21%|██        | 7/33 [00:01<00:05,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  24%|██▍       | 8/33 [00:01<00:05,  4.88it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  27%|██▋       | 9/33 [00:01<00:04,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  30%|███       | 10/33 [00:02<00:04,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  33%|███▎      | 11/33 [00:02<00:04,  4.86it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  36%|███▋      | 12/33 [00:02<00:04,  4.79it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  39%|███▉      | 13/33 [00:02<00:04,  4.84it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  42%|████▏     | 14/33 [00:02<00:03,  4.82it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  45%|████▌     | 15/33 [00:03<00:03,  4.79it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  48%|████▊     | 16/33 [00:03<00:03,  4.74it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  52%|█████▏    | 17/33 [00:03<00:03,  4.73it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  55%|█████▍    | 18/33 [00:03<00:03,  4.73it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  58%|█████▊    | 19/33 [00:03<00:02,  4.81it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  61%|██████    | 20/33 [00:04<00:02,  4.86it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  64%|██████▎   | 21/33 [00:04<00:02,  4.81it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  67%|██████▋   | 22/33 [00:04<00:02,  4.86it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  70%|██████▉   | 23/33 [00:04<00:02,  4.81it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  73%|███████▎  | 24/33 [00:04<00:01,  4.91it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  76%|███████▌  | 25/33 [00:05<00:01,  4.93it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  79%|███████▉  | 26/33 [00:05<00:01,  4.87it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  82%|████████▏ | 27/33 [00:05<00:01,  4.93it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  85%|████████▍ | 28/33 [00:05<00:01,  4.95it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  88%|████████▊ | 29/33 [00:05<00:00,  4.87it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  91%|█████████ | 30/33 [00:06<00:00,  4.80it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  94%|█████████▍| 31/33 [00:06<00:00,  4.72it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  97%|█████████▋| 32/33 [00:06<00:00,  4.71it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints: 100%|██████████| 33/33 [00:06<00:00,  4.83it/s]
Datapoints: 100%|██████████| 33/33 [00:06<00:00,  4.84it/s]

CV Folds:  67%|██████▋   | 2/3 [00:25<00:12, 12.36s/it](4020, 28, 28)
Epoch 1/10

  1/126 [..............................] - ETA: 1:18 - loss: 2.3129 - accuracy: 0.0938
 22/126 [====>.........................] - ETA: 0s - loss: 1.5247 - accuracy: 0.4844  
 43/126 [=========>....................] - ETA: 0s - loss: 1.2435 - accuracy: 0.5763
 64/126 [==============>...............] - ETA: 0s - loss: 1.1118 - accuracy: 0.6230
 86/126 [===================>..........] - ETA: 0s - loss: 1.0108 - accuracy: 0.6584
107/126 [========================>.....] - ETA: 0s - loss: 0.9452 - accuracy: 0.6802
126/126 [==============================] - 1s 2ms/step - loss: 0.8965 - accuracy: 0.6983
Epoch 2/10

  1/126 [..............................] - ETA: 0s - loss: 0.4769 - accuracy: 0.8125
 23/126 [====>.........................] - ETA: 0s - loss: 0.6002 - accuracy: 0.7908
 45/126 [=========>....................] - ETA: 0s - loss: 0.5949 - accuracy: 0.7958
 67/126 [==============>...............] - ETA: 0s - loss: 0.5938 - accuracy: 0.7934
 89/126 [====================>.........] - ETA: 0s - loss: 0.5872 - accuracy: 0.7992
111/126 [=========================>....] - ETA: 0s - loss: 0.5701 - accuracy: 0.8046
126/126 [==============================] - 0s 2ms/step - loss: 0.5750 - accuracy: 0.8040
Epoch 3/10

  1/126 [..............................] - ETA: 0s - loss: 0.6893 - accuracy: 0.7812
 23/126 [====>.........................] - ETA: 0s - loss: 0.5690 - accuracy: 0.8030
 45/126 [=========>....................] - ETA: 0s - loss: 0.5393 - accuracy: 0.8118
 67/126 [==============>...............] - ETA: 0s - loss: 0.5347 - accuracy: 0.8130
 89/126 [====================>.........] - ETA: 0s - loss: 0.5258 - accuracy: 0.8209
111/126 [=========================>....] - ETA: 0s - loss: 0.5107 - accuracy: 0.8257
126/126 [==============================] - 0s 2ms/step - loss: 0.5140 - accuracy: 0.8224
Epoch 4/10

  1/126 [..............................] - ETA: 0s - loss: 0.4519 - accuracy: 0.9062
 23/126 [====>.........................] - ETA: 0s - loss: 0.4343 - accuracy: 0.8533
 44/126 [=========>....................] - ETA: 0s - loss: 0.4493 - accuracy: 0.8423
 66/126 [==============>...............] - ETA: 0s - loss: 0.4483 - accuracy: 0.8414
 88/126 [===================>..........] - ETA: 0s - loss: 0.4465 - accuracy: 0.8427
110/126 [=========================>....] - ETA: 0s - loss: 0.4619 - accuracy: 0.8395
126/126 [==============================] - 0s 2ms/step - loss: 0.4667 - accuracy: 0.8358
Epoch 5/10

  1/126 [..............................] - ETA: 0s - loss: 0.4192 - accuracy: 0.8125
 22/126 [====>.........................] - ETA: 0s - loss: 0.4205 - accuracy: 0.8608
 43/126 [=========>....................] - ETA: 0s - loss: 0.4074 - accuracy: 0.8590
 64/126 [==============>...............] - ETA: 0s - loss: 0.4179 - accuracy: 0.8633
 85/126 [===================>..........] - ETA: 0s - loss: 0.4222 - accuracy: 0.8599
107/126 [========================>.....] - ETA: 0s - loss: 0.4209 - accuracy: 0.8598
126/126 [==============================] - 0s 2ms/step - loss: 0.4173 - accuracy: 0.8600
Epoch 6/10

  1/126 [..............................] - ETA: 0s - loss: 0.8799 - accuracy: 0.7500
 23/126 [====>.........................] - ETA: 0s - loss: 0.3757 - accuracy: 0.8641
 45/126 [=========>....................] - ETA: 0s - loss: 0.3760 - accuracy: 0.8715
 67/126 [==============>...............] - ETA: 0s - loss: 0.3924 - accuracy: 0.8657
 89/126 [====================>.........] - ETA: 0s - loss: 0.3944 - accuracy: 0.8634
110/126 [=========================>....] - ETA: 0s - loss: 0.3948 - accuracy: 0.8619
126/126 [==============================] - 0s 2ms/step - loss: 0.3949 - accuracy: 0.8624
Epoch 7/10

  1/126 [..............................] - ETA: 0s - loss: 0.2022 - accuracy: 0.9375
 23/126 [====>.........................] - ETA: 0s - loss: 0.3799 - accuracy: 0.8764
 45/126 [=========>....................] - ETA: 0s - loss: 0.3484 - accuracy: 0.8806
 66/126 [==============>...............] - ETA: 0s - loss: 0.3493 - accuracy: 0.8769
 87/126 [===================>..........] - ETA: 0s - loss: 0.3637 - accuracy: 0.8761
108/126 [========================>.....] - ETA: 0s - loss: 0.3662 - accuracy: 0.8759
126/126 [==============================] - 0s 2ms/step - loss: 0.3635 - accuracy: 0.8774
Epoch 8/10

  1/126 [..............................] - ETA: 0s - loss: 0.3186 - accuracy: 0.8750
 22/126 [====>.........................] - ETA: 0s - loss: 0.3324 - accuracy: 0.8835
 43/126 [=========>....................] - ETA: 0s - loss: 0.3333 - accuracy: 0.8859
 64/126 [==============>...............] - ETA: 0s - loss: 0.3333 - accuracy: 0.8848
 85/126 [===================>..........] - ETA: 0s - loss: 0.3371 - accuracy: 0.8860
106/126 [========================>.....] - ETA: 0s - loss: 0.3356 - accuracy: 0.8841
126/126 [==============================] - 0s 2ms/step - loss: 0.3457 - accuracy: 0.8826
Epoch 9/10

  1/126 [..............................] - ETA: 0s - loss: 0.2176 - accuracy: 0.9375
 22/126 [====>.........................] - ETA: 0s - loss: 0.3447 - accuracy: 0.8835
 44/126 [=========>....................] - ETA: 0s - loss: 0.3421 - accuracy: 0.8771
 65/126 [==============>...............] - ETA: 0s - loss: 0.3261 - accuracy: 0.8822
 86/126 [===================>..........] - ETA: 0s - loss: 0.3350 - accuracy: 0.8808
107/126 [========================>.....] - ETA: 0s - loss: 0.3298 - accuracy: 0.8867
126/126 [==============================] - 0s 2ms/step - loss: 0.3285 - accuracy: 0.8878
Epoch 10/10

  1/126 [..............................] - ETA: 0s - loss: 0.3079 - accuracy: 0.8125
 23/126 [====>.........................] - ETA: 0s - loss: 0.2855 - accuracy: 0.9049
 45/126 [=========>....................] - ETA: 0s - loss: 0.2990 - accuracy: 0.9014
 67/126 [==============>...............] - ETA: 0s - loss: 0.2908 - accuracy: 0.9053
 89/126 [====================>.........] - ETA: 0s - loss: 0.2913 - accuracy: 0.9045
111/126 [=========================>....] - ETA: 0s - loss: 0.2977 - accuracy: 0.9006
126/126 [==============================] - 0s 2ms/step - loss: 0.3008 - accuracy: 0.8993


Datapoints:   0%|          | 0/33 [00:00<?, ?it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   3%|▎         | 1/33 [00:00<00:06,  5.09it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   6%|▌         | 2/33 [00:00<00:06,  4.83it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:   9%|▉         | 3/33 [00:00<00:06,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  12%|█▏        | 4/33 [00:00<00:06,  4.78it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  15%|█▌        | 5/33 [00:01<00:05,  4.87it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  18%|█▊        | 6/33 [00:01<00:05,  4.84it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  21%|██        | 7/33 [00:01<00:05,  4.89it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  24%|██▍       | 8/33 [00:01<00:05,  4.96it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  27%|██▋       | 9/33 [00:01<00:04,  4.92it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  30%|███       | 10/33 [00:02<00:04,  4.94it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  33%|███▎      | 11/33 [00:02<00:04,  4.89it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  36%|███▋      | 12/33 [00:02<00:04,  4.84it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  39%|███▉      | 13/33 [00:02<00:04,  4.77it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  42%|████▏     | 14/33 [00:02<00:03,  4.75it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  45%|████▌     | 15/33 [00:03<00:03,  4.75it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  48%|████▊     | 16/33 [00:03<00:03,  4.79it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  52%|█████▏    | 17/33 [00:03<00:03,  4.78it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  55%|█████▍    | 18/33 [00:03<00:03,  4.78it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  58%|█████▊    | 19/33 [00:03<00:02,  4.84it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  61%|██████    | 20/33 [00:04<00:02,  4.79it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  64%|██████▎   | 21/33 [00:04<00:02,  4.88it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  67%|██████▋   | 22/33 [00:04<00:02,  4.85it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  70%|██████▉   | 23/33 [00:04<00:02,  4.85it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  73%|███████▎  | 24/33 [00:04<00:01,  4.84it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  76%|███████▌  | 25/33 [00:05<00:01,  4.82it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  79%|███████▉  | 26/33 [00:05<00:01,  4.72it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  82%|████████▏ | 27/33 [00:05<00:01,  4.76it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  85%|████████▍ | 28/33 [00:05<00:01,  4.66it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  88%|████████▊ | 29/33 [00:06<00:00,  4.73it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  91%|█████████ | 30/33 [00:06<00:00,  4.74it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  94%|█████████▍| 31/33 [00:06<00:00,  4.75it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints:  97%|█████████▋| 32/33 [00:06<00:00,  4.62it/s]
1/2 [==============>...............] - ETA: 0s
2/2 [==============================] - 0s 2ms/step


Datapoints: 100%|██████████| 33/33 [00:06<00:00,  4.54it/s]
Datapoints: 100%|██████████| 33/33 [00:06<00:00,  4.78it/s]

CV Folds: 100%|██████████| 3/3 [00:36<00:00, 11.96s/it]
CV Folds: 100%|██████████| 3/3 [00:36<00:00, 12.21s/it]

We can now look at the results per group:

cv_results["test_single_accuracy"]
[[0.85, 0.8666666666666667, 0.9333333333333333, 0.8, 0.85, 0.85, 0.85, 0.8833333333333333, 0.85, 0.85, 0.8166666666666667, 0.8666666666666667, 0.9333333333333333, 0.8, 0.8833333333333333, 0.8, 0.8, 0.8833333333333333, 0.7833333333333333, 0.8166666666666667, 0.8, 0.85, 0.8, 0.95, 0.8, 0.8833333333333333, 0.8833333333333333, 0.85, 0.85, 0.7833333333333333, 0.75, 0.8666666666666667, 0.8333333333333334, 0.9333333333333333], [0.7833333333333333, 0.8166666666666667, 0.7666666666666667, 0.7833333333333333, 0.85, 0.8, 0.8, 0.8166666666666667, 0.8, 0.8333333333333334, 0.8333333333333334, 0.8, 0.8166666666666667, 0.85, 0.75, 0.7, 0.8333333333333334, 0.8166666666666667, 0.8666666666666667, 0.7666666666666667, 0.7833333333333333, 0.7833333333333333, 0.7666666666666667, 0.8833333333333333, 0.8333333333333334, 0.7833333333333333, 0.8, 0.85, 0.85, 0.8166666666666667, 0.7666666666666667, 0.8, 0.8833333333333333], [0.7666666666666667, 0.9166666666666666, 0.8, 0.8, 0.7833333333333333, 0.85, 0.8833333333333333, 0.85, 0.9, 0.8833333333333333, 0.9, 0.85, 0.8833333333333333, 0.8166666666666667, 0.8333333333333334, 0.8333333333333334, 0.85, 0.7833333333333333, 0.7166666666666667, 0.7333333333333333, 0.75, 0.7833333333333333, 0.85, 0.7666666666666667, 0.7833333333333333, 0.9, 0.8666666666666667, 0.8333333333333334, 0.8333333333333334, 0.8166666666666667, 0.85, 0.85, 0.9]]

And the overall accuracy as the average over all samples of all groups within a fold:

cv_results["test_per_sample__accuracy"]
array([0.84705882, 0.80858586, 0.83080808])

Total running time of the script: (0 minutes 41.631 seconds)

Estimated memory usage: 262 MB

Gallery generated by Sphinx-Gallery