.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/integrations/_01_tensorflow.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_integrations__01_tensorflow.py: .. _tensorflow: Tensorflow/Keras ================ .. note:: This example requires the `tensorflow` package to be installed. Theoretically, tpcp is framework agnostic and can be used with any framework. However, due to the way some frameworks handle their objects, some special handling internally is required. Hence, this example does not only serve as example on how to use tensorflow with tpcp, but also as a test case for these special cases. When using tpcp with any machine learning framework, you either want to use a pretrained model with a normal pipeline or a train your own model as part of an Optimizable Pipeline. Here we show the second case, as it is more complex, and you are likely able to figure out the first case yourself. This means, we are planning to perform the following steps: 1. Create a pipeline that creates and trains a model. 2. Allow the modification of model hyperparameters. 3. Run a simple cross-validation to demonstrate the functionality. This example reimplements the basic MNIST example from the [tensorflow documentation](https://www.tensorflow.org/tutorials/keras/classification). Some Notes ---------- In this example we show how to implement a Pipeline that uses tensorflow. You could implement an Algorithm in a similar way. This would actually be easier, as no specific handling of the input data would be required. For a pipeline, we need to create a custom Dataset class, as this is the expected input for a pipeline. .. GENERATED FROM PYTHON SOURCE LINES 39-54 The Dataset ----------- We are using the normal fashion MNIST dataset for this example It consists of 60.000 images of 28x28 pixels, each with a label. We will ignore the typical train-test split, as we want to do our own cross-validation. In addition, we will simulate an additional "index level". In this (and most typical deep learning datasets), each datapoint is one vector for which we can make one prediction. In tpcp, we usually deal with datasets, where you might have multiple pieces of information for each datapoint. For example, one datapoint could be a patient, for which we have an entire time series of measurements. We will simulate this here, by creating the index of our dataset as 1000 groups each containing 60 images. Other than that, the dataset is pretty standard. Besides the `create_index` method, we only need to implement the `input_as_array` and `labels_as_array` methods that allow us to easily access the data once we selected a single group. .. GENERATED FROM PYTHON SOURCE LINES 54-93 .. code-block:: default from functools import lru_cache import numpy as np import pandas as pd import tensorflow as tf from tpcp import Dataset tf.keras.utils.set_random_seed(812) tf.config.experimental.enable_op_determinism() @lru_cache(maxsize=1) def get_fashion_mnist_data(): # Note: We throw train and test sets together, as we don't care about the official split here. # We will create our own split later. (train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.fashion_mnist.load_data() return np.array(list(train_images) + list(test_images)), list(train_labels) + list(test_labels) class FashionMNIST(Dataset): def input_as_array(self) -> np.ndarray: self.assert_is_single(None, "input_as_array") group_id = int(self.group_label.group_id) images, _ = get_fashion_mnist_data() return images[group_id * 60 : (group_id + 1) * 60].reshape((60, 28, 28)) / 255 def labels_as_array(self) -> np.ndarray: self.assert_is_single(None, "labels_as_array") group_id = int(self.group_label.group_id) _, labels = get_fashion_mnist_data() return np.array(labels[group_id * 60 : (group_id + 1) * 60]) def create_index(self) -> pd.DataFrame: # There are 60.000 images in total. # We simulate 1000 groups of 60 images each. return pd.DataFrame({"group_id": list(range(1000))}) .. GENERATED FROM PYTHON SOURCE LINES 94-95 We can see our Dataset works as expected: .. GENERATED FROM PYTHON SOURCE LINES 95-98 .. code-block:: default dataset = FashionMNIST() dataset[0].input_as_array().shape .. rst-class:: sphx-glr-script-out .. code-block:: none Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz 8192/29515 [=======>......................] - ETA: 0s 29515/29515 [==============================] - 0s 1us/step Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz 8192/26421880 [..............................] - ETA: 0s 131072/26421880 [..............................] - ETA: 10s 1040384/26421880 [>.............................] - ETA: 2s  5292032/26421880 [=====>........................] - ETA: 0s 8986624/26421880 [=========>....................] - ETA: 0s 13639680/26421880 [==============>...............] - ETA: 0s 16785408/26421880 [==================>...........] - ETA: 0s 21331968/26421880 [=======================>......] - ETA: 0s 26058752/26421880 [============================>.] - ETA: 0s 26421880/26421880 [==============================] - 0s 0us/step Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz 5148/5148 [==============================] - 0s 0us/step Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz 8192/4422102 [..............................] - ETA: 0s 139264/4422102 [..............................] - ETA: 1s 1114112/4422102 [======>.......................] - ETA: 0s 4422102/4422102 [==============================] - 0s 0us/step (60, 28, 28) .. GENERATED FROM PYTHON SOURCE LINES 99-101 .. code-block:: default dataset[0].labels_as_array().shape .. rst-class:: sphx-glr-script-out .. code-block:: none (60,) .. GENERATED FROM PYTHON SOURCE LINES 102-139 The Pipeline ------------ We will create a pipeline that uses a simple neural network to classify the images. In tpcp, all "things" that should be optimized need to be parameters. This means our model itself needs to be a parameter of the pipeline. However, as we don't have the model yet, as its creation depends on other hyperparameters, we add it as an optional parameter initialized with `None`. Further, we prefix the parameter name with an underscore, to signify, that this is not a parameter that should be modified manually by the user. This is just convention, and it is up to you to decide how you want to name your parameters. We further introduce a hyperparameter `n_dense_layer_nodes` to show how we can influence the model creation. The optimize method +++++++++++++++++++ To make our pipeline optimizable, it needs to inherit from `OptimizablePipeline`. Further we need to mark at least one of the parameters as `OptiPara` using the type annotation. We do this for our `_model` parameter. Finally, we need to implement the `self_optimize` method. This method will get the entire training dataset as input and should update the `_model` parameter with the trained model. Hence, we first extract the relevant data (remember, each datapoint is 60 images), by concatinating all images over all groups in the dataset. Then we create the Keras model based on the hyperparameters. Finally, we train the model and update the `_model` parameter. Here we chose to wrap the method with `make_optimize_safe`. This decorator will perform some runtime checks to ensure that the method is implemented correctly. The run method ++++++++++++++ The run method expects that the `_model` parameter is already set (i.e. the pipeline was already optimized). It gets a single datapoint as input (remember, a datapoint is a single group of 60 images). We then extract the data from the datapoint and let the model make a prediction. We store the prediction on our output attribute `predictions_`. The trailing underscore is a convention to signify, that this is an "result" attribute. .. GENERATED FROM PYTHON SOURCE LINES 139-198 .. code-block:: default from tpcp import OptimizablePipeline, OptiPara, make_optimize_safe, make_action_safe from typing import Optional, Tuple from typing_extensions import Self import warnings class KerasPipeline(OptimizablePipeline): n_dense_layer_nodes: int n_train_epochs: int _model: OptiPara[Optional[tf.keras.Sequential]] predictions_: np.ndarray def __init__(self, n_dense_layer_nodes=128, n_train_epochs=5, _model: Optional[tf.keras.Sequential] = None): self.n_dense_layer_nodes = n_dense_layer_nodes self.n_train_epochs = n_train_epochs self._model = _model @property def predicted_labels_(self): return np.argmax(self.predictions_, axis=1) @make_optimize_safe def self_optimize(self, dataset, **_) -> Self: data = np.vstack([d.input_as_array() for d in dataset]) labels = np.hstack([d.labels_as_array() for d in dataset]) print(data.shape) if self._model is not None: warnings.warn("Overwriting existing model!") self._model = tf.keras.Sequential( [ tf.keras.layers.Flatten(input_shape=(28, 28)), tf.keras.layers.Dense(self.n_dense_layer_nodes, activation="relu"), tf.keras.layers.Dense(10), ] ) self._model.compile( optimizer="adam", loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=["accuracy"], ) self._model.fit(data, labels, epochs=self.n_train_epochs) return self @make_action_safe def run(self, datapoint) -> Self: if self._model is None: raise RuntimeError("Model not trained yet!") data = datapoint.input_as_array() self.predictions_ = self._model.predict(data) return self .. GENERATED FROM PYTHON SOURCE LINES 199-204 Testing the pipeline -------------------- We can now test our pipeline. We will run the optimization using a couple of datapoints (to keep everything fast) and then use `run` to get the predictions for a single unseen datapoint. .. GENERATED FROM PYTHON SOURCE LINES 204-209 .. code-block:: default pipeline = KerasPipeline().self_optimize(FashionMNIST()[:10]) p1 = pipeline.run(FashionMNIST()[11]) print(p1.predicted_labels_) print(FashionMNIST()[11].labels_as_array()) .. rst-class:: sphx-glr-script-out .. code-block:: none (600, 28, 28) Epoch 1/5 1/19 [>.............................] - ETA: 12s - loss: 2.3593 - accuracy: 0.1250 19/19 [==============================] - 1s 2ms/step - loss: 1.5112 - accuracy: 0.4933 Epoch 2/5 1/19 [>.............................] - ETA: 0s - loss: 0.9220 - accuracy: 0.7188 19/19 [==============================] - 0s 2ms/step - loss: 0.8705 - accuracy: 0.7083 Epoch 3/5 1/19 [>.............................] - ETA: 0s - loss: 1.0179 - accuracy: 0.6875 19/19 [==============================] - 0s 2ms/step - loss: 0.6799 - accuracy: 0.7850 Epoch 4/5 1/19 [>.............................] - ETA: 0s - loss: 0.4521 - accuracy: 0.8750 19/19 [==============================] - 0s 2ms/step - loss: 0.5962 - accuracy: 0.8133 Epoch 5/5 1/19 [>.............................] - ETA: 0s - loss: 0.4230 - accuracy: 0.8438 19/19 [==============================] - 0s 2ms/step - loss: 0.5158 - accuracy: 0.8400 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step [8 8 0 9 6 0 7 3 7 9 3 8 6 3 7 8 1 4 0 7 9 8 5 5 2 1 3 3 1 9 7 5 9 9 7 8 2 7 2 7 2 6 7 1 1 7 5 4 8 3 5 9 0 7 3 0 0 9 1 9] [8 8 0 9 2 0 7 3 7 9 3 8 4 3 7 8 1 4 0 7 9 8 5 5 2 1 3 4 6 7 7 5 9 9 7 8 2 7 4 7 0 3 5 1 1 5 5 2 8 3 5 9 0 7 3 0 0 7 1 9] .. GENERATED FROM PYTHON SOURCE LINES 210-212 We can see that even with just 5 epochs, the model already performs quite well. To quantify we can calculate the accuracy for this datapoint: .. GENERATED FROM PYTHON SOURCE LINES 212-216 .. code-block:: default from sklearn.metrics import accuracy_score accuracy_score(p1.predicted_labels_, FashionMNIST()[11].labels_as_array()) .. rst-class:: sphx-glr-script-out .. code-block:: none 0.8 .. GENERATED FROM PYTHON SOURCE LINES 217-223 Cross Validation ---------------- If we want to run a cross validation, we need to formalize the scoring into a function. We will calculate two types of accuracy: First, the accuracy per group and second, the accuracy over all images across all groups. For more information about how this works, check the :ref:`custom_scorer` example. .. GENERATED FROM PYTHON SOURCE LINES 223-245 .. code-block:: default from typing import Sequence, Dict from tpcp.validate import Aggregator class SingleValueAccuracy(Aggregator[np.ndarray]): RETURN_RAW_SCORES = False @classmethod def aggregate(cls, /, values: Sequence[Tuple[np.ndarray, np.ndarray]], **_) -> Dict[str, float]: return {"accuracy": accuracy_score(np.hstack([v[0] for v in values]), np.hstack([v[1] for v in values]))} def scoring(pipeline, datapoint): result: np.ndarray = pipeline.safe_run(datapoint).predicted_labels_ reference = datapoint.labels_as_array() return { "accuracy": accuracy_score(result, reference), "per_sample": SingleValueAccuracy((result, reference)), } .. GENERATED FROM PYTHON SOURCE LINES 246-253 Now we can run a cross validation. We will only run it on a subset of the data, to keep the runtime manageable. .. note:: You might see warnings about retracing of the model. This is because we clone the pipeline before each call to the run method. This is a good idea to ensure that all pipelines are independent of each other, however, might result in some performance overhead. .. GENERATED FROM PYTHON SOURCE LINES 253-259 .. code-block:: default from tpcp.validate import cross_validate from tpcp.optimize import Optimize pipeline = KerasPipeline(n_train_epochs=10) cv_results = cross_validate(Optimize(pipeline), FashionMNIST()[:100], scoring=scoring, cv=3) .. rst-class:: sphx-glr-script-out .. code-block:: none CV Folds: 0%| | 0/3 [00:00........................] - ETA: 0s - loss: 1.4785 - accuracy: 0.4600  50/124 [===========>..................] - ETA: 0s - loss: 1.1601 - accuracy: 0.5863 75/124 [=================>............] - ETA: 0s - loss: 1.0235 - accuracy: 0.6354 100/124 [=======================>......] - ETA: 0s - loss: 0.9496 - accuracy: 0.6678 124/124 [==============================] - 1s 2ms/step - loss: 0.8968 - accuracy: 0.6919 Epoch 2/10 1/124 [..............................] - ETA: 0s - loss: 1.0481 - accuracy: 0.7500 26/124 [=====>........................] - ETA: 0s - loss: 0.5908 - accuracy: 0.8089 51/124 [===========>..................] - ETA: 0s - loss: 0.5920 - accuracy: 0.7996 76/124 [=================>............] - ETA: 0s - loss: 0.5642 - accuracy: 0.8076 101/124 [=======================>......] - ETA: 0s - loss: 0.5612 - accuracy: 0.8103 124/124 [==============================] - 0s 2ms/step - loss: 0.5606 - accuracy: 0.8104 Epoch 3/10 1/124 [..............................] - ETA: 0s - loss: 0.7180 - accuracy: 0.6250 26/124 [=====>........................] - ETA: 0s - loss: 0.5136 - accuracy: 0.8161 51/124 [===========>..................] - ETA: 0s - loss: 0.5116 - accuracy: 0.8241 76/124 [=================>............] - ETA: 0s - loss: 0.4994 - accuracy: 0.8240 101/124 [=======================>......] - ETA: 0s - loss: 0.4985 - accuracy: 0.8261 124/124 [==============================] - 0s 2ms/step - loss: 0.5009 - accuracy: 0.8232 Epoch 4/10 1/124 [..............................] - ETA: 0s - loss: 0.4630 - accuracy: 0.8125 26/124 [=====>........................] - ETA: 0s - loss: 0.4043 - accuracy: 0.8618 51/124 [===========>..................] - ETA: 0s - loss: 0.4236 - accuracy: 0.8529 76/124 [=================>............] - ETA: 0s - loss: 0.4401 - accuracy: 0.8470 101/124 [=======================>......] - ETA: 0s - loss: 0.4450 - accuracy: 0.8450 124/124 [==============================] - 0s 2ms/step - loss: 0.4448 - accuracy: 0.8437 Epoch 5/10 1/124 [..............................] - ETA: 0s - loss: 0.6924 - accuracy: 0.6250 26/124 [=====>........................] - ETA: 0s - loss: 0.4356 - accuracy: 0.8281 51/124 [===========>..................] - ETA: 0s - loss: 0.4340 - accuracy: 0.8407 76/124 [=================>............] - ETA: 0s - loss: 0.4171 - accuracy: 0.8503 101/124 [=======================>......] - ETA: 0s - loss: 0.4101 - accuracy: 0.8555 124/124 [==============================] - 0s 2ms/step - loss: 0.4119 - accuracy: 0.8540 Epoch 6/10 1/124 [..............................] - ETA: 0s - loss: 0.5976 - accuracy: 0.7812 26/124 [=====>........................] - ETA: 0s - loss: 0.4092 - accuracy: 0.8558 51/124 [===========>..................] - ETA: 0s - loss: 0.3851 - accuracy: 0.8664 76/124 [=================>............] - ETA: 0s - loss: 0.3845 - accuracy: 0.8660 101/124 [=======================>......] - ETA: 0s - loss: 0.3790 - accuracy: 0.8691 124/124 [==============================] - 0s 2ms/step - loss: 0.3792 - accuracy: 0.8687 Epoch 7/10 1/124 [..............................] - ETA: 0s - loss: 0.4009 - accuracy: 0.8438 26/124 [=====>........................] - ETA: 0s - loss: 0.3543 - accuracy: 0.8714 51/124 [===========>..................] - ETA: 0s - loss: 0.3607 - accuracy: 0.8713 76/124 [=================>............] - ETA: 0s - loss: 0.3589 - accuracy: 0.8758 101/124 [=======================>......] - ETA: 0s - loss: 0.3569 - accuracy: 0.8790 124/124 [==============================] - 0s 2ms/step - loss: 0.3555 - accuracy: 0.8768 Epoch 8/10 1/124 [..............................] - ETA: 0s - loss: 0.2810 - accuracy: 0.9062 26/124 [=====>........................] - ETA: 0s - loss: 0.2951 - accuracy: 0.9147 51/124 [===========>..................] - ETA: 0s - loss: 0.3004 - accuracy: 0.9038 76/124 [=================>............] - ETA: 0s - loss: 0.3195 - accuracy: 0.8935 101/124 [=======================>......] - ETA: 0s - loss: 0.3299 - accuracy: 0.8899 124/124 [==============================] - 0s 2ms/step - loss: 0.3464 - accuracy: 0.8826 Epoch 9/10 1/124 [..............................] - ETA: 0s - loss: 0.2351 - accuracy: 0.9375 27/124 [=====>........................] - ETA: 0s - loss: 0.3445 - accuracy: 0.8843 52/124 [===========>..................] - ETA: 0s - loss: 0.3397 - accuracy: 0.8792 77/124 [=================>............] - ETA: 0s - loss: 0.3213 - accuracy: 0.8884 102/124 [=======================>......] - ETA: 0s - loss: 0.3224 - accuracy: 0.8903 124/124 [==============================] - 0s 2ms/step - loss: 0.3196 - accuracy: 0.8886 Epoch 10/10 1/124 [..............................] - ETA: 0s - loss: 0.2527 - accuracy: 0.9688 26/124 [=====>........................] - ETA: 0s - loss: 0.2442 - accuracy: 0.9111 51/124 [===========>..................] - ETA: 0s - loss: 0.2760 - accuracy: 0.9044 76/124 [=================>............] - ETA: 0s - loss: 0.2979 - accuracy: 0.8960 101/124 [=======================>......] - ETA: 0s - loss: 0.2960 - accuracy: 0.8982 124/124 [==============================] - 0s 2ms/step - loss: 0.2921 - accuracy: 0.8997 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step CV Folds: 33%|###3 | 1/3 [00:11<00:22, 11.48s/it](4020, 28, 28) Epoch 1/10 1/126 [..............................] - ETA: 1:12 - loss: 2.3656 - accuracy: 0.1250 26/126 [=====>........................] - ETA: 0s - loss: 1.4324 - accuracy: 0.5240  51/126 [===========>..................] - ETA: 0s - loss: 1.1858 - accuracy: 0.6048 76/126 [=================>............] - ETA: 0s - loss: 1.0536 - accuracy: 0.6460 100/126 [======================>.......] - ETA: 0s - loss: 0.9618 - accuracy: 0.6753 125/126 [============================>.] - ETA: 0s - loss: 0.8932 - accuracy: 0.6990 126/126 [==============================] - 1s 2ms/step - loss: 0.8913 - accuracy: 0.7000 Epoch 2/10 1/126 [..............................] - ETA: 0s - loss: 0.4984 - accuracy: 0.8125 26/126 [=====>........................] - ETA: 0s - loss: 0.5686 - accuracy: 0.8053 51/126 [===========>..................] - ETA: 0s - loss: 0.5829 - accuracy: 0.7923 76/126 [=================>............] - ETA: 0s - loss: 0.5839 - accuracy: 0.7940 101/126 [=======================>......] - ETA: 0s - loss: 0.5740 - accuracy: 0.7942 126/126 [==============================] - ETA: 0s - loss: 0.5662 - accuracy: 0.8002 126/126 [==============================] - 0s 2ms/step - loss: 0.5662 - accuracy: 0.8002 Epoch 3/10 1/126 [..............................] - ETA: 0s - loss: 0.5249 - accuracy: 0.8125 26/126 [=====>........................] - ETA: 0s - loss: 0.5084 - accuracy: 0.8317 51/126 [===========>..................] - ETA: 0s - loss: 0.4669 - accuracy: 0.8364 76/126 [=================>............] - ETA: 0s - loss: 0.4701 - accuracy: 0.8363 101/126 [=======================>......] - ETA: 0s - loss: 0.4785 - accuracy: 0.8348 126/126 [==============================] - ETA: 0s - loss: 0.4893 - accuracy: 0.8279 126/126 [==============================] - 0s 2ms/step - loss: 0.4893 - accuracy: 0.8279 Epoch 4/10 1/126 [..............................] - ETA: 0s - loss: 0.2082 - accuracy: 0.9688 26/126 [=====>........................] - ETA: 0s - loss: 0.4251 - accuracy: 0.8594 51/126 [===========>..................] - ETA: 0s - loss: 0.4422 - accuracy: 0.8456 76/126 [=================>............] - ETA: 0s - loss: 0.4264 - accuracy: 0.8536 101/126 [=======================>......] - ETA: 0s - loss: 0.4435 - accuracy: 0.8478 126/126 [==============================] - ETA: 0s - loss: 0.4431 - accuracy: 0.8453 126/126 [==============================] - 0s 2ms/step - loss: 0.4431 - accuracy: 0.8453 Epoch 5/10 1/126 [..............................] - ETA: 0s - loss: 0.1979 - accuracy: 0.9375 26/126 [=====>........................] - ETA: 0s - loss: 0.4029 - accuracy: 0.8558 51/126 [===========>..................] - ETA: 0s - loss: 0.4061 - accuracy: 0.8548 76/126 [=================>............] - ETA: 0s - loss: 0.4097 - accuracy: 0.8553 101/126 [=======================>......] - ETA: 0s - loss: 0.4030 - accuracy: 0.8567 125/126 [============================>.] - ETA: 0s - loss: 0.4040 - accuracy: 0.8583 126/126 [==============================] - 0s 2ms/step - loss: 0.4046 - accuracy: 0.8580 Epoch 6/10 1/126 [..............................] - ETA: 0s - loss: 0.4598 - accuracy: 0.8125 25/126 [====>.........................] - ETA: 0s - loss: 0.3639 - accuracy: 0.8625 50/126 [==========>...................] - ETA: 0s - loss: 0.3693 - accuracy: 0.8637 75/126 [================>.............] - ETA: 0s - loss: 0.3757 - accuracy: 0.8642 100/126 [======================>.......] - ETA: 0s - loss: 0.3760 - accuracy: 0.8659 125/126 [============================>.] - ETA: 0s - loss: 0.3771 - accuracy: 0.8670 126/126 [==============================] - 0s 2ms/step - loss: 0.3772 - accuracy: 0.8667 Epoch 7/10 1/126 [..............................] - ETA: 0s - loss: 0.2244 - accuracy: 0.9062 25/126 [====>.........................] - ETA: 0s - loss: 0.3532 - accuracy: 0.8813 50/126 [==========>...................] - ETA: 0s - loss: 0.3309 - accuracy: 0.8875 75/126 [================>.............] - ETA: 0s - loss: 0.3474 - accuracy: 0.8813 100/126 [======================>.......] - ETA: 0s - loss: 0.3482 - accuracy: 0.8794 125/126 [============================>.] - ETA: 0s - loss: 0.3489 - accuracy: 0.8780 126/126 [==============================] - 0s 2ms/step - loss: 0.3479 - accuracy: 0.8786 Epoch 8/10 1/126 [..............................] - ETA: 0s - loss: 0.2092 - accuracy: 0.9375 25/126 [====>.........................] - ETA: 0s - loss: 0.2997 - accuracy: 0.9000 50/126 [==========>...................] - ETA: 0s - loss: 0.3115 - accuracy: 0.8931 75/126 [================>.............] - ETA: 0s - loss: 0.3209 - accuracy: 0.8888 100/126 [======================>.......] - ETA: 0s - loss: 0.3294 - accuracy: 0.8869 125/126 [============================>.] - ETA: 0s - loss: 0.3292 - accuracy: 0.8860 126/126 [==============================] - 0s 2ms/step - loss: 0.3285 - accuracy: 0.8863 Epoch 9/10 1/126 [..............................] - ETA: 0s - loss: 0.1886 - accuracy: 0.9062 26/126 [=====>........................] - ETA: 0s - loss: 0.2963 - accuracy: 0.9014 50/126 [==========>...................] - ETA: 0s - loss: 0.3077 - accuracy: 0.8881 75/126 [================>.............] - ETA: 0s - loss: 0.3219 - accuracy: 0.8771 100/126 [======================>.......] - ETA: 0s - loss: 0.3062 - accuracy: 0.8866 125/126 [============================>.] - ETA: 0s - loss: 0.3130 - accuracy: 0.8898 126/126 [==============================] - 0s 2ms/step - loss: 0.3120 - accuracy: 0.8900 Epoch 10/10 1/126 [..............................] - ETA: 0s - loss: 0.2327 - accuracy: 0.9062 26/126 [=====>........................] - ETA: 0s - loss: 0.2941 - accuracy: 0.8822 51/126 [===========>..................] - ETA: 0s - loss: 0.2947 - accuracy: 0.8922 76/126 [=================>............] - ETA: 0s - loss: 0.3047 - accuracy: 0.8882 101/126 [=======================>......] - ETA: 0s - loss: 0.3064 - accuracy: 0.8877 126/126 [==============================] - ETA: 0s - loss: 0.3065 - accuracy: 0.8871 126/126 [==============================] - 0s 2ms/step - loss: 0.3065 - accuracy: 0.8871 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step CV Folds: 67%|######6 | 2/3 [00:22<00:11, 11.40s/it](4020, 28, 28) Epoch 1/10 1/126 [..............................] - ETA: 1:13 - loss: 2.3129 - accuracy: 0.0938 25/126 [====>.........................] - ETA: 0s - loss: 1.4511 - accuracy: 0.5113  50/126 [==========>...................] - ETA: 0s - loss: 1.1892 - accuracy: 0.5994 75/126 [================>.............] - ETA: 0s - loss: 1.0642 - accuracy: 0.6392 100/126 [======================>.......] - ETA: 0s - loss: 0.9617 - accuracy: 0.6753 125/126 [============================>.] - ETA: 0s - loss: 0.9002 - accuracy: 0.6967 126/126 [==============================] - 1s 2ms/step - loss: 0.8965 - accuracy: 0.6983 Epoch 2/10 1/126 [..............................] - ETA: 0s - loss: 0.4769 - accuracy: 0.8125 26/126 [=====>........................] - ETA: 0s - loss: 0.6113 - accuracy: 0.7957 51/126 [===========>..................] - ETA: 0s - loss: 0.5969 - accuracy: 0.7953 76/126 [=================>............] - ETA: 0s - loss: 0.5938 - accuracy: 0.7956 101/126 [=======================>......] - ETA: 0s - loss: 0.5780 - accuracy: 0.8026 126/126 [==============================] - ETA: 0s - loss: 0.5750 - accuracy: 0.8040 126/126 [==============================] - 0s 2ms/step - loss: 0.5750 - accuracy: 0.8040 Epoch 3/10 1/126 [..............................] - ETA: 0s - loss: 0.6893 - accuracy: 0.7812 25/126 [====>.........................] - ETA: 0s - loss: 0.5681 - accuracy: 0.8075 50/126 [==========>...................] - ETA: 0s - loss: 0.5351 - accuracy: 0.8125 75/126 [================>.............] - ETA: 0s - loss: 0.5261 - accuracy: 0.8179 100/126 [======================>.......] - ETA: 0s - loss: 0.5193 - accuracy: 0.8222 125/126 [============================>.] - ETA: 0s - loss: 0.5144 - accuracy: 0.8227 126/126 [==============================] - 0s 2ms/step - loss: 0.5140 - accuracy: 0.8224 Epoch 4/10 1/126 [..............................] - ETA: 0s - loss: 0.4519 - accuracy: 0.9062 26/126 [=====>........................] - ETA: 0s - loss: 0.4196 - accuracy: 0.8582 51/126 [===========>..................] - ETA: 0s - loss: 0.4550 - accuracy: 0.8395 76/126 [=================>............] - ETA: 0s - loss: 0.4425 - accuracy: 0.8446 101/126 [=======================>......] - ETA: 0s - loss: 0.4570 - accuracy: 0.8416 126/126 [==============================] - ETA: 0s - loss: 0.4667 - accuracy: 0.8358 126/126 [==============================] - 0s 2ms/step - loss: 0.4667 - accuracy: 0.8358 Epoch 5/10 1/126 [..............................] - ETA: 0s - loss: 0.4192 - accuracy: 0.8125 25/126 [====>.........................] - ETA: 0s - loss: 0.4182 - accuracy: 0.8625 50/126 [==========>...................] - ETA: 0s - loss: 0.4171 - accuracy: 0.8656 75/126 [================>.............] - ETA: 0s - loss: 0.4220 - accuracy: 0.8608 100/126 [======================>.......] - ETA: 0s - loss: 0.4263 - accuracy: 0.8584 125/126 [============================>.] - ETA: 0s - loss: 0.4181 - accuracy: 0.8597 126/126 [==============================] - 0s 2ms/step - loss: 0.4173 - accuracy: 0.8600 Epoch 6/10 1/126 [..............................] - ETA: 0s - loss: 0.8799 - accuracy: 0.7500 26/126 [=====>........................] - ETA: 0s - loss: 0.3717 - accuracy: 0.8654 51/126 [===========>..................] - ETA: 0s - loss: 0.3801 - accuracy: 0.8701 76/126 [=================>............] - ETA: 0s - loss: 0.3856 - accuracy: 0.8688 101/126 [=======================>......] - ETA: 0s - loss: 0.3907 - accuracy: 0.8632 126/126 [==============================] - ETA: 0s - loss: 0.3949 - accuracy: 0.8624 126/126 [==============================] - 0s 2ms/step - loss: 0.3949 - accuracy: 0.8624 Epoch 7/10 1/126 [..............................] - ETA: 0s - loss: 0.2022 - accuracy: 0.9375 26/126 [=====>........................] - ETA: 0s - loss: 0.3734 - accuracy: 0.8786 51/126 [===========>..................] - ETA: 0s - loss: 0.3456 - accuracy: 0.8817 76/126 [=================>............] - ETA: 0s - loss: 0.3615 - accuracy: 0.8762 101/126 [=======================>......] - ETA: 0s - loss: 0.3618 - accuracy: 0.8769 126/126 [==============================] - ETA: 0s - loss: 0.3635 - accuracy: 0.8774 126/126 [==============================] - 0s 2ms/step - loss: 0.3635 - accuracy: 0.8774 Epoch 8/10 1/126 [..............................] - ETA: 0s - loss: 0.3186 - accuracy: 0.8750 26/126 [=====>........................] - ETA: 0s - loss: 0.3445 - accuracy: 0.8810 50/126 [==========>...................] - ETA: 0s - loss: 0.3372 - accuracy: 0.8844 74/126 [================>.............] - ETA: 0s - loss: 0.3362 - accuracy: 0.8847 98/126 [======================>.......] - ETA: 0s - loss: 0.3342 - accuracy: 0.8865 123/126 [============================>.] - ETA: 0s - loss: 0.3448 - accuracy: 0.8824 126/126 [==============================] - 0s 2ms/step - loss: 0.3457 - accuracy: 0.8826 Epoch 9/10 1/126 [..............................] - ETA: 0s - loss: 0.2176 - accuracy: 0.9375 26/126 [=====>........................] - ETA: 0s - loss: 0.3434 - accuracy: 0.8834 50/126 [==========>...................] - ETA: 0s - loss: 0.3422 - accuracy: 0.8750 75/126 [================>.............] - ETA: 0s - loss: 0.3323 - accuracy: 0.8804 100/126 [======================>.......] - ETA: 0s - loss: 0.3298 - accuracy: 0.8853 125/126 [============================>.] - ETA: 0s - loss: 0.3278 - accuracy: 0.8880 126/126 [==============================] - 0s 2ms/step - loss: 0.3285 - accuracy: 0.8878 Epoch 10/10 1/126 [..............................] - ETA: 0s - loss: 0.3079 - accuracy: 0.8125 26/126 [=====>........................] - ETA: 0s - loss: 0.2803 - accuracy: 0.9075 51/126 [===========>..................] - ETA: 0s - loss: 0.2911 - accuracy: 0.9044 76/126 [=================>............] - ETA: 0s - loss: 0.2854 - accuracy: 0.9062 100/126 [======================>.......] - ETA: 0s - loss: 0.2977 - accuracy: 0.9016 125/126 [============================>.] - ETA: 0s - loss: 0.3012 - accuracy: 0.8992 126/126 [==============================] - 0s 2ms/step - loss: 0.3008 - accuracy: 0.8993 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 2ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step 1/2 [==============>...............] - ETA: 0s 2/2 [==============================] - 0s 1ms/step CV Folds: 100%|##########| 3/3 [00:32<00:00, 10.41s/it] CV Folds: 100%|##########| 3/3 [00:32<00:00, 10.69s/it] .. GENERATED FROM PYTHON SOURCE LINES 260-261 We can now look at the results per group: .. GENERATED FROM PYTHON SOURCE LINES 261-263 .. code-block:: default cv_results["test_single_accuracy"] .. rst-class:: sphx-glr-script-out .. code-block:: none [[0.85, 0.8666666666666667, 0.9333333333333333, 0.8, 0.85, 0.85, 0.85, 0.8833333333333333, 0.85, 0.85, 0.8166666666666667, 0.8666666666666667, 0.9333333333333333, 0.8, 0.8833333333333333, 0.8, 0.8, 0.8833333333333333, 0.7833333333333333, 0.8166666666666667, 0.8, 0.85, 0.8, 0.95, 0.8, 0.8833333333333333, 0.8833333333333333, 0.85, 0.85, 0.7833333333333333, 0.75, 0.8666666666666667, 0.8333333333333334, 0.9333333333333333], [0.7833333333333333, 0.8166666666666667, 0.7666666666666667, 0.7833333333333333, 0.85, 0.8, 0.8, 0.8166666666666667, 0.8, 0.8333333333333334, 0.8333333333333334, 0.8, 0.8166666666666667, 0.85, 0.75, 0.7, 0.8333333333333334, 0.8166666666666667, 0.8666666666666667, 0.7666666666666667, 0.7833333333333333, 0.7833333333333333, 0.7666666666666667, 0.8833333333333333, 0.8333333333333334, 0.7833333333333333, 0.8, 0.85, 0.85, 0.8166666666666667, 0.7666666666666667, 0.8, 0.8833333333333333], [0.7666666666666667, 0.9166666666666666, 0.8, 0.8, 0.7833333333333333, 0.85, 0.8833333333333333, 0.85, 0.9, 0.8833333333333333, 0.9, 0.85, 0.8833333333333333, 0.8166666666666667, 0.8333333333333334, 0.8333333333333334, 0.85, 0.7833333333333333, 0.7166666666666667, 0.7333333333333333, 0.75, 0.7833333333333333, 0.85, 0.7666666666666667, 0.7833333333333333, 0.9, 0.8666666666666667, 0.8333333333333334, 0.8333333333333334, 0.8166666666666667, 0.85, 0.85, 0.9]] .. GENERATED FROM PYTHON SOURCE LINES 264-265 And the overall accuracy as the average over all samples of all groups within a fold: .. GENERATED FROM PYTHON SOURCE LINES 265-266 .. code-block:: default cv_results["test_per_sample__accuracy"] .. rst-class:: sphx-glr-script-out .. code-block:: none array([0.84705882, 0.80858586, 0.83080808]) .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 36.573 seconds) **Estimated memory usage:** 253 MB .. _sphx_glr_download_auto_examples_integrations__01_tensorflow.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: _01_tensorflow.py <_01_tensorflow.py>` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: _01_tensorflow.ipynb <_01_tensorflow.ipynb>` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_