tpcp.optimize.Optimize#

class tpcp.optimize.Optimize(pipeline: OptimizablePipelineT, *, safe_optimize: bool = True, optimize_with_info: bool = True)[source]#

Run a generic self-optimization on the pipeline.

This is a simple wrapper for pipelines that already implement a self_optimize method. This wrapper can be used to ensure that these algorithms can be optimized with the same interface as other optimization methods and can hence be used in methods like tpcp.validate.cross_validate.

Optimize will never modify the original pipeline, but will store a copy of the optimized pipeline as optimized_pipeline_.

If safe_optimize is True, the wrapper applies the same runtime checks as provided by make_optimize_safe.

Parameters:
pipeline

The pipeline to optimize. The pipeline must implement self_optimize to optimize its own input parameters.

safe_optimize

If True, we add additional checks to make sure the self_optimize method of the pipeline is correctly implemented. See make_optimize_safe for more info.

optimize_with_info

If True, Optimize will try to call self_optimize_with_info by default and will fall back to self_optimize. If you want to force optimize to use self_optimize, even if an implementation of self_optimize_with_info exists, set this parameter to False.

Other Parameters:
dataset

The dataset used for optimization.

Attributes:
optimized_pipeline_

The optimized version of the pipeline. That is a copy of the input pipeline with modified params.

optimization_info_

If the optimized pipeline implements a self_optimize_with_info method, this parameter contains the additional information provided as second return value from this method.

Methods

clone()

Create a new instance of the class with all parameters copied over.

get_params([deep])

Get parameters for this algorithm.

optimize(dataset, **optimize_params)

Run the self-optimization defined by the pipeline.

run(datapoint)

Run the optimized pipeline.

safe_run(datapoint)

Run the optimized pipeline.

score(datapoint)

Run score of the optimized pipeline.

set_params(**params)

Set the parameters of this Algorithm.

__init__(pipeline: OptimizablePipelineT, *, safe_optimize: bool = True, optimize_with_info: bool = True) None[source]#
clone() Self[source]#

Create a new instance of the class with all parameters copied over.

This will create a new instance of the class itself and all nested objects

get_params(deep: bool = True) Dict[str, Any][source]#

Get parameters for this algorithm.

Parameters:
deep

Only relevant if object contains nested algorithm objects. If this is the case and deep is True, the params of these nested objects are included in the output using a prefix like nested_object_name__ (Note the two “_” at the end)

Returns:
params

Parameter names mapped to their values.

optimize(dataset: DatasetT, **optimize_params: Any) Self[source]#

Run the self-optimization defined by the pipeline.

The optimized version of the pipeline is stored as self.optimized_pipeline_.

Parameters:
dataset

An instance of a Dataset containing one or multiple data points that can be used for optimization. The structure of the data and the available reference information will depend on the dataset.

optimize_params

Additional parameter for the optimization process. They are forwarded to pipeline.self_optimize.

Returns:
self

The class instance with all result attributes populated

run(datapoint: DatasetT) PipelineT[source]#

Run the optimized pipeline.

This is a wrapper to contain API compatibility with Pipeline.

safe_run(datapoint: DatasetT) PipelineT[source]#

Run the optimized pipeline.

This is a wrapper to contain API compatibility with Pipeline.

score(datapoint: DatasetT) float | Dict[str, float][source]#

Run score of the optimized pipeline.

This is a wrapper to contain API compatibility with Pipeline.

set_params(**params: Any) Self[source]#

Set the parameters of this Algorithm.

To set parameters of nested objects use nested_object_name__para_name=.

Examples using tpcp.optimize.Optimize#

Optimizable Pipelines

Optimizable Pipelines

Custom Optuna Optimizer

Custom Optuna Optimizer

Cross Validation

Cross Validation

Tensorflow/Keras

Tensorflow/Keras

Optimization Info

Optimization Info