adaptive.DataSaver#

The DataSaver class#

class adaptive.DataSaver(*args, **kwargs)[source]#

Bases: adaptive.learner.base_learner.BaseLearner

Save extra data associated with the values that need to be learned.

Parameters
  • learner (BaseLearner instance) – The learner that needs to be wrapped.

  • arg_picker (function) – Function that returns the argument that needs to be learned.

Example

Imagine we have a function that returns a dictionary of the form: {'y': y, 'err_est': err_est}.

>>> from operator import itemgetter
>>> _learner = Learner1D(f, bounds=(-1.0, 1.0))
>>> learner = DataSaver(_learner, arg_picker=itemgetter('y'))
ask(*args, **kwargs)[source]#

Choose the next ‘n’ points to evaluate.

Parameters
  • n (int) – The number of points to choose.

  • tell_pending (bool, default: True) – If True, add the chosen points to this learner’s pending_points. Set this to False if you do not want to modify the state of the learner.

data: dict#
load(fname, compress=True) None[source]#

Load the data of a learner from a pickle file.

Parameters
  • fname (str) – The filename from which to load the learner’s data.

  • compress (bool, default True) – If the data is compressed when saved, one must load it with compression too.

load_dataframe(df: pandas.core.frame.DataFrame, extra_data_name: str = 'extra_data', input_names: tuple[str] = (), **kwargs) None[source]#

Load the data from a pandas.DataFrame into the learner.

Parameters
  • df (pandas.DataFrame) – DataFrame with the data to load.

  • extra_data_name (str, optional) – The extra_data_name used in to_dataframe, by default “extra_data”.

  • input_names (tuple[str], optional) – The input names of the child learner. By default the input names are taken from df.attrs["inputs"], however, metadata is not preserved when saving/loading a DataFrame to/from a file. In that case, the input names can be passed explicitly. For example, for a 2D learner, this would be input_names=('x', 'y').

  • **kwargs (dict) – Keyword arguments passed to each child_learner.load_dataframe(**kwargs).

loss(*args, **kwargs)[source]#

Return the loss for the current state of the learner.

Parameters

real (bool, default: True) – If False, return the “expected” loss, i.e. the loss including the as-yet unevaluated points (possibly by interpolation).

new() adaptive.learner.data_saver.DataSaver[source]#

Return a new DataSaver with the same arg_picker and learner.

npoints: int#
pending_points: set#
remove_unfinished(*args, **kwargs)[source]#

Remove uncomputed data from the learner.

save(fname, compress=True) None[source]#

Save the data of the learner into a pickle file.

Parameters
  • fname (str) – The filename into which to save the learner’s data.

  • compress (bool, default True) – Compress the data upon saving using ‘gzip’. When saving using compression, one must load it with compression too.

tell(x: Any, result: Any) None[source]#

Tell the learner about a single value.

Parameters
  • x (A value from the function domain) –

  • y (A value from the function image) –

tell_pending(x: Any) None[source]#

Tell the learner that ‘x’ has been requested such that it’s not suggested again.

to_dataframe(extra_data_name: str = 'extra_data', **kwargs: Any) pandas.core.frame.DataFrame[source]#

Return the data as a concatenated pandas.DataFrame from child learners.

Parameters
  • extra_data_name (str, optional) – The name of the column containing the extra data, by default “extra_data”.

  • **kwargs (dict) – Keyword arguments passed to the child_learner.to_dataframe(**kwargs).

Returns

Return type

pandas.DataFrame

Raises

ImportError – If pandas is not installed.

The make_datasaver function#

adaptive.make_datasaver(learner_type, arg_picker)[source]#

Create a DataSaver of a learner_type that can be instantiated with the learner_type’s key-word arguments.

Parameters
  • learner_type (BaseLearner type) – The learner type that needs to be wrapped.

  • arg_picker (function) – Function that returns the argument that needs to be learned.

Example

Imagine we have a function that returns a dictionary of the form: {'y': y, 'err_est': err_est}.

>>> from operator import itemgetter
>>> DataSaver = make_datasaver(Learner1D, arg_picker=itemgetter('y'))
>>> learner = DataSaver(function=f, bounds=(-1.0, 1.0))

Or when using adaptive.BalancingLearner.from_product:

>>> learner_type = make_datasaver(adaptive.Learner1D,
...     arg_picker=itemgetter('y'))
>>> learner = adaptive.BalancingLearner.from_product(
...     jacobi, learner_type, dict(bounds=(0, 1)), combos)