adaptive.LearnerND

class adaptive.LearnerND(*args, **kwargs)[source]

Bases: adaptive.learner.base_learner.BaseLearner

Learns and predicts a function ‘f: ℝ^N → ℝ^M’.

Parameters
  • func (callable) – The function to learn. Must take a tuple of N real parameters and return a real number or an arraylike of length M.

  • bounds (list of 2-tuples or scipy.spatial.ConvexHull) – A list [(a_1, b_1), (a_2, b_2), ..., (a_n, b_n)] containing bounds, one pair per dimension. Or a ConvexHull that defines the boundary of the domain.

  • loss_per_simplex (callable, optional) – A function that returns the loss for a simplex. If not provided, then a default is used, which uses the deviation from a linear estimate, as well as triangle area, to determine the loss.

data

Sampled points and values.

Type

dict

points

Coordinates of the currently known points

Type

numpy array

values

The values of each of the known points

Type

numpy array

pending_points

Points that still have to be evaluated.

Type

set

Notes

The sample points are chosen by estimating the point where the gradient is maximal. This is based on the currently known points.

In practice, this sampling protocol results to sparser sampling of flat regions, and denser sampling of regions where the function has a high gradient, which is useful if the function is expensive to compute.

This sampling procedure is not fast, so to benefit from it, your function needs to be slow enough to compute.

This class keeps track of all known points. It triangulates these points and with every simplex it associates a loss. Then if you request points that you will compute in the future, it will subtriangulate a real simplex with the pending points inside it and distribute the loss among it’s children based on volume.

ask(n, tell_pending=True)[source]

Chose points for learners.

property bounds_are_done
data: dict
inside_bounds(point)[source]

Check whether a point is inside the bounds.

loss(real=True)[source]

Return the loss for the current state of the learner.

Parameters

real (bool, default: True) – If False, return the “expected” loss, i.e. the loss including the as-yet unevaluated points (possibly by interpolation).

property npoints

Number of evaluated points.

pending_points: set
plot(n=None, tri_alpha=0)[source]

Plot the function we want to learn, only works in 2D.

Parameters
  • n (int) – the number of boxes in the interpolation grid along each axis

  • tri_alpha (float (0 to 1)) – Opacity of triangulation lines

plot_3D(with_triangulation=False)[source]

Plot the learner’s data in 3D using plotly.

Does not work with the adaptive.notebook_integration.live_plot functionality.

Parameters

with_triangulation (bool, default: False) – Add the verticices to the plot.

Returns

plot – The 3D plot of learner.data.

Return type

plotly.offline.iplot object

plot_isoline(level=0.0, n=None, tri_alpha=0)[source]

Plot the isoline at a specific level, only works in 2D.

Parameters
  • level (float, default: 0) – The value of the function at which you would like to see the isoline.

  • n (int) – The number of boxes in the interpolation grid along each axis. This is passed to plot.

  • tri_alpha (float) – The opacity of the overlaying triangulation. This is passed to plot.

Returns

The plot of the isoline(s). This overlays a plot with a holoviews.element.Path.

Return type

holoviews.core.Overlay

plot_isosurface(level=0.0, hull_opacity=0.2)[source]

Plots a linearly interpolated isosurface.

This is the 3D analog of an isoline. Does not work with the adaptive.notebook_integration.live_plot functionality.

Parameters
  • level (float, default: 0.0) – the function value which you are interested in.

  • hull_opacity (float, default: 0.0) – the opacity of the hull of the domain.

Returns

plot – The plot object of the isosurface.

Return type

plotly.offline.iplot object

plot_slice(cut_mapping, n=None)[source]

Plot a 1D or 2D interpolated slice of a N-dimensional function.

Parameters
  • cut_mapping (dict (int → float)) – for each fixed dimension the value, the other dimensions are interpolated. e.g. cut_mapping = {0: 1}, so from dimension 0 (‘x’) to value 1.

  • n (int) – the number of boxes in the interpolation grid along each axis

property points

Get the points from data as a numpy array.

remove_unfinished()[source]

Remove uncomputed data from the learner.

tell(point, value)[source]

Tell the learner about a single value.

Parameters
  • x (A value from the function domain) –

  • y (A value from the function image) –

tell_pending(point, *, simplex=None)[source]

Tell the learner that ‘x’ has been requested such that it’s not suggested again.

property tri

An adaptive.learner.triangulation.Triangulation instance with all the points of the learner.

property values

Get the values from data as a numpy array.

property vdim

Length of the output of learner.function. If the output is unsized (when it’s a scalar) then vdim = 1.

As long as no data is known vdim = 1.

Custom loss functions

adaptive.learner.learnerND.default_loss(simplex, values, value_scale)[source]

Computes the average of the volumes of the simplex.

Parameters
  • simplex (list of tuples) – Each entry is one point of the simplex.

  • values (list of values) – The scaled function values of each of the simplex points.

  • value_scale (float) – The scale of values, where values = function_values * value_scale.

Returns

loss

Return type

float

adaptive.learner.learnerND.uniform_loss(simplex, values, value_scale)[source]

Uniform loss.

Parameters
  • simplex (list of tuples) – Each entry is one point of the simplex.

  • values (list of values) – The scaled function values of each of the simplex points.

  • value_scale (float) – The scale of values, where values = function_values * value_scale.

Returns

loss

Return type

float

adaptive.learner.learnerND.std_loss(simplex, values, value_scale)[source]

Computes the loss of the simplex based on the standard deviation.

Parameters
  • simplex (list of tuples) – Each entry is one point of the simplex.

  • values (list of values) – The scaled function values of each of the simplex points.

  • value_scale (float) – The scale of values, where values = function_values * value_scale.

Returns

loss

Return type

float