adaptive.Learner2D

class adaptive.Learner2D(function, bounds, loss_per_triangle=None)[source]

Bases: adaptive.learner.base_learner.BaseLearner

Learns and predicts a function ‘f: ℝ^2 → ℝ^N’.

Parameters
  • function (callable) – The function to learn. Must take a tuple of two real parameters and return a real number.

  • bounds (list of 2-tuples) – A list [(a1, b1), (a2, b2)] containing bounds, one per dimension.

  • loss_per_triangle (callable, optional) – A function that returns the loss for every triangle. If not provided, then a default is used, which uses the deviation from a linear estimate, as well as triangle area, to determine the loss. See the notes for more details.

data

Sampled points and values.

Type

dict

pending_points

Points that still have to be evaluated and are currently interpolated, see data_combined.

Type

set

stack_size

The size of the new candidate points stack. Set it to 1 to recalculate the best points at each call to ask.

Type

int, default: 10

aspect_ratio

Average ratio of x span over y span of a triangle. If there is more detail in either x or y the aspect_ratio needs to be adjusted. When aspect_ratio > 1 the triangles will be stretched along x, otherwise along y.

Type

float, int, default: 1

data_combined : dict

Sampled points and values so far including the unknown interpolated points in pending_points.

Notes

Adapted from an initial implementation by Pauli Virtanen.

The sample points are chosen by estimating the point where the linear and cubic interpolants based on the existing points have maximal disagreement. This point is then taken as the next point to be sampled.

In practice, this sampling protocol results to sparser sampling of smooth regions, and denser sampling of regions where the function changes rapidly, which is useful if the function is expensive to compute.

This sampling procedure is not extremely fast, so to benefit from it, your function needs to be slow enough to compute.

loss_per_triangle takes a single parameter, ip, which is a scipy.interpolate.LinearNDInterpolator. You can use the undocumented attributes tri and values of ip to get a scipy.spatial.Delaunay and a vector of function values. These can be used to compute the loss. The functions areas and deviations to calculate the areas and deviations from a linear interpolation over each triangle.

ask(n, tell_pending=True)[source]

Choose the next ‘n’ points to evaluate.

Parameters
  • n (int) – The number of points to choose.

  • tell_pending (bool, default: True) – If True, add the chosen points to this learner’s pending_points. Set this to False if you do not want to modify the state of the learner.

bounds_are_done
data_combined()[source]

Like data, however this includes the points in pending_points for which the values are interpolated.

inside_bounds(xy)[source]
ip()[source]

A scipy.interpolate.LinearNDInterpolator instance containing the learner’s data.

ip_combined()[source]

A scipy.interpolate.LinearNDInterpolator instance containing the learner’s data and interpolated data of the pending_points.

loss(real=True)[source]

Return the loss for the current state of the learner.

Parameters

real (bool, default: True) – If False, return the “expected” loss, i.e. the loss including the as-yet unevaluated points (possibly by interpolation).

npoints

Number of evaluated points.

plot(n=None, tri_alpha=0)[source]

Plot the Learner2D’s current state.

This plot function interpolates the data on a regular grid. The gridspacing is evaluated by checking the size of the smallest triangle.

Parameters
  • n (int) – Number of points in x and y. If None (default) this number is evaluated by looking at the size of the smallest triangle.

  • tri_alpha (float) – The opacity (0 <= tri_alpha <= 1) of the triangles overlayed on top of the image. By default the triangulation is not visible.

Returns

plot – A holoviews.core.Overlay of holoviews.Image * holoviews.EdgePaths. If the learner.function returns a vector output, a holoviews.core.HoloMap of the holoviews.core.Overlays wil be returned.

Return type

holoviews.core.Overlay or holoviews.core.HoloMap

Notes

The plot object that is returned if learner.function returns a vector cannot be used with the live_plotting functionality.

remove_unfinished()[source]

Remove uncomputed data from the learner.

tell(point, value)[source]

Tell the learner about a single value.

Parameters
  • x (A value from the function domain) –

  • y (A value from the function image) –

tell_pending(point)[source]

Tell the learner that ‘x’ has been requested such that it’s not suggested again.

vdim

Length of the output of learner.function. If the output is unsized (when it’s a scalar) then vdim = 1.

As long as no data is known vdim = 1.

xy_scale

Custom loss functions

adaptive.learner.learner2D.default_loss(ip)[source]
adaptive.learner.learner2D.minimize_triangle_surface_loss(ip)[source]

Loss function that is similar to the default loss function in the Learner1D. The loss is the area spanned by the 3D vectors of the vertices.

Works with Learner2D only.

Examples

>>> from adaptive.learner.learner2D import minimize_triangle_surface_loss
>>> def f(xy):
...     x, y = xy
...     return x**2 + y**2
>>>
>>> learner = adaptive.Learner2D(f, bounds=[(-1, -1), (1, 1)],
...     loss_per_triangle=minimize_triangle_surface_loss)
>>>
adaptive.learner.learner2D.uniform_loss(ip)[source]

Loss function that samples the domain uniformly.

Works with Learner2D only.

Examples

>>> from adaptive.learner.learner2D import uniform_loss
>>> def f(xy):
...     x, y = xy
...     return x**2 + y**2
>>>
>>> learner = adaptive.Learner2D(f,
...                              bounds=[(-1, -1), (1, 1)],
...                              loss_per_triangle=uniform_loss)
>>>
adaptive.learner.learner2D.resolution_loss_function(min_distance=0, max_distance=1)[source]

Loss function that is similar to the default_loss function, but you can set the maximimum and minimum size of a triangle.

Works with Learner2D only.

The arguments min_distance and max_distance should be in between 0 and 1 because the total area is normalized to 1.

Examples

>>> def f(xy):
...     x, y = xy
...     return x**2 + y**2
>>>
>>> loss = resolution_loss_function(min_distance=0.01, max_distance=1)
>>> learner = adaptive.Learner2D(f,
...                              bounds=[(-1, -1), (1, 1)],
...                              loss_per_triangle=loss)
>>>

Helper functions

adaptive.learner.learner2D.areas(ip)[source]

Returns the area per triangle of the triangulation inside a LinearNDInterpolator instance.

Is useful when defining custom loss functions.

Parameters

ip (scipy.interpolate.LinearNDInterpolator instance) –

Returns

The area per triangle in ip.tri.

Return type

numpy array

adaptive.learner.learner2D.deviations(ip)[source]

Returns the deviation of the linear estimate.

Is useful when defining custom loss functions.

Parameters

ip (scipy.interpolate.LinearNDInterpolator instance) –

Returns

The deviation per triangle.

Return type

numpy array