adaptive.Learner1D

class adaptive.Learner1D(*args, **kwargs)[source]

Bases: adaptive.learner.base_learner.BaseLearner

Learns and predicts a function ‘f:ℝ → ℝ^N’.

Parameters
  • function (callable) – The function to learn. Must take a single real parameter and return a real number.

  • bounds (pair of reals) – The bounds of the interval on which to learn ‘function’.

  • loss_per_interval (callable, optional) – A function that returns the loss for a single interval of the domain. If not provided, then a default is used, which uses the scaled distance in the x-y plane as the loss. See the notes for more details.

data

Sampled points and values.

Type

dict

pending_points

Points that still have to be evaluated.

Type

set

Notes

loss_per_interval takes 2 parameters: xs and ys, and returns a

scalar; the loss over the interval.

xstuple of floats

The x values of the interval, if nth_neighbors is greater than zero it also contains the x-values of the neighbors of the interval, in ascending order. The interval we want to know the loss of is then the middle interval. If no neighbor is available (at the edges of the domain) then None will take the place of the x-value of the neighbor.

ystuple of function values

The output values of the function when evaluated at the xs. This is either a float or a tuple of floats in the case of vector output.

The loss_per_interval function may also have an attribute nth_neighbors that indicates how many of the neighboring intervals to interval are used. If loss_per_interval doesn’t have such an attribute, it’s assumed that is uses no neighboring intervals. Also see the uses_nth_neighbors decorator for more information.

ask(n, tell_pending=True)[source]

Return ‘n’ points that are expected to maximally reduce the loss.

data: dict
loss(real=True)[source]

Return the loss for the current state of the learner.

Parameters

real (bool, default: True) – If False, return the “expected” loss, i.e. the loss including the as-yet unevaluated points (possibly by interpolation).

property npoints

Number of evaluated points.

pending_points: set
plot(*, scatter_or_line='scatter')[source]

Returns a plot of the evaluated data.

Parameters

scatter_or_line (str, default: "scatter") – Plot as a scatter plot (“scatter”) or a line plot (“line”).

Returns

plot – Plot of the evaluated data.

Return type

holoviews.Overlay

remove_unfinished()[source]

Remove uncomputed data from the learner.

tell(x, y)[source]

Tell the learner about a single value.

Parameters
  • x (A value from the function domain) –

  • y (A value from the function image) –

tell_many(xs, ys, *, force=False)[source]

Tell the learner about some values.

Parameters
  • xs (Iterable of values from the function domain) –

  • ys (Iterable of values from the function image) –

tell_pending(x)[source]

Tell the learner that ‘x’ has been requested such that it’s not suggested again.

property vdim

Length of the output of learner.function. If the output is unsized (when it’s a scalar) then vdim = 1.

As long as no data is known vdim = 1.

Custom loss functions

adaptive.learner.learner1D.default_loss(xs, ys)[source]

Calculate loss on a single interval.

Currently returns the rescaled length of the interval. If one of the y-values is missing, returns 0 (so the intervals with missing data are never touched. This behavior should be improved later.

adaptive.learner.learner1D.uniform_loss(xs, ys)[source]

Loss function that samples the domain uniformly.

Works with Learner1D only.

Examples

>>> def f(x):
...     return x**2
>>>
>>> learner = adaptive.Learner1D(f,
...                              bounds=(-1, 1),
...                              loss_per_interval=uniform_sampling_1d)
>>>
adaptive.learner.learner1D.uses_nth_neighbors(n)[source]

Decorator to specify how many neighboring intervals the loss function uses.

Wraps loss functions to indicate that they expect intervals together with n nearest neighbors

The loss function will then receive the data of the N nearest neighbors (nth_neighbors) aling with the data of the interval itself in a dict. The Learner1D will also make sure that the loss is updated whenever one of the nth_neighbors changes.

Examples

The next function is a part of the curvature_loss_function function.

>>> @uses_nth_neighbors(1)
... def triangle_loss(xs, ys):
...    xs = [x for x in xs if x is not None]
...    ys = [y for y in ys if y is not None]
...
...    if len(xs) == 2: # we do not have enough points for a triangle
...        return xs[1] - xs[0]
...
...    N = len(xs) - 2 # number of constructed triangles
...    if isinstance(ys[0], Iterable):
...        pts = [(x, *y) for x, y in zip(xs, ys)]
...        vol = simplex_volume_in_embedding
...    else:
...        pts = [(x, y) for x, y in zip(xs, ys)]
...        vol = volume
...    return sum(vol(pts[i:i+3]) for i in range(N)) / N

Or you may define a loss that favours the (local) minima of a function, assuming that you know your function will have a single float as output.

>>> @uses_nth_neighbors(1)
... def local_minima_resolving_loss(xs, ys):
...     dx = xs[2] - xs[1] # the width of the interval of interest
...
...     if not ((ys[0] is not None and ys[0] > ys[1])
...         or (ys[3] is not None and ys[3] > ys[2])):
...         return loss * 100
...
...     return loss
adaptive.learner.learner1D.triangle_loss(xs, ys)[source]
adaptive.learner.learner1D.curvature_loss_function(area_factor=1, euclid_factor=0.02, horizontal_factor=0.02)[source]