BayesianMaximization

BayesianMaximization[f,{conf1,conf2,}]

gives an object representing the result of Bayesian maximization over the function f over the configurations confi.

BayesianMaximization[f,reg]

maximizes over the region represented by the region specification reg.

BayesianMaximization[f,sampler]

maximizes over configurations obtained by applying the function sampler.

BayesianMaximization[f,{conf1,conf2,}nsampler]

applies the function nsampler to successively generate configurations starting from the confi.

Details and Options

  • BayesianMaximization[] returns a BayesianMaximizationObject[] whose properties can be obtained using BayesianMaximizationObject[]["prop"].
  • Possible properties include:
  • "EvaluationHistory"configurations and values explored during maximization
    "MaximumConfiguration"configuration found that maximizes the result from f
    "MaximumValue"estimated maximum value obtained from f
    "Method"method used for Bayesian maximization
    "NextConfiguration"configuration to sample next if maximization were continued
    "PredictorFunction"best prediction model found for the function f
    "Properties"list of all available properties
  • Configurations can be of any form accepted by Predict (single data element, list of data elements, association of data elements, etc.) and of any type accepted by Predict (numerical, textual, sounds, images, etc.).
  • The function f must output a real-number value when applied to a configuration conf.
  • BayesianMaximization[f,] attempts to find a good maximum using the smallest number of evaluations of f.
  • In BayesianMaximization[f,spec], spec defines the domain of the function f. A domain can be defined by a list of configurations, a geometric region, or a configuration generator function.
  • In BayesianMaximization[f,sampler], sampler[] must output a configuration suitable for f to be applied to it.
  • In BayesianMaximization[f,{conf1,conf2,}nsampler], nsampler[conf] must output a configuration suitable for f to be applied to it.
  • BayesianMaximization takes the following options:
  • AssumeDeterministic Falsewhether to assume that f is deterministic
    InitialEvaluationHistory Noneinitial set of configurations and values
    MaxIterations 100maximum number of iterations
    Method Automaticmethod used to determine configurations to evaluate
    RandomSeeding1234what seeding of pseudorandom generators should be done internally
  • Possible settings for Method include:
  • Automaticautomatically choose the method
    "MaxExpectedImprovement"maximize expected improvement over current best value
    "MaxImprovementProbability"maximize improvement probability over current best value
  • Possible settings for RandomSeeding include:
  • Automaticautomatically reseed every time the function is called
    Inheriteduse externally seeded random numbers
    seeduse an explicit integer or strings as a seed

Examples

open allclose all

Basic Examples  (3)

Maximize a function over an interval:

Use the resulting BayesianMaximizationObject[] to get the estimated maximum configuration:

Get the estimated maximum function value:

Maximize a function over a set of configurations:

Get the maximum configuration over the set:

Maximize a function over a domain defined by a random generator:

Get the estimated maximum value:

Scope  (3)

Maximize a function over a region:

Get the list of available properties to query:

Get the history of evaluations:

Get information about the method used to determine the configurations to explore:

Get the current probabilistic model of the function (this is a PredictorFunction):

Find the best configuration to explore if the maximization were continued:

Find a list of properties simultaneously:

Visualize how well the function is modeled, particularly near the maximum:

Maximize a function with initial configurations over a domain defined by a random neighborhood configuration generator:

Get the model of the function:

Visualize the model's performance near the maximum:

Define a function that takes an image and computes the probability returned by ImageIdentify of identifying the image as the entity , with the domain defined by a random generator over a corpus of images:

Maximize the function above:

Get the maximum configuration:

Get the evaluation history:

Options  (4)

AssumeDeterministic  (1)

Maximize a function over a domain defined by a random generator:

The function is assumed to be stochastic; the value from the probabilistic model will differ in general from the function value for the same configuration:

Include information that the function is deterministic, i.e. noise free:

For a deterministic function, the model value and the function value for evaluated configurations agree to a good precision:

InitialEvaluationHistory  (1)

Maximize a function over a disk region with a small number of iterations:

Use the information from this evaluation in the next:

Get the estimated maximum configuration now:

MaxIterations  (1)

Maximize a function with a domain defined by a random generator:

Get the number of function evaluations:

Specify the maximum number of iterations:

Method  (1)

Define a function over an interval region:

Specify the method for exploring configurations:

Specify a different method:

Applications  (2)

Define a training set to train predictor functions using the Predict function and a test set to measure their performance:

Create a "log-likelihood" function to measure the log-likelihood of test data for different methods in Predict:

Maximize the log-likelihood function over a domain defined by a list of different methods for Predict:

Examine the evaluation history:

Find the best configuration to explore next:

Load Fisher's Iris dataset and divide it into a training set and a validation set:

Create "blackbox" functions. Here the functions are the log-likelihood functions for two different methods used in the Classify function. The arguments of the functions are known as hyperparameters.

Train a logistic regression classifier with two hyperparameters, the L1 and L2 regularization coefficients:

Maximize the log-likelihood function for the logistic regression classifier over a domain defined by a rectangular region in the logarithm of the hyperparameters:

Get the model of the function:

Visualize the model of the log-likelihood function of the classifier together with the estimated maximum:

Now train a support vector machine (SVM) classifier with two hyperparameters, the soft margin parameter and the gamma scaling parameter:

Maximize the log-likelihood function for the SVM classifier over a domain defined by a rectangular region in the logarithm of the hyperparameters:

Get the model of the function:

Visualize the model of the log-likelihood function of the classifier together with the estimated maximum:

Possible Issues  (2)

When the domain of the objective function is defined by an initial configuration set and a neighborhood configuration generator, the results depend on the quality of the generator provided.

Minimize a function where the domain is defined as above:

Get the model of the function:

Since the starting initial configurations are "far" from the global maximum and the generator takes relatively small steps, the algorithm could converge to a local maximum:

If the neighborhood configuration generator takes steps that are "too small" or "too large", this could lead to issues:

This takes a rather long time to evaluate. Since the generator takes very small steps each time, the algorithm keeps running until reaches its default value of 100 iterations:

Wolfram Research (2016), BayesianMaximization, Wolfram Language function, https://reference.wolfram.com/language/ref/BayesianMaximization.html (updated 2017).

Text

Wolfram Research (2016), BayesianMaximization, Wolfram Language function, https://reference.wolfram.com/language/ref/BayesianMaximization.html (updated 2017).

CMS

Wolfram Language. 2016. "BayesianMaximization." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2017. https://reference.wolfram.com/language/ref/BayesianMaximization.html.

APA

Wolfram Language. (2016). BayesianMaximization. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/BayesianMaximization.html

BibTeX

@misc{reference.wolfram_2023_bayesianmaximization, author="Wolfram Research", title="{BayesianMaximization}", year="2017", howpublished="\url{https://reference.wolfram.com/language/ref/BayesianMaximization.html}", note=[Accessed: 20-April-2024 ]}

BibLaTeX

@online{reference.wolfram_2023_bayesianmaximization, organization={Wolfram Research}, title={BayesianMaximization}, year={2017}, url={https://reference.wolfram.com/language/ref/BayesianMaximization.html}, note=[Accessed: 20-April-2024 ]}