The ForwardOptimizerModel class#
Aliases#
halerium.core.model.ForwardOptimizerModel
- class ForwardOptimizerModel(graph, adjustables, cost_function, reduce_dynamic_cost=<class 'halerium.core.operator.operators.Sum'>, data=None, compiler=None, solver='L-BFGS', batch_size=None, copy_graph=True, model_graph_options=None)#
Forward optimizer model class.
Forward optimizer model class.
The model uses direct sampling to generate samples along the causal directions of the graph. Information therefore only travels forward along the causal connections as defined by the Variables dependencies. With this sampling method the model performs a gradient descent to find values for the adjustable parameters that minimize the provided cost function.
- Parameters:
graph (halerium.core.Graph) – The graph of the model.
adjustables – The variables to adjust to optimize model. Starting values and boundaries for variables can also be provided (see notes).
cost_function (callable) – The function that takes the graph as argument and returns an operator whose value is the objective function to minimize (see notes).
reduce_dynamic_cost (callable, optional) – The function to apply to dynamic costs to reduce to a scalar (see notes).
data (halerium.core.DataLinker, dict, None, optional) – The data linker or dict containing data constraining the model. The default is None.
compiler (optional) – The compiler instance or class for compiling the model. The default is None, in which case a Tensorflow compiler is used.
solver (str, None, optional) – The algorithm used to solve the model. Current choices are ‘L-BFGS’ (default) and ‘Adam’.
batch_size (int, None, optional) – The number of examples to average over in each step during optimization. The default is None, in which case the batch size depends on whether the cost is deterministic.
copy_graph (bool, optional) – Whether the model should make a copy of the graph for its own use, or just keep the graph itself as attribute. Users should leave this set to the default True, unless they are certain that the graph won’t be altered by the user or other code. Such changes to a graph a model holds directly (i.e. not a copy) makes that model inconsistent and likely causes errors.
model_graph_options (dict, optional) – The options for creating the model graph. The default is None.
Notes
Recognized ways to specify variables a,b,… as adjustables and to optionally provide starting values a_start, b_start,…, lower bounds a_lower_bound, b_lower_bound,…, and upper bounds a_upper_bound, b_upper_bound,… include:
adjustables=[a, b,…]
adjustables=[[a, a_start], [b, b_start], …]
adjustables={a: a_start, b: b_start, …}
- adjustables=[[a, a_start, a_lower_bound, a_upper_bound],
[b, b_start, b_lower_bound, b_upper_bound], …]
- adjustables={a: [a_start, a_lower_bound, a_upper_bound],
b: [b_start, b_lower_bound, b_upper_bound], …}
Any of the provided staring values or bounds can be None, a number, or an array with compatible shape.
The cost function takes a graph as argument. It will be evaluated on the graph provided to the model. If the resulting cost is static, it must be a scalar, i.e. have shape=(). If the cost is dynamic, the provided operation reduce_dynamic_cost must yield a scalar when applied to the dynamic cost.
- apply_to_samples(fetches, function, n_samples)#
Draw samples and apply a function to them.
- Parameters:
fetches – The variables to generate sample data for.
function (callable) – The function to apply to the sample data.
n_samples (int) – The number of samples to draw from the model.
- Returns:
The result of applying the function to the sampled data.
- Return type:
result
- assert_is_trained()#
Check if model is trained.
- Return type:
None.
- Raises:
RuntimeWarning – If model is not trained.
- property batch_size#
- property cost#
- property cost_is_deterministic#
- get_adjustables(return_bounds=False)#
- Parameters:
return_bounds (bool) – Whether to return the bounds, too.
- Returns:
adjustables – The adjustables. If return_bounds is False, in the form: {variable: value } If return_bounds is True, in the form: {variable: [value, lower_bound, upper_bound] }
- Return type:
dict
- get_cost_gradients(fetches, n_samples=1)#
Get samples of the gradient of the cost function with respect to the variables specified in fetches.
- Parameters:
fetches – The variables to generate sample data for.
n_samples (int) – The number of examples to draw from the model.
- Returns:
The gradient samples.
- Return type:
samples
- get_example(fetches)#
Draw an example from the model.
- Parameters:
fetches – The variables to generate example values for.
- Returns:
The example data.
- Return type:
example
- get_means(fetches, n_samples=100)#
Estimate mean values.
- Parameters:
fetches – The variables to estimate mean values for.
n_samples (int) – The number of samples to estimate the means from.
- Returns:
The estimated means of the variables.
- Return type:
means
- get_posterior_graph(name=None, n_samples=100)#
Create posterior graph from trained model.
- Parameters:
name (str) – The name to give to the posterior graph.
n_samples (int) – The number of samples to estimate the posterior distributions from.
- Returns:
post_graph – The posterior graph.
- Return type:
- get_samples(fetches, n_samples=1)#
Draw samples from the model.
- Parameters:
fetches – The variables to generate sample data for.
n_samples (int) – The number of examples to draw from the model.
- Returns:
The sampled data.
- Return type:
samples
- get_standard_deviations(fetches, n_samples=100)#
Estimate standard deviations.
- Parameters:
fetches – The variables to estimate standard deviations for.
n_samples (int) – The number of samples to estimate the standard deviations from.
- Returns:
The estimated standard deviations of the variables.
- Return type:
standard_deviations
- get_variances(fetches, n_samples=100)#
Estimate variances.
- Parameters:
fetches – The variables to estimate variances for.
n_samples (int) – The number of samples to estimate the variances from.
- Returns:
The estimated variances of the variables.
- Return type:
variances
- property is_trained#
Whether the model has been trained.
- property model_graph#
The model graph (don’t modify it yourself).
- solve(solver=None, batch_size=None, **kwargs)#
Solve the model.
- Parameters:
solver (str, None) – The solver to use. If None, use the model’s solver.
batch_size (int, None) – The batch size. If None, use the model’s batch size.
kwargs – Any keyword arguments to pass to the minimizer.
- Return type:
None.
- property solver#