enopt⚓︎
Ensemble optimisation algorithm.
EnOpt
⚓︎
    
              Bases: Optimize
This is an implementation of the ensemble steepest descent ensemble optimization algorithm - EnOpt. The update of the control variable is done with the simple steepest (or gradient) descent algorithm:
where \(x\) is the control variable, \(l\) is the iteration index, \(\alpha\) is the step size, \(C\) is a smoothing matrix (e.g., covariance matrix for \(x\)), and \(G\) is the ensemble gradient.
Methods:
| Name | Description | 
|---|---|
| calc_update | Update using steepest descent method with ensemble gradient | 
References
Chen et al., 2009, 'Efficient Ensemble-Based Closed-Loop Production Optimization', SPE Journal, 14 (4): 634-645.
TODO: Implement getter for optimize_result
__init__(fun, x, args, jac, hess, bounds=None, **options)
⚓︎
    Parameters:
| Name | Type | Description | Default | 
|---|---|---|---|
| fun | callable | objective function | required | 
| x | ndarray | Initial state | required | 
| args | tuple | Initial covariance | required | 
| jac | callable | Gradient function | required | 
| hess | callable | Hessian function | required | 
| bounds | list | (min, max) pairs for each element in x. None is used to specify no bound. | None | 
| options | dict | Optimization options 
 | {} | 
calc_update()
⚓︎
    Update using steepest descent method with ensemble gradients