Usage Examples#
This page provides some annotated examples showing how Espresso can be used.
Gradient descent#
1from espresso import <testproblem> as testproblem
2
3niterations = 100
4epsilon = 0.01
5
6tp = testproblem(example_number = 1)
7
8model = tp.starting_model
9for k in range(niterations):
10 predictions, G = tp.forward(model, with_jacobian = True)
11 residuals = tp.data - predictions
12 model -= epsilon * G.T.dot(residuals)
13print(model)
The algorithm here is straightforward:
with \(\mathbf{g}(\mathbf{m})\) representing simulated data for model \(\mathbf{m}\) and \(\mathbf{G}\) representing the corresponding Jacobian, which has elements given by
Let’s go through the code in detail and explain the Espresso-specific parts. First we select and import one test problem from Espresso
1from espresso import <testproblem> as testproblem
Here <testproblem>
should be replaced by any valid problem name. We then assign some values to variables representing the number of iterations of gradient descent, and the learning rate \(\epsilon\). Next, we instantiate (initialise) the test problem we imported.
6tp = testproblem(example_number = 1)
Individual test problems may contain multiple examples, to provide access to multiple datasets or showcase particular problem characteristics. These can be selected by setting example_number
to the relevant number; consult the documentation for details of what each test problem provides.
Once the instance of EspressoProblem
has been created, it can be used to access various functions and attributes. Each example defines a sensible ‘null’ or ‘initial’ model to use for inversion, which we use to initialise our gradient descent algorithm:
8model = tp.starting_model
We compute simulated data and the Jacobian for our current model estimate, and compare this to the ‘data’ embedded within our EspressoProblem
.
10 predictions, G = tp.forward(model, with_jacobian = True)
11 residuals = tp.data - predictions
Finally, we update the model accordingly, and iterate until (hopefully!) a good model is found.