Estimate influences with the Influence Estimator#
The InfluenceEstimator
is an objective class in Halerium for estimating the influence of a variable, entity, or (sub)graph on the variance of a specified target.
For a given graph, target, and element of the graph, the influence estimator calculates the relative reduction of the target’s variance when the given graph element is held fixed compared to when all variables are unconstrained.
Ingredients#
To estimate the influences, we need the following ingredients: - a graph, and - a target.
The target can be just a variable in the graph, or a function taking the graph as input.
Imports#
We import the packages, classes, and functions required for the code examples below.
[1]:
# for handling data:
import numpy as np
# for building graphs:
from halerium.core import Graph, Entity, Variable, StaticVariable, show
# for building models:
from halerium.core import get_generative_model, get_posterior_model
# for estimating influences:
from halerium import InfluenceEstimator
Basic example#
The graph#
Let’s create a graph first.
[2]:
g = Graph("g")
with g:
e = Entity("e")
with e:
Variable("a", mean=0, variance=2)
Variable("b", mean=0, variance=3)
Variable("c", mean=0, variance=4)
Variable("d", mean=e.a + e.b + e.c, variance=1)
show(g)
In this graph, the mean of variable d
is given by the sum of the variables e.a
, e.b
, and e.c
. The variances of e.a
, e.b
, and e.c
are two, three, and four times as large as the variance of d
for any fixed set of values for e.a
, e.b
, and e.c
. The total variance of d
, when e.a
, e.b
, and e.c
are not fixed, is simply the sum of all these.
[3]:
generative_model = get_generative_model(g)
d_total_variance = generative_model.get_variances(g.d, n_samples=1000)
print("total variance of d =", d_total_variance)
total variance of d = [10.10254441]
The influence estimator#
Now let’s create an influence estimator for the graph g
and the target g.d
.
[4]:
ie = InfluenceEstimator(graph=g,
target=g.d)
We can now query the influence estimator by calling it with some graph element(s) as argument. For example, we can ask it to estimate the influence of the variable g.e.a
by:
[5]:
ie(g.e.a)
[5]:
0.22220185608099374
The number returned is the relative reduction of the variance of the target when g.e.a
is held fixed compared to the full variance of the target when no varianbe is held fixed. In the graph considered here, g.e.a
contributes 20% to the variance of the target g.d
. Thus the estimated relative variance reduction should be near 0.2.
We can ask for the influences of all graph elements by omitting the call argument:
[6]:
ie()
[6]:
{'g': 1.0,
'g/e': 0.9159407995866957,
'g/e/a': 0.22220185608099374,
'g/e/b': 0.302906770122022,
'g/e/c': 0.39083217338368,
'g/d': 1.0}
The graph g
itself as whole accounts for all the variance in its member g.d
, and thus its influence g.d
expressed as relative variance reduction is unity.
The same holds for the target itself. Fixing its value removes any variance in it. Thus the relative variance reduction is 100%.
90% of the variance in g.d
stems from variables in the entity g.e
. Thus, the estimated influence of g.e
on g.d
is close to 0.9. Resolving the constituents of g.e
, one finds that g.e.a
, g.e.b
, and g.e.c
each infuence the target g.d
by 20%, 30%, and 40%, resp. The remaining 10% of the variance of g.d
is irreducible noise.
Visualizing the influence estimator#
You can add the information of the objective to the graph visualization by using show
and then activating the objective’s button in the bottom right of the canvas.
[7]:
show(ie)
Options#
Target#
As in the above example, the target can be a variable in the graph.
[8]:
ie = InfluenceEstimator(graph=g,
target=g.d)
ie()
[8]:
{'g': 1.0,
'g/e': 0.8942984478404212,
'g/e/a': 0.199091176983294,
'g/e/b': 0.29995665619246964,
'g/e/c': 0.3952506146646576,
'g/d': 1.0}
The target can also be an entity in the graph. Let’s make a graph with two entities.
[9]:
g2 = Graph("g2")
with g2:
e1 = Entity("e1")
with e1:
Variable("a", mean=0, variance=2)
Variable("b", mean=0, variance=3)
Variable("c", mean=0, variance=4)
e2 = Entity("e2")
with e2:
Variable("d1", variance=1)
Variable("d2", variance=1)
e2.d1.mean = e1.a + e1.b
e2.d2.mean = e1.a + e1.c
Now take the second entity as target and estimate the influence of all the graph components on its variance.
[10]:
ie = InfluenceEstimator(graph=g2,
target=g2.e2)
ie()
[10]:
{'g2': 0.7793190412540283,
'g2/e1': 0.8799107327311396,
'g2/e1/a': 0.4582248784911632,
'g2/e1/b': 0.17902348299773718,
'g2/e1/c': 0.2426623712422392,
'g2/e2': 0.7793190412540283,
'g2/e2/d1': 0.36816158978351676,
'g2/e2/d2': 0.41115745147051147}
One can also take functions as targets. Such a function must take a graph as argument and return operation. As an example, we define the function:
[11]:
def f(graph):
return graph.e2.d1 + graph.e2.d2
Now we specify the function as target:
[12]:
ie = InfluenceEstimator(graph=g2,
target=f)
ie()
[12]:
{'g2': 0.7506897555952952,
'g2/e1': 0.8536942019034909,
'g2/e1/a': 0.44848251047837473,
'g2/e1/b': 0.16228885840705484,
'g2/e1/c': 0.24292283301806125,
'g2/e2': 0.7506897555952952,
'g2/e2/d1': 0.32748271279832525,
'g2/e2/d2': 0.42320704279696997}
The visualization still works in the same way.
[13]:
show(ie)
Speed vs. Accuracy#
The influence estimator class takes an argument n_samples
, which regulates how many examples are used when computing the variances. Fewer examples make the estimator faster but less accurate, more examples requires more computing time but yields more accurate results.
Let’s try fast but inaccurate:
[14]:
ie = InfluenceEstimator(graph=g,
target=g.d,
n_samples=30)
ie()
[14]:
{'g': 1.0,
'g/e': 0.8174850455328879,
'g/e/a': 0.2531955170898377,
'g/e/b': 0.301689813971006,
'g/e/c': 0.26259971447204417,
'g/d': 1.0}
Now try a slower but more accurate estimator:
[15]:
ie = InfluenceEstimator(graph=g,
target=g.d,
n_samples=10000)
ie()
[15]:
{'g': 1.0,
'g/e': 0.9032381973643302,
'g/e/a': 0.20475754552666683,
'g/e/b': 0.2999833333235057,
'g/e/c': 0.3984973185141577,
'g/d': 1.0}
[ ]: