next up previous
Next: Problem formulation Up: Shadow Information Spaces: Combinatorial Previous: Incorporating distinguishability


Probabilistic Events, Observations, and Bayesian Filters

Now consider the case of probabilistic uncertainty. We use $ s_i$ to denote the shadow with label $ i$ , as well as the random variable for that shadow in the joint/multivariate distribution. For shadows $ s_1, \ldots, s_n$ , the joint distribution is then $ P(s_1, \ldots, s_n)$ , in which a specific entry is $ P(s_1 = x_1, \ldots, s_n = x_n) \in [0, 1]$ . In writing formulas and outlining algorithms, we usually shorten the repeated variables to ``$ \ldots$ '' on both the left hand side (LHS) and the right hand side (RHS) of an expression. In such cases, the combined ``$ \ldots$ '' on the LHS and RHS denote the same set of random variables. For example, $ P(s_1,s_2,s_3,s_4) = P(s_1,s_2,s_k,s_3,s_4)$ is shortened to $ P(\ldots) = P(\ldots, s_k, \ldots)$ .



Subsections

Jingjin Yu 2011-01-18