next up previous
Next: Processing field-of-view events and Up: Probabilistic Events, Observations, and Previous: Problem formulation

Processing component events

To understand how observations affect agent distributions in a probabilistic setting, let us first look at the component events (we do not distinguish between events and observations for these since they are the same by assumption). Among the four types of component events, split and disappear events are more important than appear and merge events.

1) Split. A split event introduces more uncertainty. As a shadow splits into two disjoint shadows, the probability masses in the newly spawned shadows cannot be predicted without additional information because the sensors can not see what happens within the shadow region during a split event. The issue is resolved by the introduction of a split rule, obtained from supporting data or an oracle, which dictates how the originating shadow's probability mass should be redistributed. For example, statistical data may support that the number of targets in the child shadows are proportional to their respective areas.
2) Disappear. When a shadow disappears, the targets hiding behind it are revealed. This information can be used to update our belief about the target distribution by eliminating some improbable distributions of targets. In particular, it can reduce the uncertainty created by split events. For example, suppose that a shadow $ s_i$ , having $ a_i$ targets in it (with $ 100\%$ probability), splits into shadows $ s_j$ and $ s_k$ . It is possible that $ s_j$ has 0 to $ a_i$ targets in it, as does $ s_k$ . However, if $ s_k$ later disappears to reveal $ a_k$ targets in it and no other events happen to $ s_j$ and $ s_k$ , then $ s_j$ must have exactly $ a_i - a_k$ targets in it. In general, assuming that shadow $ s_k$ disappears with a target distriubtion $ P(s_k)$ , the update rule is given by

\begin{displaymath}
\begin{array}{l}
P'(s_1 = x_1, \ldots, s_{k-1}=x_{k-1}, s_{k...
...ldots, s_{k} = x_k, \ldots, s_n = x_n)P(s_k = x_k),
\end{array}\end{displaymath}

in which the summation is over all joint probability entries of $ P(s_1, \ldots, s_n)$ such that $ s_k = x_k$ . Normalization is required.
3) Appear. An appearing shadow $ s_k$ , with distribution $ P(s_k)$ , can be joined with the rest via combining the independent distributions $ P(s_k)$ with $ P(s_1, \ldots, s_n)$ :

\begin{displaymath}
\begin{array}{l}
P'(s_1 = x_1, \ldots, s_n = x_n, s_{k}=x_k)  = P(s_1 = x_1, \ldots, s_n = x_n)P(s_k = x_k).
\end{array}\end{displaymath}


4) Merge. In this case, two probability masses are collapsed. We simply collect the joint distribution to form a single one,

\begin{displaymath}
\begin{array}{l}
P'(\ldots, s_{k} = x_k)  = \displaystyle\...
..._k}P(\ldots, s_i = x_i, \ldots, s_j = x_j, \ldots),
\end{array}\end{displaymath}

in which $ s_{k}$ is the merged shadow of shadows $ s_i$ and $ s_j$ . A detailed example is given in Table I in which the original shadows are $ s_1, s_2, s_3$ and $ s_2, s_3$ merge to form shadow $ s_4$ .
Table I:
before merge
$ P(s_1 = 1, s_2 = 1, s_3 = 4) = 0.2$
$ P(s_1 = 1, s_2 = 2, s_3 = 3) = 0.2$
$ P(s_1 = 1, s_2 = 3, s_3 = 2) = 0.2$
$ P(s_1 = 2, s_2 = 1, s_3 = 3) = 0.2$
$ P(s_1 = 2, s_2 = 2, s_3 = 2) = 0.2$
after merge
$ P(s_1 = 1, s_4 = 5) = 0.2 + 0.2 + 0.2 = 0.6$
$ P(s_1 = 2, s_4 = 4) = 0.2 + 0.2 = 0.4$


next up previous
Next: Processing field-of-view events and Up: Probabilistic Events, Observations, and Previous: Problem formulation
Jingjin Yu 2011-01-18