next up previous
Next: Efficiently propagating probability masses Up: Probabilistic Events, Observations, and Previous: Balancing between estimation accuracy

Accurately propagating probability masses

When there are few targets and events/observations, we want to keep assumptions as few as possible to maintain an accurate target distribution in the shadows as critical events happen. In this section, we minimally assume: 1) An accurate split rule is provided that governs how targets redistribute when a split event happens, and 2) the sensor statistics are provided ( $ P(\mathbf{e} \mid \mathbf{y})$ is known). As events happen, the probability mass $ P(s_1, \ldots, s_n)$ , is updated according to a straightforward algorithm (Table III and IV) based on the analysis in section VI-D, in which the observation data structure is defined in Table II.





observation data structure used in Table III
Table II:
   
$ t$ event type, can be one of appear, disappear, split, merge component events and enter, exit, null field-of-view events
$ s_s$ the originating shadow in a split event
$ s_{s1}$ the first new shadow after a split event
$ s_{s2}$ the second new shadow in a split event
$ s_{m1}$ the first shadow in a merge event
$ s_{m2}$ the second shadow in a merge event
$ s_{m}$ the newly merged shadow
$ s_{e}$ the newly appeared shadow from an appear event
$ n_e$ the number of targets in an appear event
$ s_{v}$ the disappearing shadow in a disappear event
$ n_v$ the number of targets revealed in a disappear event








PROCESS_PROBABILITY_MASS
Input
Table III:
initial target distribution $ P(s_1, \ldots, s_n)$
queue $ Q$ of observation sequence
split rule
sensor statistics $ P(\mathbf{e} \mid \mathbf{y})$

Output
the target distribution after all observations
 
foreach event observation $ o$ in $ Q$
switch($ o.t$ )
case appear:
update all $ P(s_1 = x_1, \ldots, s_n = x_n) = p_j$ entries to
$ P(s_1 = x_1, \ldots, s_n = x_n, o.s_e = n_e) = p_j$
case disappear:
remove all joint distribution entries with $ o.s_v \ne o.n_v$
renormalize the probability masses
case split:
add two new shadows $ o.s_{s1}$ , $ o.s_{s2}$
split prob. mass in $ o.s_s$ into $ o.s_{s1}$ , $ o.s_{s2}$ by split rule
case merge:
add a new shadow $ o.s_m$
collapse probability mass from $ o.s_{m1}, o.s_{m2}$ into $ o.s_m$
case enter, exit, null:
call PROCESS_FOV_EVENT
return the updated target distribution








PROCESS_FOV_EVENT
Input
Table IV:
target distribution $ P(s_1, \ldots, s_n)$
sensor statistics $ P(\mathbf{e} \mid \mathbf{y})$
field-of-view observation $ y \in \{y_e, y_x, y_n \}$
the affected shadow $ s_i$

Output
the target distribution after the observation
 
foreach $ P(\ldots, s_i = x_i, \ldots) = p_j$ entry in the distribution
let $ P'(\ldots, s_i = x_i +1, \ldots) = p_j * P(\mathbf{e} = e_e \mid \mathbf{y} = y)$
let $ P''(\ldots, s_i = x_i, \ldots) = p_j * P(\mathbf{e} = e_n \mid \mathbf{y} = y)$
if $ x_i > 0$
let $ P'''(\ldots, s_i = x_i - 1, \ldots) = p_j * P(\mathbf{e} = e_e \mid \mathbf{y} = y)$
else
normalize $ P', P''$ such that $ P' + P'' = p_j$
remove $ P(\ldots, s_i = j, \ldots) = p_j$ entry
store entries $ P', P''$ and $ P'''$ if applicable
return the updated target distribution




Figure 12: A simple event observation sequence is generated (on the left, with only two field of view observations marked with bold faced font) when a robot carrying omni-directional, infinite range sensor follows the dotted path in a polygonal environment with a hole (the four figures on the right). The last event, disappearing of shadow $ s_5$ , is not shown on the right; we note that additional resource is needed to make $ s_5$ disappear (say, a sub search team). A slightly more complicated sequence is also possible with six additional field of view observations (on the left, marked with lightened font).
\begin{figure}\begin{center}
\epsfig{figure=figures/example-few-agents-simple-combo.eps,width=0.8\textwidth}
\end{center}
\end{figure}


Table V:
observation         probability masses
$ initial$
$ P(s_1 = 2, s_2 = 2) = 1$
$ y_x$ , $ s_1$
$ P(s_1 = 1, s_2 = 2) = 0.9,$
$ P(s_1 = 3, s_2 = 2) = 0.1$
$ split,$ $ s_2 \to \qquad$ $ s_3, s_4$
$ P(s_1=1, s_3=0, s_4=2) = 0.9 * 0.25 = 0.225$
$ P(s_1=1, s_3=1, s_4=1) = 0.9 * 0.5 = 0.45$
$ P(s_1=1, s_3=2, s_4=0) = 0.9 * 0.25 = 0.225$
$ P(s_1=3, s_3=0, s_4=2) = 0.1 * 0.25 = 0.025 $
$ P(s_1=3, s_3=1, s_4=1) = 0.1 * 0.5 = 0.05 $
$ P(s_1=3, s_3=2, s_4=0) = 0.1 * 0.25 = 0.025 $
$ y_e$ , $ s_3$
$ P(s_1=1, s_3=1, s_4=2) = 0.225$
$ P(s_1=1, s_3=0, s_4=1) = 0.45 * 0.1 = 0.045$
$ P(s_1=1, s_3=2, s_4=1) = 0.45 * 0.9 = 0.405$
$ P(s_1=1, s_3=1, s_4=0) = 0.225 * 0.1 = 0.0225$
$ P(s_1=1, s_3=3, s_4=0) = 0.225 * 0.9 = 0.2025$
$ P(s_1=3, s_3=1, s_4=2) = 0.025$
$ P(s_1=3, s_3=0, s_4=1) = 0.05 * 0.1 = 0.005$
$ P(s_1=3, s_3=2, s_4=1) = 0.05 * 0.9 = 0.045$
$ P(s_1=3, s_3=1, s_4=0) = 0.025 * 0.1 = 0.0025$
$ P(s_1=3, s_3=3, s_4=0) = 0.025 * 0.9 = 0.0225$
$ merge,$         $ s_1, s_3$ $ \to s_5$
$ P(s_4=2, s_5=2) = 0.225$
$ P(s_4=1, s_5=1) = 0.045$
$ P(s_4=1, s_5=3) = 0.405 + 0.005 = 0.41$
$ P(s_4=0, s_5=2) = 0.0225$
$ P(s_4=0, s_5=4) = 0.2025 + 0.0025 = 0.205$
$ P(s_4=2, s_5=4) = 0.025$
$ P(s_4=1, s_5=5) = 0.045$
$ P(s_4=0, s_5=6) = 0.0225$
$ disappear, $ $ s_5$
$ P(s_4=0) = 0.0769$
$ (= 0.0225 * 0.5/((0.0225 + 0.045 + 0.225) * 0.5)$
$ P(s_4=1) = 0.1538,$
$ P(s_4=2) = 0.7692,$

Figure 13: Graphical display of the target probability mass as the algorithm PROP_PROBABILITY_MASS is run over the simple event observation sequence in Fig. 12. Each figure corresponds to one step in Table V. Lighter (if any) and darker balls represent probability masses before and after an event, respectively. The volumes are proportional to the magnitude of the probability mass entries.
\begin{figure}\begin{center}
\begin{tabular}{cc}
\epsfig{figure=figures/prob-m...
...,width=0.4\textwidth} \\
(e) & (f) \\
\end{tabular}\end{center}
\end{figure}

As a demonstration, we work through the observation sequence given by Fig. 12, with the following assumptions: 1) Initially there are 2 targets each in shadow $ s_1, s_2$ , 2) the split rule is that each target has $ 0.5$ probability of going into each of the two split shadows, 3) there is no null event or observation, with the true positive rate for any observation being $ p
= 0.9$ , and 4) $ a_5 = 1$ with probability $ 0.5$ and $ a_5 = 2$ with probability $ 0.5$ . The extra assumptions are made so that the calculation of the probability mass entries are limited (so that they can be listed in a table). The observation by observation processing is shown in Table V. The distribution is represented using a table of joint probabilities, which is always practical when there are not too many targets and events. Renormalization is performed in the third step for the first and sixth entries, as well as in the last step. In the merge step, the third and seventh entries from previous step are combined, as are the fifth and ninth entries. A graphical illustration of the probability masses during each step of the run is given in Fig. 13. Note that the dimensions change as component events happen.

Figure 14: Probability masses for the slightly more complicated observation sequence in Fig. 12 before shadows $ s_1, s_3$ merge to form $ s_5$ . Note: origin is shifted for better display.
\begin{figure}\begin{center}
\epsfig{figure=figures/prob-mass-simple-2.eps,width=0.5\textwidth}
\end{center}
\end{figure}

To verify the correctness of the outcome, Monte Carlo trials are also run, in which individual targets are propagated through the observation one by one. Since it is not an exact method, we leave the details of it to the next section. After $ 1000$ successful random trials (this is the number of trials used for all Monte Carlo simulations in this paper), we obtained $ P(s_4=0) = 0.079, P(s_4 = 1) =
0.154, P(s_4=2) = 0.767$ , which matches closely the results of the exact algorithm.


next up previous
Next: Efficiently propagating probability masses Up: Probabilistic Events, Observations, and Previous: Balancing between estimation accuracy
Jingjin Yu 2011-01-18