In subsection VI-A, assumptions 1) and 2) are made to simplify the presentation of the probabilistic algorithms. The first assumption is that component events are observed without error. Although component events can be observed with high accuracy in many environments, it is not always the case. Sensor network is such an example: sensing range can be hard to know precisely. The extension to handle such uncertainties is relatively straightforward, at least in theory: all we need to do is to maintain a probability distribution over all possible sequence of shadows consistent with the robot's observations. Obtaining expectations of the number of targets in any shadow can then be done by also calculating the expectation over all possible sequences of shadows that contains the target shadow. The computation effort will certainly increase; resampling can alleviate the burden somewhat.
Various distinguishability assumptions can also be handled. Recall that when agents move nondeterministically, two distinguishability cases are investigated. When there are only teams with single attributes, the same approach from the nondeterministic case applies by simply carrying out one computation per team. If the teams have multiple attributes (for example, the initial condition may be given as a joint probability distribution of red and blue teams), a direct extension is performing one computation for each joint probability entries in the initial condition. This is clearly more work and resampling may be necessary depending on the granularity of the initial target distribution. On the plus side, although we lose some accuracy with resampling (to save computation time), a richer class of problems can now be handled because any initial condition can be described as a joint probability distribution.