MultiAgentDecisionProcess  Release 0.2.1
ProblemDecTiger Class Reference

ProblemDecTiger implements the DecTiger problem. More...

#include <ProblemDecTiger.h>

+ Inheritance diagram for ProblemDecTiger:
+ Collaboration diagram for ProblemDecTiger:

Public Member Functions

 ProblemDecTiger ()
 Default constructor.
 ~ProblemDecTiger ()
 Destructor.
- Public Member Functions inherited from DecPOMDPDiscrete
virtual DecPOMDPDiscreteClone () const
 Returns a pointer to a copy of this class.
void CreateNewRewardModel ()
 Creates a new reward model.
void CreateNewRewardModelForAgent (Index agentI)
 implementation of POSGDiscreteInterface
 DecPOMDPDiscrete (std::string name="received unspec. by DecPOMDPDiscrete", std::string descr="received unspec. by DecPOMDPDiscrete", std::string pf="received unspec. by DecPOMDPDiscrete")
 Default constructor.
void ExtractMADPDiscrete (MultiAgentDecisionProcessDiscrete *madp)
 Get the MADPDiscrete components from this DecPOMDPDiscrete.
double GetReward (Index sI, Index jaI) const
 Return the reward for state, joint action indices.
double GetReward (State *s, JointAction *ja) const
 implements the DecPOMDPInterface
double GetRewardForAgent (Index agentI, State *s, JointAction *ja) const
 Function that returns the reward for a state and joint action.
double GetRewardForAgent (Index agentI, Index sI, Index jaI) const
 Return the reward for state, joint action indices.
RewardModelGetRewardModelPtr () const
 Get a pointer to the reward model.
RGetGetRGet () const
bool SetInitialized (bool b)
 Sets _m_initialized to b.
void SetReward (Index sI, Index jaI, double r)
 Set the reward for state, joint action indices.
void SetReward (Index sI, Index jaI, Index sucSI, double r)
 Set the reward for state, joint action , suc. state indices.
void SetReward (Index sI, Index jaI, Index sucSI, Index joI, double r)
 Set the reward for state, joint action, suc.state, joint obs indices.
void SetReward (State *s, JointAction *ja, double r)
 implements the DecPOMDPInterface
void SetRewardForAgent (Index agentI, State *s, JointAction *ja, double r)
 Function that sets the reward for an agent, state and joint action.
void SetRewardForAgent (Index agentI, Index sI, Index jaI, double r)
 Set the reward for state, joint action indices.
void SetRewardForAgent (Index agentI, Index sI, Index jaI, Index sucSI, double r)
 Set the reward for state, joint action , suc. state indices.
void SetRewardForAgent (Index agentI, Index sI, Index jaI, Index sucSI, Index joI, double r)
 Set the reward for state, joint action, suc.state, joint obs indices.
std::string SoftPrint () const
 Prints some information on the DecPOMDPDiscrete.
 ~DecPOMDPDiscrete ()
 Destructor.
- Public Member Functions inherited from DecPOMDPDiscreteInterface
virtual ~DecPOMDPDiscreteInterface ()
 import the GetReward function from the base class in current scope.
- Public Member Functions inherited from POSGDiscreteInterface
virtual ~POSGDiscreteInterface ()
 Destructor.Can't make a virt.destr. pure abstract!
- Public Member Functions inherited from MultiAgentDecisionProcessDiscreteInterface
virtual ~MultiAgentDecisionProcessDiscreteInterface ()
 Destructor. Can't make a virt.destr. pure abstract!
- Public Member Functions inherited from MultiAgentDecisionProcessInterface
virtual ~MultiAgentDecisionProcessInterface ()
 Destructor.
- Public Member Functions inherited from POSGInterface
virtual ~POSGInterface ()
 Virtual destructor.
- Public Member Functions inherited from DecPOMDPInterface
virtual ~DecPOMDPInterface ()
 Virtual destructor.
- Public Member Functions inherited from MultiAgentDecisionProcessDiscrete
void CreateNewObservationModel ()
 Creates a new observation model mapping.
void CreateNewTransitionModel ()
 Creates a new transition model mapping.
const ObservationModelDiscreteGetObservationModelDiscretePtr () const
 Returns a pointer to the underlying observation model.
double GetObservationProbability (Index jaI, Index sucSI, Index joI) const
 Return the probability of joint observation joI: P(joI|jaI,sucSI).
OGetGetOGet () const
bool GetSparse () const
 Are we using sparse transition and observation models?
TGetGetTGet () const
const TransitionModelDiscreteGetTransitionModelDiscretePtr () const
 Returns a pointer to the underlying transition model.
double GetTransitionProbability (Index sI, Index jaI, Index sucSI) const
 Return the probability of successor state sucSI: P(sucSI|sI,jaI).
bool Initialize ()
 A function that can be called by other classes in order to request a MultiAgentDecisionProcessDiscrete to (try to) initialize.
 MultiAgentDecisionProcessDiscrete ()
 Default constructor.
 MultiAgentDecisionProcessDiscrete (std::string name="received unspec. by MultiAgentDecisionProcessDiscrete", std::string descr="received unspec.by MultiAgentDecisionProcessDiscrete", std::string pf="received unspec. by MultiAgentDecisionProcessDiscrete")
 Constructor that sets the.
 MultiAgentDecisionProcessDiscrete (int nrAgents, int nrS, std::string name="received unspec. by MultiAgentDecisionProcessDiscrete", std::string descr="received unspec.by MultiAgentDecisionProcessDiscrete", std::string pf="received unspec. by MultiAgentDecisionProcessDiscrete")
 Constructor that sets the.
void Print () const
 Prints some information on the MultiAgentDecisionProcessDiscrete.
Index SampleJointObservation (Index jaI, Index sucI) const
 Sample an observation.
Index SampleSuccessorState (Index sI, Index jaI) const
 Sample a successor state.
void SetObservationModelPtr (ObservationModelDiscrete *ptr)
 Set the obversation model.
void SetObservationProbability (Index jaI, Index sucSI, Index joI, double p)
 Set the probability of joint observation joI: P(joI|jaI,sucSI).
void SetSparse (bool sparse)
 Indicate whether sparse transition and observation models should be used.
void SetTransitionModelPtr (TransitionModelDiscrete *ptr)
 Set the transition model.
void SetTransitionProbability (Index sI, Index jaI, Index sucSI, double p)
 Set the probability of successor state sucSI: P(sucSI|sI,jaI).
 ~MultiAgentDecisionProcessDiscrete ()
 Destructor.
- Public Member Functions inherited from DecPOMDP
 DecPOMDP ()
 Default constructor. sets RewardType to REWARD and discount to 1.0.
double GetDiscount () const
 Returns the discount parameter.
double GetDiscountForAgent (Index agentI) const
 Returns the discount parameter.
reward_t GetRewardType () const
 Returns the reward type.
reward_t GetRewardTypeForAgent (Index agentI) const
 Returns the reward type.
void SetDiscount (double d)
 Sets the discount parameter to d.
void SetDiscountForAgent (Index agentI, double d)
 Functions needed for POSGInterface:
void SetRewardType (reward_t r)
 Sets the reward type to reward_t r.
void SetRewardTypeForAgent (Index agentI, reward_t r)
 Sets the reward type to reward_t r.

Private Types

enum  action_enum { LISTEN, OPENLEFT, OPENRIGHT }
typedef std::vector
< ActionDiscrete
ActionIVec
enum  jointAction_enum {
  LISTEN_LISTEN, LISTEN_OPENLEFT, LISTEN_OPENRIGHT, OPENLEFT_LISTEN,
  OPENLEFT_OPENLEFT, OPENLEFT_OPENRIGHT, OPENRIGHT_LISTEN, OPENRIGHT_OPENLEFT,
  OPENRIGHT_OPENRIGHT
}
enum  jointObservation_enum { HEARLEFT_HEARLEFT, HEARLEFT_HEARRIGHT, HEARRIGHT_HEARLEFT, HEARRIGHT_HEARRIGHT }
enum  observation_enum { HEARLEFT, HEARRIGHT }
typedef std::vector
< ObservationDiscrete
ObservationIVec
enum  state_enum { SLEFT, SRIGHT }

Private Member Functions

void ConstructActions ()
 Construct all the Actions and actionSets (the vector _m_actionVecs).
void ConstructObservations ()
 Construct all the observations and observation sets.
void FillObservationModel ()
 Fills the observation model with the tiger problem obs. probs.
void FillRewardModel ()
 Fills the reward model with the tiger problem rewards.
void FillTransitionModel ()
 Fills the transition model with the tiger problem transitions.

Private Attributes

const size_t NUMBER_OF_ACTIONS
const size_t NUMBER_OF_AGENTS
const size_t NUMBER_OF_OBSERVATIONS
const size_t NUMBER_OF_STATES

Additional Inherited Members

- Protected Attributes inherited from DecPOMDPDiscrete
RewardModel_m_p_rModel
 The reward model used by DecPOMDPDiscrete.

Detailed Description

ProblemDecTiger implements the DecTiger problem.

This class can be used as an alternative to parsing dectiger.dpomdp.

Definition at line 45 of file ProblemDecTiger.h.

Member Typedef Documentation

typedef std::vector<ActionDiscrete> ProblemDecTiger::ActionIVec
private

Definition at line 50 of file ProblemDecTiger.h.

Definition at line 51 of file ProblemDecTiger.h.

Member Enumeration Documentation

Enumerator:
LISTEN 
OPENLEFT 
OPENRIGHT 

Definition at line 64 of file ProblemDecTiger.h.

Enumerator:
LISTEN_LISTEN 
LISTEN_OPENLEFT 
LISTEN_OPENRIGHT 
OPENLEFT_LISTEN 
OPENLEFT_OPENLEFT 
OPENLEFT_OPENRIGHT 
OPENRIGHT_LISTEN 
OPENRIGHT_OPENLEFT 
OPENRIGHT_OPENRIGHT 

Definition at line 70 of file ProblemDecTiger.h.

Enumerator:
HEARLEFT_HEARLEFT 
HEARLEFT_HEARRIGHT 
HEARRIGHT_HEARLEFT 
HEARRIGHT_HEARRIGHT 

Definition at line 87 of file ProblemDecTiger.h.

Enumerator:
HEARLEFT 
HEARRIGHT 

Definition at line 82 of file ProblemDecTiger.h.

Enumerator:
SLEFT 
SRIGHT 

Definition at line 59 of file ProblemDecTiger.h.

Constructor & Destructor Documentation

ProblemDecTiger::~ProblemDecTiger ( )

Destructor.

Definition at line 90 of file ProblemDecTiger.cpp.

Member Function Documentation

void ProblemDecTiger::ConstructActions ( )
private

Construct all the Actions and actionSets (the vector _m_actionVecs).

Definition at line 94 of file ProblemDecTiger.cpp.

References MADPComponentDiscreteActions::_m_actionVecs, MADPComponentDiscreteActions::_m_nrActions, MultiAgentDecisionProcess::_m_nrAgents, DEBUG_CA, LISTEN, NUMBER_OF_ACTIONS, OPENLEFT, and OPENRIGHT.

Referenced by ProblemDecTiger().

void ProblemDecTiger::ConstructObservations ( )
private
void ProblemDecTiger::FillRewardModel ( )
private
void ProblemDecTiger::FillTransitionModel ( )
private

Member Data Documentation

const size_t ProblemDecTiger::NUMBER_OF_ACTIONS
private

Definition at line 57 of file ProblemDecTiger.h.

Referenced by ConstructActions().

const size_t ProblemDecTiger::NUMBER_OF_AGENTS
private

Definition at line 55 of file ProblemDecTiger.h.

Referenced by ProblemDecTiger().

const size_t ProblemDecTiger::NUMBER_OF_OBSERVATIONS
private

Definition at line 56 of file ProblemDecTiger.h.

Referenced by ConstructObservations().

const size_t ProblemDecTiger::NUMBER_OF_STATES
private

Definition at line 54 of file ProblemDecTiger.h.

Referenced by FillObservationModel(), FillTransitionModel(), and ProblemDecTiger().


The documentation for this class was generated from the following files: