As complexity in our everyday environment increases (e.g., mobile applications for monitoring energy consumption), how do we adapt and react to the changing demands placed on us? In dynamic decision making (DDM) problems, the environment changes over time due to previous decisions made and/or factors outside the control of the decision-maker. To maximize his/her reward, an agent effectively needs to control a complex dynamic system. This often involves planning in the face of uncertainty about how decisions change the state of the system and the rewards that can be obtained. Thus, DDM refers to a process by which an agent selects a course of action in a manner that achieves or maintains a desired state in a dynamic environment. This includes balancing exploration and exploitation, distinguishing between different sources of variability within the environment, and tracking the current state of the environment (i.e., filtering).