Minimizing Age of Detection for a Markov Source over a Lossy Channel
By: Shivang Garde, Jaya Prakash Champati, Arpan Chattopadhyay
Potential Business Impact:
Helps machines know when to check important things.
Monitoring a process/phenomenon of specific interest is prevalent in Cyber-Physical Systems (CPS), remote healthcare, smart buildings, intelligent transport, industry 4.0, etc. A key building block of the monitoring system is a sensor sampling the process and communicating the status updates to a monitor for detecting events of interest. Measuring the freshness of the status updates is essential for the timely detection of events, and it has received significant research interest in recent times. In this paper, we propose a new freshness metric, Age of Detection (AoD), for monitoring the state transitions of a Discrete Time Markov Chain (DTMC) source over a lossy wireless channel. We consider the pull model where the sensor samples DTMC state whenever the monitor requests a status update. We formulate a Constrained Markov Decision Problem (CMDP) for optimising the AoD subject to a constraint on the average sampling frequency and solve it using the Lagrangian MDP formulation and Relative Value Iteration (RVI) algorithm. Our numerical results show interesting trade-offs between AoD, sampling frequency, and transmission success probability. Further, the AoD minimizing policy provides a lower estimation error than the Age of Information (AoI) minimizing policy, thus demonstrating the utility of AoD for monitoring DTMC sources.
Similar Papers
Semi-Markov Decision Process Framework for Age of Incorrect Information Minimization
Information Theory
Helps computers know when information is too old.
Minimizing Functions of Age of Incorrect Information for Remote Estimation
Information Theory
Helps systems send data only when needed.
Online Learning for Optimizing AoI-Energy Tradeoff under Unknown Channel Statistics
Networking and Internet Architecture
Keeps information fresh with less energy.