Optimizing UAV Task Allocation with Enhanced Battery Efficiency Using Semi-Markov Decision Processes
Academic Article in Scopus
-
- Overview
-
- Identity
-
- Additional document info
-
- View All
-
Overview
abstract
-
This study presents an innovative approach using Semi-Markov Decision Processes (SMDPs) for task allocation in Unmanned Aerial Vehicles (UAVs), specifically addressing the stochastic nature of battery levels. By dynamically managing task assignments based on real-time battery status and the time spent in different states, the SMDP framework significantly enhances operational efficiency. This SMDP approach achieves an average improvement of 54.1% decrease in total flight time compared to manual assignment strategies, this study further investigates the impact of different reward matrix configurations. The increasing deployment of UAVs in surveillance, disaster response, and delivery services necessitates efficient task allocation strategies, particularly under energy constraints. Previous works have explored heuristic and reinforcement learning-based approaches, yet few have incorporated the semi-Markov decision framework for battery-aware task scheduling. Our research extends prior efforts, such as resource-constrained task allocation models and networked evolutionary game-theoretic approaches, by explicitly modeling state durations and decision-making uncertainty. This study expands on previous findings by utilizing two distinct sets of reward matrices to validate the improvements achieved with SMDPs in UAV task allocation. By comparing these configurations, we aim to further optimize decision-making strategies, ensuring more efficient and adaptive UAV operations in dynamic environments. © The Author(s) 2025.
status
publication date
published in
Identity
Digital Object Identifier (DOI)
Additional document info
has global citation frequency
volume