Reinforcement Learning Approaches for Energy-Efficient Embedded Systems: A Survey

Main Article Content

Dr. Amit Jain

Abstract

The increased usage of embedded systems in areas like mobile computing, biomedical applications, industrial automation, and Internet of Things (IoT) has exacerbated the need to operate the embedded systems intelligently with a focus on energy efficiency under tight computational and power limitations. Reinforcement Learning (RL) is a potential solution to optimize power consumption and system performance with adaptive and data-driven real-time decision-making. This article is an in-depth survey of RL-based approaches to embedded systems design, with a special focus on model-free and model-based learning, energy-aware learning, and other lightweight learning algorithms in resource-constrained systems. Important uses are Dynamic Voltage and Frequency Scaling (DVFS), CPU scheduling, real-time object detection and autonomous control of embedded robotics. Simulation environments like MATLAB/Simulink, OpenAI Gym, and Network Simulator 3 (NS-3), as well as common hardware platforms, like ARM Cortex-M, NVIDIA Jetson, and Texas Instruments MSP430. In literature, it is possible to identify the presence of significant achievements, including up to 47% power savings and latency reductions with Deep RL and adaptive Convolutional Neural Networks (CNNs). Nonetheless, there remain barriers to safe policy learning, deployment in real-time, and reliability in changeable environments. The paper ends with some of the main research findings, such as a scalable RL framework, energy-aware reward functions, and sophisticated simulation techniques on next-generation intelligent embedded systems.

Downloads

Download data is not yet available.

Article Details

Section

Research Paper

How to Cite

Reinforcement Learning Approaches for Energy-Efficient Embedded Systems: A Survey. (2025). Journal of Global Research in Electronics and Communications(JGREC), 1(10), 23-29. https://doi.org/10.5281/zenodo.17422713

Similar Articles

You may also start an advanced similarity search for this article.