Abstract
Autonomous systems (AS) are systems that can adapt and change their behaviors in response to unanticipated events and include systems such as aerial drones, autonomous vehicles, and ground/aquatic robots. AS require a wide array of sensors, deep learning models, and powerful hardware platforms to perceive the environment and safely operate in real-time. However, in many contexts, some sensing modalities negatively impact perception while increasing the system’s overall energy consumption. Since AS are often energy-constrained edge devices, energy-efficient sensor fusion methods have been proposed. However, existing methods either fail to adapt to changing scenario conditions or to optimize system-wide energy efficiency. We propose CARMA, a context-aware sensor fusion approach that uses context to dynamically reconfigure the computation flow on a field-programmable gate array (FPGA) at runtime. By clock gating unused sensors and model sub-components, CARMA significantly reduces the energy used by a multi-sensory object detector without compromising performance. We use a deep learning processor unit (DPU) based reconfiguration approach to minimize the latency of model reconfiguration. We evaluate multiple context identification strategies, propose a novel system-wide energy-performance joint optimization, and evaluate scenario-specific perception performance. Across challenging real-world sensing contexts, CARMA outperforms state-of-the-art methods with up to 1.3× speedup and 73% lower energy consumption.