Resource constrained AI

Resource constrained AI is of inherent interest when AI is deployed on self-powered and limited form factor devices, such as wearables and stand-alone edge devices. This means that computational power should be spent diligently to conserve battery life. Fortunately, training can often still be performed centrally in a setting where compute power is readily available, while at the edge, only inference needs to happen. Nevertheless, there is still a need for the trained models to be pruned, or be run at reduced precision, without overly sacrificing inference accuracy. This often involves some kind of trade-off between accuracy and power consumption. Custom model architectures for specialised hardware are another related research topic.

Multiple teams within IDLab are active in this field of research, leading to a diversity of applications:

  • Human and animal health monitoring
  • Autonomous navigation
  • Remote monitoring and event detection
  • Distributed agent orchestration
Copyright © 2024 IDLab. All rights reserved.