Trackez: An IoT-Based 3D-Object Tracking From 2D Pixel Matrix Using Mez and FSL Algorithm

Nuruzzaman Faruqui, Md Alamgir Kabir, Mohammad Abu Yousuf, Md Whaiduzzaman, Alistair Barros, Imran Mahmud

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

The imaging devices sense light reflected from objects and reconstruct images using the 2D-sensor matrix. It is a 2D Cartesian coordinate system where the depth dimension is absent. The absence of a depth axis on 2D images imposes challenges in locating and tracking objects in a 3D environment. Real-time object tracking faces another challenge imposed by network latency. This paper presents the development and analysis of a real-time, real-world object tracker called Trackez, which is capable of tracking within the top hemisphere. It uses Machine Vision at the IoT Edge (Mez) technology to mitigate latency sensitivity. A novel algorithm, Follow-Satisfy-Loop (FSL), has been developed and implemented in this paper that optimally tracks the target. It does not require the depth-axis. The simple and innovative design and incorporation of Mez technology have made the proposed object tracker a latency-insensitive, Z-axis-independent, and effective system. The Trackez reduces the average latency by 85.08% and improves the average accuracy by 81.71%. The object tracker accurately tracks objects moving in regular and irregular patterns at up to $5.4ft/s$ speed. This accurate, latency tolerant, and Z-axis independent tracking system contributes to developing a better robotics system that requires object tracking.

Original languageEnglish
Pages (from-to)61453-61467
Number of pages15
JournalIEEE Access
Volume11
DOIs
Publication statusPublished - 2023

Keywords

  • 2D coordinate
  • 3D coordinate
  • latency sensitivity
  • Machine vision
  • Mez
  • object tracking
  • the IoT edge

Fingerprint

Dive into the research topics of 'Trackez: An IoT-Based 3D-Object Tracking From 2D Pixel Matrix Using Mez and FSL Algorithm'. Together they form a unique fingerprint.

Cite this