Advancing mobile sensing with resilient distributed computing
As various armed conflicts continue to unfold on the world stage, unmanned aerial vehicles (UAV) are playing pivotal roles. Technological innovation on the battlefield calls for strategic use of UAVs beyond tactical reconnaissance and surveillance. An approach to consider is the integration of resilient distributed computing to elevate real-time intelligence, streamline decision-making processes, and fortify the resilience of mobile sensing systems. The intersection of advancing mobile sensing and resilient distributed computing creates the opportunity to not only adapt to the challenges of modern conflict but also redefine the future of military and civilian applications.
To understand the architecture and integration of distributed computing, it’s important to comprehend sensor fusion technology. Sensor fusion combines the strengths of a detective, a photographer, and a surveyor to act as a super-sleuth that sees everything. It takes data from multiple sensors, like cameras, radar, and lidar, and combines them into a single, more detailed picture of the world around us. This fused data is more accurate and reliable than any one sensor on its own, revealing details and insights each sensor alone might miss. Sensor fusion allows us to see through walls, understand movement in the dark, and draws texture to our world maps. It is a vital tool in conflict that requires constant modernization.
The increasing demands for mobile sensing capabilities call for the daunting challenge of processing all sensing tasks on a single mobile device. In the realm of cooperative multiple UAV sensing, the convergence of cutting-edge technology meets the formidable challenges of data heterogeneity and communication bandwidth constraints. The first obstacle lies in the diversity of sensors carried by different UAVs, resulting in a mosaic of data types—from images and lidar scans to chemical readings—each with unique resolutions, formats, and accuracy levels. Overcoming this requires advanced algorithms capable of seamlessly fusing heterogeneous data, compensating for inherent discrepancies.
The second challenge surfaces in the realm of communication and bandwidth limitations. Real-time data exchange between UAVs and the central processing unit (CPU) is often constrained by bandwidth limitations and unreliable wireless channels, impeding smooth collaboration and causing analysis delays.
The third challenge is the unpredictable nature of data collection conditions, encompassing data loss, connection disruptions, and sensor malfunctions. Successfully fusing the remaining sensing data under these conditions presents a pivotal challenge. Such fusion demands innovative strategies to maintain high-quality results in the face of uncertainty. Navigating this trifecta of challenges is central to unlocking the true potential of cooperative UAV sensing in dynamic environments.
Enter Intelligent Fusion Technology’s solution: a resilient mobile distributed computing framework seamlessly integrating coded computing (CC) and named data networking (NDN). This innovative framework optimizes network traffic and information sharing within the edge network dynamically, adapting to the ever-changing conditions. Crucially, the CC technique ensures the recovery of missing computation results, thus mitigating the impact of edge node failures or disconnections.
Concept of intelligent sensors fusion. Photo credit: BorderUAS, A. Kyriakopoulos
At the core of this innovation lies the integration of CC atop a target recognition model, which is then complemented by incorporating NDN into the robust Apache Storm platform. Apache Storm is a real-time data processing platform that enables the seamless analysis of continuous streams of data as it flows into a system. Think of it as a dynamic and responsive “brain” that can instantly process and analyze data as it arrives, making it particularly valuable for applications requiring timely insights and rapid decision-making. Storm allows developers to create distributed, fault-tolerant applications that process and react to data in real-time, making it a powerful tool for tasks ranging from live analytics and event monitoring, to processing data from sensors or IoT devices. Extensive simulations and quantitative results have underscored the framework’s efficacy, which thereby delivers resilient and robust computing capabilities while reducing network traffic. This breakthrough technology emerges as an optimal solution that redefines mobile distributed computing, particularly in challenging network scenarios. The framework maintains remarkably accurate, high-level target recognition even with incomplete original sensing data.
The technology extends its reach into the realm of cooperative augmented reality (AR), offering real-time, immersive, and context-aware situational awareness while enhancing mobile sensing capabilities. By leveraging distributed edge computing, the system gives rise to a cooperative AR ecosystem as it integrates advanced sensing, communication, and processing techniques. Successfully deployed with HoloLens and edge servers over a wireless network, the system excels in capturing data from multiple HoloLens sensors, conducting fusion, precise object recognition, and seamlessly projecting 3D models into the wearer’s field of view in real-time.
A detailed exploration of the intricate architecture of this cooperative AR system sheds light on its distributed sensing and edge computing components, all of which are integrated into the Apache Storm platform. From data collection and aggregation to analysis, object recognition, and real-time 3D model rendering on HoloLens, the system showcases the feasibility and advantages of integrating distributed cooperative sensing and edge computing. This not only offers dynamic, immersive AR experiences but also paves the way for novel applications that transcend traditional boundaries.
Intelligent Fusion Technology has successfully translated these groundbreaking concepts into a tangible prototype featuring a heterogeneous distributed computing platform. This platform, driven by Apache Storm and an in-house developed machine learning-based scheduler for task assignments, signals a promising future for mobile sensing and augmented reality.
Intelligent Fusion Technology maintains a commitment to innovation, pushing the boundaries of mobile sensing and distributed computing. We are committed to shaping a future where technological resiliency, efficiency, and limitless possibilities converge.
Genshe Chen, is chief technology officer for Intelligent Fusion Technology.