Situational Awareness Platform – Part 2: Data Fusion

Victoria Heusinger-Heß, Jakob Stigler, Katharina Ross – Fraunhofer Institute for High-Speed Dynamics, Ernst-Mach-Institut, EMI, Michael Hubner – AIT Austrian Institute of Technology – Center for Digital Safety & Security
June 23, 2022 | Blogs

The overall TeamAware system is a complex aggregation of a wide array of different sensor systems coupled with individual software components which filter, interpret and digest the raw sensor data in different forms. These individual components are being acutely developed during the course of this project to achieve the final project result of a comprehensive situational awareness picture.

The TeamAware data fusion work package aims to achieve the main project goal of providing the practitioners with a high amount of leverageable information to enhance their in-situ capabilities without hindering their performance through information overflow.

In particular, the data fusion capabilities involve the alignment of the different data delivered by the individual sensor systems, like optical or acoustic sensors, to gain additional information which cannot be obtained by individual systems alone.

The goal is to decrease false alarms by combining different data sources as well as to increase measurement precision to metadata interpretation.

To achieve this goal, different individual data fusion modules are being developed or further improved within the project. Some of them already have shown great promise in previous projects, such as the MuFASA (Multimodal Fusion Architecture for Sensor Applications) framework from AIT.

MuFASA addresses various disciplines in terms of data fusion, such as data imperfection, data alignment/registration and data heterogeneity. In this sense, MuFASA provides data fusion modules that handle said tasks. In TeawAware, especially the GeoFusion module, which is part of MuFASA, will be further improved and developed. The core concept of the GeoFusion lies in probabilistic reasoning based on Bayesian inference of different sensor modalities.

This approach is used to provide multi-sensor fusion capabilities including data alignment and registration, i.e. coincidence in time and space. The main tasks of MuFASA are summarized as:

  • Spatial-temporal coincidence of different sensor modalities
  • Updating evidence of objects of interest in time
  • Evidence-based fusion of various sensor modalities
  • Confidence-based triggering of fused alarms
  • Localization of fused alarms

To further increase the usefulness of sensor data for the practitioners, one of the main obstacles is the inherent unreliability in the reported data themselves. Each sensor measurement taken within a real environment is inherently unreliable to some degree. Different modalities, which cannot all be accommodated for, might change the sensor function or the reported measurement. For the practitioners on the other hand, it is of utmost importance to know that the reported data are as reliable as possible, because this might guide the operator’s decision-making in a major way.

To address this problem for some systems, the Multi-Agent Navigation Frame Reinforcement Determination (MANFReD) module is currently under development by Fraunhofer EMI. The aim of MANFReD is to reinforce the localization data provided by TeamAware subsystems by correlating the localization data from various on-person systems based on metadata information as well as reports from personnel external sensor systems, which are available to the TeamAware system (e.g. camera information) to boost the reliability of the provided information and correct especially for systematic errors like, e.g., location drift.

The Sensor Information Density (SID) data fusion module is an example for a data fusion software currently under development for the TeamAware project. SID is a metadata fusion module, which aims to construct information on an operational level by observing the location and time of collected sensor information from any kind of sensor subsystem and correlating them with a specific modeling for the individual sensor type. This allows to construct an overview of the sensor coverage, information density and information decay over time for the in-situ operator based on live data. This allows for easy identification of areas which are currently not being observed and for getting an idea of information quality and thus aids the decision-making process by providing an interpretable view of the available information themselves.

With further development and integration of additional systems, more opportunities for further data combinations can be identified to increase the information flow to the operator and enhance the decision process of the operator and the usefulness of the system.

Figure 1:Mock-up of the basic principles used in the MANFReD module.

Contact

Monica Florea
Administrative Coordinator

European Projects Department
SIMAVI
Soseaua Bucuresti-Ploiesti 73-81 COM
Bucuresti/ROMANIA

Email:

Çağlar Akman
Technical Coordinator

Command and Control Systems
HAVELSAN
Eskişehir Yolu 7 km
Ankara/TURKEY

Email:

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101019808.

© 2021 | TeamAware All Rights Reserved