The increase in the amount and type of data related to emergency cases pose continuous challenges to first responders, who mainly consist of firefighters, emergency medical services, and law enforcement agencies. As equipped with specialized skills and qualifications, they are the groups of people, services and organizations whose duty is to arrive first to the emergency zone, carry out rescue operations, and perform crisis management in natural or human-made disasters. Usually, the characteristics and nature of such emergency cases are quite complicated. On the other hand, as the existing information communication technology suffers from fragility and unstable connectivity, the adaptation of advancement of latest computer technologies in the emergency-related scenarios are far from the satisfactory level.
Recently, there emerged some recent achievements focusing on adopting the latest technologies of augmented reality (AR) and virtual reality (VR) to serve first responders. Wang et al. proposed an AR system designed to facilitate the control of rescue robot and explore unknown environments for urban search and save operations1. Based on simultaneous localization and mapping (SLAM) technology with RGB-D camera, their system first runs to get the position and posture of the rescue robot. Subsequently, they used a deep learning-based algorithm to obtain the target location, place an AR marker in the global coordinate and display it on operator screen to indicate the target even when it is out of camera view. Park et al. presented an AR-based emergency management system for fire safety route guidance2. Their system can acquire visibility and grasp occupants in case of fire disasters in buildings, and provide visualization information and optimal guide for quick initial response using smart element AR-based disaster management service through linkage of physical virtual domain in the building. Hu et al. developed a high scene-rendering frame rate to achieve better immersion and prevent users from feeling dizzy in disaster scene simulation3. In their study, they designed a plugin-free browser/sever (B/S) architecture for 3D disaster scene construction and visualization based in mobile VR, and focused on the construction and optimization of a 3D disaster scene to satisfy the high framerate requirements for the rendering of 3D disaster scenes in mobile VR.
Meanwhile, AR/VR serves as an effective approach to reproduce a disaster scene such that information can be extracted from the virtual environment for safety assessment. For this regard, safety training based on AR/VR is another significant application in this field. Virtual simulations of complex situations can enable trainees to grasp a comprehensive understanding of safety issues4. Gong et al. designed an earthquake evacuation education system and conducted a virtual dormitory earthquake5. Boulos et al. studied the possibility of applying a VR geographic information system (GIS) in emergency training6. Pham et al. used VR interactive safety education and building anatomy modeling to instruct students safety operations knowledge7. Lovreglio et at. presented a comparison of effectiveness of fire extinguisher trained with VR and video, and concluded that people trained with VR obtained better score in knowledge acquisition and self-efficacy8.
On the other hand, AR/VR-based applications gained tremendous attention in the emergency response research area during the last several years. Meng et al. discovered that a virtual environment with smoke and virtual fire can induce participants to experience higher physiological and psychological stress. Their study focused on how to improve the emergency simulation quality in a VR experiment, and better simulate a real fire situation9. Bourhim et al. proposed a holistic evaluation system to measure the efficacy of VR simulation of fire, and concluded that VR simulation could be both realistic and engaging10. Goldiez et al. evaluated rescue navigation training using an AR map. Their results indicated that compared with a traditional paper map or a compass, an AR map can promote wayfinding performance in search and rescue11. Xu et al. used a VR-based fire training simulator for smoke hazards in a fire12. One of their training scenarios was fire rescue in a primary school, and they utilized VR to train firefighters to choose a safer path to rescue a trapped pupil.
There are also some studies focusing on employing AR/VR technology for post-emergency recovery, which mainly includes damage detection and building reconstruction, to take measures to assess the disaster damage and rebuild the buildings in the affected areas. In such cases, AR/VR can help better understand the condition of a building and plan the way to rebuild it. Dai et al.13 investigated the application of AR visualization to inter-story drift measurement through photogrammetry to extract the information from an image and measure the inter-story drift through the generated data. Dong et al.14 carried out inter-story drift measurement based on AR technology. They used an AR algorithm to superimpose the baseline on a structure and detect building edges to avoid pre-installation of infrastructure. Authors in15 employed mobile AR for rapid status assessment of a building after earthquakes due to its advantages when measuring structural integrity and specific damage status.
At the same time, some other studies focused on utilizing AR/VR technology in the emergency management field. Visualization enhancement can improve the learning ability of users, transfer the complex knowledge conveniently, and demonstrate abstract concepts to them16. Such advantages are beneficial to applications in emergency management, such as safety training and hazard identification, because they help participants better understand what my happen and what should be done in emergencies. Together with visualization, AR/VR can also be used as bidirectional interactive tools between humans and computers. Arias et al.17 used VR to simulate a fire scene in a hotel and provided several items that the participants could interact with, such as chairs that can be picked up and moved, pillows and towels that can be wetted and used to block smoke vents, telephones that produce dial tones, and windows that open and close. Li et al.18 designed a virtual escape experiment involving people and obstacles when investigating escape route finding behavior. Another important aspect in AR/VR-based bidirectional interaction is observing how the virtual environment influences participant’s behavioral modes and mental states in emergencies19. Emergencies simulated in a virtual environment is proven to influence participants’ behavioral modes. The basic collection method for behavioral modes includes recording the time, distance, and route in the virtual environment. In addition, the mental state in a virtual emergency scene is also significant for researching the interaction effect.
Besides, there also exist some state-sponsored projects focusing on employing AR/VR technologies in emergency cases. In April 2021, the Research Triangle Institute (RTI) was awarded a $750K to develop a user-centered persistent first responder AR (FRAR) test bed20; they aim to create and support a market for public safety user interfaces (UIs) through improvements of existing AR technologies for firefighters, emergency medical services (EMS), and law enforcement operations and tasks. There are also some commercial-level companies that focus on using AR/VR for first responder training21, or search and rescue operations22.
At the national level, SU has received the highest number of EU funded projects per faculty member, taking part in 20 FP6 projects, 53 FP7 projects including 37 Marie Curie Grants, Cooperation Projects and 4 Capacities Projects. In H2020, SU has been involved in 13 funded projects. SU also has a commendable performance in the EU funded education programs, with 5 Jean Monnet European Modules, 3 Jean Monnet Chairs Ad Personam and 1 Jean Monnet Centre of Excellence. These European grants constitute about 18% of the SU’s total budget for research.
Sabanci University’s Behavioral Analytics & Visualization Lab (BAVLAB) will be representing Sabanci University in this call. The main goal of BAVLAB is to conduct research on Big Data Analytics for understanding human behavior and data analytics as well as visualizing Big Data in many diverse settings. Since 2015 the lab has got funded by research grants, corporate funding and sponsorship. The researchers of the lab pursue collaborations or partnerships with companies in different sectors, such as Telecom, Finance, Retailing, Energy and Healthcare, where large datasets or Big Data are involved. The lab currently hosts 2 Faculty Members, 2 Visiting Researchers, 2 PhD students, 6 Master’s students at Sabancı University. In 2020 BAVLAB presented research results at more than 15 international conferences and journals. In this project, BAVLAB with the lead of Prof. Balcisoy will contribute in designing and developing visual analytics systems for large spatio-temporal datasets and trajectories where Prof. Balcisoy and his team has an extensive track record of publications and enterprise level projects.
Based on HMI displaying refined, filtered, and managebale common situational awareness picture, Sabancı University will take part in the development of AR-enhanced user interfaces (UI) to present redefined information to the first responders during the TeamAware project.
Besides, together with other partners, SU will also make contribution to the validation of TeamAware platform in terms of defined performance metrics (correct display of data received from sensors in the field, successful fusion of collected data, resultant Common Situational Awareness Picture approved by first responders) with respect to related requirements.
Prof. Dr. Selim Balcısoy graduated from ETH Zurich in Electronics Engineering in 1996. In 2001, he completed his PhD on "Analysis and Development of Interaction Techniques between Real and Virtual Worlds" in the field of Computer Science in EPFL Lausanne. He worked on mobile graphics as a senior research engineer at Nokia Research Center (USA) between 2001-2004. He has been working as a faculty member at Sabancı University since 2004. Dr. Balcısoy established the Behavioral Analytics and Visualization Laboratory (BAVLAB) jointly with MIT Media Lab in 2015. Two technopark companies were established within BAVLAB and more than ten industry supported research projects were successfully completed. Dr. Balcısoy's research areas are: Data Analytics, Visual Analytics, Machine Learning, Augmented Reality, Virtual Reality, Cultural Heritage and Mobile Graphics. He has publications in more than 80 international refereed journals and conferences, 2 IBM Research Awards, one TÜBİTAK Career Award and one US patent office patent.
Ekberjan is currently pursuing his PhD degree in computer science at Sabancı University. After obtaining his M.Sc degree in Electrical & Electronics engineering from Boğaziçi University, he gained more than seven years of industrial experience in AI & ML. Together with publications, he has participated in two EU projects, proposed & conducted two national projects, led & developed two commercial products. His current research interests include: machine learning, deep learning, data visualization, crowd capturing, behavioral analysis, computer vision, and data mining.
1Runze Wang, Huimin Lu, Junhao Xiao, Yi Li, and Qihang Qiu, “The design of an augmented reality system for urban search and rescue”, IEEE International Conference on
Intelligence and Safety for Robotics (ISR), 24-27 Aug. 2018. DOI: 10.1109/IISR.2018.8535823
2S. Park, S.H. Park, L.W. Park, S. Park, S. Lee, T. Lee, S.H. Lee, H. Jang, S.M. Kim, H. Chang, and S. Park, “Design and implementation of a smart IoT based
building and town disaster management system in smart city infrastructure”, Applied Science 2018, 8(11). https://doi.org/10.3390/app8112239
3Y. Hu, J. Zhu, W. Li, Y. Zhang, Q. Zhu, H. Qi, H. Zhang, Z. Cao, W. Yang, and P. Zhang, “Construction and optimization of three-dimensional disaster scenes within
mobile virtual reality”, ISPRS International Journal of Geo-Information, 7 (2018). https://doi.org/10.3390/ijgi7060215
4D. Lorenz, W. Armbruster, C. Vogelgesang, H. Hoffmann, A. Pattar, D. Schmidt, T. Volk, and D. Kubulus, “A new age of mass casualty education? The InSitu project:
realistic training in virtual reality environments”, Anaesthesist 65 (2016), pages 703-709. https://doi.org/10.1007/s00101-016-0196-x
5X. Gong, Y. Liu, Y. Jiao, B. Wang, J. Zhou, and H. Yu, “A novel earthquake education system based on virtual reality”, IEICE Transactions on Information Systems,
E98D (2015) 2242-2249.https://doi.org/10.1587/transinf.2015EDP7165
6M.N.K. Boulos, Z. Lu, P. Guerrero, C. Jennett, and A. Steed, “From urban planning and emergency training to Pokemon Go: applications of virtual reality GIS (VRGIS) and
augmented reality GIS (ARGIS) in personal, public, and environmental health”, International Journal of Health Geography.
16 (2017).https://doi.org/10.1186/s12942-017-0081-0
7H.C. Pham, A. Pedro, Q.T. Le, D.Y. Lee, and C.S. Park, “Interactive safety education using building anatomy modelling”, Universal Access in the Information Society,
18, 269-285 (2019).https://doi.org/10.1007/s10209-017-0596-y
8R. Lovreglio, X. Duan, A. Rahouti, R. Phipps, and D. Nilsson, “Comparing the effectiveness of fire extinguisher virtual reality and video training”,
Virtual Reality (2020).https://doi.org/10.1007/s10055-020-00447-5
9F. Meng, and W. Zhang, “Way-finding during a fire emergency: an experimental study in a virtual environment”,
Ergonomics 57 (2014), 816-827. https://doi.org/10.1080/00140139.2014.904006
10E.M. Bourhim, and A. Chekaoui, “Efficacy of virtual reality for studying people’s pre-evacuation behavior under fire”, International Journal of
Human-Computer Studies, vol: 142, (2020).https://doi.org/10.1016/j.ijhcs.2020.102484
11B.F. Goldiez, A.M. Ahmad, and P.A. Hancock, “Effects of augmented reality display settings on human wayfinding performance”, IEEE Transactions on Systems,
Man, and Cybernetics, Part C (Applications and Reviews), vol:37, issue: 5,
September 2007.https://doi.org/10.1109/TSMCC.2007.900665
12Z. Xu, X.Z. Lu, H. Guan, C. Chen, and A.Z. Ren, “A virtual reality-based fire training simulator with smoke hazard assessment capacity”, Advances in
Engineering Software, vol: 68, pages: 1-8, February 2014. https://doi.org/10.1016/j.advengsoft.2013.10.004
13F. Dai, S. Dong, V.R. Kamat, and M. Lu, “Photogrammetry assisted measurement of inter-story drift for rapid post-disaster building damage reconnaissance”,
Journal of Nondestructive Evaluation, 30 (2011) 201-212. https://doi.org/10.1007/s10921-011-0108-6
14S. Dong, C. Feng, and V.R. Kamat, “Sensitivity analysis of augmented reality-assisted building damage reconnaissance using virtual prototyping”,
Automation in Construction, vol: 33, pages: 24-36, August 2013. https://doi.org/10.1016/j.autcon.2012.09.005
15W. Kim, N. Kerle, and M. Gerke, “Mobile augmented reality in supporting of building damage and safety assessment”, Natural Hazards and Earth System Sciences, vol: 16,
pages 287-298, February 2016.https://doi.org/10.5194/nhess-16-287-2016
16A.H. Behzadan, S. Dong, and V.R. Kamat, “Augmented reality visualization: a review of civil infrastructure system applications”, Advanced Engineering Informatics,
vol:29, issue:2, pages: 252-267, April 2015.https://doi.org/10.1016/j.aei.2015.03.005
17S. Arias, R. Fahy, E. Ronchi, D. Nilsson, H. Frantzich, and J. Wahlqvist, “Forensic virtual reality: investigating individual behavior in the MGM Grand fire”,
Fire Safety Journal, vol: 109, October 2019. https://doi.org/10.1016/j.firesaf.2019.102861
18H. Li, J. Zhang, L. Xia, W. Song, and N.W.F. Bode, “Comparing the route-choice behavior of pedestrians around obstacles in a virtual experiment and a field study”,
Transportation Research Part C: Emerging Technologies, vol:107, pages: 120-136,
October 2019.https://doi.org/10.1016/j.trc.2019.08.012
19L. Chittaro and R. Sioni, “Serious games for emergency preparedness: evaluation of an interactive vs. a non-interactive simulation of a terror attack”,
Computers in Human Behavior, vol: 50, pages 508-519, September 2015. https://doi.org/10.1016/j.chb.2015.03.074
20First responder augmented reality test bed, project description
available at: https://www.nist.gov/ctl/pscr/funding-opportunities/past-funding-opportunities/psiap-augmented-reality/first-responder
21Augmented Training Systemshttps://www.augmentedtrainingsystems.com
22Edgybeeshttps://www.edgybees.com
Contact
Monica Florea
Administrative Coordinator
European Projects Department
SIMAVI
Soseaua Bucuresti-Ploiesti 73-81 COM
Bucuresti/ROMANIA
Email:
Çağlar Akman
Technical Coordinator
Command and Control Systems
HAVELSAN
Eskişehir Yolu 7 km
Ankara/TURKEY
Email:
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101019808.