Ethics and data privacy in emergency response

May 16, 2022 | Blogs

To manage a crisis, emergency first responders must have effective and efficient communication strategies in place as speed of information exchange is one of the main determinants of incident recovery. Failures in communication can have severe repercussions.

Similarly, the use of big data1 and machine learning2 raises the possibility of misuse and unintended repercussions. Consequently, it becomes critical to address the societal, legal, ethical, and data protection challenges that may arise as a result of their implementation. There are a few crucial considerations that must be taken into account, in particular:

  • How will data be collected and used?
  • What measures will be taken to mitigate risks and stay compliant?
  • What are the implications of using machine learning algorithms?

That is where Eticas comes in as the project’s ethics partner. Eticas Research and Innovation is a non-profit dedicated to research, education, and dissemination at the nexus of technology, data, society, and responsibility. Our goal is to promote public awareness about the dangers of digital technology, while also empowering individuals and communities to preserve their personal information.

Eticas assists policymakers in developing policies that maximize technological advantages while avoiding potential negative consequences. Moreover, it helps technology developers detect ethical, legal, and social concerns early in the development process and incorporate alternatives into the technology design. The Eticas Foundation is leading cutting-edge, highly competitive projects in a variety of fields, such as building responsible data exchange systems, ethical solutions, and analyzing GDPR compliance for sensitive projects.

Key ethical issues for TeamAware

During crisis and emergency management, the TeamAware toolkit and processes will be largely aimed toward building and improving first responder awareness, communication and knowledge management systems. These toolkits are specifically created to meet the interests and needs of first responders such as law enforcement officers, medical workers, and emergency responders. The data transfers that communication between these groups entails, however, must be especially respectful of fundamental rights to privacy and personal data protection, as stipulated in frameworks such as the European Convention on Human Rights and the Charter of Fundamental Rights of the European Union (2016/C 202/02). TeamAware toolkit users must be able to safeguard these rights in regards to the data they share with the system under these coordinates.

Furthermore, the Charter of Fundamental Rights (CFR), the European Convention on Human Rights, and the Universal Declaration of Human Rights all underline the importance of the right to non-discrimination. As a result of their characteristics, many of the potential collectives using TeamAware toolkits, including those with disabilities, may encounter discrimination. The CFR is a document that sets out the basic rights and freedoms of everyone in the European Union. These rights include the right to integrity, the right to privacy, and the right to protection from discrimination on the grounds of race, sex, religion, or national origin. One of the most important rights enshrined in the charter is the right to integrity. This right protects people from being subjected to violence or intimidation, and ensures that their personal data is protected.

The Teamaware project has thoroughly examined all of the potential effects of sensitive data processing for members of the aforementioned collectives. When it comes to the protection of personal data of study participants who belong to vulnerable groups of the population, the consortium has identified non-discrimination as a major objective of TeamAware. One of the project's key goals is to provide practitioners with relevant information on how to deal with vulnerable individuals during an emergency. These recommendations will aid in avoiding any potential discriminatory treatment during the research.

Due to the general characteristics of TeamAware’s live-action field exercises, the right to integrity is important when considering that the research participants will be exposed to conditions that may jeopardize their physical and mental well-being. Further, the right to privacy is a conditional right, meaning it can be broken for good reason and in a proportionate way. This is alluded to in the second paragraph of the European Convention on Human Rights. As a result, TeamAware will handle research participants' personal data proportionally and in compliance with data protection rules.

Key data protection issues for TeamAware

The General Data Protection Regulation (GDPR) is a regulation specific to the European Union in the area of data protection. It replaces the Data Protection Directive 95/46/EC, which was introduced in 1995. The GDPR was adopted on April 14, 2018, and came into force on May 25, 2018. The GDPR regulates the handling of personal data by controllers and processors within the European Union. It also establishes the right of data subjects to access their personal data, and to exercise certain other rights, such as the right to rectification and erasure.

All TeamAware partners must comply with the GDPR because the personal data they may process as part of TeamAware belongs to data subjects whose data is being monitored for research purposes within the Union (Article 3.2). TeamAware is made up of organizations mainly headquartered in countries that are members of the European Union (EU) or European Economic Area (EEA). Activities undertaken in non-EU countries are required bythe joint controller's agreement to meet the minimum requirements set forth in it, assuring a minimum level of compliance.

The TeamAware consortium will be processing mainly data coming from:

  • Representatives and contact points from members of the Advisory Boards.
  • Research participants involved in the field exercises carried out within the project, some of them belonging to vulnerable categories of the population.
  • Publicly available databases.
  • Data generated during testing such as environmental and structural shapes, colors, decorations, etc.

And in order to remain compliant the project will follow the 7 key principles of GDPR, outlined below:

  • Lawfulness, fairness and transparency
  • Purpose limitation
  • Data minimisation
  • Accuracy
  • Storage limitation
  • Integrity and confidentiality (security)
  • Accountability

Key societal considerations for TeamAware and their consequences

First responders and crisis management personnel are frequently required to work in chaotic environments. This entails making judgment calls on the optimal course of action based on little information and under significant emotional and time pressure. When a first responder arrives at the scene of a car accident, for example, they must make split-second decisions about whether to provide medical help or use force if necessary. If they refuse to provide medical assistance or if they do provide medical assistance, but inadvertently damage the sufferer in the process they may face personal consequences in the form of administrative reviews with career repercussions, disciplinary hearings with potential financial implication or even prosecution posing a risk to their own freedom. All the above notwithstanding the danger to their own lives, already inherent to the profession.

As an answer to these unfortunately common challenges, the TeamAware solution aims to use Augmented Reality and Mobile Human Machine Interfaces to provide users with access to real-time, improved, accurate and manageable information, allowing them to make better decisions and avoid costly mistakes. Furthermore, these technologies have the potential to increase industry safety, efficiency, and communication.

AI raises three major ethical concerns for society: privacy and surveillance, bias and discrimination, and one of the most important implications of new technologies, the role of human judgment. Since the TeamAware systems will aid first responders in decision making,these are concerns that will be closely monitored by Eticas throughout the development of each system that makes part of the whole solution.

An important part of the monitoring process also includes the development of a Privacy Impact Assessment and active contributions to desirability and acceptability assessments, which, when combined with the sensible application of ethics requirements, results in an appropriate legal and ethical foundation to assist consortium partners in the development of knowledge and TeamAware systems, not only in compliance with GDPR, but also, and more importantly, in accordance with the ethical principles to reinforce the systems' design.

Gemma Galdon Clavell

Gemma is a policy analyst working on surveillance, social, legal and ethical impacts of technology, smart cities, privacy, security policy, resilience and policing. She is the President of the Board at the Eticas Foundation and a founding partner at Eticas Research & Consulting. She completed her PhD on surveillance, security and urban policy in early 2012 at the Universitat Autònoma de Barcelona, where she also received an MSc in Policy Management, and was later appointed Director of the Security Policy Programme at the Universitat Oberta de Catalunya (UOC).

Previously, she worked at the Transnational Institute, the United Nations’ Institute for Training and Research (UNITAR) and the Catalan Institute for Public Security. She teaches topics related to her research at several foreign universities, mainly in Latin America, and is a member of the IDRC- funded Latin American Surveillance Studies Network. Additionally, she is a member of the international advisory board of Privacy International and a regular analyst on TV, radio and print media. Her recent academic publications tackle issues related to the proliferation of surveillance in urban settings, urban security policy and community safety, security and mega events, the relationship between privacy and technology and smart cities.

Nour Salih

With experience in population data analysis and applied statistics to social sciences, she is currently working at Eticas on various projects surrounding the protection of data and privacy, ranging from the use of algorithms by governments, to data commons and bad data. As an operations and compliance specialist, Nour is dedicated and passionate about the importance of business ethics. Her work with EU funded projects has seen her managing a number of successful projects and overseeing the legal and ethical requirements necessary to comply with EU law.

Marta Burgos

Marta Burgos is an experienced project manager with a background in Sociology and Political Science with focus on citizen participation, migration and intercultural cities taking into account the gender perspective. At Eticas, her research and work is centered on analyzing data, technology aimed at vulnerable groups, seniors and people with neurodegenerative diseases. Her work with us as a project manager for EU projects is also focused on e-health and medtech initiatives and their impact on privacy and data protection.

1Any unstructured data that has quality flaws, such as erroneous, incomplete, inconsistent, or duplicated information.
2Artificial intelligence that enables software applications to improve their accuracy in predicting outcomes without being expressly programmed to do so.


Contact

Monica Florea
Administrative Coordinator

European Projects Department
SIMAVI
Soseaua Bucuresti-Ploiesti 73-81 COM
Bucuresti/ROMANIA

Email:

Çağlar Akman
Technical Coordinator

Command and Control Systems
HAVELSAN
Eskişehir Yolu 7 km
Ankara/TURKEY

Email:

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101019808.

© 2021 | TeamAware All Rights Reserved