MSF Spotlight Series
The MSF Spotlight Series aims at bringing experts and stakeholders together to exchange knowledge about a variety of socio-political, humanitarian, and public health issues that are relevant locally, regionally and globally. On April 8, MSF hosted its first Spotlight session on the importance of data protection in a humanitarian context with speakers Lama Khatib, Human Rights Watch Middle East and North Africa Director and Director of the Beirut office, Mohamad Najem, Executive Director at SMEX, and Privacy Coordinator at MSF International, Ciaran O Hultachain, via Zoom from Geneva.
Data Protection in the Region: An Overview
While communication technologies helped human rights organizations improve data collection and communication practices, it has also increased the risk of compromising personal data and putting lives at risk. Cyber-attacks have threatened both the individuals working in these organizations—as in the case of Lama Khatib outlined below—as well as the vulnerable communities they are trying to protect, i.e. in Afghanistan and Rohingya.
In Arabic-speaking countries, Mohamad Najem points out that “there is a clear absence of data protection regimes and regulations of surveillance technology,” especially in the Gulf. The GCC Countries have been heavily investing in technology in an attempt to shift their petrochemical-dependent economies to more diverse industries, such as high-end manufacturing and industrialization—backed by billions of dollars spent in scientific research.
At the same time, we’re witnessing the Fourth Industrial Revolution as it rapidly unfolds in the Gulf. Dubai, for example, has become the go-to hub for many international tech companies in the region, such as Facebook, Google, Twitter and Telegram, according to Najem. Along with Saudi Arabia, the two countries have also invested in surveillance technology as a tool for mass control over their populations.
In 2020, numerous Gulf states, including Bahrain, Oman, Saudi Arabia, UAE, have signed contracts with Israeli surveillance companies, such as NSO, and used their nefarious services against their own citizens. Digitizing the economy in countries adopting draconian laws and governing by authoritarian rule poses serious challenges for data protection, especially in the humanitarian context.
Lama Khatib’s Testimony: Hacked by Pegasus
“I was shocked. At first I didn’t believe the message was authentic.” Between April and August 2021, the personal phone of Lama Khatib, Human Rights Watch MENA Director and Director of the Beirut office, was hacked using Pegasus on at least five different occasions. Khatib received a message from Apple informing her that her phone was targeted by a state-sponsored hacking attack. As for the cause behind being targeted, Khatib suspects that the attacks could be tied to her investigative work on the Beirut Port explosion.
After forensic evidence confirmed the attack, Khatib contacted NSO to pursue an investigation into her case, but the company continued to state that their techonology is only deployed to identify and locate terrorists. According to the HRW Director, however, “the evidence speaks to another reality,” especially when governments in the region have launched espionage campaigns against human rights defenders and journalists. “We know that targets have been detained, subjected to torutre, and have even been killed,” added Khatib.
“In an era where more and more of our lives are accessed through our devices, how do we keep ourselves secure?” Khatib reassured that Human Rights Watch and other organizations do take “a number of steps to maintain the security of staff devices and to ensure that the communications with contacts are secure.” Nevertheless, there continue to be risks associated with data collection, especially in the absence of regulations of surveillance technologies and amidst inadequate data protection laws in the region.
Data Protection in the Humanitarian Context
Data collection systems that hold biometric data have severe implications for human rights, especially in conflict regions. “We are concerned about the growing trend towards collecting more and more biometric data in the humanitarian context,” warned Khatib.
She gives the example of Afghanistan, in which the European Union, the United States and international financial institutions like the World Bank, supported the Afghan government in developing a whole series of biometric databases. After the US exited Afghanistan, these databases ended up in the hands of the Taliban.
“What does it mean that the Taliban now has a biometric database from the Ministry of Interior and can use it to identify all the security personnel that work there?” In the absence of data protection assessments, there is a real risk of retaliation and other consequences that go beyond privacy rights. “Individuals have not been informed about how the data might be used and who has access to it, there is no opportunity to rescind any decisions related to the collected data, and adequate procedures around data destruction were not taken,” Khatib explained.
In this sense, data collection practices have failed to respect and protect human rights in the humanitarian context, where, according to Khatib, “the stakes are much higher.” Individuals fleeing conflict zones face the highest risk of data abuse, especially considering their primary concern is survival, rather than keeping their sensitive information private. “If displaced people are being asked to provide their data in exchange for food ration, their ability to question how this data is being used is compromised.”
In the case of UNHCR data collection practices in Rohingya, communities that were forcibly displaced on ethnic grounds did not know that their data collected would be shared with the government of Bangladesh and the government of Myanmar, the very country they were trying to flee.
“When a refugee provides their data in order to receive services, the knowledge that they can still receive assistance without providing biometric data becomes incredibly important,” added Khatib.
Data Protection Fundamentals
So how do we ensure data collection does not become a threat to human rights defenders and the communities they are trying to protect? First, it’s important to identify what data protection actually means beyond the notion of privacy alone.
Data protection encompasses organizational and technical measures — including policies, procedures, contracts, systems and guidelines for staff behaviors — to protect personal data, i.e. any piece of information that identifies a human being. In this sense, data protection corresponds to a set of legal and technical procedures while at the same time remaining immensely personal. As stated by MSF Privacy Coordinator, Ciaran O Hultachain: “Data protection is what we do habitually in our personal lives.”
What does data protection mean in practice as applied to organizations, governments, and private sector companies, as well as to individuals? Hultachain shares his insights on the fundamentals of data protection, as adopted by MSF International.
- Keep a data register of what data you hold, where it is stored, and why.
- Undertake data protection impact assessments for any new project.
Anytime an organization starts collecting new data about individuals, it must run an impact assessment to see what is the risk to privacy, especially if these organizations are trying to provide assistance to people and improve their lives.
- Evaluate your vendors for data protection
Ensure that the vendors organizations (i.e. Zoom as a virtual meeting platform) are securing the data they use.
- Secure your data as much as possible to maintain confidentiality
Ensure that the technical and IT systems we’re using and the procedures and processes we have are respecting confidentiality.
- Be careful with your data and where you store it, on both an individual and organizational level.
- Protect sensitive data: medical, religion, race, sexuality, politics, etc.
- Have an effective process to manage personal data breaches when they occur.
For more insights, you can watch the full session here: