Panel 3: Humanitarian Innovation in Data and Information Management
Photo: Yasuyoshi Chiba/Getty Images
Available Downloads
Mvemba Phezo Dizolele: Good afternoon, everyone, and welcome to this session. It’s a pleasure and honor for me to join with you this afternoon to discuss a very important and consequential topic of humanitarian innovation in data and information management. I’m Mvemba Phezo Dizolele, senior fellow and director of the Africa Program here at the Center for Strategic and International Studies.
We have heard of a staggering number of 274 million people in need of humanitarian assistance and protection this year, 2022. We live in a world that is obsessed and saturated with data, and the humanitarian sector is no exception. NGOs, local and international, collect data – names, contract, stories, cases, et cetera. This is what technology enables us to do. These organizations collect this data to be more efficient, to perform better. In other words, they do this to better support the vulnerable populations they are called to help. We know, of course, that good intentions do not always lead to good outcomes. We live in a world of hackers, crashes, and other risks. So data collection is also about managing and securing the information, and often very sensitive information. In some cases, it’s literally a question of life and death. In some parts of the world, though, organizations still collect data the old-fashioned way, meaning through paper, pen and paper, sometimes pencil. That is not – no more reassuring than the new technology collection way that we have.
So either way we are confronted with several questions and challenges, which we hope our panel will answer today. They will discuss challenges and promising solutions in the collection, management, and use of data by humanitarian organizations. Our panelists will discuss the need for “humanitarian data intervention,” in quote, allowing for data and technology to be made available to local populations in the interest of better humanitarian responses. Our experts will also look at the challenges of accountability, transparency, and the ethical basis of power relations.
Now I’d like to introduce our panelists. In the interest of time, I will use the shorter versions of their bios – they have impressive backgrounds – and once we finish the introduction, they will speak in the order in which they are introduced.
Joining us in the room today is Ziad Al Achkar. He’s a Ph.D. candidate and researcher at the Carter School for Peace and Conflict Resolution at George Washington University.
Is that correct, George Washington?
Ziad Al Achkar: George Mason.
Mr. Dizolele: George Mason – always get those two confused. (Laughs.) Pardon me. George Mason University.
I didn’t mean to start any war here. (Laughs.) I know we’re in D.C.
His research focuses on the use of digital technologies and remote sensing by humanitarian, peace-building organizations. Ziad looks at the role that data collection and surveillance plays in the sector and how to develop and design responsible and ethical practices.
Then, in the room, we are also joined by Laura Walker McDonald, who is the senior adviser for digital technology and data protection at the International Committee of the Red Cross, where she’s working to increase the delegation’s awareness, understanding, and analysis of evolving new technologies and implications for humanitarian aid and to conduct – and the conduct of hostilities. In particular, she focuses on the potential impact on the protection of the affected populations and humanitarian organizations and new types of digital risks such as cybersecurity, mis- and disinformation, and hate speech.
Joining us online is, first, Maja Lazić, who is the deputy head of the Joint Data Center on Forced Displacement based in Copenhagen, Denmark. She previously was the deputy representative for UNHCR in Malaysia, where she worked towards the objectives of the Global Impact for Refugees and U.N. Sustainable Development Goal agenda – goals agenda, excuse me – in the national context. She brings over two decades of experience working in conflict and post-conflict areas and emergency settings in the Balkans, Southeast Asia, and Central Asia.
Then we have Dr. Patrick Vinck, who is joining us also online. He’s the co-founder of KoBoToolbox and an assistant professor in the Department of Global Health and Population at the Harvard School of Public Health and in the Department of Emergency Medicine. He is also the research director of the Harvard Humanitarian Initiative.
So without further ado, I’d like to turn the mic to our friend Ziad, who will start, and we’ll go from there. Thank you very much.
Mr. Al Achkar: Thank you so much. And let me start by thanking CSIS and BHA for getting us together today to talk about humanitarian innovation and where we are today as a sector.
I think it’s important for us to look at, I think, how far the sector’s come in the past 10 to 15 years on these issues. I think 10 to 15 years ago there was a lot of enthusiasm and excitement about kind of the power and potential technology to revolutionize the humanitarian response and revolutionize the field and the work that we do as a sector and as a community, and I think, in my opinion, at that time, there was a lot of kind of techno-utopian silver bullet ideas floating around, and I think we’ve come somewhat – we got caught in it a little bit and I think the sector has since matured a little bit, which is good. There’s a bit more realism about, you know, what technology can and cannot do, and most importantly, I think there’s a realization of the very real harms that can occur from the use of technology and data collection. And I think in part this comes from an improved and gained expertise that we have now in the sector that, maybe 10, 15 years ago, wasn’t as prevalent and we built better capacities to do this work better, on the one hand, and on the other hand, I think the reality is that we’ve had a few incidents and things that have happened within the sector that helped us – that forced us to rethink our approach. The role that civil society’s played in this has been huge in pushing back and pushing us to think and be more principled in our work. But of course, I think we still have a lot of work to do and a lot of ways to improve as a whole.
So I think if you look back at kind of the, you know, late 2000s, early 2010s at where the sector was, there was very little guidance about what ethical and responsible data collection looks like. You know, if you looked at the Sphere Standards, for example, and I – you know, if you go back into the 2007 version or the 2012 version, there’s very little mention of data responsibility and data collection. You know, you can just go – do a control-F search, maybe seven hits in a 500-pages document. But now kind of – in part thanks to a lot of the panelists here today, there’s a lot more guidance; we have handbooks; we have more conversations about these issues and agreed-upon principles about how to do building partnerships and how to do data collection and engaging with technology in a lot more responsible way.
And I’m going to make a quick note. Right now there is the – the Inter-Agency Standing Committee has an ongoing survey about looking into changes that needs to be done on data responsibility for operational guidance, so I really encourage those of you that work in this field to go online, engage with the survey because I think this is kind of one way we can make things better.
But I still want to note that I think there’s still plenty – a long way for us to improve accountability, to begin with, for the sector, long way to improve transparency in a lot of the ways we do data collection, a lot of the ways we do private partnerships with the tech sector. And I believe as we talk about kind of localization as a sector and a community, we really cannot continue doing business the way we have. We really must change the way we approach partnerships and we need meaningful engagement, and frankly, I think we need to cede space and power in a lot of different places.
And so just briefly, just to kind of wrap up – I know we’ve got two more minutes. For me, when I think about innovation and the research that I’ve done in this space and the work that I’ve done, I think there’s three main issues at play. I think, first, is a technical issue. You know, do we as a community have the necessary technical capacity to do this kind of work right, to be able to build the kind of partnerships that will be sustainable, equitable, and principally driven, and so that’s one. The second is an organizational issue and organizational dynamics. Do we have the organizational structures and the willingness and desire within those organizations to truly change the way that we work as a sector and community? And then, third and lastly, I think there’s a political component at this, a political-economic component. Is there really political will to make sure that these efforts have long-term backing and support? And I think this is where it’s key. The first two cannot happen without this political will and support, and if we don’t have the buy-in from the top, quote/unquote, “humanitarian leadership” and the buy-in from donor governments and institutions, I think we will hit a wall. We will continue to see the cycle of three- to five-year projects and innovative cycles that really don’t lead – don’t do much.
And so I’ll wrap it up here and I’m very much looking forward for the discussion from my panelists and for Q&A.
Mr. Dizolele: Thank you very much, Ziad. A couple takeaways: I think primarily is the challenge is – organizational challenges and structural challenges, just capacity, and then, two, is the lack of political will, which is key for any change that we can bring about in this space.
All right, thank you very much. We’ll go to Maja in Copenhagen, if you can intervene now – five minutes, and then we go to Laura and then we’ll conclude with Patrick. Thanks.
Maja Lazić: Thank you very much and thank you very much to CSIS for inviting me to this interesting discussion. I really feel privileged to join this panel with Ziad, with Laura, and with Patrick and with you.
When I read the introduction to this conference it sets out an objective to explore the way in which innovations in people, processes, and products can improve responses to humanitarian needs such as forced displacement. And I thought to myself, in this sense, the World Bank-UNHCR Joint Data Center on Forced Displacement where I come from, the setup is a new approach to addressing conflicts and protracted crises. When the center began its journey two a half years ago, we identified some of the main data gaps in the forced displacement context. It really became clear to us that we needed to focus on contribution – on our contribution to dramatically improve the quality, quantity, and accessibility of microdata on refugees and internally displaced stateless people and their host communities. So the center’s goal is to enhance the ability of stakeholders to make timely and evidence-informed decisions that can improve protection and well-being of affected people, like you also emphasized, Mvemba. But innovation is a critical element in this effort, and so we’re pursuing four strategic priorities – and please allow me to elaborate a little bit here.
The four strategic priorities are, one, strengthening data systems and standards; two, increasing high-quality data collection and analysis at the country level; three, supporting responsible access to data; and finally, the fourth, building evidence and sharing knowledge across the community. And the Joint Data Center essentially combines the comparative advantages of two major development and humanitarian institutions, namely the analytical capacity of the World Bank and its ability to link displacement data to other key socioeconomic data, and, on the other hand, the access of UNHCR to information about refugees, stateless people, IDPs, as well as their overview of the legal protection and policy frameworks in host countries. So we sit squarely on the nexus of development and humanitarian efforts, and it allows us to act as a catalyst, working with our parent institutions and many other partners in new ways and to support transformative data and evidence opportunities to enable a sustainable change to affected populations. So ultimately, we’re a public good addressing the needs of both affected governments and populations, while also ensuring that all the results that we produced are widely available.
And I’m going to just dive in very quickly into an example so that it becomes a little concrete for you from our work. So, quality and comparable data on the forcibly displaced in host populations is much needed for development programs and policies that can complement humanitarian interventions. This was – as late as in May, when I was in the Democratic Republic of Congo, I spoke with several development partners and it was so evident that they were thirsting for socioeconomic data on IDPs and refugees in order to feed into their programs. Yet, refugees and IDPs and stateless people are largely invisible in national statistics, in recurring socioeconomic international surveys such as the demographic health survey, the World Bank poverty surveys that can enable such interventions. And during the pandemic, we faced an extraordinary challenge in this regard. The pandemic essentially hindered the traditional ways of collecting data in developing countries. It was simply not possible to rely on the face-to-face survey. And other methods have to be identified and it had to happen quite fast. So mobile phone survey techniques have gained sophistication and have largely been able to complement traditional survey techniques. The World Bank led the way by introducing data collection through high-frequency phone surveys, and the Joint Data Center was well positioned to facilitate and support the extension of these phone surveys to include refugees and IDPs. So as a result, more than 100,000 interviews have been conducted with forcibly displaced households and host populations in at least 10 major host countries, including Burkina Faso, Chad, Ethiopia, Yemen, Iraq, Jordan, and a few others.
So this helped really obtain more timely and comparable data on how those displaced were faring during the pandemic in terms of food security, participation in education, access to health, among other things. And I’ll give you just two quick examples on such data. So in Chad, it emerged that over the course of the pandemic, only one-third of refugee households had pre-pandemic-level access to food. Eighty-two percent of refugee households have experienced severe food insecurity since the beginning of the pandemic, compared to 54 percent of the Chadian households. And in Djibouti, host households that did not have access to health services where needed cited crowded health centers or hospitals, 48 percent, and the inability to pay out-of-pocket fees, 24 percent of the respondents accounted for. As the main – and this was the main reasons for not receiving care, whereas refugees instead reported the inability to pay fees, 38 percent, or afford the trip, 31 percent, so a purely cash issue.
And to conclude the opening here, this data has really allowed operational colleagues to better prioritize, design, and target interventions, considering not only basic needs but also more nuanced socioeconomic information. These are just two examples out of a 55-activities portfolio and we recently came out with a second annual report, which I would encourage you to read. It gives you more information about the efforts that we put in here and the thinking behind our work.
So with this, I hope that I was stringing on to Ziad’s presentation and your introduction, Mvemba, but I really look forward to engage in the discussion of this panel. Thank you.
Mr. Dizolele: Thank you very much, Maja. I think the takeaway here will be just the importance of data. I think, if I heard you right, data really stands between the needs and the quality of support that its community get. If it’s not recorded properly, then that can determine how much access they really get to support. And I think, still, when we go to Q&A, I’d like you to think about what are the vulnerabilities in that data collection? Who uses it and what for and how is it protected?
With that, we will go to Laura. Your five minutes, please.
Laura Walker McDonald: Thank you. And thank you so much for the invitation to join the panel. I wish I could cede my minutes to my colleagues on the panel to hear more.
What you had to share was fascinating, Maja. Thank you. And Ziad also. And I’m looking forward to the conversation.
So just first of all, a word about the International Committee of the Red Cross, in case anyone’s coming across the ICRC for the first time. We’re part of the Red Cross and Red Crescent movement, which is made up of over 190 national societies like the American Red Cross here in the U.S., and the federation that brings them together to provide support in the event of natural disasters and also a range of other health and other interventions.
The International Committee of the Red Cross, the ICRC, which I work for, is a neutral Swiss organization which is independent and over 150 years old now, and we support communities who are affected by conflict and other situations of violence all around the world. We’re also the guardians of international humanitarian law and help to think through how that law can be applied and can evolve in this changing world that we live in. And my role is around specifically digital technologies and data protection, although I increasingly think of it as digital and other technologies, because there’s more to life than just digital and ones and zeroes, even these days. And the role that I play here in Washington within the delegation is really just a few things I wanted to talk about today, just as a jumping-off point for the conversation. One is how technology – first and foremost how technology is impacting the people that we serve. So communities are themselves digital actors now and have digital lives and footprints which can be very helpful to them in the event of an emergency to get lifesaving information, to keep in touch with loved ones, but that digital footprint can also become a risk and can lead to them being targeted and other – or the victims of fraud and other things. So we now operate in an environment where the communities we support we must assume have this digital life, and the role that we play in that, to recognize that as a potential source of risk or to support that digital life, to support access to connectivity, is something that, as a humanitarian organization, we must consider. And of course, we also use technology in providing assistance and there have been some really good examples so far, and I think there the issue is doing it right and doing it responsibly. So whether it’s gathering the right data in a responsible manner for evidence-based decision making to actually use it with the plan to do that beforehand is critical. Equally, we have placed a strong emphasis on data protection at the ICRC, and I can talk about that a little bit in a minute. But I think for me, I’ve worked now for over 12 years in digital and in development and aid and I think the critical thing is understanding what some of those impacts are. We know really very little about things like the specific impacts and harms and how it can help to have connectivity in an emergency and what happens when that goes away and what happens when you provide it for people.
Equally, you mentioned in the introduction misinformation, disinformation, and hate speech is playing huge roles in conflicts all over the world. It’s been very visible in Ukraine. But I think we must recognize that that’s because there’s a lot of English-language content available. There are many conflicts around the world that are going on where there is mis- and disinformation and it is extremely impactful but perhaps less visible to us as an international community.
And then the final thing that I will just bring up – I can talk more about our approach to data protection, if anyone’s interested, but we also operate as a humanitarian entity with a particular role and mandate in the digital world, and that’s complicated for us too. We also have to send email and gather data and store it somewhere, thinking about how we can do that in a way that maintains the confidentiality of that data that we collect that people entrust us with, which is a huge part of our access to communities and the reason that they can ask us for help or tell us things they need us to know is if they know we can keep it confidential. And navigating that in a world where all this digital infrastructure is provided by the private sector, much of it based in countries who are not neutral in our international order, is an extremely tricky proposition and something that we’re thinking through in some depth.
We are bringing a few solutions to that party, so we – as I mentioned, we do have our Data Protection Office and they are always thinking about this issue. We have a new delegation for cyberspace that just got started in Luxembourg, which is – has a strong agreement with the state of Luxembourg around how we store and manage our data. And we’re also going to launch in September some new research on the potential for a digital emblem that would help protect entities and objects that would be protected under international law if they flew a Red Cross, Red Crescent, or Red Crystal emblem, but they’re online, so how do we say – actually, this is a protected server or data or this transmission is protected?
So I’ll stop there but looking forward to the conversation, and thanks so much.
Mr. Dizolele: Thank you very much, Laura. I think this was very insightful as well, like the previous presentations.
The challenges here, from what I understand, is responsibility in the collection of data, responsibility in managing that data, which is safeguarding it, and then I think this touches on some of the issues that we started on: Is there an international protocol on how to handle this data that organization get? But we can delve into those when we open up for Q&A.
We’ll turn now back to our friend Patrick Vinck, who’s joining us online. Patrick, the mic is yours; five minutes, please.
Patrick Vinck: Thank you. Thank you for – to everyone for inviting me and it was great to hear from my colleague there. In fact, it’s nice to be last because it will afford me to not have to repeat a lot of the very important and good points that have been made, and instead I will focus on two things. I will actually focus not on technology; I will focus on people and on policies and processes.
Before I do that, though, let me restate what I think has been transparent in what has been said, which is that data and technology have brought important gains to the humanitarian sector, that there have been major success and fundamental changes that have been enabled by new technologies and new ways of collecting data, since that’s the topic today.
But, you know, at the same time there are challenges. Now, let me point again that the needs in the humanitarian sectors are growing. Every year we see appeals growing and financial appeals not being met. So that gap has only been growing, and again, the importance of having data to guide investment when the resources are limited is important. So is the need for evidence about what works and what does not work, and that can really only be achieved by improving how we collect, process, analyze, gain insights into our work so that we can be more effective.
Now, I’ve mentioned effective, and one of the key challenges is how do we balance effectiveness with protection, consideration for ethics? And that’s really what we need to emphasize is that gains are important and are needed, but it cannot be at all costs.
Importantly too, data and technology are a means towards more fundamental transformation of how we are operating and what we are doing and how we are doing things. And here I cannot understate how fundamentally separated the digital transformation of humanitarian action has been from the localization agenda, from accountability. It’s almost as if those two things were completely separate and not talking to each other and, frankly, at times, contradicting each other, and that is not helpful because the fundamental change that is needed is around organization, is around accountability, so we cannot continue in advancing in a digital transformation that is, frankly – again, it’s a big picture thing, but largely ignoring those important changes.
So let me go back to what I said I would talk about: people, policies, and process. So on the people side, I just mentioned it; I just mentioned how little inclusivity – systematic consultation, effective consideration for local partners and communities there is in technological choices. We advance solutions that are proprietary that local humanitarian actors will not be able to use in the long term that cannot afford licenses. You know, these are considerations that we need to have. How do we build more durable and sustainable response operations where, when crises are likely to repeat themselves, not treat partners as this is what you need to do, this is how you’re going to do it, and this is the tool you will use. That’s not an effective approach.
There’s also – again, I’m sorry, I will make big generalization – but this – we’ve seen it in this field, this very insufficient concern for unintended consequences of the digital transformation, the shift of burden that technology brings and putting the burden of having a cell phone, being able to connect, having cards that function. That burden is now on the community itself. The burden of dealing with access points to get cash, that’s now on the community itself. It used to be different. And so we really need to think about those burdens and what it means for how we are operating. And again, the efficiency gains that may be perceived on the humanitarian side cannot – and I’m sorry I’m being – under attack by mosquitoes so you’ll see my hands move a little bit – but those efficiency gains really need to be balanced with those protection concerns, for example.
And finally, on the people side, I will just note the major gap – we are increasingly asking difficult technological skills from people, from partners, from local actors, and yet we don’t have the resources to train, to build those skills, to enhance digital literacy, for example. So major issues around people.
Now, let me turn to the policies and process. I hope I have time. As Ziad mentioned and I think Laura mentioned as well, we’ve seen major gains in the framing of the use of technology in the guidance around ethical data collection. This is – there’s no question that we know a lot better what to do and how to do it, but there are still major gaps when it comes to enforcement, when it comes to accountability, when it comes to oversight around some of the data practices, for example, and especially when it comes to oversight by and for communities, and so we go back to the localization aspect.
And separately, a lot of these policies and process are kind of not necessarily addressing big fundamental question about the transformation of the role of humanitarian actors. We see actors providing services to government, on social safety net, on technology, humanitarian organizations being – becoming almost digital service providers. Now, with that comes a range – I’m not going to judge whether it’s good or bad, but with that come a large number of questions of responsibilities, about disengagement, about the ethics of data sharing, but these are really fundamental question that needs to be addressed and for which we need to think through the policies and process to – (audio break) – maybe approach or use. We’ve seen humanitarians becoming a source of data. We know that everyone collected more and more data and we’ve seen what challenges and risks can come to that. Just as a small example, we’ve only begun – and there’s been very limited discussion as to how digital transformation of humanitarian action is changing humanitarian access negotiation. There’s of course a very famous example from Yemen and a few places where data and access to technology became a big part of the negotiation, but how do we handle that when it’s going to become a systematic question about who has the data, who collects the data, who will have it? And again, remember that local actors do not have the protection or the means that are afforded to international means. That is very easy for United Nations to say well, of course we don’t give data to the government but the local actor that collects the data for them, they don’t have the same level of protection; they don’t have the ability to say of course we cannot give you the information we have. And so we really, again, need to fundamentally rethink what we need to do here.
So I’ll stop here. I’m probably out of time. I have not talked about technology. I’ve talked about people, policies, process, and we can talk about more things. But the consequences for technological choices are obvious and important, and we still see ineffective solutions, not durable ones, bad data practices in the field. And so this is a very current and important problem, so thank you for convening us to discuss about this, and if we can even just inch towards solutions that would be amazing.
Thank you.
Mr. Dizolele: Thank you very much, Patrick. A few challenges that you laid out and takeaways there: first of all, that data is important and needed but it should not be collected at all costs. In other words, there’s limits, I think – the moral limit of it, the ethical limits of that process. You also talked about digitalization, the role that it plays in the transformation of the humanitarian space, how that is key to localization. I think this brings some questions in terms of the friction that you presented between people and the processes. How do we use the digitalization of data in settings where society at large is not actually digitized? So Maja was talking about the DRC. We heard about Chad. I’m sure if we think of Afghanistan – a lot of these spaces are not particularly up to date when it comes to technology. So where is that balance, really? And I think in a way it’s almost like you have on one side the guardians of the galaxy with their data and the vulnerable population that is supposed to serve that almost don’t have access to that.
So on that note, I think I’ll have a question for each of our panelists and then we’ll open it to you, the audience, because we really want to have – it’s a conversation. We have a lot of knowledge in the room; there’s no point for me to be here monopolizing this conversation when in fact knowledge may be with you.
So Ziad, we’ll start with you. In all of the stuff that you’ve said, I want to know, how could an established international charter help facilitate the convention for data collection in the humanitarian space. Is that possible?
Mr. Al Achkar: That is an excellent question. Is that possible? I think looking at the way the international system is right now, I’m a little bit skeptical that we can come to an international data agreement on humanitarian data collection and principles and the way we do it and regulatory frameworks and international governance mechanism. So I think an international agreement I don’t think is possible; in the short term, I think there’s a lot of – there would be a lot of politics involved. But I say this: This does not mean that we should not be working within the U.N. system, within international community, within partners who are interested in building, you know, building governance structures, building into the points being brought in today – accountability mechanisms, building in standards, building in ways to have feedback loops in communities that are most impacted by this. I think there’s a lot that we can do as a humanitarian sector, as humanitarian agencies. Some of the work that’s being done, you know, by the ICRC on their responsibility handbooks or with OCHA and Centre for Humanitarian Data on building – you know, capacities-building processes. So I think there’s a lot that could be done, that could be done applied to the broader humanitarian community.
I unfortunately don’t see – you know, I know there’s a lot of conversation happening about do we need a digital Geneva Convention, for example. I think conversation about digital emblems is critical and so important, but unfortunately, I can’t see an international, you know, governing 196 countries data responsibility mechanism, you know. A lot of countries around the world still, to this day, don’t have data responsibility laws or data privacy laws on the books. You know, I think roughly it’s about 67 to 68 percent of countries have one – the United States, you know, to – you know, it’s a very, for example – you know, there is no real national laws; it’s down to the states and there’s work that’s being done on that level. And so I think we’re far away, unfortunately, from a global, international governance of data for humanitarian response, but I think there is a lot of work that can be done that ought to be done.
Mr. Dizolele: But you mentioned earlier political will, talked about political – the lack thereof. It’s hard to herd all the cats, all the various countries of the world to come together, but can Western organization be subjected to a code of conduct, a code of ethics that may be voluntary but can start the process?
Mr. Al Achkar: I would say, I think, a lot of them already do have codes of conducts and already have procedures and kind of follow specific principles. I think one of the big things we come back to, as humanitarians, is that, you know, our fundamental guiding moral compass are the humanitarian principles. We can debate whether or not, you know, we should – the role of neutrality, but I think today, you know, we have fundamentally a do-no-harm approach, a lot of humanitarian, you know, guide our work, and humanitarian principles guide a lot of the work of humanitarian agencies. So I think, you know, we’ve seen a lot of codes of conducts come up; I think the problem becomes if there’s too many of them then it becomes an ad hoc approach, and that was kind of the issue, you know, I think early in the 2010s and 2015s is that we have a lot of ad hoc code of conducts that is not, you know, representative of the whole community. So I think any effort to kind of bring the community together to build one system, to build a shared vision of what is responsible data use I think will go a long way to, A, improving accountability, improving – I think transparency is key. I think this is one of the issues that there’s very little transparency about how we do collect data and what we do with the data and who gets access to it, but also transparency about the contracts that the humanitarian sector gets into with the private technology companies. And so I think there’s a lot of ways, to your point, that we can be doing that we ought to be doing and we’re still not doing it.
Mr. Dizolele: OK, so it’s not exactly the Wild West, but close.
Mr. Al Achkar: It’s better than it used to be. I’ll say it that way. (Laughs.)
Mr. Dizolele: All right. Thank you. (Laughs.)
So Maja, I would like to come back to you about that disparity that you described earlier about the data collection itself but also how that stands in the way for access to services for a lot of these families or these communities? If you can talk a little bit about the vulnerabilities there.
Thank you.
Ms. Lazić: That’s a bit of a difficult question for somebody like me because we’re – you know, even though in my past life I was a protection officer in the field, discussing this I would have to pull on my experience from there, whereas, you know, what I can talk about in the context of data for us more data protection.
Mr. Dizolele: OK.
Ms. Lazić: So would it be good for me to maybe address some of the work that has been happening in the U.N. Refugee Agency in terms of developing a microdata library and working with data protection sort of concretely and trying to unlock some of those assessments and surveys that we have – that we unfortunately too often have sitting on – in the offices precisely because there are some deep data protection challenges?
Mr. Dizolele: OK. Please go ahead.
Ms. Lazić: OK. And so essentially the point of departure is the fact that we need the data, as we’ve established, and that it’s really complex to manage, and in that complexity, at the same time, we’re also faced with the fact that the cost of not using some of the data that is stuck in many offices because of data protection concerns is simply too high. We’re faced with growing needs and growing populations being displaced.
So what we did, essentially, with all this in mind, we mobilized our expertise in the Joint Data Center to support UNHCR to deliver on its data transformation strategy, and I recognize this word from the conversation already. So that’s something that the U.N. Refugee Agency is also going through. And the idea was we need to unlock the value of the existing data from surveys and assessments so that this data can be responsibly shared within and beyond UNHCR.
We’ve facilitated the collaborative transfer of expertise, and this is where it gets really interesting, because the World Bank has knowledge and tools and great experience in managing data, being a leading institution in household survey data of individuals, and they were able to transfer this wealth of expertise to UNHCR to enable a setup of a microdata library in UNHCR. And part of the project was a team of curators that would compile data inventories across UNHCR and put them into place in the processes to take the data sets identified to publication. And so this required a pipeline through which data is checked, it’s cleaned, it’s an anonymized, it’s documented, and it has been developed through really close collaboration with the experts and the World Bank’s data group, and then it’s being institutionalized rigorously in UNHCR with guidance and training and governance. And so this means that data is now stored on UNHCR servers and covered consistently by its data protection policy. And just to give you an example of what a stark contrast that has been, because I’ve been sitting in the situation in the field operation where I would have a request for data from another organization, from the government, and I would need to sit and look at our data protection policy, which at that time, I have to say, wasn’t as good – (laughs) – as it is becoming now, and basically make an onsite decision, you know, can we share this data or not? And I have to say that, as a protection officer, I’m good at international human rights law and humanitarian law, but to make a decision like that in real time was quite an onus.
So this process that follows the microdata library is really something that I think is revolutionizing UNHCR. And so the library runs on an application that was developed and is maintained by the World Bank data group and that has been adapted to meet UNHCR’s needs through regular information exchange and these consistent setups and approaches mean that UNCHR microdata library is fully interoperable with the World Bank microdata library.
So building on this expertise and learning also from private and public sector, outside of forced displacement, outside of the humanitarian space, the Microdata Library was launched in 2020 and now holds more than 470 data sets. And if we talk about the access – and we can talk about access and use later on but maybe I can just say now, to sort of round up on this part of making the data available – I think what’s really important is that when you put together two institutions like UNHCR and World Bank and when you’re able to transfer this kind of expertise from the World Bank into UNHCR, you’re not just giving UNHCR a robust and rigorous way of sharing data and unlocking this wealth for programming of not just them but also other organizations, you’re also teaching or – excuse my – for a lack of a better word – you’re also transferring the learnings of sensitivities of the forcibly displaced into the World Bank, who is doing a lot of surveys and who’s increasingly paying attention to vulnerable populations out there. So what I’m trying to add to this angle is essentially that the knowledge transfers two ways: you’re releasing data but you’re also sensitizing to the protection needs of populations that are particularly vulnerable.
Maybe I stop here and we can elaborate further on use and the rest.
Mr. Dizolele: Oh, this was very useful, Maja. Thank you very much.
So we’ll go back to Laura. (Laughs.) You talked about the emblem to say please don’t touch this. So question, really, is, what principles and guidelines shape the ethics of that data collection and security in international setting? And then, how does this idea of “don’t touch this” – is taking hold, if it is, and what’s the future of it?
Ms. McDonald: Gosh, I hope everyone else heard MC Hammer just then. (Laughs.) OK, I just dated some of us, but there’s some blank faces in this room and that’s frightening.
OK, so I mentioned that the ICRC has a data protection office and that’s because we’re old enough that – and we have a particular mandate under the Geneva Conventions which still apply online and we think are fit for purpose for that. When it comes to laws of war, the Geneva Conventions can be updated when new weapons and new approaches come along. They can be updated by – in the way that they are applied and the way they are understood. And they don’t necessarily need an update, we believe, to handle things like AI and machine learning and automated targeting systems and things like that. That’s an aside. But we – so the national data protection or regional data protection legislation that you’ve heard about – perhaps you’ve heard of the General Data Protection Regulation, or GDPR, in the EU; we have similar legislation coming up here. It doesn’t actually apply to the ICRC, but we do need to be accountable and we do need to have rules that govern the data that we collect, so we have a data protection framework which you can download and read; it’s online. It’s governed by a data protection office that guides our offices and our delegations around the world and our HQ on what to do, making these tricky decisions that Maja described around how to handle data, and we have an independent commission board that oversees that as well. So you can go and have a look at that.
But I think to your point about, you know – there’s two things I think, maybe, that I might just say quickly. One is that I think this – where this really lives, where it sings, where it really happens, is not necessarily in the drafting or the application of these regulations. It’s in them being understood by the people who have to implement them every day. And I think there – you know, I used to think of this as there needs to be a carrot and a stick, so there needs to – there need to be incentives within organizations and it needs to be a new job description and new performance evaluation that you implemented, things like data protection rules. And I always look at – it’s not the same but the example of sexual exploitation and abuse within the humanitarian aid system.
There was a big outcry about that a few years ago. Some agencies who had offices dedicated to this topic at HQ and they had trainings and they had a system; there were certifications; there was review; there was an ombudsperson – still, these things were widespread and there were – it was instructive to see the scale of renewed investment in those areas after the outcry had happened, because I think that tells you the scale of institutional investment that needs to happen to make something like protection of folks from sexual exploitation and abuse happen. Same thing with data protection. You can have all the data protection offices you want, but there are multiple things that have to happen within an organization. It’s not just incentivizing and telling someone to do it; it’s giving them the tools to do that. So that’s about capacity, that’s about understanding technology, it’s about having systems in place, and it’s not just about digital data either.
There’s a wonderful scholar called Elizabeth Eagen who works with data and society in the Ford Foundation and she said to me once that she was doing an interview with someone in a civil society organization in Eastern Europe about data protection and sitting in a room with boxes and boxes of physical paper wondering, you know, what would happen if she walked out with one of those boxes? Like, what is in there? And I think, you know, the same conversation comes up when we’re worried about digital data and we’re not paying attention to what happens. You know, what are the basic systems at work in this organization?
And I think we have to also disaggregate. It’s not just about data protection. A subset, a very important subset, of protecting data and being responsible with data is cybersecurity, and here again is a whole wing of critical infrastructure upgrades that the humanitarian sector needs and can’t really afford. You might know that the ICRC itself was hacked this year, or actually last year, and that was discovered in January. Again, if you google it you will be able to read all about it on our website because we were as transparent as possible because it’s important for people to know that if the Red Cross can be targeted to try to access our very sensitive data, so can all the other organizations who – and we’ve been hearing about all this sensitive data that is collected, and it’s a fact that most organizations can’t resource the type of cybersecurity operation that would be required to really keep that data safe, which brings me back to values, which I wanted to talk about in terms of ethics and principles.
I think your personal values around this can grow and change. Certainly, there is always tension between “we must do something” and “we must do the right things and only the right things,” and that’s something that we all have to weigh and all organizations have to weigh. With data, the reason to be minimalist about data collection is that you cannot always secure it, and if you know that you cannot take the risk of that data being used or uncovered or taken away or changed, then you shouldn’t necessarily collect it, and I think that’s a conversation we have to have.
And just to wrap up, I would say that one of the other challenges is something that Maja made me think about, or touched on just now. When we’re doing these analysis – and this is part of what makes the capacity building so tough – it’s not just political science or international relations majors that we have to bring together; it’s also the computer science people and the lawyers, and these people don’t like each other very much and do not understand each other when they speak. But you have to get them in a room and on a working group for a month to figure this stuff out, and then they need to be involved in every operational decision you make about these databases or these programs. This is analysis and risk mitigation across multiple specialisms, and it’s really hard for us to do as humanitarians, or as anyone, arguably.
Anyway, I think I’ve gone on for long enough, so I’ll stop there.
Mr. Dizolele: Thank you very much, Laura. Very insightful there.
Patrick, we’re close with you before we open it to the audience. You sit at the intersection between kind of the humanitarian and the private sector, so to speak, with KoBoTools. You just kind of – you straddle both worlds. So the question for you would be, what avenues exist to provide data collection transparency to local organizations in crisis-affected regions? If such a thing exists. And then do Western organizations partner with local digital data collection companies in this humanitarian space, particularly in terms of interventions? About three to five minutes for you and then we can open it to the audience. Thanks.
Dr. Vinck: Sure. Thank you. And I kind of wish I could revisit some of the past questions, but maybe we’ll have time after.
I have to say this: I’m kind of an accidental entrepreneur. KoBoToolbox was really created because, on a very selfish level, we needed – more than 10 years ago we needed a tool that would meet the requirement for data collection and help us grow faster from the moment we collect data to the moment we’re able to get insight from it, so it could not be something that relies on internet; it could not be, you know, based on complicated tools and analysis. We needed something very simple to use in the field, very robust. And it turns out we started – we managed to literally take the money from photocopies and data entry to invest that and hire a programmer and then found a open-source tool, ODK, that we could build around and so really created something that turned out to work well. Turns out once it existed, people wanted to use it, turns out once they wanted to use it, they wanted to improve it, and so we created this pattern but the model has remained free and our goal remains free, which is that we need good data and so we make it simple, free for any humanitarian actor to collect and have a tool that is quite powerful but easy to use. So, you know, we’re not really in the private sector in the sense that this is not a private company, it’s a nonprofit, and we’re just trying to make sure there’s good data out there.
Now, what’s the challenge with that is that as you make things simple, as you make data collection simple, you make collecting bad data simple. You made the ability to – for organizations, frankly, to collect data and create risks for communities. So along with innovation, along with these tools, there is a need to improve practices, to improve basic digital interests, to improve – I mean, I know other people on the call – we all have heard of these – in the early days of crowdmapping, we’ve all heard of digital maps that would show victims of sexual and gender-based violence with location or show potential targets on maps for armed groups. I mean, so there are things that are so out there that it sounds ridiculous that someone would do it, but the reality is, you know, we all make mistakes – let’s just be clear on that – and we need to learn from them. And so one of the key questions, then, is, who’s responsible for that? Who is responsible for creating the learning? Because the reality is when we go and ask local organizations to start data collection for us, who spends the time training them on what does it mean to have really meaningful concept? How should you train your interviewers? How should you protect the data? I mean, there are so many basic digital skills and – hear me out – they’re fantastic partners and people are really knowledgeable in every country, but for every organizations that know what they are doing, there are others who are not as good.
Let me give an example of help line. You ask, do we partner? Well, of course we partner with local actors who collect data all the time, and running help line is a good example. It’s not – you know, it’s done by local organization. But how are they trained? They are certainly trained on potentially an organization wants to use a specific software so they will be trained on the software, but what about the data protection? What about safeguarding information? What about how do you respond to request for preventment?
I gave the example earlier. It’s fine for an agency to say, well, the government can ask us – data. We won’t give it to them, sure. What about your partners who are actually the front line, who are actually collecting the data? And I have that experience in the field where we ask a sub-office, well, has the local government ever asked you to share information, and they say no, never, they know we don’t, and then you go to their partner who are actually collecting the data and you ask, has the local government asked you to share information? Oh, yes, all the time. So there are so many disconnects, so many aspects on which we need to do a much, much better job. I talked about this dichotomy between digital transformation and localization, and that’s really where it is. We need to set up those infrastructure, those support network, peer-to-peer networks, make sure that organizations talk to each other, and build knowledge and experience.
And I do want – I’m sorry; I don’t know if I’m out of time. I do want to go back on one thing which is that, you know, the data security has become such a cybersecurity and a technical discussion and firewalls and this and that – all these are really important, and the fact that ICRC was hacked, was mentioned, should not undermine – yes, you know, in any organization, even if you have the best protection, like ICRC, there is a risk. But it shows the importance of having everything else that goes with the policies – enforcement, the accountability, and – sorry I’m rambling, but I do want to give one example of that, is data retention. You know your data may be hacked; you know it may happen soon. Have a clear policy on how long you’re going to keep the data exposed at risk. What’s the process? Is it after one year? After two years? Is it after three years? Have a clear decision. I’m not going to tell you what’s right or wrong. I’m not going to tell you you need to delete every data after six months. But be very clear and purposeful. Think about this question. Make sure you – Maja shared her example about having to decide how am I navigating this data request? Well, you have to have very clear assistance on how do you deal with requests? What is reasonable? What is not reasonable? What’s the risk? How are you going to assess it? But those are the things that are available in the ICRC and IFRC, at the U.N. agencies of the world. They’re not available to my friends at ISHEL in the Central African Republic. They don’t know how to do this, to be blunt, and they are fantastic people, they are smart people, but there are things that come with experience and for which we need to work.
Mr. Dizolele: OK.
Dr. Vinck: Sorry, I’ll stop.
Mr. Dizolele: Thank you very much, Patrick.
We’ll open it to the audience. So if you want to ask your question, raise your hand, identify yourself, and ask the question, no political speech, and if you want to direct it to a specific panelist, please do so.
Sir.
William Keady: Dr. Vinck, my question is directed right to you.
Mr. Dizolele: What affiliation?
Mr. Keady: My name is William Keady. I’m an intern at the Osgood Center for International Studies.
My question relates right to what you just started talking about. Where should proper data management compliance training come from? Should that be a governmental response or through NGOs? Because we see these different examples of hacking through different organizations and how do we build that trust with refugees to where they know that their data is going to be safe and they don’t have to worry about giving away their data and losing it towards bad compliance?
Mr. Dizolele: Do you want to direct that to a specific panelist?
Mr. Keady: Dr. Vinck, I guess you could answer that.
Mr. Dizolele: OK.
Dr. Vinck: Sure. And I’m sure my colleagues, you know, Maja and Laura may have things to add. But let’s be very clear: I’m lucky, I guess, and I have an institutional affiliation – you do too – with a university. We know exactly how it works. We have an IRB, we have questions, we can go to them. That IRB itself is accredited and there’s certification and we take classes every year on those practice. None of this exists largely. I mean, within organizations you may have specific courses that people have to take, and generally, that is not even really tied to specific data, access, or privilege. It needs to be. You need to kind of work on accreditation. You need to work on courses and certificates. And those need to be kept up to date. But we talked about resource. Who’s going to pay for that? All I know is it needs to be done, and it needs to be done in a way that is not internal; it needs to be publicly available to all organizations working on humanitarian data.
So in short, where should it come from? Governments of course have an important role to play; the international community, and frankly, here I would look at the U.N. I would like at UNHCR. I would like at the UNHCRs of the world. They have a responsibility to make sure that those training, those resources are available, even if it just means contracting a range of things. Look, I would love to do this. I would love for us to build a course for the Federation of the Red Cross. Not the ICRC, the IFRC has built fantastic resource for their affiliates. Let’s do more of that.
Mr. Dizolele: Thank you very much.
Maja, Laura, or Ziad? Anybody want to quickly add to this?
Ms. McDonald: I mean, I can quickly say – yes, good shout-out to the IFRC’s data playbook which just had its public launch; it is indeed fabulous. And, you know, the ICRC data protection office also runs trainings periodically in different places. We are planning for one in March for those in D.C., and we have run them in Dakar, we’ve run them in Latin America, I forget where. But, you know, it’s hard for us to fund that also and then that is a one-week course on data protection that then, you know, is not enough; as you say, these things need to be refreshed and updated and there needs to be mechanisms. So I would just say that.
And then I would also just add and tack on that this is – ultimately this is about power – not the training. I mean, sometimes, but that’s not good training. But – (laughs) – the issue of how much access do we give folks without really testing or ensuring that those people are adequate safeguarders of that data? This is about us not prioritizing it because we don’t have to, because we have power. And a little bit of, you know, loving feedback to the international system that I’m a part of is that the big organizations who are grantors or who then subcontract to smaller partners, it is on us to ensure that those partners have the support they need to do this well, and also to donors; you know, the drivers of good and bad practice in this sector are the institutional policies that we – big organizations and donors – have around data retention, data capture, data sharing, including with us, and so there is something of we have to get our own house in order and look to ourselves there.
Mr. Al Achkar: I’m just going to make a quick note. Unfortunately, one of the things we always forget is languages. You know, if we’re going to do all these kind of trainings, it has to be in as many languages as possible to reach as many people as possible. You know, we do really not a good job even within the basic six languages of the U.N., you know. Good luck finding a lot of data responsibility resources in Arabic. As an Arabic speaker, I can tell you, they don’t really exist out there; they don’t come out as often; they’re not published as often. And so one of the challenges is not just developing these tools but make sure to translate it into dozens of languages so that the people who are on the front lines, people on the ground doing this work, responding, operationalizing all these guidelines can really within the language that they speak and not rely on, you know, principles that are made in English or French.
Mr. Dizolele: Thank you.
Maja, any comment or –
Ms. Lazić: Yeah, just briefly, to really echo everything that’s been said and particularly the whole point about roles and responsibilities, even with an organization. I think that’s what we’ve been experiencing through the development of the responsible data access in UNHCR, you know, how we’re getting down to the nitty gritty and more granular approach to managing data and sharing it. And I think the example with UNHCR’s microdata library was also what has empowered the Joint Data Center to go external and basically say if UNHCR, with such a sensitive mandate for such a sensitive – such vulnerable populations can go out and institute all of these things and share data, surely we can work with other stakeholders. And the way to do that – you know, we talked about a convention for data protection, we talked about different things. The way to do that seems to be to really open up a pool for the various tools and guidance that is produced, because many organizations sit with this in – you know, between the four walls and are figuring it out, whereas there’s actually a lot of intel that we can collectively put together. And that also goes to trainings and develop trainings that puts us – all of us on the same standards in terms of, you know, the different components of data protection. I think that’s a field that is moving not just with the Joint Data Center but also with other partners in the Inter-Agency Standing Committee and also in the development partner world. So just to echo and say that that is something that we feel is moving.
Mr. Dizolele: Thank you very much to all the panelists for your comments. We have about four minutes. Any burning question from our friends in the audience? Question to the points? You’re going to address it to a specific panelist?
Q: I can shoot it out to everyone. Hi. I’m Kiara. I’m from the Department of State.
So the United States, obviously, prides ourselves as being one of the biggest humanitarian donors, and I know you just talked about larger donors kind of working with smaller countries or organizations. What would data support look like for the Department of State or the United States as a whole?
Mr. Dizolele: OK, thank you. So we have three minutes to answer this question.
I think there’s a question over there.
There’s one more question?
Q: (Off mic) – one?
Q: I’ll do it later, thank you.
Mr. Dizolele: Later? You’re sure?
Q: I can give it a shot, but it’s probably –
Mr. Dizolele: Mic.
Q: There’s a – yeah. (Comes on mic.) Thank you. Daniel Cragger, University of Washington.
A lot of our conversation today has focused over data security, but I think this also relates into the effectiveness and usefulness and how these are operationalized in context. For example, there’s a lot of demands for age and gender disaggregation, but in contexts where I’ve worked in, that’s not necessarily as feasible in contexts where populations are worried about conscription. I’ve also seen questions about gender-based violence and prevalence of underaged child marriage being operationalized as – in U.N. products – as surveys in public by heads of household. So my question to you all is, how can we better align the current data methodologies in the humanitarian sector given the differing contexts and data security needs? Thank you.
Mr. Dizolele: Did you say your affiliation?
Q: University of Washington.
Mr. Dizolele: All right. Thank you.
So because we have about three minutes – (laughs) – we’re going to give two of our panelists the bid to your question and then two to the fellow’s question. And please, no more than 45 seconds. (Laughter.) Thank you – in your responses each. (Laughs.)
Mr. Al Achkar: I’ll go first and I’ll answer the first question. I think, when looking at – you know, when asking what the State Department can do to help support communities, I think the first thing is ask them what they need and work with them and engage with them in meaningful conversation so that we don’t make assumptions sitting here in Washington about what somebody needs in Beirut, Lebanon, what organization need. So it’s about opening communication and asking organizations what they need.
Mr. Dizolele: Thank you. Anybody else among the panelists want to take 45 minutes (sic) for that question?
Mr. Al Achkar: Seconds.
Mr. Dizolele: I mean seconds, sorry. (Laughter.) Pardon – 45 seconds. (Laughs.)
Ms. Lazić: Do you want me to come in?
Mr. Dizolele: Do you want to answer that?
Ms. Lazić: Yeah, no. I can quickly just say – well, first of all, I think the U.S. State Department is doing a lot because they’re – in terms of supporting data and work around the data, as we’re doing in the Joint Data Center. They are one of our main contributors and partners in this space, and they’ve got a lot of bandwidth for the kind of work that we’re doing, including some of the explorations that we’re doing. So just to say, from that perspective, there is action from their side. And perhaps I stop there, but just to say that that’s something that we appreciate highly.
Mr. Dizolele: Thank you very much.
Laura or Patrick, do you want to address the second question?
Ms. McDonald: I have a suggestion that might – my suggestion for both questions, which is very ambitious, but the suggestion is this: Shorten the feedback loop, because I think one thing I notice about digital and data work, full stop, across both development and humanitarian aid is that we don’t have enough evidence of what happens down the road after our intervention. In any other field, that would not be OK. If we acknowledge the power and balance that the United States has or that we have as actors and we are not looking at the outcomes, including looking at the data and is it problematic and what’s the outcome of selecting the data, and if we acknowledge that power imbalance – and we need to be actively soliciting the views of the people who are the data subjects and making sure that they can quickly change what is happening. And they also have, under most data protection regulation, they have rights, which we should respect. And so yeah, that’s my suggestion: shorten the feedback loop and make sure we’re actively soliciting feedback from communities and hearing what happens down the road from our intervention.
Mr. Dizolele: Thank you, Laura.
Patrick, we’ll give you the last 30 seconds.
Dr. Vinck: I have a ton to say on both questions but I will do exactly what Laura just suggested, which is to shorten my feedback loop and I will leave it there with the fantastic answers from my colleagues, but I’m always happy to discuss. Don’t hesitate to reach out. Those are really important questions, but thank you very much.
Mr. Dizolele: Thank you for the short feedback. (Laughter.)
We would like to take this opportunity to thank our distinguished panelists for their expertise, for joining us today, and thank the audience – (applause) – a round of applause for our friends.
Thank you. This concludes our session.