IAC 2024 - IA Fast Track to Search Focused AI Solutions
Ethics and technology in humanitarian setting
1. 27/08/2018 IHSA 2018 iTRACK Project
Katrina
Petersen, Olivia
Iannelli,
TRILATERAL
RESEARCH
Ethics and Technology in
Humanitarian Settings: an
iTRACK case study
2. Examining how the social-cultural intersect with the ethical in
humanitarian technology design and use
For example:
• What cultural considerations need to be taken into account to
really understand how privacy is a public good?
• How do cultural considerations change what is ethically
acceptable or desirable?
3. The iTRACK system
Keep track of humanitarian workers in the field
Support planners/workers’ situational awareness & decisions
4. Approach
• Literature review
• iTRACK’s various components were tested in a simulation
exercise
• Interviews with key stakeholders
• Synthetisation of findings
7. StaffSense
Mobile application for encrypted
asynchronous communication in field
Users can:
• Communicate with headquarters, users
• Receive notifications and alerts
• Send location-based threat messages
• Turn on tracking
• Use a panic button
Enhanced situational awareness
proactive warning and alerting
8. Mobile DSE
Mobile DSE Application
• Dashboard for convoy navigation and
routing
• Sends real-time convoy location to the
centralized system
Improved navigation and mission re-
routing (in case of threats)
Improved mission planning and
organisation
9. On board threat detection
Real-time detection of threats and critical
events
• Records 3D 360° HDR panoramic
images and video
• Uses machine learning to detect
landmarks and objects
• Uses landmarks to estimate vehicle
location and trajectory
• Sends real-time alerts to users
Automatically detects potential threats
around the vehicle
Notifies users and headquarters
10. Movement in Space
Ethical responsibility towards
personal privacy
Social responsibility towards
worker safety
Cultural responsibility towards
local context
When a worker is off duty, they
are not monitored
“Off-duty” is not a clear
category when deployed in a
crisis zone
Movement patterns can help
understanding of communities
being served
11. Sorting and classification
"(1) pre-existing social values found in the ‘social institutions,
practices and attitudes’ from which the technology emerges, (2)
technical constraints and (3) emergent aspects of a context of
use” (Friedman & Nissenbaum, 1996)
‘‘the values of the author [of an algorithm], wittingly or not, are
frozen into the code, effectively institutionalising those values’’
(Macnish, 2012)
12. Qualifying Risk
“When the location where the humanitarian workers reside when
being off-duty does not pose any safety risks, the devices can be
turned off” iTRACK designer
“When complex algorithms and big data sets are used to make
decisions, it can be difficult to work out exactly what factors
influenced that decision, and to what extent” (Murray & Fussey,
2018)
Safety Risk depends on a range of factors:
Context, Experience, Familiarity, Timing, Frame of Reference
13. Because blame can potentially be assigned to several moral
agents simultaneously (Mittelstadt et al, 2016) and to avoid
creating an “accountability gap” (Cardona, 2008), iTrack has to be
transparent about how it approaches security, threats, and
ethical principles.
It has be clear about:
How to define responsibility?
How to define justice and fair treatment?
What is privacy, as a right, intended to offer?
What elements of a situation need to be considered when
determining if acts are ethical?
What is a user required to know about a system?
This is in part driven by semi-recent events that showed issues like privacy, autonomy, justice are cultural bound
--german wings crash
--increased realization within technology design that it is not just the organisational or individual practices that matter, but the contextual, situational, environmental.
increase use of tech designed by EU projects outside of the EU
One place that we’ve been exploring this work is in the project iTrack.
EU funded: Horizon 2020 -- 3 years
12 Partners from 8 EU states and UNWFP
Aim: next generation intelligent tracking platform
Through things like threat detection, tracking of workers paths and other assets.
tested in simulations with humanitarian practitioners.
Pilot applications with the World Food Programme and iMMAP in the on-going conflict disasters in the Middle East.
To be clear, itrack has not been tried in the active field yet. It is still being developed, so many of these conversations are potentials in order to get designer to consider the implications of their choices.
I’m going to go into three different areas where ethics are frequently the conversation and try to try to raise some socio-cultural considerations for further thinking.
Then, I’ll sum up some initial thoughts on what this could mean for those of us considering the ethics and technology in humanitarian settings.
Let me tell you a bit more about iTrack.
sensor development
GIS,
security & privacy,
artificial intelligence,
information management,
risk analysis,
humanitarian logistics.
Location tracker can be turned off.
Bc of privacy issues – no people
Images not sent, just alerts and identifications (with percentages)
A few systems create traces in real time of the movements of people and assets.
These components can note if you enter a building or if you pause roadside for a while.
The designers are very aware of the privacy issues that come with this – learning from recent troubles with fitbits leaving traces of soldiers movements at military bases, specifically designing in the option to turn the tracking aspects off.
(Along with an elaborate scheme of access control, data aging, logging, and auditing)
However, while this manages one aspect of privacy, it raises others, others than can be seen when looking at the socio-cultural aspects.
This is because: Turning off a location tracker might disable the possibilities for the humanitarian organization to warn, and to find and evacuate humanitarian workers when an incident occurs in the location where they are 'off duty’.
We’ve had conversations with humanitarian workers who want to be protected and tracked to reduce safety risk. For them, privacy isn’t even an issue at hand. In their socio-cultural context, responsibility and safety is what matters. Privacy does not provide safety nor does it lead towards organisational responsibility.
By tracking humanitarian workers it could also be possible to enhance the safety of those they serve.
An advisor at the World Food Program we spoke to mentioned that this additional layer of surveillance may also make beneficiaries, particularly women, feel more comfortable and safe when dealing with humanitarian workers. These women would know these individuals are being tracked and therefore, more accountable for their actions.
So, what protection does privacy offer here?
Humanitarian safety and security and responsibility might be in conflict with specific engagements towards privacy.
Even more, privacy does not mean the same one place to the next…(Germanwings)
And even when on duty, interpreting the tracking data is not straightforward.
Interviewee: “We were driving in Sri lanka and our driver starts plucking leaves from a bush. He was collecting neems plants which if you give it to the local people it is considered an act of hospitality. The driver said that to enter into the community we should bring the women these neem leafs and they will take this as a sign that we are being culturally appropriate. When we arrived they were totally at ease and happy to see us and we had a successful meeting. Now you run that through iTRACK you stopped by the side of the road to go play in the bushes for 25 minutes is that an inefficiency or is that a mission success factor?”
Reverse question: what protection does tracking offer?
“The fact is your system is never going to be able to detect that. There is no algorithm to detect whether you are hanging out with someone or doing something because you want to or because it is helping you get to your objective.”
No right answer
And these are harder to design for because many issues like this have no ”right” answer to compare against.
If you are working on facial recognition, eventually it does or does not correctly identify faces.
But if you are working on objects, threats, and privacy, we are talking apples and oranges where they both have the potential to be correct….depending.
For example, A child soldier carrying a gun could be considered a threat, due to the identification of a gun, but to label him or her as such would be far too simplistic.
In a conflict setting, many things, should not be interpreted as black and white but rather, have a variety of different social and cultural interpretations which vary from country to country and often from town to town.
Going back to the tracking: This makes it a challenge when the designers say:
“Of course, when the location where the humanitarian workers reside when being off-duty does not pose any safety risks, the devices can be turned off.”
Or when they say they can quantify threat.
How do you define the risks and threats, then, that these questions of privacy and security are being framed around?
What factors do you consider? What definition of objects?
With the implementation of iTRACK the responsibility of threat identification, and therefore, security, could be shifted from the individual to somebody else,
within his organisation, or at a distance in time and space (e.g. programmer)
affecting a humanitarian worker’s dynamic with their organisation and peers as well as their freedom of choice,
Some of these conversations are old, like the way values are carried in algorithms, classifications systems, and
Some are still ongoing – think the challenges in facial and voice recognition or police profiling
Or those around justice and fair treatment. (like those around environmental justice).
Because some groups are more vulnerable than others and subject to disproportionate burdens. Fairness and justice for them mean something different for those in positions of power.
So when defining the objects around us, these objects have different meanings and implications depending on:
Context, Experience, Familiarity, Timing, Frame of Reference