This research project is funded by the basic research fund for the International Postdoctoral Fellowships of the University of St.Gallen. It will be undertaken at the Institute for Business Ethics with Prof. Dr. Florian Wettstein’s research team.
The research will explore responsibility for structural and intersectional injustices that are channeled through digital technologies. The research will take an interdisciplinary approach and be undertaken in two phases:
Phase 1 will be an in-depth examination of different forms of direct, indirect and structural digital violence, their relationships to each other and how these relationships affect regulatory settings. Direct violence puts agents, who intentionally cause harm, in direct relation to victims who suffer the violation. Violence is indirect when agents do not intentionally or directly harm a victim. Structural violence results from the interplay and layers of direct intentional violence and indirect unintentional violence to replicate an intersectional inequality. That is, when the use and design of digital technology intentionally and/or unintentionally produces harm, intersecting inequalities (e.g., of gender, race, ethnicity, economic status) result from interplays between direct and indirect violence. This interplay and the resultant intersectional inequalities pose problem for existing regulatory regimes. Existing regulation is not set up to deal with this kind of complexity and it conventionally focuses on remedying direct intentional violence with minimal attention to indirect structural violence. The proposed research aims to show how direct and indirect violence intersect to produce structural violence that affects regulatory choices and effectiveness. Direct intentional harms can be regulated through conventional liability, such as tort law, but this does not address the unjust structure that prompted the violence. It is unclear what regulatory alternatives might be available when direct intentional (or indirect unintended) violence through digital technology is activated by unjust structures (or socio-economic and political systems). To an extent, structural violence is agent-invariant: it persists regardless of the punishment of individual wrongdoers. Hence, an optimal regulatory framework would target the underlying unjust structure as well as the wrongdoer. Currently, no studies unpack the effects of different forms of violence on regulatory efficiency in the digital space. The proposed project seeks to fill this gap by using an empirical research method.
Phase 2 will build on Phase 1 to introduce a comprehensive responsibility framework based on principles of political responsibility and capability approach for structural violence that is replicated and amplified through digital technologies. This phase will pose three questions: 1) What are the bases or parameters of responsibility? 2) How to identify those who bear responsibility 3) How should alternative regulatory framework therefore be framed? It will address these questions by investigating the extent to which those who control and benefit from unjust structures seek change. Using Iris Marion Young’s social connection model, the research will examine the extent to which agents have control and power over digital infrastructure and hence bear responsibility. It will also explore the extent to which actors, other than those controlling the structure, contribute to its operation and thereby bear responsibility to challenge it based on interest parament of the social connection model. Going beyond Young’s model, by using capability-based interpretation of power to identify agents of justice and expected moral responsibility, it will focus on the responsibility of tech-companies, states and users (including potential victims) to undermine unjust digital infrastructure.