Journal of Cross-disciplinary Research in Computational Law (2021)
Delay is a central element of law-as-we-know-it: the ability to interpret legal norms and contest their requirements is contingent on the temporal spaces that text affords citizens. As computational systems are further introduced into legal practice and application, these spaces are threatened with collapse, as the immediacy of ‘computational legalism’ dispenses with the natural ‘slowness’ of text.
In a constitutional democracy, the operation of law relies on the multi-interpretability of language and the possibility of contesting meaning. These capabilities are assumed in the contemporary structures of rule creation, interpretation, adjudication, and enforcement. When new technologies are introduced into those structures, such as AI/machine learning or self-executing rules (e.g. smart contracts), new mediations necessarily enter the frame.
Computational legalism refers to the ontological features of digital systems that make it impossible for humans to see, interpret, and contest the rules that they contain and impose. When software code structures our behavioural possibilities, it forces us to act ‘legalistically’, that is, like automatons: we don’t think, but simply act in accordance with the structures laid down by the code, because we have no other choice (unless the system’s designer has elected to give us one). This paper/discussion seeks to view and challenge computational legalism through the lens of critical code studies, the explication of a digital system’s meaning by the interpretive or hermeneutic analysis of its text, i.e. its source code. These texts play a fundamental role in the constitution of all digital systems, and so any analysis cannot be complete without at least some engagement at that level. Looking at digital systems from this angle is relevant to real-world processes of designing and producing code, and might help us to identify normative possibilities for combatting the vices of computational legalism.
This post summarises computational legalism, a concept I developed in my doctoral thesis that is borne of the parallel between code’s ruleishness – its reliance on strict, binary logic instead of interpretable standards – and its conceptual equivalent in the legal realm, known as legalism (more specifically the strong variant of the latter).
Technology law scholars have recently begun to consider the design studies concept of affordance, bringing it into the legal fold both as a means to explain how the law has developed the way it has, and more recently in attempts to cast the law per se as an affordance.
The literature concerning the regulation of technology is rightly beginning to focus more on ex ante, or ‘by design’, enforcement. Despite almost two decades having passed since Lawrence Lessig’s Code was first published, the acceptance that ‘code’ (as opposed to law) has the power to regulate, and an appreciation of that power, is still developing, particularly since it requires a level of interdisciplinary openness to which the conservative legal world is occasionally hostile.