Yesterday’s session of the International Arbitration Forum brought together moderators Velimir Živković, Niamh Leinwather, and Miljana Bigović, alongside Professor Crenguta Leaua, an international arbitrator and advisor on complex disputes at the intersection of law, technology, and business, to explore the evolving intersection of arbitration and emerging technologies.
The discussion opened with an update on Platforum9’s new features launching 1st of May 2026, introducing a stronger focus on community engagement within structured forums.
Framing Technology Through Metaphor
Leaua introduced a distinctive framework, using metaphors from the Harry Potter universe to examine the growing influence of technology in dispute resolution. She emphasised that while arbitration has always positioned itself as adaptable and forward-looking, there is a risk of becoming overly captivated by technological progress, equating efficiency with genuine advancement without sufficient ethical scrutiny.
The Illusion of Autonomy and AI Influence
Leaua likened artificial intelligence tools to the “Imperius Curse”, warning of the illusion of autonomy they can create. While AI enhances efficiency in legal research, drafting, and analysis, it may also subtly guide reasoning. This raises concerns around the cognitive independence of arbitrators, who may unknowingly adopt algorithmic logic embedded within these systems.
Predictive Analytics and the Nature of Justice
Turning to predictive tools, Leaua drew parallels with “liquid luck”, noting that while such tools can forecast likely outcomes based on past data, they do not determine what is just. The increasing reliance on probability risks shifting arbitration away from principled reasoning towards strategic prediction, challenging its foundational purpose.
Speed, Efficiency, and the Loss of Reflection
Through the metaphor of the “Time-Turner”, Leaua addressed the growing emphasis on speed. While technology enables practitioners to process vast amounts of information quickly, she cautioned that legal reasoning requires reflection, an iterative and time-dependent process. Excessive acceleration risks diminishing the depth and quality of decision-making.
Digital Memory, Authenticity, and Evidence
Leaua compared digital evidence systems to the “Pensieve”, highlighting both their advantages and vulnerabilities. While digital storage allows for extensive documentation and analysis, it lacks the inherent responsibility and contextual awareness of human memory. This raises concerns around completeness, manipulation, and interpretation.
She further explored issues of identity and authenticity in virtual hearings, drawing on the “Polyjuice Potion” analogy. The rise of AI-generated content and deepfakes challenges traditional assumptions about presence and credibility, requiring new safeguards within arbitration processes.
Automation and the Limits of Rigidity
The discussion also touched on smart contracts, likened to the “Unbreakable Vow”. While these systems ensure automatic execution, they risk eliminating flexibility and the ability to account for unforeseen circumstances, such as force majeure or equitable considerations. This raises important questions about the balance between certainty and fairness.
Regulation and Systemic Response
Živković steered the discussion towards the broader regulatory landscape, questioning how arbitration can effectively respond to these technological challenges. Drawing on the Harry Potter analogy, he highlighted that attempts to control powerful tools often struggle without coordinated effort. He raised the need for a combination of global approaches and institutional initiatives, prompting reflection on whether harmonised regulation or decentralised solutions are more effective.
Institutional Tools and Ethical Guidance
Leinwather contributed from an institutional perspective, outlining efforts within the Vienna International Arbitral Centre (VIAC) to address these challenges. She referenced the development of AI-focused guidance and practical tools designed to help arbitrators identify risks such as deepfakes and misuse of technology. Leinwather emphasised the importance of equipping practitioners with both awareness and practical resources, while also questioning whether institutions should go further in embedding ethical considerations into their frameworks.
Practical Risks and Generational Challenges
Bigović provided insight from a counsel perspective, focusing on the practical implications of AI use in legal practice. She highlighted recent examples where misuse of AI tools led to significant professional errors, underlining the importance of oversight and verification. Bigović also pointed to a generational challenge: junior practitioners may rely heavily on AI without fully understanding expected outputs, increasing the risk of inaccuracies. She stressed the need for training, particularly in effective prompting and critical evaluation, alongside robust internal controls within law firms.
Confidentiality, Trust, and Human-Machine Interaction
The discussion further explored concerns around confidentiality and data privacy, particularly when using externally hosted AI tools. Participants noted the limitations this imposes on adoption, especially for arbitrators handling sensitive information.
Another key theme was the evolving relationship between humans and AI. As practitioners become more accustomed to these tools, there is a risk of developing trust that reduces critical scrutiny. This raises questions about when reliance becomes over-reliance, and where responsibility ultimately lies.
Education, Awareness, and Responsibility
Leaua returned to the importance of education and awareness, arguing that technological competence must be paired with ethical understanding. She emphasised that responsibility lies not only in using these tools, but in understanding their origins, limitations, and impact.
A central concern was the growing divide between those with access to advanced technologies and those without. Ensuring fairness in arbitration will require transparency around the use of such tools and consideration of their impact on equality between parties.
Amplified Power, Amplified Responsibility
The session concluded with a clear message: technology amplifies human capability, but this must be matched by amplified responsibility. Arbitration must remain focused on achieving justice, rather than simply embracing speed or innovation.
Implications for Arbitration
- AI tools may subtly influence reasoning, challenging cognitive independence.
- Predictive analytics risks prioritising probability over justice.
- Speed must be balanced with reflection in legal decision-making.
- Digital evidence introduces new risks around reliability and authenticity.
- Institutions play a critical role in providing guidance and tools.
- Training and oversight are essential, particularly for junior practitioners.
- Transparency and responsibility must underpin all use of technology in arbitration.