Published By: University College London/UCL News, 10/24/2016
Artificial intelligence has recently been used to predict (past) judicial decisions in the European Court of Human Rights to a surprising degree of accuracy. This method could potentially be used to automatically identify cases that are likely to involve human rights violations, and is also an interesting example of how artificial intelligence can quantify and even predict human behavior based on pattern recognition.
Extended Discussion Questions
- The article mentions that, based on their results, the team found judgments to be based more on non-legal facts than directly legal arguments. What are some benefits of being able to categorize human behavior with computing? What are some potential drawbacks?
- What kinds of assumptions were the researchers making in categorizing the bases for the court’s decisions? Are there other ways they could have broken down the data?
- Dr. Nikolaos Aletras, the leader of the team, is quoted as saying, “We don’t see AI replacing judges or lawyers.” Do you agree or disagree with this proposition? Why or why not?
- If court decisions will not be made directly by AI algorithms, how else could they be integrated into the existing legal system? What impacts might they have on the system?
Relating This Story to the CSP Curriculum Framework
Global Impact Learning Objectives:
- LO 7.2.1 Explain how computing has impacted innovations in other fields.
- LO 7.4.1 Explain the connections between computing and real-world contexts, including economic, social, and cultural contexts.
Global Impact Essential Knowledge:
- EK 7.2.1A Machine learning and data mining have enabled innovation in medicine, business, and science.
- EK 7.3.1A Innovations enabled by computing raise legal and ethical concerns.
Other CSP Big Ideas:
- Idea 4 Algorithms
Banner Image: “Network Visualization – Violet – Crop 9”, derivative work by ICSI. New license: CC BY-SA 4.0. Based on “Social Network Analysis Visualization” by Martin Grandjean. Original license: CC BY-SA 3.0