The Ethics Engine: A Modular Pipeline for Accessible Psychometric Assessment of Large Language Models
By: Jake Van Clief, Constantine Kyritsopoulos
Potential Business Impact:
Helps study AI's values and guide its use.
As Large Language Models increasingly mediate human communication and decision-making, understanding their value expression becomes critical for research across disciplines. This work presents the Ethics Engine, a modular Python pipeline that transforms psychometric assessment of LLMs from a technically complex endeavor into an accessible research tool. The pipeline demonstrates how thoughtful infrastructure design can expand participation in AI research, enabling investigators across cognitive science, political psychology, education, and other fields to study value expression in language models. Recent adoption by University of Edinburgh researchers studying authoritarianism validates its research utility, processing over 10,000 AI responses across multiple models and contexts. We argue that such tools fundamentally change the landscape of AI research by lowering technical barriers while maintaining scientific rigor. As LLMs increasingly serve as cognitive infrastructure, their embedded values shape millions of daily interactions. Without systematic measurement of these value expressions, we deploy systems whose moral influence remains uncharted. The Ethics Engine enables the rigorous assessment necessary for informed governance of these influential technologies.
Similar Papers
The Ethical Compass of the Machine: Evaluating Large Language Models for Decision Support in Construction Project Management
Artificial Intelligence
AI helps builders make safer, smarter choices.
LLM Ethics Benchmark: A Three-Dimensional Assessment System for Evaluating Moral Reasoning in Large Language Models
Computers and Society
Tests if AI makes good and fair choices.
The Moral Consistency Pipeline: Continuous Ethical Evaluation for Large Language Models
Computation and Language
Keeps AI acting good, even in new situations.