Structure Transfer: an Inference-Based Calculus for the Transformation of Representations
By: Daniel Raggi , Gem Stapleton , Mateja Jamnik and more
Potential Business Impact:
Lets computers change how they understand information.
Representation choice is of fundamental importance to our ability to communicate and reason effectively. A major unsolved problem, addressed in this paper, is how to devise \textit{representational-system (RS) agnostic} techniques that drive representation transformation and choice. We present a novel calculus, called \textit{structure transfer}, that enables representation transformation across diverse RSs. Specifically, given a \textit{source} representation drawn from a source RS, the rules of structure transfer allow us to generate a \textit{target} representation for a target RS. The generality of structure transfer comes in part from its ability to ensure that the source representation and the generated target representation satisfy \textit{any} specified relation (such as semantic equivalence). This is done by exploiting \textit{schemas}, which encode knowledge about RSs. Specifically, schemas can express \textit{preservation of information} across relations between any pair of RSs, and this knowledge is used by structure transfer to derive a structure for the target representation which ensures that the desired relation holds. We formalise this using Representational Systems Theory~\cite{raggi2022rst}, building on the key concept of a \textit{construction space}. The abstract nature of construction spaces grants them the generality to model RSs of diverse kinds, including formal languages, geometric figures and diagrams, as well as informal notations. Consequently, structure transfer is a system-agnostic calculus that can be used to identify alternative representations in a wide range of practical settings.
Similar Papers
Structure Transfer: an Inference-Based Calculus for the Transformation of Representations
Machine Learning (CS)
Changes how computers understand different information.
Cross-Model Semantics in Representation Learning
Machine Learning (CS)
Makes AI models share knowledge better.
Understanding Learning Dynamics Through Structured Representations
Machine Learning (CS)
Makes AI learn faster and smarter with fewer mistakes.