Score: 0

MoMoE: A Mixture of Expert Agent Model for Financial Sentiment Analysis

Published: November 17, 2025 | arXiv ID: 2511.13983v1

By: Peng Shu , Junhao Chen , Zhengliang Liu and more

Potential Business Impact:

Makes AI smarter by letting many AI parts work together.

Business Areas:
MOOC Education, Software

We present a novel approach called Mixture of Mixture of Expert (MoMoE) that combines the strengths of Mixture-of-Experts (MoE) architectures with collaborative multi-agent frameworks. By modifying the LLaMA 3.1 8B architecture to incorporate MoE layers in each agent of a layered collaborative structure, we create an ensemble of specialized expert agents that iteratively refine their outputs. Each agent leverages an MoE layer in its final attention block, enabling efficient task decomposition while maintaining computational feasibility. This hybrid approach creates specialized pathways through both the model architecture and the agent collaboration layers. Experimental results demonstrate significant improvements across multiple language understanding and generation benchmarks, highlighting the synergistic benefits of combining expert routing at both the neural and agent levels.

Page Count
12 pages

Category
Computer Science:
Computational Engineering, Finance, and Science