TAMEing Long Contexts in Personalization: Towards Training-Free and State-Aware MLLM Personalized Assistant
By: Rongpei Hong , Jian Lang , Ting Zhong and more
Potential Business Impact:
Helps AI remember and talk about your specific things.
Multimodal Large Language Model (MLLM) Personalization is a critical research problem that facilitates personalized dialogues with MLLMs targeting specific entities (known as personalized concepts). However, existing methods and benchmarks focus on the simple, context-agnostic visual identification and textual replacement of the personalized concept (e.g., "A yellow puppy" -> "Your puppy Mochi"), overlooking the ability to support long-context conversations. An ideal personalized MLLM assistant is capable of engaging in long-context dialogues with humans and continually improving its experience quality by learning from past dialogue histories. To bridge this gap, we propose LCMP, the first Long-Context MLLM Personalization evaluation benchmark. LCMP assesses the capability of MLLMs in perceiving variations of personalized concepts and generating contextually appropriate personalized responses that reflect these variations. As a strong baseline for LCMP, we introduce a novel training-free and state-aware framework TAME. TAME endows MLLMs with double memories to manage the temporal and persistent variations of each personalized concept in a differentiated manner. In addition, TAME incorporates a new training-free Retrieve-then-Align Augmented Generation (RA2G) paradigm. RA2G introduces an alignment step to extract the contextually fitted information from the multi-memory retrieved knowledge to the current questions, enabling better interactions for complex real-world user queries. Experiments on LCMP demonstrate that TAME achieves the best performance, showcasing remarkable and evolving interaction experiences in long-context scenarios.
Similar Papers
Enabling Personalized Long-term Interactions in LLM-based Agents through Persistent Memory and User Profiles
Artificial Intelligence
AI remembers you for better conversations.
A Survey of Personalized Large Language Models: Progress and Future Directions
Artificial Intelligence
Makes AI understand and talk like you.
M-CALLM: Multi-level Context Aware LLM Framework for Group Interaction Prediction
Human-Computer Interaction
Helps computers guess what groups will do together.