Reconsidering Conversational Norms in LLM Chatbots for Sustainable AI
By: Ronnie de Souza Santos, Cleyton Magalhães, Italo Santos
LLM based chatbots have become central interfaces in technical, educational, and analytical domains, supporting tasks such as code reasoning, problem solving, and information exploration. As these systems scale, sustainability concerns have intensified, with most assessments focusing on model architecture, hardware efficiency, and deployment infrastructure. However, existing mitigation efforts largely overlook how user interaction practices themselves shape the energy profile of LLM based systems. In this vision paper, we argue that interaction level behavior appears to be an underexamined factor shaping the environmental impact of LLM based systems, and we present this issue across four dimensions. First, extended conversational patterns increase token production and raise the computational cost of inference. Second, expectations of instant responses limit opportunities for energy aware scheduling and workload consolidation. Third, everyday user habits contribute to cumulative operational demand in ways that are rarely quantified. Fourth, the accumulation of context affects memory requirements and reduces the efficiency of long running dialogues. Addressing these challenges requires rethinking how chatbot interactions are designed and conceptualized, and adopting perspectives that recognize sustainability as partly dependent on the conversational norms through which users engage with LLM based systems.
Similar Papers
Mitigating the Carbon Footprint of Chatbots as Consumers
Computers and Society
Saves energy by using chatbots smarter.
Exploring Anthropomorphism in Conversational Agents for Environmental Sustainability
Human-Computer Interaction
Helps people use less energy at home.
Conversational AI as a Coding Assistant: Understanding Programmers' Interactions with and Expectations from Large Language Models for Coding
Human-Computer Interaction
Helps computers help people write code better.