MetaLLMix : An XAI Aided LLM-Meta-learning Based Approach for Hyper-parameters Optimization
By: Mohammed Tiouti, Mohamed Bal-Ghaoui
Potential Business Impact:
Finds best computer settings for AI faster.
Effective model and hyperparameter selection remains a major challenge in deep learning, often requiring extensive expertise and computation. While AutoML and large language models (LLMs) promise automation, current LLM-based approaches rely on trial and error and expensive APIs, which provide limited interpretability and generalizability. We propose MetaLLMiX, a zero-shot hyperparameter optimization framework combining meta-learning, explainable AI, and efficient LLM reasoning. By leveraging historical experiment outcomes with SHAP explanations, MetaLLMiX recommends optimal hyperparameters and pretrained models without additional trials. We further employ an LLM-as-judge evaluation to control output format, accuracy, and completeness. Experiments on eight medical imaging datasets using nine open-source lightweight LLMs show that MetaLLMiX achieves competitive or superior performance to traditional HPO methods while drastically reducing computational cost. Our local deployment outperforms prior API-based approaches, achieving optimal results on 5 of 8 tasks, response time reductions of 99.6-99.9%, and the fastest training times on 6 datasets (2.4-15.7x faster), maintaining accuracy within 1-5% of best-performing baselines.
Similar Papers
MetaLLMix : An XAI Aided LLM-Meta-learning Based Approach for Hyper-parameters Optimization
Machine Learning (CS)
Finds best computer settings for AI faster.
LLMs as In-Context Meta-Learners for Model and Hyperparameter Selection
Machine Learning (CS)
AI helps pick the best computer programs.
LLMs as In-Context Meta-Learners for Model and Hyperparameter Selection
Machine Learning (CS)
AI helps pick best computer programs and settings.