Efficient and Green Code LLMs: Happier Software Engineers, Happier Planet
Speaker: David Lo – Singapore, SingaporeTopic(s): Artificial Intelligence, Machine Learning, Computer Vision, Natural language processing , Software Engineering and Programming
Abstract
Many have been excited about the potential of code Large Language Models (code LLMs). However, code LLMs are large, slow, and energy-hungry compared to traditional ASE solutions, which raises usability and sustainability concerns. This is especially true when we want to deploy them in IDEs on local devices, which is often the preferred setting.
This lecture will highlight three strategies (3S: "Stop," "Simplify," and "Shrink") to improve the efficiency and energy consumption of code LLMs. For the "Stop" strategy, the lecture will present `FrugalCoder,' the first solution that efficiently "appraises" the potential outcome before running code LLMs, preventing costly but unfruitful code LLM executions. For the "Simplify" strategy, the lecture will present 'SimPy,' the first code LLM-oriented programming language grammar; its simple structure captures pertinent semantics succinctly, allowing code LLM to be more efficient while retaining similar efficacy. For the "Shrink" strategy, this lecture will discuss 'Avatar,' which combines constraint solving, metaheuristic search, and knowledge distillation to create a much smaller, more efficient, and energy-saving model.
The lecture will conclude with a vision of what the future can be with efficient and green LLM and a call for action for more research in this direction to make both software engineers and our planet happier.
About this Lecture
Number of Slides: 60 - 80Duration: 60 - 75 minutes
Languages Available: English
Last Updated:
Request this Lecture
To request this particular lecture, please complete this online form.
Request a Tour
To request a tour with this speaker, please complete this online form.
All requests will be sent to ACM headquarters for review.