Search for authorsSearch for similar articles
2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Energy-Aware LLMs: A step towards sustainable AI for downstream applications

      Preprint
      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Advanced Large Language Models (LLMs) have revolutionized various fields, including communication networks, sparking an innovation wave that has led to new applications and services, and significantly enhanced solution schemes. Despite all these impressive developments, most LLMs typically require huge computational resources, resulting in terribly high energy consumption. Thus, this research study proposes an end-to-end pipeline that investigates the trade-off between energy efficiency and model performance for an LLM during fault ticket analysis in communication networks. It further evaluates the pipeline performance using two real-world datasets for the tasks of root cause analysis and response feedback in a communication network. Our results show that an appropriate combination of quantization and pruning techniques is able to reduce energy consumption while significantly improving model performance.

          Related collections

          Author and article information

          Journal
          22 March 2025
          Article
          2503.17783
          b0da9b25-53bd-48be-9f97-2775d8a2ee60

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          This work has been submitted to V. International Conference on Electrical, Computer and Energy Technologies (ICECET 2025) for possible publication
          cs.PF cs.AI cs.CL cs.LG

          Theoretical computer science,Performance, Systems & Control,Artificial intelligence

          Comments

          Comment on this article