Contact person
Joakim Nivre
Forskare
Contact JoakimProject to explore resource-efficient methods in artificial intelligence (AI), specifically natural language processing (NLP).
Objective
The project aims to address the increasing energy consumption and carbon footprint of large-scale AI/NLP models by exploring resource-efficient methods. The goal is to reduce the computational resources without compromising performance.
Content
The project focuses on developing and testing methods for resource-efficient training and deployment of large language models. By combining more data-efficient methods for training with model compression for deployment, we aim to reduce resource requirements over the entire life-cycle of an AI/NLP model. In addition, we will create a novel conceptual framework for life-cycle analysis of resource efficiency.
Methodology
For data-efficient training we will explore a generative-discriminative approach that can be made efficient thanks to the technique of non-residual attention developed by researchers at RISE. For resource-efficient deployment we will investigate task-agnostic and task-specific distillation methods. For the evaluation of resource-efficiency, we will measure running time, power consumption, and number of floating point operations and correlate these measurements with different product measures, including product-as-performance and product-as-output.
Impact
The project is expected to deliver the following results:
In this project, we aim to take steps towards reducing energy consumption and optimizing resource utilization in the AI field. We want to pave the way for a more sustainable and resource-efficient future for AI and support companies and organizations in their transition to AI-driven solutions.
Resource-Efficient AI
Active
Project management and implementation
1year
1 MSEK
Joakim Nivre Fredrik Carlsson Luise Dürlich Evangelia Gogoulou