Hoppa till huvudinnehåll
RISE logo

Master thesis: Decentralized deep learning and continual learning

Introduction
The study of deep learning on decentralized data has been of interest for a long time. Modern mobile devices are equipped with sensors that can collect a multitude of data streams related to people’s everyday lives, providing a promising source of training data. With the global spread of smartphones and other mobile devices, an immense increase of data collected has highlighted the need for responsible mining. Such data requires special care not only due to increasing privacy concerns among users but also due to regulatory privacy laws.

Distributed data collection unlocks the potential of data that would otherwise be difficult or impossible to obtain. Existing solutions (e.g. federated learning) generally assume that the data is drawn from a stationary underlying distribution, and that the data of the past is similar to the data of the future. However, in practice, different clients often have different types of biases and their data may not always be independently and identically distributed (iid). This results in differing client data distributions (the non-iid data paradigm), which hinders efficient training of deep learning models.

An underexplored research problem is continual learning in decentralized settings, where the goal is to study deep learning on non-stationary distributions. It is still unclear how to mitigate catastrophic forgetting and learn useful representations as distributions shift in a decentralized setting.

Thesis Description
In this master thesis, you will work on decentralized learning and continual learning, investigating methods to learn useful deep learning representation when data distributions shift in time and space. You will work in close collaboration with our deep learning research group in Gothenburg. The work requires skilled students within machine learning and statistical inference. You will be expected to do a literature study in order to get familiar with what the research field looks like today, and then start with simpler models and eventually extend or develop upon more advanced solutions. This will be a research oriented master thesis.

Supervisor: Edvin Listo Zec

Start date: Early 2023

Location: Gothenburg

Credits: 30 ECTS (högskolepoäng).

Who are you?
We expect you to have required skills:

  • Experience of implementing machine learning models.
  • Courses in mathematical statistics, probability theory or similar.
  • Programming skills. Preferably with some experience of relevant frameworks such as Pytorch or Tensorflow.

Useful reading material
Communication-Efficient Learning of Deep Networks from Decentralized Data

Decentralized adaptive clustering of deep nets is beneficial for client collaboration

Federated Learning under Distributed Concept Drift

Learning without forgetting

Welcome with your application!
If you want to know more, please contact edvin.listo.zec@ri.se. Candidates are encouraged to send in their application as soon as possible. Last day of application is 30th of November. Suitable applicants will be interviewed as applications are received. Note that all applications for this position must go through our recruitment system Varbi. We do not accept applications by email.

Our union representatives are Lazaros Tsantaridis, SACO, 010 516 62 21 and Bertil Svensson, Unionen, 010-516 53 56.

Om jobbet

Ort

Gothenburg

Anställningsform

Visstidsanställning 3-6 månader

Job type

Student - examensarbete/praktik

Kontaktperson

Edvin Listo Zec
edvin.listo.zec@ri.se

Referensnummer

2022/551

Sista ansökningsdag

2022-11-30

Skicka in din ansökan