Awesome and Dark Tactics
Homepage Catalog Tag Selection Contributions
All Tags AWS algorithm-design architecture cloud-principles cost-reduction data-centric data-compression data-processing deployment design edge-computing energy-footprint hardware libraries locality machine-learning management measured migration model-optimization model-training performance queries rebuilding scaling services strategies template workloads

<- Back to category

Tactic: Consider Knowledge Distillation

Tactic sort: Awesome Tactic
Type: Architectural Tactic
Category: green-ml-enabled-systems
Tags: machine-learning  model-optimization 

Title

Consider Knowledge Distillation

Description

Knowledge distillation is a technique where a large, complex model (teacher) is used to train a smaller, simpler model (student). The goal is to transfer the learned information from the teacher model to the student model, allowing the student model to achieve comparable performance while requiring fewer computational resources. Knowledge distillation improves performance when evaluating accuracy and energy consumption.

Participant

Data Scientist

Related software artifact

Machine Learning Algorithm

Context

Machine Learning

Software feature

Knowledge Distillation

Tactic intent

Improve energy efficiency by apply knowledge distillation of pre-trained models if they are too big for a given task

Target quality attribute

Performance

Other related quality attributes

Accuracy, Energy Efficiency

Measured impact

< unknown >

Source

Shanbhag, S., Chimalakonda, S., Sharma, V. S., & Kaulgud, V. (2022, June). Towards a Catalog of Energy Patterns in Deep Learning Development. In Proceedings of the International Conference on Evaluation and Assessment in Software Engineering 2022 (pp. 150-159). [DOI](https://doi.org/10.1145/3530019.3530035); Haichuan Yang, Yuhao Zhu, and Ji Liu. 2019. Energy-Constrained Compression for Deep Neural Networks Via Weighted Sparse Projection and Layer Input Masking. International Conference on Learning Representations (ICLR) (2019) (ICDCSW). 55–62. [DOI](https://doi.org/10.48550/arXiv.1806.04321)


Graphical representation

  • Contact person
  • Patricia Lago (VU Amsterdam)
  •  disc at vu.nl
  •  patricialago.nl

The Archive of Awesome and Dark Tactics (AADT) is an initiative of the Digital Sustainability Center (DiSC). It received funding from the VU Amsterdam Sustainability Institute, and is maintained by the S2 Group of the Vrije Universiteit Amsterdam.

Initial development of the Archive of Awesome and Dark Tactics by Robin van der Wiel