All Tags
AWS
algorithm-design
architecture
cloud-principles
cost-reduction
data-centric
data-compression
data-processing
deployment
design
edge-computing
energy-footprint
hardware
libraries
locality
machine-learning
management
measured
migration
model-optimization
model-training
performance
queries
rebuilding
scaling
services
strategies
template
workloads
Tactic: Consider Knowledge Distillation
Tactic sort:
Awesome Tactic
Type: Architectural Tactic
Category: green-ml-enabled-systems
Title
Consider Knowledge Distillation
Description
Knowledge distillation is a technique where a large, complex model (teacher) is used to train a smaller, simpler model (student). The goal is to transfer the learned information from the teacher model to the student model, allowing the student model to achieve comparable performance while requiring fewer computational resources. Knowledge distillation improves performance when evaluating accuracy and energy consumption.
Participant
Data Scientist
Related software artifact
Machine Learning Algorithm
Context
Machine Learning
Software feature
Knowledge Distillation
Tactic intent
Improve energy efficiency by apply knowledge distillation of pre-trained models if they are too big for a given task
Target quality attribute
Performance
Other related quality attributes
Accuracy, Energy Efficiency
Measured impact
< unknown >