All Tags
AWS
algorithm-design
architecture
cloud-principles
cost-reduction
data-centric
data-compression
data-processing
deployment
design
edge-computing
energy-footprint
hardware
libraries
locality
machine-learning
management
measured
migration
model-optimization
model-training
performance
queries
rebuilding
scaling
services
strategies
template
workloads
Tactic: Consider Energy-Aware Pruning
Tactic sort:
Awesome Tactic
Type: Architectural Tactic
Category: green-ml-enabled-systems
Title
Consider Energy-Aware Pruning
Description
In machine learning, pruning refers to the process of reducing the complexity and size of a ML model by removing unnecessary or less important components, such as weight. In energy-aware pruning, energy consumption of a neural network is used to guide the pruning process to optimize for the best energy efficiency. With the estimated energy for each layer in a CNN model, the algorithm performs layer-by-layer pruning, starting from the layers with the highest energy consumption to the layers with the lowest energy consumption. For pruning each layer, it removes the weights that have the smallest joint impact on the output feature maps
Participant
Data Scientist
Related software artifact
Machine Learning Algorithm
Context
Machine Learning
Software feature
< unknown >
Tactic intent
Improve energy efficiency by pruning nodes with the smallest joint impact on the output
Target quality attribute
Energy Efficiency
Other related quality attributes
Accuracy
Measured impact
The energy-aware pruning method reduces energy consumption