All Tags
AWS
ai
algorithm-design
architecture
browser
cloud
cloud-efficiency
cloud-principles
cost-reduction
data-centric
data-compression
data-processing
deployment
design
documentation
edge-computing
email-sharing
energy-efficiency
energy-footprint
enterprise-optimization
green-ai
hardware
libraries
llm
locality
machine-learning
maintainability
management
measured
microservices
migration
mobile
model-optimization
model-training
multi-objective
network-traffic
parameter-tuning
performance
queries
rebuilding
scaling
services
storage-optimization
strategies
tabs
template
testing
workloads
Tactic: Limit Ensemble Size
Tactic sort:
Awesome Tactic
Type: Architectural Tactic
Category: green-ml-enabled-systems
Title
Limit Ensemble Size
Description
Limiting the number of base learners in an ensemble uses significantly less energy while often losing negligible accuracy. If accuracy is lost, incrementally experiment to find the smallest size with acceptable accuracy. In this study, 2-3 ensembles was the appropriate limit.
Participant
Machine Learning Practitioners and Researchers.
Related software artifact
Machine Learning ensemble.
Context
Ensemble Learning.
Software feature
Ensemble Size.
Tactic intent
Reduce energy used for training and inference by reducing the number of base models while maintaining accuracy.
Target quality attribute
Energy Efficiency.
Other related quality attributes
Accuracy.
Measured impact
An ensemble of size 2 consumes 37.49% less energy compared to an ensemble of size 3, and an ensemble of size 3 consumes 26.96% less energy than an ensemble of size 4. The average F1-scores for ensemble sizes 2, 3, and 4 were 0.782, 0.774, and 0.780, meaning variations in the number of models within an ensemble do not meaningfully impact accuracy. Energy measured in Joule (J) for training all base models, the optional meta-model, and fusion-time inference. Accuracy measured in F1.
