All Tags
AWS
algorithm-design
architecture
cloud-principles
cost-reduction
data-centric
data-compression
data-processing
deployment
design
edge-computing
energy-footprint
hardware
libraries
locality
machine-learning
management
measured
migration
model-optimization
model-training
performance
queries
rebuilding
scaling
services
strategies
template
workloads
Tactic: Reduce Number of Data Features
Tactic sort:
Awesome Tactic
Type: Architectural Tactic
Category: green-ml-enabled-systems
Title
Reduce Number of Data Features
Description
A large number of data features can lead to high computing power requirements for training and inference. Typi- cally, machine learning scenarios involve a huge number of features or variables that describe the input data. However, not all these features are necessary for the model to make accurate predictions. Therefore, reducing these data features can lead to improved energy efficiency while still maintaining accuracy. Reducing the number of input features can be achieved by selecting only a subset the available data features.
Participant
Data Scientist
Related software artifact
Data
Context
Machine Learning
Software feature
< unknown >
Tactic intent
Enhance energy efficiency by reducing the number of data features by choosing only a subset of all the available features
Target quality attribute
Energy Efficiency
Other related quality attributes
Accuracy, Data Representativeness
Measured impact
Reducing number of input features can result in a reduction of energy consumption while still maintaining accuracy.