All Tags
AWS
algorithm-design
architecture
cloud-principles
cost-reduction
data-centric
data-compression
data-processing
deployment
design
edge-computing
energy-footprint
hardware
libraries
locality
machine-learning
management
measured
migration
model-optimization
model-training
performance
queries
rebuilding
scaling
services
strategies
template
workloads
Tactic(s) tagged with "machine-learning"
- Apply Cloud Fog Network Architecture (AT)
- Apply Sampling Techniques (AT)
- Choose a Lightweight Algorithm Alternative (AT)
- Choose an Energy Efficient Algorithm (AT)
- Consider Energy-Aware Pruning (AT)
- Consider Federated Learning (AT)
- Consider Graph Substitution (AT)
- Consider Knowledge Distillation (AT)
- Consider Reinforcement Learning for Energy Efficiency (AT)
- Consider Transfer Learning (AT)
- Decrease Model Complexity (AT)
- Design for Memory Constraints (AT)
- Enhance Model Sparsity (AT)
- Minimize Referencing to Data (AT)
- Monitor Computing Power (AT)
- Reduce Number of Data Features (AT)
- Remove Redundant Data (AT)
- Retrain the Model If Needed (AT)
- Set Energy Consumption as a Model Constraint (AT)
- Use Built-In Library Functions (AT)
- Use Checkpoints During Training (AT)
- Use Computation Partitioning (AT)
- Use Data Projection (AT)
- Use Dynamic Parameter Adaptation (AT)
- Use Energy-Aware Scheduling (AT)
- Use Energy-Efficient Hardware (AT)
- Use Informed Adaptation (AT)
- Use Input Quantization (AT)
- Use Power Capping (AT)
- Use Quantization-Aware Training (AT)