Résumé
Neural Architecture Search (NAS) methods seek optimal networks by exploring thousands of variants of a reference architecture. Yet, optimality is typically related to prediction performance, overlooking the environmental impacts of training. Thus, NAS search spaces are unfit for performance and energy consumption trade-offs. We contribute to energy-aware NAS with (i) a grammar-based Convolutional Neural Network generator (CNNGen) producing diverse architectures not based on a reference one; (ii) 1,300 available architectures obtained via CNNGen with their implementation, energy consumption and performance measurements; (iii) Three state-of-the-art predictors releasing the need for trained models for performance and energy estimation.
langue originale | Indéfini/inconnu |
---|---|
titre | 32nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning |
Etat de la publication | Publié - 2024 |