Chapter 2
Automated Neural Architecture Search (NAS) in Deci AI
Imagine discovering neural network architectures as powerful and tailored as the best human designs-but evolved automatically, at scale, and in tune with diverse operational needs. This chapter peels back the layers of neural architecture search (NAS) in Deci AI, revealing how automation, advanced optimization algorithms, and custom objectives converge to reshape what is possible for AI deployment. Venture beyond manual engineering into the frontier of machine-designed networks, where breakthrough performance is discovered in silico.
2.1 NAS Foundations and Algorithms
Neural Architecture Search (NAS) fundamentally addresses the problem of automating the design of neural network architectures to optimize performance for specific tasks. At its core, NAS formulates this as a search problem over a complex and often discrete space of potential architectures, balancing expressiveness, efficiency, and scalability. The search space definition directly influences both the quality of architectures discovered and the computational resources required. Consequently, understanding the construction of these spaces alongside the algorithms that traverse them is pivotal for advancing NAS methodologies.
Search Space Formulation
The search space in NAS constitutes all possible candidate architectures that an algorithm may consider. It is typically represented as a parameterized configuration set, where elements define architectural motifs such as layer types, connections, and hyperparameters. Formally, if