Neural Architecture Search (NAS) automates the design of neural network architectures (e.g., number of layers, type of activation, and connections). its goal is to find the optimal architecture that performs well on the data.
Models designed with NAS are on par with or outperform those created by hand.
Components of NAS:
Search Space: The search space defines the possible components for building different architectures.
The micro approach has been shown to have significant performance advantages over the macro approach.
Search Strategy: NAS searches through the search space based on specific strategies to find the architectures to test to find the most performant one.
Five common strategies include:
- Grid search
- Random Search
- Bayesian Optimization
- Evolutionary Algorithms
- Reinforcement Learning.
Performance Estimation: NAS depends on measuring the performance of the different architectures that it tries. The most straightforward approach to estimating performance is to evaluate the validation accuracy of each architecture. However, calculating validation accuracy can be computationally heavy given the large search spaces and complex networks.
Strategies to reduce computation costs: