Next up for developing artificial intelligence systems is automated neural-net architecture search.

James Kobielus, Tech Analyst, Consultant and Author

March 30, 2018

3 Min Read
Image: Shutterstock

Evolution is the historical process that created the intelligence behind these words. It’s also responsible for spawning the neural connections that readers are using to grasp what’s being expressed.

Any serious effort to develop “artificial general intelligence” must at some point recapitulate the evolutionary process within which neural networks took shape and became attuned to world around them.  Artificial intelligence researchers have been developing more sophisticated “neuroevolution” approaches for many years. Now it would seem that the time is right for these to enter the mainstream of commercialized AI in a big way.

As AI becomes the driving force behind robotics, more developers are exploring alternative approaches for training robots to master the near-endless range of environmental tasks for which they’re being designed. There is fresh interest in approaches that can train robots to walk as well as humans, swim like dolphins, swing from trees like gibbons, and maneuver with the aerial agility of bats. As I noted here, the robotics revolution has spurred AI researchers to broaden the scope of intelligence to encompass any innate faculty that enables any entity to explore, exploit, adapt, and survive in some environment.

In this new era, we’re seeing more research focused on evolutionary algorithms, which are designed to help neural nets automatically evolve their internal structures and connections through trial-and-error training scenarios. In the broader perspective, there is an intensifying commercial and research focus on “neurorobotics,” as well as such overlapping fields as  reinforcement learning, embodied cognition, swarm intelligence, and multi-objective decision making.

As Kenneth O. Stanley notes in this fascinating article, developers’ growing need for sophisticated techniques to accelerate neural-net architecture optimization has spurred convergence between the fields of neural evolution and deep reinforcement learning. As he notes, researchers at OpenAI have developed a neuroevolution approach that boosts the performance of conventional deep reinforcement learning techniques on a variety of training tasks. In this way, the researchers can go well beyond the traditional focus of AI training -- which takes a neural-net architecture as given and simply adjusts weights among artificial neurons -- and uses a simulated variant of “natural selection” to evolve the architecture itself through iterations.

In Stanley’s article, he suggests how neuroevolution might soon become a standard capability in the DevOps toolkit of every practicing data scientist. He discusses a hypothetical scenario in which alternative neural-net architectures are iteratively generated, tested, and selected into a robotics simulation lab.

This is an increasingly feasible scenario for mainstream developers, according to Stanley, due to the steadily improving availability and price-performance of GPUs and other AI-optimized hardware processing power in the cloud. “Neuroevolution,” he states, is just as eligible to benefit from massive hardware investment as conventional deep learning, if not more. The advantage for neuroevolution, as with all evolutionary algorithms, is that a population of artificial neural networks is intrinsically and easily processed in parallel. If you have 100 artificial neural networks in the population and 100 processors, you can evaluate all of those networks at the same time, in the time it takes to evaluate a single network. That kind of speed-up can radically expand the potential applications of the method.”

 Of course, no one is claiming that neuroevolution is a mature field or that this AI training approach is widely deployed in production environments. However, it is clear that these evolutionary neural-net architectural optimization techniques will begin to enter the mainstream of “automated machine learning” approaches within the coming 3-to-5 years. As I noted in this recent Wikibon report, there is a growing range of automation tools for the new generation of developers that deploy machine learning, deep learning, and other AI capabilities into production applications.

It’s only a matter of time before automated neural-net architecture search comes into AI developer toolchains. As it does, it will supplement the automated feature engineering, algorithm selection, and model training capabilities that are already there.

About the Author(s)

James Kobielus

Tech Analyst, Consultant and Author

James Kobielus is an independent tech industry analyst, consultant, and author. He lives in Alexandria, Virginia.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights