Blogging

DelphiNEAT

NEAT implementation in Delphi.

What Is It?

NEAT (NeuroEvolution of Augmenting Topologies) is a method for evolving Neural Networks which was developed by Kenneth O. Stanley. DelphiNEAT is an implementation of NEAT written in Borland Delphi by mattias fagerlund. To find out more about NEAT, click on this link.

DelphiNEAT comes with a number of demo applications; Obstacle Navigator, Pole Balancing, Symbolic Regression and XOR regression. If you download the source and binary distro, all the demos are included in pre-compiled format, so you can run them even if you don’t have access to Delphi.

Genetic Art


My most interesting DelphiNEAT demo is the GeneticArt program that I developed using DelphiNEAT. You can read about it here.

Download the demo / source

I’ve put up two releases of DelphiNEAT, one with source only and one with compiled demos. The source only version is tiny, but if you want precompiled demos, go for the bigger file. Note that DelphiNEAT comes without any kind of documentation, so you’re pretty much on your own. 

My coolest NEAT demos

Hopper

I used NEAT to evolve the controllers for a couple of simulated physical creatures. You can read more about it here. The code for the hopper and the snake aren’t included in the base distro, because they require my DelphiODE package, and the latest CVS version of GLScene.

 

The Demos

These are the demos that are included with the DelphiNEAT package.

Symbolic Regression

y=abs(x)

The symbolic regression demo presents the NEAT system with a function (in the above case the function is y=abs(x)). Fitness is defined as how close is the evolved network to the sought function.

Obstacle Navigator

A fairly simple corridor

 

A more complex corridor

 

The aim of the Obstacle Navigator is to follow a corridor without hitting any walls. The NEAT system evolves a steering behaviour that’s tested on several corridor configurations. The input of the network is the five sensors (drawn as lines) – they indicate how far from the wall the navigator is at each given sensor angle. Hitting the wall means the navigator dies. Fitness is defined as the sum of distances the navigator was able to travel before it’s demise. If a navigator is able to travel all corridors without crashing, it’s considered a perfect solution.

Pole Balancing

A cart balancing two poles

Pole Balancing is a classic AI benchmark. It simulates a cart running on a track, balancing one or two poles. NEAT is extremely successfull with this benchmark, but DelphiNEAT has failed to reach the same level. I haven’t been able to pinpoint exactly why.

The Artificial Ant

An early ant

The articial ant is a classic benchmark used in GA / GP. The goal is to steer an articial ant (red and black) over a mostly empty (white) toroidal world where there is a trail of food (green). If the ant finds a piece of food (a red square above), it’s given a point. If it steps without finding any food (black), it simply wastes one of it’s turns. When it’s been allowed to run for a number of steps, it’s killed off. The fitness is the number of food pieces it has found. I haven’t optimized the settings for this demo yet, and I expect the performance to improve.

 

XOR Regression

A successful 2-bit XOR network

 

A successful 4-bit XOR network

 

The goal of this demo, which is another classic benchmark, is to find a neural network that works like a simple XOR gate. NEAT is wildly successful at this benchmark, but the benchmark is considered to bee too simple by todays standards. When using high bit XORs, it becomes a bit tricky again.

Melanie Hale