November 4, 2019
Evolution has been the primary mechanism for brains and organisms in general to master their environments, and now these concepts can be applied to computer programs to achieve clever solutions that would be difficult, if not impossible for humans to arrive at on their own. These concepts have been titled genetic algorithms and they’re being used to solve extremely non-linear problems where traditional optimization techniques would fall flat.
An example of this concept can be seen in the following simulation where agents are tasked with navigating a randomly generated obstacle course.
This Python code uses concepts like fitness, mutation, and gene inheritance to arrive at a solution to the course. Each agent has a fitness score and DNA that, at the end of its life, can be passed down to the next generation with varying degrees of effectiveness. Fitness is a function of distance to the target if it hasn’t reached it, and time until arrival if it has.
# Calculate fitness based on time alive and distance from target
def calculate_fitness(self):
distance_from_target = math.hypot(self.position.x - target_location[0],
self.position.y -target_location[1])
# Set new closest distance
if distance_from_target < self.nearest_distance:
self.nearest_distance = distance_from_target
# If the bug hit the target
if self.nearest_distance == 0 or self.target_collision is True:
self.lifetime = self.death_time - self.birth_time
self.fitness_score = 1 + (1/self.lifetime)**2
# If the bug hit a wall
elif self.wall_collision is True:
self.fitness_score = ((1 / self.nearest_distance)**2) * 0.1
# Bug died of natural causes
else:
self.fitness_score = (1/self.nearest_distance)**2
if self.fitness_score < 0:
self.fitness_score = 0
If the fitness result is high, agents will have higher chances of being selected in the mating pool and will able to pass their “genes” to the next generation. Genes in this case are a list of vectors that the agents follow until they either die, hit a wall, or reach the goal.
for i in range(population):
parent_a, parent_b = select_parents(mating_pool)
child = parent_a.crossover(parent_b.genes)
bug.append(Bug(child))
sprite_list.add(bug)
The main driver of innovation in this simulation is the mutation rate of agent’s genes. It’s calculated based on a variable that can change as the simulation progresses and occurs during crossover after selection in the mating pool.
def mutation(self, genes):
for i in range(lifespan):
if random.random() < mutation_rate:
genes[i] = Vector((random.randrange(-speed, _speed)),
(random.randrange(-speed, _speed)))
return genes
I found that agents can find themselves in a local maximum while trying to improve so the mutation rate can slowly increase if no progress is made for a few generations. This parameter is interesting because if it is too small, agents may never find a solution and if it’s too large, successful agent’s DNA can be completely mutated out of existence and the population will regress! The window for the best mutation rate is small and can vary but it’s usually around 1 to 2 percent. The max mutation rate caps at 5 percent since more than this will cause regression.
As always, source code can be found on my Github linked below. Variables like base mutation rate, total obstacles, and population can be altered in the first few lines of the code before runtime. The program uses the Pygame package for visualization and was built using Python 3.6.2.
Github Link: https://github.com/danbar0/Smart_Rockets