I have implemented the first, pre-alpha version of my little genetic algorithm framework and it is working very well thus far. Now, I am in the process of writing documentation and finishing up some details. I just wanted to clarify something.
The term "mutation rate". Does it mean:
The likelyhood of a given chromosome being mutated at all?
The likelyhood of a given gene in a chromosome being modified?
Or the likelyhood of a single allele in a gene being modified?
Depending on which of the above is the correct answer (or something entirely different for that matter), please also clarify whether I need to scale the mutation rate by some other value (the number of genes in a chromosome, for instance).
I'm not sure how you implement a single allele, but I would say mutation rate is the chance of a single binary mutating (e.g you have DNA 0000, and a 25% chance of every binary digit, the zeros in this case, to "mutate" to a 1).
In the projects I have done myself I have not scaled the mutation rate.
Edited.