Extending MOSES to evolve Recurrent Neural Networks
by Joel Lehman for OpenCog sponsored by the Singularity Institute for Artificial Intelligence
MOSES has outperformed GP on several tasks. Because RNNs are difficult to evolve for many of the reasons that program trees are difficult to evolve, extending MOSES for RNNs may result in an algorithm that is more effective than current GA + NN techniques. Doing so will require extensions to Combo and Reduct that will be applicable to continuous domains in general even if MOSES does not extend well to RNNs as anticipated.