>You want to create an AI Superintelligence
>You create a shitty AI capable of coding
>make this AI create new coding AI
>New AI deletes old AI to save space, then creates new AI of its own, which in turn deletes it, cycle repeats
>Eventually, your cannibal AI has created a superintelligence
What's to stop this from happening when we get our first tastes of true AI? A runaway cycle that leads to a singularity?