

I guess that I’m the resident compiler engineer today. Let’s go.
So why not write an optimizing compiler in its own language, and then run it on itself?
The process will reach a fixed point after three iterations. In fancier language, Glück 2009 shows that the fourth, fifth, and sixth Futamura projections are equivalent to the third Futamura projection for a fixed choice of (compiler-)compiler and optimizer. This has practical import for cross-compiling; when I used to use Gentoo, I would watch GCC build itself exactly three times, and we still use triples in our targets today.
[S]uppose you built an optimizing compiler that searched over a sufficiently wide range of possible optimizations, that it did not ordinarily have time to do a full search of its own space — so that, when the optimizing compiler ran out of time, it would just implement whatever speedups it had already discovered.
Oh, it’s his lucky day! Yud, you’ve just been Schmidhuber’d! Starting in 2003, Schmidhuber’s lab has published research on Gödel machines, self-improving machines which prove that their self-modifications will always be better than previous iterations. They are named not just after Gödel, but after his First Incompleteness Theorem; Schmidhuber et al proved easily that there will always be at least one speedup theorem which a Gödel machine can never reach (for a given choice of axioms, etc.)
EURISKO used “heuristics” to, for example, design potential space fleets. It also had heuristics for suggesting new heuristics, and metaheuristics could apply to any heuristic, including metaheuristics. … EURISKO could modify even the metaheuristics that modified heuristics. … Still, EURISKO ran out of steam. Its self-improvements did not spark a sufficient number of new self-improvements.
Once again the literature on metaheuristics exists, and it culminates in the discovery of genetic algorithms. As such, we can immediately apply the concept of gene-oriented evolution (“beanbag” or “gene pool” reasoning) and note that, if goals don’t change and new genes don’t enter the pool, then eventually the population stagnates as the possible range of mutated genes is tested and exhausted. It doesn’t matter that some genes are “meta” genes that act on other genes, nor that such actions are indirect. Genes are genes.
I’m gonna close with a sneer from Jay Bellou, who I hope is not a milkshake duck, in the comments:
All “insights” eventually bottom out in the same way that Eurisko bottomed out; the notion of ever-increasing gain by applying some rule or metarule is a fantasy. You make the same sort of mistake about “insight” as do people like Roger Penrose, who believes that humans can “see” things that no computer could, except that you think that a computer can too, whereas in reality neither humans nor computers have access to any such magical “insight” sauce.
Sometimes, yeah! There was a classic theory of metacompilers in the 1960s with examples like META II. In the 1980s, partial evaluation was put onto solid ground following Futamura’s programme, and in the 1990s the most successful team wrote The Book on the topic. My current weekend project is a fork of META II and it evolves by gradual changes to the compiler punctuated by two self-rebuild cycles.