I don't think there is a single standard word or phrase that covers just this concept. Perhaps recursive self-improvement matches the idea concisely - but that is not specific AI jargon.
Very little is understood about what strength this effect can have or what the limits are. Will 10 generations of self-improvement lead to a machine that is 10% better, 10 times better, or $2^{10}$ times better? And by what measure?
Some futurologists suggest this might be a very strong effect, and use the term Singularity to capture the idea that intelligence growth through recursive self-improvement will be strong, exceed human intelligence, and lead to some form of super-intelligent machine - the point at which this goal is reached is called The Singularity. Ray Kurzweil is a well-known proponent of this idea.
Specifically, use of the term Singularity implies more than just the basic recursion that you suggest, and includes assumptions of a very large effect. Plus technically, it refers to a stage that results from the recursion, not the recursion itself.
However, despite the popularity of it as a concept, whether or not such self-improving system will have a large impact on the generation of intelligent machines is completely unknown at this stage. Related research about general intelligence is still in its infancy, so it is not even clear what would count as being the first example system x.