Most games code replays as a series of commands.
How it works is that when you create the game you create a server on localhost, and the replay is buffered into a stream. As time passes, the commands are pulled out of the stream and executed.
That's the standard way to program it, and it's how it works.
In order to support playback of different versions, you have to do a few things:
Option A) store all the gamedata in the .rep file. This "bloats" replays to very large, and can have problems if the engine was changed since certain balance changes can make the replay out-of-sync of the actual game. Elemental does this with the .sav files (old saves use the unit strenghts and values of when the game was created, not the most recent patch). So, their .sav files are more than a MB.
A bloated replay is bad. Don't give me crap about 'oh but internet speeds are hella fast! What's the difference between a 2 MB file and a 300 KB file?' Well, for me, it's a ton. 5 GB monthly limit and 6 Mbps. Or unlimited and 300 Kbps. There is a difference, and most countries outside of the US do bandwidth throttling once you go over a certain limit. JS files are getting more and more bloated as shitty webdev programmers are graduating without knowing basic JS-compression techniques.
Option
keep a backup of all the binaries, then, in the header of your .rep, you indicate the version number, and the game chooses which binary to load and run. This bloats your Program Files, which isn't a huge deal, since harddisk space is "cheap" and it doesn't make game patches any larger or smaller. However, it does add a problem where if you buy a game after it releases, say 6 months, and they have patched the game twice, the game companies have to burn the CDs with the old binaries on them. Which is odd.
In the case of downloaded games (we can dismiss bandwidth concerns in this case since if you're the type of person to download a 4 GB game, you probably don't have too much of an issue with it), you then do have to pad game downloads in the same way. Which does cost a little bit of moeny to the company itself.
Option B also becomes more and more absurd the more the game is patched. If a game has 20 patches, you have 20 binaries floating around. It's sorta nasty and it will bloat the size of a new purchase.
Even if you keep a backup, how exactly you implement it while parsing the replay becomes a problem. You can do xdelta and reinterprete the replay on the fly allowing you to greatly reduce the size of the game backups but increasing the load time of the game, you could create a game folder that you that you mount, which has the best performance but causes MASSIVE bloat. (I'm pretty sure that SC2 uses the xdelta method).
Option C) make replays actual "movies". This bloats replays to ~25 MB, but no longer requires keeping backups of anything. However, this makes sharing replays extremely difficult and therefore, this option is unfeasible.
Reversing in a replay is another thing that would take an enormous amount of engineering hours figuring out how to do, and I imagine that Blizzard has actually taken out a software patent, since I don't actually know where I would start with it, myself.
tl;dr Creating a replay system that supports prior versions is unwieldly and costs a large amount of engineering hours, on top of incurring other costs, or having other drawbacks. Hence why very few companies do it, espiecially since a strong replay system very, very rarely drives new sales of games. There isn't any real money in it. SC is one of the true e-sports and so having a strong replay system does drive sales. Which is why Blizzard invested in it. Money.
\thread.