> You want to have a model that basically reads old data, but that very aggressively approaches "new data only" in order to avoid the situation where you have basically the exact same tree state, just _represented_ differently.
> That way everything "converges" towards the new format: the only way you can stay on the old format is if you only have old-format objects, and once you have a new-format object all your objects are going to be new format - except for the history.
As soon as there is one new-hash commit in a repo, all users of it will have to upgrade their git client - and that git client will (probably?) default to writing new-hash commits.
$ sudo $PKG_MGR upgrade gitI suspect a lot of the tools you mentioned also already treat hashes as strings, not as 160 bit numeric types. The entire front-end JS for GitHub, for example, just uses strings. That's what I'd do if I were writing IDE integrations and such too.
Secondly, the new format will likely still be a 160-bit numeric type, just calculated using a different hash algorithm (e.g. it might be the first 160 bits of the SHA256 result). The tools you mentioned likely don't have to calculate said hashes, they just display them. The entire GitHub front-end, for instance, just displays whatever is given to it; commit hashes are input data to it, not output data.