Ah, this kind of makes sense, but then why wouldn't they map the old 100% to the new maximum?
I can understand the increased granularity, if you're making monitors that go much brighter, then you get more light levels, but why define the old maximum to be less than the new maximum?
Yes that's essentially how I understand it: hardware-wise HDR is basically "a really bright screen" coupled with good black levels. To ensure that you always get the HDR effect at any brightness, you need to reserve some headroom. It should be noted that you probably could not sustain this peak brightness for long anyhow due to thermal issues.