That is because that package internally steps down that voltage. Any LED that produces light in the visible range will have a bandgap in the 1 to 2 Volt range. Feeding it a higher voltage than that would lead to disastrous current run away. The package that contains the LED and the current limiter and whatever other electronics are in there might of course accept 24V DV, or 110V AC or 230V AC or 17.4V at 4kHz. But the question that your parent comment was asking is: For which of those choices is the conversion most efficient to get to the voltage and current that the LED actually needs in the end.
LEDs are usually current-controlled, not voltage controlled.
You might have several LEDs in series and they might very well have a voltage drop across all of them of 24V or even much higher when maintaining an operational amount of current.
I know. But at the maximum rated current the voltage drop across the LED is going to be only marginally higher than the band gap. Definitely not 24 Volts. So the question becomes what do you do with the excess voltage. If you are smart you don't just burn it in a resistor. And at that point the question is exactly what the OP asked: For which input voltage do you minimize total losses.