It's repeating based on what the trained model has given it about situations where instructions possibly similar to the instructions given are specified and which were about reversing strings in general.
If the author messed with temperature and retried their failing prompt enough times, or simply reworded it a little differently, they might also get the correct answer.