Perhaps surprisingly, most people don't have strong feelings either way, and a good number of researchers suspect the task is actually efficiently computable (i.e., in polynomial-time). At least from the standpoint of complexity theory there are not really any compelling reasons to think factoring is hard, beyond the fact that we haven't discovered an algorithm for it yet.
Some discussion of why some believe factoring could be easy can be found here: [2]
[1] https://en.wikipedia.org/wiki/AKS_primality_test
[2] https://mathoverflow.net/questions/79366/evidence-for-intege...
The convention is to let N be the number to be factored, in which case log_2(N) is the input length. Hence, an algorithm that runs in time "polynomial (in the input length)" would have to run in time log(N)^k for some k.
However, those algorithms are _probabilistic_, not _deterministic_, which is what sets this linked method apart.
This algorithm, much like Pollard-Strassen before it, is pretty much irrelevant to integer factorization in practice. But it is nevertheless remarkable to see some progress on deterministic factorization after such a long time.
This is an extremely well written paper. A high-school student with motivation could understand all of it in about 1 week of work.
The key idea here is in Lemma 3.3 which is a simplified presentation of Lehman's original idea. All of the "good" algorithms in this class are exploiting this observation.
Hittmeir adds a new ingredient, namely applying "baby-step-giant-step" searching. And Harvey makes a slightly better parameter choice and strategy than Hittmeir to get his result.
The new idea and value to me would be the concise explanation in Lemma 3.3.
A lot of the feedback is of the form, "who cares, since it doesn't push the state of the art in practical factoring", but for me, there is art in ideas like this. Lehman, Hittmeir and Harvey have put forth a really elegant and beautiful line of thought here---that is accessible!---and I hope you can enjoy it without too much worry about how much it is worth.
So most likely, no. This is primarily of theoretical importance (it is now the fastest, deterministic algorithm with a proven bound), but is not immediately a performance gain on existing algorithms.
[1] https://en.wikipedia.org/wiki/General_number_field_sieve
[2] https://en.wikipedia.org/wiki/Lenstra_elliptic-curve_factori...