Luddite but mixed with "Eliezer Yudkowsky" who is a researcher working on the problem of friendly AI (or whatever they're calling it these days). Basically trying to prevent skynet.
The GP is saying that once we have AGI, then "AGI is going to make the human race irrelevant" outweighs "AGI makes software devs irrelevant".