Reiterating other comments, terrorists can't make bioweapons because they lack the facilities and prerequisites, not because they're incompetent.
Either the LLM is useful, in which case it could be useful to a terrorist, or it's useless, in which case you won't mind if access is restricted.
Note: I'm not saying it will definitely be useful to a terrorist. I'm saying that companies have an obligation to show in advance that their open source LLM can't help a terrorist, before releasing it.
No it isnt. That's like saying, "You can walk" is an argument against cars.
If LLMs are set to revolutionize industry after industry, why not the terrorism industry? Someone should be thinking about this beyond just "I don't see how LLMs would help a terrorist after 60 seconds of thought". Perhaps the overall cost/benefit is such that LLMs should still be open-source, similar to how we don't restrict cars -- my point is that it should be an informed decision.
And we should also recognize that it's really hard to have this discussion in public. The best way to argue that LLMs could be used by terrorists is for me to give details of particular schemes for doing terrorism with LLMs, and I don't care to publish such schemes.
[BTW, my basic mental model here is that terrorists are often not all that educated and we are terrifically lucky for that. I'm in favor of breakthrough tutoring technology in general, just not for bioweapons-adjacent knowledge. And I think bioweapons have much stronger potential for an outlier terrorist attack compared with cars.]
Facilities are a major hurdle for nuclear weapons. For bioweapons they are much less of a problem. The main constraint is competency.
> The main constraint is competency.
Oh right, anyone can be a chemist, it requires no skill, that why labs aren't a core part of the course work.
Ai researchers are really good at telling other fields their work, that they have no experience in, is easy.