Scanners cannot look for every possible bug. They can however, look for
known bugs. And that's what most attempt to accomplish.
In a nutshell: The scanner looks for known files (thus you see a lot of 404's). If it finds these files, it may try another check (some query string or POST) to determine if it is the vulnerable version. The scanner then tries to inject some javascript for the XSS. If it sees the injected string in the body of the response, it may start modifiying the request until the javascript is actually valid, resulting in an alert window. If it doesn't see a response weith the injected code, the test is complete, move to the next. Finally, the scanner may look for backup files by taking the list of pages from the crawled site and appending things like .bak, .orig, etc.
If you were to take this scanner and run it against a site that returns status code 200 for every request, and echo's back the query string, I suspect you will see a more extensive list of checks that are performed.