> The C language is too complicated and too flexible to allow that.
I disagree. In fact, I would expect the following could be a pretty reasonable exercise in a book like "Software Tools"[1]: "Write a program to extract all the function declarations from a C header file that does not contain any macro-preprocessor directives." This requires writing a full C lexer; a parser for function declarations (but for function and struct bodies you can do simple brace-matching); and nothing else. To make this tool useful in production, you must either write a full C preprocessor, or else use a pipeline to compose your tool with `cpp` or `gcc -E`. Which is the better choice?
However, I do see that the actual "Software Tools" book doesn't get as advanced as lexing/parsing; it goes only as far as the tools we today call `grep` and `sed`.
I certainly agree that doing the same for C++ would require a full-blown compiler, because of context-dependent constructs like `decltype(arbitrary-expression)::x < y > (z)`; but there's nothing like that in K&R-era C, or even in C89.
No, I think the only reason such a declaration-extracting tool wasn't disseminated widely at the time (say, the mid-to-late 1970s) is that the cost-benefit ratio wouldn't have been seen as very rewarding. It would automate only half the task of writing a header file: the other and more difficult half is writing the accompanying code comments, which cannot be automated. Also, programmers of that era might be more likely to start with the header file (the interface and documentation), and proceed to the implementation only afterward.
[1] - K&P's "Software Tools" was originally published in 1976, with exercises in Ratfor. "Software Tools in Pascal" (1981) is here: https://archive.org/details/softwaretoolsinp00kern/