Yes as pointed out many times not "computer science" and I somewhat regret the name. However, it came out of me being a teaching assistant for people doing computer science degrees. A surprising number of people got to 3rd year operating systems courses without realising things like 2^10 is a kilobyte, 2^20 is a megabyte, etc. Let alone how a program was linked and loaded. I hope for this to be helpful, there are plenty of similar resources but sometimes the way one person says something resonates more than another.
I deliberately wanted to avoid x86-only to illustrate for similar reasons. Unfortunately Itanium proved to be a poor choice, ARM would have been better, but it gives me something to update if I get time! However, much of the basic content still remains relevant many years on.
Regardless, this seems like a nice little document for those 20% of students who are still willing to read supplementary materials. Then again, this is short and clear enough that if it were properly chunked within a course and had graded assignments attached, many students would probably actually skim it.
If you ever want to develop this some more, I hope you'll consider something like Runestone[0], in order to give it some auto-graded questions along the way!
[0] https://runestone.academy/runestone/default/user/login?_next...
Starting with bits and bytes and hardware would be pretty goddamn boring, I think. You'd lose a lot of students that way.
That said, in the third year we had a course in analog electronics, which was basically transistors and logic gates.
Following that course was one in digital electronics, where we all built our own little toy 8-bit computer, wiring the CPU ourselves, writing the microcode ourselves. I'll never forget the a-ha moment when you realize that your instruction set are just binary patterns representing which wires to put a current on, which units to toggle on and off. The instruction to move a value from a register to an address in RAM has to look like this, because you need to toggle the read input on the correct register, and the write input on the RAM unit, and everything else has to be off. Blew my mind at the time.
The clock was manual if you wanted to, so you could step through and watch your little CPU run a program, or you could set it to like 1Hz and watch the thing go. And from there, you can sort of get how a modern computer works, it's just a matter of going from 1Hz to 1GHz, wider buses, wider instruction sets, but it's no longer "magic" how the CPU works, it's all ones and zeroes, for a reason, and you now know that reason.
A smaller proportion of students approach courses from the bottom-up (10-20%?) - they tend to start with the basic pieces and glue them together over time to understand larger concepts.
I’ve read about this phenomenon (it is extensively documented in pedagogic literature) and seen it in action (I taught CS for about ten years) but still could not explain to you why that split occurs.
I would speculate that it is a difference in type-1 vs type-2 reasoning based on the level of familiarity and comfort with the prerequisites to each course, but even that guess is heavily biased by studying constructivism in CS pedagogy.
I think that having students take 5-6 classes together in 16 weeks doesn’t promote mastery in any of those classes. Tying the performance in those classes to scholarship eligibility and job placement incentives grades, not necessarily understanding. Grades and mastery can be separated because 2-3 exams in a class, which determines the majority of the grade, rewards students the most, on a time investment vs. performance basis, for understanding technicalities in the grading system and for hyper focusing on the types of problems that can be on an exam. This doesn’t promote mastery, this is a game academia and students play for the satisfaction of government, finance and corporations. Definitely, a problem, solutions could include more frequent sampling of understanding, more diverse ways of measuring knowledge, decoupling performance from financing and longer periods to learn topics.
People remember what they are required to recall (citation needed). Classes can only test for so many things, so people are going to remember what they needed for assignments or tests. It feels like that's one of the reasons apprenticeships are hailed by some as useful. They "test" what is required in real world application. I've never needed to know that 2^10 is a kilobyte as a developer building websites, but would that be surprising/amazing? There are many things I've needed outside of school that were never taught.
As long as CS is used as the path to software development, it will be a balance between theory and application.
I believe that he didn't understand what Computer Science actually is, besides having a degree in it. I'd say it's because of the educational system, where one could attain a degree without actually qualifying in it.
In under developed/developing economies if only qualified people get their degrees, then only a fraction would get their jobs and it would be very bad for that economy and so bad educational systems are by design.
[1] https://twitter.com/heavyinfo/status/1209330850363404288
If they understood a little more about how their program was built and run ... from the bottom up, as it were :) ... they would have had less pain.
Of course, it just shows as usual the hardest problem in computer science is indeed, naming!
Edit: BTW you're right, "Code" by Petzold should be required reading http://www.charlespetzold.com/code/index.html
I have been considering some parts on containers and virtualisation which isn't covered at all.
The Itanium and PowerPC examples probably haven't aged well. There is no question that Itanium is a very interesting architecture with many interesting features, but now it is dead it's like deciphering hieroglyphs. I think I have to update these to ARM, or maybe even RISC-V to be more relevant moving forward.
So that would be my plans, as they were :)
https://www.bottomupcs.com/chapter01.xhtml
Section on "Masking"
How do you get 0x09 or 0x90?
I get how using <memory> & <mask> = <extracted data>...
so:
10100101
&
11110000
=
10100000
But I have yet to see how this is 0x90 or 0x09, perhaps I'm misunderstanding. I'm trying to understand the 'shift' piece of it
> Reordering
> This bit is crap
The CS curriculum probably made more sense back in the day when everyone was essentially an embedded developer. But nowadays, the most useful knowledge I have is the low level mechanics of how things like the OS and networking protocols work. S/W eng. classes are a bit useful, but mostly knowing how to write in C++, Java, and now Python has gotten me most of the way. As it is, I have almost never run into a situation where most of my CS classes have been relevant. And, where they are relevant, it can be covered by a week course in the basics.
I feel the CS curriculum would be much better service for students if it covered more of the knowledge of how to get things done. And not in a faddish, framework du jour manner, but there are constant elements throughout all the fads that a good developer should learn cold, and are not covered very well, at least in my 8 years in CS academia.
IMHO the real problem with CS is that it's driven by AI envy, and much of what is considered important only makes sense in light of the assumption the human mind is basically a computer, and CS is all about how to recreate a human mind. However, almost none of that line of thought matters in the real world, and is most likely false.
There is no 'the' CS curriculum. You must be thinking of one particular school's CS curriculum, like maybe your own? Another school's CS curriculum is going to be wildly different.
A curricula, OTOH, that prepares a student to be a good developer should also have a heavy emphasis on: - #1 is write a lot of code with a focus on good coding practice, preferably in a combination of Python and Java or C++ - Understand Linux and be proficient with command line tools
Personally, going through OP's site, I was nodding my head and comparing it to what I learned in my undergraduate CS degree. Some of it is dated, but most of it connects to the Architecture and Operating System classes I took.
[0] https://www.acm.org/binaries/content/assets/education/cs2013...
It is much easier to teach an engineer to make good software than to teach a CSist to do engineering. I have seen the latter happen, but the usual results are... well, we see that every day.
A complement, not really an alternative, to this pdf.
(I fumble fingered this the first time and left out the link! Unfortunately I can't delete the confusing comment but fortunately it is being downvoted away)
"This site supports a course and a textbook that guide students and self-learners through the construction of a modern, full-scale computer system - hardware and software - from the ground up."
Might want to proofread some of this and remove the ... proof-reading notes? Not sure what you thought was crap about this section?
Over time I have become a bit more realistic :) I've stopped worrying if it's better or worse than anything else. As long as it's factually correct, I think a rising tide floats all boats so the more we all write and read others work, the better off we'll be.
A complement, not really an alternative, to this pdf.
I stopped reading at some point because there were so many errors, unfinished bits, and just flat-out garbage.
Sorry to the author but this needs to improve. It's a good start, but you need to invite fixes and implement them. Put your email address on every single page and invite fixes and then implement them.
This could be an awesome resource, but right now it's too full of errors to be useful.
Nobody should be recommending this. Nobody who had actually reviewed it all would recommend it.
Thanks for sharing