So, the excerpt here is chapter 3, section 1. The actual definition happens in chapter 3 section 5 ("What Is a Random Sequence?"). I have the book at home, though, so I can't quote it here. Sorry. But the intuition is, if you have an infinite sequence of random numbers, then the numbers in all infinite subsequences should be equidistributed. So, like, if you a stream of random 0's and 1's, then if you pick only every other number, the 0's and 1's have to be equiprobable, and if you pick every third number, they still have to be equiprobable, etc. This is slightly wrong, but it's on the right track to the actual definition.
re: Pseudorandomness, the point of pseudorandomness is the following.
1. A lot of algorithms use randomness to make pathologically bad cases extremely unlikely. For example, choosing a random pivot in quicksort makes the worst case very unlikely.
2. But in a lot of cases, this leads to huge amounts of space consumption. For example, most frequency moment estimations involve a matrix of random numbers. So if you're getting those numbers from a "truly random" source, then you have to store the entire matrix, which can be huge.
3. So, a better solution is to use a pseudorandom number generator! That way you can store a seed of s bits, and do something clever, like deterministically re-generate the matrix as you need it, rather than storing it outright.
Notice though, that this is not independent of the notion of randomness! In fact they are quite intimately tied together.