But the length of time it takes a modern computer to count to 10 or 1,000 is perhaps inconceivably
small by your metric, no? Your idea arbitrarily selects numbers around 2 billion as being conceivable, at least for a single core on my MacBook.
But my question isn't what makes 10^4000 inconceivable -- my question is what makes 10^4000 any less conceivable than 1000. To me, they're both firmly in the realm of abstractions we can reason about using the same types of mathematical methods. They're both qualitatively different from numbers like 5 or 10 which are undoubtedly "conceivable".