Time and solution are usually abstract concepts here.
The time a turing machine takes to solve the problem can be considered local and in absense of general relativity, it doesn't matter much.
When you talk about time in NP/P, you usually talk about the O Notation (Big O and friends) which gives you the driving factor of the time needed to solve.
Ie, O(n) = 2 + 3x + 9x^2 + 9^x is n^x complexity because this part of the equation will grow much quicker than the others. In reality of course, it can take a while for the biggest part to overtake which is why searching an array linearly in CPU cache can be faster than a hashmap.
The solution in NP/P is usually reduced to either a predicate being proven true or the answer being boolean or can be reduced to either. (ie, solving a jig-saw puzzle can be reduced to "is this jig-saw puzzle solvable" which can be proven by presenting a solution)
The problem you present is not in the scope of NP/P. You're already involving multiple computers and generally here, people still assume general relativity doesn't exist. Plus NP still holds because one computer simply accessed a blackbox function (oracle) while the other solved it in NP.
Time-complexity isn't the only complexity either, you also have space-complexity.
SC tells you how much memory you need to solve a problem. Or rather, how quickly that memory need will grow. You can trade a lot of NP problems to P problems if you have "NP" space complexity.
SC complexity won't spit out a byte-accurate estimation of memory but it can tell you that O(n) = 100 + 100*x will grow linearly in size (O = x) and O(n) = 100 + x^2 will grow exponentially in memory usage (O = x^2).