With the two's complement convention, the concept of 'signedness' only matters when a narrow integer value needs to be extended to a wider value (e.g. 8-bit to 16-bit), specifically whether the new bits needs to be replicated from the narrow value's topmost bit (for signed extension) or set to zero (for unsigned extension).
It would be interesting to speculate what a high level language would look like with such sign-agnostic "Schroedinger's integer types").