Parallelism giveth, and concurrency taketh away.
Concurrency, as used by people who think hard about these things, is about what you do to keep a system coherent and balanced with parallel activities going on: the overhead of coordinating parallel activity that prevents you from getting ideal xN performance. This includes stuff you wouldn't need to do at all with one thread, and also stalls when coordinating access to shared resources.
It doesn't include useful work not done in parallel because it's inherently serial, or that is thrown away because it was unnecessarily done more than once. That all is the domain of Amdahl's law.