The Venn overlap between the set of "contacts in your phone" and "people with Wikipedia bios" is likely rather small. Hence why I think it's a faulty premise to complain about Siri defaulting to contact card when these two sets do intersect.
If I'm asking the question - maybe a friend is nearby who doesn't know them and I'm too lazy to explain - then I want to know who they are, not what their contact info is.
Even simple questions like "Who is..." has many different interpretations. A human will understand the context. An AI won't, because you can't derive the context from the words themselves. It's a function of social setting, physical setting, relationship, previous conversations, and so on.
At the moment conversational interfaces are more like a Bash shell with a speech recogniser on the front. The shell needs a precisely formed command and has almost no concept of state or context at all. (I think Siri actually has some, but not much.)
So it's completely unrealistic to expect CIs to be able to do this today. It will only be possible when NLP gets a whole lot more sophisticated and starts tracking context and state - although even that will still be a hard problem, because social state is defined as much by location, physical surroundings, time of day, and custom as by the words being used.
A contact card could definitely be used for that. If one exists, Siri should give the info from it and then wait to see if the user also wants external information (from Wikipedia or elsewhere).