You know, I might be misreading this but I think you mean Western society is, by contrast, Christian.
The West has made a big todo about being secular on a national level, regardless of what the population believes or does not believe. In some countries, the people are also largely a-religious. There's culture war, check, but to say that a nation's religion plays any role in whether it belongs to the West or not is, dunno. Dangerous?
I don't disagree that in the past cultural lines were drawn along religion, but, surely we 've come a bit of way since? Maybe not if you look at current wars, but still: is that really, really what we think the West means? An alliance of Christian nations?
'Cause that, like, sucks.