AHC: Have Religion Remain a Large Part of Western Society

Whilst religion today in the West still remains a sizeable force for influencing the world-at-large, it can be argued that it's very much taken a back-seat as a key driving force for governance and policy in most Western societies were it was once so overwhelmingly dominant; unlike areas of the Middle East, rural Africa, etc. where this is still the case. As such, is there any way religion (well, primarily Christianity) can remain the dominant force in government (and people's personal lives) it had been throughout most of its history; or was it inevitable that its influence would slowly erode over time?
 
Top