Perhaps if the various Christian churches attempt to politically intervene in the American war of independence in a ham-fisted way? For example, fearing a precedent being set by European empires falling to 'natives', the churches preach that independence is fundamentally immoral and try to persuade Americans to stay with Britain.
While I think it is more likely the Americans will simply establish their own churches, if the arguments are consistently grounded in theology and the bible, you might have enough minds shifted who conclude 'religion' is the problem. Americans might still call themselves Christian and believe in the basic articles of faith, but they no longer congregate or build churches. 'God, without religion' becomes the catch cry - or in other words, their distrust of big government has a twin distrust of big religion. Slowly, over several generations, you have a majority of Americans simply not care enough about religion to be called Christian. They don't necessarily describe themselves as Deist, but in effect their mindset reflects that school of thought.