Whilst the political centre of gravity in America is far to the right of anything I personally would consider sensible, the Republicans are generally on the left of it, historically being the party of civil rights, tax and spend interventionism, and the unions, whilst the Democrats are to the right of it, being the party of the bankers and big business, and, while the Republicans aren't beyond stooping to such levels, are considerably more likely to resort to race-baiting, red-baiting, and pandering to the religious right.
Yet this wasn't always the case, and historically both parties had both right-wing conservative and left-wing progressive/populist wings, especially in the late 19th and early 20th centuries (it's hard to think that the Democrats were once the party of the New Deal). Also how would this affect each party's approach to issues like states' rights, abortion, gay marriage, etc?
When did their trends towards the left and the right become inevitable, and how could it have been reversed?