When did the British abandon the desire for the 13 Colonies?

When exactly did the British realize that the colonies were truly gone for good? Did any harsh feelings last long after the War of 1812 or did it get rid of the excess nastiness to one another?
 
I think they had already accepted they were gone by 1812.

But US-British feelings were rather chilly for other reasons.
 
Britain recognized that the Treaty of Paris had signed away any claim it might have had to the 13 Colonies in perpetuity, and by 1812 had firmly come to terms with the loss of the colonies and were actually far more happy with the arrangement they had. The United States of America had proven far more profitable to Britain as an independent nation and trading partner than it had ever been as a colony, and there was never any serious suggestion of re-establishing British rule over them.
 
Last edited:
Some of us still want it back as a colony or a Dominion.
:cool::D:cool::D:cool::D:cool::D:cool::D
God Help Us Noooooo........... !

If you look at the pamphlets and Parliamentary speeches around well before the insurrection there was a noticeable argument that the american colonies gave Britain no return and cost a fortune and the best thing to do was cut them adrift.

IIRC one (presumably tongue in cheek?) pamphlet suggested all the north american colonies should be sold to France in exchange for Calais, Ponthieu and Gascony. I'd sign up for that.
 
Top