JonMc
Striker
Good post. WWI was the final nail in the positivist coffin but the ideas that permeated WWI...eugenics (which I'm not necessarily knocking at that time), separation of the sexes and fear of sex, classification of homosexuality as a medical illness, racial superiority etc all went further than they should because of the honestly held belief that the British Empire was proof that logic and reason and commerce would solve all social ills. Positivism. Social experimentation.Was creaky even before WW1. Gladstone's resignation and the contradictions over Ireland and home rule, the humiliation of the Boer War and the Concentration Camp solution to it and wider public disquiet over colonialism (as seen in likes of Heart of Darkness) and the internal disputes around social welfare and rights of the working population that led to the birth of Labour all torpedoed positivism really. The simple fact is that the British Empire was not sustainable as anything other than an upper class vehicle, as soon as it was made accountable to its British subjects it fell to pieces. It is one of the few empires which nonetheless permitted that to occur.
I think also as with many Empires and Great Nations that a final victory over an ancient and classically less democratic enemy (France) lead to a general malaise and complacency. There was a lot of stupid (Crimea, ME) and despicable (Opium Wars) Imperial misadventure once France were out the picture which basically came from just being big enough to be in everyone's business but without any justifiable reason to be involved
Our foreign policy was to make the whole world England.
Last edited: