What sickens me, is that despite all that has happened in the last hundred years, humanity (as it has done for thousands of years) has learnt nothing.
There are still those who want war (Europe knows all too well about the horror of that). Empires and the plentiful negative aspects of what they represent, whether it be the Soviets, the British or earlier imperial powers, do not assuage the imperialist ambitions of the US (and lets face it, at the moment they are pursuing what I like to call 'economic imperialism'). Poverty and disease are far from being addressed - look at sub-Saharan Africa, and while we're on the subject of that, where are the Western governments' soldiers when ethnic cleansing is going on? Oh, they have no oil, so they don't matter - these people' deaths mean nothing.
I just don't understand, we should be building bridges - we should be reaching out to other nations (which is why I am firm believer in the spirit of the EU, but perhaps not its machinations), not causing resentment. As I've no doubt said before, I feel genuinely ashamed to be British sometimes - I feel guilty by association of what has happened in Iraq. I love my country as well, I'm patriotic, I'm no leftist apologist, but I honestly thought in this day and age we should be looking forward to an era of increased cooperation, understanding and internationalism. But it seems as if we are about to re-enter the dark ages......
It's sad, really sad. Whatever happened to diplomacy?
Even though one might think of the British diplomatic service as being full of well connected people, people who one might expect to be right-wing, even they are in total disbelief at the foreign policy pursued by our government.
What must they think of the foreign policy of the US then? Surely we all know by now, that to get what you want in life - and lets face it, that's what diplomacy IS, the best way is to earn people's trust and to earn their respect. Have the US or UK done either of those things recently?