As I was trying to fall asleep last night I realized that this forum would be a great way to reach out and talk to some fine folks that I would otherwise never have the chance to converse with...
So, this question is mainly directed at those living outside the United States:
Quite simply, does the rest of the world realize what Bush and his neo-conservative cronies have been trying to do? By that I mean, many Americans still believe the war in Iraq was/is about fighting terrorism. It's not. Many others simply believe it was about oil, and while you can't discount the fact that it is quite clear they (I'll avoid using "we" when referring to the Bush Administration as I don't agree with anything they've done in the last eight years) are going after Persian Gulf oil, the question remains why, and what's the big picture? Articles like the Wolfowitz Doctrine,
http://http://en.wikipedia.org/wiki/Wolfowitz_Doctrine
and its implementation after September 11th point directly at the fact that this administration is hell bent on establishing a new Imperialist American empire. Hard to believe for many Americans but virtually undeniable when you look at the facts. Is this the majority view outside of the states? Have Americans been kept in the dark so long by our own administration that they are more willing to jump on the bandwagon of rampant nationalism rather than accepting the fact that our current leadership is better defined by fascism than democracy?
So, this question is mainly directed at those living outside the United States:
Quite simply, does the rest of the world realize what Bush and his neo-conservative cronies have been trying to do? By that I mean, many Americans still believe the war in Iraq was/is about fighting terrorism. It's not. Many others simply believe it was about oil, and while you can't discount the fact that it is quite clear they (I'll avoid using "we" when referring to the Bush Administration as I don't agree with anything they've done in the last eight years) are going after Persian Gulf oil, the question remains why, and what's the big picture? Articles like the Wolfowitz Doctrine,
http://http://en.wikipedia.org/wiki/Wolfowitz_Doctrine
and its implementation after September 11th point directly at the fact that this administration is hell bent on establishing a new Imperialist American empire. Hard to believe for many Americans but virtually undeniable when you look at the facts. Is this the majority view outside of the states? Have Americans been kept in the dark so long by our own administration that they are more willing to jump on the bandwagon of rampant nationalism rather than accepting the fact that our current leadership is better defined by fascism than democracy?