You know, knowing about all the shit the us did in the last few decades and the shit they are still doing to this day makes it sound so damn laughable to hear them talking about what other countries should/shouldnt do to make this world a better place.
Europe has the unique diplomatic advantage of being our allies. They were able to sit back, build up their economy and diplomatic strength while the U.S. did the vast majority of the West's dirty work--the West's, not the United States'.
Everyone thinks their sooooooooo innocent and can point fingers at the U.S. all of the time. I'm sick of the political and historical ignorance on this forum. I bash things America does as much as anyone, but where are the threads about Putin essentially taking sole control of the government? Or about how Europe continues to harass countries like Romania about their civil rights histories? Or any other wrongs in this world that the U.S. doesn't have a hand in?