Why do Americans think that they have the best country in the world, from what I’ve seen the majority don’t see the rest of the world as equal and when it comes to history they have this view that America has never had any failures. They love to bring up the war of independence and claim that they did it by themselves when the french almost bankrupted themselves fighting in it. For me it’s bizarre that one of the richest and most influential countries in the world has such a brain washed society and is in places incredibly backwards or poor
But having traveled around, I don't think it's the greatest, even if I love the place dearly.
Every time some one asks me this question, my mind instantly goes back to this opening from way back. I personally don't think any one country can be the "greatest." Every country will have its flaws somewhere or other. Do I think the U.S. is great? Of course I do - but it's not the "greatest."
You can only make a country better by pointing out the flaws - even if it's not considered "patriotic". Simple as that.