It all depends on when and how you view history. If you only allow yourself to look at the positive parts of American history and forbid yourself from looking at the negative parts, it’s easy to see American was always great until whichever politician you don’t like is in office.
A lot of sanitized American history sterilizes the naughty parts and bad bits.
I’ve even heard of some US textbooks removing the trail of tears which is a crucial part of not just US but world history as it directly influenced Hitler’s ideas which is an important historical connection in understanding the connections between things.