Why has it become so bad to feel like America first is the way to be. I guess since Ronald Reagan. Everybody seems to have the attitude that American is somehow a bad thing. Are we really a dead country? When I was growing up, a company moving production to China would have meant prison for the executives of that country. They would have been deemed anti-American and traitors. Many people were blackballed for even thinking of putting a communist, dictatorship above America.