Jul 30, 2021
I think most Americans have realized that the government has been captured by corporations, the military industrial complex, and special interests, and no longer represent the will of the people. If so we would have free healthcare, college, and social programs like other developed countries of the world. So when the government institutes mandates like vaccines or masks, it becomes a political act of power over the people. If we had a real democracy I don’t think people would be protesting.