Fellow American,
There’s a growing discussion being heard in today’s schools.
On one side, some people take pride in our nation and believe that our history should be taught just as it happened. Children should learn about our national leaders, the events that lead us to where we are, and in so doing, teach them lessons about government and how best to manage our country.
The other side disagrees.
Rather than tell history as it took place, it would rather that we cut out a few people, events, and places – you know, anything that may have promoted racism, slavery, or injustice. It would also seek to blame those things on current members of our communities, even children.
So which one is more important to you? Which one should our children understand better?
Actual American History or the idea that whites are inherently evil and should be blamed for all atrocities in our past?
Take our poll here to answer. Make your voice heard today.
|