There once was a time when schools, teachers, and district staff could be trusted to teach our kids precisely what they needed to become productive members of society. However, those days may be over.
For example, in some schools, teachers and staff have recently been allowed to alter lessons on our national history, choosing to focus on the dangers of racism and how the white man created so much destruction instead of on what really happened.
Some have even taught that the entire white race is inherently racist and entitled. It’s called critical race theory (CRT).
We’d like to know what you think about that. Should American history be taught as it once was, based on the facts? Or should it now contain lessons on white privilege and white racism?
Tell us what you think here.
|