I was reading an article about our local community college when I read a quote that was a clear indication of what is wrong with higher education in America. I'm not going to say which school official said it, and it really isn't important which issue prompted the quote, the words speak for themselves: "I don't think it's the business of higher education to tell people which values are best and which values they should all live by. I'd be concerned...if specific values or morals would become part of our culture to promote." In other words; the last thing a college or university should be doing is promoting values. If all value systems are equal (presumably, if you won't take a stand one way or another), then even clearly amoral value systems deserve the right to be heard and considered. College campuses in America are rife with the idea that there are NO moral absolutes in our world (except the absolute that there are no absolutes; a bit of irony). Despicable acts like pedophilia and morally bankrupt systems like Neo-Nazism have all gained traction in the public arena because nobody in authority at public universities is willing to say, "This is clearly a moral evil". In the name of acceptance and diversity we've lost the ability to condemn evil and promote good. In the words of Edmund Burke, "The only thing that is necessary for evil to triumph is for good men to do nothing."
{note: for the record, I went to a Christian University, but my wife went to a public school and now works for a public college. I'm not saying Christians shouldn't attend public schools {I'm a public school teacher, as are my brother, sister, brother-in-law, and sister-in-law}, but Christian parents need to be aware of what's being taught (or not taught in this case) to their teens; moral relativism is NOT Christian}
No comments:
Post a Comment