I have been busy working on my book proposal, the book formerly known as "The Fish Wars: How Evolution and Christianity Can Make Peace" which I'm renaming "Losing My Religion: A Christian Gets Fed Up..." I need to come up with the second half of that sentence or just leave it as is. I am taking a much more first-person approach and will talk about how the anti-science fervor, the literalism and fundamentalism and Christian right mixing politics with religion is not just about as opposite as you can get from what Jesus was all about, it's causing a lot of people to laugh at and walk away from Christianity.
My book will also talk about how many people in the church were not there for me during and after my divorce while all my non-Christian friends were. What does this say about the faith? Or about theirs anyway? I have met person after person who have said the same thing, so this is not just a local phenomenon affecting me. I am not embarrased in any way to be a Christian. I love the Bible, I love Jesus, and I think it's a beautiful empowering faith. But I am increasingly embarrassed by the Christians... the judgmentalism and narrow-minded pursuit of a political agenda, making creationism, abortion, gay marriage the main topics in their repertoire. What about poverty? What about being there for people in your life, and not running away or judging people who are not perfect? What about forgiveness?
There are certainly many wonderful things Christians have done in the world and continue to do. But in America, where I'm from and what I know, it's a mixed bag. All I know is that many intelligent and compassionate people would not think of becoming a Christian because of it's rejection of science. It's actually quite harmful to our society, and quite scary how sheep-like people can be. People often blindly follow and don't think critically about their beliefs. I like to say, Jesus didn't call people sheep for no reason! :)