The title of this post is something that has been slowly coming into view for me as I’ve been attempting to understand God apart from the Christian religion. By wrestling through many faith issues which I address in previous blog posts, I’ve currently concluded that Christianity doesn’t lead people to God. Its general nature actually steers people away from God.
So here I am. It feels kind of like going to the doctor for the majority of your life and then realizing the doctor is what has been making you sick. I think I would be so much healthier (have a better understanding of God) had I not seen the doctor at all.
This is a pretty unconventional viewpoint. I mean, just like everybody “knows” you go to the doctor when you need to stay alive, everybody “knows” you go to church in order to grow in eternal life. After three decades of living this kind of eternal life, however, I have to say that this supposed new life is intended to kill.
I was instantly fooled by institutional Christianity’s claim to Christ. Jesus seemed like a wonderful man, no matter who he was, so how could a church go wrong there? Yet it’s surprising to me just how much the love of a man could be twisted to portray not love but hate. Allow me to explain.