|Since when isn't truth out-datable?
||[May. 5th, 2005|03:43 pm]
|||||Misfits - Green Hell||]|
By this I mean that our culture has come to the point whereas quite a few of the beliefs of the (Christian) church as a whole have little or no relevance. Think for example about birth control (a Catholic issue) or the rights of women and other minorities.
I forget where, but the Bible makes mention of slavery as a positive thing, the selling of virgin daughters and quite a few other practices that while not as barbaric, are wrong. For example, the idea that a woman should always defer to a male in her life, her father in youth and her husband from then out. Would you say that these are positive modals for a modern society?
Sure there are a few universal truths in the Bible. But these ideas are found in every single other religion, even Satanism. When I say that Christianty is on its way out, I mean as a whole. It is in decline. A good way to see this is to take a look worldwide and see how most American Christian views are seen. ONLY in the US is the idea that a creation story should be taught in schools as science even plausable.
Reality is that things change. If you don't change with Reality you are left behind. 2,000 odd years is a good run for any religion, I give them another 300 before complete revalence is lost.
I am not Christian so any conflict with tenents about the church and world being seperate doesn't bother me. Facts are facts.
This was from an earlier post, I had just finished reading Feuerbach's "The Essence of Christianity" (very interesting book, ya'll should check it out if you haven't already) for a philosophy of religion class and then writing a paper based off that and LaVeyian Satanism. (If you don't know what this is, please don't tell me the devil is going to take my soul!)If anyone is interested in the paper I will throw it online.
Eh, but my point is..... agree with me or not? And why? What do you think about outdated ideas still being taught as truth? If it is easier, what would you think if public schools taught that black people were inherently inferior and must be taken care of by whites? (A belief from not too long ago that aside from a few wierdos in the hills, is thought to be bunk these days but was once accepted as truth by a majority AKA "white man's burden".)