Preclaimer: I’m neither from the US nor religious, but..
I remember learning in school what the core values of Christianity (and most religions, I guess) are about. Loving thy neighbor, helping others, being a nice and good person, that kinda stuff.
So I’m honestly wondering how Christianity in the US, at least the part you see online and in the media, seemingly has nothing to do with those core values anymore, yet they all claim to be Christians (and decrying other Christians who follow those core values as not-real Christians)? Please help me understand.
submitted by /u/Vaiara
[link] [comments]
r/NoStupidQuestions Preclaimer: I’m neither from the US nor religious, but.. I remember learning in school what the core values of Christianity (and most religions, I guess) are about. Loving thy neighbor, helping others, being a nice and good person, that kinda stuff. So I’m honestly wondering how Christianity in the US, at least the part you see online and in the media, seemingly has nothing to do with those core values anymore, yet they all claim to be Christians (and decrying other Christians who follow those core values as not-real Christians)? Please help me understand. submitted by /u/Vaiara [link] [comments]
Preclaimer: I’m neither from the US nor religious, but..
I remember learning in school what the core values of Christianity (and most religions, I guess) are about. Loving thy neighbor, helping others, being a nice and good person, that kinda stuff.
So I’m honestly wondering how Christianity in the US, at least the part you see online and in the media, seemingly has nothing to do with those core values anymore, yet they all claim to be Christians (and decrying other Christians who follow those core values as not-real Christians)? Please help me understand.
submitted by /u/Vaiara
[link] [comments]