Do y’all feel that religion is on the decline? Are people just as religious as ever or do people really believe less? Maybe people go to church less or are less inclined to be part of an established religion, but maybe their belief in a god is a strong as ever.
Any thoughts on this, y’all? Maybe it’s a generational thing or maybe it’s a fad. Or, and this should also be considered, maybe religion is finally breathing it’s last breath.
Would love to hear about your own thoughts and opinions about religion for yourself and what you see around you!