Lately, the surge is toward “justice Christianity”—intervening to prevent human trafficking or slavery, caring for indigenous cultures or for the planet itself. And it is right and it is wrong. My goodness, yes, of course God cares about justice. But to be frank, it is actually not the central theme of the Bible. Christianity isn’t simply a religious version of the Peace Corps.
All of these “camps” are Christianity—sort of. Like elevator music is music—sort of. Like veggie burgers are hamburgers—sort of. Think gas fireplaces, wax fruit, frozen burritos. They look like the real thing, but…
It all comes down to this: What is Christianity supposed to do to a person?
"Long before he laid down earth’s foundations God had us in mind, had settled on us as the focus of his love to be made whole and holy with his love." (Ephesians 1:4 TM)
God is restoring the creation he made. What you see in Jesus is what he is after in you. This is a really core assumption. Your belief about this will affect the rest of your life.