Evangelical Christianity has traditionally been defined by its belief in the authority of Scripture and its mission to make from all nations disciples of Christ, but it is increasingly being defined by its role in culture wars–we're those who fight evolutionists, fight feminists, fight homosexuals, fight liberals. As the political world polarizes further into right and left, the church is tempted to polarize with it. But as we focus on repelling those influences that threaten to change the...