It doesn't make sense to me but there seems to be an increase of hatred for people of the Christian faith. Christianity gave the world some marvelous advances (and continues to do so!). Among them hospitals, penal system, a judicial system the strives (albeit imperfectly) for justice, and the Red Cross. Christians worked to heal the sick during Europe's plagues, they started the Salvation Army, and YMCA. It was Christian leadership that lead to the abolition of slavery, the women's vote, the civil rights movement, and the end of child labor. Currently, Christians take the lead in mission work among the nations poor, shelters, and homeless ministry.
These days the leading edge of Christian faith seeks to bring health to marriages, protection of the unborn child, protection of children, working in nations where there is poverty and loss, etc.
Amazingly, this is the faith that has engendered the hatred of so many. Anti-Christian activity is on the rise in Europe, in India, and in the US. In Persecution: How Liberals Are Waging War Against Christianity, author David Limbaugh uncovers a mountain of evidence of large-scale, across-the-board discrimination against Christians in the public and academic spheres—accompanied by an alarming determination on the part of the political left to eradicate Christian influences from our culture and even our history lessons. I frequently hear derogatory comments made on Television toward the church or people of faith. It does seem that this is on the rise.
So this morning in my Bible reading I came across the verse in Mark 13.13 "All men will hate you because of me, but he who stands firm to the end will be saved." I guess we shouldn't be surprised, eh?