There is a growing crisis in America that is beginning to become “normalized.” It is manifesting itself in the dehumanizing political rhetoric we hear everyday and also in the gun violence that seems to play out on a near weekly basis at this point in some city, school, church, store or location around the United States. What in the world is happening in this culture that has brought us to this point? How do we see real change?