A popular phrase these days among Christians is that we must "engage the culture." I'm not really sure why this is so popular. Of course we must engage the culture, if we believe in the Great Commission. This is just another way of saying that we are to give out the gospel.
Unfortunately, however, this is not what many people mean when they use this terminology. I actually get a little bit nervous when I hear the phrase, "engage the culture," because in my experience this amounts to code for "copy our degraded and ungodly culture." This is actually antithetical to the gospel. If we are to advance the gospel, we must live lives that are worthy of the gospel, which always means that we will not be conformed to this world, but transformed by the renewing of our minds.
Let's engage the culture with holiness, truth, and love. Then the world will see Jesus, not a bad imitation of themselves.
Here is a good article on this.