what does the bible say?
The word evangelicalism comes from the Greek euangelion, meaning “good news,” and euangelizomai, meaning “to proclaim good news” (i.e., Romans 1:16). The term developed during the Protestant Reformation to describe a return to the gospel of salvation by grace through faith and later became associated with revival movements in the eighteenth and nineteenth centuries that stressed personal conversion and the authority of Scripture. As those revivals spread through Europe and North America, evangelicalism increasingly focused on evangelistic zeal—calling people to faith in Christ’s atoning death and resurrection (1 Corinthians 15:3–5)—sometimes giving less attention to the steady growth and maturity of those already in the faith.
Over time, “evangelicalism” evolved to encompass diverse denominations and cultures, with its meaning becoming increasingly blurred by social and political associations. Yet true evangelicalism has never been about earthly movements or causes. It centers on knowing God through His Word, living under His authority, and proclaiming Christ so that others might come to faith. The heart of evangelicalism reflects the same heart seen in the first century—the desire of all true believers to believe the good news and to make disciples of all nations (Matthew 28:18–20).