Hollywood has long been perceived by conservatives as something of a liberal cesspool. However, that description isn’t truly fair.
Some of the best films of the last few decades explicitly promote conservative ideals and the notion of American exceptionalism.
Here are our favorites.
Do you agree with our list or think we left out any worthy additions? Click through to the end at let us know in the comments below!