Projecting the End of the American Dream: Hollywood's Visions of U.S. Decline

Hardback
This provocative book reveals how Hollywood films reflect our deepest fears and anxieties as a country, often recording our political beliefs and cultural conditions while underscoring the darker side of the American way of life.