This article delineates the history of cinema in the United States.
This article delineates the history of cinema in the United States.
The American Film Institute (AFI) is an American nonprofit film organization that educates filmmakers and honors the heritage of the motion picture arts in the United States. AFI is supported by private funding and public membership fees.
View the full Wikipedia page for American Film Institute