Thruout the hist of the film industry, Hollywood's Western movies depicted the nature of the American frontier in stereotypical black-and-white terms, with generally the white settlers always the good guys, and Indians (together with Mexicans) as vicious bloodthirsty baddies, or alternatively native American 'noble savages' (like the Lone Ranger's Tonto) which set the perceptions of the majority of ppl towards what happened in the West and what native Americans were like. It wasn't until the early 1990s, with DANCES WITH WOLVES and LAST OF THE MOHICANS, that Hollywood's depiction of the frontier and American Indians began to change, with native Americans not typecast exclusively as the savages, and with their characters and perspectives dealt with in-depth and sympathetically instead of being totally dismissed as in the past.
WI Hollywood had somehow attempted to make less stereotypical Westerns earlier ? How would Americans' perception of hist have been affected for the better ? I just read an account of how director Thomas Ince and 1 native American actor whose name I can't recall right now, but whom I believe was a Sioux (and who ha toured with Buffalo Bill Cody) were intending to make 'real' Indian movies during the 1920s, but this never eventuated. WI Ince and his native American friend had actually managed to make such 'authentic' Western movies conveying the Indian POV, at that point in time ?
WI Hollywood had somehow attempted to make less stereotypical Westerns earlier ? How would Americans' perception of hist have been affected for the better ? I just read an account of how director Thomas Ince and 1 native American actor whose name I can't recall right now, but whom I believe was a Sioux (and who ha toured with Buffalo Bill Cody) were intending to make 'real' Indian movies during the 1920s, but this never eventuated. WI Ince and his native American friend had actually managed to make such 'authentic' Western movies conveying the Indian POV, at that point in time ?