It’s finally happened; Hollywood has learned something!
For years, the only films with a primarily Black cast have been made from the same cookie-cutter plots of the struggle to get out of bad neighborhood. Someone’s mom is on crack, nobody has a dad and everybody gets shot. What the blockbuster hit Black Panther has done, by simply existing, is change the narrative of Black films forever. Could this be the end of watching Tyler Perry in a wig play into Black stereotypes? Will we end pathetic attempts at a half-decent movies thrown together that go straight to VH1? Ladies and gentlemen, could this mean that no more Rickys will be shot?
Black Panther has shown the world a movie that not only empowers Black people, but sends a powerful and universal message of brotherhood. For the first time, an African country is shown as the advanced society, the place above all others, instead of the poor, war-torn, third world that just needs a white savior. And never before have there been so many empowered Black women in a movie.
The hype of Black Panther might die down at some point, but its effects will be everlasting.
By Brianna Thomas, Senior, Gwendolyn Brooks