Black Women In Hollywood To Know: Actresses

Just a few short years ago, the entertainment industry was stifling creatives of color. Now, Black women in powerful positions—from directors’ chairs to C-suites—are presenting fresh narratives and taking their rightful place in the field.

As Hollywood catches up to our expertise, we’re showing the world who we really are. Meet the actresses who are setting the standard today. 

Source: Read Full Article