Like in ‘Green Book’ movie, Hollywood tends to water racism down – award-winning author
The Oscar winning drama ‘Green Book’ has brought racism in America back into the spotlight