DEI is collapsing across corporate America – NaturalNews.com

Are American businesses finally realizing just how destructive DEI programs are? It wasn’t long ago that Diversity, Equity and Inclusion (DEI) was a corporate buzzword, with woke executives around the nation clamoring to come up with the most diverse workforces possible – even if it meant a far les…