I read this article
on PilotOnline today and I found the notion fascinating and I'd love to hear some additional opinions.
In the article the author Philip Walzer
suggests that many companies may be doing more harm than good by requiring mandatory diversity training for their employees.
The theory is that being forced to attend training or seminars which often mainly focus on legal ramifications of being culturally insensitive can actually cause more hostility and exclusion in a workforce than promoting acceptance and inclusion.
Researchers instead suggest making these training opportunities voluntary.
What do you think? Could voluntary diversity and inclusion sessions within a company be more beneficial than mandatory ones? Check out the article and sound off!