The right-wing in the U.S. feels they are being drowned out by the dominant culture in the U.S. and they have a term for it: 'woke culture.'
The right-wing in the U.S. feels they are being drowned out by the dominant culture in the U.S. and they have a term for it: 'woke culture.'