What does woke culture want?

Woke culture is a social movement that seeks to raise awareness of social and political issues, particularly those related to race, gender, and other forms of inequality. It is a call to action to challenge oppressive systems and to fight for justice and equality. Woke culture wants to create a more equitable and just society, where everyone is respected and treated fairly.