How many societies today do we see women dominating men?

There is no definitive answer to this question as it depends on the context and the society in question. In some societies, women may be seen as having more power and influence than men, while in others, the opposite may be true. Generally speaking, however, it is safe to say that in most societies, gender roles are becoming more equal and women are increasingly taking on leadership roles and dominating in areas that were traditionally male-dominated.