The definition of a Male dominated society is “ A society that benefits men in the law's, education, government and day to day life.”. I believe that the United States was very male dominated in the earlier stages of our country. Women were expected to clean, make food, and raise the kids. While the men work the farm, go into politics, be in the military, and run businesses. Today women.