Thursday, January 05, 2017

Should women be told what and what not to wear?

Women who are oppressed should be forced not to follow the rules of their religion and therefore women who refuse to show bare parts of themselves at the beach should be forced to act as if they are free from religious oppression. Right?

This is in France.

But all over the world women are told (by men!) what or what not to wear. Yes, even in the good ol' US of A!

From the website:

 There’s a distinct irony in the suggestion that women who are allegedly forced to wear a face covering should be forced not to wear it.

In a lot of churches women MUST cover their heads, according to Christian believes: If you don't believe me, here are some pictures:

An American christian tells women not to show cleavage: According to French law he is opressing women by telling them his religion tells them to cover themselves. So he should be arrested. Unless people are okay with double standards.

In Israel religion tells women they have to cover themselves. But they are Jewish so that's okay. Apparently.

So, what are your thoughts on the subject? Should men go on telling women what or what not to wear or is it time EVERY nation in the world enters the age of enlightment and allow women to finally decide for themselves?

No comments: