Forums:
Watching TV is a constant source of rage. From TV commercials to the news to series, sexism is everywhere, and it seems to me that no one bats an eyelid anymore every time so-called advertising creatives try to sell us toilet paper by asking yet another model to strip naked. Gender stereotypes are everywhere: the products themselves and the way they are being sold to us. What you see on TV is the triumph of the ever-successful marriage of patriarchy and capitalism. Capitalism creates stuff no one needs (who the fuck needs special 'men' shower gel? Can I use it If I like the smell if I'm not a man?) and patriarchy makes sure it's gendered and makes sure women's bodies are objectified in order to sell it. Products that are marketed as being 'for men' often get creative ads while virtually everything else gets one thing: a naked woman. You want a car? Here's a naked woman. You want shower gel? Here's a naked woman. TV shows reiterate the tired clichés patriarchy created about women: in 90% of these shows, women are portrayed as having one unique goal, the pursuit of love and ultimately marriage, and very seldomly is the focus put on their careers, their hopes, their dreams, what makes them individual beings with a functioning brain. Women's bodies are used and abused, belittling women is a daily occurrence, and I'm getting tired of it. Just plain tired that in a media as mainstream as TV, women still are portrayed as weak, pretty and nothing else, inferior to men, their characters less developed and so on and so forth. I simply can't stand it anymore, and it's not a case of the bad days: it's a constant rage. I mean it: stop, just stop fucking trying to sell a car by showing a woman running under the rain, getting her white dress cling onto her body. Just stop.