Concordeia wrote:I can't wrap my head around the fact that there is still this common social perception that sex is somehow a dirty, taboo, or just plain bad. Sexuality is one of humanity's core instincts and a basic element of our society. For one, it's necessary, because without sex, as with many other species, the human race could not continue it's existence. Two, sex is pleasurable. People like doing it because it feels good, otherwise they wouldn't do it. So if sex is both a necessary and desirable aspect of the human condition, why do people still hold this perception of sex as something perverted or disgusting? It just doesn't make sense to me.
man i don't know what to tell ya, all i know is, it's probably dirty, perverted or shameful because different it's all about what the women want, and it just depends on many things of the woman ya know, like if it's her time of the month, if she's just not in the mood, or if she's just pissed off cause some other girl lookin at her or you and she totally caught it, and let's not forget my favorite turn off from a chick, you know, the one where she'll make you feel like dog crap on the bottom of some guy's shoe and he keeps on walking in mud just by saying " it's not you, it's me" or even worse! "you got a tiny winkie" or whatever she'll call it, because all in all, at the end of the day, women are always the real the shot-callers in the bedroom! idk why, but they will always hold the fact that they have the power to give birth to a child over our heads, but in my case, i don't wanna have, well give birth to a child, because that @!%* looks like it hurts more than the superman punching me in the balls.





