Hello! I am a straight woman but for some reason I see being penetrated as inherently degrading/ humiliating/submissive, since it is technically an invasion and it is a mentality that I want to change!

Furthermore, in women, due to the position of their genitals, they have to be placed in really embarrassing positions.

I am from a western country, I have not been raped, abused, nor am I religious, or anything like that.

It may seem dumb, but this topic has me on the verge of tears, please help!

PS: English is not my mother tongue so please forgive me if I made any mistakes 🙏


Leave a Reply
You May Also Like