Men should be master's in romance and know what a woman's needs are. But unfortunately there are few men that knows how to treat a woman. Don't get me wrong because certain women enjoy being abused by men to a point that it's even on television with music and movies and so on. Morals and values is fading fast. Sex is just something to fill the desire and needs. Not sacred anymore just do it and forget each other's name. Very Sad.
no plus ones
Wait while more posts are being loaded