Here is a potentially sensitive topic. Please be honest. Do you trust women with important services and roles?
For example, would you visit a female physician or a female lawyer, and place in her the same confidence and trust as you would a male in the same position?
Being completely honest, I have to say no. The reason is not because I feel that women are incapable of doing these jobs as well as men. It's because many American women seem to have a high sense of entitlement, and most men seem very willing to give gifts or favors to women in order to improve their standing.
My fear is that a woman will not be as qualified as a man because she didn't have to work as hard to achieve her position.
I originally wrote a much longer post about this which went into great detail, but I don't think that was necessary. I can post that stuff later if the discussion becomes detailed enough.
Your thoughts?


Digitech: " Do you trust women with important services and roles? "
Absolutely. Went to school with, worked with, and otherwise known enough talented to women to say I would.