In my experience doctors are arrogant, overly confident, controlling, and unhelpful. They lie to you if they don't know what's wrong with you. As a matter of fact I think ER doctors are the only doctors that have ever helped me with anything.
Not only that, the whole field is just set up to suppress symptoms. If you actually do some of your own research and you think you found what is causing health problems, doctors will most likely chew you out, basically call you stupid and tell you that's dangerous thinking. Excuse me because I would like to be free from this illness for life, instead of being prescribed a drug that covers up 20% of the problem. I also have pretty strong proof they were wrong on several major issues I have.
If you think this thread isn't being fair, I really don't know what to say. This is my honest to god, life time of experience with doctors.
I do think experimenting excessively with your health on your own can be dangerous. That's not what I'm complaining about. Doctors have little, if any, open mindedness. The whole field is just dogmatic.