So, can we trust doctors?
The question came to me when I found out doctors are no longer the most trusted profession in America, and I believe as more news comes out they will become even less trusted.
The fact is, like any profession, there are always going to be bad apples... but how bad is the question...
First, let’s play Devil’s advocate for a minute here:
How many doctors, surgeons or others in the medical field chose their profession because they genuinely care about people’s health?
And how many of them went into the profession because they knew it paid well?
It’s a fair question, because we assume that in order to be in the profession of healing, doctors should care about people.
Here is an experience my mother had recently that may cause you to question that assumption:
A couple of weeks ago my mother was on a Florida trip and was experiencing some swelling in a certain area of her body.
She went to a doctor in Florida and got an ultrasound.
They told her they found something and needed to biopsy.
When she came in later the doctor showed her the ultrasound and what they biopsied.
He tried to schedule her immediately for removal even though he said it was smooth and most likely benign.
My mom’s insurance didn’t cover that doctor, so she asked for her results to be sent to her primary doctor in Texas and she would take care of it when she got home.
As it turns out, when she got back to Texas and went in to see her doctor, he said there was nothing on the ultrasound.
The doctor in Florida had showed my mother somebody else’s ultrasound.
He had done a biopsy on normal tissue cells and tried to schedule my mom for an unnecessary procedure to make money.
Doctors like this are everywhere and you should always get a second opinion.
Here's a couple other examples and then we'll move on to the positive examples and solutions...
A dear friend of mine’s mother-in-law was told by a doctor that she had cancer in her jaw.
They did surgery and removed 80% of her lower jaw...
...Only to tell her after the fact that there was no cancer after all.
The few that actually did have cancer, he gave hundreds of extra treatments just to make money.
Doctors make money from you being sick, prescribing drugs, and doing surgeries--all of which may be totally unnecessary.
Doctors practice medicine, they do not study nutrition.
America is only 5% of the world’s population, but we consume 40% of the pharmaceuticals.
These drugs don’t make us healthier, we spend 10X more than most other countries in healthcare costs yet we are 37th in the world according to the World Health Organziation.
But here's the good news (finally, good news!):
Most commercial prescribed drugs are totally unnecessary and can be replaced with better dietary habits to resolve any symptoms...
(and guess what, natural foods are a lot cheaper than drugs!)
So back to the original question, can you trust doctors?
My opinion is that yes, you can trust some of them...
...But please, always get a 2nd opinion.
And before you go to the doctor, maybe you should try natural first.
Some of these Chinese doctors have a preventative system where you only pay them a retainer when you’re healthy and to cover regular check ups.
If you get sick, all of their services are free.
It’s kind of like what I do...
...My goal is to not sell my clients anything except for the information that will get them healthy.
Have a blessed day.
Think Great Lose Weight