Now, I don't know if I'm the only one who thinks this way, but I'm pretty sure I'm the only one
here that thinks this way.
I personally believe that medicine for mental problems is not required. Now, I know that's gonna get me a bunch of people popping up and saying: "I couldn't live without my medicine!" Please, say what you will, but this is what I'm saying.
I don't think being so dependant on a drug is right. Has anyone ever watched that old Hulk movie where the doctor keeps telling and persisting to tell the girl that she can't walk, she can't walk, and that her medicine will help her get better and be able to walk again? If you
have watched it, then you would have seen that when the girl realized that all of it was just in her head, she could walk. And I know it's fiction, but there are many real life stories like that, if you google them.
I personally think that psychologists need to concentrate more on doing what they're supposed to be doing, and counselling and giving advice to the people in need of it, rather than saying: "Oh, you have such-and-such, let's just give you some such-and-such and we'll let you go on."
Did you know that antidepressants mess with the serotonin in your brain, and can actually
cause some of the things they're supposed to be fixing, or make them ten times worse, or even create an entirely new problem that you have to fix with even
more medicine? Did you know that many serial killers were on antidepressants when they began killing? Did you know that after being on your medicine for a long period of time, your body becomes immune to it, and it no longer works, and you're back in the same boat you were in before?
Besides this, just because a doctor goes to school and gets a degree doesn't mean they know everything. Think back to your least favorite teacher, the one who was boring and couldn't answer a question
correctly about their subject when you asked them about it and they didn't have access to books or a computer. Now ask yourself: Is your doctor any different?
Just because the FDA approves a medicine doesn't mean it'll work and it won't make you even worse off. Look at how many 'medicines' have been recalled in the last five years, and you'll catch my drift.
My theory is that all illnesses can be overcome with great effort on the patient's part, and some deep mind-searching to figure out what's causing your depression and efforts to fix it, no matter what happens. The short-term solution is just a short-term fix; the long-term solution will stick forever.
This is coming from a formerly depressed, suicidal person who talked to the people in her head and thought there was something coming to get her every time it got dark or the sun was going to fall on her head during the day, too. I went to my doctor, yes, but I never touched a pill bottle, and I got through it well on my own, and became a stronger and more outspoken person because of what I'd been through. I'm thoroughly convinced that if I'd taken the medicine my doctor had thought about prescribing me, I would have been worse off.
Thoughts? This is just a debate, by the way, so please no furious replies, or even annoyed ones, it's just something that has been on my mind for a while that I'd like to get some different opinions on.