Psychology
Psychology is defined as "The study of human behavior," and more specifically "the science of behavior and mental processes." Basically, psychology seeks to understand both how and why humans do what they do. Since that is a very broad topic, expect a long article.
For tropes based on psychology, go here.
History
Psychology largely branched off from philosophy, which is where most vague ruminations get their start; as far back as Plato and Aristotle, people were making suppositions on human behavior. Major boosts to physiology during the 1800s made people start to believe (not incorrectly) that fundamental aspects of consciousness--sensation, motor control, personality, memory, etc--could be detected as physical phenomena in the brain. The first "true" psychologist was Wilhelm Wundt, who opened a laboratory for the purpose in Leipzig in 1879.
Almost immediately, arguments developed over how psychology should be approached.
- Wundt's student Edward Titchener argued for a school called Structuralism, which employed trained Navel Gazing in order to analyze the structure of one's own consciousness.
- Functionalism, primarily advocated by the American William James, cared more about how the mind functioned, more about what it did than how it was built. Since functionalist experiments could look for quantifiable behavior ("I wonder how many times I need to punch someone before they get angry enough to punch me back? Let's grab a hundred strangers and test!"), this school won, and Structuralism is largely a footnote today.
- Finally, a fellow we've all heard of--Sigmund Freud--came up with an approach called Psychoanalysis, which in some ways combined the two: while introspection and self-observation were a major part of the process, the client looked for actual dysfunctional behaviors they were displaying, and then asked the psychoanalyst for help in puzzling out the motivations behind those behaviors. While a fair amount of Freud's theories--particularly his obsession with sex--are largely discredited today, it's worth noting that the things he got right--particularly the idea of the the subconscious mind and all tropes rooted therein--remain largely unmodified today.
Functionalism evolved further into Behaviorism as time went on. The first step in this direction was another name you're likely to know--Ivan Pavlov--who demonstrated the link between experience and learning. Pavlov's classic "Classical Conditioning" experiment was to ring a bell every time he fed his dog, who had been outfitted with an implant that collected some of its saliva. After a while of this, Pavlov demonstrated that, when he rang the bell, the dog would start to drool; it had been "conditioned" to associate the bell with food. Another researcher, B. F. Skinner, expanded this to "operant conditioning" which is basically how consequences, such as rewards and benefits, determine the frequency of behavior. He rigged up a contraption where lab rats would receive food every time they hit a lever in their cages; the rats continued to do this even after the food stopped. He was also able to train rats not to do things--even natural, logical things--by immediately administering punishments every time they did. In doing so, Skinner gave us the most radical definition of Behaviorism: all things we do and value are trained into us by stimulus-response conditioning, the hard way, and thus do not require consciousness. We are all easily manipulated robots.
Well, obviously, that was an unpopular and dystopian philosophy, even if there is some truth to it, and the response to it is called the "Cognitive Revolution." It originated around beginning of The Sixties and has completely replaced Functionalism as the guiding principle of psychological research, especially in America, and much reduced Behaviorism's traction as well. One of the main tenets of the Cognitive Revolution is that there is such thing as a mind, damn it, but is more about (to quote Jerome Bruner) "an all-out effort to establish meaning as the central concept in psychology [...] Its aim was to discover and to describe formally the meanings that human beings created out of their encounters with the world, and then to propose hypotheses about what meaning-making processes were implicated." The seminal article in cognitive psychology was George Miller's "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information," which basically identified how much Random Access Memory the human brain has. The exact amount is currently under dispute--it's not seven, but then Miller never claimed it was seven facts, it was seven "chunks", a "chunk" being the largest meaningful unit of data a person can process. (What a chunk consists of varies by training and content. For instance, you reading this article are probably fairly literate in English and can store an entire sentence in a chunk... but if you were a foreigner just starting an English-Second-Language course, your chunk capacity might be overwhelmed by an eight-letter word like "tangible"--especially since it's not a compound word like "doorknob".) Even then, the exact nature and storage capacity of a "chunk" is still being debated. But the point was that Miller discovered something specific about a non-corporeal cognitive process--and a fairly unintuitive thing too. It was a big step forward, and Miller's paper is one of the most-referred-to papers in the field of psychology.
Much has been done since then, particularly in light of emergent technologies that let us see more deeply into the brain, but that's basically where psychology stands today as a historical topic, so let's move on.
Subfields of Psychology
There's two basic branches of psychology: "Basic" and "Applied". The former is more about making discoveries and figuring out fundamental things about human braining; the latter is about using them in other areas. Examples of these "other areas" on The Other Wiki include education, medicine and health care, product design and law; but psychology is "the study of human behavior" and those are all places where humans behave, you could make the argument that those fields are all just either extensions of psychology or hybrids of it with other disciplines. That's kind of the problem with psychology: aside from the hard sciences, there's very little it doesn't have its fingers in.
This is an overview of Basic psychology, but you'll probably see how the Applied stuff branches off as we go through.
- Abnormal Psychology is the study of "unusual patterns of behavior, emotions and thought." These may or may not relate to specific mental disorder. Psychologists are the first to admit that there is no such thing as "normal" to begin with, so this field can be on unstable ground; even worse, there's always the question of whether a particular "abnormal" behavior has psychological or biological roots. (The fact that one can affect the other just complicates things.) The Universe Bible for this field is the DSM, the Diagnostic and Statistical Manual of Mental Disorders, which lists every recognized mental disorder and its symptoms. To say that Flame Wars can erupt over what's a disorder, and what its symptoms are, is an understatement.
- Behavioral Neuroscience focuses on biology, particularly neurobiology, and its impact on behavior and behavioral development. In particular, the experiments that led to the "90% of Your Brain" urban legend came out of this field. Much of our knowledge of the anatomy of the brain, what happens where, also comes from here--not just people inflicting deliberate lesions on rats, but also doing field research. Phineas Gage, for instance, has proved a useful case study to psychologists and neurologists for some 150 years.
- Cognitive Psychology explores internal mental processes--how people think, speak, receive stimuli, remember things, solve problems, etc. Obviously, this also covers a great deal of material, from Failed Spot Checks to Feigning Intelligence to Taking A Third Option to Weirdness Censoring, and we're just going to leave it at that for now instead of doubling the size of this article.
- Cultural Psychology examines the effects of culture--upbringing, laws, customs, taboos--on the mind. It embraces the Blank Slate idea, feeling that mind and culture are largely inseparable. If you want ideas on where Crazy Asian Drivers, Widget Series or Scary Black Men come from, for instance, this is the place to look. This field is unfashionable, since it can so easily drift into corroboration of Acceptable Ethnic Targets. But it isn't going away any time soon either; culture obviously influences thought patterns--and, more importantly, the lens of culture is built into everything we do, so cultural psychologists will of necessity be involved in any attempts to develop truly universal models of behavior.
- Developmental Psychology, is the study of "systematic psychological changes, emotional changes, and perception changes that occur in human beings over the course of their life span." (We keep quoting The Other Wiki because they keep putting things well.) Another good name for it might be "The Psychology of Aging." Once called "Child Psychology," the field has expanded to cover all age ranges; indeed there's specialties in Elderly Psychology now (which could prove really useful to those of us who intend to have jobs during the Baby Boom Retirement Wave). The general focus of developmental psychology is on acquisition and/or evolution of skills, moral & conceptual understanding, and self-concept.
- Evolutionary Psychology takes the assumption that behaviors, like organs, are the product of natural selection, and still exist because they provide some benefit to the organism that bears them. This field asks why Buxom Is Better, looks for the Truth in Television behind Double Standards, wonders why Sacred Hospitality developed, tries to find where Stage Moms came from, even ponders why we are conscious at all. It's simultaneously the oldest branch of psychology (having roots in Charles Darwin, 20 years before Wundt) and one of the youngest (its modern era having started in 1972 at earliest), and is gaining serious traction in the psychological community.
- Personality Psychology deals with, pretty obviously, personality. It concerns not only how people are unique from each other but how people are alike as well. There are many, many different ways psychologists approach personality, from overall "type" theories to elaborate "Two Kinds Of People" jokes (Type A vs. Type B personalities, anyone?), so we're just going to direct you to our articles on the two most popular tests, the Myers-Briggs and the Big Five Personality Traits, and leave it at that.
- Positive Psychology tries to live on the happy side of the Sliding Scale of Idealism vs. Cynicism, preferring to promote well-being and nurture talent and genius. It feels that people spend too much time trying to fix what's wrong and not enough time encouraging and enjoying what we already have. Though people have been concerned with these topics since Adam and Eve, the field's first real glimmer of existence was in '98, making it by far the youngest branch of psychology, and the concommitant immaturity of its approach has been a point of contention for some critics.
- Finally, Social Psychology deals with how people's thoughts, feelings and behaviors are influenced by the presence (perceived, actual or otherwise) of other people. It straddles the line between psychology and sociology and is also an extremely wide field, covering things like the Confirmation Bias, Moral Dissonance, what goes on Beneath the Mask, Tsundere divisions, Stanley Milgram's obedience experiments, every Social Engineering trope, and more.
Useful Principles from Psychology
One of the nice things about studying psychology is that you now know some of the flawed shortcuts, assumptions and heuristics that people use to speed up their daily decision-making. There are, pretty disturbingly, a lot of them, and a great deal of scam artistry and advertising revolves around their use. In the interests of general education, we'd like to distribute a few of these things now.
Correlation Does Not Imply Causation
"Just because A and B happen at the same time, that doesn't mean A causes B, or vice versa." This can be pretty obvious at times ("I was driving my car and my grandma died. I shall never drive again!"), but is less obvious when the two phenomena seem causally related. This principle reminds you to think twice about what seems obvious. Here are two examples:
- This editor's mother was once heard to quote a study which demonstrated that most people who win the lottery have poor money management skills. In her mind, this suggested that gaining large amounts of money made you careless with your checkbook. Is this the truth? (Answer in spoilers: Probably not--especially since most people who make large fortunes are careful with money; this is how they make large fortunes. But the kind of person who plays the lottery is sloppy with money to begin with; it's a tax on people who can't calculate percentages. Therefore, montetary carelessness is an explanatory variable that is often mistaken for a response variable. Winning the lottery does not cause monetary carelessness; monetary carelessness causes lottery wins.)
- A study showed that children who have nightlights are more likely to need vision-correcting lenses as adults. What's going on here? (The obvious conclusion is that nightlights cause vision damage... but that's pretty obviously not true; our eyes are exposed to much higher amounts of light during the day and do not suffer permanent damage. The other conclusion is that... adult-glasses-wearing causes nightlights? How is that possible?--how could my wearing glasses at 28 cause a Stable Time Loop when I'm 6? The answer is that adults wearing glasses does cause nightlights... But the adult is not you. It's your parents. They couldn't see well when you were a child [the nightlight was for them, not you], and bad vision is hereditary [which is why you need glasses today]. A does not cause B, and B does not cause A; C causes both. Fun Fact: In statistics, this is known as "common response caused by a lurking variable.")
Tools are not always trustworthy
Let's say you're required to take a personality test two days in a row. You showed up yesterday and got a certain result from the test. You came back today... but you were pissed off. You were woken up at asscrack o'clock by screaming from the apartment upstairs, the line at the coffee shop was five miles long, you were cut off in traffic and scalded yourself, there were no parking spaces... It's been a crappy day. So you take the personality test... and you get the exact opposite result you got yesterday.
What's going on here? Did your personality Reverse its Polarity overnight? ...Or is the test just bad at measuring personalities?
This is less a big deal in the world of the physical sciences, where they have (for instance) a world standard kilogram in a vault somewhere in Europe that you can compare your classroom plastic weight to. Psychology doesn't have that luxury; there isn't a world-standard personality tucked into a vault somewhere to test our tools against. We design tools to measure a certain thing about personality, behavior, cognition, etc, but that doesn't mean the tool succeeds. And even if it does, most psychology results are expressed statistically, and a statistic will say anything if you torture it long enough. Simply put: don't believe everything people tell you.
Cognitive Biases On Trope
It should hardly surprise you that humans don't always think well. What may surprise you is how frequently, and how subtly, we do so. One of the biggest categories of flawed thought processes are the "cognitive biases," which is when a person makes a judgment that certifiably departs from reality, or from the judgments of more-impartial outsiders. Typically these are not done on purpose--they're learned behaviors which we acquire because they make it easier to think faster, or which work in certain circumstances but are being currently misapplied. Here are some that correspond to known tropes.
- Feigning Intelligence: this is one of many "positive illusions," in which people overvalue or over-expect positive outcomes and downplay negative ones. Specifically, it's related to illusory superiority, in which people place more emphasis on their virtues and laugh off their flaws.
- Know-Nothing Know-It-All / Heroic Self-Deprecation: these are both the result of the Dunning-Kruger effect, which states that it's possible to be so flamingly incompetent that you can't even recognize your own incompetence. Conversely, people with actual skill often underestimate themselves, assuming that everyone knows about the Fatal Flaw or Achilles' Heel in their competence. See also the Overconfidence effect, in which one believes one is always right.
- Sunk Cost Fallacy: the more time, money and effort you've put into a thing, the more you value it--regardless of your likelihood of return or the objective value of the thing you're building. See also post-purchase rationalization. This is a way for the brain to avoid realizing, "Holy shit, I've been wasting my time on this." Seriously, Ignorance Is Bliss. The implications of how sales people (or MMORPGs) can exploit this to gain money are things we are going to gloss over here, but I'm sure you've got some ideas.
- Confirmation Bias: people are more likely to remember things that support what they already believe, and to interpret ambiguous data to support their own conclusions. If your favorite MMO got an Editor's Choice award from Game Spot, you'll remember that. The fact that it got 8.8 reviews from every other publication out there may mysteriously slip your mind.
- Out-of-Character Moment, Protagonist-Centered Morality: these is informed by the fundamental attribution error. Basically, people have trouble assuming that anyone except themselves can have an Out-of-Character Moment. If I'm driving to work and I cut someone off, I'm excused because I had a bad morning. If, however, he cuts me off, it's because he's a Jerkass; every action he ever takes is a direct reflection of his personality. This is, of course, a good example of how cognitive biases can develop in the first place: maybe he does have an excuse, but how the heck am I supposed to know that? Having said that, it's still unfair for me to jump to conclusions. See also the actor-observer bias and trait-ascription bias.
- Gambler's Fallacy: the tendency to believe in a Random Number God which keeps track of past events and alters future probabilities accordingly.
- Selective Obliviousness: Denial, pure and simple. Obviously, if you don't want to think or know about something, being able to shut it away can be very useful in the short term; but huge swaths of drama and horror stories have been written about someone who just can't face the truth. Apply with caution.
- Beauty Equals Goodness: the halo effect, which is the tendency for traits in one area of personality to spill over into other areas; and, more directly, the physical attractiveness stereotype, which is the trope played dead straight.
- Nostalgia Filter: the reminiscience bump, which is almost the Laconic version of the trope word for word: We tend to forget how much we thought something sucked in the past.
- Failed a Spot Check: Selective attention and inattentional blindness. We don't think about it much, but every single moment of every day we are bombarded with huge swaths of information: blue skies, trees waving in the wind, the smell of grass, the sound of people's voices, the feeling of wearing clothes. At some point, we must either learn to ignore this stuff or be overwhelmed by it. So we filter out any information that isn't pertinent to the task at hand. ...And run the risk of not noticing that the guy we're talking to isn't the guy we started talking to.
- Bavarian Fire Drill: ohh yes. This is one of Robert Cialdini's "Six Weapons of Influence," word for word. People tend to respond to authority, so act like you're one and they'll obey you.
Experimental Psychology
Technically, this is a field of basic psychology, but we decided to section it out because it helps answer a question almost everyone has if they've put any thought into it: "How do you scientifically measure mental events? You can't claim to know what a person is thinking or feeling, but that's what psychology studies. How the heck do you do this?"
Well, first off, we have MRI machines now. But the real and ultimate answer is, we don't. We quantify it using behaviors. If the experiment involves someone getting angry, we don't ask them, Do You Feel Angry; we identify a number of behaviors which could express anger (punching confederate in the face; yelling; pulling a weapon) and see how many test participants engage in those behaviors. We'll have two groups of participants: a "control group" who go through things normally, and then an "experimental group" where a new stimulus is introduced; sometimes there'll be more than one experimental group if we've identified more than one potential stimulus (and can test it without things getting too complicated). If we can make significantly more members of the experimental group(s) engage in bodily harm after exposure to the new stimulus, we'll think we've got something.
Let's use the Milgram experiment as an example of how these things are quantified. If you're participating in the Milgram experiment, it's June 1961 and you arrive at the lab at Yale University. There are two other people there, one of whom is administrating the experiment and the other of whom is a fellow participant. The Administrator asks you to draw slips of paper which will determine which of you is the "learner" and which is the "teacher"; you pull the "Teacher" slip. Well, the Learner goes into another room, connected to you by an intercom, and you sit down at a control panel, where the Administrator explains that it's your job to get the Learner to remember a list of word-pairs. You'd teach him, and then give him a four-choice multiple-answer test. If he got it wrong, the control panel comes into play: you're required to administer electric shocks to the Learner as punishment for getting it wrong, with each button representing a higher voltage. As you continue, the Learner starts screaming in pain, pounding on the wall, and claiming that he has a heart condition, and that if you go further the electric shocks might do something terrible to him...
Well, here's the good news: the "Learner" is not a fellow participant at all. He's what we call a "confederate"--a scientist who is pretending to be a participant because that's what's necessary for the test to proceed. (Both slips of paper said "Teacher"; he just lied about his.) You, the actual participant, have been led to believe that the experiment has something to do with the effects of punishment on memory... but psychological experiments are always conducted blind, so that the participant doesn't have the chance to evaluate what hypothesis (he thinks) you're trying to test, and tailor his behavior to support (or bust) it. The test is actually about how far people will go when Just Following Orders. There are no actual electric shocks, but you think there are, and if you complain, the Administrator says things like, "It is absolutely essential that you continue" and "But Thou Must!," and sees what you'll do.
Past a certain point, the confederate is instructed to stop yelling, stop banging on the wall... stop making any responses at all. The participant is told to construe these silences as failed responses and continue administering the shocks, ignoring that something is quite obviously wrong with the Learner. This is why the Milgram experiments are probably too unethical to reproduce today: the participant is made to believe that s/he has killed a fellow human being for the sake of an experiment. It was staged just after the trial of Nazi war criminal Adolf Eichmann, who used the "Just Following Orders" excuse, implicitly allowing almost every German citizen alive to use it as well. Milgram wanted to find out just how far people would go if given that excuse. This is why the control panel has the series of buttons: you can now set a quantity, in voltage, on the perils of blind obedience.
Before conducting the experiment, Milgram polled 14 psych majors on how many participants they expected to be willing to go all the way down the series of buttons, to a maximum of 450 volts. These psych majors predicted that at most one person in one hundred would do so. In reality, 26 participants of 40, sixty-five percent, pushed all the buttons on the control panel, even after the "Learner" "died" of his "heart condition." All of them expressed disquiet with the idea, but all of them bowed in the face of authority. The participants were normal people like you and me.
After that, it was lots of statistics. (Trade secret: a lot of psychology is just statistics. Someone who hasn't completely forgotten how they work should feel free to add some details about them to this page.) But Milgram determined that there was "statistical significance" in the situation, that more people acted unusually than they would under completely normal circumstances. Like all psychological papers, his included a detailed description of the experiment's methodology, so that other professionals could try to poke holes in what he'd done. No one's really been able to, though (as mentioned) the ethics committees went wild. Most people--including the participants--feel that the tests were worth doing, but most people aren't willing to try and replicate them.
Therapists
There's a stigma in American society (and thus in American media) about psychotherapy, with most people feeling that if you need it, something is inherently wrong with you. The reality is really the opposite: psychology is "the study of human behavior," and like most humans you are behaving most of the time; so what harm could result from examining your own behaviors and trying to improve them? ...Besides the fact that most of us are probably in denial about some of our behaviors and motivations; self-awareness isn't always a value Americans embrace. And that probably feeds into the stigma against therapy, which is ironic because American researchers have been the bleeding edge of psychology since about The Fifties.
In any case: let's get this out of the way. If you are going to or have been recommended to go to therapy, there is nothing wrong with you. You would simply like some advice on how to be happier with your life. (And who the hell doesn't want that?)
A large part of talking therapy is just that--talking. Armchair Psychology may be a Dead Horse Trope in the media, but sometimes we need someone to tell us the truth. This is why Functionalism was deposed to begin with: we human beings are not always good at self-analysis. Think of it like standing inside a building and then trying to see the outside surfaces of said building. It's impossible; there's walls in the way; you can't do it unless you're standing outside the building. The same thing is true of being human: we can't stand outside ourselves enough to see ourselves clearly. It's just a rule of nature. So we need a mirror, and that's where the psychologist comes in.
And, let's face it: it can also be really reassuring to blurt out a hard truth ("I hate my life!") and have someone say, "If I were you, I would too." It can be really reassuring to say, "I Just Want to Be Normal!" and have someone say, "You are." Dispelling the Angst Dissonance of a person in pain is the first step to therapy. This is yet another irony: if you are in therapy, everyone assumes you are abnormal, but the whole point is to point out that it's completely reasonable to be unhappy with the problems you have, and to want help in solving them. That belief is really the fundamental disconnect which causes so many Americans to eschew therapy: needing help is contrary to The American Spirit of Do-It-Yourself Ingenuity. (We are not always a wise culture.)
Being in therapy does not necessarily involve medication. Usually you will see a psychologist first, and s/he will then refer you to a psychiatrist for some or all of the therapy process if they feel medication is necessary. A psychologist is Not That Kind of Doctor, even if they happen to have a Ph.D or even a Psy.D. A psychiatrist, on the other hand, is a medical doctor, one who happened to pick psychopharmacology as their specialty instead of pediatrics, open-heart surgery, etc; as such, they can prescribe medication. Psychiatry is specifically about the medical treatment of mental disorders, so you'll see a lot of psychiatrists multi-classing into talking therapy or CAT scans, but fewer with (say) behavioral research. They are dabbling with letting psychologists prescribe, but by and large, even if you want Prozac, a psychologist cannot give it to you. (So don't rob one.) (You shouldn't have to anyway; mood-stabilization drugs are not psychologically addicting.)
Some Random Facts
- Contrary to popular belief, it seems most "bullies" have high self-esteem, not low self-esteem. No, they don't just need a hug.
- Or more specifically, as one book of self-esteem put it, they are conceited (they have self-esteem without self-worth, leading to a delusion of self-esteem which is dependent largely on bullying others). As explored below in Cracked.com
Research shows kids who have an inflated sense of self-worth become aggressive when their sense of superiority is called into question, leading to a more damaging fall for little Billy when he realizes what a loser he is (whereas fat Ralph already knew himself to be a loser and is therefore immune to disappointment).
- Schizophrenia does NOT involve multiple personalities. This is a common misconception stemming from the fact that the word means "split brain". What it actually means is that a schizophrenic's brain is in conflict with itself; schizophrenia is most associated with delusions and hallucinations (especially auditory hallucinations, IE hearing voices). The Split Personality thing is associated with Dissociative Identity Disorder, formerly named Multiple Personality Disorder.
- According to research, opposites do not attract. Instead, similarity and proximity play the biggest roles in attraction.
- More on the difference between psychiatrists and psychologists: In general, a psychologist gets a bachelor's in psychology then goes to grad school and gets a doctorate in psychology (you can find lesser paying work with just a bachelor's or master's, but to be called a psychologist you need a doctorate). A psychiatrist gets some random undergrad degree, then goes to med school and becomes a doctor, then does a several year long psychiatric residence. Although there is a lot of overlap in what they do, in general a psychologist is much more likely to do therapy and talk to you about your problems, while a pyschiatrist is more likely to give you pills or deal with severe mental disorders. Some states in the US (not sure about the rest of the world) have given psychologists limited ability to prescribe medications. Usually this has to be approved by a physician anyways, and even in states where psychologists cannot prescribe anything they can often make recommendations to doctors that can.
- Cracked.com to the rescue: Six bullshit facts about psychology that everyone believes.
- People talk all the time about your five senses--sight, sound, smell, taste and touch. In fact, there's seven senses. The sixth is your sense of balance, which functions using liquid in your inner ear and can cause nausea if not treated well. The seventh is known as "proprioception" or the "kinesthetic sense," and has to do with how you keep track of where your limbs are in relation to your body. It's what lets you clap your hands when you can't see, or causes "phantom limb" sensations after an amputation.
- Nerves. Ohh, nerves. Obviously, they're a little bit from biology, but neurons have a great deal to do with psychology, so let's talk them a little bit.
- Neurons are one-way connections. The two basic flavors are "sensory" neurons, which go from body to brain, and "motor" neurons, which go from brain to body (specifically muscles).
- The gap between neurons is called a "synapse," a word you've probably heard a number of times. A typical neuron is shaped vaguely like a tree: there's the branches, called "dendrites", and the trunk, called an "axon". (For those wondering, the cell nucleus is typically in the dendrites, so to a certain extent this analogy is ass-backwards.) Synapses are where the business end of an axon dumps "neurotransmitters" into the the gap. These neurotransmitters bind with receptors on the dendrite of the neighboring neuron, and this signals the neuron to fire.
- Nerves are not only one-way, they are one-message. Almost every inch of your skin has receptors for pain, pleasure, temperature and pressure sensation. Each of them requires a dedicated neuron. (Yeah, yeah, inefficient. If you're a genetics researcher who's into Bio Augmentation, take it as a challenge.)
- Neurons do not actually use electrical current or lightning to transmit information. However, they do use ions, which do carry a minute electrochemical charge. We're not going to get into the science of it, but suffice it to say that without sodium, calcium and potassium ions, your nerves don't work. Additionally we have to say, that ions in water solutions are very, very slow and this limits neuron signal's speed dramatically, and there is no way to improve it. Our brain compensates through parallel processing: instead of processing something with a single fast but energy-hungry silicon microchip, we instead employ lots and lots and lots of slow and inefficient but low-cost and energy-economical neurons that divide that problem into extremely small parts that are far easier to process. It's like the famous "infinite numbers of monkeys typing at random at an infinite number of typewriters for an infinite amount of time would eventually type Shakespeare" theorem.
- Here's the really fun part: all drugs have their effect in synapses. Every single bloody one of them. Some of them bind to receptors in the synapse (most recreational drugs); some, like the "Selective Serotonin Reuptake Inhibitors" used to treat depression (Paxil, Prozac, Celexa) stop the original axon from collecting its spent neurotransmitters; and a few actually bind to the receptor but have an opposite effect. See why you need a medical degree to prescribe this stuff?
- As a corollary, this means that no drug can make your body do something it wasn't intended to do in the first place. That's how "endorphines" got their name: a surgery patient was being given morphine to knock her out, and realized, in her last moments of consciousness: "Wait, my body already has receptor sites for morphine! There must be 'endogenous morphine' already somewhere in my sysgrlpblll agaaagabababa zzzzzzz." (Fortunately, she remembered enough of this after waking up to get the research done.) It also does mean that, yes, Mind Screw drugs like PCP or whatever have a place and a function in your neurochemistry, though what that place is, we can't be buggered to look up at the moment.