Inhibition and Excitation: the Yin and Yang of the Brain

Do excitatory and inhibitory neurons make binary logic in the brain?

Not really. But it’s a good question because we learn a lot when we try to answer it.

First, we have to clarify what the words ‘excitatory’ and ‘inhibitory’ mean.

  • Excitation is the process by which a neuron’s membrane potential (or voltage) increases. If excitation is sufficient, a neuron will produce an action potential.
  • Inhibition is the process by y which a neuron’s membrane potential (or voltage) decreases. If a neuron is already firing, then if it receives enough inhibition, it will stop firing.

So the statement “If some neurons are excitatory meaning they will fire and some inhibitory meaning they won’t” is not quite right. All neurons, whether excitatory or inhibitory, can fire, but only if they receive adequate excitation. If an inhibitory neuron fires, it can reduce the voltage of other neurons, whether they are excitatory or inhibitory. Excitation is the accelerator for all neurons. Inhibition is the brake for all neurons.

If you find this confusing, you might benefit from anthropomorphizing the neurons. Imagine the neurons are people trying to work together. The team is divided into two groups: the optimists and the pessimists. When the optimists talk, other people — both optimists and pessimists — get excited, and talk louder. But the pessimists get excited because they love to shut down conversation, whereas the optimists want to promote conversation.

If the optimists dominate, the team gets stuck in a noisy and chaotic shouting match, and accomplishes nothing. If the pessimists dominate, the team eventually becomes completely silent, and again accomplishes nothing. The key to a productive team is balance between the optimists and the pessimists.

This is the kind of balance that the brain seems to find when it is functioning normally. In neuroscience this is called the balance of excitation and inhibition. Essentially, excitation and inhibition are the yin and yang of the brain, each pulling a neuron’s voltage in the opposite direction. These forces might be thought of as binary, but their effects are not.


Is the brain a digital processor?

I’d now like to explore the digital-versus-analog concept in the context of spiking neurons. What follows may be too detailed for a casual reader, but those interested in really learning about the brain should read on.

It is very popular, even among neuroscientists, to think of the firing of a neuron as a binary event. Early models of neurons assumed this was the case. To this day, many artificial neuron models assume that the neuron is an “all-or-none” device that is either firing or not firing. If this were true, then biological neurons networks could in principle implement a binary code: neurons that receive more excitation than inhibition might be the ones, and neurons that receive more inhibition than excitation might be the zeros.

The typical neural phenomenon that is described as binary is the action potential or spike: if you look at the voltage of a neuron on a 2–4 millisecond timescale, it is usually possible to approximate the voltage pattern in time as a stream of ones and zeros. But if you look at the diagram above, it should be clear why this is an approximation. On any timescale smaller than around 1 millisecond, the spike looks like an analog signal: it is a continuous fluctuation in voltage. And the shape of this fluctuation depends on a variety of factors including neuron type.

It is important to realize that the spikes themselves very rarely get transmitted to the next neuron — known as the post-synaptic neuron — in unaltered form. At the synapse, the discrete-looking spikes cause the axon of the pre-synaptic neuron to release neurotransmitters. The neurotransmitters bind to receptors on the dendrite or cell body of the post-synaptic neuron, which leads to the emergence of post-synaptic potentials. These potentials are much more fuzzy and analog than spikes, because their rise and fall times are longer than that of a spike. How fuzzy they are depends on what kinds of receptors are present on the post-synaptic neuron. In other words, the receiver of a neural signal is an active participant in its ‘interpretation’.

Post-synaptic potentials typically add up in an analog manner. If they are excitatory, they cause the voltage of the post-synaptic neuron to go up, and if they are inhibitory, they cause the voltage of the post-synaptic neuron to go down.

In the figure above 1, the spikes or action potentials are the sharp lines in the bottom half of the diagram. Above them, you see the voltage of the post-synaptic neuron. It gradually increases in a more-or-less analog manner. That means that the presynaptic spike is triggering excitatory post-synaptic potentials. Now recall that each neuron can receive input from perhaps thousands of neurons. The probability of a binary code being transmitted from one neuron to another becomes quite slim. Neurons integrate their inputs in a manner that looks more analog than digital.

Even this story is a simplification. There are complex currents in neurons that can lead to phenomena like habituation, facilitation and rebound excitation. Ultimately the effect of one neuron on another depends on the precise voltage pattern of each neuron and the distribution of neurotransmitters and receptors.

Returning to the digital-analog question, it makes sense to think of action potentials as a discrete medium of transmission, but only for internal transmission: discrete signals seem to help the neuron send information along its axon in an energy-efficient manner. But the binary nature of the signal usually gets ‘blurred’ as soon as the signal reaches the axon terminal. The blurring occurs for two reasons:

  1. the post-synaptic potentials are more extended in time than spikes
  2. potentials from multiple pre-synaptic neurons arrive at the post-synaptic neuron

So what does inhibition do?

If inhibitory neurons are not creating zeros in a binary code, what might they be doing? We still don’t know for sure, but we have a lot of experimental data, and also some theoretical ideas. I’ve discussed the possible roles of inhibition here:

What do inhibitory neurons do in the brain? by Yohan John on Neurologism

In short, inhibition seems essential for mediating the following computational phenomena:

  • Resetting other neurons
  • Decorrelating patterns to highlight differences
  • Contrast-enhancement and edge-detection
  • Competition between neurons or groups of neurons
  • Brain oscillations or rhythms such as gamma, beta, alpha and theta oscillations (which I’ve written about in this post)

These topics are more complex than excitation, so if you’re curious, check out the blog post.


1 Figure source: Postsynaptic Potentials

This post was an answer to a Quora question, and was written last year.