Synaptic weight

Synaptic weight

In neuroscience and computer science, synaptic weight refers to the strength or amplitude of a connection between two nodes, corresponding in biology to the amount of influence the firing of one neuron has on another. The term is typically used in artificial neural network and biological neural network research.

Computation

In a computational neural network, a vector or set of inputs extbf{x} and outputs extbf{y}, or pre- and post-synaptic neurons respectively, are interconnected with synaptic weights represented by the matrix w, where for a linear neuron

:y_j = sum_i w_{ij} x_i ~~ extrm{or}~~ extbf{y} = w extbf{x}.

The synaptic weight is changed by using a learning rule, the most basic of which is Hebb's rule, which is usually stated in biological terms as

"Neurons that fire together, wire together."

Computationally, this means that if a large signal from one of the input neurons results in a large signal from one of the output neurons, then the synaptic weight between those two neurons will increase. The rule is unstable, however, and is typically modified using such variations as Oja's rule, radial basis functions or the backpropagation algorithm.

Biology

For biological networks, the effect of synaptic weights is not as simple as for linear neurons or Hebbian learning. However, biophysical models such as BCM theory have seen some success in mathematically describing these networks.

In the mammalian central nervous system, signal transmission is carried out by interconnected networks of nerve cells, or neurons. For the basic pyramidal neuron, the input signal is carried by the axon, which releases neurotransmitter chemicals into the synapse which is picked up by the dendrites of the next neuron, which can then generate an action potential which is analogous to the output signal in the computational case.

The synaptic weight in this process is determined by several variable factors:
* How well the input signal propagates through the axon (see myelination),
* The amount of neurotransmitter released into the synapse and the amount that can be absorbed in the following cell (determined by the number of AMDA and NMDA receptors on the cell membrane and the amount of intracellular calcium and other ions),
* The number of such connections made by the axon to the dendrites,
* How well the signal propagates and integrates in the postsynaptic cell.

The changes in synaptic weight that occur is known as synaptic plasticity, and the process behind long-term changes (long-term potentiation and depression) is still poorly understood. Hebb's original learning rule was originally applied to biological systems, but has had to undergo many modifications as a number of theoretical and experimental problems came to light.

ee also

* Neural network
* Synaptic plasticity
* Hebbian learning


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Synaptic plasticity — In neuroscience, synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength. There are several underlying mechanisms that cooperate to achieve synaptic plasticity, including changes in the quantity …   Wikipedia

  • Hebbian theory — describes a basic mechanism for synaptic plasticity wherein an increase in synaptic efficacy arises from the presynaptic cell s repeated and persistent stimulation of the postsynaptic cell. Introduced by Donald Hebb in 1949, it is also called… …   Wikipedia

  • Oja's rule — Oja s learning rule, or simply Oja s rule, named after a Finnish computer scientist Erkki Oja, is a model of how neurons in the brain or in artificial neural networks change connection strength, or learn, over time. It is a modification of the… …   Wikipedia

  • Biological neuron model — A biological neuron model (also known as spiking neuron model) is a mathematical description of the properties of nerve cells, or neurons, that is designed to accurately describe and predict biological processes. This is in contrast to the… …   Wikipedia

  • BCM theory — BCM theory, BCM synaptic modification, or the BCM rule, named for Elie Bienenstock, Leon Cooper, and Paul Munro, is a physical theory of learning in the visual cortex developed in 1981. Due to its successful experimental predictions, the theory… …   Wikipedia

  • Generalized Hebbian Algorithm — The Generalized Hebbian Algorithm (GHA), also known in the literature as Sanger s rule, is a linear feedforward neural network model for unsupervised learning with applications primarily in principal components analysis. First defined in 1989cite …   Wikipedia

  • nervous system, human — ▪ anatomy Introduction       system that conducts stimuli from sensory receptors to the brain and spinal cord and that conducts impulses back to other parts of the body. As with other higher vertebrates, the human nervous system has two main… …   Universalium

  • DNAJC5 — DnaJ (Hsp40) homolog, subfamily C, member 5 PDB rendering based on 2ctw …   Wikipedia

  • Mind uploading — This page is about whole brain emulation in futurism, transhumanism and science. See also Mind uploading in fiction. Whole brain emulation or mind uploading (sometimes called mind transfer) is the hypothetical process of transferring or copying a …   Wikipedia

  • Metaplasticity — is a term originally coined by W.C. Abraham and M.F. Bear to refer to the plasticity of synaptic plasticity. Until that time synaptic plasticity had referred to the plastic nature of individual synapses. However this new form referred to the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”