The speed of neural information processing is subject to a variety of constraints, including the time for electrochemical signals to traverse axons and dendrites, axonal myelination, the diffusion time of neurotransmitters across the synaptic cleft, differences in synaptic efficacy, the coherence of neural firing, the current availability of neurotransmitters, and the prior history of neuronal firing. Although there are individual differences in something psychometricians call “processing speed,” this does not reflect a monolithic or unitary construct, and certainly nothing as concrete as the speed of a microprocessor.
The signals which are propagated along axons are actually electrochemical in nature, meaning that they travel much more slowly than electrical signals in a computer, and that they can be modulated in myriad ways
Similarly, there does not appear to be any central clock in the brain, and there is debate as to how clock-like the brain’s time-keeping devices actually are. To use just one example, the cerebellum is often thought to calculate information involving precise timing, as required for delicate motor movements; however, recent evidence suggests that time-keeping in the brain bears more similarity to ripples on a pond than to a standard digital clock.
Although the apparent similarities between RAM and short-term or “working” memory emboldened many early cognitive psychologists, a closer examination reveals strikingly important differences
Although RAM click here now and short-term memory both seem to require power (sustained neuronal firing in the case of short-term memory, and electricity in the case of RAM), short-term memory seems to hold only “pointers” to long term memory whereas RAM holds data that is isomorphic to that being held on the hard disk. (See here for more about “attentional pointers” in short term memory).
Unlike RAM, the capacity limit of short-term memory is not fixed; the capacity of short-term memory seems to fluctuate with differences in “processing speed” (see Difference #4) as well as with expertise and familiarity.
For years it was tempting to imagine that the brain was the hardware on which a “mind program” or “mind software” is executing. This gave rise to a variety of abstract program-like models of cognition, in which the details of how the brain actually executed those programs was considered irrelevant, in the same way that a Java program can accomplish the same function as a C++ program.
Unfortunately, this appealing hardware/software distinction obscures an important fact: the mind emerges directly from the brain, and changes in the mind are always accompanied by changes in the brain. Any abstract information processing account of cognition will always need to specify how neuronal architecture can implement those processes – otherwise, cognitive modeling is grossly underconstrained. Some blame this misunderstanding for the infamous failure of “symbolic AI.”
Another pernicious feature of the brain-computer metaphor is that it seems to suggest that brains might also operate on the basis of electrical signals (action potentials) traveling along individual logical gates. Unfortunately, this is only half true. For example, signal transmission is dependent not only on the putative “logical gates” of synaptic architecture but also by the presence of a variety of chemicals in the synaptic cleft, the relative distance between synapse and dendrites, and many other factors. This adds to the complexity of the processing taking place at each synapse – and it is therefore profoundly wrong to think that neurons function merely as transistors.
Computers process information from memory using CPUs, and then write the results of that processing back to memory. No such distinction exists in the brain. As neurons process information they are also modifying their synapses – which are themselves the substrate of memory. As a result, retrieval from memory always slightly alters those memories (usually making them stronger, but sometimes making them less accurate – see here for more on this).