SENSATION AND PERCEPTION
Sensing the World: Basic Principles.
With these simple observations you have exercised four of your senses: vision, hearing, touch, and smell. The primary function of the nervous system is communication—the transmission of information from one part of the body to the other. Where does that information come from? Put simply, your senses are the gateway through which your brain receives all its information about the environment. It’s a process that is so natural and automatic that we typically take it for granted until it is disrupted by illness or injury. Nevertheless, as the stories of Paul and Warren demonstrate, people with one nonfunctional sense are amazingly adaptive. Often, they learn to compensate for the missing environmental information by relying on their other senses. In this chapter, we will explore the overlapping processes of sensation and perception. Sensation refers to the detection and basic sensory experience of environmental stimuli, such as sounds, objects, and odors. Perception occurs when we integrate, organize, and interpret sensory information in a way that is meaningful. Here’s a simple example to contrast the two terms. Your eyes’ physical response to light, splotches of color, and lines reflects sensation. Integrating and organizing those sensations so that you interpret the light, splotches of color, and lines as a painting, a flag, or some other object reflects perception. Where does the process of sensation leave off and the process of perception begin? There is noclear boundary line between the two processes as we actually experience them. In fact, many researchers in this area of psychology regard sensationand perception as a single process. Although the two processes overlap, in this chapter we will present sensation and perception as separate discussions. In the first half of the chapter, we’ll discuss the basics of sensation— how our sensory receptors respond to stimulation and transmit that information in usable form to the brain. In the second half of the chapter, we’ll explore perception—how the brain actively organizes and interprets the signals sent from our sensory receptors. We’re accustomed to thinking of the senses as being quite different from one another.
However, all our senses involve some common processes. All sensation is a result of the stimulation of specialized cells, called sensory receptors, by some form of energy. Imagine biting into a crisp, red apple. Your experience of hearing the apple crunch is a response to the physical energy of vibrations in the air, or sound waves.The sweet taste of the apple is a response to the physical energy of dissolvable chemicals in your mouth, just as the distinctive sharp aroma of the apple is a response to airborne chemical molecules that you inhale through your nose. The smooth feel of the apple’s skin is a response to the pressure of the apple against your hand. And the mellow red color of the apple is a response to the physical energy of light waves reflecting from the irregularly shaped object you’ve just bitten into. Sensory receptors convert these different forms of physical energy into electrical impulses that are transmitted via neurons to the brain. The process by which a form of physical energy is converted into a coded neural signal thatcan be processed by the nervous system is called transduction. These neural signals are sent to the brain, where the perceptual processes of organizing and interpreting the coded messages occur. The basic steps involved in sensation and perception. We are constantly being bombarded by many different forms of energy. For instance, at this very moment radio and television waves are bouncing aroundthe atmosphere and passing through your body. However, sensory receptors are so highly specialized that they are sensitive only to very specific types of energy.
Thresholds. Sensory Adaptation.
To study sensation is to study an ageless question: How does the world out there get represented in here, inside our heads? Put another way, how are the external stimuli that strike our bodies transformed into messages that our brains comprehend?
Each species comes equipped with sensitivities that enable it to survive and thrive. We sense only a portion of the sea of energy that surrounds us, but to this portion we are exquisitely sensitive. Our absolute threshold for any stimulus is the minimum stimulation necessary for us to detect it 50 percent of the time. Signal detection researchers report that our individual absolute thresholds vary with our psychological state.
Experiments reveal that we can process some information from stimuli too weak to recognize. But the restricted conditions under which this occurs would not enable unscrupulous opportunists to exploit us with subliminal messages.
To survive and thrive, an organism must have difference thresholds low enough to detect minute changes in important stimuli. In humans, a difference threshold (also called a just noticeable difference, or jnd) increases in proportion to the size of the stimulus—a principle known as Weber’s law.
Along with being specialized as to the types of energy that can be detected, our senses are specialized in other ways as well. We do not have an infinite capacity to detect all levels of energy. To be sensed, a stimulus must first be strong enough to be detected—loud enough to be heard, concentrated enough to be smelled, bright enough to be seen. The point at which a stimulus is strong enough to be detected because it activates a sensory receptor cell is called a threshold. There are two general kinds of sensory thresholds for each sense—the absolute threshold and the difference threshold.
The absolute threshold refers to the smallest possible strength of a stimulus that can be detected half the time. Why just half the time? Because the minimum level of stimulation that can be detected varies from person to person and from trial to trial. Because of this human variability, researchers have arbitrarily set the limit as the minimum level of stimulation that can be detected half the time. Under ideal conditions (which rarely occur in normal daily life), our sensory abilities are far more sensitive than you might think. Can stimuli that are below the absolute threshold affect us?
The other important threshold involves detecting the difference between two stimuli. The difference threshold is the smallest possible difference between two stimuli that can be detected half the time. Another term for the difference threshold is just noticeable difference, which is abbreviated jnd. The just noticeable difference will vary depending on its relation to the original stimulus. This principle of sensation is called Weber’s law, after the German physiologist Ernst Weber (1795–1878).
Weber’s law holds that for each sense, the size of a just noticeable difference is a constant proportion of the size of the initial stimulus. So, whether we can detect a change in the strength of a stimulus depends on the intensity of the original stimulus. For example, if you are holding a pebble (the original stimulus), you will notice an increase in weight if a second pebble is placed in your hand. But if you start off holding a very heavy rock (the original stimulus), you probably won’t detect an increase in weight when the same pebble is balanced on it. What Weber’s law underscores is that our psychological experience of sensation is relative. There is no simple, one-to-one correspondence between the objective characteristics of a physical stimulus, such as the weight of a pebble, and our psychological experience of it.
Suppose your best friend has invited you over for a spaghetti dinner. As you walk in the front door, you’re almost overwhelmed by the odor of onions and garlic cooking on the stove. However, after just a few moments, you no longer notice the smell. Why? Because your sensory receptor cells become less responsive to a constant stimulus. This gradual decline in sensitivity to a constant stimulus is called sensory adaptation. Once again, we see that our experience of sensation is relative—in this case, relative to the duration of exposure.
Sensory adaptation refers to our ability to adapt to unchanging stimuli. For example, when we smell an odor in a room we’ve just entered and remain in that room for a period of time, the odor will no longer be easily detected. The phenomenon of sensory adaptation focuses our attention on informative changes in stimulation by diminishing our sensitivity to constant or routine odors, sounds, and touches.
Because of sensory adaptation, we become accustomed to constant stimuli, which allows us to quickly notice new or changing stimuli. This makes sense. If we were continually aware of all incoming stimuli, we’d be so overwhelmed with sensory information that we wouldn’t be able to focus our attention. So, for example, once you manage to land your posterior on the sofa, you don’t need to be constantly reminded that the sofa is beneath you.
Vision. The Stimulus Input: Light Energy.
A lone caterpillar on the screen door, the pile of dirty laundry in the closet corner, a spectacular autumn sunset, the intricate play of color, light, and texture in a painting by Monet. The sense organ for vision is the eye, which contains receptor cells that are sensitive to the physical energy of light. Before we can talkabout how the eye functions, we need to briefly discuss some characteristics of light as the visual stimulus.
Each sense receives stimulation, transduces it into neural signals, and sends these neural messages to the brain. We have glimpsed how this happens with vision.
The energies we experience as visible light are a thin slice from the broad spectrum of electromagnetic radiation. The hue and brightness we perceive in a light depend on the wavelength and intensity. The electromagnetic spectrum is the entire range of wavelengths or frequencies of electromagnetic radiation from very short gamma waves at one end of the spectrum to the longest radio waves at the other end. Visible light is the portion of the electromagnetic spectrum that can be detected by the human eye. Light has characteristics of both particles (photons) and waves, with a wavelength between 400 and 700 nm. This range sometimes is called the range of visible light or, more correctly, the visible spectrum. Within the visible spectrum, each color has a different wavelength.
After entering the eye and being focused by a camera-like lens, light waves strike the retina. The retina’s light-sensitive rods and color-sensitive cones convert the light energy into neural impulses, which are coded by the retina before traveling along the optic nerve to the brain.
The visual system includes the eyes, the accessory structures, and the optic nerves (II), tracts, and pathways. The eyes respond to light and initiate afferent action potentials, which are transmitted from the eyes to the brain by the optic nerves and tracts. The accessory structures, such as eyebrows, eyelids, eyelashes, and tear glands, help protect the eyes from direct sunlight and damaging particles.
The retina of each eye, which gives us the potential to see the whole world, is about the size and thickness of a postage stamp. The retina consists of a pigmented retina and a sensory retina. The sensory retina contains three layers of neurons: photoreceptor, bipolar, and ganglionic. The cell bodies of these neurons form nuclear layers separated by plexiform layers, where the neurons of adjacent layers synapse with each other.
Visual Information Processing.
In the cortex, individual neurons called feature detectors, respond to specific features of a visual stimulus, and their information is pooled for interpretation by higher-level brain cells. Sub-dimensions of vision (color, movement, depth, and form) are processed separately and simultaneously, illustrating the brain’s capacity for parallel processing. The visual pathway faithfully represents retinal stimulation, but the brain’s representation incorporates our assumptions, interests, and expectations.
The optic nerve (II) leaves the eye and exits the orbit through the optic foramen to enter the cranial cavity. Just inside the vault and just anterior to the pituitary, the optic nerves are connected to each other at the optic chiasm. Ganglion cell axons from the nasal retina (the medial portion of the retina) cross through the optic chiasm and project to the opposite side of the brain. Ganglion cell axons from the temporal retina (the lateral portion of the retina) pass through the optic nerves and project to the brain on the same side of the body without crossing.
Beyond the optic chiasm, the route of the ganglionic axons is called the optic tract. Most of the optic tract axons terminate in the lateral geniculate nucleus of the thalamus. Some axons do not terminate in the thalamus but separate from the optic tract to terminate in the superior colliculi, the center for visual reflexes. Neurons of the lateral geniculate ganglion form the fibers of the optic radiations, which project to the visual cortex in the occipital lobe. Neurons of the visual cortex integrate the messages coming from the retina into a single message, translate that message into a mental image, and then transfer the image to other parts of the brain, where it is evaluated and either ignored or acted on.
The visual fields of the eyes partially overlap. The region of overlap is the area of binocular vision, seen with two eyes at the same time, and it is responsible for depth perception, the ability to distinguish between near and far objects and to judge their distance. Because humans see the same object with both eyes, the image of the object reaches the retina of one eye at a slightly different angle from that of the other.With experience, the brain can interpret these differences in angle so that distance can be judged quite accurately.
Research on how we see color supports two nineteenth-century theories. First, as the Young-Helmholtz trichromatic (three-color) theory suggests, the retina contains three types of cones. Each is most sensitive to the wavelengths of one of the three primary colors of light (red, green, or blue). Second, as opponent-process theory maintains, the nervous system codes the color-related information from the cones into pairs of opponent colors, as demonstrated by the phenomenon of afterimages and as confirmed by measuring opponent processes within visual neurons of the thalamus. The phenomenon of color constancy under varying illumination shows that our brains construct our experience of color.
Researchers now believe that an additional level of color processing takes place in the ganglion cells. As described by the opponent-process theory, the ganglion cells respond to and encode color in terms of opposing pairs. In the brain, the thalamus and visual cortex also encode color in terms of opponent pairs. Consequently, both theories contribute to our understanding of the process of color vision. Each theory simply describes color vision at a different stage of visual processing.
Hearing. The Stimulus Input: Sound Waves.
The pressure waves we experience as sound vary in frequency and amplitude, and correspondingly in perceived pitch and loudness.
Through a mechanical chain of events, sound waves traveling through the auditory canal cause minuscule vibrations in the eardrum. Transmitted via the bones of the middle ear to the fluid-filled cochlea, these vibrations create movement in tiny hair cells, triggering neural messages to the brain.
Research on how we hear pitch supports both the place theory, which best explains the sensation of high-pitched sounds, and frequency theory, which best explains the sensation of low-pitched sounds. We localize sound by detecting minute differences in the intensity and timing of the sounds received by each ear.
Hearing Loss and Deaf Culture. Living in a Silent World.
Hearing losses linked to conduction and nerve disorders can be caused by prolonged exposure to loud noise and by diseases and age-related disorders. Those who live with hearing loss face social challenges. Cochlear implants can enable some hearing for deaf children and most adults. But Deaf Culture advocates, noting that Sign is a complete language, question the enhancement. Additionally, deafness can lead to sensory compensation where other senses are enhanced. Advocates feel that this furthers their view that deafness is not a disability.
Other Important Senses: Touch, Pain, Taste, Smell.
Our sense of touch is actually four senses—pressure, warmth, cold, and pain—that combine to produce other sensations, such as "hot." One theory of pain is that a "gate" in the spinal cord either opens to permit pain signals traveling up small nerve fibers to reach the brain, or closes to prevent their passage. Because pain is both a physiological and a psychological phenomenon, it often can be controlled through a combination of physical and psychological treatments.
While vision, hearing, smell, and taste provide you with important information about your environment, another group of senses provides you with information that comes from a source much closer to home: your own body. In this section, we’ll first consider the skin senses, which provide essential information about your physical status and your physical interaction with objects in your environment. We’ll next consider the body senses, which keep you informed as to your position and orientation in space. own body. In this section, we’ll first consider the skin senses, which provide essential information about your physical status and your physical interaction with objects in your environment.
We’ll next consider the body senses, which keep you informed as to your position and orientation in space. We usually don’t think of our skin as a sense organ. But the skin is in fact the largest and heaviest sense organ. The skin of an average adult covers about 20 square feet of surface area and weighs about six pounds. There are many different kinds of sensory receptors in the skin. Some of these sensory receptors are specialized to respond to just one kind of stimulus, such as pressure, warmth, or cold. Other skin receptors respond to more than one type of stimulus.
One important receptor involved with the sense of touch, called the Pacinian corpuscle, is located beneath the skin. When stimulated by pressure, the Pacinian corpuscle converts the stimulation into a neural message that is relayed to the brain. If a pressure is constant, sensory adaptation takes place. The Pacinian corpuscle either reduces the number of signals sent or quits responding altogether (which is fortunate, or you’d be unable to forget the fact that you’re wearing underwear). Sensory receptors are distributed unevenly among different areas of the body, which is why sensitivity to touch and temperature varies from one area of the body to another. Your hands, face, and lips, for example, are much more sensitive to touch than are your back, arms, and legs. That’s because your hands, face, and lips are much more densely packed with sensory receptors.
Pain is important to our survival. It provides us with important information about our body, telling us to pay attention, to stop what we are doing, or to pull away from some object or stimulus that is injuring us. A wide variety of stimuli can produce pain—the sensation of discomfort or suffering. Virtually any external stimulus that can produce tissue damage can cause pain, including certain chemicals, electric shock, and extreme heat, cold, pressure, or noise. Pain can also be caused by internal stimuli, such as disease, infection, or deterioration of bodily functions. Some areas of the body are more sensitive to pain than are other areas.
The most influential theory of pain is the gate-control theory, developed by psychologist Ronald Melzack and anatomist Patrick Wall (1965, 1996). The gate-control theory suggests that the sensation of pain is controlled by a series of “gates” that open and close in the spinal cord. If the spinal gates are open, pain is experienced. If the spinal gates are closed, no pain is experienced.
Taste, a chemical sense, is likewise a composite of five basic sensations—sweet, sour, salty, bitter, and umami—and of the aromas that interact with information from the taste buds. The influence of smell on our sense of taste is an example of sensory interaction.
Our sense of taste, or gustation, results from the stimulation of special receptors in the mouth. The stimuli that produce the sensation of taste are chemical substances in whatever you eat or drink. These substances are dissolved by saliva, allowing the chemicals to activate the taste buds. Each taste bud contains about 50 receptor cells that are specialized for taste.
The surface of the tongue is covered with thousands of little bumps with grooves in between. These grooves are lined with the taste buds. Taste buds are also located on the insides of your cheeks, on the roof of your mouth, and in your throat (Oakley, 1986). When activated, special receptor cells in the taste buds send neural messages along pathways to the thalamus in the brain. In turn, the thalamus directs the information to several regions in the cortex (O’Doherty & others, 2001b). There were long thought to be four basic taste categories: sweet, salty, sour, and bitter. Recently, the receptor cells for a fifth basic taste, umami, were identified (Chaudhari & others, 2000). Loosely translated, umami means “yummy” or “delicious” in Japanese. Umami is the distinctive taste of monosodium glutamate and is associated with protein-rich foods and the savory flavor of Parmesan and other aged cheeses, mushrooms, seaweed, and meat. Each taste bud shows maximum sensitivity to one particular taste, and lesser sensitivity to other tastes. Most tastes are complex and result from the activation of different combinations of basic taste receptors. Taste is just one aspect of flavor, which involves several sensations, including the aroma, temperature, texture, and appearance of food.
Like taste, smell is a chemical sense, but there are no basic sensations for smell, as there are for touch and taste. Unlike the retina’s receptor cells that sense color by breaking it into component parts, the 5 million olfactory receptor cells with their 1000 different receptor proteins recognize individual odor molecules. Some odors trigger a combination of receptors. Like other stimuli, odors can spontaneously evoke memories and feelings.
The sensory stimuli that produce our sensation of an odor are molecules in the air. These airborne molecules are emitted by the substance we are smelling. We inhale them through the nose and through the opening in the palate at the back of the throat. In the nose, the molecules encounter millions of olfactory receptor cells located high in the nasal cavity. Unlike the sensory receptors for hearing and vision, the olfactory receptors are constantly being replaced. Each cell lasts for only about 30 to 60 days. In 1991, neuroscientists Linda Buck and Richard Axel identified the odor receptors that are present on the hairlike fibers of the olfactory neurons. Like synaptic receptors, each odor receptor seems to be specialized to respond to molecules of a different chemical structure. When these olfactory receptor cells are stimulated by the airborne molecules, the stimulation is converted into neural messages that pass along their axons, bundles of which make up the olfactory nerves.
So far, hundreds of different odor receptors have been identified (Mombaerts, 1999). We probably don’t have a separate receptor for each of the estimated 10,000 different odors that we can identify, however. Rather, each receptor is like a letter in an olfactory alphabet. Just as different combinations of letters in the alphabet are used to produce recognizable words, different combinations of olfactory receptors produce the sensation of distinct odors. Thus, the airborne molecules activate specific combinations of receptors, and the brain identifies an odor by interpreting the pattern of olfactory receptors that are stimulated (Buck, 2000).
As shown in Figure 3.10, the olfactory nerves directly connect to the olfactory bulb in the brain, which is actually the enlarged ending of the olfactory cortex at the front of the brain. Warren lost his sense of smell because the surgeon cut through the nerve fibers leading to his olfactory bulb. Axons from the olfactory bulb form the olfactory tract. These neural pathways project to different brain areas, including the temporal lobe and structures in the limbic system (Angier, 1995). The projections to the temporal lobe are thought to be part of the neural pathway involved in our conscious recognition of smells. The projections to the limbic system are thought to regulate our emotional response to odors. The direct connection of olfactory receptor cells to areas of the cortex and
limbic system is unique to our sense of smell. All other bodily sensations are first processed in the thalamus before being relayed to the higher brain centers in the cortex. Olfactory neurons are unique in another way, too. They are the only neurons that directly link the brain and the outside world (Axel, 1995). The axons of the sensory neurons that are located in your nose extend directly into your brain! As with the other senses, we experience sensory adaptation to odors when exposed to them for a period of time. In general, we reach maximum adaptation to an odor in less than a minute.We continue to smell the odor, but we have become about 70 percent less sensitive to it.
At any moment we are conscious of a very limited amount of all that we are capable of experiencing. One example of this selective attention is the cocktail party effect—attending to only one voice among many. Another example is inattentional blindness, which refers to our blocking of a brief visual interruption when focusing on other sights.
Visual and auditory illusions were fascinating scientists even as psychology emerged. Explaining illusions required an understanding of how we transform sensations into meaningful perceptions, so the study of perception became one of psychology’s first concerns. Conflict between visual and other sensory information is usually resolved with the mind’s accepting the visual data, a tendency known as visual capture.
From a top-down perspective, we see how we transform sensory information into meaningful perceptions when we are aided by knowledge and expectations.
The early Gestalt psychologists were impressed with the seemingly innate way we organize fragmentary sensory data into whole perceptions. Our minds structure the information that comes to us in several demonstrable ways:
Our senses are constantly registering a diverse range of stimuli from the environment and transmitting that information to the brain. But to make use of this raw sensory data, we must organize, interpret, and relate the data to existing knowledge.
Psychologists sometimes refer to this flow of sensory data from the sensory receptors to the brain as bottom-up processing. Also called data-driven processing, bottom-up processing is often at work when we’re confronted with an ambiguous stimulus. For example, imagine trying to assemble a jigsaw puzzle one piece at a time, without knowing what the final picture will be. To accomplish this task, you would work with the individual puzzle pieces to build the image from the “bottom up,” that is, from its constituent parts. But as we interact with our environment, many of our perceptions are shaped by top-down processing, which is also referred to as conceptually driven processing. Top-down processing occurs when we draw on our knowledge, experiences, expectations, and other cognitive processes to arrive at meaningful perceptions, such as people or objects in a particular context.
Both top-down and bottom-up processing are involved in our everyday perceptions. Our perceptual processes must help us organize our sensations to answer three basic, important questions: (1) What is it? (2) How far away is it? And (3) Where is it going? In the next few sections, we will look at what psychologists have learned about the principles we use to answer these perceptual questions. Much of our discussion reflects the work of an early school of psychology called Gestalt psychology, which was founded by German psychologist Max Wertheimer. The Gestalt psychologists emphasized that we perceive whole objects or figures (gestalts) rather than isolated bits and pieces of sensory information. Roughly translated, the German word Gestalt means a unified whole, form, or shape. Although the Gestalt school of psychology no longer formally exists, the pioneering work of the Gestalt psychologists established many basic perceptual principles.
To recognize an object, we must first perceive it (see it as a figure) as distinct from its surroundings (the ground). We must also organize the figure into a meaningful form. Several Gestalt principles—proximity, similarity, continuity, connectedness, and closure—describe this process.
When you look around your world, you don’t see random edges, curves, colors, or splotches of light and dark. Rather, you see countless distinct objects against a variety of backgrounds. Although to some degree we rely on size, color, and texture to determine what an object might be, we rely primarily on an object’s shape to identify it.
How do we organize our perceptions so that we see an object as separate from other objects? The early Gestalt psychologists identified an important perceptual principle called the figure–ground relationship, which describes how this works. When we view a scene, we automatically separate the elements of that scene into the figure, which is the main element of the scene, and the ground, which is its background. You can experience the figure– ground relationship by looking at a coffee cup on a table. The coffee cup is the figure, and the table is the ground. Notice that usually the figure has a definite shape, tends to stand out clearly, and is perceptually meaningful in some way. In contrast, the ground tends to be less clearly defined, even fuzzy, and usually appears to be behind and farther away than the figure. The early Gestalt psychologists noted that figure and ground have vastly different perceptual qualities. As Gestalt psychologist Edgar Rubin observed, “In a certain sense, the ground has no shape.” We notice the shape of the figure but not the shape of the background, even when that ground is used as a well-defined frame. It turns out that brain neurons also respond differently to a stimulus that is perceived as a figure versus a stimulus that is part of the ground.
Particular neurons in the cortex that responded to a specific shape when it was the shape of the figure did not respond when the same shape was presented as part of the background. The separation of a scene into figure and ground is not a property of the actual elements of the scene you’re looking at. Rather, your ability to separate a scene into figure and ground is a psychological accomplishment.
Research on the visual cliff revealed that many species perceive the world in three dimensions at, or very soon after, birth. We transform two-dimensional retinal images into three-dimensional perceptions by using binocular cues, such as retinal disparity, and monocular cues, such as the relative sizes of objects.
The ability to perceive the distance of an object as well as the three-dimensional characteristics of an object is called depth perception.
We use a variety of cues to judge the distance of objects. The following cues require the use of only one eye. Hence, they are called monocular cues (mono means “one”). After familiarizing yourself with these cues, look at the photographs on the next page. Try to identify the monocular cues you used to determine the distance of the objects in each photograph.
1. Relative size. If two or more objects are assumed to be similar in size, the object that appears larger is perceived as being closer.
2. Overlap. When one object partially blocks or obscures the view of another object, the partially blocked object is perceived as being farther away.
3. Aerial perspective. Faraway objects often appear hazy or slightly blurred by the atmosphere.
4. Texture gradient. As a surface with a distinct texture extends into the distance, the details of the surface texture gradually become less clearly defined. The texture of the surface seems to undergo a gradient, or continuous pattern of change, from crisp and distinct when close to fuzzy and blended when farther away.
5. Linear perspective. Parallel lines seem to meet in the distance. For example, if you stand in the middle of a railroad track and look down the rails, you’ll notice that the parallel rails seem to meet in the distance. The closer together the lines appear to be, the greater the perception
6. Motion parallax. When you are moving, you use the speed of passing objects to estimate the distance of the objects. Nearby objects seem to zip by faster than do distant objects. When riding on a commuter train, for example, houses and parked cars along the tracks seem to whiz by, while the distant downtown skyline seems to move very slowly.
When monocular cues are used by artists to create the perception of distance or depth in paintings, they are referred to as pictorial cues. If you look at the cover of this book, you can see how artist Phoebe Beasley used pictorial cues, including overlap and relative size, to create the perception of depth in her artwork. Another monocular cue is accommodation. Unlike pictorial cues, accommodation utilizes information about changes in the shape of the lens of the eye to help us estimate distance. When you focus on a distant object, the lens is flat, but focusing on a nearby object causes the lens to thicken. Thus, to some degree, we use information provided by the muscles controlling the shape of the lens to judge depth. In general, however, we rely more on pictorial cues than on accommodation for depth perception.
Binocular cues for distance or depth perception require information from both eyes. One binocular cue is convergence—the degree to which muscles rotate your eyes to focus on an object. The more the eyes converge, or rotate inward, to focus on an object, the greater the strength of the muscle signals and the closer the object is perceived to be. For example, if you hold a dime about six inches in front of your nose, you’ll notice the slight strain on your eye muscles as your eyes converge to focus on the coin. If you hold the dime at arm’s length, less convergence is needed. Perceptually, the information provided by these signals from your eye muscles is used to judge the distance of an object.
Another binocular distance cue is binocular disparity. Because our eyes are set a couple of inches apart, a slightly different image of an object is cast on the retina of each eye. When the two retinal images are very different, we interpret the object as being close by. When the two retinal images are more nearly identical, the object is perceived as being farther away. Here’s a simple example that illustrates how you use binocular disparity to perceive distance. Hold a pencil just in front of your nose. Close your left eye, then your right. These images are quite different— that is, there is a great deal of binocular disparity between them. Thus you perceive the pencil as being very close. Now focus on another object across the room and look at it first with one eye closed, then the other. These images are much more similar. Because there is less binocular disparity between the two images, the object is perceived as being farther away. Finally, notice that with both eyes open, the two images are fused into one. A stereogram is a picture that uses the principle of binocular disparity to create the perception of a three-dimensional image (Kunoh & Takaoki, 1994). Look at the stereogram shown here. When you first look at it, you perceive a twodimensional picture of leaves. Although the pictorial cues of overlap and texture gradient provide some sense of depth to the image, the elements in the picture appear to be roughly the same distance from you. However, a stereogram is actually composed of repeating columns of carefully arranged visual information. If you focus as if you are looking at some object that is farther away than the stereogram, the repeating columns of information will present a slightly different image to each eye. This disparate visual information then fuses into a single image, enabling you to perceive a three-dimensional image—three rabbits! To see the rabbits, follow the directions in the caption.
Finally, our effective functioning requires a kinesthetic sense, which notifies the brain of the position and movement of body parts, and a sense of equilibrium, which monitors the position and movement of the whole body.
Pain begins when an intense stimulus activates small-diameter sensory fibers, called free nerve endings, in the skin, muscles, or internal organs. The free nerve endings carry their messages to the spinal cord, releasing a neurotransmitter called substance P. In the spinal cord, substance P causes other neurons to become activated, sending their messages through open spinal gates to the thalamus in the brain. Other areas of the brain involved in the experience of pain are the somatosensory cortex and areas in the frontal lobes and limbic system that are involved in emotion. When the sensory pain signals reach the brain, the sensory information is integrated with psychological information. Depending on how the brain interprets the pain experience, it regulates pain by sending signals down the spinal cord that either open or close the gates. If, because of psychological factors, the brain signals the gates to open, pain is experienced or intensified. If the brain signals the gates to close, pain is reduced.
Anxiety, fear, and a sense of helplessness are some of the psychological factors that can intensify the experience of pain. Positive emotions, laughter, distraction, and a sense of control can reduce the perception of pain. The experience of pain is also influenced by social and cultural learning experiences about the meaning of pain and how people should react to pain. Psychological factors also influence the release of endorphins, the body’s natural painkillers that are produced in many parts of the brain and the body. Endorphins are released as part of the brain’s overall response to physical pain or stress. In the brain, endorphins can inhibit the transmission of pain signals. In the spinal cord, endorphins inhibit the release of substance P. Finally, a person’s mental or emotional state can influence other bodily processes that affect the experience of pain. Muscle tension, psychological arousal, and rapid heart rate can all produce or intensify pain. Today, a variety of techniques and procedures can effectively eliminate or reduce pain.
Movement, Position, and Balance
The phone rings. Without looking up from your textbook, you reach for the receiver, pick it up, and guide it to the side of your head. You have just demonstrated your kinesthetic sense—the sense that involves the location and position of body parts in relation to one another. (The word kinesthetics literally means “feelings of motion.”) The kinesthetic sense involves specialized sensory neurons, called proprioceptors, which are located in the muscles and joints. The proprioceptors constantly communicate information to the brain about changes in body position and muscle tension. Closely related to the kinesthetic sense is the vestibular sense, which provides a sense of balance, or equilibrium, by responding to changes in gravity, motion, and body position. The two sources of vestibular sensory information, the semicircular canals and the vestibular sacs, are both located in the ear. These structures are filled with fluid and lined with hairlike receptor cells that shift in response to motion, changes in body position, or changes in gravity. When you experience environmental motion, like the rocking of a boat in choppy water, the fluids in the semicircular canals and the vestibular sacs are affected. Changes in your body’s position, such as falling backward in a heroic attempt to return a volleyball serve, also affect the fluids. Your vestibular sense supplies the critical information that allows you to compensate for such changes and quickly reestablish your sense of balance.
Maintaining equilibrium also involves information from other senses, particularly vision. Under normal circumstances, this works to our advantage. However, when information from the eyes conflicts with information from the vestibular system, the result can be dizziness, disorientation, and nausea. These are the symptoms commonly experienced in motion sickness, the bane of many travelers in cars, on planes, on boats, and even in space. One strategy that can be used to combat motion sickness is to minimize sensory conflicts by focusing on a distant point or an object that is fixed, such as the horizon.
In the first part of this chapter, we’ve described how the body’s senses respond to stimuli in the environment. To make use of this raw sensory data, the brain must organize and interpret the data and relate it to existing knowledge. Next, we’ll look at the process of perception—how we make sense out of the information that we receive from our environment.
Having perceived an object as a coherent figure and having located it in space, how then do we recognize it—despite the varying images that it may cast on our retinas? Size, shape, and lightness constancies describe how objects appear to have unchanging characteristics regardless of their distance, shape, or motion. These constancies explain several of the well-known visual illusions. For example, familiarity with the size-distance relationships in a carpentered world of rectangular shapes makes people more susceptible to the Muller-Lyer illusion.
Philosophers have debated whether our perceptual abilities should be credited to our nature or our nurture. To what extent do we learn to perceive? German philosopher Immanuel Kant (1724–1804) maintained that knowledge comes from our inborn ways of organizing sensory experiences. Indeed, we come equipped to process sensory information. But British philosopher John Locke (1632–1704) argued that through our experiences we also learn to perceive the world. Indeed, we learn to link an object’s distance with its size. So, just how important is experience? How radically does it shape our perceptual interpretations?
Sensory Deprivation and Restored Vision.
For many species, infancy is a critical period during which experience must activate the brain’s innate visual mechanisms. If cataract removal restores eyesight to adults who were blind from birth, they remain unable to perceive the world normally. Generally, they can distinguish figure from ground and can perceive colors, but they are unable to recognize shapes and forms. In controlled experiments, animals have been reared with severely restricted visual input. When their visual exposure is returned to normal, they, too, suffer enduring visual handicaps.
Human vision is remarkably adaptable. Given glasses that shift the world slightly to the left or right, or even turn it upside down, people manage to adapt their movements and, with practice, to move about with ease. Our perceptual adaptation to changed visual input makes the world seem normal again. But imagine a far more dramatic new pair of glasses—one that shifts the apparent location of objects 40 degrees to the left. When you first put them on and toss a ball to a friend, it sails off to the left. Walking forward to shake hands with the person, you veer to the left. Could you adapt to this distorted world? Chicks cannot. When fitted with such lenses, they continue to peck where food grains seem to be.
Clear evidence that perception is influenced by our experience—our learned assumptions and beliefs—as well as by sensory input comes from the many demonstrations of perceptual set and conBy one month before birth, the fetus demonstratestext effects. The schemas we have learned help us to interpret otherwise ambiguous stimuli, a fact that helps explain why some of us "see" monsters, faces, and UFOs that others do not.
Perception and the Human Factor.
Perceptions vary, and may not be what a designer assumes. Human factors psychologists therefore study how people perceive and use machines, and how machines and physical environments can be better suited to that use. Such studies have improved aircraft safety and spawned user-friendly technology.