Everybody loves a good visual illusion, and I'm happy to say that Harvard psychologists have recently discovered yet another mind-bender to add to my wonder of the confusing world that is our visual system.
It's called "silencing", and describes an effect by which we fail to recognize changes in visual objects that are moving with respect to our eyes.
The researchers showed subjects a visual scene in that was composed of a circle of dots that constantly changed color. The change in color is quite obvious, but an amazing thing happens when you set the wheel in motion. Once the circle of dots start rotating, they appear to stop shifting color. Stop the wheel, and they immediately revert back to their dynamic former selves.
Such a finding points to the importance of motion in our visual system, a component of vision that seems to have a particular importance in our evolutionary past.
For many decades now scientists have suggested that, broadly speaking, there are two different pathways when interpreting the visual information entering your eyes. One is considered the "what" pathway, and deals mostly with specific object features and color. The other has been coined the "where" pathway, and tends to respond to location, motion, and more coarse features of objects.
With the discovery of "silencing", we've got yet another example of how one of these pathways might effect the other. Set the wheel in motion, and you start losing the ability to distinguish the finer features of the dots. Why this happens is anybody's guess, but it's an important step in understanding the intricacies of our visual system.
check out Harvard's VisionLab for more videos and demonstrations
It is tempting to think that brains are incredibly precise machines. As we move about the world, it seems like what we see is certain, unambiguous, and unchanging. So it's only a natural extension of this to assume that what goes on between our ears is just as precise. In reality, this may be completely incorrect.
While a lot of past brain research has treated the human brain as a computational monster, crunching the numbers and using the powers of logic to represent the world around it, such an approach has proven to be difficult to connect with reality. While brains do carry out a lot of computation, the fact of the matter is that trying to process every aspect of the world around you would simply be too much to handle. What the brain needs is a way of making things more efficient, more manageable. What the brain needs is statistics.
A growing body of scientific literature has emerged in the past decade that takes a slightly different approach to understanding what it is our brains are actually doing. Rather than treating the world as a black-and-white environment where certainty is the end goal, perhaps what we need is probability, likelihood.
Here's an example of such an approach. It details a recent project of Ruth Rosenholtz, a vision scientist in the Department of Brain and Cognitive Sciences at MIT. She's got a new model of vision that uses the statistics of the visual field as a key component in the visual computations that the brain carries out.
In the model, the brain breaks the visual field down into small areas of focus. In each area, information is gathered about the basic shapes and visual components that lie within its boundaries, and a kind of assumption is made about what the area as a whole contains. In the center of your vision, these areas are relatively small, allowing for more fine-grained discrimination of your visual field. However, towards the periphery, the areas grow larger, allowing for more cluttered images to become noisy.
The model makes some interesting predictions about visual discrimination that seem to match well with our behavioral data. For example, an "A" that is seen in your periphery will be relatively easy to spot if it is alone; however, if the "A" is surrounded by other letters (such as in "TOAST"), then the brain will not be able to detect it. This is because all these letters fall within the larger fields of the periphery, and any individual member of the group is lost in the noise.
Such an approach to vision seems to be quite fruitful, and the underlying assumptions of statistics have a lot of interesting implications for other aspects of cognition as well. This is but one example of research going on in this field...I urge you to check others out as well!
via Science Daily
Here's a really interesting TEDx talk given by Dan Simon at the University of Illinois. Dr. Simon studies visual attention and perception, among other things, but the topic of this talk touches on a subject that is a bit more abstract. He discusses the types of behavior we see when people do things, say things, or perceive things that logic tells us they shouldn't. Put simply, he is interested in understanding the ways in humans act unintuitively.
He gives a number of examples that by now are very well known in psychological literature (the gorilla ball game is one of my favorites), but I think that his talk as something very important to say about our attempts to understand humans.
In attempting to investigate the ways in which our actions don't make sense from a rational or intuitive standpoint, we can say something very important about the underlying mechanisms (be they at the neural level or the psychological level) that cause people to do the things that they do.
I can't help but think of economics when I see a lecture like this one, since it seems that our most popular modern theories in this field have assumed that humans are rational and relatively all-knowing creatures that can act in an efficient and sensible manner. Now, I don't think it should take a well-established career and tenure to be able to understand that humans are far from the rational creatures that many economists would like, but perhaps instead we should just should them some of the examples in this talk...
While I'm on the topic of getting old and changing your perspective on life an all those heavy topics, I thought I'd drop a link that actually shows some of the science behind this stuff.
Here is an article by the one and only Christof Koch, describing some research that he has done regarding perception and circumstance in human beings.
As much as we'd like to think that our view of the world is based off of hard evidence and objective reasoning, the truth of the matter is that what we perceive is significantly influenced by some decidedly subjective factors. Expectation, bias, personal opinion, and even your level of tiredness can change how you perceive the world around you, as witnessed in Koch's interesting experiments.
For example, in one study subjects stood at the top of a hill and were asked to judge how steep it was. One group of people stood on top of a sturdy box while doing this, and the other stood on top of a skateboard (don't worry, nobody ended up taking a plunge). Both of these objects were the same height, and yet those on top of the skateboard rated the hill significantly steeper than it actually was (the subjects on the box did not do this).
Having the experience of being on a wobbly skateboard changed their perception about the world around them, distorting it to induce more self-preserving behaviors by making them nervous.
What this suggests is that our own opinions about the world and the feelings that we have at any given moment are not completely based on the world around us, but on what we decide to be the world around us.
Given this implication, one can come up with all kinds of interesting ways we might test this theory out. At the very least, it might make us pause the next time we swear that all the red lights are conspiring against us...