This is Your Brain On Awesome Thoughts on the world from a student of the mind

23Feb/11

Birds are Quantum Physicists!

Remember a few months back when an article came out describing that ability of birds to see magnetic fields?  Well, here's another chapter in that interesting aviary saga.

As I mentioned in a previous post, scientists have been trying to figure out just how it is that birds are able to accomplish this amazing feat.  Many hypotheses involve the protein cryptochrome, a molecule that seems to be nearly one-of-a-kind as far as biological structures go.

Now, scientists have taken the awesome factor for this mechanism one step higher...they're suggesting that these birds may actually be using quantum entanglement in their navigational systems.

For those uninitiated into the world of really really tiny physics, entanglement basically describes two electrons that are inextricably linked.  Any time you subject an electron to a magnetic field, you affect its "spin",  a quantum property that is too complex to be explained in this short post.  However, if that electron is entangled with another, then any time electron "A" changes its spin, electron "B" will react as well, even though it was never subjected to the magnetic field.

Sounds creepy huh?  Apparently this is a concept that dates back to the good old days of Einstein, who famously described it as "spooky action at a distance."

So might birds use this?  Well, one theory is that in a bird's eye are pairs of these "quantum entangled" electrons.  Occasionally, one of these electrons will move away from the other, causing it to experience a slightly different magnetic field than its partner.  Through some unknown mechanism, the bird measures this change in magnetic field by measuring the quantum state of the two electrons.

If you think this sounds hard to believe, you wouldn't be alone, and scientists are still trying to figure out just what is going on.  There have been many experiments performed on quantum entanglement, but nearly all of them require very specific environmental conditions that are never seen in nature (such as having a temperature close to zero degrees Kelvin).  To see such an effect in a warm-blooded living organism is fascinating.

It's discoveries like this that make me love the world of science.  Quantum physics is a field that has been around for less than a century.  Go back a hundred years, and you would have found a number of physicists who theorized that we were just at the cusp of "figuring out" the entire universe.  Now, we've got an entire new field of physics that almost nobody understands, and yet we're finding creatures that utilize properties of these fields at a fundamental level.  The universe is a strange place, indeed.  Who knows what other mysterious discoveries are out there, waiting to be uncovered.

via The Wired Blog

21Feb/11

Motion, identity, and optical illusions: the case of silencing

Everybody loves a good visual illusion, and I'm happy to say that Harvard psychologists have recently discovered yet another mind-bender to add to my wonder of the confusing world that is our visual system.

It's called "silencing", and describes an effect by which we fail to recognize changes in visual objects that are moving with respect to our eyes.

The researchers showed subjects a visual scene in that was composed of a circle of dots that constantly changed color.  The change in color is quite obvious, but an amazing thing happens when you set the wheel in motion.  Once the circle of dots start rotating, they appear to stop shifting color.  Stop the wheel, and they immediately revert back to their dynamic former selves.

Such a finding points to the importance of motion in our visual system, a component of vision that seems to have a particular importance in our evolutionary past.

For many decades now scientists have suggested that, broadly speaking, there are two different pathways when interpreting the visual information entering your eyes.  One is considered the "what" pathway, and deals mostly with specific object features and color.  The other has been coined the "where" pathway, and tends to respond to location, motion, and more coarse features of objects.

With the discovery of "silencing", we've got yet another example of how one of these pathways might effect the other.  Set the wheel in motion, and you start losing the ability to distinguish the finer features of the dots.  Why this happens is anybody's guess, but it's an important step in understanding the intricacies of our visual system.

check out Harvard's VisionLab for more videos and demonstrations

9Feb/11

What does it take to see the entire Sun?

Well, after thousands of years of speculating, dreaming, and fearing that giant yellow blob in the sky, we are finally able to visualize the sun in its entirety.

Certainly, our knowledge of the sun has grown exponentially over the past century or so, moving from a celestial object of the gods to the giant burning atom smasher that we know and love today.  However, this marks a new step towards being able to use the activity of the sun to make all kinds of predictions about our galaxy.

Of importance for this video is the ability to predict aberrant electromagnetic activity that occurs as a result of the sun's shifting surface.  Generally called "solar flares", these  giant leaping arcs of energy and power have been known to disrupt GPS signals, communication, and other kinds of electronics that rely on wireless fields.

These flares do not completely come out of the blue, we can often anticipate one by looking at activity on the sun's surface.  However, until now, we'd only been able to look at a fraction of the total surface of the sun, meaning that activity on the "dark side of the sun" (kind of a misnomer, I know) was unknown.

Now, by having two circling satellites at opposite ends of our friendly fireball, we can see what's going on all the time, allowing us to more accurately predict solar activity.  Check out the video for more details and pretty pictures!

via NASA

ps, for those who might have noticed a less-frequent number of posts lately, I've been running all over the place getting interviews finished...I promise to take up more slack once things settle down!

4Feb/11

Adding fuzziness to neural computation

It is tempting to think that brains are incredibly precise machines.  As we move about the world, it seems like what we see is certain, unambiguous, and unchanging.  So it's only a natural extension of this to assume that what goes on between our ears is just as precise.  In reality, this may be completely incorrect.

While a lot of past brain research has treated the human brain as a computational monster, crunching the numbers and using the powers of logic to represent the world around it, such an approach has proven to be difficult to connect with reality.  While brains do carry out a lot of computation, the fact of the matter is that trying to process every aspect of the world around you would simply be too much to handle.  What the brain needs is a way of making things more efficient, more manageable.  What the brain needs is statistics.

A growing body of scientific literature has emerged in the past decade that takes a slightly different approach to understanding what it is our brains are actually doing.  Rather than treating the world as a black-and-white environment where certainty is the end goal, perhaps what we need is probability, likelihood.

Here's an example of such an approach.  It details a recent project of Ruth Rosenholtz, a vision scientist in the Department of Brain and Cognitive Sciences at MIT.  She's got a new model of vision that uses the statistics of the visual field as a key component in the visual computations that the brain carries out.

In the model, the brain breaks the visual field down into small areas of focus.  In each area, information is gathered about the basic shapes and visual components that lie within its boundaries, and a kind of assumption is made about what the area as a whole contains.  In the center of your vision, these areas are relatively small, allowing for more fine-grained discrimination of your visual field.  However, towards the periphery, the areas grow larger, allowing for more cluttered images to become noisy.

The model makes some interesting predictions about visual discrimination that seem to match well with our behavioral data.  For example, an "A" that is seen in your periphery will be relatively easy to spot if it is alone; however, if the "A" is surrounded by other letters (such as in "TOAST"), then the brain will not be able to detect it.  This is because all these letters fall within the larger fields of the periphery, and any individual member of the group is lost in the noise.

Such an approach to vision seems to be quite fruitful, and the underlying assumptions of statistics have a lot of interesting implications for other aspects of cognition as well.  This is but one example of research going on in this field...I urge you to check others out as well!

via Science Daily