Thursday, December 31, 2009

Body Worlds "Too"

As a holiday present, my lovely wife bought us tickets to the new Body Worlds exhibit at the Franklin Institute in Philadelphia.  This is the second showing of a Body Worlds exhibit at the Franklin, and this "BodyWorlds2" is supposed to have a special emphasis on the brain (thus, despite our having seen the first exhibit several years ago, the wife and I thought we would enjoy going to see this exhibit as well).  I have to say that we did enjoy both trips to the "Body Worlds", and if you haven't yet seen one of these exhibits, I highly recommend it.  (If you have no idea what BodyWorlds is, you can check out their website here.)  The preservation techniques and the human specimens (as well as those of sheep, horses, camels and other animals) really are stunning, and give you a first-hand look at several different facets of anatomy that you could only get in a college (or really graduate level) anatomy lab. In many cases, you wouldn't get such a clean or clear cut look from a med school cadaver as you do in many of these "plastinates" (like in the images below which show a specimen where only the blood vessels are preserved, and another where all of the nerve tracts have been preserved). 
My only gripes about the exhibit were (a), if you've seen it before, there really weren't that many new specimens than the first exhibit (thus my desire to change the title to BodyWorlds " Too"), and (b) for an exhibit that was supposed to be focused on the brain and the nervous system, I was disappointed at the lack of emphasis on neuroanatomy.  It seemed to me like we were basically seeing the normal BodyWorlds exhibit, but with a few posters here and there to tell us about some of the things going on in the brain (like how emotions seem to be predominantly processed in the limbic system, or how music can stimulate many areas of the brain, etc.)  These posters however were oviously not the highlight of the show, and in many cases seemed to be only an afterthought, which was evidenced by the fact that there did not seem to be any sort of unifying theme, or reasonable order in which they were presented (many of the ones that actually pointed out brain structures were well after, and in completely different rooms from the actual brain specimens, and most of them were poorly showcased and off to the side).  At the risk of using (or misusing) one of the buzzwords of the past decade, I think the Franklin missed out on a great "teachable moment", and I don't think anyone leaving that exhibit will have any real grasp of even the most fundamental and important concepts in neuroscience (or in neuroanatomy).  That being said, the exhibit is still worth seeing, and you will definitely learn some interesting bits and facts here and there, but I think mainly the purpose is to inspire awe at the human form and to excite viewers to want to learn more about biology, medicine, and anatomy.  In this respect, I think that the show truly succeeds, but then, we already know that I love biology, so maybe I am biased.


Wednesday, December 30, 2009

Okay, so it's not neuroscience,

But I have had several people asking me to share my view on the whole "Climategate" SNAFU.  I'm sure you can probably guess, but to be blunt, I think it is a complete load of crap.  If I had to sum up my feelings as succinctly and eloquently as possible, I would simply quote the great Harvard naturalist Stephen Jay Gould who said:
"When people learn no tools of judgment and merely follow their hopes, the seeds of political manipulation are sown."
Since I am nowhere near as succinct or eloquent as Dr. Gould, here is my response to anyone who would claim that global warming is not real, but rather the product of some vast conspiracy within the science community:
If you don’t accept the mountain of evidence that conclusively shows that the earth is warming AND that this climate change is impacted by human activities, then you are either (1) IGNORANT (that is, unaware of the aforementioned mountain of evidence) or (2) PUROPOSEFULLY OBTUSE AND DECEITFUL (likely for some political or monetary gain), or (3) both (1) and (2). It is an unfortunate fact that the majority of Americans have a poor understanding of science and the nature of scientific evidence, which leaves them susceptible to being deceived by those who have financial or political interests in discrediting the evidence. It seems that history does repeat itself, and the current campaign of the “climate deniers” is remarkably reminiscent of how the tobacco industry successfully parried claims and undermined research for decades, somehow casting doubt on the overwhelming evidence that showed that smoking causes cancer (which was first shown as early as the 1700s, unequivocally confirmed in the 1950s, and yet not really accepted until the late 1980s or 90s). All of the political pundits making so much hoopla out of these “Climategate” emails seem only to be able to point to THREE lines taken from a COUPLE of emails that have been cherry picked and taken out of context from over 10 years of emails, and then, they claim that these THREE LINES somehow topple the DECADES of research carried out by HUNDREDS of scientists and published in THOUSANDS of scientific journal articles. And what is this “trick” everyone is so upset about? Well, rather than use tree ring data (which becomes unreliable as a measure for seasonal temperature when CO2 in the atmosphere increases beyond a certain point) the scientist in question used the ACTUAL TEMPERATURES recorded over the past 30 years. So he was “hiding” what looked like “the decline” in tree ring data, by SHOWING THE ACTUAL INCREASE IN AVERAGE TEMPERATURES! This is not some massive conspiracy/cover-up, it is actually STRONGER evidence that the earth is warming. I’m not going to spend the time to fully disabuse this nonsense here, but if you want to see where else this “ClimateGate” BS falls flat on its face you can go here:

http://www.yaleclimatemediaforum.org/2009/12/cru-emails-whats-really-there/
http://www.informationisbeautiful.net/visualizations/climate-change-deniers-vs-the-consensus/

or here if you actually want to see photographs of glaciers and ice caps melting away…

http://www.worldviewofglobalwarming.org/pages/glaciers.html

If you really want to know what’s going on, look up the evidence for yourself, don’t just believe fat mouths like Glenn Beck and Rush Limbaugh who are, of their own self-proclamation “entertainers” rather than experts. And don’t take me at my word either if you don’t want to, but don’t claim “flawed science” if you haven’t even looked at the science, or the reports of the hundreds of hard-working people whose decades’ worth of work have been casually and carelessly cast aside because one or two comments in some emails have been horribly misconstrued and taken out of context and then plastered on your tv.


Sunday, December 27, 2009

Aluminium and Alzheimer's disease

I've been planning on writing about this one for a while now because I feel like it's almost as prevalent a  myth as the vaccines and autism notion (which may just get its comeuppance again).  To disabuse both notions, there is no convincing evidence to suggest that exposure to aluminium (in the form of deodorants, antacids, from cookware, etc.) causes Alzheimer's disease, nor is there any evidence to suggest that vaccines cause autism.  I would love to know how these ideas gain such a strong foothold in the public consciousness when you consider the lack of data to support them.  But then it's easy to find websites propounding this nonsense (like this one, which conspicuously lacks any references for the claims being made) then there are movies (like when Ryan Reynolds' character in Smokin' Aces says he doesn't use deodorant because it causes Alzheimer's) and of course, you have celebrities (like Jim Carrey, Jenny McCarthy, and, sadly, Bill Maher) touting these myths as if they should be believed because celebrity somehow qualifies in place of medical expertise or advanced degrees.  Anyway, as for the aluminium and Alzheimer's myth, this one is pretty easy since I will just refer you to the Alzheimer's society webpage on the subject here, and then, as I usually like to do, I will try to boil it down for those of you with short attention spans... 
This particular myth seems to have started in the 1960s and 70s with a handful of studies where researchers gave animals large doses of aluminium phosphate and then, when they looked at their brains, found lots of proteins tangled up like last year's Xmas lights.  These knotted and tangled protein masses looked kinda like the protein tangles that are seen in the brains of Alzheimer's patients (called neurofibrillary tangles) and thus, the idea that aluminium may be linked in some way to Alzheimer's gained some momentum.  However, in the following decades, scientists conducted numerous studies to see if there really was a link.  The vast majority of the studies failed to detect any link between aluminum and Alzheimer's. Those few that have shown causality are almost all in non-human animals and have used doses that are much higher than anything you would come into contact with during your daily routine.  So, unless you plan on drinking solutions or medicines that are loaded with aluminum, or perhaps eating your soda cans after you drink out of them, you should have nothing to worry about.

Saturday, December 26, 2009

A nice resource...

In the rare event that you are as big a nerd as I am, you may find the following site interesting (or, at the very least, very informative): http://www.ninds.nih.gov/disorders/disorder_index.htm#Z
It is an index of neurological disorders and diseases, from Alzheimer's to Zelwegger's... just click on one (or as many as you want to know about) and you will find a great little summary of the disease and even a summary of how the National Institutes of Health, or more specifically the National Institute of Neurological Disorders and Stroke, is funding research aimed at finding treatments or cures for the various diseases.  Now if only they made a day to day calender, I could have had the perfect Xmas present.

Tuesday, December 22, 2009

A brief introduction to neuroanatomy....


I would like to think that I am equally entertaining when I am giving a lecture... my students would probably disagree. If only I could sing... or dance... or play the tambourine...

Sunday, December 20, 2009

Why do candy canes taste cool?


Candy canes and other mints taste cool because they contain menthol, which can bind to, and activate receptors that normally sense cold.  The fibers at the end of temperature sensitive nerve cells extend to just beneath the surface of your skin, and are also found in your tongue and throughout the inside of your mouth.  At the ends of these cells, there are ion channels which are basically hollow tubes made up of proteins.  At normal temperatures, these tubes are sort of scrunched up so that some of the proteins in the tube block the opening.  At colder temperatures, the tube straightens out and allows ions outside of the nerve cell to pass into the inside of the cell. 


These positively charged ions (sodium and calcium) flood into the cell when the receptors are open, and they change the electric potential across the cell membrane which in turn starts an action potential: a wave of electric current that moves along the length of the nerve cell (I've talked about action potentials and how nerve cells send information before here).  The action potential from a temperature sensitive neuron ultimately sends a signal all the way up to the brain which interprets this signal as cold.  What is interesting about these cold-sensing ion channels is that they can also be opened up when molecules bind to them at sites on the outside of the cell called receptor sites. In the case of our cold receptors, molecules like menthol (which is found in mint leaves, and thus in mint candies like candy canes) bind to the receptor site and cause the protein tube to open up.  This causes the action potentials and signaling we just talked about, which tell your brain that your tongue is cool when really it is just enjoying a candy cane at your normal body temperature.  Interestingly, this is very similar to what happens when you eat spicy foods that taste "hot", except in that case they are different receptors that open when heated (and close when cooled).  These "hot" receptors also open when bound by molecules, in this case, the most prominent one is called capsaicin (which is found in most hot peppers and hot sauces).  So,now you know why candy canes taste cool, and hot sauce tastes hot. (If your holidays are anything like mine, you are probably going to be eating candy canes and Buffalo wings at some point, though hopefully not at the same time).  Happy Holidays!

Wednesday, December 16, 2009

Can sitting too close to the tv make you go blind?

I don't know if I've posted about this one before or not.  If I haven't, I've been meaning to, since this one is an oldie but goodie.  Anyway, the short answer is one we've all suspected since we were little kids, being chastised with this warning by our overprotective parents... sitting too close to the TV will NOT cause you to go blind.  Keeping your eyes too close to your tv screen, however (or your computer screen, or portable media players, video gaming devices, or even books, whatever those are), can cause you to become nearsighted (as the muscles that help your eyes to focus gradually lose their ability to adjust for things that are far away, see here if you don't believe me).  As a corollary to this, a new study shows that Americans are much more likely to be nearsighted today than 40 years ago (here for the journal's webpage).  I knew I should have gone to optometry school!  Maybe I will just buy some stock in Bausch and Lomb.

Are cats smarter than dogs?

So dogs tend to get the bum wrap when it comes to intelligence. Though many dog owners come to appreciate how smart their pets can actually be (our pups know several of their toys by name, and can retrieve them if you ask), most people would probably pick cats as the smarter household pet because they excel at being aloof and even manipulative.  The truth is that there probably is no reliable way to determine which is smarter (especially since there are so many different breeds of dog and cat) and because both are smart in different ways, as they have evolved to be.  Despite the difficulty of the comparison, the New Scientist has an article out now where they have attempted to compare cats and dogs on several different criteria (including brain size, "understanding", "problem solving", and "supersenses").  I don't know how valid the final outcome is (dogs win 6 to 5), but the article is certainly interesting and filled with some good information (like dogs tend to have larger brains, but cats tend to have more cortical neurons, or cats may actually have a better sense of smell than dogs).  The fun facts aside, I have to disagree with some of the categories used (like popularity or eco-friendliness) if the question really is just one of intelligence (rather than the more general question of which one makes the better pet).  Also, I am disappointed to see birds have been excluded from the comparison: despite my dislike for having them as pets, they are certainly smart (like gray parrots for example).  Anyway, the rundown of winners for each category looks like this:
Brain: cats
Shared history with humans: dogs
Bonding: dogs
Popularity: cats
Understanding: dogs
Problem solving: dogs
Vocalization: cats
Tractability (or "trainability"): dogs
Supersenses: cats
Eco-friendliness: cats
Utility: dogs
Totals: DOGS 6, CATS 5

Saturday, December 12, 2009

Does giving your brain a "mental workout" really help ward off dementia?

Well, it is certainly possible, even likely, but not conclusive.  I will refer you to a couple of blogs that have done a much better job of explaining this than I plan to do here...
http://www.nytimes.com/2007/11/08/opinion/08aamodt.html?_r=1
and/or
http://www.spring.org.uk/2008/06/which-cognitive-enhancers-really-work.php

If you were always the kind of person to read the Cliff's notes, or rent the movie instead of reading the book, the bottom line is that exercise, the real/physical kind, not the mental kind you can do sitting in front of a computer, appears to be the most effective way to improve your overall cognitive function and ward off dementia as you age.  Playing Sudoku or doing crosswords (or other brain games) may have some beneficial effects, particularly if you don't do anything at all to stimulate your brain, but it is more likely that these types of activities will primarily only help you to get better at similar types of puzzles and problems.  So, if you want to do what's best for your brain, you should try to get at least twenty minutes of exercise, several times a week...
you heard me, stop reading this blog and go outside (or, if you live in a frozen part of the country like I do, hit the gym).

Tuesday, December 8, 2009

2009 in Review...weirdness review

Discover magazine has a post about the 15 weirdest science stories of 2009, here
There are many of them I don't think I would characterize as science, but they are still worth checking out.
My favorites are the sea monster and the grad student who became an escort to pay her way through school, (and now has a tv and a book deal). I knew I should have looked at the fine print on those student loans!

Monday, December 7, 2009

It's not exactly brain surgery...

Why should I do all the heavy lifting?

So, I just finished reading the book "Welcome to your brain" by Sandra Aamodt and Sam Wang.  It is a very valiant attempt at covering a lot of different topics in neuroscience and presenting them in a way that is accessible to the general public.  Not only is the book a valiant attempt, but a largely successful one, though there were a couple places in the book where an advanced degree in neuroscience or psychology seemed to be necessary.  BUT, on the whole, most of the chapters are very accessible, and easy to understand, and all the while, the authors do a good job of keeping things interesting, particularly by pointing out popular neuroscience myths.  Anyway, check out the book if you get a chance, and, if not, add their blog (AND THIS ONE!) to your list of blogs you follow, or re-tweet, or whatever the kids are doing these days.  Also, since it will save me from having to write up several posts of my own, here is a link to their article "Six myths about the brain"
If you read this blog, you may find some of it repetitive, but then some of it is stuff I haven't covered (...yet).

Sunday, December 6, 2009

Autism and vaccines...

I am sure you have heard at some point, probably from the mouth of some celebrity (Jim Carrey, Jenny McCarthy, Bill Maher, etc.) that they believe vaccines can cause autism.  I am not going to go on at length about this (I will provide links you can go to for that below).  I will just say a few things.  First, there is no evidence to suggest that vaccines cause autism.  The only reason anyone ever connected the two is because infants usually get vaccines around 10-12 months of age, whereas autism usually gets diagnosed between 12 and 24 months because this is when the child should start developing more social and communicative behaviors (like learning to talk).  Because the one so closely follows the other, people have assumed a connection that simply isn't there, just a coincidence of physical development (an immune system developed enough to get vaccines) and brain development (one developed well enough to start talking).  Second, the scapegoat here, or what tends to get blamed most often is the mercury found in a preservative called thimerosal which used to be used in vaccines.  Almost all vaccines are now thimerosal-free, or as close to it as possible (except for some of the influenza vaccines), yet people still claim that vaccines are causing autism (despite all the evidence that shows that vaccines with thimerosal never caused autism, see below).  Third, some people site the fact that the incidence of autism has increased over the past several decades as evidence of a link since this coincides with increases in both the numbers of children getting vaccines, and the use of thimerosal in those vaccines (since about the 1970s).  Without getting into the whole correlation is not cause argument again (especially since autism can even be correlated to how much it rains) studies have since come out that directly compared the incidence of autism before and after thimerosal was removed from vaccines and showed no differences in the amount of kids being diagnosed with autism (some of the reports are summarized here).  It is also important to note another correlation that coincides with the increased incidence and that is increased awareness and education about autism spectrum disorders (which as a group has grown to include many different disorders in addition to classically defined autism).  This increased awareness has led to many doctors being able to catch or diagnose the disease in more children early on, which in turn would lead to what looks like an increased incidence, but may just be the result of better diagnosticians. Finally, it is understandable that parents of autistic children would want to blame someone or something for their child's disorder.  The reasons for this are many (some even financial like in this case), but mostly because, as a parent of an autistic child, you can often feel helpless to do anything, and frustrated on many levels.  Latching on to a cause (political, if not medical) gives some sense of control and a feeling of being able to work to fix the problem, or prevent it in the future.  This is only natural, and understandable, and these parents do have my sympathy.  HOWEVER, vaccines are CRITICAL to public health, and to individual health.  Very few drugs, if any, have ever had the successes that we see with vaccines (like the almost complete eradication of polio and smallpox for example, which are not completely eradicated because there are still populations of people who do not get vaccinated).  Childhood vaccines are unbelievably important, and will, in many cases, prevent not only several diseases, but DEATH. For example measles kills hundreds of thousands of children in the rest of the world, but not in the US because most kids are vaccinated, and, thanks to a widespread vaccination program in Asia, the number of deaths resulting from measles outside of the US has dropped nearly 80% since 2000.  If you are a parent, don't let unreasoned and unsubstantiated fears prevent you from keeping your child ALIVE and healthy. 
That's pretty much all I have to say on the subject, but, like I said, here are some links for you to check out on your own from some pretty reputable sources, like the FDA, and the American Medical Association (with others in there for good measure).
http://www.fda.gov/BiologicsBloodVaccines/vaccines/QuestionsaboutVaccines/ucm070430.htm
http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on
http://jama.ama-assn.org/cgi/content/abstract/290/13/1763
http://www.iom.edu/en/Reports/2003/Immunization-Safety-Review-Thimerosal---Containing-Vaccines-and-Neurodevelopmental-Disorders.aspx
http://www.aap.org/advocacy/releases/autismparentfacts.htm
http://www.pn.psychiatryonline.org/content/44/14/4.1.full

Monday, November 30, 2009

Okay, I'm immature.

Science Daily has a story titled:
Small Hairy Balls Hide Foul-Tasting Healthful Enzymes
It has nothing to do with the brain or with neuroscience, but how could I ignore reading an article with that title? And now sharing it with you... Basically, it looks like a Dutch group has figured out a way to package enzymes so that you don't have to taste them, and so they don't break down or lose potency before they get to your stomach or small intestine (or, as flight attendants like to say, wherever their final destination may be).  Could be a cool new tool for drug delivery, but then, you'd have to be okay with putting small hairy balls in your mouth.

Saturday, November 28, 2009

Facilitated Communication

It's like a Ouija board only much, much worse.  Here's a link to give you some idea of what I'm talking about.  Basically someone is claiming this guy is not brain dead because they can pick up his hand and use it to type words on a keyboard.  I, like PZ Myers, would like to see how well he answers the questions being asked when the "facilitator" is blindfolded.  Anyway, hopefully now the cartoon makes sense.

Wednesday, November 25, 2009

Tryptophan and Turkey (Again)

So, I likely won't get to post anything tomorrow what with traveling and then eating and drinking followed by more eating and drinking and then some more eating for good measure.  Since I won't be posting anything, enjoy this lovely e-card from someecards.com, and the blog version of a rebroadcast in the form of a link to my post from a few days ago explaining how tryptophan in turkey is not the culprit for your thanksgiving day drowsiness.  If anything is to blame, it's the booze and all the "carbage".  Here's the link to the turkey/tryptophan post, in the meanwhile...

Brain size: is bigger better? or, Of mice and men (and elephants)

What would you say if I told you there may be a way to increase your brain size?  Would you be interested?

If you are, you're probably not alone, but would having a bigger brain make you smarter? Or would you just be throwing your money away?

Sadly, most of the evidence suggests that overall brain size is not a critical determinant of intelligence or cognitive function, and in some cases, having a bigger brain can actually be a bad thing.

Now, I was planning on writing a nice long post about how bigger is not always better when it comes to the brain, particularly when we look at different animals, but, I just read this article on ScienceDaily (http://www.sciencedaily.com/releases/2009/11/091117124009.htm) that sums things up pretty well.  I'll just give a quick example that I think is quite convincing:
the brain of an Asian elephant is obviously larger than that of your average human (about 7.5 kilograms, where the human brain is, on average, a little less than 1.5 kilograms).  Though, I think we will all agree that humans are smarter than elephants (if only slightly).  To get around this blatant difference between humans and elephants, some people have argued that proportional brain size is what matters. That is, if you take the size (or mass) of the brain and divide it into the size of the whole animal, you get a ratio that describes the size of the brain in proportion to the the rest of the body.  This works well when you look at humans and elephants, where, in humans, the mass of the brain over that of the body gives a ratio of about 2.1%, and in Asian elephants, about 0.15%. And to some extent, this approach works for other comparisons, where there are several examples that make sense (the ratio is larger in "smarter" animals),  HOWEVER... there are definitely many examples where this doesn't work, for instance, when you look at mice, the ratio of brain weight over bodyweight is 3.2%, about one and a half times more than the ratio in humans.  So, if bigger is always better, then either mice or elephants are smarter than people... which, despite the popularity of Sarah Palin's new book, is still highly doubtful.

So, comparisons across different species suggest that bigger brains don't equal smarter animals (even when you take overall body size into account).  But what about if you compare humans to other humans?  Here, there is some evidence that bigger is better (that's what she said), though again there are many exceptions and reasons for questioning that evidence (he retorted defensively).  For example, when we look at the fossil record, we see that, as humans evolved, our skull cavities got bigger, suggesting that our brains got bigger as we got smarter (though, again there are exceptions, like Neanderthals whose brains may have been bigger than our own).  When we look at more recent evidence in humans, we see some interesting things.  For example, boys have larger brains (on average) than girls (which remains the case in adulthood as well).  But does that mean boys are smarter than girls?  Well, if we look at average SAT scores, boys do tend to score higher than girls, BUT, if we look at grades (in co-ed institutions) girls get better grades than boys do.  So where does that leave us?  It leaves us with no clear evidence that one sex is any smarter than the other, despite the difference in brain size.  (The funny thing is, when we look at brain size as a proportion of total body size, girls actually have bigger brains than boys do, of course that still doesn't help us).
If we try to ignore the confusion of comparing men and women (or boys and girls as the case may be), there are some reports (based on data from MRI scans) that suggest bigger brains do correlate with higher IQ scores (which are certainly a limited, and perhaps biased measure for intelligence, but still interesting, and not completely irrelevant).  However, these are weak correlations, and they do NOT demonstrate a hard and fast rule (see the very end of this post for a more detailed explanation of what I mean).  For example, though he wasn't included in the MRI studies, one famous exception to this idea is Albert Einstein.  When pathologists examined Einstein's brain after his death they found that it was not any larger than your average human brain, though I'd be willing to bet he would score pretty high on an IQ test. 
In addition to the fact that these correlations were weak, there are a couple other points that make this evidence questionable (as to what it really means, if anything)...
1. several variables can change brain size over short periods of time.  For example, drinking alcohol temporarily shrinks the brain, which means if anyone in these studies was hungover, they would likely appear to have a smaller brain and also probably wouldn't do so well on the IQ test.  Other factors that are known to affect brain size include diet, exercise, marijuana, medication, and even meditation.  Given the relative ease with which these things can alter brain size, it seems more difficult to tie any measure to absolute brain size since brain size is obviously not absolute.
2. IQ tests may not indicate a person's overall intelligence.  That is, someone with a smaller brain may not have done as well on an IQ test, but they may be much more skilled in some other form of intelligence, like art (painting, sculpture, or music, etc.).  We all know that Mozart was a genius (whose music can actually make you perform better on IQ type tests) and yet, he would probably not score so high on an IQ test.
3. Correlation is not cause.  I have talked about this before.  Even if we assume the data is unequivocal (that is, an almost perfect correlation, where 100 percent of the time a bigger brain is associated with greater intelligence) we still can't say that having a bigger brain causes people to be "smarter".  Perhaps, brains get bigger with use, and therefore, people who study more or who have more years of education (and can therefore score higher on an IQ test) have slightly larger brains. For example, a study conducted about a decade ago with London cab drivers suggested that, at least a part of the brain might be able to get bigger with use.  The study showed that the hippocampus (which is important for remembering where things are located) was bigger in cab drivers than in regular commuters, and cabbies who had been on the job for many years had larger hippocampi than cabbies who had only been working for a shorter time.  While this isn't definitive, and doesn't say anything about other parts of the brain, it does suggest that brain growth in response to use or practice is a reasonable hypothesis.


So, what's the answer?  Are bigger brains better? I think, despite the correlations between brain size and IQ, we still have to say NO.
One final piece of evidence I would like to offer is the presence of a condition (in humans) called megalencephaly, which literally means "large brain".  Megalencephaly is characterized by a brain that is unusually large or heavy when compared to average brain size, and while it is suspected that the causes are genetic, they are not fully known.  Interestingly, megalencephaly often results in decreased intelligence and mental retardation, which seems pretty convincing evidence that simply having a bigger brain does not confer any improvements in cognitive abilities. Of course, another condition known as microencephaly, or "small brain", also results in mental retardation and, often, early lethality.  Together, these extreme conditions suggest that our brains work best within a range of sizes that are reasonably close to average.  So if you have an average sized brain, be grateful, and know that, you're in good company (with Einstein), and while there may be other factors determining how smart you are (environmental factors like education level, or diet and exercise, or genetic and biological factors, like how many receptors for neurotransmitters your nerve cells make), brain size is likely not one of them.

___________________________________________________________________________________

The statistical measure for a correlation is the r value.  In the MRI studies the r values ranged from 0.35 to 0.51, which may be statistically significant, but suggest a fair amount of deviation (what we would call a "weak" correlation).  "r" values range from -1 to +1, with -1 being a perfect negative correlation (one thing gets smaller while the other gets bigger), +1 being a perfect positive correlation (two values get bigger together), and 0 being no different from random chance (the two variables don't seem to be related to each other at all). While getting an r value of 0.4 or 0.5 is definitely meaningful (what we call "statistically significant"), it suggests that there are a fair number of people who break from the trend. "Significance" in terms of statistics, signifies that the relationship being studied (in this case between brain size and IQ score) is not random, or likely doesn't reflect measures you would get by chance.  To give you a simple example of what it means to be non-random, or not by chance, think of the simple comparison between two individuals where chance for a given variable is 50 percent (like flipping a coin, 50 percent chance of heads, 50 percent chance of tails).  If you come across two people with different brain sizes, the MRI data suggests it is likely that more than fifty percent of the time, the person with the bigger brain will score better on an IQ test than the person with the smaller brain.  However, a greater than 50 percent chance is NOT a hundred percent certainty.  So, some of the time (less than 50 percent, but still greater than 0 percent), the person with the smaller brain will be "smarter" than the person with the larger brain (at least when measured by IQ).

Sunday, November 22, 2009

Mad cows and cannibals...

A new article in the New England Journal of Medicine (1) shows the recent evolution of a mutation in a population of people in Papua, New Guinea that is protective against the degenerative brain disorder known as kuru.  Kuru is a prion disease which is a disease (specifically a transmissable spongiform encephalopathy) that is caused by proteins that act like viruses.  That is, the proteins are transmissable (i.e. contageous) and though they don't replicate themselves like viruses or bacteria would, they somehow seem to be able to cause all the other similar proteins in their host/victim to misfold in a way that then causes the build up of neurodegenerative plaques which are often fatal.  Mostly, these diseases are transmitted through the eating of meat, as in Mad cow disease (which is known as Bovine spongiform encephalopathy, or BSE, in animals, the mutated form of which, in humans, is called Creutzfeld-Jakob Disease, or CJD).  Kuru is a lot like CJD or BSE, and was passed on in the tribes of Papua New Guinea by the practice of the eating of dead relatives in memorial services.  Though this practice was ended in the 1950s, the populations of people in this area experienced kuru epidemics that shaped the evolution of their tribes.  One tribe in particular seems to have a high prevalence of a mutation (G127V) that protects them against kuru.  Since kuru is fatal (killing many women and children in these populations), it makes sense that this mutation would be selected for, and its prevalence would increase in the population.  This is particularly exciting because, not only is this a great example of evolution in action (in humans no less!), but as of yet, there are no treatments for CJD or kuru, and these diseases are often, almost always, fatal.  The discovery of this mutation in a population that has evolved a resistance to a prion disease will offer insights into how we might go about protecting the brains of those who haven't evolved a resistance to prion diseases, and hopefully save some lives.

1. Mead, Simon, Whitfield, Jerome, Poulter, Mark, Shah, Paresh, Uphill, James, Campbell, Tracy, Al-Dujaily, Huda, Hummerich, Holger, Beck, Jon, Mein, Charles A., Verzilli, Claudio, Whittaker, John, Alpers, Michael P., Collinge, John. A Novel Protective Prion Protein Variant that Colocalizes with Kuru Exposure. New England Journal of Medicine, 2009; 361 (21): 2056 DOI: 10.1056/NEJMoa0809716

Monday, November 16, 2009

You may feel sleepy on Thanksgiving... but it's not the turkey.

Since Turkey-day is around the corner, I thought I would bring up the very popular myth that tryptophan in turkey is what makes us all feel groggy on Thanksgiving.  In an earlier post, I talked about how the amino acid tryptophan gets converted into serotonin, and then melatonin.  Melatonin, as you may or may not know is the "sleep hormone". It is secreted by the pineal gland to help regulate our sleep/wake cycles which follow a circadian rhythm of about 24-25 hours.  During the day, when it is bright and sunny we feel awake, then, as the day turns into night, we start producing more melatonin, and we get sleepy.  Considering this, it's not too hard to see why tryptophan became the scapegoat for our Thanksgiving day sleepiness, but the truth is tryptophan, or really turkey in general has gotten a bad rap.  First, tryptophan is a fairly prevalent amino acid, and there is actually plenty of it in most of the protein containing foods that we eat.  Furthermore, turkey does NOT contain a higher level of tryptophan than most other common meats, fish, and poultry.  For example, per 200 calorie serving, duck, pork, chicken, soy, sunflower seeds, several types of fish, and turkey all have about 440 - 450 mg of tryptophan, with turkey being the lowest in the group.  Of course, that being said, even if turkey did have significantly more tryptophan than other meats, it is still questionable as to whether normally consumed levels of tryptophan can make you sleepy.  While at first glance, the research seems to back the idea that tryptophan has sedative effects, these studies have used very large quantities to test for effects. For example, one study from 1975 suggested that consuming 5 grams of tryptophan (so, about 11 servings of turkey) did increase self-reported drowsiness, and a study conducted in 1989 found that a dose of 1.2 grams of tryptophan did not increase measures for drowsiness, but a dose of 2.4 grams did.  These studies suggest that you would have to eat a lot of turkey (like, over a pound and a half) to get an effective dose.  So, while it is possible that you may eat that much turkey on our most hallowed of gluttonous holidays, it is more likely that thanksgiving day drowsiness is the result of a coming together of many factors, a perfect storm if you will, of:
1. lots of food (which diverts bloodflow to the digestive tract),
2. much of the food is carbohydrate heavy stuffing and sweet foods like cranberry sauce, sweet potatoes, and desserts (which can cause an overproduction of insulin resulting in low blood sugar, and thus sleepiness, later on),
3. and then of course there are usually a couple of alcoholic beverages involved (with obvious sleep inducing effects). 
Add all of that up with being  in a nice, warm home, on a comfy couch, with football or parades or a fire flickering in the background, and what you have is a recipe for a nap.  I'm kinda sleepy just thinking about it.
Have a Happy Thanksgiving!

Wednesday, November 11, 2009

I'm a "No" man... oh, and alcohol doesn't kill brain cells.

So, I've been reading Randy Olson's new book Don't be such a scientist: talking substance in an age of style, and in it, he talks about the disconnect between the general public (in our modern overly stimulated society) and scientists (who cloister themselves in their labs and ivory towers).  A large part of the communication breakdown he posits comes from the nature of science itself which is a bare bones, take no prisoners, purely data and fact driven culture that breeds overt skepticism at all costs.  This is important for how science works and for maintaining integrity in research, and it can actually be quite helpful in making your experiments better, but when you are trying to convey your findings to a broader audience they tend to find all the negativity and information overload to be, well, boring.  Partly this is because the attention of the average person is now a hot commodity, and the marketplace for the average joe's attention is filled with advertisers, marketers, politicians, television, music, movies, you tube, and on, and on.  Spend a few minutes going on about the role of bone morphogenetic proteins, wnt-beta catenin signaling, hedgehog singnaling, and several other factors in the differentiation of stem cells in the development of different aspects of the central and peripheral nervous system, and zzzzzzz.....  Of course, I don't know if its the training, or if people of a certain mind set just gravitate to science, but a lot of us are like that.  We go off on our research as if its the most important thing in the world, and we are obsessed with facts and with being accurate, and, we are very negative, we are always questioning the validity of what we're seeing or being told (it's what we do, we get paid to be skeptical). And so, as I've been reading this book, I've realized that I am no different, and that even this whole blog is devoted to negativity.  I have set out to debunk, demystify, and disprove many common misperceptions about neuroscience.  I am a "no" man.  That's not how this works... That idea is wrong... I have become the person who constantly annoys everyone by correcting their grammar, escept I do it with neuroscience.  Well, I can't help it.  It's who I am, and part of my nature as a scientiist. BUT... despite my obsession with factual accuracy and my desire to negate the myths that are out there, it doesn't mean I always have to be the bearer of bad news.  For example, today, the myth I want to debunk is the myth that drinking (alcohol) kills brain cells.  As it turns out, there is very little (if any) conclusive evidence to suggest that drinking alcohol kills brain cells.  So where did this myth come from?  Well the obvious memory loss and headaches that come from binge drinking suggested that some sort of brain damage was occuring.  And more recently, MRIs have shown that the brain shrinks after drinking.  Don't worry, it's only temporary, but apparently it shrivels like a prune.  Also, alcohol can damage parts of cells known as neurites that form synapses which are the connections between cells that allow them to communicate.  Synapses are critical for forming new memories, and it appears to be this aspect of alcohol's effects that result in the short term memory loss we've all experienced at some point or another.  Of course all that being said, overdosing on alcohol (or alcohol poisoning) can most definitely cause cells to die (and could cause you to die).  Also, drinking and driving could kill brain cells by smashing them into the windshield at 60 miles per hour.  But if your just going to have a few drinks with some friends, live it up, and relax, confident in the knowledge that your brain cells aren't lightweights, but they're actually tough enough to handle a couple beers, and maybe even a shot or two.  Hey, some studies have even suggested that moderate drinking can improve mental abilities.  So maybe a glass or two of wine a night is the way to go (since it's good for your heart too).
Normally I link to a bunch of stuff to back up what I'm saying, but I am off to Wisconsin, so these links will have to suffice....
http://www.wonderquest.com/BrainCells.htm
http://www2.potsdam.edu/hansondj/HealthIssues/1103162109.html

Sunday, November 8, 2009

More BS from people trotting out PhDs

The proponents for creationism (or intelligent design, as the names are really interchangeable) at the Discovery Institute (DI) have a new campaign going on where they aim to expound upon 95 theses that, they claim, will make you reconsider Evolutionary theory. At least, I think it's the work of the DI since the only thing on the page right now is a link to a DI document that lists the names of scientists and other academics who have assented to the statement:

“We are skeptical of claims for the ability of random mutation and natural selection to account for the complexity of life. Careful examination of the evidence for Darwinian theory should be encouraged.”

First, I love that the 95 theses are a tribute to Martin Luther's seminal work which he tacked to cathedral doors to ignite a reformation of the Catholic church. And they say intelligent design isn't religious in nature?
Second, I love that there isn't anything on the page. I am sure this is the calm before the storm, but for now it just points out that there is no evidence to disprove any facet of evolutionary theory as it stands today. Most likely, these 95 theses will be things that argue against Darwin's ideas of 150 years ago or will just point to gaps in our understanding and say, "since we can't explain this (yet), God must have done it". Same old crap, new marketing and PR.  Or, it will just be some crazy ideas that are unfounded and unsubstantiated by any factual evidence (like this one)
Next, I would like to point out that the statement above is pretty reasonable. First, it's not asking anyone if they denounce evolutionary theory, of even whether or not they think it fails to explain the origins of species, it just asks if they are skeptical about random mutations and natural selection being the SOLE cause of the complexity of life, and if they would encourage people to pursue further inquiry into evolutionary theory. It is cleverly worded so that a scientist would read it and think, "of course, why wouldn't we want to further investigate the theory that underlies ALL of the major findings in biology for the last 150 years. Surely there are nuances to be expounded upon, and our understanding of the natural world will likely never be complete, as new species arise everyday." Of course a non-scientist would read this, and if they were already skewed to favor creationism, would accept this as evidence that there is a major conflict amongst academics over whether or not evolution is true. Let me be plain, THERE IS NOT. The OVERWHELMING majority of scientists recognize that evolution is a fact.  Those who don't, I suspect, are lying out of some misplaced sense of duty to their religious beliefs, or for some monetary or political gain.  Asking whether or not you agree with such a statement is a trick question... ask any evolutionary biologist, and my bet is that all of them would agree. Why? because if they didn't think that evolutionary theory should be "carefully examined", they wouldn't be able to justify their own jobs! I mean, why would you spend your whole life and career studying something you didn't think needed a more "careful examination"  And as for being skeptical, that is what scientists do. I am skeptical too! Particularly when you ascribe something like ALL of life's complexity to just 2 processes: random mutation of DNA, and natural selection. Especially when we already know that artificial selection, sexual selection, and genetic drift can drive evolution as effectively (if not more so) than natural selection.


And as for random mutation, it is a wonderfully exciting hypothesis to suggest that there might be changes to DNA, that could be selected for or against, that are not random, and actually, I think you could already make a case for it. More and more evidence suggests that our environment interacts with our DNA at the molecular level. As a common example, UV radiation from the sun preferentially targets thymidine molecules in our DNA, more specifically, pairs of thymidines that happen to be next to each other. When UV light mutates our DNA, more often than not, it causes two of these molecules to fuse together, forming a thymidine dimer (see figure above, copyright 2004, Steven M. Carr). These are the mutations we worry about with sun exposure because they can lead to skin cancer, and these mutations are obviously not random: the chemical structure of thymidines in DNA lends them more to an energy induced change in bonding than other molecules in the DNA (the C, G, and A nucleotides). Of course, this example is more of an analogy because, in order for a mutation to be passed on to one's offspring, it has to occur in cells that will ultimately give rise to either a sperm or an egg cell, and, hopefully, your spermatagonial cells and ovarian follicles aren't getting sunburned. However, this analogy suggests that it is possible for mutations to be non-random, which means it may be possible for mutations that get passed on to be non-random as well... for example, molecules from the food we eat and the air we breathe enter the blood stream and are thus trafficked to all of the cells in our bodies (including sperm and egg cells). These molecules may introduce mutations in a manner similar to UV radiation from the sun, that is in some preferential, non-random way. This is a fascinating idea! And it can be tested experimentally and thus scientifically, and thus, should be "more closely examined". And then we will obtain evidence either for or against non-random mutations being an agent of evolutionary change. And, if we find it's possible, then we can futher "examine" different species for more examples of non-random genetic changes.

But there are even more hypotheses and predictions we can generate based on the evidence we already have, which lead us to "examine more closely" these aspects of evolutionary theory. Previous mutations could make it more or less likely for specific types of mutations to occur in the future. And DNA, by its very chemical nature and structure may lend itself to certain types of changes over others (like the suburn and thymidine example, except possibly without the need for a mutagen). Also, we know that mutations at the sequence level aren't the only things that can bring about change at the phenotypic level. Whole genes sometimes get duplicated when the processes of meiosis and mitosis go awry.  We can further "examine" these duplication events to see how often they occur, or how important they are for different types of traits. Sections of chromosomes cross over during meiosis to introduce genetic variability, and mechanisms have arisen to vary phenotypes at the RNA and protein levels as well.  These areas all need further "closer examination" to see what role they play in evolution, if any.

The point is, this statement that seems to indict so many scientists as "dissenting from Darwinism," is actually quite weak and not up to date with much of what has been discovered about evolution and genetics in the past 50 or 60 years. Many biologists would admit to not only being skeptical, but to claiming that the statement "random mutation and natural selection account for ALL the complexity of life" is actually false, (since we already know that random mutation and natural selection are not the WHOLE story of evolution). Basically, the poorly named discovery institute has trotted out some old ideas about evolution and asked, "who thinks we should investigate further?", NOT, "who disagrees?", but "who thinks we should investigate further?" Well, of course scientists are going to want to investigate more closely, that's what we do! But the DI just wants you to read that statement, and think that there is massive "dissent from Darwinian theory", and that we should all pack up our books and go home. Agreeing to the statement they have written only means you dissent from an antiquated and un-updated theory of evolution. You may as well ask physicists if they are skeptical of the ability of Isaac Newton's ideas to explain ALL of the things we see in the universe. I imagine they would all say no, his ideas have been updated and expounded upon by others like Einstein to fill in the gaps where Newtonian physics fail and to explain what's going on at the sub-atomic level, or in black holes, or at the beginning of the universe. Does this mean that ALL of Newtonian physics is wrong? No. In fact, it is still the basis for most of our knowledge of physics in the world we live in (where everything is at least as big as an atom and things move slower than the speed of light). The same is true for Darwin's ideas.  Are they perfect and all-inclusive? NO. Do they still provide the framework that explains almost everything we observe in biology? YES.


The main difference here is, ID propoents use this statement to claim that scientists support so called "intelligent" design because ID, so they claim, further investigates evolutionary theory.  But that is a lie.  Scientific investigation allows us to more closely examine evolutionary theory.  ID, while helpful in its ability to point to some of the unanswered questions and interesting examples, does nothing to find out anything new about those instances.  The whole basis of the idea is this: some divine being who we cannot see or know in any objective or natural way put everything here and that was that.  This idea is the opposite of further examination, it is fatalism pure and simple.  It is saying that things like the origin of species are too complex to figure out, we should just say god did it, give up, and go home.  In fact, I should think that while most biologists would affirm the statement put forth by the DI, most creationists would not (at least if they were to read it the way scientists do).  In fact, I suspect that most of the fellows at the DI don't really want to examine evolutionary theory any further because everytime we do, their claims are disproved and evolutionary theory is further verified (like the evidence for the evolution of the bacterial flagellum which I talked about here).

This is just another example of how the DI and other proponents of creationism manipulate people and information to cloak themselves in this false air of credibility when really they are being flat out deceitful. They don't really want to "examine" evolutionary theory "more closely" they simply want to tear it down in the eye of the public. ID is a smear campaign plain and simple.

I don't yet have a PhD, but when I do, maybe they'll ask me to sign their petition as well?  I think I would have to refuse on moral grounds. I agree with the statement, but not the spirit of it, and not the dishonesty that went into crafting and disseminating it. Questioning evolutionary theory is great! It's what good biologists do everyday. And I encourage everyone to question it, to find out more (from reputable sources), and to test their theory over and again, scientifically.  The difference is, the DI doesn't want you to carefully examine evolutionary theory, at least, not in any productive way. They want you to learn FROM THEM the specific examples of things that haven't been explained yet. And the things they point to, like old ideas, and say they've been proved wrong when really they've just been updated. And they want you to think that numerous scientists disagree with evolutionary theory, when really the true scientists just want you to understand it better.

Wednesday, November 4, 2009

A little late for Halloween, but...

you want to see something that will freak you out?  at least as freaked out as you can be by something that is uber nerdy and neurosciencey?  Check out this video of the McGurk effect.  Watch the video, and try to identify the one syllable sound that is being repeated.  Then, watch it again, but close your eyes, or look away from the screen.



Did the sound change?
What happened?
The McGurk effect is an auditory illusion that demonstrates how our brains process the information we get from our senses. Generally, these pathways have some overlap which we often take for granted. For example, the same coordination of auditory and visual information being exploited in the illusion here is actually useful when you are in a bar or crouded party and you can't really hear the person you're talking to. Luckily, in real life (unlike in the video) the auditory and visual cues match up and your brain can use the visual information to help you "hear" the person next to you in that loud party. Though not exactly the same, a similar overlap occurs with olfactory (smell) and taste pathways which is why food tastes different when you have a cold or pinch your nose.

Here's another video that includes an explanation of the McGurk effect, and after that, a video showing the reactions of some viewers on a Japanese television show.



Sunday, November 1, 2009

What do Autism and Milli-Vanilli have in common?

Very little as you might have guessed, but recent popular media articles with clever titles like "Autism: Blame it on the Rain" I get the feeling I'm going to be just as disappointed as when I learned those guys were lip syncing the whole time.  Anyway, the articles go on to talk about an epidemiological study that shows a correlation between counties in west coast states that get a lot of rainfall and increased incidences of autism.  And as I suspected, I am disappointed, not in the research that inspired the popular press articles, but that most of the popular press articles seem to gloss over the fact that this study is VERY PRELIMINARY.  The authors of the study are not saying there is a definite link between living in a rainy area and the chances that your kids have autism.  Here's what I mean.  Before anyone gets too agitated and starts thinking of moving to the Arizona desert, let's consider one of the most famous adages in science: "correlation is not causation".  What that means is: just because two things are correlated doesn't mean they have anything to do with one another.  A great example of this comes from the venganza.org website, home of the church of the flying spaghetti monster, which has a figure correlating the decline in the number of pirates with the global increase in average temperature. 


Does this mean that we should assume global warming is responsible for the decrease in piracy we've seen since the 1800s?  No, it is simply a correlation, and there is no good evidence to suggest that increases in the average yearly temperature have any affect on how many pirates there are.  Or that pirates somehow cool the globe, and their absence is what has caused the world to get warmer. Given what we know of history and climate science, it is much more likely that the increases in technology and industry since the 1800s led to increased air pollution (and increased levels of greenhouse gases) which has led to steadily increasing average temperatures. Conversely, technology also allowed for improved national navies and merchant marines that have had more success in capturing and imprisoning or killing pirates, or evading them.  Additionally, the development of air travel meant less and less people and valuables were transported by sea.  As the cost to benefit ratio of being a pirate became more costly than beneficial, less and less people opted for it as a career.  (And of course, if we look at the recent increase in pirates coming out of Somalia, we see no decrease in temperature to account for this absurd relationship, but the dire circumstances in Somalia and the millions of dollars in ransom these pirates have been able to amass suggests there is something to the cost/benefit idea.)  Anyway, the fact that the only thing this study shows is a correlation with no evidence of causality is not the only strike against an argument for relating rainfall to autism.  Apparently, there were several issues with the methodology of this study that make its results sketchy at best.  An article at scienceline.org actually does a great job of explaining everything, but just to give the highlights here:
Despite the statistically significant findings, a careful reading of Waldman’s article published in the November issue of Archives of Pediatrics and Adolescent Medicine results in more questions than answers.
Could the rain-autism correlation be an artifact of another relationship? Traveling through Oregon or Washington, for example, a distinct trend emerges: the further east, the more rural and dry. It may be that urbanized counties do a better job of reporting autism cases, and that these simply happen to be the same counties with more rainfall.
Teasing out other causes, like ruralness, is difficult in an epidemiological study. This is one reason why Waldman’s research design is the “weakest type of epidemiological study,” according to Irva Hertz-Picciotto, an environmental epidemiologist at the University of California, Davis, also uninvolved with the study. Rather than examining exposures of individuals, ecological studies compare whole groups, which means information that could help decipher the true cause-effect relationship is lost.


Further, the grouped data in Waldman’s study came from a combination of sources — from state agencies to regional health centers — leading Hertz-Picciotto to suspect the numbers are not comparable within or between states. “Often the [agency or center] will put the children into the categories that they know they can provide services for,” Hertz-Picciotto says, “even though the categories may not be the best description of the child’s diagnosis.”
All of that aside, does this study tell us anything important?  Despite my somewhat rough treatment of it here, the answer is actually yes.  The finding that the incidence of autism may be correlated to the amount of rainful an area receives could lead to the discovery of some other factor that is related to both autism and rainy days, and thus, provide an explanation for the correlation.  For example, there is a psychological disorder called Seasonal Affective Disorder (SAD) which is more commonly known as the "winter blues".  When the days are shorter, and we are exposed to less natural light, and it is hypothesized that people with SAD produce less melatonin and serotonin (a hormone and neurotransmitter, respectively, the former most notably characterized as helping to regulate sleep/wake cycles, while the latter is the neurotransmitter most often enhanced by antidepressant medications).  It may be possible, that children who have a genetic predisposition to autism spectrum disorders may be sensitive to getting less light in a similar fashion as people who have SAD, these kids may have brains that produce less serotonin or melatonin, and this may, in turn affect the development of their brains.  Alternately, it may be that kids who fall on the spectrum may become more antisocial or have a more difficult time with vocal development because they are forced to stay indoors more, watching tv and playing video games rather than interacting with other kids.  Both of these ideas are completely speculative and are not substantiated by any hard evidence.  But it is important in science not to discount any ideas until they have been tested, and disproven.  And the benefit to the study mentioned in these articles is that it gives us a reason to hypothesize about the role of light exposure or time spent isolated indoors as factors that could affect autism, and better still, reason to test these ideas so that we will one day know for sure whether or not, and to what extent, where you live, or how you live, affects the chances your kids will be autistic or not.




Thursday, October 29, 2009

The Great Pumpkin is out to get you...


So, I have a calendar from the Nature Conservatory in the lab, and on it, for the month of October, there is a little blurb about pumpkins being treated with pesticides that are toxic to the human nervous system.  The chemicals in question are: malathion and diazinon, and they are neurotoxins... at least to bugs, which is why they make such good pesticides.  When it comes to humans, their toxicity is debatable.  Malathion, at the low doses used in agriculture, is completely harmless to humans (though I don't suggest drinking or eating it at higher doses).  Diazonin, on the other hand, can be more toxic, though, again, you should be okay unless you are eating or drinking it directly.  That being said, you should always was any pumpkins you plan on carving or cooking with.  AND you should always wash your hands thoroughly anytime after you've been handling any pumpkins.  Also, I agree with the Nature Conservancy in their recommendation to find a local, organic grower to avoid the chemicals altogether (you can find one at http://www.localharvest.com/).  The reason being that, while these chemicals are not harmful to humans, they can certainly be harmful to other, smaller animals that might raid a farm, and not have the benefit of being able to wash with soap or cut through the tough skin of the pumpkin with a knife. Plus, we don't want to have these chemicals build up in the soil and groundwater to the point where they reach concentrations that are toxic.  You can even look at your local grocery store to see if they sell certified organic pumpkins, if they are USDA certified organic, that means no chemical pesticides were used, and you and the environment can breathe a sigh of relief. 
Of course, I can't stop there because there's a great opportunity here to talk a little neuroscience.  I said that malthion and diazonin are neurotoxins, but what are they, and how are they toxic?  Both are chemicals of a class known as organophosphates, and both are cholinesterase inhibitors, which means that they block the action of an enzyme in the nervous system called acetylcholinesterase.  In a previous post, I described how the transmission of nerve impulses occurs, describing how those impulses travel across synapses in the form of chemicals called neurotransmitters (here).  What I neglected to mention was that after the signal has been conducted, the neurotransmitters need to be removed from the synapse, or else the post-synaptic cell will be fooled into thinking that another impulse has been transmitted, and another, and another, indefinitely, until the neurotransmitter molecules are gotten rid of.  Some neurostransmitters, like serotonin, are reabsorbed by the presynaptic cell.  Antidepressants like Zoloft and Prozac work by preventing this reuptake, thus increasing the activation of cells responsive to serotonin. In the case of another neurotransmitter called acetylcholin, the molecules get broken down by the enzyme acetylcholinesterase.  Acetylcholine is primarily used by the neurons that allow your brain to control muscle movements.  Blocking acetylcholinesterase from breaking down acetylcholine in synapses can lead to continued muscle contraction.  Which may not sound so bad until you realize that this is actually the mechanism of action of another organophosphate/cholinesterase inhibitor: Sarin "Nerve" gas.  Sarin is a very potent acetylcholinesterase inhibitor that causes its victims to convulse wildly, often dying from suffocation as the diaphragm and accessory muscles involved in breathing contract uncontrollably preventing the victim from taking in sufficient air.  Now, I know that's scary, but, despite being organophosphates and cholinesterase inhibitors, malathion and diazonin are not nearly as potent as Sarin gas, and like I said, in low doses, malathion is actually quite harmless to humans (unless you eat or drink it at high concentrations).  Diazonin, while still far from lethal, is a little more caustic, and has been banned by the EPA for residential use, (though it can still be used for agricultural use).  Now, if you still want to buy some pumpkins from the store (or already have), the good news is that they were likely mainly treated with malathion, BUT, if you have some really GIANT pumpkins, they were probably treated with diazonin (or both diazonin and malathion).  In order to get giant pumpkins, you have to leave them in the field longer... the longer they're in the field, the more likely bugs are to camp out and have a good meal, thus, in order to grow large pumpkins, insecticides like diazonin must be used.  Of course, by buying these pumpkins you are supporting the widescale use of these chemicals which can cause damage to the nervous systems of lots of smaller animals like birds and rodents who might raid the pumpkin patch for a meal, or fish and amphibians who may be getting higher doses of these chemicals in the water runoff to streams and ponds.  So while the "Great Pumpkin" may not be real, the real pumpkins at your local grocer can be just as scary.  By the way, the cartoon comes from http://shinjiku.deviantart.com/ , I hope the artist doesn't mind my swiping it for this post.

Monday, October 26, 2009

We live in exciting times! Blindness cured!

Well, not exactly, but some sensationalism is warranted.  This is mostly a follow up study that has been ongoing for the past 2 years or so, but researchers from UPenn and the Children's Hospital of Philadelphia have successfully used gene therapy to reverse (for the most part) the visual impairments associated with a disease known as Leber's congenital amaurosis, or LCA.  To be clear, this is not a cure for all types of blindness (that is, blindness due to other causes), nor is it even a full cure for LCA, as none of the patients have regained completely normal vision (though they have made tremendous improvements), and the mutated gene targeted in this study only accounts for 8-16 % of all LCA cases.  Still, this is incredibly promising research, for several reasons.  First, and most obvious, despite not being a complete cure, a single injection of a gene therapy mostly reversed LCA associated blindness to the point where half of the patients are no longer legally blind and can even navigate obstacle courses in low light conditions.  The second major aspect of importance is the gene therapy itself.  In this case, an adeno-viral vector was designed to insert a functioning gene (called RPE65, or retinal pigement epithelium-specific protein that weighs 65 kilo-Daltons) in place of a mutated version that underlies LCA in a small proportion of cases.  What's amazing is not the virus mediated transfection of the gene, as this is a technique that has been around for a while, and used successfully many, many times in mice and other experimental animal models.  What's exciting about the use of this technique in humans is that the FDA is very very cautious when it comes to allowing it to be used on humans.  Mainly, they are concerned because a virus is used to carry the gene and insert it into human cells, and they are also concerned that, in the long term, messing with the DNA in the cells could increase the risk that the infected cells will become cancerous.  Of course, the viruses used in gene therapy have been engineered so that they cannot replicate, and therefore cannot cause disease (much like attenuated viruses that are used in vaccines).  But the worries about cancer can only be alleviated when we have a large enough group of patients who we can follow over time to see whether or not they develop any tumors.  As mentioned, this particular study is already 2 years in the making, and, so far, there does not appear to be any increased incidence of cancer, which is very exciting and promising (though, of course, these patients will still need to be monitored as the years go by).  Finally, the fact that these viral mediated gene therapies have been used and validated so extensively in lab animals is what made this therapy possible (and successful) in humans.  Thus, another major underpinning of this study is how it demonstrates the importance of lab animal use in biomedical research. 
I also liked how the researchers used each patient as their own control by injecting the therapy into only one eye.  In this way, they could compare their results from the treated eye to the untreated eye, and since both eyes are in the same patient, they should be as close to identical as possible including the amount of bloodflow they receive, etc.  In this way, the researchers could be sure that it was really their treatment that made the patient's eyesight better, and not a natural regression of the disease (or a miracle).  They can tell this because if one eye gets better and the other doesn't (assuming it's the treated eye that gets better) then the gene therapy appears to work.  If both eyes stay the same, or get worse at the same rate, then the therapy didn't work.  And, if the treated eye actually gets worse faster than the normal rate of degeneration due to the disease (which should be evident in the untreated eye), they will know that there is something wrong with the treatment.  Of course, this type of design makes it difficult to rule out any placebo effect (since, I'm assuming, the patients all know that at least one eye is going to be receiving the treatment), but given the remarkable results, I think we can rule out the placebo effect, or, if not, apparently, we can use it to reverse certain types of blindness.  Either way, I think we should be happy.

Friday, October 23, 2009

The cerebral cortex is not in the neck.

Here's one that simply amazes me.  Someone brought this to my attention a little while ago, but it took me some time to track down.  In the tv show "The Unit", one of the members of this secret counterterrorism group gets killed.  His name was Hector, and I guess it was a big ratings ploy in season 3 where they leaked that someone from the unit was going to be killed. Hector was shot through the neck by a sniper, and the bullet ultimately winds up being lodged in his chest cavity somehow.  Ignoring the fact that the bullet made a magical U-turn to end up where it did, I am more perplexed by the scene in the show where the medical examiner tells one of the other members of the unit that "the bullet entered the neck here, snapping the cerebral cortex.  He felt no pain."  In the scene, the guy even points to the cervical portion of the spine (the neck) as he is saying the phrase "snapping the cerebral cortex".  I hate to break it to the writers of "The Unit", but the cerebral cortex is in the brain, not in the spinal cord.  Sort of like an onion, the brain has several layers, except, unlike an onion, the brain tends to be bigger and more squishy and wrinkly.  Just like the tough outer skin on the onion, the brain has a tough protective skin called the dura mater (which basically means "one tough mother", dura is the latin word from which we derive the word durable, and mater, means mother, like in alma mater, which means "nourishing mother", a term we honor our colleges and universities with since they nourish us with scholarship).  Under the dura mater is the arachnoid layer which doesn't really resemble anything you'd see in an onion, but looks more like a collection of spiderweb-like structures (thus arachnoid) which helps to cushion your brain against collisions with the inside of your skull.  Beneath the arachnoid lies the pia mater ("soft mother"), and, together, these three layers make up the meninges (which may sound familiar if you've heard of meningitis, which is an infection characterized by the swelling of the meninges.  Meningitis can be bacterial or viral, and in some cases, usually when it is bacterial, meningitis can be fatal, which is usually when it shows up in the news).  Directly under the meninges lies the cerebral cortex.  It is the outermost layer of what we typically think of as the brain: the gray, wrinkly ball that sits in the skull.  The cerebral cortex is actually the part of the brain that gives it its gray and wrinkled appearance, and it is also an area that is very important in most of our thinking, feeling, and doing.  The image below gives you some idea of the cerebral cortex, though it might be a little tough to read, I recommend going to the site I took it from: http://www.coheadquarters.com/coOuterBrain1.htm to get a better look.  Once you get below the cerebral cortex you hit subcortical structures like the basal ganglia (of which, the striatum is the largest portion), and even the cortex itself is subdivided into many layers (6 in the human brain).  Though, unlike an onion, or the picture, most of these layers are not so easy to pull apart, and they are more distinctions that have been made by looking at the tissue under a microscope than an onion like layering.  All of that being said, I'm not saying that getting shot in the neck wouldn't kill you, especially if it severed the spinal column, and, if that were the case, then it is possible that you wouldn't be in much pain (at least you probably wouldn't feel much below the point at which the spinal cord was separated), but I think you're cerebral cortex would remain intact.

Friday, October 16, 2009

Society for Neuroscience Annual Meeting


I'm off to Chicago for the next 5-6 days for the annual meeting of the Society for Neuroscience (along with some 30,000 other neuroscientists).  I realize that I have not been posting a lot lately (I am working on a couple of papers that have been demanding most of my time) and now with the conference I will likely continue to be absent.  But, hopefully, when I come back, I will have lots of cool new stuff to talk about.

Thursday, October 15, 2009

Booth loves Bones.

So I was watching an episode of the Fox television show "Bones" a while back (I have yet to check out any of Kathy Reichs' books, she is the forensic anthropologist/author who is the inspiration for and producer of the show, but I do enjoy the show, and the overly rational, though exaggerated to the point of caricature, Temperance Brennan, a.k.a. "Bones").  Anyway, the other main character in the show is Booth (played by David Boreanaz), an ex-army sniper turned FBI agent who investigates murder cases with "Bones" and the rest of her forensics team.  Recently, Booth was diagnosed with a brain tumor and underwent surgery to have it removed, during the recovery, he was in a coma and dreamt that he was married to, and madly in love with, Bones.  Upon fully recovering, his amorous feelings followed him out of the dreamworld and into reality.  As evidence for the fact that Booth did, in fact, feel like he was in love with Bones, the FBI psychologist Dr. Sweets (played by John Francis Daley, who you may remember from the excellent, but short lived series Freaks and Geeks) shows Booth PET scans of his brain showing "activity" in the VTA, or Ventral Tegmental Area.  PET scans are somewhat similar to MRI scans, and if you follow the posts, I've described how fMRIs work in the past, but again, briefly, fMRIs measure bloodflow in the brain. When a particular area is being used, it needs more oxygen and glucose, thus, more blood.  But, here's the catch, everywhere in your brain needs oxygen and glucose, so a single MRI image wouldn't really tell you very much.  What you need to do is get a baseline, and then, while changing one thing (asking the person to think about something specific or to perform a specific task), you take another image.  When you subtract everything that is the same out of the two images, and look only at what is different, you get a functional MRI (fMRI), that is, an image showing exactly which areas of the brain seemed to be working harder while you were trying to do that specific task.  PET scans (or positron emission tomography) work in a similar fashion, except instead of measuring the iron (or oxygen) in the blood, PET scans work by measuring glucose (which has been tagged with a radioactive fluorine molecule to show up in the scan).  But the same principle holds if the PET scan is to be specific to a function, you have to have one image as a baseline, and another for the test condition.  In the case of Booth, I don't dispute that thinking about the one you love leads to increased bloodflow in the VTA and the caudate, There are several studies suggesting these 2 areas are involved in romantic love.  (One such study can be found here, or, for the popular press version, here.)  No, what I dispute is that a single PET scan image of Booth's brain would not likely show the VTA and caudate to be working overtime, unless Booth was specifically thinking about Bones while the scan was being done.  And even then, you would really only get an image like the one they showed if a baseline image had been taken.  A minor point I know, but when neuroscience comes up on a popular tv show, I can't help myself but to comment.  Actually, despite the minor error, I have to commend the producers and writers of Bones for introducing some pretty cool and pretty recent neuroscience info into the show.
And if you find it interesting too, here's a little more fodder for your brain: in many of these studies, for both the VTA and the caudate, it tends to be the right side of the brain that lights up, leading us to wonder what the VTA and the caudate in the left side of the brain are doing.  (And if you think that makes sense because the right side of the brain is supposedly the creative, artsy, and romantic side and the left brain is the rational side, you may have a point, or you may find a post about whether or not all that right brain/left brain stuff is really true sometime in the near future).  The really interesting conclusion of these studies however is that they suggest romantic love (as opposed to familial or brotherly love or platonic love, i.e. friendship) derives from motivational areas of the brain, not the ones involved in emotions like joy or sadness.  So, while relationships involve their fair share of joy and sadness, apparently falling in love is more like becoming addicted to cocaine...

By the way, if you're wondering why I didn't talk at all about Booth having visual hallucinations from a cerebellar tumor, it's because it is within the realm of possibility.  Though visual hallucinations are more commonly associated with tumors or lesions in the occipital lobe (in the back of the brain, where the visual cortex is located), they can occur in patients who have cerebellar tumors.  Though rare, cerebellar tumors can lead to visual hallucinations due to the fact that the cerebellum is also in the back of the brain, and a large growth can put pressure on the occipital lobe which is right next door.  Though, the more interesting thing is that Booth's hallucinations were actually visual and auditory, as he could hear and converse with his hallucinations, this is even more rare, as auditory processing occurs in the temporal lobe which is a little further forward in the brain, and auditory hallucinations are most commonly associated with tumors located there.  Just like the occipital lobe is heavily involved in visual processing, and the temporal is heavily involved in auditory (or hearing), the cerebellum is involved in motor movements, particularly those that require coordination (like running or playing sports).  As such, most cerebellar tumors are associated with "ataxia" which is the loss of coordinated movement in the limbs, often this is revealed by an unsteady gait or falling forward or to the side when walking, or by clumsiness when performing tasks that require coordinated movements of the hands or feet (see http://www.ncbi.nlm.nih.gov/books/bv.fcgi?rid=cmed6.section.19621 for more symptoms of brain tumors).  It is odd that Booth wouldn't show any of these symptoms, and only have the much more rare hallucinations, but it is not impossible.  Everyone is different (and every tumor is different) this is what makes medicine (and finding a cure for cancer) so difficult, and while it is unlikely that a patient with a cerebellar astrocytoma would present with only the symptoms of having visual/auditory hallucinations, it can (and sometimes does) happen.