Sunday, January 31, 2010

Sunday Comics: Poor Kermit

So, today's comic is not from a comic strip, but rather a t-shirt design from Tshirt Hell: a website that takes OFFENSIVE to a whole new level.  If you are easily offended, I recommend NOT going there (of course, now you're definitely going to, aren't you?).  Anyway, I have to give credit to my esteemed colleague Dr. Bunsen Honeydew for not passing up the "teachable moment" offered by the untimely demise of poor Kermit.  I am sure, that while he may be wracked with grief, Beaker will someday appreciate the lengths to which Bunsen has gone to ensure he receives the best possible biology education...

Saturday, January 30, 2010

Random facts...

I imagine you are all familiar with the Isaac Newton quotation: "If I have seen farther, it is by standing on the shoulders of Giants."  Two interesting things about this statement: 1. Newton was paraphrasing an earlier sentiment by Bernard of Chartres:
“We are like dwarfs on the shoulders of giants, so that we can see more than they, and things at a greater distance, not by virtue of any sharpness of sight on our part, or any physical distinction, but because we are carried high and raised up by their giant size.” quoted in John of Salisbury, The Metalogicon (1159).
and 2. Some science historians (such as John Gribbin, whose book The Scientists is an absolutely fantastic read) have suggested that the comment was not so much an assent to humility, but rather a jibe at Robert Hooke (whose Micrographia firmly established the field of microscopy).  Hooke liked to look at everything he could under the microscope, including oily films, like the kind you sometimes see in puddles that reflect rainbow colored rings.  Several years later, Isaac Newton would describe these rainbow colored rings, with only passing mention of Hooke's work, and the rings would come to be known as "Newton's rings".  Foment ensued, with Hooke writing to Newton, praising him, but subtly hinting that he wanted proper recognition.  Newton replied in a letter containing the famous quote, that also claimed Descartes deserved most of the credit, not Hooke.  Since Hooke was a stooped man with a bit of a hunchback, Gribbin and others have proposed that Newton's use of "the shoulders of Giants" comment was meant to be sarcastic, with Newton implying that not only was Hooke a physically diminutive man, but a "mental pygmy" (The Scientists p. 164) as well.

Without receiving much of the credit, Hooke seems to have contributed to some of Newton's other achievements as well.  For example, Hooke proposed: "That all bodies whatsoever that are put into a direct and simple motion, will so continue to move forward in a streight line, till they are by some other effectual powers deflected..." before Newton claimed it as his 1st law of motion.

Friday, January 29, 2010

A busy few days for neuroscience

Jeez!  Don't check the stories on ScienceDaily on a regular basis, and they really start to pile up.  Anyway, here are my picks for the neuroscience articles I found most interesting from the past few days:

This one is another retelling of the paper I linked to in a recent post, the one about how female math teachers' anxiety seems to be contagious, but only the girls seem to catch it.

This one bolsters the claim that corticotropin-releasing hormone (CRH) is involved in alcohol dependence (i.e. alcoholism, or addiction).  CRH is a hormone that is released by the hypothalamus in the brain in response to stress.  CRH then acts in the pituitary gland causing it to release, among other things, another hormone (called adrenocorticotropic hormone, or ACTH) which stimulates the production of "stress" hormones (corticosteroids).  Anyway, the study shows that drugs that block CRH activity lessen the symptoms of alcohol dependence in rats, which means they may hold some promise for treating alcoholism in humans.

This one is about Death-Associated Protein Kinase (with a name like that, how could I not link to this article).  The "death" in the name of the protein refers cell death, not the death of a whole person, but cell death is an important thing to understand, particularly in the brain, where stroke and other injuries can activate cell death pathways, as can neurodegenerative disorders like Parkinson's and Alzheimer's.  Figuring out a way of preventing the cells from committing mass suicide could prove beneficial for these conditions.

this one, suggests that men may, on average, feel less guilt than women.  At least in western countries (well Spain, where the study was conducted).  The study did look at several different age groups and found that the difference held up in all of them (with some minor differences).  I'm not really aware of any other studies that have looked at guilt in general across the sexes, so, of course, we'll need to wait and see if more research shores this up.  But in the meanwhile, it is still very interesting data.

And finally, this one, where researchers have shown that skin fibroblasts can be converted into neurons by activating a few genes known as transcription factors.  In 2007, Shinya Yamanaka (et al.) figured out that you could take fibroblasts (cells that can be obtained from a skin biopsy) and turn them into pluripotent stem cells, that is, stem cells that can become pretty much any type of cell in the body.  Since this remarkable discovery (which was Science magazine's breakthrough of the year that year, and will most likely earn Yamanaka a Nobel), numerous other researchers have tried (and several have succeeded) in using these induced pluripotent stem (IPS) cells to model neurodegenerative diseases for which there are no good animal models.  For example, many researchers can study neurodegenerative conditions in mice or rats, but some diseases, like spinal muscular atrophy (SMA) are only found in humans, and so, lab animals may not provide the best insights.  Using Yamanaka's technique, researchers have been able to take skin biopsies from patients who have SMA, turn them into stem cells, and then, turn them into neurons, so they can better study the disease.  This new paper shows that you don't have to revert the fibroblasts to pluripotent stem cells, but you can get them to turn directly into neurons, eliminating the middle man so to speak, and saving a hell of a lot of time, hard work, and money in the process.  Of course, not only does this discovery make the process described above easier for studying diseases like SMA, but it also suggests that different cells in the body may not be all so different after all.

Thursday, January 28, 2010

Power Corrupts

"All power tends to corrupt; absolute power corrupts absolutely"
                                                                 - Lord Acton
We all tend to recognize the above statement as true (at least to some degree), but cliches and old wives tales (so called conventional wisdom), are rarely backed by real science or statistical methodology.  Even if we could show that the majority of people who obtain power exhibit some form of corruption (meaning we would have to catch everyone who's doing something wrong), there would still be no way (out in the real world) to determine whether power causes people to go bad, or if corrupt people tend to be drawn to positions of power.  A recent set of experiments by Drs. Lammers and Galinsky (of Tilburg and Northwestern Universities respectively) set out to determine what role power plays in our sense of morality.  Here's a great summary of the experiments at the Economist, and while I am usually posting about science that overturns "conventional wisdom", in this case, it seems that Lord Acton was right, at least, mostly... I think the most interesting finding in the study was not that power leads to moral hypocrisy, but that a sense of entitlement appears to be a necessary condition for the abuse of power.  So, not all power corrupts, only power combined with a sense of entitlement.

More on "how we know what isn't so"

Yesterday, I posted about the book How We Know What Isn't So by Thomas Gilovich.  As it turns out, the day before yesterday, Jonah Lehrer (author of Proust Was a Neuroscientist and How We Decide) was blogging about similar material, namely, how we selectively seek out information that conforms to what we already believe, or, if exposed to contradictory information, we selectively ignore anything we don't like.  (I guess the old adage is true, we hear what we want to hear, or see what we want to see.)  Anyway, Lehrer's post   is pretty good.  (Perhaps fittingly with last night's state of the union address,) It focuses primarily on political ideology and cable news networks (so you are probably immediately thinking republicans and Fox news or democrats and MSNBC, likely depending on your own political leanings, and you'd be right, but the most interesting finding in the paper that inspired the post was that longer viewing of the Al-Jazeera news network actually correlated with LESS dogmatic beliefs, suggesting that Al-Jazeera may not be the extremist's favorite news source).

Wednesday, January 27, 2010

Are boys better at math than girls...Revisited

A little while back, I posted about the myth that boys are better at math than girls (here).  The only data to support such a claim comes from examining standardized test scores.  HOWEVER, it has been shown that girls who are aware of the stereotype succumb to a self-fulfilling prophecy and score worse on math tests than boys, but they also score worse than girls who are unaware of the negative stereotype, who don't score worse than boys.  A new study out in the Proceedings of the National Academy of Sciences has replicated these findings, and added an interesting note, in that, it seems some of girls' anxiety about math (and math tests) is coming from their female math teachers.  This appears to add to the idea that girls need confident female mathematician role models (and take it one step further by suggesting that having a female role model who lacks confidence in her math skills may be just as bad as not having a role model at all). (Of course, if this myth weren't so pervasive, there would probably be less anxious female math teachers... just a thought.)

Book Review: How We Know What Isn't So...

I recently finished reading How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life by Cornell University Psychologist Thomas Gilovich.  This is a FANTASTIC BOOK!  It is just full of information (that is, scientific data) and yet it is easy to read and loaded with entertaining stories and anecdotes.  Some of the examples that betray our "common sense" wisdom include statistics that show why we tend to believe athletes perform in hot and cold streaks, and one really great study that showed how the spaghetti western idea of the man in the black hat being the bad guy may have real life underpinnings and consequences (where it was shown that NFL teams with black uniforms are more likely to accrue penalties).  I've blogged about a lot of topics here that all seem to revolve around a central theme, and while I had originally intended for that theme to be neuroscience, it has occured to me that this blog is more about countering the anti-science myths that seem to abound in our society (rather than just the neuroscience myths).  If, like me, you can't understand how so many people can believe things that are directly contradicted by mountains of evidence (e.g. believing that the earth is only 6000 years old, or that global warming is a hoax, etc.) this book provides some answers.  It turns out that there are some advantages to being able to make decisions on the fly rather than trying to collect all of the necessary data (so our evolutionary history has moulded us into "following our gut" in many cases).  Also,  we tend to filter all of the information we get to suit our preconceived beliefs (easily accepting ideas that conform to what we believe and discarding contradictory evidence).  And then there's the fact that science (or rather scientific findings) are so often not intuitive, or as Robert Pirsig more eloquently stated: "The real purpose of [the] scientific method is to make sure that nature hasn't misled you into thinking you know something you actually don't know."
Other factors are obviously, also at play: preconceived notions are difficult to overturn (even if we don't filter out the contradictory information), randomness is tricky and sometimes things that can be explained by chance seem like more than just a coincidence.  Also, our memories are imperfect, we tend to remember events rather than non-events.  For example, I have heard many people claim, "how could there be global warming, it snowed a bunch of times last winter", but the truth is, you remember the days it snowed because it was different and it stood out, maybe you couldn't get to work, or maybe you almost got in an accident, maybe you were delayed, or sat in traffic, the list goes on.  For the past week or so here, it has been unseasonably warm (many days getting into the 50s and even 60s) but these past few weeks will likely go unremembered because, while it may be enjoyable, we are all still going to work and coming home at regular hours, and nothing seems to be out of the ordinary.  Anyway, I don't wish to get too bogged down with examples, or with explaining all of the ideas presented in the book.  If you get the chance, check it out from the library or buy a copy at the local bookstore (or local online mega-retailer).  I promise you it will be an enjoyable and enlightening read, and pretty short too (less than 200 pages).

Tuesday, January 26, 2010

Daydreaming may not be such a bad thing

All of us have been scolded by a teacher at some point in our lives to "stop daydreaming!".  But despite the conventional wisdom that a wandering mind is a bad thing, it turns out it may be helpful particularly when it comes to creative thinking and making associations between ideas.  So perhaps Gertrude Stein was more right than she knew when she quipped:
"It takes a lot of time to be a genius, you have to sit around so much doing nothing, really doing nothing"

Monday, January 25, 2010

Lucy in the Sea with Diamonds

If you think the Beatles' song "Lucy in the Sky" is trippy, it's nothing compared to reality, well reality on Neptune and Uranus that is.  So again, this story has nothing to do with neuroscience or even biology, but it's just too friggin cool not to post:—with-diamond-icebergs/?utm_campaign=Feed:+DiscoverBlogs+(Discover+Blogs)
So we all know that diamonds are the hardest natural substance on earth, and it's difficult to imagine such a thing as liquid diamonds, but apparently, the pressure on Uranus and Neptune is so great that there may actually be whole oceans of diamonds with diamond icebergs floating around in them.  Diamonds are made from carbon which is one of the most abundant elements around, usually, carbon in the form of coal or otherwise trapped in rock sits under the earth's surface where it gets exposed to heat and pressure that transforms some of the carbon into the appropriate lattice to make diamonds.  Researchers have shown in a lab (here on earth) that it is possible, under the right conditions of pressure and temperature, to liquify diamonds without causing them to lose their characteristic chemical structure (which would cause them to become graphite).  Since these conditions are achieved naturally on Uranus and Neptune, and since carbon is so abundant, it is possible that these planets may have whole seas of liquid diamond with diamond "icebergs".  

Sunday, January 24, 2010

Sunday Comics

Last day of the weekend... time to head back to the "rat" race...

Saturday, January 23, 2010

Songbirds Rock!

Literally.  I knew they could sing, but who knew they could play the guitar too?

(For those who don't know, the birds in the video are zebra finches, the same species that I use for most of my research because they have such truly extraordinary brains.)

Friday, January 22, 2010

A Sleep Study

So the way in which the suprachiasmatic nucleus (SCN, an area within the brain) has been thought to regulate our circadian rhythms may have just been overturned.  Circadian rhythms are the daily rhythms of our bodies (from the Latin circa = about or around and dia = a day), most commonly associated with sleep and waking cycles, though there are other things that fall into daily rhythms (like levels of circulating testosterone appear to be highest in the early morning).  For a while now, it has been suspected that the SCN regulates our daily rhythms by having the neurons within the SCN fire really fast during the day, and then fire more slowly at night.  A new study in the journal Science suggests, that if you look just at the neurons that are most likely the "clock" cells, they aren't firing all day, just in the morning and in the evening. Of course, this isn't really overturning any widely held myth about neuroscience, just adding to the story.  The SCN is still likely the master clock within our brain, and many of the downstream effects of its circadian regulation (like the secretion of melatonin from the pineal gland) are still valid effectors of sleep wake patterns and the other physiological correlates of circadian rhythms.  Here's the press release at Science Daily.

Wednesday, January 20, 2010

Neural Depolarization?

So, when I hear the term "neural depolarization", I immediately think of a neuron firing an action potential, which is actually called neuronal depolarization.  Apparently Google agrees with me, as it suggested "neuronal depolarization" when I started typing "neural depolarization" into the toolbar, but, if you commit, you will find out that "neural" depolarization is some sort of new age quackery.  I will refer you to an article by Stephen Barrett M.D. over at quackwatch, which is where I found out about this nonsense, and where, if you skip ahead to the end, you will see the following:
"NDP (neural depolarization) is a minor variety of "energy medicine," whose practitioners claim to heal by modifying the alleged flow of nonmaterial (non-measurable) "energy" throughout the body. NDP's practitioners also offer what they describe as nutritional and spiritual advice. They claim to be effective against a wide range of diseases, including cancer, even though no scientific studies have substantiated or even investigated such claims. The attention they give to clients may produce symptomatic relief, but I see no reason to believe that NDP itself can influence the course of any disease or that its advocates are qualified to give health advice."
 As for my google search, I could only bring myself to look at the first page that comes up, where I found several instances of bogus information:
"Every cell of our body has at it’s core a nerve, that is why we are able to feel."
While I believe that the author did not mean the literal core of our cells (where we would most likely, though not always, find a nucleus), this statement is still simply false.  First of all "nerve" can refer either to a "nerve fiber" which is a collection of axons (the long cable part of a neuron), or a "nerve cell" (a neuron).  Now, many cells in the body are not affiliated with nerve cells (or fibers), for example blood and lymph cells in the circulation most certainly do not have a "nerve" at their core, the red blood cells don't even have a nucleus, and they only come into contact with a "nerve" when they are dropping off oxygen and picking up carbon dioxide.  Additionally, how do neurons, or "nerve" cells, themselves have nerves at their core?  Even if we assume that this statement is meant to apply only to the skin, typically one neuron serves many, many cells of the skin (where most of the cells in the upper layers of the epidermis, don't even come into contact with the nerve endings).  But let's move on...

"...we have all cut a finger, at one time or the other, which has healed.   The cells regenerate, the skin closes up, and within a few days the finger looks and functions like nothing is amiss.    This is a simplistic and very basic example, but nevertheless an example, of self healing. This self-healing ability is true for every cell of every organ of our body." 
Again, this is simply not true... certain cells and organs have a very hard time regenerating, like the brain and the heart, this is why strokes and heart attacks are so debilitating and in many cases deadly, because these organs do a poor job of repairing themselves, and most certainly do not heal like a cut finger or a broken bone.
The site goes on to claim that we need to treat our cells with "positive, loving thoughts" and that negative thoughts are toxic to our cells.  It then goes on to claim that these toxic events cause our "nerves" to become polarized, and that this polarization is bad, which is why "neural depolarization" is supposedly good.  It even goes so far as to claim that these polarized cells are the "root cause of all pain and disease that we experience".  That's ALL pain and disease!  (from Alzheimer's to Zelwegger's). Okay, whenever someone tells you there is ONLY ONE underlying cause to ALL disease, you get to loudly claim "Bullshit!" And as for that cause being the fact that your nerve cells are polarized, well that's just ridiculous.  Almost all of your nerve cells are in a constant state of being polarized at all times, waiting to be depolarized when they are used to send a signal.  In fact, your brain devotes lots of energy to keeping your neurons in an almost permanent state of polarization (and your brain uses about 20 percent of the total energy you will use).  In fact, I don't see how you could think, move, or do anything at all, if it weren't for the fact that your neurons remained polarized at rest.  Anyway, I could go on about how ridiculous this is, but I guess the point is, if someone tries to sell you on "neural depolarization", hold on to your wallet as tight as you can and run!

Tuesday, January 19, 2010

Objective Science Reporting

Reporters often try to give their stories "balance" by presenting both sides of the story.  My personal opinion is that remaining objective is the highest ideal for reporters to aspire to, and, in many political and other he said/she said type stories, being "balanced" is a great way to remain objective.  However, many stories don't have 2 sides, and in these cases, attempts to be "balanced" can actually undermine good fact-based reporting.  For example, I have yet to see a story where coverage of the recent earthquake in Haiti is then "balanced" by interviewing some guy who claims the earthquake didn't happen.  Giving airtime to someone so obviously purporting lies would most definitely be seen, at the least, as bad reporting.  Yet, whenever there is a story about science, it seems like there is always some quack that can be tracked down and cited as having the "other side of the story", which would be okay, except for the fact that these contrarians almost always lack any evidence or any sort of valid credentials. Science says the earth is round, let's find Quacky McQuackery who has a diploma he bought on the internet from Timbuktu U who says he's "studied" the earth by living on it for his whole life and he claims to have overwhelming evidence that the earth is flat.  Quacky also says the earth is only 6000 years old, vaccines cause autism, and global warming is false.  Now, obviously I am not the only one who feels that we shouldn't be putting Quacky up on a soap box, and yet it seems to happen far too often and good science gets unfairly maligned in the interest of "balance".  Here's an article by science journalist Chris Mooney, where he talks about just this sort of thing.

And of course, in the interest of balance, and in light of all of the recent ClimateGate media buzz, it amazes me that items like this don't garner the same amount of media attention as the hacked emails:
"In 1998, for instance, John H. Cushman, Jr., of The New York Times exposed an internal American Petroleum Institute memo outlining a strategy to invest millions to “maximize the impact of scientific views consistent with ours with Congress, the media and other key audiences.” Perhaps most startling, the memo cited a need to “recruit and train” scientists “who do not have a long history of visibility and/or participation in the climate change debate” to participate in media outreach and counter the mainstream scientific view."  

Sunday, January 17, 2010

One step closer to repairing the brain

It appears that a group at UC Irvine has been having some remarkable successes treating animals with stroke-like brain injuries using a protein called TGF-alpha (transforming growth factor alpha). You can read a summary of two of their studies here.  (My own research is somewhat close to this, I have been examining the potential for proteins in the TGF-beta family to minimize damage or promote repair after traumatic brain injuries.) Now, obviously a lot of work still needs to be done to see if others can replicate this work, and to see if it will have the same effects in humans.  But the research certainly looks promising.  And any improvements would be welcome, since stroke is the number one cause of long-term disability in the U.S., and we currently lack any truly effective therapies.  I am excited to see where this research will lead, and to see if we can have an effective treatment for stroke victims (and perhaps traumatic brain injury patients) sometime in the next 5 to 10 years.

Sunday Comics: the limit of human intelligence

This week: a classic Far Side cartoon....

Friday, January 15, 2010

Star Shaped Cells Strengthen Synapses

A new article out this week in Nature adds to the growing evidence for the importance of glial cells in the brain.  (Here's the coverage at The Scientist).  I've posted before about the myth that we only use 10 percent of our brains, where, I cited as one of the possible origins of this myth, the fact that only about 10 percent of our brain is comprised of neurons and the other 90 percent is made up of these other cell types, collectively called glia.  Another idea that I have posted about is the Neuron Doctrine, which is the long held belief in neurobiology that most, if not all of the functions of the brain are the result of neuronal signaling (neurons firing action potentials and transferring those impulses across synapses using chemical neurotransmitters). When both of these ideas are taken together, one can mistakenly get the idea that we only use 10 percent of our brains.  While I don't mean to beat a dead horse here, I do like to point out whenever new evidence shows how glial cells are critical to many of the processes that were previously thought to be exclusively neuronal.  Along those lines, this new article describes how astrocytes (a sub-type of glial cells that happen to be somewhat star-shaped) play an important (and possibly even necessary) role in strengthening the synaptic connections between cells in a process called long term potentiation (LTP).   LTP is thought to be the main mechanism by which new neuronal circuits are established and things like learning and memory are accomplished at the cellular level.  Thus, glial cells continue to demonstrate how important they are, even in higher cognitive functions.

Thursday, January 14, 2010

Has the internet changed the way you think?

I posted a little bit ago about how the internet hasn't really changed our social networks or made us more isolated as many people have thought.  Now, has asked their annual question of some well known thinkers, and the question for this year: "How has the internet changed the way you think?"  You can check out the responses here:


So I've been looking over the Pew Research Center's study (out earlier this year) concerning the public perception of science and one of the interesting things that I noticed right away is that there's a difference in the percent of people surveyed who accept evolution.  In a similar survey conducted in 2006, when asked whether "Humans and other living things have evolved over time" 51% of respondents agreed with the statement.  Meanwhile, 42% preferred the statement "Humans and other living things have existed in their present form only", and 7% fell under the heading "Don't know".  In the 2009 study, 61% percent agreed that "Humans and other living things have evolved over time", while 31% claimed the "existed in present form only" option, and 8% responded "don't know" or failed to respond.  That's a 10 percent increase in people accepting evolution and a 9% decrease in those denying it!  Hooray for progress!

Tuesday, January 12, 2010

Pop Quiz!

The Pew Research Center has a very brief online quiz to test your science knowledge (it's only 12 questions and they're all multiple choice and true/false)...  My only gripe is that the quiz really only tests whether or not you know some basic science facts and recent science headlines.  I would like to see a survey on how much people actually understand about the scientific method and the process and practice of science in our society (I hypothesize that a better understanding of what science is, how it works, and the rigorous process that scientists go through when it comes to publishing and obtaining funding, etc. would lead to less misunderstanding between scientists and the public, a la evolution, global warming, and others.  Of course this is just wild speculation on my part.) Anyway, the quiz is part of a poll that the center conducted to determine the public's attitude toward science and scientists, and, in turn, scientists' opinions of the public and their science knowledge and understanding.  I will probably write a separate post about the results from that poll, but in the meanwhile, enjoy the quiz.

Is the internet causing us to withdraw from the real world?

For the past several decades, technologies such as television, video games, and most recently, the internet have been blamed for our increasing social isolation.  A recent survey by the Pew Research Center suggests that, counter to popular belief, neither the internet, online social networking, nor mobile phone usage (i.e. texting) are to blame.  The study does seem to support the idea that Americans have become (slightly) more socially isolated where, since 1985, the average network of close confidants has gotten smaller by about one third (which apparently equals one person).  However, the study goes on to show that, if anything, the internet provides us with larger social networks of people with whom we can discuss issues that are important to us, and, counter to our assumptions, we are still likely to see our close friends and confidants in person on a regular basis (on average, 210 days out of the year).  Also text messaging, despite popular belief, has not replaced talking on the phone, at least if you're talking on a cell phone (texting is tied with landline phone calls as the third most popular way to contact friends... with meeting in person being number one, and talking on mobile phones as number 2).  Even writing cards and letters (you know using an actual mailbox, envelopes, and stamps) is a more popular way of staying in touch than emailing, IM-ing, or online social networking.  So why does the internet and its social networking sites get such a bad rap?  Well, people who are on social networking sites do seem to have used these sites to maintain their real world groups of friends and confidants (and, perhaps sadly, not have to set up new networks based on where they move to).  For example, social networking website users are 30% less likely to know at least some of their neighbors, and 26% less likely to use their neighbors as a source of companionship, than those who do not use networking sites.  The internet, on the other hand, is guilty by association only, since internet users as a whole are not less likely to know or befriend their neighbors than those who do not use the internet. Of course, before we jump to the conclusion that social networking sites are "bad", it is important to note that: when the online social networking sites are neighborhood sites "participants tend to have very high levels of local engagement", forming tight relationships with their neighbors.  Finally, the question that started this post: Does the internet lead to users withdrawing from the outside world?  Again, it seems that our assumptions betray us as internet users are 42% more likely to visit a public park or plaza than non-internet users, suggesting that internet users are not more likely to withdraw into their homes and spend all of their time in front of their computers.  (And yes, that was after controlling for population density, in case you were thinking that people in rural communities might be less likely to have internet and less likely to go to a public park rather than their front yards)

Here's a link to the study again (here for the full report), it also has some interesting things to say about the make up of our social networks (in terms of diversity and other factors), our usage of mobile phones, as well as some analyses of bloggers versus internet users who do not blog, and much more.

Sunday, January 10, 2010

Sunday Morning Comics

Today's comic comes from Saturday Morning Breakfast Cereal.  Somehow it just seems to fit with the other posts from this week.

Saturday, January 9, 2010

In light of the recent post on Texas and Textbooks....

where one of the proponents for changing the curriculum actually said, something to effect of "that critical thinking stuff is gobbledygook!"  I humbly submit this excellent video by's so good, it makes me want to trash talk anyone who would call critical thinking "gobbledygook"...

Suck it, non-critical thinkers! (sorry, I couldn't help myself)

Second Saturday is Celebrity Saturday

In another attempt to enhance the news magazine experience here at CH&H, I'm adding another recurring piece: "Celebrity Saturday", where, for as long as I can keep doing so, I will post about (mostly neuro) scientists who are related to celebrities (or are themselves celebrities) on the second Saturday of each month...

For the first installment of Celebrity Saturday, we have Simon Baron-Cohen, Ph.D, whose cousin, Sacha Baron-Cohen is the comic actor known for his roles as Borat, Bruno, and Ali-G.  And if you think Sacha's resume is impressive, you should check out Simon's, he is a Professor of Developmental Psychopathology at Trinity College, Cambridge in the U.K. and Director of the Autism Research Centre at Cambridge University.  He has published numerous papers in high impact journals, and has been honored with several awards and academic distinctions.  In case you were confused, that's Simon on the left, and Sacha on the right.

Thursday, January 7, 2010

I stand corrected....

In my recent post on the Texas board of education's attempts to re-write the history books, I attributed the quotation 'history is a set of lies agreed upon" to Napoleon Bonarparte, but according to the Yale Book of Quotations, the origin of that sentiment can actually be traced to Voltaire in 1764.  Just goes to show, you can't believe everything you read on the internets.

Do you ever wish you could turn your brain off?

Well now, you may just get your chance:

(Some mornings, I wake up and wonder if I've forgotten to turn mine on.)

Texas is re-writing history

And biology, and English language arts... So, it's nothing new to hear that some "conservative" members of the state board of education in Texas want to get biblical creationism written into the biology textbooks and get evolution written out of them.  What is new (and I feel like this should be something in one of the Freakonomics books) is how the economic woes of California might lead to the Texas state board of education holding the power to determine what textbooks look like ALL ACROSS THE COUNTRY.  California, being the most heavily populated state in the Union, typically dominates in terms of the number of textbooks purchased.  Texas, being the second most heavily populated state, buys the next largest amount of textbooks.  California typically buys so many textbooks, that either Texas has to deal with whatever California gets, or publishers could publish two versions of the book.  However, publishing two versions of a book is obviously more expensive than just publishing one, and in most cases, this extra cost is so prohibitive that Texas has to take the books in whatever form California wants them (especially if New York, number 3 on the list of heavily populated states, agrees).  Now, we all know that California, as a result of their huge deficits and the economic downturn, is basically bankrupt, and in desperate need of cutting costs to stay afloat.  One easy way to cut costs is to tell all the public schools to just use the older textbooks and hold off on ordering new ones.  This means that Texas will likely become the number one purchaser of textbooks in the U.S. and, as a result, the rest of us may have to just accept the Texas versions of our textbooks, or like California, keep using older and possibly out of date texts.
Now, why should you care?  Or, why do I care enough to post about it here (when I usually try to stick to neuroscience stuff)?  Well, let's look at the desire to remove evolution from biology textbooks first... Neuroscience is a branch of biology, and like the famous geneticist Theodosius Dobzhansky once commented: "Nothing in biology makes sense, except in the light of evolution."  Evolution provides the underlying framework for ALL of biology, including neuroscience, and it is critical for understanding human psychology and animal behavior, as well as several types of nervous system diseases and disorders.  But beyond this, teaching kids not to seek out natural explanations to explain the unknown stifles curiosity and stymies scientific inquiry and progress.  In an age where we need solutions to global warming, the energy crisis, and cures for numerous diseases and injuries, I would prefer to be training new, inquisitive scientists rather than a crop of fatalists who say "why bother?"  Ultimately, however, I realize that the next major attack on science (following the recent salvos against evolution and global warming) is likely going to be against the field of neuroscience, where materialist explanations for things like "free will" and love are replacing medieval ideas about them originating from an immaterial "soul".
But wait, there's more... if you read this article about Don McLeroy and the rest of the conservative members of the Texas board of education you will see that they don't just want to change the biology textbooks, they want to change the history books as well (and the english books, and really the aims of education itself).  Apparently, McLeroy and his ilk believe as French dictator Napoleon Bonaparte did when he said "History is a set of lies agreed upon."  Their proposed amendments include adding more religious documents to history texts to try and paint America's founding as a Christian nation (and thus tear down the very clearly established wall in the Constitution that separates church and state).  And they want to "play up clashes with Islamic cultures" so as to institutionalize prejudice and hate (and bolster support for more oily holy wars). (And these are the people who call Obama a nazi?)  As for the health science books, members of the board have sought in the past to change a photo of a woman with a briefcase to one of a woman baking... I imagine the next step is to remove any references of the feminist movement or women's rights from the history books (women's suffrage?  what's that?).  My favorite quotation by far, though, was this:
The ultraconservatives argued that they were too light on basics like grammar and too heavy on reading comprehension and critical thinking. “This critical-thinking stuff is gobbledygook,” grumbled David Bradley, an insurance salesman with no college degree, who often acts as the faction’s enforcer.
Yeah, we don't want to teach kids to think for themselves, they're too hard to brainwash when they do that.
SO... what is there to do?  Well, I suppose we will have to just sit back and wait to see whether or not California decides to buy any textbooks, or if Texas succeeds getting publisher's to leave some things out of their books... Of course, if you wanted to take action, the best course would be to join the National Center for Science Education, the group most active in combating the weakening of science education standards.  As for any changes to the history books, perhaps the ACLU might be the next best place to go to.

Wednesday, January 6, 2010

This is a great blog...

Not this one, well, I mean, I'd like to think that this blog is a great blog, but I'm usually not so comfortable with shameless self promotion (except when I tell people to buy my book... I haven't written a book yet, but when I do, you should buy it).  Anyway, I'm actually referring to the blog "Bad Astronomy" over at DiscoverBlogs, which I have referenced here before, and am now directing you there again.  Go, now, as I let Phil Plait do more of my heavy lifting and deliver more comeuppance to the anti-vaccination nuts people (who apparently are also concerned about diet and GI diseases causing autism... for which there is also no evidence to support).
My favorite lines from the post:
 McCarthy confuses anecdotes with data. As I have said before, anecdotes are where you start an investigation, not where you finish one. That’s the difference between science (aka reality) and nonsense. 

Are men better at math than women?

This is a myth that has been around for a while, likely it is the result of societies where women are seen as the "lesser sex".  Like the U.S. circa, well, almost all of our history as a country, throughout most of which, there were a lot less women in more math oriented careers than men (like engineers or physicists, or, mathematicians).  The shame is it seems that this myth became self perpetuating.  That is, despite the fact that no real evidence emerged to show that men are any better at math than women*, some evidence suggested that girls who were aware of the negative stereotype (that they are supposedly worse at math than boys) tended to do worse on standardized math tests than girls who were unaware of the stereotype.  A new study (really a compilation of other studies allowing for a meta-analysis) further bears this out.  It shows that differences between the sexes in math ability are more likely a reflection of the country (or local environment) in which the subjects live, rather than a reflection of some hidden math prowess that is conferred by having a Y chromosome.  The study found that, in countries where women were treated as being lesser than men, and where there were few positive women role models (in terms of math and science), girls performed worse than boys on the math tests, BUT, in countries where men and women are on more equal footing, and there are more prominent female role models, the girls actually did as well or BETTER than the boys.  You go, girls!
*sure, there were a couple studies showing that boys did better on  standardized math tests than girls, but then there's a ton of data that shows that girls tend to get better grades than boys, including grades in math and science classes.

Monday, January 4, 2010

Pessimism, Ageism, and Death

How do you feel about the elderly? Do you think they are less intelligent? or absent minded? or bad drivers? Well, if you do, you may be more likely to have a heart attack or stroke than someone who doesn't hold such views...

A study out earlier this year (Levy et al. 2009) showed that people under the age of 50 who had negative thoughts about old age (or, more specifically, the elderly) were more likely to have cardiac events later in life.  Sampling from the Baltimore longitudinal study of aging, researchers found that people who regarded the elderly as "less intelligent" or "absent minded" were almost twice as likely to have heart attacks or strokes than those who held more positive attitudes.  So what do your opinions about old people have to do with your health as you age?  Well, it is likely that your beliefs in and of themselves are not necessarily bringing about ill health, but rather people who hold more negative stereotypes may be more likely suffer from greater stress, or poor coping skills, or, it may be that their poor attitudes are reflective of their already not feeling as healthful or energetic or happy as those who have more positive opinions of the elderly.  It may be that people who have better opinions of the elderly are also people who tend to exercise more, or eat healthier foods... there is no real way of knowning whether or not healthy people are more optimistic about growing old because they are healthy, or because their positive outlook on life actually enhances their health and longevity.
If I had to guess, I would say your best bet is to focus on the things that we know positively impact your health (get more exercise, eat right, get better sleep, keep up positive relationships with friends, etc.) and if you do that, you will likely improve the way you feel, and your outlook on things, probably even on getting old and the elderly (and other people) in general... Or, you could just try telling yourself over and over again that old people are just as capable as young people and hope for the best.  Let me know which one works out better.

Sunday, January 3, 2010

Sunday Comics: why I don't use a monocular microscope

In a new feature, I've decided to share some of my favorite nerdy and/or sciencey comics.  Hopefully I will be able to post something every sunday morning... although, while there is certainly no shortage of science cartoons, the supply of funny science cartoons may be more limited...

Saturday, January 2, 2010

Does Ginko Biloba improve your memory or prevent cognitive decline?

Apparently not.  For years, anectodal claims that supplementing with Ginko Biloba could enhance your memory or prevent dementia in old age have fed the almost 1 billion dollar a year industry (that's for Ginko alone, not the herbal supplement industry which is a 34 billion dollar a year industry... that is, in the U.S.A. alone, who knows how much gets raked in worldwide, but with demand throughout Asia for things like rhino horn and seal penis driving their respective source species to extinction, one can only imagine it must be a whole helluva lot).  Anyway, I wish I could say I came up with exposing this neuromyth all by myself, but I was first tipped off by Phil Plait at Discover blogs.  Here's the link to a recent article in USA Today relating the latest research.  I doubt that this report will have much of an impact because there will still be some people who swear by the supplement (just like there are plenty of people who swear by homeopathy and psychic healers and other sorts of quackery).

Friday, January 1, 2010

Merry New Year!

And what a great start we're off to!  I just finished watching Penn State (my alma mater) beat LSU in the Capital One Bowl (aka Citrus Bowl), it was definitely a close one, but a win is a win!  It also reminded me of a study from a while back that showed how watching our favorite sports team can do more than just cause us to drink beer, eat bad food, and jump up and down on the couch.  A study by Bernhardt et al (1998) showed that when men watch their favorite teams win, their testosterone levels go up.  If their team loses, they go down.  The study tested men after watching basketball and soccer (football) matches, but I would imagine the effect extends to (American) football games as well (though that remains to be tested).  As for all the women sports fans out there, I think it would be interesting to see what's going on with your testosterone levels too (no reason to be sexist), as well as what the levels of other steroid hormones (particularly stress hormones) look like in both men and women.  Some studies have shown that watching sports increases the risks for and incidence of heart attacks, so it would be interesting to see what the hormone levels look like, which might provide some insight as to whether or not stress is more or less of a heart risk factor than the beer, nachos, wings, or chili cheese dogs we tend to consume while watching our favorite sports.  Also, I don't know what the increased testosterone may be doing in terms of affecting behavior or perhaps the way that we think (affecting things like confidence or aggression), but these would also be interesting things to look into particularly if you were interested in studying sports hooligans and riots.  Anyway, now I plan on watching the rest of the Rose Bowl and eating some more crappy food, so I hope you have a "Happy New Year!" whomever you're rooting for.