
Click here for a larger copy.

Now, if you look at the right side of the picture above, you see that water enters the atmosphere by evaporating from large bodies like oceans... WHEN THE WATER IS WARMER MORE OF IT EVAPORATES and enters the atmosphere. As you follow the arrows at the top of the figure, moving to the left, you see that all of that moisture in the atmosphere condenses when it hits cold air. So you see, as global warming continues, we should see more precipitation (including snow) because the oceans will continue to get warmer and more water will evaporate and fill the air with moisture that, when it condenses will fall to the ground as either snow, sleet, or rain. In fact, it is exactly because surface temperatures in the Pacific Ocean are warmer that we got so much snow (see figure to the right). Of course, this warming is because of El Nino, not so much because of Global Warming, but it illustrates the point perfectly... WARMER WATER = MORE SNOW.
Certainly not if you use it to read sites like this one! But seriously, when I was a kid, it was TV, then it was video games, now, it seems that the internet is the target of everyone's brain-deteriorating fears. The truth is, people have been afraid of the intellect-destroying potential of new technologies for a long time... for example, according to an article at Slate.com:A respected Swiss scientist, Conrad Gessner, might have been the first to raise the alarm about the effects of information overload. In a landmark book, he described how the modern world overwhelmed people with data and that this overabundance was both "confusing and harmful" to the mind. The media now echo his concerns with reports on the unprecedented risks of living in an "always on" digital environment. It's worth noting that Gessner, for his part, never once used e-mail and was completely ignorant about computers. That's not because he was a technophobe but because he died in 1565. His warnings referred to the seemingly unmanageable flood of information unleashed by the printing press.But the fear goes back even further than that:
Socrates famously warned against writing because it would "create forgetfulness in the learners' souls, because they will not use their memories." He also advised that children can't distinguish fantasy from reality, so parents should only allow them to hear wholesome allegories and not "improper" tales, lest their development go astray.As time marched on, asking kids to leave their homes and actually go to school was seen as a threat to their developing minds, then it was the radio, and then, of course, television, and now, computers and them darned interwebs...
By the end of the 20th century, personal computers had entered our homes, the Internet was a global phenomenon, and almost identical worries were widely broadcast through chilling headlines: CNN reported that "Email 'hurts IQ more than pot'," the Telegraph that "Twitter and Facebook could harm moral values" and the "Facebook and MySpace generation 'cannot form relationships'," and the Daily Mail ran a piece on "How using Facebook could raise your risk of cancer." Not a single shred of evidence underlies these stories, but they make headlines across the world because they echo our recurrent fears about new technology.But, not to worry, there is good news (and it is not just that cannabis use won't permanently lessen your IQ, unless, of course, you are a chronic chronic-user). As far as the internet is concerned...
There is, in fact, a host of research that directly tackles these issues. To date, studies suggest there is no consistent evidence that the Internet causes mental problems. If anything, the data show that people who use social networking sites actually tend to have better offline social lives, while those who play computer games are better than nongamers at absorbing and reacting to information with no loss of accuracy or increased impulsiveness.To add to this, I posted about the social networking bit a little while ago, and I also seem to remember seeing an article out not too long ago that suggested surfing the web could actually improve cognition. However, despite all of this seeming positivity, there is some bad news... it looks like TV doesn't seem to fair as well in the research...
In contrast, the accumulation of many years of evidence suggests that heavy television viewing does appear to have a negative effect on our health and our ability to concentrate. We almost never hear about these sorts of studies anymore because television is old hat, technology scares need to be novel, and evidence that something is safe just doesn't make the grade in the shock-horror media agenda.Of course, I am half-inclined to chalk these results up to the ills of excess rather than tv itself (as there are mixed results on the supposed ill effects of tv viewing, even during critical periods of learning and development. TV is bad. TV is not bad.) As another example of this, the internet gets mostly positive reviews, but there is some evidence to suggest that overuse (aka internet addiction) may be linked to depression, or that the internet itself is more addictive than gambling.
Apparently, in Britain, cat owners are more likely to have a college degree than dog owners. Why might this be? Well, it is hypothesized that people who have college educations have jobs or lifestyles that leave them less time for more needy pets... like dogs. (I wonder what having a bird or a fish says about your education?) Anyway, apparently there is a cultural element to this as well, seeing as how, in the United States, there does not appear to be a similar trend, as reported by Justin Wolfers at the Freakonomics Blog:"unlike Britain, there’s no educational gradient here. In the U.S., 31.5 percent of cat owners have college degrees, which is statistically, insignificantly larger (i.e. no different) than the 30.1 percent of dog owners who hold diplomas. (These numbers are lower than the British numbers, partly because I’m referring to the qualifications of the respondent, not the maximum qualification in the household.) There are no real income differences to speak of, as both cat and dog owners are each as likely as the other to be in either the top or bottom income quartile."And then, there's also this fun link to a press release for an article claiming that people "who define themselves as "dog people" are more extraverted, more agreeable and more conscientious than self-described "cat people."
A new study out suggests that (medical) marijuana may not help to prevent cognitive decline or to lessen the accumulation of amyloid protein deposits in a mouse model for Alzheimer's disease. Now, before you go crazy laughing and wondering why scientists do studies to find answers that anyone with a little common sense could give you, there is actually a precedent. Marijuana acts in the brain on cells that express receptors usually attuned to molecules that the brain makes, called endocannabinoids (endogenous cannabis-like molecules). Now, endocannabinoids have been shown to help keep brain cells alive under "stress" (stress being a catchall term for conditions known to kill nerve cells), and a compound called HU210, which activates cannabinoid receptors much more potently than marijuana had been shown in previous studies using rats to have some efficacy in keeping neurons alive under the types of "stress" the brains of Alzheimer's patients experience. What I like best about this study is that it shows how science gives you answers even when they're not what you want to hear:So, if previous studies showed some benefit, but this one showed no benefit, which ones should we believe? Well, obviously, more studies should be conducted, preferably on humans... I might just know some people who would volunteer to be studied."As scientists, we begin every study hoping to be able to confirm beneficial effects of potential therapies, and we hoped to confirm this for the use of medical marijuana in treating Alzheimer's disease," says Song, a member of the Brain Research Centre at UBC and VCH Research Institute and Director of Townsend Family Laboratories at UBC."But we didn't see any benefit at all. Instead, our study pointed to some detrimental effects."
10. Science Comics #1 (Feb 1940). Okay, so the "Science" in Science Comics is likely referring to science fiction rather than real science, but since this is a list of the top "science comic book" covers, it seems like I would be a little remiss not to include any covers from a series called "Science comics". Plus, with "Electro" breaking through a steel wall to save the damsel in Bond-villainesque distress, this cover seemed to be much more exciting than some of the others that were up for consideration.
4. Dignifying Science (2003). Another graphic novel from GT Labs, and this one is obviously one that is well needed and well deserved. I would probably put it up at number 1, except the one thing I don't like is how the cover model looks like just that (a model). Why does she have to be all dressed up, in front of the dressing room mirror to do science? And while her test tubes and other equipment suggest heavy duty science, I can't help but notice that the set she is holding makes it look like she is spritzing herself with perfume. It's like the artist is saying, "sure, women can be scientists, but their first priority is looking pretty." Which, to me seems to be a tad defeating. Other than that, a great book, profiling the likes of Marie Curie and Rosalind Franklin, and I believe it was even nominated for an Eisner Award. You can check out some of the contents here.
So, I've posted before about how there is absolutely no substantive evidence to back the claim that vaccines cause autism. The one and only study to make such a claim was published in 1998 by Dr. Andrew Wakefield, who has been all over the news this week in the wake of investigations into unethical research practices. Now, even before the revelations of the past week, many people had reason to doubt the study that kicked off the anti-vaccination movement. For example, Wakefield had a financial (and huge conflict of) interest in developing an alternative to traditional vaccines at the time he published the study. Also, the original study only included 12 children, which is too small of a sample to infer anything meaningful about the population at large. And that would be if the data could be trusted, but since it seems that much of the data was faked, I guess the point is moot. It's no wonder that the rest of Wakefield's co-authors on the study demurred from the conclusions he had drawn, and most of them withdrew their names from the paper. In addition to all of this, however, the past week has been even more enlightening... an investigation into Wakefield's practices by Britain's General Medical Council (GMC) revealed unethical practices by the good doctor, including giving painful spinal taps to children in the study "without clinical reason". Additionally, in the past week, the medical journal, the Lancet completely removed the paper from its publication record. I wish that all of this would finally convince people that there is no truth to Wakefield's ridiculous claims, but sadly, as this article in the London Times points out, many of the antivaxers will only see this as "a setup" or a witch hunt, and the "persecution" of Andrew Wakefield will only strengthen their belief that he is right.