Thursday, November 7, 2013

Pop and Rock Music Renew Energy

According to Dr. Steve Dunn and Dr. James Durrant, scientists researching solar energy at Queen Mary College of London and Imperial College of London, playing pop or rock music increases the efficiency of solar panels. They have been studying the effects of sound waves upon solar panels for quite some time, focusing on classical music when they worked with music at all. The sound waves of pop and rock music start vibrations that increase energy production in solar cells with groups of “nanorods.” Nanorods are made from zinc oxide and covered with an active polymer to make it possible for them to convert sunlight into energy; billions of these tiny rods are required for this process. By playing the right kind of tunes, it appears that production efficiency increases by roughly 40%.

There are similar experiments that have converted sound vibrations into electricity, but the difference here is the method. First, it is an unorthodox choice of sound wave—pop music? The high pitch tones associated with it are extremely effective at being picked out, versus the non-success of the lower classical music. This is also the first experiment to convert sound energy into light energy.

While this may not seem extremely important at the moment, the implications are significant. This discovery opens many more doors into the world of pop/rock outside of your favorite radio station. On a more general level, sound wave technology as a way of increasing energy output or efficiency is key here—if it can be useful at this low level, who says it can’t make a big difference further up the chain?


I had no idea sound waves could be harnessed in this way. It is plausible to me—there are already instances were similar reactions have been observed, and this is only building on previous knowledge.

Thursday, October 17, 2013

Brain Meds


Remembering to take medicine is a constant struggle. People that have to take a pill (or more!) every day have more than likely forgotten on one occasion or another. What if this common problem could be solved completely—no more pills, no more forgetting, no more mixing up which pill you are actually supposed to take? Scientists are developing this very thing so we don’t have to do the remembering anymore.

They are trying to create a cellular response that will release the medication on its own. Basically, a cell that is typically triggered by dopamine (a neurotransmitter that is released upon consumption of food or other rewarding experiences) will release the medication into the bloodstream. Neurotransmitters translate signals from a neuron across a synapse on a nerve. They relay signals to the appropriate receiving cells, so messages like “I’m hungry” don’t go to the wrong organs. Dopamine is released in response to “rewarding experiences” such as food or sex.

So what could this mean for the future of drugs? Well, a couple of things. No, pharmaceutical companies would not go away with the abolition of pills (trust me, they will find a way to make money whether it is through pills or neurotransmitters).  It would however change the way medicine is given. Do we simply have all of our medication implanted at the doctor’s office? Would there be any need for Walgreens or CVS without pharmacists needed to relay medicine between you and your doctor?

It might even have an interesting impact on drug addiction. The effects of dopamine are enhanced by many addictive drugs, which is part of the reason they are sought after. Perhaps a drug that is released without your physical control will curb addictions or take the work of breaking a habit away—there is no choice but to take the appropriate amount. Of course, for this to work scientists will have to find a way to keep medicine from being release every time dopamine is released—it would not be good to have medicine released with every  meal, sexual encounter, or even stressful situation (where dopamine is also released).


The future of medicine is extremely promising. Believe me, I am all for taking the thought out of prescription medicine, and I’m sure many other people are too. This could mean no more accidental overdoses, forgetting medicine altogether, or mixing up your medications.  

Monday, September 23, 2013

Scientific Method: Not for Scientists?

The scientific method is so ingrained in school children that they could practically recite it in their sleep—question, hypothesis, test, observation, results, conclusion—these are skills understood by fourth graders and repeated until high school graduation. This is why I found it so surprising when I surveyed four professors of science and all of them had various opinions of its effectiveness. Effectiveness? How can this even be up for discussion? I spent ten years in school learning how effective the scientific method is, and now suddenly that means nothing?

Well, yes and no. Only one professor responded that they do not use it all. Dr. Kryzsiak believes that the traditional scientific method is not very useful with her research in biochemistry. She says that while she uses “components” of the scientific method, she doesn’t follow it to a tee. For example, she poses a scientific question but offers no hypothesis. She is not alone, either; Dr. Burns does not question or hypothesize, she gets “scientific ideas” from field work. Dr. Bulinski and Dr. Sinski have something in common with Dr. Burns too—all three include a step where they read scholarly papers on the subject they want to study to make sure they are not repeating research. This step also refines what they want to study exactly, and sometimes changes their direction entirely.

Dr. Bulinski cited using the scientific method the most out the four scientists surveyed, yet she still has a twist on it, because her research differs from the traditional lab work of chemists and biologists. As a paleontologist, she has to plan her trips to sites where she can collect the specimens she needs to study. For her research, she follows every step of the method closely—right down to her hypothesis.

After reviewing each response, it seems that the use of the scientific method may just be personal preference. What works for some sciences/scientists may not work for others. Dr. Sinski made the clearest statement when he wrote “it is not so much a method as it is common sense.” So while these scientists may not be recording every hypothesis and step in their process, they are making predictions and experiments constantly, without even realizing it. 

Wednesday, September 4, 2013

The Age of Denial?

In the New York Times article “Welcome to the Age of Denial,” Adam Frank argues that Americans are no longer grasping the importance of science.  He is not very effective in making this argument, however, because at most it seems he is grasping at straws—trying to make a bigger deal out of this situation than it deserves. Instead of “sending his students into a world that celebrates what science has to offer” he is tossing them into a world that apparently equates scientific knowledge with astrology, if I am to use the same dramatics as Frank.

His use of the statistics to “shock” work against him. First, the two percent increase in creationists does not feel dramatic, much less relevant to his cause—but more on that in a minute. The 5% difference in knowledge of climate change is greater, but I might question the validity of the numbers, due to the fad “going green” has become in the last five years. From these, Frank argues that people trust scientific research less (by doubting vaccinations, for example) and their lack of respect for “scientific fact” has led creationism to develop into “creation science.”


While I agree that some “anti-science” causes (by Frank’s definition) such as the decreased use of vaccinations in some areas of the country are cause for alarm, I do not believe them to be a part of some bigger, more generalized issue. While religion does go head to head with science, is not the heart of the problem either. Frank declares people understand and care less—but fails to pin down a defined reason why. His grand belief that American society is entering a “dark moment” in history is over-dramatized and desperate.