November 2012 Archives

Full Moon Myth Debunked



"It must be a full moon."

This is a common phrase heard in professions of dealing with emergencies, from police officers to emergency room nurses and doctors. Many claim that during a full moon, there is a tendency for people to act strangely and more aggressively. The legend is that the full moon brings out the worst in people, meaning more accidents, more violence, and more reckless behavior. Is there any scientific evidence to explain this popular myth?

                Studies have been done that analyze records of hospital room visits to attempt to find a correlation between periods of high occupancy and the lunar cycle. Researchers examined over 160,000 records of emergency room visits and found no difference between a full moon and any other night. The same result was found when looking at surgery incidence. Doctors are no more likely to mess up during a surgery no matter what point the lunar cycle is at. A study conducted in 2005 by Mayo Clinic researchers reported that there was no major jump of people coming in during a full moon compared to any other night. The only positive correlation found with the full moon was pet injuries. According to Colorado State University Veterinary Medical Center, researchers found that based on around 12,000 cases of animal emergency visits, there was a 23% higher chance for cats and a 28% higher chance for dogs.                       

                Though there is not much scientific evidence in favor of the full moon myth, it still persists. Some who favor the myth argue that because the body is made of 75% water, the moon's gravitational pull may be affecting us more when moon is full. However, scientists argue that the pull of the moon on our bodies is minimal and is not strong enough to measurably change anything. Another possible explanation for the myth is that when people know there is a full moon they're more likely to note weird encounters when they're more aware of them. Have you heard of the full moon myth? Do you find to encounter more unusual circumstances with people during a full moon? Do you believe there is an explanation even if science hasn't found it yet?

Works Cited:

Unbreak My Heart

Often associated with a break-up or another comparable tragic event, a broken heart has been a word with a negative connotation, but there has always been some criticism behind the phrase. Since a heart does not actually break, it would be weird to say that you are actually brokenhearted. Inspired by another post, I was looking for a connotation between emotional distress and physical hurting and discovered that broken heart syndrome actually exists. I decided to investigate...

Broken heart syndrome refers to a sensation similar to a heart attack. When a person is overcome by overwhelming amounts of stress, they may experience chest pain like a person having a heart attack will. However, the pains are not as serious and only occurs in a specific part of the heart that enlarges. The rest of the heart acts as it normally would, which makes broken heart syndrome less severe than other heart conditions that have severe consequences. Broken heart syndrome can be cured in about a week, depending on the severity.

The main difference between broken heart syndrome and a heart attack is the damage they bring about. Whereas a severe heart attack can really damage a person's heart, broken heart syndrome does not result in any damage to the heart. There is pain, certainly, but normally your heart is unscathed after the syndrome passes, unlike a heart attack which can be scarred for a lifetime after a heart attack. It must also be noted that John's Hopkins medicine research has found that broken heart syndrome mostly occurs in older women, although it is not sure if this means that risk for heart attack is higher for men as a result. Perhaps women are a bit more susceptible to broken heart syndrome, but are not in as much danger to having a heart attack? There is a bit of a silver lining with broken heart syndrome (also called stress cardiomyopathy): once you experience it, you most likely will never have it again. Unlike a heart attack which can occur more than once for a person, broken heart syndrome (a relatively new discovery) is only a one time thing.

It is unsure what kind of stress actually causes a person to fall victim to broken heart syndrome. While some people are able to handle stress well, others can fall victim to anxiety very easily. Sure, a tragic event like a death, break-up, or accident can result in a stress overload, but it would be interesting to see how much stress a person could handle before incurring broken heart syndrome. What do you guys think?

Can animals talk?

Earlier this month, the New York Times published an article about an elephant in South Korea who can "talk" - that is, he can mimic a handful of words that he has picked up from his human caretakers. 

The NYT gave examples of some of the Korean words that Koshik, the 22-year-old male elephant, can speak:

"His vocabulary includes "annyong" (hello), "anja" (sit down), "aniya" (no), "nuo" (lie down) and "choah" (good)."

Researchers at the University of Vienna conducted a study in which native Korean speakers wrote down what words they thought they heard elephant say, and their answers lined up with what everyone else thought the elephant was saying.

This lead me to wonder if there have been other cases in which animals have picked up words from humans, and if they've actually seemed to understand what they're saying.

There are a lot of reports from pet owners who claim their dog or cat has picked up a few words. A simple YouTube search will give you a ton of clips of dogs saying "hello", "I love you", and even "I want my mommy."

It seems as though it's not all that rare for animals who spend their lives around talking humans to start picking up a few words and mimicking them - similar to how babies start speaking.

While researchers have been able to teach animals to communicate with a human language by using computer programs, seeing as different species have unique cognitive abilities and vocal systems, teaching animals to audibly speak and visibly comprehend more than a handful of words has proved to be much more difficult.

Back to Koshik - while I have no doubt that this elephant can mimic the words he's probably had said to him all of his life, because he doesn't comprehend the meaning of these words, I don't find his ability to speak to be anymore interesting than any other cool trick. Furthermore, I'm really not buying this wild speculative conclusion from the researchers: 

"The researchers think that Koshik started imitating human speech out of a need to socialize. For seven years when he was a juvenile and at a critical stage in his development, he was the only elephant at the Everland Zoo."

Because it links the elephant's ability with his past living situation, this conclusion could be a product of the Texas sharp shooter problem. Having it come down to "a need to socialize" assumes that elephants and humans have similar socialization and communication needs. I'm not saying it's a completely off-base speculation, and there's definitely a chance that Koshik started speaking because he was lonely. But it would be difficult to thoroughly analyze an animal's past experiences and the effect of those experiences on his current emotions without more similar cases to study. 

Oh My Oxytocin


In class on Tuesday, we discussed the altering of data and fraud in science. I found this very interesting because, as I have mentioned in many past blog posts, the public puts a heap of trust in science. Society leaves the questioning of data to other scientists and politicians, rather than question results themselves; they simply accept the information as fact because it is labeled as a "science." So my question is, what makes us trust?

According to Readers Digest, there are two extremes when it comes to trusting others. On one end there are those who are incapable of trusting anyone, and they are labeled with paranoid personality disorder. In contrast, there are those with a genetic disorder that cause them to trust just about anyone and anything. For the most part, the public finds themselves somewhere in the middle.


The article claims trust develops at an early age, as young as 14 months. We long to trust people because it releases a gratification hormone called oxytocin, but our brains are not so blinded to have faith in someone who has let us down in the past. The article mentions a study that had men inhale oxytocin or oxygen, and the ones who inhaled oxytocin had increased levels of trust.

Lets break down the study...

1.     It is an experimental study. The scientists know who is in the control group and who is in the experimental group.

2.     Hypothesis= Inhaling oxytocin will increase trust.

3.     Putative Causal Variable= Oxytocin

4.     Putative Response Variable=Trust

5.     The study rules out reverse causation because of the element of time. Increased levels of trust cannot cause the inhalation of oxytocin because the oxytocin is inhaled first.

The results of the study could be a false positive, due to chance, or due to a third variable. It was a relatively small experiment and was only done using men, which leaves a lot of questions unanswered. How does oxytocin affect women? Is it easier for men to trust than it is for women? 


After this study was released a product called Liquid Trust (containing oxytocin) that claimed to make others more likely to trust you when you spray it on. The only drawback... oxytocin does not erase the memory, so if you have not been a trustworthy person, Liquid Trust will not make people forget it. Would you buy Liquid Trust based off the results of this study?



The holidays are upon us and while this may be the season of giving, the food we decide to eat can be unforgiving on our waistlines. These next few weeks are calorie bombs- our targets: plates of cookies, mashed potatoes, and one-too-many Peppermint Mocha's. But this year, instead of worrying that you'll pack on the pounds, let a new study from the University of Rhode Island assure you that perhaps the key to staying slim isn't what you eat- but the speed you eat it at. 
In this study, conducted by professor Kathleen Melanson, a correlation was found between the subject's BMIs (Body Mass Index) and the speeds at which they ate. Those who ate slower (2 ounces per minute) had lower BMI's compared to the subjects who ate more quickly (3.1 ounces per minute). Professor Melanson explains her findings in this article, posted on science news site, She says, "It takes time for your body to process fullness signals, so slower eating may allow time for fullness to register in the brain before you've eaten too much." These findings are important and should help us remember that finishing our plate isn't always a race. 

Melanson's observational study is valid, but there are definitely some possible confounding variables I wonder about. For example, I believe that people who are slim are generally more health conscious and may be more educated on appropriate portion size than those who are overweight. What I mean by this statement is that, it shouldn't matter how fast your eating the meal, it should only matter how much of it your eating. I wonder if Melanson has taken this into account in her results/study. 

It's common procedure to celebrate the holiday season every day from Thanksgiving until New Years. However, this may be the downfall for anyone trying to maintain their weight. While occasionally indulging on your roommates famous holiday cookies isn't a problem, everyday snacking can become an issue. Remember that its called a holiDAY for a reason. In my opinion, focusing on healthy eating throughout the season is the best way to survive this time of year without packing on the pounds.

music, sweet music



            Being a musician, I encourage people all of the time to take up a musical instrument and learn about its essence. Not only do musical instruments provide a relief from stress but also they make an individual look more "cool".


            So, someone may assume the focus with a musical instrument would distract someone from more significant things like school work. But, this is not entirely true.


            According to this website, discipline with music helps the brain increase cognitive ability. As a result, test scores across the boards would go up for the individuals who have a music background, "...Johnson, professor of music education and music therapy and associate dean of the School of Fine Arts at KU, found jumps of 22 percent in English test scores and 20 percent in math scores at elementary schools with superior music education." Here, Professor Johnson did an experiment with schools across the nation. From their results, it is shown that a background in music could raise standardized test scores for college applicants and high school students.


            The famous Mozart effect has a similar benefit. As a reminder, the Mozart Effect is the idea that someone who regularly listens to classical music will have better test scores; also, this idea relates to pregnant women who listen to Mozart when they are pregnant to make their offspring "smarter". This theory has been seen in particular situations, but some scientists have failed to manipulate the effect with research. Because of this fault, this theory can be concluded as chance because there were not consistent trials.  Those individuals who did not sho

 results with cognitive ability probably don't have a taste in classical music so they couldn't stimulate relief.  There are other third variables that can be put into context.


            But, in general terms, music is an art form that allows people to be relaxed. On the other hand, a musical instrument requires more than just pleasure: dedication is essential. The determined musicians who study their instrument for months and years have trained their brain to function in a musical fashion.  So, when they sit down to take a long test, they can attack problems with a music basis to guide them through the test.  On a personal note, when I am taking a midterm or an accumulative assessment, I will always hum and tap to the rhythm of that song in order to make my brain make sense of the information on the assignment. So, research a musical instrument and get involved with lessons and dedication. 

website URL: 


picture site 


 In the morning paper, one might flip through news stories, crossword puzzles, and perhaps check their horoscope. While reading horoscopes, many people get caught up in the eeriness of the similarities between what is written how it can relate to daily life.  My roommate will even dismiss unusual behaviors of others with, "He must be a Taurus." There is no doubt that astrology holds an important and even spiritual place in many lives, but is there any actual proven leverage to any of it?

                Most scientists have left astrology untouched, dismissing the whole concept as a sham until a French psychologist and statistician named Michel Gauquelin came along. Michel had been highly skeptical of the validity of astrology. He conducted research to find a correlation between a person's chosen profession and their natal chart. His research shows a tendency of a person's profession to correlated with the positioning of the stars. Critics of Gauquelin's work indicated that the only correlation present was those of members of higher professions rather than unskilled workers.


(Gauquelin's Mars Effect Chart)

                Michel Gauquelin's most controversial work regards a correlation he calls the 'Mars effect'.  He split times into segments when planets passed through the sky and noted that there was a spike of athletes that were born just after rise and culmination when Mars occupied two particular zones. He states that Mars is in these two zones more often for generals, physicians and sport champions and less often for painters and musicians. Critics of Gauquelin's work argue that he did not adjust the statistical significance of the Mars Effect for multiple comparisons and failed to address this issue  in any of his publications. Recent scientists have failed to reproduce his work.

                There is no strong scientific evidence of any proof of astrology, though that doesn't stop people from believing it wholeheartedly. Do you think the idea of astrology is too far fetched for scientist to attempt to explain rationally? Do you see any correlations between your personality and that of your astrological ' zodiac sign'? Do you put much faith into horoscopes? To check if your personality matches your sign, follow this link.

Works Cited




My best friend has always been extremely afraid of spiders.  Once, she even almost caused an accident when we were driving because she saw a spider in the car and literally jumped on my lap while I was driving.  I had to pull over and kill the spider.  I always ask it possible that someone can be THAT afraid of a little harmless spider?!  I guess I can't understand because I am not afraid of spiders myself, but I decided to research if arachnophobia is really an actual thing.  As it turns out...arachnophobia is very real.  A phobia is an irrational, persistent fear of things or situations (Mental Healthy).  More specifically, arachnophobia is an extreme or irrational fear of spiders.  In the US, 50% of women and 25% of men say they have some degree of arachnophobia.  Someone is arachnophobic when their fear of spiders reaches a level that is irrational, illogical, and unhelpful (Mental Healthy).  A very intense panic reaction can be caused.  Some symptoms are sweating, increased heart beat, breathing difficulties, light-headedness, dry mouth, extreme fear, nausea, and more.  This was a shock to me.  I couldn't believe that a little spider could cause such strong reactions in people.  Phobias are different than rational fears, and they can have a negative affect on people's lives and might require treatment.  Arachnophobia has been around for a while.  Studies have shown that arachnophobia in Europe can be traced back to the Middle Ages.  Different people develop arachnophobia for different reasons.  Some people have a bad experience that evokes the phobia.  Other people have had no awful experiences with spiders but have just always been afraid of spiders, usually from childhood.  When people with arachnophobia see a spider, a fight or flight response is provoked.  For people who have the phobia very bad, it can affect their lives a lot.  They might not want to go anywhere where they could even see a spider, like on a camping trip.  This phobia can be overcome with counseling for many.  I always thought my friend was overreacting and being a drama queen.. But I guess she wasn't joking around!  Are any of you guys arachnophobics?  Would you ever consider counseling to overcome it?  What draws the line between being very afraid of spiders, or having an actual phobia of them?

Work Cited

I have been on ice skates since i was 5 or 6 years old. I had a stick in my hand before I could ever remember. Every year I would play summer and winter leagues. I have gotten hit hard and knocked out for a while, but I always recovered. I've gotten my finger sliced by a skate, but my finger is still there. Hockey is definitely a dangerous sport, but what if one of the most dangerous parts of being a hockey player (or figure skater in this case) wasn't in the sport? What if the most dangerous part lied in the air you were breathing?

In this last year, a hockey player I briefly knew had been battling serious cancer. He lost his eyesight, and many thought he would not make it. But recently, his tumors have been getting better and he has been recovering better. And now, in this last week, a great kid who I have played with my entire life in travel, and played against in high school, was diagnosed with brain a brain tumor. It has definitely hit my town very hard. Please pray for them.

The point of me bringing this up is because something clicked in my head. I have always heard how the exhaust from zambonis (machine that cleans the ice) was dangerous. And that exhaust sits in the rink, while all of these players skate around and breathe in this air at a quick pace. I'm not necessarily saying that this exhaust causes cancer, because that would be a serious scientific break through. But I have a gut feeling that this exhaust is seriously dangerous and I would like to explore this matter further.

It turns out that when I research 'zamboni smoke', 'zamboni exhaust', 'zamboni dangerous', there aren't that many articles. Most of the articles are stories about players being hospitalized and ill because of the zamboni exhaust. I did learn that zambonis are powered by natural gas or gasoline, which can release carbon monoxide and ultrafine. According to Wikipedia, carbon monoxide is toxic to humans and animals when encountered in higher concentrations. In further research, I found out that when breathed in, carbon monoxide replaces oxygen in the blood, thus killing off cells and starving vital organs of oxygen. A large dosage of carbon monoxide can kill you within ten minutes. Long term effects of breathing in carbon monoxide can result in brain damage, heart problems, major organ dysfunction, and memory or cognitive problems. The thing about carbon monoxide is that its tasteless, odorless, and colorless, so you can not detect it with human senses. Ultrafine particles are deposited into the lungs. They can penetrate tissue and be absorbed directly into the bloodstream. Ultrafine promotes lung disease and other systematic defects.

Clearly, carbon monoxide and ultrafine are 2 things you don't want to be inhaling. I also couldn't find much more research on what other chemicals or particles are in zamboni smoke, because I'm sure there are a lot more. And when I researched whether cancer was related to zamboni exhaust, I could not find any results. This is a serious matter, and I still have a feeling that there is a connection. I also worry about my own fate because of how many hours of my young life I have spent in an ice rink with a zamboni powered by gas. There is nothing better than a fresh cut ice, but I think zamboni's have to be studied a little farther.

If anyone has ever known a hockey or figure skater with cancer, please post a comment. And please pray for Mike Weltner and Corey Dineen.
Today's lesson about the feasibility of a zombie virus began with a series of gruesome videos in which organisms invaded other organisms. While watching the especially awful video of wasp larvae in a caterpillar, I couldn't help but wonder if the caterpillar knew what was going on. Granted, the larvae eventually took over its brain, but did it ever realize that another organism was inhabiting its body? And if it did, wouldn't it try to kill itself before the invader did? That question led me to the biggest question of all-do animals have suicidal tendencies? This subject is a very interesting one that has many societal and religious implications. For example, St. Thomas Aquinas and St. Augustine both determined that suicide "was an unrepentable sin" on the grounds that it was not natural. This kind of logic is similar to the argument that homosexuality is not natural either. However, based on our discussion in class it appears as though that argument is not valid, as might be the case in regards to suicide.

It is important to note that there is very little research and data on animal suicide. In fact, nearly all evidence that it exists is anecdotal. Though observational experimentation is feasible, a randomized control trial on organisms other than bacteria and possible rodents would be unethical. Discovery News published an article in 2010 about the topic. As the article explains, "Animal suicides were often seen as acts of abuse, madness, love or loyalty-the same causes then given for human suicides". This certainly gives a new perspective on the matter, perpetuated in research published in 2010 by the British journal Endeavour. Sadly the article is only available in print in the university library, but luckily one of the authors, Edmund Ramsden, offered commentary in the Discovery News article. He explained the significance of research on animal suicide, stating that "You begin to challenge the definition of's not necessarily even a choice".
One example of suicidal behavior in animals is the case of the pea aphid, an insect that, as the article in Nature explains, is known to explode itself thereby sacrificing its life in order to protect its surrounding relatives from predators such as the lady bug. Worker ants are also known to die in order to protect the colony. There is also folklore and anecdotes that reference animal suicides.
While suicide in the animal kingdom is a fascinating issue, it's difficult to draw any clear conclusions on the topic. Most of the occurrences are anecdotal, which of course does not follow the scientific method whatsoever. However, more research on the topic could have some interesting implications about human suicide and how to prevent it. Is it really possible for an animal to consciously end its life for the same reasons as people? Do you think more research should be done on the topic? Or is this something that will simply remain a part of folklore for a long time to come?
    Do you think the shows today on television have caused more violence?  On average, American children watch four hours of television a day.  Unfortunately, many shows on T.V. are now violent.  Children may become immune to the horror of violence, gradually accept violence as a way to solve problems, imitate the violence seen on T.V., and begin to relate to the victims or criminals in the crime.  The biggest problem is the shows that are violent and seem realistic.  These are the shows that most likely will make a child repeat it's actions.  
   Violent content on television shows about three to five violent acts per hour.  Children's Saturday morning shows, portray about 20 to 25 violent acts per hour.  Research done by the American Psychiatric Association in 1996 reported that adolescents will have witnessed 16,000 simulated murders and 200,000 violent acts by 18 years old.  Statistics show 73% of the time the good guy is usually the one who is the most violent but always goes unpunished and is usually justified.  Think about shows such as 24 and Blue Bloods.  The good guys kill people and it looks really cool when they do it.  This is just showing kids it is okay to just take out a gun and kill someone.  It also shows people jumping 10 stories out of a building and surviving.  We all know these things are not okay.  
   The National Institute of Mental Health has concluded that violence on television does lead to aggressive behavior by children and teenagers.  Obviously this does not happen to all children, but it does happen to many.  A main problem of television violence is the fact the children become numb to the violence and aggression.  This means these kids are desensitized to the violent acts on TV and witnessing the violence does not effect them anymore.  Overexposure to aggressive television shows can make children think the world is an unsafe place to live.  This will make them overestimate the amount of violent situations they will come into contact with.  This will cause a lot of unnecessary stress and anxiety.  
The Experiment 
   A well-known experiment was done by Bushman in 1998.  He discovered that when we watch violent shows, we store in our memory, a perceptual and cognitive representation of the scene.  The experiment was done on 8 year old boys.  At a later age, they realized the violent scenarios they stored in their brains were pulled up and activated when they were adults.  This influenced their behavior.  They were more aggressive men.  Chronic views of TV violence leads to constantly accessing the parts of the brain where these memories are stored.  This is called primed aggressive constructs.  The younger the child is, the more harmful the violent television shows are.  

  Parents should determine what age their child is ready to witness violent acts.  Although not all children will react to violent TV shows in a negative way, some will.  We want to avoid at all costs anymore crime coming into this country.  I wonder if most criminals today get their ideas off of movies or television shows.  Sometimes when I watch certain shows I joke saying, "That's a good idea to cover up a crime."  Of course I would never do such a thing, but I know some people that watch might seriously think that way.   
   Do you think that the crime today is from the media and Hollywood? Or do you think it is just how a parent raises a child with their morals?  If a good child knows their rights and wrongs TV shouldn't effect them at all right? Well for a lot of kids it seems to.  So what do you think ? 

articles used:

Sensitive Crocodiles


Believe it or not, crocodile jaws are more sensitive than our fingertips!

Djerba-crocodile-Cutie-sky_walker.jpg In fact, scientists of the University of Vanderbilt have discovered that crocodiles have bumps called integumentary sensory organs (ISOs). These bumps are densely distributed on the jaws and body scales. Each one is a small dome, barely a millimeter wide, surrounded by a groove. There are around 4,000 of them on an alligator’s jaws and inside its mouth. Source

They were first identified in 1895, but no one knew what the bumps’ purpose was. These were some of the hypotheses on their purpose:

  • Secrete waterproofing oils?

  • Help the croc to detect changes in saltiness, electricity, magnetic fields, or water pressure?

  • Water ripple sensors?

In 2002, Daphne Soares from the University of Maryland showed that American alligators respond to the ripples created by a single drop of water, even in complete darkness. It was predicted that this was due to the bumps on their faces. It was a brief study so Duncan Leitch from Vanderbilt University, decided to uncover the purpose of these creepy and mysterious bumps . Leitch studied the bumps, which he calls “integumentary sensory organs” or ISOs, on 18 young American alligators and 4 Nile crocodiles. When he cut open the domes of the bumps, he uncovered a vast network of nerve endings. They’re all branches of the big trigeminal nerve that carries sensations from the animal’s face to its brain. This splits into three main branches - one going over the croc’s eye, one running through each jaw. He and Ken Catania, concluded that they’re touch sensors. When ripples occurred, they sent out signals to the nerves in the bumps on the crocodilian’s jaw which caused them to go to see what caused the ripples. When scientists cover the bumps with an insulating material and ripple the water, the crocodiles didn’t do anything. Furthermore, Leitch tested how they sense food. He dropped bits of food into the tanks. The crocodiles would swiftly turn towards the ripples and sweep their heads from side to side. Once their skin touched the food, they snapped it up within 50-70 milliseconds. They would also wait until a fish swam past their open jaws and only shut their jaws when they knew it was prey. Source

It was concluded that these ISOs help crocodilians to capture their prey by either detecting ripples in the water to alert them of possible danger or feeling the direct touch of food. It is also likely that they use the sensors to distinguish between different objects in their jaws, which is why the bumps are vastly concentrated around their teeth. For example, mother crocodiles often carry their eggs and babies in their jaws, so their finely-tuned sense of touch may help them to distinguish between prey and the senses in their jaws tell them to bite or if they have their babies they know not to bite. Humans have skin that tells us when something touches us, and the nature of the material touching us. Crocodilian skin is very different from ours so they need a whole different sensing system to tell them what is around them and to help them find their food. Who would have thought that these ugly bumps could have a survival purpose?

Study: Leitch & Catania. 2012. Structure, innervation and response properties of integumentary sensory organs in crocodilians. J Exp Biol

Are You SAD For The Winter?


      As Monday kicks in, our honey-sweet Thanksgiving break was called to an end. Depressed with the abrupt termination of my wanton pleasure, I also took great displeasure with the weather. As a continuation of my last blog, I decided to bring some more interesting observations and resources into the dominion of seasonal swing of personal mood.

      My personal anecdote during the break was powerful at least at my own level that the weather could seriously hold sway of our dispositions. With the blessing of God, our grand tour to Ohio was accompanied by fine weather for the most part. On the second last day, we walked out hotel with sunshine galore, enjoying the most sensational weather that the winter could afford to give us. High 55, no winds. My thermometer was happy to show me a promising forecast. Later on at that day, we would be at Columbus, some 100 miles from Cincinnati. The proximity of these two cities almost persuaded that the weather should not vary significantly. But when we hopped off from our bus at around 2 o'clock, at the time when we usually observe the highest temperature of the day. Holy Cow! We were literally blown away by the weather. With relentless gales buffeting my face, we arrived there unprepared. We were literally a group of penguins, who were basking beneath the summer sunshine on Antarctica, and suddenly encounter a paranormal reversal of the season. What do we do? Cuddle together and kindle fire! We were so depressed and disenfranchised by such a downturn of nature. Thus, our visit to Ohio State University that night was more like a ritual. No more than five minutes have passed before when we got so indignant with the mischief of weather, which drove us back hostel. No "We are Penn State" chanting, no show-off with Penn State gears, nothing. In Columbus, we became stray dogs longing for a warm kennel, not because of Ohio State diehards but the surreal change of the weather. Shazam! Welcome to Ohio-ish winter, says the land.


      Photo courtesy of CNNHealth Dept. 

      In my last blog, I tackled with my recently begotten problem with oversleeping. At the last section of the article, I introduced the concept of SAD. In case you did not read that one, which could be a permanent loss for you, please let me remind you that SAD is the acronym of seasonal affective disorder. Drawing from the description from PubMed Health, the government portal of medical database, "seasonal affective disorder (SAD) is a kind of depression that occurs at a certain time of the year, usually in the winter." Also as a reminder, I briefly talked about two chemicals that control our affective mood, serotonin and melatonin. In an excerpt of my last blog, I wrote,

"The science behind all of this [seasonal change of sleep pattern] involves serotonin and melatonin, two neurotransmitters in the brain. Melatonin is produced when we sleep. Sleeping too much produces abnormal levels of melatonin. The more we sleep, the more we want to sleep because of the increase in this neurotransmitter." Alright, this makes a great sense to me to explain my behavior of skipping class for more sleep after an 8-hour one, but it didn't specify the variable of season. It doesn't take me long to reach the following lines: "During the summer, we experience higher serotonin levels. Serotonin is responsible for our mood. When sunlight is in short supply, our serotonin levels fall and we don't have so much energy or so many good feelings."

      The story I told you at the start of this blog is not merely a bland narrative that I were to offer you, but it should serve as an example of how personal anecdotes give birth to strong opinion toward certain things. In our past class, Andrew talked about the process of flu vaccine inoculation has been lambasted because of sporadic emergences of juvenile autism. The account of Jenny McCarthy, an eloquent and contentious mother, has pushed the audience to seriously consider the drawback of flu vaccine. Though what made her public was not the rampant diatribe of the vaccine but her book that introduces the process of remedying her thus handicapped son with autism, it takes no discretion to see that most scientific studies start with observation and anecdotal documents. If you are the lucky dog, out of the strength of epiphany, viola, you are on the very beginning of some groundbreaking discovery. For me, that episode on the trip reminded me one thing: why don't look up that stuff further?

      So here we go, I found this passage from CNN, which came right to the season. Titled as "When it's more than 'winter blues'", Dr. Charles Raison, an associate professor of psychiatry at the University of Arizona in Tucson, set forth his reminiscence of personal perception of weather: "I grew up in a place where the sun shone every day from May through October. These sun-drenched days were the happiest times of my life. But in winter a dense fog would often blanket my hometown for weeks at a time, leaving the world gray and featureless and leaving me down and dreary." Oh poor Dr. Raison, he seemed to suffer much more than most of us due to the switch of season.

Winter Depression Global Interval.jpg

Picture courtsey of online community-based database.

      I confessed that SAD could be the source of depression and it truly was. On the contrary with an image of an evil demon utilizes the weather to debilitate us, nothing is metaphysical of the winter, the whole enchilada of clinical symptoms with the advent of the Santa Claus Season could be well explained at the physico-chemical level of human beings.

      This semester is drawing to an end, and as the time drags on, I began to collect more general observations with the value of scientific conductions. Like the downswing of mood with the coming of winter, I first attributed it to the sign of depression. Being a victim of intermittent depression, I sometimes dismissed it as lasting failure of a wholesome lifestyle. Indeed nowadays, I still whip myself to be more active at those times when I feel overcast. The vulgar public, which is used here to compare with some erudite scientists, tend to overlook the bodily mechanism of our daily behavior. When we found ourselves unmotivated or disheartened, we are more inclined to look intrinsically to our hidden aspect of characters and thus blame on them than to look extrinsically to the functionality code of our bodies. In this case of SAD, if we blindly discredited ourselves as being slothful, we shall be embroiled in turmoil of unnecessary self-criticism. So my suggestion with how to lead a rational life, by the inspiration of SC200, is to think proactively scientifically. To search online if any discomfort is going on against our general well-being is not a bad idea to promote health, both physically and psychologically, because scientific evidences are much more likely to be truthful than cursory accreditation to one's own soul, if you will.

      Having found that SAD brought me a grand realization of life philosophy, I continued to read the CNN editorial, and something struck me when I was somewhere in the middle: "people in Iceland have remarkably low rates of SAD, despite living in one of the darkest winter environments on earth...even more remarkably, people of Icelandic descent living in the prairie provinces of Canada have far low rates of SAD than their fellow non-Icelandic Canadians." Eureka! I have guessed the context actually prior to reading it: I have had something peculiar in my mind that Scandinavians live in one of the most inclement regions in the world in terms of climate, but they also appear to be genuinely happier than those who live in much milder climatic zones in the world despite the fact of top-notch social welfare systems and stable governments. Why is it the case? Then I met these lines in the article. Though Iceland is not a Scandinavian country, its climate much resembles such one. Thus, the theory of low SAD rate among Icelandic ought to be seen among Scandinavian residents as well if my correlation stands true.

      In an intriguing study called "Winter Seasonal Affective Disorder: A Global, Biocultural Perspective" by Barry Whitehead, he gave us an informative table between specific geographical locations and corresponding SAD rates.

SAD Table.jpg

Table courtesy of Barry S. Whitehead.

      Quite interestingly, according to the table, I was half-right. Compared to a rate of 3.6% in Iceland, Finland and Norway did not give us an impressive grade, finishing with 7.1% and 9.65%, respectively, whereas Sweden appears to be the sister to Iceland in this regard with SAD rates of 3.9% and 3.5% depending on studies. Why, being neighbors to each other, these Scandinavian countries vary dramatically with SAD prevalence? This doubt is covered by the CNN article once again: "Icelandic people carry an as-yet-undiscovered genetic factor that protects against SAD." You might call this remark inconclusive, but I think that this implication at least renders us a possibility of the abnormality of the SAD rate trend along with the increase of latitude. If you are interested in the discussion of the SAD rate and countries, I strongly recommend you to check the content of the study done by Whitehead with the link here:

       Apart from unsettled genetic theory of the overall well-being of Icelandic's mood in the winter, I deduced that daily activities, such as sauna, could also be a contributing factor of their happiness. In a CNN travel log, the author Larry Bleiberg revealed that "Researchers cite the cardiovascular benefits. A sauna removes toxins, leaving skin soft and supple, they [sauna practitioners] say." This being said, I think one of the confounding variables, which dictate the low SAD rate in Iceland and Canadian prairie, can be sauna and other routines of residents. If these fine-tune events are salubrious to the body adaptation to the environment, without too much genetic inheritance, they are gratified to exhibit a better mood anyway. Who's to blame that we don't have that many sauna units in America? Just kidding.

t1port_thunder_bay_sauna_cl.jpgPhoto courtesy of CNNTravel Dept.

      With so much information that SAD provides me, I couldn't help but wondering how come a seemingly evident phenomenon that most of us take for granted could mean an awful lot in the scientific realm and research arenas.

      As a closing mark, can you think of other causes for the low rate of SAD in Iceland? I know that this sounds equally weird as countries with more chocolate consumption are more likely to have larger number of Nobel laureates, but science sometimes is hidden in an uncharted niche, waiting for exchanges of innovative ideas. After all, isn't the process of sorting out "unexpected unknowns" making science even more appealing than ever?

      You may still hate it, so do I. But if you find this blog interesting, please leave your feedbacks below.



Pumpkin and You

With the season become closer to winter, I've been reflecting on my favorite season of fall. Something about the earth tone colors and the falling leaves make me infinitely happy. Did I mention all of the pumpkin? Pumpkin muffins, pumpkin soup, pumpkin pie, pumpkin everything! With all this pumpkin most of us are consuming, I started to wonder what kind of health benefits this vegetable provides. 

Some of the health benefits include
  • Sharper eyesight: there is more than 200% of your daily value of Vitamin A in one cooked and mashed cup of pumpkin and also includes beta-carotene, which the body converts in Vitamin A as well. 
  • Pumpkin seeds have been known to reduce LDL (low-density lipoprotein) or "bad" cholesterol. The seeds can also increase the production of serotonin, which can put you in a better mood. 
  • Recovery food: cooked pumpkin as a post-workout food has the ability to help you restore your electrolytes and muscle functions, similar to a banana. 
  • Cold protector: the vitamin C in a pumpkin can possibly help to ward off a cold, with 11 milligrams of the vitamin in one cup of cooked pumpkin. 
  • Weight Loss: with all the fiber in a pumpkin, it can help you feel fuller for a longer amount of time with less calories consumed. 


The nutritional composition of a pumpkin listed here: ( includes carotenoids, protein, essential fatty acids, vitamins A and C, magnesium, potassium, zinc, and a large amount of fiber. 

While pumpkins can be baked, fried, or prepared several different ways, it is better to steam a pumpkin so the vegetable can retain its nutrients and give you the health benefits you need. 

Can Humans Hibernate?

I was reading a post about bears hibernating and was curious about if humans had ever lived under circumstances where hibernation may be beneficial to survival. The depths of winter weren't always about christmas and snowball fights, they used to be a gruesome struggle for survival. Food was scarce, shelter was essential, and sickness claimed many. So why didn't humans just get fat and take a nap in a cave or hole somewhere?

After some research I came across a 70 year old Indian Yogi, or practitioner of yoga, by the name Satyamurti who meditated in a small underground pit for 8 days. During these 8 days his heart rate became so low that it barely registered on recording instruments and his body temperature dropped from 37C to 34.8C matching the temperature of his surrounding environment. Drop in metabolism and decrease in body temperature are two very important parts of hibernation.

Another case of human "hibernation" occurred during winters of chronic famine among Russian peasants. It is recorded that at the first snowfall peasants would lie down by their stoves and go to sleep only waking up once a day to nibble of hard bread, that was baked in the fall, and have a sip of water. This routine would last the entire winter until spring would come and they could continue their routine labor. This hibernation was not only economical it was peaceful, removing all the stress and struggle of the winter and everyday life. It also shows that if necessary humans can train themselves to sleep for extended period of time in order to survive.

 There have also been many cases of people being either lost or stranded in frozen environments and surviving much longer than they might be able to in warmer climates. This was due to their hypothermia which protected them from severe brain damage, and nearly stopped their metabolisms. Drifting in and out of consciousness many have survived despite the odds stacked against them.

Why is this? Are these changes in lifestyle an adaptation for survival? If these circumstances continued for generations could humans develop a mechanism for hibernation similar to that of a bear, mole, or ground squirrel?

As of now it is assumed that humans can not hibernate, at least not for extended periods of time like other mammals, because the mechanisms for hibernation that we do have are not evolved to a point of making hibernation a plausible long term option.

 Is there a future in human hibernation? Under the worlds current conditions full of heated shelters and winter coats it is highly unlikely that we will ever find hibernation truly essential for survival. 

But could hibernation be something that is not completely involved with evolution? Could it be more closely related to Lamark's idea of an acquired skill that comes from a previous generation? In the case of the Russian peasants they became accustomed to a hibernation like lifestyle because it was considered a normal social behavior, something they didn't inherit but learned to accept from birth. 

What do you think; Darwin, Lamark, social interactionism, or a strange mix of more than one ideology?   

The Antioxidant Delusion

| 1 Comment
I was recently going through my previous blog entries trying to get some inspiration for new topics. I noticed one word that came up a few times in my entries and comments-antioxidant. The first and second time I read it, I glazed over the word. After all, it's thrown around in general health and diet news all the time. But by the third and fourth time I spotted the word, I realized I knew very little about antioxidants besides the fact that they're in blueberries. What are they? What do they do? And most importantly-do they live up to the hype?


The first definition of antioxidants I read was from the National Institute of Health, which describes them as "substances that may protect your cells against the effects of free radicals". This only raised more questions. What does "MAY protect cells" mean and what are free radicals? Luckily, the latter question was subsequently answered. Free radicals as "molecules produced when your body breaks down food, or by environmental exposures like tobacco smoke and radiation". The article went on to list the possible effects of free radicals which include heart disease and cancer.
With a definition in hand, I went on to look for more information about how antioxidants affect our bodies. The Harvard School of Public Health published an article all about the antioxidant issue, aptly titled Antioxidants: Beyond the Hype. One interesting fact that the article cleared up is the fact that referring to antioxidants as a substance is deceiving, as it is in fact a chemical property. The article also discussed the rise in popularity of antioxidants in the 1990s before research had even proven their benefits. Instead the media and health industries jumped on the trend and stocked shelves with antioxidant-packed supplements and foods. Of course I was immediately reminded Trofim Lysenko fraud case we discussed in class in which he told the media his "results" before the study was complete. The only difference being between these cases is that there is probably not fraud in the antioxidant research considering most of the antioxidant-based studies have not yielded the anticipated/desired results that show they are beneficial.
A study is currently being performed by researchers at the University of California-San Diego on the effect of antioxidant pills on Alzheimer's suffers. A randomized control trial was performed on "78 patients with mild to moderate Alzheimer's disease who were divided into three groups and given supplements for 16 weeks". The independent variable was the type and amount of antioxidant pills given to participants and the dependent variable was the protein levels of cerebrospinal fluid (CSF), which serve as markers for the onset of Alzheimer's. At the completion of the 16 weeks, none of the groups showed improvements in the Alzheimer's-related CSF markers. The article's author is critical of the study for many reasons. Not only was the sample size small, the duration was very short and only one dependent variable was measured (or published). The more visible progression of the disease (changes in memory and thinking skills) was seemingly disregarded.
Though the research on antioxidants and disease prevention has been inconclusive thus far, does this mean we as consumers should focus less on incorporating them into our diet? Not quite yet, the Harvard article claims. It explains that the short duration of most studies and the fact that most of them have been done on people with existing diseases (cancer, Alzheimer's, etc.) as reasons for the inconclusive data. However, this article and another one from the Mayo Clinic stress the importance of eating a balanced diet rich in fruits and vegetables. This way, you will satisfy your body's need for a range of nutrients and antioxidants.
Have you-like me-been led by the media to believe that antioxidants are the ultimate cure-all? Did you do any other research before grabbing for that high-antioxidant food or supplement? If so, will you continue to incorporate them into your diet until long-term research publishes more conclusive data?

European Journal of Personality is suggesting that owning a cat can make a person more outgoing. Their argument is that humans infected with toxoplasmosis gondii are more outgoing and extraverted compared to humans that are not infected with this. The study shows that about 23 percent of Americans over the age of 12 are infected. However, according to the article, its not the actual cats that expose you to this and make you more outgoing, it is actually their litter that can become infected which can be passed on to a human. This parasite supposedly increases the concentration of dopamine in the brain. The parasite can infect humans but it has to live out its life cycle in rats, then in cats and then in their feces, and that is where Humans can pick it up. This parasite can also be ingested through the consumption of raw meats.

            While this sounds good it raises a lot of questions. Scientific American reports one study done on this topic where the people infected and the non-infected took a personality test and those infected were shown to be more outgoing. However the European Journal of Personality only suggests that this could make you more outgoing, and that is all. In my opinion it would be fairly difficult to perform this study because measuring how outgoing a person is would be difficult, I suppose some sort of personality test like the Myers-Briggs (<---take the test here!!) could be used, but I don't think that is an effective or accurate test. The study done that Scientific American reported does not state if it was a double-blind placebo trial or not and that certainly takes away from the validity of the study. Another issue this raises is what harmful effects this parasite might have on your body. If you get it from raw meat and cat feces, and it is indeed a parasite, it could not be that great for you, even if it makes you feel more extraverted. What most surprised me about this topic though is that a stereotypical "cat-lady or cat-man" (I'm not sexist) is normally viewed as sheltered and slightly withdrawn from society. If these people own a of cats their chances of ingesting this parasite are much higher, and probably in much higher quantities, so by this journal's hypothesis they should be exceptionally outgoing... but I don't believe that is the case. It would be interesting if a study were done though to find out that cats can make you more outgoing, I bet more people would want a cat then.


Works Cited:

To Eat or Not to Eat?



Fast Food chains.  Whether healthy or unhealthy, I believe almost everyone at Penn State, and largely this country, has experienced one of them at one point or another.  With this countries growing obesity problem, it is no secret that millions of people make the choice to eat at unhealthy fast food chains on a daily basis.  The fact that McDonald's proudly parades the sign "Over 99 Billion Served" is disturbing to me and others, but that is a blog for another time.  For this blog, I'm here to tell you which fast food chains are more unhealthy than others, and why people who eat at unhealthy fast food chains frequently, and I include myself in that category, should switch to eating at the healthy fast food chains. 

According to Urbanspoon, there is an excess of 1073 food chains in densely populated states, on average.  Over 170 McDonalds's lie on the island of Manhattan alone.  So which ones are worse than others?  After doing extensive research on, among other major websites, it depends how you look at it and what you get at each place.  For those that would dub the worst individual item on a menu as contributing to the worst chain of restaurants, then will tell you that Pizza Hut's Triple Meat Italiano Large Pizza would take the cake at 1280 calories and 3070 mg of sodium.  For those disciplined enough to go to McDonald's and get a salad with grilled chicken, congratulations, as you are not eating that unhealthy. 

Every place has its good and bad.  Even a place like Subway has items such as its Italian B.M.T, which contains 3000mg of sodium and 16g of saturated fat.  So just because you are eating at a place like subway does not by any means conclude that you are eating healthy. That goes both ways.  Just because you to a place like Burger King or Wendy's does not mean you can't get a tangerine chicken salad, which has about one tenth the calories, fat, and sodium that their triple Baconator does.  However, there are a few places that are bad just about all around.  Places like KFC, Dairy Queen, and Sonic make it incredibly hard to go there and not walk out more unhealthy then you went in.  Long John Silvers is the worst on that list, as they deep-fry just about everything there.  Those are the places you want to stay away from altogether.  Then, there are good fast food chains.  Places like Panera Bread, Chipotle and Noodles and Co., which while they still offer some unhealthy options, are predominately nutritional.  Those are places I hope this blog helped convince people to go to, rather than the unhealthy fast food chains that I previously mentioned.  Even a Chipotle Burrito packed with just about everything you can find at Chipotle is still better than a SuperSonic Double Bacon Cheeseburger with 1370 calories and 27 percent of a person's daily carbohydrate diet intake.  

To conclude, I guess I'm saying to just be mindful of what you eat and where you eat it.  According to,  just because you are eating at Subway does not mean what you are eating is better for you than a Big Mac.  Just because you are eating at McDonald's does not mean that a salad there is worse than one of Panera Bread's Sticky Cheese Honey Bun's. 




This is one of the many questions my roommate asked me about a bears hibernation period. She also wanted to know why they hibernate, and if she went to the zoo in January, would she see a bear? 
I'm no expert on bears so I told her I'd look it up and get back to her and write a blog about it while I'm at it. 

Lets start with why bears hibernate. Bears are typically very large animals that need a large amount of food to survive. During the winter when food is scarce, other animals compete against each other to survive. A bear during this time period would have a very hard time finding enough food for itself and survival rates would probably be much lower. In the months before winter, the bear stocks up on food and can gain up to 40 pound of fat in just one week that it can use during the winter when it sleeps. They usually sleep from October until April or May. This is one adaptation that I am quite jealous of. 

To address the question of punching a bear during hibernation, the answer is yes of course you can, but the bear will wake up and eat you. In fact, the bear will most likely be awake to eat you before you could make a fist. This is because of a unique adaptation that allows them to sense a predator walking towards them while they sleep. As a possible predator approches, the animals heartbeat will quicken and from this point the bear can awaken from hibernation rather quickly. This website says that the ability is necessary for the species since they cannot burrow underground to protect themselves. 

Even though it is a natural response for a bear to hibernate during the winter, it is less likely that bears will hibernate while enclosed in a zoo setting. This is because there is no competition for food, and it is usually warmer. So, if you plan on going to your local zoo in January, there is a good chance you will see a fully-awake bear.

Now, for my own question: What if the bears have to use the bathroom?
This page from National Park Service says bears do not urinate or defecate during hibernation, but instead recycle it to make nitrogen. They then use this nitrogen to build up protein for themselves. Bears seem like very well adapted animals and it is definitely because of this blog that if I had to be any animal in the world, I would be a bear.
All over the cafeteria, signs are posted reading "Students who eat breakfast get better grades." However, I was curious as to how these two separate events were correlated. Does eating breakfast actually make people smarter? It's possible that smart students are more likely to eat breakfast, that eating breakfast causes students to perform better, that there is a third variable involved somewhere or that this correlation is due simply to chance. 
student eating breakfast.jpg
study conducted by the University of Kentucky was designed to test whether grades and breakfast were actually positively correlated. According to the University of Kentucky website, the purposed of the experiment was to test whether breakfast consumption affected a college student's GPA. The student, Ashley Smith, who conducted the experiment collected data from 251 undergraduate students, 106 males and 145 females, at the university. Though she did not report whether this experiment was observational or experimental, Smith reported that "those who ate breakfast had a higher GPA than those who did not." This study illustrates that eating breakfast is beneficial for a student's grades, but does not clarify the correlation.
I was unable to find any other information on the subject matter from legitimate sources, and thought this would definitely be an interesting study for someone to conduct. If the researchers used a simple random sample to collect participants, they could randomly divide both the men and women into a control and experimental treatment group, where they would eat breakfast or not eat breakfast. Then, after a certain period of time, determine the GPA of the students. Since it would be a randomized and single-blind experiment, this would be able to rule out the possibility of a confounding variable.
So what do you think? Does eating breakfast actually make you smarter, do smart people eat breakfast, or is there a confounding variable?

People love to be scared. They go to haunted houses, ghost hunting trips, and watch scary movies just to get a thrill. I can remember being scared for like by the movie the 6th sense after sneaking in on my parents watching it. I have hated scary movies ever since. But can scary movies actually have effects on your health beyond just making you scared for a period of time?


According to Lifestyle Lounge, the long lasting affects of watching scary movies includes anxiety, sleepiness, fear, and phobias. Anxiety is a visible affect of watching horror movies and can even linger into adulthood. Regarding sleepiness, Lifestyle Lounge says, "An immediate psychological effect of scary movies on the minds of people is lack of sleep. The person may find an inability to sleep through the night for few days, even months after watching the movie. He/she may need to use a nightlight while sleeping. The thoughts of the horrifying characters appeared in the horror movie and the situations may haunt him/her, leading to sleepless nights. This condition last for a week to even a year." The lack of sleep can affect your overall health and quality of life. The fear and phobias come from the playing of the scary movie in your head and makes you fearful of things relating to it and eventually can turn into a phobia if it is not caught in time.


In an article by Suite 101, it says that, "According to Glenn Sparks, a professor of communication at Purdue University, the physical reactions to horrifying images include sweaty palms, tense muscles, a drop of several degrees in skin temperature, a spike in blood pressure and an increase in heart rate of up to 15 beats per minute. Those are some pretty incredible changes considering you're simply sitting still viewing images."


All in all, if you can not handle watching scary movies without any of these side affects happening, you should not watch these types of movies. They can be bad for your health short term and bad for your psychological health long term. 


Recently, my roommate started to take cinnamon tablets to help her keep off the freshman 15.  I always thought this was a weight loss myth. I've heard people say to sprinkle some cinnamon on your food or put a cinnamon stick in your water, but I never thought it would make any significant affect. I decided to do some research to find out if cinnamon tablets can really help and see if I should be taking them too.

 According to Livestrong, "According to a study published by "The American Journal of Clinical Nutrition," cinnamon has been linked to weight loss because the spice stimulates, or increases, the metabolism of glucose. Glucose, or blood sugar, is a main source of energy and affects how hungry or energetic you feel." The Livestrong article also tells some of the benefits cinnamon can have. It says, "A benefit of cinnamon is its ability to delay food from progressing through the digestive system. Food is delayed in the stomach and, as a result, leaves you feeling full for a longer period of time. This results in a reduction of hunger and causes you to eat less. In addition to satiation, the active compound in cinnamon is methyl hydroxy chalcone polymer (MHCP) and works to increase glucose metabolism, according to the USDA. Your blood sugar level determines whether you burn fat or store it. Increasing your glucose metabolism through the use of cinnamon tablets burns excess glucose, which is stored as fat in your body. In short, feeling less hungry and burning excess glucose more effectively equates to eating less and losing weight."

 Taking cinnamon tablets seems so appealing after reading these amazing benefits it has! So how much mg of these tablets should we be using per day? Buzzle says, "For better results, it is best to consume cinnamon in the form of capsules or pills. ½ - ¾ teaspoon of powdered cinnamon is equivalent to 2,000 mg. Taking 4 capsules of 500 mg daily, that is, one with breakfast, two along with lunch, and last one with dinner should be fine."

 What do you guys think? Do you think this sounds legit and would you personally try doing this to lose weight? I think this is the most natural weight loss pill you could ever find. We should still exercise regularly and eat well, but taking a cinnamon tablet will definitely move the weightloss process along quicker.





Anecdote or science?


Winter season is cold season, and for a while, it seemed as if the cold was an inevitable sickness. But over the past few years, a new product claiming to shorten the length of the cold--Zicam--was produced. My family swore by this product and every time they though they were coming down with a cold, ran out to the store to buy it. The original Zicam was a cotton swab with medicine on the end that you used on your nose. 


However, now the Zicam is a tablet that you dissolve in your mouth--so why the sudden discontinuation of the cotton swabs? According to the FDA, the cotton swab is associated with anosmia, or the loss of smell, in people. However, the FDA also reported that only 130 cases of this were reported, so is this actually legitimate? 

Like Professor Read said in class, the power of the anecdote is much higher than the power of science. Only 130 people out of a huge population of people using this product seems like a very small statistic. Conversely, 130 cases all seems like a lot of people who are experiencing this side effect. 

According to an article in the New York Times, the company who produces Zicam, Matrixx Initiatives, has had reports on this dating back to 1999. Then, in 2006, the company, according to the article, "paid $12 million to settle 340 lawsuits from Zicam users who claimed that the product destroyed their sense of smell...hundreds more such suits have been filed." I thought this was very interesting in the article because according to the FDA, there have been 130 cases of this--so why so many lawsuits? Were there more cases there deemed not legitimate?

I was unable to find any more information on that question, but I think it would be interesting to see the science behind why the drug was eventually deemed not marketable and to know more information about the 130 cases. As the FDA is the one who banned the drug, I find it plausible that this actually was a common, negative side effect of the drug. However, perhaps it was just the power of the anecdote. What do you think?

Can sheep be gay?


Sexual orientation always seems like it's applicable only to humans--but according to an article in Times Magazine, rams can be gay too. According to the article, zoologists have known for years that homosexuality is not limited to just humans and estimate that about eight percent of rams are gay. 

However, while the article did not discern as to why some animals display signs of homosexuality and others do not, it stated that the animal activist group People for the Ethical Treatment of Animals (PETA) started a campaign to raise awareness about scientists in Oregon who were cutting open gay rams to try to learn how to turn them straight.

While this is utterly ridiculous, I did some research and found that scientists in Oregon were in fact studying the heterosexual and homosexual rams to study the "neurological basis of sexual attraction." 


I found the press release of the study performed by Oregon Health and Science University, which states that the scientists confirmed the male sheep's sexual preferences are derived from biological differences. The lead scientist in this study said this could also suggest differences in human sexuality--however, there is little to no evidence for this theory.

While he said they performed actual experiments on the sheep, he also said the study only examined 27 adult, 4-year-old sheep. According to the press release, in this study they included eight male sheep who displayed female preferences, nine sheep who displayed male preferences and ten female sheep. 

However, when I saw they only studied 27 sheep--only eight who were considered "homosexual"--I found the study to be unconvincing. With such few numbers, it's very possible that the differences discovered were due purely to chance. 

The study does not say whether or not this experiment was single-blind for the scientists, but if it was not, this also takes away the credit of the experiment.

Additionally, the biology of sheep is extremely different than that of humans, so the fact that the scientist hypothesized that it was possible humans' sexual preference is also determined biologically seems to be a very big stretch.

Perhaps if the scientists took this experiment to a bigger extent it would have more validity, especially if they performed it single-blind. Until then, I would rule this one out!

My roommate has been a vegetarian since last year, well on and off that is. Every week, she changes her mind depending on if she caves to her cravings for certain foods. My friends and I find ourselves cracking jokes about it constantly, "so what are you this week?" I admire people who are truly 100% vegetarian; they must have some sort of willpower to never eat meat. I could never become vegetarian because meat has been in my diet since a very young age.

Many Americans live an unhealthy lifestyle (shown clearly by the obesity epidemic), and many people assume that vegetarians live a healthier lifestyle. But is this necessarily true? Sure, vegetarians do not eat McDonald's BigMac and 7-11's hotdogs, but that does not necessarily mean the other foods they don't eat are healthy. Vegetarians could be eating candy all day long, which in the end can result just as unhealthy as meat can.

The Washington Post offered me an insight I have never thought about before. Apparently, vegetarian diets have been said to be more "natural." But the article offers support for why this is simply not true, and it all goes back to the time when people's only way to obtain food was to grow their own and hunt animals. Evolution, including how our diets evolved, allowed society to grow and develop, and it started off very basic. These two options of food were the "natural" way of eating; therefore, this article argues that eating meat is what allowed humans to grow so incredibly.

Archeologists and other scientists argue that our brains would have never been able to develop like they have if everyone was vegetarian starting millions of years ago. "At the core of this research is the understanding that the modern human brain consumes 20 percent of the body's energy at rest, twice that of other primates." Therefore, eat and cooked foods were necessary in order to give their bodies the necessary nutrients to be able to grow. We need a certain amount of calories to have energy to last us throughout the day. Nowadays, it is "a piece of cake" to obtain those calories! You can even eat all the necessary calories you need in a day in one sitting, say with a McDonald's value meal. But back then, it was not as easy to obtain the calories needed because of scarcity of resources and food. In that time, if someone was vegetarian, they would have had a lot harder time surviving because there were just so few options of what else they could eat. If they did not eat meat, they would have not obtained enough calories from vegetables, fruits, roots, etc. to fulfill their nutritional needs.

This is not to say that being a vegetarian is unhealthy. It could very well mean you live a healthier lifestyle, but this is not solely because of the fact that you do not eat meat. It is how your diet as a whole is compared to that of someone who does eat meat. According to Mayo Clinic, anyone can live healthily and still grow as a vegetarian: children, elderly, men, pregnant woman, etc. The important thing is to know what foods to eat to receive the nutritional intake that is needed.

I knew there were variations of vegetarians, but I never knew to what extent. Apparently, there are 4 kinds. 1) Lacto-vegetarian diets exclude meat, fish, poultry and eggs, as well as foods that contain them. Dairy products, such as milk, cheese, yogurt and butter, are included. 2) Lacto-ovo vegetarian diets exclude meat, fish and poultry, but allow dairy products and eggs. 3) Ovo-vegetarian diets exclude meat, poultry, seafood and dairy products, but allow eggs. 4) Vegan diets exclude meat, poultry, fish, eggs and dairy products -- and foods that contain these products. Are any of you guys vegetarian? If so, which kind would you consider yourself?

On the other hand, there are possible insights as to why it might be healthier to be vegetarian. According to Brown University's Health Education Department, vegetarians have lower heart rates and a lesser chance of getting cancer. This could be because meat often has many additives that are not healthy for our bodies. Many times, the meat we eat from restaurants are packaged in factories with contamination or where the meat is not handled properly. All these possibilities lead to the fact that it is possible vegetarians due lead a healthier lifestyle, however; this still does not mean their eating habits are "more natural." The question that we will never know the answer to is: What would society be like now if people never ate meat? Would the world population be much less? Would people be much dumber because their brains evolved? I suppose the intelligence factor could be tested with someone who has been vegetarian for their whole life vs. someone who has eaten meat their whole life. But, there would be too many confounding third variables such as genetics, education, study habits, lifestyle, diet, etc.

What diet do you consider "more natural?"  Why? Both sides can rightfully be argued, there is no right or wrong answer, only opinions.

the ballot order effect

Chances are you have voted for something in your lifetime thus far. Have you ever wondered why we vote the way we do? Why you wrote one name over another on a ballot? With running in a recent election myself for my sorority I am now more curious than ever of these questions. 

Turns out I'm not alone, there have been many studies done to explain the psychology behind voting. The explanation I will analyze is the ballot order effect. The ballot order effect is supported by the primary effect phenomenon: "our tendency to remember items at the top of a list better than those in the middle or bottom." In voting this happens when we pick the first candidates irrationally. This may be because we're bored and don't care that much or it could be that we are still "processing the information at the top of the list" so other options don't get our full attention. Seems crazy but studies a "classically demonstrated" study done by psychologist Solomon Asch in 1946 actually supports this theory. Basically, he gave one group of participants a list of positive adjectives of an anonymous person first and negatives second. The other group was given a list of negative adjectives first and positive second. When asked to "rate the person" the group with positive adjectives first "consistently rated the person higher than those who were given negative ones [adjectives] first." 

This effect has also been theorized to be a significant factor when choosing a candidate in elections. "Statistical analysis of over 20 years of elections in California show that the so-called 'ballot order effect' may have changed the winner in up to 12% of the primaries." Other advocates of the ballot order effect, such as Jon Krosnick, think it was also responsible for the 2008 New Hampshire Democratic primary. "Pre-election surveys in that state showed candidate Barack Obama leading candidate Hilary Clinton by as much as 7%. However, Clinton won the crucial early state." "I'll bet that Clinton got at least 3% more voted than Obama simply because she was listed closer to the top." (John A. Krosnick) With elections there is always going to be biased because of spectators feelings and opinions. Perhaps this is true but could also be attributed to Krosnick's possible favoritism towards Obama. 

I believe the study done by Asch with the positive and negative ordering of adjectives is very simple to understand to a layman and really makes sense. The fact the study was done over 50 years ago and is still plausible and respected shows studies trying to accept the null, that order has no effect, have not been substantial enough. The two questions I have are how many people were in each group and would it be beneficial to have a control group given adjectives in no apparent order? 

A marvel of mine has always been how was life created. Was it science and evolution or the story of Adam and Eve? I tend to believe the more realistic approach is science and evolution, but how have we come so far from a "molecular level." (ScienceDaily) Although I am still unsure, this scientific article "sheds light" on this "longstanding problem" using "mathematical research." (Science Daily) 

Scientists have very different hypothesis of how life began, but they will agree on two "necessary ingredients" being "a network of molecules that have the ability to work together to jumpstart and speed up their own replication." How molecules achieved this is where everyone gets stumped. Two studies have been published by two men, Wim Hordijk and Mike Steel of the University of Canterbury in New Zealand. The first, in 2005, "mathematical models of simple chemical reactions" were used to show that such "networks" of these molecules might form more easily than previously thought. There new study which included Stuart Kauffman, a colleague from University of Vermont, analyzed the "structure of the networks" in the models and found "a plausible mechanism by which they [the networks] could have evolved to produce cell membranes and nucleic acids" among other things necessary for life.(ScienceDaily) 

I found this study to be interesting but vague. The article doesn't go into details of how the study was conducted so I tried googling it and found another article with the exact same information. I enjoyed reading about it because I am so intrigued in the beginning of life but it didn't do much to cement my theories. I still believe science and evolution are responsible but will need to read more articles, in depth studies, and opposing viewpoints to really conduct an educational hypothesis. How do you think life began? Have you read any compelling articles that do go more in-depth with their findings? 

National Evolutionary Synthesis Center (NESCent). "Model sheds light on chemistry that sparked origin of life." ScienceDaily, 26 Nov. 2012. Web. 27 Nov. 2012.

            How far would you go, and how much money would you spend to help your dog walk again? A study done by Cambridge University has found that by injecting dogs that are paralyzed with basically stem cells from the nose or "olfactory ensheathing cells" as Outside Online puts it can actually help them gain the ability to walk. Thirty-three dogs were in this study and most of them were injected with these cells, while the others were given a placebo. All of the dogs injected wit these cells were able to walk again and have full use of their hind legs. However as the video points out this may not work as well in humans. It works in dogs because it allows the hind legs to cooperate with the front legs and humans are not that simple. So while they may gain some movement back, it is not certain that they will regain all of their movement.

This study is, in my opinion, pretty amazing, but I see a few flaws with it. One problem I have is that 33 dogs is a relatively small sample size to draw from. Making the conclusions not as accurate. Also, the study was done on dogs, not monkeys or rats or pigs, so it's hard to say how relevant this study is to humans. Another problem I found with this is what happens to the placebo dogs? Is it ethical to give most of the dogs an injection that the researchers hypothesize will give them the ability to walk, and not the others? Also I'd like to see a follow up on the dogs given the treatment to see if there were any long-term effects and if they maintained the ability to walk. I also believe this study needs to be done on another species such as monkeys or something closer to humans in order to see the effects, before it is tried on humans. While this is a great advancement, a lot more science needs to be done before any big conclusions are drawn.

Works Cited:

flu during pregnancy and autism.jpgEarlier in this blog period, I wrote a post about flu during pregnancy being possibly linked to the development of autism in the mother's child. Turns out, autism isn't the only disorder that is linked to flu and pregnancy. Studies have also shown that flu during pregnancy also puts the fetus at an increased chance of developing schizophrenia later in life.

Much like the study done linking the flu and autism, the study done linking flu and schizophrenia involves an increased risk of becoming diagnosed later in life. However, the study linking maternal flu and schizophrenia seems to be backed up by more evidence than that linking it with autism. Although correlation does not equal causation and these are not double blind randomized control trials, this link brings up many interesting questions, not only about the link the flu plays but also a possible connection in the origins of autism and schizophrenia. As it turns out, there has been some research on this possible link between autism and schizophrenia--both disorders that we still have very much to learn about. As the mentioned article points out, while both are very distinctly different disorders, they share many of the same symptoms, such as being socially withdrawn and having difficulties in communication. Also, both disorder seem to have genetic component, however with much more evidence existing relating schizophrenia to genes than relating autism to genes.

So, with all of these factors, that leaves the question: What is the relationship (if there is one) between autism, schizophrenia, and prenatal exposure to the flu virus. It seems that if there is a relationship, it has to do with some effect that the virus has on the brain in the very early stages of development, before one is even born. Perhaps then, autism and schizophrenia may be related to the same part of the brain and the same "abnormalities" cause the disorders, linking them closer than we might know. Maybe the flu virus somehow alters of inhibits the growth of a certain part of the brain that is responsible for functions the autism and schizophrenia cause abnormalities in. The other question is, why the flu virus and not other illness? We know that both disorders have been linked to a genetic basis. Maybe mothers who have the genes for a greater risk of passing on genes that develop autism or schizophrenia are also somehow more susceptible to the flu virus? Or maybe there is an immune system factor here and disorders like autism and schizophrenia can be "caught" but the vast majority of people's immune systems fight them off? What do you guys think? What are some possible links between mothers experiencing the flu during pregnancy and the development of disorders such as autism and schizophrenia by the child?



 Orange juice is one of the main juices everyone has to have in their refrigerator. You can cook with it and it is really refreshing. It's the perfect mixer for alcohol because it helps dilute the bad taste. There are all different brands of orange juice and then there is Sunny D.

            Sunny D is said to be a "fake" orange juice. Sunny D comes off like its real juice and is one the best brands that does so. In supermarkets it is many times place next to the real orange juices. Parents and many consumers can easily be deceived.


            Sunny D is also priced around the same amount as regular orange juice, which is a complete rip off because the ingredients used to make Sunny D are much cheaper. Sunny D is just like any other sugary drink but because of the way the company markets the juice, it is perceived as real, which makes it easier for them to sell Sunny D for more.

            On the bright side Sunny D does contain 100% of your daily vitamin C intake but because of the high sugar, sodium and carbohydrates, its still not the healthiest choice.

            Some people claim they like Sunny D better because it taste better and that's probably because of the sugar. I like both. I would prefer to drink orange juice in the morning and with alcohol and I can just drink Sunny D during day.

             What brand of orange juice do you like and why?  Do you drink Sunny D and if so do you like it better? Is it ethical for Sunny D to be marketed the way it is?



Adderall: A PED?

| 1 Comment
adderall.jpgThe term PED, short for performance enhancing drug, has become as popular a term in sports as home run or touchdown. With the recent advances in testing for performance enhancing drugs the number of positive tests continues to rise, but more recently a drug that many athletes seem to be testing positive for ADHD medicine, Adderall. 

Adderall is mostly used to improve focus, and thus is widely popular among college students to help with cramming for exams. While adderall is helpful when trying to learn and retain a large amount of information, it also can help make fast, split decisions, which is why it has become popular with athletes. 

An article written by A.J. Perez for Fox Sports sheds some light on the new phenomenon sweeping the sports world. In the article, the founder of the Bay Area Lab Co-Operative (BALCO), Victor Conte, talks about the effects of using adderall for athletes. Conte has a unique perspective on the situation as he was largely responsible for supplying steroids to athletes during the 90s and 2000s and the bust of his lab is what blew the cover off the steroid scandal that rocked Major League Baseball. In the article, he states that taking adderall or similar amphetamines gives athletes an advantage in their ability to read the situation and make the right decision quickly. He also mentions that before his lab was shut down, he recommended drugs like adderall to players for that very reason. But even now that the BALCO laboratory has been shut down, the use of adderall among athletes seems to be rising dramatically. Is this because of increased use of the drug? Or does it have something to do with better testing that is able to detect adderall?

Since the start of training camp this year in August, 7 NFL players have tested positive for amphetamines, and their excuse has been the use of adderall. Just last week, Philadelphia Phillies catcher Carlos Ruiz tested positive for amphetamines and admitted that was for using adderall. Is is possible that all of these athletes are telling the truth and are really using adderall? Sure. But another article written by Doug Kyed offers a different possibility. When the NFL or MLB find out that a player has failed a drug test, they will release a statement that says something like "this player has tested positive for the use of amphetamines" or something along those lines. What they don't do is specifically name the drug that the athlete tested positive for. In the article, it shows that Kyed contacted a former NFL player named Ryan Riddle via twitter, who stated that he thinks that many players are just using adderall as a public relations cover up for what they really tested positive for.

This is an interesting possibility. As I previously mentioned, we as fans have no way of knowing for sure what drug they really tested positive for, all we have is what we are told. One of the players that tested positive earlier this year was New York Giants running back Andre Brown, and he was suspended for 4 games because of this. However, he was able to appeal the decision and show that he had been prescribed the adderall and his suspension was revoked. Now, if the other players that claimed all they did was take prescribed adderall, why weren't their suspensions revoked? Is it possible that it is because they really used adderall as a cover for the use of steroids because a known positive test for steroids would be extremely damaging to their reputation? 

Adderall has become a problem for professional sports. Is it being abused to this extent, or is t being used as a cover for the use of other PEDs? Either way, this is a problem that isn't going away anytime soon.

Adderall: A PED?

| 1 Comment
adderall.jpgThe term PED, short for performance enhancing drug, has become as popular a term in sports as home run or touchdown. With the recent advances in testing for performance enhancing drugs the number of positive tests continues to rise, but more recently a drug that many athletes seem to be testing positive for ADHD medicine, Adderall. 

Adderall is mostly used to improve focus, and thus is widely popular among college students to help with cramming for exams. While adderall is helpful when trying to learn and retain a large amount of information, it also can help make fast, split decisions, which is why it has become popular with athletes. 

An article written by A.J. Perez for Fox Sports sheds some light on the new phenomenon sweeping the sports world. In the article, the founder of the Bay Area Lab Co-Operative (BALCO), Victor Conte, talks about the effects of using adderall for athletes. Conte has a unique perspective on the situation as he was largely responsible for supplying steroids to athletes during the 90s and 2000s and the bust of his lab is what blew the cover off the steroid scandal that rocked Major League Baseball. In the article, he states that taking adderall or similar amphetamines gives athletes an advantage in their ability to read the situation and make the right decision quickly. He also mentions that before his lab was shut down, he recommended drugs like adderall to players for that very reason. But even now that the BALCO laboratory has been shut down, the use of adderall among athletes seems to be rising dramatically. Is this because of increased use of the drug? Or does it have something to do with better testing that is able to detect adderall?

Since the start of training camp this year in August, 7 NFL players have tested positive for amphetamines, and their excuse has been the use of adderall. Just last week, Philadelphia Phillies catcher Carlos Ruiz tested positive for amphetamines and admitted that was for using adderall. Is is possible that all of these athletes are telling the truth and are really using adderall? Sure. But another article written by Doug Kyed offers a different possibility. When the NFL or MLB find out that a player has failed a drug test, they will release a statement that says something like "this player has tested positive for the use of amphetamines" or something along those lines. What they don't do is specifically name the drug that the athlete tested positive for. In the article, it shows that Kyed contacted a former NFL player named Ryan Riddle via twitter, who stated that he thinks that many players are just using adderall as a public relations cover up for what they really tested positive for.

This is an interesting possibility. As I previously mentioned, we as fans have no way of knowing for sure what drug they really tested positive for, all we have is what we are told. One of the players that tested positive earlier this year was New York Giants running back Andre Brown, and he was suspended for 4 games because of this. However, he was able to appeal the decision and show that he had been prescribed the adderall and his suspension was revoked. Now, if the other players that claimed all they did was take prescribed adderall, why weren't their suspensions revoked? Is it possible that it is because they really used adderall as a cover for the use of steroids because a known positive test for steroids would be extremely damaging to their reputation? 

Adderall has become a problem for professional sports. Is it being abused to this extent, or is t being used as a cover for the use of other PEDs? Either way, this is a problem that isn't going away anytime soon.

I'll admit - I'm one of those people who, at the first sign of feeling unwell, will go onto WebMD and check my symptoms. Looking at their homepage right now gives you a glimpse into the type of things people use their website to find answers to, like...

"Is this spot cancerous?"
"Do I have a cold or flu?"
"Why do I have a headache?"

People have probably been using the internet to try to figure out their problems ever since Google first came out, even those with serious medical inquiries. The WebMD site features a Symptom Checker, which people can use in an attempt to self-diagnose themselves. WebMD encourages those who use their Symptom Checker to print out a report if they feel their situation is problematic enough to warrant a visit to the doctor. It says that it shouldn't take the place of real medical care, but some people choose to neglect that warning.


Let's, for example, say I'm having chest pains and decide to take it to WebMD. The list of possible conditions is very extensive, with problems ranging from asthma to breast cancer, not to mention a ton of conditions with terrifying names that I've never even heard of - costochondritis, atrial fibrillation, bronchial adenoma, cryptococcus... the list goes on. Not only do I still have no idea what's wrong with me, but now I have an even more expansive list of possible things that I could have. Sure, I could schedule a doctor's appointment, but what's going to stop me from thinking I have some horrible disease? What's going to stop me from convincing myself that I'm probably going to die from these chest paints because WebMD told me I have cancer?

According to this article from the Columbia News Source, some doctors agree with me, saying that too often, patients will use medical websites to convince themselves that their symptoms are the result of the worst possible problem.

However, sites like WebMD could be incredibly beneficial. Let's go back to the chest pain example, and say that I was experiencing these pains a lot. I might not think much of a few pains here and there, and decide that experiencing some discomfort was less of a hassle than scheduling a doctor's appointment. After a few days, I decide to check out WebMD and see the huge list of conditions that could be associated with chest pains. This website just might convince me to seek out medical care, and there's a chance that the chest pains were, in fact, a symptom of a serious problem. 

Using this information, the question everyone must answer for themselves is this: Is self-diagnosing (using, for example, a website such as WebMD) more harmful or helpful in the end for one's wellbeing?

Personally, I use WebMD if I feel like there's something wrong that I can't explain, but I never jump to any conclusions or convince myself that it's anything too serious. The Symptom Checker can be a helpful tool, but only if you take it with a grain of salt and follow up with an actual doctor's diagnosis.

Surprises about CHEWING GUM



Do you chew gum a lot?  I know that I do.  I always chew gum... It's sort of like a habit, especially when I'm bored sitting in class or doing work.  I wondered if chewing too much gum is bad for you.  I didn't think I would find many negative things about it, but surprisingly, I found an article that says it can be very bad for you.  Chewing gum has tons of chemicals and artificial sweeteners.  Some of these include GMO corn syrup, artificial colors and flavors, and chemical sweeteners that most people would not be able to pronounce if they looked on the wrapper.  Sugarless gum has similar "neurotoxic sweeteners" that diet sodas have.  Also, I was shocked to see that chewing gum can lead to digestive disorders overtime.  When people chew gum, their body's digestive enzymes get ready for food that never comes.  The chewing tricks the body into thinking it will be getting food.  The digestive enzymes that are suppose to be used for meals are being used up and it is possible that there won't be enough enzymes for when you actually do eat and need to digest.  This is where people can develop digestive problems.  There were no studies done on this to back it up, but it seems believable to me.  I also found a lot of stuff on how chewing gum can be good for you.  Research funded by the Wrigley Science Institute showed that chewing gum can improve your memory and curve your appetite.  There was also a study done by researchers at the Baylor College of Medicine.  108 eighth grade math students were put into 2 groups and followed for 14 weeks.  One group chewed gum when they did their homework and took tests while the other group did not chew any gum.  The students who chewed gym had a 3% increase in their standardized math test scores.  This was a big surprise to me!  So ... is chewing gum good or bad?  Does it do more harm or good for people?  What do you guys think?  I think that a study should be done over many years and there should be a control group of people who don't chew gum, and then a group that must chew gum every single day for years.  Digestive systems will be looked at before and after the years of chewing gum.  Weight and appetite should also be looked at and measured before and after.


Everybody hears the cliche when someone gets dumped. "My heart hearts" But does it actually hurt? And if it does, is it because they were in love? The point of this blog is to put the end to the mystery and bring out the facts. 
     According to an analysis by a Syracuse professor, it depends. That is a very broad answer but in her analysis she goes on to explain how when you fall in love you experience a similar sensation to one of using cocaine. You get a euphoric feeling but it also affects certain areas of the brain. 
     There is also another very common cliche. This is the cliche of "love at first sight" Researchers found that yes, there is such thing as love at first sight. So according to this study do you fall in love by you brain or your heart? According to them the answer is the brain but the heart is also involved.
     According to another study, when you see the person you love the reason your heart starts to beat fast is because of adrenaline. According to Dr. Reginald Ho, the brain sends signals to the adrenal gland, which then secretes  hormones and then they go to the heart which causes the heart to beat faster (

The final answer to this question is, there is not a direct answer. There has not been a lot of research done on this topic so therefore there is not one final answer. Some people think one way and some people think another. What do you think?

Heart attacks

| 1 Comment
They are becoming more common at an earlier age now. Many people think they are invincible from heart attacks so they don't exercise, and don't eat properly. However in the past couple months I have heard of cases dealing with people that I know where the fathers had sporadic heart attacks and died. The misconception is that people think that mostly men have heart attacks and they have them when they are older. However it seems that as time goes on, heart attacks are becoming more and more common at an earlier age.
     First off we should determine exactly what a heart attack is. So what exactly is it? A heart attack occurs when the arteries are blocked off to the heart and blood cannot pump through. When the blood is blocked the heart is either damaged or dies, which determines if the person with the heart attack will survive or die.
     Although heart attacks are common, you should not worry 24/7 that you will fall victim to one. There are many causes of heart attacks:
1. Coronary heart disease
2. Coronary artery spasm
3. genetics

Although the genetics cannot cause a heart attack directly, if your family has a history of coronary heart disease or other things such as high blood pressure, you have a higher risk of having a heart attack. do you prevent yourself from having a heart attack?
If you have high blood pressure of other heart problems you can take medicine, eat healthy, exercise your heart. 
There are also many other prevention techniques such as quitting smoking, eating less sodium, and managing stress.
     According to The Heart Foundation:
- Heart disease is the number leading cause of death for men and women in the United States. -This year more than 920.000 people will have a heart attack.
-1/2 of people that are victim to heart attacks ate under the age of 65
-By 2020 it is predicted that heart attacks will be the greatest cause of death throughout the world.

Heart attacks are not something you should worry about all day, however it is important to take preventive steps in order to make sure that you do not fall victim to one. Eat healthy, exercise, and go to the doctor regularly so you do not become a heart attack statistic.
   Have you ever heard the saying that farting on a pillow can cause pink eye?  I was never sure if this was true.  My friend's brother farted on his pillow one night and a few days later he got pink eye.  We were not sure if this was just ironic or if him farting on his pillow was actually true.  
Pink Eye
   Pink eye is medically referred to as conjunctivitis. This is also known as inflammation of the outermost layer of your eye ball (the conjunctiva).  Usually this is a result of a virus, bacteria or an allergic reaction.  The difference is a viral or bacterial cause of pink eye is contagious.  Pink eye from an allergic reaction is not.  Most of the time, in children, it is passed around from hand-eye contact.  This includes a child who has pink eye touching a toy and passing it onto another child.  The children may not have touched each other, but they did touch common objects.  Also if the birth mother of an infant being born has gonorrhea or Chlamydia the child is bound to get pink eye as well.  The mother can prevent this from happening by getting a vaccination.  

Pink Eye Is Not From Farting on Pillows
    The myth that farting on pillows can cause pink eye is NOT TRUE.  Most pink eye can be a result of poor hygiene.  By not washing your hands or face on a daily basis, this can cause random forms of bacteria or viruses to enter your body.  We touch our face a lot with our hands, even when we don't know it.  By not washing our hands this is applying harmful substances to our face. If we don't wash our face often, the bacteria that our faces intake throughout the daily course of a day, could build up and turn into something like pink eye.  Also, the gases that most likely will come from our farts are gases such as methane.  These will not cause pink eye.  

The Slim Chances of Pink Eye Occurring From Farting on a Pillow
    There are very very slim chances of pink eye occurring from farting on a pillow.  If a person is not wearing any sort of undergarments such as underwear or a form of pants, then they fart on your pillow, certain bacteria may transfer onto your face or into your eye.  This would mean the person can not be wearing any sort of pants or underwear they would fart on your pillow and you would immediately have to lay your head on the pillow for a certain length of time.  I say you have to immediately lay your head on the pillow because bacteria dies quickly when it does not have a host.  Our pillow is not a host for bacteria or viruses.  Yes, our farts can contain bacteria that can transfer onto our pillow, but this will not stay on our pillow for a long length of time.  80 percent of bacteria is in our large intestine.  If that bacteria from our large intestine comes out in our farts, it could reach the pillow, as I said earlier.  
    If the person is clothes,for that matter, you have nothing to worry about and there is no way you will receive pink eye from someone farting on your pillow.  No gases or bacteria can transfer through the clothing.  

  There haven't been reliable experiments done on pink eye.  However, scientists are certain of it's causes and solutions.  
  So after all of this if someone farts on your pillow, especially with clothes on, don't be scared you're going to be okay and pink eye free.  If someone does think they're funny and farts on your pillow without anything covering their butt, then consider changing your pillow case just in case.  Also, always wash your hands and face, you never know what bacteria could be on your hands or things you touch.  
  Do you still think that farting on your pillow can cause pink eye?  Has this ever happened to you? Share your stories below! 

Is Diet Soda Really Healthier?

| 1 Comment
I've heard people say over the years that diet soda can actually be more harmful to the human body than regular soda. It seems outrageous to think that a drink designed to be healthier could lead to risks of serious illnesses and diseases. Upon further research I discovered a lot of information on the risks of diet soda. In doing so, I read a lot of articles that directly linked diet soda and diseases like diabetes and increased risks of strokes. However, I started to think about the article on one of our tests about the life choices we make based on our exercise. The experiment basically explained that those who exercised on a more rigorous level for a longer period of time actually lost less weight in comparison to those who exercised at only a moderate level simply because those that exercised more rigorously assumed they had more freedom to sit around and eat less healthy foods. These two aspects of diet soda, one being the increased risk of diabetes, strokes, etc. and the other being the change in living styles, came to be pretty shocking to me.

These days, any type of food or drink mixed with artificial ingredients has its risks. But I never realized the extremes of these risks especially when looking at a soda that is designed to be 'healthier'. I came across a website that outlined 8 major dangers to consuming diet sodas. I won't list them all as you can read them for yourself, but there was one that stuck out to me more than all the others. The commonly used artificial sweetener in diet sodas, known as NutraSweet, provides serious harms to the body. The chemical in NutraSweet, aspartame, once ingested, breaks down into different chemicals. Most shockingly, aspartame breaks down into methanol once inside the body. Once methanol is then inside the body, it can be heated to a point where it turns into yet another dangerous chemical, Formaldehyde. With all of these chemical reactions occurring inside the body, neurons in the brain can be shocked to the point of cell death. Never in a million years would I suspect that drinking a diet soda could kill my brain cells. The Lance Armstrong funded website Livestrong goes so far as to saying the long-term risks can include impaired vision. There are plenty of chemicals that go into these drinks that the average person doesn't know and/or understand. Fortunately, the dangers are now being exposed. But as we've learned already, significance in a data set takes time to appear. Since diet soda has only been around for a limited amount of time, the strong correlations may take some more time to surface.

Clearly, there are some direct links between diet soda and health risks, but part of me believes that there is more to it than just unhealthy ingredients in these drinks. As previously stated, the article we read on one of our tests where the more rigorous exercises led to alternative unhealthy behavior can potentially be attributed to diet soda's unhealthiness too. It's natural to give oneself a break every now and then. By selecting a 'diet' soda rather than a sugar-filled caffeinated drink, it's reasonable to assume that there is more freedom in the selection of other food consumption. A University of Pennsylvania professor at least thinks so in this article on Fox News. He states that "Soda may not be the villain. It may be the other things that people consume in association with diet soda." It certainly makes sense considering these days everyone is looking for ways to take a shortcut. This is not to say that drinking a diet soda with a salad will magically make you skinnier. Diet soda still presents many risks to the human body as evidenced by the "8 Dangers of Diet Soda" article. Either way, whether diet soda is directly linked to these dangerous health risks, or it simply leads to unhealthy decisions, diet soda may not necessarily be the healthier choice. 

What does it take to be called a genius? Is it your IQ? Is it your life accomplishments? Or is it when your brain's composition is not one of an everyday human being?  This was the case for well-known genius, Albert Einstein.

While reading The Washington Post science, my home-town newspaper, I stumbled upon an article in the science discussing the differences between Albert Einstein's brain and that of a "non-genius" human. Einstein's name is one that we have heard since grade school science; a man whose discoveries will forever affect mankind. But I never guessed before reading this article that Einstein's brain was actually biologically different. I just assumed he began to become educated from a very young age, allowing him to develop faster than a normal person.

It seems that intelligence has been highly debated upon in the science community. The question remains: Is intelligence genetic? The answer to that is: in part. In all the years we have all gone to school, whether it be 3rd grade or sophomore year in college, we have all observed at some point or another which kids in a class are smart, and which are not as smart. I am not saying this is always true. Obviously, in big schools like Penn State, we have classes with hundreds of students sometimes, which does not allow for this kind of observation. But in high school for example, there were always those kids who tried to work with the super smart kids in the class because they knew that they would get a grade. Or when you watched and took note who handed their homework in always on time; who got the best test grades; and who raised their hand a lot with good answers. People just kind of assumed those were the "smart" kids. Then the "not-so-intelligent" kids were those who were the opposite.  This nature that we are raised in is what determines our futures; our society lets our "intelligence" determine us. It is a domino effect. The kids with the best grades and highest SAT scores go to the best schools. This often holds a high factor in determining the degree one receives, later determining a career.

Intelligence is in part due to nurture because you learn what you give yourself the time and/or interest to learn. It is your choice whether to want to excel by learning. You can do the minimum, or you can exceed that and make yourself a rising genius. It is possible to increase your intelligence at any point in life, but it is whether you give yourself the time to do so. Einstein used that time wisely, but he was also born with an advantaged brain, which is the nature aspect of how Einstein's intelligence was greater.

Don't we all wish we were born with genes inclined to having a greater intelligence? With the scientific advances of being able to genetically "create" your baby, you can now make your child the next Albert Einstein!

An international team of researchers at Union College did a study examining genes using genetic and intelligence testing. According to Science Daily, "In nearly every case, the researchers found that intelligence could not be linked to the specific genes that were tested." The result was that researchers only found one gene- very little evidence- that connects intelligence to genetics. However, as someone once said, "Lack of evidence does not equal lack of existence."  According to the scientists, it is a possibility that this just means the intelligence genes are much harder to detect.

However, for Albert Einstein, there is scientific proof currently that proves Einstein's brain was mechanized differently. He is often regarded as the father of physics, perhaps most notorious for his discovery of the mass-energy equivalence formula E = mc2 (the 2 is supposed to be squared). He agreed that after he died, scientists could preserve his brain to study it. According to The Washington Post, a pathologist named Thomas Harvey did just this when he sliced Einstein's brain into 240 blocks and later into 2,000 thinner slices.  

The specimens were given to chosen scientists and discovered, however few of them published any observations on them. Unfortunately, many of these specimens have got lost in time. How could they lose Einstein's brain?! But still, the scientists that did publish work published good quality work. The most easily-noted difference about Einstein's brain is that it had different convulsions and folds than that of a normal person.

It was also found that some parts of Einstein's brain had a larger mass of neurons and a higher than usual ratio of glia (cells that help neurons transmit nerve impulses) to neurons. Another study published by anthropologist Dean Falk of Florida State University concluded that his Einstein's parietal lobes had very unusual patterns on it. Parietal lobes are concerned with correlation of sensory theory's and reception, which probably relates to why he was so good at physics.

14 photographs were used to make observations on Einstein's brain. Another team of scientists compared his brain with 85 other people, and found that Einstein's brain was special in that its folds in the gray matter were noticeably different, according to NBC News. It was also discovered that his frontal lobes, used for planning and abstract thinking, were also different.  The theory that the bigger your brain is, the smarter you are is not true then because it also says that Einstein's brain was still only average size.  Scientists were able to correlate why it was that Einstein was so brilliant in some subjects, according to how those parts of his brain looked.

However; Sandra Witelson of the Michael G. De Groot School of Medicine at McMasters University who also studied Einstein's brain had a completely different observation. She stated that Einstein had an extra fold in his parietal lobe. This is different than the fold being shaped differently. Having an extra fold is like having brain and a piece of another one!  She claims this could have happened while he was in the womb or it could be genetic. This is an interesting thought.

If intelligence were not partially due to nurture, it could be argued that the SAT should be abolished. This would mean that the SAT is unfair because some people are genetically smarter than others. I sure would have loved to sleep in all those Sunday mornings instead of going to 3-hour long SAT preparation classes.  Unfortunately, this was not the case!

Do you guys think that your personal intelligence is due mostly to nature or nurture? Why? Do you think you could become a genius if you put your all into it, or do you blame nature for blocking your ability?

Assisted Suicide?


Today, in my sociology class we discussed the topic of euthanasia. Euthanasia, defined as "the painless killing of a patient suffering from an incurable and painful disease or in an irreversible coma", is a very controversial issue in the world today with many people unsure where they stand on the issue. 


There are several types of euthanasia, passive, which is when medication that is keeping the patient alive is purposely withheld, and active, which is doing something, like giving a lethal injection, that kills the patient. Active euthanasia is the type of the euthanasia that receives the most attention and the most controversy.


An article in the Washington Post tells the story of Mars Cramer, and his wife Mathilde, and their decision to use euthanasia to end her life. In the Netherlands, where the Cramers lived at the time, euthanasia is legal as long as the patient is terminally ill with no chance of recovery, and is suffering serious pain. After suffering from cancer for years, Mathilde was injected with a muscle relaxant to stop her heart,  ending her life the way she wanted to, on her terms.



Another article from 2011 tells of two more people using euthanasia to end their lives. The first is an elderly woman in pain similar to Mathilde, but it also talks about a man named Dan James, a 23 year old who traveled from Britain to Switzerland to be euthanized after he became paralyzed from the chest down following a rugby accident. I thought this was an interesting tidbit because it opens up a dangerous possibility. Even though some countries may continue to ban the use of euthanasia, people can travel to a country where it is legal. It also brings up the question of the qualifications needed to be able to be euthanized. 

You could say that Dan James may not have been in suffering physical pain from his paralysis, but the psychological suffering is a different element. That made me wonder, is psychological pain considered when deciding if someone is suffering? Also how much pain is "suffering"? Is there a certain level of pain needed to be eligible for euthanasia? The fact that there are people like Dan James who feel the need to turn to euthanasia at such an early age is sad, but should it be his right to be able to make that decision if he wants to?


Assisted Suicide?

| 1 Comment

Today, in my sociology class we discussed the topic of euthanasia. Euthanasia, defined as "the painless killing of a patient suffering from an incurable and painful disease or in an irreversible coma", is a very controversial issue in the world today with many people unsure where they stand on the issue. 


There are several types of euthanasia, passive, which is when medication that is keeping the patient alive is purposely withheld, and active, which is doing something, like giving a lethal injection, that kills the patient. Active euthanasia is the type of the euthanasia that receives the most attention and the most controversy.


An article in the Washington Post tells the story of Mars Cramer, and his wife Mathilde, and their decision to use euthanasia to end her life. In the Netherlands, where the Cramers lived at the time, euthanasia is legal as long as the patient is terminally ill with no chance of recovery, and is suffering serious pain. After suffering from cancer for years, Mathilde was injected with a muscle relaxant to stop her heart,  ending her life the way she wanted to, on her terms.



Another article from 2011 tells of two more people using euthanasia to end their lives. The first is an elderly woman in pain similar to Mathilde, but it also talks about a man named Dan James, a 23 year old who traveled from Britain to Switzerland to be euthanized after he became paralyzed from the chest down following a rugby accident. I thought this was an interesting tidbit because it opens up a dangerous possibility. Even though some countries may continue to ban the use of euthanasia, people can travel to a country where it is legal. It also brings up the question of the qualifications needed to be able to be euthanized. 

You could say that Dan James may not have been in suffering physical pain from his paralysis, but the psychological suffering is a different element. That made me wonder, is psychological pain considered when deciding if someone is suffering? Also how much pain is "suffering"? Is there a certain level of pain needed to be eligible for euthanasia? The fact that there are people like Dan James who feel the need to turn to euthanasia at such an early age is sad, but should it be his right to be able to make that decision if he wants to?


Ladies (or gentleman): Do you use mascara? Me too, which is why I think this topic is important because it will affect your opinion on a product you use in your everyday lives.

Mascara is one of the most commonly used pieces of makeup out there. I don't know about you, but it can really brighten my eyes on a day when I look dead tired. I am sure many of you can agree with that! If you don't have time to do makeup, you can throw a little mascara on, and you are good to go! For the few of you out there who don't know what mascara is, it is a black liquid substance that comes in a tube, and you use the brush to give your eyelashes volume and length. But after using mascara for years, my eyelashes have seemingly become thinner and smaller. I am curious if this is because I clean my eye makeup off too roughly or because the mascara is doing harm to my eyelashes. Or is it a combination of both?

After doing some research, I kept stumbling across the term eyelash mites; this word makes me cringe as I type! Apparently your eyelashes are a popular place for bacteria to grow, and mascara only worsens the situation. According to Healthy Body Daily, older people are more prone to having eyelash mites because their immune system is weaker allowing an easy entry for bacteria.

causing them to catch bacteria more easily . You might have eyelash mites if your eyes are itchy or your eyelashes fall out often. This really makes me worry that I might be carrying these tiny parasites!   

Apparently the adult mites are .4 mm long and have a semi-transparent body. The body looks somewhat like an ant's body; it has two fused body parts stuck together with about 8 hairs on the first part of the body. The mites are covered with scales to help themselves attach to the eyelash. The mites have a sharp, point like mouth, which eats away the dead skin cells and hair follicles on the eyelash.

These bacteria that also grow on the sebaceous gland of the eye, ALSO attach themselves onto the nose, forehead, cheeks, and chin, according to Steady Health! I wonder why that is so! Our faces are honestly a party for bacteria; we put all sorts of products on our faces, we touch our faces, other people touch our faces, and so on... Maybe these parasites attach themselves on the peach fuzz hairs on our faces. But do these mites grow on other parts of our body with hair? They are like a mixture between lice and termites! They grow on/in hair and they eat away whatever it is they want.  

On certain occasions, I don't wash my eye makeup off because I am too tired. After not washing the mascara off for one night, these mites have already started eating at your mascara.  You do not need to wear mascara to get eyelash mites, but it certainly increases the chance that you might have them. Does your mascara start to become dry and clumped up the day after you don't wash it off? It is best just to clean your makeup the night you wear it because after that is when it starts to become crusty and even harder to wipe off.

The female mites are shorter and rounder than males. They fertilize and lay eggs on your eyelashes, which eventually hash and create their own eggs. Your eyelashes are a breeding ground for eyelash mite eggs! A female can lay up to 20 eggs in one eyelash! I read earlier that each eye can have about 200 eyelashes, but this seems somewhat exaggerated. I think I could look at my eye and count my eyelashes, and there certainly are not 200, let alone 100! Maybe I have fewer eyelashes than the everyday person because I do wear mascara, but 200 seem like a lot more than we can see on each eye. Anyway, if we have 200 lashes on each eye and 20 eggs on each lash, then that means we have 4,000 eggs on each eye. They can survive up to a few weeks. Hope your eyes are not hurting reading this!

Who knew there are so many small components to things that are seemingly harmless? Remember you do not have to wear mascara to have eyelash mites. But, if you do wear mascara and are losing your eyelashes like me, there are probably reasons other than the eyelash mites. Apparently, the amount of usage, the type of mascara, and the application and application and removal process are all big factors of how mascara can affect your eyelashes, according to Consumer Health Digest. It is all pretty simple. The more you use mascara, the more likely it is to negatively affect your eyelashes. The more layers you apply, the more likely your eyelashes are to dry out or clump up. I am about to tell you something that I bet you did not know! Pumping the tube twice is worse for you because it allows air to be pumped in the tube, which permits more possibility for contamination. So, only pump the tube once or your eyelashes become more prone to getting infected! In addition, much mascara contains petroleum, which slows down eyelash growth. Your eyelashes could be falling off because you are allergic to the ingredients inside the mascara, which there are also hypoallergenic ones to buy.

But there are so many mascara's for sale; I can imagine it is hard to choose one. Do you use waterproof mascara? I have that now, but the next time I buy mascara, I am not going to buy that kind. Apparently, waterproof mascaras contain a chemical called dimethicone copolyol that sticks to lashes and makes it even harder to remove the mascara. Next time I have both waterproof and non-waterproof, I'll have to test out which one is harder to take off! Have any of you noticed the differences in how mascara comes off depending on brand or style?  The best type of mascara to buy is all natural.

The makeup you use is very important; your face is the only thing you have that will stay with you through time and reflect what how much time you have been through! It is worth spending a bit more money than in the future having wrinkles or no eyelashes because you did not wipe your makeup off and gave into these mites! I am disgusted now knowing what is really growing on my eyelashes. Truthfully, am I going to stop using mascara? No, but I am going to continue cleaning my makeup off every night and encouraging my friends to do so.  I hope this article convinced of how important it really is to take care of your skin and the consequences of neglecting it.

brain_shaking.jpgMy best friend from high school, Becky, played so many sports and always got injured. She tore her ACL, she broke her wrist, but most of all she has gotten four concussions. A concussion "occurs when a sudden movement or direct force to the head sets brain tissue in motion within the skull." Studies done at Dartmouth and in Philadelphia and Pittsburgh show that each patient experiences concussions differently, and some people may show no signs of pain at all while others can be in agony. My dad, for example, plays soccer a few times a week and always gets knocked around and hit in the head. My dad probably is more sustainable to the injury than my friend Becky is. Becky has had to miss two months of our sophomore year because she had to sit in a dark room to ease the pain of her concussion. The study in the article say that the best way to help a teenager overcome a concussion is to ease them back into school and play gradually. They say that studies show that letting teens rest and do their normal activities at a more relaxed pace will help them adapt better. Playing video games and watching TV will help these teens get back to normal quicker. This makes me wonder why my friend had to sit in a dark room for weeks and couldn't have visitors or go on her computer. Different doctors believe different things, but a study done at Dartmouth University suggests that letting teens do what they want will help them get back on track quicker.
Another study that Dartmouth did showed that in a survey of 145 emergency room workers published in Journal Pediatrics showed that many doctors didn't use the guidelines or tell patients the proper instructions of how to treat a concussion. This worries me that people such as my friend have wasted so much time thinking that they are recovering when they really are not doing anything to better their health. Perhaps this is why Becky keeps on getting concussions. She hasn't healed from the first and her skull is still sensitive and every time she gets a hit to the head her skull gets more and more damaged. Studies are still trying to figure out how it is that some people don't get concussions and others get them very frequently, but I believe that it has something to do with the fact that they aren't treated correctly the first time, and also because of genetics. Until then though people should be careful on all of the impact they have, especially using their heads, because it can lead to swelling and bleeding in the brain that can cause more than just physical damage.

                Climate change has become one of the most predominant topics in science over the recent years.  Scientists have conducted various studies to determine if it's actually occurring, what causes it, and how to prevent it.  Now that it is known that climate change is in fact happening, more of the focus is on how to stop and prevent it.  There are tons of theories and methods, but one that caught my eye is called cloud brightening.  Cloud brightening involves shooting tiny particles of salt water into the air in order to brighten clouds and make them more reflective of sunlight (Poppick).  This method seems a little strange and it raises some questions: Will it really work?  Is it affordable and practical?  Are there any risks?  These questions are crucial when it comes to climate change prevention.

                Cloud brightening is a weird term that may throw people off.  It doesn't sound like a very scientific term, and some people probably think it sounds a little childish.  It is a very legit technique, though, and it is a form of geoengineering (Poppick).  Geoengineering is defined as "the deliberate large-scale manipulation of an environmental process that affects the earth's climate, in an attempt to counteract the effects of global warming." (Oxford).  Cloud brightening seems to be one of the cheapest and easiest methods of geoengineering, which is why it is so appealing to scientists (Geoengineering).  The actual machines that spray the particles are a little complex to design, but other than that, cloud brightening is relatively simple.  These particle spraying machines can be mounted on unmanned ships and patrol the oceans on their own.  Here is a picture of what a cloud brightening ship would look like:


There could be some concern about over 1000 unmanned ships floating around.  It has potential to cause some troubles with things such as water traffic (Geoengineering).  Other than that, though, cloud brightening seems to be a pretty safe method for reducing climate change.  It is also reversible meaning that the machines could be shut off and everything would go back to normal in a short time (Poppick).  Cloud brightening will continue to be studied and tested, and maybe one day it will become a major player in the game of preventing and reducing climate change.  The issue is that our world is changing and we have to start taking measures to stop it.  Could cloud brightening become the solution of the future?  We do not know yet, but scientists will continue to conduct research to come up with a solid method for putting an end to climate change.








Works Cited

"Geoengineering." Definition of. N.p., n.d. Web. 26 Nov. 2012. <>.

"Geoengineering." Geoengineering. N.p., n.d. Web. 26 Nov. 2012. <>.

Poppick, Laura. "Could Marine Cloud Machines Cool the Planet?" N.p., n.d. Web. 26 Nov. 2012. <>.


Is the world running out of helium? How did this happen? Why does it matter?

Admit it, you've all done it at least once in your life...sucked in a helium balloon to make a hilarious, squeaky voice. At the time, it seemed harmless, but the truth is that helium is a natural resource, and the world is running out.

I work in a flower shop which also offers balloons as an "add on" to orders and about a year ago my boss informed me that our balloon prices were basically doubling, when I asked why, (because I predicted a lot of questions from customers regarding this change) he replied because of the helium shortage. I had no idea what he was talking about, and that is usually the reaction I still get when I tell customers that a Mylar balloon is now 6 dollars instead of 3. Or they laugh... So not only are we running out of helium, but people seem to be uninformed about the issue- which goes a lot farther than just a rise in prices.


Helium is actually an important resource, not only used to blow up party balloons or help lift blimps. Helium is the most stable of all the elements, and it won't burn or react with other elements. Helium comes from underground pockets in the earth that got dislodged by oil and gas drilling (

shortage-10031229). To put it into perspective, we'll run out of helium in about 30 -50 years.

How did this happen?

In the early 20th century, air travel and airship-based welfare were expected to be heavily reliant on helium. Obviously, that didn't happen. I think the quote below gives a clear insight into what happened.

"As you can tell from the distinct lack of majestic blimps in the sky above you," says Eric Limer at Geekosystem, the future many envisioned in the 1920s "never really panned out." So in the 1990s, the U.S. began selling off its helium surplus, "at a relatively low cost, no less, for things like party balloons." (

I don't really see the logic in selling off the helium, because any natural resource that serves a purpose in various aspects of life should not be bargained off. And although it is the second most common element in the universe, it is extremely rare on Earth.

It may be rare of Earth, but there are gas giants in our solar system that are abundant with helium- why aren't we? According to this article (, our helium is literally floating away. Apparently, it is believed that as a planet, we may have began with a lot of helium- similar to Jupiter and Saturn, but because our planet's element makeup is quite different then theirs because we are composed of oxygen, iron, and silicon mainly, we have a significantly less amount of mass compared to Jupiter and Saturn. Because our mass is so much less, we do not have enough gravitational hold on helium in the atmosphere, which is why helium is not abundant on our planet and why the small amount we do have is from underground.

Why does it matter?

Below, I've listed a few uses of helium that I believe explain why it matters:


Helium is radioactive and therefore used for cooling nuclear reactors

Helium is inert and won't react, so it is used as a pressurizing agent for liquid fuel


Medical uses - it is used in heart surgery, it is also needed to conduct an MRI.

A leak detection agent for extremely small leaks

Helium cools to -454 degrees F which allows scientists to study atoms

because they freeze and their vibrations slow down.

And finally, balloons and blimps are the last two reasons, although they are somewhat superficial if you ask me. And scientists are starting to agree (, in a BBC news article it seems that many scientists are speaking out about balloons being a "hugely frustrating....(and) absolutely the wrong use of helium" in comparison to several medical procedures like MRI scans that should take precedent.

How can we fix it?

We can obviously find ways to cut back, and give up our precious balloons.

In the future, we could have the technology to get Helium from off of this planet. Helium is a rare gas on Earth, but it is actually abundant in space. The moon as well as solar winds hold large supplies of helium. I think there are more solutions to this issue and we perhaps need to dig deeper on the root of the problem itself- it is over consumption? Mindless consumption? Or something else entirely? Is there a way out or a way to acquire helium from the gas giants? There is no doubt in my mind this is an issue, and an issue that concerns more than just balloons, because helium is used in so many ways we do not even realize and perhaps we did not even take seriously now. In class we recently discussed government intervention and the mistrust many people have pertaining to coverups and our discussion was mostly centered around extra terrestrials, but it could be applied here as well. Is it there fault for thinking we could use the "surplus" of helium that existed in the 1990s for recreation and profit? And if so, is it there responsibility to fix it? What are we going to do in the 30-50 years when we actually do run out? I honestly don't know, but I think we need to pay closer attention to the issue and find out before it's too late.

Chapstick or Crackstick?


A phrase my roommate can tell you is constantly spewing out of my mouth as I rush to get ready to go somewhere.  I'm a huge fan of the EOS egg shaped chapsticks.  I have them in every color, and I won't leave the building without one.  If my lips are dry I get irritated and mean until they're perfectly moisturized.  But is there really such a thing as a chapstick addiction?  I looked into a few articles and it's somewhat debated among people.
smooth-sphere-lip-balm-group-6-skus-1-400x295.jpgIn the Washington Post's article about it they claim that there really is no such thing as a chapstick addiction.  People simply like the idea of having hydrated lips, but it's better that we hydrate them by drinking water rather than lathering them in wax.  If anything it's suggested that people addicted to chapstick switch over from the wax kind to the jell kind.  Vaseline is an especially better replacement.


In another article a woman, Amanda Elser, discusses her friend's insane attachment to her chapstick.  After witnessing it for too long, she went to Dr. Zeichner of Mount Sinai Medical Center to get some answers.  One thing that Dr. Zeichner informed her of is that there is such a thing as over-moisturizing the skin (which includes the lips) because it can cause your body to lost the ability to moisturize itself naturally.  This doctor, just like the Washington Post said, explains that people need to make sure they're choosing a good product when using lip balm because that could be the key to not over doing it.

What do you guys think?  Do any of you find yourselves addicted to chapstick, and if so do you think you'll change the brand you purchase or quit using it altogether?

Late Night Eating

Since returning from Thanksgiving break I've been curious whether or not my eating patterns should be changed.  I tend to eat breakfast and lunch at normal times of the day, but I always find myself eating dinner after 8 o'clock at night.  For years I've been told in health class and by family members that eating after 8 o'clock is bad for your health and can cause you to gain weight.  I never really believed in the statement because I've always been practically nocturnal in my sleep schedule, but I'm starting to think about it now.  After looking through a few online articles, a particular one caught my interest.


On the John Alvino website an article was written by Alvino about his interest in the same topic.  For years he'd listened to other people and kept himself from eating after 7 o'clock at night in fear of gaining weight.  He eventually came to notice that his hunger was more intense later in the evening, so he decided to try out eating his dinner at a later time.  After trying this for 12 weeks he saw unbelievable results.  He'd lost body fat!  He's not a doctor so he can't declare that this will actually work for everyone, but I'd like to take his word for it.  He does close with suggesting that people stop eating at least 90 minutes before bed time.


What do you guys think?  Is late night eating or snacking a good or bad idea to you?

Bedridden with Mono


I am the unfortunate victim of a college students worst nightmare: Mono. Mononucleosis, as it is called, is not very fun. I am fatigued, my throat hurts, and I all around just do not feel well. I have always heard that it lasts for a while and that it is very difficult to get rid of. This got me to thinking. How long does Mono actually last? What can I do about it? What the hell is it?

So I did my research and now I can definitely give my audience a better explanation than: "it makes you feel like you got hit by a bus... and you can't swallow." Mono is know as glandular fever and, more commonly, the "kissing disease." Transmitted through saliva, it is most commonly caused by the Epstein-Barr virus, which infects the B-limphocytes. These are white blood cells that aid the immune system in combating foreign bodies.

Symptoms include drowsiness, fever, loss of appetite, sore throat, swollen lymph nodes, and swollen spleen. You can not treat Mono, only relieve the symptoms. I was prescribed Prednisone to help relieve the swelling and pain in my lymph nodes. On top of that, the doctor told me to get as much rest as possible and drink a lot of water. It is also advised to do as little physical activity as possible because strenuous activity increases your chance of having the spleen rupture, which is never fun.

Mono can last from anywhere from 3 weeks to 3 months, but after you have it, you usually will not get it again. There is another virus that causes Mono, though, which means it is possible to get it twice. Mono is a very unfortunate thing to have as it keeps you from accomplishing some of the things that you need to do, especially in college. Will we one day have a way of preventing or better treating Mononucleosis?

It sounds silly, but getting your heart broken affects your health in many different ways. It starts as emotional pain and can even lead to physical suffering. This journey begins from the second you say your goodbyes. Your adrenal glands start producing adrenaline and cortisol. Your blood pressure also starts to sky- rocket. Your stress hormomes depend on how bad the break up was. The worse the breakup, the more stress hormones working.


Another reaction to a breakup is how your brains activity starts to increase greatly. According to abcnews , "It's like clockwork: Your eyes hit that photo of the two of you and--boom!--awful stomach pit. You feel sick, yet you can't look away. That's because the moment you saw his face, blood started rushing to your brain's pleasure center, the ventral tegmental area. These are all the good times talking. The command center for craving and longing also lights up. It demands attention--one reason you're obsessed with driving by your ex's house, stalking his Facebook page, or trying to replace him with some other satisfier." Its scary how the brain makes us feel such strong, crazy emotions over a person.

According to  Dr. Sinatra, heartbreak from a break up can actually do damage to your heart. It states, "The reason is that heartbreak is a stress that sets into motion your body's natural "fight or flight" response. When your body enters into this response, the released adrenaline raises your blood pressure and increases your heart rate and breathing. If your body remains in this alarm mode for any length of time, you become prone to stress-producing conditions, ranging from aching neck muscles and headaches to ulcers, allergies, diminished sexual desire, and heart disease.

Heartbreak can be tough. But we need to understand that no relationship is worth physically harming your body through a terrible heartbreak. We should remember to remain strong so little damage is done to our fragile hearts.


Just a warning: this is a very sad story. One of my best friends had a brother who committed suicide many years ago. For a while before that, he was feeling somewhat down, so the doctor prescribed him to anti-depressants. After taking them for a while, he began to not feel himself anymore. I wish I could describe for you more how he felt when he took them, but I shall never know. Anyway, he went back to the doctor and told him he felt strange after taking these pills. The doctor told him to ignore it and continue taking them. Shortly after, one day he was standing on his balcony. His mom and his aunt were talking in the living room. He jumped. They were a few feet away from catching his feet. My friend now hates all doctors, and I don't blame him. He blames doctors for why he no longer has an older brother. And he is right to do so in every sense.

This specific story represents a greater problem that I think needs much more attention in this country. People are thinking that there is something wrong with them way too easily, and doctors are diagnosing people with issues way too easily. This has resulted in millions of American taking prescription pills for problems that many other countries do not see as a real medical problem. They see malaria and AIDS as real sicknesses, not that someone is hyper or sad. I have so many friends on Adderall and ADD medications, but don't we all have problems concentrating at times? I am sure many of you are prescribed to ADD medications. Would not taking these pills so drastically change your attention habits? What triggered you to think you had ADD in the first place?  A lot of people have a hard time concentrating, but is popping a pill supposed to be a permanent solution as oppose to teaching you how to become more concentrated? In the same sense, what qualifies someone to be on anti-depressants? Is it when someone has been down for a period of time and has found no solution? Are these pills going to fix you?

As with every medication, anti-depressants have many frightening side effects. And as we have seen with my friend, these side effects at times can overpower the original purpose of the medication. Some of these side effects only occur when someone first starts taking the pills, and the body eventually adapts to them. However, many others experience these side effects and do not realize that they are even having a negative effect on them. According to Mayo Clinic, basic side effects can include nausea, fatigue, headache, fatigue, tremor, nervousness, insomnia, high blood pressure, dry mouth, sexual problems, and weight gain. I know committing suicide does not fall in any of these categories, but I trust my friend when he says he knows the pill did this to his brother. He said his brother changed when he began taking those pills; he became a different person. In general I do not promote taking medications like these, so I have no trouble believing that these pills could have counteracted their original purpose. As I once heard someone randomly on the street say and thought it was brilliant, "Lack of evidence does not mean lack of existence." Just because we do not know the full extent of what this medication can do, does not mean that it is harmless.

The reason anti-depressants supposedly work is because they affect your brain, which controls your moods and emotions. You might have learned about anti-depressants in Psychology class because depression is a topic that psychologists deal with. Web MD explains how anti-depressants affect your brain's chemicals, which are called neurotransmitters. People who are depressed do not have enough active neurotransmitters in their brain, so these pills activate these neurotransmitters in your brain. While this probably reads well to you, it sounds unnatural to me. I am all for natural remedies and healing. If you are naturally in a state of depression, is it not wrong to unnaturally take you out of that state of depression? It seems as if my reasoning counteracts itself, just as it counteracted my friend's brother.

Apparently, it can take up to three months for the full benefits of anti-depressants to kick in. But you normally should start feeling "better" after about three weeks. "Also, antidepressants may stop working in a small number of people who have been taking them for a while." Could this be what happened to my friend's brother? "Also, some people with depression don't improve with antidepressants and must explore other treatment options."  Or could this possibly be what happened to my friend's brother? The pill did not work, so his depression got worse driving him to suicide. The only thing that conflicts with this is that my friend told me his brother's personality began to change when he started taking the pills.

I have found a statistic that really just expresses how I feel about anti-depressants with a number. According to Time Magazine, The Journal of Effective Disorders published an article attributing 68% of benefit of anti-depressants to the placebo effect. We also learned about this in psychology if you remember. The placebo effect is when a patient is treated with a false-medication so they wrongfully think that the false medication is helping them. It was proven that many people believed in the placebo effect. I remember learning an experiment with the placebo effect in psychology.  Two groups were told their energy level would be monitored after they were given sugar pills. One group was really given these pills. The other group was given fake pills and told they were real. That group given the fake pills acted just as energetic if they had taken the real pills. They were deceived, but because they were told the pills would have an effect on them, their brains tricked them into believing the pills really were doing something. Could it be the same with anti-depressant pills? Could people become happier because they think that's how they should be feeling because they are on the pill?

However, as Scientific American put it, the debate "revolves around the distinction between clinical significance and statistical significance." They claim that anti-depressants can be good, and can be bad; that they work for some people, while not for others. While this may be true, then those "some people" should not be on the pill because it could be having a negative effect on them.

Either way, if you are any prescription pills, please be aware of the side effects and try to really think what effect (positive or negative) the pills have had on you. I know it is a personal topic, but would any of you be willing to share your stories? Have you been on any of these meds? How do you feel compared to before? I don't know about you, but the idea of falsely tricking my brain into thinking something that it is naturally not is worrisome to me.   

Over break, I was hanging out at my cousins house and observing her goldfish. She told me how it had been alive for 3 years. The fish barely even moved, and was in a small boring fish bowl. I felt bad for the little fella. What a terrible life! Being trapped with nowhere to swim and no companions must be terrible. I said that to my cousin, and she replied that "fish have a memory span of 3 seconds." I found that insane and hard to believe. (I also made a note in my phone to write a blog about it, and here I am) She explained that goldfish can't remember anything after 3 seconds, so the fish doesn't really know how boring its life is cause it can't remember anything! The fish basically never knows where it is. Lets observe the memory, and the general life of a goldfish.

                 Dory is a very forgetful fish in the movie 'Finding Nemo'
When I searched on google 'the memory of a goldfish', the second link had a title named "Three-second memory myth". The article says that fish can recall information for up to five months. Scientists observed that goldfish that were trained to respond to a certain sound, still reacted to the sound months later in the wild. Also, fish were able to respond to certain movements, lights, and sounds. For example, a goldfish would know it was getting fed if it saw a bright red light. Months later, the fish still responded to the light. So, now that I know fish CAN remember past 3 seconds, I feel even more bad for these captive goldfish!

On top of having an extremely boring life, that life is short-lived. The longest my goldfish has ever lasted was about a year, which I thought was very long for a goldfish. It turns out that improper care is very common. Without proper filtration of the tank, a fish will not survive as long. It also depends on tank size, and whether the fish is being over/under fed. With that kind of care, a goldfish will live 10-15 months. A goldfish can live up to 5 days, or 25 years! The oldest recorded goldfish lived to be 44 years old!!

The life of a goldfish depends on the owner and environment. A goldfish living in the wild will have a natural and satisfying life. But, the lone goldfish left in a small fish bowl, will most likely die young and not happy. And if you think they don't know any better, they do, because they can remember things from up to 5 months in the past. Laws are being placed recently to fight against improper care of goldfish and domestic violence against goldfish. San Francisco has recently banned goldfish, and a man was recently convicted of felony animal cruelty for stomping on a goldfish. I am happy that actions are being taken to fight for goldfish, because even though they're such a small creature, you still have to feel for them! 

I am interested to hear how long some of your goldfish have lived. Comment please.

The Twinkie: 1933-2012


twinkie 1.jpgOn Novemeber 21, 2012 the infamous Hostess Company officially declared bankruptcy and began to liquidate its assets due to a Bakers Union strike.  A sad day for Ho Ho lovers, Ding Dong lovers, Wonder Bread lovers, Donette lovers, Devil Dog lovers, Yodel lovers, and of course: Twinkie lovers.  I am proud to say that I haven't had more than five Hostess products in my entire life.  However, what is the appeal to some people? Why did people rush to the stores to buy their last boxes of Twinkies though? What makes them so popular?

According to a study published in the Nature Neuroscience Journal, researchers of the Scripps Research Institute in Florida tested whether rats would prefer to eat high calorie, junk food or healthy food.   The researchers split the rats into three groups; the first group was fed only healthy food, the second group was fed healthy food but had access to the junk food for one hour during the day, and the third group was given healthy food but had access to junk food around the clock.  The rats, like many humans, more often than not chose to eat the junk food in excess amounts and the rats in the third group demonstrated addictive behavior when choosing which food they were going to consume.  The researchers found that the more junk food that the rats ate; the more they wanted because their stomachs and brains weren't satisfied.  The levels of the dopamine receptor, in the brains of the rats in the third group, that allows for reward significantly decreased as the study progressed, much like the chemicals in a human brain do with the continued addiction to drugs.  Another interesting fact about this study is that the rats in the third group were trained ahead of time to be afraid of bright light exposure.  However, when the rats were eating the junk food and a flash of bright light occurred, the rats did not care.  They just kept eating the junk food and showed no reaction to the flash of light.

Dr. Eric Stice, a neuroscientist from the Oregon Research Institute, was featured on CBS's 60 Minutes and reported similar findings.  After reviewing MRI brain scans, sugar activates the same region of the brain that is activated when addicts are using their drug of choice.  He also reported that "heavy users of sugar develop a tolerance to sugar." (Stice)  The fact that people can build up a tolerance to sugar, sugar activates the same region of the brain as drugs, and the levels of the dopamine receptors greatly decreased as the study went on are very convincing findings that sugar can be addictive.  Due to the high sugar content in Twinkies, people who are addicted to sugar may use Twinkies as their 'source.'

The nutritional value of Twinkies is pathetic and would certainly fall under the researchers' classification of 'junk food.'  One Twinkie has 2.5 grams of saturated fats, 20 milligrams of cholesterol, 220 milligrams of sodium, 27 grams of carbohydrates, and 19 grams of sugar!! The nutritional facts are just the beginning of it, the shelf life is centuries.  A Twinkie will never go bad; it will never get moldy, and it will never taste different.  This is because of all of the preservatives in them that are chemically made and injected into the batter and the cream filling.

Even though rats were used for the first study, the rats were very well controlled.  By splitting the rats into three different groups and setting the first group as a very clear control group, the researchers were able to take out extraneous variables.  However, they used rats, not humans.  Even though rats are commonly used in studies to prove or disprove effects that certain foods or drugs have on humans, they are still rats.  I don't think that either of the studies presented were due to chance because they prove each other and there are countless other studies out there that prove sugar is addictive and can be harmful to our systems in excess.   So, from the research presented about the addictive qualities about junk food and sugar, I am inclined to conclude that people who are (were) addicted to Twinkies were addicted to their high sugar content and Twinkies were their outlet.  People are paying over $50 for a box of Twinkies off of eBay!  In the long run, maybe the bankruptcy of the Hostess Company will be good for America and our problem with obesity by eliminating some of the worst snack foods on the shelves.  What are your opinions? Do you think that people can actually be addicted to sugar (i.e. Twinkies)?  Or is it more of just a correlation between the brain activity and sugar consumption?  Why does the same part of the brain light up when sugar is consumed as when drugs are consumed by someone who is addicted to them?

**All sources used are in the links within the blog.

twinkie 2.jpg


I've mentioned it before but I'll say it again: I love my dog. It's as simple as that. I have a closer and more special relationship with him than I do with most people. I know that sounds weird and sad, and I realize that, so it made me wonder why this was. I know I'm not the only one in this world with such a special and important relationship with their pet. What is it about people that make them so close with their pets? Is there even a reason? Or is it just an unexplainable connection?

            According to Sarah Wilson, the relationship with your pet is determined by your own personality. She gives multiple reasons and instances for why certain people treat their animals a certain way in her article "Why We Love Cats and Dogs".  Weird as it may sound, I found that I fell into the "Soul Mates" section. She said, "The Soul Mate [is] Loving, attentive, and empathetic towards their dogs, true Soul Mates value the deep-feeling connections they develop with their pets. Soul Mates are always thinking of their pets' safety and wellbeing, and a Soul Mate's dog is always very well taken care of. Soul Mates are also known to take their dogs on many trips, adventures, and outings. As a result, their dogs are usually very well socialized." My dog, Niko, is a labradoodle (which are pretty big). Because of this, the rest of my family and I are not able to take him everywhere we go, but I still consider us "soul mates" because I know we would all love to. My dog is 10 years old and still acts like a puppy, he's always one of the most sociable dogs at the dog park. So is this the explanation I've been searching for? Is it all based on who you are specifically as a person? Or, is it something innate in the human brain that makes us want and cherish our relationships with our pets?

            Another article explains that, in general, we see our pets as an extension of our family, as our "furry children." The Iams Company spent millions on experiements and tests to figure that out, but it seems pretty obvious to me now. Of course I see my dog as my family, and that is probably why I love him and our relationship so much, just as I do with the rest of my actual, human family. This article explains the progression of the owner-pet relationship and how it eventually evolves into family.

dog-and-owner.jpeg            This article explains that these pet-owner relationships are the ones that we should strive for. It says, "Our relationships with our furry friends have great influence upon our well being and happiness." So, I guess in the end the idea is pretty simple: we need our pets. Whether it be because of our individual personality, or the sense of family, or just because they make us better people. We need them to feel a sense of purpose, a sense of love, and a sense of family. I know that Niko gives me all of these things, and I can only hope that he's feeling the same things in return!


Are my eyes to blame?

| 1 Comment

Well, the Well Blog from the New York Times has really been striking out for me.  In my last entry, Is "when to send kids to school" a scientific decision?, I examined one of the studies presented in this blog.  After evaluating it, I realized that there was little value to the study even though the title of the article was compelling.  This time, I found an article that was even more compelling for me personally because it examines something that I experienced as a child.

The article, Really? Eye Problems Can Cause Headaches in Children, was interesting because as a child, I suffered from frequent headaches that did decrease after I began wearing glasses regularly.  That being said, I disagree with the conclusion that "vision problems are often blamed for childhood headaches, but in reality, the two are rarely related."  In my personal experience, I have found that the two do have a some sort of correlation.  

This brings up the classic problem of causation versus correlation.  The title is very misleading in using the word "cause" because it suggests a causal relationship when the "bottom line" of the article suggests that there isn't even a correlation between the two.  

After reading this article, I felt the need to explore other research regarding the topic because I was unsatisfied that my personal experience contradicted the results of this study.  My first search lead me to the Headaches and Eye Problems page on the Better Health Channel website.  Although this information simply provides conditions and treatments for medical disorders and does not feature specific studies, it did support my personal finding that vision problems and headaches are related.  

The way that the Better Health Channel explains the correlation between vision problems and headaches made more sense to me than the results of the first study.  The site suggested that "the headache is caused by the person squinting and overworking the eye muscles in an attempt to better focus their vision" and that "problems of internal pressure and swelling within the structures of the eye can 'refer' pain into other areas of the head."  The bottom line from this source is that "eyestrain can cause or contribute to recurring headaches."

It seems as though there will always be contradictions in information related to the causes of headaches.  At the very least, it is helpful for headache sufferers (such as myself) to examine potential causes of frequent headaches and make adjustments to see if these problems are at all related to their suffering.  For sufferers of frequent headaches, even a small life style change could be enough to slightly lessen the intensity of their pain which could greatly improve their quality of life. Therefore, if you do suffer from headaches, it might be worth it to get your eyes checked.

I have heard the phrase Spontaneous human combustion (SHC) before, but I never really thought about it in a deeper meaning until I saw an advertisement with it today. Of course, if broken random, this phrase literally means for someone or something to randomly explode. This sort of makes me laugh, because it sounds so impossible, but in fact there have been several hundred cases. I had never even heard about this phenomenon, but apparently this topic has been actively debated upon among scientists and other communities. 

According to How Stuff Works, there have been hundreds of cases of people being found burned, yet with no apparent trigger of flame (i.e.: lighters, candles, etc.). Yes, I first assumed that it was possible all these people could have been intentionally burned by others. But, there were no indications in these instances that someone broke in the houses and other locations that these people died in.  This just baffles me! How can someone be burned naturally? Are there components in the body that are fire hazardous?

"Spontaneous combustion occurs when an object -- in the case of spontaneous human combustion, a person -- bursts into flame from a chemical reaction within, apparently without being ignited by an external heat source." Many speculate that this phenomenon is real, however; many scientists argue this is not real. What do you guys think?

What is even stranger is that random body parts and clothes were found untouched/unharmed right next to the ashes. The most crucial body parts such as the head were destroyed, but other body parts such as feet or hands were left completely intact next to the ashes. Furthermore, in several instances, some of the clothes of the victims were left untouched also next to the ashes. These two mind-blowing observations are the biggest indications to me that SHC cold in fact be very real.

After reading some more, I found out that not all of the people who have experienced SHC have died from it. Some of these survivors have found strange burns on their body, not having an idea the source of these burns. Few people who have experienced SHC have survived from it. This questions me to ask why these people are even said to have experienced SHC if they are in a completely different boat than the other group. Although somewhat similar, the outcome and symptoms are completely different. Unless...on a very far-fetched idea, all the people who died from SHC suffered from these burns before- hand and the burn is what ignited these people burning into ashes!

I think scientists and doctors should be more focused on observing those people who did survive. It could lead to clues that could help to find further information on this wonder. Have these survivors experienced these burns more than once? What places on their body did these burns occur? How did they first appear? How long did it take for them to disappear? Did they hurt? Could anything in their lifestyle be relevant to their burns?

In the 1800's, Charles Dickens published a book called "Bleak House," where a character Krook died from spontaneous human combustion. This is when the phenomenon first started receiving mass attention. In the book, Krook was an alcoholic; a substance that is flammable. It was speculated that Krook could have died because he drank so much alcohol, which is what caused him to go up in flames. This idea was believed for many years.

Scientists and fire experts came together to try to find the reasoning behind the wick effect, when someone naturally bursts into flames due to the body fat. According to BBC News, "Using a dead pig wrapped in cloth, they simulated a human body being burned over a long period and the charred effect was the same as in so-called spontaneous human combustion." This wording is a bit confusing, but I question this experiment because the circumstances are different in both situations. In SHC, the trigger of flame comes from within, while in this experiment the trigger is man-made. If the bases of the cause of the occurrences are not the same in both situations, it is hard to believe that the outcome of this experiment can be relevant to the real cause of SHC.

Cambridge News informed me that Professor Brian Ford conducted the experiment using a pig because it resembles a human's body regarding the level of fat content. After a little petroleum and five whole hours of constant burning, its bones began to disintegrate.  "The scientists believe they demonstrated how a case of spontaneous human combustion can occur through normal processes on a person who has been knocked unconscious." This still does not explain to me why a person's body would start burning...

Even human bones don't disintegrate at up to a 1000 degrees Celsius. Bone remains have been found thousands of years later; that is how we have learned about the past. So you are telling me that bones that have survived from the beginning of time have lasted, but these bones have just vanished?!

Other "Explanations include balls of lightning or a buildup of methane inside the gut." But forensic scientists have referred back to the wick effect when explaining spontaneous human combustion. They explain that a simple source such as a cigarette can be the cause of the burning, outside causes having a later effect on the body. This would mean that the body can act as its own ignition!

There are so many conflicting explanations of spontaneous human combustion that I do not know what to believe. What do you guys think? Is SHC caused by an internal source? I honestly think is very possible; there are so many things in this world that we have yet to discover the reasoning for and this could be one of them!

Turkey feast= food coma?


My mother stays awake late every night on the computer checking her e-mails and such. My brother stays awake late every night watching movies. My grandma even usually stays up late most nights lounging around the house peeking in everyone else's room to see what they are doing. But last night, on Thanksgiving, everyone was sound asleep shortly after our feasting was done. It was pretty early for our house, must have been about 10 PM or so. One would think that everyone goes to sleep early because people eat so much that they go into a "food coma" (when one passes out into a deep sleep because they are so full.) It makes sense. I am sure plenty of you have experienced these! But what about the plenty of overweight Americans who eat in this quantity on a daily basis? Do they go into food comas every day? Or is there something in particular about a Thanksgiving feast that puts one to sleep?

Apparently, there is...and it is the main dish of Thanksgiving. The turkey! Turkey supposedly contains a nutrient called Tryptophan, which is one of the twenty naturally occurring amino acids that assist in creating proteins for our body. This amino acid is essential to the body, but our bodies cannot naturally produce it by ourselves, so we must obtain it from outside sources, such as the food we eat. According to Scientific American, "Tryptophan is used by the human body to make serotonin, a neurotransmitter." If any of you remember as I do from psychology class that you may have taken in high school or at Penn State, neurotransmitters are chemicals in your body that tell your brain when you are tired. Therefore, it is a widely common thought that turkey makes you tired.

However, turkey is not the only food that contains this natural protein. According to Live Strong, meats, fishes, nuts, vegetables and beans all contain tryptophan. But, each of these foods contains different levels; turkey happens to be the meat with the highest level. Finding out this information confused me because wouldn't that technically mean that after every time we eat foods with these ingredients, we should become tired?

Reading on confused me even more. Apparently, there are several types of amino acids, and tryptophan is the least common among them. When you intake food, several amino acids enter your body, and they all enter your bloodstream, trying to fight for who can reach the brain first. Tryptophan is the one that usually has the least success.

 Therefore, the body would never intake this protein by itself, so the protein could not have a direct effect on the serotonin in your body. A lot of this information is conflicting, which leads me to question whether it really is the turkey that makes people tired. Or could it just be the food coma that I was discussing earlier?

After doing some further research, Web MD gave me the same facts that I doubted from this myth. It would not make sense for turkey to be the sole case of our tiredness if all these other foods also contain this ingredient. It is the carbs that make you tired! Thanksgiving is full of carbs: bread, cornbread, stuffing, potatoes, pies, etc! So it makes sense that after eating all this, people are bound to feel tired. Thanksgiving is the one holiday a year that is based on food.

I can imagine though that an experiment would be able to be conducted to find this out. If the tryptophan was isolated from everything around it and then given to someone who test if it makes them tired. Apparently, in the 1980's, this was tested when tryptophan became the new sleeping pill, according to Live Science.  The supplements were sold in wide quantities, but shortly the FDA discontinued their sale "after a massive outbreak of an autoimmune disease called eosinophilia-myalgia syndrome." People died, and this was linked back to the manufacturer that they were made in. However, scientists still believe otherwise; that the pill itself was having an effect on these people. This is another instance other than what we learned in class of when doctors and scientists have been wrong, resulting in the loss of people's lives.

Well, guess that was not the best experiment. Another experiment would be for scientists to test vegetarians vs. non-vegetarians during Thanksgiving. The two groups could be asked to go to bed by a certain time and wake up at a certain time every day for one week, so they contain the same energy levels. Then on Thanksgiving, it would be documented at what time each of the participants started feeling tired, and at what time they actually went to bed. This data would reveal whether the turkey had an effect on the tiredness of the non-vegetarians. There are confounding third variables, but then again there are in many published studies. Some third variables include what else the participants ate during that day, their caffeine intake, their normal energy levels, etc. Everyone in the groups would need to be of the same age groups because younger people have more energy than older people. Are any of you vegetarian? Did you feel tired after you ate? Were there others around you who ate turkey and wanted to sleep right after? Did they have a lot of other food on their plate? 

So ladies and gentleman, the myth that you have been told for years has finally been debunked.

Calendar Pic.jpg

Growing up with the same class for 12 years, everyone knew those students who were especially younger or older than all the rest.  I never thought that a student's age relative to their classmates could impact their social and behavioral health.  In a recent article from the New York Times Well Blog, titled "Younger Students More Likely to Get A.D.H.D. Drugs" the idea has been proposed that "in a given grade, students born at the end of the calendar year might be at a distinct disadvantage" and that "the lower the grade, the greater the disparity."  

The study, conducted by Journal Pediatrics, researched students in Iceland.  The researchers, led by Dr. Zoega, followed "10,000 students born in Iceland in the mid-1990s, following them from fourth grade through seventh grade, or ages 9 to 12."  The students were divided and evaluated based on the month of the year in which they were born.  

There were several major takeaways from the study:

1.) For children in the fourth grade, "those in the youngest third of their class had an 80 to 90 percent increased risk of scoring in the lowest decile on standardized tests" and "were also 50 percent more likely than the oldest third of their classmates to be prescribed stimulants for A.D.H.D."
2.) Although the youngest of children in some of the earlier grades were more likely to experience a disadvantage, there is still traceable evidence that the youngest of the seventh graders were still experiencing some disadvantages compared to their classmates.
3.) Gender played a small role: "girls scored higher than boys on tests, and had lower rates of stimulant prescriptions" although there was "still an age effect among girls for both academic performance and the use of A.D.H.D. medication."

Overall, this article had a very compelling title with little evidence to back its claim.  The Mayo Clinic listed several potential causes of A.D.H.D. including altered brain function and anatomy, heredity, maternal smoking, drug use, and exposure to toxins, childhood exposure to environmental toxins, and food additives.  None of these potential causes related in the least to a student's age relative to their classmates.  As for the test scores, I have yet to find any other compelling studies that suggest the same.  So what could be the explanation for this correlation?  Well, it could be chance or some other element causing a correlation between the two.  Perhaps students who are the youngest of their class might always be treated differently and feel that they are less likely to succeed.  This type of mindset could potentially be affecting their performance as they grow through these middle-school years.  

The leader of the research, Dr. Zoega, advises against jumping to conclusions based on the data reported in the study.  Instead, she suggests that "parents and educators should consider a child's age relative to his or her classmates when looking at poor grades and at any behavioral problems."  That being said, the title of the article is extremely misleading.  In my opinion, if the researchers do not see age as a cause for A.D.H.D., but as merely a potential correlation, they should not have specifically mentioned this disorder, especially considering how highly political this issue has become.

Doorways lead to memory loss?

| 1 Comment

Dark%20doorway-thumb-550xauto-77251.jpgImage credit: Fotovika/Shutterstock

            Recently, I was frustrated by my bad memory. When I went to the bathroom with a towel in my hand because I want to have a shower, and then forgot why am I here. 

            Do you have the same condition as me? Have you ever gone to the kitchen because you were thirsty and then forgotten why you were there? And when you do eventually wander back to your room with a glass of water in your hand, will you forget what you were working on?Have you ever walked out your front door and suddenly realized that you can't remember where you parked your bicycle or car? Such events happened frequently and rather than due to the chance. That made me think, and wonder if  I got the early-onset dementia.

            When I search my worry on the internet, I got the possible answer. The arch-criminal of our bad memory is the door. A team of researchers in the state of Indiana say there's a bona fide, scientific reason for it. A study was recently conducted by Dr Radvansky and colleagues from the University of Notre Dame in Indianapolis in the USA,which shows that walking through a doorway can make us forget, and mostly the recent memory.

            In Dr Radvansky's experiment, the objects that people had to remember were combinations of different items., they are different colours and shapes (ten different colours and six different shapes).

            The first experiment was a virtual one on a smallish computer screen. volunteers used computer keys to navigate their way through 55 virtual rooms, large and small. Each room contained one or two tables, with objects that the volunteers had to pick up, carry to the next room and set down on a table again. As soon as they picked them up, the objects disappeared. Throughout the test, they were presented with the name of an object and asked if it was the one they were currently carrying, or the one they had already put down. The results showed memory performance dipped markedly once they had passed through a doorway, rather than when they covered the same distance but remained in the same room

            In order to confirm the result in the first experiment, the second experiment was almost the same as the first one. In the second experiment everything was real; the tables, the items and the rooms, the result was still the same, that is walking through a doorway into a different room gave them memory lapses.

            But what was the cause? The doorway, or the fact that the room they ended up in was different.So the third experiment had the volunteers going through a few doorways, and then back into the same room that they started from. Going back into the same room did not improve their memory.  And this result leads to the conclusion that walking through the doorway triggers the loss of memory. Radvansky said, "Recalling the decision or activity that was made in a different room is difficult because it has been compartmentalized."article-2062061-0ED0A6C200000578-698_233x333.jpg

            The psychologists said that our mind regards the doorway as the "event boundary", which signaling the end of one memory episode and the beginning of another.

            Draw to the conclusion, the walking through the doorway will cause the loss of memory.However, is that better for us to live in a room without doorway, so that we will never forget anything?

            Will the result be affected by the length of time? In other word, the volunteers get in the other room several minutes after. So, will they forget the objects in these several minutes from they leave the first room to they enter the second room? 

            This study is exactly intriguing, however, I think we still need further research to make this theory perfect.

Cited Resources:
















"It smells like old people."

| 1 Comment

old people 1.jpgLet's face it; we all think old people smell weird, we've all said, "It smells like old people in here," and we all know exactly what smell is associated with 'old people.'  Well, on May 30 of this year, Johan Lundström proved that this wasn't just a coincidence and that there actually is a specific odor given off by old people.  Johan Lundstöm is an assistant professor at the Monell Chemical Senses Center in Philadelphia, Pennsylvania and conducted his research with his international team.  Lundstöm put pads under the armpits 44 participants that consisted of young people, middle aged people, and people over 75 years old for five nights in a row.  41 volunteers were then asked to distinguish between the smells.  The volunteers were all young people and rated the odors that they classified as 'old people,' as "less intense and less unpleasant" than that of the odors produced by the pads they categorized as middle aged and young people odors. Middle aged men smelled the worst to the people smelling the pads.  The volunteers were highly successful in identifying the 'old person' odors, but it was harder for them to categorize the young vs. the middle aged odors.  One reason Lundstöm gave as to why old people smell differently was that as humans age, the glands on the skin act differently and different types and amounts of bacteria are secreted through these glands.
old people 3.jpg

Another reason that Lundstöm and his researchers gave was that humans were able to distinguish between different smells that different age groups gave off just like other animals are able to do.  Wild animals are able to smell out mates that are young and able to reproduce.  They discriminate against older animals because they are not able to reproduce like younger ones are.  Therefore, maybe as humans, we subconsciously think in our animal brains that old people have a specific smell because we know that they can't reproduce and further our society.  Lundstöm controlled as much as he could with the study.  For example, he instructed the participants to use unscented soap and shampoo throughout the five day period and to refrain from eating potent, spicy foods that lead to different body odor smells.  He also made sure that the participants did not drink alcohol or smoke during the study, in case that led to a different odor excretion.  

The specific old person smell could be from many different things.  They don't bathe as often as younger people do and they often don't wash their clothes as much either.  The amount of medication and the medication they have to take is usually different from the medication that is typically taken by middle aged and younger people.  They also have older things in their homes that have sat around for decades longer than younger and middle aged people and have had that extra time to collect odors and dust. 

Of course, Lundstöm's studies has some flaws, especially with the controlling aspects.  Even though he instructed the participants who wore the pads under their armpits were instructed not to do certain things, there is no way to make sure that they didn't.  Without having the participants in a controlled laboratory setting, it is impossible to control every aspect that could alter the specific odors produced by the participants.  Also, the people who smelled and categorized the pads were only young people.  Lundstöm didn't prove that old people smelled different to all age groups by doing this, he just proved that young people think they do.  It would be interesting to see if a different age group or groups with people from all ages felt the same as the participants in Lundstöm's study.  Lundstöm proved what everyone has been saying all along, however this could be problematic too.  I was unable to find other scientific studies that confirmed what Lundstöm found, so there is a possibility his results were due to chance.  I am quick to accept his findings, but I think old people smell weird.  Is there anyone out there that thinks old people smell normal? Do you think Lundstöm's results are legitimate? What are your opinions about the stench old people give off? Are there other studies that you are aware of?

**All sources used are in the links within the blog.

old people 2.jpg

509px-Pityriasisrosa.pngIn high school, I had a friend who developed a strange rash on his stomach and back. The rash was little red bumps that stayed on his trunk and spread slightly to his arms, but nowhere else. He wasn't allergic to any foods and hadn't changed soaps  or detergents before developing this rash. Baffled, he went to the doctor, who looked at his rash and immediately diagnosed him with what the medical community calls  pityriasis rosea.

As the article says, this is a rash that is not terribly uncommon in young people, and poses no threat. When my friend found out this was what he had, obviously those of us who were in contact with him wondered how he got it and if we could also get it. His answer, the no one knows for sure what causes this rash, raised the question I am now posing: What really causes pityriasis rosea?

There is much disagreement in the medical community when it comes to pinpointing what causes this rash. As this particular article states, some believe that it is caused by a virus. However, others completely believe that a virus is not the cause. And still, others do not rule the possibility of a virus out, linking it to the HHV6 virus. Given all of this evidence, I believe that a virus is the most probable cause of this rash, however there is room for skepticism. Before contracting the rash, many patients show symptoms of a common cold, which we know is caused by a virus. Here, there are two possibilities: 1) the HHV6 virus is causing the cold symptoms or reversely 2) the cold virus somehow turns into the HHV6 virus. However, there could also be a third variable present. Perhaps a measure taken while one has a cold (such as medicine taken) can somehow spark an outbreak of this virus in some people? Or even chance could be present: that it is simply a coincidence that one has a cold and about a week later develops pityriasis rosea. There is not really any sign of what causes this consistent among all patients, making tests and studies difficult. What do you think? Do you think a virus, a third variable, chance, or something entirely different is causes the pityriasis rosea rash?

Cells Made of Metal

Upon first reading this piece on Cells Made from Metal I was very skeptical. I thought I could have stumbled upon a Sci-fi site or a strange conspiracy theory, but this
is actually going on!

In class we discussed the chance of life being on other planets and the concept that it may be very different from us. If we were to find extraterrestrial life it may not be the kind of life we would be expecting.

It is traditionally thought that life itself must be carbon based, but a British scientist believes that he has created life from another element. Lee Cronin of the University of Glasgow claims to have done the impossible. He has created what appears to be a non-carbon life form made from metal... 

"I am 100 percent positive that we can get evolution to work outside organic biology," he said.

The "bubbles" (as seen above ) he created have properties of biological cells and are composed of molecules made from metal. Different compartments and layers of bubbles perform different functions coming together to form something like an organelle.

What is even more fascinating is that the cells are in what appear to be early stages of evolving, opening up the idea for creating even more realistic cells capable of naturally reproducing with a DNA code. 

 The implications of this would be huge requiring a lot of science to be rewritten. Practical use could be endless, man could create organs that would make man half robotic or even make a completely new organism of its own.

 For now these ideas are merely fantasies but as Cronin continues his work they could become closer to reality everyday. If he accomplishes nothing else, he has opened scientists eyes to possibilities other than carbon based life, changing a norm in the scientific community.

If these cells were controllable and functional, what other practical uses could they be put toward? Are aliens more likely to be like the terminator than the slimy squid looking things we like to imagine? 


Poor Construction: The Human Knee

566172_1.jpgAs Penn State students, I assume that most of you were watching Saturday's football game against Indiana. And if you're a true Penn State fan, you were probably just as heartbroken as I was when Michael Mauti went down with the ultimately season ending injury to his knee. Mauti's injury brought the question to my mind: why is the human knee so poorly constructed?

Everybody knows someone who has some sort of problem with his or her knees. The cause of such widespread problems can be attributed to the anatomy of the human knee. Basically, our knees are two bones held together by three bands of ligaments (the ACL, PCL, and MCL). If you're thinking that this does not sound like the most stable of designs for the joints which support the majority of the weight of the human body, you'd be correct. We know that it's a poor design, but the question is, why are our knees designed like this and why hasn't evolution taken care of this? Not much can really be found to answer this question, especially written in terms that us regular people can understand. This Wikipedia article I found discussing the evolution of the human skeleton to accommodate bipedalism states that knee joints became larger as humans began to walk upright, in order to support more weight. However, the issue as to why the ligaments have not become better protected is not so clear. One reason that I believe that knee ligaments have not become better protected through human evolution is because of medical technology. In today's advanced world, the need for humans to evolve to adapt to their environments in order to survive is not nearly as urgent as it was say, hundreds of thousands and millions of years ago. Millions of years ago, those who had stronger knee ligaments would be the ones to survive and pass on genes of these strong ligaments because then, survival depended on mobility. In today's world, it seems that we have halted the process of evolution, which we know takes millions of years, because the need to adapt for survival no longer exists. Also, perhaps the human knee joint is not better protected because in becoming so, human mobility might suffer, causing evolution to maybe pick and choose what seems to be most beneficial. This seems to be a complicated issue that seems like it would take a lot of research to figure out. Has mobility increased vulnerability of knees, or in reverse causation has a more vulnerable design increased mobility? And is it possible that with today's technology, we are stopping the survival of the fittest process and evolution of the human species?


Last week I woke up and I saw that my roommate had left me a note on my whiteboard saying "Last night around 1 you sat up in bed, proceeded to tell me a "funny" story, laughed and laid back down". I of course did not remember doing this and this is not the first time I have talked in my sleep and not remembered it the next day. So I was wondering: What is sleep talking and why does it happen?

Sleep talking is formally known as somniloquy. It occurs when a person talks in their sleep without being aware they are doing it. It is part of a group of sleeping disorders known as Non Rapid Eye Movement sleep or NREM. Other NREM disorders include sleepwalking, sleep eating and night terrors.

According to an article on Psychology Today online, sleep talking usually occurs during a period of transition from light sleep to deep sleep or vice versa. It is possible to wake up briefly during transition before returning to sleep. Sometimes however the brain is being pulled in two different directions, one half wanting to sleep and the other trying to stay awake. This is when one of the NREM disorders is most likely to occur.

There are several different forms of sleep talking as the talker can say plain gibberish, full sentences or have full blown conversations while being unconscious.

According to the National Sleep Foundation, the type of speech one is able to produce is dependent on what stage of sleep they are in, the lighter the sleep the more complex the speech. When in a deep sleep, the talker is more likely to produce moans and gibberish.
While sleep talking is common in children it is less frequent in adults and can be brought on by a variety of factors such as "stress, depression, fever, sleep deprivation, day-time drowsiness and alcohol"
According to both sources there are no real cures for sleep talking only preventative measures that can be taken. These include but are not limited to following a regular sleep schedule, getting enough sleep, reducing heavy meals and alcohol consumption before bed.

Global Warming is arguably the most controversial science topic of our time. People can't agree whether or not it is man-made change or simply the Earth going through its natural cycle. What's not debatable is whether or not the Earth is warming; it is.


The World Bank recently warned the world that if the Earth's temperature rise is not slowed or stopped our economy might be at risk. That's right, not the polar ice caps or sea level tropical paradises, our economy. The report from the World Bank warns us that a 4 degree Celsius change could have catastrophic effects on our economy. "A global temperature increase like one estimated by the World Bank would lead to "the inundation of coastal cities; increasing risks for food production, potentially leading to higher malnutrition rates; many dry regions becoming dryer, wet regions wetter; unprecedented heat waves in many regions," water scarcity, and more natural disasters"(Koebler, 11/18/12). Now the 4 degree Celsius increase in temperature is the worst case scenario; however, they believe it could happen as early as 2060.

So what does this mean for us? I think it means we need to sit down and seriously discuss the reality that we are in. At this point, climate change being man-made or natural does not matter. What matters is that we can find a way to slow and/or stop it. Our current economy still has a long way to go before we fully recover and Europe is a complete disaster. If the World Bank is correct in predicting food shortages and water scarcity, the world could be in for a real shock. While the developed world would struggle to cope with these problems, developing nations such as China and India, where poverty is already rampant, could be hit even harder. The effect on Africa would be devastating. "It would push up food prices, make 35 percent of farmland in sub-Saharan Africa unsuitable for agriculture, and would extend the ranges of certain diseases" (Koebler, 11/18/12)

No longer is climate change just an issue of environmental protection, it is now a danger to our economic and financial security. If this report becomes a reality, are we prepared to cope with the problems that will come with it? I don't think we are at this time, but luckily it is not too late. Discussions need to begin at the government level and eventually the global world on ways to combat the increasing temperature. Future generations will suffer if we continue to ignore the science before us.


Gigantic muscles or children?


biggest body builder.JPGIt's a known fact: anabolic steroids cause infertility in males.  I don't know about everyone else, but it's just something I've accepted and haven't really ever wondered why or what the steroids do to cause infertility.  According to Dr. Stanton C. Honig, who is a professor of surgery/urology at the University Of Connecticut School Of Medicine and a practicing urologist at Yale Hospital, when a male takes anabolic steroids his testicles are tricked into thinking they don't need to produce testosterone, which in turn leads to an underproduction of sperm.  This happens because when the steroid is injected into the bloodstream, the hormones that help produce sperm in the brain are not activated and do not send signals to the Leydig cells in the testicles, which produce testosterone.  Therefore, the rest of the body has extremely high levels of testosterone, but within the testicles there is very little.

Dr. Mashe Wald, from the University of Iowa Hospitals and Clinics, also states that the hormones within the brain and the follicle stimulating hormones are affected by anabolic steroid use.   The Leutinizing Hormone (LH) is the hormone in the brain that is normally supposed to activate the Leydig cells in the testicles and the Follicle Stimulating Hormone (FSH) is the other hormone that contributes to sperm production in combination with testosterone.  When anabolic steroids are taken all three of these hormones aren't at their proper levels and infertility is the result. 

The explanation of why anabolic steroids cause male infertility seems pretty simple.  To go more in depth I wanted to know what other types of side effects can occur for males that take steroids.  Since steroids boost testosterone levels, the most common side effects occur within the reproductive system.  In addition to infertility, many men experience a dramatic decrease in testicle size.  The actual 'breasts' of men also increase in size, not to be confused with the increase in size of the pectoral muscles.  One of the most common non-reproductive system side effects is the formation of acne and other skin diseases.  The side effects can even be as severe as heart attacks or strokes.  Anabolic steroids also increase blood pressure and the levels of cholesterol. Emotional side effects are also extremely common, such as sever anger issues and men typically get very aggressive due to the increases of testosterone in their bodies. 

Overall, it is very apparent that anabolic steroid use is extremely dangerous for men.  Not only will users be unable to have children, they are causing long term, irreversible effects.  It seems to me that the evidence is pretty indisputable and harmful side effects are inevitable.  With this in mind why do body builders continue to take steroids anyway? Is it possible for body builders to even be competitive without these drugs anymore?  I only looked into side effects for men, but I'd be interested to know what the side effects in women are as well, considering the increase of women body builders in the last decade or so.


**All sources used are in the links within the blog.

Aliens and the Arecibo Message

| 1 Comment

message 1.gif

Aliens have always been a topic of interest for me. Especially the possibility of having contact with them and them coming to Earth.  The Arecibo Message that we talked about in class (that contained our DNA, the type of radio waves we use, the elements that make up Earth, a pictorial representation of us, and our solar system) has apparently been replied to in the form of the a crop circle.  In August 2001 near a radio telescope in the United Kingdom, a picture that seemed to be a direct reply appeared in a field over night with no foot prints and no evidence that humans did this.  The reply included Silicon 14 being added into the list of elements that made up their planet, an additional strand of DNA, a pictorial representation of them, a description of their solar system, and their kind of radio signal that they use.  This is extremely interesting, but of course highly controversial.

The Arecibo Message was sent from the Arecibo radio telescope in 1974 to the star cluster M13, which is about 21,000 light years away from us and located at the very edge of the Milky Way galaxy.  This message was sent up by the SETI Institute, which we also earned about in class.  The target of M13 is so far away and it will take thousands of years to actually reach M13 and by the time it does, it will have changed location because of the ever expanding nature of the universe.  According to Cornell University and Dr. Frank Drake, who was part of the team who wrote the message, this wasn't even an attempt to reach Aliens; it was just an example of what humans could do with technology.  With these statements and the folk lore nature of crop circles being created by Aliens, it is extremely difficult to believe that Aliens did actually send a reply.

message 2.gif

However, it is very possible that this reply could be from extraterrestrial life much closer than the M13 cluster of stars.  It is also possible that if extraterrestrial life did intercept the message that their technology is much, much more advanced than ours.  With the better and more advanced technology, they could have been able to travel faster than the speed of light and could have been able to intercept the message and send us the reply that found in the United Kingdom.  There is also the theory that extraterrestrial life is already on our planet and they are living amongst us.  This could explain all of the crop circles that have been left unexplained and this could explain the reply to the Arecibo message.  It is highly unlikely that we are the only form of life in the entire universe.  There are so many different forms of life on our own planet alone, how can we be the only ones out there?  Do you think the reply to the Arecibo message was from Aliens? If so, from where?  If you don't think Aliens sent this reply, who did? How were they able to make the extensive formation in the field without leaving a trace of evidence?

**All sources used are in the links within the blog.

message 4.jpg

message 3.jpg

Why Do We Watch Scary Movies?

Personally I've never grown out of my fear of horror movies and I'm sure many of you haven't either. With every scary movie I've watched, I have found it increasingly difficult to find any sort of benefit in voluntarily frightening myself. However for many the opposite is true. There are plenty of people who enjoy watching scary movies and have even developed a passion for them. But from a logical standpoint, there really should be no reason for people to watch them in the first place. Are there actual benefits in fear? Science has even had trouble answering this question.

When I asked a friend of mine why he enjoyed watching scary movies, he simply replied, "I don't know, they're entertaining." Seems reasonable enough but watching a car crash is also entertaining. So is it rational for people to develop a passion for watching car crashes? Probably not. Struggling to understand the concept as to why people genuinely enjoy scaring themselves, I went to science to solve my question. It turns out that there is no single answer but here are some theories that I found interesting:

Theory #1: It is a way of experiencing fear in a controlled setting
This theory is similar to the benefits of riding a roller coaster. The thrill of riding a roller coaster causes an adrenaline rush, increases heart rate, and causes fear without any real consequences. Being on a roller coaster will make you think you're in danger when you really aren't. Watching a scary movie produces the same effects. But the flaw that I found in this theory is that it fails to answer the question of why exactly we chase fear. 

Theory #2: Conflicting emotions
I found this theory to be very interesting. Researchers Andrade and Cohen state in their theory that people enjoy experiencing positive and negative feelings simultaneously. They argue that some people can be, "happy to be unhappy". It gets more in depth when they say that when people experience a certain amount of "psychological disengagement" or detachment from safety, positive feelings can be the product. Though I personally don't believe this pertains to every scary movie lover. 

Theory #3: For males, "the entrance to manhood is associate with enduring hardship"
This theory uses a very male-oriented ideology that "there's a motivation males have in our culture to master threatening situations." After watching or in this case enduring a scary movie, this theory states that people (typically males) find satisfaction in overcoming fear. Though this could very well be an explanation for some, it doesn't seem very likely that this theory would explain why all people, especially females, like watching scary movies.

Theory #4: It's in your brain
Research shows that fear is not just a biological reaction but an emotion as well. Humans are more evolved than animals which is why we are able to understand and think of fear rather than just experience it and forget about the next second (i.e. squirrels, birds, chipmunks). The brains amygdala and cortex have interactions that create the emotion of fear. So, "once an emotion is aroused, it is difficult for us to turn it off". So maybe our longing for fear is less voluntary and more biological. 

There are many theories out there that attempt to explain this ongoing phenomenon. But I feel as if each theory either has a flaw or it just doesn't apply to everyone. I understand that people may find pleasure in fear but this blog questions why that is so. I would like to hear feedback, criticism, and especially theories of your own. Why do people watch scary movies?

Why is my gas mileage so bad??

cars1.jpgMy car is about ten years old and I have been noticing recently that it does not get nearly the same amount of gas mileage as it did when I first got it three years ago.  The first time I drove it to school, which is about a five hour trip; I was able to just make it on one tank of gas.  However, I went home for a weekend and when I came back I had to fill up about half way.  Even with my lack of knowledge with cars, I knew this was a problem.  What caused this sudden decrease?  I was due for an oil change, so I went and got that done.  The mechanic who did it told me that my tire pressure was at 28 psi and they were supposed to be at 35 psi.  With these two things fixed I'm hoping that I'll be able to make it home without having to stop half way for gas! However, are there other things that could cause a sudden decrease in gas mileage?

                I learned that basically anything that has to do with the timing of the engine combustion has a lot to do with the efficiency of the gas mileage.  One of the main things that controls this is the oxygen sensor.  Having a bad oxygen sensor  can also lead to a "rough idle," meaning the engine has a hard time when it is just sitting at a stop light or something because a bad oxygen sensor will mess up the proper mixture of fuel and air, the timing of the engine, and the combustion of the engine as well. As far as I can tell, my car doesn't exactly sound peaceful when it is sitting at a stop light, to say the least.

                How exactly does an oxygen sensor work?  Well, the engine isn't able to burn gasoline without oxygen; the oxygen sensor determines exactly how much oxygen is needed in order for the gasoline to burn with just the right amount.  The exact ratio of air to hydrogen to carbon (the elements found in gasoline) is 14: 7: 1.  If the oxygen sensor is bad, two things can happen; too much air can get into this ratio or too little.  Too much air is called a 'rich mixture' and there will be excess fuel left over after the combustion, which will be emitted into the environment and your car will not be considered under the limit of the 'okay amount' of gasses leaked.  On the other hand, having not enough air causes a 'lean mixture,' which can lead to bad gas mileage, amongst other things, and just bad engine performance in general.  A working oxygen sensor can judge when more oxygen needs to be let in or when less oxygen needs to be let in and it fixes the imbalance and it sends the information the car's computer which then adjusts the levels. 

                Having a bad oxygen sensor can decrease gas mileage up to 40%!!  "It is recommended my oxygen sensor manufacturers to get it replaced every 15,000 to 30,000 miles on older cars and at 60,000 miles for newer cars."  I realize that I have little to no knowledge about cars, but it's very alarming to me that having a bad oxygen sensor can decrease gas mileage as much as it does.  My car has about 175,000 miles on it, which means the oxygen sensor in my car should have been replaced about 6 times by now.  I can guarantee you that this hasn't happened!  Over break I am planning on having this done.

                Is anyone really good with cars? What other things could be affecting my gas mileage this much?  Could it have been due to my tire pressure more so than the chance of a faulty oxygen sensor?

**All sources used are in the links within the blog.

     Today truly marks a sad day in the global health community, as the results on a two decade's long study for the most promising malaria vaccine to date have come back far less than impressive. An article found in today's issue of Science outlines its failures (the actual results can be found in the New England Journal of Medicine). But first, lets take a larger look at the vaccine.
     One thing needs to be cleared up before we go into this: this vaccine is not intended to be like a smallpox or polio vaccine, it's more like the measles, mumps, and rubella one. There was not going to be a widespread mobilization to inject as many people as possible to eradicate it, rather it was to be given to infants in multiple parts. That being said, the results were still disappointing. The risk of episodes of malaria are only lowered by 31% for babies between 6 and 12 weeks old when given he vaccine. The age noted is important here, and the main reason why the results are viewed as so disappointing; as between 6 and 12 weeks for a vaccine is ideal health-wise (the earlier a child is vaccinated the less of a chance to get the disease) and is the most cost efficient. For example, as stated in the article, the risk is lowered by a lot more (56%) when given to babies between 5 and 17 months of age, which suggests that starting the vaccination cycle later increases the possibility of lowering the risk. This, however, is much more expensive than administering it early, not to mention that 56% isn't exactly an overwhelming number and while it suggests that the risk would lower if the vaccination cycle started even later, this is of course not certain.   
     So what does this mean for the world as a whole? Well for one, malaria is obviously going to continue to be a very large problem. While there is some promise for some kind of working vaccine to be created there remains a question of how long it will take to produce. Even after a working vaccine is produced the "cost-effective" question remains. If it turns out to be quite expensive how will it be put to widespread use? Will there be enough money left to develop vaccines if the disease mutates? While all of these questions seem quite bleak now that the results of this study have come out there is still some hope.
     This study could still produce positive results. The full results, which won't be ready until 2014, could show progress in lowering the risk of malaria, as well as how long protection would last, if a booster is given to children 18 months after their third dose of the vaccine. While probably the most expensive option, some positive results would be far better than none. Another bright spot is that severe side effects of this vaccine would be relatively low. I'm sure the Global Health community, as well as anyone who wants to see progress in stopping this disease, will begin holding their breaths until 2014. Do you think we'll ever see an effective malaria vaccine during our lifetimes?     

High Time For Hibernation

| 1 Comment

      Thanksgiving break's around the corner! One of the sweetest things with a weeklong break is sleeping. Yeah, I know it's old-chestnut stuff. Andrew talked about blogging topic today and he specifically mentioned that sleeping has outshined the topic of sex to be the sexier topic. I don't intend to bring more headache to our TAs as they might have seen a whole lot of blogs with similar discussion of sleep. Today, I want to do something rarely mentioned.

      It was before my preparation that the winter of State College chimes in, and the temperature got more and more intolerable these days. What's worse, the heater in our apartment hasn't been working properly, since it deserts us when we are deprived at midnight and embraces us when we are spared during the daytime. Compared to just a month ago, falling into sleep becomes a much harder task these days. Most alarmingly, almost like a lasting strike in my body, I found it quite demanding to nudge myself out of that cozy environment in the morning. I even skipped a morning class this week which I had never done before intentionally. It was sort of a transformation that has dawned on me and disqualified me from a morning bird to a tardy snail which desires the snugness of the shell.

Hibernation comic.jpg

Photo Courtesy of The Muslim Observer(TMO).

      I remember lucidly how motivated I was in the late summer and early fall in the morning when alarm went off. Though I did snooze a bit on occasions, it didn't hurt my daily schedule to the slightest. What has happened to my body which radically altered my sleeping pattern, and even beyond, shut off my energy outlet in the morning these days?

      First of all, I interested in the question of why our bodies are quite finicky to temperature for a sound sleep. We have heard of tales about poor wretches were jerked out of their dreams because of some chills (simply caused by displaced fleece thrower or so), or more dramatically, a cadre of soldiers died deep in trenches due to hypothermia in inclement weather during WWII. On the contrary, barely are we disrupted by the warmth during our sleep, which is made possible by an enclosed area with zero or few air transference.

      When talking about comfortable temperature of sleeps, I recalled some debilitating episodes when I caught colds at young age. My mom would cover me with layers of blankets, in attempt to drive my sweat out. This article does not serve the purpose to explain the mechanism of toxin discharged with perspiration, but it is quite inspiring for me toward the solution of the temperature problem. Not necessarily to sweat hellishly as a remedy for cold, do we actually sweat a lot during sleep which helps our body in a particular way? Were that being the case, the requirement of minimal warmth for an undisturbed sleep would have been justified easily.

      Do we really sweat during sleeps? When I look up online to see what our community has to offer, I was shocked that sweat sounds much more like a bane than a normal physical activity during sleep. I was flooded with the result of hyperhidrosis. According to the index of CNN Health Channel, "hyperhidrosis is excessive sweating that occurs even when the temperature isn't hot and you're not exercising." When I first swept across this fresh notion, I thought that the scenario being described is most likely to happen during the sleep. If hyperhidrosis stood out as an undesirable symptom which bothers sleepers, it gives me a blurry vision that normal people should not sweat during sleeps? Well, is this hypothesis tenable then?

Night Sweat.jpg

      In an article by Mark Boyer, he tackles with hyperhidrosis again. But instead of coming to rescue for solutions, he cobbled up some conjectures which dovetails well with my experience: "Nobody likes to wake up in a pool of sweat in the middle of the night...and it's important to look at all of the possible variables. The first and most obvious thing to consider is temperature and overall comfort of the sleeping environment." Occurred in the exact same way when my mom tried to cure my cold during the night, this observation immediately grabbed my attention. If sweating, let alone the quantity, is essentially bad for our health during the sleep, a rational person has a strong inclination to deduce that a normal person does not sweat in bed.

      But without too much hesitation, I doubt about this idea at once. It is generally accepted as a common sense that even though our mind is somewhat in a relaxed state at night, our metabolic system is operated at full steam. A wide array of body wastes were secreted out during this time, giving an equilibrium of our metabolism. Being the most apparent manifestation of metabolic activity, why sweating is abnormal during the night, if nothing can be more normal than that? It comes as a farcical paradox. Then, I ask myself this question. If a warm temperature, which I so hope from the heater, does not guarantee an effective sleep at night and give the green light to our metabolism attunement, why do we look forward to sleeping as warm as possible at night?

      I guess this conundrum could be disentangled rather simply: we do sweat on the face of our skin during the night, but the amount of discharge is negligible on a vast scale of our perception. It evaporates gradually during the sleep and goes totally unnoticed in the morning. This hypothesis perfectly clears the mist of sweating mystery in our sleep. When it is a paltry amount, nobody cares, but a misguided voice that we don't really sweat arises; when it is a massive amount, everybody panics, for it defies the normalcy of dryness during sleep as they suppose.

      Drawing knowledge from this discussion, I think the one of the most crucial mentalities in scientific studies is to handle extremities cautiously. It does not mean that science cannot be proved on extremities, for H. pylori is the only cause of ulcers as we studies recently, but I only suggest that we should think twice before arriving to a conclusion misled by an extremity. In this case, neither anhidrosis (lack of sweating) nor hyperhidrosis during sleeps wholesome for our health. But a moderate amount of sleep, which is not easy to determine in this case, entitles you as a normal person, if not a hearty one.

      I just talked about the role of temperature plays in our sleeping patterns, but I have missed something with the story. I do not hesitate to tell you that I slept much more recently as the prelude of a harsh winter reaches State College. There must be some reasons to square off this confusion. Not surprisingly, the temperature theory wins my favor again, because this fatal decrease on thermometer will probably also cause stagnancy to our body, like what we refer frequently as the state of lethargy.

      The research tells me that I was just off the bull's-eye on this one. Instead of the slump of temperature, the independent variable turns out to be the intensity of sunlight. An article from helps me considerably understand this proposition. "The science behind all of this [seasonal change of sleep pattern] involves serotonin and melatonin, two neurotransmitters in the brain. Melatonin is produced when we sleep. Sleeping too much produces abnormal levels of melatonin. The more we sleep, the more we want to sleep because of the increase in this neurotransmitter." Alright, this makes a great sense to me to explain my behavior of skipping class for more sleep after an 8-hour one, but it didn't specify the variable of season. It doesn't take me long to reach the following lines: "During the summer, we experience higher serotonin levels. Serotonin is responsible for our mood. When sunlight is in short supply, our serotonin levels fall and we don't have so much energy or so many good feelings." This article leaves me a good position to elaborate further.

      The primary drive of seasonal change with sleep is as simple as two conflicting variables vying for prevalence. When summer kicks in, massive sunlight exposure spurs the production of serotonin, and therefore keeps us highly motivated throughout the day; when it turns to winter, the shortage of sunlight suppresses the level of serotonin, vacuuming our energies to consecrate the chilly weather. In addition, as previously indicated, melatonin accumulates during our sleep. In a dank winter morning, when the assumption that beyond the bed lies a territory of hostility creeps in, we are not in the least encouraged to get off. This reluctance gives melatonin an unrivaled opportunity to decimate the serotonin, and then we become the victim of chemical overkill in our body regulatory mechanism.


Photo Courtesy of CNN Health Channel.

      Nevertheless, the inability to get up in the morning is far from the worst of situation that ensues after the change of season. Seasonal Affective Disorder, coincidentally with the acronym of SAD, is what drags us down. A CNN editorial reveals the underlying factor of its development: "Seasonal affective disorder can be expected in regions of the world that are farther away from the equator and thus experience seasonal changes in daylight hours more dramatically." State College's latitude is well above 40° north of the equator, thus making it a representative venue of the peculiar climate. That article also suggests that SAD occurs most frequently on women in their 20s, 30s, and 40s. Being a legitimate male, I am fortunate at least at this moment without a tract of SAD symptoms save some difficulties of getting up. However, to the best of my knowledge, the sound of SAD seems to have an ability to cause depression. Once a victim of the latter, I absolutely do not want it again.

      I have to confess that instead of an open talk of SAD, this blog is more a gossip about sweat during our sleeps. My fellows, have you ever noticed evident sweat on your body when you get up? Are you confused when that happens to you? If not, please tell me your theory of that phenomenon.

      Above all, HAPPY THANKSGIVING!



Criminal Minds


What could possibly drive someone to murder another human being... what if I told you it was out of his or her control? Would you believe me?


In recent years, more and more judges are referring to neurological evidence while in court. Scientists have discovered through brain scans that some people's brains are more prone to think in a criminal manner than others. In most situations this is just a "sliver" of evidence used in the big picture of the case. It has been helpful in using and reasoning lighter sentencing in criminal cases. The explanation and science behind this type of evidence is still being studied, but it became known after a compelling case in 2002.


      In 2002, a 40-year-old Virginia teacher was caught viewing child pornography and making advances on his stepdaughter. He was convicted of child molestation, but the night before he went to jail, he went to the doctor with a crippling headache and confessed he might commit rape. Doctors found something they didn't expect: A brain tumor.

The cancerous tumor was putting pressure on his orbifrontal cortex, which controls impulse and judgment. The tumor was removed, and the man no longer exhibited pedophilic tendencies. The case was described at the annual meeting of the American Neurological Association. Said one of his doctors: "He wasn't faking."


Ever since Andrew showed us the clip of Jenny McCarthy talking about vaccines and autism, I have started to realize how the media twists and turns our views on scientific evidence. If the article I found would have taken out this story of the man with the brain tumor, we would simply believe brain scans provide a little bit of evidence in criminal cases, but not enough to deem someone as guilty or innocent. If you continue reading the article, you will find that there are "thousands of people with broken looking brains, who act rational." So why are these brain scans becoming so popular when their reliability is still unknown? Because of stories like the one above.

We all know from this class that the 2002 child pornography case could simply be due to chance, not everyone with a brain tumor thinks like a pedophile. Even if these brain scans do provide sufficient evidence, would it be ethical to scan brains before crimes are committed? Or maybe it's reverse causation. Only after one commits a horrific crime does one's brain change or tumors form.

Do you think this could lead to legitimate evidence? Or is this just another powerful anecdote winning over the readers?

Color Blindness

Many people find that color blindness is weird and hard to believe.  People with perfect vision, when it comes to color anyway, cannot really imagine what it is like to either mix-up colors or not see some colors completely.  As a victim of color blindness, I thought it would be good to blog about this.  Color blindness is caused by a problem with the color-sensing granules (pigments) in certain nerve cells in the eye.  These cells are found in the retina near the back of the eye.
2429.jpgIf just one pigment is missing, then people usually just have trouble telling red and green apart.  This type of color blindness is the most common.  If a different pigment is missing, then people usually have trouble telling apart blue and yellow, as well as having trouble with red and green.  The most severe type of color blindness is called achromatopsia and it is very rare.  Achromatopsia allows a person to only see colors as shades of grey.  This case of color blindness usually means that a person has very poor vision altogether. 

Color blindness is a genetic problem.  My dad is very badly color blind and that is why I am too.  One in ten men are diagnosed with color blindness and it is very rare for women to be color blind.  My dad and I both have a weird case of color blind though.  We find it hard to tell dark colors like purple, blue and dark green apart.  We see them as black.  We also find it hard to tell light colors like yellow, orange and pink apart.  We see those colors as white.  One symptom of color blindness is not being able to tell apart the difference between shades of the same or similar color.

People can find out if they are color blind by going to the doctor and getting an eye exam.  There is no known treatment for color blindness, however, there are special contact lenses and glasses that help people distinguish the difference between similar colors.  Color blindness is a lifelong condition but people that are color blind usually adjust to it without difficulty.  The only real complications with color blindness is the fact that you cannot acquire certain jobs.  For example, electricians, painters, costume designers, pilots and cooks need to see all colors accurately.

Ishihara x5.jpg
As a kid with a severe case of color blindness, I found myself getting made fun of sometimes and other times people do not believe me.  Some people compare my vision to a dog's.  Lastly, a lot of questions are asked to tell the difference between certain colors, which I have trouble with.  Color blindness does not make me weird though, it makes me unique and I am proud of it.  Has anyone faced the problems that I have?

We're all familiar with Chicken Little- the children's tale that tells the story of a chicken afraid that the sky is falling. How far off is that little chicken though? Could our sky really be falling- or just deteriorating before our eyes?, Disney-Chicken-Little-Sky-Falling.jpg


The Ozone Layer, a belt of natural ozone gas (made up of mostly oxygen), sits close to 30 miles above the planet Earth in a region called the stratosphere. The job of the ozone layer is to protect the Earth from the suns harmful UVB radiation. In simpler terms- the ozone layer acts like a layer of sunscreen for the Earth. Unfortunately, this means that without it we get burned. 

28630730.pngScientists and Environmentalists argue that the excessive amounts of pollution on earth are what's causing ozone depletion. An example of this pollution is chlorofluorocarbons (CFC), a chemical found in everyday household items like cans of hairspray and Red-Whip. 


These CFCs are released and broken down into chlorine molecules when they reach the upper atmosphere. This chlorine is what's responsible for destroying the ozone layer (oxygen molecules). When oxygen molecules in the ozone get destroyed, more UVB rays get let into the Earth's atmosphere causing the familiar phenomenon known as "global warming". According to the United States Environmental Protection Agency, one atom of chlorine can destroy up to one hundred thousand ozone molecules!  

After reading this blog you may be thinking, what does this mean for usWhile environmental problems like global warming and ozone depletion are very serious- there is no need to worry that the sky is falling just yet. National Geographic Magazine reports that since these findings in 1996, several countries have entirely banned CFCs. Scientist's believe that the amount of chlorine found in the atmosphere is actually depleting! It is estimated that chlorine levels will return to normal within 50 years if recent trends in environmentalism continue. What trends have you seen that you think contribute to the environmental protection movement? Do you believe in global warming? I would love to hear how some of you feel about the environmental threats being made to our planet! 

Have you thought about why we get bored? Maybe you tried, but ended up bored in the process...

Actually, I think it's possible that there has been a decrease in boredom since social networking came around. I actually find myself mindlessly going on FB or Twitter whenever I feel like I'm about to get bored. But have you ever wondered if captive animals feel boredom?

A recent study by the University of Guelph is the first research study to empirically demonstrate boredom in confined animals. The researchers presented captive mink with stimuli ranging from appealing treats to neutral objects to undesirable things, such as leather gloves used to catch the animals.

Half of the animals in the study lived in small, bare cages. The other half lived in large "enriched" cages that were enhanced with water for wading, passageways for running, objects to chew and towers to climb. The researchers found that animals in confined, empty spaces avidly seek stimulation, which is consistent with boredom and they approached stimuli -- even frightening ones (I couldn't find exactly what the stimuli was)-- three times more quickly and investigated them for longer. They also ate more treats, even when given as much food as the minks that were having fun. When they were not being tested, mink in empty cages spent much of their waking time lying down and idle. Science Source and Picture Source

Granted, science can't conclude if animals feel boredom in the same way as humans because it's impossible to measure those types of subjective experiences. The observations of the minks did not surprise me. It's logical that bored people and animals will be more prone to eat more and to find interest in the most insignificant stimuli. I was surprised to learn that there is not much research on boredom even though it is associated with many negative things in society such as crime, teen pregnancy, health and well-being. Mark Fenske (Guelph neuroscientist and psychology professor), said: "Being able to now study boredom in non-human animals is an important step in our efforts to understand its causes and effects and find ways to alleviate boredom-related problems across species." When I first read this quote I though "DUH! Just give animals and people things to do so they won't get bored. That will alleviate boredom." But then, I began to hear Andrew's voice and started to ask questions. Why does under-stimulation cause problems? Does the extreme boredom in animals cause them other psychological problems like depression? Does this depression cause suicide? Hmmm, maybe my next blog will be on animals and suicide. Isn't that interesting?

As we discussed the possibility of alien life in class, I recalled a newspaper article that discussed sending a response to aliens... A response however would require previous contact, which is why I was stunned when I read about The WOW Signal. 
In 1977 Jerry Ehman was volunteering for an SETI (Search for Extraterrestrial Intelligence) project at Ohio State's Big Ear radio telescope when something very strange happened. While scanning for deep space radio waves near a star called Tau Sagittarii, over 120 lightyears away, he came across a loud narrowband signal that lasted for the full 72 second interval it was being recorded. On the report Ehman wrote "WOW!" next to the recorded data giving it the name 
Thumbnail image for wow signal.jpg

What could this be? Nothing manmade had ever been sent in the remote area of the star; so what was out there sending this signal? 

The media and SETI immediately looked toward extraterrestrial life. However, others didn't resist bringing up other possible explanations. Some speculated that it could be a manmade signal that happened to reflect off space debris, or could also be interstellar scintillation and atmospheric twinkling that caused this phenomenon.

The SETI refuted these claims with the nature of the signal itself. It was a 1420 MHz signal which is almost dead on the hydrogen line. Hydrogen is speculated to be the most abundant element in space and therefore could be the best method for extraterrestrials to send a strong signal. Also on Earth the 1420 MHz signal is a "protected spectrum" meaning it is only used for astronomical purposes. This made the possibility of a reflected human signal extremely unlikely.

Since the discovery in 1977 many have attempted to find the signal again with more powerful advanced equipment but all efforts have been fruitless, only adding to the mystery behind it. Where did the signal go and why was it temporary? Does it have a meaning we can't understand?While many have stumbled around the meaning and source of the signal, action hasn't been taken until recently.

 More than 35 years since the signal was found, a response is being sent. Regaining the attention of media ranging from NPR to Fox News, the idea is to send digitized versions of tweets and a Stephen Colbert video regarding the extraterrestrials towards where the signal originated. 

Are we really tweeting @ #Aliens? Could they favorite our tweet or message back? What would they think of us if they saw Colbert and a bunch of tweets that they could not understand? If we were to receive a message that contained a mess of foreign texts that no one could decipher how would we handle it?   Only time will tell... For now, what exists120 lightyears away will have to be left up to speculations and imagination.
When I think about growing up and hearing about coffee, one word comes to mind: mixed signals. I feel like there is new research and a changed social opinion on whether coffee is healthy or unhealthy for you. I also feel that people in society have certain "facts" about coffee that are completely bogus. Personally, my family and I have never been coffee drinkers, which I find is somewhat rare in our society. We have never had health problems, but that is no evidence that coffee is bad for you.


According to Dr. Patricia Fitzgerald, who is a nutritionist, allows her patients to keep drinking there daily cup of joe! The article Clinical Nutrition says that coffee drinkers have a lower risk of diabetes. Other research from Dr. Fitzgerald shows that coffee can lower the risk of prostate and skin cancer. Other research from The University of South Florida and The University of Miami found that the amount of caffeine in the coffee can actually delay Alzheimer's disease and lengthen your life. Researchers from Harvard University, said that women who drink two to three cups of caffeinated coffee a day have a 15% lower risk of depression then women who don't drink coffee. That same study concluded that regular coffee drinkers have a decreased stroke risk and heart failure.

Clearly, there seems to be some great health benefits from drinking coffee! I see a lot of pros, but how about the cons?

Mark Hyman, who is a practicing physician, has personal experience with coffee. He talks about  how in college he would use coffee to study for exams, and when he would work long days in the E.R, he would need coffee all do long like he was a drug addict. After he quit coffee, he felt reenergized, and realized that he had been living on borrowed energy. This is because caffeine is addictive. Ronald Griffiths, a professor in the departments of psychiatry and neuroscience at The Johns Hopkins School of Medicine, said that experiments have been done where subjects were given a minimum of one hundred milligrams of caffeine a day. That is about the equivalent of a cup of coffee. This can trigger a physical addiction which can lead to headaches, muscle pains and stiffness, lethargy, nausea, vomiting, depressed mood, and irritability. 


Other than addiction, coffee releases dopamine, which helps you focus, concentrate, and remember. But, over time, dopamine kills your neurotransmitters which makes it ineffective. Drinking coffee can increase homocysteine, which increases your risk for heart disease, depression, cancer, and dementia. Homocysteine also destroys your vitamins and minerals in your body, including magnesium. Coffee can increase your risk of osteoporosis, liver damage, diarrhea, reflux, and heartburn. It can also increase the risk of stillbirths and iron deficiency in mothers and babies. 

Clearly there are pros and cons to drinking coffee and not drinking coffee. Personally, it makes complete sense to stay away from it. Yea, there are some benefits, but there are many negatives! If there was one negative, I would probably still stay away from it! There is nothing wrong in drinking coffee once or twice a week, but habitual use is dangerous. I think people have to start opening there eyes and realize the consequences and true dangers that coffee presents.

Facebook. Twitter. Reddit. Youtube. All websites that our society "abuses" on a second-to-second basis. Within a 10 minute walk through campus, I'd be willing to bet that half of the people you see are using social media one way or another. We've become so engulfed in using social media that a bulk of our free time and even work time involves the usage of social media. During lecture halls if you were to sit behind all of the students, you'd think you were at Facebook's headquarters. Society's obsession with this phenomenon is becoming increasingly evident and I think I know why.


First, I would like to clarify that I personally use Facebook and Twitter regularly (maybe too regularly) so I vouch for myself when I say that my assumptions are credible. So lets break it down, what is social networking? According to this source social networking is a social instrument of communication. Great, so it's a way that we can communicate with each other through the web. But wasn't something invented in the 70's to do just that? Oh yeah, e-mail. However, modern day society often prefers social media to communicate with one another as opposed to e-mail. And how could you blame us? Sites like Facebook and Twitter provide fresh, contemporary, and advanced ways of communicating. But our relentless obsession with social media has to be more than that.

My personal theory of solving this question primarily consists of simple observations. We as humans like to do what is in our best interest. It's simple behavioral economics. Social networking allows people to freely express themselves or even an alter-ego to the public, without facing any of the consequences. In real world situations we are always directly or indirectly pressured to abide by the norm. There is always that unwritten code of conduct to abide by in public settings. For example, it would be unusual and frowned upon to speak loudly and obnoxiously on a quiet bus. The more accepted behavior is to sit and be quiet. The objective is to avoid negative attention because negative attention often leads to shame and embarrassment, which is obviously not desirable. However, social media allows us to freely exhibit any and all feelings or expressions into a public environment without physically having to deal with the judgment, embarrassment, and/or degradation that we would in reality. It eliminates most if not all of the factors that coincide with the pressures of obeying with society's social guidelines. I've noticed a trend with people, me being one of them, utilizing social media as a tool to publicize how wonderful their lives are and outline the positive aspects of their lives. Facebook and Twitter provide the option to post statements, pictures, comments, invitations, etc in which people use to their boastful advantage.  It essentially allows people to rub in the public's face the image of, "I'm cool/cooler than you!" This indirect method of showing off gives people a sense of self-satisfaction. The benefits don't even stop there. Facebook allows people to "like" one anothers posts/pictures whereas Twitter allows people to "retweet" or "favorite" others tweets which all serve the same purpose of expressing agreement and approval. Even these features are used for personal benefit; those who provide positive feedback usually receive positive feedback. The combination of flaunting oneself and receiving positive feedback on said flaunting creates an ongoing cycle with the sole purpose of achieving self-satisfaction. Scientific research even proves that areas in the brain like the 'Prefrontal Pleasure Center' (PPC) secrete "happiness" chemicals when somebody is pleased or satisfied.     

I firmly believe that our obsession with social media is not only because of its power to communicate quickly and efficiently, but mostly because of something a bit more psychological. Have any of you noticed the same things that I did? I welcome all types of feedback

  Have your parents ever told you that carrots help your eyesight?  Well they are wrong because this is a myth.  Carrots contain beta-carotene and Vitamin A.  These are both essential for eyesight. Although carrots contain certain nutrients that you need in your body to have good eyesight, carrots will NOT save your eyes from going blind.  
   There are two types of vitamin A, according to  These are retinoids and carotenoids.  Retinoids are a fatty form of vitamin A that can be found in liver, fish oils and butter.  However, abusing these foods will lead to nothing but bad results such as cancer and toxicity.  Carotenoids are provitamins that the body converts into Vitamin A.  This includes the healthier alternatives such as carrots, sweet potatoes, broccoli, pumpkin etc. Even though the body converts these things into Vitamin A, it all depends on how much is already stored in your body.  In simpler words this means, if your body does not need the Vitamin A because it already has a sufficient amount, then the body does not convert the provitamin. 
   Now, after research, I realize where this myth started.  It is not the carrots that are essential for your eyesight, it is the Vitamin A.  A deficiency in this vitamin can lead to being blind at night or other eye conflicts.  If your vision is getting worse at night this is one of the first signs of lacking Vitamin A.  Eating carrots and other sources of Vitamin A can improve your night vision if you are deficient in this vitamin.  If you do not need anymore Vitamin A in your body then these sources of the vitamin will not do anything more for you. 
   There was a study done called the Blue Mountains eye study in the late 1990s.  This study researched the correlation between Vitamin A and deteriorating night vision in older people.  The researchers found that the people who reported to be losing their eyesight WERE eating more carrots (just like they were told by their parents).  However, it did not help.  Professor Algis Vingrys from the University of Melbourne's Department of Optometry said, "No amount of carrots will improve your eyesight if your already have a well balanced diet".  
   The vision loss, according to the Blue Mountains eye study, was caused by age.  The older these people got the worst their vision got, it had nothing to do with their diet.  This is why no matter how many carrots they tried to eat, their eyesight did not get any better.
   I know I always give my dog carrots because she is older and I feel it would help with her eyesight.  However, I now realize two things.  One, it is not just carrots that carry the vitamins your eyes need to be as good as they can be.  Secondly I realized that no matter how many carrots I may give her, it still won't make her vision any better.  This is because if her body already has the amount it needs, it's not going to intake more of it because it does not need it.  Also, if her eyesight is deteriorating due to old age, nothing I do will make her eyesight any better, just like we saw in the Blue Mountain eye study. Have you ever tried to eat more carrots to make your eyesight better?  Have your parents or anyone you know ever told you about this? Do you think it is true that once your body as the amount of a certain vitamin that it needs it stops breaking it down and is now used for nothing? 

articles used are all in the hyperlinks

Long debated in political and ethical spheres is whether guns kill people or do violent people with guns kill people. When you read the question, the answer is probably immediately clear to you because of your set of personal beliefs, but someone else could easily believe the complete opposite. Because of the diversity of opinion on the subject, this potentially unanswerable question has spurred hot debates for decades.

gun pic.jpeg

(Image Credit of the Washington Post)

Following the Aurora movie theater shooting, various blogs and articles were published condemning the ownership of guns. The common reasoning behind this being that owning guns causes people to kill. In a blog, the blogger claimed that more Americans commit murders because more guns are available for sale. His allegation and others are based on data that shows that the most civilian firearms of any nation and has a high rate of homicide by firearm. A blogger for the Huffington Post adds that an American is 40% more likely to be killed by a firearm than a Canadian or an Englishman.  Another blogger, Michael Roberts, went as far to say that owning a gun was the direct cause of the Aurora shooting. Roberts claimed that owning a gun gave the shooter a sense of power and entitlement that led him to act. After the shooting and beforehand, many have come to view guns as the cause of violence in the United States.

Contrasting these views, others claim that it is not the guns killing people, it violent people misusing them. Having a more controversial opinion, after the shooting, director Michael Moore alleged that it is not guns that kill people, Americans do. His claim was is based on the fact that other nations own guns but homicide rates are not nearly as high. After the tragedy organizations such as the National Rifle Association, probably trying to protect their own interests, claimed that it was not owning guns that was causing homicides. They rather claimed that violence being portrayed in the media led people to commit acts of violence of their own.  Though these views are pretty diverse, in general they agree that owning a gun is not causing the violence.

I think the problem with this debate and the reason it has continued for so long is that it is significantly lacking any science method. The closest anyone has come to conducting actual science is making loose correlations based off a string of data. The problem is in the experimentation. How can an experiment be conducted to prove either view valid? It is not ethical to just give people guns and see if they become violent. Because of this we may never know. Are guns forcing our hands or are some of us naturally violent?