“My goal today is to be better than yesterday so wait until you see what I do "tomorrow."” - Alien Ness

Tuesday, October 19, 2010

The Reality Check Episode 84

Miracluin Tablets + Price Discrimination + Does Eating Late Cause Weight Gain?

Miracle Tablets?
Part 1: 
So, what is this miracle fruit tablet?
The planet name is called Synsepalum dulcificum. This kind of plant produces very sour berries, that can be subsequently consumed to taste sweet. According to the few studies done, the effect is due to miraculin, which is used commercially as a sugar substitute. The berry itself has a low sugar content and a midly sweet tang. It contains a glycoprotein molecule, with some trailing carbohydrate chains, called miraculin. When the fruit is eaten, that molecule binds to the taste buds, and when you eat sour foods, the molecule makes it tastes sweet. No one really knows how it changes that kind of feeling from sour to sweet. That's why it's called a miracle fruit tablet.
The berry has been used in West Africa since at least the 18th century, when European explorer Chevalier des Marchai, who searched for many different fruits during a 1725 excursion to its native West Africa , provided an account of its use there. Marchais noticed that local people picked the berry from shrubs and chewed it before meals.
An attempt was made in the 1970s to commercialize the ability of the fruit to turn non-sweet foods into sweet foods without a caloric penalty but ended in failure when the FDA classified the berry as a food additive. There were controversial circumstances with accusations that the project was sabotaged and the research burgled by the sugar industry to prevent loss of business caused by a drop in the need for sugar. The US Food and Drug Administration(FDA) has always denied that pressure was put on it by the sugar industry but refused to release any files on the subject. Similar arguments are noted for the FDA's regulation on Stevia now labeled as a "dietary supplement" instead of a "sweetener".
For a time in the 1970s, US dieters could purchase a pill form of miraculin. It was at this time that the idea of the "miraculin party" was conceived. Recently, this phenomenon has enjoyed some revival in food-tasting events, referred to as "flavor-tripping parties" by some. The tasters consume sour and bitter foods, such as lemons, radishes, pickles, hot sauce, and beer, to experience the taste changes that occur.

Part 2:
What is Price Discrimination? 
Price discrimination or price differentiation exists when sales of identical goods or services are transacted at different prices from the same provider. In a theoretical market with perfect information, perfect substitutes, and no transaction costs or prohibition on secondary exchange (or re-selling) to prevent arbitrage, price discrimination can only be a feature of monopolistic and oligopolistic markets, where market power can be exercised. Otherwise, the moment the seller tries to sell the same good at different prices, the buyer at the lower price can arbitrage by selling to the consumer buying at the higher price but with a tiny discount. However, product heterogeneity, market frictions or high fixed costs (which make marginal-cost pricing unsustainable in the long run) can allow for some degree of differential pricing to different consumers, even in fully competitive retail or industrial markets. Price discrimination also occurs when the same price is charged to customers which have different supply costs.
The effects of price discrimination on social efficiency are unclear; typically such behavior leads to lower prices for some consumers and higher prices for others. Output can be expanded when price discrimination is very efficient, but output can also decline when discrimination is more effective at extracting surplus from high-valued users than expanding sales to low valued users. Even if output remains constant, price discrimination can reduce efficiency by misallocating output among consumers.
Price discrimination requires market segmentation and some means to discourage discount customers from becoming resellers and, by extension, competitors. This usually entails using one or more means of preventing any resale, keeping the different price groups separate, making price comparisons difficult, or restricting pricing information. The boundary set up by the marketer to keep segments separate are referred to as a rate fence. Price discrimination is thus very common in services, where resale is not possible; an example is student discounts at museums. Price discrimination in intellectual property is also enforced by law and by technology. In the market for DVDs, DVD players are designed - by law - with chips to prevent use of an inexpensive copy of the DVD (for example legally purchased in India) from being used in a higher price market (like the US). The Digital Millennium Copyright Act has provisions to outlaw circumventing of such devices to protect the enhanced monopoly profits that copyright holders can obtain from price discrimination against higher price market segments.
Price discrimination can also be seen where the requirement that goods be identical is relaxed. For example, so-called "premium products" (including relatively simple products, such as cappuccino compared to regular coffee) have a price differential that is not explained by the cost of production. Some economists have argued that this is a form of price discrimination exercised by providing a means for consumers to reveal their willingness to pay.
Science Myth of the Week:
Does eating at night make you fat?
It is simply not true.
Read on facts here: Festive medical myths

The Reality Check Episode 83

Easter Origins + Vitamin C for the Common Cold + Brain Calorie Loss

Easter Origins!
Part 1: 
Easter Origins?
Easter Origin - A One-time Event
Easter origin, as a Christian holiday, can be found in the pages of scripture itself. Matthew, Mark, Luke and John, all followers of Jesus, offer their own unique eyewitness accounts of the crucifixion and resurrection of Jesus Christ. It is this culminating event of Christianity that is celebrated on Easter Sunday every year.
Easter Origin - The Resurrection of Jesus Christ
Easter origin actually began as a part of the Jewish Passover, as Christ was crucified and resurrected during Passover week. Christ is believed by Christians to actually be the Passover Lamb spoken of in Exodus, for He Himself became the perfect, sinless sacrifice for the sins of all people. Jews who chose to follow Christ then honored this day in succeeding years during the Passover season, but as Christianity was spread throughout non-christian nations, the celebration of Easter was gradually combined with pagan "rites of spring" traditions. Modern celebrations are the result of this compromise. At the same time, Easter is often the only day that many people attend church and are introduced to the "Good News" of Jesus Christ.
Easter Origin - Christ Revealed in the Jewish Passover
Easter origin can be traced to the Passover ceremony itself. Christian scholars believe that the Old Testament is Christ concealed, while the New Testament is Christ revealed. Let's hold the elements of the Passover up to the light of the life of Christ. By tradition, the lamb to be sacrificed during the Passover was selected four days before the sacrifice was to be made. Jesus rode into Jerusalem four days before He was crucified. The lamb was customarily slain at 3 p.m. on Passover. Jesus uttered the words "it is finished" and died on the cross at 3 p.m. (this is known traditionally as Good Friday, but many Bible scholars have determined the crucifixion to be on a Wednesday or Thursday). The festival of Unleavened Bread began at sunset. One of the rituals involved the sacrifice of a grain offering, representing the first fruits of the harvest. Jesus, according to the Apostle Paul, became the first fruits of those raised from the dead. During the Passover dinner, three matzahs are put together. Christians see these matzahs as representative of the Father, Son and Holy Spirit. The middle matzah is broken, as Christ said at the Last Supper, "This is My body, broken for you." The middle matzah is also striped and pierced, as Jesus was during His crucifixion, and as was prophesied in Isaiah 53:5, Psalm 22:16 and Zechariah 12:10. This matzah is then wrapped in a white cloth and hidden, just as Christ was wrapped in linen and laid in the tomb.
Easter Origin - The Biblical Accounts
Easter (also known as Resurrection Day), is the event upon which the entire Christian faith hinges. Paul, once a Jewish leader hostile to Christians, became a convert when he met Jesus on the Road to Damascus. As an eyewitness of Christ, Paul made it abundantly clear that without the resurrection, there is no basis for faith in Christ: Now if Christ be preached that he rose from the dead, how say some among you that there is no resurrection of the dead? But if there be no resurrection of the dead, then is Christ not risen: And if Christ be not risen, then is our preaching vain, and your faith is also vain.
When Christ was born, He fulfilled a number of Old Testament prophecies concerning the Messiah. By the time of His crucifixion, resurrection and ascension, He had fulfilled more than 300 of them. These numbers alone provide staggering evidence that Jesus Christ was the promised Messiah. So it is with good reason that Christians the world over regard Easter as a very special event. But in the early days of the church, most Christians were Jewish converts. Because Jesus was crucified and rose again during the Passover season, their celebration of Christ's resurrection was acknowledged during that annual observance of the deliverance from bondage in Egypt. Christian Jews consider the Passover to be symbolic of the time when Christ set all believers free from the penalty of sin and death.
Easter Origin - What Does the Resurrection Mean to You?
Easter origin? Can a man who claims to be God and then rises from the dead actually be God in human form? Is He someone you should follow? C.S. Lewis asked those same questions and came to the conclusion that there are only three possibilities. Jesus Christ claimed to be God. Therefore, to say He is just a "good man" or "great teacher" is to call him a liar. Any sane person who would claim to be God, but who in fact, is not, must then be a madman - a lunatic! If Christ is neither a liar nor a lunatic, then there is only one other possible conclusion - He must be the Lord! If He is the Lord, what does Resurrection Day mean to you?

Part 2:
So, does Vitamin C help with colds? 
At the very first sign of cold symptoms, many people reach right for a bottle of vitamin C supplements. Vitamin C for the common cold is such a widely accepted treatment that we seek it out in lots of products, such as fortified juices,cough drops, and tea.
Vitamin C was first touted for the common cold in the 1970s. But despite its widespread use, experts say there's very little proof that vitamin C actually has any effect on the common cold.
What is vitamin C?
Vitamin C is an important vitamin and antioxidant that the body uses to keep you strong and healthy. Vitamin C is used in the maintenance of bones, muscle, and blood vessels. Vitamin C also assists in the formation of collagen and helps the body absorb iron.
Vitamin C is found naturally in vegetables and fruits, especially oranges and other citrus fruits. This key vitamin is also available as a natural dietary supplement in the form of vitamin C pills and vitamin C chewable tablets.
Can Vitamin C Prevent or Treat Cold Symptoms?
Vitamin C has been studied for many years as a possible treatment for colds, or as a way to prevent colds. But findings have been somewhat inconsistent. Overall, experts have found little to no benefit for vitamin C preventing or treating the common cold.
In a July 2007 study, researchers wanted to discover whether taking 200 milligrams or more of vitamin C daily could reduce the frequency, duration, or severity of a cold. After reviewing 60 years of clinical research, they found that when taken after a cold starts, vitamin C supplements do not make a cold shorter or less severe. When taken daily, vitamin C very slightly shorted cold duration -- by 8% in adults and by 14% in children.
But researchers found the most effect on people who were in extreme conditions, such as marathon runners. In this group, taking vitamin C cut their risk of catching a cold in half.
So what does all this mean?
The average adult who suffers with a cold for 12 days a year would still suffer for 11 days a year if that person took a high dose of vitamin C every day during that year.
For the average child who suffers about 28 days of cold illness a year, taking daily high-dose vitamin C would still mean 24 days of cold illness.
When vitamin C was tested for treatment of colds in 7 separate studies, vitamin C was no more effective than placebo at shortening the duration of cold symptoms.
Is Vitamin C Safe to Take?
In general, vitamin C is safe to take when ingested through food sources such as fruits and vegetables. For most people, taking vitamin C supplements in the recommended amounts is also safe. The RDA or recommended daily allowance is 90 mg for men and 75 mg for women. High doses of vitamin C (greater than 2000 milligrams per day for adults) may cause kidney stones, nausea, and diarrhea.
If you're unsure about taking vitamin C for colds, talk to your healthcare provider. Your doctor can answer any questions about vitamin C and colds and about any other dietary supplement that you are taking. - WebMD

Science Myth of the Week:
Can you lose weight just by thinking really really hard?
The answer is no, you still have to exercise and eat healthy.

The Reality Check Episode 82

Mike Duffy and Critical Thinking + HFCS Study + Frown Myth

Frown takes more muscles than smiles?
Part 1:
The Reality Check is going to a little able Mike Duffy and what he thinks about critical thinking.
Michael Dennis Duffy is a Canadian Senator and former Canadian television journalist. Prior to his appointment to the upper house he was the Ottawa editor for CTV News Channel, and a host of Mike Duffy Live and Countdown with Mike Duffy on the network. Duffy sits in the Senate as a Conservative, representing Prince Edward Island.
Senator Mike Duffy has attacked the University of King’s College and other Canadian journalism schools for exposing students to Noam Chomsky and critical thinking. In a speech Saturday to Conservative party members in Amherst, Duffy reportedly slammed journalism programs for churning out leftist graduates. “When I went to the school of hard knocks, we were told to be fair and balanced,” Duffy was quoted from his speech in yesterday’s issue of the Amherst Daily News. “That school doesn’t exist any more. Kids who go to King’s, or the other schools across the country, are taught from two main texts.” According to Duffy — a former CTV News journalist appointed to the Senate last year by Prime Minister Stephen Harper — those two texts are Manufacturing Consent, Chomsky’s book on mainstream media, and books about the theory of critical thinking. “When you put critical thinking together with Noam Chomsky, what you’ve got is a group of people who are taught from the ages of 18, 19 and 20 that what we stand for, private enterprise, a system that has generated more wealth for more people because people take risks and build businesses, is bad,” Duffy is quoted as saying. Duffy then told Conservatives they have nothing to apologize for because most Canadians are not “on the fringe where these other people are.” Kim Kierans, head of the King’s School of Journalism, was surprised to hear Duffy’s comments. She said Manufacturing Consent isn’t part of the curriculum, though students do read some Chomsky. She made no apologies for teaching critical thinking. “We’re trying to teach people to have critical thinking skills, to hold accountable anyone who is in any way in authority,” she said. “It doesn’t matter if it’s the Conservatives, the NDP, the Green party, they’re all fair game in the sense that they have to be able to be transparent.” - Metro
Listen to the Interview here: The Reality Check Episode 82

Part 2: 
So, does high fructose corn syrup make people fatter than just sugar?

Science Myth of the Week: 
So, does it take more to frown than smile? 
You've likely been told that it takes fewer muscles to smile than it does to frown, and that, in light of this fact, you should smile more often. There are quite a few numbers that get tossed around when this line is used. Some claim it takes 43 muscles to frown and 17 to smile, but open Aunt Milda's chain letter and you might be surprised to learn it takes 26 to smile and 62 to frown. And some naysayers claim it's quite the opposite, that in fact it takes more muscles to smile than to frown.

Monday, October 18, 2010

The Reality Check Episode 81

Buying the Cosmos + TV Ratings + Head Heat Loss Myth

Buying stars?
Part 1:
So, can you actually buy a start and name it?
It is true! Although it is quite expensive, you can buy a start and name it. 
"Who gave them the right to name stars? And then charge someone for the name?"
The answer is simple: Nobody gave them the right. They just do it.
At least half a dozen companies are offering to attach names to stars while making the designations seem official, providing a fancy certificate and directions for locating the newly named point of light. Their promotional strategies range from harmlessly playful to bordering on fraudulent. Meanwhile the night sky is being populated with unofficial names, at $49.95 a pop, one unsuspecting buyer at a time.
What you really get
It's not hard to grasp the romantic or otherwise wondrous reasons someone might have for buying a star name, especially as a gift. It's also important for potential buyers to know what they'd actually get.
Pretty much nothing, beyond some very expensive paper.
Only the International Astronomical Union (IAU) has the right to officially name celestial objects. It does so for scientific purposes only and does not recognize any commercial naming systems. The IAU, viewed by astronomers as the reputable governing body, is well aware of the sea of commercial star vendors. It has this to say:
"The IAU dissociates itself entirely from the commercial practice of 'selling' fictitious star names."
Some folks wonder, understandably, why stars are not given names in lieu of boring numbers.
The IAU does recognize a handful of ancient star names, given to some of the brightest stars in our sky. But with millions and millions of stars out there, it wisely decided long ago that a numbering system is more useful for scientists.
As the IAU puts it, "Finding Maria Gonzalez in Argentina or John Smith in Britain just from their names is pretty hopeless, but if you know their precise address (perhaps from their social security number) you can contact them without knowing their name at all."
As a web site called Name a Star admits, "Scientists will never want to deal with finding 'Aunt Martha's Star.'" This company deserves a gold star for forthrightness.

Part 2:
Install TV meter boxes in a sampling of homes. These are boxes that keep track of exactly what a person is watching at any moment, and for how long. The sampling of homes you choose is important; they should be people from a variety of different age groups, nationalities and sexes. Nielsen Media Research, the company in charge of tracking TV ratings for the United States and Canada, keeps meter boxes in about 5,000 U.S. homes at any given time.
Obtain national statistics on the citizens of the United States. The Census can be extremely helpful in this, as it breaks down people by age, income, etc.
Take a look at the results you are getting from the meter boxes. Each person that watches a particular show is a representative for the part of the U.S. he most fits into. For example, if a house containing a husband and wife in their forties with no kids watched a particular show at a particular time, it is safe to assume most people that fit that specific description watched the same show as well. You would compare this sampling with all others that met that exact description and see what percentage of your samples watched a show, and then apply that percentage to the general population.
Multiply the number of people meeting a specific description in your sample group who watched a particular show by the number of people in the U.S. who fit that particular description. The number you get is the estimated number of people who watched that show.

Science Myth of the Week:
So, do you lose most of your heat from your head? 
The origin came from the US military in the 1970s saying that 45% to 48% of the heat is lost from the head. 
But, it's actually not true.

The Reality Check Episode 80


IQ Correlations + Dr Yoni Freedhoff Interview + Trillium Myth
Part 1: 
Political, religious and sexual behaviors may be reflections of intelligence, a new study finds.
Evolutionary psychologist Satoshi Kanazawa at the the London School of Economics and Political Science correlated data on these behaviors with IQ from a large national U.S. sample and found that, on average, people who identified as liberal and atheist had higher IQs. This applied also to sexual exclusivity in men, but not in women. The findings will be published in the March 2010 issue of Social Psychology Quarterly.
The IQ differences, while statistically significant, are not stunning -- on the order of 6 to 11 points -- and the data should not be used to stereotype or make assumptions about people, experts say. But they show how certain patterns of identifying with particular ideologies develop, and how some people's behaviors come to be.
The reasoning is that sexual exclusivity in men, liberalism and atheism all go against what would be expected given humans' evolutionary past. In other words, none of these traits would have benefited our early human ancestors, but higher intelligence may be associated with them.
"The adoption of some evolutionarily novel ideas makes some sense in terms of moving the species forward," said George Washington University leadership professor James Bailey, who was not involved in the study. "It also makes perfect sense that more intelligent people -- people with, sort of, more intellectual firepower -- are likely to be the ones to do that."
Bailey also said that these preferences may stem from a desire to show superiority or elitism, which also has to do with IQ. In fact, aligning oneself with "unconventional" philosophies such as liberalism or atheism may be "ways to communicate to everyone that you're pretty smart," he said.
The study looked at a large sample from the National Longitudinal Study of Adolescent Health, which began with adolescents in grades 7-12 in the United States during the 1994-95 school year. The participants were interviewed as 18- to 28-year-olds from 2001 to 2002. The study also looked at the General Social Survey, another cross-national data collection source.
Kanazawa did not find that higher or lower intelligence predicted sexual exclusivity in women. This makes sense, because having one partner has always been advantageous to women, even thousands of years ago, meaning exclusivity is not a "new" preference.
For men, on the other hand, sexual exclusivity goes against the grain evolutionarily. With a goal of spreading genes, early men had multiple mates. Since women had to spend nine months being pregnant, and additional years caring for very young children, it made sense for them to want a steady mate to provide them resources.
Religion, the current theory goes, did not help people survive or reproduce necessarily, but goes along the lines of helping people to be paranoid, Kanazawa said. Assuming that, for example, a noise in the distance is a signal of a threat helped early humans to prepare in case of danger.
"It helps life to be paranoid, and because humans are paranoid, they become more religious, and they see the hands of God everywhere," Kanazawa said.
Participants who said they were atheists had an average IQ of 103 in adolescence, while adults who said they were religious averaged 97, the study found. Atheism "allows someone to move forward and speculate on life without any concern for the dogmatic structure of a religion," Bailey said.
"Historically, anything that's new and different can be seen as a threat in terms of the religious beliefs; almost all religious systems are about permanence," he noted.
The study takes the American view of liberal vs. conservative. It defines "liberal" in terms of concern for genetically nonrelated people and support for private resources that help those people. It does not look at other factors that play into American political beliefs, such as abortion, gun control and gay rights.
"Liberals are more likely to be concerned about total strangers; conservatives are likely to be concerned with people they associate with," he said.
Given that human ancestors had a keen interest in the survival of their offspring and nearest kin, the conservative approach -- looking out for the people around you first -- fits with the evolutionary picture more than liberalism, Kanazawa said. "It's unnatural for humans to be concerned about total strangers." he said. The study found that young adults who said they were "very conservative" had an average adolescent IQ of 95, whereas those who said they were "very liberal" averaged 106. It also makes sense that "conservatism" as a worldview of keeping things stable would be a safer approach than venturing toward the unfamiliar, Bailey said. Neither Bailey nor Kanazawa identify themselves as liberal; Bailey is conservative and Kanazawa is "a strong libertarian." Vegetarianism, while not strongly associated with IQ in this study, has been shown to be related to intelligence in previous research, Kanazawa said. This also fits into Bailey's idea that unconventional preferences appeal to people with higher intelligence, and can also be a means of showing superiority. None of this means that the human species is evolving toward a future where these traits are the default, Kanazawa said.
"More intelligent people don't have more children, so moving away from the trajectory is not going to happen," he said.
Part 2:
Interview with Dr Yoni Freedhoff!
Given the chance, cows nurture their young and form lifelong friendships with one another. They play games and have a wide range of emotions and personality traits. But most cows raised for the dairy industry are intensively confined, leaving them unable to fulfill their most basic desires, such as nursing their calves, even for a single day. They are treated like milk-producing machines and are genetically manipulated and pumped full of antibiotics and hormones that cause them to produce more milk. While cows suffer on factory farms, humans who drink their milk increase their chances of developing heart disease, diabetes, cancer, and many other ailments.
Listen to the Interview here: The Reality Check Episode 80
Science Myth of the Week:
So, is it illegal to pick Trillium from a government area?
Not, yet, but soon it will be.



The Reality Check Episode 79

Olympics Revisited + Euphamism Treadmill + Audiobooks vs Books

Part 1:
So, how do we actually say that who won the Winter Olympics? Do we just count the number of medals? Or do we count who has the most gold medals? 
If we count the most medals, then the US won. if we count the most gold medals, the Canada won. But, there's another issue about the country population. Norway is around 8 times smaller than the US by population size, but they have 23 medals, and the US has 32. 
We can also look at the amount of athlete and see the percentage of the medals. Another thing is that if you send 2 teams for one sport, then it is impossible for both to have gold medals. 
There are also effects by the fundings the country gives to the athletes. 
So, if you only win by 1/100, then is it really skill? Or is it just chance? Of course if you win by 3 seconds, you obviously win of course. 
So, basically, don't just blindly look at the number and say who is the better country at the Winter Olympics. 

Part 2: 
Euphemisms often evolve over time into taboo words themselves, through a process described by W.V.O Quine, and more recently dubbed the "euphemism treadmill" by Steven Pinker. This is the well-known linguistic process known as “pejoration” or “semantic change”.
Words originally intended as euphemisms may lose their euphemistic value, acquiring the negative connotations of their referents. In some cases, they may be used mockingly and become dysphemisms.
In his remarks on the ever-changing London slang, made in Down and Out in Paris and London, George Orwell, mentioned both the euphemism treadmill and the dysphemism treadmill. He did not use these now-established terms, but observed and commented on the respective processes as early as in 1933.
For example, the term "concentration camp", to describe camps used to confine civilian members of the Boer community in close (concentrated) quarters, was used by the British during the Second Boer War, primarily because it sounded bland and inoffensive. Despite the high death rates in the British concentration camps, the term remained acceptable as a euphemism. However, after Nazi Germany used the expression to describe its death camps in the 1930s and 1940s, the term gained a widespread negative connotation, particularly in connection with the Holocaust.
Also, in some circles, the euphemisms "lavatory" or "toilet", are now considered inappropriate and were replaced with "bathroom" and "water closet", which in turn have been replaced by some with restroom and W.C. These are also examples of euphemisms which are geographically concentrated. The term "restroom" is rarely used outside the United States. "W.C." was previously quite popular in the United Kingdom, but is passing out of favor there, but becoming more popular in France, Germany and Hungary now as the polite term of choice. - Wiki

Science Myth of the Week: 
A new study by Carnegie Mellon University scientists shows that because of the way the brain works, we understand spoken and written language differently, something that has potential implications in the workplace and in education, among many other areas.
In the first imaging study that directly compares reading and listening activity in the human brain, Carnegie Mellon scientists discovered that the same information produces systematically different brain activation. And knowing what parts of the brain fire during reading or listening comprehension affects the answer to one of the classic questions about language comprehension: whether the means of delivery through eyes or ears makes a difference.
"The brain constructs the message, and it does so differently for reading and listening. The pragmatic implication is that the medium is part of the message. Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper," said Carnegie Mellon Psychology Professor Marcel Just, co author of the report that appears in this month's issue of the journal Human Brain Mapping.
Just said that the most recent methods of functional magnetic resonance imaging (fMRI) were applied to measure brain activity during these high level conceptual processes. Rather than examining the processing of single digits or words, his group is applying brain imaging to societal, workplace, and instructional issues.
"We can now see how cell phone use can affect driving, how reading differs from listening, and how visual thinking is integrated with verbal thinking," Just said.
Using the non invasive fMRI, scientists were able to measure the amount of activity in each of 20,000 peppercorn sized regions of the brain every three seconds and create visual maps of how the mental work of thinking was allocated throughout the brain from moment to moment. To the scientists' surprise, there were two big differences in the brain activity patterns while participants were reading or listening to identical sentences, even at the conceptual level of understanding the meaning of a sentence.
First, during reading, the right hemisphere was not as active as anticipated, which opens the possibility that there were qualitative differences in the nature of the comprehension we experience in reading versus listening. Second, while listening was taking place, there was more activation in the left hemisphere brain region called the pars triangularis (the triangular section), a part of Broca's area that usually activates when there is language processing to be done or there is a requirement to maintain some verbal information in an active state (sometimes called verbal working memory). The greater amount of activation in Broca's area suggests that there is more semantic processing and working memory storage in listening comprehension than in reading.
Because spoken language is so temporary, each sound hanging in the air for a fraction of a second, the brain is forced to immediately process or store the various parts of a spoken sentence in order to be able to mentally glue them back together in a conceptual frame that makes sense. "By contrast," Just said, "written language provides an "external memory" where information can be re­read if necessary. But to re play spoken language, you need a mental play back loop, (called the articulatory phonological loop) conveniently provided in part by Broca's area."
The study doesn't attempt to suggest that one means of delivering information is better than another, Just said. "Is comprehension better in listening or in reading? It depends on the person, the content of the text, and the purpose of the comprehension. In terms of persons, some people are more skilled at one form of comprehension and typically exercise a preference for their more skilled form where possible. It may be that because of their experience and biology they are better and more comfortable in listening or reading," he explained. -CM

Sunday, October 17, 2010

The Reality Check Episode 78

Chicken/Egg Precedent Problem + Weasle Words + Gutenberg

Which came first?
Part 1:
So, which came first? The chicken or the egg? 
The question “which came first, the chicken or the egg?” looks at first glance like a matter of straightforward reproductive biology.  But before we can even begin to answer this question, we must define our terms.  So actually, it is a classic case of semantic ambiguity…a problem of meaning and interpretation.  Specifically, while the term “chicken” is biologically unambiguous – we all know what a chicken looks, sounds and tastes like - the term “egg” is somewhat more general and is therefore a possible source of ambiguity.  
Do we mean (1) just any egg, or (2) a chicken egg?  And if we’re talking about a chicken egg, then is a “chicken egg” (2a) an egg laid by a chicken, (2b) an egg containing a chicken, or (2c) both?  Reformulating the question to reflect each possible meaning of “egg” leads to four distinct versions of the chicken-or-egg question.
1.  Which came first, the chicken or (just any old) egg?
2a.  Which came first, the chicken or an egg laid by a chicken?
2b.  Which came first, the chicken or an egg containing a chicken?
2c.  Which came first: the chicken, or an egg laid by and containing a chicken? 
Contrary to popular belief, there is indeed a definite answer to each of these questions. Specifically, the answers are:  (1)  The egg.  (2a)  The chicken.  (2b)  The egg.  (2c)  The chicken.  Given some knowledge of logic and biology, these answers are not hard to verify.  To get this show on - or should that be across? - the road, let’s go through them in order.
First, consider question 1: which came first, the chicken or (just any old) egg?  This question is answered “the egg” because species that lay eggs have been around a lot longer than modern chickens.  For example, we have plenty of fossil evidence that dinosaurs laid eggs from which baby dinosaurs hatched, and dinosaurs predate chickens by millions of years.  Indeed, a growing body of research indicates that dinosaurs were among the biological ancestors of chickens!
Now let’s look at question 2a: which came first, the chicken or an egg laid by a chicken?  The answer to this question is “the chicken” on semantic grounds alone.  That is, if a chicken egg must be laid by a chicken, then before a chicken egg can exist, there must by definition be a chicken around to lay it.  And question 2c - which came first, the chicken or an egg laid by and containing a chicken? - is answered the same way on the same grounds; logically, the fact that a chicken egg must be laid by a chicken precedes and therefore “dominates” the (biologically subsequent) requirement that it contain a chicken.  So whereas we needed paleozoological evidence to answer question 1, questions 2a and 2c require practically no biological knowledge at all! 
Having saved the best for last, let us finally consider the most interesting version, 2b:  which came first, the chicken or an egg containing a chicken?  This version is interesting because an egg containing a chicken might have been laid by a chicken or a non-chicken, which of course affects the answer.  Thanks to modern genetic science, we can now be sure that the egg came first.  This is because reproductive mutations separating a new species from its progenitor generally occur in reproductive rather than somatic DNA and are thus expressed in differences between successive generations, but not in the parent organisms themselves.  While the somatic (body) cells of the parents – e.g. wing cells, drumstick cells and wishbone cells - usually contain only the DNA with which they were conceived, germ (reproductive) cells like ova and spermatozoa contain non-somatic DNA that may have been changed before or during mating by accidental deletion, insertion, substitution, duplication or translocation of nucleotide sequences.  This is what causes the mutation that results in the new species. 
Where an animal qualifies as a member of a given species only if its somatic DNA (as opposed to its reproductive DNA) conforms to the genotype of the species, the parents of the first member of a new species are not members of that new species.  At the same time, all the biological evidence says that the ancestors of modern chickens were already oviparous or egg-laying…that a male and a female member of the ancestral species of the modern chicken, call this species “protochicken”, mated with each other and created an egg.  (Could the first chicken have evolved from a viviparous or live-bearing species, and after being born alive, have started laying eggs?  All the biological evidence says “no”.)  But because their act of mating involved a shuffling of reproductive genes that were not expressed in the body of either parent – if they had been expressed there, the parents would themselves have been members of the new species - the fetus inside the egg was not like them.  Instead, it was a mutant…a modern chicken! 
Only two loose ends remain: the “gradual” and “sudden” extremes of the evolutionary spectrum.  These extremes are evolutionary gradualism - Darwin’s original slow-paced timetable for natural selection - and punctuated evolution, as advocated more recently by evolutionary theorists including the controversial Stephen J. Gould. 
Gradualism says that mutations are biologically random, but subject to a selection process determined by environmental (external) conditions to which species must adapt over the course of many generations.  Taken to the limit, it implies either that each minor mutation that occurs during the evolutionary change of one species into another is random and independent of any other mutation, in which case a useful combination of mutations is highly improbable, or that each individual mutation confers a selective advantage on the mutant…that every evolutionary advantage of a new species over its precursor decomposes into smaller advantages combined in a more or less linear way.  Unfortunately, this makes it almost impossible to explain complex biological structures that do not break down into smaller structures useful in their own right…structures like bacterial cilia and flagella, and even the human eye. 
The hypothetical gradualistic evolution of one species into another via mutations accumulated over many generations leads to the following question: when does the quality and quantity of mutations justify a distinction between “species”…when does a protochicken become a chicken? It’s a good question, but our chicken-or-egg answers remain valid no matter how we answer it. 
At the other extreme, evolution sometimes appears to progress by leaps and bounds, moving directly from the old to the new in “punctuated” fashion.  And to complicate matters, this sometimes seems to happen across the board, affecting many species at once.  The most oft-cited example of punctuated evolution is the Cambrian Explosion.  Whereas sedimentary rocks that formed more than about 600 million years ago are poor in fossils of multicellular organisms, slightly younger rocks contain a profusion of such fossils conforming to many different structural templates.  The duration of the so-called “explosion”, a mere geological eye blink of no more than 10 million years or so, is inconsistent with gradualism; new organs and appendages must have been popping out faster than the environment alone could have selected them from a field of random mutations.  Clearly, the sudden appearance of a new appendage would leave little doubt about the evolutionary demarcation of ancestral and descendant species.
But the kind of punctuated evolution that occurs between generations is not the end of the line in sheer biological acceleration.  Sometimes, an evolutionary change seems to occur within the lifespan of a single organism!  For example, in the spirit of “ontogeny recapitulates phylogeny”, insect metamorphosis almost seems to hint at an evolutionary process in which an ancient grub or caterpillar underwent a sudden transformation to something with wings and an exoskeleton…or alternatively, in which a hard-shelled flying bug suddenly gave birth to an egg containing a soft and wormy larva.  While that’s not what really happened – as is so often the case, the truth lies somewhere in the middle - what occurred was just as marvelous and just as punctuated. 
What seems to have happened was this.  Due to a reproductive mutation, a whole sequence of evolutionary changes originally expressed in the fetal development of an ancestral arthropod, and originally recapitulated within the womb and egg it inhabited, were suddenly exposed to the environment, or at least to the hive, in a case of “ovum interrupts”.  A fetal stage of morphogenesis that formerly occurred within womb and egg was interrupted when the egg hatched “prematurely”, making the soft fetus into an equally soft larva and giving it a valuable opportunity to seek crucial nourishment from external sources before being enclosed in a pupa, a second egg-like casing from which it later hatched again in its final exoskeletal form.  So metamorphosis turns out to be a case of biological common sense, providing the fetus-cum-larva with an opportunity to acquire the nourishment required for the energy-consuming leap into adulthood. 
Does this affect our answer to the chicken-or-egg question?  Not really.  For even where the life cycle of an organism includes distinct morphological stages, the DNA of egg-laying insects does not change after conception.  And since it is reproductive and not somatic DNA modification that distinguishes one species from the next in line, our answers stand firm.  (Of course, this says nothing of science fiction movies in which something bizarre and insidious causes runaway mutations in the somatic DNA of hapless humans, causing them to evolve into monsters before our very eyes!  Such humans have either undergone a random or radiation-induced “meta-mutation” whereby their genetic code suddenly rearranged itself to incorporate a self-modification routine that is executed somatically, within their own cells, or they are the victims of a space virus which inserted such a routine into their DNA for its own nefarious purposes.)
OK…perhaps there’s yet another loose end.  Asking which of two things came first implies that time flows in a straight line from past to future (those are the “loose ends”).  But what if time were to flow in either direction, or even to loop around, flowing in what amounts to a circle? No more loose ends.  In fact, loops have no ends at all!  But in this case, the answer depends on whether we’re on the forward or reverse side of the loop, heading towards the future or the past.  Another way to formulate this question: does the cause lead to the effect, or is there a sense in which the effect leads to the cause?  Suffice it to say that no matter which way we choose to go, the original answers to the four versions (1, 2a, 2b and 2c) of the chicken-or-egg question are all affected the same way.  They are either all unchanged or all reversed, with no additional ambiguity save that pertaining to the direction of time (not a problem for most non-physicists and non-cosmologists).
Now that we’ve tied up every last loose end, what about the most important question of all, namely what to tell a curious child?  The answer: take your pick of versions.  Some kids will prefer the dinosaur angle of version 1; some kids will prefer the “birds and bees” reproductive biology lesson of version 2b.  In my opinion, if we limit ourselves to one version only, the most valuable explanation is probably that of 2b; but due to its relative complexity, a younger child can probably derive greater benefit from a T. Rex-versus-Triceratops embellishment of version 1.  To exhaust the golden opportunities for logical and scientific instruction, one should of course answer all four versions.  But no matter which way you go, make sure the child knows exactly which version(s) of the question you’re answering.  If you leave out the one he or she had in mind, you’ll no doubt be egged on until it gets answered! - Cognitive Theoretic Model of the Universe.

Part 2: 
What are Weasel Words?
Weasel words is an informal term for words and phrases aimed at creating an impression that something specific and meaningful has been said, when in fact only a vague or ambiguous claim has been communicated.
The origin came from that weasels poke holes in the egg and suck out the inside, leaving the shell intact. It's basically saying that you show someone an empty egg, making people think it has more than it really does. But, it's not lying.

Science Myth of the Week:
So, did Gutenberg invent the printing press?
No, we all know that someone invented the printing press in China. 

The Reality Check Episode 77

Chocolate/Strokes + Olympic Mascots + KFC Myth

Does Chocolate prevent strokes?
Part 1: 
So, does chocolate lower the risks or getting strokes? 
Eating just one square of chocolate a day can cut the risk of heart attack and stroke by 39%, researchers said today. Eating 7.5g of chocolate daily also leads to lower blood pressure, a study found. Researchers in Germany followed 19,357 people aged between 35 and 65 for at least a decade. Those who ate the most amount of chocolate - an average of 7.5g a day - had lower chances of heart attacks and stroke than those who ate the least amount (1.7g a day on average). The difference between the two groups amounted to 6g of chocolate - less than one square of a 100g bar. The study, published in the European Heart Journal, concluded that if those people who ate the least chocolate increased their intake by 6g a day there would be fewer heart attacks and strokes. Of those who ate the least chocolate, there were 219 strokes or heart attacks per 10,000 people but there could be 85 fewer if they ate 7.5g a day on average, researchers said. Those who ate the most chocolate had a 27% reduced risk of heart attacks and nearly half (48%) the risk of strokes compared with those eating the least amount. Eating chocolate lowered blood pressure, which accounted for some of the reduced risk, but falls were seen in heart attacks and strokes even when this was taken into account. Dr Brian Buijsse, a nutritional epidemiologist at the German Institute of Human Nutrition, Nuthetal, Germany, who led the research, said: "People who ate the most amount of chocolate were at a 39% lower risk than those with the lowest chocolate intakes. "If the 39% lower risk is generalised to the general population, the number of avoidable heart attacks and strokes could be higher because the absolute risk in the general population is higher." However, he warned people against eating too much chocolate and putting on weight or cutting down the amount of healthy foods they eat. "Small amounts of chocolate may help to prevent heart disease, but only if it replaces other energy-dense food, such as snacks, in order to keep body weight stable," he said. Cocoa beans contain flavanols, which are thought to have an effect on lowering blood pressure. The experts said dark chocolate has more flavanols than milk chocolate and is therefore likely to be more beneficial. Frank Ruschitzka, from the European Society of Cardiology (ESC), said: "Basic science has demonstrated quite convincingly that dark chocolate particularly, with a cocoa content of at least 70%, reduces oxidative stress and improves vascular and platelet function. "However, before you rush to add dark chocolate to your diet, be aware that 100g of dark chocolate contains roughly 500 calories. "As such, you may want to subtract an equivalent amount of calories, by cutting back on other foods, to avoid weight gain." Victoria Taylor, senior heart health dietician at the British Heart Foundation, said: "This sounds like a dream for chocolate lovers and just in time for Easter too, but it's important to read the small print with this study. "The amounts consumed on average by even the highest consumers was about one square of chocolate a day or half a small chocolate Easter egg in a week, so the benefits were associated with a fairly small amount of chocolate. "Some people will be tempted to eat more than one square, however. "Chocolate has high amounts of calories and saturated fat which are linked to weight gain and raised cholesterol levels - two of the key risk factors for heart disease. "So whilst chocolate, in moderation, can form part of a heart healthy diet it is important to remember to include a variety of other foods including fruit and vegetables and oily fish, as well as getting out and being active for at least 30 minutes a day." - The Independent

Winter Olympics 2010 Mascots
Part 2: 
The mascots for the 2010 Winter Olympics and the 2010 Winter Paralympics were Miga and Quatchi, and Sumi respectively, who had a "sidekick", Mukmuk. The four mascots were introduced on November 27, 2007. They were designed by the Canadian and American duo, Meomi Design. It was the first time the Olympic and Paralympic mascots were introduced at the same time.
The mascots are:
Miga - A mythical sea bear, part orca and part kermode bear living off the coast of Vancouver Island. She loves to surf in the summer, especially in Tofino, and snowboard in the winter. Her green scarf was given to her by Mukmuk.
Quatchi - A sasquatch. He comes from the mysterious forests of Canada, wears blue earmuffs, and dreams of being a hockey goalie. He loves to travel and learn about the regional dances and cuisines of every place he visits. He carries his camera around his neck wherever he goes.
Sumi - An animal guardian spirit with the wings of the Thunderbird and legs of a black bear who wears an orca-like hat in an artistic style of Haida people. He lives in the mountains of British Columbia and is a passionate environmentalist. His name comes from the Salish word "sumesh," meaning "guardian spirit."
Mukmuk - A Vancouver Island marmot described as "small and friendly", Mukmuk is not an official mascot but acts as their sidekick. As of December 2008 he has joined the other mascots as a plush toy, but not yet a life-size mascot costume. His name comes from the Chunuk Wawa word "muckamuck," meaning "food" or "to eat", because of his large appetite.
Miga and Quatchi are mascots for the Olympics Games, while Sumi is the mascot for the Paralympic Games. Aside of three mascots, Mukmuk is their designated "sidekick". Thus, there are two Olympic mascots and one Paralympic mascot as well as one "sidekick". - Wiki

Science Myth of the Week:
So, KFC changed it's name back in 1991 because they stopped using chicken? Is it true?
It's not true! KFC does not use mutated chickens! They changed their name because of copyright reasons for Kentucky, and to hide the word "fried", because the word gives people an image of unhealthy food. 

Saturday, October 16, 2010

The Reality Check Episode 76

Graphology/Handwriting Analysis + Kosher Food + Drinking/Pregnancy

Graphology
Part 1:
The science of handwriting. 
There are actually 2 kinds of determining your personality from handwriting. One kind is forensic analysis by a computer, which compares handwritings and tells you the results. The other kind if graphology, which is the pseudoscientific study and analysis of handwriting especially in relation to human psychology. In the medical field, it can be used to refer to the study of handwriting as an aid in diagnosis and tracking of diseases of the brain and nervous system. The term is sometimes incorrectly used to refer to forensic document examination. 

Graphology is claimed to be useful for everything from understanding health issues, morality and past experiences to hidden talents and mental problems. However, "in properly controlled, blind studies, where the handwriting samples contain no content that could provide non-graphological information upon which to base a prediction, graphologists do no better than chance at predicting... personality traits...." And even non-experts are able to correctly identify the gender of a writer about 70% of the time.
There are a variety of techniques used by graphologists. Even so, the techniques of these "experts" seem to be reducible to impressions from such things as the pressure exerted on the page, spacing of words and letters, crossed t's, dotted i's, size, slant, speed and consistency of writing. Though graphologists deny it, the content of the writing is one of the more important factors in graphological character assessment. The content of a message, of course, is independent of the handwriting and should be irrelevant to the assessment.
Graphology is another pipe dream of those who want a quick and dirty decision making process to tell them who to marry, who did the crime, who they should hire, what career they should seek, where the good hunting is, where the water, oil, or buried treasure is, etc. Graphology is another in a long list of quack substitutes for hard work. It is appealing to those who are impatient with such troublesome matters as research, evidence analysis, reasoning, logic, and hypothesis testing. If you want results and you want them now and you want them stated in strong, certain terms, graphology is for you. If, however, you can live with reasonable probabilities and uncertainty, you might try another method to pick a spouse or hire an employee. - The Skeptic's Dictionary

Kosher Food Signs
Part 2: 
So, what is Kosher food?
Kosher foods are those that conform to the regulations of the Jewish Halakhic framework. These rules form the main aspect of kashrut, Jewish dietary laws. Reasons for food being non-kosher include the presence of ingredients derived from non-kosher animals or from kosher animals that were not properly slaughtered, a mixture of meat and milk, wine or grape juice (or their derivatives) produced without supervision, the use of produce from Israel that has not been tithed, or even the use of cooking utensils and machinery which had previously been used for non-kosher food. These might include utensils and machines used for making pork or other non-kosher foods.
According to the Jewish, this kind of food if much healthier for people. But, it's actually not true. It's actually not better or worse for people. It's just a list of things that they are allowed to eat.

Science Myth of the Week:
So, does drinking during pregnancy effect the baby?
Of course it does! How can it not? Alcohol messes up our bodies. 

The Reality Check Episode 75

Ginkgo Biloba + Audio Cables + Fugu (Puffer Fish) Myth

Gingko Leaves
Part 1:
So, what is Ginkgo Biloba?
Ginkgo is a unique species of tree with no close living relatives. The ginkgo is classified in its own division, the Ginkgophyta, comprising the single class Ginkgoopsida, order Ginkgoales, family Ginkgoaceae, genus Ginkgo and is the only extant species within this group. It is one of the best-known examples of a living fossil, because Ginkgoales other than G. biloba are not known from the fossil record after the Pliocene.
What does Ginkgo Biloba do? 
It is said that the Ginkgo Biloba remedies can help many kinds of diseases, from depression and memory loss, to headaches and dizziness.
So, there was a study that involved more than 3000 people aged over 72 years. They were all randomly assigned to take Ginkgo Biloba twice a day, or placebo tablets. The overall findings show that Ginkgo Biloba didn't prevent Alzheimer's. Then they decided to see if it helped with memory loss. Then they use a kind of test every 6 months for their mental abilities. By the end of the research, the Ginkgo Biloba didn't help with it at all. It doesn't harm anyone, but it doesn't help either. 
In conclusion, the Ginkgo Biloba remedies doesn't help at all. 

Part 2: 
A cable is two or more wires running side by side and bonded, twisted or braided together to form a single assembly. In mechanics cables, otherwise known as wire ropes, are used for lifting, hauling and towing or conveying force through tension. In electrical engineering cables used to carry electric currents . An optical cable contains one or more optical fibers in a protective jacket that supports the fibers.
Electric cables discussed here are mainly meant for installation in buildings and industrial sites. For power transmission at distances greater than a few kilometers see high voltage cable, power cables, and HVDC. - Wiki
Any current-carrying conductor, including a cable, radiates an electromagnetic field. Likewise, any conductor or cable will pick up energy from any existing electromagnetic field around it. These effects are often undesirable, in the first case amounting to unwanted transmission of energy which may adversely affect nearby equipment or other parts of the same piece of equipment; and in the second case, unwanted pickup of noise which may mask the desired signal being carried by the cable, or, if the cable is carrying power supply or control voltages, pollute them to such an extent as to cause equipment malfunction.
The first solution to these problems is to keep cable lengths in buildings short, since pick up and transmission are essentially proportional to the length of the cable. The second solution is to route cables away from trouble. Beyond this, there are particular cable designs that minimize electromagnetic pickup and transmission. Three of the principal design techniques are shielding, coaxial geometry, and twisted-pair geometry.
Whether or not Monster Cables are worth it is a war that has raged since home theater immemorial. A poster at Audioholics was put in a room with five fellow audiophiles, and a Martin Logan SL-3 speaker set at 75Db at 1000KHz playing a mix of "smooth, trio, easy listening jazz" that no one had heard before. In one corner, Monster 1000 speaker cables. In the other, four coat hangers twisted and soldered into a speaker cable.
Seven songs were played while the group was blindfolded and the cables swapped back and forth. Not only "after 5 tests, none could determine which was the Monster 1000 cable or the coat hanger wire," but no one knew a coat hanger was used in the first place.
Further, when music was played through the coat hanger wire, we were asked if what we heard sounded good to us. All agreed that what was heard sounded excellent, however, when A-B tests occurred, it was impossible to determine which sounded best the majority of the time and which wire was in use. - Gizmodo

Science Myth of the Week: 
So, there's this Japanese delicacy called Fugu. The myth is that this kind of fish is only served by professionals, because if not served correctly, the person eating it can die. Is it true?
Yes! It is a dangerous delicacy if not made correctly!

Friday, October 15, 2010

The Reality Check Episode 74

Counterfeiting + Daniel Loxton Interview + Celery Myth

Part 1:
Counterfeit money is currency that is produced without the legal sanction of the state or government to resemble some official form of currency closely enough that it may be confused for genuine currency. Producing or using counterfeit money is a form of fraud.
Roman coins were struck using a minting process, not cast, so these coin molds were created for forgery.
Counterfeiting is probably as old as money itself. Before the introduction of paper money, the most prevalent method of counterfeiting involved mixing base metals with pure gold or silver. A form of counterfeiting is the production of documents by legitimate printers in response to fraudulent instructions. During World War 2, the Nazis forged British pounds and American dollars. Today some of the finest counterfeit banknotes are called Super-dollars because of their high quality, and likeness to the real US dollar. There has been a considerable amount of counterfeiting of Euro banknotes and coins since the launch of the currency in 2002.
Some of the ill-effects that counterfeit money has on society are: a reduction in the value of real money; and increase in prices due to more money getting circulated in the economy - an unauthorized artificial increase in the money supply; a decrease in the acceptability of paper money; and losses, because companies are not reimbursed for counterfeits. Traditionally, anti-counterfeiting measures involved including fine detail with raised intaglio printing on bills which allows non-experts to easily spot forgeries. On coins, milled or reeded (marked with parallel grooves) edges are used to show that none of the valuable metal has been scraped off. - Wiki

Part 2:
Daniel Loxton is a Canadian writer, illustrator, and skeptic. He is the Editor of Junior Skeptic magazine, a kids’ science section bound into the Skeptics Society’s Skeptic magazine. He writes and illustrates most issues of Junior Skeptic.
Listen to the interview here: The Reality Check Episode 74

Science Myth of the Week: 

The Reality Check Episode 73

Quantum Mechanics + Creation Movie Review + Drinking Myth

Part 1: 
Quantum mechanics basically is talking about how things react on the molecular level, because it's completely different from our everyday life objects. It's a extremely complicated theory, but that's the basic definition of it. Richard Feyman once said that anyone who claims to understand quantum mechanics don't understand quantum mechanics. 
Just before 1900, it became clear that classical physics was unable to explain certain phenomena. Coming to terms with these limitations of classical physics led to the development of quantum mechanics in the early decades of the 20th century, a major revolution in physical theory. Much of the universe on the largest scale does not neatly conform to classical physics, because of general relativity. Similarly, quantum mechanics means that the universe in the small also does not neatly conform to classical mechanics. The principles of quantum mechanics are difficult for the human mind to understand, because humans are accustomed to reasoning about the world on a scale where classical physics is an excellent approximation. Quantum mechanics is counterintuitive; in the words of Richard Feynman, it deals with "Nature as She is—absurd."
Many fundamental parts of the universe, such as photons, behave in some ways like particles and in other ways like waves. Radiators of photons, such as neon lights, have emission spectra, but these spectra are discontinuous in that only certain frequencies are present. The laws of quantum mechanics predict the energies, the colors, and the spectral intensities of electromagnetic radiation. But the same laws ordain that the more closely one pins down one measure (such as the position of a particle), the less predictable another measure pertaining to the same particle must become. Put another way, measuring position first and then measuring momentum does not have the same outcome as measuring momentum first and then measuring position. Even more disconcerting, pairs of particles can be created as entangled twins -- which means that an action that pins down one characteristic of one particle will instantaneously pin down the same or other characteristic of its entangled twin, regardless of the distance separating the entangled twins.

Part 2:
Creation is a 2009/2010 film. The film is a partly biographical, partly fictionalized account of Charles Darwin's relationship with his eldest daughter, Annie, as he struggles to write On the Origin of Species. British naturalist Charles Darwin is a young father who lives a quiet life in an idyllic village. He is a brilliant and deeply emotional man, devoted to his wife and children. Darwin is especially fond of his eldest daughter Annie, a precocious and inquisitive ten-year-old. He teaches her much about nature and science, including his theory of evolution, and tells her true stories of his travels. Her favorite story, despite the sad ending, is about the young orangutan Jenny, who is brought from Borneo to the London Zoo, where she finally died of pneumonia in the arms of her keeper. Darwin is furious when he learns that the family clergyman has made Annie kneel on rock salt as punishment for contradicting him about the existence of dinosaurs, as their existence and extinction contradicts the church's position that life is unchanging and perfect and that the Earth is very young.
Having returned from his expedition in the Galapagos Islands fifteen years earlier, Darwin is still working on finishing a manuscript about his findings, which substantiates his theory of evolution. The delay is caused by anxiety about his relationship with his religious wife, Emma, who fundamentally opposes his ideas. Emma worries that she may go to heaven and he may not, separating them for eternity.
After Annie becomes ill in 1851, Darwin takes her to the town of Malvern for James Manby Gully’s water cure therapy, against his wife Emma's will, but Annie becomes weaker and dies peacefully, after her father, at her request, tells her Jenny's story once more. Darwin is devastated. Annie's death sharpens Darwin’s conviction that natural laws have nothing to do with divine intervention. To his contemporaries, this is an idea so dangerous it seems to threaten the existence of God. In a box in Darwin’s study, we discover the notes and observations that will become On the Origin of Species.
The film shows Annie in flashbacks and hallucinations, a vibrant apparition who goads her father to address his fears and finish his masterwork. For a long time Annie's death is a taboo subject between Darwin and Emma, for Darwin fears that Emma blames him for their daughter's death. As a result of the strained relations between Charles and Emma, they stop making love entirely. Anguished, Darwin begins to suffer from a mysterious, fatiguing illness. - Wiki

Science Myth of the Week:
Does mixing drinks up really make you drunker?
No it doesn't, it all ends up in the same place, and no matter which way you ad the chemicals, the results are the same.


Thursday, October 14, 2010

The Reality Check Episode 72

Gambler's Fallacy + Cryonics + Super Freakonomics Review

Gambler's Fallacy
Part 1: 
So, what is the Gambler's fallacy? 
It's when people are gambling and when the a coin is flipped. If it came out tails for 5 times in a row, people will naturally think that the next one will be head to balance it out. The inverse gambler's fallacy is when people think the next flip will be tails again. This is actually a part of statistics and under the normal conditions, no matter what came before the coin toss, the next one will still be 1/2 for each side. 
It seems to obvious that no one will make this kind of mistake. But, when it's just a simple case as described above, then people usually won't fall for it. But then, if a coin has been tails for 20 times in a row, then people will think that it must be heads next time. The chance of getting 20 tails in a row is 2 to the power of 20, it's basically impossible. So, if that does happen, then there must be something wrong with the coin. 
Amos Tversky and Daniel Kahneman proposed that the gambler's fallacy is a cognitive bias produced by a psychological heuristic called the representativeness heuristic. According to this view, "after observing a long run of red on the roulette wheel, for example, most people erroneously believe that black will result in a more representative sequence than the occurrence of an additional red", so people expect that a short run of random outcomes should share properties of a longer run, specifically in that deviations from average should balance out. When people are asked to make up a random-looking sequence of coin tosses, they tend to make sequences where the proportion of heads to tails stays close to 0.5 in any short segment more so than would be predicted by chance; Kahneman and Tversky interpret this to mean that people believe short sequences of random events should be representative of longer ones. The representativeness heuristic is also cited behind the related phenomenon of the clustering illusion, according to which people see streaks of random events as being non-random when such streaks are actually much more likely to occur in small samples than people expect. -Wiki

Part 2: 
Cryonics is the low-temperature preservation of humans and animals who can no longer be sustained by contemporary medicine, with the hope that healing and resuscitation may be possible in the future. Cryopreservation of people or large animals is not reversible with current technology. The stated rationale for cryonics is that people who are considered dead by current legal or medical definitions may not necessarily be dead according to the more stringent information-theoretic definition of death. It is proposed that cryopreserved people might someday be recovered by using highly advanced future technology. The future repair technologies assumed by cryonics are still hypothetical and not widely known or recognized. Cryonics is, therefore, regarded with skepticism by most scientists and physicians, although some do support it. As of 2010, only around 200 people have undergone the procedure since it was first proposed in 1962. In the US, cryonics can only be legally performed on humans after they have been pronounced legally dead. Cryonics procedures ideally begin within minutes of cardiac arrest, and use cryoprotectants to prevent ice formation during cryopreservation. However, the idea of cryonics also includes preservation of people after longer post-mortem delays because of the possibility that brain structures encoding memory and personality may still persist or be inferable. Whether sufficient brain information still exists for cryonics to work under some preservation conditions may be intrinsically unprovable by present knowledge. Therefore, most proponents of cryonics see it as an intervention with prospects for success that vary widely depending on circumstances. -Wiki

Superfreakonomics
Science Myth of the Week: 
Summery of Superfreaknomics: 
Walking drunk is much more deadly than driving drunk.
How pimps are like Realtors.
Why suicide bombers should buy life insurance.
How Iran uses incentives, and not altruism, to get kidney donors.
Children who watch a lot of TV are more likely to engage in crime when they get older.
The profit motive encourages doctors to administer chemotherapy, even though it's not effective in saving more lives.
The Endangered Species Act has perverse incentives for landowners, causing them to clear habitat.
Buying locally produced food increases greenhouse-gas emissions.
You're more likely to solve global warming by throwing sulfur dioxide into the air than through any incentives Al Gore has in mind for getting people to use less energy.
Monkeys can learn the value of money, but don't let them go too far or they'll be having sex every minute.