“My goal today is to be better than yesterday so wait until you see what I do "tomorrow."” - Alien Ness

Tuesday, October 19, 2010

The Reality Check Episode 84

Miracluin Tablets + Price Discrimination + Does Eating Late Cause Weight Gain?

Miracle Tablets?
Part 1: 
So, what is this miracle fruit tablet?
The planet name is called Synsepalum dulcificum. This kind of plant produces very sour berries, that can be subsequently consumed to taste sweet. According to the few studies done, the effect is due to miraculin, which is used commercially as a sugar substitute. The berry itself has a low sugar content and a midly sweet tang. It contains a glycoprotein molecule, with some trailing carbohydrate chains, called miraculin. When the fruit is eaten, that molecule binds to the taste buds, and when you eat sour foods, the molecule makes it tastes sweet. No one really knows how it changes that kind of feeling from sour to sweet. That's why it's called a miracle fruit tablet.
The berry has been used in West Africa since at least the 18th century, when European explorer Chevalier des Marchai, who searched for many different fruits during a 1725 excursion to its native West Africa , provided an account of its use there. Marchais noticed that local people picked the berry from shrubs and chewed it before meals.
An attempt was made in the 1970s to commercialize the ability of the fruit to turn non-sweet foods into sweet foods without a caloric penalty but ended in failure when the FDA classified the berry as a food additive. There were controversial circumstances with accusations that the project was sabotaged and the research burgled by the sugar industry to prevent loss of business caused by a drop in the need for sugar. The US Food and Drug Administration(FDA) has always denied that pressure was put on it by the sugar industry but refused to release any files on the subject. Similar arguments are noted for the FDA's regulation on Stevia now labeled as a "dietary supplement" instead of a "sweetener".
For a time in the 1970s, US dieters could purchase a pill form of miraculin. It was at this time that the idea of the "miraculin party" was conceived. Recently, this phenomenon has enjoyed some revival in food-tasting events, referred to as "flavor-tripping parties" by some. The tasters consume sour and bitter foods, such as lemons, radishes, pickles, hot sauce, and beer, to experience the taste changes that occur.

Part 2:
What is Price Discrimination? 
Price discrimination or price differentiation exists when sales of identical goods or services are transacted at different prices from the same provider. In a theoretical market with perfect information, perfect substitutes, and no transaction costs or prohibition on secondary exchange (or re-selling) to prevent arbitrage, price discrimination can only be a feature of monopolistic and oligopolistic markets, where market power can be exercised. Otherwise, the moment the seller tries to sell the same good at different prices, the buyer at the lower price can arbitrage by selling to the consumer buying at the higher price but with a tiny discount. However, product heterogeneity, market frictions or high fixed costs (which make marginal-cost pricing unsustainable in the long run) can allow for some degree of differential pricing to different consumers, even in fully competitive retail or industrial markets. Price discrimination also occurs when the same price is charged to customers which have different supply costs.
The effects of price discrimination on social efficiency are unclear; typically such behavior leads to lower prices for some consumers and higher prices for others. Output can be expanded when price discrimination is very efficient, but output can also decline when discrimination is more effective at extracting surplus from high-valued users than expanding sales to low valued users. Even if output remains constant, price discrimination can reduce efficiency by misallocating output among consumers.
Price discrimination requires market segmentation and some means to discourage discount customers from becoming resellers and, by extension, competitors. This usually entails using one or more means of preventing any resale, keeping the different price groups separate, making price comparisons difficult, or restricting pricing information. The boundary set up by the marketer to keep segments separate are referred to as a rate fence. Price discrimination is thus very common in services, where resale is not possible; an example is student discounts at museums. Price discrimination in intellectual property is also enforced by law and by technology. In the market for DVDs, DVD players are designed - by law - with chips to prevent use of an inexpensive copy of the DVD (for example legally purchased in India) from being used in a higher price market (like the US). The Digital Millennium Copyright Act has provisions to outlaw circumventing of such devices to protect the enhanced monopoly profits that copyright holders can obtain from price discrimination against higher price market segments.
Price discrimination can also be seen where the requirement that goods be identical is relaxed. For example, so-called "premium products" (including relatively simple products, such as cappuccino compared to regular coffee) have a price differential that is not explained by the cost of production. Some economists have argued that this is a form of price discrimination exercised by providing a means for consumers to reveal their willingness to pay.
Science Myth of the Week:
Does eating at night make you fat?
It is simply not true.
Read on facts here: Festive medical myths

The Reality Check Episode 83

Easter Origins + Vitamin C for the Common Cold + Brain Calorie Loss

Easter Origins!
Part 1: 
Easter Origins?
Easter Origin - A One-time Event
Easter origin, as a Christian holiday, can be found in the pages of scripture itself. Matthew, Mark, Luke and John, all followers of Jesus, offer their own unique eyewitness accounts of the crucifixion and resurrection of Jesus Christ. It is this culminating event of Christianity that is celebrated on Easter Sunday every year.
Easter Origin - The Resurrection of Jesus Christ
Easter origin actually began as a part of the Jewish Passover, as Christ was crucified and resurrected during Passover week. Christ is believed by Christians to actually be the Passover Lamb spoken of in Exodus, for He Himself became the perfect, sinless sacrifice for the sins of all people. Jews who chose to follow Christ then honored this day in succeeding years during the Passover season, but as Christianity was spread throughout non-christian nations, the celebration of Easter was gradually combined with pagan "rites of spring" traditions. Modern celebrations are the result of this compromise. At the same time, Easter is often the only day that many people attend church and are introduced to the "Good News" of Jesus Christ.
Easter Origin - Christ Revealed in the Jewish Passover
Easter origin can be traced to the Passover ceremony itself. Christian scholars believe that the Old Testament is Christ concealed, while the New Testament is Christ revealed. Let's hold the elements of the Passover up to the light of the life of Christ. By tradition, the lamb to be sacrificed during the Passover was selected four days before the sacrifice was to be made. Jesus rode into Jerusalem four days before He was crucified. The lamb was customarily slain at 3 p.m. on Passover. Jesus uttered the words "it is finished" and died on the cross at 3 p.m. (this is known traditionally as Good Friday, but many Bible scholars have determined the crucifixion to be on a Wednesday or Thursday). The festival of Unleavened Bread began at sunset. One of the rituals involved the sacrifice of a grain offering, representing the first fruits of the harvest. Jesus, according to the Apostle Paul, became the first fruits of those raised from the dead. During the Passover dinner, three matzahs are put together. Christians see these matzahs as representative of the Father, Son and Holy Spirit. The middle matzah is broken, as Christ said at the Last Supper, "This is My body, broken for you." The middle matzah is also striped and pierced, as Jesus was during His crucifixion, and as was prophesied in Isaiah 53:5, Psalm 22:16 and Zechariah 12:10. This matzah is then wrapped in a white cloth and hidden, just as Christ was wrapped in linen and laid in the tomb.
Easter Origin - The Biblical Accounts
Easter (also known as Resurrection Day), is the event upon which the entire Christian faith hinges. Paul, once a Jewish leader hostile to Christians, became a convert when he met Jesus on the Road to Damascus. As an eyewitness of Christ, Paul made it abundantly clear that without the resurrection, there is no basis for faith in Christ: Now if Christ be preached that he rose from the dead, how say some among you that there is no resurrection of the dead? But if there be no resurrection of the dead, then is Christ not risen: And if Christ be not risen, then is our preaching vain, and your faith is also vain.
When Christ was born, He fulfilled a number of Old Testament prophecies concerning the Messiah. By the time of His crucifixion, resurrection and ascension, He had fulfilled more than 300 of them. These numbers alone provide staggering evidence that Jesus Christ was the promised Messiah. So it is with good reason that Christians the world over regard Easter as a very special event. But in the early days of the church, most Christians were Jewish converts. Because Jesus was crucified and rose again during the Passover season, their celebration of Christ's resurrection was acknowledged during that annual observance of the deliverance from bondage in Egypt. Christian Jews consider the Passover to be symbolic of the time when Christ set all believers free from the penalty of sin and death.
Easter Origin - What Does the Resurrection Mean to You?
Easter origin? Can a man who claims to be God and then rises from the dead actually be God in human form? Is He someone you should follow? C.S. Lewis asked those same questions and came to the conclusion that there are only three possibilities. Jesus Christ claimed to be God. Therefore, to say He is just a "good man" or "great teacher" is to call him a liar. Any sane person who would claim to be God, but who in fact, is not, must then be a madman - a lunatic! If Christ is neither a liar nor a lunatic, then there is only one other possible conclusion - He must be the Lord! If He is the Lord, what does Resurrection Day mean to you?

Part 2:
So, does Vitamin C help with colds? 
At the very first sign of cold symptoms, many people reach right for a bottle of vitamin C supplements. Vitamin C for the common cold is such a widely accepted treatment that we seek it out in lots of products, such as fortified juices,cough drops, and tea.
Vitamin C was first touted for the common cold in the 1970s. But despite its widespread use, experts say there's very little proof that vitamin C actually has any effect on the common cold.
What is vitamin C?
Vitamin C is an important vitamin and antioxidant that the body uses to keep you strong and healthy. Vitamin C is used in the maintenance of bones, muscle, and blood vessels. Vitamin C also assists in the formation of collagen and helps the body absorb iron.
Vitamin C is found naturally in vegetables and fruits, especially oranges and other citrus fruits. This key vitamin is also available as a natural dietary supplement in the form of vitamin C pills and vitamin C chewable tablets.
Can Vitamin C Prevent or Treat Cold Symptoms?
Vitamin C has been studied for many years as a possible treatment for colds, or as a way to prevent colds. But findings have been somewhat inconsistent. Overall, experts have found little to no benefit for vitamin C preventing or treating the common cold.
In a July 2007 study, researchers wanted to discover whether taking 200 milligrams or more of vitamin C daily could reduce the frequency, duration, or severity of a cold. After reviewing 60 years of clinical research, they found that when taken after a cold starts, vitamin C supplements do not make a cold shorter or less severe. When taken daily, vitamin C very slightly shorted cold duration -- by 8% in adults and by 14% in children.
But researchers found the most effect on people who were in extreme conditions, such as marathon runners. In this group, taking vitamin C cut their risk of catching a cold in half.
So what does all this mean?
The average adult who suffers with a cold for 12 days a year would still suffer for 11 days a year if that person took a high dose of vitamin C every day during that year.
For the average child who suffers about 28 days of cold illness a year, taking daily high-dose vitamin C would still mean 24 days of cold illness.
When vitamin C was tested for treatment of colds in 7 separate studies, vitamin C was no more effective than placebo at shortening the duration of cold symptoms.
Is Vitamin C Safe to Take?
In general, vitamin C is safe to take when ingested through food sources such as fruits and vegetables. For most people, taking vitamin C supplements in the recommended amounts is also safe. The RDA or recommended daily allowance is 90 mg for men and 75 mg for women. High doses of vitamin C (greater than 2000 milligrams per day for adults) may cause kidney stones, nausea, and diarrhea.
If you're unsure about taking vitamin C for colds, talk to your healthcare provider. Your doctor can answer any questions about vitamin C and colds and about any other dietary supplement that you are taking. - WebMD

Science Myth of the Week:
Can you lose weight just by thinking really really hard?
The answer is no, you still have to exercise and eat healthy.

The Reality Check Episode 82

Mike Duffy and Critical Thinking + HFCS Study + Frown Myth

Frown takes more muscles than smiles?
Part 1:
The Reality Check is going to a little able Mike Duffy and what he thinks about critical thinking.
Michael Dennis Duffy is a Canadian Senator and former Canadian television journalist. Prior to his appointment to the upper house he was the Ottawa editor for CTV News Channel, and a host of Mike Duffy Live and Countdown with Mike Duffy on the network. Duffy sits in the Senate as a Conservative, representing Prince Edward Island.
Senator Mike Duffy has attacked the University of King’s College and other Canadian journalism schools for exposing students to Noam Chomsky and critical thinking. In a speech Saturday to Conservative party members in Amherst, Duffy reportedly slammed journalism programs for churning out leftist graduates. “When I went to the school of hard knocks, we were told to be fair and balanced,” Duffy was quoted from his speech in yesterday’s issue of the Amherst Daily News. “That school doesn’t exist any more. Kids who go to King’s, or the other schools across the country, are taught from two main texts.” According to Duffy — a former CTV News journalist appointed to the Senate last year by Prime Minister Stephen Harper — those two texts are Manufacturing Consent, Chomsky’s book on mainstream media, and books about the theory of critical thinking. “When you put critical thinking together with Noam Chomsky, what you’ve got is a group of people who are taught from the ages of 18, 19 and 20 that what we stand for, private enterprise, a system that has generated more wealth for more people because people take risks and build businesses, is bad,” Duffy is quoted as saying. Duffy then told Conservatives they have nothing to apologize for because most Canadians are not “on the fringe where these other people are.” Kim Kierans, head of the King’s School of Journalism, was surprised to hear Duffy’s comments. She said Manufacturing Consent isn’t part of the curriculum, though students do read some Chomsky. She made no apologies for teaching critical thinking. “We’re trying to teach people to have critical thinking skills, to hold accountable anyone who is in any way in authority,” she said. “It doesn’t matter if it’s the Conservatives, the NDP, the Green party, they’re all fair game in the sense that they have to be able to be transparent.” - Metro
Listen to the Interview here: The Reality Check Episode 82

Part 2: 
So, does high fructose corn syrup make people fatter than just sugar?

Science Myth of the Week: 
So, does it take more to frown than smile? 
You've likely been told that it takes fewer muscles to smile than it does to frown, and that, in light of this fact, you should smile more often. There are quite a few numbers that get tossed around when this line is used. Some claim it takes 43 muscles to frown and 17 to smile, but open Aunt Milda's chain letter and you might be surprised to learn it takes 26 to smile and 62 to frown. And some naysayers claim it's quite the opposite, that in fact it takes more muscles to smile than to frown.

Monday, October 18, 2010

The Reality Check Episode 81

Buying the Cosmos + TV Ratings + Head Heat Loss Myth

Buying stars?
Part 1:
So, can you actually buy a start and name it?
It is true! Although it is quite expensive, you can buy a start and name it. 
"Who gave them the right to name stars? And then charge someone for the name?"
The answer is simple: Nobody gave them the right. They just do it.
At least half a dozen companies are offering to attach names to stars while making the designations seem official, providing a fancy certificate and directions for locating the newly named point of light. Their promotional strategies range from harmlessly playful to bordering on fraudulent. Meanwhile the night sky is being populated with unofficial names, at $49.95 a pop, one unsuspecting buyer at a time.
What you really get
It's not hard to grasp the romantic or otherwise wondrous reasons someone might have for buying a star name, especially as a gift. It's also important for potential buyers to know what they'd actually get.
Pretty much nothing, beyond some very expensive paper.
Only the International Astronomical Union (IAU) has the right to officially name celestial objects. It does so for scientific purposes only and does not recognize any commercial naming systems. The IAU, viewed by astronomers as the reputable governing body, is well aware of the sea of commercial star vendors. It has this to say:
"The IAU dissociates itself entirely from the commercial practice of 'selling' fictitious star names."
Some folks wonder, understandably, why stars are not given names in lieu of boring numbers.
The IAU does recognize a handful of ancient star names, given to some of the brightest stars in our sky. But with millions and millions of stars out there, it wisely decided long ago that a numbering system is more useful for scientists.
As the IAU puts it, "Finding Maria Gonzalez in Argentina or John Smith in Britain just from their names is pretty hopeless, but if you know their precise address (perhaps from their social security number) you can contact them without knowing their name at all."
As a web site called Name a Star admits, "Scientists will never want to deal with finding 'Aunt Martha's Star.'" This company deserves a gold star for forthrightness.

Part 2:
Install TV meter boxes in a sampling of homes. These are boxes that keep track of exactly what a person is watching at any moment, and for how long. The sampling of homes you choose is important; they should be people from a variety of different age groups, nationalities and sexes. Nielsen Media Research, the company in charge of tracking TV ratings for the United States and Canada, keeps meter boxes in about 5,000 U.S. homes at any given time.
Obtain national statistics on the citizens of the United States. The Census can be extremely helpful in this, as it breaks down people by age, income, etc.
Take a look at the results you are getting from the meter boxes. Each person that watches a particular show is a representative for the part of the U.S. he most fits into. For example, if a house containing a husband and wife in their forties with no kids watched a particular show at a particular time, it is safe to assume most people that fit that specific description watched the same show as well. You would compare this sampling with all others that met that exact description and see what percentage of your samples watched a show, and then apply that percentage to the general population.
Multiply the number of people meeting a specific description in your sample group who watched a particular show by the number of people in the U.S. who fit that particular description. The number you get is the estimated number of people who watched that show.

Science Myth of the Week:
So, do you lose most of your heat from your head? 
The origin came from the US military in the 1970s saying that 45% to 48% of the heat is lost from the head. 
But, it's actually not true.

The Reality Check Episode 80


IQ Correlations + Dr Yoni Freedhoff Interview + Trillium Myth
Part 1: 
Political, religious and sexual behaviors may be reflections of intelligence, a new study finds.
Evolutionary psychologist Satoshi Kanazawa at the the London School of Economics and Political Science correlated data on these behaviors with IQ from a large national U.S. sample and found that, on average, people who identified as liberal and atheist had higher IQs. This applied also to sexual exclusivity in men, but not in women. The findings will be published in the March 2010 issue of Social Psychology Quarterly.
The IQ differences, while statistically significant, are not stunning -- on the order of 6 to 11 points -- and the data should not be used to stereotype or make assumptions about people, experts say. But they show how certain patterns of identifying with particular ideologies develop, and how some people's behaviors come to be.
The reasoning is that sexual exclusivity in men, liberalism and atheism all go against what would be expected given humans' evolutionary past. In other words, none of these traits would have benefited our early human ancestors, but higher intelligence may be associated with them.
"The adoption of some evolutionarily novel ideas makes some sense in terms of moving the species forward," said George Washington University leadership professor James Bailey, who was not involved in the study. "It also makes perfect sense that more intelligent people -- people with, sort of, more intellectual firepower -- are likely to be the ones to do that."
Bailey also said that these preferences may stem from a desire to show superiority or elitism, which also has to do with IQ. In fact, aligning oneself with "unconventional" philosophies such as liberalism or atheism may be "ways to communicate to everyone that you're pretty smart," he said.
The study looked at a large sample from the National Longitudinal Study of Adolescent Health, which began with adolescents in grades 7-12 in the United States during the 1994-95 school year. The participants were interviewed as 18- to 28-year-olds from 2001 to 2002. The study also looked at the General Social Survey, another cross-national data collection source.
Kanazawa did not find that higher or lower intelligence predicted sexual exclusivity in women. This makes sense, because having one partner has always been advantageous to women, even thousands of years ago, meaning exclusivity is not a "new" preference.
For men, on the other hand, sexual exclusivity goes against the grain evolutionarily. With a goal of spreading genes, early men had multiple mates. Since women had to spend nine months being pregnant, and additional years caring for very young children, it made sense for them to want a steady mate to provide them resources.
Religion, the current theory goes, did not help people survive or reproduce necessarily, but goes along the lines of helping people to be paranoid, Kanazawa said. Assuming that, for example, a noise in the distance is a signal of a threat helped early humans to prepare in case of danger.
"It helps life to be paranoid, and because humans are paranoid, they become more religious, and they see the hands of God everywhere," Kanazawa said.
Participants who said they were atheists had an average IQ of 103 in adolescence, while adults who said they were religious averaged 97, the study found. Atheism "allows someone to move forward and speculate on life without any concern for the dogmatic structure of a religion," Bailey said.
"Historically, anything that's new and different can be seen as a threat in terms of the religious beliefs; almost all religious systems are about permanence," he noted.
The study takes the American view of liberal vs. conservative. It defines "liberal" in terms of concern for genetically nonrelated people and support for private resources that help those people. It does not look at other factors that play into American political beliefs, such as abortion, gun control and gay rights.
"Liberals are more likely to be concerned about total strangers; conservatives are likely to be concerned with people they associate with," he said.
Given that human ancestors had a keen interest in the survival of their offspring and nearest kin, the conservative approach -- looking out for the people around you first -- fits with the evolutionary picture more than liberalism, Kanazawa said. "It's unnatural for humans to be concerned about total strangers." he said. The study found that young adults who said they were "very conservative" had an average adolescent IQ of 95, whereas those who said they were "very liberal" averaged 106. It also makes sense that "conservatism" as a worldview of keeping things stable would be a safer approach than venturing toward the unfamiliar, Bailey said. Neither Bailey nor Kanazawa identify themselves as liberal; Bailey is conservative and Kanazawa is "a strong libertarian." Vegetarianism, while not strongly associated with IQ in this study, has been shown to be related to intelligence in previous research, Kanazawa said. This also fits into Bailey's idea that unconventional preferences appeal to people with higher intelligence, and can also be a means of showing superiority. None of this means that the human species is evolving toward a future where these traits are the default, Kanazawa said.
"More intelligent people don't have more children, so moving away from the trajectory is not going to happen," he said.
Part 2:
Interview with Dr Yoni Freedhoff!
Given the chance, cows nurture their young and form lifelong friendships with one another. They play games and have a wide range of emotions and personality traits. But most cows raised for the dairy industry are intensively confined, leaving them unable to fulfill their most basic desires, such as nursing their calves, even for a single day. They are treated like milk-producing machines and are genetically manipulated and pumped full of antibiotics and hormones that cause them to produce more milk. While cows suffer on factory farms, humans who drink their milk increase their chances of developing heart disease, diabetes, cancer, and many other ailments.
Listen to the Interview here: The Reality Check Episode 80
Science Myth of the Week:
So, is it illegal to pick Trillium from a government area?
Not, yet, but soon it will be.



The Reality Check Episode 79

Olympics Revisited + Euphamism Treadmill + Audiobooks vs Books

Part 1:
So, how do we actually say that who won the Winter Olympics? Do we just count the number of medals? Or do we count who has the most gold medals? 
If we count the most medals, then the US won. if we count the most gold medals, the Canada won. But, there's another issue about the country population. Norway is around 8 times smaller than the US by population size, but they have 23 medals, and the US has 32. 
We can also look at the amount of athlete and see the percentage of the medals. Another thing is that if you send 2 teams for one sport, then it is impossible for both to have gold medals. 
There are also effects by the fundings the country gives to the athletes. 
So, if you only win by 1/100, then is it really skill? Or is it just chance? Of course if you win by 3 seconds, you obviously win of course. 
So, basically, don't just blindly look at the number and say who is the better country at the Winter Olympics. 

Part 2: 
Euphemisms often evolve over time into taboo words themselves, through a process described by W.V.O Quine, and more recently dubbed the "euphemism treadmill" by Steven Pinker. This is the well-known linguistic process known as “pejoration” or “semantic change”.
Words originally intended as euphemisms may lose their euphemistic value, acquiring the negative connotations of their referents. In some cases, they may be used mockingly and become dysphemisms.
In his remarks on the ever-changing London slang, made in Down and Out in Paris and London, George Orwell, mentioned both the euphemism treadmill and the dysphemism treadmill. He did not use these now-established terms, but observed and commented on the respective processes as early as in 1933.
For example, the term "concentration camp", to describe camps used to confine civilian members of the Boer community in close (concentrated) quarters, was used by the British during the Second Boer War, primarily because it sounded bland and inoffensive. Despite the high death rates in the British concentration camps, the term remained acceptable as a euphemism. However, after Nazi Germany used the expression to describe its death camps in the 1930s and 1940s, the term gained a widespread negative connotation, particularly in connection with the Holocaust.
Also, in some circles, the euphemisms "lavatory" or "toilet", are now considered inappropriate and were replaced with "bathroom" and "water closet", which in turn have been replaced by some with restroom and W.C. These are also examples of euphemisms which are geographically concentrated. The term "restroom" is rarely used outside the United States. "W.C." was previously quite popular in the United Kingdom, but is passing out of favor there, but becoming more popular in France, Germany and Hungary now as the polite term of choice. - Wiki

Science Myth of the Week: 
A new study by Carnegie Mellon University scientists shows that because of the way the brain works, we understand spoken and written language differently, something that has potential implications in the workplace and in education, among many other areas.
In the first imaging study that directly compares reading and listening activity in the human brain, Carnegie Mellon scientists discovered that the same information produces systematically different brain activation. And knowing what parts of the brain fire during reading or listening comprehension affects the answer to one of the classic questions about language comprehension: whether the means of delivery through eyes or ears makes a difference.
"The brain constructs the message, and it does so differently for reading and listening. The pragmatic implication is that the medium is part of the message. Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper," said Carnegie Mellon Psychology Professor Marcel Just, co author of the report that appears in this month's issue of the journal Human Brain Mapping.
Just said that the most recent methods of functional magnetic resonance imaging (fMRI) were applied to measure brain activity during these high level conceptual processes. Rather than examining the processing of single digits or words, his group is applying brain imaging to societal, workplace, and instructional issues.
"We can now see how cell phone use can affect driving, how reading differs from listening, and how visual thinking is integrated with verbal thinking," Just said.
Using the non invasive fMRI, scientists were able to measure the amount of activity in each of 20,000 peppercorn sized regions of the brain every three seconds and create visual maps of how the mental work of thinking was allocated throughout the brain from moment to moment. To the scientists' surprise, there were two big differences in the brain activity patterns while participants were reading or listening to identical sentences, even at the conceptual level of understanding the meaning of a sentence.
First, during reading, the right hemisphere was not as active as anticipated, which opens the possibility that there were qualitative differences in the nature of the comprehension we experience in reading versus listening. Second, while listening was taking place, there was more activation in the left hemisphere brain region called the pars triangularis (the triangular section), a part of Broca's area that usually activates when there is language processing to be done or there is a requirement to maintain some verbal information in an active state (sometimes called verbal working memory). The greater amount of activation in Broca's area suggests that there is more semantic processing and working memory storage in listening comprehension than in reading.
Because spoken language is so temporary, each sound hanging in the air for a fraction of a second, the brain is forced to immediately process or store the various parts of a spoken sentence in order to be able to mentally glue them back together in a conceptual frame that makes sense. "By contrast," Just said, "written language provides an "external memory" where information can be re­read if necessary. But to re play spoken language, you need a mental play back loop, (called the articulatory phonological loop) conveniently provided in part by Broca's area."
The study doesn't attempt to suggest that one means of delivering information is better than another, Just said. "Is comprehension better in listening or in reading? It depends on the person, the content of the text, and the purpose of the comprehension. In terms of persons, some people are more skilled at one form of comprehension and typically exercise a preference for their more skilled form where possible. It may be that because of their experience and biology they are better and more comfortable in listening or reading," he explained. -CM

Sunday, October 17, 2010

The Reality Check Episode 78

Chicken/Egg Precedent Problem + Weasle Words + Gutenberg

Which came first?
Part 1:
So, which came first? The chicken or the egg? 
The question “which came first, the chicken or the egg?” looks at first glance like a matter of straightforward reproductive biology.  But before we can even begin to answer this question, we must define our terms.  So actually, it is a classic case of semantic ambiguity…a problem of meaning and interpretation.  Specifically, while the term “chicken” is biologically unambiguous – we all know what a chicken looks, sounds and tastes like - the term “egg” is somewhat more general and is therefore a possible source of ambiguity.  
Do we mean (1) just any egg, or (2) a chicken egg?  And if we’re talking about a chicken egg, then is a “chicken egg” (2a) an egg laid by a chicken, (2b) an egg containing a chicken, or (2c) both?  Reformulating the question to reflect each possible meaning of “egg” leads to four distinct versions of the chicken-or-egg question.
1.  Which came first, the chicken or (just any old) egg?
2a.  Which came first, the chicken or an egg laid by a chicken?
2b.  Which came first, the chicken or an egg containing a chicken?
2c.  Which came first: the chicken, or an egg laid by and containing a chicken? 
Contrary to popular belief, there is indeed a definite answer to each of these questions. Specifically, the answers are:  (1)  The egg.  (2a)  The chicken.  (2b)  The egg.  (2c)  The chicken.  Given some knowledge of logic and biology, these answers are not hard to verify.  To get this show on - or should that be across? - the road, let’s go through them in order.
First, consider question 1: which came first, the chicken or (just any old) egg?  This question is answered “the egg” because species that lay eggs have been around a lot longer than modern chickens.  For example, we have plenty of fossil evidence that dinosaurs laid eggs from which baby dinosaurs hatched, and dinosaurs predate chickens by millions of years.  Indeed, a growing body of research indicates that dinosaurs were among the biological ancestors of chickens!
Now let’s look at question 2a: which came first, the chicken or an egg laid by a chicken?  The answer to this question is “the chicken” on semantic grounds alone.  That is, if a chicken egg must be laid by a chicken, then before a chicken egg can exist, there must by definition be a chicken around to lay it.  And question 2c - which came first, the chicken or an egg laid by and containing a chicken? - is answered the same way on the same grounds; logically, the fact that a chicken egg must be laid by a chicken precedes and therefore “dominates” the (biologically subsequent) requirement that it contain a chicken.  So whereas we needed paleozoological evidence to answer question 1, questions 2a and 2c require practically no biological knowledge at all! 
Having saved the best for last, let us finally consider the most interesting version, 2b:  which came first, the chicken or an egg containing a chicken?  This version is interesting because an egg containing a chicken might have been laid by a chicken or a non-chicken, which of course affects the answer.  Thanks to modern genetic science, we can now be sure that the egg came first.  This is because reproductive mutations separating a new species from its progenitor generally occur in reproductive rather than somatic DNA and are thus expressed in differences between successive generations, but not in the parent organisms themselves.  While the somatic (body) cells of the parents – e.g. wing cells, drumstick cells and wishbone cells - usually contain only the DNA with which they were conceived, germ (reproductive) cells like ova and spermatozoa contain non-somatic DNA that may have been changed before or during mating by accidental deletion, insertion, substitution, duplication or translocation of nucleotide sequences.  This is what causes the mutation that results in the new species. 
Where an animal qualifies as a member of a given species only if its somatic DNA (as opposed to its reproductive DNA) conforms to the genotype of the species, the parents of the first member of a new species are not members of that new species.  At the same time, all the biological evidence says that the ancestors of modern chickens were already oviparous or egg-laying…that a male and a female member of the ancestral species of the modern chicken, call this species “protochicken”, mated with each other and created an egg.  (Could the first chicken have evolved from a viviparous or live-bearing species, and after being born alive, have started laying eggs?  All the biological evidence says “no”.)  But because their act of mating involved a shuffling of reproductive genes that were not expressed in the body of either parent – if they had been expressed there, the parents would themselves have been members of the new species - the fetus inside the egg was not like them.  Instead, it was a mutant…a modern chicken! 
Only two loose ends remain: the “gradual” and “sudden” extremes of the evolutionary spectrum.  These extremes are evolutionary gradualism - Darwin’s original slow-paced timetable for natural selection - and punctuated evolution, as advocated more recently by evolutionary theorists including the controversial Stephen J. Gould. 
Gradualism says that mutations are biologically random, but subject to a selection process determined by environmental (external) conditions to which species must adapt over the course of many generations.  Taken to the limit, it implies either that each minor mutation that occurs during the evolutionary change of one species into another is random and independent of any other mutation, in which case a useful combination of mutations is highly improbable, or that each individual mutation confers a selective advantage on the mutant…that every evolutionary advantage of a new species over its precursor decomposes into smaller advantages combined in a more or less linear way.  Unfortunately, this makes it almost impossible to explain complex biological structures that do not break down into smaller structures useful in their own right…structures like bacterial cilia and flagella, and even the human eye. 
The hypothetical gradualistic evolution of one species into another via mutations accumulated over many generations leads to the following question: when does the quality and quantity of mutations justify a distinction between “species”…when does a protochicken become a chicken? It’s a good question, but our chicken-or-egg answers remain valid no matter how we answer it. 
At the other extreme, evolution sometimes appears to progress by leaps and bounds, moving directly from the old to the new in “punctuated” fashion.  And to complicate matters, this sometimes seems to happen across the board, affecting many species at once.  The most oft-cited example of punctuated evolution is the Cambrian Explosion.  Whereas sedimentary rocks that formed more than about 600 million years ago are poor in fossils of multicellular organisms, slightly younger rocks contain a profusion of such fossils conforming to many different structural templates.  The duration of the so-called “explosion”, a mere geological eye blink of no more than 10 million years or so, is inconsistent with gradualism; new organs and appendages must have been popping out faster than the environment alone could have selected them from a field of random mutations.  Clearly, the sudden appearance of a new appendage would leave little doubt about the evolutionary demarcation of ancestral and descendant species.
But the kind of punctuated evolution that occurs between generations is not the end of the line in sheer biological acceleration.  Sometimes, an evolutionary change seems to occur within the lifespan of a single organism!  For example, in the spirit of “ontogeny recapitulates phylogeny”, insect metamorphosis almost seems to hint at an evolutionary process in which an ancient grub or caterpillar underwent a sudden transformation to something with wings and an exoskeleton…or alternatively, in which a hard-shelled flying bug suddenly gave birth to an egg containing a soft and wormy larva.  While that’s not what really happened – as is so often the case, the truth lies somewhere in the middle - what occurred was just as marvelous and just as punctuated. 
What seems to have happened was this.  Due to a reproductive mutation, a whole sequence of evolutionary changes originally expressed in the fetal development of an ancestral arthropod, and originally recapitulated within the womb and egg it inhabited, were suddenly exposed to the environment, or at least to the hive, in a case of “ovum interrupts”.  A fetal stage of morphogenesis that formerly occurred within womb and egg was interrupted when the egg hatched “prematurely”, making the soft fetus into an equally soft larva and giving it a valuable opportunity to seek crucial nourishment from external sources before being enclosed in a pupa, a second egg-like casing from which it later hatched again in its final exoskeletal form.  So metamorphosis turns out to be a case of biological common sense, providing the fetus-cum-larva with an opportunity to acquire the nourishment required for the energy-consuming leap into adulthood. 
Does this affect our answer to the chicken-or-egg question?  Not really.  For even where the life cycle of an organism includes distinct morphological stages, the DNA of egg-laying insects does not change after conception.  And since it is reproductive and not somatic DNA modification that distinguishes one species from the next in line, our answers stand firm.  (Of course, this says nothing of science fiction movies in which something bizarre and insidious causes runaway mutations in the somatic DNA of hapless humans, causing them to evolve into monsters before our very eyes!  Such humans have either undergone a random or radiation-induced “meta-mutation” whereby their genetic code suddenly rearranged itself to incorporate a self-modification routine that is executed somatically, within their own cells, or they are the victims of a space virus which inserted such a routine into their DNA for its own nefarious purposes.)
OK…perhaps there’s yet another loose end.  Asking which of two things came first implies that time flows in a straight line from past to future (those are the “loose ends”).  But what if time were to flow in either direction, or even to loop around, flowing in what amounts to a circle? No more loose ends.  In fact, loops have no ends at all!  But in this case, the answer depends on whether we’re on the forward or reverse side of the loop, heading towards the future or the past.  Another way to formulate this question: does the cause lead to the effect, or is there a sense in which the effect leads to the cause?  Suffice it to say that no matter which way we choose to go, the original answers to the four versions (1, 2a, 2b and 2c) of the chicken-or-egg question are all affected the same way.  They are either all unchanged or all reversed, with no additional ambiguity save that pertaining to the direction of time (not a problem for most non-physicists and non-cosmologists).
Now that we’ve tied up every last loose end, what about the most important question of all, namely what to tell a curious child?  The answer: take your pick of versions.  Some kids will prefer the dinosaur angle of version 1; some kids will prefer the “birds and bees” reproductive biology lesson of version 2b.  In my opinion, if we limit ourselves to one version only, the most valuable explanation is probably that of 2b; but due to its relative complexity, a younger child can probably derive greater benefit from a T. Rex-versus-Triceratops embellishment of version 1.  To exhaust the golden opportunities for logical and scientific instruction, one should of course answer all four versions.  But no matter which way you go, make sure the child knows exactly which version(s) of the question you’re answering.  If you leave out the one he or she had in mind, you’ll no doubt be egged on until it gets answered! - Cognitive Theoretic Model of the Universe.

Part 2: 
What are Weasel Words?
Weasel words is an informal term for words and phrases aimed at creating an impression that something specific and meaningful has been said, when in fact only a vague or ambiguous claim has been communicated.
The origin came from that weasels poke holes in the egg and suck out the inside, leaving the shell intact. It's basically saying that you show someone an empty egg, making people think it has more than it really does. But, it's not lying.

Science Myth of the Week:
So, did Gutenberg invent the printing press?
No, we all know that someone invented the printing press in China.