Saturday, December 31, 2016

How our Brain Reacts to Sugary Tastes?

 By Kate Dailey

"Sweetie," "Sugar," and "Honey." There's a reason we call our loved ones flavor-derived nicknames. "We're all born liking sweet tastes," says Dr. Alexei B. Kampov-Polevoi, a professor of psychiatry at the University of North Carolina Chapel Hill. "It's kind of the yardstick for all pleasures." But what does it mean for food to taste sweet? And how does that taste affect our brains and our bodies? 

The desire for sweetness is hardwired into humans--give babies a little sugar on their lips and they'll smile. That's because up until the advent of artificial additives, sweet flavors signified calorie-dense foods. "If you're sitting there on the savanna and trying not to be eaten by something else, you want to be able to make a quick decision about what's good to eat," says Steven Munger, an associate professor of Anatomy and Neurobiology at the University of Maryland. A sweet snack indicated that not only was it probably not poisonous, it also would provide ample energy. While we can't blame everything on our prehistoric ancestry, the desire for sweetness is well-rooted in primitive genetics.
"All mammals - mice, dogs, humans—with the exception of cats, use the same types of genes and genetic mechanisms to detect sweet flavor," says Munger. (Cats have since mutated so that they no longer have the gene for detecting sweet food).

Still, some people crave sweetness more than others: women are more likely than men to have a preference for sweet food, a fact that's somehow tied to hormones. "During the menstrual cycle, the mood as well as the desire to eat sweets can fluctuate," says Dr. Kampov-Polevoi. "That indicates that sex hormones are involved."  
Children, as well, are more drawn than adults to like sweet foods, which makes sense. "Things that I loved as a child just taste so obnoxiously sweet to me now," says Danielle Reed, a member of the Monell Chemical Senses Center in Philadelphia. Her research showed that as children grew, their preference and liking for sweet foods decline. The research indicates that the desire for sweetness is linked to stages of development. "When you're growing you need the [extra] calories, and when you stop growing you don't," says Reed.  

Whether one like the taste of sugar a little or a lot, sweet foods react with everyone's brains in the same way--by producing a rush of chemicals, including dopamine, which creates an opiate-like effect. "In Sweden, sweet-tasting foods like sugar solutions are used as anesthetic to do minor surgeries," says Dr. Kampov-Polevi. Sugar water is also used in the US on babies for minor procedures like blood draws. It's also a go-to staple for recovering addicts, who find that binging on sugary snacks can sometimes help fight the urge to drink.   

No matter how you get your sugar fix, the brain reacts the same: whether your source is artificial sweeteners, high fructose corn syrup, or fructalose. "A sugar is a sugar is a sugar," says Barry Popkin, professor of global nutrition at the University of North Carolina. (At one point, researchers thought that artificial sugars may make the body even hungrier, since they offer a sweet taste with no calories, leaving the body wanting more. That has proven not to be true, says Popkin.)


How we consume that sugar, however, does make a difference. Humans do all process sweetness in beverages differently than sweetness in food - that is, by barely responding to it at all. Though our brains associate the taste of sugar with calorie dense food, drinking highly sweet beverages doesn't impact our caloric impact. When test subjects consumer eat 500 calories of sugar-rich food, says Popkin, they're likely to eat 500 fewer calories sometime during the day. Not so with sweet beverages. Even when researchers stir in spoonfuls of sugar into a regular glass of water, subjects still fail to compensate for those extra calories elsewhere.


That's trouble. An increase in sugary beverages has translated into a two-thirds to three-fourths increase in overall calorie consumption over the last 20 years, says Popkin. We can speculate about what this means for our waistline--some studies claim that the rise in American obesity can be directly linked to the rise in inexpensively produced soft drinks - but we are less sure about other long-term implications.   

What's also unknown is the long-term effect of eating sweet foods over a lifetime. Americans are currently living in what could be called the Sweetest Generation: our access to sweet and sugary foods, especially beverages like soda and juice, is at an all-time high. "For the past 1000 years, it's mostly been breast milk followed by water," says Popkin.  Now, we get about twenty percent of our calories from sugary drinks, a number that's skyrocketed in the past 20 years. We're eating more sugary foods than ever, and researchers are still unsure of the consequences. 

"If you consume caffeine over time, you habituate to it, and it has a different effect," says Popkin. "With drug use, you habituate to a certain amount of drugs and need more over time. When it comes to sweetness, we don't understand the long-term effect." We do know that we love sweets - and that the country is sweeter than ever. But in this case, being sweet may not be a good thing.  

One Church, Two Lanterns And The Start of The American Revolution

Nicole is on assignment in Boston, to solve a history mystery.  We all know the story of how during the American Revolution, lanterns in a church steeple signaled that the British were on the move.  But after more than 200 years, we still don’t know just who hung those lanterns. Plus, Nicole gets to go where few visitors are allowed.  And she gets more clues toward solving just who hung the signal lanterns for Paul Revere.

Sunday, December 25, 2016

Violence Among Teens Can Spread Like a Disease, Study Finds

If you’re a teenager, how do you know whether it’s cool to smoke cigarettes, curse or get a cartilage piercing? Look around: To find out what’s socially acceptable, impressionable adolescents generally turn to their peers. Now, new research finds that this social dynamic also plays out when it comes to more violent behaviors.

A new study, published yesterday in the American Journal of Public Health, draws on surveys of thousands of teens to reveal how the people around you influence your tendency to engage in violence. The authors report that adolescents are far more likely to commit a violent act if a friend has already done so—adding evidence to a mounting theory that violence in communities can spread like a disease.

The study was born of an unusual collaboration between Ohio State University social psychologist Brad Bushman and OSU political scientist Robert Bond. Bushman, who has written and lectured extensively on humans and violence, was interested in exploring the model of violence spreading like a contagious disease that had beens popularized by University of Chicago at Illinois epidemiologist Gary Slutkin. Bond had expertise in analyzing social networks. "We just really hit off and decided that we should to try find a way to merge our research interests," Bond says.

For the study, the two tracked the behavior of more than 90,000 American teenagers at 142 schools, who were surveyed in class starting in the mid-1990s as part of the National Longitudinal Study on Adolescent to Adult Health. By accessing follow-up interviews that were done with nearly 6,000 of the teenagers years later, the researchers were able to see whether they had practiced violent behavior in the past year—namely, getting into a serious fight, pulling a weapon on someone or hurting someone badly enough that they needed medical attention.

The teenagers were then asked to identify five male and five female friends, who were subsequently interviewed by the surveyors about their violent behavior. With this web of data, Bond and Bushman were able to piece together nodes of violence and their effect on the people connected to them.

What they found was a contagious model. Teenagers were 48 percent more likely to have been in a serious fight, 140 percent more likely to have pulled a weapon and 183 percent more likely to have hurt someone badly enough to require medical attention if they knew someone who had done the same. Moreover, the influence of one violent person can spread through up to 4 degrees of separation. In other words, if your friend's friend's friend's friend practices violent behavior, it's more likely you will too.

"People who exhibit these kinds of behaviors tend to be friends with one another," Bond says, adding: "They're teenagers. They're still sort of learning how to navigate their social environment."

For years, social scientists have theorized that violent behavior can spread from person to person like an illness, infecting whole neighborhoods and communities. This contagious theory was pioneered by Slutkin, who spent his early career working to prevent the spread of communicable diseases such as tuberculosis in San Francisco and Somalia, and AIDS in Uganda.

After returning to the U.S., Slutkin was troubled by the amount of violent crime he saw present in American culture. “I saw that these kids were killing each other,” he says. Soon, he started to see parallels between how violence was being viewed and treated by officials and how the AIDS epidemic was mismanaged and underfunded. “[Violence] is the only contagious epidemic that is not being managed by the health sector,” Slutkin says. “It's been fundamentally misdiagnosed.”

In 2000, Slutkin founded the movement Cure Violence to gain support for viewing violence as a contagious disease as opposed to solely a criminal justice issue. Cure Violence uses epidemiological techniques to target the people most at risk of spreading violence, and by working to stop its spread by “interrupting” violence before it starts. Slutkin has given a TED Talk on his approach, which was featured in the 2011 documentary The Interrupters. Cure Violence’s model, however, has faced resistance from law enforcement suspicious of treating violent criminals as victims, and Chicago dropped the program from its policing efforts in 2013.

Slutkin says that Bushman and Bond’s study adds to the now “thousands of studies that show the contagion of violence.” It also shows evidence that different forms of violence can be similarly contagious, from physical fights to violence using weapons, he says. This supports what he’s seen in his work. “We all unconsciously copy each other, especially with violence,” Slutkin says.

When it comes to other communicable diseases—say, a virus—the best way to avoid falling ill is to avoid the bug in the first place. Bushman thinks that this avoiding exposure is also the best for prevent violent behavior in teenagers. He also believes that the same contagious model could be used to spread non-violent behavior: By training teenagers to practice more empathy, schools and social workers could unleash positive behavior into social networks that would spread to people who don't receive treatment directly, he says.

Bond pointed to school-based violence prevention programs already in place across America to train students to practice peaceful conflict resolution, and said that their research could lead to better targeting of teenagers who would have the most social influence on their networks. "Those types of programs might be a lot more effective," Bond says, "because they're affecting not only who is directly affected by it, but the other people who see the changes in those people's behavior."

For future research, Bond is considering collecting his own data on how teenagers process and react to violence in some kind of a laboratory setting, while Bushman is interested in studying how violence could spread through other kinds of social networks, such as networks of terrorists on social media or in neighborhoods worldwide.

Slutkin, meanwhile, still hopes that people and governments will someday adopt his model of ending preventable violence. He draws parallels between his model and the new theory of our solar system proposed by astronomer Galileo Galilei, who faced opposition when his observations of the planets and moons didn’t fit with the prevailing theory of an Earth-centered solar system. “The theory was wrong,” Slutkin says. “It required a new theory.”

Enter the TeenDrive365 Video Challenge for your change to win $15,000!

ENTER BY FEBRUARY 23 FOR YOUR CHANCE TO WIN $15,000!

The 2017 TeenDrive365 Video Challenge is accepting entries!

Learning to drive is one of the coolest times for a teen. But facts have shown it can be extremely dangerous. You have the power to inspire your fellow teens, maybe better than a teacher or a parent can, because you know what messages will be the most powerful. Create a 30-60 second video for your fellow teen drivers that highlights the importance of safe teen driving and you could win $15,000, or one of 14 other prizes!

Contest ends February 23, 2017

CHECK OUT THE PRIZES:

  • 1st Place: $15,000 and the chance to work with a Discovery film crew to reshoot your video into a TV-ready PSA!
  • 2nd Place: $10,000 and a behind-the-scenes trip to a Velocity show taping
  • 3rd Place: $7,500
  • People's Choice: $5,000 and a behind-the-scenes trip a Velocity show taping
  • 4th Place - 10th Place: $2,500
  • Four Regional Winners: $1,000

Click here to Enter

Thank you to B-Forc for letting us know about this opportunity!

The Surprising Benefits of Entrepreneurship

By Marcelle Yeager

More and more people are considering starting their own gig. It offers flexibility and the opportunity to be your own boss. The possibility of making a lot of money may excite you, or your primary driver may be the chance to follow your passion. When you think about whether to go for it, you're probably thinking of questions like: Can I risk several years of financial and job insecurity? Am I OK with failing? Do I have the right expertise to do this? What do I do if it doesn't work?

These are all completely valid and necessary questions to ask yourself. You need to carefully weigh your answers and fully consider the pros and cons of starting your own business. While it's a lot of hours and hard work that could either never pay off or take many years to pay off, there are a lot of upsides, including some unexpected benefits you may not have considered before.

Development of new skills. When you start or join a company in its infancy, you usually take on a piece of everything. Though you might not have done sales before, if you're passionate enough about what you're doing and the whole idea of going it on your own, you will learn as you go.

Some processes will be a steep learning curve and others you just flat out should not or may not want to engage in. It's important to realize at some point that you cannot do it all if you want to grow fast. Figure out what you are most interested in trying and feel you may be good at (or already know you are!) and outsource whatever is not on that list.

The areas below are characteristic of almost all organizations and it's likely that as an entrepreneur, you'll be involved in at least some parts of each. If you're willing to invest money and time, you can choose in which ones you want direct involvement.

Sales – A lot of entrepreneurs have never done sales but it doesn't mean you're not going to be able to sell the thing you've created. Whether it's a product or service, you will be engaging in sales the whole time you own your business. You may not think, "Hey! I'm doing sales!" but when you talk about your company with potential partners and clients, you're selling.

Relationship building – This is not much different from sales. You will be going to events, making phone calls and sending emails and materials out in order to meet prospective partners and clients.

Operations – If you have a product, you need to understand logistics and supply chain techniques. If you have a service, you will use operations principles in order to determine and manage your workflow.

Marketing – A lot of entrepreneurs are uncomfortable with it, but the good news is that there are many marketing and web design freelancers out there who can help. If you do outsource, you still should try to determine your brand and purpose on your own and how to best communicate it before hiring someone. After all, it will still be you selling and pitching to clients and partners.

Information technology – Even if you haven't done a website before, there are website design companies like Squarespace that offer user-friendly templates and instructions. You can easily update it on your own when needed without having to rely on a web designer. If your company requires a complex interface, however, it is probably worthwhile hiring a designer.

Accounting and finance – When you first start out, you may be able to use a program like QuickBooks to track your finances and handle invoices. You may need to look into payment applications to determine how you'll be paid. As you grow, you may consider outsourcing these functions to save you time.

Human resources – If you are managing a team of employees, contractors or vendors, you're performing HR functions. You're handling pay, benefits and performance.

Other – There are a variety of other skills you will likely use at one point or another as you build your business, such as strategic planning, product development, fundraising and customer service. You should consult with a lawyer regarding contracts and insurance needs.

Building your career network. As you attend events and have phone chats with partners and clients, you build your network. Often when you meet one person, they suggest someone else with whom you should connect. As a result, your web grows bigger and bigger. You may not see now how these people might fit into your future career, but you've effectively grown your network while growing your business.

Meeting inspiring entrepreneurs. Most entrepreneurs love talking to other entrepreneurs because they've had others serve as their own mentors and inspiration. Talking to other business owners may result in close friendships, partnerships or simply spark ideas that you hadn't considered before. These are long-lasting connections that can take you in a variety of different directions should you decide to change your business model or simply do something else.

As you build a business, you gain a large number of unanticipated skills and contacts. The more open you are to opportunity, the better you will do – not only in your present endeavor, but also down the road in unknown but exciting territory.

From A Land Where Music Was Banned — To Carnegie Hall

Our top story focuses on how music, once illegal in Afghanistan is now once again becoming part of the lives of some Afghani teens.  We take a look at the documentary “Dr. Sarmast’s Music School” which tells the story of bringing music back to Afghanistan and the birth of the Afghan Youth Orchestra..  Plus Daniella reports on what may be called a “musical miracle”…visiting teens from Afghanistan getting the chance to play music side by side with American high school students.  While these young musicians started off worlds apart, we learn how they quickly came together preparing for a very special musical performance.

Monday, December 19, 2016

Meet Alberto Garcia who has created a Smart Anti-concussion Football Helmet

In recent years, a growing mountain of evidence has linked the brutal action of American football and frequent concussions suffered by players to long-term brain damage. When one Texas high school player suffered a concussion, he was inspired to develop a more protective helmet and shoulder pads, inspired by nature.

Alberto Garcia began working on a solution to the problem of brain trauma in America's favorite sport as a science fair project while a high school sophomore. He was inspired by the fact that animals like rams and woodpeckers, which are constantly impacting things with their heads, have natural stabilizers around their necks to prevent the whiplash motion after impact that contributes to brain damage in humans.

Garcia created a helmet and shoulder pads system with an integrated Arduino microcontroller connected to sensors in the helmet that stabilizes the head upon impact. When the sensors detect an impact above a certain threshold, the stabilizers lock the helmet in place to keep the athlete's head and brain from being jarred back and forth.

"If you reduce the whiplash motion of the neck, then you can reduce the odds of receiving a spinal cord or neck injury because all that energy is dispersed into the stabilizers," Garcia said.

The system takes a much more pro-active approach than other supplemental equipment, like the neck band we reported on that seeks to passively reduce the risk of concussion by increasing blood flow to the skull so the brain has less room to slosh around after impact.

New helmet designs are also more flexible to absorb more impact, but this is the first system we've seen that acts more like a vehicle seat belt for the entire skull.

The sensors in Garcia's system also transmit data about the force of impacts to the sidelines, providing data that could help in the diagnosis of concussions.

One key obstacle to the success of such a system is ensuring that it isn't so heavy and bulky that it could interfere with a player's mobility and gameplay. Garcia says his system only weighs five pounds (2.3 kg) and he's tested and modified it numerous times.

The project was a factor in Garcia's admittance to Texas Tech, where he's still developing the system and researching the market for it. He says he's had interest from the Air Force and Navy, who thought it might have potential for use by fighter pilots.

Garcia has a provisional patent on his invention and continues to look into possible uses in contact sports, the automotive industry and the military. More information is available in the video below.

How to Become a Great Babysitter?

Alexa takes us to a Red Cross babysitting class. The class teaches teens how to properly care for a child, including how to change a diaper, put kids to bed, stay safe on the job, and what to do in an emergency. The course also gives advice on how to advertise your babysitting services.

How Much Sugar is in your Cereal?

Tyler reports on the issue of sugar in cereal. Sugar has been linked to obesity and diabetes. The Environmental Working Group tested 84 cereals and found that three out of four of them failed the nutrition guidelines set by the federal government. The guidelines are voluntary but they ask that cereal not be more than 26 percent sugar by weight. However, some cereals were found to be more than 50-percent sugar. To figure out how much sugar is in a cereal, read the label.  Ingredients are listed in the order of the amount they are used in the food. When looking at the ingredients keep in mind that sugar can have many names including: honey, high fructose corn syrup or molasses. If some form of "sugar" heads the ingredients list that cereal has more sugar than any other ingredient. 

Tuesday, December 13, 2016

SPIDER-MAN: HOMECOMING

SPIDER-MAN: HOMECOMING

A young Peter Parker/Spider-Man (Tom Holland), who made his sensational debut in Captain America: Civil War, begins to navigate his newfound identity as the web-slinging super hero in Spider-Man: Homecoming. Thrilled by his experience with the Avengers, Peter returns home, where he lives with his Aunt May (Marisa Tomei), under the watchful eye of his new mentor Tony Stark (Robert Downey, Jr.). Peter tries to fall back into his normal daily routine – distracted by thoughts of proving himself to be more than just your friendly neighborhood Spider-Man – but when the Vulture (Michael Keaton) emerges as a new villain, everything that Peter holds most important will be threatened.

 
Release date: July 7, 2017 (USA)
Director: Jon Watts
Film series: Captain America film series
Music composed by: Michael Giacchino
 
 

 

What Were The Japanese Internment Camps?

Eden tells us about a difficult time in US history. During World War Two, after Japan bombed the United States’ Pearl Harbor, terrible discrimination began against Japanese Americans. The US government rounded up people of Japanese Ancestry and sent them what were essentially prison camps. It was not until the end of the war that the Japanese Americans were finally released. Today, the “National Japanese American Memorial for Patriotism during World War II” stands in Washington DC. The memorial lists the names of the 10 relocation camps along with the numbers of those forced to live at each camp. It also pays tribute to the thousands of Japanese Americans who fought for the United States in World War Two.

Thursday, December 8, 2016

Why is Chicago called the “Windy City”?

By Evan Andrews

The origins of Chicago’s famous nickname are not entirely clear. The most obvious explanation is that it comes from the frigid breezes that blow off Lake Michigan and sweep through the city’s streets. However, another popular theory holds that it was coined in reference to Chicago’s bloviating residents and politicians, who were deemed to be “full of hot air.” Proponents of the “windbag” view usually cite an 1890 article by New York Sun newspaper editor Charles Dana. At the time, Chicago was competing with New York to host the 1893 World’s Fair (Chicago eventually won), and Dana is said to have cautioned his readers to ignore the “nonsensical claims of that windy city.” Dana is often credited with popularizing the “Windy City” moniker, yet according to David Wilton’s book “Word Myths: Debunking Linguistic Urban Legends,” researchers have never managed to find his original article. Many now dismiss it as a myth.

Even if Dana’s editorial does exist, it’s unlikely that either he or the World’s Fair debate were responsible for popularizing Chicago’s nickname. Etymologist Barry Popik, a longtime researcher of the Windy City question, has uncovered evidence that the name was already well established in print by the 1870s—several years before Dana. Popik also dug up references showing that it functioned as both a literal reference to Chicago’s windy weather and a metaphorical jab at its supposedly boastful citizenry. Many of the citations are found in newspapers from other Midwest cities, which were in a rivalry with Chicago over who was the region’s main metropolis. For example, an 1876 headline in the Cincinnati Enquirer used the phrase “That Windy City” in reference to a tornado that swept through Chicago. “The Cincinnati Enquirer’s use is clearly double-edged,” Popik told the Chicago Tribune in 2006. “They used the term for windy speakers who were full of wind, and there was a wind-storm in Chicago. It’s both at once.” Since Chicago had previously used its lake breezes to promote itself as a summertime vacation spot, Popik and others conclude that the “Windy City” name may have started as a reference to weather and then taken on a double meaning as the city’s profile rose in the late-19th century.

Interestingly, although Chicago may have gotten its nickname in part because of its fierce winds, it’s not the breeziest town in the United States. In fact, meteorological surveys have often rated the likes of Boston, New York and San Francisco as having higher average wind speeds.

Meet the 13-Year-Old Jazz Musician Who Was Nominated For a Grammy

Don’t call Joey Alexander a genius. Yes, the 13-year-old is a celebrated jazz artist and one of the youngest Grammy nominees in history, for his debut album My Favorite Things. But he doesn’t like the labels that come with being preternaturally talented.

“I really don’t think I’m a genius or a prodigy,” he says. “I want people to dig my music, and not care about who I am.”

Alexander was born in the summer of 2003 in Denpasar, the sweltering capital city of the Indonesian island province of Bali. His family is a musical one: his mother’s sister is the Indonesian pop singer Nafa Urbach; his father dabbles in piano and guitar.

“My parents told me that when I was in my mother’s womb, they’d play jazz greats for me,” he says. He remembers hearing jazz for the first time when he was three — Louis Armstrong and Thelonius Monk — and started playing himself three years later. He saw an electric keyboard and thought it was a toy. “And then I found the keys, and I just felt the sound,” he says.

His father taught him the rudimentary basics of the piano, but Alexander largely taught himself how to play. He and his Dad would attend jam sessions at local jazz clubs. One day Joey went up. “Afterwards, I was just, like, ‘wow,’” he says.

His potential was obvious to all who cared to listen, and plenty did. In Jakarta, Indonesia’s capital and largest city, he performed for the storied jazz pianist Herbie Hancock. Then in 2014, the jazz trumpeter Wynton Marsalis, artistic director for Jazz at Lincoln Center, spotted him playing on YouTube. He invited him to play at the Center’s gala that year, and has subsequently become a mentor.

Alexander says his discovery was “God’s plan.” A devout Christian, the teenager alternates hours of piano practice with bible study. His faith has helped keep him grounded, he says. “My music, it’s a gift from God, and it’s a gift I’ve had to learn. It takes hard work and focus.”

What’s next? The future looks bright; he just released his second album, called Countdown, after a John Coltrane track. But adulthood is also looming, and all that it brings. “When he really experiences life — when he has his first heartbreak, say — we’re going to see his music evolve,” his drummer, Ulysses Owens Jr., says. As he gets older, he’ll have more to say.”

How long was the Hundred Years’ War?

The series of intermittent conflicts between France and England that took place during the 14th and 15th centuries wasn’t classified as the “Hundred Years’ War” until 1823. Traditionally, the war is said to have begun in 1337 when Philip VI attempted to reclaim Guyenne (part of the region of Aquitaine in southwestern France) from King Edward III—who responded by laying claim to the French throne—and to have lasted until 1453 when the French claimed victory over the disputed territory at the Battle of Castillon. By this calculation, the Hundred Years’ War actually lasted 116 years.

However, the origin of the periodic fighting could conceivably be traced nearly 300 hundred years earlier to 1066, when William the Conqueror, the duke of Normandy, subjugated England and was crowned king. Technically a vassal of the king of France (as the duke of Normandy), William’s simultaneous new role as king of England ushered in a complex web of dynastic marriages in which descendants of both the French and English kingdoms could arguably lay claim to the same territories. Over time, these overseas possessions resulted in inevitable clashes, and by 1337, Philip VI’s declaration that Edward III had forfeited his right to Guyenne was just the push Edward needed to renew his claim to the French throne as the nephew and closest male relative of King Charles IV, who had died in 1328.

From the French perspective, the conventional dates attributed to the Hundred Years’ War (1337-1453) marked the beginning and end of English hostilities on French soil. However, the English retained possession of the port city of Calais until 1558 and continued to assert a claim to the French throne until King George III finally relinquished the title in 1800.

What’s the World’s Smallest Flowering Plant?

The world’s smallest flowering plant is the watermeal, or Wolffia globosa. Found all over the planet, this bright green oval plant is about the size of a grain of rice!

Wolffia is the smallest genus of the aquatic plants known as duckweeds, which are part of the family Lemnaceae. Usually found floating in masses in freshwater lakes, streams, and marshes, they are rootless and rarely flower, mostly reproducing when the main plant, or “mother frond,” grows a new segment, or “daughter frond,” from one end. Wolffia also produce the world’s smallest fruit.

Attached Imageg2.jpg

According to the International Lemna Association (ILA), this tiny plant and its relatives could help our planet in a big way. The nonprofit organization is dedicated to promoting duckweed as a fast-growing, sustainable crop with a wide variety of uses. Duckweed is eaten by ducks and other aquatic birds, along with certain fish such as tilapia, but it can also be used in the diets of chickens, pigs, and cattle.

According to John W. Cross, author of the website The Charms of Duckweed, these plants quickly absorb the minerals they need for growth, as well as other organic nutrients, from the water they’re floating in. Duckweeds are especially good at taking phosphates and nitrogen out of water — two substances that need to be removed during sewage treatment and from farming runoff. Yet when grown on sewage or animal waste, duckweeds normally don’t retain toxins, so they can be used as feed or to fertilize crops.

The world’s smallest flowering plant, the Wolffia Globosa, has come in to bloom at the Tsukuba botanical garden in Japan.

Wolffia globosa with the larger leaved Spirodela polyrhiza Photo: Eric Guinther

The ILA website says that duckweed has other potential commercial applications: it could be a source of renewable and sustainable fuel to replace fossil fuels. Also, because it contains around 44 percent protein, it can be used to make bioplastics. Cross notes that genetic engineers are modifying duckweeds to produce low-cost pharmaceuticals such as vaccines.

The Mummy

The Mummy

Believed to be safely entombed in a crypt deep beneath the unforgiving desert, an ancient queen (Sofia Boutella), whose destiny was unjustly taken from her, is awakened in our current day, bringing malevolence with her that has grown over millennia.

 
Release date: June 9, 2017 (USA)
Director: Alex Kurtzman
Music composed by: Brian Tyler
Editor: Paul Hirsch
Production company: K/O Paper Products
 
 

Saturday, December 3, 2016

Meet the Woman Who Invented the Automatic Dishwasher

Josephine Garis Cochrane was an independent woman of the mid-1800s. When she married husband William Cochran, she took his last name but added an “e” to the end. And when she realized no one had yet created a proper automatic dishwasher, she invented one herself!

Josephine led a comfortable life in Shelbyville, Illinois. William was a successful businessman, and the couple often held dinner parties in their large home. She even had servants to clean up afterward. But one morning after a party, she found some of her good china had gotten chipped. She was so upset, she decided to wash the dishes herself from then on. It wasn’t long before Josephine wondered why no one had invented a machine to do the job … and soon she had sketched out the idea that would become the first commercially successful automatic dishwasher.

Her design used water pressure to clean, just as today’s dishwashers do. It had wire compartments for the dishes, which fit into a wheel inside a copper boiler. A motor turned the wheel while soapy water sprayed onto the dishes. It was practical, but Josephine had a hard time trying to hire a mechanic to build her machine the way she wanted, instead of insisting on building it HIS way. She finally found a man named George Butters to work with, and the Garis-Cochran Dish-Washing Machine was patented in 1886, three years after her husband died.

Josephine thought her invention would appeal to other housewives, but it was more of a hit with hotels and restaurants, maybe because it was an expensive appliance for a regular family to buy. She opened her own factory in 1897, and personally sold her machines nearly up until her death in 1913. In 1926, her company was bought by Hobart, which eventually became the modern appliance giant KitchenAid.

What is a Duke of Edinburgh’s Award?

Kristina does a special report on the royal achievements of some teens!  The Duke of Edinburgh Awards, were started by Prince Phillip in the 1950s. The goal is to encourage young adults around the world to embrace self-improvement.  In part one, Kristina takes us to the awards ceremony in Nashville Tennessee, where she was not only reporting, but also an award recipient.  On hand at the ceremony, was Prince Phillip’s youngest son, His Royal Highness, the Prince Edward who travels the world attending these ceremonies. Plus, Kristina gets a chance to sit down with Royalty…she interviews His Royal Highness, the Prince Edward. 

 

Roman Aqueducts: The Dawn of Plumbing

How did the ancient Romans deal with plumbing? They built huge and extensive
aqueducts, which is Latin for waterway.  These under- and above ground channels, typically made of stone, brick, and volcanic cement, brought fresh water for drinking and bathing as much as 50 to 60 miles from springs or rivers. Aqueducts helped keep Romans healthy by carrying away used water and waste, and they also took water to farms for irrigation.

So how did aqueducts work? The engineers who designed them used gravity to keep the water moving. If the channel was too steep, water would run too quickly and wear out the surface. Too shallow, and water would stagnate and become undrinkable. The Romans built tunnels to get water through ridges, and bridges to cross valleys.

Once it reached a city, the water flowed into a main tank called a castellum. Smaller pipes took the water to the secondary castella, and from those the water flowed through lead pipes to public fountains and baths, and even to some private homes. It took 500 years to build Rome’s massive system, which was fed by 11 separate aqueducts. To this day, Rome’s public fountains run constantly, as do smaller faucets that provide fresh water to anyone who stops for a drink.

The empire stretched across an immense part of the world, and wherever the Romans went they built aqueducts — in as many as 200 cities around the empire.  Their arched bridges are among the best preserved relics of that empire, in part because many aqueducts kept working for centuries, long after the Romans had retreated. You can still see their arches in Bulgaria, Croatia, France, Germany, Greece, Israel, Lebanon, Spain, Tunisia, and other former Roman colonies.

Aqueduct of Segovia

Monday, November 28, 2016

Smurfs: The Lost Village

Smurfs: The Lost Village

With the evil wizard Gargamel (Rainn Wilson) hot on their trail, Smurfette (Demi Lovato), Brainy (Danny Pudi), Clumsy (Jack McBrayer) and Hefty (Joe Manganiello) embark on a journey through the Forbidden Forest to find a mysterious village.

 
Release date: April 7, 2017 (USA)
Director: Kelly Asbury
Screenplay: Pamela Ribon
Music composed by: Christopher Lennertz
Art director: Marcelo Vignali
 

What Foods Were Served At The First Thanksgiving?

For many Americans, the Thanksgiving meal includes seasonal dishes such as roast turkey with stuffing, cranberry sauce, mashed potatoes and pumpkin pie. The holiday feast dates back to November 1621, when the newly arrived Pilgrims and the Wampanoag Indians gathered at Plymouth for an autumn harvest celebration, an event regarded as America’s “first Thanksgiving.” But what was really on the menu at the famous banquet, and which of today’s time-honored favorites didn’t earn a place at the table until later in the holiday’s 400-year history?

TURKEY

While no records exist of the exact bill of fare, the Pilgrim chronicler Edward Winslow noted in his journal that the colony’s governor, William Bradford, sent four men on a “fowling” mission in preparation for the three-day event. Wild—but not domestic—turkey was indeed plentiful in the region and a common food source for both English settlers and Native Americans. But it is just as likely that the fowling party returned with other birds we know the colonists regularly consumed, such as ducks, geese and swans. Instead of bread-based stuffing, herbs, onions or nuts might have been added to the birds for extra flavor.

Turkey or no turkey, the first Thanksgiving’s attendees almost certainly got their fill of meat. Winslow wrote that the Wampanoag guests arrived with an offering of five deer. Culinary historians speculate that the deer was roasted on a spit over a smoldering fire and that the colonists might have used some of the venison to whip up a hearty stew.

FRUITS AND VEGETABLES

The 1621 Thanksgiving celebration marked the Pilgrims’ first autumn harvest, so it is likely that the colonists feasted on the bounty they had reaped with the help of their Native American neighbors. Local vegetables that likely appeared on the table include onions, beans, lettuce, spinach, cabbage, carrots and perhaps peas. Corn, which records show was plentiful at the first harvest, might also have been served, but not in the way most people enjoy it now. In those days, the corn would have been removed from the cob and turned into cornmeal, which was then boiled and pounded into a thick corn mush or porridge that was occasionally sweetened with molasses.

Fruits indigenous to the region included blueberries, plums, grapes, gooseberries, raspberries and, of course cranberries, which Native Americans ate and used as a natural dye. The Pilgrims might have been familiar with cranberries by the first Thanksgiving, but they wouldn’t have made sauces and relishes with the tart orbs. That’s because the sacks of sugar that traveled across the Atlantic on the Mayflower were nearly or fully depleted by November 1621. Cooks didn’t begin boiling cranberries with sugar and using the mixture as an accompaniment for meats until about 50 years later.

FISH AND SHELLFISH

Culinary historians believe that much of the Thanksgiving meal consisted of seafood, which is often absent from today’s menus. Mussels in particular were abundant in New England and could be easily harvested because they clung to rocks along the shoreline. The colonists occasionally served mussels with curds, a dairy product with a similar consistency to cottage cheese. Lobster, bass, clams and oysters might also have been part of the feast.

POTATOES

Whether mashed or roasted, white or sweet, potatoes had no place at the first Thanksgiving. After encountering it in its native South America, the Spanish began introducing the potato to Europeans around 1570. But by the time the Pilgrims boarded the Mayflower, the tuber had neither doubled back to North America nor become popular enough with the English to hitch a ride. New England’s native inhabitants are known to have eaten other plant roots such as Indian turnips and groundnuts, which they may or may not have brought to the party.

PUMPKIN PIE

Both the Pilgrims and members of the Wampanoag tribe ate pumpkins and other squashes indigenous to New England—possibly even during the harvest festival—but the fledgling colony lacked the butter and wheat flour necessary for making pie crust. Moreover, settlers hadn’t yet constructed an oven for baking. According to some accounts, early English settlers in North America improvised by hollowing out pumpkins, filling the shells with milk, honey and spices to make a custard, then roasting the gourds whole in hot ashes.

Wednesday, November 23, 2016

That Time America Outlawed Pinball

By James McClure

When you hear the word "prohibition," alcohol and marijuana likely come to mind. But America has banned a number of other vices and recreations over the years - including, of all things, pinball.

The modern, coin-operated version of pinball was invented in Chicago in the 1920s, where it was seen as another game of chance that people could bet on at speakeasies and other nefarious joints during alcohol prohibition. As a result, it quickly became associated with gangsters and the rest of the city's criminal underworld.

The criminal association quickly led to pinball being seen as a gateway for harder vices (sound familiar?).

"Pinball machines are a harmful influence because of their strong tendency to instil desire for gambling in immature young people," said Lewis Valentine, who was New York City's police commissioner from 1934-1945.

New York Mayor Fiorello LaGuardia agreed. In the early 1940s, he launched a moral crusade against pinball, which he called an "evil" that robbed the public through the "pockets of schoolchildren in the form of nickels and dimes given them as lunch money." The campaign resulted in New York becoming the first major American city to ban the game in 1942.

Other cities, including Los Angeles, Philadelphia and even Chicago, followed suit. But few enforced the ban with as much fervor as New York under Mayor LaGuardia, who orchestrated prohibition-style raids involving police smashing thousands of pinball machines with sledgehammers and axes before dumping them in the city's rivers.

Saved by the flipper

Like so many rounds of pinball, the machine itself was saved by the flipper. The iconic piece of today's game wasn't invented until 1947. Before then, players had to to shake and tilt the table to maneuver the ball, making pinball a game of chance like gambling on slot machines. But the flipper allowed the game's backers to argue that pinball was actually based on skill.

That's the case they made in 1976, when pinball had its day in court. In April of that year, Roger Sharpe - a writer for the New York Times and GQ, who also happened to be a savvy pinballer - was called as a star witnessof New York's Music and Amusement Association (MAA). Sharpe was asked to play rounds of pinball in a Manhattan courtroom to demonstrate that pinball was a game of skill, not chance. Sharpe did just that when he amazed legislators by calling his shot like a billiard player.

Thus New York overturned pinball prohibition, other cities soon followed and Sharpe became known as the Babe Ruth of pinball.

Meet the First Native American Woman Doctor

By Christopher Klein

In an era when women couldn’t vote and Native Americans were denied citizenship, Susan La Flesche shattered not just one barrier, but two, to become the first Native American woman doctor in the United States. A new book details how the 19th-century trailblazer overcame racial and gender biases in a white patriarchal world to graduate at the top of her medical school class, care for an entire reservation and raise children while pursuing a full-time career.

Eight-year-old Susan La Flesche sat at the bedside of an elderly woman, puzzled as to why the doctor had yet to arrive. After all, he had been summoned four times, and four times he had promised to come straight away. As the night grew longer, the sick woman’s breathing grew fainter until she died in agony before the break of dawn. Even to a young girl, the message delivered by the doctor’s absence was painfully clear: “It was only an Indian.”

That searing moment stoked the fire inside Susan to one day heal the fellow members of her Omaha tribe. “It has always been a desire of mine to study medicine ever since I was a small girl,” she wrote years later, “for even then I saw the need of my people for a good physician.”

Born in a buckskin teepee on the Omaha Indian Reservation in northeast Nebraska on June 17, 1865, Susan was never given a traditional Omaha name by her mixed-race parents. Her father, Chief Joseph La Flesche (also known as “Iron Eye”), believed his children as well as his tribe were now living in a white man’s world in which change would be the only constant. “As the chief guardian of welfare, he realized they would have to adapt to white ways or simply cease to survive,” says Joe Starita, author of “A Warrior of the People: How Susan La Flesche Overcame Racial and Gender Inequality to Become America’s First Indian Doctor.” “He began an almost intense indoctrination of his four daughters. They would have to speak English and go to white schools.”

While Iron Eye insisted that Susan learn the tribe’s traditional songs, beliefs, customs and language in order to retain her Omaha identity, he also sent her to a Presbyterian mission school on the reservation where she learned English and became a devout Christian. At the age of 14, she was sent east to attend a girls’ school in Elizabeth, New Jersey, followed by time at Virginia’s Hampton Institute, where she took classes with the children of former slaves and other Native Americans.

Female physicians, late 19th century. Susan LaFlesche is in the second row from the back, fourth woman from the right. (Credit: Legacy Center, Drexel University College of Medicine)

Omaha means “against the current,” and few members of the tribe embodied the name better than La Flesche, as she proved by enrolling in the Woman’s Medical College of Pennsylvania at a time when even the most privileged of white women faced severe discrimination. Starita points to articles published in journals such as Popular Science Monthly that argued that women faced an intellectual disadvantage because their brains were smaller than those of men or that their menstrual cycles made them unfit for scientific pursuits. A Harvard doctor even wrote a 300-page thesis asserting that women should be barred from attending college because the stress would harm their reproductive organs. “When you read these theories in scientific journals, you realize what all women were facing,”

Still, La Flesche persevered and graduated in 1889 at the top of her 36-woman class to make history by becoming the first Native American woman doctor. Although prodded to remain on the East Coast where she could have lived a very comfortable existence, the 24-year-old La Flesche returned to the reservation to fulfill her destiny.

She became the sole doctor for 1,244 patients spread over a massive territory of 1,350 square miles. House calls were arduous. Long portions of her 20-hour workdays were spent wrapped in a buffalo robe driving her buggy through blankets of snow and biting subzero winds with her mares, Pat and Pudge, her only companions. When she returned home, the woman known as “Dr. Sue” often found a line of wheezing and coughing patients awaiting her. La Flesche’s office hours never ended. While she slept, the lantern lit in her window remained a beacon for anyone in need of help.

La Flesche preached hygiene and prevention along with the healing power of fresh air and sunshine. She also spoke out against the white whiskey peddlers who preyed on the tribe members, continuing her father’s work as a passionate prohibitionis.

As difficult as it may have been to straddle two civilizations, La Flesche “managed to thread the delicate bicultural needle,” according to Starita. “Those with no trust of white doctors flocked to Susan,” he says. “The people trusted her because she spoke their language and knew their customs.”

La Flesche again shattered stereotypes by continuing to work after her 1894 marriage to Henry Picotte, a Sioux from South Dakota, and the birth of their two boys at a time when women were expected to be full-time mothers and home makers. “If you are looking for someone who was ‘leaning in’ a century before that term was coined, you need look no further than Susan La Flesche,” Starita says. “She faced a constant struggle to serve her people and serve her husband and children. She was haunted that she was spreading herself so thin that she wasn’t the doctor, mother and wife she should be. The very fears haunting her as a woman in the closing years of the 19th century are those still haunting women in the opening years of the 21st century.”

The evils of alcohol that La Flesche railed against came into her home as her husband struggled with the bottle. He contracted tuberculosis, exacerbated by his alcoholism, and died in 1905, leaving La Flesche a widow with two small boys. By this point, the physician needed some healing herself, as her long hours led to chronic pain and respiratory issues. She pressed on, however, and in 1913 opened a hospital near Walthill, Nebraska, the first such facility to be built on reservation land without any support from the federal government. Her hospital was open to anyone who was ill—no matter their age, gender or skin color.

Starita believes that La Flesche, who passed away at the age of 50 on September 18, 1915, faced greater discrimination as a woman than as a Native American. “When I got into the research, I was stunned by how deeply entrenched gender bias was in the Victorian era. White women were largely expected to just raise children and maintain a safe Christian home. One can only imagine where that bar was set for a Native American woman.”

Saturday, November 19, 2016

Who were the African-American Heroines of Nasa?

As America stood on the brink of a Second World War, the push for aeronautical advancement grew ever greater, spurring an insatiable demand for mathematicians. Women were the solution. Ushered into the Langley Memorial Aeronautical Laboratory in 1935 to shoulder the burden of number crunching, they acted as human computers, freeing the engineers of hand calculations in the decades before the digital age. Sharp and successful, the female population at Langley skyrocketed.

Many of these “computers” are finally getting their due, but conspicuously missing from this story of female achievement are the efforts contributed by courageous, African-American women. Called the West Computers, after the area to which they were relegated, they helped blaze a trail for mathematicians and engineers of all races and genders to follow.

“These women were both ordinary and they were extraordinary,” says Margot Lee Shetterly. Her new book Hidden Figures shines light on the inner details of these women’s lives and accomplishments. The book is being adapted into a movie that will receive a wide release release in January.

“We've had astronauts, we’ve had engineers—John Glenn, Gene Kranz, Chris Kraft,” she says. “Those guys have all told their stories.” Now it’s the women’s turn.

Growing up in Hampton, Virginia, in the 1970s, Shetterly lived just miles away from Langley. Built in 1917, this research complex was the headquarters for the National Advisory Committee for Aeronautics (NACA) which was intended to turn the floundering flying gadgets of the day into war machines. The agency was dissolved in 1958, to be replaced by the National Aeronautics and Space Agency (NASA) as the space race gained speed.

The West Computers were at the heart of the center’s advancements. They worked through equations that described every function of the plane, running the numbers often with no sense of the greater mission of the project. They contributed to the ever-changing design of a menagerie of wartime flying machines, making them faster, safer, more aerodynamic. Eventually their stellar work allowed some to leave the computing pool for specific projects—Christine Darden worked to advance supersonic flight, Katherine Johnson calculated the trajectories for the Mercury and Apollo missions. NASA dissolved the remaining few human computers in the 1970s as the technological advances made their roles obsolete.

The first black computers didn’t set foot at Langley until the 1940s. Though the pressing needs of war were great, racial discrimination remained strong and few jobs existed for African-Americans, regardless of gender. That was until 1941 when A. Philip Randolph, pioneering civil rights activist, proposed a march on Washington, D.C., to draw attention to the continued injustices of racial discrimination. With the threat of 100,000 people swarming to the Capitol, President Franklin D. Roosevelt issued Executive Order 8802, preventing racial discrimination in hiring for federal and war-related work. This order also cleared the way for the black computers, slide rule in hand, to make their way into NACA history.

Exactly how many women computers worked at NACA (and later NASA) over the years is still unknown. One 1992 study estimated the total topped several hundred but other estimates, including Shetterly’s own intuition, says that number is in the thousands.

Johnson at NASA in 1966

As a child, Shetterly knew these brilliant mathematicians as her girl scout troop leaders, Sunday school teachers, next-door neighbors and as parents of schoolmates. Her father worked at Langley as well, starting in 1964 as an engineering intern and becoming a well-respected climate scientist. “They were just part of a vibrant community of people, and everybody had their jobs,” she says. “And those were their jobs. Working at NASA Langley.”

Surrounded by the West Computers and other academics, it took decades for Shetterly to realize the magnitude of the women’s work. “It wasn't until my husband, who was not from Hampton, was listening to my dad talk about some of these women and the things that they have done that I realized,” she says. “That way is not necessarily the norm”

The spark of curiosity ignited, Shetterly began researching these women. Unlike the male engineers, few of these women were acknowledged in academic publications or for their work on various projects. Even more problematic was that the careers of the West Computers were often more fleeting than those of the white men. Social customs of the era dictated that as soon as marriage or children arrived, these women would retire to become full-time homemakers, Shetterly explains. Many only remained at Langley for a few years.

But the more Shetterly dug, the more computers she discovered. “My investigation became more like an obsession,” she writes in the book. “I would walk any trail if it meant finding a trace of one of the computers at its end.”

She scoured telephone directories, local newspapers, employee newsletters and the NASA archives to add to her growing list of names. She also chased down stray memos, obituaries, wedding announcements and more for any hint at the richness of these women’s lives. “It was a lot of connecting the dots,” she says.

“I get emails all the time from people whose grandmothers or mothers worked there,” she says. “Just today I got an email from a woman asking if I was still searching for computers. [She] had worked at Langley from July 1951 through August 1957.”

Dr. Christine Darden. Courtesy NASA

Dr. Christine Darden. Courtesy NASA

Langley was not just a laboratory of science and engineering; “in many ways, it was a racial relations laboratory, a gender relations laboratory,” Shetterly says. The researchers came from across America. Many came from parts of the country sympathetic to the nascent Civil Rights Movement, says Shetterly, and backed the progressive ideals of expanded freedoms for black citizens and women.

But life at Langley wasn’t just the churn of greased gears. Not only were the women rarely provided the same opportunities and titles as their male counterparts, but the West Computers lived with constant reminders that they were second-class citizens. In the book, Shetterly highlights one particular incident involving an offensive sign in the dining room bearing the designation: Colored Computers.

One particularly brazen computer, Miriam Mann, took responding to the affront on as a her own personal vendetta. She plucked the sign from the table, tucking it away in her purse. When the sign returned, she removed it again. “That was incredible courage,” says Shetterly. “This was still a time when people are lynched, when you could be pulled off the bus for sitting in the wrong seat. [There were] very, very high stakes.”

But eventually Mann won. The sign disappeared.

The women fought many more of these seemingly small battles, against separate bathrooms and restricted access to meetings. It was these small battles and daily minutiae that Shetterly strove to capture in her book. And outside of the workplace, they faced many more problems, including segregated busses and dilapidated schools. Many struggled to find housing in Hampton. The white computers could live in Anne Wythe Hall, a dormitory that helped alleviate the shortage of housing, but the black computers were left to their own devices.

“History is the sum total of what all of us do on a daily basis,” says Shetterly. “We think of capital “H” history as being these huge figures—George Washington, Alexander Hamilton and Martin Luther King.” Even so, she explains, “you go to bed at night, you wake up the next morning, and then yesterday is history. These small actions in some ways are more important or certainly as important as the individual actions by these towering figures.”

The book and movie don’t mark the end of Shetterly’s work She continues to collect these names, hoping to eventually make the list available online. She hopes to find the many names that have been sifted out over the years and document their respective life’s work.

The few West Computers whose names have been remembered, have become nearly mythical figures—a side-effect of the few African-American names celebrated in mainstream history, Shetterly argues. She hopes her work pays tribute to these women by bringing details of their life’s work to light. “Not just mythology but the actual facts,” she says. “Because the facts are truly spectacular.”

Was Scotland the Birthplace of Golf?

Nicole continues her series: UK OK!, by visiting the historic town of St. Andrews.  Its home to the one of the oldest English speaking universities in the world…and its considered the “home of modern golf”.

Which Athletes Get the Biggest Scholarships?

BY Andrea Aronson

In the sports-obsessed United States, many students assume that the ticket to a hefty college scholarship is athletic prowess on the field, on the court, or in the pool.

Not so.  Just look at the total numbers.

Between the National Collegiate Athletic Association (NCAA) and the National Association for Intercollegiate Athletics (NAIA) about $3.2 billion in athletic scholarships are disbursed every year.  While that might sound like a hefty chunk of change, the reality is, when you do the math and look at the overall dynamics of athletic scholarships, you’ll see that number can be misleading.

Remember that of that $3.2 billion, only approximately one-quarter of it will be available to graduating high school seniors.  With an estimated 54,000 incoming first-year athletes potentially receiving scholarships each year, that means that, on average, an athlete might expect to receive around $15,000 in scholarship dollars. Not bad, you may be thinking, but not the mother lode either when you consider the average cost of a college education.

Further, keep in mind that “full-ride tuition scholarships” only really exist for a few sports (men’s football and basketball, women’s basketball, volleyball and gymnastics) and for a few players in that sport, and that most athletic scholarships are only a fraction of those averages.  Note, too, that many players at all levels of varsity play are on their teams with no scholarship money, at all.  While Division 3 players never receive athletic scholarships, Divisions 1 and 2 teams carry several players who receive zero award dollars.

Contrast all this with the facts and figures of academic scholarships.  Individual colleges and universities give away approximately $24 billion in scholarship awards, and the Federal government gives away another $22 billion in need-based aid.  About 13.2 million students attend four-year colleges and universities.  Obviously, if we did some math, the amount of money, on average, going to a student at a four-year college or university would be pretty tiny.

But it’s not distributed evenly—just as the athletic scholarship money is distributed unevenly, so too are academic scholarship dollars. No surprise: the best athletes (in certain sports) get more money than other athletes (in other sports).  The best students with the best grades and tests scores get more money than other students.

But which is the better bet? Where you should spend the most time and energy in order to get a better scholarship and reduce the cost of college?

Well, we’re betting on academics.

Here’s why:  no matter what the scenario, having strong academic credentials is appealing for both colleges and college coaches, and yes, it can even help you be recruited for that varsity collegiate spot on the team.

At best, college athletic recruiting is a crapshoot.  Even the most seemingly talented players may not get the kind of coach interest that they believe that they deserve.  Every year, every coach seeks something different for their team and needs that different something to varying degrees.  Depending upon how much they need it, and whether you offer it, the calculus of whether you’ll get recruited, and how much money you might be offered can change.  Add into this that many sports are not well-supported financially at many colleges, and that the large majority of sports are “equivalency sports” that have a bucket of money that has to be divided up across all players, and suddenly you have a recipe for total scholarship unpredictability.  Will you get recruited?  Maybe.  Maybe not.  Will you get scads of scholarship money?  Highly unlikely.

On the other hand, everybody wants a good student, and many institutions are more than willing to provide significant scholarship dollars to get that high-flyer.  There is no gray here. No unpredictability.  No complicated calculation.  And, what constitutes a strong student is generally objectively agreed upon across all colleges: a solid performance in classwork as reflected by the high school transcript.  Contrast this to the subjectivity of athletic recruiting, and you’ll see why spending time studying may be a better bet than spending money on that extra session of private coaching.

Plus, coaches are desperate for good students that they can recruit since they help out the coach on many levels.  Coaches need to meet certain academic standards both with their recruiting class’ high school performance as well as ongoing with their varsity team’s collegiate academic achievements.  Often, coaches will have what they consider to be “academic recruits.”  These are players who may not be considered superstars in their sport but who can help buoy the team in the classroom, and, yes, they get actively recruited to be on the team. (Though, admittedly, these players don’t usually get much in the way of large athletic scholarships, they often do get sizable merit scholarships because of the strength of their academics).

So, which athletes get the biggest scholarships?  The ones who don’t rely on their athletic prowess to be the main driver of their potential scholarship dollars and who study, study, study!

Tuesday, November 15, 2016

How Dangerous Is Smoking With Asthma?

 Smoking is bad news for everyone, but especially for kids who have asthma. In our continuing series brought to you by the Connecticut Tobacco & health Trust Fund, Scott takes a look at the effects of smoking, if you have asthma

Friday, November 11, 2016

Why Is Election Day on a Tuesday in November?

Federal law in the United States requires that the presidential election be held every four years on the first Tuesday after the first Monday in November. In modern society that seems like an arbitrary time to hold an election, but it made a lot of sense in the 1800s.

In the early decades of the United States, the date for the election of the president would be set by the individual states. Those various election days, however, almost always fell in November.

The reason was simple: under an early federal law, the electors for the electoral college were to meet in the individual states on the first Wednesday of December. And according to a 1792 federal law, the elections in the states (which would choose the electors) had to be held within a 34-day period before that day.

Beyond meeting legal requirements, holding elections in November made good sense in an agrarian society, as the harvest would have been concluded.

And the harshest winter weather would not have arrived, which was a consideration for those who had to travel to a polling place.

In a practical sense, having the presidential election held on different days in different states was not a major concern in the early decades of the 1800s. Communication was slow, and when it took days or weeks for election results to become known it didn't matter if states held elections at different times.

As communication improved with the introduction of the railroad and the telegraph, it seemed obvious that election results in one state might influence the voting yet to occur in another. And as transportation improved, there was also a fear that voters could travel from state to state and participate in multiple elections.

In the early 1840s Congress decided to make a standardized date for holding presidential elections across the country.

Congress Standardized Election Day in 1845

In 1845 Congress passed a law establishing that the day for choosing presidential electors (in other words, the day for the popular vote that would determine the electors of the electoral congress) would be every four years on the first Tuesday after the first Monday in November.

That formulation was chosen to fall within the time frame determined by the aforementioned 1792 law.

And making the election the first Tuesday after the first Monday also ensured that the election would never be held on November 1, which is All Saints Day, a Catholic holy day of obligation. There is also a legend that merchants in the 1800s tended to do their bookkeeping on the first day of the month, and scheduling an important election on that day might interfere with business.

The first presidential election held in accordance with the new law was held on November 7, 1848. In that year's election, the Whig candidate Zachary Taylor defeated Lewis Cass of the Democratic Party, and former president Martin Van Buren, who was running on the ticket of the Free Soil Party.

Why Hold the Presidential Election on a Tuesday?

The choice of a Tuesday is most likely because elections in the 1840s were generally held at county seats, and people in outlying areas would have to travel from their farms into town to vote. Tuesday was chosen as people could begin their travels on a Monday, and thus avoid traveling on the Sunday sabbath.

Holding important national elections on a weekday seems anachronistic in the modern world, and it's no doubt true that Tuesday voting tends to create obstacles and discourages participation.

The introduction of early voting procedures in many American states in recent elections has addressed the problem of having to vote on a specific weekday. But, generally speaking, the tradition of voting for the president every four years on the first Tuesday after the first Monday in November has continued uninterrupted since the 1840s.

How to deal with Pre-Test Stress?

We’ve all get so anxious about exams, some get cold sweats, some get dizzy and faint, some become nauseous. Extreme test anxiety, while rare, can be a tremendous problem. Daniella gets some advice from an expert on how to deal with pre-test stress

Bigorexia: young men, body image and steroids

At age 13, Nathyn Costello picked up a pair of dumbbells at his friend's place and started working out. He was a skinny kid, and had recently watched a couple of Jean-Claude Van Damme movies. He saw the man in the skin-tight singlet showing off his ripped abs and bulging biceps, watched him high kick and fly kick the bad guy and ultimately get the girl. Costello decided then and there what he wanted to be—big.

Seven years later Costello had achieved his dream; his body was buff and sculpted and there was barely a molecule of fat on his frame. But taking his shirt off in front of other men at footy training, eating a piece of fruit and even going out to dinner with friends had become the stuff of his nightmares.

'The best way to think about muscle dysmorphia is like reverse anorexia,' says Scott Griffiths, a psychologist from the University of Sydney whose research focuses on muscle dysmorphia and eating disorders in men.

'Guys with muscle dysmorphia are not trying to be skinny: their ideal physique is lean, cut, and very big, so the type of dieting and exercise they do is different to people with anorexia,' says Griffiths. 'But it's just as aggressive, so they too can look you in the eye and tell you that they're small, even though they're huge.'

In his early twenties, toned, trim and muscular, Costello would sit in the gym, watch other guys walk past and think how fit and strong they looked. He'd catch sight of himself in the mirror and get upset.

Outside the gym, as a coach of a high-level football team, Costello wouldn't take his shirt off unless he knew he was under a certain body fat percentage, and socializing with friends became almost impossible. When he was invited out to dinner he would eat his own meal before or after, or avoid social activities altogether.

Griffiths says Costello's story is a familiar one among the men he speaks with during his research.

'Once those thoughts, feelings and behaviors escape the context of the gym and they start to interfere in personal relationships, your ability to hold down a job and to do your job properly, that's when you need to have that honest conversation with yourself,' he says.

Costello says he knew something wasn't right with his behavior but when he saw his body reflected in the mirror he remained unhappy. So even with his strict diet and workout regime, Costello says he was driven to 'improve the outcome'. He turned to steroids.

Rising from the underground

Steroid use is on the rise in the US.

This recent rise in steroid use is something Ben Ly, a personal trainer, says he has noticed over the past few years among young men, especially among 15 to 25-year-olds.

'They're very impatient; they want to put on muscle as fast as they possibly can, they look at muscle magazines and movies, and see these huge, ripped out guys but they don't realize that [a lot of those guys] have something called 'mature muscle', which is five to 15 years’ worth of weight training that has gotten them to that point.'

Ly says he's also noticed that over the past three to four years, steroid use has become less of a taboo subject.

'Illicit steroid use was very underground, no one would mention it. But even in the last year, it's become very open. Just last week I had two young guys [ask about steroids] and approach us directly at the reception counter, it was like they were just asking for a cigarette or a lighter … it's so big and well used now in society that to them, it's not illegal ... and everybody wants to be a part of it.'

This recent rise in steroid use has Costello worried because it reminds him of his own experience with steroids in the past.

'I knew guys who were using 10 to 30 times [the dosage] I was, and that was very common, and I'd be concerned. These guys would put on a lot more muscle, a lot more quickly, but from my observation they'd also be the ones who were more fragile—they'd run out of the drug, lose a bit of weight, and their confidence would drop through the floor.'

That kind of unregulated and sporadic use also has Griffiths concerned. He says the period of time when someone with muscle dysmorphia is coming off a steroid cycle is critical. He sees some of his clients experience big mood swings and emotional instability, including the risk of suicide.

'For a guy whose body image, self esteem and emotional stability rests heavily, if not exclusively, on how he looks and his appearance, he's going to experience a very rapid drop in muscularity ... and to see that happen over just a couple of days can be quite traumatic.'

Griffiths also says the way in which steroid users have recently been associated with violence in the media has not helped in encouraging men to come forward and seek help. While the taboo around steroid use has diminished, Griffiths says there's still a stigma for men who want to seek help.

'I think to a large extent men are discouraged from talking about their vulnerabilities, especially when it comes to mental illness. I think the perception that muscle dysmorphia is a disorder that men can suffer from hasn't quite permeated through the public consciousness yet.'

Costello was never diagnosed with muscle dysmorphia, but now at age 36, believes he had the disorder in his twenties.

'The truth is I think I just wanted acceptance,' Costello says. 'I was very skinny and wanted to feel more confident, I guess. That's where it started ... the world automatically rewards people who look good, but there aren't enough people who want to be vulnerable to talk about how they're feeling.'

Powered by Blogger.