Mental decline affecting people at earlier age
Since 1979, the death rate for neurological diseases has increased more in the United States than any other Western country, rising 66 percent among men and 92 percent among women. But research at Bournemouth University in the U.K. suggests that this spike is not simply due to the fact that people are living longer because there also has been an alarming rise in dementia and other neurological disorders among people under 55 years old.
And this “hidden epidemic” is not just occurring in the U.S. The researchers found that the rise of neurological disease deaths in adults under 74 rose significantly in all 16 countries they studied.
Most concerning about these figures is what the researchers speculate may be related to the rise in neurological diseases. The doctors calculate that this is not likely a genetic problem, as three decades is not long enough to reflect a significant genetic change. However, earlier onset in Western countries, they noted, could be attributed to new technology introduced during this period, including the explosion of electronic devices, rises in microwaves, TVs, mobile phones, increased pollution and more chemical additives in foods, to name a few. The researchers suggested that no one single environmental factor is responsible, but rather that the interaction of all these triggers could be causing the spike.
The first vaccination: May 14, 1796
A young English boy named James Phipps is given an unusual treatment by the local country doctor. The physician is Edward Jenner, and he makes an incision in the boy’s arm and applies to it some pus taken from a blister on the hand of a milkmaid who has contracted cowpox. Jenner is acting on a theory based on lore among the area’s farmers that milkmaids who developed cowpox somehow avoided catching smallpox.
The young boy does get sick, but within a week or so, he recovers. Then, on July 1, Jenner takes the next critical step—he applies actual smallpox matter to an incision on Phipps. The boy suffers no ill effects and Jenner believes he may have found a safe way to protect people from smallpox. Using the Latin word for cowpox—vaccinia—Jenner calls his treatment “vaccination.”
Prior to Jenner’s discovery, the most common approach to fighting smallpox was variolation, which involved exposing people to smallpox virus. Those people would develop the disease, but often a less severe version and they seemed more likely to be able to avoid a recurrence in the future. It may first have been used as a way to fight smallpox in China as early as 1000 A.D. It also was used in India, then spread to the Middle East and Africa, and by the 18th century, was well-established as the treatment of choice in Europe, particularly Great Britain. This tactic likewise became accepted in Britain’s colonies and George Washington required that it be used on men drafted into his Continental Army.
But the use of smallpox virus as protection came with risks. It killed some people and caused the spread of other diseases, such as tuberculosis and syphilis, which were transmitted through the procedure. So Jenner’s use of a cowpox virus, which appeared to have no lasting harmful effects, was potentially a great breakthrough.
Still, his approach wasn’t embraced at first by Great Britain’s medical establishment, which doubted that a country doctor could have made such an important discovery. In fact, he was widely ridiculed in some circles. Critics, especially the clergy, claimed it was repulsive and ungodly to inoculate someone with material from a diseased animal. A satirical cartoon of 1802 showed people who had been vaccinated sprouting cow’s heads.
In time, though, Jenner’s smallpox vaccinations took hold, and in 1840, 17 years after he died, the British government banned all other preventive treatments against smallpox. Strictly speaking, he didn’t invent the concept of vaccination—although he did name it-- but his work represented the first scientific attempt to control an infectious disease by the deliberate use of vaccination. Scientists throughout the 19th and 20th centuries followed his model to develop vaccines to fight many deadly diseases, including polio, whooping cough, measles, tetanus, yellow fever, typhus and hepatitis B.
In 1979, the World Health Organization declared that smallpox, which killed at least 300 million people in the 20th century alone, had been eradicated on the planet and that no more vaccinations were necessary.
More slices of history:
Hello, Cheerios: May 1, 1941
First hit workout video: April 24, 1982
Insulin goes mainstream: April 15, 1923
Polio vaccine celebrated: April 12, 1955
First artificial heart: April 4, 1969
First batch of Coca-Cola: March 29, 1886
Elephant man case presented: March 17, 1885
Flu pandemic begins: March 11, 1918
Aspirin is born: March 6, 1899
How the brain tracks motion
As amazing as the human brain is, it does have some limitations, one of which is that it takes one-tenth of a second to process visual information. This would be especially problematic when trying to perceive an object in motion, particularly one moving very quickly. But scientists at the University of California, Berkeley found that the brain compensates for the delay by predicting the path of a moving object so that it can ‘see’ the object and respond to it accordingly.
One fundamental problem with the brain, the researchers noted, is that it does not work exactly in real-time; in fact, it actually functions more slowly than computers. So, by the time the brain is able to fully process information, the information is slightly dated. This is especially true when it comes to perceiving fast-moving objects.
To compensate for this lag, the brain is able to predict the intended path of a moving object, so the object that the eye ‘sees’ is actually the brain’s prediction of where that object will be one-tenth of a second from the moment you see it.
The researchers studied the brains of six volunteers using functional magnetic resonance imaging (fMRI), which indirectly measures brain activity by measuring changes in the blood flow in the brain.
The volunteers’ brains were scanned as they watched an illusion called the “flash-drag effect,” in which brief flashes of light shift over a moving background. The brain interprets the flash as part of the moving background, and therefore engages the prediction mechanism to shift the position of the flash.
Pet owners have lower risk of heart disease
The next time you’re out for a walk with your dog in the pouring rain, remind yourself how much good it’s doing you. A new study published in the journal Circulation found that pet owners, and especially dog owners, have a lower risk of heart disease than people who do not have pets.
The conclusion is based on an analysis of several studies that examine the link between pet ownership and health. The findings do not determine causality, meaning, researchers are not sure if owning a pet actually reduces heart disease risk or if healthy people tend to get pets. What they do know is that dog owners, in particular, are 54 percent more likely to get the recommended amount of physical activity than people who do not own a dog.
The research also concluded that owning a pet may be linked to lower blood pressure, lower cholesterol levels and lower incidences of obesity.