Jump to content
The Corroboree

tripsis

Trusted Member
  • Posts

    4,056
  • Joined

  • Days Won

    39

Everything posted by tripsis

  1. This is beginning to get a little exasperating. Meat is a energy source, yes. Protein is also is the most difficult macronutrient to digest, so almost 30% of the energy gained from it is expended in digesting it (i.e. thermogenesis). Carbohydrates are the body's preferred energy source. It is also the one that is most easily converted to fat. Big, easily spotted animals are an obvious target for hunter-gatherers, which yield a lot of food for the effort expended. Digging into the ground for half a day to obtain a kilo of tubers high in carbohydrates is simply not as productive. This: Vs. this: It's your "common fricking sense" that's leading you down the path of flawed logic and false conclusions. So there's no point in linking to anything, because what you're saying has no credible, scientific basis? You might as well quit pretending you know what you're talking about then, if all you're doing is parroting other people's baseless opinions. That's not evidence, that's you basing broad statements on your limited individual experience of the world. If anything, that suggests you have a fast basal metabolic rate. Stop digging yourself a hole and go back and read through what I've written. I have already touched on this point. Metabolism. Go do some research. Also, don't going putting words into my mouth. Never once did I state that everything that is eaten in converted to energy. There is a difference between eating food and digesting food. If the food is digested, it will enter the blood stream. However, not everything you eat will necessarily be digested. Perhaps this is what you've been attempting to say? And now, I'm over this. I have better things to do with my time than argue with you. Go on, have the last word, you know you want to.
  2. Silver Banksia plants excel at phosphate saving The Australian plant family is highly efficient in the management of the nutrient December 10, 2013 Plants in the leached soils of Western Australia have developed a special strategy for coping with the scarcity of phosphorus. Together with colleagues from the University of Western Australia, Perth, scientists from the Max Planck Institute of Molecular Plant Physiology in Golm near Potsdam have discovered that plants from the Banksia genus of the Proteaceae family make severe cutbacks, in particular to the RNA found in the ribosomes (rRNA). The cell’s protein factories are the biggest consumers of phosphorus; in this way, the plants save on both phosphorus and water. As global phosphorous reserves are in severe decline, the strategies of the Proteaceae could be of interest from the perspective of optimising crop plants through breeding. /applications/core/interface/imageproxy/imageproxy.php?img=http://www.mpg.de/7647730/zoom.jpg&key=3520f89ee2e36a8ddb5d08b16c159604545fcbad6afc1c47018be5511d443275Proteaceae are used to a lack of phosphorous. Their efficient management of phosphorous could be of interest for the optimisation of crop plants. © MPI of Molecular Plant Physiology Plants in Western Australia have to be very tough to survive. The heat is oppressive there, rain is extremely rare, and phosphorous in the form of phosphate is virtually nowhere to be found in the soil. However, this element is crucial to the survival of plants. It attaches itself to sugar and proteins and is a component of DNA, the cell membrane and the energy currency ATP. When phosphorous is scarce, photosynthesis declines and plants hardly grow. This is not, however, the case with some of the plants from the Banksia genus of the Proteaceae family. “These plants grow on soils which contain a hundred times less phosphate than unfertilised soils in Europe,” explains Mark Stitt, one of the authors of the study. This is due to their roots, on the one hand, which resemble toilet brushes and suck every phosphorous atom from the soil with their fine hairs. On the other hand, the plants are extremely prudent in their use of the little phosphorous available to them. They save most in the nucleic acids, which combine between 30 and 50 percent of the cell’s total phosphorous. Cutbacks are applied in particular to ribosomal RNA, a component of the cell’s protein factories. Compared to Proteaceae, the model plant Arabidopsis thaliana has two to four times more ribosomes in its fully-grown leaves and, in the case of young leaves, it has 10 to 40 times more. Fewer ribosomes produce fewer proteins and enzymes. The plant grows more slowly as a result, but does not present any symptoms of phosphorous deficiency. In fact, only too much phosphorus could prove dangerous for them. “The plants can be fertilised to death, as they cannot halt the absorption of phosphate,” explains Mark Stitt. Other plants simply close down when over-fertilised. “Up to now, we did not know why the Proteaceae, which have adapted to phosphorous deficiency, are no longer able to do this.” Presumably, they have simply never been in such a situation, as the soils of Western Australia are very old and weathered and did not acquire additional phosphate in the past from volcanic eruptions, people or animals. The plants are also extremely economic when they form new leaves. Instead of investing simultaneously in the growth and formation of the photosynthesis equipment, which would bind huge volumes of ribosomes and, thus phosphorous, they focus first on the formation of the leaf and later on the production of the green chlorophyll. In the next phase of their study, the researchers would like to establish whether humans could implement the strategies of the Proteaceae for the efficient use of phosphorous in crop plants, or whether this approach could be associated with disadvantageous characteristics, for example lower yields. Phosphorous is very rarely found on the Earth and deposits are concentrated in very small geographical areas: almost 75 percent of the world’s total phosphate rock is found in Morocco and the Western Sahara and a further 15 percent is distributed between China, Algeria, Syria, South Africa and Jordan. CS/HR Source.
  3. That is one way. A natural diet high in greens and insects, as would be found in freely ranging chickens, results in omega 3 too.
  4. I never said you claimed meat was evil, though you have implied - more than once - that it is unhealthy. Except meat doesn't do that. There is plenty of evidence to support what I said, not difficult to find it if you actually care to educate yourself. This purported "theory" - is it yours, as you initially claimed, or someone else's, as would be the case if you read it somewhere? If the latter, then where did you read it and what was the evidence? Considering this "theory" appears to be based on a flawed understanding of digestion, it doesn't strike me as credible.
  5. From weight loss to fundraising, 'ironic effects' can sabotage our best-laid plans Growing body of research shows how efforts backfire in sneaky ways: we fail in our best efforts because of our best efforts /applications/core/interface/imageproxy/imageproxy.php?img=http://static.guim.co.uk/sys-images/Guardian/Pix/pictures/2013/12/11/1386768159657/c3206a29-be75-41ba-a218-2a96b3010270-460x276.jpeg&key=aabfb01cc6b19bb6cf2b16281f0f3ea83026293dea66cbdb49ba2630ac96bc43You were trying so hard not to spill the wine – which is why you spilled it. Photograph: Alamy The great Harvard psychologist Dan Wegner, who died earlier this year, wrote a famous article entitled How To Think, Say, or Do Precisely the Worst Thing for Any Occasion (pdf). It concerned a very specific kind of mistake, which he labelled the "precisely counterintuitive error" – the kind of screw-up so obviously calamitous that you think about it in advance and decide you definitely won't let it happen: This is an example of what psychologists call an "ironic effect": it's not just that we fail in our best efforts, but that we fail because of our best efforts. If you hadn't given much thought to the wine, you'd probably not have disgraced yourself. The depressingly popular field of "positive thinking" is basically one long litany of ironic effects, because trying too hard to be happy makes people miserable. (I explore this in my book The Antidote – and now I just have to hope that this self-promotional reference doesn't have the ironic effect of making you less likely to buy it.) But ironic effects have been cropping up in a whole range of other contexts, too. Here are three reported in the last few weeks alone: Stigmatising obesity makes overweight people eat more, not less For a study in the Journal of Experimental Psychology, a group of women read an article suggesting that overweight people find it harder to get jobs; others read an article making the same point about smokers, while still others read no article at all. Afterwards, they were shown into a break room with bowls full of junk-food snacks. As Tom Jacobs explains at Pacific Standard, among women who already perceived themselves as overweight, those exposed to the weight-related message consumed about 80 calories more, on average, than those who read the article on smoking. (Questionnaire responses also implied they felt less in control of their eating.) Stigmatisation triggers anxiety, which triggers eating. For women who didn't see themselves as overweight, the weight-related article increased their sense of control over their food consumption – which shows why non-overweight people probably shouldn't be in sole charge of designing anti-obesity campaigns: what makes them feel better about food has the opposite effect on the people they need to reach. Campaigns "need to emphasize the positive aspects to losing weight," Jacobs concludes, "rather than the negative aspects of being fat." (Relatedly: moderate exercise is more motivating than hard training.) Supporting a good cause on Facebook makes people less likely to give money or time There are many reasons to be sceptical of the benefits of "slacktivism" – Joseph Kony's still at large, for a start – but new research (pdf) suggests it might be actively damaging. In a series of experiments at the University of British Columbia, people were invited to make low-cost expressions of support for good causes, either privately (by signing a petition) or publicly (by coming to the front of the room to sign the petition, accepting lapel pins, etc). The results, in brief: the more public the commitment, the less willing people are to give a higher-cost donation of money or time. "Once we’ve shown our support and earned the status associated with joining a cause," explains Adam Grant, outlining the likely mechanism involved, "we feel less obligated to follow through with a meaningful contribution to that cause." He offers some alternative tactics for fundraisers here. Awareness campaigns get forgotten by the people who need them most "Motivated forgetting" is an especially galling species of ironic effect: when a message makes you feel vulnerable – for example, by reminding you of the ways in which your gender or ethnicity places you at a disadvantage – you're more likely to find ways, conscious or otherwise, to forget it, in order to retain a sense of self-control. In a study to be published in the Journal of Consumer Research, marketing experts found that students who were reminded of their university's poor performance were less likely to remember an advertisement offering a discount at the campus bookshop. "Consider an advertisement for breast cancer prevention," the researchers write. "If the ad makes … women’s vulnerability to the disease" salient in their minds, they could "feel threatened and exhibit defensive responses, such as decreased ad memory." In short: if you're trying to change behaviour or beliefs – your own, or other people's – don't assume that the most direct, vigorous or effortful route is necessarily the most effective one. The human mind is much, much more perverse and annoying than that. Source.
  6. Feel free to start any thread you wish, where you can bash whoever you want. In the meantime, I'll keep posting threads I personally find interesting.
  7. What do you think evolution is? We have evolved to eat meat. Period. Do you know why the Inuit can survive on a diet of almost only animal? It's because they eat the organs and stomach contents too. They get all the vitamins and minerals they need by consuming the whole animal, as well as some of what the animal has also eaten. We could do that too. For the sake of argument however, let's assume we couldn't. So what? Almost every other culture on earth traditionally eaten meat, meaning that they have "adapted to it over many generations". If it conferred ill health, it would have been selected against. On what basis do you disagree? Provide some credible evidence.
  8. Really? Need I point out that big pharma produces vitamins?
  9. Considering that my response was in response to you claiming it takes "ages", that will be hard, as ages is rather ambiguous. If you're claiming 8 hours, I won't argue, but that doesn't strike me as "ages". Such as? My point is that we are adapted to eat meat. There are cultures that subsist entirely off meat. If it were that harmful, they would be extinct. And you, mate, need to educate yourself a little more. If food is digested, that energy will enter the bloodstream. If there if more energy than the body immediately requires, it will be converted to fat. Do you understand? Your body doesn't just digest an arbitrary amount and then shit out the other half of the pizza it doesn't need, because it's decided that's enough energy in one sitting.
  10. Depends if you claim this is another piece of propaganda peddled by some organisation with a hidden agenda or not. <___base_url___>/uploads/emoticons/default_newimprovedwinkonclear.gif But go on, interested to hear all opinions.
  11. Might as well check out Lester Meyers' "Orana Cactus World" in Gilgandra while you're about it, but it doesn't compare to the others (haven't been to Dawson's yet). Still worth checking out. Should be a good trip. <___base_url___>/uploads/emoticons/default_smile.png
  12. No need to apologise, I post these articles to incite discussion, so it's good to see that this one has generated so much interest. You can tear apart the study if you like, but I'll pick apart those posts I think lack logic. <___base_url___>/uploads/emoticons/default_smile.png
  13. The Vitamin Myth: Why We Think We Need Supplements Nutrition experts contend that all we need is what's typically found in a routine diet. Industry representatives, backed by a fascinating history, argue that foods don't contain enough, and we need supplements. Fortunately, many excellent studies have now resolved the issue. Paul Offit Jul 19 2013, 9:12 AM ET /applications/core/interface/imageproxy/imageproxy.php?img=http://cdn.theatlantic.com/static/mt/assets/food/vitaminsmainc.jpg.jpg&key=6efb4de96b863363bc5ae35e71c8a4a2b2b0335a45efcf361668fd20b9f586a2 slipah/Flickr On October 10, 2011, researchers from the University of Minnesota found that women who took supplemental multivitamins died at rates higher than those who didn't. Two days later, researchers from the Cleveland Clinic found that men who took vitamin E had an increased risk of prostate cancer. "It's been a tough week for vitamins," said Carrie Gann of ABC News. These findings weren't new. Seven previous studies had already shown that vitamins increased the risk of cancer and heart disease and shortened lives. Still, in 2012, more than half of all Americans took some form of vitamin supplements. What few people realize, however, is that their fascination with vitamins can be traced back to one man. A man who was so spectacularly right that he won two Nobel Prizes and so spectacularly wrong that he was arguably the world's greatest quack. When Albert Einstein was asked what he thought of Pauling's work, he shrugged his shoulders. "It was too complicated for me." In 1931, Linus Pauling published a paper in the Journal of the American Chemical Society titled "The Nature of the Chemical Bond." Before publication, chemists knew of two types of chemical bonds: ionic, where one atom gives up an electron to another; and covalent, where atoms share electrons. Pauling argued that it wasn't that simple -- electron sharing was somewhere between ionic and covalent. Pauling's idea revolutionized the field, marrying quantum physics with chemistry. His concept was so revolutionary in fact that when the journal editor received the manuscript, he couldn't find anyone qualified to review it. For this single paper, Pauling received the Langmuir Prize as the most outstanding young chemist in the United States, became the youngest person elected to the National Academy of Sciences, was made a full professor at Caltech, and won the Nobel Prize in Chemistry. He was 30 years old. In 1949, Pauling published a paper in Science titled "Sickle Cell Anemia, a Molecular Disease." At the time, scientists knew that hemoglobin (the protein in blood that transports oxygen) crystallized in the veins of people with sickle-cell anemia, causing joint pain, blood clots, and death. But they didn't know why. Pauling was the first to show that sickle hemoglobin had a slightly different electrical charge -- a quality that dramatically affected how the hemoglobin reacted with oxygen. His finding gave birth to the field of molecular biology. In 1951, Pauling published a paper in the Proceedings of the National Academy of Sciences titled "The Structure of Proteins." Scientists knew that proteins were composed of a series of amino acids. Pauling proposed that proteins also had a secondary structure determined by how they folded upon themselves. He called one configuration the alpha helix -- later used by James Watson and Francis Crick to explain the structure of DNA. In 1961, Pauling collected blood from gorillas, chimpanzees, and monkeys at the San Diego Zoo. He wanted to see whether mutations in hemoglobin could be used as a kind of evolutionary clock. Pauling showed that humans had diverged from gorillas about 11 million years ago, much earlier than scientists had suspected. A colleague later remarked, "At one stroke he united the fields of paleontology, evolutionary biology, and molecular biology." Pauling's accomplishments weren't limited to science. Beginning in the 1950s -- and for the next forty years -- he was the world's most recognized peace activist. Pauling opposed the internment of Japanese Americans during World War II, declined Robert Oppenheimer's offer to work on the Manhattan Project, stood up to Senator Joseph McCarthy by refusing a loyalty oath, opposed nuclear proliferation, publicly debated nuclear-arms hawks like Edward Teller, forced the government to admit that nuclear explosions could damage human genes, convinced other Nobel Prize winners to oppose the Vietnam War, and wrote the best-selling book No More War! Pauling's efforts led to the Nuclear Test Ban Treaty. In 1962, he won the Nobel Peace Prize -- the first person ever to win two unshared Nobel Prizes. In addition to his election to the National Academy of Sciences, two Nobel Prizes, the National Medal of Science, and the Medal for Merit (which was awarded by the president of the United States), Pauling received honorary degrees from Cambridge University, the University of London, and the University of Paris. In 1961, he appeared on the cover of Time magazine's Men of the Year issue, hailed as one of the greatest scientists who had ever lived. Then all the rigor, hard work, and hard thinking that had made Linus Pauling a legend disappeared. In the words of a colleague, his "fall was as great as any classic tragedy." The turning point came in March 1966, when Pauling was 65 years old. He had just received the Carl Neuberg Medal. "During a talk in New York City," recalled Pauling, "I mentioned how much pleasure I took in reading about the discoveries made by scientists in their various investigations of the nature of the world, and stated that I hoped I could live another twenty-five years in order to continue to have this pleasure. On my return to California I received a letter from a biochemist, Irwin Stone, who had been at the talk. He wrote that if I followed his recommendation of taking 3,000 milligrams of vitamin C, I would live not only 25 years longer, but probably more." Stone, who referred to himself as Dr. Stone, had spent two years studying chemistry in college. Later, he received an honorary degree from the Los Angeles College of Chiropractic and a "PhD" from Donsbach University, a non-accredited correspondence school in Southern California. Pauling followed Stone's advice. "I began to feel livelier and healthier," he said. "In particular, the severe colds I had suffered several times a year all my life no longer occurred. After a few years, I increased my intake of vitamin C to ten times, then twenty times, and then three hundred times the RDA: now 18,000 milligrams per day." From that day forward, people would remember Linus Pauling for one thing: vitamin C. In 1970, Pauling published Vitamin C and the Common Cold, urging the public to take 3,000 milligrams of vitamin C every day (about 50 times the recommended daily allowance). Pauling believed that the common cold would soon be a historical footnote. "It will take decades to eradicate the common cold completely," he wrote, "but it can, I believe, be controlled entirely in the United States and some other countries within a few years. I look forward to witnessing this step toward a better world." Pauling's book became an instant best seller. Paperback versions were printed in 1971 and 1973, and an expanded edition titled Vitamin C, the Common Cold and the Flu, published three years later, promised to ward off a predicted swine flu pandemic. Sales of vitamin C doubled, tripled, and quadrupled. Drugstores couldn't keep up with demand. By the mid-1970s, 50 million Americans were following Pauling's advice. Vitamin manufacturers called it "the Linus Pauling effect." Scientists weren't as enthusiastic. On December 14, 1942, about thirty years before Pauling published his first book, Donald Cowan, Harold Diehl, and Abe Baker, from the University of Minnesota, published a paper in the Journal of the American Medical Association titled "Vitamins for the Prevention of Colds." The authors concluded, "Under the conditions of this controlled study, in which 980 colds were treated . . . there is no indication that vitamin C alone, an antihistamine alone, or vitamin C plus an antihistamine have any important effect on the duration or severity of infections of the upper respiratory tract." Other studies followed. After Pauling's pronouncement, researchers at the University of Maryland gave 3,000 milligrams of vitamin C every day for three weeks to eleven volunteers and a sugar pill (placebo) to ten others. Then they infected volunteers with a common cold virus. All developed cold symptoms of similar duration. At the University of Toronto, researchers administered vitamin C or placebo to 3,500 volunteers. Again, vitamin C didn't prevent colds, even in those receiving as much as 2,000 milligrams a day. In 2002, researchers in the Netherlands administered multivitamins or placebo to more than 600 volunteers. Again, no difference. At least 15 studies have now shown that vitamin C doesn't treat the common cold. As a consequence, neither the FDA, the American Academy of Pediatrics, the American Medical Association, the American Dietetic Association, the Center for Human Nutrition at the Johns Hopkins Bloomberg School of Public Health, nor the Department of Health and Human Services recommend supplemental vitamin C for the prevention or treatment of colds. Although study after study showed that he was wrong, Pauling refused to believe it, continuing to promote vitamin C in speeches, popular articles, and books. When he occasionally appeared before the media with obvious cold symptoms, he said he was suffering from allergies. Then Linus Pauling upped the ante. He claimed that vitamin C not only prevented colds; it cured cancer. In 1971, Pauling received a letter from Ewan Cameron, a Scottish surgeon from a tiny hospital outside Glasgow. Cameron wrote that cancer patients who were treated with ten grams of vitamin C every day had fared better than those who weren't. Pauling was ecstatic. He decided to publish Cameron's findings in the Proceedings of the National Academy of Sciences (PNAS). Pauling assumed that as a member of the academy he could publish a paper in PNAS whenever he wanted; only three papers submitted by academy members had been rejected in more than half a century. Pauling's paper was rejected anyway, further tarnishing his reputation among scientists. Later, the paper was published in Oncology, a journal for cancer specialists. When researchers evaluated the data, the flaw became obvious: the cancer victims Cameron had treated with vitamin C were healthier at the start of therapy, so their outcomes were better. After that, scientists no longer took Pauling's claims about vitamins seriously. But Linus Pauling still had clout with the media. In 1971, he declared that vitamin C would cause a 10 percent decrease in deaths from cancer. In 1977, he went even further. "My present estimate is that a decrease of 75 percent can be achieved with vitamin C alone," he wrote, "and a further decrease by use of other nutritional supplements." With cancer in their rearview mirror, Pauling predicted, Americans would live longer, healthier lives. "Life expectancy will be 100 to 110 years," he said, "and in the course of time, the maximum age might be 150 years." Cancer victims now had reason for hope. Wanting to participate in the Pauling miracle, they urged their doctors to give them massive doses of vitamin C. "For about seven or eight years, we were getting a lot of requests from our families to use high-dose vitamin C," recalls John Maris, chief of oncology and director of the Center for Childhood Cancer Research at the Children's Hospital of Philadelphia. "We struggled with that. They would say, 'Doctor, do you have a Nobel Prize?' " Blindsided, cancer researchers decided to test Pauling's theory. Charles Moertel, of the Mayo Clinic, evaluated 150 cancer victims: half received ten grams of vitamin C a day and half didn't. The vitamin C-treated group showed no difference in symptoms or mortality. Moertel concluded, "We were unable to show a therapeutic benefit of high-dose vitamin C." Pauling was outraged. He wrote an angry letter to the New England Journal of Medicine, which had published the study, claiming that Moertel had missed the point. Of course vitamin C hadn't worked: Moertel had treated patients who had already received chemotherapy. Pauling claimed that vitamin C worked only if cancer victims had received no prior chemotherapy. Bullied, Moertel performed a second study; the results were the same. Moertel concluded, "Among patients with measurable disease, none had objective improvement. It can be concluded that high-dose vitamin C therapy is not effective against advanced malignant disease regardless of whether the patient had received any prior chemotherapy." For most doctors, this was the end of it. But not for Linus Pauling. He was simply not to be contradicted. Cameron observed, "I have never seen him so upset. He regards the whole affair as a personal attack on his integrity." Pauling thought Moertel's study was a case of "fraud and deliberate misrepresentation." He consulted lawyers about suing Moertel, but they talked him out of it. Subsequent studies have consistently shown that vitamin C doesn't treat cancer. Pauling wasn't finished. Next, he claimed that vitamin C, when taken with massive doses of vitamin A (25,000 international units) and vitamin E (400 to 1,600 IU), as well as selenium (a basic element) and beta-carotene (a precursor to vitamin A), could do more than just prevent colds and treat cancer; they could treat virtually every disease known to man. Pauling claimed that vitamins and supplements could cure heart disease, mental illness, pneumonia, hepatitis, polio, tuberculosis, measles, mumps, chickenpox, meningitis, shingles, fever blisters, cold sores, canker sores, warts, aging, allergies, asthma, arthritis, diabetes, retinal detachment, strokes, ulcers, shock, typhoid fever, tetanus, dysentery, whooping cough, leprosy, hay fever, burns, fractures, wounds, heat prostration, altitude sickness, radiation poisoning, glaucoma, kidney failure, influenza, bladder ailments, stress, rabies, and snakebites. When the AIDS virus entered the United States in the 1970s, Pauling claimed vitamins could treat that, too. On April 6, 1992, the cover of Time -- rimmed with colorful pills and capsule -- declared: "The Real Power of Vitamins: New research shows they may help fight cancer, heart disease, and the ravages of aging." The article, written by Anastasia Toufexis, echoed Pauling's ill-founded, disproved notions about the wonders of megavitamins. "More and more scientists are starting to suspect that traditional medical views of vitamins and minerals have been too limited," wrote Toufexis. "Vitamins -- often in doses much higher than those usually recommended -- may protect against a host of ills ranging from birth defects and cataracts to heart disease and cancer. Even more provocative are glimmerings that vitamins can stave off the normal ravages of aging." Toufexis enthused that the "pharmaceutical giant Hoffman-La Roche is so enamored with beta-carotene that it plans to open a Freeport, Texas, plant next year that will churn out 350 tons of the nutrient annually, or enough to supply a daily 6 milligram capsule to virtually every American adult." The National Nutritional Foods Association (NNFA), a lobbying group for vitamin manufacturers, couldn't believe its good luck, calling the Time article "a watershed event for the industry." As part of an effort to get the FDA off their backs, the NNFA distributed multiple copies of the magazine to every member of Congress. Speaking at an NNFA trade show later in 1992, Toufexis said, "In fifteen years at Time I have written many health covers. But I have never seen anything like the response to the vitamin cover. It whipped off the sales racks, and we were inundated with requests for copies. There are no more copies. 'Vitamins' is the number-one-selling issue so far this year." Although studies had failed to support him, Pauling believed that vitamins and supplements had one property that made them cure-alls, a property that continues to be hawked on everything from ketchup to pomegranate juice and that rivals words like natural and organic for sales impact: antioxidant. Antioxidation vs. oxidation has been billed as a contest between good and evil. The battle takes place in cellular organelles called mitochondria, where the body converts food to energy, a process that requires oxygen and so is called oxidation. One consequence of oxidation is the generation of electron scavengers called free radicals (evil). Free radicals can damage DNA, cell membranes, and the lining of arteries; not surprisingly, they've been linked to aging, cancer, and heart disease. To neutralize free radicals, the body makes its own antioxidants (good). Antioxidants can also be found in fruits and vegetables -- specifically, selenium, beta-carotene, and vitamins A, C, and E. Studies have shown that people who eat more fruits and vegetables have a lower incidence of cancer and heart disease and live longer. The logic is obvious: if fruits and vegetables contain antioxidants -- and people who eat lots of fruits and vegetables are healthier -- then people who take supplemental antioxidants should also be healthier. In fact, they're less healthy. In 1994, the National Cancer Institute, in collaboration with Finland's National Public Health Institute, studied 29,000 Finnish men, all long-term smokers more than fifty years old. This group was chosen because they were at high risk for cancer and heart disease. Subjects were given vitamin E, beta-carotene, both, or neither. The results were clear: those taking vitamins and supplements were more likely to die from lung cancer or heart disease than those who didn't take them -- the opposite of what researchers had anticipated. In 1996, investigators from the Fred Hutchinson Cancer Research Center, in Seattle, studied 18,000 people who, because they had been exposed to asbestos, were at increased risk of lung cancer. Again, subjects received vitamin A, beta-carotene, both, or neither. Investigators ended the study abruptly when they realized that those who took vitamins and supplements were dying from cancer and heart disease at rates 28 and 17 percent higher, respectively, than those who didn't. In 2004, researchers from the University of Copenhagen reviewed fourteen randomized trials involving more than 170,000 people who took vitamins A, C, E, and beta-carotene to see whether antioxidants could prevent intestinal cancers. Again, antioxidants didn't live up to the hype. The authors concluded, "We could not find evidence that antioxidant supplements can prevent gastrointestinal cancers; on the contrary, they seem to increase overall mortality." When these same researchers evaluated the seven best studies, they found that death rates were 6 percent higher in those taking vitamins. In 2005, researchers from Johns Hopkins School of Medicine evaluated nineteen studies involving more than 136,000people and found an increased risk of death associated with supplemental vitamin E. Dr. Benjamin Caballero, director of the Center for Human Nutrition at the Johns Hopkins Bloomberg School of Public Health, said, "This reaffirms what others have said. The evidence for supplementing with any vitamin, particularly vitamin E, is just not there. This idea that people have that [vitamins] will not hurt them may not be that simple." That same year, a study published in the Journal of theAmerican Medical Association evaluated more than 9,000 people who took high-dose vitamin E to prevent cancer; those who took vitamin E were more likely to develop heart failure than those who didn't. In 2007, researchers from the National Cancer Institute examined 11,000 men who did or didn't take multivitamins. Those who took multivitamins were twice as likely to die from advanced prostate cancer. In 2008, a review of all existing studies involving more than 230,000 people who did or did not receive supplemental antioxidants found that vitamins increased the risk of cancer and heart disease. On October 10, 2011, researchers from the University of Minnesota evaluated 39,000 older women and found that those who took supplemental multivitamins, magnesium, zinc, copper, and iron died at rates higher than those who didn't. They concluded, "Based on existing evidence, we see little justification for the general and widespread use of dietary supplements." Two days later, on October 12, researchers from the Cleveland Clinic published the results of a study of 36,000 men who took vitamin E, selenium, both, or neither. They found that those receiving vitamin E had a 17 percent greater risk of prostate cancer. In response to the study, Steven Nissen, chairman of cardiology at the Cleveland Clinic, said, "The concept of multivitamins was sold to Americans by an eager nutraceutical industry to generate profits. There was never any scientific data supporting their usage." On October 25, a headline in the Wall Street Journal asked, "Is This the End of Popping Vitamins?" Studies haven't hurt sales. In 2010, the vitamin industry grossed $28 billion, up 4.4 percent from the year before. "The thing to do with [these reports] is just ride them out," said Joseph Fortunato, chief executive of General Nutrition Centers. "We see no impact on our business." How could this be? Given that free radicals clearly damage cells -- and given that people who eat diets rich in substances that neutralize free radicals are healthier -- why did studies of supplemental antioxidants show they were harmful? The most likely explanation is that free radicals aren't as evil as advertised. Although it's clear that free radicals can damage DNA and disrupt cell membranes, that's not always a bad thing. People need free radicals to kill bacteria and eliminate new cancer cells. But when people take large doses of antioxidants, the balance between free radical production and destruction might tip too much in one direction, causing an unnatural state in which the immune system is less able to kill harmful invaders. Researchers have called this "the antioxidant paradox." Whatever the reason, the data are clear: high doses of vitamins and supplements increase the risk of heart disease and cancer; for this reason, not a single national or international organization responsible for the public's health recommends them. In May 1980, during an interview at Oregon State University, Linus Pauling was asked, "Does vitamin C have any side effects on long-term use of, let's say, gram quantities?" Pauling's answer was quick and decisive. "No," he replied. Seven months later, his wife was dead of stomach cancer. In 1994, Linus Pauling died of prostate cancer. Source.
  14. If that were the case, the sarcasm was lost in text, as it almost always is. I really don't know why people persist in attempts to be sarcastic while lacking an ability to convey tone. It just doesn't work. Yet you go on to say you think it's propaganda and imply the scientists have been funded by an organisation with ulterior motives. How is that sarcasm? Correct. Often studies lead to further hypotheses. They never stated it as fact, they implied a possible connection and stated that further research was to be performed. Hence why they are going to continue research, as stated. That's a reasonable point. I'm not saying the study is without its flaws. Personally, the first thing that sprang to mind was the connection between fat and Bilophila. The authors made the connection between meat and increased bile due to fat, but not all meat is fatty. On the other hand, plenty of high fat vegetable sources exist, most notably many nuts and seeds. To me, it would have made more sense to have have a diet of lean meat, a diet of fatty meat, a lean plant diet and a plant diet high in fat. While I agree that it's not a great piece of research, that doesn't automatically mean it's a mouthpiece for some anti-meat agenda.
  15. Use two sieves. One very fine one, to wash away very fine material, and another one with mesh the diameter of particles you are after. 1 mm is probably too small to provide support. 2 mm should work.
  16. It would be fine. Silica is an essential plant nutrient. Diatomite, my favourite media for cacti, is essentially just silica. Why not just use coarse sand? You can sieve it yourself to control particle size.
  17. M1 is closer to 15 years, if I recall correctly. Rosei 1 is the short-spined fat one; rosei 2 is more slender and has much longer spines. All of them are stunningly beautiful
  18. True, but at least water is something we truly need.
  19. You're honestly going to make that claim, with no evidence whatsoever? Wow. I just lost a fair bit of respect for you Sally. So because "Average Joe" is too lazy to read a short article fully, you're going to attack the authors of the study/article? Perhaps you've never done science Sally, but your example doesn't imply sleight of hand, it reveals how scientists try to understand their results. In this case, eating meat led to an increase of Bilophila bacteria. Those same bacteria have been shown to cause inflammation in a previous study. So the authors have inferred that eating meat may lead to inflammation, yet were clear that they did not measure it in this study specifically and plan on doing so. No sleight of hand there, it's all very clear. Evidence please, or you're simply peddling your own propaganda. Meat does not take that long to digest. Everything about our evolution and anatomy suggests we evolved as omnivores. Nothing about our evolution and anatomy suggests we are supposed to be herbivores. If you wish to subsist purely off plants, that's fine and kudos to you, it is undoubtedly a more environmental way to exist, but don't go peddling bullshit regurgitated by a bunch of ill-informed vegetarians with an agenda. I've been vegan twice. No matter how much I tried to eat a healthy and balanced diet, I always became deficient in both iron and B12. I was also way too skinny. Not the diet for me. If it works fr you though, go for it! You really need to read more, before you go basing opinions on your own flawed logic. It doesn't work like that. Do you even understand what the word "digest" means? Using your example, if non-meat food is digested in an hour or two and you eat two greasy vegetarian pizzas, then that means in a hour or two, you've broken that food down into a state that allows it to be absorbed into the blood stream, where it is either immediately used as energy, burned off in thermogenesis or converted to fat for later use. If you ate two greasy meat pizzas, that took up to eight hours to digest, then the energy and nutrients from them is going to enter your bloodstream slower. That's the only difference. This is why high protein meals leads to longer feelings of satiety than meals with little protein. Different macronutrients have different capacities for how much can be burnt off in thermogenesis, i.e. more protein is burnt of in thermogenesis than carbohydrates, and more carbohydrates are burnt off in thermogenesis than fat. Individual metabolism is also a big factor. Some individuals have slow metabolisms and even a slight increase of calorific intake above their daily needs will result in fat production. Others have incredibly high metabolisms and regardless of how much they eat and what they eat, really struggle to gain weight.
  20. Scientists discover vast undersea freshwater reserves Dec 05, 2013 /applications/core/interface/imageproxy/imageproxy.php?img=http://cdn.physorg.com/newman/gfx/news/2013/1-australianre.jpg&key=f85e6b787f1994f545142dd37318ea26e84744dd90ece5e213862bd68581ea95 Australian researchers say they have identified vast reserves of fresh water trapped beneath the ocean floor off Australia, China, North American and South America Australian researchers said Thursday they had established the existence of vast freshwater reserves trapped beneath the ocean floor which could sustain future generations as current sources dwindle. Lead author Vincent Post, from Australia's Flinders University, said that an estimated 500,000 cubic kilometres (120,000 cubic miles) of low-salinity water had been found buried beneath the seabed on continental shelves off Australia, China, North America and South Africa. "The volume of this water resource is a hundred times greater than the amount we've extracted from the Earth's sub-surface in the past century since 1900," said Post of the study, published in the latest edition of Nature. "Freshwater on our planet is increasingly under stress and strain so the discovery of significant new stores off the coast is very exciting. "It means that more options can be considered to help reduce the impact of droughts and continental water shortages." UN Water, the United Nations' water agency, estimates that water use has been growing at more than twice the rate of population in the last century due to demands such as irrigated agriculture and meat production. More than 40 percent of the world's population already live in conditions of water scarcity. By 2030, UN Water estimates that 47 percent of people will exist under high water stress. Post said his team's findings were drawn from a review of seafloor water studies done for scientific or oil and gas exploration purposes. "By combining all this information we've demonstrated that the freshwater below the seafloor is a common finding, and not some anomaly that only occurs under very special circumstances," he told AFP. The deposits were formed over hundreds of thousands of years in the past, when the sea level was much lower and areas now under the ocean were exposed to rainfall which was absorbed into the underlying water table. When the polar icecaps started melting about 20,000 years ago these coastlines disappeared under water, but their aquifers remain intact—protected by layers of clay and sediment. Post said the deposits were comparable with the bore basins currently relied upon by much of the world for drinking water and would cost much less than seawater to desalinate. Drilling for the water would be expensive, and Post said great care would have to be taken not to contaminate the aquifers. He warned that they were a precious resource. "We should use them carefully: once gone, they won't be replenished until the sea level drops again, which is not likely to happen for a very long time," Post said.
  21. Glass Beads Were Currency For Australian Aborigines Friday, December 06, 2013 /applications/core/interface/imageproxy/imageproxy.php?img=http://www.archaeology.org/images/News/1312/Australian-Aboriginal-Beads.jpg&key=87e50f4b321dea4bf7d9ff54595e4cb1292c647d00e8f52e5b17604ea82d1f63 (Mirani Litster and Daryl Wesley)COFFS HARBOUR, AUSTRALIA—Australian Aborigines were using European glass beads as currency long before sustained contact with Europeans themselves, say Australian National University archaeologists Daryl Wesley and Mirani Litster. They have excavated 30 beads of European manufacture in the Arnhem Land region and think the artifacts were brought to the continent by Maccasans, an Indonesian people known to have traveled to the area to harvest sea cucumbers. The Maccasans could have traded the beads with the Aborigines, probably in return for access to land. While beads have been found at sites in the area before, it was thought they dated to after 1916, when European missionaries would have brought them to Arnhem Land. But the team found the beads in deposits that long predate the arrival of missionaries. Wesley says the discovery has implications for Aboriginal land claims, which in part are based on the idea that they negotiated with the Maccassan for access to their traditional fishing grounds. Source.
  22. Chowing Down On Meat, Dairy Alters Gut Bacteria A Lot, And Quicklyby Michaeleen Doucleff December 11, 2013 1:34 PM /applications/core/interface/imageproxy/imageproxy.php?img=http://media.npr.org/assets/img/2013/12/11/_mg_0101-58835d0be26aa0461f0fa99213bb1f048536f517-s40-c85.jpg&key=a0db531fe2df4e87f34d9cb085b840439270527fd7a125ec1ef7da6490056b27To figure out how diet influences the microbiome, scientists put volunteers on two extreme diets: one that included only meat, egg and cheese and one that contained only grains, vegetables and legumes. Morgan Walker/NPR Looks like Harvard University scientists have given us another reason to walk past the cheese platter at holiday parties and reach for the carrot sticks instead: Your gut bacteria will thank you. Switching to a diet packed with meat and cheese — and very few carbohydrates — alters the trillions of microbes living in the gut, scientists Wednesday in the journal Nature. The change happens quickly. Within two days, the types of microbes thriving in the gut shuffle around. And there are signs that some of these shifts might not be so good for your gut: One type of bacterium that flourishes under the meat-rich diet has been linked to inflammation and intestinal diseases in mice. "I mean, I love meat," says microbiologist , who contributed to the study and is now at Duke University. "But I will say that I definitely feel a lot more guilty ordering a hamburger ... since doing this work," he says. Scientists are just beginning to learn about how our decisions at the dinner table — or the drive-through — tweak our microbiome, that is, the communities of bacteria living in our bodies. But one thing is becoming clear: The critters hanging out in our intestine influence many aspects of our health, including weight, immunity and perhaps even . And interest in studying the links between is growing. Previous research in this field had turned up tantalizing evidence that eating fiber can alter the composition of gut bacteria. But these studies had looked at diets over long periods of times — months and even years. David and his colleagues wanted to know whether fiber — or lack of it — could alter gut bacteria more rapidly. To figure that out, the researchers got nine volunteers to go on two extreme diets for five days each. The first diet was all about meat and cheese. "Breakfast was eggs and bacon," David says. "Lunch was ribs and briskets, and then for dinner, it was salami and prosciutto with an assortment of cheeses. The volunteers had pork rinds for snacks." Then, after a break, the nine volunteers began a second, fiber-rich diet at the other end of the spectrum: It all came from plants. "Breakfast was granola cereal," David says. "For lunch, it was jasmine rice, cooked onions, tomatoes, squash, garlic, peas and lentils." Dinner looked similar, and the volunteers could snack on bananas and mangoes. "The animal-based diet is admittedly a little extreme," he says. "But the plant-based diet is one you might find in a developing country." David and the team analyzed the volunteers' microbiomes before, during and after each diet. And the effects of all that meat and cheese were immediately apparent. "The relative abundance of various bacteria species looked like it shifted within a day after the food hit the gut," David says. After the volunteers had spent about three days on each diet, the bacteria in the gut even started to change their behavior. "The kind of genes turned on in the microbes changed in both diets," he says. In particular, microbes that "love bile" — the Bilophila — started to dominate the volunteers' guts during the animal-based diet. Bile helps the stomach digest fats. So people make more bile when their diet is rich in meat and dairy fats. A study last year that blooms of Bilophila cause inflammation and colitis in mice. "But we didn't measure levels of inflammation in our subjects," David says. "That's the next step." Instead, he says, his team's data support the overall animal model that Bilophila promotes inflammation, which could ultimately be controlled by diet. "Our study is a proof of concept that you can modify the microbiome through diet," David says. "But we're still a long ways off from being able to manipulate the community in any kind of way that an engineer would be pleased about." Even just classifying Bilophila as "bad bacteria" is a tricky matter, says Dr. , a gastroenterologist at the Mayo Clinic in Minnesota. "These bacteria are members of a community that have lived in harmony with us for thousands of years," says Kashyap, who wasn't involved in the study. "You can't just pick out one member of this whole team and say it's bad. Most bacteria in the gut are here for our benefit, but given the right environment, they can turn on us and cause disease." Nevertheless, Kashyap thinks the Nature study is exciting because the findings unlock a potentially new avenue for treating intestinal diseases. "We want to look at diet as a way of treating patients," Kashyap says. "This study shows that short-term dietary interventions can change microbial composition and function." Of course, figuring out exactly how to do that will take much more research. "The paper has made the next leap in the field," Kashyap says. "With discovery comes responsibility. Once you make this big finding, it needs to be tested appropriately." http://www.npr.org/blogs/thesalt/2013/12/10/250007042/chowing-down-on-meat-and-dairy-alters-gut-bacteria-a-lot-and-quickly'>Source. Edit: seems there's a problem copying the links within the article from NPR. Best to read the article at the source.
  23. Food fuelled with fungi Ecologists are starting to appreciate the power of microbes to make crops hardier. Nicola Jones 10 December 2013 /applications/core/interface/imageproxy/imageproxy.php?img=http://www.nature.com/polopoly_fs/7.14202.1386696417!/image/1.14339_Drought-GETTY150327870.jpg_gen/derivatives/landscape_630/1.14339_Drought-GETTY150327870.jpg&key=e181c0074ddfae27ba8d9238750e1121580b1c07da245237d96cb0f13cdae065Maize could survive drought with the help of fungi. SAUL LOEB/AFP/GETTY IMAGES With the planet’s population booming and climate change threatening traditional ‘bread-basket’ regions, researchers are seeking ways to squeeze more food from the land. Some are taking a sideways approach: instead of trying to produce hardier crops through breeding or genetic modification, they are manipulating the vast array of symbiotic microorganisms that live in plants. Next spring, Adaptive Symbiotic Technologies in Seattle, Washington, will bring to market the first commercial product that harnesses such microorganisms — known as endophytes — to improve crops. The company plans to sell a mixture of fungi for coating rice and maize (corn) seeds, which it says will produce crops with high yields and reduced water use even in harsh conditions. “It’s a real paradigm shift in plant ecology,” says company founder and plant biologist Rusty Rodriguez. “Up till now we have focused on plants as individuals, as we have with animals.” In the same way that biologists are now starting to understand the power and influence of the trillions of microbes living in and on the human body, ecologists are getting to grips with plant microbiomes. The result is powerful. Instead of having to find and introduce into a crop a single gene for a coveted trait such as salt tolerance, researchers can use a slew of interacting genes that comes pre-integrated in a living organism, such as a symbiotic fungus. Conventional breeding has helped to create varieties with increased tolerance to drought, but progress in introducing new genes through genetic manipulation has been slow. Despite decades of research, only one drought-tolerant genetically modified crop has been approved in the United States: Monsanto’s DroughtGard maize, which expresses a stress-response gene from bacteria. Although symbiotic plant–microbe relationships — such as those of the nitrogen-fixing bacteria that live in the roots of legumes — have been known for many decades, applied research in this field is relatively recent. Only in the 1970s did researchers realize that a fungus living in symbiosis with tall fescue grass was responsible for making cattle grazing on infected pastures ill. Scientists in New Zealand later discovered that some endophyte-ridden grasses, although poisonous to livestock, were resistant to attack by weevils. This spawned a niche industry that develops and markets endophyte-hosting turf varieties that repel pest attacks without being toxic to animals. Now some researchers are applying similar philosophies to food crops. The approach bucks a trend of sterilizing and simplifying crops, says Rodriguez. “Agriculture has spent the past century wiping out the microbes living in our plants, through pesticides and fertilizers. Now we’re trying to reverse that.” Endophyte researcher James White at Rutgers University in New Brunswick, New Jersey, agrees. “A lot of companies don’t think this way — they go for chemical control. They think the microbes get in the way,” he says. “It’s not the paradigm that these microbes are significantly impacting plants. But they are.” There are thought to be millions of endophytic microbes in the world; only a fraction have been identified, and any given plant can host hundreds. Rodriguez’s work began by happy accident. In the early 2000s, while studying the dozen or so plant species that can survive at 50 °C in the hot soils near geothermal vents in Yellowstone National Park in Wyoming, he found that all of them carried a symbiotic fungus. Although neither the plants nor the fungi could tolerate soil temperatures of 40 °C by themselves, together they could (R. S. Redman et al. Science 298, 1581; 2002). Rodriguez and his colleagues later discovered that the fungi were easily transferable: they could grow in anything from watermelons to maize and confer heat- and drought-tolerance on those crops. “The endophytes somehow protect the plants from oxidation, so the plants don’t turn up all their stress defences,” says Rodriguez. Those findings led him to look for other endophytes optimized to tackle the problems likely to be caused to particular food crops by climate change (R. S. Redman et al. PLoS ONE 6, e14823; 2011). The result is a commercial mix of about half a dozen fungi that the team named BioEnsure. Field tests done or commissioned by the company show that, compared to untreated seeds, the product increased maize yields by 85% in Michigan during a 2012 drought, increased seed germination rates by two to five times during 5 °C cold snaps, and enabled maize to use one-third less water. In rice, the scientists saw yield increases of 3–6% in 2012 and 2013, despite drought and early-season planting when temperatures were cool. The crop also used 25–50% less water than normal. BioEnsure has been approved for use by the US Food and Drug Administration and the Department of Agriculture, and independent tests have shown the mixture to be non-toxic. Rodriguez plans eventually to produce targeted endophyte mixes for more crops, including soya beans, wheat, barley and sugar cane. But the question of whether BioEnsure will work in commercial conditions is hard to answer: although Adaptive Symbiotic Technologies’ field-test results are public, they have not been peer reviewed. Richard Richards, who leads research to breed better wheat for the Australian Commonwealth Scientific and Industrial Research Organisation’s plant industry division in Canberra, is dubious. “Typically, there is a metabolic cost of hosting an endophyte, so that crops with endophytes are likely to grow less and be less productive,” he says. Rodriguez counters that “in all the field work we’ve done over 15 years we haven’t seen anything suggesting metabolic cost”. Others are cautiously optimistic. Mogens Nicolaisen, who works with plant pathogens at Aarhus University in Denmark, thinks that endophytes could be a good way to help introduce resistance to both drought and disease, including pathogens such as wheat rust, an area that Rodriguez says he is pursuing. But, Nicolaisen adds, getting the endophytes into seeds and regulating their growth in different environmental conditions will be tricky. “It will be very hard to control,” he says. Source.
  24. I'll PM you that. There must be a way for me to get some seeds! Might need to attempt to contact BPTH Sumatera in Indonesia if I have no luck here, though with only a phone number to call, I foresee this being less than fruitful. Anyone have Rev's contact details?
  25. See another article on the topic here. That claim is not without its evidence. Firstly, the forkhead box protein P2 (FOXP2) gene is implicated in language. This was discovered when in the 1990s, a family in London with an inherited language disability (slurring) was found to have a defect of the FOXP2 gene. The human variant of the FOXP2 gene was recently found in the Neanderthal genome. Additionally, morphological evidence supports the idea. Neanderthals have larger hypoglossal canals, tongue bones similar to that of humans' and an inner ear tuned to human speech wavelengths (2-4 kHz). Though this does not definitely tell us whether Neanderthals were able to speak or not, it does suggest that the genetic change for speech came about before our split from them.
×
×
  • Create New...