Sunday, September 30, 2012

Stop wasting money on Fish Oil


(Reuters) - Omega-3 fatty acids, found in oily fish such as sardines and salmon and once touted as a way of staving off heart disease and stroke, don't help after all, according to a Greek study.
Based on a review and analysis of previous clinical trials including more than 68,000 participants, Greek researchers whose report appeared in the Journal of the American Medical Association said the fatty acids have no impact on overall death rates, deaths from heart disease, or strokes and heart attacks.
Some examples of the fish oil products that contain Omega-3 fatty acids.

This was true whether they were obtained from supplements such as pills, or from fish in the diet, said the researchers, led by Mosef Elisef at the University Hospital of Ioannina.

"Overall, omega-3...supplementation was not associated with a lower risk of all-cause mortality, cardiac death, sudden death, myocardial infarction, or stroke based on relative and absolute measures of association," Elisef and his team wrote.
A decade ago, medical evidence suggested that boosting omega-3s, including the acids known as EPA and DHA, with food or supplements had a strong protective effect even though the mechanism wasn't understood.
Scientists cited improvements in levels of triglycerides - a type of fat in the blood - as well as blood pressure levels and heart rhythm disturbances.
But since then, the picture has grown clouded. Earlier this year, a group of Korean researchers found that omega-3 supplements had no effect on heart disease or death based on 20,000 participants in previous trials.
The current study pooled results of 18 clinical trials that assigned participants randomly to take either omega-3 supplements, or not. It also includes two trials in which people got dietary counseling to increase their consumption of omega-3 rich foods.
Because the trials in the Greek analysis went as far back as 1989, researchers also considered whether growing use of statins and other medications could explain why later studies failed to support the earlier findings. But Elisef and his team said that wasn't the case.
Because people who eat a lot of fish have been found to have less heart disease, researchers figured that perhaps putting the supposed "active ingredients" in a pill could provide similar benefits, said Alice Lichtenstein, director of the Cardiovascular Nutrition Laboratory at Tufts University in Boston.
"What we have learned over the years is you can't think about individual nutrients in isolation," she added.
People who eat fish often may be replacing things like steak, hamburgers or quiche, making for a healthier diet.
Instead of supplements, Lichtenstein recommended eating fish at least twice a week, having a diet rich in whole grains and vegetables, getting lots of physical activity, and not smoking.

Wednesday, August 22, 2012

Sitting straight Bad for Backs

Source

Sitting up straight is not the best position for office workers, a study has suggested.

Scottish and Canadian researchers used a new form of magnetic resonance imaging (MRI) to show it places an unnecessary strain on your back.

They told the Radiological Society of North America that the best position in which to sit at your desk is leaning back, at about 135 degrees.

Experts said sitting was known to contribute to lower back pain.

Data from the British Chiropractic Association says 32% of the population spends more than 10 hours a day seated.





Half do not leave their desks, even to have lunch.

Two thirds of people also sit down at home when they get home from work.

Spinal angles

The research was carried out at Woodend Hospital in Aberdeen.

Twenty two volunteers with healthy backs were scanned using a positional MRI machine, which allows patients the freedom to move - so they can sit or stand - during the test.

Our bodies are not designed to be so sedentary
Rishi Loatey, British Chiropractic Association

Traditional scanners mean patients have to lie flat, which may mask causes of pain that stem from different movements or postures.

In this study, the patients assumed three different sitting positions: a slouching position, in which the body is hunched forward as if they were leaning over a desk or a video game console, an upright 90-degree sitting position; and a "relaxed" position where they leaned back at 135 degrees while their feet remained on the floor.

The researchers then took measurements of spinal angles and spinal disk height and movement across the different positions.

Spinal disk movement occurs when weight-bearing strain is placed on the spine, causing the disk to move out of place.

Disk movement was found to be most pronounced with a 90-degree upright sitting posture.

It was least pronounced with the 135-degree posture, suggesting less strain is placed on the spinal disks and associated muscles and tendons in a more relaxed sitting position.

The "slouch" position revealed a reduction in spinal disk height, signifying a high rate of wear and tear on the lowest two spinal levels.

When they looked at all test results, the researchers said the 135-degree position was the best for backs, and say this is how people should sit.

'Tendency to slide'

Dr Waseem Bashir of the Department of Radiology and Diagnostic Imaging at the University of Alberta Hospital, Canada, who led the study, said: "Sitting in a sound anatomic position is essential, since the strain put on the spine and its associated ligaments over time can lead to pain, deformity and chronic illness."

Rishi Loatey of the British Chiropractic Association said: "One in three people suffer from lower back pain and to sit for long periods of time certainly contributes to this, as our bodies are not designed to be so sedentary."

Levent Caglar from the charity BackCare, added: "In general, opening up the angle between the trunk and the thighs in a seated posture is a good idea and it will improve the shape of the spine, making it more like the natural S-shape in a standing posture.

"As to what is the best angle between thigh and torso when seated, reclining at 135 degrees can make sitting more difficult as there is a tendency to slide off the seat: 120 degrees or less may be better."

Thursday, June 21, 2012

Fake Food Everywhere!


Sources here and here

I can't believe the food we have these days are not real

For example

1) Honey.

Pretty much all the major players buy their honey from China. Chinese honey frequently has all of the pollen filtered out of it to disguise its origin, and it's then cut like back-alley cocaine with cheap corn syrup and artificial sweeteners. The FDA says that a substance can't legally be called "honey" if it contains no pollen, and yet most of the stuff tested from the main retailers contained not a trace of it.

2) Soy Sauce

Proper soy sauce takes a pretty long time to make, so many manufacturers have started producing an imitation product that takes only three days to make and has a longer shelf life. It is made from something called "hydrolyzed vegetable proteins," as well as caramel coloring, salt, and our good old friend corn syrup. Most of the soy sauce that you get in packets with your sushi is actually this fake stuff.

3) Wasabi

Speaking of Soy Sauce, the wasabi is fake too. Its usually made from horseradish mixed with mustard

4) Saffron

A lot of "top-quality" saffron consists of roughly 10 percent actual saffron. The rest is just random, worthless plant bits, ground up and mixed with the real thing. Quite often, what you are actually using is  saffron-flavored gelatin

5) Olive Oil

As crazy as it sounds, olive oil piracy is one of the Italian Mafia's most lucrative enterprises, to the extent that it appears that most olive oil on the market is either greatly diluted or completely forged by a massive shadow industry that involves major names such as Bertolli.

They've been at it for a while, too -- Joe Profaci, said to be one of the real-life dons who inspired the character of Don Vito Corleone in The Godfather, was known by the moniker of "The Olive Oil King." But evidence suggests that olive oil racketeering has been a major problem in the world for centuries. Hell, the ancient Sumerians had a fraud squad for shady olive oil peddlers.

Today, the stuff that is pawned off to us as quality olive oil is often just a tiny amount of the real thing, mixed with up to 80 percent of ordinary, less than healthy, cheap as muck sunflower oil. That is, if you're getting any olive oil at all. In fact, we're so used to shitty olive oil that apparently food connoisseurs reject the real stuff because it tastes fake to them.

But why would anyone bother? It's freaking olive oil. How much money can there be in it when you can get a bottle for a few bucks at the grocery store? It turns out that, profit-wise, shady olive oil is comparable to cocaine trafficking.

6) Cheese
And sure enough if you look at people who sell actual cheese you find on their ingredients all sorts of words like "milk" and "milkfat" and "cheese cultures."

But sitting right in the same aisle with the actual cheese you'll see a package like this:



Note the careful omission of the word "cheese" from the package of "American slices" up there. These "pasteurized processed sandwich slices" are to cheese what a hobo is to, you know, someone with a home. The way the supplier's website puts it, the product "...resembles a Processed American Cheese in certain food applications."

7) Maple Syrup

Most brands are just water, high fructose corn syrup, caramel coloring and various chemicals. Yeah, you could pretty much make the shit yourself in about five minutes. Though at least the log cabin people have switched from high-fructose corn syrup to actual sugar.

8) Strawberry Flavoring

Strawberry flavoring (like the kind you get in fast food milkshakes) is a mix of about 50 separate chemicals and none of them have berry in the name. They include:

Amyl acetate, amyl butyrate, amyl valerate, anethol, anisyl formate, benzyl acetate, benzyl isobutyrate, butyric acid, cinnamyl isobutyrate, cinnamyl valerate, cognac essential oil, diacetyl, dipropyl ketone, ethyl acetate, ethyl amyl ketone, ethyl butyrate, ethyl cinnamate, ethyl heptanoate, ethyl heptylate, ethyl lactate, ethyl methylphenylglycidate, ethyl nitrate, ethyl propionate, ethyl valerate, heliotropin, hydroxyphenyl-2-butanone (10 percent solution in alcohol), a-ionone, isobutyl anthranilate, isobutyl butyrate, lemon essential oil, maltol, 4-methylacetophenone, methyl anthranilate, methyl benzoate, methyl cinnamate, methyl heptine carbonate, methyl naphthyl ketone, methyl salicylate, mint essential oil, neroli essential oil, nerolin, neryl isobutyrate, orris butter, phenethyl alcohol, rose, rum ether, g-undecalactone, vanillin, and solvent.

But it does have orris butter, the most delicious butter you can squeeze from an orris, whatever the fuck that is.

It actually could be worse. For example 

1) Pepper

China, there has been pepper sold that is made entirely from mud

2) Vodka

Bootleg vodka production is rampant the world over and bottles that look completely legit on the shelf have huge amounts of methanol, which is a kind of alcohol, true, but it's the kind that's used as race car fuel and antifreeze

3) Flour

In certain parts of Africa, flour from markets is routinely cut with things like alum, chalk, Plaster of Paris and mashed potatoes.

4) Pork Dumplings

Last year, news outlets in China claimed that street vendors selling pork dumplings were actually stuffing them with wet cardboard flavored with pork juice

5) Curry Powder

Occasionally Indian spices are doctored with substances like lead chromate--which improves color--to sawdust to make it bulkier or actual dried cow shit, which if it improves anything really speaks poorly for the quality of the spice to begin with.

Thursday, March 29, 2012

Chocolate 'may help keep people slim'


Source
People who eat chocolate regularly tend to be thinner, new research suggests.
The findings come from a study of nearly 1,000 US people that looked at diet, calorie intake and body mass index (BMI) - a measure of obesity.
It found those who ate chocolate a few times a week were, on average, slimmer than those who ate it occasionally.
Even though chocolate is loaded with calories, it contains ingredients that may favour weight loss rather than fat synthesis, scientists believe.
Despite boosting calorie intake, regular chocolate consumption was related to lower BMI in the study, which is published in Archives of Internal Medicine.
The link remained even when other factors, like how much exercise individuals did, were taken into account.
And it appears it is how often you eat chocolate that is important, rather than how much of it you eat. The study found no link with quantity consumed.
According to the researchers, there is only one chance in a hundred that their findings could be explained by chance alone.
But the findings only suggest a link - not proof that one factor causes the other.
Lead author Dr Beatrice Golomb, from the University of California at San Diego, said: "Our findings appear to add to a body of information suggesting that the composition of calories, not just the number of them, matters for determining their ultimate impact on weight."
This is not the first time scientists have suggested that chocolate may be healthy for us.
Other studies have claimed chocolate may be good for the heart.
Consumption of certain types of chocolate has been linked to some favourable changes in blood pressure, insulin sensitivity and cholesterol level.
And chocolate, particularly dark chocolate, does contain antioxidants which can help to mop up harmful free radicals - unstable chemicals that can damage our cells.
Dr Golomb and her team believe that antioxidant compounds, called catechins, can improve lean muscle mass and reduce weight - at leaststudies in rodents would suggest this might be so.
Chocaholics tempted by new museum
Mice fed for 15 days with epicatechin (present in dark chocolate) had improved exercise performance and observable changes to their muscle composition.
They say clinical trials are now needed in humans to see if this is the case.
But before you reach for a chocolate bar, there are still lots of unanswered questions. And in the absence of conclusive evidence, experts advise caution.
While there's no harm in allowing yourself a treat like chocolate now and again, eating too much might be harmful because it often contains a lot of sugar and fat too.

Friday, February 10, 2012

Steve Jobs an Inventor?


Source

The real genius of Steve Jobs.


by NOVEMBER 14, 2011




Jobs
Jobs’s sensibility was more editorial than inventive. “I’ll know it when I see it,” he said.

Not long after Steve Jobs got married, in 1991, he moved with his wife to a nineteen-thirties, Cotswolds-style house in old Palo Alto. Jobs always found it difficult to furnish the places where he lived. His previous house had only a mattress, a table, and chairs. He needed things to be perfect, and it took time to figure out what perfect was. This time, he had a wife and family in tow, but it made little difference. “We spoke about furniture in theory for eight years,” his wife, Laurene Powell, tells Walter Isaacson, in “Steve Jobs,” Isaacson’s enthralling new biography of the Apple founder. “We spent a lot of time asking ourselves, ‘What is the purpose of a sofa?’ ”

It was the choice of a washing machine, however, that proved most vexing. European washing machines, Jobs discovered, used less detergent and less water than their American counterparts, and were easier on the clothes. But they took twice as long to complete a washing cycle. What should the family do? As Jobs explained, “We spent some time in our family talking about what’s the trade-off we want to make. We ended up talking a lot about design, but also about the values of our family. Did we care most about getting our wash done in an hour versus an hour and a half? Or did we care most about our clothes feeling really soft and lasting longer? Did we care about using a quarter of the water? We spent about two weeks talking about this every night at the dinner table.”

Steve Jobs, Isaacson’s biography makes clear, was a complicated and exhausting man. “There are parts of his life and personality that are extremely messy, and that’s the truth,” Powell tells Isaacson. “You shouldn’t whitewash it.” Isaacson, to his credit, does not. He talks to everyone in Jobs’s career, meticulously recording conversations and encounters dating back twenty and thirty years. Jobs, we learn, was a bully. “He had the uncanny capacity to know exactly what your weak point is, know what will make you feel small, to make you cringe,” a friend of his tells Isaacson. Jobs gets his girlfriend pregnant, and then denies that the child is his. He parks in handicapped spaces. He screams at subordinates. He cries like a small child when he does not get his way. He gets stopped for driving a hundred miles an hour, honks angrily at the officer for taking too long to write up the ticket, and then resumes his journey at a hundred miles an hour. He sits in a restaurant and sends his food back three times. He arrives at his hotel suite in New York for press interviews and decides, at 10 P.M., that the piano needs to be repositioned, the strawberries are inadequate, and the flowers are all wrong: he wanted calla lilies. (When his public-relations assistant returns, at midnight, with the right flowers, he tells her that her suit is “disgusting.”) “Machines and robots were painted and repainted as he compulsively revised his color scheme,” Isaacson writes, of the factory Jobs built, after founding NeXT, in the late nineteen-eighties. “The walls were museum white, as they had been at the Macintosh factory, and there were $20,000 black leather chairs and a custom-made staircase. . . . He insisted that the machinery on the 165-foot assembly line be configured to move the circuit boards from right to left as they got built, so that the process would look better to visitors who watched from the viewing gallery.”
  • Isaacson begins with Jobs’s humble origins in Silicon Valley, the early triumph at Apple, and the humiliating ouster from the firm he created. He then charts the even greater triumphs at Pixar and at a resurgent Apple, when Jobs returns, in the late nineteen-nineties, and our natural expectation is that Jobs will emerge wiser and gentler from his tumultuous journey. He never does. In the hospital at the end of his life, he runs through sixty-seven nurses before he finds three he likes. “At one point, the pulmonologist tried to put a mask over his face when he was deeply sedated,” Isaacson writes:
Jobs ripped it off and mumbled that he hated the design and refused to wear it. Though barely able to speak, he ordered them to bring five different options for the mask and he would pick a design he liked. . . . He also hated the oxygen monitor they put on his finger. He told them it was ugly and too complex.

One of the great puzzles of the industrial revolution is why it began in England. Why not France, or Germany? Many reasons have been offered. Britain had plentiful supplies of coal, for instance. It had a good patent system in place. It had relatively high labor costs, which encouraged the search for labor-saving innovations. In an article published earlier this year, however, the economists Ralf Meisenzahl and Joel Mokyr focus on a different explanation: the role of Britain’s human-capital advantage—in particular, on a group they call “tweakers.” They believe that Britain dominated the industrial revolution because it had a far larger population of skilled engineers and artisans than its competitors: resourceful and creative men who took the signature inventions of the industrial age and tweaked them—refined and perfected them, and made them work.

In 1779, Samuel Crompton, a retiring genius from Lancashire, invented the spinning mule, which made possible the mechanization of cotton manufacture. Yet England’s real advantage was that it had Henry Stones, of Horwich, who added metal rollers to the mule; and James Hargreaves, of Tottington, who figured out how to smooth the acceleration and deceleration of the spinning wheel; and William Kelly, of Glasgow, who worked out how to add water power to the draw stroke; and John Kennedy, of Manchester, who adapted the wheel to turn out fine counts; and, finally, Richard Roberts, also of Manchester, a master of precision machine tooling—and the tweaker’s tweaker. He created the “automatic” spinning mule: an exacting, high-speed, reliable rethinking of Crompton’s original creation. Such men, the economists argue, provided the “micro inventions necessary to make macro inventions highly productive and remunerative.”


Was Steve Jobs a Samuel Crompton or was he a Richard Roberts? In the eulogies that followed Jobs’s death, last month, he was repeatedly referred to as a large-scale visionary and inventor. But Isaacson’s biography suggests that he was much more of a tweaker. He borrowed the characteristic features of the Macintosh—the mouse and the icons on the screen—from the engineers at Xerox PARC, after his famous visit there, in 1979. The first portable digital music players came out in 1996. Apple introduced the iPod, in 2001, because Jobs looked at the existing music players on the market and concluded that they “truly sucked.” Smart phones started coming out in the nineteen-nineties. Jobs introduced the iPhone in 2007, more than a decade later, because, Isaacson writes, “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to.” The idea for the iPad came from an engineer at Microsoft, who was married to a friend of the Jobs family, and who invited Jobs to his fiftieth-birthday party. As Jobs tells Isaacson:

This guy badgered me about how Microsoft was going to completely change the world with this tablet PC software and eliminate all notebook computers, and Apple ought to license his Microsoft software. But he was doing the device all wrong. It had a stylus. As soon as you have a stylus, you’re dead. This dinner was like the tenth time he talked to me about it, and I was so sick of it that I came home and said, “Fuck this, let’s show him what a tablet can really be.”


Even within Apple, Jobs was known for taking credit for others’ ideas. Jonathan Ive, the designer behind the iMac, the iPod, and the iPhone, tells Isaacson, “He will go through a process of looking at my ideas and say, ‘That’s no good. That’s not very good. I like that one.’ And later I will be sitting in the audience and he will be talking about it as if it was his idea.”

Jobs’s sensibility was editorial, not inventive. His gift lay in taking what was in front of him—the tablet with stylus—and ruthlessly refining it. After looking at the first commercials for the iPad, he tracked down the copywriter, James Vincent, and told him, “Your commercials suck.”

“Well, what do you want?” Vincent shot back. “You’ve not been able to tell me what you want.”“I don’t know,” Jobs said. “You have to bring me something new. Nothing you’ve shown me is even close.”Vincent argued back and suddenly Jobs went ballistic. “He just started screaming at me,” Vincent recalled. Vincent could be volatile himself, and the volleys escalated.When Vincent shouted, “You’ve got to tell me what you want,” Jobs shot back, “You’ve got to show me some stuff, and I’ll know it when I see it.”

I’ll know it when I see it. That was Jobs’s credo, and until he saw it his perfectionism kept him on edge. He looked at the title bars—the headers that run across the top of windows and documents—that his team of software developers had designed for the original Macintosh and decided he didn’t like them. He forced the developers to do another version, and then another, about twenty iterations in all, insisting on one tiny tweak after another, and when the developers protested that they had better things to do he shouted, “Can you imagine looking at that every day? It’s not just a little thing. It’s something we have to do right.”
  • The famous Apple “Think Different” campaign came from Jobs’s advertising team at TBWA\Chiat\Day. But it was Jobs who agonized over the slogan until it was right:
They debated the grammatical issue: If “different” was supposed to modify the verb “think,” it should be an adverb, as in “think differently.” But Jobs insisted that he wanted “different” to be used as a noun, as in “think victory” or “think beauty.” Also, it echoed colloquial use, as in “think big.” Jobs later explained, “We discussed whether it was correct before we ran it. It’s grammatical, if you think about what we’re trying to say. It’s not think the same, it’s think different. Think a little different, think a lot different, think different. ‘Think differently’ wouldn’t hit the meaning for me.”
The point of Meisenzahl and Mokyr’s argument is that this sort of tweaking is essential to progress. James Watt invented the modern steam engine, doubling the efficiency of the engines that had come before. But when the tweakers took over the efficiency of the steam engine swiftly quadrupled. Samuel Crompton was responsible for what Meisenzahl and Mokyr call “arguably the most productive invention” of the industrial revolution. But the key moment, in the history of the mule, came a few years later, when there was a strike of cotton workers. The mill owners were looking for a way to replace the workers with unskilled labor, and needed an automatic mule, which did not need to be controlled by the spinner. Who solved the problem? Not Crompton, an unambitious man who regretted only that public interest would not leave him to his seclusion, so that he might “earn undisturbed the fruits of his ingenuity and perseverance.” It was the tweaker’s tweaker, Richard Roberts, who saved the day, producing a prototype, in 1825, and then an even better solution in 1830. Before long, the number of spindles on a typical mule jumped from four hundred to a thousand. The visionary starts with a clean sheet of paper, and re-imagines the world. The tweaker inherits things as they are, and has to push and pull them toward some more nearly perfect solution. That is not a lesser task.

Jobs’s friend Larry Ellison, the founder of Oracle, had a private jet, and he designed its interior with a great deal of care. One day, Jobs decided that he wanted a private jet, too. He studied what Ellison had done. Then he set about to reproduce his friend’s design in its entirety—the same jet, the same reconfiguration, the same doors between the cabins. Actually, not in its entirety. Ellison’s jet “had a door between cabins with an open button and a close button,” Isaacson writes. “Jobs insisted that his have a single button that toggled. He didn’t like the polished stainless steel of the buttons, so he had them replaced with brushed metal ones.” Having hired Ellison’s designer, “pretty soon he was driving her crazy.” Of course he was. The great accomplishment of Jobs’s life is how effectively he put his idiosyncrasies—his petulance, his narcissism, and his rudeness—in the service of perfection. “I look at his airplane and mine,” Ellison says, “and everything he changed was better.”

The angriest Isaacson ever saw Steve Jobs was when the wave of Android phones appeared, running the operating system developed by Google. Jobs saw the Android handsets, with their touchscreens and their icons, as a copy of the iPhone. He decided to sue. As he tells Isaacson:

Our lawsuit is saying, “Google, you fucking ripped off the iPhone, wholesale ripped us off.” Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go to thermonuclear war on this. They are scared to death, because they know they are guilty. Outside of Search, Google’s products—Android, Google Docs—are shit.

In the nineteen-eighties, Jobs reacted the same way when Microsoft came out with Windows. It used the same graphical user interface—icons and mouse—as the Macintosh. Jobs was outraged and summoned Gates from Seattle to Apple’s Silicon Valley headquarters. “They met in Jobs’s conference room, where Gates found himself surrounded by ten Apple employees who were eager to watch their boss assail him,” Isaacson writes. “Jobs didn’t disappoint his troops. ‘You’re ripping us off!’ he shouted. ‘I trusted you, and now you’re stealing from us!’ ”

Gates looked back at Jobs calmly. Everyone knew where the windows and the icons came from. “Well, Steve,” Gates responded. “I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”

Jobs was someone who took other people’s ideas and changed them. But he did not like it when the same thing was done to him. In his mind, what he did was special. Jobs persuaded the head of Pepsi-Cola, John Sculley, to join Apple as C.E.O., in 1983, by asking him, “Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?” When Jobs approached Isaacson to write his biography, Isaacson first thought (“half jokingly”) that Jobs had noticed that his two previous books were on Benjamin Franklin and Albert Einstein, and that he “saw himself as the natural successor in that sequence.” The architecture of Apple software was always closed. Jobs did not want the iPhone and the iPod and the iPad to be opened up and fiddled with, because in his eyes they were perfect. The greatest tweaker of his generation did not care to be tweaked.
  • Perhaps this is why Bill Gates—of all Jobs’s contemporaries—gave him fits. Gates resisted the romance of perfectionism. Time and again, Isaacson repeatedly asks Jobs about Gates and Jobs cannot resist the gratuitous dig. “Bill is basically unimaginative,” Jobs tells Isaacson, “and has never invented anything, which I think is why he’s more comfortable now in philanthropy than technology. He just shamelessly ripped off other people’s ideas.”
After close to six hundred pages, the reader will recognize this as vintage Jobs: equal parts insightful, vicious, and delusional. It’s true that Gates is now more interested in trying to eradicate malaria than in overseeing the next iteration of Word. But this is not evidence of a lack of imagination. Philanthropy on the scale that Gates practices it represents imagination at its grandest. In contrast, Jobs’s vision, brilliant and perfect as it was, was narrow. He was a tweaker to the last, endlessly refining the same territory he had claimed as a young man.

As his life wound down, and cancer claimed his body, his great passion was designing Apple’s new, three-million-square-foot headquarters, in Cupertino. Jobs threw himself into the details. “Over and over he would come up with new concepts, sometimes entirely new shapes, and make them restart and provide more alternatives,” Isaacson writes. He was obsessed with glass, expanding on what he learned from the big panes in the Apple retail stores. “There would not be a straight piece of glass in the building,” Isaacson writes. “All would be curved and seamlessly joined. . . . The planned center courtyard was eight hundred feet across (more than three typical city blocks, or almost the length of three football fields), and he showed it to me with overlays indicating how it could surround St. Peter’s Square in Rome.” The architects wanted the windows to open. Jobs said no. He “had never liked the idea of people being able to open things. ‘That would just allow people to screw things up.’ ” 


Sunday, January 29, 2012

Forget global warming - it's Cycle 25 we need to worry about

Source


The supposed ‘consensus’ on man-made global warming is facing an inconvenient challenge after the release of new temperature data showing the planet has not warmed for the past 15 years.

The figures suggest that we could even be heading for a mini ice age to rival the 70-year temperature drop that saw frost fairs held on the Thames in the 17th Century.

Based on readings from more than 30,000 measuring stations, the data was issued last week without fanfare by the Met Office and the University of East Anglia Climatic Research Unit. It confirms that the rising trend in world temperatures ended in 1997.

A painting, dated 1684, by Abraham Hondius depicts one of many frost fairs on the River Thames during the mini ice age
A painting, dated 1684, by Abraham Hondius depicts one of many frost fairs on the River Thames during the mini ice age

Meanwhile, leading climate scientists yesterday told The Mail on Sunday that, after emitting unusually high levels of energy throughout the 20th Century, the sun is now heading towards a ‘grand minimum’ in its output, threatening cold summers, bitter winters and a shortening of the season available for growing food.

Solar output goes through 11-year cycles, with high numbers of sunspots seen at their peak.

We are now at what should be the peak of what scientists call ‘Cycle 24’ – which is why last week’s solar storm resulted in sightings of the aurora borealis further south than usual. But sunspot numbers are running at less than half those seen during cycle peaks in the 20th Century.

Analysis by experts at NASA and the University of Arizona – derived from magnetic-field measurements 120,000 miles beneath the sun’s surface – suggest that Cycle 25, whose peak is due in 2022, will be a great deal weaker still.


According to a paper issued last week by the Met Office, there is a  92 per cent chance that both Cycle 25 and those taking place in the following decades will be as weak as, or weaker than, the ‘Dalton minimum’ of 1790 to 1830. In this period, named after the meteorologist John Dalton, average temperatures in parts of Europe fell by 2C.

However, it is also possible that the new solar energy slump could be as deep as the ‘Maunder minimum’ (after astronomer Edward Maunder), between 1645 and 1715 in the coldest part of the ‘Little Ice Age’ when, as well as the Thames frost fairs, the canals of Holland froze solid.

The world average temperature from 1997 to 2012

Yet, in its paper, the Met Office claimed that the consequences now would be negligible – because the impact of the sun on climate is far less than man-made carbon dioxide. Although the sun’s output is likely to decrease until 2100, ‘This would only cause a reduction in global temperatures of 0.08C.’ Peter Stott, one of the authors, said: ‘Our findings suggest  a reduction of solar activity to levels not seen in hundreds of years would be insufficient to offset the dominant influence of greenhouse gases.’

These findings are fiercely disputed by other solar experts.

‘World temperatures may end up a lot cooler than now for 50 years or more,’ said Henrik Svensmark, director of the Center for Sun-Climate Research at Denmark’s National Space Institute. ‘It will take a long battle to convince some climate scientists that the sun is important. It may well be that the sun is going to demonstrate this on its own, without the need for their help.’

He pointed out that, in claiming the effect of the solar minimum would be small, the Met Office was relying on the same computer models that are being undermined by the current pause in global-warming.

CO2 levels have continued to rise without interruption and, in 2007, the Met Office claimed that global warming was about to ‘come roaring back’. It said that between 2004 and 2014 there would be an overall increase of 0.3C. In 2009, it predicted that at least three of the years 2009 to 2014 would break the previous temperature record set in 1998.

World solar activity cycles from 1749 to 2040

So far there is no sign of any of this happening. But yesterday a Met Office spokesman insisted its models were still valid.

‘The ten-year projection remains groundbreaking science. The period for the original projection is not over yet,’ he said.

Dr Nicola Scafetta, of Duke University in North Carolina, is the author of several papers that argue the Met Office climate models show there should have been ‘steady warming from 2000 until now’.

‘If temperatures continue to stay flat or start to cool again, the divergence between the models and recorded data will eventually become so great that the whole scientific community will question the current theories,’ he said.

He believes that as the Met Office model attaches much greater significance to CO2 than to the sun, it was bound to conclude that there would not be cooling. ‘The real issue is whether the model itself is accurate,’ Dr Scafetta said. 

Meanwhile, one of America’s most eminent climate experts, Professor Judith Curry of the  Georgia Institute of Technology, said she found the Met Office’s confident prediction of a ‘negligible’ impact difficult to understand.

‘The responsible thing to do would be to accept the fact that the models may have severe shortcomings when it comes to the influence of the sun,’ said Professor Curry. As for the warming pause, she said that many scientists ‘are not surprised’.
Four hundred years of sunspot observations

She argued it is becoming evident that factors other than CO2 play an important role in rising or falling warmth, such as the 60-year water temperature cycles in the Pacific and Atlantic oceans.

‘They have insufficiently been appreciated in terms of global climate,’ said Prof Curry. When both oceans were cold in the past, such as from 1940 to 1970, the climate cooled. The Pacific cycle ‘flipped’ back from warm to cold mode in 2008 and the Atlantic is also thought likely to flip in the next few years .

Pal Brekke, senior adviser at the Norwegian Space Centre, said some scientists found the importance of water cycles difficult to accept, because doing so means admitting that the oceans – not CO2 – caused much of the global warming between 1970 and 1997.

The same goes for the impact of the sun – which was highly active for much of the 20th Century.

‘Nature is about to carry out a very interesting experiment,’ he said. ‘Ten or 15 years from now, we will be able to determine much better whether the warming of the late 20th Century really was caused by man-made CO2, or by natural variability.’

Meanwhile, since the end of last year, world temperatures have fallen by more than half a degree, as the cold ‘La Nina’ effect has re-emerged in the South Pacific.

‘We’re now well into the second decade of the pause,’ said Benny Peiser, director of the Global Warming Policy Foundation. ‘If we don’t see convincing evidence of global warming by 2015, it will start to become clear whether the models are bunk. And, if they are, the implications for some scientists could be very serious.’



Popular Posts