November 6, 2011

If Man's Greenhouse Gases Are The Cause of Global Warming, How Could the Warming Have Started in the 1600's?

Monique Chartier

... that would be 200 years before man started cranking up a whole array of back saving, medical advancing, temperature moderating, light extending, food preserving, comfort creating evil inventions driven by an equally wicked fuel supply.

When discussing the theory of anthropogenic global warming, it's always a little dangerous to move away from the Central Data Point (man generates only 6% of greenhouse gases) and the Comtois Corollary (therefore, it is far too costly, in every sense of the word, to halt global warming, even if man's 6% is the tipping point). While my mind and heart are entirely in the sceptics' camp, it is easy to get overwhelmed by the data and proclamations of scientists on both sides of the argument, as they go back and forth with irrefutable proof refuting the latest refute in the discussion.

However, last week on Watts Up With That, guest contributor Tony Brown posted a temperature chart which revealed a familiar trend - but at an intriguing start point.

Take several even longer steps backwards, through the medium of Central England temperature (CET) the oldest and most examined instrumental data set in the world- maintained by the UK’s Met office- and this puts GISS into further historic context, in that the temperature rise extends-with numerous advances and reverses- all the way to 1659.

Don't miss the plot of CO2 emissions at the bottom of his graph - completely flat from 1659 until 1845-ish, then a real spike starting around 1946; i.e., almost 300 years after the warming trend had started.

In the back and forth between warmists and sceptics, the accusation of "data cherry picking" has been thrown around a little too often. It would be easy to make it in this instance; however, the start-year of most AGW temperature-and-greenhouse-gas charts is probably not so much a case of cherry-picking but of simply neglecting to look back far enough to see when the trend started. "Let's see, man started generating greenhouse gases in the 1850's; let's look at the temperature trend from that point." "Sure! Makes sense."

Even if not a deliberate mistake, it appears to be a mistake nevertheless. With Brown's temperature chart, AGW advocates must now answer the most fundamental question about their theory:

If the planet started on a warming trend long before man's greenhouse gases arrived on the scene, isn't it possible that the "warming" in Anthropogenic Global Warming is really not "anthropogenic" at all?

Comments, although monitored, are not necessarily representative of the views Anchor Rising's contributors or approved by them. We reserve the right to delete or modify comments for any reason.

"it is easy to get overwhelmed by the data and proclamations of scientists on both sides of the argument,"

Last wek I poicked up a copy of the Boston Globe and read a column on Global Warming. The author cited Jim Hansen, "noted NASA climatologist".

This name comes up so often that I Googled it some time ago. Jim Hansen is the same "NASA scientist" who was pushing the "coming ice age" in the 70's. At that time he had conclusive proof that the Earth was rapidly cooling and that we were entering another Ice Age. Those whose memories do not extend back to the 70's may be amazed to hear that the "coming Ice Age" was a cover story for both Time and Newsweek. It was all the rage for about a year.

Since Mr. Hansen's position on the "Coming Ice Age" is being back pedalled, here is a tad

Posted by: Warrington Faust at November 6, 2011 7:36 PM

Aren't we talking about a little more than one degree Celsius, over a couple of hundred year span?

Do you think there's any chance that our modern thermometers are calibrated just a little more precisely today than they were in the nineteenth century, hmm?

Posted by: brassband at November 6, 2011 7:44 PM


Plus, Hansen does not have a degree in climatology or even a related field. This is his ed background:

B.A., Physics and Mathematics, 1963, University of Iowa
M.S., Astronomy, 1965, University of Iowa
Ph.D., Physics, 1967, University of Iowa


Yes, one degree Celsius over the last 170-ish years, which equates to approximately one hundredth of a degree Fahrenheit per year temperature rise. That's another AGW data point that has been poorly publicized, presumably because it wouldn't sound scary enough to frighten people and push our elected officials into (misguided) action.

Posted by: Monique at November 7, 2011 6:49 AM


In the 1960s, nobody had the hubris to think they could predict the climate. No climatology degrees existed, AFAIK. Those degrees Hansen has are actually a fine foundation for studying heat transport, which is the entire basis of weather and, longer term, climate. No sun, no radiative heat transfer, no weather, no climate.
What he has done with the degrees, that's another matter.

Posted by: chuckR at November 7, 2011 7:24 AM

Ugh, more from our local "science" expert on climate change. Meanwhile back on planet Earth, things get even worse...

"Global CO2 emissions rising faster than worst-case scenarios"

One of the small consolations of the Great Recessions was that global greenhouse-gas emissions had dipped slightly, giving the world a few years’ breathing room to figure out how to tackle global warming. But the Copenhagen climate talks fizzled, the world didn’t take advantage of the lull, and the grace period’s now over. According to new data from the Department of Energy’s Oak Ridge National Lab, global carbon-dioxide emissions just saw their biggest one-year rise, a 6 percent jump in 2010.

The striking thing is that emissions are now rising faster than the worst-case scenarios envisioned by the IPCC in its 2007 report. What would this mean for global warming? The chart on the right, from a 2009 study by MIT’s Joint Program on the Science and Policy of Climate Change, lays out the possibilities. If emissions keep growing at their current pace, then the average prediction from MIT’s modeling is that the world could heat up 5.2°C by 2100. But that’s just the average. There’s a 9 percent chance that global surface temperatures could rise more than 7°C — truly uncharted territory. And as we keep adding carbon-dioxide into the air, the odds that we’ll be able to dodge a drastic rise in temperatures become very, very low.

Posted by: Russ at November 7, 2011 9:49 AM

"Facts are meaningless. You could use facts to prove anything that's even remotely true."

Posted by: Homer Simpson at November 7, 2011 10:28 AM

There's NO DOUBT int he scientific community that we're in a period that would be warming anyways because of the natural cycles of the sun and orbit that affect climate. The ice on Greenland and in the arctic never were permanent features of the planet, they come and go in 40K-100K year cycles.

That said, adding this much CO2 to the atmosphere will almost surely exacerbate the changes. Even if it doesn't cause temperature change, the particulate pollution, ocean acidification, and health effects (cancer) of fossil fuels should have us examining other sources of energy. Just because it's going to rain anyways doesn't mean you should defecate on your own sidewalk.

And Russ, I think the clear take-away from the ever-increasing carbon emissions is that even though we KNOW that it's going to have major harmful effects, we just can't stop burning fossil fuels. There's nothing we can do as an individual country about it, and we're not going to convince developing nations that they have to switch to much more expensive technologies for energy. It's just not going to happen.

We should be looking at mitigation strategies, not sticking our head in the sand and denying facts, or pretending that if we just made some changes there would be no problem. Climate change is inevitable, we need tools to mitigate and cope more than anything else.

Posted by: mangeek at November 7, 2011 10:36 AM

Every new generation of technology is substantially cleaner than the last. Top down government solutions can achieve very little to reduce carbon emissions, practically speaking, and grant money doled out by "progressive" bureaucracies to feel-good "green" firms has proven to be an utter waste and disaster thus far, which should not come as a surprise to those familiar with basic economic principles.

Although it is good to be aware of the issue, this is something that we will simply have to wait out because we don't have the technology to solve it at the moment. In 50 or 100 years, that could easily change. It's not immediately obvious that the world will be worse off if the average temperature increases by a couple of degrees in the meantime, and we will all be significantly worse off if government uses the situation as an excuse to impose all manner of draconian restrictions on the economy.

Posted by: Dan at November 7, 2011 10:55 AM

At least you folks won't be able to deny climate change when your houses are all underwater from Greenland's ice sheet melt. Until then, I've given up on arguing. It's like trying to argue with someone who believes dinosaurs and humans were frolicking through the fields at the same time in history... even though I do love me some Flintsones.

I've met Dr. Jim Hansen. He's not a crackpot, and he's not trying to make a buck. If you disagree with him, fine, completely your choice -- but stop trying to say he's unqualified, a quack, or stupid. It makes you look pretty bad yourself.

And if you think physics can't be related to climate science, you're really not even understanding what the terms "physics" and "climate science" even mean Monique.

Until then, I just sit here and laugh a silent and sad laugh. Coal and Oil interests will continue to keep the market from achieving its own solution by applying monopoly power over distribution media, by exerting undue political influence, and by buying up intellectual property on green technologies they never actually plan to use. That's *exactly* when government needs to step in and allow the market to work properly again, by creating easier paths to market entry for green energy service providers.

Also, Dan, your first argument in your most recent post is so easily torn apart, I almost feel bad doing it... computers continued to get faster and faster in the 1980's and 1990's -- they also drew significantly more wattage, used more heavy metals such as lead in their production, and increased wasteful extra parts as we wanted our computers to "look cooler". Some companies bucked that trend, but by NO means the majority. Myth debunked.

Seriously though, when 90% or more of the world's scientists (across racial, national, and religious boundaries) agree on something, but you think you've got them beat -- how much hubris is that?

What's your degree in Monique? Climate science, right?

PS: This is one of those issues I get worked up about, because I worked on distortion of climate science by the Bush Administration for many years -- they censored scientific reports, they fired federal employees, and they blatantly used non-scientists to lie to the American people about science.

My apologies in advance for any undue offense. My general respect for the integrity of this blog remains.

Posted by: jparis at November 7, 2011 1:07 PM

"What's your degree in Monique? Climate science, right?"

Nope. Thanks for reinforcing one of the points of the post. It doesn't take a scientist to see that 6% is a very small number from the perspective of the planet and, therefore, the case that it is the tipping point needs to be damn strong.

As it turns out, the case is nothing like strong - just the opposite.

Speaking of which, seeing you're into this, JParis, tell me about the heart of AGW: the computer models. Tell me about their consistency one with the other and how they fare "predicting" actual conditions.

Posted by: Monique at November 7, 2011 1:26 PM

That's *exactly* when government needs to step in and allow the market to work properly again, by creating easier paths to market entry for green energy service providers.

Why yes, it works so well with Solyndra and the others that will fail to repay their loans. It works so well that we in RI need to heavily subsidize offshore wind turbines. And worse, we subsidize production, not creation of the turbines themselves. Where is the incentive to become more cost-efficient? Just another CPFF government program. Venture socialism will always have a higher failure rate than venture capitalism, because of the addition of the political variable - who do I need to repay for their support is added to what are the chances I can make money on this?

It would have been more interesting if Obama had offered a $550million X prize for creation and demonstration of a cost effective commercial green technology. Instead he gave that to his campaign contributors.

Posted by: chuckR at November 7, 2011 1:58 PM

Monique, all I can do here is point you towards a major treatise on the issue that I helped edit at my time at the Union of Concerned Scientists. It's been many years and two jobs since then, and I don't want to go pulling numbers out of my butt (nor spending hours re-reading something I already read at least 20 times a couple years ago). I was convinced back then... I personally don't need to be re-convinced.

As for your point about what it does and doesn't take a scientist to see... a 6% change in your own body temperature (98.6 F) would translate to 104.5 F (assuming upward change) -- you'd be darned close to dead. So it really does matter that we ask the question "6% of what, and how tolerant is this system to that level of change?"

Meaning, maybe the Earth is tolerant to 6%, maybe it isn't. Your body isn't. So just citing the raw percentage and saying that it's small means absolutely nothing to me.

For the record, I do have a degree in Computer Science, and I believe the data we present here are both accurate and telling. Link ahoy!

Also, I'm including our report on how the government has systematically distorted climate change science:

UCS is many things, but it doesn't advocate for the election of anybody. It doesn't even advocate for a particular political agenda. The vast majority of our donors and members are in fact scientists (in fields other than climate science as well).

Here I am using "our" like I still work there. It was a wonderfully refreshing place to work in that science was always the focus.

Posted by: jpars at November 7, 2011 2:14 PM


I would *never* use Deepwater or Rhode Island policymaking in general as an example of what should be done in terms of green economic development, any more than I'd use Greece as a model of how to develop an industry other than tourism (and they are kinda failing at that now too).

However, your X-Prize idea would indeed do exactly what I stated: lower the barrier to entry for a new technology or implementation.

So we agree there are good ways government can help the market regain balance, and some very BAD ways as well. Deepwater? 38 Studios? Rhode Island has its head up its butt when it comes to spending what little "common good" money it has (and yes, Democratic Representative Socialism of varying degrees does deal with all that pesky common good stuff that industry has absolutely no incentive to be concerned with)

Posted by: jparis at November 7, 2011 2:22 PM

I agree with Mr Paris on this point, the "we're only responsible for 6% of the carbon dioxide" argument just doesn't hold any meaning...

Try increasing or decreasing the ocean's salinity by 6 percent, or your body temperature, or intensity of sunlight. When you're talking about an established system of somewhat balanced factors, 6% can make a big difference. Rememebr that it only took a tiny percentage of people to default on their mortgages for those 'pools' of securities to stop returning value, triggering the CDO payoffs.

I'm not a climate alarmist by any means, but it's clear as day that fossil fuels aren't a sustainable long-term solution for billions of people to live on. Hansen is a rabid alarmist, calling climate change out as the 'end of humanity'. Really?! It won't be the 'end of humanity' if the weather patterns change and the sea levels rise a few feet every century.

Posted by: mangeek at November 7, 2011 2:30 PM

jparis - this is why I'm a convert to smaller government. Generally, they aren't competent at investing money (at least during the last few decades of my lifetime). Perhaps they could instead provide additional tax incentives to foster creation of an X prize from private funds instead. The reason the Feds guaranteed Solyndra and others' loans is because nobody in their right mind thought these companies had a chance in hell. A decent chance in hell is about all you need to get venture capital.

I wonder if the folks still without power in CT are upset at Obama for not spending a large chunk of the $0.75 trillion stimulus on electrical grid upgrades and repair instead of, of, of, God only knows what. But they are probably blinded by the Blue....

Also, the magnitude of that stimulus money would have guaranteed the capital investment in 100 nuclear power stations. It is brutally difficult to sink several billion into a non-producing asset for up to a decade. Lowering the interest rate would at least help. The difference between a Solyndra and a power station is that we know the latter will work and be reasonably cost effective once on line.

Posted by: chuckR at November 7, 2011 2:48 PM

Russ - My statement is "so easily torn apart" by you because your "refutation" is full of fallacies and intellectual dishonesty.

computers continued to get faster and faster in the 1980's and 1990's -- they also drew significantly more wattage, used more heavy metals such as lead in their production, and increased wasteful extra parts as we wanted our computers to "look cooler". Some companies bucked that trend, but by NO means the majority. Myth debunked

First, you inexplicably focus on a single product that isn't even very power-intensive instead of looking at the cumulative effect, which was the topic of my statement. Second, you exclusively focus on the downsides of certain newer computers (certainly not all) and totally ignore the exponential increases in output that outweigh those downsides many times over. Third, you ignore that every other part of the equation has also improved by leaps and bounds - power is far cleaner than it used to be, buildings are more energy efficient, vehicles are more energy efficient, etc., etc. Pollution is far less of a problem in this country than it was decades ago and you have predominantly technology to thank for that improvement.

Posted by: Dan at November 7, 2011 4:47 PM

That was me Dan, not Russ. I know all us lefties look alike (really just kidding there).

What part of the world do you live in where a 500W power supply for today's average desktop computer "isn't very power intensive"? Sorry, but when people can't put food on the table, running 500W all day every day (many people don't turn their computers off), that adds up. Dispute that number all you want, but I've got 7 years of IT management experience and 5 years of policy experience on top of that -- this is *one* area where I don't doubt myself.

No, MOST new computers use more power than the computers a generation ahead of them -- they have more peripherals, use faster-spinning hard drives, and have video processing units designed to stream full-motion video (something that takes a lot of heat, cooling, and power).

From Moore's Law: transistor dimensions are scaled by 30% (0.7x) every technology generation, thus reducing their area by 50%. This reduces the delay (0.7x) and therefore increases operating frequency by about 40% (1.4x). Finally, to keep electric field constant, voltage is reduced by 30%, reducing energy by 65% and power (at 1.4x frequency) by 50%, since active power = CV2f. Therefore, in every technology generation transistor density doubles, circuit becomes 40% faster, while power consumption (with twice the number of transistors) stays the same.

(Source there is wikipedia, but Google "Moore's Law" and you'll find another acceptable source if you wish) -- point being, processors have NOT used less power over time -- they have stayed the same while other components of computers have used more power.

And we weren't talking about how much cooler looking, more capable, and faster computers have gotten... we were talking about their power use. If every industry was happy with the same power-usage equation as the computing industry, we'd actually be kinda screwed. So really -- please don't try to "win" on this one, because the facts are the facts.

Now if you want to look at the system as a whole? I was just providing one counter-example that definitely does hold true. Let's look at the system.

I assume you'll agree the DOE's Energy Information Administration isn't going to be ultra-liberal on this issue.

On the surface, 8% of our power is renewable... but wait, 35% of that is hydropower, most of which has been around for generations. You see any new dams being built recently in the US?

Another 24% is "wood". So burning wood. Yeah, pretty sure that's not a "new technology".

So 59% of the 8% is accounted for by dams and burning wood (leaving about 4% actually solar, wind, biomass, and other newer technologies)

And vehicles? They are more energy efficient due to no foresight or innovation on the part of Detroit... that was Korea and Japan leading the way, and when our homegrown companies almost went chapter 11, we finally wised up in the last few years.

So yeah, I'm happy about limited progress, because even 4% of our energy portfolio being newer, cleaner, renewables is a good thing -- but it isn't enough, and more importantly, there's no proof that industry will do this on its own in anything other than a perfunctory manner.

In fact, there's significant evidence to prove they'll lobby to continue to slow it down. People have jobs because of Coal, dirtiest of fuels that it is.

Posted by: jparis at November 7, 2011 5:32 PM

"Pollution is far less of a problem in this country than it was decades ago and you have predominantly technology to thank for that improvement."

I'm not so sure about that. Rhode Island has gone from an 'inefficient, dirty' place to an 'efficient, clean' place, at least on paper. Most of the change has been because of the disappearance of the 'dirty' industries to faraway lands.

We've just externalized a lot of mining, refining, production, and outsourced the jobs.

Posted by: mangeek at November 7, 2011 5:49 PM

"it works so well with Solyndra and the others that will fail to repay their loans. It works so well that we in RI need to heavily subsidize offshore wind turbines."

Quite simply, green energy does not work without a government mandate (imposing distinctly non-market prices) and/or a government subsidy.

I have no stake in this other than a desire not to pay unnecessarily high prices to heat or light my home or drive my car. I will celebrate with the most extreme of the greenies when the Magical, Mystery Fuel has been discovered.

But neither wind nor solar nor wave nor palm tree oil nor any green source currently out there is that fuel. Nor will these fuels ever be rendered affordable or feasible by gov't mandates, a gov't "green energy" program or, most absurdly, a "green jobs" initiative. It's a flat out rip off of the taxpayer, as Solyndra has so vividly demonstrated.

Posted by: Monique at November 7, 2011 5:56 PM

Sorry, jparis. Shutting down Russ is just a reflex at this point.

I don't dispute your figures on increased energy usage per computer because that is not my expertise, but as I stated earlier, I think what you are missing is that a computer can do a job today that would require 10 computers running 10 times as long a decade ago. In this specific situation, the cost is fixed and obvious while the benefits are dispersed, cascading, and difficult to measure, but they most likely outweigh the costs. For example, productivity increases, decreased labor costs, efficient outsourcing and teleworking. Coal is dirty, but it IS getting cleaner. Coal plants aren't the smog-belching smokestacks that they used to be. My opinion is that we need to double down on nuclear power for a variety of reasons, but technological increases in coal have decreased the pollution it releases by leaps and bounds on both ends. Don't get me started on wind and solar energy - grossly inefficient wastes of money, both. Efficient vehicles, homes, and devices have helped a lot. Washing machines, lamps, refrigerators and the like eat up much less power. American cars sucked for a long time and their labor costs were ludicrous - they should have gone bankrupt and they didn't deserve a bailout.

Posted by: Dan at November 7, 2011 7:01 PM

"What part of the world do you live in where a 500W power supply for today's average desktop computer "isn't very power intensive?"

At the risk hijacking this thread, if you're buying 500w power supplies for desktops as an admin or IT manager you're getting ripped off.

Posted by: Max D at November 7, 2011 7:12 PM

Posted by jparis "At least you folks won't be able to deny climate change when your houses are all underwater from Greenland's ice sheet melt."

In relatively recent times, the age of Viking exploration, Greenland was in large part arable. The sheet ice is a relatively recent situation (the beginning is roughly co-terminus with the "Little Ice Age" in Europe) which caused the Norse to abandon their settlements in Greenland. I somehow doubt that New England was underwater in that period. The Norse managed to settle there too. (Did anyone besides me attend the visit of the Viking Longboats to Providence? That was about 8 years ago.)

Posted by jparis "my time at the Union of Concerned Scientists."

The very name suggests that there are scientists who are not "concerned". What is their reason for apathy? Do the "unconcerned" scientists have a viable thesis contrary to those who are "concerned"?

Posted by: Warrington Faust at November 7, 2011 8:00 PM

@ Warrington: Oh, let's not be totally silly about the Norse situation. They got sold a bill of goods based upon a name, and mostly, got swindled. When you say "in large part arable", you mean how much? How much have the glaciers really grown in that few hundred years?

Actually, the Union of Concerned Scientists isn't a union at all (501 c3 and c4 fund sources), and the only reason the members consider themselves concerned is because they tend to look at science as part of a larger human context (some cases including politics).

Scientists who aren't "concerned" (sorry, have to laugh -- I got asked this question at conferences all the time: "So, why are you concerned?") -- they may be brilliant folks but just don't care about politics, or don't think their work has anything to do with the larger context of the world.

That's fine, the world has need of tons of those people, and that doesn't mean they don't believe in the scientific method or they deny climate change exists.

@ Max D: I'd agree with you 100% if all I did was blog and email. There are a lot of people out there who:

A) Don't know enough about computers to not get ripped off in that regard if they don't need the power

B) Always want "the best" and buy it regardless of price.

C) Program, use CAD, play video games, use a home scanner or other odd peripherals -- basically actually need the power.

(see, you didn't hijack, just side-track)

@ Dan: You do have me at a mathematical disadvantage if every computer is running at 100% (or even high) capacity all the time.

Assuming those newer power supplies are even conservative 350w, compare that to the 125w I had powering my first 286-SX25. They run at 100% when they are turned on, and then filter power from there... there's really no other way to make a computer work quickly, it needs power on-demand.

So unless the computer is run at 100%, that efficiency you mentioned is lessened and possibly erased for folks who just use email and such (most of the population).

Posted by: jparis at November 7, 2011 8:28 PM

Posted by jparis "@ Warrington: Oh, let's not be totally silly about the Norse situation. They got sold a bill of goods based upon a name, and mostly, got swindled. When you say "in large part arable", you mean how much? How much have the glaciers really grown in that few hundred years?"

The Norse Vikings arrived in Greenland around 980 AD, during a 300-year-long warm period. Starting around 1100, however, the climate cooled rapidly. Average temperature dropped 4 degrees Celsius (7 degrees Fahrenheit) in only 80 years. Since the Norse—Viking fame aside—were largely farmers, this drop in temperature likely hit them hard. They began to leave Greenland shortly thereafter, and by the mid-15th century their settlements lay abandoned.

It seems there is nothing unusual about "warming" in Greenland. Continuing archeology finds more ancient farms there. Since Greenland is 1700 miles long and 700 miles wide, the amount "arable" in Viking times is yet to be determined. It is worth noting that tropical growth is found under the ice in Antartica. More interesting is that there are ancient maps which show Antartica, with startling accuracy, without ice. "Scientists" accept the age of the maps, but are unable to explain how they were derived.

Posted by: Warrington Faust at November 7, 2011 11:12 PM

Hey, I work in I.T. too, and part of my job is energy management.

It's true that modern PCs can use more power while than they did in the 1980s and 1990s, but the usage profile has changed and in most configurations overall energy usage is actually much lower now.

I'm writing this on a mid-range workstation-class box: quad-core CPU, 8GB RAM, three hard drives, and a video card that can push more bits per second than all the computers I've ever owned put together... The power supply is rated for 350 Watts, but even at full-blast it only consumes about 140W, and the giant flat panel display uses a lot less juice than the old CRTs did. The actual consumption for the system at idle is about 60W. Older machines tended to run 'full blast' regardless of if the machine was idle or not; this one literally puts CPU cores to sleep between keyclicks.

So be careful when you compare the amperage rating on the back of the case to what's actually being used... Older PCs may have had 160W power supplies, but they consumed 120W 24 hours a day. Today's PC might have a 350W power supply, but it's asleep 16 hours a day at 6W and only really burning 85W during the other eight hours.

At my work, changes I made to the power settings on machines not only saved a bundle, they actually changed the HVAC profiles in some buildings. We were pumping energy into rooms via computers, then pumping energy into air conditioning to dissipate the heat, now the AC stays off unless there are packed classes doing high-end number-crunching.

Posted by: mangeek at November 8, 2011 12:06 AM

We are about 10,000 years out from the end of the last ice age which itself lasted approximately 100,000 years. We are warming up. Species which can not adapt will go extinct. Man didn't cause the ice age just has he didn't cause any of the other periods where the earth's temperature increased.
After 9/11 when we grounded all airplanes in the US a scientist was doing a sun study. The lack of contrails in the atmosphere due to grounded jets caused an uptick in temperature over the US. Here is a copy of the results.,%2520contrails%2520%26%25209-11.doc+9/11+grounding+of+airplanes+caused+temperature+uptick&hl=en&gl=us&pid=bl&srcid=ADGEESjOJo15zZN6_6xWgRNTAcB9_XSXCzlcHEXrjs7pyZ7yqPrDzHWUtFWPz9JkoCngnjFRdpeYhekRX_uS_Ra7RvroOJiyOdq-C4qwQpGO7hdY-eUCROm0LPBZROxUcJtXvbPQ58NM&sig=AHIEtbQwxfZ1efprjrIK1C_0S5YuTu-vgg

Posted by: Mark at November 8, 2011 12:28 AM

Related to the airplane/warming data: One of the most reasonable 'coping/mitigation' strategies is actually to move the particulate pollution from the lower atmosphere into the upper atmosphere, where it actually lowers global temperatures like a volcanic eruption. That would let us buy some more time with fossil fuels until we figured out what to do in the long run.

Posted by: mangeek at November 8, 2011 1:02 AM

Posted by Mangeek
"the giant flat panel display uses a lot less juice than the old CRTs did"

I was under the impression that flat screens used considerably more power than the old CRTs. Am I in error again?

Posted by: Warrington Faust at November 8, 2011 6:01 PM

It depends on the underlying technology, Warrington. My 24" LCD uses about 55 Watts when in-use. My CRT which had a similar amount of real estate consumed 75 Watts.

The standby/sleep consumtion of the LCD is also lower. CRTs of all sorts have to keep the tube 'warm' to start up quickly.

Posted by: mangeek at November 9, 2011 10:43 AM
Post a comment

Remember personal info?

Important note: The text "http:" cannot appear anywhere in your comment.