Featured Post

Syria helped orchestrate 2006 Motoon riots

Tweet Orchestrated outrage

Read More

Tar sands investor BP says their projected future of unlimited carbon pollution “is a wake-up call, not something any of us would like to see happening.”

Posted by admin | Posted in The Capitol | Posted on 21-01-2011

Tags: , , , , , , , , , , , , , , ,

0

Guest blogger Andy Rowell of Oil Change International, in a WonkRoom cross-post.

We are on the path to climate chaos, Big Oil has admitted. Both BP and Exxon have conceded that progress on climate change is totally insufficient to stabilize CO2 emissions. Both oil companies have just published their Energy Outlooks, and the outlook looks grim.

In a bleak prognosis for success on reducing carbon dioxide emissions, BP admits in its new Energy Outlook 2030 report, which was published yesterday, that global CO2 emissions from energy will grow an average of 1.2 percent a year through 2030. In total, BP’s chief economist Christof Ruehl predicts “to the best of our knowledge,” CO2 emissions will rise by 27 percent over the next two decades, meaning an increase of about 33bn tons. All this does not bode well for climate change, with even Bob Dudley calling the scenarios a “wake-up call“:

I need to emphasize that this is a projection, not a proposition. It is our dispassionate view of what we believe is most likely to happen on the basis of the evidence. For example, we are not as optimistic as others about progress in reducing carbon emissions. But that doesn’t mean we oppose such progress. As you probably know, BP has a 15 year record of calling for more action from governments, including the wide application of a carbon price. Our base case assumes that countries continue to make some progress on addressing climate change, based on the current and expected level of political commitment. But overall, for me personally, it is a wake-up call, not something any of us would like to see happening.

BP’s estimate is just higher than ExxonMobil, which believes that CO2 emissions will increase by 25 percent in 20 years, which, according to John Vidal, writing in The Guardian, in effect dismisses “hopes that runaway climate change can be arrested and massive loss of life prevented.”

These projections by BP and Exxon scientists are even gloomier the projections of the U.S. Energy Information Administration, which projectst that energy-related CO2 emissions will “grow by 16 percent from 2009 to 2035.” Exxon argues that oil will still be king in 2030:

In 2030, fossil fuels remain the predominant energy source, accounting for nearly 80 percent of demand. Oil still leads, but natural gas moves into second place on very strong growth of 1.8% a year on average, particularly because of its position as a favored fuel for power generation. Other energy types – particularly nuclear, wind, solar and biofuels – will grow sharply, albeit from a smaller base. Nuclear and renewable fuels will see strong growth, particularly in the power-generation sector. By 2030, about 40 percent of the world’s electricity will be generated by nuclear and renewable fuels.

BP too has demand for fossil fuels rising: BP’s “base case” — or most likely projection — points to primary energy use growing by nearly 40 percent over the next twenty years, with 93% of the growth coming from non-OECD countries. The BP report argues that world energy growth over the next twenty years is expected to be dominated by emerging economies such as China, India, Russia and Brazil. Natural gas is also expected to be the fastest growing fossil fuel, with coal and oil losing market share as fossil fuels as a whole experience a slow decline in growth, falling from 83 percent to 64 percent. Coal will increase by 1.2 percent per year and by 2030 it is likely to provide virtually as much energy as oil, excluding biofuels.

There is some good news that energy diversification will continue. Between 2010 to 2030 the contribution to energy growth of renewables (solar, wind, geothermal and biofuels) is seen to increase from 5 to 18 percent.

What oil there is left is predominantly under OPEC control. OPEC’s share of global oil production is set to increase to 46%, a position not seen since 1977, the decade that saw the cartel preside over a series of oil shocks and shortages. In fact, 75 percent of all growth in oil reserves over the next two decades is expected to come from OPEC nations, which include Kuwait, Iran, Angola, Libya, Saudi Arabia, Iraq and Nigeria.

Andy Rowell writes for Oil Change International’s Price of Oil.

JR:  Of course as much as BP claims it would not like to see continued rapid growth in carbon pollution, the UK’s Independent reported last year, “Oil giant BP today signalled it would press on with a controversial Canadian tar sands project despite facing a showdown with environmental campaigners and shareholders.”

The tar sands are among the most carbon-intensive of replacements for conventional petroleum (see “Tar sands — Still dirty after all these years“):

shale.jpg

X-axis is the range of potential resource in billions of barrels. Y-axis is grams of Carbon per MegaJoule of final fuel.

Related Posts:

Climate Progress

What If Everything You Knew About Projected Global Warming Was Wrong?

Posted by admin | Posted in The Capitol | Posted on 14-01-2011

Tags: , , , , , ,

0

One of the dirty secrets in the climate change community is that the IPCC consensus is widely seen as the lower bound of likely warming because the models do not incorporate the slower and much more powerful positive feedbacks such as increased methane release from permafrost and bogs. In the years since it was released, every study that has looked at the issue has concluded that the models are far far more likely to drastically understating warming and sea level rise than overstating. One reason for this is the growing work on past climate that shows we’re not even in the same ballpark:

The atmospheric CO2 concentration currently is 390 parts per million by volume (ppmv), and continuing on a business-as-usual path of energy use based on fossil fuels will raise it to ?900 to 1100 ppmv by the end of this century (see the first figure) (1). When was the last time the atmosphere contained ?1000 ppmv of CO2? Recent reconstructions (2–4) of atmospheric CO2 concentrations through history indicate that it has been ?30 to 100 million years since this concentration existed in the atmosphere (the range in time is due to uncertainty in proxy values of CO2). The data also reveal that the reduction of CO2 from this high level to the lower levels of the recent past took tens of millions of years. Through the burning of fossil fuels, the atmosphere will return to this concentration in a matter of a century. Thus, the rate of increase in atmospheric CO2 is unprecedented in Earth’s history.

What was Earth’s climate like at the time of past elevated CO2? Consider one example when CO2 was ?1000 ppmv at ?35 million years ago (Ma) (2). Temperature data (5, 6) for this time period indicate that tropical to subtropical sea surface temperatures were in the range of 35° to 40°C (versus present-day temperatures of ?30°C) and that sea surface temperatures at polar latitudes in the South Pacific were 20° to 25°C (versus modern temperatures of ?5°C). The paleogeography of this time was not radically different from present-day geography, so it is difficult to argue that this difference could explain these large differences in temperature. Also, solar physics findings show that the Sun was less luminous by ?0.4% at that time (7)….

Thus, Earth was 16°C warmer at 30 to 40 Ma.The conclusion from this analysis—resting on data for CO2 levels, paleotemperatures, and radiative transfer knowledge—is that Earth’s sensitivity to CO2 radiative forcing may be much greater than that obtained from climate models (12–14).

As Joe Romm notes:

Methane release from the not-so-perma-frost is the most dangerous amplifying feedback in the entire carbon cycle. The permafrost contains a staggering “1.5 trillion tons of frozen carbon, about twice as much carbon as contained in the atmosphere,” much of which would be released as methane. Methane is is 25 times as potent a heat-trapping gas as CO2 over a 100 year time horizon, but 72 times as potent over 20 years! The carbon is locked in a freezer in the part of the planet warming up the fastest (see “Tundra 4: Permafrost loss linked to Arctic sea ice loss“). Half the land-based permafrost would vanish by mid-century on our current emissions path (see “Tundra, Part 2: The point of no return” and below). No climate model currently incorporates the amplifying feedback from methane released by a defrosting tundra.

OK so the question to the people that don’t think this is a major is simple: why? But before you answer, I will note something about the possible methane release dynamics. In a paper that I regrettably cannot find again (although it was so mathematical that I would expect only a couple of TMV readers would get anything out of it) they point out that based on what we know about permafrost and decomposition, there is a high chance of a random “detonation” of methane. Here is why:

The majority of organic matter waiting to be decomposed is trapped below the surface. As the tundra heats up, the upper layers of this matter are warmed to the point that they thaw out and anaerobic processes start decomposition, which produces methane and heat. This methane and heat are largely trapped under the surface because the top level is compressed in a way that it can’t vent, so the heat warms the matter more, increasing the reaction rate and widening the reaction depth. At some point the pressure underground becomes so high that it fractures the top level and vents. This isn’t speculation, it is a widely observed fact.

What IS speculation is how widespread this is occurring and what it means. Scientists on site are noticing an uptick in the amount of venting locations and we are detecting an increase in methane release through satellite measurements, but so far it is a linear increase (meaning predictable…it’s rising but slowly and steadily). However this paper was a mathematical modelling of the dynamics in a compartmentalized model, which basically means that there are different basins of decomposition that are relatively separate but have some connections, and is a relatively realistic assumption. They show that the dynamics of the whole system will exhibit local stability (small increases in venting) but that the region as a whole will be vulnerable to simultaneously massive methane release, a so called “methane bomb.” Oh, and there would be absolutely no way to predict when it would occur until it does, all we can say is that the warmer the Arctic gets the greater the risk. [For those that know about neuron firing, this is the exact dynamics of action potentials in neuronal firing….you have a quiescent period that holds until the resting potential rises near the threshold and then you have an indeterminate amount of time until the coordinated feedback kicks in and you have the unstoppable action potential. This model even produces the same refractionary period after the methane release that neurons display after APs.]

So in summary, we know we have enough methane trapped in the arctic regions to increase radiative forcing by an extreme amount, we are increasingly confident that in the past that level of GHGs produced temperatures that even if you take the lower bounds are still several times worse than the IPCC projections and there is a very convincing argument that the nature of the Arctic and chemistry means that it could explode without warning at a fundamentally unknowable time…which increases in probability (exponentially) with temperature and all evidence suggests that the last few decades of warming are almost certainly anthropogenic and will continue to increase.

Can we ever “prove” any of this stuff? No, we can’t, but at this point the evidence that supports the fears encompasses an extraordinary amount of scientific fields and the dynamics that suggest we need to be really careful are seen widely in biology, electronics, the stock market/economy, weather formation patterns, crowd behavior, chemical reaction dynamics, ecology, etc etc etc.

We cannot know with certainty what will happen but we seem to be damn close (actually past) the danger point when we may not have any control no matter what we do. People that argue “well I believe in AGW but we need to do a risk analysis to make a rational approach weighing cost and risk” better understand the massive tail risk that is getting wider with our increased understanding…something that wasn’t taken into account for the financial system and the reason why literally “impossible” events transpired. The “impossibility” of rapid warming may prove just as shortsighted.


The Moderate Voice

Science stunner: On our current emissions path, CO2 levels in 2100 will hit levels last seen when the Earth was 29°F (16°C) hotter – Paleoclimate data suggests CO2 “may have at least twice the effect on global temperatures than currently projected by computer models”

Posted by admin | Posted in The Capitol | Posted on 13-01-2011

Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,

0

The disinformers claim that projections of dangerous future warming from greenhouse gas emissions are based on computer models.  In fact, ClimateProgress readers know that the paleoclimate data is considerably more worrisome than the models (see Hansen: ‘Long-term’ climate sensitivity of 6°C for doubled CO2).  That’s mainly because the vast majority of the models largely ignore key amplifying carbon-cycle feedbacks, such as the methane emissions from melting tundra (see Are Scientists Underestimating Climate Change).

Science has just published an important review and analysis of “real world” paleoclimate data in “Lessons from Earth’s Past” (subs. req’d) by National Center for Atmospheric Research (NCAR) scientist Jeffrey Kiehl.  The NCAR release is here: “Earth’s hot past could be prologue to future climate.”  The study begins by noting:

Climate models are invaluable tools for understanding Earth’s climate system. But examination of the real world also provides insights into the role of greenhouse gases (carbon dioxide) in determining Earth’s climate. Not only can much be learned by looking at the observational evidence from Earth’s past, but such know ledge can provide context for future climate change.

The atmospheric CO2 concentration currently is 390 parts per million by volume (ppmv), and continuing on a business-as-usual path of energy use based on fossil fuels will raise it to ∼900 to 1100 ppmv by the end of this century (see the first figure) (1). When was the last time the atmosphere contained ∼1000 ppmv of CO2? Recent reconstructions (24) of atmospheric CO2 concentrations through history indicate that it has been ∼30 to 100 million years since this concentration existed in the atmosphere (the range in time is due to uncertainty in proxy values of CO2). The data also reveal that the reduction of CO2 from this high level to the lower levels of the recent past took tens of millions of years. Through the burning of fossil fuels, the atmosphere will return to this concentration in a matter of a century. Thus, the rate of increase in atmospheric CO2 is unprecedented in Earth’s history.

I will repost the references at the end, since this is a review article (see also U.S. media largely ignores latest warning from climate scientists: “Recent observations confirm … the worst-case IPCC scenario trajectories (or even worse) are being realised” — 1000 ppm)

So now the question is — how much warmer was it back then?

What was Earth’s climate like at the time of past elevated CO2? Consider one example when CO2 was ∼1000 ppmv at ∼35 million years ago (Ma) (2). Temperature data (5, 6) for this time period indicate that tropical to subtropical sea surface temperatures were in the range of 35° to 40°C (versus present-day temperatures of ∼30°C) and that sea surface temperatures at polar latitudes in the South Pacific were 20° to 25°C (versus modern temperatures of ∼5°C). The paleogeography of this time was not radically different from present-day geography, so it is difficult to argue that this difference could explain these large differences in temperature. Also, solar physics findings show that the Sun was less luminous by ∼0.4% at that time (7). Thus, an increase of CO2 from ∼300 ppmv to 1000 ppmv warmed the tropics by 5° to 10°C and the polar regions by even more (i.e., 15° to 20°C).

What can we learn from Earth’s past concerning the climate’s sensitivity to greenhouse gas increases? Accounting for the increase in CO2 and the reduction in solar irradiance, the net radiative forcing—the change in the difference between the incoming and outgoing radiation energy–of the climate system at 30 to 40 Ma was 6.5 to 10 W m−2 with an average of ∼8 W m−2. A similar magnitude of forcing existed for other past warm climate periods, such as the warm mid-Cretaceous of 100 Ma (8). Using the proxy temperature data and assuming, to first order, that latitudinal temperature can be fit with a cosine function in latitude (9), the global annual mean temperature at this time can be estimated to be ∼31°C, versus 15°C during pre-industrial times (around 1750) (10). Thus, Earth was ∼16°C warmer at 30 to 40 Ma. The ratio of change in surface temperature to radiative forcing is called the climate feedback factor (11). The data for 30 to 40 Ma indicate that Earth’s climate feedback factor was ∼2°C W−1 m−2. Estimates (1, 11) of the climate feedback factor from climate model simulations for a doubling of CO2 from the present-day climate state are ∼0.5 to 1°C W−1 m−2. The conclusion from this analysis—resting on data for CO2 levels, paleotemperatures, and radiative transfer knowledge—is that Earth’s sensitivity to CO2 radiative forcing may be much greater than that obtained from climate models (1214).

Indeed, in the release, Kiehl notes his study “found that carbon dioxide may have at least twice the effect on global temperatures than currently projected by computer models of global climate.”

Why is the ‘real world’ warming so much greater than the models?  The vast majority of the models focus on the equilibrium climate sensitivity — typically estimated at about 3°C for double CO2 (equivalent to about ¾°C per W/m2) — only includes fast feedbacks, such as water vapor and sea ice.  As Hansen has explained in deriving his 6°C ‘long-term’ sensitivity:

Elsewhere (Hansen et al. 2007a) we have described evidence that slower feedbacks, such as poleward expansion of forests, darkening and shrinking of ice sheets, and release of methane from melting tundra, are likely to be significant on decade-century time scales. This realization increases the urgency of estimating the level of climate change that would have dangerous consequences for humanity and other creatures on the planet, and the urgency of defining a realistic path that could avoid these dangerous consequence.

For background on the tundra (and methane), see Science: Vast East Siberian Arctic Shelf methane stores destabilizing and venting:  NSF issues world a wake-up call: “Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.”

Methane release from the not-so-perma-frost is the most dangerous amplifying feedback in the entire carbon cycle.  The permafrost permamelt contains a staggering “1.5 trillion tons of frozen carbon, about twice as much carbon as contained in the atmosphere,” much of which would be released as methane.  Methane is  is 25 times as potent a heat-trapping gas as CO2 over a 100 year time horizon, but 72 times as potent over 20 years!  The carbon is locked in a freezer in the part of the planet warming up the fastest (see “Tundra 4: Permafrost loss linked to Arctic sea ice loss“).  Half the land-based permafrost would vanish by mid-century on our current emissions path (see “Tundra, Part 2: The point of no return” and below).  No climate model currently incorporates the amplifying feedback from methane released by a defrosting tundra.

Kiehl’s work is in line with other major studies, like this one:

Scientists analyzed data from a major expedition to retrieve deep marine sediments beneath the Arctic to understand the Paleocene Eocene thermal maximum, a brief period some 55 million years ago of “widespread, extreme climatic warming that was associated with massive atmospheric greenhouse gas input.” This 2006 study, published in Nature (subs. req’d), found Artic temperatures almost beyond imagination–above 23°C (74°F)–temperatures more than 18°F warmer than current climate models had predicted when applied to this period. The three dozen authors conclude that existing climate models are missing crucial feedbacks that can significantly amplify polar warming.

How long might it take for the extra warming to kick in?  That isn’t known for certain, but two major studies looking at paleoclimate data that Kiehl didn’t cite suggest it’s sooner rather than later:

A study published in Geophysical Research Letters (subs. req’d) looked at temperature and atmospheric changes during the Middle Ages. This 2006 study found that the effect of amplifying feedbacks in the climate system–where global warming boosts atmospheric CO2 levels–”will promote warming by an extra 15 percent to 78 percent on a century-scale” compared to typical estimates by the U.N.’s Intergovernmental Panel on Climate Change. The study notes these results may even be “conservative” because they ignore other greenhouse gases such as methane, whose levels will likely be boosted as temperatures warm.

A second study, published in Geophysical Research Letters, “Missing feedbacks, asymmetric uncertainties, and the underestimation of future warming” (subs. req’d), looked at temperature and atmospheric changes during the past 400,000 years. This study found evidence for significant increases in both CO2 and methane (CH4) levels as temperatures rise. The conclusion: If our current climate models correctly accounted for such “missing feedbacks,” then “we would be predicting a significantly greater increase in global warming than is currently forecast over the next century and beyond”–as much as 1.5°C warmer this century alone.

In the longer term, past 2100, if we were to get anywhere near the kind of warming that Kiehl’s analysis of the paleoclimate data suggests we are headed to, that could render large tracts of the planet uninhabitable.  That was the conclusion of a recent PNAS paper coauthored by Matthew Huber, professor of earth and atmospheric sciences at Purdue (release here).  I haven’t blogged on it, but I guess I will have to now.  The bottom line:

“We found that a warming of 12 degrees Fahrenheit would cause some areas of the world to surpass the wet-bulb temperature limit, and a 21-degree warming would put half of the world’s population in an uninhabitable environment,” Huber said. “When it comes to evaluating the risk of carbon emissions, such worst-case scenarios need to be taken into account. It’s the difference between a game of roulette and playing Russian roulette with a pistol. Sometimes the stakes are too high, even if there is only a small chance of losing.”

So don’t even think about what 29°F warming could mean.

Kiehl concludes:

The above arguments weave together a number of threads in the discussion of climate that have appeared over the past few years. They rest on observations and geochemical modeling studies. Of course, uncertainties still exist in deduced CO2 and surface temperatures, but some basic conclusions can be drawn. Earth’s CO2 concentration is rapidly rising to a level not seen in ∼30 to 100 million years, and Earth’s climate was extremely warm at these levels of CO2. If the world reaches such concentrations of atmospheric CO2, positive feedback processes can amplify global warming beyond current modeling estimates. The human species and global ecosystems will be placed in a climate state never before experienced in their evolutionary history and at an unprecedented rate. Note that these conclusions arise from observations from Earth’s past and not specifically from climate models. Will we, as a species, listen to these messages from the past in order to avoid repeating history?

Will we?

Related Posts:

References:

  1. S. Solomon et al Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, S. Solomon et al., Eds. (Cambridge Univ. Press, Cambridge, UK, 2007).
  2. M. Pagani, J. C. Zachos, K. H. Freeman, B. Tipple, S. Bohaty , Science 309, 600 (2005); 10.1126/science.1110063. doi:10.1126/science.1110063Abstract/FREE Full Text
  3. B. J. Fletcher, S. J. Brentnall, C. W. Anderson, R. A. Berner, D. J. Beerling , Nat. Geosci. 1, 43 (2008). CrossRefWeb of Science
  4. D. O. Breecker, Z. D. Sharp, L. D. McFaddenn , Proc. Natl. Acad. Sci. U.S.A. 107, 576 (2010). Abstract/FREE Full Text
  5. P. K. Bijl, S. Schouten, A. Sluijs, G. J. Reichart, J. C. Zachos, H. Brinkhuis , Nature 461, 776 (2009). CrossRefMedlineWeb of Science
  6. P. N. Pearson et al ., Geology 35, 211 (2007). Abstract/FREE Full Text
  7. D. O. Gough, Sol. Phys. 74, 21 (1981). CrossRef
  8. D. L. Royer, Geochim. Cosmochim. Acta 70, 5665 (2006). CrossRefWeb of Science
  9. G. R. North, J. Atmos. Sci. 32, 2033 (1975). CrossRef
  10. The cosine temperature expression can be integrated analytically to obtain the global annual mean temperature. Paleotemperatures from (5) for a subtropical location and a high southern latitude location were used to determine the two coefficients in the analytical expression for global mean temperature.
  11. S. E. Schwartz, Clim. Change; 10.1007/s10584-010-9903-9 (2010). doi:10.1007/s10584-010-9903-9 CrossRef
  12. J. Hansen et al., Open Atmos. Sci. 2, 217 (2008).
  13. P. K. Bijl, A. J. Houben, S. Schouten, S. M. Bohaty, A. Sluijs, G. J. Reichart, J. S. Sinninghe Damsté, H. Brinkhuis , Science 330, 819 (2010). Abstract/FREE Full Text
  14. D. J. Lunt et al., Nat. Geosci. 3, 60 (2010). CrossRefWeb of Science

Climate Progress

Governor-Elect Dan Malloy And Incoming Budget Chief Ben Barnes Tackle The State’s Projected $3.5 Billion Deficit

Posted by admin | Posted in The Capitol | Posted on 22-11-2010

Tags: , , , , , , , , , , ,

0

Hartford Courant cartoonist Bob Englehart shows that Governor-elect Dannel Malloy and incoming budget chief Ben Barnes will be tackling the state’s projected $ 3.5 billion budget deficit.

http://blogs.courant.com/bob_englehart/2010/11/november-21-2010.html

Capitol Watch

AG Who Opposed NCLB is Projected Senate Winner in Conn.

Posted by admin | Posted in The Capitol | Posted on 11-11-2010

Tags: , , , , ,

0

CNN is projecting that Richard Blumenthal, the Democratic attorney general who sued the federal government over the No Child Left Behind Act, has beaten Linda McMahon, the former World Wrestling Entertainment chief executive officer, for an open U.S. Senate seat in Connecticut.

Blumenthal filed the suit back in 2005, calling the law an unfunded mandate. On his campaign website, he continues to call for a major overhaul of the law.

McMahon, a former state school board member, wanted to see “choice and competition” through expansion of charter schools.


Politics K-12

Richard Blumenthal Projected To Win in Connecticut

Posted by admin | Posted in The Capitol | Posted on 02-11-2010

Tags: , , ,

0

He has all but won for sure. I just heard he has 60% of the vote to Linda McMahon’s 38%.


The Moderate Voice

Cook Political Ups Projected Democratic Losses To 48 to 60

Posted by admin | Posted in The Capitol | Posted on 26-10-2010

Tags: , , , ,

0

The Cook Political Report‘s pre-election House outlook is a Democratic net loss of 48 to 60 seats, with higher losses possible. A turnover of just 39 seats would tip majority status into Republican hands. The midterm maelstrom pulling House Democrats under shows no signs of abating, if anything it has intensified.

Whereas fewer than a third of Democratic Senate seats are up for election, House Democrats are suffering the full violence of this national undertow. Over a quarter of the entire 255-member House Democratic caucus have trailed GOP opponents in at least one public or private survey, and nearly half have tested under 50 percent of the vote in at least one poll.

At this point, only 190 House seats are Solid, Likely or Lean Democratic, while 198 seats are Solid, Likely or Lean Republican, and 47 seats are in the Toss Up column. While there are certain to be at least 43 new members of the House thanks to 41 open seats and two vacancies, between 40 and 50 incumbents (over 95 percent of them Democrats) are likely to lose their seats, making for possibly the largest freshman class since 1992.

Hotline On Call

WTC reconstruction projected to be completed in five to six years. Later than Ground Zero Mosque construction

Posted by admin | Posted in The Capitol | Posted on 18-08-2010

Tags: , , , , , , , , , ,

0

An exclusive by NY1 news, developer Larry Silverstein of the reconstruction of the World Trade Center. He said the rebuilding effort of Towers 2,3, and 4. But the only Tower that is getting any visable action in the site is Tower 4.

Silverstein said he’s proud at the pace of construction:

“Here we are, after nine years of this, and this thing is now moving forward at a terrific pace,” said Silverstein. “And we are thrilled with the acceleration and the reality of what’s transpiring before us on a daily basis…

“It’s several stories above grade at this juncture. And once the typical floors get formed and framed, the structure starts to rise up and the building will begin to take shape quickly,” Silverstein explained.

The reconstruction effort was held up through red tape, financing disagreements, and relative laziness on all parties involved, especially the by the government. However, I am not confident about Silverstein’s enthusiasm and prediction.

There are conditions to the agreement the Port Authority made with developers. The PA says Tower 3 will get five stories of retail, and if Silverstein raises $ 300 million and leases 400,000 square feet of space, New York City, New York state, and Port Authority will back the construction with an additional $ 600 million. Tower 2 will be built to street level, and may be built higher, if the economy picks up. That’ll not happen if we don’t get tax cuts and with democrats in power in Washington and in New York.

NY1 added this bit:

I think everybody came to the realization that this had to get done, and the Port Authority came to realize they cannot do this without us; we realized we can’t do this without them,” said the developer.

The Port Authority wouldn’t confirm Silverstein’s timetable, but said, “Silverstein is so optimistic he says he’s already talking to potential tenants for Tower 4, who he describes as first-class corporations looking to lease a significant amount of space.

Compare this to the timetable of the Ground Zero Mosque construction. The imam said it’ll get built by the 10th year anniversary of 9/11 attacks in 2011. As Rush Limbaugh said on his radio show a few weeks ago, the Ground Zero Mosque will be constructed faster than the rebuilding of the World Trade Center, and the memorial for those killed. Unreal.

Cross Posted at Cubachi

Liberty Pundits Blog