Eat that GW believers!

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

KitemanSA wrote:There have been several posts here that state or imply, in essense, that it is not possible to detect a 0.5 degree change in temperature given that the array of guages can't be shown accurate within +-2 degrees. Sorry, this is not true.

It is true that you will not be able to accurately state what the magnitude of the temperature is, but it is possible thru proper statistical methods to extract accurate information about CHANGES in that magnitude.

Of course, those proper statistical methods would look extreme askance at hand-picking the data set.
But the whole deal depends on the magnitude.

So I'll give you that the signal appears to be increasing. By how much? Well that is a mystery. And there is absolutely no way to be sure that the "signal" extracted is not just an artifact of the noise.

You can always decrease the confidence interval to show a signal. But then confidence in the result declines.

BTW would you care to suggest a "proper method"? I'll bet it depends on an estimation of the nature of the signal. So you do correlations at different frequencies until your correlation shows significance. But there is no way to be sure when the noise far exceeds the signal. And that is just the instrumentation noise. I'd love for some one to tell me how you extract a signal when all you have to work with is daily min/max numbers.

Then there is the problem of different instruments, collection methods, and variations in record length and as you mentioned post collection data quality and what have you got?
Engineering is the art of making what you want from what you can get at a profit.

KitemanSA
Posts: 6179
Joined: Sun Sep 28, 2008 3:05 pm
Location: OlyPen WA

Post by KitemanSA »

MSimon wrote: You can always decrease the confidence interval to show a signal. But then confidence in the result declines.

BTW would you care to suggest a "proper method"? I'll bet it depends on an estimation of the nature of the signal.
True. Time series methods don't always work with spacial data and vice versa. But I think I might try starting with Huang's "Empirical Mode Decomposition" routine which does pretty well with multi-variate time history data IIRC.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

KitemanSA wrote:
MSimon wrote: You can always decrease the confidence interval to show a signal. But then confidence in the result declines.

BTW would you care to suggest a "proper method"? I'll bet it depends on an estimation of the nature of the signal.
True. Time series methods don't always work with spacial data and vice versa. But I think I might try starting with Huang's "Empirical Mode Decomposition" routine which does pretty well with multi-variate time history data IIRC.
http://www.keck.ucsf.edu/compjc/pre2004 ... etal98.pdf

The Hilbert transform depends on picking a point in that data and using the data around it to cancel out frequencies. It is quite good at that. Used in Radio to develop the I and Q signals for SSB transmission.

Where it falls down is that it does not decompose endpoints (it does fine in the center of a series). In radio it typically means a 20 millisecond or so delay from signal generation until I/Q decomposition in order to gather enough data points to do the transform.

So we might know what the signal was 10 or 20 years ago. It can't tell us much about today.
Engineering is the art of making what you want from what you can get at a profit.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

Also a requirement for any sampled data system is that the analysis be limited to frequencies no greater than 1/2 the sampling frequency (for fidelity 1/3 is better). The out of band signals should be limited to less than 1/2 LSB for the full accuracy of the system to be realized. For old data that is impossible.

If there are frequencies above fs/2 in the data they will be aliased to a lower frequency.

So what do we know about climate data?

First - for most of the data the sampling points are not defined (when was the min? When was the max?)

You can adjust for irregular sampling if you know the "exact" period between samples. Good luck with that.

Second we know that the input to the filter (Hilbert, Fourier, other - pick one) is not properly band limited. If we assume that a thermometer has an e time of 15 minutes then there will be high frequencies in the data not accounted for (band limited) by taking daily samples. What is required (roughly) is a thermometer that takes 24 hours to respond for climate data (assuming daily collection). Of course with all the high frequency data in the signal the i.e. the filter output is always moving, you have to sample at minimum at known times. Or you could take the signals from 24 hours (every 15 minutes say) and apply the proper filter. Now what do you do about old min/max data?

You could get rid of the spurious frequencies (aliases) if you knew the magnitudes, phases, and frequencies of the out of band signals.

Good luck with that.

In climate science we are dealing with (to put it generously) idiots who do not understand the limitations of the data or worse crooks who would prefer to bury that knowledge in order to give an aura of certainty greater than the facts warrant.

I'd love to hear Michael Mann give a discourse on how Nyquist relates to his data.
Engineering is the art of making what you want from what you can get at a profit.

Jccarlton
Posts: 1747
Joined: Thu Jun 28, 2007 6:14 pm
Location: Southern Ct

Post by Jccarlton »

MSimon wrote:
Nah, Hansen is old hat ("father of AGW"). He's not in the pockets of anyone as far as I understand,
Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha. Ha.

You are hilarious.

http://powerandcontrol.blogspot.com/200 ... ansen.html

http://powerandcontrol.blogspot.com/200 ... ading.html

BTW I set you up and you fell for it. Easy to do when you are a believer.

Start with this premise: the whole field of climate science with very few exceptions is corrupt to the core. Then go looking for the corruption. You will note I found Hansen's double dealing in 2007. It is now two years later and the nictitating membranes are still covering your eyes.

But let me explain Hansen with the money quote from the above links.
Note that Enron was a major broker of natural gas.
Coal bad. Natural gas good.

When doing science the person most easily fooled is yourself. Any belief will bias your search for the truth (or as close an approximation as is possible). Engineers get disabused of that sort of thinking by having to solve real problems in real time. I can't tell you how many times when I was SURE of the cause of the problem I was lead astray. These days I tend to gather more data and look for causes outside my normal frame of reference.
Don't forget Soros:
http://newsbusters.org/blogs/jake-gonte ... orge-soros
and other progressives:
US Senate Blog
Another US Senate blog
In fact there is so much on Hansen that I have seen over the years that you would have to be willfully blind to not believe he is a shill.

Josh Cryer
Posts: 526
Joined: Sun Aug 31, 2008 7:19 am

Post by Josh Cryer »

TallDave,
And building GCM models against that data based on the unvalidated assumption that small changes in a trace gas are the primary driver of climate then claiming those models are 90% reliable is just crazy.
The models are not based on temperature data, they are based on generalized* physical properties of gases and thermodynamics (none of which rely on climate / weather temperature data). But since you have shown absolutely no evidence that you have read one freaking model paper, instead insisting on code that you will never be able to understand, I really have nothing more to say on this matter.

It's time someone makes an open climate model based on these papers. Someone needs to pay me money and I'll happily do it. I'll even get a skeptic to help me with it, and we can both read the papers together and decide upon how the model should be built. Otherwise I simply don't have the time for such an endeavor. :(

*in this context generalized doesn't mean that the physical properties are not modeled accurately, simply that the computational power necessary to model them perfectly doesn't exist. Hansen's 1988 model was good enough to delve future temperatures for a decade, and the 2001 model is accurate to this day, future models will be better.
Polls are now finding the percentage of people who believe AGW is a real and urgent problem is approaching the number who believe Mars is littered with alien artifacts, even despite all the AGW cheerleading from the MSM.
The propaganda campaign is working, science filtered as entertainment rather than knowledge pursuit. One guy, a real scientist on one side, another guy, a shill for some industry or an idiot blogger on the other side. That's entertainment. Though of course, the number of people who believe that the Earth is 5000 years old is pretty staggering, so I wouldn't place much merit in the intelligence of the public. (And I am a fan of mob intelligence.) The deniers are increasingly in the creationist camp, in my experience.
That anonymous and deeply stupid article was already completely destroyed here by Eschenbach.
Erm, I linked that article particularly because it has the link to your article and a second rebuttal by the Economist: http://www.economist.com/blogs/democrac ... _gun_still

The fact that you did not realize this leads me to believe that you actually have not read either article.

Darwin Zero is not a smoking gun by any means (sorry, gotta at least give those 22k hits for it as a conspiracy a chance for an alternate view). What it is is an attempt to obfuscate the peer review process and mislead the public. No comments about the little experiment Kilty did? Shame. I'd think a forum full of engineers would be happy to look at it.
You keep pointing to the methodology papers. They aren't following the methodology.
Not established in any way whatsoever.
Yes, Darwin Zero is the smoking gun.
No, sea ice melt, Greenland, the whole arctic sea, and East and West Antarctica is the smoking gun. Hardiness zone changes are the smoking gun (unless we want to say the USDA is full of crazy environmentalists, of course). More extreme weather patterns are the smoking gun. All predicted to varying extents (some failures in others, such as Antarctica).

Of course, if we start with throwing the data out, then, well, there ain't shit I can say, right? You disbelieve the data, you have absolutely no alternate data set to share, so we're just frick.





MSimon,
So please Oh Data Master please tell me what is the correct sampling frequency for Climate in order to separate the signal from the out of band noise?
I don't know, the temperature record has been atrocious. But do you disagree with any methods that the scientists have used to come to their conclusions? What is interesting is that we are complaining about error bars in smaller data sets, when the final data sets have error bars of 0.04 (0.02+-) or less. They conclude this through using a diverse array of data and a diverse array of statistical methods. Picking at *one set* of temperature data alone does not suffice. It's like saying that because I flipped a coin 10 times and it was heads 8 out of ten times that statistically speaking 80% of all coin flips will be heads, which would be ridiculous. More like "I cannot make a statement about coin flips based on that data alone." No shit, the scientists don't base the whole of global climate forcing on one data set. This is precisely why denialists try to discredit every single data set on record, without contributing anything to the scientific process (if there *are* problems with them then they'd be better off if they were corrected, and not bemoaned incessantly).

In any event, if CLARREO is accomplished then we will have extraordinarily accurate measurements down to some ridiculous percentage (I believe 0.1 K) that would make all denialists cry like babies. Of course, they'd still claim CO2 wasn't the cause. This is why it was necessary for the CO2 monitoring satellite to go up, since they would have complimented each other perfectly.
Dude. You are blowing smoke when you say that you have looked at the data. Seen it? To be sure. Looked with a critical eye? Not on your life.
I downloaded every single bit of it for later pursuit. I don't see why I should verify or falsify the scientific data myself, when there are plenty of nutjobs out there who claim the methodology is wrong, and who have whole websites dedicated to making all sorts of graphs about this subject, getting paid a bunch of money for web traffic. They have the time, they could do it. However, I do plan to write some code to do a lot of the analysis for me, but I simply believe that if there was a smoking gun here it would be shown clear as day, without conspiracy filled innuendos.

Instead what we have are cries for data that already exists. Why? Why would people who can't do a 5 second google search ("climate station histories") perpetuate disinformation that the data doesn't exist?



Jccarlton, character assassination rather than scientific scrutiny doesn't work with me: http://www.columbia.edu/~jeh1/distro_La ... 070927.pdf

I am spending the majority of my time contributing, hopefully, insightful commentary, not character attacks. I could have called Eschenbach all sorts of names, I refused. Yeah, I get snarky, I admit it, but only to people who can actually defend themselves, normally. It really annoys me when people trash talk someone that cannot defend themselves.

Merry Christmas, y'all. And to prove that I don't look at local, short term, small data sets? It's about 12 degrees (F) out right now. And I still believe global warming is happening due to the copious amount of evidence to that effect.
Science is what we have learned about how not to fool ourselves about the way the world is.

KitemanSA
Posts: 6179
Joined: Sun Sep 28, 2008 3:05 pm
Location: OlyPen WA

Post by KitemanSA »

MSimon wrote:The Hilbert transform depends on picking a point in that data and using the data around it to cancel out frequencies. It is quite good at that. Used in Radio to develop the I and Q signals for SSB transmission.

Where it falls down is that it does not decompose endpoints (it does fine in the center of a series). In radio it typically means a 20 millisecond or so delay from signal generation until I/Q decomposition in order to gather enough data points to do the transform.

So we might know what the signal was 10 or 20 years ago. It can't tell us much about today.
Which is why the EMD method is not based on it per-se.
5. The empirical mode decomposition method: the sifting process

Knowing the well-behaved Hilbert transforms of the IMF components is only the starting point. Unfortunately, most of the data are not IMFs. At any given time, the data may involve more than one oscillatory mode; that is why the simple Hilbert transform cannot provide the full description of the frequency content for the general data as reported by Long et al. (1995). We have to decompose the data into IMF components. Here, we will introduce a new method to deal with both non-stationary and nonlinear data by decomposing the signal rst, and discuss the physical meaning of this decomposition later. Contrary to almost all the previous methods, this new method is intuitive, direct, a posteriori and adaptive, with the basis of the decomposition based on, and derived from, the data.
PS: Thanks for the link.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

The models are not based on temperature data, they are based on generalized* physical properties of gases and thermodynamics (none of which rely on climate / weather temperature data).
Not really. They don't have enough computer power to do that. What is done is parameterization. Which is not he same thing as running a full Navier-Stokes on the atmosphere.

Let me add that without temperature data they have no initial conditions. Nor do they have any way to verify the models. Of course if the models initialized and verified against bad data.....

Think of the problems in modeling Polywell where the equations of the interactions are accurate to ppm (minimum) and there are only a few interactions to deal with.

Now compare that with atmospheric science where some of the important quantities are not well known to even +/- 50% (clouds). UV is not modeled. The Sun is not modeled. Galactic cosmic rays are not modeled.

Small potatoes you say? In a linear system - probably. In a chaotic system (lots of sources and sinks non-harmonically related) not on your life.

Now if the best way to find out what Polywell does (simple, well defined, isolated) is to build one, what hope is there that at the current state of computing and understanding we can model a system at least 1,000,000X more complicated?

ZIP.

Should we keep trying? Yes. We will get better at it. Place a $100 trillion bet now on the outcome of the models? You'd lose the money slower by betting on Red AND Black at the roulette wheel.

There is a $1 billion flyer that might be useful though. Spend the money to reduce the costs of solar, wind, and storage so they make economic sense without subsidies for them or taxes on their competitors.

But it all moot any way. If Russia and China are not joining in there is no reason we should. After all the US CO2 output is large and rising very slowly. The Chinese CO2 output is large and rising quickly. Expected to double by around 2020 and double again by 2050.

The US Congress (stupid as it is) is not going to cripple the USA if China does not join in by crippling itself.

And the biggest joke of all? The best near term hope for eliminating coal fired plants is Polywell or something like it. And its biggest proponents are AGW sceptics.

It makes no difference how settled the science and how good your proofs are if China does not join it.

My advice if you want to make a difference? Quit wasting your time here and spend it instead learning Chinese.

http://wattsupwiththat.com/2009/12/25/g ... s-records/

http://wattsupwiththat.com/2009/12/21/o ... d-in-snow/

http://eureferendum.blogspot.com/2009/1 ... stmas.html (rare blizzard strikes West Texas)

http://eureferendum.blogspot.com/2009/1 ... ought.html (blizzards in China)

http://eureferendum.blogspot.com/2009/1 ... ening.html (blizzards in the UK)

http://eureferendum.blogspot.com/2009/1 ... g-off.html (belief in AGW in the US falls below 50% among likely voters)

http://eureferendum.blogspot.com/2009/1 ... gnals.html (Brits prepared for mild winter - get harsh one)

http://eureferendum.blogspot.com/2009/1 ... ather.html (about two feet of global warming strikes Mid-Atlantic states)

Good one on how the belief in AGW is raising CFCs in the atmosphere:

http://eureferendum.blogspot.com/2009/1 ... -rich.html

related:

http://eureferendum.blogspot.com/2009/1 ... arbon.html

China is getting about 60% of the carbon credits business
Engineering is the art of making what you want from what you can get at a profit.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

KitemanSA wrote:
MSimon wrote:The Hilbert transform depends on picking a point in that data and using the data around it to cancel out frequencies. It is quite good at that. Used in Radio to develop the I and Q signals for SSB transmission.

Where it falls down is that it does not decompose endpoints (it does fine in the center of a series). In radio it typically means a 20 millisecond or so delay from signal generation until I/Q decomposition in order to gather enough data points to do the transform.

So we might know what the signal was 10 or 20 years ago. It can't tell us much about today.
Which is why the EMD method is not based on it per-se.
5. The empirical mode decomposition method: the sifting process

Knowing the well-behaved Hilbert transforms of the IMF components is only the starting point. Unfortunately, most of the data are not IMFs. At any given time, the data may involve more than one oscillatory mode; that is why the simple Hilbert transform cannot provide the full description of the frequency content for the general data as reported by Long et al. (1995). We have to decompose the data into IMF components. Here, we will introduce a new method to deal with both non-stationary and nonlinear data by decomposing the signal rst, and discuss the physical meaning of this decomposition later. Contrary to almost all the previous methods, this new method is intuitive, direct, a posteriori and adaptive, with the basis of the decomposition based on, and derived from, the data.
PS: Thanks for the link.
Thanks for that. I guess just skimming the paper is insufficient. I'll give it a better effort.

You are welcome.
Engineering is the art of making what you want from what you can get at a profit.

Jccarlton
Posts: 1747
Joined: Thu Jun 28, 2007 6:14 pm
Location: Southern Ct

Post by Jccarlton »


CherryPick
Posts: 33
Joined: Sat Jun 13, 2009 9:39 pm
Location: Finland

AGW - IPPC models

Post by CherryPick »

Josh Cryer wrote: The models are not based on temperature data, they are based on generalized* physical properties of gases and thermodynamics (none of which rely on climate / weather temperature data). But since you have shown absolutely no evidence that you have read one freaking model paper, instead insisting on code that you will never be able to understand, I really have nothing more to say on this matter.

*in this context generalized doesn't mean that the physical properties are not modeled accurately, simply that the computational power necessary to model them perfectly doesn't exist. Hansen's 1988 model was good enough to delve future temperatures for a decade, and the 2001 model is accurate to this day, future models will be better.
IPCC's data is available at http://www.ipcc-data.org/

It is also useful to read the IPCC reports - especially the AR4WG1 chapter 8 (for example page 601) to assure that modelers confirm quite much of what MSimon says about the missing scientific understanding in the creation of the climate models.

As central parts of climate system are not understood and as the models' "forecasts" does not match the measurements in any reasonable accuracy, no policies should be done based on the models.
--------------------------------------------------------
CherryPick
Ph.D.
Computer Science, Physics, Applied Mathematics

TallDave
Posts: 3141
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

The models are not based on temperature data
Yes they are, the models have to be built against something. Typically CRU data has been used.
instead insisting on code
Making code available is called "the scientific method."
The fact that you did not realize this leads me to believe that you actually have not read either article.
I did realize it. You apparently don't or you wouldn't keep citing anonymous, well-debunked nonsense.
Darwin Zero is not a smoking gun by any means
Now that's denialism. You cannot adjust temperatures using dissimilar stations more than a thousand miles away.

http://wattsupwiththat.com/2009/12/20/d ... more-14358

There's nothing in the methodology that allows those adjustments.
No, sea ice melt, Greenland, the whole arctic sea, and East and West Antarctica is the smoking gun.
There is actually more sea ice in the Antarctic, and Arctic sea ice disappeared in the 1930s as well. But sea ice is just a curiosity; it doesn't increase sea level when it melts. East Antarctica has gained mass over a period AGWers are claiming was the warmest in thousands of years.
More extreme weather patterns are the smoking gun.
This is just ignorance. Hurricane activity has actually fallen off the charts.

http://wattsupwiththat.com/2009/09/22/g ... happening/

TallDave
Posts: 3141
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

The propaganda campaign is working, science filtered as entertainment rather than knowledge pursuit. One guy, a real scientist on one side, another guy, a shill for some industry or an idiot blogger on the other side.
Again, this is just ignorance. The amount of money spent by AGWers is larger by orders of magnitude. In your view, the "shills" are the people being outspent thousands to one by environmental advocates who have massive investments in AGW and who get arrested outside coal plants, and the "scientists" are the ones throwing objectivity out the window, assuming their conclusions, ignoring the scientific method, and corrupting the peer review process.

TallDave
Posts: 3141
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

*in this context generalized doesn't mean that the physical properties are not modeled accurately, simply that the computational power necessary to model them perfectly doesn't exist. Hansen's 1988 model was good enough to delve future temperatures for a decade, and the 2001 model is accurate to this day, future models will be better.
This is particularly hilarious, since Model A now predicts way too much warming, making it considerably inferior to a naive forecast (and remember, what matters isn't what models predict in 10 years, it's the disaster they predict way out in the future; that's why we're being told to divert trillions of dollars away from carbon energy).

So, in 2015, when the older models are predicting way too much warming, we'll get new models that predict only a little warming in the near future... and so on, and so on. It's the crisis that never quite occurs.

http://wattsupwiththat.com/2009/01/28/f ... g-climate/
To justify using a climate forecasting model, one would need to test it against a relevant naïve model.

We used the Forecasting Method Selection Tree to help determine which method is most appropriate for forecasting long-term climate change. A copy of the Tree is attached as Appendix 1. It is drawn from comparative empirical studies from all areas of forecasting. It suggests that extrapolation is appropriate, and we chose a naïve (no change) model as an appropriate benchmark. A forecasting model should not be used unless it can be shown to provide forecasts that are more accurate than those from this naïve model, as it would otherwise increase error. In Green, Armstrong and Soon (2008), we show that the mean absolute error of 108 naïve forecasts for 50 years in the future was 0.24°C.
Again, they could at least be honest and put giant error bars around the GCM predictions, but that doesn't fit their political agenda.
Last edited by TallDave on Fri Dec 25, 2009 10:53 pm, edited 1 time in total.

TallDave
Posts: 3141
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

This is fun:

http://reason.com/blog/2009/12/24/what- ... rgets-to-m
However, Friedman fails to mention that another result of "green taxes" is that Danish consumers pay some of the highest electric rates in the world: 42 cents per kilowatt hour compared to 9 cents per kilowatt hour in the U.S. This may explain why the very hospitable Danish lady from whom I rented a room while covering the climate conference was always following behind me to make sure that the lights were off. Friedman's mention of "a 10-cent-a-gallon increase in gasoline taxes" is a bit disingeuous as well. A gallon of gasoline costs $7.76 in Denmark compared to $2.63 in the U.S.
Scary.

Post Reply