Guest Essay by Dr. Alan Welch FBIS FRAS
Summary. This report presents and analyses the sea level data for 2024. The year 2023 finished with the strong El Niño still causing faster increasing sea level rises but during 2024 the El Niño reduced considerably. The rate of rise is now more stable at about 3.3 mm/year and the so called “acceleration” has resumed its downward trend. The analysis carried out now uses software to carry out Spectral Analysis and to converge on a best fit sinusoidal curve using an iterative approach.
Preamble – Some thoughts regarding “acceleration”
A term that appears many times in sea level reports is acceleration. I feel that unfortunately it has been misused in situations where no clear acceleration exists, but a quadratic curve has been fitted. These curves yield a coefficient for the t2 variable that have the same units as acceleration but need to be applied with caution. Extrapolation is very misused, especially over many decades. For these reasons I have called the resulting acceleration “acceleration” generally throughout the text but have found it useful in using its changing value with time to judge different models.
I must be careful not to create a “pot calling the kettle black” situation as my work involves using sinusoidal curves and prediction. The sinusoidal curves are now backed up by spectral analysis, which has also showed short period cycles due to El Niño effects, longer periods from decadal oscillations and possible very long periods (c1000 years) in some Tidal Gauge data. The prediction item involves showing how the sinusoidal curve would show the variation of “acceleration” with time over a 70-year period up to about 2060 and comparing this with “acceleration” assessments every month or so.
Main Findings.
Data releases via the https://climate.nasa.gov/vital-signs/sea-level/ web site were made during 2024 for January, June, July, August, October and November but although all were analysed this report uses the latest, January 2025, data. The Figure 1 shows the data with linear and quadratic best fits and shows how the term “residuals” is defined.

Figure 1
The residuals (measured values minus linear values) were plotted in Figure 2 together with a quadratic best fit curve and the standard deviation of the errors added. To check this process is carried out correctly the quadratic term is compared with that for the full data and the linear fit checked as being y = 0 x + 0.

Figure 2
Next in Figure 3, the residuals were plotted together with a sinusoidal curve having an amplitude of 4.2 mm and a period of 26 years. The value of 26 years has been used for a couple of years having been originally eyed in.

Figure 3
The standard deviation has improved over the quadratic fit from 3.13 to 3.06. The period of 26 years has been retained so as not to keep changing but a possibly higher value may be more appropriate hence the above figure was repeated but with a period of 29 years and the standard deviation reduces further to 2.99 as shown in Figure 4. Part of the improvement in the standard deviation is also due to the -0.4mm constant in the equation.

Figure 4
The choice of 29 years followed on due to the introduction of two other analytical techniques. Firstly, the fitting of a sinusoidal curve using a convergence process where the four parameters in the equation (constant, amplitude, shift and period) are changed by small steps and an error criterion (square root of the sum of the squares of the differences) reaches a minimum. The process does not guarantee this is the absolute minimum but starting with different parameters may help to confirm this and/or using diagonal small increments by combining two of the parameter increments in pairs. Secondly the use of spectral analysis has been introduced details of which can be found in an Appendix. There are many spectral analysis methods and whilst that used (CLEANest method) may not be the most suitable, checks were performed on a range of simulated curves to judge the suitability of the chosen approach and how to interpret the graphs produced.
Inspection of Figures 3 and 4 shows a tendency for the fitted curve to be similar to a quartic polynomial. Figure 4A shows the result of applying a quartic curve and whilst the SD is now reduced from 3.13 for a quadratic to 3.01 it is still slightly higher than the 29-year period. A danger with polynomials is that near each end of the data the graph can become influenced excessively by the highest order term. Not obvious in this case but extrapolators extrapolate at your peril!!

Figure 4A
Figure 5 shows the result of the spectral analysis for the residuals in the NASA analysis. The amplitudes are best judged as relative amplitudes indicating the impact of various periods. (I must thank Tonny Vanmunster of the CBA Belgium Observatory for help with his software. Asking him what the term “theta” used in the program on the y axis is he replied “Related to CLEANest, and the “theta” value shown on the Y Axis: In the CLEANest method described by Foster (1995), the Y-axis of the periodogram represents the power of the signal at each frequency (or period) — but it’s important to note that this is not the same as a classical Fourier power. Instead, the Y-axis in a CLEANest periodogram reflects the amplitude of the cleaned spectral component at each frequency, after the iterative removal of sidelobes caused by uneven time sampling (spectral window effects). You can interpret it as a “debiased” or “cleaned” amplitude, where aliasing has been largely suppressed. So, I would suggest to call it the amplitude or CLEANest amplitude.”
It is the Periods of various peaks that are most useful.

Figure 5
As shown in the appendix when sinusoidal periods involved are nearly close to the period covered by the data the highest point on the spectrum fits quite well but is followed by a long, sometimes slowly descending, tail. Figure 6 homes in on the periods below 12 years where the effects of the El Niños and La Niñas are mainly seen.

Figure 6
Figure 7 shows the El Niño index covering the NASA data range the data being extracted and processed through the Spectral Analysis program producing the spectrum shown in figure 8.

Figure 7

Figure 8
With a peak amplitude between 10 and 11 years what role can sunspots have on this. The migration of sunspots towards the equator over the solar cycle and changes in size combined with the sunspot number produces a daily “Total Solar Irradiance” (TSI) as shown in figure 9 which covers the period of the satellite readings.

Figure 9
The Spectral Analysis of this portion of the TSI resulted in the following.

Figure 10
We need some HUMOUR at this stage. Following the lead of 97% of Climate experts I have fitted the quadratic curve shown above and then extrapolated it to 2100. Get a good supply of suntan lotion in!!

Figure 10A
The following graphs, Figure 11 and Figure 13, are probably, in my mind, becoming the most important ones as they progressively grow over time. They plot the “acceleration” values against the date they were determined based on the data from the start of 1993 up to that date. Prior to 2012 the “accelerations” were more chaotic due to the shorter time periods and large swings in data. From 2012 onwards there is a more settled form to the graphs superimposed on which are several “S” shape deviations each occurring because of an El Niño and the effect of which diminishes with time. These occurred in 2015/16 (Very Strong), in 2018/19 and 2023/24 (Strong) and even a weak effect in 2021 is just discernible. Since about 2017 there has been a steady decrease in “acceleration” with short increases over a year at each large El Niño. These combine to result in an average reduction in “acceleration” of about 0.002 mm/year2 per year.

Figure 11
Having stated that I think these graphs are very important I have only found one similar example of this graph (figure 15). This was in a paper by R. S. Nerem, T. Frederikse, B. D. Hamlington (Ref 7) dated March 2022. Unfortunately, the data only goes up to the end of 2020 and uses superseded data. In my paper (Ref 5) I discussed this change in data where some historical data, as far back as 1993, has increased by over 8 mm for some, as far as I know, unexplained reason. Three of the figures are reproduced below to show those changes together with a predicted curve based on a 26-year cycle.


Figure 12 Before changes Figure 13 After changes

Figure 14 Changes in data
The change of 8 mm is about 2.5 times the annual increase in sea levels. The graph, in Ref 7, was, not unreasonably at the time, used to substantiate the statement that “…the acceleration coefficient becomes stable after 2017” but it would be interesting to see it updated. One thing I have noticed is that whereas a couple of years ago the “accelerations” were quoted in the form 0.083 +/- 0.025 mm/year2, which represented 1σ, they are now quoted in the form 0.09 +/- 0.09 mm/year2 to 0.08 +/- 0.06 mm/year2 representing a 90% confidence interval. These changes now lead to a much lower projected limit. At least this lower limit is in line with my work and the Tidal Gauges.
The quoted slopes, accelerations and comments are from Ref 6, “The rate of global sea level rise doubled during the past three decades”. The note 1 below the values are from ref 6.
“Table 1 Changes in rates and accelerations during the altimeter record
| End Date | Rate (mm/year) | Acceleration (mm/year2) | 2020–2050 Sea Level Change (mm) |
| 2017.99 | 3.3 ± 0.4 | 0.09 ± 0.09 | 171 ± 77 |
| 2018.99 | 3.3 ± 0.4 | 0.09 ± 0.08 | 168 ± 71 |
| 2019.99 | 3.3 ± 0.4 | 0.09 ± 0.08 | 174 ± 64 |
| 2020.99 | 3.3 ± 0.4 | 0.09 ± 0.07 | 169 ± 60 |
| 2021.99 | 3.3 ± 0.3 | 0.08 ± 0.07 | 165 ± 57 |
| 2022.99 | 3.3 ± 0.3 | 0.08 ± 0.06 | 158 ± 55 |
| 2024.00 | 3.3 ± 0.3 | 0.08 ± 0.06 | 169 ± 52 |
- Rate and acceleration estimates for different lengths of the satellite altimeter record. The start year is fixed in 1993 but the end year of the record used varies from 2017.99 to 2024. The last row indicates the rate and acceleration estimates over the current full record. Uncertainty estimates denote the 90% confidence interval. Additionally, the extrapolation of the measured rate and acceleration is used to project the sea level change from 2020–2050.”

Figure 15
If the residuals had followed a sinusoidal variation based on the 29-year sinusoidal curve an equivalent set of “accelerations” could have been determined. Figure 16 shows the “accelerations” as at the end of Jan 2025 with the “accelerations” calculated using quadratic curve fitting to a set of data obtained from a 29-year curve and extended to 2065. Compared with the previous 26-year curve this slightly extends the prediction of when “accelerations” will reach values similar to long range Tidal Gauge data of between 0 and 0.02 mm/yr2. This is shown as occurring at about 2035. The curve as calculated is asymptotic to 0 but it will probably converge on a slightly higher value in line with Tidal Gauges.
Should I pop my clogs before, say, 2030, it would be nice to think someone out there keeps adding new “acceleration” values to Figure 16. My grand kids are only 11, and have not yet acquired the spreadsheet bug, so not quite ready yet to take up the mantle.

Figure 16
Conclusions
The slope of the data has settled down to about 3.3 mm/year.
The “accelerations”, having peaked at just over 0.09 mm/year2 during the period 2017 to 2020, has now reduced to nearly 0.07 mm/year2 in early 2023 at which stage an El Niño caused a short-term blip back up to 0.08 mm/year2 before resuming a downward trend. With time the effect of future El Niños on the “accelerations” would gradually reduce.
The introduction of the spectral analysis and iterative determination of a sinusoidal curve has been a great benefit. Because of this the process has been extended to a much wider range of data, but as yet this has not all been analysed. This amounted to over 50 times the amount of work (spreadsheets and figures) as needed when analysing the NASA data. Therefore, I have spreadsheets coming out of my ears(!) but to fully present it all will need a large amount of extra compiling of results, producing figures, discussing and finalising the conclusions.
The datasets include the following. As and when each dataset is finalised, I intend to issue a report.
NOAA data
The site https://www.star.nesdis.noaa.gov/socd/lsa/SeaLevelRise/LSA_SLR_timeseries.php issues the Sea Level Data every few months. The sets being processed are those with Seasonal Signals removed as measured by the Topex, Jason-1,-2,-3 and Sentinel-6MF Satellites. The data files analysed were from 1993 to approximately the end of September 2024 although a more recent set of data takes the values up to the end of 2024. They cover the global data and the data for 24 sub-areas from the Pacific Ocean down to small seas such as the Adriatic Sea. An interesting presentation is to plot values like “acceleration” or phase shift for each sub-set on an easterly basis. Any trends may help to make judgements on the legitimacy of the derived “accelerations” as being meaningful or mainly a manifestation of the method of determination.
Tidal Gauge data
There are hundreds of Tidal Gauge datasets and those selected include very long sets, such as Brest, those recently discussed on WUWT, such as the NY Barrage and odd ball ones with negative slopes and/or negative “accelerations”. The general approach adopted was similar as that used for the satellite data except that with longer date ranges involved. With this longer (> 100 years) curve the residuals are calculated with respect to this curve. Also, whereas the NASA data used the spectral analysis only for the residuals, for the Tidal Gaude data it will also be applied to the full data to possibly pick up very long-term curves.
North Atlantic, North Sea and Arctic Sea data
These Tidal Gauge datasets were used in Ref 5 to investigate the theory that the Global Sea Levels had an “acceleration” because the satellite coverage was only 95% and the main areas missing include large areas of the North Atlantic, North Sea and Arctic Sea. If these had a decadal oscillation this may show up as a sinusoidal variation. At that time the sinusoidal curve was judged by eye to be about a 26-year period but now a more accurate assessment can be made.
All nine Tidal Gauges will be considered.
Appendix – Spectral Analysis Program
A recent addition to the analysis tools has been the use of spectral analysis software. The program used is called Peranso, described as Light Curve and Period Analysis Software and used mainly in the analysis of variable stars. It was written by Tonny Vanmunster at the Center for Backyard Astrophysics Belgium & Extremadura. As such it has features that are specific to variable stars but can be used to analyse general sets of data.
Due to star magnitudes being such that brighter stars have magnitudes that are more negative the y data for non-astronomical values must have their signs changed. Also, the system refers to Julian Days which may not be relevant and can be changed to Periods in Years.
To use for a general set of x y data an excel file can be set up with x (time) values in the A column, a set of commas in the B column and the variable, with a changed sign, in the C column forming a CSV type file. This data is then copied into a Clipboard.
Nine cases of data input were produced as shown below (Figure A1) to investigate how the system responds and to investigate the suitability for the type of data found in sea level studies.

Figure A1
The top line is for a single oscillation with the longest period, the second line 2 oscillations and the third line 3 oscillations. The first column is all main long-term oscillations, the second column adds an intermediate oscillation and the third column an extra short-term oscillation. The resulting spectral plots are shown below in figure A2.

Figure A2
The “long tail” on the upper row come about from the fact that all the much longer period curves have a very similar RMS error. It requires about 3 oscillations before a more definite peak period is obtained as is shown in row 3.
A further case involved using a very short section of a 1000-year curve. The figures are taken from the Peranso screen and show the input (Figure A3) and spectrum obtained (Figure A4). The peak is shown as at a period of 1111 years and is hardly discernable. This slightly higher period generally occurs when the input is a short portion of a much longer curve. As a peak it is not very obvious, a large range of long period curves would fit nearly as good.

Figure A3

Figure A4
# # # # #
References (Not all references appear in this report but may appear in follow on reports)
1. https://wattsupwiththat.com/2022/05/14/sea-level-rise-acceleration-an-alternative-hypothesis/
2. https://wattsupwiththat.com/2022/06/28/sea-level-rise-acceleration-an-alternative-hypothesis-part-2/
4. Nerem, R. S., Beckley, B. D., Fasullo, J. T., Hamlington, B. D., Masters, D., & Mitchum, G. T. (2018). Climate-change-driven accelerated sea-level rise detected in the altimeter era. (full text .pdf) Proceedings of the National Academy of Sciences of the United States of America, 115(9). First published February 12, 2018
6. B. D. Hamlington, A. Bellas-Manley, J. K. Willis, S. Fournier, N. Vinogradova, R. S. Nerem, C. G. Piecuch, P. R. Thompson & R. Kopp. The rate of global sea level rise doubled during the past three decades. https://www.nature.com/articles/s43247-024-01761-5
7. R. S. Nerem, T. Frederikse & B. D. Hamlington. Extrapolating Empirical Models of Satellite‐Observed Global Mean Sea Level to Estimate Future Sea Level Change https://repository.library.noaa.gov/view/noaa/54199
# # # # #
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Measuring sea level by satellite? Dunno…
Convince me it’s accurate.
I remember a few years ago Willis did a piece on here about satellite data. As I recall there have been 4? The result was each one showed an increase in the rate of sealevel rise compared to the previous one.
It will not be easy to prove or disprove the accuracy of the satellite data, 32 years is far too short a period to do this but at this stage 32 years of data is all we have so we either throw it in the dust bin or try to make some sense of it. In 2018 the University of Colorado were pulling all the strings and feeding the BBC and the Guardian with horror stories of high acceleration and 80 year extrapolation. 40 years as an engineer taught me the dangers of quadratic (or any polynomial) curve fitting coupled with extrapolation. In the paper I emphasized the importance of figs 11 and 16 (In the paper I said 11 and 13 which was an error). Fig 16 in particular as it extends to 2065 and gives over 70 years of a predicted trend. In all the work I have been looking at there seems to some general sinusoidal behaviour, generally in the 20 to 30 year period. That covers the NASA data, NOAA data and Tidal Gauges. Interestingly most long period Tidal Gauges have accelerations in the 0.01 mm/year2, a value fig 16 shows the satellite data may be approaching in the longer term.
I have analyzed all the 24 NOAA sub areas and those in the Indian Ocean area are of interest. The Indian Ocean and the Indonesian Sea have sinusoidal variations of about 15 and 11 years respectively which means in the 32 year total analysis period there are 2 or 3 cycles. Performing Fig 16 type analysis produces a fuller picture of the decay in the “acceleration” values following the form of a slightly under-damped oscillation.
What “acceleration” figures appear in the calculations depends on the starting point, as in the NOAA sub areas different area have different phase shifts, on the period of any oscillation and on the length of period the analysis covers My Ref 2 in the paper looked at this (particularly the 3rd item) in more detail.
I hope to produce 2 more papers to cover the 24 NOAA sub areas and a few of the Tidal Gauges. There is a lot of effort going in to these which may be meaningful and may not be but if it is all left to U of C we know here that may lead to.
Not sure if this is being done or if this is a stupid idea, but there are reservoirs, ponds, etc. all over the world where you likely have calm, flat surface conditions for significant periods and the surface elevation is controlled and known. As satellites fly over these points with known positions and elevations, are they ever measured and used for calibration or verification?
Acceleration via tide gauges:
If the satellite data is following my figure 16 it will join your chart in about 2060 at which stage I will be 122 and looking forward to this and the return of Halley’s Comet!
The “acceleration” in satellite data v tide gauges still looks mostly like badly calibrated satellite data, and shifts to different satellites.
The “acceleration” in satellite data v tide gauges still looks mostly
like badly calibrated satellite data, and shifts to different satellites.
______________________________________________________________________
Some of us think it looks more like the Winston Smith’s of the
climate mob rewriting the historical data. The graph below shows
that Colorado University’s Sea Level Research Group C-SLRG
altered the data from 1992 – 1998 enough that they could claim
an acceleration of 0.083mm/yr² when there isn’t any acceleration.
We must be due for another change in the data!!
Prior to 2018 there was a long pause in the up dates from C-SLRG and to toot my own horn I posted that when C-SLRG resumed regular updates there would a huge rewrite that increased the rate of sea level rise. They’ve once again paused the updates, so your comment is appropriate.
Couple or three things. From the post –
0.1 mm is the thickness of a thick human hair. 0.01 mm is difficult to measure reproducibly with a 0.01 accuracy micrometer, even on a rigid material like steel.. The measurements are nonsensical. A fantasy. The crust is in constant motion, resulting in marine fossils found at over 6000 m altitude, and fossil fuel remains at depths of over 4000 m. A sea level range of at least 10000 m – 10 kms.
There are no “sinusoidal” curves with predictive power in a chaotic system. Analysing the past, using spectral analysis, or any similar process using Fourier transforms, can produce all sorts of sinusoidal curves, which added together will provide a close approximation to the data provided.
Unfortunately, the chaotic motions of the lithosphere can result in ever changing quasi-periodicity with precisely no predictive value at all.
Given the chaotic three dimensional movements of the crust (including ocean basin walls and floors), and a certain mass of surface water upon it, sea levels must rise and fall – chaotically and unpredictably. History shows evidence of 10 km changes in the past.
Possibly good reason to investigate something else the non-predictable future states of a chaotic system.
Agreed measurements are very small but no reason not to try and make some sense of them.
I am not combining curves but finding a dominant period (usually 15 to 30 years) and then showing the sun and El Nino effects with periods of 11 years and below.
No reason why not to predict. That is how science progresses – measure, formulate theory. predict, more measurements etc
Finally as I said in a previous reply don’t let us leave it all to the U of C.
No, provide hypothesis for something not previously explicable, make prediction, do experiment . . .
Feynman –
You don’t even have a theory. Sorry about that, but it’s true.
If you believe you can predict the weather, the stock market, oil futures, or anything of that nature, good for you! Belief is not fact. Oh, and assumption – “The Sun will rise tomorrow.”, can be done by any small child.You need to do better than that.
Science moves on in many ways. The starting point could be to provide a hypothesis but surely that usually needs data. Science is a continually revolving process.
Regarding Richard Feynman, my number 1 hero, I hate to slightly disagree with him but in his quote you provided the experiment must also be right. I can give you a couple of examples from my former life in analyzing Nuclear Power Concrete Containment Structures. In the 1960’sI was responsible for developing methods of analysis for a range of NPS including Hartlepool and Heysham 1. These consisted of large cylindrical structures with 8 large vertical penetrations passing through the walls. We tested a range of different size models fo which I provided predictions for the strain gauges. A large scale concrete model agreed well but a small scale araldite model didn’t agree between prediction and test results. Strain gauges had been attached to the inside of the vertical penetrations in the vertical and horizontal directions along 4 lines at 90 degrees apart. I pointed out that if the sets of readings were transposed then theory and experiment agreed but this was rejected by the laboratory. I insisted the model was cut up to expose the gauges and on doing this the gauges were seen to be wired incorrectly.
Forward a few years and 4 identical containment structures had been built and I was involved in the pressure testing. 3 passed with flying colours but the fourth showed no agreement between experiment and theory. The strain gauges consisted of 2 set of many gauges in the vertical and horizontal direction and again agreement could be shown if the sets were transposed. Being concrete and the working structure it couldn’t be cut up so my explanation of the wiring being transposed was accepted and the test proceeded in small steps up to full test pressure. The strain gauge readings stayed linear up to full pressure and a test certificate duly signed. The NPS as now nearly reached the end of its working life and will soon be shut down.
You say I don’t have a theory. In a previous paper I was asked why I used a sinusoidal curve, at that time of 26 year period, and gave a theory that a large amount of the north Atlantic, North Sea and Arctic Sea made up the 5% not measured and there may be a roughly decadal oscillation in that area. I studied many of the Tidal Gauge datasets in that region and showed this was a possibility and I am following up using the spectral analysis and convergent curve fitting method to check more accurately on this.
Finally I do not accept your last paragraph. I am putting in a lot of effort in to this work. The facts are shown for others to make judgements.
My number 1 scientific hero as well.
.
I have often used the single-line summary from his “Seeking new laws” public lecture myself to get across the fundamental point, but the full transcript immediately follows it with some important qualifiers …
Richard Feynman would have agreed with your hesitation.
.
Most people will only take the time to login and compose a comment if they see something “objectionable”.
The vast majority of “silent watchers” appreciate the efforts made by you (and other WUWT article writers), even if the initial reaction is all too often “Hmmmmm …. I’m going to think about that one for a while …” (as was mine in this instance).
Thanks for your reply. Interesting to see the full Feynman quote.
I take your point that people like to complain but don’t take the effort to praise. Thank you the silent majority!!!
You don’t have a theory. You don’t even have a(n) hypothesis.
You seem to be speculating that a “roughly decadal oscillation” exists – which is “roughly completely useless”.
Chaos throws up all manner of a periodic “oscillations”. That’s the nature of chaos. If you don’t accept that chaos exists, fine. You join a large group who believe that they can predict future states of chaotic systems. They can’t – it’s as simple as that.
You don’t have to accept anything that you don’t want to. If you believe that you can predict the future by examining the past, that’s your choice.
“There are no “sinusoidal” curves with predictive power in a chaotic system.”
You must live somewhere the seasons don’t change perodically.
I’m not SURE there’s value in Frequency space analysis of chaotic system data, but I think you fail to prove your assertion.
Your sarcasm is misplaced. Try predicting when the seasons will change. Maybe I should have been clearer, and said “useful predictive power”. A 12 year old assumes that the seasons change. Maybe you can come up with something a little more useful.
I am quite sure there is no useful predictive ability at all. I don’t have to prove anything, and you are free to believe that there is value in anything you want. You can demonstrate you are not ignorant and gullible if you wish, but I don’t believe you are all that concerned about my opinion, are you?
The analyses presented are well beyond anything I did or was exposed to in college during the 1960s. My exposure to sea level involved going to Jekyll Island on the Georgia coast in 1968 with my soon-to-be wife and her mother. Both are now gone but the Island and its dunes and vegetation (using satellite images) looks much the same as it did 57 years ago. Using 3.3 mm/year (?) the sea level rise would be 188 mm or 7.4 inches.
Longshore drift can move tons of sand in 57 years so changes from 1968 might be difficult to see. I suggest a visit for anyone traveling on I-95; it is just 11 miles to the Atlantic Ocean. 🐟
Kip, the problem with satellite sea level data is that it is inhomogeneous. The entire acceleration is due to a single year, as can be seen in this graph plotting the sea level rise for each 9-year-period average.
This sudden jump was convincingly proposed to be due to a change of satellite by Willis Eschenbach in his article:
https://wattsupwiththat.com/2021/02/21/munging-the-sea-level-data/
But could be also due to the humongous 2015-16 El Niño.
And it is correct that the data shows a sea level rise deceleration since 2016.
Javier ==> Alas, I am not the author of this piece — only the “Editor”, meaning only that I uploaded and formatted the essay for the site.
I have facilitated Dr. Welch’s essay’s over the last couple of years.
In the past, I have included a disclaimer at the beginning that I am not the author, and that I do not support curve fitting of any type under almost all circumstances.
“On a personal note: This is not my hypothesis. I do not support curve fitting in general and an alternate curve fitting would not be my approach to sea level rise. I stand by my most recent opinions expressed in “Sea Level: Rise and Fall – Slowing Down to Speed Up”. Overall, my views have been more than adequately aired in my previous essay on sea levels here at WUWT.”
Note that I also don’t agree with the explanation proposed by willis, which depends on the same sort of curve fitting and over analysis of data in which all measurements are within the range of generalized error over nearly the entire data set. Angels on the heads of pins….
Kip, Way too much in the article to absorb in a few minutes, hours or even days. I confine myself to a couple of comments.
I plan to spend the next week or so rereading the article until I’m sure that I understand it. As usual, thanks for posting it.
Alan replying here as Kip is my go between who greatly helps and supports my efforts although thats not to say he agrees with all my work.
Agree with your point 1. It was U of C use of a quadratic with outrageous use of extrapolation that jerked me out of a restful retirement and start these studies. I think people use quadratic as it is simple without thinking of the consequences.
Not sure of using exponential growth, point 2. Does that imply a feed back process and it can get out of control like polynomial extrapolation.
Agreed. That is why my fig 16 is important. If correct it shows the satellite data approaching the tidal gauge data analyses, but only after another 30 or 40 years. Can’t wait.
Finally you might want to add some of my other papers to your reading list!!
Alan,
Does exponential growth imply feedback? I haven’t the slightest idea. So maybe? The answer is beyond my pay grade I expect.
What I do know is that some stuff, maybe a lot of stuff, does grow exponentially. Moore’s Law re the number of semiconductor devices that can be created in a chip area is any example. I suspect, for example, that atmospheric CO2 is growing exponentially — but at a VERY low rate of change.
Does that mean sea level rise should be fit as an exponential curve? My guess is: possibly — but the data is so noisy that trying to extract an acceleration by any means may well be a waste of time and effort.
Thanks for your thoughts.
I am not happy with an exponential curve because firstly it will accelerate faster and faster and secondly you can’t extrapolate backwards in any meaningful way. There is a growth curve, the Gompertz Function, that does have a limit. I found it useful when studying Covid figures in 2020. Wikipedia have details of it.
Backward extrapolation can be just as important as forward. There may not be numerical data but there is anecdotal information such as warm/cold/warm cycles. The satellite data covers too short a period to use backward extrapolation but i have found very long Tidal Gauge data sets, like Brest, can be investigated.
Alan
I’m not much of a mathematician but I’m pretty sure there are two (equivalent I think) ways to extrapolate exponentials backward. R**(-T) or 1/(R**T). That seems to work. one dollar earning two percent interest for 100 years become would be 1.02**(100) = $7.24. $7.24 extrapolated backwards at two percent for 100 years would be $7,24*(1,02**(-100)) = 0.99935 = $1.00 after rounding.
yes but going backwards would always be decreasing and no previous peaks, like a previous warm period, could be reached.
I think there is growing evidence that CO2 levels in the atmosphere is exhibiting the Gompertz effect.
I think there is unlikely to be more than a single doubling of the 290ppm per-industrial level.
Regarding sea level, what matters is the sea-level next to land and feel perpetually aware that the earth in its plastic state being massaged by large gravitational bodies orbiting us will produce spurious data when in fact tidal gauges, many of them, will show multi hundred year trends that matter where we live.
Interesting paper regardless.
Dear don k and Kip,
I am no statistician but as far as I can gather, Kip is talking about polynomial order 2 ->> y = ax + bx^2 + constant, which to my recollection is a partial parabola and not an asymptotic power curve. A parabolic curve predicts a turning point, not a limit as x approaches 0.
Kip is also only using ~ 30-years of satellite data, when there many tide gauge datasets that exceeded 30 years.
The essential question for satellite altimeter data (and satellite data generally) is how can a satellite passing over the same ‘cell’ of heaving ocean every 20-days or so 200 or so km above the surface, ‘measure’ sea level, relative to a theoretical point at the centre of the earth (which moves around), with any degree of accuracy?
Regardless of how complex the methodology is (and it is complex), that is surely the starting-point of any investigation.
My take is that satellite data are full of humbug and not fit for the purpose of monitoring sea level rise, acceleration or anything else.
Yours sincerely,
Dr Bill Johnston
http://www.bomwatch.com.au
As I mentioned in my reply to Don K the paper is my (Alan Welch) work and Kip has no blame in its content at all. Kip helps me load on to WUWT, give helpful advice and is quite critical of what I say at times but sees the benefit of a wide approach to this difficult topic.
In my work I am addressing Tidal Gauges as well and have included them in previous work and will do more with them in the future. The problem is the satellite data exists and unless you think it is all fabricated it is all we have to work with in conjunction with the Tidal Gauge data. Given another 40 years and we will, hopefully, be in a better position to judge.
If we don’t look we won’t find and it will all be down to others to present their take on the data.
Thanks Alan, and apologies for mixing up who is who.
When fitting curves like you are doing, the fitted curve will try to ‘follow’ all the patterns in the data. This includes the 18.61 nodal cycle and the 8.85 year cycle of lunar perigee, which shows-up as a quasi 4.4 year cycle (see this paper by Ivan Haig et al (JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, C06025, doi:10.1029/2010JC006645).
There is also this book – the Bible of tidal analysis:
Pugh, D. J. (1987), Tides, Surges and Mean Sea‐Level: A Handbook for Engineers and Scientists, 472 pp., John Wiley, Chichester, U. K. Which I found a copy of on line, and downloaded – years ago now. I tried to publish an analysis of the combined tide gauge dataset (US + Australia’s Pacific Sea Level Monitoring Program dataset) for Tuvalu.
After several years of struggle and back-and-forth, the Journal simply refused to publish it.
When you have a 30-year dataset, you also have a problem. Periodogram analysis of raw data (which my application automatically detrends beforehand) will also pick-up the ENSO signal, which has a frequency of about (very much approximate) 5-7 years.
So you have an ElNino/LaNina curve to think about especially as your data is centred on a massive ElNino, with La Ninas each side.
You could detrend the data and see if you can fit a LOWESS curve to track and remove the ENSO signal – the BoM SO is a detrended signal, but I don’t know which ENSO you have used. Also, it would be much easier if you expressed time in month-centred DeciYears [for monthly data, Year + (month-0.5)/12] instead of years since ….
Then you have the possibility that the ‘trough’ corresponds to one or more of the major cycles (or not),
I organised for someone to provide tidal predictions for Tuvalu, which saved a lot of mucking around. US data started in the 1950s I think and as I recall, I stitched the data using an overlap then tested for a discontinuity.
If I seem vague, its because this was done at least 20-years ago. While I may have the paper and the data on an old back-up drive, there is probably not much value in trying to dig it up now.
I found there was no decernable trend. Furthermore, I compared a satellite-derived data set from CSIRO, with satellite data from U Colorado (could have been Hawaii), Although derived from the same satellites, the numbers and trends were different??
All the best,
Bill
Bill
I expect that most “polynomial” fits are done using spreadsheets like Excel. A quadratic is just an order 2 polynomial. Higher order polynomials are also available from spreadsheets and are easy to use. The second order term is “acceleration”. The third order term is the rate of change of “acceleration”, etc.
The not very obvious problem with high order polynomials is that each additional term potentially adds another inflection point — a change of direction in the curve. I’m pretty sure that the result of using high order polynomials then grabbing the 2nd — “acceleration” — term. is that fits within the data look increasingly good with more terms, but the slopes at the ends of the data are increasingly weighted toward the initial and final few terms. You end up with great fits that are likely to be lousy for prediction outside the range.
====
I know that the radar altimetry satellites use a signal with a beamwidth of about 10km. My understanding is that the return signal is sort of a complicated sum of reflections from myriad points on the surface where the surface happens to be perpendicular to the radar beam. I infer that the satellite instrumentation looks at the return signal as it rises from zero and decides the (probable) surface level based on some criteria that may or may not be valid. I can’t think of any way to check. The claim is that individual measurements are accurate to about +/- 2cm. The back of my envelope here says they probably get a million or so readings a day from most of the ice free surface of the Earth. That suggests that their daily summary values could possibly be accurate to around +/- 0.02mm.
A lot of assumptions there.
On the other hand. Tide gauges have a bunch of problems as well. They can be damaged by ships. They may move around the port with scant documentation as facilities change. The long record ones predate any serious concern about sea level changes. They were presumably put there to tell ships roughly how much water they had under their keel. Maintenance activities many decades ago may well have offset their readings a bit. Maybe multiple times.
And I haven’t said anything about local tectonics which is another concern.
I vaguely recall that the rate of change of acceleration is labeled “jerk.”
And ,what about average wind speeds ?
If speeds are higher , wave/swell heights go up …….
😉
Sweet Old Bob
As winds increase, the crests will indeed get higher. But the troughs will get lower. MAYBE that averages out. On the other hand it doesn’t take all that much wind before the wind starts blowing the tops off swells causing breaking waves. The waves are no longer sinusoids. Perhaps that can be dealt with rationally. Or maybe not. I’ve never come across an explanation of how that is handled. I admit to a certain amount of curiosity.
I get that Don k, which is why prediction CIs become impossibly wide the further-out is the prediction. Also, the more coefficients the higher is R^2, even though beyond order-2, they may be meaningless.
The cyclic nature of SL data (+ SOI) also means that data are highly autocorrelated. A thought that crossed my mind for a short dataset, is to calculate first- or second-differences as a starter, check for autocorrelation, and see if there a trend – is there a tendency for period-to-period differences to increase with time?
Cheers,
Bill
Dr. Johnston ==> Credit where credit is due — the essay (and the earlier efforts along this line) are all the work of Dr. Alan Welch.
As Dr. Welch points out in several of his replies here today, I do not agree with much of this type of curve fitting nor the entire subject of “sea level rise accelerations”.
To see my view on sea level rise, you can read any or all of the links produced by this Google search.
I know I messed see above https://wattsupwiththat.com/2025/06/06/measuring-and-analysing-sea-levels-using-satellites-during-2024/#comment-4080937
Cheers,
b.
CO2 increases at a near linear basis. Backradiation shows a log decay with an increase in concentration. There is no physical reason why increasing CO2 would cause an acceleration of either temperature or sea level. There must be something else causing the change. Also, the oceans are warming. Visible radiation warms the oceans, not IR backradiation.
Well done Kip!
Thanks for the praise but I think I contributed as well!!!!!!
Also thanks for giving me access to this site to present my work.
I originally tried PNAS but at peer review stage I was thrown out – wonder who the peer reviewer was.
Being a long retired OAP in a rural area of the UK with no connection with any group it is difficult to publish. I tried ArXiv and ResearchGate but they don’t want odd folk like me!!
Now you know how Elon feels 🙂
Measuring sea level is difficult since the signal is buried in many much larger signals with a wide spectrum of frequencies. However, if the goal is to determine whether or not you will drown, surely, the question is whether land is being inundated or exposed. That measurement is NOT difficult. Take the poster child of sea level rise to drown an entire nation, Bangladesh, but also ocean atolls (https://doi.org 10.1002/wcc.557), and coastal areas around the globe. Over the past 40 years of sea level rise, whatever the tiny number may be and whether it is accelerating positively or negatively, a net increase in coastal land area (DOI: 10.1038/nclimate3111) of more 13,000 km2 has occurred, 1000 km2 in Bangladesh alone due to river sediments (https://www.geospatialworld.net/news/bangladesh-gaining-land-not-losing-scientists/). Erosion beats sea level rise, hands down. Island atolls are constructed from coral and nourish themselves. 9 of 10 are growing or stable. The measurements are both accurate and precise, and require no extrapolations or curve fitting.
Every time I read numbers like these calculated from satellites, I simply dont believe the ridiculous confidence intervals claimed. There is no physically possible way to get accurate measurements to .01mm from these satellites. Even the GPS satellites dont claim that level of accuracy (GPS accuracy is measured in feet). Just a few of the issues that make it impossible to get the .01mm accuracy are the fact that all the satellites are falling to the Earth, all at different rates, and all the rates vary all the time due to atmospheric conditions. The satellites use laser for measurements, at ranges of hundreds of thousands of feet, the speed of light changes in atmosphere vs vacuum creates additional error. Then the fact that the measurement of sea level is done using the humidity level, what is the difference in humidity 1mm above the sea level and sea level? especially considering waves.
I have no doubt that there are additional confounding issues that I am not aware of. I also know technology improves every year. But the claim of accuracy to hundredths of millimeters is ridiculous, a tenth of a millimeter seems very unlikely, and 1mm seems barely possible. I welcome additional thoughts, and would be glad to hear any other comments on this issue, as I am just a reasonable knowledgeable amateur when it comes to orbital mechanics and the other parts of physics involved in this.
Thanks for your comments.
There is a lot of what you say that I agree with. The total process from readings to analysis contains many phases.
The satellites take readings. Somewhere and somehow these are processed and issued as a much reduced dataset. These data can then be analyzed. And finally theories can be formulated.
I come in at the 3rd stage although some amount of theorizing creeps in.
With respect to measurement each 10 days about 400,000 readings are taken and after (not sure what) processing about 3/4 of these are used to create a single result to which other filters are applied to remove seasonal etc effects. Then every 6 sets are averaged to give a 60 day moving average. It’s all down to large numbers of reading which may or may not be acceptable. I believe Kip Hansen may have written on this.
The final data file(s) are issued on sites such as https://climate.nasa.gov/vital-signs/sea-level/ or https://www.star.nesdis.noaa.gov/socd/lsa/SeaLevelRise/LSA_SLR_timeseries.php issue data sets at irregular intervals. For unknown reasons occasionally some historical data gets altered even after 30 years. Why?
Plenty of scope for jiggery-pockery to go on or even a thought of conspiracy theory to raise its head.
But having said all that I come in at the analysis stage. In 2018 the Univ of Colorado issued their 2018 paper that sent all the Climate Crisis press (BBC, Guardian) in a twist, got Swedish school girls truanting. normally reasonable people damaging paintings or blocking major motorways. I thought there must be a better analysis approach and so my series of papers came about.
Up to now it has mainly concerned the NASA dataset but I have 2 other papers underway. The first looks at the NOAA data for 24 various sub-areas such as the Atlantic Ocean or the Bering Sea. The second looks at the Tidal Gauge Data, a vast data set found on
https://sealevel.info/MSL_global_thumbnails5.html.
One early finding for the NOAA data for the Indian Ocean region is that any Decadal Oscillations seem to be on a shorter (11 to 15) year time scale, Hence in calculating how the “acceleration” changes with time (as in my Fig 16) it.results in a much fuller picture that compares well with my prediction in figure 16 up to 2065. Not sure if this has any significance. It is out of my knowledge zone. Papers on decadal oscillations tend to concentrate on the 60ish year period so the satellite era is a bit short for a fuller picture.
All a bit iffy but I think it is worth the effort to analysis the data in various ways and present the findings. Others can then pen their opinions.
dbakerber
If you’re still monitoring this thread. A few minor points.
The satellite altimeters use radar, not lasers. Why? I expect because aiming a very tight beam from a slightly unstable platform and getting a detectable signal back from a variably tilted and constantly moving target from 1300km away would be too difficult.
The later instruments use dual frequency radars in order to compute atmospheric delays.
GPS is presumably much more accurate if one is in space and using GPS signals that don’t pass through (significant) atmosphere. Also satellite positions over time are highly correlated, subject to known physical laws, and, conceptually at least knowable with great precision. They also use DORIS a sort of backwards GPS with fixed transmitters on the ground and a receiver on the satellite.
There are “handbooks” for the missions with extensive discussions of error sources, and data handling. I, at least, am impressed with the level of detail. The Jason-3 handbook is at https://www.ncei.noaa.gov/sites/default/files/2021-01/Jason-3%20Products%20Handbook.pdf There are similar documents for earlier instruments.
CAVEAT: My impression is that the claimed 2mm measurement accuracy is a political goal used to sell the project before it was funded. Whether they achieved it or not I have to be impressed with their efforts.
From the above article:
First,
Followed by,
TILT! If one believed there was a true sinusoidal-type causation underlying variation in a given data set (such as SLR data over time), then it would be more proper to have derived a sinusoidal-function to fit to the data, instead of a non-periodic curve fit such as a quadratic equation.
Put most simply, sin(x) cannot be correctly expressed as a quadratic equation of variable x.
The sinusoidal fit(s), of one or more simultaneous frequencies, are easily obtained by performing Fourier analysis of the data set. Now, I see that you actually did such for your Figures 3 and Figure 4 in the above article, yielding sinusoidal periods of 26 and 29 years, respectively, and then “somewhat” in Figures 5, 6, 8, 12 and 13.
Since the above article started with the sentence:
I think there is reason to question that a fundamental sinusoidal oscillation with a period of 26–29 years or so can be used to credibly “analyze” a small portion, representing only 3–4% of a full cycle, much less analyzing such by a quadratic curve fit approximation, and even less so considering the non-periodic nature of perturbations such as El Niños and La Niñas.
I think you have misunderstood much my work.
The quadratic curve is what the likes of the University of Colorado are using and which I totally reject. My curves in fig 3 and 4 are not from Fourier analysis but each a single period curve obtained by a convergence method I developed to minimize the square root of the sum of the squares of the errors and backed up by a spectral analysis. In your remarks what does “somewhat” imply.. The paper says analyses for the sea level data for 2024 and it covers the additional data during that year. If you look at the graphs they all cover 1993 to Jan 2025, the full period. My previous paper covered the additional data for 1993. It would be crazy to analysis one year of data. Ideally we need 60 or more years but have to make do with the 32 years we have.
Thanks for contributing but I feel you have not added anything to the debate
Indeed, perhaps I have misunderstood your work. If so, I apologize.
I’ll just note in my defense this statement of yours just above your Figure 16:
“Figure 16 shows the “accelerations” as at the end of Jan 2025 with the “accelerations” calculated using quadratic curve fitting to a set of data obtained from a 29-year curve and extended to 2065.”
(my bold emphasis added)
Thanks for the reply.
When I have used quadratic curves it is to replicate the work that groups like U of C have carried out. That leads to what they imply is an acceleration but what I call “acceleration” as it is a outcome of the method used and not physically meaningful.
Using a sinusoidal curve and repeating the process produces another smoother curve. Plotting together will I hope be useful to make judgements between the 2 approaches but the process is very slow and needs to stretch to 2060(ish) although by 2035 we should have a good feeling about it. As the song says “Time goes by so slowly”.
Something that would be really interesting to see again is an analysis of how pumped ground water is affecting sea level. All over the world, including in the US, water is pumped from underground for agricultural purposes. Most of this underground water is trapped in geological formations (Ogollala Aquifer, e.g.) so it is in effect new water into the system. How much of that has been pumped in the 20th century? and how has that affected sea level? It seems like a saw one paper on the subject several years ago, it would be worthwhile to see it revisited.
Dr. Welch, thanks for an interesting article. However, I fear you’re not aware that the claimed “acceleration” of the satellite sea level data is nothing more than an incorrect splice of the various satellites. See my post here. This is the money graph from that post.
As you can see, the first two satellites show no acceleration, nor do the last three … but when they are wrongly spliced together, presto, acceleration.
Best to you and yours,
w.
I am aware of this graph.
I am doing some more processing along these lines but need some time to bring it together.
As this discussion is nearly 4 days old and coming to a natural end could I contact you by E Mail directly.
I am on alankwelch@gmail.com
Alan
Of course. Check your email.
Best,
w.