You are not logged in.
Pages: 1
Hi all. First post here. I assure you this is not homework. I'm a retiree following my curiousity. Start with 2048 consecutive daily numbers. I ran the Fourier analysis tool in Gnumeric (an Excel clone for linux) and got 2 columns (real+imaginary) of 2048 numbers each.
The program manual told me how to run the analysis. The next question is what do the output numbers mean?
Another option... is there a canned program somewhere (free; runs under linux) that can find periodicities in data sets?
Offline
Hi knighthawk;
Welcome to the forum. You are trying to find a periodic relationship in the data? Daily, weekly, monthly, what? I am asking you what you think the period is because then you can post say 3 or 4 periods and I can check it.
What I would do is first graph the data. Your own eyes are a great way to spot patterns. If the data is polynomic then a least square square fit is done and an analysis of the residuals. If the dat looks periodic then a Discrete Fourier Fit can be attempted.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
What I would do is first graph the data. Your own eyes are a great way to spot patterns. If the data is polynomic then a least square square fit is done and an analysis of the residuals. If the dat looks periodic then a Discrete Fourier Fit can be attempted.
Actually, it was after plotting the data that I noticed what look like periodicities. That's what started me looking for an objective analysis, versus simply eyeballing it.
Here's the background... I've been following the 10.7 cm solar flux data from Pentiction. Unlike sunspots, where humans estimate area of sunspots, the 10.7 cm flux is an objective measurement. And furthermore, this measurement is apparently immune to the Livingston-Penn effect but that's a whole other off-topic item.
There are 3 flux readings taken per day, and I run a script under linux that reads the downloaded file, and calculates the mean value for each day. I still have to do some manual cleanup...
1) There are a few 1-day gaps, where the data is missing. I insert the average of the 2 surrounding days.
2) An occasional freak class-X flare will send one observation's value through the roof. I throw that one out, and use the average of the other 2 daily observations.
I plot the data in an Excel-compatable spreadsheet. The most obvious periodicity to look for was approximately 27 or 28 days. That's a "Carrington Rotation" which, in plain English, is how long it takes the sun to complete 1 rotation as viewed from earth. I.e. a noisey sunspot that was pointing at us goes around, and is pointing at us again. That didn't help as much as I had hoped.
Despite how noisey the data is, I noticed pronounced spikes approximately a month and a half apart. 47 days as near as I can tell. Plotting the 47-day running mean greatly smoothed out the data and showed yet another cycle, of approximately 7 and a half months.
To further confuse the issue, the sun was in a solar minimum from mid 2007 through the end of 2009, so the periodicity would be missing during that time.
BTW, I was going to post explanatory links, but apparently new members can't post URLs.
Offline
Hi;
Tell me what the sites are and I will post them for you.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Sorry, I don't see any links for emailing you directly. Did you mean to cut up the URLs or leave off the http?
Offline
hi knighthawk
Welcome to the forum.
Just do this and bobbym can find it and put the link in too. (cut off as much as needed or split up to fool the robot)
http://www.mathisfunforum.com/viewtopic.php?id=17828
Bob
Children are not defined by school ...........The Fonz
You cannot teach a man anything; you can only help him find it within himself..........Galileo Galilei
Sometimes I deliberately make mistakes, just to test you! …………….Bob
Offline
It looks like ftp links aren't filtered. That's the data, which is the important part.
The solarflux data can be pulled from
ftp://ftp.geolab.nrcan.gc.ca/data/solar_flux/daily_flux_values/fluxtable.txt
The column you want is "fluxadjflux"
for earlier data, browse through
ftp://ftp.geolab.nrcan.gc.ca/data/solar_flux/daily_flux_values/
Offline
Hi knighthawk;
Thanks, I have the .txt file.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Is there a way I can send you a spreadsheet or a CSV data file? As I mentioned, there is some cleanup required of the raw data. I've already done that.
Wild hunch time. Is the "7 and 1/2 months" cycle actually 224.7 days? I'm almost hoping that it's not.
Offline
Hi;
First, let me look at the raw data that I have.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Hi;
Here is my plot of the first 1000 pieces of raw data.
Where do you go from here?
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Hi;
Here is my plot of the first 1000 pieces of raw data.
Where do you go from here?
I don't think we're on the same page. Is that the 1st 1000 points or the first 1000 days of ftp://ftp.geolab.nrcan.gc.ca/data/solar_flux/daily_flux_values/fluxtable.txt ? It doesn't look familiar. Remember what I said earlier on about multiple observations per day. The leftmost column "fluxdate" is formatted "YYYYMMDD", e.g. "20041028" is 2004/10/28. I take the average value of all the observations for each day.
My original question was about the significance of the Fourier Analysis when taking a bunch of data and analyzing it as per (replace "fff" with "www" to get past the filter)
fff.brainmapping.org/NITP/PNA/tests/ProblemSet3_files/FourierExcel.htm
I originally fed in the most recent 2048 days. This analysis needs a whole power of 2 as the number of data points.
Offline
Okay, I just plugged in the first 1000 pieces of data. If you want to send me the csv that you have I will contact your email address.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
Pages: 1