call system ('wc -l ' // LoadFile // ' > trashme-loadcts.txt') ! get number of linesFor those that don't do programming this is counting the number of lines in the file. It is doing so by getting an external (unix) utility to do the counting and making it store that in a temporary file (trashme-loadcts.txt) and then reading this file to learn how many lines there are before deleting the file. This is then used as test for whether the file is finished or not in the subsequent line-reading loop.
read (3,"(i8)"), NLine
call system ('rm trashme-loadcts.txt')
XLine=0There are a bunchaton of related no-nos here to do with bounds checking and trusting of input - including the interesting point that this temporary file is the same for all users meaning that if, by some mischance, two people were running loadmikeh at the same time on different files in the same place there is the possibility that one gets the wrong file length (or crashes because the file has been deleted before it could read it). Now I'm fairly sure that this process is basically single user and that the input files are going to be correct when this is run but, as Harry discovers in chapter 8, sometimes it isn't clear what the right input file is:
read (2,"(i7,49x,2i4)"), FileBox,Beg,End
XExe=mod(FileBox,NExe) ; XWye=nint(real(FileBox-XExe)/NExe)+1
read (2,"(4x,12i5)"), (LineData(XMonth),XMonth=1,NMonth)
Data(XYear,1:12,XBox) = LineData(1:12)
if (XLine.GE.NLine) exit
Had a hunt and found an identically-named temperature database file which
did include normals lines at the start of every station. How handy - naming
two different files with exactly the same name and relying on their location
to differentiate! Aaarrgghh!! Re-ran anomdtb:
"cat " + ptsfile + " | wc -l"This would almost merit an entry at thedailywtf.com but for the fact that is is far from the worst 'feature' of this particular program. This file has some other features that get mentioned in chapter 18. And subsequently in chapter 20 where our intrepid hero encounters a hole bunch of poorly documented required files squirrelled away in Mark New's old directory (good thing it wasn't his new one with the news of nude photos in it :) )
AGREED APPROACH for cloud (5 Oct 06).
For 1901 to 1995 - stay with published data. No clear way to replicate
process as undocumented.
For 1996 to 2002:
1. convert sun database to pseudo-cloud using the f77 programs;
2. anomalise wrt 96-00 with anomdtb.f;
3. grid using quick_interp_tdm.pro (which will use 6190 norms);
4. calculate (mean9600 - mean6190) for monthly grids, using the
published cru_ts_2.0 cloud data;
5. add to gridded data from step 3.
This should approximate the correction needed.
On we go.. firstly, examined the spc database.. seems to be in % x10.
Looked at published data.. cloud is in % x10, too.
First problem: there is no program to convert sun percentage to
cloud percentage. I can do sun percentage to cloud oktas or sun hours
to cloud percentage! So what the hell did Tim do?!! As I keep asking.
but I digress]
Then - comparing the two candidate spc databases:
I find that they are broadly similar, except the normals lines (which
both start with '6190') are very different. I was expecting that maybe
the latter contained 94-00 normals, what I wasn't expecting was that
thet are in % x10 not %! Unbelievable - even here the conventions have
not been followed. It's botch after botch after botch. Modified the
conversion program to process either kind of normals line.
; map all points with influence radii - that is with decay distance [IDL variable is decay]What this is doing is finding out whether stations may influence each other (i.e. how close they are). It's doing this not by means of basic trig functions but by creating a virtual graphic and drawing circles on it and seeing if they overlap! There are a couple of problems here. Firstly it seems that sometimes, as the next few lines report, IDL doesn't like drawing virtual circles and throws an error.
; this is done in the virtual Z device, and essentially finds all points on a 2.5 deg grid that
; fall outside station influence
while dummymax gt -9999 do begin
if keyword_set(test) eq 0 then begin
set_plot,'Z' ; set plot window to "virtual"
erase,255 ; clear current plot buffer, set backgroudn to white
endif else begin
for i=0.0,nel do beginIn that case we just skate over the point and (presumably) therefore assume it has no influence - which is, um, very reassuring for people trying to reproduce the algorithm in a more rational manner.
x=x<179.9 & x=x>(-179.9)
y=y>(-89.9) & y=y<89.9
catch,error_value ; avoids a bug in IDL that throws out an occasional
; plot error in virtual window
if error_value ne 0 then begin
..well that was, erhhh.. 'interesting'. The IDL gridding program calculates whether or not a
station contributes to a cell, using.. graphics. Yes, it plots the station sphere of influence then
checks for the colour white in the output. So there is no guarantee that the station number files,
which are produced *independently* by anomdtb, will reflect what actually happened!!
Fortunately our hero is able to fix this (although it isn't at all clear to me how the new process is integrated with the old one)
Well I've just spent 24 hours trying to get Great Circle Distance calculations working in Fortran,
with precisely no success. I've tried the simple method (as used in Tim O's geodist.pro, and the
more complex and accurate method found elsewhere (wiki and other places). Neither give me results
that are anything near reality. FFS.
Worked out an algorithm from scratch. It seems to give better answers than the others, so we'll go