British Cave Monitoring Centre (Home)

Freshen the Data

This is an outline page - still to be formatted and expanded

View Uploaded Files & Technical: View Logs.

The logger data is uploaded automatically, on a scheule (details below). You can trigger a manual upload by clicking one of the buttons below.

Note: 15-Jun-2024 – Due to a slow broadband link, the uploads from Pooles cavern are taking a long time, and the BCRA server is timing out and not transferring all of the files. This is most likely an electrical fault on the phone line somewhere between Pooles Cavern and the BT cabinet. Some work-arounds have been implemented, as described below.

  1. Both php and curl have had large limits placed on their execution time. In freshen.php I have set $time_limit = 5000; and in the cron job I have set --max-time 5000 ; 5000s being about 83 minutes.
  2. The code in freshen.php has been altered so that only a quarter of the files are handled in any one operation. The cron job now executes every six hours, at 1,7,13 and 19 GMT and each execution handles a different set of files. Additionally, the freshen_log file is now a set of four files; and the system error output goes to a second set of log files.

Because the code was written in a hurry, and because debugging it was difficult, it uses what might be considered a rather convoluted procedure, as follows...

  • An incremental upload of the CSV data files using TinyTag's 'restart' option.
  • A full upload of the 'raw' data files
  • For identification, the CSV files are named using the restart sequence number
  • A list of refreshes is available in the *.restart.txp files
  • See notes above, which describe how the files that are processed depend on th etime of day that the program is run

Note that the 'fractured' nature of the incremental CSV files may make them difficult to use.

Technical note: this runs freshen.php

  • A full upload of the CSV data files
  • A full upload of the 'raw' data files
  • See notes above, which describe how the files that are processed depend on th etime of day that the program is run

Note that *.csv files will be over-written, but the individual 'refresh' files already on the server will be left alone - although David Gibson might delete these manually as part of the ongoing development/debugging work.

Technical note: this runs freshen.php?restart=no

  • DEBUG: do not fetch any data files from the Gateway

Technical note: this runs freshen.php?debug=yes

Points to note...

At the moment, a CRON setting causes the "Replace the Stored Logs" function (see buttons above) to be executed as follows (E&OE). Please check with David Gibson to see if the CRON settings quoted below are still current. Prior to 15-June-2024 the setting was ...

CRON settings:
Times: 22 1,13 * * * (that is, those minutes, past those hours, every day)
Command: curl http://bcra.org.uk/data/freshen.php?restart=no\&cron=yes
   >/home/bcra/public_html/data/logs/freshen_log.html 2>&1

After June-15-2024 the settings are

13 1 * * *	
myChunk=1; myPath=/home/bcra/public_html/data/logs/; \
curl https://bcra.org.uk/data/freshen.php?restart=no\&cron=yes --max-time 5000 \
>${myPath}freshen_log_${myChunk}.html 2>${myPath}freshen_ERR_${myChunk}.txp	
    
13 7 * * *	
myChunk=2; myPath=/home/bcra/public_html/data/logs/; \
curl https://bcra.org.uk/data/freshen.php?restart=no\&cron=yes --max-time 5000 \
>${myPath}freshen_log_${myChunk}.html 2>${myPath}freshen_ERR_${myChunk}.txp	

13 13 * * *
myChunk=3; myPath=/home/bcra/public_html/data/logs/; \
curl https://bcra.org.uk/data/freshen.php?restart=no\&cron=yes --max-time 5000 \
>${myPath}freshen_log_${myChunk}.html 2>${myPath}freshen_ERR_${myChunk}.txp	

13 19 * * *
myChunk=4; myPath=/home/bcra/public_html/data/logs/; \
curl https://bcra.org.uk/data/freshen.php?restart=no\&cron=yes --max-time 5000 \
>${myPath}freshen_log_${myChunk}.html 2>${myPath}freshen_ERR_${myChunk}.txp	



This page, http://bcra.org.uk/data/freshen.html was last modified on Sun, 16 Jun 2024 11:24:11 +0100
Server: britiac4.british-caving.org.uk (31.25.186.126)