Hi,
When a backup is run (Administrafion->Backup) the process seems to run correctly. In fact I can see the backup being created in the /tmp/openemr_eb_backup/emr_backup. But it never download.
Any thoughts?
Ubuntu 11.04
Apache version 2.2.17
MySQL version 5.1.54
OpenEMR v4.1.0 (5)
PHP version 5.3.5
Hi,
Anything showing up in the php error log or the javascript error log. Also, is it happening on all browsers or a specific one? Also, is the script timing out?
-brady
When I try to open the openemr.tar.gc file from the tmp folder I get the following:
gzip: stdin: invalid compressed data-crc error
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now
Okay, I answered my own question as to what the problem is: in the php.ini file (just search for it) increase the max_execution_time from its current setting to something a little more extreme like 1080 - this gives enough time and the back up will work fine.
Until it gets too big again. The demand backup via the browser is not a long term solution. You need to implement a automated backup script…
It’s not hard, you need only use mysqldump to copy the database out to text and and kind of tar, zip, copy, rsync, xcopy to make a snapshot of it all and put it somewhere other than the same machine.
There is an linux example in contribs/util called backup_oemr.sh, would be easy to do a windows cmd shell version of it.
I have found that using the backups via the web client, whether through the OpenEMR backup selection or via phpmyAdmin can fail, and frequently they fail silently on large databases.
The ONLY safe way to backup your database is from the command line with the mysql tables locked during the dump.
Use the examples provided in the contrib/util/backup_oemr.sh. If you are windows based either install cygwin so you have a unix shell command or convert the script to windows .cmd format.
Or… find a good mysql UI backup application tool that IS NOT browser based.
Also, to clarify (correct me if I am wrong), byt mysqldump by default has the -opt parameter on, which locks tables (so, as long as use this command with using any parameters to explicitly remove locks, should be fine; that being said, always good practice to include -opt in the command, which is done in backup method within OpenEMR).
The backup script should work (since it is utilizing mysqldump with lock and appropriate copy of the web files), but I think the main issue seems to be the download issues.
I agree with Tony (in a way). Ideally, the user implements a automated command line tool to backup their OpenEMR frequently and off site. However, the natural progression of new users will likely be to use the internal OpenEMR backup mechanism(ie. limit their initial resource investment) until they become comfortable to either implement a more sophisticated method (or hire somebody to do it). (This is where documentation on various backup methods in the wiki would be extremely helpful)
To debug, curious to know in both Jeff and rdh61 cases the files/directories and sizes you see at /tmp/openemr_web_backup after running the backup script (note the files will keep getting bigger while the script is running, so wait until files stop growing)