scifi000 wrote on Friday, December 11, 2015:
Daily backing up the patient data, the easiest and secure method? Thanks.
scifi000 wrote on Friday, December 11, 2015:
Daily backing up the patient data, the easiest and secure method? Thanks.
fsgl wrote on Friday, December 11, 2015:
Most secure is any method done offline & backup media encrypted.
scifi000 wrote on Friday, December 11, 2015:
What is the easiest method? Does openemr have a built in backup function?
scifi000 wrote on Friday, December 11, 2015:
Windows 7, 8 and Ubuntu
sunsetsystems wrote on Friday, December 11, 2015:
For several backup methods see: http://open-emr.org/wiki/index.php/OpenEMR_Wiki_Home_Page#Supplementary
The approach I like and reasons why are at: http://open-emr.org/wiki/index.php/Automated_Backups_to_an_Alternate_Server
fsgl wrote on Friday, December 11, 2015:
The easiest backup is found in Administration->Backup, which is fine for Linux, but less good for Windows.
I’ve had loss of documents & configuration problems of LBV forms when emr_backup.tar was restored in Windows. CVerk’s backup (first link above) has worked flawlessly for past 3 years.
Another advantage is that this backup includes a copy of the Package, which means quicker turn-around time. The resident utility requires download of OpenEMR, then a restore of the web directory & database before you can get back to work.
With either method, creation of a system image should be done as well in event the primary method of backup fails or if the server goes belly up. If you have an extra device, you can be up & running in the time it takes for the morning coffee break.
I prefer to monitor the backup in real time instead of automating the task. Recently one of the Windows Updates changed the level of User Access Control causing a permission problem with CVerk’s backup. Without watching the backup, I would have blithely assumed that all backups had been done correctly.
tmccormi wrote on Friday, December 11, 2015:
We use a remote backup server controller to launch automated backups that first do dated/compressed mysqldump of the entire database (not just the openemr instance) and then use dirvish (an RSYNC tool) to take a remote (incremental) copy of the entire system.
There is also a paid service that will do this that we like (contact me off list if you are interested). The service puts a backup appliance in the Doctor’s office that can cover all their devices (including desktops) and then they pull from the appliance to a data warehouse for long term storage.
–Tony
aethelwulffe wrote on Saturday, December 12, 2015:
Or use Fbackup on windows to do a shadow image of your whole web directory on your windows machines…or turn on file history, indexing, and backups on windows8 and 10.
Fbackup is seriously mediocre free backup software…but has a huge list of plugins!
I have heard that GFI backup is better, but I have never run it.
Since I grab my xampp directory, FTP stuff and mail stuff as separate jobs, the database for the last two can possibly be out of sync with the others, but it has never been an issue.
I also shut down all services and do a regular manual copy of the important directories on occasion. These backups are my only off-site maintained copies, and they are tested by being dumped onto another machine. They always run. No database re-building necessary.
scifi000 wrote on Sunday, March 13, 2016:
Hello, I went on openemr and went to Administration->Backup, and pushed the create a back up button and it created a Tar file. What are the steps to restore the back up if I need to? Thanks.
scifi000 wrote on Sunday, March 13, 2016:
Hello, I went on openemr and went to Administration->Backup, and pushed the create a back up button and it created a Tar file. What are the steps to restore the back up if I need to? Thanks.
fsgl wrote on Sunday, March 13, 2016:
For Linux, see this.
With Windows I would suggest using CVerk’s backup instead of the emr_backup.tar because recovery is simply deleting the old xampp folder & copying to C-drive.
Recovery instructions are given in my first link above.
Create system images periodically because backups, especially via cron jobs, can turn out to be duds. Macrium Reflect can be used for Linux as well.
In addition to attending the backup, one should test the fidelity of the backup by restoring it. Only way to detect a dud.