ryry wrote on Tuesday, May 29, 2012:
We have an OpenEMR installation that’s been running since June of last year. Here are some statistics on our database size:
openemr_postcalendar_events: 10,819 / 5.1MB
form_encounter: 8,881 / 1.8MB
forms: 43,482 / 5.6MB
patient_data: 4,620 / 1.4MB
users: 37 / 8.8KB
The report in question is “interface/reports/appt_encounter_report.php” or Reports > Visits > Appt-Enc
It takes up 99% of our SQL server CPUs when we run it. We’ve also installed a vanilla OpenEMR installation on its own CentOS machine outside of our server cluster to be sure and the result is the same:
http://webdev.secured-domain.net/chart1.png - Load averages
http://webdev.secured-domain.net/chart2.png - CPU utilization
http://webdev.secured-domain.net/chart3.png - mysqld process
This high load makes all other operations on that same OpenEMR installation slow to a crawl while the report is running.
The report does eventually finish in about 40 minutes.
How do other users of OpenEMR deal with this? Are users told to run the report after hours to avoid disabling the OpenEMR during the day? Are users told they can only run these reports with a sufficiently small amount of data?
Brady has mentioned that the query can be optimized, but how should it be optimized? Why wouldn’t it be optimized to begin with?
It seems to us that reports like these don’t scale and we’ll have to tell our clients they can no longer use them or have a separate, dedicated piece of hardware purely for generating reports.