Docker - not for the faint of heart

I’m so irritated.

Remember when you were able to replace the battery in your cell phone without dealing with the time and expense of taking it to someone else? Well we’ve reached that crossroads with the “Dockerization” of OpenEMR. Until I’m able to wrap my head around the Docker system, I’m taking this time (as a non-developer) to vent my frustrations over what was once but no longer is a simple process. Remember when you could just make a change to a file and FTP it up to it’s target location? Those days are gone. I made a change to the patient_list.php module and got it to work on my local home grown non-Docker installation of OpenEMR on Ubuntu. Now it’s time to upload to the AWS Cloud version, but guess what? There are three possible locations where it would go:

/$ sudo find -name patient_list.php

At this point I could be bold and just replace all three files via FTP, but in the back of my mind I can’t help but think I’m going to break something. After initially realizing I have no idea what to do, I crank up the old Google search and find this article:

In which are listed the steps for editing files in a Docker container:

  1. Find the container id of a running container
  2. Login inside the docker container using CONTAINER ID
  3. Update the package manager
  4. Install the required package vi, nano, vim etc.
  5. Edit the file using either vim or nano
  6. Install vim editor along with dockerfile
  7. Using remote editor by exposing the port 22
  8. Best practices for editing the file

Are you freaking kidding me? I feel like the guy at the fast-food restaurant who just wants to use the bathroom and instead of being able to get a token to open the door (put in coin, turn knob and you’re in) I’m presented with a touch-screen displaying - you guessed it - a Rubik’s cube! I know I’m sounding like a meat-headed sloth that wants to stick with the old way of doing things, but considering the cornucopia of responsibilities I have as an IT professional, this just adds one more layer of complexity to what I already have on my plate.

Once I get a fundamental understanding of the Docker system down pat I suspect I’ll change my tune, but for the sake of saving time, I guess that’s why we have services like Fiverr to pawn this work off onto people in the know. Just had to get this off my chest.

</end of rant grrrrrr arrrrgh>

It’s fine @Help_Desk, Sometimes we need to just get it off our chest.

I too find docker frustrating many times and rarely use however, I realize docker has its place and for many is a valuable tool. Tool is the keyword here in my opinion. Originally, docker was intended for developers to somewhat quickly develop and test software in different environments. Then grew from there.

I personally am old school and simply have a hard time wrapping my head around the need to run in a cloud or even use docker for production sites that are fairly static and not requiring many updates.

I prefer if running on AWS, to just use a machine with normal openemr install. Better yet with computer reliability so good these days, spending a thousand dollars on a computer, plugging up to router and setting in the corner of the office somewhere is far more cost effective than a cloud solution.

Each has to determine what is best for them and whether they want to spend the time/effort learning docker. Moreover though, many consider it a challenge and fun to learn new things and add to their arsenal of abilities. I bet you’re one of those.
Good luck.

Thanks man. I’m definitely an advocate for life-long and accelerated learning, it’s just the time pressure of figuring this out in a production environment has been stuck in my craw.

AWS was the right move for us as we had 4.x installed on a local CentOS 6.10 server on a VPN. When the power or internet went out we were screwed. The AWS with 2FA has been a steady rock with no access problems.

Hmm, strange. I’d think if you loose power then your work stations would also be down.
Concerning internet, same is true unless one has battery backup. Even so, the nice thing about a local server is the fact if internet is down due to vendor problems etc then office work can continue using the local intranet. IMO, this is the way medical software should be run i.e local intranet unplugged from internet.

I have my family’s clinic set up this way where only one or two admins have access from outside the office via VPN.

Yeah we have 2 campuses and remote workers so cloud is the solution for us. Plus I don’t want to be in the data center business, I have enough on my plate.