Imagine that you’re working in an enterprise scale application landscape project. Monoliths are being replaced with microservices by applying the strangler pattern while parts of that landscape are being migrated to the cloud.
That new service is ready to be requested by a lot of systems. But you recognize now that the landscape does heavily rely on certificate based authentication. Your colleague has a hard disk with some certificates for several stages of the old monoliths environment.
What can you do now ?
That’s the question I faced during recent weeks. It was not feasible to apply a green-field approach, so I needed to find another solution.
Request certificates asap
Several static environments have been removed during the project for being replaced with dynamically created cloud environments. As a consequence, most existing certificates are not going to be used in the future.
New certificates are needed additionally to the certificates for the existing environments.
Waiting for certificates can become a blocking issue sooner than expected.
Pipelines and environments are continuously developed and improved. I wanted to automate testing pipelines including proper certificates – but how to do that if they haven’t existed yet ?
Self signed certificates can help, but only a bit. They won’t be accepted, but can be used to build the pipeline.
Lesson learned: request certificates as soon as possible to avoid waiting games.
Result: The application needs to have 233 certificates for authentication purpose.
Use secure storage for certificates
The cloud environment offers a managed secret and certificate storage. But what to do if your secure storage is incompatible with your certificate format ? There is a chance to transform certificates into a compatible format and back to the format you need.
I choose to step into another direction: certificates must be inside that secure storage without format transformation. Reason: I want to avoid increasing deployment pipeline complexity (which is already quite complex).
Encode the certificate via base64 helped here. The pipeline can read the certificate from secure storage and pass it to Kubernetes as a K8s-secret.
If certificates in Git do encrypt
Automate all the things. Does that include importing certificates into an environment ? For me it did, especially while waiting for receiving all needed certificates. Therefore I choose to write scripts for encoding and importing certificates into the secure storage.
But I do not want to be the only person capable of executing this setup task. Furthermore, I want to versionize scripts and resources. So: git ?
Yes, Git – but encrypted. There is a very straightforward tool to encrypt repository: git-crypt.
It is also possible to use PGP identities, which makes it quite easy to allow colleagues to decrypt the repository.
As I usually tell everybody how cool GitKraken is and how much I like using it: please don’t use it along with git-crypt without looking on configuration details. I do not know the details, but git-crypt works by acting as diff-tool. Kraken is using an own diff-tool by default – which leads into unencrypted files even though you listed them to be encrypted in .gitattributes. Once happens, you may never be able to erase the certificate from history – especially if you are using a managed remote.
Certificates remain also in Git ?
Once I migrated all needed certificates into secure storage, the question “do I keep the repo” arised. The answer “no !” came into my mind immediately. After reconsidering and having some discussion, I remain for no. Having a secure storage is such a huge advantage which I want to use as much as possible. It must be used as single source of truth.
TL ; DR
Using Git for tracking certificates is useful but felt not correct even at the first time I thought about. But I needed it to fill organizational gaps. If you ever come to this situation, encrypting is absolutely necessary. git-crypt is easy to use for this purpose. Always use a secure storage like Vault if possible.