Technology + Management + Innovation

Three Reasons Why Docker is a Game Changer

by Jake Bennett

Containers represent a fundamental evolution in software development, but not for the reasons most people think.

Image of containers being unloaded

Docker’s rapid rise to prominence has put it on the radar of almost every technologist today, both IT professionals and developers alike. Docker containers hold the promise of providing the ideal packaging and deployment mechanism for microservices, a concept which has also experienced a growing surge in popularity.

But while the industry loves its sparkling new technologies, it is also deeply skeptical of them. Until a new technology has been battletested, it’s just an unproven idea with a hipster logo, so it’s not surprising that Docker is being evaluated with a critical eye—it should be.

To properly assess Docker’s utility, however, it’s necessary to follow container-based architecture to its logical conclusion. The benefits of isolation and portability, which get most of the attention, are reasons enough to adopt Docker containers. But the real game changer, I believe, is the deployment of containers in clusters. Container clusters managed by a framework like Google’s Kubernetes, allow for the true separation of application code and infrastructure, and enable highly resilient and elastic architectures

It is these three benefits in combination—isolation, portability, and container clustering—that are the real reasons why Docker represents such a significant evolution in how we build and deploy software. Containers further advance the paradigm shift in application development brought about by cloud computing by providing a higher layer of abstraction for application deployment, a concept we’ll explore in more detail later.

Is Docker worth it?

However, you don’t get the benefits of containers for free: Docker does add a layer of complexity to application development. The learning curve for Docker itself is relatively small, but it gets steeper when you add clustering. The question then becomes: is the juice worth the squeeze? That is, do containers provide enough tangible benefit to justify the additional complexity? Or are we just wasting our time following the latest fad?

Certainly, Docker is not the right solution for every project. The Ruby/Django/Laravel/NodeJS crowd will be the first to point out that their PaaS-ready frameworks already give them rapid development, continuous delivery and portability. Continuous integration platform provider Circle CI wrote a hilarious post poking fun at Docker, transcribing a fictitious conversation in which a Docker evangelist tries to explain the benefits of containers to a Ruby developer. The resulting confusion about the container ecosystem perfectly captures the situation.



Strengthen Your AWS Security by Protecting App Credentials and Automating EC2 and IAM Key Rotation

by Jake Bennett

Effective information security requires following strong security practices during development. Here are three ways to secure your build pipeline, and the source code to get you started.


One of the biggest headaches faced by developers and DevOps professionals is the problem of keeping the credentials used in application code secure. It’s just a fact of life. We have code that needs to access network resources like servers and databases, and we have to store these credentials somewhere. Even in the best of circumstances this is a difficult problem to solve, but the messy realities of daily life further compound the issue. Looming deadlines, sprawling technology and employee turnover all conspire against us when we try to keep the build pipeline secure. The result is “credential detritus”: passwords and security keys littered across developer workstations, source control repos, build servers and staging environments.

Use EC2 Instance Profiles

A good strategy for minimizing credential detritus is to reduce the number of credentials that need to be managed in the first place. One effective way to do this in AWS is by using EC2 Instance Profiles. An Instance Profile is an IAM Role that is assigned to an EC2 instance when it’s created. Once this is in-place, any code running on the EC2 instance that makes CLI or SDK calls to AWS resources will be made within the security context of the Instance Profile. This is extremely handy because it means that you don’t need to worry about getting credentials onto the instance when it’s created, and you don’t need to manage them on an ongoing basis—AWS automatically rotates the keys for you. Instead, you can spend your time fine tuning the security policy for the IAM Role to ensure that it has the least amount of privileges to get its job done.

Get Credentials Out of Source Control

EC2 Instance Profiles are a big help, but they won’t completely eliminate the need to manage credentials in your application. There are plenty of non-AWS resources that your code requires access to, and EC2 Instance Profiles won’t help you with those. For these credentials, we need another approach. This starts by making sure that credentials are NEVER stored in source control. A good test I’ve heard is this: if you were forced to push your source code to a public repository on GitHub today, then you should be able to sleep well tonight knowing that no secret credentials would be revealed. How well would you sleep if this happened to you?

The main problem is that source code repositories typically have a wide audience, including people who shouldn’t have access to security credentials. And once you check in those credentials, they’re pretty much up there forever. Moreover, if you use a distributed SCM like Git, then those credentials are stored along with your source code on all of your developers’ machines, further increasing your exposure. The more breadcrumbs you leave lying around, the more likely rats will end up infesting your home. This appears to be what happened in the Ashley Madson hack that took place earlier this year.  Hard-coded credentials stored in source control were implicated as a key factor in the attack. Apparently, their source code was littered with the stuff. Not good.



Using Vagrant, Chef and IntelliJ to Automate the Creation of the Java Development Environment

by Jake Bennett

The long path to DevOps enlightenment begins with the Developer’s IDE: Here’s how to get started on the journey. In this article we walk through the steps for automating the creation of a virtual development environment.

Cloud Laptop Vagrant Chef

One of the challenges faced by software developers today working on cloud applications and distributed systems is the problem of setting up the developer workstation in a development environment comprised of an increasing number of services and technologies. It was already hard enough to configure developer workstations for complex monolithic applications, and now it’s even harder as we start to break down the application into multiple microservices and databases. If you are starting to feel like your developers’ workstations have become fragile beasts that are able to generate builds only by the grace of God and through years of mystery configuration settings, then you are facing trouble. Seek medical help immediately if you are experiencing any of the following symptoms:

  • The onboarding of new developers takes days or even weeks because getting a new development machine configured is a time-consuming and error-prone process.
  • The words “But the code works on my machine” are uttered frequently within your organization.
  • Bugs are often discovered in production that don’t occur in development or staging.
  • The documentation for deploying the application to production is a short text file with a last modified date that’s over a year old.

The good news is that there are technologies and practices to remedy these problems. The long-term cure for this affliction is cultivating a DevOps culture within your organization. DevOps is the new hybrid combination of software development and infrastructure operations. With the rise of virtualization and cloud-computing, these two formerly separate departments have found themselves bound together like conjoined twins. In the cloud, hardware is software, and thus software development now includes infrastructure management.