Publishing Docker Containers to Amazon ECR with Bitbucket Pipelines

Jim O'Halloran • September 7, 2021


I've been using a virtually identical container setup in a few different projects. Up til now, I've just been copy/pasting the full container build configuration into each project and trying to remember to update each whenever they change. That was fine as the number of projects was small, but as time went on it's become more of a pain.

What I wanted to create was a single repository that contained the docker configuration for a series of "generic" images, and have those images built and stored in a private repository on Amazon's ECR (Elastic Container Registry) where they can be pulled down and used for each individual project. I also didn't want to do the built and uppload to ECR dance manually. Given that these containers are updated only occasionally and Bitbucket Piperlines gives us 50 minutes for free every month (and the repo will be on Bitbucket), it seemed like using Pipelines would be a good option to automate this process.

Create ECR IAM User

To begin, create a user in IAM with the following policy and save their Access and Secret keys (you'll need these later).

  "Version": "2012-10-17",
  "Statement": [
      "Effect": "Allow",
      "Action": [
      "Resource": "*"

You'll want to use a dedicated user, because their Access key and Secret Key will need to be stored in the pipeline, and you probably don't want pipelines having access to your entire AWS account when only ECR access is required.

Create Repository

Create your repository containing a Dockerfile (and any related files) and your bitbucket-pipelines.yml file. I used the following layout:

docker/  # Home for all container related files
docker/common/  # My project has several containers building from a common base
docker/common/Dockerfile  # Describes how to build common container
docker/common/<other build files here>
docker/app/  # App server builds from common
docker/app/Dockerfile  # Describes how to build app server container
docker/app/<other build files here>

Then the bitbucket-pipelines.yml looks like this:

    - step:
          - docker
          - docker
          # build the image
          - DATE=$(date -u +%Y%m%d)
          - docker build -t foo-common docker/common
          - docker build -t foo-app docker/app

          # use the pipe to push to AWS ECR
          - pipe: atlassian/aws-ecr-push-image:1.4.2
              IMAGE_NAME: foo-common
              TAGS: '${BITBUCKET_BRANCH}-${DATE} ${BITBUCKET_BRANCH} latest'
          - pipe: atlassian/aws-ecr-push-image:1.4.2
              IMAGE_NAME: foo-app
              TAGS: '${BITBUCKET_BRANCH}-${DATE} ${BITBUCKET_BRANCH} latest'

ECR Repositories

Create ECR repositories for each of the containers you're building. The important thing here is that the names should match the IMAGE_NAME specified in the pipeline (foo-common and foo-app in my case).

Pipeline Configuration

Now commit and push your files to BitBucket (standard Git stuff here). And enable Pipelines for your repository. Under "Deployments" add Repository variables for each of AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION as follows:

Pipelines should now be able to build the images for each of your components and push them to your private ECR repositories automatically.

I chose to tag my images with "latest" to indicate the most recent build (for projects that just don't care). The branch name (I'm using different branches for different PHP versions, so a project can get the latest build for their PHP version). And branchname-date to allow a project to lock in a specific build if it has to. How you tag your images is totally up to you!