Skip to main content

Docker Automated Builds

When it comes to "dockerization" of your application, it might be possible that your apps are under continuous development and needs to generate new docker image frequently based on the app updation.
So, if haven't gone through my previous Articles regarding Docker Overview and Docker Basics, then please go through those first.

Solution: Automation of the Docker Build using Docker Automated Build feature of hub.docker.com

How? Let's find out.

All we have to do is to link Docker and Github to automate the build process. Following steps will teach you to do so with a sample project on Github.

Step 1: Create account on both Github.com and hub.docker.com

Step 2: Create a repository on the Github as per your requirements.

Step 3: Create a file named as "Dockerfile" with the following contents


FROM ubuntu:16.04

USER root
WORKDIR /root

RUN apt-get update -y \
    && apt-get install -y apache2

RUN service apache2 restart

EXPOSE 80:8080


Step 4: Commit and merge with the master.

Step 5: Go the hub.docker.com

Open Create > Create Automated build
Click Automated Build with Github

Step 6: Follow the steps to grant full access of Github account to the hub.docker.com

Step 7: Select your Github repository ,fill out the description and click Create.

Step 8: Commit some changes in your Github repository to trigger the Build. (Like changes in README.md file)

Step 9: Come back to hub.docker.com, open your Automated build repository and open the build section. You will find one build running. It will take couple of minutes to finish.

In the end your Docker image will be ready with the latest tag.

Comments

Post a Comment

Popular posts from this blog

Ceph Single Node Setup Ubuntu

Single Node Ceph Install A quick guide for installing Ceph on a single node for demo purposes. It almost goes without saying that this is for tire-kickers who just want to test out the software. Ceph is a powerful distributed storage platform with a focus on spreading the failure domain across disks, servers, racks, pods, and datacenters. It doesn’t get a chance to shine if limited to a single node. With that said, let’s get on with it. Inspired from:  http://palmerville.github.io/2016/04/30/single-node-ceph-install.html Hardware This example uses a VMware Workstation 11 VM with 4 disks attached (1 for OS/App, 3 for Storage). Those installing on physical hardware for a more permanent home setup will obviously want to increase the OS disks for redundancy. To get started create a new VM with the following specs: ·         Name: ceph-single-node ·         Type: Linux ·     ...

Docker Basics

When Do You Need to Use Docker? For replicating the environment on your server, while running your code locally on your laptop Experimenting with new things on your laptop without breaking the repositories. Creating a production grade environment on you PC with just simple steps. For instant testing of your application. For Docker CI/CD during numerous development phases (dev/test/QA) For distributing your app’s OS with a team, and as a version control system. Simple ways to setup docker: - Route 1 (curl required): # curl https://get.docker.com | sh Route 2: #apt-get update #apt-get install \     apt-transport-https \     ca-certificates \     curl \     software-properties-common #curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add – #apt-key fingerprint 0EBFCD88 #add-apt-repository \    "deb [arch=amd64] https://download.docker.com/lin...

Docker Overview

OVERVIEW Docker is the company driving the container movement and the only container platform provider to address every application across the hybrid cloud. Today’s businesses are under pressure to digitally transform but are constrained by existing applications and infrastructure while rationalizing an increasingly diverse portfolio of clouds, datacenters and application architectures. Docker enables true independence between applications and infrastructure and developers and IT ops to unlock their potential and creates a model for better collaboration and innovation. A little intro to LXC: - LXC (LinuX Containers) is a OS-level virtualization technology that allows creation and running of multiple isolated Linux virtual environments (VE) on a single control host. These isolation levels or containers can be used to either sandbox specific applications, or to emulate an entirely new host. LXC uses Linux’s cgroups functionality, which was introduced in version 2.6.24 to...