Continuous Integration (CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early. By integrating regularly, you can detect errors quickly, and locate them more easily.
In this tutorial you will learn how to setup Jenkins using Docker and then deploy it on AWS S3 ,thus completing the entire CI/CD process.
npx create-react-app cicd_tutorial cd cicd_tutorial npm start
Prerequisites:
Minimum hardware requirements:
256 MB of RAM
1 GB of drive space (although 10 GB is a recommended minimum if running Jenkins as a Docker container)
Software requirements:
Java: see the Java Requirements page
Web browser: see the Web Browser Compatibility page
Go to Docker store and download respective docker image depending on your OS.
The recommended Docker image to use is the jenkinsci/blueocean image (from the Docker Hub repository). This image contains the current Long-Term Support (LTS) release of Jenkins (which is production-ready) bundled with all Blue Ocean plugins and features. This means that you do not need to install the Blue Ocean plugins separately.
Run the following command in your terminal
docker pull jenkinsci/blueocean docker run -p 8080:8080 jenkinsci/blueocean
Note the admin password dumped on log
open a browser on http://localhost:8080
run the initial setup wizard. Choose “recommended plugins”
browse to http://localhost:8080/blue
This creates Jenkins BlueOcean Image in your docker.
docker network create command:docker network create jenkins
Create the following volumes to share the Docker client TLS certificates needed to connect to the Docker daemon and persist the Jenkins data using the following docker volume create commands:
docker volume create jenkins-docker-certs docker volume create jenkins-data
In order to execute Docker commands inside Jenkins nodes, download and run the docker:dind Docker image using the following docker container run command:
docker container run --name jenkins-docker --rm --detach --privileged --network jenkins --network-alias docker --env DOCKER_TLS_CERTDIR=/certs --volume jenkins-docker-certs:/certs/client --volume jenkins-data:/var/jenkins_home --volume "$HOME":/home --publish 3000:3000 docker:dind
Run the jenkinsci/blueocean image as a container in Docker using the following docker container run command (bearing in mind that this command automatically downloads the image if this hasn’t been done):
docker container run --name jenkins-tutorial --rm --detach --network jenkins --env DOCKER_HOST=tcp://docker:2376 --env DOCKER_CERT_PATH=/certs/client --env DOCKER_TLS_VERIFY=1 --volume jenkins-data:/var/jenkins_home --volume jenkins-docker-certs:/certs/client:ro --volume "$HOME":/home --publish 8080:8080 jenkinsci/blueocean
Once you run this command open your browse and browse to http://localhost:8080.
CICD Jenkins Project)./home directory of the Jenkins container – i.e. /home/Documents/GitHub/YOUR_REACT_PROJECT_NAMEJenkinsfile, which you’ll be checking into your locally cloned Git repository.First, create an initial Pipeline to download a Node Docker image and run it as a Docker container (which will build your simple Node.js and React application). Also add a “Build” stage to the Pipeline that begins orchestrating this whole process.
Jenkinsfile at the rootpipeline { agent { docker { image 'node:6-alpine' args '-p 3000:3000' } } stages { stage('Build') { steps { sh 'npm install' } } }
Jenkinsfile and commit it to your repositoryCICD tutorial Git repository itself, Jenkins:Build stage (defined in the Jenkinsfile) on the Node container. During this time, npm downloads many dependencies necessary to run your Node.js and React application, which will ultimately be stored in the node_modules workspace directory (within the Jenkins home directory).Similarly, you can add test stage as well . Once that is added your Jenkins file will look somewhat like this :
pipeline { agent { docker { image 'node:6-alpine' args '-p 3000:3000' } } environment { CI = 'true' } stages { stage('Build') { steps { sh 'npm install' } } stage('Test') { steps { sh 'npm test' } } } }
In order to set up your S3 bucket in your Jenkins file add a Production stage as per the below code –
stage('Production') { steps { withAWS(region:'YOUR_BUCKET_REGION',credentials:'CREDENTIALS_FROM_JENKINS_SETUP') { s3Delete(bucket: 'YOUR_BUCKET_NAME', path:'**/*') s3Upload(bucket: 'YOUR_BUCKET_NAME', workingDir:'build', includePathPattern:'**/*'); }
We will setup the credentials in the Jenkins in the next step. Now your Jenkins file should look something like this :
pipeline { agent { docker { image 'node:6-alpine' args '-p 3000:3000' } } environment { CI = 'true' HOME = '.' npm_config_cache = 'npm-cache' } stages { stage('Install Packages') { steps { sh 'npm install' } } stage('Test and Build') { parallel { stage('Run Tests') { steps { sh 'npm run test' } } stage('Create Build Artifacts') { steps { sh 'npm run build' } } } } stage('Production') { steps { withAWS(region:'YOUR_BUCKET_REGION',credentials:'CREDENTIALS_FROM_JENKINS_SETUP') { s3Delete(bucket: 'YOUR_BUCKET_NAME', path:'**/*') s3Upload(bucket: 'YOUR_BUCKET_NAME', workingDir:'build', includePathPattern:'**/*'); } } } } }
{
“Version”: “2020-02-19”,
“Statement”: [
{
“Sid”: “PublicReadGetObject”,
“Effect”: “Allow”,
“Principal”: ““, “Action”: “s3:GetObject”, “Resource”: “arn:aws:s3:::YOUR_BUKCET_NAME/“
}
]
}Permissions page click Attach existing policies directly and search for AmazonS3FullAccess.Now make any change to your react app and commit your file to your local repo.If everything is setup properly your newly committed will be uploaded to S3 which you can browse through the bucket endpoint where your app exists.