Continous Delivery in depth: pipelines

In this first installment, we’ll explain how to use Jenkins pipelines with Stratio Big Data to get a complete lifecycle trace from the development team to the final production environment.

During the “Lunch & Learn” on Stratus Continuous Delivery we looked at some of the problems and now we will try to explain them so that you can understand without problem the nature of the main bugs and the solution we have implemented (something that we will explain in the second part).

Recommended Reading

Pipelines are code

Each of our pipeline is in a private group of github under Stratio’s organization repo, where we have several elements:

  • Lgrove
  • Libvars.groovy
  • Libpipeline.groovy
  • Dev-project.groovy

L.groovy is the main directory for shared methods and is used to parse files, check code, compile, run tests and create docker images. More than 70 methods, mostly private. The Jenkins pipeline allows us to load it automatically from an internal jenkins repository, but to make it easier, we skipped that option and left the file in github.

Libvars.groovy is the directory for the shared variables. Groovy allows variables without type, but some are with type to have better maintenance. Some of these variables are constants, such as urls (internal nexus, gitolite or docker registry), slack channels and default versions.

Libpipeline.groovy is the primary method. Decide what type of operations will be performed on the current task. We will talk about this file later.

Dev-project.groovy is the true pipeline. True because it loads the three previous files, sets the values ​​of the variables and invokes the previous main method. As an example, we can take a look at one of the Stratio open source projects (Stratio Crossdata) that comes with comments on its goal:



import groovy.transform.Field

@Field String lib

node('master') { //Common element load
    def cdpwd = pwd().split('/').reverse()[0].replaceAll('@.*', '')
    lib = load "../${cdpwd}@script/l.groovy"
    l.vars = load "../${cdpwd}@script/libvars.groovy"
    l.pipeline = load "../${cdpwd}@script/libpipeline.groovy"
}

// Some metadata for checking out, identifying and messaging abouts warnings/errors
l.v.MODULE = 'crossdata' 
l.v.MAIL = 'crossdata@stratio.com'
l.v.REPO = 'Stratio/crossdata'
l.v.ID = 'xd'
l.v.SLACKTEAM = 'stratiocrossdata'
l.v.FAILFAST = true 

// Stratio is polyglot, and so are its developments. 
// We need to know what build tool we have to use
l.v.BUILDTOOL = 'maven' 

// Should we deploy to sonatype oss repository (so maven artifacts become public)
l.v.FOSS = true 

// Each PR gets statuses, as soon as each run action passes or fails
l.v.PRStatuses = ['Compile', 'Unit Tests', 'Integration Tests', 'Code Quality'] 

l.v.MERGETIMEOUT = 70  // Timeous for each kind of operation we could perform
l.v.PRTIMEOUT = 30
l.v.RELEASETIMEOUT = 30
l.v.SQUASHTIMEOUT = 15

l.v.MERGEACTIONS = { // If our git hook sent a payload related to a PR being merged 
                  l.doBuild()

                  parallel(UT: {
                     l.doUnitTest()
                  }, IT: {
                     l.doIntegrationTest()
                  }, failFast: l.v.FAILFAST)

                  l.doPackage() //Might be a tgz, deb, jar, war
                  // java-scaladocs are published to our s3 bucket 
                  // (such as http://stratiodocs.s3-website-us-east-1.amazonaws.com/cassandra-lucene-index/3.0.6.1/)
                  l.doDoc() 

                  parallel(CQ: {
                     // Static code analysis with Sonarqube 
                     // and coveralls.io (for some FOSS projects)
                     l.doCodeQuality() 
                  }, DEPLOY: {
                     l.doDeploy()
                  }, failFast: l.v.FAILFAST)

                  // And push it to our internal docker registry
                  //, for a later usage in tests and demos
                  l.doDockerImage() 
                  // A Marathon cluster deploys the previously build image
                  l.doMarathonInstall('mc1') 
                  l.doAcceptanceTests(['basic', 'auth', cassandra', 'elasticsearch', 'mongodb', mesos', 'yarn'])
                 }

l.v.PRACTIONS = { // If our git hook sent a payload about a PR being opened or synced
               l.doBuild()

               parallel(UT: {
                  l.doUnitTest()
               }, IT: {
                  l.doIntegrationTest()
               }, failFast: l.v.FAILFAST)

               l.doCodeQuality()
               // We deploy a subset of our wannabe packages to an staging repo            
               l.doStagePackage()
               // Work as packer, building a temporal docker image, so a container can be used for testing 
               l.doPackerImage() 
               l.doAcceptanceTests(['basic', 'auth', cassandra', 'elasticsearch', 'mongodb', mesos', 'yarn'])
              }

l.v.BRANCHACTIONS = { // We could receive a hook signalling a branch to be forged
                   l.doBranch()
                  }

l.v.RELEASEACTIONS = { // So we could release a final version
                    l.doRelease()
                    l.doDoc()
                    l.prepareForNextCycle()
                    // This time the image is the real deal
                    // It will end @ Docker Hub (https://hub.docker.com/r/stratio/)
                    l.doDockerImage() 

                    // Deploying again, to a production Marathon cluster
                    l.doMarathonInstall('mc2')
                    // Let the world know a new version is released, and spread its changelog
                    l.doReleaseMail() 
                   }

l.v.SQUASHACTIONS = {
                   // Currently just checks a PR statuse, rebases it, 
                   // invokes l.v.PRACTIONS, and merges the PR
                   l.doSquash() 
                  }

l.pipeline.roll()

Returning to libpipeline.groovy, we can see how some of the previously configured variables are used:

Some of the options that are not mentioned are the brightest: Before performing the integration and acceptance tests, several docker images are selected, executed and configured. When the tests are finished, the containers are destroyed, and they can enjoy a clean environment for testing.

As you can imagine, you can consult both public and private repositories in different git providers (github, gitlab, bitbucket). We can work with maven and do projects.

And since most elements can be defined by each development team, some elements can be read from each git repository:

Javier Delgado is an unofficial evangelist for the use of Jenkins for continuous tasks (inspection, testing, delivery), passionate about automation and speaker in different conferences on these matters. Computer engineer and faithful follower of continuous learning, currently works as an engineer DevOps in Stratio, Spanish-American big data company pioneer in offering large companies a complete digital transformation around their data through a single product.

Original article published in Stratio’s blog.

Written by
Am a tech geek.. Do you wanna know more about me..? My contents will do tell you.

Have your say!

0 0

Lost Password

Please enter your username or email address. You will receive a link to create a new password via email.

Continous Delivery in depth: pipelines

by Ahmad
Turn up.. Let's be friends on social networks also