As previously discussed, there are multiple methods for creating pipelines, including the use of shared projects. In this article, we will delve deeper into the concept of shared projects and explore their benefits.
In the DevOps realm, it is not uncommon to have multiple projects that require testing, building, and deploying using various technologies such as Docker and AWS. Imagine having to write a pipeline for each of these projects, including tasks like building JAR files and creating Docker images, which can become tedious and time-consuming. However, by utilizing shared projects, this process can be streamlined, allowing you to focus on other important tasks and continue to learn and grow.
Prerequisites
- Basic knowledge of the terminal
- A Jenkins server
- A Dockerhub account
- A GitHub or Gitlab account
- A smile on your face (Put up that smile friend ?)
Introduction
A shared project in Jenkins is a way for multiple pipelines to access common resources, such as utility scripts, configuration files, and libraries. This can help to improve the maintainability and reusability of your pipeline code, as well as ensure consistency across different parts of your application.
shared project on pipeline
Once a shared project is set up, it can be referenced in other pipeline jobs using the library
step or @Library
annotation. This allows the pipeline to access the shared resources and use them in the pipeline script, such as calling a shared function or using a shared variable.
library identifier: 'jenkins-shared-library@main', retriever: modernSCM(
[$class: 'GitSCMSource',
remote: 'https://github.com/noucair/jenkins-shared-library',
credentialsId: 'github-code'
]
code source for shared project here
shared project compose
A shared project can be composed of various components such as scripts, libraries, configuration files, and other resources that are used by multiple pipelines. These components can be organized in a structured way within the shared project, making it easy to find and use the resources you need. For example our case
To provide a more detailed example, consider using a script that groups all Docker tasks, such as logging in, pushing, and building images, into a single class for ease of use in a pipeline.
#!/usr/bin/env groovy
package com.example
class Docker implements Serializable {
def script
Docker(script) {
this.script = script
}
def buildDockerImage(String IMAGE_NAME) {
script.echo "building the docker image..."
script.sh "docker build -t $IMAGE_NAME ."
}
def dockerLogin() {
script.withCredentials([script.usernamePassword(credentialsId: 'dockerhub', passwordVariable: 'PASS', usernameVariable: 'USER')]) {
script.sh "echo $script.PASS | docker login -u $script.USER --password-stdin"
}
}
def dockerPush(String IMAGE_NAME) {
script.sh "docker push $IMAGE_NAME"
}
}
Conclusion
In summary, a shared project can be composed of multiple components, including scripts, libraries, and configuration files, which can be organized in a structured way to make it easy to find and use the resources you need across different pipeline jobs.