Tuesday, December 19, 2023

How to trigger a Jenkins job from another Jenkins job | Jenkins job Integrating another Jenkins Job | Jenkins Pipeline job triggering another Jenkins Job

Jenkins job can be triggered so many different ways. This article provides steps to trigger a Jenkins job from another Jenkins job. 


Pre-requisites:

Scenario #1(post build) - How to trigger a Jenkins job from another Free style Job?

1. Login to Jenkins instance.
2. Open your any existing freestyle build job.
3. Click on Configure




4. Go to post build action

5. Add post-build action --> click on Build other projects

6. Select job name(projects)  that you want to trigger by typing the name of the job and also check trigger only if build is stable


7. Save the job. 
8. Build job now, once the current job is built, it will trigger the next job immediately.
Check the console output of the current job, you will see id would trigger second job 

9. Go to the secondJob
Check console output. You will see the second job got triggered by first build job.

Scenario #2 (pre-build) - How to trigger a Jenkins job from another Free style Job?

1. Open your freestyle build job.
2. Click on Configure
3. Click on Build triggers
4. Check build other projects are built.
    


Select source job name which will be built first and then once the build is stable, it will be trigger this job. And also check trigger only if build is stable.

5. Save the job. 

6. Run the first job. once that job is successful, and then it will trigger this job.


Scenario #3 - How to trigger any Jenkins job from a pipeline Job:

pipeline {

    agent any

    stages {

        stage('Trigger Another Job') {

            steps {

                    build job: 'mySecondJob', wait: false

            }

        }

    }

}



Watch Steps in YouTube channel: 

Wednesday, November 29, 2023

Jenkins CI/CD Pipeline Optimization Best Practices | Optimizing Jenkins CI/CD pipelines

Optimizing Jenkins CI/CD pipelines is crucial for achieving faster, more efficient, and reliable software delivery. Here are some best practices and strategies for optimizing Jenkins pipelines:

1. Parallelization:

  • Parallel Stages: Break down your pipeline into stages and parallelize independent stages to run concurrently. This can significantly reduce the overall pipeline execution time.

  • stages { stage('Build') { steps { script { parallel( unit_tests: { // Run unit tests }, integration_tests: { // Run integration tests } ) } } } // Other stages... }

2. Artifact Caching:

  • Use Caches: Utilize Jenkins' built-in caching mechanisms to store and retrieve build artifacts between different pipeline runs. This reduces the time spent on redundant build steps.

  • pipeline { options { // Enable build caching buildDiscarder(logRotator(numToKeepStr: '5')) caches { gradle 'gradle-wrapper' } } // Pipeline stages... }

3. Agent Utilization:

  • Node Pools: Distribute builds across multiple Jenkins agents or node pools to leverage available resources effectively. Adjust the number of executors on each agent based on workload.

  • pipeline { agent { label 'docker' } // Pipeline stages... }

4. Incremental Builds:

  • Only Build Changes: Set up your pipeline to trigger builds only for changes in relevant branches. Use tools like Git SCM polling or webhooks to trigger builds on code changes.

5. Artifact Promotion:

  • Promote Artifacts: Promote artifacts from one environment to another instead of rebuilding them. This helps in maintaining consistency across environments and reduces build times.

6. Pipeline DSL Optimization:

  • Code Reusability: Use shared libraries and functions to avoid duplicating code across multiple pipeline scripts. This promotes code reusability and simplifies maintenance.

7. Conditional Execution:

  • When Conditions: Use the when directive to conditionally execute stages based on certain criteria, such as branch names or environment variables.

  • stage('Deploy to Production') { when { expression { params.DEPLOY_TO_PROD == 'true' } } steps { // Deployment steps } }

8. Artifact Cleanup:

  • Clean Workspace: Include a step to clean up the workspace at the end of each build to avoid accumulation of unnecessary artifacts and files.
  • post { always { cleanWs() } }

9. Pipeline Visualization:

  • Blue Ocean: Consider using the Blue Ocean plugin for Jenkins, which provides a more visually appealing and intuitive view of your pipeline.

10. Monitoring and Analytics:

  • Collect Metrics: Implement monitoring and analytics to collect data on pipeline performance. Identify bottlenecks and areas for improvement.

11. Pipeline as Code:

  • Declarative Syntax: Use the declarative syntax for Jenkins pipeline scripts whenever possible. It is more concise and easier to read.

12. Use Jenkins Shared Libraries:

  • Library Usage: If you have common functionality across multiple pipelines, consider moving that logic into a shared library. This promotes code reuse and centralizes maintenance.

13. Artifact Signing and Verification:

  • Security Checks: Integrate security checks into your pipeline, including artifact signing and verification steps, to ensure the integrity and authenticity of your artifacts.

14. Automated Testing:

  • Automated Tests: Include automated tests for your pipeline scripts to catch issues early. Jenkins provides testing frameworks like Jenkins Pipeline Unit for this purpose.

15. Infrastructure as Code:

  • Infrastructure Automation: Treat your Jenkins infrastructure as code. Use tools like Docker and Kubernetes for scalable and reproducible Jenkins environments.

Friday, November 10, 2023

How to Integrate Artifactory with Azure DevOps | Upload Artifacts from Azure DevOps YAML Pipelines to Artifactory | Artifactory and Azure DevOps Integration | Upload Java WAR file into Artifactory from Azure Pipelines

How to integrate Artifactory with Azure DevOps for uploading build artifacts?



Artifactory is one of the popular binary repository managers. It is Java based open source tool, used for storing build artifacts and docker images.

Some of the key features of Artifactory:
  • Supports 27 different package types including helm charts, docker images regardless of tech stack.
  • A single source of truth for all your binaries
  • Integration with all CICD tools
  • role based authorization with teams to manage artifacts 
  • you can create local, remote and virtual repositories
Pre-requisites:

We will automate building a Java project configured in GitHub and we will use Maven to build, package in Azure YAML pipeline and upload WAR file into Artifactory.

Steps to be followed to create a pipeline:

Create a Pipeline:
Login to your Azure Devops dashboard https://dev.azure.com


Go to Pipelines, Click New
select your SCM, in my case, Java Web App code is in GitHub.



Select the Java Repo, select Maven from the option. This will generate YAML file for us.

Add a task for uploading WAR file.
Click on Show assistant, search for generic and select JFrog Generic Artifacts


Choose Upload as command, choose service connection for Artifactory, enter *.war as pattern, enter repo name where artifact will be uploaded. Click on Add.


Now save the pipeline.. Run the pipeline

Azure DevOps YAML Pipeline Code

using the pipeline code we can automate build using Maven and upload WAR file into Artifactory

# Build your Java project using Maven and Upload WAR file to Artifactory

trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- task: Maven@3
inputs:
mavenPomFile: 'MyWebApp/pom.xml'
mavenOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.8'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
goals: 'package'
- task: JFrogGenericArtifacts@1
inputs:
command: 'Upload'
connection: 'Artifactory_svc_conn'
specSource: 'taskConfiguration'
fileSpec: |
{
"files": [
{
"pattern": "*.war",
"target": "libs-snapshot-local"
}
]
}
failNoOp: true

Now login to Artifactory..Click on Artifactory --> Artifacts. We can see the WAR file being uploaded.


This is how we can easily integrate Artifactory with Azure DevOps for uploading build artifacts.

Watch Steps in YouTube channel:

Monday, October 30, 2023

Fix for Kubernetes Deployment Error using helm chart | Error: kubernetes cluster unreachable: exec plugin: invalid apiversion "client.authentication.k8s.io/v1alpha1"

 Error: kubernetes cluster unreachable: exec plugin: invalid apiversion "client.authentication.k8s.io/v1alpha1"


 

Root cause and fix:
Downgrading helm version to 3.8.2 would resolve the issue.

curl -L https://git.io/get_helm.sh | bash -s -- --version v3.8.2