Thursday, May 12, 2022

AWS, Azure Cloud and DevOps Coaching Online Classes | May 2022 Schedule

Are you in IT? Tired of your work? Are you not able to make any good progress in your career? 

Are you not having a job? Looking for a break in IT? Are you interested in learning DevOps? 
 
Did you get laid off from your previous job due to Covid-19
 
You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.

DevOps Coaching Classes schedules for May 2022

Date Time Type When?
May 25th 6:00 to 8:00 PM CST Weekdays Mondays/Wednesdays
May 29th 11:35 AM to 01:30 AM CST on Saturdays
         &
02:00 PM to 04:00 pm CST on Sundays
Weekends Sat/Sun

DevOps Coaching Highlights:

- Comprehensive hands on knowledge on Git, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Puppet, Docker, AWS IAM, ECR, Docker registry. AWS and Azure cloud platforms.

- Coach is having about 22+ yrs of professional IT experience, 6+ Yrs in DevOps/Cloud/Automation.

- Many students already got placed in reputed companies from this coaching program successfully.

- Working as a Sr.DevOps Coach/Architect in a one of the top IT services companies in USA.

- Unique program...less theory, more hands on lab exercises...
 
Resume preparation will be done with candidates personally.

One-to-one Interview coaching.

- Coaching is purely hands on with 101% job relevant.

100% Job assistance.

- Coached about 1200+ people successfully for past 4 and half years and many of my students got placed with many large enterprises in DFW, Charlotte, Houston, Austin, Chicago, Florida, Seattle, Bay area, Ohio, NJ and NY areas..

Contact no #: +1(469)733-5248
Email id: devops.coaching@gmail.com
Contact: Coach

Tuesday, May 3, 2022

stderr: remote: Bitbucket Cloud recently stopped supporting account passwords for Git authentication. remote: App passwords are recommended | Fix for this issue | How to Create app passwords in Bitbucket?

When you are creating any freestyle jobs or pipeline jobs in Jenkins, when you try to checkout a project from Bitbucket and If you are using Bibucket password, you may get this error.

Reason for this error:

Beginning March 1, 2022, Bitbucket Cloud users will no longer be able to use their account passwords when using Basic authentication for Git over HTTPS and the Bitbucket Cloud REST API. The removal of account password usage for Basic authentication when using Git over HTTPS and/or the Bitbucket Cloud REST API is due to Bitbucket Cloud's ongoing effort to align with internal infrastructure and improve Atlassian account security. App passwords are substitute passwords for a user's account and are designed to be used for a single purpose with limited permissions.



How to Create App Passwords in Bitbucket?

Go to Bitbucket--> Settings
Click on App passwords --> Create app password



Now enter label name and click on read/write under repositories

Click on Create, this will generate app password.
Now you can this app password in Jenkins.

Provision Ubuntu 18.0.4 EC2 Instance | How to create EC2 instance in AWS console | Launch Ubuntu 18.0.4 instance in AWS

 How to create EC2 instance in AWS console using new UI experience?

What is EC2 instance? 

It is virtual server provided by AWS. We will be using this EC2 to setup both Jenkins and Tomcat. Please follow the below steps to create an EC2 instance.

Watch here for live demo:

Steps:
1: Login to AWS console by clicking this link -->  https://aws.amazon.com/console/
click on All services, Click on Compute -->  Click on EC2


2. Click on Launch instance


3. Enter Name as EC2 and enter 2 as number of instances


4. Select Ubuntu and choose Ubuntu server 18.0.4 as AMI




5. Enter t2.small as instance type
6. Click on Create new Key Pair


7. Choose the existing key pair if you have one, otherwise create new one, give some name as myJenkinsKey. Make sure you download the key in your local machine. Please do NOT give space or any character while naming the key.



8. Under Network settings, Click Edit



Add port range as 8080 and select AnyWhere as Source Type, that should enter 0.0.0.0/0 as Source

9. Enter 10 GB as storage 
And then make sure in Summary, values appear as below:


10. Click on Launch Instance.

Click on View instances

Now you should be able to view instances in AWS console.


Connect to EC2 instance from local machine:
Please click the below link to understand the steps for connecting to EC2 instance from your local machine - windows or Apple laptop.

http://www.cidevops.com/2018/02/how-to-connect-to-ec2-instance-from.html

Thursday, April 28, 2022

AWS, Azure Cloud and DevOps Coaching Online Classes | May 2022 Schedule

 Are you in IT? Tired of your work? Are you not able to make any good progress in your career? 

Are you not having a job? Looking for a break in IT? Are you interested in learning DevOps? 
 
Did you get laid off from your previous job due to Covid-19
 
You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.

DevOps Coaching Classes schedules for May 2022:

Date Time Type When?
May 25th 6:00 to 8:00 PM CST Weekdays Mondays/Wednesdays
May 29th 11:35 AM to 01:30 AM CST on Saturdays
         &
02:00 PM to 04:00 pm CST on Sundays
Weekends Sat/Sun

DevOps Coaching Highlights:

- Comprehensive hands on knowledge on Git, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Puppet, Docker, AWS IAM, ECR, Docker registry. AWS and Azure cloud platforms.

- Coach is having about 22+ yrs of professional IT experience, 6+ Yrs in DevOps/Cloud/Automation.

- Many students already got placed in reputed companies from this coaching program successfully.

- Working as a Sr.DevOps Coach/Architect in a one of the top IT services companies in USA.

- Unique program...less theory, more hands on lab exercises...
 
Resume preparation will be done with candidates personally.

One-to-one Interview coaching.

- Coaching is purely hands on with 101% job relevant.

100% Job assistance.

- Coached about 1200+ students successfully for past 4 and half years and many of my students got placed with many large enterprises in DFW, Charlotte, Houston, Austin, Chicago, Florida, Seattle, Bay area, Ohio, NJ and NY areas..

To join coaching classes, contact coach below:

Contact no # : +1(469)733-5248
Email id: devops.coaching@gmail.com
Contact: Coach

Saturday, April 23, 2022

Create Azure Pipeline using YAML | Create Azure YAML Pipeline | Build Pipelines in Azure DevOps to Deploy into Azure Cloud | How to Deploy Java WebApp into Azure Web App in Azure Cloud

 Building pipelines in Azure DevOps is really easy, you can migrate your web applications from any where into Azure Cloud by using Azure pipelines.



Watch the steps in YouTube Channel:

Pre-requistes:


(If you already have a WebApp setup in Azure cloud, move to next step)
You need to create WebApp in Azure Cloud. WebApp is an App service (mostly Platform as a Service) provide by Azure Cloud to migrate any web applications.

Click here to learn how to create WebApp in Azure Portal.

Create Azure Build YAML pipeline in Azure DevOps

Login to Azure DevOps, go to your project dashboard.

Click on Pipelines --> new pipeline


Select GitHub or where ever your source code is 


Select your code repo

Choose the below option --> Maven package Java project web app to Linux on Azure


Choose the azure subscription, click on continue
Enter your Microsoft account details for Azure pipelines to authenticate with Azure cloud for deploying WebApp into App service.

Now choose web app name from the drop down

Click on validate and configure

Now Review Azure pipeline code 
and make sure path of mom.xml is added properly as MyWebApp/pom.xml





Click on Save and Run

Once you save the file, you can confirm by logging into your GitHub repo.


Now Build must be running. Once build is successful, we need to give permission for deployment into Azure. Click on Review






This confirms that webapp is deployed successfully in Azure cloud.


Verify the webapp deployment into Azure cloud

go to App settings of WebApp, copy the URL and enter /MyWebApp

https://enteryourwebappurl/MyWebApp


that's it..that is how you deploy Web App from Azure pipelines into Azure cloud.

Sunday, March 27, 2022

Install Jenkins on Ubuntu 18.0.4 using Docker Compose | Setup Jenkins on AWS EC2 Ubuntu instance | How to setup Jenkins in Ubuntu EC2 instance using Docker?

Please follow the steps to install Jenkins using Docker compose on Ubuntu 18.0.4 instance. 

What is Docker Compose?
Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration.
 
The purpose of docker-compose is to function as docker cli but to issue multiple commands much more quickly. To make use of docker-compose, you need to encode the commands you were running before into a docker-compose.yml file
 
Run docker-compose up and Compose starts and runs your entire app.

Change Host Name to Jenkins
sudo hostname Jenkins

Perform update first
sudo apt update

Now lets start Docker. compose installation first:

Install Docker-Compose
sudo apt-get install docker-compose -y

Add current user to docker group
sudo usermod -aG docker $USER

Create directory
mkdir ~/jenkins

Jenkins Setup

Create docker-compose.yml
this yml has all configuration for installing Jenkins
sudo vi docker-compose.yml 

version: '3.3'
services:
  jenkins:
    image: jenkins/jenkins:lts
    restart: unless-stopped
    privileged: true
    user: root
    ports:
      - 8080:8080
    container_name: jenkins
    volumes:
      - ~/jenkins:/var/jenkins_home
      - /var/run/docker.sock:/var/run/docker.sock
      - /usr/local/bin/docker:/usr/local/bin/docker

Now execute the compose file using Docker compose command:
sudo docker-compose up -d 



Make sure Jenkins is up and running
sudo docker-compose logs --follow
You can also get the admin password


How to get Jenkins admin password in another way?
Identify Docker container name

sudo docker ps


Get admin password by executing below command
sudo docker exec -it jenkins cat /var/jenkins_home/secrets/initialAdminPassword


Access Jenkins in web browser

Now Go to AWS console. Click on EC2, click on running instances link. Select the checkbox of EC2 you installed Jenkins. Click on Action. Copy the value from step 4 that says --> Connect to your instance using its Public DNS:

Now go to browser. enter public dns name or public IP address with port no 8080.

Unlock Jenkins
You may get screen, enter the below command in Git bash( Ubuntu console)

Copy the password and paste in the browser.
Then click on install suggested plug-ins. 
Also create user name and password.
enter everything as admin. at least user name as admin password as admin
Click on Save and Finish. Click on start using Jenkins. Now you should see a screen like below:



That's it. You have setup Jenkins successfully using Docker compose. 
Please watch the steps in our YouTube channel.

Thursday, March 3, 2022

How to store Terraform state file in Azure Storage | How to manage Terraform state in Azure Blob Storage

One of the amazing features of Terraform is, it tracks the infrastructure that you provision. It does this through the means of state. By default, Terraform stores state locally in a file named terraform.tfstate. This does not work well in a team environment where if any developer wants to make a change he needs to make sure nobody else is updating terraform in the same time. You need to use remote storage to store state file.




With remote state, Terraform writes the state data to a remote data store, which can then be shared between all members of a team. Terraform supports storing state in many ways including the below:

  • Terraform Cloud
  • HashiCorp Consul
  • Amazon S3
  • Azure Blob Storage
  • Google Cloud Storage
  • Alibaba Cloud OSS
  • Artifactory or Nexus 

We will learn how to store state file in Azure Blob storage. We will be creating Azure storage account and container.

Watch the steps in YouTube Channel:

Pre-requistes:

Steps:

Configure remote state remote storage account

Before you use Azure Storage as a backend, you must create a storage account. We will create using shell script:

#!/bin/bash
RESOURCE_GROUP_NAME=tfstate
STORAGE_ACCOUNT_NAME=tfstate$RANDOM
CONTAINER_NAME=tfstate
# Create resource group
az group create --name $RESOURCE_GROUP_NAME --location eastus
# Create storage account
az storage account create --resource-group $RESOURCE_GROUP_NAME --name $STORAGE_ACCOUNT_NAME --sku Standard_LRS --encryption-services blob
# Create blob container
az storage container create --name $CONTAINER_NAME --account-name $STORAGE_ACCOUNT_NAME


This should have created resource group, storage account and container.


Configure terraform backend state 

To configure the backend state, you need the following Azure storage information:

    • storage_account_name: The name of the Azure Storage account.
    • container_name: The name of the blob container.
    • key: The name of the state store file to be created.
    • access_key: The storage access key.
    Create backend.tf file

    terraform {
    required_providers {
    azurerm = {
    source = "hashicorp/azurerm"
    version = "=2.63.0"
    }
    }
    backend "azurerm" {
    resource_group_name = "tfstate"
    storage_account_name = "<storage_acct_name>"
    container_name = "tfstate"
    key = "terraform.tfstate"
    }
    }

    provider "azurerm" {
    features {}
    }

    resource "azurerm_resource_group" "demo-rg" {
    name = "demo-rg"
    location = "eastus"
    }


    terraform init



    terraform apply 

    and type yes

    This should have created backend file called terraform.tfstate in a container inside azure storage.

    You can view remote state file info:


    This is how you can store terraform state information remotely.