Wednesday, July 7, 2021

AWS : Exam:Practice Exam:Mock Exam: Cloud Practioner

https://geekflare.com/aws-practice-test/


https://digitalcloud.training/certification-training/

https://digitalcloud.training/certification-training/aws-certified-cloud-practitioner/

https://digitalcloud.training/courses/free-aws-certified-cloud-practitioner-practice-exam/


https://www.testpreptraining.com/


https://www.whizlabs.com/cart/ 

WELCOME


https://karanawsbucket.s3.us-east-1.amazonaws.com/Blogger_Images/products_whizcard-clf-c01-01-06.pdf

https://karanawsbucket.s3.us-east-1.amazonaws.com/Blogger_Images/products_csaa-whizcard-_-revised_14_06_2021.pdf

Azure : Devops : Pipeline :YAML

 You can organize pipeline jobs into stages. Stages are the major divisions in a pipeline: "build this app", "run these tests", and "deploy to pre-production" are good examples of stages. They are logical boundaries in your pipeline where you can pause the pipeline and perform various checks.


Pipeline > Stages >Stage>Steps>Step


jobs:

- job: A

  steps:

  - bash: echo "A"


- job: B

  steps:

  - bash: echo "B"

  

  

  If you organize your pipeline into multiple stages, you use the stages keyword.


If you choose to specify a pool at the stage level, then all jobs defined in that stage will use that pool unless otherwise specified at the job-level

stages:

- stage: A

  jobs:

  - job: A1

  - job: A2


- stage: B

  jobs:

  - job: B1

  - job: B2

  

  When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. The exception to this is when you add dependencies. With dependencies, stages run in the order of the dependsOn requirements

  stages:

- stage: string

  dependsOn: string

  condition: string

  -----------------------------------------------------------

  You can organize your pipeline into jobs. Every pipeline has at least one job. A job is a series of steps that run sequentially as a unit. In other words, a job is the smallest unit of work that can be scheduled to run.

  

  In the simplest case, a pipeline has a single job. In that case, you do not have to explicitly use the job keyword

  

  jobs:

- job: myJob

  timeoutInMinutes: 10

  pool:

    vmImage: 'ubuntu-16.04'

  steps:

  - bash: echo "Hello world"

  

  

  /usr/lib/jvm/adoptopenjdk-11-hotspot-amd64



 # update-alternatives --config java

          update-alternatives --list java

          echo ls /etc/alternatives

          ls /etc/alternatives

  

  

https://docs.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml

https://docs.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops


https://faun.pub/reduce-your-build-time-using-caching-in-azure-pipelines-7a7bd0201cee


MAVEN_CACHE_FOLDER: $(HOME)/.m2/repository"


variables:

  MAVEN_CACHE_FOLDER: $(HOME)/.m2/repository

  MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)'


- task: Cache@2

  inputs:

    key: '"funcs" | maven |"$(Agent.OS)" | **/pom.xml'

    restoreKeys: |

    path: $(MAVEN_CACHE_FOLDER)

  displayName: Cache Maven local repo

  

https://stackoverflow.com/questions/66161852/is-there-a-predefined-variable-for-home-vsts-in-azure-pipelines/66161853#66161853

  

  

Pool: Hosted Ubuntu 1604

Agent: Hosted Agent



Pool: Azure Pipelines

Image: ubuntu-16.04

Agent: Hosted Agent

Azure : Devops : Pipelines: Task ::: Cache Task

https://docs.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops

https://docs.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops#maven


Maven

Maven has a local repository where it stores downloads and built artifacts. To enable, set the maven.repo.local option to a path under $(Pipeline.Workspace) and cache this folder.

Example:

YAML
variables:
  MAVEN_CACHE_FOLDER: $(Pipeline.Workspace)/.m2/repository
  MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)'

steps:
- task: Cache@2
  inputs:
    key: 'maven | "$(Agent.OS)" | **/pom.xml'
    restoreKeys: |
      maven | "$(Agent.OS)"
      maven
    path: $(MAVEN_CACHE_FOLDER)
  displayName: Cache Maven local repo

- script: mvn install -B -e

If you are using a Maven task, make sure to also pass the MAVEN_OPTS variable because it gets overwritten otherwise:

Friday, July 2, 2021

Install Git On Linux

https://linuxconcept.com/install-git-on-rhel-7-operating-system/

https://phoenixnap.com/kb/how-to-install-rpm-file-centos-linux

http://opensource.wandisco.com/rhel/7/git/x86_64/

https://stackoverflow.com/questions/21820715/how-to-install-latest-version-of-git-on-centos-7-x-6-x

https://superuser.com/questions/1190269/relationship-between-yum-repo-and-rpm

https://blog.thewatertower.org/2019/04/24/modifying-systemd-unit-files/

https://www.tecmint.com/list-all-running-services-under-systemd-in-linux/

https://superuser.com/questions/513159/how-to-remove-systemd-services

https://www.digitalocean.com/community/tutorials/how-to-use-systemctl-to-manage-systemd-services-and-units

https://askubuntu.com/questions/795226/how-to-list-all-enabled-services-from-systemctl

https://access.redhat.com/documentation/en-us/red_hat_jboss_enterprise_application_platform/7.0/html/installation_guide/configuring_jboss_eap_to_run_as_a_service

https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/v2-linux?view=azure-devops

https://devblogs.microsoft.com/premier-developer/azure-devops-setting-up-repository-permissions/

https://stackoverflow.com/questions/2853803/how-to-echo-shell-commands-as-they-are-executed

-----------------------------------------------------------------------------------------------------------

https://linuxconcept.com/install-git-on-rhel-7-operating-system/


git --version

sudo yum remove git

sudo touch wandisco-git.repo

sudo vi  wandisco-git.repo

[wandisco-git]

name=Wandisco GIT Repository 

baseurl=http://opensource.wandisco.com/rhel/7/git/x86_64/

enabled=1

gpgcheck=1

gpgkey=http://opensource.wandisco.com/RPM-GPG-KEY-WANdisco


sudo rpm --import  http://opensource.wandisco.com/RPM-GPG-KEY-WANdisco


sudo yum install git


Scheduled Trigger : CICD :Azure Devops

https://stackoverflow.com/questions/52429366/azure-devops-is-it-possible-to-schedule-a-release-for-a-specific-day-and-time/58980088#58980088



Edit your release and click on the "Schedule set" icon under the Artifacts. You can enable and "Add a new time" for repeated execution.

Thursday, July 1, 2021

AWS: Instance Types : Instance Families : Instance Sizes : Instance Pricing : EC2

https://aws.amazon.com/ec2/instance-types/

https://aws.amazon.com/ec2/instance-explorer

Instance type

Instance   size

Hypervisor

vCPUs

Architecture

Cores

Threads   per core

Sustained   clock speed (GHz)

Memory   (GiB)

Network performance

Maximum   number of network interfaces

IPv4 addresses per interface

IPv6 addresses per interface

On-Demand   Linux pricing

c5.4xlarge

4xlarge

nitro

16

x86_64

8

2

3.4

32

Up to 10 Gigabit

8

30

30

0.68 USD per Hour

c5.2xlarge

2xlarge

nitro

8

x86_64

4

2

3.4

16

Up to 10 Gigabit

4

15

15

0.34 USD per Hour

c5.xlarge

xlarge

nitro

4

x86_64

2

2

3.4

8

Up to 10 Gigabit

4

15

15

0.17 USD per Hour

c5.large

large

nitro

2

x86_64

1

2

3.4

4

Up to 10 Gigabit

3

10

10

0.085 USD per Hour


Analytics

  • Elasticsearch Service  [Amazon Elasticsearch Service]
  • MSK  [Amazon Managed Streaming for Apache Kafka]


https://aws.amazon.com/ec2/instance-types/

n/w performance and Clock Speed -  may stay same  in a instance family.

Instance Family : Processor Speed /N.W Performance:  a1,t2, t3 [General Purpose]          c4, c5 [Compute Optimised]                   [Memory Optimized/RAM]           [Accelerated Computing/HW Accelerator]     [Storage Optimized/EBS]

Instance Size :  vCPU, Cores,  Memory(RAM)    :    nano, micro, small, medium, large, xlarge, 2xlarge, 4xlarge



30.5 Days


Large has 4GB RAM, 1 Core, 2 VCPUs

similarly, xlarge has 8 GB RAM, 2 Core, 4 VCPUs

n/w performance and Clock Speed -  may stay same  in a instance family.


Burst is related to EC2 Performance

Amazon EC2 allows you to choose between Fixed Performance Instances (e.g. M5, C5, and R5) and Burstable Performance Instances (e.g. T3). Burstable Performance Instances provide a baseline level of CPU performance with the ability to burst above the baseline.


New Generation is cheaper as compared to Old/deprecated - if other features stay same.






Azure - Pipeline - Add Approver for Stage

https://learn.microsoft.com/en-us/azure/devops/pipelines/process/approvals?view=azure-devops&tabs=check-pass