Wednesday, July 7, 2021

Browser : Chrome: Mixed Content, Http/Https

https://developer.mozilla.org/en-US/docs/Web/Security/Mixed_content/How_to_fix_website_with_mixed_content

https://blog.chromium.org/2020/02/protecting-users-from-insecure.html


How to fix your website

The best strategy to avoid mixed content blocking is to serve all the content as HTTPS instead of HTTP.

AWS:CloudShell:CLI: aws ec2 describe-instances

https://thehftguy.com/2016/03/10/how-to-export-amazon-ec2-instances-to-a-csv-file/

https://gmusumeci.medium.com/how-to-export-aws-ec2-instances-in-multiple-aws-regions-and-multiple-aws-accounts-to-excel-csv-ce283af0ed90

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Filtering.html

https://docs.aws.amazon.com/cli/latest/reference/ec2/describe-instances.html

https://docs.aws.amazon.com/cloudshell/latest/userguide/working-with-cloudshell.html


AWS Cloud Shell 

aws ec2 describe-instances --filters "Name=tag:Environment,Values=QA"  --output json

aws ec2 describe-instances --filters "Name=tag:Environment,Values=QA" --output table  >  QA_EC2_Instances.tsv


InstanceId

InstanceType

PrivateIpAddress


aws ec2 describe-instances \

--filters "Name=tag:Environment,Values=QA" \

--query 'Reservations[*].Instances[*].{InstanceId:InstanceId,InstanceType:InstanceType,PrivateIpAddress:PrivateIpAddress}' \

--output json \

>  QA_EC2_Instances.json



aws ec2 describe-instances --filters Name=instance-state-name,Values=running --query "Reservations[*].Instances[*].InstanceId" --output text


aws iam list-access-keys --user-name  john_doe

aws iam list-access-keys --user-name  john_doe

AWS : Exam:Practice Exam:Mock Exam: Cloud Practioner

https://geekflare.com/aws-practice-test/


https://digitalcloud.training/certification-training/

https://digitalcloud.training/certification-training/aws-certified-cloud-practitioner/

https://digitalcloud.training/courses/free-aws-certified-cloud-practitioner-practice-exam/


https://www.testpreptraining.com/


https://www.whizlabs.com/cart/ 

WELCOME


https://karanawsbucket.s3.us-east-1.amazonaws.com/Blogger_Images/products_whizcard-clf-c01-01-06.pdf

https://karanawsbucket.s3.us-east-1.amazonaws.com/Blogger_Images/products_csaa-whizcard-_-revised_14_06_2021.pdf

Azure : Devops : Pipeline :YAML

 You can organize pipeline jobs into stages. Stages are the major divisions in a pipeline: "build this app", "run these tests", and "deploy to pre-production" are good examples of stages. They are logical boundaries in your pipeline where you can pause the pipeline and perform various checks.


Pipeline > Stages >Stage>Steps>Step


jobs:

- job: A

  steps:

  - bash: echo "A"


- job: B

  steps:

  - bash: echo "B"

  

  

  If you organize your pipeline into multiple stages, you use the stages keyword.


If you choose to specify a pool at the stage level, then all jobs defined in that stage will use that pool unless otherwise specified at the job-level

stages:

- stage: A

  jobs:

  - job: A1

  - job: A2


- stage: B

  jobs:

  - job: B1

  - job: B2

  

  When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. The exception to this is when you add dependencies. With dependencies, stages run in the order of the dependsOn requirements

  stages:

- stage: string

  dependsOn: string

  condition: string

  -----------------------------------------------------------

  You can organize your pipeline into jobs. Every pipeline has at least one job. A job is a series of steps that run sequentially as a unit. In other words, a job is the smallest unit of work that can be scheduled to run.

  

  In the simplest case, a pipeline has a single job. In that case, you do not have to explicitly use the job keyword

  

  jobs:

- job: myJob

  timeoutInMinutes: 10

  pool:

    vmImage: 'ubuntu-16.04'

  steps:

  - bash: echo "Hello world"

  

  

  /usr/lib/jvm/adoptopenjdk-11-hotspot-amd64



 # update-alternatives --config java

          update-alternatives --list java

          echo ls /etc/alternatives

          ls /etc/alternatives

  

  

https://docs.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml

https://docs.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops


https://faun.pub/reduce-your-build-time-using-caching-in-azure-pipelines-7a7bd0201cee


MAVEN_CACHE_FOLDER: $(HOME)/.m2/repository"


variables:

  MAVEN_CACHE_FOLDER: $(HOME)/.m2/repository

  MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)'


- task: Cache@2

  inputs:

    key: '"funcs" | maven |"$(Agent.OS)" | **/pom.xml'

    restoreKeys: |

    path: $(MAVEN_CACHE_FOLDER)

  displayName: Cache Maven local repo

  

https://stackoverflow.com/questions/66161852/is-there-a-predefined-variable-for-home-vsts-in-azure-pipelines/66161853#66161853

  

  

Pool: Hosted Ubuntu 1604

Agent: Hosted Agent



Pool: Azure Pipelines

Image: ubuntu-16.04

Agent: Hosted Agent

Azure : Devops : Pipelines: Task ::: Cache Task

https://docs.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops

https://docs.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops#maven


Maven

Maven has a local repository where it stores downloads and built artifacts. To enable, set the maven.repo.local option to a path under $(Pipeline.Workspace) and cache this folder.

Example:

YAML
variables:
  MAVEN_CACHE_FOLDER: $(Pipeline.Workspace)/.m2/repository
  MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)'

steps:
- task: Cache@2
  inputs:
    key: 'maven | "$(Agent.OS)" | **/pom.xml'
    restoreKeys: |
      maven | "$(Agent.OS)"
      maven
    path: $(MAVEN_CACHE_FOLDER)
  displayName: Cache Maven local repo

- script: mvn install -B -e

If you are using a Maven task, make sure to also pass the MAVEN_OPTS variable because it gets overwritten otherwise:

Friday, July 2, 2021

Install Git On Linux

https://linuxconcept.com/install-git-on-rhel-7-operating-system/

https://phoenixnap.com/kb/how-to-install-rpm-file-centos-linux

http://opensource.wandisco.com/rhel/7/git/x86_64/

https://stackoverflow.com/questions/21820715/how-to-install-latest-version-of-git-on-centos-7-x-6-x

https://superuser.com/questions/1190269/relationship-between-yum-repo-and-rpm

https://blog.thewatertower.org/2019/04/24/modifying-systemd-unit-files/

https://www.tecmint.com/list-all-running-services-under-systemd-in-linux/

https://superuser.com/questions/513159/how-to-remove-systemd-services

https://www.digitalocean.com/community/tutorials/how-to-use-systemctl-to-manage-systemd-services-and-units

https://askubuntu.com/questions/795226/how-to-list-all-enabled-services-from-systemctl

https://access.redhat.com/documentation/en-us/red_hat_jboss_enterprise_application_platform/7.0/html/installation_guide/configuring_jboss_eap_to_run_as_a_service

https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/v2-linux?view=azure-devops

https://devblogs.microsoft.com/premier-developer/azure-devops-setting-up-repository-permissions/

https://stackoverflow.com/questions/2853803/how-to-echo-shell-commands-as-they-are-executed

-----------------------------------------------------------------------------------------------------------

https://linuxconcept.com/install-git-on-rhel-7-operating-system/


git --version

sudo yum remove git

sudo touch wandisco-git.repo

sudo vi  wandisco-git.repo

[wandisco-git]

name=Wandisco GIT Repository 

baseurl=http://opensource.wandisco.com/rhel/7/git/x86_64/

enabled=1

gpgcheck=1

gpgkey=http://opensource.wandisco.com/RPM-GPG-KEY-WANdisco


sudo rpm --import  http://opensource.wandisco.com/RPM-GPG-KEY-WANdisco


sudo yum install git


Azure - Pipeline - Add Approver for Stage

https://learn.microsoft.com/en-us/azure/devops/pipelines/process/approvals?view=azure-devops&tabs=check-pass