anitsh / til

Today I Learn (til) - Github `Issues` used as daily learning management system for taking notes and storing resource links.
https://anitshrestha.com.np
MIT License
75 stars 11 forks source link

Course - Google Cloud Computing Foundations: Infrastructure in Google Cloud #471

Open anitsh opened 3 years ago

anitsh commented 3 years ago

image

Course - Google Cloud Computing Foundations: Infrastructure in Google Cloud https://google.qwiklabs.com/course_templates/154 https://google.qwiklabs.com/course_sessions/116497

Quest - Perform Foundational Infrastructure Tasks in Google Cloud https://google.qwiklabs.com/quests/118

Lessons - 4,5,6

How are user identities created in Cloud IAM? Answer: User identities are created outside of Google Cloud using a Google-administered domain. Reason: Creating users and groups within Google Cloud is not possible.

Which of the following is not an encryption option for Google Cloud?

Scripted encryption keys is not an option with Google Cloud.

If a Cloud IAM policy gives you Owner permissions at the project level, your access to a resource in the project may be restricted by a more restrictive policy on that resource. False. Policies are a union of the parent and the resource. If a parent policy is less restrictive, it overrides a more restrictive resource policy.

image

anitsh commented 3 years ago

Lesson 4 image image https://youtu.be/_LTnis5hy-Y

Lesson 5 What protocol is used by REST APIs? HTTP

What protocol is used by REST APIs?

Cloud Pub/Sub is not a messaging processing service. You write your applications to process the messages stored in Cloud Pub/Sub.

Which of the following API Management Systems can be used on legacy systems? Cloud Endpoints Cloud Gateway

Lesson 6 What is Identity-Aware Proxy? Identity-Aware Proxy (IAP) is a Google Cloud Platform service that intercepts web requests sent to your application, authenticates the user making the request using the Google Identity Service, and only lets the requests through if they come from a user you authorize. In addition, it can modify the request headers to include information about the authenticated user.

project level roles

Google Cloud's Identity and Access Management (IAM) service lets you create and manage permissions for Google Cloud resources. Cloud IAM unifies access control for Google Cloud services into a single system and provides a consistent set of operations. In this hands-on lab you learn how to assign a role to a second user and remove assigned roles associated with Cloud IAM. More specifically, you sign in with 2 different sets of credentials to experience how granting and revoking permissions works from Google Cloud Project Owner and Viewer roles.

You should see Browser, Editor, Owner, and Viewer roles. These four are known as primitive roles in Google Cloud. Primitive roles set project-level permissions and unless otherwise specified, they control access and management to all Google Cloud services.

Role Name Permissions
roles/viewer Permissions for read-only actions that do not affect state, such as viewing (but not modifying) existing resources or data.
roles/editor All viewer permissions, plus permissions for actions that modify state, such as changing existing resources.
roles/owner All editor permissions and permissions for the following actions: Manage roles and permissions for a project and all resources within the project. Set up billing for a project.
roles/browser (beta) Read access to browse the hierarchy for a project, including the folder, organization, and Cloud IAM policy. This role doesn't include permission to view resources in the project.

Quiz

How are user identities created in Cloud IAM? User identities are created outside of Google Cloud using a Google-administered domain. Creating users and groups within Google Cloud is not possible.

Which of the following is not an encryption option for Google Cloud? Customer-managed encryption keys (CMEK) Google encryption by default Customer-supplied encryption Keys (CSEK) Scripted encryption keys is not an option with Google Cloud.

If a Cloud IAM policy gives you Owner permissions at the project level, your access to a resource in the project may be restricted by a more restrictive policy on that resource. False. Policies are a union of the parent and the resource. If a parent policy is less restrictive, it overrides a more restrictive resource policy.

anitsh commented 3 years ago

Perform Foundational Infrastructure Tasks in Google Cloud

Challenge Scenario

You are just starting your junior cloud engineer rolewith Jooli inc. So far you have been helping teams create and manage Google Cloud resources.

Your challenge

You are now asked to help a newly formed development team with some of their initial work on a new project around storing and organizing photographs, called memories. You have been asked to assist the memories team with initial configuration for their application development environment; you receive the following request to complete the following tasks:

Some Jooli Inc. standards you should follow:

// Not sure we need to create this project and set gcloud config set project memories

gcloud config set compute/zone us-east1-b gcloud config set compute/region us-east1

// Probably not needed gcloud pubsub subscriptions create --topic a-thumbnail a-thumbnail

gsutil mb gs://a-thumbnail-bucket

/* globals exports, require */
//jshint strict: false
//jshint esversion: 6
"use strict";
const crc32 = require("fast-crc32c");
const gcs = require("@google-cloud/storage")();
const PubSub = require("@google-cloud/pubsub");
const imagemagick = require("imagemagick-stream");

exports.thumbnail = (event, context) => {
  const fileName = event.name;
  const bucketName = event.bucket;
  const size = "64x64"
  const bucket = gcs.bucket(bucketName);
  const topicName = "a-thumbnail";
  const pubsub = new PubSub();
  if ( fileName.search("64x64_thumbnail") == -1 ){
    // doesn't have a thumbnail, get the filename extension
    var filename_split = fileName.split('.');
    var filename_ext = filename_split[filename_split.length - 1];
    var filename_without_ext = fileName.substring(0, fileName.length - filename_ext.length );
    if (filename_ext.toLowerCase() == 'png' || filename_ext.toLowerCase() == 'jpg'){
      // only support png and jpg at this point
      console.log(`Processing Original: gs://${bucketName}/${fileName}`);
      const gcsObject = bucket.file(fileName);
      let newFilename = filename_without_ext + size + '_thumbnail.' + filename_ext;
      let gcsNewObject = bucket.file(newFilename);
      let srcStream = gcsObject.createReadStream();
      let dstStream = gcsNewObject.createWriteStream();
      let resize = imagemagick().resize(size).quality(90);
      srcStream.pipe(resize).pipe(dstStream);
      return new Promise((resolve, reject) => {
        dstStream
          .on("error", (err) => {
            console.log(`Error: ${err}`);
            reject(err);
          })
          .on("finish", () => {
            console.log(`Success: ${fileName} → ${newFilename}`);
              // set the content-type
              gcsNewObject.setMetadata(
              {
                contentType: 'image/'+ filename_ext.toLowerCase()
              }, function(err, apiResponse) {});
              pubsub
                .topic(topicName)
                .publisher()
                .publish(Buffer.from(newFilename))
                .then(messageId => {
                  console.log(`Message ${messageId} published.`);
                })
                .catch(err => {
                  console.error('ERROR:', err);
                });

          });
      });
    }
    else {
      console.log(`gs://${bucketName}/${fileName} is not an image I can handle`);
    }
  }
  else {
    console.log(`gs://${bucketName}/${fileName} already has a thumbnail`);
  }
};

gcloud functions deploy thumbnail \ --runtime nodejs10 \ --trigger-resource photographs \ --trigger-event google.storage.object.finalize \ --stage-bucket thumbnail-bucket

Review Incomplete Taks In The Quest

This is the final task of the quest: https://google.qwiklabs.com/quests/118

So there are other tasks which were incomplete.

Cloud Storage

Cloud Storage allows world-wide storage and retrieval of any amount of data at any time. You can use Cloud Storage for a range of scenarios including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download. In this hands-on lab you will learn how to use the Cloud Console to create a storage bucket, then upload objects, create folders and subfolders, and make those objects publicly accessible.

Create a Bucket In the Cloud Console, go to Navigation menu > Storage > Browser. Click Create Bucket.

Bucket naming rules: Do not include sensitive information in the bucket name, because the bucket namespace is global and publicly visible. Bucket names must contain only lowercase letters, numbers, dashes (-), underscores (), and dots (.). Names containing dots require verification. Bucket names must start and end with a number or letter. Bucket names must contain 3 to 63 characters. Names containing dots can contain up to 222 characters, but each dot-separated component can be no longer than 63 characters. Bucket names cannot be represented as an IP address in dotted-decimal notation (for example, 192.168.5.4). Bucket names cannot begin with the "goog" prefix. Bucket names cannot contain "google" or close misspellings of "google". Also, for DNS compliance and future compatibility, you should not use underscores () or have a period adjacent to another period or dash. For example, ".." or "-." or ".-" are not valid in DNS names.

Each bucket has a default storage class, which you can specify when you create your bucket. You can stop publicly sharing an object by removing permission entry that have allUsers.

gsutil mb gs://YOUR-BUCKET-NAME/ gsutil cp ada.jpg gs://YOUR-BUCKET-NAME gsutil cp -r gs://YOUR-BUCKET-NAME/ada.jpg . gsutil cp gs://YOUR-BUCKET-NAME/ada.jpg gs://YOUR-BUCKET-NAME/image-folder/ gsutil ls gs://YOUR-BUCKET-NAME gsutil ls -l gs://YOUR-BUCKET-NAME/ada.jpg gsutil acl ch -u AllUsers:R gs://YOUR-BUCKET-NAME/ada.jpg gsutil acl ch -d AllUsers gs://YOUR-BUCKET-NAME/ada.jpg gsutil rm gs://YOUR-BUCKET-NAME/ada.jpg gsutil rm -r gs://anit-bucket-1

gsutil -m cp -r gs://spls/gsp067/python-docs-samples . : Copy sample app gsutil mb -p [PROJECT_ID] gs://[BUCKET_NAME] : Create a new cloud storage bucket for your cloud function.

Cloud IAM

Google Cloud's Identity and Access Management (IAM) service lets you create and manage permissions for Google Cloud resources. Cloud IAM unifies access control for Google Cloud services into a single system and provides a consistent set of operations. In this hands-on lab you learn how to assign a role to a second user and remove assigned roles associated with Cloud IAM. More specifically, you sign in with 2 different sets of credentials to experience how granting and revoking permissions works from Google Cloud Project Owner and Viewer roles.

https://google.qwiklabs.com/focuses/551 Cloud IAM: Qwik Start https://google.qwiklabs.com/focuses/1035 IAM Custom Roles https://google.qwiklabs.com/focuses/10599 Cloud Monitoring: Qwik Start

Navigation menu > IAM & Admin > IAM Remove Project Viewer Access. gcloud auth list: lists the credentialed accounts in your Google Cloud project

Cloud Monitoring

Cloud Monitoring provides visibility into the performance, uptime, and overall health of cloud-powered applications. Cloud Monitoring collects metrics, events, and metadata from Google Cloud, Amazon Web Services, hosted uptime probes, application instrumentation, and a variety of common application components including Cassandra, Nginx, Apache Web Server, Elasticsearch, and many others. Cloud Monitoring ingests that data and generates insights via dashboards, charts, and alerts. Cloud Monitoring alerting helps you collaborate by integrating with Slack, PagerDuty, HipChat, Campfire, and more.

Create a VM. gcloud compute ssh gcelab2 --zone us-central1-c : Connect to the instance

Monitor a Compute Engine virtual machine (VM) instance with Cloud Monitoring. You'll also install monitoring and logging agents for your VM which collects more information from your instance, which could include metrics and logs from 3rd party apps.

Agents collect data and then send or stream info to Cloud Monitoring in the Cloud Console.

The Cloud Monitoring agent is a collectd-based daemon that gathers system and application metrics from virtual machine instances and sends them to Monitoring. By default, the Monitoring agent collects disk, CPU, network, and process metrics. Configuring the Monitoring agent allows third-party applications to get the full list of agent metrics.

It is best practice to run the Cloud Logging agent on all your VM instances.

curl -sSO https://dl.google.com/cloudagents/add-monitoring-agent-repo.sh sudo bash add-monitoring-agent-repo.sh sudo apt-get update sudo apt-get install stackdriver-agent

curl -sSO https://dl.google.com/cloudagents/add-logging-agent-repo.sh sudo bash add-logging-agent-repo.sh sudo apt-get update sudo apt-get install google-fluentd

Create an uptime check Uptime checks verify that a resource is always accessible. For practice, create an uptime check to verify your VM is up.

Create an alerting policy Use Cloud Monitoring to create one or more alerting policies.

Create a dashboard and chart You can display the metrics collected by Cloud Monitoring in your own charts and dashboards. In this section you create the charts for the lab metrics and a custom dashboard.

Pub/Sub

The Pub/Sub basics As stated earlier, Google Cloud Pub/Sub is an asynchronous global messaging service. There are three terms in Pub/Sub that appear often: topics, publishing, and subscribing.

To sum it up, a producer publishes messages to a topic and a consumer creates a subscription to a topic to receive messages from it.

gcloud pubsub topics create myTopic gcloud pubsub topics list gcloud pubsub topics delete Test1

gcloud pubsub subscriptions create --topic myTopic mySubscription gcloud pubsub topics list-subscriptions myTopic gcloud pubsub subscriptions delete Test1

gcloud pubsub topics publish myTopic --message "Hello"

gcloud pubsub subscriptions pull mySubscription --auto-ack

You published 4 messages to your topic, but only 1 was outputted. Now is an important time to note a couple features of the pull command that often trip developers up:

gcloud pubsub topics publish myTopic --message "Publisher is starting to get the hang of Pub/Sub" gcloud pubsub topics publish myTopic --message "Publisher wonders if all messages will be pulled" gcloud pubsub topics publish myTopic --message "Publisher will have to test to find out" gcloud pubsub subscriptions pull mySubscription --auto-ack --limit=3

Add a flag to your command so you can output all three messages in one request. You may have not noticed, but you have actually been using a flag this entire time: the --auto-ack part of the pull command is a flag that has been formatting your messages into the neat boxes that you see your pulled messages in.

limit is another flag that sets an upper limit on the number of messages to pull.

Cloud Functions

Cloud Functions is a serverless execution environment for building and connecting cloud services. With Cloud Functions you write simple, single-purpose functions that are attached to events emitted from your cloud infrastructure and services. Your Cloud Function is triggered when an event being watched is fired. Your code executes in a fully managed environment. There is no need to provision any infrastructure or worry about managing any servers.

Cloud Functions can be written in Node.js, Python, and Go, and are executed in language-specific runtimes as well. You can take your Cloud Function and run it in any standard Node.js runtime which makes both portability and local testing a breeze.

Cloud Functions provides a connective layer of logic that lets you write code to connect and extend cloud services. Listen and respond to a file upload to Cloud Storage, a log change, or an incoming message on a Cloud Pub/Sub topic. Cloud Functions augments existing cloud services and allows you to address an increasing number of use cases with arbitrary programming logic. Cloud Functions have access to the Google Service Account credential and are thus seamlessly authenticated with the majority of Google Cloud services such as Datastore, Cloud Spanner, Cloud Translation API, Cloud Vision API, as well as many others. In addition, Cloud Functions are supported by numerous Node.js client libraries, which further simplify these integrations.

Cloud events are things that happen in your cloud environment.These might be things like changes to data in a database, files added to a storage system, or a new virtual machine instance being created.

Events occur whether or not you choose to respond to them. You create a response to an event with a trigger. A trigger is a declaration that you are interested in a certain event or set of events. Binding a function to a trigger allows you to capture and act on events. For more information on creating triggers and associating them with your functions, see Events and Triggers.

Cloud Functions removes the work of managing servers, configuring software, updating frameworks, and patching operating systems. The software and infrastructure are fully managed by Google so that you just add code. Furthermore, provisioning of resources happens automatically in response to events. This means that a function can scale from a few invocations a day to many millions of invocations without any work from you.

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

https://cloud.google.com/sdk/gcloud/reference/functions/event-types/list

mkdir gcf_hello_world cd gcf_hello_world nano index.js

/**
* Background Cloud Function to be triggered by Pub/Sub.
* This function is exported by index.js, and executed when
* the trigger topic receives a message.
*
* @param {object} data The event payload.
* @param {object} context The event metadata.
*/
exports.helloWorld = (data, context) => {
const pubSubMessage = data;
const name = pubSubMessage.data
    ? Buffer.from(pubSubMessage.data, 'base64').toString() : "Hello World";

console.log(`My Cloud Function: ${name}`);
};

gsutil mb -p [PROJECT_ID] gs://[BUCKET_NAME]

When deploying a new function, you must specify --trigger-topic, --trigger-bucket, or --trigger-http. When deploying an update to an existing function, the function keeps the existing trigger unless otherwise specified.

For this lab, you'll set the --trigger-topic as hello_world.

gcloud functions deploy thumbnail --stage-bucket labb1 --trigger-topic labb1 --runtime nodejs10 --region us-east1

--stage-bucket=STAGE_BUCKET When deploying a function from a local directory, this flag's value is the name of the Google Cloud Storage bucket in which source code will be stored. Note that if you set the --stage-bucket flag when deploying a function, you will need to specify --source or --stage-bucket in subsequent deployments to update your source code.

gcloud functions describe thumbnail gcloud functions logs read thumbnail

https://cloud.google.com/sdk/gcloud/reference/functions/deploy#--stage-bucket https://cloud.google.com/functions/docs/calling/storage https://cloud.google.com/sdk/gcloud/reference/functions/deploy#--trigger-bucket