Week 8 GCP Notes

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Google Cloud Computing Foundation Course - Week 8 Lecture Notes Summary

1. The Purpose of APIs (Lecture 37)


• What is an API?
• An API (Application Programming Interface) provides a clean interface for disparate
software resources to communicate using a universal structure, abstracting away
unnecessary complexity.
• APIs open up opportunities by providing a standardized communication channel.
• REST APIs:
• REST (Representational State Transfer) is the most popular architectural style for APIs,
widely used due to its scalability and stateless nature.
• REST APIs perform operations like GET, POST, PUT, and DELETE via HTTP requests.
• REST APIs are well-suited for cloud applications and mobile devices due to their
stateless architecture.
• Security and Authentication:
• APIs often use OAuth for authentication and tokens for security, ensuring secure
interactions without storing session state.
• Challenges in Managing APIs:
• Key challenges include deciding on formats for API descriptions, handling
authentication, scaling the API infrastructure, and logging/monitoring API usage.

2. Cloud Endpoints: Distributed API Management System (Lecture 38)


• What are Cloud Endpoints?
• Cloud Endpoints enable developers to build, deploy, and manage APIs on Google Cloud
backends, offering robust API management.
• It allows you to control API access, generate API keys, and validate API calls using
JSON Web Tokens (JWT).
• Integration with GCP Services:
• It integrates with Auth0 and Firebase for user authentication and can be deployed on App
Engine, Google Kubernetes Engine (GKE), or any Kubernetes environment using a
proxy container.
• Monitoring and Analytics:
• Key API metrics like latency, error rates, and usage patterns can be monitored using
Stackdriver Logging and Stackdriver Trace.
• BigQuery integration allows further analysis of API logs and performance data.
• Open API and gRPC:
• Cloud Endpoints supports both Open API specifications and gRPC for API description
and implementation.
• It also supports service-to-service authentication and user authentication with Google
services.
3. Using Apigee Edge for API Management (Lecture 39)
• What is Apigee Edge?
• Apigee Edge is a comprehensive API management platform that creates a proxy layer in
front of backend services, offering advanced features like rate limiting, caching, security,
and analytics.
• Apigee Edge provides additional value with transformation, fault handling, and
analytics, helping manage and optimize APIs for other companies or legacy systems.
• API Gateway and Microservices:
• Apigee Edge can serve as an API gateway, abstracting clients from the internal
partitioning of an application into microservices.
• It is particularly useful in scenarios where legacy systems are gradually replaced by
microservices.
• Use Cases for Legacy Systems:
• Engineers can implement API facades for legacy applications, enabling modern API
interactions without needing to update outdated protocols or interfaces.

4. Managed Messaging Services (Lecture 40)


• Use Cases for Managed Messaging Systems:
• Messaging systems are critical in applications that need to ingest, process, and analyze
large volumes of data, such as IoT applications or user engagement data in games.
• Complex business processes involving multiple interacting applications can benefit from
messaging systems for handling background operations and analytics.
• Applications of Managed Messaging:
• Workload Distribution: Large queues of tasks can be distributed efficiently among
workers (e.g., Compute Engine instances).
• Asynchronous Workflows: Messaging systems enable asynchronous order processing,
where workers process tasks based on topic subscriptions.
• Event Notifications: Notifications can be distributed to subscribed downstream services
when significant events, like user sign-ups, occur.
• Additional Use Cases:
• Cache Refresh: Invalidation events can be published to update distributed caches.
• Logging to Multiple Systems: Logs can be streamed to multiple systems for monitoring
or querying.
• Data Streaming: Real-time data streams, such as sensor data, can be continuously sent
to cloud servers for analysis.
• Reliability Improvement: Systems in different zones can subscribe to the same
message topics to recover from failures in a specific zone or region.

Questions with Answers


1. Q: What is the main purpose of an API? A: An API provides a standardized interface for
different software systems to communicate while abstracting unnecessary details.
2. Q: What are the key operations in a REST API? A: REST APIs use HTTP methods like
GET, POST, PUT, and DELETE to perform operations on resources.
3. Q: Why are REST APIs well-suited for cloud applications? A: REST APIs are stateless,
meaning they don’t require session data to be stored, making them scalable and ideal for cloud
environments.
4. Q: What is the role of OAuth in APIs? A: OAuth is used to authenticate users and authorize
access to API resources securely, often via tokens.
5. Q: What challenges do organizations face when managing APIs? A: Common challenges
include scaling infrastructure, handling authentication, logging API calls, and monitoring
performance metrics.
6. Q: What is Cloud Endpoints, and what does it offer? A: Cloud Endpoints is a distributed
API management system that allows developers to build, deploy, and manage APIs on Google
Cloud backends, providing features like access control, monitoring, and security.
7. Q: How can developers authenticate API users in Cloud Endpoints? A: Developers can
authenticate users using Firebase, Auth0, or Google authentication, and validate API calls with
JSON Web Tokens.
8. Q: What performance metrics can be monitored in Cloud Endpoints? A: Critical metrics
like latency, error rates, and traffic volume can be monitored using Stackdriver Logging and
Stackdriver Trace.
9. Q: How does Apigee Edge help in managing APIs? A: Apigee Edge provides an API proxy
layer, offering features like security, rate limiting, caching, and analytics, allowing better
management of API services.
10.Q: What is an API gateway, and why is it important? A: An API gateway provides a layer of
abstraction between clients and backend services, allowing developers to partition applications
into microservices while maintaining a unified API interface.
11.Q: How does Apigee Edge assist in transitioning legacy systems to microservices? A:
Apigee Edge allows developers to create API facades or adapters, gradually transitioning legacy
systems to modern APIs without rewriting the entire application.
12.Q: What are managed messaging services, and why are they important? A: Managed
messaging services enable the ingestion, transformation, and analysis of large volumes of data,
facilitating real-time communication between multiple applications.
13.Q: How does a messaging system improve workload distribution? A: Messaging systems
distribute large queues of tasks among multiple workers, improving the efficiency and balance
of workloads.
14.Q: What is the role of messaging systems in event-driven architectures? A: Messaging
systems allow services to publish events, which can then be consumed by other services
subscribed to receive notifications about those events.
15.Q: How can messaging systems help in cache management? A: Messaging systems can
publish invalidation events that update cache entries, ensuring data consistency across
distributed systems.
16.Q: What is the benefit of logging to multiple systems via a messaging system? A: Logs can
be written to multiple systems simultaneously, enabling detailed monitoring, querying, and data
analysis across different platforms.
17.Q: How does Cloud Pub/Sub facilitate real-time data streaming? A: Cloud Pub/Sub allows
data from processes or IoT devices to be streamed to cloud servers for real-time processing and
analysis.
18.Q: What is the purpose of service-to-service authentication in Cloud Endpoints? A:
Service-to-service authentication ensures secure communication between microservices,
preventing unauthorized access.
19.Q: Why is scaling infrastructure a challenge when managing APIs? A: As the number of
API users grows, the infrastructure must handle increased traffic, requiring careful planning for
scalability and performance optimization.
20.Q: How does Cloud Endpoints support different API specifications? A: Cloud Endpoints
supports both Open API and gRPC specifications, allowing developers to define APIs using
widely accepted formats.
21.Q: What is gRPC, and how does it differ from REST? A: gRPC is a high-performance,
open-source RPC framework that uses HTTP/2 for communication, whereas REST relies on
HTTP and is more commonly used for web services.
22.Q: What are the benefits of using Stackdriver for monitoring APIs? A: Stackdriver
provides real-time insights into API performance, including metrics like latency, error rates, and
request volume, enabling quick issue detection.
23.Q: How does Apigee Edge enhance security for APIs? A: Apigee Edge provides security
features like token-based authentication, rate limiting, and encryption, ensuring secure API
interactions.
24.Q: What is the role of Cloud Pub/Sub in balancing workloads across multiple systems? A:
Cloud Pub/Sub distributes tasks to multiple systems or services, ensuring efficient workload
management and improved reliability.
25.Q: How does Apigee Edge handle fault tolerance in API management? A: Apigee Edge
provides fault-handling capabilities, allowing it to manage API errors and ensure continuous
operation during service interruptions.
26.Q: Why is token-based security preferred for APIs? A: Token-based security ensures that
user credentials are not transmitted with each request, improving security by using temporary
tokens for API authentication.
27.Q: How does asynchronous workflow processing benefit from messaging systems? A:
Messaging systems allow tasks to be processed asynchronously, ensuring that workflows are not
blocked by dependencies and can scale efficiently.
28.Q: How does Apigee Edge support API transformation? A: Apigee Edge can transform API
requests and responses, ensuring compatibility between different systems or legacy applications.
29.Q: What is the benefit of caching in API management? A: Caching reduces the load on
backend systems by storing frequently requested data closer to the API clients, improving
performance and response times.
30.Q: How does Apigee Edge handle analytics for API usage? A: Apigee Edge provides
detailed analytics on API usage, including metrics like request volume, performance trends, and
user activity, helping optimize API operations.

You might also like