Google Cloud Logging – Stackdriver

https://click.linksynergy.com/fs-bin/click?id=l7C703x9gqw&offerid=624447.17955&type=3&subid=0

Google Cloud Logging

  • Answers the questions “Who did what, where and when” within the GCP projects
  • Maintains non-tamperable audit logs for each project and organizations
  • Logs buckets are a regional resource, which means the infrastructure that stores, indexes, and searches the logs are located in a specific geographical location. Google manages that infrastructure so that the applications are available redundantly across the zones within that region.

Cloud Logging Process

Google Cloud Logging Export

  • For each Google Cloud project, Logging automatically creates two logs buckets: _Required and _Default.
    • _Required bucket
      • holds Admin Activity audit logs, System Event audit logs, and Access Transparency logs
      • retains them for 400 days.
      • aren’t charged for the logs stored in _Required, and
      • the retention period of the logs stored here cannot be modified.
      • cannot delete this bucket.
    • _Default bucket
      • holds all other ingested logs in a Google Cloud project except for the logs held in the _Required bucket.
      • are charged
      • are retained for 30 days, by default, and can be customized from 1 to 3650 days
    • these buckets cannot be deleted
  • All logs generated in the project are stored in the _Required and _Default logs buckets, which live in the project that the logs are generated in
  • Logs buckets only have regional availability, including those created in the global region.

Cloud Logging Types

Google Cloud Platform Logs

  • Google Cloud platform logs are service-specific logs that can help debug and troubleshoot issues, as well as better understand the Google Cloud services used
  • logs visible vary, depending on which Google Cloud resources you’re using in your Google Cloud project or organization.

Security Logs

  • Audit Logs
    • Cloud Audit Logs includes three types of audit logs: Admin Activity, Data Access, and System Event.
    • Cloud Audit Logs provide audit trails of administrative changes and data accesses of the Google Cloud resources.
      • Admin Activity and System Events
        • enabled by default
        • no additional charge
        • admin activity – administrative actions and API calls
        • have 400 day retention
      • System Events
        • enabled by default
        • no additional charge
        • system events – GCE system event like live migration
        • have 400 day retention
      • Data Access logs
        • Log API calls that create, modify or read user-provided data
        • 30 day retention
        • disabled by default
        • size can be huge
        • charged beyond free limits
        • For GCP-visible services
  • Access Transparency Logs
    • provides logs of actions taken by Google staff when accessing the Google Cloud content.
    • can help track compliance with the organization’s legal and regulatory requirements.
    • have 400 day retention

User Logs

  • User logs are generated by user software, services, or applications and written to Cloud Logging using a logging agent, the Cloud Logging API, or the Cloud Logging client libraries
  • Agent logs
    • produced by logging agent installed that collects logs from user applications and VMs
    • covers log data from third-party applications
    • charged beyond free limits
    • 30 day retention

Cloud Logging Export

  • Log entries are stored in logs buckets for a specified length of time i.e. retention period and are then deleted and cannot be recovered
  • Logs can be exported by configuring log sinks, which then continue to export log entries as they arrive in Logging. A sink includes a destination and a filter that selects the log entries to export.
  • Exporting involves writing a filter that selects the log entries to be exported, and choosing a destination from the following options:
    • Cloud Storage: JSON files stored in buckets for long term retention
    • BigQuery: Tables created in BigQuery datasets. for analytics
    • Pub/Sub: JSON messages delivered to Pub/Sub topics to stream to other resources. Supports third-party integrations, such as Splunk
    • Another Google Cloud Cloud project: Log entries held in Cloud Logging logs buckets.
  • Every time a log entry arrives in a project, folder, billing account, or organization resource, Logging compares the log entry to the sinks in that resource. Each sink whose filter matches the log entry writes a copy of the log entry to the sink’s export destination.
  • Exporting happens for new log entries only, it is not retrospective

GCP Certification Exam Practice Questions

  • Questions are collected from Internet and the answers are marked as per my knowledge and understanding (which might differ with yours).
  • GCP services are updated everyday and both the answers and questions might be outdated soon, so research accordingly.
  • GCP exam questions are not updated to keep up the pace with GCP updates, so even if the underlying feature has changed the question might not be updated
  • Open to further feedback, discussion and correction.
  1. Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention. What should you do?
    1. Create an export to the sink that saves logs from Cloud Audit to BigQuery.
    2. Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.
    3. Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.
    4. Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.
  2. Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention. What should you do?
    1. Create an export to the sink that saves logs from Cloud Audit to BigQuery.
    2. Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.
    3. Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.
    4. Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.

Reference

Google_Cloud_Logging

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.