Cloud Armor
- Cloud Armor protects the applications from multiple types of threats, including distributed denial-of-service (DDoS) attacks and application attacks like cross-site scripting (XSS) and SQL injection (SQLi).
- Cloud Armor provides protection only to applications running behind an external HTTP(S) and TCP/SSL Proxy load balancer.
- Cloud Armor supports applications deployed on Google Cloud, in a hybrid deployment, or in a multi-cloud architecture.
- Cloud Armor is implemented at the edge of Google’s network in Google’s points of presence (PoP).
- Security policies protect applications running behind a load balancer from DDoS and other web-based attacks
- Backend service can have only one security policy associated with it
- Prioritized rules define configurable match conditions, actions (allow or deny) and order in a security policy
- Cloud Armor provides Preview mode that helps evaluate and preview the rules before going live.
Cloud Identity-Aware Proxy
- Identity-Aware Proxy IAP allows managing access to HTTP-based apps both on Google Cloud and outside of Google Cloud.
- Identity-Aware Proxy IAP intercepts the web requests sent to the application, authenticates the user making the request using the Google Identity Service, and only lets the requests through if they come from an authorized user. In addition, it can modify the request headers to include information about the authenticated user.
- Identity-Aware Proxy IAP helps establish a central authorization layer for applications accessed by HTTPS to use an application-level access control model instead of relying on network-level firewalls.
- IAP uses Google identities and IAM and can leverage external identity providers as well like OAuth with Facebook, GitHub, Microsoft, SAML, etc.
- Identity-Aware Proxy (IAP) can be configured to use JSON Web Tokens (JWT) as signed headers to make sure that a request to the app is authorized and doesn’t bypass IAP
Cloud Data Loss Prevention – DLP
- Cloud Data Loss Prevention – DLP is a fully managed service designed to help discover, classify, and protect the most sensitive data.
- provides two key features
- Classification is the process to inspect the data and know what data we have, how sensitive it is, and the likelihood.
- De-identification is the process of removing, masking, replacing information from data.
- uses information types – or infoTypes – to define what it scans like credit card numbers, email addresses, etc.
- provides various built-in infoType detector and supports custom ones
- supports inspection rules to fine-tune scan results using
- Exclusion rules decrease the number of findings
- Hotword rules increase the quantity or change the likelihood value of findings
- provides likelihood, which indicates how likely it is that a piece of data matches a given infoType like VERY_LIKELY or POSSIBLE, etc.
- supports Text Classification and Reduction
- supports Image Classification and Reduction, where the image is handled using its base64 encoded version
- supports storage classification with scans on data stored in Cloud Storage, Datastore, and BigQuery
- supports scanning of binary, text, image, Microsoft Word, PDF, and Apache Avro files
- supports Templates help decouple configuration information from the implementation of the requests and manage large scale rollouts
Security Command Center – SCC
- is a Security and risk management platform
- helps generate curated insights that provide a unique view of incoming threats and attacks to the assets, which include organization, projects, instances, and applications
- displays possible security risks, called findings, that are associated with each asset.
- provides services
- Security Health Analytics provides managed vulnerability assessment scanning that can automatically detect the highest severity vulnerabilities and misconfigurations across assets.
- Web Security Scanner custom scans provide granular information about application vulnerability findings like outdated libraries, XSS, etc.
- Cloud Data Loss Prevention discovers, classifies, and protects sensitive data
- Cloud Armor protects Google Cloud deployments against threats
- Anomaly Detection identifies security anomalies for the projects and VM instances, like potential leaked credentials and coin mining, etc.
- Container Threat Detection can detect the most common container runtime attacks
- Forseti Security, the open-source security toolkit, and third-party security information and event management (SIEM) applications
- Event Threat Detection monitors the organization’s Cloud Logging stream and consumes logs to detect Malware, Cryptomining, etc.
- Phishing Protection helps prevent users from accessing phishing sites by classifying malicious content that uses the brand and reporting the unsafe URLs to Google Safe Browsing
- Continuous Exports, which automatically manage the export of new findings to Pub/Sub.
DDoS Protection and Mitigation
- Distributed Denial of Service (DDoS) Protection and Mitigation is a shared responsibility between Google Cloud and the Customer
- DDoS attack is an attempt to render the service or application unavailable to the end-users using multiple sources
- DDoS Protection and Mitigation Best Practices
- Reduce the Attack Surface
- Isolate and secure network using VPC, subnets, firewall rules. tags and IAM
- Google provides Anti-spoofing protection and Automatic isolation between virtual networks
- Isolate Internal Traffic
- Use privates IPs and avoid using Public IPs
- Use NAT Gateway and Bastion host
- Use Internal Load Balancer for internal traffic
- Enable Proxy-based Load Balancing
- HTTP(S) or SSL proxy load balancer uses GFE that helps mitigate and absorb layer 4 and other attacks
- Disperse traffic across multiple regions
- Scale to Absorb the Attack
- Use GFE for protection
- Use Anycast-based load balancing to provide single anycast IP to FE
- Use Autoscaling to scale backend services as per the demand
- Protection using CDN Offloading
- CDN acts as a proxy and can help render cache content reducing the load on the origin servers
- Deploy Third-party DDoS Protection solutions
- App Engine Deployment
- A fully multi-tenant system with isolation
- Google Cloud Storage
- Use signed URLs to access Google Cloud Storage
- API Rate Limiting
- Define rate limiting based on the number of allowed requests
- API Rate limits are per applied per-project basis
- Resource Quotas
- Quotas help prevent unforeseen spikes in usage
- Reduce the Attack Surface
Access Context Manager
- Access Context Manager allows organization administrators to define fine-grained, attribute-based access control for projects and resources
- helps prevent data exfiltration
- helps reduce the size of the privileged network and move to a model where endpoints do not carry ambient authority based on the network.
- helps define desired rules and policy but isn’t responsible for policy enforcement. The policy is configured and enforced across various points, such as VPC Service Controls.
FIPS 140-2 Validated
- The NIST developed the Federal Information Processing Standard (FIPS) Publication 140-2 as a security standard that sets forth requirements for cryptographic modules, including hardware, software, and/or firmware, for U.S. federal agencies.
- FIPS 140-2 Validated certification was established to aid in the protection of digitally stored unclassified, yet sensitive, information.
- Google Cloud Platform uses a FIPS 140-2 validated encryption module called BoringCrypto in its production environment.
- Data in transit to the customer and between data centers, and data at rest are encrypted using FIPS 140-2 validated encryption.
- BoringCrypto module that achieved FIPS 140-2 validation is part of the BoringSSL library.
- BoringSSL library as a whole is not FIPS 140-2 validated
- In order to operate using only FIPS-validated implementations:
- Google’s Local SSD storage product is automatically encrypted with NIST approved ciphers, but Google’s current implementation for this product doesn’t have a FIPS 140-2 validation certificate. If you require FIPS-validated encryption on Local SSD storage, you must provide your own encryption with a FIPS-validated cryptographic module.
- Google automatically encrypts traffic between VMs that travels between Google data centers using NIST-approved encryption algorithms, but this implementation does not have a FIPS validation certificate. If you require this traffic to be encrypted with a FIPS-validated implementation, you must provide your own.
- Clients connecting to Google infrastructure with TLS clients must be configured to require use of secure FIPS-compliant algorithms; if the TLS client and GCP’s TLS services agree on an encryption method that is incompatible with FIPS, a non-validated encryption implementation will be used.
- Applications built and operated on GCP might include their own cryptographic implementations; in order for the data they process to be secured with a FIPS-validated cryptographic module, you must integrate such an implementation yourself.
- All Google Cloud regions and zones currently support FIPS 140-2 validated encryption.