AWS Certified Solutions Architect – Professional (SAP-C01) Exam Learning Path
AWS Certified Solutions Architect – Professional (SAP-C01) exam is the upgraded pattern of the previous Solution Architect – Professional exam which was released last year (2018) and upgraded this year. I recently passed the latest pattern and difference is quite a lot between the previous pattern and the latest pattern. The amount of overlap between the associates and professional exams and even the Solutions Architect and DevOps has drastically reduced.
AWS Certified Solutions Architect – Professional (SAP-C01) exam basically validates
Design and deploy dynamically scalable, highly available, fault-tolerant, and reliable applications on AWS
Select appropriate AWS services to design and deploy an application based on given requirements
Migrate complex, multi-tier applications on AWS
Design and deploy enterprise-wide scalable operations on AWS
AWS Certified Solutions Architect – Professional (SAP-C01) Exam Summary
AWS Certified Solutions Architect – Professional (SAP-C01) exam was for a total of 170 minutes but it had 75 questions. The questions and answers options are quite long and there is a lot of reading that needs to be done, so be sure you are prepared and manage your time well. As always, mark the questions for review and move on and come back to them after you are done with all.
One of the key tactic I followed when solving any question was to read the question and use paper and pencil to draw a rough architecture and focus on the areas that you need to improve. Trust me, you will be able eliminate 2 answers for sure and then need to focus on only the other two. Read the other 2 answers to check the difference area and that would help you reach to the right answer or atleast have a 50% chance of getting it right.
AWS Certified Solutions Architect – Professional (SAP-C01) focuses a lot on concepts and services related to Scalability, High Availability, Disaster Recovery, Migration, Security and Cost Control.
AWS Certified DevOps Engineer – Professional (DOP-C01) Exam Learning Path
AWS Certified DevOps Engineer – Professional (DOP-C01) exam is the upgraded pattern of the DevOps Engineer – Professional exam which was released last year (2018). I recently attempted the latest pattern and AWS has done quite good in improving it further, as compared to the old one, to include more DevOps related questions and services.
AWS Certified DevOps Engineer – Professional (DOP-C01) exam basically validates
Implement and manage continuous delivery systems and methodologies on AWS
Implement and automate security controls, governance processes, and compliance validation
Define and deploy monitoring, metrics, and logging systems on AWS
Implement systems that are highly available, scalable, and self-healing on the AWS platform
Design, manage, and maintain tools to automate operational processes
AWS Certified DevOps Engineer – Professional (DOP-C01) Exam Summary
AWS Certified DevOps Engineer – Professional exam was for a total of 170 minutes but it had 75 questions (I was always assuming it to be 65) and I just managed to complete the exam with 20 mins remaining. So be sure you are prepared and manage your time well. As always, mark the questions for review and move on and come back to them after you are done with all.
One of the key tactic I followed when solving the DevOps Engineer questions was to read the question and use paper and pencil to draw a rough architecture and focus on the areas that you need to improve. Trust me, you will be able eliminate 2 answers for sure and then need to focus on only the other two. Read the other 2 answers to check the difference area and that would help you reach to the right answer or atleast have a 50% chance of getting it right.
AWS Certified DevOps Engineer – Professional exam covers a lot of concepts and services related to Automation, Deployments, Disaster Recovery, HA, Monitoring, Logging and Troubleshooting. It also covers security and compliance related topics.
Be sure to cover the following topics
Whitepapers are the key to understand Deployments and DR
After completing my Google Cloud – Professional Cloud Architect certification exam, I was looking into the Google Cloud – Professional Data Engineer exam and luckily Google Cloud was doing a pilot for their latest updated Professional Data Engineer certification exam. I applied for the free pilot and had a chance to appear for the exam. The pilot exam was 4 hours – 95 questions (as compared to 2 hrs – 50 questions). The results would be out in March 2019, but I can assure the overall exam is quite exhaustive. Once again, the exam covers not only the gamut of services and concepts but also the focus on logical thinking and practical experience.
Quick summary of the exam
Wide range of Google Cloud data services and what they actually do. It includes Storage, and a LOTS of Data services
Nothing much on Compute and Network is covered
Questions sometimes tests your logical thinking rather than any concept regarding Google Cloud.
Hands-on, if you have not worked on GCP before make sure you do lots of labs else you would be absolute clueless for some of the questions and commands
Tests are updated for the latest enhancements.
Pilot exam does not cover the cases studies. But given my Professional Cloud Architect exam experience, make sure you cover the case studies before hand.
Be sure that NO Online Course or Practice tests is going to cover all. I did Coursera, LinuxAcademy which is really vast, but hands-on or practical knowledge is MUST.
The list of topics is quite long, but something that you need to be sure to cover are
provides administrators the ability to manage cloud resources centrally by controlling who can take what action on specific resources.
Understand how IAM works and how rules apply esp. the hierarchy from Organization -> Folder -> Project -> Resources
to clean and prepare data. It can be used anomaly detection.
does not need any programming language knowledge and can be done through graphical interface
be sure to know or try hands-on on a dataset
to handle existing Hadoop/Spark jobs
you need to know how to improve the performance of the Hadoop cluster as well :). Know how to configure the hadoop cluster to use all the cores (hint- spark executor cores) and handle out of memory errors (hint – executor memory)
how to install other components (hint – initialization actions)
is an interactive tool for exploration, transformation, analysis and visualization of your data on Google Cloud Platform
based on Jupyter
fully managed workflow orchestration service based on Apache Airflow
pipelines are configured as directed acyclic graphs (DAGs)
workflow lives on-premises, in multiple clouds, or fully within GCP.
provides ability to author, schedule, and monitor your workflows in a unified manner
Google expects the Data Engineer to surely know some of the Data scientists stuff
Storage Transfer Service allows import of large amounts of online data into Google Cloud Storage, quickly and cost-effectively. Online data is the key here as it supports AWS S3, HTTP/HTTPS and other GCS buckets. If the data is on-premises you need to use gsutil command
Transfer Appliance to transfer large amounts of data quickly and cost-effectively into Google Cloud Platform. Check for the data size and it would be always compared with Google Transfer Service or gsutil commands.
BigQuery Data Transfer Service to integrate with third-party services and load data into BigQuery
Google Cloud – Professional Cloud Architect Certification learning path
Re-certified !!!! Google Cloud – Professional Cloud Architect certification exam is one of the toughest exam I have appeared for. Even though it was recertification, the preparation level was same as the first one. The gamut of services and concepts it tests your knowledge on is really vast.
Google Cloud – Professional Cloud Architect Certification Summary
Has 50 questions to be answered in 2 hours.
Covers wide range of Google Cloud services and what they actually do.
includes Compute, Storage, Network and even Data services
Questions sometimes tests your logical thinking rather than any concept regarding Google Cloud.
Hands-on is a MUST, if you have not worked on GCP before make sure you do lots of labs else you would be absolute clueless for some of the questions and commands
Make sure you cover the case studies before hand. I got ~15 questions (almost 5 per case study) and it can really be a savior for you in the exams.
Be sure that NO Online Course or Practice tests is going to cover all. I did LinuxAcademy (a bit old now) which is really vast, but hands-on or practical knowledge is MUST.
Google Cloud – Professional Cloud Architect Certification Resources
Google Kubernetes Engine, powered by the open source container scheduler Kubernetes, enables you to run containers on Google Cloud Platform.
Kubernetes Engine takes care of provisioning and maintaining the underlying virtual machine cluster, scaling your application, and operational logistics such as logging, monitoring, and cluster health management.
A node pool is a subset of machines that all have the same configuration, including machine type (CPU and memory) authorization scopes. Node pools represent a subset of nodes within a cluster; a container cluster can contain one or more node pools. Hint : For adding new machine types, need to add a new node pool as existing one cannot be edited
Be sure to Create a Kubernetes Cluster and configure it to host an application
Understand how to make the cluster auto repairable and upgradable. Hint – Node auto-upgrades and auto-repairing feature
Very important to understand where to use gcloud commands (to create a cluster) and kubectl commands (manage the cluster components)
Very important to understand how to increase cluster size and enable autoscaling for the cluster
Know how to manage secrets like database passwords
is a lightweight, event-based, asynchronous compute solution that allows you to create small, single-purpose functions that respond to cloud events without the need to manage a server or a runtime environment.
Remember that Cloud Functions is serverless and scales from zero to scale and back to zero as the demand changes.
Virtual Private Cloud
Understand Virtual Private Cloud (VPC), subnets and host applications within them Hint VPC spans across region
Understand how Firewall rules works and how they are configured. Hint – Focus on Network Tags. Also, there are 2 implicit firewall rules – default ingress deny and default egress allow
MS SQL server support was added anew. Previously for HA, it required setting up SQL Server on Compute Engine, using Always On Availability Groups using Windows Failover Clustering. Place nodes in different subnets.
is a fully managed, mission-critical relational database service.
provides a scalable online transaction processing (OLTP) database with high availability and strong consistency at global scale.
globally distributed and can scale and handle more than 10TB.
not a direct replacement and would need migration
There are no direct options for Oracle yet.
Know Cloud Datastore and BigTable
provides document database for web and mobile applications. Datastore is not for analytics
Understand Datastore indexes and how to update indexes for Datastore
Can be configured Multi-regional and regional
provides column database suitable for both low-latency single-point lookups and precalculated analytics
understand Bigtable is not for long term storage as it is quite expensive
provides scalable, fully managed enterprise data warehouse (EDW) with SQL and fast ad-hoc queries.
Remember it is most suitable for historical analysis.
MemoryStore and Firebase did not feature in any of the questions
Although there is a different certification for Data Engineer, the Cloud Architect does cover data services. Data services are also part of the use cases so be sure to know about them
Know the Big Data stack and understand which service fits the different layers of ingest, store, process, analytics, use
Key Services which need to be mainly covered are –
Cloud Storage as the medium to store data as data lake
as the messaging service to capture real time data esp. IoT
is designed to provide reliable, many-to-many, asynchronous messaging between applications esp. real time IoT data capture
Cloud Storage can generate notifications Object change notification
Cloud Dataflow to process, transform, transfer data and the key service to integrate store and analytics.
Cloud BigQuery for storage and analytics. Remember BigQuery provides the same cost-effective option for storage as Cloud Storage
Cloud Dataprep to clean and prepare data. Hint – It can be used anomaly detection.
Cloud Dataproc to handle existing Hadoop/Spark jobs. Hint – Use it to replace existing hadoop infra.
Cloud Datalab is an interactive tool for exploration, transformation, analysis and visualization of your data on Google Cloud Platform
Know standard patterns Cloud Pub/Sub -> Dataflow -> BigQuery