I mentioned in my last post that I’ve recently began studying for some of the Salesforce architect exams, and last week I had the chance to take the Data Architecture and Management Designer certification exam and passed!
I sort of took this exam on a whim last Thursday. I was reviewing the exam guide and felt like I was pretty confident in a number of the sections, even with limited study, so I booked the exam for that night. Thats one of the great things about taking these online proctored tests is that you can schedule your test for 15 minutes from now and be on your way with no problem. Also, at the time of this writing (due to Covid), Salesforce is allowing you to use internal webcams during the exam (normally you have to use an external webcam which I know a lot of people dont have). This makes it super convenient and I hope Salesforce keeps this policy post Covid.
One other thing to keep in mind about this particular exam is that you only need a 58% to pass. A lot of the other exams require 65% to pass, so you’ve got a 7% buffer here to give you a bit more confidence. I’m certainly not an advocate of just learning enough to scrape by the exam as I dont think it will translate well for you in the real world, but I’ll take any sort of buffer I can get 🙂
Before I get into the details of this certification, I’ll just say that although I did pass the exam, there were a few core sections that came up multiple times during the test that I just was not confident in. Specifically, I felt like there was a lot of emphasis on making sure you understood what limits existed for things like big objects, external objects, etc. So understanding the concept and when it should be implemented wasn’t necessarily enough – you really needed to know the nitty gritty details.
Lets get into the exam sections.
Data modeling/Database Design: 25%
- Compare and contrast various techniques and considerations for designing a data model for the Customer 360 platform. (e.g. objects, fields & relationships, object features).
- Given a scenario, recommend approaches and techniques to design a scalable data model that obeys the current security and sharing model.
- Compare and contrast various techniques, approaches and considerations for capturing and managing business and technical metadata (e.g. business dictionary, data lineage, taxonomy, data classification).
- Compare and contrast the different reasons for implementing Big Objects vs Standard/Custom objects within a production instance, alongside the unique pros and cons of utilizing Big Objects in a Salesforce data model.
- Given a customer scenario, recommend approaches and techniques to avoid data skew (record locking, sharing calculation issues, and excessive child to parent relationships).
I think if you’ve spent a good amount of time as an admin or developer, a lot of this section will likely come naturally to you. If you haven’t, getting a solid understanding of CRM / Salesforce fundamentals is super important (field times, relationships, best practices, etc). The two callouts I’ll make are around Big Objects and deferred sharing calculations. The trailhead for this exam covers big objects fairly well, but as I mentioned above, dont skim over the limits sections. Read through this tip sheet for deferring sharing calculations which covers the basics.
Master Data Management: 5%
- Compare and contrast the various techniques, approaches and considerations for implementing Master Data Management Solutions (e.g. MDM implementation styles, harmonizing & consolidating data from multiple sources, establishing data survivorship rules, thresholds & weights, leveraging external reference data for enrichment, Canonical modeling techniques, hierarchy management.)
- Given a customer scenario, recommend and use techniques for establishing a “golden record” or “system of truth” for the customer domain in a Single Org
- Given a customer scenario, recommend approaches and techniques for consolidating data attributes from multiple sources. Discuss criteria and methodology for picking the winning attributes.
- Given a customer scenario, recommend appropriate approaches and techniques to capture and maintain customer reference & metadata to preserve traceability and establish a common context for business rules
This section is only 5% of the exam so I wouldn’t spend too much effort here. If you are familiar with MDM concepts you are going to be just fine here. If you aren’t or dont know what MDM is, go through the trailhead as it gives you a good enough understanding for this exam. Also, unrelated to the exam, I’ve worked with a number of customers that feel like MDM is a thing of the past and isn’t relevant anymore. That topic probably deserves a post of its own, but I’ll just say that in my opinion, MDM still has a place and role, although it may look slightly different than it has in the past. And being familiar with the concepts will help you have conversations with folks who still or did use the approach.
Salesforce Data Management: 25%
- Given a customer scenario, recommend appropriate combination of Salesforce license types to effectively leverage standard and custom objects to meet business needs.
- Given a customer scenario, recommend techniques to ensure data is persisted in a consistent manner.
- Given a scenario with multiple systems of interaction, describe techniques to represent a single view of the customer on the Salesforce platform.
- Given a customer scenario, recommend a design to effectively consolidate and/or leverage data from multiple Salesforce instances.
This trailhead gives a good intro the Customer 360 concepts. You should also be familiar with multi-org strategies, and something like the hub and spoke model. Lastly, I suggest memorizing the high level access each license type gives you. I was particularly blind to Lightning Platform Starter vs Lightning Platform Plus so pick out the differences using a table like this.
Data Governance: 10%
- Given a customer scenario, recommend an approach for designing a GDPR compliant data model. Discuss the various options to identify, classify and protect personal and sensitive information.
- Compare and contrast various approaches and considerations for designing and implementing an enterprise data governance program.
This is another small section on the test and I have to imagine that anybody actively working in Salesforce is at least familiar with GDPR concepts. I found that these types of questions were easy to use logic to sort out. If you aren’t familiar with GDPR you can read up on it here.
Large Data Volume considerations: 20%
- Given a customer scenario, design a data model that scales considering large data volume and solution performance.
- Given a customer scenario, recommend a data archiving and purging plan that is optimal for customer’s data storage management needs.
- Given a customer scenario, decide when to use virtualised data and describe virtualised data options.
This section is covered pretty well in the main exam trailhead Understanding key concepts around data archiving will be helpful for you such as knowing the differences between archiving, purging, deleting, etc. A lot of the limits I mentioned above will help drive you to the answers in for these questions.
Data Migration: 15%
- Given a customer scenario, recommend appropriate techniques and methods for ensuring high data quality at load time.
- Compare and contrast various techniques for improving performance when migrating large data volumes into Salesforce.
- Compare and contrast various techniques and considerations for exporting data from Salesforce.
Think Data loader, data exports, bulk APIs, parallel vs serial API usage, etc. Again you are going to get asked about key differences in each of these strategies so you’ve got to know. where the weak spots are. Also, brush up on PK chunking if you haven’t used it before.
Let me know if you have any resources that were particularly helpful for you and best of luck with the exam!