top of page

DATA INTEGRATION CHECKLIST
QA Item
Validated
Data source QA page:
Data source 1
Data source 2
Data source 3
Data source 4
Date ranges, data granularity, and breakouts are as expected, when comparing between Datorama and Data Provider reports.
Are all accounts from the discovery workbook integrated?
Are all profiles correct?
Have the data load rules (if any) correctly excluded unwanted data from the ingested data sets.
Do top line metrics (impressions, clicks, conversions, etc) line up with what is in the Data Provider’s platform?
When applying date filters, do metrics (excluding reach, frequency, and uniques) match data for the same date range from the data provider?
Are there top line KPIs that are not represented on the Dashboard that you would like to see for QA purposes?
Are there Dimensions you expect to see that you are not seeing?
HARMONIZATION CALCULATIONS CHECKLIST
General QA Page
QA Item
Validated
Validated
When it comes of linkage between two or more different data sources, can you confirm that all required entities and attributes appearing in the same row with their respective measurements
Does the exceptions (if any) to the harmonization logic applied make sense? (Please check exception tables for each harmonization method that was applied)
In the case of Classifications of the raw data, are the classifications correct?
Please verify whether the Unclassified values (if any) are as expected.
Harmonization Fields QA Page
QA Item
Validated
Please verify the accuracy/output of each of the applied harmonization field logic.
Harmonization Field 1
Harmonization Field 2
Harmonization Field 3
Does the exceptions (if any) for each of the harmonization rules applied make sense? (Please check exception tables for each harmonized field)
Harmonization Field 1 exception table
Harmonization Field 2 exception table
Harmonization Field 3 exception table
Please verify whether the Unclassified values (if any) are as expected.
KPI QA Page
QA Item
Validated
Please verify the accuracy/output of each of the applied harmonization field logic.
KPI 1
KPI 2
KPI 3
When switching granularities through the provided filters, is the behaviour of the KPI as expected?
Does the exceptions (if any) for each of the KPI calculations make sense? (Please check exception tables for each KPI calculation)
KPI 1 exception table
KPI 2 exception table
KPI 3 exception table
Are the aggregation functions for each KPI working as expected? Total aggregations such as sum total, averages, lifetime values etc.
MCC QA Page
QA Item
Validated
Validated
Please validate the data in the table provided to verify if the Delivery data to Buy data connection are as expected.
Are the aggregation functions for each MCC related metric working as expected? For metrics such as Actualized Cost, Recalculated Cost, Fixed cost etc.
Are the delivery vs. budget pacing calculations working as expected?
Please check the provided exception table to verify whether the IOs that did not get connected to delivery data are as expected. For instace, the delivery date range is beyond the IO flight duration, keys used to link the Buy and delivery data are missing from the delivery data sets etc.
When toggling between different delivery date ranges via the date range filter, are the IO calculations behaving as per your expectation?
FINAL VISUALIZATION QA CHECKLIST
QA Item
Validated
Fidelity to function prototype: Does the final dashboard/s conform to the signed off wireframe, mockup and prototype?
Are all available filters, toggles, drop downs etc. usable across each section of the page?
Do global filters (such as page level date filters) work as expected.
Do filters adjust the appropriate data sets as per expectation. Such as filtering a table, graph or any other widget.
Does conditional formatting (if any) show the appropriate direction eg. Positive = Green
Are the downloading/exported features working as expected?
Is the data displayed accurately and does it match when comparing with your data vendors/platforms? The below are examples of validation criteria. Please include other data points used for spot checks in applicable:
Time period used for validating data (1 day, 1 week, 1 month, 3 months etc.)
Data validated for Business Units (eg: 3 BUs validated)
Data validated by Channel, Campaign, Media Buy etc.
SOLUTION DOCUMENTATION QA CHECKLIST
QA Item
Validated
Are all points mentioned in the documentation clearly layed out and in the expected amount detail?
Is the solution design diagram indicative of the overall implementation flow?
Are all the harmonized fields and business rules cleary stated in the document and understood?
Are all the KPI/custom metric calculations cleary stated in the document and understood?
Does the 'Scaling and Changes' section in document provide enough detail on how the solution can be taken forward?
Are all maintenance activities (if applicable) under the 'Repeatable Tasks' section clearly detailed and understood?
bottom of page