What you'll learn

  • How to structure a control matrix across four blocks with 28 columns that satisfy ISAE 3402.16 and ISAE 3402.23
  • How to write control descriptions that pass the five-element test (WHO, WHAT, WHEN, EVIDENCE, EXCEPTION)
  • How IPE flagging prevents the most common PCAOB finding on system-generated reports
  • How the [ISAE 3402 template pack](/templates) pre-populates 11 worked example controls across seven control objectives

You've been asked to build the control matrix for an ISAE 3402 engagement. You open a blank spreadsheet. Twenty minutes later you have five columns: control objective, control description, frequency, type, and a notes field. That structure will generate review comments before fieldwork even starts.

An ISAE 3402 control matrix that survives review requires a minimum of 28 columns across four distinct blocks (identification, classification, linkage, assessment), with every control description passing a five-element test covering who performs it, what they do, when, what evidence it produces, and what happens when exceptions occur.

Why five columns is not enough

ISAE 3402.16(a) lists eight mandatory description criteria for the service organisation's system. The control matrix is where most of those criteria land. A five-column matrix cannot address: the link between controls and risks (ISAE 3402.23), the distinction between key and non-key controls with documented rationale, IPE identification, CUEC dependencies, subservice organisation carve-out or inclusive method treatment, or the pre-defined evidence expectations that set the baseline for testing.

Every missing column becomes a gap that the reviewer fills with a question. Five questions in the review means the matrix goes back for rework. Twenty-eight columns answered upfront means the reviewer traces a complete chain from control objective to testing conclusion without interruption.

The column count is not arbitrary. It reflects what the standard requires, what inspectors check, and what the testing protocol needs as input. Each column feeds forward into subsequent tabs: the testing protocol pulls frequency, population, risk level, and key/non-key classification directly from the matrix. If those columns do not exist, the tester invents them on the fly, and consistency breaks.

The four-block structure

The ISAE 3402 template pack organises the 28 columns into four blocks. Each block serves a different purpose in the audit chain.

Identification covers the first six columns. These answer: which control objective does this control serve, what process area does it belong to, what is its unique identifier, what does the control actually do (per the five-element test), and who owns it. The control objective description follows a formula: "Controls provide reasonable assurance that [specific operational outcome]." Vague objectives ("controls over IT") fail the specificity test in ISAE 3402.18.

Classification covers columns seven through thirteen. These answer: is the control manual, automated, or IT-dependent manual? How often does it operate? Is it key or non-key (with mandatory rationale)? What system does it run in? Does it rely on information produced by the entity (IPE), and if so, how will completeness and accuracy of that IPE be tested? This block is where most review comments originate because it requires judgment calls that teams often skip or answer generically.

Linkage covers columns fourteen through twenty. These connect the control to the rest of the engagement file. Every control must trace to a risk in the risk assessment. Every control maps to a COSO component and to specific financial statement assertions. If a CUEC dependency exists, the linkage block records it. If a subservice organisation is involved, the carve-out or inclusive method is documented here. Without this block, the control matrix is a standalone document. With it, the matrix becomes a node in a connected chain.

Assessment covers columns twenty-one through twenty-eight. These define the testing expectations. What evidence should the tester expect to see? What testing approach will be used (and the standard's prohibition on inquiry alone under ISAE 3402.25(a) is reinforced here)? What is the sample size basis? What is the population? Design effectiveness and operating effectiveness get separate columns because a control that is well-designed but does not operate consistently has a fundamentally different deficiency profile. Exception notes and prior-year references close the block.

The five-element control description test

The single most common deficiency in ISAE 3402 files is a vague control description. "Management reviews the payroll report" tells the tester almost nothing. Compare that to a description that passes all five elements:

WHO performs the control: the Payroll Manager (a role title, never a person's name).

WHAT they do: reviews the monthly payroll variance checklist, comparing total payroll cost per department to the approved budget and prior month, investigating variances exceeding 5%.

WHEN they do it: by the 15th of each following month.

What EVIDENCE the control produces: signed variance checklist with investigation notes for all flagged departments, retained in the payroll SharePoint folder.

What happens when an EXCEPTION occurs: variances exceeding 10% are escalated to the Finance Director with a written explanation within two business days.

A control description that omits any element creates problems downstream. Without WHO, the tester cannot confirm the right person performed the control. Without WHEN, the tester cannot determine whether the control operated on time. Without EVIDENCE, the tester does not know what to inspect. Without EXCEPTION, the tester has no benchmark for evaluating whether deviations were handled appropriately.

The five-element test applies to every control in the matrix. Automated controls still need it: WHO is the system, WHAT is the specific validation or calculation, WHEN is real-time or batch (specify which), EVIDENCE is the system log or rejection report, EXCEPTION is the error handling routine.

IPE flagging: the column that prevents the PCAOB's top finding

Information produced by the entity (IPE) is any system-generated report or data extract that the auditor uses as audit evidence. The PCAOB's 2024 Staff Alert identified IPE testing failures as the most common deficiency in service organisation engagements: firms relied on system-generated access listings, payroll registers, and exception reports without testing whether those reports were complete and accurate.

The control matrix contains a dedicated IPE flag column. When a control relies on a system-generated report as evidence (a user access listing from the ERP, a payroll variance report, a change management log), the IPE flag is set to "Y." A companion column then requires documentation of: which report constitutes the IPE, which system generates it, and the completeness and accuracy approach the tester will use.

This flag is set at the matrix stage, not during testing. The reason: if IPE identification happens only when the tester encounters a report in the field, some IPE will be missed. By flagging it in the matrix, the testing protocol inherits the flag and includes IPE testing steps automatically.

For the 11 example controls in the pack, IPE flags are pre-populated. The quarterly access review relies on the user access listing (IPE). The payroll variance review relies on the payroll register (IPE). The change management procedure relies on the CAB meeting minutes generated by the ITSM tool (IPE). Each flagged control has a pre-written completeness and accuracy approach that the tester can adapt to the specific entity's systems.

Key versus non-key classification

Every control in the matrix receives a key or non-key classification. The distinction matters for two reasons: sample sizes are higher for key controls, and a deviation in a key control is more likely to trigger a gap analysis entry.

The rationale column requires the team to answer three questions. What risk does this control address? Does a compensating control exist? What happens if this control fails?

A control is key when it is the primary control for a risk and no compensating detective control would catch failures between operating cycles. A control is non-key when it supplements a primary control. In the pack's worked examples, the quarterly access review by the Information Security Manager is classified as key because no compensating control prevents unauthorised access between quarterly review cycles. The annual access certification by department heads is non-key because the quarterly review is the primary control.

A vague classification with no rationale ("key because it's important") is the finding most likely to generate a review comment. The rationale column forces specificity.

Worked example: Van der Berg Payroll Services B.V.

Entity: Van der Berg Payroll Services B.V., a Dutch payroll processing bureau with €34M revenue, processing payroll for 112 client entities. Type II engagement covering 1 January to 31 December 2025. Seven control objectives across ITGC, payroll processing, change management, and financial reporting.

  1. The engagement team opens the control matrix and defines the payroll processing control objective: "Controls provide reasonable assurance that payroll transactions are processed accurately, completely, and only with proper authorisation." The process area is Payroll Processing. Documentation note: the payroll processing objective description follows the formula template. Objective links to the inaccurate payroll output risk and the unauthorised payroll changes risk in the risk assessment tab.

  2. The team populates the payroll variance review control using the five-element test. WHO: Payroll Manager. WHAT: Reviews monthly payroll variance checklist, comparing total payroll cost per department to approved budget and prior month, investigating variances exceeding 5%. WHEN: By the 15th of each following month. EVIDENCE: Signed variance checklist with investigation notes, retained in payroll SharePoint folder. EXCEPTION: Variances exceeding 10% escalated to Finance Director within two business days. Documentation note: control description field contains all five elements in sequence. Control type classified as IT-dependent manual (relies on system-generated variance report but performed by a person).

  3. The classification block records: frequency is monthly (12 occurrences per year), key control (rationale: "Primary detective control over payroll accuracy. No other control independently verifies total payroll cost against budget. If this control fails, variances up to 10% go undetected until quarterly financial review"), system is the payroll application, IPE flag is Y. Documentation note: IPE description reads "Payroll variance report generated by [Payroll System]. Completeness: reconcile report total to general ledger payroll expense. Accuracy: reperform 2 variance calculations."

  4. The linkage block connects the payroll variance review control to the inaccurate payroll output risk, COSO component Control Activities, assertions Accuracy and Completeness, the payroll authorisation complementary user entity control (user entities must authorise payroll transactions before submission), no subservice organisation dependency. Documentation note: the payroll authorisation CUEC cross-reference means the control's effectiveness depends on user entities submitting authorised data. If the payroll authorisation CUEC does not operate, this control could process accurately but on fraudulent input.

  5. The assessment block sets: evidence expected is the signed checklist with investigation notes, testing approach is inspection plus reperformance, sample size basis is 3 to 4 per standard monthly sampling at 95% confidence, population is 12 monthly reviews, design effectiveness assessed as effective per the payroll variance review design walkthrough. Documentation note: operating effectiveness left blank at matrix stage. Populated during testing phase. Exception column pre-populated with "No exceptions" as default, to be updated if deviations found.

The completed matrix row for the payroll variance review control contains 28 populated fields. A reviewer can trace from the control objective through the description, classification, risk linkage, and testing parameters without asking a single clarifying question.

Practical checklist for control matrix construction

  1. Define all control objectives before writing individual controls. Each objective follows the formula: "Controls provide reasonable assurance that [specific outcome]." Vague objectives generate vague controls (ISAE 3402.18).

  2. Write every control description using the five-element test. Read the description and confirm you can answer: who, what, when, what evidence, what exception handling. If any element is missing, the tester will not know what to look for.

  3. Flag every IPE-dependent control in the matrix before testing begins. For each flag, document the report name, source system, and your completeness and accuracy approach. Do not leave this for fieldwork (PCAOB 2024 Staff Alert).

  4. Document key/non-key rationale by answering three questions: what risk, what compensating control, what consequence of failure. A one-word classification with no rationale will generate a review comment.

  5. Verify that every control traces to at least one risk in the risk assessment. A control with no risk linkage has no purpose in the matrix.

Common mistakes

  • Writing control descriptions that omit the exception-handling element. The AFM's 2025 inspection findings noted that 23 of 32 ISAE 3402 files had insufficient evidence for at least one control, and the root cause in most cases was a description so vague that the tester could not determine what "operating effectively" looked like.

  • Classifying all controls as key. When every control is key, the classification carries no information. The distinction exists to focus testing effort on controls where failure has the greatest consequence. If 90% of controls are key, the team has not applied judgment.

  • Omitting the linkage block entirely. A control matrix without risk linkage is a list of controls, not an audit working paper. ISAE 3402.23 requires the auditor to identify risks that threaten control objective achievement. Controls that do not map to identified risks cannot demonstrate they address those risks.

Get practical audit insights, weekly.

No exam theory. Just what makes audits run faster.

No spam — we're auditors, not marketers.