What you'll learn
- Which IT controls belong in an ISAE 3402 report and how ITGCs relate to application controls
- How testing differs for automated controls, manual controls, and IT-dependent manual controls
- What sample sizes apply and where to find the correct ISAE 3402 paragraph references
- How long an ISAE 3402 engagement typically takes from planning to report issuance
You've been asked to perform an ISAE 3402 engagement on a managed IT services provider. The control matrix from last year's auditor has 47 controls, 30 of them labelled "IT general controls." Your first question is the same one every practitioner asks: which of these actually belong in the report, and how many of them do I need to test? The second question follows immediately: what does "testing" even look like for an automated control that fires 10,000 times a day?
An ISAE 3402 engagement on IT controls requires the service auditor to test IT general controls (ITGCs) that support the reliability of application controls, and to test the application controls themselves when they are relevant to the control objectives in the system description. The scope depends on which controls the service organisation's management includes in the system description, but the service auditor must evaluate whether that description is complete and whether the included controls are sufficient to meet the stated control objectives.
What IT controls belong in an ISAE 3402?
The answer depends on the service organisation's system description. ISAE 3402.13 requires management to provide a description of the service organisation's system that includes, among other things, the control objectives and related controls. The service auditor's job under ISAE 3402.14 is to obtain sufficient appropriate evidence to form an opinion on whether the description is fairly presented and the controls are suitably designed (Type I) or suitably designed and operating effectively (Type II).
IT controls enter the scope when the service organisation's processes rely on technology to achieve control objectives. For a managed IT services provider, this is obvious: virtually all controls are IT controls. For a payroll bureau, the split is different. Some controls are manual (a supervisor reviews exception reports), some are automated (the system calculates tax withholdings based on coded tables), and some are IT-dependent manual controls (a supervisor reviews a system-generated report, but the reliability of that report depends on ITGCs).
IT general controls typically fall into four categories relevant to an ISAE 3402:
- Logical access controls. User provisioning, de-provisioning, password policies, privileged access management, periodic access reviews. These protect the integrity of application controls by ensuring only authorised personnel can modify system configurations and data.
- Change management. How changes to applications, operating systems, and databases are requested, tested, approved, and deployed. A change management failure can disable an automated control without anyone noticing until the next audit.
- IT operations. Job scheduling, backup procedures, incident management, monitoring. These ensure the IT environment remains available and recoverable.
- Data centre physical security. Physical access restrictions, environmental controls, visitor logs. Relevant when the service organisation operates its own data centres.
Not every ITGC needs to appear in every ISAE 3402 report. A service organisation that uses a public cloud provider (AWS, Azure) for infrastructure can carve out the data centre controls by referencing the cloud provider's own SOC 2 or ISAE 3402 report under the carve-out method. The service organisation's report then covers only the controls it manages directly.
How do ITGCs cascade to application controls?
This is the question that separates a good ISAE 3402 engagement from a checklist exercise.
ITGCs do not exist in isolation. They support application controls. If the ITGC layer is weak, the application controls built on top of it cannot be trusted even if they appear to operate correctly.
Consider an automated three-way match in an accounts payable system. The system compares the purchase order, goods receipt, and invoice before releasing a payment. That control is automated and fires every time an invoice is processed. But its reliability depends on several ITGCs: (1) that only authorised users can modify the matching tolerance thresholds, (2) that any change to the matching logic went through change management, (3) that the system access controls prevent a single user from creating both the purchase order and approving the invoice, and (4) that the underlying data has not been corrupted by an unmanaged system change.
If the change management ITGC fails (a developer pushed a code change directly to production without testing or approval), the three-way match might still appear to operate, but you have no assurance that it is performing the check it was designed to perform. The ITGC failure cascades upward.
In the ISAE 3402 control matrix, this relationship is documented by linking each application control to the ITGCs it depends on. When an ITGC fails testing, the service auditor must evaluate the impact on every application control that relies on it. This is why the linkage column exists.
How do you test automated controls vs manual controls?
The testing approach differs because the nature of the control differs.
Automated controls operate identically every time, assuming the underlying system configuration has not changed. A system-enforced password policy either requires eight-character passwords with complexity rules or it does not. Testing an automated control involves two elements: (1) confirming the configuration is correct (inspect the system settings), and (2) confirming the configuration has not changed during the reporting period (review change management logs). If the ITGC environment is strong and no changes occurred, a single inspection of the configuration may be sufficient for a Type II report because the control operated the same way every time.
Manual controls depend on human execution, which varies. A monthly access review might be performed thoroughly in January, skipped in March, and done superficially in August. Testing manual controls requires sampling across the reporting period. The service auditor selects a sample of instances (monthly reviews, weekly approvals, daily reconciliations) and tests whether each instance was performed as designed.
IT-dependent manual controls combine both elements. The control is manual (a person reviews a report and takes action) but depends on the accuracy and completeness of a system-generated report. Testing requires two steps: (1) test the ITGCs supporting the report (is the report complete and accurate?), and (2) test a sample of instances where the person performed the review. Skipping the first step is the most common error in ISAE 3402 IT control testing. The reviewer sees that the supervisor signed off on the exception report every month, but nobody checked whether the report itself captured all exceptions.
Information produced by the entity (IPE) is the formal term for these system-generated reports. Every ISAE 3402 engagement should include IPE testing as a separate procedure, not just as an afterthought in the manual control test.
What sample sizes apply?
This is where most practitioners reach for the wrong paragraph reference.
ISAE 3402 paragraphs 24-29 and A28-A36 govern the service auditor's procedures, including testing of controls. These are the correct references for sampling decisions. Paragraphs A47-A54 address modified opinions and have nothing to do with sample sizes. The confusion arises because many template working papers (including some from major networks) cite A47-A54 in the sampling section.
For manual controls, sample sizes follow the same logic as ISA 530 sampling for tests of controls. The sample size depends on the expected deviation rate, the tolerable deviation rate, and the confidence level. In practice, most ISAE 3402 engagements use the following as a baseline:
- Daily controls (250+ occurrences per year): 25 items, assuming a tolerable deviation rate of 9-10% at 90% confidence
- Weekly controls (52 occurrences): 9-15 items
- Monthly controls (12 occurrences): All 12 (the population is small enough to test in full)
- Quarterly controls (4 occurrences): All 4
- Annual controls (1 occurrence): Test the single instance
For automated controls with strong ITGCs, a single test of the configuration (plus evidence of no changes during the period) can cover the entire population because the control operated identically every time. This is the efficiency gain from strong ITGCs: it reduces the sampling burden on application controls.
When a deviation is found, the service auditor must determine whether to expand the sample. One deviation in a sample of 25 produces a deviation rate of 4%. Whether this exceeds the tolerable deviation rate depends on the engagement's defined threshold. The ISAE 3402 testing protocol in ciferi's pack includes a pre-defined deviation criteria section that forces the team to set these thresholds before fieldwork begins (not after the first deviation appears).
What is the difference between Type I and Type II?
A Type I report (ISAE 3402.8(a)) covers the fairness of the system description and the suitability of the design of controls as of a specified date. The service auditor inspects the controls but does not test whether they operated effectively over a period.
A Type II report (ISAE 3402.8(b)) covers everything in a Type I report plus the operating effectiveness of controls over a specified period (typically 9-12 months, with 12 months being the standard expectation from user auditors).
The practical difference is significant. A Type I report tells the user auditor "these controls were designed correctly as of 30 September 2025." It says nothing about whether anyone actually performed the monthly access review in January, or whether the automated matching control was functioning correctly throughout the year. User auditors relying on a Type I report must perform their own procedures to obtain evidence about operating effectiveness during the period, per ISA 402.16.
Most service organisations start with a Type I report and transition to a Type II in the following year. The Type I engagement typically covers a 3-4 month "readiness" period during which the service organisation implements controls and the service auditor confirms the design. The Type II engagement then covers the full subsequent year.
A Type I report has a limited shelf life. After the first year, user auditors expect a Type II. Continuing to issue Type I reports year after year raises questions about why the service organisation has not transitioned.
How long does an ISAE 3402 engagement take?
Timelines vary, but a typical first-year ISAE 3402 engagement follows this pattern:
Planning and scoping (4-6 weeks). Define the system boundaries, draft control objectives with management, identify the controls to include, build the control matrix, determine the reporting period, and agree the engagement timeline. This phase takes longer than most teams expect because the service organisation often does not have a clean, complete list of its own controls.
System description review (2-3 weeks). Management prepares the system description. The service auditor reviews it for completeness, accuracy, and compliance with ISAE 3402.14(a)-(b). Multiple iterations are common in year one.
Design testing and walkthrough (2-4 weeks). For a Type I, this is the testing phase. For a Type II, this is the first stage. The service auditor performs walkthroughs of each control, confirming it is designed to meet the stated control objective.
Operating effectiveness testing (4-8 weeks, Type II only). This is the fieldwork-heavy phase. The service auditor selects samples, inspects evidence, reperforms controls, and documents deviations. IT controls typically require system access to review configurations, logs, and change records.
Reporting and finalisation (2-4 weeks). Draft the report, share with management for factual accuracy review, resolve any findings, obtain management's written assertion, and issue the final report.
Total elapsed time: 12-20 weeks for a first-year Type I, 16-28 weeks for a first-year Type II. Year-two engagements are faster because the system description, control matrix, and scoping decisions carry forward.
Who signs the report?
The service auditor signs the ISAE 3402 report. This must be a qualified auditor (RA or equivalent) at a firm licensed to perform assurance engagements. The engagement partner reviews and approves the report before issuance.
Management of the service organisation provides a written assertion (ISAE 3402.12) that accompanies the report. This assertion confirms that the system description is fairly presented, the controls are suitably designed, and (for Type II) the controls operated effectively throughout the period. The assertion is signed by an appropriate officer of the service organisation (typically the CEO, CFO, or CTO).
The report is addressed to the service organisation. It is distributed to user entities and their auditors, usually under a restricted-use clause. The service auditor does not issue the report directly to user auditors.
Worked example
Dekker Cloud Services B.V. Revenue: €12M. Provides hosted ERP and payroll processing for 45 mid-market clients across the Netherlands and Belgium. First-year ISAE 3402 Type II engagement. Reporting period: 1 April 2025 to 31 March 2026.
Define control objectives. Working with Dekker's IT director, identify seven control objectives: logical access management, change management, data backup and recovery, incident management, payroll processing accuracy, journal entry authorisation, and bank reconciliation processing. Documentation note: Record how each control objective was determined to be relevant to user entities' financial reporting. Cross-reference to the system description.
Build the control matrix. Map 22 controls to the seven objectives. Fourteen are ITGCs (access provisioning, quarterly access review, annual access recertification, change request approval, testing before deployment, deployment approval, daily backup execution, monthly backup integrity check, incident logging, incident escalation SLA, physical access restriction). Eight are application controls (automated payroll gross-to-net calculation, payroll exception report review, three-way AP match, journal entry dual authorisation, bank reconciliation automated matching, manual bank reconciliation review, segregation of duties enforcement, data input validation). Documentation note: For each control, complete the five-element description: who performs it, what they do, when they do it, what evidence they produce, and what happens when an exception occurs.
Test ITGCs first. Test the quarterly access review (population: 4 reviews, test all 4). Review access review documentation for completeness: all users reviewed, actions taken for inappropriate access, sign-off by the IT security manager. One deviation found: the Q2 review was completed 11 days late. Assess whether the delay constitutes an operating effectiveness failure. Documentation note: Record the deviation, management's explanation (staff absence), the compensating detective control (automated alerting on privileged access changes operated throughout Q2), and the conclusion on whether the control objective was still met.
Test application controls. For the automated payroll gross-to-net calculation, inspect the system configuration (tax tables, contribution rates) and confirm no changes to the calculation logic during the period via the change management log. For the payroll exception report review (monthly, population: 12), test all 12 instances. Verify the reviewer signed off, exceptions were investigated, and corrections were processed within the defined SLA. Documentation note: For each application control, document the link to the supporting ITGCs. Note that the automated calculation's reliability depends on the change management and access controls tested in step 3.
Assess deviations and form the opinion. The Q2 access review delay is the only deviation. The compensating control (automated alerting) operated throughout the period, providing partial coverage. Severity assessment: LOW. No opinion modification required. Documentation note: Record the deviation in the gap analysis with the severity rating, compensating control evaluation, and the conclusion that the deviation does not affect the overall opinion.
The final Type II report covers all seven control objectives with an unmodified opinion and a description of the single deviation in the access review control.
Practical checklist
- Before fieldwork, define the tolerable deviation rate for each control. Setting this after the first deviation creates the appearance of working backward from the result (ISAE 3402, paragraphs 24-29).
- Link every application control to the ITGCs it depends on. When an ITGC fails, trace the impact upward to all dependent application controls before concluding.
- For automated controls, test the configuration once and verify no changes occurred during the period via change management logs. Do not sample automated controls as if they were manual.
- Test IPE (information produced by the entity) separately. Confirm the completeness and accuracy of every system-generated report used in a manual or IT-dependent manual control.
- Cite paragraphs 24-29 and A28-A36 for sampling decisions. If your working papers reference A47-A54 in the sampling section, correct them. Those paragraphs address modified opinions.
- For first-year engagements, build the timeline with management input. A realistic first-year Type II takes 16-28 weeks from planning to report issuance.
Common mistakes
- Citing ISAE 3402 paragraphs A47-A54 as the basis for sample size decisions. These paragraphs govern modified opinions, not sampling. The correct references are paragraphs 24-29 and A28-A36. This error propagates through template working papers and is rarely caught until an inspection.
- Testing automated controls with the same sample sizes used for manual controls. An automated control with strong ITGCs and no configuration changes during the period does not require a sample of 25. A single inspection of the configuration, combined with change management evidence, is sufficient.
- Failing to test IPE. A supervisor's monthly sign-off on an exception report means nothing if the report itself is incomplete. The PCAOB and AFM have both flagged IPE testing as a recurring deficiency in service organisation engagements.
Related content
- ISAE 3402 glossary entry. Definitions of ITGCs, application controls, control objectives, and the Type I / Type II distinction.
- ISAE 3402 working paper pack. Includes the control matrix with ITGC-to-application-control linkage, testing protocol with correct paragraph references, and gap analysis with severity assessment methodology.
- FUTURE POST: ISAE 3402 vs SOC 2: which report does your client need?. If your client's user entities include US-based companies, you may need a SOC 2 alongside the ISAE 3402.
Get practical audit insights, weekly.
No exam theory. Just what makes audits run faster.
No spam — we're auditors, not marketers.