Document Comparison

PCI-Secure-Software-ROV-Template-v1_1.pdf PCI-Secure-Software-ROV-Template-v1_2.pdf
31% similar
180 → 165 Pages
49844 → 50153 Words
570 Content Changes

Content Changes

570 content changes. 248 administrative changes (dates, page numbers) hidden.

Added p. 5
• Secure Software Requirements and Assessment Procedures (Secure Software Standard) Version 1.2.

It is the mandatory template for Secure Software Assessors completing a Secure Software Assessment.

Do not delete any content from the Secure Software ROV Template. The Introduction section of this document may be deleted, but the assessor must follow the instructions in this section while documenting the assessment.
Added p. 6
• Secure Software Requirements and Assessment Procedures (Secure Software Standard).

• Secure Software Program Guide (Secure Software Program Guide)

• Secure Software Attestation of Validation (Secure Software AOV)

• Glossary of Terms, Abbreviations, and Acronyms (SSF Glossary)

• Qualification Requirements for SSF Assessors (SSF Assessor Qualification Requirements) Documenting the Assessment Findings and Observations The results of the Secure Software Assessment are documented within the Detailed Findings and Observations section of the Secure Software ROV Template. An example layout of the Detailed Findings and Observations section is provided in Table 1.

N/A (Not Applicable) The control objective does not apply to the assessed software or software vendor. All “N/A” responses require detailed reporting on the testing performed and the results of the tests that explain why the control objective does not apply. In general, all control objectives within a given module must be satisfied and cannot be marked as N/A, unless there are legitimate technical constraints …
Added p. 10
• Appendix A, Additional Information Worksheet

PCI Secure Software Standard v1.2 Report on Validation Software Vendor Name:

Payment Software Name:

DBA (doing business as):

Company main website:

Company main website:
Added p. 13
Lead Assessor email address:

Lead Assessor PCI credentials:
Added p. 13
QA reviewer PCI credentials:
Added p. 13
Assessor Name: Assessor PCI Credentials: Assessor Role or Function During the Assessment:

Section 2.2 of the Qualification Requirements for SSF Assessors specifies the independence requirements that the Assessor Company must adhere to at all times when conducting Secure Software Assessments. Assessors are encouraged to review the independence requirements prior to completing the following table:
Added p. 14
Confirmation of Consultation Services Provided:

Description of Services Provided:
Added p. 14
Product or Service Name: Product Description:
Added p. 16
Already listed on PCI SSC website? Yes No PCI identifier (if applicable):
Added p. 17
Describe a typical implementation of the software (for example, how it is configured in the execution environment, how it typically interacts with other components or services, where those components or services reside, and who is responsible for maintaining them).

<Insert payment software architecture diagram(s) here>

Required dependencies are those that would render the assessed payment software inoperable or useless, if unavailable. Such dependencies typically involve hardware, software or services that must be purchased, licensed, and/or maintained separately by an implementing entity (such as a merchant or other type of entity). These types of dependencies do not include hardware, software, or services that are packaged and distributed with the assessed software.
Added p. 19
• Provider / Supplier: The manufacturer and/or supplier of the device. Also referred to as “device vendor.”

• Make / Model #: The device name and/or model number.

• Version(s) Supported: The version(s) of the device that is/are supported by the assessed payment software.

• Version(s) Tested: The version(s) of the device that was/were used during the software assessment.

• PTS Approval #: The PTS approval number for the associated PCI-approved PTS device(s), where applicable. This information need only be specified if the required device(s) has been approved by PCI SSC under the PCI PIN Transaction Security (PTS) Point-of-Interaction (POI) device validation program.

Note 1: POI device approval listings that appear similar or identical on the PCI SSC List of Approved PTS Devices may be associated with different versions of the PTS POI Standard. Be sure the correct listing is referenced and used during the assessment. The most recent device approvals should be referenced.

Note 2: …
Added p. 20
• Provider / Supplier: The manufacturer and/or supplier of the software.

• Software Name: The name of the software.

• Software Description: A brief description of the software type and/or intended function (e.g., database, web server, etc.)

• Version(s) Supported: The version(s) of the software supported by the assessed payment software.

• Version(s) Tested: The version(s) of the software that was/were used during the software assessment.

Provider / Supplier Software Name Software Description Version(s) Supported Version(s) Tested
Added p. 21
• Sensitive Data Type: The type of data deemed sensitive. Examples include Account Data, authentication credentials, cryptographic keys, etc.

• Sensitive Data Elements: The names of the individual data elements in relation to the Sensitive Data Type. Examples include PAN/SAD, username/password, etc.

• Protection Requirements: Indicates whether the data requirements confidentiality or integrity protection, or both.

• Storage Locations: The locations where sensitive data is stored persistently. Examples include file [name], table [name], etc.

Sensitive Data Type Sensitive Data Elements Protection Requirements Storage Location(s) 2.4.2 Sensitive Data Flows Provide high-level data flow diagrams that show the details of all sensitive data flows, including:

• All flows and locations of encrypted sensitive data (including all sensitive data inputs/outputs both within and outside the execution environment).

• All flows and locations of clear-text sensitive data (including all sensitive data inputs/outputs both within and outside the execution environment).

• How and where sensitive data is stored, processed and/or transmitted.

• The …
Added p. 23
Assessment start date:
Added p. 24
Assessed Modules Justification (if excluded from the software assessment) Core Requirements N/A Module A

• Account Data Protection Requirements Module B

• Terminal Software Requirements Module C

• Web Software Requirements 3.2.2 Requirements Deemed Not Applicable Identify any control objectives and test requirements that were determined to be “Not Applicable” to the assessed software or the assessed software vendor. List applicable control objectives and test requirements in the order they appear in Section 4, “Detailed Findings and Observations” (adding additional rows as needed).

Important Note: A “Not Applicable” finding is only acceptable where the control objective has been verified to be not applicable to the assessed software through an appropriate degree of testing. All “Not Applicable” responses MUST be tested, and details MUST be provided to describe how it was determined that a control objective does not apply to the assessed software.

Control Objective or Test Requirement #:

Describe how it was determined that the requirement …
Added p. 28
Where sampling is used, samples must be representative of the total population of possible items. The sample size must be sufficiently large and diverse to provide assurance that the selected sample accurately reflects the overall population.

The required attributes that must be specified for each sample set are described below:

• Reference #: A reference number used to uniquely identify each sample set. Generic values such as “Set-1,” “Set-2,” and so on may be used in lieu of formal reference numbers.

• Sample Description: A brief description of the items sampled. For example, “a sample of software updates” or “a sample of user IDs.”

• Total Sampled: The number of items included in the sample set. This could also be expressed in other relevant terms, such as lines of code (if applicable).

• Total Population: The total number of possible items available for testing.

• Sample Justification: The Assessor’s justification for why the Total Sampled is …
Added p. 29
Validated: All applicable control objectives are marked “In Place,” thereby Secure Software Name(s) and Version(s) has achieved full validation with the PCI Secure Software Standard.

The ROV was completed according to the PCI Secure Software Standard Version 1.2, in adherence with the instructions therein.

All information within this ROV represents this Secure Software Assessment in all material aspects.
Added p. 30
This assessment was conducted in a manner intended to preserve at all times the professional judgment, integrity, impartiality, and professional skepticism of the SSF Assessor Company.

This Report on Validation accurately identifies, describes, represents, and characterizes all factual evidence that the SSF Assessor Company and its Assessor Employees gathered, generated, discovered, reviewed and/or determined in their sole discretion to be relevant to this assessment while performing the assessment.

The judgments, conclusions, and findings contained in this Report on Validation (a) accurately reflect and are based solely upon the factual evidence described immediately above, (b) reflect the independent judgments, findings and conclusions of the SSF Assessor Company and its Assessor Employees only, acting in their sole discretion, and (c) were not in any manner influenced, directed, controlled, modified, provided or subjected to any prior approval by the assessed Vendor, any contractor, representative, professional advisor, agent or affiliate thereof, or any other person or …
Added p. 32
1.1.b The assessor shall examine evidence to confirm that information is maintained that describes where sensitive data is stored. This includes the storage of sensitive data in temporary storage (such as volatile memory), semi- permanent storage (such as RAM disks), non- volatile storage (such as magnetic and flash storage media), or in specific locations or form factors (such as with an embedded system that is only capable of local storage).

R1 Identify the evidence obtained that details the locations where sensitive data is stored.

R1 Identify the evidence obtained that details the security controls that are implemented to protect sensitive data during storage, processing, and transmission.

R1 Describe each of the tests performed, including the tool(s) and/or method(s) used, to confirm that the evidence obtained in Test Requirement 1.1.a accurately reflects the sensitive data stored, processed, and transmitted by the assessed software.

R2 Describe each of the software tests performed, including the tool(s) and/or …
Added p. 37
R1 Identify the evidence obtained that details all interfaces (user interfaces, APIs, etc.) that are accessible or that can be made accessible (through user input or interaction) upon software installation, initialization, or first use.
Added p. 37
R1 Indicate whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for authentication, such as those that are provided by the execution environment or that reside outside of the execution environment.

R2 If R1 is “No,” then describe what the assessor observed that confirms that none of the interfaces identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data.

2.1.d The assessor shall test the software to determine whether any of the interfaces identified in Test Requirement 2.1.a expose functions or services that have publicly disclosed vulnerabilities by conducting a search on the exposed protocols, methods, or services in public vulnerability repositories such as that maintained within the National Vulnerability Database.

R1 Indicate whether any of the interfaces identified in Test Requirement 2.1.a rely on any protocols, functions, or ports that are known to contain vulnerabilities.

R2 If R1 is “No,” then describe what …
Added p. 48
R1 Identify the evidence obtained that details the sensitive data that is retained by the software for transient use.

R2 Describe the mechanisms used or relied upon by the software to securely delete sensitive data from transient storage facilities once the purpose for retaining this data has been fulfilled.

R1 Indicate whether any sensitive data identified in Test Requirement 3.2.a is stored using immutable objects.

R2 If R1 is “Yes,” then describe the constraints that exist that require or otherwise necessitate the use of immutable objects.

R3 If R1 is “Yes,” then describe the protection mechanisms implemented to mitigate the risks posed by the use of immutable objects.

R2 If R1 is “Yes,” then describe the methods implemented to protect this data when retained for this purpose.

R3 If R1 is “Yes,” then describe how the software handles this data when debugging, error finding, and/or testing functions are terminated.

R2 If R1 is “Yes,” then identify the …
Added p. 50
R1 Indicate whether the software stores any sensitive data within the code itself (e.g., is ‘hardcoded’).

R2 If R1 is “No,” describe how the assessor confirmed that no sensitive data is stored within the code.

R3 If R1 is “Yes,” then identify the evidence obtained that details the locations within the code where this data is stored.

R4 If R1 is “Yes,” then describe the methods implemented to protect this data from unauthorized disclosure and/or modification (as applicable).

R2 If R1 is “Yes,” then identify the evidence obtained that details the cryptographic algorithms and modes of operation used or relied upon by the software.

R2 If R1 is “Yes,” then describe the protection methods used or relied upon by the software for this purpose.

R3 If R1 is “Yes,” then describe the methods implemented to ensure that these protection methods are present in all environments where the software is designed to be executed.

3.3.e Where user input …
Added p. 58
In Place Not in Place N/A 4.1 Attack scenarios applicable to the software are identified.

Note: This control objective is an extension of Control Objective 10.1. Validation of both control objectives should be performed at the same time.

R1 Identify the evidence obtained that details the software vendor’s analysis of potential threats and attack scenarios applicable to the assessed software.

R2 Identify the date when the software vendor’s threat analysis was last performed or updated.

Where such industry standards are not used, the assessor shall confirm that the methodology used provides equivalent coverage for the attack scenarios applicable to the software under evaluation.

R1 Identify the evidence obtained that details the software vendor’s threat-modeling methodology.

R2 Indicate whether the software vendor’s threat- modeling methodology is based on industry- standard methods or guidelines.

R5 Describe any other assessment activities performed and/or findings for this test requirement.

R3 Describe what the assessor observed in the evidence obtained in Test Requirement …
Added p. 60
R1 Identify the evidence obtained that details the methods implemented to mitigate each of the threats identified in Test Requirement 4.1.a.

R2 Identify the evidence obtained that details the software vendor’s justification(s) for any threats identified in Test Requirement 4.1.a that were not mitigated.

R2 If R1 is “Yes,” then describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that all such settings are applied upon software installation, initialization, or first use.

4.2.c Where user input or interaction can disable, remove, or bypass any such mitigations, the assessor shall examine evidence and test the software to confirm that such action requires authentication and authorization and that guidance on the risk of such actions is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether users can disable, remove, or bypass any of the settings identified in Test Requirement 4.2.b.

R2 If …
Added p. 64
R1 Describe the assessment activities performed and what the assessor observed in the evidence obtained that confirms that all authentication mechanisms relied upon by the software require unique user identification.

R2 If R1 is “Yes,” then identify the evidence obtained that confirms that unique identification is required for each different program and system accessing these APIs.

R2 If R1 is “Yes,” then describe the methods implemented within or by the software to protect these authentication credentials from attempts to intercept them in transit.

R2 Describe any other assessment activities performed and/or findings for this control objective.
Added p. 66
R1 Identify the evidence obtained that demonstrates that all authentication methods implemented by the software are evaluated to determine whether they contain known vulnerabilities.

R2 Describe how the implementation of these authentication methods mitigates vulnerabilities common to those methods.

R1 Identify the evidence obtained that details the software vendor’s analysis of the implemented authentication methods and their ability to resist attacks common to such methods.

R2 Describe how the software vendor evaluates the robustness of each of the implemented authentication methods and how the software vendor determines whether the authentication credentials are sufficiently strong to resist attacks.

R1 Identify the evidence obtained that details the access privileges granted to critical assets by default, and the software vendor’s justification for granted such access.

5.4.b The assessor shall examine evidence and test the software to identify the level of access that is provided to critical assets and to confirm that such access correlates with the evidence examined …
Added p. 70
R1 Identify the evidence obtained that details the locations within the software where sensitive data is transmitted outside of the physical execution environment.

R2 Identify the evidence obtained that details the protection requirements for sensitive data transmitted outside of the physical execution environment.

6.2.c Where third-party or execution-environment features are relied upon for the security of the transmitted data, the assessor shall examine evidence to confirm that guidance on how to configure such features is provided to stakeholders in accordance with Control Objective 12.1.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that all applicable ingress and egress methods enforce a secure version of the protocol with end- point authentication prior to transmission.

R2 If R1 is “Yes,” then describe the method(s) used or relied upon by the software to ensure that strong cryptography is always enforced where sensitive data is transmitted outside of the …
Added p. 72
R1 Indicate whether the software relies on cryptography for the protection of sensitive data during storage, processing, or transmission.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that all cryptography relied upon for the protection of sensitive data during storage, processing, or transmission complies (or can be configured to comply) with all applicable sections of Control Objective 7.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the assessed software to use these methods in a secure manner.

R1 Indicate whether the software relies on asymmetric cryptography to encrypt sensitive data during storage, transmission, or processing.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained through examination and testing that confirms private keys are not used to protect the confidentiality of sensitive data during storage, transmission, or processing.

• …
Added p. 81
R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms these keys are installed and stored in a manner that provides for dual control, or that protects the keys from unauthorized substitution or modification where dual control is infeasible.

In Place Not in Place N/A 7.3.a The assessor shall examine evidence and test the software to identify all random number generators used by the software and to confirm that all random number generation methods:

R1 Identify the evidence obtained that details all locations within the software where random numbers are required.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that the third-party software, platforms, or libraries do not contain or otherwise expose any known vulnerabilities that would compromise its ability to secure generate random numbers.

R2 If R1 is “Yes,” then describe what the assessor observed in the …
Added p. 87
R1 Indicate whether the software maintains its own activity tracking records (even if only temporarily).

8.3.b Where the software utilizes external or third- party systems for the maintenance of tracking data, such as a log server, the assessor shall examine evidence to confirm that guidance on the correct and complete setup and/or integration of the software with the external or third-party system(s) is provided to stakeholders in accordance with Control Objective 12.1.

In Place Not in Place N/A 8.4.a The assessor shall examine evidence and test the software to confirm that the failure of the activity-tracking mechanism(s) does not violate the integrity of existing records by confirming that:

R1 Describe the protection methods implemented to ensure prevent existing activity tracking records and data from being overwritten or corrupted when activity tracking mechanisms fail.

8.4.b The assessor shall examine evidence and test the software to confirm that the integrity of activity tracking records is maintained …
Added p. 133
R1 Describe what the assessor observed in the evidence obtained that confirms the software vendor’s guidance does not conflict with the payment terminal vendors’ security guidance for the PCI PTS POI devices included in the software assessment.
Added p. 134
In Place Not in Place N/A C.1.1 All software components and services are documented or otherwise cataloged in a software bill of materials (SBOM).

In Place Not in Place N/A C.1.1 The assessor shall examine evidence to confirm that information is maintained that describes all software components and services comprising the software solution, including:

• All proprietary software libraries, packages, modules, and/or code packaged in a manner that enables them to be tracked as a freestanding unit of software.

• All third-party and open-source frameworks, libraries, and code embedded in or used by the software during operation.

• All third-party software dependencies, APIs, and services called by the software during operation.

R1 Identify the evidence obtained that details the assessed software’s bill of materials (SBOM).

C.1.2 The SBOM describes each of the primary components and services in use, as well as their secondary transitive component relationships and dependencies to the greatest extent feasible.

In Place Not in …
Added p. 135
R1 Identify each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to validate the evidence obtained in Test Requirement C.1.2.a.

R2 Indicate whether software testing identified any components or services used during software operation that were not reflected in the SBOM.

R3 If R2 is “Yes,” then describe why the assessor considers it acceptable for these components or services to be excluded from the SBOM.

C.1.3 Where the software is provided “as a service,” the SBOM includes information describing the software dependencies present in the production software execution environment to the greatest extent feasible.

In Place Not in Place N/A C.1.3.a The assessor shall examine evidence to confirm that the SBOM describes all dependencies present in the production software execution environment that the software relies upon for operation or to satisfy security requirements in this standard.

R1 Indicate whether the software is provided “as-a- service”.

R2 If …
Added p. 136
R3 If R2 is “Yes,” then describe why the assessor considers it acceptable for these dependencies to be excluded from the SBOM.
Added p. 137
In Place Not in Place N/A C.1.4.a The assessor shall examine evidence to confirm that information is maintained in the SBOM that describes the following for each component and service in use, including secondary component relationships and dependencies:

• The original source/supplier of the component or service.

• The name of the component or service as defined by the original supplier.

• A description of the relationship(s) between the component and service and other components/services embedded in or used by the software.

• The version of the component or service as defined by the original supplier to differentiate it from previous or other versions.

• The name of the author who designed/developed the component or service.

• Any other identifiers provided by the original supplier to uniquely identify the component or service.

R1 Describe the overall structure of the SBOM, the nomenclature and attributes used, and how the SBOM accounts for components and services.

C.1.4.b The assessor shall …
Added p. 138
R1 Describe the software vendor’s processes for generating SBOMs and how it ensures one is generated for each new software release.

C.1.6 Vulnerabilities in third-party components and services are monitored and managed in accordance with Control Objective 10.

In Place Not in Place N/A C.1.6.a The assessor shall examine evidence to confirm that third-party components and services present in and/or in use by the software are regularly monitored for vulnerabilities in accordance with Control Objective 10.1.

R1 Describe how the software vendor leverages the SBOM to monitor and manage vulnerabilities in third-party components and services.

C.1.6.b The assessor shall examine evidence to confirm that vulnerabilities in third-party components and services are identified and are patched or otherwise mitigated in a timely manner in accordance with Control Objective 10.2.

R1 Identify the evidence obtained that confirms that vulnerabilities in third-party components are patched or mitigated in a timely manner.

C.1.7 Where software components and/or resources are hosted …
Added p. 140
In Place Not in Place N/A C.2.1 User access to sensitive functions and sensitive resources exposed through Internet-accessible interfaces is authenticated.

In Place Not in Place N/A C.2.1 Using information obtained in Test Requirements 1.2.a and 2.1.a in the Core Requirements, the assessor shall examine evidence to identify all sensitive functions and sensitive resources exposed through Internet- accessible interfaces.

R1 Identify the evidence obtained that details all sensitive functions and sensitive resources that are exposed, or that may be exposed, through Internet- accessible interfaces.

C.2.1.1 The methods implemented to authenticate user access to sensitive functions and sensitive resources use industry-standard mechanisms.

In Place Not in Place N/A C.2.1.1.a The assessor shall examine evidence to identify all methods implemented by the software to authenticate access to sensitive functions and sensitive resources.

R1 Describe the method(s) relied upon by the software to authenticate access to the sensitive functions and sensitive resources identified in Test Requirement C.2.1.

C.2.1.1.b The …
Added p. 142
In Place Not in Place N/A C.2.1.2 Using information obtained in Test Requirement C.2.1.1.a, the assessor shall examine evidence to confirm that the authentication methods implemented are sufficiently strong and robust to protect authentication credentials in accordance with Control Objective 5.3 in the Core Requirements section.

R1 Describe how the methods implemented to authenticate user access to the sensitive functions and sensitive resources identified in Test Requirement C.2.1 mitigate the likelihood of authentication credentials being forged, spoofed, guessed, or otherwise compromised by an unauthorized entity.

C.2.1.3 Authentication decisions are enforced within a secure area of the software. In Place Not in Place N/A C.2.1.3.a The assessor shall examine evidence to identify where within the software architecture authentication decisions are enforced.

R1 Describe the locations within the software architecture where authentication decisions are enforced.

C.2.1.3.b The assessor shall examine evidence to confirm that all authentication decisions are enforced within a secure area of the software …
Added p. 143
R1 Indicate whether client-side or browser-based functions, scripts, or data are used for authenticating access to software interfaces.

R2 If R1 is “Yes,” then describe how the software uses these functions, scripts, and data for authenticating access to software interfaces.

R3 Describe the methods implemented to protect these functions, scripts, and data from compromise or manipulation by an unauthorized entity.

C.2.2 Access to all Internet-accessible interfaces is restricted to explicitly authorized users only. In Place Not in Place N/A C.2.2.a Using information obtained in Test Requirement 2.1.a in the Core Requirements section, the assessor shall examine evidence to identify all software interfaces that are exposed to the Internet or that can be configured in a way that exposes them to the Internet.

R1 Identify the evidence obtained that details the software interfaces that are exposed, or that could be configured in a way to expose them, to the Internet.

C.2.2.b The assessor shall examine evidence …
Added p. 144
• implemented correctly;

• appropriate for the types of users expected to use the interface; and

• does not expose known vulnerabilities.

R1 Describe what the assessor observed in the evidence obtained that confirms that the methods identified in Test Requirement C.2.2.b to restrict access to software interfaces are appropriate for the type(s) of interface provided and does not expose known vulnerabilities.

C.2.2.d Where the methods used to authorize access to Internet-accessible interfaces is user configurable, or otherwise requires user input or interaction, the assessor shall examine evidence to confirm that appropriate guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes the configurable options available and how to configure each method securely.

R1 Indicate whether any of the methods identified in Test Requirement C.2.2.b requires or enables users to configure those methods.

R2 If R1 is “Yes,” then identify the evidence obtained that details the configurable options available and the software …
Added p. 145
C.2.3 Access to all software functions and resources exposed through Internet-accessible interfaces is restricted to explicitly authorized users only.

In Place Not in Place N/A C.2.3 Using information obtained in Test Requirement C.2.2.a, the assessor shall examine evidence to identify all software functions and resources that are exposed, or that can be configured in a way that exposes them, through Internet-accessible interfaces.

R1 Identify the evidence obtained that details all software functions or resources that are exposed, or that can be exposed, through APIs or other interfaces.

C.2.3.1 The software ensures the enforcement of access control rules at both the function level and resource level with fine-grained access control capabilities.

In Place Not in Place N/A C.2.3.1.a Using information obtained in Test Requirement C.2.3, the assessor shall examine evidence to determine how the software controls access to individual functions and resources exposed (or potentially exposed) through Internet- accessible interfaces.

R1 Describe the methods relied upon …
Added p. 146
• appropriate for the type of function(s) and resource(s) provided; and

R1 Describe what the assessor observed in the evidence obtained that confirms the methods described in Test Requirement C.2.3.1.a are implemented correctly and do not expose known vulnerabilities.

C.2.3.1.c Where the methods used to authorize access to the functions and resources exposed (or potentially exposed) through Internet-accessible interfaces are user configurable or otherwise requires user input or interaction, the assessor shall examine evidence to confirm that guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes the mechanisms and configurable options available to restrict access to the functions and resources exposed through these interfaces, and how to configure such mechanisms.

R1 Indicate whether any of the methods described in Test Requirement C.2.3.1.a are user configurable.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure such methods securely.

C.2.3.1.d Where …
Added p. 147
R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that access to functions and resources exposed through APIs or other interfaces requires users to be explicitly authorized before access is granted.

C.2.3.2 Authorization rules are enforced upon each user request to access software functions and resources through Internet-accessible interfaces.

In Place Not in Place N/A C.2.3.2.a Using information obtained in Test Requirement C.2.3.1.a, the assessor shall examine evidence to confirm that authorization checks are performed each time users request access to a function or resource exposed (or potentially exposed) through Internet-accessible interfaces to verify they are authorized for the function, resource, and type of access requested.

R1 Describe how the software verifies whether users are authorized to access functions or resources exposed through APIs or other interfaces.

C.2.3.2.b The assessor shall examine evidence and test the software to confirm that access …
Added p. 148
R1 Identify the evidence obtained that details the locations within the software architecture where authorization and access control decisions are enforced.

C.2.3.3.b The assessor shall examine evidence to confirm that all access control decisions are enforced within a secure area of the software architecture.

R1 Describe what the assessor observed in the evidence obtained that confirms all authorization and access control decisions are enforced within a secure area of the software architecture.

C.2.3.3.c The assessor shall examine evidence and test the software to confirm that client-side or browser-based functions, scripts, and data are never solely relied upon for access control purposes.

R1 Indicate whether client-side or browser-based functions, scripts, or data are relied upon for access control purposes.

R2 If R1 is “Yes,” then describe how the software uses these functions, scripts, or data for access control.

R3 If R1 is “Yes,” then describe the methods implemented to ensure the compromise of client- side or browser-based …
Added p. 149
In Place Not in Place N/A C.3.1 The software enforces or otherwise supports the use of the latest HTTP Security Headers to protect Internet accessible interfaces from attacks.

In Place Not in Place N/A C.3.1.a The assessor shall examine evidence to confirm the software supports the use of the latest HTTP Security Headers, and to determine the options available and how such settings are configured.

R1 Identify the evidence obtained that details the primary set of HTTP Security Headers and configuration options that are supported by the software.

C.3.1.b Where HTTP Security Headers are configured and controlled by the software provider, the assessor shall examine evidence to confirm that the software is configured to use the latest available HTTP Security Headers and that the configuration settings are reasonable and justified.

R1 Indicate whether HTTP Security Headers are configured and controlled by the assessed software or entity.

R2 If R1 is “Yes,” then describe what the …
Added p. 150
In Place Not in Place N/A C.3.2.a Using information obtained in Test Requirement C.2.1.a, the assessor shall examine evidence to identify all interfaces that accept data input from untrusted sources.

R1 Identify the evidence obtained that details all APIs and other interfaces that accept input data from untrusted sources.

C.3.2.b Where the software accepts input from untrusted sources, the assessor shall examine evidence to identify the data format(s) expected by the software for each input field and the parsers and interpreters involved in processing the input data.

R1 Identify the evidence obtained that details the data format(s) expected for each of the input fields identified in Test Requirement C.3.2.a and the parsers or interpreters involved in the processing of the input data.

C.3.2.c Using information obtained in Test Requirement 4.1.a in the Core Requirements section, the assessor shall examine evidence to determine whether attacks that target all such parsers and interpreters are acknowledged in …
Added p. 151
R1 Indicate whether any of the security controls described in Test Requirement C.3.2.d are user configurable.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on configuring these security controls securely.

C.3.2.1 Industry-standard methods are used to protect software inputs from attacks that attempt to exploit vulnerabilities through the manipulation of input data.

In Place Not in Place N/A C.3.2.1.a Using information obtained in Test Requirement 4.2.a in the Core Requirements section, the assessor shall examine evidence to identify all software security controls implemented to mitigate attacks that attempt to exploit vulnerabilities through the manipulation of input data.

R1 Describe the methods relied upon by the software to mitigate attempts to exploit vulnerabilities in parsers and interpreters through the manipulation of input data.

C.3.2.1.b The assessor shall examine evidence to confirm that the methods implemented to protect against such attacks use industry-standard mechanisms and/or techniques that are:

R1 Describe …
Added p. 152
• Are implemented correctly in accordance with available guidance, and

• Do not expose any vulnerabilities.

R1 Describe what the assessor observed in the evidence obtained that confirms the methods implemented to protect against attempts to exploit vulnerabilities in parsers and interpreters through the manipulation of input data are implemented correctly and do not expose vulnerabilities.

C.3.2.2 Parsers and interpreters are configured with the most restrictive configuration feasible. In Place Not in Place N/A C.3.2.2.a Using information obtained in Test Requirement C.3.2.b, the assessor shall examine evidence to identify the configurations for each parser or interpreter used to process untrusted input data.

R1 Identify the evidence obtained that details the (default) configurations for each parser or interpreter used to process untrusted input data.

C.3.2.2.b For each of the parsers/interpreters and the configurations identified, the assessor shall examine evidence to confirm that parsers and interpreters are configured with the most restrictive set of capabilities feasible and …
Added p. 153
In Place Not in Place N/A C.3.3.a Using information obtained in Test Requirements C.2.1.a and C.2.2, the assessor shall examine evidence to identify all Internet accessible interfaces and the functions and resources exposed (or potentially exposed) through those interfaces to identify where such interfaces, functions, and resources may be susceptible to resource starvation attacks.

R1 Identify the evidence obtained that details the interfaces, functions, and resources potentially susceptible to resource starvation attacks.

C.3.3.b Where such interfaces, functions, and resources are potentially susceptible to resource starvation attacks, the assessor shall examine evidence to confirm that:

• The threat of such attacks is documented in accordance with Control Objective 4.1, and
Added p. 153
R1 Describe what the assessor observed in the evidence obtained that confirms that the threats related to resource starvation attacks are documented in the software vendor’s threat analysis.

R2 Describe the security controls implemented to protect against resource starvation attacks.

C.3.3.c The assessor shall examine evidence to confirm that the software security controls implemented to mitigate resource starvation and other similar attacks on Internet accessible interfaces are designed and implemented in accordance with applicable industry standards and best practices.

R1 Describe what the assessor observed in the evidence obtained that confirms that software security controls implemented to protect against resource starvation attacks are aligned with industry standards and best practices regarding such protections.
Added p. 154
R1 Indicate whether software security controls to protect against resource starvation attacks are user configurable.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on configuring such methods.

C.3.4 Software security controls are implemented to protect Internet accessible interfaces from malicious file content.

In Place Not in Place N/A C.3.4.a Using information obtained in Test Requirement C.2.1.a, the assessor shall examine evidence to identify all Internet accessible interfaces that accept file uploads and the file types permitted.

R1 Identify the evidence obtained that details the software interfaces that accept file uploads, and file types supported.

C.3.4.b Where the software accepts file uploads over Internet accessible interfaces, the assessor shall examine evidence to confirm that:

• The threat of attacks on file upload mechanisms is documented in accordance with Control Objective 4.1, and

R2 Describe the software security controls implemented to mitigate common attacks that target file upload mechanisms.
Added p. 155
R1 Describe what the assessor observed in the evidence obtained that confirms that software security controls implemented to mitigate attacks on file upload mechanisms are designed in accordance with applicable industry-standard methods.

C.3.4.d The assessor shall examine evidence to confirm that the software security controls implemented to mitigate attacks on file upload mechanisms include methods to restrict the file types permitted by the file upload mechanisms.

R1 Describe the methods implemented by the software to restrict the types of files permitted and the types of files permitted by default.

C.3.4.e The assessor shall examine evidence to confirm that the software security controls implemented to mitigate attacks on file upload mechanisms include methods to restrict the maximum number and size of files permitted for upload.

R1 Describe the methods and/or mechanisms implemented by the software to mitigate attacks that attempt to overwhelm or exploit file parsing mechanisms using excessive file sizes or excessive file uploads.

C.3.4.f …
Added p. 156
R1 Describe the methods and/or mechanisms implemented by the software to mitigate attacks that attempt to remotely execute malicious code through direct calls to uploaded files.

C.3.4.h The assessor shall examine evidence to confirm that the use of file-parsing mechanisms does not rely on file names or file extensions for security purposes.

R1 Describe the methods and/or mechanisms implemented by the software to mitigate attacks that attempt to trick the software into interpreting files of one type as another type.

C.3.4.i Where the implementation of software security controls is user configurable or otherwise requires user input or interaction, the assessor shall examine evidence to confirm that guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes how to configure such mechanisms.

R1 Indicate whether any of the software security controls implemented to protect against attacks on file parsing mechanisms are user configurable or require user input or interaction to be …
Added p. 157
• The threat of hostile object creation and data tampering attacks is documented in accordance with Control Objective 4.1, and

R1 Describe what the assessor observed in the evidence obtained that confirms that threats to interfaces that accept and process data objects as inputs are documented.

R2 Describe the software security controls implemented to mitigate common attacks on these types of interfaces and functions.

C.3.5.c The assessor shall examine evidence to confirm that the software security controls implemented to mitigate hostile object creation and data tampering attacks are implemented in accordance with applicable industry standards and best practices.

R1 Describe what the assessor observed in the evidence obtained that confirms that software security controls implemented to mitigate hostile object creation and data tampering attacks are designed in accordance with applicable industry- standard methods.

C.3.5.d The assessor shall examine evidence to confirm that the software security controls implemented to mitigate hostile object creation and data tampering …
Added p. 158
R1 Describe what the assessor observed in the evidence obtained that confirms that file-parsing mechanisms do not contain or otherwise expose vulnerabilities.

C.3.5.g Where the software accepts serialized objects as inputs, the assessor shall examine evidence to confirm that software security controls are implemented to protect against deserialization attacks and that such security controls adhere to applicable industry standards and best practices.

R1 Indicate whether the software accepts serialized objects as inputs.

R2 If R2 is “Yes,” then describe the software security controls implemented to protect against deserialization attacks.

C.3.5.h Where the software security controls implemented to protect against hostile object creation and data tampering are user configurable or otherwise require user input or interaction, the assessor shall examine evidence to confirm that guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes how to configure such mechanisms.

R1 Indicate whether any of the software security controls are user configurable or …
Added p. 159
In Place Not in Place N/A C.3.6.a The assessor shall examine evidence to determine if and/or how the software supports cross-origin access to Internet accessible interfaces, and to confirm that access to software APIs and resources from browser-based scripts is disabled by default.

R1 Indicate whether the software supports cross- origin access to software interfaces.

R2 If R1 is “Yes,” then describe the mechanisms implemented to restrict access to API endpoints and resources from browser-based scripts.

C.3.6.b Where cross-origin access is enabled, the assessor shall examine evidence to confirm that the reasons for enabling cross-origin access are reasonable and justified, and that access is restricted to the minimum number of origins feasible.

C.3.6.c The assessor shall test the software to confirm that the claims made by the assessed entity regarding cross-origin access are valid. At a minimum, testing is expected to include functional testing using forensic tools/techniques.

R1 Describe each of the tests performed, including …
Added p. 161
In Place Not in Place N/A C.4.1 Sensitive data transmissions are encrypted in accordance with Control Objectives 6.2 and 6.3. In Place Not in Place N/A C.4.1.a Using information obtained in Test Requirement 6.2.a, the assessor shall examine evidence to determine how communications are handled by the software, including those between separate systems in the overall software architecture.

R1 Identify the evidence obtained that details the full architecture of the assesses software, including all components that reside both within and outside the physical execution environment.

C.4.1.b Where the software allows or otherwise supports the transmission of sensitive data between users and systems in different security contexts, the assessor shall examine evidence to confirm that all such communications are encrypted using strong cryptography in accordance with Control Objectives 6.2 and 6.3.

R1 Describe what the assessor observed in the evidence obtained that confirms that communications between components in different security contexts is encrypted using …
Modified p. 1
Payment Card Industry (PCI) Software Security Framework Secure Software Template for Report on Validation Version 1.1
Payment Card Industry (PCI) Software Security Framework Secure Software Template for Report on Validation Version 1.2
Removed p. 5
This Reporting Template is mandatory for all Secure Software Assessment report submissions to PCI SSC.

Do not delete any content from any place in this document, including this section and the versioning above. These instructions are important for the assessor as they complete reporting, but also provide context for the report recipient(s). The inclusion of additional text or sections is permitted within reason, as noted above.

Software Security Framework

• Secure Software Requirements and Assessment Procedures Software Security Framework

• Secure Software Program Guide Software Security Framework

• Secure Software Attestation of Validation Software Security Framework

• Glossary of Terms, Abbreviations, and Acronyms Software Security Framework

• Qualification Requirements for Assessors
Modified p. 5
• Secure Software Template for Report on Validation (hereafter referred to as the Secure Software ROV Reporting Template) is for use with the PCI Software Security Framework
• Secure Software Template for Report on Validation (Secure Software ROV Template) is for use with the PCI Software Security Framework
Modified p. 5
Secure Software Requirements and Assessment Procedures (PCI Secure Software Standard) Version 1.1 and is the mandatory template for Secure Software Assessors completing a Secure Software Assessment. The Secure Software ROV Reporting Template provides reporting instructions and a reporting template for Secure Software Assessors. This template assures a consistent level of reporting against the PCI Secure Software Standard for all assessors.
Using this Document The PCI Secure Software Report on Validation Template provides reporting instructions and a reporting template for Secure Software Assessors. This template assures a consistent level of reporting against the PCI Secure Software Standard for all assessors.
Modified p. 5
Tables have been included in this template to assist with the reporting process for certain lists and other information as appropriate. You can modify the tables in this template to increase or decrease the number of rows or to change column width. The assessor may add appendices to include relevant information that is not addressed by the current organization. However, the assessor must not remove any details from the tables provided in this document. Customization is acceptable, such as the …
Tables have been included in this template to assist with the reporting process for certain lists and other information as appropriate. The tables in this template may be modified to increase or decrease the number of rows or to change column width. Additional appendices may be added if there is relevant information that is not addressed in the current template. However, the assessor must not remove any details from the tables provided in this document. Minor customizations, such as the …
Modified p. 5
A Secure Software Assessment involves thorough testing and assessment activities from which the assessor generates detailed work papers for each control objective and its associated test requirements. These work papers contain comprehensive records of the assessment activities, including observations, configurations, process information, interview notes, documentation excerpts, references, screenshots, and other evidence collected during the assessment. The Secure Software Report on Validation (ROV) is effectively a summary of evidence derived from the assessor’s work papers to describe how the assessor performed …
A Secure Software Assessment involves thorough testing and assessment activities from which the assessor generates detailed workpapers for each control objective and its associated test requirements. These workpapers contain records of the assessment activities, including observations, configurations, process information, interview notes, documentation excerpts, references, screenshots, and other evidence collected during the assessment. The Secure Software Report on Validation (ROV) acts as a comprehensive summary of the testing activities performed and the information that is collected during the Secure Software Assessment. …
Removed p. 6
1. Contact Information and Report Summary

3. Assessment Overview

4. Assessor Company Attestations

5. Findings and Observations The Secure Software ROV Reporting Template also includes the following Appendices:

Documenting the Assessment Findings and Observations Within the Findings and Observations section of the Secure Software ROV Reporting Template is where the detailed results of the software assessment are documented. In this section, an effort was made to efficiently use space and provide a snapshot view of assessment results (Summary of Assessment Results) ahead of the detailed reporting that is to be specified in the “Reporting Details: Assessor’s Response” column. An example layout of the Findings and Observations section is provided in Table 1.
Modified p. 6 → 5
Appendix A, Additional Information Worksheet Appendix B, Testing Environment Configuration for Secure Software Assessments All numbered sections must be thoroughly and accurately completed. The Secure Software ROV Reporting Template also contains instructions to help ensure that Secure Software Assessors supply all required information for each section. All responses should be entered in the applicable location or table provided in the template. Responses should be specific, but efficient. Details provided should focus on the quality of detail, rather than lengthy, repeated …
All numbered sections must be thoroughly and accurately completed. The Secure Software ROV Template also contains instructions to help ensure that Secure Software Assessors supply all required information for each section. All responses should be entered in the applicable location or table provided in the template. Responses should be specific, but efficient. Details provided should focus on the quality of detail, rather than lengthy, repeated text. Copying text from the control objectives or test requirements is discouraged, as it does …
Removed p. 7
Table 1. Findings and Observations Control Objectives and Test Requirements Reporting Instructions Reporting Details:

N/A (Not Applicable) The control objective does not apply to the organization or their software development practices. All “N/A” responses require reporting on the testing performed to confirm the “N/A” status. Note that a “N/A” response still requires a detailed description explaining how it was determined that the control objective does not apply.
Modified p. 7
Assessor’s Summary of Assessment Findings (check one) 1.1 Detailed Control Objective Summary In Place N/A Not in Place 1.1.a Test Requirement Reporting Instruction Reporting Instruction For the Summary of Assessment Findings, there are three results possible•In Place, Not Applicable (N/A), and Not in Place. Only one selection is to be made for each control objective. Table 2 provides a helpful representation when considering which selection to make. Reporting details and results should be consistent throughout the ROV, as well as …
Table 1. Detailed Findings and Observations Control Objectives / Test Requirements Reporting Instructions Assessor’s Findings Control Objective 1: Control Objective Title Parent Control Objective Summary In Place Not in Place N/A 1.1 Child Control Objective Summary In Place Not in Place N/A 1.1.a Test Requirement R1 Reporting Instruction R2 Reporting Instruction For the Summary of Assessment Findings, there are three results possible•In Place, Not in Place, and N/A (Not Applicable). Only one selection is to be made for each control …
Modified p. 7
In Place The expected testing has been performed and all elements of the control objective have been met.
In Place The expected testing has been performed and all elements of the control objective have been met. Detailed testing must be performed and reporting provided that demonstrates how the assessor confirmed the control objective is In Place.
Removed p. 8
Summarize Summarize how the software prevents sensitive data from being processed until initialization is complete.

The response would provide a high-level overview of a security control, process, mechanism, or tool that is implemented or used by the vendor to satisfy a control objective. For example, summarizing a security control or protection mechanism would include information about what is implemented, what it does, and how it meets its purpose.
Modified p. 8
To provide consistency in how Secure Software Assessors document their findings, the reporting instructions use standardized terms. Those terms and the context in which they should be interpreted is provided in Table 3.
To provide consistency in how Secure Software Assessors document their findings, the reporting instructions use standardized terms. Those terms and the context in which they should be interpreted are provided in Table 3.
Modified p. 8
Identify Identify the documentation and evidence examined that outlines all configuration options provided by the software.
Identify Identify the evidence obtained that details all configuration options provided by the software.
Modified p. 8
The response would be either “yes” or “no”.
The response would be either “Yes” or “No.”
Modified p. 9
• Provide sufficient detail and information to demonstrate a finding of “In Place” or “N/A”.
• Provide sufficient detail, information, and rationale to demonstrate a finding of “In Place” or “N/A.”
Modified p. 9
• Ensure the response covers all applicable systems, processes, and components including those provided by third parties.
• Ensure the response covers all applicable systems, processes, components, APIs, and functions including those provided by third parties.
Modified p. 9
• Provide full dates where dates are required, using either “dd/mmm/yyyy” or “mmm/dd/yyyy” format, and using the same format consistently throughout the document.
• Provide full dates where dates are required, using the “dd-mm-yyyy” format consistently throughout the document.
Modified p. 9
• Do not report items as “In Place” unless they have been verified as being “In Place”.
• Do not report items as “In Place” unless they have been verified as being “In Place.”
Modified p. 10
Appendix A, Additional Information Worksheet Appendix B, Testing Environment Configuration for Secure Software Assessments Appendix A is optional and may be used to add extra information to support the assessment findings if the information is too large to fit in the Reporting Details: Assessor Response column within the Findings and Observations section. Examples of information that may be added in Appendix A include diagrams, flowcharts, or tables that support the Secure Software Assessor’s findings. Any information recorded in Appendix A …
Appendix B, Testing Environment Configuration for Secure Software Assessments Appendix A is optional and may be used to add extra information to support the assessment findings if the information is too large to fit in the Assessor’s Findings column within the Detailed Findings and Observations section. Examples of information that may be added in Appendix A include diagrams, flowcharts, or tables that support the Secure Software Assessor’s findings. Any information recorded in Appendix A should reference back to the …
Modified p. 10
Appendix B is mandatory and must be used to confirm that the environment used by the assessor to conduct the Secure Software Assessment was configured in accordance with Section 4.5.1 of the Secure Software Program Guide. This confirmation must be submitted to PCI SSC along with the completed Report on Validation (ROV).
Appendix B is mandatory and must be used to confirm that the environment used by the assessor to conduct the Secure Software Assessment was configured in accordance with Section 4.6.1 of the Secure Software Program Guide. This confirmation must be submitted to PCI SSC along with the completed Report on Validation (ROV).
Removed p. 11
1. Contact Information and Report Summary 1.1 Contact Information Software Vendor Contact Information Company name: Company contact name:

Confirmation that internal QA was fully performed on the entire submission per requirements in the relevant program documentation.

Note: If “No,” this is not in accordance with PCI Program requirements.
Modified p. 11 → 12
Contact e-mail address: Contact phone number:
Contact email address:
Modified p. 11 → 13
Assessor e-mail: Assessor phone number:
Lead Assessor phone number:
Modified p. 11 → 13
QA reviewer e-mail address:
QA reviewer email address:
Removed p. 12
Note: This date must be shown as the “Secure Software ROV Completion Date” in the Secure Software AOV.

Timeframe of assessment (start date to completion date):

Identify date(s) spent onsite at the Software Vendor, if applicable:

Describe how time was spent onsite at the Software Vendor, how time was spent performing remote assessment activities, and how time was spent on validation of remediation activities:

Note: Provide range of dates for each activity.
Removed p. 13
Is the software already listed on the PCI SSC List of Validated Payment Software? Yes No If “yes,” provide the Validated Payment Software name and PCI Identifier:

Payment Software Type for this software (Please refer to Section A.2 of the Secure Software Program Guide for a detailed explanation of software types):

Describe how the software is designed (for example, as a standalone application, as a component or library, or as part of a suite of applications) Describe a typical implementation of the software (for example, how it is configured in the execution environment or how it typically interacts with other systems or components).
Modified p. 13 → 16
2. Software Overview 2.1 Software Details Software name tested: Software version tested (wildcards not permitted):
Software name tested: Software version(s) tested:
Modified p. 13 → 16
(01) POS Suite/General (02) Payment Middleware (03) Payment Gateway/Switch (04) Payment Back Office (05) POS Admin (06) POS Specialized (07) POS Kiosk (08) POS Face-to-Face/POI (09) Shopping Cart / Store Front (10) Card-Not-Present (11) Automated Fuel Dispenser (12) Payment Component Describe the software function and purpose (for example, the types of transactions performed, the specific payment acceptance channels supported, etc.):
(01) POS Suite/General (02) Payment Middleware (03) Payment Gateway/Switch (04) Payment Back Office (05) POS Admin (06) POS Specialized (07) POS Kiosk (08) POS Face-to-Face/POI (09) Shopping Cart / Store Front (10) Card-Not-Present (11) Automated Fuel Dispenser (12) Payment Component Describe the general software function and purpose (for example, the types of transactions performed, the payment acceptance channels supported, etc.).
Removed p. 14
Device Make / Manufacturer Device Model Name / Number Device Version (wildcards permitted) Device Description (e.g., device type, function, etc.) Example: Acme, Inc. Acme POS v1.x Integrated POS, secure card reader and pin-entry device.
Removed p. 15
Software Vendor / Owner Software Name Software Version (wildcards permitted) Software Description (e.g., type, function, etc.) Example: Acme, Inc. Acme E-commerce Server v2.x Web/application server 2.5 Other Required Software Components Does the assessed payment software rely on any other third-party or proprietary software, APIs, or components to provide its intended functionality? If “yes,” identify and list all software, APIs, and components the assessed payment software relies upon to provide the full scope of its intended functionality:

Software Vendor / Owner Software Name Software Version (wildcards permitted) Software Description (e.g., type, function, etc.) Example: Acme, Inc. Acme Crypto Library v3.x Suite of cryptographic libraries used for authentication and data protection.
Removed p. 16
Note: Additional rows may be added to accommodate additional sensitive data types. Refer to the Software Security Framework

• Glossary of Terms, Abbreviations, and Acronyms for more information on how Sensitive Data is defined.

Sensitive Data Store (file [name], table [name], etc.) Sensitive Data Type (e.g., Account Data, authentication credentials, etc.) Description of Sensitive Data Elements (e.g., PAN/SAD, username/password, etc.) Summary of How the Sensitive Data is Handled (e.g., stored, processed, transmitted, etc.) Summary of How the Sensitive Data is Protected (e.g., encrypted during transmission, hashed during storage, etc.) 2.7 Overview of Sensitive Functions Provided Identify the sensitive functions provided by the software and describe how each is protected:

Note: Additional rows may be added to accommodate additional sensitive data types. Refer to the Software Security Framework

• Glossary of Terms, Abbreviations, and Acronyms for more information on how Sensitive Functions are defined.

Sensitive Function Type (e.g., user authentication, data encryption, encryption key management, etc.) …
Removed p. 17
Sensitive Resource Name (e.g., LDAP, libcrypto, keychain, etc.) Associated Sensitive Function (user authentication, data encryption, encryption key management, etc.) Source / Provider (The entity that maintains the code e.g., Microsoft, Verifone, Ingenico, etc.) Summary of How Interactions are Secured (e.g., mutual authentication, access control, obfuscation, etc.) 2.9 Sensitive Data Flows

• Provide high-level data flow diagrams that show the details of all sensitive data flows, including: o All flows and locations of encrypted sensitive data (including all sensitive data inputs/outputs both within and outside the execution environment) o All flows and locations of clear-text sensitive data (including all sensitive data inputs/outputs both within and outside the execution environment)

• For each data flow, identify the following: o How and where sensitive data is stored, processed and/or transmitted o The specific types and details of the sensitive data involved (e.g., full track, PAN, PIN, expiry date, user IDs, passwords, etc.) o All components …
Removed p. 19
3. Assessment Overview 3.1 Assessment Scope Identify the requirement modules within the Secure Software Standard the software was assessed to:

Note: if the payment software stores, processes, or transmits Account Data, the software must be assessed to both the Core Requirements and the Account Data Protection module.

Core Requirements Module A

• Account Data Protection Requirements Module B

• Terminal Software Requirements 3.2 Hardware Platforms and Components Tested Identify/describe all hardware platforms and components the assessed payment software was tested on/with during the assessment:

Device Make / Manufacturer Device Model Name / Number Device PCI Approval Number (if applicable) Device Hardware and/or Firmware Version (no wildcards) Device Description (e.g., device type, function, etc.)
Removed p. 20
Software Vendor / Owner Software Name Software Version (no wildcards) Software Description (e.g., type, function, etc.) 3.4 System Configurations Tested Describe each unique combination of hardware and software (including those identified in Section 3.2 and 3.3) used to validate the payment software, as well as other important details of the testing environment (for example, how the various platforms and components are configured to communicate with one another, whether any of the hardware/software components were virtualized, etc.).

Describe who provided the environment(s) where the software was tested (e.g., the Secure Software Assessor Company, the software vendor, a third-party, a combination of two/all three, etc.):
Removed p. 21
Reference Number Document Name (including version, if applicable) Document Description / Purpose Document Generation Method Document Date (date last updated) Doc-1 Manual Automated Doc-2 Manual Automated Doc-3 Manual Automated Doc-4 Manual Automated Doc-5 Manual Automated 3.6 Individuals Interviewed Identify and list the individuals interviewed during testing:

Reference Number Individual’s Name Role / Job Title Organization Summary of Topics Covered (high-level summary only)
Removed p. 22
Reference Number Test Type / Description (e.g., type of test performed, forensic tools used, etc.) Test Scope (e.g., software components and/or features evaluated) Control Objectives Covered (No generic references) Test-1 Manual source code review User authentication module 5.1, 5.2, 5.3
Removed p. 23
Note: When a reporting instruction asks to identify a sample, the Secure Software Assessor must identify the items sampled (for example, as “Set-1”) in the table below and then specify the corresponding sample set reference number in the Assessor Response field next to the applicable reporting instruction in the Findings and Observations section. The existing rows representing pre-defined sample sets must not be deleted. However, the assessor may add rows to this table as needed to accommodate additional sample sets.

Where sampling is used (or where instructed), samples must be representative of the total population. The sample size must be sufficiently large and diverse to provide assurance that the selected sample accurately reflects the overall population, and that any resultant findings based on a sample are an accurate representation of the whole. In all instances where a Secure Software Assessors finding is based on a representative sample rather than the complete …
Removed p. 24
4. Assessor Company Attestations A duly authorized representative of the Assessor Company hereby confirms the following:
Removed p. 24
Note: This section must be printed and signed manually, or digitally signed using a legally recognized electronic signature.
Modified p. 24 → 31
Assessor Employee Name: Assessor Company Name:
Lead Assessor Name: SSF Assessor Company Name:
Removed p. 25
5. Findings and Observations Minimizing the Attack Surface The attack surface of the software is minimized. Confidentiality and integrity of all software critical assets are protected, and all unnecessary features and functions are removed or disabled.

1.1.b For each item of sensitive data, the assessor shall examine vendor evidence to confirm that evidence describes where this data is stored, and the applicable security controls implemented to protect the data. This includes in temporary storage (such as volatile memory), semi- permanent storage (such as RAM disks), and non-volatile storage (such as magnetic and flash storage media).

For each item of sensitive data identified in 1.1.a, identify the documentation and evidence examined that describes where each item of sensitive data is stored (including storage in temporary locations, semi-permanent locations, and non- volatile locations), and the security controls implemented to protect the sensitive data.
Modified p. 25 → 32
Control Objective and Test Requirements Reporting Instructions Reporting Details: Assessor’s Response Summary of Assessment Findings (check one) Control Objective 1: Critical Asset Identification All software critical assets are identified 1.1 All sensitive data stored, processed, or transmitted by the software is identified. In Place N/A Not in Place 1.1.a The assessor shall examine vendor evidence to confirm that it details all sensitive data that is stored, processed, and/or transmitted by the software. At a minimum, this shall include all payment …
In Place Not in Place N/A 1.1 All sensitive data stored, processed, or transmitted by the software is identified. In Place Not in Place N/A 1.1.a The assessor shall examine evidence to confirm that information is maintained that details all sensitive data that is stored, processed, and/or transmitted by the software. At a minimum, this shall include all payment data; authentication credentials; cryptographic keys and related data (such as IVs and seed data for random number generators); and system configuration …
Modified p. 25 → 32
Identify the documentation and evidence examined that identifies and describes all sensitive data that is stored, processed, and transmitted by the software.
R1 Identify the evidence obtained that details the sensitive data that is stored, processed, and transmitted by the assessed software.
Removed p. 26
Identify the documentation and evidence examined that describes where the software implementation enforces storage of sensitive data within a specific location or form factor.

Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used, and the scope of each test.

Describe any discrepancies that were found between the information obtained through documentation and evidence review and information obtained through software testing.

Note: The assessor may require and rely on assistance from the software vendor to complete this test requirement (such as through access to a dedicated test environment). Any such specific assistance must be documented by the assessor.

Identify the documentation and evidence examined in support of this test requirement, including the documentation and evidence examined in 1.1.a.
Modified p. 26 → 33
1.1.d The assessor shall examine vendor evidence and test the software to validate the information provided by the vendor in Test Requirement 1.1.a.
1.1.d The assessor shall test the software to validate the evidence obtained in Test Requirements 1.1.a through 1.1.c.
Removed p. 27
Describe any discrepancies that were found between the information obtained through the documentation and evidence reviews and the information obtained through the software testing.

1.1.f The assessor shall examine vendor evidence and test the software to identify the cryptographic implementations that are supported by the software, including (but not limited to) cryptography used for storage, transport, and authentication. The assessor shall confirm that the cryptographic data for all of these implementations is supported by the vendor evidence, and that the evidence describes whether these are implemented by the software itself, through third-party software, or as functions of the execution environment.
Modified p. 27 → 32
Identify the documentation evidence examined in support of this test requirement.
R2 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 27 → 34
1.1.g The assessor shall examine vendor evidence and test the software to identify any accounts or authentication credentials supported by the software, including both default and user created accounts. The assessor shall confirm that these accounts and credentials are supported by the vendor evidence.
1.1.g The assessor shall examine evidence and test the software to identify the accounts and authentication credentials supported by the software (including both default and user-created accounts) and to confirm that these accounts and credentials are supported by the evidence examined in Test Requirements 1.1.a through 1.1.c.
Removed p. 28
Describe the criteria the assessor used to determine whether configuration options provided by the software have the potential to impact the security of sensitive data.

1.1.i When cryptography is used to protect any sensitive data, the assessor shall examine vendor evidence to confirm that these cryptographic methods and materials are identified.
Modified p. 28 → 34
1.1.h The assessor shall examine vendor evidence and test the software to identify any configuration options provided by the software that can impact sensitive data, including through separate files or scripts, or internal functions, menus and options provided by the software. The assessor shall confirm that these are supported by the vendor evidence.
1.1.h The assessor shall examine evidence and test the software to identify the configuration options provided by the software that can impact sensitive data (including those provided through separate files or scripts, internal functions, or menus and options), and to confirm that these are supported by the evidence examined in Test Requirements 1.1.a through 1.1.c.
Modified p. 28 → 34
Describe each of the configuration options that can impact the security of sensitive data.
R1 Identify the evidence obtained that details the configuration options provided by the assessed software that can impact the security of sensitive data.
Removed p. 29
Indicate whether the software uses cryptography to protect any sensitive data (yes/no).

If “yes,” identify the documentation and evidence examined that identifies and describes all cryptographic methods and materials used to protect sensitive data.

Describe what the assessor observed in the documentation, evidence and software test results that confirms that all software functions that are designed to store, process, and transmit sensitive data are documented.
Removed p. 30
Describe any discrepancies found between the sensitive data identified through the documentation and evidence reviews in 1.1.a and the sensitive data identified through the documentation and evidence reviews in this test requirement.
Modified p. 30 → 35
1.2.c Where the sensitive functions or sensitive resources are provided by third-party software or systems, the assessor shall examine third-party software or system evidence and test the software to confirm that the vendor software is correctly following the guidance for this third-party software.
1.2.c Where the sensitive functions or sensitive resources are provided by third-party software or systems, the assessor shall examine evidence and test the software to confirm that the software correctly follows available guidance for the third- party software.
Modified p. 30 → 35
Note: For example, by reviewing the security policy of a PTS, FIPS 140-2, or FIPS 140-3 approved cryptographic system.
Note: For example, by reviewing the security policy of a PTS or FIPS140-2 or 140-3 approved cryptographic system.
Modified p. 30 → 35
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether any sensitive functions or sensitive resources are provided by third- party software or systems.
R1 Indicate whether the assessed software relies upon any sensitive functions or sensitive resources during operation that are provided by third-party software or systems.
Modified p. 30 → 35
Indicate whether any sensitive functions or sensitive resources are provided by third-party software or systems (yes/no).
R1 Identify the evidence obtained that details the sensitive functions and sensitive resources provided or relied upon by the assessed software.
Removed p. 31
1.2.d The assessor shall examine vendor evidence and test the software to confirm that the sensitive functions and sensitive resources provided or used by the software are supported by the vendor evidence.

Describe any documentation and evidence examined in support of this test requirement, including the documentation and evidence examined in 1.2.a.

• The vendor defines classification criteria for identifying critical assets.

Describe how the software vendor’s inventory of all critical assets is maintained (e.g., the format, location, etc.).
Modified p. 32 → 36
Vendor classification criteria identifies the confidentiality, integrity, and resiliency requirements for each critical asset.
The software vendor defines criteria for classifying critical assets in accordance with the confidentiality, integrity, and resiliency requirements for each critical asset.
Modified p. 32 → 36
• An inventory of all critical assets with appropriate classifications is defined.
• An inventor of all critical assets with appropriate classifications is maintained.
Modified p. 32 → 36
Describe what the assessor observed in the documentation and evidence examined that confirms the software vendor has identified the confidentiality, integrity, and resiliency requirements for each critical asset.
R1 Identify the evidence obtained that details the confidentiality, integrity, and resiliency requirements for each critical asset.
Removed p. 33
Describe each of the tests performed in support of this test requirement, including tool(s) or method(s) used and the scope of each test.

Describe any discrepancies found between the information obtained through the documentation and evidence reviews and the information obtained through the software testing.
Removed p. 33
Describe each of the software tests performed in support of this test requirement, including tool(s) or method(s) used and the scope of each test.
Modified p. 33 → 37
In Place N/A Not in Place 2.1.a The assessor shall examine vendor evidence and test the software to identify any software APIs or other interfaces that are provided or exposed by default upon installation, initialization, or first use. For each of these functions, the assessor shall confirm that the vendor has documented and justified its use as part of the software architecture. Testing shall include methods to reveal any exposed functionality of the software (such as scanning for listening services …
In Place Not in Place N/A 2.1.a The assessor shall examine evidence and test the software to identify any software APIs or other interfaces that are provided or exposed by default upon installation, initialization, or first use. For each of these interfaces, the assessor shall confirm that the vendor has documented and justified its use as part of the software architecture. Testing shall include methods to reveal any exposed interfaces or other software functionality (such as scanning for listening services …
Modified p. 33 → 37
Note: This includes functions which are auto-enabled as required during operation of the software.
Note: This includes functions that are auto-enabled as required during operation of the software.
Modified p. 33 → 37
2.1.b The assessor shall test the software to determine whether any of the functions identified in Test Requirement 2.1.a rely on external resources for authentication. If such resources are relied upon, the assessor shall examine vendor evidence to identify what methods are required to ensure proper authentication remains in place and shall confirm that these methods are included in the assessment of all other requirements of this standard.
2.1.b The assessor shall test the software to determine whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for authentication. Where such resources are relied upon, the assessor shall examine evidence to confirm that methods are implemented to ensure that proper authentication remains in place and that these methods are included in the assessment of other applicable requirements in this standard.
Modified p. 33 → 37
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software relies on external resources for authentication.
R2 Describe what the assessor observed in the evidence obtained that confirms that all accessible interfaces are reflected in the software vendor’s documentation.
Modified p. 34 → 37
If “yes,” describe each of the methods implemented by the software to ensure proper authentication remains in place for the APIs and other interfaces provided or exposed by the software.
R3 If R1 is “Yes,” then describe the methods that are implemented to ensure that proper authentication always remains in place during software operation.
Modified p. 34 → 37
Describe what the assessor observed in the documentation, evidence and software test results that confirms determines whether any of the APIs or other interfaces identified in 2.1.a rely on external resources for the protection of sensitive data during transmission.
R2 If R1 is “No,” then describe what the assessor observed that confirms that none of the interfaces identified in Test Requirement 2.1.a rely on external resources for authentication.
Modified p. 34 → 38
2.1.c The assessor shall test the software to determine whether any of the functions identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data during transmission. If such resources are relied upon, the assessor shall examine vendor evidence to identify what methods are required to ensure proper protection remains in place and shall confirm that these methods are included in the assessment of all other requirements of this standard.
2.1.c The assessor shall test the software to determine whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data during transmission. Where such resources are relied upon, the assessor shall examine evidence to confirm that methods are implemented to ensure proper protection remains in place and that these methods are included in the assessment of other applicable requirements in this standard.
Modified p. 34 → 38
Indicate whether any of the APIs or other interfaces identified in 2.1.a rely on external resources for the protection of sensitive data during transmission (yes/no).
R1 Indicate whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data.
Modified p. 34 → 38
If “yes,” describe each of the methods implemented to ensure proper protection of sensitive data remains in place and why the methods are appropriate for their intended purpose.
R3 If R1 is “Yes,” then describe the methods that are implemented to ensure that the protection of sensitive data always remains in place during software operation.
Removed p. 35
Identify the public vulnerability repositories that were searched (manually or electronically) in conjunction with the software testing in this test requirement.

2.1.e Where vulnerabilities in exposed functions exist, the assessor shall examine vendor evidence and test the software to confirm the following:

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether any vulnerabilities exist in exposed software functions, APIs, or other interfaces.
Modified p. 35 → 39
The mitigations implemented by the software vendor to minimize exploit of these weakness have been identified.
Methods are implemented to mitigate the exploitation of these weaknesses.
Modified p. 35 → 39
• The risks posed by the use of known vulnerable protocols, functions, or ports is documented.
• The risks posed by the use of known vulnerable protocols, functions, or ports are documented.
Modified p. 35 → 39
• Clear and sufficient guidance on how to correctly implement sufficient security to meet the security and control objectives of this standard is made available to stakeholders per Control Objective 12.1.
• Clear and sufficient guidance on how to correctly implement sufficient security to meet applicable control objectives in this standard is provided to stakeholders in accordance with Control Objective 12.1.
Removed p. 36
If yes,” complete the remaining reporting instructions for this test requirement.

Describe each of the mitigations implemented in the software to mitigate the risks posed by the vulnerabilities in the exposed methods or services, and why each mitigation is appropriate for its intended purpose.

Describe the software vendor’s justification for the software’s use of known vulnerable functions, protocols, and ports, and how the assessor concluded the vendor’s justification(s) are reasonable, given the risks involved.

2.1.f The assessor shall examine vendor evidence and test the software to confirm available functionality matches what is described in vendor documentation. Testing shall include methods to reveal any exposed functionality of the software (such as scanning for listening services where applicable).

Note: This test requirement is redundant with test requirement 2.1.a. Reporting instructions are intentionally left blank. No further instruction needed.
Removed p. 37
Where access to third-party functions is prevented through implemented mitigations, the assessor shall test the software to confirm that they do not rely on a lack of knowledge of the functions as their security mitigation method⎯e.g., by simply not documenting an otherwise accessible API interface⎯and to verify the mitigations in place are effective at preventing the insecure use of such third-party functions.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all software functions exposed by third-party modules are either disabled or unable to be accessed by default.

Indicate whether any third-party modules or functions are exposed (through APIs or other interfaces) by default (yes/no).
Removed p. 37
Describe each of the third-party modules or functions that are exposed by default and the software vendor’s justification for doing so. For each instance where third- party modules or functions are exposed by default, also describe why the assessor considers the vendor’s justification for each exception reasonable.
Removed p. 38
2.2.b Where any software security controls, features, and functions are enabled only upon initialization or first use, the assessor shall test the software to confirm that no sensitive data can be processed until this initialization process has been completed.
Modified p. 38 → 40
In Place N/A Not in Place 2.2.a The assessor shall examine vendor evidence and test the software to identify all software security controls, features and functions, and to confirm that any such controls, features and functions relied upon by the software for the protection of critical assets are enabled upon installation, initialization, or first use of the software.
In Place Not in Place N/A 2.2.a The assessor shall examine evidence and test the software to identify all software security controls, features and functions relied upon by the software for the protection of critical assets and to confirm that all are enabled upon installation, initialization, or first use of the software.
Modified p. 38 → 40
Indicate whether any of the software security controls, features, and functions identified in 2.2.a are enabled only upon initialization or first use, rather than upon installation (yes/no).
R1 Indicate whether any software security controls relied upon by the software for the protection of critical assets can only be enabled upon software initialization or first use.
Modified p. 38 → 40
Describe any circumstances that exist that prevent security controls, features, and functions from being enabled upon installation (for example, the software is only available as a service).
R1 Identify the evidence obtained that details the software security controls, features, and functions relied upon by the software for the protection of critical assets.
Removed p. 39
2.2.c Where user input or interaction is required to enable any software security controls, features, or functions (such as the installation of certificates) the assessor shall examine vendor evidence to confirm that there is clear and sufficient guidance on the process provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.

Indicate whether any of the software security controls, features, or functions identified in 2.2.a require user input or interaction prior to being enabled (yes/no).

2.2.d The assessor shall examine vendor evidence and test the software to confirm that following the software vendor’s implementation guidance required in Control Objective 12.1 results in all security-relevant software security controls, features, and functions being enabled prior to the software enabling processing of sensitive data.
Modified p. 39 → 41
Identify the documentation and evidence examined that contains the software vendor’s implementation guidance for properly enabling each of the security controls, features, and functions that require user input or interaction prior to being enabled.
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance for configuring all configurable software security controls, features, or functions.
Removed p. 40
Describe any discrepancies found between the information obtained through the documentation and evidence reviews in 2.3.a and the information obtained through the software testing performed in this test requirement.
Modified p. 40 → 41
In Place N/A Not in Place 2.3.a The assessor shall examine vendor evidence to identify all default credentials, keys, certificates, and other critical assets used for authentication by the software.
In Place Not in Place N/A 2.3.a The assessor shall examine evidence to identify the default credentials, keys, certificates, and other critical assets used for authentication by the software.
Modified p. 40 → 42
Note: It is expected that this analysis will include, but not necessarily be limited to, the use of entropy analysis tools to look for hardcoded cryptographic keys, searches for common cryptographic function call and structures such as SBoxes and big-number library functions (and tracing these functions backwards to search for hardcoded keys), as well as checking for strings containing common user account names or password values.
Note: It is expected that this analysis will include, but not necessarily be limited to, the use of entropy analysis tools to look for hardcoded cryptographic keys, searches for common cryptographic function call and structures such as S-Boxes and big- number library functions (and tracing these functions backwards to search for hardcoded keys), as well as checking for strings containing common user account names or password values.
Removed p. 41
Identify the documentation and evidence examined that contains the software vendor’s implementation guidance for disabling or changing all authentication credentials or keys for built-in accounts, where user input or interaction is required.

Describe what the assessor observed in the software testing results that confirms that default authentication credentials or keys for built-in accounts are not used by the software after software installation, initialization, or first use.
Modified p. 41 → 42
Indicate whether any authentication methods require user input or interaction to disable or change authentication credentials or keys for built-in accounts (yes/no).
R1 Indicate whether user input or interaction is required to disable or change default authentication credentials for built-in accounts.
Removed p. 42
Describe what the assessor observed in the software testing results that confirms that default authentication credentials or keys for built-in accounts are not used to protect the storage and transmission of sensitive data after software installation, initialization, or first use.

Describe what the assessor observed in the documentation and evidence that indicates that the vendor’s justifications for the privileges and resources required by the software are reasonable.

Describe what the assessor observed in the documentation and evidence that confirms whether the software requires elevated privileges to access any resources required by the software.
Modified p. 42 → 43
In Place N/A Not in Place 2.4.a The assessor shall examine vendor evidence to identify all privileges and resources required by the software and to confirm the evidence describes and reasonably justifies all privileges and resources required, including explicit permissions for access to resources, such as cameras, contacts, etc.
In Place Not in Place N/A 2.4.a The assessor shall examine evidence to identify the privileges and resources required by the software and to confirm that information is maintained that describes and reasonably justifies all privileges and resources required, including explicit permissions for access to resources, such as cameras, contacts, etc.
Modified p. 42 → 43
2.4.b Where limiting access is not possible⎯e.g., due to the architecture of the solution or the execution environment in which the software is executed⎯the assessor shall examine vendor evidence to identify all mechanisms implemented by the software to prevent unauthorized access, exposure, or modification of critical assets, and to confirm there is clear and sufficient guidance on properly implementing the mechanisms provided in the software vendor’s implementation guidance made available to stakeholders Identify the documentation and evidence examined in support …
2.4.b Where limiting access is not possible due to the architecture of the solution or the execution environment in which the software is executed the assessor shall examine evidence to identify all mechanisms implemented by the software to prevent unauthorized access, exposure, or modification of critical assets, and to confirm that guidance on properly implementing and configuring these mechanisms is provided to stakeholders in accordance with Control Objective 12.1.
Modified p. 42 → 44
Identify the documentation and evidence examined in support of this test requirement.
R5 Describe any other assessment activities and/or findings for this test requirement.
Removed p. 43
Identify the documentation and evidence examined that contains the software vendor’s implementation guidance on properly configuring such mechanisms.

Describe any discrepancies between the information obtained through the documentation and evidence reviews in 2.4.a and the information obtained through the software testing in this test requirement.
Modified p. 43
Describe the mechanisms implemented by the software to prevent unauthorized access, exposure, or modification of critical assets.
R3 If R1 is “Yes,” then describe the methods or mechanisms implemented to prevent unauthorized users from using the privileges to access, expose, or modify critical assets.
Modified p. 43 → 44
2.4.c The assessor shall test the software to confirm that access permissions and privileges are assigned according to the vendor evidence. The assessor shall, where possible, use suitable tools for the platform on which the software is installed to review the permissions and privileges of the software itself, as well as the permissions and privileges of any resources, files, or additional elements generated or loaded by the software during use.
2.4.c The assessor shall test the software to confirm that access permissions and privileges are assigned according to the evidence examined in Test Requirement 2.4.a. The assessor shall, where possible, use suitable tools for the platform on which the software is installed to review the permissions and privileges of the software itself, as well as the permissions and privileges of any resources, files, or additional elements generated or loaded by the software during use.
Removed p. 44
Describe each of the tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.

Describe what the assessor observed in documentation, evidence and software test results that confirms that the software does not use legacy features of the intended execution environment, and that only recent and secured functions are implemented.

Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s justifications for the privileges assigned to each of the default accounts are reasonable.
Modified p. 44 → 45
In Place N/A Not in Place 2.5.a The assessor shall examine the vendor evidence to identify all default accounts provided by the software and to confirm vendor evidence includes reasonable justification for the privileges assigned to these accounts.
In Place Not in Place N/A 2.5.a The assessor shall examine evidence to identify all default accounts provided by the software and to confirm that the privileges assigned to these accounts are justified and reasonable.
Modified p. 44 → 45
2.5.b The assessor shall test the software to confirm that all default accounts provided or used by the software are supported by the vendor evidence.
2.5.b The assessor shall test the software to confirm that all default accounts provided or used by the software are supported by the evidence examined in Test Requirement 2.5.a.
Removed p. 45
Describe what the assessor observed in documentation, evidence, and software test results that confirms that all exposed software functions (APIs and other interfaces) are protected from unauthorized use and modification.
Modified p. 45
2.5.c The assessor shall examine vendor evidence and test the software to confirm that exposed functionalities (i.e., APIs) are protected from use by unauthorized users to modify account privileges and elevate user access rights.
2.5.c The assessor shall examine evidence and test the software to confirm that exposed interfaces, such as APIs, are protected from attempts by unauthorized users to modify account privileges and elevate user access rights.
Removed p. 46
Describe what the assessor observed in the documentation and evidence that indicates that the vendor’s justifications for retaining each item of persistent sensitive data are reasonable.

Describe any discrepancies found between the functions and services identified through the documentation and evidence reviews and the functions and services identified through the software testing.
Modified p. 46
In Place N/A Not in Place 3.1.a The assessor shall examine vendor evidence to identify what sensitive data is collected by the software for use beyond any one transaction, the default time period for which it is retained, and whether the retention period is user- configurable, and to confirm vendor evidence includes reasonable justification for retaining the sensitive data.
In Place Not in Place N/A 3.1.a The assessor shall examine evidence to identify the sensitive data that is collected by the software for use beyond any one transaction, the default time period for which it is retained, and whether the retention period is user-configurable, and to confirm that the purpose for retaining the sensitive data in this manner is justified and reasonable.
Modified p. 46
3.1.b The assessor shall test the software to confirm that all available functions or services designed for the retention of sensitive data are supported by the vendor evidence.
3.1.b The assessor shall test the software to confirm that all available functions or services designed for the retention of sensitive data are supported by the evidence examined in Test Requirement 3.1.a.
Removed p. 47
Describe what the assessor observed in the documentation, evidence, and software test results that indicates that all persistent sensitive data stored or retained for debugging, error finding, or testing purposes are protected in accordance with Control Objective 6.

Describe the software configuration options, mechanisms, etc., that must be enabled explicitly and authorized by the user before any storage and/or retention of persistent sensitive data is permitted by the software.

Describe the specific features and functions implemented by the software to ensure that all persistent sensitive data retained for debugging, error finding, or testing are securely deleted upon closure of the software in accordance with Control Objective 3.4.
Modified p. 47
Indicate whether the software facilitates the storage and/or retention of persistent sensitive data for debugging, error finding, or testing purposes (yes/no).
R1 Indicate whether the software facilitates the persistent storage of sensitive data for the purposes of debugging, error finding, or system testing.
Removed p. 48
Identify the documentation and evidence examined that identifies and describes all instances in the software in which user input or interaction is required to configure the retention period of persistent sensitive data.

Describe what the assessor observed in the software vendor’s guidance that indicates clear and sufficient instruction is provided to stakeholders on configuring the retention periods of persistent sensitive data and the secure deletion procedures of the software.
Modified p. 48 → 49
Indicate whether the software requires user input or interaction to configure the retention period for any persistent sensitive data stored or retained by the software (yes/no).
R1 Indicate whether the software requires or otherwise enables users to configure the retention periods for sensitive data stored in temporary storage facilities.
Modified p. 48 → 52
Identify the documentation and evidence examined that contains the software vendor’s implementation guidance for configuring the retention periods for persistent sensitive data stored or retained by the software.
R1 Identify the evidence obtained that details the methods implemented to render persistent sensitive data irretrievable when no longer needed.
Removed p. 49
Note: The assessor should refer to Control Objective 1 to identify all critical assets, including transient sensitive data.

Describe what the assessor observed in the documentation and evidence examined that indicates that the software vendor’s justifications for retaining each item of sensitive data for transient use are reasonable.

Note: The assessor should refer to Control Objective 1 to identify all sensitive functions and services.

Describe what the assessor observed in the software test results that confirms the software does not use immutable objects to store transient sensitive data.

Describe any discrepancies found between the functions and services for retaining transient sensitive data identified through the documentation and evidence reviews in 3.2.a and the functions and services for retaining transient sensitive data identified through the software testing in this test requirement.
Modified p. 49 → 48
In Place N/A Not in Place 3.2.a The assessor shall examine vendor evidence to identify all sensitive data that is retained by the software for transient use, what triggers the secure deletion of this data, and confirm reasonable justification exists for retaining the data. This includes data that is stored only in memory during the operation of the software.
In Place Not in Place N/A 3.2.a Using information obtained in Test Requirement 1.1.a, the assessor shall examine evidence to identify all sensitive data that is retained by the software for transient use, what triggers the secure deletion of this data, and to confirm that the purposes for retaining the data are justified and reasonable. This includes data that is stored only in memory during the operation of the software.
Modified p. 49 → 48
3.2.b The assessor shall test the software to confirm that all available functions or services that retain transient sensitive data are supported by vendor evidence and do not use immutable objects.
3.2.b Using information obtained in Test Requirement 1.2.a, the assessor shall test the software to confirm that all available functions or services that retain transient sensitive data are supported by evidence examined in Test Requirement 3.2.a and do not use immutable objects.
Removed p. 50
Describe what the assessor observed in the documentation, evidence, and software test results that indicate that all transient sensitive data stored or retained for debugging, error finding, or testing purposes are protected in accordance with Control Objective 6.

Describe the software configuration options, mechanisms, etc., that must be enabled explicitly and authorized by the user before any storage and/or retention of transient sensitive data for debugging, error finding, or testing is permitted.

Describe the specific features and functions implemented by the software to ensure that all transient sensitive data retained for debugging, error finding, or testing is securely deleted upon closure of the software in accordance with Control Objective 3.4.
Modified p. 50 → 49
Indicate whether the software facilitates the storage or retention of transient sensitive data for debugging, error finding, or testing (yes/no).
R1 Indicate whether the software facilitates the storage of sensitive data in temporary storage facilities for the purposes of debugging, error finding, or system testing.
Removed p. 51
Describe what the assessor observed in the software vendor’s implementation guidance that indicates clear and sufficient instruction is provided to stakeholders on configuring the retention periods for transient sensitive data and configuring the secure deletion of such data when no longer needed.
Modified p. 51 → 47
Describe what the assessor observed in the documentation and evidence that confirms determines whether the software enables users to configure retention periods for transient sensitive data.
R1 Indicate whether the software requires or otherwise enables users to configure the retention period for persistently stored sensitive data.
Modified p. 51 → 47
Identify the documentation and evidence examined that contains the software vendor’s implementation guidance for configuring the retention periods for transient sensitive data stored or retained by the software.
R2 If R1 is “Yes,” identify the evidence obtained that details the options available to users to configure the retention periods for this data.
Modified p. 51
Indicate whether the software enables users to configure the retention periods for transient sensitive data (yes/no).
R1 Indicate whether the software requires or otherwise enables users to configure methods to protect sensitive data in storage facilities (transient or persistent).
Removed p. 52
In Place N/A Not in Place 3.3.a The assessor shall examine the vendor evidence to identify the protection methods implemented for all sensitive data during storage and transmission.

3.3.b The assessor shall test the software to confirm that no additional storage of sensitive data is included.

Describe what the assessor observed in the software testing results that confirms the software does not provide for any additional storage or retention of sensitive data beyond that which was identified in the testing for Control Objectives 3.1 and 3.2.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether any transient sensitive data is stored outside of temporary variables during retention.
Modified p. 52 → 50
3.3.c Where sensitive data is stored outside of temporary variables within the code itself, the assessor shall test the software to confirm that sensitive data is protected using either strong cryptography or other methods that provide an equivalent level of security.
3.3.b Where sensitive data is stored outside of temporary variables within the code itself, the assessor shall test the software to confirm that sensitive data is protected using either strong cryptography or other methods that provide an equivalent level of security.
Removed p. 53
Identify the software structures and locations used to store each item of sensitive data (persistent or transient), where such data is stored outside of temporary variables during retention.

For each item of sensitive data (persistent or transient) that is stored outside of temporary variables during retention, describe what the assessor observed in the documentation, evidence, and software test results that confirms that all instances of sensitive data stored outside of temporary variables during retention are protected using either strong cryptography or other methods that provide equivalent security.
Modified p. 53 → 50
3.3.d Where protection methods use cryptography, the assessor shall examine vendor evidence and test the software to confirm that the cryptographic implementation complies with Control Objective 7 of this standard.
3.3.c Where protection methods use cryptography, the assessor shall examine evidence and test the software to confirm that the cryptographic implementation complies with Control Objective 7 of this standard.
Removed p. 54
For each of the cryptographic methods used, describe what the assessor observed in the documentation, evidence, and software test results that confirms that each cryptographic implementation complies with Control Objective 7.

3.3.e Where sensitive data is protected using methods other than strong cryptography, the assessor shall examine vendor evidence and test the software to confirm that the protections are present in all environments where the software is designed to be executed, are correctly implemented, and are covered by the vendor evidence.
Modified p. 54 → 47
Identify the cryptographic methods used to protect sensitive data stored or retained by the software.
R3 If R1 is “Yes,” then describe the methods implemented to protect sensitive data when retained for this purpose.
Modified p. 54 → 50
Indicate whether protection methods other than strong cryptography are used to protect any sensitive data during storage or retention (yes/no).
R1 Indicate whether the software uses or relies on cryptography for the protection of stored sensitive data (transient or persistent).
Removed p. 55
Describe any discrepancies found between the protection methods identified through the documentation and evidence reviews and the protection methods identified through the software testing.

Indicate whether the software requires user input or interaction to configure the protection methods identified in 3.3.a, 3.3.d and 3.3.e (yes/no).

Identify the documentation and evidence examined that contains the software vendor’s implementation guidance for instructing users where and how to configure all protection mechanisms that require user input or interaction.
Modified p. 55 → 49
3.3.f Where users are required to configure protection methods, the assessor shall examine vendor evidence to confirm that there is clear and sufficient guidance on this process provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.
3.2.d Where users can configure retention of transient sensitive data, the assessor shall examine vendor evidence to confirm that clear and sufficient guidance on this process is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.
Removed p. 57
Describe any additional software tests performed to support of this test requirement, including the tool(s) or method(s) used and the scope of each test.

Identify all platform and implementation issues that may complicate the erasure of transient sensitive data.

Identify the methods implemented to ensure non-transient sensitive data is rendered unrecoverable.

Describe how the methods to render transient sensitive data unrecoverable accommodate for platform-specific issues, such as flash wear-levelling algorithms or SSD over-provisioning.
Modified p. 57 → 50
Identify the documentation and evidence examined that identifies and describes all methods implemented by the software to securely delete non-transient sensitive data when no longer required.
R1 Identify the evidence examined that details the methods implemented and/or relied upon to protect sensitive data (both transient and persistent) during retention.
Modified p. 57 → 52
3.4.b The assessor shall examine vendor evidence and test the software to identify any platform or implementation level issues that complicate the secure deletion of non-transient sensitive data and to confirm that any non-transient sensitive data is securely deleted using a method that ensures that the data is unrecoverable after deletion. Methods may include (but are not necessarily limited to) overwriting the data, deletion of cryptographic keys (of sufficient strength) which have been used to encrypt the data, or platform …
3.4.b The assessor shall examine evidence and test the software to identify any platform or implementation level issues that complicate the secure deletion of non-transient sensitive data and to confirm that any non-transient sensitive data is securely deleted using a method that ensures that the data is rendered unrecoverable. Methods may include (but are not necessarily limited to) overwriting the data, deletion of cryptographic keys (of sufficient strength) that have been used to encrypt the data, or platform-specific functions that
Modified p. 57 → 52
Indicate whether there are any platform or implementation level issues that may complicate the secure deletion of non- transient sensitive data (yes/no).
R1 Indicate whether known platform or implementation-level issues exist that complicate the secure deletion of sensitive data from persistent data stores.
Removed p. 58
Describe what the assessor observed in the software testing results that confirms the methods implemented by the software to render non-transient sensitive data unrecoverable are implemented correctly.
Removed p. 59
In Place N/A Not in Place 3.5.a The assessor shall examine vendor evidence to identify all secure deletion methods for all transient sensitive data and to confirm that these methods ensure that the data is unrecoverable after deletion.

3.5.b The assessor shall examine vendor evidence and test the software to identify any platform or implementation level issues that complicate the erasure of such transient sensitive data

•such as abstraction layers between the code and the hardware execution environment

Identify all platform and implementation issues that complicate the erasure of transient sensitive data.

Identify each of the methods implemented to minimize the risk posed by these complications.
Modified p. 59 → 54
• and to confirm what methods have been implemented to minimize the risk posed by these complications.
R2 If R1 is “Yes,” then describe the methods implemented to mitigate the risks associated with such complications.
Modified p. 59 → 54
Indicate whether there are any platform or implementation level issues that may complicate the erasure of transient sensitive data (yes/no).
R1 Indicate whether known platform or implementation-level issues were discovered that complicate the secure deletion of sensitive data from transient data stores.
Removed p. 60
Describe what the assessor observed in in the software testing results that confirms that the methods implemented by the software to render transient sensitive data unrecoverable are implemented correctly and applied to all transient sensitive data.
Modified p. 61 → 55
• Execution environments that may be vulnerable to remote side- channel attacks to expose sensitive data

•such
as attacks that exploit cache timing or branch prediction within the platform processor.
• Execution environments that may be vulnerable to remote side-channel attacks to expose sensitive data, such as attacks that exploit cache timing or branch prediction within the platform processor.
Modified p. 61 → 55
• Automatic storage or exposure of sensitive data by the underlying execution environment, such as through swap-files, system error logging, keyboard spelling, and auto-correct features, etc.
• Automatic storage or exposure of sensitive data by the underlying execution environment, such as through swap-files, system error logging, keyboard spelling, and auto-correct features.
Modified p. 61 → 55
• Sensors or services provided by the execution environment that may be used to extract or leak sensitive data such as through use of an accelerometer to capture input of a passphrase to be used as a seed for a cryptographic key, or through capture of sensitive data through use of cameras, near-field communication (NFC) interfaces, etc.
• Sensors or services provided by the execution environment that may be used to extract or leak sensitive data, such as through use of an accelerometer to capture input of a passphrase to be used as a seed for a cryptographic key, or through capture of sensitive data through use of cameras or near-field communication (NFC) interfaces.
Modified p. 61 → 55
Describe what the assessor observed in the documentation and evidence that confirms the software vendor’s analysis accounts for the attack vectors described in this test requirement.
R2 Describe what the assessor observed in the evidence obtained that confirms the software vendor’s analysis accounts for the attack vectors described in this test requirement.
Modified p. 61 → 55
Identify any additional sensitive data disclosure attack vectors covered in the vendor’s analysis.
R1 Identify the evidence obtained that details the software vendor’s sensitive data disclosure attack vector analysis.
Modified p. 61 → 55
Identify the date(s) when the software vendor’s analysis was last updated.
R4 Identify the date when the software vendor’s analysis was last performed or updated.
Removed p. 62
Describe what the assessor observed in the documentation, evidence, and software test results that indicate that the mechanisms that protect against the unintended disclosure of sensitive data are implemented correctly.

3.6.c The assessor shall examine vendor evidence to confirm that clear and sufficient guidance on the proper configuration and use of such mitigations is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.

Describe what the assessor observed in the software testing results that confirms that all mitigation controls that protect against the unintended disclosure of sensitive data are implemented correctly.
Modified p. 62 → 56
3.6.d The assessor shall test the software using forensic tools to identify any sensitive data residue in the execution environment, and to confirm that all mitigation controls are correctly implemented and the software does not expose or otherwise reveal sensitive data.
3.6.d The assessor shall test the software to identify any sensitive data residue in the execution environment, and to confirm that protection methods are implemented correctly and the software does not expose or otherwise reveal sensitive data to unauthorized users.
Modified p. 62 → 58
Describe what the assessor observed in the documentation and evidence that confirms the software vendor’s guidance is appropriate and sufficient to result in the secure configuration and use of all such mitigations.
R4 If R2 is “No,” then describe what the assessor observed in the evidence obtained that confirms the software vendor’s methodology provides equivalent coverage to industry-standard methods.
Removed p. 63
Control Objective and Test Requirements Reporting Instructions Reporting Details: Assessor’s Response Summary of Assessment Findings (check one) Control Objective 4: Critical Asset Protection Critical assets are protected from attack scenarios.

Identify the documentation and evidence examined that identifies and describes all software attack scenarios and the protection mechanisms implemented to mitigate those attacks.

Identify the documentation and evidence examined that identifies and describes the vendor’s method(s) for determining software attack scenarios.

Indicate whether industry-standard methods are the basis for the software vendor’s method(s) (yes/no).

If “no,” describe how the software vendor’s method(s) provides coverage of the software attack scenarios that is equivalent to industry-standard methods.
Modified p. 63 → 58
4.1.b The assessor shall examine vendor evidence to determine whether any specific industry-standard methods or guidelines were used to identify relevant attack scenarios, such as the threat model guidelines. Where such industry standards are not used, the assessor shall confirm that the methodology used provides an equivalent coverage of the attack scenarios and methods for the software.
4.1.b The assessor shall examine evidence to determine whether any specific industry-standard methods or guidelines were used to identify relevant attack scenarios.
Modified p. 63 → 58
If “yes,” identify the industry-standard methods or guidelines used.
R3 If R2 is “Yes,” then identify the industry- standard methods or guidelines used.
Removed p. 64
Describe what the assessor observed in the documentation and evidence that indicates that all sensitive data entry and egress points and the authentication and trust model(s) applied to these points are covered in the vendor’s attack analysis.

Describe what the assessor observed in the documentation and evidence that indicates that all software data flows, network segments, and authentication/privilege boundaries are covered in the software vendor’s attack analysis.

Describe what the assessor observed in the documentation and evidence that indicates that all static IPs, domains, URLs, or ports required by the software for operation were covered in the vendor’s attack analysis.
Modified p. 64 → 59
• All critical assets managed by and all sensitive resources used by the system are documented.
• All critical assets managed, and all sensitive resources used by the system are documented.
Modified p. 64 → 59
• All entry and egress methods for sensitive data by the software, as well as the authentication and trust model applied to each of these entry/egress points, are defined.
• All entry and egress points for sensitive data, as well as the authentication and trust model applied to each of these entry/egress points, are documented.
Modified p. 64 → 59
• All data flows, network segments, and authentication/privilege boundaries are defined.
• All data flows, network segments, and authentication/privilege boundaries are documented.
Modified p. 64 → 59
Identify the individual who is assigned formal ownership for the software under evaluation.
R1 Identify the individual assigned formal responsibility for the security of the assessed software.
Modified p. 64 → 59
Summarize the software vendor’s method(s) for defining and measuring the probability and impact of potential exploits against the assessed software.
R2 Identify the evidence obtained that details the software vendor’s methodology for measuring the probability and impact of potential exploits.
Removed p. 65
• Consideration for the installed environment of the software, including any considerations for the size of the install base are documented. All attack surfaces that must be mitigated

•such as implementing insecure user prompts or separating open protocol stacks; storage of sensitive data post authorization or storage of sensitive data using insecure methods, etc.

Describe what the assessor observed in the documentation and evidence that indicates that cryptography and cryptographic elements, such as cipher modes, were considered in the vendor’s attack analysis.

Describe what the assessor observed in the documentation and evidence that indicates that the software execution environment was considered in the vendor’s attack analysis.

Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s attack analysis covers the use of insecure methods.

4.1.d The assessor shall examine vendor evidence to confirm that the threat model created is reasonable to address the potential risks posed by the install …
Modified p. 65 → 59
• Considerations for cryptography elements like cipher modes, protecting against timing attacks, padded oracles, brute force, “rainbow table” attacks, dictionary attacks against the input domain, etc. are documented.
• Considerations for cryptography elements like cipher modes, and protecting against relevant attacks such as timing attacks, padded oracles, brute force, “rainbow table” attacks, and dictionary attacks against the input domain are documented.
Modified p. 65 → 59
Identify the documentation and evidence examined in support of this test requirement.
R6 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 65 → 60
• Execution environment implementation specifics or assumptions such as network configurations, operating system security configurations, etc. are documented.
• Execution environment implementation specifics or assumptions, such as network configurations and operating system security configurations, are documented.
Removed p. 66
If “yes,” describe what the assessor observed in the documentation and evidence that indicates that the vendor’s justification(s) for any lack of mitigation is reasonable.

4.2.b The assessor shall examine vendor evidence and test the software to confirm that the implemented mitigation methods are reasonable for the threat they address.

Describe what the assessor observed in the documentation, evidence and software test results that indicates that the implemented mitigation methods are appropriate for the threats they are intended to address.

4.2.c Where any mitigations rely on settings within the software, the assessor shall test the software to confirm that such settings are applied by default, before first processing any sensitive data, upon installation, initialization, or first use of the software.
Modified p. 66 → 61
Indicate whether any of the threats identified in Control Objective 4.1 were not mitigated (yes/no).
R1 Indicate whether any of the mitigations identified in Test Requirement 4.2.a rely on features of the execution environment.
Modified p. 66 → 61
Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R2 If R1 is “Yes,” then describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that changes to those settings requires user authentication and authorization.
Removed p. 67
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether any software mitigations or protection mechanisms rely on configurable settings within the software.

Describe what the assessor observed in the documentation, evidence, and software test results that indicate that all settings and values required to configure such mitigations are applied by default.

Indicate whether any such settings and mitigations can be disabled, removed, or bypassed by user input or interactions (yes/no).

Describe what the assessor observed in the documentation, evidence, and software test results that indicates that settings or values required to configure threat mitigations cannot be disabled, removed, or bypassed without requiring strong user authentication and authorization.
Modified p. 67 → 61
Indicate whether any of the mitigations identified in 4.2.a rely on software configuration settings or values (yes/no).
R1 Indicate whether any of the mitigations identified in Test Requirement 4.2.a rely on settings within the software.
Removed p. 68
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether software protection mechanisms rely on features of the execution environment.

Indicate whether any protection mechanisms rely on features of the execution environment (yes/no).
Modified p. 68 → 61
4.2.d When any mitigations rely on features of the execution environment, the assessor shall examine vendor evidence to confirm that guidance is provided to the software users to enable such settings as part of the install process.
4.2.d When any mitigations rely on features of the execution environment, the assessor shall examine evidence to confirm that guidance is provided to stakeholders on how to enable such settings in accordance with Control Objective 12.1.
Modified p. 68 → 61
Identify the documentation and evidence that contains the software vendor’s guidance on enabling, configuring, and using protection methods provided by the execution environment.
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to enable and configure the software to securely use these features.
Modified p. 68 → 62
Indicate whether the intended execution environment provides any APIs to query the status of mitigation controls (yes/no).
R1 Indicate whether the software relies on APIs provided by the execution environment to query the status of software security controls.
Removed p. 70
Describe what the assessor observed in the documentation and evidence that confirms that authentication requirements are defined for all roles based on critical-asset classification, type of access, and level of privilege.
Modified p. 70 → 63
5.1.b The assessor shall examine vendor evidence and test the software to confirm that all access to critical assets is authenticated and authentication mechanisms are implemented correctly.
5.1.b The assessor shall examine evidence and test the software to confirm that access to critical assets is authenticated and authentication mechanisms are implemented correctly.
Modified p. 70 → 63
Describe what the assessor observed in the documentation, evidence, and software test results that indicate that each of the authentication mechanisms are implemented correctly.
R2 Describe what the assessor observed in the evidence obtained that confirms that access to all critical assets is authenticated.
Removed p. 71
Describe what the assessor observed in the documentation and evidence that indicates that all data associated with authentication credentials is treated as a critical asset.
Modified p. 71 → 64
Indicate whether the software relies on or supports the use of additional mechanisms for secure non-console access to the software (yes/no).
R1 Indicate whether the software relies on or supports the use of external authentication mechanisms.
Modified p. 71 → 64
5.1.d The assessor shall examine vendor evidence to confirm that any sensitive data associated with credentials, including public keys, is identified as a critical asset.
5.1.d The assessor shall examine evidence to confirm that sensitive data associated with authentication credentials, including public keys, is identified as a critical asset.
Modified p. 71 → 65
Identify the documentation and evidence that contains the software vendor’s guidance on configuring additional authentication mechanisms for secure non-console access to the software.
R1 Identify the evidence obtained that details the software vendor’s guidance on the proper use and protection of user authentication credentials.
Removed p. 72
Describe what the vendor observed in the documentation, evidence, and software test results that confirms that all implemented authentication methods require unique identification.

5.2.b Where interfaces, such as APIs, allow for automated access to critical assets, the assessor shall examine vendor evidence and test the software to confirm that unique identification of different programs or systems accessing the critical assets is required (for example, through use of multiple public keys) and that guidance on configuring a unique credential for each program or system is included in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.
Modified p. 72 → 65
Indicate whether the software provides APIs or other interfaces to enable automated access to critical assets (yes/no).
R1 Indicate whether the software provides or otherwise facilitates automated API access to critical assets.
Removed p. 73
Identify the documentation and evidence that contains the software vendor’s guidance on configuring unique credentials for each program or system to which automated access to critical assets is provided via APIs or other interfaces.

If “yes,” describe what the assessor observed in the documentation, evidence, and software test results that indicate that the authentication mechanisms that supply its identification parameters in this way are protected.
Modified p. 73 → 65
5.2.c Where identification is supplied across a non-console interface, the assessor shall test the software to confirm that authentication mechanisms are protected.
5.2.c Where identification is supplied across a non- console interface, the assessor shall test the software to confirm that authentication credentials are protected from attacks that attempt to intercept them in transit.
Modified p. 73 → 65
Indicate whether identification is supplied across a non-console interface (yes/no).
R1 Indicate whether any authentication credentials (user, API, etc.) are supplied across a non-console interface.
Removed p. 74
Describe what the assessor observed in the software vendor’s guidance that confirms that users are instructed not to share identification and authentication parameters between individuals or programs.

5.2.e The assessor shall examine vendor evidence, including source code of the software, to confirm that there are no additional methods for accessing critical assets.

Describe how and the extent to which source code was examined to confirm that the software provides no other methods for accessing critical assets.

Describe what the assessor observed in the documentation, evidence, and source code that indicates that the software provides no additional methods, other than those identified in 5.2.a.
Modified p. 74 → 64
Identify the documentation and evidence examined that contains the software vendor’s guidance on the use of identification and authentication parameters.
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the software to securely use such authentication mechanisms.
Removed p. 75
In Place N/A Not in Place 5.3.a The assessor shall examine vendor evidence to confirm that all implemented authentication methods were evaluated to identify the details of known vulnerabilities or attack methods on the authentication method, and how the implementation mitigates against such attacks. The evidence must also illustrate that the implementation used in the software was considered. For example, a fingerprint may be uniquely identifiable to an individual, but the ability to spoof or otherwise bypass such technology can be highly dependent on the way the solution is implemented.
Removed p. 75
Describe what the assessor observed in the documentation and evidence that confirms that the implementation of each authentication method was evaluated to identify known vulnerabilities, attack methods, vectors, or patterns that might enable an attacker to compromise or, otherwise, circumvent the authentication method.

Describe what the assessor observed in the documentation and evidence that indicates that the protection mechanisms implemented to mitigate the probability and impact of potential attacks on the software’s authentication methods are appropriate for their intended purpose and that any residual risk has been reasonably justified.

Describe the methods used by the software vendor to evaluate the robustness of the implemented authentication methods and how they are consistent with industry-accepted methods.
Modified p. 75 → 66
5.3.b The assessor shall examine vendor evidence to confirm that implemented authentication methods are robust and that robustness of the authentication methods was evaluated using industry-accepted methods.
5.3.b The assessor shall examine evidence to confirm that the implemented authentication methods are robust, and that the robustness of the authentication methods was evaluated using industry-accepted methods.
Removed p. 76
Describe what the assessor observed in the documentation and evidence that indicates that the implemented authentication methods are reasonably sufficient to protect them from being forged, spoofed, leaked, guessed, or circumvented.

Describe what the assessor observed in the software test results that provides reasonable assurance that the authentication mechanisms are implemented correctly and do not expose vulnerabilities.
Modified p. 76 → 66
5.3.c The assessor shall test the software to confirm the authentication methods are implemented correctly and do not expose vulnerabilities.
5.3.c The assessor shall test the software to confirm that the authentication methods are implemented correctly and do not expose vulnerabilities.
Modified p. 76 → 66
Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R1 Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test, to confirm that all authentication methods implemented by the software are implemented correctly and do not contain or otherwise expose vulnerabilities.
Removed p. 77
5.4.b The assessor shall examine vendor evidence and test the software to identify what access is provided to critical assets and confirm that such access correlates with the vendor evidence. The test to confirm access is restricted should include attempts to access critical assets through user accounts, roles, or services which should not have the required privileges.

Describe any discrepancies found between the access provided to critical assets identified through the documentation and evidence reviews and the access to critical assets identified through the software testing performed in support of this test requirement.
Modified p. 77 → 67
In Place N/A Not in Place 5.4.a The assessor shall examine vendor evidence to confirm that the vendor has clearly identified and reasonably justified the required access for all critical assets.
In Place Not in Place N/A 5.4.a The assessor shall examine evidence to confirm that information is maintained that identifies and justifies the required access for all critical assets.
Modified p. 77 → 69
Describe what the assessor observed in the documentation and evidence that indicates that the access requirements and justification(s) for each critical asset are reasonable for the software’s intended function.
R2 If R1 is “Yes,’ then describe what the assessor observed in the evidence obtained that confirms the security properties are valid for all platforms where the software is intended to be deployed.
Removed p. 78
Describe what the assessor observed in the documentation, evidence, and software test results that indicates that security methods implemented to protect sensitive data during persistent storage properly address all defined protection requirements and identified attack scenarios.
Modified p. 78 → 68
Describe what the assessor observed in the documentation and evidence that indicates that protection requirements for all sensitive data stored by the software are defined.
R2 Describe what the assessor observed in the evidence obtained that confirms the implemented security controls address all the protection requirements identified in Test Requirement 6.1.a.
Modified p. 78 → 68
6.1.b The assessor shall examine vendor evidence and test the software to confirm that security methods implemented to protect all sensitive data during storage appropriately address all defined protection requirements and identified attack scenarios.
6.1.b The assessor shall examine evidence and test the software to confirm that security controls are implemented to protect sensitive data during storage and that they address all defined protection requirements and identified attack scenarios.
Modified p. 78 → 68
Note: The assessor should refer to Control Objective 1 to identify all critical assets and Control Objective 4 to identify all attack scenarios applicable to the software.
Note: The assessor should refer to evidence obtained in the testing of Control Objective 1.1 to determine all sensitive data retained by the software, and Control Objective 4.1 to identify all attack scenarios applicable to the software.
Removed p. 79
Based on the documentation, evidence, and the results of the software testing performed in support of this test requirement, indicate whether cryptography is used by the software for securing sensitive data during persistent storage (yes/no).

Describe what the assessor observed in the documentation, evidence, and the software test results that indicates that in all instances where cryptography is used for securing sensitive data during persistent storage, the cryptography implementation complies with Control Objective 7.
Modified p. 79 → 68
If “yes,” complete the remaining reporting instructions for this test requirement.
R2 Describe any other testing activities performed and/or findings for this test requirement.
Removed p. 80
Indicate whether the software uses index tokens for securing sensitive data during persistent storage (yes/no).

6.1.e Where protection methods rely on security properties of the execution environment, the assessor shall examine vendor evidence and test the software to confirm that these security properties are valid for all platforms which the software targets, and that they provide sufficient protection to the sensitive data.
Modified p. 80 → 69
Describe what the assessor observed in the documentation, evidence, and software test results that indicates that index tokens are generated in a way that ensures there is no correlation between the value and the sensitive data being referenced.
R2 If R1 is “Yes,” then describe how the index tokens are generated in a way that ensures there is no correlation between the token value and the sensitive data being referenced.
Removed p. 81
Indicate whether any protection mechanisms implemented by the software to safeguard sensitive data rely on execution environment security properties (yes/no).

Describe what the assessor observed in the documentation, evidence, and software test results that indicates that the security properties, upon which the protection mechanisms rely, exist for all platforms included in the software evaluation.

Describe what the assessor observed in the documentation, evidence, and software test results that indicates that the protection mechanisms that rely on execution environment security properties are appropriate for the type of sensitive data they are to protect.
Removed p. 82
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether software protection mechanisms rely on third-party software security properties to safeguard sensitive data.

Indicate whether protection mechanisms implemented by the software to safeguard sensitive data rely on third- party software security properties (yes/no).

Describe what the assessor observed in the documentation, evidence, and software test results that indicates that the protection mechanisms that rely on third-party software security properties are appropriate for the type of sensitive data they are to protect.
Modified p. 82 → 70
Describe what the assessor observed in the documentation and evidence that confirms that there are no unmitigated vulnerabilities in the third-party software, whose security properties are relied upon by the software’s protection methods.
R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms there are no unmitigated vulnerabilities in the third- party software.
Removed p. 83
Describe what the assessor observed in the documentation and evidence that confirms protection requirements are defined for all locations where the software transmits sensitive data.

6.2.b The assessor shall examine vendor evidence and test the software to confirm that for each of the ingress and egress methods that allow for transmission of sensitive data with confidentiality considerations outside of the physical execution environment, sensitive data is always encrypted with strong cryptography prior to transmission or is transmitted over an encrypted channel using strong cryptography.
Modified p. 83 → 71
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all transmissions of sensitive data outside of the physical execution environment are encrypted prior to transmission using strong cryptography or are transmitted over an encrypted channel that uses strong cryptography.
R1 Describe what the assessor observed in the evidence obtained that confirms that sensitive data transmitted outside of the execution environment is encrypted using strong cryptography.
Removed p. 84
Describe any additional software tests performed to support of this test requirement, including the tool(s) or method(s) used and the scope of each test.

Indicate whether the software relies upon any third-party or execution- environment features to ensure the security of transmitted sensitive data (yes/no).
Modified p. 84 → 71
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software relies upon third- party or execution environment features for securing sensitive data during transmission.
R1 Indicate whether the software relies on third- party software or features of the execution environment to protect sensitive data during transmission.
Modified p. 84 → 71
Identify the documentation and evidence that contains the software vendor’s guidance on the configuration and use of all third-party or execution-environment features to protect transmitted sensitive data.
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the assessed software to use these features in a secure manner.
Removed p. 85
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether transport layer encryption is used to secure the transmission of sensitive data.

If “yes,” describe what the assessor observed in the documentation, evidence, and software test results that confirms that all ingress and egress methods used to transmit sensitive data enforce secure versions of the TLS protocol with end-point authentication.
Modified p. 85 → 71
Indicate whether transport layer encryption is used (TLS) to secure the transmission of sensitive data (yes/no).
R1 Indicate whether transport layer encryption is relied upon to protect sensitive data during transmission.
Removed p. 86
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether software methods that encrypt sensitive data for transmission allow for different types of cryptography or different levels of security to be used.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all forms of cryptography used for encrypting sensitive data transmissions are strong cryptography.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms strong cryptography is enforced at all times by the software during transmission.
Modified p. 86 → 72
Indicate whether the methods implemented by the software to encrypt sensitive data for transmission allow for the use of different types of cryptography or different levels of security (yes/no).
R1 Indicate whether the encryption methods implemented to protect sensitive data during transmission allow for the use of different types of cryptography or cryptography with different effective key strengths.
Removed p. 87
Describe what the assessor observed in the documentation, evidence, and software test results that indicate that all uses of cryptography for the purpose of securing critical assets complies with Control Objective 7.

6.3.b Where cryptographic methods provided by third-party software or aspects of the execution environment or platform on which the application is run are relied upon for the protection of sensitive data, the assessor shall examine vendor evidence and test the software to confirm that the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1 provides clear and sufficient detail for correctly configuring these methods during the installation, initialization, or first use of the software.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software relies upon third- party software, platforms, or libraries for cryptographic services.

If “yes,” identify the documentation and evidence that contains the software vendor’s guidance on the …
Modified p. 88 → 73
6.3.c Where asymmetric cryptography such as RSA or ECC is used for protecting the confidentiality of sensitive data, the assessor shall examine vendor evidence and test the software to confirm that private keys are not used for providing confidentiality protection to the data.
6.3.c Where asymmetric cryptography such as RSA or ECC is used for protecting the confidentiality of sensitive data, the assessor shall examine evidence and test the software to confirm that private keys are not used for providing confidentiality protection to the data.
Modified p. 88 → 75
If “yes,” describe what the assessor observed in the documentation, evidence, and software test results that confirms private keys are not used to provide confidentiality protection for sensitive data.
R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms these implementations do not expose or otherwise contain vulnerabilities.
Removed p. 89
Identify each of the unapproved algorithms or modes of operation used to protect critical assets.

For each unapproved algorithm or mode of operation used to protect critical assets, describe how each is used with approved algorithms to ensure a cryptographic key strength is equivalent to that of the approved algorithms.
Modified p. 89 → 73
Indicate whether the software uses any unapproved cryptographic algorithms or modes of operation to protect critical assets (yes/no).
R1 Indicate whether the software relies on any cryptographic methods provided by third-party software or the execution environment to protect sensitive data.
Modified p. 89 → 74
In Place N/A Not in Place 7.1.a The assessor shall examine the vendor evidence to confirm that, where cryptography is relied upon (in whole or in part) for the security of the critical assets:
In Place Not in Place N/A 7.1.a The assessor shall examine evidence to determine how cryptography is used for the protection of critical assets and to confirm that:
Modified p. 89 → 74
Use of any unapproved algorithms must be in conjunction with approved algorithms and implemented in a manner that does not reduce the equivalent cryptographic key strength provided by the approved algorithms.
The implementation of non-standard algorithms does not reduce the equivalent cryptographic key strength provided by the industry-standard algorithms.
Modified p. 89 → 74
Identify the industry-accepted cryptographic algorithms and modes of operation that are used in the software.
• Industry-standard cryptographic algorithms and modes of operation are used.
Modified p. 89 → 75
Industry-accepted cryptographic algorithms and modes of operation are used in the software as the primary means for protecting critical assets; and
Only documented cryptographic algorithms and modes of operation are used in the software.
Removed p. 90
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the cryptographic algorithms and modes of operation in use are implemented correctly (that is, fit-for-purpose).
Modified p. 90 → 74
Describe the mechanisms implemented to protect the cryptographic algorithms and modes of operation used by the software against common cryptographic attacks.
R1 Identify the evidence obtained that details the cryptographic algorithms and cipher modes relied upon by the software for the protection of sensitive data.
Modified p. 90 → 75
• Only documented cryptographic algorithms and modes are used in the software and are implemented correctly, and
R1 Identify the evidence obtained that confirms that only documented cryptographic algorithms and cipher modes are relied upon by the software for the protection of sensitive data.
Modified p. 90 → 75
Protections are incorporated to prevent common cryptographic attacks such as the use of the software as a decryption oracle, brute-force or dictionary attacks against the input domain of the sensitive data, the re-use of security parameters such as IVs, or the re-encryption of multiple datasets using linearly applied key values (such as XOR’d key values in stream ciphers or one-time pads).
Protection methods are implemented to mitigate common attacks on cryptographic implementations (for example, the use of the software as a decryption oracle, brute- force or dictionary attacks against the input domain of the sensitive data, the re-use of security parameters such as IVs, or the re- encryption of multiple datasets using linearly applied key values, such as XOR’d key values in stream ciphers or one-time pads).
Modified p. 90 → 76
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that only documented cryptographic algorithms and modes of operation are used in the software.
R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that only industry-standard padding methods are used.
Removed p. 91
For each of the cryptographic algorithms or modes of operation that require a unique value per encryption operation or session, describe how the implementation of those algorithms and modes of operation protect against common cryptographic attacks.
Modified p. 91 → 75
Indicate whether any of the implemented cryptographic algorithms and supporting modes of operation require a unique value per encryption operation or session (yes/no).
R1 Indicate whether any of the cryptographic implementations relied upon for the protection of sensitive data require a unique value per encryption operation or session.
Modified p. 91 → 76
7.1.d Where padding is used prior to/during encryption, the assessor shall examine vendor evidence and test the software to confirm that the encryption operation always incorporates an industry-accepted standard padding method.
7.1.e Where hash functions are used to protect sensitive data, the assessor shall examine evidence and test the software to confirm that:
Removed p. 92
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that whenever an encryption operation uses padding, it always uses an industry- accepted standard padding method.

Identify each of the industry-accepted padding methods used by the software.

7.1.e Where hash functions are used within the software, the assessor shall:

• Examine publicly available literature and research to identify vulnerable algorithms that can be exploited, and

Identify each of the approved, collision- resistant hash algorithms and methods used by the software for the protection of sensitive data.
Modified p. 92 → 76
Test the software to confirm that only approved, collision-resistant hash algorithms and methods are used with a salt value of appropriate strength, generated using a secure random number generator.
Only approved, collision-resistant hash algorithms and methods are used for this purpose, and
Modified p. 92 → 76
Indicate whether the software uses any hash functions for the protection of sensitive data (yes/no).
R1 Indicate whether the software relies upon hash functions for the protection of sensitive data.
Removed p. 94
In Place N/A Not in Place 7.2.a The assessor shall examine vendor evidence and test the software to confirm that:

• All cryptographic keys that are used for providing security to critical assets

•including both confidentiality and authenticity

•as well as for providing other security services to the software (such as authentication of end-point or software updates) have a unique purpose. For example, no key may be used for both encryption and authentication operations.

Describe each of the software tests performed in support of this test requirement, including the tool(s)j or method(s) used and the scope of each test.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all keys have defined generation methods, and no secret or private cryptographic keys are relied upon for the security of critical assets that are shared between software instances, except when a common secret or private key is used to secure …
Modified p. 94 → 78
• All keys have defined generation methods, and no secret or private cryptographic keys relied upon for security of critical assets are shared between software instances, except when a common secret or private key is used for securing the storage of other cryptographic keys that are generated during the installation, initialization, or first use of the software (e.g., white-box cryptography).
• All keys have defined generation methods, and no secret or private cryptographic keys relied upon for security of critical assets are shared between software instances, except when a common secret or private key is used for securing the storage of other cryptographic keys that are generated during the installation, initialization, or first use of the software (for example, white-box cryptography).
Modified p. 94 → 78
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all cryptographic keys that provide security to critical assets or other security services to the software have a unique purpose.
R1 Describe what the assessor observed in the evidence obtained that confirms that all cryptographic keys that provide security to critical assets or other security services to the software have a unique purpose.
Removed p. 95
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all cryptographic keys have an equivalent bit strength of at least 128 bits, in accordance with industry standards Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all keys have a defined crypto-period aligned with industry standards, and that methods are implemented to retire and/or update each key at the end of the defined crypto-period.

Describe what the assessor observed in the documentation, evidence, and software test results that indicates that all key-generation functions implement one- way functions or other irreversible key- generation processes, and that no reversible key calculation modes are used to directly create new keys from an existing key.
Modified p. 95 → 78
• All keys have a defined crypto- period aligned with industry standards, and methods are implemented to retire and/or update each key at the end of the defined crypto-period.
• All keys have a defined cryptoperiod aligned with industry standards, and methods are implemented to retire and/or update each key at the end of the defined cryptoperiod.
Modified p. 95 → 79
• The integrity and confidentiality of all secret and private cryptographic keys managed by the software are protected when stored (e.g., encrypted with a key-encrypting key that is at least as strong as the data-encrypting key and is stored separately from the data-encrypting key, or as at least two full-length key components or key shares, in accordance with an industry- accepted method).
• The integrity and confidentiality of all secret and private cryptographic keys managed by the software are protected when stored (for example, encrypted with a key-encrypting key that is at least as strong as the data- encrypting key and is stored separately from the data-encrypting key, or as at least two full-length key components or key shares, in accordance with an industry- accepted method).
Modified p. 95 → 79
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all keys have a defined generation or injection process, and that the process ensures sufficient entropy for the key.
R6 Describe what the assessor observed in the evidence obtained that confirms that all keys have a defined generation or injection process, and that the process ensures sufficient entropy for the key.
Modified p. 95 → 80
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the integrity and confidentiality of all secret and private cryptographic keys managed by the software are protected when stored.
R2 If R1 is “Yes, then describe what the assessor observed in the evidence obtained that confirms that the authenticity of these public keys is preserved.
Removed p. 96
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software uses cryptography to protect cryptographic keys.
Modified p. 96 → 79
Identify the documentation and evidence examined in support of this test requirement.
R8 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 96 → 79
Indicate whether the software uses cryptography to protect any cryptographic keys (yes/no).
R1 Indicate whether the software relies on cryptography to protect other cryptographic keys during storage or transmission.
Modified p. 96 → 79
If “yes,” describe what the assessor observed in the documentation, evidence, and software test results that confirms that all cryptographic keys used to protect other cryptographic keys provide an effective key strength equal to or greater than the keys they protect.
R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that all cryptographic keys used to protect other keys possess an equal or greater effective key strength as the key(s) they protect.
Removed p. 97
• Effective and expiration date

• Key length Identify the documentation and evidence examined in support of this test requirement.

Identify the documentation and evidence that contains the software vendor’s inventory of all cryptographic keys used by the software.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the authenticity of all public keys used in the software is maintained.
Modified p. 97 → 80
Indicate whether the software uses public keys (yes/no).
R1 Indicate whether the software relies on public keys for the protection of sensitive data.
Modified p. 97 → 80
7.2.d Where public or white-box keys are not unique per software instantiation the assessor shall examine vendor evidence and test the software to confirm that methods and procedures to revoke and/or replace such keys (or key pairs) exist.
7.2.e Where public or white-box keys are not unique per software instantiation the assessor shall examine evidence to confirm that methods and procedures to revoke and/or replace such keys (or key pairs) exist.
Removed p. 98
If “yes,” for each public or white-box keys used by the software that are not unique to each software instance, describe how the software (or the software vendor) revokes and/or replaces such keys (or key pairs).

7.2.e Where the software relies upon external files or other data elements for key material (such as for public TLS certificates), the assessor shall examine vendor evidence to confirm that clear and sufficient guidance on how to install such key material in accordance with this standard

•including details noting any security requirements for such key material

•is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.

Describe any additional software tests performed to support of this test requirement, including the tool(s)/method(s) used and the scope of each test.

Indicate whether the software relies upon external files or other data elements for cryptographic key materials (yes/no).
Modified p. 98 → 80
Indicate whether any public or white-box keys are used by the software that are not unique to each software instance (yes/no).
R1 Indicate whether the software relies upon public or white-box keys that are not unique to each software instance.
Modified p. 98 → 80
Describe what the assessor observed in the documentation, evidence, and software test results that indicates that the software relies upon external files or other data elements for cryptographic materials.
R1 Indicate whether the software relies upon external files or other external sources for cryptographic key material.
Removed p. 99
7.2.f Where public keys are used, the assessor shall examine vendor evidence and test the software to confirm that public keys manually loaded or used as root keys are installed and stored in a way that provides dual control (to a level that is feasible on the execution environment), preventing a single user from replacing a key to facilitate a man- in-the-middle attack, easy decryption of stored data, etc. Where complete dual control is not feasible (e.g., due to a limitation of the execution environment), the assessor shall confirm that the methods implemented are appropriate to protect the public keys.

Describe what the assessor observed in the documentation, evidence, and software test results that suggests the software uses manually loaded public keys or uses public keys as root keys.

If “no,” skip to 7.2.g If “yes,” complete the remaining reporting instructions for this test requirement.

Describe what the assessor observed in the documentation, …
Modified p. 99 → 81
Indicate whether the software uses any manually loaded public keys or uses public keys as root keys (yes/no).
R1 Indicate whether the software uses or relies upon public keys that are manually loaded or used as root keys.
Removed p. 100
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all secret and private keys are managed in a way that ensures split knowledge over each key.

Describe any circumstances that make the absolute split knowledge of secret or private keys infeasible. Also describe the methods implemented to protect the secret and private keys in the absence of split knowledge.
Removed p. 101
Describe what the assessor observed in the documentation, evidence, and software test results that confirms the software vendor has defined crypto- periods for each of the cryptographic keys used for the protection of sensitive data.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that methods are implemented to “roll” cryptographic keys used for the protection of sensitive data at the end of their defined crypto-period.

In Place N/A Not in Place 7.3.a The assessor shall examine vendor evidence to confirm that all random number generation methods implemented in the software:

Describe any additional software tests performed in support this test requirement, including the tool(s)/method(s) used and the scope of each test.
Modified p. 102 → 81
• Use at least 128 bits of entropy prior to the output of any random numbers from the random number generator.
• Use at least 128 bits of entropy prior to the output of any random numbers.
Modified p. 102 → 81
Describe what the assessor observed in the documentation and evidence that confirms all random number generation methods implemented use at least 128 bits of entropy prior to the output of any random numbers from the random number generator.
R2 Describe what the assessor observed in the evidence obtained that confirms that all random number generation methods implemented use at least 128 bits of entropy prior to the output of any random numbers from the random number generator.
Modified p. 102 → 81
Describe what the assessor observed in the documentation and evidence that confirms that sufficient entropy (at least 128 bits) is always provided or produced upon start-up or entry of other predictable states of the system.
R3 Describe what the assessor observed in the evidence obtained that confirms that sufficient entropy (at least 128 bits) is always provided or produced upon start-up or entry of other predictable states of the system.
Modified p. 102 → 82
7.3.b Where the vendor is relying upon previous assessment of the random number generator, or source of initial entropy, the assessor shall examine the approval records of the previous assessment and test the software to confirm that this scheme and specific approval include the correct areas of the software in the scope of its assessment, and that the vendor claims do not exceed the scope of the evaluation or approval of that software. For example, some cryptographic implementations approved under …
7.3.c Where the software vendor relies on a previous assessment of the random number generator or source of initial entropy, the assessor shall examine evidence (such as the approval records of the previous assessment) to confirm that this scheme and specific approval include the correct areas of the software in the scope of its assessment, and that the vendor claims do not exceed the scope of the evaluation or approval of that software. For example, some cryptographic implementations approved under …
Modified p. 102 → 82
Indicate whether the software relies upon a previous assessment of a random number generator or source of initial entropy to meet this control objective (yes/no).
R1 Indicate whether the software relies upon any random number generators that have been previously assessed to ensure they comply with industry-accepted standards for random number generation.
Removed p. 103
Describe what the assessor observed in the documentation and evidence that confirms that all vendor claims pertaining to the random number generation function(s) used do not exceed the scope of evaluation or approval of those random number generation functions.

7.3.c Where third-party software, platforms, or libraries are used for all or part of the random number generation process, the assessor shall examine current publicly available literature to confirm that there are no publicly known vulnerabilities or concerns with the software that may compromise its use for generating random values in the software under test.

Where problems are known, but have been mitigated by the software vendor, the assessor shall examine vendor evidence and test the software to confirm that the vulnerabilities have been sufficiently mitigated.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software relies on third-party software, platforms, or libraries for random number …
Modified p. 103 → 82
The assessor shall test the software to confirm that third-party software, platforms, or libraries are correctly integrated, implemented, and configured.
R1 Indicate whether the software relies upon third- party software, platforms, or libraries for all or part of the random number generation process.
Removed p. 104
Describe how the documentation, evidence, and software test results confirm that there are no known vulnerabilities or other weaknesses present in the third-party software, platforms, or libraries used as part of the random number generation process that would compromise the software’s ability to generate sufficiently random values.

Describe any vulnerabilities that exist in the third-party software, platforms, or libraries that are used by the software as part of the random number generation process. Also describe the methods implemented in the software to mitigate those vulnerabilities.

7.3.d The assessor shall examine vendor evidence and test the software to confirm that methods have been implemented to prevent or detect (and respond) the interception, or “hooking,” of random number calls that are serviced from third-party software, or the platform on which the software is executed.

Summarize how the software mitigates the interception or “hooking” of random number calls to/from the third-party software, platforms, or libraries providing …
Removed p. 105
Describe each of the software tests performed in support of this test requirement, including the tool(s)/method(s) used and the scope of each test. Also describe how the assessor obtained at least 128MB of data output from each random number generator implemented by the software.

Describe what the assessor observed in the documentation, evidence, and software test results that indicate that random number values cannot be statistically correlated.

Describe how the assessor confirmed that the methods used to generate the data output from the implemented random number generation function(s) ensure that the data is produced as it would be produced by the software during normal operation.
Removed p. 106
Note: If sufficient entropy is not provided, then 7.4.c must be completed.
Modified p. 106 → 83
In Place N/A Not in Place 7.4.a The assessor shall examine vendor evidence and test the software to confirm that the methods used for the generation of all cryptographic keys and other material (such as IVs, “k” values for digital signatures, etc.) have entropy that meets the minimum effective strength requirements of the cryptographic primitives and keys.
In Place Not in Place N/A 7.4.a The assessor shall examine evidence and test the software to confirm that the methods used for the generation of all cryptographic keys and other material (such as IVs, “k” values for digital signatures, and so on) have entropy that meets the minimum effective strength requirements of the cryptographic primitives and keys.
Modified p. 106 → 83
Describe what the assessor observed in the documentation, evidence, and software test results that indicate that the methods used to generate cryptographic keys and other key material have entropy that meets the minimum effective strength requirements of the cryptographic primitives and keys.
R1 Describe what the assessor observed in the evidence obtained that confirms that all methods used to generate cryptographic keys and other material have entropy that meets the minimum effective strength requirements of the cryptographic primitives and keys.
Removed p. 107
• Any methods used for generating keys directly from a password/passphrase enforces an input domain that is able to provide sufficient entropy, such that the total possible inputs are at least equal to that of the equivalent bit strength of the key being generated (e.g., a 32-hex-digit input field for an AES128 key).

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the methods used to generate keys directly from a password/passphrase provide sufficient entropy, such that the total possible inputs are at least equal to that of the equivalent bit strength of the key being generated.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that passwords/passphrases are passed through an industry-standard key- derivation function that provides a work factor of at least 10,000 for any attempt to brute-force the password/passphrase value.
Modified p. 107 → 84
The passphrase is passed through an industry-standard key-derivation function, such as PBKDF2 or bcrypt, which extends the work factor for any attempt to brute-force the passphrase value. The assessor shall confirm that a work factor of at least 10,000 is applied to any such implementation.
Passphrases are passed through an industry-standard key-derivation function, such as PBKDF2 or bcrypt, which extends the work factor for any attempt to brute- force a passphrase value. The assessor shall confirm that a work factor of at least 10,000 is applied to any such implementation.
Modified p. 107 → 84
Indicate whether any cryptographic keys used by the software are generated through processes that require direct user interaction (yes/no).
R1 Indicate whether the software uses or relies upon cryptographic keys that are generated through a process that requires direct user interaction.
Removed p. 108
• Clear and sufficient guidance is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1 that any passphrase used must be: o Randomly generated itself, using a valid and secure random process: an online random number generator must not be used for this purpose. o Never implemented by a single person, such that one person has an advantage in recovering the clear key value; violating the requirements for split knowledge.

Identify the documentation and evidence that contains the software vendor’s guidance on using passwords/passphrases that do not violate requirements for split knowledge.
Modified p. 108 → 84
Identify the documentation and evidence that contains the software vendor’s guidance on using passwords/passphrases that are randomly generated using a valid and secure random process.
R4 Identify the evidence obtained that details the software vendor’s guidance on generating keys this way in a secure manner.
Removed p. 109
Indicate whether any instances were found where third-party software, platforms, or libraries used as part of the random number generation process could not produce sufficient entropy (yes/no).

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that appropriate mitigations are implemented to compensate for the lack of sufficient entropy.

Identify the documentation and evidence that contains the software vendor’s guidance on securely configuring and using third-party software, platforms, or libraries that are part of the random number generation process.

Control Objective and Test Requirements Reporting Instructions Reporting Details: Assessor’s Response Summary of Assessment Findings (check one) Control Objective 8: Activity Tracking All software activity involving critical assets is tracked.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all access attempts and use of critical assets are tracked and traced to a unique identification for the person, system, or entity accessing …
Modified p. 110 → 86
In Place N/A Not in Place 8.1.a The assessor shall examine vendor evidence and test the software to confirm that all access attempts and usage of critical assets are tracked and traceable to a unique identification for the person, system, or entity performing the access.
In Place Not in Place N/A 8.2.a The assessor shall examine evidence and test the software to confirm that the tracking method(s) implemented capture specific activity performed, including:
Removed p. 111
For each of the activities identified in this test requirement, describe how the software captures each activity in its activity logs or other activity tracking mechanism(s).
Modified p. 111 → 86
In Place N/A Not in Place 8.2.a The assessor shall examine vendor evidence and test the software to confirm that the tracking method(s) implemented capture specific activity performed, including:
8.2.b The assessor shall examine evidence and test the software to confirm that the tracking method(s) implemented provide the following:
Modified p. 111 → 86
• Enablement of any privileged modes of operation
• Enablement of any privileged modes of operation.
Modified p. 111 → 86
• Disabling of encryption of sensitive data
• Disabling of encryption of sensitive data.
Modified p. 111 → 86
• Decryption of sensitive data
• Decryption of sensitive data.
Modified p. 111 → 86
• Exporting of sensitive data to other systems or processes
• Exporting of sensitive data to other systems or processes.
Modified p. 111 → 86
• Failed authentication attempts
• Failed authentication attempts.
Modified p. 111 → 86
• Disabling or deleting a security control or altering security functionality Identify the documentation and evidence examined in support of this test requirement.
• Disabling or deleting a security control or altering security functions.
Removed p. 112
Describe how the information identified in this test requirement is presented in the software activity logs or other activity tracking mechanism(s).

8.2.c The assessor shall test the software to confirm that sensitive data is not directly recorded in the tracking data.
Modified p. 112 → 86
• A unique identification for the person, system, or entity performing the access
• A unique identification for the individual, system, or entity accessing or using critical assets.
Modified p. 112 → 86
• A timestamp for each tracked event
• A timestamp for each tracked event.
Modified p. 112 → 86
• Details on what critical asset has been accessed Identify the documentation and evidence examined in support of this test requirement.
• Details on what critical asset has been accessed.
Modified p. 112 → 88
Describe what the assessor observed in any documentation, evidence, and software test results that confirms that sensitive data is not recorded in activity logs or in the output of other activity tracking mechanisms.
R1 Describe what the assessor observed in the evidence obtained that confirms that the integrity of activity tracking data and records is always maintained.
Removed p. 113
In Place N/A Not in Place 8.3.a Where the activity records are managed by the software, including only temporarily before being passed to other systems, the assessor shall examine vendor evidence and test the software to confirm that the protection methods are implemented to protect completeness, accuracy, and integrity of the activity records.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software manages activity records.

Indicate whether activity records are managed by the software (yes/no).

Describe how the protection methods for ensuring the integrity of activity records mitigate the risk of unauthorized modification of the activity records.
Modified p. 113 → 87
Describe the protection methods implemented by the software to protect the integrity of activity records.
R2 If R1 is “Yes,” then describe the methods implemented by the software to ensure the completeness, accuracy, and integrity of its activity tracking records.
Removed p. 114
The assessor shall test the software to confirm methods are implemented to secure the authenticity of the tracking data during transmission to the log storage system, and to confirm that this protection meets the requirements of this standard

•for example, authenticity parameters must be applied using strong cryptography

•and any account or authentication parameters used for access to an external logging system are protected.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software uses or supports the use of external systems for storing or maintaining activity tracking data.

Describe the methods implemented by the software to secure the authenticity of activity tracking data during transmission to third-party or external activity tracking or log storage system(s).
Modified p. 114 → 87
Indicate whether the software uses or supports the use of other systems for storing or maintaining activity tracking data (yes/no).
R1 Indicate whether the software relies upon or supports the use of external and/or third-party activity tracking mechanisms.
Modified p. 114 → 87
Identify the documentation and evidence that contains the software vendor’s guidance on how to securely configure the integration of the software with third- party or external activity tracking and/or log storage system(s).
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the software to use these activity tracking mechanisms in a secure manner.
Removed p. 115
In Place N/A Not in Place 8.4.a The assessor shall examine vendor evidence and test the software to confirm that failure of the activity tracking system does not violate the integrity of existing records. The assessor shall explicitly confirm that:

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the software does not overwrite any tracking data upon a restart of the software, and each new start only appends to existing datasets or creates a new tracking dataset.

Where unique dataset names are relied upon by the software for maintaining the integrity between execution instances, describe the methods implemented in the software to prevent other software (or instance of the same software) from overwriting or rendering invalid any existing data sets.

Describe any conditions that exist that make it difficult for the software to apply file privileges to assist with maintaining the integrity of the activity …
Modified p. 115 → 89
Where possible the software applies suitable file privileges to assist with maintaining the integrity of the tracking dataset (such as applying an append only access control to a dataset once created). Where the software does not apply such controls, the assessor shall confirm reasonable justification exists describing why this is the case, why the behavior is sufficient, and what additional mitigations are applied to maintain the integrity of the tracking data.
The software applies, where possible, suitable file privileges to assist with maintaining the integrity of the tracking dataset (such as applying an append-only access control to a dataset once created). Where the software does not apply such controls, the assessor shall confirm reasonable justification exists describing why this is the case, why the behavior is sufficient, and what additional mitigations are applied to maintain the integrity of the tracking data.
Removed p. 116
Describe what the assessor observed in the documentation, evidence, source code, and software test results that confirms whether attempts to circumvent or overwrite activity tracking mechanisms and data are possible.

For each of the software tests that could not be performed or do not produce the expected result, describe the factors that prevent such tests from producing results as expected and the additional protections implemented to protect the integrity of activity tracking records.
Modified p. 116 → 89
• Preventing the creation of new dataset entries by preventing further writing to the media on which the dataset is located (e.g., by using media that has insufficient available space).
• Preventing the creation of new dataset entries by preventing further writing to the media on which the dataset is located (for example, by using media that has insufficient available space).
Modified p. 116 → 89
Describe each of the software tests performed in support of this test requirement, including the tool(s)/method(s) used and the scope of each test.
R1 Describe each of the assessment activities performed, including the tool(s) or method(s) used and the scope of each test, to confirm that the integrity of activity tracking records is always maintained.
Modified p. 116 → 89
Indicate whether any of the tests specified in this test requirement could not be performed or did not produce the expected result (yes/no).
R2 Indicate whether any of the tests specified in this test requirement could not be performed.
Removed p. 117
In Place N/A Not in Place 9.1.a The assessor shall examine vendor evidence and test the software to confirm that, where possible, the software implements a method to validate the integrity of its own executable and any configuration options, files, and datasets that it relies upon for operation (such that unauthorized, post-deployment changes can be detected).

Indicate whether the execution environment or other factors prevent the software from validating the integrity of its own files (yes/no).

Describe how and to what extent the assessor attempted to identify other methods for validating the authenticity of software executables, configuration files, and datasets.
Modified p. 117 → 91
Where the execution environment prevents this, the assessor shall examine vendor evidence and current publicly available literature on the platform and associated technologies to confirm that there are indeed no methods for validating authenticity. The assessor shall then test the software to confirm controls are implemented to minimize the associated risk.
Where the execution environment prevents this, the assessor shall examine evidence (including publicly available literature on the platform and associated technologies) to confirm that there are indeed no methods for validating authenticity, and that additional security controls are implemented to minimize the associated risk.
Modified p. 117 → 91
Identify the documentation and evidence examined for this test requirement.
R4 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 117 → 91
Describe the method(s) implemented by the software to validate the integrity and authenticity of its own files.
R1 Describe the methods that are implemented or relied upon to validate the integrity of the software’s execution and configuration files.
Removed p. 118
Describe the software tests performed in support of this test requirement, including the tool(s)/method(s) used and the scope of each test.

Describe the software tests performed in support of this test requirement, including the tool(s)/method(s) used and the scope of each test.

Describe the frequency with which integrity values used by the software and dataset(s), upon which the software relies upon for secure operation, are checked.

Describe the actions the software takes upon the failure of such integrity checks.

Describe what the assessor observed in the documentation, evidence, and software testing results that confirms whether the software uses cryptographic primitives to detect anomalies.
Modified p. 118 → 91
9.1.b The assessor shall examine vendor evidence and test the software to confirm that integrity values used by the software and dataset(s) upon which it relies for secure operation are checked upon software execution, and at least every 36 hours thereafter (if the software continues execution during that time period). The assessor shall confirm what action the software takes upon failure of these checks and confirm that the processing of sensitive data is halted until this problem is remediated.
9.1.b The assessor shall examine evidence and test the software to confirm that integrity values used by the software and dataset(s) upon which it relies for secure operation are checked upon software execution, and at least every 36 hours thereafter (if the software continues execution during that time period).
Modified p. 118 → 91
9.1.c Where cryptographic primitives are used by any anomaly-detection methods, the assessor shall examine vendor evidence and test the software to confirm that cryptographic primitives are protected.
9.1.c Where cryptographic primitives are used by any anomaly-detection methods, the assessor shall examine evidence and test the software to confirm that the cryptographic primitives are protected.
Removed p. 119
If “yes,” describe what the assessor observed in the documentation, evidence, and software testing results that confirms that protection mechanisms are implemented to protect cryptographic primitives, and that those protections are appropriate for their intended purpose.

Describe what the assessor observed in the documentation and evidence that confirms whether stored values are used by the software to detect anomalies.

Indicate whether stored values are used by the software for to detect anomalies (yes/no).

If “yes,” describe what the assessor observed in the documentation, evidence, and software test results that confirms that protection mechanisms are implemented to protect stored values, and that the protection mechanisms are appropriate for their intended purpose.
Modified p. 119 → 92
9.1.d Where stored values are used by any anomaly-detection methods, the assessor shall examine vendor evidence and test the software to confirm that these values are considered sensitive data and protected accordingly.
9.1.d Where stored values are used by any anomaly-detection methods, the assessor shall examine evidence and test the software to confirm that these values are considered sensitive data and are protected accordingly.
Removed p. 120
Describe what the assessor observed in the documentation, evidence, and results of the software tests that confirms whether configuration or other data set values can be modified by the software during execution.

Indicate whether configuration or other dataset values (relied upon by the software for operation) can be modified by the software during execution (yes/no).

For each of the integrity protections implemented, describe how the implementation allows for updates during execution, while ensuring that the integrity of the values can be validated after the update.
Modified p. 120 → 92
Describe the integrity protections implemented to protect configuration or other dataset values from modification during software execution.
R2 If R1 is “Yes,” then describe the methods implemented to protect stored values from unauthorized modification or disclosure.
Removed p. 121
Describe each of the software tests performed in support of this test requirement, including the tools(s)/method(s) used and the scope of each test.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that controls are implemented to prevent brute-force attacks on account, password, or cryptographic-key input fields, and that the controls are appropriate for their intended purpose.
Removed p. 122
Identify the documentation and evidence examined in support of this test requirement, including the documentation and evidence examined in 10.1.a.
Removed p. 122
Control Objective and Test Requirements Reporting Instructions Reporting Details: Assessor’s Response Summary of Assessment Findings (check one) Control Objective 10: Threat and Vulnerability Management The software vendor identifies, assesses, and manages threats and vulnerabilities in its payment software.
Removed p. 122
Identify the documentation and evidence examined that identifies and describes the attack methods applicable to the software.

Describe what the assessor observed in the documentation and evidence examined in 10.1.a that indicates that the software vendor has conducted a reasonably comprehensive assessment of the common software attack methods applicable to the software and the software’s susceptibility to them.

10.1.c The assessor shall examine vendor evidence to confirm that mitigations against each identified attack vector exists, and that the vendor’s software release process includes validation of the existence of these mitigations.

Describe what the assessor observed in the documentation and evidence that confirms that protection mechanisms are implemented to mitigate each of the common attacks applicable to the software, and that the mitigations are appropriate for their intended purpose.
Modified p. 122 → 94
10.1.b The assessor shall examine vendor evidence to confirm that the list of common attacks is valid for the software the vendor has produced and shall note where this does not include common attack methods detailed in industry- standard references such as OWASP and CWE lists.
10.1.b The assessor shall examine evidence to confirm that the identified attacks are valid for the software and shall note where this does not include common attack methods detailed in industry- standard references such as OWASP and CWE lists.
Removed p. 123
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor has implemented testing processes to verify that all protection mechanisms implemented in the software to mitigate each of the potential attacks identified in 10.1.a remain in place and operate effectively.

Describe what the assessor observed in the documentation and evidence that confirms that such processes are implemented throughout the entire software lifecycle.
Removed p. 124
Describe the automated tools used as part of the vendor’s testing process to detect vulnerabilities in software code and during execution, and why they are appropriate for the software architecture and the software development languages and frameworks in use.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor’s testing process accounts for the entire code base, including third-party, open- source, or other shared components and libraries.

Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s testing process reasonably accounts for all common vulnerabilities and attack methods applicable to the software.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor’s testing process demonstrates a history of successfully finding software vulnerabilities and remediating them prior to software release.
Modified p. 124 → 95
• Includes the use of tools for security testing that are appropriate for detecting applicable vulnerabilities and are suitable for the software architecture, development languages, and frameworks used in the development of the software.
• Includes the use of security testing tools that are suitable for the software architecture, development languages, and frameworks used in the development of the software.
Modified p. 124 → 95
• Accounts for the entire code base, including detecting vulnerabilities in third-party, open-source, or shared components and libraries.
• Accounts for the entire code base and detects vulnerabilities in third-party, open- source, or shared components and libraries.
Modified p. 124 → 95
• Demonstrates a history of finding software vulnerabilities and remediating them prior to retesting of the software.
• Demonstrates a history of finding software vulnerabilities and remediating them prior to software release.
Removed p. 125
Summarize the software vendor’s vulnerability ranking/categorization scheme and how it aligns with other industry-standard schemes.
Modified p. 125 → 96
The vendor implements an industry-standard vulnerability- ranking system (such as CVSS) that allows for the categorization of vulnerabilities.
An industry-standard vulnerability-ranking system (such as CVSS) is used to classify/categorize vulnerabilities.
Modified p. 125 → 96
For all vulnerabilities, the vendor provides a remediation plan •it is unacceptable for a known vulnerability to remain unmitigated for an indefinite period.
A remediation plan is maintained for all detected vulnerabilities that ensures vulnerabilities do not remain unmitigated for an indefinite period.
Modified p. 125 → 96
Describe how the software vendor’s process ensures vulnerabilities do not remain unmitigated indefinitely.
R2 Describe how the software vendor ensures that known vulnerabilities do not remain unmitigated indefinitely.
Removed p. 126
Describe the assessor’s rationale for why the software vendor’s security update release criteria are reasonable given the software’s intended purpose and function, and the critical assets maintained by the software.
Modified p. 126 → 97
• Reasonable criteria exist for releasing software updates to fix security vulnerabilities.
• Reasonable criteria are defined for releasing software updates to fix security vulnerabilities.
Modified p. 126 → 97
• Security updates are made available to stakeholders in accordance with defined criteria.
• Security updates are made available to stakeholders in accordance with the defined criteria.
Modified p. 126 → 97
Summarize the software vendor’s criteria for how and how often the vendor releases software updates to fix security vulnerabilities.
R2 Describe the software vendor’s criteria and process for determining when to delay security updates to address known vulnerabilities.
Modified p. 126 → 97
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor makes security updates available to stakeholders in accordance with its own defined security update release criteria and does not make exceptions on a regular basis.
R1 Describe what the assessor observed in the evidence obtained that confirms that security updates are provided to stakeholders in a timely manner.
Modified p. 126 → 100
In Place N/A Not in Place 11.1.a The assessor shall examine vendor evidence to confirm that:
12.1.b The assessor shall examine evidence to confirm that the guidance:
Removed p. 127
Identify the documentation and evidence examined in support of this requirement.

Identify the software update sample selected in support of this test requirement.

Indicate whether any evidence was obtained that suggests the software vendor did not provide security fixes to stakeholders in accordance with its own defined criteria (yes/no).

If “yes,” for each instance identified describe the vendor’s justification for not providing security fixes, and why the assessor considers each exception reasonable given the risk posed by the continued existence of known vulnerabilities in the software.
Removed p. 128
In Place N/A Not in Place 11.2.a The assessor shall examine vendor evidence to confirm that the method by which the vendor releases software updates ensures the integrity of the software and its code during transmission and install. Where user instructions are required to validate the integrity of the code, the assessor shall confirm that clear and sufficient guidance to enable the process to be correctly performed is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.

Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the software vendor’s methods for delivering software updates protect the integrity of the software and its code during transmission and installation (or implementation).
Modified p. 128 → 98
Indicate whether the software requires user interaction to validate the integrity of the software update code (yes/no).
R1 Indicate whether the software requires user input or interaction to validate the integrity of software updates prior to implementation.
Modified p. 128 → 98
If “yes,” identify the documentation and evidence that contains the software vendor’s guidance on how to verify the integrity of software update code.
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to validate the integrity of software updates.
Removed p. 129
Describe what the assessor observed in the documentation and evidence that confirms whether the software provides software integrity validation methods that are cryptographically insecure.

If “yes,” describe how the documentation and evidence demonstrates that the software distribution methods provide a suitable chain of trust.

11.2.c The assessor shall examine vendor evidence to confirm that the software vendor informs users of the software updates, and that clear and sufficient guidance on how they may be obtained and installed is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.
Modified p. 129 → 98
Indicate whether the methods to protect the integrity of software update code are cryptographically insecure (yes/no).
R1 Indicate whether the methods used to validate the integrity of software updates are not cryptographically secure.
Modified p. 129 → 99
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor informs uses and other stakeholders when security updates are available.
R2 Describe what the assessor observed in the evidence obtained that confirms guidance is provided to stakeholders on how to implement software updates.
Modified p. 129 → 100
Identify the documentation and evidence that contains the software vendor’s guidance on how to obtain and install security updates.
R1 Identify the evidence obtained that details the software vendor’s guidance for stakeholders on how to install and/or configure the security features of the software.
Removed p. 130
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides users with mitigation techniques when vulnerabilities are detected in the software and a security patch cannot be provided in a timely manner.

11.2.e The assessor shall examine vendor evidence to confirm the update mechanisms cover all software, configuration files, and other metadata that may be used by the software for security purposes, or which may in some way affect security.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor’s update mechanisms cover all software, configuration files, and metadata used by the software for security purposes or could affect the security of the software.
Removed p. 131
In Place N/A Not in Place 12.1.a The assessor shall examine vendor evidence to confirm that the vendor creates and provides, to all stakeholders, clear and sufficient guidance to allow for the secure installation and use of the software.

Summarize how the software vendor makes implementation guidance available to all stakeholders, whether it’s provided as a single document, multiple documents, a series of independent notifications, or content posted on the software vendor’s website, and so on.

12.1.b The assessor shall examine vendor evidence to confirm that the guidance:

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides stakeholders with instructions on how to securely configure the platform and/or environment in which the software is to execute, including configuring any platform parameters or other resources upon which the software relies.
Modified p. 131 → 100
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides stakeholders with guidance on how to install and configure the software, including any third-party software required.
R3 Describe what the assessor observed in the evidence obtained that confirms the software vendor provides guidance on how to install and maintain cryptographic keys managed by the software.
Removed p. 132
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides stakeholders with instructions on how to manage all cryptographic keys used by the software, including how those keys are to be managed; that is, distributed, loaded, removed, changed, or destroyed, etc.).

Describe what the assessor observed in the documentation and evidence that confirms that users are never instructed to disable security settings or parameters in the installed environment that support software operation.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor identifies all requirements within this standard that are not applicable and provides reasonable explanations for why each requirement is not applicable.
Modified p. 132 → 100
• Includes instructions for key management (e.g., use of keys, how keys are distributed, loaded, removed, changed, destroyed, etc.)
• Includes instructions for key management (for example, the use of keys and how they are distributed, loaded, removed, changed, and destroyed.)
Modified p. 132 → 100
• Does not instruct the user to disable security settings or parameters within the installed environment, such as anti-malware software or firewall or other network-level protection systems.
• Does not instruct the user to disable security settings or parameters within the installed environment, such as anti- malware software or firewall or other network-level protection systems.
Modified p. 132 → 101
• Provides justification for any requirements in this standard that are to be assessed as not applicable. For each of these, the assessor shall confirm reasonable justification exists for why this is the case and confirm that it agrees with their understanding and the results of their testing of the software.
• Provides justification for any requirements in this standard that are to be assessed as not applicable. For each of these, the assessor shall confirm justification exists for why this is the case and shall confirm that it agrees with their understanding and the results of their software testing.
Modified p. 132 → 103
Describe what the assessor observed in the documentation and evidence that confirms that users are never instructed to execute the software in a privileged mode higher than the minimum privilege necessary for software operation.
R2 Describe what the assessor observed in the evidence obtained that confirms all locations where sensitive data is stored in the assessed software is identified in the vendor guidance.
Modified p. 132 → 103
Describe what the assessor observed in the documentation and evidence that confirms that the guidance provided to stakeholders clearly identifies the version(s) of the software to which it applies.
R3 Describe what the assessor observed in the evidence obtained that confirms that the software vendor provides guidance on how to securely delete sensitive data from storage locations.
Removed p. 133
Control Objective and Test Requirements Reporting Instructions Reporting Details: Assessor’s Response Summary of Assessment Findings (check one) Control Objective A.1: Sensitive Authentication Data Sensitive authentication data is not retained after authorization.

• unless the software is intended only for use by issuers or organizations that support issuing services.

In Place N/A Not in Place A.1.1.a For each instance of sensitive authentication data identified in Control Objective 1,the assessor shall test the software, including generation of error conditions and log entries, and usage of forensic tools and/or methods, to identify all potential storage locations and to confirm that the software does not store sensitive authentication data after authorization. This includes temporary storage (such as volatile memory), semi- permanent storage (such as RAM disks), and non-volatile storage (such as magnetic and flash storage media).

Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software stores sensitive authentication …
Modified p. 133 → 101
Identify the documentation and evidence examined in support of this test requirement.
R7 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 133 → 102
A.1.1 The software does not store sensitive authentication data after authorization •even if encrypted
In Place Not in Place N/A A.1.1 The software does not store sensitive authentication data after authorization (even if encrypted) unless the software is intended only for use by issuers or organizations that support issuing services.
Removed p. 134
In Place N/A Not in Place A.2.1 The assessor shall examine the instructions prepared by the software vendor and confirm the documentation includes the following guidance for customers, integrators, and resellers:

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders (including customers, integrators, and resellers) that identifies and describes all locations within the software and its execution environment where cardholder data is stored.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders (including customers, integrators, and resellers) on how to securely delete cardholder data wherever it is stored.
Modified p. 134 → 102
Identify the documentation and evidence in support of this test requirement.
R4 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 134 → 103
A.2.1 The software vendor provides guidance to customers regarding secure deletion of cardholder data after expiration of the customer-defined retention period.
In Place Not in Place N/A A.2.1 The software vendor provides guidance to stakeholders regarding secure deletion of cardholder data after expiration of defined retention period(s).
Modified p. 134 → 103
A list of all locations where the software stores cardholder data.
All locations where the software stores cardholder data.
Modified p. 134 → 103
Instructions on how to securely delete cardholder data stored by the payment software, including data stored on underlying software or systems (such as OS, databases, etc.).
How to securely delete cardholder data stored by the payment software, including cardholder data stored on underlying software or systems (such as in OS files or in databases).
Modified p. 134 → 103
Instructions for configuring the underlying software or systems (such as OS, databases, etc.) to prevent inadvertent capture or retention of cardholder data •for example, system backup or restore points.
How to configure the underlying software or systems to prevent the inadvertent capture or retention of cardholder data (for example, by system backup or restore points).
Modified p. 134 → 103
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders (including customers, integrators, and resellers) on how to configure the software or underlying systems to prevent the inadvertent capture or retention of cardholder data.
R4 Describe what the assessor observed in the evidence that confirms the software vendor provides guidance on how to configure the underlying platform to prevent the inadvertent capture or retention of cardholder data.
Removed p. 135
• Details of all instances where PAN is displayed.

• Confirmation that the payment software masks PAN to display a maximum of the first six and last four digits by default on all displays.

• Instructions for how to configure the software to display more than the first six/last four digits of the PAN (includes displays of the full PAN).

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders that identifies all locations within the software or underlying systems where PAN is displayed.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders that instructs them to mask all displays of PAN to a maximum of the first six and last four digits by default, and how to do so.

If “yes,” Describe what the assessor observed in the documentation and evidence that confirms …
Modified p. 135 → 103
In Place N/A Not in Place A.2.2.a The assessor shall examine vendor evidence, including the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1, to confirm the guidance includes the following instructions for customers and integrators/resellers:
In Place Not in Place N/A A.2.1 The assessor shall examine evidence to confirm that guidance is provided to stakeholders in accordance with Control Objective 12.1 that details:
Modified p. 135 → 104
Based on the documentation and evidence, indicate whether the software supports the full display of PAN (yes/no).
R1 Describe the options available within the software to restrict the display of PAN.
Removed p. 136
Identify any additional documentation and evidence in support of this test requirement.

Identify any additional documentation and evidence in support of this test requirement.

Describe what the assessor observed in the documentation, evidence, and software test results that indicates that all displays of PAN are masked to a maximum of the first six and last four digits by default.

A.2.2.c The assessor shall examine vendor evidence and test the software to confirm that for each instance where the PAN is displayed, the instructions for displaying more than the first six/last four digits are accurate.
Modified p. 136 → 104
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that by following the software vendor’s guidance on configuring PAN masking, the software only displays first six and last four digits of the PAN.
R1 Describe what the assessor observed in the evidence obtained that confirms stakeholders are provided guidance on how to configure available PAN-masking features and options.
Removed p. 137
In Place N/A Not in Place A.2.3.a The assessor shall examine vendor evidence, including the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1 to verify the guidance includes the following:

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders that identifies all available options to render cardholder data unreadable, and that instructions are also provided to securely configure those options.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders that identifies all instances where cleartext cardholder data is output by the software and instructs stakeholders that they are responsible for rendering such instances of PAN unreadable.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders instructing them that they are responsible for ensuring that PAN remains …
Modified p. 137 → 106
• Details of any configurable options for each method used by the software to render cardholder data unreadable, and instructions on how to configure each method for all locations where cardholder data is stored by the payment software.
• Details of any configurable options for each method used to render cardholder data unreadable, and instructions on how to configure each method for all locations where cardholder data is stored.
Modified p. 137 → 106
• A list of all instances where cardholder data may be output for the customer to store outside of the payment application, and instructions that the customer is responsible for rendering the PAN unreadable in all such instances.
• A list of all instances where cardholder data may be output for storage outside of the payment application, and instructions that the implementing entity is responsible for rendering the PAN unreadable in all such instances.
Modified p. 137 → 106
• Instruction that if debugging logs are ever enabled (for example, for troubleshooting purposes) and they include the PAN, they must be protected, disabled as soon as troubleshooting is complete, and securely deleted when no longer needed.
• Instruction that if debugging logs are ever enabled (for troubleshooting purposes) and they contain PAN, they must be protected, that debugging must be disabled as soon as troubleshooting is complete, and that debugging logs must be securely deleted when no longer needed.
Removed p. 138
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software creates both tokenized and truncated versions of the same PAN.

Where the software creates both tokenized and truncated versions of the same PAN, describe what the assessor observed in the documentation, evidence, and software test results that confirms that tokenized and truncated versions of the PAN cannot be correlated to reconstruct the original PAN.
Modified p. 138 → 105
• Index tokens and pads, with the pads being securely stored
• Index tokens and pads, with the pads being securely stored.
Modified p. 138 → 105
• Strong cryptography, with associated key-management processes and procedures.
• Strong cryptography, with associated key- management processes and procedures.
Modified p. 138 → 105
Note: The assessor should examine several tables, files, log files and any other resources created or generated by the software to verify the PAN is rendered unreadable.
Note: The assessor should examine several tables, files, log files, and any other resources created or generated by the software to verify the PAN is rendered unreadable.
Modified p. 138 → 105
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that PAN is rendered unreadable wherever it is stored persistently.
R2 Describe what the assessor observed in the evidence obtained that confirms hashing is not used to render PAN unreadable.
Modified p. 138 → 106
Identify the documentation and evidence examined in support of this test requirement.
R6 Describe any other assessment activities performed and/or findings for this test requirement.
Removed p. 139
Describe what the assessor observed in the documentation, evidence, and software test results that confirms whether the software generates files for use outside of the software.

Where the software creates or generates files for use outside of the software, describe what the assessor observed in the documentation, evidence, and software test results that confirms that PAN is either excluded from all such files, or it is rendered unreadable by the software.
Removed p. 140
Indicate whether any evidence was found through the documentation, evidence, or software test results that suggests that the software vendor stores PAN on vendor systems (yes/no).
Modified p. 140 → 106
If “yes,” describe what the assessor observed in the documentation, evidence, and software test results that confirms that PAN is rendered unreadable wherever and whenever it is stored on vendor systems.
R5 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms instructions on how to render PAN unreadable where it is stored for troubleshooting purposes.
Removed p. 141
Control Objective and Test Requirements Reporting Instructions Reporting Details: Assessor’s Response Summary of Assessment Findings (check one) Control Objective B.1: Terminal Software Documentation The software architecture is documented and includes diagrams that describe all software components and services in use and how they interact.

Describe what the assessor observed in the documentation and evidence to conclude that all third-party and open- source components, external services, and APIs used by the software are documented.

Describe what the assessor observed in the documentation and evidence to conclude that all UI’s and APIs provided or made accessible by the software are documented.
Modified p. 141 → 108
B.1.1 The software vendor maintains documentation that describes all software components, interfaces, and services provided or used by the software.
In Place Not in Place N/A B.1.1 The software vendor maintains documentation that describes all software components, interfaces, and services provided or used by the software.
Modified p. 141 → 108
In Place N/A Not in Place B.1.1 The assessor shall examine all relevant documentation and evidence necessary to confirm that the software vendor maintains documentation describing the software’s overall design and function including, but not limited to, the following:
In Place Not in Place N/A B.1.1 The assessor shall examine evidence to confirm that documentation is maintained that describes the software’s overall design and function including, but not limited to, the following:
Modified p. 141 → 108
Identify the documentation and evidence examined that identifies and describes all third-party and open-source components, external services, and Application Programming Interfaces (APIs) used by the software.
R2 Describe what the assessor observed that confirms the software design documentation covers all third-party and open-source components, external services, and APIs used by the software.
Modified p. 141 → 108
Identify the documentation and evidence that identifies and describes all User Interfaces (UI) and APIs provided or made accessible by the software.
R3 Describe what the assessor observed that confirms the software design documentation covers all interfaces and APIs provided or made accessible by the software.
Modified p. 141 → 108
Identify any other documentation or evidence examined in support of this test requirement.
R4 Describe any other assessment activities performed and/or findings for this test requirement.
Removed p. 142
Identify the documentation and evidence examined that describes how sensitive data is securely deleted from storage when no longer needed.
Modified p. 142 → 109
In Place N/A Not in Place B.1.2.a The assessor shall examine all relevant documentation and evidence necessary to confirm that the software vendor maintains documentation describing all sensitive data flows including, but not limited to, the following:
In Place Not in Place N/A B.1.2.a The assessor shall examine evidence to confirm that documentation is maintained that describes all sensitive data flows including, but not limited to, the following:
Modified p. 142 → 109
Identify the documentation and evidence examined that identifies and describes the sensitive data that is stored, processed, or transmitted by the software.
R2 Describe what the assessor observed in the evidence obtained that confirms it details all sensitive data stored, process, and transmitted by the software.
Modified p. 142 → 109
Identify the documentation and evidence examined that identifies and describes the locations where sensitive data is stored.
R1 Identify the evidence obtained that details all data flows involving sensitive data.
Removed p. 143
Identify the documentation and evidence examined that identifies and describes all possible error conditions for each software function that handles sensitive data.

Identify the documentation and evidence examined that identifies and describes the cryptographic algorithms and modes of operation used and supported for each instance where cryptography is used to protect sensitive data.

Identify the documentation and evidence examined that identifies and describes how cryptographic keys are managed for each instance where cryptography is used to protect sensitive data.
Modified p. 143 → 110
Identify the documentation and evidence examined that identifies and describes all software functions that handle sensitive data.
R1 Identify the evidence obtained that details all software functions that handle sensitive data.
Modified p. 143 → 110
Identify the documentation and evidence examined that identifies and describes all inputs and outputs for each software function that handles sensitive data.
R2 Describe what the assessor observed in the evidence obtained that confirms it details all inputs, outputs, and possible error conditions for each function that handles sensitive data.
Modified p. 143 → 110
Identify any other documentation and evidence examined in support of this test requirement.
R4 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 144 → 111
In Place N/A Not in Place B.1.3 The assessor shall examine all relevant documentation and evidence necessary to confirm that the software vendor maintains documentation describing all configurable options provided or made available by the software that can impact the security of sensitive data including, but not limited to, the following:
In Place Not in Place N/A B.1.3 The assessor shall examine evidence to confirm that documentation is maintained that describes all configurable options provided or made available by the software that can impact the security of sensitive data including, but not limited to, the following:
Modified p. 144 → 111
Identify the documentation and evidence examined that identifies and describes all configurable options that enable access to sensitive data.
R2 Describe what the assessor observed in the evidence obtained that confirms it details all configurable options that facilitate access to sensitive data.
Modified p. 144 → 111
Identify the documentation and evidence examined that identifies and describes all configurable options that enable modification of mechanisms used to protect sensitive data.
R3 Describe what the assessor observed in the evidence obtained that confirms it details all configurable options that facilitate modification of mechanisms used to protect sensitive data.
Modified p. 144 → 111
Identify the documentation and evidence examined that identifies and describes all remote access features, functions, and parameters provided or made available by the software.
R4 Describe what the assessor observed in the evidence obtained that confirms it details all remote access features, functions, and parameters provided or made available by the software.
Modified p. 144 → 111
Identify the documentation and evidence examined that identifies and describes all remote update features, functions, and parameters provided or made available by the software.
R5 Describe what the assessor observed in the evidence obtained that confirms it details all remote update features, functions, and parameters provided or made available by the software.
Modified p. 144 → 111
Identify the documentation and evidence examined that describes the default settings for each configurable option.
R6 Describe what the assessor observed in the evidence obtained that confirms it details the default settings for each configurable option.
Modified p. 144 → 111
Identify any other documentation and evidence examined in support of this test requirement.
R7 Describe any other assessment activities performed and/or findings for this test requirement.
Removed p. 145
Describe what the assessor observed in the documentation and evidence that confirms that the software is intended for deployment on PCI-approved POI devices.

Describe what the assessor observed in the documentation and evidence that confirms that the device characteristics of the POI devices supported by the software match the device characteristics specified in the PCI SSC’s List of Approved PTS Devices.
Modified p. 145 → 112
B.2.1 The software is intended for deployment and operation on payment terminals (i.e., PCI-approved POI devices).
In Place Not in Place N/A B.2.1 The software is intended for deployment and operation on payment terminals (PCI-approved POI devices).
Modified p. 145 → 112
In Place N/A Not in Place B.2.1 The assessor shall examine all relevant software documentation and evidence necessary to determine the payment terminals upon which the software is to be deployed. For each of the payment terminals identified in the software documentation and included in the software assessment, the assessor shall examine the payment terminal’s device characteristics and compare them with the following characteristics specified in the PCI SSC’s List of Approved PTS Devices to confirm they match:
In Place Not in Place N/A B.2.1 The assessor shall examine evidence to determine the payment terminals upon which the software is to be deployed. For each of the payment terminals identified and included in the software assessment, the assessor shall examine the payment terminal’s device characteristics and compare them with the following characteristics specified in the PCI SSC’s List of Approved PTS Devices to confirm they match:
Modified p. 145 → 112
• Firmware version number(s) Identify the documentation and evidence examined that identifies the POI devices supported by the software.
• Firmware version number(s) R1 Identify the evidence obtained that details the PCI PTS POI devices supported by the software.
Modified p. 145 → 112
Identify any other documentation and evidence examined in support of this test requirement.
R3 Describe any other assessment activities performed and/or findings for this test requirement.
Removed p. 146
Describe how and the extent to which the source code was examined to determine whether the software support supports external communications.

Describe what the assessor observed in the documentation, evidence, and source code that confirms whether the software supports external communications.

Indicate whether the software supports external communications (yes/no).
Removed p. 146
If “yes,” complete the reporting instructions for test requirements B.2.2.b through B.2.2.2.
Modified p. 146 → 112
In Place N/A Not in Place B.2.2.a The assessor shall examine all relevant software documentation and source code necessary to determine whether the software supports external communications.
In Place Not in Place N/A B.2.2.a The assessor shall examine evidence (including source code) to determine whether the software supports external communications.
Modified p. 146 → 112
Identify the documentation and evidence examined that confirms whether the software supports external communications.
R1 Indicate whether the software supports the use of external communication methods.
Removed p. 147
Indicate whether there are any discrepancies between the list of external communication methods used by the software and the list of PCI-approved external communication methods provided by the payment terminal. (yes/no).

If “no,” skip to B.2.2.c.

Describe each of the discrepancies between the external communication methods supported by the software and those included in the payment terminal’s PTS POI device evaluation.

For each of the noted discrepancies, describe the assessor’s rationale for why the discrepancy does not violate the control objective.
Modified p. 147 → 112
Identify the external communication methods included in the payment terminal’s PTS POI device evaluation.
B.2.2 The software uses only the external communication methods included in the payment terminal’s PTS device evaluation.
Removed p. 148
Describe what the assessor observed in the documentation and evidence that confirms whether the software relies on the Open Protocols features of the payment terminal.

If “yes,” complete the remaining reporting instructions for test requirements B.2.2.1 through B.2.2.2.
Modified p. 148 → 113
Describe what the assessor observed in the documentation, evidence, and source code that confirms the software only uses the PCI-approved external communication methods included in the payment terminal’s PTS POI device evaluation and does not implement its own external communication methods (that is, its own IP stack).
B.2.2.c The assessor shall examine evidence (including source code) to confirm that the software uses only the external communication methods included in the payment terminal’s PTS device evaluation and does not implement its own external communication methods or IP stack.
Modified p. 148 → 113
B.2.2.1 Where the software relies on the Open Protocols features of the payment terminal, the software is developed in accordance with the payment terminal vendor’s security guidance/policy.
B.2.2.1 Where the software relies on the Open Protocols feature of the payment terminal, the software is developed in accordance with the payment terminal vendor’s security guidance/policy.
Modified p. 148 → 113
In Place N/A Not in Place B.2.2.1 The assessor shall examine all relevant payment terminal documentation •including the payment terminal vendor’s security guidance/policy

•and
all relevant software vendor process documentation and software design documentation to confirm that the software is developed in accordance with the payment terminal vendor’s security guidance/policy.
In Place Not in Place N/A B.2.2.1 The assessor shall examine all relevant payment terminal documentation (including the payment terminal vendor’s security guidance/policy) and all relevant software vendor process documentation and software design documentation to confirm that the software is developed in accordance with the payment terminal vendor’s security guidance/policy.
Modified p. 148 → 113
Indicate whether the software relies on the Open Protocols features of the payment terminal (yes/no).
R1 Indicate whether the software relies upon the Open Protocols features of the PCI PTS POI devices included in the software assessment.
Removed p. 149
In Place N/A Not in Place B.2.2.2 The assessor shall examine all relevant software documentation and source code to confirm that the software does not circumvent, bypass, or add additional services or protocols to the Open Protocols of the payment terminal as approved and documented in the payment terminal vendor’s security guidance/policy. This includes the use of:
Modified p. 149 → 114
B.2.2.2 The software does not circumvent, bypass, or add additional services or protocols to the Open Protocols of the payment terminal as approved and documented in the payment terminal vendor’s security guidance/policy. This includes the use of link layer protocols, IP protocols, security protocols, and IP services.
B.2.2.2 The software does not circumvent, bypass, or add additional services or protocols to the Open Protocols of the payment terminal as approved and documented in the payment terminal vendor’s security guidance/policy. This includes the use of:
Modified p. 149 → 114
• IP Services Identify the documentation and evidence examined in support of this test requirement.
R2 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 149 → 115
Describe what the assessor observed in the documentation, evidence, and source code that confirms that the software does not circumvent, bypass, or add additional services or protocols to the Open Protocols approved as part of the payment terminal’s PCI PTS device evaluation.
R1 Describe what the assessor observed in the evidence obtained that confirms the software does not bypass or render ineffective any encryption methods provided by the payment terminal.
Removed p. 150
If “yes,” complete the reporting instructions for test requirements B.2.3.b through B.2.3.d.
Modified p. 150 → 115
In Place N/A Not in Place B.2.3.a The assessor shall examine all relevant software documentation and source code necessary to determine whether the software facilitates encryption of sensitive data. Where the software does provide such a function, the assessor shall confirm the software does not bypass or render ineffective any encryption methods or account data security methods implemented by the payment terminal as follows:
In Place Not in Place N/A B.2.3.a The assessor shall examine evidence (including source code) to determine whether the software provides encryption of sensitive data. Where the software does provide such a function, the assessor shall confirm the software does not bypass or render ineffective any encryption methods or account data security methods implemented by the payment terminal as follows:
Modified p. 150 → 115
Indicate whether the software supports the encryption of sensitive data (yes/no).
R1 Indicate whether the software provides its own methods to facilitate the encryption of sensitive data.
Modified p. 150 → 115
B.2.3.b The assessor shall examine all relevant payment terminal documentation •including payment terminal vendor security guidance/policy

•necessary
to determine which encryption methods are provided by the payment terminal.
B.2.3.b The assessor shall examine all relevant payment terminal documentation (including payment terminal vendor security guidance/policy) to determine which encryption methods are provided by the payment terminal.
Modified p. 150 → 115
Identify each of the encryption methods provided by the payment terminal.
R2 If R1 is “Yes,” then describe the encryption methods provided by the device(s).
Modified p. 150 → 116
Describe what the assessor observed in the documentation, evidence, and source code that determines whether the software supports the encryption of sensitive data.
R2 If R1 is “No,” then describe what the assessor observed in the evidence obtained that confirms the methods provided by the software to encrypt sensitive data provide for strong cryptography.
Removed p. 151
Describe what the assessor observed in the documentation, evidence, and source code that confirms that the software does not bypass or render ineffective any encryption methods provided by the payment terminal.

B.2.3.d Where the software facilitates encryption of sensitive data, but the payment terminal is not required to provide approved encryption methods (per the PCI PTS POI Standard), the assessor shall examine all relevant software documentation and source code necessary to confirm that the encryption methods used or implemented by the software for encrypting sensitive data provide “strong cryptography” and are implemented in accordance with Control Objectives 7.1 and 7.2.
Removed p. 151
Describe what the assessor observed in the documentation and evidence that confirms whether the payment terminal is required to provide approved encryption methods as part of its PCI PTS POI device evaluation.

Indicate whether the payment terminal is required to provide approved encryption methods as part of its PCI PTS POI device evaluation (yes/no).

If “yes,” skip to B.2.4.

If “no,” complete the remaining reporting instructions for this test requirement.

Indicate whether the software requires random number values for cryptographic operations involving sensitive data or sensitive functions (yes/no).

If “yes,” complete the reporting instructions for test requirements B.2.4.b through B.2.4.c.
Modified p. 152 → 116
In Place N/A Not in Place B.2.4.a The assessor shall examine all relevant software documentation and source code necessary to determine whether the software requires random values to be generated for any cryptographic operations involving sensitive data or sensitive functions.
In Place Not in Place N/A B.2.4.a The assessor shall examine evidence (including source code) to determine whether the software requires random values to be generated for any cryptographic operations involving sensitive data or sensitive functions.
Modified p. 152 → 116
Describe what the assessor observed in the documentation, evidence, and source code examined that confirms whether the software requires random number values for cryptographic operations involving sensitive data or sensitive functions.
R1 Indicate whether the software relies on random values to be generated for cryptographic operations involving sensitive data or sensitive functions.
Removed p. 153
If “no,” skip to B.2.4.c.

Indicate whether there are any discrepancies between the list of random number generation functions used by the software and the list of PCI-approved random number generation functions provided by the payment terminal (yes/no).

Identify the discrepancies found between the random number functions used by the software and those included in the payment terminal’s PTS POI device evaluation.

For each of the noted discrepancies, describe the assessor’s rationale for why the discrepancy does not violate the parent control objective.
Removed p. 154
Describe what the assessor observed in the documentation, evidence, and source code that confirms that the software only uses the PCI-approved random number generation functions included in the payment terminal’s PCI PTS POI device evaluation and does not implement its own random number generation functions.
Removed p. 155
Identify all discrepancies found between the logical interfaces identified through documentation and evidence reviews and the logical interfaces identified through source code reviews.

For each of the noted discrepancies, describe the software vendor’s justification for why the discrepancy exists and the assessor’s rationale for why the discrepancy is considered acceptable.
Modified p. 155 → 117
In Place N/A Not in Place B.2.5.a The assessor shall examine all relevant software documentation and source code necessary to determine all logical interfaces of the software, including:
In Place Not in Place N/A B.2.5.a The assessor shall examine evidence (including source code) to determine all logical interfaces of the software, including:
Modified p. 155 → 117
• The logical interfaces intended for sharing clear-text account data, such as those used to pass clear- text account data back to the approved firmware of the payment terminal.
• The logical interfaces intended for sharing clear-text account data, such as those used to pass clear-text account data back to the approved firmware of the payment terminal.
Modified p. 155 → 120
Identify the documentation and evidence examined in support of this test requirement.
R1 Identify the evidence obtained to support this test requirement.
Removed p. 156
Describe what the assessor observed in the documentation, evidence, and source code in B.2.5.a that indicates that the software does not support sharing of clear-text account data directly with other software through its own logical interfaces.

B.2.5.c The assessor shall install and configure the software in accordance with the software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods (commercial tools, scripts, etc.) the assessor shall test the software using all software functions that handle account data to confirm that the software does not facilitate the sharing of clear-text account data directly with other software through its own logical interfaces.
Modified p. 156 → 121
Describe what the assessor observed in the software testing results that confirms that the software does not facilitate the sharing of cleartext account data directly with any other software through its own logical interfaces.
R2 If R1 is “No,” then describe what the assessor observed that confirms the software does not support the loading of files outside of the base software package(s).
Modified p. 156 → 124
Describe each of the tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R1 Identify each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm the findings in Test Requirement B.3.1.a.
Removed p. 157
Describe what the assessor observed in the documentation, evidence, and source code that confirms whether the software connects to and/or uses any shared resources provided by the payment terminal.

If “yes,” complete the remaining reporting instructions for this test requirement and test requirement B.2.6.b.
Modified p. 157 → 119
In Place N/A Not in Place B.2.6.a The assessor shall examine all relevant software documentation and source code necessary to determine whether and how the software connects to and/or uses any shared resources provided by the payment terminal, and to confirm that:
In Place Not in Place N/A B.2.6.a The assessor shall examine evidence (including source code) to determine whether and how the software connects to and/or uses any shared resources provided by the payment terminal, and to confirm that:
Modified p. 157 → 119
• The software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1 includes detailed instructions for how to configure the software to ensure secure integration with shared resources.
• The guidance required in Control Objectives 12.1 and B.5.1 includes detailed instructions for how to configure the software to ensure secure integration with shared resources.
Modified p. 157 → 119
• The software vendor’s implementation guidance for secure integration with such shared resources is in accordance with the payment terminal vendor’s security guidance/policy.
• The required guidance for secure integration with shared resources is in accordance with the payment terminal vendor’s security guidance/policy.
Modified p. 157 → 119
Indicate whether the software connects to or uses any shared resources provided by the payment terminal (yes/no).
R1 Indicate whether the software relies on any shared resources provided by the PCI PTS POI devices that are included in the software evaluation.
Modified p. 157 → 119
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders on how to configure the software and securely integrate with each of the shared resources provided by the payment terminal.
R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance for how to configure the software to securely integrate with the shared resources.
Modified p. 157 → 122
Identify the documentation and evidence examined in support of this test requirement.
R2 Identify the evidence obtained to support these findings.
Removed p. 158
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that the software’s connection to and use of any shared resources provided by the payment terminal are handled securely.
Modified p. 158 → 119
B.2.6.b The assessor shall install and configure the software in accordance with the software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods (commercial tools, scripts, etc.) the assessor shall test the software using all software functions that use or integrate shared resources to confirm that any connections to or use of shared resources are handled securely.
B.2.6.b The assessor shall install and configure the software in accordance with the guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods, the assessor shall test the software using all software functions that use or integrate shared resources to confirm that any connections to or use of shared resources are handled securely.
Modified p. 158 → 120
In Place N/A Not in Place B.2.7.a The assessor shall examine all relevant payment terminal documentation •including the payment terminal vendor’s security guidance/policy

•necessary
to determine whether and how application segregation is enforced by the payment terminal.
In Place Not in Place N/A B.2.7.a The assessor shall examine all relevant payment terminal documentation (including the payment terminal vendor’s security guidance/policy) to determine whether and how application segregation is enforced by the payment terminal.
Modified p. 158 → 125
Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R1 Identify each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm the findings in Test Requirement B.3.1.1.a.
Removed p. 159
Describe what the assessor observed in documentation, evidence, and source code that confirms that the software does not include functions that would enable it to bypass or defeat any device-level application segregation controls provided by an underlying payment terminal.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders on how to cryptographically sign software files in a manner that supports cryptographic authentication of all such files by an underlying payment terminal’s firmware.
Modified p. 159 → 120
B.2.8 All software files are cryptographically signed to facilitate cryptographic authentication of the software files by the payment terminal firmware.
B.2.8 All software files are cryptographically signed to enable cryptographic authentication of the software files by the payment terminal firmware.
Modified p. 159 → 120
In Place N/A Not in Place B.2.8.a The assessor shall examine the software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1 to confirm it includes detailed instructions for how to cryptographically sign the software files in a manner that facilitates the cryptographic authentication of all such files by the payment terminal.
In Place Not in Place N/A B.2.8.a The assessor shall examine the guidance required in Control Objectives 12.1 and B.5.1 to confirm that it includes detailed instructions for how to cryptographically sign the software files in a manner that enables the cryptographic authentication of all such files by the payment terminal.
Removed p. 160
Describe what the assessor observed in the documentation, evidence, and software test results that confirms that all software files are cryptographically signed in a manner that supports the cryptographic authentication of all software files by an underlying payment terminal’s firmware.

Describe how and the extent to which source code was examined to support this test requirement.

Indicate whether evidence was found to suggest that the software supports the loading of files outside of the base software package(s) (yes/no).
Modified p. 160 → 119
Identify any additional documentation and evidence examined in support this test requirement.
R4 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 160 → 121
B.2.8.c Where the software supports the loading of files outside of the base software package(s), the assessor shall determine whether each of those files is cryptographically signed in a manner that facilitates the cryptographic authentication of those files by the payment terminal. For any files that cannot be cryptographically signed, the assessor shall justify why the inability to cryptographically sign each such files does not adversely affect the security of the software or the underlying payment terminal.
B.2.8.c Where the software supports the loading of files outside of the base software package(s), the assessor shall examine evidence and test the software to determine whether each of those files is cryptographically signed in a manner that enables the cryptographic authentication of those files by the payment terminal. For any files that cannot be cryptographically signed, the assessor shall justify why the inability to cryptographically sign such files does not adversely affect the security of the software or the …
Modified p. 160 → 121
Describe what the assessor observed in the documentation, evidence, and source code that confirms whether the software supports the loading of files outside of the base software package(s).
R1 Indicate whether the software supports the loading of files outside of the base software package.
Removed p. 161
Where files were found that were not or could not be cryptographically signed, describe the assessor’s rationale for why the inability to cryptographically sign and authenticate such files does not adversely affect the security of the software or an underlying payment terminal.

Describe what the assessor observed in the documentation, evidence, and source code that confirms whether the software supports EMV® payment transactions.
Removed p. 162
B.2.9 The integrity of software prompt files is protected in accordance with Control Objective B.2.8. In Place N/A Not in Place B.2.9.a The assessor shall examine all relevant software documentation and source code necessary to determine whether the software supports the use of data entry prompts and/or prompt files. Where the software supports such features, the assessor shall confirm the software protects the integrity of those prompts as defined in Test Requirements B.2.9.b through B.2.9.c.

If “yes,” complete the remaining reporting instructions for test requirements B.2.9.b through B.2.9.c.
Modified p. 162 → 122
Describe what the assessor observed in the documentation, evidence, and source code that confirms whether the software supports the use of data entry prompts and/or prompt files.
R1 Indicate whether the software supports the use of data entry prompts or prompt files.
Modified p. 162 → 122
Indicate whether the software supports the use of data entry prompts and/or prompt files (yes/no).
R1 Indicate whether the software supports EMV® payment transactions.
Removed p. 163
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides guidance to stakeholders on how to cryptographically sign all prompt files in a manner that supports the cryptographic authentication of those files in accordance with Control Objective B.2.8.

B.2.9.c The assessor shall install and configure the software in accordance with the software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods (commercial tools, scripts, etc.) the assessor shall confirm that all prompt files are cryptographically signed in a manner that facilitates the cryptographic authentication of those files by the payment terminal in accordance with B.2.8.

Describe what the assessor observed in any documentation, evidence, and software test results that confirms that all prompt files are cryptographically signed in a manner that supports the cryptographic authentication of those files by an underlying payment terminal …
Modified p. 163 → 121
Identify any documentation and evidence examined in support of this test requirement.
R3 Describe any other assessment activities performed and/or findings for this test requirement.
Removed p. 164
In Place N/A Not in Place B.3.1.a The assessor shall examine all relevant software documentation and source code necessary to identify all external inputs to the software. For each user or other external input, the assessor shall examine all relevant software documentation and source code to confirm that inputs conform to a list of expected characteristics and that all input that does not conform to expected characteristics is rejected by the software or otherwise handled securely.

Describe what the assessor observed in the documentation, evidence, and source code that confirms that all software inputs are checked upon data entry to determine whether the data conforms to a set of expected characteristics.

Describe what the assessor observed in the documentation, evidence, and source code that confirms that all input data that does not conform to the set of expected characteristics are either rejected or handled in a secure manner.
Modified p. 164 → 124
B.3.1 The software validates all use and other external inputs. Note: Control Objectives B.3.1 through B.3.3 are extensions of Control Objective 4.2. Validation of these control objectives should be performed at the same time.
Note: Control Objectives B.3.1 through B.3.3 are extensions of Control Objective 4.2. Validation of these control objectives should be performed at the same time.
Removed p. 165
Describe what the assessor observed in the software testing results that confirms that all software inputs are validated upon data entry.

In Place N/A Not in Place B.3.1.1.a The assessor shall examine all relevant software documentation and source code necessary to identify all terminal software functions where string values are passed as inputs to confirm that all strings are checked for text or data that can be erroneously or maliciously interpreted as a command.

Describe how the assessor ensured that all software inputs where input data is passed to the software as a string value were identified and checked for compliance with the parent control objective.
Modified p. 165 → 124
Describe what the assessor observed in the software testing results that confirms all invalid data or data that does not conform to expected characteristics are either rejected or handled securely.
R3 Describe how the software handles input data that does not conform to the expected characteristics.
Removed p. 166
Describe what the assessor observed in the documentation, evidence, and source code that confirms that all such values are either rejected or handled securely.

Where the software handles such input data rather than rejecting it, describe each of the methods implemented by the software to ensure such input data is handled safely.

Describe what the assessor observed in the software testing results that confirms that all input data containing commands is either rejected or handled safely and securely.
Modified p. 166 → 125
B.3.1.1.b The assessor shall install and configure the software in accordance with the software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods (commercial tools, scripts, etc.), the assessor shall test the software by attempting to supply each of the identified functions with data that includes commands to confirm that the software either rejects such inputs or otherwise handles such inputs securely.
B.3.1.1.b The assessor shall install and configure the software in accordance with the guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods, the assessor shall test the software by attempting to supply each of the identified functions with data that includes commands to confirm that the software either rejects such inputs or otherwise handles such inputs securely.
Removed p. 167
Describe what the assessor observed in the documentation, evidence, and source code that confirms that the software conducts checks to confirm that buffers are sized appropriately wherever the software handles buffers.
Modified p. 167 → 126
In Place N/A Not in Place B.3.1.2.a The assessor shall examine all relevant software documentation and source code necessary to identify all software functions that handle buffers and process data supplied by external inputs. For each of the noted functions, the assessor shall confirm that each of the identified functions:
In Place Not in Place N/A B.3.1.2.a The assessor shall examine evidence (including source code) to identify all software functions that handle buffers and process data supplied from untrusted sources. For each of the noted functions, the assessor shall confirm that each of the identified functions:
Modified p. 167 → 126
• Conducts checks that confirm that buffers are sized appropriately for the data they are intended to handle, including consideration for underflows and overflows.
• Conducts checks to confirm that buffers are sized appropriately for the data they are intended to handle, including consideration for underflows and overflows.
Modified p. 167 → 126
Describe how the assessor ensured that all software inputs that handle buffers and process externally provided data were identified.
R3 Describe how the software ensures that buffers are sized appropriately for the data they are intended to store.
Modified p. 167 → 126
Describe what the assessor observed in the documentation, evidence, and source code that confirms that only unsigned variables are used to define buffer sizes.
R2 Identify the evidence obtained that demonstrates that only unsigned variables are used to define buffer sizes.
Modified p. 167 → 126
Describe what the assessor observed in the documentation, evidence, and source code that confirms that all input data that violates buffer size or other memory allocation thresholds are rejected or handled securely.
R4 Describe how the software handles input data that violates buffer size or any other memory allocation thresholds.
Removed p. 168
Describe what the assessor observed in the software testing results that confirms that all input data that violates buffer size or other memory allocation thresholds is rejected or handled securely.

In Place N/A Not in Place B.3.2.a The assessor shall examine all relevant software documentation and source code necessary to identify all software functions that handle the sensitive data predefined in Control Objective 1.2. For each of the noted software functions, the assessor shall confirm that each function:

Describe what the assessor observed in the documentation, evidence, and source code that confirms that the software performs checks on return values to ensure that sensitive data is not inadvertently leaked through error codes or messages.
Modified p. 168 → 126
B.3.1.2.b The assessor shall install and configure the software in accordance with the software vendor’s implementation guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods (commercial tools, scripts, etc.) the assessor shall test the software by attempting to supply each noted function with inputs that violate buffer size thresholds to confirm that the software either rejects or securely handles all such attempts.
B.3.1.2.b The assessor shall install and configure the software in accordance with the guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods, the assessor shall test the software by attempting to supply each noted function with inputs that violate buffer size thresholds to confirm that the software either rejects or securely handles all such attempts.
Modified p. 168 → 126
Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to validate the findings in Test Requirement B.3.1.2.a.
Removed p. 169
Describe what the assessor observed in the software testing results that confirms that sensitive data is not exposed or inadvertently leaked through error codes or messages.

In Place N/A Not in Place B.3.3.a The assessor shall examine all relevant software documentation and source code necessary to identify all software functions that rely on synchronous processing. For each of the noted functions, the assessor shall confirm that protection mechanisms have been implemented in the software to mitigate race conditions.

Describe what the assessor observed in the documentation, evidence, and source code that confirms protection mechanisms are implemented in the software to mitigate race conditions.
Modified p. 169 → 127
Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to validate the findings in Test Requirement B.3.2.a.
Removed p. 170
Describe what the assessor observed in the software testing results that confirms that the software is resistant to race conditions.
Modified p. 170 → 128
Describe each of the software tests performed in support of this test requirement, including the tool(s) or method(s) used and the scope of each test.
R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to validate the findings in Test Requirement B.3.3.a.
Removed p. 171
Summarize the software vendor’s process for testing the software for vulnerabilities or other software flaws prior to software release.
Modified p. 171 → 129
Identify the documentation and evidence examined in support of this test requirement.
R7 Describe any other assessment activities performed and/or findings for this test requirement.
Modified p. 171 → 129
B.4.1 A documented process is maintained and followed for testing software for vulnerabilities prior to each update or release.
In Place Not in Place N/A B.4.1 A documented process is maintained and followed for testing software for vulnerabilities prior to each update or release.
Modified p. 171 → 129
In Place N/A Not in Place B.4.1.a The assessor shall examine all relevant software documentation and evidence necessary to confirm that the software vendor maintains a documented process in accordance with Control Objective 10.2 for testing the software for vulnerabilities prior to each update or release, and that the documented process includes detailed descriptions of how the vendor tests for the following:
In Place Not in Place N/A B.4.1.a The assessor shall examine evidence to confirm that the software vendor maintains a documented process in accordance with Control Objective 10.2 for testing the software for vulnerabilities prior to each update or release, and that the documented process includes detailed descriptions of how the vendor tests for the following:
Modified p. 171 → 129
Describe when and the frequency with which the software is tested for the presence or use of any unnecessary ports and protocols prior to software release.
R1 Describe how the software is tested for the presence unnecessary ports and protocols and the frequency of this testing.
Modified p. 171 → 129
Describe when and the frequency with which the software is tested for the unintended storage, transmission, or output of clear-text account data.
R2 Describe how the software is tested for the unintended storage, transmission, and output of clear-text account data and the frequency of this testing.
Modified p. 171 → 129
Describe when and the frequency with which the software is tested for the presence of any default user accounts with default or static access credentials.
R3 Describe how the software is tested for the presence of built-in user accounts with default or static authentication credentials and the frequency of this testing.
Modified p. 171 → 129
Describe when and the frequency with which the software is tested for the presence of any hard-coded authentication credentials in code or in configuration files.
R4 Describe how the software is tested for the presence of hard-coded authentication credentials and the frequency of this testing.
Removed p. 172
B.4.1.b The assessor shall examine all relevant documentation and evidence necessary (such as software testing artifacts, etc.) to confirm that the software is tested for vulnerabilities prior to each release and that the testing covers the following:

Describe what the assessor observed in the documentation and evidence that confirms that the software is routinely tested for the presence or use of unnecessary ports or protocols.

Describe what the assessor observed in the documentation and evidence that confirms that the software is routinely tested for the unintended storage, transmission, or output of clear-text account data.

Describe what the assessor observed in the documentation and evidence that confirms that the software is routinely tested for the presence of default user accounts with default or static access credentials.

Describe what the assessor observed in the documentation and evidence that confirms that the software is routinely tested for the presence of hard-coded authentication credentials in code or in …
Modified p. 172 → 129
Describe when and the frequency with which the software is tested for the presence of any faulty or ineffective software security controls.
R6 Describe how the software is tested for the presence of faulty or ineffective security features or functions and the frequency of this testing.
Modified p. 172 → 130
• The presence of any default user accounts with default or static access credentials.
• The presence of any default user accounts with static access credentials.
Modified p. 172 → 130
Identify the documentation and evidence examined in support of this test requirement.
R1 Identify the evidence obtained that confirms the findings for this test requirement.
Removed p. 173
Describe what the assessor observed in the documentation and evidence that confirms that the software is routinely tested for the presence of faulty or ineffective software security controls.
Removed p. 174
In Place N/A Not in Place B.5.1 The assessor shall examine all relevant software documentation and evidence necessary to confirm that the software vendor provides detailed implementation guidance to stakeholders in accordance with Control Objective 12.1 on how to securely implement and operate the software for all applicable payment terminals.

Describe what the assessor observed in the documentation and evidence that confirms that the software vendor’s guidance is provided to stakeholders in accordance with Control Objective 12.

Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s guidance covers all available software security options and parameters.
Modified p. 174 → 131
B.5.1 The software vendor provides implementation guidance on how to implement and operate the software securely for the payment terminals on which it is to be deployed. Note: This control objective is an extension of Control Objective 12.1. Validation of these control objectives should be performed at the same time.
In Place Not in Place N/A B.5.1 The software vendor provides implementation guidance on how to implement and operate the software securely for the payment terminals on which it is to be deployed.
Modified p. 174 → 131
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor provides detailed guidance to stakeholders on how to securely implement and operate the software for all payment terminals on which the software is to be deployed.
R1 Identify the evidence obtained that details the software vendor’s guidance on the implementation and operation of the software for applicable payment terminals.
Modified p. 174 → 131
In Place N/A Not in Place B.5.1.1 The assessor shall examine software vendor implementation guidance to confirm it includes detailed instructions on how to configure all available security options and parameters of the software in accordance with Control Objective B.1.3.
In Place Not in Place N/A B.5.1.1 The assessor shall examine evidence to confirm that the required guidance includes detailed instructions on how to configure all available security options and parameters of the software in accordance with Control Objective B.1.3.
Removed p. 175
Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s guidance covers all payment terminal security features and functions used by the software.

Describe what the assessor observed in the documentation and evidence that demonstrates that the software vendor’s guidance covers the secure configuration and use of security features and functions for all payment terminals upon which the software is to be deployed.
Modified p. 175 → 131
In Place N/A Not in Place B.5.1.2 The assessor shall examine the software vendor implementation guidance to confirm it includes detailed instructions on how to securely configure the software to use the security features and functions of the payment terminal where applicable.
B.5.1.2 Implementation guidance includes detailed instructions for how to securely configure the software to use the security features and functions of the payment terminal where applicable.
Modified p. 175 → 132
In Place N/A Not in Place B.5.1.3 The assessor shall examine the software vendor implementation guidance to confirm it includes detailed instructions on how to configure the software to securely integrate or use any shared resources provided by the payment terminal in accordance with Control Objective B.2.6.
In Place Not in Place N/A B.5.1.3 The assessor shall examine evidence to confirm that the required guidance includes detailed instructions on how to configure the software to securely integrate or use any shared resources provided by the payment terminal in accordance with Control Objective B.2.6.
Modified p. 175 → 142
Identify the documentation and evidence examined in support of this test requirement.
R2 Identify the evidence obtained that supports this finding.
Modified p. 175 → 159
Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s guidance covers all payment terminal shared resources used by the software.
R1 Describe what the assessor observed in the evidence obtained that confirms access is restricted to the minimum number of origins feasible.
Removed p. 176
In Place N/A Not in Place B.5.1.4 The assessor shall examine the software vendor implementation guidance to confirm it includes detailed instructions on how to cryptographically sign the software files in a manner that facilitates the cryptographic authentication of all such files by the payment terminal in accordance with Control Objective B.2.8.

Describe what the assessor observed in the documentation and evidence that demonstrates that the software vendor’s guidance covers cryptographic signing for all payment terminals upon which the software is to be deployed.

In Place N/A Not in Place B.5.1.5 The assessor shall examine the software vendor implementation guidance to confirm it includes detailed instructions for stakeholders to cryptographically sign all prompt files in accordance with Control Objective B.2.9.
Modified p. 176 → 132
B.5.1.4 Implementation guidance includes detailed instructions on how to cryptographically sign the software files in a manner that facilitates the cryptographic authentication of all such files by the payment terminal.
B.5.1.4 Implementation guidance includes detailed instructions on how to cryptographically sign the software files in a manner that enables the cryptographic authentication of all such files by the payment terminal.
Modified p. 176 → 133
Describe what the assessor observed in the documentation and evidence that indicates that the software vendor’s guidance covers cryptographic signing for all software files.
R1 Describe what the assessor observed in the evidence obtained that confirms that vendor guidance includes instructions on how to cryptographically sign prompt files.
Modified p. 176 → 133
B.5.1.5 Implementation guidance includes instructions for stakeholders to cryptographically sign all prompt files.
In Place Not in Place N/A B.5.1.5 The assessor shall examine evidence to confirm that the required guidance includes detailed instructions for stakeholders to cryptographically sign all prompt files in accordance with Control Objective B.2.9.
Modified p. 176 → 154
Describe what the assessor observed in the documentation and evidence that confirms that the software vendor’s guidance covers cryptographic signing of all prompt files.
R1 Describe what the assessor observed in the evidence obtained that confirms that threats to interfaces that accept file uploads are documented.
Removed p. 177
Describe what the assessor observed in the documentation and evidence that demonstrates the software vendor’s guidance for securely configuring the underlying payment terminal aligns with the payment terminal vendor’s security guidance for each payment terminal upon which the software is to be deployed.
Modified p. 177 → 133
In Place N/A Not in Place B.5.2 The assessor shall examine the payment terminal vendor’s security guidance/policy and the software implementation guidance required in Control Objective B.5.1 to confirm that the software implementation guidance aligns with the payment terminal vendor’s security guidance/policy.
In Place Not in Place N/A B.5.2 The assessor shall examine evidence (including the payment terminal vendor’s security guidance/policy and the guidance required in Control Objective B.5.1) to confirm that the guidance aligns with the payment terminal vendor’s security guidance/policy.
Modified p. 178 → 162
Control Objective Test Requirement Additional Information 3.2 3.2.b A table containing an inventory of all open-source components used by the vendor’s software is attached to this ROV.
Control Objective Test Requirement Additional Information Ex: 3.2 3.2.b A table containing an inventory of all open-source components used by the vendor’s software is attached to this ROV.
Removed p. 179
B.2 Confirmation of Testing Environment Configuration For each of the unique combinations of testing hardware, software and system configurations specified in Section 3.4, confirm the following:
Modified p. 179 → 163
B.1 Confirmation of Testing Environment Used The Secure Software Assessor Company’s Testing Environment, as describe in Section 4.5.1 of the Secure Software Program Guide, was used for this assessment.
B.2 Confirmation of Testing Environment Used Indicate whether the Secure Software Assessor Company’s Testing Environment(s) described in Section B.1 adhere(s) to the requirements specified in Section 4.6.1 of the Secure Software Program Guide.
Modified p. 179 → 163
Note: If “no,” then provide reasons why the Secure Software Assessor Company Test Environment is not capable of properly and fully testing all functions of the Payment Software and describe the alternative environment(s) used in the field below:
If “No,” then provide reasons why the Secure Software Assessor Company Test Environment is not capable of properly and fully testing all functions of the Payment Software and describe the alternative environment(s) used in the field below:
Modified p. 179 → 164
Note: If any of the questions below are determined to be “not applicable,” please select “No” for the response and provide a detailed explanation as to why the questions are not applicable in B.3 where prompted.
Note: If any of the questions below are determined to be “not applicable,” select “No” for the response and provide a detailed explanation as to why the questions are not applicable in B.4 where prompted.
Removed p. 180
B.3 Attestation of Test Environment Validation Provide the name of the Secure Software Assessor who attests that all items in table B.1 and B.2 were validated and all details are consistent with the details in the rest of the Report on Validation.
Modified p. 180 → 165
If any of the items in B.2 were marked as “No,” please describe why those items could not be confirmed and why the circumstances surrounding the lack of confirmation are acceptable.
If any of the items in B.3 were marked as “No,” describe why those items could not be confirmed and why the circumstances surrounding the lack of confirmation are acceptable.