-
Supporting Document
Mandatory Technical Document
Evaluation Activities for Network Device
cPP
February-2015
Version 1.0
CCDB-2015-01-001
-
Page 2 of 93 Version 1.0 February-2015
Foreword
This is a supporting document, intended to complement the Common Criteria version 3 and
the associated Common Evaluation Methodology for Information Technology Security
Evaluation.
Supporting documents may be “Guidance Documents”, that highlight specific approaches
and application of the standard to areas where no mutual recognition of its application is
required, and as such, are not of normative nature, or “Mandatory Technical Documents”,
whose application is mandatory for evaluations whose scope is covered by that of the
supporting document. The usage of the latter class is not only mandatory, but certificates
issued as a result of their application are recognized under the CCRA.
This supporting document has been developed by the Network International Technical
Community (NDFW-iTC) and is designed to be used to support the evaluations of products
against the cPPs identified in section 1.1.
Technical Editor: Network International Technical Community (NDFW-iTC)
Document history:
V1.0, 27 February 2015 (published version)
V0.4, 26 January 2015 (incorporates changes due to comments received from CCDB
review)
V0.3, 17 October 2014 (released version following public review, submitted for CCDB
review)
V0.2, 13 October 2014 (internal draft in response to public review comments, for iTC
review)
V0.1, 5 September 2014 (Initial release for public review)
General Purpose: See section 1.1.
Field of special use: This Supporting Document applies to the evaluation of TOEs claiming
conformance with the collaborative Protection Profile for Network Devices [NDcPP] and
collaborative Protection Profile for Stateful Traffic Filter Firewalls [FWcPP].
-
February-2015 Version 1.0 Page 3 of 93
Acknowledgements:
This Supporting Document was developed by the Network international Technical
Community with representatives from industry, Government agencies, Common Criteria
Test Laboratories, and members of academia.
-
Table of contents
Page 4 of 93 Version 1.0 February-2015
Table of Contents
1 INTRODUCTION ............................................................................................... 8
1.1 Technology Area and Scope of Supporting Document ......................................................................... 8
1.2 Structure of the Document ...................................................................................................................... 8
1.3 Glossary .................................................................................................................................................... 9
2 EVALUATION ACTIVITIES FOR SFRS ......................................................... 10
2.1 Security Audit (FAU) ............................................................................................................................. 10 2.1.1 FAU_GEN.1 Audit data generation ............................................................................................... 10 2.1.2 FAU_GEN.2 User identity association ........................................................................................... 11 2.1.3 FAU_STG.1 Protected audit trail storage ....................................................................................... 11 2.1.4 FAU_STG_EXT.1 Protected audit event storage ........................................................................... 11 2.1.5 FAU_ STG_EXT.2 Counting lost audit data ................................................................................ 13 2.1.6 FAU_ STG_EXT.3 Display warning for local storage space ........................................................ 13
2.2 Cryptographic Support (FCS) .............................................................................................................. 14 2.2.1 FCS_CKM.1 Cryptographic Key Generation ................................................................................. 14 2.2.2 FCS_CKM.2 Cryptographic Key Establishment ........................................................................... 16 2.2.3 FCS_CKM.4 Cryptographic Key Destruction ................................................................................ 19 2.2.4 FCS_COP.1(1) Cryptographic Operation (AES Data Encyption/ Decryption) .............................. 20 2.2.5 FCS_COP.1(2) Cryptographic Operation (Signature Generation and Verification ........................ 23 2.2.6 FCS_COP.1(3) Cryptographic Operation (Hash Algorithm) ......................................................... 24 2.2.7 FCS_COP.1(4) Cryptographic Operation (Keyed Hash Algorithm) .............................................. 25 2.2.8 FCS_RBG_EXT.1 Extended: Cryptographic Operation (Random Bit Generation) ....................... 26 2.2.9 FCS_HTTPS_EXT.1 HTTPS Protocol........................................................................................... 27 2.2.10 FCS_IPSEC_EXT.1 IPsec Protocol ........................................................................................... 27 2.2.11 FCS_SSHC_EXT.1 SSH Client................................................................................................. 36 2.2.12 FCS_SSHS_EXT.1 SSH Server ................................................................................................ 39 2.2.13 FCS_TLSC_EXT.1 Extended: TLS Client Protocol ................................................................. 42 2.2.14 FCS_TLSC_EXT.2 Extended: TLS Client Protocol with authentication .................................. 46 2.2.15 FCS_TLSS_EXT.1 Extended: TLS Server Protocol ................................................................. 50 2.2.16 FCS_TLSS_EXT.2 Extended: TLS Server Protocol with mutual authentication ...................... 52
2.3 Identification and Authentication (FIA) .............................................................................................. 55 2.3.1 FIA_PMG_EXT.1 Password Management ................................................................................... 55 2.3.2 FIA_UIA_EXT.1 User Identification and Authentication ............................................................. 56 2.3.3 FIA_UAU_EXT.2 Password-based Authentication Mechanism ................................................... 56 2.3.4 FIA_UAU.7 Protected Authentication Feedback .......................................................................... 57 2.3.5 FIA_X509_EXT.1 X.509 Certificate Validation ........................................................................... 57 2.3.6 FIA_X509_EXT.2 X.509 Certificate Authentication .................................................................... 58 2.3.7 FIA_X509_EXT.3 Extended: X509 Certificate Requests .............................................................. 59
2.4 Security management (FMT) ................................................................................................................ 60 2.4.1 FMT_MOF.1(1)/TrustedUpdate ..................................................................................................... 60 2.4.2 FMT_MOF.1(2)/TrustedUpdate ..................................................................................................... 60 2.4.3 FMT_MOF.1(1)/Audit ................................................................................................................... 60 2.4.4 FMT_MOF.1(2)/Audit ................................................................................................................... 61 2.4.5 FMT_MOF.1(1)/AdminAct ............................................................................................................ 61 2.4.6 FMT_MOF.1(2)/AdminAct ............................................................................................................ 61 2.4.7 FMT_MOF.1/LocSpace Management of security functions behaviour ........................................ 62 2.4.8 FMT_MTD.1 Management of TSF Data ....................................................................................... 62 2.4.9 FMT_MTD.1/AdminAct Management of TSF Data ..................................................................... 62 2.4.10 FMT_SMF.1 Specification of Management Functions .................................................................. 63
-
Table of contents
February-2015 Version 1.0 Page 5 of 93
2.4.11 FMT_SMR.2 Restrictions on security roles .............................................................................. 63
2.5 Protection of the TSF (FPT) .................................................................................................................. 63 2.5.1 FPT_SKP_EXT.1 Protection of TSF Data (for reading of all symmetric keys) ............................ 63 2.5.2 FPT_APW_EXT.1 Protection of Administrator Passwords .......................................................... 64 2.5.3 FPT_TST_EXT.1 TSF testing ........................................................................................................ 64 2.5.4 FPT_TST_EXT.2 Self tests based on certificates ........................................................................... 65 2.5.5 FPT_TUD_EXT.1 Trusted Update ................................................................................................. 65 2.5.6 FPT_TUD_EXT.2 Trusted Update based on certificates ............................................................... 67 2.5.7 FPT_STM.1 Reliable Time Stamps ............................................................................................... 68 2.5.8 FPT_FLS.1/LocSpace Failure with preservation of secure state .................................................... 68
2.6 TOE Access (FTA) ................................................................................................................................. 69 2.6.1 FTA_SSL_EXT.1 TSF-initiated Session Locking ......................................................................... 69 2.6.2 FTA_SSL.3 TSF-initiated Termination ......................................................................................... 69 2.6.3 FTA_SSL.4 User-initiated Termination ........................................................................................ 69 2.6.4 FTA_TAB.1 Default TOE Access Banners ................................................................................... 70
2.7 Trusted path/channels (FTP) ................................................................................................................ 70 2.7.1 FTP_ITC.1 Inter-TSF trusted channel ............................................................................................ 70 2.7.2 FTP_TRP.1 Trusted Path ................................................................................................................ 71
3 EVALUATION ACTIVITIES FOR SARS ........................................................ 73
3.1 ASE: Security Target Evaluation ......................................................................................................... 73 3.1.1 Conformance claims (ASE_CCL.1) ............................................................................................... 73
3.2 ADV: Development ................................................................................................................................ 74 3.2.1 Basic Functional Specification (ADV_FSP.1)................................................................................ 74
3.3 AGD: Guidance Documents .................................................................................................................. 75 3.3.1 Operational User Guidance (AGD_OPE.1) .................................................................................... 75 3.3.2 Preparative Procedures (AGD_PRE.1) ........................................................................................... 76
3.4 ATE: Tests .............................................................................................................................................. 77 3.4.1 Independent Testing – Conformance (ATE_IND.1) ....................................................................... 77
3.5 AVA: Vulnerability Assessment ............................................................................................................ 78 3.5.1 Vulnerability Survey (AVA_VAN.1) ............................................................................................. 78
4 REQUIRED SUPPLEMENTARY INFORMATION .......................................... 80
5 REFERENCES ............................................................................................... 81
A. VULNERABILITY ANALYSIS ........................................................................ 82
A.1 Introduction ....................................................................................................................................... 82
A.2 Additional Documentation ................................................................................................................ 82
A.3 Sources of vulnerability information ............................................................................................... 83
A.4 Process for Evaluator Vulnerability Analysis ................................................................................. 85
A.5 Reporting ............................................................................................................................................ 86
A.6 Public Vulnerability Database Entries for Flaw Hypotheses ........................................................ 87
-
Table of contents
Page 6 of 93 Version 1.0 February-2015
A.7 Additional Flaw Hypotheses ............................................................................................................. 87
A.8 iTC Activities – cPP and Supporting Document Maintenance ...................................................... 87
B. NETWORK DEVICE EQUIVALENCY CONSIDERATIONS ........................... 90
B.1 Introduction ....................................................................................................................................... 90
B.2 Evaluator guidance for determining equivalence ........................................................................... 90
B.3 Strategy .............................................................................................................................................. 92
B.4 Test presentation/Truth in advertising ............................................................................................ 93
-
List of tables
February-2015 Version 1.0 Page 7 of 93
List of tables
Table 1 - Evaluation Equivalency Analysis .......................................................................... 92
-
Introduction
Page 8 of 93 Version 1.0 February-2015
1 Introduction
1.1 Technology Area and Scope of Supporting Document
1 This Supporting Document defines the Evaluation Activities associated with
the collaborative Protection Profile for Network Devices [NDcPP].
2 The Network Device technical area has a number of specialised aspects, such
as those relating to the secure implementation and use of protocols, and to
the particular ways in which remote management facilities need to be
assessed across a range of different physical and logical interfaces for
different types of infrastructure devices. This degree of specialisation, and
the associations between individual SFRs in the cPP, make it important for
both efficiency and effectiveness that evaluation activities are given more
specific interpretations than those found in the generic CEM activities.
3 This Supporting Document is mandatory for evaluations of products that
claim conformance to any of the following cPP(s):
a) collaborative Protection Profile for Network Devices [NDcPP]
b) collaborative Protection Profile for Stateful Traffic Filter Firewalls [FWcPP].
4 Although Evaluation Activities are defined mainly for the evaluators to
follow, the definitions in this Supporting Document aim to provide a
common understanding for developers, evaluators and users as to what
aspects of the TOE are tested in an evaluation against the associated cPPs,
and to what depth the testing is carried out. This common understanding in
turn contributes to the goal of ensuring that evaluations against the cPP
achieve comparable, transparent and repeatable results. In general the
definition of Evaluation Activities will also help Developers to prepare for
evaluation by identifying specific requirements for their TOE. The specific
requirements in Evaluation Activities may in some cases clarify the meaning
of SFRs, and may identify particular requirements for the content of Security
Targets (especially the TOE Summary Specification), user guidance
documentation, and possibly supplementary information (e.g. for entropy
analysis or cryptographic key management architecture – see section 4).
1.2 Structure of the Document
5 Evaluation Activities can be defined for both Security Functional
Requirements and Security Assurance Requirements. These are defined in
separate sections of this Supporting Document.
6 If any Evaluation Activity cannot be successfully completed in an evaluation
then the overall verdict for the evaluation is a ‘fail’. In rare cases there may
be acceptable reasons why an Evaluation Activity may be modified or
deemed not applicable for a particular TOE, but this must be agreed with the
Certification Body for the evaluation.
-
Introduction
February-2015 Version 1.0 Page 9 of 93
7 In general, if all Evaluation Activities (for both SFRs and SARs) are
successfully completed in an evaluation then it would be expected that the
overall verdict for the evaluation is a ‘pass’. To reach a ‘fail’ verdict when
the Evaluation Activities have been successfully completed would require a
specific justification from the evaluator as to why the Evaluation Activities
were not sufficient for that TOE.
8 Similarly, at the more granular level of Assurance Components, if the
Evaluation Activities for an Assurance Component and all of its related SFR
Evaluation Activities are successfully completed in an evaluation then it
would be expected that the verdict for the Assurance Component is a ‘pass’.
To reach a ‘fail’ verdict for the Assurance Component when these Evaluation
Activities have been successfully completed would require a specific
justification from the evaluator as to why the Evaluation Activities were not
sufficient for that TOE.
1.3 Glossary
9 For definitions of standard CC terminology see [CC] part 1.
10 cPP – collaborative Protection Profile
11 CVE – Common Vulnerabilities and Exposures (database)
12 iTC – International Technical Community
13 SD – Supporting Document
14 Supplementary information – information that is not necessarily included
in the Security Target or guidance documentation, and that may not
necessarily be public. Examples of such information could be entropy
analysis, or description of a cryptographic key management architecture used
in (or in support of) the TOE. The requirement for any such supplementary
information will be identified in the relevant cPP (see description in section
4).
-
Evaluation Activities for SFRs
Page 10 of 93 Version 1.0 February-2015
2 Evaluation Activities for SFRs
2.1 Security Audit (FAU)
2.1.1 FAU_GEN.1 Audit data generation
2.1.1.1 Guidance Documentation
15 The evaluator shall check the guidance documentation and ensure that it lists
all of the auditable events and provides a format for audit records. Each audit
record format type must be covered, along with a brief description of each
field. The evaluator shall check to make sure that every audit event type
mandated by the cPP is described and that the description of the fields
contains the information required in FAU_GEN1.2, and the additional
information specified in the table of audit events.
16 The evaluator shall also make a determination of the administrative actions
that are relevant in the context of the cPP. The evaluator shall examine the
guidance documentation and make a determination of which administrative
commands, including subcommands, scripts, and configuration files, are
related to the configuration (including enabling or disabling) of the
mechanisms implemented in the TOE that are necessary to enforce the
requirements specified in the cPP. The evaluator shall document the
methodology or approach taken while determining which actions in the
administrative guide are security relevant with respect to the cPP. The
evaluator may perform this activity as part of the activities associated with
ensuring that the corresponding guidance documentation satisfies the
requirements related to it.
2.1.1.2 Tests
17 The evaluator shall test the TOE’s ability to correctly generate audit records
by having the TOE generate audit records for the events listed in the table of
audit events and administrative actions listed above. This should include all
instances of an event: for instance, if there are several different I&A
mechanisms for a system, the FIA_UIA_EXT.1 events must be generated for
each mechanism. The evaluator shall test that audit records are generated for
the establishment and termination of a channel for each of the cryptographic
protocols contained in the ST. If HTTPS is implemented, the test
demonstrating the establishment and termination of a TLS session can be
combined with the test for an HTTPS session. Logging of all activities
related to trusted update should be tested in detail and with utmost diligence.
When verifying the test results, the evaluator shall ensure the audit records
generated during testing match the format specified in the guidance
documentation, and that the fields in each audit record have the proper
entries.
18 Note that the testing here can be accomplished in conjunction with the
testing of the security mechanisms directly.
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 11 of 93
2.1.2 FAU_GEN.2 User identity association
19 This activity should be accomplished in conjunction with the testing of
FAU_GEN.1.1.
2.1.3 FAU_STG.1 Protected audit trail storage
2.1.3.1 TSS
20 The evaluator shall examine the TSS to ensure it describes the amount of
audit data that are stored locally and how these records are protected against
unauthorized modification or deletion. The evaluator shall ensure that the
TSS describes the conditions that must be met for authorized deletion of
audit records.
2.1.3.2 Guidance Documentation
21 The evaluator shall examine the guidance documentation to determine that it
describes any configuration required for protection of the locally stored audit
data against unauthorized modification or deletion.
2.1.3.3 Tests
22 The evaluator shall perform the following tests:
a) Test 1: The evaluator shall access the audit trail as an unauthorized administrator and attempt to modify and delete the audit records. The
evaluator shall verify that these attempts fail.
b) Test 2: The evaluator shall access the audit trail as an authorized administrator and attempt to delete the audit records. The evaluator
shall verify that these attempts succeed. The evaluator shall verify
that only the records authorized for deletion are deleted.
2.1.4 FAU_STG_EXT.1 Protected audit event storage
2.1.4.1 TSS
23 The evaluator shall examine the TSS to ensure it describes the means by
which the audit data are transferred to the external audit server, and how the
trusted channel is provided.
24 The evaluator shall examine the TSS to ensure it describes the amount of
audit data that are stored locally; what happens when the local audit data
store is full; and how these records are protected against unauthorized access.
25 If the TOE complies with FAU_STG_EXT.2 the evaluator shall verify that
the numbers provided by the TOE according to the selection for
FAU_STG_EXT.2 are correct when performing the tests for
FAU_STG_EXT.1.3.
-
Evaluation Activities for SFRs
Page 12 of 93 Version 1.0 February-2015
26 The evaluator shall examine the TSS to ensure that it details the behaviour of
the TOE when the storage space for audit data is full. When the option
‘overwrite previous audit record’ is selected this description should include
an outline of the rule for overwriting audit data. If ‘other actions’ are chosen
such as sending the new audit data to an external IT entity, then the related
behaviour of the TOE shall also be detailed in the TSS.
2.1.4.2 Guidance Documentation
27 The evaluator shall also examine the guidance documentation to ensure it
describes how to establish the trusted channel to the audit server, as well as
describe any requirements on the audit server (particular audit server
protocol, version of the protocol required, etc.), as well as configuration of
the TOE needed to communicate with the audit server.
28 The evaluator shall also examine the guidance documentation to determine
that it describes the relationship between the local audit data and the audit
data that are sent to the audit log server. For example, when an audit event is
generated, is it simultaneously sent to the external server and the local store,
or is the local store used as a buffer and “cleared” periodically by sending the
data to the audit server.
29 The evaluator shall also ensure that the guidance documentation describes all
possible configuration options for FAU_STG_EXT.1.3 and the resulting
behaviour of the TOE for each possible configuration. The description of
possible configuration options and resulting behaviour shall correspond to
those described in the TSS.
2.1.4.3 Tests
30 Testing of the trusted channel mechanism for audit will be performed as
specified in the associated assurance activities for the particular trusted
channel mechanism. The evaluator shall perform the following additional
test for this requirement:
a) Test 1: The evaluator shall establish a session between the TOE and the audit server according to the configuration guidance provided.
The evaluator shall then examine the traffic that passes between the
audit server and the TOE during several activities of the evaluator’s
choice designed to generate audit data to be transferred to the audit
server. The evaluator shall observe that these data are not able to be
viewed in the clear during this transfer, and that they are successfully
received by the audit server. The evaluator shall record the particular
software (name, version) used on the audit server during testing.
31 The evaluator shall perform operations that generate audit data and verify
that this data is stored locally. The evaluator shall perform operations that
generate audit data until the local storage space is exceeded and verifies that
the TOE complies with the behaviour defined in FAU_STG_EXT.1.3.
Depending on the configuration this means that the evaluator has to check
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 13 of 93
the content of the audit data when the audit data is just filled to the maximum
and then verifies that
a) The audit data remains unchanged with every new auditable event that should be tracked but that the audit data is recorded again after
the local storage for audit data is cleared (for the option ‘drop new
audit data’ in FAU_STG_EXT.1.3).
b) The existing audit data is overwritten with every new auditable event that should be tracked according to the specified rule (for the option
‘overwrite previous audit records’ in FAU_STG_EXT.1.3)
c) The TOE behaves as specified (for the option ‘other action’ in FAU_STG_EXT.1.3).
2.1.5 FAU_ STG_EXT.2 Counting lost audit data
32 This activity should be accomplished in conjunction with the testing of
FAU_STG_EXT.1.2 and FAU_STG_EXT.1.3.
2.1.5.1 TSS
33 The evaluator shall examine the TSS to ensure that it details the possible
options the TOE supports for information about the number of audit records
that have been dropped, overwritten, etc. if the local storage for audit data is
full.
2.1.5.2 Guidance Documentation
34 The evaluator shall also ensure that the guidance documentation describes all
possible configuration options and the meaning of the result returned by the
TOE for each possible configuration. The description of possible
configuration options and explanation of the result shall correspond to those
described in the TSS.
35 The evaluator shall verify that the guidance documentation contains a
warning for the administrator about the loss of audit data when he clears the
local storage for audit records.
2.1.5.3 Tests
36 The evaluator shall verify that the numbers provided by the TOE according
to the selection for FAU_STG_EXT.2 are correct when performing the tests
for FAU_STG_EXT.1.3.
2.1.6 FAU_ STG_EXT.3 Display warning for local storage space
37 This activity should be accomplished in conjunction with the testing of
FAU_STG_EXT.1.2 and FAU_STG_EXT.1.3.
-
Evaluation Activities for SFRs
Page 14 of 93 Version 1.0 February-2015
2.1.6.1 TSS
38 The evaluator shall examine the TSS to ensure that it details how the user is
warned before the local storage for audit data is full.
2.1.6.2 Guidance Documentation
39 The evaluator shall also ensure that the guidance documentation describes
how the user is warned before the local storage for audit data is full and how
this warning is displayed or stored (since there is no guarantee that an
administrator session is running at the time the warning is issued, it is
probably stored in the log files). The description in the guidance
documentation shall correspond to the description in the TSS.
2.1.6.3 Tests
40 The evaluator shall verify that a warning is issued by the TOE before the
local storage space for audit data is full.
2.2 Cryptographic Support (FCS)
2.2.1 FCS_CKM.1 Cryptographic Key Generation
2.2.1.1 TSS
41 The evaluator shall ensure that the TSS identifies the key sizes supported by
the TOE. If the ST specifies more than one scheme, the evaluator shall
examine the TSS to verify that it identifies the usage for each scheme.
2.2.1.2 Guidance Documentation
42 The evaluator shall verify that the AGD guidance instructs the administrator
how to configure the TOE to use the selected key generation scheme(s) and
key size(s) for all uses defined in this PP.
2.2.1.3 Tests
43 Note: The following tests require the developer to provide access to a test
platform that provides the evaluator with tools that are typically not found on
factory products.
Key Generation for FIPS PUB 186-4 RSA Schemes
44 The evaluator shall verify the implementation of RSA Key Generation by the
TOE using the Key Generation test. This test verifies the ability of the TSF
to correctly produce values for the key components including the public
verification exponent e, the private prime factors p and q, the public modulus
n and the calculation of the private signature exponent d.
45 Key Pair generation specifies 5 ways (or methods) to generate the primes p
and q. These include:
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 15 of 93
a) Random Primes:
Provable primes
Probable primes
b) Primes with Conditions:
Primes p1, p2, q1,q2, p and q shall all be provable primes
Primes p1, p2, q1, and q2 shall be provable primes and p and q shall be probable primes
Primes p1, p2, q1,q2, p and q shall all be probable primes
46 To test the key generation method for the Random Provable primes method
and for all the Primes with Conditions methods, the evaluator must seed the
TSF key generation routine with sufficient data to deterministically generate
the RSA key pair. This includes the random seed(s), the public exponent of
the RSA key, and the desired key length. For each key length supported, the
evaluator shall have the TSF generate 25 key pairs. The evaluator shall verify
the correctness of the TSF’s implementation by comparing values generated
by the TSF with those generated from a known good implementation.
Key Generation for Elliptic Curve Cryptography (ECC)
FIPS 186-4 ECC Key Generation Test
47 For each supported NIST curve, i.e., P-256, P-384 and P-521, the evaluator
shall require the implementation under test (IUT) to generate 10
private/public key pairs. The private key shall be generated using an
approved random bit generator (RBG). To determine correctness, the
evaluator shall submit the generated key pairs to the public key verification
(PKV) function of a known good implementation.
FIPS 186-4 Public Key Verification (PKV) Test
48 For each supported NIST curve, i.e., P-256, P-384 and P-521, the evaluator
shall generate 10 private/public key pairs using the key generation function
of a known good implementation and modify five of the public key values so
that they are incorrect, leaving five values unchanged (i.e., correct). The
evaluator shall obtain in response a set of 10 PASS/FAIL values.
Key Generation for Finite-Field Cryptography (FFC)
49 The evaluator shall verify the implementation of the Parameters Generation
and the Key Generation for FFC by the TOE using the Parameter Generation
and Key Generation test. This test verifies the ability of the TSF to correctly
produce values for the field prime p, the cryptographic prime q (dividing p-
1), the cryptographic group generator g, and the calculation of the private
key x and public key y.
50 The Parameter generation specifies 2 ways (or methods) to generate the
cryptographic prime q and the field prime p:
-
Evaluation Activities for SFRs
Page 16 of 93 Version 1.0 February-2015
Primes q and p shall both be provable primes
Primes q and field prime p shall both be probable primes
51 and two ways to generate the cryptographic group generator g:
Generator g constructed through a verifiable process
Generator g constructed through an unverifiable process.
52 The Key generation specifies 2 ways to generate the private key x:
len(q) bit output of RBG where 1
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 17 of 93
2.2.2.3 Tests
Key Establishment Schemes
59 The evaluator shall verify the implementation of the key establishment
schemes of the supported by the TOE using the applicable tests below.
SP800-56A Key Establishment Schemes
60 The evaluator shall verify a TOE's implementation of SP800-56A key
agreement schemes using the following Function and Validity tests. These
validation tests for each key agreement scheme verify that a TOE has
implemented the components of the key agreement scheme according to the
specifications in the Recommendation. These components include the
calculation of the DLC primitives (the shared secret value Z) and the
calculation of the derived keying material (DKM) via the Key Derivation
Function (KDF). If key confirmation is supported, the evaluator shall also
verify that the components of key confirmation have been implemented
correctly, using the test procedures described below. This includes the
parsing of the DKM, the generation of MACdata and the calculation of
MACtag.
Function Test
61 The Function test verifies the ability of the TOE to implement the key
agreement schemes correctly. To conduct this test the evaluator shall
generate or obtain test vectors from a known good implementation of the
TOE supported schemes. For each supported key agreement scheme-key
agreement role combination, KDF type, and, if supported, key confirmation
role- key confirmation type combination, the tester shall generate 10 sets of
test vectors. The data set consists of one set of domain parameter values
(FFC) or the NIST approved curve (ECC) per 10 sets of public keys. These
keys are static, ephemeral or both depending on the scheme being tested.
62 The evaluator shall obtain the DKM, the corresponding TOE’s public keys
(static and/or ephemeral), the MAC tag(s), and any inputs used in the KDF,
such as the Other Information field OI and TOE id fields.
63 If the TOE does not use a KDF defined in SP 800-56A, the evaluator shall
obtain only the public keys and the hashed value of the shared secret.
64 The evaluator shall verify the correctness of the TSF’s implementation of a
given scheme by using a known good implementation to calculate the shared
secret value, derive the keying material DKM, and compare hashes or MAC
tags generated from these values.
65 If key confirmation is supported, the TSF shall perform the above for each
implemented approved MAC algorithm.
-
Evaluation Activities for SFRs
Page 18 of 93 Version 1.0 February-2015
Validity Test
66 The Validity test verifies the ability of the TOE to recognize another party’s
valid and invalid key agreement results with or without key confirmation. To
conduct this test, the evaluator shall obtain a list of the supporting
cryptographic functions included in the SP800-56A key agreement
implementation to determine which errors the TOE should be able to
recognize. The evaluator generates a set of 24 (FFC) or 30 (ECC) test vectors
consisting of data sets including domain parameter values or NIST approved
curves, the evaluator’s public keys, the TOE’s public/private key pairs,
MACTag, and any inputs used in the KDF, such as the other info and TOE id
fields.
67 The evaluator shall inject an error in some of the test vectors to test that the
TOE recognizes invalid key agreement results caused by the following fields
being incorrect: the shared secret value Z, the DKM, the other information
field OI, the data to be MACed, or the generated MACTag. If the TOE
contains the full or partial (only ECC) public key validation, the evaluator
will also individually inject errors in both parties’ static public keys, both
parties’ ephemeral public keys and the TOE’s static private key to assure the
TOE detects errors in the public key validation function and/or the partial
key validation function (in ECC only). At least two of the test vectors shall
remain unmodified and therefore should result in valid key agreement results
(they should pass).
68 The TOE shall use these modified test vectors to emulate the key agreement
scheme using the corresponding parameters. The evaluator shall compare the
TOE’s results with the results using a known good implementation verifying
that the TOE detects these errors.
SP800-56B Key Establishment Schemes
69 The evaluator shall verify that the TSS describes whether the TOE acts as a
sender, a recipient, or both for RSA-based key establishment schemes.
70 If the TOE acts as a sender, the following assurance activity shall be
performed to ensure the proper operation of every TOE supported
combination of RSA-based key establishment scheme:
a) To conduct this test the evaluator shall generate or obtain test vectors from a known good implementation of the TOE supported schemes.
For each combination of supported key establishment scheme and its
options (with or without key confirmation if supported, for each
supported key confirmation MAC function if key confirmation is
supported, and for each supported mask generation function if KTS-
OAEP is supported), the tester shall generate 10 sets of test vectors.
Each test vector shall include the RSA public key, the plaintext
keying material, any additional input parameters if applicable, the
MacKey and MacTag if key confirmation is incorporated, and the
outputted ciphertext. For each test vector, the evaluator shall perform
a key establishment encryption operation on the TOE with the same
inputs (in cases where key confirmation is incorporated, the test shall
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 19 of 93
use the MacKey from the test vector instead of the randomly
generated MacKey used in normal operation) and ensure that the
outputted ciphertext is equivalent to the ciphertext in the test vector.
71 If the TOE acts as a receiver, the following assurance activities shall be
performed to ensure the proper operation of every TOE supported
combination of RSA-based key establishment scheme:
a) To conduct this test the evaluator shall generate or obtain test vectors from a known good implementation of the TOE supported schemes.
For each combination of supported key establishment scheme and its
options (with our without key confirmation if supported, for each
supported key confirmation MAC function if key confirmation is
supported, and for each supported mask generation function if KTS-
OAEP is supported), the tester shall generate 10 sets of test vectors.
Each test vector shall include the RSA private key, the plaintext
keying material (KeyData), any additional input parameters if
applicable, the MacTag in cases where key confirmation is
incorporated, and the outputted ciphertext. For each test vector, the
evaluator shall perform the key establishment decryption operation
on the TOE and ensure that the outputted plaintext keying material
(KeyData) is equivalent to the plaintext keying material in the test
vector. In cases where key confirmation is incorporated, the evaluator
shall perform the key confirmation steps and ensure that the outputted
MacTag is equivalent to the MacTag in the test vector.
b) The evaluator shall ensure that the TSS describes how the TOE handles decryption errors. In accordance with NIST Special
Publication 800-56B, the TOE must not reveal the particular error
that occurred, either through the contents of any outputted or logged
error message or through timing variations. If KTS-OAEP is
supported, the evaluator shall create separate contrived ciphertext
values that trigger each of the three decryption error checks described
in NIST Special Publication 800-56B section 7.2.2.3, ensure that
each decryption attempt results in an error, and ensure that any
outputted or logged error message is identical for each. If KTS-KEM-
KWS is supported, the evaluator shall create separate contrived
ciphertext values that trigger each of the three decryption error
checks described in NIST Special Publication 800-56B section
7.2.3.3, ensure that each decryption attempt results in an error, and
ensure that any outputted or logged error message is identical for
each.
2.2.3 FCS_CKM.4 Cryptographic Key Destruction
2.2.3.1 TSS
72 The evaluator shall check to ensure the TSS lists each type of plaintext key
material and its origin and storage location.
-
Evaluation Activities for SFRs
Page 20 of 93 Version 1.0 February-2015
73 The evaluator shall verify that the TSS describes when each type of key
material is cleared (for example, on system power off, on wipe function, on
disconnection of trusted channels, when no longer needed by the trusted
channel per the protocol, etc.).
74 The evaluator shall also verify that, for each type of key, the type of clearing
procedure that is performed (cryptographic erase, overwrite with zeros,
overwrite with random pattern, or block erase) is listed. If different types of
memory are used to store the materials to be protected, the evaluator shall
check to ensure that the TSS describes the clearing procedure in terms of the
memory in which the data are stored (for example, "secret keys stored on
flash are cleared by overwriting once with zeros, while secret keys stored on
the internal persistent storage device are cleared by overwriting three times
with a random pattern that is changed before each write").
2.2.4 FCS_COP.1(1) Cryptographic Operation (AES Data Encyption/ Decryption)
2.2.4.1 Tests
AES-CBC Known Answer Tests
75 There are four Known Answer Tests (KATs), described below. In all KATs,
the plaintext, ciphertext, and IV values shall be 128-bit blocks. The results
from each test may either be obtained by the evaluator directly or by
supplying the inputs to the implementer and receiving the results in response.
To determine correctness, the evaluator shall compare the resulting values to
those obtained by submitting the same inputs to a known good
implementation.
76 KAT-1. To test the encrypt functionality of AES-CBC, the evaluator shall
supply a set of 10 plaintext values and obtain the ciphertext value that results
from AES-CBC encryption of the given plaintext using a key value of all
zeros and an IV of all zeros. Five plaintext values shall be encrypted with a
128-bit all-zeros key, and the other five shall be encrypted with a 256-bit all-
zeros key.
77 To test the decrypt functionality of AES-CBC, the evaluator shall perform
the same test as for encrypt, using 10 ciphertext values as input and AES-
CBC decryption.
78 KAT-2. To test the encrypt functionality of AES-CBC, the evaluator shall
supply a set of 10 key values and obtain the ciphertext value that results from
AES-CBC encryption of an all-zeros plaintext using the given key value and
an IV of all zeros. Five of the keys shall be 128-bit keys, and the other five
shall be 256-bit keys.
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 21 of 93
79 To test the decrypt functionality of AES-CBC, the evaluator shall perform
the same test as for encrypt, using an all-zero ciphertext value as input and
AES-CBC decryption.
80 KAT-3. To test the encrypt functionality of AES-CBC, the evaluator shall
supply the two sets of key values described below and obtain the ciphertext
value that results from AES encryption of an all-zeros plaintext using the
given key value and an IV of all zeros. The first set of keys shall have 128
128-bit keys, and the second set shall have 256 256-bit keys. Key i in each
set shall have the leftmost i bits be ones and the rightmost N-i bits be zeros,
for i in [1,N].
81 To test the decrypt functionality of AES-CBC, the evaluator shall supply the
two sets of key and ciphertext value pairs described below and obtain the
plaintext value that results from AES-CBC decryption of the given ciphertext
using the given key and an IV of all zeros. The first set of key/ciphertext
pairs shall have 128 128-bit key/ciphertext pairs, and the second set of
key/ciphertext pairs shall have 256 256-bit key/ciphertext pairs. Key i in
each set shall have the leftmost i bits be ones and the rightmost N-i bits be
zeros, for i in [1,N]. The ciphertext value in each pair shall be the value that
results in an all-zeros plaintext when decrypted with its corresponding key.
82 KAT-4. To test the encrypt functionality of AES-CBC, the evaluator shall
supply the set of 128 plaintext values described below and obtain the two
ciphertext values that result from AES-CBC encryption of the given plaintext
using a 128-bit key value of all zeros with an IV of all zeros and using a 256-
bit key value of all zeros with an IV of all zeros, respectively. Plaintext value
i in each set shall have the leftmost i bits be ones and the rightmost 128-i bits
be zeros, for i in [1,128].
83 To test the decrypt functionality of AES-CBC, the evaluator shall perform
the same test as for encrypt, using ciphertext values of the same form as the
plaintext in the encrypt test as input and AES-CBC decryption.
AES-CBC Multi-Block Message Test
84 The evaluator shall test the encrypt functionality by encrypting an i-block
message where 1 < i
-
Evaluation Activities for SFRs
Page 22 of 93 Version 1.0 February-2015
a key, an IV and a ciphertext message of length i blocks and decrypt the
message, using the mode to be tested, with the chosen key and IV. The
plaintext shall be compared to the result of decrypting the same ciphertext
message with the same key and IV using a known good implementation.
AES-CBC Monte Carlo Tests
86 The evaluator shall test the encrypt functionality using a set of 200 plaintext,
IV, and key 3-tuples. 100 of these shall use 128 bit keys, and 100 shall use
256 bit keys. The plaintext and IV values shall be 128-bit blocks. For each 3-
tuple, 1000 iterations shall be run as follows:
# Input: PT, IV, Key
for i = 1 to 1000:
if i == 1:
CT[1] = AES-CBC-Encrypt(Key, IV, PT)
PT = IV
else:
CT[i] = AES-CBC-Encrypt(Key, PT)
PT = CT[i-1]
87 The ciphertext computed in the 1000th iteration (i.e., CT[1000]) is the result
for that trial. This result shall be compared to the result of running 1000
iterations with the same values using a known good implementation.
88 The evaluator shall test the decrypt functionality using the same test as for
encrypt, exchanging CT and PT and replacing AES-CBC-Encrypt with AES-
CBC-Decrypt.
AES-GCM Test
89 The evaluator shall test the authenticated encrypt functionality of AES-GCM
for each combination of the following input parameter lengths:
128 bit and 256 bit keys
a) Two plaintext lengths. One of the plaintext lengths shall be a non-zero integer multiple of 128 bits, if supported. The other plaintext
length shall not be an integer multiple of 128 bits, if supported.
b) Three AAD lengths. One AAD length shall be 0, if supported. One AAD length shall be a non-zero integer multiple of 128 bits, if
supported. One AAD length shall not be an integer multiple of 128
bits, if supported.
c) Two IV lengths. If 96 bit IV is supported, 96 bits shall be one of the two IV lengths tested.
90 The evaluator shall test the encrypt functionality using a set of 10 key,
plaintext, AAD, and IV tuples for each combination of parameter lengths
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 23 of 93
above and obtain the ciphertext value and tag that results from AES-GCM
authenticated encrypt. Each supported tag length shall be tested at least once
per set of 10. The IV value may be supplied by the evaluator or the
implementation being tested, as long as it is known.
91 The evaluator shall test the decrypt functionality using a set of 10 key,
ciphertext, tag, AAD, and IV 5-tuples for each combination of parameter
lengths above and obtain a Pass/Fail result on authentication and the
decrypted plaintext if Pass. The set shall include five tuples that Pass and
five that Fail.
92 The results from each test may either be obtained by the evaluator directly or
by supplying the inputs to the implementer and receiving the results in
response. To determine correctness, the evaluator shall compare the resulting
values to those obtained by submitting the same inputs to a known good
implementation.
2.2.5 FCS_COP.1(2) Cryptographic Operation (Signature Generation and Verification
2.2.5.1 Tests
ECDSA Algorithm Tests
ECDSA FIPS 186-4 Signature Generation Test
93 For each supported NIST curve (i.e., P-256, P-384 and P-521) and SHA
function pair, the evaluator shall generate 10 1024-bit long messages and
obtain for each message a public key and the resulting signature values R and
S. To determine correctness, the evaluator shall use the signature verification
function of a known good implementation.
ECDSA FIPS 186-4 Signature Verification Test
94 For each supported NIST curve (i.e., P-256, P-384 and P-521) and SHA
function pair, the evaluator shall generate a set of 10 1024-bit message,
public key and signature tuples and modify one of the values (message,
public key or signature) in five of the 10 tuples. The evaluator shall obtain in
response a set of 10 PASS/FAIL values.
RSA Signature Algorithm Tests
Signature Generation Test
95 The evaluator shall verify the implementation of RSA Signature Generation
by the TOE using the Signature Generation Test. To conduct this test the
evaluator must generate or obtain 10 messages from a trusted reference
implementation for each modulus size/SHA combination supported by the
TSF. The evaluator shall have the TOE use their private key and modulus
value to sign these messages.
-
Evaluation Activities for SFRs
Page 24 of 93 Version 1.0 February-2015
96 The evaluator shall verify the correctness of the TSF’s signature using a
known good implementation and the associated public keys to verify the
signatures.
Signature Verification Test
97 The evaluator shall perform the Signature Verification test to verify the
ability of the TOE to recognize another party’s valid and invalid signatures.
The evaluator shall inject errors into the test vectors produced during the
Signature Verification Test by introducing errors in some of the public keys
e, messages, IR format, and/or signatures. The TOE attempts to verify the
signatures and returns success or failure.
98 The evaluator shall use these test vectors to emulate the signature verification
test using the corresponding parameters and verify that the TOE detects these
errors.
2.2.6 FCS_COP.1(3) Cryptographic Operation (Hash Algorithm)
2.2.6.1 TSS
99 The evaluator shall check that the association of the hash function with other
TSF cryptographic functions (for example, the digital signature verification
function) is documented in the TSS.
2.2.6.2 Guidance Documentation
100 The evaluator checks the AGD documents to determine that any
configuration that is required to configure the required hash sizes is present.
2.2.6.3 Tests
101 The TSF hashing functions can be implemented in one of two modes. The
first mode is the byte-oriented mode. In this mode the TSF only hashes
messages that are an integral number of bytes in length; i.e., the length (in
bits) of the message to be hashed is divisible by 8. The second mode is the
bit-oriented mode. In this mode the TSF hashes messages of arbitrary length.
As there are different tests for each mode, an indication is given in the
following sections for the bit-oriented vs. the byte-oriented testmacs.
102 The evaluator shall perform all of the following tests for each hash algorithm
implemented by the TSF and used to satisfy the requirements of this PP.
Short Messages Test - Bit-oriented Mode
103 The evaluators devise an input set consisting of m+1 messages, where m is
the block length of the hash algorithm. The length of the messages range
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 25 of 93
sequentially from 0 to m bits. The message text shall be pseudorandomly
generated. The evaluators compute the message digest for each of the
messages and ensure that the correct result is produced when the messages
are provided to the TSF.
Short Messages Test - Byte-oriented Mode
104 The evaluators devise an input set consisting of m/8+1 messages, where m is
the block length of the hash algorithm. The length of the messages range
sequentially from 0 to m/8 bytes, with each message being an integral
number of bytes. The message text shall be pseudorandomly generated. The
evaluators compute the message digest for each of the messages and ensure
that the correct result is produced when the messages are provided to the
TSF.
Selected Long Messages Test - Bit-oriented Mode
105 The evaluators devise an input set consisting of m messages, where m is the
block length of the hash algorithm (e.g. 512 bits for SHA-256). The length of
the ith message is m + 99*i, where 1 ≤ i ≤ m. The message text shall be
pseudorandomly generated. The evaluators compute the message digest for
each of the messages and ensure that the correct result is produced when the
messages are provided to the TSF.
Selected Long Messages Test - Byte-oriented Mode
106 The evaluators devise an input set consisting of m/8 messages, where m is
the block length of the hash algorithm (e.g. 512 bits for SHA-256). The
length of the ith message is m + 8*99*i, where 1 ≤ i ≤ m/8. The message text
shall be pseudorandomly generated. The evaluators compute the message
digest for each of the messages and ensure that the correct result is produced
when the messages are provided to the TSF.
Pseudorandomly Generated Messages Test
107 This test is for byte-oriented implementations only. The evaluators randomly
generate a seed that is n bits long, where n is the length of the message digest
produced by the hash function to be tested. The evaluators then formulate a
set of 100 messages and associated digests by following the algorithm
provided in Figure 1 of [SHAVS]. The evaluators then ensure that the correct
result is produced when the messages are provided to the TSF.
2.2.7 FCS_COP.1(4) Cryptographic Operation (Keyed Hash Algorithm)
2.2.7.1 TSS
108 The evaluator shall examine the TSS to ensure that it specifies the following
values used by the HMAC function: key length, hash function used, block
size, and output MAC length used.
-
Evaluation Activities for SFRs
Page 26 of 93 Version 1.0 February-2015
2.2.7.2 Tests
109 For each of the supported parameter sets, the evaluator shall compose 15 sets
of test data. Each set shall consist of a key and message data. The evaluator
shall have the TSF generate HMAC tags for these sets of test data. The
resulting MAC tags shall be compared to the result of generating HMAC
tags with the same key and IV using a known good implementation.
2.2.8 FCS_RBG_EXT.1 Extended: Cryptographic Operation (Random Bit Generation)
110 Documentation shall be produced—and the evaluator shall perform the
activities—in accordance with Appendix D of [NDcPP].
2.2.8.1 Tests
111 The evaluator shall perform 15 trials for the RNG implementation. If the
RNG is configurable, the evaluator shall perform 15 trials for each
configuration. The evaluator shall also confirm that the guidance
documentation contains appropriate instructions for configuring the RNG
functionality.
112 If the RNG has prediction resistance enabled, each trial consists of (1)
instantiate DRBG, (2) generate the first block of random bits (3) generate a
second block of random bits (4) uninstantiate. The evaluator verifies that the
second block of random bits is the expected value. The evaluator shall
generate eight input values for each trial. The first is a count (0 – 14). The
next three are entropy input, nonce, and personalization string for the
instantiate operation. The next two are additional input and entropy input for
the first call to generate. The final two are additional input and entropy input
for the second call to generate. These values are randomly generated.
“generate one block of random bits” means to generate random bits with
number of returned bits equal to the Output Block Length (as defined in
NIST SP800-90A).
113 If the RNG does not have prediction resistance, each trial consists of (1)
instantiate DRBG, (2) generate the first block of random bits (3) reseed, (4)
generate a second block of random bits (5) uninstantiate. The evaluator
verifies that the second block of random bits is the expected value. The
evaluator shall generate eight input values for each trial. The first is a count
(0 – 14). The next three are entropy input, nonce, and personalization string
for the instantiate operation. The fifth value is additional input to the first call
to generate. The sixth and seventh are additional input and entropy input to
the call to reseed. The final value is additional input to the second generate
call.
114 The following paragraphs contain more information on some of the input
values to be generated/selected by the evaluator.
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 27 of 93
Entropy input: the length of the entropy input value must equal the seed
length.
Nonce: If a nonce is supported (CTR_DRBG with no Derivation Function
does not use a nonce), the nonce bit length is one-half the seed length.
Personalization string: The length of the personalization string must be
-
Evaluation Activities for SFRs
Page 28 of 93 Version 1.0 February-2015
DISCARD (e.g., drop the packet), and PROTECT (e.g., encrypt the packet)
actions defined in RFC 4301.
119 As noted in section 4.4.1 of RFC 4301, the processing of entries in the SPD
is non-trivial and the evaluator shall determine that the description in the
TSS is sufficient to determine which rules will be applied given the rule
structure implemented by the TOE. For example, if the TOE allows
specification of ranges, conditional rules, etc., the evaluator shall determine
that the description of rule processing (for both inbound and outbound
packets) is sufficient to determine the action that will be applied, especially
in the case where two different rules may apply. This description shall cover
both the initial packets (that is, no SA is established on the interface or for
that particular packet) as well as packets that are part of an established SA.
FCS_IPSEC_EXT.1.3
120 The evaluator checks the TSS to ensure it states that the VPN can be
established to operate in transport mode and/or tunnel mode (as identified in
FCS_IPSEC_EXT.1.3).
FCS_IPSEC_EXT.1.4
121 The evaluator shall examine the TSS to verify that the algorithms AES-CBC-
128 and AES-CBC-256 are implemented. If the ST author has selected either
AES-GCM-128 or AES-GCM-256 in the requirement, then the evaluator
verifies the TSS describes these as well. In addition, the evaluator ensures
that the SHA-based HMAC algorithm conforms to the algorithms specified
in FCS_COP.1(4) Cryptographic Operations (for keyed-hash message
authentication).
FCS_IPSEC_EXT.1.5
122 The evaluator shall examine the TSS to verify that IKEv1 and/or IKEv2 are
implemented.
123 For IKEv1 implementations, the evaluator shall examine the TSS to ensure
that, in the description of the IPsec protocol, it states that aggressive mode is
not used for IKEv1 Phase 1 exchanges, and that only main mode is used. It
may be that this is a configurable option.
FCS_IPSEC_EXT.1.6
124 The evaluator shall ensure the TSS identifies the algorithms used for
encrypting the IKEv1 and/or IKEv2 payload, and that the algorithms AES-
CBC-128, AES-CBC-256 are specified, and if others are chosen in the
selection of the requirement, those are included in the TSS discussion.
FCS_IPSEC_EXT.1.7
125 The evaluator shall ensure the TSS identifies the lifetime configuration
method used for limiting the IKEv1 Phase 1 SA lifetime and/or the IKEv2
SA lifetime. The evaluator shall verify that the selection made here
corresponds to the selection in FCS_IPSEC_EXT.1.5.
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 29 of 93
FCS_IPSEC_EXT.1.8
126 The evaluator shall ensure the TSS identifies the lifetime configuration
method used for limiting the IKEv1 Phase 2 SA lifetime and/or the IKEv2
Child SA lifetime. The evaluator shall verify that the selection made here
corresponds to the selection in FCS_IPSEC_EXT.1.5.
FCS_IPSEC_EXT.1.9
127 The evaluator shall check to ensure that, for each DH group supported, the
TSS describes the process for generating "x". The evaluator shall verify that
the TSS indicates that the random number generated that meets the
requirements in this PP is used, and that the length of "x" meets the
stipulations in the requirement.
FCS_IPSEC_EXT.1.11
128 The evaluator shall check to ensure that the DH groups specified in the
requirement are listed as being supported in the TSS. If there is more than
one DH group supported, the evaluator checks to ensure the TSS describes
how a particular DH group is specified/negotiated with a peer.
FCS_IPSEC_EXT.1.12
129 The evaluator shall check that the TSS describes the potential strengths (in
terms of the number of bits in the symmetric key) of the algorithms that are
allowed for the IKE and ESP exchanges. The TSS shall also describe the
checks that are done when negotiating IKEv1 Phase 2 and/or IKEv2
CHILD_SA suites to ensure that the strength (in terms of the number of bits
of key in the symmetric algorithm) of the negotiated algorithm is less than or
equal to that of the IKE SA this is protecting the negotiation.
FCS_IPSEC_EXT.1.13
130 The evaluator ensures that the TSS identifies RSA and/or ECDSA as being
used to perform peer authentication. The description must be consistent with
the algorithms as specified in FCS_COP.1(2) Cryptographic Operations (for
cryptographic signature).
131 If pre-shared keys are chosen in the selection, the evaluator shall check to
ensure that the TSS describes how pre-shared keys are established and used
in authentication of IPsec connections. The description in the TSS shall also
indicate how pre-shared key establishment is accomplished for TOEs that
can generate a pre-shared key as well as TOEs that simply use a pre-shared
key.
FCS_IPSEC_EXT.1.14
132 The evaluator shall verify that the TSS describes how the DN in the
certificate is compared to the expected DN.
-
Evaluation Activities for SFRs
Page 30 of 93 Version 1.0 February-2015
2.2.10.2 Guidance Documentation
FCS_IPSEC_EXT.1.1
133 The evaluator shall examine the guidance documentation to verify it instructs
the Administrator how to construct entries into the SPD that specify a rule
for processing a packet. The description includes all three cases – a rule that
ensures packets are encrypted/decrypted, dropped, and flow through the TOE
without being encrypted. The evaluator shall determine that the description
in the guidance documentation is consistent with the description in the TSS,
and that the level of detail in the guidance documentation is sufficient to
allow the administrator to set up the SPD in an unambiguous fashion. This
includes a discussion of how ordering of rules impacts the processing of an
IP packet.
FCS_IPSEC_EXT.1.3
134 The evaluator shall confirm that the guidance documentation contains
instructions on how to configure the connection in each mode selected.
FCS_IPSEC_EXT.1.4
135 The evaluator checks the guidance documentation to ensure it provides
instructions on how to configure the TOE to use the algorithms, and if either
AES-GCM-128 or AES-GCM-256 have been selected the guidance instructs
how to use these as well.
FCS_IPSEC_EXT.1.5
136 The evaluator shall check the guidance documentation to ensure it instructs
the administrator how to configure the TOE to use IKEv1 and/or IKEv2 (as
selected), and uses the guidance to configure the TOE to perform NAT
traversal for the following test (if selected).
137 If the IKEv1 Phase 1 mode requires configuration of the TOE prior to its
operation, the evaluator shall check the guidance documentation to ensure
that instructions for this configuration are contained within that guidance.
FCS_IPSEC_EXT.1.6
138 The evaluator ensures that the guidance documentation describes the
configuration of the mandated algorithms, as well as any additional
algorithms selected in the requirement. The guidance is then used to
configure the TOE to perform the following test for each ciphersuite
selected.
FCS_IPSEC_EXT.1.7
139 The evaluator shall verify that the values for SA lifetimes can be configured
and that the instructions for doing so are located in the guidance
documentation. If time-based limits are supported, the evaluator ensures that
the Administrator is able to configure Phase 1 SA values for 24 hours.
Currently there are no values mandated for the number of bytes, the
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 31 of 93
evaluator just ensures that this can be configured if selected in the
requirement.
FCS_IPSEC_EXT.1.8
140 The evaluator shall verify that the values for SA lifetimes can be configured
and that the instructions for doing so are located in the guidance
documentation. If time-based limits are supported, the evaluator ensures that
the Administrator is able to configure Phase 2 SA values for 8 hours.
Currently there are no values mandated for the number of bytes, the
evaluator just ensures that this can be configured if selected in the
requirement.
FCS_IPSEC_EXT.1.11
141 The evaluator ensures that the guidance documentation describes the
configuration of the mandated algorithms, as well as any additional
algorithms selected in the requirement. The guidance is then used to
configure the TOE to perform the following test for each ciphersuite
selected.
FCS_IPSEC_EXT.1.13
142 The evaluator ensures the guidance documentation describes how to set up
the TOE to use certificates with RSA and/or ECDSA signatures and public
keys.
143 The evaluator shall check that the guidance documentation describes how
pre-shared keys are to be generated and established. The description in the
guidance documentation shall also indicate how pre-shared key
establishment is accomplished for TOEs that can generate a pre-shared key
as well as TOEs that simply use a pre-shared key.
144 In order to construct the environment and configure the TOE for the
following tests, the evaluator will ensure that the guidance documentation
describes how to configure the TOE to connect to a trusted CA, and ensure a
valid certificate for that CA is loaded into the TOE and marked “trusted”.
FCS_IPSEC_EXT.1.14
145 The evaluator shall ensure that the guidance documentation includes
configuration of the expected DN for the connection.
2.2.10.3 Tests
FCS_IPSEC_EXT.1.1
146 The evaluator uses the guidance documentation to configure the TOE to
carry out the following tests:
a) Test 1: The evaluator shall configure the SPD such that there is a rule for dropping a packet, encrypting a packet, and allowing a packet to
flow in plaintext. The selectors used in the construction of the rule
shall be different such that the evaluator can generate a packet and
-
Evaluation Activities for SFRs
Page 32 of 93 Version 1.0 February-2015
send packets to the gateway with the appropriate fields (fields that are
used by the rule - e.g., the IP addresses, TCP/UDP ports) in the
packet header. The evaluator performs both positive and negative test
cases for each type of rule (e.g. a packet that matches the rule and
another that does not match the rule). The evaluator observes via the
audit trail, and packet captures that the TOE exhibited the expected
behavior: appropriate packets were dropped, allowed to flow without
modification, encrypted by the IPsec implementation.
b) Test 2: The evaluator shall devise several tests that cover a variety of scenarios for packet processing. As with Test 1, the evaluator ensures
both positive and negative test cases are constructed. These scenarios
must exercise the range of possibilities for SPD entries and
processing modes as outlined in the TSS and guidance
documentation. Potential areas to cover include rules with
overlapping ranges and conflicting entries, inbound and outbound
packets, and packets that establish SAs as well as packets that belong
to established SAs. The evaluator shall verify, via the audit trail and
packet captures, for each scenario that the expected behavior is
exhibited, and is consistent with both the TSS and the guidance
documentation.
FCS_IPSEC_EXT.1.2
147 The assurance activity for this element is performed in conjunction with the
activities for FCS_IPSEC_EXT.1.1.
148 The evaluator uses the guidance documentation to configure the TOE to
carry out the following tests:
149 The evaluator shall configure the SPD such that there is a rule for dropping a
packet, encrypting a packet, and allowing a packet to flow in plaintext. The
evaluator may use the SPD that was created for verification of
FCS_IPSEC_EXT.1.1. The evaluator shall construct a network packet that
matches the rule to allow the packet to flow in plaintext and send that packet.
The evaluator should observe that the network packet is passed to the proper
destination interface with no modification. The evaluator shall then modify a
field in the packet header; such that it no longer matches the evaluator-
created entries (there may be a “TOE created” final entry that discards
packets that do not match any previous entries). The evaluator sends the
packet, and observes that the packet was dropped.
FCS_IPSEC_EXT.1.3
150 The evaluator shall perform the following test(s) based on the selections
chosen:
a) Test 1 (conditional): If tunnel mode is selected, the evaluator uses the guidance documentation to configure the TOE to operate in tunnel
mode and also configures a VPN peer to operate in tunnel mode. The
evaluator configures the TOE and the VPN peer to use any of the
allowable cryptographic algorithms, authentication methods, etc. to
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 33 of 93
ensure an allowable SA can be negotiated. The evaluator shall then
initiate a connection from the TOE to connect to the VPN peer. The
evaluator observes (for example, in the audit trail and the captured
packets) that a successful connection was established using the tunnel
mode.
b) Test 2: The evaluator uses the guidance documentation to configure the TOE to operate in transport mode and also configures a VPN peer
to operate in transport mode. The evaluator configures the TOE and
the VPN peer to use any of the allowed cryptographic algorithms,
authentication methods, etc. to ensure an allowable SA can be
negotiated. The evaluator then initiates a connection from the TOE to
connect to the VPN peer. The evaluator observes (for example, in the
audit trail and the captured packets) that a successful connection was
established using the transport mode.
FCS_IPSEC_EXT.1.4
151 The evaluator shall configure the TOE as indicated in the guidance
documentation configuring the TOE to use each of the supported algorithms,
attempt to establish a connection using ESP, and verify that the attempt
succeeds.
FCS_IPSEC_EXT.1.5
152 Tests are performed in conjunction with the other IPsec evaluation activities.
153 (conditional): The evaluator shall configure the TOE as indicated in the
guidance documentation, and attempt to establish a connection using an
IKEv1 Phase 1 connection in aggressive mode. This attempt should fail. The
evaluator should then show that main mode exchanges are supported.
154 (conditional): The evaluator shall configure the TOE so that it will perform
NAT traversal processing as described in the TSS and RFC 5996, section
2.23. The evaluator shall initiate an IPsec connection and determine that the
NAT is successfully traversed.
FCS_IPSEC_EXT.1.6
155 The evaluator shall configure the TOE to use the ciphersuite under test to
encrypt the IKEv1 and/or IKEv2 payload and establish a connection with a
peer device, which is configured to only accept the payload encrypted using
the indicated ciphersuite. The evaluator will confirm the algorithm was that
used in the negotiation.
FCS_IPSEC_EXT.1.7
156 When testing this functionality, the evaluator needs to ensure that both sides
are configured appropriately. From the RFC “A difference between IKEv1
and IKEv2 is that in IKEv1 SA lifetimes were negotiated. In IKEv2, each
end of the SA is responsible for enforcing its own lifetime policy on the SA
and rekeying the SA when necessary. If the two ends have different lifetime
policies, the end with the shorter lifetime will end up always being the one to
-
Evaluation Activities for SFRs
Page 34 of 93 Version 1.0 February-2015
request the rekeying. If the two ends have the same lifetime policies, it is
possible that both will initiate a rekeying at the same time (which will result
in redundant SAs). To reduce the probability of this happening, the timing of
rekeying requests SHOULD be jittered.”
157 Each of the following tests shall be performed for each version of IKE
selected in the FCS_IPSEC_EXT.1.5 protocol selection:
a) Test 1 (Conditional): The evaluator shall configure a maximum lifetime in terms of the number of bytes allowed following the
guidance documentation. The evaluator shall configure a test peer
with a byte lifetime that exceeds the lifetime of the TOE. The
evaluator shall establish an SA between the TOE and the test peer,
and determine that once the allowed number of bytes through this SA
is exceeded, a new SA is negotiated. The evaluator shall verify that
the TOE initiates a Phase 1 negotiation.
b) Test 2 (Conditional): The evaluator shall configure a maximum lifetime of 24 hours for the Phase 1 SA following the guidance
documentation. The evaluator shall configure a test peer with a
lifetime that exceeds the lifetime of the TOE. The evaluator shall
establish an SA between the TOE and the test peer, maintain the
Phase 1 SA for 24 hours, and determine that once 24 hours has
elapsed, a new Phase 1 SA is negotiated. The evaluator shall verify
that the TOE initiates a Phase 1 negotiation.
FCS_IPSEC_EXT.1.8
158 When testing this functionality, the evaluator needs to ensure that both sides
are configured appropriately. From the RFC “A difference between IKEv1
and IKEv2 is that in IKEv1 SA lifetimes were negotiated. In IKEv2, each
end of the SA is responsible for enforcing its own lifetime policy on the SA
and rekeying the SA when necessary. If the two ends have different lifetime
policies, the end with the shorter lifetime will end up always being the one to
request the rekeying. If the two ends have the same lifetime policies, it is
possible that both will initiate a rekeying at the same time (which will result
in redundant SAs). To reduce the probability of this happening, the timing of
rekeying requests SHOULD be jittered.”
159 Each of the following tests shall be performed for each version of IKE
selected in the FCS_IPSEC_EXT.1.5 protocol selection:
a) Test 1 (Conditional): The evaluator shall configure a maximum lifetime in terms of the number of bytes allowed following the
guidance documentation. The evaluator shall configure a test peer
with a byte lifetime that exceeds the lifetime of the TOE. The
evaluator shall establish an SA between the TOE and the test peer,
and determine that once the allowed number of bytes through this SA
is exceeded, a new SA is negotiated. The evaluator shall verify that
the TOE initiates a Phase 2 negotiation.
-
Evaluation Activities for SFRs
February-2015 Version 1.0 Page 35 of 93
b) Test 2 (Conditional): The evaluator shall configure a maximum lifetime of 8 hours for the Phase 2 SA following the guidance
documentation. The evaluator shall configure a test peer with a
lifetime that exceeds the lifetime of the TOE. The evaluator shall
establish an SA between the TOE and the test peer, maintain the
Phase 1 SA for 8 hours, and determine that once 8 hours has elapsed,
a new Phase 2 SA is negotiated. The evaluator shall verify that the
TOE initiates a Phase 2 negotiation.
FCS_IPSEC_EXT.1.10
160 (conditional) If the first selection is chosen, the evaluator shall check to
ensure that, for each DH group supported, the TSS describes the process for
generating each nonce. The evaluator shall verify that the TSS indicates that
the random number generated that meets the requirements in this PP is used,
and that the length of the nonces meet the stipulations in the requirement.
161 (conditional) If the second selection is chosen, the evaluator shall check to
ensure that, for each PRF hash supported, the TSS describes the process for
generating each nonce. The evaluator shall verify that the TSS indicates that
the random number generated that meets the requirements in this PP is used,
and that the length of the nonces meet the stipulations in the requirement.
FCS_IPSEC_EXT.1.11
162 For each supported DH group, the evaluator shall test to ensure that all
supported IKE protocols can be successfully completed using that particular
DH group.
FCS_IPSEC_EXT.1.12
163 The evaluator simply follows the guidance to configure the TOE to perform
the following tests.
a) Test 1: This test shall be performed for each version of IKE supported. The evaluator shall successfully negotiate an IPsec
connection using each of the supported algorithms and hash functions
identified in the requirements.
b) Test 2: This test shall be performed for each version of IKE supported. The evaluator shall attempt to establish an SA for ESP that
selects an encryption algorithm with more strength than that being
used for the IKE SA (i.e., symmetric algorithm with a key size larger
than that being used for the IKE SA). Such attempts should fail.
c) Test 3: This test shall be performed for each version of IKE supported. The evaluator shall attempt to establish an IKE SA using
an algorithm that is not one of the supported algorithms and hash
functions identified in the requirements. Such an attempt should fail.
d) Test 4: This test shall be performed for each version of IKE supported. The evaluator shall attempt to establish an SA for ESP
(assumes the proper parameters where used to establish the IKE SA)
-
Evaluation Activities for SFRs
Page 36 of 93 Version 1.0 February-2015
that selects an encryption algorithm that is not identified in
FCS_IPSEC_EXT.1.4. Such an attempt should fail.
FCS_IPSEC_EXT.1.13
164 For efficiency sake, the testing that is performed may be combined with the
testing for FIA_X509_EXT.1, FIA_X509_EXT.2 (for IPsec connections),
and FCS_IPSEC_EXT.1.1. The following tests shall be repeated for each
peer authentication selected in the FCS_IPSEC_EXT.1.1 selection above:
a) Test 1: The evaluator shall configure the TOE to use a private key and associated certificate signed by a trusted CA and shall establish
an IPsec connection with the peer.
b) Test 2 [conditional]: The evaluator shall generate a pre-shared key off-TOE and use it, as indicated in the guidance documentation, to
establish an IPsec connection with the peer.
FCS_IPSEC_EXT.1.14
165 The evaluator shall, if necessary, configure the expected DN according to the
guidance documentation. The evaluator shall send a peer certificate signed
by a trusted CA with a DN that does not match an expected DN and verify
that the TOE denies the connection.
2.2.11 FCS_SSHC_EXT.1 SSH Client
2.2.11.1 TSS
FCS_SSHC_EXT.1.2
166 The evaluator shall check to ensure that the TSS contains a description of the
public key algorithms that are acceptable for use for authentication, that this
list conforms to FCS_SSHC_EXT.1.5, and ensure that password-based