Last Updated on February 21, 2022 by Admin 3

CCSP : Certified Cloud Security Professional (CCSP) : Part 22

  1. Data labels could “all the following, except:

    • Multifactor authentication
    • Access restrictions
    • Confidentiality level
    • Distribution limitations

    Explanation:​
    All the others might be included in data labels, but multifactor authentication is a procedure used for access control, not a label.

  2. In the cloud motif, the data owner is usually:

    • The cloud provider
    • In another jurisdiction
    • The cloud customer
    • The cloud access security broker
    Explanation:
    The data owner is usually considered the cloud customer in a cloud configuration; the data in question is the customer’s information, being processed in the cloud. The cloud provider is only leasing services and hardware to the customer. The cloud access security broker (CASB) only handles access control on behalf of the cloud customer, and is not in direct contact with the production data.
  3. The goals of DLP solution implementation include all of the following, except:

    • Elasticity
    • Policy enforcement
    • Data discovery
    • Loss of mitigation
    Explanation:
    DLP does not have anything to do with elasticity, which is the capability of the environment to scale up or down according to demand. All the rest are goals of DLP implementations.
  4. What is the intellectual property protection for a useful manufacturing innovation?

    • Trademark
    • Copyright
    • patent
    • Trade secret
    Explanation: 
    Patents protect processes (as well as inventions, new plantlife, and decorative patterns). The other answers listed are answers to other questions.
  5. The most pragmatic option for data disposal in the cloud is which of the following?

    • Cryptoshredding
    • Overwriting
    • Cold fusion
    • Melting
    Explanation: 
    We don’t have physical ownership, control, or even access to the devices holding the data, so physical destruction, including melting, is not an option. Overwriting is a possibility, but it is complicated by the difficulty of locating all the sectors and storage areas that might have contained our data, and by the likelihood that constant backups in the cloud increase the chance we’ll miss something as it’s being overwritten. Cryptoshredding is the only reasonable alternative. Cold fusion is a red herring.
  6. In the cloud motif, the data processor is usually:

    • The cloud customer
    • The cloud provider
    • The cloud access security broker
    • The party that assigns access rights
    Explanation: 
    In legal terms, when “data processor” is defined, it refers to anyone who stores, handles, moves, or manipulates data on behalf of the data owner or controller. In the cloud computing realm, this is the cloud provider.
  7. What is the intellectual property protection for the tangible expression of a creative idea?

    • Trade secret
    • Copyright
    • Trademark
    • Patent
    Explanation: 
    Copyrights are protected tangible expressions of creative works. The other answers listed are answers to subsequent questions.
  8. The goals of SIEM solution implementation include all of the following, except:

    • Dashboarding
    • Performance enhancement
    • Trend analysis
    • Centralization of log streams
    Explanation: 
    SIEM does not intend to provide any enhancement of performance; in fact, a SIEM solution may decrease performance because of additional overhead. All the rest are goals of SIEM implementations.
  9. Data masking can be used to provide all of the following functionality, except:

    • Secure remote access
    • test data in sandboxed environments
    • Authentication of privileged users
    • Enforcing least privilege
    Explanation: 
    Data masking does not support authentication in any way. All the others are excellent use cases for data masking.
  10. All of the following are terms used to described the practice of obscuring original raw data so that only a portion is displayed for operational purposes, except:

    • Tokenization
    • Masking
    • Data discovery
    • Obfuscation
    Explanation: 
    Data discovery is a term used to describe the process of identifying information according to specific traits or categories. The rest are all methods for obscuring data.
  11. DLP solutions can aid in deterring loss due to which of the following?

    • Power failure
    • Performance
    • Bad policy
    • Malicious disclosure
    Explanation: 
    DLP tools can identify outbound traffic that violates the organization’s policies. DLP will not protect against losses due to performance issues or power failures. The DLP solution must be configured according to the organization’s policies, so bad policies will attenuate the effectiveness of DLP tools, not the other way around.
  12. All the following are data analytics modes, except:

    • Datamining
    • Agile business intelligence
    • Refractory iterations
    • Real-time analytics
    Explanation: 
    All the others are data analytics methods, but “refractory iterations” is a nonsense term thrown in as a red herring.
  13. What are the U.S. State Department controls on technology exports known as?

    • DRM
    • ITAR
    • EAR
    • EAL
    Explanation: 
    ITAR is a Department of State program. Evaluation assurance levels are part of the Common Criteria standard from ISO. Digital rights management tools are used for protecting electronic processing of intellectual property.
  14. When crafting plans and policies for data archiving, we should consider all of the following, except:

    • The backup process
    • Immediacy of the technology
    • Archive location
    • The format of the data
  15. DLP solutions can aid in deterring loss due to which of the following?

    • Device failure
    • Randomization
    • Inadvertent disclosure
    • Natural disaster
    Explanation: 
    DLP solutions may protect against inadvertent disclosure. Randomization is a technique for obscuring data, not a risk to data. DLP tools will not protect against risks from natural disasters, or against impacts due to device failure.
  16. DLP can be combined with what other security technology to enhance data controls?

    • SIEM
    • Hypervisors
    • DRM
    • Kerberos
    Explanation: 
    DLP can be combined with DRM to protect intellectual property; both are designed to deal with data that falls into special categories. SIEMs are used for monitoring event logs, not live data movement. Kerberos is an authentication mechanism. Hypervisors are used for virtualization.
  17. The goals of SIEM solution implementation include all of the following, except:

    • Dashboarding
    • Performance enhancement
    • Trend analysis
    • Centralization of log streams
    Explanation: 
    SIEM does not intend to provide any enhancement of performance; in fact, a SIEM solution may decrease performance because of additional overhead. All the rest are goals of SIEM implementations.
  18. Data masking can be used to provide all of the following functionality, except:

    • Test data in sandboxed environments
    • Authentication of privileged users
    • Enforcing least privilege
    • Secure remote access
    Explanation: 
    Data masking does not support authentication in any way. All the others are excellent use cases for data masking.
  19. Cryptographic keys for encrypted data stored in the cloud should be ________________ .

    • Not stored with the cloud provider.
    • Generated with redundancy
    • At least 128 bits long
    • Split into groups
    Explanation: 
    Cryptographic keys should not be stored along with the data they secure, regardless of key length. We don’t split crypto keys or generate redundant keys (doing so would violate the principle of secrecy necessary for keys to serve their purpose).
  20. Tokenization requires two distinct _________________ .

    • Personnel
    • Authentication factors
    • Encryption keys
    • Databases
    Explanation: 
    In order to implement tokenization, there will need to be two databases: the database containing the raw, original data, and the token database containing tokens that map to original data. Having two-factor authentication is nice, but certainly not required. Encryption keys are not necessary for tokenization. Two-person integrity does not have anything to do with tokenization.