Domain 2: Cloud Data Security Section B Review

Domain 2 – Section B Review: Data Protection

CCSP Domain 2 — Cloud Data Security Section B Review 10 scenarios
This section review tests your ability to apply concepts from the preceding modules to realistic exam scenarios. Work through each question, commit to an answer, then reveal the reasoning. Focus on understanding WHY the correct answer is right and why the distractors are wrong.

Scenario 1

An organization uses envelope encryption via a cloud KMS. An attacker compromises one data encryption key (DEK). The security team panics and wants to rotate ALL keys across the entire system. Is this necessary?

  1. A) No — DEKs cannot be rotated without recreating all data
  2. B) No — envelope encryption isolates blast radius. Only the data encrypted by the compromised DEK is at risk. Other DEKs and the KEK remain secure. Only the compromised DEK needs rotation and affected data re-encrypted
  3. C) Yes — any key compromise means the entire encryption system is compromised
  4. D) Yes — the KEK is automatically compromised when any DEK is exposed
Answer & reasoning

Correct: B

Envelope encryption's primary benefit is blast-radius containment. Each DEK protects a limited scope of data. Compromising one DEK only exposes that scope. The KEK and other DEKs remain secure.

Scenario 2

A compliance requirement mandates that credit card numbers in a cloud database be protected in a way that removes the database from PCI DSS scope while still allowing authorized transaction processing. Which technique is MOST appropriate?

  1. A) AES-256 encryption with customer-managed keys
  2. B) Dynamic data masking for database queries
  3. C) Tokenization — replacing credit card numbers with tokens removes PCI scope from systems handling only tokens, while the secure token vault allows authorized recovery for processing
  4. D) SHA-256 hashing of credit card numbers
Answer & reasoning

Correct: C

Tokenization replaces sensitive data with tokens that have no mathematical relationship to the original. Systems handling only tokens are out of PCI scope. The token vault (in scope) enables authorized recovery. Encryption keeps data in scope because the encrypted data is still cardholder data.

Scenario 3

A development team needs production-quality data for testing. They copy the production database to a test environment and encrypt it with a different key. A security assessor says this is insufficient. Why?

  1. A) Encrypted production data in test is adequately protected
  2. B) The assessor is being overly cautious
  3. C) Encryption alone does not address the core issue — test environment users who have the test encryption key can decrypt and access full production data. The data should be statically masked to permanently replace sensitive values with realistic but fictional data
  4. D) The test encryption key should be the same as the production key
Answer & reasoning

Correct: C

Encrypting production data in test just moves the access problem to key management. Anyone with the test key decrypts full production data. Static masking permanently replaces sensitive data with fictional equivalents that cannot be reversed, which is the appropriate solution.

Scenario 4

A DLP system monitors cloud storage and email but not SaaS collaboration tools. An employee shares a sensitive financial report through a team messaging platform. The DLP does not detect or prevent this. What control gap exists?

  1. A) The employee should have known not to share sensitive data through messaging
  2. B) Email DLP should have intercepted the message since it traverses the same network
  3. C) The DLP deployment is incomplete — it covers storage and email but not SaaS collaboration channels. A CASB with DLP capabilities should be deployed to extend monitoring to all SaaS data paths including messaging
  4. D) DLP cannot monitor SaaS messaging platforms
Answer & reasoning

Correct: C

DLP must cover all data paths. SaaS messaging platforms are a separate data path from email and storage. CASBs extend DLP monitoring to SaaS applications through API integration and proxy capabilities.

Scenario 5

An organization pseudonymizes customer data and their DPO declares the data is no longer personal data under GDPR. An auditor challenges this. Who is correct?

  1. A) Neither — GDPR does not distinguish between pseudonymized and anonymized data
  2. B) The DPO — as long as the re-identification key is stored in a different system
  3. C) The auditor — under GDPR, pseudonymized data remains personal data because re-identification is possible using the separately stored key. Only true anonymization (irreversible) removes GDPR classification
  4. D) The DPO — pseudonymized data cannot identify individuals
Answer & reasoning

Correct: C

GDPR explicitly states that pseudonymized data is still personal data (Recital 26). The existence of a re-identification capability, however securely stored, means GDPR obligations apply. Only true anonymization removes the data from GDPR scope.

Scenario 6

A cloud application generates hundreds of DLP false positive alerts daily. The security team starts ignoring them. During this period, actual sensitive data transfers go undetected. What is the MOST effective improvement?

  1. A) Tune DLP policies: implement exact data matching for high-confidence detections, refine content inspection patterns, add context analysis, and create tiered alerting with automatic actions for high-confidence matches and human review for medium-confidence
  2. B) Accept the false positive rate as a cost of doing business
  3. C) Disable DLP and rely on encryption alone for data protection
  4. D) Hire additional analysts to process all alerts manually
Answer & reasoning

Correct: A

Alert fatigue is a critical DLP failure mode. The solution is tuning for precision: exact data matching provides near-zero false positives, context analysis improves accuracy, and tiered alerting ensures critical alerts surface while lower-confidence alerts are handled appropriately.

Scenario 7

Users upload encrypted ZIP files to personal cloud storage, bypassing DLP content inspection. The organization's DLP monitors network traffic for sensitive content. Why does this bypass succeed?

  1. A) DLP content inspection cannot see inside encrypted files — the encrypted ZIP appears as a binary blob, hiding its contents. Controls must operate before encryption (endpoint DLP) or policies must block encrypted uploads to unapproved destinations
  2. B) The personal cloud storage service blocks DLP inspection
  3. C) Network DLP only monitors downloads, not uploads
  4. D) DLP should be able to inspect encrypted file contents
Answer & reasoning

Correct: A

Encryption is a DLP blind spot. If content is encrypted before DLP inspection, the DLP system sees only ciphertext. Endpoint DLP (inspecting before encryption) or policy-based controls (blocking encrypted uploads to unapproved services) address this gap.

Scenario 8

An organization discovers sensitive data in 47 cloud storage locations but was only aware of 12. Automated classification tools achieve 95% accuracy. The security architect focuses on reducing the 5% false positive rate. What is the GREATER concern?

  1. A) False negatives within the 5% — sensitive data misclassified as non-sensitive receives insufficient protection, creating compliance violations and breach exposure. False negatives are a greater risk than false positives
  2. B) Both false positives and negatives are equally concerning
  3. C) The 95% accuracy rate is insufficient for any deployment
  4. D) False positives — they waste security team resources
Answer & reasoning

Correct: A

False negatives mean sensitive data is treated as non-sensitive, receiving inadequate protection. This creates direct security and compliance gaps. False positives waste resources but do not create security vulnerabilities. Tuning should prioritize reducing false negatives.

Scenario 9

A data label of 'Restricted' is applied to a cloud storage object. When the object is copied to a different cloud service, the label is not preserved. DLP policies in the destination service do not flag the object. What happened?

  1. A) DLP policies should work independently of labels
  2. B) The copy operation automatically changes the classification
  3. C) The destination service does not support data labels
  4. D) Label non-persistence during data movement means the copied data loses its classification enforcement. DLP and access controls in the destination that rely on labels treat the unlabeled copy as unclassified, potentially applying insufficient protection
Answer & reasoning

Correct: D

Labels make classification enforceable by automated systems. If labels are not preserved during data movement, classification-based controls cannot function on the copy. This is a common gap that must be addressed through label persistence mechanisms or compensating controls.

Scenario 10

A client-side encryption implementation encrypts all data before uploading to cloud storage. The organization later wants to use the cloud provider's search and analytics services on this data. They cannot. Why?

  1. A) Client-side encrypted data is ciphertext from the provider's perspective — cloud-native services like search, indexing, and analytics require plaintext access to function. This is the fundamental tradeoff between maximum security and cloud-native functionality
  2. B) The organization needs to share their encryption keys with the provider
  3. C) The encryption algorithm blocks cloud service access
  4. D) Client-side encryption is incompatible with cloud storage services
Answer & reasoning

Correct: A

Client-side encryption means the CSP only sees ciphertext and cannot perform operations that require understanding the data content. This is the inherent tradeoff: maximum security via client-side encryption versus maximum functionality via server-side encryption.

Up Next Module 22: Information Rights Management (IRM)