Domain 2: Cloud Data Security Section A Review

Domain 2 – Section A Review: Data Fundamentals

CCSP Domain 2 — Cloud Data Security Section A Review 10 scenarios
This section review tests your ability to apply concepts from the preceding modules to realistic exam scenarios. Work through each question, commit to an answer, then reveal the reasoning. Focus on understanding WHY the correct answer is right and why the distractors are wrong.

Scenario 1

A SaaS provider's terms of service claim ownership of all data uploaded to their platform. The customer's legal team discovers this during a contract review. What is the fundamental issue?

  1. A) SaaS providers typically own all uploaded data per industry standard
  2. B) Data ownership belongs to the data creator/collector (the customer). Terms claiming provider ownership conflict with the fundamental principle that data ownership does not transfer when data is processed in the cloud
  3. C) The terms only apply to metadata, not actual customer data
  4. D) Ownership terms are unenforceable in cloud contracts
Answer & reasoning

Correct: B

Data ownership does not transfer when data moves to the cloud. ToS claiming provider ownership contradict this principle and should be renegotiated before signing.

Scenario 2

A cloud provider replicates customer data across three regions for availability. The customer requests data deletion and receives confirmation that the primary region data is deleted. What remains unaddressed?

  1. A) The other regions will auto-delete through eventual consistency
  2. B) Data copies in the other two regions, plus backup snapshots, CDN caches, and log files across all regions remain unaddressed. Complete deletion requires confirming all copies are removed or rendered unrecoverable
  3. C) Only regulated data needs deletion from backup copies
  4. D) Nothing — primary deletion is sufficient for compliance
Answer & reasoning

Correct: B

Data dispersion means copies exist across multiple locations. Deletion verification must cover all regions, replicas, backups, caches, and logs. Crypto-shredding addresses this by rendering all copies unrecoverable.

Scenario 3

A company migrates 10TB of medical imaging to cloud storage. They select high-performance block storage for the archive. An architect suggests object storage with an archive tier instead. Who is correct?

  1. A) The original choice — block storage ensures the fastest retrieval
  2. B) Both are equivalent for this use case
  3. C) Neither — file storage with NFS is most appropriate for medical images
  4. D) The architect — medical imaging archives are write-once, rarely accessed, unstructured data that is ideally suited for object storage with archive/cold tier, providing massive cost savings over block storage
Answer & reasoning

Correct: D

Medical image archives are large, unstructured, write-once, rarely accessed files — the ideal use case for object storage with archive tier. Block storage's performance advantage is wasted on archival data and costs significantly more.

Scenario 4

During data flow mapping, a security team discovers that internal microservices communicate over unencrypted HTTP within a cloud VPC. The team lead argues encryption is unnecessary inside the VPC. A breach through a compromised microservice later exposes inter-service traffic. What lesson does this illustrate?

  1. A) Only traffic crossing VPC boundaries needs encryption
  2. B) Internal cloud traffic must be encrypted because VPCs share physical infrastructure. A compromised service can capture unencrypted traffic from other services within the same VPC. Defense in depth requires encryption everywhere
  3. C) The breach was caused by the compromised microservice, not the lack of encryption
  4. D) VPCs are inherently insecure and should not be used
Answer & reasoning

Correct: B

Unencrypted internal traffic means any compromised service can observe communication between other services. Encrypting all traffic, including internal, limits the blast radius of a compromise.

Scenario 5

A GDPR data controller uses a cloud processor. The processor subcontracts to a sub-processor in a country without an EU adequacy decision. No Standard Contractual Clauses are in place. What is the compliance status?

  1. A) Compliant — the processor handles all GDPR obligations
  2. B) Non-compliant — personal data is being transferred to a third country without a legal transfer mechanism. The controller is accountable for ensuring all processors and sub-processors have appropriate safeguards
  3. C) Non-compliant — but only the sub-processor faces regulatory consequences
  4. D) Compliant — sub-processors are not covered by GDPR transfer restrictions
Answer & reasoning

Correct: B

The controller is accountable for the entire processing chain. Sub-processors in third countries without adequacy decisions require legal transfer mechanisms (SCCs). The controller must ensure these are in place throughout the chain.

Scenario 6

API credentials for a cloud storage service are found in a public GitHub repository. The credentials were committed 3 hours ago and the commit has since been deleted. What is the immediate response?

  1. A) Immediately rotate the compromised credentials. Automated scrapers continuously monitor public repositories and may have captured the credentials within seconds of the commit. Git history deletion does not guarantee the credentials were not harvested
  2. B) No action needed — the commit was deleted
  3. C) Monitor the cloud account for unusual activity before taking action
  4. D) Contact GitHub to permanently purge the commit from their servers
Answer & reasoning

Correct: A

Automated credential-scraping tools can capture exposed secrets within seconds. Deleting the commit does not eliminate the risk. Immediate credential rotation is the only safe response, followed by auditing for any unauthorized access during the exposure window.

Scenario 7

A company classifies data into four levels but stores all levels in the same cloud storage bucket with the same access controls and encryption settings. What is the fundamental problem?

  1. A) Classification without differentiated controls is meaningless. If all data receives the same protection regardless of classification, there is no practical benefit from classifying. Controls must be tiered to match classification levels
  2. B) Uniform storage simplifies management and reduces costs
  3. C) The four-level classification scheme is too complex
  4. D) Cloud storage does not support differentiated access controls
Answer & reasoning

Correct: A

Classification's purpose is to drive appropriate controls. Storing all data with identical protections means classification has no operational effect. Restricted data receiving the same controls as public data is under-protected.

Scenario 8

A cloud architect discovers that the CDN caching their web application is storing personalized content containing user preferences and account details in edge nodes across 40 countries. This was not in the original data flow assessment. What is the impact?

  1. A) Edge nodes automatically comply with local data protection laws
  2. B) The data flow assessment is incomplete. Personalized content in CDN edge nodes creates data residency exposure in 40 countries, potentially violating privacy regulations. The CDN caching policy must be reviewed and the data map updated
  3. C) CDNs are exempt from data residency requirements
  4. D) Only the origin server data is subject to privacy regulations
Answer & reasoning

Correct: B

CDN-cached personalized data creates data residency exposure in every country with edge nodes. This was a data flow mapping gap. The organization must review what is cached, update their data map, and ensure compliance with all applicable jurisdictions.

Scenario 9

An organization discovers that cloud storage buckets have been publicly accessible for 8 months. The buckets were created by a developer for a short-term project and forgotten. Three of the buckets contain customer PII. What controls should prevent this?

  1. A) Preventive controls: organization-wide policies blocking public access by default. Detective controls: CSPM tools scanning for misconfigurations. Governance controls: resource tagging requirements and regular access reviews
  2. B) Encrypting all bucket contents, which makes public access irrelevant
  3. C) Requiring developers to use private buckets manually
  4. D) Better developer training on security best practices
Answer & reasoning

Correct: A

Multiple control layers are needed: preventive (block public access by default), detective (automated configuration scanning), and governance (resource tagging and reviews). No single control is sufficient, and relying on developer behavior is unreliable.

Scenario 10

A company uses provider-managed encryption for cloud storage. A security assessor warns about provider insider threats. The company argues that the provider's background checks are sufficient. What is the gap?

  1. A) Adding network encryption addresses the insider threat
  2. B) Background checks adequately address insider threats for cloud providers
  3. C) Provider-managed keys mean provider employees can theoretically access decryption capabilities. Customer-managed or customer-provided keys ensure only the customer can decrypt data, providing technical protection against provider insider threats rather than relying solely on personnel controls
  4. D) Provider insider threats are theoretical and do not warrant additional controls
Answer & reasoning

Correct: C

Background checks are personnel controls that reduce but do not eliminate insider threat risk. Customer-managed keys provide a technical control: even if a provider insider accesses the storage, they cannot decrypt the data without the customer's keys.

Up Next Module 17: Encryption and Key Management