Domain 2: Asset Security Module 16 of 84

Data Lifecycle Management

CISSP Domain 2 — Asset Security B — Data Lifecycle and Controls 10–12 minutes

The Question Behind the Question

Most organizations can tell you what data they have. Fewer can tell you where it goes. Almost none can tell you who is accountable for a specific dataset at each stage of its existence.

That gap is what this module addresses. The CISSP exam tests whether you understand that data governance is not a one-time classification exercise — it is an ongoing accountability chain that follows data from the moment it is created until it is permanently destroyed. Every stage introduces different risks, and every stage demands different controls.

If you remember one thing: the exam cares about who makes decisions about data, not who implements those decisions technically.


The Six Lifecycle Stages

Data moves through six stages. Each stage changes the risk profile and the controls that apply:

Create — Data comes into existence through user input, system generation, sensor collection, or acquisition from external sources. At creation, classification must occur. This is the point where the data owner assigns a label and determines handling requirements. If classification does not happen at creation, it rarely happens later — and unclassified data is unprotected data.

Store — Data is committed to a storage medium: database, file system, cloud object store, physical media. Storage introduces persistence risk. Data at rest needs encryption, access controls, and integrity verification. The storage location also determines jurisdictional applicability — where data is stored affects which regulations apply.

Use — Data is actively processed, viewed, or manipulated by users or systems. This is the stage where data is most exposed because it must be in a readable state. Controls include access restrictions, audit logging, DLP monitoring, and screen capture prevention for sensitive data.

Share — Data is transmitted to other users, systems, or organizations. Sharing introduces transmission risk and third-party risk simultaneously. Controls include encryption in transit, access authorization, data loss prevention rules, and contractual obligations for recipients.

Archive — Data moves to long-term storage because it is no longer actively used but must be retained for business, legal, or regulatory reasons. Archived data still requires protection proportional to its classification. Common mistakes: archiving without encryption, losing track of archived data locations, or failing to apply retention schedules to archives.

Destroy — Data is permanently eliminated. Destruction must be verifiable and irreversible. The method depends on the media type and classification level: cryptographic erasure for encrypted storage, degaussing for magnetic media, physical destruction for highest classifications. Destruction must be documented with certificates of destruction for regulated data.


Data States: At Rest, In Transit, In Use

Independently of lifecycle stages, data always exists in one of three states. These states determine which technical controls apply:

Data at rest is stored data — on disk, in a database, on tape. The primary threats are unauthorized access and theft of the storage medium. Controls: encryption (AES-256, full-disk encryption, database-level encryption), access controls, physical security of storage media.

Data in transit is data moving between systems — across a network, between data centers, to a cloud provider. The primary threats are interception and modification. Controls: TLS/SSL, VPN tunnels, IPsec, message-level encryption for sensitive payloads.

Data in use is data being actively processed in memory or CPU. This is the hardest state to protect because the data must be decrypted for processing. Emerging controls include confidential computing, secure enclaves (Intel SGX, AMD SEV), and homomorphic encryption for specific use cases. For the exam, know that data in use is the most difficult state to protect and that traditional encryption alone does not cover it.

A common exam pattern: a question describes data moving between states and asks which control applies. Match the state to the control category. Encryption at rest does not protect data in transit. TLS does not protect data at rest. Neither protects data in use.


Data Ownership Roles

CISSP distinguishes five data-related roles. Confusing these roles is one of the most common exam mistakes:

Data Owner — A senior business leader (not IT) who is accountable for the data. The owner determines classification level, approves access, defines retention requirements, and accepts risk. The data owner does not implement controls — they authorize and direct. On the exam, when a question asks "who decides the classification?" the answer is the data owner.

Data Custodian — Typically IT operations. The custodian implements and maintains the technical controls that the data owner mandates: backups, encryption, access control enforcement, storage management. The custodian is responsible for the how, not the what.

Data Steward — Focuses on data quality, metadata, and ensuring data is used according to policy. Stewards define data standards, maintain data dictionaries, and ensure consistency across systems. Think of the steward as the quality and compliance layer between the owner and the custodian.

Data Processor — An entity (often third-party) that processes data on behalf of the data controller. Under GDPR, the processor acts only on the controller's instructions. A cloud provider hosting your customer database is a processor. They do not decide what happens to the data — they execute your instructions.

Data Controller — The entity that determines the purposes and means of data processing. Under GDPR, the controller decides why data is collected and what is done with it. Your organization is the controller for the customer data it collects, even if a third-party processor stores it.

The distinction that matters most on the exam: owners decide, custodians implement, controllers direct, processors execute.


Data Mapping and Flow Documentation

You cannot protect data you cannot find. Data mapping documents where data originates, where it flows, where it is stored, who accesses it, and how it leaves the organization.

Data flow documentation answers four questions:

  1. Where does the data originate? — User input, system integration, third-party feed, sensor collection
  2. Where does it move? — Between internal systems, to cloud services, to third parties, across borders
  3. Where does it rest? — Primary databases, backup systems, archives, endpoint devices, SaaS platforms
  4. Where does it leave? — Reports, API calls, email, file transfers, third-party sharing

Data mapping becomes especially important for privacy regulations. GDPR requires organizations to demonstrate they know what personal data they hold, where it is processed, and who it is shared with. Without a current data map, compliance is impossible to prove.

For the exam, data mapping is a governance control that enables all other data protection controls. You cannot classify data you have not inventoried. You cannot apply retention policies to data you cannot locate. You cannot report breaches involving data you did not know existed.


Pattern Recognition

Data lifecycle questions on the CISSP follow these patterns:

  • "Who should classify the data?" → Data owner (a business leader, never IT)
  • "Who implements the backup schedule?" → Data custodian (IT operations)
  • "Data is being transmitted to a partner" → Sharing stage, data in transit — apply encryption in transit and contractual controls
  • "The organization doesn't know where its customer data resides" → Data mapping deficiency — you cannot protect what you cannot find
  • "A cloud provider stores data on the organization's behalf" → Processor/controller relationship, not ownership transfer

Trap Patterns

  • "The IT director should classify the data" — Classification is a business decision made by the data owner (a business leader), not an IT function. IT implements the controls that the classification requires.
  • "Encryption protects data in all three states" — Traditional encryption protects data at rest and in transit. Data in use must be decrypted for processing, making it the most difficult state to protect.
  • "Data archival eliminates the need for security controls" — Archived data retains its classification and requires protection proportional to its sensitivity. Archival changes the access pattern, not the security requirement.
  • "Transferring data to a cloud provider transfers ownership" — Moving data to a processor does not change who owns or controls the data. The controller retains accountability.

Scenario Practice


Question 1

A healthcare organization stores patient records in a cloud-hosted database. The cloud provider manages the infrastructure but does not make decisions about what data is collected or how it is used.

Under GDPR terminology, what role does the cloud provider fulfill?

A. Data owner
B. Data controller
C. Data processor
D. Data steward

Answer & reasoning

Correct: C

The cloud provider processes data on behalf of the healthcare organization (the controller) without determining the purposes or means of processing. The provider acts on the controller's instructions, which defines the processor role under GDPR.


Question 2

An organization discovers that sensitive financial data classified as "Confidential" was moved to an archive three years ago without encryption. The archive is on a network-accessible file server.

What is the FIRST action the security manager should take?

A. Delete the archived data immediately
B. Assess the exposure and apply encryption to the archived data based on its classification
C. Reclassify the data as "Public" since it is archived
D. Move the data back to the primary database

Answer & reasoning

Correct: B

Archived data retains its original classification and requires commensurate protection. The security manager must assess the scope of the exposure first, then apply the controls the classification demands. Deletion may violate retention requirements. Reclassification without business justification is improper. Moving data does not address the encryption gap.


Question 3

A multinational company cannot produce a complete inventory of where its customer personal data is stored across its 12 subsidiaries. A regulator has requested documentation of all personal data processing activities.

What capability gap does this reveal?

A. Insufficient encryption across subsidiaries
B. Lack of a data classification policy
C. Absence of data mapping and flow documentation
D. Missing data loss prevention technology

Answer & reasoning

Correct: C

The inability to produce a data inventory across subsidiaries indicates the organization has not mapped its data flows. Data mapping documents where data originates, moves, is stored, and who accesses it. Without it, the organization cannot satisfy regulatory requests for processing activity records.


Key Takeaway

Data does not sit still, and neither should your governance over it. Every stage of the lifecycle and every state of existence introduces different risks that demand different controls.

The ownership chain is the foundation: owners decide classification, custodians implement protection, controllers direct processing, processors execute instructions. When a CISSP question asks who is accountable, follow the chain — accountability always sits with the business, not with IT.

Next Module Module 17: Asset Retention