How to Migrate Data to Data-Level Protection

How to migrate structured data to Ubiq encryption, tokenization, or masking, including bulk conversion, progressive encryption, and legacy platform transitions.

This guide explains how to migrate structured data into Ubiq encryption, tokenization, or masking.

It covers two primary scenarios:

  1. Moving from unprotected cleartext data to Ubiq
  2. Migrating from an existing protection platform to Ubiq

In both cases, the objective is the same: sensitive fields become unreadable by default and are only revealed when an authorized identity is validated at runtime.

Migrating from Unprotected Data to Ubiq

Overview

When sensitive data is currently stored in cleartext, the migration challenge is straightforward but operationally sensitive. You must transition the dataset to a protected state without breaking applications, disrupting workflows, or introducing inconsistency.

There are two primary migration models:

  1. A scheduled bulk conversion
  2. A progressive, no-downtime conversion

Most production environments favor progressive conversion. Some teams use a hybrid approach depending on system criticality.

Preparation

Before applying protection, you should clearly define:

  • Which fields are sensitive
  • What protection method each field requires
  • Where protection will be enforced
  • How identity authorization will be handled

For each field, confirm format constraints, length requirements, and whether deterministic behavior is required for joins or lookups. These characteristics influence whether encryption, tokenization, or masking is appropriate.

You must also determine where Ubiq will be invoked. In many environments, this occurs at the application layer so data is protected at creation time. In others, especially where applications cannot be modified, the database layer is the preferred enforcement boundary.

Identity enforcement must also be defined up front. Ubiq can integrate with Entra ID, Okta, or other IAM systems. You should decide which identities are permitted to decrypt, which are encrypt-only, and which have no access to protected fields.

Finally, logging and observability should be enabled prior to migration. Encrypt and decrypt operations should be auditable from day one.

Scheduled Bulk Conversion

A bulk conversion protects all historical data in a controlled window. This model is best suited for systems that can tolerate a maintenance window and where immediate full protection is required.

In this approach, you deploy Ubiq integration first, configure protection policies, and then execute a conversion job that processes existing rows and updates sensitive fields in place. During this window, write operations are typically paused to avoid partial state inconsistencies.

This model provides a clean cutover. Once complete, the dataset is fully protected and normal operations resume under Ubiq enforcement.

The primary tradeoff is the operational coordination required for the maintenance window.

Progressive Conversion Without Downtime

In environments where downtime is unacceptable, a progressive model is more appropriate.

There are two common patterns.

The first is encrypt-on-write. In this model, all new and updated records are protected immediately. Existing records remain cleartext until they are modified. Over time, the dataset gradually transitions as records are touched.

The second is encrypt-on-access. In this model, when a record is read, the system detects whether the value is already protected. If it is not, the system protects it and writes it back before returning the cleartext to the authorized identity. This model converges more quickly than encrypt-on-write without requiring a maintenance window.

In either progressive approach, you must reliably distinguish between cleartext and protected values. This is typically done using deterministic markers such as prefixes or metadata columns that indicate protection state. Avoid relying solely on decrypt-failure detection unless it is tightly controlled and well understood.

Many teams add a background conversion process that scans for remaining unprotected rows and converts them in controlled batches. This ensures predictable convergence while maintaining full system availability.

Validation

After migration begins, validate both functionality and enforcement.

Authorized identities must be able to decrypt. Unauthorized identities must not. Applications should function normally. Reporting, exports, analytics, and downstream systems must be tested for compatibility. Logging should clearly reflect encrypt and decrypt events.

Migrating from Another Protection Platform to Ubiq

Overview

When migrating from another protection platform, the complexity increases because existing values are already protected. During the transition, systems must temporarily support both legacy and Ubiq protection schemes.

There are two primary approaches: bulk re-key conversion or dual-mode progressive migration.

Pre-Migration Assessment

Start by inventorying where protected data exists and how it is currently used. Identify which applications access it, whether deterministic behavior is required, and how legacy keys are managed.

You must confirm that you have a reliable path to decrypt legacy values. This includes access to legacy keys, runtime libraries, and a safe test environment for validation.

Define target Ubiq protection profiles carefully. Ensure format compatibility and downstream system expectations are met. Even if both systems use format-preserving methods, outputs will not be identical.

Bulk Re-Key Conversion

In a bulk re-key model, you schedule a controlled event where legacy values are decrypted and immediately re-encrypted or tokenized using Ubiq. Applications are paused during this event to avoid inconsistency.

This model eliminates dual-mode complexity quickly but requires coordination and downtime.

Dual-Mode Progressive Migration

In most large environments, a dual-mode period is necessary.

During this phase, applications or database logic must be capable of detecting whether a value is protected by the legacy platform or by Ubiq.

New writes should always use Ubiq. Reads must support both formats. When a legacy-protected value is read, it can optionally be converted to Ubiq protection before being written back. Over time, the dataset converges fully to Ubiq.

It is critical to avoid indefinite dual-mode operation. Define a clear milestone for disabling the legacy decrypt path once all records have transitioned.

Cutover and Decommission

Once no legacy-protected values remain, disable legacy decrypt logic, remove associated libraries, and retire legacy key dependencies. Confirm logging and monitoring reflect only Ubiq operations.

Recommended Approach

For most production systems, progressive migration is the safest path. Bulk conversion is appropriate when operational windows are available.

Always use deterministic protection-state detection. Always maintain full observability. And always define a clear end state where only Ubiq enforcement remains active.


© 2026 Ubiq Security, Inc. All rights reserved.