Oracle Key Vault's multi-master architecture allows multiple OKV nodes to actively participate in key management across geographically distributed environments. Each node can read and write data, ensuring high availability, fault tolerance, and seamless scalability. This setup not only eliminates a single point of failure but also supports consistent key access and synchronization across sites, making it ideal for enterprise-grade security and disaster recovery.
Active-Active Architecture:
All nodes in the cluster are active and capable of processing client requests, including cryptographic operations and key management functions.
Automatic Synchronization:
Nodes automatically replicate and synchronize key and metadata changes to ensure consistency across the cluster.
Fault Tolerance:
If one node becomes unavailable, other nodes continue to operate without disruption, ensuring high availability.
Geographically Distributed Support:
Nodes can be deployed across different data centers or geographic regions to support disaster recovery and improve performance for distributed applications.
Certificate-Based Trust Establishment:
Each node uses certificates to establish trust and securely exchange data with other nodes in the cluster.
In this article, I’ll walk you through the steps to configure a multi-master cluster in Oracle Key Vault (OKV). For this demonstration, we will be setting up a 2-node cluster, although best practice recommends using at least 3 or 4 nodes to ensure better fault tolerance and avoid unexpected failures.
High Availability, Scalability, and Fault Tolerance are built in.
All nodes must be running the same OKV version to join the cluster.
NTP-based time synchronization is mandatory across all nodes.
Nodes actively serve endpoints and maintain an identical dataset.
Cluster Size Limits:
Minimum: 2 nodes
Maximum: 16 nodes
Supports multiple read-write node pairs across data centers.
We will use the following configuration:
Cluster Name: okv-cl01
Cluster Subgroup Name: okv-cl01-sg01
Controller Node – 192.168.56.210
– okv.oracle.com
Acts as the first master node in the cluster.
Candidate Node – 192.168.56.211
– okv02.oracle.com
Pre-Configuration Notes
Before starting the configuration, make sure to consider the following:
Backups are Critical
If OKV runs on a virtualized platform, take a full VM snapshot before proceeding. A failed configuration is not easy to roll back.
Time Synchronization is Mandatory
All nodes must use NTP (Network Time Protocol) to ensure accurate time sync. Desynchronized clocks will prevent successful cluster formation.
Static IPs for all OKV nodes.
To ensure proper communication between OKV nodes and endpoints, make sure the following ports are open in your firewall:
TCP 5696 – Required for secure key management operations between OKV and Oracle endpoints.
TCP 443 – To access the OKV web console.
TCP 22 – For SSH access (administrative use only).
To enable cluster mode in OKV, follow these high-level steps:
Initial Node Selection:
When you select the first OKV node for cluster setup, it becomes the Controller Node. This node manages cluster membership and coordination.
All subsequent nodes you add to the cluster are treated as Candidate Nodes. These nodes synchronize with the controller and become part of the cluster.
Cluster Formation:
Once all candidate nodes successfully join, the OKV cluster becomes fully functional. The cluster ensures high availability and load balancing across nodes.
First, select the controller node and enter the required controller settings. After completing the controller setup, you can proceed to configure the candidate node settings.
Steps (summary):
To add a candidate node to the OKV cluster, follow these steps using the current server details:
Access the Candidate Node:
Log in to the candidate server (e.g., okv02.example.com
) via SSH or console access.
Set Hostname and Network Configuration:
Ensure the hostname and IP address are correctly set and resolvable from the controller node.
Sync Time Settings:
Configure NTP to synchronize time with the controller node to prevent cluster issues.
Install or Verify OKV Software:
Confirm that the Oracle Key Vault software is installed and that the correct version matches the controller node.
Retrieve and Import Controller Certificate:
Obtain the controller certificate from node01
(controller).
Copy it to the candidate node and place it in the appropriate section.
Generate and Share Candidate Certificate:
Generate the candidate node's certificate.
Copy it back to node01
to complete the mutual trust setup.
Join the Cluster from the Controller Node:
Use the OKV web console to initiate the cluster join process with node02
as the candidate node.
After entering all the required details, initiate the conversion of the server to a candidate node.
Throughout this guide, we’ve walked through the process of setting up a 2-node OKV cluster, starting from the initial node configuration to successfully adding a candidate node. By following the necessary pre-configuration steps, ensuring proper synchronization, and securely managing certificates, you can create a robust OKV deployment that meets the needs of today’s complex and data-driven enterprise environments.
Remember that it’s crucial to secure the recovery password and verify all configurations before completing the setup. A well-planned and executed multi-master configuration can safeguard your encryption keys and ensure your systems remain resilient against potential threats and failures.
By leveraging OKV’s multi-master architecture, you can stay ahead of the evolving cybersecurity landscape and maintain continuous protection for your sensitive data across various environments and geographies.