Welcome to Subscribe On Youtube
Formatted question description: https://leetcode.ca/all/1879.html
1879. Minimum XOR Sum of Two Arrays
Level
Hard
Description
You are given two integer arrays nums1
and nums2
of length n
.
The XOR sum of the two integer arrays is (nums1[0] XOR nums2[0]) + (nums1[1] XOR nums2[1]) + ... + (nums1[n - 1] XOR nums2[n - 1])
(0-indexed).
- For example, the XOR sum of
[1,2,3]
and[3,2,1]
is equal to(1 XOR 3) + (2 XOR 2) + (3 XOR 1) = 2 + 0 + 2 = 4
.
Rearrange the elements of nums2
such that the resulting XOR sum is minimized.
Return the XOR sum after the rearrangement.
Example 1:
Input: nums1 = [1,2], nums2 = [2,3]
Output: 2
Explanation: Rearrange nums2 so that it becomes [3,2].
The XOR sum is (1 XOR 3) + (2 XOR 2) = 2 + 0 = 2.
Example 2:
Input: nums1 = [1,0,3], nums2 = [5,3,4]
Output: 8
Explanation: Rearrange nums2 so that it becomes [5,4,3].
The XOR sum is (1 XOR 5) + (0 XOR 4) + (3 XOR 3) = 4 + 4 + 0 = 8.
Constraints:
n == nums1.length
n == nums2.length
1 <= n <= 14
0 <= nums1[i], nums2[i] <= 10^7
Solution
Use dynamic programming with compressed states. Create a 2D array dp
of n
rows and 2^n
columns, where dp[i][j]
represents the minimum XOR sum of the first i + 1
numbers in nums1
when the combination in nums2
is j
. Finally, return dp[n - 1][2^n - 1]
.
-
class Solution { public int minimumXORSum(int[] nums1, int[] nums2) { int length = nums1.length; int[][] dp = new int[length][1 << length]; for (int i = 0; i < length; i++) dp[0][1 << i] = nums1[0] ^ nums2[i]; for (int i = 1; i < length; i++) { Arrays.fill(dp[i], Integer.MAX_VALUE); for (int j = 1; j < 1 << length; j++) { int remain = j; while (remain > 0) { int curBit = remain & (-remain); if ((j & curBit) != 0) { int k = (int) (Math.log(curBit) / Math.log(2)); dp[i][j] = Math.min(dp[i][j], dp[i - 1][j - curBit] + (nums1[i] ^ nums2[k])); } remain -= curBit; } } } return dp[length - 1][(1 << length) - 1]; } }
-
// OJ: https://leetcode.com/problems/minimum-xor-sum-of-two-arrays/ // Time: O(N * 2^N) // Space: O(2^N) class Solution { int m[16384] = {[0 ... 16383] = INT_MAX}, N; int dp(vector<int> &A, vector<int> &B, int i, int mask) { if (i == N) return 0; if (m[mask] != INT_MAX) return m[mask]; int ans = INT_MAX; for (int j = 0; j < N; ++j) { if (mask >> j & 1) continue; ans = min(ans, (A[i] ^ B[j]) + dp(A, B, i + 1, mask | (1 << j))); } return m[mask] = ans; } public: int minimumXORSum(vector<int>& A, vector<int>& B) { N = A.size(); return dp(A, B, 0, 0); } };
-
# 1879. Minimum XOR Sum of Two Arrays # https://leetcode.com/problems/minimum-xor-sum-of-two-arrays class Solution: def minimumXORSum(self, nums1: List[int], nums2: List[int]) -> int: n = len(nums2) @cache def go(mask): i = bin(mask).count('1') if i >= len(nums1): return 0 count = float('inf') for j in range(n): if mask & (1 << j) == 0: count = min(count, (nums1[i] ^ nums2[j]) + go(mask + (1 << j))) return count return go(0)