Welcome to Subscribe On Youtube
Formatted question description: https://leetcode.ca/all/1879.html
1879. Minimum XOR Sum of Two Arrays
Level
Hard
Description
You are given two integer arrays nums1
and nums2
of length n
.
The XOR sum of the two integer arrays is (nums1[0] XOR nums2[0]) + (nums1[1] XOR nums2[1]) + ... + (nums1[n - 1] XOR nums2[n - 1])
(0-indexed).
- For example, the XOR sum of
[1,2,3]
and[3,2,1]
is equal to(1 XOR 3) + (2 XOR 2) + (3 XOR 1) = 2 + 0 + 2 = 4
.
Rearrange the elements of nums2
such that the resulting XOR sum is minimized.
Return the XOR sum after the rearrangement.
Example 1:
Input: nums1 = [1,2], nums2 = [2,3]
Output: 2
Explanation: Rearrange nums2 so that it becomes [3,2].
The XOR sum is (1 XOR 3) + (2 XOR 2) = 2 + 0 = 2.
Example 2:
Input: nums1 = [1,0,3], nums2 = [5,3,4]
Output: 8
Explanation: Rearrange nums2 so that it becomes [5,4,3].
The XOR sum is (1 XOR 5) + (0 XOR 4) + (3 XOR 3) = 4 + 4 + 0 = 8.
Constraints:
n == nums1.length
n == nums2.length
1 <= n <= 14
0 <= nums1[i], nums2[i] <= 10^7
Solution
Use dynamic programming with compressed states. Create a 2D array dp
of n
rows and 2^n
columns, where dp[i][j]
represents the minimum XOR sum of the first i + 1
numbers in nums1
when the combination in nums2
is j
. Finally, return dp[n - 1][2^n - 1]
.
-
class Solution { public int minimumXORSum(int[] nums1, int[] nums2) { int length = nums1.length; int[][] dp = new int[length][1 << length]; for (int i = 0; i < length; i++) dp[0][1 << i] = nums1[0] ^ nums2[i]; for (int i = 1; i < length; i++) { Arrays.fill(dp[i], Integer.MAX_VALUE); for (int j = 1; j < 1 << length; j++) { int remain = j; while (remain > 0) { int curBit = remain & (-remain); if ((j & curBit) != 0) { int k = (int) (Math.log(curBit) / Math.log(2)); dp[i][j] = Math.min(dp[i][j], dp[i - 1][j - curBit] + (nums1[i] ^ nums2[k])); } remain -= curBit; } } } return dp[length - 1][(1 << length) - 1]; } }
-
// OJ: https://leetcode.com/problems/minimum-xor-sum-of-two-arrays/ // Time: O(N * 2^N) // Space: O(2^N) class Solution { int m[16384] = {[0 ... 16383] = INT_MAX}, N; int dp(vector<int> &A, vector<int> &B, int i, int mask) { if (i == N) return 0; if (m[mask] != INT_MAX) return m[mask]; int ans = INT_MAX; for (int j = 0; j < N; ++j) { if (mask >> j & 1) continue; ans = min(ans, (A[i] ^ B[j]) + dp(A, B, i + 1, mask | (1 << j))); } return m[mask] = ans; } public: int minimumXORSum(vector<int>& A, vector<int>& B) { N = A.size(); return dp(A, B, 0, 0); } };
-
# 1879. Minimum XOR Sum of Two Arrays # https://leetcode.com/problems/minimum-xor-sum-of-two-arrays class Solution: def minimumXORSum(self, nums1: List[int], nums2: List[int]) -> int: n = len(nums2) @cache def go(mask): i = bin(mask).count('1') if i >= len(nums1): return 0 count = float('inf') for j in range(n): if mask & (1 << j) == 0: count = min(count, (nums1[i] ^ nums2[j]) + go(mask + (1 << j))) return count return go(0)
-
func minimumXORSum(nums1 []int, nums2 []int) int { n := len(nums1) f := make([]int, 1<<n) for i := range f { f[i] = 1 << 30 } f[0] = 0 for i := 0; i < 1<<n; i++ { k := bits.OnesCount(uint(i)) - 1 for j := 0; j < n; j++ { if i>>j&1 == 1 { f[i] = min(f[i], f[i^1<<j]+(nums1[k]^nums2[j])) } } } return f[(1<<n)-1] } func min(a, b int) int { if a < b { return a } return b }
-
function minimumXORSum(nums1: number[], nums2: number[]): number { const n = nums1.length; const f: number[] = Array(1 << n).fill(1 << 30); f[0] = 0; for (let i = 0; i < 1 << n; ++i) { const k = bitCount(i) - 1; for (let j = 0; j < n; ++j) { if (((i >> j) & 1) === 1) { f[i] = Math.min(f[i], f[i ^ (1 << j)] + (nums1[k] ^ nums2[j])); } } } return f[(1 << n) - 1]; } function bitCount(i: number): number { i = i - ((i >>> 1) & 0x55555555); i = (i & 0x33333333) + ((i >>> 2) & 0x33333333); i = (i + (i >>> 4)) & 0x0f0f0f0f; i = i + (i >>> 8); i = i + (i >>> 16); return i & 0x3f; }