Welcome to Subscribe On Youtube
2913. Subarrays Distinct Element Sum of Squares I
Description
You are given a 0-indexed integer array nums
.
The distinct count of a subarray of nums
is defined as:
- Let
nums[i..j]
be a subarray ofnums
consisting of all the indices fromi
toj
such that0 <= i <= j < nums.length
. Then the number of distinct values innums[i..j]
is called the distinct count ofnums[i..j]
.
Return the sum of the squares of distinct counts of all subarrays of nums
.
A subarray is a contiguous non-empty sequence of elements within an array.
Example 1:
Input: nums = [1,2,1] Output: 15 Explanation: Six possible subarrays are: [1]: 1 distinct value [2]: 1 distinct value [1]: 1 distinct value [1,2]: 2 distinct values [2,1]: 2 distinct values [1,2,1]: 2 distinct values The sum of the squares of the distinct counts in all subarrays is equal to 12 + 12 + 12 + 22 + 22 + 22 = 15.
Example 2:
Input: nums = [1,1] Output: 3 Explanation: Three possible subarrays are: [1]: 1 distinct value [1]: 1 distinct value [1,1]: 1 distinct value The sum of the squares of the distinct counts in all subarrays is equal to 12 + 12 + 12 = 3.
Constraints:
1 <= nums.length <= 100
1 <= nums[i] <= 100
Solutions
Solution 1: Enumeration
We can enumerate the left endpoint index $i$ of the subarray, and for each $i$, we enumerate the right endpoint index $j$ in the range $[i, n)$, and calculate the distinct count of $nums[i..j]$ by adding the count of $nums[j]$ to a set $s$, and then taking the square of the size of $s$ as the contribution of $nums[i..j]$ to the answer.
After the enumeration, we return the answer.
The time complexity is $O(n^2)$, and the space complexity is $O(n)$. Here, $n$ is the length of the array $nums$.
-
class Solution { public int sumCounts(List<Integer> nums) { int ans = 0; int n = nums.size(); for (int i = 0; i < n; ++i) { int[] s = new int[101]; int cnt = 0; for (int j = i; j < n; ++j) { if (++s[nums.get(j)] == 1) { ++cnt; } ans += cnt * cnt; } } return ans; } }
-
class Solution { public: int sumCounts(vector<int>& nums) { int ans = 0; int n = nums.size(); for (int i = 0; i < n; ++i) { int s[101]{}; int cnt = 0; for (int j = i; j < n; ++j) { if (++s[nums[j]] == 1) { ++cnt; } ans += cnt * cnt; } } return ans; } };
-
class Solution: def sumCounts(self, nums: List[int]) -> int: ans, n = 0, len(nums) for i in range(n): s = set() for j in range(i, n): s.add(nums[j]) ans += len(s) * len(s) return ans
-
func sumCounts(nums []int) (ans int) { for i := range nums { s := [101]int{} cnt := 0 for _, x := range nums[i:] { s[x]++ if s[x] == 1 { cnt++ } ans += cnt * cnt } } return }
-
function sumCounts(nums: number[]): number { let ans = 0; const n = nums.length; for (let i = 0; i < n; ++i) { const s: number[] = Array(101).fill(0); let cnt = 0; for (const x of nums.slice(i)) { if (++s[x] === 1) { ++cnt; } ans += cnt * cnt; } } return ans; }