I am getting wrong answer on 1&2. But as per my understanding of question, i am getting correct output for my debugging. Any help please?!!!
Any hint/corner case where my code could fail?
Share you code via ide.codingblocks.com so that i can have a look.
Till then you can have a look at this short code snippet.
I suggest you to use this approach:
Note: We know all elements in a subarray arr[iā¦j] are distinct, sum of all lengths of distinct element subarrays in this sub array is ((j-i+1)(j-i+2))/2. How? the possible lengths of subarrays are 1, 2, 3,ā¦ā¦, j ā i +1. So, the sum will be ((j ā i +1)(j ā i +2))/2.
2.We first find largest subarray (with distinct elements) starting from first element.
3.We count sum of lengths in this subarray using above formula.
4.For finding next subarray of the distinct element, we increment starting point, i and ending point, j unless (i+1, j) are distinct. If not possible, then we increment i again and move forward the same way.
In case of input [1,2,3] - the sample input; your approach says the largest subarray is the array itself so I plug in the values and your formula gives me ans as 6. What is wrong? I think the formula you gave is wrong
I went to gfg to understand, thanks for the help