I want to know what am I missing in this algorithm.
Can someone plz pinpoint the issue
What is the problem in this algorithm?
Can you please tell where you are facing the issue?
some test cases are getting wrong but I am unable to find why is it so
The logic seems to be incorrect, you need to loop from both the sides and then take sum at a particular index.
sorry but didn’t get it
Wait I’ll share the approach and code
We create two arrays - ‘inc’ and ‘dec’
- inc[i] stores the length of increasing subarray till i.
- dec[i] stores the length of decreasing subarray starting from index i.
- Doing so gives us the length of increasing and decreasing subarray at each index in O(n) time.
- We calculate the length of the longest bitonic subarray by finding the maximum inc[i] + dec[i] - 1
- We subtract one since the current element at ith index is included in both the increasing and decreasing subarray lengths.
Algorithm
Initialize inc[0] to 1 and dec[n-1] to 1
Creating inc[] array
a. Till end of the array ie, i=1 to n, if arr[i] > arr[i-1] then inc[i] = inc[i-1] + 1. else, inc[i] = 1
Creating dec[] array
a. From the end of the array ie, i = n-2 till i =0, if arr[i] > arr[i+1] then dec[i] = dec[i+1] +1 else, dec[i] = 1
I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.
On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.