We are given an array A of size n , and an integer k denoting the number of subarrays that we need to divide the array A into. Every element can be included atmax once in any subarray
for an array A of length N we define F(A) = index (i) * A[i] (for all i between [1,N] )
if S1,S2,S3,S4 are the k non empty subarrays
then cost of this division is maximum ({F(S1),F(S2),F(S3) ,…F(S_k)}
For n = 4 , k =3
A = [1,2,3,4] there are three possible divisions in the form of:
- [1,2] , [3] ,[4] => max( 11+22 , 3 , 4) = 5
- [1] , [2,3] ,[4] => max( 1,12+23 ,4) = 8
- [1] , [2] , [3,4] => max( 1, 2 , 13+24) = 11
The answer would be minimum cost of all partitions i.e : 5
constraints : 1<=N<=10^5
1<=K<=N
1<=A[i]<=10^6
How to approach such kind of problem ?