Getting run error in all the test cases

#include
#include<bits/stdc++.h>
using namespace std;
#define ll long long
int main () {
ll ans=INT_MIN;
ll n,m;
ll dp[n+1][n+1];
ll x,y;

cin>>n>>m;
for(ll i=0;i<m;i++){
   cin>>x>>y;
   dp[x-1][y-1]++;
}
for(int i=1;i<n;i++){
	for(int j=1;j<n;j++){
		dp[i][j]+=dp[i-1][j];
	}
}
for(int i=1;i<n;i++){
	for(int j=1;j<n;j++){
		dp[i][j]+=dp[i][j-1];
	}
}
for(int i=1;i<n;i++){
	for(int j=1;j<n;j++){
       ans=max(ans,min(dp[i][j],min(dp[n-1][j]-dp[i][j],min(dp[i][n-1]-dp[i][j],dp[n-1][n-1]-dp[n-1][j]-dp[i][n-1]+dp[i][j]))));			
	}

}
cout<<ans<<endl;
return 0;
}
Getting run error in all the test cases.

https://ide.codingblocks.com/s/329215 here is the link of the code.

Just take prefix sum of every row and after that take prefix sum of every column.
Then if you partition the (i,j)th of grid then your top-left box sum will be the value at (i,j)th index of grid
and top-right box sum will be the sum at that row’s last column - the sum of top-left box sum
and bottom right box sum will be the value at (n,j)th of the grid - the sum of top-left box
and bottom-right box sum will be the value at (n,m)th of grid - top-right box sum - bottom left box sum + the sum of the top-left box ( because it gets subtracted two times, the first time in top-right box sum and the second time in bottom-left box sum ).
Make ans at every step as
ans = max ( ans, min( all four boxes) );
You can see my implementation, I have commented on the part for better understanding. It’s better if you try on a notebook with one small example to have a crystal clear understanding.