Defining the size of array

If I define an array as arr[n*(n+1)/2], for numbers 5 and larger, the program either crashes or shows an error.
But if I take a variable p,store the value in it and define the array as arr[p], I don’t get any error.
Why is this so?

Also, in the fibonacci pattern, if I want to input 10 rows, why are there some numbers negative in 10th row?

Hi Rohan
In the first part, your question is little ambiguous. I guess if you store 5 in n i.e n=5 and then make arr[n*(n+1)/2],the program should work.
This is what you are trying to ask?

Not exactly.
I first asked the user to input the value of n, so we now have a definite value of n. Now I defined an array as arr[n*(n+1)/2] . It works fine for smaller values of n, but say for n = 5 or greater, the program crashes or shows an error.

Now, if I take a variable p as p=n*(n+1)/2 and define the array as arr[p] it works fine even for larger values of n.
What difference does calculating n*(n+1)/2 separately make?

Hi
#include<bits/stdc++.h>
using namespace std;
int main()
{
int n;
cout<<"Enter number: ";
cin>>n;
int arr[(n*(n+1))/2];
for(int i=0;i<(n*(n+1))/2;i++)
cin>>arr[i];
cout<<endl;
for(int i=0;i<(n*(n+1))/2;i++)
cout<<arr[i];
}
I tried this,works for me for larger values of n also

1 Like

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.