Cannot get a response from the crawler.!

After writing the command, i am not getting the html response from the crawler! Please help me out!!
Below is the link for the code and image of the command prompt–

hey @prerit_goyal ,
can you please upload the full repository on google drive and share it with me.
need to check other files too.

myfile is the name— https://github.com/Prerit2809/CodingBlocksCourse

hey @prerit_goyal ,
the code file under the spiders folder has a different name, just change that it will work

You need it like name=“myfile”

the problem still persists, even after changing the name.!!

okay. Let me check it on my system then.

can you also let me know , that after changing the name , is the output screen the same as before .
also update your code on github. with the position where you changed the name.

yes i have changed the name at the repo. the output screen is still the same as before.

you don’t need to change the name of code file.
in the code , there is a variable named name so you just need to change that to
name = "myfile"

oh, sorry sorry! I tried after changing the name the output screen is still same!

hey @prerit_goyal ,
did it worked ?

no, the output is still the same!!

scrapy_name

change this to
name="myfile"

did you did so ?

Yes, I did those changes and still my output was same.
P.s: Now I have changed them on repo files too!!

hey @prerit_goyal ,
sorry to be so late in response.

I was running you scrapy codes. Though i didn’t got what the problem is, but it wasn’t running for me too.
so , what would i suggest is create a new project and start on that.

oh, okay i’ll try it again!!

yeah try that once. That will work.

I closed the for loop here and used the shortcut method for it. Now the scrapy is working all fine.
would you please tell me why the for loop portion was not working correctly?.

I don’t know about this thing , why did this work and not the for loop.
Because your code only , just made a new project and it worked successfully.

Can you tell me what commands did you used to make a new project.

scrapy startproject quotes
scrapy genspider spider quotes.toscrape.com/

now my question is what if i have to scrape multiple pages(same site), then what should i do?