Am not able to access file and output csv

I have uploaded my code on google drive. Plz tell me from where do i open the scrapy file.
If i give the attribute name in my pgm it is giving error but without it there is no file which opens code or csv file that contains the output. I have uploaded my file on google drive as bookstore.ipnyb. pls check

Hey @surbhi11, you need to share the link of you ipynb noteboo. Right click on the file in ipynb notebook, and click on get sharable link, and than share the link here.

the link is-
https://drive.google.com/file/d/1pUH0nMHhzRj2K2odCzhqTRduVZAmjGfu/view?usp=sharing

Hey @surbhi11, the file is not accessible by me, so change the settings of file, to everyone with the link can view the file in sharing settings.

I have changed the sharing settings

Hey @surbhi11, there is some issue with file. Its not opening with jupyter notebook. Could you share snaps of where you coded or simple share the ss of your coded file(showing code lines in the file).

sir I have shared the correct file again. the link is-https://drive.google.com/file/d/10s4oao4osMQv_YG4cRe5SqeZPrDln0GZ/view?usp=sharing

plz check

Hey @surbhi11, first of all make .py file instead of .ipynb, as told by mentor in previous lectures. After that open command prompt. Now you to go to the folder where your scrapy project folder is. You need to use ‘cd’ commands in command prompt again and again to go into the folders. You can print all current directories using ‘dir’ command in windows command prompt. It will show you all the directories.
And then use ‘scrapy crawl quotes’ etc as shown in previous videos.

Happy Learning :blush:

sir where do i type scrapy crawl command as i m unable to type anything in command prompt after opening jupyter notebook and what to do after that.

Hey @surbhi11, you can open new command prompt window from your start menu to call that. Also make sure you write your code on sublime/normal text editor and save it with correct file name with .py extension. Watch previous video carefully you will get it.

Happy Learning :blush:

sir I have saved my code in jupyter notebook with .py extension. when after opening the directories I am typing crawl command it is giving following-

C:\Users\DELL\Desktop\cb\my_first_project\my_first_project\spiders>scrapy crawl bookstore.py
Traceback (most recent call last):
File “C:\Users\DELL\anaconda3\Scripts\scrapy-script.py”, line 10, in
sys.exit(execute())
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\cmdline.py”, line 144, in execute
cmd.crawler_process = CrawlerProcess(settings)
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\crawler.py”, line 280, in init
super(CrawlerProcess, self).init(settings)
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\crawler.py”, line 152, in init
self.spider_loader = self._get_spider_loader(settings)
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\crawler.py”, line 146, in _get_spider_loader
return loader_cls.from_settings(settings.frozencopy())
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\spiderloader.py”, line 68, in from_settings
return cls(settings)
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\spiderloader.py”, line 24, in init
self._load_all_spiders()
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\spiderloader.py”, line 51, in load_all_spiders
for module in walk_modules(name):
File “C:\Users\DELL\anaconda3\lib\site-packages\scrapy\utils\misc.py”, line 78, in walk_modules
submod = import_module(fullpath)
File "C:\Users\DELL\anaconda3\lib\importlib_init
.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File “”, line 1014, in _gcd_import
File “”, line 991, in _find_and_load
File “”, line 975, in _find_and_load_unlocked
File “”, line 671, in _load_unlocked
File “”, line 783, in exec_module
File “”, line 219, in _call_with_frames_removed
File “C:\Users\DELL\Desktop\cb\my_first_project\my_first_project\spiders\bookstore.py”, line 7, in
“scrolled”: true
NameError: name ‘true’ is not defined

C:\Users\DELL\Desktop\cb\my_first_project\my_first_project\spiders>

Hey @surbhi11, pm me here. I will follow up there itself to clear the doubt asap.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.