Scrapy not working

This is the log

(base) C:\Users\Suprateek Chatterjee>scrapy startproject project3
New Scrapy project ‘project3’, using template directory ‘d:\anaconda\lib\site-packages\scrapy\templates\project’, created in:
C:\Users\Suprateek Chatterjee\project3

You can start your first spider with:
cd project3
scrapy genspider example example.com

(base) C:\Users\Suprateek Chatterjee>cd project3

(base) C:\Users\Suprateek Chatterjee\project3>scrapy crawl quotes_spider
2019-06-04 00:13:00 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: project3)
2019-06-04 00:13:00 [scrapy.utils.log] INFO: Versions: lxml 4.2.5.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 18.7.0, Python 3.7.0 (default, Jun 28 2018, 08:04:48) [MSC v.1912 64 bit (AMD64)], pyOpenSSL 18.0.0 (OpenSSL 1.0.2p 14 Aug 2018), cryptography 2.3.1, Platform Windows-10-10.0.17134-SP0
Traceback (most recent call last):
File “d:\anaconda\lib\site-packages\scrapy\spiderloader.py”, line 69, in load
return self._spiders[spider_name]
KeyError: ‘quotes_spider’

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “d:\anaconda\lib\runpy.py”, line 193, in _run_module_as_main
main”, mod_spec)
File “d:\anaconda\lib\runpy.py”, line 85, in run_code
exec(code, run_globals)
File "D:\Anaconda\Scripts\scrapy.exe_main
.py", line 9, in
File “d:\anaconda\lib\site-packages\scrapy\cmdline.py”, line 150, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File “d:\anaconda\lib\site-packages\scrapy\cmdline.py”, line 90, in _run_print_help
func(*a, **kw)
File “d:\anaconda\lib\site-packages\scrapy\cmdline.py”, line 157, in _run_command
cmd.run(args, opts)
File “d:\anaconda\lib\site-packages\scrapy\commands\crawl.py”, line 57, in run
self.crawler_process.crawl(spname, **opts.spargs)
File “d:\anaconda\lib\site-packages\scrapy\crawler.py”, line 171, in crawl
crawler = self.create_crawler(crawler_or_spidercls)
File “d:\anaconda\lib\site-packages\scrapy\crawler.py”, line 200, in create_crawler
return self._create_crawler(crawler_or_spidercls)
File “d:\anaconda\lib\site-packages\scrapy\crawler.py”, line 204, in _create_crawler
spidercls = self.spider_loader.load(spidercls)
File “d:\anaconda\lib\site-packages\scrapy\spiderloader.py”, line 71, in load
raise KeyError(“Spider not found: {}”.format(spider_name))
KeyError: ‘Spider not found: quotes_spider’

firstly; Scrapy looks for the scrapy.cfg file, which contains
your project structure and project name. Running it outside your project
folder where there isn’t a scrapy.cfg will not work.

Have you set up the SPIDER_MODULES setting?
check out: (http://doc.scrapy.org/en/latest/topics/settings.html#spider-modules)