Scrapy throwing spider not found error

2021-07-22 20:16:57 [scrapy.utils.log] INFO: Scrapy 2.5.0 started (bot: myproject)
2021-07-22 20:16:57 [scrapy.utils.log] INFO: Versions: lxml 4.6.3.0, libxml2 2.9.5, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 21.2.0, Python 3.9.6 (tags/v3.9.6:db3ff76,
Jun 28 2021, 15:26:21) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 20.0.1 (OpenSSL 1.1.1k 25 Mar 2021), cryptography 3.4.7, Platform Windows-10-10.0.19043-SP0
2021-07-22 20:16:57 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor
Traceback (most recent call last):
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\spiderloader.py”, line 75, in load
return self._spiders[spider_name]
KeyError: ‘quotes_spider.py’

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “c:\users\asus\appdata\local\programs\python\python39\lib\runpy.py”, line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File “c:\users\asus\appdata\local\programs\python\python39\lib\runpy.py”, line 87, in run_code
exec(code, run_globals)
File "C:\Users\ASUS\AppData\Local\Programs\Python\Python39\Scripts\scrapy.exe_main
.py", line 7, in
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\cmdline.py”, line 145, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\cmdline.py”, line 100, in _run_print_help
func(*a, **kw)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\cmdline.py”, line 153, in _run_command
cmd.run(args, opts)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\commands\crawl.py”, line 22, in run
crawl_defer = self.crawler_process.crawl(spname, **opts.spargs)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\crawler.py”, line 191, in crawl
crawler = self.create_crawler(crawler_or_spidercls)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\crawler.py”, line 224, in create_crawler
return self._create_crawler(crawler_or_spidercls)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\crawler.py”, line 228, in _create_crawler
spidercls = self.spider_loader.load(spidercls)
File “c:\users\asus\appdata\local\programs\python\python39\lib\site-packages\scrapy\spiderloader.py”, line 77, in load
raise KeyError(f"Spider not found: {spider_name}")
KeyError: ‘Spider not found: quotes_spider.py’

aah nvm typed “quotes_spider.py” instead of "quotes_spider’