-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
Not working
scrapy 2.7 have to_native_str: https://github.com/scrapy/scrapy/blob/2.7/scrapy/utils/python.py
but deprecated('to_unicode')
I want to fix this error
(project_name) ➜ tutorial scrapy crawl quotes -O quotes-humor.json -a tag=humor
2024-02-25 23:08:19 [scrapy.utils.log] INFO: Scrapy 2.11.1 started (bot: tutorial)
2024-02-25 23:08:19 [scrapy.utils.log] INFO: Versions: lxml 5.1.0.0, libxml2 2.12.3, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 23.10.0, Python 3.10.11 (main, Jun 4 2023, 13:06:58) [Clang 14.0.3 (clang-1403.0.22.14.1)], pyOpenSSL 24.0.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.5, Platform macOS-13.3.1-arm64-arm-64bit
2024-02-25 23:08:19 [scrapy.addons] INFO: Enabled addons:
[]
2024-02-25 23:08:19 [asyncio] DEBUG: Using selector: KqueueSelector
2024-02-25 23:08:19 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
2024-02-25 23:08:19 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
2024-02-25 23:08:19 [scrapy.extensions.telnet] INFO: Telnet Password: 6682ff1da07bfe18
2024-02-25 23:08:19 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.logstats.LogStats']
2024-02-25 23:08:19 [scrapy.crawler] INFO: Overridden settings:
{'BOT_NAME': 'tutorial',
'FEED_EXPORT_ENCODING': 'utf-8',
'NEWSPIDER_MODULE': 'tutorial.spiders',
'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
'SPIDER_MODULES': ['tutorial.spiders'],
'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'}
Unhandled error in Deferred:
2024-02-25 23:08:19 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/crawler.py", line 265, in crawl
return self._crawl(crawler, *args, **kwargs)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/crawler.py", line 269, in _crawl
d = crawler.crawl(*args, **kwargs)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/twisted/internet/defer.py", line 2256, in unwindGenerator
return _cancellableInlineCallbacks(gen)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/twisted/internet/defer.py", line 2168, in _cancellableInlineCallbacks
_inlineCallbacks(None, gen, status, _copy_context())
--- <exception caught here> ---
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/twisted/internet/defer.py", line 2000, in _inlineCallbacks
result = context.run(gen.send, result)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/crawler.py", line 158, in crawl
self.engine = self._create_engine()
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/crawler.py", line 172, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/core/engine.py", line 99, in __init__
self.downloader: Downloader = downloader_cls(crawler)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/core/downloader/__init__.py", line 97, in __init__
DownloaderMiddlewareManager.from_crawler(crawler)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/middleware.py", line 90, in from_crawler
return cls.from_settings(crawler.settings, crawler)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/middleware.py", line 66, in from_settings
mwcls = load_object(clspath)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/utils/misc.py", line 79, in load_object
mod = import_module(module)
File "/Users/user/.pyenv/versions/3.10.11/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy_cookies/downloadermiddlewares/cookies.py", line 9, in <module>
from scrapy.utils.python import to_native_str
builtins.ImportError: cannot import name 'to_native_str' from 'scrapy.utils.python' (/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/utils/python.py)
2024-02-25 23:08:19 [twisted] CRITICAL:
Traceback (most recent call last):
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/twisted/internet/defer.py", line 2000, in _inlineCallbacks
result = context.run(gen.send, result)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/crawler.py", line 158, in crawl
self.engine = self._create_engine()
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/crawler.py", line 172, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/core/engine.py", line 99, in __init__
self.downloader: Downloader = downloader_cls(crawler)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/core/downloader/__init__.py", line 97, in __init__
DownloaderMiddlewareManager.from_crawler(crawler)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/middleware.py", line 90, in from_crawler
return cls.from_settings(crawler.settings, crawler)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/middleware.py", line 66, in from_settings
mwcls = load_object(clspath)
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/utils/misc.py", line 79, in load_object
mod = import_module(module)
File "/Users/user/.pyenv/versions/3.10.11/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy_cookies/downloadermiddlewares/cookies.py", line 9, in <module>
from scrapy.utils.python import to_native_str
ImportError: cannot import name 'to_native_str' from 'scrapy.utils.python' (/Users/user/.pyenv/versions/3.10.11/envs/project_name/lib/python3.10/site-packages/scrapy/utils/python.py)
Metadata
Metadata
Assignees
Labels
No labels