==> Synchronizing chroot copy [/home/leming/armv7/root] -> [leming]...done ==> Making package: scrapy 2.12.0-1 (Thu Dec 26 13:35:53 2024) ==> Retrieving sources... -> Found scrapy-2.12.0.tar.gz ==> WARNING: Skipping verification of source file PGP signatures. ==> Validating source files with sha512sums... scrapy-2.12.0.tar.gz ... Passed ==> Validating source files with b2sums... scrapy-2.12.0.tar.gz ... Passed ==> Making package: scrapy 2.12.0-1 (Thu Dec 26 13:36:12 2024) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... warning: dependency cycle detected: warning: python-incremental will be installed before its python-twisted dependency Packages (46) libxslt-1.1.42-2 python-attrs-23.2.0-4 python-autocommand-2.2.2-7 python-automat-22.10.0-7 python-cffi-1.17.1-2 python-charset-normalizer-3.4.0-5 python-click-8.1.7-4 python-constantly-23.10.4-2 python-filelock-3.16.1-2 python-hyperlink-21.0.0-7 python-idna-3.10-2 python-incremental-22.10.0-7 python-jaraco.collections-5.0.1-2 python-jaraco.context-5.3.0-3 python-jaraco.functools-4.1.0-1 python-jaraco.text-4.0.0-2 python-jmespath-1.0.1-5 python-more-itertools-10.5.0-1 python-platformdirs-4.3.6-2 python-pyasn1-0.6.0-2 python-pyasn1-modules-0.4.0-2 python-pycparser-2.22-3 python-requests-2.32.3-4 python-requests-file-2.1.0-1 python-six-1.16.0-10 python-typing_extensions-4.12.2-3 python-urllib3-1.26.20-4 python-wheel-0.45.0-3 python-cryptography-44.0.0-1 python-cssselect-1.2.0-8 python-defusedxml-0.7.1-7 python-itemadapter-0.8.0-5 python-itemloaders-1.3.2-2 python-lxml-5.3.0-2 python-packaging-24.2-3 python-parsel-1.8.1-4 python-protego-0.3.1-3 python-pydispatcher-2.0.7-3 python-pyopenssl-24.3.0-1 python-queuelib-1.7.0-2 python-service-identity-24.2.0-2 python-setuptools-1:75.2.0-4 python-tldextract-5.1.3-2 python-twisted-24.3.0-4 python-w3lib-2.1.2-4 python-zope-interface-7.2-1 Total Download Size: 0.25 MiB Total Installed Size: 85.59 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-cssselect-1.2.0-8-any downloading... python-w3lib-2.1.2-4-any downloading... python-parsel-1.8.1-4-any downloading... python-queuelib-1.7.0-2-any downloading... python-itemloaders-1.3.2-2-any downloading... python-pydispatcher-2.0.7-3-any downloading... python-service-identity-24.2.0-2-any downloading... python-itemadapter-0.8.0-5-any downloading... python-protego-0.3.1-3-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... checking available disk space... :: Processing package changes... installing python-pycparser... installing python-cffi... Optional dependencies for python-cffi python-setuptools: "limited api" version checking in cffi.setuptools_ext [pending] installing python-cryptography... installing python-cssselect... installing python-defusedxml... installing python-itemadapter... installing python-jmespath... installing libxslt... Optional dependencies for libxslt python: Python bindings [installed] installing python-lxml... Optional dependencies for python-lxml python-beautifulsoup4: support for beautifulsoup parser to parse not well formed HTML python-cssselect: support for cssselect [installed] python-html5lib: support for html5lib parser python-lxml-docs: offline docs python-lxml-html-clean: enable htmlclean feature installing python-packaging... installing python-w3lib... installing python-parsel... installing python-itemloaders... installing python-protego... installing python-pydispatcher... installing python-pyopenssl... installing python-queuelib... installing python-attrs... installing python-pyasn1... installing python-pyasn1-modules... installing python-service-identity... Optional dependencies for python-service-identity python-idna: for Internationalized Domain Names support [pending] installing python-more-itertools... installing python-jaraco.functools... installing python-jaraco.context... installing python-autocommand... installing python-jaraco.text... Optional dependencies for python-jaraco.text python-inflect: for show-newlines script installing python-jaraco.collections... installing python-platformdirs... installing python-wheel... Optional dependencies for python-wheel python-keyring: for wheel.signatures python-xdg: for wheel.signatures python-setuptools: for legacy bdist_wheel subcommand [pending] installing python-setuptools... installing python-idna... installing python-charset-normalizer... installing python-urllib3... Optional dependencies for python-urllib3 python-brotli: Brotli support python-certifi: security support python-cryptography: security support [installed] python-idna: security support [installed] python-pyopenssl: security support [installed] python-pysocks: SOCKS support installing python-requests... Optional dependencies for python-requests python-chardet: alternative character encoding library python-pysocks: SOCKS proxy support installing python-six... installing python-requests-file... installing python-filelock... installing python-tldextract... installing python-automat... Optional dependencies for python-automat python-graphviz: for automat-visualize python-twisted: for automat-visualize [pending] installing python-constantly... installing python-hyperlink... installing python-click... installing python-incremental... installing python-typing_extensions... installing python-zope-interface... installing python-twisted... Optional dependencies for python-twisted gobject-introspection-runtime: for GObject Introspection support python-appdirs: for using conch python-bcrypt: for using conch python-cryptography: for using conch [installed] python-h2: for http2 support python-idna: for TLS client hostname verification [installed] python-priority: for http2 support python-pyasn1: for using conch [installed] python-gobject: for GObject Introspection support python-pyopenssl: for TLS client hostname verification [installed] python-pyserial: for serial support python-service-identity: for TLS client hostname verification [installed] tk: for using tkconch :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Packages (3) python-pyproject-hooks-1.2.0-3 python-build-1.2.2-3 python-installer-0.7.0-10 Total Installed Size: 0.47 MiB :: Proceed with installation? [Y/n] checking keyring... checking package integrity... loading package files... checking for file conflicts... checking available disk space... :: Processing package changes... installing python-pyproject-hooks... installing python-build... Optional dependencies for python-build python-pip: to use as the Python package installer (default) python-uv: to use as the Python package installer python-virtualenv: to use virtualenv for build isolation installing python-installer... :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Retrieving sources... -> Found scrapy-2.12.0.tar.gz ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources... -> Extracting scrapy-2.12.0.tar.gz with bsdtar ==> Starting build()... * Getting build dependencies for wheel... running egg_info creating Scrapy.egg-info writing Scrapy.egg-info/PKG-INFO writing dependency_links to Scrapy.egg-info/dependency_links.txt writing entry points to Scrapy.egg-info/entry_points.txt writing requirements to Scrapy.egg-info/requires.txt writing top-level names to Scrapy.egg-info/top_level.txt writing manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' no previously-included directories found matching 'docs/build' warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '*.py[cod]' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file 'Scrapy.egg-info/SOURCES.txt' * Building wheel... running bdist_wheel /usr/lib/python3.13/site-packages/setuptools/_distutils/cmd.py:111: SetuptoolsDeprecationWarning: bdist_wheel.universal is deprecated !! ******************************************************************************** With Python 2.7 end-of-life, support for building universal wheels (i.e., wheels that support both Python 2 and Python 3) is being obviated. Please discontinue using this option, or if you still need it, file an issue with pypa/setuptools describing your use case. By 2025-Aug-30, you need to update your project and remove deprecated calls or your builds will no longer be supported. ******************************************************************************** !! self.finalize_options() running build running build_py creating build/lib/scrapy copying scrapy/robotstxt.py -> build/lib/scrapy copying scrapy/exceptions.py -> build/lib/scrapy copying scrapy/middleware.py -> build/lib/scrapy copying scrapy/spiderloader.py -> build/lib/scrapy copying scrapy/shell.py -> build/lib/scrapy copying scrapy/resolver.py -> build/lib/scrapy copying scrapy/responsetypes.py -> build/lib/scrapy copying scrapy/__init__.py -> build/lib/scrapy copying scrapy/cmdline.py -> build/lib/scrapy copying scrapy/pqueues.py -> build/lib/scrapy copying scrapy/addons.py -> build/lib/scrapy copying scrapy/exporters.py -> build/lib/scrapy copying scrapy/dupefilters.py -> build/lib/scrapy copying scrapy/link.py -> build/lib/scrapy copying scrapy/signals.py -> build/lib/scrapy copying scrapy/squeues.py -> build/lib/scrapy copying scrapy/extension.py -> build/lib/scrapy copying scrapy/mail.py -> build/lib/scrapy copying scrapy/crawler.py -> build/lib/scrapy copying scrapy/statscollectors.py -> build/lib/scrapy copying scrapy/__main__.py -> build/lib/scrapy copying scrapy/signalmanager.py -> build/lib/scrapy copying scrapy/logformatter.py -> build/lib/scrapy copying scrapy/interfaces.py -> build/lib/scrapy copying scrapy/item.py -> build/lib/scrapy creating build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/depth.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/httperror.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/referer.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/__init__.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/offsite.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/urllength.py -> build/lib/scrapy/spidermiddlewares creating build/lib/scrapy/commands copying scrapy/commands/edit.py -> build/lib/scrapy/commands copying scrapy/commands/version.py -> build/lib/scrapy/commands copying scrapy/commands/genspider.py -> build/lib/scrapy/commands copying scrapy/commands/fetch.py -> build/lib/scrapy/commands copying scrapy/commands/view.py -> build/lib/scrapy/commands copying scrapy/commands/list.py -> build/lib/scrapy/commands copying scrapy/commands/shell.py -> build/lib/scrapy/commands copying scrapy/commands/check.py -> build/lib/scrapy/commands copying scrapy/commands/__init__.py -> build/lib/scrapy/commands copying scrapy/commands/startproject.py -> build/lib/scrapy/commands copying scrapy/commands/bench.py -> build/lib/scrapy/commands copying scrapy/commands/crawl.py -> build/lib/scrapy/commands copying scrapy/commands/settings.py -> build/lib/scrapy/commands copying scrapy/commands/parse.py -> build/lib/scrapy/commands copying scrapy/commands/runspider.py -> build/lib/scrapy/commands creating build/lib/scrapy/extensions copying scrapy/extensions/feedexport.py -> build/lib/scrapy/extensions copying scrapy/extensions/corestats.py -> build/lib/scrapy/extensions copying scrapy/extensions/debug.py -> build/lib/scrapy/extensions copying scrapy/extensions/periodic_log.py -> build/lib/scrapy/extensions copying scrapy/extensions/memusage.py -> build/lib/scrapy/extensions copying scrapy/extensions/logstats.py -> build/lib/scrapy/extensions copying scrapy/extensions/closespider.py -> build/lib/scrapy/extensions copying scrapy/extensions/httpcache.py -> build/lib/scrapy/extensions copying scrapy/extensions/__init__.py -> build/lib/scrapy/extensions copying scrapy/extensions/spiderstate.py -> build/lib/scrapy/extensions copying scrapy/extensions/memdebug.py -> build/lib/scrapy/extensions copying scrapy/extensions/postprocessing.py -> build/lib/scrapy/extensions copying scrapy/extensions/statsmailer.py -> build/lib/scrapy/extensions copying scrapy/extensions/throttle.py -> build/lib/scrapy/extensions copying scrapy/extensions/telnet.py -> build/lib/scrapy/extensions creating build/lib/scrapy/contracts copying scrapy/contracts/__init__.py -> build/lib/scrapy/contracts copying scrapy/contracts/default.py -> build/lib/scrapy/contracts creating build/lib/scrapy/linkextractors copying scrapy/linkextractors/lxmlhtml.py -> build/lib/scrapy/linkextractors copying scrapy/linkextractors/__init__.py -> build/lib/scrapy/linkextractors creating build/lib/scrapy/http copying scrapy/http/cookies.py -> build/lib/scrapy/http copying scrapy/http/__init__.py -> build/lib/scrapy/http copying scrapy/http/headers.py -> build/lib/scrapy/http creating build/lib/scrapy/core copying scrapy/core/scraper.py -> build/lib/scrapy/core copying scrapy/core/__init__.py -> build/lib/scrapy/core copying scrapy/core/scheduler.py -> build/lib/scrapy/core copying scrapy/core/spidermw.py -> build/lib/scrapy/core copying scrapy/core/engine.py -> build/lib/scrapy/core creating build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/robotstxt.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpproxy.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/redirect.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/stats.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/defaultheaders.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/cookies.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpauth.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpcache.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/ajaxcrawl.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/__init__.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/offsite.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/useragent.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpcompression.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/retry.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/downloadtimeout.py -> build/lib/scrapy/downloadermiddlewares creating build/lib/scrapy/utils copying scrapy/utils/ssl.py -> build/lib/scrapy/utils copying scrapy/utils/ossignal.py -> build/lib/scrapy/utils copying scrapy/utils/sitemap.py -> build/lib/scrapy/utils copying scrapy/utils/project.py -> build/lib/scrapy/utils copying scrapy/utils/test.py -> build/lib/scrapy/utils copying scrapy/utils/spider.py -> build/lib/scrapy/utils copying scrapy/utils/_compression.py -> build/lib/scrapy/utils copying scrapy/utils/testproc.py -> build/lib/scrapy/utils copying scrapy/utils/trackref.py -> build/lib/scrapy/utils copying scrapy/utils/versions.py -> build/lib/scrapy/utils copying scrapy/utils/gz.py -> build/lib/scrapy/utils copying scrapy/utils/curl.py -> build/lib/scrapy/utils copying scrapy/utils/benchserver.py -> build/lib/scrapy/utils copying scrapy/utils/conf.py -> build/lib/scrapy/utils copying scrapy/utils/httpobj.py -> build/lib/scrapy/utils copying scrapy/utils/serialize.py -> build/lib/scrapy/utils copying scrapy/utils/datatypes.py -> build/lib/scrapy/utils copying scrapy/utils/testsite.py -> build/lib/scrapy/utils copying scrapy/utils/python.py -> build/lib/scrapy/utils copying scrapy/utils/reactor.py -> build/lib/scrapy/utils copying scrapy/utils/console.py -> build/lib/scrapy/utils copying scrapy/utils/asyncgen.py -> build/lib/scrapy/utils copying scrapy/utils/__init__.py -> build/lib/scrapy/utils copying scrapy/utils/ftp.py -> build/lib/scrapy/utils copying scrapy/utils/request.py -> build/lib/scrapy/utils copying scrapy/utils/display.py -> build/lib/scrapy/utils copying scrapy/utils/boto.py -> build/lib/scrapy/utils copying scrapy/utils/deprecate.py -> build/lib/scrapy/utils copying scrapy/utils/decorators.py -> build/lib/scrapy/utils copying scrapy/utils/template.py -> build/lib/scrapy/utils copying scrapy/utils/iterators.py -> build/lib/scrapy/utils copying scrapy/utils/signal.py -> build/lib/scrapy/utils copying scrapy/utils/misc.py -> build/lib/scrapy/utils copying scrapy/utils/response.py -> build/lib/scrapy/utils copying scrapy/utils/url.py -> build/lib/scrapy/utils copying scrapy/utils/defer.py -> build/lib/scrapy/utils copying scrapy/utils/engine.py -> build/lib/scrapy/utils copying scrapy/utils/job.py -> build/lib/scrapy/utils copying scrapy/utils/log.py -> build/lib/scrapy/utils creating build/lib/scrapy/selector copying scrapy/selector/unified.py -> build/lib/scrapy/selector copying scrapy/selector/__init__.py -> build/lib/scrapy/selector creating build/lib/scrapy/loader copying scrapy/loader/__init__.py -> build/lib/scrapy/loader creating build/lib/scrapy/settings copying scrapy/settings/__init__.py -> build/lib/scrapy/settings copying scrapy/settings/default_settings.py -> build/lib/scrapy/settings creating build/lib/scrapy/spiders copying scrapy/spiders/sitemap.py -> build/lib/scrapy/spiders copying scrapy/spiders/feed.py -> build/lib/scrapy/spiders copying scrapy/spiders/__init__.py -> build/lib/scrapy/spiders copying scrapy/spiders/crawl.py -> build/lib/scrapy/spiders copying scrapy/spiders/init.py -> build/lib/scrapy/spiders creating build/lib/scrapy/pipelines copying scrapy/pipelines/media.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/files.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/__init__.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/images.py -> build/lib/scrapy/pipelines creating build/lib/scrapy/http/response copying scrapy/http/response/text.py -> build/lib/scrapy/http/response copying scrapy/http/response/xml.py -> build/lib/scrapy/http/response copying scrapy/http/response/__init__.py -> build/lib/scrapy/http/response copying scrapy/http/response/json.py -> build/lib/scrapy/http/response copying scrapy/http/response/html.py -> build/lib/scrapy/http/response creating build/lib/scrapy/http/request copying scrapy/http/request/rpc.py -> build/lib/scrapy/http/request copying scrapy/http/request/form.py -> build/lib/scrapy/http/request copying scrapy/http/request/__init__.py -> build/lib/scrapy/http/request copying scrapy/http/request/json_request.py -> build/lib/scrapy/http/request creating build/lib/scrapy/core/downloader copying scrapy/core/downloader/tls.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/middleware.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/webclient.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/__init__.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/contextfactory.py -> build/lib/scrapy/core/downloader creating build/lib/scrapy/core/http2 copying scrapy/core/http2/stream.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/protocol.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/__init__.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/agent.py -> build/lib/scrapy/core/http2 creating build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/datauri.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/__init__.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/ftp.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http10.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/file.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/s3.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http11.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http2.py -> build/lib/scrapy/core/downloader/handlers running egg_info writing Scrapy.egg-info/PKG-INFO writing dependency_links to Scrapy.egg-info/dependency_links.txt writing entry points to Scrapy.egg-info/entry_points.txt writing requirements to Scrapy.egg-info/requires.txt writing top-level names to Scrapy.egg-info/top_level.txt reading manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' no previously-included directories found matching 'docs/build' warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '*.py[cod]' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file 'Scrapy.egg-info/SOURCES.txt' /usr/lib/python3.13/site-packages/setuptools/command/build_py.py:218: _Warning: Package 'scrapy.templates.project' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project' to be distributed and are already explicitly excluding 'scrapy.templates.project' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/lib/python3.13/site-packages/setuptools/command/build_py.py:218: _Warning: Package 'scrapy.templates.project.module' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project.module' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project.module' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project.module' to be distributed and are already explicitly excluding 'scrapy.templates.project.module' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/lib/python3.13/site-packages/setuptools/command/build_py.py:218: _Warning: Package 'scrapy.templates.project.module.spiders' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project.module.spiders' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project.module.spiders' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project.module.spiders' to be distributed and are already explicitly excluding 'scrapy.templates.project.module.spiders' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/lib/python3.13/site-packages/setuptools/command/build_py.py:218: _Warning: Package 'scrapy.templates.spiders' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.spiders' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.spiders' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.spiders' to be distributed and are already explicitly excluding 'scrapy.templates.spiders' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) copying scrapy/VERSION -> build/lib/scrapy copying scrapy/mime.types -> build/lib/scrapy copying scrapy/py.typed -> build/lib/scrapy creating build/lib/scrapy/templates/project copying scrapy/templates/project/scrapy.cfg -> build/lib/scrapy/templates/project creating build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/__init__.py -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/items.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/middlewares.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/pipelines.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/settings.py.tmpl -> build/lib/scrapy/templates/project/module creating build/lib/scrapy/templates/project/module/spiders copying scrapy/templates/project/module/spiders/__init__.py -> build/lib/scrapy/templates/project/module/spiders creating build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/basic.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/crawl.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/csvfeed.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/xmlfeed.tmpl -> build/lib/scrapy/templates/spiders installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/scrapy creating build/bdist.linux-armv7l/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/depth.py -> build/bdist.linux-armv7l/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/httperror.py -> build/bdist.linux-armv7l/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/referer.py -> build/bdist.linux-armv7l/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/offsite.py -> build/bdist.linux-armv7l/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/urllength.py -> build/bdist.linux-armv7l/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/robotstxt.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/commands copying build/lib/scrapy/commands/edit.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/version.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/genspider.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/fetch.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/view.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/list.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/shell.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/check.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/startproject.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/bench.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/crawl.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/settings.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/parse.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/commands/runspider.py -> build/bdist.linux-armv7l/wheel/./scrapy/commands copying build/lib/scrapy/exceptions.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/extensions copying build/lib/scrapy/extensions/feedexport.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/corestats.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/debug.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/periodic_log.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/memusage.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/logstats.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/closespider.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/httpcache.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/spiderstate.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/memdebug.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/postprocessing.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/statsmailer.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/throttle.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/telnet.py -> build/bdist.linux-armv7l/wheel/./scrapy/extensions copying build/lib/scrapy/middleware.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/contracts copying build/lib/scrapy/contracts/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/contracts copying build/lib/scrapy/contracts/default.py -> build/bdist.linux-armv7l/wheel/./scrapy/contracts creating build/bdist.linux-armv7l/wheel/scrapy/linkextractors copying build/lib/scrapy/linkextractors/lxmlhtml.py -> build/bdist.linux-armv7l/wheel/./scrapy/linkextractors copying build/lib/scrapy/linkextractors/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/linkextractors creating build/bdist.linux-armv7l/wheel/scrapy/http copying build/lib/scrapy/http/cookies.py -> build/bdist.linux-armv7l/wheel/./scrapy/http copying build/lib/scrapy/http/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/http copying build/lib/scrapy/http/headers.py -> build/bdist.linux-armv7l/wheel/./scrapy/http creating build/bdist.linux-armv7l/wheel/scrapy/http/response copying build/lib/scrapy/http/response/text.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/xml.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/json.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/html.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/response creating build/bdist.linux-armv7l/wheel/scrapy/http/request copying build/lib/scrapy/http/request/rpc.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/request copying build/lib/scrapy/http/request/form.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/request copying build/lib/scrapy/http/request/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/request copying build/lib/scrapy/http/request/json_request.py -> build/bdist.linux-armv7l/wheel/./scrapy/http/request creating build/bdist.linux-armv7l/wheel/scrapy/core copying build/lib/scrapy/core/scraper.py -> build/bdist.linux-armv7l/wheel/./scrapy/core creating build/bdist.linux-armv7l/wheel/scrapy/core/downloader copying build/lib/scrapy/core/downloader/tls.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/middleware.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/webclient.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader creating build/bdist.linux-armv7l/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/datauri.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/ftp.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http10.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/file.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/s3.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http11.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http2.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/contextfactory.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/core copying build/lib/scrapy/core/scheduler.py -> build/bdist.linux-armv7l/wheel/./scrapy/core copying build/lib/scrapy/core/spidermw.py -> build/bdist.linux-armv7l/wheel/./scrapy/core creating build/bdist.linux-armv7l/wheel/scrapy/core/http2 copying build/lib/scrapy/core/http2/stream.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/http2/protocol.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/http2/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/http2/agent.py -> build/bdist.linux-armv7l/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/engine.py -> build/bdist.linux-armv7l/wheel/./scrapy/core copying build/lib/scrapy/spiderloader.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/shell.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/resolver.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/robotstxt.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpproxy.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/redirect.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/stats.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/defaultheaders.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/cookies.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpauth.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpcache.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/ajaxcrawl.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/offsite.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/useragent.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpcompression.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/retry.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/downloadtimeout.py -> build/bdist.linux-armv7l/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/responsetypes.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/cmdline.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/pqueues.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/addons.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/templates creating build/bdist.linux-armv7l/wheel/scrapy/templates/project creating build/bdist.linux-armv7l/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/settings.py.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/middlewares.py.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/items.py.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/pipelines.py.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project/module creating build/bdist.linux-armv7l/wheel/scrapy/templates/project/module/spiders copying build/lib/scrapy/templates/project/module/spiders/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project/module/spiders copying build/lib/scrapy/templates/project/scrapy.cfg -> build/bdist.linux-armv7l/wheel/./scrapy/templates/project creating build/bdist.linux-armv7l/wheel/scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/basic.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/xmlfeed.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/csvfeed.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/crawl.tmpl -> build/bdist.linux-armv7l/wheel/./scrapy/templates/spiders copying build/lib/scrapy/VERSION -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/utils copying build/lib/scrapy/utils/ssl.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/ossignal.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/sitemap.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/project.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/test.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/spider.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/_compression.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/testproc.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/trackref.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/versions.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/gz.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/curl.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/benchserver.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/conf.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/httpobj.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/serialize.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/datatypes.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/testsite.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/python.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/reactor.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/console.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/asyncgen.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/ftp.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/request.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/display.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/boto.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/deprecate.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/decorators.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/template.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/iterators.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/signal.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/misc.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/response.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/url.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/defer.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/engine.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/job.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils copying build/lib/scrapy/utils/log.py -> build/bdist.linux-armv7l/wheel/./scrapy/utils creating build/bdist.linux-armv7l/wheel/scrapy/selector copying build/lib/scrapy/selector/unified.py -> build/bdist.linux-armv7l/wheel/./scrapy/selector copying build/lib/scrapy/selector/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/selector copying build/lib/scrapy/exporters.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/loader copying build/lib/scrapy/loader/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/loader copying build/lib/scrapy/dupefilters.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/link.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/signals.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/squeues.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/extension.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/mail.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/mime.types -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/crawler.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/statscollectors.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/__main__.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/signalmanager.py -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/py.typed -> build/bdist.linux-armv7l/wheel/./scrapy copying build/lib/scrapy/logformatter.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/settings copying build/lib/scrapy/settings/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/settings copying build/lib/scrapy/settings/default_settings.py -> build/bdist.linux-armv7l/wheel/./scrapy/settings copying build/lib/scrapy/interfaces.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/spiders copying build/lib/scrapy/spiders/sitemap.py -> build/bdist.linux-armv7l/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/feed.py -> build/bdist.linux-armv7l/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/crawl.py -> build/bdist.linux-armv7l/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/init.py -> build/bdist.linux-armv7l/wheel/./scrapy/spiders copying build/lib/scrapy/item.py -> build/bdist.linux-armv7l/wheel/./scrapy creating build/bdist.linux-armv7l/wheel/scrapy/pipelines copying build/lib/scrapy/pipelines/media.py -> build/bdist.linux-armv7l/wheel/./scrapy/pipelines copying build/lib/scrapy/pipelines/files.py -> build/bdist.linux-armv7l/wheel/./scrapy/pipelines copying build/lib/scrapy/pipelines/__init__.py -> build/bdist.linux-armv7l/wheel/./scrapy/pipelines copying build/lib/scrapy/pipelines/images.py -> build/bdist.linux-armv7l/wheel/./scrapy/pipelines running install_egg_info Copying Scrapy.egg-info to build/bdist.linux-armv7l/wheel/./Scrapy-2.12.0-py3.13.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/Scrapy-2.12.0.dist-info/WHEEL creating '/build/scrapy/src/scrapy-2.12.0/dist/.tmp-4x0ehgz5/Scrapy-2.12.0-py2.py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'scrapy/VERSION' adding 'scrapy/__init__.py' adding 'scrapy/__main__.py' adding 'scrapy/addons.py' adding 'scrapy/cmdline.py' adding 'scrapy/crawler.py' adding 'scrapy/dupefilters.py' adding 'scrapy/exceptions.py' adding 'scrapy/exporters.py' adding 'scrapy/extension.py' adding 'scrapy/interfaces.py' adding 'scrapy/item.py' adding 'scrapy/link.py' adding 'scrapy/logformatter.py' adding 'scrapy/mail.py' adding 'scrapy/middleware.py' adding 'scrapy/mime.types' adding 'scrapy/pqueues.py' adding 'scrapy/py.typed' adding 'scrapy/resolver.py' adding 'scrapy/responsetypes.py' adding 'scrapy/robotstxt.py' adding 'scrapy/shell.py' adding 'scrapy/signalmanager.py' adding 'scrapy/signals.py' adding 'scrapy/spiderloader.py' adding 'scrapy/squeues.py' adding 'scrapy/statscollectors.py' adding 'scrapy/commands/__init__.py' adding 'scrapy/commands/bench.py' adding 'scrapy/commands/check.py' adding 'scrapy/commands/crawl.py' adding 'scrapy/commands/edit.py' adding 'scrapy/commands/fetch.py' adding 'scrapy/commands/genspider.py' adding 'scrapy/commands/list.py' adding 'scrapy/commands/parse.py' adding 'scrapy/commands/runspider.py' adding 'scrapy/commands/settings.py' adding 'scrapy/commands/shell.py' adding 'scrapy/commands/startproject.py' adding 'scrapy/commands/version.py' adding 'scrapy/commands/view.py' adding 'scrapy/contracts/__init__.py' adding 'scrapy/contracts/default.py' adding 'scrapy/core/__init__.py' adding 'scrapy/core/engine.py' adding 'scrapy/core/scheduler.py' adding 'scrapy/core/scraper.py' adding 'scrapy/core/spidermw.py' adding 'scrapy/core/downloader/__init__.py' adding 'scrapy/core/downloader/contextfactory.py' adding 'scrapy/core/downloader/middleware.py' adding 'scrapy/core/downloader/tls.py' adding 'scrapy/core/downloader/webclient.py' adding 'scrapy/core/downloader/handlers/__init__.py' adding 'scrapy/core/downloader/handlers/datauri.py' adding 'scrapy/core/downloader/handlers/file.py' adding 'scrapy/core/downloader/handlers/ftp.py' adding 'scrapy/core/downloader/handlers/http.py' adding 'scrapy/core/downloader/handlers/http10.py' adding 'scrapy/core/downloader/handlers/http11.py' adding 'scrapy/core/downloader/handlers/http2.py' adding 'scrapy/core/downloader/handlers/s3.py' adding 'scrapy/core/http2/__init__.py' adding 'scrapy/core/http2/agent.py' adding 'scrapy/core/http2/protocol.py' adding 'scrapy/core/http2/stream.py' adding 'scrapy/downloadermiddlewares/__init__.py' adding 'scrapy/downloadermiddlewares/ajaxcrawl.py' adding 'scrapy/downloadermiddlewares/cookies.py' adding 'scrapy/downloadermiddlewares/defaultheaders.py' adding 'scrapy/downloadermiddlewares/downloadtimeout.py' adding 'scrapy/downloadermiddlewares/httpauth.py' adding 'scrapy/downloadermiddlewares/httpcache.py' adding 'scrapy/downloadermiddlewares/httpcompression.py' adding 'scrapy/downloadermiddlewares/httpproxy.py' adding 'scrapy/downloadermiddlewares/offsite.py' adding 'scrapy/downloadermiddlewares/redirect.py' adding 'scrapy/downloadermiddlewares/retry.py' adding 'scrapy/downloadermiddlewares/robotstxt.py' adding 'scrapy/downloadermiddlewares/stats.py' adding 'scrapy/downloadermiddlewares/useragent.py' adding 'scrapy/extensions/__init__.py' adding 'scrapy/extensions/closespider.py' adding 'scrapy/extensions/corestats.py' adding 'scrapy/extensions/debug.py' adding 'scrapy/extensions/feedexport.py' adding 'scrapy/extensions/httpcache.py' adding 'scrapy/extensions/logstats.py' adding 'scrapy/extensions/memdebug.py' adding 'scrapy/extensions/memusage.py' adding 'scrapy/extensions/periodic_log.py' adding 'scrapy/extensions/postprocessing.py' adding 'scrapy/extensions/spiderstate.py' adding 'scrapy/extensions/statsmailer.py' adding 'scrapy/extensions/telnet.py' adding 'scrapy/extensions/throttle.py' adding 'scrapy/http/__init__.py' adding 'scrapy/http/cookies.py' adding 'scrapy/http/headers.py' adding 'scrapy/http/request/__init__.py' adding 'scrapy/http/request/form.py' adding 'scrapy/http/request/json_request.py' adding 'scrapy/http/request/rpc.py' adding 'scrapy/http/response/__init__.py' adding 'scrapy/http/response/html.py' adding 'scrapy/http/response/json.py' adding 'scrapy/http/response/text.py' adding 'scrapy/http/response/xml.py' adding 'scrapy/linkextractors/__init__.py' adding 'scrapy/linkextractors/lxmlhtml.py' adding 'scrapy/loader/__init__.py' adding 'scrapy/pipelines/__init__.py' adding 'scrapy/pipelines/files.py' adding 'scrapy/pipelines/images.py' adding 'scrapy/pipelines/media.py' adding 'scrapy/selector/__init__.py' adding 'scrapy/selector/unified.py' adding 'scrapy/settings/__init__.py' adding 'scrapy/settings/default_settings.py' adding 'scrapy/spidermiddlewares/__init__.py' adding 'scrapy/spidermiddlewares/depth.py' adding 'scrapy/spidermiddlewares/httperror.py' adding 'scrapy/spidermiddlewares/offsite.py' adding 'scrapy/spidermiddlewares/referer.py' adding 'scrapy/spidermiddlewares/urllength.py' adding 'scrapy/spiders/__init__.py' adding 'scrapy/spiders/crawl.py' adding 'scrapy/spiders/feed.py' adding 'scrapy/spiders/init.py' adding 'scrapy/spiders/sitemap.py' adding 'scrapy/templates/project/scrapy.cfg' adding 'scrapy/templates/project/module/__init__.py' adding 'scrapy/templates/project/module/items.py.tmpl' adding 'scrapy/templates/project/module/middlewares.py.tmpl' adding 'scrapy/templates/project/module/pipelines.py.tmpl' adding 'scrapy/templates/project/module/settings.py.tmpl' adding 'scrapy/templates/project/module/spiders/__init__.py' adding 'scrapy/templates/spiders/basic.tmpl' adding 'scrapy/templates/spiders/crawl.tmpl' adding 'scrapy/templates/spiders/csvfeed.tmpl' adding 'scrapy/templates/spiders/xmlfeed.tmpl' adding 'scrapy/utils/__init__.py' adding 'scrapy/utils/_compression.py' adding 'scrapy/utils/asyncgen.py' adding 'scrapy/utils/benchserver.py' adding 'scrapy/utils/boto.py' adding 'scrapy/utils/conf.py' adding 'scrapy/utils/console.py' adding 'scrapy/utils/curl.py' adding 'scrapy/utils/datatypes.py' adding 'scrapy/utils/decorators.py' adding 'scrapy/utils/defer.py' adding 'scrapy/utils/deprecate.py' adding 'scrapy/utils/display.py' adding 'scrapy/utils/engine.py' adding 'scrapy/utils/ftp.py' adding 'scrapy/utils/gz.py' adding 'scrapy/utils/httpobj.py' adding 'scrapy/utils/iterators.py' adding 'scrapy/utils/job.py' adding 'scrapy/utils/log.py' adding 'scrapy/utils/misc.py' adding 'scrapy/utils/ossignal.py' adding 'scrapy/utils/project.py' adding 'scrapy/utils/python.py' adding 'scrapy/utils/reactor.py' adding 'scrapy/utils/request.py' adding 'scrapy/utils/response.py' adding 'scrapy/utils/serialize.py' adding 'scrapy/utils/signal.py' adding 'scrapy/utils/sitemap.py' adding 'scrapy/utils/spider.py' adding 'scrapy/utils/ssl.py' adding 'scrapy/utils/template.py' adding 'scrapy/utils/test.py' adding 'scrapy/utils/testproc.py' adding 'scrapy/utils/testsite.py' adding 'scrapy/utils/trackref.py' adding 'scrapy/utils/url.py' adding 'scrapy/utils/versions.py' adding 'Scrapy-2.12.0.dist-info/AUTHORS' adding 'Scrapy-2.12.0.dist-info/LICENSE' adding 'Scrapy-2.12.0.dist-info/METADATA' adding 'Scrapy-2.12.0.dist-info/WHEEL' adding 'Scrapy-2.12.0.dist-info/entry_points.txt' adding 'Scrapy-2.12.0.dist-info/top_level.txt' adding 'Scrapy-2.12.0.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Successfully built Scrapy-2.12.0-py2.py3-none-any.whl ==> Entering fakeroot environment... ==> Starting package()... ==> Tidying install... -> Removing libtool files... -> Purging unwanted files... -> Removing static library files... -> Stripping unneeded symbols from binaries and libraries... -> Compressing man and info pages... ==> Checking for packaging issues... ==> Creating package "scrapy"... -> Generating .PKGINFO file... -> Generating .BUILDINFO file... -> Generating .MTREE file... -> Compressing package... ==> Leaving fakeroot environment. ==> Finished making: scrapy 2.12.0-1 (Thu Dec 26 13:36:51 2024) ==> Cleaning up...