Scrapy project to Scrapinghub fails

756 views Asked by At

My scrapy projects works fine on my local machine. But, I'm getting an error when deploying to Scrapinghub:

$ shub deploy

Packing version 88e88d8-master
Deploying to Scrapy Cloud project "8888888"
Deploy log last 30 lines:
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode
    _run(args, settings)
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run
    _run_scrapy(args, settings)
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy
    execute(settings=settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 148, in execute
    cmd.crawler_process = CrawlerProcess(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 243, in __init__
    super(CrawlerProcess, self).__init__(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 134, in __init__
    self.spider_loader = _get_spider_loader(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 330, in _get_spider_loader
    return loader_cls.from_settings(settings.frozencopy())
  File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 61, in from_settings
    return cls(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 25, in __init__
    self._load_all_spiders()
  File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders
    for module in walk_modules(name):
  File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 71, in walk_modules
    submod = import_module(fullpath)
  File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
  File "/app/__main__.egg/mycrawler/spiders/first_spider.py", line 4, in <module>
  File "/app/__main__.egg/mycrawler/items.py", line 4, in <module>
ImportError: No module named myCrawlerHelper
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}

{"status": "error", "message": "Internal error"}
Deploy log location: /var/folders/1w/x1jxnccs57d9h60kwwnsk83c0000gn/T/shub_deploy_irxzg9x8.log
Error: Deploy failed: b'{"status": "error", "message": "Internal error"

I've some helper functions packed into the file myCrawlerHelper.py and am importing those into my spiders and items.py. I believe the problem is associated with this.

I use splash as well. I also noticed that the error messages include python 2.7, although I use 3.6

How can I get rid of this issue?

0

There are 0 answers