I am deploying a Portia spider in scrapyd. While deploying I am passing URLs for every link parsing
Example:
The URL(say URL_1) crawled by the spider is http://www.example.com/query1
and the URL(say URL_2) I am passing is http://www.example.com/query2
to extract content.
My spider extracted the content from URL_2 and stored it into respective items. This is fine.
I am having the [URL] item and URL_2 stored in the item, but what I want is URL_1 to be stored in the [URL] item.
Any Solution?