ctrl+shift+p filters: :st2 :st3 :win :osx :linux
Browse

Scrapy

scrapy support for sublime text

Details

Installs

  • Total 1K
  • Win 863
  • Mac 314
  • Linux 241
Feb 18 Feb 17 Feb 16 Feb 15 Feb 14 Feb 13 Feb 12 Feb 11 Feb 10 Feb 9 Feb 8 Feb 7 Feb 6 Feb 5 Feb 4 Feb 3 Feb 2 Feb 1 Jan 31 Jan 30 Jan 29 Jan 28 Jan 27 Jan 26 Jan 25 Jan 24 Jan 23 Jan 22 Jan 21 Jan 20 Jan 19 Jan 18 Jan 17 Jan 16 Jan 15 Jan 14 Jan 13 Jan 12 Jan 11 Jan 10 Jan 9 Jan 8 Jan 7 Jan 6 Jan 5 Jan 4
Windows 0 0 0 1 0 0 1 0 1 0 1 1 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 2 2 0 1 1 0 1 0 0 0 0 1 1 0 1
Mac 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0
Linux 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0

Readme

Source
raw.​githubusercontent.​com

Scrapy Snippets for Sublime Text

This package provides handy snippets for working with scrapy, the python tool for making web spiders. It also provides palette commands for parsing request parameters from Postman.

Installation

Clone this package in your Packages/ directory. No Package Control support yet :)

Snippets

Snippets assume your spider has a BASE_URLattribute. This prevents code repetition and makes your code easily portable in case the website changes it's base url.

Snippets for the most common requests are provided: - Get: for sending a request with url request parameters, like mysite.com?parameter=value - Post: for X-www-Form-Urlencoded requests - Json: for POST requests with application/json content-type - A 'redirect' method, useful when you need to send a GET request without parameters. - A 'last' method that opens up the received response in the browser and starts an ipdb shell, useful for debugging the response or playing around with xpaths. - A asp parameter extractor, for easily getting __VIEWSTATE, __VIEWSTATEGENERATOR and __EVENTVALIDATION from a response.

Commands

Format from Postman

If you're using Postman to inspect your requests to a site, you can copy & paste the parameters in 'bulk-edit' mode, and paste them into your spider. Then, select all the lines you just pasted and from the command palette execute Scrapy: Format from Postman. The parameters will be formatted as python's dictionary keys and values.

Instance Item

Copy an item class definition in your spider. Then, select all the class definition lines and execute Scrapy: Yield Item. The definition will be formatted into a yield ClassInstance() statement.