A Powerful Spider(Web Crawler) System in Python.
Go to file
Roy Binux 9c4b803830
Merge pull request #925 from tunstek/couchdb_support_and_travis_fix
CouchDB support and travis fix (python 3.4-7)
2019-11-13 21:22:06 -08:00
.github
data
docs added https to couchdb + cleanup + added couchdb to docs 2019-11-08 10:57:10 +01:00
pyspider fixed docker-compose issue 2019-11-08 18:24:26 +01:00
tests fixed test_60_relist_projects change 2019-11-07 09:47:33 +01:00
tools
.coveragerc
.gitignore
.travis.yml fixed .travis 2019-11-07 09:49:43 +01:00
config_example.json full working example 2019-11-01 20:12:01 +01:00
docker-compose.yaml fixed docker-compose issue 2019-11-08 18:24:26 +01:00
Dockerfile
LICENSE
MANIFEST.in
mkdocs.yml
README.md added https to couchdb + cleanup + added couchdb to docs 2019-11-08 10:57:10 +01:00
requirements.txt
run.py
setup.py removed python 3.8 from setup.py 2019-11-07 09:45:08 +01:00
tox.ini

pyspider Build Status Coverage Status Try

A Powerful Spider(Web Crawler) System in Python. TRY IT NOW!

Tutorial: http://docs.pyspider.org/en/latest/tutorial/
Documentation: http://docs.pyspider.org/
Release notes: https://github.com/binux/pyspider/releases

Sample Code

from pyspider.libs.base_handler import *


class Handler(BaseHandler):
    crawl_config = {
    }

    @every(minutes=24 * 60)
    def on_start(self):
        self.crawl('http://scrapy.org/', callback=self.index_page)

    @config(age=10 * 24 * 60 * 60)
    def index_page(self, response):
        for each in response.doc('a[href^="http"]').items():
            self.crawl(each.attr.href, callback=self.detail_page)

    def detail_page(self, response):
        return {
            "url": response.url,
            "title": response.doc('title').text(),
        }

Demo

Installation

WARNING: WebUI is open to the public by default, it can be used to execute any command which may harm your system. Please use it in an internal network or enable need-auth for webui.

Quickstart: http://docs.pyspider.org/en/latest/Quickstart/

Contribute

TODO

v0.4.0

  • a visual scraping interface like portia

License

Licensed under the Apache License, Version 2.0