Skip to content

Commit

Permalink
Upgrade PyPy for CI, and test both 3.5 (oldest) and 3.6 (newest) (scr…
Browse files Browse the repository at this point in the history
…apy#4504)

* Upgrade PyPy for CI, and test both 3.5 (oldest) and 3.6 (newest)

* Log a detailed error message to discover why MockServer is not working

* Go for all lines!

* Disable tests based on mitmproxy while running on PyPy

* Fix test_get_func_args for PyPy 3.6+

* Make testPayloadDefaultCiphers work regardless of OpenSSL default ciphers

* Crossing fingers…

* Rename: testPayloadDefaultCiphers → testPayloadDisabledCipher

* Test the PyPy version currently documented as the minimum required version

* Fix the PYPY_VERSION tag

* Update the documentation about supported PyPy versions

* Also test the latest 3.5 Python version with PyPy

* Fix the PYPY_VERSION value for the latest 3.5 version

* Use pinned dependencies for asyncio and PyPy tests against oldest supported Python versions

* Fix PyPy installation for the pypy3-pinned Tox environment

* Try installing Cython

* Maybe PyPy requires lxml 3.6.0?

* install.rst: minor clarification

* lxml 4.0.0 is required on PyPy

* Require setuptools 18.5+

* Revert "Require setuptools 18.5+"

This reverts commit 017ec33.

* Maintain lxml as a dependency if setuptools < 18.5 is used
  • Loading branch information
Gallaecio authored Jul 16, 2020
1 parent 9a74a71 commit d29bec6
Show file tree
Hide file tree
Showing 8 changed files with 80 additions and 51 deletions.
16 changes: 11 additions & 5 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,19 +18,25 @@ matrix:
- env: TOXENV=typing
python: 3.8

- env: TOXENV=pypy3
- env: TOXENV=pinned
python: 3.5.2
- env: TOXENV=asyncio
- env: TOXENV=asyncio-pinned
python: 3.5.2 # We use additional code to support 3.5.3 and earlier
- env: TOXENV=pypy3-pinned PYPY_VERSION=3-v5.9.0

- env: TOXENV=py
python: 3.5
- env: TOXENV=asyncio
python: 3.5 # We use specific code to support >= 3.5.4, < 3.6
- env: TOXENV=pypy3 PYPY_VERSION=3.5-v7.0.0

- env: TOXENV=py
python: 3.6
- env: TOXENV=pypy3 PYPY_VERSION=3.6-v7.3.1

- env: TOXENV=py
python: 3.7

- env: TOXENV=py PYPI_RELEASE_JOB=true
python: 3.8
dist: bionic
Expand All @@ -42,9 +48,9 @@ matrix:
dist: bionic
install:
- |
if [ "$TOXENV" = "pypy3" ]; then
export PYPY_VERSION="pypy3.5-5.9-beta-linux_x86_64-portable"
wget "https://bitbucket.org/squeaky/portable-pypy/downloads/${PYPY_VERSION}.tar.bz2"
if [[ ! -z "$PYPY_VERSION" ]]; then
export PYPY_VERSION="pypy$PYPY_VERSION-linux64"
wget "https://bitbucket.org/pypy/pypy/downloads/${PYPY_VERSION}.tar.bz2"
tar -jxf ${PYPY_VERSION}.tar.bz2
virtualenv --python="$PYPY_VERSION/bin/pypy3" "$HOME/virtualenvs/$PYPY_VERSION"
source "$HOME/virtualenvs/$PYPY_VERSION/bin/activate"
Expand Down
14 changes: 0 additions & 14 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,20 +64,6 @@ Here's an example spider using BeautifulSoup API, with ``lxml`` as the HTML pars

.. _BeautifulSoup's official documentation: https://www.crummy.com/software/BeautifulSoup/bs4/doc/#specifying-the-parser-to-use

.. _faq-python-versions:

What Python versions does Scrapy support?
-----------------------------------------

Scrapy is supported under Python 3.5.2+
under CPython (default Python implementation) and PyPy (starting with PyPy 5.9).
Python 3 support was added in Scrapy 1.1.
PyPy support was added in Scrapy 1.4, PyPy3 support was added in Scrapy 1.5.
Python 2 support was dropped in Scrapy 2.0.

.. note::
For Python 3 support on Windows, it is recommended to use
Anaconda/Miniconda as :ref:`outlined in the installation guide <intro-install-windows>`.

Did Scrapy "steal" X from Django?
---------------------------------
Expand Down
12 changes: 9 additions & 3 deletions docs/intro/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,18 @@
Installation guide
==================

.. _faq-python-versions:

Supported Python versions
=========================

Scrapy requires Python 3.5.2+, either the CPython implementation (default) or
the PyPy 5.9+ implementation (see :ref:`python:implementations`).


Installing Scrapy
=================

Scrapy runs on Python 3.5.2 or above under CPython (default Python
implementation) and PyPy (starting with PyPy 5.9).

If you're using `Anaconda`_ or `Miniconda`_, you can install the package from
the `conda-forge`_ channel, which has up-to-date packages for Linux, Windows
and macOS.
Expand Down
44 changes: 28 additions & 16 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,39 @@ def has_environment_marker_platform_impl_support():
return parse_version(setuptools_version) >= parse_version('18.5')


install_requires = [
'Twisted>=17.9.0',
'cryptography>=2.0',
'cssselect>=0.9.1',
'itemloaders>=1.0.1',
'lxml>=3.5.0',
'parsel>=1.5.0',
'PyDispatcher>=2.0.5',
'pyOpenSSL>=16.2.0',
'queuelib>=1.4.2',
'service_identity>=16.0.0',
'w3lib>=1.17.0',
'zope.interface>=4.1.3',
'protego>=0.1.15',
'itemadapter>=0.1.0',
]
extras_require = {}

if has_environment_marker_platform_impl_support():
extras_require[':platform_python_implementation == "CPython"'] = [
'lxml>=3.5.0',
]
extras_require[':platform_python_implementation == "PyPy"'] = [
# Earlier lxml versions are affected by
# https://bitbucket.org/pypy/pypy/issues/2498/cython-on-pypy-3-dict-object-has-no,
# which was fixed in Cython 0.26, released on 2017-06-19, and used to
# generate the C headers of lxml release tarballs published since then, the
# first of which was:
'lxml>=4.0.0',
'PyPyDispatcher>=2.1.0',
]
else:
install_requires.append('lxml>=3.5.0')


setup(
Expand Down Expand Up @@ -67,21 +94,6 @@ def has_environment_marker_platform_impl_support():
'Topic :: Software Development :: Libraries :: Python Modules',
],
python_requires='>=3.5.2',
install_requires=[
'Twisted>=17.9.0',
'cryptography>=2.0',
'cssselect>=0.9.1',
'itemloaders>=1.0.1',
'lxml>=3.5.0',
'parsel>=1.5.0',
'PyDispatcher>=2.0.5',
'pyOpenSSL>=16.2.0',
'queuelib>=1.4.2',
'service_identity>=16.0.0',
'w3lib>=1.17.0',
'zope.interface>=4.1.3',
'protego>=0.1.15',
'itemadapter>=0.1.0',
],
install_requires=install_requires,
extras_require=extras_require,
)
2 changes: 2 additions & 0 deletions tests/test_proxy_connect.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,8 @@ def _wrong_credentials(proxy_url):

@skipIf(sys.version_info < (3, 5, 4),
"requires mitmproxy < 3.0.0, which these tests do not support")
@skipIf("pypy" in sys.executable,
"mitmproxy does not support PyPy")
@skipIf(platform.system() == 'Windows' and sys.version_info < (3, 7),
"mitmproxy does not support Windows when running Python < 3.7")
class ProxyConnectTestCase(TestCase):
Expand Down
6 changes: 5 additions & 1 deletion tests/test_utils_python.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
import platform
import unittest
from itertools import count
from sys import version_info
from warnings import catch_warnings

from scrapy.utils.python import (
Expand Down Expand Up @@ -214,9 +215,12 @@ def __call__(self, a, b, c):
else:
self.assertEqual(
get_func_args(str.split, stripself=True), ['sep', 'maxsplit'])
self.assertEqual(get_func_args(" ".join, stripself=True), ['list'])
self.assertEqual(
get_func_args(operator.itemgetter(2), stripself=True), ['obj'])
if version_info < (3, 6):
self.assertEqual(get_func_args(" ".join, stripself=True), ['list'])
else:
self.assertEqual(get_func_args(" ".join, stripself=True), ['iterable'])

def test_without_none_values(self):
self.assertEqual(without_none_values([1, None, 3, 4]), [1, 3, 4])
Expand Down
6 changes: 4 additions & 2 deletions tests/test_webclient.py
Original file line number Diff line number Diff line change
Expand Up @@ -413,7 +413,9 @@ def testPayload(self):
self.getURL("payload"), body=s, contextFactory=client_context_factory
).addCallback(self.assertEqual, to_bytes(s))

def testPayloadDefaultCiphers(self):
def testPayloadDisabledCipher(self):
s = "0123456789" * 10
d = getPage(self.getURL("payload"), body=s, contextFactory=ScrapyClientContextFactory())
settings = Settings({'DOWNLOADER_CLIENT_TLS_CIPHERS': 'ECDHE-RSA-AES256-GCM-SHA384'})
client_context_factory = create_instance(ScrapyClientContextFactory, settings=settings, crawler=None)
d = getPage(self.getURL("payload"), body=s, contextFactory=client_context_factory)
return self.assertFailure(d, OpenSSL.SSL.Error)
31 changes: 21 additions & 10 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -58,11 +58,6 @@ deps =
commands =
pylint conftest.py docs extras scrapy setup.py tests

[testenv:pypy3]
basepython = pypy3
commands =
py.test {posargs:--durations=10 docs scrapy tests}

[pinned]
deps =
-ctests/constraints.txt
Expand All @@ -85,7 +80,6 @@ deps =
Pillow==3.4.2

[testenv:pinned]
basepython = python3
deps =
{[pinned]deps}
lxml==3.5.0
Expand All @@ -104,6 +98,27 @@ deps =
reppy
robotexclusionrulesparser

[testenv:asyncio]
commands =
{[testenv]commands} --reactor=asyncio

[testenv:asyncio-pinned]
commands = {[testenv:asyncio]commands}
deps = {[testenv:pinned]deps}

[testenv:pypy3]
basepython = pypy3
commands =
py.test {posargs:--durations=10 docs scrapy tests}

[testenv:pypy3-pinned]
basepython = {[testenv:pypy3]basepython}
commands = {[testenv:pypy3]commands}
deps =
{[pinned]deps}
lxml==4.0.0
PyPyDispatcher==2.1.0

[docs]
changedir = docs
deps =
Expand Down Expand Up @@ -135,7 +150,3 @@ deps = {[docs]deps}
setenv = {[docs]setenv}
commands =
sphinx-build -W -b linkcheck . {envtmpdir}/linkcheck

[testenv:asyncio]
commands =
{[testenv]commands} --reactor=asyncio

0 comments on commit d29bec6

Please sign in to comment.