Skip to content

Python 3.10 compatibility #166

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 17 commits into from
Mar 10, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 18 additions & 6 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,20 +12,26 @@ jobs:
strategy:
fail-fast: false
matrix:
include:
include:
- python-version: "pypy3"
env:
TOXENV: "msgpack"
- python-version: "pypy3"
env:
TOXENV: "json"

- python-version: "2.7"
env:
TOXENV: "msgpack"
- python-version: "2.7"
env:
TOXENV: "json"
- python-version: "pypy3"
- python-version: "3.5"
env:
TOXENV: "msgpack"
- python-version: "pypy3"
- python-version: "3.5"
env:
TOXENV: "json"

- python-version: "3.6"
env:
TOXENV: "msgpack"
Expand All @@ -50,7 +56,13 @@ jobs:
- python-version: "3.9"
env:
TOXENV: "json"

- python-version: "3.10"
env:
TOXENV: "msgpack"
- python-version: "3.10"
env:
TOXENV: "json"

steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
Expand All @@ -65,4 +77,4 @@ jobs:
tox

- name: Upload to codecov
uses: codecov/codecov-action@v2
uses: codecov/codecov-action@v2
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ docs/_build
.DS_Store
pytestdebug.log
.idea
coverage.xml
11 changes: 10 additions & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,25 @@
Client interface for Scrapinghub API
====================================

.. image:: https://img.shields.io/pypi/v/scrapinghub.svg
:target: https://pypi.org/project/scrapinghub

.. image:: https://img.shields.io/pypi/pyversions/scrapinghub.svg
:target: https://pypi.org/project/scrapinghub

.. image:: https://github.com/scrapinghub/python-scrapinghub/actions/workflows/main.yml/badge.svg
:target: https://github.com/scrapinghub/python-scrapinghub/actions/workflows/main.yml

.. image:: https://codecov.io/gh/scrapinghub/python-scrapinghub/branch/master/graph/badge.svg
:target: https://app.codecov.io/gh/scrapinghub/python-scrapinghub

The ``scrapinghub`` is a Python library for communicating with the `Scrapinghub API`_.


Requirements
============

* Python 2.7 or above
* Python 2.7 or Python 3.5+


Installation
Expand Down
11 changes: 4 additions & 7 deletions requirements-test.txt
Original file line number Diff line number Diff line change
@@ -1,8 +1,5 @@
mock
vcrpy==1.10.3
# FIXME remove the constraint after resolving
# https://github.com/pytest-dev/pytest/issues/2966
pytest<3.3.0
pytest-cov<2.6.0
pytest-catchlog
responses==0.10.6
vcrpy
pytest
pytest-cov
responses
12 changes: 6 additions & 6 deletions scrapinghub/client/collections.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from __future__ import absolute_import
import collections

from six import string_types
from six.moves import collections_abc

from ..hubstorage.collectionsrt import Collection as _Collection

Expand Down Expand Up @@ -82,7 +82,7 @@ def iter(self):

:return: an iterator over collections list where each collection is
represented by a dictionary with ('name','type') fields.
:rtype: :class:`collections.Iterable[dict]`
:rtype: :class:`collections.abc.Iterable[dict]`
"""
return self._origin.apiget('list')

Expand Down Expand Up @@ -130,9 +130,9 @@ class Collection(object):
>>> for elem in foo_store.iter(count=1)):
... print(elem)
[{'_key': '002d050ee3ff6192dcbecc4e4b4457d7', 'value': '1447221694537'}]

- get generator over item keys::

>>> keys = foo_store.iter(nodata=True, meta=["_key"]))
>>> next(keys)
{'_key': '002d050ee3ff6192dcbecc4e4b4457d7'}
Expand Down Expand Up @@ -185,7 +185,7 @@ def delete(self, keys):
The method returns ``None`` (original method returns an empty generator).
"""
if (not isinstance(keys, string_types) and
not isinstance(keys, collections.Iterable)):
not isinstance(keys, collections_abc.Iterable)):
raise ValueError("You should provide string key or iterable "
"object providing string keys")
self._origin.delete(keys)
Expand Down Expand Up @@ -219,7 +219,7 @@ def iter(self, key=None, prefix=None, prefixcount=None, startts=None,
:param requests_params: (optional) a dict with optional requests params.
:param \*\*params: (optional) additional query params for the request.
:return: an iterator over items list.
:rtype: :class:`collections.Iterable[dict]`
:rtype: :class:`collections.abc.Iterable[dict]`
"""
update_kwargs(params, key=key, prefix=prefix, prefixcount=prefixcount,
startts=startts, endts=endts,
Expand Down
8 changes: 4 additions & 4 deletions scrapinghub/client/frontiers.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ def iter(self):
"""Iterate through frontiers.

:return: an iterator over frontiers names.
:rtype: :class:`collections.Iterable[str]`
:rtype: :class:`collections.abc.Iterable[str]`
"""
return iter(self.list())

Expand Down Expand Up @@ -174,7 +174,7 @@ def iter(self):
"""Iterate through slots.

:return: an iterator over frontier slots names.
:rtype: :class:`collections.Iterable[str]`
:rtype: :class:`collections.abc.Iterable[str]`
"""
return iter(self.list())

Expand Down Expand Up @@ -321,7 +321,7 @@ def iter(self, **params):

:param \*\*params: (optional) additional query params for the request.
:return: an iterator over fingerprints.
:rtype: :class:`collections.Iterable[str]`
:rtype: :class:`collections.abc.Iterable[str]`
"""
origin = self._frontier._frontiers._origin
path = (self._frontier.key, 's', self.key, 'f')
Expand Down Expand Up @@ -358,7 +358,7 @@ def iter(self, mincount=None, **params):
:param \*\*params: (optional) additional query params for the request.
:return: an iterator over request batches in the queue where each
batch is represented with a dict with ('id', 'requests') field.
:rtype: :class:`collections.Iterable[dict]`
:rtype: :class:`collections.abc.Iterable[dict]`
"""
origin = self._frontier._frontiers._origin
path = (self._frontier.key, 's', self.key, 'q')
Expand Down
2 changes: 1 addition & 1 deletion scrapinghub/client/items.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ def list_iter(self, chunksize=1000, *args, **kwargs):
down by `chunksize`.

:return: an iterator over items, yielding lists of items.
:rtype: :class:`collections.Iterable`
:rtype: :class:`collections.abc.Iterable`
"""

start = kwargs.pop("start", 0)
Expand Down
2 changes: 1 addition & 1 deletion scrapinghub/client/projects.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def iter(self):
Provided for the sake of API consistency.

:return: an iterator over project ids list.
:rtype: :class:`collections.Iterable[int]`
:rtype: :class:`collections.abc.Iterable[int]`
"""
return iter(self.list())

Expand Down
4 changes: 2 additions & 2 deletions scrapinghub/client/proxy.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def iter(self, _path=None, count=None, requests_params=None, **apiparams):

:param count: limit amount of elements.
:return: an iterator over elements list.
:rtype: :class:`collections.Iterable`
:rtype: :class:`collections.abc.Iterable`
"""
update_kwargs(apiparams, count=count)
apiparams = self._modify_iter_params(apiparams)
Expand Down Expand Up @@ -165,7 +165,7 @@ def iter(self):
"""Iterate through key/value pairs.

:return: an iterator over key/value pairs.
:rtype: :class:`collections.Iterable`
:rtype: :class:`collections.abc.Iterable`
"""
return six.iteritems(next(self._origin.apiget()))

Expand Down
9 changes: 4 additions & 5 deletions scrapinghub/hubstorage/resourcetype.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@
import time
import json
import socket
import logging
from collections import MutableMapping
import socket
import time

import six
from six.moves import range
import requests.exceptions as rexc
from six.moves import range, collections_abc

from .utils import urlpathjoin, xauth
from .serialization import jlencode, jldecode, mpdecode
Expand Down Expand Up @@ -230,7 +229,7 @@ def stats(self):
return next(self.apiget('stats', chunk_size=STATS_CHUNK_SIZE))


class MappingResourceType(ResourceType, MutableMapping):
class MappingResourceType(ResourceType, collections_abc.MutableMapping):

_cached = None
ignore_fields = ()
Expand Down
7 changes: 4 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,18 +28,19 @@
package_data={'scrapinghub': ['VERSION']},
install_requires=['requests>=1.0', 'retrying>=1.3.3', 'six>=1.10.0'],
extras_require={'msgpack': mpack_required},
python_requires='>=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*',
classifiers=[
'Development Status :: 5 - Production/Stable',
'License :: OSI Approved :: BSD License',
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: Implementation :: CPython',
'Programming Language :: Python :: Implementation :: PyPy',
'Topic :: Internet :: WWW/HTTP',
Expand Down
2 changes: 1 addition & 1 deletion tests/client/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,8 +66,8 @@ def project(client):
return client.get_project(TEST_PROJECT_ID)


@my_vcr.use_cassette()
@pytest.fixture(scope='session')
@my_vcr.use_cassette()
def spider(project, request):
# on normal conditions you can't create a new spider this way:
# it can only be created on project deploy as usual
Expand Down
6 changes: 3 additions & 3 deletions tests/client/test_frontiers.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import time
from types import GeneratorType
from collections import Iterable

from six import string_types
from six.moves import collections_abc

from scrapinghub.client.frontiers import Frontiers, Frontier, FrontierSlot
from ..conftest import TEST_FRONTIER_SLOT
Expand Down Expand Up @@ -36,7 +36,7 @@ def test_frontiers(project, frontier, frontier_name):

# test for iter() method
frontiers_names = frontiers.iter()
assert isinstance(frontiers_names, Iterable)
assert isinstance(frontiers_names, collections_abc.Iterable)
assert frontier_name in list(frontiers_names)

# test for list() method
Expand All @@ -58,7 +58,7 @@ def test_frontier(project, frontier):
_add_test_requests_to_frontier(frontier)

slots = frontier.iter()
assert isinstance(slots, Iterable)
assert isinstance(slots, collections_abc.Iterable)
assert TEST_FRONTIER_SLOT in list(slots)

slots = frontier.list()
Expand Down
5 changes: 2 additions & 3 deletions tests/client/test_job.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
from collections import Iterator

import pytest
from six.moves import collections_abc

from scrapinghub.client.items import Items
from scrapinghub.client.jobs import Job
Expand Down Expand Up @@ -224,7 +223,7 @@ def test_metadata_delete(spider):
def test_metadata_iter_list(spider):
job = spider.jobs.run(meta={'meta1': 'data1', 'meta2': 'data2'})
meta_iter = job.metadata.iter()
assert isinstance(meta_iter, Iterator)
assert isinstance(meta_iter, collections_abc.Iterator)
meta_list = job.metadata.list()
assert ('meta1', 'data1') in meta_list
assert ('meta2', 'data2') in meta_list
Expand Down
6 changes: 3 additions & 3 deletions tests/client/test_projects.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import types
from collections import defaultdict, Iterator
from collections import defaultdict

import pytest
import responses
from six.moves import range
from requests.compat import urljoin
from six.moves import range, collections_abc

from scrapinghub import ScrapinghubClient
from scrapinghub.client.activity import Activity
Expand Down Expand Up @@ -288,7 +288,7 @@ def test_settings_delete(project):
def test_settings_iter_list(project):
project.settings.set('job_runtime_limit', 24)
settings_iter = project.settings.iter()
assert isinstance(settings_iter, Iterator)
assert isinstance(settings_iter, collections_abc.Iterator)
settings_list = project.settings.list()
assert ('job_runtime_limit', 24) in settings_list
assert settings_list == list(settings_iter)
2 changes: 1 addition & 1 deletion tests/hubstorage/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,8 +62,8 @@ def hsproject(hsclient):
return hsclient.get_project(TEST_PROJECT_ID)


@my_vcr.use_cassette()
@pytest.fixture(scope='session')
@my_vcr.use_cassette()
def hsspiderid(hsproject):
return str(hsproject.ids.spider(TEST_SPIDER_NAME, create=1))

Expand Down
4 changes: 2 additions & 2 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,12 @@
# and then run "tox" from this directory.

[tox]
envlist = py{36,py3,37,38,39}-{json,msgpack}
envlist = py{27,35,36,py3,37,38,39}-{json,msgpack}

[testenv]
deps =
-r{toxinidir}/requirements-base.txt
-r{toxinidir}/requirements-test.txt
msgpack: -r{toxinidir}/requirements.txt
pypy-msgpack: -r{toxinidir}/requirements-pypy.txt
commands = py.test --cov=scrapinghub --cov-report=xml
commands = py.test --cov=scrapinghub --cov-report=xml {posargs: scrapinghub tests}