Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

❇️ Use prawcore bound to Niquests instead of Requests #2037

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 10 additions & 10 deletions docs/getting_started/configuration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ Environment variables have the highest priority, followed by keyword arguments t
Using an HTTP or HTTPS proxy with PRAW
--------------------------------------

PRAW internally relies upon the requests_ package to handle HTTP requests. Requests
PRAW internally relies upon the niquests_ package to handle HTTP requests. Niquests
supports use of ``HTTP_PROXY`` and ``HTTPS_PROXY`` environment variables in order to
proxy HTTP and HTTPS requests respectively [`ref
<https://requests.readthedocs.io/en/master/user/advanced/#proxies>`_].
<https://niquests.readthedocs.io/en/latest/user/advanced.html#proxies>`_].

Given that PRAW exclusively communicates with Reddit via HTTPS, only the ``HTTPS_PROXY``
option should be required.
Expand All @@ -41,19 +41,19 @@ variable can be provided on the command line like so:
Configuring a custom requests Session
-------------------------------------

PRAW uses requests_ to handle networking. If your use-case requires custom
PRAW uses niquests_ to handle networking. If your use-case requires custom
configuration, it is possible to configure a custom Session_ instance and then use it
with PRAW.

For example, some networks use self-signed SSL certificates when connecting to HTTPS
sites. By default, this would raise an exception in requests_. To use a self-signed SSL
certificate without an exception from requests_, first export the certificate as a
sites. By default, this would raise an exception in niquests_. To use a self-signed SSL
certificate without an exception from niquests_, first export the certificate as a
``.pem`` file. Then configure PRAW like so:

.. code-block:: python

import praw
from requests import Session
from niquests import Session


session = Session()
Expand All @@ -69,11 +69,11 @@ certificate without an exception from requests_, first export the certificate as

The code above creates a custom Session_ instance and `configures it to use a custom
certificate
<https://requests.readthedocs.io/en/master/user/advanced/#ssl-cert-verification>`_, then
passes it as a parameter when creating the :class:`.Reddit` instance. Note that the
<https://niquests.readthedocs.io/en/latest/user/advanced.html#ssl-cert-verification>`_,
then passes it as a parameter when creating the :class:`.Reddit` instance. Note that the
example above uses a :ref:`password_flow` authentication type, but this method will work
for any authentication type.

.. _requests: https://requests.readthedocs.io
.. _niquests: https://niquests.readthedocs.io

.. _session: https://2.python-requests.org/en/master/api/#requests.Session
.. _session: https://niquests.readthedocs.io/en/latest/user/advanced.html
4 changes: 2 additions & 2 deletions docs/getting_started/multiple_instances.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,8 @@ Multiple Threads
PRAW is not thread safe.

In a nutshell, instances of :class:`.Reddit` are not thread-safe for a number of reasons
in its own code and each instance depends on an instance of ``requests.Session``, which
is not thread-safe [`ref <https://github.com/psf/requests/issues/2766>`_].
in its own code and each instance depends on an instance of ``niquests.Session``, which
is thread-safe.

In theory, having a unique :class:`.Reddit` instance for each thread, and making sure
that the instances are used in their respective threads only, will work.
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorials/refresh_token.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ following:

.. code-block:: python

import requests
import niquests

response = requests.get(
response = niquests.get(
"https://www.reddit.com/api/v1/scopes.json",
headers={"User-Agent": "fetch-scopes by u/bboe"},
)
Expand Down
31 changes: 20 additions & 11 deletions praw/models/reddit/subreddit.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,9 @@
from warnings import warn
from xml.etree.ElementTree import XML

import websocket
from niquests.exceptions import HTTPError, ReadTimeout, Timeout
from prawcore import Redirect
from prawcore.exceptions import ServerError
from requests.exceptions import HTTPError

from ...const import API_PATH, JPEG_HEADER
from ...exceptions import (
Expand Down Expand Up @@ -3067,30 +3066,40 @@ def _submit_media(
"""
response = self._reddit.post(API_PATH["submit"], data=data)
websocket_url = response["json"]["data"]["websocket_url"]
connection = None
ws_response = None
if websocket_url is not None and not without_websockets:
try:
connection = websocket.create_connection(websocket_url, timeout=timeout)
ws_response = self._reddit._core._requestor._http.get(
websocket_url,
timeout=timeout,
).raise_for_status()
except (
OSError,
websocket.WebSocketException,
BlockingIOError,
HTTPError,
Timeout,
) as ws_exception:
msg = "Error establishing websocket connection."
raise WebSocketException(msg, ws_exception) from None

if connection is None:
if ws_response is None:
return None

try:
ws_update = loads(connection.recv())
connection.close()
except (OSError, websocket.WebSocketException, BlockingIOError) as ws_exception:
ws_update = loads(ws_response.extension.next_payload())
except (
ReadTimeout,
HTTPError,
) as ws_exception:
msg = "Websocket error. Check your media file. Your post may still have been created."
raise WebSocketException(
msg,
ws_exception,
) from None
finally:
if (
ws_response.extension is not None
and ws_response.extension.closed is False
):
ws_response.extension.close()
if ws_update.get("type") == "failed":
raise MediaPostFailed
url = ws_update["payload"]["redirect"]
Expand Down
13 changes: 8 additions & 5 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ classifiers = [
"Topic :: Utilities"
]
dependencies = [
"prawcore >=2.4, <3",
"prawcore@git+https://github.com/Ousret/prawcore@feat-niquests",
"update_checker >=0.18",
"websocket-client >=0.54.0"
"niquests[ws]>=3.10,<4"
]
dynamic = ["version", "description"]
keywords = ["reddit", "api", "wrapper"]
Expand Down Expand Up @@ -56,9 +56,7 @@ readthedocs = [
test = [
"betamax >=0.8, <0.9",
"betamax-matchers >=0.3.0, <0.5",
"pytest >=2.7.3",
"requests >=2.20.1, <3",
"urllib3 ==1.26.*, <2"
"pytest >=2.7.3"
]

[project.urls]
Expand All @@ -78,6 +76,11 @@ extend_exclude = ['./docs/examples/']
profile = 'black'
skip_glob = '.venv*'

[tool.pytest.ini_options]
# this avoids pytest loading betamax+Requests at boot.
# this allows us to patch betamax and makes it use Niquests instead.
addopts = "-p no:pytest-betamax"

[tool.ruff]
target-version = "py38"
include = [
Expand Down
82 changes: 81 additions & 1 deletion tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,51 @@
import socket
import time
from base64 import b64encode
from sys import platform
from sys import platform, modules
from urllib.parse import urlparse

import requests
import niquests
import urllib3

# betamax is tied to Requests
# and Niquests is almost entirely compatible with it.
# we can fool it without effort.
modules["requests"] = niquests
modules["requests.adapters"] = niquests.adapters
modules["requests.models"] = niquests.models
modules["requests.exceptions"] = niquests.exceptions
modules["requests.packages.urllib3"] = urllib3

# niquests no longer have a compat submodule
# but betamax need it. no worries, as betamax
# explicitly need requests, we'll give it to him.
modules["requests.compat"] = requests.compat

# doing the import now will make betamax working with Niquests!
# no extra effort.
import betamax

# the base mock does not implement close(), which is required
# for our HTTP client. No biggy.
betamax.mock_response.MockHTTPResponse.close = lambda _: None


# betamax have a tiny bug in URI matcher
# https://example.com != https://example.com/
# And Niquests does not enforce the trailing '/'
# when preparing a Request.
def _patched_parse(self, uri):
parsed = urlparse(uri)
return {
"scheme": parsed.scheme,
"netloc": parsed.netloc,
"path": parsed.path or "/",
"fragment": parsed.fragment,
}


betamax.matchers.uri.URIMatcher.parse = _patched_parse

import pytest

Expand All @@ -31,6 +75,42 @@ def _get_path(name):
return _get_path


@pytest.fixture(autouse=True)
def lax_content_length_strict(monkeypatch):
import io
import base64
from betamax.util import body_io
from urllib3 import HTTPResponse
from betamax.mock_response import MockHTTPResponse

# our cassettes are[...] pretty much broken.
# Some declared Content-Length don't match the bodies.
# Let's disable enforced content-length here.
def _patched_add_urllib3_response(serialized, response, headers):
if "base64_string" in serialized["body"]:
body = io.BytesIO(
base64.b64decode(serialized["body"]["base64_string"].encode())
)
else:
body = body_io(**serialized["body"])

h = HTTPResponse(
body,
status=response.status_code,
reason=response.reason,
headers=headers,
preload_content=False,
original_response=MockHTTPResponse(headers),
enforce_content_length=False,
)

response.raw = h

monkeypatch.setattr(
betamax.util, "add_urllib3_response", _patched_add_urllib3_response
)


def pytest_configure(config):
pytest.placeholders = Placeholders(placeholders)
config.addinivalue_line(
Expand Down
4 changes: 2 additions & 2 deletions tests/integration/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@
from urllib.parse import quote_plus

import betamax
import niquests
import pytest
import requests
from betamax.cassette import Cassette

from praw import Reddit
Expand Down Expand Up @@ -70,7 +70,7 @@ def read_only(self, reddit):
@pytest.fixture(autouse=True)
def recorder(self):
"""Configure Betamax."""
session = requests.Session()
session = niquests.Session()
recorder = betamax.Betamax(session)
recorder.register_serializer(PrettyJSONSerializer)
with betamax.Betamax.configure() as config:
Expand Down
Loading
Loading