Skip to content

TypeError: issubclass() arg 1 must be a class #1936

@aardvarkk

Description

@aardvarkk

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

When calling OpenAI.beta.chat.completions.parse in a highly concurrent environment, and providing a class as the response_format, I get the following error:

    response = self.client.beta.chat.completions.parse(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/resources/beta/chat/completions.py", line 156, in parse
    return self._post(
           ^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1280, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 957, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1063, in _request
    return self._process_response(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1162, in _process_response
    return api_response.parse()
           ^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_response.py", line 319, in parse
    parsed = self._options.post_parser(parsed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/resources/beta/chat/completions.py", line 150, in parser
    return _parse_chat_completion(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 122, in parse_chat_completion
    construct_type_unchecked(
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 445, in construct_type_unchecked
    return cast(_T, construct_type(value=value, type_=type_))
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 519, in construct_type
    return type_.construct(**value)  # type: ignore[arg-type]
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 230, in construct
    fields_values[name] = _construct_field(value=values[key], field=field, key=key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 394, in _construct_field
    return construct_type(value=value, type_=type_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 513, in construct_type
    if not is_literal_type(type_) and (issubclass(origin, BaseModel) or issubclass(origin, GenericModel)):
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen abc>", line 123, in __subclasscheck__
TypeError: issubclass() arg 1 must be a class

This is being run in a multi-threaded environment. When I run with a single thread, I don't see this issue. I'm trying to churn through a bunch of data, so I'm attempting to use about 100 threads to make these API requests in parallel. If I reduce the count to 1 the problem goes away.

To work around this, I believe I can stop using the beta parse method with the provided response_format,

To Reproduce

  1. Call OpenAI.beta.chat.completions.parse with 100 threads simultaneously

Code snippets

class Bar(Enum):
    C = "C"

class Qux(Enum):
    D = "D"

class Foo(BaseModel):
    a: Bar
    b: Qux

def main(): # Called by 100 threads concurrently
    OpenAI(api_key="...").beta.chat.completions.parse(
        model="gpt-4o-mini",
        seed=0,
        temperature=0,
        messages=[
            {"role": "system", "content": "..."},
            {"role": "user", "content": "..."},
        ],
        response_format=Foo,
    )

OS

Ubuntu 22.04.4 LTS

Python version

Python v3.11.10

Library version

openai v1.57.2

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions