Skip to content

windows call lmstudio (127.0.0.1:1234/v1),get 502 #3044

@susky900213

Description

@susky900213

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

(base) PS C:\WINDOWS\system32> pip show openai
Name: openai
Version: 2.30.0
Summary: The official Python library for the openai API
Home-page: https://github.com/openai/openai-python
Author:
Author-email: OpenAI support@openai.com
License: Apache-2.0
Location: C:\Users\yun\miniconda3\Lib\site-packages
Requires: anyio, distro, httpx, jiter, pydantic, sniffio, tqdm, typing-extensions
Required-by: agentscope, agentscope-runtime, reme_ai

Traceback (most recent call last):
File "C:\work\test.py", line 21, in test_lm_studio
response = await client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<3 lines>...
)
^
File "C:\Users\yun\miniconda3\Lib\site-packages\openai\resources\chat\completions\completions.py", line 2714, in create
return await self._post(
^^^^^^^^^^^^^^^^^
...<49 lines>...
)
^
File "C:\Users\yun\miniconda3\Lib\site-packages\openai_base_client.py", line 1884, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yun\miniconda3\Lib\site-packages\openai_base_client.py", line 1669, in request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502

To Reproduce

(base) PS C:\WINDOWS\system32> pip show openai
Name: openai
Version: 2.30.0
Summary: The official Python library for the openai API
Home-page: https://github.com/openai/openai-python
Author:
Author-email: OpenAI support@openai.com
License: Apache-2.0
Location: C:\Users\yun\miniconda3\Lib\site-packages
Requires: anyio, distro, httpx, jiter, pydantic, sniffio, tqdm, typing-extensions
Required-by: agentscope, agentscope-runtime, reme_ai

Traceback (most recent call last):
File "C:\work\test.py", line 21, in test_lm_studio
response = await client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<3 lines>...
)
^
File "C:\Users\yun\miniconda3\Lib\site-packages\openai\resources\chat\completions\completions.py", line 2714, in create
return await self._post(
^^^^^^^^^^^^^^^^^
...<49 lines>...
)
^
File "C:\Users\yun\miniconda3\Lib\site-packages\openai_base_client.py", line 1884, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yun\miniconda3\Lib\site-packages\openai_base_client.py", line 1669, in request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502

Code snippets

OS

windows10

Python version

Python 3.13.12

Library version

2.30.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions