OpenAI Adds WebSocket Support to the Responses API, Accelerating Long-Chain Tasks by 40%

iconPANews
Share
Share IconShare IconShare IconShare IconShare IconShare IconCopy
AI summary iconSummary

expand icon
OpenAI has added WebSocket support to its Responses API, improving long-chain task execution by 40%. The update supports incremental input and maintains persistent connections, enabling workflows with over 20 tool calls to run faster. It complies with the Zero Data Retention standard and allows context resumption via previous_response_id. Connection duration is limited to 60 minutes. This enhancement reflects increased support for on-chain data processing in complex operations.

PANews, February 25: According to OpenAI’s developer documentation, OpenAI has introduced a WebSocket mode for its Responses API, specifically optimized for complex workflows involving frequent tool calls. This mode improves execution speed by approximately 40% for long chains of tasks with more than 20 tool calls, by establishing persistent connections and supporting incremental input. Additionally, the WebSocket mode is compatible with the Zero Data Retention (ZDR) standard and enables low-latency context continuation using previous_response_id. Each connection is currently limited to a maximum duration of 60 minutes.

Disclaimer: The information on this page may have been obtained from third parties and does not necessarily reflect the views or opinions of KuCoin. This content is provided for general informational purposes only, without any representation or warranty of any kind, nor shall it be construed as financial or investment advice. KuCoin shall not be liable for any errors or omissions, or for any outcomes resulting from the use of this information. Investments in digital assets can be risky. Please carefully evaluate the risks of a product and your risk tolerance based on your own financial circumstances. For more information, please refer to our Terms of Use and Risk Disclosure.