OpenAI introduces a new Batch API that offers up to 50 percent discount for asynchronous API tasks such as summarization, translation, and image classification. With the Batch API, you can get results for large numbers of API requests within 24 hours. This requires uploading a JSONL file containing the requests in batch format. Currently, only the /v1/chat/completions endpoint is supported. In addition to the request file and endpoint, you must also specify a 24-hour window for processing. Optionally, custom metadata can be added to the batch. The new batch API reduces costs and provides higher limits for asynchronous tasks. OpenAI expects this to enable more efficient use of its APIs for applications that require a large number of requests.
Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Sources
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.