Short

According to a recent survey conducted by ZoomRx, two-thirds of the 20 largest pharmaceutical companies prohibit their employees from using ChatGPT due to security concerns regarding sensitive internal data. Nevertheless, many life sciences professionals use ChatGPT regularly, even though 83 percent say the technology is "overrated." Despite the potential for increased efficiency and effectiveness through the use of AI in drug development, the majority of pharmaceutical companies are cautious about the technology due to security risks. Most companies see AI primarily as a means to reduce costs, while concerns about data security and privacy remain. But there are exceptions. According to OpenAI, Moderna is using ChatGPT in a pilot project called Dose ID to analyze and visualize large data sets and determine the optimal vaccine dose.

Ad
Ad
Short

OpenAI has filed a motion to dismiss Elon Musk's lawsuit, Bloomberg reports. OpenAI says Musk's claim that the company abandoned its altruistic principles in favor of profits is "revisionist history." OpenAI claims that Musk wants to use OpenAI's success for his own competing AI company, XAI. In his lawsuit, Musk quotes from OpenAI's founding charter, in which the company promises to make its products open source for the benefit of the general public. OpenAI counters that it never promised not to monetize its technology. The charter also states that the open-source clause does not apply in principle, but only when open source is actually useful, which OpenAI says must be constantly discussed. The partnership with Microsoft does not violate any agreements. A hearing on the dismissal of the lawsuit is scheduled for April 24, Bloomberg reports.

Short

OpenAI has announced major updates to its Assistants API. The API now includes an improved search tool called file_search, which can ingest up to 10,000 files per Assistant - a 500x increase over the previous version. In addition to file_search, OpenAI introduces vector_store objects that automate file parsing, chunking, and embedding for seamless searches. The update also includes granular control over token usage, support for common model configuration parameters such as temperature and peak P-value, and the ability to use fine-tuned models, though initially limited to gpt-3.5-turbo-0125. The API now supports streaming, and streaming and polling helper functions have been added to the OpenAI Node and Python SDKs. Developers should consult the migration guide to learn how to upgrade their tool usage to the latest version of the Assistants API.

Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Short

OpenAI introduces a new Batch API that offers up to 50 percent discount for asynchronous API tasks such as summarization, translation, and image classification. With the Batch API, you can get results for large numbers of API requests within 24 hours. This requires uploading a JSONL file containing the requests in batch format. Currently, only the /v1/chat/completions endpoint is supported. In addition to the request file and endpoint, you must also specify a 24-hour window for processing. Optionally, custom metadata can be added to the batch. The new batch API reduces costs and provides higher limits for asynchronous tasks. OpenAI expects this to enable more efficient use of its APIs for applications that require a large number of requests.

Google News