OpenAI on Tuesday announced new tools and features for its enterprise-grade Assistant API. The AI ​​firm has made its API suite available to larger businesses that require customized solutions for their workloads. Highlighting some of its key clients such as Klarna, Morgan Stanley, Oscar, Salesforce and Wix, the company said it launched new features to better support their needs. Two new tools and six new Assistant API features have been introduced. Along with this, OpenAI has also provided options for businesses to use their AI products in a cost-effective manner.

to announce through blog post“We’re deepening our support for enterprises with new features that are useful to larger businesses and any developers who are rapidly growing on our platform,” OpenAI said. The first new tool, called Private Link, is designed to provide advanced security for businesses that share their organizational data with the OpenAI Cloud. Private Link provides a direct connection between Microsoft Azure and OpenAI, reducing the risk of the open internet. Additionally, the AI ​​firm has rolled out Basic Multi-Factor Authentication (MFA) for added security and to ensure compliance with access control requirements.

Another tool designed for better administrative control has also been introduced. Dubbed Projects, it allows administrators to scope out roles and API keys to specific projects, restrict and choose which models to make available, and set usage and rate-based limits. Additionally, project owners will also be able to create service account API keys. These keys will grant other users access to the projects.

Talking about Assistants API, OpenAI has introduced many new features. The file_search recovery feature can now hold up to 10,000 files per assistant, compared to before. The company claims that it supports parallel queries through multi-threaded searches and features better reranking and query rewriting. Streaming support has also been added. In addition, technical components are included such as vector-store objects and tool_choice parameters. The API will also support the fine-tuned GPT 3.5 Turbo model.

For cost management, the AI ​​firm has incorporated two new methods. First, businesses with sustained token-per-minute levels can now request access to provisioned throughput at up to a 50 percent discount. For non-urgent workloads, customers can opt for batch API requests that are priced up to 50 percent less than shared prices. They will still return results within 24 hours and offer higher rate ranges.


Affiliate links may be automatically generated – see our ethics statement for details.
Denial of responsibility! Thelocalreport.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us.The content will be deleted within 24 hours.

Reference Url

Follow Us on