Navigation
Search
|
Microsoft brings OpenAI-powered Deep Research to Azure AI Foundry agents
Tuesday July 8, 2025. 01:26 PM , from InfoWorld
Microsoft has added OpenAI-developed Deep Research capability to its Azure AI Foundry Agent service to help enterprises integrate research automation into their business applications.
The integration of research automation is made possible by Deep Research API and SDK, which can be used by developers to embed, extend, and orchestrate Deep Research-as-a-service across an enterprise’s ecosystem, including data and existing systems, Yina Arenas, VP of product at Microsoft’s Core AI division, wrote in a blog post. [ Related: More OpenAi news and insights ] Developers can use Deep Research to automate large-scale, source-traceable insights, programmatically build and deploy agents as services invokable by apps, workflows, or other agents, and orchestrate complex tasks using Logic Apps, Azure Functions, and Foundry connectors, Arenas added. Essentially, the new capability is designed to help enterprises enhance their AI agents to conduct deeper analysis of complex data, enabling better decision-making and productivity, said Charlie Dai, vice president and principal analyst at Forrester. “All major industries will benefit from this, such as investment insights generation for finance, drug discovery acceleration for healthcare, and supply chain optimization for manufacturing,” Dai added. How does Deep Research work? Deep Research, at its core, uses a combination of OpenAI and Microsoft technologies, such as o3-deep-research, other GPT models, and Grounding with Bing Search, when integrated into an agent. When a research request is received by the agent that has Deep Research integrated — whether from a user or another application — the agent taps into GPT-4o and GPT-4.1 to interpret the intent, fill in any missing details, and define a clear, actionable scope for the task. After the task has been defined, the agent activates the Bing-powered grounding tool to retrieve a refined selection of recent, high-quality web content. Post this step, the o3-deep-research agent initiates the research process by reasoning through the gathered information and instead of simply summarizing content, it evaluates, adapts, and synthesizes insights across multiple sources, adjusting its approach as new data emerges. The entire process results in a final output that is a structured report that documents not only the answer, but also the model’s reasoning path, source citations, and any clarifications requested during the session, Arenas explained. Competition, pricing, and availability Microsoft isn’t the only hyperscaler offering deep research capability. “Google Cloud already provides Gemini Deep Research with its Gemini 2.5 Pro. AWS hasn’t offered cloud services on it, but it showcased Bedrock Deep Researcher as a sample application to automate the generation of articles and reports,” Dai said. Microsoft, itself, offers the deep research capability inside its office suite of applications as Researcher in Microsoft 365 Copilot. OpenAI, too, has added the deep research capability inside its generative AI-based assistant, ChatGPT. In terms of pricing, Deep Research inside Azure AI Foundry Agent Service will set back enterprises by $10 per million input tokens and $40 per million output tokens for just the 03-deep-research model.Cached inputs for the model will cost $2.50 per million tokens, the company said. Further, enterprises will incur separate charges for Grounding with Bing Search and the base GPT model being used for clarifying questions, it added.
https://www.infoworld.com/article/4018689/microsoft-brings-openai-powered-deep-research-to-azure-ai-...
Related News |
25 sources
Current Date
Jul, Wed 9 - 10:37 CEST
|