Microsoft aims to extend its ecosystem of AI-powered apps and services, called “copilots,” with plug-ins from third-party developers.
Today at its annual Build conference, Microsoft announced that it’s adopting the same plug-in standard its close collaborator, OpenAI, introduced for ChatGPT, its AI-powered chatbot — allowing developers to build plug-ins that work across ChatGPT, Bing Chat (on the web and in the Microsoft Edge sidebar), Dynamics 365 Copilot, Microsoft 365 Copilot and the newly launched Windows Copilot.
“I think over the coming years, this will become an expectation for how all software works,” Kevin Scott, Microsoft’s CTO, said in a blog post shared with TechCrunch last week.
Bold pronouncements aside, the new plug-in framework lets Microsoft’s family of “copilots” — apps that use AI to assist users with various tasks, such as writing an email or generating images — interact with a range of different software and services. Using IDEs like Visual Studio, Codespaces and Visual Studio Code, developers can build plug-ins that retrieve real-time information, incorporate company or other business data and take action on a user’s behalf.
A plug-in could let the Microsoft 365 Copilot, for example, make arrangements for a trip in line with a company’s travel policy, query a site like WolframAlpha to solve an equation or answer questions about how certain legal issues at a firm were handled in the past.
Customers in the Microsoft 365 Copilot Early Access Program (plus ChatGPT Plus subscribers) will gain access to new plug-ins from partners in the coming weeks, including Atlassian, Adobe, ServiceNow, Thomson Reuters, Moveworks and Mural. Bing Chat, meanwhile, will see new plug-ins added to its existing collection from Instacart, Kayak, Klarna, Redfin and Zillow, and those same Bing Chat plug-ins will come to Windows within Windows Copilot.
The OpenTable plug-in allows Bing Chat to search across restaurants for available bookings, for example, while the Instacart plug-in lets the chatbot take a dinner menu, turn it into a shopping list and place an order to get the ingredients delivered. Meanwhile, the new Bing plug-in brings web and search data from Bing into ChatGPT, complete with citations.
A new framework
Scott describes plug-ins as a bridge between an AI system, like ChatGPT, and data a third party wants to keep private or proprietary. A plug-in gives an AI system access to those private files, enabling it to, for example, answer a question about business-specific data.
There’s certainly growing demand for such a bridge as privacy becomes a major issue with generative AI, which has a tendency to leak sensitive data, like phone numbers and email addresses, from the datasets on which it was trained. Looking to minimize risk, companies including Apple and Samsung have banned employees from using ChatGPT and similar AI tools over concerns employees might mishandle and leak confidential data to the system.
“What a plugin does is it says ‘Hey, we want to make that pattern reusable and set some boundaries about how it gets used,” John Montgomery, CVP of AI platform at Microsoft, said in a canned statement.
There are three types of plug-ins within Microsoft’s new framework: ChatGPT plug-ins, Microsoft Teams message extensions and Power Platform connectors.
Teams message extensions, which allow users to interact with a web service through buttons and forms in Teams, aren’t new. Nor are Power Platform connectors, which act as a wrapper around an API that allows the underlying service to “talk’ to apps in Microsoft’s Power Platform portfolio (e.g. Power Automate). But Microsoft’s expanding their reach, letting developers tap new and existing message extensions and connectors to extend Microsoft 365 Copilot, the company’s assistant feature for Microsoft 365 apps and services like Word, Excel and PowerPoint.
For instance, Power Platform connectors can be used to import structured data into the “Dataverse,” Microsoft’s service that stores and manages data used by internal business apps, that Microsoft 365 Copilot can then access. In a demo during Build, Microsoft showed how Dentsu, a public relations firm, tapped Microsoft 365 Copilot together with a plug-in for Jira and data from Atlassian’s Confluence without having to write new code.
Microsoft says that developers will be able to create and debug their own plug-ins in a number of ways, including through its Azure AI family of apps, which is adding capabilities to run and test plug-ins on private enterprise data. Azure OpenAI Service, Microsoft’s managed, enterprise-focused product designed to give businesses access to OpenAI’s technologies with added governance features, will also support plug-ins. And Teams Toolkit for Visual Studio will gain features for piloting plug-ins.
Transitioning to a platform
As for how they’ll be distributed, Microsoft says that developers will be able to configure, publish and manage plug-ins through the Developer Portal for Teams, among other places. They’ll also be able to monetize them, although the company wasn’t clear on how, exactly, pricing will work.
In any case, with plug-ins, Microsoft’s playing for keeps in the highly competitive generative AI race. Plug-ins transform the company’s “copilots” into aggregators, essentially — putting them on a path to becoming one-stop shops for both enterprise and consumer customers.
Microsoft no doubt perceives the lock-in opportunity as increasingly key as the company faces competitive pressure from startups and tech giants alike building generative AI, including Google and Anthropic. One could imagine plug-ins becoming a lucrative new source of revenue down the line as apps and services rely more and more on generative AI. And it could allay the fears of businesses who claim generative AI trained on their data violates their rights; Getty Images and Reddit, among others, have taken steps to prevent companies from training generative AI on their data without some form of compensation.
I’d expect rivals to answer Microsoft’s and OpenAI’s plug-ins framework with plug-ins frameworks of their own. But Microsoft has a first-mover advantage, as OpenAI had with ChatGPT. And that can’t be underestimated.
Microsoft goes all in on plug-ins for AI apps by Kyle Wiggers originally published on TechCrunch
https://techcrunch.com/2023/05/23/microsoft-goes-all-in-on-plugins-for-ai-apps/