Satya Nadella, the chief executive officer (CEO) of Microsoft started his keynote at the Ignite 2023 event on November 15 with a light-hearted quip about pulling an all-nighter for Cricket World Cup semifinal, which happened on the same day. What followed was a range of announcements, largely pertaining to the company’s work and launches in the artificial intelligence domain.
Here are some of the major highlights:
One of the most important announcements made at the event was the launch of Microsoft Azure Maia and Microsoft Azure Cobalt — the two company-designed chips optimised for AI. Maia is an AI accelerator chip designed to run cloud-based training and inference for workloads like OpenAI models, Bing, and GitHub Copilot. Cobalt, on the other hand, is a cloud-native chip based on Arm architecture for general-purpose workloads and optimised for performance, efficiency, and cost. Microsoft also announced the general availability of Azure Boost for storage and faster networking.
Extending the ‘Copilot experience’
“We believe in the future there will be a Copilot for everyone and for everything you do,” says the Microsoft Ignite announcement blog. In this spirit, the company said that it is extending its Copilot offerings across solutions for the benefit of every role and function, in including office workers, front-line workers, developers, and IT professionals. Some of the Microsoft Copilot-related updates and announcements include — Microsoft Copilot for Service to offer support for role-based support; Copilot in Microsoft Dynamics 365 Guides; new Microsoft Copilot Studio for organisations to build their own copilots from scratch or adapt out-of-the-box copilots.
Nvidia AI Foundry service
Nvidia will now run its AI foundry service on Azure to help enterprises and startups for the development, tuning, and deployment of custom AI models. The Nvidia AI foundry service has a collection of foundational models, NVIDIA NeMo framework and tools, and NVIDIA DGX Cloud AI supercomputing and services. This suite of products and solutions will help enterprises in creating custom generative AI models. Businesses can then deploy their models with NVIDIA AI Enterprise software on Azure to power generative AI applications.
Microsoft demonstrated the preview of Azure AI Studio — a unified and trustworthy platform for organisations to build, test, and deploy AI apps. Using this platform, developers/firms can build their own copilots, and train their own or ground third-party foundational apps using proprietary data. Further, the company has made Vector Search, a feature of Azure AI Search, general available for enterprise generative AI applications.
GPT-4 Turbo, which was announced by OpenAI at its first ever developer conference this month, will be available in public preview in Azure OpenAI Service by end of November 2023, while the GPT-3.5 Turbo model with a 16K token prompt length is now generally available.
Microsoft has now expanded the Copilot Copyright Commitment (CCC) to customers using Azure OpenAI Service and will be called Customer Copyright Commitment. Under this, the company has published new documentation to help Azure OpenAI Service customers to mitigate the risk of infringing content.