Regístrese ahora para una mejor cotización personalizada!

ChatGPT now chatting via Azure OpenAI Service

Nov, 13, 2023 Hi-network.com

Microsoft is making the ChatGPT AI large language model available in preview as a component for applications designed for the company's Azure OpenAI Service, paving the way for developers to integrate the large language model into a host of different enterprise development and end-user applications.

Microsoft appears to have had several users working with this integration already, listing ODP Corporation (the parent company of Office Depot and OfficeMax), Singapore's Smart Nation Digital Government Office, and contract management software provider Icertis as reference customers.

Developers using the Azure OpenAI Service can use ChatGPT to add a variety of features to applications, like recapping call center conversations, automating claims processing, and even creating new advertisements with personalized content, among other things.

Generative AI like ChatGPT is already being used to enhance Microsoft's product offerings, as well. For instance, according to the company, the premium version of Teams can use AI to create chapters in conversations and automatically generated recaps, while Viva Sales can offer data-driven guidance and suggest email content to help teams reach their customers.

Enterprise use cases for Azure OpenAI ChatGPT

Ritu Jyoti, IDC group vice president for worldwide AI and automation research, said that the proposed use cases make a lot of sense, and that she expects much of the initial usage of Microsoft's new ChatGPT-powered offering to be internally focused within enterprises.

"For [example], helping HR put together job descriptions, helping employees with internal knowledge management and discovery - in other words, augmenting employees with internal search," she said.

The pricing of the service works by tokens - one token covers about four characters' worth of a given query in written English, with the average paragraph clocking in at 100 tokens, and a 1,500 word essay at about 2,048. According to Jyoti, one reason GPT-3-based applications became more popular just before ChatGPT went viral is that the pricing from the OpenAI foundation dropped to about$0.02 for 1,000 tokens.

ChatGPT via Azure costs even less, at$0.002 per 1,000 tokens, making the service potentially more economical than using an in-house large language model, she added.

"I think the pricing is great," Jyoti said.

Microsoft appears to be operating the service with an emphasis on responsible AI practices, according to Gartner vice president and distinguished analyst Bern Elliot --perhaps having learned lessons from incidents where a chatbot front end to Bing displayed strange behavior, including a conversation with a New York Times reporter in which the chatbot declared its love and treid to convince him to leave his spouse.

"I think Microsoft, historically, has taken responsible AI very seriously, and that's to their credit," he said. "Having a strong track record for ethical use and for delivering enterprise-grade privacy is positive, so I think that's in their favor."

That's a key consideration, he said, given the concerns raised by AI use in the enterprise - data protection and contextualization of data sets, more specifically. The latter issue generally centers on making sure that enterprise AI are pulling answers from the right base of information, which ensures that those answers are correct and eliminates the "hallucinations" seen in more general-use AI.

Next read this:

  • Cloud computing is no longer a slam dunk
  • What is generative AI? Artificial intelligence that creates
  • Coding with AI: Tips and best practices from developers
  • Python moves to remove the GIL and boost concurrency
  • 7 reasons Java is still great
  • The open source licensing war is over
,

tag-icon Etiquetas calientes: Inteligencia Artificial Investigación y desarrollo Cloud computing Microsoft Azure

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.