Wednesday, 03 May 2023 08:56

ChatGPT Might get a Private Option for Business According to Microsoft

Written by

Reading time is around minutes.

A recent incident where ChatGPT users at Samsung unknowingly exposed sensitive data via ChatGPT has raised concerns in multiple industries. The banking and finance industry saw several companies put a stop on the use of ChatGPT and certain regulators began investigating how its use could leak PII, or other financial information. To combat this new obstacle to business adoption, Microsoft is looking to offer a private business model which would exclude user input from being used to train the LLM.

The use of AI, including LLMs to automate business functions is not anything new. The term AI has been batted around for many years as a buzz word attached to any product designed to learn an environment and adapt to its needs. Products like Cylance Protect touted AI as their foundational difference. Most of these products are (in massively simplistic terms) large databases with statistical engines (Cylance called it the Math Model) that allow them to make the most favorable decision faster than a human can.

With ChatGPT using it to build out proposals, agendas, schedules, even analyze code/scripts, etc. make it very attractive as an automation tool. The problem, as it stands now, is that anything input into ChatGPT is used to train the LLM to make it more comprehensive and “better”. So, when someone at Samsung asked ChatGPT to review sensitive data, the LLM kept that information available. All someone had to do was ask it the right question, and it regurgitated it back out.

This is something of a flaw in ChatGPT, but not one that OpenAI is really set up to deal with. In comes Microsoft with their right to resell ChatGPT technology to save the day (and make a few bucks). The Microsoft plan is to run a version of ChatGPT on “Private” cloud servers in Azure. Each business will have their one walled garden in terms of data input. This piece of mind is not going to be cheap though, according to what is some information floating around on the web Microsoft’s Azure based private ChatGPT could cost upwards of ten times the cost of a regular subscription. It will be interesting to see if the benefit of using an LLM for automation outweighs the increased cost.

For those that do not trust Microsoft, OpenAI is also rumored to be working on their own private subscription service. This service will be set to not use input to train the model by default and will be hosted in AWS as opposed to Azure.

As things stand, we were not able to find information on when these services will be available although we would expect something fairly soon give the industry reaction to Samsung’s source code leak.

 

Read 2503 times

Leave a comment

Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.