We noted earlier this week, “JPMorgan Chase & Co. might spark a trend among other Wall Street banks to restrict chatbots in the office.” And we were correct.
Bloomberg reported that Bank of America Corp., Citigroup Inc., Deutsche Bank AG, Goldman Sachs Group Inc., and Wells Fargo & Co. are restricting AI-powered chatbot ChatGPT.
The Wall Street banks have blocked employees’ use of ChatGPT because of compliance issues with third-party software.
Bank of America execs told employees that ChatGPT is prohibited from business use, Bloomberg said, citing people with direct knowledge of the matter. Internal meetings at BofA revealed the chatbot technology must be properly vetted before it can be used for business communications. At Deutsche Bank, a spokesperson said the staff could no longer use the chatbot.
A Wells Fargo spokesperson said:
“We are imposing usage limits on ChatGPT, as we continue to evaluate safe and effective ways of using technologies like these.”
At Citigroup, the bank has blocked all access to ChatGPT, while Goldman Sachs restricted AI-powered bot on the trading floor, according to Financial News, citing sources.
JPMorgan was the first big bank to clamp down on the ChatGPT earlier this week. The move was primarily based on compliance issues tied to third-party software.
Financial News explains the reason for the wave of Wall Street banks banning ChatGPT this week:
Highly-regulated banks and financial institutions are notoriously cautious about allowing their employees access to third-party software and websites. Most still do not enable staff to access social media platforms such as Facebook and Instagram when working in the office or using company devices.
US regulators also handed out more than $2bn in fines to a dozen large investment banks for employees’ unauthorized use of messaging platforms, including WhatsApp, largely by those in trading floor functions.
Before the ban, Bloomberg noted some of the uses of ChatGPT by industry insiders:
-
A salesperson at a US bank used ChatGPT’s search engine on his personal device to get an overview of a client. The task was completed in less time than it would take to scour the internet, but the person said it couldn’t be used in an internal report and had to be cross-checked for accuracy.
-
An oil trader used a version of ChatGPT to write a research note on the outlook for crude. It read well, she said, but the information was out-of-date and had to be fixed.
-
A stock trader in Taipei used it to compile key takeaways from US earnings, sparing himself tedious copying and pasting between documents. Still, he based investment decisions on his own notes. *
-
And a bond trader in mainland China wrote routine reports on policy analysis using AI to save time — part of which she then spent carefully fact-checking.
… and just like that, 15 seconds later — ChatGPT writes a full year outlook on the oil market.
There’s no doubt the AI-powered platform has saved the time of some traders and bankers. But some have called that into question:
“It may save time, but we don’t know if it’s true, which is the biggest downside of the tool.
“It can be used like an intelligent colleague in the office, going over your work and improving it,” Oded Netzer, a professor at Columbia Business School who researches data and technology, told Bloomberg.
Even with the chatbot’s impressive array of capabilities — from writing research reports to computer code, poems, songs, and even entire movie plots, to passing law, business, and medical exams, it’s not perfect, and some of the answers it produces has been found to have errors or be incredibly woke, and racist.
The head of trading at a top bank in the US spoke with Bloomberg under the condition of anonymity. He said ChatGPT has its limitations, explaining trading pits were automated years ago by algos but pointed out other segments of banks, such as fixed-income markets, could be automated down the line.
And why the hesitation to quickly embrace ChatGPT? Well, there’s this: “When the SEC knocks on your door and asks why did you execute that transaction, you have to have a better answer than, ‘Well, the machine told me to,'” Larry Tabb, an analyst at Bloomberg Intelligence, explained.
Tyler Durden
Fri, 02/24/2023 – 18:40