Bitget Restricts Use of AI Tools Following Troubling Results and Negative User Experiences

Fredrik Vold
Last updated: | 1 min read
An AI Chip. Source: Adobe

Crypto derivatives exchange Bitget has decided to limit how much it uses artificial intelligence (AI) tools like ChatGPT due to the potential for misinformation.

According to the exchange, problematic results, including false information about crypto-related topics, have come from the use of AI-based chat bots, which Bitget previously has used to handle customer service inquiries.

The move to limit the use of AI tools also comes after the exchange conducted a survey that found that crypto traders have negative experiences with using AI chatbots, Blockworks reported on Monday.

The outlet cited Gracy Chen, Bitget’s managing director, as saying that the exchange wants to prioritize “a blend of human expertise and technological innovation” going forward.

This is important given that society’s understanding of artificial intelligence is still at a nascent stage, she said.

“The crypto landscape is complex and ever-changing; it requires keen human insight and intuition to navigate its many twists and turns,” Chen said, while noting that humans remain indispensable for the time being:

“AI tools, while robust and resourceful, lack the human touch necessary to interpret market nuances and trends accurately.”

In her comments, Chen also pointed to the fact that technological tools like AI are only as good as the data they are being fed and that the algorithm is trained on.

“In our journey with ChatGPT, we’ve learned that AI tools are only as effective as their latest update, training, and the data they’ve been fed,” the Bitget director said.

She added that AI should never be relied on as a replacement for professional financial advice or own research.

“It’s essential to remember that these tools, while powerful, are not infallible,” Chen said.