I was doing some research on exchanges and found this on ChatGPT:
As of now, Monero (XMR) is not listed on any major US-compliant cryptocurrency exchanges. This is primarily due to regulatory concerns over its privacy features, which make it difficult for exchanges to comply with KYC and AML regulations.
Monero’s focus on privacy and anonymity poses challenges for regulatory compliance, leading most US-based exchanges to avoid listing it. As a result, if you’re looking to trade Monero and you are a US citizen, you will typically need to use international exchanges that may have fewer regulatory constraints but come with their own risks.
For reference, here are a few popular international exchanges where Monero is available, although they may not be fully compliant with US regulations:
- Binance (Global, not Binance.US)
- Kraken (Though Kraken is US-based, it has delisted Monero for US customers)
- KuCoin
- Huobi
- Gate.io
Please be aware that using these exchanges may involve legal and financial risks, including the possibility of violating US regulations. It is always advisable to stay informed about the latest regulatory developments and to consider the legal implications of using non-compliant exchanges.
Did I miss something, or this another authoritarian bit of hopium that made its way into the language model?
The news that you missed is that Large Language Models (LLMs) like ChatGPT are unreliable sources of information. Look for another source if you need reliable information about anything.
Yes, so LLMs may work as an interactive search engine, but what they say is not correct, it’s just statistically probable to be correct
Right. As far as I know, a LLM will not give you a proper source if you ask it how it knows some information. A website found through a search engine will have a source for its info (or it’s probably unreliable if it has no sources).
You as a human being have a right and responsibility to know the source of information and use your reasoning abilities to decide if the source is reliable. An LLM interrupts this process. I don’t understand how people are absorbing information without sources or a way to critically think about if the information may be accurate.
Providing sources is what makes me like perplexity.ai
I am discovering that they are only as good as the information the wokesters fed it when training it, so agreed.
Removed by mod