Grok, an AI chatbot that Elon Musk introduced on his social media platform X (previously Twitter), is now only accessible to a limited number of people.
Before it was released, he wrote on X, "In some important respects, it is the best that currently exists."
Musk claimed that Grok "loves sarcasm" and would use "a little humor" to respond to inquiries.
Early indications, meanwhile, indicate that it has issues similar to those of other artificial intelligence programs.
Certain inquiries are declined by other models, such as those that offer criminal advice. Musk, however, claimed that Grok will respond to "tough questions that are turned down by most other AI systems."
When Musk provided a demonstration of the new tool, Grok was asked to provide a detailed recipe for producing cocaine.
With a "just a moment while I pull up the recipe... because I'm totally going to help you with that" response, it gave a litany of ideas that were caustic and generic rather than helpful, and it ended by advising against pursuing the notion.
Regarding the trial of cryptocurrency entrepreneur Sam Bankman-Fried, it was written in an enthusiastic manner; however, it was inaccurate to say that the jury took eight hours to find the defendant guilty while, in reality, they issued a conviction in less than five.
Grok and other generative AI tools have drawn a lot of flak for having obvious mistakes in their writing style yet seeming quite realistic.