After testing, Microsoft's ChatGPT-like AI assistant—which is integrated into its office apps—will be made publicly accessible on November 1.
If someone is unable to attend a Teams meeting, Microsoft 365 Copilot can provide a summary of the discussions.
Additionally, it can quickly draft emails, Word documents, spreadsheet graphs, and PowerPoint presentations.
Microsoft claims that the tool would end "drudgery," yet critics are concerned that technology such as this may supplant labor.
There are worries that it may put companies in a risky situation where they depend too much on AI-powered support.
Due to its current inability to identify information that has not been created by humans, it may also violate new AI regulations.
According to China's AI rules and Europe's AI Act, individuals need to be aware when they are engaging with AI rather than humans.
According to Microsoft 365 chief executive Collette Stallbaumer, it is the responsibility of the user of Copilot to make it clear.
"It is a tool, and people have a responsibility to use it responsibly," she stated.
I had the unique chance to test out Copilot before it was made available to the general public.
It makes use of the same technology as ChatGPT, which was developed by OpenAI, a business in which Microsoft has made billion-dollar investments.
I conducted my demo on Microsoft employee Derek Snyder's laptop because Copilot is integrated with user accounts and gives users access to their personal or business data.
According to Microsoft, the data is safely handled and won't be used for tech training.
According to Stallbaumer, "You only have access to data that you would otherwise be allowed to see." "It respects data policies."
According to my initial impressions of Copilot, it will be a helpful tool as well as a fiercely competitive colleague for office workers, particularly in businesses trying to save costs.