Microsoft Copilot offers protected AI experience for UO

A more secure form of artificial intelligence has arrived at the University of Oregon.

Everyone at the UO now has access to Microsoft Copilot with data protection, a generative AI tool that provides critical safeguards for your data.

Just like ChatGPT or DALL-E, Microsoft's web-based AI chatbot can create images, write text or code, analyze data, summarize webpages and perform numerous other tasks. In fact, the web version of Copilot uses the same underlying AI models as DALL-E and ChatGPT.

What sets Copilot apart from other tools are the data protections it offers for the UO community.

As long as you're logged in with your UO account, the data you enter into Copilot will be encrypted and won't be used to train any AI models outside the university. 

"We're thrilled to provide UO students, faculty and staff with a powerful AI tool that offers greater privacy and security," said Abhijit Pandit, vice president and chief information officer. "For that reason, we recommend Copilot as the first tool in our UO AI toolkit."

To access the protected version of Copilot, visit microsoft365.com/copilot. You’ll be prompted to log in with your UO email address and password unless you’re already logged into UO Microsoft services. A small green shield badge in the upper right indicates you’re using Copilot with data protection. The look and feel should resemble the UO Microsoft portal, with a green banner at the top of the window and your UO profile picture or initials in the upper right.

The university provided students with access to the protected version of Copilot in late September. Microsoft had previously given employees access automatically.

An October update to Copilot brought enhanced data protections, feature updates and other changes, such as a new link and tweaks to the shield icon.

Notably, Copilot now saves past chats, like ChatGPT does.

Some features, such as the ability to upload an image, have been temporarily disabled. Microsoft plans to restore them soon.

UO cybersecurity staff stress the importance of staying mindful when composing AI prompts, even in the protected version of Copilot. People should avoid entering any sensitive personal or UO data. For now, only low-risk, or "green," data is appropriate.

Moderate- and high-risk data (amber and red) shouldn't be entered in Copilot. That includes data protected by FERPA or HIPAA — the federal laws protecting student records and health records, respectively — among other types of information.

"We're recommending a cautious approach," said José Domínguez, chief information security officer. "We will continue evaluating Copilot to determine whether it can handle more sensitive data."

Tips for getting started with Copilot with data protection are available in the UO Service Portal.

Guidance about AI and instruction is available in the Artificial Intelligence Resource Guide from the Office of the Provost.

Information Services staff point out that the rate of change in generative AI tools may outpace the speed of announcements to the UO community.

"By the time you read this, Copilot may already look different or have new features," said Melody Riley, associate chief information officer for enterprise solutions. "We and others will continue providing more updates and guidance in the coming months."

Microsoft uses the name "Copilot" name for a range of AI tools. Copilot with data protection is web-based. No other version of Copilot is broadly available at the UO, including Copilot for Microsoft 365, a version that's deeply integrated into the Microsoft apps. Some limited pilot projects are underway to test such tools. Any groups that are interested in piloting generative AI tools are encouraged to contact Information Services.

Anyone with questions can submit a ticket through the Microsoft Office 365 Support Request page in the UO Service Portal or contact the IT staff who support their unit or the Technology Service Desk.

—By Nancy Novitski, University Communications
—Top photo: AdriaVidal - stock.adobe.com

 

Reminder about employee data-handling responsibilities
UO employees are responsible for being aware of the sensitivity of the data they handle, for using approved collaboration and storage tools that support that policy, and for following other controls associated with different classes of data. The Information Security Office can help identify secure and compliant solutions. (Some UO cybersecurity reference pages require a Duck ID login.)