Rapid generative AI (GenAI) adoption is the top-ranked issue for the next two years for legal, compliance and privacy leaders, according to a recent survey by Gartner.
In a September 2023 survey of 179 legal, compliance and privacy leaders, 70% of respondents reported rapid GenAI adoption as a top concern for them.
Gartner experts have identified key areas that organizations need to address:
Limited visibility into key risks
The ease of adoption, widespread applicability, and the ability of GenAI tools to perform a range of different business task mean that assurance teams will have limited visibility into new risks.
“New processes to detect and manage these risks will take time to roll out leaving businesses exposed in the interim,” said Stuart Strome. “Legal leaders should adapt preexisting, well-established and widely distributed risk monitoring and management practices until new processes can be implemented. For example, they might modify data inventories and records of processing activities of privacy impact assessments to track GenAI usage.”
Lack of employee clarity on acceptable use
Employees will lack clarity on what constitutes acceptable use of the technology due to unfamiliarity with the rules governing it. Organizational legal leaders should work to build consensus on ‘must avoid’ outcomes and institute controls to minimize the likelihood of those outcomes while championing acceptable use cases in policies and guidance.
“Legal leaders need to institute a mandatory human review of GenAI output, prohibit entering enterprise IP or personal information into public tools such as ChatGPT, and develop policies that require clear indication of GenAI provenance on any public-facing output,” said Strome. “It’s important to include real-world examples of prohibited and acceptable GenAI usage in policy guidance and alert employees when policies are updated. Further, consider working with IT to develop embedded controls, such as popups in GenAI tools that require users to attest they are not using the tools for prohibited cases.”
Need for AI governance
As GenAI tools rapidly become more ubiquitous, poor accountability for negative outcomes could create unacceptable legal and privacy risks. Yet for most companies AI governance will not fit neatly into existing functional organizational structures, and the expertise needed may be scattered throughout the business or even not exist at all. Legal leaders need to clearly document roles and responsibilities for approvals, policy management, risk management and training for GenAI.
“Legal leaders should advocate for establishing a cross-functional steering committee, or for modifying the mandate of an existing committee, to establish principles and standards for use, and to align on roles and responsibilities related to AI governance,” said Strome.