As generative AI grows, data security controls are critical

A continuous, holistic approach is needed to address threats

April 30, 2025
#
Business services Cybersecurity Technology risk consulting

The increased implementation of artificial intelligence systems across operations creates transformative opportunities for businesses. But AI also carries a critical price tag: an urgent need to protect these systems from threats that traditional security controls often fail to fully address. 

A modern approach to AI security demands a defence-in-depth strategy that spans secure data ingestion, model training and deployment, infrastructure hardening, and continuous monitoring.

Here is a look at data security controls businesses should consider incorporating as a foundational layer to protect generative AI systems. 

How to protect your business

Generative AI platforms are reshaping productivity and decision making across sectors, but also come with distinct risk vectors like:

  • Model poisoning (malicious data injection)
  • Model theft and intellectual property loss
  • Prompt injection attacks
  • Jailbreaking and unauthorized use
  • Compliance breaches due to data exposure

Generative AI tools and large language model-powered assistants also interact with user inputs and business content in ways that may inadvertently expose sensitive or regulated data.

These risks can result in business disruption, regulatory non-compliance, financial loss and reputational harm—so ensuring the right security is in place is critical. 

Data security controls that businesses should consider

Monitoring AI interactions

Ensure that generative AI tools are not processing, storing or inadvertently exposing sensitive data such as personally identifiable information, financial records, intellectual property or confidential business strategies.

Enforcing data loss prevention policies

Extend policies to cover AI-assisted applications so that AI-generated or AI-handled content adheres to enterprise data protection guidelines.

Implementing blocking and redaction controls

Introduce rule-based policies to automatically block or redact classified or sensitive data from being sent to or returned by AI platforms.

Strengthening endpoint security

Use the appropriate tools to ensure devices interacting with generative AI platforms are compliant with corporate security standards and appropriately managed.

Applying network-access controls

Cloud access security broker tools can monitor and control AI access across different cloud environments, allowing precise control over how and where AI tools are used.

Preventing data exfiltration

Insider risk management tools can detect unusual patterns such as excessive prompt activity, signs of potential data leakage or anomalous usage behaviours associated with generative AI tools.

Implementing content filtering

Set up automated detection and filtering mechanisms for high-risk terms and phrases in AI inputs and outputs to reduce the risk of sensitive data exposure.

Adopting zero-trust principles

Ensure AI operates within a zero-trust architecture—which enforces strict identity, device and access controls—so that generative AI capabilities are available only to authorized users under the principle of least privilege.

The takeaway 

Securing generative AI is not a one-time initiative. It’s a continuous journey that requires coordinated efforts among the cybersecurity, data governance and compliance functions and the participation of both the AI and machine learning teams.

As organizations harness the power of generative AI, embedding cybersecurity into every phase of the AI lifecycle—from data ingestion to model deployment—is essential.

By proactively implementing the right controls and governance structures, businesses can unlock the full value of generative AI while mitigating risks, maintaining trust and ensuring regulatory compliance.

RSM contributors

  • Atul Ojha
    Partner
  • Vaishnavi Vaidyanathan
    Director
  • Arindam Hajra
    Director

Is your business concerned about its cybersecurity position?

Connect with our team to see how we can help