Back to Blog
Can You Upload Confidential Documents to ChatGPT? A NED's Guide

Can You Upload Confidential Documents to ChatGPT? What Every NED Needs to Know

8 min readmeetinginsight.ai

You have a 300-page board pack, just three days before the meeting, and two other boards demanding your attention. The temptation is clear: upload the pack to ChatGPT, ask it to summarise key risks, pull out financial highlights, and highlight items you need to question.

It works. The summaries are good. The time savings are real. And increasingly, directors are doing exactly this.

But every non-executive director needs to ask right away: what happens to your board papers after you upload them? The consequences could be serious.

The Scale of the Problem

The data shows a worrying trend. Research from Board Intelligence finds that the average board pack for large organisations has grown to 294 pages, up from 267 in 2023.1 Including committee papers, directors at the biggest companies review nearly 600 pages each month. Plus, 68% of directors and governance professionals say their board materials are "weak" or "poor."1

Given this, it's no surprise directors are turning to AI tools. A survey by OnBoard found that over two-thirds of board professionals used AI for board work in the last six months, and 40% tried multiple platforms.2 The Harvard Law School Forum on Corporate Governance reports that 35% of directors say their boards already use AI in oversight.3

But there's a big gap between quickly adopting AI and doing the proper checks. A Diligent Institute survey found that 46% of directors using AI for board work rely on consumer tools like ChatGPT or Gemini, and less than a quarter have formal AI governance policies in place.4 These tools aren't designed to protect sensitive financial data, pre-announcement earnings, or confidential board strategies. Every upload puts you and your company at risk.

What Happens When You Upload a Board Pack to ChatGPT

When you upload a document to a consumer AI tool, you may be putting your entire board at risk — and most users never consider the critical steps now out of their control.

Your documents are immediately sent to external servers. The board pack leaves your device and goes to data centres run by the AI provider—often in the United States—where it's processed, stored, and possibly kept. For a UK-based NED handling sensitive data under GDPR and UK law, this could lead to compliance breaches with serious consequences.

Your documents might be used for training. Unless you've specifically opted out (and most users haven't—a 2024 EU audit found only 22% knew about opt-out settings5), the content you upload could help improve the AI model. That means parts of your board papers might, in theory, affect results shown to other users—including competitors.

Your conversations might be discoverable. As Skadden lawyers said in their guide for the Harvard Law School Forum on Corporate Governance: "AI chats may be discoverable by regulators or litigation adversaries, potentially disclosing information that could be used against the company's interests."6 Every prompt you type and every document you upload creates records you can't control.

The security risks are bigger than you might expect. In late 2024, researchers found a campaign that hacked over 40 popular browser extensions used by 3.7 million professionals.7 Once hacked, these extensions could quietly access data from active browser sessions—including AI chat windows. The risk isn't just what the AI provider does with your data. It's also what happens to it while it's being sent and stored.

What the Numbers Show

Research shows that 34.8% of employee inputs to ChatGPT now include sensitive data, up from 11% in 2023.8 At the same time, 69% of organisations say AI-powered data leaks are their biggest security worry, but nearly half have no AI-specific security measures.8

For a non-executive director, the stakes are especially high. Board papers often include pre-market financial results, M&A talks, executive pay details, legally privileged material, and strategic plans that can affect share prices. These aren't everyday documents. Board papers are some of the most sensitive materials a company has.

And the personal liability is real. Under UK law, directors are individually responsible for data protection and confidentiality. A breach caused by uploading board papers to an unsanctioned AI tool wouldn't just be embarrassing—it could be a governance failure with legal consequences.

"But I Only Use It for Summaries"

This is the most common response, but it misses the point. The risk isn't about what you ask the AI to do. It's about what you give the AI access to. Whether you want a short summary or a detailed risk analysis, the whole document is sent, processed, and stored on external servers as soon as you upload it.

Skadden's guidance is clear: "A company's personal data, trade secrets, or other confidential information should only be analysed with AI tools that have been validated by the company's internal IT team."6 For a NED using a personal device and personal ChatGPT account, that validation almost certainly hasn't happened.

What About Enterprise AI Tools?

Some directors mention enterprise-grade board portals like Diligent, Nasdaq Boardvantage, and OnBoard, which are starting to add AI features. These are real improvements over consumer tools, offering better data handling and contractual protections.

But these portals have their own limits for NEDs. Enterprise board portals are bought by the company, not the individual. They need IT procurement, which can take 6 to 12 months. They're tied to one organisation—so if you sit on three boards, you need three separate systems, each run by a different company. And importantly, they still process your documents in the cloud. Your board papers still leave your device and go to external servers.

For a NED on multiple boards who wants one consistent preparation tool, enterprise portals fix the procurement issue but not the privacy concern.

Worried about your board papers?

meetinginsight.ai analyses your documents entirely on your device. Nothing is ever uploaded, sent, or stored on external servers — across every board you sit on.

Start your free 30-day trial →

The Alternative: AI That Never Sees the Cloud

There's another way. Local AI, also called on-device or air-gapped AI, processes documents entirely on your own computer. Nothing is uploaded, sent, or stored on external servers. The AI runs on your machine, and the documents never leave it.

This isn't just theory. The technology exists now and is mature enough for professional use. Local AI models can summarise documents, pull out key risks, spot action items, and answer questions about complex board packs—all without any data ever leaving your device.

For NEDs, this matters a lot. Local AI lets you use one tool for every board you serve on, since no company's data ever touches another's infrastructure—or any outside system at all. There's no procurement process, no IT approval needed, and no risk of mixing data between roles.

This is exactly what we built meetinginsight.ai to do. Every document you add is processed entirely on your computer. No cloud, no external servers, and no data ever leaves your device. This lets you dive deeper into board packs, find insights on every page, and come to meetings with sharper questions—across every board you serve—without risking the confidentiality you must protect.

A Practical Framework for NEDs Using AI

Whether you choose meetinginsight.ai or another approach, here's a framework for any NED considering AI for board preparation.

Ask where your data goes. If the answer is "to external servers" or "to the cloud," consider that a red flag for confidential board materials. Also ask if the provider can access your documents, if they use them for model training, and where they store them.

Check your obligations. Look over the confidentiality rules in your appointment letter and any NDAs you've signed. Most forbid sharing board materials with third parties—and an AI provider's servers might count as third parties.

Think about the regulatory context. Provision 29 of the UK Corporate Governance Code, effective from January 2026, requires boards to report on how effective their key controls are.9 Uploading board papers to unsanctioned AI tools is exactly the kind of control gap Provision 29 aims to reveal.

Keep personal and professional AI use separate. Using ChatGPT to plan a holiday is fine. Using the same tool and account to analyse a pre-market earnings report is not. The tool isn't the problem—the data you put in is.

Choose local processing for sensitive work. For anything confidential—board papers, legal documents, financial data—use AI that processes everything on your device. The convenience of cloud AI isn't worth the risk when documents contain market-moving information.

The Bottom Line

AI is a powerful tool for board preparation. It can help you understand every issue in the pack, spot connections across quarters, catch numbers that don't add up, and come to the board table with sharper questions than ever before.

But the way most directors use AI today—uploading confidential documents to free consumer tools—is a governance risk hiding in plain sight.

The question isn't whether to use AI; it's about choosing AI that protects your confidentiality. Start now.

Insist on solutions that keep your board documents private and fully under your control. Your duty demands it.


meetinginsight.ai is a private AI tool for board preparation. All document analysis happens locally on your computer — nothing is ever uploaded to external servers. Start your free 30-day trial.


Notes

Footnotes

  1. Board Intelligence, "Under the Microscope: The State of Board Effectiveness in 2025"boardintelligence.com 2

  2. OnBoard, Board Effectiveness Survey 2025onboardmeetings.com

  3. PwC Annual Corporate Directors Survey, referenced in Harvard Law School Forum on Corporate Governance — corpgov.law.harvard.edu

  4. Diligent Institute / Corporate Board Member, "As Directors Embrace GenAI Use, Robust Governance Must Follow"diligent.com

  5. 2024 EU audit on ChatGPT user data practices, referenced in Nightfall AI — nightfall.ai

  6. Skadden, Arps, Slate, Meagher & Flom LLP, "Do's and Don'ts of Using AI: A Director's Guide", Harvard Law School Forum on Corporate Governance, September 2025 — corpgov.law.harvard.edu 2

  7. Pulsedive / GitLab Security, analysis of Chrome extension supply chain attack, December 2024 — blog.pulsedive.com

  8. Metomic, "Is ChatGPT Safe for Business?"metomic.io 2

  9. FRC, UK Corporate Governance Code 2024, Provision 29; ICAEW, "Prepare for 2026: Get Ready for Provision 29"icaew.com