Copilot broke audit logs, but Microsoft won't tell customers

pistachioapp.com

758 points by Sayrus a day ago


planb - 19 hours ago

I am assigned to develop a company internal chatbot that accesses confidential documents and I am having a really hard time communicating this problem to executives:

As long as not ALL the data the agent hat access too is checked against the rights of the current user placing the request, there WILL be ways to leak data. This means Vector databases, Search Indexes or fancy "AI Search Databases" would be required on a per user basis or track the access rights along with the content, which is infeasible and does not scale.

And as access rights are complex and can change at any given moment, that would still be prone to race conditions.

lokar - a day ago

Wait, copilot operates as some privileged user (that can bypass audit?), not as you (or better, you with some restrictions)

That can’t be right, can it?

jeanlucas - a day ago

A better title would be: Microsoft Copilot isn't HIPAA compliant

A title like this will get it fixed faster.

fulafel - a day ago

> CVEs are given to fixes deployed in security releases when customers need to take action to stay protected. In this case, the mitigation will be automatically pushed to Copilot, where users do not need to manually update the product and a CVE will not be assigned.

Is this a feature of CVE or of Microsoft's way of using CVE? It would seem this vulnerability would still benefit from having a common ID to be refrenced in various contexts (eg vulnerability research). Maybe there needs to be another numbering system that will enumerate these kinds of cases and doesn't depend on the vendor.

aetherspawn - 19 hours ago

Trying to get off Microsoft right now for LOB apps … the incompetence (multiple hacks over the last few months, SSO zero day, and now learn Copilot ignores permissions when searching because the indexer runs as global admin) is getting just plain scary.

nzeid - a day ago

Hard to count the number of things that can go wrong by relying directly on an LLM to manage audit/activity/etc. logs.

What was their bug fix? Shadow prompts?

degamad - a day ago

One thing that's not clear in the write-up here: *which* audit log is he talking about? Sharepoint file accesses? Copilot actions? Purview? Something else?

eleveriven - 20 hours ago

This is exactly the kind of issue that makes trust in large vendors like Microsoft feel more like a gamble than a guarantee

nerdjon - 14 hours ago

I am very curious realistically how can they reliably fix this.

So my understanding is that this is that the database/index that copilot used already crawled this file so of course it would not need to access the file to be able to tell the information in it.

But then, how do you fix that? Do you then tie audit reports to accessing parts of the database directly? Or are we instructing the LLM to do something like...

"If you are accessing knowledge pinky promise you are going to report it so we can add an audit log"

This really needs some communication from Microsoft on exactly what happened here and how it is being addressed since as of right now this should raise alarm bells for any company using Copilot and people have access to sensitive data that needs to be strictly monitored.

jayofdoom - a day ago

Generally speaking, anyone can file a CVE. Go file one yourself and force their response. This blogpost puts forth reasonably compelling evidence.

QuadmasterXLII - a day ago

This seems like a five alarm fire for HIPPA, is there something I’m missing?

usr1106 - a day ago

How does their auditing even work? Auditing should happen at kernel level, I sure hope they don't have Copilot in their kernel. So how can any access go unaudited?

Well, the article did not say whether the unaudited access was possible in the opposite order after boot. First ask without reference and get it without audit log. Then ask without any limitation and get an audit log entry.

Did Copilot just keep a buffer/copy/context of what it had before in the sequence described. I guess that would go without log entry for any program. So what did MS change or fix? Producing extra audit log entries from user space?

Foobar8568 - a day ago

We have cases were purview were missing logs. Fun stuff when we tried to figure out a postmortem at my work.

Microsoft tools can't be trust anymore, something really broke since COVID...

troad - a day ago

Microsoft's ham-fisted strategy for trying to build a moat around its AI offering, by shoving everyone's documents in it without any real informed consent, genuinely beggars belief.

It will not successfully create a moat - turns out files are portable - but it will successfully peeve a huge number of users and institutions off, and inevitably cause years of litigation and regulatory attention.

Are there no adults left at Microsoft? Or is it now just Copilot all the way up?

- a day ago
[deleted]
zavec - a day ago

Just to make sure I'm understanding footnote one correctly: it shows up (sometimes before and hopefully every time now) as a copilot event in the log, and there's no corresponding sharepoint event?

From a brief glance at the O365 docs it seems like the 'AISystemPluginData` field indicates that the event in the screenshot showing the missing access is a copilot event (or maybe they all get collapsed into one event, I'm not super familiar with O365 audit logs), and I'm inferring from the footnote that there's not another sharepoint event somewhere in either the old or new version. But if there is one that could at least be a mitigation if you needed to do such a search on the activity before the fix.

overgard - a day ago

I don’t know much about audit logs, but the more concerning thing to me is it sounds like it’s up to the program reading the file to register an access? Shouldn’t that be something at the file system level? I’m a bit baffled why this is a copilot bug instead of a file system bug unless copilot has special privileges? (Also to that: ick!)

xet7 - a day ago

https://archive.is/PRTRA

Josh5 - a day ago

are they even sure that the AI even accessed the content that second time? LLMs are really good and making up shit. I have tested this by asking various LLMs to scrape data from my websites while watching access logs. Many times, they don't and just rely on some sort of existing data or spout a bunch of BS. Gemini is especially bad like this. I have not used copilot myself, but my experience with other AI makes me curious about this.

sub7 - a day ago

Windows and any softwaqre coming out of Redmond today is pure spyware with little to 0 utility.

This Clippy 2.0 wave of apps will obviously be rejected by the market but it can't come soon enough.

The higher $msft gets, the more pressure they have to be invasive and shittify everything they do.

thayne - 13 hours ago

I'm curious how they fixed this. Did they actually ensure the audit log is updated, or did they just give copilot some new instructions to update the audit log that could possibly be bypassed with the right prompt?

dmitrijbelikov - a day ago

Nobody usually bothers with logging actions with files, well, that is, it is like that almost everywhere. Downloading files is not a joke, there are many nuances, for example: - format - where to store - logging - info via headers

Pavilion2095 - 19 hours ago

Whoever designed this should be fired: https://ibb.co/yGHf2yB

self_awareness - 20 hours ago

> You might be thinking, “Yikes, but I guess not too many people figured that out, so it’s probably fine.”

To you, the reader of this comment: if you thought like this, the problem is also in you.

t0lo - 19 hours ago

Good old moralsoft

fud101 - 13 hours ago

i hope this guy doesn't get copilot banned at work. what a tool.

thenaturalist - a day ago

Hardly have I ever seen corporate incentives so aligned to overhype the capabilities of a technology while it being so raw and unpolished as this one.

The bubble bursting will be epic.

stogot - a day ago

Remember when CISA called Microsoft’s security culture deficient?

https://www.cisa.gov/sites/default/files/2025-03/CSRBReviewO...

And remember when the Microsoft CEO responded that they will care about security above all else?

https://blogs.microsoft.com/blog/2024/05/03/prioritizing-sec...

Doesn’t seem they’re doing that does it?

heywire - a day ago

I am so tired of Microsoft cramming Copilot into everything. Search at $dayjob is completely borked right now. It shows a page of results, but the immediately pops up some warning dialog you cannot dismiss that Copilot can’t access some file “” or something. Every VSCode update I feel like I have to turn off Copilot in some new way. And now apparently it’ll be added to Excel as well. Thankfully I don’t have to use anything from Microsoft after work hours.

TheRoque - a day ago

In my opinion, using AI tools for programming at the moment, unless in a sandboxed environment and on a toy project, is just ludicrous. The amount of shady things going on in this domain (AI trained on stolen content, no proper attribution, not proper way to audit what's going out to third party servers etc.) should be a huge red flag for any professional developer.

throwaway984393 - a day ago

[dead]

micromacrofoot - a day ago

[flagged]

conartist6 - 21 hours ago

Lie, cheat.