The first thing you discover isn't an AI problem
When you accept that AI use is already happening, the first instinct is to write a policy. Don't paste customer data into public tools. Don't upload contracts. Sanitise before you send. These are reasonable rules, and you should write them. But the policy itself is the easy part, and it's not the part that matters most.
What matters is what you discover while writing it. You discover that nobody has a clean answer to the question "where does our customer data actually live?" You discover that there are folders on your shared drive that haven't been touched in four years and that nobody is sure who owns. You discover that the boundary between "internal" and "confidential" was never really defined — it was just understood. You discover that half of what people are pasting into AI tools is data they shouldn't have had quite such easy access to in the first place.
This is the real value of taking shadow AI seriously. It is a forcing function. It surfaces a generation of accumulated information mess that the business has been able to ignore because nothing was reaching into it at the speed an AI does. Suddenly, the question of who has access to what — and whether that file is current, draft, or abandoned — becomes urgent in a way it wasn't before.
"The companies that get this right treat the policy moment as a chance to clean house, not just to draw lines. The ones that get it wrong write the policy, file it, and continue to assume that the data underneath is in better shape than it is."