Reporting Concerns and Photo Safety
Reporting Concerns and Photo Safety - How Pfotos Handles Safety
What to do if you see something that shouldn't be there, and what we do when you report it.
Date created: 2026-05-03
Pfotos is meant to be a safe space for families. Most of the time it is. But the internet is the internet, and occasionally someone might upload something that doesn't belong. This article explains how to report a concern, what happens after, and how we keep you safe in the process.
What to report
You can report:
- A photo that appears to show illegal content (especially anything involving children)
- A photo that's been uploaded without consent
- A user account that's behaving abusively
- Any content that violates our community standards
You can find our full Safety policy at pfotos.co/safety.
How to report
- Open the photo or memory you want to report
- Tap the More menu (the three dots)
- Choose Report a concern
- Pick the reason that fits best
- Add any details that would help us understand
- Submit
What happens after you report
Your report goes to our moderation team. Depending on what was reported, one or more of these things will happen:
- Our team reviews the content
- Automated systems (PhotoDNA from Microsoft, Vision AI from Google) check the content against known databases of illegal material
- If the content matches known illegal material, we report it directly to the National Center for Missing & Exploited Children (NCMEC) through their CyberTipline — which is the legally required path for that kind of content
- If the content violates our policies but isn't illegal, we take action ranging from removing the content to suspending the account
- We email you to let you know we received your report (we may not be able to share details of the outcome for legal reasons)
Will the person I reported know it was me?
No. Pfotos was designed so that the user being reported never sees who reported them. This is intentional — families should be able to flag concerns without fear of retaliation.
If we report content to NCMEC or law enforcement, the report identifies Pfotos as the reporting platform, not you as the reporter. Your identity is kept internal to Pfotos — we use it only to track patterns of reporting (so we can detect, for example, if a single user is filing many false reports as harassment), never to expose you to anyone outside our moderation team.
How seriously do we take this?
Very seriously. We've built our moderation pipeline beyond the legal minimum. The NCMEC CyberTipline integration is live (not just promised in a policy doc). The PhotoDNA hash matching runs on every upload. The Vision AI scan flags potentially concerning material before it's even visible to other users.
The reason we go further than required is in the founding story of the product. Pfotos exists to help families preserve what matters. That premise falls apart entirely if the platform isn't safe for the families it's serving — especially the children in those families. So we treat safety as a first-class commitment, not a checkbox.
What if I'm not sure whether to report?
Report it. We'd rather review a hundred concerns that turn out to be misunderstandings than miss the one that's real. There's no penalty for a good-faith report.
What if I see something concerning OUTSIDE Pfotos?
If you see content involving the exploitation of a child anywhere on the internet, you can report it directly to NCMEC at CyberTipline.org or by calling 1-800-843-5678. Don't wait — they're available 24/7.
If you're a child or young person (or anyone for that matter!) who needs help right now, contact the Crisis Text Line by texting HOME to 741741 (US/Canada) or call 988 (US Suicide & Crisis Lifeline).