‘It just disappeared’: How Meta’s safety crackdown upended one Victoria woman’s life

Imagine waking up one morning to find that decades’ worth of your personal online history has been erased. Irreplaceable photos and videos have vanished. Mementos from loved ones who have since passed have disappeared.

And not only that. Completely out of the blue, with no reason, justification, or warning, you are being falsely accused of “child exploitation.”

Given how much we now rely on social media, this sounds like an unthinkable nightmare. But for more and more people, as platforms such as Meta get increasingly powerful, this is becoming a reality.

Victoria resident Courtnay Paige is one such victim of this seemingly arbitrary, impersonal process.

For Paige, a realtor and mother of two, the accusation seems ridiculous.

But the consequences are devastating.

On July 23, 2025, Paige found that her Facebook and Instagram accounts — both personal and business — had been suspended.

When she attempted to address the issue, she says Meta — which owns Facebook and Instagram — only gave vague explanations for the suspensions, with references relating to child exploitation, abuse, or nudity.

“There is zero on either account, on any of my accounts, that would ever have [warranted] this false allegation,” she said.



Paige says she tried creating new accounts multiple times. For several months, she attempted and failed to regain access to her accounts after they were repeatedly suspended. Multiple appeals resulted in nothing.

In November, she started yet another Instagram account and kept the content minimal and strictly business-focussed. In order to be able to access customer support, Paige went so far as to have the account verified, signing up to pay a monthly fee of $25.

“I was just like, we will keep it simple, it will just be basically business and a couple of photos, and there is nothing that can go wrong there,” she said.

But go wrong it did, and this account was removed as well.

“I had my account up for, I want to say a month, maybe not even,” she said.

“And then it just disappeared.”

At some point, Paige heard back from a Meta representative, who told her her case would be reviewed. She was warned that it could take a while. She hasn’t heard back.

‘Expanding teen account protections’

The timing of Paige’s issues is no coincidence. On the same date that her accounts were suspended — July 23, 2025 — Meta announced it was “expanding teen account protections and child safety features.”

“At Meta, we work to protect young people from both direct and indirect harm,” the platform posted on its newsroom page.

“We’re bringing some Teen Account protections to adult-managed accounts that primarily feature children and are continuing to crack down on accounts that may seek to abuse them.”

The vague wording doesn’t exactly break down what that means, but the post goes on to mention “accounts run by adults that primarily feature children.”

“These include adults who regularly share photos and videos of their children, and adults – such as parents or talent managers – who run accounts that represent teens or children under 13,” it said.



This phrasing is broad and appears to mean that the millions and millions of parents who post pictures of family members online could be in violation of some terms of use.

“While these accounts are overwhelmingly used in benign ways, unfortunately there are people who may try to abuse them, leaving sexualized comments under their posts or asking for sexual images in DMs, in clear violation of our rules.”

It also mentioned having taken down nearly 135,000 Instagram accounts “for leaving sexualized comments or requesting sexual images from adult-managed accounts featuring children under 13.”

“We also removed an additional 500,000 Facebook and Instagram accounts that were linked to those original accounts.”

Paige’s accounts didn’t even come close to meeting those criteria; in fact, her most recent business account only featured pictures of properties for sale.

Meta says it will let account users know when they remove an account that “had interacted inappropriately with their content, encouraging them to be cautious and to block and report.”

It did not mention who would receive that information or how those who are accused could actually go about appealing any decisions.

Disconcerting limbo

Dr. Mike Zajko, an associate professor of sociology at UBC Okanagan, says Paige’s experience is part of a larger pattern. Since July, many users around the world have reported being locked out of their Meta accounts — sometimes permanently and with little to no explanation.

“This is not an uncommon situation at the moment, given the sorts of reports that have been coming out of a number of countries and various people expressing frustration at having their accounts suspended or disabled,” Zajko said.

While the situation may not be rare, the implications can be significant, he says.

“If you’re highly dependent on your social media profile to keep in touch with people or operate a business, that sort of thing, being stuck in this kind of limbo is very disconcerting,” he said.

Zajko says Meta has to juggle the competing obligations of removing harmful content while maintaining users’ rights.



“One of the responsibilities that they have is towards young people and keeping them safe, and they’ve been under some considerable legal pressure around that front recently,” he said.

“So for them it’s a matter of how they choose to balance those responsibilities and which ones they favour at any given moment in time.”

The way the platform is going about this can be extremely problematic for the user, however.

“From our perspective, it’s very much a black box, and we don’t know how those judgments are made on the outside,” he said. “Meta is very opaque in terms of how it makes these decisions and what it bases those decisions on.”

“If you are affected by one of these suspensions, they’re typically not going to tell you a whole lot.”

Often, users can only speculate as to what triggered the suspension.

“I’ve heard that sometimes it’s images being flagged; obviously, it could be words, it could be complaints,” Zajko said.

“There’s a lot of automation that Meta has been using to deal with the volume of these complaints, because we’re talking about billions of users around the world.”



He also points out that social media platforms are, by their very nature, not subject to the kinds of accountability standards that would exist in regulated industries or legal contexts.

“In a criminal case, you would have to involve disclosing all of the relevant evidence and that sort of thing,” he said.

“Meta isn’t held to any of those standards.”

He notes that while mistakes do happen and some accounts are reinstated, the process is unpredictable and frustrating. The deeper issue, he says, is society’s growing dependence on social media platforms for personal connections, memory storage, and business operations. This means we may be increasingly vulnerable to Meta’s unilateral decisions.

And, as seen in Paige’s case, the platform’s lack of transparency can leave people feeling powerless, he says.

In order to reduce the risk of accounts being removed, he says, people should consider maintaining an online presence on platforms that are not under the Meta umbrella. These include TikTok, X, LinkedIn, Reddit, YouTube, and Snapchat — although each of these has its own problems and limitations.

Or you can remove yourself from social media altogether.

‘The biggest gift’

Victoria resident Jessica Karpa has also had issues with her Meta accounts, which, like Paige’s, had been flagged as harmful to children.

And, like Paige, Karpa has no idea what triggered the issue.

For her, the trouble actually started before the platform’s expanded protections.

“My account was first suspended in June, and then I appealed it, and then it took two weeks to retrieve access to my account,” she said.

“I would guess a human had a chance to take a look at it and verify that I was not, in fact, a danger to children.”

That wasn’t the end, however. In July, it was once again flagged for the same reason.

“I appealed it the same as I had before, and I am still yet to regain access to my account,” Karpa said.



She has decided not to move further on the appeal.

“I haven’t tried to escalate it further because, honestly, life without social media has been pretty freaking great,” she said.

“When I do get it back, I’m just going to download all of my content and delete it once and for all.”

Karpa says she felt she had a “severe addiction to social media.”

“I would spend hours and hours and hours on my phone, and now… I hardly touch it,” she said.

“I am on it maybe three times a day.”

She says she is completely done with social media.

“Yeah, it sucks to lose all the memories, but I’m finally able to live my life without all the noise,” she said.

“This has been the biggest gift.”

‘I’ve been begging, pleading’

Now, nearly five months since this saga began, Paige continues to fight for her accounts.

She says she has been unable to speak to any sort of human decision-maker throughout her appeal process.

She has kept documentation of her efforts to get the issue resolved and has sent numerous emails and submitted countless appeals, without success.

“I’ve been begging, pleading,” she said. “I’ve gotten nowhere.”



Paige believes — but hasn’t been able to confirm — that facial recognition technology may be involved, with each new account automatically being associated with the originally suspended ones for that reason.

Either way, the impersonal aspect of the process is frustrating.

“You need the human element, unfortunately,” she said.

“When things like this are happening, and that’s just being overridden by AI, and there is no human element even involved, it’s a pretty scary future to be in.”

1130 NewsRadio has reached out to Meta and Facebook for comment.

More From Vancouver Chronicles