It is no secret that I am active on Instagram. @foodieExhalingLife is the only social media account I maintain. I am not on Facebook, X, Threads, or TikTok, despite Instagram’s persistent nudging. I do not operate multiple accounts—public or private. What I share is not content manufactured for engagement, but fragments of life as it is, captured on an iPhone and posted without pretense.
![]() |
| Snapped from streets of Lisbon, Portugal |
I spend about an hour each week reporting, deleting, and blocking men on Instagram. I have become efficient at it. It is a small, if persistent, cost of maintaining a public account. Even so, I often think about young girls on the platform. Are we expecting thirteen-year-olds to manage sexually aggressive messages, unsolicited images of genitalia, and links to pornography? What of the men who attempt to lure them with saccharine flattery, or those who offer weekly payments in exchange for nothing more than conversation?
When I read The Wall Street Journal’s “His Job Was to Make Instagram Safe for Teens. His 14-Year-Old Showed Him What the App Was Really Like,” I was struck less by surprise than by the absence of meaningful action. How many of us think of Instagram as a hunting ground for sexual predators? Meta, Instagram’s parent company, operates a multibillion-dollar business—not a safeguard for children. With projected revenue of $59.6 billion, any intervention that meaningfully reduces user engagement would, inevitably, affect its bottom line.
Mark Zuckerberg did not acquire Instagram for $1 billion in 2012 to police predation. He acquired it to expand a business. That much is clear.
And yet, the question remains.
It is the responsibility of parents—not Meta—to educate children about sexual predators and to ensure their social media accounts are private. Responsible parenting requires active involvement, and that must extend to digital life. Instagram’s algorithm does what it is designed to do: it learns from engagement. If you have ever liked a video of puppies, you understand how quickly your feed becomes saturated with them. Now imagine the feed of a predator who engages with images of young girls. The system does not distinguish intent; it amplifies it.
Is it Meta’s responsibility to protect children from predators? Corporate responsibility, social ethics, and basic decency would suggest yes. But is it their responsibility alone when parents knowingly expose their children to the same risks?
Do parents not bear the greater responsibility?
It need not be your child’s account. If you maintain a public profile and notice increased engagement on posts featuring your child, it is worth examining who that audience is. Images are not static. They are copied, altered, and redistributed—often in ways you will never see. To assume otherwise is willful blindness.
While the minimum age for an Instagram account in the United States is thirteen, children younger than that can appear on parent-managed accounts. As reported in The Wall Street Journal’s “The Influencer Is a Young Teenage Girl. The Audience Is 92% Adult Men,” the term that lingered with me was not “influencer,” but something far less palatable.
Pimping.
I understand the appeal. Children want visibility; parents want to support them. Influence brings attention, products, and money. It is, on the surface, appealing.
But when parents knowingly manage accounts where the majority of followers—and paying subscribers—are adult men, where images of their children circulate in private groups accompanied by explicit commentary, and still choose not to intervene, the question is no longer one of ambition. It is one of complicity.
Instagram maintains policies against monetizing child-focused accounts through subscriptions, though enforcement remains unclear. Even imperfect enforcement exceeds the inaction of parents who refuse to report, delete, or block for fear of diminishing engagement. Predators, after all, engage deeply—through prolonged viewing, likes, comments, and direct messages. Engagement, in this context, becomes indistinguishable from exploitation.
I am not conservative when it comes to sex or the business surrounding it. I support the legalization of prostitution with regulation and age protections. But this is not that.
Anyone who profits from the exploitation of minors—or facilitates their objectification—should be held criminally accountable. When it becomes known that a child is the subject of sexual fixation, and an adult continues to provide images that sustain that fixation, intent becomes secondary. Participation is enough.
And when that participation comes from a parent, the implications are far more severe.
What happens when fascination becomes action?
Can responsibility still be deferred?
We cannot hold Instagram solely accountable while ignoring the role of parents who knowingly expose their children to predators. When images of children are monetized, when engagement is prioritized over safety, when warning signs are ignored in favor of opportunity, the line is no longer ambiguous.
It is crossed.
It is immoral. When images provide sexual gratification, what distinguishes them from pornography? When money is made from that exchange—directly or indirectly—what distinguishes it from the business of sex?
These are not abstract questions.
They demand an answer.
When did it become acceptable for parents to knowingly place their children in the path of those who would exploit them?
_____
More essays:

Comments
Post a Comment