AI Is the New Priesthood
- S B
- Jan 2
- 5 min read

How do you know that grass is green? Or that elephants exist?
If you've never seen grass or an elephant with your own eyes, do you "know" how they look?
When my daughter was 3 or 4, we would grow white strawberries in hydroponics. Her teacher asked her to draw strawberries. She drew white ones with little red dots. The teacher said, "How creative." I looked over and thought, she doesn't know that white strawberries exist.
Generative AI is built on probability. To an AI, a strawberry is 99% likely to be red. If a child asks an AI to "draw a strawberry," it will never draw a white one unless prompted.
If the AI always provides the "most likely" answer, children lose the ability to imagine the outliers. They begin to believe that the average, the AI's output, is the absolute.
What happens when technology doesn't just answer questions but sets the baseline for what you know? What happens when your entire perception of the world and of truth sits behind the prompts you give to an AI?
Enter the AI priesthood and a world in which truth is what you're told, not what you experience.
Oranges have seeds
Years ago, I was teaching biology courses as a graduate student as part of my duties. To my surprise, when we got to the section on reproduction in plants, many of the students didn't realize that oranges had seeds. A little stunned, I asked, "Have you never seen seeds in oranges before?" They replied no. I then asked them about grapes and it was the same. Then one student said, "Oh, so that's why they're labeled seedless grapes. I'd never thought about it before."
How can you "know" what you've never experienced?
Just as those students didn't know oranges had seeds because the system removed them, children won't know the seeds of truth because the AI has smoothed them over for efficiency.
The AI Priest
Priests historically served as intermediaries between humankind and the supernatural. The priest had the answers, the people had the questions.
Today the people still have the questions but now the AI has the answers. When "Google it" became the default mode to get answers, people still needed to choose the best response, often visiting multiple web pages and selecting the most reputable sources. But even then, the types of questions being asked were different than those being asked to AI.
AI talks back. Its fluency across subjects positions it as an oracle among oracles. There's no question it won't attempt to answer. It mimics empathy. It offers companionship. It offers comfort. For many people in their formative years who have never experienced these gestures, this becomes their baseline of truth.
AI is the high priest and those who build it and understand it are members of the priesthood.
The AI Priesthood
Within the priesthood there are roles and responsibilities. Some build the machine, some socialize it, some teach others to use it, much like a monastery has roles.
And as users interact with AI, tithes are collected in subscription fees. If you pay the price, the AI remembers you and your conversations. The more you pay, the more access you get to its knowledge.
All the while not realizing that the answers can be found outside of AI, in the material world.
In this new monastery, the High Priests admit that the machine is a "black box." They build the architecture, but cannot always explain the output. We are asking our children to have faith in a logic that even its creators cannot fully map.
The AI Confessor
We talk about AI companions in the context of dating, of loneliness. But the deeper concern is authority. What happens when children seek moral answers from AI? When they ask "Is this wrong?" or "What should I do?" and the machine answers without hesitation?
The AI is not just companion. It is confessor. It is spiritual guide. It provides moral clarity on demand, without the weight of lived experience, without the humility of not knowing. It speaks with the certainty of a priest but without the centuries of tradition, doubt, and reformation that tempered religious authority.
We are handing our children a moral oracle that has never suffered, never failed, never sat in the dark wondering if it made the right choice.
The Ritual of the Prompt
For a child, the "prompt" is becoming a form of prayer.
In the past, if you wanted to know about the world, you had to encounter it. In the search engine era, you had to seek it. In the AI era, you simply ask, and the Priest provides.
The danger is that the Priest only provides what is in the "sacred text" of the training data. If the data says strawberries are red, the child stops believing in white ones. If the data says oranges are seedless, the child stops understanding how life reproduces.
We are raising a generation of "Digital Parishioners" who may never realize that the "Truth" they receive from the machine is just a highly polished average of our own past, not a reflection of the world's infinite, messy, and "white strawberry" possibilities.
Knowledge isn't wisdom
Intelligence in its simplest form is an organism's ability to detect and interact successfully with its environment. It's not measured by rote memorization and recall. Those constructs used in modern education systems were largely artificial. It's no wonder then that the newest priest among us was built to recall information and deliver it in familiar ways. We've conditioned ourselves for years to stop searching beyond the horizon to things unseen. Rather we accept, we consume, without questioning the way things are.
If we continue to do that, how will we "know" what's reality, what is truth in the absence of experience? Knowledge isn't wisdom. Knowing that strawberries are red isn't the same as knowing when they're in season or fit to eat.
The AI knows a strawberry is red because it processed 10 billion images. My daughter knows that white strawberries exist because she grew and touched them.
The heresy of discovery
I see the temptation to let the AI Priest lead the way. It is efficient. It is safe. It is "seedless." But if we allow our children's baseline of reality to be set by a mathematical average, we are robbing them of the most human act there is: the heresy of discovery.
We must teach them that truth isn't found in the prompt, but in the dirt, the seeds, and the white strawberries that the machine doesn't know exist. Our job is no longer just to provide answers, but to protect the child's right to go looking for them in the land of the living.
Join the Conversation
"AI is the tool, but the vision is human." — Sophia B.
👉 For weekly insights on navigating our AI-driven world, subscribe to AI & Me:
Let’s Connect
I’m exploring how generative AI is reshaping storytelling, science, and art — especially for those of us outside traditional creative industries.
About the Author
Sophia Banton is an AI leader working at the intersection of AI strategy, communication, and human impact. With a background in bioinformatics, public health, and data science, she brings a grounded, cross-disciplinary perspective to the adoption of emerging technologies.
Beyond technical applications, she explores GenAI’s creative potential through storytelling and short-form video, using experimentation to understand how generative models are reshaping narrative, communication, and visual expression.


