A Framework for Faithful Practice
Leveraging AI's power while keeping prayer, discernment, community, and the image-bearer at the center of our work.
"Whoever serves, let them serve with the strength that God provides, so that in all things God may be glorified through Jesus Christ."
— 1 Peter 4:11Section 01
Six principles form the spine of the framework. Each is a posture — not a rule — that must be carried into the work.
AI has knowledge and reason but not wisdom or discernment. It can help us gain knowledge and understanding of a topic. It can suggest how to apply what is learned, but it does not have the wisdom to be trusted to decide what to do.
It is a tool of this age. It will not inherit the next.
AI's thinking is conformed to the patterns of the world rather than renewed in the Spirit. It cannot bypass the testing required to discern God's will. Only the renewed mind, led by the Spirit, can do that work.
And more than this: AI is not neutral. It exerts a spiritual gravity. Using it shapes us — toward calculation over presence, toward optimization over relationship, toward simulation over reality — even when we intend otherwise. The image-bearer must be aware that the tool is acting on them while they act on it.
AI opens up productivity and efficiency that previously did not exist. But the friction we used to face — the slowness of writing, the cost of research, the time of building — was itself a form of discernment-by-necessity. As that friction disappears, the discernment it once enforced must now be made explicit.
The human is the image-bearer, not the AI. Before assigning any task to AI, the image-bearer prays for help, guidance, and spiritual fruit. The AI is a tool wielded from a place of prayer — never a substitute for it.
Two postures to watch for: the first is pride dressed as integrity — clinging to a skill we have worked years to develop when AI is now better, refusing the humility of using the better tool. The second is hubris dressed as enablement — celebrating new capability AI grants us without having earned the wisdom to use it well. Both are spiritual failures. The image-bearer must hold both at once.
Our ability to evaluate AI output is bounded by our own knowledge, experience, and cultural context. There will be times when AI produces answers we cannot adequately judge — and times when our judgment of AI is itself in error. This is the catch: the very thing that makes discernment necessary (AI exceeding our experience) also makes it harder (our experience being too narrow to evaluate).
This is why discernment cannot be a solo activity. It requires community — image-bearers with different experiences and blind spots than ours, and where the work crosses cultural or contextual lines, voices from those contexts — and ultimately the Holy Spirit.
AI generates plausible content, not necessarily true content. Its outputs sound like human writing because they are statistically shaped to do so — but a confident answer is not a true answer, and a coherent sentence is not a faithful one.
The image-bearer carries the burden of verifying truth before publishing, sharing, or acting on AI output. In ministry contexts this matters doubly: a single hallucinated fact or mistranslated verse can erode trust that took years to build.
Section 02
Every question, problem, idea, or job to be done must pass through four questions before any AI work begins.
Section 03
Once a job has passed the filter, one more question: "Should AI play a central role in this work, or only an adjacent one?" This is not a question of capability ("can AI do this?") but of weight.
For most work that has passed the four-question filter, AI can play a central role — but only when paired with a strong human-in-the-loop. The image-bearer must remain active in the work: verifying, shaping, redirecting, and taking responsibility for what is produced. AI is wielded; the human is never bypassed.
A small set of tasks have human presence, confession, or relational depth as a constitutive element of the work — not as a quality enhancement but as the thing that makes the work what it is. If the whole point of a task is presence, AI cannot substitute — it can only counterfeit. In these categories AI is not forbidden, but it is restricted to an adjacent role.
AI can support exegetical study, surface cross-references, draft outlines, and stretch our reading. It must not produce final theological content that reaches an audience without faithful human review.
AI can help with preparation, follow-up, and research. It must not be the conversation itself. An AI's appearance of empathy in grief is a form of deception. The value of a real voice on the line is the whole point.
Worship, communion, confession, prayer with another, the sermon delivered live — these require human presence as a constitutive element. AI may support preparation; it does not mediate the moment.
AI can support research, generate options, and clarify thinking. It cannot discern God's call. That work belongs to the Spirit, the image-bearer, and the community of faith around them.
For work where AI can play a central role, the question becomes: how present must the image-bearer be? Three intensities for three kinds of stakes.
AI assists with research and drafting only. The image-bearer makes every consequential choice and reviews every word.
AI produces drafts and proposals. Nothing leaves the loop without the image-bearer reviewing and approving. The default for most work.
The image-bearer reviews periodically and always on outputs that carry public weight. Reserved for low-stakes work that is easy to correct.
Section 04
AI tends to pull the image-bearer away from four anchors that hold a life of faith together. The framework asks not just how AI is used but whether these anchors remain intact through its use.
Are we rooted in Scripture, the witness of the saints, the wisdom of the Church across generations? Or is our diet of formation now mostly AI-mediated, AI-summarized, AI-curated?
AI can present anything as new and timeless at once, dissolving real tradition into endless present. The past is an anchor only if we read it directly.
Are we rooted in embodied community — face-to-face conversation, shared meals, common worship, real accountability? Or is more and more of our relational life now mediated by algorithms and conducted with screens?
AI can simulate presence without delivering it. People are an anchor only if we are physically with them.
Are we rooted in actual geography — a parish, a neighborhood, a city, a piece of land we tend? Or are we increasingly "anywhere," meaning nowhere?
AI flourishes on placelessness; faithful life requires location. Place is an anchor only if we stay long enough to be shaped by it.
Are we rooted in active prayer — not optimized, not productive, not measurable? Or has prayer itself become another performance to be improved?
AI cannot pray. Neither can a person who has substituted optimization for it.
Section 05
The workflow that follows once a job has passed the four-question filter. Two lanes run on top of the four anchors of rootedness, which feed discernment into every gate. Step through the flow — or toggle to compare the two lanes.
Every act of work begins here. Before a problem is framed, before a question is asked, the image-bearer orients toward God's glory as the destination — not just the criterion.
AI assists; the human carries the central work.
AI carries the labor, with checks between every step.
Past, People, Place, Prayer — discernment fed into every gate.
Both lanes converge at Review. The review feeds back into the next cycle.
Section 06
Review is not just "did it work?" — it's a discernment step that asks six distinct questions.
Did this work bring glory to God? Or did it center us, our productivity, our convenience?
Was what we produced actually true? Did we verify, or did we trust the AI's confidence?
Did this work produce love, joy, peace, patience, kindness — or anxiety, pride, dependency?
Did this serve our neighbor? Did stewardship and inclusion shape the outcome?
Was the way we worked God-honoring? Did we pray? Did we listen? Or did we just ship?
Beyond this single piece of work — what is the pattern of use ushering into our lives, our team, our ministry? Each individual use can be defensible while the accumulated pattern is forming something we would not choose.
Section 07
The framework's principles do not always pull in the same direction. Real decisions involve trade-offs, and pretending otherwise produces brittle frameworks. Expand any tension to read.
Section 08
Most AI use happens inside teams, organizations, and ministry contexts. The principle: match the discernment community to the scope of the work.
When a team uses AI together, the discernment burden cannot be silently distributed. Someone must explicitly hold the question: "Should we be doing this at all?"
What does discernment look like when a client is paying you to use AI for them? Consultants owe their clients more than execution — they owe them the discernment work the client may not know to ask for.
When ministry crosses cultural lines, discernment requires voices from the recipient culture. The moment the work crosses a cultural boundary, the discernment community needs to cross with it.
Section 09
Use this one-page reference to evaluate any AI decision quickly. Each principle and question reduces to a single test.