Generative Artificial Intelligence for Local Churches: An Update
On June 18, 2024, the Office of General Counsel provided a webinar in partnership with the Insurance Board, Generative Artificial Intelligence for Local Churches. The webinar introduced Local Churches to generative artificial intelligence (“genAI”), discussed risks and benefits, and emphasized the need for churches to develop a policy on the use of genAI. This article provides a brief reminder of the overall risks of generative artificial intelligence, discusses some legal updates in copyright law, and identifies risks in using AI notetakers. If you are new to generative artificial intelligence, please visit the Insurance Board’s Online Learning webpage and scroll down to Human Resources to view the webinar and download the related policy checklist, then come back to this article.
Generative Artificial Intelligence Risks
A primary risk of using genAI models to generate content and information, answer questions, or conduct research is the output may be incomplete or wrong. OpenAI, creator of ChatGPT, a popular large language model (LLM), acknowledges in its Terms of Use that “Output may not always be accurate.” This risk requires human review and assessment of output created by genAI models like ChatGPT and other tools, whether they are stand-alone tools or built into other systems the Local Church is already using. Additional risks may include output that violates the U.S. Copyright Act, perpetuates discrimination and stereotypes, or plagiarizes others’ works. Uploading data into genAI models may present copyright protection, data privacy, and security risks. Please view the webinar to learn more about each of these risks.
Copyright Infringement Update
One risk the webinar addressed is copyright infringement— both the infringement of others’ rights by a Local Church, and the infringement of the Local Church’s copyright when a church’s works are used to train a genAI model without authorization. The law continues to develop in this area.
Are works a church produces using genAI tools copyrightable and registrable?
The U.S. Copyright Office released Part 2 of its report, Copyright and Artificial Intelligence, on January 29, 2025. Part 2 specifically addresses the copyrightability of works produced using genAI. A church that produces music, liturgies, Bible studies, educational curricula, or other works should know that AI-generated material within these works may not be registrable in the U.S. Copyright Office or otherwise protected under the copyright laws. While human-created elements of these works can still be registered, the Copyright Office will analyze each work on a case-by-case basis. The Copyright Office is clear that human-created prompts used to create the output are not sufficient to show human control over the expressive elements of a work.
Are a church’s rights in its own works infringed by a genAI company using the works to train its model?
This question is currently being litigated. While some decisions have been issued in these cases, the cases are not over and may result in additional decisions or be appealed. Bartz v. Anthropic PBC is a putative class action in the Northern District of California brought by authors alleging that Anthropic trained its genAI model on the authors’ works. Some of the works were obtained legally by purchasing them. Some were obtained illegally from book pirating websites. The judge found that where Anthropic obtained the works legally, using the works to train the genAI model was protected by the fair use doctrine because the use of the works was transformative. (For more information on fair use, see the Columbia University Libraries Fair Use page. Many academic institutions have similar pages.) Where the works were obtained illegally, the judge found the use to be infringing and rejected a fair use defense. Following that decision, Anthropic reached a settlement with the authors, though the settlement must still overcome some legal hurdles and the process is ongoing at the time of this publication. Kadrey v. Meta is a case in the same jurisdiction where a judge also found Meta’s use of the authors’ works to train its genAI model to be transformative and protected by fair use. Unlike in Bartz, the judge did not find the questionable way that Meta obtained the works to be dispositive. The judge’s analysis focused instead on the plaintiffs’ failure to show the market for their work was harmed. Judgment was entered for Meta. Many other lawsuits alleging copyright infringement are pending in different jurisdictions. Because the law is not settled, we cannot make generalizations as to a church’s rights in its own works when the works are used without authorization to train a genAI model, but churches should be aware that this use of their works, even without permission, may be considered fair use, and not infringement, under the U.S. Copyright Act.
Does a church’s use of text, images, or music produced by a genAI model infringe on others’ copyrights?
The Office of General Counsel is not aware of any current lawsuits bringing copyright infringement claims against end users whose use of a genAI tool results in output that arguably infringes copyrights owned by others. If a Local Church chooses to use these tools in creating content, it should be aware that outputs may be so similar to the works on which the models were trained that an infringement claim is possible. For example, in a different ongoing lawsuit against Anthropic by music publishers, the complaint alleged that when prompted to write a short piece of fiction in the style of Louis Armstrong, the genAI tool allegedly generated a response that contained large portions of the lyrics to Armstrong’s song, “What a Wonderful World.” The output did not acknowledge the title, authorship, or ownership of those lyrics. (End users of the technology are not defendants in this case.) The best way for a Local Church to protect itself against claims of infringement is to engage in careful human review, pursuant to its policy, of any genAI-created content or works.
Risks of AI Notetakers
AI notetakers are a class of genAI tools that may be offered as plug-ins or be built into electronic meeting platforms like Zoom and Teams. Popular tools include Otter.ai, Zoom AI Companion, and Read AI, though many others exist. A notetaker tool “listens” to an electronic meeting while it is happening. The tool may transcribe and/or summarize the meeting, capture “takeaways” or “next steps,” or assign tasks to meeting attendees. The tool may also integrate with calendars and other applications. A Local Church that uses these tools should have an understanding of the risks they pose. AI notetakers can be challenging to manage without a strong policy that is monitored and enforced, because individuals like church council members, volunteers, and staff (if not otherwise prohibited) may have their own notetakers operating during electronic meetings, along with any notetaker the church is using as host of the meeting.
Understand the data security and privacy risks.
Read the terms and conditions of any AI notetaker carefully and understand how the tool uses and protects the church’s data. The tool may provide the church’s data to the company that owns the notetaker to train the AI model, or to third parties, both of which may present data disclosure risks for the church. The church may decide to limit its use to tools that assure the church’s data is protected and not used to train AI models or provided to third parties.
Beyond the data security risks, AI notetakers generate meeting summaries that may allow personal, confidential, and even erroneous information to be easily shared to others. These summaries may be stored indefinitely and by many people, increasing the chance of inadvertent or unwanted disclosure.
Notify and obtain consent to record from meeting attendees.
AI notetakers operate essentially by recording the meeting. If the Local Church’s standing rules, governing documents, or policies prohibit or restrict meeting recordings, AI notetakers should also be prohibited or restricted, unless a new policy is approved by the church. Like other methods of recording, a Local Church using an AI notetaker in its meetings should ensure that all meeting attendees are aware an AI notetaker is recording the meeting and consent to the recording or leave the meeting. The church must determine whether individual meeting attendees should be permitted to use their own notetakers during church meetings. If not, unwanted notetakers may need to be removed from the meeting by the meeting host. The use of AI notetakers by staff and during staff meetings should be addressed in the church’s human resource policies. While the church’s policy may be to permit or require the recording of staff meetings, by AI notetaker or otherwise, staff should still be reminded that a recording is taking place; indeed, in some jurisdictions the law may require this notice. Staff should be permitted to use only notetakers approved by the church in conducting church business and should strictly limit the access to and ability to disseminate the summaries.
An individual’s notetaker may try to attend the meeting even if the individual does not.
Because many notetakers offer calendar integration, an individual’s AI notetaker may show up at a meeting that is on the individual’s calendar, even if that individual does not attend the meeting. A church should strongly consider prohibiting any individual’s AI notetaker from attending a meeting the individual is not attending.
Do not automatically distribute summaries, and never distribute summaries without review.
Notetaker summaries may be incorrect, incomplete, lack context, attribute comments or statements to the wrong individuals, and/or provide irrelevant action items. Summaries should be reviewed and corrected at the earliest opportunity by a designated individual who attended the meeting to ensure accuracy and usefulness. Turn off the setting that automatically sends summaries to all meeting invitees. Be aware that distributing a summary through the notetaker may require meeting participants to log into a third-party platform or create an account with the notetaker tool that may grant the notetaker unanticipated permissions.
AI notetaker summaries are not meeting minutes.
Churches routinely have official business meetings under parliamentary procedure rules (such as Robert’s Rules) that require minutes. AI notetaker summaries are not minutes. Minutes should be prepared separately and approved at the next meeting. The meeting minutes, not the notetaker summary, is the official record of the meeting; once the minutes are approved, consider whether recordings and summaries should be discarded.
The best practice for Local Churches to manage these risks is through a clear policy addressing the use of AI notetakers that is strictly enforced. Expect to train staff and volunteers on the policy, and become familiar with the technology so that you can identify inappropriate use of notetaker tools.
For best practices in managing other AI risks, see the Insurance Board’s Fall 2025 edition of The Steward.
The information provided in this article is not legal advice. If you need legal advice, please consult with an attorney.
Related News
Local Churches and First Amendment Rights
On September 25, 2025, the President of the United States released a Presidential Memorandum...
Read MoreGenerative Artificial Intelligence for Local Churches: An Update
On June 18, 2024, the Office of General Counsel provided a webinar in partnership with the...
Read MoreFair Labor Standards Act: What Local Churches Need to Know About Minimum Wage and Overtime
United Church of Christ Local Churches may employ lay workers, have schools or daycares as...
Read More