
On April 8, Fannie Mae published Lender Letter LL-2026-04. Two pages. No press conference. And it quietly redrew the operating rules for every mortgage company in America that uses artificial intelligence.
The letter establishes a mandatory governance framework for any seller or servicer that uses AI or machine learning in connection with originating or servicing Fannie Mae loans. The compliance deadline is August 6, 2026, which is 120 days from publication.
This follows Freddie Mac, which enforced its own AI governance requirements starting March 3.
Both GSEs are now saying the same thing: if you use AI, you govern it, you document it, you own it, and you prove it on demand.
Most branches I have spoken with are not ready. Many do not even know this letter exists.
Before I break down what the letter requires, let me answer the question that matters most to you.
WHO DOES THIS ACTUALLY APPLY TO?
LL-2026-04 is addressed to "All Fannie Mae Single-Family Sellers and Servicers." That means the entity that holds Fannie Mae seller/servicer approval. In practical terms, that is the retail bank, the independent mortgage bank, or the credit union that funds and sells the loan.
The CEO, COO, or Chief Compliance Officer at that institution is ultimately accountable. Freddie Mac's version goes further and specifically requires sign-off from the CIO, CTO, CISO, or CRO.
If you are a branch manager or production leader at one of these institutions, this letter did not land on your desk. It landed on your CEO's desk. But here is the reality: if nobody on the production floor understands what AI tools are in use and raises the issue internally, the deadline arrives and the branch is exposed. You are the person closest to the tools. That makes you the person best positioned to lead this conversation.
If you are a mortgage broker, LL-2026-04 does not directly bind you. Brokers are not seller/servicers. However, the letter explicitly requires seller/servicers to govern the AI use of their vendors, subcontractors, and third-party originators to the same standard they govern their own. Wholesale lenders will start updating their TPO agreements to require AI disclosures and governance documentation from the brokers they work with. If you are a broker using AI tools in your origination process, expect your wholesale partners to start asking about it before August.
WHAT THE LETTER ACTUALLY REQUIRES
I read the full letter and cross-referenced it with the Cooley law firm's legal analysis, National Mortgage News coverage, and the Freddie Mac guide section (1302.8) it builds on. Here is what it comes down to.
If you use AI or ML anywhere in origination or servicing, and that includes automated underwriting, document processing, fraud detection, income verification, quality control, chatbots, or borrower communications, you must have written policies and procedures covering the full lifecycle of every AI system. Development, implementation, use, maintenance, and risk measurement.
Those policies must be transparent and communicated to every employee whose job touches AI. They must incorporate what Fannie Mae calls "trustworthy and ethical AI/ML." They must reflect an understanding of legal and regulatory requirements. They must align with your institution's risk tolerance. And they must have a designated owner who implements, maintains, and reviews them at least annually.
You must also comply with Fannie Mae's Information Security and Business Resiliency Supplement. And you must manage and govern any subcontractor or vendor's use of AI to a standard "no less protective" than your own.
Upon request from Fannie Mae, you must promptly disclose what AI you are using, why you are using it, how you are using it, and what safeguards you have put in place.
FREDDIE MAC IS EVEN TOUGHER
If your company only sells to Fannie Mae, the requirements above are your floor. If you sell to both GSEs, and most mid-size lenders do, you need to meet the higher standard.
Freddie Mac's version, effective since March 3, is more prescriptive. It requires active assessment of AI systems against specific attack vectors. It mandates regular internal and external audits measured against named industry standards, specifically NIST 800-53 and ISO 27001. It requires ongoing monitoring for performance degradation and bias. It demands segregation of duties with documented accountability structures. And it requires senior management approval of AI policies.
The biggest difference: Freddie Mac includes a broad indemnification clause. If your AI causes a loss, you absorb it. Fannie Mae's letter makes no reference to indemnification, but the accountability is implicit.
As one industry CEO summarized it: Fannie Mae is "principles-based" while Freddie Mac is "prescriptive." You need to satisfy both.
WHY THIS IS REALLY ABOUT STRUCTURE, NOT COMPLIANCE
I have been thinking a lot about what separates the lenders who will handle this well from the ones who will scramble in July.
It is not resources. It is not budget. It is whether they built their AI adoption on a foundation or on momentum.
For years, lenders have adopted AI tools because they work. They speed things up. They catch fraud. They reduce costs. That is clarity. You understand that AI improves your operation.
But clarity without governance is fragile. The lender who deploys an AI fraud detection tool without documenting how it works, who oversees it, what happens when it flags a file incorrectly, and how bias is monitored has clarity but no structure. LL-2026-04 just applied the pressure.
The branches that will thrive through August and beyond are the ones that treat this not as a compliance checkbox but as an opportunity to build something durable. A governance program that holds under regulatory scrutiny. A vendor management framework that does not depend on one person remembering to check. A documented escalation process for when AI gets it wrong.
Systems replace vigilance. The lenders who understand that will not just survive this deadline. They will be the ones their partners and borrowers trust most on the other side of it.
YOUR 100-DAY ACTION PLAN (FOR BRANCH MANAGERS AND PRODUCTION LEADERS)
You are probably not the person who signs the governance policy. But you are the person who knows the production floor better than anyone in the C-suite. Here is how to lead this conversation.
Step 1: Inventory every AI and ML tool in your branch's origination workflow. Include anything your team touches that automates a decision, processes a document, verifies information, or communicates with a borrower. Include vendor tools. Most branches will find 5 to 12 tools they did not realize were AI-powered.
Step 2: For each tool, write down what it does, why your branch uses it, how it fits into your workflow, and whether you know what safeguards exist. You do not need to answer every question. The point is to see where the gaps are.
Step 3: Bring the inventory to your CEO, COO, or compliance lead. Frame it as: "Fannie Mae's August 6 deadline requires us to document and govern every AI tool we use. Here is what I have found on the production floor. We need a designated owner and a plan."
Step 4: Ask about vendors. Request AI governance documentation from every third-party provider your branch uses. If a vendor cannot provide it, flag it immediately. Under the letter, their compliance gap is your company's compliance gap.
Step 5: Volunteer to help build the training component. The letter requires that policies be communicated to all personnel whose jobs touch AI. You know who those people are. Offer to lead the rollout at your branch.
Step 6: Document everything you do. The word "documented" appears repeatedly in both the Fannie Mae and Freddie Mac frameworks. If it is not written down, it does not exist for compliance purposes.
FOR BROKERS
Your action plan is simpler but still urgent. Inventory the AI tools in your origination process. Prepare documentation that describes what you use, how you use it, and what safeguards are in place. Expect your wholesale partners to request this information in the coming weeks as they build their own governance programs. Being ready before they ask puts you ahead of every other broker on their roster.
THE BIGGER PICTURE
This letter is not a one-time event. The Cooley analysis noted that sellers and servicers should expect the GSEs to revisit and refine these standards over time and make AI a new focal point of examinations. In February, a Pennsylvania homeowner sued a lender alleging AI-generated marketing calls violated the Telephone Consumer Protection Act. Scrutiny is increasing on every front.
The branches that move on this now will not just be compliant by August. They will be the ones that lenders, partners, and borrowers trust most. Because they built something that holds.
Most mid-size lenders do not have a playbook for this yet. Neither did I. So I am building one from scratch for my own branch and documenting every step as I go.
First in. Documenting everything. Building the frameworks for what comes next.
If you want to talk through what this looks like for your operation, reply to this email. That is what I am here for.
Reply to this email or find me on LinkedIn. I am not hard to find.
First in. Documenting everything. Building the frameworks for what comes next.
Josh
