How to Write an AI RFP - What to Include and What to Skip
Writing an RFP for an AI project is different from writing one for standard software. Most AI RFPs we see either over-specify the technical approach (locking out good solutions) or under-specify the problem (leading to proposals that can't be compared).
Having responded to many AI RFPs and helped clients write them, I've developed a strong opinion about what works. Here's a practical guide to writing an AI RFP that attracts quality vendors and produces proposals you can actually evaluate.
Why Most AI RFPs Miss the Mark
The standard enterprise RFP format doesn't work well for AI projects. Here's why.
Traditional RFPs specify the solution. They describe exactly what to build - screens, features, integrations. AI projects shouldn't work this way because the best approach depends on the data, and you often don't know the optimal solution until you've explored the data.
Traditional RFPs assume fixed scope. AI projects have inherent uncertainty. The model might need different training data. The accuracy target might require a different approach than planned. Good AI vendors will tell you this; rigid RFPs discourage honesty.
Traditional RFPs focus on features. AI projects should focus on outcomes. You care about "process 500 invoices per day with 95% accuracy," not "build a convolutional neural network with three hidden layers."
The best AI RFPs describe the problem clearly, define success in measurable terms, and give vendors room to propose their best approach.
The Structure That Works
Here's a section-by-section breakdown of an effective AI RFP.
Section 1 - Company Background
Keep this brief. Two to three paragraphs covering:
- What your organisation does
- Your industry and scale (revenue, employees, locations)
- Relevant context about your technology environment
- Why you're exploring AI now
Don't include your entire company history or organisational chart. Vendors need enough context to understand your world, nothing more.
Section 2 - The Business Problem
This is the most important section. Invest time here.
Describe the current state. What's the process today? Who does it? How long does it take? What are the pain points? What's the volume?
Quantify the impact. How much is this costing you? Include direct costs (labour, errors) and indirect costs (delays, missed opportunities).
Describe the desired future state. What does the process look like after the AI is working? Be specific about outcomes, not technology.
Example of a good problem description:
Our claims processing team manually reviews approximately 2,000 insurance claims per week. Each claim takes an average of 12 minutes to assess, categorise, and route to the appropriate handler. Error rates in categorisation are approximately 15%, leading to delays and rework. We want to automate the initial assessment and routing, reducing manual processing time by at least 70% and improving categorisation accuracy to 95% or higher.
Example of a bad problem description:
We want to use AI to improve our claims processing.
The difference should be obvious. The first gives a vendor everything they need to propose a solution. The second gives them nothing.
Section 3 - Success Criteria
Define what "done" looks like with measurable criteria.
Include:
- Performance targets: Accuracy, speed, volume capacity
- Minimum viable outcome: What's the lowest acceptable result that still justifies the investment
- Stretch goals: What would make this a standout success
- Business metrics: ROI expectations, payback period
- Operational requirements: Uptime, latency, throughput
Be realistic. If you set targets too high, good vendors will either pad their pricing or decline to bid. If you set them too low, you won't get the value you need.
Section 4 - Data Description
Describe what data you have, honestly. Vendors will find out the true state of your data eventually, so being upfront saves everyone time.
Include:
- Data types and sources: What data is relevant and where does it live
- Volume: How much data is available for training and testing
- Quality: Your honest assessment of data quality (completeness, consistency, accuracy)
- Access: How the data can be accessed (APIs, database exports, manual extraction)
- Sensitivity: Privacy, compliance, or security classifications
- Labels: Whether you have examples of correct outputs for training
If you're not sure about your data situation, say so. A vendor who proposes a data assessment as the first step is being responsible.
Section 5 - Technical Environment
Describe your current technology landscape as it relates to this project.
Include:
- Cloud platform (Azure, AWS, Google Cloud, on-premises)
- Key systems the AI will need to integrate with
- Authentication and security requirements
- Data sovereignty requirements
- Existing AI or ML infrastructure (if any)
- Preferred programming languages or frameworks (if any, but be careful about being too prescriptive)
Section 6 - Scope and Phasing
Define what's in scope for this engagement and what's not.
A good approach is to define phases:
- Phase 1 - Discovery and Data Assessment: Validate the data, refine requirements, confirm feasibility
- Phase 2 - Proof of Concept: Build a working prototype demonstrating the approach
- Phase 3 - Production Development: Build the production system with integrations
- Phase 4 - Deployment and Optimisation: Deploy, monitor, and tune
You can ask vendors to quote on Phase 1 with indicative pricing for later phases. This reduces risk for everyone - you're not committing to a full build until the data has been assessed and the approach has been validated.
Section 7 - Team and Governance
Describe:
- Your internal project team (who will vendors work with)
- Decision-making process (who approves what)
- Communication expectations (cadence, format)
- Reporting requirements (progress updates, risk reporting)
Be clear about your internal capacity. If your team has limited availability for the project, say so. It's better for vendors to plan around constraints than to discover them mid-project.
Section 8 - Evaluation Criteria
Tell vendors how you'll evaluate their proposals. This is a courtesy that also improves the quality of responses you receive.
A sample weighting:
| Criteria | Weight |
|---|---|
| Understanding of the problem and proposed approach | 30% |
| Relevant experience and track record | 25% |
| Team composition and qualifications | 20% |
| Pricing and value for money | 15% |
| Risk management and governance | 10% |
Adjust weights to reflect what matters most to your organisation. But sharing them upfront means vendors focus their proposals on what you actually care about.
Section 9 - Commercial Requirements
Keep this practical:
- Budget range or budget envelope (sharing this gets you more realistic proposals)
- Preferred pricing model (time and materials, fixed price, hybrid)
- Payment terms
- Intellectual property expectations (who owns the models, code, and data)
- Confidentiality requirements
A note on budget: many RFPs don't share a budget range. This is a mistake. You'll get proposals ranging from $50K to $500K, and three-quarters of them will be outside your range. Sharing a budget range (even a broad one) helps vendors calibrate their proposals and saves everyone time.
Section 10 - Process and Timeline
Lay out:
- RFP issue date
- Deadline for vendor questions
- Proposal submission deadline
- Evaluation period
- Shortlisted vendor presentations (if applicable)
- Decision date
- Intended project start date
Allow enough time. Two weeks to respond to a complex AI RFP is too short. Three to four weeks is reasonable.
What to Skip
Just as important as what to include is what to leave out.
Don't Specify the Model Architecture
"We want a solution built using GPT-4 with a fine-tuned classifier" - don't do this. You're hiring an AI development company because they have the expertise to recommend the right approach. Specifying the architecture limits your options and may result in a worse outcome.
Don't Require a Fixed Price for the Entire Project
AI projects have genuine uncertainty. Requiring a fixed price for everything from discovery to deployment forces vendors to pad their estimates significantly or make assumptions they can't verify. Instead, ask for fixed pricing on discovery/PoC and indicative ranges for later phases.
Don't Write 60 Pages
Long RFPs don't get better responses. They get copy-pasted answers from previous proposals. The best responses come from RFPs that are clear, specific, and under 15 pages. Focus on quality of information, not volume.
Don't Include Irrelevant Requirements
Standard procurement requirements (insurance certificates, company registration details, financial statements) should be requested separately or as appendices. They clutter the RFP and distract from the substance.
Don't Ask for a Full Project Plan
At the RFP stage, vendors don't know enough to create a meaningful detailed project plan. They can provide an approach, high-level timeline, and key milestones. A detailed plan comes after discovery.
How to Evaluate AI RFP Responses
Once responses come in, here's how to evaluate them effectively.
Look for Honesty About Uncertainty
Good AI vendors will identify risks and uncertainties in their proposals. Vendors who promise guaranteed outcomes without caveats are either inexperienced or misleading you. A proposal that says "we expect 85-92% accuracy based on similar projects, with final performance dependent on data quality" is more trustworthy than one that says "we guarantee 95% accuracy."
Evaluate the Questions They Ask
The best vendors will submit clarifying questions before their proposal. The quality of those questions reveals their experience. Questions about data quality, edge cases, integration complexity, and success criteria are good signs. No questions at all is a warning sign.
Check the Team, Not Just the Company
AI projects are delivered by people, not brands. A proposal from a large consultancy staffed with junior graduates is worth less than a proposal from a specialist firm with experienced AI engineers. Ask who specifically will work on your project and review their backgrounds.
Compare Approaches, Not Just Prices
The cheapest proposal is rarely the best value. Compare the proposed approaches - what's included, what's excluded, what assumptions have been made. A $200K proposal with thorough data assessment, testing, and post-launch support is better value than a $120K proposal that skips data validation and ends at deployment.
Watch for Template Responses
If large sections of the proposal read like they could apply to any client, the vendor didn't invest time in understanding your problem. Look for specific references to your problem, your data, your industry, and your constraints.
A Note on Procurement Processes
Enterprise procurement processes are designed for buying known commodities. AI projects aren't known commodities. If your procurement team insists on a rigid process, work with them to adapt it.
Useful adaptations:
- Allow for a discovery phase before committing to a full build
- Include a "go/no-go" decision after the proof of concept
- Build in flexibility for scope adjustments based on what the data reveals
- Weight technical capability and relevant experience more heavily than price
The goal of procurement is to get the best outcome for the organisation. For AI projects, that means finding the best team with the right experience - not just the lowest price.
Getting Started
Writing an AI RFP well takes effort, but it pays off in better proposals, easier evaluation, and ultimately a better project outcome. If you're preparing to go to market for an AI project and want feedback on your RFP before it goes out, we're happy to share our perspective.
Explore our AI development services, learn about our AI consulting approach, or contact us to discuss your upcoming project.