C
Docs

Recording Bid Outcomes

Capture win/loss/no-bid results, document debrief insights, and create feedback loops to improve future decision accuracy

Updated 2026-03-3026 min read

Recording Bid Outcomes

Recording bid outcomes transforms your bid decision process from static predictions to a continuously learning system. Every win, loss, and no-bid decision generates valuable data that improves future decision accuracy, calibrates win probability models, and reveals systematic strengths and blind spots.

Overview

The outcome recording workflow captures three critical data points:

  1. Decision Made: What did you decide? (Go, No-Go, Deferred)
  2. Outcome Achieved: What actually happened? (Win, Loss, No-Bid, Cancelled)
  3. Lessons Learned: Why did it turn out this way? (Debrief insights, competitive factors, decision validation)

This creates a feedback loop that continuously improves your bid decision process:

Success

Organizations that systematically record and analyze outcomes improve win probability prediction accuracy by 30-40% over 12-18 months and increase overall win rates by 15-20% through continuous learning.

Recording Go/No-Go Decisions

Decision Recording Workflow

No-Bid Decision Recording

No-bid decisions are equally valuable for learning:

No-Bid Decision Record:

Opportunity: $1.2M Cybersecurity Assessment, Dept of Defense
Decision: NO-BID
Decision Score: 48 (Qualified No-Go)
Decision Date: 2026-02-10

Rationale:
Primary Reason: Competitive Position (Score: 38/100)
- TechCorp is incumbent with strong performance (CPARS 4.6/5)
- TechCorp has 6 DoD contracts vs. our 1 DoD contract
- TechCorp has executive relationship with customer CIO
- We have no clear differentiator to overcome incumbent advantage

Secondary Reasons:
- Win probability only 28% (below 35% threshold for pursuit)
- Resource constrained (2 active Strong Go pursuits consuming 80% of capacity)
- Better opportunities in pipeline (Federal Health Cloud Migration, Strategic Fit 85)

Decision Maker: VP Business Development (John Smith)
Approval: CEO (Jane Doe) - concurred with no-bid recommendation

Alternative Actions Considered:
1. Pursue solo: Rejected (28% win probability too low, would waste $40K proposal budget)
2. Team with partner: Rejected (no partner available who could offset incumbent advantage)
3. Defer and re-evaluate: Rejected (competitive position unlikely to improve before deadline)

Strategic Implications:
- Focus on building DoD past performance through smaller contracts
- Develop executive relationships at DoD for future large opportunities
- Track for re-compete in 3-5 years when incumbent contract expires

Follow-Up Actions:
- Monitor award announcement to see who won (competitive intelligence)
- Confirm TechCorp won and award value (validate our competitive assessment)
- If non-incumbent won, analyze why (did we misjudge incumbent strength?)

Note

No-bid decisions should be tracked just as rigorously as go decisions. They provide valuable data on decision thresholds and help validate that you're appropriately selective. If you never no-bid, you're probably pursuing too many low-probability opportunities.

Deferred Decision Recording

Sometimes you need more information before deciding:

Deferred Decision Record:

Opportunity: $2M AI/ML Platform, Provincial Health
Decision: DEFERRED
Decision Score: 66 (Qualified Go, borderline)
Decision Date: 2026-03-01

Reason for Deferral:
- Win probability highly uncertain (55% ±25% - wide confidence interval)
- Missing critical information: Is TechCorp (incumbent) rebidding?
- Customer engagement needed to validate our AI/ML approach
- Teaming partner (DataScience Inc.) hasn't committed yet

Information Required:
1. Confirm whether TechCorp is rebidding (BD to call customer program manager)
2. Validate AI/ML approach (Technical lead to meet with customer CTO)
3. Secure teaming commitment (VP to finalize teaming agreement with DataScience)

Decision Timeline:
- Information gathering: March 1-7 (1 week)
- Final go/no-go decision: March 8 (22 days before proposal deadline)
- If GO: Proposal kickoff March 9

Contingency Plan:
- If TechCorp rebids AND DataScience won't commit: NO-BID (win probability drops to ~35%)
- If TechCorp rebids AND DataScience commits: GO (win probability ~58%, acceptable)
- If TechCorp doesn't rebid: GO (win probability ~72%, strong go)

Decision Criteria:
GO if: (TechCorp not rebidding) OR (TechCorp rebidding AND DataScience commits AND customer validates approach)
NO-BID if: TechCorp rebidding AND (DataScience won't commit OR customer doesn't validate approach)

Deferred decisions should have clear information gathering plans and decision deadlines to avoid "decide to decide later" paralysis.

Recording Bid Outcomes

Win Outcome Recording

Loss Outcome Recording

Losses are even more valuable than wins for learning:

No-Bid Outcome Recording

Track no-bid outcomes to validate decision thresholds:

No-Bid Outcome Record:

Opportunity: $2.5M Cloud Security, Dept of Defense
Decision: NO-BID (Decision Score: 42, Win Probability: 25%)
Decision Date: 2026-01-20
Decision Rationale: TechCorp incumbent with strong performance; missing mandatory security clearances; weak competitive position

Award Information (monitored):
Award Date: 2026-03-01
Winner: TechCorp Solutions (incumbent retained contract)
Award Value: $2.3M

Validation of No-Bid Decision:
✓ TechCorp won (as predicted) - validates competitive position assessment
✓ Award value was $2.3M (vs. estimated $2.5M) - customer estimate was close
✓ No other known competitors won (suggests they also no-bid or were not competitive)

Outcome Analysis:
- Win probability 25% was likely accurate (incumbent won as expected)
- No-bid decision was correct (pursuing would have wasted ~$50K proposal budget with <25% win rate)
- Competitive position assessment (score 35) was accurate (we were weak vs. strong incumbent)

Lessons Learned:
✓ No-bid discipline is working (avoided low-probability pursuit)
✓ Competitive assessment was accurate (TechCorp strength was as evaluated)
✓ Decision threshold (no-bid below 35% win probability) is appropriate

Counterfactual Analysis:
"What if we had pursued despite no-bid decision?"
- Investment: $50K proposal cost
- Win probability: 25%
- Expected value: $2.3M × 25% = $575K
- Expected profit: $2.3M × 15% margin × 25% = $86K
- ROI: ($86K - $50K) / $50K = 72%

Analysis: Even with 25% win probability, ROI would have been 72% (marginally positive). However:
- We had 2 active Strong Go pursuits (win probability 70-75%) with ROI > 200%
- Resource constraint prevented pursuing all opportunities
- Pursuing low-probability opportunity would have diverted resources from high-probability pursuits
- Portfolio-level decision to focus resources on Strong Go opportunities was correct

Conclusion: No-bid decision validated; discipline maintained

Tip

Always monitor award outcomes for no-bid opportunities. If you consistently no-bid opportunities that are won by similar competitors, you may be too conservative. If you no-bid and strong competitors also no-bid (procurement cancelled or weak winner), you correctly identified a poor opportunity.

Debrief Best Practices

Formal Debriefs

Most government customers offer formal debriefs:

Requesting Debrief:

Email Template:

Subject: Debrief Request - Cloud Migration RFP #12345

Dear [Contracting Officer],

Thank you for the opportunity to propose on the Cloud Migration contract (RFP #12345).

We would like to request a formal debrief to better understand the evaluation results and how we can improve future proposals. Specifically, we would appreciate feedback on:

1. Our technical approach (strengths and weaknesses relative to the winner)
2. Our past performance evaluation (what would have strengthened our rating?)
3. Our pricing (was our price competitive? How did it compare to the winner?)
4. Overall proposal strengths and areas for improvement

We are available for a debrief call at your earliest convenience. Please let us know your availability.

Thank you for your time and consideration.

Regards,
[Your Name]
[Company]

Debrief Meeting Structure:

  1. Listen First: Let customer provide prepared feedback without interruption
  2. Ask Clarifying Questions: Understand specific evaluation factors and scores
  3. Compare to Winner: Ask how winner's approach differed from yours
  4. Identify Improvement Areas: Ask what would have made your proposal stronger
  5. Thank Customer: Express appreciation for feedback and time

Questions to Ask in Debrief:

  • "What were the key strengths of the winning proposal?"
  • "How did our technical approach compare to the winner's? What did they propose that we didn't?"
  • "What could we have done to strengthen our past performance evaluation?"
  • "Was our price competitive? If not, what drove the difference?"
  • "Were there any weaknesses or concerns in our proposal that cost us points?"
  • "What would you recommend we do differently next time?"

Informal Debriefs

Customer relationships often enable informal feedback:

Informal Debrief Conversation:

[Call with customer program manager, 1 week after loss]

You: "Thanks for taking my call. I wanted to follow up on the Cloud Migration contract. I know TechCorp won, and I respect the decision. I'm calling to learn - what could we have done differently?"

Customer: "Your proposal was very strong. It came down to past performance and continuity. TechCorp has 5 years of history with us, and we know they can deliver. Your technical approach was actually more innovative, but we're risk-averse given the criticality of this system."

You: "That makes sense. If we want to be competitive for future opportunities with your team, what would you recommend?"

Customer: "Build a track record with us on smaller projects. Show us you can deliver. Once we have confidence in your team, innovation becomes more attractive than continuity."

You: "That's very helpful. Are there smaller upcoming opportunities where we could demonstrate our capabilities?"

Customer: "Actually, we have a $300K data analytics pilot coming up in Q3. That might be a good entry point."

You: "I'd love to discuss that. Thank you for the feedback and the opportunity."

[Outcome: Loss converted to relationship development and future opportunity identification]

Learning from Outcomes

Model Calibration

Use outcomes to improve win probability accuracy:

Quarterly Calibration Analysis:

Q1 2026 Outcomes Review:

Predicted vs. Actual:

High Win Probability (70-80% predicted):
- 5 opportunities
- 4 wins, 1 loss
- Actual win rate: 80% (within expected range)
- Calibration: GOOD

Moderate Win Probability (50-60% predicted):
- 8 opportunities
- 4 wins, 4 losses
- Actual win rate: 50% (within expected range)
- Calibration: EXCELLENT

Low Win Probability (30-40% predicted):
- 3 opportunities (note: we should rarely pursue this range)
- 0 wins, 3 losses
- Actual win rate: 0% (expected ~35%, but small sample)
- Calibration: ACCEPTABLE (small sample size)

Overall Model Accuracy:
- Mean Absolute Error: 12.3% (target < 15%)
- Prediction Accuracy: GOOD

Identified Biases:
- Federal health opportunities: Win rate 75% vs. predicted 62% (UNDERESTIMATING healthcare domain advantage)
- DoD opportunities: Win rate 33% vs. predicted 48% (OVERESTIMATING our DoD competitiveness)

Model Adjustments:
1. Increase competitive position score for federal health opportunities where we have healthcare expertise (+8 points)
2. Decrease win probability for DoD opportunities against incumbents (-10% instead of -8%)
3. Add "executive relationship" factor to competitive position (currently missing)

Expected Impact: Improve prediction accuracy by 3-5 percentage points

Pattern Recognition

Identify systematic win/loss patterns:

Win/Loss Pattern Analysis:

Win Patterns (12-month analysis):

HIGH WIN RATE SCENARIOS (70%+ win rate):
✓ Federal health opportunities + healthcare domain expertise
✓ Small business set-aside + we qualify
✓ Existing customer + strong past performance
✓ Technical evaluation-heavy (>60%) + our capabilities exceed requirements

MODERATE WIN RATE SCENARIOS (40-60% win rate):
• New customer + strong capability match + competitive pricing
• Existing customer + moderate past performance + innovation opportunity
• Federal non-health + strong technical approach

LOW WIN RATE SCENARIOS (&lt;30% win rate):
✗ DoD + incumbent competitor + we lack DoD past performance depth
✗ Price-dominant evaluation (>50%) + commodity services
✗ Competitor has executive relationship + our relationship is weak
✗ 7+ bidders + no clear differentiator

Strategic Implications:
1. PURSUE AGGRESSIVELY: Federal health, small business set-asides, existing customers with strong performance
2. PURSUE SELECTIVELY: New customers, federal non-health (evaluate case-by-case)
3. AVOID: DoD against incumbents (unless we have 5+ DoD contracts), price-dominant commodity work

Resource Allocation:
- 50% of resources on high win rate scenarios (expected 70% win rate × 50% = 35% overall)
- 40% of resources on moderate win rate scenarios (expected 50% win rate × 40% = 20% overall)
- 10% of resources on strategic investments (low win rate but high strategic value)
- Target: 55% overall win rate (35% + 20% = 55%)

Continuous Improvement

Outcome Review Cadence:

FrequencyActivityParticipants
After Every OutcomeRecord outcome, capture debrief insightsBD lead, proposal manager
MonthlyReview all outcomes from past month, update competitor profilesBD team
QuarterlyCalibrate win probability model, analyze patterns, adjust decision thresholdsBD leadership, finance, executive
AnnuallyStrategic review of decision framework, priority adjustments, major model updatesExecutive team, board

Best Practices

Record Outcomes Promptly

Warning

Record outcomes within 1 week of customer announcement. Memory fades quickly, and debrief insights are most valuable when captured fresh. Delaying outcome recording reduces learning value by 50%+.

Outcome Recording Checklist:

  • Award details (date, value, winner)
  • Debrief scheduled (within 2 weeks of announcement)
  • Initial lessons learned captured
  • Competitor intelligence updated
  • Win probability prediction accuracy analyzed
  • Strategic value realization tracked (for wins)

Embrace Losses as Learning Opportunities

Winning teaches you what works. Losing teaches you what doesn't work. Both are valuable.

Loss Analysis Framework:

  1. What did we do well? (preserve these practices)
  2. What could we improve? (specific, actionable improvements)
  3. What was outside our control? (competitive factors, customer preferences)
  4. Would we make the same decision again? (validate go/no-go decision logic)

Example:

Loss: $1.2M DoD Cybersecurity

What We Did Well:
✓ Strong technical approach (scored 88/100)
✓ Competitive pricing (2nd lowest)
✓ High-quality proposal (customer said "very competitive")

What We Could Improve:
• Build deeper DoD past performance (1 contract wasn't enough)
• Develop executive relationships at DoD
• Better assess incumbent strength before deciding to pursue

Outside Our Control:
• TechCorp's 5-year incumbent advantage
• Customer's risk aversion (preferred continuity)
• TechCorp CEO's relationship with customer CIO

Decision Validation:
- Win probability was 52% (loss was within expected range)
- Decision to pursue was marginal (Decision Score 68, barely Qualified Go)
- In hindsight: Should have no-bid and focused resources on stronger opportunities

Lesson: Be more selective about challenging strong incumbents; build DoD past performance first

Share Learnings Across Organization

Debrief Sharing Process:

  1. Capture: Proposal manager documents debrief insights
  2. Analyze: BD team analyzes patterns and lessons
  3. Share: Monthly "Lessons Learned" meeting with all BD, technical, and executive staff
  4. Apply: Update proposal templates, win strategies, and decision models based on learnings

Lessons Learned Meeting Agenda:

Monthly Lessons Learned Meeting - April 2026

Outcomes This Month:
- 2 wins, 3 losses, 1 no-bid

Win #1: $850K Federal Health Cloud Migration
- Key win factor: Healthcare domain expertise
- Lesson: Healthcare compliance framework is strong differentiator
- Action: Make healthcare compliance framework standard for all health opportunities

Win #2: $600K Provincial Education DevOps
- Key win factor: Competitive pricing (15% below nearest competitor)
- Lesson: Education sector is price-sensitive
- Action: Adjust pricing strategy for education sector (target 10-15% below market)

Loss #1: $1.2M DoD Cybersecurity
- Key loss factor: Incumbent advantage + past performance gap
- Lesson: 1 DoD contract insufficient for large DoD opportunities
- Action: Target smaller DoD opportunities to build past performance before pursuing large ones

... [continue for all outcomes]

Model Updates:
- Increase healthcare domain advantage in competitive position scoring
- Decrease win probability for DoD opportunities against incumbents
- Add pricing competitiveness factor for education sector

Decision Threshold Updates:
- No change this month (model is well-calibrated)

FAQ


Related Documentation:

Was this page helpful?

Recording Bid Outcomes | Cothon Docs | Cothon