The past couple of months have seen numerous updates from the Department of the Army on the MAPS IDIQ. The latest draft, currently at #5 (released March 10, 2026), has implemented significant changes from the earlier MAPS IDIQ drafts, and the Final RFP is now anticipated in the near term. A Virtual Listening Session was held on March 12th to walk the industry through the updates, and the Government has requested survey input to shape the Final RFP.
However, this blog isn’t about the incorporated changes from Draft #3 to Draft #4 of the MAPS IDIQ. For that, you can go through the earlier blog we published on MAPS covering the MAPS Contract Guide and Draft 3.
Many U.S. government contractors, especially the small businesses, treat eligibility as the only gate to a MAPS award. They overlook how the Army MAPS Scorecard translates readiness into an actual advantage at award time. With upcoming solicitation cycles tightening competition and funding realities pressing firms to perform, now is the time to treat MAPS readiness as a measurable, strategic asset rather than paperwork.
In this blog, we will break down the MAPS Scorecard and explain how it works, what it measures, and how contractors can assess and plan their readiness before submission.
What Is the Army MAPS Scorecard and Why Does It Matter?
The MAPS program centers on a structured evaluation of a firm’s capabilities, past performance, and organizational readiness, distilling into a measurable profile that can be translated into the award probability alongside the mandatory gate criteria. While the Gate confirms the eligibility to compete, the Scorecard gauges the actual competitive posture across capabilities and execution risk.
Small businesses often misunderstand scoring, assuming that a strong past performance record or robust processes automatically translate into a high MAPS score. But the truth is that the evaluators not only weigh the existence of documentation but also consider the strength, relevance, and recency of evidence provided. A favorable Scorecard can meaningfully tilt the odds of a MAPS award by signaling maturity, risk control, and scalable operations – factors that affect bid/no-bid decisions, pricing strategy, and capture approach.
The strategic implications of the scorecard should inform bid selection and resourcing decisions, rather than waiting for the final solicitation to be released. A firm with a deliberate MAPS posture can pursue opportunities with a clearer line of sight to competitive advantages, while a firm with ad hoc readiness risks undermining its chances of winning.
While the MAPS Gate Criteria isn’t the scope of this post, the table below should help you to realize the importance of the Gate Criteria and the Scorecard:
| Feature | Gate Criteria (Phase I) | MAPS Scorecard (Phases II-IV) |
| Nature | Pass/Fail Binary | Quantitative Point System |
| Purpose | Establishes baseline eligibility | Determines competitive ranking |
| Flexibility | Non-negotiable requirements | Strategic point optimization |
| Outcome | Entry into the competition | Award selection and seat placement |
How Is the Army MAPS Scorecard Structured?
The Scorecard organizes readiness evidence into four core pillars. Each pillar focuses on distinct signals evaluators use to assess risk, capability, and execution potential. Understanding what evaluators examine in each area helps small firms prioritize where to invest time and money.
1. Systems and Rate Structure
Evaluators seek proof of mature, repeatable delivery processes that demonstrate consistent execution. Systems maturity signals an ability to manage rates, schedules, and delivery risks at scale. While certain systems are mandatory gates for large businesses, small businesses, and emerging large businesses can earn substantial credits for having government-approved contractor business systems, with point values that vary meaningfully by system type.
For firms with informal processes, the absence of these approved systems creates a “point gap” that is difficult to close during the proposal window. Points are awarded for official approval letters or audit reports from the Defense Contract Audit Agency (DCAA), Defense Contract Management Agency (DCMA), or a Cognizant Federal Agency (CFA). Under Draft 5, the key systems and their point values for Small Businesses and Emerging Large Businesses are:
- Accounting System (3,500 points): Proves financial maturity and ability to handle cost-plus work. This is the highest-value system available to small businesses and should be a top investment priority.
- Purchasing System (2,500 points): Signals institutional procurement controls and supply chain discipline.
- Property Management System (2,000 points): Demonstrates accountability for government-furnished property across contract performance.
The maximum available under this pillar for small businesses is 8,000 points. For large businesses, the scoring structure differs, only the Property Management System earns additional scorecard points (2,500 points), as the Accounting and Purchasing systems are mandatory gate criteria for them rather than scored differentiators.
2. Certifications and Compliance
Beyond the mandatory ISO 9001 and CMMC Level 2 requirements in the Gate, the Scorecard rewards firms with advanced certifications. These certifications carry significant weight because they demonstrate verified risk reduction and data integrity controls.
For Small Businesses and Emerging Large Businesses, the available certification credits under Draft 5 are as follows, up to a maximum of 5,000 points:
- Top Secret Facility Clearance: 1,000 points
- CMMC Level 2 C3PAO (scheduled, without approved conditional in place): 1,000 points
- CMMC Conditional Level 2 C3PAO (active and approved): 2,000 points
- CMMC Final Level 2 C3PAO or higher (active and approved): 3,000 points
- ISO/IEC 27001:2022 (active and approved): 1,000 points
The tiered CMMC structure rewards firms that have completed a third-party assessment over those who are merely scheduled for one. Achieving a Final Level 2 C3PAO certification carries the highest single certification value and meaningfully separates competitors in tight scoring tiers.
Strategic timing is critical here. Small firms should not wait for the final RFP to pursue these credentials. Early certifications allow the required “evidence of use” to mature, ensuring the firm can provide the necessary artifacts for Phase II verification review.
3. Past Performance Portfolio
Qualifying Projects (QPs) are the primary scoring lever on the MAPS vehicle. Small businesses may submit up to three QPs per domain, provided they meet the minimum $2.5 million value threshold and have at least one year of performance history within the last four years.
A critical new distinction in Draft 5 is the split between two QP types, which determines which scoring criteria apply:
- LOE (Level of Effort) QPs: contract types like T&M, Labor-Hour, and most Cost Reimbursement types are evaluated on Vacancy Rate and Time to Fill Rate.
- Outcome-Based QPs: fixed-price types, CPFF Completion, grants, and agreements, are evaluated on Schedule and Completeness instead.
Offerors must correctly identify each QP type on the Qualifying Project Form, as the applicable scorecard criteria differ entirely.
The scoring for QPs is graduated and objective, rewarding specific performance attributes across six factors:
Performance Quality: The Government evaluates the latest finalized CPARS or PPQ across all elements (Quality, Schedule, Cost Control, Management, and Small Business Subcontracting). Point awards per QP:
- Exceptional in all elements: 10,000 points
- Very Good (or combination of Very Good and higher) in all elements: 7,500 points
- Satisfactory (or combination of Satisfactory and higher) in all elements: 4,500 points
- Neutral in all elements: 1,000 points
- Marginal or below: 0 points
- Maximum available: 30,000 points (across three QPs)
Relevance: Evaluated as a percentage of technical capability alignment to the proposed domain’s PWS requirements:
- 100% match: 7,000 points per QP
- 75–99.99%: 5,000 points
- 50–74.99%: 2,500 points
- 25–49.99%: 1,000 points
- 0–24.99%: 0 points
- Maximum available: 21,000 points
Dollar Value: Scored on a tiered scale based on total contract value. For Small Businesses:
- Greater than $10M: 4,000 points per QP
- $5M–$10M: 2,000 points
- $2.5M–$5M: 1,000 points
- Maximum available: 12,000 points
Passthrough Rate (New in Draft 5): The Government now evaluates how much of the labor was performed by subcontractors versus the prime. This is calculated as Total Subcontractor Labor Dollars divided by Total Labor Dollars. Lower subcontractor dependency earns more points:
- 0–30% subcontracted: 6,000 points per QP
- 30–50% subcontracted: 2,500 points
- 50–65% subcontracted: 1,000 points
- Greater than 65% subcontracted: 0 points
- Maximum available: 18,000 points
This new factor is one of the highest-value single scoring categories on the entire vehicle and strongly rewards firms that self-perform the majority of their labor. Firms that relied heavily on teaming arrangements or subcontracting in prior contracts should evaluate how this affects their QP selections.
Recency: Ongoing work or work with a period of performance end date within two years of the final solicitation release earns 1,000 points per QP. Maximum available: 3,000 points.
NAICS Alignment: Direct alignment with the domain’s primary NAICS code earns 1,000 points per QP. Maximum available: 3,000 points.
Small businesses must curate their portfolio strategically. A project with a glowing narrative but low dollar value, or a “Satisfactory” CPARS, may be a weaker choice than a larger project with “Exceptional” ratings, even if the latter is less “mission-exciting.” And under Draft 5’s Passthrough Rate factor, a project where the prime self-performed the bulk of the labor carries meaningful additional point value.
4. Management & Organizational Capability
This pillar evaluates workforce stability and risk mitigation through quantitative metrics. The Army has moved to objective data on staffing responsiveness and retention, and under Draft 5, these criteria apply differently depending on QP type.
Vacancy Rate (LOE QPs only): The Scorecard evaluates the vacancy rate for the last full year of performance on each LOE QP, calculated as:
Vacancy Rate = (Vacant Positions at Year End / Total Proposed Positions) × 100
Point awards per LOE QP:
- 0% vacancy: 5,000 points
- 1–4.99%: 3,500 points
- 5–8.99%: 2,750 points
- 9–11.99%: 2,000 points
- 12–14.99%: 1,000 points
- 15% or higher: 0 points
- Maximum available: 15,000 points (across three LOE QPs)
Time to Fill (LOE QPs only): The Scorecard also incorporates average Time to Fill for any job vacancy on each LOE QP during the last full year of performance:
Time to Fill = Date of Offer Acceptance − Date Vacancy Opened (mean average for multiple vacancies)
Point awards per LOE QP:
- 30 calendar days or less: 5,000 points
- 31–45 days: 3,500 points
- 46–50 days: 2,750 points
- 51–60 days: 2,000 points
- 61–74 days: 1,000 points
- 75 days or more: 0 points
- Maximum available: 15,000 points (across three LOE QPs)
Schedule and Completeness (Outcome-Based QPs only): For QPs classified as Outcome-Based, the Vacancy Rate and Time to Fill criteria do not apply. Instead, the Government evaluates:
- Schedule: Whether contractual milestones and delivery outcomes were met on time (up to 5,000 points per QP; max 15,000 points).
- Completeness: Whether 100% of requirements were fulfilled, and whether measurable customer-recognized efficiencies (cost savings, reduced oversight burden, improved reliability, etc.) were delivered, with higher efficiency counts earning more points (up to 5,000 points per QP; max 15,000 points).
Firms should assess their QP portfolio with both the LOE/Outcome-Based split and these criteria in mind, as the applicable scoring categories differ substantially between contract types.
How Should Small Businesses Plan Their MAPS Score?
Strategic planning for the MAPS Scorecard must begin well before the submission deadline. Small businesses should approach this as a “point engineering” exercise rather than a creative writing task.
1. Quantitative Gap Assessment
Contractors should conduct an immediate audit of their top three potential QPs per domain. This involves stress-testing each claim against the current Draft 5 rubric to identify where the evidence is weak. If a firm discovers that its highest-value project has a 20% vacancy rate, it must decide how to mitigate the impact. Equally important: identify whether each QP is LOE or Outcome-Based, since this determines which scoring criteria apply, and an incorrect classification will result in missing or misapplied points.
2. Evidence Capture and Alignment
Firms should build an evidence matrix that links each scorecard criterion to a specific, verifiable artifact, such as a CPARS report, a DCMA approval letter, or an ISO certificate. This clarity reduces last-minute scrambling and ensures the self-score withstands Phase II verification, where the government only adjusts scores downward. If they find a mismatch, they will reduce your score; if you failed to claim a valid point, they will not add it for you. For insights on aligning capture with the broader vehicle goals, contractors can reference our MAPS Contract Guide on our website.
3. Resource Allocation Tradeoffs
Contractors must decide whether to submit immediately or delay to strengthen their posture. For instance, waiting for a DCAA accounting system audit (worth 3,500 points for small businesses) or a CMMC Final Level 2 C3PAO assessment (worth 3,000 points) could meaningfully improve a score, justifying a delay in entry for a more certain win in a future “on-ramp” or later draft cycle. Similarly, firms should assess whether their most relevant projects are LOE or Outcome-Based and whether the corresponding staffing data, schedule adherence records, or efficiency documentation is ready to substantiate those criteria.
What Common Mistakes Do Contractors Make When Interpreting the Scorecard?
Research into recent industry feedback highlights several recurring pitfalls that can be fatal to a MAPS bid:
Overstated Self-Scores: Claiming points for “scheduled” audits or “planned” certifications that are not finalized at the time of submission will result in immediate downward adjustments during verification. The one exception is the CMMC scheduled assessment credit (1,000 points for SBs), which specifically requires documented proof of a scheduled C3PAO assessment.
Over-reliance on Commercial Past Performance: While commercial work is accepted, failing to tailor the narrative to federal relevance and recency requirements can lead to lower confidence ratings.
Ignoring the LOE vs. Outcome-Based Distinction: A significant number of firms will misclassify their QPs, either completing Vacancy Rate and Time to Fill data for an Outcome-Based QP, or Schedule and Completeness data for an LOE QP. Draft 5 is explicit: offerors shall complete only the applicable criteria for each QP type. Misclassification will cost points.
Underestimating the Passthrough Rate Factor: The Passthrough Rate is now one of the highest-value single scoring categories (up to 18,000 points), yet many firms have not yet factored it into their QP selection. A project where the prime self-performed the majority of the labor earns up to 6,000 points per QP; a heavily subcontracted project could earn zero. This single criterion can make or break a competitive score.
Ignoring Rate Realism: Small businesses often bid aggressively low to win a seat. However, if a price is so low that it suggests the contractor doesn’t understand the scope or cannot maintain labor quality, evaluators may flag it as a performance risk. Note that Draft 5 sets the fair and reasonable price band for the Postaward Conference FFP at no less than $50 and no more than $100.
Misinterpreting Tie-Breaking Logic: Many firms ignore the tie-breaker mechanisms, not realizing that ties for final small-business seats are resolved first by the percentage of CPARS/PPQ element ratings over the last three years rated as Exceptional, then by Very Good, not by narrative strength.
Also Read: MAPS Contract Guide: Key Insights for IT and Professional Services Businesses
Frequently Asked Questions About the Army MAPS Scorecard
1. Is passing the Gate enough to win a MAPS award?
No. Clearing the Gate only confirms eligibility. The Scorecard determines the competitive ranking of all eligible offerors, and only the top 25 small businesses per domain receive an award under Draft 5’s planned award structure (70 total per domain: 30 large businesses with 15 reserved for emerging large businesses, 25 small businesses, and 15 commercial-sector vendors).
2. How important is federal past performance?
It is critical. While non-federal work is accepted, it must be presented in a way that directly links outcomes to MAPS categories, including risk mitigation and scalability. Federal projects with active CPARS are the gold standard for scoring high in the quality and relevance categories. Additionally, Draft 5’s Passthrough Rate factor further advantages firms with a strong record of self-performing federal contract labor.
3. Can small businesses compete without advanced certifications?
Yes, but certifications serve as powerful point multipliers. They signal formalized processes that lower the Army’s perceived risk, which is often the deciding factor in tight scoring tiers. Under Draft 5, the maximum certification credit available to small businesses is 5,000 points, with ISO/IEC 27001:2022 and a CMMC Final Level 2 C3PAO together accounting for 4,000 of those points.
4. What changed most significantly in Draft 5 from a scoring perspective?
Draft 5 introduced the Passthrough Rate as an entirely new scoring factor worth up to 18,000 points, restructured performance quality points upward (Exceptional now earns 10,000 per QP, up from 7,000), raised relevance scoring to a maximum of 7,000 points per QP, and introduced the LOE vs. Outcome-Based QP distinction that determines which organizational capability criteria apply to each project. Firms that have not reviewed their scoring posture against Draft 5 may be significantly underestimating both their potential score and their competitive exposure.
The Army MAPS Scorecard is a strategic differentiator that transforms corporate readiness into a measurable competitive advantage. Viewing MAPS readiness as a proactive, evidence-based planning exercise enables small businesses to make smarter bid decisions and allocate resources where they will have the greatest impact on award probability.
If you’re evaluating MAPS readiness for your organization, our team can help you assess options and build a pragmatic roadmap. Explore how our MAPS readiness advisory can support outcomes such as clearer capture strategies and stronger proposal narratives, or get in touch to discuss your scenario. For additional context and practical guidance, don’t miss the resources from last year’s MAPS webinar resources, and do register for the upcoming MAPS Masterclass.





