From Casetext: Smarter Legal Research

Dismas Charities, Inc. v. U.S.

United States Court of Federal Claims
Jun 10, 2004
No. 04-304 C (Fed. Cl. Jun. 10, 2004)

Summary

recognizing that, as a general rule, agencies are permitted to re-score proposals

Summary of this case from Vanguard Recovery Assistance v. U.S.

Opinion

No. 04-304 C.

Filed: June 10, 2004 Reissued: July 14, 2004.

This opinion was originally filed under seal on June 10, 2004, pursuant to this Court's March 9, 2004 protective order. The parties were given an opportunity to advise the Court of their views with respect to any "protected information" referred to in the opinion that they asserted was required to be redacted under the terms of the protective order. The parties jointly requested certain redactions. The Court agreed with some of the parties' initial proposed redactions, but not others. At the Court's request, the parties submitted revised proposed redactions. The Court agreed with the revised proposed redactions and redacted the materials requested by the parties. The Court's redactions are indicated by asterisks in brackets ([* * *]). The Court has also, in this reissued opinion, corrected errata.

Daniel S. Herzfeld, Shaw Pittman LLP, McLean, Virginia, for plaintiff. Alex D. Tomaszczuk, Shaw Pittman LLP, McLean, Virginia, of counsel.

John H. Williamson, Trial Attorney, Mark A. Melnick, Assistant Director, David M. Cohen, Director, Peter D. Keisler, Assistant Attorney General, United States Department of Justice, Washington, D.C., for defendant. Todd Bailey, Assistant General Counsel, Federal Bureau of Prisons, of counsel.

Joseph A. Camardo, Jr., Law Firm of Joseph A. Camardo, Jr., Auburn, N.Y., for defendant-intervenor. Kevin M. Cox, Law Firm of Joseph A. Camardo, Jr., Auburn, N.Y., of counsel.



Bid Protest; best value; lowest price technically acceptable; arbitrary and capricious; judgment on the administrative record; RCFC 56.1; source selection decision; source selection evaluation panel. OPINION AND ORDER


Plaintiff, Dismas Charities, Inc. ("Dismas"), filed this bid protest action on March 5, 2004, alleging that the Federal Bureau of Prisons ("BOP") improperly awarded Solicitation No. 200-0669-SE to Bannum, Inc. ("Bannum"). Plaintiff also filed, on March 5, a motion for preliminary injunction. After a discussion among the Court and the parties, an expedited briefing and argument schedule was established. As a result, plaintiff agreed to forgo seeking a preliminary injunction and agreed to proceed to an adjudication on the merits. Accordingly, plaintiff's motion for preliminary injunction, filed March 5, 2004, was treated as a motion for judgment on the administrative record and for permanent injunction. Bannum's motion to intervene was granted on March 16, 2004. Bannum filed a response in opposition to Dismas's motion for judgment on the administrative record and permanent injunction on April 2, 2004. Defendant, United States ("the Government"), filed an opposition to plaintiff's motion for permanent injunction and a cross-motion for judgment on the administrative record on April 8, 2004. The cross-motions were fully briefed as of April 28, 2004, and the Court heard oral argument on May 7, 2004. For the following reasons, plaintiff's motion for judgment on the administrative record is DENIED, and defendant's motion for judgment on the administrative record is GRANTED. The award to Bannum is upheld.

I. Background

A. The Solicitation

BOP issued Solicitation No. 200-0669-SE ("the solicitation" or "RFP") in May 2001 to procure community correction centers services, commonly referred to as "halfway houses." The solicitation stated that "[t]he Government contemplates award of a firm-fixed, unit-price, Indefinite-Delivery, Requirements type contract with a two-year base period and three one-year options resulting from this solicitation." Administrative Record ("AR") 197; RFP § L.4. Section L of the solicitation also provided that the contract would be awarded on a best-value basis: "The Government intends to award a contract or contracts resulting from this solicitation to the responsible offeror(s) whose proposal(s) represents the best value after evaluation in accordance with the factors and subfactors in the solicitation." AR 9, 195; RFP §§ B.1(a) L.2.

1. Evaluation Factors in the RFP

Section M of the RFP identified the evaluation factors: "The four Evaluation Factors/Criteria are Past Performance, Technical, Management, and Price, with Past Performance being the most important with the remaining three factors having equal weight." AR 207; RFP § M.5. The Technical factor included three subfactors: (1) Reports/Policy/Procedures; (2) Facility; and (3) Overall Programs Approach. Id.

The RFP's Section "M.3 Technical Evaluation Panel" established the process that BOP would use to apply the evaluation criteria:

The evaluation criteria at M.5 will be utilized by a Source Selection Evaluation Board (SSEB) in analyzing each Technical Proposal submitted in response to this solicitation. The SSEB will score each response on each element. Offerors' scores will be computed to arrive at a total score. The total score shall determine the proposals that are included within the competitive range.

AR 206; RFP § M.3. The Source Selection Plan stated that proposals would be evaluated using a 1000 point scale. AR 4025. Offerors could receive up to 325 points for Past Performance, and could receive up to 225 points for each of the three remaining factors (Cost, Management, and Technical). See, e.g., AR 4073.

The RFP also described the manner in which cost would be taken into account:

Should evaluations result in substantially "technically equal" scores, cost will be a major factor in the selection for contract award. Should evaluations result in acceptable proposals with significant differences in technical scores, cost will be regarded, but not be predominant in the determination of the proposal offering most benefit and greatest value to the Government.

AR 206-07; RFP § M.3.

The scoring of the Past Performance factor was to be "highly influential" in the selection of the awardee. AR 207; RFP § M.5 — Factor I(c). The solicitation stated that BOP, in evaluating Past Performance, would consider an offeror's performance record, performance deficiencies, quality of work, timely performance, effectiveness of management, facility maintenance and repairs, labor standards compliance, and personnel management practices. AR 208; RFP § M.5 — Factor I(c). The evaluation of the Technical factor was to be based upon each proposed physical plant in regard to suitability, age, condition, location, compliance with safety standards, documentation and procedures, and descriptions of the offeror's operational procedures in performing the statement of work's requirements. AR 208; RFP § M.5 — Factor II. The evaluation of the Management factor was to be based upon management capability, previous successful performance of similar contracts, the qualifications and experience of offerors, and employment practices and policies. AR 208; RFP § M.5 — Factor III. The evaluation of the Cost factor was to be based upon the proposed rate per "inmate day." AR 208; RFP § M.5 — Factor IV.

2. The Source Selection Decision

BOP formed a Source Selection Evaluation Panel ("SSEP") to evaluate the offerors' proposals. AR 639. The panel had four members: Mary Martin, Community Corrections Specialist, who was the panel chair, AR 3966, 3967; Susie Mance, Contract Oversight Specialist ("COS"), AR 3973; VanDella Menifee, Community Corrections Manager ("CCM"), AR 639, 663; and Sheila Thompson, Contracting Officer ("CO"), AR 663. Each panel member evaluated each proposal using an "Evaluation Checklist" that posed 164 questions for the evaluators to address. See, e.g., AR 730-57. The Evaluation Checklist contained instructions that set forth the four-step process for the SSEP to evaluate the offerors' proposals. AR 731-32, 756-57.

Step 1 stated: "You must use the proposal and answer each Evaluation Checklist question. Circle the appropriate raw point (0 thru 5) which best indicates the offeror's ability, as conveyed by the proposal, to successfully accomplish the specification." AR 731. Step 2 directed the panel members to total the raw points, and Step 3 directed them to report the raw points to the chairperson with an indication as to whether discussions with the offeror were required to resolve any deficiencies. AR 756. Step 4 stated that "[t]he SSEP Chairperson will return all discussion responses to you, i.e., deficiencies, clarifications or excesses. You must evaluate the responses and determine if an adjustment to the raw points is warranted. If so, adjust the raw points on the Evaluation Checklist and re-total." AR 756. The raw points were then converted into final points using a mathematical formula.

Four offerors submitted bids: Bannum, Dismas, Correctional Services Corporation ("CSC"), and RanHall Correctional ("RanHall"). Each member of the panel completed an Evaluation Checklist for each offeror. After the panel members evaluated the proposals, they convened on August 23, 2001. AR 639. The panel concluded that RanHall was not in the competitive range. AR 639. By memorandum dated October 24, 2001, panel chair Martin requested that CO Thompson conduct discussions with the three remaining offerors to address areas of concern identified by the panel. AR 639-47. By memorandum dated January 2, 2002, panel chair Martin requested that CO Thompson continue discussions with the three offerors to address further concerns of the panel. AR 637-38. The Evaluation Checklists indicate that during the time that these discussions were going on with the offerors, the panel members, as a result of such discussions, revised the raw score point totals at least once. AR 684.

3. Re-Scoring of Proposals

On May 16, 2002, panel chair Martin returned the Evaluation Checklists to the panelists for justification of their scores. The panelists were instructed to provide comments for each score given, to identify the page number in the solicitation that addressed the requirement, and to provide a narrative evaluation of each checklist subject. See AR 3975, file memorandum by panel chair Martin dated October 22, 2002 (providing an explanation for the contract file as to why the proposals had been re-scored). These instructions were not unique to this BOP solicitation. Rather, they were "generic" instructions for panel members on other solicitations for community correction centers services as well. In a "Memorandum for Panel Members" dated May 29, 2002, panel chair Martin provided the following guidance regarding how to score offerors' proposals:

Due to recent enhancements made to the evaluation process, more in-depth comments are required on all evaluation checklists. Attached are your completed checklists for the above referenced solicitation. Please review your checklists for both offerors and ensure a notation is made for scores you assigned for checklist elements. For example, if you score an element 3, you should note the offeror met the minimum requirements and indicate where in the technical proposal this element was addressed. If a score of 4 or a 5 was given, note what the offeror did to receive an above average score. If a 0, 1, or 2 was given, state what was deficient or what needed clarification.

AR 3966.

Panel chair Martin, in her October 22, 2002 file memorandum, recorded that on July 8, 2002, COS Mance submitted her Evaluation Checklists. See supra at 4; AR 3975. Even after the May 29, 2002 memorandum, panel chair Martin determined that "[m]ore explicit comments were required" on COS Mance's Checklists, and they were returned to Ms. Mance for further comments on September 18, 2002. AR 3975. CCM Menifee was also sent new checklists on this date. Id. COS Mance and CCM Menifee returned their re-scored Checklists for the three offerors to panel chair Martin, with an explanation about how they had re-scored the proposals. AR 3967 (dated September 27, 2002), 3973 (dated October 1, 2002). CCM Menifee's September 27, 2002 memorandum stated that "I originally scored numerous 4 and 5 for all offerors but after getting additional instructions and understanding the process better, I now submit the following revised scores. This was my first panel and my lack of experience with the evaluation process, also attributed to the inflated scores." AR 3967. Similarly, in a memorandum dated October 1, 2002, COS Mance told panel chair Martin that she had re-evaluated the proposals for all three offerors and "[a]fter receiving a better understanding of the evaluation process, the majority of the 5 ratings were reduced to 3 ratings." AR 3973.

CO Thompson recorded her comments on the front of her Evaluation Checklists:

When proposals were re-evaluated, it became apparent that during my initial evaluations, I rated various elements of the proposals with scores of 4's and 5's, that should have been scored at 3. This was the first evaluation team that I had participated in. After re-evaluating the proposals it became clear that I did not have a good understanding of exactly what should have been done, and therefore scored various elements with scores that were not appropriate. Therefore, I have re-scored the proposals to reflect the correct scoring.

AR 730, 979, 1104.

CO Thompson drafted her comments directly on the front of her previous Checklists, which are dated as of the date of the original scoring. The Checklists do not appear to indicate the date on which they were re-scored, nor the date on which CO Thompson drafted her explanation as to why she re-scored the proposals.

4. March 19, 2003 Source Selection Decision Document

On March 19, 2003, Stewart Rowles, Administrator of the Community Corrections Branch of BOP and the Source Selection Authority for this procurement, signed a Source Selection Decision Document ("SSDD") determining that contract award should be made to Bannum based upon its October 31, 2002 proposal. AR 3976-80. Before any award was issued, however, BOP's Compliance and Review Contract Office prepared an April 11, 2003 pre-award review memorandum in response to a request from CO Thompson. AR 1889-96.

The pre-award review memorandum identified several concerns based upon a review of the procurement file. First, it stated that "the file contains no documentation regarding the significant decreases in the technical and management evaluations performed by the technical panel members." AR 1892. Second, it stated that "[t]he past performance evaluations continue to contain the same errors and inconsistencies that were noted under previous reviews." Id. Third, it stated that the SSDD should be "based upon an in-depth comparative assessment against all source selection criteria in the solicitation. . . . No comparative assessment between Bannum's, Dismas [sic], and CSC's proposals are [sic] provided. The SSA needs to explain why the Government considers the successful offer a better value in comparison with the other competing offer[s]." AR 1892-93 (emphasis in original). The pre-award review memorandum also noted that "[t]he ratings, scores, and other quantifiable measures used during the source selection process should be used as guides to support the decision process, not to make the decision." AR 1893.

To address the first concern, panel chair Martin prepared a memorandum to CO Thompson, dated August 19, 2003. AR 635. The memorandum officially confirmed that panel chair Martin had requested that all panel members re-evaluate the proposals to ensure that they applied the scoring system provided in the Evaluation Checklists. AR 635. To address the second concern, CO Thompson prepared a memorandum dated October 10, 2003, that analyzed the past performance of the three offerors determined to be in the competitive range. AR 4032-57.

5. October 28, 2003 Source Selection Decision Document

Mr. Rowles prepared and signed a revised SSDD on October 28, 2003. AR 4058-64. The three-page "Past Performance Summary" section of CO Thompson's October 10, 2003 memorandum that addressed the past performance of all three offerors, AR 4055-57, was incorporated into the October 28, 2003 SSDD. AR 4061-63. The March 19, 2003 SSDD had addressed only Bannum's past performance, not that of the other offerors. AR 3979.

The "Management" and "Technical" sections of the March 19, 2003 SSDD, which had also addressed only Bannum's proposal, were revised. AR 3979-80. The "Management" and "Technical" sections of the October 28, 2003 SSDD made a comparative assessment of all three offerors' proposals, as recommended in the April 11, 2003 pre-award review memorandum. AR 4063-64.

BOP's Compliance and Review Contract Office then prepared a second pre-award review memorandum — this time reviewing the October 28, 2003 SSDD. AR 6243-75. This pre-award review memorandum, dated November 10, 2003, identified several typographical errors and minor miscalculations in the SSDD, which were shown in a marked-up copy of the October 28, 2003 SSDD that was attached to the pre-award review memorandum. AR 6294, 6258-64. The memorandum stated that "[t]he calculation errors are minimal and will not effect [sic] the award decision." AR 6249. One of the errors identified on the mark-up was a reference at the bottom of the third page of the SSDD to "[* * *]% of the IGE," which the mark-up noted should be corrected to "[* * *]%." AR 6260. The pre-award review memorandum also noted that on page 25 of CO Thompson's October 10, 2003 memorandum regarding past performance, AR 4056, the reference to "[* * *]" excellent ratings for Dismas should have been "[* * *]." AR 6249.

Plaintiff, in its initial brief, relied on these mistakes as bases for its claim. After the Government supplemented the administrative record with the November 10, 2003 pre-award review memorandum that highlighted this mistake, plaintiff stated, in its reply brief, that "it withdraws its challenge to what appeared to be an improper evaluation of the number of Dismas's Past Performance references and Dismas's cost in comparison to the Government independent cost estimate, which evaluation was part of the November 2003 source selection decision. The Government's documentation . . . appears to support its assertion that these mistakes were merely typos." Pl.'s Reply at 2 n. 1.

6. November 14, 2003 Source Selection Decision Document

On November 14, 2003, Mr. Rowles signed a revised SSDD. AR 4072-78. The November 14 SSDD corrected the typographical and calculation errors identified in the November 10, 2003 pre-award review memorandum. AR 4074, 4076. The SSDD stated that all three offerors "addressed the Management factor requirements during discussions" ( i.e., the offerors satisfied the Management requirements of the solicitation) and all three offerors had satisfactorily addressed the Technical factor. AR 4077-78. Dismas had received an overall average rating of [* * *] for the Past Performance factor, as compared to [* * *] for Bannum, but both offerors' ratings fell within the range (3.66-4.33) for an adjectival rating of "[* * *]" established in CO Thompson's October 10, 2003 performance evaluation memorandum. The SSDD incorporated the summary contained in that memorandum. AR 4034, 4035, 4045.

The SSDD stated that out of a possible 1000 total points, Bannum received [* * *], Dismas received [* * *], and CSC received [* * *]. AR 4073. The final score matrix was as follows:

Name Past Technical Management Cost Total Performance 225 (Max) 225 (Max) 225 (Max) 1000 (Max) 325 (Max) Bannum, Inc. [***] [***] [***] [***] [***] Dismas [***] [***] [***] [***] [***] Charities CSC [***] [***] [***] [***] [***] AR 4073. The SSDD also stated that "[t]he current pricing offers are: Bannum $[* * *] ([* * *]% of the IGE), CSC $[* * *] ([* * *]% of the IGE), and Dismas $[* * *] ([* * *]% of the IGE)." Id. The SSDD determined that Bannum's proposal offered the best value:

[T]he Source Selection Authority (SSA) must determine if the highest overall rated proposal of Dismas Charities, with the highest price, has perceived benefits which merit the additional cost. . . . It is the opinion of the SSA that the offer submitted by Bannum, Incorporated is the most advantageous to the government as it meets or exceeds the minimum requirements of the contract at the lowest proposed price. A review of the strengths and weaknesses of the proposals, reveals that the services Dismas Charities is offering do not warrant paying the premium or difference of $[* * *] in cost over the life of the contract which consists of a two year base and three option periods of one year each. As outlined in Section M.3 of the solicitation, "Should evaluations result in substantially `technically equal' scores, cost will be a major factor in the selection for contract award."

AR 4073. The SSDD continued, noting that "with Bannum's and Dismas's total overall points being virtually equal prior to price considerations, price takes on a priority role. When considering non-cost [Past Performance, Technical, and Management] percentage points between the two offerors with the highest non-cost scores (Bannum and Dismas), there is a [* * *]% point difference between the two offerors." AR 4075.

The November 14, 2003 SSDD concluded that Bannum would be awarded the contract: "It is my determination that award to Bannum, Incorporated, who was rated the highest of the three offerors in Technical, Management, and Cost, best meets the needs of the Government. Furthermore, the prices proposed by Bannum represent a savings to the Government of $[* * *] for the entire contract period, when compared to the offeror with the highest overall point total but highest proposed price." AR 4078.

B. The GAO Decision

On November 17 and 18, 2003, BOP sent to the three offerors notices of the award of the contract to Bannum. AR 4089-90, 4098, 4099. On December 3, 2003, Dismas filed a bid protest with the General Accounting Office ("GAO"). AR 4137-58. On February 20, 2004, the General Counsel of the GAO issued an opinion denying Dismas's bid protest. AR 4311-14. On March 5, 2004, Dismas filed the present action in this court.

II. Discussion

A. Jurisdiction and Standard of Review for Bid Protest Actions

The Court has jurisdiction over this bid protest action pursuant to 28 U.S.C. § 1491(b) (2000). Section 1491(b)(4) explicitly provides that in any action under § 1491(b), "[t]he courts shall review the agency's decision pursuant to the standard set forth in section 706 of title 5," the Administrative Procedure Act ("APA"), 5 U.S.C. §§ 701- 706 (2000). Section 706(2) provides that the reviewing court shall: "Hold unlawful and set aside agency action, findings, and conclusions found to be (A) arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law . . . or (D) without observance of procedure required by law." 5 U.S.C. § 706(2)(A) (2000); see also Citizens to Preserve Overton Park, Inc. v. Volpe, 401 U.S. 402, 416 (1971). To prevail in a bid protest, the plaintiff must prove the arbitrary and capricious nature of the Government's actions by a preponderance of the evidence. Ellsworth Associates, Inc. v. United States, 45 Fed. Cl. 388, 392 (1999).

Judicial review of agency contracting decisions is "extremely limited." CACI Field Servs. v. United States, 13 Cl. Ct. 718, 725 (1987). Contracting officers may properly exercise wide discretion in their evaluation of bids and in their application of procurement regulations. Id. The court cannot substitute its judgment for that of the agency, even if reasonable minds could reach differing conclusions, and must give deference to the agency's findings and conclusions. Seaborn Health Care, Inc. v. United States, 55 Fed. Cl. 520, 523 (2003); CRC Marine Servs., Inc. v. United States, 41 Fed. Cl. 66, 83 (1998). "The disappointed bidder has the `heavy burden' of showing that the award decision `had no rational basis.'" Impresa Construzioni Geom. Domenico Garufi v. United States, 238 F.3d 1324, 1332 (Fed. Cir. 2001). The question before the court is not whether the agency's decision was right or wrong; instead, the court "must determine whether that decision was the result of a considered process, rather than an arbitrary and capricious choice based on factors lacking any intrinsic rational basis or relationship to the questions at issue." CW Government Travel, Inc. v. United States, 53 Fed. Cl. 580, 590 (2002). Generally, the details of technical rating decisions involve discretionary determinations that a court will not second guess. Id. (citing E.W. Bliss Co. v. United States, 77 F.3d 445, 449 (Fed. Cir. 1996)). Procurement officials have substantial discretion to determine which proposal represents the best value for the government. Overstreet Elec. Co. v. United States, 59 Fed. Cl. 99, 108 (2003) (citing Lockheed Missiles Space Co., Inc. v. Bentsen, 4 F.3d 955, 958 (Fed. Cir. 1993)).

B. Standard of Review for Judgment on the Administrative Record

From a procedural standpoint, bid protest actions are considered upon cross-motions for judgment on the administrative record pursuant to RCFC 56.1. See World Travel Service v. United States, 49 Fed. Cl. 431, 438 (2001). The standards applicable to a motion for judgment on the administrative record differ from a RCFC 56 motion for summary judgment. See Lion Raisins, Inc. v. United States, 51 Fed. Cl. 238, 246-47 (2001); Tech Systems, Inc. v. United States, 50 Fed. Cl. 216, 222 (2001). The statements and counter-statements of facts prepared pursuant to RCFC 56.1 argue the significance and weight accorded to the facts that were the basis for the agency decision. Tech Systems, 50 Fed. Cl. at 222. The inquiry in a review of the administrative record in a bid protest is whether, given all the disputed and undisputed facts, a protester has met its burden of proof that an award is arbitrary, capricious . . . or violates to prejudicial effect an applicable procurement regulation. Id. (citing CCL Serv. Corp. v. United States, 48 Fed. Cl. 113, 119 (2000)); See also PGBA, LLC v. United States, 60 Fed. Cl. 196, 204 n. 11 (2004).

Under this standard, it is well settled that "the focal point for judicial review should be the administrative record already in existence, not some new record made initially in the reviewing court." Camp v. Pitts, 411 U.S. 138, 142 (1973). That record consists of the materials and files that were before the agency at the time the decision was made. See Florida Power Light v. Lorion, 470 U.S. 729, 743 (1985); Federal Power Comm'n v. Transcontinental Gas Pipe Line Corp., 423 U.S. 326, 331 (1976). The administrative record should not include materials created or obtained subsequent to the time the decision-maker decided to take the challenged agency action or materials adduced through discovery by opponents of the agency's actions in de novo proceedings in court. See id.

C. Dismas Has Standing to Pursue This Action, and the Action is Timely

In order to maintain standing to sue in a bid protest action, a protestor must be an "interested party." See 28 U.S.C. § 1491(b)(1). The Tucker Act, however, does not define the term "interested party." The United States Court of Appeals for the Federal Circuit, therefore, has adopted the definition of "interested party" set forth in the Competition in Contracting Act ("CICA"). Northrop Grumman Corp. v. United States, 50 Fed. Cl. 443, 455-56 (2001) (citing Am. Fed'n of Gov't Employees v. United States, 258 F.3d 1294, 1300-02 (Fed. Cir. 2001)). The CICA defines an "interested party" as "an actual or prospective bidder or offeror whose direct economic interest would be affected by the award of the contract or by failure to award the contract." Id., 31 U.S.C. § 3551(2) (2000).

Bannum, in its opposition to Dismas's motion for judgment on the administrative record, argued that Dismas lacks standing to protest the award to Bannum because Dismas allegedly did not have a facility or the required permits or approvals from the City of Savannah by the RFP submission deadline. There is no evidence in the record, however, that supports this contention. The Government, in its response to plaintiff's proposed additional facts, conceded that Dismas had the appropriate permits and an acceptable facility, such that it would have been eligible for award of the contract. The Government confirmed this conclusion at oral argument. Bannum did not file a reply brief, nor did it pursue this point at oral argument. Accordingly, the Court finds that Dismas has standing to pursue this bid protest.

Bannum Opp. at 5.

Def.'s Resp. to P's Additional Facts at ¶¶ 24-27.

Tr. at 67-68.

Id. at 68.

Bannum additionally urged this Court to dismiss the protest as untimely. Citing the GAO opinion in this case, Bannum posits that because "BOP reports, and Dismas does not dispute, that Dismas was provided with the scores on which the arguments are based no later than October 2, 2003," Dismas's failure to file suit in this court prior to March 5, 2004 renders its protest untimely. This court, however, is not bound by the bid protest timeliness rules of the GAO. See 28 U.S.C. § 1491(b)(3) ("in exercising jurisdiction under this subsection, the courts shall give due regard to . . . the need for expeditious resolution of the action"); Software Testing Solutions, Inc. v. United States, 58 Fed. Cl. 533, 535 (2003) ("[t]his court, with all due respect, fails to see how a GAO rule that self-limits that agency's advisory role constitutes a limit, either legally or prudentially, on this court's exercise of jurisdiction").

Bannum Opp. at 7.

While Dismas may have been aware of the scores as early as October 2, 2003, Dismas was not informed of BOP's decision until November 18, 2003. AR 4099. Dismas timely requested a debriefing on November 20, 2003, within three days of learning of BOP's award decision. AR 4152. On December 3, 2003, Dismas filed a bid protest with GAO. AR 4137-48. On March 5, 2004, within 10 days of learning of the GAO decision denying its protest, Dismas filed its complaint and motion for injunctive relief in this court. See AR 4312; complaint. Dismas has diligently pursued its rights with respect to this procurement. The Court finds that Dismas's bid protest action, filed on March 5, 2004, is timely.

D. FAR Provisions Governing Source Selection

Federal Acquisition Regulation ("FAR") Subpart 15.3 governs Source Selection. FAR 15.303(b) provides that the source selection authority shall, inter alia:

1) Establish an evaluation team, tailored for the particular acquisition, that includes appropriate contracting, legal, logistics, technical, and other expertise to ensure a comprehensive evaluation of offers;
2) Ensure consistency among the solicitation requirements, notices to offerors, proposal preparation instructions, evaluation factors and subfactors, solicitation provisions or contract clauses, and data requirements;
3) Ensure that proposals are evaluated based solely on the factors and subfactors contained in the solicitations;
4) Consider the recommendations of advisory boards or panels; and
5) Select the source or sources whose proposal is the best value to the Government.
48 C.F.R. § 15.303.

FAR 15.304, entitled "Evaluation factors and significant subfactors," provides that "all factors and significant subfactors that will affect contract award and their relative importance shall be stated clearly in the solicitation. . . . The rating method need not be disclosed in the solicitation. The general approach for evaluating past performance information shall be described." 48 C.F.R. § 15.304(d). The solicitation shall also state, at a minimum, whether all evaluation factors other than cost or price, when combined, are (1) significantly more important than cost or price; (2) approximately equal to cost or price; or (3) significantly less important than cost or price. 48 C.F.R. § 15.304(e) (citing 10 U.S.C. § 2305(a)(3)(A)(iii)).

FAR 15.305 governs "proposal evaluation." This provision states, in pertinent part: "An agency shall evaluate competitive proposals and then assess their relative qualities solely on the factors and subfactors specified in the solicitation. Evaluations may be conducted using any rating method or combination of methods, including color or adjectival ratings, numerical weights, and ordinal rankings. The relative strengths, deficiencies, significant weaknesses, and risks supporting proposal evaluations shall be documented in the contract file." 48 C.F.R. § 15.305(a).

The source selection decision is governed by FAR 15.308:

The source selection authority's (SSA) decision shall be based on a competitive assessment of proposals against all source selection criteria in the solicitation. While the SSA may use reports and analyses prepared by others, the source selection decision shall represent the SSA's independent judgment. The source selection decision shall be documented, and the document shall include the rationale for any business judgments and tradeoffs made or relied on by the SSA, including benefits associated with additional costs. Although the rationale for the selection decision must be documented, that documentation need not quantify the tradeoffs that led to the decision.
48 C.F.R. § 15.308.

E. The Source Selection Decision Was Neither Arbitrary nor Capricious and Was Otherwise in Accordance With Law

In its motion for injunctive relief, Dismas contends that BOP's decision to award a contract to Bannum was arbitrary, capricious, an abuse of discretion, and contrary to law and that Dismas was prejudiced by BOP's actions for several reasons: (1) BOP disparately and unequally downgraded Dismas's Technical and Management scores; (2) BOP's re-scoring had the effect of changing the solicitation from a "best value" to "lowest price technically acceptable," see 48 C.F.R. §§ 15.101, 15.102; (3) BOP improperly evaluated the Technical proposals by using an inaccurate point scoring system that did not reflect the true measure of proposals; (4) BOP failed to properly evaluate the Past Performance factor in conducting the price/technical tradeoff determination; and (5) BOP failed to conduct a price reasonableness analysis using the prices offered by Bannum. The Court will address these contentions seriatim.

1. BOP Did Not "Disparately and Unequally" Downgrade Dismas's Technical and Management Scores

As indicated above, BOP used Evaluation Checklists to generate raw scores of the proposals under the Technical and Management factors. The raw scores were later converted into the 1000-point scale as discussed supra at 3, 4, 8 and infra at 19-20. Each Evaluation Checklist included 164 questions that scored the proposals on a scale of 0 through 5 with 5 being the best score for each question. AR 787-81. Each member of the SSEP added up the scores from her Evaluation Checklist for each of the Technical subfactors and the Management factor for each proposal. Then, the raw points of the SSEP members were averaged for each Technical subfactor, and the Management factor to arrive at the consensus raw points score for each proposal. This consensus raw points score was then converted mathematically into the weighted point score.

After three rounds of evaluating proposals, the SSEP Scoring Sheet revealed that Dismas had won the competition on the Technical factor (including all Technical subfactors — Reports/Policy/Procedure, Facility, and Overall Approach) and on the Management factor:

Averaged Raw Point Dismas Bannum Difference between Scores of SSEP After Dismas and Bannum Three Rounds of Scoring

Reports/Policy/Procedure 320.50 301.25 19.25

Facility 123.50 106.00 17.50

Overall Programs Approach 26.75 23.75 3.00

Management 177.00 154.50 23.50

Total 647.75 585.50 62.25

Pl. App. 24. After these scores had been tabulated, BOP determined that it needed to re-score the proposals, as discussed supra at 4-5. The results of the re-scoring of proposals were as follows: Averaged Raw Point Scores of SSEP After Re-Scoring Total

Dismas Bannum Difference between Dismas and Bannum Reports/Policy/Procedure 256.50 253.50 3.00 Facility 101.00 99.50 1.50 Overall Programs Approach 21.50 23.00 -1.50 Management 148.50 143.50 -5.00 527.50 529.50 -2.00 Pl. App. 24. Dismas correctly points out that while Bannum's raw scores were reduced by 56 points, Dismas's raw scores were reduced by approximately double that amount. The re-scoring had the effect of making a competition in which Dismas had a significant advantage into a competition that was roughly even with respect to the Technical and Management factors. Prior to re-scoring, Dismas's proposal had a [* * *] weighted-point advantage over Bannum's proposal with respect to the Technical and Management factors. After the re-scoring, Dismas's proposal had a [* * *] weighted-point disadvantage vis-à-vis Bannum's proposal with respect to the Technical and Management factors. AR 4073.

Dismas contends that because its score was decreased more than Bannum's in the re-score, BOP treated Dismas unfairly. Dismas states that "generally, an agency may not evaluate offerors disparately, but must rationally evaluate proposals and treat each offeror fairly in conducting evaluations." Seattle Security Servs., Inc. v. United States, 45 Fed. Cl. 560, 569 (2001). "It is fundamental that the contracting agency must treat all offerors equally; it must evaluate offers evenhandedly against common requirements and evaluation criteria." Id. Dismas is correct on the law. The facts, however, do not support Dismas's allegation that it received unfair treatment.

There is no indication that the panel applied different standards when re-scoring the three proposals. The mere fact that the re-scoring resulted in Dismas's score going down — a disparate result — does not mean that Dismas was treated unfairly or differently. BOP recognized a problem with the scoring and rectified it prior to award — an action that should be commended, not discouraged. Plaintiff has made no showing that the Government re-scored in order to alter the results. Government employees are presumed to act in good faith, and this presumption can only be overcome by clear and convincing evidence of bad faith. Am-Pro Protective Agency, Inc. v. United States, 281 F.3d 1234, 1240 (Fed. Cir. 2002).

The record indicates that the re-scoring was undertaken, not to alter the results of the competition, but because the panel chair was concerned that the panel members were incorrectly scoring the proposals. There were several instances where panel members had awarded 4s and 5s in response to questions in which the offerors had met, but not exceeded, the specifications, and thus merited 3s according to the numerical scale applicable to the evaluation. AR 730, 979, 1104, 3966-67, 3973-75. In the re-scoring, BOP sought to ensure that the evaluators applied the scoring methodology set forth in the instructions to the Evaluation Checklists. AR 3966, 3975. The Evaluation Checklists explained what each raw point score should represent:

0 = The proposal did not comment on the specification.
1 = The approach did not correctly address the specification. It is either deficient, unclear or excessive. The SSEP member must specifically identify: the deficiency ( i.e., what and why it is incorrect); the clarification ( i.e., what and why it is unclear); or the excess ( i.e., offering more than the specification requires). To increase the raw points assigned, the offeror must correct the deficiency or clarify the issue by demonstrating or providing evidence which appropriately addresses the specification.
2 = The approach poorly addressed the specification. The approach only indicated compliance. However, it did not explain, present samples or provide other documents which would indicate a likelihood of success. The SSEP member must specifically ask the offeror to resubmit the approach. To increase the raw points assigned, the offeror must explain how the requirement would be accomplished.
3 = The approach addressed the specification. The approach reasonably explained how the specification would be accomplished and met the minimum specifications.
4 = The approach addressed the specification in a satisfactory manner. The approach provided a reasonable and effective explanation.
5 = The approach addressed the specification in an excellent manner. The approach provided a reasonable and effective explanation using efficient innovation to accomplish the specification. It also provided samples which conveyed an excellent-to-outstanding approach.

AR 788-89.

Examples of inconsistent application of the point system are particularly noticeable in COS Mance's Checklists. The written comments recorded to the right side of the number on her Evaluation Checklists appear to indicate that initially she focused on less easily distinguishable aspects of the scoring criteria, such as whether she thought that the proposal offered a "reasonable" explanation (warranting a 3) as opposed to a "reasonable and effective" explanation (required for a 4 or 5). AR 790-812. As a result, she initially did not assign scores in a manner consistent with the scoring system set forth on the Evaluation Checklists. The instructions to the Evaluation Checklists required that if the proposal "provided a reasonable and effective explanation" a score of 4 or 5 was appropriate, depending upon whether the proposal was "satisfactory" (4) or "excellent" (5). AR 788-89. On COS Mance's Evaluation Checklist for Dismas, she wrote next to question 8, "a reasonable effective explanation provided," and it appears that she initially assigned a score of 5. AR 791. For question 9, Ms. Mance again initially assigned a score of 5, but she wrote next to the score "a reasonable explanation provided," which corresponded to a score of 3 rather than 5. Id. Additionally, even though she assigned scores of 5 for questions 11-14, for question 11, she wrote "reasonably addressed," but next to questions 12-14, she wrote "reasonable effective." Id. The record supports the Government's assertion that it was this type of discrepancy that prompted panel chair Martin to suggest that the panel members re-score the proposals.

The panel members re-scored all of the proposals, not just Dismas's. The panel members appear to have gone through the Evaluation Checklists question by question and re-scored the proposals following panel chair Martin's written guidance. They had previously noted the page number references where the response to each question could be found within the proposal, so they were able to go back to that part of the proposal and determine the appropriate score for that question. While the re-scoring resulted in many 5s being reduced to 3s, some 4s and 5s remained. After the re-score, Dismas received fifty-nine 4s and thirty-two 5s. AR 686-715, 730-812. Bannum received twenty-eight 4s and forty-six 5s. AR 686-715, 730-812. Thus, contrary to Dismas's allegations, even after Ms. Martin's May 29, 2002 memorandum, the panel members continued to exercise independent judgment in their scoring. The decision as to whether an offeror should have scored a 3, 4, or 5 on any question is properly left to the discretion of the agency. This type of decision is part of "the minutiae of the contracting process," which involves the sort of "discretionary determination that the court will not second-guess." CW Government Travel, 53 Fed. Cl. at 590. BOP's decision to re-score, as well and the re-scoring itself, was the result of a considered process, and therefore, was not arbitrary or capricious. See id.

2. BOP's Re-Scoring of Proposals Was Reasonable and Did Not Contradict the Use of Best Value Criteria or the Instructions for Completion of the Evaluation Checklists

Dismas contends that the instructions contained in panel chair Martin's memorandum converted the solicitation from a "best value" procurement to a "lowest price technically acceptable" procurement. The main basis for Dismas's theory is that by requiring a justification for scores above and below a 3, Ms. Martin's May 29, 2002 memorandum created a disincentive to give scores other than a 3. Thus, panel members gave more 3s after the re-scoring, with the result that all the offerors had virtually identical scores on the Technical and Management factors, thereby necessarily increasing the importance of the Price factor.

Pl.'s Reply at 2.

A "best value" determination allows the Government to "consider award to other than the lowest price offeror or other than the highest technically rated offeror." 48 C.F.R. § 15.101-1(a). Alternatively, the "lowest price technically acceptable" source selection process "is appropriate when best value is expected to result from selection of the technically acceptable proposal with the lowest evaluated price," and "tradeoff is not permitted." 48 C.F.R. § 15.101-2 (a) and (b)(2).

Ms. Martin's memorandum instructed the panel members to articulate a reason for all ratings, not just those higher than a three. If a question received a score of 0-2, the panel member had to describe the nature of the deficiency. If a score of 3 was given, the panel member had to state that the response met the minimum requirements of the solicitation. If a panel member gave a score of 4 or 5, she had to state why the response exceeded the minimum requirements of the solicitation. AR 3966. Requesting a written comment along with a numerical score was the only way for the panel chair to ensure that the members consistently applied the Evaluation Checklist instructions. BOP's actions in this regard seem eminently reasonable in a case such as this where there are new, inexperienced panel members.

Additionally, the mere fact that a majority of the scores were 3s does not mean BOP actually conducted a "lowest price technically acceptable" solicitation. The only conclusion that can be drawn from the record is that most of the responses only met the minimum requirements, and therefore a score of 3 was appropriate. Offerors were free to provide proposals that exceeded the requirements and would have gotten higher scores had they done so. In fact, as discussed supra at 16, offerors did receive higher scores when they exceeded the minimum requirements. The requirement to justify the scores did not ensure that everyone got 3s. Rather, the quality of the proposals determined the scores. While Dismas may be unhappy with the results of the re-score, it has failed to meet its burden of proving, by a preponderance of the evidence, that the rescoring process was arbitrary or capricious. See Ellsworth Associates, 45 Fed. Cl. at 392.

At oral argument, Dismas advanced the theory that the solicitation provided that at least one of the offerors was required to receive a 5 on each question on the Evaluation Checklist, and that the failure of BOP to use this scoring method resulted in a "lowest price technically acceptable" procurement rather than a "best value" solicitation. The language that Dismas relies on to support this theory is found at M.3 of the contract: "The evaluation criteria at M.5 will be utilized by a Source Selection Evaluation Board (SSEB) in analyzing each Technical Proposal submitted in response to this solicitation. The SSEB will score each response on each element, giving the highest score to the best response for each element." AR 206.

Tr. at 26-27.

Interpretation of the terms of a government contract (or solicitation) is a matter of law. Fortec Constructors v. United States, 760 F.2d 1288, 1291 (Fed. Cir. 1993). Whether a solicitation's provisions are ambiguous is also a question of law. Overstreet Elec., 59 Fed. Cl. at 112. This Court must begin its analysis by construing the plain language of the solicitation. Id. For the reasons discussed below, the plain language of the solicitation clause at M.3 does not support Dismas's assertion that for each of the 164 questions on the Evaluation Checklist at least one offeror should have received a score of 5.

The term "element" is not defined in the solicitation. Dismas suggests that "each element" means each question on the Evaluation Checklist. This proposed definition for "element" is supported by the fact that the May 29, 2002 "generic memorandum" instructing panel members on how to complete the Evaluation Checklists used the term "element" to mean "question." ("Please review your checklist for both offerors and ensure a notation is made for scores you assigned for checklist elements"). See supra at 5; AR 3966 (emphasis added). But even assuming that "element" means "Evaluation Checklist question," Dismas's argument that at least one offeror was required to receive a 5 on each "element" is unpersuasive.

According to the RFP and the Source Selection Plan, BOP was to score each proposal objectively against the requirements of the solicitation. Panel members were to give a score of 3 if the response "met the minimum specifications," a score of 4 if the response was "satisfactory," and a score of 5 if the response was "excellent." See supra at 15-16; AR 788-89. "The best response" on a given "element" or question may be one that only meets the "minimum requirements," earning a 3. If that is the case, then as long as no other offeror got higher than a 3 on that question, the "best response" received the "highest score." See AR 206.

Id. at 57-58.

The term "highest score" is not the same as "maximum points available." This distinction is bolstered by the fact that section M.5 of the solicitation, under Factor IV — Cost, states that "the lowest rate, as indicated by the Business Proposal will receive the maximum points available under the Cost/Price factor." AR 208 (emphasis added). If the solicitation had intended to require that at least one bidder get a 5 on each question, it would have used language similar to the language relating to cost quoted above. The solicitation, on its face, did not require the panel members to give a 5 to at least one offeror for each of the questions on the Technical/Management Checklist.

Furthermore, the fact that BOP scored the Evaluation Checklists objectively, rather than comparatively, does not convert the solicitation to a "lowest price technically acceptable" procurement. In fact, such an objective scoring method is consistent with the FAR requirements for evaluating proposals under a "best value" procurement. See 48 C.F.R. § 15.305(a), discussed supra at 12 (stating that "an agency shall evaluate competitive proposals and then assess their relative qualities" (emphasis added)). Thus, it was proper for BOP to use the Evaluation Checklist to rate the proposals against objective criteria, and then evaluate the relative strengths and weaknesses of the proposals based on that analysis.

A significant indication that BOP did not engage in a "lowest price technically acceptable" solicitation is that BOP actually conducted a tradeoff. BOP determined that even though Dismas scored higher on the non-price factors, the small margin was not worth the extra price. AR 4074. Had BOP conducted a "lowest price technically acceptable" procurement, no tradeoff would have been permitted. 48 C.F.R. § 15.101-2 (b)(2).

3. The Point Scoring System that BOP Used to Evaluate the Technical Proposals Was Proper and Consistent With the Solicitation

Dismas contends that "BOP's mechanical scoring methodology under the Technical factor does not reflect the actual evaluation of proposals." Specifically, Dismas alleges that BOP's method for converting the scores from raw points to weighted points was improper. Dismas asserts that it should have received a higher rating on the Technical factor than Bannum because the total of the raw scores that it received for the three Technical subfactors was higher than Bannum's (379 versus 376). See supra table at 14. While the methodology used by BOP may be, at first glance, a bit confusing to some, it is not irrational. Dismas has not met its burden of showing that the evaluation of the proposals did not "accurately reflect the actual differences in the proposals." See CSE Constr. Co. v. United States, 58 Fed. Cl. 230, 244 (2003).

Pl.'s Mot. at 11.

Id.

The Solicitation expressly provided for three Technical subfactors: "Reports/Policy/Procedures," "Facility," and "Overall Programs Approach." AR 207; RFP § M.5. Each subfactor was evaluated according to its own set of questions in the Evaluation Checklist. BOP had determined that it needed many more questions to properly evaluate Reports/Policy/Procedures (81 questions, each scored 0-5, worth a raw score of up to 405), than it needed to evaluate Facility (31 questions, each scored 0-5, worth a raw score of up to 155), or Overall Programs Approach (7 questions, each scored 0-5, worth a raw score of up to 35). See, e.g., AR 713. However, for purposes of the 1000-point scale established in the Source Selection Plan, BOP weighted each of the three Technical subfactors equally — 75 points each. AR 4025. That decision is consistent with the terms of the RFP because the RFP did not provide that any one Technical subfactor would have more weight than the others. AR 207.

There were 45 questions pertaining to the Management factor, for a total of 164 questions.

Because Reports/Policy/Procedure had more questions, and therefore more raw points, each raw point was worth less when it was converted to the weighted point score. In contrast, because there were so few questions relating to Overall Programs Approach, each raw point related to that subfactor significantly impacted the weighted score. The Overall Programs Approach raw points were particularly important in this case because Bannum received a higher raw score for Overall Programs Approach (23) than did Dismas (21.5) (a difference of 1.5 raw points). This difference, however, translated into a 3.75-point difference in the weighted scores in Bannum's favor.

While the result was unfortunate for Dismas, there was nothing improper about how many questions BOP chose to utilize for each Technical subfactor in its Evaluation Checklist. As noted supra at 9 and 17, generally, the details of technical rating decisions involve discretionary determinations of contracting officials that a court will not second guess. CW Government Travel, 53 Fed. Cl. at 590. Such is the case here. BOP had the discretion to determine how many questions it needed to properly evaluate a subfactor. It seems reasonable that BOP would need different numbers of questions to assess different factors and subfactors of the proposals. The record does not suggest otherwise. Additionally, because the RFP did not state that the subfactors would carry different weights, AR 207, it was acceptable and rational for BOP to translate the raw scores from the Evaluation Checklists into a weighted score that reflected equal weight for each Technical subfactor. The agency's decision to do so was not arbitrary or capricious. See id; Banknote Corp. of America, Inc. v. United States, 365 F.3d 1345, 1355-56 (Fed. Cir. 2004) (holding that where RFP did not indicate the relative weight of evaluation factors, contracting officer acted reasonably in assigning equal weight).

4. BOP Properly Evaluated and Weighted Past Performance

The GAO issued a ruling in which it rejected Dismas's claim regarding the evaluation and weighting of Dismas's Past Performance. Dismas Charities, Inc., B-289575.2, B-289575.3 (Feb. 20, 2004). "When the GAO has dealt with similar situations in the past, their opinions are not binding on this tribunal, but such decisions when reasonable and persuasive provide useful guidance to the court." Ellsworth Associates, 45 Fed. Cl. at 394 n. 3. While we agree with GAO's ultimate decision on this issue, we have conducted what we believe to be a full, thorough, and independent analysis of both the record and the arguments of the parties. We have not simply adopted the conclusions of the GAO.

The Solicitation ranked Past Performance as the most important factor. AR 207; RFP § M.5. In particular, the Solicitation stated, "The assessment of the offeror's past performance will be used as a means of receiving a more favorable evaluation than another whose record is acceptable, even though both may have acceptable technical and management proposals." Id. Furthermore, the Past Performance factor was to be "highly influential in determining the relative merits of the offeror's proposals and in selecting the offeror's proposal and in selecting the offeror whose proposal is considered most advantageous to the Government." Id.

The record shows that BOP complied with the terms of the solicitation and properly evaluated and weighted the Past Performance factor. In distributing the 1000-point scale among the various factors, BOP allocated 325 points to Past Performance and 225 points each to Technical, Management, and Cost. See, e.g., AR 4073. Despite Dismas's several arguments to the contrary, by allocating points in this manner, BOP complied with section M.5. Thus, once BOP tallied the points for the non-price factors and discovered that the proposals of Dismas and Bannum were substantially "technically equal," there was no need to further consider Past Performance prior to determining that price would be "a major factor in the selection for contract award." AR 206-07.

Beyond the point allocation, the record contains extensive evidence that BOP was fully aware of the past performance of all competitors in the competitive range. The record contains an October 10, 2003 memorandum from the Contracting Officer detailing, in approximately 25 pages, the strengths and weakness of the Past Performance of Dismas, AR 4045-4052, Bannum, AR 4035-4044, and CSC, AR 4052-4057. This memorandum lists the contracts for which Past Performance information was received and notes which contracts were used in computing the offerors' Past Performance scores. AR 4032-4057. The record indicates that Mr. Rowles carefully reviewed the October 10 memorandum and incorporated parts of that memorandum into the November 14, 2003 SSDD. AR 4075-4077.

Dismas contends that the three pages devoted to Past Performance in the SSDD were insufficient because the SSDD did not contain a "head-to-head" comparison of the various strengths and weakness of the offerors and did not list any weaknesses for Dismas or Bannum. According to Dismas, this lack of comparison occurred despite the critique of the aborted March 2003 source selection decision. See discussion supra at 6; AR 1889-96. In that critique, the BOP contracting specialist stated:

Tr. at 8.

Pl.'s Mot. at 15.

The FAR requires that the SSDD be based on an in-depth comparative assessment against all source selection criteria in the solicitation. . . . Conclusive statements based solely on ratings or scores are not acceptable. The SSA needed to compare, contrast and articulate the benefits, strengths, weaknesses and other elements of discrimination and value between both the successful offeror and the unsuccessful offeror(s).

AR 1892-93 (emphasis in original). The record belies the assertion that BOP did not comply in its November 14 SSDD with the recommendations of BOP's Compliance and Review Contract Office. The March 2003 SSDD contained only one paragraph related to Past Performance, and that paragraph discussed only Bannum's Past Performance. AR 3979. Similarly, the analysis for Management and Technical addressed only Bannum's proposal. AR 3979. In contrast, the November 14, 2003 SSDD discussed all the offerors' proposals and stated why Bannum's was better under the Management and Technical factors. AR 4072-78.

Also in response to the March 2003 critique, Mr. Rowles significantly increased the Past Performance section of the SSDD to almost three pages. AR 4075-77. He explained how many points each offeror received, the adjectival rating that the points translated into, and discussed salient features of the offerors' Past Performance. Id. While the SSDD does not expressly state, for example, that "Dismas's food quality is more important to BOP than exercise equipment provided by Bannum," the Court does not consider such an omission to be fatal. Despite the fact that the SSDD does not contain such a direct comparison, the record supports the conclusion that BOP carefully weighed the strengths and weaknesses of each offerors' Past Performance in a comparative manner. AR 4075-77, 4032-57. Additionally, the failure of the SSDD to list weaknesses of Bannum or Dismas was not improper because the CO's Past Performance memorandum stated for both Bannum and Dismas that "only minor improvements are needed. Therefore, no further discussion regarding past performance is necessary." AR 4044, 4052. It was therefore reasonable for Mr. Rowles to conclude that whatever weaknesses existed were not significant enough to warrant being mentioned in the SSDD. Mr. Rowles made an independent judgment, based on Ms. Thompson's report, that Bannum's and Dismas's weaknesses need not be discussed because the performance problems that had existed on past contracts had been resolved. See generally 48 C.F.R. § 15.308.

The SSDD expressly states that "a review of the strengths and weaknesses, of the proposals reveals that the services Dismas Charities is offering do not warrant paying the premium or difference of $[* * *] in cost over the life of the contract. . . ." AR 4074. While the SSDD could, perhaps, have been more artfully crafted, it complies with the requirements of FAR § 15.308. Mr. Rowles stated the rationale for the business judgment and the tradeoff that he made. BOP reasonably concluded that Dismas's advantage in Past Performance was simply not worth the extra cost, given the fact that the proposals of Bannum and Dismas were substantially "technically equal" when all non-cost factors (Past Performance, Technical, and Management) were considered. AR 4074.

5. BOP Properly Evaluated the Reasonableness of Bannum's Prices

a. BOP's Failure to Use the Actual Prices Proposed by Bannum in Evaluating the Reasonableness of Bannum's Prices Was Harmless Error

To prevail in a protest, the protestor must show not only a significant error in the procurement process, but also that the error prejudiced it. Data General Corp. v. Johnson, 78 F.3d 1556 (Fed. Cir. 1996) (citing LaBarge Prods., Inc. v. West, 46 F.3d 1547, 1556 (Fed. Cir. 1995)). To establish prejudice, a protestor must show that there was a "substantial chance" that it would have received the contract award but for the alleged errors. Id. "Generally, the requirement of proving prejudice prevents an unsuccessful bidder from overturning a contract award due to a harmless violation of a statute or regulation on the part of the government." TRW Envtl. Safety Sys., Inc. v. United States, 18 Cl. Ct. 33, 67 (1989). Not every error compels the setting aside of an award. Allied Technology Group, Inc. v. United States, 39 Fed. Cl. 125, 133 (1997). The Court must "consider the significance of errors in the procurement process when deciding whether the overturning of an award is appropriate." Id. (citing Data General Corp., 78 F.3d at 1562). This court has held that de minimis errors do not require the overturning of an award. Id. De minimis errors are those that are so insignificant when considered against the solicitation as a whole that they can safely be ignored and the main purpose of the contemplated contract will not be affected if they are. Id.

Dismas cites 48 C.F.R. § 15.404-1(a)(1) for the proposition that the contracting officer was required to conduct a price reasonableness analysis of Bannum's proposal. That provision states that "the contracting officer is responsible for evaluating the reasonableness of the offered prices." 48 C.F.R. § 15.404-1(a)(1). FAR 15.305(a)(1) provides that "[n]ormally, competition establishes price reasonableness. Therefore, when contracting on a firm-fixed-price . . . comparison of the proposed prices will usually satisfy the requirement to perform a price analysis, and a cost analysis need not be performed." 48 C.F.R. § 15.305(a)(1).

Even assuming, however, that BOP was required to conduct a price reasonableness analysis in this firm-fixed-price procurement, BOP's use of $[* * *] and $[* * *] per inmate day (rather than Bannum's actual figures of $[* * *] and $[* * *] per inmate day) was harmless error. While BOP used incorrect numbers, that had no effect upon the validity of the price analysis. On April 29, 2003, BOP determined that the range of bids that would be accepted as reasonable was $[* * *] to $[* * *]. AR 4029. This range was based upon price data from three of Dismas's other facilities in Georgia. Id. FAR 15.404-1(b)(2)(vi) specifically provides that the "Government may use various price analysis techniques and procedures to ensure a reasonable price. Examples of such techniques include but are not limited to . . . comparison of proposed prices with prices obtained through market research for the same or similar items." 48 C.F.R. § 15.404-1(b)(2)(vi). Bannum's April 19, 2003 proposed prices of $[* * *] per inmate day for the two-year base period, and $[* * *] per inmate day for each of the three option years, AR 1788, fell within the "reasonable" range described above. Furthermore, when BOP conducted the head-to-head comparison between Bannum's and Dismas's prices, it accurately calculated that Dismas's proposal was approximately $[* * *] more than Bannum's over the life of the contract. See AR 4073 ($[* * *]), Pl.'s Mot. at 21 n. 13 ($[* * *]). Accordingly, BOP's error was harmless. The Court, therefore, will not overturn the award based on this mistake. See Allied Technology Group, 39 Fed. Cl. at 133.

b. BOP Was Not Required to Account for the Time Value of Money When Comparing the Proposals of Dismas and Bannum

Dismas next complains that BOP did not account for the time value of money. Dismas, however, is unable to point to anything in the solicitation, the FAR, or the administrative record that places such a requirement on the Government. It was not arbitrary or capricious for the Government to assume that it would exercise all the options provided for in the contract, and would therefore reap the benefits of Bannum's lower-priced option years. Despite the calculations discussed during oral argument, it is unclear that even if the Government had accounted for the time value of money, any change in the perceived savings would have been significant. The contracting agency may properly exercise wide discretion in its evaluation of bids and in its application of procurement regulations. See CACI Field Servs., 13 Cl. Ct. at 725. Even if we agreed with plaintiff that it would make sense for the Government to conduct such an analysis, it would not be appropriate for the Court to substitute its judgment for that of the agency. Seaborn Health Care, 55 Fed. Cl. at 523. The decision not to consider the time value of money was within the discretion of the agency, and Dismas has not met its burden of showing that BOP acted in an arbitrary or capricious manner, see Ellsworth Associates, 45 Fed. Cl. at 392, or that it was prejudiced by BOP's action, Data General Corp., 78 F.3d at 1562.

Tr. at 37, 69.

c. BOP Properly Assessed the Risk of Bannum's Proposed Prices

Dismas lastly contends that BOP improperly neglected to assess the risk [* * *] [of] Bannum's [* * *] price proposal. Dismas claims that [* * *] Bannum will be unable to [* * *] comply with the salaries and wages requirements of the Services Contract Act, 41 U.S.C. §§ 351- 354 (2000), and its implementing regulations, 48 C.F.R. § 52.222-41. The Court does not agree with this argument. First, the fact that Bannum's prices were within the acceptable range indicates that the Government reasonably concluded that the risk to which Dismas refers would likely not materialize. Second, because the contract was firm-fixed-price, any such risk would fall on Bannum, not the Government. Third, the Government determined that Bannum was a responsible offeror. BOP conducted sufficient inquiry into the risks posed by the [* * *] prices proposed by Bannum [* * *]. The Court will not overturn BOP's assessment simply because Dismas disagrees with BOP's conclusion. See Seaborn Health Care, 55 Fed. Cl. at 523.

Pl.'s Reply at 16. The requirements of the Services Contract Act were incorporated by reference into the solicitation at Section I. AR 159.

CONCLUSION

Having determined that BOP's decision to award the contract to Bannum was neither arbitrary nor capricious, and was otherwise in accordance with law, there is no need for this Court to address the issue of injunctive relief. The Court ORDERS that plaintiff's motion for judgment on the administrative record is DENIED and defendant's motion for judgment on the administrative record is GRANTED. The Court further ORDERS that the award of the contract to Bannum pursuant to Solicitation No. 200-0669-SE is UPHELD. Each party shall bear its own costs.

IT IS SO ORDERED.


Summaries of

Dismas Charities, Inc. v. U.S.

United States Court of Federal Claims
Jun 10, 2004
No. 04-304 C (Fed. Cl. Jun. 10, 2004)

recognizing that, as a general rule, agencies are permitted to re-score proposals

Summary of this case from Vanguard Recovery Assistance v. U.S.
Case details for

Dismas Charities, Inc. v. U.S.

Case Details

Full title:DISMAS CHARITIES, INC., Plaintiff, v. THE UNITED STATES, Defendant, and…

Court:United States Court of Federal Claims

Date published: Jun 10, 2004

Citations

No. 04-304 C (Fed. Cl. Jun. 10, 2004)

Citing Cases

Vanguard Recovery Assistance v. U.S.

Nonetheless, an agency has the right to change its mind in the course of an evaluation if it has good reason.…

HWA, INC. v. U.S.

In this case, that other information might reasonably have supported giving HWA the lower of these two…