SEAPORT-E

SeaPort-e

The Naval Sea Systems Command (NAVSEA) procures over a half billion dollars of Professional Support Services (PSS) each year for its headquarters' Directorates, Program Executive Offices (PEOs), and field activities. In order to meet the Navy strategic sourcing wedge, NAVSEA committed to $250M in savings by procuring PSS more efficiently. Coupled with this need, the Office of the Secretary of Defense (OSD) directed that 50% of all support services be procured using performance-based contracting by the year 2005. Furthermore, NAVSEA had more than 450 separate PSS contracts supporting its requirements.

Most of these efforts were not integrated from a Command perspective, utilized a multitude of different processes in which to procure the services, and did not leverage corporate buying habits or e-business to facilitate the processes. In addition, the services were predominantly procured via level of effort vice performance-based terms.

NAVSEA established the SeaPort Office to meet these the NAVSEA strategic sourcing wedge and the OSD performance-based contracting directive while bringing order to NAVSEA PSS acquisitions. The vision was to provide a faster, better, and cheaper means in which to procure PSS. The strategy developed in October 2000 involved a product line solution containing three components:

(1) Develop and award Multiple Award IDIQ contracts (MACs) using innovative acquisition techniques to achieve the NAVSEA strategic wedge, to conform to the OSD performance-based contracting directive, and to bring order to PSS acquisitions.

(2) Exploit existing e-business opportunities and create an automated, intuitive, web-based, e-procurement portal to provide services quickly and easily in an "amazon.com" environment.

(3) Create a website continually refreshing customers and suppliers with new information, opportunities, training, metrics and useful links to associate sites.

An important tactic used to implement this vision and obtain "buy-in" was to involve senior leadership and working-level representatives in the design and implementation of the MACs and the portal through multiple "Integrated Product Teams (IPTs)". These IPTs continuously and aggressively communicated the vision, strategy, and status to leadership at all working levels.

On April 2, 2001, in an unprecedented period of less than 6 months, SeaPort became a reality when all three of these initiatives converged. The MACs were awarded to twenty-one exceptionally well-qualified industry partners, the e-business portal became operational, and its front door website, www.seaport.navy.mil, was launched. These components combined to provide a faster, better and cheaper process to acquire PSS within the Command.

SEAPORT-E: TASK ORDERS

TO / DO / TI ZONE DESCRIPTION  AWARD DATE ISSUED BY
         
N00178-04-D-4036-HR01 4 Technical and Engineering Services for Critical Protection Systems 01/31/07 NSWC, PANAMA CITY
N00178-04-D-4036-0009 2 Test and Evalution and Associated Support Services for Integrated Combat Systems Test Facility 06/20/10 NSWC, DAHLGREN DIVISION
N00178-04-D-4036-0008 3 Warfare Systems Development Support 03/29/10 NSWC, DAHLGREN DIVISION
N00178-04-D-4036-0007 3 SSDS MK2 Program Support 02/01/07 NSWC, DAHLGREN DIVISION
N00178-04-D-4036-0006 2 Engineering and Technical Support Services 05/26/05 NSWC, INDIAN HEAD DIVISION
N00178-04-D-4036-0005 3 Advanced Sensor Distribution Systems Support 04/01/05 NSWC, DAHLGREN DIVISION
N00178-04-D-4036-0004 2 Total Ship Training Systems and Test and Evaluation Support 03/11/05 NSWC, PORT HUENEME DIVISION
N00178-04-D-4036-0003 3 SSDS MK2 Program Support 08/01/04 NSWC, DAHLGREN DIVISION

Quality Assurance

The DRS Team ensures that each tasked deliverable, effort, and development process has clear metrics indicating acceptable results. We are dedicated to service quality in product and deed. Our commitment to quality is evident in the fact that we have been issued ISO 9001-2000.

Our approach to achieving quality is the same for products and effort-oriented tasks.

  • Plan the effort in clearly measurable steps
  • Monitor execution progress to plan
  • Intercede immediately and aggressively if progress deviates from expectations

The beginning to a quality effort begins with careful planning that is generally represented in the task WBS. Progress to plan as reflected in the WBS schedule becomes the measure of effectiveness for the execution of each order and deliverable.

The task WBS must contain absolutely measurable milestones at intervals not exceeding two weeks. Subjective estimates of completion are avoided because they are susceptible to undue optimism and often ignore contingent factors that affect progress. The demand that intermediate milestones be auditable for verification means that, for example, computer code or a document section must be demonstrably complete. While a document section completion is verifiable by reading, computer code completion might have to be exercised. That means that appropriate tools must be in place to provide the required level of validation for the product being produced.

Measurable milestones and result intervals have to be short to ensure that trends are determined early and consistently. In addition, the schedule of WBS element completions must be such that no task dependent upon another is scheduled for completion prior to the other. This ensures that inter-dependent products and efforts are assessable as completed and that no surprises await the completion of “that one last detail.” An example of this process is the completion of an appendix referenced in the document before it is referenced so the document does not end up assessed as complete with the referenced appendix yet to be completed. Software development has similar requirements such as code required for a unit to work must be available when the dependent unit is exercised or audited to objectively validate the unit.

Monitor

DRS monitors quality as a combination of schedule completion and technically acceptable milestone demonstration. Since the project WBS includes explicitly verifiable milestones in effort or product completion and has them in relatively frequent intervals, progress “markers” are frequent and indicate trends in completion to plan early in the development process. Assuming that more difficult parts of an effort are either scheduled for more lengthy completion or broken down into interim completion stages, there should be no pronounce deviation from plan after a small amount of the work is complete. Where one task might be misestimated, a large number of smaller tasks serve to average out initial estimation errors and ensure the validity of the progress trend soon after initial development is underway. In addition, a larger number of smaller milestones ensures that expenditures are proportional to effort.

Our planning is unique in emphasizing measurable milestones, short achievement intervals, and an order of milestones that completes independent milestones before dependent ones. This process supports effective project monitoring by making verifiable, relatively current making audits of work in progress by either the DRS Team project manager or government official can not only be performed with confidence, they can be performed with little notice, little effort, and with minimal impact on continued development progress. The DRS Team will further formalize this process by putting the WBS and progress reporting on our website so that progress can be monitored at any time. The potential of unannounced audits based on an open view of progress to plan enhances attention to schedule and further encourages on-time development.

Maximize

DRS strives to maximize quality in its products and efforts by making valid progress to plan so that efforts are never rushed. In addition, our bi-weekly management coordination teleconferences ensure that products and efforts are continually directed at the needs for the requiring user, the Directorate, and the sponsor.

Our approach to monitoring progress ensures that any deficiency in production rate or product performance is detected at the earliest possible stage. The requirement that verifiable task elements be scheduled in order of dependence ensures that no “long pole gotchas” invalidate apparently smooth progress at the end of a project.

When problems are observed, DRS responds with aggressive and redundant corrective measures as further described below. DRS has not had a delivery issue in the last three years other than those caused by availability of Government-furnished information or material that was outside the control of the DRS Team and its immediate customer.