Chapter 5: The 1995 Process and Procedures

Chapter 5
The 1995 Process and Procedures



Composition of the 1995 Defense Base Closure and Realignment Commission

The commissioners chosen to serve on the 1995 round of the Defense Base Closure and Realignment Commission have diverse backgrounds in public service, business, and the military. In accordance with the enacting statute, two commissioners were nominated in consultation with the Speaker of the U.S. House of Representatives, two in consultation with the U.S. Senate Majority Leader, and one commissioner with the advice of each of the House and Senate Minority Leaders. The two remaining nominations were made independently by the President.

The Commission staff was drawn from divergent backgrounds encompassing government, law, academia, and the military. In addition to those hired directly by the Commission, other staff were detailed from the Department of Defense, the General Accounting Office, the Department of Commerce, the Environmental Protection Agency, the Federal Aviation Administration, and the Federal Emergency Management Agency. The expertise provided by the detailees from these diverse government agencies contributed significantly to the Commission’s independent review and analysis effort.

The Commission’s review and analysis staff was divided into five teams — Army, Navy, Air Force, Interagency Issues, and Cross Service. A direct-hire civilian managed each of the teams in accordance with the amended law which also limits the number of Department of Defense detailees to 20 percent of the analysts.



THE 1995 BASE CLOSURE PROCESS


Key Provisions of the Law

Public Law 101-510, as amended, requires the Secretary of Defense to submit a list of proposed military base closures and realignments to the Commission by March 1, 1995 (see Appendix F). In accordance with the statute, these recommendations must be based upon a force-structure plan submitted to Congress with the Department of Defense budget request for Fiscal Year 1996, and upon final criteria developed by the Secretary of Defense and approved by Congress. For the 1995 Commission process, the Secretary of Defense announced in December, 1994, that the final criteria would be identical to those used during the 1991 and 1993 base closure round.

The Secretary of Defense based the force-structure plan on an assessment of the probable threats to national security during the six-year period beginning in 1995, as well as the anticipated levels of funding that would be available for national defense (see Appendix G).

The final criteria cover a broad range of military, fiscal, and environmental considerations. The first four criteria, which relate to military value, were given priority consideration. The remaining four criteria, which address return on investment, economic, community infrastructure, and environmental impacts, are important factors that may mitigate against the military value criteria (see Appendix H).

The law requires the Commission to hold public hearings on base closure and realignment recommendations of the Secretary of Defense and on any changes proposed by the Commission to those recommendations. The Commission must report its findings to the President by July 1, 1995, based on its review and analysis of the Secretary of Defense’s recommendations. To change any of the Secretary’s recommendations, the Commission must find the Secretary deviated substantially from the force-structure plan or final selection criteria.

Once the President receives the Commission’s final report, he has until July 15, 1995, to approve or disapprove the recommendations. If approved, the report is sent to the Congress, which then has 45 days to reject the report by a joint resolution of disapproval; otherwise, the report has the force of law. If the President disapproves the Commission’s recommendations in whole or in part, he must transmit to the Commission and the Congress his reasons for disapproval. The Commission then has until August 15, 1995, to submit a revised list of recommendations to the President. At that point, the President either forwards the revised list to Congress by September 1, 1995, or the 1995 base closure process is terminated with no action taken to close or realign bases. The law prohibits the President or Congress from making any amendments to the recommendations, thereby requiring an "all-or-nothing" acceptance or rejection of the recommendations.

The 1995 Commission thoroughly analyzed all of the information used by the Secretary of Defense to prepare the recommendations. The Commission held a total of 13 investigative hearings in Washington, D.C. Military department representatives directly responsible for the Secretary’s recommendations testified before the Commission. In addition, several defense and base closure experts from the Federal government and private sector testified about the specifics of the base closure process, the potential impacts of the Secretary of Defense’s recommendations, and ways the Federal government could better assist communities with re-use activities. The Commissioners and staff members conducted 206 fact-finding visits to military activities recommended by the Secretary of Defense and considered by the Commission for closure or realignment. Furthermore, the Commission held 16 regional hearings to hear directly from communities nationwide, heard from hundreds of Members of Congress who testified before the Commission, and received over 200,000 letters from concerned citizens across the country. Finally, the Commission received input from the General Accounting Office, as required by the base closure statute, which included a report containing its evaluation of DoD’s selection process (see Appendix O and Appendix P).

Based on the information gathered and the analyses performed, alternatives and further additions to the Secretary’s list were considered. To perform a thorough analysis and consider all reasonable options, the Commissioners voted on March 7, 1995, and on May 10, 1995, to add a total of 36 installations for further consideration as alternatives and additions to the 146 bases recommended for closure or realignment by the Secretary of Defense. As required by law, the Commission published the required notice on May 17, 1995, in the Federal Register to inform communities their bases were under consideration by the Commission for possible closure or realignment. Public hearings were held for each of the installations the Commission added for consideration and each major base was visited by at least one commissioner (see Appendix J).


The Office of the Secretary of Defense (OSD) Guidance to the Military Departments and Defense Agencies



The Deputy Secretary of Defense established the policy, procedures, authorities, and responsibilities for base realignment or closure (BRAC) actions by memorandum dated January 7, 1994. This policy guidance provided the Secretaries of the military departments and the directors of the defense agencies with the responsibility to provide the Secretary of Defense with recommendations for closures and realignments. This policy also required the Secretaries of the military departments and Directors of the defense agencies to develop recommendations based exclusively upon the force-structure plan and final selection criteria, consider all U.S. military installations (as defined in the law) equally, analyze their base structure using like categories of bases, use objective measures for the selection criteria wherever possible, and allow for the exercise of military judgment in selecting bases for closure and realignment.

The Deputy Secretary also established the BRAC 95 Review Group and the BRAC 95 Steering Group to oversee the entire BRAC process. The BRAC 95 Review Group was composed of senior level representatives from each of the military departments, Chairpersons of the BRAC 95 Steering Group and each Joint Cross-Service Group, and other senior officials from the Office of the Secretary of Defense, Joint Staff, and Defense Logistics Agency. It provided oversight and policy for the entire BRAC process. The BRAC 95 Steering Group assisted the Review Group in exercising its authorities.

The Assistant Secretary of Defense for Economic Security was given the responsibility to oversee the 1995 process, and was delegated authority to issue additional instructions.

The Chairman of the Joint Chiefs issued the interim force-structure plan, as directed by the Deputy Secretary’s January 7, 1994 memorandum, on February 7, 1994. The Department issued the final selection criteria in the Federal Register on December 9, 1994. The Deputy Secretary provided the final force-structure plan on January 11, 1995. This Plan was updated on February 22, 1995, by the Deputy Secretary to reflect budget decisions, and was provided to Congress and the Commission on the same day.


Joint Cross-Service Functions

The 1993 Defense Base Closure and Realignment Commission recommended that the Department of Defense develop procedures for considering potential joint or common activities among the military departments. For BRAC 95, the Deputy Secretary directed the creation of Joint Cross-Service Groups (JCSGs) to consider these issues in conjunction with the military departments.

In the January 7, 1994, BRAC policy guidance, and further articulated in BRAC Policy Memorandum Number Two (issued on November 2, 1994), the Deputy Secretary announced a process involving both JCSGs and the individual military departments. This process was designed to establish alternatives for closure and realignment in situations involving common support functions for five functional areas. The five functional areas were: Depot Maintenance, Military Medical Treatment Facilities, Test and Evaluation, Undergraduate Pilot Training, and Laboratories. Additionally, the Department created an Economic Impact Group.

The Economic Impact Group included representatives from the military departments and the Office of the Secretary of Defense. For a year, the Group reviewed methods for analyzing economic impact, established common measures and approaches, and developed a computer-based system to facilitate the analysis of economic impact, including cumulative economic impact.

The Department considered both cumulative economic impact and historical trends of economic activity as part of the economic impact criterion. In response to concerns raised by the 1993 Defense Base Closure and Realignment Commission and the General Accounting Office, DoD analyzed economic impact and cumulative economic impact as relative measures for comparing alternatives. DoD did not establish threshold values, above which it would remove bases from consideration.

Economic impact was considered at two stages in the process. The military departments, in developing their recommendations, developed and analyzed data reflecting the economic impacts of prior BRAC rounds, as well as proposed Department actions during the current round. Once the service recommendations were made to the Secretary of Defense, the economic impacts were reviewed again, to determine whether there were instances in which separate service actions might have affected the same locality.

Each of the Joint Cross-Service Groups developed excess capacity reduction goals, established data collection procedures and milestone schedules for cross-service analysis of common support functions, and presented alternatives to the military departments for their consideration in developing recommendations. The JCSGs issued their alternatives to the military departments in November, 1994, and these alternatives were to be considered as part of their ongoing BRAC analysis.


The Army Process

The Army grouped all installations into categories with similar missions, capabilities, and characteristics. After developing a set of measurable attributes related to DoD’s four selection criteria for military value, the Army then assigned weights to reflect the relative importance of each measure. The Army then collected data on its installations and estimated relative importance, using established quantitative techniques to assemble installation assessments.

Using both the installation assessments and its stationing strategy, the Army determined the military value of each installation. These appraisals represented the Army’s best judgment on the relative merit of each installation and were the basis for selecting installations that were studied further for closure or realignment.

Once the list of final study candidates received approval by the Secretary of the Army, a variety of alternatives were examined in an effort to identify the most feasible and cost-effective way to close or realign. The Army applied DoD’s remaining four selection criteria by analyzing the financial, economic, community, and environmental impacts of each alternative using DoD’s standard models. The Army’s senior leaders reviewed the results of these analyses and discontinued studies of alternatives they found financially or operationally unfeasible.

During the course of the study effort, the Army Audit Agency performed independent tests and evaluations to check mathematical computations and ensure the accuracy of data and reasonableness of assumptions throughout every step of analysis. The General Accounting Office monitored the Army’s process from the very beginning and met regularly with the Army’s auditors, as well as officials from The Army Basing Study (TABS) office.


The Navy Process

The Secretary of the Navy established a Base Structure Evaluation Committee (BSEC), and a Base Structure Analysis Team (BSAT) to provide staff support to the BSEC. The BSEC had eight members, consisting of senior Department of the Navy (DoN) career civilians and Navy flag and Marine Corps general officers, who were responsible for developing recommendations for closure and realignment.

The BSAT was composed of military and civilian analysts who were tasked to collect data and to perform analysis for the BSEC. The Naval Audit Service reviewed the activities of the BSEC and the BSAT to ensure compliance with the approved Internal Control Plan and audited the accuracy and reliability of data provided by DoN activities. The Office of the General Counsel provided senior-level legal advice and counsel.

In compliance with the Internal Control Plan, a Base Structure Data Base (BSDB) was developed. Data included in the BSDB had to be certified as accurate and complete by the officer or civilian employee who initially generated data in response to the BSEC request for information, and then at each succeeding level of the chain of command. In conjunction with the requirement to keep records of all meetings that were part of the decision-making process, the BSDB and the certification policy were designed to ensure the accuracy, completeness, and integrity of the information upon which the DoN recommendations were based.

The BSEC developed five major categories for organizing its military installations for analysis and evaluation: Operational Support, Industrial Support, Technical Centers/Laboratories, Educational/Training, and Personnel Support/Other. These categories were then further divided into 27 subcategories to ensure that like installations were compared to one another and to allow identification of total capacity and military value for an entire category of installations. Within these 27 subcategories were 830 individual Navy or Marine Corps installations or activities, each of which was reviewed during the BRAC 95 process.

Data calls were issued to these installations, tailored to the subcategory in which the activity was grouped, to obtain the relevant certified information relating to capacity and military value. Conglomerate activities having more than one significant mission received multiple capacity data calls and military value analyses relating to those missions. The certified responses to these data calls were entered into the BSDB and formed the sole basis for BSEC determinations.

Capacity analysis compared the present base structure to the future force- structure requirement for each subcategory of installations to determine whether excess base structure capacity existed. If total capacity was greater than the future required capacity, excess capacity was determined to exist, and the military value of each installation in a subcategory was evaluated. If there was no meaningful excess capacity, no further closure or realignment analysis was conducted. Of the 27 subcategories, eight of them demonstrated either little or no excess capacity.

The remaining 19 subcategories underwent military value analysis to assess the relative military value of installations within a subcategory, using a quantitative methodology that was as objective as possible. Information from the military value data call responses was displayed in a matrix and scored by the BSEC according to relative importance for a particular subcategory. A military value score for a particular installation was a relative measure of military value only within the context of the subcategory in which that installation was analyzed, in order to compare one installation in a subcategory against another installation in that category.

The results of the capacity analyses and military value analyses were then subjected to configuration analysis. Multiple solutions were generated that would satisfy capacity requirements for the future force structure while maintaining the average military value of the retained installations at a level equal to or greater than the average military value for all of the installations in the subcategory.

The configuration analysis solutions were then used by the BSEC as the starting point for the application of military judgment in the development of potential closure and realignment scenarios to undergo return on investment analysis. Additionally, the Joint Cross-Service Groups generated numerous alternatives derived from their analysis of data and information provided by the military departments. As a result of the scenario development portion of the process, the BSEC developed 174 scenarios involving 119 activities.

Cost of Base Realignment Actions, or COBRA analyses, were conducted on all of these scenarios. The BSEC used the COBRA algorithms as a tool to ensure that its recommendations were cost effective.

The impact on the local economic area was calculated using the DoD BRAC 95 Economic Impact Data Base. The BSEC also evaluated the ability of the existing local community infrastructure at potential receiving installations to support additional missions and personnel. The impact of increases in base personnel on such infrastructure items as off-base housing availability, public and private schools, public transportation, fire and police protection, health care facilities, and public utilities was assessed.

Once the BSEC had determined the candidates for closure or realignment, an environmental summary was prepared which compared the environmental management efforts at losing and gaining sites. Differences in environmental management effort were presented as they relate to such programs as threatened or endangered species, wetlands, cultural resources, land use, air quality, environmental facilities, and installation restoration sites. The environmental impact analysis permitted the BSEC to obtain a comprehensive picture of the potential environmental impacts arising from the recommendations for closure and realignment.


The Air Force Process

The Secretary of the Air Force appointed a Base Closure Executive Group of six general officers and seven comparable (Senior Executive Service) civilians. Additionally, an Air Staff-level Base Closure Working Group was formed to provide staff support and additional detailed expertise for the Executive Group. Plans and Programs General Officers from the Major Commands (MAJCOM) met on several occasions with the Executive Group to provide mission specific expertise and greater base-level information. Additionally, other potential service impacts were coordinated by a special inter-service working group.

The Executive Group developed a Base Closure Internal Control Plan that was approved by the Secretary of the Air Force. This plan provided structure and guidance for all participants in the base closure process, including procedures for data gathering and certification.

The Executive Group reviewed all active and Air Reserve Component (ARC) installations in the United States that met or exceeded the Section 2687, Title 10 U.S.C. threshold of 300 direct-hire civilians authorized to be employed. Data on all applicable bases was collected via a comprehensive and detailed questionnaire answered at base level with validation by the MAJCOM and Air Staff. All data was evaluated and certified in accordance with the Air Force Internal Control Plan. As an additional control measure, the Air Force Audit Agency was tasked to continuously review the Air Force process for consistency with the law and DoD policy and to ensure the data collection and validation process was adequate. A baseline capacity analysis evaluated the physical capability of a base to accommodate additional force structure and other activities (excess capacity) beyond what was programmed to be stationed at the base.

All data used in the preparation and submission of information and recommendations concerning the closure or realignment of military installations was certified as to its accuracy and completeness by appropriate officials at base level, MAJCOM, and Air Staff level. In addition, the Executive Group and the Secretary of the Air Force certified that all information contained in the Air Force detailed analysis and all supporting data were accurate and complete to the best of their knowledge and belief.

The Executive Group placed all bases in categories, based on the installation’s predominant mission. When considered by category, the results of the baseline capacity analysis represented the maximum potential base closures that could be achieved within each category. The results of the baseline capacity analysis were then used in conjunction with the approved DoD force-structure plan in determining base structure requirements. Other factors were also considered to determine actual capabilities for base reductions. The capacity analysis was also used to identify cost effective opportunities for the beddown of activities and aircraft dislocated from bases recommended for closure and realignment.

Bases deemed militarily or geographically unique or mission-essential were approved by the Secretary of the Air Force for exclusion from further closure consideration. Capacity was analyzed by category, based on a study of current base capacity and the future requirements imposed by the force-structure plan. Categories and subcategories having no excess capacity were recommended to and approved by the Secretary of the Air Force for exclusion from further study.

All non-excluded active component bases in the remaining categories were individually examined on the basis of all eight selection criteria established by the Secretary of Defense, with over 250 sub-elements to the grading criteria. These sub-elements were developed by the Air Force to provide specific data points for each criterion.

Under Deputy Secretary of Defense direction, the Executive Group and the Secretary of the Air Force considered and analyzed the results of the efforts of Joint Cross-Service Groups in the areas of Depot Maintenance, Laboratories, Test and Evaluation, Undergraduate Pilot Training, and Military Treatment Facilities including Graduate Medical Education. The Joint Cross-Service Groups established data elements, measures of merit, and methods of analysis for their functional areas. The Air Force collected data as requested by the joint groups, following the Air Force’s Internal Control Plan. After receiving data provided by each of the services, the joint groups developed functional values and alternatives for the activities under their consideration. These alternatives were reported to the military departments for consideration in their processes.

The ARC category, comprised of Air National Guard and Air Force Reserve bases, warrants further explanation. First, these bases do not readily compete against each other, as ARC units enjoy a special relationship with their respective states and local communities. Under federal law, relocating Guard units across state boundaries is not a practical alternative. In addition, careful consideration must be given to the recruiting needs of these units. Realignment of ARC units onto active or civilian, or other ARC installations, however, could prove cost effective. Therefore, the ARC category was examined for cost effective relocations to other bases.


The Defense Logistics Agency Process

The Defense Logistics Agency (DLA) is not directly identified in the DoD force-structure plan. Therefore, DLA developed Concepts of Operations to translate the effects of the force structure plan within the Agency’s mission planning.

The DLA Director established a Base Realignment and Closure Executive Group comprised of appropriate senior executives from the Agency’s business and staff areas. The Group included both senior level civilian and military personnel, and was chaired by the Principal Deputy Director.

The Executive Group served as senior advisors to direct the 1995 study effort and present activity realignment and closure candidates for the Director’s final recommendation to the Secretary of Defense. A BRAC Working Group was also established under the direction of the Executive Group. The Working Group developed analytical tools, collected and analyzed certified data, developed and evaluated alternative scenarios for Executive Group consideration, conducted sensitivity analyses, and compiled documentation to support the final recommendations.

The DLA BRAC analysis process ensured that all of the Agency’s activities were fully evaluated. Formal charters were developed for the Executive Group and the Working Group, and audit and internal control plans were developed to document the collection and use of accurate certified data.

The Executive Group aggregated activities into categories and subcategories based on similarity of mission, capabilities, and attributes. From these, the following categories were defined: Distribution Depots, Inventory Control Points, Service/Support, and Command and Control Activities. Subcategories were defined within the categories to ensure that the activities were evaluated in a fair and consistent manner. Where possible, activities were compared to peers of similar function and size. Activities identified for closure as a result of previous BRAC decisions were not evaluated.

Comprehensive data calls were designed to support analysis of excess capacity; military value; and economic, environmental, and community impacts with certified data. The data call questionnaires were carefully designed to ensure uniform interpretation of questions, level of detail, and documentation requirements. Sources for the data were specified to the greatest extent practical.

DLA conducted an excess capacity analysis for each of the BRAC activity categories and subcategories. Where significant amounts of excess capacity were found, these sites could be considered as possible receiver sites in potential realignment recommendations.

The purpose of the military value analysis was to determine the relative ranking of each activity with respect to other activities in the same category or subcategory. OSD provided the military departments and the defense agencies with a list of selection criteria to be used as part of the military value analysis. The Executive Group developed more distinctive measures to assess the military value of DLA activities. The Measures of Merit used to develop military value were Mission Scope, Mission Suitability, Operational Efficiencies, and Expandability.

The next step was to identify potential realignment or closure candidates and eliminate the remaining activities from further consideration. Military value, in conjunction with military judgment, was the primary consideration in determining prospective realignment or closure candidates. Once an alternative was conceived, it was evaluated for reasonableness and then either refined or abandoned. DLA worked closely with each military department during this process to identify and consider potential excess space for joint use, to evaluate the impact of military department recommendations on its activities, and to ensure that the impacts of military department recommendations were appropriately factored into the Agency’s recommendations.

The DLA BRAC Working Group evaluated potential realignment and closure scenarios using the COBRA model. The analysis results were reviewed by the BRAC Working Group and presented to the Executive Group for further consideration.

Each scenario was considered in terms of its overall risk, benefit, and cost to the strategic direction of DLA and the interests of DoD. Based on its review and best military judgment, the Executive Group made individual recommendations to the Director. After the approval of the Director, the recommendations were then returned to the Working Group for economic, community infrastructure, and environmental impact assessments. The Working Group reported its findings to the Executive Group for further consideration as appropriate.

An Internal Control Plan for the collection and analysis of data was developed for the BRAC 95 process. The plan, issued May 23, 1994, was reviewed and approved by the DoD Inspector General (DoDIG) and the General Accounting Office (GAO).

The DoDIG personnel were responsible for data validation, fully participated in the Executive and Working Group meetings, and observed the Working Group analysis process.

GAO representatives also participated in the DLA BRAC 95 process and attended Executive Group meetings, observed the Working Group analysis process, and visited selected field activities to observe the data collection and data validation processes.

Upon completion of the impact assessments, recommendations were returned to the Executive Group. The Working Group presented the results of the impact analyses and supported additional Executive Group deliberations. The Executive Group discussed the impact assessments, conducted an extensive review of each recommendation, and approved selected recommendations.


Defense Investigative Service Process

The Defense Investigative Service (DIS) Director established a Base Realignment and Closure Executive Group comprised of appropriate principals from headquarters, and chaired by the Deputy Director, Resources. The Executive Group acted as senior advisors to direct the analysis effort and present the Directors final recommendations to the Secretary of Defense. A BRAC Working Group was established under the direction of the Executive Group. The Working Group was comprised of four headquarters elements and two investigations control and automation elements. An Internal Control Plan was developed to ensure that data was consistent and standardized, accurate and complete, certifiable, verifiable, auditable by external audit and inspection agencies, and replicable using documentation developed during data collection.

The selection process consisted of five steps to gather data and conduct analyses: 1) collect data; 2) analyze military value; 3) develop alternatives; 4) perform COBRA analyses; and 5) determine impacts.

Military value criteria were given priority consideration. Since the DoD Selection Criteria were designed specifically with the Military services in mind, the Executive Group developed more distinctive measures to assess the military value of DIS activities. The Measures of Merit used to develop military value were Mission Essentiality, Mission Suitability, Operational Efficiencies, and Expandability.

The DIS used the COBRA model to assess the relative costs, savings, and return on investment of the alternatives. Working Group members gathered the necessary data regarding personnel, construction, and renovation.

The potential economic impact on communities was evaluated through the use of the BRAC 95 Economic Impact Data Base. The ability of the potential losing and receiving locations infrastructure to support each alternative was evaluated by the Executive and Working Groups. Impacts were also evaluated in terms of readiness, effectiveness, and efficiency with regard to the ability of DIS to support its customers. The analysis also considered potential environmental impacts at both the losing and gaining sites for each alternative.

The COBRA results, community and environmental impacts, and supporting rationale were presented to the Executive Groups for consideration and selection of the Agencys final recommendation to the Secretary of Defense.


Office of the Secretary of Defense/Joint Chiefs of Staff Review

Using certified data, the Secretaries of the military departments and Directors of the defense agencies developed their recommendations based on the approved final selection criteria and force structure plan, and submitted their base closure and realignment recommendations to the Secretary of Defense for review and approval. As part of the Secretary’s review, the Assistant Secretary of Defense for Economic Security provided for Joint Staff and OSD review of the recommendations received from the military departments and defense agencies.

The Joint Staff reviewed the recommendations from a warfighting perspective to ensure they would not adversely affect the military readiness capabilities of the armed services. The Chairman of the Joint Chiefs of Staff endorsed all the military department and defense agency recommendations without objection.

Key staff elements of the Office of the Secretary of Defense and the Joint Staff also reviewed the recommendations to ensure they would not sacrifice necessary capabilities and resources. The Assistant Secretary of Defense for Economic Security reviewed the recommendations to ensure all eight selection criteria were considered and the recommendations were consistent with the force structure plan. This review also assured that DoD policies and procedures were followed and that the analyses were objective and rigorous.

The Secretary approved the recommendations of the military departments and defense agencies and officially transmitted his list of closures and realignments to the 1995 Defense Base Closure and Realignment Commission on February 28, 1995.


COMMISSION REVIEW

The Commission established five teams within its Department of Review and Analysis — one team to review each respective Service application of the military value criteria to the base closure process, an Interagency Issues Team which reviewed the Defense Agencies’ application of the military value criteria to the base closure process, and a Cross Service Team to review the application of military value applied to depots, test and evaluation, and laboratories. Each team analyzed the services methodology to ensure general compliance with the law, to confirm accuracy of data, and to determine if base-specific recommendations were properly offered by the Secretary of Defense.

In addition, the Interagency Issues Team analyzed the final four criteria — Return on Investment, Economic Impacts, Community Infrastructure, and Environmental Impacts — across all services. The Interagency Issues Team also provided analysis on airspace issues when applicable.


Criteria 1 – 4: Military Value

In accordance with PL 101-510, as amended, all of the information used by the Secretary of Defense to prepare recommendations must be sent to Congress, the Commission, and the Comptroller General. Within the Commission, each team began its review and analysis with an examination of the documents provided by the services. First, teams determined whether the recommendations were based on the force-structure plan and eight criteria, and whether all bases were considered equally. Next, the teams considered if categories, subcategories, and base exclusions were reasonable.

Each of the teams reviewed the process the services used to assess military value, as well as the reasonableness of the data they used. Each team examined the capacity analyses performed by the services and highlighted installation categories that required additional scrutiny. Specific data analyses included a review and independent analysis of the COBRA input data and military construction cost estimates, as well as the capacity of receiver installations to accept missions.

Throughout the review and analysis process, the Commission staff maintained an active and ongoing dialogue with base-associated communities who made significant contributions to the entire process. Staff members also accompanied Commissioners on base visits, attended regional hearings, and visited closure and realignment candidates and receiving installations.


Criteria 5 – 8: Costs, Savings, and Impacts

While the first four selection criteria assessed military value and were given priority consideration, the remaining criteria were also applied in base closure and realignment evaluations. Because these criteria were not driven by military considerations specific to a service, the Commission’s Interagency Issues Team evaluated criteria application across all services to ensure process uniformity and compliance with the legal requirement to evaluate recommendations based on the final selection criteria.


Criterion 5: Return on Investment

As prescribed by OSD policy guidance, the COBRA model was used by the services and defense agencies to calculate costs, savings, net present value, and return on investment for base closure and realignment actions. Return on investment was the expected payback period in years for each proposed base closure or realignment. The COBRA input data consisted of standard factors, which generally remained constant, and base/scenario factors which were unique. Standard factor examples included civilian pay, national median home price, discount rates, and costs per mile of moving personnel and equipment. Examples of base/scenario factors included the number of authorized personnel at a base, the size of the base, the number of personnel moving, and construction costs required by the move. The output data were used by each of the services and defense agencies in their decision-making process.

All of the COBRA runs used by the services and defense agencies in formulating their recommendations were provided to the Commission with the Secretary’s list. Other COBRA runs were submitted by the services and defense agencies upon Commission request. The Commission thoroughly reviewed the services and defense agencies data throughout its evaluation process.

The Commission also generated and ran its own COBRA models to evaluate various alternative realignment and closure scenarios. In total, including the original DoD submission COBRA runs, the staff received or generated nearly 400 COBRA runs for evaluation and consideration. Ten percent of these COBRA runs were generated by communities and submitted to the Commission for evaluation. In a number of these cases, the communities analyses identified important cost and savings issues.

Another vital function performed by the Review and Analysis Interagency Issues Team was to track the costs and savings estimates of DoD recommendations throughout the review and analysis process. During the time from February 28, 1995, when the list of recommendations was submitted to the Commission, until the final deliberations in late June, DoD modified the return on investment calculations for 64 of the original 146 recommendations. Several of these revised COBRA runs substantially changed the estimate of the costs and savings associated with a particular realignment or closure action. In general, DoD originally underestimated the cost of executing realignment or closure actions and overestimated their projected savings.


Criterion 6: Economic Impact

Two economists of the Commissions Review and Analysis Interagency Issues Team, one detailed from the Department of Commerce (DOC) and one from the Federal Emergency Management Agency (FEMA), validated DoDs compliance with Criterion 6 on economic impact. Their review included (1) analysis of economic procedures provided to the services by DoDs Joint Cross-Service Group on Economic Impact, (2) validation of personnel changes resulting from the current BRAC action, in particular providing consistency in personnel changes between the Economic Impact Database (EID) and the COBRA personnel summary reports, (3) validation of employment data used in the economic impact equation and historical economic data used to demonstrate actual economic activity, (4) a validation of the economic areas assigned to installations, and (5) an analysis of the indirect job multipliers used to measure indirect job impacts.

The services generally complied with the OSD guidance to estimate economic impact, and these impacts represented a worst-case estimate of job loss. Economic procedures used by the services complied with commonly used economic practice for measuring regional economic impacts. Personnel changes were consistent, in the majority of installations, between EID and COBRA. Where inconsistencies occurred, the Commission directed the services to resolve them. Economic data were validated by comparing the data in the EID with economic reports generated by the services and by validating these data from their sources–DOCs Bureau of Economic Analysis and Labor Departments Bureau of Labor Statistics. The Commission validated assignment of installations to appropriate economic areas, consistent with the Office of Management and Budgets Revised Standard for Defining Metropolitan Areas, as appropriate.

The Commission, with further assistance of FEMA, assessed indirect job multipliers used by the services to estimate indirect job losses by independently computing multipliers for 32 major bases included on the Secretarys list. In most cases, the multipliers used by the services were greater than those estimated by FEMA. Where the FEMA multipliers were greater, the Commission questioned DoDs Joint Cross-Service Group on Economic Impact about the apparent discrepancies. The Commission found, through these discussions, that the lower DoD multipliers were from adjustments to standard multipliers to account for lower military wages and on-base services for DoD military personnel, compared to that of DoD civilian personnel. After this review, the Commission believed the indirect job multiplier values used by the services were consistent and complied with good economic practice.


Criterion 7: Community Infrastructure

The Commissions Review and Analysis Interagency Issues Team validated DoDs compliance with Criterion 7, the ability of both the existing and potential receiving communities infrastructure to support forces, missions and personnel. DoD did not provide specific guidance on how the services should evaluate this criterion. The services determined their own measures for adequacy of community infrastructure which were based as much as possible on existing data sources. Each service appeared to address its measures adequately, so that no substantial deviation from established criteria was identified.

Army: In its report to the Commission, the Army stated that Criterion 7 was addressed with Criterion 6 using DoDs standard model to evaluate economic impacts. The Army provided no additional description of its evaluation of community infrastructure. Some of the attributes selected for the Armys military value analysis suggested that community infrastructure may have been taken into account in the analysis. These attributes included workforce statistics, cost of living index, family housing, health care index, and variable housing allowance.

Navy: The Navy rated selected aspects of community infrastructure in its military value analysis, including on- and off-base housing, child care availability, commute distance, access to education and health care, and crime statistics. Community infrastructure factors were rated and assigned weights for calculation within each installation category. The Navys data calls contained comprehensive listings and statistics on workforce attributes, spouse employment, education options, and ability of local infrastructure to accept growth at various levels.

Air Force: The Air Force quantified and rated several sub-elements: off-base housing, transportation, crime rate, medical care, education, and off-base recreation. The Air Force assigned color-coded ratings to the six sub-elements, which were averaged out to a single color-code assigned for community infrastructure. The analysis relied on various national, local, and service-specific data sources. The Variable Housing Allowance (VHA) survey evaluated various cost-related factors for individual bases, and was used to derive the VHA paid to enlisted personnel. VHA data was used by the Air Force to assess off-base housing and commute information. It should be noted that the objectives of the VHA survey (to measure need for VHA) tend to influence survey responders to maximize negative responses. Thus, quality of life data derived from the VHA survey may appear to show a negative bias towards community infrastructure.

Defense Agencies: The Defense Logistics Agency assessed community impact by using data on local economic indicators, transportation, utilities, workforce availability, housing, education, health care, crime, and climate/environment. Data sources included Bureau of the Census, Department of Commerce, state agencies, local transit authorities, and published business directories.


Criterion 8: Environmental Impact

An environmental analyst detailed to the Commissions Review and Analysis Interagency Issues Team from the Environmental Protection Agency validated DoDs compliance with Criterion 8 on environmental impact. The review included (1) review of DoD guidance to the services and defense agencies, (2) review of each services analysis and recommendations, (3) review of selected base-specific data calls for each service, and (4) interviews with an environmental analyst from the BRAC staff of each service to clarify interpretation of DoD guidance.

The Department required consideration of environmental impacts for closing, realigning, and receiving installations. Specifically, seven environmental attributes were to be evaluated: threatened and endangered species, wetlands, historic and archeological sites, pollution control, hazardous materials/wastes, land and air uses, and programmed environmental costs/cost avoidances.

Guidance was issued in December 1994 which addressed environmental restoration and compliance costs. The policy stated that [e]nvironmental restoration costs at closing bases are not be considered for cost of closure calculations, and cited DoDs legal obligation for environmental restoration at any base, whether or not it closes. Environmental compliance costs, however, could be a factor in a base closure or realignment decision, and were estimated for all facilities.

The services and defense agencies generally complied with the DoD guidance in their evaluation of environmental impacts. The services applied different weighting factors to environmental criteria, and some services selected certain environmental criteria to incorporate in their military value analysis. Specific comments follow:

Army: The Army assessed some environmental impacts in its military value assessment as environmental carrying capacity, which measured ability to conduct current missions, receive additional units, and expand operations in light of environmental constraints. The Army also assessed environmental impacts and costs in Installation Environmental Baseline Summaries. Army documentation indicated that environmental factors did not impede any recommended BRAC action.

Navy: The Navy selected certain environmental factors to include in most of its military value calculations, under Environment and Encroachment. These factors were selected and weighted differently for each subcategory of Navy facilities, as some environmental criteria were considered more significant to certain types of facilities. Of all environmental factors measured within military value evaluations, air quality was often assigned the greatest weight. All required environmental attributes and costs were assessed qualitatively in the base-specific environmental data calls.

Air Force: The Air Force quantified air quality as one of seven sub-elements in its military value analysis under Criterion II (Availability and Conditions of Land, Facilities, and Associated Airspace). The Air Force addressed and weighted all other environmental elements in general in Section VIII (Environmental Impact). Additional environmental information and costs were summarized in the base-specific data calls but were not weighted as criteria for comparison. The categories and level of detail for compliance costs varied from one base to another, and did not allow for effective comparison between bases.

Defense Agencies: The Defense Logistics Agency sent environmental questionnaires to installations, and sent responses to the Commission. DLA stated any environmental factors would limit an installations ability to expand were assessed. In two cases, Tracy/Sharpe and Ogden, air quality nonattainment was viewed as a potential limitation on expansion. The Defense Investigative Service completed an environmental analysis for the structure from which it will move.

General comments: Air quality presented particular concerns for realigning and receiving candidate installations. The BRAC-95 was the first round which considered regulations for conformity under the 1990 Clean Air Act, which prohibits a federal agency from supporting an action unless it determines that it conforms to the air quality implementation plan for the area.

The Air Force appeared to assign air quality a greater weight than other services as they considered the military value implications. Air Force and DLA considered the probability of obtaining conformity determinations in making their recommendations. Although the Navy identified areas where conformity might be required, its recommendations assumed that implementation was possible, even at significant cost. The Armys documentation did not indicate that air conformity concerns affected closures or realignments.   

The Role of the General Accounting Office (GAO)

In compliance with Public Law 101-510, as amended, GAO evaluated DoD’s selection process, provided the Commission and Congress a report containing its detailed analysis of the process, and testified before the Commission on April 17, 1995.

The GAO reported to Congress and the Commission that the services’ selection process was generally sound, well documented, and should result in substantial savings. However, the recommendations and selection process were not without problems and, in some cases, raised questions about the reasonableness of specific recommendations. At the same time, GAO noted that improvements were made to the process from prior rounds, including more precise categorization of bases and activities, resulting in more accurate comparisons between like facilities and functions, and better analytical capabilities.

GAO reported that the DoD and its components included the requirement to use certified data, i.e., information that was accurate and complete to the best of the originator’s knowledge and belief. This requirement was designed to overcome concerns about the consistency and reliability of data used in the process. GAO also found that the services improved their cost and savings estimates for BRAC 1995 recommendations. In developing cost estimates, they took steps to develop more current and reliable sources of information and placed greater reliance, where practicable, on standardized data. Some components sought to minimize the costs of base closures by avoiding unnecessary military construction. For example, the Navy proposed a number of changes to prior BRAC decisions that will further reduce infrastructure and avoid some previously planned closure costs.

The 1993 Defense Base Closure and Realignment Commission required DoD to explore opportunities for cross service use of common support assets. For the 1995 round, the Department of Defense established Joint Cross-Service Review Groups to provide the services with alternatives for realignments and closures in the areas of depot maintenance, laboratories, test and evaluation facilities, undergraduate pilot training, and medical treatment facilities. GAO found that DoD’s attempt at reducing excess capacity by proposing cross-service alternatives yielded some results. Agreements for consolidating similar work done by two or more of the services were limited, however, and opportunities to achieve additional reductions in excess capacity and infrastructure were missed. This was particularly true of depot maintenance activities and laboratory facilities.

GAO also found that although the services have improved their processes with each succeeding BRAC round, some process problems continued to be identified. In particular, the Air Force’s process remained largely subjective and not well documented; also, it was influenced by preliminary estimates of base closure costs that changed when a more focused analysis was made. For these and other reasons, GAO questioned a number of the Air Force’s recommendations. To a lesser extent, some of the services’ decisions affecting specific closures and realignments also raised questions. For example, GAO found the Secretary of the Navy’s decision to exclude certain facilities from closure for economic impact reasons was not consistently applied.

As stated above, GAO reported that, as in the past, key aspects of the Air Forces 1995 process remained largely subjective and not well documented. Documentation of the Air Force’s process was too limited for GAO to fully substantiate the extent of Air Force deliberations and analyses. However, GAO determined that initial analytical phases of the Air Force process were significantly influenced by preliminary estimates of base closure costs. For example, some bases were removed from initial consideration based on these estimates. Also, in some instances, closure costs appeared to materially affect how the bases were valued.

Relative to the Navy, GAO concluded its process was generally thorough and well-documented. It pointed out, however, that the Secretary of the Navy excluded four activities in California, and one in Guam, from consideration for closure because of concerns over the loss of civilian positions. For the activities in California, the Secretary based his decision on the cumulative economic impact of closures from all three prior BRAC rounds. But the economic impact of the four California activities, as defined by OSD criteria, is less on a locality basis than that for similar activities recommended for closure in other states either by the Navy or by other DoD components. In this case, however, OSD did not take exception to the inconsistency.

GAO also found the Armys process and recommendations to be generally sound. GAO asserted the Army did not fully adhere to its regular process, however, in assessing military value when recommending minor and leased facilities for closure. In selecting 15 minor sites for closure, the Army based its decision on the judgment of its major commands which assessed the sites as excess and of low military value. In considering leased facilities, the Army relied on its stationing strategy and its guidance to reduce leases but did not assess the facilities separately as it did for other installations. The decisions were arrived at through some departure from the process used for installations.

Regarding the Defense Logisitics Agency, GAO reported its process and recommendations were well-documented and flowed logically.

Finally, GAO certified that the Defense Investigative Services recommendation was well-documented and generally sound.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.