[ Pobierz całość w formacie PDF ]
.Another consideration was to keep the overall maintenance costs for the JSF low and also the amount of necessary maintenance hours and the numbers of maintainers necessary.This is important as most costs of weapons systems relate not to the acquisition cost, but to life-cycle costs (which are at least twice as high) (Kutner 2001).Secondly, the number of qualified maintainers, both military and contractors, is dropping and without enough maintainers aircraft and other weapons spend more time on the ground or in the base and are not available for operations (Thornton 2001, 205).Apart from the great benefits the armed forces can get through modelling and simulation for weapons design, there are also problems and unaddressed issues.Ideally, the armed forces can work through SBA with the industry much more closely and optimize a weapons design before the prototypeis built.The customer can look at the computer models and can make suggestions for improvements.Simulations can then determine how design changes affect the overall functionality.This, however, would imply the sharing of data and simulation technologybetween the developer and the customer and would require a lot of trust.Thecompanies naturally fear that they may lose their competitive advantage, if they have to share the technology of their design and simulation tools with the government or even with other companies (Kutner 2001).At the moment the Pentagon has noconsistent policy in respect to SBA and it is expected that the problem will have to be solved once it becomes necessary not only to simulate single weapons systems, but a whole network of integrated weapons systems.This will be the case with the Future Combat System programme that connects 14 different platforms ranging from an unmanned howitzer to a man-portable UAV into a single network (Kutner 2001).Taking this one step further is to develop a completely new generation of weapons on the computer, make them ready for production, but then put the blueprints on the shelf and start developing the next generation, which is the concept of ‘skipped generations’ (Defense & Aerospace Electronics 1993).If a new weapon is needed, it can be easily produced from the blueprints.Otherwise it is never produced and there are no costs of maintaining and operating unnecessary weapons, while weaponsdesign can always include the newest technologies.In other words, governments could have ‘virtual arsenals’ of conventional, or more likely, non-conventional weapons.The obvious problem with this is that weapons systems are never sufficiently tested, only in computer simulations, which can have varying degrees of accuracy.If a new weapon has to be rushed into battle, there is no operational experience available, which turns the weapon into a liability rather than an asset.Under operational conditions a military commander will naturally prefer battle proven weapons and will be reluctant to grant any new weapon a more extensive role, which makes the idea of skipped generations rather questionable.What use is a weapon, which is not sufficiently tested? This leads directly to the next problem: the process of weapons testing and evaluation.74War as BusinessWeapons Testing and EvaluationWeapons testing and evaluation (T&E) relates to activities which are intended to ensure the quality of equipment and reduce the risks for the armed forces.T&E is usually done by the armed forces- and national research labs, but because of reasons outlined in the previous chapter, the research labs now tend to contract out these activities to private companies.In the US IT companies like SAIC, Titan, CACI, Anteon, and MITRE offerservices relating to weapons testing and evaluation.In Britain the majority of test facilities are now operated by Qinetiq (UK MoD 2005, 10).In some cases thecomplete T&E of a new weapons system is contracted out and the contractor takes care of everything required to field-test it and will produce an evaluation report containing its findings.Few companies have the capability and expertise for a complete test.Usually a contractor only tests components of a weapons systems, or certain aspects of it.Some companies just provide instrumentation devices or software for testing or related engineering support services.The live testing usually takes place on government test ranges and facilities (which are rented by thecontractors) under the supervision, or active participation of government employees (UK MoD 2005 XIX).There are also problems with this arrangement, as contractors complain aboutdelays because of understaffing and about facilities, which are not up to date in terms of technical standard.This led to the more recent tendency of some contractors having their own test ranges and facilities.In some cases contractors manage test facilities and conduct tests for the armed forces.For example, SAIC has its own Test and Analysis business unit which works together with the armed forces toevaluate the capabilities of combat and combat support systems.The company also holds a $500 million contract for supporting the Air Force’s Operational Test and Evaluation Center.SAIC’s activities range from scientific studies over developing instrumentation devices to large field-tests of new systems.SAIC has done this kind of work for the Air Force since 1989.Increasingly prime contractors like Lockheed Martin, Northrop Grumman orRaytheon do test and evaluation of systems and components, often even of their own products, which raises some serious questions about the objectivity of these tests.Of course, there are practical reasons for this: testing is an integral part of the development phase and the contractor knows best which tests are needed; testing by the military research labs is time consuming, as it depends on available resources; leaving most of the testing to the contractors is in the end cheaper.To shorten the development phase and to reduce costs, modelling and simulation is now used for virtually every test programme (Fox, Boito and Younossi 2004,XXI).The costs of live tests can be substantial.In respect to the testing of missile defence systems, which are on the one hand an extreme case, but on the other a weapons system in line with the general trend towards high-tech, a single test has come to cost $80 to $100 million (Graham 2003, 205).Alone the interceptor missile costs more than $35 million ($24 million for the kill vehicle and $11 million for the booster) (Graham 2003, 294).Naturally only few tests are conducted which generate the data used in simulations for performance assessments.Fewer tests also meanModelling, Simulation, and Wargaming75that less data is generated for analysis, which weakens the results of the analysis and also the accuracy of the models designed with the test data.As the reliance on simulations has grown, there is now the great danger pointed out by Wayne Biddle:‘Our weapons tests now use so much computer modelling and simulation that no one knows whether some new arms really work.’ (quoted from Fox, Boito and Younossi 2004, xxi) As a result many of the most complex weapons systems have been fielded without sufficient testing and their actual performance fell far short of expectations.Out of a whole range of possible examples, three impressive failures shall be briefly discussed in this section: the Sgt.York Divad, the Patriot missile system and missile defence systems.Sgt.York Divad A very illustrative example for the problems connected to testing is the Sgt.York Divad (Division Air Defense) automatic gun system, which wasordered by the US Army in 1978 [ Pobierz całość w formacie PDF ]

  • zanotowane.pl
  • doc.pisz.pl
  • pdf.pisz.pl
  • gieldaklubu.keep.pl
  •