+ All Categories
Home > Documents > [American Institute of Aeronautics and Astronautics 9th AIAA Aviation Technology, Integration, and...

[American Institute of Aeronautics and Astronautics 9th AIAA Aviation Technology, Integration, and...

Date post: 15-Dec-2016
Category:
Upload: ricky
View: 212 times
Download: 0 times
Share this document with a friend
16
American Institute of Aeronautics and Astronautics 1 A KBE Genetic-Causal Cost Modelling Methodology for Manufacturing Cost Contingency Management R. Curran 1 Delft University of Technology, Delft, 2629 HS M. Gilmour 2 Queens University Belfast, Northern Ireland , C. McAlleenan and P. Kelly 3 Bombardier Aerospace Belfast, Airport Road, Belfast, NI, UK Abstract The paper provides validated evidence of a robust methodology for the management of lean manufacturing cost contingency, with a particular focus on contingency regarding recurring work content. A truly concurrent engineering process is established by capturing a range of knowledge from the design, manufacturing and procurement functional areas, and analyzing it in a way that it can be automatically accessed with no specialist manufacturing knowledge, thus enabling the company to quickly estimate their designs based on the company’s lean manufacturing capability. A use-case study is presented which estimated the rolled-up assembly time for a thin-walled stiffened structure to be within ±7% of the manual estimating. A rolled-up accuracy of ± 10% was indicative in terms of total part and assembly cost, although it was evident that some of the individual cost deltas deviated more significantly, tending to cancel each other out in the roll-up. However, the key contribution of the paper is the implementation and validation of the use of the Genetic-Causal principle in developing a lean manufacture cost contingency methodology. Manufacturing cost models are established according to the principle of ‘causality’, which establishes which cost drivers should be used to generate the cost relations associated with the cost breakdown structure elements for various manufacturing processes. However, the ‘genetic’ principle is imposed through the hypothesis that the estimated costs of a new product will be similar to a company’s historical cost performance, relative to process selection, product definition and lean manufacturing capability. The latter use of the ‘genetic’ principle then presupposes that any new cost estimates generated are by nature not definitive and that rather one should be aiming to estimate the cost contingency. Consequently, the paper goes on to develop this hypothesis through a aircraft fuselage use-case to show how the lean concept idea of value added and non value added work content is consistent with this approach and indeed leads us to cost uncertainty and sensitivity analysis. Consequently, the paper advocates the formalization of an initially heuristic approach and so can be described inherently as a Knowledge Based Engineering (KBE) approach: capturing, structuring and formalizing manufacturing cost knowledge for application through knowledge bases and rules. I. Introduction It is often quoted that conceptual design influences up to 80% of product cost, yet the majority of cost reduction tools in aerospace companies focus on latter stages of product design or production [1-7]. What cost estimation that is carried out in early design stage occurs more at a global product level following on from the bid cost estimating effort using parametric techniques or cost estimating relations that connect high-level aircraft performance/design attributes to manufacturing cost. Although this effort is appropriate at setting a target cost, it is not tightly coupled to the product or process definition and therefore is less effective in managing development and product costs through cost-related design and engineering changes. A key driver in the presented work was the recognized difficulty in evaluating and subsequently tracking direct recurring costs that are associated with a complex product as it evolves through its development process to the stage of being manufactured [8-14]. The work presented aims to illustrate a more concurrent engineering process by capturing a mixture of knowledge from the design, manufacturing and procurement functional areas and putting that into a 1 Chairholder Air Transport & Operations (ATO) Department [email protected] ,. AIAA Senior Member: 241840 2 PhD Researcher, now graduated 3 Bombardier Aerospace Employees 9th AIAA Aviation Technology, Integration, and Operations Conference (ATIO) <br>and<br>Air 21 - 23 September 2009, Hilton Head, South Carolina AIAA 2009-7107 Copyright © 2009 by Prof. R. Curran. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
Transcript

American Institute of Aeronautics and Astronautics 1

A KBE Genetic-Causal Cost Modelling Methodology for Manufacturing Cost Contingency Management

R. Curran1

Delft University of Technology, Delft, 2629 HS

M. Gilmour2

Queens University Belfast, Northern Ireland,

C. McAlleenan and P. Kelly3

Bombardier Aerospace Belfast, Airport Road, Belfast, NI, UK

AbstractThe paper provides validated evidence of a robust methodology for the management of lean

manufacturing cost contingency, with a particular focus on contingency regarding recurring work content. A truly concurrent engineering process is established by capturing a range of knowledge from the design, manufacturing and procurement functional areas, and analyzing it in a way that it can be automatically accessed with no specialist manufacturing knowledge, thus enabling the company to quickly estimate their designs based on the company’s lean manufacturing capability. A use-case study is presented which estimated the rolled-up assembly time for a thin-walled stiffened structure to be within ±7% of the manual estimating. A rolled-up accuracy of ± 10% was indicative in terms of total part and assembly cost, although it was evident that some of the individual cost deltas deviated more significantly, tending to cancel each other out in the roll-up. However, the key contribution of the paper is the implementation and validation of the use of the Genetic-Causal principle in developing a lean manufacture cost contingency methodology. Manufacturing cost models are established according to the principle of ‘causality’, which establishes which cost drivers should be used to generate the cost relations associated with the cost breakdown structure elements for various manufacturing processes. However, the ‘genetic’ principle is imposed through the hypothesis that the estimated costs of a new product will be similar to a company’s historical cost performance, relative to process selection, product definition and lean manufacturing capability. The latter use of the ‘genetic’ principle then presupposes that any new cost estimates generated are by nature not definitive and that rather one should be aiming to estimate the cost contingency. Consequently, the paper goes on to develop this hypothesis through a aircraft fuselage use-case to show how the lean concept idea of value added and non value added work content is consistent with this approach and indeed leads us to cost uncertainty and sensitivity analysis. Consequently, the paper advocates the formalization of an initially heuristic approach and so can be described inherently as a Knowledge Based Engineering (KBE) approach: capturing, structuring and formalizing manufacturing cost knowledge for application through knowledge bases and rules.

I. IntroductionIt is often quoted that conceptual design influences up to 80% of product cost, yet the majority of cost

reduction tools in aerospace companies focus on latter stages of product design or production [1-7]. What cost estimation that is carried out in early design stage occurs more at a global product level following on from the bid cost estimating effort using parametric techniques or cost estimating relations that connect high-level aircraft performance/design attributes to manufacturing cost. Although this effort is appropriate at setting a target cost, it is not tightly coupled to the product or process definition and therefore is less effective in managing development and product costs through cost-related design and engineering changes. A key driver in the presented work was the recognized difficulty in evaluating and subsequently tracking direct recurring costs that are associated with a complex product as it evolves through its development process to the stage of being manufactured [8-14]. The work presented aims to illustrate a more concurrent engineering process by capturing a mixture of knowledge from the design, manufacturing and procurement functional areas and putting that into a

1 Chairholder Air Transport & Operations (ATO) Department [email protected],. AIAA Senior Member:2418402 PhD Researcher, now graduated3 Bombardier Aerospace Employees

9th AIAA Aviation Technology, Integration, and Operations Conference (ATIO) <br>and <br>Air21 - 23 September 2009, Hilton Head, South Carolina

AIAA 2009-7107

Copyright © 2009 by Prof. R. Curran. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.

American Institute of Aeronautics and Astronautics 2

form that can be automatically accessed with no specialist manufacturing knowledge, thus enabling the company to quickly estimate their designs based on actual company manufacturing capability. It will be shown that this ability can have a profound affect on product development, when cost is integrated as an analytical performance metric.

Design Activity Progression

Estimate At Completion

(EAC)

Sold Level

Figure 1. Comparison of EAC to Sold Level to ensure design activity meets target cost

Often within aerospace, the selling price is the target price for the contract, with the supplier being allowed a certain profit margin. The target cost is determined as: Target Cost = Target Price – Target Profit. The selling or target price is often specified in the Request for Proposal (RFP) and analysis this industry will embark on a top-down costing exercise using the overall market value as the starting point. Internal programmes are expected to create a bottom-up estimate based on a Bill of Material (BOM) that is used to verify that they can meet the target cost. Once the value of the work package has been agreed then the target cost becomes the sold level cost of the piece of work, which the company is contractually obliged to meet through the absorption of any deviations. It is for this reason that initial BOM based estimate is so crucial and why so much effort is currently expended in its generation. The Estimate at Completion (EAC) evaluation is a rolling cost estimate that is evaluated as the design activity progresses through detailed design and manufacture, as illustrated in Fig1. The EAC is compared to the Sold Level in order to evaluate whether the design is on target to meet the target cost objective.

II. Research ContextThe authors have tried to capture some of the main reasons why cost estimation in the early stages of

production is so challenging and some of the most common issues tend to be: 1. the partial amount of product and process data available, 2. the limited amount of design data that is available being highly volatile and likely to change, 3. a lack of manufacturing knowledge by those responsible for either designing the product or

those responsible for putting together the estimate,4. a lack of clarity of the causal cost drivers of a design, 5. the estimation process is resource intensive and 6. it is highly dependant on subjective expert judgment.

Figure 2. Knowledge capture of highly variable production constraints that suggest the implausibility of full deterministic cost prediction

American Institute of Aeronautics and Astronautics 3

Furthermore, Fig. 2 displays in some more detail the knowledge captured in regards to these challenges, with the knowledge ontology categorizing generic issues according to Personnel, Procedure, Product Data, Environment and System. Essentially, there are of course ‘causal’ reasons for incurring cost, which provide the analysis architecture but the nature of the challenges identified necessitate a ‘genetic’ approach which implies that the resultant manufacturing cost will be a function of the product definition and manufacturing process selection but that these costs cannot be definitively predicted but rather will emerge in a similar way to previous product/process ancestry.

This paper explores the potential of integrating two main theoretical areas of research that could facilitate the above research goal; namely: the Genetic-Causal Cost Theory (GCT) and the Knowledge Based Engineering (KBE) approach. Curran has been exploring and applying his GCT approach to cost modeling from around the beginning of the 21st Century [15-26]. The original proposition was that GCT requires: 1) classifying the generic cost elements that are linked to particular (genetic) design (product) attributes; 2) developing causal parametric relations that link those genetic attributes to the resultant manufacturing (or life cycle) costs. Inproceeding with a hierarchical design-oriented classification there are a number of key aspects that can be considered as genetic, cost being a result of these aspects that relate to design definition. The relevant information from these aspects can be thought of as bits of genetic information that are coded into the design and which give rise to cost. The actual cost is only fixed if all things remain equal; otherwise it will vary with environmental factors such as rates and interests or process efficiency factors that vary from company to company. Therefore, any scientific cost prediction is still termed an estimate as the prediction is the most likely potential cost given: a) the nature of the design, b) the available manufacturing processes, and c) changeable environmental factors (that might additionally influence manufacture). The Genetic-Causal method utilizes: 1) Form: being the required shape; 2) Process: being the available conversion processes; and 3) Material: determining the characteristics. Consequently, design information is absolutely fundamental to the understanding of manufacturing cost, according to the genetic cost coding imposed by the designer through the impact of their decisions on form, process and material. There is also recognition of the impact of environmental ‘noise’ in influencing the causal impact of form, process and material. Notwithstanding, these causal relations can be modeled using observed statistical significance with appropriate normalization for environmental factors. This results in scientifically based relations that numerically link cost to causal sources that are explicit to design definition. Apart from being a highly generic cost modeling technique, the Genetic-Causal method is also well suited to use within an integrated design platform as changes in the design (for performance benefit) can be mapped through to cost in order to directly trade-off manufacturing cost relative to some global objective functions, as exemplified later in this paper.

Increasing organizational and supply chain complexity, accelerating technical change and impending retirement or loss of company employees are frequently mentioned drivers for companies to move towards the adoption of knowledge-based systems [27, 28]. Knowledge-based systems use knowledge management methodologies and techniques to capture, store and use knowledge from various sources in order to be able to meet business objectives and requirements. This is reflected in the following definition of knowledge management [27]: “Knowledge management is a systematic, organized, explicit and deliberate ongoing process of creating, disseminating, applying, renewing and updating intellectual and knowledge-based assets of the organization to achieve organizational objectives”. In general, knowledge-based systems are set up to achieve one or more of the following objectives [27]:

! Knowledge Capitalization: learning from the past by knowledge retention & re-use.! Project Accompaniment: learning from present activities by knowledge sharing and exchange.! Innovation: moving towards future benefits by leveraging organizational knowledge assets.! Cost reductions: achieving cost reductions through ‘first-time right’ adoption enabled by knowledge

sharing.

American Institute of Aeronautics and Astronautics 4

Figure 3. Knowledge-based systems - application taxonomy

When reviewing applications of knowledge-based systems, two major streams can be identified (as illustrated in Fig 1). These streams differ primarily on their intended functionality. Though systems from these streams can be based upon a shared knowledge base that captures explicit and tacit knowledge in the organization, in practice the differences in intended functionality will put different requirements on the knowledge base content [29-32] and will result in different spectra in using the same knowledge base (Figure 1). As such, these two streams require different approaches towards the capture and use of knowledge. The two major streams are:

! Expert Systems: these applications are primarily aimed at supporting the end user by offering advice, recommendations or explicit cases based on previously attained relevant knowledge. Case-BasedReasoning [31] or Rule-Based approaches [33, 34] are frequently used to construct Expert Systems. In general, Expert Systems supply critical knowledge towards the user and/or offer explicit advice, but the user frequently still has to process, analyse and decide based upon the offered knowledge source(s) or knowledge inferences.

! Knowledge-Based Engineering (KBE) applications: Knowledge-Based Engineering [34] ‘is a technology based on the use of dedicated software tools that are able to capture and re-use product and process engineering knowledge. The main objective of KBE is reducing time and cost of product development by means of:

! Automation of repetitive, non-creative design tasks.! Support of multi-disciplinary integration starting from the conceptual phase of the

design process.’The automation of repetitive, non-creative design tasks is often based on rule-based [34] or parametric approaches [18]. In general, the user is freed from performing routine tasks, making more time available for ‘pure’ design tasks.

Given the previous establishment of the industrial problem in carrying out effective manufacturing cost estimating very early in the product development process, we can formulate the main research goal as:

“To develop and validate an improved approach to estimating and managing product manufacturing cost from the earliest stages of the development cycle.”

Innovative research demands that a new research hypothesis should be the foundation of any study that hopes to make a long term contribution to the body of knowledge within a domain (alluded to through the research goal). The basic research hypothesis to be tested through the work presented is as follows:

American Institute of Aeronautics and Astronautics 5Figure 4. Breakdown of Product Work Content (Adapted from ILO, 1978)

“That it is possible to integrate heuristic and formalized modelling principles into the theoretical modelling architecture provided by Knowledge Based Engineering (KBE) theory, in order to establish a defining theoretical basis for the understanding, definition and analysis of cost”

Therefore, the ultimate Research Question is:“Can the Genetic-Causal Cost Theory and Lean Thinking be integrated into a KBE architectural framework in order to provide an applied cost contingency management capability that ensures the provision of sufficient cost information, so as to improve the management and control of product development costs”

Consequently, reflecting on the research goal previous set out and the research question we can define the research scope of the study presented as:

! Moving from aiming for a project target cost to actually managing the target cost relative to the product development process.

! Improving and enabling the evolutionary process from very early product definition (Bid/Grey BOM) and data generation through informed decision making.

! Increasing the speed and reliability of the cost modelling & improving understanding for cost management and integration.

! Developing a conceptual system architecture and application framework that embodies current expert knowledge and industrial needs.

III. Theoretical Methodology for Cost Contingency ModellingManufacturing companies have been embracing Lean theory following the success of the Toyota

Company, in the attempt to improve the efficiency of their operations. A product and process that is developed or reengineered along lean principles should be optimised for minimum effort production. Work in excess of a theoretical minimum can result for a number of different reasons, each of which can be linked to one or more of the seven wastes that lean theory acts to eliminate. Such reasons are

1. Work added due to product design defects –which results in a. the waste of over processing

i. This can be avoided by good communication between the methods and design functions to ensure products are ‘designed for manufacture’

2. Work added due to ineffective production & operation methods – such as a. the transportation of parts arising from bad facility layout, b. the excess motion of operators due to poorly designed work stations, c. defects arising from poor selection of processes and tools, d. waiting arising from poor scheduling

3. Work added due to poor planning by management such asa. excess inventoryb. over production

4. Ineffective time due to worker –a. time taken off above the allocated rest periods, i.e. lateness, deliberately working slowly

Value added!!!

Non value added!!!

American Institute of Aeronautics and Astronautics 6

The breakdown of the total work content experienced by an organisation during production is displayed in 3, where the actual work content of a product is composed of the theoretical minimum work content plus that work which arises from defects in the design of the product and process. The work content associated with problems with the design of the product and process can only become quantifiable once the product and process has been completely specified. Ineffective time arising from worker inefficiency and poor managerial planning is only experienced, and hence accurately quantifiable, once production begins. Thus the total time that will be required for production will be the sum of basic work content, the work due to product/process defects and the ineffective time. Lean practices adopted post product and process definition act to reduce the amount of excess work due to ineffectiveness and process defects, with only minimal scope for product change.

Consequently, work content estimated at the concept stage (when neither product nor process is fully defined) must only capable of accurately quantifying the basic work content. The use of standard hours negates the need to take into consideration operator learning within the proposed methodology. The standard hour measurement assumes that work is carried out by experienced competent personnel and that the process has attained steady state conditions. In reality it takes time for an operator to become experienced and competent with the processes relating to a particular product, especially in situations where there is an extensive amount of manual work such as that experienced in aerospace assembly operations. Initial work times will therefore be much greater than those experienced after a number of products have been manufactured where lessons are have been learnt, the principle is displayed in 4 for a typical learning curve [36]. Learning is dependant on the nature of the product (complexity) and process (high automation versus low automation). Identification of suitable learning rates to be used on the programme is the responsibility of senior management and these are established by the Manufacturing and Commercial business functions prior to project start up. These rates are based on a mixture of standardised industry rates, such as those offered by the RAND Corporation and through organisational past experience on previous contracts. 4 also displays the potential effect that the introduction of lean principles post production has on the standard hours for the process.

Figure 5. Work Breakdown, and the Effect of Lean in the Context of the Product Learning Curve

The methodology for the process mapping illustrated in Fig. 5. It is readily evident that there is an enforced integration of the Manufacturing and Procurement functions with only one feedback loop for design revision. In addition, data management tool and knowledge repositories have been utilized to facilitate the automated retrieval of knowledge and automatic generation of data, with clear demarcation of responsibility in terms of the data categorization.

Value added!!!

Non value added!!!

American Institute of Aeronautics and Astronautics 7

Figure 6. A generic KBE solution to work content and materials estimation (to the right) facilitated through the genetic-causal CENTRE-ing approach

The overall product estimate is the sum of the estimates associated with each of the line items within the BOM (Product Std Hours Estimate = !"#$%&'(%)& Std Hour Estimate). The resultant estimate is therefore a sum of parts and it is useful to be able to quantify this range that can be considered as estimate assurance or uncertainty analysis. Uncertainty analysis concerns the evaluation of the ‘goodness’ or confidence associated with a generated estimate and indicates the credibility of the study. Uncertainty can arise due to:

1. Uncertainty with regard the product design

2. Uncertainty in the cost estimation approaches

Product design inputs into the cost model are considered to be fixed (i.e. single point estimates) and so sensitivity analysis considers the effect of changes to design parameters on the total estimate. Therefore, parts of the product can be ranked according to the overall effect that changes to their design have on the total product estimate. There is then the opportunity to identify parts that require more design attention if manufacturing time is to be minimized. Each line item estimate has an associated range of values, or a confidence interval,expressed as a mean, minimum and maximum value. The upper and lower bounds for the range can easily be discerned from the upper and lower bounds of the estimates from each of the line items : (Product Std Hours Estimate) Lower Bound = !*"#$%&'(%)&+(,&-.ur Estimate)Lower Bound and (Product Std Hours Estimate) Upper bound =!"#$%&'()&*'+),'-./0'12)$*3)&4Upper Bound.. Likewise the mean of this range can also be gauged from the mean of each of the individual estimates: (Product Std Hours) Mean= !*"#$%& '(%)& Std Hour)Mean. These values are important as they specify what the limits of the estimate are, i.e. they add credibility and confidence to referencing estimates by stating that the actual value will likely not fall below (Product Std Hours) Lower Bound ,nor will it likely exceed (Product Std Hours) Upper bound , and it is most likely to be (Product Std Hours) Mean.

American Institute of Aeronautics and Astronautics 8

Figure 7. Generation of the Total System estimate

To quantify the likeliness of experiencing an actual value within this range of estimates every possible instance of variable needs to be considered and summed against every possible instance of the other variables within the system. The frequency of each potential output obtained then must be converted into a percentage probability as illustrated in Figure 6. Using Monte Carlo analysis, the variables can be randomly sampled and the resulting outputs considered being representative for the whole system. However, it is primarily the distribution of output from the system that is of interest as this aids in the understanding of the behavior of the output of the system. Fig. 6 displays the individual Probability Density Function (PDFs) for the components of a particular system and illustrates how these can be added to obtain the PDF for the global system. Each variable is assigned a random value from its known or assumed probability distribution and these are input into the system model, with the process repeated for each of the variables many times over. The results of the simulation are also displayed in the form of Cumulative Distribution Function (CDF) graphs. The central limit theorem states that the sum of a large number of variables can be approximated to a normal distribution. The standard deviation can be used as a measure of confidence in the results space, whilst the CDF, facilitates the quantification of certainty associated with a particular point estimate.

Sensitivity analysis involves studying how the outputs of a system change with variations in the assumptions or inputs of the system. It is useful for identifying those system variables that are important in controlling system behavior. Within the context of the prototype tool, sensitivity analysis is conducted in order to identify those parts or assemblies within the product that will alter the total product manufacturing time most, should their individual design attributes change. All product attributes are subject to change, especially in the early product development stage. To apply sensitivity analysis for the purpose of estimate control, product design attributes can be changed within likely ranges. The attributes are increased or decreased by a specified amount and then the estimation activity is rerun. These new estimates can be compared to the original estimated values and relative differences, at a sub system, and at a total system level, calculated. These relative differences in the estimates, both at a component level, and at an all up level, are then considered the sensitivities of the product estimate to the changes made in the product design attributes. Two different types of sensitivity can be calculated, namely, (1) the sensitivity of each individual part estimate compared to the original estimate:

100EstimatePart

EstimatePart -EstimatePart EstimatePart %

Original

Original%/"#$

%&

and (2) the total product estimate sensitivity due to a change in one individual part:

100EstimateProduct Total

EstimatePart -EstimatePart EstimateProduct Total%

Original

Original20%/"#$

%&

It is the second sensitivity, that of the total system sensitivity to the change in part cost, that is of most interest. Each part or assembly within the product can be ranked according to the relative effect on the cost of the overall system. This ranking defines the priority given to the part during design changes in order to best control the manufacturing estimate. Due to the non-continuous method of estimation achieved through use of the categorization technique, it is appropriate to consider the maximum potential variation in input parameters,

American Institute of Aeronautics and Astronautics 9

and determine the effect this has on the generated estimate. This is different to a more typically employed sensitivity technique, determined by calculating the change in total system output per unit input change.

IV. Application to an Aircraft Fuselage Use-CaseLike with the approaches developed for dealing with unknowns in the development of the fabrication

models, Watson’s enhanced multiple linear regression statistical analysis tool again will be utilised in order to generate statistical relations that link known assembly attributes to fastener counts. The development of such relations are only semi causal, exploiting observed correlation between such attributes and the fastener counts. They do not fully model the real causal relationships.

The primary difficulty with relying on a regression-based approach is the need for a sufficient number of similar past or current products on which to base the analysis. This is particularly problematic with assemblies, as the further up the product structure the SSC goes, then the fewer instances of these there become, and therefore the reduced certainty with which statistical models can be constructed.

Fuselage SSCs have been chosen in this instance to present the potential, and the limitations, of the approach. Fuselage assemblies have been chosen primarily due to the availability of suitable data at BAB. In all, 67 sub assemblies where considered, all of which relate specifically to regional jets. The 67 assemblies are from 3 derivative regional aircraft programs and are made up of the following SSCs: Skin panels – 19 sub assemblies; Floor sections – 4 floor sub assemblies; Longitudinal joints – 17 longitudinal joint assemblies;Circumferential joints – 5 circumferential joint assemblies; Door surrounds – 7 door surrounds; Major load bearing structures -13 different load bearing structures; Door structures – 2. As skin panels over the greatest quantity of data points they will be used to demonstrate the regression models. All panels considered are metallic and mechanically fastened.

Canvassing the observations of the domain experts it was observed that the total number of fasteners in a panel would potentially correlate with (1) the total surface area of the panel and (2) the total part count in the assembly. Both of these attributes will be known at the concept design stage and both can be determined for the BOM.

Table 1. Correlation matrix for skin panel fastener counts and known assembly attributes

Surface Area

(sq inches)

Rivets Bolts Total Part

Count in

Assembly

Total

Fasteners

Surface Area (sq inches) 1.00

Rivets 0.78 1.00

Bolts 0.10 -0.12 1.00

Total Part Count in Assembly 0.64 0.81 -0.28 1.00

Total Fasteners 0.81 0.98 0.06 0.76 1.00

To validate the observations a correlation analysis was carried out the results of which are summarizedin Table 1. The high degree of co dependence between surface area and part count precludes the use of both these attributes in the same relationship; therefore relationships for each must be developed independently. Carrying out linear regression analysis, in the first instance based on the surface area of the skin panels, and secondly on the number of parts, gives the graphs displayed in Fig. 7 and 8.

American Institute of Aeronautics and Astronautics 10

Figure 8. Total fastener count vs. skin panel area for BAB regional jet skin panel assemblies

Figure 9. Total fastener count vs. total part count for BAB regional jet skin panel assemblies

The R squared values for the relation of total fastener values and skin panel area and total part count respectively are 0.65 and 0.58 respectively. It is noted that rivets are much more prevalent than the lightweight structural bolts, and both these figures could be improved marginally by considering the relationship of both the variables with rivets alone.

A more detailed breakdown of the composition of skin panels could potentially offer better results. For example consider the constituent parts when grouped by their fabrication centre. The observation is made that machined parts, being thicker than sheet metal and extruded parts, are more likely to require the use of bolts as opposed rivets. Multiple linear regression on this basis resulted in an Adjusted R square value of 0.62, which is no improvement over that already considered. Indeed the difficulty with using these particular set of variables is the limited number of data points, and the larger number of variables, thus calling into question the validity of the derived relationship.

Also consider the break down of parts by the size categories defined previously, where the assumption in this instance is that larger parts are more likely to require a greater quantity of fasteners. The R squared value has been improved to 0.82, the problem again being one with regard the credibility of the relationship since there is 5 independent variables but only 19 data points. Fig. 9 displays the error associated with the fastener estimates for each of the skin panels.

y = 0.2605x + 2.1361R2 = 0.65

0

1000

2000

3000

4000

5000

6000

7000

8000

0 5000 10000 15000 20000 25000 30000

Skin Panel Area (sq inches)

Tota

l Fas

tene

r Cou

nt

Basic work content,i.e. Lean!!!

Non - Lean!!!

y = 22.496x + 1048.2R2 = 0.5833

0

1000

2000

3000

4000

5000

6000

7000

8000

0 50 100 150 200 250 300 350

Total Part Count

Tota

l Fas

tner

Cou

nt

Basic work content,i.e. Lean!!!

Non - Lean!!!

American Institute of Aeronautics and Astronautics 11

Figure 10. Percentage Error for the skin panel assemblies of the various statistical estimating techniques.

The fundamental issue with regard using a statistical based approach is that it is not fully representative of the product being developed. The assumption is made that the new product is similar to all current products. For example consider the data set used for the development of the previous relations; included within the dataset are three skin panels that are significantly different from the other panels, namely panels 14, 15 and 16. These panels are located in the fuselage centre section just above the wing pick up points. They have significantly large cut outs for emergency exit doors, and included in their structure is three heavy G frames that are used to attach the fuselage to the wings. As these entities carry high loads they are fastened to the skin panel by means of lightweight bolts as opposed rivets. Consequently for these centre panels the lightweight bolts equates to about 30% of the total fastener count whilst for the other panels the proportion is less than 2%. Their inclusion therefore within the skin panel family of parts skews the relationships developed for that family, consequently reducing the accuracy of the estimating relation. It is observed on Fig 9 that consistently the most accurate technique is that of the regression based on the size groupings. As expected, larger parts will require a larger number of rivets. This relationship is only appropriate though where the size category is based on the same dimension on which parts are to be joined, in this instance the part length. Consider the assembly of a skin panel, where the majority of fasteners are consumed by the attachment of the frames and stringers to the skins. Both these parts are fastened along their length. On another type of SSC such as a floor structure where parts are not joined along their length, i.e. at their end points, the relationship is likely to be less obvious and therefore more difficult to generate statistical based relations.

Similarly, the issue regarding a purely statistical based approach is that although it offers the ability to develop fastener estimation models in a relative quick manner, the accuracy and representative-ness of these models may be questionable, and that they are really only applicable when sufficient data exist with regard past products, which, for the majority of SSCs, is seldom the case.

Figure 11a and 11b. Total parts broken down for new skin panel assembly & Mean Manufacturing hours broken up by Manufacturing Centre

-100.00

-50.00

0.00

50.00

100.00

150.00

200.00

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

Skin Panel

Perc

enta

ge E

rror

% Error in Total Fasteners of Estimate based on Part Count % Error in Total Fasteners of Estimate based on Panel Surface Area% Error in Total Fasteners of Estimate based on constituent part type % Error in Total Fasteners of Estimate based on Size Groupings

AssemblyHawlmarkMachine ShopSystemsCompositeC04

AssemblyHawlmarkMachine ShopSystemsCompositeC04Made by Centre not identified

American Institute of Aeronautics and Astronautics 12

V. Full (Use-Case) Validation for a New Aircraft FuselageA validation exercise was carried out for a new fuselage skin panel assembly, which originated from a

new regional aircraft. The skin panel BOM, as provided by the design engineers, contained 51 different line items. Again the majority of parts are sheet metal details, being fabricated in the sheet metal manufacturing facility. Fig 10a and 10b shows a breakdown of the part number and mean manufacturing hours respectively by manufacturing centre. Again the large amount of manufacturing time (45%) is consumed by the assembling of the product, whilst although the product is made of 87% of sheet metal parts these account for 42% of the manufacturing time. The one chemi-milled part, the fuselage panel skin, contributes to 10% of the total manufacturing time. Fig. 11 displays the spread of the total manufacturing estimate and the risk associated with each value within the range. The 10%, 50% and 90% figures for the total manufacturing time estimate are 100, 110 and 119 hours. Due to the large number of parts to be manufactured in the sheet metal facility is appropriate to carry out a Monte Carlo simulation to determine the spread of the total hours for the facility.

Figure 12. Uncertainty Analysis for new skin panel for all up manufacturing time

Figure 13. Sheet metal manufacturing hours for new skin panel assembly

This is displayed in Fig 12. For the sheet metal MC the 10%, 50%, and 90% figures for the amount of work for which they will be responsible for within this product are 37, 45, and 55 hours respectively. If the sheet metal fabrication manager was going to be prudent when providing a quote on the amount of work for which they would be responsible for with regard they product manufacture he could for example quote a figure as high

0

20

40

60

80

100

120

140

160

180

71 75 79 83 87 91 95 99 103 107 111 115 119 123 127 131 135 139 143 147 151

Total Std Hrs

Freq

uenc

y

0

20

40

60

80

100

120

Cum

ulat

ive

Freq

uenc

yFrequency Cumulative Frequency

Basic work content,i.e. Lean!!!

+ Waste content,i.e. non - Lean!!!

0

20

40

60

80

100

120

16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76

Total Std Hrs

Freq

uenc

y %

0

20

40

60

80

100

120

Cum

ulat

ive

Freq

uenc

y %

Frequency Cumulative Frequency

Basic work content,i.e. Lean!!!

+ Waste work content,i.e. non - Lean!!!

American Institute of Aeronautics and Astronautics 13

as 54 hours, which would be associated with an 80% probability that it would not be exceeded. This illustrates how the approach can provide greater estimate understanding and indicates how estimates can be managed, not just with regard the product design but also for manufacturing operations.Fig. 13 displays a sensitivity analysis for the assembly, again based on a generalised +/-20% change in the product dimensional attributes. It is noted that the parts that require control during the design activity with regard their overall dimensions will be assembly, the chemi-milled skin, the re-enforcing frame angles and two machined fixtures.

-15

-10

-5

0

5

10

15

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51

Line Item/Part

Perc

enta

ge C

hang

e in

tota

l pro

duct

sta

ndar

d tim

e du

e to

+/

- 20%

cha

nge

in d

imen

sion

s

-20% Change in Dimensions +20% Change in Dimensions

Panel Assembly

Skin

Reinforcing Frame Angles

Machined Fittings

Figure 14. Sensitivity Analysis for new skin panel assembly

In this instance it was requested that the assembly estimate be broken and displayed graphically based on the skill/resource required, as shown in Fig 14, as each of these resources have different cost rates for their time. Consequently, this breakdown offers greater insight with regard the cost of the product, which in turn acts to facilitate greater cost control. These estimates of cost contingency are being used by the industrial collaborator and therefore we have validation at an expert level, while the design of the new fuselage panel is so similar to the historical cases used in the genetic-causal cost modelling that we can assume a similar level of accuracy to presented for the historical test case. However, one of the main propositions in this paper is that any (new) cost estimates generated are by nature not definitive and that rather one should be aiming to estimate the cost contingency. It is a case of estimating what the new product could cost in establishing a rational cost target thatcan facilitate cost management, as illustrated in Fig 1.

Figure 15. Breakdown of mean assembly hours by resource for new skin panels

FITTER STD HRS

RIVETTER STD HRS

AUTORIVET STD HRS

American Institute of Aeronautics and Astronautics 14

VI. Discussion and ConclusionsThe main research aim was stated as: ‘To develop and validate an improved approach to estimating and

managing product manufacturing cost from the earliest stages of the development cycle’. Subsequently the ultimate Research Question was stated to be: ‘Can the Genetic-Causal Cost Theory and Lean Thinking be integrated into a KBE architectural framework in order to provide an applied cost contingency management capability that ensures the provision of sufficient cost information, so as to improve the management and control of product development costs’. In addressing the main research themes, one of the key theoretical contributions of the paper is the implementation and validation of the use of the Genetic-Causal principle in developing a lean manufacture cost contingency methodology. Manufacturing cost models are established according to the principle of ‘causality’, which establishes which cost drivers should be used to generate the cost relations associated with the cost breakdown structure elements for various manufacturing processes. However, the ‘genetic’ principle is imposed through the hypothesis that the estimated costs of a new product will be similar to a company’s historical cost performance, relative to process selection, product definition and lean manufacturing capability. The latter use of the ‘genetic’ principle then presupposes that any new cost estimates generated are by nature not definitive and that rather one should be aiming to estimate the cost contingency. Consequently, the paper goes on to develop this hypothesis through a aircraft fuselage use-case study to show how the lean concept idea of value added and non value added work content is consistent with this approach and indeed leads us to establishment of the cost uncertainty and sensitivity analysis.

The paper advocates the formalization of an initially heuristic approach and so can be described inherently as a Knowledge Based Engineering (KBE) approach: capturing, structuring and formalizing manufacturing cost knowledge for application through knowledge bases and rules. The KBE framework has provided a recurring material and work content estimation methodology that can be used from the earliest stage when design definition first becomes available. It has used the concepts of Genetic Causal Costing along with Lean Thinking for addressing the value added and non-value added elements in cost contingency management. In addition, a typical KBE-driven product level costing architecture was utilized in automating the approach for collaborating company. This including sensitivity and uncertainty modeling to highlight the essence of cost control that was investigated against a new product development programme from which a use-case study was taken. A use-case study is presented which estimated the rolled-up assembly time for a thin-walled stiffened structure to be within ±7% of the manual estimation. A rolled-up accuracy of ± 10% was indicative in terms of total part and assembly cost, although it was evident that some of the individual cost deltas deviated more significantly, tending to cancel each other out in the roll-up. However, one of the main propositions established is that any (new) cost estimates generated are by nature not definitive and that rather one should be aiming to estimate the target cost and contingency, and then control the development costs according to that range. It is a case of estimating what the new product is likely to cost and then using the cost contingency targets to facilitate cost management and control. The collaborating company have adopted the tool and indeed even preferred to use its results on the use-case study over and above that of the manual estimation.

Therefore, presented work has provided a validated method of robust manufacturing cost contingency management. A truly concurrent engineering process has been established by capturing a mixture of knowledge from the design, manufacturing and procurement functional areas, and putting it in a form that can be automatically accessed with no specialist manufacturing knowledge, thus enabling the company to quickly estimate their designs based on actual company manufacturing capability. Moreover the system developed for the industrial collaborator has increased the speed and reliability of the cost modeling while also improving understanding for cost contingency management and integration.

AcknowledgementsThe research work that forms the basis for this paper was carried out by Dr. Mark Gilmour as part of his PhD studies (under Prof. Curran’s academic supervision) and was funded by Bombardier Aerospace Belfast. Dr. Gilmour carried out all the basic work and was greatly assisted by many staff at Bombardier Aerospace Belfast (BAB), in addition to Peter Kelly who managed the project and Conor McAlleenan who was the day-to-day supervisor and industrial mentor. Furthermore, a great depth of thanks goes to Bombardier for ‘housing and adopting’ Dr. Gilmour on-site for much of his PhD time, and embracing the research concept that was entirely co-developed (in a quite exemplary university-industry collaboration) and subsequently implemented within the Methods Function at BAB.

References[1]. Watson, P, Curran, R, Murphy, A and S Cowan, (2006). “Cost Estimation of Machined Parts within an Aerospace Supply Chain”, Journal Concurrent Engineering, Vol. 14, No. 1, 17-26.

American Institute of Aeronautics and Astronautics 15

[2]. Jiao J., & Tseng M., (1999). “A pragmatic approach to product costing based on standard time estimation”. International Journal of Operations and Management, Vol. 19 No. 7, 1999, pp 738-755.[3]. Bode, J. (1998). ‘Decision support with neural networks in the management of research and development: concepts and application to cost estimation’, Information & Management 34, pp. 33-40.[4]. Curran, R, Rush, C, Roy, R and S Raghunathan (2002). “Current Cost Estimating Practice in Aerospace”, Proceedings of Concurrent Engineering: Research and Applications (CE2002), 894-902. [5]. Brinke, E.T., (2002), “Costing Support and Cost Control in Manufacturing: A Cost Estimation Tool applied in the sheet metal domain”, PhD Thesis, Printed by PrintPartners Ipskamp, Enschede, The Netherlands, ISBN 90-365-1726-5.[6]. Cooper, R, and R.S. Kaplan, (1991). The design of cost management systems; text, cases and readings, New Jersey, Prentice-Hall Inc.[7]. Sheldon, DF, Huang, GQ and Perks K, (1991). Design For Cost: Past Experience and Recent Development, Journal of Engineering Design, Vol. 2, No 2, 127~139.[8]. Valdew Singh; “Systems Integration – Coping with legacy systems”; 8/1 24-28; 1997.[9]. Fiegenbaum, A.V. (1983). Total Quality Control, McGraw-Hill, New York.[10]. Meyer, C. (1993). Fast Cycle Time: How to Align Purpose, Strategy and Structure for Speed, Free Press, New York.[11]. Hammer, M. and Champy, J. (1995). Reengineering the Corporation, Nicholas Brealey Publishing, London.[12]. Suri, R. (1998). Quick Response Manufacturing: A Companywide Approach to Reducing Leadtimes, Productivity Press, Portland, Oregon.[13]. Towill, D.R. (1999). Management theory: Is it of any practical use? Or, how does a fad become a paradigm, Engineering Management Journal, June, pp.111-122.[14]. Lee, Olds; “Integration of Cost Modeling and Business Simulation into Conceptual Launch Vehicle Design”; Paper No. AIAA 97-3911.[15]. Curran, R, A. K. Kundu, J. M. Wright, S. Crosby, M. Price, S. Raghunathan, E. Benard, Modeling of aircraft manufacturing cost at the concept stage, The International Journal of Advanced Manufacturing Technology, Feb 2006, Pages 1 – 14.[16]. Curran, R, M. Price, S. Raghunathan, E. Benard, S. Crosby, S. Castagne and P. Mawhinney, Integrating Aircraft Cost Modeling into Conceptual Design, Journal of Concurrent Engineering: Research and applications, Sage Publications, 2005, ISSN: 1063-293X, Vol. 13, No. 4, 321-330.[17]. Curran, R, Kundu, A, Raghunathan, S, and D Eakin. “Costing Tools for Decision Making within Integrated Aerospace Design”, Journal of Concurrent Engineering Research, 9(4), 2002 OR 2001!!!, 327-338.[18]. Curran, R, Raghunathan, S and M Price. A Review of Aircraft Cost Modeling and Genetic Causal Approach, Progress in Aerospace Sciences Journal, Vol 40, No 8, 2004, 487-534.[19]. Curran, R, Rothwell A and S Castagne. “Numerical Method for Cost-Weight Optimisation of Stringer-Skin Panels”, AIAA Journal of Aircraft, Vol. 43, No 1, 2006a, 264-274.[20]. Curran, R, M. Price, S. Raghunathan, E. Benard, S. Crosby, S. Castagne and P. Mawhinney, Integrating Aircraft Cost Modelling into Conceptual Design, Journal of Concurrent Engineering: Research and applications, Sage Publications, Vol. 13, No. 4, 321-330 (2005), ISSN: 1063-293X, Nov 2005.[21]. Curran, R, A. K. Kundu, J. M. Wright, S. Crosby, M. Price, S. Raghunathan, E. Benard, “Modelling of aircraft manufacturing cost at the concept stage”, The International Journal of Advanced Manufacturing Technology, Vol. 31 (3-4), Nov 2006b, Pages 407-420.[22]. Curran, R, Castagne, S, Early, J, Price, M, Raghunathan, S, Butterfield J & A. Gibson, “Aircraft Cost Modelling using the Genetic Causal Technique within a Systems Engineering Approach”, The Aeronautical Journal, Royal Aeronautical Society, 2007.[23]. Curran, R, Castagne, S, Rothwell, A, Price M and A Murphy, “Uncertainty and Sensitivity Analysis in the Design Optimization of Operating Cost Relative to Manufacturing Cost and Structural Integration”, AIAA Journal of Aircraft, resubmitted after minor review modifications, 2009.[24]. Curran, R, P. Watson, Cost CENTRE-ing: An Agile Cost Estimating Methodology for Procurement, Proceedings of the 15th International Conference on Concurrent Engineering, Springer, ISBN 978-1-84800-971-4, 89-112, 2008.[25]. Curran, R., Kundu, A., Raghunathan, S. and McFadden R. (2002). “Impact of Aerodynamic Surface Tolerance on Aircraft Cost Driver”, Journal of Aerospace Engineering, 216(G1), 29-39.[26]. Curran, R, Kundu, AK, Raghunathan, S and R McFadden (2003). “Influence of Manufacturing Tolerance on Aircraft Direct Operating Cost (DOC)”, Journal of Materials Processing Technology, Elsevier Science BV, ISSN; 0924-0136, 239-247.[27]. Ammar-Khodja, S., & Bernard, A. (2008). An overview on knowledge management. In A. Bernard & S. Tichkiewitch (eds.), Methods and Tools for Effective Knowledge Life-Cycle Management (pp. 3-21), Berlin: Springer.

American Institute of Aeronautics and Astronautics 16

[28]. Hackbarth, G. (1998). The impact of organizational memory on IT systems. In E. Hoadley & I. Benbasat (eds.), Proceedings of the Fourth Americas Conference on Information Systems (pp. 588-590)[29]. Van der Laan, A.H., & Van Tooren, M.L.J. (2005). Parametric modeling of moveables for structural analysis. Journal of Aircraft, 42 (6), pp. 1606 – 1614. [30]. Alavi, M., & Leidner, D.E. (2001). Review: Knowledge management and knowledge management systems: Conceptual foundations and research issues. MIS Quarterly, 25 (1), pp. 107-136.[31]. Haque, B.U., Belecheanu, R.A., Barson, R.J., & Pawar, K.S. (2000). Towards the application of case based reasoning to decision-making in concurrent product development (concurrent engineering). Knowledge-Based Systems, 13, pp. 101-112.[32]. Kitamura, Y., Kashiwase, M., Fuse, M., & Mizoguchi, R. (2004). Deployment of an ontological framework of functional design framework. Advanced Engineering Informatics, 18 (2), pp. 115-127.[33]. Er, A., & Dias, R. (2000). A rule-based expert system approach to process selection for cast components. Knowledge-based Systems, 13, pp. 225-234. [34]. Mitchell, T., Buchanan, B., DeJong, G., Dietterich, T., Rosenbloom, P., & Waibel, A. (1990). Machine Learning. Annual Review of Computer Science, 4, pp. 417-433.[35]. La Rocca, G. (2009). KBE Techniques to support Aircraft Multi-Disciplinary Design and Optimization, unpublished doctoral dissertation, Delft: The Netherlands.[36] Butterfield, J., R. Curran, G.Watson, C. Craig, S. Raghunathan, R. Collins, T. Edgar and C. Higgins, R. Burke, P. Kelly, and C. Gibson, “Use of Digital Manufacturing to Improve Operator Learning in Aerospace Assembly”. 7th AIAA Aviation Technology, Integration and Operations Conference (ATIO), 2nd Centre of Excellence for Integrated Aircraft Technologies (CEIAT) International Conference on Innovation and Integration in Aerospace Sciences. Hastings Europa Hotel, Belfast, Northern Ireland, 18th – 20th September 2007.


Recommended