Barry Boehm, "Managing Software Productivity and Reuse," Computer, Volume 32, Issue 9, September 1999, pp. 111-113 (pdf)
Your organization can choose from three main strategies for improving its software productivity. You can work faster, using tools that automate or speed up previously labor-intensive tasks. You can work smarter, primarily through process improvements that avoid or reduce non-value-adding tasks. Or you can avoid unnecessary work by reusing software artifacts instead of custom developing each project. Which strategy will produce the highest payoff?
Added October 10th, 2000
Barry Boehm, Chris Abts, "COTS Integration: Plug and Pray?" Computer, Volume 32, Number 1, January 1999, pp. 135-138 (pdf)
For most software applications, the use of commercial off-the-shelf products has become an economic necessity. Gone are the days when upsized industry and government information technology organizations had the luxury of trying to develop—and at greater expense, maintain— their own database, network, and user-interface management infrastructure. Viable COTS products are climbing up the protocol stack, from infrastructure into application solutions in such areas as office and management support, electronic commerce, finance, logistics, manufacturing, law, and medicine. For small and large commercial companies, time-to-market pressure also exert a strong pressure toward COTS-based solutions.
Added October 10th, 2000
Nenad Medvidovic, David S. Rosenblum, Jason E. Robbins, David F. Redmiles, "Modeling Software Architectures in the Unified Modeling Language," IEEE Computer, January 1999 (pdf)
The Unified Modeling Language (UML) is a family of design notations that is rapidly becoming a de facto standard software design language. UML provides a variety of useful capabilities to the software designer, including multiple, inter-related design views, a semi-formal semantics expressed as a UML meta model, and an associated language for expressing formal logic constraints on design elements. However, UML currently lacks support for capturing and exploiting certain architectural concerns whose importance has been demonstrated through the research and practice of software architectures. In particular, UML lacks direct support for modeling and exploiting architectural styles, explicit software connectors, and local and global architectural constraints. This paper presents two strategies for supporting such architectural concerns within UML. One strategy involves using UML “as is,” while the other incorporates useful features of existing architecture description languages (ADLs) as UML extensions. We discuss the applicability, strengths, and weaknesses of the two strategies. The strategies are applied on three ADLs that, as a whole, represent a broad cross-section of present-day ADL capabilities.
Added November 13th, 2000
Sunita Chulani, Barry Boehm, Bert Steece, "Calibrating Software Cost Models Using Bayesian Analysis," IEEE Transactions on Software Engineering, July-August 1999, pp. 573-583 (pdf)
The COCOMO II.1997 software cost estimation model was originally formulated using behavioral analyses and an expert-judgement Delphi process to determine initial values of its cost drivers and scale factors. Using a multiple regression analysis approach, we then calibrated the model on a dataset consisting of 83 projects. The regression analysis produced results that occasionally contradicted the expert-judgement results: e.g. making a product more reusable caused it to be less expensive rather than more expensive to develop. These counter intuitive results were due to the fact that the COCOMO II database violated to some extent the following restrictions imposed by multiple linear regression [Briand92, Chulani98]:
(i) the number of datapoints should be large relative to the number of model parameters (i.e. there are many degrees of freedom). Unfortunately, collecting data has and continues to be one of the biggest challenges in the software estimation field. This is caused primarily by immature processes and management reluctance to release cost-related data.
(ii) no data items are missing. Data frequently contains missing information because the data collection activity has a limited budget or because of a lack of understanding of the data being reported.
(iii) there are no outliers. Extreme cases frequently occur in software engineering data because there is lack of precision in the data collection process.
(iv) the predictor variables (cost drivers and scale factors) are not highly correlated. Unfortunately, because cost data is historically rather than experimentally collected, correlations among the predictor variables are unavoidable.
Barry Boehm, Dan Port, Alexander Egyed, Marwan Abi-Antoun, "The MBASE Life Cycle Architecture Milestone Package: No Architecture Is An Island," WICSA '99 (pdf)
This paper summarizes the primary criteria for evaluating software/system architectures in terms of key system stakeholders’ concerns. It describes the Model Based Architecting and Software Engineering (MBASE) approach for concurrent definition of a system’s architecture, requirements, operational concept, prototypes, and life cycle plans. It summarizes our experiences in using and refining the MBASE approach on 31 digital library projects. It concludes that a Feasibility Rationale demonstrating consistency and feasibility of the various specifications and plans is an essential part of the architecture’s definition, and presents the current MBASE annotated outline and guidelines for developing such a Feasibility Rationale.
Added August 17th, 1998
Barry Boehm, Dan Port, "Conceptual Modeling Challenges for Model-Based Architecting and Software Engineering (MBASE)," Proceedings, Conceptual Modeling Symposium (pdf)
The difference between failure and success in developing a software-intensive system can often be traced to the presence or absence of clashes among the models used to define the system’s product, process, property, and success characteristics. (Here, we use a simplified version of one of Webster’s definitions of “model” a description or analogy used to help visualize something. We include analysis as a form of visualization).
Section 2 of this paper introduces the concept of model clashes, and provides examples of common clashes for each combination of product, process, property, and success models. Section 3 introduces the Model-Based Architecting and Software Engineering (MBASE) approach for endowing a software project with a mutually supportive base of models. Section 4 presents examples of applying the MBASE approach to a family of digital library projects. Section 5 summarizes the main conceptual modeling challenges involved in the MBASE approach, including integration of multiple product views and integration of various classes of product, process, property, and success models. Section 6 summarizes current conclusions and future prospects.
Added August 20th, 1998
Philippe Kruchten, Alexander Egyed, "Rose/Architect: A Tool to Visualize Architecture," HICSS, January 1999 (pdf)
Rational Rose is a graphical software modeling tool, using the Unified Modeling Language (UML) as its primary notation. It offers an open API that allows the development of additional functionality (“add-ins”). In this paper, we describe Rose/Architect, a Rose™ “add-in” used to visualize architecturally-significant elements in a system’s design, developed jointly by University of Southern California (USC) and Rational Software. Rose/Architect can be used in forward engineering, marking architecturally significant elements as they are designed and extracting architectural views as necessary. But it can be even more valuable in reverse engineering, i.e., extracting missing key architectural information from a complex model. This model may have been reverse-engineered from source code using the Rose reverse engineering capability.
Added September 24th, 1998
Sunita Chulani, Bert Steece, "A Bayesian Software Estimating Model Using a Generalized g-Prior Approach," IEEE Transactions on Software Engineering, Special Issue on Empirical Methods in Software Engineering, July-August 1999, pp. 573-583 (pdf)
Soon after the initial publication of the COCOMO II model, the Center for Software Engineering (CSE) began an effort to empirically validate COCOMO II . By January 1997, they had a dataset consisting of 83 completed projects collected from several Commercial, Aerospace, Government and FFRDC organizations. CSE used this dataset to calibrate the COCOMO II.1997 model parameters. Because of uncertainties in the data and / or respondents' misinterpretations of the rating scales, CSE developed a pragmatic calibration procedure for combining sample estimates with expert judgement. Specifically, the above model calibration for the COCOMO II.1997 parameters assigned a 10% weight to the regression estimates while expert-judgement estimates received a weight of 90%. This calibration procedure yielded effort predictions within 30% of the actuals 52% of the time.
CSE continued the data collection effort and the database grew from 83 datapoints in 1997 to 161 datapoints in 1998. Using this data and a Bayesian approach that can assign differential weights to the parameters based on the precision of the data, we provide an alternative calibration of COCOMO II. Intuitively, we prefer this approach to the uniform 10% weighted average approach described above because some of the effort multipliers and scale factors are more clearly understood than others. The sample information for well-defined cost drivers receives a higher weight than that given to the less precise cost drivers. This calibration procedure yielded significantly better predictions; that is our version of COCOMO II gives effort predictions within 30% of the actuals 76% of the time. The reader should note that these predictions are based on out-of-sample data (projects) as described in the 'Cross Validation' section (i.e. section 5).
Added September 24th, 1998
Donald J. Reifer, Barry Boehm, Sunita Chulani, "The Rosetta Stone: Making COCOMO 81 Files Work With COCOMO II," CrossTalk, February 1999, pp. 11-15 (pdf)
As part of our efforts to help COCOMO users, we, the COCOMO research team at the Center for Software Engineering at the University of Southern California (USC), have developed the Rosetta Stone for converting COCOMO 81 files to run using the new COCOMO II software cost estimating model. The Rosetta Stone is very important because it allows users to update estimates made with the earlier version of the model so that they can take full advantage of the many new features incorporated into the COCOMO II package. This paper describes both the Rosetta Stone and guidelines for making the job of conversion easy.
Added September 24th, 1998
Barry Boehm, Dan Port, "Escaping the Software Tar Pit: Model Clashes and How to Avoid Them," ACM Software Engineering Notes, January 1999, pp. 36-48 (pdf)
"No scene from prehistory is quite so vivid as that of the mortal struggles of great beasts in the tar pits... Large system programming has over the past decade been such a tar pit, and many great and powerful beasts have thrashed violently in it... Everyone seems to have been surprised by the stickiness of the problem, and it is hard to discern the nature of it. But we must try to understand it if we are to solve it." Fred Brooks, 1975
Several recent books and reports have confirmed that the software tar pit is at least as hazardous today as it was in 1975. Our research into several classes of models used to guide software development (product models, process models, property models, success models), has convinced us that the concept of model clashes among these classes of models helps explain much of the stickiness of the software tar-pit problem.
We have been developing and experimentally evolving an approach called MBASE -- Model-Based (System) Architecting and Software Engineering -- which helps identify and avoid software model clashes. Section 2 of this paper introduces the concept of model clashes, and provides examples of common clashes for each combination of product, process, property, and success model. Sections 3 and 4 introduce the MBASE approach for endowing a software project with a mutually supportive set of models, and illustrate the application of MBASE to an example corporate resource scheduling system. Section 5 summarizes the results of applying the MBASE approach to a family of small digital library projects. Section 6 presents conclusions to date.
Added September 24th, 1998
Barry Boehm, Marwan Abi-Antoun, Dan Port, Julie Kwan, Anne Lynch, "Requirements Engineering, Expectations Management, and The Two Cultures," Proceedings of the 4th IEEE International Symposium on Requirements Engineering, June 1999, pp. 14-22 (pdf)
In his seminal work, The Two Cultures, C.P. Snow found that science and technology policymaking was extremely difficult because it required the combined expertise of both scientists and politicians, whose two cultures had little understanding of each other's principles and practices [Snow, 1959].
During the last three years, we have conducted over 50 real-client requirements negotiations for digital library applications projects. Those largely involve professional librarians as clients and 5-6 person teams of computer science MS-degree students as developers. We have found that their two-cultures problem is one of the most difficult challenges to overcome in determining a feasible and mutually satisfactory set of requirements for these applications.
During the last year, we have been experimenting with expectations management and domain-specific lists of "simplifiers and complicators" as a way to address the two-cultures problem for software requirements within the overall digital library domain. Section 2 of this paper provides overall motivation and context for addressing the two-cultures problem and expectations management as significant opportunity areas in requirements engineering. Section 3 discusses the digital library domain and our stakeholder Win-Win and Model-Based (System) Architecting and Software Engineering (MBASE) approach as applied to digital library projects. Section 4 discusses our need for better expectations management in determining the requirements for the digital library projects are products over the first two years, and describes our approach in year 3 to address the two-cultures problem via expectations management. Section 5 summarizes results to date and future prospects.
Added September 24th, 1998
Barry Boehm, Kevin J. Sullivan, "Software Economics: Status and Prospects," Information and Software Technology , Volume 41, Number 14, November 1999, pp. 937-946 (pdf)
Software is valuable when it produces information in a manner that enables people and systems to meet their objectives more effectively. Software engineering techniques have value when they enable software developers to build more valuable software. Software economics is the sub-field of software engineering that seeks improvements which enable software engineers to reason more effectively about important economic aspects of software development, including cost, benefit, risk, opportunity, uncertainty, incomplete knowledge and the value of additional information, implications of competition, and so forth. In this paper, we survey the current status of selected parts of software economics, highlighting the gaps both between practice and theory and between our current understanding and what is needed.
Added July 18th, 2008
Barry Boehm, Hoh Peter In, "Cost vs. Quality Requirements: Conflict Analysis and Negotiation Aids," Software Quality Professional, Volume 1, Number 2, March 1999, pp. 38-50 (pdf)
The process of resolving conflicts among software quality requirements is complex and difficult because of incompatibility among stakeholders' interests and priorities, complex cost-quality requirements dependencies, and an exponentially increasing resolution option space for larger systems. This paper describes an exploratory knowledge-based tool, the Software Cost Option Strategy Tool(S-COST), which assists stakeholders to 1) surface appropriate resolution options for cost-quality conflicts; 2) visualize the options; and 3) negotiate a mutually satisfactory balance of quality requirements and cost.
S-COST operates in the context of the USC-CSE WinWin system (a groupware support system for determining software and system requirements as negotiated win conditions), QARCC (Quality Attribute and Risk Conflict Consultant - a support system for identifying quality conflicts in software requirements), and COCOMO (Constructive, Cost estimation Model). Initial analysis of its capabilities indicate that its semiautomated approach provides users with improved capabilities for addressing cost-quality requirements issues.
Added November 11th, 2005
Jongmoon Baik, "The Effects of CASE Tools on Software Development Effort," Qualifying Report for partial fullfillment of Computer Science Department requirements (pdf)
It is common knowledge that software tools have played a critical role in the software engineering process by improving software quality and productivity. A huge number of CASE (Computer Aided Software Engineering) tools have been produced to assist tasks in a software development process since the end of 1970's. Many studies in the CASE field were done in the 1980's and the early 1990's to provide more effective CASE technologies and environments. While the research in this field is no longer as active, software developers use a range of CASE tools that are typically assembled over the period to support tasks throughout the software process. The diversity and proliferation of software tools in the current CASE market makes it difficult to understand what kind of tasks are supported and how much effort can be reduced by using software tools in a software development process. A big challenge is to alleviate this difficulties in the software engineering community. The primary goals of this research are to establish a framework for classifying software tools according to their support in a software lifecycle, to provide tool rating scales with which software tools are effectively evaluated, and to analyze the effect of software tools on the software development effort.
Added November 9th, 1999
Alexander Egyed, Nikunj R. Mehta, Nenad Medvidovic, "Software Connectors and Refinement in Product Families," International Workshop on Software Architecture Families 2000 (pdf)
Product families promote reuse of software artifacts such as architectures, designs and implementa-tions. Product family architectures are difficult to create due to the need to support variations. Traditional approaches emphasize the identification and description of generic components which prove too rigid to support variations in each product. This paper presents an approach that supports analyzable family archi-tectures using generic software connectors that provide bounded ambiguity and support flexible product families. It describes the transformation from a family architecture to a product design through a four-way refinement and evolution process.
Added November 2nd, 1999
Jung-Won Park, Dan Port, Barry Boehm, "Supporting Distributed Collaborative Prioritization," APSEC, Sixth Asia-Pacific Software Engineering Conference (APSEC'99), 1999, pp. 560 (pdf)
Software developers are seldom able to implement stakeholders' requirements fully when time and resources are limited. To solve the problem, requirement engineers together with the stakeholders must prioritize requirements. The problem is exacerbated when the stakeholders are not all in the same place and/or can not collaborate at the same time. We have constructed a system called the Distributed Collaboration and Prioritization Tool (DCPT) to support the distributed and collaborative prioritization. In this paper, we will discuss the prioritization model implemented within DCPT and will give examples of using the tool. We will also discuss DCPT's integration with USC's WinWin requirements capture and negotiation system.
Added September 17th, 1999
Jongmoon Baik, Sunita Chulani, Ellis Horowitz, "Software Effort and Schedule Estimation Using The Constructive Cost Model: COCOMO II," submitted for ICSE 99 Informal Demo (pdf)
During development of a software product, several questions arise: How long will it take to develop? How much will it cost? How many people will be needed?
In answering these questions, several others arise: What are the risks involved if we compress the schedule by a certain fraction? Can we invest more in strategies such as tools, reuse, and process maturity and get higher productivity, quality and shorter cycle times? How can the cost and schedule be broken down by component, stage and activity?
COCOMO II facilitates the planning process by enabling one to answer the above questions using a parametric model that has been calibrated to actual completed software projects collected from Commercial, Aerospace, Government and non-profit organizations. Although, COCOMO II consists of three submodels, Applications Composition, Early Design, and Post-architecture, each one offering increased fidelity the further along one is in the project planning and design process; only the Early Design and Post Architecture models have been calibrated and implemented in the software.
Added July 14th, 1999
Barry Boehm, Dan Port, "When Models Collide: Lessons From Software System Analysis," IT Professional, Volume 1, Number 1, January-February 1999, pp. 49-56 (pdf)
This paper analyzes several classes of model clashes encountered on large, failed IT projects (e.g., Confirm, Master Net), and shows how the MBASE approach could have detected and resolved the clashes.
The first step in developing either an applicaiton or a system is to visualise it. The first step in developing either an application or a system is to visualize it. And when you visualize a system, you can’t help but use intellectual models to rea-son about what you’re building and how it will behave. The model can be a pattern you follow or an analogy you use. Whatever the form, models are ubiquitous: Developers use them in building a small stand-alone package or a large custom system. Customers use them to visualize what they think they’re getting from developers.
Added July 8th, 1999
Ellis Horowitz, Joo H. Lee, June Sup Lee, "WinWin: a System for Negotiating Requirements," ICSE '99 (pdf)
WinWin is a system that aids in the capture and recording of system requirements. It also assists in negotiation. The WinWin system has been available for several years and it being used by dozens of software development groups. In this presentation we will go over the capabilities of the system and discuss how it might be used on your software development project.
Added July 8th, 1999
Alexander Egyed, Barry Boehm, "Comparing Software System Requirements Negotiation Patterns," Journal for Systems Engineering, John Wiley & Sons, 1999 (pdf)
In a period of two years, two rather independent experiments were conducted at the University of Southern California (USC). In 1995, 23 three-person teams negotiated the requirements for a hypothetical library system. Then, in 1996, 14 six-person teams negotiated the requirements for real-world digital library systems.
A number of hypotheses were created to test how more realistic software projects differ from hypothetical ones. Other hypotheses address differences in uniformity and repeatability of negotiation processes and results. The results indicate that repeatability in 1996 was even harder to achieve then in 1995. Nevertheless, this paper presents some surprising commonalties between both years that indicate some areas of uniformity.
As such we found that the more realistic projects required more time to resolve conflicts and to identify options (alternatives) than the hypothetical ones. Further, the 1996 projects created more artifacts although they exhibited less artifact interconnectivity, implying a more divide and conquer negotiation approach. In terms of commonalties, we found that people factors such as experience did have effects onto negotiation patterns (especially in 1996), that users and customers were most significant (in terms of artifact creation) during the goal identification whereas the developers were more significant in identifying issues (conflicts) and options. We also found that both years exhibited some strange although similar disproportional stakeholder participation.
Added July 7th, 1999
Barry Boehm, Alexander Egyed, "Optimizing Software Product Integrity through Life-Cycle Process Integration," Computer Standards and Interfaces, Volume 21, Issue 1, May 1999, pp. 63-75 (pdf)
Managed and optimized - these are the names for the levels 4 and 5 of the Capability Maturity Model (CMM) respectively. With that the Software Engineering Institute (SEI) pays tribute to the fact that, after the process has been defined, higher process maturity, and with that higher product maturity, can only be achieved by improving and optimizing the life-cycle process itself.
In the last three years, we had had the opportunity to observe more than 50 software development teams in planning, specifying and building library related, real-world applications. This environment provided us with a unique way of introducing, validating and improving the life cycle process with new principles such as the WinWin approach to software development.
This paper summarizes the lessons we have learned in our ongoing endeavor to integrate the WinWin life-cycle process. In doing so, we will not only describe what techniques have proven to be useful in getting the developer’s task done but the reader will also get some insight on how to tackle process improvement itself. As more and more companies are reaching CMM levels two or higher this task, of managing and optimizing the process, becomes increasingly important.
Added July 7th, 1999
Alexander Egyed, Nenad Medvidovic, "Extending Architectural Representation in UML with View Integration," UML '99 (pdf)
UML has established itself as the leading OO analysis and design methodology. Recently, it has also been increasingly used as a foundation for representing numerous (diagrammatic) views that are outside the standardized set of UML views. An example are architecture description languages. The main advantages of representing other types of views in UML are 1) a common data model and 2) a common set of tools that can be used to manipulate that model. However, attempts at representing additional views in UML usually fall short of their full integration with existing views. Integration extends representation by also describing interactions among multiple views, thus capturing the inter-view relationships. This work describes a view integration framework and demonstrates how an architecture description language, which was previously only represented in UML, can now be fully integrated into UML.
Added May 13th, 1999
Alexander Egyed, Cristina Gacek, "Automatically Detecting Mismatches during Component-Based and Model-Based Integration," ASE '99 (pdf)
A major emphasis in software development is placed on identifying and reconciling architectural and design mismatches. Those mismatches happen during software development on two levels: while composing system components (e.g. COTS or in-house developed) and while reconciling view perspectives. Composing components into a system and 'composing' views (e.g. diagrams) into a system model are often seen as being somewhat distinct aspects of software development, however, as this work shows, their approaches in detecting mismatches complement each other very well. In both cases, the composition process may result in mismatches that are caused by clashes between development artefacts. Our component-based integration approach is more high-level and can be used early on for risk assessment while little information is available. Model-based integration, on the other hand needs more information to start with but is more precise and can handle large amounts of redundant information. This paper describes both integration approaches and discusses their commonalties and differences. Both integration approaches are automateable and some tools support is already available.
Added May 13th, 1999
Alexander Egyed, "Trace Observer: A Reengineering Approach to View Integration" (pdf)
Developing software in phases (stages) using multiple views (e.g. diagram) is the major cause for inconsistencies between and within views. Views exhibit redundancies because they repeatably use the same modeling information for the sake of representing related information within different perspectives. As such, redundancy becomes a vital ingredient in handling complexity by allowing a complex problem (model) to be divided up into smaller comprehensive problems (closed-world assumption).
However, this type of approach comes with a price tag; redundant views must be kept consistent and, at times were more and more development methodologies are used, this task becomes very time consuming and costly. We have, therefore, investigated ways on how to automate the issue of identifying view mismatches and to this end we have created a view integration framework. This paper describes this framework and shows how scenario executions and their observations can help in automating parts of that framework. Trace Observer, as this technique is called, can assist in cross-referencing (mapping) high-level model elements and it may also be used for transforming model elements so that different types of views may interpret them.
Added May 13th, 1999
Jung-Won Park, Dan Port, Barry Boehm, Hoh Peter In, "Supporting Distributed Collaborative Prioritization for WinWin Requirements Capture and Negotiations," Proceedings of 3rd World Multiconference on Systemics, Cybernetics and Informatics (SCI'99), IIIS, Volume 2, pp. 578-584 (pdf)
One of the most common problems within a risk driven software collaborative development effort is prioritizing items such as requirements, goals, and stakeholder win-conditions. Requirements have proven particularly sticky in this as it is often the case that they can not be fully implemented when time and resources are limited introducing additional risk to the project. A practical approach to mitigating this risk in alignment with the WinWin development approach is to have the critical stakeholders for the project collaboratively negotiate requirements into priority bins which then are scheduled into an appropriate incremental development life cycle.
We have constructed a system called the Distributed Collaboration Priorities Tool (DCPT) which to assist in collaborative prioritization of development items. DCPT offers a strcutually guided approach to collaborative prioritization much in the spirit of USC's WinWin requirements capture and negotiation system. In this paper, we will discuss the prioritization models implemented within DCPT via an actual prioritization of new WinWin system features. We also discuss DCPT's two-way integration with WinWin system, some experiences using DCPT, and current research directions.
Proceedings of 3rd World Multiconference on Systemics, Cybernetics and Informatics (SCI'99), Vol. 2, pp. 578-584, IIIS
Added April 5th, 1999
Alexander Egyed, "Using Patterns to Integrate UML Views," Proceedings of the 3rd Ground Systems Architecture Workshop (GSAW'99), El Segundo, CA, March 1999 (pdf)
Patterns play a major role during system composition (synthesis) in fostering the reuse of repeatable design and architecture configurations. This paper investigates how knowledge about patterns may also be used for system analysis to verify the conceptual integrity of the system model.
To support an automated analysis process, this work introduces a view integration framework. Since each view (e.g. diagram) adds an additional perspective of the software system to the model, information from one view may be used to validate the integrity of other views. This form of integration requires a deeper understanding as to what the views mean and what information they can share (or constrain). Knowledge about patterns, both in structure and behavior, are thereby a valuable source for view integration automation.
Added April 2nd, 1999
Alexander Egyed, "Integrating Architectural Views in UML," Qualifying Report for partial fulfillment of Computer Science Department requirements (pdf)
To support the development of software products we frequently make use of general-purpose software development models and tools such as the Unified Modeling Language (UML). However, software development in general and software architecting in particular (which is the main focus of our work) require more than what those general-purpose models can provide. Architecting is about:
1) modeling the real problem adequately
2) solving the model problem
3) interpreting the model solution in the real world.
In doing so, a major emphasis is placed on mismatch identification and reconciliation within and among architectural views (such as diagrams). We often find that this latter aspect, the analysis and interpretation of (architectural) descriptions, is under-emphasized in most general-purpose languages. We architect not only because we want to build (compose) but also because we want to understand. Thus, architecting has a lot to do with analyzing and verifying the conceptual integrity, consistency, and completeness of the product model.
The emergence of the Unified Modeling Language (UML), which has become a de-facto standard for OO software development, is no exception to that. This work describes causes of architectural mismatches in UML views and shows how integration techniques can be applied to identify and resolve them in a more automated fashion. In order to do so, this work introduces a view integration framework and describes its major activities – Mapping, Transformation, and Differentiation. To deal with the integration complexity and scalability of our approach, the concept of VIR (view independent representation) is introduced and described.
Added March 11th, 1999
Sunita Chulani, Barry Boehm, Bert Steece, "Bayesian Analysis of Empirical Software Engineering Cost Models," IEEE-TSE; Special Issue on Empirical Methods, Volume 25, Issue 4, July 1999, pp. 573-583 (pdf)
The most commonly used technique for empirical calibration of software cost models has been the popular classical multiple regression approach. As discussed in this paper, the multiple regression approach imposes a few assumptions frequently violated by software engineering datasets. The source data is also generally imprecise in reporting size, effort and cost-driver ratings, particularly across different organizations. This results in the development of inaccurate empirical models that don't perform very well when used for prediction. This paper illustrates the problems faced by the multiple regression approach during the calibration of one of the popular software engineering cost models, COCOMO II. It describes the use of a pragmatic 10% weighted average approach that was used for the first publicly available calibrated version [Clark98]. It then moves on to show how a more sophisticated Bayesian approach can be used to alleviate some of the problems faced by multiple regression. It compares and contrasts the two empirical approaches, and concludes that the Bayesian approach was better and more robust than the multiple regression approach.
Added March 11th, 1999
Barry Boehm, "Making RAD Work for Your Project," extended version of IEEE Computer, Volume 32, Number 3, March 1999, pp. 113-114,117 (pdf)
A significant recent trend we have observed among our USC Center for Software Engineering's industry and government Affiliates is that reducing the schedule of a software development project was becoming considerably more important than reducing its cost. This led to an Affiliates' Workshop on Rapid Application Development (RAD) to explore its trends and issues. Some of the main things we learned at the workshop were:
There are good business reasons why software development schedule is often more important than cost.
There are various forms of RAD. None are best for all situations. Some are to be avoided in all situations.
For mainstream software development projects, we could construct a RAD Opportunity Tree which helps sort out the best RAD mixed strategy for a given situation.
Added March 8th, 1999
Alexander Egyed, "Automating Architectural View Integration in UML" (pdf)
Architecting software systems requires more than what general-purpose software development models can provide. Architecting is about modeling, solving and interpreting, and in doing so, placing a major emphasis on mismatch identification and reconciliation within and among architectural views (such as diagrams). The emergence of the Unified Modeling Language (UML), which has become a de-facto standard for OO software development, is no exception to that. This work describes causes of architectural mismatches for UML views and shows how integration techniques can be applied to identify and resolve them in a more automated fashion.
Added March 3rd, 1999
Sunita Chulani, Barry Boehm, "Modeling Software Defect Introduction Removal: COQUALMO (COnstructive QUALity MOdel)" (pdf)
Cost, schedule and quality are highly correlated factors in software development. They basically form three sides of the same triangle. Beyond a certain point (the "Quality is Free" point), it is difficult to increase the quality without increasing either the cost or schedule or both for the software under development. Similarly, development schedule cannot be drastically compressed without hampering the quality of the software product and/or increasing the cost of development. Watts Humphrey, at the LA SPIN meeting in December '98, highlighted that "Measuring Productivity without caring about Quality has no meaning". Software estimation models can (and should) play an important role in facilitating the balance of cost/schedule and quality.
Recognizing this important association, an attempt is being made to develop a quality model extension to COCOMO II; namely COQUALMO. An initial description of this model focusing on defect introduction was provided in [Chulani97a]. The model has evolved considerably since then and is now very well defined and calibrated to Delphi-gathered expert opinion. The data collection activity is underway and the aim is to have a statistically calibrated model by the onset of the next millennium.
The many benefits of cost/quality modeling include:
Resource allocation: The primary but not the only important use of software estimation is budgeting for the development life cycle.
Tradeoff and risk analysis: An important capability is to enable 'what-if' analyses that demonstrate the impact of various defect removal techniques and the effects of personnel, project, product and platform characteristics on software quality. A related capability is to illuminate the cost/schedule/quality trade-offs and sensitivities of software project decisions such as scoping, staffing, tools, reuse, etc.
Time to Market initiatives: An important additional capability is to provide cost/schedule/quality planning and control by providing breakdowns by component, stage and activity to facilitate the Time To Market initiatives.
Software quality improvement investment analysis: A very important capability is to estimate the costs and defect densities and assess the return on investment of quality initiatives such as use of mature tools, peer reviews and disciplines methods.
Added March 3rd, 1999
Copyright 2008 The University
of Southern California
The written material, text,
graphics, and software available on this page and all related
pages may be copied, used, and distributed freely as long as the
University of Southern California as the source of the material,
text, graphics or software is always clearly indicated and such
acknowledgement always accompanies any reuse or redistribution
of the material, text, graphics or software; also permission to
use the material, text, graphics or software on these pages does
not include the right to repackage the material, text, graphics
or software in any form or manner and then claim exclusive proprietary
ownership of it as part of a commercial offering of services or
as part of a commercially offered product.