University of Southern California
    
Home   Contact Us    
Center for Systems and Software Engineering

About us
News
History
People
Events
Upcoming
Highlights
Past
Publication
Tech. Report
TR by Author
Research
Projects
Tools
Courses
Education
Degrees
Admissions
Affiliates
List of Affiliates
Private Area
Other Resources


Technical Reports

USC-CSE-2000-534

Barry Boehm, Dan Port, Mohammed Al-Said, "Avoiding the Software Model-Clash Spiderweb," Computer, Volume 33, Issue 11, November 2000, pp. 120-122 (pdf)

Analysts frequently describe troubled projects with the tarpit metaphor used o effectively in Fred Brooks’s The Mythical Man-Month (2nd ed., Addison-Wesley, Reading, Mass., 1995). We have found a similarly effective metaphor: Think of a troubled software project as an insect caught in a spiderweb of sticky constraints, trying desperately to break free before the spider arrives to feed.

Added May 15th, 2001


USC-CSE-2000-533

Alexander Egyed, Nenad Medvidovic, "A Formal Approach to Heterogeneous Software Modeling," FASE 2000 (pdf)

The problem of consistently engineering large, complex software systems of today is often addressed by introducing new, “improved” models. Examples of such models are architectural, design, structural, behavioral, and so forth. Each software model is intended to highlight a particular view of a desired system. A combination of multiple models is needed to represent and understand the entire system. Ensuring that the various models used in development are consistent relative to each other thus becomes a critical concern. This paper presents an approach that integrates and ensures the consistency across an architectural and a number of design models. The goal of this work is to combine the respective strengths of a powerful, specialized (architecture-based) modeling approach with a widely used, general (design-based) approach. We have formally addressed the various details of our approach, which has allowed us to construct a large set of supporting tools to automate the related development activities. We use an example application throughout the paper to illustrate the concepts.

Added November 2nd, 1999


USC-CSE-2000-532

Nikunj R. Mehta, Nenad Medvidovic, Sandeep Phadke, "Towards a Taxonomy of Software Connectors," 21st International Conference on Software Engineering, 2000 (pdf)

Software systems of today are frequently composed from prefabricated, heterogeneous components that provide complex functionality and engage in complex interactions. Existing research on component-based development has mostly focused on component structure, interfaces, and functionality. Recently, software architecture has emerged as an area that also places significant importance on component interactions, embodied in the notion of software connectors. However, the current level of understanding and support for connectors has been insufficient. This has resulted in their inconsistent treatment and a notable lack of understanding of what the fundamental building blocks of software interaction are and how they can be composed into more complex interactions. This paper attempts to address this problem. It presents a comprehensive classification framework and taxonomy of software connectors. The taxonomy is used both to understand existing software connectors and to suggest new, unprecedented connectors. We demonstrate the use of the taxonomy on the architecture of an existing, large system.

Added November 16th, 1999


USC-CSE-2000-531

Bradford Clark, "Quantifying the Effects of Process Improvement on Effort," IEEE Software, Volume 17, Issue 6, November 2000, pp. 65-70 (pdf)

When organizations make many improvements concurrently, software project managers have no way of determining how much improvement is due to process maturity versus other factors. Using a 161-project sample, the article isolates the effects on effort of process maturity versus other effects, concluding that an increase of one process maturity level can reduce development effort by 4% to 11%.

Added July 18th, 2008


USC-CSE-2000-530

Barry Boehm, Ellis Horowitz, Raymond Madachy, Chris Abts, "Future Trends, Implications in Cost Estimation Models," CrossTalk, April 2000 (pdf)

The rapid pace of change in software technology requires everybody in the software business to continually rethink and update their practices just to stay relevant and effective. This article discusses this challenge first with respect to the USC COCOMO II software cost modeling project, and then for software-intensive organizations in general. It then presents a series of adaptive feedback loops by which organizations can use COCOMO II-type models to help cope with the challenges of change.

Added November 11th, 2005


USC-CSE-2000-529

Barry Boehm, "Safe and Simple Software Cost Analysis," IEEE Software, September/October 2000, pp. 14-17 (pdf)

There are a number of simple software cost analysis methods, but they may not always be safe. The simplest is to base your cost estimate on the typical costs or productivity rates of your previous projects. This will work well if your new project doesn’t have any cost-critical differences from your previous projects. But it won’t be safe if some critical cost-driver has changed for the worse.

Added June 24th, 2004


USC-CSE-2000-528

Barry Boehm, Richard Fairley, "Software Estimation Perspectives," IEEE Software, November/December 2000, pp. 22-26 (pdf)

How much is 68 + 73? Engineer: “It’s 141.” Short and sweet. Mathematician: “68 + 73 = 73 + 68 by the commutative law of addition.” True, but not very helpful. Accountant: “Normally it’s 141, but what are you going to use it for?”

Added June 24th, 2004


USC-CSE-2000-527

Barry Boehm, "Requirements that Handle IKIWISI, COTS, and Rapid Change," Computer, July 2000, pp. 99-102 (pdf)

In the good old days, dealing with software requirements was relatively easy. Software requirements were the first order of business and took place before design, cost estimation, planning, or programming. Of course, it wasn’t simple.

Added June 24th, 2004


USC-CSE-2000-526

Barry Boehm, "The Art of Expectations Management," Computer, January 2000, pp. 122-124 (pdf)

One of the most valuable skills a software professional can develop, expectation management is something surprisingly few people know or practice. I’ve witnessed more than 100 stakeholder software requirement negotiations in which inflated expectations about the simplicity of the problem or ease of providing a solution have caused the most difficulty. Expectations management holds the key to providing win-win solutions to these situations.

Added June 24th, 2004


USC-CSE-2000-525

Barry Boehm, "Spiral Development: Experience, Principles, and Refinements," Spiral Development Workshop, edited by Wilfred J. Hansen, February 9, 2000 (pdf)

Spiral development is a family of software development processes characterized by repeatedly iterating a set of elemental development processes and managing risk so it is actively being reduced. This paper characterizes spiral development by enumerating a few “invariant” properties that any such process must exhibit. For each, a set of “variants” is also presented, demonstrating a range of process definitions in the spiral development family. Each invariant excludes one or more "hazardous spiral look-alike" models, which are also outlined. This report also shows how the spiral model can be used for a more cost-effective incremental commitment of funds, via an analogy of the spiral model to stud poker. An important and relatively recent innovation to the spiral model has been the introduction of anchor point milestones. The latter part of the paper describes and discusses these.

Added February 22nd, 2001


USC-CSE-2000-524

Alexander F. Egyed, "Automatically Validating Model Consistency During Refinement," submitted to ICSE 2001 (pdf)

Automated consistency checking between software development models still remains a complex and non-scalable problem. Current solutions are frequently only able to detect small numbers of inconsistency types, often under less than realistic assumptions. This paper introduces a new approach to consistency checking based on model trans-formation. Our approach uses transformation to translate and to interpret model information between different types of views (e.g., diagrams) in order to simplify their comparison. Transformation-based consistency checking, in the manner we use it, has never been attempted before and, as this paper will demonstrate, has significant benefits including (1) increased variety of automatically detectable inconsistencies, (2) improved scalability, (3) ability to handle incomplete, ambiguous model specifications, and (4) ability to define domain- and model-independent inconsistency rules. This paper will illustrate our approach in context of model refinement and abstraction using a complex example. Our approach is fully tool supported.

Added October 10th, 2000


USC-CSE-2000-520

Alexander Egyed, Nenad Medvidovic, Cristina Gacek, "Component-Based Perspective on Software Mismatch Detection and Resolution," IEE Proceedings - Software Engineering, Volume 147, Number 6, December 2000, pp. 225-236 (pdf)

Existing approaches to modeling software systems all too often neglect the issue of component mismatch identification and resolution. The traditional view of software development over-emphasizes synthesis at the expense of analysis - the latter frequently being seen as a problem one only needs to deal with during the integration stage towards the end of a development project. This paper discusses three software modeling and analysis techniques, all tool supported, and emphasizes the vital role analysis can play in identifying and resolving risks early on. This work also combines model based development with component based development (e.g., COTS and legacy systems) and shows how their mismatch detection capabilities complement each other in providing a more comprehensive coverage of development risks.

Added November 13th, 2000


USC-CSE-2000-516

Nenad Medvidovic, Peyman Oreizy, Richard N. Taylor, Rohi Khare, Michael Guntersdorfer, "An Architecture-Centered Approach to Software Environment Integration" (pdf)

Software architecture research has yielded a variety of powerful techniques for assisting in the design, implementation, and long-term evolution of complex, heterogeneous, distributed, multi-user applications. Since software development environments are themselves applications with these charac-teristics, it is natural to examine the effectiveness of an architectural approach to constructing and changing them. We report on our experience in creating a family of related environments in this manner. The environments encompass a range of services and include commercial off-the-shelf products as well as custom-built tools. The particular architectural approach adopted is fully reflexive: the environments are used in their own construction and evolution. We also report on some engineering experiences, in particular with our use of XML as the vehicle for supporting a common and extensible representation of architectural models, including the model of the environment itself. Generally applicable lessons from the experience are described.

Added November 13th, 2000


USC-CSE-2000-515

Alexander Egyed, Paul Gruenbacher, Nenad Medvidovic, "Refinement and Evolution Issues between Requirements and Product Line Architectures" (pdf)

Though acknowledged as very closely related, to a large extent re-quirements engineering and architecture modeling have been pursued independ-ently of one another, particularly in the large body of software architecture research that has emerged over the past decade. The dependencies and constraints imposed by elements of one on those of the other are not well understood. This paper identifies a number of relevant relationships we have identified in the process of trying to relate the WinWin requirements engineering approach with architecture and design-centered approaches (e.g., C2 and UML). This paper further discusses their relevance towards product line issues which currently are obscured by the fusion of roduct-specific and product-line information.

Added November 13th, 2000


USC-CSE-2000-514

Nikunj R. Mehta, Nenad Medvidovic, Marija Mikic-Rakic, "Why Consider Implementation-Level Decisions in Software Architectures?" submitted to 4th International Workshop on Software Architectures, Limerick, Ireland, June 2000 (pdf)

Software architecture provides a high-level abstraction of the structure, behavior, and properties of a software system aimed at enabling early analysis of the system and its easier implementation. Often, however, important details about a system are left to be addressed in its implementation, resulting in differences between conceptual and concrete architectures. This paper describes an approach towards bringing these two closer by making certain implementation-level decisions explicit in the architecture. Specifically, we focus on the choices made in modeling and implementing component interactions. The process is based on a taxonomy of software connectors that the authors have developed to better understand component interactions, and an architectural framework developed to support a variety of connectors.

Added November 13th, 2000


USC-CSE-2000-513

Nenad Medvidovic, David S. Rosenblum, Richard N. Taylor, "Heterogeneous Typing for Software Architectures," submitted to ACM Transactions on Software Engineering and Methodology, 1999 (pdf)

Software architectures have the potential to substantially improve the development and evolution of large, complex, multi-lingual, multi-platform, long-running systems. However, in order to achieve this potential, specific architecture-based modeling, analysis, and evolution techniques must be provided. This paper motivates and presents one such technique: a type system for software architectures, which allows flexible, controlled evolution of software components in a manner that preserves the desired architectural relationships and properties. Critical to the type system is a framework that divides the space of subtyping relationships into a small set of well defined categories. The paper also investigates the effects of large-scale development and off-the-shelf reuse on establishing type conformance between interoperating components in an architecture. An existing architecture is used as an example to illustrate a number of different applications of the type system to architectural modeling and evolution.

Added November 13th, 2000 


USC-CSE-2000-511

Barry Boehm, Victor R. Basili, "Gaining Intellectual Control of Software Development," IEEE Computer, Volume 33, Number 5, May 2000, pp. 27-33 (pdf)

Recent disruptions caused by several events have shown how thoroughly the world has come to depend on software. The rapid proliferation of the Melissa virus hinted at a dark side to the ubiquitous connectivity that supports the information-rich Internet and lets e-commerce thrive. Although the oft-predicted Y2K apocalypse failed to materialize, many software experts insist that disaster was averted only because countries around the globe spent billions to ensure their critical software would be Y2K-compliant. When denial-of-service attacks shut down some of the largest sites on the Web last February, the concerns caused by the disruptions spread far beyond the complaints of frustrated customers, affecting even the stock prices of the targeted sites.

Added October 10th, 2000


USC-CSE-2000-509

Barry Boehm, Kevin J. Sullivan, "Software Economics: A Roadmap," Proceedings of the Conference on The Future of Software Engineering, ICSE 2000, Limerick, Ireland, 2000, pp. 319-343 (pdf, doc)

The fundamental goal of all good design and engineering is to create maximal value added for any given investment. There are many dimensions in which value can be assessed, from monetary profit to the solution of social problems. The benefits sought are often domain-specific, yet the logic is the same: design is an investment activity. Software economics is the field that seeks to enable significant improvements in software design and engineering through economic reasoning about product, process, program, and portfolio and policy issues. We summarize the state of the art and identify shortfalls in existing knowledge. Past work focuses largely on costs, not on benefits, thus not on value added; nor are current technical software design criteria linked clearly to value creation. We present a roadmap for research emphasizing the need for a strategic investment approach to software engineering. We discuss how software economics can lead to fundamental improvements in software design and engineering, in theory and practice.

Added October 10th, 2000


USC-CSE-2000-507

Barry Boehm, "Spiral Development: Experience, Principles, and Refinements," Spiral Experience Workshop, February 9, 2000 (pdf)

This presentation opened the USC-SEI Workshop on Spiral Development* Experience and Implementation Challenges held at USC February 9-11, 2000. The workshop brought together leading executives and practitioners with experience in transitioning to spiral development of software-intensive systems in the commercial, aerospace, and government sectors. Its objectives were to distill the participants’ experiences into a set of critical success factors for transitioning to and successfully implementing spiral development, and to identify the most important needs, opportunities, and actions to expedite organizations’ transition to successful spiral development. To provide a starting point for addressing these objectives, I tried in this talk to distill my experiences in developing and transitioning the spiral model at TRW; in using it in system acquisitions at DARPA; in trying to refine it to address problems that people have had in applying it in numerous commercial, aerospace, and government contexts; and in working with the developers of major elaborations and refinements of the spiral model such that the Software Productivity Consortium’s Evolutionary Spiral Process [SPC, 1994] and Rational, Inc’s Rational Unified Process [Royce, 1998; Kruchten 1999; Jacobson et al., 1999]. I’ve modified the presentation somewhat to reflect the experience and discussions at the Workshop.

Added March 28th, 2000


USC-CSE-2000-506

Barry Boehm, "Unifying Software Engineering and System Engineering," Computer, Volume 3, Number 3, March 2000, pp. 114-116 (pdf)

Rapid change in information technology brings with it a frequent need to undo the effects of previous culture change efforts. This process, while often challenging and frustrating, offers numerous rewards for success. Organizations can change from slow, reactive, adversarial, separated software and systems engineering processes to unified, concurrent processes. These processes better suit rapid development of dynamically changing software-intensive systems involving COTS, agent, Web, multimedia, and Internet technology.

Added October 10th, 2000


USC-CSE-2000-505

Barry Boehm, Chris Abts, Sunita Chulani, "Software Development Cost Estimation Approaches - A Survey," Qualifying Exam Report (Sunita Chulani) (pdf)

This paper summarizes several classes of software cost estimation models and techniques: parametric models, expertise-based techniques, learning-oriented techniques, dynamics-based models, regression-based models, and composite-Bayesian techniques for integrating expertise-based and regression-based models. Experience to date indicates that neural-net and dynamics-based techniques are less mature than the other classes of techniques, but that all classes of techniques are challenged by the rapid pace of change in software technology. The primary conclusion is that no single technique is best for all situations, and that a careful comparison of the results of several approaches is most likely to produce realistic estimates.

Added April 10th, 2000


USC-CSE-2000-504

Jongmoon Baik, Barry Boehm, "Empirical Analysis of CASE Tool Effects on Software Development Effort," ACIS International Journal of Computer & Information Science, Volume 1, Issue 1, Winter 2000, pp. 1-10 (pdf)

During the last couple of decades, CASE (Computer Aided Software Engineering) tools have played a critical role in improvement of software productivity and quality by assisting tasks in software development processes. Many initiatives in the field were pursued in the 1980’s and 1990’s to provide more effective CASE technologies and development environments. Even though the CASE field is no longer active research area, most software development teams use a huge range of CASE tools that are typically assembled over some period with the hope of productivity and quality improvements throughout the software development process. The variety and proliferation of tools in the current CASE market makes it difficult to understand what kinds of tasks are supported and how much effort can be reduced by using CASE tools. In this paper, we provide a classification of CASE tools by activity coverage in a software development lifecycle. We also report a experimental result of Bayesian analysis on CASE tool effects with a extended set of tool rating scales from COCOMO (COnstructive COst MOdel) II with which CASE tools are effectively evaluated.

Added March 20th, 2000


USC-CSE-2000-503

Chris Abts, "A Perspective on the Economic Life Span of COTS-based Software Systems: the COTS-LIMO Model," submitted to the Workshop on Continuing Collaborations for Successful COTS Development, ICSE 2000, Limerick, Ireland, June 4-5, 2000 (pdf)

The use of commercial-of-the-shelf (COTS) components is becoming ever more prevalent in the creation of large software systems. The rationale usually cited for this trend is that by using COTS components, immediate short-term gains in direct development effort & schedule are possible—admittedly, often as a trade-off for a more complicated long-term post-deployment maintenance environment. Even so, the conventional wisdom is that generally, the more of the system that can be built using COTS components, the better. Anecdotal evidence recently gathered while conducting data collection interviews for the COCOTS COTS integration cost model suggests, however, that there may be diminishing returns in trying to maximize the use of COTS components in a system development. Beyond a certain point, an increase in the number of COTS components in a system may actually reduce the system's overall economic life span rather than increase it. This paper discusses why this may be true, at least in some cases, and proposes a new economic COTS life span model, COTS-LIMO, as a way of possibly examining these effects. As of this writing, the suggestion being made here that increasing the number of COTS components in a system ultimately produces diminishing returns can only be called a hypothesis. But if proven true, even in some cases, then this could have significant implications for current policy decisions being made by governments and organizations encouraging an ever expanding use of COTS components in software system developments.

Added March 8th, 2000


USC-CSE-2000-502

Chris Abts, Barry Boehm, Elizabeth Bailey Clark, "Empirical Observations on COTS Software Integration Effort Based on the Initial COCOTS Calibration Database," Proceedings of the Twenty-Fifth Annual Software Engineering Workshop (SEW 25), NASA/Goddard Space Flight Center, 2000 (pdf)

As the use of commercial-of-the-shelf (COTS) components becomes ever more prevalent in the creation of large software systems, the need for the ability to reasonably predict the true lifetime cost of using such software components grows accordingly. This paper presents empirically-based findings about the effort associated with activities found to be significant in the development of systems using COTS components. The findings are based upon data collected for the purpose of calibrating the COCOTS COTS software integration cost model, an extension to the COCOMO II cost model designed to capture costs COCOMO does not. A brief overview of COCOTS is presented to put the data in perspective, including its relation to COCOMO II. A set of histograms is then shown summarizing the effort data collected to date. The paper concludes with some observations suggested by an examination of that calibration data.

Added March 8th, 2000


USC-CSE-2000-501

Chris Abts, Barry Boehm, Elizabeth Bailey Clark, "COCOTS: A COTS Software Integration Lifecycle Cost Model - Model Overview and Preliminary Data Collection Findings," Proceedings ESCOM-SCOPE 2000 Conference, Munich, Germany, April 18-20, 2000, pp. 325-333 (pdf)

As the use of commercial-of-the-shelf (COTS) components becomes ever more prevalent in the creation of large software systems, the need for the ability to reasonably predict the true lifetime cost of using such software components grows accordingly. In using COTS components, immediate short-term gains in direct development effort & schedule are possible, but usually as a trade-off for a more complicated long-term post-deployment maintenance environment. In addition, there are risks associated with COTS software separate from those of creating components from scratch. These unique risks can further complicate the development and post-deployment situations. This paper discusses a model being developed as an extension of the COCOMO II cost model. COCOTS attempts to predict the lifecycle costs of using COTS components by capturing the more significant COTS risks in its modeling parameters. The current state of the model is presented, along with some preliminary findings suggested by an analysis of calibration data collected to date. The paper concludes with a discussion of the on-going effort to further refine the accuracy and scope of COCOTS.

Added March 8th, 2000


USC-CSE-2000-500

Barry Boehm, Chris Abts, Jongmoon Baik, A. Winsor Brown, Sunita Chulani, Bradford Clark, Ellis Horowitz, Ray Madachy, Donald J. Reifer, Bert Steece, "USC COCOMO II.2000" (pdf)

This manual presents two models, the Post-Architecture and Early Design models. These two models are used in the development of Application Generator, System Integration, or Infrastructure developments [Boehm et al. 2000]. The Post-Architecture is a detailed model that is used once the project is ready to develop and sustain a fielded system. The system should have a life-cycle architecture package, which provides detailed information on cost driver inputs, and enables more accurate cost estimates. The Early Design model is a high-level model that is used to explore of architectural alternatives or incremental development strategies. This level of detail is consistent with the general level of information available and the general level of estimation accuracy needed.

The Post-Architecture and Early Design models use the same approach for product sizing (including reuse) and for scale factors. These will be presented first. Then, the Post-Architecture model will be explained followed by the Early Design model.

Added September 8th, 2008


Copyright 2008 The University of Southern California

The written material, text, graphics, and software available on this page and all related pages may be copied, used, and distributed freely as long as the University of Southern California as the source of the material, text, graphics or software is always clearly indicated and such acknowledgement always accompanies any reuse or redistribution of the material, text, graphics or software; also permission to use the material, text, graphics or software on these pages does not include the right to repackage the material, text, graphics or software in any form or manner and then claim exclusive proprietary ownership of it as part of a commercial offering of services or as part of a commercially offered product.