University of Southern California
    
Home   Contact Us    
Center for Systems and Software Engineering

About us
News
History
People
Events
Upcoming
Highlights
Past
Publication
Tech. Report
TR by Author
Research
Projects
Tools
Courses
Education
Degrees
Admissions
Affiliates
List of Affiliates
Private Area
Other Resources


Technical Reports

USC-CSSE-2010-536

Jo Ann Lane, "SoS Management Strategy Impacts on SOS Engineering Effort," Proceedings of the International Conference on Software Process, Paderborn, Germany, July 8-9, 2010 (pdf)

To quickly respond to changing business and mission needs, many organizations are integrating new and existing systems with commercial-off-the-shelf (COTS) products into network-centric, knowledge-based, software-intensive systems of systems (SoS). With this approach, system development processes to define the new architecture, identify sources to either supply or develop the required components, and eventually integrate and test these high level components are evolving and are being referred to as SoS Engineering (SoSE). This research shows that there exist conditions under which investments in SoSE have positive and negative returns on investment, provides the first quantitative determination of these conditions, and points out directions for future research that would strengthen the results.

Added May 2, 2011


USC-CSSE-2010-535

Barry Boehm, Jennifer Bayuk, Abhi Desmukh, Robert Graybill, Jo Ann Lane, Alan Levin, Azad Madni, Mike McGrath, Arthur B. Pyster, Stas Tarchalski, Richard Turner, Jon Wade, "Systems 2020 Strategic Initiative," SERC, Technical Report SERC-2010-TR-009, August 29, 2010 (pdf)

The Department of Defense (DoD) increasingly faces a mix of relatively foreseeable and unforeseeable threat and opportunity profiles. This means that DoD technological superiority relies on rapid and assured development, fielding, and evolution of progressively more complex and interoperable defense systems. Meeting these challenges requires DoD to design and build an entirely new class of adaptive systems that allow the Department to operate with far greater speed and agility. Mr. Lemnios, the Director, Defense Research and Engineering (DDR&E), requested a study of systems engineering research areas that enable agile, assured, efficient, and scalable systems engineering approaches to support the development of these systems. This report addresses the four highest-potential research areas determined by the study: Model Based Engineering (MBE), Platform Based Engineering (PBE), Capability on Demand(COD), and Trusted System Design (TSD). It elaborates each research area, characterizing them in terms of current state of the art and state of the practice, and identifies the most promising research topics. It then proposes next steps to create a DoD-wide Systems 2020 initiative based on a coordinated set of high-leverage, game-changing activities comprising systems engineering research, technology maturation, and pilot-based transition into practice.

Added February 28th, 2011


USC-CSSE-2010-534

Di Wu, Qi Li, Mei He, Barry Boehm, Ye Yang, Supannika Koolmanojwong, "Analysis of Stakeholder/ Value Dependency Patterns and Process Implications: A Controlled Experiment," Proceedings of the 43rd Annual Hawaii International Conference on System Sciences, January 2010 (pdf)

Different classes of information system stakeholders depend on different values to be successful. Understanding stakeholders’ value dependencies is critical for developing software intensive systems. However, there is no universal one-size-fits-all stakeholder/ value metric that can be applied for a given system. This paper presents an analysis of major classes of stakeholders’ value priorities using the win-win prioritization results from 16 real-client graduate software engineering course projects. Findings from this controlled experiment further verify and extend the hypotheses that “different stakeholders have different value propositions”, bridge the value understanding gaps among different stakeholders, beneficial for further reasoning about stakeholders’ utility functions and for providing process guidance for software projects involving various classes of stakeholders.

Added December 29th, 2009


USC-CSSE-2010-533

Di Wu, Da Yang, Barry Boehm, "Finding Success in Rapid Collaborative Requirements Negotiation Using Wiki and Shaper," Proceedings of the 43rd Annual Hawaii International Conference on System Sciences, January 2010 (pdf)

Defining requirements without satisfying success critical stakeholders often leads to expensive project failures. Enabling interdisciplinary stakeholders to rapidly and effectively collaborate in development of globally-usable software-intensive systems remains a major challenge. At USC, 32 real-client, graduate-level team projects experimented with using the wiki-based requirements negotiation support tool WikiWinWin over a two- year period.  Data collected from these projects indicated project outcome is correlated with several usage aspects, including early use, amount of use, frequency of use, shaper use, and evolving of negotiation artifacts.  Several changes made based on our first-year’s experience also showed improvements in cost-effectiveness. User feedback generally confirmed that using a wiki-based negotiation tool was beneficial, and that improving on wiki-tool ease of use would yield further client satisfaction in the future.

Added December 29th, 2009


USC-CSSE-2010-532

Ali Afzal Malik, Barry Boehm, Yan Ku, Ye Yang, "Comparative Analysis of Requirements Elaboration of an Industrial Product," 2010 2nd International Conference on Software Technology and Engineering (ICSTE), San Juan, Puerto Rico, October 3-5, 2010 (pdf)

Requirements for a software development project are gradually refined as more information becomes available. This process of requirements elaboration can be quantified using the appropriate set of metrics. This paper reports the results of an empirical study conducted to analyze the requirements elaboration of an industrial software process management tool-SoftPM-being used by more than 300 Chinese commercial software organizations. After adjusting for the effects of overlaps amongst different versions of SoftPM, multi-level requirements data are gathered and elaboration factors for each version are obtained. These elaboration data are compared with the data from a previous empirical study that analyzed requirements elaboration of a set of different small e-services projects. This comparison reveals that the elaboration factors of different SoftPM versions have much less variation confirming the intuition that projects with similar characteristics have comparable elaboration factors.

Added January 18th, 2011


USC-CSSE-2010-531

Barry Boehm, Jo Ann Lane, "Evidence-Based Software Processes," New Modeling Concepts for Today's Software Processes Lecture Notes in Computer Science, 2010, Volume 6195/2010, pp. 62-73 (pdf)

Many software projects fail because they commit to a set of plans and specifications with little evidence that if these are used on the project, they will lead to a feasible system being developed within the projectís budget and schedule. An effective way to avoid this is to make the evidence of feasibility a first-class developer deliverable that is reviewed by independent experts and key decision milestones: shortfalls in evidence are risks to be considered in going forward. This further implies that the developer will create and follow processes for evidence development. This paper provides processes for developing and reviewing feasibility evidence, and for using risk to determine how to proceed at major milestones. It also provides quantitative result on "how much investment in evidence is enough," as a function of the projectís size, criticality, and volatility.

Added January 13th, 2011


USC-CSSE-2010-530

Qi Li, Fengdi Shu, Barry Boehm, Qing Wang, "Improving the ROI of Software Quality Assurance Activities: An Empirical Study," New Modeling Concepts for Today's Software Processes, Lecture Notes in Computer Science, 2010, Volume 6195/2010, pp. 357-368 (pdf)

Review, process audit, and testing are three main Quality Assurance activities during the software development life cycle. They complement each other to examine work products for defects and improvement opportunities to the largest extent. Understanding the effort distribution and inter-correlation among them will facilitate software organization project planning, improve the software quality within the budget and schedule and make continuous process improvement. This paper reports some empirical findings of effort distribution pattern of the three types of QA activities from a series of incremental projects in China. The result of the study gives us some implications on how to identify which type of QA activity is insufficient while others might be overdone, how to balance the effort allocation and planning for future projects, how to improve the weak part of each QA activity and finally improve the Return On Investment (ROI) of QA activities and the whole process effectiveness under the specific organization context.

Added January 12th, 2011


USC-CSSE-2010-529

Barry Boehm, "The Changing Nature of Software Evolution," IEEE Software, Volume 27, Issue 4, July-August 2010, pp. 26-29 (pdf)

Traditionally, software evolution took place after software development put a system in place. However, the pace of change in technology and competition has changed the nature of software evolution to a continuous process, in which there's no neat boundary between development and evolution. Many traditional software development assumptions and practices haven't recognized this changing nature and increasingly find themselves in deep trouble as a result. Minimizing development costs by adopting numerous off-the-shelf products often leads to unaffordable evolution costs as vendors ship new releases and stop supporting the old ones. Assuming that a single form of evolutionary development covers all situations often leads to unrealistic commitments and dead-end systems as situations change.

Added January 12th, 2011


USC-CSSE-2010-528

Jo Ann Lane, Barry Boehm, Mark Bolas, Azad Madni, Richard Turner, "Critical Success Factors for Rapid, Innovative Solutions," New Modeling Concepts for Today’s Software Processes, Lecture Notes in Computer Science, 2010, Volume 6195/2010, 52-61 (pdf)

Many of today’s problems are in search of new, innovative solutions. However, the development of new and innovative solutions has been elusive to many, resulting in considerable effort and dollars and no solution or a mediocre solution late to the marketplace or customer. This paper describes the results of research conducted to identify the critical success factors employed by several successful, high-performance organizations in the development of innovative systems. These critical success factors span technical, managerial, people, and cultural aspects of the innovative environment.

Added January 12th, 2011


USC-CSSE-2010-527

Bradford Clark, Ray Madachy, Thomas Tan, Barry Boehm, Wilson Rosa, "Building Cost Estimating Relationships for Acquisition Decision Support," 25th Annual COCOMO Forum, USC, Los Angeles, November 2010 (pdf)

Research Objectives
•Using SRDR data, improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing.
–Characterize different Application Domains and Operating Environments within DoD
–Analyze collected data for simple Cost Estimating Relationships (CER) within each domain
–Develop rules-of-thumb for missing data
•Make collected data useful to oversight and management entities

Added November 15th, 2010


USC-CSSE-2010-526

Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm, "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty,'" expanded version of USC-CSSE-2010-525 (pdf)

Accurate software cost and schedule estimations are essential especially for large software projects. However, once the required efforts have been estimated, little is done to recalibrate and reduce the uncertainty of the initial estimates. To address this problem, we have developed and used a framework to continuously monitor the software project progress and readjust the estimated effort utilizing the Constructive Cost Model II (COCOMO II) and the Unified CodeCount Tool developed by the University of Southern California. As a software project progresses, we gain more information such as complexity, architecture resolution, and people capability as well as the actual source lines of code developed and effort spent. This information is then used to assess and re-estimate the effort required to complete the remainder of the project. As the estimations of effort grow more accurate with less uncertainty, the quality and goal of project outcome can be assured within the available resources. The paper thus also provides and analyzes empirical data on how projects evolve within the familiar software "cone of uncertainty."

Added November 8th, 2010


USC-CSSE-2010-525

Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm, "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty,'" ASE2010 (pdf)

Accurate software cost and schedule estimations are essential especially for large software projects. However, once the required efforts have been estimated, little is done to recalibrate and reduce the uncertainty of the initial estimates. To address this problem, we have developed and used a framework to continuously monitor the software project progress and readjust the estimated effort utilizing the Constructive Cost Model II (COCOMO II) and the Unified CodeCount Tool developed by the University of Southern California (USC). As a software project progresses, we gain more information about the project itself, which can then be used to assess and re-estimate the effort required to complete the project. With more accurate estimations and less uncertainties, the quality and goal of project outcome can be assured within the available resources. The paper thus also provides and analyzes empirical data on how projects evolve within the familiar software "cone of uncertainty."

Added November 8th, 2010


USC-CSSE-2010-524

Barry Boehm, Jo Ann Lane, "Improved Acquisition Processes Through Incremental Commitments," NDIA 2010 (pdf)

The wide variety of software-intensive systems needed to support the new horizons of evolving technology, system and software complexity, high dependability, global interoperability, emergent requirements, and adaptability to rapid change make traditional and current one-size-fits-all process models infeasible.  This tutorial presents the process framework, principles, practices, and case studies for a new model developed and being used to address these challenges. It has a series of risk-driven decision points that enable projects to converge on whatever combination of agile, plan-driven, formal, legacy-oriented, reuse-oriented, or adaptive processes that best fit a project’s situation.  The tutorial discusses the decision table for common special cases; exit ramps for terminating non-viable projects; support of concurrent engineering of requirements, solutions and plans; and evidence-based commitment milestones for synchronizing the concurrent engineering.  The tutorial will include case studies and exercises for participants’ practice and discussion.

Added October 26th, 2010


USC-CSSE-2010-523

Stephen Blanchette Jr., Steven Crosson, Barry Boehm, "Evaluating the Software Design of a Complex System of Systems," CMU/SEI Tech Report CMU/SEI-2009-TR-023, January 2010 (pdf)

Schedule- or event-driven reviews are a crucial element of any major software development project. Such reviews tend to focus on different aspects of development, and different types of reviews provide different benefits. The sum of these reviews, however, is inadequate to address the needs of software development in a complex system of systems (SoS) environment. What is needed is a true, evidence-driven, SoS-level evaluation capable of providing an overall assess-ment of, and insight into, the software development effort in that context.

This report discusses the application of the Lifecycle Architecture (LCA) event to what was an enormously complex SoS program: the Army’s Future Combat Systems. From the FCS experience, readers will gain insight into the issues of applying the LCA in an SoS context and be ready to apply the lessons learned in their own domains.

Added October 7th, 2010


USC-CSSE-2010-522

Barry Boehm, "Extending Software Engineering Research Outside the Digital Box," Proceedings, NCO-SDP Workshop on the Future of Software Engineering Research, November 2010 (pdf)

Since software is developed to run on computers, there is a tendency to focus computer science and software engineering on how best to get software to run on computers.  But, engineering is different from science: the Webster definition of “engineering” is “the application of science and mathematics by which the properties of matter and the sources of energy in nature are made useful to people.” Thus, it would follow that the responsibility of software engineering and its research would include the utility to people of the software and the software-reliant artifacts they use, beyond thinking within purely digital boxes.  This position paper addresses two perspectives on the future of software engineering when viewed in this broader context.

Added September 1st, 2010


USC-CSSE-2010-521

Vu Nguyen, LiGuo Huang, Barry Boehm, "An Analysis of Trends in Productivity and Cost Drivers over Years" (pdf)

Software engineering practices have evolved considerably over the last four decades, changing the way software systems are developed and delivered. This paper reports our empirical analysis on how changes in software engineering practices are reflected in the COCOMO cost drivers and how software productivity has evolved over the years. The analysis is based on the COCOMO data set of 341 software projects developed between 1970 and 2009. We find that while the overall ratings for many cost drivers related to the use of software tools, software processes, platform, and personnel tend to follow certain trends over the years, the overall ratings of the majority of cost drivers remain unchanged. We also find that the average software productivity has increased by 6 times over the last 40 years. Moreover, this trend is seen to accelerate noticeably in the last 10 years.

Added August 24th, 2010


USC-CSSE-2010-519

Supannika Koolmanojwong, Barry Boehm, "The Incremental Commitment Model Process Patterns for Rapid-Fielding Projects," ICSP 2010, Paderborn, Germany (pdf)

To provide better services to customers and not to be left behind in a competitive business environment, a wide variety of ready-to-use software and technologies are available for one to grab and build up software systems at a very fast pace. Rapid fielding plays a major role in developing software systems to provide a quick response to the organization. This paper investigates the appropriateness of current software development processes and develops new software development process guidelines, focusing on four process patterns: Use single Non-Developmental Item (NDI), NDI-intensive, Services-intensive, and Architected Agile. Currently, there is no single software development process model that is applicable to all four process patterns, but the Incremental Commitment Model (ICM) can help a new project converge on a process that fits their process drivers. This paper also presents process decision criteria in terms of these drivers and relates them to the ICM Electronic Process Guide.

Added August 23rd, 2010


USC-CSSE-2010-518

Vu Nguyen, Barry Boehm, Phongphan Danphitsanuphan, "Assessing and Estimating Corrective, Enhancive, and Reductive Maintenance Tasks: A Controlled Experiment," APSEC Special Issue, Information and Software Technology Journal, 2010 (pdf)

This paper describes a controlled experiment of student programmers performing maintenance tasks on a C++ program. The goal of the study is to assess the maintenance size, effort, and effort distributions of three different maintenance types and to describe estimation models to predict the programmerís effort on maintenance tasks. Twenty three graduate students and a senior majoring in computer science participated in the experiment in a software engineering lab. Each student was asked to perform maintenance tasks required for one of the three task groups. The impact of different LOC metrics on maintenance effort was also evaluated by fitting the data collected into various estimation models. The results of our study suggest that corrective maintenance is much less productive than enhancive and reductive maintenance. Our results generally confirm the previous results concluding that program comprehension activities require as much as 50% of total effort in corrective maintenance. Moreover, the best software effort model can estimate the time of 79% of the programmers with the error of 30% or less. The model achieved this performance level by using LOC added, modified, and deleted metrics as independent size variables.

Added August 13th, 2010


USC-CSSE-2010-517

George Edwards, Yuriy Brun, Nenad Medvidovic, "Automated Analysis and Code Generation for Domain-Specific Models" (pdf)

Domain-specific languages (DSLs) are able to concisely and intuitively express the essential features of system designs because they use the abstractions and patterns that are most useful and natural for the system under development. However, leveraging system models specified in a DSL for automated analysis and code generation requires the implementation of specialized analysis and code generation tools for each DSL. In this paper, we describe a strategy for creating model analysis and code generation tools that can be applied to a large family of DSLs, and can consequently be easily reused off-the-shelf with new languages and applied in new contexts. The key innovation underlying our strategy is the use of metamodels to automatically synthesize configurations and plug-ins for flexible analysis and code generation frameworks. The result is that software engineers utilizing a DSL can perform automated model analysis and code generation without having to develop custom tools, greatly reducing the effort required to utilize a DSL.

Added August 11th, 2010


USC-CSSE-2010-516

Barry Boehm, Jo Ann Lane, Supannika Koolmanojwong, Richard Turner, "Architected Agile Solutions for Software-Reliant Systems," Proceedings, INCOSE 2010 (pdf)

Systems are becoming increasingly reliant on software due to needs for rapid fielding of “70%” capabilities, interoperability, net-centricity, and rapid adaptation to change. The latter need has led to increased interest in agile methods of software development, in which teams rely on shared tacit interpersonal knowledge rather than explicit documented knowledge. However, such capabilities often need to be scaled up to higher level of performance and assurance, requiring stronger architectural support. Several organizations have recently transformed themselves by developing successful combinations of agility and architecture that scale up to projects of up to 100 personnel. This paper identifies a set of key principles for such architected agile solutions for software-reliant systems, and illustrates them with several case studies.

Added August 2nd, 2010


USC-CSSE-2010-515

A. Winsor Brown, Barry Boehm, Supannika Koolmanojwong , "Software Cost Estimation in the Incremental Commitment Model," Systems Research Forum, Volume 4, Issue 1, June 2010 (pdf)

Complex, software intensive systems — especially those with multiple software component developers — and Directed System of Systems (DSOS) or Acknowledged Systems of Systems (ASOS) need approaches to control the development and estimate the software development costs and schedules. This paper will introduce a next-generation synthesis of the spiral model and other leading process models into the Incremental Commitment Model (ICM). The ICM emphasizes architecting systems (or DSOSs) to encapsulate subsystems (or systems) undergoing the most rapid change, and having agile systems engineers handle longer-range change traffic to rebaseline the plans for future increments. Systems engineers do this, while largely plan-driven teams develop and continuously verify and validate (V&V) the current increment, as is usually required for safe or secure software.

Our approach for estimating software development cost of systems is the Constructive Incremental Commitment Cost Model (COINCOMO) and its tool, which currently implements together in one tool the Constructive Cost Model (COCOMO II), and the Constructive Phased Schedule and Effort Model (COPSEMO).

Added July 8th, 2010


USC-CSSE-2010-514

David Woollard, Chris A. Mattmann, Daniel Popescu, Nenad Medvidovic, "KADRE: Domain-Specific Architectural Recovery For Scientific Software Systems" (pdf)

Scientists today conduct new research via software-based experimentation and validation in a host of disciplines, including materials science, life sciences, astronomy, and physics. Scientific software represents a significant investment due to its complexity and longevity, its life spanning sometimes decades. Unfortunately, there is little reuse of scientific software beyond small libraries, increasing development and maintenance costs. Modern workflow and grid technologies offer a promising medium for reuse in this domain, however reuse at the workflow level is quite different than that of software libraries. To alleviate this disconnect, we have developed KADRE, a domain-specific architecture recovery approach and toolset to aid automatic and accurate identification of workflow components in existing scientific software. KADRE improves upon traditional state of the art general cluster techniques, helping to promote component-based reuse of scientific kernels within the domain.

Added June 28th, 2010


USC-CSSE-2010-513

Ivo Krka, Yuriy Brun, Daniel Popescu, Joshua Garcia, Nenad Medvidovic, "Using Dynamic Execution Traces and Program Invariants to Enhance Behavioral Model Inference," New Ideas and Emerging Results Track, 32nd International Conference on Software Engineering (ICSE 2010) (pdf)

Software behavioral models have proven useful for design, validation, verification, and maintenance. However, existing approaches for deriving such models sometimes overgeneralize what behavior is legal. We outline a novel approach that utilizes inferred likely program invariants and method invocation sequences to obtain an object-level model thatdescribes legal execution sequences. The key insight is using program invariants to identify similar states in the sequences. We exemplify how our approach improves upon certain aspects of the state-of-the-art FSA-inference techniques.

Added May 24th, 2010


USC-CSSE-2010-512

A. Winsor Brown, Supannika Koolmanojwong, "USC's Two Semester Software Engineering Graduate Project Course," The 2nd International Symposium on Engineering Education and Educational Technologies: EEET 2010 (pdf)

For over 12 years, USC's Computer Science (CSCI) Department has been offering a two-semester software engineering course designed by Dr. Barry Boehm and required course for the CS department's Specialization in Software Engineering. From the beginning, it has been doing real projects for real clients. The courses focus on activities not normally covered by regular computer science. While the focus is on software engineering of the projects, the project is done in the context of systems engineering, employing the Incremental Commitment Model (ICM). Project teams are self-organizing and select the projects they wish to work on. Over the years, various online tools have been developed specifically to support the courses.

Added May 24th, 2010


USC-CSSE-2010-511

Ivo Krka, "Enabling Requirements Elaboration Through Synthesis, Refinement, and Analysis of Partial Behavior Models" (pdf)

In this thesis, I propose a holistic approach for capturing, analyzing, and refining behavioral requirements of a software system by synthesizing, analyzing, and refining Modal Transition Systems in a scalable manner.

Added May 17th, 2010


USC-CSSE-2010-510

Ivo Krka, Nenad Medvidovic, "Supporting Refinement of Partial Behavior Models Under Model Composition and Abstraction" (pdf)

During requirements elicitation and preliminary design, a system's behavior is typically partially specified: some behavior is defined as either forbidden or required, while other behavior is not yet categorized as either of those. The goal is then to gradually refine the specification and finally arrive at a complete behavioral description. Partial-behavior modeling formalisms, such as Modal Transition Systems, can support such gradual refinement. However, several challenges still remain, particularly in the context of hierarchical architectural specifications where (sub)system models are composed of smaller subsystems and components, and, in turn, abstracted to be made more compact and analyzable. Refinement of a behavior specification can be performed using models of varying scopes (e.g., different subsystems) and levels of abstraction, depending on the stakeholder needs. The primary challenge then becomes ensuring that a refinement of a model is correct when that model is a composition (or abstraction) of other models; this problem has not been addressed in the existing literature. In this paper, we propose a framework that supports reasoning about behavior refinement of composite and abstract models. Our framework assures that (1) a refinement of a composition is realized with refinements of the individual composed models and (2) a refinement of an abstract model is interpreted in terms of a refinement of the detailed model.

Added May 17th, 2010


USC-CSSE-2010-509

Raymond Madachy, Barry Boehm, Dan Houston "Modeling Software Defect Dynamics," SoftwareTech, Volume 13, Number 1, April 2010, pp. 26-34 (pdf)

Recent enhancements to the COnstructive QUALity MOdel (COQUALMO) help in assessing defect dynamics to better understand the tradeoffs of different processes and technologies for reducing defects.

Added May 10th, 2010


USC-CSSE-2010-508

Ricardo Valerdi, Barry Boehm, "COSYSMO: A Systems Engineering Cost Model," Genie Logiciel, March 2010, Number 92, pp. 2-6 (pdf)

Building on the synergy between Systems engineering and Software Engineering, we have developed a parametric model to estimate systems engineering costs. The goal of this model, called COSYSMO (Constructive Systems Engineering Cost Model), is to more accurately estimate the time and effort associated with performing the system engineering tasks in complex systems. This article describes how COSYSMO was developed and summarizes its size drivers and effort multipliers. We conclude with an example estimate to illustrate the usage of the model to estimate Systems engineering cost.

Added May 4th, 2010


USC-CSSE-2010-507

Judith Dahmann, George Rebovich Jr., Jo Ann Lane, Ralph Lowry, John Palmer, "Systems of Systems Test and Evaluation Challenges," IEEE SoSE 2010 (pdf)

A growing number of military capabilities are achieved through a system of system approach and this trend is likely to continue in the foreseeable future. Systems of systems differ from traditional systems in ways that require tailoring of systems engineering processes to successfully deliver their capabilities. This paper describes the distinct characteristics of systems of systems that impact their test and evaluation, discusses their unique challenges, and suggests strategies for managing them. The recommendations are drawn from the experiences of active system of system engineering practitioners.

Added April 26th, 2010


USC-CSSE-2010-506

Jo Ann Lane, Tim Bohn, "Using SysML to Evolve Systems of Systems," to be sent to the SE Journal for consideration soon (pdf)

The recent Department of Defense (DoD) guidebook, Systems Engineering for Systems of Systems, describes how traditional systems engineering activities have evolved to support systems engineering at the system of systems (SoS) level. Part of the research for this guidebook probed the application of modeling and simulation to support SoS systems engineering. The findings indicated that limited modeling and simulation are currently used, but that additional support would be useful if models could be quickly generated and used to support needed decision making. This paper presents an approach to using system modeling language (SysML) models to support these needs.

Added April 26th, 2010


USC-CSSE-2010-505

Judith Dahmann, George Rebovich Jr., Jo Ann Lane, "System Engineering Artifacts for SoS," IEEE Systems 2010 Conference (pdf)

This paper describes system of systems (SoS) systems engineering (SE) artifacts, compares and contrasts them with similar ones developed and used for individual systems, and explains how they are used to guide SoS engineering processes. The paper concludes with next steps for using SoS artifacts to continue maturing the understanding of SoS SE in an international cooperative effort with the United Kingdom, Australia, and Canada.

Added April 26th, 2010


USC-CSSE-2010-504

Jo Ann Lane, Ricardo Valerdi, "Accelerating System of Systems Engineering Understanding and Optimization through Lean Enterprise Principles," IEEE Systems 2010 Conference (pdf)

By applying a lean enterprise lens to studies of the evolving field of system of systems engineering (SoSE), it has been observed that many SoSE teams are developing processes that are consistent with many lean enterprise principles. These SoSE processes are designed to efficiently evolve the group of systems to meet new needs using limited resources. This paper provides further insights and recommendations for the evolution of system of systems processes using lean concepts. We conclude with a discussion of the potential conflicts between SoSE and lean paradigms and provide thirteen SoS case studies to illustrate the emphasis on lean thinking.

Added April 26th, 2010


USC-CSSE-2010-503

Ali Afzal Malik, Barry Boehm, "An Empirical Study of the Efficacy of COCOMO II Cost Drivers in Predicting a Project's Elaboration Profile" (pdf)

A project's elaboration profile consists of a set of ratios called elaboration factors that quantify the step-wise expansion of a project's requirements from very high-level business objectives to very low-level source lines of code. Knowledge of a project's elaboration profile can be extremely useful in deriving an early estimate of its size. The real challenge, however, is to predict the elaboration profile. Can the COCOMO II cost drivers be used for this purpose? This paper attempts to answer this question. It examines the elaboration profiles and COCOMO II cost driver ratings of 25 small real-client projects. The data collection process is thoroughly described and the cost drivers relevant at each stage of elaboration are identified. Relationships between elaboration factors and relevant cost drivers are analyzed using simple as well as multiple regression. The results indicate that there is no magical formula for predicting the various elaboration factors just from the values of COCOMO II cost drivers. This may be due to some confounding factors which are highlighted in this paper.

Added April 5th, 2010


USC-CSSE-2010-502

Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm, "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty'," ASE '10 (pdf)

Accurate software cost and schedule estimations are essential especially for large software projects. However, once the required efforts have been estimated, little is done to recalibrate and reduce the uncertainty of the initial estimates. To address this problem, we have developed and used a framework to continuously monitor the software project progress and readjust the estimated effort utilizing the Constructive Cost Model II (COCOMO II) and the Unified CodeCount Tool developed by the University of Southern California. As a software project progresses, we gain more information such as complexity, architecture resolution, and people capability as well as the actual source lines of code developed and effort spent. This information is then used to assess and re-estimate the effort required to complete the remainder of the project. As the estimations of effort grow more accurate with less uncertainty, the quality and goal of project outcome can be assured within the available resources. The paper thus also provides and analyzes empirical data on how projects evolve within the familiar software "cone of uncertainty."

Added March 30th, 2010


USC-CSSE-2010-501

Barry Boehm, Dan Ingold, Kathleen Dangle, Rich Turner, Paul Componation, "Early Identification of SE-Related Program Risks," CSER 2010 (pdf)

This paper summarizes the results of a Department of Defense (DoD) Systems Engineering Research Center (SERC) project to synthesize analyses of DoD Systems Engineering (SE) effectiveness risk sources into a lean framework and toolset for early identification of SE-related program risks. It includes concepts of operation which enable project sponsors and performers to agree on the nature and use of more effective evidence-based reviews. These enable early detection of missing SE capabilities or personnel competencies with respect to a framework of Goals, Critical Success Factors (CSFs), and Questions determined from leading DoD early-SE CSF analyses. The SE Effectiveness Measurement (EM) tools enable risk-based prioritization of corrective actions, as shortfalls in evidence for each question are early uncertainties, which when combined with the relative system impact of a negative answer to the question, translates into the degree of risk that needs to be managed to avoid system overruns and incomplete deliveries.

Added March 16th, 2010


USC-CSSE-2010-500

Barry Boehm, Jo Ann Lane, "DoD Systems Engineering and Management Implications for Evolutionary Acquisition of Major Defense Systems," A DoD SERC Quick-Look Study and CSER 2010 Invited Presentation (pdf)

This DoD Systems Engineering Research Center (SERC) briefing was commissioned as a quick-look study by Ms. Kristen Baldwin, Director of Systems Analysis within the ODDR&E Systems Engineering (SE) organization and Deputy Director of the SERC.   The recent DoD Instruction 5000.02 and the Congressional Weapon System Acquisition Reform Act (WSARA) have recommended evolutionary acquisition as the preferred strategy for Major Defense Acquisition Programs (MDAPs).  Since DoD SE has largely been performed under non-evolutionary acquisition practices, there are questions about what needs to be changed about DoD SE practices and SE-related acquisition management practices to enable SE to function more effectively.   The study is presented as a briefing with notes to emphasize that it is not an exhaustive study, but one to set the context for more detailed analyses and initiatives.

The study found that there are several forms of evolutionary acquisition, and that there is no one-size-fits-all SE approach that is best for all situations.  For rapid-fielding situations, an easiest-first, get something working, evolutionary SE approach is best.  But for enduring systems, an easiest-first evolutionary SE approach is likely to produce an unscalable system whose architecture is incompatible with achieving high levels of safety and security.   The study also found that evolutionary acquisition requires much higher sustained levels of SE effort, earlier and continuous integration and test, pro-active approaches to address sources of system change, greater levels of concurrent engineering, and achievement reviews based on evidence of feasibility vs. evidence of plans, activity, and system descriptions.

The study also found that many traditional acquisition practices are incompatible with effective SE of evolutionary acquisition.  These include assumptions that full-capability requirements can be specified up front along with associated full-capability plans, budgets, schedules, work breakdown structures, and earned-value management targets; that most systems engineers can be dismissed after PDR; and that all forms of requirements change or “creep” should be discouraged.  The study also found that other inhibitors to effective SE need to be addressed, such as underbudgeting (SE is the first victim of inadequate budgets); contracting provisions emphasizing functional definition before addressal of key performance parameters; and management temptations to show rapid progress on easy initial increments while deferring the hard parts until later increments.

Based on these findings, the study recommended that significant initiatives be undertaken to provide the necessary EvA infrastructure of SE-related acquisition and development processes, contracting and incentive structures, milestone decision criteria, financing, program management, and staffing, along with research support to address current gaps between EvA infrastructure needs and capabilities.  Projects attempting to succeed at EvA while burdened by DoD’s current acquisition infrastructure could lead to sufficient numbers of projects “failing with EvA” to cause policymakers to give up on it before it has a chance to succeed.

Added March 15th, 2010


Copyright 2010 The University of Southern California

The written material, text, graphics, and software available on this page and all related pages may be copied, used, and distributed freely as long as the University of Southern California as the source of the material, text, graphics or software is always clearly indicated and such acknowledgement always accompanies any reuse or redistribution of the material, text, graphics or software; also permission to use the material, text, graphics or software on these pages does not include the right to repackage the material, text, graphics or software in any form or manner and then claim exclusive proprietary ownership of it as part of a commercial offering of services or as part of a commercially offered product.