University of Southern California
    
Home   Contact Us    
Center for Systems and Software Engineering

About us
News
History
People
Events
Upcoming
Highlights
Past
Publication
Tech. Report
TR by Author
Research
Projects
Tools
Courses
Education
Degrees
Admissions
Affiliates
List of Affiliates
Private Area
Other Resources


Technical Reports

USC-CSSE-2007-748

Chiyoung Seo, Sam Malek, Nenad Medvidovic, "An Energy Consumption Framework for Distributed Java-Based Software Systems," Proceedings of the Twenty-Second ACM/IEEE International Conference on Automated Software Engineering (ASE 2007), Atlanta, Georgia, November 5-7, 2007, pp. 421-424 (pdf)

In this paper we define and evaluate a framework for estimating the energy consumption of Java-based software systems. Our primary objective in devising the framework is to enable an engineer to make informed decisions when adapting a system's architecture, such that the energy consumption on hardware devices with a finite battery life is reduced, and the lifetime of the system's key software services increases. Our framework explicitly takes a component-based perspective, which renders it well suited for a large class of today's distributed, dynamic, and mobile applications. The framework allows the engineer to estimate the software system's energy consumption at construction time and refine it at runtime. In a large number of distributed application scenarios, the framework showed very good precision on the whole, giving results that were within 5% (and often less) of the actually measured power losses incurred by executing the software. While our empirical evidence suggests that the framework is broadly applicable as-is, our work to date has highlighted a number of future enhancements.

Added May 10th, 2006


USC-CSSE-2007-747

Jo Ann Lane, "COSOSIMO Parameter Definitions," Symposium on Complex Systems Engineering, January 11-12, 2007 (pdf)

The Constructive System-of-Systems (SoS) Integration Cost Model (COSOSIMO) is designed to estimate the effort associated with the Lead System Integrator (LSI) activities to define the SoS architecture, identify sources to either supply or develop the required SoS component systems, and eventually integrate and test these high level component systems. This technical report is an update to the COSOSIMO parameter definitions dated March 2006 and describes the parameters for each of the COSOSIMO sub-models. The parameters include a set of size drivers that are used to calculate a nominal effort for the sets of activities associated with the sub-model and a set of cost drivers that are used to adjust the nominal effort based on related SoS architecture, process, and personnel characteristics.

Added May 11th, 2006


USC-CSSE-2007-746

Jesal Bhuta, Chris A. Mattmann, Nenad Medvidovic, Barry Boehm, "A Framework for the Assessment and Selection on Software Compononents and Connectors in COTS-Based Architectures," Sixth Working IEEE/IFIP Conference on Software Architecture, Mumbai, India, 2007 (pdf)

Software systems today are composed from prefabricated commercial components and connectors that provide complex functionality and engage in complex interactions. Unfortunately, because of the distinct assumptions made by developers of these products, successfully integrating them into a software system can be complicated, often causing budget and schedule overruns. A number of integration risks can often be resolved by selecting the 'right' set of COTS components and connectors that can be integrated with minimal effort. In this paper we describe a framework for selecting COTS software components and connectors ensuring their interoperability in software-intensive systems. Our framework is built upon standard definitions of both COTS components and connectors and is intended for use by architects and developers during the design phase of a software system. We highlight the utility of our framework using a challenging example from the data-intensive systems domain. Our preliminary experience in using the framework indicates an increase in interoperability assessment productivity by 50% and accuracy by 20%.

Added September 19th, 2006


USC-CSSE-2007-745

George Edwards, Sam Malek, Nenad Medvidovic, "Scenario-Driven Dynamic Analysis of Distributed Architectures," Proceedings of the 10th International Conference on Fundamental Approaches to Software Engineering (FASE), March 2007 (pdf)

Software architecture constitutes a promising approach to the development of large-scale distributed systems, but architecture description languages (ADLs) and their associated architectural analysis techniques suffer from several important shortcomings. This paper presents a novel approach that reconceptualizes ADLs within the model-driven engineering (MDE) paradigm to address their shortcomings. Our approach combines extensible modeling languages based on architectural constructs with a model interpreter framework that enables rapid implementation of customized dynamic analyses at the architectural level. Our approach is demonstrated in the eXtensible Tool chain for Evaluation of Architectural Models (XTEAM), a suite of ADL extensions and model transformation engines targeted specifically for highly distributed, resource-constrained, and mobile computing environments. XTEAM model transformations generate system simulations that provide a dynamic, scenario- and risk-driven view of the executing system. This information allows an architect to compare architectural alternatives and weigh trade-offs between multiple design goals, such as system performance, reliability, and resource consumption. XTEAM provides the extensibility to easily accommodate both new modeling language features and new architectural analyses.

Added October 30th, 2006


USC-CSSE-2007-744

Yue Chen, "Stakeholder Value Driven Threat Modeling for Off The Shelf Based Systems," 2007 ICSE Doctoral Symposium (pdf)

This paper abstract summarizes the Threat Modeling method based on Attacking Path Analysis (T-MAP) which quantifies and prioritizes security threats by calculating the total severity weights of relevant attacking paths for Commercial Off The Shelf (COTS) based systems. Compared to existing approaches, T-MAP is dynamic and sensitive to system stakeholder value priorities and IT environment. It distills the technical details of thousands of relevant software vulnerabilities into management-friendly numbers at a high-level. In its initial usage in a large IT organization, T-MAP has demonstrated significant strength in COTS vulnerability prioritizing and estimating security investment effectiveness, as well as COTS security assessment in early project life-cycle. Furthermore, a software tool has been developed to automate the T-MAP.

Added December 6th, 2006


USC-CSSE-2007-743

Raymond Madachy, Barry Boehm, Jo Ann Lane, "Assessing Hybrid Incremental Processes for SISOS Development," Software Process Improvement and Practice, Wiley, 2007 (pdf)

New processes are being assessed to address modern challenges for Software-Intensive Systems of Systems (SISOS), such as coping with rapid change while simultaneously assuring high dependability. A hybrid agile and plan-driven process based on the spiral lifecycle has been outlined to address these conflicting challenges with the need to rapidly field incremental capabilities in a value-based framework. A system dynamics model has been developed to assess the incremental hybrid process and support project decision-making. It estimates cost and schedule for multiple increments of a hybrid process that uses three specialized teams, and also considers the mission value of software capabilities. It considers changes due to external volatility and feedback from user-driven change requests, and dynamically re-estimates and allocates resources in response to the volatility. Deferral policies and team sizes can be experimented with, and it includes tradeoff functions between cost and the timing of changes within and across increments, length of deferral delays, and others. We illustrate how the model can be used to determine optimal agile team size to handle changes. Both the hybrid process and simulation model are being evolved on a very large scale incremental SISOS project and other potential pilots.

Added December 18th, 2006


USC-CSSE-2007-742

Hasan Kitapci, Barry Boehm, "Using a Hybrid Method for Formalizing Informal Stakeholder Requirements Inputs," Software Process Improvement and Practice, Wiley, 2007 (pdf)

Success of software development depends on the quality of the requirements specification. Moreover, good – sufficiently complete, consistent, traceable, and testable – requirements are a prerequisite for later activities of the development project. Without understanding what the stakeholders really want and need, and writing these requirements, projects will not develop what the stakeholders wanted.

During the development of the WinWin negotiation model and the EasyWinWin requirements negotiation method, we have gained considerable experience in capturing informal requirements in over 100 projects. However, the transition from informal representations to semi-formal and formal representations is still a challenging problem.

Based on our analysis of the projects to date, we have developed an integrated set of gap-bridging methods as a hybrid method to formalize informal stakeholder requirements inputs. The basic idea is that orchestrating these gap-bridging methods through the requirements engineering process can significantly eliminate requirements related problems and ease the process of formality transition.

Added December 19th, 2006


USC-CSSE-2007-741

Barry Boehm, Apurva Jain, "The Value-Based Theory of Systems Engineering: Identifying and Explaining Dependencies," INCOSE 2007, San Diego, CA, June 24-28, 2007 (pdf)

The Value-Based Theory of Systems Engineering brings together many interdisciplinary theoretical lenses into a state of synchrony and allows reasoning about systems in different dimensions, and at various levels of abstraction. The theory’s primary strength is in its ability to identify and work through the dependencies of most socio-political-technical systems and explain success in such contexts by situating the success-critical stakeholders at the forefront. In this paper we present the 4+1 theoretical lenses of the Value-Based Theory of Systems Engineering with a core emphasis on the Dependency Theory – the first and most complex of the four component theories.

Added July 18th, 2008


USC-CSSE-2007-740

Barry Boehm, Apurva Jain, "Developing a Process Framework Using Principles of Value-Based Software Engineering," Software Process: Improvement and Practice, Volume 12, Issue 5, September 2007, pp. 377-385 (pdf)

In this article we present a software process framework using the 4 + 1 theory and principles of value-based software engineering (VBSE). The value-based process framework serves as a 6-step process guide, and explains critical interactions between the five theories in the 4 + 1 theory of value-based software engineering. This article also applies the process framework to a supply chain organization through a case study analysis to illustrate its strength in practice.

Added July 18th, 2008


USC-CSSE-2007-739

Barry Boehm, "Future Challenges and Rewards for Software Engineers," DoD Software Tech News, October 2007, pp. 6-12 (pdf)

A clear trend for the future of software engineering is illustrated by Figure 1, showing the growth in percentage of aircraft requirements involving software control.

This makes software engineering skills increasingly valuable and software careers increasingly influential, but it also places significant responsibilities on software engineers to ensure that their software will be able to deliver high levels of dependability. Some additional future trends discussed below will make this goal increasingly challenging, but also increasingly important to address. These trends are: uncertainty and emergence; rapid change; multifaceted dependability; diversity; and interdependence.

Added July 16th, 2008


USC-CSSE-2007-738

Barry Boehm, Jo Ann Lane, "Putting Systems to Work: Processes for Expanding System Capabilities Through System of Systems Acquisitions," Symposium on Complex Systems Engineering, January 11-12, 2007 (pdf)

Business and mission pressures to provide new complex system capabilities quickly are leading organizations to pursue the development of systems of systems.  This allows us to get additional “mileage” from our existing, but somewhat aging systems by putting them to work in a system of systems environment with other systems in order to achieve the new desired capabilities.  Our experiences in helping to define, acquire, develop, and assess 21st century software-intensive system of systems (SISOS ) have taught us that traditional 20th century acquisition and development processes do not work well on such systems.  This article summarizes the characteristics of such systems, and indicates the major problem areas in using traditional processes on them.  We also present new processes that we and others have been developing, applying, and evolving to address 21st century SISOS.  These include extensions to the risk-driven spiral model to cover broad (many systems), deep (many supplier levels), and long (many increments) acquisitions needing rapid fielding, high assurance, adaptability to high change traffic, and complex interactions with evolving, often complex, Commercial Off-the-Shelf (COTS) products, legacy systems, and external systems.

Added April 16th, 2008


USC-CSSE-2007-737

Vu Nguyen, Sophia Deeds-Rubin, Thomas Tan, Barry Boehm, "A SLOC Counting Standard," COCOMO II Forum 2007 (pdf)

Source Lines of Code (SLOC or LOC) is one of the most widely used sizing metrics in industry and literature. It is the key input for most of major cost estimation models such as COCOMO, SLIM, and SEER-SEM. Although the SEI and the IEEE have established SLOC definitions and guidelines to standardize counting practice, inconsistency in SLOC measurements still exists in industry and research. This problem causes the incomparability of SLOC metric among organizations and the inaccuracy of cost estimation. This report presents a set of counting standards that defines what and how to count SLOC. Our experience with the development and use of the USC CodeCount™ toolset, a popular utility that automates the SLOC counting process, suggests that this problem can be alleviated by the use of a reasonable and unambiguous counting standard guide and with the support of a configurable counting tool.

Added April 2nd, 2008


USC-CSSE-2007-736

Jesal Bhuta, Barry Boehm, "A Framework for Identification and Resolution of Interoperability Mismatches in COTS-Based Systems," 2nd International Workshop on Incorporating COTS Software into Software Systems: Tools and Techniques (co-located with the 29th International Conference on Software Engineering), Minneapolis, May 2007 (pdf)

Software systems today are frequently composed from prefabricated commercial components that provide complex functionality and engage in complex interactions. Such projects that utilize multiple commercial-off-the-shelf (COTS) products often confront interoperability conflicts resulting in budget and schedule overruns. These conflicts occur because of the incompatible assumptions made by developers of these products. Identification of such conflicts and planning strategies to resolve them is critical for developing such systems under budget and schedule constraints. In this paper we present an attribute-based framework that can be used to perform high-level and automated interoperability assessment to filter out COTS product combinations whose integration will not be feasible within the project constraints. Our framework is built upon standard definitions of both COTS components and connectors and is intended for use by architects and developers during the design phase of a software system. Our preliminary experience in using the framework indicates an increase in interoperability assessment productivity by 50% and accuracy by 20%.

Added February 22nd, 2008


USC-CSSE-2007-734

Jesal Bhuta, Barry Boehm, "Attribute-Based COTS Product Interoperability Assessment," International Conference on COTS-Based Software Systems, Alberta, Canada, February/March 2007 (pdf)

Software systems today are frequently composed from prefabricated commercial components that provide complex functionality and engage in complex interactions. Such projects that utilize multiple commercial-off-the-shelf (COTS) products often confront interoperability conflicts resulting in budget and schedule overruns. These conflicts occur because of the incompatible assumptions made by developers of these products. Identification of such conflicts and planning strategies to resolve them is critical for developing such systems under budget and schedule constraints. Unfortunately, acquiring information to perform interoperability analysis is a time-intensive process. Moreover, increase in the number of COTS products available to fulfill similar functionality leads to hundreds of COTS product combinations, further complicating the COTS interoperability assessment landscape. In this paper we present a set of attributes that can be used to define COTS interoperability-specific characteristics. COTS product definitions based on these attributes can be used to perform high-level and automated interoperability assessment to filter out COTS product combinations whose integration will not be feasible within project constraints. In addition to above stated attributes, we present a tool that can be used to assess COTS-based architectures for interoperability conflicts, reducing the overall effort spent in performing interoperability analysis. Our preliminary experience in using the framework indicates an increase in interoperability assessment productivity by 50% and accuracy by 20%.

Added February 22nd, 2008


USC-CSSE-2007-733

Yue Chen, Barry Boehm, Luke Sheppard, "Measuring Security Investment Benefit for Off the Shelf Software Systems - Stakeholder Value Driven Approach," The 2007 Workshop on the Economics of Information Security (WEIS 2007), CMU, PA, June 2007 (pdf)

This paper presents the Threat Modeling method based on Attacking Path Analysis (T-MAP) which quantifies security threats by calculating the total severity weights of relevant attacking paths for Commercial Off The Shelf (COTS) based systems. Compared to existing approaches, T-MAP is sensitive to an organization’s business value priorities and IT environment. It distills the technical details of thousands of relevant software vulnerabilities into management-friendly numbers at a high-level, and systematically establishes the traceability and consistency from management-level organizational value propositions to technical-level security threats and corresponding mitigation strategies. In its initial usage in a large IT organization, T-MAP has demonstrated promising strength in prioritizing and estimating security investment effectiveness, as well as in evaluating the security performance of COTS systems. In the case study, we demonstrate the steps of using T-MAP to analyze the cost-effectiveness of how system patching, user account control and firewall can improve security. In addition, we introduce a software tool that automates the T-MAP.

Added February 22nd, 2008


USC-CSSE-2007-732

Hasan Kitapci, Barry Boehm, "Using a Hybrid Method for Formalizing Informal Stakeholder Decisions," 40th Annual Hawaii International Conference on System Sciences (HICSS'07), 2007, p. 283 (pdf)

Decisions are hard to make when available information is incomplete, inconsistent, and ambiguous. Moreover, good – sufficiently complete, consistent, traceable, and testable – requirements are a prerequisite for successful projects. Without understanding what the stakeholders really want and need and writing these requirements in a concise, understandable and testable manner, projects will not develop what the stakeholders wanted leading to either major late rework or project termination.

During the development of the WinWin negotiation model and the EasyWinWin requirements negotiation method, we have gained considerable experience in capturing decisions made by stakeholders in over 100 projects. However, the transition from informal decisions to requirements specification is still a challenging problem.

Based on our analysis of the projects to date, we have developed an integrated set of gap-bridging methods as a hybrid method to support stakeholders making better decisions in order to eliminate requirements related problems and ease the process of formality transition.

Added February 22nd, 2008


USC-CSSE-2007-731

Monvorath Phongpaibul, Barry Boehm, "A Replicate Empirical Comparison between Pair Development and Software Development with Inspection," ESEM 2007 (pdf)

In 2005, we studied the development effort and effect of quality comparisons between software development with Fagan’s inspection and pair development.  Three experiments were conducted in Thailand: two classroom experiments and one industry experiment.  We found that in the classroom experiments, the pair development group had less average development effort than the inspection group with the same or higher level of quality. The industry experiment’s result showed pair development to have a bit more effort but about 40% fewer major defects. However, since this set of experiments was conducted in Thailand, the results may be different if we conducted the experiment in other countries due to the impact of cultural differences. To investigate this we conducted another experiment with Computer Science graduate students at USC in Fall 2006.  Unfortunately, the majority of the graduate students who participated in the experiment were from India, a country in which the culture is not much different from Thailand [18], [19].  As a result, we cannot compare the impact of cultural differences in this paper.  However, the results showed that the experiment can be replicated in other countries where the cultures are similar.

Added February 22nd, 2008


USC-CSSE-2007-730

Monvarath Phongpaibul, Supannika Koolmanojwong, Alexander Lam, Barry Boehm, "Comparative Experiences with Electronic Process Guide Generator Tools," ICSP 2007, pp. 61-72 (pdf)

The primary objective of all software engineering courses is to help students learn how to develop successful software systems with good software engineering practices. Various tools and guidelines are used to assist students to gain the knowledge as much as possible. USC’s Center for Systems and Software Engineering (CSSE) has found that the keystone course in learning software engineering is a year-long real-client team project course. Over the last ten years, CSSE has evolved a set of guidelines for the course, and has experimented with early tests for creating electronic process guides for MBASE (Model-Based (Systems) Architecting and Software Engineering) Guidelines using Spearmint/EPG. Currently, CSSE has been developing and experimenting with Eclipse Process Framework’s (EPF) to situate the LeanMBASE Guidelines. This paper reports our comparative experiences of using the earlier and current tools to generate the electronic process guidelines. In our analysis, we used the objectives defined by Humphrey and Kellner[17] to compare the process tools. The evaluation identifies some research challenges and areas for future research work.

Added February 22nd, 2008


USC-CSSE-2007-729

Supannika Koolmanojwong, Barry Boehm, "An Empirical Study on MBASE and LeanMBASE" ESEM 2007, p. 496 (pdf)

From 1998-2005, the successful Model-Based (Systems) Architecting and Software Engineering (MBASE) had been used as a set of guidelines for the keystone two-semester real-client team project graduate software engineering course sequence. However, to fit with small-sized and limited schedule projects, MBASE was trimmed to reduce the huge amount of efforts in documentation. Consequently, LeanMBASE, which is a light-weight software process framework that helps teams identify the high-value activities and helps balance the workload of a development, is being used in the software engineering course. This paper reports the comparison and improvement of the projects that use MBASE and LeanMBASE in terms of content, performance, and customer satisfaction.

Added February 22nd, 2008


USC-CSSE-2007-728

Jo Ann Lane, F. Stan Settles, Barry Boehm, "Assessment of Process Modeling Tools to Support the Analysis of System of Systems Engineering Activities," Proceedings of the Fifth Annual Conference on Systems Engineering Research, March 2007 (pdf)

Many organizations are attempting to provide new system capabilities through the net-centric integration of existing systems into systems of systems (SoS).  The engineering activities used to architect and develop these SoS are often referred to as SoS Engineering (SoSE).  Recent reports are indicating that SoSE activities are considerably different from classical systems engineering (SE) activities.  Other systems engineering experts believe that there is nothing really different with respect to systems engineering activities or component-based engineering in the SoS environment—that there are only differences in scale and complexity. To better understand SoSE, studies are currently underway to evaluate the differences between classical SE and SoSE.  This paper summarizes process areas to be investigated in the SE-SoSE comparison and then analyzes and evaluates several types of process modeling tools in order to identify a set of tools that can be used to capture classical SE and SoSE process characteristics for further comparison.

Added February 22nd, 2008


USC-CSSE-2007-725

Jo Ann Lane, Barry Boehm, "System-of-Systems Cost Estimation: Analysis of Lead System Integrator Engineering Activities," Information Resources Management Journal, Vol. 20, No. 2, 2007 (pdf)

As organizations strive to expand system capabilities through the development of system-of-systems (SoS) architectures, they want to know "how much effort" and "how long" to implement the SoS. In order to answer these questions, it is important to first understand the types of activities performed in SoS architecture development and integration and how these vary across different SoS implementations. This paper provides results of research conducted to determine types of SoS Lead System Integrator (LSI) activities and how these differ from the more traditional system engineering activities described in Electronic Industries Alliance (EIA) 632 (“Processes for Engineering a System”). This research further analyzed effort and schedule issues on “very large” SoS programs to more clearly identify and profile the types of activities performed by the typical LSI and to determine organizational characteristics that significantly impact overall success and productivity of the LSI effort. The results of this effort have been captured in a reduced-parameter version of the Constructive SoS Integration Cost Model COSOSIMO) that estimates LSI SoS Engineering (SoSE) effort.

(This was an invited expansion of a paper that received a best paper award at the 2006 Symposium on Information Systems Research and Systems Approach that was part of the InterSymp 2006 sponsored by the International Institute for Advanced Studies in Systems Research and Cybernetics.)

Added February 22nd, 2008


USC-CSSE-2007-724

Supannika Koolmanojwong, Monvorath Phongpaibul, Natachart Laoteppitak, Barry Boehm, "Comparative Experiences with Software Process Modeling Tools for the Incremental Commitment Model" (pdf)

The Incremental Commitment Model (ICM) is a new generation process model that focuses on the incremental growth of success critical stakeholder satisfaction, system definition and stakeholder commitment. ICM has been introduced in system engineering, but not software engineering. In the Fall 2008, ICM will be used as a process model to develop software system in the USC Software Engineering graduate course. Hence, two significantly different software process modeling tools are selected to create the electronic process guidelines for this course. This paper reports our comparative experiences between an adaptability tolerance framework, Eclipse Process Framework Composer (EPFC) and a precision oriented process definition language, Little-JIL in order to create ICM electronic guide. In addition, the paper provides a tool comparison analysis by using Humphrey and Kellner's criteria and a target group evaluative result. The evaluation identifies some research challenges and areas for future research work.

Added December 15th, 2007


USC-CSSE-2007-720

Ricardo Valerdi, Ray Madachy, "Impact and Contributions of MBASE on Software Engineering Graduate Courses," Journal of Systems and Software, Volume 80, Issue 8, August 2007 (pdf)

As the founding Director of the Center for Software Engineering, Professor Barry Boehm developed courses that have greatly impacted the education of software engineering students. Through the use of the MBASE framework and complementary tools, students have been able to obtain real-life software development experience without leaving campus. Project team clients and the universities have also benefited. This paper provides evidence on the impact of Dr. Boehm’s frameworks on courses at two universities, and identifies major contributions to software engineering education and practice.

Added November 8th, 2007


USC-CSSE-2007-719

Tim Menzies, Oussama Elrawas, Jairus Hihn, Martin S. Feather, Ray Madachy, Barry Boehm, "The Business Case for Automated Software Engineering," Automated Software Engineering, Proceedings of the Twenty-Second IEEE/ACM International Conference on Automated Software Engineering, Atlanta, Georgia, USA, 2007, pp. 303-312 (pdf)

Adoption of advanced automated SE (ASE) tools would be favored if a business case could be made that these tools are more valuable than alternate methods. In theory, software prediction models can be used to make that case. In practice, this is complicated by the ”local tuning” problem. Normally, predictors for software effort and defects and threat use local data to tune their predictions. Such local tuning data is often unavailable.

This paper shows that assessing the relative merits of different SE methods need not require precise local tunings. STAR1 is a simulated annealer plus a Bayesian post-processor that explores the space of possible local tunings within software prediction models. STAR1 ranks project decisions by their effects on effort and defects and threats. In experiments with two NASA systems, STAR1 found that ASE tools were necessary to minimize effort/ defect/ threats.

Added November 8th, 2007


USC-CSSE-2007-718

N/A


USC-CSSE-2007-717

Jo Ann Lane, Ricardo Valerdi, "Synthesizing SoS Concepts for Use in Cost Modeling," 2007 Wiley Periodicals, Inc. Syst Eng 10, pp. 297-308 (pdf)

Today’s need for more complex, capable systems in a short timeframe is leading many organizations towards the integration of existing systems into network-centric, knowledgebased system-of-systems (SoS). This presents new acquisition challenges in the area of cost estimation because of the lack of commonly accepted definitions and roles. Software and system cost models to date have focused on the software and system development activities of a single system. When viewing the new SoS architectures, one finds that the cost associated with the design and integration of these SoSs is not handled well, if at all, in current cost models. This paper looks at commonly cited definitions of SoS, then evaluates these definitions to determine if they adequately describe and converge on a set of SoS characteristics in the areas of product, development process, and development personnel that can be used to define boundaries and key parameters for an initial SoS cost model. Sixteen SoS definitions are synthesized to provide reasonable coverage for different properties of SoSs. Two examples are used to illustrate key characteristics relevant to cost modeling.

Added November 8th, 2007


USC-CSSE-2007-716

Jo Ann Lane, Barry Boehm, "Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation - A DACS State-of-the-Art Report," A DACS State-of-the-Art Report, August 2007 (pdf)

Many Department of Defense (DoD) organizations are attempting to provide new system capabilities through the net-centric integration of existing software-intensive systems into a new system often referred to as Software-Intensive System of Systems (SISOS). The goal of this approach is to build on existing capabilities to produce new capabilities not provided by the existing systems in a timely and cost-effective manner. Many of these new SISOS efforts such as the Future Combat Systems are of a size and complexity unlike their predecessor systems and cost estimation tools such as the Constructive Cost Model (COCOMO) suite are undergoing significant enhancements to address these challenges. This report describes the unique challenges of SISOS cost estimation, how current tools are changing to support these challenges, as well as on-going efforts to further support SISOS cost estimation needs. This report concentrates heavily on the COCOMO-based models and tools, but also incorporates activities underway by other cost model vendors.

Added November 1st, 2007


USC-CSSE-2007-715

Barry Boehm, Jo Ann Lane, "Using the Incremental Commitment Model to Integrate System Acquisition, Systems Engineering, and Software Engineering," an expanded version of CrossTalk, October 2007 (pdf)

One of the top recommendations to emerge from the October 2006 Deputy Under Secretary of Defense (DUSD) Acquisition, Technology, and Logistics (AT&L) Defense Software Strategy Summit was to find ways of better integrating software engineering into the systems engineering and acquisition process.  Concurrently, a National Research Council study was addressing the problem of better integrating human factors into the systems engineering and acquisition process.  This paper presents a model that emerged from these and related efforts that shows promise of improving these integrations.  This model, called the Incremental Commitment Model (ICM), organizes systems engineering and acquisition processes in ways that better accommodate the different strengths and difficulties of hardware, software, and human factors engineering approaches.  It also provides points at which they can synchronize and stabilize, and at which their risks of going forward can be better assessed and fitted into a risk-driven stakeholder resource commitment process. 

Added July 17th, 2007


USC-CSSE-2007-714

Yuriy Brun, Nenad Medvidovic, "Discreetly Distributing Computation via Self-Assembly" (pdf)

One aspect of large networks, such as the Internet, is the colossal amount of computation its nodes could perform if that computation were distributed efficiently.  The Internet has already led to solving some problems, e.g., NP-complete problems, that were unlikely to have been solved on individual computers.   However, the methods leading to those solutions disclosed inputs, algorithms, and outputs to the Internet nodes.  It has even been argued in the literature that it is not possible to ask an entity for help with solving NP-complete problems without disclosing the input and algorithm.

In this paper, we present an architectural style that distributes computation over a network discreetly, such that no small group of computers (asymptotically smaller than O(n log n) for an n-bit input) knows the algorithm or the input.  The style abstracts away the distribution and only requires writing non-parallel code, automating in turn the parallelization of computation.  Further, the style is fault-and adversary-tolerant (malicious, faulty, and unstable nodes may not break the computation) and scalable (communication among the nodes does not increase with network or problem size).  Systems designed and constructed according to the style free the architect from having to worry about these non-functional properties.  We formally argue that our architectural style has all three properties: discreetness, fault and adversary tolerance, and scalability.

Added July 17th, 2007


USC-CSSE-2007-713

LiGuo Huang,  Barry Boehm, Hao Hu, Jidong Ge, Jian Lü, Cheng Qian, "Modeling a Value-Based Process Based On Object Petri Nets and Its Application in the ERP Domain," International Journal of Software and Informatics, Fall 2007 (pdf)

Commercial organizations increasingly need software processes sensitive to business value, quick to apply, and capable of early analysis for subprocess consistency and compatibility. This paper presents experience in applying a lightweight synthesis of a Value-Based Software Quality Achievement (VBSQA) process and an Object-Petri-Net-based process model (called VBSQA-OPN) to achieve a manager-satisfactory process for software quality achievement in an on-going ERP software project in China. The results confirmed that 1) the application of value-based approaches was inherently better than value-neutral approaches adopted by most ERP software projects; 2) the VBSQA-OPN model provided project managers with a synchronization and stabilization framework for process activities, success-critical stakeholders and their value propositions; 3) process visualization and simulation tools significantly increased management visibility and controllability for the success of software project.

Added July 4th, 2007


USC-CSSE-2007-712

Barry Boehm, Jo Ann Lane, "Using the Incremental Commitment Model to Integrate System Acquisition, Systems Engineering, and Software Engineering," CrossTalk Journal, Volume 20, Number 10, 2007 (pdf)

One of the top recommendations to emerge from the October 2006 Deputy Under Secretary of Defense (DUSD) Acquisition, Technology, and Logistics (AT&L) Defense Software Strategy Summit was to find ways of better integrating software engineering into the systems engineering and acquisition process.  Concurrently, a National Research Council study was addressing the problem of better integrating human factors into the systems engineering and acquisition process.  This paper presents a model that emerged from these and related efforts that shows promise of improving these integrations.  This model, called the Incremental Commitment Model (ICM), organizes systems engineering and acquisition processes in ways that better accommodate the different strengths and difficulties of hardware, software, and human factors engineering approaches.  It also provides points at which they can synchronize and stabilize, and at which their risks of going forward can be better assessed and fitted into a risk-driven stakeholder resource commitment process. 

Added July 4th, 2007


USC-CSSE-2007-711

Yue Chen, Barry Boehm, "Stakeholder Value Driven Threat Modeling for Off The Shelf Based Systems," The 29th International Conference on Software Engineering (ICSE), Doctoral Symposium, Minneapolis, MN, May 2007 (pdf)

As the trend of the usage of third party Commercial-Off-The-Shelf (COTS) and open source software continuously increases, COTS security has become a major concern for many organizations whose daily business extensively relies upon a healthy IT infrastructure. But, according to the 2006 CSI/FBI computer criminal survey, 47% of the surveyed organizations only spent no more than 2% of the IT budget in security. Often, competing with limited IT resources and the fast changing internet threats, the ability to prioritize security vulnerabilities and address them efficiently has become a critical success factor for every security manager.

As known, the security impacts of vulnerabilities can be specified in terms of Confidentiality, Integrity, and Availability (CIA). These attributes can have very different business indications in different context. Unfortunately, most current leading vulnerability rating systems by CERT, Microsoft, NIST, and Symantec are value neutral, static, and treat CIA equally. To date, it is still very difficult to prioritize security vulnerability efficiently with quantitative evidence because of lack of effective metrics and historical data, and the complex and sensitive nature of security.

We propose to (1) establish a framework, namely the Threat Modeling method based on Attacking Path Analysis (T-MAP),which is sensitive to stakeholder value propositions to dynamically prioritize COTS vulnerability and model security threat; (2) grow a comprehensive COTS vulnerability database that supports T-MAP; (3) automate T-MAP with a XML framework that abstracts the process.

Added February 22nd, 2008


USC-CSSE-2007-710

Barry Boehm, Jo Ann Lane, "Using the Incremental Commitment Model to Achieve Successful System Development" (pdf)

Studies to evaluate the usage and success of the spiral development model have shown mixed results—many successes but many misinterpretations and neglect of its underlying principles, leading to continuing development problems and failed systems.   This article describes a recent improvement to the spiral development process model that is easier to understand, is harder to misinterpret, and better enables integration of the human, hardware, and software aspects of a software-intensive system’s development and evolution.  This model, called the Incremental Commitment Model (ICM), embodies the principles underlying the spiral model, but organizes its process into multiple views (including a spiral view) that are more straightforward to apply and better aligned with mainstream system acquisition phases and milestones.

Added May 7th, 2007


USC-CSSE-2007-709

DeWitt T. Latimer IV, "Acquiring and Engineering Robotic Systems," Qual Exam Report (pdf)

Based on observing that some robotic technologies fail to be successfully acquired and/or transitioned into long term operations, this work proposes to study the various robotic engineering methods employed at the acquisition level. The goal will be to determine which methods are being utilized, their effectiveness, and if there are any observable gaps in the practice of robotics engineering at the acquisition. In the conduct of this research, a framework for analyzing robotic case studies will be presented, a survey of current practice by robotic system engineering acquisition personnel will be conducted, and heuristics for engineering support of cost analysis will be reviewed. These case studies, survey, and cost analysis will be used to determine evidence of engineering methods in forming a preliminary body of knowledge for new engineers involved in robotic systems acquisition. Finally, any gaps in practice will be catalogued for future research.

Added May 7th, 2007


USC-CSSE-2007-708

Yue Chen, Barry Boehm, Luke Sheppard, "Measuring Security Investment Benefit for Off the Shelf Software Systems - A Stakeholder Value Driven Approach," The Sixth Workshop on the Economics of Information Security (WEIS 2007) (pdf)

This paper presents the Threat Modeling method based on Attacking Path Analysis (T-MAP) which quantifies security threats by calculating the total severity weights of relevant attacking paths for Commercial Off The Shelf (COTS) based systems. Compared to existing approaches, T-MAP is sensitive to an organization’s business value priorities and IT environment. It distills the technical details of thousands of relevant software vulnerabilities into management-friendly numbers at a high-level. In its initial usage in a large IT organization, T-MAP has demonstrated significant strength in prioritizing and estimating security investment effectiveness, as well as in evaluating the security performance of COTS systems. In the case study, we demonstrate the steps of using T-MAP to analyze the cost-effectiveness of how system patching, user account control and firewall can improve security. In addition, we introduce a software tool that automates the T-MAP.

Added May 4th, 2007


USC-CSSE-2007-706

Vito Perrone, Chris A. Mattmann, Sean Kelly, Dan Crichton, Anthony Finkelstein, Nenad Medvidovic, "A Reference Framework for Requirements and Architecture in Biomedical Grid System," Proceedings of the 2007 IEEE International Conference on Information Reuse and Integration (IEEE IRI-07), Las Vegas, NV, August 13-15, 2007, pp. 418-423 (pdf)

In this paper we introduce the work done to define a framework for requirements and architectural understanding in biomedical grid computing systems. A set of core requirements for biomedical grids have been identified on the basis of our experience in the analysis and development of several biomedical and other grid systems including the National Cancer Institute¹s Early Detection Research Network (EDRN) in the US and the National Cancer Research Institute (NCRI) Platform in the UK. The requirements have been specified taking into account different points of view and are intended as a core set that can be extended on the basis of project specific aspects. These are also mapped to existing architectures of biomedical grid systems, and their constituent components. Such a framework is intended as a guide for equipping developers with conceptual tools to avoid costly mistakes when architecting biomedical grid systems.

Added March 24th, 2007


USC-CSSE-2007-705

Jo Ann Lane, F. Stan Settles, Barry Boehm, "Assessment of Process Modeling Tools to Support the Analysis of System of Systems Engineering Activities," Conference on Systems Engineering Research (CSER) 2007 (pdf)

Many organizations are attempting to provide new system capabilities through the net-centric integration of existing systems into systems of systems (SoS).  The engineering activities used to architect and develop these SoS are often referred to as SoS Engineering (SoSE).  Recent reports are indicating that SoSE activities are considerably different from classical systems engineering (SE) activities.  Other systems engineering experts believe that there is nothing really different with respect to systems engineering activities or component-based engineering in the SoS environment-that there are only differences in scale and complexity. To better understand SoSE, studies are currently underway to evaluate the differences between classical SE and SoSE.  This paper summarizes process areas to be investigated in the SE-SoSE comparison and then analyzes and evaluates several types of process modeling tools in order to identify a set of tools that can be used to capture classical SE and SoSE process characteristics for further comparison.

Added March 23th, 2007


USC-CSSE-2007-704

Jo Ann Lane, "Understanding Differences Between System of Systems Engineering and Traditional Systems Engineering," PhD Qual Exam Report (pdf)

Today’s need for more complex, more capable systems in a short timeframe is leading more organizations towards the integration of existing systems, Commercial-Off-the-Shelf (COTS) products, and new systems into network-centric, knowledge-based systems of systems (SoS).  With this development approach, system development processes to define the new architecture, identify sources to either supply or develop the required components, and eventually integrate and test these high level components are evolving and are being referred to as SoS Engineering (SoSE).  Recent reports are indicating that SoSE activities are considerably different from the more Traditional Systems Engineering (TSE) activities.  Other systems engineering experts believe that there is nothing really different with respect to system engineering activities or component-based engineering in the SoS environment—that there are only differences in scale and complexity. However, most of these beliefs are opinions based on ad hoc observations, albeit from experts working in the SoSE arena, and not substantiated by case study analyses or data.  The goal of this research is to investigate SoSE through the study of several large-scale SoSE programs to determine if there are significant differences between SoSE and TSE processes and, if so, to describe them in terms of key cost drivers and impacts to associated effort.

This research effort surveys both SoSE projects and relatively large scale, complex TSE projects to identify key SoS characteristics and SoSE processes, then develops process models for a set of SoSE and TSE projects in order to compare SoSE activities and associated effort of these projects with TSE activities/effort. The resulting analysis is designed to answer the question "are SoSE processes different from TSE processes and, if so, how". This research effort will provide valuable insights into SoSE as well as provide data to support the on-going development of SoSE cost models at the University of Southern California (USC) Center for Systems and Software Engineering (CSSE).

This proposal presents 1) a statement of the research topic and the intended research contribution, 2) a review of relevant literature, 3) the proposed methodology for addressing the SoSE research illustrated using a sample SoSE and TSE project, and 4) a plan for the completion of this research dissertation.

Added March 23th, 2007


USC-CSSE-2007-703

Yuriy Brun, "Solving NP-Complete Problems in the Tile Assembly Model," A subsequent version has been published as Theoretical Computer Science, Volume 395, Issue 1, April 17, 2008, pp. 31-46 (pdf)

Formalized study of self-assembly has led to the definition of the tile assembly model, a highly distributed parallel model of computation that may be implemented using molecules or a large computer network such as the Internet. Previously, I defined deterministic and nondeterministic computation in the tile assembly model and showed how to add, multiply, and factor. Here, I extend the notion of computation to include deciding subsets of the natural numbers, and present a system that decides SubsetSum, a well known NP-complete problem. The computation is nondeterministic and each parallel assembly executes in time linear in the input. The system requires only a constant number of different tile types: 49. I describe mechanisms for finding the successful solutions among the many parallel assemblies and explore bounds on the probability of such a nondeterministic system succeeding and prove that probability can be made arbitrarily close to 1.

Added February 19th, 2007


USC-CSSE-2007-702

Daniel Popescu, "Framework for Replica Selection in Fault-Tolerant Distributed Systems" (pdf)

This paper describes my term project, which I developed in the course CS 589 Software Engineering for Embedded Systems. The term project should be a design and an implementation of a novel application or development tool that exploits one or more existing approaches to software engineering in the context of embedded systems, demonstrates a novel idea in this domain, or overcomes a known significant challenge posed by embedded systems.

In my project I examined how to select replica components in fault-tolerant systems to increase the overall reliability of the system considering the additional costs of the deployed replica components. As a result, I developed a framework for different replica selection algorithms and evaluated five selections strategies for this framework.

Added January 18th, 2007


USC-CSSE-2007-701

David Woollard, Chris Mattmann, Nenad Medvidovic, "Injecting Software Architectural Constraints into Legacy Scientific Applications," 11th IEEE European Conference on Software Maintenance and Reengineering (CSMR07), Amsterdam, the Netherlands, March 2007 (pdf)

While software architectures have been shown to aid developers in maintenance, reuse, and evolution as well as many other software engineering tasks, there is little language-level support for these architectural concepts in legacy programming languages such as Fortran and C. Because many existing scientific codes are written in legacy programming languages, it is difficult to integrate them into architected software systems. By wrapping these scientific codes in architecturally-aware Java interfaces, we are able to componentize legacy programs, integrating them into systems built with first-class architectural elements while meeting the performance and throughput requirements of scientific codes.

Added January 4th, 2007


USC-CSSE-2007-700

Da Yang, Barry Boehm, Ye Yang, Qing Wang, Mingshu Li, "Coping with the Cone of Uncertainty: An Empirical Study of the SAIV Process Model," Lecture Notes in Computer Science, Springer Berlin / Heidelberg, Volume 4470/2007, pp. 37-48 (pdf)

There is large uncertainty with the software cost in the early stages of software development due to requirement volatility, incomplete understanding of product domain, reuse opportunities, market change, etc. This makes it an increasingly challenging issue to deliver software on time, within budget, and with satisfactory quality in the IT field. In this paper, we introduce the Schedule as Independent Variable (SAIV) approach, and present the empirical study of how it is used to cope with the uncertainty of cost, and deliver customer satisfactory products in 8 USC (University of Southern California) projects. We also investigate the success factors and best practices in managing the uncertainty of cost.

Added February 22nd, 2008


Copyright 2008 The University of Southern California

The written material, text, graphics, and software available on this page and all related pages may be copied, used, and distributed freely as long as the University of Southern California as the source of the material, text, graphics or software is always clearly indicated and such acknowledgement always accompanies any reuse or redistribution of the material, text, graphics or software; also permission to use the material, text, graphics or software on these pages does not include the right to repackage the material, text, graphics or software in any form or manner and then claim exclusive proprietary ownership of it as part of a commercial offering of services or as part of a commercially offered product