Session 1 |
09:00 - 09:15 | Welcome note and introduction |
09:15 - 09:30 | Summary of TwinPeaks, Matthias Galster |
09:30 - 09:50 |
A Knowledge-Assisted Framework to Bridge Functional and Architecturally Significant Requirements
Preethu Rose and Balaji Balasubramaniam
(Tata Consultancy Services, India)
The disciplines of requirements engineering (RE) and software architecture (SA) are fundamental to the success of software projects. The synergistic relationship between these two disciplines has long been acknowledged by both academicians and practitioners alike. To build successful and cost-effective software systems, we must understand and leverage the linkages between functional and architectural requirements. We discuss a knowledge-assisted approach that establishes traceability between functional and architectural requirements. The approach classifies requirements into the problem context (functional) and a solution (architectural) context. The functional context is called Functional Requirement Viewpoint (FRV). The architectural context is further categorized into three sub-contexts namely the Functional Architecture Viewpoint (FAV), the Technical Architecture Viewpoint (TAV) and the Deployment Architecture Viewpoint (DAV). Though the approach separates the problem domain and the solution domain explicitly; it facilitates development of requirements and architectural specifications concurrently; appreciating the necessary interplay between the two.
|
09:50 - 10:10 |
An Ontological Framework for Architecture Model Integration
Arvind Kiwelekar and Rushikesh K. Joshi
(Dr. Babasaheb Ambedkar Technological University, India; IIT Bombay, India)
Architecture Model Integration is a process of defining correspondences between high-level architecture elements and other software elements from artifacts developed during various phases of product life cycle. A primary purpose of this process of integration is to address the challenges of the semantic gap that results due to scattering of concerns into multiple representations of application domain entities throughout a series of models. An ontological approach aimed to address the semantic gap is presented in this paper. A reference architecture ontology and an architecture knowledge base are the central components of the proposed framework. Around these components, three framework processes namely interpretation, representation, and refinement are integrated. The framework processes are uniformly applied to derive architecture models from artifacts developed during requirements, low-level design, and implementation.
|
10:10 - 10:30 |
Helping System Engineers Bridge the Peaks
Neha Rungta, Oksana Tkachuk, Suzette Person, Jason Biatek, Michael W. Whalen, Joseph Castle, and Karen Gundy-Burlet
(NASA Ames Research Center, USA; NASA Langley Research Center, USA; University of Minnesota, USA)
In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.
|
10:30 - 11:00 | coffee break |
Session 2 |
11:00 - 12:00 |
What Drives Design?
Rick Kazman (Keynote).
University of Hawaii, Honolulu, HI, USA
|
12:00 - 12:30 |
A Framework for Identifying and Analyzing Non-functional Requirements from Text
Vibhu Saujanya Sharma, Roshni R. Ramnani, and Shubhashis Sengupta
(Accenture Technology Labs, India)
Early identification of Non-Functional Requirements (NFRs) is important as this has direct bearing on the design and architecture of the system. NFRs form the basis for architects to create the technical architecture of the system which acts as the scaffolding in which the functionality of the same is delivered. Failure to identify and analyze NFRs early-on can result in unclassified, incomplete or conflicting NFRs, and this typically results in costly rework in later stages of the software development. In practice, this activity is primarily done manually. In this paper, we present a framework to automatically detect and classify non-functional requirements from textual natural language requirements. Our approach to identify NFRs is based on extracting multiple features by parsing the natural language requirement whereby the presence of a certain combination of and relationship among the features uniquely identifies the requirement as an NFR of a particular category. These features are specified as pattern based rules which can be specified in a human readable language through the use of a domain specific language that we have defined. This enables great ease and flexibility in creating and extending rules. Our approach has been implemented as a prototype tool and here we also present the results of applying our approach on a publicly available requirement corpus.
|
12:30 - 02:00 | Lunch |
Session 3 |
02:00 - 02:30 |
Engineering Support for Virtual Integration
Mike Whalen (Invited talk)
University of Minnesota, USA
Nuseibeh's TwinPeaks paper makes the argument that system design naturally iterates between activities in the "problem space" where we do goal and requirements exploration and the "solution space" where we explore means to implement aspects of the system. In previous work, we pointed out that this idea naturally describes hierarchical construction, in which design choices made at higher levels of abstraction levy requirements on system components at lower levels of abstraction. Thus, whether an aspect of the system is a design choice or a requirement depends largely on one's vantage point within the hierarchy of system components.
Given a hierarchy of requirements and design information, we can perform "virtual integration"; that is, determining whether the system (at some level in the hierarchy) meets its functional, performance, and resource requirements given requirements of subcomponents. If the subcomponents are correctly implemented with respect to their requirements, this approach should allow smooth system integration. Unfortunately, it is often the case that we incompletely specify certain aspects of the system or create a set of requirements that are unrealizable, that is, it is impossible to create an implementation that simultaneously satisfies all of its requirements. In this talk, I discuss tool support towards performing virtual integration with confidence and some examples using the Architectural Analysis Design Language (AADL) and AGREE plugin for formal verification.
|
02:30 - 03:00 |
Exploring the Twin Peaks using Probabilistic Verification Techniques
Anitha Murugesan, Lu Feng, Mats P. E. Heimdahl, Sanjai Rayadurgam, Michael W. Whalen, and Insup Lee
(University of Minnesota, USA; University of Pennsylvania, USA)
System requirements and system architecture/design co-evolve as the understanding of both the problem at hand as well as the solution to be deployed evolve---the Twin Peaks concept. Modeling of requirements and solution is a promising approach for exploring the Twin Peaks. Commonly, such models are deterministic because of the choice of modeling notation and available analysis tools. Unfortunately, most systems operate in an uncertain environment and contain physical components whose behaviors are stochastic. Although much can be learned from modeling and analysis with commonly used tools, e.g., Simulink/Stateflow and the Simulink Design Verifier, the SCADE toolset, etc., the results from the exploration of the Twin Peaks will---by necessity---be inaccurate and can be misleading; inclusion of the probabilistic behavior of the physical world provides crucial additional insight into the system's required behavior, its operational environment, and the solution proposed for its software. Here, we share our initial experiences with model-based deterministic and probabilistic verification approaches while exploring the Twin Peaks. The intent of this paper is to demonstrate how probabilistic reasoning helps illuminate weaknesses in system requirements, environmental assumptions, and the intended software solution, that could not be identified when using deterministic techniques. We illustrate our experience through a medical device subsystem, modeled and analyzed using the Simulink/Stateflow (deterministic) and PRISM (probabilistic) tools.
|
03:00 - 03:30 |
Architecturally Savvy Personas
Jane Cleland-Huang (Invited talk)
|
03:30 - 04:00 | coffee break |
Session 4 |
4:00 - 04:15 |
One Minute Madness (all participants)
|
04:15 - 04:30 |
Quiet brainstorming / reflection (Xavier Franch)
|
04:30 - 05:15
|
Presentation of reflections, general discussion (all participants)
|
05:15 - 05:30
|
Wrap up
|