Seilevel
Seilevel Home
Back to Blog Home - Requirements Defined

Sunday, September 17, 2006

Day 1 RE'06 (update from Joy)

“That’s neat, but…so what?”

I noticed myself frequently thinking these words at RE’06. Like Tony, I found myself disappointed that academia seems removed from the practical world of requirements engineering. I believe it’s supposed to work like this - the academics do research in new areas, come up with brilliant new ideas, and then apply them to problems in industry. While there were a few exceptions to this, most of what I saw failed to be clearly applicable to real problems.

I will provide summaries of talks that Tony has not already commented on.

I saw the third paper presented in the “Languages, Methods and Tools” session, “Making Mobile Requirements Engineering Tools Usable and Useful”. The authors were trying to address what mobile computing devices mean to the requirements engineering field. The authors developed a prototype application for a PDA to be used to elicit requirements. Such a tool would allow requirements to be discovered in the work-based environment (while walking around even). There were obvious limitations with screen size, no keyboard, and ultimately had general usability issues.

The interesting question for me was - in what situations would this be useful? How often do we really need to gather requirements where we cannot use a laptop (or even a tablet PC)? The presenter indicated there was a specific need for someone to do this, but he did not elaborate on what it was.

The afternoon of Day 1 was spent in the session of research papers on “Non-Functional Requirements”.

The first paper in this group was titled “The Detection and Classification of Non-Functional Requirements with Application to Early Aspects”. The authors’ goal was to develop a method to be able to automatically detect and classify the non-functional requirements in standard documentation materials (notes, emails, documentation). In simple terms, there are key words associated with non-functional requirement types. For each sentence in the document, the key words are used to calculate a likelihood of it being of that particular type, and if it is above a threshold, it is then classified as a requirement of that type.

It’s not clear to me that there is a benefit gained from such methods. If the method is not perfect (and I will argue it can’t be), then you still have to parse through your documentation manually so that you do not miss any requirements. If it is not perfect in another direction (and again I don’t think it can be), you also still have to read through all of the requirements extracted to find the ones that are not actually requirements or are miss-classified. Given that, I’m skeptical that there will be a benefit gained from such methods.

I found the second paper in this session, “Emotional Requirements in Video Games”, to be quite interesting. The premise is that video games are designed using emotional requirements, with a target emotional state and means to induce that state in a player. Emotional requirements describe the story of the player’s experience, where and when the emotions should be felt, and how they vary over time. This type of requirement is very prevalent in video game software and can be challenging to capture, so this paper provides an approach to that problem. They propose using emotional terrain maps (where there is emotion), emotional intensity maps (what emotion) and timelines (how it varies) as visual representations of the requirements.

There was a brief conversation at the end by other researchers in the room, proposing additional applications of the techniques. Suggestions were in process workflows, such as optimizing workflow on a factory floor or robot planning. There was also an interesting discussion about further work desired in the gaming industry on how to verify this type of requirements.

The third paper in this session was called “Towards Regulatory Compliance: Extracting Rights and Obligations to Align Requirements with Regulations”. There are regulations that software systems often must enforce. The authors developed a systematic method to parse the legal documentation to extract and prioritize the rights and obligations. They identify the constraints, resolve ambiguities and trace back to the original policy to demonstrate compliance. In their research, they used HIPPA as a case study.

The final event of the day for us was the poster session. Largely the posters were challenging to understand, and the students were not present for explanation when we stopped by. I was somewhat intrigued by one titled “So, You Think You are a Requirements Engineer?” The authors present a suggested map of skill areas against experience, where the skill areas are elicitation, analysis, communication, validation and management. I’m mostly curious to see the findings of an empirical study in the community that they have started.
Requirements Defined Newsletter Bookmark and Share

Thursday, September 14, 2006

Day 1 IEEE 14th Annual Requirements Engineering Conference

This is the first time that I have been to this conference and honestly I didn't know what to expect. I had several goals. I wanted to meet some of the well known names in the requirements field, I wanted to see what the cutting edge research in requirements looked like and I wanted to get a better understanding of current best practices.

What I have discovered so far is enlightening and also somewhat disappointing. If I had to summarize with one key theme, it is that academia is completely disconnected from the real world of requirements definition.

The first session this morning was the keynote by Mary Beth Rosson from Penn State. Her speech was titled "End Users Who Meet Their Own Requirements". The talk discussed how users are programming directly today and that they do not do analysis and design. For example, Excel spreadsheets and Word macros are common, but very little analysis is done when they are developed. According to Rosson, the key is to provide users with the tools to help them get the solution correct. At first I thought that the topic was not relevant to my work on large scale enterprise systems. However as I thought about it more, I realized that many of the projects that we work on are conversions of systems built by the business side into applications that can be supported by the IT organization. These applications are typically extremely complex Excel spreadsheets or Microsoft Access databases.

The next set of presentations were research papers by doctoral candidates. The first paper was "A Case Study in Systematic Improvement of Language for Requirements." This was probably the clearest of the presentations and possibly the most applicable to the real world. The essence of the presentation was a model for measuring the risk associated with your glossary (the termed them "concepts"). By counting the number of concepts and multiplying by the number of shared concepts the author claimed that you could get a measure of risk associated with your definitions.

The second paper was entitled "Generating Hierarchical State Machines From Use Case Charts". The authors of this paper took use case charts and created additional UML 2.0 language constructs. Moreover they created a metholodgy to derive state models from the use case models. Frankly, this didn't appear to have much practical value as the initial use case charts are a model that is simply too much work to do.

Stay tuned for a summary of the afternoon session!
Requirements Defined Newsletter Bookmark and Share

Wednesday, September 13, 2006

Problems May Be Your Problem

Requirements are often touted as the foundation of a good software project. I agree that without good requirements, you will find it very difficult to build the right solution.

What, then, is the foundation for the requirements?

A clear problem statement is, in many cases, just as important as the requirements themselves. It serves as the base from which the features and requirements are built. With some systems, the problem is obvious and it is clear that the solution will add value. With most enterprise systems, however, the problems and benefits are not so clear.

In these muddy waters, you should stop and analyze your problem statements in detail. Eliciting these issues is sometimes easy (when everyone hates the software, for example), but finding out the pain points of the users and management often requires the same degree of facilitation used in requirements gathering.

For example, a company may know that their overall processing time for orders is too long. They may not, however, understand all of the individual problem statements that add up to that company-wide issue. Breaking this problem down into individual subcomponents for analysis and definition requires the exact same skill set used in solution analysis.

Several useful 'buckets' for problem analysis are:
- Effort (the amount of work required by your employees to accomplish a task)
- Duration (the amount of real elapsed time required to accomplish a task)
- Risk (the propensity for error involved in a task)

The results of such a problem analysis might appear as follows:

Effort
-E01: Order entry takes too much effort
-E02: Order fulfillment takes too much effort

Duration
-D01: Order entry takes too long
-D02: Order fulfillment takes too long

Risk
-R01: Order entry may be inaccurate

You would, of course, define what 'too much effort' and 'too long' mean for your system in measurable terms.

Once you have the problem defined at this level, you can more easily create features that address each of these low-level problem statements.

Feature 01 - Automatically populate shipping information based on customer's ID number.
Feature 02 - Automated order fulfillment system.

You can then map these features back to the problem statements for traceability.

Feature 01 satisfies problem E01, D01, and R01.
Feature 02 satisfies problem E02 and D02.

With a clear definition of the problem in hand, writing the requirements for each of these features becomes much easier. You can understand the original intent of the feature and the issue it was addressing. This also allows you to trace back ROI when the software is complete to see if the solution really addressed the problem.
Requirements Defined Newsletter Bookmark and Share

RE '06

Joy and I are at the 14th annual Requirements Engineering conference in Minneapolis. Over the next 3 days we will report on the sessions which represent the state of the art in the requirements engineering field.

http://www.ifi.unizh.ch/req/events/RE06/
Requirements Defined Newsletter Bookmark and Share