In Quest of Requirements Engineering Research that Industry needs
Daniel Méndez, Technical University of Munich, Germany
Daniel very kindly made his slides available at
https://www.slideshare.net/mendezfe/in-quest-of-requirements-engineering-research-that-industry-needs
Guest Editorial Reflections on Requirements
Since the first paper published in RE in 1977, the social challenges we still face today. He talks about a Human Social Environment in which lie human needs and an intangible dimension.
Today, it is easy to find examples of industrial projects that fail due to a poor understanding of the system requirements.
As an example, he talked about this case:
http://www.dw.com/en/germany-seeks-toll-collect-compensation/a-1324248
33% of problems in software are related to requirements.
36% of the RE problems lead to software project problems.
Now, our challenges are bigger because we have distributed systems (as discussed yesterday in
Sharan's keynote)
Some of the research targets problems that are ill-understood and they "pretend" to work universally, when they in fact do not.
No matter how you look at it, there is a gap between RE research and practice
So how can we develop research that is in fact needed by industry?
1) Turn RE research into a theory-centric discipline
He gives an example in GORE (citing a literature review paper by Jennifer Horkoff), which shows that from more than a hundred reported case studies in the literature, only 20 were "real" case studies, like the ones discussed by Fabio, on the first day.
He makes a joke: there are more people believing in elves (54%) than there is using GORE in industry (5%).
RE is largely dominated by conventional wisdom
- Lack of empirical awareness
- Neglecting particularities of practical contexts
- neglecting relation to existing evidence
First take of this talk is:
We should change from conventional wisdom to evidence based, theory-centric, context sensitive RE research.
One of the biggest problems at this moment is the lack of data, especially because this is the kind of sensitive data that industrial partners are reluctant towards giving access.
2) Understanding the practitioner's problems
NaPIRE - http://www.re-survey.org
They are performing a bi-yearly replicated, globally distributed family of surveys, based on collaborative design, analysis and synthesis.
The goal of this survey is to understand the status quo in practice w.r.t: Req. Elicitation, Req. Documentation, Req. Change and Alignment, RE Standards and RE Improvement.
He shows a chart created as a result of this survey, analyzing many problems such as: insufficient support from project leader, inconsistent requirements, unmeasurable NFRs.
He also shows a model in which he analyses the problems happening as result of customer communication flaws.
One thing he started to investigate is if the agile methodologies, which supposedly take into account communication with the customer are actually doing it.
Empirical studies should try to understand the following dimensions.
context - causes - problems - effects
This is the approach they take in NaPIRE.
How can we profit form NaPIRE:
- The results of the survey are available for public use.
- He also invites us to join the initiative.
There are other initiatives worth joining, e.g. RE Pract, Elena
3) Academia and Industrial Collaboration
He gives some personal examples of participation in industrial projects, some of which were successful also for research purposes, and others weren't (mainly for social reasons, such as data access).
Success factors for collaboration with industry (according to practitioners):
- Shared-long term vision and project goals
- Common problem- and domain-understanding
- Proper project organization: manageable-sized sub-problems, minimal-invasive solutions, regular on-site work and meetings, fast feedback loops
- Proper mindset and principles: pragmatism and lack of idealism when solving industry problems.
Challenge: engaging in academia-industry collaboration while preserving our integrity.
---
Q&A:
One thing we should understand is that RE is very, very young, when compared with existing sciences, such as physics, for example.
Instead of saying we need agile RE, we should be investigating in which conditions is agile RE appropriate for a particular industrial sector.
For a long time, in model-based system development community, we've been talking about meta models and we should have into account that the meta model's developer and reader have their own mental models when writing and interpreting such models. And in RE, we are just in the beginning of understanding how to target this problem. To enable shared semantics is very important.
Roel says that the difference between research and consultancy is that for the latter, we are aiming at solving a problem that has been solved many and many times before, while this is not the case for the former.
Replicating case studies should be done so that we can compare results. However, they are rarely accepted as publications because people claim that case study has been done.
I raised a question regarding what kind of evidence is expected from a research papers so that the work is considered as consistently evaluated. This is a problem for us (outside the empirical studies' community) because it is impossible to have the kind of rigor that is sometimes expected from us. Daniel said that he thinks that publications having only claims have a place, so as to raise discussion. The biggest problem he sees is the strong claims that people do without having evidence to support them. If a researcher claims something works in a particular condition, some evidence should be provided to support it.