Our website uses cookies to give you the most optimal experience online by: measuring our audience, understanding how our webpages are viewed and improving consequently the way our website works, providing you with relevant and personalized marketing content.
You have full control over what you want to activate. You can accept the cookies by clicking on the “Accept all cookies” button or customize your choices by selecting the cookies you want to activate. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button. Please find more information on our use of cookies and how to withdraw at any time your consent on our privacy policy.

Managing your cookies

Our website uses cookies. You have full control over what you want to activate. You can accept the cookies by clicking on the “Accept all cookies” button or customize your choices by selecting the cookies you want to activate. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button.

Necessary cookies

These are essential for the user navigation and allow to give access to certain functionalities such as secured zones accesses. Without these cookies, it won’t be possible to provide the service.
Matomo on premise

Marketing cookies

These cookies are used to deliver advertisements more relevant for you, limit the number of times you see an advertisement; help measure the effectiveness of the advertising campaign; and understand people’s behavior after they view an advertisement.
Adobe Privacy policy | Marketo Privacy Policy | MRP Privacy Policy | AccountInsight Privacy Policy | Triblio Privacy Policy

Social media cookies

These cookies are used to measure the effectiveness of social media campaigns.
LinkedIn Policy

Our website uses cookies to give you the most optimal experience online by: measuring our audience, understanding how our webpages are viewed and improving consequently the way our website works, providing you with relevant and personalized marketing content. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button. Please find more information on our use of cookies and how to withdraw at any time your consent on our privacy policy.

Skip to main content

How machine learning can deliver intelligence to requirements engineering

If you have ever walked into a restaurant and tried to order something that’s not on the menu, you understand that this is a risky proposition. If you fail to clearly define exactly what you want, you may end up with something vastly different than you were envisioning. Just saying “I want a dish with spinach, garbanzo beans and onion” could yield any number of possibilities. Are they steamed, baked or sauteed? Are they cooked separately or pureed into a lumpy mass? Chances are, you’re not a chef, so you may not know all the lingo or technique it takes to get the tasty dish you’re looking for.

At its core, this is a question of poorly-defined requirements — and it’s just as much an issue in software development as it is in the kitchen.

In software, many different requirements occur during the different phases of a product’s lifecycle — from the planning phase to development or construction, to operations and decommissioning. To keep track of all these stakeholder requirements, requirements engineering is a critical part of the application lifecycle.

Good requirements engineering can prevent expensive changes, flag incorrect or dangerous requirements before they make it to production, and ensure that software is delivered with exactly the functionality intended.

The challenge of the quality of requirements

Despite the benefits of requirements engineering, it is difficult to track the quality of requirements without checking each one individually. A good requirement has various characteristics. It should be unambiguous, testable, precise, understandable, feasible and independent — all of which need to be reviewed by an experienced (and expensive) specialist. If that engineer finds mistakes, the person who wrote the requirement must be informed, which can lead to complex, time-consuming feedback loops.

Also, there is seldom a training process for requirement writers, which leads to mistakes being made again and again. Experience has shown that approximately 60% of all mistakes arise in the analysis phase of the development process, and about 80% of requirements must be processed more than once.

Checking every written requirement individually is very time consuming and a bottleneck to development. Consequently, developers must work with low-quality requirements which might be misunderstood. This leads to the development of functionalities that are different than intended — which triggers costly rework.

Application lifecycle management

Application lifecycle management (ALM) can track requirements and enable faster, more frequent releases while maintaining end-to-end traceability and visibility. With these types of tools, it’s possible for teams of any size to define, develop, test and manage complex software systems.

ALM software like Polarion also accelerates development, improves final product quality and helps development teams improve coordination and cooperation. However, it does not inherently solve the problem of poorly-written requirements — in other words, the old “garbage in, garbage out” conundrum still applies.

Machine learning to the rescue

One good way to solve the problem of requirements quality is by integrating machine learning to improve the entire process. A clever machine learning algorithm can prevent errors by checking every requirement that is written, as it is written.

In one development project, Atos integrated a machine learning algorithm named reQlab, which was developed as an additional network service for application lifecycle managment systems like Polarion by IT-Designers GmbH. Atos integrated the algorithm into the client’s Polarion landscape and customized it to match their approach to writing requirements.
Once a requirement is written, it is automatically checked by the algorithm, which rates it on a scale of one to five — from bad (1) to ambiguous (2-4) to good (5). Good requirements are automatically passed on to the development team without further intervention. The others are either kicked back to the writer or flagged for the requirements engineering team.

Any employee who writes a requirement gets immediate feedback on their submission and learns whether or not a developer could properly interpret and implement the requirement. Errors and weaknesses are provided directly to the user, along with suggested improvements. Below are some examples of poorly-written requirements and the feedback that the algorithm provided.

 

The benefits of intelligent requirements engineering

With a machine learning solution like this, the number of errors that occur in the early phases of a development project can be drastically reduced. This is critical, because anyone familiar with the software development lifecycle knows that the later you make changes, the higher the costs.

Business users do not need extensive knowledge or training on the standards, rules and technical jargon specific to software requirements, because the algorithm checks them automatically. Users are provided with immediate feedback — enabling a continuous improvement process that not only fixes bad requirements quickly, but improves future quality as well.

By helping business users fix borderline requirements and saving the expensive specialist engineers for only the worst of the worst, it reduces the overall cost of development. It also improves communication between developers and business users, and prevents misunderstandings in the development process.

In the end, this is a great example of how intelligent solutions powered by AI and machine learning can add value to every aspect of your digital value chain — saving time, money and energy, and ensuring that you deliver the right functionality at the right time.

At Atos, we are committed to helping our clients transform for the future, and we dedicate ourselves to finding innovative solutions to even the most routine day-to-day challenges they face.

By Felix Otto, Junior Consultant

Posted on: October 11, 2021

Share this blog article


About Felix Otto
Junior Consultant
Felix started working as a consultant and developer in the IT industry a year ago, after he finished his three years dual studies in cooperation with Atos. Right now, he is working in a Polarion ALM Project, helping to customize and improve Polarion Landscapes. Before his work in the Polarion environment he was part of a machine learning project building a digital twin for the manufacturing industry. Besides his work he is studying computer science in economics for a Master of Science degree.

Follow or contact Felix