-- Steve McConnell,
author of "Code Complete" and "Rapid Development”
Despite my years of experience with implementing, upgrading and
migrating content management systems in validated environments, it seems that
every project I’m on has its own unique level of challenges when it comes to
getting good requirements. Different
(and sometimes conflicting) needs often bubble up from various groups of users,
which can make requirements prioritization a challenge. Also, while regulated organizations typically
provide a well defined system development lifecycle, they may not be as
attentive with providing a robust set of tools for the BA to elicit
requirements early on in the project when they’re really needed. Additionally, delivery estimates for
implementation at times end up being based on arbitrary mandates vs. a realistic
timeline that reflects the true level of effort to deliver. These dynamics lead to the inevitable “scope
creep” – meaning, at some point the scope of the work increases because we
didn’t have the right requirements defined up front that were aligned with a
sensible timeline for implementation.
The Waterfall
Approach
From an implementation standpoint, regulated organizations
have traditionally leveraged a waterfall approach for implementation. This method consists of well defined phases where
each have specific deliverables. Formal
approvals on these deliverables must be received in order to move onto the next
phase in the project. Additionally,
scope cannot be changed once the system is in a validated state without
completing a formal change request. Typically,
the change request process looks something like this:
- Initiate a change request form
- Determine and then document the specific change
- Submit the change request for approval
- Get all of the necessary approvals
- Update the schedule and scope
- Once all of the above is completed, proceed with making the required updates
So when using a waterfall approach, it’s very possible that the implementation effort will look something like this:
- BA interviews users and/or conduct workshops with key stakeholders, and writes everything down in notes/minutes
- Once the BA believes all the requirements have been captured, a (typically text based) functional spec is developed, reviewed and formally approved
- Development team constructs the solution based on the functional spec
- System test team develops system test scripts, and BA Develops UAT Scripts
- System Test team performs dry runs of system test scripts, followed by formal execution
- BA Performs dry runs of UAT scripts, followed by formal UAT execution by the users
The problem with this approach is that the users may not get a chance to really use the system until formal UAT. Also, they may not know what requirements they actually need until they see the system and use it to perform their “day in the life” tasks. Delaying this level of hands-on interaction increases the risk of changes late in the implementation cycle. If the system must be modified at a late stage to accommodate the new requirements, then lost time and increased cost results if previously developed work is changed or removed.
The Agile approach
Agile has been effectively used in many different industries
and environments to counter the challenges with the following the traditional
approach. Typically, agile projects look
something like this:- A business sponsor is identified
- The agile team is formed (typically consisting of a team lead, developer, product owner and a business stakeholder)
- The team works with the business sponsor to understand the basic solution they are looking for
- The work cycle timeframe is defined (i.e. typically as a 2-4 week phase known as a “sprint”)
- The team dives right in and starts development
- Working software is implemented at the end of the sprint, and then demonstrated to the users
- If additional features are needed, the team will continue onto the next sprint
Agile is designed to embrace change. So once the priority of a requested change is determined, the request is added to a backlog (which is a prioritized features list that contains a short description of what’s desired). This allows the team to continuously changing business priorities vs. trying to define all of the requirements up front before beginning development. Working this way provides the following benefits:
- The delivery cycle with agile is shorter, so time is saved going from concept to delivery
- It enables ongoing communications with the business
- Frequent demonstrations generates new ideas for the next sprint
- The team adapts & updates as they go along to ensure that what’s delivered optimizes value
- It provides the ability to respond to changing and evolving business needs
The requirement for Computer System Validation
In regulated Pharmaceutical environments, the FDA mandates the need to perform Computer System Validation. According to GAMP (Good Automated Manufacturing Practice), Computer System Validation is: “Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its pre-determined specifications and quality attributes.” Additionally, “validation of new systems serves two purposes: 1) To avoid any intolerable risk to patient safety or to the business, and 2) to maximize the business benefits to be derived from the new system.”
Validated environments typically embrace a System
Development Lifecycle, which comes with a defined set of documentation at each
phase of the project:
- Planning (Strategic Objectives, Feasibility Study, Scope, To Be Process flow diagrams, etc.)
- Specification (URS, FS, Validation Plan)
- Design (HDS, SDS)
- Construction (Code/Code Reviews)
- Testing (IQ\OQ\PQ Plan and Protocol, IQ\OQ\PQ scripts, IQ\OQ\PQ execution, IQ\OQ\PQ Summary, Validation Summary Report)
- Operation and Maintenance (Maintaining the Validated State)
The issue here as I see it isn’t with having a formal SDLC. In regulated environments, it exists for good reason. The challenge for the BA comes down to having a good method for eliciting and verifying the right requirements, because reliance on a large text based functional spec defined up front for communicating system functions is too complicated and nebulous for users to comprehend. While I know very well that a functional spec is a required validation deliverable - and does have value in terms of organizing and capturing all requirements in one place (as well as a means of tracing to test scripts) – on its own it’s a dreadful tool for BA’s to use as a method of requirements elicitation. What users state what they want for inclusion in a functional spec may not be ultimately what they need, because defining requirements tends to be an evolutionary process. A well formed functional spec is junk if it doesn’t properly capture requirements for a solution that adds business value.
What’s the
compromise?
So based on what we know about agile and computer system
validation – using agile (in a strict sense) of deploying working software in
such small increments within a validated production environment doesn’t seem
feasible or practical, since deployments aren’t permitted without all of the
formal testing and approvals on the required documentation. Therefore, I believe that the best option
would be to follow a hybrid approach.
Simply put – the idea is to leverage elements of agile to capture good
requirements, and then use the work products generated from this
effort as an input into the formal validation phase.
This approach would provide the following benefits:
- Reduces project risk, since this increases the chances that the system validation effort will be more predictable
- Functional spec literally “writes itself”
- Users will have an easier time understanding functional spec, since it represents what they’ve already seen through a visual representation vs. just a text based abstraction
- This will lead to quicker/easier approvals, and fewer (if any) re-approvals due to changes
- It should cause less issues (and change controls) during the testing phase
- UAT should run smoother
Leveraging models to capture good requirements
If for whatever reason incremental development isn’t feasible, it’s still possible to leverage agile principles through the use of models. In this case, requirements tend to best developed through the use of simple but effective visual models that properly represents the user’s “day in the life”. The value of requirements models is that they are understandable to both users as well as IT, and helps to bridge the communication gap which often exists between the two groups. As the models are further refined, the requirements continue to develop as they become clearer.
What about COTS?
Commercial off-the-shelf software (COTS) is typically
thought of as a way to save time and money on expensive custom programming. But, if not properly managed, it can have
potentially disastrous results. What may look nice during a demo may crash in
production (i.e. functional issues) or cause the users to struggle with the
system (i.e. usability issues) or has capabilities that don’t fit with your
business (i.e. process issues).
Again – we can leverage elements of agile by starting with a
good set of user requirements defined as user stories. We also need to understand enough about the
different available products on the market, so that the gaps between our user
requirements and what’s currently available through COTS solutions are known. Also,
take the time to participate in vendor demos, and use quantifiable selection
criteria to choose the right vendor. Next, we need to determine all of the
estimated implementation costs and the LOE to close the gaps between our user
requirements and what the vendor provides.
Once the vendor is chosen, then before diving right into the
validation phase – take the time to conduct a series of demos in a sandbox
environment with the users (especially on newly added functionality that was
requested) to ensure that it meets the business needs. Be prepared to do multiple demos if issues
are found, if updates are made based on user feedback, or if different groups
need to focus on specific features. Next, conduct hands on informal UAT in the development system with
the users prior to approving the functional spec and commencing the validated
phase (preferably using training materials that represent the real “day in the
life” scenarios). This upfront investment
should pay dividends downstream when it comes time to formally validate the system.
Conclusion
Agile and Waterfall methodologies each have their own
respective strengths and benefits. When used together in regulated environments,
the value of both can be optimized. I’ve
found that this approach seems to be well received by the QA and validation folks,
since it helps to improve quality as well as make the validation process more
predictable. The key for the BA will be to develop a way to sell this concept
early on to the project/program managers and other decision makers on the benefits
of using agile concepts prior to formal system validation. Having at
least one or two success stories to draw upon can help make this selling point much
easier.