Sunday, August 25, 2019


Building Customer Centric Teamwork


This article on building customer centric teamwork in a regulated environment will show us the even when working in a highly regulated environment that follows a strict system lifecycle methodology, our first priority as BA's should be always be working people with empathy and understanding.  We will learn how to apply specific techniques that give stakeholders visibility up front in the process.  This allows us to achieve the benefits of ongoing and efficient communication while meeting the strict quality guidelines that are mandated in a regulated environment. 



Wednesday, July 5, 2017

Requirements Techniques for Implementing your Content Management System


"Business analysis is important.
Business analysts matter."
     -- International Institute of Business Analysis

“…content is and continues to be a business asset worthy of investment”
     -- Content Strategy Alliance

Over the past 20 years, I've had a role with implementing different types of applications that are used to manage content:
  • Document Management
  • Web Content Management
  • Component Content Management
  • Imaging
  • Scanning & OCR
Here is a deck I put together which shows examples of how to leverage certain techniques contained in the International Institute of Business Analysis “A Guide to the Business Analysis Body of Knowledge® (BABOK® Guide)” when implementing content management systems: Requirements Techniques for Implementing your CMS

Sunday, April 10, 2016

Being Agile in a Validated Environment

"The most difficult part of requirements gathering is not the act of recording what the user wants, it is the exploratory development activity of helping users figure out what they want"

   -- Steve McConnell, author of "Code Complete" and "Rapid Development”

Despite my years of experience with implementing, upgrading and migrating content management systems in validated environments, it seems that every project I’m on has its own unique level of challenges when it comes to getting good requirements.  Different (and sometimes conflicting) needs often bubble up from various groups of users, which can make requirements prioritization a challenge.  Also, while regulated organizations typically provide a well defined system development lifecycle, they may not be as attentive with providing a robust set of tools for the BA to elicit requirements early on in the project when they’re really needed.  Additionally, delivery estimates for implementation at times end up being based on arbitrary mandates vs. a realistic timeline that reflects the true level of effort to deliver.  These dynamics lead to the inevitable “scope creep” – meaning, at some point the scope of the work increases because we didn’t have the right requirements defined up front that were aligned with a sensible timeline for implementation. 
The Waterfall Approach

From an implementation standpoint, regulated organizations have traditionally leveraged a waterfall approach for implementation.  This method consists of well defined phases where each have specific deliverables.  Formal approvals on these deliverables must be received in order to move onto the next phase in the project.  Additionally, scope cannot be changed once the system is in a validated state without completing a formal change request.  Typically, the change request process looks something like this:
  • Initiate a change request form
  • Determine and then document the specific change
  • Submit the change request for approval
  • Get all of the necessary approvals
  • Update the schedule and scope
  • Once all of the above is completed, proceed with making the required updates

So when using a waterfall approach, it’s very possible that the implementation effort will look something like this: 
  • BA interviews users and/or conduct workshops with key stakeholders, and writes everything down in notes/minutes
  • Once the BA believes all the requirements have been captured, a (typically text based) functional spec is developed, reviewed and formally approved
  • Development team constructs the solution based on the functional spec
  • System test team develops system test scripts, and BA Develops UAT Scripts
  • System Test team performs dry runs of system test scripts, followed by formal execution
  • BA Performs dry runs of UAT scripts, followed by formal UAT execution by the users

The problem with this approach is that the users may not get a chance to really use the system until formal UAT.  Also, they may not know what requirements they actually need until they see the system and use it to perform their “day in the life” tasks.  Delaying this level of hands-on interaction increases the risk of changes late in the implementation cycle. If the system must be modified at a late stage to accommodate the new requirements, then lost time and increased cost results if previously developed work is changed or removed.

The Agile approach
Agile has been effectively used in many different industries and environments to counter the challenges with the following the traditional approach. Typically, agile projects look something like this:
  • A business sponsor is identified
  • The agile team is formed (typically consisting of a team lead, developer, product owner and a business stakeholder)
  • The team works with the business sponsor to understand the basic solution they are looking for
  • The work cycle timeframe is defined (i.e. typically as a 2-4 week phase known as a “sprint”)
  • The team dives right in and starts development
  • Working software is implemented at the end of the sprint, and then demonstrated to the users
  • If additional features are needed, the team will continue onto the next sprint

Agile is designed to embrace change. So once the priority of a requested change is determined, the request is added to a backlog (which is a prioritized features list that contains a short description of what’s desired).  This allows the team to continuously changing business priorities vs. trying to define all of the requirements up front before beginning development.   Working this way provides the following benefits:
  • The delivery cycle with agile is shorter, so time is saved going from concept to delivery
  • It enables ongoing communications with the business
  • Frequent demonstrations generates new ideas for the next sprint
  • The team adapts & updates as they go along to ensure that what’s delivered optimizes value
  • It provides the ability to respond to changing and evolving business needs

The requirement for Computer System Validation

In regulated Pharmaceutical environments, the FDA mandates the need to perform Computer System Validation. According to GAMP (Good Automated Manufacturing Practice), Computer System Validation is: “Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its pre-determined specifications and quality attributes.”  Additionally, “validation of new systems serves two purposes: 1) To avoid any intolerable risk to patient safety or to the business, and 2) to maximize the business benefits to be derived from the new system.” 
Validated environments typically embrace a System Development Lifecycle, which comes with a defined set of documentation at each phase of the project:
  • Planning (Strategic Objectives, Feasibility Study, Scope, To Be Process flow diagrams, etc.)
  • Specification (URS, FS, Validation Plan)
  • Design (HDS, SDS)
  • Construction (Code/Code Reviews)
  • Testing (IQ\OQ\PQ Plan and Protocol, IQ\OQ\PQ scripts, IQ\OQ\PQ execution, IQ\OQ\PQ Summary, Validation Summary Report)
  • Operation and Maintenance (Maintaining the Validated State)

The issue here as I see it isn’t with having a formal SDLC.  In regulated environments, it exists for good reason.  The challenge for the BA comes down to having a good method for eliciting and verifying the right requirements, because reliance on a large text based functional spec defined up front for communicating system functions is too complicated and nebulous for users to comprehend. While I know very well that a functional spec is a required validation deliverable - and does have value in terms of organizing and capturing all requirements in one place (as well as a means of tracing to test scripts) – on its own it’s a dreadful tool for BA’s to use as a method of requirements elicitation.  What users state what they want for inclusion in a functional spec may not be ultimately what they need, because defining requirements tends to be an evolutionary process. A well formed functional spec is junk if it doesn’t properly capture requirements for a solution that adds business value.
What’s the compromise?
So based on what we know about agile and computer system validation – using agile (in a strict sense) of deploying working software in such small increments within a validated production environment doesn’t seem feasible or practical, since deployments aren’t permitted without all of the formal testing and approvals on the required documentation.  Therefore, I believe that the best option would be to follow a hybrid approach.  Simply put – the idea is to leverage elements of agile to capture good requirements, and then use the work products generated from this effort as an input into the formal validation phase.  This approach would provide the following benefits:
  • Reduces project risk, since this increases the chances that the system validation effort will be more predictable
  • Functional spec literally “writes itself”
  • Users will have an easier time understanding functional spec, since it represents what they’ve already seen through a visual representation vs. just a text based abstraction
  • This will lead to quicker/easier approvals, and fewer (if any) re-approvals due to changes
  • It should cause less issues (and change controls) during the testing phase
  • UAT should run smoother

Leveraging models to capture good requirements

If for whatever reason incremental development isn’t feasible, it’s still possible to leverage agile principles through the use of models. In this case, requirements tend to best developed through the use of simple but effective visual models that properly represents the user’s “day in the life”. The value of requirements models is that they are understandable to both users as well as IT, and helps to bridge the communication gap which often exists between the two groups. As the models are further refined, the requirements continue to develop as they become clearer. 

What about COTS?
Commercial off-the-shelf software (COTS) is typically thought of as a way to save time and money on expensive custom programming.  But, if not properly managed, it can have potentially disastrous results. What may look nice during a demo may crash in production (i.e. functional issues) or cause the users to struggle with the system (i.e. usability issues) or has capabilities that don’t fit with your business (i.e. process issues). 

Again – we can leverage elements of agile by starting with a good set of user requirements defined as user stories.  We also need to understand enough about the different available products on the market, so that the gaps between our user requirements and what’s currently available through COTS solutions are known. Also, take the time to participate in vendor demos, and use quantifiable selection criteria to choose the right vendor.  Next, we need to determine all of the estimated implementation costs and the LOE to close the gaps between our user requirements and what the vendor provides.
Once the vendor is chosen, then before diving right into the validation phase – take the time to conduct a series of demos in a sandbox environment with the users (especially on newly added functionality that was requested) to ensure that it meets the business needs.  Be prepared to do multiple demos if issues are found, if updates are made based on user feedback, or if different groups need to focus on specific features. Next, conduct hands on informal UAT in the development system with the users prior to approving the functional spec and commencing the validated phase (preferably using training materials that represent the real “day in the life” scenarios).  This upfront investment should pay dividends downstream when it comes time to formally validate the system.

Conclusion
Agile and Waterfall methodologies each have their own respective strengths and benefits. When used together in regulated environments, the value of both can be optimized.  I’ve found that this approach seems to be well received by the QA and validation folks, since it helps to improve quality as well as make the validation process more predictable. The key for the BA will be to develop a way to sell this concept early on to the project/program managers and other decision makers on the benefits of using agile concepts prior to formal system validation. Having at least one or two success stories to draw upon can help make this selling point much easier.

Sunday, July 5, 2015

The Importance of the Current State Assessment

“There is never enough time to do it right the first time, but there is always enough time to do it over.”

Classic Murphy’s Law … good one for those of us who’ve worked long enough in IT (and sooner or later have had to come to terms with…)

When it comes to performing a current state assessment - my own experience has been that for projects of a certain size and complexity, it’s definitely the right thing to do.  Yet sometimes we’re still told that there’s “not enough time” to do it.  Having been the BA on various projects where the choice has been to go one way or the other - I can say for sure that the benefits of performing a current state assessment can work wonders for project delivery.  Unfortunately, the costs of not performing one when it’s needed can have the quite the opposite effect.

What is a Current State Assessment?

A current state assessment should minimally capture the following 3 components:
  •          As-Is business process
  •          Key business stakeholders
  •          Current issues (i.e. “Pain-Points’)

It should always be one of the first steps undertaken for significant automation projects.   Yet project leads may show resistance towards investing in a current state assessment – either because they don’t understand its value, or tell us that we don’t have the time to do it.  But not investing in a current state assessment could have a negative impact with regards to capturing good requirements.  This in turn increases the risk of a implementing a limited solution that doesn’t fully meet the needs of the system’s users.  Therefore, it’s my belief the value of a current state assessment is that (at least in the long run) it can actually save the project time & money – as well as reduce risk.  The good news is that the time and effort to perform the current state assessment should be more easily quantifiable when compared to estimating the time and effort to implement the actual solution. 

If a current state assessment is not planned (or has not been performed), the BA should push hard to ensure that this activity happens.   Listed below are some considerations to you can bring to light if you believe that a current state assessment is needed, but you’ve come across a level of resistance:

1)     It helps us get to know our stakeholders & the optimal way to elicit requirements from them

A current state assessment allows us to understand the work that needs to be performed by the organization, and who needs to complete it.  Knowing this will help us identify the key stakeholders, their role within the organization, and their level of influence on the implementation. This can provide us with the opportunity to develop positive relationships at the beginning of the project, and help reduce the risk of missing key stakeholders (who later in the project may feel left out of the process). Also, from a project planning standpoint - when certain activities need to occur in a specific order (i.e. scope definition phase, followed by requirements workshops, etc.) - it’s critical to know who the key stakeholders are as early as possible.

Identifying the stakeholders early on also provides us with the opportunity to use important elicitation techniques during the current state assessment (i.e. such as surveys, interviews, etc.), which can help us to better understand their concerns and opinions.  This can also help us elicit information on as-is processes, issues, gaps, and desired requirements.

In addition, having this level of understanding will help the BA determine the different types of requirements models to be used during a workshop.  This promotes more efficient requirements workshops, since the problem to be solved is better understood.
 
2)      It helps us determine the capability gaps between the current state and desired future state

There is a certain level of performance that the organization is getting from the current solution (even if it’s manual).  Therefore, we need to make sure that any key benefits the organization is receiving from the current solution are not lost when implementing the new solution.  These benefits should become apparent while capturing the current state, and this having this knowledge will be necessary when the time comes to determine the potential value that the new solution would bring.

For instance, if upgrading an existing solution, the business may ask the following questions:
  •          Will features that currently add value be retained or enhanced?
  •          Will existing issues or pain points be reduced or eliminated?
  •          Will the new solution also include new capabilities?

Providing answers to these questions can help us work with the business to determine what changes need to be made to the current state so that any new features can be optimally consumed.  Having this understanding also allows us to make decisions that involve process change vs. configuring the solution. This also provides us with the ability to present recommendations up front vs. stumbling across solutions later in the project – thereby causing rework or postponement of features.

3)      We can inventory our existing information assets

Having at least a high level understanding of what’s in the current system can be critical to understanding what will be involved with migrating to the new system.  For example, if we are upgrading an EDMS system, then we will want to inventory information such as content types, number of documents, taxonomy being used (if any) and the cleanliness of the metadata.  Having this information can at least give us a high level view of our existing system’s current state, so that we can then plan for the effort involved with migrating to the new platform.

4)      It helps with defining the change strategy

Since performing a current state assessment helps us determine the future state, organizations can use the knowledge gained from this to help define and plan for their change strategy. The change strategy will involve performing an impact assessment that defines what set of activities, communications, training, and updates to documented procedures will be required. Additionally, the organization’s ability to adopt these changes will drive the timing for the release of different features.

5)      It helps with assessing potential future solutions

A current state assessment will help us define the organization’s future needs. Having the correct set of needs will then help us outline a cohesive set of user requirements, which will in turn enable the desired future state. This will better position us to accurately assess alternative solutions that can meet these user requirements.  In addition, it helps prevent the organization from being overwhelmed by a chosen solution that they are not ready for or capable to consume.

Additional thoughts or ideas are certainly welcome – especially from those of you have also reaped the benefits of performing a current state assessment.  Or, if perhaps Murphy showed up at your project and decided that you should charge ahead without a current state assessment when it was truly needed – it would be good to hear about your experience as well.


Sunday, November 2, 2014

Process vs. Technology - Getting the Right Combination


I think one of the all time great quotes as it relates to technology was from none other than Bill Gates: "The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency."  (No wonder he's the second richest person in the world).

I somewhat experienced the meaning of this quote firsthand as a volunteer BA for a local athletic organization that was in need of a website upgrade. We developed a solid business case that was approved by the Board of Directors. We sent out a survey to our user community, and received a high response rate.  We developed a complete set of user stories and use case/storyboards that were the outcome of three elicitation sessions with a focus group.  The web developer provided a beautiful site per exactly what was specified.  User acceptance testing was well defined, organized, and executed.  The site went live on time and within budget.  Everything seemed to be going perfect.  But once we went live, a very important feature (the mailing list) has been temporarily abandoned due to confusion about what process is needed to leverage it.

If everything seemed to go well and we did all of the right things, then what went wrong?

Looking back, I think what happened is that we went right from our business objectives to what we wanted the website to provide us from a solution standpoint.  If we could go back and do it all over again, I would probably have suggested that we first step back and ask a few simple (but important) questions:

1.What business policies should the system enforce?
2.What is our current business process, and what are its "pain-points"?
3.What improvements can we realistically make to this business process using new technology, but also given our constraints (i.e. timeline, money, people, etc.)?

Sometimes these questions aren't always easy to answer without first performing a level of due diligence.  That said, I think its fairly safe to say that in order to get the most out of automation, there are three elements that should be taken into consideration up front:

1.Business policy
2.Business process
3.Feasible solution capabilities

All three are important, and need to be in alignment with each other.

For example, is our business policy: “All parents of an athlete who is paid for the upcoming season are to receive important email communications from the club secretary”?  Or is this our business policy: “All families of an athlete who is paid for the upcoming season are to receive important email communications from the club secretary?" It’s important for us to define this, because each of these policies will very likely drive out two very different mailing lists - one with mostly two people per athlete, the other with primarily one.

Once we have our business policy in place, the next step would be to develop the business rules.  For example, we could state:

1.“All parents who would like to be added to the mailing list must first register with the website”
2.“All parents who would like to be added to the mailing list must first send in payment for the athlete"
3."All parents must not be subscribed to the mailing list without their knowledge by a third party"
4."All email address on the mailing list must be accurate and cannot contain typos"

Having this cohesive set of rules in place can then help us answer some leading questions. For example:

1.Who really needs to be on our mailing list?
2.Is there one type of mailing list, or are there multiple?
3.How frequently will these mailing list(s) need to be refreshed? (i.e. Once per season?  Once per calendar year?  Never - it just continues to grow, and we let registered users determine if/when they want to unsubscribe)?

The value of this exercise is that we can then begin to think about what could be handled by way of automation, and what our new process will look like.

Asking ourselves some additional questions can then help us further envision the solution scope:

1.Do we have (or can we implement) the processes that can leverage what we want from the solution?
2.Do we have the resources to implement and support the solution we are asking for?
3.Do we have the funding to pay for them? (Especially when weighed against other competing needs that might have more benefit to the organization as a whole).

For example, perhaps we could have integrated the website with a more sophisticated mailing list software that provides a "confirmed opt-in" or "double opt-in" (which automatically sends an e-mail message to the address and requests a response so the recipient agrees to be on the list). Additionally, users could have the ability to unsubscribe own their own whenever desired. But if we always need to perform business rule #2 above (regardless of the technical solution in place), then perhaps going through all the work (and spending the additional money) to implement a more sophisticated tool like this might be overkill.

So perhaps having a very simple mechanism like this might do the trick:

1.Website admin exports the entire list of registered users (at the end of the "open enrollment" period), and sends the list to the secretary
2.Secretary verifies registered parents vs. athletes that are paid for in the upcoming season
3.Secretary sends the verified list back to the website admin
4.Website admin uploads this filtered list into an email group on the back end

Additionally, if someone drops out of mid-season, the process might be as simple as sending a request to an unsubscribe inbox, and then this step is handled manually as a one-off.

Although this type of solution would probably not be feasible for a decent sized company that leverages an email marketing list with thousands of subscribers - it might work quite well for a school athletic team with a modest budget and only about 100 or so people who are on the email list per season.

Lesson Learned: Sometimes when business users state what they want, it isn't exactly what they need to accomplish their goals.  Likewise, sometimes as BA's we may assume that as requirements are elicited, that business users will understand and communicate all of the implications to their business process. To help us develop some common ground, it might be useful to first take the time to capture our business policies and rules, followed by an assessment of what processes changes make sense in light of the solution capabilities that are feasible.

Sunday, April 7, 2013

Deliverables vs. Delivering


I remember years ago during football season watching a commercial produced by IBM. This guy was on a beach with a bunch of washed up monitors and I think some other pieces of equipment. I recall his basic message was that you could produce the greatest technology in the world – but if it doesn't add business value - then it’s basically “junk.”

It’s interesting how sometimes on our projects we become rigidly associated with certain SDLC deliverables. From a project management standpoint, this isn’t necessarily a bad thing, since it helps provide a clear means of tracking progress. But if we’re not careful, what can happen is that we become so focused on our deliverables that sometimes we lose sight of who really has the most at stake here. And that’s the user.

I’m not saying that deliverables aren’t important. As BAs, we certainly should have a level of
accountability for the work we are responsible for. It’s just that in the end these splendid artifacts we work so hard to perfect become fairly useless if the solution doesn’t meet the needs of the business.

It’s my belief that a true test of a success is the user base getting the business value they expected from the solution in the first place. Keeping the business engaged (with the right level of communication) consistently throughout the project can help increase our chances of achieving this outcome. Also, using good elicitation techniques to get the right requirements from the start is essential. However, if users seem confused or disconnected during acceptance testing, then that’s red flag. Therefore, it’s important that we use acceptance testing as an opportunity to elicit feedback on how we can help smooth the transition to go-live. 

For instance, simple things like having a good user guide in place - which walks the user through everything from getting access to instructions on how they can get the most from the system to do their work - can go a long way towards ensuring true success. Therefore, incorporating user feedback throughout the process not only ensures that our own deliverables are correct, but the users will feel more engaged.  This will significantly help our chances of delivering a successful outcome.

Lesson learned: Focus on the user first, and keep them actively involved throughout the
implementation of the solution.  This not only increases our chance of implementing a solution that works, but it helps us avoid the great misfortune of delivering “junk.”