Seeing the forest through the trees
A Multi-Part IT Service Management Novella
Part 4: “Assess for Success”
By Mukesh Barot, Navy IT Service Management Office & Phil Withers, Navy ITSMO Contractor Support Staff
Editor’s Note: This is the fourth installment in a series designed to highlight the products and services of the Navy IT Service Management Office relating their capabilities in a business case story format spread over succeeding chapters of an IT Service Management novella:
Part 1: http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5299
Part 2: http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5640
Part 3: http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5924
He was thinking that he was in a really good place. Recapping the sequence of events in his mind, Bob thought about his initial foray into IT Service Management after being named a Process Owner, and how overwhelming it all seemed at the time. Had it not been for the resources and assistance he found at the Navy IT Service Management Office, doubtless he’d be referring to ITSM as a dirty four-letter word like many of his fellow Process Owners did. “Ah, but with those resources…” he absent-mindedly motioned with his finger as if checking off an imaginary to-do list and remembered he was able to get a firm grip on the governance aspects of his process, ensuring that the process managers were trained in the scope of their responsibilities. He remembered he was able to create and promulgate a cohesive strategic communication plan to foster messaging unity for his process in concert with the enterprise strategy. He thought about the fact that by using the Navy Process Reference Model (NPRM) he had based his process on international standards and industry best practice and he now had a service quality management plan based on a Navy ITSMO guide and template that captured the metrics for his process and stepped logically through implementing and sustaining end-to-end service quality. These were all good things, to be sure.
But he furrowed his brow as he recalled his last conversation with Sally. He had been discussing all of these accomplishments with several other process owners, who by the way, jotted down the links to the various Navy ITSMO documents and information contained on their wiki site – “I should get a commission” he thought. During that discussion, he had nonchalantly stated that the enterprise Request Fulfillment process was running like a top (or words to that effect).
“How do you know?” Sally said.
“Pardon me?” Bob had been holding court and was on a roll when the question stopped him cold.
Sally continued, “How do you know your process is, as you say, running like a top? What quantifiable measurements have you put in place that shows an improvement trend with identified gaps that enable you to focus your improvement efforts going forward?”
Of course, Bob knew about the Process Capability Assessment Model and Tool, and had even listened to the overview brief on the Navy ITSMO wiki portal. He was about to name-drop the PCAT when Sally added; “…furthermore, have you identified your process SWOT criteria? A SWOT analysis is a key deliverable to leadership that you have indeed done the hard work of establishing a process capability baseline – where you are right now – and that you have started on a well-defined path to process improvement – where you want to be in six months, a year, or longer.”
“I was just getting to that!” Bob had said with a wide grin.
He wasn’t grinning now. After that impromptu meeting, he had made a beeline for his office and sitting at his desk, he scoured through his local hard drive data dump file he made when he harvested the majority of the Navy ITSMO products that most concerned him. He paused for a second and looking at the ceiling, tried several phonetic variations: “SWAT? SWOT? Surface Warfare…something...Officer…Training?” It wouldn’t come. Snapping back to the task at hand, he found what he was looking for in the “Assessment” folder. It was the Process Capability Assessment Model/Tool Assessors Guide from the Navy ITSMO’s Assessment and Audit Library. There was other ‘stuff’ in the folder as well, like the PCAT Plan Template, the Report Template, a SWOT Template (Yeah!) and a training brief as well as other supporting documentation. For now, he opened the PCAT Assessors Guide to get a feel for the scope of the content.
The guide was well laid out. In particular, chapters 2 and 3 were what interested him the most – Assessment Team Roles, Practices for interviews, constraints and communication, and Assessment Process Activities. He read through the roles and their associated responsibilities: Sponsor, Coordinator, Observer, Lead Assessor, and Assessor. He found it interesting that here too, as with so many of the Navy ITSMO products and artifacts, the PCAT was based on an international standard: ISO/IEC-15504 Information Technology — Process Assessment. He was also intrigued by the distinction between the assessment model and the tool itself. As it turns out, the ‘tool’ is simply an automated Microsoft Excel spreadsheet that accepts the assessor input and creates a bar graph depiction of the numerical values. The red meat is the model itself. Almost immediately he noticed the alignment with all the material he had studied in the Service Quality Management – the Deming Cycle was on full display as the engine for the capability baseline and incremental process improvement activity.
He took some time for an in-depth review of chapter 3 to become familiar with the activities involved in a process assessment. The sequential activities were displayed in a flow chart:
- Plan & Organize
- Data Collection & Assessment
- Assessment Analysis
- Assessment Reporting
- Presentation & Internal Review
Scoping the assessment seemed to be a big deal according to the documentation, since there were five different levels of capability that a process could be assessed against. These levels were in keeping with the ISO standard; from the lowest level – Level 1 – where the assessors are simply looking at whether the process is actually accomplishing its stated purpose and outcomes, all the way up to Level 5, where not only is the process managed, established, and predictable, but it is also optimized – meaning the process was continually improving through innovation and optimization activities to meet mission and enterprise goals.
Bob also noted the guide revealed seven focus areas the assessors are to concentrate on. In Level 1, only two of these focus areas are assessed: Purpose and Outcomes.
“That makes sense.” he thought, “No use worrying about higher process capability levels if your process can’t even fulfill its stated purpose or isn’t producing its expected outcomes.”
In Levels 2 through 5, the remaining five focus areas come into play:
- Management Activities
- Interfaces (both internal and external)
- Roles and Teams
- Information Work Products
And each of the focus areas for these higher capability levels are assessed using more stringent criteria designed to exact a normalized and repeatable process that is continually examined for efficiency and effectiveness – in other words, continually improved.
“Well, that’s all well and good I suppose, but just what are the criteria that the assessors using for each capability level?” He mulled that question as he scanned the contents, and then hit upon the answer; the attributes are applied against the seven focus areas that are taken directly from the Navy Process Reference Model (NPRM)! He nodded his head in approval. This made total sense to him. The reference model, from which he had based the design of his Request Fulfillment process, was also being used to supply the criteria against which the process was being assessed. “That’s why it’s so important to have an architecture model” he said with an air of satisfaction.
He continued to review the document. He noticed that while the trained assessors used numerical values to assign weight to the evidence they collected about how a process was performing at a particular level, those values would fall into a predefined rating scale that allowed a measure of flexibility in considering how well a process had attained the particular level. The rating scale was, again, drawn from the ISO standard:
- Not achieved (0 to 15 %) – There is little or no evidence of achievement of the defined attribute
- Partially achieved (15 % to 50 %) – There is some evidence of an approach to, and some achievement of, the defined attribute
- Largely achieved (50 % to 85%) – There is evidence of a systematic approach to, and significant achievement of, the defined attribute
- Fully achieved (85 % to 100%) – There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute
The assessors use the PCAT rating sheet to record their evidence, which could include documents, written artifacts, observations and interviews with key process practitioners. Each assessor (a minimum of two assessors is required – more is better) records their evidence independently and assigns their scores. Later in the process, the Lead Assessor helps to de-conflict any wildly disparate ratings among the assessors for the same evidence and they begin the process of “synthesizing” a single rating. Then when all of the focus areas for an assessment level have been assessed per the assessment plan, the assessors really earn their keep by performing a SWOT analysis.
“There’s that non-word again.” Bob mumbled about performing another analysis, but as he read through the guide, he discovered that the real, tangible value in the assessment comes from the analysis of the assessment results to discover the actionable Strengths, Weaknesses, Opportunities and Threats (SWOT) that the organization can take onboard to make quantifiable and measureable improvement in their processes. “Aha! SWOT!” He jotted the acronym down on a sticky note, pulled it off the pad and stuck it on the frame of his computer monitor.
After the hard work of capturing the SWOT analysis comes the reporting – reports and their formats are agreed to in the assessment plan during the Plan & Organize activity of the assessment. This is followed by the actual presentation to the sponsor. “This is all really good stuff, and straight forward too” Bob thought, which was quickly followed by a conclusion: “I need to get Assessor Training.”
He reviewed the ITSMO Process Capability Assessment Tool Brief and made another sticky note as a reminder to put in a service request for PCAT Assessor training on the Navy ITSMO Service Request System. Leaning back in his chair, he mused aloud “If I can get my process assessed at Level 1 capability for Purpose and Outcomes, and come out with at least a Largely Achieved rating, then I will have established an objective and quantifiable baseline of process capability against which future measurements will launch. Scheduling iterative assessments will foster a continual improvement culture that can be used as a pattern for other processes to follow.”
He smiled to think how far he had come in his thinking – it was no longer just about his process, but rather it was about multiple processes supporting end-to-end quality service delivery to the customer, a lesson he learned from the Navy ITSMO’s Service Quality Management practice. With his process design based on an approved enterprise architecture model, and his assessment methodology based on international standards linked to that model and supported by a tool, he was certain he had the winning formula to not just improve his service delivery, but to show that improvement to leadership.
About the Navy ITSMO
Chartered in April 2012, the Navy ITSMO provides IT Service Management thought leadership and assistance by creating usable products and services for the Navy ITSM community. The Navy ITSMO strives for alignment of enterprise IT architecture through discreet but interlocking practice areas to help define and support organizational IT governance and management requirements. The Navy ITSMO résumé boasts industry-certified expertise in ITIL, COBIT, Program and Project Management, DoDAF, IT Risk Management and Control, IT Skills Framework, Service Quality, CMMI, ISO/IEC-20000, ISO/IEC-15504, Information Security, Enterprise IT Governance, and Assessment and Audit.