Figure 5. MOC scoping practices of 4 companies.
Case Study: An Ideal, Scoped MOC
The quality of scoping, illustrated in the previous section, ranges from marginal (Company A) to superb (Company N). Most companies had clearly identifiable gaps in their MOC scoping approaches. For the record, none of the companies A thru O were Gateway clients, when they created their MOC procedures, so I can’t take responsibility for their failures or claim credit for the successes (Company N)!
But, I would like to share with you a project which we completed last month.
The client is a global chemical company. They have about 100 sites worldwide, a few are quite large (hundreds of employees), and the majority are quite small (less than 10 people per site). Highly hazardous chemicals, of the kind found in Appendix A of the PSM regulations , exist at a few, but not the majority of sites. Everyone is obliged to use the MOC process, whether they are at a PSM-regulated site or not.
At the start of the project, the existing MOC procedure was similar to Company H, shown in Figure 4, although I must emphatically state, the client is not Company H.
The company had already assembled a team of about 6 key stakeholders, who participated in face-to-face meetings, with an additional 5 in Europe who initially participated by teleconference, but also in-person for the final meeting.
The scoping redesign took place during 3 facilitated team meetings and individual work in the interim.
The first team meeting was a full day. The team “tore apart” the existing process. Every item was questioned, and, after having read this newsletter, you already know what the questions are: what? what kind? who? why? when? This highlighted data gaps, like the lack of an official list of units for all the plants.
Between the first and second meetings, the facilitator reconstituted the scoping process based on the input from the first meeting. The second meeting was again, face-to-face, but this time only a half day.
The facilitator presented the updated scoping process at the start of the second meeting. The team critiqued what the facilitator had prepared. This meeting was considerably more enjoyable, since it’s always more fun to critique the facilitator’s work than it is to critique one’s own work (meeting 1).
Between the second and third meetings, the facilitator quickly updated the scoping materials, and distributed them. Each team member was asked to be extremely critical and identify any scoping detail that they did not entirely agree with. This feedback was consolidated, and the final scoping ruleset was updated by the facilitator.
The third meeting was only 2 hours, and really served to ratify the work that the team had accomplished.
What was the outcome?
Using the evaluation approach, described in the previous sections, the new MOC scoping approach at this company is as shown in Figure 6. In order to make the symbols in all the diagrams comparable, they are sized to the same scale; that is, a symbol with a given area in Figure 6 represents the same number of action items as a symbol with the same area in Figure 5.
The new scoping approach at this company incorporates all of the positive attributes previously identified:
- 100% of action items are fully-formed: the action, the type of action, the role, the reason for the action and the timing are all specified
- All the action items that can be specified1 up front, are specified. Little is left to individual’s memories.
- The separation between redlining (blue actions during Change Design) and document updating (blue actions during Close-Out) is recognized.
- The quantity and extent of hazards at this company are less than Company N. So, it’s reasonable that there are fewer scoping items, as indicated by the area of the symbols in Figure 6 being smaller than the area of the symbols in Figure 5.
Figure 6. An ideal, scoped MOC.
Can You Do This at Your Site?
The critical resource at most sites today is manpower. People have “too much to do” and are often resistant to taking on a new, perhaps open-ended initiative.
The team’s contribution to this activity was as follows:
- Initial meeting: 8 hours each team member.
- Interim data gathering: 8 hours for the team leader
- Second meeting: 4 hours for each team member
- Interim review: 2 hours for each team member. 8 hours for the team leader
- Third meeting: 2 hours for each team member.
In order to achieve this optimized level of team involvement, a properly-experienced facilitator is a necessity. If you and your team are able to make this time commitment, better-scoped MOCs may be within your grasp, as well!
The new scoping process has been implemented in the client’s electronic MOC system. The users are very, very enthusiastic about the improvements to the MOC system.
Certain action items, like PHA follow-up items and PSSR punchlist items can obviously not be determined during scoping, and are added into the process later, as needed.
 Hoff, R., 2011, “Scoping – Action Items,” MOC Best Practices, 5(2).
 OSHA, 1992, “Process safety management of highly hazardous chemicals,” 29CFR1910.119, OSHA, Washington.
Appendix A: MOC Scoping Coding
As stated in the body of this newsletter, a “fully-formed” scoping question would contain all 5 elements:
- Information: Does this change involve the plant firewater system?
- Action: Review the MOC
- Role: Area Safety Rep.
- Action Item Type: Review
- State: Approvals state
While the information, action and role can be directly determined from the scoping questions, the “action item type” and “state” usually couldn’t be extracted from the MOC procedures, because many would regard action item type and state to be the author’s terminology, and perhaps not industry standard.
Most of the time, the form/procedure gives no indication as to when the task is to be performed. But we’ll assume that the people running the process are sensible, and will behave reasonably. So, certain things are allocated to certain states, even if the procedure doesn’t explicitly spell it out. Specifically:
- Anything labeled “approval” will be a sign-off type action item, performed during the Approvals state,
- Anything labeled “review” will be a review type, conducted in the Approvals state,
- Anything labeled “redline” will be a perform type, conducted in the Change Design state,
- Anything labeled “update” will be a perform type, conducted in the Close=Out state,
- Anything related to impact analysis with be in Impact Analysis state, if it uses a verb. If it’s informational, we leave it as informational,
- Anything using the word “PSSR” will be in the PSSR state,
- Anything that is described as a plant notification, will be in the PSSR state,
- Anything that is described as training will be in the Implementation state.