The current issue and full text archive of this journal is available at www.emeraldinsight.com/0968-5227.htm
IMCS 17,3
248 Received 12 September 2008 Revised 11 November 2008 Accepted 14 November 2008
Determinants of the critical success factor of disaster reco re cove very ry pl plan anni ning ng fo forr in info form rmat atio ion n systems Wing S. Chow and Wai On Ha Department of Finance and Decision Sciences, Hong Kong Baptist University, Kowloon Tong, Hong Kong, China
Abstract Purpose – Recent disaster recovery planning (DRP) literature has mainly focused on qualitative research, while neglecting the quantification of critical success factors (CSFs) for information systems function (ISF). This paper aims to address this issue. Design/methodology/approach – This paper first conducts an extensive literature review, and then identifies 62 DRP measurement items for ISF. A questionnaire survey, which is based on these 62 measurement items, is used for data collection, and 129 managers of DRP in ISF participate in this paper. factoring analysis, this paper identifies identifies ten DRP CSFs for Findings – Through the use of convergent factoring ISF, they are: DRP documentations, documentations, DRP steering committee committee and DRP testing, DRP policy and goals, DRP training, DRP maintenance and staff involvement, DRP minimum IS processing requirements, top management commitment to DRP, prioritization IS functions/services, external, off-site back-up system, and internal, and on-site back-up system. Originality/value – This paper determines success factors based on a set of decision variables gathered from an extensive literature review in DRP in information systems. Keywords Information systems, Critical success factors, Disasters Paper type Literature review
Information Management & Computer Security Vol. 17 No. 3, 2009 pp. 248-275 q Emerald Group Publishing Limited 0968-5227 DOI 10.1108/09685220910978103
1. Introduction People Peop le have typ typica ically lly asso associa ciated ted dis disaste asters rs wit with h cata catastro strophic phic eve events nts suc such h as fires fires,, hurricanes, and tornadoes (Rosenthal and Himel, 1991; Blake, 1992; Krousliss, 1993). Wrobel (1997) summarizes that disasters can be caused by three different different phenomena, namely name ly natu natural ral cau causes ses,, hum human an erro error, r, and int intenti entional onal causes. causes. The definition definition of a disaster is thus not solely confined to natural catastrophic events like the Tsunami in South-east Asia in 2004 or Typhoon Katrina off the Gulf Coast of the USA in 2005, but may also include any event that significantly affects the operation of an organization, such as human error in data entry, or intentional acts like the September 11 attacks in the USA in 2001. The process of disaster recovery planning (DRP) sets out a series of strategies that enable the resumption of critical functions for an organization to an acceptable level of service in the event of any of these various types of disaster. DRP is considered to be one of the most critical management management issues for both private and public organizations in our present age of digitalization. The reason for this is that a halt of any informa information tion systems function function (ISF) service may be truly devastating devastating for th thee ope operat ratio ional nal cap capac acit ity y and rep reput utati ation on of an org organi anizat zation ion (E (Els lsti tien, en, 199 1999) 9)..
Putting an effect Putting effective ive DRP in place can reduce the severi severity ty of a potential disaster as well as allaying staff anxiety, and can then speed up the recovery DRP processes where necessary (Pember, 1996; Rutherford and Myer, 2000). DRP is not a new concept for large organizations around the world, due to the fact that many governments ordered those within their borders to implement DRP systems for their ISF services, so that their computer systems could survive a possible Y2K bug. To date, much anecdotal evidence has been published supporting the significance of DRP for ISF. The published evidence includes that on IT disaster preparedness by Nelson (2006) and Botha and von Solms (2004), the contingency planning guide for infor in format matio ion n te tech chnol nolog ogy y sy syst stem emss by US US’s ’s Na Nati tiona onall In Inst stit itut utee of St Stand andard ardss and Technology, Technol ogy, and the ISO/IEC 27000 series of international standards standards that are reserved for a fami family ly of inf informa ormatio tion n sec securi urity ty man managem agement ent sta standar ndards. ds. How However ever,, cru crucia cially lly missing from the DRP literature is any overview study that refers to the determination of DRP critical success factors (CSFs) for ISF, based on relevant measurement items collected from an extensive literature review. The objectives of this paper are three-fold: to provide a literature review on the “critical success factors” for DRP, to outline a methodology for assessing the CSFs in an organization based on the perceived importance of DRP managers at a given time, and to provide empirical research to substantiate the previous two objectives. Thee fo Th foll llow owin ing g se sect ctio ions ns wi will ll re revi view ew th thee DR DRP P li lite tera ratu ture re an and d re rele leva vant nt DR DRP P meas me asure ureme ment nt it item ems, s, di discu scuss ss th thee mo mode dell dev devel elop opme ment nt,, pre prese sent nt the res resear earch ch methodology, and discuss the findings, before offering a conclusion.
2. Disaster recovery planning There is much literature documenting the success factors of DRP implementation. Thes Th esee su succ ccess ess fa fact ctors ors,, bas based ed on ei eith ther er an org organi anizat zation ional al or de depar partm tmen enta tall le leve vell perspective, are described in terms of events, items, components, operational steps, or constructs. At the organizational level, Francis (1993) examines the DRP process of Chi/Cor Information Informat ion Management, Inc. and reports that DRP imple implementat mentation ion should comprise the following ten processes: project organization, business impact analysis, security review, strategy development, back-up and recovery, alternative site selection, disaster recovery rec overy pla plan n dev develop elopmen ment, t, tes testing ting,, mai mainte ntenanc nance, e, and peri periodic odic audi audit. t. Dwy Dwyer er et al. (1994) (19 94) sum summar marize ize nin ninee DRP suc succes cesss fact factors ors from the refe referen rence ce guid guideli elines nes of the systemss auditab system auditability ility and control report, namely organizing and managing the project, perform per forming ing orga organiz nizatio ational nal imp impact act ana analys lysis, is, det determ ermini ining ng min minimu imum m proc process essing ing requirements, requir ements, analyzing risks, priorit prioritizing izing tasks to recover, analyzing alternatives and selecting strategy, developing the plan, testing the plan, and maintaining the plan. Similarly, Simil arly, Smith and Sherwood (1995) propose nine DRP succes successs factors, namely policy statement, stateme nt, planni planning ng responsi responsibilit bilities, ies, incid incident ent managem management ent responsi responsibili bility, ty, busin business ess impact analysis, recovery strategies, training and awareness, testing, maintenance and review, and documentation. One study by Vartabedian (1999) details a total of 14 succ su ccess essfu full ope operat ratio ion n eve event ntss for im impl pleme ement ntin ing g DR DRP: P: pr probl oblem em re recog cogni niti tion on;; ne need ed justification; management support; dollar, time, and resource commitment; recovery team selection; business impact analysis; risk analysis; plan content; responsibilities of the reco recover very y tea team; m; bac back-up k-up proc procedur edures; es; dis disaste asterr imp implem lement entatio ation n tas task; k; pos post-p t-plan lan activities; activi ties; testing testing;; and mainten maintenance. ance.
CSF of DRP for information systems
249
IMCS 17,3
250
At the departmental level, Wong et al. (1994) propose that an effective DRP for ISF should consist of nine procedural steps: obtaining top management commitment, establishing a planning commitment, performing risk and impact analysis, prioritizing recovery needs, selecting a recovery plan, selecting a vendor and developing agreement, developing and implementing the plan, testing the plan, and continually testing and evaluating the plan. Blatnik (1998) describes the DRP implementation for ISF as including nine successful events, namely enforcement of policy, threat analysis, back-up strategies, training, testing, documentation, regular reviews, regular updates, and ISF personnel participation. Adam (1999) suggests that DRP implementation in human resources departments should take into account the following DRP components: compiling an emergency schedule, assigning an emergency crew, establishing an alternative site, training, back-up, and updating the plan. For a simple DRP for use in human resources departments, Solomon (1994) proposes three basic elements: assessment of the vulnerabilities and resources, examination of communication options, and exploration of alternative locations. In a study of DRP implementation within small and large accounting firms, Ivancevich et al. (1997) states that the following measurement items can be used to ensure a successful DRP system: management support, risk analysis, fire and water insurance, identification of critical application, alternative site processing, off-site storage, file back-up procedures, communication policies to handle disruption of phone line or cables, testing the plan, documentation of immediate action, documentation of the plan, and updating the plan.
3. DRP constructs and their measurement items This section presents an extensive literature review on DRP constructs for ISF and their measurement items. Table I shows a comprehensive list of DRP constructs gathered from the IS-related literature. In this table, we have identified 14 DRP constructs and surveyed their measurement items from IS-related literature. These constructs are as follows: (1) obtaining the ISF top management commitment for DRP (hereafter, “top management commitment”); (2) establishing ISF policy and goals for DRP (policy and goals); (3) forming the ISF steering committee for DRP (steering committee); (4) performing IS risk assessment and impact analysis for DRP (risk assessment and impact analysis); (5) prioritizing IS functions for DRP (prioritization); (6) determining the minimum IS processing requirement for DRP (minimum processing requirement); (7) selecting IS alternative sites for DRP (alternative site); (8) developing IS back-up storage systems for DRP (back-up storage); (9) forming an IS recovery team for DRP (recovery team); (10) training ISF staff to activate DRP (training); (11) launching DRP testing in ISF (testing); (12) documenting DRP procedures for ISF (documentation);
)
d e u n i t n o c (
CSF of DRP for information systems
251 ) 5 9 9 1 ( s s o R d n a e e L
n a l p e h t f o n o i t a c fi i t s u J
t n e n e t a ) r m r g 3 m a o o i p e 9 n l n o 9 e a p 1 C ( S m u s
) 4 9 9 1 ( . l a t e r e y w D ) 4 9 9 1 ( . l a t e g n o W ) 3 9 9 1 ( s i c n a r F s t c u r t s n o c P R D
n o i t a y r n i o t m n r e t v e e n d I
y r o t n e v n o n i i t a n i i n e d m u r l e c t e n I d
t n e m y n r e i g v s o s c a g e n r l e i n r i y n e g n a t r e o t s t a s s a r r i e d D t s P n a
e c n a n e t n i a m d n a t s e T
n o i t a r a p e r P
n n i o i t c d s i i a e t t r s c d s g a i y n a l u l i p l t p a a e c s e m n n r e I a I p R t e n e g s h t o t m e a i c t v y n a u p m i g a a t r p o n m e z l s i s a o t a e m i i l f m k n n a n i s r i v s e p r r d a i y e n t e r p n l m t a g e n t s d l l o n r a t a c o n g n i h g e t e a t t n c n m e e n i e i p n i a s i e a i e j z p z i s i o e i e h m z d z l t e o s m e s r t r r n c l d y y n r i s e l y l o a a l e a p f u a l t u n t c u a d l v g a e o e r c g c l q a s n n r h e e e e p r r n n e n n m O t P i A I o a D p r A a D I p T g n n n a d t i y l a n n l r p e n e p n s d a i v a y m e l n s r o e r n t t h a p e y c e k o d t l a r n e v s t l a e i d a n r e n t g p o e n p r n a n n c o m m e e a e h e a t t e e e s z e t p t r v i i p m m i h l i r s t t o m n g o t c i t t l i s l b e o s c c e l a r a a t a m f e n m t o d e l e e v s t p i p r e l v m s b a m e e s e r e e e e m P n m O m o c E o c P s a i S S d D i T t t y c t n r e c e e e j t a w i v o e m p i s r t o p n n v c d p n e m o o i o i e e e n l i v r r n t t e i a i s m t a a n r v s t y p p y y s i z z a r e e e g t o g i d i e s i t i e o c n n u d e t r l n n n s i y r e t v i u c k t e c o j a l a s l u a e l n a e a t o c v s g c g r a s l e t e a c i l e r n r u n e P r o I o B a S A s S d B e r D p T t s l e e n n a t e g i m o o a t t g s i n a m t e r t n e o s t a o i n t e n s c e d g e i t e v s y g t a s t e e n n a n m m t m t s i z m r e g p i i a g t e t u s a s p n i i e e n t i g s a i r n m i n m s i i u y v m i a m i n e y r r m r i c u e c e k l o u k i t o n p c n c l c a d a t s n n i e m i o l o a m e i a e r o s o t r r q o o e T m c P S c R a a P M p r A B R T D T n i n s s i e y e d s r i i r e t i e y e g g s e d c d v e o t e y l v t u a u l l o a a c c p a c c r r e n i n e m n I a I r t s R t s
Table I. A list of DRP constructs in literature
)
d e u n i t n o c (
IMCS 17,3
252
) 4 9 9 1 ( r e n r u T t n e m e n n a fi l e P r n a l p e h t n i a t n i a M
d e n v a i , t t s u y l n r e c e a e o m v x g t e e o , c a e y t e r r h s y e a t r p r t e o n i o t o s p m s o i s a l s i s i h i m e m D p m T o c
s g a g n t ) i r c 9 n e a 9 j n v o a e 9 r l 1 D ( P p
) 8 9 9 1 ( y r a e L
d n g a n i n t a r l s e p e e ) t 5 e t s 9 y h l e t l n 9 e 1 a e i t ( u t e t i a n r i u d m t l e u l k n a m c o v n o a C e I c B
Table I.
l a n v o o r i s p s e p a c c d u s n n a e i o v t n i c t e t u u t r c i t r e s n W x e i
n o i t a i t i n i t c e j o r P
t n e m e g a n a t o m r p p o p T u s n o d i t n t i a t a t n e c d c u a n fi n n e e i n e m o i t d a a i i m s m t e t e c n e t i u g r , c t m b ) e i e n j r i r t d a o l 9 a t g a u m o n i d p l 9 b i n r o o e o p l m a r a c s a 9 e o e r M P V 1 ( P e r N M u s D e r o c s l s a t l o e n c t e t u g n c n r n o e t n i n e d t a o n m a s s n m n a e t r p o i e i g c t e y c a p i m n c t i P i n p l r a F a R o a m o o M S I p D T m c P
t n e m s s e s s a k s i R
d n e a i t d s e d t a e n p g i p i s e u D q e
y t i t l n n g i e s s i b s i e t i n a m r e t s s d c c e n n s y e a u j n l n e i l l s p a s c o a u u l n r m n I p p V s a B i a
t n e s e m i p g o e l t e a r v t e S d
t c e j o r p n n i o e t d i a u i l c t i n i n I
y y r e r e v v o o c c e r e r n n i i e e d d u n l u n l c l c l a n p I n a I p
s i s y l a n a s d e e N m u m i n i m s y e c f i t r u n e o s d I e r
t c a p m i s s s e i s n y i s l u a B n a
t c a s p i m s i y l s a s s e i s n a n y k i s l a s u i B n a R t n e m s s t c e a s e e t s p i g t s i n a i m s i y r m l k e d m s e t o i n a S c R a n a
n o i t a z i t i r o i r P
e t i t s e g n v i n e m i t m u s a e r n m s i e r i c e n o u t i q l r e M p r A
s y e c f i t r u n e o s d i e r n m i e u d i u m l n c i n m I
s e r u p d u e k c c a o r B p e g a r o t s p u k c a B
) 7 9 9 1 ( . l a t e h c i v e c n a v I r o f s e i t u d s e e d y e l n o p fi e m D e
l e n n o s r e p g n i n i a r T
t n e n s i m e p e i g o d t e l u a e l v c t e n r I s d
e c n a n e t n i a M
) 6 9 9 1 ( g n e H ) 8 9 9 1 ( k i n t a l B d o o t w r c e e h j o r S p d n n n i o a ) e t d i 5 a h t 9 u i i l c t 9 i m n i 1 n S I (
y r e v o c e r t e n e r u g m d n i e u t c c o o r s e D p T e f c o n n a o m y i t n t a t i l e e g a n i t t b t e n i n n t y i n i t e s a l n n s r r o e o n e m a t m i c e t p o v t s e l k n n t c o p a n m e c s p s a l a a s l e e e a l s o e i a l m t P R s R t D i P P P m a e t y r e v o c e R
n o i t a t g n n e i n m u i a r c o T D
l o r s t i s n y o c l a y n i a t r k c s u i e R S
e c n a g n e n t i n t i s e a T M
) 4 9 9 1 ( n o m o l o S ) 9 9 9 1 ( m a d A
l a c i t i r c f o n o n i t o a c i t a fi i c t i l n e p d I p a
t c a p m i s s s e i s n y i s l u a B n a
g n i n n a l p t c e j o r P
e c n a g n e n t i n t i s e a T M
y r e v o c e r n i e d u n l c l n a I p
t r o p p u s t n e m e g a n a M
f o t n e m e c r y c o i f l n o E p t n e m e t a t s y c i l o P
s s e n i s s i u s b l y n a i n a d e t d c u a l c p n i m I
)
d e u n i t e n o t c i s ( e g v n i t a i s s n r c e e t l o r A p
CSF of DRP for information systems
253
y g e t a r t s y r e v o c e R
s i t s a y e l r h a T n a
t c y a p t i l m i i s g b i s s n s i e i s n n o i n l y n p s a s l e u a P r B n a d n a s e i t i l i s b e a c s r r s e u e n o s s l u s A v e r
s e i g e t a r t s y r e v o c e R e d i v n s t a s a e s n s e r s e t t a i i l l i n a i b s e e n e a c r o d r o i e r u l t u n l o p c l s x a c n u o I v e r E l
e v i n t a a e n r k e a t l M a
t s l e n s a t t e l i o e n c t n t m u g t s n o t n o s t r n e i n e n i s c e g d e t e i v t s o a s a e e n n m m a t s n m t s i z m p g a e t t u r i p o i s e a i y i n i s t g e i a i r n m s i c c m s i a e y r p i m r m r i c l t c P i k u t e n e o p n l r d a i m s o F a R o a m e i q l o t o i n n r r e o S I p D T m c P S c R a a P M p r A
Table I.
IMCS 17,3
254
s e r u d e c e r o r g e p t a r a p o w e u t s d c k e n c n a t i a a r b s e e f r u l s i f i n F O F i y r e v o c e r n i e y g d t u e l a c t r n I s
t n e m p o l e v e d n a l P
s e p i g u t e k c r a a B t s
n f a y l c o p i n e f o l e l o i n t h p o n a t p b n l a o n a p i u c n n l t o r o o i c e i s r t i t t a p o i h a e a a t d c e h t i e e t t n g n n l l a t e n e i n u d i g i d m m n n t e m e i a a u u n d c s o c m t o m o e p o h o h m T U D C t p D i
g n i n i a r T
g n i n i a r T
g n i t s e T n o i t a t n e m u c o D
t n e y r m e e v g a y d o c n e i a t l n i a s r m b n g s t i e i s i n e n n e n d n e o r d i u n i a l a c p s a r w c e n n l I r T a I p
l e n n o d n i n t a l y o a r g w e r s p i e n a t c e i i a l p i t u F t s v d e e p g S r a r I p T R u e d n a e c n a g n e w n t i n i e t i s e a v T M e r
n o i t a c e i n n u n i m o m a m i x o t E c p o
Table I.
p u k c a B
w e r c y c g n n n e i g g n r i i e s s a r A m e T
y c e n l e l i e u p g d r e e m o m h C e c s
e g a r o t s p u k c a B
m a e t y r e v o c e R
n o i t a t n e m u c o D
g n i n i a r T
n a l p e h t e t a d p U e c n a g n e n t i n t i s e a T M
l e n n i o n t o a s r p e i c p i t F r a S I p
(13) maintaining DRP in ISF (maintenance); and (14) encouraging the participation of ISF personnel for DRP (ISF personnel participation). The following section describes each of these DRP constructs, together with their relevant measurement items.
3.1 Top management commitment Top management commitment is considered the most vital construct to the success of DRP. Ginn (1989) states three reasons to support such a claim: first, top management finalizes an annual budget to support DRP implementation in an organization; second, top management decides when and how the DRP should be implemented in an organization; third, top management dictates the level of cooperation and support that should be provided by the various departments when a DRP is launched in an organization. Furthermore, this construct is considered as critically important because DRP requires long-term planning, and that it involves ongoing capital investment (Chow, 2000; Cerullo et al., 1994). The DRP measurement items for the top management commitment construct are to: (1) provide adequate financial support for DRP development (Zolkos, 2000; Pember, 1996; Ferraro and Hayes, 1998); (2) commit to DRP development (Gluckman, 2000; Ginn, 1989; Rohde and Haskett, 1990); (3) support DRP development (Bodnar, 1993; Coleman, 1993); and (4) accept responsibility for DRP quality of output (Warigon, 1999; Carlson and Parker, 1998). The Appendix reveals these four measurement items, which are Q1-Q4.
3.2 Policy and goals The policy and goals of DRP must be clearly outlined so that the DRP is clear to everyone in the organization. It is important to establish policy and goals for an effective DRP in an organization. The purpose of a policy statement is to set the guidelines for disaster recovery and define who is accountable for the DRP planning process (McNurlin, 1988; Turner, 1994). The policy statement should clearly define realistic DRP goals and their objectives (Salzman, 1998). With clear goals and objectives, recovery strategies can be established in a cost-effective manner (Rothstein, 1998). The DRP measurement items for the policy and goals construct are to: (1) define the scope (Iyer and Sarkis, 1998); (2) define goals and objectives (Meade, 1993; Kovacich, 1996); (3) define realistic goals and objectives (Hiatt and Motz, 1990); (4) establish policy (Petroni, 1999; Turner, 1994); and (5) have a clear vision (Zolkos, 2000). The Appendix reveals these five measurement items, which are Q5-Q9.
CSF of DRP for information systems
255
IMCS 17,3
256
3.3 Steering committee A steering committee must be formed and appointed by the top management, so that all functional units offer their full cooperation. The DRP steering committee should have adequate visibility and be granted an appropriate degree of authority (Bodnar, 1993). The steering committee should consist of representatives from each functional unit so that their views on critical DRP events can be accurately gathered (Wong et al., 1994). The steering committee may also include external DRP consultants because they can make recommendations objectively, without the consideration of office politics (Peach, 1991). The DRP construct of steering committee includes the following measurement items, the: (1) formation of a steering committee (Hawkins et al., 2000; Wong et al., 1994); (2) participation of representatives from all functional units (Cerullo and Cerullo, 1998; Wong et al., 1994); and (3) participation of external consultants (Leary, 1998; Smith and Sherwood, 1995). The Appendix reveals these three measurement items, which are Q10-Q12.
3.4 Risk assessment and impact analysis Risk assessment and impact analysis determine how long an organization can survive without the support of critical business functions when a disaster strikes (Rothstein, 1998). All critical functions must be pre-determined before a DRP strategy is chosen for an organization (Chow, 2000). Here, the “risk assessment” identifies the business events/functions that are most likely to pose threats to a firm (Gallegos and Wright, 1988), and the “impact analysis” refers to the evaluation of the consequences of a disaster; such as the financial and non-financial loss of business functions (Doughty, 1991). The DRP measurement items for this construct are to: (1) analyze financial loss (Wong et al., 1994; Doughty, 1991); (2) analyze security control (Gilbert, 1995; Bodnar, 1993); (3) analyze adverse events (Haubner, 1994; Rosenthal and Himel, 1991); (4) analyze security weakness (Menkus, 2000; Moch, 1999); and (5) rank the security risk (Coult, 1999; Cerullo and Cerullo, 1998). The Appendix reveals these five measurement items, which are Q13-Q17.
3.5 Prioritization The prioritization construct refers to the ranking of the business functions on which a firm is highly dependent, concerning the survival of a disaster. The “prioritization” construct is an important element in DRP success because not all business functions are equally important and susceptible to the disruption caused by a disaster (Kull, 1982). Thus, critical missions should be given high priority for recovery (Coleman, 1993). The DRP measurement items for this construct are to prioritize: (1) critical functions (Wong et al., 1994; Murphy, 1991); (2) critical applications (Wong et al., 1994; Kull, 1982); (3) recovery activities (Lee and Ross, 1995; Frost, 1994);
(4) requirements (Lee and Ross, 1995; Frost, 1994); and (5) recovery schedule (Cerullo and Cerullo, 1998). The Appendix reveals these five measurement items, which are Q18-Q22.
3.6 Minimum processing requirement When a disaster strikes, no organization has sufficient time or resources to recover every business function in a short period of time. An effective DRP should, therefore, address the minimum-processing requirement that would ensure that company operations are recovered to an acceptable level (Coleman, 1993). The minimum processing requirement determines an acceptable recovery time, that is, the point in time to which data must be restored (Myatt, 1999) and the maximum allowable downtime of business functions that a company or functional unit can withstand (Wong et al., 1994). Relevant personnel should be consulted to form the minimum processing requirement, so that the actual requirement is properly captured (Doughty, 1991). The DRP measurement items for this construct are to determine: (1) the minimum processing requirement (Baker, 1995; Heng, 1996); (2) the maximum allowable downtime (Myers, 1999; Wong et al., 1994); (3) the recovery time (Gluckman, 2000; Douglas, 1998; Tilley, 1995); and (4) what data must be stored (Myatt, 1999; Douglas, 1998; Tilley, 1995). The Appendix reveals these four measurement items, which are Q23-Q26.
3.7 Alternative site Firms that highly dependent on IS applications must consider an alternative site with which they can back up their IS resources/databases, so that they can be recovered easily in the event of a disaster (Blake, 1992). Alternative sites can be operated on either an external site or in-house site, and can be implemented in the mode of a hot site, a cold site, mobile recovery facilities, or a mirrored site (Hawkins et al., 2000). The practice of each of these alternative sites has trade-off value, thus one must establish its selection criteria and then perform cost benefit analysis (Yiu and Tse, 1995). The DRP measurement items for this construct are to establish: (1) a clear vision of an alternative site (Blake, 1992); (2) an in-house site (Leary, 1998; Peach, 1991); and (3) an external site (Hawkins et al., 2000; Rothstein, 1998; Wong et al., 1994). The Appendix reveals these three measurement items, which are Q27-Q29.
3.8 Backup storage Backup storage refers to how all relevant IS data should be backed up and kept in a safe place from which they can, when needed, be retrieved quickly for restoration. Backup storage can be based either on- or off-sites (Rohde and Haskett, 1990). On-site backup storage refers to the storage of data/resources in different locations within an organization (Hawkins et al., 2000), whereas off-site backup storage refers to the storage of data/resources on a remote site. Arnell (1990) indicates that off-site storage should be located far enough from the present company that the backup
CSF of DRP for information systems
257
IMCS 17,3
storage is not affected by the same disaster. The DRP measurement items for this construct are to: (1) establish off-site backup (Hawkins et al., 2000; Rohde and Haskett, 1990); (2) establish on-site backup (Hawkins et al., 2000; Rohde and Haskett, 1990); (3) establish a backup schedule (Haubner, 1994); and
258
(4) subscribe to insurance coverage (Coult, 1999; Eckerson, 1992). The Appendix reveals these four measurement items, which are Q30-Q33.
3.9 Recovery team The recovery team coordinates recovery tasks in an effective manner when a disaster strikes (Donovan et al., 1999). Miller (1997) states that a team approach to managing the recovery process in the event of a disaster is important for two reasons: first, all relevant staff may not be presented when a disaster strikes; second, when more of the right people are involved, more intelligent answers to DRP problems may be generated. ISF personnel must also participate in the recovery team so that a faster restoration of ISF functions/services can be ensured in the event of a disaster. The DRP measurement items for this construct are to: (1) participate in the recovery team (Miller, 1997; Fong, 1991); and (2) be involved in the recovery team (McNurlin, 1988; Fong, 1991). The Appendix reveals these two measurement items, which are Q34-Q35.
3.10 Testing A series of test programs needs to be developed to make sure the DRP is a complete and accurate product. Testing can be designed in such a way that the weaknesses of a DRP can be identified (Lee and Ross, 1995). The DRP should also be tested in such a way that it renders minimal disturbance to the daily operations of an organization. The DRP measurement items for this construct are to test the plan: (1) as though a disaster had happened (Edwards and Cooper, 1994; Wong et al., 1994); (2) by duplicating processing (Wong et al., 1994); (3) by recovering procedures (Wong et al., 1994); (4) by recovering functions (Edwards and Cooper, 1994; Wong et al., 1994); and (5) by simulating a disaster (Edwards and Cooper, 1994; Wong et al., 1994). The Appendix reveals these five measurement items, which are Q36-Q40.
3.11 Training Once the DRP is developed, all staff involved in the plan must know their roles and duties. A training program is therefore required to ensure that all staff understand their positions, which will subsequently reduce the potential for operational errors and the opportunity for miscommunication when the plan is implemented during a real disaster (Turner, 1994). Training is thus essential. The DRP measurement items for this construct are to train staff in:
(1) disaster responsiveness (Salzman, 1998); (2) stress management (Paton and Flin, 1999; Turner, 1994); (3) team skills (Solomon, 1994; Paton and Flin, 1999);
CSF of DRP for information systems
(4) damage assessment (Turner, 1994); (5) notification (Turner, 1994); (6) responsibility and duties (Smith and Sherwood, 1995); and (7) the relocation of departments (Morwood, 1998). The Appendix reveals these seven measurement items, which are Q41-Q47.
3.12 Documentation The “documentation” construct refers to a set of manuals and procedures that outlines for the DRP recovery team all respective events related to DRP in the event of a disaster. In addition, the exact details of functions, personnel, responsibilities, contact names and numbers, and equipment of the DRP, involved with a disaster, must be documented (Jacobs and Weiner, 1997). The DRP measurement items for this construct are to document: (1) recovery procedures (Salzman, 1998; Doughty, 1993); (2) the procedures of emergency responsibility (Salzman, 1998; Ferraro and Hayes, 1998); (3) the roles and responsibilities of the recovery team (Salzman, 1998; Doughty, 1993); (4) team members and contact numbers (Miller, 1997; Jacobs and Weiner, 1997); (5) backup operations (Salzman, 1998, Miller, 1997); (6) the personnel to contact (Coult, 1999; Jacobs and Weiner, 1997); (7) the notification procedure (Coult, 1999; Jacobs and Weiner, 1997); and (8) the list of suppliers/vendors (Pember, 1996; Miller, 1997). The Appendix reveals these eight measurement items, which are Q48-Q55.
3.13 Maintenance Rothstein (1988) claims that the creation of a DRP without periodic testing and ongoing maintenance is worsen than not having a DRP. The maintenance construct is important to reduce the likelihood of incorrect decisions being made and to decrease the stress of disaster-team members during the recovery process (Frost, 1994). To keep up with the ever-changing information system technology, the DRP should be reviewed and tested on a regular basis (Peterson and Perry, 1999). Each time a DRP is altered, those changes must be updated. The DRP measurement items for this construct are to: (1) test the DRP periodically (Lee and Ross, 1995; Karakasidis, 1997); (2) evaluate the DRP continually (Mitome et al., 2001; Miller, 1997); (3) review the DRP regularly (Ebling, 1996; Matthews, 1994); (4) update DRP changes (Norman, 1993; Korzeniowski, 1990); and (5) audit the DRP (Bodnar, 1993; Francis, 1993).
259
IMCS 17,3
260
The Appendix reveals these five measurement items, which are Q56-Q60.
3.14 ISF personnel participation ISF personnel must participate and monitor the development processes of DRP in an organization (Wong et al., 1994). Although ISF personnel may not be DRP team members, they should contribute their technical knowledge at all different stages (Rutherford and Myer, 2000). Owing to the fact that ISF personnel should know their duties and responsibilities within the disaster recovery process, they should review the plan and check whether the recovery operation procedures are operated as planned. In addition, ISF personnel should review the DRP regularly from a technical standpoint so that minimum ISF service disruption is sustained (Blatnik, 1998). The DRP measurement items for this construct are to participate in: (1) recovery duties and responsibility (Blatnik, 1998; Butler, 1997); and (2) reviewing the DRP (Blatnik, 1998; Baker, 1995). The Appendix reveals these two measurement items, which are Q61-Q62.
4. Methodology 4.1 Study subject The research sample was based on the directory of Top 2000 Foreign Enterprise in Hong Kong 1999. Two criteria were adopted when selecting potential respondents: first, organizations of all the selected candidates had to have DRP for their ISF; second, the selected candidates had experience and also in-charge of their DRP in their present organization. ISF departments were directly contacted to explain to them the purpose of this research, and to obtain names and positions of target respondents. Subsequently, a total of 500 potential participants were contacted and a questionnaire, together with a covering letter, was mailed to them. 4.2 Instrument development The 62 measurement items of DRP, as shown in the Appendix, were used in the first part of our questionnaire. Managers were asked to evaluate the importance of these measurement items using a measurement scale of 1-5, where a value of 5 represented the most important one, and a value of 1 represented the least important. These measurement scores were later used to determine the critical success constructs of the DRP for ISF. The second part was used to collect demographics data of our respondents. 4.3 Data collection procedure A structured questionnaire with a cover letter was used in the data collection. A complete set of questionnaires was sent to each selected respondent with a return envelope. About 139 replies were returned, though ten were incomplete and so were discarded. Therefore, 129 questionnaires were used for the data analysis, which constituted a response rate of 26.7 percent. Table II shows the characteristics of the respondents. In total, 23 held an IS director position, 70 percent were managers, and the remaining 7 percent held the position of IS executives. About 80 percent of the respondents had more than three years’ working experience with DRP system in ISF, with the remaining 20 percent having one to three
Respondent characteristics
Position Director Manager Executives Years handing DRP in ISF 1-3 years over 3 years Company size 1-50 51-100 101-250 251-500 Above 500 Industry type Banking/finance/insurance Retail/wholesale Manufacturing Restaurant/hotel Transportation/shipping Real estate Computers/telecommunications/networking Medicine/health Media/publishing Utilities Others
Total numbers
Percentage
30 90 9
23.3 69.8 7
23 106
17.8 82.2
8 21 24 27 49
6.2 16.3 18.6 20.9 38
24 11 25 4 5 2 19 2 6 5 25
18.6 8.5 19.4 3.1 3.9 1.6 14.7 1.6 4.7 3.9 20.2
years’ DRP working experience. This information shows that the respondents had gained an in-depth DRP knowledge in their organizations. About 20 percent of the respondents worked in financial institutes and in manufacturing, and about 15 percent came from the IT industry. The information also showed that 38 percent of the companies hired more than 500 employees, about 21 percent hired 251-500 employees, and the remaining companies hired less than 250 employees. About 85 percent of the respondents claimed that IS services are important to daily operations in their organizations, with about 11 percent of the respondents rating the dependency of IS services as crucial and critical, and the remaining 4 percent as neutral. The range of DRP experience was ranging from seven to 25 years.
5. Results This paper first analyzes the content validity, mean scores of the proposed construct, CSFs of DRP, their reliability test, and their labeling. 5.1 Content validity In this part, content validity of the instrument is assessed. Content validity refers to the extent to which the measurement items of a construct/component are actually representing the meaning of that construct/component (Babbie, 1992). Content validity test is subjectively judged by researchers rather than proven by a statistical testing (Saraph et al., 1989). The contents of proposed instrument in this thesis are derived
CSF of DRP for information systems
261
Table II. Characteristics of the respondents
IMCS 17,3
262
from an extensive literature review as reported in Section 3, the content validity of the instrument is thus reasonably justified.
5.2 Mean scores of proposed DRP constructs Table III reviews the mean scores of the proposed DRP constructs. In this table, a mean score value of 5 for DRP constructs represents that a DRP construct is highly important, and vice versa for a value of 1. It is shown that the construct of backup storage has the highest score (mean score 4.118), while the construct of alternative site has the lowest score (mean score 3.155). ¼
¼
5.3 Convergent analysis for DRP CSFs This study applied the construct convergent approach; which is a rigorous way to create a factor structure matrix where the correlations of all measurement items with every construct are listed (Compeau and Higgins, 1995). Construct analysis with the varimax rotation was used to determine the DRP critical success constructs for ISF. Table IV shows the total variance explained by the convergent analysis, and a total of 11 CSFs of DRP is proposed; which constitutes cumulative of 76 percent of explaining power. The Kaiser-Meyer-Olkin measure of sampling is 0.868. Table V reveals the corresponding factor loadings. In this table, a construct loading of $ 0.5 was considered a significant contribution to a CSF for DRP. TableV also shows that ten proposed measurement items were removed from data analysis because two of them (that are Q27 and Q56) were shared by two constructs and the other eight (that are Q14, Q15, Q16, Q17, Q19, Q35, Q61, and Q62) have all factor loadings less than a value of 0.5. We also removed the 11th CSF in Table V that has only one measurement item (that is Q12) because we cannot compute its reliability value, which is to be studied for all CSFs in the next section. 5.4 Reliability and labeling of DRP CSFs A reliability test is referred the examination of the internal consistency of a measurement instrument. The Cronbach’s alpha ( a ) coefficient is the most popular method adopted in assessing the reliability, and a high alpha value (close to 1) of the
Table III. Mean scores and standard deviations of proposed DRP constructs
DRP constructs
Question numbers
Top management commitment (F1) Policy and goals (F2) Steering committee (F3) Risk assessment and impact analysis (F4) Prioritization (F5) Minimum processing requirement (F6) Alternative site (F7) Backup storage (F8) Recovery team (F9) Testing (F10) Training (F11) Documentation (F12) Maintenance (F13) ISF personnel participation (F14)
Q1-Q4 Q5-Q9 Q10-Q12 Q13-Q17 Q18-Q22 Q23-Q26 Q27-Q29 Q30-Q33 Q34-Q35 Q36-Q40 Q41-Q47 Q48-Q55 Q56-Q60 Q61-Q62
a
No. of items
Mean
SD
4 5 3 5 5 4 3 4 2 5 7 8 5 2
4.00 3.92 3.17 3.67 4.02 3.89 3.16 4.12 3.66 3.41 3.41 3.74 3.53 3.81
0.79 0.79 0.95 0.84 0.77 0.80 1.06 0.67 0.95 0.96 0.79 0.90 0.88 0.91
Note: These questions are correspondent to those measurement items shown in the Appendix a
e v i t s l g a n u % i m d u a o C l d e r a u f q s o e f c o e g n a s t i a r m n e a u s c v r e n P o i t a t o R l a t o T e v t s i a g l n u % i d m a u o l C d e r a u q s f o f e o e c g s t a n a i m n r e u a s c v r e n o P i t c a r t x a E l t o T e v i t a l u % m u C s e u l a v f n e o e g e i c e g n a t l a i a n r i e t a i c r n e v I P l a t o T t n e n o p m o C
CSF of DRP for information systems
5 4 3 2 0 6 9 1 6 0 4 3 6 2 9 7 7 3 3 0 5 3 8 9 7 8 4 5 6 3 9 3 . 0 . 9 . 7 . 5 . 2 . 9 . 6 . 9 . 3 . 0 . 0 1 2 2 3 4 5 5 6 6 7 6 7
263
5 8 0 8 9 6 3 1 5 4 4 3 2 6 6 7 0 6 9 7 4 8 8 1 7 1 5 1 0 6 5 4 . 0 . 8 . 8 . 7 . 7 . 7 . 6 . 3 . 3 . 6 . 0 2 1 1
8 9 1 4 9 5 9 9 7 5 4 1 7 3 6 9 0 7 4 1 3 6 7 2 4 0 6 4 3 1 2 1 6 . . . . . . . . . . . 6 6 5 5 4 4 4 4 2 2 1 2 8 6 9 6 9 3 5 5 1 4 0 3 8 3 6 3 1 1 2 1 3 1 5 2 6 9 0 7 1 2 2 . 9 . 4 . 8 . 1 . 5 . 7 . 0 . 2 . 4 . 0 . 3 4 4 5 5 6 6 6 7 7 7 6 7
2 6 8 3 7 3 4 3 0 5 3 0 3 4 5 2 7 7 0 1 8 2 1 4 7 3 3 0 6 4 1 9 . 6 . 4 . 4 . 3 . 3 . 2 . 2 . 2 . 1 . 8 . 3 1 4
3 0 4 9 3 5 8 0 8 1 0 2 9 4 9 6 0 5 9 0 3 3 7 9 9 6 0 9 6 4 3 2 1 . . . . . . . . . . . 6 2 3 2 2 2 1 1 1 1 1 1 2 8 6 9 6 9 3 5 5 1 4 0 3 8 3 6 3 1 1 2 1 3 1 5 2 6 9 0 7 1 2 2 . 9 . 4 . 8 . 1 . 5 . 7 . 0 . 2 . 4 . 0 . 3 4 4 5 5 6 6 6 7 7 7 6 7
2 6 8 3 7 3 4 3 0 5 3 0 3 4 5 2 7 7 0 1 8 2 1 4 7 3 3 0 6 4 1 9 . 6 . 4 . 4 . 3 . 3 . 2 . 2 . 2 . 1 . 8 . 1 3 4
3 0 4 9 3 5 8 0 8 1 0 2 9 4 9 6 0 5 9 0 3 3 7 9 9 6 0 9 6 4 3 2 1 . . . . . . . . . . . 6 2 3 2 2 2 1 1 1 1 1 1
1 2 3 4 5 6 7 8 9 0 1 1 1
s i s y l a n a t n e n o p m o c l a p i c n i r p : d o h t e m n o i t c a r t x E : e t o N
Table IV. Total variance explained
IMCS 17,3
264
) 0 6 8 0 7 2 1 5 0 8 4 9 1 6 9 3 9 5 1 1 0 8 3 8 2 6 3 8 5 4 6 8 9 d 5 5 3 7 3 3 1 2 8 4 5 2 6 9 8 1 4 7 3 8 2 6 0 6 7 3 6 6 1 6 0 4 6 e 1 . 0 . 1 . 0 . 0 . 0 . 0 . 0 . 0 . 2 . 2 . 6 . 0 . 1 . 1 . 0 . 0 . 0 . 2 . 2 . 1 . 4 . 2 . 0 . 0 . 0 . 1 . 0 . 1 . 0 . 0 . 0 . u . 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 n 0 i 2
2
2 2
2 2
2 2 2 2
2
t n o c (
2
9 3 7 5 0 9 3 3 9 0 0 3 9 5 7 4 9 8 4 2 8 8 9 1 4 5 1 9 2 1 9 1 2 0 0 5 2 6 4 5 2 0 4 5 0 5 9 0 2 0 6 6 5 0 7 5 4 0 9 1 3 7 5 6 9 4 5 8 0 0 1 0 1 0 1 0 0 0 3 1 0 0 1 1 4 1 0 2 2 2 2 0 0 2 0 1 0 1 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 2 2
2
2
2
9 0 9 6 4 9 3 8 5 0 3 2 9 8 2 3 0 9 5 9 1 9 7 3 1 1 1 5 6 0 8 4 7 9 5 9 1 3 2 2 7 3 8 6 7 3 5 9 7 6 8 2 5 7 1 3 8 0 1 0 0 4 8 0 2 6 1 1 1 1 0 1 0 0 0 1 1 0 3 0 1 1 0 1 0 1 0 1 1 2 0 1 0 1 1 5 7 5 . . . . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 1 . 9 0 0 0 . 0 0 0 2
2 2
2 2 2 2 2 2 2
2
2
2 4 7 9 9 7 2 7 9 1 5 9 1 3 1 0 0 1 0 8 7 8 7 4 1 3 5 2 7 5 0 2 7 7 9 3 0 6 7 6 5 3 5 4 5 4 0 0 1 6 5 3 8 5 0 9 3 8 8 3 7 2 7 3 4 8 2 1 0 1 1 1 1 1 3 2 0 0 0 0 2 0 3 2 1 3 3 4 2 1 0 1 5 5 6 5 6 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2
2
6 4 7 8 9 0 4 3 1 5 2 9 9 9 4 7 0 2 6 3 8 8 4 0 5 3 5 5 7 3 9 3 2 8 0 7 8 4 3 6 8 1 3 1 8 9 1 5 8 0 9 0 8 1 4 7 2 8 9 8 0 0 0 3 4 2 1 1 1 1 3 3 2 0 1 1 0 3 1 2 1 0 3 3 1 1 1 2 1 2 1 1 1 1 6 8 7 5 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 2 . . 0 . 0 . 0 . 0 7 0 0
r o t c a f s s e c c 6 u s l a c i t i r C
8 0 2 0 7 6 3 3 4 1 7 8 3 0 5 9 6 7 2 0 7 5 6 9 5 6 1 9 8 9 9 1 1 2 5 3 2 9 2 3 7 2 4 7 6 7 6 3 5 3 1 0 7 5 2 3 4 9 0 1 3 3 8 7 2 8 1 1 1 0 0 1 1 1 3 0 1 0 1 0 0 1 1 1 1 1 1 4 3 3 3 3 3 3 3 7 7 7 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2
2
2
2 0 3 0 6 7 9 0 5 4 1 8 1 7 3 6 2 6 1 0 4 9 9 7 5 8 3 6 6 4 1 6 4 6 3 5 7 9 4 9 4 8 2 5 9 6 4 2 5 0 4 7 5 1 6 3 4 9 0 3 4 3 0 0 3 7 1 1 1 0 1 0 1 1 1 0 0 1 0 0 1 0 1 0 0 0 0 1 0 0 1 2 1 2 1 2 1 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2
2
6 2 7 7 9 1 7 9 1 8 7 4 1 7 3 3 1 9 5 5 7 2 9 2 2 3 6 7 1 6 6 7 8 9 3 9 6 5 6 0 5 9 8 0 3 9 4 9 4 2 3 3 8 1 8 4 3 4 9 8 4 6 7 7 1 8 0 1 0 1 0 0 1 0 1 1 2 3 0 2 1 0 0 0 3 3 3 1 2 2 1 0 0 1 0 2 2 2 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 2 . 4 0 0 2 2 2
4 6 3 7 4 1 6 1 5 2 6 9 9 0 3 8 0 1 9 5 5 0 7 2 3 3 9 0 5 7 1 9 4 2 1 9 7 7 4 7 8 5 6 8 9 9 2 6 6 2 9 5 9 9 2 8 8 3 7 8 8 4 4 9 2 2 1 1 4 1 0 0 0 0 0 1 1 1 0 1 1 2 0 0 3 3 1 1 3 0 1 2 2 8 8 8 7 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2
4 1 5 5 0 3 3 1 6 8 4 7 3 6 6 6 5 8 2 8 1 6 8 7 2 1 4 2 5 5 0 8 9 9 3 4 5 2 7 7 9 2 4 6 3 5 1 2 5 1 9 3 2 6 5 5 9 6 3 1 9 3 8 8 6 6 5 6 5 1 1 1 0 0 1 2 0 2 1 . 0 . 3 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . . . 0 . 1 . 1 . 0 . 1 . 1 . 3 . 2 . 2 . 2 . 3 . 1 . 2 . 1 . 0 . 1 . 0 . 0 . 2 . 2 . 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 . 0 2
2 2
3 8 2 5 0 5 7 4 1 1 6 2 4 1 5 6 0 3 0 4 2 0 5 9 6 9 2 9 0 3 9 0 5 3 1 3 8 9 6 9 8 5 5 3 1 7 3 7 3 0 9 8 7 0 8 3 9 9 4 5 1 1 8 4 3 8 0 1 2 1 0 1 0 1 1 1 2 0 0 1 1 0 1 2 1 2 4 4 3 2 2 2 2 3 3 1 1 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 Table V. Rotated component matrix
2
2
1 2 3 4 5 6 7 8 9 0 1 1 1 2 1 7 2 8 2 9 2 0 3 1 3 2 3 3 3 3 1 4 1 5 1 6 1 7 1 8 1 9 1 0 2 1 2 2 2 3 2 4 2 5 2 6 2 Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q
; e s t n r o a i a t p a e 2 2 2 2 2 2 2 2 2 2 2 2 2 r e t s i o w 1 1 t y 6 6 3 7 3 4 4 7 5 6 0 2 3 9 5 4 8 7 1 3 7 3 4 2 0 8 8 2 0 n b 8 1 4 7 1 8 6 4 8 9 4 2 2 0 4 2 6 7 4 5 1 4 6 2 7 2 3 3 0 i 2 0 0 0 0 1 0 1 0 0 1 0 0 0 0 1 0 1 2 0 0 0 0 1 0 1 1 0 1 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . d d 0 e e r g a 2 2 2 2 2 2 2 2 r h e s v y n e o c h t 6 7 1 4 1 8 8 4 8 2 0 4 0 5 2 4 7 3 0 5 9 6 8 0 9 7 6 3 3 4 6 2 7 4 9 3 5 9 6 2 4 6 1 6 3 1 0 5 7 3 8 3 8 0 5 5 1 1 n 0 s o e . 0 . 1 . 0 . 0 . 0 . 0 . 1 . 0 . 0 . 0 . 0 . 1 . 1 . 0 . 0 . 3 . 0 . 1 . 0 . 1 . 0 . 0 . 0 . 0 . 0 . 0 . 2 . 2 . i u 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 t a a t c 2 2 2 2 e o r b ; d n o e v i t o 8 6 2 1 5 6 2 8 6 7 6 2 2 1 5 0 2 4 9 5 5 7 5 3 2 9 5 2 0 a 0 4 3 9 9 4 9 6 8 3 5 8 6 3 4 4 2 8 0 2 8 6 9 6 5 6 6 8 7 z m 1 0 2 0 1 0 1 0 0 1 0 1 1 1 0 0 4 2 1 1 1 0 0 0 2 1 2 1 2 i . . . . . . . . . . . . . . . . . . . . . . . . . . . . . e 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 l a r s 2 2 m r m o t e n i t r n e 3 0 2 9 0 3 7 3 7 9 0 2 9 2 0 1 1 6 5 1 7 3 2 3 7 6 6 0 5 e s i 5 6 5 1 5 4 1 8 8 0 5 1 9 0 1 1 9 2 8 4 9 4 5 7 4 7 8 2 0 a 1 0 1 0 1 1 0 1 0 2 2 0 0 1 1 2 0 1 2 3 3 3 2 2 1 2 0 0 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . K m e 0 r u 2 2 2 2 h t s i a w e m x ; a e 8 7 9 2 0 2 2 9 8 4 5 0 7 7 8 0 3 3 9 5 6 8 3 0 9 3 6 9 9 u m l 6 3 6 9 1 1 3 0 7 8 9 5 5 8 4 9 3 2 4 8 6 5 1 6 2 5 8 4 4 i a 1 1 1 1 3 3 1 0 0 2 2 1 1 1 1 1 0 1 0 0 0 0 1 1 1 1 1 2 1 r . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 a v 5 . 2 : 0 d o n a h t t e h 8 3 7 8 4 6 4 6 5 8 3 0 5 6 7 7 4 4 5 9 9 1 7 4 1 9 3 4 2 m s 6 1 7 1 2 s 1 8 2 3 8 1 3 3 5 4 8 9 9 8 9 2 4 4 4 6 9 0 9 3 e 4 0 2 0 1 2 1 1 2 3 3 2 4 3 3 4 3 3 2 0 0 2 2 3 5 7 7 6 5 . . . . . . . . . . . . . . . . . . . . . . . . n . . . . . 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 i o l s t g 2 a n t i o r d a ; o s l i l 8 6 0 6 2 2 s 6 5 9 2 0 5 5 0 9 8 8 2 4 6 4 9 3 7 3 4 0 5 6 2 0 0 7 8 4 8 5 7 6 0 9 0 0 9 4 7 0 8 7 4 6 4 3 6 6 2 9 2 y l a l 5 8 7 7 6 5 2 2 1 2 2 1 0 2 0 1 0 3 2 3 3 0 2 1 1 1 1 0 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . e 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 a s n u a 2 t a c e n e b n s o t p l 6 0 4 6 5 5 6 0 6 9 5 6 3 3 6 7 3 1 9 3 6 8 9 5 9 9 8 1 9 u 3 3 6 8 1 9 6 4 4 7 9 3 2 3 5 0 2 2 5 2 3 3 4 8 9 9 4 8 5 s m 1 0 1 0 0 2 0 1 1 1 1 0 0 1 2 3 1 2 3 4 3 3 2 1 0 1 1 0 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . o r 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 c e e l 2 a h t p n i c i n d i r e 2 5 0 7 2 1 8 4 9 0 2 6 5 6 1 6 0 6 7 4 8 0 4 3 2 2 2 7 3 p d 6 9 6 0 3 7 8 1 3 8 1 9 2 3 1 9 8 3 5 1 6 3 0 1 3 0 5 2 8 u : l 6 6 6 6 5 6 5 2 2 0 3 1 2 2 3 2 0 2 0 1 1 2 3 4 3 2 1 3 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . d 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 o c n i h t t e o m n s n o m 3 5 0 8 4 0 2 8 i 3 2 7 5 8 9 8 8 2 1 8 3 7 5 7 1 1 6 2 5 7 e t 0 4 1 4 1 4 8 6 t 0 3 6 1 7 5 3 5 0 3 2 6 3 5 5 1 8 3 9 8 9 c i 6 6 6 7 7 7 6 6 2 1 0 1 1 2 2 2 3 2 2 1 1 2 2 3 3 3 3 2 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . a 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 t r t n x e E m : e r s u s s r e o t a t 1 2 3 4 5 6 7 6 7 8 9 0 6 7 8 9 0 4 5 1 2 8 9 0 1 2 3 4 5 4 4 4 4 4 4 4 3 3 3 3 4 5 5 5 5 6 3 3 6 6 4 4 5 5 5 5 5 5 o e c a Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q N m f
4 9 8 0 9 5 0 1 7 3 1 2 4 7 1 1 7 9 4 4 8 7 2 2 0 8 3 4 5 8 3 4 3 1 2 1 2 0 2 3 2 2 0 8 6 5 6 5 0 9 0 3 1 9 5 0 1 9 1 0 0 1 1 0 0 1 0 0 1 2 1 0 0 0 1 1 0 3 3 2 1 0 0 0 0 0 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 1
9
8
7 r o t c a f s s e c c 6 u s l a c i t i r C 5
4
3
2
1
CSF of DRP for information systems
265
Table V.
IMCS 17,3
266
corresponding construct represents a high reliability. Nunnally (1967) suggests that an alpha value of 0.7 or higher is normally considered an acceptable level. This study adopted the refinement process proposed by Norusis (1993), which is to retain those measurement items of a DRP construct that contribute to the maximum Cronbach’s alpha value. Table VI reveals the maximum alpha values for the ten significant CSFs of Table V. It is shown that the a-value of all suggested DRP CSFs met the minimum level of a-value $ 0.4. The maximum a-values of DRP CSF are ranging from 0.944 to 0.812. The last column of Table VI refers to the labels of each CSF that best describing the constitution of their measurement items. Each of these labels is to be discussed more in next section.
6. Discussion Many managers value highly the process of DRP for ISF, but the CSFs and measurements around the process remain unclear. This paper reports an extensive survey conducted on the topic of DRP applications in ISF: the survey identified 62 measurement items that can serve as a basis to determine DRP CSFs for ISF. Based on the data collected from a questionnaire survey of DRP practices for ISF in Hong Kong, this paper identifies ten DRP CSFs, each of which is further discussed in this section. The first DRP CSF, which consists of eight measurement items, is DRP documentations. This DRP CSF shows that an effective DRP requires everyone involved to know their DRP roles, and that the firm should adopt measures to document all relevant events related to the activation of DRP when and if a disaster strikes, such as who should be in-charge of DRP, what actions should be taken (i.e. both recovery and emergency responses), how to activate the communication channel (including the person to contact and back-up operations), and the responsibility and role of each individual during a disaster. The second DRP CSF, which also consists of eight measurement items, is the DRP steering committee and DRP testing. The DRP steering committee outlines that we should involve representatives from all functional areas, so that different perspectives on DRP problems can be clearly identified. In addition, DRP should not only be actively tested through the simulation scenarios of all IS functions, but should also be examined for its validity in both processing and customer services, if a real disaster should occur. The third DRP CSF, which consists of five measurement items, is DRP policy and goals. This refers to top management involvement in identifying the DRP vision, scope, and goal of IS, with realistic applications for their firm. This involvement should lead to an appreciation of the necessary scope of work, labor, and costs for an effective DRP for ISF. The fourth DRP CSF, which consists of six measurement items, is DRP training. This refers not only to training with regard to the responsibility and duties of all staff involved in DRP, but also to their training in the efficacy of stress management, damage assessment, team collaboration, and DRP awareness. The fifth DRP CSF, which consists of four measurement items, is DRP maintenance and staff involvement. The active maintenance of DRP implies that DRP should be regularly and continually reviewed and evaluated, and that it should be updated with
g n i l e s e b a u L l a v a m e u m u l m u i a v m x i a a x a M M d e n i a s t e r m e t f i o . o N
s n o i t a t n e m u c o d P R D
m m t e t e n t g e s d s f n y n f y i m s t a s a s i t s p e s p e m u e c - u t s d t o m k i l n k r o c a c a p c a a m o e b b t s g c S m e I n e e o c t n S i c d i I t a m e n i s v m s s g g a g n r u n e t f t e e f n n o g t n n i y i n i s m o n i e t / r i a c n o e i n , , e i i a n m n l l a m i e t s z s n a l a P i o a t e e r e a o t n s t p t m v m i r m i i n r t r r e c P P P P P l o P u p R o n t t e D i q R R R R R v R o x r e o u n n D r T t P f E I D D D D D i
4 4 9 . 0
2 2 9 . 0
5 7 1 1 0 0 9 9 9 . . . 0 0 0
8 1 9 . 0
8 7 8 . 0
3 2 9 . 0
CSF of DRP for information systems
267
4 2 2 1 8 8 . . 0 0
8 8 5 6 4 3 4 4 2 2
s d e t m e e l t e I d
–
–
– – –
s e u l a v -
4 4 9 . 0
2 2 9 . 0
5 7 1 1 0 0 9 9 . 0 . 9 . 0 0
3 2 Q 3 1 9 . 0
3 3 – Q , 3 1 Q 8 7 8 . 0
– –
7 7 2 1 8 . 8 . 8 . 0 0 0
a
f o s . m e o i N t s e u l a v a l a n i g i r s O m e t i t n e m e r u s a e M
8 8 5 6 4 4 4 6 2 2 , , 3 6 5 9 3 4 Q Q Q , 8 , , 2 5 5 3 4 Q Q Q , 7 , , 9 1 4 5 3 9 4 5 Q Q Q Q Q , , , 8 3 , 8 , 0 5 6 3 Q 4 5 , Q Q Q Q 7 , , , 7 , 2 9 4 5 5 1 1 7 4 Q , 4 5 Q Q Q Q 6 Q Q , 4 , 0 , 0 , Q , , 8 , 4 5 1 4 5 1 4 4 3 Q Q Q Q Q Q Q
6 2 Q , 5 2 Q , 4 2 Q , 3 2 Q
4 Q , 3 Q , 2 Q , 1 Q
3 3 Q , 2 2 Q , 1 2 Q , 0 2 Q , 8 1 Q , 3 1 Q
0 3 1 3 Q Q , , 9 2 8 2 Q Q
s s e c c u s l s a c r i o t t i r c a 1 2 3 4 5 6 7 8 9 0 C f 1 a
V e l b a T o t t n e d n o p s e r r o C a
: e t o N
Table VI. Reliability of the proposed ten CSFs
IMCS 17,3
268
any relevant changes immediately. All concerned personnel must be actively involved in this maintenance process, so that everyone is aware of the changes. The sixth DRP CSF, which consists of three measurement items, is the DRP minimum IS processing requirement. The four areas that dictate the survivability of firms without IS support may include the maximum allowable downtime for firms, the maximum allowable downtime for customers, the minimum functional services to all concerned parties, and the maximum time by which all IS services are restored. These areas of maximum downtime requirement are particularly crucial for those firms highly dependent on electronic commerce such as e-trading and e-banking. The seventhDRP CSF, which consists of four measurement items, is top management commitment to DRP. The top management commitment must firstly be shown in a commitment to the DRP budget allocation, and then to exhibiting leadership by involvement and participation in DRP events, such as personally attending DRP meetings. These activities would promote staff enthusiasm toward DRP. The eighth DRP CSF, which consists of four measurement items, is prioritization of IS functions/services. It is suggested that firms should establish a rule to prioritize which functions, activities, and system requirements are the most critical, and then decide which of these should first be recovered and then the order of the rest, should a disaster strike. Without the determination of these priorities, firms could easily fall into a chaotic situation because DRP personnel would not be able to decide which of these priority jobs to recover first. The ninth DRP CSF, which consists of two measurement items, is the external, off-site back-up processing sites for DRP. This CSF suggests that an external back-up processing site should be established, so that IS operations/services would still be functional if a disaster were to strike at a local office. Many industries, such as banking, have adopted this measure to ensure that their IS functions would not be totally halted if such a disaster were to happen. The tenth DRP CSF, which consists of two measurement items, is the internal, on-site back-up processing sites for DRP. This final DRP CSF suggests that all firms should establish an internal back-up processing site so that IS operations/services could be immediately retrieved in the event of a disaster. The recovery of IS operations/services as a result of this practice is claimed to be more efficient than that as a result of the implementation of the ninth CSF (off-site back-up), especially when the disaster is a small-scale one such as a small fire in a computer room or a virus attack on local servers.
7. Conclusion DRP is considered equally important at both organizational and developmental levels. Most companies nowadays have developed their DRP, but there is a lack of quantifying reports on the CSFs for DRP implementation. This paper offers a solution to this problem; first, by providing a comprehensive DRP literature review; second, by identifying 62 DRP measurement items; and third, by verifying ten DRP CSFs for information system function. These ten DRP CSFs are: DRP documentations; the DRP steering committee and DRP testing; DRP policy and goals; DRP training; DRP maintenance and staff involvement; DRP minimum IS processing requirements; top management commitment to DRP; prioritization of IS functions/services; provision of external, off-site back-up systems; and provision of an internal, on-site back-up system.
The findings of this paper can serve as a basis for many future DRP research studies. One suggestion is to apply these CSFs to serve as a foundation for developing a corporate DRP strategy, similar to the organizational system model proposed by Ravichandran and Rai (2000). Other suggestions include studying how DRP CSFs contribute to the success of DRP performance within a firm, or how their applicability to large enterprises compares with their applicability to small- and medium-sized enterprises.
CSF of DRP for information systems
269 References Adam, M. (1999), “Preventing the chain react”, HR Magazine, January, pp. 28-32. Arnell, A. (1990), Handbook of Effective Disaster/Recovery Planning , McGraw-Hill, New York, NY. Babbie, E. (1992), The Practice of Social Research , 6th ed., Wadsworth, Belmont, CA. Baker, G. (1995), “Quick recoveries”, CA Magazine, August, pp. 49-53. Blake, W.F. (1992), “Making recovery a priority”, Security Management , Vol. 36 No. 4, pp. 71-4. Blatnik, G.J. (1998), “Point of failure recovery plan”, IS Audit & Control Journal , Vol. 4 No. 1, pp. 24-7. Bodnar, G.H. (1993), “Data security and contingency planning”, Internal Auditing , Winter, pp. 74-80. Botha, J. and von Solms, R. (2004), “Cyclic approach to business continuity planning”, Information Management & Computer Security , Vol. 12 No. 4, pp. 328-37. Butler, J.G. (1997), Contingency Planning and Disaster Recovery: Protecting your Organization’s Resources, Computer Technology Research Corp., Montgomery, AL. Carlson, S.J. and Parker, D.J. (1998), “Disaster recovery planning and accounting information systems”, Review of Business , Winter, pp. 10-15. Cerullo, M.J. and Cerullo, V. (1998), “Key factors to strengthen the disaster contingency and recovery planning process”, Information Strategy: The Executive’s Journal , Winter, pp. 37-43. Cerullo, M.J., McDuffie, R.S. and Smith, L.M. (1994), “Planning for disaster”, The CPA Journal , June, pp. 34-8. Chow, W.S. (2000), “Success factors for IS disaster recovery planning in Hong Kong”, Information Management & Computer Security , Vol. 8 No. 2, pp. 80-6. Coleman, R. (1993), “Six steps to disaster recovery”, Security Management , February, pp. 61-2. Compeau, D. and Higgins, C. (1995), “Computer self-efficacy: development of a measure and initial test”, MIS Quarterly , Vol. 19 No. 2, pp. 189-211. Coult, G. (1999), “Disaster recovery”, Managing Information, Vol. 6 No. 3, pp. 31-5. Devargas, M. (1999), “Survival is not compulsory: an introduction to business continuity planning”, Computers & Security , Vol. 18 No. 1, pp. 35-46. Donovan, T., Rosson, B. and Eichstadt, B. (1999), “Preparing carriers for . . .”, Telephony, June, pp. 180-8. Doughty, K. (1991), “Performing a business impact analysis”, EDPACS , Vol. 18 No. 9, pp. 1-7. Doughty, K. (1993), “Auditing the disaster recovery plan”, EDPACS , Vol. 21 No. 3, pp. 1-12. Douglas, W.J. (1998), “A systematic approach to continuous operations”, Disaster Recovery Journal , Summer, pp. 1-3. Dwyer, P.D., Friedberg, A.H. and McKenzie, K.S. (1994), “It can happen here: the important of continuity planning”, IS Audit & Control Journl , Vol. 1, pp. 30-5.
IMCS 17,3
270
Ebling, R.G. (1996), “Establishing safe harbor: how to develop a successful disaster recovery program”, Risk Management , September, pp. 53-6. Eckerson, W. (1992), “Andrew reminds users of need for disaster planning”, Network World , Vol. 2, p. 56, September. Edwards, B. and Cooper, J. (1994), “Testing the disaster recovery plan”, Information Management & Computer Security , Vol. 3 No. 1, pp. 21-7. Elstien, C. (1999), “Reliance on technology”, Enterprise Systems Journal , July, pp. 38-40. Ferraro, A. and Hayes, S. (1998), “Auditors add value to the business continuity program”, IS Audit & Control Journal , Vol. 5, pp. 47-50. Fong, A. (1991), The Disaster Recovery Plan , School of Business, Hong Kong: Business Research Centre, Hong Kong Baptist University, Kowloon Tong. Francis, B. (1993), “Recover from distributed disasters”, Datamation, December, pp. 73-6. Frost, C. (1994), “Effective responses for proactive enterprises: business continuity planning”, Disaster Prevention and Management , Vol. 3 No. 1, pp. 7-15. Gallegos, F. and Wright, D.M. (1988), “Evaluating data security: the initial review”, Data Security, Spring, pp. 8-15. Gilbert, L. (1995), “The function of a business continuity plan”, EDPACS , March, pp. 12-15. Ginn, R.D. (1989), “The case for continuity”, Security Management , January, pp. 84-90. Gluckman, D. (2000), “Continuity . . . recovery”, Risk Management , March, p. 45. Haubner, D. (1994), “Data processing disaster recovery planning considerations for tandem network”, EDPACS , Vol. 19 No. 11, pp. 1-7. Hawkins, S.M., Yen, D.C. and Chou, D.C. (2000), “Disaster recovery planning: a strategy for data security”, Information Management & Computer Security , Vol. 8 No. 5, pp. 222-9. Heng, G.M. (1996), “Developing a suitable business continuity planning methodology”, Information Management & Computer Security , Vol. 4 No. 2, pp. 11-13. Hiatt, C. and Motz, A. (1990), “Disaster recovery planning: what it should be, what it is, how to improve it”, EDPACS , Vol. 17 No. 9, pp. 1-9. Hurwicz, M. (2000), “When disaster strikes”, Network Magazine, January, pp. 44-50. Ivancevich, D.M., Hermanson, D.R. and Smith, L.M. (1997), “Accounting information system controls and disaster recovery plans”, Internal Auditing , Spring, pp. 13-20. Iyer, R. and Sarkis, J. (1998), “Disaster recovery planning in an automated manufacturing environment”, IEEE Transactions on Engineering Management , Vol. 45 No. 2, pp. 163-75. Jacobs, J. and Weiner, S. (1997), “The CPA’s role in disaster”, The CPA Journal , Vol. 20-24, November, pp. 56-8. Karakasidis, K. (1997), “A project planning process for business continuity”, Information Management & Computer Security , Vol. 5 No. 2, pp. 72-8. Korzeniowski, P. (1990), “How to avoid disaster with a recovery plan”, Software Magazine, February, pp. 46-55. Kovacich, G.L. (1996), “Establishing: a network security programme”, Computers & Security , Vol. 15 No. 6, pp. 486-98. Krivda, C.D. (1998), “Planning for the worst”, Midrange Systems, September, pp. 10-14. Krousliss, B. (1993), “Disaster recovery planning”, Catalog Age, Vol. 10 No. 12, p. 98. Kull, D. (1982), “Disaster recovery: just in case”, Computer Decisions , September, pp. 180-209. Leary, M.K. (1998), “A rescue plan for your LAN”, Security Management , March, pp. 53-60.
Lee, S. and Ross, S. (1995), “Disaster recovery planning for information systems”, Information Resources Management Journal , Summer, pp. 18-23. McNurlin, B.C. (1988), “Trends in disaster”, I/S Analyzer , Vol. 26 No. 11, pp. 1-12. Marcella, A. Jr and Rauff, J. (1994), “Automated disaster recovery plan auditing prospects for using expert systems to evaluate disaster recovery plans”, EDPACS , Vol. 21 No. 9,pp. 1-16. Matthews, G. (1994), “Disaster management: controlling the plan”, Managing Information, Vol. 1 Nos 7/8, pp. 24-7. Meade, P. (1993), “Taking the risk out of disaster recovery services”, Risk Management , February, pp. 20-6. Menkus, B. (2000), “Defining security threats to information processing application”, EDPACS , February, pp. 9-15. Miller, H.J. (1997), “A guide to planning for the business recovery of an administrative business unit”, EDPACS , April, pp. 9-16. Mitome, Y., Speer, K.D. and Swift, B. (2001), “Embracing disaster with contingency planning”, Risk Management , May, pp. 18-27. Moch, C. (1999), “Taking cover: Bell Atlantic presents businesses with security options”, August, Telephony, p. 62. Morwood, G. (1998), “Business continuity: awareness and training programmes”, Information Management & Computer Security , Vol. 6 No. 1, pp. 28-32. Murphy, J.H. (1991), “Taking the disaster out of recovery”, Security Management , August, pp. 61-6. Myatt, P.B. (1999), “Going in for analysis”, Security Management , April, pp. 75-9. Myers, K.N. (1999), Manager’s Guide to Contingency Planning for Disasters: Protecting Vital Facilities and Critical Operations, 2nd ed., Wiley, New York, NY. Nelson, K. (2006), “Examining factors associated with IT preparedness”, Proceedings of the 39th Hawaii International Conference on System Science, pp. 1-10. Newton, J. and Cudlipp, G. (1998), “From chaos to control”, Canadian Insurance, November, pp. 20-22, 26. Norman, G. (1993), “Disaster recovery after downsizing”, Computers & Security , Vol. 12 No. 3, pp. 225-9. Norusis, M.J. (1993), SPSS for Windows Professional Statistics Release 6.0 , SPSS, Chicago, IL. Nunnally, J.C. (1967), Psychmoetric Theory , McGraw-Hill, New York, NY. Paton, D. and Flin, R. (1999), “Disaster stress: an emergency management perspective”, Disaster Prevention and Management , Vol. 8 No. 4, pp. 261-7. Peach, S. (1991), “Disaster recovery: an unnecessary cost burden or an essential feature of any DP installation”, Computers & Security , Vol. 10 No. 6, pp. 565-8. Pember, M.E. (1996), “Information disaster planning: an integral component of corporate risk management”, Records Management Quarterly , April, pp. 31-7. Peterson, D.M. and Perry, R.W. (1999), “The impacts of disaster exercises on participants”, Disaster Prevention and Management , Vol. 8 No. 4, pp. 241-54. Petroni, A. (1999), “Managing information systems’ contingencies in banks: a case study”, Disaster Prevention and Management , Vol. 8 No. 2, pp. 101-10. Ravichandran, T. and Rai, A. (2000), “Quality management in systems development: an organizational system perspective”, MIS Quarterly , Vol. 24 No. 3, pp. 381-415. Rohde, R. and Haskett, J. (1990), “Disaster recovery planning for academic computing centers”, Communications of the ACM , Vol. 33 No. 6, pp. 652-7.
CSF of DRP for information systems
271
IMCS 17,3
Rosenthal, P. and Himel, B. (1991), “Business resumption planning: exercising your emergency response teams”, Computer & Security , Vol. 10 No. 6, pp. 497-514. Rothstein, P.J. (1988), “Up and running”, Datamation, October, pp. 86-96. Rothstein, P.J. (1998), “Disaster recovery: in the line of fire”, Managing Office Technology, May, pp. 26-30.
272
Rutherford, K. and Myer, G. (2000), “Business continuity: do you have a plan?”, Canadian Underwriter , April, pp. 38-41. Salzman, T. (1998), “An audit work program for reviewing IS disaster recovery plans (conclusion)”, EDPACS , Vol. 25 No. 7, pp. 8-20. Saraph, J.V., Benson, P.G. and Schroeder, R.G. (1989), “An instrument for measuring the critical factors of quality management”, Decision Sciences, Vol. 20 No. 4, pp. 810-29. Smith, M. and Sherwood, J. (1995), “Business continuity planning”, Computer & Security , Vol. 14 No. 1, pp. 14-23. Solomon, C.M. (1994), “Bracing for emergencies”, Personnel Journal , April, pp. 74-83. Tilley, K. (1995), “Work area recovery planning: the key to corporate survival”, Disaster Prevention and Management , Vol. 13 Nos 9/10, pp. 49-53. Turner, D. (1994), “Resources for disaster recovery”, Security Management , August, pp. 61-7. Vartabedian, M. (1999), “For want of a nail”, Call Center Solutions , February, pp. 40-8. Warigon, S. (1999), “Preparing and auditing an IS security incident-handling plan”, EDPACS , Vol. 26 No. 10, pp. 1-13. Wong, B.K., Monaco, J.A. and Sellaro, C.L. (1994), “Disaster recovery planning: suggestions to top management and information systems managers”, Journal of Systems Management , May, pp. 28-32. Wrobel, L.A. (1997), The Definitive Guide to Business Resumption Planning , Artech House, Norwood, MA. Wroblewski, M.E. (1982), “Contingency planning for DP”, Data Management , May, pp. 25-7. Yiu, K. and Tse, Y.Y. (1995), “A model for diaster recovery planning”, IS Audit & Control Journal , Vol. 5, pp. 45-51. Zolkos, R. (2000), “To rebound from disaster requires advance plans”, Business Insurance, Vol. 34 No. 9, pp. 2-4.
Further reading Kerlinger, F.N. (1986), Foundation of Behavioral Research, 3rd ed., Rinehard & Winston, New York, NY. Zajac, B.P. Jr (1989), “Disaster recovery – are you really ready?”, Computers & Security , Vol. 8, pp. 297-8.
Appendix
Question numbers
CSF of DRP for information systems Measurement items
Construct “Top management commitment” Q1 Top management provides an adequate financial support for DRP Q2 Top management commits to DRP Q3
Top management supports DRP
Q4
Top management assumes ultimate responsibility for DRP Construct “Policy and goals” Q5 Top management defines the scope of DRP Q6 Top management defines the goals and objectives of DRP Q7 Top management defines realistic goals and objectives of DRP Q8 Top management establishes the policy of DRP Q9 Top management has a clear vision of DRP Construct “Steering committee” Q10 A formal steering committee has been formed for disaster recovery plan Q11 Representatives from different department participate in the steering committee Q12 External consultants participate in the steering committee Construct “Risk assessment and impact analysis” Q13 Impact of financial losing business functions has been analyzed Q14 The degrees of security control of business functions has been analyzed Q15 The likelihood of potential adverse events to business functions has been analyzed Q16
The security weakness of business functions has been analyzed Q17 The security risk of business functions has been ranked Construct “Prioritization” Q18 Prioritization has been established for critical functions Q19 Prioritization has been established for critical applications Q20 Prioritization has been established for recovery activities Q21 Prioritization has been established for requirements
Sample references
273 Pember (1996) and Ferraro and Hayes (1998) Ginn (1989) and Rohde and Haskett (1990) Bodnar (1993) and Coleman (1993) Carlson and Parker (1998), Kull (1982) and Warigon (1999) Iyer and Sarkis (1998) Meade (1993) and Kovacich (1996) Hiatt and Motz (1990) Turner (1994), Dwyer et al. (1994) and Petroni (1999) Zolkos (2000) Karakasidis (1997) and Jacobs and Weiner (1997) Cerullo and Cerullo (1998) and Wroblewski (1982) Leary (1998) and Pember (1996)
Krivda (1998) and Wong et al. (1994) Newton and Cudlipp (1998), Gilbert (1995) and Bodnar (1993) Rosenthal and Himel (1991), Hiatt and Motz (1990) and Haubner (1994) Moch (1999) and Menkus (2000) Cerullo and Cerullo (1998) and Coult (1999) Gilbert (1995) and Peach (1991) Salzman (1998) and Marcella and Rauff (1994) Blake (1992) and Smith and Sherwood (1995) Karakasidis (1997) and Smith and Sherwood (1995) ( continued )
Table AI. Measurement items for DRP constructs
IMCS 17,3
Question numbers
Measurement items
Q22
274
Prioritization has been established for recovery schedules Construct “Minimum processing requirement” Q23 An minimum processing requirement of business function has been determined Q24 An maximum allowable downtime of business functions has been determined Q25 Acceptable recovery time of business functions has been determined Q26 The point in time to which data must be restored of business functions has been determined Construct “Alternative site” Q27 A clear vision of alternative processing site has been established Q28 An in-house alternative processing site has been established Q29 An external alternative processing site has been established Construct “Backup storage” Q30 Off– site backup storage has been established for recovering business functions Q31
On-site backup storage has been established for recovering business functions Q32 A regular backup schedule has been established for business functions Q33 A proper insurance coverage has been subscribed for business functions Construct “Recovery team” Q34 Relevant personnel participate in recovery team Q35
Relevant personnel involve in recovery team
Construct “Testing” Q36 Disaster recovery plan is tested as though a real disaster occurred Q37 Disaster recovery plan is tested by duplications of regular processing Q38 Disaster recovery plan is tested by testing individual recovery procedures Q39 Disaster recovery plan is tested by testing recovery functions Q40 Disaster recovery plan is tested by simulating actual disaster condition by interrupting service Construct “Training” Q41 Training in disaster response is given to recovery personnel Q42 Training stress management is given to recovery personnel Table AI.
Sample references Cerullo and Cerullo (1998) and Cerullo et al. (1994) Heng (1996) Hurwicz (2000) and Wong et al. (1994) Rothstein (1988), Myatt (1999) and Frost (1994) Tilley (1995) and Elstien (1999)
Blake (1992) Leary (1998) Hawkins et al. (2000) and Rothstein (1988) Edwards and Cooper (1994), Haubner (1994) and Marcella and Rauff (1994) Krousliss (1993) and Miller (1997) Haubner (1994) Eckerson (1992) and Lee and Ross (1995) Fong (1991), Miller (1997) and McNurlin (1988) Edwards and Cooper (1994) and McNurlin (1988) Wong et al. (1994), Edwards and Cooper (1994) and Leary (1998) Wong et al. (1994) Wong et al. (1994) and Edwards and Cooper (1994) Wong et al. (1994) and Edwards and Cooper (1994) Edwards and Cooper (1994) and Leary (1998) Peterson and Perry (1999) and Salzman (1998) Paton and Flin (1999) and Turner (1994) ( continued )
Question numbers
Measurement items
Sample references
Q43
Training team skill is given to recovery personnel
Solomon (1994) and Paton and Flin (1999) Turner (1994)
Q44
Training damage assessment is given to recovery personnel Q45 Training notification when an disaster strikes is given to recovery personnel Q46 Training in recovery responsibility and duties is given to recovery personnel Q47 Training in relocation of department when a disaster strikes is given to recovery personnel Construct “Documentation” Q48 Recovery procedures have been documented Q49
Emergency response procedures have been documented Q50 Individual disaster recovery team roles and responsibilities have been documented Q51 Recovery team members and contact numbers have been documented Q52 Backup operations have been documented Q53 Personnel to contact when disaster strikes have been documented Q54 Disaster notification procedures have been documented Q55 Suppliers/vendors list have been documented Construct “Maintenance” Q56 Disaster recovery plan is tested periodically Q57
Disaster recovery plan is evaluated continually
Q58
Disaster recovery plan is reviewed regularly
Q59
Disaster recovery plan is updated with changes
Q60 Disaster recovery plan is audited Construct “ISF personnel participation” Q61 ISF personnel participate in recovery duties and responsibilities Q62 ISF personnel participate in reviewing disaster recovery plan
Turner (1994)
CSF of DRP for information systems
275
Smith and Sherwood (1995) and Morwood (1998) Morwood (1998)
Francis (1993) and Doughty (1993) Salzman (1998) Salzman (1998) and Doughty (1993) Miller (1997) Miller (1997) and Salzman (1998) Coult (1999) Salzman (1998) Pember (1996) and Miller (1997) Lee and Ross (1995) and Karakasidis (1997) Miller (1997) and Mitome et al. (2001) Ebling (1996) and Matthews (1994) Norman (1993) and Korzeniowski (1990) Bodnar (1993) and Francis (1993) Blatnik (1998) and Smith and Sherwood (1995) Blatnik (1998) and Baker (1995)
Corresponding author Wing S. Chow can be contacted at:
[email protected]
To purchase reprints of this article please e-mail:
[email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints
Table AI.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.