EDUCATION SERVICES
Commvault Certified Professional Exam Prep Course April 2016
Page 1 of 122
Legal Notices Information in this document, including URL and other website references, represents the current view of Commvault Systems, Inc. as of the date of publication and is subject to change without notice to you. Descriptions or references to third party products, services or websites are provided only as a convenience to you and should sho uld not be considered an endorsement by Commvault. Commvault makes no representations or warranties, express or implied, as to any third party products, services or websites. The names of actual companies and products mentioned herein may be the trademarks of their respective owners. Unless otherwise noted, the example companies, organizations, products, domain names, e -mail addresses, logos, people, places, and events depicted herein are fictitious. Complying with all applicable copyright laws is the responsibility of the user. This document is intended for distribution to and use only by Commvault customers. Use or distribution of this document by any other persons is prohibited without the express written permission of Commvault. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Commvault Systems, Inc. Commvault may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in an y written license agreement from Commvault, this document does not give you any license to Commvault’s intellectual property. COMMVAULT MAKES NO WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED, AS TO THE INFORMATION CONTAINED IN THIS DOCUMENT. ©1999-2016 Commvault Systems, Inc. All rights reserved Commvault, Commv Comm vault and logo, the “ C” logo, Commvault Systems, Solving Forward, SIM, Singular Information Management, Simpana, Commvault Galaxy, Unified Data Management, QiNetix, Quick Recovery, QR, CommNet, GridStor, Vault Tracker, InnerVault, QuickSnap, QSnap, Recovery Recovery Director, CommServe, CommServe, CommCell, IntelliSnap, ROMS, Simpana OnePass, CommVault CommVault Edge and CommValue, are trademarks or registered trademarks of Commvault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services of their respective owners. All specifications are subject to change without notice. All right, title and intellectual property rights in and to the Manual is owned by Commvault. No rights are granted to you other than a license to use the Manual for your personal use and information. You may not make a copy or derivative work of this Manual. You may not sell, resell, sublicense, rent, loan or lease the Manual to another party, transfer or assign your rights to use the Manual or otherwise exploit or use the Manual for any purpose other than for your personal use and reference. The Manual is provided "AS IS" without a warranty of any kind and the information provided herein is subject to change without notice.
Page 2 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Table of Contents Contents Legal Notices ................................................. ..................................................................................................... .................................................... .................................................... 2 Table of Contents ................................................. ....................................................................................................... ...................................................... ........................................... 3 Introduction .................................................. ...................................................................................................... .................................................... .................................................... 6 Commvault® Certified Professional V11 Exam Prep Course .................................................. .................................................................................... .................................. 7 Education Advantage .................................................. ........................................................................................................ ...................................................... .................................. 8 Class Resources ................................................................................................................................................. 9 CVLab On Demand Lab Environment ........................................................................................... ...................... 10 Commvault® Education Career Path .................................................................................................................. 11 Education Services V11 Course Roadmap ................................................. ..................................................................................................... .................................................... ...... 12 Education Services V11 Certification Roadmap ...................................................................................... ............. 13 Automated Proactive Solutions System System (APSS) ................................................. ................................................................................................... .................................................. 14 Welcome to Commvault .............................................. .................................................................................................... ...................................................... ................................ 15 Course Overview ............................................... ..................................................................................................... ...................................................... ......................................... 16 Module 1: CommCell® Administration Review .......................................................................................... ............. 17 CommCell® Administration ................................................... .......................................................................................................... ....................................................... ...................... 18 CommCell® Installation Process ..................................................................................................................... 18 CommCell® Upgrade Process ......................................................................................................................... 18 Standard Agent Installation ........................................................................................................................... 19 CommServe® Server ..................................................................................................................................... 19 CommServe® DR Backup .................................................. ......................................................................................................... ....................................................... ...................... 20 CommCell® Console .................................................... ...................................................... ................................ 24 Job Controller ............................................................................................................................................... 25 Event Viewer ................................................................................................................................................ 25 MediaAgents and Indexing ................................................... .......................................................................................................... ....................................................... ...................... 28 V2 Indexing ................................................... ......................................................................................................... ...................................................... ......................................... 29 Index Process for Data Protection Jobs .......................................................................................................... 29 Index Checkpoint and Backup Process............................................................................................................ 29 Alerts and Reports...................................................... ...................................................... ................................ 33 Alerts ..................................................... .................................................... .................................................. 33 Reports............................................................................................................................................................ 33 Security ........................................................................................................................................................... 37 Role Based Security ...................................................................................................................................... 37
Page 3 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
User Quotas ................................................................................................................................................. 38 Firewall Configuration.................................................................................................................................... 38 Firewall Topologies ....................................................................................................................................... 39 Module 2: Storage Configuration and Media Management ...................................................................................... 42 Disk Libraries ................................................................................................................................................... 43 SAN Data Server ..................................................... ...................................................... ................................ 44 Tape Libraries .................................................................................................................................................. 48 Tape Management .................................................. ........................................................................................................ ...................................................... ................................ 49 Destructive and non-Destructive Media Handling ...................................................................................... ...... 50 Deduplication .................................................... ...................................................... ......................................... 54 Deduplication Data Movement Process ................................................. ..................................................................................................... .................................................... ...... 55 Transactional Deduplication Database .................................................. ...................................................................................................... .................................................... ...... 55 Content Aware Deduplication ...................................................... .................................................... ............... 56 Data Verification ........................................................................................................................................... 56 Partition Deduplication Design .................................................... .................................................... ............... 57 Storage Policies ................................................. ....................................................................................................... ...................................................... ......................................... 62 Storage Policy Structure ................................................................................................................................ 62 Global Secondary Copy ..................................................... ....................................................... ...................... 63 Storage Policy Settings ..................................................... ....................................................... ...................... 64 Storage Policy Copy Settings.......................................................................................................................... 64 Storage Policy Administration......................................................................................................................... 64 Retention ................................................... ....................................................................................................... .................................................... .................................................. 68 Job Based Retention ..................................................................................................................................... 68 Data Aging for Non-Deduplicated Data ................................................. ..................................................................................................... .................................................... ...... 69 Additional Retention Settings ...................................................... .................................................... ............... 70 Object Based (Subclient) Retention ...................................................... .................................................... ...... 71 Module 3: Clients and Agents ............................................................................................................................... 75 Clients ............................................................................................................................................................. 76 Client Administration ............................................... ..................................................................................................... ...................................................... ................................ 76 Client Computer Groups ................................................... .......................................................................................................... ....................................................... ...................... 77 File System Agent and Subclient Configuration ................................................. ................................................................................................... .................................................. 81 Filtering.................................................. ...................................................................................................... .................................................... .................................................. 81 Subclients...................................................... ...................................................... ......................................... 82 Virtual Server Agent (VSA) ................................................... .......................................................................................................... ....................................................... ...................... 88 Virtual Server Agent Backup Backup Process .................................................... .................................................... ...... 89 Virtual Server Agent Roles Roles ................................................ ..................................................................................................... ..................................................... ........................ 89 Page 4 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Transport Modes ..................................................... ...................................................... ................................ 91 Module 4: Data Protection and Recovery ..................................................... .................................................... ...... 95 Data Protection Jobs......................................................................................................................................... 96 Using Schedules and Schedule Policies ................................................. ..................................................................................................... .................................................... ...... 96 Job Management ............................................... ..................................................................................................... ...................................................... ........................................100 Controlling Job Activity ..................................................... ....................................................... .....................100 Recovery Operations .......................................................................................................................................106 Using Find ...................................................................................................................................................107 Using Browse ................................................. ....................................................................................................... ...................................................... ........................................107 Performance and Troubleshooting .................................................. ...................................................................................................... .................................................... ..............112 Stream Management Primer .........................................................................................................................112 Commvault Performance Settings .................................................................................................................114 Thank You ......................................................................................................................................................119
Page 5 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Introduction
Page 6 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Commvault® Certified Professional V11 Exam Prep Course
Page 7 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Education Advantage The Commvault® Education Advantage product training portal contains a set of powerful tools to enable Commvault customers and partners to better educate themselves on the use of the Commvault software suite. The portal includes:
Training Self-Assessment Tools Curriculum Guidance based on your Role in your Commvault Enterprise Management of your Commvault Certifications Access to Practice Exams and Certification Certification Preparation Preparation Tools And more!
Page 8 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Class Resources Course manuals and activity guides are available for download for Instructor-Led Training (ILT) and Virtual InstructorLed Training (vILT) courses. It is recommended to download these documents the day prior to attending class to ensure the latest document versions are being used. Self-paced eLearning courses can be launched directly from the EA page. If an eLearning course is part of an ILT or vILT course, it is a required prerequisite and should be viewed prior to attending class. If an ILT or vILT class will be using the Commvault ® Virtual Lab environment, a button will be used to launch the lab on the first day of class. Commvault® certification exams can be launched directly from the EA page. If you are automatically registered for an exam as part of an ILT or vILT course, it will be available on the final day of class. There is no t ime limit on when the exams need to be taken, but it is recommended to take them as soon as you feel you are ready.
Page 9 of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
CVLab On Demand Lab Environment The Commvault Virtual Lab (CVLab (CVLab environment) is now available to our global global customers. The CVLab allows you access to a vital learning tool that provides a flexible method for gaining hands-on experience with the Commvault® software platform. platform. You will have anywhere/anytime access to a powerful lab environment environment to practice installations, test configurations, review review current version capabilities or review any lab exercises. The CVLab shares a common console with our Education Advantage (EA) portal and is accessible 24-hours a day up to the amount of connect time purchased. The CVLab time can be purchased as standalone on-demand CVLab time, or to extend lab time for training courses attended. Extending CVLab time must be purchased within 48-hours after class end time time in order to maintain your lab progress from the training training course. Whether purchasing on-demand or extending; extending; CVLab connect time may be purchased in four-hour blocks in any quantity. quantity. Access will be available for 90 days from from point of purchase and is priced at just one Training Unit per four-hour block.
Page 10 10 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Commvault® Education Career Path The Commvault next generation platform leapfrogs legacy solutions in capabilities and functionality fully modernizing the performance, security, compliance, and economic benefits of a holistic data management strategy. The key concepts conce pts covered in this first step learning module highlight the core features of Commvault’s new platform. To realize the full value of these features, Commvault provides multiple levels of education and certification from core training, through specialized learning sessions, from introductory modules for those new to Commvault to master level training for Commvault power-users.
Page 11 11 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Education Services V11 Course Roadmap
Page 12 12 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Education Services V11 Certification Roadmap
Page 13 13 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Automated Proactive Solutions System (APSS) The Commvault® Automated Proactive Solution System (APSS) is a revolutionary tool providing real-time monitoring and automated healing capabilities. The system uses workflows and custom scripts to collect and analyze data and proactively provide solutions to issues in a CommCell® environment. This active system will notify Commvault administrators on health issues and allow them to enable APSS automatic healing. Commvault administrators can view issues and APSS solutions using the Proactive Support icon in Maintenance Advantage. When issues are detected detected by the APSS system, a pop-up notification is displayed in MA allowing the user to dismiss the notification or display solutions. Automated solutions can be applied by downloading and deploying the corresponding APSS solution workflow. The APSS dashboard provides a comprehensive overview of their CommCell environment. CommServe database health, SLA charts, job summary, system alerts, and open incidents can be viewed. Additional custom widgets can be added to the dashboard. To start using APSS, log on to Maintenance Advantage (MA) and complete your user profile. Download and install the APSS application.
Page 14 14 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Welcome to Commvault The WELCOME TO COMMVAULT interactive introduction provides web-based information to help you get the most of your total Commvault experience. This program is intended for those new to Commvault and enables you to better use our support processes, knowledge materials, and learn the basics of Commvault ® software. This 20 minute program is your map to understanding un derstanding Commvault's support, education, and online resources and immediately improves our success in working together. http://download.commvault.com/unsecure/welcome-to-commvault/
Page 15 15 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Course Overview
Page 16 16 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Module 1: CommCell® Administration Review
Page 17 17 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
CommCell® Administration CommCell® Installation Process The first component to be installed in a new CommCell® environment will be the CommServe® server. Once it is installed the next step would be to install MediaAgent software and detect and configure libraries. Policy configuration configuration for storage policies, schedule policies, subclient policies and glo bal filters should be done prior to installing any client agents. When installing client agents, options to associate the default subclient for the agent with the policies can be selected so preconfiguring policies makes the agent deployment process smoother.
CommCell® Upgrade Process Prior to upgrading a CommCell ® environment it is critical to perform a CommServe DR backup. In the event of problems during the upgrade process, the environment can be rolled back to ensure ensu re CommCell operations can continue. The first component to be upgraded must be the CommServe server. The upgrade process can be an inplace upgrade or a fresh installation of the CommServe server. It is recommended that you have the CommServe database inspected by Commvault prior to upgrading. This can be done by uploading the database dump to cloud.commvault.com. Check the Commvault Online Documentation for complete instructions for CommServe database inspection. MediaAgents should be upgraded next and libraries should be tested to ensure everything is functioning properly. Clients can then be upgraded on an as needed basis. Note that with Commvault ® software, client agents up to two versions back can coexist with a CommServe server and MediaAgents at the latest version.
Page 18 18 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Standard Agent Installati Installation on Agent Installation Installation Methods Commvault client agents can be deployed using several methods:
Push installation can be used to push Commvault agents to one or more remote machines.
Standard installation can be used to install software on a local machine.
Custom package installation is used to create smaller distributable installation packages to be deployed on remote machines. Decoupled installation can be used to install Commvault agents on machines without connectivity to the CommServe server. Restore only agents can be installed to machines without allocating licenses.
For each agent installation method, specific common s ettings can be selected based on preconfigured settings in the CommCell environment:
Client computer group
Subclient policy
Storage policy
Global filters
CommServe® Server The CommServe® server is the central management system within a CommCell environment. All activity is coordinated and managed by the CommServe CommServe server. The CommServe system runs on a Windows Windows platform and maintains a Microsoft SQL metadata database. This database contains all configuration information. It is important to note that Commvault software does not use a centralized catalog system like most other backup products. This means the metadata database on the CommServe server will be considerably smaller than databases that contain catalog data. Due to the small size of the database, an automated backup of the database is executed by default every morning at 10:00 AM.
Key points regarding the CommServe server:
For CommServe server high availability the following options are available: o
o
The CommServe server can be clustered – clustered – This This is recommended re commended for larger environments where high availability is critical. The CommServe server can be virtualized – virtualized – This This is suitable for smaller environments.
It is ABSOLUTELY CRITICAL that the CommServe database is properly protected. By default every day at 10 AM a CommServe DR backup job is conducted. This operation can be completely customized customized and set to run multiple times a day if required. All activity is conducted through the CommServe CommServe server; therefore, therefore, it is important that communication communication between the CommServe server and all CommCell environment resources always be available.
Page 19 19 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
CommServe® DR Backup By default every day at 10:00 AM the CommServe DR backup process is executed. This process will first dump the CommServe SQL database to the: For upgrades:
\CommVault\Simpana\CommServeDR folder.
New Installation:
\CommVault\Content Store\CommServeDR folder.
An Export process will then copy the folder contents to a user defined defined drive letter or UNC path. path. A Backup phase will then back up the DR Metadata and any user defined log files to a location based on the storage policy associated with the backup phase of the DR process. All processes, schedules and export/backup location are customizable in the DR Backup Settings applet in the Control Panel.
Export The Export process will copy the contents of the \CommServDR folder to the user defined export location. A drive letter or UNC path can be defined. The export location should NOT be on the local CommServe server. If a standby CommServe server is available define the export location to a share on the standby server. By default five metadata backups will be retained in t he export location. It is recommended to have enough disk space to maintain one weeks’ worth of DR exports.
Backup The Backup process is used to back up the DR metadata to protected storage. This is accomplished b y associating the backup phase with a storage policy. A default DR storage policy i s automatically created when the first library is configured in the CommCell environment. Although the backup phase can be associated with a regular storage policy it is recommended to use a dedicated DR storage policy to protect the DR metadata.
Page 20 20 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D The CommCell® console requires IIS to run as a remote web-based application. http://documentation.commvault.com/commvault/v11/article?p=features/commcell_console/al http://documentation.commvault.com/commvau lt/v11/article?p=features/commcell_console/alternate_iis_server.htm ternate_iis_server.htm
Page 21 21 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The only answer that is not true is D Disaster Recovery settings are configured in the Control Panel. This is where the number of o f metadata exports (default of five) is configured. The default schedule is daily at 10:00 AM. http://documentation.commvault.com/commvault/v11/article?p=features/di http://documentation.commvault.com/commvau lt/v11/article?p=features/disaster_recovery/c_cs_dr_backup.htm saster_recovery/c_cs_dr_backup.htm
Exam Tip: Read Tip: Read test questions carefully and look for key words such as: NOT, Only, Minimum within the question.
Page 22 22 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is F An IP address can always be used used in place of a fully qualified domain domain name. This will avoid issues with DNS name resolution. Optionally, a hosts file can be placed on each server. For host name resolution, the hosts file will be checked prior to contacting a DNS server to resolve the name. http://documentation.commvault.com/commvault/v11/article?p=features/network/nw_reqts.htm
Page 23 23 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
CommCell® Console Accessing Accessin g the CommC CommCell ell® Console The CommCell® console Graphical User Interface (GUI) can be accessed via any of the following four methods/locations: Local
The CommCell® console is, by default, installed along with the CommServe ® software enabling direct GUI access from the CommServe host. Web-Based
The CommCell Console can be accessed remotely via any Java capable web browser. For java web access you must have installed a compatible version of the Java™ Runtime Environment (JRE). See The Commvault Online Documentation and Support Knowledge Base for the latest compatible Java version information. Web-based access also requires that the CommServe host (or alternate access host in the same domain) have IIS and the CommCell Console installed. Alternate IIS Host
For better security, an alternate IIS host can be used to enable web-based CommCell Console access. As with local Web-based access, the alternate IIS host requires installation of IIS and the CommCell Console. Remote Host
The CommCell Console can be installed as a stand-alone application on any remote host (currently Linux, Macintosh, or Windows platforms are supported). Remote access in this configuration can use port 8401 to Page 24 24 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
be accessible on the CommServe host. Updates to a CommCell Console installed on a remote host must be applied manually.
Job Controller Filtering the Job Controller Window Right-click in the Job Controller window | Filters. To add a new filter, select New Filter button. To apply an existing filter, from the Filters drop-down box select the filter.
Changing Job Status Right-click job | select Suspend, Resume or Kill A dialog box will appear to confirm confirm the status change of a job and provide provide an explanation dialog box. box. Explanations for job status change can be viewed in the Job Summary report.
Job Details Right-click job | Details or double-click job Details for specific jobs can be used to provide information on job status, data path, media usage or job errors.
Event Viewer Filtering the Event Viewer Double-down arrow | Filter | Select field down arrow and select the appropriate filter The event viewer can be filtered based on the available fields. Although some filters such as Date don’t have a practical application, other fields such as Computer, Program or Event code can be effective in quickly locating specific events.
Searching Event Log Right-click in event viewer | Search Events Although only 200 to 1,000 events are displayed in the the event viewer, the entire event log log can be searched from the event viewer. The default total number of events retained is 10,000. When right-clicking anywhere in the event viewer, select the option to search events. Events can be searched by time range, severity and job ID. If common searches are frequently conducted, the search criteria can be saved as a query and run at any time.
Setting Event Log Retention Home tab | Control Panel | Configure | System By default the event log will retain 10,000 events or 7 days of events. When the event logs reach their upper limit t he oldest events will be pruned from the event logs.
Page 25 25 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Filters in the job controller can be set to filter by client groups, clients, and agent types. http://documentation.commvault.com/commvault/v11/article?p=features/job_management/job_management_advanc http://documentation.commvault.com/commvault/v11/article?p=features/job_manag ement/job_management_advanc ed.htm
Page 26 26 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D An event search can be conducted based based on time range, severity, or job ID. To run an event event search, right-click in the event viewer window and then select search events. http://documentation.commvault.com/commvault/v11/article?p=features/ev http://documentation.commvault.com/commvau lt/v11/article?p=features/event_viewer/c_event_viewer_overview.ht ent_viewer/c_event_viewer_overview.htm m
Page 27 27 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
MediaAgents and Indexing Physical vs. Virtual MediaAgent Commvault recommends using physical MediaAgents to protect physical and virtual data. The advantages for using a physical MediaAgent are: better performance, more versatility as a multi -purposed data mover (protect VMs and physical data), and resiliency. A MediaAgent can be virtualized if all performance requirements including CPU, RAM, index cache location and deduplication database location are being met.
Data Pipe MediaAgents can be used to backup client data over a network or dedicated where a client and MediaAgent are installed on the same server using a LAN Free or preferred path to backup data directly to storage.
Cyclic Redundancy Checks (CRC) By default, the ‘Validation on Media’ and ‘Validation on Network’ Cyclic Redundancy Check’ options are enabled for all MediaAgents. The CRC information is used w hen conducting validation operations for data in protected storage. There is a minimal performance impact on protection jobs (1 – (1 – 2%) 2%) and it is strongly recommended to always leave these settings enabled. It is also recommended to regularly run data validation jobs, especially for deduplicated data.
Optimize for Concurrent LAN Backup (SDT Pipeline) The Scalable Data Transport (SDT) pipeline is designed to optimize MediaAgent resource allocation when protecting a large number of streams over a n etwork connection. The MediaAgent setting ‘Optimize for concurrent LAN backup’ is used to enable or disable the SDT pipeline and is enabled by default.
Page 28 28 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
V2 Indexing V2 indexing works by using a persistent persistent index database maintained maintained at the backup set level. During subclient data protection jobs, log files are generated with all protected objects. The logs will be played into the index database.
Index Process for Data Protection Jobs Indexing data is located in a persistent index database. One index database will maintain records for all objects within a backup set, so all subclients within the same backup set will write to the same index database. The database is created and maintained on the data mover MediaAgent once the initial protection job of a subclient within a backup b ackup set completes. Index databases are located in the index cache location on the MediaAgent. Me diaAgent. During data protection jobs, log files are generated with records of protected objects. The maximum size of a log will be 10,000 objects or a complete chunk. Once a Log is filled or a new chunk is started, a new Log file is created and the closed Log will be written to the index database. By writing index logs to the database while the job is still running, the indexing operations of the job runs independent of the actual job allowing a job to complete even if log operations are still committing information to the database. At the conclusion of each job, the log files files are written to storage along with with the job. This is an important distinction from traditional indexing which would copy the entire index to storage. By copying just logs to storage, indexes require significantly less space in storage which is a big benefit when protecting large file servers. Since the index database is not copied to storage at the conclusion of each job, a special IndexBackup subclient is used to protect index databases.
Index Checkpoint and Backup Process During data protection jobs, logs are committed to the index database and are also kept in the index cache. In the event that an index database is lost or becomes corrupt, a backup copy of the index database is restored from media and the log files in the index cache are replayed to the database. If the index cache location is lost, the database and logs are restored from media and the logs will be replayed into the t he database. These recovery methods provide complete resiliency for index recovery. The index databases are protected with a system cr eated subclient on the MediaAgent called IndexBackup. An index backup operation is scheduled to run every eight hours. During the backup operation, index databases are checked to determine if they qualify for protection. The two primary criteria to determine if a database qualifies for protection is one million changes or 30 days since the last backup. If the index database qualifies, three actions will occur:
A database checkpoint will occur
The database will be compacted
The database will be backed up to the storage policy associated with the IndexBackup subclient
Database Checkpoint Checkpoints are used to indicate a point-in-time in which a database was backed up. Once the database is protected to storage, any logs that are older than the checkpoint can be deleted from the index cache location.
Database Compaction During data aging operations, deleted jobs are marked in the database as unrecoverable but objects associated with the job remain in the database. The compaction operation deletes all aged objects and compacts the database.
Page 29 29 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Database Backup Once the checkpoint and compaction occurs, the database will be backed up to the primary copy location of the storage policy. Three copies of the database will be kept in storage and normal storage policy retention rules are ignored. During the index backup process, the database will be frozen and no browse or find operations can be run against the database. Each database that qualifies for backup is protected sequentially minimizing the freeze time. Data protection jobs will not be affected by the index backup.
Index Cache Settings All object level data protection protection jobs will use indexes for all operations. operations. These indexes are maintained maintained in the index cache. Improper configuration of the index cache can result in job failures and long delays in browse and recovery operations. Consider the following when designing and configuring the index cac he:
Index cache should be located on dedicated high speed disks, preferably solid state disks. Do NOT put the index cache on the system drive. The location of the cache can be changed by: right-clicking the MediaAgent and selecting the properties Catalog tab. Size the index cache appropriately based on the size of your environment en vironment and the estimated number of objects that will be protected. It is much better to overestimate than underestimate index cache size. Sizing guidelines are available in the Commvault Online Documentation. o
o
The default retention time for V1 indexes in the index cache is 15 days. If you will be frequently browsing for data older than 15 days increase this setting and allocate enough disk space for the index cache. V2 indexes are persistent persistent and are not pruned based on index cache retention retention settings.
Index files are automatically protected so there is n o need to perform backups of the index cache location.
Page 30 30 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B. V2 indexing maintains an index database database at the backup set level. level. V1 indexing maintains indexes at the subclient level. level. http://documentation.commvault.com/commvault/v11/article?p=features/index/c_v2_overview.htm
Page 31 31 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B. Index transaction logs are related to a specific job. At the conclusion of the job, they are copied to the media where the job data is located. Index database backups are protected with a special subclient on the MediaAgent. The subclient is scheduled to run automatically every eight hours.
Test Tip: Be Tip: Be careful for questions where the answer seems mostly correct. In this question, both C and D are accurate answers but not the full answer. Always read each answer to determine which answer is the best choice.
Page 32 32 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Alerts and Reports Alerts Console Alerts When configuring alerts, console alerts can be selected as a notification method. Once an alert is triggered, it appears in the Console Alerts window within the CommCell® browser. Right-click on an alert to view details, delete, mark as read or unread, or to insert a note. Console alerts can be pinned or deleted using the icons at the bottom of t he window.
Reports CommCell® reports can be configured from the Reports tab in the CommCell toolbar. Reports can be:
Scheduled
Saved to a specific location
Saved as report templates
Use the Time Range tab to set the scope of the report and use the Output tab to select the output format:
HTML
PDF
Text file - which is saved as a CSV file for spreadsheet import. Page 33 33 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
You can also choose the Output method. Choices include scheduling, save as script, save as template template or save to a disk location. The following lists some common CommCell ® reports:
The Job Summary report, which can be used to view data protection, data recovery and administrative jobs. The CommCell® Readiness report, which can be used as a status report for CommCell components such as clients, MediaAgents, library storage capacity and index caches. The CommCell® Configuration report, which provides CommCell configuration, license usage and update status of CommCell components. The Job Schedule report, which can be used to view schedules for client computer groups, clients and administrative jobs. Data Retention Forecast and Compliance report, which can be used to view jobs in storage, the media it is located on, and the estimated time the data will age.
Page 34 34 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D. An alert can be tested by right-clicking right-clicking the alert and selecting Test. Test. http://documentation.commvault.com/commvault/v11/article?p=features/aler http://documentation.commvault.com/commvau lt/v11/article?p=features/alerts/alerts_advanced.htm ts/alerts_advanced.htm
Test Tip: Be Tip: Be aware of answers the state a limiting capability within Commvault software. Answer A states that the action cannot be performed performed from the CommCell CommCell console. Not only is Commvault a feature rich software solution that has many options but it is not likely that an exam question will ask about a feature or capability that does not exist.
Page 35 35 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B. Commvault software can perform A, C, and D but cannot send reports directly to a printer. The report can be saved as a PDF or Emailed where it can then be printed.
Page 36 36 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Security Role Based Security Role based security transcends limitations of traditional user and user group security by separating the user or group from permissions. Role based security is based on three components:
User or user group – group – can can be a local CommCell user / user group or domain user / user group
Role – Role – defines defines a set of permissions not tied to any user or user group
Entity – Entity – the the component that joins the user / user group with the associated role
The separation of user / user group (who), role (permissions), and entity (what) allows a user or user group to have different permissions depending on what their role is for a specific entity. Example: A user requires backup and recovery permissions for a file server. The same user requires restore only Example: A permissions for a mail server. The user is associated with the file server entity and assigned the backup an d recovery role. The same user is assigned to the mail server entity with the recovery role.
Roles After Upgrading from Previous Version Prior to Commvault version 11, all permissions (formerly called capabilities) were associated with a CommCell user group. When upgrading Commvault software, a role will be created for each user group and permissions will be assigned to the role based on the capabilities of the old user group. For each user group, a role will automatically be created prefixed with _Role. These roles will automatically be assigned to entities along with the user groups.
Page 37 37 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
User Quotas Domain users can have data protection quotas enforced for file based backups. Quotas can be set at the group or user level. If quotas are set at the group level, they can be overridden at the user level. How user quotas work:
When a user reaches 90% of their defined quotas, a warning Email will be sent to the user.
When a user reaches 110% of quota, backups will not run for systems owned by the client.
To fall below these thresholds, the user either has to delete data or the Commvault administrator has to increase the user’s quota.
Firewall Configuration When CommCell® components need to communicate or move data through a firewall, firewall settings must be configured for each component. This can be done by configuring individual firewall settings for a specific client or firewall settings can be applied to a client computer group. For example, if a client needs to communicate with a CommServe server through a firewall and backup data to a MediaAgent through a firewall, all three components would require firewall configuration. There are three primary methods for connecting through a firewall:
Direct – Direct – where where the CommCell components communicate directly with each other through a firewall. Through a proxy – proxy – where where CommCell components use a proxy in a demilitarized Zone or DMZ to communicate with each other. Gateway – Gateway – where where CommCell components communicate through a gateway resource.
There are four configuration tabs available:
Incoming connections
Incoming ports
Outgoing routes
Options
Configuring Incoming Connections The incoming connections tab is used to determine if other CommCell components can connect the client or client group where the firewall settings are being configured. There are three connection options:
Open connection – connection – there there are no firewall restrictions. In this case, no incoming connections need to be configured. Restricted – there Restricted – there are firewall port restrictions in place and a component on the other side of the firewall can reach the component that is currently being configured. Blocked – there Blocked – there are firewall port restrictions in place and a component on the other side of the firewall can NOT reach the component that is currently being configured.
Commvault software uses port 8400 as the default communication port for all CommCell traffic. When firewall settings are enabled for a CommCell component, by default, port 8403 will be used as a listening port for any inbound connection attempts. Additionally, a dynamic port range can be configured to provide additional data traffic ports for backup and recovery operations. Page 38 38 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Firewall Topologies Firewall topologies provide a simplified template to deploy firewall configurations to CommCell resources. One -Way, Two-Way, or Proxy firewall topologies can be co nfigured. Once the simplified topology is configured, advanced firewall settings are still available at the client group and client levels to further configure settings if desired.
Page 39 39 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B. The primary concept in Commvault V11 security is separating users and user groups from security. Roles are created which defines the permissions. Users and user groups, local or from Active Directory, are linked to a role when configuring security settings for an entity. Although answers A and B are not wrong, answer B is the complete answer. http://documentation.commvault.com/commvault/v11/article?p=features/user_admin/user_admin.htm
Page 40 40 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C The default listening port for all firewall configurations within Commvault is 8403 which is a Commvault registered port. Whether setting firewall parameters at the client level, client group level, or using firewall topologies, the same default listening port is used. http://documentation.commvault.com/commvault/v11/article?p=features/fir http://documentation.commvault.com/commvau lt/v11/article?p=features/firewall/direct_connections.htm ewall/direct_connections.htm
Page 41 41 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Module 2: Storage Configuration and Media Management
Page 42 42 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Disk Libraries A disk library is a logical logical container which is used to define one or more paths paths to storage called mount paths. paths. These paths are defined explicitly to the location of the storage and can be defined as a drive letter or a UNC path. Within each mount path, writers can be allocated which defines the total number of concurrent streams for the mount path.
Disk Library Streams Disk library stream settings, by default are not throttled at the library or mount path pa th level. Throttling is handled by the MediaAgent (default 100) and storage policy (default 50). So if two storage policies are writing to the same MediaAgent, a maximum of 100 streams will be used. Depending on the type of library being used, MediaAgent power, and network bandwidth, these numbers can be modified.
Disk Library Properties Setting low disk space watermark Disk library Properties | General tab Disk Library Low Watermark can be configured with a percentage threshold that will report an event to t he event viewer when total disk capacity for all mount paths reaches or falls below the specified percentage. Alerts can also be configured to notify users when the threshold has been reached. Determine the order multiple mount paths will be used Disk library Properties | Mount Paths tab Mount Path usage determines the order in which mount paths will be written to when multiple mount paths are configured for a disk library.
Page 43 43 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Fill & Spill Spill - The default setting for mount path usage will use mount paths in the order in which they were created. Spill & Fill (load Fill (load balancing) - Will load balance device streams between all mount paths in the library. Which should you use for best performance? For disk libraries part of a bottleneck will be the disks themselves but the other part of the bottleneck will be the I/O path to the disks. The use of Fill & Spill or Spill & Fill (load balancing) should be determined by how the mount paths were created and the I/O path to each mount path.
Validating a Mount Path Mount path validation is run from the CommCell console. This operation tests performance of library mount paths. The validation operation is configured using the following parameters:
MediaAgent
File size
Number of writers
Block size
Number of files
SAN Data Server SAN Data Server allows multiple MediaAgents to connect to storage through a Linux proxy MediaAgent using Fibre Channel or iSCSI connections. The Linux MediaAgent can host direct attached disk storage which is then presented to other MediaAgents as a shared disk library. This allows multiple client / MediaAgents to backup over the SAN connection to a single shared disk library instead of each client / MediaAgent using a dedicated library. When using the SAN Data Server and Commvault deduplication, be tter deduplication ratios can be achieved. SAN Data Server is best suited for protecting clustered applications or when client / MediaAgents have SAN connectivity connectiv ity and it’s desirable to avoid using the IP network to conduct backup operations.
Page 44 44 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Commvault software provides a fragmentation analysis tool but it is recommended to follow the storage hardware vendor’s guidance regarding disk defragmentation. defragmentation. http://documentation.commvault.com/commvault/v11/article?p=features/di http://documentation.commvault.com/commvau lt/v11/article?p=features/disk_library/advanced.htm sk_library/advanced.htm
Page 45 45 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C The spill and fill option will load balance backup streams across multiple mount paths.
http://documentation.commvault.com/commvault/v11/article?p=features/disk_library/advanced.htm
Page 46 46 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D A local path, defined by a drive drive letter, or a UNC path, including including user permissions to read and write write to the disk are supported. Map network drives are not supported.
http://documentation.commvault.com/commvault/v11/article?p=features/di http://documentation.commvault.com/commvau lt/v11/article?p=features/disk_library/best_practices.htm sk_library/best_practices.htm
Page 47 47 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Tape Libraries A tape or removable media library library is a library where media can can be added, removed and moved between multiple libraries. The term removable media is used to specify various types of removable media supported by Commvault software including tape and USB disk drives which can be moved between MediaAgents MediaA gents for data protection and recovery operations. Removable media libraries will be divided into the following components:
Library Is the logical representation of a library within a CommCell environment. A library can be dedicated to a MediaAgent or shared between multiple MediaAgents. Sharing of removable media libraries can be static or dynamic depending on the library type and the network connection method between the MediaAgents and th e library. –
Master drive pool – pool – is is a physical p hysical representation of drives within a library. An example of master drive pools would be a tape library with different drive types li ke LTO4 and LTO5 drives within the same library. Drive pool – pool – can can be used to logically divide drives within a library. The drives can then be assigned to protect different jobs. Scratch pool – pool – can can be defined to manage media which can then be assigned to different data protection jobs. Custom scratch pools can be defined and media can be assigned to each pool. Custom barcode patterns can be defined to automatically assign specific media to different scratch pools or media can manually be moved between scratch pools in the library.
Using the Library & Drive Configuration Tool Libraries are either detected (e.g., tape device, library controller) or added (e.g., disk, cloud, IP-based controller). Essential to both both is the ability ability of the MediaAgent MediaAgent to correctly see/access the device. Prior to any detection detection or adding of devices to a MediaAgent, confirm the physical and logical view of the device from the operating Page 48 48 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
system. If multiple similar devices are are involved (e.g., a multi-drive library), all such devices devices should be at the same firmware level. Detection The system only detects devices for which device drivers are loaded. A detected device may have the following status:
Success indicates that the system has all of the information necessary to use the device. Partially configured, detect fail - connection error status when the detection fails due to an error connecting to the MediaAgent. Partially configured, detect fail - device not found status when the detection fails due to a missing device.
Note: Some devices (e.g., the library associated with a stand-alone drive) have no detection status, since they are Note: Some virtual entities and as such have no hardware components that can be detected. Exhaustive Detection Modern tape drives have serial numbers which are used by the Commvault software to properly place a drive physically and logically within a library. Older drives without serial numbers require manual locating. Exhaustive detection is the process of associating drive numbers to their correct SCSI address. This is done by mounting a media to each of the drives in the library to obtain the drive’s SCSI address.
Tape Management Depending on the Media group certain administrative tasks can be performed for the group and tapes within the group. Managing tapes in different media groups provides the Commvault administrator with greater flexibility and simplifies the management of media. It is important to understand the capabilities and limitations of media management within the various logical media groups available. The following actions can be performed on tapes in any media group: Act io n
Descri Desc ri pt io n
Export
Physically export a tape out of the library.
Move
Logically move tapes between media groups.
Verify Media
Physically verify the OML header information to CommCell tape metadata and the barcode label.
View Contents
Logically view active and aged jobs jobs on a tape.
Delete
Logically delete the existence of a tape from the CommServe database.
Delete Contents
Logically delete contents by marking all jobs as aged and recycling the tape back into a scratch pool.
Erase Media
Physically erase data by writing a new OML header to the tape.
Mark Media Bad
Logically mark a tape bad to prevent it from being reused.
Media Refresh
Refresh active jobs on existing tapes by writing the jobs to new tapes.
Page 49 49 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Destructive and non-Destructive Media Handling Deleting Tapes A Delete operation operation is a logical operation that deletes the existence of the tape from the CommCell® environment and CommServe® database. The delete operation will not physically erase any data and the tape does not need to be in the library when it is deleted. This is also considered a non-destructive operation which means a tape can only be deleted if no active jobs are currently retained on the tape. This means that the Delete option will not be available for tapes in the Assigned Media group. Delete Tape Contents Delete Contents is is a logical operation that will automatically mark all jobs on a tape as aged. During a Delete Contents operation a confirmation message will appear and then the administrator must enter erase and reuse media. media . The administrator will then be prompted to select a scratch group to move the tape to. Data on the tape is logically marked aged so the data can still be recovered up to the point where the tape is mounted and the OML header is overwritten.
The most common situation where the Delete Contents operation is used is when there is not enough spare media to run scheduled jobs. This typically happens when storage policies are improperly configured or retention expectations are unrealistic compared to capacity to store data. If an administrator frequently uses the delete contents option to free up tapes for jobs, consider readdressing environment configurations or purchase more tapes. Manual operations such as this add the potential of human error accidentally deleting critical data. Erasing Tapes Erase Media is a physical operation that will mount the tape and overwrite the OML header. Once the header is overwritten data cannot be recovered using any method Commvault® software provides. This is considered a destructive operation so it cannot be performed on any tapes where jobs are actively being retained. The option to erase media will be available in all logical groups except the Assigned Media group.
Page 50 50 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D Jobs on tape media are logically marked aged in the CommServe database but job metadata is kept until the OML header on the tape is physically overwritten. Jobs on an aged tape in a scratch pool can be browsed and data can be restored.
http://documentation.commvault.com/commvault/v11/article?p=features/di http://documentation.commvault.com/commvau lt/v11/article?p=features/disaster_recovery/other/overvi saster_recovery/other/overview.htm ew.htm
Page 51 51 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is False A library does not need to be attached attached to the server when installing installing MediaAgent software. The library and drive configuration tool is used to detect and configure libraries connected to a MediaAgent. http://documentation.commvault.com/commvault/v11/article?p=features/library_drive_config/library_dr http://documentation.commvault.com/commvault/v11/article?p=features/lib rary_drive_config/library_drive_configurati ive_configurati on_overview.htm
Page 52 52 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C To prevent mixed retention on tape, only jobs from the same storage policy copy can be appended to a tape.
Page 53 53 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Deduplication Commvault® software has a unique set of deduplication features that are not available with most third party deduplication solutions. By taking full advantage of Commvault ® Deduplication, you can reduce storage and network resource requirements, shrink backup windows, efficiently copy data to o ff-site locations, and copy deduplicated data to tape, disk, or to a cloud environment. Commvault Deduplication offers the following benefits:
Efficient use of storage media
Efficient use of network bandwidth
Significantly faster Synthetic Full operations
Significantly faster auxiliary copy operations
Efficient use of tape media
Resilient indexing and restorability
The Deduplication Process and Data Protection The following steps illustrate the deduplication process during a data protection job. 1. Production data is read from the source location and written into a memory buffer. This memory buffer is filled based on the defined block size. Note that the block size is referred to as a data block with a default of 128 KB. 2. A signature is then generated on the the data block. The signature uniquely represents the bit makeup of the block.
Page 54 54 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
3. The signature is compared in the Deduplication Database (DDB) to determine if the data block already exists. A. If it does exist, the data block in the memory buffer is discarded and pointers to the existing block in storage is referenced instead. B. If it does not exist, the t he data block is written to protected storage.
Deduplication Data Movement Process DASH Full A read optimized synthetic DASH Full Full uses the Commvault deduplication deduplication feature to logically perform perform synthesized full backups without moving any data. This can be accomplished because Commvault deduplication tracks the location of all blocks on disk storage. After the initial foundation foundation full is run and subsequent incremental incremental jobs are run, all block data required for the synthetic full is already present in the deduplicated disk storage location. Since deduplication will only store a unique block once in storage the DASH Full operation will only make reference the blocks in storage and not actually copy them. The DASH Full operation will generate a new index file signifying that a full backup was run and update the deduplication database with block record data that is used for data aging purposes. DASH Full backups are the preferred method of running full backup jobs and can dramatically reduce backup windows. DASH Copy A DASH Copy is an optimized auxiliary auxiliary copy operation which only transmits transmits unique blocks from the source library to to the destination library. It can be thought of as an intelligent replication which is ideal for consolidating data from remote sites to a central data center and backups to DR sites. DASH Copy has several advantages over traditional replication methods:
DASH Copies are auxiliary copy operations so they can be scheduled to run at optimal time periods when network bandwidth is readily available. Traditional replication would replicate data blocks as it arrives at the source. Not all data on the source disk needs to be copied to the target disk. Using the subclient associations of the secondary copy, only the data required to be copied would be selected. Traditional replication would require all data on the source to be replicated to the destination. Different retention values can be set to each copy. Traditional replication would use the same retention settings for both the source and target.
DASH Copy is more resilient than traditional replication. If the source disk data becomes corrupt the target is still aware of all data blocks existing on the t he disk. This means after the source disk is repopulated r epopulated with data blocks, duplicate blocks will not be sent s ent to the target, only changed blocks. Traditional replication would require the entire replication process to start over if the source data became corrupt.
Transactional Deduplication Database Transactional deduplication database provides significant resiliency benefits. There are several components that make up the in memory deduplication database: In Memory logs and disk log.
In Memory Logs In Memory logs are linked to portions of the deduplication database and are dynamically added to memory by the system. There are three memory logs; one active log, which records all database changes, a pending commit log, and
Page 55 55 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
merge commit log. The active log will be record changes for 20 seconds. Once the active log is closed it becomes a pending commit log and a new active log is started. While the active log receives changes and the pending commit log closes, a merge commit log is committed to an on disk log.
DiskDB Log The DiskDB log resides in the DDB location and receives updates from the memory logs. In addition to receiving changes from the In Memory logs, it is also used to commit records to the deduplication database. In the event of a MediaAgent unplanned shutdown, the DiskDB log is used to bring the DDB to a consistent state.
Content Aware Deduplicati Deduplication on The concept of content aware deduplication is t o identify what type of data is being protected and adjust how deduplication is implemented. Consider a deduplication appliance that receives data from a backup application. The appliance can’t detect files, databases, dat abases, or metadata generated from the backup application. Commvault deduplication is integrated into agents so it understands what is being protected. This provides significant space saving be nefits and results in faster backup, restore and synthetic full backup operations.
Object Based Content Aware Deduplication Since most file objects are not equally divisible by a set block size, such as 128 KB, Commvault® Deduplication uses a content aware approach to generate signatures. If an object that is 272 KB in size is deduplicated, it can be evenly divisible by 128 KB with a remainder of 16 KB. In this case two 128 KB deduplication blocks will be hashed and compared. The remaining 16 KB will be hashed in its entirety. In other words, Commvault ® Deduplication will not add more data to the deduplication buffer. The result is if the object containing the three deduplication blocks never changes, all three blocks will always deduplicate against themselves.
Database and Log Content Aware Deduplication Database application often provide built in compression which will compress blocks before Commvault generates signatures on the blocks. The application level compression results in inconsistent blocks being deduplicated each time a backup runs which results in poor deduplication ratios. Using Commvault compression during ba ckups instead of application compression, the application agent will detect the database backup and generate a signature on uncompressed data. After the signature has been generated the block will then be compressed. This lead to improved deduplication ratios. Log files are constantly changing with new information added and old information truncated. Since the state of the data is constantly changing, deduplication will provide no space saving benefits. During log backup jobs, the application agent will detect the log backup and no signatures are generated, saving CPU and memory resources on the production system and speeding up backups by eliminating signature lookups in the deduplication database.
Data Verification Verification Verificati on of Existing Existing Jobs on Disk and Deduplication Deduplication Database Database Full verification for disk and DDB uses checksum data to verify block integrity by reading data chunks (Sfiles), uncompressing, and decrypting, and using CRC information to validate block integrity. This option also verifies chunk metadata using CRC checks. Any blocks failing the check will be marked in the DDB. New blocks generating the same signature as a block marked bad are re-written to disk and a new signature entry is written to the DDB. This verification method also verifies chunk integrity between the DDB and di sk library.
Page 56 56 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Verification Verificati on of Deduplication Deduplication Database The verification of the deduplication database option performs all of the same tasks as the verification of existing jobs on disk and the deduplication database except metadata chunk validation.
Quick Verification of Deduplication Database Quick verification option verifies chunk integrity between DDB and disk library.
Incremental Verification Incremental data verification verifies data integrity for new jobs added since the last verification job. This option is available when running ‘verification of deduplication database’ or ‘verification of existing jobs on disk a nd deduplication database’ options. Since this method only verifies new jobs, full verification jobs should periodically be executed.
Partition Deduplication Design Partition deduplication provides higher scalability and deduplication efficiency by allowing more than one Deduplication Database (DDB) to exist within a single Deduplication Engine. It works by logically dividing signatures between multiple databases. For example, all signatures that begin with a zero will be managed man aged by the first database and all records starting with a one will be managed in a second database. If two deduplication partitions are used, it effectively doubles the size of the Deduplication S tore. Currently Commvault® software supports up to four database partitions.
How Partitioned Databases Work During data protection jobs, partitioned Deduplication Databases (DDBs) and the data protection operation work using the following logic: 1. Signature is generated at the source - For primary data protection jobs using Client Side Deduplication, the source location is the client. For auxiliary DASH copy jobs, the source MediaAgent generates signatures. 2. Based on the generated signature it is sent to its respective database. The database compares the signature to determine if the block is duplicate or unique. 3. The defined storage policy data path is used to protect data – data – Regardless Regardless of which database the signature is compared in, the data path remains consistent throughout the job. If GridStor® Round-Robin has been enabled for the storage policy primary copy, jobs will load balance across MediaAgents.
Page 57 57 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Transactional deduplication database using in memory logs and a disk log to cache database updates, much like a database uses log files for database integrity. In the e vent of an unplanned MediaAgent shutdown, the disk log is used to bring the deduplication database to a consistent state. The deduplication database should never be l ocated on NAS storage.
Page 58 58 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B An incremental data verification verification job verifies all chunk data written written since the last time the verification verification job ran. Commvault software Version 11 supports data protection jobs running while the verification operation is running.
http://documentation.commvault.com/commvault/v11/article?p=features/deduplication/t_verifying_the_deduplic http://documentation.commvault.com/commvault/v11/article?p=features/deduplic ation/t_verifying_the_deduplicated_ ated_ data.htm
Page 59 59 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A A new feature in Commvault Commvault Version 11 software is the ability to generate signatures prior prior to Commvault compression. This results in better deduplication ratios. Encryption can optionally be used.
Page 60 60 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D For best performance, it is not recommended to place the deduplication database on a separate MediaAgent. Using high speed solid state or PCI P CI storage configured in a RAID 10 or 5 array and locally attached to a MediaAgent provides the best performance at scale. http://documentation.commvault.com/commvault/v11/article?p=features/deduplication/deduplication_building_block.h http://documentation.commvault.com/commvault/v11/article?p=features/deduplic ation/deduplication_building_block.h tm
Page 61 61 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Storage Policies A storage policy defines the lifecycle lifecycle management rules for all protected data. In its most basic basic form, a storage policy can be thought of as a container with one or more sets of rules that define how data will be managed. These rules are called storage policy copies. Within the storage policy, multiple copies can be created, each with their own set of rules. There are three primary rules that every storage policy copy must have defined:
What data is determined through subclients association
Where will it be stored is determined by the data path
How long will it be kept is determined by the retention
Storage Policy Structure The concept of storage policy copies is that the data from the production environment only has to be moved to protected storage once. Once the data is in protected storage, the storage policy logically manages and maintains independent copies of the data. This allows for great flexibility when managing data based on the three key aspects of data protection: disaster recovery, data recovery, and data archiving. There are three types of storage policy copies
Primary Copy
Secondary Synchronous Copy
Secondary Selective Copy
Page 62 62 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Primary Copy A storage policy primary copy sets the primary rules for protected protected data. Each storage policy policy can have two primary copies, a primary snap copy and primary classic copy. A primary snap is used to manage protected data using the Commvault IntelliSnap® feature and a primary classic which will manage traditional agent based data protection jobs. Most rules defined during the policy creation process can be modified after it has been created.
Secondary Copies There are two types of secondary copies:
Secondary Synchronous
Secondary Selective
Synchronous Copy A synchronous copy defines a secondary secondary copy to synchronize protected data with a source source copy. All valid data (jobs that completed successfully) written to the source copy will be copied to the synchronous copy via an update process called an auxiliary copy operation - this means all full, incremental, differential, transaction log, or archive job from a source copy will also be managed by the synchronous copy. Synchronous copies are useful when you want a consistent point-in-time copy at any point within the cycle of all protected data available for restore. Synchronous copies are used to meet the following requirements:
Consistent point-in-time copies of data required to restore data to a particular point-in-time within a cycle.
Copies that are required to be sent off-site daily.
To provide the ability to restore multiple versions of an object from a secondary copy within a cycle.
Selective Copy A selective copy allows automatic automatic selection of specific full backups backups or manual selection of any backup backup for additional protection. Selective copy options allow the time based automatic selection of: all, weekly, monthly, quarterly, halfyear, and/or yearly full backups. Advanced options allow you to generate selective copies based on a frequency of number of cycles, days, weeks, or months. You can also choose the Do Not Automatically Select Jobs option which allows you to use auxiliary copy schedules to determine when copies of full backups will be made. Selective copies are used to meet the following requirements:
Data being sent off-site weekly, monthly, quarterly, or yearly. Archiving point-in-time copies of data data for compliance and government government regulations.
Global Secondary Copy Global Secondary copy policies allow multiple storage policy copies using a tape data path to be associated with a single global secondary copy. This is based on the same concept as global deduplication policies, but global secondary copies only apply to tape copies. c opies. If multiple secondary copies require the same retention and encryption settings, using a global secondary copy reduces the number of tapes required during auxiliary copy operations and improves performance.
Page 63 63 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Storage Policy Settings Device Streams Device streams determine how many concurrent write operations will be performed to the library. This set ting should equal the number of tape drives or the total number of writers for all mount paths in a disk library. If the number of drives or writers change, the Device Streams setting can be modified in the storage policy’s properties.
Storage Policy Copy Settings Combine to Streams A storage policy can be configured configured to allow the use of multiple streams streams for primary copy backup. backup. Multi-streaming of backup data is done to improve improve backup performance. Normally, each stream used for the primary copy requires a corresponding stream on each secondary secondary copy. In the case of tape media for for a secondary copy, multi-stream storage policies will consume multiple media. The combine to streams option can be used to consolidate multiple streams from source data on to fewer media when secondary copies are run. This allows for better media management and the grouping of like data onto media for storage.
Storage Policy Administration Hiding Storage Policies Storage Policies can be hidden from view within the CommCell Console by selecting the Hide Storage Policy check box in the Storage Policy Properties. Once hidden it will not appear in the Storage Policies list and more importantly, subclients cannot be associated with a hidden Storage Policy. In order to hide a Storage Policy, no subclients can be associated with the policy. Hiding Storage Policies is an important feature because if a Storage Policy that is managing protected data is deleted, all the managed data will be pruned during the next data aging operation.
Deleting Storage Policies If a Storage Policy is deleted, all protected data associated with the Storage Policy and all policy copies will be pruned during the next data aging operation. It is strongly recommended to hide the Storage Policy instead of deleting it. To delete a Storage Policy, perform the following: 1. In the Storage Policy properties view the Associations tab to ensure no subclients are associated with the policy. A Storage Policy cannot be deleted if subclients are associated with the policy. 2. On the Storage Policy, right-click | select View | Jobs. De-select the option to Specify Time Range then click OK. This step will display all jobs managed by all copies of the Storage Policy. Ensure that there are no jobs being managed by the policy and then exit from the job history. 3. Right-click on the Storage Policy | Select All Tasks | Delete. Read the warning dialog box then click OK. Type erase and reuse media then media then click OK.
Page 64 64 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B If a storage policy copy will use a global deduplication database, it must be set during initial configuration. Since the global deduplication policy defines the data path and holds all deduplicated data, a storage policy copy’s global deduplication setting cannot be modified after the copy is created. http://documentation.commvault.com/commvault/v11/article?p=features/deduplication/t_creating_a_spc_with_global _dedup.htm
Page 65 65 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D The combine to streams option determines the number of destination device streams. If three source streams need to be consolidated into a single stream, setting the combine to streams parameter to one will accomplish this. Multiplexing the streams can also be set but only after enabling the combine to streams option for the secondary copy. http://documentation.commvault.com/commvault/v11/article?p=features/streams/advanced.htm
Page 66 66 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Hiding a storage policy will prevent the policy from appearing in the CommCell console, subclient associat ions selection, and in the subclient drop-down box. http://documentation.commvault.com/commvault/v11/article?p=features/storage_policies/storage_policies_how_to.ht m
Page 67 67 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Retention Job Based Retention Policy based retention settings are configured in the storage policy copy Retention Retention tab. tab. The settings for backup data are Days Days and and Cycles Cycles.. For archive data the retention is configured in Days Days.. Retention can also be set through schedules or applied retroactively to a job i n a storage policy copy.
Job Based Retention Rules Policy based retention settings are configured in the storage policy copy Retention Retention tab. tab. The settings for backup data are Days Days and and Cycles Cycles.. For archive data the retention is configured in Days Days.. Retention can also be set through schedules or applied retroactively to a job in a storage policy copy.
Days A day is a 24-hour time period defined defined by the start time of the job. Each Each 24-hour time period is complete whether a backup runs or not. In this way a day is considered a constant.
Cycles A cycle is traditionally defined defined as a complete full backup, all dependent dependent incremental, differential, differential, or log backups; up to, but not including the subsequent full. In real world terms a cycle is all backup jobs required to restore a system to a specific point-in-time. To better understand what a cycle is we will reference a cycle as Active as Active or or Complete Complete.. As soon as a full backup completes successfully it starts a new cycle which will be the active cycle. The previous active cycle will be marked as a complete cycle.
Page 68 68 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
An active cycle will only be marked marked complete if a new full backup finishes successfully. successfully. If a scheduled full backup does not complete successfully, the active cycle will remain active until such time that a full backup does complete. On the other hand, a new active cycle will begin and the previous active cycle will be marked complete when a full backup completes successfully regardless of scheduling.
Data Aging for Non-Deduplicate Non-Deduplicated d Data There are two processes that will be performed during a data aging operation. Aging Aging simply simply marks jobs that have exceeded retention as aged. Pruning Pruning will will physically delete eligible disk jobs or recycle a tape when all jobs on it have been marked aged. The Data Aging process process will compare the current retention settings of the storage policy copy to jobs in protected storage. Any jobs that are eligible to be aged will be marked aged. By default the data aging process runs every day at 12PM. This can be modified and multiple data aging operations can be scheduled if desired. Pruning is is also part of the data aging process. How Pruning occurs depends on whether jobs are on disk or tape. For jobs on a disk library, they will be pruned. This will physically delete the the data from the disk. If deduplication deduplication is being used, job blocks that are not being referenced by other jobs will be deleted. If Managed Disk Space is enabled, the jobs will remain until the Disk library reaches the upper watermark watermark threshold defined in the Library Library Properties.
For tape media, when all jobs on the tape have been marked as aged, and there are n o auxiliary copies dependent on the jobs, the tape will be moved into a scratch pool and data will be overwritten when the tape is picked for new data protection operations. In this case the data is not deleted and can still be recovered by browsing for aged data, until the tape label is overwritten. If the storage policy copy option ‘mark media to be erased after recycling’ has been selected or if the tape is manually picked to be erased, the data will physically be destroyed. This is done by overwriting the OML header of the tape making the data unrecoverable through the CommCell environment or using Media Explorer.
Rules for Aging Data There are several rules that are applied during the data aging process: 1. Both days and cycles criteria must be met for aging to occur. 2. Data is aged in complete cycles. 3. Days criteria is not dependent on jobs running on a given day.
Rule 1: Both CYCLES and DAYS criteria must be met before Data will age Commvault software uses AND logic to ensure e nsure that both retention parameters are satisfied. Another way of looking at this is the longer of the two values of cycles and days within a policy copy will always determine the time data will be retained for. Rule 2: Data is aged in complete cycles Backup data is managed within a storage policy copy as a cycle or a set of backups. This will include the full which designates the beginning of a cycle and all incrementals or differentials. When data aging is performed and retention criteria allow for data to be aged, the entire cycle is marked as aged. This process ensures that jobs will not become orphaned resulting in dependent jobs (incremental or differential) existing without the associated full.
Page 69 69 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Rule 3: Day is based on a 24 hour ho ur time period A day will be measured as a 24 hour time period from the start time of a data protection operation. operation. Days are considered constants since regardless of a backup being performed or completed successfully the time period will always be counted. If a backup fails, backups are not scheduled or if power goes out a day will still count towards retention. This is why it is so critical to measure retention in cycles and days. If retention was just managed by days and no backups were run for a few weeks all backup data may age off leaving no backups.
Additional Retention Retention Settings Spool Copy The Spool Copy option can be used to take advantage of fast disk read/write access and its multi-streaming capabilities when there is limited capacity available available on the disks. A spool copy is a no retention copy. Data is spooled to disk and then copied to a secondary copy. Once the data is successfully copied to the secondary copy, the data on disk will be pruned and the space will immediately be available for new backups. The spool copy option is not available when using deduplication.
Extended Retention Standard retention allows you to define the length of time based on cycles and days that you want to retain data. Extended retention allows you to define specific retention in days that you want to keep full backups for. It allows you to extend the basic retention by assigning specific retention to fulls based on criteria configured in the extended retention settings. Basically it allows you to set a grandfather, father, son tape rotation scheme.
Managed Disk Space Managed Disk Space is a feature used with disk libraries not using Commvault deduplication, which allows data to reside on the disk beyond its retention settings. This allows you to increase the chances of recovering data faster from primary storage on disk without changing retention settings. Managed data on disk is treated the same as retained data for data recovery.
Managed data will be held on the disk beyond the standard retention settings until an upper threshold is reached. A monitoring process will detect data exceeding the upper threshold and then delete aged jobs from the media until a lower threshold is reached. It is important to note that only aged jobs will be pruned. If all aged jobs are pruned and the lower threshold is not met no more pruning will occur. Retention Set Through Schedules Retention can be extended beyond the defined storage policy primary copy retention through a schedule or schedule policy. This is done by setting the Extend Job Retention options options in the Media Media tab tab of Advanced Options. The default setting is to use storage policy primary copy retention settings. You can set schedule based retention for a specified number of days or infinitely retain the data. Retention settings at the schedule level cannot be shorter than the retention defined in the storage policy primary copy. Retention Applied to Job in Policy Copy Retention for a job in a primary or secondary storage policy copy can be retroactively modified by going to the job history for the copy. This can be done by selecting the storage policy copy where the job is located, right -click the copy and select View | Jobs. Specify the time range of the job then click OK. Right -click on the job and select Retain Job. The job can be retained infinitely or until a specific date. The job icon will change to reflect that the job has been pegged down.
Page 70 70 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Object Based (Subclient (Subclient)) Retention Object based retention uses the principals of synthetic full backups to create, in a way, a carry forward image file. When an object is deleted from the production environment, the object is logged with the point in time it was deleted. The object will be carried forward with each subsequent synthetic full backup based on the number of days set in the subclient retention tab. When the days have been exceeded, the object will no longer be carried forward and once the synthetic full exceeds storage policy copy retention it is pruned from protected storage. So if the subclient retention is set to 90 days, once the item is deleted it will be carried forward with each synthetic full backup for a period of 90 days.
Page 71 71 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D Storage policy copy retention uses AND logic logic – – both both cycles and days criteria must be met for jobs to age. http://documentation.commvault.com/commvault/v11/article?p=features/dat http://documentation.commvault.com/commvau lt/v11/article?p=features/data_aging/data_aging_overview.htm a_aging/data_aging_overview.htm
Test Tip: questions Tip: questions such as this use similar answers with subtle variations. It is important to carefully read each answer. The key difference between answer C and D is the word ‘or’ ‘ or’ and ‘and’.
Page 72 72 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Since the basic retention is 30 days and the end of month fulls need to be retained for 90 days, the simplest approach is to use extended retention. http://documentation.commvault.com/commvault/v11/article?p=features/dat http://documentation.commvault.com/commvau lt/v11/article?p=features/data_aging/data_aging_advanced.htm a_aging/data_aging_advanced.htm
Test Tip: This Tip: This question does not indicate that the disk has deduplication enabled. This is just a logic question and a test of awareness of Commvault retention options. Whether t he library is disk or tape, the answer is the same. The trick to this question is that when using deduplication enabled disk libraries, it is strongly NOT recommended to use extended retention rules.
Page 73 73 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B If a subclient is re-associated to a different storage policy, the old storage policy will never have additional full backups written and managed by it. Therefore, theoretically the data would never age in the old storage policy. To avoid this issue, when a subclient is re-associated to a new storage policy, data in the old storage policy will age by days only.
http://documentation.commvault.com/commvault/v11/article?p=features/data_aging/faq.htm
Page 74 74 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Module 3: Clients and Agents
Page 75 75 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Clients Commvault software uses agents to communicate with file systems a nd applications that require protection. Any server with an Agent installed on it is referred to as a Client. Each Agent contains code that is used to communicate directly with the system requiring protection. The Agent will communicate using APIs or scripts that are native to the file system or application. For example: A Windows 2008 file system can use VSS to protect file data so the Windows Agent will have the option to enable enable VSS during backup operations. The Agent will then have a backup set defined. The backup set is a complete representation of all data the agent is responsible to protect. Within the backup set, subclients are used to define the actual data requiring protection. By default, a Default Subclient is used to define ALL data requiring protection within the backup set. Additional subclients can be created created to define specific content requiring protection. protection. When content is defined within within the user defined subclient, it will automatically be excluded from the default subclient. An example for a custom subclient could be defining a specific drive containing user data where VSS will be initiated for the drive during backup jobs to ensure all open files are protected.
Client Administration Check client connectivity (check readiness) Connectivity to a client and all Storage Policy data paths for configured subclients within the client can be checked and reported on at the client level. This is actually a mini version of the CommCell Readiness report. It will ensure that the CommServe server can communicate with the client machine. It will also check data path connectivity to all MediaAgent and library paths for Storage Policies that are associated with subclients configured for the client.
Page 76 76 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Releasing a License Releasing a license is a logical operation that can be applied to clients or agents. Releasing a license will ‘grey out’ the client or agent so data can still be recovered. Release License key points:
The client or agent will appear greyed out in the CommCell console. This means that data will still be retained and restored (out of place) in protected storage but the client cannot be backed up unless the license is reapplied. If the CommCell licensing structure is agent based the license will be available to install on another system. If the CommCell licensing structure is capacity based, the size of data for the deconfigured client or agent will not count against the capacity usage of the CommCell environment. Released licenses can be re-applied to the client by using the Reconfigure option. Agents and the client can be completely removed from the CommCell CommCell environment by using the Delete option.
Client Activity Control Data protection and data recovery jobs can be enabled or disabled in the Activity Control tab in the Client Properties. If activity is disabled an Enable after a Delay button will be displayed. This can be used to automatically enable the activity on a specific date and time. Client activity control is useful when a client will be offline, since any scheduled operations are by default ignored if activity is disabled.
Client Update Status Update status for clients can be viewed by selecting the Client Computers entity in the CommCell Browser or through the Client Properties page in the Version tab. Summary Update Status View for all Clients The current Commvault software version, service pack level, update status and operating system platform can be viewed for all Clients by selecting Client Computers icon in the CommCell Browser. This will display all clients in the CommCell environment providing summary information on their status. Checking Detailed Update Status for a Client The Version tab will display the current version of software, service pack level and status of each package installed on a client.
Client Computer Groups Client Computer Groups can be used to group like clients to simplify si mplify CommCell administration. Clients can be added to one or more computer groups. There are several methods for adding clients to groups:
During installation the client group can be selected.
In the client group properties select the clients and include them in the group.
In the client properties in the Group tab select the group or groups to add the client to.
Page 77 77 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Client Computer Groups provide the following benefits:
Simplified navigation when locating clients within the CommCell console.
Configuring user group security to manage entire computer groups.
Simplified activity control such as enabling or disabling data protection or recovery for an entire group.
Applying updates, bandwidth throttling, throttling, firewall configurations, configurations, etc… to entire groups at the same same time.
Executing schedule policies at the group level. Assigning computer groups when configuring configuring reports and alerts will automatically automatically add/remove clients when changes are made to the group.
Page 78 78 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Connectivity to a client can be tested using the check readiness option. This will check the readiness of any client regardless of what operating system is used. http://documentation.commvault.com/commvault/v11/article?p=features/reports/types/readiness_check_overvi http://documentation.commvault.com/commvault/v11/article?p=features/r eports/types/readiness_check_overview.ht ew.ht m
Test Tip: In Tip: In this question, three of the four answers are almost identical. A and B narrow the criteria of the check readiness by specifying and operating system where C does not. Be aware of these questions and double check the answers as these types of questions q uestions are most commonly challenged by test takers with the comment ‘there is more than one answer’. In this case, there is only one answer.
Page 79 79 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A Audit trails are configured configured for the CommCell environment and and not specific entities. All other answers answers can be used with client computer groups. Client computer groups http://documentation.commvault.com/commvault/v11/article?p=features/client_group/client_computer_group_how_to http://documentation.commvault.com/commvault/v11/article?p=features/client _group/client_computer_group_how_to .htm Audit trail http://documentation.commvault.com/commvault/v11/article?p=features/audit_trail/audit_trail.htm
Page 80 80 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
File System Agent and Subclient Configuration Filtering Filters are defined at the global and subclient level to remove the specific folders and objects that do not require protection. Global filters defined in the Global Filters applet in Control Panel can be automatically or manually associated with subclients. If global filters are associated with a subclient, the choice to override the global filters will be available. Global and subclient filter key points:
Global filters can be defined for Windows, UNIX, Exchange and Virtual Servers.
To enforce global filters to subclients enable the Use Global Filters on all Subclients check box.
Subclient settings to inherit global filters are configured as: o
On – On – always always use global filters
o
Off – Off – never never use global filters
o
Cell Level Policy – Policy – only only use global filters if Use Global Filters on all Subclients check box has been enabled.
Subclient filters include exclusion and exception filter entries: o
o
Exclusion filters determines which folders and/or objects will be excluded from the subclient. Exception is an override for exception and global filters. This means any folders and/or objects defined in the exception entry will be protected by the subclient.
Page 81 81 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Subclients Subclients are used to define data that will be protected in a containerized format. Each subclient container will manage specific content within a backup backup set. set. Each backup backup set set will have one or more subclients. Key points for subclients:
Subclient contents can be defined as drives, folders, files or UNC paths.
Storage policy used to manage the subclient can be defined.
Scripts can be inserted prior to scan and after backups.
Filter settings can be configured for global and local filters.
IntelliSnap® technology can be enabled and storage array managing s ubclient data can be defined.
Data transfer options including compression, deduplication and encryption c an be configured.
Default Subclient By default most Agents will have a Default Subclient. During the initial installation of the agent software an option to associate agent data with a storage policy is provided. This determines the storage policy that will manage the Default Subclient data. All subclients must be associated with a storage policy to protect the data. The default subclient acts as a catch all for all data managed within a backup backup set. set. This means the default subclient will automatically detect and protect all data the agent is responsible to protect. When custom subclients are defined any data managed by the custom subclient will automatically be excluded from the default subclient. This is the concept of mutual exclusiveness of contents within a backup backup set. set. Data is mutually exclusive to the subclient in which it is defined and data cannot be defined in multiple subclients within the backup backup set. set. The concept of Commvault software is to Copy Once and Reuse Extensively (CORE). In other words protect the data to the storage policy and use secondary copies to create additional copies of data. There are situations where protecting data from the source location multiple times may be required. To accomplish this you can create additional backup backup sets. sets. Modifying Contents of the Default Subclient The content of the default subclient is represented by a slash (backslash for windows based agents and forward slash for Linux/Unix based clients). It is strongly NOT recommended to modify the contents of the default subclient. Modifying this content will disable the auto detect functionality of the default subclient. If this i s done any future content required for protection must be explicitly added to the subclient contents.
Defining Contents The Contents tab is used to define the content for the subclient. There are several important points that should be understood before configuring subclient content:
The contents of a Default Subclient for most backup agents is a \ (windows based) or / (Linux/Unix based). This represents an auto detection functionality that will protect any newly added volumes. If the default subclient’s content is modified the \ or / will be removed and auto detection will be disabled. It is NOT recommended that the contents of the default s ubclient be modified. If only certain drives are to be protected, use the Filter tab to exclude the drives. Content can be added by browsing (Browse button) or manually entered (Add Paths button). Use the Add Paths button to enter UNC paths to protect data on systems that do not have Commvault agents installed. An impersonate user box will prompt you to enter a user account with proper permissions to read the data from the shared location. This feature is only recommended when protecting small amounts of data.
Page 82 82 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The option to Backup System S ystem State can be used to protect system state data for Windows servers. By default, the default subclient will automatically protect system state data. If required, a separate subclient can be defined to specifically protect system state. Only o ne subclient within a backup set can be designated to protect system state data.
Data Readers Data Readers determine the number of concurrent read operations that will be performed when protecting a subclient. For file system agents, by default, the number of readers permitted for concurrent read operations is based o n the number of physical disks available. The limit is one reader per physical disk. If there is one physical disk with two logical partitions, setting the readers to 2 will have no effect. Having too many simultaneous read operations on a single disk could potentially cause the disk heads to thrash slowing down read operations and potentially decreasing the life of the disk. The Data Readers setting is configured in the General tab of the subclient and defaults to two readers. Allow multiple readers within within a drive or mount point When a disk array containing several physical disks is addressed logically by the OS as a single drive letter, the Allow multiple readers within a drive or mount point can be used as an override. This will allow a backup job to take advantage of the fast read access of a RAID array.
Storage Policy Association Traditionally data protection has always been approached at the server level w here an entire server is protected as one piece. Commvault software handles data protection at the subclient level where each subclient when protected will run as a separate job. If three subclients are defined within a backup backup set, set, if they all run at the same time, three jobs will appear in the job controller. controller. Content defined at the subclient level is directed to storage through a Storage Policy. This means that data defined in different subclients on the same server can be directed to different storage and have different retention settings.
Pre/Post Scripts Pre/Post Process scripts can be used to quiesce applications prior to protection. This is very useful when protecting proprietary database systems or for quiescing databases within virtual machines prior to using the Commvault Virtual Server Agent for snapping and backing up the virtual machine.
Page 83 83 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A The subclient data readers option determines the maximum number of concurrent virtual machines to backup. If there are more streams available than virtual machines requiring protection, individual VMs are multi-streamed at the disk level. http://documentation.commvault.com/commvault/v11/article?p=products/vs_vmware/config_adv.htm
Page 84 84 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C The optimal scan method, configured in the subclient by setting ‘Optimized Scan’ provides the fastest scan method by maintaining a small metadata database on the volume. When the scan phase is initiated, the database is queried to determine what items should be backed up. http://documentation.commvault.com/commvault/v11/article?p=products/windows/c_file_scan_methods.htm
Page 85 85 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D When a custom subclient is created, the contents of that subclient is automatically removed from the default subclient. If the subclient is later deleted, the default subclient, acting as a catch all, will automatically protect all data defined in the deleted subclient. http://documentation.commvault.com/commvault/v11/article?p=products/windows/t_modify_default_subclient.htm
Page 86 86 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D Filters are configured at the global level, in the Control Panel, and at the subclient level. http://documentation.commvault.com/commvault/v11/article?p=features/filters/content_filters_regular_expression http://documentation.commvault.com/commvault/v11/article?p=features/filters/con tent_filters_regular_expressions.ht s.ht m
Page 87 87 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Virtual Server Agent (VSA) There are three primary methods Commvault software can use to protect virtual environments:
Virtual Server Agent (VSA)
Agents installed within virtual virtual machines
IntelliSnap® Technology
Virtual Server Server Agent (VSA) The Commvault Virtual Server Agent (VSA) (VSA) interacts with the hosting hypervisor to provide protection at the virtual machine level. This means agents do not need to be installed directly on the virtual machines, although installing restore-only agents will provide a simplified method for restoring data back to the VM. Depending on the hypervisor application being used and the virtual machine’s operating system, different features and capabilities will be available. The VSA interfaces with t he hypervisor’s APIs and provides capabilities inherent to the application. As hypervisor capabilities improve, the Commvault VSA agent will be enhanced to take advantage of new capabilities.
Agent Based Based Protection Protection Agent based protection uses Commvault Commvault agents installed directly in the virtual machine. When an agent is installed in in the VM, it will appear in the CommCell console just like a regular client and the functionality will be exactly the same as an agent installed on a physical host. The main advantage with this configuration is that all the features available with Commvault agents can be used to protect data on the VM. For applications, using agents provide complete application awareness of all data protection operations.
Page 88 88 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Hardware Snapshots with IntelliSnap ® Feature The Commvault IntelliSnap ® feature provides integration with supported hardware vendors to conduct, manage, and backup snapshots. This technology can be used to snap VMs at the data store level and back them up to protected storage. The process for protecting virtual machines is similar to performing snapshots with the VSA agent directly interfacing with the hosting hypervisor application. VSA will first quiesce the virtual machine and then the IntelliSnap ® feature will use vendor API’s to perform a hardware snapshot of the data store. The data store will then be mounted on an ESX proxy and all VMs registered. The VMs can then be backed up and indexes generated for granular level recovery. The snapshots can also be maintained for live browse and recovery. The backup copies can be used for longer term retention and granular browse and recovery. Using the VSA and IntelliSnap agent provides a high availability, disaster recovery, and data recovery solution. Specific configuration and hardware are required to implement this solution method.
Virtual Server Agent Agent Backup Process VSA works by communicating with the the hosting hypervisor to initiate software software snapshots of virtual machines. machines. Once the VMs are snapped, VSA will back them up to protected storage. The following steps illustrate the process of backing up VMware virtual machines: 1. Virtual Server Agent communicates communicates with the hypervisor instance to locate locate virtual machines defined in the subclient that require protection. 2. Once the virtual machines are located the hypervisor will prepare the virtual machine for the snapshot process. 3. The virtual machine will be placed in a quiescent state. For Windows VMs, VSS will be engaged to quiesce disks. 4. The hypervisor will then conduct a software snapshot of the virtual machine. 5. The virtual machine metadata will be extracted. 6. The backup process will then back up all virtual disk files. 7. Once the disks are backed up, indexes will be generated for granular recovery (if enabled). 8. The hypervisor will then delete the snapshots.
Virtual Server Agent Agent Roles Virtual Server Agent (VSA) proxies proxies are defined at the instance instance level of the VSA pseudo client. The top listed VSA proxy will be designated as the coordinator and all other proxies will be designated as data movers. The coordinator will be responsible for communicating with the hypervisor to get information about VMs and distribute VM backups t o data mover proxies. Data mover proxies will communicate with the coordinator proxy and provide information on available resources and job status. In the event that the coordinator proxy is unavailable, the next proxy in the list will assume the role of coordinator. If a data mover proxy becomes unavailable, the coordinator proxy will assign jobs to other available proxies.
Page 89 89 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Virtual Machine Machine Backup Backup Process Process When a VSA subclient backup starts, the coordinator will receive a list of all virtual machines listed in the subclient. Based on a defined set of rules, the coordinator will create a dynamic VM queue to determine the order in which virtual machines will be protected and which V SA proxies will back up each virtual machine.
Subclient Data Readers The data readers setting in the advanced tab of the subclient defines the maximum number of streams used for the backup. When the job starts, if there are more VMs than available streams, each VM will be allocated a single stream. If there are more streams than VMs, the coordinator will automatically instruct the data mover proxy to use multiple streams for the VM backups. Depending on the number of available streams, each virtual disk in the VM will be backed up as a single stream. This process is dynamic so as a job progresses and more streams become available and less VMs require protection, multiple streams streams can be used to protect individual individual VMs.
DataStore Distribution If VMs within a subclient exist across multiple DataStores, the coordinator will assign VMs to proxies, one VM per DataStore until the maximum stream count is reached. Each VM will be assigned to a different data mover proxy, balancing stream loads across proxies based on proxy resources. This will distribute the load across multiple DataStores which will improve backup performance and maintain a healthy DataStore state. In addition to the subclient Data Readers setting, a hard limit can be set for the maximum number of concurrent VMs that can be protected within a single DataStore using the nVolumeActivityLimit additional setting.
VM and VSA Proxy Distribution Distribution Rules DataStore distribution is the primary rule that determines the order i n which VMs will be backed up. Additional rules that determine VM backup order are: 1. Number of proxies available to back up a VM. The fewer proxies available, the higher in the queue the VM will be. This will also be dependent on transport mode. If the transport transport mode is set to Auto (default), SAN will have highest priority, followed by HotAdd and then NDB mode. If a specific transport mode is defined in the subclient, only proxies that are capable of protecting the VM can be used – used – this this could affect the available number of proxies which could result in a higher queue priority. 2. Number of virtual disks. VMs disks. VMs with more virtual disks will be higher in the queue. 3. Size of virtual machine. Larger machine. Larger VMs will be higher in the queue.
Stream Allocation and Proxy Throttling During backup operations, the coordinator proxy will gather information on each data mover proxy to determine the default maximum stream count each proxy can handle. This will be based on the following:
10 streams per CPU.
1 stream per 100 MB available RAM.
When the coordinator assigns jobs to the data mover proxies, it will evenly distribute jobs until the default maximum number of streams on a proxy is reached. Once the threshold is reached it will no longer assign additional jobs to the proxy. If all proxies are handling the maximum number of streams and there are still s till streams available, the coordinator will assign additional jobs to proxies using a round-robin method.
Page 90 90 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Throttling can be hard set on a per proxy basis using the following f ollowing registry keys: nStreamsPerCPU will nStreamsPerCPU will limit the number of streams per CPU on the proxy. nMemoryMBPerStream will nMemoryMBPerStream will set the required memory on the proxy for each stream. nStreamLimit will nStreamLimit will set a limit on the total number of str eams for a proxy. bHardStreamLimit will bHardStreamLimit will set a hard stream limit across all proxies within the VSA instance.
Transport Modes VMware® Transport Modes The VMware VADP framework provides three transport modes to protect virtual machines:
SAN transport mode
HotAdd mode
NBD and NBD SSL mode
Each of these modes has their advantages and disadvantages. Variables such as physical architecture, source data location, ESXi resources, network resources and VSA proximity to MediaAgents and storage will all have an effect on determining which mode is best to use. It is also recommended to consult with Commvault for design guidance when deploying the Commvault software in a VMware environment. SAN Transport Mode SAN Transport Mode can be used on a VSA proxy with direct Fibre channel or iSCSI access to snapshot VMs in the source storage location. This mode provides the advantage of avoiding network movement of VM data and eliminates load on production ESXi servers. Virtual machines can be backed up through the VSA and to th e MediaAgent. If the VSA is installed on a proxy server configured as a MediaAgent with direct direct access to storage, LAN-Free backups can be performed. HotAdd Mode HotAdd mode uses a virtual VSA in the VMware environment. This will require all data to be processed and moved through the VSA proxy on the ESXi server. HotAdd mode has the advantage of not requiring a physical VSA proxy and does not require direct SAN access to storage. It works by ‘hot adding’ virtual disks to the VSA proxy and backing up the disks and configuration files to protected storage. A common method of using HotAdd mode is to use Commvault deduplication with client client side deduplication, DASH Full and incremental forever protection strategy. Using Change Block Tracking (CBT) only changed blocks within the virtual disk will have signatures generated and only u nique block data will be protected. NBD Mode NBD mode will use a VSA proxy installed on a physical host. VSA will connect to VMware and snapshots will be moved from the VMware environment over the network and to the VSA proxy. This method will require adequate network resources and it is recommended to use a dedicated backup network when using the NBD mode.
Page 91 91 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C Commvault writes data in chunks. For disk library, the default chunk size is 2 GB. When a chunk completes, it is logged in the database and information is written to index files. In the case of committing a VSA backup, at least one chunk, 2 GB of data and one full VM must be completed. http://documentation.commvault.com/commvault/v11/article?p=products/vs_xen/t_commit_backup_job.htm
Page 92 92 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A Files and folders cannot be filtered from a VMware VSA backup. Disks, VMs and DataStores can be filtered. http://documentation.commvault.com/commvault/v11/article?p=products/vs_vmware/config_adv.htm
Page 93 93 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A During VSA backups, the VSA coordinator dynamically allocates resources based on all VMs currently queued to backup. This process is not on a subclient by subclient basis. So if two VSA s ubclients defined within the same VSA instance are running at the same time, the VSA coordinator will dynamically assign resources for all VMs within both subclients. http://documentation.commvault.com/commvault/v11/article?p=products/vsa/c_vsa_load_balance_v11.htm
Page 94 94 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Module 4: Data Protection and Recovery
Page 95 95 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Data Protection Jobs Using Schedules and Schedule Policies Commvault® software uses a standard scheduler which is used for scheduling all operations within the CommCell ® environment. The following is a list of operations which can be scheduled within a CommCell environment: 1. Data protection operations 2. Data recovery operations 3. CommServe DR backup 4. Reports 5. Data Aging
Page 96 96 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D The start new media and mark media full settings in the advanced backup options can be used to t o isolate specific jobs on tape media. This is commonly done when specific jobs, in this cas e weekend full backups, need to be isolated on tape for export to DR and archive facilities. Another common use case is isolating jobs with extended retention rules on separate tape media.
Page 97 97 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C All of the above job types can be scheduled with a schedule policy except for restore jobs.
Page 98 98 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A If stream resources are not available when a job enters the backup phase, the job status will show waiting and the delay reason will display ‘no resources’. Based on job priorities, when stream resources resources become available, the streams streams will be allocated to jobs in a waiting state. http://documentation.commvault.com/commvault/v11/article?p=features/job_management/job_management_advanc http://documentation.commvault.com/commvault/v11/article?p=features/job_managemen t/job_management_advanc ed.htm
Page 99 99 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Job Management Controlling Job Activity Commvault software offers a great deal of flexibility for controlling job activity. Job Activity Key points:
If activity is disabled in a parent object of the CommCell tree, activity is a utomatically disabled for any child objects. Activity can be disabled until manually manually enabled or set to automatically automatically re-enable at a specific date and time. time. If activity is enabled for a parent object in the CommCell tree, activity can be enabled or disabled for any child objects. When activity is disabled or enabled, the icon where the activity was set will change to reflect the current activity state.
What Activity can be Controlled CommCell Level
All activity for the entire CommCell environment can be enabled / disabled.
Disabling activity will disable all activity for the CommCell.
Enabling (default) allows activity to be controlled at child levels.
Data management (data protection) can be enabled or disabled.
Page 100 100 of of 122
Education Services Ac ti vi t y t hat can be enabled enabled or disabled
Commvault Certified Professional Exam Prep
CommCell Client Group
Client
Ag ent
April 2016
Backup Set
Subclient
No
No
Al l act iv it y
Yes
No
No
No
Data Management
Yes
Yes
Yes
Yes
Data Recover Recover y Yes
Yes
Yes
Yes
No
Yes
Au xi li ary Copy
Yes
No
No
No
No
No
Data Aging
Yes
Yes
No
No
No
Schedule Ac ti vi t y
Yes
No
Yes (enabled / disabled through schedule view for client)
Yes (enabled / disabled through schedule view for the agent)
Yes
Yes (enabled / disabled through schedule view for Backup Set)
Yes (enabled / disabled through schedule view for subclient)
Enabling or Disabling CommCell Activity Disabling Job Activity If job activity is disabled at any level within the CommCell tree it will automatically disable activity for any child objects within the tree. Activity cannot be overridden at any child levels. Example: A client computer group representing clients for a specific location is disabled for maintenance. By disabling activity at the group level, all clients within the group will automatically be disabled. Enabling Job Activity If job activity is enabled at any level within the CommCell tree, activity can be disabled at any child level object. This provides granular activity control down to the subclient level. Example: A specific client has a maintenance window scheduled. By disabling the activity for that client no operations will run. All other clients within the group will operate normally. Enabling After Delay If activity is disabled at any level in the CommCell tree the option Enable after a Delay can be used to set a date and time where activity will automatically be re-enabled.
Job Priorities Commvault software implements a robust method for configuring job priorities. There are three different number values that make up a job priority, the job type, client and agent. The three numbers are combined to form a three digit priority level. In Commvault software the zero value has the highest priority and the nine value has the lowest priority. Page 101 101 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Each job type will have a specific priority value associated with it. CommCell administrative operations such as data aging and the CommServe DR backup have a zero level priority. Restore operations also have a zero level priority. Backup operations have a one level priority. Auxiliary copy jobs have a two level priority. Client priorities are configured for individual clients in the Job Options tab in the client properties. The default client priority is six. Agent priorities are configured configured for each agent type in the Job Management applet in control control panel and have a default priority of six.
Operation Rules Operation windows allow the Commvault administrator to designate black out windows in which designated operations will not run. These rules can be set at the global, client c omputer group, client, agent, and subclient levels. Different operation windows can be defined for data protection jobs, recovery jobs, copy jobs and administrative jobs. Each defined operation window can have one or more Do not run intervals defined. Different operation rules can be specified for the same operation type to define specific time intervals for different days of the week. Job starts during an operation window blackout period If a job starts and an operation window is currently preventing jobs from running it will be placed in a Queued state. This will apply to both indexed and non-indexed jobs. Once the operation window is lifted and jobs are able to run, the jobs will change to a running state. Job is running and an operation window blackout period becomes active If a job is currently running and an operation windows blackout period becomes active, indexed and non-indexed jobs will behave in the following ways:
Indexed based jobs will finish writing their current chunk then be placed in a waiting state. When the blackout period is lifted the job will continue from the most successfully written chunk. Non-Indexed jobs will continue writing a nd will ignore the operation windows blackout period.
Page 102 102 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D When setting restrictive operation controls such as disabling data protection jobs, when the restrictive activity is configured it will affect all child objects within a tree. If data protection activity is disabled at the CommCell level, all data protection activity within the CommCell will be disabled. If data protection is disabled at a client group level then all clients within the group will have data protection activity disabled. http://documentation.commvault.com/commvault/v11/article?p=features/job_manag http://documentation.commvault.com/commvau lt/v11/article?p=features/job_management/c_job_management.htm ement/c_job_management.htm
Page 103 103 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is D Operation windows define ‘ define ‘do do not run intervals’ intervals’ for for data protection jobs. In this question, using the operation window will place the mailbox backup into a waiting state for the duration of the operation window, in this case 1:00 AM to 3:00 AM. http://documentation.commvault.com/commvault/v11/article?p=features/operation_window/c_cc_oper_win_overview. http://documentation.commvault.com/commvault/v11/article?p=features/ope ration_window/c_cc_oper_win_overview. htm
Page 104 104 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B Answers A, B and C list the three job preemption control options in the the job management configuration. Of the three, three, the only one enabled by default is ‘restore preempts other jobs’.
http://documentation.commvault.com/commvault/v11/article?p=features/job_management/c_job_priority_overview.ht http://documentation.commvault.com/commvault/v11/article?p=features/job_manag ement/c_job_priority_overview.ht m
Page 105 105 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Recovery Operations Recovery Methods Commvault software provides several different methods for data recovery. Depending on the situation each of these methods will have their advantages and disadvantages. Recovery methods can be divided into two main categories: Indexed and and Non-Indexed recovery. recovery. Indexed Based Recovery Methods Find
Provides the ability to enter search criteria such as myfile.txt or *.docx to search for specific files. This option is useful when you know specifically what file or files need to be restored.
Browse
Provides the ability to browse for all protected data using the folder hierarchal structure (like Windows Explorer). This method is useful when multiple files, folders or drives need to be restored.
Restore
Provides the ability to enter a drive, folder path or file path such as F:\users\jdoe that is required for restore. This option is useful when you know the specific location for data required for restore.
Page 106 106 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Full system restore
Provides the ability to restore an entire server in the event of a full system crash. This method requires that all data on the server including system state data has been protected. It also requires a base operating system to be installed and the Windows file system agent. This method is useful when the operating system can be reinstalled or if base images are being deployed to servers.
1-Touch restore
Provides the ability to restore an entire server in the event of a full system crash. This method uses a boot image to boot the system with a temporary operating system. It will then reinstall the operating system using unattended answer files, reinstall the file system agent and then initiate a full system restore. This method is useful when a system needs to be rebuilt with minimum administrator effort.
Non-indexed Based Recovery Method
Restore by Job
Provides the ability to perform a non-indexed restore u sing one or more streams for one or more jobs. This method is useful in disaster recovery scenarios when the index cache is not available. An indexed based restore would have to restore index files from media before the restore can begin where this method would immediately begin restoring data. This method is also beneficial when backup jobs have been multi-streamed since multiple streams can be used to restore the data. Indexed based restore methods will always be single streamed.
Using Find Your first and best tool tool in locating data within protected storage storage is Find Find.. The Find task task is available available at the backup backup set set level and within the Restore Browse. Browse. Find can scan multiple indexes within a specified range of backup backup time looking for a specific filename filename or pattern (wildcards). You can also limit your scope of search to a specific folder folder or folder structure. Matching results are displayed, displayed, including all versions of the file within the specified time range. range. You can select to restore any, all, or specific version(s) of the found file. Note that if multiple versions are restored, each version will have a sequential n umber appended to the filename starting with 1 for the most recent version of the file. With E-mail, you can also use Find Find to search on data within the From, From, To, and Received fields of the message. Note that not all Agents support the Find task and the scope is restricted to a single Client/ backup Client/ backup set. set.
Using Browse A browse operation allows the administrator administrator to browse through the folder folder structure to select files files and folders to restore. Multiple files and folders can be selected for recovery operations. If a parent object i n the folder structure is selected, then all objects within the parent folder will automatically be selected for restore. When selecting a file that was modified multiple times during a cycle, the specific version of the file or all versions can be selected to be recovered.
Image and Non-Image Browsing Each time a backup operation is conducted an image file is generated which represents a view of the folder s tructure at the time the backup occurred. When a browse operation is conducted, by default an Image Browse is used to present the folder structure as it existed based on the browse date and time. This is done by displaying the folder structure from the most recent image file prior to the point in time being browsed. So if a browse is being conducted
Page 107 107 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
on Wednesday at 2:00 PM and the most recent backup was run on Tuesday at 10:00 PM, the image file from the 10:00 PM backup will be used. This image browse method is used to produce a co nsistent structure of the data based on the browse time which is important since folder structures may change from day to day during a cycle. When restoring an entire folder structure, it is important that the structure represents represents a specific point when a backup was conducted and not represent data for the entire cycle. This is best explained by using temporary files as an example. Temporary files and folders can be generated, deleted and regenerated multiple times during a cycle. Each time a backup is run, the file folder structure will be different based on which files existed at the specific point in time. When a restore operation is run you wouldn't want every temporary file and folder to b e restored, just a particular point or day. Where the image browse option is good for restoring file and folder structures to a particular point-in-time, it could also result in deleted items not showing up when a browse operation is conducted. For example, if at Wednesday at 2:00 PM a browse operation is run using the Tuesday 10:00 PM image file, and a file or folder was deleted on Tuesday at 2:00 PM, the deleted files will not appear in the browse results. This is because when the 10:00 P M image file was created; the deleted files were not present. There are two methods to ensure deleted items are displayed during browse operations: 1. Select the Show Deleted Items check box box - This will run what is referred to as a No -Image browse. In this case the image files are bypassed and the browse operation will return results from the index cache which whi ch will show all items backed up from the point the full was run. This method is useful when recovering user data that has been deleted but may not be a good choice when restoring an entire folder structure, especially if the folder structure was modified during the cycle. 2. Specify date and time to browse browse - If you know when the data was deleted, specify that date and time in the browse options. So if data was deleted at 2:00 PM on Tuesday and you specify Tuesday as the browse date, then the most recent image file prior to the point the browse is being conducted would be Monday at 10:00 PM. Since the day was deleted on Tuesday it would be present in the image file on Monday night and will show up in the browse results.
Page 108 108 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is B The ‘image browse’ method uses the most recent image file for the browse operation as a filter to show only items present from the most recent backup preceding the browse time.
http://documentation.commvault.com/commvault/v11/article?p=features/browse/browsing_data_how_to.htm
Page 109 109 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A When a restore of a file is requested, the find operation provides the most efficient method to locate the file. The find feature allows the use of wildcards and a date range to be specified so the exact time or name of the file is not required. http://documentation.commvault.com/commvault/v11/article?p=features/find/find.htm
Page 110 110 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is C The show deleted items option removes the image filter and browses directly to the index. This will display all items within the cycle whether they existed at the browse point or not. http://documentation.commvault.com/commvault/v11/article?p=features/browse/browsing_data_how_to.htm
Page 111 111 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Performance and Troubleshooting Stream Management Primer Data Streams are what the Commvault software uses to move data from source to destination. The source can be production data or Commvault protected data. A destination stream will always be to CommVault protected storage. Understanding the data stream concept will allow a CommCell environment to be optimally configured to meet protection and recovery windows. This concept will be discussed in great detail in the following sections. Primary Protection Streams Primary data protection streams originate at the source file or application that is being protected. One or more read operations will be used to read the source data. Once the data is read from the source it is processed by the agent and then sent to the MediaAgent as Job Streams . The MediaAgent then processes the data, arranges the data into chunks and writes the data to storage in Device Streams . Secondary Copy Streams For most data centers, performance is the key requirement when performing primary backups from the production servers. When copying data to secondary copies, media management becomes the primary focus. The aspect of grouping data with like retention and storage requirements allows for more efficient media management. Using multiple secondary copies allows for data with li ke requirements to be managed on the same media set. Using options such as combine to streams and multiplexing secondary copies improves overall performance and media management.
Data Readers Data Readers determine the number of concurrent conc urrent read operations that will be performed when protecting a subclient. For file system agents, by default, the number of readers permitted for concurrent read operations is based o n the Page 112 112 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
number of physical disks available. The limit is one reader per physical disk. If there is one physical disk with two logical partitions, setting the readers to 2 will have no effect. Having too many simultaneous read operations on a single disk could potentially cause the disk heads to thrash slowing down read operations and potentially decreasing the life of the disk. The Data Readers setting is configured in the General tab of the subclient and defaults to two readers. Allow multiple readers within within a drive or mount point When a disk array containing several physical disks is addressed logically by the OS as a single drive letter, the Allow multiple readers within a drive or mount point can be used as an override. This will allow a backup job to take advantage of the fast read access of a RAID array. If this option is not selected, the CommVault software will use one read operation during data protection jobs.
Storage Policy Device Streams Device streams are configured in the properties of the storage policy. The general rule of thumb t humb is that the number n umber of device streams configured in a storage policy should always equal the number of drives or writers of all libraries defined in the storage policy policy primary copy. Having fewer number of streams streams may be used to throttle parallel throughput, but that doesn’t make maximum maxim um efficient use of the devices and there are other means to restrict allocation of devices. If the number of device streams streams is greater than the total number number of resources available available no benefit will be gained. The Commvault software uses a throttling mechanism to always use the lowest stream value throughout the data movement process.
Disk Library Device Streams For disk libraries the number of device streams is based on the total number of mount path writers for all mount paths within the library. If a disk library has two mount paths with ten writers each, a total of twenty device streams can write to the library. It is important im portant to note that since disk libraries allow multiple write operations, multiplexing is n ot recommended. By increasing the number of m ount path writers, more job streams can be written to device streams st reams on a one-to-one ratio. If network, MediaAgent, and disk resources are adequate, increasing the number of writers wr iters for a mount path will have a positive effect on data protection performance.
Tape Library Device Streams For tape libraries one sequential write operation can be performed to each drive. If there are eight drives in the library then no more than eight device streams will be used. By default each job stream will write to a device stream. To allow multiple job streams to be written to a single tape drive, multiplexing can be enabled. The multiplexing factor will determine how many job streams can be written to a single device stream. If a multiplexing factor of four is set and there are eight drives a total of thirty two job streams can be written to eight device streams.
Combine to Streams A storage policy can be configured configured to allow the use of multiple streams streams for primary copy backup. backup. Multi-streaming of backup data is done to improve backup backup performance. Normally, each stream used for for the primary copy requires a corresponding stream on each secondary secondary copy. In the case of tape media for for a secondary copy, multi-stream storage policies will consume multiple media. The combine to streams option can be used to consolidate multiple streams from source data on to fewer media when secondary copies are run. This allows for better media management and the grouping of like data onto media for storage.
Disk Library Block Size Use 64 KB block size for target volume The standard buffer size used by the Commvault software and most transmission methods is 64 KB. The standard block size used by Windows NTFS File File System is 4 KB. KB. UNIX and Netware Netware Files systems use a standard of 8 KB. The Page 113 113 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
low block size is for backward backward compatibility with smaller disks (< 2 GB). When writing a 64 KB buffer to smaller disk blocks, there is overhead involved involved is acquiring and writing each block. If you use 64 KB block sizes there there is no additional overhead. overhead. CPU loading is reduced with potential potential for improved I/O.
Commvault Performance Settings Chunks Commvault software writes protected data to media in chunks. During data protection jobs as each chunk is written to media indexes and the CommServe database is updated. For indexed based jobs this creates points in which a job can be resumed if network, client or MediaAgent problems occur. In this case the job can continue from the most successfully written chunk. It also allows for indexed jobs to be recovered to the most recent chunk if the job fails to complete. This partial recovery is performed by using t he restore by job option. In this case the data can be recovered up to the most recent written chunk. As a general rule the larger the the chunk size, the more efficient the the protection operation will be. In the event that jobs jobs are being conducted over unreliable links such as WAN backups decreasing the chunk size may improve overall performance. If any disruption occurs during the job any data written to media prior to the chunk update operation in the cache and CommServe server will have to be rewritten. In this case a smaller chunk size will limit the amount of data that will have to be re-transmitted over the link. For reliable clients, MediaAgents and networks using a higher chunk size will improve performance. Chunk sizes are determined by the job type being performed and media being used. Depending on the media type the default chunk sizes will be based on the following:
Disk storage will use 2 GB chunk sizes.
Tape media will write chunks based on whether the job is indexed or non-indexed job types. o
4 GB chunk sizes for indexed based backups.
o
16 GB chunks for non-indexed chunk sizes.
Chunk size can be configured using the following methods:
Tape Media – Media – Chunk Chunk size can globally set in the Media Management applet in Control Panel. Tape or Disk Media – Media – Chunk Chunk size can be set at the data path level in the storage policy copy. This is done in the data path properties in the Data Path tab of the policy copy. Settings at the data path level will override settings configured in control panel.
Blocks Block size determines the block size when writing data to protected storage. This size can be modified to meet hardware requirements and also improve performance. The default block size Com mvault software uses is 64 KB. Block size can be set in the data path properties in the Data Path tab of the storage policy copy. It is important to note that block size is hardware dependent. This means that increasing this setting requires all network cards, Host Buss Adapters, switches, operating systems and drives to support the the block size. Consider this aspect not only only in the production environment but also any DR locations where recovery operations may be performed.
Page 114 114 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Setting Concurrency Parameters Disk Writers The previous section on disk I/O contains more detailed discussion on configuring the number number of disk writers. Setting the number of disk writers is a balance between MediaAgent capacity, drive configuration, data stream speed, and overall throughput. Multiplexing factor The previous section on tape I/O contains a more detail discussion on setting a multiplexing factor. Note that the multiplexing factor for disk libraries is unnecessary and not recommended. For deduplication enabled libraries, multiplexing cannot be enabled. Network Agents Network Agents are parallel processes that read/write buffers buffers to the transmission path. If not fully used, they consume resources that might be used elsewhere. Each MediaAgent has a setting in its Properties: Optimize for concurrent LAN backups. backups . This is enabled by default. If this setting is enabled, the number of Network Agents is forced to 1 and changing the value has no effect.
Pipeline Buffers The Data pipe buffers determine the amount of shared memory allocated on each computer for data pipes. The size of each buffer is 64 KB. By default, 30 data pipe buffers are established on each server for data movement operations. You can increase the data transfer transfer throughput from the client client by increasing the number of data data pipe buffers. When you increase the number of data pipe buffers, more shared memory is consumed by the client or MediaAgent. This may degrade the server performance. Therefore, before increasing the number of data pipe buffers, ensure the re is adequate shared memory available. You can optimize the number of data pipe buffers by monitoring the number of concurrent backups completed on the server. Pipeline buffers is configured on a client or MediaAgent by adding the additional setting registry key: nNumPipelineBuffers.
Page 115 115 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is E Block size is hardware dependent. All MediaAgent operating systems, network devices, and tape libraries that will be backing up or restoring data must support the configured block size.
Page 116 116 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A The CVPing utility is used to ping an IP address using a specific port. http://documentation.commvault.com/commvault/v11/article?p=features/network/network_tools.htm
Page 117 117 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
The correct answer is A Data Interface Pairs (DIPs) are used to specify a source and destination network interface for two CommCell resources, such as a client and MediaAgent to communicate through. DIPs can be configured using host names or IP addresses. In this case, using an IP address will avoid DNS lookup issues. http://documentation.commvault.com/commvault/v11/article?p=features/data_interface_pairs/data_inter http://documentation.commvault.com/commvault/v11/article?p=features/dat a_interface_pairs/data_interface_pairs.ht face_pairs.ht m
Page 118 118 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Thank You At Commvault, we have a strong strong focus on providing quality education. education. We use a 3-tier student survey process to assess your learning experience, how the training affected your ability to be more productive using Commvault products, and finally how the training impacted your ability to enhance and improve the impact Commvault products have in your data management environment. 1. The initial initial ‘Learning Experience’ or course survey can be done as soon as your y our course is complete via Education Advantage. We’ll show how to launch the survey on the next slide and take 10-15 minutes for all to complete it. 2. The ‘Learning Effectiveness’ follow-up follow -up survey is sent to all students about 6 weeks after your course via email. We are looking for your input input on how you were able to apply the skills skills learned in your environment and whether there is content we need to add to our courses to better address your skills needs (something that may not be evident at course completion). 3. 3-6 months after completing your course co urse you will receive the Education Value Survey via email from Techvalidate. We use a third party to collect, collect, audit, and validate validate these survey responses. This survey is used to assess the impact training has had on your business and data management environment. Were you better able to leverage Commvault Commvault products, with better performance, performance, better resource usage? usage? Were you better skilled, reducing reliance on customer support for product usage queries queries over time? Finally we ask based on your Commvault learning experience, how likely you would be to recommend Commvault training to a friend or colleague. This one question question produces an overall learner satisfaction (or Net Promotor) Promotor) score. This metric is used to measure (at a high level) how we are doing overall.
Page 119 119 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
We strive to meet your highest highest expectations and highest survey survey marks. If we fail to meet your expectations with the learning experience, please provide specific comments o n how we can improve. We take all comments seriously and and will adjust our offerings to better better support your needs.
Page 120 120 of of 122
Education Services
Commvault Certified Professional Exam Prep
April 2016
Page 121 121 of of 122
COMMVAULT.COM | 888.746.3849 | EA.COMMVAULT.COM ©2016 COMMVAULT SYSTEMS, INC. ALL RIGHTS RESERVED.