Download - UserGroup Protection Manager June 2012
Protection Manager
Dienstag, 3.7.2012
NetApp UserGroup
René Meier Professional Service Consultant
Christian Oriet Systems Engineer
1
Agenda
OnCommand 5
Use cases
Architecture
Technical details
Naming conventions
SnapVault
Application datasets
tips, tricks and best practices
2
Introduction: OnCommand 5.1 Naming changes
3
Old Name New Name
Protection Manager OnCommand Unified
Manager
Provisioning Manager OnCommand Unified
Manager
Operations Manager Refers to the old UI
DFM Server OnCommand Core
Package
Use cases
centralized solution
monitoring / alarming
automation
protection based on SLA’s (policy)
efficient for small-big* environment’s
simple restore
4
Architecture
Primary Site Backup Site
SnapMirror / SnapVault / OSSV / vFiler-DR
Protection Manager
OnCommand Core
Control
Data flow Provisioning Manager
Performance Advisor
OnCommand Unified Manager – Management Console
5
Technical details
Naming properties (introduced in PM OC 5)
Maximum concurrent baseline jobs: 25
– Job’s queue window in NMC will display the queued jobs
Maximum (default) concurrent update jobs: 100
6
Naming conventions
Protection Manager NetApp
Local Backup Snapshot on primary
Backup Snapshot on primary and transfer to secondary
Mirror Snapshot on primary and mirror to secondary
DR Backup same as Backup + failover mechanism
DR Mirror same as Mirror + failover mechanism
remote backup only no local backups, trigger SnapVault update
(used by SnapManager’s and OSSV)
7
Protection Manager The way to a dataset
1.Schedule
2.Protection policy
3.Ressource pool
4.Dataset
Live Demo
8
Relationship import
9
SnapVault
CLI
SystemManager
Import to Protection Manager
Live Demo
10
Importing Relationships
Can only import individual qtree relationships into a dataset
– Can’t import the entire volume
– Hence newly created qtrees, can’t be protected automatically
– This problem does not apply to VSM or QSM
When a relationship is imported, the previous relationship schedule (on ONTAP)
should be manually deleted
– For instance, Importing a VSM will not remove the appropriate entries from
SnapMirror.conf file
When a relationship is imported, you won’t be able to restore (in PM) from
snapshots created prior to the import process
– As a matter of fact, retiring these snapshots becomes crucial as PM can’t
retire these snapshots based on protection policy’s retention time.
The secondary volumes of imported relationships are not automatically sized by
Protection Manager
Useful to monitor LAG time
11
Application Datasets
12
Application Datasets
An application dataset, is something that is created by a
SnapManager through SnapDrive integration of Protection
Manager
– You cannot manually create an Application dataset
What’s the uniqueness of an application dataset?
– The application dataset only takes care of remote backup schedule
and retention
– The local backup schedule and retention is taken care by the
SnapManager
How to create an application dataset:
13
Snapdrive setup Protection Policy
(remote backup only) SnapManager setup
tips, tricks and best practices
14
tips, tricks and best practices
Disable performance advisor in huge environments
– dfm option set perfadvisortransport=disabled
– dfm host set <host-id> perfadvisortransport=disabled
Disable following option to prevent systemload by WAFL
scanners
– dfm option set snapDeltaMonitorEnabled=no
(This causes "Overwrite Rate" column in "Volume Overwrite Rate" report to stop updating)
A Mirror Policy always applies to VSM
– No QSM at all
QSM instead of SnapVault
– if SnapVault license is missing
– option pmQSMBackupPreferred (default: disabled)
15
tips, tricks and best practices
PM now uses DSS (disabled when upgraded from 3.8)
– Option Name: dpDynamicSecondarySizing
dfm option list dpDynamicSecondarySizing
dfm option set dpDynamicSecondarySizing=enabled
Be aware of performance issues over thousand of relationships
(ensure that PM job engine is not a bottleneck)
– Having too many relationships within a dataset
– Having hundreds of dataset with one volume/relationship
– best practice: 40-50 relationships within a dataset
Backup and DR pre/post scripts
– If Custom Script fails, then PM job will fail
– Example: DR script to create shares
Scheduling
– spread load
– consider deduplication schedule
16
Retiring Relationships from PM
17
Never ever delete a dataset to retire a relationship out of PM
This would automatically mark the relationship as unwanted and
the reaper functionality would kick in
What is the right way to delete a relationship?
Start with relinquishing the relationship
(Check out “dfm dataset relinquish help”)
Then remove the primary volume (save!)
Then remove the secondary volume
Learning Resources and contacts
TR’s
– TR 3440 – Sizing Guide
– TR 3710 - BPG
– KB1011950 – Distributed DFM
– TR 3690 – Access to DFM database
– TR 3655 – DFM DR
– TR 3767 – DFM HA
GSS
– www.communities.netapp.com
18
19
20