NetBackup IT Analytics Data Collector Installation and Configuration Guide
- Section I. Introduction
- Introduction
- Install and configure a Data Collector
- Step-1: Choose operating system and complete prerequisites
- Installing the Data Collector software
- Configure SSL
- Section II. Data Protection
- Configuration for Veritas Backup Exec
- Configuration for Cohesity DataProtect
- Configuration for Commvault Simpana
- Open TCP/IP access to the Commvault database
- Set up a read-only user in the CommServe server
- Configuration for EMC Avamar
- Import EMC Avamar server information
- Configuration for EMC Data Domain backup
- Configuration for Dell EMC NetWorker backup & recovery
- Importing generic backup data
- Configuration for generic backup
- CSV format specification
- Configuration for HP Data Protector
- Architecture overview (HP Data Protector)
- Configure the Data Collector server in Cell Manager (HP Data Protector)
- Configuration for IBM Spectrum Protect (TSM)
- Architecture overview (IBM Spectrum Protect -TSM)
- Import IBM Spectrum Protect (TSM) information
- Configuration for NAKIVO Backup & Replication
- Configuration for Oracle Recovery Manager (RMAN)
- Configuration for Rubrik Cloud Data Management
- Configuration for Veeam Backup & Replication
- Configuration for Veritas Backup Exec
- Section III. Storage (Capacity)
- Configuration for Compute Resources
- Configuration for DELL EMC Elastic Cloud Storage (ECS)
- Configuration for Dell EMC Unity
- Configuration for EMC data domain storage
- Configuration for EMC Isilon
- Configuration for EMC Symmetrix
- Configuration for EMC VNX Celerra
- Configuration for EMC VNX CLARiiON
- Configuration for EMC VPLEX
- Configuration for EMC XtremIO
- Configuration for FUJITSU ETERNUS Data Collector
- Configuration for Hitachi Block
- Configuring a Hitachi Device manager user
- Configuration for Hitachi Content Platform (HCP)
- Hitachi content platform system management console
- Hitachi content platform tenant management console
- Configuration for Hitachi NAS
- Configuration for Hitachi Vantara All-Flash and Hybrid Flash Storage
- Configuration of Host inventory
- Host Access Privileges, Sudo Commands, Ports, and WMI Proxy Requirements
- Configure host Discovery policies to populate the host Inventory
- Validate host connectivity
- Host Inventory search and host Inventory export
- Configure and edit host probes
- Propagate Probe Settings: Copy Probes, Paste Probes
- Configuration for HP 3PAR
- Configuration for HP EVA
- Configuration for HPE Nimble Storage
- Configuration for HPE StoreOnce
- Configuration for IBM Enterprise
- Configuration for IBM COS
- Configuration for IBM SVC
- Configuration for IBM XIV
- Configuration for Microsoft Windows server
- Configuration for NetApp-7
- Configuration for NetApp StorageGRID
- Configuration for NetApp Cluster
- Configuration for NetApp E-Series
- Configuration for NEC HYDRAstor
- Configuration for Pure Storage FlashArray
- Section IV. Compute (Virtualization and Host Collection)
- Configuration for VMware
- Configuration for IBM VIO
- Configuration for Microsoft Hyper-V
- Section V. Cloud
- Configuration for Amazon Web Services (AWS)
- Mandatory probe user privileges
- Link AWS accounts for Collection of consolidated billing data
- Configuration for Google Cloud Platform
- Configuration for OpenStack Ceilometer
- Configuration for OpenStack Swift
- Configuration for Microsoft Azure
- Configuration for Amazon Web Services (AWS)
- Section VI. Fabric
- Configuration for Brocade switch
- Configuration for Cisco switch
- Configuration for Brocade Zone alias
- Configuration for Cisco Zone alias
- Configuration for Brocade switch
- Section VII. File Analytics
- Configuration for File Analytics
- Host Discovery and Collection File Analytics probe
- Adding a File Analytics Data Collector policy
- File Analytics Export Folder Size and Folder Depth
- Configuration for File Analytics
- Section VIII. Data Collection Validation and Troubleshooting
- Validate data collection
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- Uninstalling the Data Collector
- Appendix A. Firewall Configuration: Default Ports
- Appendix B. Load historic events
- Load Veritas NetBackup events
- Appendix C. CRON Expressions for Policy and Report Schedules
- Appendix D. Maintenance Scenarios for Message Relay Server Certificate Generation
Specifying the File Analytics folder depth
A parameter,
, is available to specify folder depth for File Analytics.To specify the folder depth for the report summary, add the following parameter when executing the command -Dfa.export.folderDepth=x where "x" is the depth. By default the depth is set to 1.
To turn off reporting on parents, add the following parameter when executing the command -Dfa.export.includeParents=No. By default reporting on parents is turned on.
To specify the name of the output file use -Dfa.export.reportFileName=SomeReportName.csv. If this parameter is not specified the default output file will be report.csv.
For example:
java -classpath /opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/ WEB-INF/classes/ -Dfa.export.folderDepth=2 -Dfa.export.includeParents=No -Dtest.resourceLocation=opt/aptare/portal/WEB-INF/classes/ com.aptare.sc.service.fa.FaSubDirectoryReport
As an example, the table that follows, uses these directory structures to show the results of different parameter values:
D1
D1/SD1
D1/SD1/SD2
D2/SD3
D3
This table illustrates the expected results given the different parameter values:
Table:
fa.export.folder Depth | fa.export.include Parents | Directories Included in Report |
---|---|---|
0 | N/A | D1 D2 D3 |
1 | N/A | D1 D1/SD1 D2 D2/SD3 D3 |
2 | No | D1/SD1 D1/SD1/SD2 D2/SD3 D3 |