NetBackup IT Analytics Help
- Section I. Introducing NetBackup IT Analytics
- Section II. Certified configurations
- Portal and database servers
- Data Collector server configurations
- Capacity Manager configurations
- Array/LUN performance Data Collection
- EMC Isilon array performance metrics
- NetApp Cluster-Mode performance metrics
- EMC Symmetrix enhanced performance metrics
- Host access privileges, sudo commands, ports, and WMI proxy requirements
- Cloud configurations
- Virtualization Manager configurations
- File Analytics configurations
- Fabric Manager configurations
- Backup Manager configurations
- ServiceNow configurations
- Internal TCP port requirements
- Section III. End user
- Understand the portal
- About the Admin tab
- Explore your inventory
- Hierarchy toolbar to organize your data
- Show objects
- Use attributes to organize your data
- Pin reports - saving reports with inventory objects
- Assign attributes in the inventory list view
- Get acquainted with reports
- About badging
- Generate and maintain reports
- Select Report Scope
- Group hosts by attributes
- Search for hosts in the report Scope Selector
- Backup Manager advanced scope selector settings
- Solution reports scope selector settings
- Units of Measure in Reports
- Customize report filter logic
- Sort columns in reports
- Convert tabular report to chart
- Distribute, share, schedule, and alert
- Scheduling Exported Reports and Dashboards
- Organize reports
- Work with the dynamic template designer
- Dynamic Template Designer Quick Start
- Converting to a Homogeneous, Product-Specific Template
- Dynamic Template Function Configurations
- Create Fields with the Field Builder
- Scope Selector Component - Custom Filter
- Configure a Bar Chart Dynamic Template
- Steps to Create a Bar Chart Dynamic Template
- Configure an Area/Stacked Area Chart Dynamic Template
- Line Charts for Performance Metrics
- Line Chart Field Requirements
- One Object Per Line Chart, One or More Metrics Per Chart
- Multiple Objects Per Line Chart, One Metric Per Chart
- Example of a Stacked Bar Chart Dynamic Template
- Create a Sparkline Chart in a Tabular Dynamic Template
- Adding or Editing Methods
- Validate and Save a Method
- Work with the SQL template designer
- Configure SQL Template Scope Selector Components
- Sample SQL Queries
- Number, Size, Date, and Time Formatting
- Alignment, Aggregation, Bar Type, and Bar Type Color
- Pipelined functions for report query building
- APTlistOfDates
- aptStringConcat
- getServerAttributeValue
- getObjectAttributeValue
- getChildServerGroupContextById
- getServerGroupContextById
- secsToHoursMinSecs
- APTgetTapeDriveStatusName
- getFullPathname
- listJobSummaryAfterRestart
- listJobSummaryAfterRestartNBW
- listJobSummaryAfterRestart for NetWorker Backup Jobs
- listOfBackupWindowDates
- listChargebackCatByVOLSDetail
- listChargebackCatByNcVolDetail
- listChargebackCatByFSDetail (for HNAS)
- listChargebackCatByFSDetail (for EMC Isilon)
- listChargebackByLUNSummary
- listChargebackByLUNDetail
- listChargebackCatByLUNSummary
- listChargebackCatByLUNDetail
- Alert configuration
- Add/Edit an Alert Policy
- Manage hosts, backup servers, and host groups
- NetBackup Master Servers
- Manage attributes and objects
- Provide Portal access and user privileges
- Setting / Resetting passwords
- Managing user group home pages (Administrator)
- Configure master schedules and backup windows
- Add, edit, and move policies
- Add/Edit a threshold policy
- Capacity Chargeback policy types
- Solutions administration
- Manage and monitor data collection
- About data collection tasks
- Add/Edit data collectors
- Review collectors and collection status
- Upgrade Data Collectors
- Work with Capacity Manager host data collection
- Host access requirements
- Manage credentials
- Configure host discovery policies to populate the host discovery and collection view
- Validate host connectivity
- Search and export in host discovery and collection
- Propagate probe settings: Copy probes, paste probes
- Discovery policies for Veritas NetBackup
- View and manage system notifications
- Customize with advanced parameters
- Access control advanced parameters
- General Data Collection advanced parameters
- Cloud data collection advanced parameters
- Host discovery and collection advanced parameters
- Backup Manager advanced parameters
- Capacity Manager advanced parameters
- File Analytics advanced parameters
- Virtualization Manager advanced parameters
- Manage your Portal environment
- Analyze files
- Troubleshoot the Portal
- Retrieving log files
- Debug
- Attribute inheritance overrides
- Understand report data caching
- Understand the portal
- Section IV. Report Reference
- Introduction to NetBackup IT Analytics
- Alert Reports
- Risk Mitigation Solution Reports
- Risk Mitigation Reports
- Storage Optimization Solution Reports
- System Administration Reports
- Oracle Job Overview
- Capacity Manager Reports
- Application Capacity Reports
- Array Capacity Utilization Reports
- Array Capacity & Utilization (Generic Data)
- Array Capacity & Utilization (IBM SVC View)
- Array Capacity and Utilization (IBM XIV View)
- Array Capacity and Utilization (NetApp View)
- Array Capacity and Utilization (NetApp Cluster)
- NetApp Storage System Detail
- Array Capacity and Utilization (OpenStack Swift)
- IBM Array Site Summary
- IBM Array Detail
- LUN Utilization Summary
- NetApp Aggregate Detail
- NetApp Cluster-Mode Aggregate Detail
- NetApp Plex Details
- NetApp Volume Details
- NetApp Cluster-Mode Volume Detail
- NetApp StorageGrid Tenant Summary
- Available/Reclaimable Capacity Reports
- Capacity at Risk Reports
- Capacity Chargeback Reports
- Host Capacity Utilization Reports
- SnapMirror Reports
- SnapVault Reports
- Capacity Forecasting Reports
- Storage Performance Reports
- Mission Control for Performance Analysis
- Thin Provisioning Reports
- Hitachi Dynamic Provisioning Pool Utilization
- File Analytics Reports
- Virtualization Manager Reports
- Understanding the Datastore
- VM Server Detail
- VM Snapshot Summary
- VM Detail
- Datastore Utilization Summary
- Datastore Detail
- Fabric Manager Reports
- Host to Storage Dashboard
- Backup Manager Management Reports
- Error Log Summary
- Job Duration Report
- Veeam Backup & Replication Job Summary Report (Homogeneous)
- Veeam and RMAN Job Details Report
- Adding a Note to a Job
- Job Volume Summary Report
- NetBackup deduplication to MSDP savings
- Backup Administration Reports
- Host Details
- IBM Spectrum Protect (TSM) Storage Pools Dashboard
- Backup Media Management Reports
- TSM Tape Media Detail Table
- Backup Service Level Agreement (SLA) Reports
- Determining and Improving Backup Start Time Performance
- Determining and Improving Backup Success Performance
- Determining and Improving Backup Duration Performance
- Backup Storage Utilization Reports
- Backup Manager Forecasting Reports
- Backup Billing and Usage Reports
- Backup Policies Reports
- HP Data Protector Backup Specification Detail
- Public Cloud Reports
- Section V. NetBackup IT Analytics Exporter Installation and Configuration
- Section VI. Data Collector Installation and Troubleshooting
- Installing the Data Collector Software
- Validating Data Collection
- Uninstalling the Data Collector
- Manually Starting the Data Collector
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- CRON Expressions and Probe Schedules
- Clustering Data Collectors with VCS and Veritas NetBackup (RHEL 7)
- Clustering Data Collectors with VCS and Veritas NetBackup (Windows)
- Maintenance Scenarios for Message Relay Server Certificate Generation
- Installing the Data Collector Software
- Section VII. Data Collection for the Cloud
- Pre-Installation Setup for Amazon Web Services (AWS)
- Create an AWS IAM user
- Link AWS accounts for Collection of consolidated billing data
- Pre-Installation Setup for OpenStack Ceilometer
- Pre-Installation Setup for OpenStack Swift
- Pre-Installation Setup for Microsoft Azure
- Google Cloud Platform (GCP)
- Pre-Installation Setup for Amazon Web Services (AWS)
- Section VIII. Data Collection for Data Protection (Backup)
- Introduction
- Pre-Installation setup for Commvault Simpana
- Open TCP/IP access to the Commvault database
- Set up a read-only user in the CommServe server
- Pre-Installation setup for Cohesity DataProtect
- Pre-Installation setup for EMC Avamar
- Import EMC Avamar server information
- Pre-Installation setup for EMC Data Domain Backup
- Pre-Installation setup for EMC NetWorker
- Architecture overview (EMC NetWorker)
- Pre-Installation setup for Dell EMC NetWorker Backup & Recovery
- Pre-Installation setup for generic backup
- CSV format specification
- Pre-Installation setup for HP Data Protector
- Architecture overview (HP Data Protector)
- Configure the Data Collector server in Cell Manager (HP Data Protector)
- Pre-Installation setup for IBM Spectrum Protect (TSM)
- Architecture overview (IBM Spectrum Protect -TSM)
- Import IBM Spectrum Protect (TSM) information
- Pre-Installation setup for IBM Spectrum Protect Plus
- Pre-Installation setup for NAKIVO Backup & Replication
- Pre-Installation setup for Veritas Backup Exec
- Pre-Installation setup for Veritas NetBackup
- Prerequisites to use SSH and WMI (Veritas NetBackup)
- Prerequisites for NetBackup collection over SSH (Kerberos option)
- Veritas NetBackup 8.1 (or later) requirements for centralized collection
- Configuring file analytics in NetBackup Data Collector policy
- Pre-Installation setup for Veritas SaaS backup
- Pre-Installation setup for Oracle Recovery Manager (RMAN)
- Pre-Installation setup for Rubrik Cloud Data Management
- Pre-Installation setup for Veeam Backup & Replication
- Discovery policies for Veritas NetBackup
- About Discovery types
- About SNMP probes
- Appendix B. Load Historic Events
- Load Veritas NetBackup events
- Section IX. Data Collection for Fabrics
- Section X. Data Collection for File Analytics
- Pre-Installation Setup for File Analytics
- Host Discovery and Collection File Analytics probe
- Adding a File Analytics Data Collector policy
- File Analytics Export Folder Size and Folder Depth
- Pre-Installation Setup for File Analytics
- Section XI. Data Collection for Replication
- Section XII. Data Collection for Storage (Capacity)
- Data Collection for Capacity Overview
- Pre-Installation setup for Compute Resources
- Pre-Installation Setup for Dell Compellent
- Pre-Installation Setup for DELL EMC Elastic Cloud Storage (ECS)
- Pre-Installation Setup for EMC Data Domain Storage
- Pre-Installation Setup for EMC Isilon
- Pre-Installation Setup EMC Symmetrix
- Pre-Installation Setup for Dell EMC Unity
- Pre-Installation Setup for EMC VNX Celerra
- Pre-Installation Setup for EMC VNX CLARiiON
- Pre-Installation Setup for EMC VPLEX
- Pre-Installation Setup for EMC XtremIO
- Pre-Installation Setup for Hitachi Block
- Configuring a Hitachi Device manager user
- Pre-Installation Setup for Hitachi Content Platform (HCP)
- Hitachi content platform system management console
- Hitachi content platform tenant management console
- Pre-Installation Setup Hitachi NAS
- Pre-Installation Setup for Hitachi Vantara All-Flash and Hybrid Flash Storage
- Host Inventory Pre-Installation Setup
- Host Access Privileges, Sudo Commands, Ports, and WMI Proxy Requirements
- Configure host Discovery policies to populate the host Inventory
- Validate host connectivity
- Host Inventory search and host Inventory export
- Configure and edit host probes
- Propagate Probe Settings: Copy Probes, Paste Probes
- Pre-Installation Setup for HP 3PAR
- Pre-Installation Setup for HP EVA
- Pre-Installation setup for HPE Nimble Storage
- Pre-Installation Setup for Huawei OceanStor
- Pre-Installation Setup for IBM COS
- Pre-Installation Setup for IBM Enterprise
- Pre-Installation Setup for NetApp E-Series
- Pre-Installation Setup for IBM SVC
- Pre-Installation Setup for IBM XIV
- Pre-Installation Setup for Infinidat InfiniBox
- Pre-installation setup for FUJITSU Data Collector
- Pre-Installation setup for Infinidat InfiniGuard
- Pre-Installation Setup for NetApp-7
- Pre-Installation setup for NetApp StorageGRID
- Pre-Installation Setup for Microsoft Windows Server
- Pre-Installation Setup for NetApp Cluster
- Pre-Installation Setup for Pure Storage FlashArray
- Pre-Installation Setup for Veritas NetBackup Appliance
- Section XIII. Data Collection for Virtualization
- Pre-Installation setup for VMware
- Pre-Installation setup for IBM VIO
- Pre-Installation setup for Microsoft Hyper-V
- Section XIV. System Administration
- Preparing for Updates
- Backing Up and Restoring Data
- Monitoring NetBackup IT Analytics
- Accessing NetBackup IT Analytics Reports with the REST API
- Defining NetBackup Estimated Tape Capacity
- Automating Host Group Management
- Categorize host operating systems by platform and version
- Load relationships between hosts and host group
- Automate NetBackup utilities
- Scheduling utilities to run automatically
- Attribute Management
- Importing Generic Backup Data
- Backup Job Overrides
- Managing Host Data Collection
- System Configuration in the Portal
- Performance Profile Schedule Customization
- Configuring AD/LDAP
- Configuring Single Sign On (SSO) Using Security Assertion Markup Language (SAML)
- Changing Oracle Database User Passwords
- Integrating with CyberArk
- Tuning NetBackup IT Analytics
- Defining Report Metrics
- Working with Log Files
- Portal and data collector log files - reduce logging
- Data collector log file naming conventions
- Portal log files
- SNMP Trap Alerting
- SSL Certificate Configuration
- Configure virtual hosts for portal and / or data collection SSL
- Keystore on the portal server
- Portal Properties: Format and Portal Customizations
- Data Retention Periods for SDK Database Objects
- Troubleshooting
- Section XV. Portal Install and Upgrade (Windows)
- Installing the Portal on a Windows server
- Task 3: Installing Oracle application binaries (Windows)
- Upgrade NetBackup IT Analytics Portal on Windows
- Upgrade and migrate to a new server
- Installing the Portal on a Windows server
- Section XVI. Portal Install and Upgrade (Linux)
- Install the NetBackup IT Analytics Portal on a Linux server
- Installer-based deployment
- Upgrade NetBackup IT Analytics Portal on Linux
- Upgrade NetBackup IT Analytics Portal
- Data Collector upgrades
- Oracle patches for the database server
- Upgrade and Migrate to a new server
- Upgrade and migrate to a new server
- Upgrade and migrate to a new server
- Appendix C. X Virtual Frame Buffer
- Install the NetBackup IT Analytics Portal on a Linux server
- Section XVII. Licensing
- License installation and guidelines
- License overview
- Verify the current license configuration
- Storage suite
- Protection suite
- Backup Manager
- Backup Manager
- Complete suite
- Managing licenses
- Configure the Data Collector policy to exclude the object
- License management from the command line
- Troubleshooting
- License installation and guidelines
- Section XVIII. Inventory reports and operations
Manual steps for database import / export using data pump
Follow the steps to execute Data Pump Export in a Linux Environment
- Login to the Linux database server and switch to user aptare.
- Ensure that file
/opt/aptare/database/tools/expdp_scdb.par
is owned by aptare user and has 755 permissions. - Ensure Oracle listener and Oracle services are running.
- Run following commands:
su - aptare sqlplus / as sysdba alter session set container=scdb;
Note:
The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.
CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';
In case of a preferred folder such as
new_directory_path
:CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';
- Export the database using following command:
/opt/aptare/oracle/bin/expdp parfile=/opt/aptare/database/tools/expdp_scdb.par
- You can also choose to ignore the
par
file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user./opt/aptare/oracle/bin/expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y directory=datapump_dir dumpfile=aptare_scdb.exp logfile= export_scdb.log CONTENT=ALL flashback_time=systimestamp
After successful completion, the data pump export file
aptare_scdb.exp
is saved in/tmp
directory of the Linux Database server.In case you have specified a preferred directory,
aptare_scdb.exp
is saved to that preferred location (such as/new_directory_path
). - This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above.. Execute cp /opt/aptare/datarcvrconf/aptare.ks /tmp command to copy the aptare.ks file to /tmp folder
Follow the steps to execute Data Pump Import in a Linux Environment
- Place the export file aptar_scdb.exp created using data pump export in /tmp directory.
If you have a different preferred directory (for example
/new_directory_path
), then placeaptare_scdb.exp
in your preferred directory (/new_directory_path
). - Ensure the aptare_scdb.exp file is owned by the aptatre user and has 755 permissions.
- Ensure that files /opt/aptare/database/tools/unlock_portal_linux.sql and /opt/aptare/database/tools/impdp_scdb.par are owned by aptare user and have 755 permissions.
- Using root user Stop all Oracle and Aptare services by running following command: /opt/aptare/bin/aptare stop from root user.
- Using root user start Oracle services by running following command: /opt/aptare/bin/oracle start
- Ensure Oracle listener is running. Using aptare user check for status of Listener using following command: lsnrctl status
- Run following commands:
su - aptare sqlplus / as sysdba alter session set container=scdb;
Note:
The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.
drop user aptare_ro cascade;
drop user portal cascade;
CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';
In case of a preferred folder such as
new_directory_path
:CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';
- Run the following command using aptare user
/opt/aptare/oracle/bin/impdp parfile=/opt/aptare/database/tools/impdp_scdb.par
- You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user./opt/aptare/oracle/bin/impdp system/aptaresoftware@//localhost:1521/scdb schemas=portal,aptare_ro directory=datapump_dir dumpfile=aptare_scdb.exp logfile=import_scdb.log
- When import completes in a non-CDB environment, remove first command 'alter session set container = scdb;' from the file unlock_portal_linux.sql and run following command from aptare user.
Note:
The removal of the 'alter session set container = scdb;' is required only for a non-CDB environment and no change is required if it is a Container database.
sqlplus / as sysdba
@/opt/aptare/database/tools/unlock_portal_linux.sql
- After exiting from sqlplus, execute following command from aptare user
sqlplus portal/portal@//localhost:1521/scdb
@/opt/aptare/database/tools/validate_sp.sql
Post Steps:
Go to /tmp directory and check the file
import_scdb.log
.In case you have specified a preferred directory, check for
import_scdb.log
in your preferred location.Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.
Note:
If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build.
This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above.. Execute the following commands to copy the
aptare.ks
file todatarcvrconf
folder.cp /tmp/aptare.ks /opt/aptare/datarcvrconf/ chown aptare:tomcat /opt/aptare/datarcvrconf/ chmod 664 /opt/aptare/datarcvrconf/aptare.ks
Run updateUser.sh to change the pwd of the application account. For example, to change the password for the admin123 application user, run: updateUser.sh admin123 newPassword
Restart all Oracle and Aptare services by running /opt/aptare/bin/aptare restart from root user.
Login to the portal Application using the application account.
Follow the steps for Windows Data Pump Export
- Login to the Windows database server.
- Ensure Oracle TNS listener and Oracle services are running.
- Ensure Aptare user has access to the file c:\opt\oracle\database\tools\expdp_scdb_win.par
Run following commands:
sqlplus system/aptaresoftware@//localhost:1521/scdb
create or replace directory datapump_dir as 'c:\opt\oracle\logs';
Exit
- After exiting out of sqlplus, execute the following command: c:\opt\oracle\bin\expdp parfile=c:\opt\oracle\database\tools\expdp_scdb_win.par
- You can also choose to ignore the par file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y DIRECTORY=datapump_dir LOGFILE=export_scdb.log DUMPFILE=aptare_scdb.exp CONTENT=ALL FLASHBACK_TIME=systimestamp
- After successful completion, the data pump export file aptare_scdb.exp is saved in C:\opt\oracle\logs directory of the Windows Database server.
- Copy file c:\opt\datarcvrconf\aptare.ks to c:\opt\oracle\logs folder.
Note:
This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.
Follow the steps for Windows Data Pump Import
- Login to the Windows database server.
- The Aptare user will already have access to import files c:\opt\oracle\database\tools\unlock_portal_win.sql and c:\opt\oracle\database\tools\impdp_scdb_win.par. In case the Oracle user does not have read and execute privileges on these files, please ensure privileges are granted before starting the import.
- Place the export file aptare_scdb.exp in c:\opt\oracle\logs directory
- In case the name of the export file is capitalized, please change it to lower case. For example, change the name 'APTARE_SCDB.EXP' to 'aptare_scdb.exp'
- Stop all Oracle and Aptare services using stopAllServices from windows services tab.
- Start OracleServicescdb from windows services tab and ensure Oracle TNS listener is running.
Run following commands:
Sqlplus / as sysdba
Alter session set container = scdb; (note this command is included only for a container database, otherwise switch to container database is not required)
DROP USER aptare_ro CASCADE;
DROP USER portal CASCADE;
CREATE OR REPLACE DIRECTORY datapump_dir AS 'c:\opt\oracle\logs';
EXIT;
- After exiting out of sqlplus execute following command:
c:\opt\oracle\bin\impdp parfile=c:\opt\oracle\database\tools\impdp_scdb_win.par
- You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\impdp "sys/*@//localhost:1521/scdb as sysdba" SCHEMAS=portal,aptare_ro DIRECTORY=datapump_dir LOGFILE=import_scdb.log DUMPFILE=aptare_scdb.exp
- After import is complete, execute the following command: sqlplus "sys/*@//localhost:1521/scdb as sysdba" @c:\opt\oracle\database\tools\unlock_portal_win.sql
- After exiting out of sqlplus, execute the following command: sqlplus portal/portal@//localhost:1521/scdb @c:\opt\oracle\database\tools\validate_sp.sql
Post Steps:
To check for import logs, go to c:\opt\aptare\oracle\logs and check the file import_scdb.log
Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.
Note:
If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build
Copy the saved file from c:\opt\oracle\logs\aptare.ks to c:\opt\datarcvrconf\ folder. Ensure that the file is owned by NetBackup IT Analytics user and has appropriate Read and Write access to the copied file.
Note:
This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.
After successful completion of the import process, run StopAllservices using service tab on Windows.
Run startAllServices using service tab on Windows.
Run updateUser.bat from utils directory to change the pwd of the application account. For example, to change the password for the admin123 application user, run: updateUser.bat admin123 newPassword
Login to the portal Application using the application account.