NetBackup IT Analytics System Administrator Guide

Last Published:
Product(s): NetBackup IT Analytics (11.6)
  1. Introduction
    1.  
      NetBackup IT Analytics Overview
    2.  
      Purpose of this document
  2. Preparing for updates
    1.  
      About upgrades and updates
    2.  
      Determine the data collector version
    3.  
      Data collector updates with an aptare.jar file
    4.  
      Manual download of the aptare.jar file
    5.  
      Portal updates
  3. Backing up and restoring data
    1.  
      Best practices for disaster recovery
    2.  
      Oracle database backups
    3.  
      File system backups
    4.  
      Oracle database: Cold backup
    5.  
      Oracle database: Export backups
    6.  
      Scheduling the oracle database export
    7.  
      Oracle database: On demand backup
    8.  
      Restoring the NetBackup IT Analytics system
    9.  
      Import the Oracle database
    10.  
      Manual steps for database import / export using data pump
  4. Monitoring NetBackup IT Analytics
    1.  
      Starting and stopping portal server software
    2.  
      Starting and stopping the reporting database
    3.  
      Starting and stopping data collectors
    4.  
      Monitoring tablespaces
  5. Accessing NetBackup IT Analytics reports with the REST API
    1.  
      Overview
    2.  
      Authentication for REST APIs
    3.  
      Extracting data from tabular reports (with pagination)
    4.  
      Exporting reports
    5.  
      Exporting custom dashboards
  6. Defining NetBackup estimated tape capacity
    1.  
      NetBackup estimated tape capacity overview
    2.  
      Estimated capacity notes
    3.  
      Updating the estimated capacity table
    4.  
      Listing volume pool IDs and media types
  7. Automating host group management
    1.  
      About automating host group management
    2.  
      Task overview: managing host groups in bulk
    3.  
      Preparing to use PL/SQL utilities
    4.  
      General utilities
    5. Categorize host operating systems by platform and version
      1.  
        Use Regular Expressions to Override or Modify Default Host OS Categorization
      2.  
        Host OS Categorization Default Settings
      3.  
        Utility to Update Host OS Categorizations
      4.  
        Categorize Host Operating Systems On Demand
    6.  
      Identifying a host group ID
    7.  
      Move or copy clients
    8.  
      Organize clients by attribute
    9.  
      Move host group
    10.  
      Delete host group
    11.  
      Move hosts and remove host groups
    12.  
      Organize clients into groups by backup server
    13.  
      Merge duplicate backup clients
    14.  
      Merge duplicate hosts
    15. Bulk load utilities
      1.  
        Load host aliases
      2.  
        Load details of new hosts or update existing hosts
      3. Load relationships between hosts and host group
        1.  
          Sample Audit File (output from load_package.loadGroupMemberFile)
    16.  
      Veritas NetBackup utilities
    17. Automate NetBackup utilities
      1.  
        Scheduling a NetBackup Utility Job to Run Automatically
    18.  
      Organize clients into groups by management server
    19.  
      Set up an inactive clients group
    20.  
      Set up a host group for clients in inactive policies
    21.  
      Set up clients by policy
    22.  
      Set up clients by policy type
    23.  
      IBM Tivoli storage manager utilities
    24.  
      Set up clients by policy domain
    25.  
      Set up clients by IBM Tivoli storage manager instance
    26. Scheduling utilities to run automatically
      1.  
        Sample .sql file (setup_ora_job.sql) to set up an automatic job
    27.  
      Host matching identification for single-domain multi-customer environments
  8. Attribute management
    1.  
      Attribute bulk load utilities
    2.  
      Attribute naming rules
    3.  
      Rename attributes before upgrading
    4.  
      Load host attributes and values
    5.  
      Load attributes and values and assign to hosts
    6.  
      Load array attributes and values and assign to arrays
    7.  
      Overview of application attributes and values
    8.  
      Load application database attributes and values
    9.  
      Load MS Exchange organization attributes and values
    10.  
      Load LUN attributes and values
    11.  
      Load switch attributes and values
    12.  
      Load port attributes and values
    13.  
      Load Subscription attributes and values
  9. Importing generic backup data
    1. About generic backup data collection
      1.  
        Considerations
    2.  
      Configuring generic backup data collection
    3. CSV Format Specification
      1.  
        EXAMPLE: genericBackupJobs.csv
    4.  
      Manually loading the CSV file
  10. Backup job overrides
    1.  
      Overview
    2.  
      Configure a backup job override
  11. Managing host data collection
    1.  
      Identifying hosts by WWN to avoid duplicates
    2.  
      Setting a host's priority
    3.  
      Determining host ranking
    4.  
      Loading host and WWN relationships
    5.  
      Loading the host HBA port data
    6.  
      Create a CSV file
    7.  
      Execute the script
  12. System configuration in the Portal
    1.  
      System configuration in the Portal
    2.  
      System configuration: functions
    3.  
      Navigation overview
    4.  
      System configuration parameter descriptions: Additional info
    5.  
      Anomaly detection
    6.  
      Data collection: Capacity chargeback
    7.  
      Database administration: database
    8.  
      Host discovery: EMC Avamar
    9.  
      Host discovery: Host
    10.  
      Events captured for audit
    11. Custom parameters
      1.  
        Adding/editing a custom parameter
      2.  
        Portal customizations
      3.  
        Configuring global default inventory object selection
      4.  
        Restricting user IDs to single sessions
      5.  
        Customizing date format in the report scope selector
      6.  
        Customizing the maximum number of lines for exported reports
      7.  
        Customizing the total label display in tabular reports
      8.  
        Customizing the host management page size
      9.  
        Customizing the path and directory for File Analytics database
      10.  
        Configuring badge expiration
      11.  
        Configuring the maximum cache size in memory
      12.  
        Configuring the cache time for reports
  13. Performance profile schedule customization
    1.  
      Overview
    2.  
      Customize the performance profile schedule
  14. LDAP and SSO authentication for Portal access
    1. Overview
      1.  
        Active directory tools
      2.  
        Using LDP to find the base DN
      3.  
        Using LDP to search active directory
    2. Configure AD/LDAP
      1.  
        AD/LDAP configuration for authentication
      2.  
        AD/LDAP Configuration for authentication and authorization
      3.  
        Migrate portal users when AD/LDAP authentication is configured
      4.  
        Migrate portal users with LDAP authentication and authorization configured
    3. Configure single sign-on (SSO)
      1.  
        Single sign-on (SSO) prerequisites
      2. Setting up the external Identity Provider (IDP) server
        1.  
          Users and groups in the external LDAP directory
        2.  
          Registering with the IDP server
      3.  
        Activate single Sign-on (SSO) in the portal
      4.  
        SSO troubleshooting and maintenance
    4.  
      Enable local authentication
  15. Change Oracle database user passwords
    1.  
      Overview
    2.  
      Database connection properties
    3.  
      Modify the Oracle database user passwords
    4.  
      Modify the Oracle database user passwords for split architecture
    5.  
      Determine if Oracle is using the default login password
  16. Integrate with CyberArk
    1.  
      Introduction
    2.  
      CyberArk setup prerequisites
    3.  
      Setting up the portal to integrate with CyberArk
  17. Tuning NetBackup IT Analytics
    1.  
      Before you begin tuning
    2.  
      Tuning the portal database
    3.  
      Performance recommendations
    4.  
      Reclaiming free space from Oracle
    5.  
      Portal / Data receiver Java memory settings
  18. Working with log files
    1.  
      About debugging NetBackup IT Analytics
    2.  
      Turn on debugging
    3.  
      Database logging
    4. Portal and data collector log files - reduce logging
      1.  
        Portal Log Files
      2.  
        Data Collector Log Files
    5.  
      Database SCON logging - reduce logging
    6.  
      Refreshing the database SCON log
    7.  
      Logging user activity in audit.log
    8.  
      Logging only what a user deletes
    9.  
      Logging all user activity
    10.  
      Data collector log files
    11.  
      Data collector log file organization
    12. Data collector log file naming conventions
      1.  
        Sample Vendor.Product Naming Convention
      2.  
        Log File Names Based on Data Collector Generation
      3.  
        Checkinstall Log
      4.  
        Test Connection Log
      5.  
        Log file naming convention by collected system
    13.  
      General data collector log files
    14.  
      Find the event / meta collector ID
    15. Portal log files
      1.  
        Managing Apache Log Files
    16.  
      Database log files
    17.  
      Installation / Upgrade log files
  19. Defining report metrics
    1.  
      Changing backup success percentage
    2.  
      Changing job status
  20. SNMP trap alerting
    1.  
      Overview
    2.  
      SNMP configurations
    3.  
      Standard OIDs
    4. Data in an alerting trap
      1.  
        Example of policy based alert
  21. SSL certificate configuration
    1.  
      SSL certificate configuration
    2.  
      SSL implementation overview
    3.  
      Obtain an SSL certificate
    4.  
      Update the web server configuration to enable SSL
    5. Configure virtual hosts for portal and / or data collection SSL
      1.  
        SSL Implementation for the Portal Only
      2.  
        SSL Implementation for Data Collection Only
      3.  
        SSL Implementation for Both the Portal and Data Collection
    6.  
      Enable / Disable SSL for a Data Collector
    7.  
      Enable / Disable SSL for emailed reports
    8.  
      Test and troubleshoot SSL configurations
    9.  
      Create a self-signed SSL certificate
    10.  
      Configure the Data Collector to trust the certificate
    11.  
      Keystore file locations on the Data Collector server
    12.  
      Import a certificate into the Data Collector Java keystore
    13. Keystore on the portal server
      1.  
        Features that Require the SSL Certificate
      2.  
        Add a Certificate into the Portal Keystore
      3.  
        Update a Certificate in the Portal Keystore
      4.  
        Download a Certificate from the Portal Keystore
    14.  
      Add a virtual interface to a Linux server
    15.  
      Add a virtual / secondary IP address on Windows
  22. Portal properties: Format and portal customizations
    1.  
      Introduction
    2.  
      Configuring global default inventory object selection
    3.  
      Restricting user IDs to single sessions
    4.  
      Customizing date format in the report scope selector
    5.  
      Customizing the maximum number of lines for exported reports
    6.  
      Customizing the total label display in tabular reports
    7.  
      Customizing the host management page size
    8.  
      Customizing the path and directory for file analytics database
    9.  
      Configuring badge expiration
    10.  
      Configuring the maximum cache size in memory
    11.  
      Configuring the cache time for reports
    12.  
      Configuring LDAP to use active directory (AD) for user group privileges
  23. Data retention periods for SDK database objects
    1.  
      Data retention periods for SDK database objects
    2. Data aggregation
      1.  
        Pre-requisites
      2.  
        Data aggregation and retention levels
    3.  
      Find the domain ID and database table names
    4.  
      Retention period update for SDK user-defined objects example
    5.  
      SDK user-defined database objects
    6.  
      Capacity: default retention for basic database tables
    7.  
      Capacity: default retention for EMC Symmetrix enhanced performance
    8.  
      Capacity: Default retention for EMC XtremIO
    9.  
      Capacity: Default retention for Dell EMC Elastic Cloud Storage (ECS)
    10.  
      Capacity: Default retention for Windows file server
    11.  
      Capacity: Default retention for Pure Storage FlashArray
    12.  
      Cloud: Default retention for Amazon Web Services (AWS)
    13.  
      Cloud: Default retention for Microsoft Azure
    14.  
      Cloud: Default retention for OpenStack Ceilometer
    15.  
      Configure multi-tenancy data purging retention periods
  24. Troubleshooting
    1.  
      Troubleshooting user login problems
    2.  
      Forgotten password procedure
    3.  
      Login issues
    4.  
      Connectivity issues
    5. Data Collector and database issues
      1.  
        Insufficient Privileges
      2.  
        Remove an Inactive Hitachi Array from the Database
      3. Report Emails are not Being Sent
        1.  
          Additional Email Troubleshooting Recommendations
      4.  
        General Reporting Issues
      5.  
        Performance Issues
    6.  
      Portal upgrade performance issues
  25. Appendix A. Kerberos based proxy user's authentication in Oracle
    1. Overview
      1.  
        Pre-requisite
    2.  
      Exporting service and user principal's to keytab file on KDC
    3.  
      Modifications for Oracle
    4.  
      Modifications for Portal
  26. Appendix B. Configure TLS-enabled Oracle database on NetBackup IT Analytics Portal and data receiver
    1.  
      About Transport Layer Security (TLS)
    2.  
      TLS in Oracle environment
    3.  
      Configure TLS in Oracle with NetBackup IT Analytics on Linux in split architecture
    4.  
      Configure TLS in Oracle with NetBackup IT Analytics on Linux in non-split architecture
    5.  
      Configure TLS in Oracle with NetBackup IT Analytics on Windows in split architecture
    6.  
      Configure TLS in Oracle with NetBackup IT Analytics on Windows in non-split architecture
    7.  
      Configure TLS in user environment
  27. Appendix C. NetBackup IT Analytics for NetBackup on Kubernetes and appliances
    1.  
      Configure embedded NetBackup IT Analytics Data collector for NetBackup deployment on appliances (including Flex appliances)
    2.  
      Configure NetBackup IT Analytics for NetBackup deployment on Kubernetes

Merge duplicate hosts

To merge duplicate hosts, you must have a local CSV copy of the Duplicate Host Analysis report. This report serves as an input to the duplicate host merging script.

Recommendations:

Follow these recommendations before you merge duplicate hosts:

  • Carefully set the report scope and generate the Duplicate Host Analysis report, as its CSV export copy serves as an input for the host merge script.

  • Use a copy of the original CSV export as an input for the merge duplicate hosts script. The original CSV can serve as a reference in future.

  • Since the host merge process is irreversible, it must be executed by an administrator with comprehensive knowledge of backup solutions.

  • Back up the database before performing the host merge since the process is irreversible.

Merge duplicate hosts

Since merge duplicate hosts is an advanced process, make sure you have followed all the recommendations suggested above.

The merging duplicate hosts is performed in the following order:

  1. Generate the Duplicate Host Analysis report on the NetBackup IT Analytics export it in CSV format.

  2. Edit the CSV copy of the Duplicate Host Analysis report for the host merge script.

  3. Run the host merge script using the CSV input.

Step-1: Generate and export Duplicate Host Analysis report
  1. Access the Duplicate Host Analysis report from Reports tab > System Administration reports.
  2. Click the Duplicate Host Analysis report name. Use the descriptions in the table below to set the required report scope and generate the report.

    Field name

     

    Description

    Host Group

     

    Allow you to select the host groups, backup servers, or hosts for the report scope. Your selection narrows down the scope and helps find duplicates more efficiently by targeting specific host groups or host name.

    Find Duplicates Using:

    • Host Name

    • Display Name

     

    You can choose between Host Name (default) and Display Name. Both searches are case-sensitive.

    • For Host Name, the system compares the internal host names to find duplicates. This is the default criterion in the legacy host merge option.

    • For Display Name, the system uses the display or external names of the hosts to find duplicates.

    Host Type for the Duplicate Host

    • Clients Only

    • All

     
    • Clients Only allows you to find duplicates only for hosts that are identified as Clients (hosts backed up by any backup system).

    • All detects duplicates for all types of hosts.

    Surviving host Selection Criteria

    • Highest Job Count

    • Most Recently Updated

     

    Allows you to specify the criteria to select the surviving host among the duplicates when performing a host merge.

    • Highest Job Count: Selects the host with most associated jobs as the surviving host. This is the default criterion of the legacy host merge option, as a higher job count suggests that the host has more data associated with it.

    • Most Recently Updated: Selects the most recently updated host as the surviving host. Use this option when the duplicate hosts found are no longer actively collecting new data, as it helps to retain the most current host.

    Cascade into sub-groups

     

    The scope selector default is to cascade to all child sub-groups when generating the report. If you prefer to report ONLY on the host group you selected, then uncheck Cascade into sub-groups.

    Filter by Common Attributes

     

    Select this checkbox to have the report scope display attributes using "AND" logic. By selecting this feature, the report will display those results with the intersection of the selected criteria.

    If this checkbox is not selected, the report will display attributes using "OR" logic. For example, if you select attribute values, Campbell, Engineering, Cost Center 1 and select Filter by Common Attributes, the report will display only the results that contain all 3 attribute values. If you do not select Filter by Common Attributes, the report will display all results with attributes Campbell, Engineering, or Cost Center 1.

    Apply Attributes to Backup Servers

     

    Select this checkbox to apply the attributes only to the backup servers, instead of hosts.

  3. After generating the report, export the report in CSV format on your system.
  4. Create a copy of the CSV report and prepare the copy for the host merge script as described in the next step.
Step-2: Edit the CSV copy of the Duplicate Host Analysis report

This Duplicate Host Analysis report displays one row each for each suspected duplicate pair. If multiple duplicate hosts are detected, the report displays one row for each duplicate pair. For example, if host A has three potential duplicates, the report displays three rows - one for each duplicate.

Update the values of the following columns in the CSV copy as suggested below:

  1. Surviving Host: Default value of this report column is Main, which indicates that the duplicates will be merged into the Main host. To change the surviving host, change its value to Duplicate. This way, all hosts are merged into the duplicate host. Main and Duplicate are the only acceptable values in this column.

  2. Is Duplicate Host's Merge supported: This column supports only Yes and No as values. Delete all the rows containing the value No from the report CSV that you plan to use as input for the host merge process.

None other than the above modifications must be made to the report CSV that you plan to use for the host merge process. Your report CSV is now ready to serve as an input for the host merge script.

Step-3: Run the host merge script using the report CSV

The host merge script has a provision to perform a pre-assessment during which it evaluates errors in the CSV and suggests corrections before proceeding further. You must ensure a successful pre-assessment and only then proceed to merge the hosts. Any error in the report CSV will result in the script aborting the process. You must provide the report CSV path along with the file name, log file path, and log file name when you run the script.

Caution:

As the host merge process is irreversible, you must back up your database and follow all the recommendations suggested above before you proceed.

To merge duplicate hosts:

  1. Run the host_merge.sql script from the ../database/tools directory.

    You can run the script from SQL*Plus or SQL Developer as portal or equivalent user which has access to all the schema tables.

  2. Enter the following details when requested by the script:
    • Enter 1 or 2: Enter 1 to run the pre-assessment when you run the script for the first time. You can enter 2 when the pre-assessment is successful.

    • Enter Duplicate Host Analysis CSV file name with full path: Enter the report CSV file path including the file name.

    • Enter log file path: Enter the location for log file (without the file name)

    • Enter log file name: Enter the name of the log file.

    After your pre-assessment is successful, repeat this step with option 2 to complete the host merge.

Merge duplicate hosts script run samples

Figure: Sample of how host_merge.sql script is run

Sample of how host_merge.sql script is run

Figure: Sample-1of an unsuccessful pre-assessment

Sample-1of an unsuccessful pre-assessment

Figure: Sample-2 of an unsuccessful pre-assessment

Sample-2 of an unsuccessful pre-assessment

Figure: Successful pre-assessment

Successful pre-assessment

Figure: Host merge sample

Host merge sample