NetBackup IT Analytics System Administrator Guide

Last Published:
Product(s): NetBackup IT Analytics (11.4)
  1. Introduction
    1.  
      NetBackup IT Analytics Overview
    2.  
      Purpose of this document
  2. Preparing for updates
    1.  
      About upgrades and updates
    2.  
      Determine the data collector version
    3.  
      Data collector updates with an aptare.jar file
    4.  
      Manual download of the aptare.jar file
    5.  
      Portal updates
  3. Backing up and restoring data
    1.  
      Best practices for disaster recovery
    2.  
      Oracle database backups
    3.  
      File system backups
    4.  
      Oracle database: Cold backup
    5.  
      Oracle database: Export backups
    6.  
      Scheduling the oracle database export
    7.  
      Oracle database: On demand backup
    8.  
      Restoring the NetBackup IT Analytics system
    9.  
      Import the Oracle database
    10.  
      Manual steps for database import / export using data pump
  4. Monitoring NetBackup IT Analytics
    1.  
      Starting and stopping portal server software
    2.  
      Starting and stopping the reporting database
    3.  
      Starting and stopping data collectors
    4.  
      Monitoring tablespaces
  5. Accessing NetBackup IT Analytics reports with the REST API
    1.  
      Overview
    2.  
      Authentication for REST APIs
    3.  
      Extracting data from tabular reports (with pagination)
    4.  
      Exporting reports
    5.  
      Exporting custom dashboards
  6. Defining NetBackup estimated tape capacity
    1.  
      NetBackup estimated tape capacity overview
    2.  
      Estimated capacity notes
    3.  
      Updating the estimated capacity table
    4.  
      Listing volume pool IDs and media types
  7. Automating host group management
    1.  
      About automating host group management
    2.  
      Task overview: managing host groups in bulk
    3.  
      Preparing to use PL/SQL utilities
    4.  
      General utilities
    5. Categorize host operating systems by platform and version
      1.  
        Use Regular Expressions to Override or Modify Default Host OS Categorization
      2.  
        Host OS Categorization Default Settings
      3.  
        Utility to Update Host OS Categorizations
      4.  
        Categorize Host Operating Systems On Demand
    6.  
      Identifying a host group ID
    7.  
      Move or copy clients
    8.  
      Organize clients by attribute
    9.  
      Move host group
    10.  
      Delete host group
    11.  
      Move hosts and remove host groups
    12.  
      Organize clients into groups by backup server
    13.  
      Merge duplicate backup clients
    14. Bulk load utilities
      1.  
        Load host aliases
      2.  
        Load details of new hosts or update existing hosts
      3. Load relationships between hosts and host group
        1.  
          Sample Audit File (output from load_package.loadGroupMemberFile)
    15.  
      Veritas NetBackup utilities
    16. Automate NetBackup utilities
      1.  
        Scheduling a NetBackup Utility Job to Run Automatically
    17.  
      Organize clients into groups by management server
    18.  
      Set up an inactive clients group
    19.  
      Set up a host group for clients in inactive policies
    20.  
      Set up clients by policy
    21.  
      Set up clients by policy type
    22.  
      IBM Tivoli storage manager utilities
    23.  
      Set up clients by policy domain
    24.  
      Set up clients by IBM Tivoli storage manager instance
    25. Scheduling utilities to run automatically
      1.  
        Sample .sql file (setup_ora_job.sql) to set up an automatic job
  8. Attribute management
    1.  
      Attribute bulk load utilities
    2.  
      Attribute naming rules
    3.  
      Rename attributes before upgrading
    4.  
      Load host attributes and values
    5.  
      Load attributes and values and assign to hosts
    6.  
      Load array attributes and values and assign to arrays
    7.  
      Overview of application attributes and values
    8.  
      Load application database attributes and values
    9.  
      Load MS Exchange organization attributes and values
    10.  
      Load LUN attributes and values
    11.  
      Load switch attributes and values
    12.  
      Load port attributes and values
    13.  
      Load Subscription attributes and values
  9. Importing generic backup data
    1. About generic backup data collection
      1.  
        Considerations
    2.  
      Configuring generic backup data collection
    3. CSV Format Specification
      1.  
        EXAMPLE: genericBackupJobs.csv
    4.  
      Manually loading the CSV file
  10. Backup job overrides
    1.  
      Overview
    2.  
      Configure a backup job override
  11. Managing host data collection
    1.  
      Identifying hosts by WWN to avoid duplicates
    2.  
      Setting a host's priority
    3.  
      Determining host ranking
    4.  
      Loading host and WWN relationships
    5.  
      Loading the host HBA port data
    6.  
      Create a CSV file
    7.  
      Execute the script
  12. System configuration in the Portal
    1.  
      System configuration in the Portal
    2.  
      System configuration: functions
    3.  
      Navigation overview
    4.  
      System configuration parameter descriptions: Additional info
    5.  
      Anomaly detection
    6.  
      Data collection: Capacity chargeback
    7.  
      Database administration: database
    8.  
      Host discovery: EMC Avamar
    9.  
      Host discovery: Host
    10.  
      Events captured for audit
    11. Custom parameters
      1.  
        Adding/editing a custom parameter
      2.  
        Portal customizations
      3.  
        Configuring global default inventory object selection
      4.  
        Restricting user IDs to single sessions
      5.  
        Customizing date format in the report scope selector
      6.  
        Customizing the maximum number of lines for exported reports
      7.  
        Customizing the total label display in tabular reports
      8.  
        Customizing the host management page size
      9.  
        Customizing the path and directory for File Analytics database
      10.  
        Configuring badge expiration
      11.  
        Configuring the maximum cache size in memory
      12.  
        Configuring the cache time for reports
  13. Performance profile schedule customization
    1.  
      Overview
    2.  
      Customize the performance profile schedule
  14. LDAP and SSO authentication for Portal access
    1. Overview
      1.  
        Active directory tools
      2.  
        Using LDP to find the base DN
      3.  
        Using LDP to search active directory
    2. Configure AD/LDAP
      1.  
        AD/LDAP configuration for authentication
      2.  
        AD/LDAP Configuration for authentication and authorization
      3.  
        Migrate portal users when AD/LDAP authentication is configured
      4.  
        Migrate portal users with LDAP authentication and authorization configured
    3. Configure single sign-on (SSO)
      1.  
        Single sign-on (SSO) prerequisites
      2. Setting up the external Identity Provider (IDP) server
        1.  
          Users and groups in the external LDAP directory
        2.  
          Registering with the IDP server
      3.  
        Activate single Sign-on (SSO) in the portal
      4.  
        SSO troubleshooting and maintenance
  15. Change Oracle database user passwords
    1.  
      Overview
    2.  
      Database connection properties
    3.  
      Modify the Oracle database user passwords
    4.  
      Modify the Oracle database user passwords for split architecture
    5.  
      Determine if Oracle is using the default login password
  16. Integrate with CyberArk
    1.  
      Introduction
    2.  
      CyberArk setup prerequisites
    3.  
      Setting up the portal to integrate with CyberArk
  17. Tuning NetBackup IT Analytics
    1.  
      Before you begin tuning
    2.  
      Tuning the portal database
    3.  
      Performance recommendations
    4.  
      Reclaiming free space from Oracle
    5.  
      Portal / Data receiver Java memory settings
  18. Working with log files
    1.  
      About debugging NetBackup IT Analytics
    2.  
      Turn on debugging
    3.  
      Database logging
    4. Portal and data collector log files - reduce logging
      1.  
        Portal Log Files
      2.  
        Data Collector Log Files
    5.  
      Database SCON logging - reduce logging
    6.  
      Refreshing the database SCON log
    7.  
      Logging user activity in audit.log
    8.  
      Logging only what a user deletes
    9.  
      Logging all user activity
    10.  
      Data collector log files
    11.  
      Data collector log file organization
    12. Data collector log file naming conventions
      1.  
        Sample Vendor.Product Naming Convention
      2.  
        Log File Names Based on Data Collector Generation
      3.  
        Checkinstall Log
      4.  
        Test Connection Log
      5.  
        Log file naming convention by collected system
    13.  
      General data collector log files
    14.  
      Find the event / meta collector ID
    15. Portal log files
      1.  
        Managing Apache Log Files
    16.  
      Database log files
    17.  
      Installation / Upgrade log files
  19. Defining report metrics
    1.  
      Changing backup success percentage
    2.  
      Changing job status
  20. SNMP trap alerting
    1.  
      Overview
    2.  
      SNMP configurations
    3.  
      Standard OIDs
    4. Data in an alerting trap
      1.  
        Example of policy based alert
  21. SSL certificate configuration
    1.  
      SSL certificate configuration
    2.  
      SSL implementation overview
    3.  
      Obtain an SSL certificate
    4.  
      Update the web server configuration to enable SSL
    5. Configure virtual hosts for portal and / or data collection SSL
      1.  
        SSL Implementation for the Portal Only
      2.  
        SSL Implementation for Data Collection Only
      3.  
        SSL Implementation for Both the Portal and Data Collection
    6.  
      Enable / Disable SSL for a Data Collector
    7.  
      Enable / Disable SSL for emailed reports
    8.  
      Test and troubleshoot SSL configurations
    9.  
      Create a self-signed SSL certificate
    10.  
      Configure the Data Collector to trust the certificate
    11.  
      Keystore file locations on the Data Collector server
    12.  
      Import a certificate into the Data Collector Java keystore
    13. Keystore on the portal server
      1.  
        Features that Require the SSL Certificate
      2.  
        Add a Certificate into the Portal Keystore
      3.  
        Update a Certificate in the Portal Keystore
      4.  
        Download a Certificate from the Portal Keystore
    14.  
      Add a virtual interface to a Linux server
    15.  
      Add a virtual / secondary IP address on Windows
  22. Portal properties: Format and portal customizations
    1.  
      Introduction
    2.  
      Configuring global default inventory object selection
    3.  
      Restricting user IDs to single sessions
    4.  
      Customizing date format in the report scope selector
    5.  
      Customizing the maximum number of lines for exported reports
    6.  
      Customizing the total label display in tabular reports
    7.  
      Customizing the host management page size
    8.  
      Customizing the path and directory for file analytics database
    9.  
      Configuring badge expiration
    10.  
      Configuring the maximum cache size in memory
    11.  
      Configuring the cache time for reports
    12.  
      Configuring LDAP to use active directory (AD) for user group privileges
  23. Data retention periods for SDK database objects
    1.  
      Data retention periods for SDK database objects
    2. Data aggregation
      1.  
        Pre-requisites
      2.  
        Data aggregation and retention levels
    3.  
      Find the domain ID and database table names
    4.  
      Retention period update for SDK user-defined objects example
    5.  
      SDK user-defined database objects
    6.  
      Capacity: default retention for basic database tables
    7.  
      Capacity: default retention for EMC Symmetrix enhanced performance
    8.  
      Capacity: Default retention for EMC XtremIO
    9.  
      Capacity: Default retention for Dell EMC Elastic Cloud Storage (ECS)
    10.  
      Capacity: Default retention for Windows file server
    11.  
      Capacity: Default retention for Pure Storage FlashArray
    12.  
      Cloud: Default retention for Amazon Web Services (AWS)
    13.  
      Cloud: Default retention for Microsoft Azure
    14.  
      Cloud: Default retention for OpenStack Ceilometer
    15.  
      Configure multi-tenancy data purging retention periods
  24. Troubleshooting
    1.  
      Troubleshooting user login problems
    2.  
      Forgotten password procedure
    3.  
      Login issues
    4.  
      Connectivity issues
    5. Data Collector and database issues
      1.  
        Insufficient Privileges
      2.  
        Remove an Inactive Hitachi Array from the Database
      3. Report Emails are not Being Sent
        1.  
          Additional Email Troubleshooting Recommendations
      4.  
        General Reporting Issues
      5.  
        Performance Issues
    6.  
      Portal upgrade performance issues
  25. Appendix A. Kerberos based proxy user's authentication in Oracle
    1. Overview
      1.  
        Pre-requisite
    2.  
      Exporting service and user principal's to keytab file on KDC
    3.  
      Modifications for Oracle
    4.  
      Modifications for Portal
  26. Appendix B. Configure TLS-enabled Oracle database on NetBackup IT Analytics Portal and data receiver
    1.  
      About Transport Layer Security (TLS)
    2.  
      TLS in Oracle environment
    3.  
      Configure TLS in Oracle with NetBackup IT Analytics on Linux in split architecture
    4.  
      Configure TLS in Oracle with NetBackup IT Analytics on Linux in non-split architecture
    5.  
      Configure TLS in Oracle with NetBackup IT Analytics on Windows in split architecture
    6.  
      Configure TLS in Oracle with NetBackup IT Analytics on Windows in non-split architecture
    7.  
      Configure TLS in user environment
  27. Appendix C. NetBackup IT Analytics for NetBackup on Kubernetes and appliances
    1.  
      Configure embedded NetBackup IT Analytics Data collector for NetBackup deployment on appliances (including Flex appliances)
    2.  
      Configure NetBackup IT Analytics for NetBackup deployment on Kubernetes

Manual steps for database import / export using data pump

Manual steps for Linux Data Pump Export (CDB and non-CDB environments)

Follow the steps to execute the Data Pump Export in a Linux environment

  1. Login to the Linux database server and switch to user aptare.
  2. Ensure that file /opt/aptare/database/tools/expdp_scdb.par is owned by aptare user and has 755 permissions.
  3. Ensure Oracle listener and Oracle services are running.
  4. Run following commands:
    su - aptare
    sqlplus / as sysdba
    alter session set container=scdb;

    Note:

    The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.

    CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';

    In case of a preferred folder such as new_directory_path:

    CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';

  5. Export the database using following command:
    /opt/aptare/oracle/bin/expdp parfile=/opt/aptare/database/tools/expdp_scdb.par
  6. You can also choose to ignore the par file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user.
    /opt/aptare/oracle/bin/expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y directory=datapump_dir dumpfile=aptare_scdb.exp logfile=export_scdb.log CONTENT=ALL flashback_time=systimestamp

    After successful completion, the data pump export file aptare_scdb.exp is saved in /tmp directory of the Linux Database server.

    In case you have specified a preferred directory, aptare_scdb.exp is saved to that preferred location (such as /new_directory_path).

  7. This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above. Execute cp /opt/aptare/datarcvrconf/aptare.ks /tmp command to copy the aptare.ks file to /tmp folder
Manual steps for Linux Data Pump Import (CDB and non-CDB environments)

Follow the steps to execute Data Pump Import in a Linux Environment

  1. Place the export file aptare_scdb.exp created using data pump export in /tmp directory.

    If you have a different preferred directory (for example /new_directory_path), then place aptare_scdb.exp in your preferred directory (/new_directory_path).

  2. Ensure the aptare_scdb.exp file is owned by the aptare user and has 755 permissions.
  3. Ensure that files /opt/aptare/database/tools/unlock_portal_linux.sql and /opt/aptare/database/tools/impdp_scdb.par are owned by aptare user and have 755 permissions.
  4. Using root user, stop all Oracle and Aptare services by running following command: /opt/aptare/bin/aptare stop from root user.
  5. Using root user start Oracle services by running following command: /opt/aptare/bin/oracle start
  6. Ensure Oracle listener is running. Using aptare user check for status of Listener using following command: lsnrctl status
  7. Run following commands:
    su - aptare
    sqlplus / as sysdba
    alter session set container=scdb;

    Note:

    The alter session set container=scdb; command is required for a container database. Please ignore it for a non-cdb environment.

    drop user aptare_ro cascade;

    drop user portal cascade;

    CREATE OR REPLACE DIRECTORY datapump_dir AS '/tmp';

    In case of a preferred folder such as new_directory_path:

    CREATE OR REPLACE DIRECTORY datapump_dir AS '/new_directory_path';

  8. Run the following command using aptare user
    /opt/aptare/oracle/bin/impdp parfile=/opt/aptare/database/tools/impdp_scdb.par
  9. You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command which can also be executed from aptare user.
    /opt/aptare/oracle/bin/impdp system/aptaresoftware@//localhost:1521/scdb schemas=portal,aptare_ro directory=datapump_dir dumpfile=aptare_scdb.exp logfile=import_scdb.log
  10. When import completes in a non-CDB environment, remove first command 'alter session set container = scdb;' from the file unlock_portal_linux.sql and run following command from aptare user.

    Note:

    The removal of the 'alter session set container = scdb;' is required only for a non-CDB environment and no change is required if it is a Container database.

    sqlplus / as sysdba
    @/opt/aptare/database/tools/unlock_portal_linux.sql
  11. After exiting from sqlplus, execute following command from aptare user
    sqlplus portal/portal@//localhost:1521/scdb
    @/opt/aptare/database/tools/validate_sp.sql
Post procedure steps
  1. Go to /tmp directory and check the file import_scdb.log.

    In case you have specified a preferred directory, check for import_scdb.log in your preferred location.

  2. Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.

    Note:

    If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build.

  3. This step is required only if the database is exported from an NetBackup IT Analytics version of 10.5 or above. Run the following commands to copy the aptare.ks file to datarcvrconf folder.

    cp /tmp/aptare.ks /opt/aptare/datarcvrconf/
    chown aptare:tomcat /opt/aptare/datarcvrconf/
    chmod 664 /opt/aptare/datarcvrconf/aptare.ks
  4. Run updateUser.sh to change the password of the application account. For example, to change the password for the admin123 application user, run: updateUser.sh admin123 newPassword

  5. Restart all Oracle and Aptare services by running /opt/aptare/bin/aptare restart from root user.

  6. Login to the portal Application using the application account.

Manual steps for Windows Data Pump Export (Both CDB and non-CDB environments)

Follow the steps for Windows Data Pump Export

  1. Login to the Windows database server.
  2. Ensure Oracle TNS listener and Oracle services are running.
  3. Ensure Aptare user has access to the file c:\opt\oracle\database\tools\expdp_scdb_win.par

    Run following commands:

    sqlplus system/aptaresoftware@//localhost:1521/scdb

    create or replace directory datapump_dir as 'c:\opt\oracle\logs';

    Exit

  4. After exiting out of sqlplus, execute the following command: c:\opt\oracle\bin\expdp parfile=c:\opt\oracle\database\tools\expdp_scdb_win.par
  5. You can also choose to ignore the par file and include parameters in the expdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\expdp system/aptaresoftware@//localhost:1521/scdb FULL=Y DIRECTORY=datapump_dir LOGFILE=export_scdb.log DUMPFILE=aptare_scdb.exp CONTENT=ALL FLASHBACK_TIME=systimestamp
  6. After successful completion, the data pump export file aptare_scdb.exp is saved in C:\opt\oracle\logs directory of the Windows Database server.

  7. Copy file c:\opt\datarcvrconf\aptare.ks to c:\opt\oracle\logs folder.

    Note:

    This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.

Manual steps for Windows Data Pump Import (Both CDB and non-CDB environments)

Follow the steps for Windows Data Pump Import

  1. Login to the Windows database server.
  2. The Aptare user will already have access to import files c:\opt\oracle\database\tools\unlock_portal_win.sql and c:\opt\oracle\database\tools\impdp_scdb_win.par. In case the Oracle user does not have read and execute privileges on these files, please ensure privileges are granted before starting the import.
  3. Place the export file aptare_scdb.exp in c:\opt\oracle\logs directory
  4. In case the name of the export file is capitalized, please change it to lower case. For example, change the name 'APTARE_SCDB.EXP' to 'aptare_scdb.exp'
  5. Stop all Oracle and Aptare services using stopAllServices from windows services tab.
  6. Start OracleServicescdb from windows services tab and ensure Oracle TNS listener is running.

    Run following commands:

    Sqlplus / as sysdba

    Alter session set container = scdb; (note this command is included only for a container database, otherwise switch to container database is not required)

    DROP USER aptare_ro CASCADE;

    DROP USER portal CASCADE;

    CREATE OR REPLACE DIRECTORY datapump_dir AS 'c:\opt\oracle\logs';

    EXIT;

  7. After exiting out of sqlplus execute following command:

    c:\opt\oracle\bin\impdp parfile=c:\opt\oracle\database\tools\impdp_scdb_win.par

  8. You can also choose to ignore the par file and include parameters in the impdp command directly. In other words, the above command can be replaced by the following command: c:\opt\oracle\bin\impdp "sys/*@//localhost:1521/scdb as sysdba" SCHEMAS=portal,aptare_ro DIRECTORY=datapump_dir LOGFILE=import_scdb.log DUMPFILE=aptare_scdb.exp
  9. After import is complete, execute the following command: sqlplus "sys/*@//localhost:1521/scdb as sysdba" @c:\opt\oracle\database\tools\unlock_portal_win.sql
  10. After exiting out of sqlplus, execute the following command: sqlplus portal/portal@//localhost:1521/scdb @c:\opt\oracle\database\tools\validate_sp.sql
Post procedure steps
  • To check for import logs, go to c:\opt\aptare\oracle\logs and check the file import_scdb.log.

  • Check the log file for the compilation warnings for the packages: view apt_v_solution_history_log, cmv_adaptor_pkg, avm_common_pkg, sdk_common_pkg, server_group_package, load_package, common_package, util. These compilation warnings are addressed by the script itself and no action is required from the user.

    Note:

    If you are importing DB from 10.4, upgrade the portal after the import to 10.5 build

  • Copy the saved file from c:\opt\oracle\logs\aptare.ks to c:\opt\datarcvrconf\ folder. Ensure that the file is owned by NetBackup IT Analytics user and has appropriate Read and Write access to the copied file.

    Note:

    This step is required only if database is exported from an NetBackup IT Analytics version of 10.5 or above.

  • After successful completion of the import process, run StopAllservices using service tab on Windows.

  • Run startAllServices using service tab on Windows.

  • Run updateUser.bat from utils directory to change the password of the application account. For example, to change the password for the admin123 application user, run: updateUser.bat admin123 newPassword

  • Login to the portal Application using the application account.