APTARE IT Analytics Help
- Section I. Introducing APTARE IT Analytics
- Section II. Certified configurations
- Portal and database servers
- Data Collector server configurations
- Capacity Manager configurations
- Array/LUN performance Data Collection
- EMC Isilon array performance metrics
- NetApp Cluster-Mode performance metrics
- EMC Symmetrix enhanced performance metrics
- Host access privileges, sudo commands, ports, and WMI proxy requirements
- Cloud configurations
- Virtualization Manager configurations
- File Analytics configurations
- Fabric Manager configurations
- Backup Manager configurations
- ServiceNow configurations
- Internal TCP port requirements
- Section III. End user
- Understand the portal
- About the Admin tab
- Explore your inventory
- Hierarchy toolbar to organize your data
- Show objects
- Use attributes to organize your data
- Pin reports - saving reports with inventory objects
- Assign attributes in the inventory list view
- Get acquainted with reports
- About badging
- Generate and maintain reports
- Select Report Scope
- Group hosts by attributes
- Search for hosts in the report Scope Selector
- Backup Manager advanced scope selector settings
- Solution reports scope selector settings
- Units of Measure in Reports
- Customize report filter logic
- Sort columns in reports
- Distribute, share, schedule, and alert
- Scheduling Exported Reports and Dashboards
- Organize reports
- Work with the dynamic template designer
- Dynamic Template Designer Quick Start
- Converting to a Homogeneous, Product-Specific Template
- Dynamic Template Function Configurations
- Create Fields with the Field Builder
- Scope Selector Component - Custom Filter
- Configure a Bar Chart Dynamic Template
- Steps to Create a Bar Chart Dynamic Template
- Configure an Area/Stacked Area Chart Dynamic Template
- Line Charts for Performance Metrics
- Line Chart Field Requirements
- One Object Per Line Chart, One or More Metrics Per Chart
- Multiple Objects Per Line Chart, One Metric Per Chart
- Example of a Stacked Bar Chart Dynamic Template
- Create a Sparkline Chart in a Tabular Dynamic Template
- Adding or Editing Methods
- Validate and Save a Method
- Work with the SQL template designer
- Configure SQL Template Scope Selector Components
- Sample SQL Queries
- Number, Size, Date, and Time Formatting
- Alignment, Aggregation, Bar Type, and Bar Type Color
- Pipelined functions for report query building
- APTlistOfDates
- aptStringConcat
- getServerAttributeValue
- getObjectAttributeValue
- getChildServerGroupContextById
- getServerGroupContextById
- secsToHoursMinSecs
- APTgetTapeDriveStatusName
- getFullPathname
- listJobSummaryAfterRestart
- listJobSummaryAfterRestartNBW
- listJobSummaryAfterRestart for NetWorker Backup Jobs
- listOfBackupWindowDates
- listChargebackCatByVOLSDetail
- listChargebackCatByFSDetail
- listChargebackCatByFSDetail
- listChargebackByLUNSummary
- listChargebackByLUNDetail
- listChargebackCatByLUNSummary
- listChargebackCatByLUNDetail
- Alert configuration
- Add/Edit an Alert Policy
- Manage hosts, backup servers, and host groups
- NetBackup Master Servers
- Manage attributes and objects
- Provide Portal access and user privileges
- Setting / Resetting passwords
- Managing user group home pages (Administrator)
- Configure master schedules and backup windows
- Add, edit, and move policies
- Add/Edit a threshold policy
- Capacity Chargeback policy types
- Solutions administration
- Manage and monitor data collection
- About data collection tasks
- Add/Edit data collectors
- Review collectors and collection status
- Upgrade Data Collectors
- Work with Capacity Manager host data collection
- Host access requirements
- Manage credentials
- Configure host discovery policies to populate the host discovery and collection view
- Validate host connectivity
- Search and export in host discovery and collection
- Propagate probe settings: Copy probes, paste probes
- Discovery policies for Veritas NetBackup
- About Discovery types
- View and manage system notifications
- Customize with advanced parameters
- Access control advanced parameters
- General Data Collection advanced parameters
- Cloud data collection advanced parameters
- Host discovery and collection advanced parameters
- Backup Manager advanced parameters
- Capacity Manager advanced parameters
- File Analytics advanced parameters
- Virtualization Manager advanced parameters
- Manage your Portal environment
- Analyze files
- Troubleshoot the Portal
- Retrieving log files
- Debug
- Attribute inheritance overrides
- Understand report data caching
- Understand the portal
- Section IV. Report Reference
- Introduction to APTARE IT Analytics
- Alert Reports
- Risk Mitigation Solution Reports
- Risk Mitigation Reports
- Storage Optimization Solution Reports
- System Administration Reports
- Oracle Job Overview
- Capacity Manager Reports
- Application Capacity Reports
- Array Capacity Utilization Reports
- Array Capacity & Utilization (Generic Data)
- Array Capacity & Utilization (IBM SVC View)
- Array Capacity and Utilization (IBM XIV View)
- Array Capacity and Utilization (NetApp View)
- Array Capacity and Utilization (NetApp Cluster)
- NetApp Storage System Detail
- Array Capacity and Utilization (OpenStack Swift)
- IBM Array Site Summary
- IBM Array Detail
- LUN Utilization Summary
- NetApp Aggregate Detail
- NetApp Cluster-Mode Aggregate Detail
- NetApp Plex Details
- NetApp Volume Details
- NetApp Cluster-Mode Volume Detail
- Available/Reclaimable Capacity Reports
- Capacity at Risk Reports
- Capacity Chargeback Reports
- Host Capacity Utilization Reports
- SnapMirror Reports
- SnapVault Reports
- Capacity Forecasting Reports
- Storage Performance Reports
- Mission Control for Performance Analysis
- Thin Provisioning Reports
- Hitachi Dynamic Provisioning Pool Utilization
- File Analytics Reports
- Virtualization Manager Reports
- Understanding the Datastore
- VM Server Detail
- VM Snapshot Summary
- VM Detail
- Datastore Utilization Summary
- Datastore Detail
- Fabric Manager Reports
- Host to Storage Dashboard
- Backup Manager Management Reports
- Error Log Summary
- Job Duration Report
- Veeam Backup & Replication Job Summary Report (Homogeneous)
- Veeam and RMAN Job Details Report
- Adding a Note to a Job
- Job Volume Summary Report
- NetBackup deduplication to MSDP savings
- Backup Administration Reports
- Host Details
- IBM Spectrum Protect (TSM) Storage Pools Dashboard
- Backup Media Management Reports
- TSM Tape Media Detail Table
- Backup Service Level Agreement (SLA) Reports
- Determining and Improving Backup Start Time Performance
- Determining and Improving Backup Success Performance
- Determining and Improving Backup Duration Performance
- Backup Storage Utilization Reports
- Backup Manager Forecasting Reports
- Backup Billing and Usage Reports
- Backup Policies Reports
- HP Data Protector Backup Specification Detail
- Public Cloud Reports
- Section V. Data Collector Installation and Troubleshooting
- Installing the Data Collector Software
- Validating Data Collection
- Uninstalling the Data Collector
- Manually Starting the Data Collector
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- CRON Expressions and Probe Schedules
- Clustering Data Collectors with VCS and Veritas NetBackup (RHEL 7)
- Clustering Data Collectors with VCS and Veritas NetBackup (Windows)
- Installing the Data Collector Software
- Section VI. Data Collection for the Cloud
- Pre-Installation Setup for Amazon Web Services (AWS)
- Create an AWS IAM user
- Link AWS accounts for Collection of consolidated billing data
- Pre-Installation Setup for OpenStack Ceilometer
- Pre-Installation Setup for OpenStack Swift
- Pre-Installation Setup for Microsoft Azure
- Pre-Installation Setup for Amazon Web Services (AWS)
- Section VII. Data Collection for Data Protection (Backup)
- Introduction
- Pre-Installation setup for Commvault Simpana
- Open TCP/IP access to the Commvault database
- Set up a read-only user in the CommServe server
- Pre-Installation setup for Cohesity DataProtect
- Pre-Installation setup for EMC Avamar
- Import EMC Avamar server information
- Pre-Installation setup for EMC Data Domain Backup
- Pre-Installation setup for EMC NetWorker
- Architecture overview (EMC NetWorker)
- Pre-Installation setup for Dell EMC NetWorker Backup & Recovery
- Pre-Installation setup for generic backup
- CSV format specification
- Pre-Installation setup for HP Data Protector
- Architecture overview (HP Data Protector)
- Configure the Data Collector server in Cell Manager (HP Data Protector)
- Pre-Installation setup for IBM Spectrum Protect (TSM)
- Architecture overview (IBM Spectrum Protect -TSM)
- Import IBM Spectrum Protect (TSM) information
- Pre-Installation setup for IBM Spectrum Protect Plus
- Pre-Installation setup for NAKIVO Backup & Replication
- Pre-Installation setup for Veritas Backup Exec
- Pre-Installation setup for Veritas NetBackup
- Prerequisites to use SSH and WMI (Veritas NetBackup)
- Prerequisites for NetBackup collection over SSH (Kerberos option)
- Veritas NetBackup 8.1 (or later) requirements for centralized collection
- Configuring file analytics in NetBackup Data Collector policy
- Pre-Installation setup for Veritas SaaS backup
- Pre-Installation setup for Oracle Recovery Manager (RMAN)
- Pre-Installation setup for Rubrik Cloud Data Management
- Pre-Installation setup for Veeam Backup & Replication
- Discovery policies for Veritas NetBackup
- About Discovery types
- About SNMP probes
- Appendix A. Load Historic Events
- Load Veritas NetBackup events
- Section VIII. Data Collection for Fabrics
- Section IX. Data Collection for File Analytics
- Pre-Installation Setup for File Analytics
- Host Discovery and Collection File Analytics probe
- Adding a File Analytics Data Collector policy
- File Analytics Export Folder Size and Folder Depth
- Pre-Installation Setup for File Analytics
- Section X. Data Collection for Replication
- Section XI. Data Collection for Storage (Capacity)
- Data Collection for Capacity Overview
- Pre-Installation Setup for Dell Compellent
- Pre-Installation Setup for DELL EMC Elastic Cloud Storage (ECS)
- Pre-Installation Setup for EMC Data Domain Storage
- Pre-Installation Setup for EMC Isilon
- Pre-Installation Setup EMC Symmetrix
- Pre-Installation Setup for Dell EMC Unity
- Pre-Installation Setup for EMC VNX Celerra
- Pre-Installation Setup for EMC VNX CLARiiON
- Pre-Installation Setup for EMC VPLEX
- Pre-Installation Setup for EMC XtremIO
- Pre-Installation Setup for Hitachi Block
- Configuring a Hitachi Device manager user
- Pre-Installation Setup for Hitachi Content Platform (HCP)
- Hitachi content platform system management console
- Hitachi content platform tenant management console
- Pre-Installation Setup Hitachi NAS
- Pre-Installation Setup for Hitachi Vantara All-Flash and Hybrid Flash Storage
- Host Inventory Pre-Installation Setup
- Host Access Privileges, Sudo Commands, Ports, and WMI Proxy Requirements
- Configure host Discovery policies to populate the host Inventory
- Validate host connectivity
- Host Inventory search and host Inventory export
- Configure and edit host probes
- Propagate Probe Settings: Copy Probes, Paste Probes
- Pre-Installation Setup for HP 3PAR
- Pre-Installation Setup for HP EVA
- Pre-Installation Setup for Huawei OceanStor
- Pre-Installation Setup for IBM COS
- Pre-Installation Setup for IBM Enterprise
- Pre-Installation Setup for NetApp E-Series
- Pre-Installation Setup for IBM SVC
- Pre-Installation Setup for IBM XIV
- Pre-Installation Setup for Infinidat InfiniBox
- Pre-installation setup for FUJITSU Data Collector
- Pre-Installation setup for Infinidat InfiniGuard
- Pre-Installation Setup for NetApp-7
- Pre-Installation setup for NetApp StorageGRID
- Pre-Installation Setup for Microsoft Windows Server
- Pre-Installation Setup for NetApp Cluster
- Pre-Installation Setup for Pure Storage FlashArray
- Pre-Installation Setup for Veritas NetBackup Appliance
- Section XII. Data Collection for Virtualization
- Pre-Installation setup for VMware
- Pre-Installation setup for IBM VIO
- Pre-Installation setup for Microsoft Hyper-V
- Section XIII. System Administration
- Preparing for Updates
- Backing Up and Restoring Data
- Monitoring APTARE IT Analytics
- Accessing APTARE IT Analytics Reports with the REST API
- Defining NetBackup Estimated Tape Capacity
- Automating Host Group Management
- Categorize host operating systems by platform and version
- Load relationships between hosts and host group
- Automate NetBackup utilities
- Scheduling utilities to run automatically
- Attribute Management
- Importing Generic Backup Data
- Backup Job Overrides
- Managing Host Data Collection
- System Configuration in the Portal
- Performance Profile Schedule Customization
- Configuring AD/LDAP
- Configuring Single Sign On (SSO) Using Security Assertion Markup Language (SAML)
- Changing Oracle Database User Passwords
- Integrating with CyberArk
- Tuning APTARE IT Analytics
- Defining Report Metrics
- Working with Log Files
- Portal and data collector log files - reduce logging
- Data collector log file naming conventions
- Portal log files
- SNMP Trap Alerting
- SSL Certificate Configuration
- Configure virtual hosts for portal and / or data collection SSL
- Keystore on the portal server
- Portal Properties: Format and Portal Customizations
- Advanced Configuration for NetBackup Discovery
- Data Retention Periods for SDK Database Objects
- Troubleshooting
- Section XIV. Portal Install and Upgrade (Windows)
- Installing the Portal on a Windows server
- Task 3: Installing Oracle application binaries (Windows)
- Upgrade APTARE IT Analytics Portal on Windows
- Upgrade the Oracle database application binaries to19c (Windows)
- Upgrade APTARE IT Analytics Portal
- Oracle patches for the database server
- Upgrade and migrate to a new server
- Installing the Portal on a Windows server
- Section XV. Portal Install and Upgrade (Linux)
- Install the APTARE IT Analytics Portal on a Linux server
- Installer-based deployment
- Upgrade APTARE IT Analytics Portal on Linux
- Upgrade Oracle database application binaries to 19c (Linux)
- Upgrade APTARE IT Analytics Portal
- Data Collector upgrades
- Oracle patches for the database server
- Upgrade and Migrate to a new server
- Upgrade and migrate to a new server
- Upgrade and migrate to a new server
- Appendix B. X Virtual Frame Buffer
- Install the APTARE IT Analytics Portal on a Linux server
- Section XVI. Licensing
- License installation and guidelines
- License overview
- Verify the current license configuration
- Storage suite
- Protection suite
- Backup Manager
- Backup Manager
- Complete suite
- Managing licenses
- Configure the Data Collector policy to exclude the object
- License management from the command line
- Troubleshooting
- License installation and guidelines
- Section XVII. Inventory reports and operations
Upgrade Oracle database application binaries (Linux)
Ensure the APTARE IT Analytics server does not have any other Oracle database instances installed. Also note the instructions provided with the confirmation of your purchase agreement and consult Veritas Support, if you require additional assistance.
To upgrade the Oracle database binaries:
- Perform a cold backup of your Oracle database. This means that you'll physically copy or backup the files to another location. This cold backup will simplify the restore process, in the event of unanticipated data loss.
- Export your Oracle database. This can be done manually prior to the upgrade, or you can elect to have it done as part of the upgrade process.
- Verify that you have the current version of the Oracle 19c Installer binaries.
- Total temporary file system (tmpfs) memory must be 24 GB or greater, otherwise Oracle will fail to start. Increase the size of tmpfs, typically in /etc/fstab.
- Download the following Oracle patches from Veritas and keep them in a directory on the server where Oracle Database will be installed.
p31281355_190000_Linux-x86-64.zip
p30565805_198000DBRU_Linux-x86-64.zip
The path for this directory will be required by the Oracle database installer during the upgrade process. Oracle database installer will install these patches as part of upgrade.
Oracle installer installs only the above-listed patches. Any other patches released by Oracle must be installed manually. See Apply Oracle-recommended patches.
- Verify that the Oracle Database does not have any invalid objects. The installer will verify if the Database has any invalid objects. If any invalid objects are found, the installer will prompt the user to delete them. It is recommended to consult with support before deleting the invalid objects.
- Login as root on the server where the APTARE IT Analytics Database will be installed. Typically, this is also the Portal server.
- Place the ISO image into the /mnt directory.
- Mount the ISO image that you downloaded.
mkdir /mnt/diskd
mount -o loop <sc_dbinstaller_XXXXX_XXX_linux.iso> /mnt/diskd
where you substitute the relevant name of the ISO file that you downloaded.
- Enter the following commands to start the installer:
cd / /mnt/diskd/install_oracle.sh
The command copies the ORACLE binaries into /opt/aptare/oracle19c.
- Press Enter to read the entire EULA license agreement and accept the agreement. The upgrade process begins by detecting an existing Oracle installation and switching to upgrade mode.
- Provide the absolute directory path where recommended Oracle patches are downloaded.
Note:
aptare user must have write access to the directory where these patches are downloaded.
The Database upgrade process will install the Oracle security patches if they are available in this system. aptare user must have write access to the directory where these patches are downloaded. Enter the absolute directory path where these patches are downloaded.
- Enter PROCEED to proceed with the upgrade.
This takes 3-5 minutes to complete, as it installs files into /opt/aptare/oracle19c.
Creating group aptare...Done. Creating user aptare...with default Group aptare... Done. Creating group dba...Done. Adding user aptare to group dba...Done. Creating ORACLE_HOME directory in /opt/aptare/oracle ... Done. Setting up database directories /data01 /data02 /data03 /data04 /data05 /data06... Done. Installing ORACLE binaries in /opt/aptare/oracle19c ... Extracting files... Please wait, this process will take 3-5 minutes to complete... Done. Setting permissions for oracle files ... Done. Done.
- The Oracle Database installer will install the recommended Oracle patches. Before installing the Oracle patches, Oracle installer will verify the Oracle Inventory. If any issue is observed, it will re-create the Oracle inventory.
- After the extraction of Oracle 19c binaries, the pre-upgrade process begins. This includes:
Acknowledgement that a cold backup was performed. As part of the upgrade process, Oracle 19c binaries will be installed to your system, upgraded using an auto upgrade utility and converted to Container-Based DB (CDB). A cold backup of the Oracle data files is required to protect against possible errors and data loss.
Compatibility check verifies if the existing database is compatible for direct upgrade to Oracle 19c
Database Export. This is required. As part of this process you can either verify the export or direct the upgrader to export it for you.
Verification of a database export: An Oracle database export is required. This export is in addition to the full Oracle filesystem cold backup. If this step was taken prior to the upgrade process you must enter SKIP and then enter the existing database export file name and location.
OR
Export the database: An Oracle database export is required. If you did not do this prior to the upgrade, and verify it, the upgrader can export it for you. This export is in addition to the full Oracle filesystem cold backup. Enter PROCEED to export the database and enter a location for the upgrader to place the files. This step can require 20-30 minutes depending on the size of the database.
- Once the pre-upgrade processes are complete the utility will complete the Oracle upgrade.
- After the successful completion of pre-upgrade process, the database upgrade process will start using the Autoupgrade utility. During the process, the following is displayed:
[exec] Autoupgrade Utility Started. [exec] aptare [exec] AutoUpgrade tool launched with default options [exec] Processing config file ... [exec] +--------------------------------+ [exec] | Starting AutoUpgrade execution | [exec] +--------------------------------+ [exec] 1 databases will be processed
After successful completion of the process, the following is displayed:
[exec] Autoupgrade Utility Started. [exec] aptare [exec] AutoUpgrade tool launched with default options [exec] Processing config file ... [exec] +--------------------------------+ [exec] | Starting AutoUpgrade execution | [exec] +--------------------------------+ [exec] 1 databases will be processed [exec] Job 100 completed [exec] ------------------- Final Summary -------------------- [exec] Number of databases [ 1 ] [exec] [exec] Jobs finished successfully [1] [exec] Jobs failed [0] [exec] Jobs pending [0] [exec] ------------- JOBS FINISHED SUCCESSFULLY ------------- [exec] Job 100 for scdb [exec] Autoupgrade Utility Finished.
Logs for entire upgrade process are located at:
/opt/aptare/upgrade/logs/upgrade19c/upgrade19c.log
Logs for the auto upgrade process are located at:
/opt/aptare/upgrade/logs/upgrade19c/scdb_upd_logs/ scdb/xxx/autoupgrade_<YYYYMMDD>.log
where <YYYYMMDD> is the creation date.
Summary logs are located at:
/opt/aptare/upgrade/logs/upgrade19c/global_logs/ cfgtoollogs/upgrade/auto/autoupgrade.log
Note:
Any tuning done to Oracle configuration file initscdb.ora
on previous version of Oracle will not be part of the initscdb.ora
file in 19c. These changes must be applied again.