APTARE IT Analytics Help
- Section I. Introducing APTARE IT Analytics
- Section II. Certified configurations
- Portal and database servers
- Data Collector server configurations
- Capacity Manager configurations
- Array/LUN performance Data Collection
- EMC Isilon array performance metrics
- NetApp Cluster-Mode performance metrics
- EMC Symmetrix enhanced performance metrics
- Host access privileges, sudo commands, ports, and WMI proxy requirements
- Cloud configurations
- Virtualization Manager configurations
- File Analytics configurations
- Fabric Manager configurations
- Backup Manager configurations
- ServiceNow configurations
- Internal TCP port requirements
- Section III. End user
- Understand the portal
- About the Admin tab
- Explore your inventory
- Hierarchy toolbar to organize your data
- Show objects
- Use attributes to organize your data
- Pin reports - saving reports with inventory objects
- Assign attributes in the inventory list view
- Get acquainted with reports
- About badging
- Generate and maintain reports
- Select Report Scope
- Group hosts by attributes
- Search for hosts in the report Scope Selector
- Backup Manager advanced scope selector settings
- Solution reports scope selector settings
- Units of Measure in Reports
- Customize report filter logic
- Sort columns in reports
- Distribute, share, schedule, and alert
- Scheduling Exported Reports and Dashboards
- Organize reports
- Work with the dynamic template designer
- Dynamic Template Designer Quick Start
- Converting to a Homogeneous, Product-Specific Template
- Dynamic Template Function Configurations
- Create Fields with the Field Builder
- Scope Selector Component - Custom Filter
- Configure a Bar Chart Dynamic Template
- Steps to Create a Bar Chart Dynamic Template
- Configure an Area/Stacked Area Chart Dynamic Template
- Line Charts for Performance Metrics
- Line Chart Field Requirements
- One Object Per Line Chart, One or More Metrics Per Chart
- Multiple Objects Per Line Chart, One Metric Per Chart
- Example of a Stacked Bar Chart Dynamic Template
- Create a Sparkline Chart in a Tabular Dynamic Template
- Adding or Editing Methods
- Validate and Save a Method
- Work with the SQL template designer
- Configure SQL Template Scope Selector Components
- Sample SQL Queries
- Number, Size, Date, and Time Formatting
- Alignment, Aggregation, Bar Type, and Bar Type Color
- Pipelined functions for report query building
- APTlistOfDates
- aptStringConcat
- getServerAttributeValue
- getObjectAttributeValue
- getChildServerGroupContextById
- getServerGroupContextById
- secsToHoursMinSecs
- APTgetTapeDriveStatusName
- getFullPathname
- listJobSummaryAfterRestart
- listJobSummaryAfterRestartNBW
- listJobSummaryAfterRestart for NetWorker Backup Jobs
- listOfBackupWindowDates
- listChargebackCatByVOLSDetail
- listChargebackCatByFSDetail
- listChargebackCatByFSDetail
- listChargebackByLUNSummary
- listChargebackByLUNDetail
- listChargebackCatByLUNSummary
- listChargebackCatByLUNDetail
- Alert configuration
- Add/Edit an Alert Policy
- Manage hosts, backup servers, and host groups
- NetBackup Master Servers
- Manage attributes and objects
- Provide Portal access and user privileges
- Setting / Resetting passwords
- Managing user group home pages (Administrator)
- Configure master schedules and backup windows
- Add, edit, and move policies
- Add/Edit a threshold policy
- Capacity Chargeback policy types
- Solutions administration
- Manage and monitor data collection
- About data collection tasks
- Add/Edit data collectors
- Review collectors and collection status
- Upgrade Data Collectors
- Work with Capacity Manager host data collection
- Host access requirements
- Manage credentials
- Configure host discovery policies to populate the host discovery and collection view
- Validate host connectivity
- Search and export in host discovery and collection
- Propagate probe settings: Copy probes, paste probes
- Discovery policies for Veritas NetBackup
- About Discovery types
- View and manage system notifications
- Customize with advanced parameters
- Access control advanced parameters
- General Data Collection advanced parameters
- Cloud data collection advanced parameters
- Host discovery and collection advanced parameters
- Backup Manager advanced parameters
- Capacity Manager advanced parameters
- File Analytics advanced parameters
- Virtualization Manager advanced parameters
- Manage your Portal environment
- Analyze files
- Troubleshoot the Portal
- Retrieving log files
- Debug
- Attribute inheritance overrides
- Understand report data caching
- Understand the portal
- Section IV. Report Reference
- Introduction to APTARE IT Analytics
- Alert Reports
- Risk Mitigation Solution Reports
- Risk Mitigation Reports
- Storage Optimization Solution Reports
- System Administration Reports
- Oracle Job Overview
- Capacity Manager Reports
- Application Capacity Reports
- Array Capacity Utilization Reports
- Array Capacity & Utilization (Generic Data)
- Array Capacity & Utilization (IBM SVC View)
- Array Capacity and Utilization (IBM XIV View)
- Array Capacity and Utilization (NetApp View)
- Array Capacity and Utilization (NetApp Cluster)
- NetApp Storage System Detail
- Array Capacity and Utilization (OpenStack Swift)
- IBM Array Site Summary
- IBM Array Detail
- LUN Utilization Summary
- NetApp Aggregate Detail
- NetApp Cluster-Mode Aggregate Detail
- NetApp Plex Details
- NetApp Volume Details
- NetApp Cluster-Mode Volume Detail
- Available/Reclaimable Capacity Reports
- Capacity at Risk Reports
- Capacity Chargeback Reports
- Host Capacity Utilization Reports
- SnapMirror Reports
- SnapVault Reports
- Capacity Forecasting Reports
- Storage Performance Reports
- Mission Control for Performance Analysis
- Thin Provisioning Reports
- Hitachi Dynamic Provisioning Pool Utilization
- File Analytics Reports
- Virtualization Manager Reports
- Understanding the Datastore
- VM Server Detail
- VM Snapshot Summary
- VM Detail
- Datastore Utilization Summary
- Datastore Detail
- Fabric Manager Reports
- Host to Storage Dashboard
- Backup Manager Management Reports
- Error Log Summary
- Job Duration Report
- Veeam Backup & Replication Job Summary Report (Homogeneous)
- Veeam and RMAN Job Details Report
- Adding a Note to a Job
- Job Volume Summary Report
- NetBackup deduplication to MSDP savings
- Backup Administration Reports
- Host Details
- IBM Spectrum Protect (TSM) Storage Pools Dashboard
- Backup Media Management Reports
- TSM Tape Media Detail Table
- Backup Service Level Agreement (SLA) Reports
- Determining and Improving Backup Start Time Performance
- Determining and Improving Backup Success Performance
- Determining and Improving Backup Duration Performance
- Backup Storage Utilization Reports
- Backup Manager Forecasting Reports
- Backup Billing and Usage Reports
- Backup Policies Reports
- HP Data Protector Backup Specification Detail
- Public Cloud Reports
- Section V. Data Collector Installation and Troubleshooting
- Installing the Data Collector Software
- Validating Data Collection
- Uninstalling the Data Collector
- Manually Starting the Data Collector
- Data Collector Troubleshooting
- Host resources: Check host connectivity using standard SSH
- Host resources: Generating host resource configuration files
- Configuring parameters for SSH
- CRON Expressions and Probe Schedules
- Clustering Data Collectors with VCS and Veritas NetBackup (RHEL 7)
- Clustering Data Collectors with VCS and Veritas NetBackup (Windows)
- Installing the Data Collector Software
- Section VI. Data Collection for the Cloud
- Pre-Installation Setup for Amazon Web Services (AWS)
- Create an AWS IAM user
- Link AWS accounts for Collection of consolidated billing data
- Pre-Installation Setup for OpenStack Ceilometer
- Pre-Installation Setup for OpenStack Swift
- Pre-Installation Setup for Microsoft Azure
- Pre-Installation Setup for Amazon Web Services (AWS)
- Section VII. Data Collection for Data Protection (Backup)
- Introduction
- Pre-Installation setup for Commvault Simpana
- Open TCP/IP access to the Commvault database
- Set up a read-only user in the CommServe server
- Pre-Installation setup for Cohesity DataProtect
- Pre-Installation setup for EMC Avamar
- Import EMC Avamar server information
- Pre-Installation setup for EMC Data Domain Backup
- Pre-Installation setup for EMC NetWorker
- Architecture overview (EMC NetWorker)
- Pre-Installation setup for Dell EMC NetWorker Backup & Recovery
- Pre-Installation setup for generic backup
- CSV format specification
- Pre-Installation setup for HP Data Protector
- Architecture overview (HP Data Protector)
- Configure the Data Collector server in Cell Manager (HP Data Protector)
- Pre-Installation setup for IBM Spectrum Protect (TSM)
- Architecture overview (IBM Spectrum Protect -TSM)
- Import IBM Spectrum Protect (TSM) information
- Pre-Installation setup for IBM Spectrum Protect Plus
- Pre-Installation setup for NAKIVO Backup & Replication
- Pre-Installation setup for Veritas Backup Exec
- Pre-Installation setup for Veritas NetBackup
- Prerequisites to use SSH and WMI (Veritas NetBackup)
- Prerequisites for NetBackup collection over SSH (Kerberos option)
- Veritas NetBackup 8.1 (or later) requirements for centralized collection
- Configuring file analytics in NetBackup Data Collector policy
- Pre-Installation setup for Veritas SaaS backup
- Pre-Installation setup for Oracle Recovery Manager (RMAN)
- Pre-Installation setup for Rubrik Cloud Data Management
- Pre-Installation setup for Veeam Backup & Replication
- Discovery policies for Veritas NetBackup
- About Discovery types
- About SNMP probes
- Appendix A. Load Historic Events
- Load Veritas NetBackup events
- Section VIII. Data Collection for Fabrics
- Section IX. Data Collection for File Analytics
- Pre-Installation Setup for File Analytics
- Host Discovery and Collection File Analytics probe
- Adding a File Analytics Data Collector policy
- File Analytics Export Folder Size and Folder Depth
- Pre-Installation Setup for File Analytics
- Section X. Data Collection for Replication
- Section XI. Data Collection for Storage (Capacity)
- Data Collection for Capacity Overview
- Pre-Installation Setup for Dell Compellent
- Pre-Installation Setup for DELL EMC Elastic Cloud Storage (ECS)
- Pre-Installation Setup for EMC Data Domain Storage
- Pre-Installation Setup for EMC Isilon
- Pre-Installation Setup EMC Symmetrix
- Pre-Installation Setup for Dell EMC Unity
- Pre-Installation Setup for EMC VNX Celerra
- Pre-Installation Setup for EMC VNX CLARiiON
- Pre-Installation Setup for EMC VPLEX
- Pre-Installation Setup for EMC XtremIO
- Pre-Installation Setup for Hitachi Block
- Configuring a Hitachi Device manager user
- Pre-Installation Setup for Hitachi Content Platform (HCP)
- Hitachi content platform system management console
- Hitachi content platform tenant management console
- Pre-Installation Setup Hitachi NAS
- Pre-Installation Setup for Hitachi Vantara All-Flash and Hybrid Flash Storage
- Host Inventory Pre-Installation Setup
- Host Access Privileges, Sudo Commands, Ports, and WMI Proxy Requirements
- Configure host Discovery policies to populate the host Inventory
- Validate host connectivity
- Host Inventory search and host Inventory export
- Configure and edit host probes
- Propagate Probe Settings: Copy Probes, Paste Probes
- Pre-Installation Setup for HP 3PAR
- Pre-Installation Setup for HP EVA
- Pre-Installation Setup for Huawei OceanStor
- Pre-Installation Setup for IBM COS
- Pre-Installation Setup for IBM Enterprise
- Pre-Installation Setup for NetApp E-Series
- Pre-Installation Setup for IBM SVC
- Pre-Installation Setup for IBM XIV
- Pre-Installation Setup for Infinidat InfiniBox
- Pre-installation setup for FUJITSU Data Collector
- Pre-Installation setup for Infinidat InfiniGuard
- Pre-Installation Setup for NetApp-7
- Pre-Installation setup for NetApp StorageGRID
- Pre-Installation Setup for Microsoft Windows Server
- Pre-Installation Setup for NetApp Cluster
- Pre-Installation Setup for Pure Storage FlashArray
- Pre-Installation Setup for Veritas NetBackup Appliance
- Section XII. Data Collection for Virtualization
- Pre-Installation setup for VMware
- Pre-Installation setup for IBM VIO
- Pre-Installation setup for Microsoft Hyper-V
- Section XIII. System Administration
- Preparing for Updates
- Backing Up and Restoring Data
- Monitoring APTARE IT Analytics
- Accessing APTARE IT Analytics Reports with the REST API
- Defining NetBackup Estimated Tape Capacity
- Automating Host Group Management
- Categorize host operating systems by platform and version
- Load relationships between hosts and host group
- Automate NetBackup utilities
- Scheduling utilities to run automatically
- Attribute Management
- Importing Generic Backup Data
- Backup Job Overrides
- Managing Host Data Collection
- System Configuration in the Portal
- Performance Profile Schedule Customization
- Configuring AD/LDAP
- Configuring Single Sign On (SSO) Using Security Assertion Markup Language (SAML)
- Changing Oracle Database User Passwords
- Integrating with CyberArk
- Tuning APTARE IT Analytics
- Defining Report Metrics
- Working with Log Files
- Portal and data collector log files - reduce logging
- Data collector log file naming conventions
- Portal log files
- SNMP Trap Alerting
- SSL Certificate Configuration
- Configure virtual hosts for portal and / or data collection SSL
- Keystore on the portal server
- Portal Properties: Format and Portal Customizations
- Advanced Configuration for NetBackup Discovery
- Data Retention Periods for SDK Database Objects
- Troubleshooting
- Section XIV. Portal Install and Upgrade (Windows)
- Installing the Portal on a Windows server
- Task 3: Installing Oracle application binaries (Windows)
- Upgrade APTARE IT Analytics Portal on Windows
- Upgrade the Oracle database application binaries to19c (Windows)
- Upgrade APTARE IT Analytics Portal
- Oracle patches for the database server
- Upgrade and migrate to a new server
- Installing the Portal on a Windows server
- Section XV. Portal Install and Upgrade (Linux)
- Install the APTARE IT Analytics Portal on a Linux server
- Installer-based deployment
- Upgrade APTARE IT Analytics Portal on Linux
- Upgrade Oracle database application binaries to 19c (Linux)
- Upgrade APTARE IT Analytics Portal
- Data Collector upgrades
- Oracle patches for the database server
- Upgrade and Migrate to a new server
- Upgrade and migrate to a new server
- Upgrade and migrate to a new server
- Appendix B. X Virtual Frame Buffer
- Install the APTARE IT Analytics Portal on a Linux server
- Section XVI. Licensing
- License installation and guidelines
- License overview
- Verify the current license configuration
- Storage suite
- Protection suite
- Backup Manager
- Backup Manager
- Complete suite
- Managing licenses
- Configure the Data Collector policy to exclude the object
- License management from the command line
- Troubleshooting
- License installation and guidelines
- Section XVII. Inventory reports and operations
Load relationships between hosts and host group
Description | Imports host-to-host-group relationships from a comma-delimited file. You can choose to audit the host movement, which records the details when a host is removed, added, or moved. See Sample Audit File (output from load_package.loadGroupMemberFile). |
Usage | execute load_package.loadGroupMemberFile('<file_name>', '<recycle_group>', <remove_old_entries>,'<audit_pathname>', '<audit_output_file>', <do_log>); Where: file_name is the fully qualified path to the csv file. For example: /opt/aptare/database/hosts.csv recycle_group is the full path to the group into which deleted hosts will be moved (i.e., the 'recycle bin'). remove_old_entries enables you to remove relationships in the Reporting Database that are not in the file. If set to 1 and where there are hosts with a previous relationship to a host group and where that relationship is no longer represented within the file, the utility moves those hosts to the recycle group. If set to 0, the utility does not remove those hosts. audit_pathnameis the full path to the audit file, not including the filename. audit_output_fileis the name of the audit file where the audit results will be stored. do_logenables you to turn on the auditing function so that all host movements are logged in the audit_output_file. Enter a numeric: 0 or 1, where 0 = No, 1 = Yes. Example command: execute load_package.loadgroupmemberfile ('/opt/aptare/database/movehosts.csv','/Global1/Recycle',1,'/opt/aptare/database','movehosts.out',1); |
Load File Specification | The specification for the comma-delimited file is as follows: path_to_host_group, internal_name1, internal_name2, internal_name3, etc. Where path_to_host_group is the fully qualified path to the host group into which the hosts should be added, and internal_name1 is the internal name of a host within the existing host group hierarchy. Example: /APTARE/Test, testhost01, testhost02 /APTARE/Infrastructure, testhost02, testhost03 Detailed specification for each field follows: internal_name CHAR(64) NOT NULL |
Data Constraints | The first field, path_to_host_group, must be the full path to an existing host group. If any host groups in the path_to_host_group field value do not exist, the utility creates them. Field values cannot contain embedded commas. The csv file must exist and be readable. The recycle group folder must exist. Each row must have at least one host specified, otherwise the row will not be processed. |
Logic Conditions | If you list hosts after the path_to_host_group field and those hosts are located in the existing host group hierarchy, the utility adds those host groups to the specified host group. If a host with the specified internal name does not exist in the hierarchy, the relationship will not be added. The host must already be configured in the reporting database. If any host groups in the path_to_host_group field value do not exist, the utility creates them. If the removeOldEntries parameter is set to 1, the utility assumes that this file will contain all the required relationships. In other words, for all the host groups that you specify in the file, only those hosts will be in that group after you run this utility. If the host group previously contained other host(s) that are now no longer listed in the file, the utility removes those host(s) from the host group and moves them to the recycle folder. The utility does not delete host groups from the Reporting Database; it only removes members of a host group. If a host group in the Reporting Database is not listed in the file, the utility does not take any processing action against that host group. Host groups with many hosts can be split into multiple lines for ease of file maintenance--for example, the host group and some of the hosts appear on the first line, then the same host group and other hosts appear on subsequent lines. |
Logging | The utility logs all additions, updates, warnings, and errors to the scon.log file, which is located under /tmp by default on Linux systems and C:\opt\oracle\logs on Windows systems. Logging strings are typically in the following format: Date -> Time -> Level -> load_package:sub_routine -> Action Where: Level is DBG, INFO,WARN, or ERR. sub_routine is the sub routine that is being executed (e.g. loadServerLine). Action is the action that was being reported on. Example: 14-MAY 19:00:06 ERR load_package:loadServerGroupMembers: Host group /APTARE/Business Views/Regional Offices/ Connecticut does not exist on line 6 of the data load file |