- About Vivit
- LUGs & SIGs
- Vivit Blogs
- News & Events
- Knowledge Base
|HPE Software Products: Operations Analytics|
Hello Experts ,
AQL: Error executing AQL [aqlrawlog(<aqllit></aqllit>,$starttime,$endtime,"",$limit)] Cause was Raw log search failed on one or more host(s) due to following reasons: javax.xml.ws.WebServiceException: Could not send Message.
could you please advice me how can fix that? however I have changed the query but I am always getting the same error!
We are going to install hp opsa 2.31 using virtual appliance.
Does Arcsight logger 6.2 is compatible with hp opsa 2.31 and 2.32.
We have a OpsA connected a Vertica in single installation (hostname.domain), we need migrate the Vertica do Cluster and use a Virtual IP Address (FQDN). How to change OpsA Server/Collector to connect for new Virtual FDQN?
We want to install HP Opsa 2.31 on vertica db which is shared with OBR db on same vertica server..
We require configuration step for vertica db for opsa.
Do we need to execute opsa-vertica_2.31_setup.bin from HP Operations Analytics
Or else we need to execute opsa-server-postinstall.sh from opsa server.
Kindly suggest the steps for same.
While configuring BPM Collection, i get the below error.
2017-01-30 02:04:46,752 INFO [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.createCollection:173] (http-/0.0.0.0:8080-1) Validation failed before creating collection BPM
Could you please help me figure out the changes that i need to do for the log4j.properties or any suitable file.
I want to monitor OpsA in our environment. Could anyone please suggest the best way to do that.
Is there any document available on same.
Please provide some leads to the solution.
I created shares for the csv files and mount to /opt/HP/OV/nnm on the collector. I see the cvs files. but nothing is showing up in the dashboard. please help.
NNMi custom polling is configured.
root@hpsatvld5350:/opt/HP/opsa/data/nnm # ls
Operation Analytics trying to integrate Arcsight Logger error
Logger registration validation completed successfully but connection to logger web service failed: org.apache.cxf.binding.soap.SoapFault: unknown
All, I am currently trying to find the OpsA smart Connectors ... please help, the current location is not available
All OpsA Smart Connectors are available on HP Live Network at:
2016-11-04 09:19:47,484 ERROR [com.hp.opsa.dataaccess.logger.impl.LoggerConnectionManager.loadConnectionInfoFromDB:88] (http-/0.0.0.0:8080-1) [DALClient-0009] No raw log source found for the tenant opsa_default
2016-11-04 09:19:47,485 ERROR [com.hp.opsa.dataaccess.logger.impl.LogDataAPIImpl.getRawLogSourceType:622] (http-/0.0.0.0:8080-1) [DALClient-0009] Can't recognize RowLog source file for tenant , probably no RowLog source been configured opsa_default
We have a working metric integration for NewRelic and BSMC 10.01 (with special charcter hotfix). The metrics are pulled from NewRelic using the DoItWise Integration Hub and then they are sent to BSMC. Now we are trying to set up an OpsA collection to pull these in similar manner to the OpsA<>BSMC integrations for SCOM and Nagios.
I checked the BSMC databases, CODA object output and created a collection XML using the SCOM collection as a template. The DB structure in BSMC for NewRelic is a bit different compared to SCOM though. With SCOM we have one class per metric:
Data source: BsmIntSCOMMetrics
LogicalDisk:% Free Space NON R64 % Free Space
which translates as:
BsmIntSCOMMetrics.LogicalDisk:% Free Space.% Free Space
With NewRelic we have one datasource, two classes and many metrics in each one. For example:
Data source: NewRelic
Server NON R64 System_CPU_System_percent
Server NON R64 System_Disk_All_Utilization_percent
It goes like:
I modified the XML accordingly and data is being collected and processed. However, the actual metrics are not collected in a proper manner. Using the metrics mentioned above, here is an example:
Metrics collected and processed in OpsA:
The collection that OpsA performs runs every 5 minutes. The NewRelic metrics in BSMC are collected each minute. As it seems, OpsA collects metrics at random manner for those 5 minutes since the last collection. For example it takes one timestamp from the 5 minutes timeframe and collects the other metrics at random during the same time frame. In the second example above (Metrics collected and processed in OpsA) the 'System_CPU_System_percent' sample is collected at 1478245925 not 1478245985, while the 'System_Disk_All_Utilization_percent' 1478245985.
I tried changing the key attributes in the XML (default is RelatedCi as per the BSMC object output) but that did not help. I tried to use RelatedCi and CollectionTime, but that also did not work.
Does anyone have a suggestion on how to proceed?
all , I am try to add the Arcsight logger and I receive Error at publish operation. please help opsa 2.31 and Logger 6
We fed events from OMi to OpsA back on 9/21/16. We liked a specific message group and added a key word to increase the significance of specific events so that they would be identified as significant to help identify an issue that occurred between 12p-3p.
That worked back on 9/21, and continued to work through at least mid-October (the last time we went back and looked at that time period).
Today (11/2) we looked at it again, and the log and event analytics pane show ZERO significant events for the time period 12p-3p on 9/21.
Has anyone seen this behavior? Any ideas regarding why these events are no longer significant?
I'm trying to pull HPOO and HPCSA logs. what is best way pull into the collector
I am trying to correlate metrics within Opsa.
The metric correlation can be done within the same collection. For example cpu and memory metrics can be correlated since they are in the "oa_sysperf_global collection ".
Is it possible to correlate metrics with different collections? For example cpu metric and server response time metric from RUM can be correlated and how?
Thanks for your helps,
Good day, hope that someone can assist. I am trying to install OPSA on a Linux VM and the IP Check keeps failing---the installation log shows an error (same as displayed in the installation wizard) make sure that there is only 1 IPv4 address and that this is not a loopback address.. otherwise OpsA postinstall will be corrupted... I cannot click on next to proceed from there. Tried setting the network adaptor settings: bridged, NAT and Host-only with no luck--any ideas?
I wanted one clarification . ..
Can Operations Analytics be installed on a single machine? I mean . . Server, collector, Logger and vertica
all in one machine ? I am talking about the trial version. . .
Hi . . .
I need one information. . .
In OpsA 2.31 installation document, it is given as in attached doc.
When i tried to download the software, i got one single zip file of 7.2 GB with the description :-
Is this the same combined with all 4 zip files? or is this different one?
As far as i know OpsA uses automatically configured Flex connector for self monitoring.
However for integration it asks serial number of the logger host. How can i get the serial number of logger host?
To view Operations Analytics collector logs, you need to run opsa-flex-config.sh on each Operations Analytics Collector host and perform the following steps from the command line:
* Enter the serial number of the Logger host for which you want to configure the Operations Analytics Log File Connector for HP Arcsight Logger.