HPE Software Products: Operations Analytics
Share |

Arcsight logger 6.2 is compatible with HP OPSA 2.31 and 2.32Open in a New Window

Hi Team,

We are going to install hp opsa 2.31 using virtual appliance.

Does Arcsight logger 6.2 is compatible with hp opsa 2.31 and 2.32.

Kindly suggest.

Regards,

Bharat

 

 

Change OpsA connection to Vertica Database HostnameOpen in a New Window

Hi All,

We have a OpsA connected a Vertica in single installation (hostname.domain), we need migrate the Vertica do Cluster and use a Virtual IP Address (FQDN). How to change OpsA Server/Collector to connect for new Virtual FDQN?

Regards

Sant'Ana, L

 

Vertica DB configuration for HP Opsa 2.31Open in a New Window

Hi expert,

We want to install HP Opsa 2.31 on vertica db which is shared with OBR db on same vertica server..

We require configuration step for vertica db for opsa.

Do we need to execute opsa-vertica_2.31_setup.bin from HP Operations Analytics
2.31 Vertica Integration zip file. on vertica server.

Or else we need to execute opsa-server-postinstall.sh from opsa server.

Kindly suggest  the steps for same.

Regards,

Bharat

 

 

 

 

BPM configuration ERROROpen in a New Window

Hi Team,

While configuring BPM Collection, i get the below error.

2017-01-30 02:04:46,752 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.createCollection:173] (http-/0.0.0.0:8080-1) Validation failed before creating collection BPM
2017-01-30 02:04:46,752 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:59] (http-/0.0.0.0:8080-1) Validation level 0:
2017-01-30 02:04:46,752 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:60] (http-/0.0.0.0:8080-1) Status: ERROR
2017-01-30 02:04:46,753 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:59] (http-/0.0.0.0:8080-1) Validation level 1: Skipping Ping, assuming host is up
2017-01-30 02:04:46,753 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:60] (http-/0.0.0.0:8080-1) Status: SUCCEEDED
2017-01-30 02:04:46,753 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:59] (http-/0.0.0.0:8080-1) Validation level 1: Connection successful
2017-01-30 02:04:46,753 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:60] (http-/0.0.0.0:8080-1) Status: SUCCEEDED
2017-01-30 02:04:46,753 INFO  [com.hp.opsa.collections.impl.CollectionsManagementServiceImpl.log:59] (http-/0.0.0.0:8080-1) Validation level 1: ERROR: Failed to obtain connection to BSM BUS. log4j:WARN No appenders could be found for logger (org.apache.axis2.description.AxisService).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

 

Could you please help me figure out the changes that i need to do for the log4j.properties or any suitable file.

 

Regards,

Girish

 

Monitoring OpsAOpen in a New Window

Hi,

I want to monitor OpsA in our environment. Could anyone please suggest the best way to do that.

Is there any document available on same.

Please provide some leads to the solution.

 

Thank you

 

 

NNMi data not showing up in OpsAOpen in a New Window

 

 

I created shares for the csv files and mount to /opt/HP/OV/nnm  on the collector.  I see the cvs files. but nothing is showing up in the dashboard. please help.

 

NNMi custom polling is configured.

 

root@hpsatvld5350:/opt/HP/opsa/data/nnm # ls

f_Hour_InterfaceMetrics_20161115130000_001.csv.gz f_Raw_ComponentMetrics_20161115141510_001.csv.gz

f_Raw_ComponentMetrics_20161115133507_001.csv.gz   f_Raw_ComponentMetrics_20161115142010_001.csv.gz

f_Raw_ComponentMetrics_20161115134020_001.csv.gz   f_Raw_ComponentMetrics_20161115142510_001.csv.gz

f_Raw_ComponentMetrics_20161115134510_001.csv.gz   f_Raw_ComponentMetrics_20161115143011_001.csv.gz

f_Raw_ComponentMetrics_20161115135011_001.csv.gz   f_Raw_ComponentMetrics_20161115143512_001.csv.gz

f_Raw_ComponentMetrics_20161115135510_001.csv.gz   f_Raw_ComponentMetrics_20161115144007_001.csv.gz

f_Raw_ComponentMetrics_20161115140010_001.csv.gz   f_Raw_ComponentMetrics_20161115144506_001.csv.gz

f_Raw_ComponentMetrics_20161115140511_001.csv.gz   f_Raw_ComponentMetrics_20161115145010_001.csv.gz

f_Raw_ComponentMetrics_20161115141005_001.csv.gz   f_Raw_ComponentMetrics_20161115145509_001.csv.gz

root@hpsatvld5350:/opt/HP/opsa/data/nnm #

 

Operation Analytics trying to integrate Arcsight Logger errorOpen in a New Window

Operation Analytics trying to integrate Arcsight Logger  error

error message

Logger registration validation completed successfully but connection to logger web service failed: org.apache.cxf.binding.soap.SoapFault: unknown

 

OpsA smartConnectorsOpen in a New Window

All, I am currently trying to find the OpsA smart Connectors ... please help, the current location is not available

 

All OpsA Smart Connectors are available on HP Live Network at:

 

https://hpln.hp.com/contentoffering/smart-connectors-operations-analytics-and-operations-log-intelligence

 

Opsa trying to integration Logger No raw log source found for the tenant ops_defaultOpen in a New Window

2016-11-04 09:19:47,484 ERROR [com.hp.opsa.dataaccess.logger.impl.LoggerConnectionManager.loadConnectionInfoFromDB:88] (http-/0.0.0.0:8080-1) [DALClient-0009] No raw log source found for the tenant opsa_default

2016-11-04 09:19:47,485 ERROR [com.hp.opsa.dataaccess.logger.impl.LogDataAPIImpl.getRawLogSourceType:622] (http-/0.0.0.0:8080-1) [DALClient-0009] Can't recognize RowLog source file for tenant , probably no RowLog source been configured opsa_default

 

Issues with setting up a metric collection for NewRelic in OpsAOpen in a New Window

Hi all,

 

We have a working metric integration for NewRelic and BSMC 10.01 (with special charcter hotfix). The metrics are pulled from NewRelic using the DoItWise Integration Hub and then they are sent to BSMC. Now we are trying to set up an OpsA collection to pull these in similar manner to the OpsA<>BSMC integrations for SCOM and Nagios. 

I checked the BSMC databases, CODA object output and created a collection XML using the SCOM collection as a template. The DB structure in BSMC for NewRelic is a bit different compared to SCOM though. With SCOM we have one class per metric:

<datasource>.<classname>.<metricname>

For example:

Data source: BsmIntSCOMMetrics

LogicalDisk:% Free Space NON R64 % Free Space

which translates as:

BsmIntSCOMMetrics.LogicalDisk:% Free Space.% Free Space

 

With NewRelic we have one datasource, two classes and many metrics in each one. For example:

Data source: NewRelic

Server NON R64 System_CPU_System_percent

Server NON R64 System_Disk_All_Utilization_percent

It goes like:

NewRelic.Server.System_CPU_System_percent

NewRelic.Server.System_Disk_All_Utilization_percent

 

I modified the XML accordingly and data is being collected and processed. However, the actual metrics are not collected in a proper manner. Using the metrics mentioned above, here is an example:

BSMC DB:

CollectionTime: 1478245985

System_CPU_System_percent: 55.31

System_Disk_All_Utilization_percent: 33.21

 

Metrics collected and processed in OpsA:

CollectionTime: 1478245985

System_CPU_System_percent: 13.01

System_Disk_All_Utilization_percent: 33.21

 

The collection that OpsA performs runs every 5 minutes. The NewRelic metrics in BSMC are collected each minute. As it seems, OpsA collects metrics at random manner for those 5 minutes since the last collection. For example it takes one timestamp from the 5 minutes timeframe and collects the other metrics at random during the same time frame. In the second example above (Metrics collected and processed in OpsA) the 'System_CPU_System_percent' sample is collected at 1478245925 not 1478245985, while the 'System_Disk_All_Utilization_percent' 1478245985.

 

I tried changing the key attributes in the XML (default is RelatedCi as per the BSMC object output) but that did not help. I tried to use RelatedCi and CollectionTime, but that also did not work.

Does anyone have a suggestion on how to proceed?

 

Thanks,

Alex

 

 

OpsA integration with Arcsight Logger , Error at publish operationOpen in a New Window

all , I am try to add the Arcsight logger and I receive   Error at publish operation.  please help  opsa 2.31 and Logger 6

 

 

Events from an earlier time period (were, but are now) no longer significantOpen in a New Window

We fed events from OMi to OpsA back on 9/21/16.  We liked a specific message group and added a key word to increase the significance of specific events so that they would be identified as significant to help identify an issue that occurred between 12p-3p.

That worked back on 9/21, and continued to work through at least mid-October (the last time we went back and looked at that time period).  

Today (11/2) we looked at it again, and the log and event analytics pane show ZERO significant events for the time period 12p-3p on 9/21.

Has anyone seen this behavior?  Any ideas regarding why these events are no longer significant?

Thanks

 

OpsA SmartConnectors to communicate with HPOO and HPCSAOpen in a New Window

I'm trying to pull HPOO and HPCSA logs.  what is best way pull into the collector

 

opsa default url loginOpen in a New Window

please help. what is the default url for logging into opsa servers and collector. I have configure opsa server, vertica, and collector to talk. now what is the default url to login to server and collector.

 

Opsa Metric CorrelationOpen in a New Window

Hi there,

I am trying to correlate metrics within Opsa.

The metric correlation can be done within the same collection. For example cpu and memory metrics can be correlated since they are in the "oa_sysperf_global collection ".

Is it possible to correlate metrics with different collections? For example cpu metric and server response time metric from RUM can be correlated and how?

Thanks for your helps,

Burak

 

 

 

OPSA Server installation fails on IP CheckOpen in a New Window

Good day, hope that someone can assist. I am trying to install OPSA on a Linux VM and the IP Check keeps failing---the installation log shows an error (same as displayed in the installation wizard) make sure that there is only 1 IPv4 address and that this is not a loopback address.. otherwise OpsA postinstall will be corrupted... I cannot click on next to proceed from there. Tried setting the network adaptor settings: bridged, NAT and Host-only with no luck--any ideas?

 

Operations Analytics InstallationOpen in a New Window

Hello Everyone,

I wanted one clarification . .. 

Can Operations Analytics be installed on a single machine? I mean . . Server, collector, Logger and vertica

all in one machine ? I am talking about the trial version. . . 

 

Operation AnalyticsOpen in a New Window

Hi . . .

I need one information. . .

In OpsA 2.31 installation document, it is given as in attached doc.

When i tried to download the software, i got one single zip file of 7.2 GB with the description :-

"HP_Operations_Analytics_for_HP_OneView_Sept_2015_Z7550-96174.zip".

Is this the same combined with all 4 zip files? or is this different one?

 

 

 

Opsa self monitoringOpen in a New Window

Hi people,

As far as i know OpsA uses automatically configured Flex connector for self monitoring.

However for integration it asks  serial number of the logger host. How can  i get the serial number of logger host?

 

To view Operations Analytics collector logs, you need to run opsa-flex-config.sh on each Operations Analytics Collector host and perform the following steps from the command line:

  1. Review the list of Logger hosts already configured for the opsa_default tenant.

  2. Enter the serial number of the Logger host for which you want to configure the Operations Analytics Log File Connector for HP Arcsight Logger.

* Enter the serial number of the Logger host for which you want to configure the Operations Analytics Log File Connector for HP Arcsight Logger.

 

HP OPSA - ArcSight Logger integration IssueOpen in a New Window

Good day

 

We woudl like to find out

 

We currently have OPSA 2.31 Integrated into ArcSight Logger

WIndows events are being populated, however when we look at the arcsight_log_stream table in Vertica it seems its only populating some of the 130 fields from logger

Has anyone had the same issue, as I can see there are some config files that I can possibly edit however would like to see if anyone else has done it before while I am awaiting an answer from HPE Support

 

Thanks

Contact Us

Vivit Worldwide
P.O. Box 18510
Boulder, CO 80308

Email: info@vivit-worldwide.org

Mission

Vivit's mission is to serve
the Hewlett Packard
Enterprise User
Community through
Advocacy, Community,
and Education.