Recent Posts


Oracle Exalytics Evaluation Metrics
Posted on June 10, 2013
Author: Ron Woodlock, Performance Architects

It is self-evident that porting an existing application onto an Exalyticsserver will improve performance. However, anecdotal evidence of a possible performance improvement is generally not enough to justify an organization’s adoption of new technology. What is needed is a comparative set of metrics on performance and cost between Exalytics and the legacy platform.

As a result, we’ve put together a set of measures and sample metrics for anOracle Business Intelligence Enterprise Edition (OBIEE) application to assist in the identification of high quality candidate applications for migration to Exalytics, as well as for evaluating the performance and cost of those pilot applications once migrated.

Possible performance metrics include:

  • Data set size. Given the capabilities of the Exalytics environment, it would be best to select a large data set. Metrics in this category might include current footprint and average growth in data set size over a specific period of time.
  • Data usage. A large data set that isn’t used isn’t going to challenge the Exalytics environment. Candidate applications should have significant usage patterns (e.g., high peaks and breath of data queried).
  • Peak number of concurrent users. The number of concurrent users is a general indicator of the volume of data being accessed and somewhat overlaps with other volume and size metrics, but is a good comparative measure.
  • BI Answers response times. Select a subset of BI Answers queries using some predefined criteria such as run time, most used, largest number of run-time calculations, etc. These measures should be consistent across evaluated applications.
  • BI Dashboard response times. Since dashboards often have a high profile, it may be helpful to include dashboards with metrics that are used by decision makers. Again, select dashboards consistently across the population of applications using predefined criteria.
  • Delivers response times. The response times for Delivers are less visible to the end-user. However, these jobs may have been set up using Delivers because they have long run times. In this case, there may be an opportunity to redevelop these as Answers or Dashboards and provide a quick win. Regardless, predefined criteria should be developed for scoring across applications.

Unlike performance measures, possible Total Cost of Ownership (TCO) analysis criteria will require some creativity to effectively compare the legacy and Exalytics environments. Installation comparison criteria are likely to be difficult to develop. If specific time measures or cost measures are unavailable, the use of some quasi-anecdotal measures can be adopted. To do this effectively, put language in the scoring (e.g., installation requires significant troubleshooting = score of n). For each of the measures, keep in mind the objective of determining the relative cost of each platform compared to the Exalytics environment.

Here is an example of how you could score your existing environment versus your prospective Exalytics environment:

Measure Metric Scoring Criteria
Data usage Number of queries per period Criteria should be developed based on some of the most-used subject areas; set a high value on those as a baseline.
Number of tables queried per period
Volume of data queried per period
Installation Total installation time for hardware Standalone metric.
Average number of issues post install Standalone metric.

Measures and metrics are going to include some subjectivity and could be open to critique. To mitigate this risk, include key stakeholders in the development or approval of the measures in advance. Developing measures and metrics, as well as capturing data for analysis, will support the overall Exalytics evaluation and adoption strategy.

Author: Ron Woodlock, Performance Architects

© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Leave a Reply

Your email address will not be published. Required fields are marked *