Wednesday, 18 November 2020

Generating Reports from Db2 historical monitoring (db2histmon) data

 - David Sciaraffa, Software Engineering Manager – IBM Db2


(This blog is a follow-up to my previous blog which describes the Db2 Historical Monitoring (db2histmon) framework.)

The Db2 Historical Monitoring (db2histmon) scripts (available here) collect a broad range of Db2 and Operating System diagnostic data, and retain this data for a period of time, allowing for basic triage of many types of issues. Diagnostic information is collected at various intervals (ranging from 1min to 1hour, depending on the kind of data), and can be easily customized and expanded. The scripts will collect data only while the database is activated, which can help to reduce the generation of unnecessary redundant data.

This blog provides an overview of the new report generation script.


The new report generation script can be used to scrape previously generated db2histmon data collections and produce summary reports in multiple flavours, similar to the reports produced by the MONREPORT module. The reports are useful for for analyzing various metrics, and highlighting statistical outliers, from previous time periods.

Downloading the db2histmon report script:

The full set of db2histmon scripts can be downloaded from the open-source Github repo: https://github.com/IBM/db2histmon.
The report.py script itself is located in the 4_report/ sub-folder:

(db2inst1@myhost1) /home/db2inst1/db2histmon-master/4_report/
$ ls
 
1_setup/
2_loader/ 
3_quickparser/ 
4_report/

Pre-requisites:

  • The scripts can be deployed in any Unix or Windows environment where Db2 is supported.
  • Python2 or higher.
  • The python pandas, argparse, and numpy modules are also required to execute the report.py script.
Using pip to install the pandas, argparse, and numpy modules:
(db2inst1@myhost1) /home/db2inst1/
  $ pip install argparse
Collecting argparse
  Downloading argparse-1.4.0-py2.py3-none-any.whl (23 kB)
Installing collected packages: argparse
Successfully installed argparse-1.4.0
(db2inst1@myhost1) /home/db2inst1/
  $ pip install numpy
Collecting numpy
  Downloading numpy-1.19.4-cp38-cp38-win_amd64.whl (13.0 MB)
     || 13.0 MB 656 kB/s
Installing collected packages: numpy
Successfully installed numpy-1.19.4
(db2inst1@myhost1) /home/db2inst1/
  $ pip install pandas
Collecting pandas
  Downloading pandas-1.1.4-cp38-cp38-win_amd64.whl (8.9 MB)
     || 8.9 MB 1.1 MB/s
Collecting python-dateutil>=2.7.3
  Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
     || 227 kB 1.7 MB/s
Installing collected packages: python-dateutil, pandas
Successfully installed pandas-1.1.4 python-dateutil-2.8.1

Types of reports:

The report.py script is currently capable of generating three unique types of reports:

DBSUMMARY - Provides a summary of metrics for the entire database, similar to the MONREPORT.DBSUMMARY module.

CONNECTION - Provides a summary of metrics for each connection, similar to the MONREPORT.CONNECTION module.

PKGCACHE - Provides a summary of metrics for the top executed statements that are still cached within the package-cache, similar to the MONREPORT.PKGCACHE module.


A report will display various metrics and data points for every data collection time period (or limited only to the time periods specified in the arguments), as described later.


Historical data which is used to generate the report:

The report.py script will scrape historical monitoring data that was previously collected by the Db2 Historical Monitoring (db2histmon) framework. 

Please see the 'The Data Collection and Data Archive folders' section of my previous blog which describes the Db2 Historical Monitoring (db2histmon) framework, if you're unsure where this historic monitoring data may reside.

For reference, I'm including a screen-shot showing the contents of the db2histmon data collection folder:




 

Usage and options:

The report.py script has the following usage options:

(db2inst1@myhost1) /home/db2inst1/db2histmon-master/4_report/ 
   $ python report.py -h
 
usage: report.py [-h]  [-r {dbsummary,connection,pkgcache}]
                 [-st START_TIME] [-et END_TIME] [-p PERIOD]
                 [-s] [-m MEMBERS [MEMBERS ...]] 
                 [-ah APPLICATION_HANDLES [APPLICATION_HANDLES ...]] path

positional arguments:
  path                  path where collection directories are stored

optional arguments:
  -h, --help            show this help message and exit
  -r, --report {dbsummary,connection,pkgcache}, 
                        name of report to be generated
  -st START_TIME, --start_time START_TIME
                        display collections after this time, format can be (YYYY-mm-dd HH:MM:SS) or copied from a
                        collection (YYYY-mm-dd-HH-MM-SS.ffffff)
  -et END_TIME, --end_time END_TIME
                        display collections before this time, format can be (YYYY-mm-dd HH:MM:SS) or copied from a
                        collection (YYYY-mm-dd-HH-MM-SS.ffffff)
  -p PERIOD, --period PERIOD
                        display collections at a lesser frequency, for example an interval of 3 will show every third
                        collection
  -s, --stats           show min, max, mean, std for each series of values
  -m MEMBERS [MEMBERS ...], --members MEMBERS [MEMBERS ...]
                        filter certain reports by member
  -ah APPLICATION_HANDLES [APPLICATION_HANDLES ...], --application_handles APPLICATION_HANDLES [APPLICATION_HANDLES ...]
                        filter certain reports by application handle
 

Use the --report option to specify the type of report to generate, either DBSUMMARY, CONNECTION, or PKGCACHE.

The last argument specifies the folder path where the db2histmon historical monitoring data resides.

I'll explain the other options within the examples further below.


Example 1. Generate a DBSUMMARY report (on all available historical monitoring data)


(db2inst1@myhost1) /home/db2inst1/db2histmon-master/4_report/
$ python report.py -r DBSUMMARY /path/to/db2histmon/data_files/
 





Example 2. Generate a report on a subset of time-intervals, by using the START_TIME and END_TIME options

In some cases, the quantity of db2histmon historical monitoring data collections available or archived is very large and may make the reports too large or unreadable.

We can use the --START_TIME and --END_TIME options to limit the data collection time-intervals that we wish to generate a report on.  
In this example, we wish to include all data collections that were generated after 2020-09-29 02:20 but before 2020-09-29 04:30:

(db2inst1@myhost1) /home/db2inst1/db2histmon-master/4_report/
$ python report.py -r DBSUMMARY --start_time "2020-09-29 02:20:00" --end_time "2020-09-29 04:30:00" /path/to/db2histmon/data_files/
 
Add caption




Example 3. Generate a report with time-intervals condensed into larger time PERIODs.

We can also use the --PERIOD option to combine time-intervals from the data collection set into larger sets, such that fewer (but longer duration) time-intervals are displayed in the report.
In this example, many of our original db2histmon historical monitoring data collections are approximately 5minutes apart (as an be seen in the previous report's example above). By specifying the --PERIOD 3 option, we aggregate the data which would normally be reported in three separate time-periods into one time-period: 

(db2inst1@myhost1) /home/db2inst1/db2histmon-master/4_report/
$ python report.py -r DBSUMMARY --period 3  /path/to/db2histmon/data_files/
 



Example 4. Generate a report showing MIN, MAX, MEAN statistical values.

Understanding the statistical values associated with a particular metric can be useful when looking for anomalies in the data set.  In this example, I specify the --stats option to include the statistics for each reported value:

(db2inst1@myhost1) /home/db2inst1/db2histmon-master/4_report/
$ python report.py -r DBSUMMARY --period 3  /path/to/db2histmon/data_files/




IBM Db2 - Technical (Lab) Advocate Newsletter - November 2020

 


IBM Db2 & Db2 on Cloud
Technical (Lab) Advocate Newsletter
November 2020


    Announcement - IBM Db2 Version 11.5 Modification Pack 5 is Now Available!

    Mod Pack Summary of Changes: https://ibm.co/3nHLIn7
    Db2 11.5.5.0 Download Page: https://www.ibm.com/support/pages/node/6368787
    Db2 11.5.5.0 Full APAR Fix List: https://www.ibm.com/support/pages/node/1102545
    Db2 11.5.5.0 Client Drivers Download Page: https://www.ibm.com/support/pages/node/323035

    Some Key Highlights:
    • Advanced Log Space Management - Improvements for mirrored logging support and monitoring.
    • Automated failover to HADR standby with Pacemaker (on Linux for on-premises and non-containerized cloud deployments) for production environments.
    • Support for Snowflake data sources through federation JDBC and ODBC connections.
    • Support for MySQL Community Edition data sources through Federation ODBC connection

    • Automatic Recompress feature enhancement for columnar tables.
    • pureScale - Automated periodic cluster topology verification, for daily cluster health checks.
    • purescale - Improved Remote Direct Memory Access (RDMA) connectivity validation, using Db2 Lightspeed RDMA ping
    • JDBC 4.3 specification API support; OpenJDK 13 runtime support; Odata enhancement.
    • pureScale - Concurrent online fix pack update

    • Enhanced query performance against column organized tables.
    • Db2 Historical Monitoring (db2histmon) - generate MON_REPORT style reports on historical monitor data.
    • New schema level access control of database objects.

    • UPDATE, DELETE statement SKIP LOCKED Data Support for CDE Tables
    • Machine Learning Optimizer technology preview refresh
    • 
    and much more: https://ibm.co/3nHLIn7


    Webinar Series - Db2 Offering Management Webinar Series (including Db2 v11.5.5 Deep Dive)

    The Db2 Offering Management team provides a series of informative webinars covering Db2 roadmaps, strategies, new feature and capability overviews, and much more:
    • Unboxing Db2 11.5.5 - Get an overview of all features and enhancements in Db2 11.5.5.
    • Performance and Security Updates - Hear about the latest enhancements, including topics as schema level authorization, compression improvements and compact varchar enhancements.
    • Db2 Modern Workloads - Hear about latest enhancements focused on supporting modern workloads, including: ability to handle mixed workloads in a high performing manner, integrated data virtualization, ability to leverage a multi-modal data engine including JSON and Graph including demos on Db2's NoSQL and NewSQL capabilities.
    • Db2 Containers and Cloud Pak for Data - Hear about the latest enhancements for containerized deployment, and why it makes business and technical sense to move to Db2 on CP4D.
    • Availability and Resiliency Updates - Hear about the latest enhancements in Availability and Recoverabilitt, with the introduction of automated HADR with Pacemaker.
    • Db2 on Cloud - If you are considering Db2 Cloud services for your workloads, this session will cover both Db2 services - Db2 on Cloud (OLTP) and Db2 Warehouse on Cloud. We will also discuss the strategy and roadmap so this will be a great session if you are strategizing to move to Db2 on Cloud soon or are already on it.
    • Data Management Console for Db2 Administrators - Learn about the latest updates in the Data Management console to help Db2 Administrators manage Db2 environments.
    • New Packaging and Pricing - This session will cover the upcoming editions of Db2, its pricing and entitlement migration paths from previous versions to new versions.

    Register for access to recordings, as well as upcoming webinars:
    https://mailchi.mp/8613d131efe0/db2-nebula-webinar-landing-page


    Webinar - Db2 is Developer Inclusive

    In this session, the team will delve into topics such as: Integration with programming languages; Employing in-database machine learning; Leveraging REST APIs. The team will also provide a preview of Db2 11.5.5!

    When: 11th November, 11:00AM EST
    https://ibm.webcasts.com/starthere.jsp?ei=1392215&tp_key=2d29881a0c


    Webinar - Db2 is AI ready

    Please join Quentin Presley (Development Manager) and Roger Bitar (Offering Manager) details how enterprises can become smarter with a data management system that is AI-ready. During this session, learn about how Db2 augments, embeds, and infuses AI on a flexible and scalable architecture offering governance to facilitate innovative data estates and infrastructure.
    https://ibm.webcasts.com/starthere.jsp?ei=1389272&tp_key=6e769cdbb6


    Problem Determination - Db2 Historical Monitoring (db2histmon) - New Report Generation

    To aid with problem determination and improve the availability of diagnostic and monitoring data at the first occurrence of a problematic event, the new Db2 Historical Monitoring (db2histmon) scripts and framework are now available. These scripts deploy a broad set of monitoring and diagnostic data collection, at varying intervals, and manage the archival and purging of historic data. Data collection sets can by customized as needed. A new report generation script can create MONREPORT style reports based on historical monitoring data.
    Download the db2histmon scripts: https://github.com/IBM/db2histmon
    Learn more about deploying and using the db2histmon scripts: https://bit.ly/31uc580
    Learn more about the db2histmon report generation scripts: 
    https://bit.ly/36RMc3f


    Conferences & Summits & User Groups

    Over the years, IDUG has become known for hosting dynamic conferences packed with the cutting-edge content you need. This year will be no exception. Be sure to make your plans to attend. With great sessions, keynote speakers, workshops, and more, you won't want to miss this event.

    Upcoming Events: https://www.idug.org/events/calendar
    IDUG EMEA Db2 Tech Conference - November 16-19, 2020
    IDUG Latin America Virtual Db2 Tech Seminars - December 8-9, 2020
    IDUG Australasia Conference - March 15-19, 2020


    Upcoming Regional User Groups:
    Find Other Local Db2 User Groups in your area: https://www.idug.org/rug


    Webinar - New developments in Db2 pureScale

    Join Toby Haynes (Senior Manager, IBM Db2) as he provides an overview of recent enhancements to the Db2 pureScale, improving on the already impressive extreme availability architecture.
    https://www.dbisoftware.com/blog/db2nightshow.php?id=800


    New Offering - Db2 on Cloud Standard and Enterprise Plans

    Db2 on Cloud offers new Standard and Enterprise plans that includes the following features:
    • Independent scaling of compute and storage
    • Three high-availability nodes plan spanning multiple availability zones (for high-availability plans)
    • Self-service managed backup with point-in-time restore
    • Private endpoint connectivity
    All customers on legacy plans (Flex or Precise Performance) must move to a Standard or Enterprise plan. For more information, see the following blogs:
    https://www.ibm.com/cloud/blog/announcements/ibm-db2-on-cloud-elastic-plans
    https://www.ibm.com/cloud/blog/announcements/deprecation-of-the-db2-on-cloud-legacy-plans-and-availability-of-new-replacement-plans


    Webinar - Intro to Db2 Managed Service for Enterprises

    Learn about managed Db2 service on cloud for OLTP workloads. With Managed Database as a Service, users can focus on applications, thus leaving the management to the service provider. IBM Db2 on Cloud Managed Service's robust features provide the depth to handle customer workload requirements such being highly available, scalable, on demand, and being secure.

    https://video.ibm.com/playlist/633288


    Webinar - Intro to Db2 Warehouse on Cloud

    Db2 Warehouse on Cloud is the Elastic Software-as-a-Service (SaaS) offering from IBM that runs on multiple cloud platforms. Get to know the offering and learn about the new and upcoming features that makes Db2 Warehouse on Cloud support larger warehouse environments with better Disaster Recovery (DR) capabilities.
    https://video.ibm.com/recorded/128072408


    Webinar - Db2 for SAP

    Learn why Db2 is best suited for SAP workloads, which offer tight integration, accelerated performance, and reliability—all on a wide choice of platforms.
    https://video.ibm.com/playlist/633288/video/127806532


    Webinar - IBM Replication – Data Delivery in 2020

    IBM Gold Consultant and replication expert Frank Fillmore delivered an information packed presentation covering an overview of IBM replication solutions that are available today. Whether you are a current IBM replication user, investigating replication for a new edge system, or working with a competitor’s solution, this webinar will help you understand what IBM replication solutions support today, and the latest sources, targets and deployment models to help your organization successfully deliver fast and accurate replicated data.
    https://bit.ly/31gqKDw


    Webinar - Data and AI pre-configured: IBM Cloud Pak for Data System vs. the competition

    IBM Cloud Pak® for Data System, a preconfigured “cloud in a box,” helps companies modernize their data and quickly get AI projects up and running. Join experts from IBM and analyst firm Cabot Partners to learn why IBM Cloud Pak for Data System is a leader in hyper-converged platforms for data and AI. This webinar will walk through a comprehensive competitive analysis of IBM Cloud Pak for Data System against AWS Outposts, Google Anthos, Microsoft Azure Stack/Hybrid Cloud and Oracle Cloud at Customer.
    https://bit.ly/3dnDTxa


    _____________________________________________________________________________________________

    Support - Overview of IBM Support Framework, Guidelines, and Escalation methods
    Please review the IBM Support guidelines for a description of the IBM support framework; how to get setup for IBM support; methods to help yourself via watson search; and how to escalate a support case.
    https://www.ibm.com/support/pages/ibm-support-guide

    Roadmaps - Db2 Development Roadmaps are Accessible to Everyone
    Curious about the feature and function committed to upcoming versions of Db2? You can now explore live development roadmaps for Db2 and the IBM Analytics family of products. This content is live and subject to change.
    https://ibm-analytics-roadmaps.mybluemix.net/

    Db2 Community - Follow the Db2 Developer and Administrator Community
    Share. Solve. Do More.
    https://developer.ibm.com/data/db2/

    Follow-us! - Stay up to date on the latest news from the Db2 team
    IBM Support Community: (https://www.ibm.com/mysupport/)
    IBM DB2 DeveloperWorks (http://bit.ly/sig_Blog)
    IBM DB2 Twitter page (https://twitter.com/db2_support)
    IBM Youtube Channel (http://bit.ly/sig_YouTube)
    developerWorks Answers (forum): https://developer.ibm.com/answers/topics/



Translate