Saturday, December 4, 2010

XTIVIA Colorado Springs Holiday Party

I think the Holiday party in Colorado Springs was a success. Most of the team was able to join us and we even had a visit from Santa Clause from the North Pole. Here is the link to view the photo's that I took.

http://picasaweb.google.com/drobincorvette/20101204#

We had some great food and even home made beer that was really tasty! Thank you to all the Xtivia employees in Colorado Springs!

We look forward to seeing pictures and stories from the other offices.

Happy Holidays!

Friday, December 3, 2010

Larry Vs. HP

Came across this today - should HP worry? or is this a case of Larry being Larry?

http://online.wsj.com/article/SB10001424052748703377504575651260196782120.html?mod=WSJ_business_whatsNews

The Data Quality Challenge

According to a study published by The Data Warehousing Institute (TDWI) entitled Taking Data Quality to the Enterprise through Data Governance, some issues are primarily technical in nature, such as the extra time required for reconciling data (85%) or delays in deploying new systems (52%). Other problems are closer to business issues, such as customer dissatisfaction (69%), compliance problems (39%) and revenue loss (35%). Poor-quality data can also cause problems with costs (67%) and credibility (77%).

Another recurring statement that we come across frequently is : Only less than 5% of our data is “bad”.

If you consider a fact table with 10 million rows, then about 500K rows are bad – which is a huge problem for analytics, operations or anything else you may want to do with the data!!

Even if you choose to ignore all the statistics and studies, it comes down to just a few question –

  1. How do we get the information that we need and where do we get it from?
  2. Can we trust this information?
  3. What does it mean and how can we get this information in the format we need?

Data Quality Definition: Commonly known, data is deemed to be of high quality if it correctly represents the real-world construct to which they refer.

The implementation of Data Quality processes and procedures can dramatically affect the quality and therefore usability of the information in your Data Warehouse.

The Goal: The reason for building a Data Warehouse is to provide consolidating data from multiple sources, both internal and external, into an enterprise solution. This consolidation involves ensuring data consistency, integrity and quality. The data warehouse is the medium to provide enriched analytical data to empower better decision making. The goal is not to copy the same old bad information from multiple locations and provide little or no benefit to the enterprise.

We must consider two key factors to achieve this goal.

Data Profiling identifies the problems. It provides snap-shots of a company‘s data quality and measures the evolution of data quality over time.

Data Cleansing corrects, or ―cleanses incomplete or inconsistent data by crosschecking against other databases and reference data. It also enriches data by providing value-add information that actually improves the quality and usefulness of existing data.

I will post more on the two factors soon...

DB2 table reorg performance - Problem Solved

One of our DB2 clients has a 600 GB database that contains two large tables (between 10 and 20 GB each) that needed to be reorged. Our options were limited because the table needs to stay online 24x7. Testing on the development system (same hardware, version of db2 and data) projected an INPLACE online reorg would take several weeks! I tested with an INPLACE reorg so the process could be started, stopped, paused and resumed as needed.

The row size of each table was near the 4k pagesize limit. So as a test, I moved these tables from a 4k pagesize tablespace to an 8k pagesize tablespace. This process included creating the new 8k bufferpool and tablespace, creating the clone tables with a slightly different name in the new tablespace, loading from cursor from the old table to the new table, dropping all constraints on the old table, renaming the old table to *_OLD, then renaming the new table to the correct name and recreating all the indexes, constraints and views. Finally a runstats was required on the tables and all indexes.

The INPLACE reorg now takes 10 minutes! This made the table reorgs simple to run at any time because they are never offline. The reorg helps performance and helps to limit disk space usage.

US-CERT Technical Cyber Security Alerts