Thursday, December 18, 2014

Resolve to Gain New Knowledge and Skill

As we approach the end of 2014 and look forward to next year, I want to encourage you to take stock of your knowledge and skill as it pertains to DB2 for i and data centric design and programming.

Your value, to employers, clients, partners, and colleagues will diminish over time unless you "sharpen the saw".

To that end, I want to call your attention to a couple public classes we are offering in Rochester, Minnesota (aka: the home office).

_________


DB2 for i Advanced SQL and Data Centric Programming

2015, February 17-19 

Skills taught


_________


DB2 for i SQL Performance Monitoring, Analysis and Tuning

2015, February 24-27 

Skills taught


_________


If you need more detail on what the respective classes cover, or why the knowledge and skill are critical success factors in the world of database engineering, please contact me. And if you want to explore a private and/or customized session, we can assist there as well.


“Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

 ― Abraham Lincoln


Wednesday, October 8, 2014

Trust, but Verify

I am often asked about the risk of migrating to a new version/release of IBM i; "should we go to 7.1 or 7.2"?

The same can be said about moving to the latest technology refresh (7.1 TR9 and 7.2 TR1 were just announced by the way).

I prefer to talk about the rewards of keeping up with the advances in technology - keep the tool belt fully stocked so to speak.

So, should you install the latest and greatest?  My usual answer is "yes, and...".

Whether you are configuring new hardware, putting on a new set of group PTFs, installing the latest TR or migrating to IBM i 7.2, my sincere advice is based on an old Russian proverb:

Trust, but Verify

What this really means is, YOU should be testing the new hardware, testing the group PTFs, verifying the TR code or the latest version of IBM i.  And I don't mean give it a spin for a few days on the development system.  I'm talking about proper and adequate testing; a real verification of the features and functions. Find out for yourself, do they behave as advertised?

Now here is the issue...  proper and adequate testing must be based on science, and some art.

SCIENCE, as in, using the scientific method:

  • Purpose or Question
  • Research
  • Hypothesis
  • Experiment
  • Analysis
  • Conclusion

And ART, as in - you have to be clever about how, when and where you apply the science.  If you are not testing the business processes that produce the transactions occurring in the production environment, you are not actually verifying anything, nor are you mitigating any risk. You are just fooling yourself.  And if you cannot pin down the variables and repeat the process consistently, the experiment will be inconclusive, and a waste of time.  I don't know how many times I have been in awkward conversations that go something like this:

DB2 user: "we just upgraded, my queries don't run fast anymore"

Mike: "I'm sorry to hear this... do you have any information captured about how the queries were running prior to the upgrade"?

DB2 user: "no"

Mike: "can you tell me how the queries were running prior to the upgrade"?

DB2 user: "yes, fast"

Mike: *heavy sigh*



When it comes to DB2 data integrity and data processing, three fundamental things need to be tested and verified:

  1. Correct results
  2. Performance
  3. Scalability

Correct results is obvious - did my request or process produce the expected answer or result?

Performance gets a lot of attention - did my request or process complete in the time expected?

Scalability is much more difficult to understand - did my request or process complete in the time expected when running with the full data set and under the stress of all the normal production activity?

My recommendation is that you get in a position to test (and verify!) that the new hardware and/or software meets your requirements BEFORE implementing anything in the production environment.  And with that said, verify your rollback strategy if something does slip by.

When it comes to testing and verifying DB2 for i, the point person should be your database engineer. If you don't have one, now is a good time to establish the position, install the candidates, and provide training and support. Don't forget to give them clear responsibility and authority to do the job.

If you don't have, or don't want to invest in a full fledged testing environment, or you want the subject matter experts to look over your shoulder, make a visit to IBM Rochester, Minnesota and embark on a performance and scalability engagement.

If you would like to discuss the science and art of verifying DB2 for i, please contact me.  We are here to help you be successful, and to mitigate risk.

Monday, September 15, 2014

Setting the Right SQL Course


The following guidance and course correction are compliments of  DB2 Subject Matter Expert and Center of Excellence team member, Kent Milligan!
___________


Having been around DB2 for i since I started my IBM career a few “years” ago, it’s been a lot of fun to watch the usage of SQL grow in the i community over time. Some folks are using SQL with new databases and applications, but the vast majority of people are using SQL to modernize existing databases and applications.

Moving an existing database from DDS to SQL DDL (Data Definition Language) is a pretty straight-forward process.  A new SQL statement is created to replace each DDS-created object in your DB2 for i database. And many IBM i customers have successfully completed this conversion from DDS to SQL DDL.

When it comes to modernizing data access in applications, the transition to SQL is more challenging.  A significant number of IBM i developers have struggled with this change because their natural reaction is to replace each native record-level access request with an SQL DML (Data Manipulation Language) statement.  They are so quick to move to SQL that they forget that SQL is designed for set-based processing.

I think this car tire repair picture does a good job highlighting the issue with performing one-for-one replacements with SQL DML - functionally your programs will continue to work, but application performance is going to suffer just like the speed and handling of this “repaired” car.