Tip of Iceberg
Home Up Tip of Iceberg GroupWare Comparison

 

Subject: PC iceburg: PC spreadsheets and databases

A week ago we started checking PC spreadsheets and databases using <a vendor's package>.  I have mixed emotions about the results. One one hand the package seems to work very well and provides some detailed assessments of the PC files. On the other hand, it has uncovered some monsters under the ice.

<Snip>

If our continuing assessment plays out like the trial on this tool, we're in deep trouble!!! The sheer volume of suspect PC files could overwhelm our ability to review, not to mention upgrade, all of them. Some of these systems have been around for quite a while and some are very new.  Some developers are simply computer-literate business folks and others are professional programmers.   So far, it hasn't made much difference who built what.

Message: Look REAL CLOSE at your PC applications. It may change your whole perspective on the depth of the Y2k problem.

Larry,

Let me make a few comments to consider about your pilot.

First, have you looked at how recently each spreadsheet and database were last accessed (I did not say last "saved" date, I said last "accessed" date) before you run your scan? If you run the scan first, you will lose this information because the date of your scan will become the last accessed date. Hopefully you find some that have not been opened for the last year or so and you may be able to reduce the quantity of files you have to remediate that way. Be careful with this, opening the files in read only mode (or even just copying the files) may update the last accessed date. The largest problem I found was that my backup programs updated the last accessed date. You also need to make this decision in consultation with the user, maybe he has some spreadsheets that are only opened for end of year processing or something.

Second, have you considered further abating your efforts by reducing the quantity of duplicate files that you have to remediate?  This can safely be done by identifying duplicate files before you remediate and then distribute copies of the remediated spreadsheet to PCs having identical copies.  How many times do you want to remediate the same spreadsheet in your efforts? Again, you have to be careful here - duplicate files do not necessarily have the same name, size, or date, but they may still be duplicates.   Successful spreadsheets are especially susceptible to being created on one PC and emailed to coworkers - multiplying your iceberg by many times over.

Third, you can further abate your efforts with some high level planning. If you remediate spreadsheets that are structurally identical together, you will find that your effort will be less because you will find the same things in each batch of spreadsheets.  For example, if you have a spreadsheet that is updated monthly with the current month's sales statistics, they are all structurally identical - only the numbers and dates have changed.   What needs to be done to one, needs to be done to all of them.

Forth, how are you testing your spreadsheets? It is my understanding that some tool manufacturers are telling people to re-run the assessment tool to "test" the spreadsheet and that is it. Larry, you have been in the Y2K remediation business for some time now - how many mainframe applications have you been willing to declare fully tested by merely re-running your assessment tool?  What was the reason for that?  Do you think the same thing applies to mission critical applications that happen to be located on desktops?

Vic Fanberg
http://www.dateWise.com