HP3000-L Archives

February 1998, Week 4

HP3000-L@RAVEN.UTC.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Richard Gambrell <[log in to unmask]>
Reply To:
Date:
Mon, 23 Feb 1998 10:05:02 -0600
Content-Type:
text/plain
Parts/Attachments:
Re: (65 lines)
Neil wrote:
>
> I distinctly remember (in 1975) having a conversation with my Data
> Processing Manager at the time during the early design phase of a
> medical insurance system (which can be described as date-rich), and we
> discussed at length the pro's and cons of adding centuries to the
dates.
>

This seems to address the critical ethical consideration - are the risks
assessed and disclosed - even if management isn't as willing as Neil's
to discuss the alternatives.  But that doesn't tell the whole story -
what of managment that always take the short term way out (see my story
below)?

[snip discussion]
> Only in 1990, when the system was totally rewritten did we add
> centuries.
>

This raises what seems to me a critical issue with Y2K decisions. How
could design decisions take into account the number of years before a
major rewrite of the application?  In most environments I've heard
about, projecting the schedule for the next upgrade or rewrite is
impossible.

> So, the Year 2000 problem may well be placed firmly in the laps of the
> bean counters :)
>
> Neil
>

In my experience, we were always doing enhancements and partial rewrites
to existing applications, or had requirements for tight integration to
existing applications, which effectively stopped us from introducing 4
digit years.  One of our major design goals (for the most part achieved)
was consistency in the user interfaces, so this also reinforced keeping
date handling the same as the rest of the applications. Only with really
independent applications did we design in 4 digit years.

The date issue and similar risk decisions were initially discussed with
management, but management (even IT management) always wanted the most
user functionality for the time invested and paid little attention to
the underlying foundation or ease of maintenance and administration
issues.

We (programming management and analysts) eventually realized that if we
felt it was too risky to take a shortcut or wanted to improve
maintainability, we just had to do it and not discuss it with IT
management.  This strategy, combined with management undercutting
staffing for programming projects, eventually led to the whole
enhancement and upgrade process getting bogged down.  In reaction,
programming moved back toward more of a quick and dirty fix it and go on
approach, leading to real maintenance headaches (i.e. what are now my
headaches).

Where does the ethical thing lead here?

It would be my hope (tempered with a great deal of practical skepticism)
that an Ethics code and "standard" criteria for professionalism could be
used as a defense against management shortsightedness, at least in so
far as blaming the application designer.

Richard - "No wonder my head hurts" - Gambrell

ATOM RSS1 RSS2