HP3000-L Archives

February 1998, Week 1

HP3000-L@RAVEN.UTC.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Eric J Schubert <[log in to unmask]>
Reply To:
Eric J Schubert <[log in to unmask]>
Date:
Sun, 1 Feb 1998 17:08:07 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (112 lines)
Nick Demos  [log in to unmask] says:
>1.  The 3000 has been upgraded, e. g. PARISC just like autos.
>2.  What's rear wheel drive got to do with it, anyway.

<off-the-subject> There is a limit to the horsepower and weight you can
supply to a front wheel drive car and still handle it, such as steering
during acceleration.  I don't know the magic number, but after too much
horsepower, rear wheel drive is required (so I read in a mechanics magazine
years ago....) <on-the-subject>

In a nutshell, 'rear wheel drive' represents the design of cars of the
1960's and early 1970's that were large frame large horsepower cars,
yielding 6 to 14 miles per gallon fuel economy.  The market changed.  The
appearance of smaller, lighter front wheel drive cars by former big car
companies represented a commitment to 're-invention' of long held 'core'
drive train and chassis technologies (breaking ranks with previous decades
of more of the same design...) in response to new market and environment
demands.

But this car to computing analogy has little to do with big car technology.
It has more to do with attitudes toward a changing computing climate,
including the methods of process re-invention to meet those challenges.

Starting about the late 1980's, the environment started to change
drastically in the computing industry.  Again, being very simplistic about a
very complex subject...

What was changing in computing?

 1. Push work to desktop, i.e., Client-Server.
 2. High speed Networks and the Internet.
 3. Unix Workstations.

Which caused an industry wide response...

 1. Standard, more open interfaces to data sources.
 2. Host based Internet/Intranet computing services.
 3. Give users productivity gains over work process.
 4. Enterprise wide services.
 5. Multi-tier architectures.

The HP3000 response was uncertain and slow at times to these new market
demands, but is slowly being pulled along today.

The Future?

I see the future of computing being defined in software components.
Software components allow an integrator to create applications visually,
utilizing objects from many sources.  Distributed objects become a machine
abstraction that allow an integrator to create applications that exist on
any remote server, more reliably and more uniform then ever was possible
under previous methods.  Three tier gives the integrator fine control over
thin or thick client, to distribute the work load as they see best.

Again, being simplistic, the computing needs of the future are:

 1. Three tier DCOM or CORBA distributed object infrastructures.
 2. Simple to use <intelligent> resource management agents.
 3. Rapid Application Development OO component based software.

Three tier suggests multiple computing platforms, which is like suggesting
the end of the traffic jam to a GM exec because there are no future need of
cars.  But, perhaps the HP3000 could re-invent itself as a first tier data
server or perhaps the second tier CORBA ORB (or both) of a three tier
distributed object infrastructure.

Perhaps.

NETWORK BASED SERVICES MODEL:

One of the strongest shifts to distributed object computing is to break
apart the goobly gook code of the traditional terminal-host application into
distinct network reachable services providing specific content.  SQL is an
old example of a standard service, but Object technologies have the
potential to create many kinds of 'smart' services with many kinds of
content.  For example, a ticker-tape stock market service over the Internet.
All that is required is a thin client display device for remote objects.

But object technologies aren't worth a hill of beans if there are no
interface agreement among service providers.  It could be just as disjoint
as any other technologies, each seeking their niche markets.  Unfortunately,
this seems to always happen when no one can agree.  In the end, the customer
looses and pays for more complexity.

No matter how fouled up the market becomes with vendors looking after their
own bottom lines, the customer is not in this game.  The customer will find
ways to subvert these incompatible vendor activities and will try to
simplify their operations.  I believe that is why you will start to see once
again hype about mainframe computing as a possible oasis from the frat.
Collaboration among vendors, such as the IMAGE TPI agreements, can go a long
way in satisfied customers.  More of this should be done, like common
security and access controls.

So, yes,  I believe there is a future on the 3000 if it can re-invent
itself.  Object technologies can offer the 3000 a future much like the
beginning with VPLUS and terminal-host programming.  Replace VPLUS with an
OO IDE and create a middleware object bus.  Perhaps the missing multi-media
and other functionality of IMAGE could be compensated in the middleware.
The ideal situation is to re-invent the entire suite, as an integrated
package.  Although the basic mode is transaction processing (just as a car
is transportation), the technologies to achieve that result shouldn't look
anything like what came decades before.  Yes, cars are a good analogy.

If you don't think technologies can re-invent themselves, digital Television
is mandated by the FCC now and DVD is here at the cost of billions of
dollars.  Makes the 3000 problems look pale in comparison.

My ideas at times may be 'subversive',
but are never more of the same.

Eric Schubert

ATOM RSS1 RSS2