Good bye

Les hlhowell at pacbell.net
Sat Feb 2 19:26:10 UTC 2008


On Sat, 2008-02-02 at 12:13 -0600, Les Mikesell wrote:
SNIP!
> I don't object so much to installing a new system on a new machine 
> because I normally keep my old ones running to cover anything that won't 
> work immediately.  Once everything is running correctly though, there is 
> no excuse for breaking it and it should not be necessary to reinstall an 
> operating system for the life of the hardware.
> 
> > I presume most of 
> > the major manufacturers are the same. It's Vista or bust. Now lets see we had 
> > Windows-98 Windows-nt Windows-2000 Windows-XP Windows-Vista  all in the last 
> > decade. That's not counting the home versions versus the professional 
> > versions. Lots of these had incompatibilities.
> 
> You can find exceptions, but just about every third party program would 
> run across that set because the commonly used interfaces were 
> maintained.  And if you expect a 5-year useful life for hardware, most 
> of those lasted that span with security updates once MS recognized the 
> need for them.
> 
> > A chance ;-) XP-SP2 hasn't fixed XP problems why should Vista-SP2 be expected 
> > to fix Vista problems.
> 
> What problems do you still see in XP or 2000+?  My updated post-SP2 
> windows machines are as stable/reliable as anything running Linux.  My 
> laptap sometimes gets into an odd state after many 
> standby/wakeup-on-a-different-wireless network operations but I haven't 
> been able to make that work at all under Linux for a comparison.  I 
> haven't closely tracked the size/number of updates, but I'd guess that 
> there is more update churn in even the Centos5.x distro than a pre-vista 
> windows.  That's not a completely fair comparison because of the 
> additional apps in the Linux distros, but a few years back I would have 
> promoted linux as the more stable choice.
> 
Of course the churning that is going on could possibly be related to the
advent of high speed DVD's, Bluetooth, USB, USB2.0, Firewire and RAID,
all of which are still undergoing the kinds of development and
consolidation that IDE went through in the late 80s to 90s.  

	Also we are asking so much more of our systems, with 3d graphics,
multiple processors, threading, and processor parallel loading and
stream changing on the fly.  

	I worked in the IC test field.  ICs have gone from a 7 year cycle in
the early 80's to a 18 month cycle in 2005 when I retired.  I suspect
that since two generations of devices have passed since then that the
cycle has dropped another 10-20%.  At the same time, processor, memory
and system speeds have gone from about 30Mhz at top end to over 8Ghz
today.    Test programs up until 2002 were typically generated by one
individual coding like a madman, and simultaneously designing the
hardware to run the devices and code.  When I left in 2004, a typical
program took a team of 3 to 4 to develop, was required to run 4 devices
in parallel for mixed signal, two devices in parallel for RF Asics, and
be developed in about 7 months, including all correlation and other
issues.  One man couldn't have done it even if he were a wizard.  The
devices changed as well, from maybe a few thousand gates to SoC's with a
full processor, memory, RF, IF, audio, and peripheral drivers on board.
A cellphone was just about three chips I think, and today they are
almost disposable, even with camera's on board.

	All of this impacts OS software as well, as the advances require
processing more and more diverse data, stranger new applications that no
one could have seen coming (Second Life or Croquet anyone?)  All of this
churns the software, and interface requirements.  Think about just video
for example, there are dozens of compression formats, multiple forms of
displays, and several algorithms to accomplish each part of the process.
Add in stop, pause, run, fast forward, fast reverse, run reverse, and
some differing audio standards of 2,3,4,5,6, and more channels, along
with audio decoding and encoding to the correct reproduction standard,
and it gets a bit complex, just to see a twenty second clip of some
politician selling the latest snake oil.

	Someone said, the only thing constant is the rate of change, which
means to those of us old enough a perception that it is logarithmic.

	I know that this doesn't add much, but think about this.  My first
computer was one printed circuit board.  It had 2K or memory, an 8080 (I
think), and four LED number displays with a hex keypad.  I had voice
output (dutycycle operations on a bit to a speaker), and tape storage
using audio cassette at about 8Kbaud that I wrote myself.  It ran at a
whopping 2Mhz I think (I had some bits before that, but they were mostly
just experiments with soldering chips together on protoboard).

	I am writing this on a system with over 800G of storage, running at
2Ghz, dual processors, with 1G of memory, over a network that runs at
multiple gigs at least part of the way, and all of you can see it and
read it all over the world in seconds if things go well.

That is the past 30 years.  What will full immersion and higher
technology bring in 30 more years?  Any guesses?  How will the
processors, operating systems, networking and hardware keep up with
that?  Will we be part of the next three decades of development?  

Along with that, I like this quote:

"Faced with the choice between changing one's mind and proving that
there is no need to do so, almost everyone gets busy on the proof."
                                                — John Kenneth Galbraith
I like to think I am on the chaning mind side of that quote...
 
Regards,
Les H




More information about the fedora-list mailing list