Computers -- Not PC vs Macs

John Tuttle

Recycles dryer sheets
Joined
Mar 7, 2006
Messages
73
The thread on PC vs Mac reminds me that arguing over the merits of these two systems is a little bit like finding fault with a talking dog because it uses incorrect grammar. Either of these two machines is incredibly more powerful and easier to use than computers used by large research universities 30 years ago. Here is a small personal example.

The Postage Stamp Problem.
An envelope may carry no more than h stamps, and one has available k integer-valued stamp denominations. Given h and k, find the maximal integer n=n(h,k) such that all integer postage values from 1 to n can be made up. For example, n(3,6)=52 and the two solutions are (1,4,6,14,17,29) and (1,3,7,9,19,24). This means that every integer from 1 to 52 can be written as a sum of 1,2, or 3 of the integers (1,4,6,14,17,29) with repetitions allowed. Moreover, no longer sequence is possible satisfying the given conditions. Also, n(3,7)=70 with a single solution (1,4,5,15,18,27,34).

http://en.wikipedia.org/wiki/Postage_stamp_problem

This problem goes back at least to 1937. I first heard about it in 1974 and wrote a program to solve it. Over the years I have implemented the algorithm in FORTRAN, IBM 370 Assembler, Pascal, C, FORTH and C++. Some results for n(3,7) are summarized in the following table:


Year Computer/Chip Language Time (seconds)
1974 IBM 370/158 Fortran 45
1974 IBM 370/158 Assembler 15
1984 Apple ][/6502 FORTH 21,600
1990 Mac SE 8 MHZ 68000 Think Pascal 276
1990 16 MHZ 386SX Turbo C++ 53
1994 90 MHZ Pentium Visual C++ 1.5
1998 400 MHZ Pentium II Visual C++ 0.25

The IBM 370/158 was a large mainframe and was the main academic computer for the University of Illinois at Chicago. It filled a large room, RENTED for $50,000.00/month and an IBM systems engineer had his own on site office.

The algorithm is a good test of raw CPU speed since it is branch and bound with mostly simple integer arithmetic and array subscripting completely contained in memory.

The algorithm time increases exponentially. For kicks I tried n(3,8) and n(3,9) on my 2.8Ghz Dual Core Pentium with times of .55 seconds and 12.4 seconds.
 
You did not write an assembler version for the Apple 6502? Lazy! :)

21,600 seconds? You could have done it in 6 hours, while waiting for the result.

It should be interesting looking at the cost per instructions executed per second over the years. It's been an interesting ride, with lots more to come.
 
Sam said:
It's been an interesting ride, with lots more to come.

The trend now is multicore rather than trying to make the individual cores faster. So, you'll probably have to rewrite the program again to make it multithreaded if you want to see future performance improvements.
 
Sam said:
You did not write an assembler version for the Apple 6502? Lazy! :)

21,600 seconds? You could have done it in 6 hours, while waiting for the result.

In 1984 it was pretty heady stuff to have a computer on your desk that could solve a problem like that. I did know 6502 Assembler at one time but you're right, I was too lazy. I said the language was FORTH, but actually it was my Pascal version using the Pascal I had written for the Apple ][ in FORTH. If you know how FORTH works, that essentially makes it written in FORTH.

wab said:
The trend now is multicore rather than trying to make the individual cores faster. So, you'll probably have to rewrite the program again to make it multithreaded if you want to see future performance improvements.

I'm too lazy to multi-thread it, or at least I have more interesting things to do.
 
I learned 6502 assembly on the Apple in 1984 too. At the time, I worked for Hayden Software. One to the company product was the Macro Assembler. I think I still have its manual. Pascal, yes. FORTH, no.
 
Didn't Hayden publish a couple of chess programs? Microchess and Sargon? I loved Sargon.
 
Yes, Sargon. I loved it too. It also published a popular word processor (can't remember the name).
 
John Tuttle said:
The thread on PC vs Mac reminds me that arguing over the merits of these two systems is a little bit like finding fault with a talking dog because it uses incorrect grammar.

My dog just said, "Bite me."
 
Some of my first endeavors were using PAL-III on a PDP-8, then I built and coded in assembler on one of these:

http://www.jitterjunction.com/firstcomputer.htm

Then I got my hands on an Altair and a PDP-11/40.

Comparing the days when you had to write your own basic IO routines, most of your own "OS" routines you kept in a 'toolbag' and a lot of code entry came with toggle switches and push buttons...things have certainly come a long, long ways.

I'm also not sure those dang things actually ever were programmed to do anything useful. Just making them blink some lights in a particular series was pretty tough coding ;)
 
The Postage Stamp Problem.
An envelope may carry no more than h stamps, and one has available k integer-valued stamp denominations. Given h and k, find...

What, are you with the Italian Post Office?
(which BTW I'm sure ran better in 1937 than it does now with computers.. :( )

I'm unsure what this has to do with real life -- I'd always understood that computers were optimized to perform certain kinds of calculations.. moon launch being different from chess-playing being different from video-streaming being different from 3-D graphics, etc.

Now THIS is funky!
http://www.newscientisttech.com/article/mg18925405.700.html
----
A quantum computer program has produced an answer without actually running.

The idea behind the feat, first proposed in 1998, is to put a quantum computer into a “superposition”, a state in which it is both running and not running. ...

With the right set-up, the theory suggested, the computer would sometimes get an answer out of the computer even though the program did not run. And now researchers from the University of Illinois at Urbana-Champaign have improved on the original design and built a non-running quantum computer that really works.

They send a photon into a system of mirrors and other optical devices, which included a set of components that run a simple database search by changing the properties of the photon.

The new design includes a quantum trick called the Zeno effect. Repeated measurements stop the photon from entering the actual program, but allow its quantum nature to flirt with the program's components - so it can become gradually altered even though it never actually passes through.

"It is very bizarre that you know your computer has not run but you also know what the answer is," says team member Onur Hosten.

This scheme could have an advantage over straightforward quantum computing. "A non-running computer produces fewer errors," says Hosten.

----

:LOL: :LOL: :LOL:
Can't argue with the guy there!

More:
http://www.physorg.com/news11087.html


Yes, we have come a long way!
First I routinely used this:
images

Then this:
images

Then this:
images

Then this (which I still have and use on rare occasions; still works after 25 years! Have changed batteries 1x.):
images

This:
images

And this:
images

.. well, even some of the "youngsters" here can imagine the rest. And this walk down 'memory' lane was over the course of what? 10 years?.. not even. More like 6 or 8.
 
Back
Top Bottom