" Moore's Law is ending " , well, not quite

Programmers as you define them are a dime a dozen. But good software designers are scarce.

True. Coding is only half the battle. Making it usable is a totally different thing.

I've seen this deteriorate a lot. We used to have users who would use a stopwatch, and complain if some process took a fraction of a second longer than it used to. And they were right. If you do the same thing 1,000 times a day, doing it even a little quicker adds up.

Fast forward to now. Software is spit out without any consideration for the efficiency of use. Want to process something? Sure. Click here. Dismiss this dialog. Drag there. Take your hand off the mouse and move over to the keyboard. Then back to the mouse. Repeat a dozen times. That's all it takes! Easy, huh?

In my assembly class we had one assignment that was all done in machine code. Just a bunch of DCs. That class shook out all the people who didn't get it.

I had to write a database program in assembly class. The whole thing, input, storage, retrieval, display. Made me realize how inefficient some of the database systems I'd work with later really were.

I knew a guy at another MegaCorp who told me his hiring strategy. He'd give the candidate a compiler print-out, and ask what the program did. He dismissed out of hand the clowns who looked only at the comment lines. The ones who looked at the source code interested him. But the ones who looked at the machine code over on the right-hand side, those were the space-men he wanted! He would have hired me, and it would have been fun to work for him, but the commute to his company would have been brutal.
 
Fast forward to now. Software is spit out without any consideration for the efficiency of use. Want to process something? Sure. Click here. Dismiss this dialog. Drag there. Take your hand off the mouse and move over to the keyboard. Then back to the mouse. Repeat a dozen times. That's all it takes! Easy, huh?

A rare bird is a software coder who is also a good user interface designer. Some of the best user interface designers are not strong coders. Good teams and corporations recognize this.

I will say that I don't necessarily like the radical simplification of some UI's that have been all the rage since the iPod came out. The pendulum may have swung too far in one direction. But you have a point. The ones in the other direction can drive you crazy.
 
I will say that I don't necessarily like the radical simplification of some UI's that have been all the rage since the iPod came out. The pendulum may have swung too far in one direction. But you have a point. The ones in the other direction can drive you crazy.

I think we're saying the same thing. The problem is dumbing down the program to get good reviews by beginners or journalists, rather than meeting the needs of the real users who are going to have to run the thing, day in and day out.

Something can be so simple to use that it does nothing. Sometimes, the job you're trying to accomplish is complicated, and overly simplified software only makes it harder.
 
What an honor! Are you able to post links to the interviews, or is that Intel internal use only?

Too bad he has Parkinson's, looks like the medication is slowing his speech a bit. Hope he stays strong for a long time yet.

-ERD50

It was fun. There wasn't much more to the interview cause his son had asked to keep it short. As soon as it gets posted. I'll send you a link.
 
I think we're saying the same thing. The problem is dumbing down the program to get good reviews by beginners or journalists, rather than meeting the needs of the real users who are going to have to run the thing, day in and day out.

Something can be so simple to use that it does nothing. Sometimes, the job you're trying to accomplish is complicated, and overly simplified software only makes it harder.


Yes! We are talking about the same thing. Some of the phone apps expect you to know the magic gesture to get something to happen. Even then, you may be SOL. There is simple and clean, and then there is clean and dumb.


I was also thinking back a way where it was the opposite. You'd have everything available, but it took 5 cascades of a pulldown to do anything. Your mouse slips, and you choose the wrong thing. That was bad too.
 
From what I understand there's still a few generations 'assured' in the pipeline, up to 2025 or so. This includes new techniques like gate-all-around.

The shrinking game is as good as over though after the current step. Still lots to be done in economic efficiency (and yields) I believe.

On the other hand, if you look at the raw computing power available per $ (or watt), we are where we need to be pretty much for every single application there is, including simulating a complete human brain for a few hundred dollars. Strong statement, I know.

The problem is in the architectures (not just software). A properly designed for purpose chip can beat a generic one easily by a factor of 100. We just don't know what architectures are best yet or how to program for them.

Quantum is nowhere near practical use yet, and won't ever be useful but for a handful of problems.
 
Only know device physics and transistor design for dummies, but over the course of 30+ years in semiconductor process development, aka “wafer bitch”, I helped develop and/or implement various processes, from APCVD doped SiO2 using 100% silane and phosphine (flame deposition!) to ALD Hi-K metal gate using HfO2 and all sorts in between. Did single wafer plasma processing on “homemade” reactors before the AMAT P5000 came along. Was still doing litho with an old Karl Suss contact printer in a university lab when I retired (and still in use). In my early days, poly-si was 450nm, and 256k DRAM was high-tech. I remember a first 1meg DRAM celebration around ‘86. Near the end of my alleged career at Megacorp, copper, dual damascene, and dielectrics containing carbon and/or a nano-porous structure were being implemented on 140nm pitch interconnect.

Was fun, for a while!
 
..... I helped develop and/or implement various processes, from APCVD doped SiO2 using 100% silane and phosphine .......

Was fun, for a while!

Aren't those chemicals known as " One Step " you know, get it on you or in in your lungs and you get one step away and then drop dead ?
 
Aren't those chemicals known as " One Step " you know, get it on you or in in your lungs and you get one step away and then drop dead ?


Heh...

Though silane is more noted for its pyrophoric qualities.

Toxic, explosive, corrosive, oxidizing, ionizing, you name it, we used it!

I have some stories... [emoji51]
 
I always thought the Programming 101 class should be Assembler.
We had a computer running Watfor is our engineering lab. I would enter two instructions in octal and cause the computer to read cards and write to the printer. Everyone was impressed!
 
There's a corollary to Moore's Law named after Digital Equipment's Gordon Bell, which says every decade a hundredfold drop in the price of processing power engenders a new computer architecture. Mainframes to PCs to the cloud to what next?
Well Honeywell (remember them?) has introduced their quantum computer:

https://www.honeywell.com/newsroom/...ng]Honeywell announces their quantum computer

The success of the cloud enables this approach.

They are building a 480000 sq.ft. production facility on the outskirts of Denver (on the former StorageTech property).
 
Last edited:
On the other hand, if you look at the raw computing power available per $ (or watt), we are where we need to be pretty much for every single application there is, including simulating a complete human brain for a few hundred dollars. Strong statement, I know.

The problem is in the architectures (not just software). A properly designed for purpose chip can beat a generic one easily by a factor of 100. We just don't know what architectures are best yet or how to program for them.
Yep, agree.

The market also weighs into this. The architectures are boxed into the paradigm of laptops, phones, servers, etc. And the architecture has to be backward compatible, sometimes to the dark ages. Just moving data around inside these chips is a challenge.

At Mega, we did some really interesting stuff with custom architectures. Problem is that the market is limited and producing such is very expensive. It helps if 1000s of other applications can be grafted onto the architecture, otherwise nobody wants to produce it.

That's the beauty of Moore's law. Vacuum tubes -> transistors -> semiconductors had millions of applications.
 
Ah the memories... (pun intended)


Just think, or cringe, at what computing could do if all these gigabytes and gigahertz where used as efficiently as the 16KB of memory on the lunar lander. With the plans for AI and surveillance systems maybe all the bloatware is a good thing.


When I left Megacorp, our distributed storage system was being used by No Such Agency and cities like NYC to record every license plate image that rolled past their cameras and keep those images on-line (fast access, not archived to tape) for 5 years. I always felt like I needed to take a shower after helping make their systems work better.


Our CEOs response to the ethical questions raised: "If we don't build it, somebody else will".
 
Our CEOs response to the ethical questions raised: "If we don't build it, somebody else will".

Wow - guess he didn't know about how well that worked in Nuremberg a few years back.

Or a bit milder: if someone else does it, at least your name is not on it. Or even: if someone else builds it, they'll probably do a poor job (since we're so great right ;).

At least he was upfront about his values.
 
At Mega, we did some really interesting stuff with custom architectures. Problem is that the market is limited and producing such is very expensive. It helps if 1000s of other applications can be grafted onto the architecture, otherwise nobody wants to produce it.

I'm reasonably hopeful that with the GPU as platform there is new maneuver space (CUDA), although the latest hardware release from NVIDIA was not so great.
 
Back
Top Bottom