" Moore's Law is ending " , well, not quite

The bottleneck has changed. When Moore stated his law, and for quite a while after that, memory capacity and processor speed were the bottlenecks. Every application was constrained by memory limitations and processing speed. Nowadays, software is the bottleneck in that (with some notable exceptions) most programs only require a small fraction of the memory and processing speed that's available. And if more of either is needed, it's dirt cheap.


I recently read that if each byte of memory cost the same today as it did 50 years ago, a typical laptop would cost half-a-trillion dollars.
 
I recently read that if each byte of memory cost the same today as it did 50 years ago, a typical laptop would cost half-a-trillion dollars.

It would also weigh a few tons, what with all the core memory and all. :)
 
I recently read that if each byte of memory cost the same today as it did 50 years ago, a typical laptop would cost half-a-trillion dollars.


When I first started working as a Software Engineer back in the early 70s, we had core memory that typically cost a Buck a Byte... In other words 32kilo bytes of memory would go for $32 Grand... And these were 1975 Dollars! Today's Dollars would be about $150 Grand. So, my new laptop has 8GB memory and if my Calculations are correct, the memory alone would cost $37.5 Billion.... Not quite a half trillion, but no chump change either.



My small little desktop has 32 Gig of memory, and I paid about $500 for that memory. About a $150 Billion worth of 1975 Memory.
 
Last edited:
No doubt about it, they are approaching the practical physical limits of transistors per area, but I don't think that's any big cause for concern, it's only one measure. And chips are so powerful now, this is not a bad place to be plateauing in that regard.

I think they can still improve power dissipation with better insulators to reduce leakage, and better thermal management. And they will continue to get better at making them within those limits, so costs should continue to come down. And with that experience, mid-range parts will be easier to make with the higher tech levels, so on average we will still see advancement, even if the very top end is stagnating somewhat. Kind of like the trickle-down effect we see in automobiles - what was once only on high end cars starts appearing on mid-range cars, and eventually a lot of it makes it to all levels.

And of course, software can be more efficient. And new networking techniques and other ideas will keep innovation flowing for a long time. I sometimes wonder if this isn't a good thing in some ways - it will force designers to come up with better ways of doing things, instead of the 'easy', brute force, "we need faster computers", approach?

Keeping with the car analogy, a piston engine is basically the same as it was 120 years ago, a piston with exploding gasoline, and gasoline is basically the same as it was. The laws of physics set limits on what that can do. But cars continue to advance at a pretty good pace. It's not all about the engine.

-ERD50
 
HMM, in a paper funded by the government, researchers conclude that the government should fund more research :facepalm:

Anyway, the problem with Moore's law is physics, not software. Moore's law states that transistor density doubles every year. But with single-digit nanometer lithography (really, really tiny transistors), heat transfer and electron migration are becoming insurmountable. OTOH, AMD, Nvidia, Samsung, and Intel are very smart and are working on other ways to increase computing power with advances in chipsets, memory technologies, 3D stacking, and generally moving away from single-processor solutions.

Here's Moore's paper http://www.monolithic3d.com/uploads/6/0/5/5/6055488/gordon_moore_1965_article.pdf
 
And of course, software can be more efficient. And new networking techniques and other ideas will keep innovation flowing for a long time. I sometimes wonder if this isn't a good thing in some ways - it will force designers to come up with better ways of doing things, instead of the 'easy', brute force, "we need faster computers", approach?

-ERD50

Oh did you say a mouthful!
Back in the good old days some of us would count assembly instructions to accomplish a given task. My last deal around performance of a JAVA application was enlightening. The development team said it was done, but the system required 5x the hardware of C++. Some smart folks started decomposing what was occurring, poor coding techniques were responsible for most of the additional system requirements.
 
Oh did you say a mouthful!
Back in the good old days some of us would count assembly instructions to accomplish a given task. My last deal around performance of a JAVA application was enlightening. The development team said it was done, but the system required 5x the hardware of C++. Some smart folks started decomposing what was occurring, poor coding techniques were responsible for most of the additional system requirements.

I saw this sort of thing too. Developers always get the newest hardware, and tend to push it to its limits. No surprise it doesn't perform well when the customers try to run that stuff on older hardware.

And I truly believe most "programmers" today don't really code. They drag, they drop, they putter with some high-level languages, but don't really have a clue about how their code actually works. Many don't really understand how any of the hardware works.

I always thought the Programming 101 class should be Assembler.
 
I had an opportunity to interview Gordon Moore last Sept for an Intel Alumni event celebrating Intel's 50th anniversary.

He turns 90, Jan 3rd and is suffering from Parkinson's. So he looked a bit frail but the moment we turned on the cameras and started recording him he perked up a gave a terrific introduction to our event. https://tinyurl.com/y9gj5mqw


I didn't ask about the law since I'm sure he is sick about it. I asked him about life after Intel and philanthropy. However, my coworker did. He did agree that's it is slowing down but he did think it would stick around for another 50 years.
 
There's a corollary to Moore's Law named after Digital Equipment's Gordon Bell, which says every decade a hundredfold drop in the price of processing power engenders a new computer architecture. Mainframes to PCs to the cloud to what next?
Government spending on faster integrated circuits would be about as productive as researching better buggy whips.
 
There's a corollary to Moore's Law named after Digital Equipment's Gordon Bell, which says every decade a hundredfold drop in the price of processing power engenders a new computer architecture. Mainframes to PCs to the cloud to what next?
Government spending on faster integrated circuits would be about as productive as researching better buggy whips.


That's not completely true, Government spending on Semitech in the 1980s and early 90 did help US semiconductors manufacturers regain competitiveness with Japanese (before losing it to the South Koreans this century). Reasonable people can disagree about its overall effectiveness but it was completely useless.

I meet Gordon Bell a couple times at the Boston Computer Museum. Didn't realize there was a law named after him. Two computer giants named Gordon... weird.
 
The slowing down of Moore's Law is forcing new type of innovation.

For example, if general purpose CPUs aren't getting faster as quickly as they used to, designers are designing other types of processing units.

We've seen a real revolution in GPUs over the last decade or so. We're starting to see machine learning processors start to do something similar.

An example of this is how a modern iPhone contains CPUs, GPUs, and Apple's Neural Engine. This is a better way to spend the transistor budget than just on CPUs.
 
I had an opportunity to interview Gordon Moore last Sept for an Intel Alumni event celebrating Intel's 50th anniversary.

He turns 90, Jan 3rd and is suffering from Parkinson's. So he looked a bit frail but the moment we turned on the cameras and started recording him he perked up a gave a terrific introduction to our event. https://tinyurl.com/y9gj5mqw


I didn't ask about the law since I'm sure he is sick about it. I asked him about life after Intel and philanthropy. However, my coworker did. He did agree that's it is slowing down but he did think it would stick around for another 50 years.

What an honor! Are you able to post links to the interviews, or is that Intel internal use only?

Too bad he has Parkinson's, looks like the medication is slowing his speech a bit. Hope he stays strong for a long time yet.

-ERD50
 
The slowing down of Moore's Law is forcing new type of innovation.

For example, if general purpose CPUs aren't getting faster as quickly as they used to, designers are designing other types of processing units.

We've seen a real revolution in GPUs over the last decade or so. We're starting to see machine learning processors start to do something similar.

An example of this is how a modern iPhone contains CPUs, GPUs, and Apple's Neural Engine. This is a better way to spend the transistor budget than just on CPUs.

Back ~ 2005, when Steve Jobs was frustrated that Motorola/Freescale/IBM wasn't keeping the PPC up with Intel, there was a lot of speculation of what they would do. I was thinking that maybe they had some new wiz-bang software/firmware to unload tons of stuff to a GPU, and blow everyone out of the water w/o needing to focus on the CPU so much. Good thing I didn't have any money on that bet! :LOL:

But it seemed like a reasonable thing to speculate on. I haven't kept up with the tech I guess, I'll need to go read up on that Neural Engine, and hope I can understand at least some of it.

-ERD50
 
And I truly believe most "programmers" today don't really code. They drag, they drop, they putter with some high-level languages, but don't really have a clue about how their code actually works. Many don't really understand how any of the hardware works.


+100
 
...I was thinking that maybe they had some new wiz-bang software/firmware to unload tons of stuff to a GPU, and blow everyone out of the water w/o needing to focus on the CPU so much. Good thing I didn't have any money on that bet! :LOL:

I had a TI99 "home computer" back when that was a thing. It had a separate graphics chip; what we'd now call a GPU. It had a ton of other benefits over the Intel chips of the day, like the 8080 and 8088.

Beyond unloading tedious graphics chores to the GPU, it had a tiny bit of internal RAM into which you could load whole programs, and blow the doors off any Intel box on the simplistic benchmarks that were in use at the time. A cool feature was "soft registers." Rather than have to do series of PUSH and POP commands every time your code branched to another routine, you just did one context switch command. When you switched back, all the register values were as you left them.

Despite being far advanced technology-wise, TI lost the chip wars to Intel, and we all had to go back to using more primitive coding techniques.
 
I always thought the Programming 101 class should be Assembler.


Or better yet, machine code. We had an old PDP-8 computer at my college, and our assignment was to write a program on paper in machine code, then enter it using the switches on the front of the panel. Then run it, with the output being the lights on the front of the panel. I learned more doing that assignment than one could ever imagine.
 
And I truly believe most "programmers" today don't really code. They drag, they drop, they putter with some high-level languages, but don't really have a clue about how their code actually works. Many don't really understand how any of the hardware works.


Programmers as you define them are a dime a dozen. But good software designers are scarce. I worked with a lot of people who knew all the ins and outs of the programming language, and were very good at implementing something when given a detailed design document. But if these same people were told "create a system that does X", one would be met with a blank look.
 
write a program on paper in machine code, then enter it using the switches on the front of the panel. Then run it, with the output being the lights on the front of the panel.

Great memory. That's how we had to boot the Univac 1050-II.
 
Or better yet, machine code. We had an old PDP-8 computer at my college, and our assignment was to write a program on paper in machine code, then enter it using the switches on the front of the panel. Then run it, with the output being the lights on the front of the panel. I learned more doing that assignment than one could ever imagine.

In my assembly class we had one assignment that was all done in machine code. Just a bunch of DCs. That class shook out all the people who didn't get it.

I was an assembly programmer and we would frequently patch modules in machine code. The folks I worked with were all of the same skillset. Having the source code was a luxary. There was a time when Megacorp couldn't reassemble the production source and have it generate the production binaries. A few of us were put in a corner until we could recreate it.
 
But let's face it. We're not going to turn back the clock and have people flipping switches at a console to enter programs. (Yes, I did that too.) What we need is good education about what happens.

One very good thing my Megacorp did in recent years was open up internal tech education to engineers after a few years of freeze. It was a good investment. We were encouraged to take python class. The instructor they hired was excellent.

He created a 3 line program that did a boatload of stuff. You can do that in python easily by "dotting on" functions to operations in a form of object based programming. So he said, looks efficient, right? Nope. We then examined the bytecode, then compiled the bytecode (instead of interpreting) and looked at the ultimate disassemble. What a monster! I lot of the younger set in the class were amazed and really took the lesson to heart.

So, they just became aware of what they were doing. What I think is happening a lot is this kind of education isn't occurring for the most part, and programs are just bloating like mad when the foundational concepts are not taught.
 
data entry in ' Data Processing 101"

Had to wrangle with one of these in J.C. , then a stack of cards handed to the 360 operator.

https://en.wikipedia.org/wiki/Keypunch

Quickly deduced I wasn't cut out to be a COBOL or Assembler programmer

Briefly worked for a now defunct computer co, machine language programming was part of the job. Washed out quickly . Mind numbing work for me. Some folks liked it.
 
Last edited:
This kind of reminds me of driving cars.

Model T: manual gear selection, manual timing adjustment, hand throttle.

20's: manual gear selection with rev matching, foot throttle. Automatic timing! Brakes and steering manual. Choke start.

40's: synchromesh gear shifting! Brakes and steering manual. Choke start.

50's: automatic gear shifting! High end cars have power steering and brakes. Automatic choking.

70's: only the cheap cars have manual tranny, steering, brakes, choke.

90's: convenience features added, all else automatic.

10's: auto lane control, braking, etc.

We're not going back. One could argue if you got rid of all that "crap", the car would be 1000 lbs less and have less environmental impact. Not going to happen.
 
Back
Top Bottom