Portal Forums Links Register FAQ Community Calendar Log in

Join Early Retirement Today
Reply
 
Thread Tools Display Modes
" Moore's Law is ending " , well, not quite
Old 01-02-2019, 11:54 AM   #1
Thinks s/he gets paid by the post
Lakewood90712's Avatar
 
Join Date: Jul 2005
Posts: 2,223
" Moore's Law is ending " , well, not quite

Maybe just slowing down a bit .

Carnegie Mellon U. link to article on future chip advances.

https://www.cit.cmu.edu/news-events/...ectronics.html
Lakewood90712 is offline   Reply With Quote
Join the #1 Early Retirement and Financial Independence Forum Today - It's Totally Free!

Are you planning to be financially independent as early as possible so you can live life on your own terms? Discuss successful investing strategies, asset allocation models, tax strategies and other related topics in our online forum community. Our members range from young folks just starting their journey to financial independence, military retirees and even multimillionaires. No matter where you fit in you'll find that Early-Retirement.org is a great community to join. Best of all it's totally FREE!

You are currently viewing our boards as a guest so you have limited access to our community. Please take the time to register and you will gain a lot of great new features including; the ability to participate in discussions, network with our members, see fewer ads, upload photographs, create a retirement blog, send private messages and so much, much more!

Old 01-02-2019, 12:22 PM   #2
Thinks s/he gets paid by the post
 
Join Date: Jun 2013
Posts: 1,019
The bottleneck has changed. When Moore stated his law, and for quite a while after that, memory capacity and processor speed were the bottlenecks. Every application was constrained by memory limitations and processing speed. Nowadays, software is the bottleneck in that (with some notable exceptions) most programs only require a small fraction of the memory and processing speed that's available. And if more of either is needed, it's dirt cheap.


I recently read that if each byte of memory cost the same today as it did 50 years ago, a typical laptop would cost half-a-trillion dollars.
Which Roger is offline   Reply With Quote
Old 01-02-2019, 01:04 PM   #3
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
JoeWras's Avatar
 
Join Date: Sep 2012
Posts: 11,702
Quote:
Originally Posted by Which Roger View Post
I recently read that if each byte of memory cost the same today as it did 50 years ago, a typical laptop would cost half-a-trillion dollars.
It would also weigh a few tons, what with all the core memory and all.
JoeWras is offline   Reply With Quote
Old 01-02-2019, 01:05 PM   #4
Thinks s/he gets paid by the post
Cut-Throat's Avatar
 
Join Date: Jan 2007
Location: Minneapolis
Posts: 1,172
Quote:
Originally Posted by Which Roger View Post
I recently read that if each byte of memory cost the same today as it did 50 years ago, a typical laptop would cost half-a-trillion dollars.

When I first started working as a Software Engineer back in the early 70s, we had core memory that typically cost a Buck a Byte... In other words 32kilo bytes of memory would go for $32 Grand... And these were 1975 Dollars! Today's Dollars would be about $150 Grand. So, my new laptop has 8GB memory and if my Calculations are correct, the memory alone would cost $37.5 Billion.... Not quite a half trillion, but no chump change either.



My small little desktop has 32 Gig of memory, and I paid about $500 for that memory. About a $150 Billion worth of 1975 Memory.
Cut-Throat is offline   Reply With Quote
Old 01-02-2019, 01:05 PM   #5
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
 
Join Date: Sep 2005
Location: Northern IL
Posts: 26,892
No doubt about it, they are approaching the practical physical limits of transistors per area, but I don't think that's any big cause for concern, it's only one measure. And chips are so powerful now, this is not a bad place to be plateauing in that regard.

I think they can still improve power dissipation with better insulators to reduce leakage, and better thermal management. And they will continue to get better at making them within those limits, so costs should continue to come down. And with that experience, mid-range parts will be easier to make with the higher tech levels, so on average we will still see advancement, even if the very top end is stagnating somewhat. Kind of like the trickle-down effect we see in automobiles - what was once only on high end cars starts appearing on mid-range cars, and eventually a lot of it makes it to all levels.

And of course, software can be more efficient. And new networking techniques and other ideas will keep innovation flowing for a long time. I sometimes wonder if this isn't a good thing in some ways - it will force designers to come up with better ways of doing things, instead of the 'easy', brute force, "we need faster computers", approach?

Keeping with the car analogy, a piston engine is basically the same as it was 120 years ago, a piston with exploding gasoline, and gasoline is basically the same as it was. The laws of physics set limits on what that can do. But cars continue to advance at a pretty good pace. It's not all about the engine.

-ERD50
ERD50 is offline   Reply With Quote
Old 01-02-2019, 01:16 PM   #6
Thinks s/he gets paid by the post
USGrant1962's Avatar
 
Join Date: Dec 2016
Location: DC area
Posts: 2,495
HMM, in a paper funded by the government, researchers conclude that the government should fund more research

Anyway, the problem with Moore's law is physics, not software. Moore's law states that transistor density doubles every year. But with single-digit nanometer lithography (really, really tiny transistors), heat transfer and electron migration are becoming insurmountable. OTOH, AMD, Nvidia, Samsung, and Intel are very smart and are working on other ways to increase computing power with advances in chipsets, memory technologies, 3D stacking, and generally moving away from single-processor solutions.

Here's Moore's paper http://www.monolithic3d.com/uploads/...65_article.pdf
__________________
FI and Semi-ER March 24, 2017
Consulting to stay engaged

"All models are wrong, some are useful." - George Box
There is always a well-known solution to every human problem: neat, plausible, and wrong.” - H.L. Mencken
USGrant1962 is offline   Reply With Quote
Old 01-02-2019, 02:08 PM   #7
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
MRG's Avatar
 
Join Date: Apr 2013
Posts: 11,078
Quote:
Originally Posted by ERD50 View Post

And of course, software can be more efficient. And new networking techniques and other ideas will keep innovation flowing for a long time. I sometimes wonder if this isn't a good thing in some ways - it will force designers to come up with better ways of doing things, instead of the 'easy', brute force, "we need faster computers", approach?

-ERD50
Oh did you say a mouthful!
Back in the good old days some of us would count assembly instructions to accomplish a given task. My last deal around performance of a JAVA application was enlightening. The development team said it was done, but the system required 5x the hardware of C++. Some smart folks started decomposing what was occurring, poor coding techniques were responsible for most of the additional system requirements.
MRG is offline   Reply With Quote
Old 01-02-2019, 04:18 PM   #8
Thinks s/he gets paid by the post
 
Join Date: Jan 2017
Posts: 2,660
Quote:
Originally Posted by MRG View Post
Oh did you say a mouthful!
Back in the good old days some of us would count assembly instructions to accomplish a given task. My last deal around performance of a JAVA application was enlightening. The development team said it was done, but the system required 5x the hardware of C++. Some smart folks started decomposing what was occurring, poor coding techniques were responsible for most of the additional system requirements.
I saw this sort of thing too. Developers always get the newest hardware, and tend to push it to its limits. No surprise it doesn't perform well when the customers try to run that stuff on older hardware.

And I truly believe most "programmers" today don't really code. They drag, they drop, they putter with some high-level languages, but don't really have a clue about how their code actually works. Many don't really understand how any of the hardware works.

I always thought the Programming 101 class should be Assembler.
CaptTom is offline   Reply With Quote
Old 01-02-2019, 04:41 PM   #9
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
clifp's Avatar
 
Join Date: Oct 2006
Posts: 7,733
I had an opportunity to interview Gordon Moore last Sept for an Intel Alumni event celebrating Intel's 50th anniversary.

He turns 90, Jan 3rd and is suffering from Parkinson's. So he looked a bit frail but the moment we turned on the cameras and started recording him he perked up a gave a terrific introduction to our event. https://tinyurl.com/y9gj5mqw


I didn't ask about the law since I'm sure he is sick about it. I asked him about life after Intel and philanthropy. However, my coworker did. He did agree that's it is slowing down but he did think it would stick around for another 50 years.
clifp is offline   Reply With Quote
Old 01-02-2019, 05:25 PM   #10
Thinks s/he gets paid by the post
gcgang's Avatar
 
Join Date: Sep 2012
Posts: 1,570
There's a corollary to Moore's Law named after Digital Equipment's Gordon Bell, which says every decade a hundredfold drop in the price of processing power engenders a new computer architecture. Mainframes to PCs to the cloud to what next?
Government spending on faster integrated circuits would be about as productive as researching better buggy whips.
__________________
You know that suit they burying you in? Thar ain’t no pockets in that suit, boy.
gcgang is offline   Reply With Quote
Old 01-02-2019, 05:37 PM   #11
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
clifp's Avatar
 
Join Date: Oct 2006
Posts: 7,733
Quote:
Originally Posted by gcgang View Post
There's a corollary to Moore's Law named after Digital Equipment's Gordon Bell, which says every decade a hundredfold drop in the price of processing power engenders a new computer architecture. Mainframes to PCs to the cloud to what next?
Government spending on faster integrated circuits would be about as productive as researching better buggy whips.

That's not completely true, Government spending on Semitech in the 1980s and early 90 did help US semiconductors manufacturers regain competitiveness with Japanese (before losing it to the South Koreans this century). Reasonable people can disagree about its overall effectiveness but it was completely useless.

I meet Gordon Bell a couple times at the Boston Computer Museum. Didn't realize there was a law named after him. Two computer giants named Gordon... weird.
clifp is offline   Reply With Quote
Old 01-02-2019, 05:46 PM   #12
Thinks s/he gets paid by the post
mpeirce's Avatar
 
Join Date: Feb 2012
Location: Northern Ohio
Posts: 3,182
The slowing down of Moore's Law is forcing new type of innovation.

For example, if general purpose CPUs aren't getting faster as quickly as they used to, designers are designing other types of processing units.

We've seen a real revolution in GPUs over the last decade or so. We're starting to see machine learning processors start to do something similar.

An example of this is how a modern iPhone contains CPUs, GPUs, and Apple's Neural Engine. This is a better way to spend the transistor budget than just on CPUs.
mpeirce is offline   Reply With Quote
Old 01-02-2019, 07:39 PM   #13
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
 
Join Date: Sep 2005
Location: Northern IL
Posts: 26,892
Quote:
Originally Posted by clifp View Post
I had an opportunity to interview Gordon Moore last Sept for an Intel Alumni event celebrating Intel's 50th anniversary.

He turns 90, Jan 3rd and is suffering from Parkinson's. So he looked a bit frail but the moment we turned on the cameras and started recording him he perked up a gave a terrific introduction to our event. https://tinyurl.com/y9gj5mqw


I didn't ask about the law since I'm sure he is sick about it. I asked him about life after Intel and philanthropy. However, my coworker did. He did agree that's it is slowing down but he did think it would stick around for another 50 years.
What an honor! Are you able to post links to the interviews, or is that Intel internal use only?

Too bad he has Parkinson's, looks like the medication is slowing his speech a bit. Hope he stays strong for a long time yet.

-ERD50
ERD50 is offline   Reply With Quote
Old 01-02-2019, 07:44 PM   #14
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
 
Join Date: Sep 2005
Location: Northern IL
Posts: 26,892
Quote:
Originally Posted by mpeirce View Post
The slowing down of Moore's Law is forcing new type of innovation.

For example, if general purpose CPUs aren't getting faster as quickly as they used to, designers are designing other types of processing units.

We've seen a real revolution in GPUs over the last decade or so. We're starting to see machine learning processors start to do something similar.

An example of this is how a modern iPhone contains CPUs, GPUs, and Apple's Neural Engine. This is a better way to spend the transistor budget than just on CPUs.
Back ~ 2005, when Steve Jobs was frustrated that Motorola/Freescale/IBM wasn't keeping the PPC up with Intel, there was a lot of speculation of what they would do. I was thinking that maybe they had some new wiz-bang software/firmware to unload tons of stuff to a GPU, and blow everyone out of the water w/o needing to focus on the CPU so much. Good thing I didn't have any money on that bet!

But it seemed like a reasonable thing to speculate on. I haven't kept up with the tech I guess, I'll need to go read up on that Neural Engine, and hope I can understand at least some of it.

-ERD50
ERD50 is offline   Reply With Quote
Old 01-02-2019, 09:01 PM   #15
Full time employment: Posting here.
RetiredAt55.5's Avatar
 
Join Date: May 2017
Location: Southeastern PA
Posts: 766
Quote:
Originally Posted by CaptTom View Post
And I truly believe most "programmers" today don't really code. They drag, they drop, they putter with some high-level languages, but don't really have a clue about how their code actually works. Many don't really understand how any of the hardware works.

+100
RetiredAt55.5 is offline   Reply With Quote
Old 01-03-2019, 07:53 AM   #16
Thinks s/he gets paid by the post
 
Join Date: Jan 2017
Posts: 2,660
Quote:
Originally Posted by ERD50 View Post
...I was thinking that maybe they had some new wiz-bang software/firmware to unload tons of stuff to a GPU, and blow everyone out of the water w/o needing to focus on the CPU so much. Good thing I didn't have any money on that bet!
I had a TI99 "home computer" back when that was a thing. It had a separate graphics chip; what we'd now call a GPU. It had a ton of other benefits over the Intel chips of the day, like the 8080 and 8088.

Beyond unloading tedious graphics chores to the GPU, it had a tiny bit of internal RAM into which you could load whole programs, and blow the doors off any Intel box on the simplistic benchmarks that were in use at the time. A cool feature was "soft registers." Rather than have to do series of PUSH and POP commands every time your code branched to another routine, you just did one context switch command. When you switched back, all the register values were as you left them.

Despite being far advanced technology-wise, TI lost the chip wars to Intel, and we all had to go back to using more primitive coding techniques.
CaptTom is offline   Reply With Quote
Old 01-03-2019, 08:38 AM   #17
Thinks s/he gets paid by the post
 
Join Date: Jun 2013
Posts: 1,019
Quote:
Originally Posted by CaptTom View Post
I always thought the Programming 101 class should be Assembler.

Or better yet, machine code. We had an old PDP-8 computer at my college, and our assignment was to write a program on paper in machine code, then enter it using the switches on the front of the panel. Then run it, with the output being the lights on the front of the panel. I learned more doing that assignment than one could ever imagine.
Which Roger is offline   Reply With Quote
Old 01-03-2019, 08:43 AM   #18
Thinks s/he gets paid by the post
 
Join Date: Jun 2013
Posts: 1,019
Quote:
Originally Posted by CaptTom View Post
And I truly believe most "programmers" today don't really code. They drag, they drop, they putter with some high-level languages, but don't really have a clue about how their code actually works. Many don't really understand how any of the hardware works.

Programmers as you define them are a dime a dozen. But good software designers are scarce. I worked with a lot of people who knew all the ins and outs of the programming language, and were very good at implementing something when given a detailed design document. But if these same people were told "create a system that does X", one would be met with a blank look.
Which Roger is offline   Reply With Quote
Old 01-03-2019, 08:51 AM   #19
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
steelyman's Avatar
 
Join Date: Feb 2011
Location: NC Triangle
Posts: 5,807
This thread reminded me of a Dilbert strip where they discuss programming in binary:

http://www.dilbert.com/strip/1992-09-08
__________________

steelyman is offline   Reply With Quote
Old 01-03-2019, 08:54 AM   #20
Thinks s/he gets paid by the post
 
Join Date: Nov 2011
Posts: 3,902
The next big jump is quantum computing.
GrayHare is online now   Reply With Quote
Reply

Tags
chip, intel, moores law, transistor


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Well [water], well, well...What to do? [LONG] Amethyst Other topics 23 09-21-2016 07:48 AM
Not Quite Ready to Retire? How to Get an "Encore Career" zedd Life after FIRE 23 01-19-2013 07:18 PM
Hello Forum....not quite starting out, not quite ready to FIRE PV4ALL Hi, I am... 5 03-01-2009 10:10 AM

» Quick Links

 
All times are GMT -6. The time now is 09:36 AM.
 
Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2024, vBulletin Solutions, Inc.