mpeirce
Thinks s/he gets paid by the post
The next big jump is quantum computing.
It is and yet it isn't. We'll only know when we get there.
The next big jump is quantum computing.
Programmers as you define them are a dime a dozen. But good software designers are scarce.
In my assembly class we had one assignment that was all done in machine code. Just a bunch of DCs. That class shook out all the people who didn't get it.
Fast forward to now. Software is spit out without any consideration for the efficiency of use. Want to process something? Sure. Click here. Dismiss this dialog. Drag there. Take your hand off the mouse and move over to the keyboard. Then back to the mouse. Repeat a dozen times. That's all it takes! Easy, huh?
I will say that I don't necessarily like the radical simplification of some UI's that have been all the rage since the iPod came out. The pendulum may have swung too far in one direction. But you have a point. The ones in the other direction can drive you crazy.
What an honor! Are you able to post links to the interviews, or is that Intel internal use only?
Too bad he has Parkinson's, looks like the medication is slowing his speech a bit. Hope he stays strong for a long time yet.
-ERD50
I think we're saying the same thing. The problem is dumbing down the program to get good reviews by beginners or journalists, rather than meeting the needs of the real users who are going to have to run the thing, day in and day out.
Something can be so simple to use that it does nothing. Sometimes, the job you're trying to accomplish is complicated, and overly simplified software only makes it harder.
It is and yet it isn't. We'll only know when we get there.
..... I helped develop and/or implement various processes, from APCVD doped SiO2 using 100% silane and phosphine .......
Was fun, for a while!
Aren't those chemicals known as " One Step " you know, get it on you or in in your lungs and you get one step away and then drop dead ?
We had a computer running Watfor is our engineering lab. I would enter two instructions in octal and cause the computer to read cards and write to the printer. Everyone was impressed!I always thought the Programming 101 class should be Assembler.
Well Honeywell (remember them?) has introduced their quantum computer:There's a corollary to Moore's Law named after Digital Equipment's Gordon Bell, which says every decade a hundredfold drop in the price of processing power engenders a new computer architecture. Mainframes to PCs to the cloud to what next?
Yep, agree.On the other hand, if you look at the raw computing power available per $ (or watt), we are where we need to be pretty much for every single application there is, including simulating a complete human brain for a few hundred dollars. Strong statement, I know.
The problem is in the architectures (not just software). A properly designed for purpose chip can beat a generic one easily by a factor of 100. We just don't know what architectures are best yet or how to program for them.
Our CEOs response to the ethical questions raised: "If we don't build it, somebody else will".
At Mega, we did some really interesting stuff with custom architectures. Problem is that the market is limited and producing such is very expensive. It helps if 1000s of other applications can be grafted onto the architecture, otherwise nobody wants to produce it.