Unemployed - Just in time

imoldernu

Gone but not forgotten
Joined
Jul 18, 2012
Messages
6,335
Location
Peru
Ever wonder how the blacksmith felt, when Henry Ford introduced the automobile to America?

Since so many members here, have been in the IT world, it may be similar, as Computer Science becomes the John Deere of the 21st Century... and perhaps sooner than later... As plows and tractors revolutionized the world of agriculture, the number of people needed to feed the world dropped, leaving millions of farmworkers by the wayside.

So, now... the business of programming may be seeing a new kind of revolution, according to this:
Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.
The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming - ...

http://www.nytimes.com/2013/12/29/science/brainlike-computers-learning-from-experience.html?partner=MYWAY&ei=5065&_r=0

(If you have trouble getting through the NYT paywall, try googling the headline for other links to the story)

As I recall the days when jobs disappeared in New England in the 1940's and 1950's, as the textile industry moved away and my mom and dad had to change professions, mid life,... it makes me wonder if the day of the programming professional will be just a memory. While we'll always need people, will it be so many?... Will the level of experience and education be as important?

Not so long ago, we used to say ""Buck Rogers" when talking about this stuff. Robots and Rockets. How do you see the future for the IT professional? In a year... in 10 years... and longer. Is the article "Buck Rogers"?
 
How do you see the future for the IT professional?

The Matrix or The Terminator. SW or HW. Any day now, and either way, humanity is doomed. Robots will soon take over.
 
Last edited:
Reading through he article the thing that jumped out at me was the statement that with today's technology a computer that models 100% of the neurons in a human brain would require the same amount of power that is required to power two major cities in the US. This tells me that major breakthroughs in reducing the power needs for computing hardware is needed before this technology can become widespread.

Without those breakthroughs, I think everyday IT work is safe for at least 10 years and probably longer. There could be shorter term benefits in specific applications.
 
Things will be a lot different, but in the end, it'll probably require even more people to keep things running. Back in the 80's my employer was really gung-ho on the "paperless office" concept- yet everything we did along those lines seemed to generate even more paper. Simplifying things sometimes manages to create even more work.

The future will still need people to run the data centers and hardware, manage projects, program in higher level languages that haven't been created yet, and troubleshoot when things do go wrong. The need for management and administration will probably increase more than the need for the actual programmers. I see the IT industry alive and well for a very long time, just a lot different than it is today. And certain jobs in IT will probably not be considered to be as highly skilled / highly paid as they are today - that shift has already begun of course!
 
The Matrix or The Terminator. SW or HW. Any day now, and either way, humanity is doomed. Robots will soon take over.

I see it as a hybrid of the two: First, robots will be built to do the menial tasks - like farming. Then, another revolutionary change will occur which will render most of the robots unnecessary for farming (just like with the humans and farming). Then, the masses of unemployed robots will be competing with the unemployed people for jobs.

That is when the war will start. :)
 
Someone else on this forum, a while back used the expression: Adapt or Die. Agree.
 
yep

When I stated in IT, 4GLs were coming out, they would increase a programmers productivity so much we wont need them. When that failed, the buzz word was "upper case", model out the data and processes that, was the answer. Then lower case came out, it read the upper case models and generated indecipherable 3gl code. Yea that didn't work so well.

About the same time Codd and Date became real popular. You had to have a relational database with at least 3rd normal form data. Surely that is will fix the issues. Why, we can just give the business folks reporting tools and they can writer their own reports, no need for IT.

Then came object oriented languages (OO), that was the problem that old procedural code. Lets go to Smalltalk, it fixes all problems, oh wait maybe C++ is the answer. Oh those silly memory leaks. OK lets fix that with a platform neutral language JAVA, it manages the memory surely that's the answer. Wait we need a container to provide services, JEE would fix all problems. Well not quite, we need distributed transactions and load balancing throw a sprayer in there we now have redundant applications that never go down. Oh wait were wasting too many resources(silo applications), virtualization it fixes all issues. What do you mean I ran out of virtual resources, I had to pretend they existed, how can I run out.

Well at the end of all that we see a 400 million dollar website deployed, see how well the technology works.

Heck yes I'm in, you guys start designing, somebody will figure out what this thing is supposed to do.

Sorry for the rant, great article. I'm sure I missed as many "magic answers" as I wrote about.

MRG
 
Self learning chips will likely not function properly. For instance, I work in air balancing and every year some engineering company claims to have developed self balancing devices to save money on balancers. Guess what... They don't work when installed. They are expensive failures that still require the man power for the systems to function. I assume that computer code will follow the same pattern.
 
I suspect almost every generation has worried strenuously about the end of civilization at the hands of technology, and many convincing articles have been written in support. "Too many people" has been a recurring favorite. Creative destruction is messy business, yet we seem to muddle through, and often even prosper. Most people are experts at identifying problems (ad nauseum), far fewer also look for solutions. And some answers simply can't be "seen" until there's a real need. Thank goodness there are still people who live to rise to the challenge of being told no, it can't be done...

'Every invention was once thought to be impossible."

"All truth passes through three stages. First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as being self-evident."
 
Last edited:
Reading through he article the thing that jumped out at me was the statement that with today's technology a computer that models 100% of the neurons in a human brain would require the same amount of power that is required to power two major cities in the US. This tells me that major breakthroughs in reducing the power needs for computing hardware is needed before this technology can become widespread.
The paragraph that jumped out at me was this one,

"But last year, Google researchers were able to get a machine-learning algorithm, known as a neural network, to perform an identification task without supervision. The network scanned a database of 10 million images, and in doing so trained itself to recognize cats."

I'm thinking that once we can train a computer to search for, and view, pictures and videos of cats on the internet, we can free up a huge amount of time, making us all much more productive in the long run :LOL:
 
I'm willing to bet that in my lifetime, COBOL will still be around :)
 
The Matrix or The Terminator. SW or HW. Any day now, and either way, humanity is doomed. Robots will soon take over.

There is another option. I think it was a Robert Heinlein short story - people become the robot's pets and the robots strive to keep them happy lest they become destructive.

Sort of like the way we treat puppies....:LOL:
 
I'm willing to bet that in my lifetime, COBOL will still be around :)

+1
Folks would be amazed if they knew how much COBOL code, they count on and use. There may be a pretty web front end, COBOL still does most of the heavy lifting.
MRG
 
Last edited:
In song(?)

"Down at the well they got a new machine
The foreman says it cuts man-power by fifteen
Yeah but that ain't natural yeah, well so old clay would say
You see hes a horse-drawn man until his dying day"

Elton John - Country Comfort Lyrics
 
My DW asked me about this article this morning. Neural networks are nothing new. Maybe a chip to take advantage of this type of programming is new . . . I don't know. I could be. Specialized chips are nothing new either.

Also, neural networks are just as statistical as any other statistical programming. The so called "learning" is very much dependent on the learning data set and how it is input into the system. It's sort of like random number generators are not really random.

There are a number of brain modeling projects that have recently been approved for funding in the US and Europe. It could be that hardware developments in this area are influenced by these projects. I wish them well. I am not expecting the end of civilization at this point.

I told my DW that this sounds like one of those stories they wrote a while ago and held off until the time between Christmas and New Years.
 
Changes will be gradual enough for people in the workforce. Computer technologies lasts a lot longer in maintenance mode than what we see.
 
Pssst.....have your kids/grand-kids take up
1. Mechanic
2. Plumber
3. Electrician
4. Dry stone waller (bit specialized....but I like them):)
5. Gardener
6. Woodworker
etc
etc
These are good jobs and they will be there years from now.
 
I love these articles and books like "The Singularity is Near," because the changes (which are probably inevitable) are so fascinating. Unfortunately, or maybe fortunately, this stuff is so complicated that it may take decades or centuries before the radical changes predicted come to fruition. In the meantime, dribs and drabs of cool things steadily pop into our life and keep us smiling and nodding our heads. Every once in a while some of these changes may be quite disruptive leading to a lot of angst and some fun bubbles in the market.

Ultimately I am with easysurfer -- COBOL will be with us in decades hence. Heck, chunks of it will probably be wrapped in 10G software in the first platform we use for uploading our minds.
 
:)
With some contrary opinions, sounds like HAL hasn't arrived yet. Will settle for the next 10 to 15 years...
Apres moi, le deluge...
 
I'll bet those smart chips and robots can't stop the big asteroid strike from happening. And when it hits, we are back to the dinosaurs again and Firecalc won't be worth a hill of beans.:D
 
Ultimately I am with easysurfer -- COBOL will be with us in decades hence. Heck, chunks of it will probably be wrapped in 10G software in the first platform we use for uploading our minds.


Heh. I had to deal with a fair amount of 'legacy' code in one of my jobs, as well as being the de-facto corporate 'memory' for Why Things Work That Way. My business card job title was "Programmer/Archaeologist"

"So, the Smalltalk-80 system invokes a C++ library to format a string that is pushed to a FORTH interpreter..."
 
Back
Top Bottom