COBOL for COVID?

My last job was in 2005 programming in COBOL doe IBM in a Health care account.

I was not programming in 1999, but was involved in testing date changes for for Y2K. The programmers were thought of like Gods, saving thousands of programs.

It would be a good opportunity for people who were toying with the idea of returning to work since it would be a temporary job. I'm sure not interested...
 
I was a professional COBOL programmer and analyst for 15 years, until we finally migrated to other languages. Yes, I started with punch cards, typed my own and learned quickly to use sequence numbers on the right side of the cards in case the cards got dropped. We had a card sorter machine that could resort them.

Way too much rust has accumulated over the years for me to deal with IMS or memory dumps. I probably could still understand most code and debug compile errors, but I think NJ will have offers from others that remember a lot more than I do at this point.
 
Worked with COBOL a time or two. Yep, I'm a mainframe dinosaur :cool:.
 
Well, COBOL was still going strong in 2000. Plenty of new COBOL code being written. But I do remember some folks taking a break from whatever they were doing at the time, and voluntarily doing Y2K 'fixes' in COBOL and PL/I because the pay was pretty high. Those folks were 'job shoppers' (contractors) who could hop from job to job.


As 2000 approached I had been in teaching for nearly a decade. Yet, I was offered a salary that would more than double my pay for a few years if I would work on fixing the Y2K problem. I did not take the bait because my teaching job came with one thing nobody was offering anymore - a pension. It was a smart decision. The old COBOL programmers were dropped like hot potatoes by the end of 2001. I was shocked. Shocked!!
 
Last edited:
Used COBOL in college a bit, after joining Megacorp did not use it for years until the mid-90s when a project required it, and I am good at quickly picking up/refreshing on programming languages. When Y2K came around I was golden because of this skill. It helped get me stock options to keep me (and others) from jumping ship. I did not use it as much since then, though a few CICS (some of you may know what that is :)) and COBOL situations came up where my knowledge was still useful in providing code samples that had to be written in COBOL.
 
I was a electrical engineering student and took FORTRAN and PL/1 for the first two programming courses. The instructor worked in the campus computing center and screened me in for a job as a student programmer.

First job assignment was to write a simple check register in COBOL for the finance dept. All it had to do was read the card data for each check and print it on paper. I spontaneously on my own added a total of all the checks at the bottom of the register.


My improvised total on the test data deck did not match the finance depts manual total, so they would not accept my program into production.



About a week later the dept head came back and said my program had found an error entry that they had been struggling to find... my total was right, theirs was wrong. That was the start of a very successful 4 years working as a mainframe student programmer. FORTRAN, PL/1, COBOL (including Report Writer), JCL, 370 Assembler... having an office with my own tube and access to the print room was a boon to doing my homework without fighting for an open CRT.

I expect I'll have a green screen related nightmare tonight.
 
I learned assembler, PL/I and ALGOL/Pascal (very similar) in college. I was a programmer/analyst for 4 years before moving into management, mostly in COBOL although I was the maintenance lead for an old assembler "TP" system (teleprocessing - very rudimentary online order system). Being on call for the nightly batch runs was awful - hauling boxes of printouts home to debug JCL issues in the middle of the night, not being able to go anywhere during your week on call (long before cell phones), etc.

Oh, and my programs were all Y2K compliant in the 1980s, and any program I had to change got its date logic fixed at the same time. Some of the programs had already been around nearly 20 years so I figured they just might last 20 more.
 
I should probably tell my Sister, she was head of IT for the State of NJ back in the 70's!
 
Y2K.jpg

The mention of Y2K reminded me of this cartoon (attached, I hope!).
 
I still have a lot of old mainframe programming and system manuals they can buy from me.
 
But what a scene it was at the midnight punch-card running centers! I was not in computer science, but I had friends who used the cards, and I vividly remember the noisy and crowded midnight life in the basement of the Computer Science building on campus, as the lowliest of college students ran their cards and generated results. Does anyone else remember the scene? Must have been the end of the '70s.

I remember that but we were learning FORTRAN IV :LOL:
 
But what a scene it was at the midnight punch-card running centers! I was not in computer science, but I had friends who used the cards, and I vividly remember the noisy and crowded midnight life in the basement of the Computer Science building on campus, as the lowliest of college students ran their cards and generated results. Does anyone else remember the scene? Must have been the end of the '70s.

Fortunately not in CS, but had to run stuff (Calcomp runs, etc) but since I had undergrad research lab (not classes) and the prof I was working with was able to provide me with a "special card" I dumped my punchcards (that I had to punch myself) in the deck with the existing program cards.... in the FRONT of the queue!! No waiting! I'm sure the rest behind me were not amused.... (this in the mid 70's).
{sibling has done so much spaghetti code in COBOL (and assembly for subroutines) for govie mainframes... retired for years and I know there's no way they'd return for even $100/ hr. maybe $400 might be tempting)
 
There are many places that still use mainframes besides the government - some of whom you probably do business with. It is not simple to "just rewrite it".
 
Apparently a LOT of legacy unemployment systems are still coded or partially coded in COBOL. Remarkably, not many young-uns know it. States are apparently hiring: https://www.dailymail.co.uk/news/ar...utdated-computer-amid-unemployment-chaos.html

Next up: will I need to brush up on those JCL skills from long ago?:dance:
Indeed there are. COBOL became popular in the late 60s-70s and into the 90s. Y2K didn't see many legacy systems replaced; only upgraded in COBOL because of risk. By then COBOL wasn't taught in many universities and definitely wasn't a cool language on your resume. Developers like writing in newer OO languages with all their cool tools. Who wants to maintain decades old code that nobody's wants to pay great money for?

Look for the lack of COBOL skillsets and 2038 date issues to be ignored.
 
When I was in college in the 1970's, I took about 4 courses in FORTRAN. Then I took one course in COBOL. That one COBOL course made me realize that no way was I going into IT, so I continued on and got my Mechanical Engineering degree. And the rest is history.....:LOL:
 
What are the 2038 date issues?

I had to look it up, but here's the problem;

The Year 2038 problem (also called Y2038 or Y2k38 or Unix Y2K) relates to representing time in many digital systems as the number of seconds passed since 00:00:00 UTC on 1 January 1970 and storing it as a signed 32-bit integer. Such implementations cannot encode times after 03:14:07 UTC on 19 January 2038. Similar to the Y2K problem, the Year 2038 problem is caused by insufficient capacity used to represent time.
 
There are many places that still use mainframes besides the government - some of whom you probably do business with. It is not simple to "just rewrite it".
DW just retired from such a place. They will have mainframes for the foreseeable future.

It is not a sin.

Open source and Linux are not panaceas.

As for the 2038 problem: at my last megacorp, we started working on that in 2001 and had it fixed in a few years with all our new hardware roll out. They didn't want to go through that again. Of course, they've since been drifting to linux and may have accidentally inherited the problem in the switch over.
 
Last edited:
When I was in college in the 1970's, I took about 4 courses in FORTRAN. Then I took one course in COBOL. That one COBOL course made me realize that no way was I going into IT, so I continued on and got my Mechanical Engineering degree. And the rest is history.....:LOL:

It was the one course in COBOL which began my move away from majoring in CS. I had a problem with the professor and the grade I received in the course, a grade which barely kept me off Dean's List. Then, the next course for CS majors I didn't like at all. I dropped it and changed my major to Economics, a much better major for me anyway. While I still liked programming, having CS as an equivalent minor still looked great on my resume and set me up to be a "big fish in a small pond" when I began my actuarial career a few years later.

Part of being an end-user included having to deal with actual systems analysts in the IT area. I liked dealing with them because I could speak their language, so to speak (and I knew enough COBOL), and they liked dealing with me compared to other end-users who usually knew very little about programming.
 
When I took my first COBOL class as an electrical engineer I really didn't care for it.
After working as a student programmer for campus admin for a couple of years doing nothing but COBOL, I was assigned a LRU memory cache task in an systems programming class.

The instructor had stored records of numbers representing memory blocks on tape.
We were to read the data and and at the end spit out which blocks were still in memory.
All the students on the sysadmin side of the shop thought "this is a systems programming class, we have to use assembler". It took them 2 weeks of fiddling to learn how to access and read the tape drive in assembler. I think it took 5 minutes using COBOL.

When your only tool is a hammer, every problem becomes a nail.
 
I had to look it up, but here's the problem;

The Year 2038 problem (also called Y2038 or Y2k38 or Unix Y2K) relates to representing time in many digital systems as the number of seconds passed since 00:00:00 UTC on 1 January 1970 and storing it as a signed 32-bit integer. Such implementations cannot encode times after 03:14:07 UTC on 19 January 2038. Similar to the Y2K problem, the Year 2038 problem is caused by insufficient capacity used to represent time.

Not just unix. Every IBM system, (mainframe, midrange, and UNIX) has the issue in one form or another. Mainframe and midrange systems are relative to midnight 1900, Unix per your comment is relative to 1970. There's some applications that use that date/time for sequencing data. It was used as a unique identity before some languages supported it.

Megacorp had used it for sequencing data . I remember a meeting pre-Y2K where the subject was brought up. As peer and I left the meeting we had a little sidebar. I asked how he thought we should fix this issue. His comment " if that code and database are still around in 2038 we have bigger issues". In 2020 that code is still running well.
 
Last edited:
I had to look it up, but here's the problem;

The Year 2038 problem (also called Y2038 or Y2k38 or Unix Y2K) relates to representing time in many digital systems as the number of seconds passed since 00:00:00 UTC on 1 January 1970 and storing it as a signed 32-bit integer. Such implementations cannot encode times after 03:14:07 UTC on 19 January 2038. Similar to the Y2K problem, the Year 2038 problem is caused by insufficient capacity used to represent time.

YIKES! I can see it now, space planes falling from the skies, banks losing all their records and keeping our money, self driving cars driving over cliffs, and my VCR will start blinking 12:00 all over again.
 
Back
Top Bottom