Oh hi there! You’re still here?
To my 5 loyal readers from last time, I deeply apologise about the notification mail that landed in your spambox! This time, I’ll send you about five straight to your so well organised inbox. If you’re new to this blog, well, then I advise you to start from the beginning 😉
I have to admit that sometimes writing this blog makes me feel like I’m writing my thesis. I can say what I want in five words but still make it twenty. Oh well…
Wait! No, nevermind.
Now back to the nerdy jibber-jabber
IBM Master the Mainframe Part Two – Challenge #01 & #02
I’m going to try and complete two challenges today. The first challenge is about the same as last time. I just need to prepare a dataset but this time it’s just a tad more advanced.
The screenshot above shows how I executed a job called part2. Then I allocated the P2.OUTPUT partitioned data set member #01 to a file called sysut2. After that I allocated the country data to file sysut1.
The DUMMY command indicates that no data should be allocated to the file sysin. IEBGENER is a utility to generate or copy a dataset. So this generated PAGE 0001.
Then all I have to do to get my hands on challenge #02 is to logoff, take a bathroom break and logon again.
They made me watch a lecture video…
And it was great! I rated it 10/10 on IBMDB. Seriously though… I watched lecture video 7 which explains the ISPF user interface in depth.
This video more or less explains what I had to do in part 1. If they offered the link to this video before then it might have answered a lot of my questions I had back then. It also showed a nice shortcut to navigate through the ISPF panel.
If you prefix your navigation command by an equal sign then you don’t longer need to navigate to the submenu of where you want to be. =3.4 would bring me right to utilities > Dslist.
Let’s start checking the dataset.
The dataset is there. The next objective is to copy the p2.output to a new list. I’ve issues the print command and I’ll terminate ISPF to see which list I’m copying to.
It looks like I’ll copy my data to Z30163.SPF2.LIST. I’ll exit this interface by choosing process option 4 which allows me to keep the new dataset.
I went back to my dataset utility and there it is! Z30163.SPF2.LIST! My baby! Welp. Now let us copy the data from p2.output to the SPF2.LIST. I’ll tab to SPF2.LIST and issue the co command which means copy. A new interface will pop-up, the copy entry panel.
Wait a minute! Am I copying the data from SPF2.LIST to p2.output? I’m kind of confused. Anyway, can’t complain, let’s perform the action Houston!
It appears I was wrong! Don’t worry… I’m used to it (sad smiley). Do it says senator Palpatine!
Aaand it’s copied! I’m now ready to complete challenge 3 of part 2. What I did might not look like much to you, but it actually took me two hours because I’m writing this blog and doing the challenge at the same time.
No idea why I’m making excuses, but I just did. If you’re angry or have mixed feelings about this then mail your complaints to [email protected]
A word about bandwidth
Hey you! I bought the cheapest hosting package (about 99 cents a month) there is, so I’d appreciate it if you put your browser into data-saving mode when visiting my blog. KIDDING! No no, I’m very happy to have you here 🙂
If you got to this point then you’ll be the first to read about my next blogging idea. Ofcourse I’ll be blogging about challenge 3 and other challenges first but I’m in need of a new car and I’m broke.
So… I might build a crawler that complies to the site its robots.txt crawl-delay. The crawler will inform me whenever there is a new and interesting car available on the website. Maybe I could collaborate with my machine learning friends so they could apply NLP plus a classifier and detect when the listing is about a broken car, car parts or a new one.
If you’re interested, let me know!
PS: Guess I have something to fix every blog… hyperlinks now open in a new tab. Want to stay updated? Follow me on RSS or subscribe using your mail address.