Why I don’t like the ‘Linux model’

There’s a thing I’m going to call the ‘Linux model’. Not because it pertains ONLY to Linux, but because most of what’s wrong with this model often starts with Linux and stuff that runs (best) on Linux.

In a way, this is really a story about all the stuff that’s broken in JupyterHub, but it goes beyond that… it’s the general model that’s broken, and the model really owes it’s roots to Linux.

Basically, when you install something on a Linux box, and even the OS for the Linux box itself, it’s probably broken. That is, *something* won’t work after installing it, and there is no way short of digging into some code somewhere of ever fixing it.

Worse, the breaking of such stuff is often super complex and intricate – somewhere buried in a log somewhere is a message regarding “package X failed due to expecting library Y to be x.x.x but was z.z.z”. Or similar obscure “thing” that takes days to figure out, if ever.

You can post the error on google and what you get most of the time is a dozen hits – all questions on StackOverflow asking the same thing and getting precious little of value in response.

Worse, you are expected to manually update packages on an almost continuous basis, and (of course) such updates often break things that were working fine before the update. Yet if you don’t update, something ELSE will break.

The entire model is broken.

What triggered this particular rant today is that I spent ages figuring out how to (finally) install C++ into JupyterHub so I could run C++ notebooks. Yesterday, I found it broken. The log complains about a library *supplied by the supporter of this C++ package* being the wrong date compared to what’s expected. It doesn’t matter. C++ in JupyterHub is now broken, and good luck finding anyone to respond with anything useful. Even less likely is that the C++ supplier will fix it anytime soon.

That’s the other problem with the Linux model. Everything is well documented and often supplied with tutorials. BUT… THEY ARE ALL YEARS OUT OF DATE. Worse, the stuff they describe has changed so much in the years since that you cannot follow the tutorial without being worse off then if you’d just thrown mud at a wall.

The biggest problem with the Linux model is that noone really cares. “I did this really cool thing in 2012 but now I’m bored and … who cares”, seems the mantra of every developer. Nothing is maintained for long. It’s becoming obvious that nothing is really being used either. Otherwise the failures would be noted and (hopefully) fixed.

Overall, it’s a really depressing time to be trying to actually do anything on a Linux box.

Reflections on Computer Horsepower

I’m rapidly on my way to becoming an old codger. This Christmas Break I soldered together a couple of hardware kits that emulate some old and older computers. One was an Altair 8800 copy, which in it’s day was one of the very first “personal computers” ever sold. The other kit was a PDP11/70 replica, which was some of the first “big iron” I ever programmed on.

Now as testiment to my codgerhood, my first computer experience was at the UofC on a CDC Cyber 170, followed by the Honeywell Multics system that replaced the CDC at the UofC a few years later.

My first job post-graduation was at a company using two IBM 3033 mainframes, each of which filled a large room. The laser printer filled an equally large room, but that’ another story (it was VERY fast).

From there I worked with various other systems, including the above (actual) PDP 11/70’s and even at one point some time on a Cray YMP.

But this isn’t about “bit iron”, it’s about the personal computer. My first was a TRS-80 Model I. I bought a silk-screen expansion board, sourced and soldered it together as I could not afford the “offical” one. Later I bought a TRS80 Model III, then the 4 and finally a Model 4P, which I still have complete with all manuals and software.

But in amongst that time came the IBM PC. It changed the world simply because it was IBM and it seemed *everyone* (or every company) bought one.

I never owned an IBM PC, nor a clone PC. My first forray into “modern” (i.e. post-IBM) PC ownership came when Tandy brought out the Model 2000. This was based on the 80186 chip, which was a “hybrid” – not an 8086 and not an 80286, but something in between. It was a great machine, and much more affordable (for the time) than a “286”.

As I struck out on my own consulting, I bought one of the newest “386” machines, and it cost me $6000. But for the time it was the greatest, fastest machine you could buy.

I lived, worked, and owned PCs through the 486 era, and into the “Pentium” machines. By then the operating systems were firmly Windows based. I skipped Windows 1 through 3, but at Windows 3.1 it finally came into it’s own. Windows for Workgroups (WFW 4) was a really nice system at the time, and I did quite a bit of work on it.

Then came Windows 95, which “changed the world”. Certainly it brought the internet to the common computer owner, as well as a pretty decent OS. Buggy, but decent. Then came Windows 98 and Windows ME (pronounced “meh” – as in “what the hell is this piece of crap???”). By then I’d gravitated to Windows NT, which had one great feature – it worked and worked well.

Through this we had Pentiums. They got faster, but they were Pentiums.

Eventually sometime after 2000 Intel started putting out the I series – I3, I5, I7. Each one had more cores and was faster than the predecessor. AMD also had multi-core chips, and there was, for a time, a nice “arms race” of computing horsepower.

At the end of April, 2012, I built my current PC system. It uses an Intel Quad Core i7 3770K, Asus Sabertooth Z77 ATX motherboard, 16GB of RAM, a couple of fancy graphics cards, a fancy case with water cooling, 2 x SSD hard drives and a Blue Ray writer. All state-of-the-art for early 2012. I bought the components and assembled it myself, and it was (and is) a very nice system.

It was also considered very fast and high performance. That particular Intel I7 (3770K) was quad core, and fast.

But what I’ve noticed since then is… nothing. I *think* you can buy processors with more cores, and probably faster ones, but today I realized that although I still get tech-type feeds, I haven’t actually heard much in the past few years about “newer, faster, better” processors.

It’s as if we’ve exhausted that particular line of “faster, better” in personal computing. I suspect that for 99% of the market, ANYTHING you buy today is plenty fast enough. The other 1% is gamers, and perhaps if I got gaming feeds or magazines I would hear more about “faster gaming machines”, but I do wonder.

Have we really reached the end end of the “faster, better” in computing hardware?

I also wondered; if I wanted to find out what the FASTEST computer you could use today, how would I even go about finding it? Yea, there’s “the google”, but I’ve also started noticing that between all the “targeted results” based on what you like, it’s getting harder and harder to find any REAL information on the internet these days.

<sigh> I guess I really am becoming an old codger.

2018 Christmas Break Soldering

I bought a couple of Vintage Computer replica kits in the summer, but did not have time to work on them due to home renovations. One kit was a replica PDP11/70, the other a replica Altair 8800.

I decided that they would be perfect “Christmas Break” projects and so kept them until then.

Over the Christmas Break, I got them out and started building them. I started first with the PDP11/70 kit, or PiDP11 as it’s called. It features a manufactured plastic case and switches that create the complete look of a vintage PDP11/70. There is also a front panel and professional grade circuit board and all the components (switches, resisters, LEDs, diodes, etc.). The kit uses a Raspberry Pi (Model 3B recommended) running software called simh to drive the replica. Basically you run the Pi’s Linux and simh runs as a process on top of that – reading the switches and driving the LEDs.

The kit was straightforward to solder together, and ended up taking most of one afternoon and evening to build. When complete, it looks and works very much like the PDP11/70’s I have used in the past, minus the loud whirring noise of the giant disk packs and fans.

The second kit was the Altair 8800 replica, which again featured a case (bamboo this time), front panel, circuit board and the bags of components. The Altair 8800 emulates the 8080 of the early computer days using an Arduino Due, rather than a Raspberry Pi. This kit was more complex, and took an entire day to solder together and assemble.

I had a few initial issues with the Altair kit, as it features a bluetooth serial port as well as an SD card reader to hold various “disk pack” images. At first I could not get either the bluetooth nor the SD reader working. Some email discussion with the kit designer indicated the bluetooth card, though powered, was not initialized unless you manually configured it in the software setup. Once done, the bluetooth works perfectly and has become my preferred communication channel with the replica. The SD reader was more interesting, in that the metal ‘can’ protecting the pins was bent, preventing full insertion of the SD card. Once that was fixed the SD reader worked perfectly, as did the replica.

It’s been fun keying in a few simple programs into both replicas using the front panel switches, but the real power comes from all the operating systems both replicas support.

The Altair 8800 replica, or “AltairDuino” offers CPM, Altair DOS, many games and other amusements. The PiDP11 offers RTS11, BSD 2.11, 3 flavors of Unix and a real-time OS once used in SCADA industrial control.

I really enjoy playing with these old machines. Given the current state of obsolescence and the love of many to consign everything unwanted to dumpsters, I’ll likely never own full-size originals, but these are a lot of fun.

JupyterHub Chronicles

I’ve continued to work with JupyterHub since my last post, and have made significant progress towards my overall goal of creating a real system for developing a programming course.

The first development was to recreate my work to date on a new server: Ubuntu 18.04 Server, as opposed to Desktop, which I had been using. I also moved this server to VirtualBox (now V6) on a different machine. The new machine acts as a file server and has capacity to spare, plus stays on “as a server” all the time.

Installing Ubuntu 18.04 Server on the machine was not difficult, and following my scripts I was able to create JupyterHub on the new server, with full encryption and networked through “huntrods.com”. I also recreated the various demo logins to allow me to share this work with other colleagues.

I finished developing “Unit 0” for my Java programming course, as well as exploring other resources such as using it for my Network Java Programming course. There were some issues, but most of the programs work.

I also found some significant shortcomings in SciJava, which I contacted the developers for more documentation. Their response was “move to BeakerX, as it has a full Java implmentation”. They also informed me that SciJava might be End-Of-Life soon, which would be unfortunate.

However, I installed BeakerX according to guidelines from a developer on my single-user Ubuntu Desktop. It worked, so I then tried installing it on the Ubuntu Server. After one set of instructions failed, I reverted to the method that worked for many of the packages, and it worked.

I now have a full-featured Java running on JupyterHub under BeakerX. There is one outstanding issue that affects both BeakerX-Java as well as SciJava: neither will accept user input from the keyboard.

Another limit on BeakerX-Java is that it won’t run fragments of code that aren’t real Java. Example: SciJava will evaluate “10+23” and output “33”. BeakerX-Java gives an error as would happen with “real” Java (which is what BeakerX has).

It turns out (from the developer) that SciJava is really a Java+Groovy hybrid, which is great for what I’d been doing, but isn’t really “real” Java.

Either I modify my Unit 0, or go with the SciJava in some notebooks and BeakerX-Java in others.

However, it’s great to have full-blown Java available in my notebooks.

JupyterHub – it’s been a long journey (and it’s not over yet…)

I started working with Jupyter Notebooks in late November (2018), and was rewarded fairly quickly with the ability to create notebooks for Java (SciJava), Chemistry (rdkit), Engineering (scipy), graphics (matplotlib) and Geography (Basemap).

However, the real sticking point was these were all pages executing Jupyter on a local user account, running on a VirtualBox Ubuntu Linux server (18.10) that I’d created.

The real goal was to create a Jupyter system that would work for multiple users, so that I could use it for my new revision of “Introduction to Computing – Java” for Athabasca University. This meant running JupyterHub.

Along the way I moved to Ubuntu 18.04LTS (a checkpoint version) and spent hours on google, youtube and the plethora of Jupyter (and JupyterHub) pages. There were many frustrations along the way, from a complete communications breakdown in forums trying to get a site certificate (letsencrypt), to documentation and tutorials written in 2015 and never updated when everything (and I do mean everything) changed in the time since.

By December 5, I was able to create a functioning JupyterHub on huntrods.com with the proper certificate. The only kernel running was Python3, but it featured either PAM (local user) authentication or OAuth (Github login) authentication, so I was pretty happy.

BUT… (and this is huge) I really needed SciJava, or writing a Java course would be a bust.

The breakthrough came this week – yesterday, in fact. After repeated ‘banging head against the wall’ attempts, I was able to install SciJava for all users. With that success, it was relatively simple to install the other libraries (noted above) so that all my single-user demonstration notebooks ran in the new JupyterHub.

I was off and running, and quickly wrote my first notebook for the Java course. It’s everything I wanted, and more. It’s really a new way of “doing programming”, a mix of documentation and program code that works almost seamlessly together. Instead of a book with dead code examples, the code is ‘alive’ – press run and it executes. Better still, the student can change the ‘book code’ and immediately see the change take effect. It’s brilliant!

Today I worked on getting the Hub automated with supervisor. My next project is to store the notebook pages in a Git repository, either GitHub or local to the server, and then refresh them whenever users log in to the Hub.

Eventually I’ll use Git for notebook version management for all users, but one step at a time.


Posting a Jupyter Notebook in WordPress



richard-java








This is a new SciJava Notebook

What’s happening

  • The notebook is viewed and running from a browser on a completely different machine
  • Jupyter Lab is running on Ubuntu in the background as user anaconda
  • Jupyter Lab is accessible over the local network via https & secured with a password

this is really very cool
Why?

  • because it is
  • because I said so
  • did I mention it’s really cool?
In [1]:
System.out.println("Numbers:");
for(int i = 0; i < 100; i++) {
    System.out.print(i + " ");
}
System.out.println("Done.");
Numbers:
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 Done.

And now… The Sieve of Eratosthenes…

  • the code was on my local pc
  • the code was just cut and pasted into this notebook
  • the code ran first time (well, it did on the PC as well)…
In [2]:
public class Seive {

static int MAX = 1000;

    public static void main(String[] args) {
        int[] stones = new int[MAX+1];

        // initialize
        for(int i = 2; i <= MAX; i++) {
            stones[i] = i;
        }

        // remove non-primes
        for(int i = 2; i <= MAX/2; i++) {
            for(int j = i+i; j <= MAX; j += i) {
                stones[j] = 0;
            }
        }

        // display the primes
        System.out.println("Primes");
        for(int i = 2; i <= MAX; i++) {
            if(stones[i] > 0) System.out.print(stones[i] + " ");
        }
        System.out.println();
    }
}
Primes
2 3 5 7 11 13 17 19 23 29 31 37 41 43 47 53 59 61 67 71 73 79 83 89 97 101 103 107 109 113 127 131 137 139 149 151 157 163 167 173 179 181 191 193 197 199 211 223 227 229 233 239 241 251 257 263 269 271 277 281 283 293 307 311 313 317 331 337 347 349 353 359 367 373 379 383 389 397 401 409 419 421 431 433 439 443 449 457 461 463 467 479 487 491 499 503 509 521 523 541 547 557 563 569 571 577 587 593 599 601 607 613 617 619 631 641 643 647 653 659 661 673 677 683 691 701 709 719 727 733 739 743 751 757 761 769 773 787 797 809 811 821 823 827 829 839 853 857 859 863 877 881 883 887 907 911 919 929 937 941 947 953 967 971 977 983 991 997 

a web link using regular old html…
Huntrods Zone

In [ ]:

Email Oops

I woke up one morning, and checked my email as usual. All was good. A little later, I wanted to send a reply to one message.

It would not send. I kept getting “timeout on mail server” errors. I tried several things, and nothing worked. Finally, I called my email provider to as if the mail server was in some way affected.

Nope. But then I got asked a series of questions about my config. Apparently “everything” was wrong with it. I made the changes they recommended, but hate them as the password is sent in plain text. Yuk. But… at least I could send email again.

Later in the day I was doing some other work, and had reason to open the taskbar box (win 7). I noticed something odd. The Pulse Connect icon showed it was active. I have to use Pulse to create a secure tunnel to AU in order to view exams that I mark. Usually I activate the tunnel, mark the exam and then disconnect. However, this day I saw that I was still connected.

Acting on a hunch, I disconnected the Pulse tunnel. Then I opened my email and reset the configurations to what I had before the morning phone call. Lo and behold, I could send email again with a secure password.

SO – the tunnel to AU was interfering with access to my email provider’s SNMP (send) server. Interesting. Something to note in case I do that again.

OpenBSD Weirdness… and NO permanent solution

A couple of weeks ago I accidentally turned off one of my UPSes. Every morning it starts beeping a warning about the battery complete with yellow warning light. So I was inspecting it and wanted to silence the alarm. Well, hitting the big button is NOT the way to silence the alarm. Yes, it does silence the alarm, but it does so by turning OFF the UPS.

Oops. Killed three servers; my OpenBSD web server and firewall, my Solaris Tomcat box and my backup file server. I restored power, then did a ‘hot swap’ for new batteries, which sadly did not solve the beeping problem. I suspect it’s just old age on that UPS and it’s now cranky. Oh well.

Meantime, all three servers came back up without apparent incident. Except… my home info server running under Tomcat on the Solaris box was unavailable. I checked a local port, and it was working fine, but  not via the firewall server.

After much checking, a couple of reboots and some web reading, it became apparent that the OpenBSD firewall did NOT load the packet filter rules when it booted. As soon as I manually loaded them the Tomcat server was again available.

I searched and searched, but there is absolutely no reason I can find as to why a working OpenBSD server would fail to load the PF rules on boot. The rules are good; there were no error messages at all in the boot logs, and it’s always worked in the past.

For now I just made a note to check date and pf rules whenever that server gets rebooted, which fortunately is about once every several years. I also need to keep my fingers off the big UPS button!

Network weirdness gets… weird

On Feb 1, all network traffic became … weird. The symptom could best be described as DNS requests being slowed by more than 2x… just enough so that attempts to connect to web servers, mail servers, web pages etc. would randomly fail on the first attempt, then work on subsequent attempts.

It was so bad that everything I did on the web, including email, had to be done twice. Once to fail, then again to connect. I noticed the problem on all devices, not just windows PCs but the iPad and iPhone as well, so I was pretty sure it was more than a device specific issue.

I waited a day or so, hoping it would clear, but after it became apparent it wasn’t getting better, I called Telus. They checked my modem and connection and pronounced all good. They also stated there were no DNS server issues.

So I was stumped.

Finally this week I decided to try the universal computer fix-all: I rebooted the Telus high speed ADSL modem. As soon as the reboot was done, it was clear that the problems were also gone.

I am used to certain types of hardware running for ages without needing a reboot, so was a little surprised that the modem needed the reboot to clear it’s buffers or whatever got mangled.

For now I’m just going to monitor and see how frequently this occurs, but it sure was weird.

 

Notes from all over for Dec 22

Just some notes on stuff that’s happening as of Dec 22.

Linda’s Windows 10 computer, after a few configuration teething pains, is running quite well. Getting rid of the lock screen took 3 attempts as Microsoft is determined to foist this crap on users, even to the tune of disabling workarounds with each new update. It remains to be seen if my efforts will work for the longer term as MS is so very determined.

We did blow ‘edge’ away. It’s easily the worst browser I’ve ever seen. Basically, it has almost zero configuration options, and the few it does have it ignores. Gone forever and gladly back to Firefox. Likewise the default ‘mail’ app is gone and Thunderbird again rules the emails. Like edge, ‘mail’ is another MS app that can’t even play nice – not even with other MS things like Outlook. What a damaged, untested, unprofessional piece of crap.

I did install Office 2016 this week thanks to a “Home User Program” deal from MS. Because Athabasca U bought into the whole MS lock-in, we get to buy home versions for really cheap (like $13 for Office 2016 pro!). It’s OK. I personally prefer Office 2013 because that was the last version without “THE RIBBON”. Yet another unwanted MS user interface “update”.

As for my AU work, I can’t hear people on the phone very well, and certainly not upset persons who make talk fast and in a higher register. After consultation with other AU academics, I bought “MagicJack” from the main website as it was on sale. It does come from the USA and took a while to arrive, and the free phone number is only USA, but it does indeed do what it claims. I paid the extra $10 to get a CDN number (Edmonton exchange) and then had AU tie it to my academic 1-888 number. By yesterday it was all working tickety-boo. Better yet – any voicemail message gets emailed to me as an audio file so I can keep track. I can use a headset when calling anywhere in North America (free) so it’s awesome. Eventually I plan to see if it would work to replace most of the land line features, but not yet. First to see it in action.

I bought a leak detector for my underwater camera, and it came after almost a month in the postal system. Still, not bad coming from Slovenia. It’s really well built and should provide extra protection against flooding for the big underwater camera system.

Speaking of which, the replacement Kraken ring light/strobe came a few weeks ago, and worked correctly from the box. Nice to know it wasn’t simply user error but rather some issue with the optical strobe sensor.

That’s all for now. Time for a Christmas break.

Merry Christmas to all, and a very Happy New Year!