Why I don’t like the ‘Linux model’

There’s a thing I’m going to call the ‘Linux model’. Not because it pertains ONLY to Linux, but because most of what’s wrong with this model often starts with Linux and stuff that runs (best) on Linux.

In a way, this is really a story about all the stuff that’s broken in JupyterHub, but it goes beyond that… it’s the general model that’s broken, and the model really owes it’s roots to Linux.

Basically, when you install something on a Linux box, and even the OS for the Linux box itself, it’s probably broken. That is, *something* won’t work after installing it, and there is no way short of digging into some code somewhere of ever fixing it.

Worse, the breaking of such stuff is often super complex and intricate – somewhere buried in a log somewhere is a message regarding “package X failed due to expecting library Y to be x.x.x but was z.z.z”. Or similar obscure “thing” that takes days to figure out, if ever.

You can post the error on google and what you get most of the time is a dozen hits – all questions on StackOverflow asking the same thing and getting precious little of value in response.

Worse, you are expected to manually update packages on an almost continuous basis, and (of course) such updates often break things that were working fine before the update. Yet if you don’t update, something ELSE will break.

The entire model is broken.

What triggered this particular rant today is that I spent ages figuring out how to (finally) install C++ into JupyterHub so I could run C++ notebooks. Yesterday, I found it broken. The log complains about a library *supplied by the supporter of this C++ package* being the wrong date compared to what’s expected. It doesn’t matter. C++ in JupyterHub is now broken, and good luck finding anyone to respond with anything useful. Even less likely is that the C++ supplier will fix it anytime soon.

That’s the other problem with the Linux model. Everything is well documented and often supplied with tutorials. BUT… THEY ARE ALL YEARS OUT OF DATE. Worse, the stuff they describe has changed so much in the years since that you cannot follow the tutorial without being worse off then if you’d just thrown mud at a wall.

The biggest problem with the Linux model is that noone really cares. “I did this really cool thing in 2012 but now I’m bored and … who cares”, seems the mantra of every developer. Nothing is maintained for long. It’s becoming obvious that nothing is really being used either. Otherwise the failures would be noted and (hopefully) fixed.

Overall, it’s a really depressing time to be trying to actually do anything on a Linux box.

JupyterHub Chronicles

I’ve continued to work with JupyterHub since my last post, and have made significant progress towards my overall goal of creating a real system for developing a programming course.

The first development was to recreate my work to date on a new server: Ubuntu 18.04 Server, as opposed to Desktop, which I had been using. I also moved this server to VirtualBox (now V6) on a different machine. The new machine acts as a file server and has capacity to spare, plus stays on “as a server” all the time.

Installing Ubuntu 18.04 Server on the machine was not difficult, and following my scripts I was able to create JupyterHub on the new server, with full encryption and networked through “huntrods.com”. I also recreated the various demo logins to allow me to share this work with other colleagues.

I finished developing “Unit 0” for my Java programming course, as well as exploring other resources such as using it for my Network Java Programming course. There were some issues, but most of the programs work.

I also found some significant shortcomings in SciJava, which I contacted the developers for more documentation. Their response was “move to BeakerX, as it has a full Java implmentation”. They also informed me that SciJava might be End-Of-Life soon, which would be unfortunate.

However, I installed BeakerX according to guidelines from a developer on my single-user Ubuntu Desktop. It worked, so I then tried installing it on the Ubuntu Server. After one set of instructions failed, I reverted to the method that worked for many of the packages, and it worked.

I now have a full-featured Java running on JupyterHub under BeakerX. There is one outstanding issue that affects both BeakerX-Java as well as SciJava: neither will accept user input from the keyboard.

Another limit on BeakerX-Java is that it won’t run fragments of code that aren’t real Java. Example: SciJava will evaluate “10+23” and output “33”. BeakerX-Java gives an error as would happen with “real” Java (which is what BeakerX has).

It turns out (from the developer) that SciJava is really a Java+Groovy hybrid, which is great for what I’d been doing, but isn’t really “real” Java.

Either I modify my Unit 0, or go with the SciJava in some notebooks and BeakerX-Java in others.

However, it’s great to have full-blown Java available in my notebooks.

JupyterHub – it’s been a long journey (and it’s not over yet…)

I started working with Jupyter Notebooks in late November (2018), and was rewarded fairly quickly with the ability to create notebooks for Java (SciJava), Chemistry (rdkit), Engineering (scipy), graphics (matplotlib) and Geography (Basemap).

However, the real sticking point was these were all pages executing Jupyter on a local user account, running on a VirtualBox Ubuntu Linux server (18.10) that I’d created.

The real goal was to create a Jupyter system that would work for multiple users, so that I could use it for my new revision of “Introduction to Computing – Java” for Athabasca University. This meant running JupyterHub.

Along the way I moved to Ubuntu 18.04LTS (a checkpoint version) and spent hours on google, youtube and the plethora of Jupyter (and JupyterHub) pages. There were many frustrations along the way, from a complete communications breakdown in forums trying to get a site certificate (letsencrypt), to documentation and tutorials written in 2015 and never updated when everything (and I do mean everything) changed in the time since.

By December 5, I was able to create a functioning JupyterHub on huntrods.com with the proper certificate. The only kernel running was Python3, but it featured either PAM (local user) authentication or OAuth (Github login) authentication, so I was pretty happy.

BUT… (and this is huge) I really needed SciJava, or writing a Java course would be a bust.

The breakthrough came this week – yesterday, in fact. After repeated ‘banging head against the wall’ attempts, I was able to install SciJava for all users. With that success, it was relatively simple to install the other libraries (noted above) so that all my single-user demonstration notebooks ran in the new JupyterHub.

I was off and running, and quickly wrote my first notebook for the Java course. It’s everything I wanted, and more. It’s really a new way of “doing programming”, a mix of documentation and program code that works almost seamlessly together. Instead of a book with dead code examples, the code is ‘alive’ – press run and it executes. Better still, the student can change the ‘book code’ and immediately see the change take effect. It’s brilliant!

Today I worked on getting the Hub automated with supervisor. My next project is to store the notebook pages in a Git repository, either GitHub or local to the server, and then refresh them whenever users log in to the Hub.

Eventually I’ll use Git for notebook version management for all users, but one step at a time.


Email Oops

I woke up one morning, and checked my email as usual. All was good. A little later, I wanted to send a reply to one message.

It would not send. I kept getting “timeout on mail server” errors. I tried several things, and nothing worked. Finally, I called my email provider to as if the mail server was in some way affected.

Nope. But then I got asked a series of questions about my config. Apparently “everything” was wrong with it. I made the changes they recommended, but hate them as the password is sent in plain text. Yuk. But… at least I could send email again.

Later in the day I was doing some other work, and had reason to open the taskbar box (win 7). I noticed something odd. The Pulse Connect icon showed it was active. I have to use Pulse to create a secure tunnel to AU in order to view exams that I mark. Usually I activate the tunnel, mark the exam and then disconnect. However, this day I saw that I was still connected.

Acting on a hunch, I disconnected the Pulse tunnel. Then I opened my email and reset the configurations to what I had before the morning phone call. Lo and behold, I could send email again with a secure password.

SO – the tunnel to AU was interfering with access to my email provider’s SNMP (send) server. Interesting. Something to note in case I do that again.

Notes from all over for Dec 22

Just some notes on stuff that’s happening as of Dec 22.

Linda’s Windows 10 computer, after a few configuration teething pains, is running quite well. Getting rid of the lock screen took 3 attempts as Microsoft is determined to foist this crap on users, even to the tune of disabling workarounds with each new update. It remains to be seen if my efforts will work for the longer term as MS is so very determined.

We did blow ‘edge’ away. It’s easily the worst browser I’ve ever seen. Basically, it has almost zero configuration options, and the few it does have it ignores. Gone forever and gladly back to Firefox. Likewise the default ‘mail’ app is gone and Thunderbird again rules the emails. Like edge, ‘mail’ is another MS app that can’t even play nice – not even with other MS things like Outlook. What a damaged, untested, unprofessional piece of crap.

I did install Office 2016 this week thanks to a “Home User Program” deal from MS. Because Athabasca U bought into the whole MS lock-in, we get to buy home versions for really cheap (like $13 for Office 2016 pro!). It’s OK. I personally prefer Office 2013 because that was the last version without “THE RIBBON”. Yet another unwanted MS user interface “update”.

As for my AU work, I can’t hear people on the phone very well, and certainly not upset persons who make talk fast and in a higher register. After consultation with other AU academics, I bought “MagicJack” from the main website as it was on sale. It does come from the USA and took a while to arrive, and the free phone number is only USA, but it does indeed do what it claims. I paid the extra $10 to get a CDN number (Edmonton exchange) and then had AU tie it to my academic 1-888 number. By yesterday it was all working tickety-boo. Better yet – any voicemail message gets emailed to me as an audio file so I can keep track. I can use a headset when calling anywhere in North America (free) so it’s awesome. Eventually I plan to see if it would work to replace most of the land line features, but not yet. First to see it in action.

I bought a leak detector for my underwater camera, and it came after almost a month in the postal system. Still, not bad coming from Slovenia. It’s really well built and should provide extra protection against flooding for the big underwater camera system.

Speaking of which, the replacement Kraken ring light/strobe came a few weeks ago, and worked correctly from the box. Nice to know it wasn’t simply user error but rather some issue with the optical strobe sensor.

That’s all for now. Time for a Christmas break.

Merry Christmas to all, and a very Happy New Year!

Robot Builder (a new book)

I just received a new book from the publisher yesterday.

Robot Builder – The Beginner’s Guide to Building Robots

John Baichtal

Que books, 2015. ISBN-13: 978-0-7897-5149-2  ISBN-10: 0-7897-5149-6

Cost (back cover) $39.99CDN, $34.99US

I found it to be a very good resource for all aspects of robot building, and has some really innovative ideas for robots that I have not seen in course projects to date. A good source of supplies as well.

Playing with Moodle

I’ve been playing with Moodle for a few years now. I first installed Moodle 1,9 on my server and then tried creating a couple of courses, which turned out to be fairly painless. The courses I chose were based on my notes from teaching C in the 1990’s.

Fast forward to last year, when I tried updating to Moodle 2.0. All did not go well, as the “brains of moodle” decided to remove some packages from moodle, requiring the user to have them pre-installed on the server instead. I was able to find and install all but the zip file support. Without all packages, moodle would not install.

I eventually upgraded the server to a newer version of the OS, which had the zip package. Once that was done, installation of Moodle 2.0 was quite painless. Unfortunately, I lost the courses in the process.

This week I decided to try adding the courses again. The new moodle has a nicer look and feel, and much improved tools. It also has a pletora of options and choices, making some decisions much more difficult. Fortunately there is a really good help system that offers tips as you work, so deciding things like “page or lesson?” is reasonable.

In the end it took under 2 hours to create and fully populate my two courses (C I and C II) from my old MS word notes. There are some quirks (why do some list entries appear with shadow border and others without?) but all in all it was fun and I have my courses on-line again.

One reason for installing moodle on my server and playing with it has to do with Athabasca University using Moodle as it’s primary course delivery mechanism. It pays to know the tools, but you can’t just do anything with someone else’s servers. So building my own allows me full rein to play and learn.

One thing I did learn, albeit too late for my current course revision, is that it’s really easy to create curriculum pages in moodle.

At AU we have this blend of content – some on moodle, some in a thing called alfresco. Sadly, editing alfresco content remotely is nigh-on impossible. For the current course, I had to resort to having the alfresco content cut-and-pasted into MS word documents by local experts,  then emailed to me for editing, then cut and pasted back into the alfresco documents. Yikes, what a process!

As I said, I wish I’d known how easy it was to create “pages” for content in moodle, as I would have put all the content back into moodle (via pages) and erased the alfresco links. It would have changed a rough multi-week editing process into in a few days.

Live and learn.