Linux distros: your initial boot process is broken

EVERY LINUX DISTRO SHOULD BOOT TO NETWORK SSH LOGIN READY, especially those customized for hardware boards.

Every new hardware board that has come out in recent months is usually accompanied by a custom ‘distro’ (short for distribution) of linux. This would be a great thing, except that the distros are always crippled by one fatal flaw.

Specifically, every new hardware distro assumes that every user wants to plug in a keyboard and a monitor and maybe a mouse, and then fire up the new board and CONFIGURE the thing from the ‘console’?

WHY? THIS IS THE 21st CENTURY.

Here is what I think should be the default for every new linux distro:

1. boot using DHCP. While I prefer static IP’s on my own internal network, I still have a DHCP server ready for just this occasion. If the developer writes the MAC address on the board, I can even associate a specific IP to the MAC address.

2. boot with SSH active and provide a default root password. Even better: provide a default user account with password, which can use sudo or ‘su -‘ (to root) with an supplied sudo or root password.

In other words, boot to network login ready. With that, I can take it from there and customize the config any way I want.

3. bonus round: boot to graphical user interface if you like, so that those who simply must have a keyboard and mouse and monitor can still get their jollies.

Linux is ubiquitous, and why that’s not always a good thing

As someone who has been using and working with unix and unix-like operating systems since the early 1980s, I am growing increasingly frustrated with linux.

Linux has become the defacto industry standard server platform for all things web. Certainly for any open source project. The problem is that everyone who develops on the linux platform seems to assume that because it’s ‘almost good enough’ with respect to security, that developing with linux assumptions is good enough for everyone.

But that’s not true. It’s not that linux is insecure, but rather that many choices have been made creating the popular linux distros that entail less security than can be achieved. And there’s the problem. Try to install a ‘produced on linux’ product on a more secure operating system, or an operating system with higher security settings, and the install will fail.

Examples include: wordpress, moodle, and elgg; all latest versions, and all who fail to install on a stock OpenBSD (ultra secure) OS. The problem is with permissions, ownership and groups. In order to install one of the above packages on OpenBSD, one is forced to change groups and file permissions from secure settings to much less secure settings before the install will succeed.

It’s all very frustrating. Taking an ultra-secure operating system and intentionally crippling some of the security just to get popular linux developed packages to install and run.

It’s not that linux itself is necessarily at fault, but rather the typical developer mentality of “it worked on my machine, so the problem is you”. This trend seems to pervade much modern software development. And that is not a good thing.

Age, experience and fame

(originally posted nov 7, 2007)

Today I received another request from some publishing outfit to participate in one of their IT surveys. I deleted the email, but then pondered for a moment on the question of “why did I delete that?”, or more specifically “why did I delete that now?”.

When I was younger, I did these things all the time. I was flattered that they wanted my input. After all, when I started I was a kid, and had no “street cred”, I was thrilled that anyone would ask for my participation. Somehow, I thought it showed that I knew something, or was somehow “famous”.

Well, not anymore. As I’ve matured in this industry, I’ve come to realize a couple of things. The first is that everyone wants your input. Of course, once it’s been filtered and blended and homogenized and extruded, your input is not worth the time you spent answering the questions.

Not only that, but I’ve also come to realize that, after 30+ years, I actually know quite a bit. That knowledge was often hard-won. The cost in terms of grey hairs and such makes this information all the more valuable.

I worked at a consulting company many years ago, and they had a very interesting pricing system. Simply doing the job was one cost. But if the client wanted the job done, PLUS training for their staff, then the job was priced as two items, the basic cost plus something they called “technology transfer”. Often, the technology transfer cost was significantly more than the basic job cost. When I inquired about this (I was pretty new at the time), I was told “work is cheap. Knowledge is expensive”.

That stuck with me, and has been borne out in all the rest of my career. I’ve spent a couple of decades as a consultant. One thing I’ve learned is that what I know is far more valuable than the jobs I do for clients. After all, if I can do a job faster and better than the competition, it’s probably because I have some tools and techniques (or experience and knowledge) that they don’t have. That is worth something.

So now, when someone asks me for advice or to fill in a survey, my first (internal) response is “what’s in it for me?”. Not in a mean way, but simply asking how my expertise is going to be valued, evaluated and compensated. Back then, I was flattered to be surveyed. I thought they were doing me a favor by seeking me out. Now, I know that it is the other way around. I am giving them knowledge based on my expertise and experience. If they want it, they will have to compensate me for the “technology transfer”.

Do the best you can

(originally posted nov 20, 2008)

I got one of those “Life’s Little Instruction Calender” dealies a couple of Christmases ago. They are now on volume XIII (or whatever), and the lack of decent material really shows. There have been one or two good ones, but most were totally unmemorable. A few have caused me to write “B.S.” in pencil over the day’s trite saying.

However, the one for November 18 was just too much. It must be commented upon:

If you are doing a job you despise, do it as well as you can. Miraculously, you’ll discover it’s not as disagreeable as you thought.

This is not only B.S., it’s really, really BAD advice. If you were to follow this advice, you will discover, much to your horror, you have now become the “go-to” person for that job, FOREVER. Doing a good job at a horrible task labels you as the person to whom should be given all the horrible tasks that no-one else wants.

j.r. crofter’s advice would be:

If you are doing a job you despise, do the absolutely worst possible job you can. This will ensure that the next time this job is handed out, you will NOT be on the list of candidates for the task.

Unless, for some reason, you enjoy being a doormat for your boss and co-workers.

The Internet

(originally posted dec 3, 2008)

I hate the internet.

Well, actually, I love the internet.

I’m just glad as hell that I didn’t have the internet in:

  1. pre-school
  2. grade school
  3. junior high
  4. high school
  5. university

Because if the internet had existed in it’s present form back when I was doing 1-4 (above), then today I would most likely be over 50 (which I am), working at a job (when I could break away from the internet), asking “…would you like fries with that?”.

I’d also probably be over 1000lbs, unable to move (except to surf the web) and eat only doritos or cheesits or some similar plastic-cheeze flavored deep-fried extruded-paste pseudo-food.

Yep. Thanks to having NO INTERNET as a kid, teen and young adult, I actually got to do things like play outside, read books, and GRADUATE.

I’m also very grateful that there were no available computers when I was growing up (at least until university where I was in rural Canada), and ESPECIALLY no COMPUTER GAMES. Just thinking about all the time I wasted on computer games AFTER I had a career is scarey stuff. Imagine if I had even the rickety games from the 1980’s back in school. Scary stuff, kids!

PHP

(originally posted jun 8, 2010)

I am not very fond of PHP.

Really.

PHP is one of the “go to” languages for web development. Actually, I suspect it’s the “go to” language of the same bunch that embraced VB (visual basic) in the ’90s.

You know – the ones that can program well enough to get into REAL trouble, but not well enough to make code that is elegant.

Sure, most PHP code (like older VB code) works, but has almost no ability to handle anything outside the meagre boundaries of the original problem. Give it some weird input, and it crashes like Windows ME.

I’ve been ressurecting an older PHP project this week, and that’s what got me thinking about all this. The code I’ve found (it was buried in a rather non-obvious place on the server) works, but it looks like crap (visually). The logic is a true cobble-together nightmare, with every single thing being a separate source file.

Worse, and this is perhaps the crux of my argument, the code is a mish-mash of programming “stuff”. And it’s 100% “good” PHP. There’s regular expressions right next to weird function calls right next to cryptic command, arguments and stuff that looks like it came out of some horrid bash script.

The real problem with PHP is that it’s a utiltity language, and that means that it’s been cobbled together from bits and pieces of all the other utility languages that came before it… shell scripting, awk, grep, (and other unix ‘stuff’), perl and who-knows what other languages… all thrown together in a washing machine’s spin cycle to tumble around into PHP.

In short – horrid. Certainly in danger of becoming a “write only” language.

Software “Engineering”

(originally posted oct 9, 2012)

… is NOT Engineering. It’s not really software either. It’s mostly age-old project management drivel in a shiny new wrapper.

If you examine the discipline closely, you will notice it’s not really about software. Students may take a C++ course or two, but aside from a few projects don’t delve deep into programming or programming topics like the older ‘Computer Science’ discipline. What you get instead is methodologies. Not just any methodologies, but the newest and shiniest ‘agile’ type methodologies. Any exposure to older methodologies is as ‘bad example’ object lessons. In the end, you are not being taught software, you are being taught managment. They might just as well call the program “Computer MBA”.

I took Engineering in the 80’s. Back then, there were four main disciplines: Chemical, Civil, Electrical and Mechanical. Each one had a common core for the first two years, in which everyone took the same courses. We all took Math (lots of math), Statics, Dynamics, Physics, Materials, Design and Drafting, Economics (yuk!) and options. After second year, if you passed, you went on to third and fourth year where you declared your specialty and took two years of courses specific to that discipline.

One thing was constant through all four years, no matter the discipline. You learned how to solve problems, not how to memorize or regurgiate classroom lectures. Almost every Engineering exam was open book, and most allowed you to bring in a ‘formula sheet’, not that it did much good. You were expected to take what had been discussed in the lectures and text, plus the assignment work, and extrapolate solutions to novel problems posed on the exams. Assignments were the same – adapt, derive, extrapolate, solve. Partial marks for showing your work were worth more than the correct answer. Knowing HOW to get the correct answer was as important as getting the correct answer. (In fact, if all you wrote down was the correct answer you would receive a mark of zero in some courses – showing the path to the solution was that important to some professors).

SO what does this have to do with the so-called ‘software engineering’? Lots. People enrolled in software engineering in many colleges and universities do not have to take ANY core Engineering courses. They take some computer science courses instead, but the ones with the ‘special’ appelation of “Software Engineering – xxx” in the name. Basically, they take methodology courses. While I’m sure this produces good methodology majors, it does not produce an Engineer. An Engineer solves problems. Engineers arrive at this capability by taking all those courses in their four years (especially in the first two years) that introduce Engineeing and problem solving to the students, and require them to master some of these skills to advance.

Until and unless “software engineering” programs require all students in the program to take the first two years of Engineering core before branching into the software side of things, they are emphatically NOT, in my opinion, Engineers.

Problem solving and design

Yesterday I read an interesting blog post:

http://www.yosefk.com/blog/why-bad-scientific-code-beats-code-following-best-practices.html

… although I think the heading “best practices” is a misnomer. I tend to agree with him, given that I too came from a scientific/engineering “solve the problem, not all the universe’s problems” coding mentality.
Too often programmers choose to generalize their design to allow for all contingencies, including some that may never occur. The problem is that in addition to extending the development time, the final product can become top heavy and difficult to debug and maintain.

Although I learned much from agile design (and have since rejected much of the fluff that goes with it), one thing I’ve always done in my own design is “write the simplest thing that will work”. The trick, as they say, is is deciding just what “will work” really means in the context of the problem to be solved.