Tag Archives: Computerworld

Migrating from commercial software

I'm sure we all remember the colourful discussion on Exchange and open source alternatives (start here for my final post on the topic) that took place between myself and Alex Scoble just a few weeks back.

Lots of interesting comments, from both sides, came out from that discussion, and like all good debates there were no real winners.

Now, there's another active conversation (I use the term loosely) on the topic taking place on Slashdot after the release of the Zimbra Collaboration Suite. Inevitably some of the same topics are coming up - including the down, and up, sides of Exchange's data model, hardware requirements and flexibility. All of them come with fiarly good comments on observations from both sides of the fence.

Attitudes to Rita = attitudes to IT management

Thankfully the damage caused by Rita was not as significant as people had feared. It has caused huge amounts of damage in some regions, but compared to the damage that was feared had it hit Houston.

What was interesting is the response from the people this time to get out of the the areas most likely to be affected. I have however seen a few interesting comments from those who stayed, including the statement from one woman who said that the potential threat was overrated and that she was glad she had stayed. Especially compared to having to spend hours in queues of cars getting out of, and then back into, the city.

Get up to speed on Grids

I love grid technology; I've been using it for years without really knowing what it was, and now I spend part of my time explaining to people how it can best be used and hot to develop grid based applications.

Sometimes explaining the benefits is difficult, often explaining the fundamentals is even more complicated because the basics of how the grid system works is completely different to how we have been using computing power for the last 30 years.

In reality of course it is only an evolution of the computing technology that is available. In the early days we used monolithic mainframes, then we moved to desktops, through client server, and we've even dabbled in terminal services (something I pondered about earlier this week).

Peer to peer power

As we use more and more computers, power, or rather producing it, is becoming a big issue, with environmental and political angles, but more critically simple physical and mathematical ones.

I know in the US, West Coast especially, brownouts and power cuts are becoming quite common. We're not quite there yet in the UK, but there are rumours and issues bubbling up about where exactly we are going to get all the power we are demanding here in the UK.

Years ago, in New Scientist, I came across the idea of Combined Heat and Power (CHP) Boilers. These work pretty much like the boilers we have in nearly all homes here in the UK, and use Natural Gas to heat water which we use in our baths and showers and which provide the necessary hot water for our central heating system. I'm no expert on US heating systems, but I suspect the same basic idea is used in American homes.

Five common IT manager mistakes

There's a great piece here on Five common mistakes that Linux IT managers make which is more than just the usual top five mistakes.

To summarize the top five, they are listed as:

  1. Reactive, not Proactive
  2. Failing to emphasize documentation and training
  3. Failing to assess strengths and weaknesses
  4. Too much, too quickly
  5. Security as a secondary priority

On the whole there is nothing here that should surprise most managers; how this article differs is that it relates many of the issues back to the Linux/Open Source community and the issues faced by managers and their organizations.

Getting kids into programming

If we want to get new blood into the programming market, we need start with kids when they are young.

If I had to suggest something, I'd recommend Python. It is easy to learn, object based, and the techniques you learn could easily be transferred to Java (or use Jython) or C++.

But Python isn't for everybody. So why not introduce them to programming with their very own programming language?

Take a look at KPL. It's a new language designed to give quick gratification to kid programmers. It's loosely based on BASIC (although parts of it felt more Modula-2 (a derivative of Pascal) to me...), but the theory remains sounds.

Blog search spam

Google announced their Blog Search system last week.

Unfortunately, it is already succumbing to that type of annoying 'use our catalog system instead' type of spam (and sorry, but it is, essentially, a form of spam) that uses adverts and associate information to drive sales by ensuring that their pages appear at the top, just because they've mentioned the search term 3 million times on their home page.

If you want an example, search for 'linux terminal server schools', which I was searching for while looking for a story about linux terminal services in schools for my previous post.

Setting up a Linux terminal server

For some organizations, the upgrade costs from their older hardware are high enough that they don't bother to upgrade, but they still want access to newer software and faster, more responsive systems.

Using terminal services is an obvious solution...

The principles of terminal services are nothing new in the computing world. Ignoring the networking and technological aspect, the basics of terminal servers go back to the big, bad old days of massive single computers (the "mainframe") and hundreds of individual green-screened terminals connected to them over serial cables.

This was how computing was thirty years ago and why so many people are used to the thought of a single computer being responsible for everything, leading to numerous questions from users as to what the 'box' is under the monitor.

Perl best practice

I love Perl, and I'm not afraid to admit it, but because it is such an easy to use language it is very easy to fall into some bad practices.

Damian Conway has written this excellent little guide to Perl best practices.

I don't agree with the order of what he suggests; mostly because I think most Perl programmers don't follow the model that would make it practical.

Top down development is great; mapping out and planning all of the different elements, but really, Perl makes it so easy to go bottom up and start adding features and functionality and then back-hack the system into a module and add testing and other bits that designing the module interface first is probably not going to work out the best way to go.

Standardizing Linux

A constant battle exists between the flexibility and choice in the Linux marketplace (hence the hundreds of different Linux distributions) and standardizing on a suite of tools, libraries and components that make the deployment of a Linux application and management techniques that much easier.

If you look around for pre-packaged (that is, pre-compiled) software from companies either that don't want to release through open source (Oracle, IBM and others), or who want to distribute an easy to use version of their software, and you'll find that they will often list a variety of different versions for each of the main Linux distributions (Suse, RedHat etc.). Have a non-standard distribution and you can have issues. The reason for these different versions is because seemingly inconsequential differences between Linux distributions (such as different library versions, or missing libraries) can make the difference between an application working and failing.