Keep it Simple

KISS

[php snippet=1]

Have you ever had the need to produce a web site that needed dynamic CMS features and capabilities but felt that using one of the usual suspects (Drupal, Joomla!, WordPress, etc.) was the equivalent of using a sledgehammer to kill a fly?

That’s certainly the feeling I had when finally getting around to refactoring a small site of mine that has been on the Internet since 1997.  There had been numerous updates and tweaks made to the site over the years, but it was still based on static html pages and a bunch of customized (and poorly organized) javascripts.  Adding or modifying content was a manual process.

I had almost talked myself into using WordPress, as all I really needed was the framework and the equivalent of the WordPress Page content type.  But, at the last moment, I stumbled upon a lightweight CMS called GetSimple.  It is advertised as “The Simplest Content Management System Ever” and appeared to have the basic features and capabilities I was looking for.  I downloaded a copy and installed it in a subdirectory off of my web server’s root.

Next came the configuration, which was a simple one-page fill-in-the-forms exercise.  so far, so good.

I then did my typical thing, which is to pretend that I am all knowing and proceed without bothering to read the documentation.  The first order of business was to develop a theme for the site.  I did this by using GetSimple’s default theme as a learning tool and modifying it to meet my needs.  GetSimple themes are similar to those used in Drupal and WordPress, i.e., they use PHP function calls to manage and provide the raw content, along with xHTML statements and cascading style sheets (CSS) to manage the content layout and presentation. Here’s an example extracted from the default template.php file:

<div id="bodycontent">
    <div class="post">
        <h1><?php get_page_title(); ?></h1> 
    <div class="postcontent">
        <?php get_page_content(); ?>
    </div>
</div>

In GetSimple templates, breadcrumbs, sidebars and other like constructs are called components.  They are generated by the get_component() function as shown in the following example.

<?php get_component('sidebar'); ?>

Now that I had become an “expert” GetSimple theme designer, the next step was to generate some data.  GetSimple has only one content type and it is aptly named “Page.”  But it has the ability to assign a different template to each page, so for all practical purposes, one can generate a number of different content types by developping different templates.  As seen in the example below, there are a number of other options associated with pages.  Most are typical of the options found in many other CMSs.

GetSimple differs from a number of today’s open source CMSs in that it does not use Mysql and/or PostgreSQL as its DBMS.  Instead it stores database entries as XML-formatted flat files in the data subdirectory of the GetSimple root. The example shown below is the XML file for a Contact Us page based on a template named bbcontact.php.

<?xml version="1.0" encoding="UTF-8"?>
<item>
<pubDate>Sun, 21 Feb 2010 09:25:34 -0500</pubDate>
<title><![CDATA[Contact HR's Big Box]]></title>
<url><![CDATA[contact-hrs-big-box]]></url>
<meta></meta>
<metad></metad>
<menu><![CDATA[Contact Us]]></menu>
<menuOrder><![CDATA[5]]></menuOrder>
<menuStatus><![CDATA[Y]]></menuStatus>
<template><![CDATA[bbcontact.php]]></template>
<parent></parent>
<content></content>
<private></private>
</item>

That’s about it.  I’m not going to go into any more detail here but will mention that GetSimple capabilities also also include friendly URLs, WYSIWYG editing, file uploading, automatic page backup, and scheduled events via cron.  As of the latest release, it also supports plugins, but there are not very many available at this time.

In conclusion, I found GetSimple to have an almost flat learning curve, a simple and easy to understand Administrative User Interface (UI) and more than adequate features and capabilities to support my needs for this particular site. I have no real basis for determining how scalable GetSimple is, but I would expect that it would be more than capable for use as a platform for a small site with basic content management needs.

Goodbye Old Friend

[php snippet=1]

I recently completed a major update of my oldest internal Linux Server here at hrpr.com.  The server, whose name is Doofus,  is hosted on a Dell Dimension XPS T500 that has been in service here since April 1999. 

For the last eight years, Doofus has been running Slackware, which is a hard-core, no-frills, very stable Linux distribution that has been around since dirt was new.  If you aren’t on speaking terms with command line shells such as Bash and are not familiar with the Linux Directory Structure and its contents, then Slackware is probably not for you. 

Back to the update stuff…

I was getting ready to download the latest, greatest version of Slackware and install it on Doofus when it suddenly occurred to me that I had recently downloaded the latest version of Ubuntu Linux and had made an installation CD-ROM for it.  It was the Desktop Edition, but as a local test server, it would do just fine.  I was also curious to see how GNOME would perform on a Pentium III with 256M of RAM.  After all, I was essentially starting with a blank slate and could revert back to Slackware if I really wasn’t satisfied with Ubuntu.

I did the Ubuntu install and, as usual, it was fast and uneventful.  Doofus rebooted and up came the GNOME GUI.  And surprise, surprise!   It was quite responsive.  The mouse was responsive and GUI applications ran fast enough to meet my "wait-time" expectations.  Which really didn’t matter much anyway since Doofus would be running 99.99999% of the time as a server with remote administrative access via SSH and/or Webmin.

I installed the LAMP stuff and a BOINC Client for running SETI at home tasks.  And that was it – Doofus was back online doing its thing.

I am still a Slackware fan and highly recommend it to anyone who enjoys tinkering at the command line level and/or wants to learn Linux from the ground up.  However, Ubuntu brings added value such as automatic notification of security updates, a more feature rich software package manager and other administrative tools that make life a bit easier.

And who knows, next week I may change my mind again and install Slackware Linux on Doofus yet one more time.

When Half is Not Enough

[php snippet=1]

Back in the day, when the TCP/IP protocol suite was the new kid on the block, one of the classic issues was how to deal with half-open connections:

A TCP connection is considered “half-open” when one party thinks the connection has been closed and the other party thinks the connection is still open.

One solution to the half-open connection is the old "When in Doubt, Time Out" rule.  That is, if no packets have been sent/received on an existing connection for some preset period of time, unilaterally close the connection.  Not elegant, but works great. 

The Internet has grown exponentially over time and TCP/IP has become one of those things that are just there and works the way it is supposed to 999.999999 percent of the time.  Today’s average PC user really doesn’t think or care about half-open connections as defined above anymore, or even how TCP/IP does what it does!

Why am I writing about this?  Well, over the past week or so connections to two of my web sites has been bouncing, i.e., the monitoring service I use has been reporting intermittent loss of connectivity:

Alert Type: Site Not Available

Result: Failed

Time: August 11, 2009 10:39:23

HostName/URL: portal.hrpr.com

Monitor Name: hrpr.com

Service: http
Alert Type: Site is Available

Result: Ok

Time: August 11, 2009 11:09:24

HostName/URL: portal.hrpr.com

Monitor Name: hrpr.com

Service: http

I opened up a trouble ticket with my host provider and after a few back and forth conversions with them, ended up with this more or less "final" answer to the root cause of my problem.  Although not mentioned below, the implication was that the PHP scripts used by my sites were exceeding the server’s PHP memory usage and/or processing time limit.  And I knew this wasn’t the case:

Well, unfortunately, I have to maintain that the problem lies elsewhere at this time. Your server is rock solid stable, your apache instance is unwavering, and the ONLY reason I’ve found, other than an external connection issue, somewhere between yourself (or your monitoring service) and our servers, is the process watcher entries. If this were our issue, I’d be very quick to point it out. We’re not really about shifting blame here…check out http:// for shining examples of our "we screwed up, we’ll fix it" mentality.

However, in this case, the two possible issues are a bad connection between yourself and our servers, and the process watcher/resource issues. Again, if you want a useful diagnosis, we’ll need to see a traceroute taken at the time that the problem is occurring. I never suggested you had networking issues anywhere else…but incorrect routing, or network issues along the specific path that your connection takes can cause a perceived interruption in service. You may also try checking your site via proxy such as http:// the next time it appears to be down.

Let me know if you have any other questions!

After reading this response, please consider visiting the URL below to comment on its quality.

Thanks!

http://…

To which I replied:

OK, I give up, you win. I’ll continue trying to isolate it on my own.  And a few other choice words I will not repeat here. 

As suggested, I did visit the "URL below" and sent a comment on the quality of the response.  As you might suspect, my comment was somewhat negative.

A few hours later, I got this in an email from the host provider’s support team:

I think I found your issue. I apologize that this was over-looked but this actually isn’t supposed to happen as on our newer servers we have implemented FTP connection limits to stop them from piling up. Basically I saw your user had a huge number of stale FTP connections.

There was about 100 of them. Our process watcher will kill your processes if you have too many processes running and that’s why your processes were being killed (99% of the time they are killed for memory usage not process count). Our process watcher won’t kill FTP or shell connections as its hard to tell if the connections are legit or not but they are still counted towards your total process limit.

I killed all your stale FTP connections so I think you won’t be getting killed by our process watcher anymore (or not nearly as much). It looks like all your processes that were killed were due to the process count limit (not memory).

I apologize that this was over-looked by … but I can understand why as this isn’t even possible on our newer servers because we limit the FTP connections to 7 total by IP or account and again 99% of the time processes are killed for memory usage not process count. Please feel free to email me at … if you are still running into these issues and think this issue might have popped back up.

In 25 words or less, the intermittent connectivity problems were being caused by half-open (stale) connections. 

The host’s FTP server thought the connections were still open and as far as my FTP client was concerned, they had long since been closed.  The nasty side effect of this was that other processes such as those spawned by the HTTP server were getting killed off because of this.

What’s the moral to this story?  Well, there really isn’t one.  I just wanted to write this down somewhere as it brought back memories to me of days gone by.