Category Archives: General

Linux QoS Library (LQL) Released

It has finally happened. I have gotten a release of the Linux QoS Library (LQL) out the door.

Releasing software is actually a bit of nerve racking process. The worst part is not creating the announcement emails or filling out Freshmeat‘s forms, the worst part is worrying about what has been forgotten.

  • Missing files in the distribution? Hopefully, make distcheck covers that.
  • Bad or broken API documentation, ie spelling errors.
  • Not enough testing – What if it doesn’t work on other systems?
  • Design flaws – It is Free Software after all. Everyone can see your mistakes.

A big part of me would have liked to spend an indefinite amount of time to get a ‘perfect’ release, something I was really 100% happy with. However, that is against the release early, release often strategy that Free Software uses to such great effect. Besides, I would probably never be 100% happy with the code base anyway. Perhaps the single most important reason for this release is to let others know that the project exists.

Announcement
The Linux QoS Library (LQL) provides a GPL licensed, GObject based C API to manipulate the network queueing disciplines, classes and classifiers in the Linux kernel. LQL does not use the TC command as a back-end. Instead, LQL communicates with the Linux kernel via Netlink sockets the same way TC does.

0.5.0 — 2004-08-30

  • Initial public release.
  • I wanted to get 100% API doc coverage and a lot more testing done before I made a public release but I decided to go with the release early, release often strategy.
  • 86% API documentation coverage. A lot of the undocumented API is for the U32 classifier implementation which I am not that fond of. I think this API will change quite a bit.
  • What LQL really needs is much more testing in larger applications.
  • I make absolutely no promises that any of the API will be stable. I expect the API to change as larger programs are built with it and new limitations (and bugs) are found.

Please see http://www.coverfire.com/lql/ for more information.

Download:
http://www.coverfire.com/lql/download/lql-0.5.0.tar.gz

Fundamentalist

I while ago I was listening to an interview involving the IMF‘s economic policies in Argentina. One of the people being interviewed (I don’t remember the name unfortunately) offered a definition of fundamentalists that I thought was insightful. The below is only a paraphrase as it has been several weeks since I heard the interview.

Fundamentalist: A person who believes in a set of rules or ideas so strongly that even after powerful evidence that these rules are failing the person thinks that the only problem is that the rules are not being enforced strongly enough.

Weekend

It was a good weekend. Saturday was Kevin’s (Karen’s brother) Buck & Doe. I had a really good time. I got to be the barbeque guy which is always lots of fun at a big party. I wish I knew how many hamburgers we cooked. Only bad part of the day was the mild sunburn that resulted. You would think I would know better by now. After some post party cleanup Karen and I spent Sunday afternoon doing a few hour motorcycle ride with my parents. Basic route was Sebringville, London, Woodstock and back via Embro Road. Sometimes I miss having my own motorcycle.

Today I received my copy of Mono: A Developers Notebook. I ordered it from Chapters in the middle of last week. I love buying books on the Internet. This book looks like a good quick start guide to Mono and C#. Exactly what I need.

Secure remote backup

Every once in a while I see posts on mailing lists where people wonder about doing remote backups. I figured it may be worth while to describe how I have been doing my home work station backups for the last few years. Hopefully, this will be useful to someone.

I consider a backup system that requires frequent manual attention pretty much useless. Mainly this is because it is hard to maintain proper backup discipline when busy or away. This is especially true of media based backups. Swapping tapes or CDs to make a backup is annoying enough that the backup probably won’t get done as often as it should. Networks allow the backup process to be automated by having each system back itself up regularly and automatically to another host on the network. However, making backups to another host in the same building doesn’t help much when the building burns down. If you have computer equipment at two secure locations with a large pipe between them, automatic off-site backups are pretty easy. Unfortunately, most individuals are not this lucky. However, with the proliferation of broadband it is quite possible that you know someone who has a big enough pipe that you could send backup data to them in off hours.

This remote computer may be owned by your best friend but do you really want to make your backup data available to them? Even if you do trust this person maybe they don’t look after their machine as well as you do, their computer could be cracked etc. Clearly the remote system needs to be considered untrusted. The data is going to have to be encrypted.

My backup script basically does:

  • Create a tarball of all of the data to backup.
  • bzip2 the file to make the transfer shorter.
  • Run GNUPG to encrypt the file to myself.
  • Use SCP to transfer the file to the remote system.

Thus, this requires that you have an OpenPGP key (via GNUPG) and access via SSH (SCP) to the remote host. Transferring the file with some other, less secure method shouldn’t reduce the security of the system too much. The only problem would be if someone sniffed your authentication information and then deleted the files from the remote host. Since the files are encrypted downloading them doesn’t do the bad guy any good.

This system is not suited to backing up your media library. Mostly because of bandwidth limitations but also because incremental backups are not possible. The entire backup is sent every time.

Though the point of this entry was to just put the idea of doing backups this way in out there for Google to index I have made a copy of my backup.sh available. The script is quite simple but should provide a good starting point for anyone interested in taking the implementation further. This particular script is setup to do daily and weekly backups. It has two configuration options that specify plain text files containing lists of directories to exclude from the daily and weekly backups (see man tar for the exclude file format). What I do is exclude everything but frequently changing directories from the daily backup and only exclude media directories from the weekly.

There is one obvious catch-22 with this system. Your GNUPG keys are stored in ~/,gnupg and this directory is backed up and encrypted by these keys. If your computer is lost the only copy of your data you have left is encrypted. You now have no way to decrypt your backup. So, you need to keep a separate backup copy of your GNUPG keys somewhere else. Since you have a pass-phrase on your key (you had better anyway) these files are already encrypted.

In order to make this backup system automatic (and hence useful) it needs to be able to transfer the backup file without user intervention. With SCP this can be accomplished by creating a un-passworded SSH key-pair. This allows the host which holds the keys to login to the remote host without a password, ie without user intervention. Then the SSH_OPTIONS variable in the script can be modified to point SCP to this key. Now you can setup the script as a cron job and get your backups done automatically every night. MD5 sums are used to verify the successful transfer of the backup. The script will also email you if the backup failed.

This script could be made a bit smarter so it would delete old backups from the remote host. It does not do that right now. You’ll have to login to the remote host once in a while to delete old backups. How often you will need to do this depends on how much space the remote host has available.

Testing Meme Propagation In Blogspace: Add Your Blog!

This posting is a community experiment that tests how a meme, represented by this blog posting, spreads across blogspace, physical space and time. It will help to show how ideas travel across blogs in space and time and how blogs are connected. It may also help to show which blogs are most influential in the propagation of memes. The dataset from this experiment will be public, and can be located via Google (or Technorati) by doing a search for the GUID for this meme (below).

The original posting for this experiment is located at: Minding the Planet (Permalink: http://novaspivack.typepad.com/nova_spivacks_weblog/2004/08/a_sonar_ping_of.html) — results and commentary will appear there in the future.

Please join the test by adding your blog (see instructions, below) and inviting your friends to participate — the more the better. The data from this test will be public and open; others may use it to visualize and study the connectedness of blogspace and the propagation of memes across blogs.

The GUID for this experiment is: as098398298250swg9e98929872525389t9987898tq98wteqtgaq62010920352598gawst (this GUID enables anyone to easily search Google (or Technorati) for all blogs that participate in this experiment). Anyone is free to analyze the data of this experiment. Please publicize your analysis of the data, and/or any comments by adding comments onto the original post (see URL above). (Note: it would be interesting to see a geographic map or a temporal animation, as well as a social network map of the propagation of this meme.)

INSTRUCTIONS

To add your blog to this experiment, copy this entire posting to your blog, and then answer the questions below, substituting your own information, below, where appropriate. Other than answering the questions below, please do not alter the information, layout or format of this post in order to preserve the integrity of the data in this experiment (this will make it easier for searchers and automated bots to find and analyze the results later).

REQUIRED FIELDS (Note: Replace the answers below with your own answers)

(1) I found this experiment at URL: http://planet.gnome.org

(2) I found it via “Newsreader Software” or “Browsing the Web” or “Searching the Web” or “An E-Mail Message”: “Browsing the Web”

(3) I posted this experiment at URL: http://www.coverfire.com

(4) I posted this on date (day, month, year): 04/08/04

(5) I posted this at time (24 hour time): 08:55:00

(6) My posting location is (city, state, country): London, ON, CA

OPTIONAL SURVEY FIELDS (Replace the answers below with your own answers):

(7) My blog is hosted by: Self

(8) My age is: 26

(9) My gender is: Male

(10) My occupation is: Student

(11) I use the following RSS/Atom reader software: Blam

(12) I use the following software to post to my blog: WordPress

(13) I have been blogging since (day, month, year): 25/05/2004

(14) My web browser is: Epiphany

(15) My operating system is: Linux

Linux QoS API update

My Linux QoS library is coming along nicely. It took me a few days but I changed the whole API to be based on GObject. The main reason for this time consuming change is to make bindings to other languages easier. Primarly I am interested in Mono and Python bindings. As much as I love good old C I have no intention of building a GUI application with C if I don’t have to.

I am now buiding small apps that use the API and discovering the features that are needed but I didn’t think of before.

Everyone who develops software using free software libraries should take moment to thank the developers for thier time and hard work. Developing a sane API is hard.

Political Compass

Political Compass is an interesting site. It asks you to fill out a short questionnaire and then gives you a rating of where you stand on the political map. For reference the site also gives the values for some famous people like Hitler, Stallin and Ghandi.

My location according to this site is
Economic Left/Right: -4.38
Social Libertarian/Authoritarian: -5.85
which puts me in the same area as Ghandi. Since I don’t know that much about Ghandi’s politics I really don’t know what that means.

It’s probably not wise to take this thing too seriously but I would have thought myself to be much closer to Paul Martin than this graph shows. Also, I certainly do not think collectivism is the best way to run the economy but somehow I ended up towards that part of the scale.

Political map

Ironic

It’s particularly ironic that a few hours after I had posted my little rant on Police fund raising I discovered that the drivers side window had been smashed out of Karen’s car and the steering wheel ripped apart. Looks like some pretty stupid theives. It was around midnight when I noticed this. At ~05:00 I was awoken by the sound of another car window being smashed. I called the police. Hopefully they caught someone.

Update: Two other cars in the parking lot had windows smashed too. As far as I know none were actually stolen. Pretty lame thieves.

Linux QoS API

My wrapper library for the Linux QoS system is coming along nicely. Here are the function calls necessary to add a filter that matches the destination field in the IP header.

filt = classifier_u32_new();
classifier_u32_set_class(filt, class);
classifier_u32_set_priority(filt, 5);
classifier_u32_set_protocol(filt, IP);
classifier_u32_set_interface(filt, 3);
classifier_u32_add_match_ip_dst(filt, "x.x.x.x");
if (!qos_classifier_u32_add(con, filt)) {
	g_print("Adding filter failed.n");
}

I am now trying to figure out how to handle the representation of the currently installed filters and classes in the API. Somehow the application needs a represenation of these things so it can make changes. Since the structure of the qeueing disciplines, classes and filters is very much like a tree I think this may be the best way to go.

Canadian Election

Well it looks like there will be a minority Liberal government in Canada this term. Considering the polls said the Conservatives might form a minority government the final numbers are pretty surprising. For anyone reading this outside of Canada the Liberals have held a strong majority in parliament for the last three terms.

Ontario, Canada’s largest province, is the traditional Liberal stronghold. Ontario elects ~1/3 of the seats in parliament so this support has translated into a lot of seats. There have been a few screw-ups in the government recently that have hurt the Liberals pretty badly. Combine this with the fact that the traditionally divided political right has united into one party this election could have spelled doom for the Liberals. Indeed, the polls showed that the Liberals and Conservatives were running neck and neck. Why were the pre-election polls so far out? Here is my little theory on what happened. Ontarians didn’t want to elect the Conservatives but they did want to punish the Liberals for their problems. So, some Ontarians told the pollsters that they were not going to vote Liberal but when the day finally came support fell on the traditional side. I am a strong Liberal supporter but at the start of the election I probably would have done the same. Besides, lying to pollsters is fun.