Category Archives: General

CBC and Internet streaming

George Farris alerted me to a change in the way CBC does radio streaming on the Internet. Previously, CBC supported QuickTime, Real Player and Windows Media player. As of August 30, 2004 they only support Windows Media player. CBC has an announcement about this change which they ironically call “Live CBC Radio streaming improvements”. You can find the link to it on this page.

From the perspective of a desktop Linux user this decision is bad. There is no easy way to listen to Windows Media streams from Linux. Real Player was not open source but at least it worked in Linux. Since the CBC audio streams were the only reason I had Real Player installed I guess I can remove it now. I always felt bad having it installed anyway.

From a free software perspective this decision doesn’t really change much. Both Real and Windows Media are proprietary formats. I wish there was some way to make the general computing audience understand that free does not equal standard. A standard way to stream audio or video over the Internet would allow anyone to do their own implementation. The free downloads offered by Microsoft and Real in the form of Windows Media player and Real Player are just a way to lock the user into their technology. Standards that allow others to innovate drive the world forward not proprietary lock-in.

There is hope for standards based Internet media. Xiph.Org is building a set of truly free audio and video codecs. Also, Fluendo is building a free software streaming media server that will use the free codecs from Xiph.Org.

Queueing

I little while ago I asked a friend to proof read my LQL project page. We ended up in a discussion about the proper spelling of Queueing. Bob blogged about this conversation over here so I won’t bother repeating the outcome. However, I would like to add this one link that provides additional information.

Bell Canada call answer tip.

I can’t believe I didn’t realize this before. If you have Bell Canada‘s call answer service you can speak to the person who is leaving a message by pressing the flash or link button on your phone. This makes using the voice mail more effective for screening calls as well as allowing you to still speak to the caller even if you couldn’t get to the phone quite fast enough.

Statistics Canada

Statistics Canada is the Canadian government agency that generates statistics on an amazing amount of things. I had no idea they were keeping statistics on computer stuff.

This article summarizes some of the information.
The Daily – Computer and peripherals price indexes

Here, you can search the database but it looks like it has a minimal cost attached ($3.00).
Computer price indexes, by type of purchaser, monthly

Linux QoS Library (LQL) Released

It has finally happened. I have gotten a release of the Linux QoS Library (LQL) out the door.

Releasing software is actually a bit of nerve racking process. The worst part is not creating the announcement emails or filling out Freshmeat‘s forms, the worst part is worrying about what has been forgotten.

  • Missing files in the distribution? Hopefully, make distcheck covers that.
  • Bad or broken API documentation, ie spelling errors.
  • Not enough testing – What if it doesn’t work on other systems?
  • Design flaws – It is Free Software after all. Everyone can see your mistakes.

A big part of me would have liked to spend an indefinite amount of time to get a ‘perfect’ release, something I was really 100% happy with. However, that is against the release early, release often strategy that Free Software uses to such great effect. Besides, I would probably never be 100% happy with the code base anyway. Perhaps the single most important reason for this release is to let others know that the project exists.

Announcement
The Linux QoS Library (LQL) provides a GPL licensed, GObject based C API to manipulate the network queueing disciplines, classes and classifiers in the Linux kernel. LQL does not use the TC command as a back-end. Instead, LQL communicates with the Linux kernel via Netlink sockets the same way TC does.

0.5.0 — 2004-08-30

  • Initial public release.
  • I wanted to get 100% API doc coverage and a lot more testing done before I made a public release but I decided to go with the release early, release often strategy.
  • 86% API documentation coverage. A lot of the undocumented API is for the U32 classifier implementation which I am not that fond of. I think this API will change quite a bit.
  • What LQL really needs is much more testing in larger applications.
  • I make absolutely no promises that any of the API will be stable. I expect the API to change as larger programs are built with it and new limitations (and bugs) are found.

Please see http://www.coverfire.com/lql/ for more information.

Download:
http://www.coverfire.com/lql/download/lql-0.5.0.tar.gz

Fundamentalist

I while ago I was listening to an interview involving the IMF‘s economic policies in Argentina. One of the people being interviewed (I don’t remember the name unfortunately) offered a definition of fundamentalists that I thought was insightful. The below is only a paraphrase as it has been several weeks since I heard the interview.

Fundamentalist: A person who believes in a set of rules or ideas so strongly that even after powerful evidence that these rules are failing the person thinks that the only problem is that the rules are not being enforced strongly enough.

Weekend

It was a good weekend. Saturday was Kevin’s (Karen’s brother) Buck & Doe. I had a really good time. I got to be the barbeque guy which is always lots of fun at a big party. I wish I knew how many hamburgers we cooked. Only bad part of the day was the mild sunburn that resulted. You would think I would know better by now. After some post party cleanup Karen and I spent Sunday afternoon doing a few hour motorcycle ride with my parents. Basic route was Sebringville, London, Woodstock and back via Embro Road. Sometimes I miss having my own motorcycle.

Today I received my copy of Mono: A Developers Notebook. I ordered it from Chapters in the middle of last week. I love buying books on the Internet. This book looks like a good quick start guide to Mono and C#. Exactly what I need.

Secure remote backup

Every once in a while I see posts on mailing lists where people wonder about doing remote backups. I figured it may be worth while to describe how I have been doing my home work station backups for the last few years. Hopefully, this will be useful to someone.

I consider a backup system that requires frequent manual attention pretty much useless. Mainly this is because it is hard to maintain proper backup discipline when busy or away. This is especially true of media based backups. Swapping tapes or CDs to make a backup is annoying enough that the backup probably won’t get done as often as it should. Networks allow the backup process to be automated by having each system back itself up regularly and automatically to another host on the network. However, making backups to another host in the same building doesn’t help much when the building burns down. If you have computer equipment at two secure locations with a large pipe between them, automatic off-site backups are pretty easy. Unfortunately, most individuals are not this lucky. However, with the proliferation of broadband it is quite possible that you know someone who has a big enough pipe that you could send backup data to them in off hours.

This remote computer may be owned by your best friend but do you really want to make your backup data available to them? Even if you do trust this person maybe they don’t look after their machine as well as you do, their computer could be cracked etc. Clearly the remote system needs to be considered untrusted. The data is going to have to be encrypted.

My backup script basically does:

  • Create a tarball of all of the data to backup.
  • bzip2 the file to make the transfer shorter.
  • Run GNUPG to encrypt the file to myself.
  • Use SCP to transfer the file to the remote system.

Thus, this requires that you have an OpenPGP key (via GNUPG) and access via SSH (SCP) to the remote host. Transferring the file with some other, less secure method shouldn’t reduce the security of the system too much. The only problem would be if someone sniffed your authentication information and then deleted the files from the remote host. Since the files are encrypted downloading them doesn’t do the bad guy any good.

This system is not suited to backing up your media library. Mostly because of bandwidth limitations but also because incremental backups are not possible. The entire backup is sent every time.

Though the point of this entry was to just put the idea of doing backups this way in out there for Google to index I have made a copy of my backup.sh available. The script is quite simple but should provide a good starting point for anyone interested in taking the implementation further. This particular script is setup to do daily and weekly backups. It has two configuration options that specify plain text files containing lists of directories to exclude from the daily and weekly backups (see man tar for the exclude file format). What I do is exclude everything but frequently changing directories from the daily backup and only exclude media directories from the weekly.

There is one obvious catch-22 with this system. Your GNUPG keys are stored in ~/,gnupg and this directory is backed up and encrypted by these keys. If your computer is lost the only copy of your data you have left is encrypted. You now have no way to decrypt your backup. So, you need to keep a separate backup copy of your GNUPG keys somewhere else. Since you have a pass-phrase on your key (you had better anyway) these files are already encrypted.

In order to make this backup system automatic (and hence useful) it needs to be able to transfer the backup file without user intervention. With SCP this can be accomplished by creating a un-passworded SSH key-pair. This allows the host which holds the keys to login to the remote host without a password, ie without user intervention. Then the SSH_OPTIONS variable in the script can be modified to point SCP to this key. Now you can setup the script as a cron job and get your backups done automatically every night. MD5 sums are used to verify the successful transfer of the backup. The script will also email you if the backup failed.

This script could be made a bit smarter so it would delete old backups from the remote host. It does not do that right now. You’ll have to login to the remote host once in a while to delete old backups. How often you will need to do this depends on how much space the remote host has available.

Testing Meme Propagation In Blogspace: Add Your Blog!

This posting is a community experiment that tests how a meme, represented by this blog posting, spreads across blogspace, physical space and time. It will help to show how ideas travel across blogs in space and time and how blogs are connected. It may also help to show which blogs are most influential in the propagation of memes. The dataset from this experiment will be public, and can be located via Google (or Technorati) by doing a search for the GUID for this meme (below).

The original posting for this experiment is located at: Minding the Planet (Permalink: http://novaspivack.typepad.com/nova_spivacks_weblog/2004/08/a_sonar_ping_of.html) — results and commentary will appear there in the future.

Please join the test by adding your blog (see instructions, below) and inviting your friends to participate — the more the better. The data from this test will be public and open; others may use it to visualize and study the connectedness of blogspace and the propagation of memes across blogs.

The GUID for this experiment is: as098398298250swg9e98929872525389t9987898tq98wteqtgaq62010920352598gawst (this GUID enables anyone to easily search Google (or Technorati) for all blogs that participate in this experiment). Anyone is free to analyze the data of this experiment. Please publicize your analysis of the data, and/or any comments by adding comments onto the original post (see URL above). (Note: it would be interesting to see a geographic map or a temporal animation, as well as a social network map of the propagation of this meme.)

INSTRUCTIONS

To add your blog to this experiment, copy this entire posting to your blog, and then answer the questions below, substituting your own information, below, where appropriate. Other than answering the questions below, please do not alter the information, layout or format of this post in order to preserve the integrity of the data in this experiment (this will make it easier for searchers and automated bots to find and analyze the results later).

REQUIRED FIELDS (Note: Replace the answers below with your own answers)

(1) I found this experiment at URL: http://planet.gnome.org

(2) I found it via “Newsreader Software” or “Browsing the Web” or “Searching the Web” or “An E-Mail Message”: “Browsing the Web”

(3) I posted this experiment at URL: http://www.coverfire.com

(4) I posted this on date (day, month, year): 04/08/04

(5) I posted this at time (24 hour time): 08:55:00

(6) My posting location is (city, state, country): London, ON, CA

OPTIONAL SURVEY FIELDS (Replace the answers below with your own answers):

(7) My blog is hosted by: Self

(8) My age is: 26

(9) My gender is: Male

(10) My occupation is: Student

(11) I use the following RSS/Atom reader software: Blam

(12) I use the following software to post to my blog: WordPress

(13) I have been blogging since (day, month, year): 25/05/2004

(14) My web browser is: Epiphany

(15) My operating system is: Linux

Linux QoS API update

My Linux QoS library is coming along nicely. It took me a few days but I changed the whole API to be based on GObject. The main reason for this time consuming change is to make bindings to other languages easier. Primarly I am interested in Mono and Python bindings. As much as I love good old C I have no intention of building a GUI application with C if I don’t have to.

I am now buiding small apps that use the API and discovering the features that are needed but I didn’t think of before.

Everyone who develops software using free software libraries should take moment to thank the developers for thier time and hard work. Developing a sane API is hard.