The Biotron at UWO

There is a major construction project going on at UWO right now called the Biotron. For details of this really interesting project take a look at the main Biotron website or this article from Canada.com.

A new biology super-lab under construction in London, Ont., will make Canada a testing ground for the latest ideas in disease, ecosystems and agriculture from all over the world.

They will be able to create the frigid darkness of Arctic winter, or the steamy heat of a tropical rainforest. They’ll be able to see what happens to a genetically modified crop under realistic conditions, without letting it escape into the environment.

“This is the first time anything like that has ever been built. So we’re absolutely unique,” says Duncan Hunter, associate dean of science at Western.

“There is no other facility like this in the world.”

“Everybody likes to describe their project, their facility, as being unique and world class. This one truly will be when it’s up and running.”

Pirates of the Mediterranean

Pirates of the Mediterranean

IN the autumn of 68 B.C. the world’s only military superpower was dealt a profound psychological blow by a daring terrorist attack on its very heart. Rome’s port at Ostia was set on fire, the consular war fleet destroyed, and two prominent senators, together with their bodyguards and staff, kidnapped.

The incident, dramatic though it was, has not attracted much attention from modern historians. But history is mutable. An event that was merely a footnote five years ago has now, in our post-9/11 world, assumed a fresh and ominous significance. For in the panicky aftermath of the attack, the Roman people made decisions that set them on the path to the destruction of their Constitution, their democracy and their liberty. One cannot help wondering if history is repeating itself.

Starting at the top

Hello, Young Workers: One Way to Reach the Top Is to Start There

Lost in the argument over whether young people today know how to work, however, is the mounting evidence produced by labor economists of just how important it is for current graduates to ignore the old-school advice of trying to get ahead by working one’s way up the ladder. Instead, it seems, graduates should try to do exactly the thing the older generation bemoans — aim for the top.

The recent evidence shows quite clearly that in today’s economy starting at the bottom is a recipe for being underpaid for a long time to come. Graduates’ first jobs have an inordinate impact on their career path and their “future income stream,” as economists refer to a person’s earnings over a lifetime.

Network neutrality: Where analogies fail

I find it interesting that so much of the discussion surrounding net neutrality centers around analogies to other aspects of the modern world. A lot of these analogies are related to the transportation of goods. Courier companies such as UPS and Fedex as well as the highway network in general are the most common examples. In one of the first articles on net neutrality, Saving the Net, Doc Searls argues that the transport analogy is a major impediment to the pro-neutrality side and offers a competing analogy. This post is not about which analogy is better, it is about the problems which occur when using any analogy to discuss a complex topic.

It is easy to understand why people use analogies to discuss complex topics like net neutrality. By allowing knowledge and understanding from one area to be applied to something new, analogies are essentially a way of simplifying the world. Like any simplification, there is always some detail lost.

Analogy is a poor tool for reasoning, but a good analogy can be very effective in developing intuition.
— Barbara Simons and Jim Horning
(Communications of the ACM, Sept 2005, Inside Risks)

The very fact that analogies apply old information to new situations should give us pause in using analogy as a reasoning tool.

To see an example of this problem one only needs to examine what is perhaps the most common analogy used by the anti-neutrality folks. The analogy in question relates to the fact that UPS and other courier companies offer high priority service (overnight) as well as normal service without the negative consequences the pro-neutrality crowd fears.

In order for a courier company to begin to offer overnight package delivery, the company must add new capacity to its delivery operations. For example, a company that ships packages by truck will need to add aircraft to its operations to support cross-continent overnight delivery. Once these aircraft are in place it does not make economic sense to fly them lightly loaded. Instead, the courier company will begin to fill the remainder of the space in the planes with lower priority packages. This has the benefit of reducing the courier’s costs by reducing the number of trucks that are necessary. There is also another unintentional benefit. Although some customers have not paid for overnight delivery, the additional high speed capacity greatly increases the chance they will get that level of service anyway. As the volume of high priority packages grows, the courier’s overall operations must also grow in high priority capacity.

Compare the above situation to packet prioritization on the Internet. Unlike the courier company example, adding a high priority service does not require that the bandwidth provider add new capacity to its operations. There is no way to make light go faster. Packet prioritization simply gives the marked packets first crack at the existing capacity. Assuming the network is properly provisioned (not heavily loaded) the difference in service quality between high and low priority packets is very low, probably unnoticeable.

There is also the issue of reverse economic incentives. In order for customers who are paying for high priority service to notice an improvement the network must be congested. This creates the strange situation where allowing the network to become congested (not upgrading) could result in more customers paying for high priority service and thus increasing the bandwidth provider’s profits.

[Before anyone complains, I realize there are other aspects to network QoS such as number of hops in a path etc. I am not attempting to explain all aspects of network operations.]

On the surface, the analogy between high priority package shipment and high priority packet delivery seems like a good one. Upon closer examination, simple physical limitations show these two worlds to have very different operational characteristics and completely opposite unintentional side effects.

The point of this post is not to argue about the exact details of packet forwarding or courier company operations. The point is that centering the discussion about complex topics like network neutrality around analogies to other systems is foolish. The lost detail results in uninformed decisions.

The Semicolon Wars

Interesting programming language article.

The Semicolon Wars from American Scientist.

A catalog maintained by Bill Kinnersley of the University of Kansas lists about 2,500 programming languages. Another survey, compiled by Diarmuid Piggott, puts the total even higher, at more than 8,500. And keep in mind that whereas human languages have had millennia to evolve and diversify, all the computer languages have sprung up in just 50 years. Even by the more-conservative standards of the Kinnersley count, that means we’ve been inventing one language a week, on average, ever since Fortran.

Network neutrality: The cell network

From Newsforge, Today’s cell phone system argues for retaining network neutrality.

Consider the closed, anti-innovation system that is the cell phone network. Do you want the Internet to be like that? Is that best solution for the rest of the economy and society in general?

James Glass (not his real name) is the owner of a company currently trying to navigate the minefield of running a third-party service on the cell phone networks. He is writing the article pseudonymously because the cell phone companies have the power and freedom to crush his company by blocking it from their networks.

The Future of Computing

The Future of Computing: From mainframes to microblades, farewell to GHz CPUs provides a nice overview of trends in CPU and system design. I have a couple of comments to add.

When in late 1950s computers became fast enough to relieve some of the coding burden from the shoulders of programmers high level languages were developed such as Ada, Algol, Fortran and C. While sacrificing code efficiency big time these high level languages allowed us to write code faster and thus extract more productivity gains from computers.

As time passed we kept sacrificing software performance in favor of developer productivity gains first by adopting object-oriented languages and more recently settling with garbage-collected memory, runtime interpreted languages and ‘managed’ execution. It is these “developer productivity” gains that kept the pressure on hardware developers to come up with faster and faster performing processors. So one may say that part of the reason why we ended up with gigahertz-fast CPUs was “dumb” (lazy, uneducated, expensive — pick your favorite epithet) developers.

Although true in some sense, the term developer productivity is a bit of a misnomer here. High(er) level tools and design methodologies do not just save developer time they make modern software possible. I seriously doubt that creating a web browser or any of the other huge pieces of software that we use everyday in assembly language is a tractable problem. Even if the problem could be brute forced, the resulting software would likely have a far higher defect rate than current software.

In the long term it makes little sense to burden CPU with DVD playback or SSL encryption. These and similar tasks should and with time will be handled completely by dedicated hardware that is going to be far more efficient (power and performance-wise) than CPU.

This completely ignores one of the most important aspects of fast general purpose CPUs, flexibility. For instance, a computer which relies on a MPEG decoder for video playback becomes useless when content is provided in another format. Continuing with this example, innovation in the area of video codecs would also become very difficult.

Despite the nitpicks, there is lot of good information in the article.

Ottawa, OLS and the war museum

Arrived in Ottawa today for OLS. Managed to get in early enough to make it over to the new (2005?) Canadian War Museum. Unfortunately, there was only two hours left before close. Two hours was not nearly long enough to do the museum justice. Even if you have been to the previous war museum you should go again. The new building is gorgeous and there is lot more stuff to look at. If you like to read everything in a museum, you need to budget a LOT more than two hours.

For those new to Ottawa, walking to the war museum from OLS will take under 30 minutes.

Photo 20060718-cwm-1.jpg from the Canadian war museum
Photo 20060718-cwm-2.jpg from the Canadian war museum
Photo 20060718-cwm-3.jpg from the Canadian war museum
Photo 20060718-cwm-4.jpg from the Canadian war museum
Photo 20060718-cwm-5.jpg from the Canadian war museum
Photo 20060718-cwm-6.jpg from the Canadian war museum
Photo 20060718-cwm-7.jpg from the Canadian war museum
Photo 20060718-cwm-8.jpg from the Canadian war museum
Photo 20060718-cwm-9.jpg from the Canadian war museum
Photo 20060718-cwm-10.jpg from the Canadian war museum