A new way to look at networking

I finally got around to watching A new way to look at networking yesterday. This is a talk given by Van Jacobson at Google in 2006 (yes, it has been on my todo list for a long time).This is definitely worth watching if you are interested in networking.

A couple of quick comments (These are not particularly deep or anything. This is mostly for my own reference later.):

  • He says that the current Internet was designed for conversations between end nodes but we’re using it for information dissemination.
    • Me: This distinction relies on the data being disseminated to each user being identical. However, in the vast majority of cases even data that on the surface is identical such as web site content is actually unique for each visitor. Any site with advertisements or with customizable features are good examples. As a result we are still using the Internet for conversations in most situations.
  • He outlines the development of networking:
    • The phone network was about connecting wires. Conversations were implicit.
    • The Internet added metadata (the source and destination) to the data which allowed for a much more resilient network to be created. The Internet is about conversations between end nodes.
    • He wants to add another layer where content is addressable rather than the source or destination.
  • He argues for making implicit information explicit so the network can make more intelligent decisions.
    • This is what IP did by adding the source and destination to data.
  • His idea of identifying the data not the source or destination is very interesting. A consequences of this model is that data must be immutable, identifiable and build in metadata such as the version and the date. It strikes me how the internal operation of the Git version control system matches these requirements.

Cloud computing

It’s pretty hard to not notice the buzz around ‘cloud computing’. In large part this is due to the new services being offered by Amazon. Who would have thought that a book seller could become the infrastructure for a new generation of Internet start-ups?

Here’s a bit of information to whet your appetite. Somehow I have to find the time to play with these technologies.

Scaling with the clouds
Surviving the storm

Drawing (nearly) unlimited power from the sky
Drawing power from the sky, part 2

Free speech in Canada

In The Inquisition In Canada my friend Bob outlines how Human Rights Commissions (HRCs) are being abused.

Last week’s Cross Country Checkup episode titled “Are There Legitimate Limits to Free Expression?” also delves into the role of the HRCs as part of a larger discussion on free speech. You can find a nice introduction to this topic in the episode’s introduction (text) or you can download to the whole show (MP3). Several people people close to this issue are interviewed as well callers from across the country.

There was also a quote from someone (unfortunately I don’t remember who) which sums the issue up nicely (paraphrasing):

You have a right to not be exposed to hate but you don’t have a right to not be offended.

A few interesting things

ETech: Lessig Calls for Geeks to Code Money Out of Politics

But he doesn’t think legislators are by and large crooks who are taking bribes in exchange for votes. In fact, he says we may have the least bribery in our nation’s history.

But the money still corrupts in a number of ways. For instance, legislators, like scientists funded by drug companies, internalize their supporters’ interests.

“Money corrupts the process of reasoning”

“They get a sixth sense of how what they do might affect how they raise money.”

Let’s start with a public version control system that lets all of society see who is adding what to legislation and other important government documents.

Drugs, Body Modifications May Create Second Enlightenment

Coffee debuted in the late 17th century in Oxford, England — leading to rowdy coffee houses, jittery arguments and even an attempt by King Charles II to ban the substance for inspiring seditious behavior.
The other consequence: the Enlightenment.

I’ve often heard coffee houses mentioned as meeting places in various historical and revolutionary contexts. The idea that the coffee was to some extent the source of the revolutionary ideas is new to me.

The Adoption-Led Market

The switch for a mesh topology in society has led to easy access for everyone to Free software created by open source communities. The result is an emerging approach which is rapidly spreading for smaller software projects and in my view is the future of all software acquisition. The emerging approach is an adoption-led market.

In this approach, developers select from available Free software and try the software that fits best in their proposed application. They develop prototypes, switch packages as they find benefits and problems and finally create a deployable solution to their business problem. At that final point, assuming the application is sufficiently critical to the business to make it worthwhile to do so, they seek out vendors to provide support, services (like defect resolution) and more. Adoption-led users are not all customers; they only become so when they find a vendor with value to offer.

How green is your web page?

Saving carbon emissions with HTTP caching.

Assume a fully loaded server uses 100W. Six servers, year-round, consume 5,000 kilowatt-hours per year or approximately 500-1000 pounds of CO2 emissions.

Best Practices for Speeding Up Your Web Site

No quote but it’s still worth reading if you are interested in web development.

Silencing scientists at Environment Canada

The current Federal government has decided that federally funded scientists at Environment Canada should not be allowed to speak with reporters directly. In the past reporters could freely contact Environment Canada scientists with science questions. Now all questions must now go through an information officer. The leaked reasons for this change are particularly worrisome. I fail to see how the fact that “interviews sometimes result in surprises to minister and senior management” outweighs the public’s access to the scientists it funds.

Wasn’t this supposed to be the open and accountable government?

Environment Canada ‘muzzles’ scientists’ dealings with media

Or listen to the first few minutes of this week’s Sunday edition for a bit of commentary.

Magazine titles and operating systems

I little while ago I was standing in front of the computer magazine section at my local Chapters when I noticed something interesting. There were three magazines with “Windows” in the title, three with “Mac” in the title, and four with “Linux” in the title. Of course this is hardly statistically significant in terms of the magazine industry as a whole but it does show how Linux is becoming much more mainstream.

End-to-end in standards and software

Two things. Both relate to Microsoft but that is just by coincidence.

The first

Apparently IE8 will allow the HTML author to specify the name and version number of the browser that the page was designed for. For example, the author can add a meta tag that says essentially “IE6”. IE8 will see this tag and switch to rendering pages like IE6 does. Apparently this came about because IE7 became more standards compliant thereby ‘breaking’ many pages, especially those on intranets which require the use of IE. The new browser version tag will allow MS to update the browser engine without breaking old pages. As a result they will be forced to maintain the old broken HTML rendering engine (or at least its behavior) for a very long time. This will consume development resources that could otherwise be put into improving IE. It will also increase the size, complexity and undoubtedly the number of bugs. As for the pages broken by newer more standards compliant browsers, what is their value? Any information in a corporate Intranet or otherwise that has value will be updated to retain its value. If no one bothers to update the page is was probably nearly worthless anyway. Also, most of the HTML pages now in use are generated by a templating system of some kind. It’s not like each and every page will have to be edited by hand.

The second

The Linux kernel development process is notorious for improving (breaking) the kernel’s internal driver APIs. This means that a driver written for version 2.6.x might not even compile against 2.6.x+1 let alone be binary compatible. This of course causes all kinds of trouble for companies not willing to open source their drivers. However, the advantages of this process are huge. It is completely normal that during the development process the author will learn a lot about how the particular problem can be solved. By allowing the internal APIs to change the Linux kernel development model allows the authors to apply this new found knowledge and not be slowed down by past mistakes. As I already mentioned this causes problems for binary only kernel drivers but if the product has value the manufacturer will update the driver to work with the new kernel release. If it doesn’t have value the driver it won’t get updated and the kernel doesn’t have to carry around the baggage of supporting the old inferior design. How does this relate to Microsoft? From Greg Kroah-Hartman:

Now Windows has also rewritten their USB stack at least 3 times, with Vista, it might be 4 times, I haven’t taken a look at it yet. But each time they did a rework, and added new functions and fixed up older ones, they had to keep the old api functions around, as they have taken the stance that they can not break backward compatibility due to their stable API viewpoint. They also don’t have access to the code in all of the different drivers, so they can’t fix them up. So now the Windows core has all 3 sets of API functions in it, as they can’t delete things. That means they maintain the old functions, and have to keep them in memory all the time, and it takes up engineering time to handle all of this extra complexity. That’s their business decision to do this, and that’s fine, but with Linux, we didn’t make that decision, and it helps us remain a lot smaller, more stable, and more secure.

So what was the point?

I don’t know what to make of these two little stories but the later has been bothering me for some time. Where does the responsibility for dealing with change belong? The Internet has taught us that we should push as much work as possible to the ends of the network. The alternative is rapidly growing complexity and inflexibility in the core. It seems to me that this applies to both of the situations I outlined here as well.