Short packet

A DPI, networking and joy of technology blog.

Wednesday, March 17, 2010

Dear TSA

I understand your need to ruffle through peoples luggage in the name of security every now and then. I also appreciate that you, unlike the Chinese, take the time to leave a little note saying that you'd been having a look at my.. well, frankly, dirty laundry and toothpaste.

However, for the sake of simplifying your work, I have one slight suggestion:

If your employees would try actually opening the bags the usual manner before resorting to breaking the lock with a screwdriver, it might save a few seconds per bag.

For my particular type of bag, the course of action to do this is as follows:

1. Grab hold of clasp (doubles as a locking mechanism) with either hand.

2. Open clasp by moving said hand, still holding the clasp.

3. Release clasp.

Repeat this three times.

For extra credit:

4. Make sure to fasten the clasps after you're done with my underwear. If you had to open three clasps to gain access to the bag, chances are that there'll still be three to close once you're done with the bag (bags are funny in that way)

I'd do an illustration of the workflow here, but I'm afraid your comment form doesn't seem to allow submissions by means of crayon on a napkin. Imagine a crude drawing of a bag with a happy face instead of a sad face and you're not far off.

All the best,

Kriss Andsten

Involuntary customer

Friday, March 5, 2010

Pluring by Plura

Just the other day, I was pondering how there's very little interesting tech news that isn't already covered in depth by the major players. There really isn't much benefit to me, or my three point five readers, if I just parrot stuff from Ars or Gizmodo. Or Slashdot, god forbid - they're doing a swell enough job parroting themselves.

But I digress. Just the other day, I ran across this outfit, Plura Processing. Business plan? Sell cluster CPU cycles. Whose CPU cycles? Yours.

Plura offers code that can be embedded in or alongside various web content; mainly flash games and similar. Said code requests work units from Plura, computes whatever needs to be computed and submits the result. Sure, they don't pay a whole lot - $2.60 per CPU month - but it's a rather cute business model nonetheless.

It's going to be quite interesting to see what reception it will get once it hits the mainstream press; especially seeing that it's quite easy to draw a parallel between this thing and P2P. If this would have been bandwidth, would people see it in the same way as when it's about CPU? Is it OK to 'donate' to the site operator, or will they be seen as stealing your resources, hogging your CPU and kicking your cat?

Given a friendly reception, it's not at all inconceivable that we'll be seeing this sort of setup in streaming media services in the future. It'd be a match that makes business sense: The server streams the content, the client takes the content, offloads it to the GPU and uses the CPU cycles to pay for some of it. Sure, it won't pay for bandwidth (yet), but could be a nice little side revenue.

On the technical side, I'm not too sure that I like it - but then again, if stuff like this makes the case for 'free' content on the Internet a bit sweeter, it might not be entirely bad.

(Linguistics of the day: Pluring is a play on the local name of John D. Rockerduck - von Pluring. It's also slang for cash. Quite fitting, in a way)

Tuesday, December 30, 2008

Content-Aware DPI. Nirvana or dead end?

I've been seeing some buzz in the industry regarding content-aware DPI solutions. While I think it's a bit of an oxymoron - show me DPI that isn't content aware and does any sort of meaningful recognition - it's really a fancy name for P2P content awareness in this case.

The general idea is that providers will use this to differentiate between freely distributable and non-freely distributable material over P2P. This is interesting in several ways.

To set the stage - DPI boxes are pretty capable nowadays and relatively cheap - maybe not in euros, but on a cost-per-subscriber basis. So it's quite possible to do fairly advanced (and in some cases, invasive) stuff if you want to munge the data. If you combine a box seeing all the traffic for a given provider with an external service that traverses P2P networks, you can piece together quite an accurate image of what data the subscribers are hauling. If you care about this for whatever reason, do not just assume that encryption is the holy grail for avoiding it.

Who would like this?

From a provider point of view, it'd be possible to treat the 'illegal' traffic differently - shaping, prioritizing, blocking or sending it down a different/cheaper network path. If this is a good thing or a bad thing is entirely up to the observer. In terms of euros and cents, P2P accounts for 75% of the total traffic transferred in some european markets. The majority of that would be movies. Make a dent in this and you can postpone network investments for a while. In a world where hardware gets cheaper and cheaper down the line, postponing an investment for a year can mean a lot of money saved. There's also possible bandwidth savings on top of this.

From a governmental point of view, this is technology that somewhat reliably allows for stopping copyright infringement. It's also cheap enough so that providers can be mandated to implement it (I'm sure providers would see this differently for obvious reasons)

From the copyright sensitive and lawyer-happy trade group point of view, this is nirvana. Paint yourselves as victims, ask the government to protect this important GNP-improving revenue stream. "It's not only the right thing to do, it's also the reasonable thing, right?"

What would it lead to?

Better legal music and video services is one thing. If we see something like RIAA-friendly-P2P filters implemented, it will mean that JaneRegularUser will have trouble using her torrent/gnutella client to fetch music and movies and either will have to update to a newer version (which might not be available right off the bat) or use something else. We're seeing an emergence of services that fill this 'something else' gap. Judging by Spotify usage statistics, there's already quite a few people who are using that and it's a steadily increasing number. I'd estimate roughly 1.5% of the Swedish internet users, compared to 0.3% six weeks ago.

There might be a rise in Darknet activity - but darknet limitations in terms of latency and outright virtual hooliganism (to mess the network up) - won't make this attractive to JaneUser - especially not compared to the legal means of sourcing the material.

P2P clients will adapt. There's quite a few ways to screw DPI gear (most of which I haven't seen in the wild yet) and we'll be seeing more of that. There'll be a spot of an arms race, but the end result will probably result in P2P looking quite different from what we're used to - from a networking perspective, at least. I'm quite sure that P2P would still be identifiable as P2P - identifying the content in transit, however, might not be doable.


Simple. Don't go down that path. As mentioned, we're seeing legal means of distributing music (Spotify, others) and video (Hulu, iPlayer, many more) gaining popularity as is. P2P is largely a convenience thing for the majority of users and as the legal means of distribution gets more convenient than the P2P, well.. you do the math.

From a governmental perspective, we have a small but fairly deep-pocketed number of trade groups who really would love stricter control over what's being transferred over the Internet. Most of their problems stem from their reluctance to get on with the times. Sure, consumers are breaking laws, but seeing that pretty damn large swathes of the population does it and that they seem quite keen on going for a legal option where one exists makes it more of a non-issue. Less regulation is a good thing in this case, methinks.

Friday, November 21, 2008

The majormulticide has begun

We've seen a number of indie MMO's come and go, but a majorly hyped and expensive production - Tabula Rasa - is closing its doors early next year.

MMO's have been seen a good way forward for gaming - sure, the upfront expenses to actually be a mainstream player are somewhat staggering, but the gains once and if you get there are pretty neat, and it's one very efficient way of bypassing the entire problem of piracy.

With the early success of EverQuest and the breakthrough of World of Warcraft, World + Dog decided that now's a good time to develop an MMO and get a cut. 'Now' being 2005-2006, with some major launches last year and this year.

Judging by european and north american traffic though, we'll be seeing a few more of the major releases go titsup sooner or later. I'd judge Age of Conan to be a good candidate for this, due to a very low player count and a (probably) fairly expensive development cycle.

Games with very strong IP ties - Lord of the Rings, Warhammer and - naturally - Warcraft seem to be doing allright, alongside EVE and Second Life. Warhammer's a bit too new to judge yet, but if they manage to keep quality somewhat high, I have a hard time seeing them do a sharp nosedive anytime soon. Various small indie games are also doing well - Tibia is actually right up there competing with Warhammer and LOTRO in some regions.

A number of the big, expensive productions in the SciFi and Fantasy space without very good IP ties will have a hard time recouping development costs and justifying further development, though. And, as such, will look less attractive compared to the few that do sport a larger userbase and to the absolute multitude of Free2Play games that tend to sport content that's limited but hard to beat on price. Many will die.

This is not to say that there isn't room for original ideas in the MMO space - but it seems that in order to sell anything new, you need either well known IP or a way of stealing large chunks of community from World of Warcraft. Thus far, few seem to have found a way of executing the latter. Design elements from some of the more novel, yet successful, games such as EVE Online and Second Life could well be key to this.

It's also quite possible that the next big mainstream title could come as a dark horse out of a Korean software house as a Free2Play game that appeals to western audiences.

Thursday, November 6, 2008

White spaces, DPI and net neutrality

With the recent White Spaces news (which I think is pretty neat - more competition is needed in the states, and the US is a pretty damn big area to cover by the usual means), I'll go out on a limb and make a prediction: traffic management will be required. We're talking somewhat larger cell sizes than WiFi and the medium is decidedly shared.

I've seen nothing about a public access mandate for this space, so let's assume it's all about commercial ISP's. Any provider entering the fray here would be competing against the incumbent Telco or Cableco and would need to compete on price and service, pretty much.

It's certainly doable, but don't expect the Network Neutrality Marvel at work here. It won't be fast and every-packet-is-equal. Either of, perhaps, but not both. And if you're selling service, I'm pretty sure I know which one goes away first.

This might perhaps be even more applicable if the AWS bit goes through. Citing that article:

"The FCC now says that the ultimate winner of its AWS spectrum auction must use up to 25 percent of its capacity to provide free, two-way broadband Internet service at data rates of at least 768 kilobits per second in the downstream direction. "

Right. And this, again, is over a shared medium? Let's say that 25% of the capacity is 25Mbps per frequency and cell. With at least 768 Kbps per user, they'll have a hell of a hard time covering peak usage. I suppose it boils down to how many frequencies the radios can muster and how many users they're getting, but it's a service that'll suffer more as it gets more popular - and being free, I can see how a lot of casual Internet users would use it.

You'd probably see ~0.15 Mbps per active subscriber at peak, giving us ~160 users per frequency and cell before you actually start seeing bad congestion. And that's assuming that the users follow a pretty average usage demographic - a few YouTube addicts or filesharing users will skew that somewhat badly since we're seeing a pretty low number of total users per cell/frequency.

I admire the notion, but I think they'll be needing to do both host fairness, ideally some smart queueing and perhaps bulk services shaping in order for it to be very usable at peak. Pretty much the same deal as 3G providers and WiFi providers today.

Friday, October 24, 2008

DPI and US Telco execs

While I actually got the time to write for once in a blue moon, I might as well throw in some other news as well. It's a presentation written by Jon Linden, a veep at procera. (Disclaimer: Procera is also where I happen to work, so please apply a grain of salt)

What makes it interesting is that it's a presentation aimed at higherups at telcos and presented at USTelecom Business Executive Forum. I doubt that there's a unified angle for the (traffic management) DPI industry when it comes to pushing features and agendas, so I couldn't really call this a view of the industry.

Nonetheless, this is one take on it and it's a pretty short read. Go for the speaker notes first, the presentation doesn't make as much sense without them.

Sorry, HOW much?

While reading the new TLD application announcement at ICANN, most of the announcement looked like one would expect it to. One thing, however...

The total fee per applicant takes into account close to $US13 million invested by ICANN since October 2007 to put the design of the implementation program in place. It includes allocated staff time, direct consulting expenses and other fixed costs. This cost will be allocated across the new gTLD applications until it is reclaimed and amounts to $US26,000.

13 million USD for that? Okay - there's more than you see beneath the surface here, for sure. 200 pages, six languages, lawyers, translators and whatnot involved, no doubt - but 13 million?

I'm in the wrong line of work, obviously.