The project is called Maelstrom and is certainly interesting even if there are a number of questions that need to be addressed. Then idea of Maelstrom is to develop a browser (right now based on Chromium) that can cache and seed a web page once a user has accessed it. With this in mind the more people in the swarm the faster a website and its content would be available. At least this is the theory. Where things start to fall apart is when you have a site with multiple pieces of content in multiple locations all stored in a database. The Maelstrom browser would not have access to all of the content a site has so it would only work on pages that are already seeded. New pages and content would be as fast as the original publishing server.
However that is not all that could potentially go wrong with Maelstrom. There is also the problem of fragmentation of data. If you take a page and update the content on that page you could end up with 100 systems having the old data and 10 having the new. It means you are 90% likely to get the old outdated page rather than the new one. Of course there could be a method to remove cached content or force a refresh of data, but that puts you back into the same position as a brand new site with no seeds.
Outside of simple web content would be any streaming media content. For example, if a site hosts their own pod casts or videos. To benefit from a torrent swarm that content would have to be stored across the swarm. This means that there would be a full copy of that media on every system in the swarm. That could eat up a lot of storage on people’s systems very quickly when you think about how much bandwidth is used to stream media. It also rules out its use by Netflix and other streaming companies. They are not going to want their content on anyone else’s servers.
Lastly, and possible more frightening is the ability of someone to inject malicious code into their copy of the site. Since you are pulling bits and pieces of a live document it is possible that this could happen and in effect create an entirely poisoned version of the swarm which would infect anyone that goes to the site. This can happen with current site technology, but that is easier to protect against and also you have a single source to clean instead of thousands.
Maelstrom, is interesting. However, there are too many problems for me to believe it is a viable technology for more than a handful of sites. Still perhaps Maelstrom could be the basis for a new way to serve websites in the cloud… or it could just end up being a really bad idea that goes nowhere.