Off-Topic > Off-Topic - Tiny Tux's Corner

package distribution by bittorrent

(1/2) > >>

solorin:
http://benjaminkerensa.com/2012/05/30/package-updates-over-bittorrent-protocol

gerald_clark:
Not too practical for lots of individual small packages.

solorin:
you're basically making the same point as the third comment down.

granted there needs to be a certain volume in some measure
before bittorrent's more apparent advantages come into play.

however,  a decentralized model empowers the community over gatekeepers.
i think that inherent advantage is enough reason to consider these technologies.

i think there's probably ways to implement this where one could kick in for the other,
under the right conditions.
or have two pieces of infrastructure in place.

for example a section of the wiki could be reserved to post torrents,
and each individual packager could provide seeding.
with DHT, i believe there doesn't even need to be a centralized tracker anywhere.

this is in off topic anyway. i just posted it to see if i could spark any imaginations.

curaga:
Require each packager to seed = require them to be always on, and to have a fast upload connection. That would be a big hurdle IMHO, with the caps, per-mb charges and even the always-on requirement.

The main isos already have torrents, not officially posted ones but with md5sum you can be sure they're genuine. (though I don't get why, when they are both small and can max one's download by using more than one mirror and aria2).

solorin:
this is still being locked in to thinking in a top-down centralized manner.

>Require each packager to seed = require them to be always on, and to have a fast upload >connection.

that's actually what the current system of distribution requires of it's one central node,
and what the swarm can address if properly cultivated.
but y'all know how bittorrent works, no need to rehash.

the third poster(in the article comments) and you both do make an interesting point tho. 
that the useful pieces of information shared are quite small.
whether they be the isos (small compared to other distros)
packages(many packages in the repo are quite small already).
or updates to binary packages (even smaller).
and that bittorrent traditionally makes more sense for large file transferring.
but i'm not so sure that that's a good reason to dismiss it so flippantly.

if there are enough peers sharing at any given moment, the benefits would still
be apparent - particularly if they are all trying to hit one or a small subset of mirrors
for an update. it's the size and behavior of the swarm that's actually more relevant here.

if you don't have enough sharing peers, then yes the topology will default to the central server
model. and you are left where you started as you pointed out above.
but as your network scales there should be efficiencies to be gained.

o no i've gone and rehashed bittorrent for y'all. o well.

Navigation

[0] Message Index

[#] Next page

Go to full version