TANSTAAFL

LiveJournal (who host this blog) will no longer let new users sign up for their advertising-free “Basic” account. Instead, new users can get the “Plus” account, which has adverts (if you’re using some quaint non-Firefox browser which still shows you such things), or they can get the “Paid” account, which doesn’t.

The announcement of this changed followed LJ’s standard practices of bungling and evasion when communicating with their customers, which new-ish owners SUP correctly describe as “the values and legacy of LiveJournal”. This has annoyed a few people, but I’m not sure why, because they should be used to it by now.

Anyhoo, livredor and hairyears are hosting some interesting discussions about it, here and here. hairyears makes the point that buying LJ is not just about buying people’s writings, you’re also getting stewardship of a community (or lots of communities) with their own values. My impression is that this applies more to LJ than to “proper” blogging sites, because of LJ’s mix of blogging and what we’d now call social networking. Social networking sites have the feel of places we go with our friends, so it’s not very surprising that we can be vociferous in defending them (LJ isn’t the only one with epic failures of customer relations: Facebook had the Feed and Beacon debacles).

Servers and bandwidth are not free, as GreatestJournal has been finding out (the hard way). But how do you make money out of such a prickly bunch? danahboyd‘s commenters have some good suggestions.

Geeks who still use Usenet (you remember, Usenet) have suggested a peer-to-peer system as a way around all this nonsense (see the comments on both livredor and hairyears‘s postings). This sort of thing is a reflex response from geeks to any outside manipulation of their stuff, until their enthusiasm is curbed by older and wiser geeks. Having been curbed, I realise that you’d need good answers to questions about how you make such a thing work, how you make it usable by non-geeks, and, related to that, how you interest people who don’t think the peer-to-peer part is intrinsically cool. Freenet has been around a long time and hasn’t become popular. BitTorrent has, because it gets people something they want (warez, pr0n, TV programmes, Linux DVDs) in a way which scales better than the centralised alternative.

I think robhu is right to say that the web browser has to remain as the interface (though that in itself makes security interesting), but it’s not clear that HTTP has to be the transport for such a thing. His idea of a federation of LJ-like servers is interesting, but once you centralise, you’re back to the question of how the people running the big servers make any money. There might be a place for the Usenet model, where each ISP runs a server for their users, or perhaps for the MSP model (which Usenet is moving to as its popularity declines), where I pay the people running a good Usenet server a yearly fee to access it.

The networking effects are a killer: you need something special to get off the ground and up to the stage where people are joining because other people are there. That, or you bodge your thing on the side of an existing infrastructure: can we do this using XMPP or Usenet or email, I wonder?

19 Comments on "TANSTAAFL"


  1. 1) I like and take your point about reflex responses. I don’t think the NANAE guy is talking about the same problem, though. We’re not trying to stop spies reading our stuff. Spies can read our stuff anyway, and they always will be able to. What we’re trying to do is to have social networking in such a way that it will be impossible, if we dissociate ourselves from a part of the WSOGMM which we used to frequent but where there has been a policy change to which we disagree, for that to mean that it’s nearly impossible for us to keep our social connections.

    2) I was saying to someone the other day that something that web forums don’t have is a way of reading them offline, and I was about to suggest some kind of REST-based thing, and then I thought… NNTP. Duh. I may post about that if I get a spare moment.

    3) XMPP/Jabber itself is an interesting case study. The whole idea was that it was a Free, decentralised IM system. And for years it was wonderful and cool and nobody used it. Then suddenly LJ-back-when-LJ-was-cool started pushing rebadged Jabber, and then Google started pushing rebadged Jabber, and now freakin’ AOL are pushing rebadged Jabber. It seems to be a snowball effect. And people seem to have wanted the decentralisation all along– one of the ways the corporations have sold it to customers is that you can talk to people on “other” IM networks! I don’t know how it happened, but it would be good to know.

    Reply

    1. 1) Yes, my advocacy of Freenet as the way of distributing spam blacklists isn’t intended as a solution to quite the same problem, but it’s an example of the tendency to see distributed systems and a bit of crypto as The Answer. Some of the policy changes on LJ do seem to be out of the fear of censorship: if the Dukes of Hazzard found out that the distributed system was habouring paedophiles (as it undoubtedly would if it were truly uncensored), they’d go after that in the media and law enforcement, just as LJ feared they would, and then we’d see how the system held up against those sorts of attacks.

      3) It looks like Google may have been responsible for making Jabber popular. Maybe what you’d need for a distributed system is a powerful advocate.

      Reply

  2. Decentralization ought to be an easy sell to anyone who has just been bitten by the downside of centralization, provided you have a working answer at that point; otherwise they’ll just go looking for a less-evil centralized service. That’s fine, though; if you believe that centralized systems are inherently and inevitably flawed, those events will keep on occurring until you get round to producing your answer (and you have a bigger list of things to put in your “we told you so” response to each disaster).

    Reply

  3. but once you centralise, you’re back to the question of how the people running the big servers make any money
    The same way pre-SUP LJ did.

    Presumably they made money back then, it’s just that now they want to make more.

    Reply

      1. My experience has been that the site has always been fast and fine, and my experience is pre 6A. Then again I was always a paid user so I got the priority queue thing.

        I’m not sure that’s a big problem. The site was really popular before 6A took over, it’s not clear that people would be unhappy with things being like they were before, and anyway – over time this becomes less of a problem as the cost of bandwidth, disk, and CPU becomes cheaper.

        Reply

  4. We’re not far from a decentralised position.

    We have OpenID and RSS, which gives us everything bar the comments part of it. If comments went back to the originating server then we’d have everything…

    Reply

    1. So, is there anyone working on a standard for sending comments back to the source post? All that’s needed is an XML tag in Atom or RSS for a URL to post the comment to. Of course anyone who automatically displayed anything sent to such an address on the site would be a fool.

      Reply

      1. There are a couple of ways it could work. If Bob is reading Alice’s journal on bob.com, then bob.com could either accept his comment submission itself and forward it, or there could simply be a tag which says “comments go to this URL at alice.com” (and another which says “there are currently n comments” so you can avoid the hacks which do this with images embedded in the RSS feed). In the RSS/OpenID model where a person’s blog still lives somewhere (rather than being totally distributed, Usenet-style), I think the latter is preferable. You read comments on Alice’s blog and make new ones at alice.com, having logged in using OpenID.

        [Edited for typos]

        Reply

      2. You’d have the same rules that we do at the moment – anyone can send comments to LJ, but only ones which fit my standards end up on my journal (no anonymous ones, for a start).

        Reply

    2. There’s also the question of how you get access to friends-locked stuff. A model which has lots of blogging sites using OpenID and RSS presumably has one blogging site pulling locked postings from another. That’s extending the trust boundary to encompass all the sites which are pulling in those locked postings. This makes me uneasy, because if it means that people who friend those offsite aren’t just trusting the individual person, they’re also trusting the other site (currently if you’re using a site as your OpenID provider, that site can impersonal you anyway, but AFAIK there’s no standard way to get a secured RSS feed, so attacks are harder to automate).

      The friends-only stuff is part of what makes LJ different from standard blogging sites, so it’s a feature that would need carefully thinking about in any putative replacement.

      Reply

      1. I don’t see how you can get away from this problem without decrypting stuff / installing apps / etc… all of which would require the end user to either install something locally (not going to happen) or would require browsers to be quite a lot more advanced than they are now.

        I think we’re living in a world where the closest we’re going to get (and not have it be so annoying that people won’t use it) is where Anne trusts Bob, so puts Bob on her friends list, Bob uses a different blogging site, so when Anne adds Bob it says “This will also mean trusting $otherbloggingsite.com, are you sure?” (if Anne currently has no friends on $otherbloggingsite.com”.

        Other possible solutions might be to do decryption with Javascript in the browser, but AFAIK there is no easy way to store keys (or anything) locally.

        Reply

    3. There’s also another question: if I decide I’m pissed off with LJ, how do I move my existing content to another site? There’s no standard blog archival format at the moment. You can get all your posts and comments off LJ in an XML format, but the place you’re going to needs to be able to import it all.

      Reply

      1. In my imaginary future where we replace LJ, most providers would be running the same blogging software, as it is now where the other LJ clones run LJ’s own software. If we wrote such software ourself it would be terribly easy to make an export utility that allowed a user to move from one journalling provider to another (actually I think it ought to be automated and built in somehow so the user can do it all on the web, although providers might not like that idea*)

        Given that these other sites don’t exist yet I don’t think it’s a major problem that there is no way to import XML backups of LJ in to them. I don’t think it would be particularly hard for a geek to write an import utility (well, and design the thing so you can import comments from people who aren’t actually on this site, etc) so I’m not too worried by that either.

        This is something that ought to be done. I mean, the LJ software is GPLed, maybe someone could bolt some stuff on to that and make it do everything we want.

        * then again they might – in my imaginary future we make some LJ clone, GPL the source, then people either run sites for fun, because they want to run their own site on their own server in their mothers basement, or they ‘compete’ on things other than screwing everyone over, i.e. we end up with companies running these sites that are more like Redhat and less like Microsoft.

        Reply

        1. Atom allows for both posting and syndicating using very similar formats. Backing up in that format and using it to post with would allow for simple cross-platform backup/restore.

          Reply

  5. Correct me if I’m wrong, but the only thing that needs a central server is ID generation and verification. OpenID goes some way to solving that.

    Yes, the server has tools for displaying posts to a selected audience – the hardest part of the process – but almost all the other problems go away with decentralisation. Scale and storage especially: collectively, LJ is enormous but an individual journal is a small amount of text with visitor traffic measured in the hundreds.

    The peer-to-peer alternative will launch and set sail if, and only if, a straighforward ‘run this script on your webspace’ installer is made available, and it’s made even easier to own your own webspace. And that, in turn, is all about getting a web hosting company (or better still, an ISP) involved. But freely, non-exclusively, and on the commercial model that the p2p journal-out-of-the-box is a great way of getting new customers.

    Better still, a basic LJ-clone that is so standardised that nongeeks can have a user interface for uploading styles and widgets is an obvious winner. Admittedly, it’d be a lot of work to write; but the reward is an anarchist’s dream – it does worse than compete with MySpace, it breaks their business model.

    Reply

    1. If I’m not misunderstanding you, this seems to be a third sort of model: everyone has their own personal webspace and runs the server software there (the other two are ewx‘s Usenet with rich content and crypto, and andrewducker‘s standardisation of protocols for a smaller number of dedicated blogging servers to talk to each other).

      The bog-end personal webspace you get with most ISPs don’t let you run scripts at all. Running your own blog on your personal site means you pay extra for the deal where you get a database and some ability to run scripts on the server. Those deals don’t provide a standardised environment, so writing an installer is probably hard. Have a look at the installation instructions for WordPress, for example.

      If there was a way to get the thing bootstrapped, you might find ISPs offering the blogging setup as a standard service like email is (or Usenet used to be). But at the moment the sort of server-side stuff you need for individuals to set this up themselves is expensive, because you’re buying a general-purpose hosting account, so you don’t get the economies of scale that someone running a dedicated blog “farm” gets. I think andrewducker‘s idea, which implies specialist MSPs for bloggers, like we have now, but with better communication between them, is more likely to be a go-er.

      Reply

      1. With standard protocols you can have both mega-sites like LJ and individual sites, all using the same methods for communicating back and forth.

        And you don’t _need_ scripting access to have a blog. You could create it dynamically on your PC and then upload the results. Not as slick, but doable.

        Reply

Leave a Reply to robhu Cancel reply

Your email address will not be published. Required fields are marked *