A thing I found while investigating how to get journal backups going again in the wake of LJ’s most recent debacle:

A while back, geeks kept saying that LiveJournal should be Usenet news, that is, instead of mucking about with all the tedious web forum stuff, it’d be nice to have a program which let you read comments and entries, kept track of threading and which comments you’d already read, and so on (remembering what you’ve read on LJ was the motivation for my LJ New Comments script, but that doesn’t avoid LJ’s clunky interface).

This was tricky as there was no obvious way to get all the comments from an entry. There was the old comment export thing, but that only works on your own journal. You could “screen scrape” with a program that tried to pull the comments from the human-readable versions of LJ’s pages, but that’s considered rude because of the load it’d put on LJ’s server, and it’s fragile as it might break if LJ changes the human-readable output.

Luckily, LJ added a bunch of new stuff to its existing interface for “clients” (programs which access LJ, like Semagic). This includes the getcomments method, which allows you to get all the comments on any entry you can see.

Add this to the existing machine-readable stuff (Atom feeds, getfriendspage) and you could probably write either a client specific for LJ (the iPhone client is the reason LJ added the getcomments method, by the looks of it) or a proxy to turn the whole thing into NNTP and let you use conventional Usenet clients. Who’s first?

(Personally, I still plan to be off once I can actually back up this journal, including the comments of my esteemed readers. But I won’t stop reading, so this would be a nifty toy even for me.)

Edit: another thing this allows is third parties offering comment feeds of your journal: someone could write a thing which turned the comments from an LJ entry into an Atom feed. Real blogs have these, so LJ could too.

The recent abolition of free accounts with no ads on LiveJournal provoked some interesting comments on LJ itself, and on the wider question of how social networking sites can make any money.

In a nice turn of phrase, antennapedia speculates that LJ may have “begun the descent through the levels of credible ownership” (which is presumably antennapedia‘s reason for producing a migration tool to assist in moving your journal to another server which uses LJ code). chipotle has some interesting numbers (although some are probably faulty) and some speculation on where the Russian overlords are heading.

There are the expected “let’s all go somewhere else” projects which will set up a page on Sourceforge/Google Code, argue about what to implement and then die (elsejournal, for example). synecdochic knows a thing or two, having worked for LJ in the past, and may have a credible proposal, although I’m curious about some of the technicalities.

After each fresh stupidity from LJ, a bunch of people bugger off to existing LJ clones which are running the Open Source parts of LJ’s code. GreatestJournal staggered under the weight. InsaneJournal is holding up, except when their hosting provider accidentally turns them off. synecdochic rightly worries about InsaneJournal in the long term, because scaling up your website when it gets popular is a hard problem, requiring equipment and people who don’t come cheap. synecdochic also has some insights into how that worked for LJ itself, if you’re interested.

Wired has a brief piece pointing out that nobody’s quite worked out how you make money of social sites yet. Perhaps you don’t: unoriginal1729 reckons search engines will always have the edge, because they can serve appropriate ads at the point where you’re actually looking to buy something rather than speculatively advertising based inferring things from your interests. Maybe the thing which precipitates a working verison of the geeks’ dream of Usenet-plus-crypto-magic will be all the centralised sites running out of money.

LiveJournal (who host this blog) will no longer let new users sign up for their advertising-free “Basic” account. Instead, new users can get the “Plus” account, which has adverts (if you’re using some quaint non-Firefox browser which still shows you such things), or they can get the “Paid” account, which doesn’t.

The announcement of this changed followed LJ’s standard practices of bungling and evasion when communicating with their customers, which new-ish owners SUP correctly describe as “the values and legacy of LiveJournal”. This has annoyed a few people, but I’m not sure why, because they should be used to it by now.

Anyhoo, livredor and hairyears are hosting some interesting discussions about it, here and here. hairyears makes the point that buying LJ is not just about buying people’s writings, you’re also getting stewardship of a community (or lots of communities) with their own values. My impression is that this applies more to LJ than to “proper” blogging sites, because of LJ’s mix of blogging and what we’d now call social networking. Social networking sites have the feel of places we go with our friends, so it’s not very surprising that we can be vociferous in defending them (LJ isn’t the only one with epic failures of customer relations: Facebook had the Feed and Beacon debacles).

Servers and bandwidth are not free, as GreatestJournal has been finding out (the hard way). But how do you make money out of such a prickly bunch? danahboyd‘s commenters have some good suggestions.

Geeks who still use Usenet (you remember, Usenet) have suggested a peer-to-peer system as a way around all this nonsense (see the comments on both livredor and hairyears‘s postings). This sort of thing is a reflex response from geeks to any outside manipulation of their stuff, until their enthusiasm is curbed by older and wiser geeks. Having been curbed, I realise that you’d need good answers to questions about how you make such a thing work, how you make it usable by non-geeks, and, related to that, how you interest people who don’t think the peer-to-peer part is intrinsically cool. Freenet has been around a long time and hasn’t become popular. BitTorrent has, because it gets people something they want (warez, pr0n, TV programmes, Linux DVDs) in a way which scales better than the centralised alternative.

I think robhu is right to say that the web browser has to remain as the interface (though that in itself makes security interesting), but it’s not clear that HTTP has to be the transport for such a thing. His idea of a federation of LJ-like servers is interesting, but once you centralise, you’re back to the question of how the people running the big servers make any money. There might be a place for the Usenet model, where each ISP runs a server for their users, or perhaps for the MSP model (which Usenet is moving to as its popularity declines), where I pay the people running a good Usenet server a yearly fee to access it.

The networking effects are a killer: you need something special to get off the ground and up to the stage where people are joining because other people are there. That, or you bodge your thing on the side of an existing infrastructure: can we do this using XMPP or Usenet or email, I wonder?