cupidsbow (cupidsbow) wrote,
cupidsbow
cupidsbow

How to Post Del.icio.us Links to LJ

A couple of people have been asking how I got the robot to pull links from my del.icio.us feed and turn them into LJ posts.

It was astoundingly easy, thanks to wistfuljane, who has written the clearest and most user-friendly set of instructions it's ever been my pleasure to follow: [How-To] Delicious Glue.

I'm not even going to try and summarise. But before you go, I do have one other useful bit of advice. If you want the compilation post to go to an LJ community (as opposed to a personal journal), it won't work without another step. (ETA: murklins has solved this. See comments for info on her new script.)

Because of the way LJ communities are setup, the del.icio.us feed can't directly login to them, and it needs to when it makes the post. How I got around this is by creating a shell livejournal, rec_room_bot. Once the automated post is listed there, I just go in, copy it, and then paste it to rec_room. I can also make any editorial changes, add tags, memory, etc, at the same time.

If rec_room wasn't already an established comm, I'd just have created a journal specifically for the feed; but the extra step isn't a particular burden, and is much, much faster than making each rec individually as I used to.


ETA: I was asked about how to create multiple robot posts, separated by fandom.

ETA 2: Ignore the first point below and go to comments instead. The lovely murklins has written a new script which does this, and I'm sure she'd let you have it if you ask nicely.

Okay, I've actually got two answers to this:

  1. I'm using the shell LJ, rec_room_bot, as the site the robot posts to, which means I have to cut and paste each rec set from there and re-post over on rec_room. As I'm doing that already, it's no big deal to edit the html code and make any changes I want.

    Because of that, at the moment I'm just getting one dump of everything I've recced each 24 hours, and then I'm splitting it up manually as I repost. I'm familiar with html, and the code is very basic anyway, so this is no great burden on my time.


  2. ETA: The tags you list in the lj.php file define which tags appear on the LJ post. They don't define which links the robot harvests. Every puplic link you've posted to del.icio.us in each 24 period is picked up. There's no way to limit it, although you can go in and edit the autogenerated post afterwards, as I discuss above.

    Thanks to wistfuljane, amothea and bibliotech for the fantastic workshopping effect that's going on here.

    I have been considering getting the robot to do the split by fandom for me, should circumstances change.

    I assume every tag listed in the lj.php would appear in the same harvest post, rather than creating separate posts for each tag (at present I'm harvesting all tags by using the empty "" option). So I think I'd need a new lj.php for each separate post I wanted.

    What I'd do is create sub-directories on my website, and name them after the tags I want to harvest separately. Then I'd create different versions of lj.php, with different tags listed, and save them in the subdirectories with the matching names. [ETA from wistfuljane: As for separate posts, creating different .php scripts (lj-songvids.php, lj-sga.php) should be sufficient. You don't need to have separate xmlrpc.inc files.]

    Then I'd create multiple robot "thingys" on del.icio.us, one for each automated post I wanted: one bot for songvids at 23:10, one for SGA at 23:20, one multifandom one for all the other fandom tags at 23:40, etc.

    This would work fine, as all my posts have a fandom tag, although songvids present a problem as they have fandom tags as well as the 'songvid' tag. But if I did songvids first, they would already have been harvested, so might not get picked up by the later bots. I shall have to test this.

    [ETA from wistfuljane: If I remember correctly, the bot would pick any links within the last 24 hours regardless of whether it has been harvested already or not. I have no idea how you would work around that except manually - although, I guess you could try "fandomname+fiction"...I don't think it would work though.]


I have also been wondering about the del.icio.us privacy setting, and whether a robot will pick up links which are marked private posts marked "private" are not picked up by the robot (thank you wistfuljane). I am still wondering, whether the robot will pick up posts in retrospect, the day after a link's setting are changed from "don't share" to public. ETA: Nope. Just what's been newly posted within the 24 hour window, so if you want something formerly 'private' to be picked up, delete and repost the link to del.icio.us.

If people have further info, please let me know. It seems useful to pool knowledge.
Tags: admin, links, software
Subscribe
  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 54 comments