Hmm, thanks for the link. But reading through those posts, it looks like the methods that still work are just using standard page-sucking techniques of the sort that would get our site banned from accessing their site if we tried automating it as a service for our users. (The person who recommended using wget says you have to explicitly override LJ's robots.txt.)
no subject