QZ qz thoughts
a blog from Eli the Bearded
Tag search results for blosxom|administrivia Page 3 of 6

Site Revamp


Making this blog has prompted me to look at the rest of the site. The main home page was rather sad. I've spruced it up, dropped the downloadable font for the logo and replaced it with an image. It's a lot smaller, and it allowed me to color the image in. The new logo uses "Stereolab" by Blue Vinyl Fonts. A Stereolab logo is in the rotation from the original set of random logos. The designer has changed websites since then (but still doesn't have a good SSL certificate). That font lacks a blank with the upper graph, so I used a dots, edited them out, filled the outline with a gentle red to purple gradient, then superimposed black lines on top of the colored ones in the graph. I think the effect is much nicer than the old "Impact Label" font and is half the size. (The designer of that, Michael Tension, does not have his own site.)

I also changed the Braille content and made the table adjustable in size for smaller screens. Then I labeled the links by some simple categorizations and put a plain text H1 tag at the end for text browsers.

Beyond the home page, I also cleaned up the cardboard shelves how-to which had suffered by Flickr images going away. Fortunately, I have a complete mirror of my Flickr pages that dates back to when I gave up on that size. So finding the images by their old Flickr IDs was trivial.

Usenet post filter


Okay, I have added a lot more old reviews and along the way wrote a Usenet post to qzpostfilt markup language filter. I call it qznewsfilt .

It does much but not all of the grunt work of conversions. I still have to manually add filenames, tags, fix line wrapped underscore-for-italics issues, convert quotes to blockquotes (where I mean quotes from books and the like, not quotes from other posts), excise quotes from other posts (that context isn't valid on the web), and fix link text.

For the last one, I found many, but not all, could be fixed with a vi macro. I created it using the handy q marcro recorder in vim.

k:.s,[^(]*,_&_ ,^M0Djf"p

A paragon of clarity, isn't it? Broken down:

k                         move up a line
 :.s                      search and replace on current line
    ,[^(]*,               everything up to first open parens
          ,_&_ ,          surround that string now with underscores and end space
                ^M        this is a literal newline to trigger the :s
                  0       move cursor to start of line
                   D      delete to end of line
                    j     move down a line
                     f"   find first " on the line
                       p  paste in last deleted fragment

I used it to turn:

Movie Name
.a https://imdb.example/url/ "at IMDB"

into


.a https://imdb.example/url/ "_Movie Name_ at IMDB"

On the less frequent

Movie Name (Year)
.a https://imdb.example/url/ "at IMDB"

examples, it needed a little bit of white space fixing afterwards.


.a https://imdb.example/url/ "_Movie Name _ (Year)at IMDB"

but overall, not too bad.

Rewriting History


I didn't use this blog between 2008 and February 2020. But now there are "posts" for movie reviews, tv reviews, book reviews, and short-film reviews for late 2014 to early 2020.

I've taken old Usenet posts of mine and applied a bit of my new markup and made them into posts, timestamped to match when I posted them. I've excised the quoted text from other people that appeared in some of them, but it's mostly legacy writing of mine. I'll probably add more from before 2014 at some point.

This is to get double use out of my own writing and to provide some historical content that might be worth exploring as I start to add new reviews.

Dead links


A lot of the old posts are about sites that do not work anymore. Some changed URLs and some are just gone. It bothered me.

So I wrote a quick tool to check the links in posts, add a "deadlink" tag, and if it is the old standard link blog post, edit the post to not have the bad link an actual <a href>, but just show it.

I use the "spider" mode of wget at the heart of the check, with a command line nearly out of the manpage. In a small inital run, I found that the default retries for unresponsive sites were excessive. I adjusted that down.

linkchecker

On entries that I find a new link for, I'll have those tagged fixedlink to highlight that it's an old post which has had edits.

Also after yesterday's logo post, I fixed two bugs in the post filter. One was serious, dealing with a lack of an anchor on the .code operations, one was minor, putting punctuation outside of anchor tags. At the same time I added an <pre> handler which also accepts ``` style markdown fences. I updated the tests to match.