Improvements on Manjaro Security Updates

I’ll give credit where it is due. I had previously criticized Manjaro for holding back all package updates as this ignored security issues. But it appears that Manjaro has a new security policy, which means that packages that are rated as “Critical” or “High” in the Arch Security Advisories get pushed through their “quality assurance” process more quickly.


Posting so more people can get entertainment value from this email.

From: Daniel Skowroński <>
Subject: –asroot

Hi Allan,
I’d like to say you are moron if you were thinking that commiting 61ba5c961e4a3536c4bbf41edb348987a9993fdb to pacman was good idea!
I am Arch Linux user becouse it allows me to do virtually anything so I manage some of my arch-based servers from root account. I also have netbook with root account only for my own reasons. And you’ve destroyed not only my routine but also several programs that were depending on that. Screw you with reverting package version or using patch from aur – it’s supposed to work out-of-the-box even on Arch.

No greetings,
Daniel Skowroński

PS. You could have at least let anybody else sign-off that imbecile commit…

Who You Gonna Call?


Nobody… Arch Linux has the latest glibc release (2.20) so is not affected by the GHOST bug, which was fixed upstream in glibc-2.18 (without realising the security implications).

For those that want to know more about security issues and updates in Arch Linux, check out the arch-security mailing list.

Replacing “makepkg –asroot”

An alarming number of people have noticed, the pacman-4.2 release removed the --asroot option from makepkg. This means that you can no longer build packages as the root user. There are good reasons for this and the option was only included due to issue we had building under fakeroot (only the package() function gets fun under fakeroot these days, and there has been no issues with fakeroot in a while anyway).

Even if your PKGBUILD file is not malicious, there are good examples of when something goes wrong by accident. Remember the bumblebee bug that deleted /usr due to an extra space? Or just this week a steam bug that deletes a user home directory? Do you still want to run code as root? OK then… I am going to show you how not to!

Firstly, we need a build directory. I suggest /home/build. Putting this directory directly under /root will not work unless you want to relax its 700 permissions to allow the nobody user read/write access1. I suppose you could as you are running as root… but I will use /home/build. Create the directory and set permissions with the following:

mkdir /home/build
chgrp nobody /home/build
chmod g+ws /home/build
setfacl -m u::rwx,g::rwx /home/build
setfacl -d --set u::rwx,g::rwx,o::- /home/build

Not that people running makepkg as root need to know what code is doing to run it… I’ll explain what is happening here. Firstly create a /home/build directory, make it owned by the nobody group and ensure that group has write permissions. Also add the sticky flag to the group permissions so all files created in that directory also are owned by the nobody group. Then we set ACLs to ensure all files and directories created in /home/build have group read/write permissions.

Now to building you package! Get you PKGBUILD in your new build directory and run makepkg as the nobody user. You can do this using su but using sudo has the advantage of being able to alias this command. Installing sudo does not create a security risk as you are running as root! You also do not need to configure anything as root will have full sudo permissions by default2. Build your package using:

sudo -u nobody makepkg

Done… I’d add “alias makepkg='sudo -u nobody makepkg” to your ~/.bashrc so you never have to type this again.

There is still a problem here. If you download and manually extract a package sourceball, or use an AUR helper such as cower to do so, the group write permissions get lost:

[root@arya build]# cower -d pacman-git
:: pacman-git downloaded to /home/build
[root@arya build]# ls -ld pacman-git/
drwxr-xr-x+ 2 root nobody 4096 Mar 21 2013 pacman-git/

Doing “chmod -R g+w pacman-git/” will fix this. There is probably a way to avoid this – at least when manually extracting the tarball, but I have no interest in figuring it out. Otherwise, it is a two line function.

And if this does not satisfy you, revert that patch that removed --asroot. It should still revert cleanly.

1 makepkg checks directory write permissions using the full path so fails if any parent directories are not writable. I guess this could be fixed if someone was interested.

2 Note that to have makepkg install missing dependencies and install your built package without being queried the password for the nobody user (which would be difficult to answer…), you will need to configure nobody to run sudo pacman without a password.

Two PGP Keyrings for Package Management in Arch Linux

Both the pacman package manager and the makepkg tool for building packages verify files using PGP signatures. However, these two pieces of software do it using different keyrings. There seems to be a lot of confusion about this and misinformation is spreading at a rapid pace, so I’ll attempt to clarify it here!

Pacman Package File Signature Verification
By default, pacman is set-up to verify every package using a PGP signature. It has its own keychain for this purpose, located at /etc/pacman.d/gnupg/. This keychain is initialized during the Arch Linux install – a root key is created and the Arch Linux master keys are locally signed by the root key. The master keys sign all Arch Developer and Trusted User keys, creating an effective web-of-trust from your pacman root key to each of the packager keys allowing verification of package files.

If you want to allow the installation of package files from a non-official repository, you need to either disable signature verification (don’t do that…), or trust the packagers signing key. To do this you first need to verify their key ID, which should be well publicized. Then you import it into the pacman keyring using “pacman-key --recv-key <KEYID>” and signify that you trust the key by locally signing it with your pamcan root key by running “pacman-key --lsign <KEYID>“.

Makepkg Source File Signature Verification
When building a package, the source files are often (and should be!) signed, with a signature file available for download alongside the source file. This typically has the same name as the source file with the extension .sig or .asc.makepkg will automatically verify the signature if it is downloaded in the sources array. e.g.:


However, makepkg needs some information to verify the source signature. It will need the public PGP key of the person who signed the source file, and that key to be trusted. The difference here is that you do not trust whoever provided the source file to provide packages for your system (or at least you should not the vast majority of the time), so your user’s keyring is used. To get the key use “gpg --recv-key <KEYID>” and trust it (once suitably verified) using “gpg --lsign <KEYID>“.

If you provide a package to the AUR, it would be a lot of work for everyone to suitably verify a PGP key and locally sign it. To demonstrate that you have verified the key, you can add the following to the PKGBUILD:

validpgpkeys=('F37CDAB708E65EA183FD1AF625EF0A436C2A4AFF') # Carlos O'Donell

Now makepkg will trust that key, even if it is not trusted in the package builder’s PGP keyring. The builder will still need to download the key, but that can be automated in their gpg.conf file.

Hopefully that clarifies the two separate types of PGP signature verification happening in pacman and makepkg and explains why they should be separate… Now can people stop recommending that the pacman keyring is imported into the user’s keyring and vice versa?

Pacman-4.2 Released

I released pacman-4.2 on the 19th of December – which is only marginally after the end of August as originally planned… We had 52 contributors provide patches to this release. Andrew takes the prize for most commits. Here are the top 10:

$ git shortlog -a -s -n v4.1.0..v4.2.0
   164  Andrew Gregory
   139  Allan McRae
    66  Dave Reisner
    26  Jason St. John
    20  Florian Pritz
    18  Pierre Neidhardt
    15  Olivier Brunel
     9  Jeremy Heiner
     9  Jonathan Frazier
     8  Dan McGee

The real prize goes to the person who caused the first reported bug. That could have been Dave but he caught it just in time. And I mean just! I posted to IRC “any ideas for the tag message” and the response I got was “I think I broke updpkgsums“. The shame of being first is inversely proportional to your commit count. (The small typos discovered so far do not count…)

Packaging Changes
There has been a couple of useful features added to makepkg. The main ones are:

Architecture Specific Fields: The source and depends (and related fields) now can all specify architecture specific values. For example:


The source for a given architecture is used in addition to the global source. The ‘+=‘ when specifying extra sources for an architecture does nothing different than just using ‘=‘, but I use it to serve as a reminder that these are additional values. Thanks to Dave!

Templating PKGBUILDs: Many PKGBUILDs share a similar build system, making them highly redundant. This is an attempt to reduce the redundancy by providing a template system. The easiest way to describe this is using an example, so I will use a potential perl module template. We create a file /usr/share/makepkg-template/perl-module-1.0.template. In this file is the build(), check() and package() functions and any common biolerplate. As this is our current version, it is also symlinked to perl-module.template. In our PKGBUILD, we would add:

# template input; name=perl-module;

and run makepkg-template. Now look in the PKGBUILD and you will see that line is replaced with:

# template start; name=perl-module; version=1.0;
build() {
# template end;

If we ever need to update the template, we create perl-module-2.0.template and update the symlink. Now run makepkg-template -n to update the PKGBUILD. Read “man makepkg-template” for more details. Thanks to Florian!

Incremental VCS Builds: Previously makepkg would remove its working copy of the VCS source directory before starting a new build. Now makepkg will just update the source copy (or attempt to in the case of SVN…) and build the package. This brings VCS builds in line with those using non-VCS sources. A new option -C/--clean was added to makepkg to remove the old $srcdir before building for cases where incremental builds fail. Thanks to Lukáš (and sorry it took me so long to deal with your patches)!

Source Package Information: To avoid things like the AUR attempting to parse bash to display information from a source tarball, we now provide a .SRCINFO file in an easily parseable format. Thanks to Dave!

Package Functions are Mandatory : The use of package() functions in PKGBUILD was introduced a long time ago. Now it is mandatory that a PKGBUILD has one (with the exception being metapackages that do not have a build() function either). Now that fakeroot usage is limited to the packaging step, the use of fakeroot is mandatory and building as root is disabled.

Misc. Changes: Other things of interest:

  • Static libraries are only removed with options=('!static') if they have a shared counterpart
  • Source signatures are required to be from a trusted source or listed in the validpgpkeys array. We also support style source signing
  • Split packages can no longer override pkgver/pkgrel/epoch as that was a silly idea…

Pacman Changes

No we don’t have hooks… They are strongly planned for the next release.

Directory Symlink Handling: Example time! Arch Linux has a /lib -> /usr/lib symlink. Previously, if pacman was installing a package and it found files in /lib, it would follow the symlink and install it in /usr/lib. However the filelist for that package still recorded the file in /lib. This caused heaps of difficulty in conflict resolving – primarily the need to resolve every path of all package files to look for conflicts. That was a stupid idea! So now if pacman sees a /lib directory in a package, it will detect a conflict with the symlink on the filesystem. If you were using this feature to install files elsewhere, you probably need to look into what a bind mount is! Note that this change requires us to correct the local package file list for any package installed using this mis-feature, so we bumped the database version. Upgrade using pacman-db-upgrade. Thanks to Andrew!

Added an –assume-installed Option: I believe this options was invented during a perl update. Almost all compiled perl modules have a dependency on a specific perl version. So with a major perl update, all the modules need to be updated at the same time, or you can use -d to ignore dependency versions, but for all packages and not just perl. This is not a problem with the Arch repositories where all packages are updated at the same time, but if you have lots of perl modules from the AUR, you will need to remove those, update, then rebuild them. Instead you can use --assume-installed perl-5.18 and all those packages depending on perl=5.18 will not complain. Thanks to Florian!

Repository Usage Configuration: A new configuration keyword was added for repositories – Usage. It can take values Sync, Search, Install, Upgrade, All. For example, I have the [staging] and [multilb-testing] repositories in my pacman.conf with the Sync usage. That way I can look at what is in these repositories without using them for package updates. Thanks to Dave!

Mics. Changes: Other changes to pacman:

  • Improved dependency ordering – the dependency ordering did not go deep enough into the tree to ensure correct installation order.
  • A warning is printed if a directory on the filesystem has different permissions to the one being “installed” from the package.
  • Lock files should now never be left behind…
  • Various speed-ups, memory leak plugs and bug fixes

See here for a more complete list of changes.

And I have just realized that the only major change I contributed was the requiring of package() functions, which I am told means 1/3 of the AUR will not build! It feels good to be back to breaking things…

Shellshock and Arch Linux

I’m guessing most people have heard about the security issue that was discovered in bash earlier in the week, which has been nicknamed Shellshock. Most of the details are covered elsewhere, so I thought I would post a little about the handling of the issue in Arch.

I am the Arch Linux contact on the restricted oss-securty mailing list. On Monday (at least in my timezone…), there was a message saying that a significant security issue in bash would be announced on Wednesday. I let the Arch bash maintainer and he got some details.

This bug was CVE-2014-6271. You can test if you are vulnerable by running

x="() { :; }; echo x" bash -c :

If your terminal prints “x“, then you are vulnerable. This is actually more simple to understand than it appears… First we define a function x() which just runs “:“, which does nothing. After the function is a semicolon, followed by a simple echo statement – this is a bit strange, but there is nothing stopping us from doing that. Then this whole function/echo bit is exported as an environmental variable to the bash shell call. When bash loads, it notices a function in the environment and evaluates it. But we have “hidden” the echo statement in that environmental variable and it gets evaluated too… Oops!

The announcement of CVE-2014-6271 was made at 2014-09-24 14:00 UTC. Two minutes and five seconds later, the fix was committed to the Arch Linux [testing] repository, where it was tested for a solid 25 minutes before releasing into our main repositories.

About seven hours later, it was noticed that the fix was incomplete. The simplified version of this breakage is

X='() { function a a>\' bash -c echo

This creates a file named “echo” in the directory where it was run. To track the incomplete fix, CVE-2014-7169 was assigned. A patch was posted by the upstream bash developer to fix this issue on the Wednesday, but not released on the bash ftp site for over 24 hours.

With a second issue discovered so quickly, it is important not to take an overly reactive approach to updating as you run the risk of introducing even worse issues (despite repeated bug reports and panic in the forum and IRC channels). While waiting for the dust to settle, there was another patch posted by Florian Weimer (from Red Hat). This is not a direct fix for any vulnerability (however see below), but rather a hardening patch that attempts to limit potential issues importing functions from the environment. During this time there was also patches posted that disabled importing functions from the environment all together, but this is probably an over reaction.

About 36 hours after the first bug fix, packages were released for Arch Linux that fixed the second CVE and included the hardening patch (which upstream appears to be adopting with minor changes). There were also two other more minor issues found during all of this that were fixed as well – CVE-2014-7186 and CVE-2014-7187.

And that is the end of the story… Even the mystery CVE-2014-6277 is apparently covered by the unofficial hardening patch that has been applied. You can test you bash install using the bashcheck script. On Arch Linux, make sure you have bash>=4.3.026-1 installed.

And because this has been a fun week, upgrade NSS too!

Recovering Windows 7 Backup on Windows 8.1

My wife’s laptop recently died and was replaced. Unlike last time, there were regular backups with the last performed only a day or two before the failure.

So all is fine… Right?

Turns out not so much. The new laptop runs Windows 8.1. I plugged in the backup drive and opened the settings to look for the Backup and Restore link… and nothing. File History sounded promising, but different.

It turns out Windows 8 introduced a Time Machine like backup tool so they phased out the old Window 7 style backup. An option was provided in Windows 8 to restore “old” Windows 7 backups, but that was removed in Windows 8.1. That means people had a one year window to update from Windows 7 to Windows 8 and restore their files before Windows 8.1 was released. Not helpful!

The only method I could find that worked to restore this backup was:

  1. “Obtain” a copy of Windows 7 and make a VM
  2. Restore the files, selecting to restore the files to a different location (onto your backup USB drive if there is room)
  3. Copy the restored files across to your Windows 8.1 install (there will be a couple of files you can not overwrite – ignoring them seemed to work)

It is almost as if I should have never used the Windows 7 backup feature at all and just copied the files onto the backup drive… Now I have to decide how long the File History tool in Windows 8 will be supported or if I need to find another backup solution.

Interesting Links – March 2014

Only a couple of weeks late this time…

  • A longstanding bug was found in GnuTLS
  • Mozilla introduced a “new” JPEG library
  • Libreoffice now has “fresh” and “stable” releases
  • Python-3.4 was released
  • Android games can soon connect with iOS games
  • The Full Disclosure list was shut down, and resurrected
  • BBQLinux – yet another Arch derivative
  • How to add multiple versions of a function optimized for different architectures in GCC
  • Google Drive space became rather cheap
  • Facebook released wrap – a fast C and C++ preprocessor
  • Google is replacing GTK+ in their browser with a new toolkit
  • Musl libc 1.0 was released
  • The quest to compile the Linux kernel with LLVM is ongoing
  • The Linux Foundataion’s Introduction to Linux course is going to be free this “summer”
  • Apple open sourced their AArch64 backend for LLVM, so there is now two…
  • A new debugger allowing you to to replay your code multiple times
  • Why you should not rerelease software without changing the version number (it is really annoying…)
  • Take a browse of old MS-DOS and Work source code
  • Vote for NASA’s new spacesuit look

Anime Guide 2013

Another years has past, so once again it is time for me to provide my (not very) insightful opinions of the anime that finished their run in 2013. As in previous years, I give my opinion without providing an actual review.

This years installment is longer than usual because I got myself a Nexus 7 to use on my train ride to work. I also tend to focus on the short series more because there is less risk with the time investment, so that also bumps the number in the list. And, lets begin!

Anime of the Year

From the New World
(TV Series, 25 episodes)

This was a hotly contested position in that there were several other series I could have equally put in this position.* This series won due to having a very strong and unique storyline that takes risks that many anime will not. From early on you are given the sense of something not quite right underlying the otherwise peaceful world. It is that question that keeps you going during the slower paced episodes that provide the needed background information early in the series. And don’t judge it too early, because there are time skips taking the main characters into adulthood. I think this series could have been improved by not being 25 episodes long. Not because there was filler added to stretch it out that long, but because it fitted exactly 25 episodes and breaks between episodes seemed out of place. I think I would have enjoyed this even more if I had watched all episodes one after the other, or had this been a series of five movies instead.

* The other very strong contenders were Attack on Titan, Psycho-Pass and The Eccentric Family.


Attack on Titan
(TV Series, 25 episodes)

I’m sure many people consider this the anime of the year. It does have a very interesting story, with the titans remaining imposing throughout. I also found the animation in the plentiful action sequences to really highlight the “3D maneuver gear”. But I disliked the protagonist and his over-emoting so much that at one point of the anime I was thinking that if he died right now it would great and we could get on with the real story. I continuously had the impression that this was anime that started with a bang was going to fade away, but there was always something that save it.

(TV Series, 10 episodes)

This was my bit of crazy for the year and I was entertained by all the pretty colours moving around… It is a bit all over the place, which is made even more difficult when there are sets of characters with the same name across different time points. The episode #0 for this series did not help. It is basically an episode from one of the previous incarnations of this story and contained a lot of information with little context. But don’t worry, there is also an additional episode at the end to explain things for you.

(TV Series, 22 episodes)

This was my Anime of the Year in an early draft of this post, mainly because I had not seen a serious dark science fiction anime in a while. And it definitely had some darkness to it – the episodic first half of the series showed some brutal crimes coming from the criminals deranged point of view. On top of the gore in the committed crimes, the implosion/explosion of the criminals was a highlight (in an “I am not a psychopath” kind of way…). The series avoid greatness mainly by having an overly formulaic character set and probably needing to be a few episodes shorter as there were a couple of nothing episodes.

Silver Spoon
(TV Series, 11 episodes)

Not a series I thought I would like from face value, but I saw a few positive reviews and noted who the author was and decided to try it. Very good decision! There is nothing individually outstanding about it, yet it remains consistently solid throughout. Despite my concerns, the comedy aspect did not go overboard. This is one of those series where you will sit down to start it and all of a sudden have finished watching all the episodes but you will have no real idea why.

Steins;Gate the Movie: Burdened Domain of Deja Vu
(Movie, 90 minutes)

The follow-up movie to what I considered to be the series of the year in 2011. Part of what I liked about the series was having no idea what was going on, but knowing that I wanted to. This was not captured by the movie – not surprising given it was a sequel – but it was a decent follow-up to a great series.

The Eccentric Family
(TV Series, 13 episodes)

For a 13 episode anime, this show managed to provide a fairly complete mythology without any episodes whose sole purpose was to explain it. Instead the episodes gradually reveal the world and you are required to piece it together bit by bit. This does lead to a reasonably slow build up and also leaves gaps in our understanding of the world, but I often find that much more enjoyable than shows that rush to tie up every loose end.

Wolf Children (The Wolf Children Ame and Yuki)
(Movie, 117 minutes)

This is really a 2012 anime, but it was released onto DVD in 2013 so it still counts… My blog, my rules! Creates a modern day folktale and focuses purely on how the situation is dealt with by the people involved. Charming in its simplicity.


A Certain Scientific Railgun S
(TV Series, 24 episodes)

I really liked the original Railgun, but this was just more of the same. In fact, it really was more of the same – the first half was a retelling of what we saw in A Certain Magical Index from a different perspective. It would now take a very good story line to get me to watch any more of this franchise.

Archenemy and Hero (Maoyuu Maou Yuusha)
(TV Series, 12 episodes)

Follows politics, economics, religion, technology and war and shows how they interact reminding me of a less interesting version of Spice and Wolf. While it is generally fairly seriousness, it switches to the outrageous at the blink of an eye. The setup is very weak, but it all fits together if we assume the Hero trusts the anyone with exceeding large breasts (even by anime standards).

Beyond the Boundary
(TV Series, 12 episodes)

A show that was bad while being good. It managed to combine a vast variety of anime clichés in a single series in a way that was at least interesting.

Blast of Tempest
(TV Series, 24 episodes)

This show started out being quite enjoyable. Once most of the mystery disappeared around half way, pseudo-intelligent conversations became the method for moving the plot along. Quoting Shakespeare everywhere just adds to the pretentiousness.

Blood Lad
(TV Series, 10 episodes)

A series being 10 episodes long is really quite strange, so you would think they had a solid story line planned… But I see not much planning of anything beyond the first episode. The lack of story was hidden by introducing a couple of new characters every episode. Yet the series was somehow not that bad.

Gargantia on the Verdurous Planet
(TV Series, 13 episodes)

Welcome to a mecha anime that spends most of its time not being mecha. The whole middle section of this anime is completely unnecessary, managing a beach episode and one revolving around belly dancing. However, the underlying elements of the story were interesting enough. It is a pity they were used so poorly.

Gingitsune: Messenger Fox of the Gods
(TV Series, 12 episodes)

This series was so laid back it did not even have a plot, which is what I think made it strangely likable. The title character was definitely the strongest and frequently stole the show despite not being the main focus.

Pokemon Origins
(TV Special, 4 episodes)

“Gritty reboot” more closely following the original games. The focus is only on the major aspects of those games at the expense of story telling. Fine, there are four episodes so you can not do too much. But it was disappointing when I realized that this was really an advertisement for Pokemon X and Y.

Servant x Service
(TV Series, 13 episodes)

I chose to watch this because all the characters were adults. It turns out adults taking care of business is not what I would consider great entertainment, but it was oddly refreshing and there was nothing I found too bad about it either.

Sunday Without God
(TV Series, 12 episodes)

Builds a unique world and deals with an interesting situation, but this is almost all that the series has going for it. The lead character is the demise of the series.

The Devil Is a Part-Timer!
(TV Series, 13 episodes)

This series is probably the best of the rest. I just could not justify it being in the “Recommended” section, but it seemed better than “Average”. Some of the comedy becomes tired near the end of the series, which further dragged down a fairly weak finish. I was impressed that this series managed to get a “beach” episode without technically having one.

Tiger and Bunny the Movie: The Beginning
(Movie, 93 episodes)

Another movie that saw theatrical release in 2012 but out on DVD in 2013. I was a bit disappointed in this given its series was one of my favourite animes of 2011. Not because the movie ruined it, but because the first half was essentially the first part of the anime retold. The second half was great! Hopefully the next movie will make it up to me.


Galilei Donna
(TV Series, 11 episodes)

This series promised much in the first episode and rapidly proceeded to ignore that any such promises were ever made. It is one of the most spectacular crashes I have ever seen a series take.

Outbreak Company
(TV Series, 12 episodes)

Do you know what a fantasy world full of magic and the traditional fantasy races needs introduced from Japan? Otaku culture. That’s right… Well, I suppose that is one way to try making your audience feel like they are important. Too much self-serving “isn’t anime great” for my liking, even if it was attempted humour.

RDG: Red Data Girl
(TV Series, 12 episodes)

This actually has all the potential to be very good. The story builds and builds… to nothing. I feel like I missed the final few episodes, but I am not drawn in enough to check if there is an OVA or movie to follow.

The Garden of Words
(Movie, 46 minutes)

The highlight of this movie was the animation of rain drops falling in the pond during the opening scenes.

The Unlimited Hyobu Kyosuke
(TV Series, 12 episodes)

The word “unlimited” is what is wrong with this anime. It means there is never any risk that the protagonist is going to be harmed in any way. Imagine, someone saying “this battle is really difficult given I am committing about 1% of my strength to it – I’m in real danger here”. Bah.

Did Not Finish

Flowers of Evil
(TV Series, 13 episodes)

I figure I should not include this in the Sub-Par section as I did not complete watching it and it may have improved substantially. What was quite “novel” about this series was the use of rotoscoping, although it does lose a lot of detail from people even at a short distance from the camera. A bit disappointing given the background art was very well done. I persevered as long as possible watching this telling myself “I have never dropped an anime, it is only 13 episodes, I’m sure I can make it…”. I gave up part way into episode five, read a review saying there was a big reveal a few episodes on, skimmed through to that episode, decided there was no hope and stopped.

Honourable Mention

Hellsing Ultimate
(OVA, 10 episodes)

This missed making it into my 2012 list, because the final episode was released at the very end of the year. Not that speed was essential here given the first episode was released in 2006. The time taken – and I suppose budget – is clearly shown with great animation, particularly in fight scenes. I think it would be a better series as just a dark horror vampire versus Nazi versus Englishmen story without the humour parts.

Space Brothers
(TV Series, 99 episodes)

This did not make my list this year or last year because it had not finished yet. It looks like it is taking a break after episode 99 for a while. I would have said much better things about this series at the end of 2012. Currently I say it is still worth watching the ~100 episodes that have been released, but it can be slow going and a bit predictable at times and the recap at the start of the episode has become too long. Despite that, it is still an anime I watch immediately on release. It is also my constant reminder to find out how many episode a series has before starting to watch it.

And that is the end for 2013. Anything good I missed?