My 2014 in Video Games

2014 was a somewhat crazy year overall – if someone would’ve told me that Seth Rogen and James Franco would be involved in one of the biggest network security scandals, I would ask them if they overdosed on Swift on Security. Anyway, let’s talk about video games, because that stuff is fun and 2014 has been a fun year for me.

In March, Facebook bought Oculus VR for $2 Billion. In late december, there are still no real games that actually work on it, but more DevKits, Samsungs Gear VR, Sony’s announcement of Project Morpheus and Googles Cardboard VR Kit. All this stuff is neat and my original Oculus Kickstarter Kit sure was impressive. But behind the hype and impressive technology, I’m still waiting for the actual games. Not just proofs of concept or fancy YouTube videos but real actual products. Because at the moment, I fear that VR’s second incarnation is facing the same fate as the Kinect. (If someone says Augmented Reality now, it’s important to remember that the Google Glass is dead in the water. I think that overall, wearables are too cumbersome but it will be interesting to see if Smart Watches can do something, despite their pathetic battery runtimes. I doubt it.)

Speaking of the Kinect: Dance Central was amazing, and I had high hopes for Fantasia: Music Evolved. I even made space in my living room. Fantasia has all the ingredients of a fun game, except for two things: The menu structure is weird and there is no real “Free Play” mode. This game is predestined to have some fun especially with children, but there is no way to just have fun without being stopped by the game. I guess that it’s more suited for teenagers and older. It’s a great game, I guess I just had the wrong expectation.

Early in the year, Might & Magic X: Legacy was released. I have blogged about the game before and even though it’s not Game-Of-The-Year-good, it was an enjoyable trip down the memory lane, back to Parts 5 and 6. It’s missing the giant story arc that made the first five games so memorable, but I got what I wanted out of it.

Many of my Kickstarters arrived – beginning with Redux: Dark Matters and Broken Age. Redux ironically isn’t as good as the original DUX 1.5, but I enjoyed the soundtrack (especially Stage 6 Alarming Area, The End). Broken Age made headlines because it was split in two, with Part 1 being released in January and Part 2 still pending. I haven’t played Part 1 yet because I want the adventure to be complete and play it as a whole. Despite the massive delays (There are lessons how going too far above the Kickstarter target is actually a bad thing as scope explodes), I’m hopeful that the game will be awesome – I’ve avoided reviews due to spoilers, but what I heard was positive.

In February, Broken Sword 5 followed, also split into two parts (Part 2 followed in April). I enjoyed it very much – it was classic Broken Sword again, with top notch voice acting and an engaging plot (I loved the Gnostics stuff). I wish that they would’ve extended the last location though, because I’d have loved to see more of the architecture and history.

Jane Jensen’s Moebius: Empire Rising was the next adventure and the first one I’m on the fence about. Mainly because it was too much forced in one direction and I felt I was as helpless as the main character (who’s kind of an asshole) and just had to go along with the plot (I called out the ability to say ‘no’ because Moebius just didn’t give me a convincing reason to go along with the plot at the beginning.

Next up in May, one of the games I absolutely loved this year: Tesla Effect: A Tex Murphy Adventure. Yes, it is cheesy and technically unimpressive but a) It’s Tex Murphy and b) it’s a really enjoyable B-Movie. There was a bittersweet announcement that Aaron Connors was working on two more novels, the second which might be the basis of one last Tex Murphy game. I’d love them to remake the old games to have a more modern interface, but I guess the FMV just wouldn’t hold up? If you like detective pulp fiction, I recommend picking up the book as well.

Wasteland 2 came out in September, and proudly held up the old school flag. The game’s first act was significantly stronger than the second, where it lost a bit of its cohesiveness. I liked the turn based combat system and the dry, cynical humor in it. The mad monks were fun, and Damonta was a good finale to the first act. The second act had good set pieces – the Angel Oracle for example – but it just felt too loosely connected. Enjoyable nevertheless.

I’m still waiting for my Dreamcast version of Pier Solar HD but from what I’m seeing, reviews are good. It’s out on Steam, so that’s what I’ll play once I find enough time. Jagged Alliance: Flashback is another kickstarter-backed game that I didn’t have time to really play, although I immediately ran into a bug that made me lose the game – definitely an authentic Jagged Alliance game :) It looks and plays good though.

Last but not least, Elite: Dangerous was released, with a lot of fanfare. The fanfare was mostly centered around the removal of offline-play. This is one of the caveat emptor lessons with Kickstarter: Just because you give money doesn’t mean you actually get exactly what’s promised. It sucks that the controversy takes away some of the hype of the game, but that fault lies in the creators. Overall though, 2014 has been a year where the fears of Kickstarter-backed games have been dispelled. There were many releases, and for the most part the games were good, or at least good enough to be worth the money.

Apart from Kickstarter, there have been a bunch of Indie and Semi-Indie games. Divinity: Original Sin was funded through Kickstarter, but I bought it regularly through steam. It’s a bit confusing at the beginning (Who are we and what are we doing?) but it’s one of the best isometric RPGs in recent years. Once it got going, I enjoyed the ride all the way to the end. It’s one of the best games of 2014, indie or not. A true indie game, Escape Goat 2 came out on PC and PS4 and is a really fun puzzle game. It reminds me a lot of early 8-Bit games like Solomon’s Key or Spherical, even though it’s very different. In any case, I recommend playing it.

Speaking of Goats: Goat Simulator is one of the most glorious games of the year, to the point of them adding a free MMO Simulator DLC. Words cannot do the game justice. One one side it’s like a fart joke that’s funny but going on too long, but on the other hand it never stops being funny. Whether you play it for 5 minutes or 5 hours, it’s hilarious just because you discover so many more ways to break things.

On the topic of glorious releases: Suikoden II was finally released on the playstation network, for PS3 and PS Vita. This is one of the best RPGs ever made, and we’ve been asking for years to get it on the PSOne classics store. It’s $9.99 and the only excuse for not buying it is if you only have a PS4, because for some stupid reason Sony doesn’t have PSOne Classics support on their flagship console. This might be a good time to look into a PS Vita, because not only do you get Suikoden I and II, a non-messed up version of Final Fantasy VII and all the mainline Persona games, including both parts of Persona 2. It also has the HD Remaster of Final Fantasy X and X-2 available for it.

Another console that had a great 2014 was the Wii U. Donkey Kong Country: Tropical Freeze is everything we love about Donkey Kong Country, including the unforgiving difficulty level. Mario Kart 8 is worth the price of a Wii U alone. The character roster is a bit hit-and-miss, but the vehicles are fun, the track design is amazing and the DLC is actually worthwhile (Link including a motorcycle and the Dragon track are worth the price of the first DLC pack alone). I hear great things about Super Smash Bros., but I could never get into any of the Smash Bros. games, so I have nothing to say except that the Wii U has a big enough roster of great games – some exclusive – that makes it a worthwhile console.

There were also a number of AAA game releases. Castlevania: Lords of Shadow 2 continued the story of the first part – a story that I absolutely loved. The game itself has some rough edges (e.g., stupid stealth sections) and I guess Castlevania purists despise the lack of exploration options, but what they’ve done with the Belmont heritage was really enjoyable. Note that Part 2 starts out with spoilers – playing it first kinda ruins the surprise at the end of the original Lords of Shadow.

Another sequel I had been looking forward to was Dragon Age: Inquisition. It was worth the wait, the game is amazing and a worthy entry in a series whose mainline games are all worthwhile. Extra Bonus points for not repeating the Mass Effect 3 ending SNAFU (which killed the series for me – I doubt I’ll buy the next Mass Effect game on release).

The last game I bought this year was Telltale’s Game of Thrones. It’s basically an interactive episode of the show with Quick-Time-Events disguising as a game attached to it, but that’s ok. It really is Game of Thrones, with all the intrigue and murder that makes the books and TV show so addicting. I’m always torn about Telltale: They know how to tell awesome stories (also see The Wolf Among Us, Back To The Future or The Walking Dead), but the episodic nature of their games often cause weak parts in the middle because they have to put in an ending to an episode that’s really just glue between the preceding and succeeding one.

Last but not least, a sad note: Ralph Henry Bear, possibly the first video game pioneer and father of the Magnavox Odyssey passed away in December. It’s interesting to see how far video games have come, and how far they still have to go.

faml – A Markup Language for browsers and node.js

A common request on many websites is to offer some light formatting capability for a user: Bold, Italic, Links, maybe lists. It should not clutter the markup too much and allow little room for error.

John Gruber’s Markdown is one of the most popular markup languages, but it has a few features that I commonly need to tweak or remove altogether. For my needs, I have customized a Markdown parser to remove features (recently the excellent stmd.js), but I’ve just decided to create a little markup language of my own:

faml – A Markup Language

The syntax may be inspired by Markdown, but it is really its own thing. I only included the things I need, and there is generally just one way of doing things (e.g., emphasis is added through asterisks). The code is based on stmd.js but heavily changed and broken up differently.

You can check out the source, documentation and JavaScript files on GitHub or play with it in the browser. It is also published to npm, allowing you to just npm install faml. I have example code for web browsers and for node.js.

The current version is 0.9 because I’m still working things like the tree that the parser returns (it contains a bunch of unneccessary stuff), adding tests, and giving it a nice homepage.

But it’s there for people to play with :)

var parser = new faml.FamlParser();
var renderer = new faml.FamlRenderer();
var input = "test *with emph*";
var parsed = parser.parse(input);
var rendered = renderer.render(parsed);

Standard Flavored Markdown Tips

Today, some of the biggest users of Markdown have given a gift to the Internet: Standard Flavored Markdown (Read more in Jeff Atwood’s Blog Post)

I played with it for an hour and I’m absolutely in love with it, for three reasons:

  1. It’s rock solid and mature – Try nesting Ordered and Unordered Lists in any combination and see it just do the right thing, something many implementations struggle with
  2. It comes with a reference implementation in C and JavaScript
  3. The JavaScript implementation is easy to extend (I have not done anything with the C version)

I was able to replace the Markdown parser in our current application with the stmd.js reference parser and got up and running immediately.

Here are some tips:

The Parser and Renderer are two different things

Instead of just taking Markdown and giving you HTML, stmd.js consists of a separate Parser and Renderer. This is massively useful, because it means you can either massage the parsed markdown tree before you render it, but you can also impact how the Markdown is rendered without messing up the parsing code. Look at this example:

var parser = new stmd.DocParser();
var renderer = new stmd.HtmlRenderer();
var input = "this **is** a\r\n" +
            "test.\r\n\r\n" +
            "With Inline-<b>HTML</b> as well";

var ast = parser.parse(input);

var html = renderer.render(ast);

document.getElementById("output").innerHTML = html;

Set a breakpoint (with Firebug or whatever JavaScript debugger you use) and look at the glorious ast. Look at the collections of children, at the tokens, and then you might see why this is so great: You can monkey around with this, without having to worry about HTML rendering.

Treat newlines as linebreaks

This is possibly the #1 request people have when they first try out Markdown. Normally, you need to have two spaces at the end of the line to make it a newline, otherwise it’s a space.

The parser correctly determines a simple newline as a Softbreak token. The default renderer renders Softbreaks as \n, that is a HTML newline which doesn’t translate into an actual line break. This is trivial:

var renderer = new stmd.HtmlRenderer();
renderer.softbreak = "<br/>";

Now, every linebreak inserts a proper <br> tag.

Disallow all HTML

Markdown allows Inline-HTML, since the original audience were programmers/bloggers. However, in some environments it may be required to disable any and all inline-HTML. To disable all HTML parsing, we tell the Parser to not generate any Html tokens:

var parser = new stmd.DocParser();
parser.inlineParser.parseHtmlTag = function() { return 0; }

All HTML Tags will now be interpreted as Str tokens and thus escaped on rendering.

Read the Source Code

The Source is on GitHub, and I highly recommend reading through stmd.js to understand how it works and where the extensibility points are. I wish that the Parser and Renderer were in two separate files, but it’s still very straight forward. Yes, there is a Regex which parses HTML, but since Markdown doesn’t just support any HTML but rather a defined subset, this is fine.

You should almost never have to edit stmd.js directly. Monkey patch, yes. But that can be in your consumer code.

This library is a gift.

Thank you, Standard Flavored Markdown team.

Windows Group Authentication and MVC 4/5/Web API problems

I’m working in a Windows environment, where I want to authenticate users using a Windows Security Group. For example, foo\mstum is in the group foo\users and I want to tell to only allow users in foo\users to access the site.

This is usually simple, just add this in <system.web>:

    <allow roles="foo\users" />
    <deny users="*" />

Now, there are two problems with this when using a modern (that is, MVC 4 or 5 or Web API) application: It doesn’t work.

Why? Because by default, Simple Membership is enabled, and this doesn’t support Windows Groups at all – you can check the source code for WebMatrix.WebData.SimpleRoleProvider.

The solution is to add an appSetting that disables Simple Membership:

    <add key="enableSimpleMembership" value="false"/>

Now is back to the previous behavior of using the System.Web.Security.WindowsTokenRoleProvider which does support Windows Groups.

There is a second gotcha though: It seems that the WindowsTokenRoleProvider does not support Universal Security Groups. In my tests, only Global or Domain Local security groups showed up when calling GetRolesForUser. I have not found out why that is and if there is a way to have it support Universal Security Groups. Do note that Distribution Groups (“Mailing Lists”) are not supported in any case.

Through the Fence

I shot my first two rolls of 35mm film in 15 years or so, and as expected, the majority of pictures only serve as a way for me to understand what the different camera settings do.

One of the photos that came out well is this shot of the Sand Canyon Bike Trail in Irvine, CA, shot through the fence on the Bridge over it.

It was shot on Black & White film (Ilford XP2 Super 400) on a Nikon FG 35mm SLR Film Camera with a 50mm lens at an aperture of f/22. That way, the background is in focus (I have a photo with f/3.5 and blurry background as well, but it’s not as effective).

The one thing that I dislike is that the right side lacks detail – I think that underexposing it a little might have been a better choice. I love the way the film grain looks. Unlike Digital Camera Noise, the grain doesn’t look like a compression artifact and gives it a nice, vintage note.

Video Games and the Ability to say ‘No’

Imagine Philip Marlowe turning down the job from General Sternwood at the beginning of The Big Sleep. Or Bilbo Baggins standing his ground and not join Thorin Oakenshield’s company in The Hobbit. Or if Han Solo refused to take Luke and Obi-Wan in Star Wars.

There wouldn’t be a story. Books and Movies work because they have characters make choices that put them or others in situations which advance the plot. Books and Movies are also passive mediums where the audience can only follow along with the road that’s already set out for them.

Video Games on the other hand are interactive mediums. While one can argue that the audience is still just following along a predetermined road with only a few forks to choose from — basically, Choose-your-own-adventure-Books with graphics — there is still a far greater amount of customizability and choice possible.

Unfortunately, most video games don’t include an option to just say ‘No’ to something that happens in-game.

For example, in the begining of Moebius: Empire Rising, the main character gets an offer from a client. The dialogue seems to allow me to just tell them ‘No’. But then, nothing else happens. I can go back to the starting location, but there is nothing else there to do. There is no consequence, and worse, there is no driving force to make me change my mind.

There is simply no explanation why I have to say ‘Yes’ to the client.

Similarly, Indiana Jones and the Last Crusade doesn’t include an option to turn down Donovan’s offer at the beginning, but there is a good reason for that: Indy’s father has gone missing on the very same mission. This at least gives me a strong reason to go along with it.

Saying ‘No’ doesn’t have to lead to a satisfactory ending. In the same Indiana Jones game, we get to meet Hitler in Berlin:

Notice the option that says ‘Throw a punch’? I believe most games nowadays wouldn’t have it, they might even make the whole thing a cutscene. Why? Because doing so has the guard in the back kill you, leading to a game over screen.

And that’s fine. Not every choice a player can make is required to lead to a happy ending in the story. Ending the story prematurely due to the decisions of the player is a valid option. But boy, did it feel good to select that ‘Throw a punch’ option.

In Shadowrun Returns: Dragonfall (the excellent DLC campaign), the game starts with a friend being killed and the player taking over her old team. There is no choice here, no way to just say ‘Boy, shadowrunning is to dangerous, I’m out!’ That may be okay because once again, a clear motive is given.

But as you stand before the Final Boss, you hear about the Evil Plan. And I thought ‘Hey, that’s kinda good actually, I no longer want to stop you’. Except that wasn’t an option. Of course, immediately after I heard about The Price To Pay, and I thought ‘Whoa, that’s a bit steep.’ But again, no option to just say ‘Sure, I’m in!’ I wouldn’t have expected a successful outcome – maybe my team would turn against me, or maybe the Final Boss would’ve killed me because there are no partners in this project.

Either option would’ve been fine and one can argue that there is a choice in case I had made a different choice earlier. (It’s hard to write these postings while avoiding spoilers, so the only thing I can say is that the second-to-last mission is the one with the real choice).

It’s hard to get this right. Not having meaningful choices is often worse than not having choices at all because in the first case, as a player I immediately see all the unrealized potential while in the second case it may be easier to string me along, unless it’s too obvious. I think the second is nowadays called Cinematic Experience because strictly linear gameplay has a negative connotation to it.

More modern games often use a morality system – Renegade/Paragon in Mass Effect, Dark/Light side of the Force in Knights of the Old Republic, even the alignment in many Dungeons & Dragons games. Very often though, these systems are much too confined, to absolute, literally too black/white. The dark path is often the psychotic mass murderer, while the light path ends up with a character that’s holier than Mother Teresa. Try to play a mafia boss who has no qualms nuking an entire city but donates a lot of his money to his local orphanage because he loves children and likes kittens. This post on Dorkly describes some of the issues better.

But even with a morality system, there is often no way to just say ‘No’, often it’s just the choice between saying ‘Yes’ through diplomacy (light) or force (dark).

The Fallout series made a more concious effort to have choice in the game – the epilogues reflect upon the choices, letting us know what happened to the peopel we’ve interacted with. Mass Effect 3’s ending – as insulting as it was – also did get that part right.

But again, these games didn’t offer the option to say ‘You know what? Screw you all, I’ll retire to the countryside and become a farmer!’

Choices add cost – it’s the dilemma of creating content that a single player may never see. Yes, all the content in the game will be seen by the playerbase as a whole, but often not by a single person. So why spend money on ‘optional’ content when a more linear, cinematic experience offers more bang for the buck?

Because it’s boring to me. Video Game characters are flat and boring compared to movie and book characters – there is no Michael Corleone, no Keyser Soze, no Vivian Sternwood in gaming. Sure, there are a lot of memorable characters and moments, but even the more memorable ones only reach Jack Bauer level at best. They are puppets, not driven by choice but driven by an external force that’s often too intrusive to the story.

Giving the player the option to say ‘No’, to walk away from things their character doesn’t want to do helps fleshing out the character, making more use of the interactive medium. Sure, punish me for it, give me an early game over screen because I didn’t stop the Final Boss. But leave that decision up to me. Maybe I just flee, not caring if the Final Boss destroys my planet. Maybe that drives me insane and my character commits suicide a few years later. That’s fine, because it was the result of a choice that I made as a player, not a choice that I was predetermined to do.

Quantic Dream did great efforts here with Fahrenheit/Indigo Prophecy and especially with Heavy Rain, games in which my character had a good reason to start the quest but offered me freedom to choose how to go on from there, with the consequences of my actions shown to me. It’s not perfect, but Heavy Rain is possibly one of the best positive examples.

People are creative. Look at all the ‘Fan Fiction’ written for The Sims, a game devoid of much inherent story and entirely choice-driven. Some things eventually become part of the game lore (the tragic accident of a family involving a pool with no ladder in The Sims 3), but what’s more important, as a player I have a feeling that I’m part of an interactive medium. I’m basically playing with dolls, making up my own story as I go along.

There has to be a better way to merge the Story-less/Choice-driven nature of The Sims with the Story-heavy/Choice-less nature of most other games. I’m looking at the Indie-developer scene here, the people that gave us Papers, please, Fez, or the before-mentioned Shadowrun Returns (which arguably benefitted from the rich source material). I feel that if such a game will exist, it will come from an Indie studio.

Help us, Indie-wan Kenobi, you are our only hope for a Keyser Soze or Vivian Sternwood.

Some Ruby concepts explained for .net developers

I’m normally a .net developer, it’s been my bread and butter for the past seven years and will be for several more. But it’s also important for me to keep in touch with other languages out there, including Ruby. Here’s my personal cheat sheet to remember naming conventions.

Method Names are lower case and use underscores, as do Method Arguments. The result of the last expression is automatically returned – there is no direct equivalent of void, although nil can serve that purpose.

def my_method(some_argument)
  1 + 1 # implicitly returns 2.

Local Variables are also lower case with underscores, and no special var keyword is required to declare them.

def some_method
  my_variable = 2
  1 + my_variable

Instance Variables – that is, a non-static field in a class – are prefixed with @. Somewhat surprisingly, they can be declared within a method.

class MyClass
  def do_stuff
    @test = 4

  def testing
    2 + @test

myc =
puts myc.do_stuff
puts myc.testing

This outputs 4 and 6. If I remove the puts myc.do_stuff line, this throws an error: test.rb:8:in '+': nil can't be coerced into Fixnum (TypeError).

Constructors are methods called initialize:

class MyClass
  def initialize(initial_value)
    @test = initial_value

  def testing
    return 2 + @test

myc =
puts myc.testing

This outputs 5. Instance Variables are private by default, but Ruby has three special ways to declare a variable as public: attr_accessor, attr_reader and attr_writer. Changing the class to this:

class MyClass
  attr_reader :test

  # .. rest as above

myc =
puts myc.test # outputs 3
myc.test = 4  # undefined method 'test='

So attr_reader is like public dynamic Test { get; private set; } in .net, while attr_writer is like { private get; set; } and attr_accessor is like { get; set; }.

To create property getters and setters, just create methods. In the end, that is what attr_reader etc. are doing, just like the .net auto-property syntax creates actual methods on compilation.

def test=(value)
  puts "I'm a property setter for @test!"
  @test = value

def test
  puts "I'm a property getter for @test!"
  return @test

Supposedly, attr_ methods are faster than manually implementing methods – not sure if it’s true, but they are definitely the recommended way if you don’t need actual logic in your getters and setters.

The syntax above used a Ruby symbol, as evidenced by the colon – :test. This is the concept that I took the longest to figure out. In a way, symbols are like interned strings in .net, since the same symbol will always mean the same thing whereas instances of strings may not be reference equal despite having the same content. Generally, Symbols should be seen as constant identifiers (they are in fact immutable). I recommend this blog post for some more information, but interned string seems to be the best .net analogue I could come up with.

Class Variables are static properties. In method names, self. is the equivalent of a static method. There are some caveats when inheriting with regards to static properties.

class MyClass
  @@static_var = 8

  def initialize(my_value)
    @instance_var = my_value

  def testing
    @instance_var + @@static_var

  def self.static_var=(value)
    @@static_var = value

myc =
puts myc.testing  # 11

myc2 =
puts myc2.testing # 12

MyClass.static_var = 10
puts myc.testing  # 13
puts myc2.testing # 14

Constants are not prefixed and use SCREAMING_CAPS syntax.

class MyClass

  def testing

myc =
puts myc.testing # 24

Class Inheritance uses < BaseClass syntax. Like .net, Ruby does not support multiple inheritance but unlike .net, there are no interfaces.

class MyClass
  def initialize
    @test = 4

class MyDerivedClass < MyClass
  def testing
    2 + @test

myc =
puts myc.testing # 6

Modules in Ruby are a bit like single-namespace assemblies in .net. Modules can contain Constants, methods, classes, etc. The include keyword is like using in .net.

module MyModule

  def MyModule.say_hello
    puts "Hello!"

class MyClass
  include MyModule

  def testing

myc =
puts myc.testing # Hello!, followed by 24

Modules do not support inheritance, despite them being like classes (in fact, Ruby’s class class inherits from the Module class, which inherits from Object). What’s somewhat noteworthy is that constants do not need the Module name, unless there is something “closer” in scope.

class MyClass
  include MyModule


  def testing
    puts 4 + SOME_CONSTANT            # 34
    puts 4 + MyModule::SOME_CONSTANT  # 24

The double colon (::) was described as the namespace resolution operator on Stack Overflow.

There is obviously a lot more to Ruby that doesn’t translate 1:1 to .net, but I hope that the above code samples make it a bit easier to understand Ruby as a .net developer

Relax NG Verification in .net (and a bit of Schematron)

I’ve been working with Docbook V5.0 a bit and started working on some processing tools to support my workflow. One of the big things is that the official Docbook Schema is Relax NG and Schematron.

Relax NG

In .net, you can create a validating XML Reader by passing in XmlReaderSettings into XmlReader.Create, but the built-in ValidationType is limited to W3C XML Schema (.xsd) or Document type definitions (.dtd). Docbook has Schema files for both, but neither are the official standard because of a slight lack of features in those schema languages.

Thankfully, the Mono team has made a Relax NG library and created a NuGet Package that is useable in Microsoft’s .net. The Package ID is RelaxNG:

PM> Install-Package RelaxNG

I’ve created a simple Docbook XML File for testing purposes and I’m using the docbookxi.rng schema file (since I’m using XIncludes).

// using System.Xml;
// using System.Xml.Linq;
// using Commons.Xml.Relaxng;
using (XmlReader instance = new XmlTextReader("DocbookTest.xml"))
using (XmlReader grammar = new XmlTextReader("docbookxi.rng"))
using (var reader = new RelaxngValidatingReader(instance, grammar))
    XDocument doc = XDocument.Load(reader);
    Console.WriteLine("Document is using Docbook Version " +

There are two ways of handling Validation Errors (in the test case, I’ve duplicated the <title>First Part of Book 1</title> node, which is illegal in Docbook since title can only occur once in that scenario).

If no handler is set up, this throws a Commons.Xml.Relaxng.RelaxngException with an error like Invalid start tag closing found. LocalName = title, NS = line 35, column 14.

The better way is to hook up to the InvalidNodeFound Event which has a signature of bool InvalidNodeFound(XmlReader source, string message):

reader.InvalidNodeFound += (source, message) =>
                               Console.WriteLine("Error: " + message);
                               return true;

source is the RelaxngValidatingReader as an XmlReader and allows you to look at the current state to do further analysis/error recovery. message is a human readable message like "Invalid start tag found. LocalName = title, NS =". The return value decides whether of not processing continues. If true, it will skip over the error – in the end, I’m going to have a proper XDocument but of course all guarantees for validity are off. If false, this will throw same RelaxngException as if there’s no event handler wired up.

Generally, I prefer to make use of a lambda closure to log all errors during Validation and set a bool on failure that prevents further processing afterwards.


Now, Relax NG is only one of the two parts of Docbook Validation, although arguably the bigger one. Schematron is employed for further validation, for example that book must have a version attribute if (and only if) it’s the root element, or that otherterm on a glosssee must point to a valid glossentry. The Docbook Schematron file is in the sch directory and for this test, I’ve removed the <glossentry xml:id="sgml"> node from the DocbookTest.xml file. This still passes Relax NG, but is no longer a valid Docbook document.

There isn’t much in terms of Schematron support in .net, but I’ve found a rather ancient library called Schematron.NET of which I downloaded Version 0.6 from 2004-11-02. This is messy, because I have to use the Docbook W3C XML Schema file which has embedded Schematron rules – basically docbook.xsd, xml.xsd and xlink.xsd from the /xsd directory. Thanks to this article on MSDN for pointing me to the library and to the fact that Schematron rules can be embedded into .xsd using the appinfo element.

I also need to make sure to use the XmlTextReader and not any other XmlReaderLiskov be damned!

using (XmlReader instance = new XmlTextReader("DocbookTest.xml"))
    var schemas = new XmlSchemaCollection();
    schemas.Add("", "xml.xsd");
    schemas.Add("", "xlink.xsd");
    schemas.Add("", "docbook.xsd");

    var schematron = new Validator();

This throws a NMatrix.Schematron.ValidationException with the message

Results from XML Schema validation:
  Error: Reference to undeclared ID is 'sgml'.
  At: (Line: 85, Column: 35)

There doesn’t seem to be an Event Handler, but the code is very 2004-ish, with properties being set after processing. Overall, the whole approach is very messy, I’m even validating the whole document again against XSD after it’s been passed through Relax NG already.

The library is also expecting the old Schematron 1.5 namespace of – which is fine for Docbook 5.0 but will be a problem once Docbook 5.1 comes out since it uses the ISO Schematron namespace of

For 5.0 it does give proper Schematron validation which is good enough for now, but overall, this isn’t really a great way to do Schematron validation. Not sure if there’s a better solution because I’d love to avoid starting my own .net 4.5 Schematron Validator Project :)

My new Lenovo Thinkpad E440

Important Update: In late 2014, Lenovo stared shipping their systems with adware that poses significant security threats to the users. I therefore recommend not buying any Lenovo products. The review will stay up for historical purposes, but my next Laptop won’t be a Lenovo.

It’s been a while since I bought a new Laptop. The last one I blogged about was an ASUS eeePC 1000HE, which is still in use as my sole Windows XP machine for interfacing with my Commodore 64 and to test games on an old Atom and GMA 950 graphics. In 2010, I bought a 13″ MacBook Pro with a 2.4 GHz Core 2 Duo which served me well until late 2013 when I wanted something with a higher screen resolution (1280×800 just wasn’t that great for some things) and a more power without sacrificing on battery life, Windows 7 compatibility or the ability to actually do work.

I ended up with a Lenovo ThinkPad E440.

Specs and Delivery

I ordered my E440 on February 24 as a BTO (Build-to-order) for a bit less than $700 including Tax and Shipping. For that money, I got

  • Intel Core i5-4200M CPU (2.5 GHz Dual Core with Hyperthreading)
  • 14.0″ 1600×900 AntiGlare Screen (16:9 Aspect Ratio)
  • Bigger Battery (62WH compared to the stock 48WH)
  • Intel 7260AC Dual Band (2.4/5 GHz) 802.11 ac/a/b/g/n Wireless
  • Windows 7 Professional 64-Bit
  • Intel HD 4600 Graphics
  • 4 GB RAM
  • 500 GB 7200 rpm hard drive and a DVD-R drive

The laptop was delivered on April 1 – that’s 36 days between ordering and delivering. This is rather ridiculous for a business laptop. Lenovo explained they had an unexpected surge of orders that clogged up their manufacturing capacity, but still, that was a bit much.

The Laptop has two memory slots, one which was filled with the 4 GB Memory I ordered and one that was empty. I had a fitting 4 GB Memory still lying around (Kingston KVR16LS11/4), so I upgraded it to 8 GB RAM immediately. I also had a 256 GB Samsung 840 Pro SSD lying around which immediately replaced the 500 GB Hard Drive.

Every Laptop should use SSDs – it makes a massive difference even over 7200 rpm drives, and the lack of a moving part increases the overall resilience. It’s also extremely quiet since pretty much only the CPU draws any real power. The Graphics is an Intel HD 4600 – not the most amazing gaming chip, but it runs Reaper of Souls in 1600×900 perfectly fine, so it’s good enough for my mobile gaming needs.

For reference, the AmazonBasics 14″ Sleeve fits perfectly, although no space for any accessories.

Anti-Theft systems and other security features

The E440 comes with Intel Anti-Theft and Computrace. Now, by virtue of being an Anti-Theft system, the Laptop will continuously send data about its location over the internet and this is enabled by default – not everyone needs it or is comfortable with it. For some in-detail look into Computrace, read this article.

Lenovo allows you to not only Enable/Disable the features, but you can even permanently disable it. They warn you that you can never re-enable it, so I assume its wiping the option ROM. After permanently disabling both Intel AT and Computrace I didn’t see any of the services that Securelist identified running.

The E440 also comes with a TPM Chip, useful for Bitlocker. A Fingerprint reader is an option as well, although I ordered mine without. Both features can be disabled in the BIOS if there’s no need for them.

Finally, UEFI Secure Boot can be toggled, but what’s even more important, you can enter “Setup Mode” which allows you to enter your own keys. This is important if you use non-Windows OS but still want to use Secure Boot.


I can’t really say too much about the preinstalled software. I noticed that it came with stuff already installed on it, but the first thing that I did was reinstall Windows on the new SSD.

Lenovo offers Windows 7 Professional as a BTO Option, which is great since there’s no good successor to it on the market yet but they include neither installation media nor a product key sticker (UPDATE: There is a real Windows 7 Product Key sticker – it sits under the battery). I had a Win7 Professional DVD lying around from another computer and used the free MagicalJellyBean Key Finder to extract the product key from the installation.

When it comes to non-Apple laptops, you should ALWAYS install them from scratch if you want a clean Windows installation without any crap on it, but as said, I haven’t done a thorough investigation on the E440 before I wiped it.

The Screen

Two features sold me immediately: The screen is Anti-Glare, and it’s 1600×900 on 14″. Anti-Glare used to be the default for laptop screens because it makes it better to work with, but with the influx of Entertainment-focused laptops in the late 90’s, the Anti-Glare was omitted, leading to screens that have deeper blacks for games/movies but make it a nightmare to work with.

From what I can see, the screen is a TN Screen, not an IPS. This means color distortion from an angle. The E440 doesn’t distort much when viewing from the side, but doesn’t have a really wide vertical angle. I’m a programmer, so that works perfectly fine for me, but if you’re in need of accurate color representation, don’t get a TN screen.

1600×900 on 14″ is awesome for me. I have enough real estate to have all the stuff open that I want and I can still read it without having to use the utterly broken Windows DPI Scaling feature. Here’s a screenshot, click for full size:

Mouse-replacement – Touchpad and Trackpoint

When it comes to Laptops, there is one major issue: The Touchpad that’s used in lieu of a mouse. Apple’s Macbook touchpad is phenomenal, it’s lightyears ahead of anything the Wintel crowd sells. The problem is that I don’t really like their current lineup of Laptops, and since I don’t use Mac OS X anyway (I run Windows 7 on both my main machine – a Mid 2010 Mac Pro – and on my MacBook Pro before I sold it) I could safely look at all the options on the market.

One of the key reasons to go Lenovo ThinkPad was because of their TrackPoint, a little “joystick” sitting in the middle of the Keyboard (between G, H and B keys) that can be used to move the mouse pointer. It takes a little bit to get used to, but then it’s pretty awesome and precise. The main touchpad is acceptable as well, but the Synaptics TouchPad driver isn’t as good as what Apple offers in Bootcamp. Specifically, scrolling with two fingers has a slight delay before it returns to normal operation and doesn’t work in all apps (e.g., in Steam it doesn’t really emulate a scroll wheel).

The Keyboard

The Keyboard itself is awesome, the chiclet style and size of the keys makes typing straight forward and easy, I didn’t have any issues hitting the right key (and only that key, not some neighboring keys as well). What has really sold me is the fact that there are dedicated PageUp/Down and Home/End/Insert/Delete keys. That actually took me a whole to get right, because after working on a MacBook Pro for a long time, I’m so used to Fn+Up for PageUp or Fn+Left for Home that I needed to retrain myself for this keyboard, but now it’s awesome for programming. The one thing I wasn’t willing to relearn though is the positioning of the Fn and Ctrl keys – Ctrl is the bottom-left key, Fn is to the right of it. Lenovo has acknowledged this and offers a BIOS option to swap Ctrl/Fn to their correct order.

The F-Keys default to their alternative mode, where F1 is “Mute” and F5 is “Brightness Down”. But again, there’s a simple way to change that, Fn+Escape switches it around so that F1 is F1 and Fn+F1 is “Mute”. This setting persists across restarts, which is awesome!

It’s definitely one of the best Laptop Keyboards I worked with.


Lenovo did a spectacular job on the BIOS. It’s a bit sad that this has to be explicitly pointed out, but they allow you to toggle or tweak almost every feature the Laptop offers. I assume that by virtue of being a business laptop, they assume that they are selling to IT System Administrators.

The Laptop isn’t too heavy and has a good battery lifetime with the 62WH battery – I can go through a whole day of working without any issue. I do not know if the 7200 rpm hard drive would draw a noticeable amount of power as I immediately replaced it with an SSD. Despite being made out of plastic, it doesn’t feel cheap, although of course it’s not in the same league as Apple’s unibody.

The touchpad cannot hold a candle to Apple. But when it comes to PC touchpads, it is definitely workable. The Trackpoint stick is a great way to control the mouse pointer as well if you’re used to it. Tap to click works pretty well, although there’s no “tap bottom right corner” to right click (you can set it to “tap with 2 fingers to right click”).

Overall, I’m very pleased with the E440 once it finally showed up on my doorstep. For the money I got almost all the specs that I wanted without having to make compromises, although of course I’m not factoring in the $200 SSD that I still had lying around. But even for the $1000 that it would cost, it’s worth the price for me.

Putting in the SSD and RAM was easy – unscrew three screws at the bottom (with a normal screwdriver, not some special nonsense), unclip the plate the screws are holding, voila, HDD and RAM is right there. This is how PC Laptops are done since forever, and this is something I missed on my MacBook Pro where changing the Hard Drive was a loathsome operation. Also, the battery is a removable part, as it should be.

I like it. A lot.