Hearthstone – Warrior vs. Paladin

I’ve tried really hard not to look at Hearthstone.  I feel like I play enough Blizzard games and another one – a free one at that with low time commitment – would only serve to lock me out of other games I’d like to pick up.  But videos like these make me want to go sign up for the beta…don’t you devour enough of my time already Blizzard?



Red Thread Games receives an additional 200,000 Euros in funding!


Red Thread Games has just received another substantial contribution to the funding of Dreamfall: Chapters! Technically this contribution brings the funding over 2 million dollars – I wonder if we’ll hear an announcement about The Longest Journey Home now? The Longest Journey Home was the 2 million dollar stretch goal from the kickstarter.

HEVC – Making a first HEVC bitstream

Alright!  So now we’ve got the reference encoder and a few options to playback our files – let’s jump into making some newfangled video files!  This post will give you all the information you need to set up a simple work flow for creating HEVC files.

All of the tools we’ll be using for this demonstration are command line only so I would recommend you set up a work folder to make execution easy.  For myself, I set up my work folder as C:/hevc/ – no fuss, no muss.  All of the tools we use will be stored here as well as input/output files during our encodes.

The first tool we’ll need is the ubiquitous ffmpeg.  We’ll be using ffmpeg to convert source material into raw .yuv video to feed to TAppEncoder.

The second tool is the aforementioned TAppEncoder – the HM10.1 reference encoder.

Finally, we’ll need mp4box if we want to mux our HEVC streams into mp4 files.

For good measure we’ll also pick up a copy of VirtualDubMod – or you can use vanilla if you wish.  We’ll use this if we want to make frame-accurate cuts to our source file in order to segment the workload.

Now that we have all of our tools we need to create a config file for TAppEncoder to use.  The very first config file I personally tested came from this blog (which also tells you how to build your own TAppEncoder, if you’re so inclined) and I believe it matches the ‘encoder_lowdelay_main.cfg’ provided with the reference encoder.  Create a new text document in your work directory and rename it something simple – I chose test.cfg.  Then paste one of the pre-made config files and save it.

#======== File I/O =====================
BitstreamFile : mobile.hevc
ReconFile : mobile_out.yuv

FrameRate : 24 # Frame Rate per second
FrameSkip : 0 # Number of frames to be skipped in input
SourceWidth : 352 # Input frame width
SourceHeight : 288 # Input frame height
FramesToBeEncoded : 10 # Number of frames to be coded

#======== Unit definition ================
MaxCUWidth : 64 # Maximum coding unit width in pixel
MaxCUHeight : 64 # Maximum coding unit height in pixel
MaxPartitionDepth : 4 # Maximum coding unit depth
QuadtreeTULog2MaxSize : 5 # Log2 of maximum transform size for
# quadtree-based TU coding (2…6)
QuadtreeTULog2MinSize : 2 # Log2 of minimum transform size for
# quadtree-based TU coding (2…6)
QuadtreeTUMaxDepthInter : 3
QuadtreeTUMaxDepthIntra : 3

#======== Coding Structure =============
IntraPeriod : -1 # Period of I-Frame ( -1 = only first)
DecodingRefreshType : 0 # Random Accesss 0:none, 1:CDR, 2:IDR
GOPSize : 4 # GOP Size (number of B slice = GOPSize-1)
# Type POC QPoffset QPfactor tcOffsetDiv2 betaOffsetDiv2 temporal_id #ref_pics_active #ref_pics reference pictures predict deltaRPS #ref_idcs reference idcs
Frame1: B 1 3 0.4624 0 0 0 4 4 -1 -5 -9 -13 0
Frame2: B 2 2 0.4624 0 0 0 4 4 -1 -2 -6 -10 1 -1 5 1 1 1 0 1
Frame3: B 3 3 0.4624 0 0 0 4 4 -1 -3 -7 -11 1 -1 5 0 1 1 1 1
Frame4: B 4 1 0.578 0 0 0 4 4 -1 -4 -8 -12 1 -1 5 0 1 1 1 1
ListCombination : 1 # Use combined list for uni-prediction in B-slices

#=========== Motion Search =============
FastSearch : 1 # 0:Full search 1:TZ search
SearchRange : 64 # (0: Search range is a Full frame)
BipredSearchRange : 4 # Search range for bi-prediction refinement
HadamardME : 1 # Use of hadamard measure for fractional ME
FEN : 1 # Fast encoder decision
FDM : 1 # Fast Decision for Merge RD cost

#======== Quantization =============
QP : 32 # Quantization parameter(0-51)
MaxDeltaQP : 0 # CU-based multi-QP optimization
MaxCuDQPDepth : 0 # Max depth of a minimum CuDQP for sub-LCU-level delta QP
DeltaQpRD : 0 # Slice-based multi-QP optimization
RDOQTS : 1 # RDOQ for transform skip

#=========== Deblock Filter ============
DeblockingFilterControlPresent: 0 # Dbl control params present (0=not present, 1=present)
LoopFilterOffsetInPPS : 0 # Dbl params: 0=varying params in SliceHeader, param = base_param + GOP_offset_param; 1=constant params in PPS, param = base_param)
LoopFilterDisable : 0 # Disable deblocking filter (0=Filter, 1=No Filter)
LoopFilterBetaOffset_div2 : 0 # base_param: -13 ~ 13
LoopFilterTcOffset_div2 : 0 # base_param: -13 ~ 13

#=========== Misc. ============
InternalBitDepth : 8 # codec operating bit-depth

#=========== Coding Tools =================
SAO : 1 # Sample adaptive offset (0: OFF, 1: ON)
AMP : 1 # Asymmetric motion partitions (0: OFF, 1: ON)
TransformSkip : 1 # Transform skipping (0: OFF, 1: ON)
TransformSkipFast : 1 # Fast Transform skipping (0: OFF, 1: ON)
SAOLcuBoundary : 0 # SAOLcuBoundary using non-deblocked pixels (0: OFF, 1: ON)

#============ Slices ================
SliceMode : 0 # 0: Disable all slice options.
# 1: Enforce maximum number of LCU in an slice,
# 2: Enforce maximum number of bytes in an ‘slice’
# 3: Enforce maximum number of tiles in a slice
SliceArgument : 1500 # Argument for ‘SliceMode’.
# If SliceMode==1 it represents max. SliceGranularity-sized blocks per slice.
# If SliceMode==2 it represents max. bytes per slice.
# If SliceMode==3 it represents max. tiles per slice.

LFCrossSliceBoundaryFlag : 1 # In-loop filtering, including ALF and DB, is across or not across slice boundary.
# 0:not across, 1: across

#============ PCM ================
PCMEnabledFlag : 0 # 0: No PCM mode
PCMLog2MaxSize : 5 # Log2 of maximum PCM block size.
PCMLog2MinSize : 3 # Log2 of minimum PCM block size.
PCMInputBitDepthFlag : 1 # 0: PCM bit-depth is internal bit-depth. 1: PCM bit-depth is input bit-depth.
PCMFilterDisableFlag : 0 # 0: Enable loop filtering on I_PCM samples. 1: Disable loop filtering on I_PCM samples.

#============ Tiles ================
UniformSpacingIdc : 0 # 0: the column boundaries are indicated by ColumnWidth array, the row boundaries are indicated by RowHeight array
# 1: the column and row boundaries are distributed uniformly
NumTileColumnsMinus1 : 0 # Number of columns in a picture minus 1
ColumnWidthArray : 2 3 # Array containing ColumnWidth values in units of LCU (from left to right in picture)
NumTileRowsMinus1 : 0 # Number of rows in a picture minus 1
RowHeightArray : 2 # Array containing RowHeight values in units of LCU (from top to bottom in picture)

LFCrossTileBoundaryFlag : 1 # In-loop filtering is across or not across tile boundary.
# 0:not across, 1: across

#============ WaveFront ================
WaveFrontSynchro : 0 # 0: No WaveFront synchronisation (WaveFrontSubstreams must be 1 in this case).
# >0: WaveFront synchronises with the LCU above and to the right by this many LCUs.

#=========== Quantization Matrix =================
ScalingList : 0 # ScalingList 0 : off, 1 : default, 2 : file read
ScalingListFile : scaling_list.txt # Scaling List file name. If file is not exist, use Default Matrix.

#============ Lossless ================
TransquantBypassEnableFlag: 0 # Value of PPS flag.
CUTransquantBypassFlagValue: 0 # Constant lossless-value signaling per CU, if TransquantBypassEnableFlag is 1.

#============ Rate Control ======================
RateControl : 0 # Rate control: enable rate control
TargetBitrate : 1000000 # Rate control: target bitrate, in bps
KeepHierarchicalBit : 1 # Rate control: keep hierarchical bit allocation in rate control algorithm
LCULevelRateControl : 1 # Rate control: 1: LCU level RC; 0: picture level RC
RCLCUSeparateModel : 1 # Rate control: use LCU level separate R-lambda model
InitialQP : 0 # Rate control: initial QP
RCForceIntraQP : 0 # Rate control: force intra QP to be equal to initial QP


Your work folder should now look something like this:


Next we need to get some content to encode.  For this test I’ve downloaded the ipod version of Big Buck Bunny.  Here I’m using a smaller resolution because the encoder is very slow and this is really just to test encoding and to make sure we have everything set up correctly.

Copy your source file (BigBuckBunny_640x360.m4v in my case) to your work folder.  Rename it something easy to type like bbb.m4v.  Because this is a test we don’t want to process the entire video file which is over 14000 frames.  First we’ll process the file into a raw avi file so we can cut it into pieces accurately.  To create a raw .avi file run a command prompt and type the following:

cd c:/hevc/

ffmpeg -i bbb.m4v -pix_fmt yuv420p -vcodec rawvideo bbb.avi


Open the resulting file in VirtualDubMod and cut out a segment or 300 or so frames.  Go ahead and disable the audio stream, set processing to ‘direct stream copy’ and save out the file with a simple filename.  I used ‘bbb_test.avi’.  Next we need to strip the avi header information so we have just a raw .yuv file.  We do that in much the same way we created our .avi file:

ffmpeg -i bbb_test.avi -pix_fmt yuv420p bbb_test.yuv

This gives us a working .yuv file which we feed to TAppEncoder along with our config file.  But first we need to update our config file to reflect our input.  Open test.cfg and change the following:

BitstreamFile : bbb_test.hevc       # the output file

ReconFile : z1.yuv     # a yuv output file – I’m not sure what its use is but I always name it to be at the bottom of my folder so I can find and delete it easily.

FrameRate : 24    # should match the source framerate

SourceWidth : 640 # Input frame width

SourceHeight : 360 # Input frame height

FramesToBeEncoded : 912 # Number of frames to be coded – should match the number of frames of your source – this is not done automatically!

There are plenty of other settings we could change – many of which will have a large impact on quality – but for now we’ll leave those settings alone.  Once you’ve updated everything be sure to save the config file and we’re ready to run TAppEncoder:

tappencoder -i bbb_test.yuv -c test.cfg

And now we wait!  The HM10.1 reference encoder is single threaded only so you can use your computer in the meantime.


Once the file is encoded you can play it back with one of the tools listed in the previous post. If you use the Lentoid HEVC decoder just rename ‘bbb_test.hevc’ to ‘bbb_test.hm10’.   If you’d like to watch in with the Osmo4 player then you’ll need to mux the .hevc file ino an .mp4 file by running:

mp4box -add bbb_test.hevc:fps=24 bbb_test.mp4


Congratulations!  You’ve encoded your first HEVC video!


Next time we’ll look at settings to increase video quality (this encode was done at QP 32) and workflow optimizations that will allow us to speed up the encoding process by using faux multi-threading.

You can download the output files here:



HEVC – A first look at h.265 video compression

High Efficiency Video Coding (HEVC) is poised to make some big reveals this year with many software developers showing demos of their individual HEVC implementations.  In particular Divx,  MainConcept, and Ateme all have beta HEVC encoders that should be available before the close of  2013 and there are plenty of other companies that have at least put out press releases that they’re working on similar implementations.  There have also been a host of decoders released so far and promises for many more.

HEVC is the successor to the h264 video format and promises similar quality at 50% of the bitrate you would need with h264 video.  Further, HEVC is targeted for high resolution video and should see even greater gains when dealing with 4K video and beyond.  A detailed look at how HEVC works is available here – be prepared for your brain to hurt if you’re a layman like me.

As wonderful as the promises of HEVC may be we still don’t have much software to evaluate at this point, let alone content making use of the new format.  But as I’ve always had an interest in video compression on an abstract level I’ve decided to do a series of articles here looking at what software IS available now, how it performs, and hopefully get an idea of how this new format works.

So to begin with let’s look at the software we can use right now to create and view HEVC video streams.  And if you readers happen to know of any further software that should be added to this list please leave a comment with a link so that we can start to figure out this great new format together.

HM10.1 Reference Encoder


For encoding there is only one solution I’m aware of right now that the average user can get their hands on – the HM10.1 reference encoder.  This is the encoder that was used to proof-of-concept the bitstream and has very little in the way of optimization for speed or quality.  For reference, when the h264 Mpeg4 AVC reference encoder first launched it produced files of worse quality than the leading Mpeg4 ASP encoders of the time such as Xvid.  So you can’t expect this software to show ALL of what HEVC can do, but we can use it to create some initial bitstreams and get an idea of HEVC’s potential.  In particular I’d note that the reference encoder does not have any psychovisual optimizations at this point which has really been the main refinement in h264 video over the past several years.

I’ve uploaded a build of the HM10.1 reference encoder here: http://www.mediafire.com/download/owe7i0hwgn86btb/hm_10.1_r3419_release.7z  This build was pulled from this forum post if you’d like to get it from the source.

Using the reference encoder is a bit tricky as there aren’t many tools to support it at the moment and the configuration settings are not well documented.  I’ll be doing further posts detailing the workflow to make use of the reference encoder in the future.

For Playback we have several options available at the moment.

GPAC Osmo4 Player

HEVC anime in Osmo4

HEVC anime in Osmo4

The Osmo4 player is a standalone video player which supports playback of HEVC streams in the .mp4 container.  Of the test decoders linked here it seems to be the slowest at decoding HEVC video but is the only solution that supports playing HEVC streams from a container.  This means it’s the only player that will let you mux audio with your video if you’d like to have a complete experience with your content.  So overall this is the player I would recommend at the moment and you can grab it here: http://gpac.wp.mines-telecom.fr/player/

Elecard HEVC Player (Alpha)

HEVC anime in the Elecard HEVC Player (Alpha)

HEVC anime in the Elecard HEVC Player (Alpha)

Elecard HEVC player (Alpha) is a standalone player that only supports reading raw .hevc bitstreams.  So you can’t mux audio with your video using this solution, and they plaster a bunch of watermarks on the output so I can’t recommend this software for general use, though it may be useful for testing purposes: http://www.elecard.com/en/technology/researchlab/hevc-player.html

Strongene Lentoid HEVC decoder

HEVC anime with the Strongene Lentoid HEVC decoder

HEVC anime with the Strongene Lentoid HEVC decoder

Strongene Lentoid HEVC decoder is a directshow filter (I think?) that allows you to decode HEVC bitstream files through Windows Media Player.  It’s a bit strange in that it only decodes HEVC bitstream files with the extension .hm10, but you can easily encode examples with TAppEncoder and rename them from out.hevc to out.hm10.  This is probably the best solution as far as performance goes with faster seeking and smoother playback than the other two solutions but again it can’t playback .hevc files muxed with audio.  You can download it at: http://xhevc.com/en/hevc/decoder/download.jsp  You also may want to take a look at their sample video files.

That’s the basic usable software that I’ve found available today.  The other option is that you could compile your own HEVC decoder from OpenHEVC or libav Smarter Fork.  For myself, I never have luck in building such solutions so I can’t test them here.  I will note, however, that the maintainer of lavfilters has stated that lavfilters will support HEVC once libav Smarter Fork makes it into libav Main (which I guess is pretty obvious).  Here’s hoping that happens sooner than later.

So – these are exciting times for armchair video compression hobbyists!  Now that we have some tools to create and view HEVC files, next time we’ll look at actually creating some content.  Until then!

*update* A newer version of the reference encoder has been released.

World of Warcraft subscriptions down – What can Blizzard do?

In a recent financial report Blizzard announced that subscriptions for World of Warcraft have dropped once again last quarter, down another 1.3 million subscribers and bringing the game’s current subscriber accounts to 8.3 million.  This isn’t great news, but it’s also not unheard of in the middle of an expansion pack and we have seen a lot of new competition rearing its head lately in the free to play arena.  So what can Blizzard do to improve its game and continue to retain subscribers?


Content.  The death knell of any MMORPG is lack of content.  Once players start having trouble finding something worthwhile to do in the game they go try other things and once they’ve stopped paying that monthly charge they become less likely to pick the game back up.  Blizzard claims to have addressed this issue by increasing the frequency of their major content patches, but I feel their increased patch schedule is disingenuous or doesn’t deliver the solution they’ve promised.  Let’s look at the progression of PvE content in Mists of Pandaria and consider ways the flow of the game could be improved overall.

Mists of Pandaria launched with 9 heroic dungeons, 3 raids, 16 raid bosses, 2 world bosses and numerous scenarios.  Gear progression was fairly straightforward – normal dungeons/scenarios -> heroic dungeons/justice gear -> LFR -> Normal raids/valor gear -> heroic raids.  But after that blizzard dropped the ball with patch 5.1.  Yes it was listed as a major content patch but it didn’t offer any new raiding or dungeon content.  The only thing it made available for gearing purposes were a few more valor items.  5.2 added a host of new raid content – 12 new raid bosses in all and a few scenarios, but offered no new 5-man content.  This created a bit of a hurdle for leveling new characters as the gear progression now required normal dungeons -> heroic dungeons -> 5.0 LFR -> 5.2 LFR -> 5.2 Normal/Valor -> 5.2 Heroic.  Basically for casual players blizzard didn’t create any means to bypass the original 5.0 LFR for progression while at the same time it gives no incentives aside from a few valor points for people to play 5-man content.  So the only way for many players – dps players in particular – to progress is to either sit through one hour plus queues for Mogu’shan Vaults or very slowly build up valor points through 5-mans until they can enter Throne of Thunder LFR.  5.3 was released just recently and again it has no new raid or 5 man content, but it does add heroic scenarios which will have gear that is slightly better than 5.2 LFR.  However, you have to form specific groups for heroic scenarios so it is unlikely that many players will participate in them.

The current content progression looks like this:


For Blizzard’s vaunted claim that they’re putting out content faster than ever we see that there has been very little PvE content added to Mists thus far, and that content that has been added is not gated in such a way as to be easily accessible to casual players unless they are willing to grind for valor points.  Now you can also gain valor through daily quests and that seems to be the content that blizzard is prioritizing.  Rather than new dungeons, we get lots of daily quests tied to the current main story arc.  I assume that Blizzard has prioritized this way because daily quests are easy to implement and have a higher impact on the general population than 5-man dungeons do, but I don’t think that necessarily makes them the better solution.  I do daily quests all the time and I hate them but I have no choice in doing them because of reputation requirements on valor gear.  As soon as I hit revered on any reputation I immediately stop doing it, though.  So daily quests, while they might statistically show as being more significant for the game as a whole, aren’t fun for many people and they aren’t going to be what drives people to resubscribe.

What blizzard needs to do if they really want to preserve WoW’s image is increase their current development team and segment their patch development.  They need to have a simple content creation process with standard releases designed to tie together intelligently that will offer new players easy inroads to current content while giving current players lots of options and things to see.  My idea for how this would best be achieved is thus:  Have monthly patches.  Have ‘big’ raid jumps every 3 months, one development team should be assigned to do nothing but large raid content with 8-12 bosses.  On large content releases we should see gear ilevel jump considerably, as it did from 5.0 to 5.2 and would also have gated LFR and new valor gear. Next month, release a new smaller raid with 3-4 bosses.  This would be an optional raid not necessarily tied to the current story arc with gear only slightly better than the previous tier (maybe 2 – 4 ilevels).  Heroic content for this tier would be -by design – tuned to be extremely difficult.  One development team would be assigned to always be working on these more exotic encounters.  Next month, new scenarios, 5-man content, and perhaps 10 man content would be released.  These new dungeons and scenarios would drop gear equivalent to  the LFR modes of the previous tier and provide a way for new players to gear up or current players to fill a slot they just haven’t had luck with.  Also blizzard should consider bringing back 10-man dungeons for a more epic feel in daily dungeoning and that content should drop gear which is equivalent to the current tier LFR.  This would give players another way to enter normal raiding.  One Dev team should always be working on dungeon and scenario content – not necessarily tied to the main story of the current expansion.  A random cave filled with monsters somewhere and a bit of backstory is just fine.  The next month is another big content patch which starts the process over again.

I’d like to see the content progression look something like this:


OPTIONS OPTIONS OPTIONS and loads of content – that’s what will help you keep subscribers, Blizzard.  And while I know that, sadly, 10 man dungeons are probably never coming back nor will Blizzard ever produce this much content no  matter how much money they make, hey I can dream.

Dreamfall Chapters – A Crowdfunded Sequel from Red Thread Games


For those who may have missed it, Dreamfall: The Longest Journey is finally getting a sequel to (maybe) tie up all the loose ends courtesy of Kickstarter and the tens of thousands of fans who were willing to back Ragnar Tornquist to make it happen.  The full details of the kickstarter can still be found at the Kickstarter page including stretch goals that were met and backer perks.  Further discussion of the game and announcements have since moved to the official Dreamfall Chapters forums and Red Thread Game’s website.  As of now the game is slated for a November 2014 release on PC, Mac, and Linux operating systems – so it should be a good Christmas next year!

For myself, I’m looking forward to this game and its continuation of The Longest Journey universe.  I must admit that I never actually finished the original Dreamfall, so I’m more in it due to the tie in with The Longest Journey at this point but I have picked up a copy of Dreamfall on Steam to play in the intervening months. I’ve played several of Ragnar Tornquist’s games over the years and what I’ve always loved about his games is the intricate and interesting world building that backs up whatever stories he presents.  We saw such a world in The Longest Journey with the dual nature of Stark vs. Arcadia and more recently in The Secret World where a huge over-arching story was woven through six zones all filled with their own individual mythologies.  To be blunt, he’s a developer I’m happy to throw my money at and I’m sure he and his team will put out something memorable.  I remember reading an interview with Tornquist once where he said something along the lines of, “Yes our games are rough around the edges, but they’ve got heart.”  Which is how I’ve always felt about them as well – games like The Longest Journey, Dreamfall, and The Secret World aren’t winners on technical merit as far as graphics go and they often have difficult gameplay elements but there’s just something about the characters and the stories that make them worth playing through and enjoying.

For the time being Red Thread Games is hiring its programmers, artists, and getting voice talent lined up as well as stocking its refrigerators with frozen pizzas (do they really need refrigerators in Norway?).  Tornquist and co. have been doing a lot of interviews and releasing a few development videos so far and that’s probably the best way to keep up to speed with what they’re working on.  The next big news should be released at Rezzed where Red Thread Games will have a Developer Session and release new content – they’re scheduled to show on June 22 @ 3:00PM.

Another game worth keeping an eye on!

Rift goes Free to Play June 12 – And you can get Storm Legion for free!


There have been a lot of great MMORPGs that have been making the jump to free to play recently – The Secret World, Tera, and now Rift is poised to join the herd on June 12.  Five years ago I don’t think most people would have expected there to be so many really great f2p options available.


One exciting part of Rift going free to play is that you can currently get Storm Legion and 30 days of free game time just for playing Rift Lite which will also grant you full access to the game on June 12.  Otherwise you’d have to pay 39.99 for a digital edition of the game or 19.99 for boxed copy at your local Gamestop.  Note that current accounts are not eligible for the 30 days of free game time, but you can still get Storm Legion for free on an existing Rift account if you’re willing to pay $14.99 for a one month subscription.  This will also grant you some perks when the game goes free to play.  Or you can create a new Trion Worlds account on a free e-mail address.

To get Storm Legion for free you need to create an account on Raptr.  Raptr is a social networking website and instant messenger, targeted towards video game players – essentially look at it as a Steam interface for tracking gametime, achievements, and giving you options for sharing your activity through facebook, twitter, and your Raptr friends list.    Personally, it’s not the kind of software I’d use except for one thing – they also offer rewards for some games.  For Rift, if you play for 14 hours you get a free copy of Storm Legion and 30 days game time and you can log those hours through the Rift lite game, even just idling on the character select screen.


First go to http://www.riftgame.com/en/ to create a Trion Worlds account and download the Rift Lite Client.

Once that’s done go to http://raptr.com and create an account.  You can tie it to a social media account, but personally I have no prior knowledge of this company so I’d recommend you make a unique ID and password for your login.  Download the Windows client to track your game time.

Once you’ve installed Raptr it should automatically detect Rift as one of your installed games.  If not just add it manually.  Boot up Rift and start playing – that’s it!  If it’s your first time playing Rift you can play a character up to level 20 to start getting a feel for the game or if you’re using an old subscription just create a new character to stand around in Freemarch until you’re rewarded.

After you’ve reached 14 hours you’ll be given a code to enter on the Trion Worlds codes page – again I’d recommend you secure yourself by putting this code in through a third party browser and not the Raptr client itself (though it’s probably okay).

Enjoy your time in Telara!


The Walking Dead – A Final Review



The Walking Dead has been an interesting game to follow during its development and in the months since its conclusion.  It is a game that seems to draw either high praise or great derision depending upon who you ask and people’s impression of the game as a whole swing from ‘perfect’ to ‘laughable’.

I did a mini-review of the first two episodes of the series quite a while ago and never bothered to write down my final impressions of the game.  I’d like to remedy that now.

Let me start by listing the points of the game I found to be positive.  The voice acting, character interaction, and difficult ethical decisions placed before the player are all laudable.  I would imagine that the majority of the budget went towards the voice recording and having emotionally charged interactions between the actors really helps bring the game world alive.  The story is – in and of itself – a tepid zombie survival story but game manages to overcome the uninteresting setting with the smaller, more intimate character stories we get to explore.  The top notch character performances really help bring that home.  And the moral quandaries ask much more of the player than we find in most other games.  In most FPS games you’ll find people randomly shoot NPC’s just to see if they can hurt them but in The Walking Dead you’re asked to really think about if it’s suitable to take a human life, even a life that is as good as over.

For all of that, the game is severely hampered by poor game design, uninspired adventure elements, low quality art direction, many many gamebreaking bugs, and a lack of meaningful development from the decisions the player is asked to make.  I won’t go into these things in too much depth, but I will say that taken together, and further taken with Telltale Game’s response (virtually none) to those affected by bugs on all platforms I really can’t recommend this game.

The main problem is the savegame bugs which have been rife on the PC but have also been seen on the XBox, Playstation, and IOs versions of the game.  The game is designed to track your progress through each episode and tabulate your ‘major’ decisions at the end vs. the rest of the community that has been playing the game.  When you begin a new episode this data is passed onward to the next episode and the content should match your past decisions.  So in Episode 1 you can choose to save either Doug or Carly.  If you save Carly in Episode 1 then you play with Carly in the next episodes – seems obvious enough!  However, the game also allows you to start a chapter with random choices – this would be for if you own the whole game and wanted to start a new game from say Episode 3 it will just make up decisions for what happened during the first 2 episodes.  But the game had problems – mainly on PC and Mac – where decisions wouldn’t be carried over from previous episodes and random decisions would be used.  Often times people didn’t even realize these changes had taken place because the impact is generally so small – a different line of dialogue here and there – but could be much more obvious if suddenly Doug comes back to life and Carly is gone.  This led to a huge disconnect for me as I was hit with this bug on Episode 3, 4, and 5.  Basically nothing I did in the last half of the game stuck.  There were unofficial fixes for this problem posted on the forums but I didn’t make use of them as I had already overwritten my episode 3 savegame by the time I realized what was happening.  After that I was too disgruntled about the buggy state of the game to take it seriously.  Suddenly being all buddy-buddy with Kenny and not knowing what kind of example I had set for Clementine made it feel like my personal touch to the story had been completely destroyed.

The bugginess of the save system is a symptom of a larger problem with the game – lazy design.  You can see it in so many aspects of the game, from the lack of working controller support, inability to remap input keys, lazy re-use of the same limited backgrounds over and over again rather than creating just a few more interesting places to look at, poor animation and character modeling, and an engine that runs far worse than is warranted by its final output.  Telltale comes off as a hack company just wanting to cash in on the brand rather than put serious time into creating a polished piece of art.

And further damning them is the fact that the developers have had no official response to these problems.  They actually have no direct online customer service to speak of aside from e-mail.  All of their tech support is done on their company forums by individuals who specifically state they’re not employed by Telltale.  When you’ve got a GotY hit that has made you millions and millions of dollars it says something about a company that they’re not willing to pay anyone to help customers experience the game the way it was intended.

So for myself The Walking Dead went through quite a few revisions as to how I felt about it.  In Episode 1 I felt let down by the adventure elements but in Episode 2 I found I was really liking the game due to its characters and ethical decisions.  In Episode 3 I was miffed that I lost progress and found the story less compelling than Episode 2 – really not enough happened in Episode 3.  Episode 4 and 5 saw similar technical problems but finished stronger than Episode 3.  While the story and game ended on a good – not great – note, the technical faults of the product led me to feel really let down and unsatisfied with my purchase.  I won’t be picking up further Telltale games, even if reviewers are willing to throw GotY awards at them.