Tryst with video recording.

Shooting a movie for the Doofus game turned out to be more than a headache; a bad case of migraine I must say. Well it all began soon after releasing the game. The logical next step was to shoot a movie/video to put on Youtube. What was supposed to be a 2 hour job turned out to be a lot harder than I had anticipated. Most screencap utilities do a pretty good job at capturing screen movies, however, what I failed to realize is the fact that most of them are hopeless when capturing any Direct3D or OpenGL rendered visuals withing a game. I am extremely disappointed with capture software that is available for recording an in-game movie. I tried several applications that are available, both free and commercial ones, but all of them turned out to be poor — either extremely slow or extremely buggy.

In the end I had to manually write an AVI capture facility into the engine code; ie. physically get the Back-buffer, StretctRect it into a texture, download it off the GPU and  store it’s contents into an AVI file via a bitmap, frame by frame. Similarly with the music and game sounds, for which I had to code in wave capture in OpenAL. Whew done! Unfortunately not all went as planned. I soon realized that the video and audio streams in the recorded AVI file went completely out of sync. That’s because the game’s frame-rate varies considerably while playing, whereas the sound is always played at the same rate. The problem unfortunately is — unlike the game the AVI file’s frame-rate is always fixed. So after a 1 min shoot, I could clearly notice a mismatch in video and sound. I tried unsuccessfully to correct the problem, but the problem still persists. That said, at least the results of  video capture were better than any 3rd party application I had tried before. So it wasn’t a total waste of time.

So yeah, I could shoot video clips, albeit not as good as I would have liked. I wanted a 1024×768 res video and all I could manage was a 640×480 one at pretty moderate quality given that all the streaming was done into a MPG4 compressed stream and there was a noticeable loss in quality. Then came the next challenge; editing the video into a full streaming movie. Movie maker was a the only free option available and the app is not too difficult to use. However, the app encodes videos only in WMA format and I couldn’t locate a MPG , AVI or an FLV option.  That meant I needed to convert the movie to a flash movie (FLV) so it could be streamed off the Internet using a SWF flash plugin. Bah! WTF! Well it turns out ffmpeg can re-encode movie files to most formats; including FLV and it’s free. Thank you ffmpeg.

Then it was Youtube. Well it seems when you upload a video to Youtube the server converts and re-encodes the uploaded video file using a really poor quality compression. I am not sure which format the FLV encoder on Youtube uses, but the results turned out to be a blocky pixelated mess. I guess, after some many conversion and switching formats, the video quality on Youtube turned out to be pretty poor. You can compare the quality with the ones on the Doofus website (the larger one here) and  you will understand what I mean.

Bah! The next time I am directly streaming content into a external HD video recorder via the TV-out option of the video card to avoid such craziness!

Qt to go LGPL.

That’s really great news! Qt the open-source and cross-platform tool-kit/framework  from Nokia (formerly from Trolltech) is going to be released under a more liberal LGPL license. What it means is, you could now use Qt in any of your projects provided you agree with LGPL. Well, does it also mean that you could finally see all those wonderful KDE apps ported across platforms? I sure hope so. KDE was built on top of Qt and thus shares a lot from Qt, and thus it’s fair to assume that KD too could benefit from this move.

Nokia states that having Qt under LGPL will allow “wider adoption”, and it may very well turn out that way. The earlier GPL license was, according to me, hindering the adoption of the toolkit and this is a welcome development indeed. Qt is a very polished GUI toolkit there is no denying that, however the license may not be the only reason why developers choose other tool-kits/frameworks over Qt.

I have used Qt quite a lot in the past, both for commercial and open-source development. However, it’s been some time since I have dabbled with the toolkit/framework and over the years I have slowly moved on to other tool-kits like wxWidgets. I haven’t been too fond of Qt’s moc-compiler thing which can be a pain to work when the project size gets large. Having said that one can’t dismiss the fact that Qt is probably the leading cross-platform toolkit out there. It provides a huge number of widgets and a myriad of functionality that would have to be rewritten or re-invented if one were to use any other toolkit. Ot offers a strong development environment and an equally strong GUI designer; often missing in most other tool-kits. It has a proven legacy and is used by companies big and small for almost all types of GUI.

Would I switch to Qt if it were LGPL? No, probably not. I am perfectly happy with Code::Blocks and wxWidgets combo and I don’t see any reason to move to Qt. Most of my projects use pretty complex but consistent UI and wxWidgets serves me pretty well in that regard.  The game builder I am currently working on goes pretty nicely with the existing wxWidgets framework and the toolkit offers me more than what I need. So I personally see no reason to switch.

Is it another year already?

A very Happy New Year to all. A bit belated I know, but I was kinda busy doing nothing. Well, not really. Yeah, I have been taking time off, but I was also busy with other activities, most importantly, marketing of the game.

So what’s 2008 been like? Well for me it was pretty uninteresting. Not, that I didn’t enjoy it, it’s just there was precious little in the way of what I like to do best; research. Most of 2008 was spent on fixing bugs, play testing, hardware testing, level creation and solving some insanely complicated issues, issues that shouldn’t have been there in the first place and some unavoidable circumstantial problems, that shouldn’t have been there in the first place. Most of the coding that was done was also equally uninteresting. Majority of the time was spent on getting thing working right with gameplay and design. Not the most pleasurable of things I must say, at least not for me. That said, a lot of ground work has been done w.r.t the engine, most of which will not have to be repeated for sometime to come. So that’s a big positive, something I can take away from 2008 as being extremely productive.

Having said that, the biggest hit of the year for me is of-course the release of the game; which took far more time than I had initially anticipated. True, it turned out OK (great 😉 ) given the budget, time and resource constraints, but I would have liked to do more. Maybe all that was missed in this one can quickly be added to the next one. A Causal Analysis is due, however I would like to hold on to that a bit longer. At least till we finish up with the final marketing parts which I am currently focusing on. A part of  last year was also spent in starting 3D Logic Software and there are a lot of things that had to be done before we went online. Unfortunately they accounted in a pretty big delay for the launch of the game.

On the tech front, 2008 has been equally low. Very little interesting developments. Most of things that happened were evolutionary rather than revolutionary. On the OS front XP still rules and will probably do so in 2009 as well. However, the year belonged to the underdog Apple. Both their OS and their products have gained significant market share and will probably continue to do so in 2009. Linux has always been interesting and 2009 will be no different. Linux grows from strength to strength in some areas and remains the same in others. If anything I am looking forward to Linux in 2009, some interesting developments on the horizon.

In 2008 we saw a resurgence of the GPU battles with ATI throwing in some impressive technology, and that’s good thing. For the first time I am an owner of an ATI card (HD 4850) and though NVIDA held on to the top spot (barely), ATI was close behind and even edging out in front at times during the year. Then again we can’t forget general purpose computing on the GPU. The year has been interesting for GPU and GPGPU. Powerful cards with supercomputing capability were unveiled and this year will see more power being packed into cards as the GPU titans clash with better with more powerful weapons at their disposal. Oh, let’s not to forget Intel here. Intel finally unveiled Larrabee, so you very well could have another titan arising in those battles.

Personal wish list for 2009.

  • Intel comes around to finally putting a proper on-board GPU with at least good hardware T&L and releases moderately good drivers.
  • Microsoft releases DirectX 11 for XP along with Vista and Windows 7.
  • OpenGL spec gets a overha….. well forget it!
  • Linux gets a single package-management/installer system that everyone across the board adopts, and most importantly is easy to use and deploy.
  • The economic downturn ends.
  • All people in the world become sane and killing of innocent people stops completely.

That all for now, 😀

Once again a Happy New Year.

…7..6…5..4..3.2.1……Launched!

Doofus Longears The Game.

😀 Yes we have launched the game. Find it’s downloadable demo at it’s very own website (www.doofuslongears.com).

10…9…8…

A lot has happened on the game front as well. First let me start off by letting people know…. I have launched 3D Logic Software. That will be our business name under which the Doofus game/s will be released.

As far as the game release goes, the countdown has begun! I have not had too much time to update the blog since we were all working hard at the final push towards the finish line. Yes, the Doofus Game is to be released very soon. Keeping my fingers crossed.

Gamepads, Joysticks! How do you play with those?

I integrated Joystick support into the game engine a long time ago but I never actually played the Doofus game using a Joystick or a gamepad up until now. One of the testers logged an issue last week saying that the game’s camera movement was a bit slow for game controllers in general. So I decided to play the game out myself with a joystick. For the record I never play any game with any accessory other than the keyboard and mouse and after my recent experience with the gamepad, I must say I missed the mouse quite a bit. Maybe it’s just me or I have taken a strong disdain towards any kind of game controllers ever since my days with God Of War, and though I am a total fan of the GOW series, the experience with game controllers while playing that game has been more than a little unpleasant. I think I have been playing games with the mouse for too long. Maybe so much so that I have grown too accustomed to the Keyboard and Mouse. I truly don’t know. However, and this could very well just be me, I find controlling the camera using the mouse far simpler and more intuitive than a Gamepad or a Joystick axis.

I tried a lot of different games this week with a gamepad, which  for the better part of this year, has sat inside the cupboard. I told myself, “It’s just a matter of time before I get the hang of this thing.” No chance! With every game I try it’s the same story. I just give up after struggling with the controller for about 10 mins. It’s been like 3 days and I still can’t control the Doofus game’s third person camera, which by the way is not at fault 🙂 . For me, controlling Doofus’ third person camera just seems a lot more natural with the mouse than with the Keyboard. Not that I can’t do it, it just feels a lot more comfortable with the mouse. Fortunately for people that dislike the mouse, Doofus does run perfectly well on any game controller.

Some people say game controllers are great for flight simulators and maneuvering vehicles. Sorry, I haven’t had time to play those. I can tell you, FPS games are almost impossible to play. You can’t aim with these things and get fragged pretty easily. Maybe combat games fare better, but again I haven’t had time to play those either. I ran Tomb Raider demo I have on my system and even there I found my gamepad to be more than a challenge.

So, after this bout of testing, the gamepad goes right back in the desk from where it came. Ok maybe I have ranted enough for one post!

The HD 4850 and the story with AMD/ATI.

First the HD 4850. I was testing the game on the new HD 4850 (Palit 512MB) today and some interesting things I observed with the graphics card. For one it gives a serious bang for the buck. Doofus 3D clocked at about 140 FPS at a resolution of 1024×768, AF 16x with graphics quality set to high. Even with AA 2x Doofus 3D clocks more than 120 FPS and I have a strong suspicion the game was going CPU bound at those frame-rate, since the machine had a 3 year old CPU. I can tell you for a fact, the card is a serious performance monster, but then again Doofus 3D ain’t a top line game. However, for me, this is the first time I have seen Doofus 3D under 4x AA and 16x AF running at a playable FPS since up until now I have had only GeForce 6200, 6600 (and to some extent the 8600) cards. There is no denying that the HD 4850 is more than worth it’s price for someone who is looking for a budget card and expects to run most of the top-line games today. The card runs a little bit hot but that’s to be expected given the amount of triangles it can push and effects it can deliver. Hats off to AMD/ATI in that regards. If you are someone who is looking for a mid-range card right now, the HD 4850 is excellent value for money.

That was the overview from non-programming point of view. Now the programmer in me has something to say. The card maybe excellent, however it’s not all that cozy with ATI drivers. OpenGL drivers are a mess, with the bundled driver not even having extensions like EXT_stencil_two_side support. Even basic functionality like (for example glDrawRangeElements() ) seems to be broken at times, even showing messed up graphics when using Vertex Arrays on older cards. Now this exact same functionality is available under DirectX. Lets say it’s safe to assume that GL drivers haven’t been updated in a while and\or AMD/ATI just isn’t interested. The only issues that were reported in this round of testing were on ATI cards, so I had to literally debug the application on ATI hardware to ascertain that these were indeed driver problems. Some of the issues I have mentioned occur on guess what, the HD 4850 also. The only workaround seems to be, vendor specific hacks! That doesn’t make me a happy programmer at all!

The story with Direct3D is a lot better and no issues were observed under DirectX renderer of the game. That just tells you something doesn’t it!

Tweaking the game to run on a wide range of hardware.

For the last week I have been involved in rather uninteresting activity. Well, I have been literally throwing the game on all possible hardware configs hoping it will run. All of this (yes, again) to find out how the game fares when exposed to different hardware configurations. Well it may seem like this activity is rather mundane, then let me assure you — it is. Well, not entirely 😀 . It takes some effort to get a game to scale seamlessly to all kinds of hardware and currently I am enduring all the pain of crappy drivers and broken functionality, which,  should I say, underscores some of the major headaches in real-time graphics development. It’s not like you can throw the game with it’s peak setting ON and expect it to run on a crappy Intel on-board graphic cards. Such a thing will just end in a disaster. The game must scale to different kinds of hardware and in our case especially so; that too seamlessly and effectively.

Doofus 3D is uniquely placed. It doesn’t aim to be a top-line, hardware intensive, hard-core gamer only, triple A (AAA) title. Neither is it a 2D game capable of running flawlessly under software rasterized graphics on your grandma’s old school PC. It is geared more towards intermediate level hardware. Hardware that most people have on their work laptops and home desktops. This effectively means an extremely wide range of hardware to cater to, and that in turn means scaling the game’s software paths (internally) based on a *lot* of underlying factors. Assuming a player to have a specific functionality available on his hardware setup can be catastrophic and disastrous. Such assumptions could mean a total failure of the game on a machine and could mean a potential loss of a buyer in the end.

While drawing up specs of Doofus 3D we were especially careful not to go overboard with graphics galore. Even with careful planning, there was significant feature creep, and with each new feature that was added, new countermeasures had to be put in place so that the game would scale to lower-end hardware. Not everything was straight forward, but we still did manage to push it through. If you have been following my blog for some time now, you would know that this is not the first time I am into such activity. I (personally) run such tests after each beta (feature addition/ feature freeze) of the game. That is probably why we haven’t faced too many problems this time around.

Under Doofus 3D we followed a process that is a bit different from traditional software development. Every beta under this game project was actually a feature complete runnable version of the game. Before or between any beta, every release was an internal alpha version. A beta meant, “A set of features is complete enough to be tested”. After each beta, each feature was tested on various hardware setups. Something like an iterative method of software development, but not quite. I would say, a process tailored specifically for our project and more specifically for our situation given our limitations.

Doofus 3D runs on most middle rung hardware without too much problems. It will run on on-board graphics cards too, but I find Intel on-board graphics to be an abomination. Hopeless hardware support for 3D graphics and equally crappy driver support! Enough reason for the engine to scale the game to run on a low setting when it detects an Intel graphics card. The situation with NVIDIA and ATI cards is a lot better with ATI’s low end cards (,assuming the price point, ) to be consistently outperforming NVIDIA cards. That said, NVIDA has the most stable hardware and drivers and most settings work uniformly across cards and driver setups, though there can be problems there as well. ATI’s drivers can be buggy at times and in case of OpenGL can be totally broken. Fortunately the O2 Engine and the Doofus Game can use either Direct3D or OpenGL as rendering APIs. For any high end or for that matter even for most mid-range graphics cards, Doofus 3D is not a problem at all.

Doofus 3D hits Code-Freeze.

Whew! The game has officially (finally) hit the last code-freeze today, actually yesterday but I tagged the repository today so it’s officially code freeze today. All the levels have been done and most of the internal (alpha) testing is complete. Yeah, the blog has been silent for a while; but that’s to be understood, I have been really hard pressed to finish this on time, yet I overshot my (mostly self-imposed) deadline by a good 15 days In part due to unforseen circumstances, in part due to my own faults, but that’s the way it goes (generally 😀 ). I am really glad it’s finally done and all but the most trivial issues remain. There is some artwork left though, I still have to finish up the initial screens and some sound-tracks and some sound-effects need to be incoroprated into the game as well. Some parts of the demo version also need to be finished up. However, I am going to hold on to that till I release the RC for one final bout of testing, that way I have a good feedback on which levels to include in the demo and which to hold off for the full version.

 

 

Doofus gets a dose of Optimizations.

Ah! It’s the optimization phase of the project and I am knee deep in both CodeAnalyst and NVIDIA PerfHUD. As far as memory-leak testing goes, most, no all of the memory leak testing is done by my own custom memory manager built directly into the engine core, so no third-party leak detectors are needed by the game. AMD’s CodeAnanlyst is a utility that is invaluable when it comes to profiling applications for CPU usage and the fact that it’s free makes it even better. NVIDIA PerfHUD is probably the champion among graphics performance utilities and which, I think, is vital when it comes to bullet proofing any graphics application for GPU performance. Too bad it doesn’t support OpenGL yet, but the O2 Engine’s renderers mirror each other almost to the point where an performance enhancement under the Direct3D renderer is almost similarly experienced under the OpenGL renderer. I would have really liked PerfHUD to have supported OpenGL though. There are some issues under GL; like for instance, FBOs under OpenGL perform a tad bit slower than Render-Targets under Direct3D (on the same hardware), which I must admit has left me a little dumbfounded. Maybe it is just for my GPU (yeah My GPUs are a bit old I must say,) or maybe the drivers are at fault but I have noticed a performance variance between the two even after considerable experimentation and optimization. It would have been good to have a utility like PerfHUD to probe directly at the dra calls and/or FBO switches. I am trying my luck with GLExpert, but I am not there yet. I must however say that GLExpert is nothing compared to PerfHUD.

Code Analyst
AMD CodeAnalyst

NVIDIA PerfHUD
Doofus running under NVIDIA PerfHUD