New screens of the Doofus 3D Game.

Update: Doofus Longears – Get ’em Gems has been released and can be found on www.doofuslongears.com

Whew, finally found some time to update the blog. I have been frantically working on putting final polish to the game, business related activities, tweaking graphics, ironing out small glitches in gameplay, play-testing levels, and the list goes on!

My major headache was the background. There were a lot of people who had complained about the background not being up to the mark. So I decided to paint a brand new background from scratch. It was a hell of a lot difficult though. Doofus 3D being a cartoon game, I wanted to have a flamboyant background (, rich and colorful with a distinct cartoon touch). However it’s not quite that simple, it’s not as easy as firing up good ol’ GIMP and just having a go at painting any ordinary scene. Since you are painting for a sky-box you really have to be a lot more careful and lot more sensitive about how to handle depth in your scene, plus you have to paint for a full panoramic view. A lot of experimentation went into this one, believe me! Lot’s of hits and misses later, and after studing some other skyboxes this is what I ended up with.

As you can see the background is a whole lot better than the muddy dingy background from the screenshots of the previous beta. Plus there is something more. Yes, the first pictures of new characters. More later 😉 .

Insanely busy.

I know the blog has been silent for some time now, but I have been insanely busy for the past couple of weeks and it’s not looking to ease up any time soon. There are just too many small things that need to be done, I still have a sh**t load of work to catch up too, I am in up to my neck in work. Things haven’t been this hectic for quite a few years now, probably never ever since I started on my own. Reminds me of the days when I used to put in 90+ hours a week. Yeah, those were indeed stressful times, used to work on a full-time job and then somehow manage to find time to come back and work on the engine and the game (*shiver*). Thankfully even though I am busy, it’s not that stressful at all. You may think the game is the culprit, on the contrary the game is the least of my worries, it’s just small things that keep adding up. It’s the other things that you have to get going to start-up a business that are just plain insane, and yes, things I hate to do 😀 . I guess you have to do all those things at least it once to get things rolling.

Fable, Oblivion and the sandbox gameplay.

Fable is a game developed by Lion head studios and to be honest I missed out on it a couple of years back when it first came out. Interestingly it was my bout with Oblivion that actually first piqued my interest in this game; since the game seems very similar to Oblivion, and yes Oblivion is pretty high on my list of all time favorite games. Fable is interesting because it just seems so much like what I have in my head as to something I might be working on. A very cartoonish backdrop, a very serious and in-depth gameplay with, what can be called as, dark humor. I have only very briefly played the game and it seems like a well designed game overall. I like it, and like it a lot. While not exactly same as Oblivion, some similarities do exist between the two and, well, it’s hard not to compared the two.

For one, although Fable is open-ended, Oblivion allows you more freedom, definitely more than Fable does. (Those of you who don’t know what an open-ended or sandbox style game is read this.) You can play Oblivion at your own pace and the game can play differently depending on the choices you make quests you complete and how you interact with world (and NPCs). Fable does allow you something similar and does have a sandbox style play, but unlike Oblivion it has a more linear gameplay, or should I say, more linear than Oblivion. Fable is actually an older game as compared to Oblivion, so it wouldn’t be fair to compare them outright, since of course the game predates Oblivion by almost 2 years. Then again 2 years isn’t such a long time after all.

The one thing I didn’t like about Fable, or should I say, didn’t appreciate too much, is that fact that you can’t deviate from your play area, meaning you can’t go anywhere and everywhere in the game world. Exploration is kept to a confined area and the player is not allowed to go beyond that. In Oblivion you are free to explore every corner of Cyrodiil which, I must say, can take quite a while. I played the game for 8 months now and I still haven’t had time to go to every place on the map. The world along with every cave and dungeon is just simply huge. I can understand the technical limitations for such an approach, but Oblivion addresses this very subtly and elegantly. Coming back to my point about exploration; I think exploration is a critical component of any sandbox style game. It gives you so much freedom or should I say gives you an illusion of total freedom and that is something I have come to appreciate a lot after playing the Elder Scrolls series (Morrowind and Oblivion).

In support of Fable, it has a fantastic combat system. I would place it better than Oblivion and I can safely say Fable allows you to have a more balanced combat game. I can give you an example; both Oblivion and Fable allow the use of Mêlée and Ranged weapons, but for some reason I didn’t find the use of ranged weapons in Oblivion all that intuitive. I can’t really explain why, can’t really put a finger on one particular reason, but while playing the game I used to get clobbered if I used a bow & arrow. In Fable I use both to an equal degree. Both games are sandbox games and both games build the player character by the choices the player makes. In Oblivion I ended up being a beefy guy with little resistance to ranged attacks from other NPCs. In Fable my character seems to be a great balance of both. Now I can take down NPCs with proper planning and lure them into traps by using a combination of mêlée, ranged and magic. On the whole Fable does allow you to build a more all-round character.

Fable’s graphics are top notch. Spells and magic, combat system, weapon augmentations and teleportations are all done wonderfully. Even the cartoonish world is built beautifully and so are all the NPCs, of course considering the triangle budget and the fact that the game runs flawlessly on a Geforce 6200 with an impressive frame-rate I must add. The camera navigation and the cut-scenes are also pretty good. Graphics complement the gameplay very nicely and that’s what is important. Graphics are not over done and that’s good. You will find games that are galore with graphics that do nothing more than slowdown the game for no apparent reason and have no particular function other than to please graphics “fanbois”. Fable does none of that. The only thing I really hate about Fable is it’s game-save system. You can’t save your game in the middle of a quest. All you can do is save the skills you have learned. I can’t understand the reason for this, just defies logic. I generally play games only for 20 to 30 mins and quests take significantly longer to complete. So this “feature” is a real PITA. This is probably the only real complaint I have with the game.

I am a fan of sandbox style gameplay but my interest in Fable was 2 fold; it’s true I like playing games, however this time around my interest in the game was more of an academic nature; as a student of game design. I wanted to see how the game was designed overall. Yeah! I have this crazy idea of actually making a sandbox style game some day (long time in the future… or maybe not so long) and Fable seemed too hard to resist. Mind you I haven’t fully played the game yet but I am already pretty impressed; and the same goes of Oblivion too. Baring little quirks, I think both games are equally good in presenting the player with a out-of-the-box experience. Both games allow you to build the player character in unique ways (, sometimes not so unique) but non-monotonous none the less. Both game are “different” and it would be unfair to say that one is better than the other. True they each have their good and bad points, but both games are equally enjoyable.

I want this!

I was over at geek.com and the website was carrying a story about a cool little gizmo called Neocube. Apparently it’s a “puzzle with a billion solutions”. Something to keep your mind occupied when you are bored or feeling a bit “gray”. What I like is the way you could manipulate all those little spheres into something cool, and so quickly. Looks like an amazing gadget to let your creativity do the talking. OK I need this one!

Posted in Fun.

Hardy is here.

UbuntuUbuntu 8.04, code name Hardy Heron was released 2 days ago and since my internet machine has nothing better to do while I finish up the game, I went for a full system upgrade right away. Ubuntu does go from strength to strength with each release of the OS and the story with Hardy is no different. I have been using Gutsy for the past 6 months now and with the release of Hardy, I think XP is in serious danger of losing it’s number one spot in my list of preferred OSes. I just don’t boot into XP these days on my internet PC and my reservations on Vista are well known. Ubuntu at the moment is all you could want from an OS, though some nagging issues clearly remain. I however do use XP for all my programming stuff, unfortunately that’s where the bulk of the gaming market is. I however do plan to release the Doofus game for Linux once I release a Windows version.

I have praised Ubuntu before on this blog, but it is funny how Canonical has consistently managed to do a good job and stick to it’s motto of providing a simple and yet promising Linux distribution that even a common, or should I say a non tech savvy person could use. They have successfully managed to change the “Linux is for Geeks” attitude into something people can look and use in their everyday lives. Lets be fair, there are others that are fast catching up and can be considered equally impressive, yet Ubuntu has managed to stay ahead of the curve, just that little bit. It’s just those small things and annoyances that Ubuntu has managed to address successfully that has led to it’s popularity. Some people would argue that Ubuntu could not have stood so tall if it weren’t standing on shoulders of teams like Fedora SUSE and of course Debian, and without whose support and work Ubuntu could not have been possible. Yes, thats indeed true. However, Ubuntu has made a difference by actually using and in some cases integrating the great work done by all these teams and putting together a strong stable distro which could easily be considered as the best of the Linux distros out there.

Little things go a long way. Many people have heard about Linux, probably more than you might think. However, very few have actually used it. Why? It’s a headache to partition your disks and actually have a Linux partition. A average-joe user dreads things like that. Walk in Wubi! Now some might say having Linux on an NTFS partition is not something new. It could be done with several other distros long before Ubuntu was around, but how many of the other distros allow this to be done with a simple few clicks? I threw in the Ubuntu CD in the drive under XP and the first thing that popped up was the Wubi installer. I could install an entire Linux distro in about 4-5 clicks and a couple of restarts of the machine. I am a long time Linux user, but even I was surprised how trivial it was to install Linux with Wubi. Now I wouldn’t recommend using Wubi for the experienced user, however this option is rather cool for a person who has never seen or used Linux before..

However, not every aspect of the distro is flawless. There are some issues that still need work. It may not be all the distro’s fault either. Somethings are still a miss with the community as a whole. Technical issues like sound and WiFi are the ones that comes to mind. There are some issues there that need to be sorted out. Needless to say such issues are no doubt small and Ubuntu has address a lot of them with Hardy. The only real complaint I have is, I still can’t seem to get my front headphone jack to work, not with Gutsy and not with Hardy. But I guess this is some weired ALSA problem. Fortunately the NVIDIA driver is doing a fine job. I remember there was a time when h/w vendors didn’t seem too interested with Linux but I must say things are changing for the better. It wasn’t that long ago, when you couldn’t find a decent driver for your graphics card, now most leading distros come bundled with one.

As a parting note, a few suggestions on download and upgrade. I would recommend using bittorrent since  found it far faster than using the overloaded Ubuntu servers. The CD ISOs can be found on all mirrors. Try this link if you want DVD ISO torrents. Also remember if you are upgrading from a CD use the “alternate” version of the install ISO. It is best to use the Update manager to do an upgrade of the OS, it’s the safest method. If you have downloaded the alternate version of the ISO, you can update without having to actually burn a CD ROM. Linux can directly mount ISOs and you don’t need any special software to do that. Make a directory under /mnt called “isocdrom” and use the command

sudo mount -o loop -t iso9660 ubuntu-8.04-alternate-i386.iso /mnt/isocdrom/

to mount the ISO directly. Then use the command

sudo /mnt/isocdrom/cdromupgrade

to start the upgrade and follow the instructions. Remember to use the full path “/mnt/isocdrom/cdromupgrade” while starting the upgrade.

Triple booting: Vista – Linux – XP using Vista’s boot loader.

Ah it’s Vista again, 😀 but this entry is a bit different from my other ‘Vista’ entries. Do you know you can have a multi-boot system via the Vista boot-loader? OK yes, I am using Vista, or rather I am testing the Doofus game out on various Windows versions and Vista just happens to be one of them. No, I am not officially using it (as yet) on dev machines but since we are heavy into testing and as past experience has shown us that Vista is a pretty unreliable OS, we decided for full Vista compatibility testing this time around. However, none of the team has Vista installed on their PCs so we had to go looking for someone who has. We did find a friend with a Dell laptop who had Vista Ultimate but unfortunately the guy had long since formatted the machine and installed dual booted XP-Fedora combination. However after some fair bit of convincing and coaxing I did manage to have him share a partition on his machine for Vista.

The problem was, we had to keep the XP-Fedora working. However the Vista install overwrites the MBR so GRUB loaded into the MBR is effectively overwritten thus preventing a boot to an already installed Linux partition, and that is exactly what happened. I had initially anticipated the problem, this is not the first time I was working with multiple OS with multiple-boot options. In the past GRUB had served me well in such situations. So I was pretty confident that even if the MBR were to be overwritten, it was just a matter of reinstalling grub. That’s what I did, unfortunately it didn’t work this time around. Maybe because of some problem with chainloader, or maybe Vista doesn’t seem to like anything other than it’s own bootloader. I was unfortunately unable to find out why exactly Vista doesn’t boot via GRUB. So I tried something else, I tried booting into Linux via Vista’s boot loader and with a bit of hacking it really worked, quite nicely I must say.

How To:

Continue reading

FL Studio Rocks!

All of this blog has been tech stuff and more tech stuff. People must be wondering what is it I otherwise do. Actually as the story goes, earlier when I was working (, not on gaming but on my other programming job), my hobby used to be working on graphics stuff and modding other games and engines. Funny, my hobby became my job now that I am working on this game, so it was time to take up another one (, hobby that is). What’s the next best thing? Creating music, of course! 😀 It so happens I ran across this software called FL Studio 6 months or so back and started fiddling with it and was soon hooked.

I was using the demo for quite a while and I was really impressed by the whole product. I guess impressed enough to go get the full producer version of the software. For the clueless, FL Studio is a digital audio workstation (DAW), meaning you can produce music with it. OK, I am still a noob at the whole thing, but even then the software allowed me to create some really good tunes pretty easily. The work flow is not trivial, but you can figure your way around after reading tutorials and following online video-tutorials. I am not a good music composer, not by any stretch of the imagination, yet the software allowed me to create pretty decent tracks far too quickly than what I had ever thought possible.

The program has near infinite options for authoring audio, most of which I am completely clueless about. Unfortunately I haven’t got too much time right now to look at each and every one, but I hope I can get around to understanding them eventually. To someone who has never seen FL Studio, the interface might look intimidating. The amount of nuts and bolts on the UI makes one think that it would be rather difficult to get things working, however looks can be very deceiving. While not a walk in the park, a few searches on the internet will have even a total noob creating great audio loops in no time. All you really need to do is visit Youtube and there are more than enough tutorials for even advanced stuff. The FL Studio site itself has more than enough vids to get you started.

FL Studio

I kinda liked the software right from the word ‘Go’ and, yes, I am a sucker for music. Unfortunately all my earlier attempts to produce anything audible, with any other software (, or for that matter hardware) could only be categorized as ‘noise’. FL Studio just seemed so intuitive. True there are other, more powerful products in the market, but I think very few can stand up to FL Studio given the price point, not to mention the lifetime free upgrades the product offers. Cool!

UAC popups were designed to annoy.

I was shocked reading this, well almost; maybe not quite, but it seems Microsoft built the UAC prompts into Vista to annoy people and that too on purpose. The idea behind it was to force developers to write more secure code that would not allow for those UAC pop-ups to appear at all. Oh wow, so this is actually a feature then! Silly me, I was really stupid to think it was just an annoyance. So it’s official then, all those popups you encounter in Vista are actually developer’s faults and has nothing to do with MS’. Dumb developers! Somebody better teach them correct programming!

To put things in perspective, it’s not uncommon to have such a system in place. Most modern Linux distros also have a similar concept, ie. either via su, sudo or asking root password while doing admin tasks. Most distros today refuse root access all together. It’s not a bad thing. Root/Admin accounts are notoriously easy to hack into, not to mention most users use root as a default login. Using root as default is dangerous by any standards. However I hate UAC because it is far more annoying than say a sudo. It’s like, it takes me for a retard and pops up for the simplest of operations. In most cases it is unwarranted, and the other reason I hate it so much is because the messages can be really cryptic. Most of the times they look to me as downright disclaimers.

I have no problems with MS trying to make Vista a more secure platform. It’s actually a very good achievement what MS has done with Vista. All previous MS OSes had a very bad reputation for security. All said, Vista is pretty good, probably the best of the MS OSes thus far when it comes to security. However the UAC just goes overboard and that, actually, is the flaw! Most users are not technically savvy and most don’t understand what the hell UAC dialogs tell them. It’s like “WTF is that!” when the dialog pops up. Most people I know just turn off the UAC because it’s annoying. Such systems can get pwnd or infected, thus offsetting all the security Vista provides. A system that was put in place to prevent something ends up actually doing the reverse!

I don’t buy the argument that annoying UAC popups per say will somehow make software vendors write more secure code. I mean just simply having a UAC will make sure that application writers will take enough care so that their application runs on a default Vista setup. My argument is, there is actually no need for any popups at all. Developers who want their applications to run on Vista, will automatically adhere to the UAC concept. I can understand not allowing an application to write to the registry or preventing files being placed in the system area, and this can all be done subtly by not having a popup. It’s as simple as not allowing a file-write/read in the system directory. A developer is smart enough to understand that the file needs to go somewhere else, or the registry value needs to be placed somewhere other than a restricted area. For heaven sakes, most of us who have programmed on *NIX based systems have been doing this for years now and I don’t remember seeing any popups!

Vote for a Poll!

I am trying out something new on the blog. Some of you may have noticed, a mysterious poll appeared on the right hand side. Actually I was browsing through wordpress plugins and I came across this neat little poll plug-in and decided to give it a try. So there it is. Let me know what you think. You may need to turn on javascript to enable voting.

Do you like the idea of having polls on this blog?

View Results

Loading ... Loading ...

Misconceptions, discipline and a pragmatic approach.

How disciplined are you in coding? No, seriously, are you a mad code hacker or are you one of those who takes that extra bit of care while coding? Are you paranoid about comments or do you believe comments are not needed especially for trivial code snippets? Why am I raising these questions? It so happens, I was helping someone out (very recently) to port code across platforms and I happen to look at a piece of code, or rather pieces of code, that were an utter disgrace to coding standards. No comments, headers included in headers, crazy loop-backs across libraries, 10 people writing code all over the place, use of non-standard practices, utter disrespect for memory management and zero design (high level or low level). Can you believe somebody using malloc to create a C++ object! I mean seriously, you could hire monkeys to do a better job than that!

OK enough of the rant already! I can’t really disclose who the code was for, since it is production code used by a reputed organization. Yeah believe it, I still can’t, but it just goes to show how disconnected the organization is with respect to what can be considered as their most valuable asset. No wrong it’s not code, it’s the process! It is not that they are not paying for it, they are, but the management is, well, too stupid (for lack of a better word) to understand the implications of not having proper coding discipline. On the flip side, you will find some organizations where the coding discipline is so rigid that it rarely allows even simple adjustments to the existing process. When coding practices are made far too rigid it hampers free thought and ultimately retards innovation. This is the other end of the story, where companies are paranoid about coding standards and don’t realize the fact that having inflexible coding practises can be in some situations counterproductive. Standardization is important, and having standards does help in many activities including coding, debugging, code reviews and can ultimately determine the quality of a product. Having standards helps maintain discipline in the team. In the above case, since the team did not maintain any standard the code just fell apart over time.

However, overdoing it can also lead to problems. Many times people simply don’t understand what a “coding standard” is. I can sight an example here; I was once involved with the team where the coding standards didn’t permit the use of RTTI. The wisdom behind that was, “RTTI breaks polymorphism”. Yes very true and RTTI should be avoided whenever possible. However, lets not be paranoid, in some situations it does help. RTTI when used subtly can solve problems which may require re-engineering of design. Not all class relationships are monolithic and RTTI can help you in such situations. I am not saying overuse RTTI, I am just saying RTTI has it’s place. To make a commandment like “Never use RTTI” is just plain lunatic. In our case it lead to breaking up of one class definition into smaller classes which ultimately lead to over-engineering of the solution. A problem which otherwise would have had a very straightforward approach was now distributed into a bunch of classes which had no real use other than to adhere to the “Never use RTTI” rule. Come to think of it, was that even a “coding standard”? This is what I can call invasion of standards on practises. Meaning in a attempt to have standards and discipline, the team/project leader went overboard and ultimately invaded on what was a design decision. It’s definitely not a coding standard.

Coming back to the code I was working on; the other thing I noticed was an attempt to do preemptive optimizations. Preemptive optimizations are an attempt to increase the run-time performance of a program during coding and/or design. While it’s not to say that you downrightly use bad practises, it often a folly to preemptively optimize code by what you think might be right. That’s because unless you are absolutely sure about what you are doing, you will have wasted your time or in a worst case actually made the code slower. What you think might be right is often not the case. One thing I remember out of the code I saw was multiplication by 0.5 to halve an integer value instead of division by 2. The reason, someone somewhere read that multiplication is faster than division on CPUs. Now this is downright crazy, because not only did it not optimize the code, it actually made it a whole lot slower. No one bothered to verify if this was indeed true. This is the type of noobish oneupmanship procreated by budding programmers who clearly have no real-world experience. A division by 2 produces

mov    edx,DWORD PTR [ebp-12]
mov    eax,edx
shr    eax,0x1f
add    eax,edx
sar    eax,1

whereas a multiplication by 0.5 produces

fild   DWORD PTR [ebp-12]
fld    QWORD PTR ds:0x8048720
fmulp  st(1),st
fnstcw WORD PTR [ebp-22]
movzx  eax,WORD PTR [ebp-22]
mov    ah,0xc
mov    WORD PTR [ebp-24],ax
fldcw  WORD PTR [ebp-24]
fistp  DWORD PTR [ebp-8]
fldcw  WORD PTR [ebp-22]

The code produced by the multiplication is slower than the division by several orders of the magnitude since the FPU gets involved. Why did that happen? Simple, because the compiler is a lot smarter that you give it credit for. It saw the division by 2 and quickly guessed the best way to halve a value was to use shift ops. Looks like we have a winner here and it’s not the programmer. It may happen that an optimizing compiler might be smart enough to even optimize this piece of code, but my point is there was no need to go for preemptive optimization in this case. Modern compilers are pretty smart, a for(int i = 0; i < 4; ++i) will produce the exact same code as for(int i = 0; i < 4; i++). Don’t believe me? Verify it. Oh yes and please don’t use a compiler from the 1990’s and complain. Something like a GCC 4.x series or a VC 9.0 is something all of us should be using right now. The only way to really optimize anything is via a performance analysis tool like Vtune or Codeanalyst and not make blind assumptions of what you may think is faster. Please note 10% of the code takes 90% of the time. The other 90% code may require no optimizations at all.

The other thing that got me really annoyed was the fact the code was poorly commented, or should I say rather inconsistently commented. No comments on function definitions, inconsistent inline comments, blocks of code without comments at all, algorithms explanations placed out of scope often in some other .doc file. Just a garbled tub of lard! OK everyone knows comments are a must, but very few programmers actually understand how good comments ought to be written. Properly commented code will boost productivity significantly. That doesn’t mean you have to over comment. It’s a case of working smart rather than working hard. It’s quality vs quantity. I wanted to write more on this but I figured the blog entry would get rather long, instead I will provide a link to relevant info and of course to doxygen. Please people do everyone a favor and use doxygen style commenting, please please! The another thing I advocate is keeping all documentation close or rather easily accessible. Most documents that are created never get read or never get read at the right time because they sit in some obscure directory on some machine somewhere. The intension of creating the documentation is undoubtedly noble, however none of that is of any real help if the documentation is not accessible at the right time. With some rather trivial tweaks to doxygen, you could easily make it happen. We tried something like this and it was a great hit.

It’s not the first time I have worked on such a piece of code. But still I find it difficult to understand how reputed organizations can work this way. Let the facts be known; once upon a time I too was guilty of writing such code, however we all learn from our mistakes. Taking a lesson out of every experience is, I think, the key.