Archive for October, 2007
Well it looks like its time to go and get the 8800 GT after all. I just had a run-in with the recently released Crysis demo and the only real thing I can say is “Superb”, about the graphics that is. The game has ultra realistic graphics, “mind-boggling” and “jaw-dropping” would be a better words. I haven’t played the demo myself but observed someone else playing it for a short while. He had pretty much maxed out Crysis settings with a resolution of 1920×1200 on a 19″ monitor and still the game was running pretty smoothly.
The first thing that caught my eye were the shadows. The game features impressive real-time soft shadows probably done using shadow maps. Other real-time techniques just seem unlikely given the complexity of geometry in the scene. I am not too sure how the team solved antialiasing issues or maybe it was the very high resolution and the setting, but there were no visible aliasing artifacts that commonly occur when using shadow maps.
When you look at the surroundings, you can help but wonder how the game manages to push so much foliage and trees per scene. I haven’t quite figured that out myself. Maybe it’s some clever instancing tricks coupled with the obvious raw power of the 8800 that allows for such incredible amount of vegetation to be simulated correctly. You can even shoot down trees and shrubs with your gun. Shooting at leaves will even leave bullet holes in them. Amazing, truly amazing.
The rest of the graphics stuff from the game is equally impressive. The atmosphere and sounds compliment the game effects nicely and the movement of the sun allows for a very diverse experience with a dynamic environment because of varied lighting. Another thing that impressed me from a game engine developer point of view were the destructible terrain environments. I had heard about the Crysis team using a voxel based terrain system which allows for exactly such type of terrains, but the results in the game are even more impressive. I remember Ken Silverman working on something like this in his Voxlap engine, interesting!
The AI seems to be good, but I haven’t played the game myself so couldn’t really tell. Post-processing effects, explosion, all look incredibly real. One thing that is really cool gameplay feature is where you can hold a guy by his throat and use him as a shield against enemy bullets, yeah, something out of a schwarzenegger movie. I am sure FPS junkies are going to get a kick out of this one.
It’s all about the graphics then. The graphics the game delivers can be nauseatingly real at times. I am usually not the one who falls over a game just because of graphics, but the Crysis game will probably be an exception. Maybe it’s just the programmer inside of me talking, maybe things will change when I play the game for myself. But I just kept thinking about (, and googling to find an answer as to) how some of those game effects were achieved. More… later!
Sometimes in life catastrophes happen, and when they do you are generally least prepared. OK, as you recall I had a little bit of a misadventure with a failing hard-drive last month. Yes, I consider a failing hard drive a huge calamity, especially if it is on a machine that houses all my RnD projects, and that is exactly what happened. It all started one September morning when I could distinctly hear, “swwwiiing….. click.. click.. click…” sound coming out from the hard-drive.
For the people who don’t know, that sound is the hard-drive’s death chant. It’s like an angina that occurs before a heart attack. And, that is pretty much what happened. The hard-drive did fail very soon and even though I could get most of important data out on a new drive, majority of my RnD data was left on the disk. I do have regular backups of my systems, but there is a limit to how much one can backup, especially if the data on the disk is like 100 GB.
Last week I decided to try and retrieve some of my RnD projects from the failed drive. Windows refused to start, and I was left wondering what it was that I could do. Only a part of the disk had gone bad, if some way I could access the remaining partitions I could get most of the data out. That is when I decided to try Linux to get at the partitions on the drive. I threw in a Ubuntu live CD ROM and after some tweaks to fstab, woila mounted readable NTFS partitions! I recovered 100% of my previously lost data!
Moral of the story: Always always keep a latest version of Linux Live CD ROM with you. Like an inflatable air bag, you may not use it most of the times, but if it saves you from a crash it’s well worth it.
I just finished off a small project in wxPython and I must say I am pretty impressed with the library. A couple of weeks back I had to do a small assignment (, not relating to the game or the engine I am currently working on) which required a fairly amount of GUI. The specification of the project demanded that the software be able to run on Linux, Win32 and Solaris. Solaris support however, was later dropped due to time constraints. It was a pretty small project (, considering of course I am working on the engine and the game for well over 2 years now). About a sum total of 5 days of work in the end. I was a bit hesitant to take on the work initially but changed my mind once it was decided that Python and wxPython was OK.
I’ve been meaning to take wxPython for a ride for a while now and this seemed like the perfect opportunity. If you have read my Selecting a scripting language articles (1 & 2) you will know I have a secondary interest in working on Python. I am looking to integrate Python into the engine. This just gave me an opportunity to have a look at the wxPython code and bindings in detail. That actually is a totally different topic in itself and I will go in to further details in the third part of the series (whenever I get around to writing it, probably soon). Coming back to wxPython, it’s been about 2 years since I worked on wxPython and I liked it then and I like it now.
Today wxPython is probably the best python GUI according to me. I haven’t worked with PyQt but I had tried messing around with PyGTK. PyGTK to be fair is kinda OK, but is less supported than wxPython. Also the fact that wxPython is far more extensive in widget and framework support than PyGTK, makes the former a more attractive choice. PyQt was just out of the contention for me because of 2 things, a) It has a very shady license, I think it is not LGPL compatible. You can’t trust it enough to use it in commercial product without breaking some license somewhere, too controversial! AND b) Qt is just such a horrible framework! I know people are going to disagree with me on this one, but Qt’s moc compiler thing just makes me cringe. I have worked on a large project involving Qt and C++, and those moc_* files can be a nightmare at compile times. Besides Qt uses archaic and redundant C++ practices to maintain compiler compatibility. So I generally tend to “run away” from Qt and Qt projects.
A US surgeon has devised a vest that when worn and plugged to your computer allows you to feel the blows from virtual game characters (read here). Now that is really interesting! It will allow the player to experience another dimension of gameplay never before possible, a consequence of his action in the virtual game world occurring on himself in the physical world. The vest allows you to feel the shocks, stabs and hits occurring inside the game. Players wearing the vest will now have to deal with implications for their actions. Ah! Responsibility! No mad shooting in an FPS game from now on. That’s refreshing for a change. It’s no secret, I am a fan of games where the player has to deal with consequences for his actions. FPS games today just lack any variety, and this vest, it seems, may just provide the right impetus to what has become a rather monotonous game genre.
It is no secret that the gaming industry is dominated by Windows platforms and the API of choice is DirectX. There are some staunch followers of the OpenGL way of life, but their numbers seem to be dwindling rather rapidly. I have read a lot of blogs claiming that OpenGL is better than DirectX or vice-versa. I even got a mail (, or two) from an unknown person recently claiming that my War of the graphic APIs was rather biased towards DirectX. Let me assure anyone and everyone that it is certainly not the case. I am supporting both APIs in my engine and let me say this again, “Both APIs are functionally equivalent. It is not the API that determines the performance of a game, but rather the underlying hardware.”
The email further went on to show some in game screen-shots to claim that OpenGL based games looked better than DirectX games. Now, anybody who has worked on a game that uses either APIs knows how much of a folly this is. “It is not the API that determines how a game looks, it is the artist that creates the content and the engine programmers that provide the technology (, like Shaders/Level Builder/Script support) to the artists which determines how the game looks.” Game design also plays an important role. In any case it’s not the API. The mail didn’t have a valid sender so I could not mail him/her with my response. In any case, Dude, you could have just posted a comment and I would have been glad to respond.
No OpenGL is not dying out. However I am sorry to say, OpenGL is falling behind. It is being increasingly abandoned in favor of DirectX. Don’t believe me? OK read this, at least you believe him. I think I made it pretty clear why that was the case in my earlier post. I am not going to outline the same points again.
Things have been very hectic lately, in the midst of which I have been franticly trying to get the blog up and running on a new website. I have finally finished migrating the database, and the blog, and the comments, without loss of anything really (Yeah!). Migrating the bugs database which involves full bug tracking for the game and the engine was a real pain and I must admit I had some anxious moments, but that too is now fully migrated.
So now my official website is www.susheelspace.com (, which by the way is still being worked on) and the blog is now at (blog.susheelspace.com). However I doubt if I will be able to move the original site very soon since there are other pressing issues at hand, but I hope to get to it in a couple of weeks.
Migrating any site that runs off a database can be painful. The only way is to have some way to export the database to a file and then reload the same at another place. That can be a pain in the neck especially if the exporter-importer (in my case phpMyadmin) are of different versions. If you thought that was painful, In my case however I had to split my original database into 3 separate ones. I ended up doing some heavy SQL work to get the old data in the new databases. Well I must admit, my SQL knife is not as sharp as it once used to be and this exercise turned out to be a nice refresher course in the end. Using raw SQL is probably the easiest and the most fool-proof way of migrating a site. Programs like phpMyAdmin will export to SQL, but be prepared for some tweaks to that. I tried other methods like using a migrator script but that just doesn’t work. My only recommendation is, “Don’t be afraid to get your hands dirty with SQL”.
Migrating the wordpress database ended up being an anticlimax. I tried messing around with the database, but there are some incompatibilities between the older version and the newer version. I tried a couple of different things, but in the end the easiest way to migrate a wordpress blog to a new site is by using the “Export” option (“Manage >> Export”) which saves the blog contents to an XML file, and then “Import” it to the new site. A word of caution, you may have to hand edit the XML file to replace the web links of your old site, example, “www.myoldsite.com/?p=64″ to “www.mynewsite.com/?p=64″. Just make sure you check the entire XML file. I know it can be a pain, but at the very least check the weblinks. Also you will have to migrate the blog/wp-content/uploads directory of your old site to the new one.
There is an old saying in software development circles which goes something like this, “A bug can never be created nor destroyed. It can only be changed from one form to another.” While that is not to be taken too literally, bug fixes do often lead to regressions that are difficult to track down and fix. Especially the ones that occur at the end of release cycles, or the worst ones that occur at the end user’s or the customer’s machines. Well not if two researchers, Chad Sterling and Ron Olsson, from UC Davis have their way.
Their research has lead to the creation of a new technique in debugging software which reduces a large piece of software into smaller fragments called “variants” which are then used to track down the bugs (reference). The technique is called “chipping”. They have even developed a software (“chipper”) called ChipperJ in Java that they claim reduced a large program down to 20 to 35 % of it’s size. I have no idea how the program does this but apparently it uses the original software code to do it. While it is debatable if and how such a system would apply to industry level software that spans more than million lines of code, it is certainly something to be interested in if you are a developer. While the technique may not remove the human factor all together from the process of debugging, it certainly is a novel idea that could push automated testing to another level.
The authors of the ChipperJ program seem to suggest that their system could be applied to large and complex projects. Their research paper does provide interesting insights into the method. The authors are of the view that their paper is just a preliminary draft of what promises to be a new approach to debugging software. The method, they claim can be combined with other more traditional methods like “slicing” in the debugger to get even better results.
My old aging Nokia 6600 cell finally passed away on Monday, leaving me franticly searching for a new one. After some deliberation and after seeing a myriad of cell models, I finally settled for the Nokia 6300. To tell that truth it’s a little on the steeper side for me. I rarely use any fancy features on a phone except for the FM radio, which by the way is a must for me. Hopefully the MP3 player that this phone carries will also be another feature I will use. I rarely talk long hours with anyone except with a few close friends, and that is often not that long either. Talking on a phone too much is a waste of time . That is exactly why I hate spending on something like a phone.
The phone is pretty sleek and slim and looks pretty stylish. Yeah, got suckered in by the looks, couldn’t help it. Call me daft, but that was the major reason I pick this beauty up. OK to be honest, I did do a spec scan but the phone has just too many features that I know I will just never use.
The phone dimensions are 106.4 x 43.6 x 11.7 mm, about the correct size to fit into your hand. The body is metallic gray, you can see it in the pic, but leaves a lot of finger prints marks because of the metallic finish. The screen is top notch. 16 million colors give an amazing display quality. The phone features a 2 mega-pixel camera but, unfortunately, the picture quality is pretty ordinary. I think the camera needs a firmware update because the pictures appear distinctly dithered when light intensity goes down. Under normal lighting conditions the camera is kinda OK, not the best I have seen.
The best feature of the phone is probably the music player. It can play all your songs when loaded in a custom converted format. Nokia music manager does this very nicely and neatly for you. The sound quality is pretty good. This I think is one feature that I am going to thoroughly use and maybe overuse. The FM tuner of the phone is pretty ordinary. My 6600 had a better tuner. It’s not the music quality that is bad but the tuner. Since music files play just fine, I can only conclude it’s not the preamp but the tuner which is at fault. It is clearly not up to the mark. I am disappointed with the FM tuner. I had expected Nokia to give better quality for the FM radio.
The phone has an impressive OS. No issues like slow navigation or faulty menus. I had initially feared that the OS would be poor since earlier phones with Symbian OSes were known to be particularly weak in these areas. Nothing of that sort. The phone is built for connectivity. It has a “push-to-talk” (PTT) feature that allows you to use your phone as a walki-talki radio, haven’t used it though. It has a mini USB connector and is just way too easy to connect to your PC. There is also a bluetooth connectivity option that I haven’t tested.
The phone has a lot of applications that come preloaded. Browsing the internet is just a charm with the the pre-loaded Opera browser. There are a few games, (and you thought I wouldn’t….;)) ), but they are pathetic. However in my opinion, playing any game on this phone will be a nightmare. The controls and keys are tightly packed and pretty flat, not suitable for gaming. The (large) screen to (small) keypad ratio makes playing anything awkward. The phone comes with a 128 MB micro SD card that is like way too less if you are a big music fan like I am. You will have to go for another 2 or 4 GB card to have your entire collection on the phone.
Lastly the battery. The phone is power hungry and the battery discharges pretty quickly. That is too be expected from such a power guzzling display. The battery life is about 2 days with moderate usage. That is pretty low, but since I use the phone pretty sparingly, this is OK for me.
In conclusion, not too much. I am still pretty new with the phone, but I am quite happy with it. There is a lot of functionality that I haven’t tested, so there may be parts missing in this short review. The phone is “good value for money ” and if you get a good bargain, well worth it. If you are a heavy camera user or plan to run a lot of multimedia applications, maybe not the ideal choice. If you are looking for a decent good looking phone which can double up as a music player and a small PIM, this could be one for you.