Is software development inherently unpredictable?

Last week I was meeting a friend of mine who also happens to be a software engineer and a programmer. So, as we were discussing he came around to complaining as to how his team was riddled with problems on a project that was supposed to be well planned and well organized from the start. It seems the project is now over schedule and over budget and the client is not happy. Now most of us in the software industry would just laugh that off saying, “Oh, that happens everywhere, all the time, so what else is new!” But later it left me wondering as to why this happens to all but the most trivial software projects. Why do projects that are planned and organized by people who have worked in the industry for years (, which by the way includes me), fail to deliver on time and/or on budget and sometimes do so miserably!?!! What is so difficult about estimating software development cycle that it always goes wrong; time and again. Yes there are always excuses, feature creep, attrition, inexperienced programmers, the weather; but still if you were to look at other fields of engineering like mechanical or construction you won’t see similar problems occurring there. Projects do go off schedule but the situation is much better off than what we experience in software. Why? Working in software I have seen projects go off schedule or off budget by 100% or more, worse, some end up as Vaporware.

So what’s exactly is wrong with our time/budget estimation techniques when applied to software? Well; maybe it’s not estimation techniques at all, maybe it’s our software development processes that is at fault, which in turn cause wrong time estimations. Through the years people have come up with several different software development models, but have never agreed upon a single one to be the correct one. There was, or rather is the waterfall model, which according to me falls short because it is too rigid. For most projects, changing requirements are a way of life and the waterfall model is just not geared for changing requirements. If the requirements do change, the phases of the model will overlap and thus put the whole process into complete disarray. Also the waterfall model is criticized for being too documentation oriented (requirement, design and other process documents) and focuses less on work related methodologies and productivity. There are several disadvantages of the waterfall model which would require maybe another blog entry. So I refrain myself from going too deep here. However, proponents of the waterfall model are blind to the idea that for most projects, specifications do change and there is actually very little the clients can do about it. Sometimes a spec change is a must to deal with rapidly changing customer demands, sometimes things are not clear upfront especially in domain specific projects, sometimes it just something the management doesn’t like or wants changed. Whatever the reason may be; time estimation for a non-trivial project with changing requirements using the waterfall model will be almost next to impossible. (You end up casing your own tail and doing “Adhoc – things” in the end).

OK so it maybe clear by now that I hate the waterfall model. Well I don’t. I just think the usefulness of the waterfall model is limited to projects with rigid specs. It is not something to be used everywhere and anywhere. Yes, if you read my above argument then you can see that it would clearly not fit for a broad spectrum of projects. In support of the waterfall model however; it is very easy to estimate time for a project provided the model fits to the project’s need. For a project that has unchanging specs, this model is probably the best. Now, having discounted the waterfall model leaves us with another popular model or rather a class of models called iterative models to software development. Where the waterfall model discourages change, most iterative models embrace it. They scale very well to changing requirements and accommodate spec changes rather easily. There are a lot of different iterative models and each one has it’s share of advantages and disadvantages. I don’t claim to know each and every one of them and I have only used one in my current project and that too with some custom modifications (, hybridized with the waterfall method, more on that later). What I want to focus on is the fact that though iterative models are designed to be scalable, forecasting or predicting time-lines is very difficult. If a spec change does come in, it can be easily absorbed by the process but would still ultimately end up causing disruptions in the total time estimates for the project. Continue reading

A modeler with a difference.

Houdini.A few months back I had a friend demo me a 3D software package called Houdini from Side Effects software. Houdini is used extensively in film and movie circles and not so much in game industry, (which is dominated by 3D Max, Maya and to some extent XSI;) but mostly all big blockbuster films that have those great special effects that make you go “whoo”, “wow”, “cool” are pretty much made using this software. I am not exactly sure what my friend was working on, but it seems he was working an extension for the package and wanted some opinions on a custom file format. The thing that got me interested with Houdini was the way you work with the whole thing. It’s a little bit different from your conventional modelers (Max, Maya and XSI); in Houdini you basically do everything by combining operators. To tell you the truth I am not a great 3D artist. I have done most of the art-work in the game, yet my skills leave a lot to be desired. The only modeler I have ever worked with is Blender and you probably know Blender has a notorious and sometimes flawed reputation of being very difficult to use.

Houdini take a very different approach to 3D modeling. The reason I liked it is because it’s entire flow seems to be highly logic driven or as they say it “procedural”. This is an amazing concept and you have to actually see it to understand it fully. It has an interface that looks like a hierarchy of node graphs using which you pretty much model everything. The node-graphs create a kind of construction history which allows you go back and modify previous steps in a snap. This kind of flexibility means the overall productivity given by the software is unbelievable. It allows the artist to be as creative as he wants and at the same time also allows the entire design process to be non monotonous or in other words non-linear.

I would love to have a Houdini like software for designing a game, and I mean the entire game, with composition and everything. Having seen the software at work (, and being a 3D game engine developer,) made my mind race in 1000 different directions and I could see so many possibilities with the type of “procedural” flow. The creative potential could be enormous when applied to game creation. Now that I have looked at it, my guts tell me a procedural work-flow for any game design/creation/composition software will be a step in the right direction. Another very interesting aspect of this package is reuse. Besides the obvious benefits of a procedural type of work-flow the software encourages the use, or rather, the reuse of existing solutions and designs. This might sound like something out of a computer programming book, but it’s rather more subtle. Create a work-flow once, and then reuse it for several different solutions with minimum effort. That would be a game designer’s and an artist’s dream come true.

For those interested, there is a free learning edition called Houdini Apprentice provided by Side Effects.

Feed mining using Yahoo Pipes.

Yahoo Pipes.I was looking at Yahoo Pipes the other day and found the pipes idea rather interesting. I never have the time to keep up-to-date with news events (, I never have time for anything these days 😀 ). Too much news going on around the world I guess. I often do miss out on important news and usually end up reading it a day or two late, maybe sometimes even more. Pipes maybe just what is needed. I always wished for some way to sort and select exactly what I want, and how I want my news. Pipes seems to be really good at that.

Doofus runs on Wine!

Doofus on Wine.
Doofus 3D running on Wine!

Just before logging off of the Gutsy, I tried running the 0.0.5 beta version of Doofus 3D via Wine and “Voilà”, it actually ran! On Linux, with no tweaks of any kind, no nothing! Obviously only with the OpenGL driver, Wine has near absent DirectX support so it was only OpenGL, but still I was like blown away. I didn’t even expect it to work and I must say I could see no performance issues. Almost the same speed as Windows. I ran most of the unit tests and managed to finish the first 7 levels without any problems (, didn’t try any more but I am sure they will work just as well). To be frank I didn’t expect the game to run, and I mean run at all, let alone with the same speed as Windows.

I had no idea Wine had no DirectX support. I guess it must be very difficult to port the DirectX APIs (, but who’s complaining when the engine can run on OpenGL just as well) . The engine logs show Wine hooks in the native OpenGL driver on Linux, so I guess there is no visible performance loss while running the game using OpenGL renderer via Wine.

Update (Dec 17th, 2007): After reading Dan’s comments and some experimentation I could in fact run Doofus 3D via the DirectX renderer. However, the DirectX renderer will not run out of the box and there might be other legal issues associated with it, which are as yet unclear. You can read the entire comments here. I like to thank Dan for pointing out the error in the above post, Wine does have DirectX 9 support.

Gutsy as a Gibbon.

UbuntuYesterday I had had to take some time off of the game to finish off a previous assignment. A couple of months back I was working on a GUI application which involved writing some GUI code using wxPython. It was a rather trivial application, nothing too brain taxing. Originally written under Windows using wxPython, it so happens it was to be ported to Linux as planned. Anyways, the target system was the Gutsy Gibbon (Ubuntu 7.10), so I had no option but to install it on my PC. Now, I am a long term Red-Hat fan and more recently, of course, Fedora has been my top distro preference. The Gutsy was my first experience with Ubuntu/Debian.

I had heard a lot about Ubuntu but never actually tried it before (, except that once for the Live CD version to rescue a hard-drive). After having used it for about 2 days I can tell you outright, it lives up to it’s reputation. While it may be difficult for me to say whether it is the best, but I can certainly tell you for a fact it is good. I have heard people say it is better than most other distros, but surprisingly I found no evidence to either prove or disprove that with regards to Fedora  7.0. I am talking purely from a user’s point of view, and not a technical one.

The fist thing that I liked was installation. Hassle free, clean, neat. OK that probably true with Fedora also so no points there. The other thing I liked about this distro was the fact that it allows you to browse the live DVD/CD version before installation, something I wish Fedora could also adapt, but then again it’s kinda nice to have things, not really too important. The installation was remarkably fast for such a large OS and the OS has an amazingly quick booting time. Oh yes, and nice to see a distro that finally bundles a propriety NVIDIA driver which can be enabled with a single click. Setting up my PPPOE internet connection was surprisingly trivial. Not even on a Windows system was it that easy. I had to download and install a PPPOE driver for Windows, but with Ubuntu it was just click and GO! Setting it up on Fedora was a nightmare!

Ubuntu comes with the default GNOME manager which is pretty quick and responsive. The system does not by default run too many services and that could be one reason the system is fast. The default installation doesn’t install too many packages, neither are any development packages installed. No eclipse, no KDevelop, no debugger. You will have to do it pretty much via Synaptic. Not too much of a problem, but you do have to browse the Ubuntu forums to find out how to get things working, especially if you are a new user. I had to install a lot of packages to get a development environment going even though it was just python I intended to use.

On the whole, this distro is clearly made with the average user in mind and tries hard to make Linux “friendly” OS. It does a good job but still I found some things lacking overall. You still do have to fire up the good o’l terminal every now an then. Hmm…, not a problem for someone who is using Linux for about 10 years now, but then again my question is, Why the terminal?

Overall experience? Not very different from Fedora, believe me. If you had experience with one you can be just as home on the other, period. Maybe a few searches on Google is all that is required. I would put Ubuntu marginally above Fedora because it is friendlier(, and because of the internet setup thing, and the NVIDIA driver).

Oh yes, the distro never crashed on me even once, Fedora does sometimes crash. Anyways, my short stint with Linux is over, going back to Windows and the game now.

Apricot: The open game.

The Apricot game was officially announced, though the talk about the game has been going on for a long time now. I had almost forgotten about it entirely till I read the story on Gamedev today. A great initiative by the Open-Source and Blender communities on the whole I must say. I am really curious on how the team integrates CS with Blender and it’s definitely something I will be watching closely, though I don’t understand the rational behind using CS when Ogre was around. I am particularly interested in Blender-CS integration since Blender is also the modeler we are using for assets in the Doofus Game (,more here,) as well. I have particularly grown fond of it especially since the new compositing and multiple UV mapping tools were introduced. Just make it look pretty nifty, something like UnrealEd 😉 .

Blender is no doubt a good 3D modeler. Baring its unconventional interface, it has all the necessary bells and whistles needed for full-fledged game development. What the Apricot team is trying to achieve is commendable and I sincerely wish the Apricot team, “The very best of Luck” and hope we get to see a great game soon. Three cheers for Apricot! Hip hip hooray…!!!

Parallel computing using .NET.

My recent entries have all be about news events, and this one makes no exception. In fact it is just about the recently released Parallel computing extensions for the .NET platform (download). You will of-course need the also recently released .NET 3.5 framework update for them to work (, it comes bundled with the Visual Studio 2008; also released recently). On the very same page you will also find an interesting paper titled “The Manycore Shift”, which doesn’t do justice to it’s title I must say. The title makes you think that it has something to do about parallel core programming, but all it does is just outline what Microsoft intends to do with it’s parallel processing programming model. A marketing gimmick? I’ll let you decide. Looked to me like something written to sell an idea to company execs who have no knowledge of programming and/or the concept of parallel computing. The question is, why is it even shown on a MSDN developers page?

It’s been some time since I downloaded these extensions. Obviously (, if you are a frequent reader of this blog, you will know, ) I am a sucker for any technology that has the potential to provide performance enhancements. Unfortunately I have very little time to devote to anything right now so could couldn’t do too much testing with the extensions. Still, I did manage to glance through the documentation provided (, you can find the chm on the same download page,) and one thing that immediately caught my eye was the fact that the documentation talks about “different approaches for handling parallelism”. The framework allows you to have Data and Task parallelism using Imperative and Declarative models. The docs for the extension are good; hmm…. well not great I must say. They take for granted that the reader has some knowledge about the concept of parallel computing, and is pretty thorough with basic threading. Not the most friendliest of docs , but not a very big problem for someone with an experience of MSDN docs under his belt. The only other library I have seen that allows parallel computation using many cores is Intel’s Thread Building Blocks (TBB) which I think allows Task based parallelism.

Parallel Extensions to the .NET Framework contains a few different approaches for handling parallelism in your applications including imperative and declarative models for data and task parallelism.

I have not had enough time to experiment with either and I don’t think too much will change soon , but the fact I am writing this entry is because parallelism is going to become the single most important aspect of program design over the next few years. Especially in the area of game development where hardware and conventional programming approaches are being pushed to their very limit. There are going to be processors with 8 cores soon and there has been talk of next gen game consoles having > 20 cores. Any system that is being designed for tomorrows needs, has to take into consideration parallel computing. With a good documentation at hand, the Parallel computing extensions are a great way to get your hands dirty with the concept of parallel computing. The extensions also provide some real world examples. Invaluable for learning anything. (I am sure some of you are going to be very interested in the ray-tracer example 😉 .) Go get some!

An Open-Source marketplace.

In an interesting development has released a service for the open-source community whereby open-source projects can now commercialize their services via Sourceforge (read here). This is interesting because it will allow small open source projects listed on Sourceforge to sell services which was previously not possible. As one blogger points out, such initiative will allow smaller open-source projects to build businesses around their projects.

“It’s one thing for a venture-backed open-source startup to develop new channels. It’s quite another for a one or two-person open-source project to do so. Suddenly, however, these small projects have an outlet to the market. A global market.” – (read the entire post here.)

I have been contributer to an open source project once and I can tell you from experience, managing or contributing to a project online is no trivial matter. It takes lot of time and effort to do so. I have great appreciation for these guys who work for free, open-source projects and who get almost nothing in return. Sourceforge’s direction in this regard is a welcome initiative indeed.

No more spam please!

The past couple of weeks has seen too much spam thrown this way (I mean on this blog). It won’t appear on this front page thankfully, but it’s now getting really annoying. I can’t understand why people are trying to spam the hell out of this blog. I get at least about 3 spam pingbacks a day. On really bad days it can go up to 15. The spams are usually something like this “We found this great post here… Please read …” which actually links back to the poster’s/spammer’s site or I don’t know where and the site has God knows what all nasty stuff. It is my request to anyone reading this blog, please don’t send spam this way. If you want to say something you are welcome to comment, of course, on the topic.

Nintendo with an interesting Operating System.

Nintendo has released an Operating System named ES (read translated version, original Japaneses version ) under an Open Source License. What the whole idea is behind having an OS is still unclear to me. If they were planning on having an OS for their consoles, there are other more attractive alternatives like Linux. Sony has done something on these lines by having Yellow Dog Linux run on their PS3 consoles.

The website is pretty vague and doesn’t delve too deeply into what the OS is intended for, but it’s clearly something too keep and eye on. Maybe it is just some research project. Maybe the company is still toying with the idea and wants to see what the response from the community will be, who knows.

The OS itself seems pretty interesting. The kernel seems to be written in C++ and it runs natively on x86 and QEMU. They have a port of SmallTalk programming language called Squeak. I am not sure what the intension is really, will it be used for game development on that platform? I guess there must be some way to do C++ programming as well, considering it’s x86 compatible.