Visual Studio Community Edition (Free).

Microsoft has released free Visual Studio Community Edition. It’s basically a free full featured Visual Studio IDE (apparently with everything included) that can be used to make all kinds of apps. I was even surprised to find support for Python and Git included. Even more surprising was the Apple and Android logos in the “supported platforms” section!

Link: http://www.visualstudio.com/products/visual-studio-community-vs

Direct3D 10/11 coming to Linux … What about games?

No, April 1st is still more than 6 months away, and yes you heard me right — Direct3D versions 10 and 11 are indeed coming to Linux. How is this even possible?  Well it is possible, since nouveau moved on to Gallium 3D which allows Direct3D API (actually any API) to be exposed via a front end called a state tracker. Interestingly (, and there seems to be a lot of confusion going about on public forums) Direct3D will be a Native API under Gallium, much like OpenGL is currently. It won’t be a something that emulates Direct3D by using wrappers around OpenGL — meaning you will be able to write and compile Direct3D code directly on Linux or BSD based systems that support the nouveau driver. Initially I was a bit skeptical of such an approach since Direct3D API is integrated with Win32 API, but the author seems to have solved this by using Wine headers. I don’t know the pitfalls (if any) of such an approach, but it seems to have worked for him and would seem a logical path to take (instead of breaking API compatibility). He clearly outlines the motivation behind doing the Direct3D port, and kudos to him for doing something that was but inevitable given a no show of Longs Peak.

Naturally a native Direct3D implementation will allow game developers to write code that is cross-platform and even allow existing engines/games that use Direct3D versions 10 and higher to be ported across to platforms that have a Gallium driver. W00t! This is amazing, almost too good to be true isn’t it? But before we gamers jump in joy, there are still a few things that have to fall in place before things can get up and running with regards to Direct3D on Linux. First and foremost is support. Hardware vendors like Nvidia and AMD must support Gallium in their drivers, or OSS drivers must be written (and are being written) to take their place. This is paramount since without such an interface, no front end API (Direct3D or OpenGL) will be able to use hardware acceleration via Gallium. Second, and more importantly, the guys at Redmond must allow such an implementation of their Direct3D API. An API itself can’t be copyrighted. The author seems to have steered clear of any Microsoft code, so theoretically this shouldn’t be a problem. But then again I am no legal eagle, so I can’t really say anything w.r.t. this. There have been rumors that there are patents on sections of Direct3D. I am not sure what that means, or for that matter if it is even possible to patent sections of an API/Library. But, things could get potentially messy if Microsoft were to place a cease and desist on this new development. I doubt this would happen, but you never know.

I have to agree, having Direct3D as a native API via Gallium does open up a lot of possibilities for OSS platforms that have severely lacked games. Accelerated graphics on most systems apart from Windows have had little choice up until now with OpenGL being the only real option. But does this really mean that all of the games that are developed and are being developed will be ported to Linux and other OSS platforms? That’s an interesting question and the answer isn’t quite that simple. Lets look at the macro picture of the industry. For AAA games the PC platform isn’t a priority. Most (maybe all) AAA games are today made with consoles in mind. Yes there maybe a PC port, but it’s the consoles that are the main priority. Most (if not all) gamers that play AAA games on the PC do spend a bang on their systems and most of them already have Windows as their main OS. Some do have *NIX systems but even these few have a Windows partition that they keep around specifically for games. Porting any software to a new platform isn’t a trivial task. Even with the best coding practices and methods, it requires a lot of resources — which aren’t free. Everything from coding, testing, maintaining build setups, writing install scripts and many other things requires time and money. For  a AAA game, or for that matter for any game or software, a port to a new platform should show a robust ROI (return on investment). That’s where the crux of the problem lies. There aren’t that many *NIX gamers out there, and if there are, the big studios aren’t seeing them!

Then there are the casual games, which also is a big market for games. Casual games represent a very different kind of audience. A typical casual gamer is a non technical person who doesn’t even understand what a hardware driver is, let alone jargons like Gallium, Direct3D, OpenGL or for that matter Linux. Most casual gamers will have nothing but a moderately powerful laptop with on-board Intel graphics chips — which came with Windows pre-installed. This is the kind of player that expects the game to install and run with a single click. They don’t understand driver updates or DirectX versions. For them it matters little which API is better or worse or which platform supports which API and which doesn’t. Apart form these two broad segments, there are a whole lot of players who will play radical indie games and this is probably where Linux ports has found some success. This gamer is the tech savvy computer geek who runs Linux as his/her primary system and isn’t afraid to fire up the console now and then. I must say, some radical indie games have found success in this area. But, these games are far from cutting edge. They maybe very good games, but you don’t expect Crysis like graphics from them, and it matters little what API is used or if the underlying API runs 5% slower when your game is not going below the 30FPS barrier.

There have been lots of debates about OpenGL vs Direct3D. I refrain to go into that. However, having a choice of accelerated graphics API for platforms other than Windows is definitely good all around. Direct3D versions 10 and 11 are well designed APIs, closely tied to current generation hardware. But will all this translate into more ports of games to Linux and BSDs is still an open question. The community as always will play a vital role and only time will tell how things pan out.

Is it another year already?

A very Happy New Year to all. A bit belated I know, but I was kinda busy doing nothing. Well, not really. Yeah, I have been taking time off, but I was also busy with other activities, most importantly, marketing of the game.

So what’s 2008 been like? Well for me it was pretty uninteresting. Not, that I didn’t enjoy it, it’s just there was precious little in the way of what I like to do best; research. Most of 2008 was spent on fixing bugs, play testing, hardware testing, level creation and solving some insanely complicated issues, issues that shouldn’t have been there in the first place and some unavoidable circumstantial problems, that shouldn’t have been there in the first place. Most of the coding that was done was also equally uninteresting. Majority of the time was spent on getting thing working right with gameplay and design. Not the most pleasurable of things I must say, at least not for me. That said, a lot of ground work has been done w.r.t the engine, most of which will not have to be repeated for sometime to come. So that’s a big positive, something I can take away from 2008 as being extremely productive.

Having said that, the biggest hit of the year for me is of-course the release of the game; which took far more time than I had initially anticipated. True, it turned out OK (great 😉 ) given the budget, time and resource constraints, but I would have liked to do more. Maybe all that was missed in this one can quickly be added to the next one. A Causal Analysis is due, however I would like to hold on to that a bit longer. At least till we finish up with the final marketing parts which I am currently focusing on. A part of  last year was also spent in starting 3D Logic Software and there are a lot of things that had to be done before we went online. Unfortunately they accounted in a pretty big delay for the launch of the game.

On the tech front, 2008 has been equally low. Very little interesting developments. Most of things that happened were evolutionary rather than revolutionary. On the OS front XP still rules and will probably do so in 2009 as well. However, the year belonged to the underdog Apple. Both their OS and their products have gained significant market share and will probably continue to do so in 2009. Linux has always been interesting and 2009 will be no different. Linux grows from strength to strength in some areas and remains the same in others. If anything I am looking forward to Linux in 2009, some interesting developments on the horizon.

In 2008 we saw a resurgence of the GPU battles with ATI throwing in some impressive technology, and that’s good thing. For the first time I am an owner of an ATI card (HD 4850) and though NVIDA held on to the top spot (barely), ATI was close behind and even edging out in front at times during the year. Then again we can’t forget general purpose computing on the GPU. The year has been interesting for GPU and GPGPU. Powerful cards with supercomputing capability were unveiled and this year will see more power being packed into cards as the GPU titans clash with better with more powerful weapons at their disposal. Oh, let’s not to forget Intel here. Intel finally unveiled Larrabee, so you very well could have another titan arising in those battles.

Personal wish list for 2009.

  • Intel comes around to finally putting a proper on-board GPU with at least good hardware T&L and releases moderately good drivers.
  • Microsoft releases DirectX 11 for XP along with Vista and Windows 7.
  • OpenGL spec gets a overha….. well forget it!
  • Linux gets a single package-management/installer system that everyone across the board adopts, and most importantly is easy to use and deploy.
  • The economic downturn ends.
  • All people in the world become sane and killing of innocent people stops completely.

That all for now, 😀

Once again a Happy New Year.

The HD 4850 and the story with AMD/ATI.

First the HD 4850. I was testing the game on the new HD 4850 (Palit 512MB) today and some interesting things I observed with the graphics card. For one it gives a serious bang for the buck. Doofus 3D clocked at about 140 FPS at a resolution of 1024×768, AF 16x with graphics quality set to high. Even with AA 2x Doofus 3D clocks more than 120 FPS and I have a strong suspicion the game was going CPU bound at those frame-rate, since the machine had a 3 year old CPU. I can tell you for a fact, the card is a serious performance monster, but then again Doofus 3D ain’t a top line game. However, for me, this is the first time I have seen Doofus 3D under 4x AA and 16x AF running at a playable FPS since up until now I have had only GeForce 6200, 6600 (and to some extent the 8600) cards. There is no denying that the HD 4850 is more than worth it’s price for someone who is looking for a budget card and expects to run most of the top-line games today. The card runs a little bit hot but that’s to be expected given the amount of triangles it can push and effects it can deliver. Hats off to AMD/ATI in that regards. If you are someone who is looking for a mid-range card right now, the HD 4850 is excellent value for money.

That was the overview from non-programming point of view. Now the programmer in me has something to say. The card maybe excellent, however it’s not all that cozy with ATI drivers. OpenGL drivers are a mess, with the bundled driver not even having extensions like EXT_stencil_two_side support. Even basic functionality like (for example glDrawRangeElements() ) seems to be broken at times, even showing messed up graphics when using Vertex Arrays on older cards. Now this exact same functionality is available under DirectX. Lets say it’s safe to assume that GL drivers haven’t been updated in a while and\or AMD/ATI just isn’t interested. The only issues that were reported in this round of testing were on ATI cards, so I had to literally debug the application on ATI hardware to ascertain that these were indeed driver problems. Some of the issues I have mentioned occur on guess what, the HD 4850 also. The only workaround seems to be, vendor specific hacks! That doesn’t make me a happy programmer at all!

The story with Direct3D is a lot better and no issues were observed under DirectX renderer of the game. That just tells you something doesn’t it!

Larrabee isn’t necessarily a means to a custom graphics API.

Has the graphics world come a full circle now that we see Intel’s first tech presentations of Larrabee? Will we see a resurgence of people writing custom software rasterizers? Is the heyday of the GPU truly coming to an end? Are APIs like OpenGL and Direct3D going to become redundant? I have seen these and a lot of similar questions being asked the past couple of days. People even going as far as the saying that technologies like Larrabee could be used to write custom graphics APIs. This has been, in part, due to the huge emotional response to the OpenGL debacle a couple of days back and partly due to the fact that Intel unveiled portions of it’s (up until now mysterious) Larrabee technology recently. Some people seem to have thus drawn up conclusions that soon we may not require the currently used graphics APIs anymore. Larrbee does promise freedom from the traditional hardware based approach. Rendering APIs today are closely connected to the underlying hardware and the graphics programmer using them is, thus, limited to what the hardware offers him/her.

Technologies like Larrabee do offer immense flexibility and power. There is no doubt in my mind that if needed one could create a custom graphics API using them. Unfortunately writing custom APIs might not be the answer or an option and there are good reasons to not do that. The first and probably what people see as a less important reason, is the fact that APIs like OpenGL and Direct3D are standards and therefore it is not advisable to dismiss them outright. What if code needs to ported across platforms where Larrabee might not be available? Then how do you scale custom API for that hardware? But one could argue that you could probably get more performance cutting across any layer that sits inbetween and using a direct access to Larrabee hardware. Call me a skeptic but I see issues here as well. It maybe very easy to hack up a simple rasterizer, but it’s a completely different thing to produce a vector optimized one even for a technology like Larrabee. It’s not a trivial task even if we have the best vector optimizing compilers from Intel. I would lay my bets on the star team working at Intel to produce a better rasterize than I probably can. Also I am pretty sure this (rasterizer) will be exposed via Direct3D and/or OpenGL interfaces. Yes you could probably make certain specific portions of your engine highly optimal using generic Larrabee architecture but a custom rendering API may not necessarily be the best option.

As a piece technology Larrabee is very interesting especially for real-time graphics. For the first time you will have the capacity to be truly and completely (maybe not completely) free from the shackles of hardware. There are so many more things you could accomplish with it. There are other things you could use Larrabee for, like for instance parallel processing and/or for doing intensive highly vectorized computations very efficiently.

Couldn’t have put it better myself….

OpenGL 3.0All in a day’s work!

OpenGL 3.0 is finally released, and it disappoints.

ARB has released the much anticipated OpenGL 3.0 spec and if you were the one following developments of OpenGL for sometime, you would know that hopes were riding high on the fact that OpenGL 3.0 would be a revolutionary redesign of an ailing and a rather old API. Apparently it’s none of that and even worse it’s actually nothing at all. OpenGL was drugging along for the past 15 years, adding on layer upon layer of muckish extensions to the point that many had expected ARB to really go ahead and make radical changes in the 3.0 specification. None of that has happened. Most of the radical changes promised have not been delivered. All that seems to have happened is the standardization of already existing extensions by making them a part of the the standard. Sad, really sad.

As a game developer and more as someone who has been using OpenGL for the past 8 years I am pretty disappointed. I was hoping to see a refreshing change to OpenGL. I am at a loss of words here; no really I am. There is really nothing more to say. The changes have been so shallow, that I wonder why it called for a major version number change in the first place. 2.1 to 3.0, phooey, it should have been 2.1.1 instead. Let me put it in another way; my current OpenGL renderer which is based on OpenGL 2.x could be promoted to 3.0 probably with 4 or 5 minuscule changes or maybe none at all! Where is the Direct3D 10+ level functionality what was hyped about? Where is the “radically forward looking” API?

What does this say for the future of OpenGL? Sadly not very much at least in the gaming arena. It was already loosing ground and there was a lot of anticipation that ARB would deliver a newer OpenGL to “take on” Direct3D. I must say that a powerful Direct3D (thanks to DirectX 11) looks all set to become the unequivocal champion when it comes to gaming graphics. OpenGL will clearly take a back seat to DirectX here. While some may argue that OpenGL will continue to flourish in the CAD arena, I am not so sure that Direct3D wont find favor over there as well. OpenGL drivers from most vendors already fall short of their Direct3D counterparts. That’s to be expected. It’s not their fault either. What else can they do when you have a 15 year old API to support whose legacy functionality is out of touch with modern day reality.

EDIT: The major thing missing as far as OpenGL 3.0 was a clean API rewrite. When you compare OpenGL 3.0 with Direct3D 11 it’s how things look from here on forward is what bothers me. Direct3D is more streamlined to address developments in hardware and while vendors could also expose similar functionality via OpenGL using vendor specific extensions, the whole situation doesn’t look too good. Making a driver that is fully OpenGL compatible will cost more in terms of manpower. That is because the specification is so large. Yes there is opportunity to deprecate things but I am not too sure how things will pan out there as well. Supporting older features on newer hardware means compromises and sacrifices in quality and performance. Driver writers cannot optimize for everything and that is why in the end performance suffers; or in worst case, ships out broken.

Doofus gets a dose of Optimizations.

Ah! It’s the optimization phase of the project and I am knee deep in both CodeAnalyst and NVIDIA PerfHUD. As far as memory-leak testing goes, most, no all of the memory leak testing is done by my own custom memory manager built directly into the engine core, so no third-party leak detectors are needed by the game. AMD’s CodeAnanlyst is a utility that is invaluable when it comes to profiling applications for CPU usage and the fact that it’s free makes it even better. NVIDIA PerfHUD is probably the champion among graphics performance utilities and which, I think, is vital when it comes to bullet proofing any graphics application for GPU performance. Too bad it doesn’t support OpenGL yet, but the O2 Engine’s renderers mirror each other almost to the point where an performance enhancement under the Direct3D renderer is almost similarly experienced under the OpenGL renderer. I would have really liked PerfHUD to have supported OpenGL though. There are some issues under GL; like for instance, FBOs under OpenGL perform a tad bit slower than Render-Targets under Direct3D (on the same hardware), which I must admit has left me a little dumbfounded. Maybe it is just for my GPU (yeah My GPUs are a bit old I must say,) or maybe the drivers are at fault but I have noticed a performance variance between the two even after considerable experimentation and optimization. It would have been good to have a utility like PerfHUD to probe directly at the dra calls and/or FBO switches. I am trying my luck with GLExpert, but I am not there yet. I must however say that GLExpert is nothing compared to PerfHUD.

Code Analyst
AMD CodeAnalyst

NVIDIA PerfHUD
Doofus running under NVIDIA PerfHUD

Two great books from NVIDIA

NVIDIA released 2 great books (free) very recently which are a must for anyone who tinkers with graphics and specifically with shaders. First one is the GPU Gems book, which I happen to have as a h/c (, long before it was released). It’s such an invaluable resource of tricks that are still very much valid to this day. I would recommend this to anyone and everyone who wants to get their hands dirty with graphics. Then yesterday  The Cg tutorial was released. I haven’t got a h/c version of the book as I pretty much have a good hand at HLSL and both (HLSL and Cg) are essentially the same. I read thorugh the book and was equally impressed with it. So if you haven’t already read them, I would strongly recommend taking a good hard look at both.

EDIT [13th May, 2008]: Another one released today. GPU Gems 2.

New year wishes and a look at the year gone by.

First of all, a “Very Happy New Year” to all.

Just to highlight some interesting news and events that happened the year gone by, plus my own experiences.

  • Games:
    1. It was probably the game of the year (, at least as far as I am concerned), I am talking about Bioshock. Enjoyed playing it even though not on my PC and I still haven’t completed it. Truly amazing graphics and a new twist to FPS style of play.
    2. The Elder Scrolls VI: Oblivion, this game didn’t come in first place because it was not launched this year, but the last. It is here since I could only manage to play and complete the game this year. Played this game along with the Shivering Isles and Knights of the nine expansions for like more than 5 months 😛 starting July, and I must say I have come to thoroughly enjoy the sandbox style gameplay the game offers. Don’t be surprised if I start getting crazy ideas of creating games like this in 2008 😉 .
    3. Just when we thought nothing could tax the 8800, Crysis hit! The game takes away the best visual graphics award of 2007. Amazing eye candy and surely the sign of things to come, though I am not sure about the overall gameplay.
    4. A couple of other interesting games as well like GOW 2 and Gears of War, but didn’t get my hands on them as yet.
  • Programming and Development:
    1. Biggest disappointment was the postponement of OpenGL 3.0 specs. I was hoping to see at least something concrete on this, but to no avail. I hope 2008 will give us more to look forward to.
    2. 2007 saw the release of Visual Studio 2008 and it’s Express editions. Not too much to complain or praise there. .NET 3.5 was released along with the studio versions.
    3. While major releases were few and far between, minor releases like Cg 2.0 and Silverlight dominated most of the programming and development news.
  • Personal projects:
    1. Biggest miss was not being able to launch Doofus 3D. Period! The game was stated to release October/November but inevitable delays and project pressures resulted in the game not being shipped. This has been the biggest disappointment from my side.
    2. The project is however still on track and baring time delays the product and the engine has become stable and looks more and more like a very solid platform for future projects. Most (almost all) of my ideas (some reallly crazy ones too) have thankfully worked!
    3. My RnD on scripting engine integrations has yielded good results. I remember my promise, will update the blog with some statistical data on this, just tied up with project pressures for now. On the whole RnD this year from my side was lower then what it was last year.
    4. Got a new website this year, migrated the blog and also have one lined up for the game release.
  • Hardware:
    1. The year belonged to NVIDIA and the 8800 has pretty much dominated the graphics scene unchallenged for most of 2007. There was a feeble attempt by AMD(/ATI) at the end of the year but the HD 3870 and 3850 have been plagued with shipping problems, though they have shown impressive figure and amazing value for money considering the price point. However, I expect the green brigade to counter that since they are already well ahead in the race to do so.
    2. The next was Intel which has successfully managed to run the competition (AMD) to the ground with it’s chips, the Core 2s, pretty much dominating the market. The Phenoms are here but still have to prove themselves. It’s safe to say Intel ruled 2007.
  • Operating Systems:
    1. I have done enough Vista bashing on this blog already, so no more! My sentiments however remain unchanged regarding the OS. 2007 has been particularly bad for Vista, the OS was given flak on a lot of articles on the web. My recommendation; give the OS a skip for the time and use XP and/or…
    2. Ubuntu 7.10 code named Gutsy Gibbon (released 2007) has been a revelation for me. I have been using this OS for a month now on my internet PC and I am more than happy with it. True there are some quirks that remain but Ubuntu is great OS for, well, everyone and anyone. I recommend this OS hands down!
  • Misc News:
    1. India wins the 20-20 world cup 2007.

New year resolution:

Release Doofus 3D.

A lot of plans in mind, but more on that later.