Visual Studio Community Edition (Free).

Microsoft has released free Visual Studio Community Edition. It’s basically a free full featured Visual Studio IDE (apparently with everything included) that can be used to make all kinds of apps. I was even surprised to find support for Python and Git included. Even more surprising was the Apple and Android logos in the “supported platforms” section!

Link: http://www.visualstudio.com/products/visual-studio-community-vs

Direct3D 10/11 coming to Linux … What about games?

No, April 1st is still more than 6 months away, and yes you heard me right — Direct3D versions 10 and 11 are indeed coming to Linux. How is this even possible?  Well it is possible, since nouveau moved on to Gallium 3D which allows Direct3D API (actually any API) to be exposed via a front end called a state tracker. Interestingly (, and there seems to be a lot of confusion going about on public forums) Direct3D will be a Native API under Gallium, much like OpenGL is currently. It won’t be a something that emulates Direct3D by using wrappers around OpenGL — meaning you will be able to write and compile Direct3D code directly on Linux or BSD based systems that support the nouveau driver. Initially I was a bit skeptical of such an approach since Direct3D API is integrated with Win32 API, but the author seems to have solved this by using Wine headers. I don’t know the pitfalls (if any) of such an approach, but it seems to have worked for him and would seem a logical path to take (instead of breaking API compatibility). He clearly outlines the motivation behind doing the Direct3D port, and kudos to him for doing something that was but inevitable given a no show of Longs Peak.

Naturally a native Direct3D implementation will allow game developers to write code that is cross-platform and even allow existing engines/games that use Direct3D versions 10 and higher to be ported across to platforms that have a Gallium driver. W00t! This is amazing, almost too good to be true isn’t it? But before we gamers jump in joy, there are still a few things that have to fall in place before things can get up and running with regards to Direct3D on Linux. First and foremost is support. Hardware vendors like Nvidia and AMD must support Gallium in their drivers, or OSS drivers must be written (and are being written) to take their place. This is paramount since without such an interface, no front end API (Direct3D or OpenGL) will be able to use hardware acceleration via Gallium. Second, and more importantly, the guys at Redmond must allow such an implementation of their Direct3D API. An API itself can’t be copyrighted. The author seems to have steered clear of any Microsoft code, so theoretically this shouldn’t be a problem. But then again I am no legal eagle, so I can’t really say anything w.r.t. this. There have been rumors that there are patents on sections of Direct3D. I am not sure what that means, or for that matter if it is even possible to patent sections of an API/Library. But, things could get potentially messy if Microsoft were to place a cease and desist on this new development. I doubt this would happen, but you never know.

I have to agree, having Direct3D as a native API via Gallium does open up a lot of possibilities for OSS platforms that have severely lacked games. Accelerated graphics on most systems apart from Windows have had little choice up until now with OpenGL being the only real option. But does this really mean that all of the games that are developed and are being developed will be ported to Linux and other OSS platforms? That’s an interesting question and the answer isn’t quite that simple. Lets look at the macro picture of the industry. For AAA games the PC platform isn’t a priority. Most (maybe all) AAA games are today made with consoles in mind. Yes there maybe a PC port, but it’s the consoles that are the main priority. Most (if not all) gamers that play AAA games on the PC do spend a bang on their systems and most of them already have Windows as their main OS. Some do have *NIX systems but even these few have a Windows partition that they keep around specifically for games. Porting any software to a new platform isn’t a trivial task. Even with the best coding practices and methods, it requires a lot of resources — which aren’t free. Everything from coding, testing, maintaining build setups, writing install scripts and many other things requires time and money. For  a AAA game, or for that matter for any game or software, a port to a new platform should show a robust ROI (return on investment). That’s where the crux of the problem lies. There aren’t that many *NIX gamers out there, and if there are, the big studios aren’t seeing them!

Then there are the casual games, which also is a big market for games. Casual games represent a very different kind of audience. A typical casual gamer is a non technical person who doesn’t even understand what a hardware driver is, let alone jargons like Gallium, Direct3D, OpenGL or for that matter Linux. Most casual gamers will have nothing but a moderately powerful laptop with on-board Intel graphics chips — which came with Windows pre-installed. This is the kind of player that expects the game to install and run with a single click. They don’t understand driver updates or DirectX versions. For them it matters little which API is better or worse or which platform supports which API and which doesn’t. Apart form these two broad segments, there are a whole lot of players who will play radical indie games and this is probably where Linux ports has found some success. This gamer is the tech savvy computer geek who runs Linux as his/her primary system and isn’t afraid to fire up the console now and then. I must say, some radical indie games have found success in this area. But, these games are far from cutting edge. They maybe very good games, but you don’t expect Crysis like graphics from them, and it matters little what API is used or if the underlying API runs 5% slower when your game is not going below the 30FPS barrier.

There have been lots of debates about OpenGL vs Direct3D. I refrain to go into that. However, having a choice of accelerated graphics API for platforms other than Windows is definitely good all around. Direct3D versions 10 and 11 are well designed APIs, closely tied to current generation hardware. But will all this translate into more ports of games to Linux and BSDs is still an open question. The community as always will play a vital role and only time will tell how things pan out.

DirectX 11 hardware is here.

It’s almost time for Windows 7 and along with that the first lot of DirectX 11 class hardware has started to appear. This time the first off the block was, surprise surprise, ATI. The 5800 series cards were released a couple of days ago and there are already impressive reviews about the new cards all around. I am sure it wont be long before Nvidia, which has been uncannily silent, comes out with their line-up. So it is safe to assume that there will be DirectX 11 class hardware on the shelves going into Windows 7 release (Windows 7 RC already has DX 11 support and will also be available for Vista soon). It will however, still take a few weeks more for the initial euphoria to settle, and we should see prices of the cards drop around the holiday season, and probably that is when I will go in for an upgrade as well. I have been running the HD 4850 for some time now and thus far it’s proving to be sufficient, not only for gaming but also for my programming needs. The HD 4850 has been surprisingly good given it’s price point and one would expect the same from 5800 series given the already positive reviews.

There are a couple of things that are in favour of DirectX 11. The first is the API itself. DirectX 11 offers more than just a simple evolutionary upgrade (more here). DirectX 10 was mostly a non event. The enormous success and the longevity of XP and the XBox 360 ensured that the 9 version of the API far outlived  most expectations (and probably will continue to live for some time to come). The story of DirectX 10 is also intrinsically connected to Vista. Vista’s low adoption meant not enough people were running a DirectX 10 capable software platform, which Microsoft stubbornly refused to port to XP for whatever reasons. Even though 10 class hardware was available during Vista’s reign, nagging hardware issues and poorly implemented drivers meant DirectX 10 never really caught on like 9 did.

That brings us to the second point in favour of DirectX 11 — Windows 7. XP is old, and I mean seriously old. I am still running a 2004 copy of XP on my machine and though it’s doing it’s job admirably, it’s due for an upgrade. Windows 7 seems to have gotten over those little annoying quirks of Vista which we hated and shouted so much about. My hunch is most people who have stuck with XP will probably upgrade too. Maybe not on immediate release, but 2-3 months down the line when things settle in, after those initial bugs have been addressed and more and more reviews of the OS come out; 7 should slowly see wider adoption. With Vista it seemed like things were rushed into and hyped up. In contrast Microsoft has been careful with Windows 7. The RC of Windows 7 has been somewhat of a “soft launch” and though I haven’t myself had the chance to try it out, it would seem (from reviews and from what people are saying) Windows 7 is much better off than what Vista was. So it’s fair to assume that 7 will catch on more than Vista and in the process DirectX 11 will “get on” to the Desktop.

Does it mean DirectX 11 will be the defacto API for coming games? For that lets look at the games developed today. Yes most of the games that are developed today are still developed primarily for  DirectX 9.0 class hardware. Why? Consoles that’s why. You do see AAA titles advertise DirectX 10 and 10.1 support, but even those games are developed with DirectX 9.0 class hardware in mind. Yes some features here and there, usually eye-candy to impress your overzealous graphics fanboi can be found, but the engine and tech itself is designed for platform compatibility. Which ironically means not all of the features of the newer DirectX versions are exploited.  As I said before, DirectX 11 is more than just a simple upgrade to the API, it’s also a new way to do things. But since the older hardware still has to be supported, compromises have to be made. There are probably no AAA titles exclusively for the PC, so even if PCs all around were to have DirectX 11 support, it’s not until the consoles catch up will you see all the cool things the newer API has to offer come to the fore.

There is little doubt that version 11 of will make games look better. But there is so much more to the API than just improving looks for games. Many of the features in the new API mirror hardware changes that have taken place, like moving away from the fixed function pipeline, the evolution of GPUs as massively parallel compute devices. All this does mean that DirectX 11 is an API to look at seriously. But how quickly will games start using all these features? I guess only time will tell.

The HD 4850 and the story with AMD/ATI.

First the HD 4850. I was testing the game on the new HD 4850 (Palit 512MB) today and some interesting things I observed with the graphics card. For one it gives a serious bang for the buck. Doofus 3D clocked at about 140 FPS at a resolution of 1024×768, AF 16x with graphics quality set to high. Even with AA 2x Doofus 3D clocks more than 120 FPS and I have a strong suspicion the game was going CPU bound at those frame-rate, since the machine had a 3 year old CPU. I can tell you for a fact, the card is a serious performance monster, but then again Doofus 3D ain’t a top line game. However, for me, this is the first time I have seen Doofus 3D under 4x AA and 16x AF running at a playable FPS since up until now I have had only GeForce 6200, 6600 (and to some extent the 8600) cards. There is no denying that the HD 4850 is more than worth it’s price for someone who is looking for a budget card and expects to run most of the top-line games today. The card runs a little bit hot but that’s to be expected given the amount of triangles it can push and effects it can deliver. Hats off to AMD/ATI in that regards. If you are someone who is looking for a mid-range card right now, the HD 4850 is excellent value for money.

That was the overview from non-programming point of view. Now the programmer in me has something to say. The card maybe excellent, however it’s not all that cozy with ATI drivers. OpenGL drivers are a mess, with the bundled driver not even having extensions like EXT_stencil_two_side support. Even basic functionality like (for example glDrawRangeElements() ) seems to be broken at times, even showing messed up graphics when using Vertex Arrays on older cards. Now this exact same functionality is available under DirectX. Lets say it’s safe to assume that GL drivers haven’t been updated in a while and\or AMD/ATI just isn’t interested. The only issues that were reported in this round of testing were on ATI cards, so I had to literally debug the application on ATI hardware to ascertain that these were indeed driver problems. Some of the issues I have mentioned occur on guess what, the HD 4850 also. The only workaround seems to be, vendor specific hacks! That doesn’t make me a happy programmer at all!

The story with Direct3D is a lot better and no issues were observed under DirectX renderer of the game. That just tells you something doesn’t it!

Larrabee isn’t necessarily a means to a custom graphics API.

Has the graphics world come a full circle now that we see Intel’s first tech presentations of Larrabee? Will we see a resurgence of people writing custom software rasterizers? Is the heyday of the GPU truly coming to an end? Are APIs like OpenGL and Direct3D going to become redundant? I have seen these and a lot of similar questions being asked the past couple of days. People even going as far as the saying that technologies like Larrabee could be used to write custom graphics APIs. This has been, in part, due to the huge emotional response to the OpenGL debacle a couple of days back and partly due to the fact that Intel unveiled portions of it’s (up until now mysterious) Larrabee technology recently. Some people seem to have thus drawn up conclusions that soon we may not require the currently used graphics APIs anymore. Larrbee does promise freedom from the traditional hardware based approach. Rendering APIs today are closely connected to the underlying hardware and the graphics programmer using them is, thus, limited to what the hardware offers him/her.

Technologies like Larrabee do offer immense flexibility and power. There is no doubt in my mind that if needed one could create a custom graphics API using them. Unfortunately writing custom APIs might not be the answer or an option and there are good reasons to not do that. The first and probably what people see as a less important reason, is the fact that APIs like OpenGL and Direct3D are standards and therefore it is not advisable to dismiss them outright. What if code needs to ported across platforms where Larrabee might not be available? Then how do you scale custom API for that hardware? But one could argue that you could probably get more performance cutting across any layer that sits inbetween and using a direct access to Larrabee hardware. Call me a skeptic but I see issues here as well. It maybe very easy to hack up a simple rasterizer, but it’s a completely different thing to produce a vector optimized one even for a technology like Larrabee. It’s not a trivial task even if we have the best vector optimizing compilers from Intel. I would lay my bets on the star team working at Intel to produce a better rasterize than I probably can. Also I am pretty sure this (rasterizer) will be exposed via Direct3D and/or OpenGL interfaces. Yes you could probably make certain specific portions of your engine highly optimal using generic Larrabee architecture but a custom rendering API may not necessarily be the best option.

As a piece technology Larrabee is very interesting especially for real-time graphics. For the first time you will have the capacity to be truly and completely (maybe not completely) free from the shackles of hardware. There are so many more things you could accomplish with it. There are other things you could use Larrabee for, like for instance parallel processing and/or for doing intensive highly vectorized computations very efficiently.

Doofus gets a dose of Optimizations.

Ah! It’s the optimization phase of the project and I am knee deep in both CodeAnalyst and NVIDIA PerfHUD. As far as memory-leak testing goes, most, no all of the memory leak testing is done by my own custom memory manager built directly into the engine core, so no third-party leak detectors are needed by the game. AMD’s CodeAnanlyst is a utility that is invaluable when it comes to profiling applications for CPU usage and the fact that it’s free makes it even better. NVIDIA PerfHUD is probably the champion among graphics performance utilities and which, I think, is vital when it comes to bullet proofing any graphics application for GPU performance. Too bad it doesn’t support OpenGL yet, but the O2 Engine’s renderers mirror each other almost to the point where an performance enhancement under the Direct3D renderer is almost similarly experienced under the OpenGL renderer. I would have really liked PerfHUD to have supported OpenGL though. There are some issues under GL; like for instance, FBOs under OpenGL perform a tad bit slower than Render-Targets under Direct3D (on the same hardware), which I must admit has left me a little dumbfounded. Maybe it is just for my GPU (yeah My GPUs are a bit old I must say,) or maybe the drivers are at fault but I have noticed a performance variance between the two even after considerable experimentation and optimization. It would have been good to have a utility like PerfHUD to probe directly at the dra calls and/or FBO switches. I am trying my luck with GLExpert, but I am not there yet. I must however say that GLExpert is nothing compared to PerfHUD.

Code Analyst
AMD CodeAnalyst

NVIDIA PerfHUD
Doofus running under NVIDIA PerfHUD

DirectX 9 to DirectX 11, where did 10 go?

This week there was a lot of buzz about DirectX 11. Yes, the newest version of the graphics API was unveiled by Microsoft at the XNA game fest and it has an interesting feature set that, I think, were long overdue. Most of DirectX 11 doesn’t diverge from version 10 (and the almost not eventful, version 10.1), but I think DirectX 11 should see a renewed interest from game developers since it provides features that were desperately needed in light of recent hardware developments. 11 (of course with the features of 10 and 10.1) now seems to be a more complete API to addresses issues related to game and graphics development and seems to be a more complete solution for the future.

What is really interesting to see is the emergence of what Microsoft terms as the “Compute Shader”, no doubt a marketing speak for GPGPU which they claim will allow the GPU, with it’s awesome power to be used for “more than just graphics”; which smells like CUDA (Compute Unified Device Architecture) to me. I wouldn’t be surprised if both turned out to be very similar (remember Cg/HLSL). In any case, what is important is the fact that such technology will be available to game developers under version 11. Technologies like CUDA (GPGPU) are the requirement of the hour and this could be the fact that 11 might see a lot more interest than the earlier (10.x) versions.

There is a lot of talk about hardware based tessellation, but frankly I haven’t seen too many details on that. At least not enough to make a detailed comment on it. From what little is being said, DirectX 11 hardware based tessellation could be used to make models appear “more smooth”. How this ultimately translates to actual implementation will be clear when more details come out. I am hazarding a guess here, but there should be something along the lines of some technology that allows sub-surf LODs to be calculated in real-time and/or displacement/bump/normal mapping to be done on the fly. I am not too sure as yet, but could be something along those lines, or maybe something in-between, or a combination of those techniques. Whatever it is, this would mean really good looking games in the future.

Issues like multi-threaded rendering/resource handling are things that were long time coming and yes, it’s a good thing we will finally see them in the newer version. It just makes my job as a game developer a whole lot easier. Most details on Shader Model 5.0 are pretty sketchy, so I won’t go into things like shader length and function recursion. However, I hope such issues are addressed satisfactorily in the newer shader model.

So will DirectX 11 succeeded where DirectX 10 failed? Will it get mass adoption like DirectX 9? Difficult to say. While most cutting edge games have adopted DirectX 10, it’s usage remains low because of several factors. For one many people still use XP which doesn’t support version 10 (or greater) of the API (for whatever reason) which means most developers have to adopt the lowest common denominator of the alternatives available, and that generally is DirectX 9.0. Also many people still don’t have DirectX 10 class hardware and that is also another reason not to go for 10.x. The issue with DirectX 10.1 is a total mess. It’s interesting, but there is even talk that NVIDIA might skip over 10.1, giving the version a total miss and aim directly for version 11 class hardware. There is logic to that decision; given that most games (except of the really high end ones) don’t even bother to use DirectX 10 let alone 10.1. All this makes adoption of 10.x a non lucrative issue for game developers.

Version 11 does bring in some really good features to gaming in general but that is not necessarily the reason the API will succeed. As a game developer, 11 holds some serious promise and could be a success if Microsoft plays it’s cards right. However there are some issues (mentioned above) that still bother me. Microsoft is still fixated on releasing version 11 only for Vista, so don’t expect your XP machines to ever run DirectX 11 even if you buy brand new hardware. That said, like most previous versions, DirectX 11 is backward compatible with version 10 and 10.1 and even 9.0. It would be impossible for Microsoft to ignore 1000s of games that already use DirectX 9 so it’s almost a written fact that newer versions of the API will continue to be backward compatible until and unless we see a complete divergence of a sizable amount of games to newer versions, and that could be a long way away since many games even today are still being produced on the 9.0 version.

Two great books from NVIDIA

NVIDIA released 2 great books (free) very recently which are a must for anyone who tinkers with graphics and specifically with shaders. First one is the GPU Gems book, which I happen to have as a h/c (, long before it was released). It’s such an invaluable resource of tricks that are still very much valid to this day. I would recommend this to anyone and everyone who wants to get their hands dirty with graphics. Then yesterday  The Cg tutorial was released. I haven’t got a h/c version of the book as I pretty much have a good hand at HLSL and both (HLSL and Cg) are essentially the same. I read thorugh the book and was equally impressed with it. So if you haven’t already read them, I would strongly recommend taking a good hard look at both.

EDIT [13th May, 2008]: Another one released today. GPU Gems 2.

Cg 2.0 Beta.

NVIDIA has released Cg 2.0 beta a few days back. I downloaded the new version including the docs and was browsing through them. What particularly interested me was the sentence “New OpenGL profiles (gp4vp, gp4gp, and gp4fp) for GeForce 8 extensions” in the features list. It’s a little bit confusing though, because the docs don’t go into how exactly to do that, or is it just some extensions. One another thing that seems a miss is how exactly to access new DirectX 10 features from Cg, or am I missing something? I don’t have a DirectX 10 compatible card so can’t test anything really, but I checked out the examples provided with the SDK and there seems to be no examples for DirectX 10.

The docs mention 2.0 is compatible with 1.5 so  I tried out some of my older test shaders written with previous versions (1.5) and they worked OK. But then again those weren’t pretty extensive.

Response to a rant: Is OpenGL being abandoned?

It is no secret that the gaming industry is dominated by Windows platforms and the API of choice is DirectX. There are some staunch followers of the OpenGL way of life, but their numbers seem to be dwindling rather rapidly. I have read a lot of blogs claiming that OpenGL is better than DirectX or vice-versa. I even got a mail (, or two) from an unknown person recently claiming that my War of the graphic APIs was rather biased towards DirectX. Let me assure anyone and everyone that it is certainly not the case. I am supporting both APIs in my engine and let me say this again, “Both APIs are functionally equivalent. It is not the API that determines the performance of a game, but rather the underlying hardware.”

The email further went on to show some in game screen-shots to claim that OpenGL based games looked better than DirectX games. Now, anybody who has worked on a game that uses either APIs knows how much of a folly this is. “It is not the API that determines how a game looks, it is the artist that creates the content and the engine programmers that provide the technology (, like Shaders/Level Builder/Script support) to the artists which determines how the game looks.” Game design also plays an important role. In any case it’s not the API. The mail didn’t have a valid sender so I could not mail him/her with my response. In any case, Dude, you could have just posted a comment and I would have been glad to respond.

No OpenGL is not dying out. However I am sorry to say, OpenGL is falling behind. It is being increasingly abandoned in favor of DirectX. Don’t believe me? OK read this, at least you believe him. I think I made it pretty clear why that was the case in my earlier post. I am not going to outline the same points again.