Busy busy busy…

Damm it’ s been the longest time ever. No, I haven’t¬†forgotten about writing, but¬†been involved with a lot of work as of late.

  • Game Art integration.
  • Testing.
  • Pushing 3 other projects (2 game related and 1 other misc project).
  • Plus one new project.

Very little time, too much to do. ūüôā

It’s getting a bit better, so ¬†hopefully you will hear more of my ‘self-centered’ ‘worthless’ rants in the coming days ūüėÄ .

Norton 360 4.0.

Norton 360I received a complimentary copy of Norton 360 a couple of weeks back, but only managed to install it now. Let me start off by saying, I am pretty impressed with the number of products integrated within the Norton 360 package. For a package this large, the installation was pretty silent, quick and quite clean. Everything got installed correctly and I could start using the software in about 5 mins flat. I tried almost everything and found nothing I could really complain about. Most things are pretty straight forward except maybe the firewall settings where you do have to turn to the application help if you want to customize things. However, even the firewall was configured correctly from the word go. A full thumbs-up to that. The virus/adware/spyware scans are also pretty good albeit a bit slow.

I personally never used any Norton product before this one. However, Norton products used to be installed on machines where I used to work and the common complaint with them is, “They seem to hog computer resources and are really slow”. Well, I can’t say that this version is a resource hog, but it’s still on the slower side.¬†The problem I found with 360 is a large amount of Page faults, and that could explain some of the problems as to why things seem to slow down. I don’t know, my machine has 2 Gigs of memory which may not be a lot, but 2,500,000 page faults in 3 hours is way too much. I guess this must be because of a lot of checking that goes on when the software is running. The application however takes a surprisingly low memory footprint.

The software integrates with internet browsers (IE 6 and higher and Firefox 3.0 and later) to prevent phishing websites. I however, am having troubles with IE integrations. My IE sometimes becomes unresponsive and sometimes takes ages to startup and load since I installed the software. Firefox integration is pretty good but I found “Norton site safety” ¬†marking some hacker/warez and illegal sites as “safe”. This is clearly a¬†lapse with the software especially on sites that are¬†known for malware/spyware/viruses and/or phishing. Some were even marked with “transaction protection” and “privacy protection”. At the very least such sites should have been marked as “unknown”.

I can’t comment on the backup system for a simple reason, I have a elaborate backup system of my own for my projects and I don’t want to mess that up. But make no mistake about it, a backup system is integral to any good security solution and Norton 360 does provide that.

Lastly, Norton 360 also has a module that will tune up your system, clean redundant and temporary files and optimize your disks for performance. This an added bonus and though these things don’t particularly fall into the category of system security, they are probably equally important.

I think Norton 360 is good solid all round package focusing on security of a system. It does everything that should be done to keep your system safe and more. The team behind the product has taken into consideration every aspect of security including having a backup system in place. If worse come to worst you have the option of restoring your data from an online source. Having said that, the product did seem a bit slow while scanning. That said, for an average user Norton 360 is a good solution.

Things I liked :-

  • Comprehensive package for your computer security. Good integration of products.
  • A lot of focus on multiple levels and aspects of security including backups. Covers mostly everything you need to keep yourself safe and/or recover from a malware/security related attack.
  • Good support.
  • Easy installation of a complex security solution.

Things I found that can be improved :-

  • Not the fastest around, a bit on the slower side. Could have been much faster.
  • Parts of the user interface could be daunting for a non technical user.¬†Some configurations could be complex for some users.
  • Does a lot of checking giving the user a feeling that the system is slowing down.
  • Some warez and dangerous sites were marked as safe.
  • Browser integrations could be improved.
  • A bit pricey.

General tips to protect yourself from malware/spyware/viruses/phishing. (These do not specifically apply to Norton 360)

  • Choose the correct browser and plugins. Firefox has Ad-bloc plus/WOT/No-Script and a host of ¬†other plugins that can reduce the number of unwanted scripts/ads that run of webpages. This has 2 advantages, a) They will make your browsing speed faster, reduce the number of bloated ads and flash scripts and b) Automatically reduce the risk of running malicious scripts on unfriendly websites.
  • Avoid warez and illegal download sites like the plague. The No. 1 reason for getting malware on your desktop is visiting and downloading from such sites.
  • Never give off your password, credit card No., or for that matter any personal information to anyone on the internet or on the phone, period! This may sound like stating the obvious, but you will be surprised how easily people are fooled into giving away their personal information. For example, most people don’t think twice about giving away their email with their email passwords when signing up on some social networking site. What if this information is used for identity theft? I am not saying it will happen; but it could happen! Remember those messages you get in your mail “AFriendOfYours is now using SomeSocialNetowkingSite.com, come and join him/her and be a part of the community!” Most sites like these will ask for your email with your email password (to get access to your email) so they can automatically connect and invite your friends. These sites will go through your email and build a profile on you including your habits, friends, where you go and what you do online. God forbid, if you do online transactions, or have your bank statements emailed to you, then nothing prevents them for knowing all about your finances. This is how phishing takes place.
  • Update and deep scan (antivirus and adware) your system at least once a week. A good antivirus/adware/spyware solution will auto-update regularly.
  • If you are using Windows never turn off the firewall. With XP (and higher) there is a built-in firewall. Norton and others have other more elaborate solutions. Use them, and don’t turn them off under the pretext of faster browsing speed. Firewalls rarely affect browsing speeds. On systems other than Windows, it’s always good to have a firewall.
  • Apply updates to your system regularly.
  • Avoid using your workplace computer to do private stuff. Remember computers at your workplace can and are being monitored. Every key you press can be logged by a key logger and some of these systems are extremely sophisticated¬†and are actively used by organizations to monitor employees.

Bash the Flash.

It’s almost¬†fashionable¬†to bash the flash these days. Everyone is doing it, the big, the small, the wise and sometimes people who don’t seem to fully understand the argument. For a technology that has been around for almost 15 years and probably the only platform capable of¬†delivering rich web content for the better part of that time, some criticisms may sound a bit too harsh — or, are they? Yes some of it is indeed true. Flash applications have been known to slow a brand new quad core machine to a crawl while doing nothing more than streaming a simple video. There have been more than one instances when the entire browser has frozen up because flash hogged every available resource. But, before we go flash bashing lets look at why we are so overly¬†dependent¬†on a technology and why suddenly after 15 years of loyal service flash has now become so much of a thorn that everyone likes to crib about.

When flash first arrived on the scene, it was this cool new technology wherein you could program interactive webpages much to the delight of web designers. But as the world would soon realize, there was a negative side to overly depend on this new technology. During those days fast internet was a¬†luxury¬†of only a few and flash content on websites would take ages to download and display on dial-up connections (yes, I was in the university back then and couldn’t afford a broadband connection ūüėÄ ) . So flash adoption was¬†initially¬†limited. But as the internet grew and speeds increased, more and more websites started adopting flash. The logical next step for a rich content platform were games. This was exploited by flash game developers and we began to see more and more flash games being developed. Flash enjoyed renewed interest, web applications started being made with flash.

So why flash? ¬†Simply because, there was no other. If you wanted to make a rich web application, there was no better solution. True some Javascript workarounds existed, but until recently these were pretty limited when compared to what flash could achieve. But there was an even bigger reason as to why flash got adopted, and is now on almost every computer system that connects to the internet and is used to browse the web, and that is — streaming video. Yes, there were other competing formats but most were closed ones and flash was¬†favored¬†over those. Only now does the HTML 5.0 standard talk about streaming video and sound. This HTML revision should have been done 10 years ago, there is no logic to this delay,¬†but it is what it is and flash was and still is the leading tech/plugin to watch streaming videos on the Internet. The story doesn’t end there; there is still a debate about what video codec/standard to use for HTML 5.0 and patent¬†encumbered video technologies¬†means this debate will last a while longer. Also most streaming media sites deliver content in flash (flv) format and haven’t yet switched to HTML 5.0. So before you go blaming flash for all your browser troubles, think about it — do you have a choice? Well as it stands today, not quite.

It’s true that flash has it’s problems. But these problems were there before, so what’s changed now? The answer is 2 fold. A) People have started watching more streaming content online and as a result¬†inadvertently¬†use more of the flash plugin. B) There is a new technology that has silently crept up¬†to flash — and that is your humble Javascript. As Javascript got faster, websites got faster as well. Things which were possible with flash could also be done with Javascript. Developers found new ways of writing rich web-content using Javascript (AJAX) and slowly started avoiding flash by using¬†equivalent Javascript functionality. Mind you, I am not saying Javascript is a replacement for flash, I am saying you can now do so much more with it than you could do earlier. As a result an obvious comparison with flash was and is being made. Javascript continues to grow and with integration of technologies like WebGL it has rapidly narrowed the gap between flash and may even surpass flash in some areas.

People blame flash, but it not flash that is the problem, it’s the implementation. The flash plugin and it’s integration with the browser is what causes the pains like system slowdowns and browser crashes. Flash today is JIT compiled much like Javascript is so there are no problems there. Contrary to some who believe otherwise,¬†ActionScript is a dialect of ECMAScript much like JavaScript and is not inferior in anyway to the latter.

According to me the problem with flash is an engineering-implementation one. ActionScript and flash aren’t deficient or outdated in any way as some would suggest. However,¬†it’s true that the implementation is what needs to be looked at. Yes, the flash plugin has problems but it’s not the technology that is flash, but it’s implementation that is the plugin. If that were to be cured, flash isn’t at all bad.


It’s been some time since the movie was released, but I only managed to watch Avatar yesterday. Ok, before I proceed let me put a “spoiler alert”. If you haven’t, go see the movie and then read the rest of the entry ;-).

I would describe the movie as, “great graphics, superbly imaginative¬†environments, great blending of live actors and CG, but a rather bland and ordinary storyline”. The movie is a graphics¬†galore, but the story itself is rather dull and predictable. Throughout the movie you can almost sense what’s going to happen next, and that’s exactly what happens — leaving little room for¬†mystery. I am a James Cameron fan (, who isn’t), but in most of his movies he does find a subtle and an uncanny way to weave a¬†wacky¬†(but believable) story around the whole action movie concept. Unfortunately, Avatar doesn’t quite have all of that.

The whole dull story thing however, could be easily forgiven given that most of the time is spent¬†admiring¬†the visual effects, graphics and stunningly beautiful¬†environments¬†modern CG can achieve. I found the movie rather enjoying. I guess Avatar is natural fodder for a 3D graphics geek like myself, but apart form that the movie does an excellent job at¬†handling or rather blending graphics with real life actors. You would be forgiven for mistaking reality from CG¬†especially¬†when live human actors¬†interact¬†with CG actors and the environment. I was doubly interested with how the environment behaved in response to the human actors actions. The most difficult part of compositing a 3D CG environment with actual actors actions is the interactions of the CG elements (with the actors). The subtle swish of the grass when an actor runs, the rustle of the leaves when a an actor goes through a bush, these are small things that makes a CG scene¬†believable. My hunch is — all that was done and captured in real time in a 3D studio environment.

The truly spectacular¬†achievement¬†of the movie/technology and the one that impressed me the most is the facial animation. Any computer¬†modeled facial animation is bound to be hit by the uncanny valley effect, but in Avatar the facial expressions, though not flawless do turn a page (no they are not fully human like, but are ¬†definitely¬†believable). The technological achievement is commendable and some¬†critical reviews don’t do justice to, what is a pretty good effort on the part of the CG team. I know how hard it is to have a seamless facial animation system (I myself am working on one)¬†and the movie and it’s mocap technology to simulate¬†facial movements¬†does bring in a lot of realism. A lot of ideas there for future gaming projects.

I am pretty impressed with the movie as a whole. Yes it has a linear and an ordinary story but it does push the envelop in CG technology. The graphics are stunning, but what is more interesting is the composition of graphics and human actors. For me the facial animation was probably the best part. It’s not a new idea, ie. to capture live human actor’s facial movements on a CG character, but Avatar does it so very well.

Happy new year…

Best wishes and a happy new year to all.

Larrabee isn’t coming just yet.

Hmm… I am disappointed (story). No, I wasn’t expecting the first versions of the technology to be game changer in the graphics or for that matter in the HPC¬† or the compute world, but I was very very interested in knowing more about the Larrabee technology. Thus far Intel has only thrown “bits and pieces” about their new tech, and that in no way gives one a clear picture. No, Intel hasn’t given up on the technology, but seems to have postponed the release in it’s current form because the performance targets weren’t being met. Ironically, Intel had initially made claims that Larrabee chips would stand up to discrete solutions from ATI and Nvidia. However, it looks like the tech still needs some work done to measure up to that.

At this point all we can do is speculate, but the fact is — building a chip that can do HPC and compute and graphics and have driver/software/optimizing compilers working perfectly is a tall order, even for a giant like Intel. I am sure they have done most of it right, but most of it isn’t all of it, and that’s probably the reason we are seeing the launch being canceled in it’s current form.

Many-core computing is the next big thing, and technologies like Larrabee are the future. I am disappointed because more than the tech, Larrabee would have been a window into how things are shaping up. How does software development scale to the future? Would the new optimizing compilers allow the use of current software methods? Or, does it mean a radical shift in the way software systems are built? How would the new tech address task parallelism? — I guess we will have to wait a while longer to see how these (and I am sure may more) questions are answered.

Hardware snags.

OK I have been off the radar for too long. Unfortunately I have been plagued with some hardware problems and have had limited access to the the Internet. Most of it is resolved now so back to normal from today.

DirectX 11 hardware is here.

It’s almost time for Windows 7 and along with that the first lot of DirectX 11 class hardware has started to appear. This time the first off the block was, surprise surprise, ATI. The 5800 series cards were released a couple of days ago and there are already impressive reviews about the new cards all around. I am sure it wont be long before Nvidia, which has been uncannily silent, comes out with their line-up. So it is safe to assume that there will be DirectX 11 class hardware on the shelves going into Windows 7 release (Windows 7 RC already has DX 11 support and will also be available for Vista soon). It will however, still take a few weeks more for the initial euphoria to settle, and we should see prices of the cards drop around the holiday season, and probably that is when I will go in for an upgrade as well. I have been running the HD 4850 for some time now and thus far it’s proving to be sufficient, not only for gaming but also for my programming needs. The HD 4850 has been surprisingly good given it’s price point and one would expect the same from 5800 series given the already positive reviews.

There are a couple of things that are in favour of DirectX 11. The first is the API itself. DirectX 11 offers more than just a simple evolutionary upgrade (more here). DirectX 10 was mostly a non event. The enormous success and the longevity of XP and the XBox 360 ensured that the 9 version of the API far outlived¬† most expectations (and probably will continue to live for some time to come). The story of DirectX 10 is also intrinsically connected to Vista. Vista’s low adoption meant not enough people were running a DirectX 10 capable software platform, which Microsoft stubbornly refused to port to XP for whatever reasons. Even though 10 class hardware was available during Vista’s reign, nagging hardware issues and poorly implemented drivers meant DirectX 10 never really caught on like 9 did.

That brings us to the second point in favour of DirectX 11 — Windows 7. XP is old, and I mean seriously old. I am still running a 2004 copy of XP on my machine and though it’s doing it’s job admirably, it’s due for an upgrade. Windows 7 seems to have gotten over those little annoying quirks of Vista which we hated and shouted so much about. My hunch is most people who have stuck with XP will probably upgrade too. Maybe not on immediate release, but 2-3 months down the line when things settle in, after those initial bugs have been addressed and more and more reviews of the OS come out; 7 should slowly see wider adoption. With Vista it seemed like things were rushed into and hyped up. In contrast Microsoft has been careful with Windows 7. The RC of Windows 7 has been somewhat of a “soft launch” and though I haven’t myself had the chance to try it out, it would seem (from reviews and from what people are saying) Windows 7 is much better off than what Vista was. So it’s fair to assume that 7 will catch on more than Vista and in the process DirectX 11 will “get on” to the Desktop.

Does it mean DirectX 11 will be the defacto API for coming games? For that lets look at the games developed today. Yes most of the games that are developed today are still developed primarily for¬† DirectX 9.0 class hardware. Why? Consoles that’s why. You do see AAA titles advertise DirectX 10 and 10.1 support, but even those games are developed with DirectX 9.0 class hardware in mind. Yes some features here and there, usually eye-candy to impress your overzealous graphics fanboi can be found, but the engine and tech itself is designed for platform compatibility. Which ironically means not all of the features of the newer DirectX versions are exploited.¬† As I said before, DirectX 11 is more than just a simple upgrade to the API, it’s also a new way to do things. But since the older hardware still has to be supported, compromises have to be made. There are probably no AAA titles exclusively for the PC, so even if PCs all around were to have DirectX 11 support, it’s not until the consoles catch up will you see all the cool things the newer API has to offer come to the fore.

There is little doubt that version 11 of will make games look better. But there is so much more to the API than just improving looks for games. Many of the features in the new API mirror hardware changes that have taken place, like moving away from the fixed function pipeline, the evolution of GPUs as massively parallel compute devices. All this does mean that DirectX 11 is an API to look at seriously. But how quickly will games start using all these features? I guess only time will tell.

Excuse me, but where’s the menu?

It’s interesting, but our good old menu-bar, yes, the one having “File¬† Edit¬† View …” on it, is slowly disappearing from some of the most popular software apps. Today I happened to wander across to the Firefox proposed UI design page and the thing I immediately noticed was the absence of the menubar in the new designs. Good riddance! No seriously, how often do you use the browser menu-bar? For me the browser is the second most frequently used application and 99% of the time I never bother about the menu-bar. I, however, would love every millimetre of screen space while browsing and am more than happy to see the menu-bar go.

There have been some subtle changes in UIs over the past couple of years. No I am not talking about glass effects and pretty pictures and icons, I am talking about design. Though not the first, it was MS Office that got the world’s attention by replacing the menu-bar with the “Ribbon Control”. A bold step, but the idea was to combine the tool-bar and menu into a single tabbed interface. To be honest, yes the ribbon idea is cool, maybe not innovative, but definitely cool. The interface had mixed reactions initially, but as people got more and more familiar with it, things started to get comfortable and soon other applications followed suite. At a first glance having no menu-bar is disorientating for a long time computer user. I did find it a bit “unusual” to navigate my way around¬† Office 2007 (the one and only time I used it). On the other hand, I never missed the menu-bar (even once) while using the Chrome browser. I guess the whole idea needs a bit getting used to, but apart from that, I really did like the whole concept of replacing the menus and tools into one single compact unit. Makes sense. Tool-bars, after all, only do complement the menus. It is therefore logical that both be combined into one.

I personally feel this use of the tab interface is a step in the right direction when it comes to exposing complex UI. The resurgence of the tab control and it’s innovative use in hiding complexity whist combining two separate controls (the menu-bar and the tool-bar) into a single entity to service both options is intuitive and resourceful. A similar usage of the tab control has also found it’s way into the mobile device world where it is going to be the main stay of the Moblin Linux platform. That is interesting indeed. So will we see our “Desktop Application Panels” and “Start Menus”, which are basically menus too, being replaced by tabs soon?

Alienware M17x – The fastest alien amongst laptops.

alienwareI was invited to the preview of Alienware M17x unveiled by Dell to cater to the high end, hard-core gaming enthusiast. It was my first experience with the Alienware brand, though I have often read about other high performance laptops from them. Dell is marketing the M17x product as “The most powerful gaming laptop machine in the universe”. Well, that is probably correct, at least for now the machine is more than capable of pushing anything out there within it’s resolution limits. Alienware is known for it’s high end machines and this avatar in the Alienware series is no different in keeping with the brand image.

The M17x packs some heavy duty, top of the line stuff in it’s guts, probably far more than what is required for a gaming notebook. Ergonomically the M17x is designed to please the hard-core gamer, and also to make a style statement. Complete with flashing lights, multi-colored keyboard and scintillating sound, every effort has been made to attract yours and everybody else’s attention. The whole laptop is designed to look different and will stand out from anything else in the room. If you want to show off your “new gaming laptop” then the M17x is probably what you should be looking at.

No high end gaming rig can be complete without a heavy duty GPU, or should I say GPUs (plural), 2 in fact. The M17x features either with Dual SLI Nvidia GTX 260M or the Nvidia GTX 280M GPUs. I would suggest the 280M. (Well, if you are going for a high end gaming system, you might as well get a top line GPU.) The 280M is, as of today, the highest performing GPU for notebooks. The laptop comes fitted with the¬† Intel Core 2 Extreme mobile processor and you seem to have an option of choosing Dual or Quad core CPU. The choice of the CPU will depend on the type of games that are played. Games like Oblivion and Fallout 3 are more CPU intensive since a lot of data is streamed in real-time, but in any case I don’t think there should be too many problems even with a Dual core CPU since most games wont go CPU bound with a powerful GPU setup and fast 1333 MHZ GDDR3 RAM. Again, if you are the one to play at exceptionally high frame-rates and can’t tolerate even the slightest glitch, then by all means the Quad core option is also provided for the M17x.

While M17x looks like a laptop, it’s actually is a mobile desktop. Weighing in at more than 5 Kgs, it isn’t something you can lug around to every place you go. The weight of the laptop must be due to the 2 GPUs, heat dissipation devices and the large battery that will be needed for such a huge performance monster. The M17x is without a doubt a high performance gaming rig. I personally tried pushing Crysis at 1440×900 at full AA and AF and there were no visible hiccups or slowdowns and the gameplay was flawless. I bet it will be able to push every game out there without a sweat. Too bad it only has a 17″ screen. For this kind of performance the 17″ screen looks a tad bit small. I would have loved to see a larger and a higher resolution monitor but I guess the compulsions of space and laptop dimensions made 17″ the largest choice.

The only real nag that I found were the lights. At a first glance you may (or may not) like the flashing lights and the multi colored keyboard, but once you start using the machine, the lights are nothing more than a distraction, especially in fast paced game. Well, you have an option to turn them off, so I guess that’s not too much of a bother. Also the only real advantage of a Dual GPU setup is for systems that have enormous resolutions (2560×1600) or for multi monitor systems. SLI combos are excellent when rendering with very heavy fillrates and even at it’s highest resolution the 17″ monitor isn’t quite in the league for Dual SLI, considering that it already has the 280/260M. Having 2 GPUs instead of one also means the machine will generate quite a lot of heat, guzzle battery power and will weigh substantially more than it would have with a single GPU. However the choice seems to have been made to please the hardest of the hardcore gamer out there. The M17x makes absolutely no compromises on performance, anywhere.

Well, there isn’t too much further to say regarding the machine. My experience with the rig was limited, but it is interesting that Dell launched the Alienware brand in India. India is not known for it’s hardcore gaming enthusiasts and you wont find too many laptops specifically for “the gamer”, at least nothing in the league of the M17x. Kudos to Dell for that.