Susheel's Blog

Yes I am Alive!

by on Nov.01, 2013, under General

WOW!!! it’s been a very long time, and yes, I am alive and well :) . It’s just that I had a lot on my hands lately and I haven’t been able to pay any attention to the blog. Hopefully I will be updating the blog more often from now on.

Leave a Comment more...

Visual Studio 2010 still too slow!

by on Jan.04, 2011, under Programming & Development

Awe sh*t! After 2 months of active use I can say for sure, Visual Studio has some serious problems with speed. I didn’t have the IDE crash on me, but it’s just too slow for any large project. Tried everything possible, cleaned out the cache, recreated the intellisense files but the program still keeps slowing down for no apparent reason. It’s really annoying when the program suddenly gets into a heavy disk access mode, to the point where even typing becomes impossible. I have racked my brains and fiddled with every possible tweak I could find on the web without success. Since our entire project is now moved over to VS 10 it’s too late to turn back now :( !!

UPDATE:  I finally managed to fix Visual Studio 10. Please read the post Speeding up Visual Studio.

Update (Not really): Neither me nor most programmers I know who are using VS 10 could solve this issue satisfactorily. I would recommend moving projects to Visual Studio 2012. It is much better and more stable than VS 2010. It’s been out for a while now and there’s no point in continuing with VS 2010 that clearly has some issues with speed and disk access.

Visual Studio 2012 :
Visual Studio 2012 Express :

10 Comments more...

A Happy New Year.

by on Dec.31, 2010, under General

A Happy New Year to all of you.

It’s been a rather slow and uneventful 2010, but  a lot more exiting 2011 (I hope) :-D .

Leave a Comment more...

A decade as a software engineer.

by on Dec.31, 2010, under General

Me in 2000

  • Favorite Programming Languages – C, C++.
  • Working with (programming languages) – C++.
  • Working with (platforms) – Win32, Linux.
  • Working on – Finance Software, Stock Market (software), Futures and Options (software).
  • Programming Languages – Pascal/Delphi, C, C++.
  • Experimenting with – OpenGL, 3D Graphics, Client Sever communication.

Me in 2010

  • Favorite Programming Languages – Python, Erlang, Haskell.
  • Working with (programming languages) – C++, Python, Lua, HLSL, GLSL.
  • Working with (platforms) – Win32, Linux, Mobile.
  • Working on – Game Development, Misc Finance Software.
  • Programming Languages – Pascal/Delphi, C, C++, Python, Erlang, Haskell, Lua, Javascript, HLSL, GLSL.
  • Experimenting with – Haskell, Erlang, Direct3D 11, HLSL/GLSL.

Hmm… not much of a change there. Surprising (…or not)! :-D

Leave a Comment more...

Over-patterning software design.

by on Dec.30, 2010, under Game design, General, Programming & Development

Ah! Design Patterns! Yes those seemingly magical concoctions of code that appear to solve all of the problems plaguing software design. So profound is the initial impact of design patterns, that the engineer begins to believe that he/she has finally found mythical scrolls of wisdom, bestowed upon him/her by divine beings, so much so that after reading through them every design problem can be automatically deconstructed into a set of familiar design patterns. Using them seems to solve every challenge software engineering has to offer — and the engineer begins to believe that all that is ever needed on his/her desk is a copy of those very patterns. Yes, there was a time when I have been guilty of the very same thing.

There is also the misconception that patterns are drop-in replacements to traditional software design practices. It’s tempting to approach a design problem with a pre-packaged solution that patterns seem to offer. “Oh, we have a Composite, that means we need a Visitor for collaboration. So let’s use a Visitor then.” That was easy, but what was missed was the overhead of designing something as a Visitor. No one asked the question why a Visitor was needed, or if  it was indeed needed. Often the only reason given for such design decisions is, “… because a design pattern says so.” That’s not what design patterns advocate at all. Excess use of design patterns while designing software, inadvertently leads to Over-engineering.

This contradicts the popular perception which is of the view that patterns were created to address most commonly occurring design problems. Yes that is true, and no I am not trying to be a design pattern heretic and declare that patterns are useless. Patterns are in fact very useful when applied correctly. It is true that most software designs can broken down into sub-designs which can be collectively solved using a combination of different design patterns. But just because they can be, doesn’t mean they have to be. A designer well versed in design pattern use can quickly find adaptable patterns for most design problems — and can probably get them to work together if he or she understands the modalities of pattern behavior.  There is a dichotomy here; design patterns lead to over-engineering — and they are useful!! What is it then?

The truth lies somewhere in-between. Most problems with “Over-patterning” begin when there is an overbearing urge on the part of a designer to adapt his/her design, and sometimes downright bend it to fit to a design pattern. Just because a pattern fits or solves a problem, doesn’t mean it has to be used. Loading a software design with patterns is a mistake. One must remember, patterns add cost, and by cost I mean engineering cost. Strange — an engineering solution adding an engineering cost? But, thats how it is with any engineering problem in any engineering domain. Ironically if you refer each pattern you will often see these costs clearly pointed out by the authors. Call them disadvantages, limitations, issues or whatever other name you come up with, but the reality is that these issues aren’t trivial. An oversight or a failure to understand the implications of these in the overall design of a software system is what leads to overly complex  or over-engineered solutions.

An excellent article to read with regards to this is Joshua Kerievsky’s Stop Over-engineering.

2 Comments :, , , , more...

Speeding up Visual Studio 2010 on XP and Vista.

by on Nov.09, 2010, under Programming & Development

A quick press — I was running Visual Studio Express 2010 on XP a couple of days back and I found it to be rather slow. The intellisense was performing horribly and the entire system was rather sluggish with a ridiculous amount of disk access — almost to the point where I had to physically shut the system down using the power switch. I initially thought it was an install problem, but ironically realized it wasn’t the case after losing another half hour in a reinstall. After googling around (, which I should have done earlier,) I found some people had similar problems and the solution to the problem is rather simple. You just need to update the Automation API to version 3.0.  :-( Windows 7 already has the latest API and doesn’t have this problem.

UPDATE: Another input from a friend. Apparently you can speed things even more by using /SafeMode switch. Unfortunately it may create problems with third-party plugins you may have with your Visual Studio. For Visual Studio Express, which doesn’t support plugins, you can try this option. I must say however, I didn’t find too much of a difference myself on my current project.

UPDATE 2: Apparently all my problems were solved after following steps 1, 2, 3 and 4 mentioned here.

Update (Not really): Neither me nor most programmers I know who are using VS 10 could solve this issue satisfactorily. I would recommend moving projects to Visual Studio 2012. It is much better and more stable than VS 2010. It’s been out for a while now and there’s no point in continuing with VS 2010 that clearly has some issues with speed and disk access.

Visual Studio 2012 :
Visual Studio 2012 Express

6 Comments more...

Direct3D 10/11 coming to Linux … What about games?

by on Sep.24, 2010, under DirectX, Game development, Hardware, OpenGL, Programming & Development

No, April 1st is still more than 6 months away, and yes you heard me right — Direct3D versions 10 and 11 are indeed coming to Linux. How is this even possible?  Well it is possible, since nouveau moved on to Gallium 3D which allows Direct3D API (actually any API) to be exposed via a front end called a state tracker. Interestingly (, and there seems to be a lot of confusion going about on public forums) Direct3D will be a Native API under Gallium, much like OpenGL is currently. It won’t be a something that emulates Direct3D by using wrappers around OpenGL — meaning you will be able to write and compile Direct3D code directly on Linux or BSD based systems that support the nouveau driver. Initially I was a bit skeptical of such an approach since Direct3D API is integrated with Win32 API, but the author seems to have solved this by using Wine headers. I don’t know the pitfalls (if any) of such an approach, but it seems to have worked for him and would seem a logical path to take (instead of breaking API compatibility). He clearly outlines the motivation behind doing the Direct3D port, and kudos to him for doing something that was but inevitable given a no show of Longs Peak.

Naturally a native Direct3D implementation will allow game developers to write code that is cross-platform and even allow existing engines/games that use Direct3D versions 10 and higher to be ported across to platforms that have a Gallium driver. W00t! This is amazing, almost too good to be true isn’t it? But before we gamers jump in joy, there are still a few things that have to fall in place before things can get up and running with regards to Direct3D on Linux. First and foremost is support. Hardware vendors like Nvidia and AMD must support Gallium in their drivers, or OSS drivers must be written (and are being written) to take their place. This is paramount since without such an interface, no front end API (Direct3D or OpenGL) will be able to use hardware acceleration via Gallium. Second, and more importantly, the guys at Redmond must allow such an implementation of their Direct3D API. An API itself can’t be copyrighted. The author seems to have steered clear of any Microsoft code, so theoretically this shouldn’t be a problem. But then again I am no legal eagle, so I can’t really say anything w.r.t. this. There have been rumors that there are patents on sections of Direct3D. I am not sure what that means, or for that matter if it is even possible to patent sections of an API/Library. But, things could get potentially messy if Microsoft were to place a cease and desist on this new development. I doubt this would happen, but you never know.

I have to agree, having Direct3D as a native API via Gallium does open up a lot of possibilities for OSS platforms that have severely lacked games. Accelerated graphics on most systems apart from Windows have had little choice up until now with OpenGL being the only real option. But does this really mean that all of the games that are developed and are being developed will be ported to Linux and other OSS platforms? That’s an interesting question and the answer isn’t quite that simple. Lets look at the macro picture of the industry. For AAA games the PC platform isn’t a priority. Most (maybe all) AAA games are today made with consoles in mind. Yes there maybe a PC port, but it’s the consoles that are the main priority. Most (if not all) gamers that play AAA games on the PC do spend a bang on their systems and most of them already have Windows as their main OS. Some do have *NIX systems but even these few have a Windows partition that they keep around specifically for games. Porting any software to a new platform isn’t a trivial task. Even with the best coding practices and methods, it requires a lot of resources — which aren’t free. Everything from coding, testing, maintaining build setups, writing install scripts and many other things requires time and money. For  a AAA game, or for that matter for any game or software, a port to a new platform should show a robust ROI (return on investment). That’s where the crux of the problem lies. There aren’t that many *NIX gamers out there, and if there are, the big studios aren’t seeing them!

Then there are the casual games, which also is a big market for games. Casual games represent a very different kind of audience. A typical casual gamer is a non technical person who doesn’t even understand what a hardware driver is, let alone jargons like Gallium, Direct3D, OpenGL or for that matter Linux. Most casual gamers will have nothing but a moderately powerful laptop with on-board Intel graphics chips — which came with Windows pre-installed. This is the kind of player that expects the game to install and run with a single click. They don’t understand driver updates or DirectX versions. For them it matters little which API is better or worse or which platform supports which API and which doesn’t. Apart form these two broad segments, there are a whole lot of players who will play radical indie games and this is probably where Linux ports has found some success. This gamer is the tech savvy computer geek who runs Linux as his/her primary system and isn’t afraid to fire up the console now and then. I must say, some radical indie games have found success in this area. But, these games are far from cutting edge. They maybe very good games, but you don’t expect Crysis like graphics from them, and it matters little what API is used or if the underlying API runs 5% slower when your game is not going below the 30FPS barrier.

There have been lots of debates about OpenGL vs Direct3D. I refrain to go into that. However, having a choice of accelerated graphics API for platforms other than Windows is definitely good all around. Direct3D versions 10 and 11 are well designed APIs, closely tied to current generation hardware. But will all this translate into more ports of games to Linux and BSDs is still an open question. The community as always will play a vital role and only time will tell how things pan out.

Leave a Comment :, , , , , , , more...

Can parallel processing really cut it?

by on Jun.13, 2010, under Hardware, Programming & Development, Software

When Larrabee was first delayed and then “postponed” most of us weren’t surprised (, at least  I wasn’t).  Parallel computing, though advocated as a world saver, isn’t the easiest model to program to. Doing everything in “software” (graphics, HPC and all) ‘might not’ be as easy as was anticipated. The cold hard reality is that languages like C++, Java and derivatives (mostly OOP ones,) were never really designed for parallelism. A multi-threading-here and a asynchronous-there, doesn’t really cut it. Using the full potential of parallel devices is very challenging indeed. Ironically most of the code that runs todays software isn’t geared for parallel computing at all. Neither are todays programmers.

But experts advocate  a parallel computing model for the future. But, is it easy to switch to? Will an innovation in hardware design, or a radical new compiler that optimizes away your “for() loop” the real answer? A very interesting article to read (even if you are not into graphics and game programming) is :

Very rarely do I quote articles, but this one is really worth a read. Well-written and well said.

Leave a Comment :, , , , , more...

Busy busy busy…

by on Jun.09, 2010, under General

Damm it’ s been the longest time ever. No, I haven’t forgotten about writing, but been involved with a lot of work as of late.

  • Game Art integration.
  • Testing.
  • Pushing 3 other projects (2 game related and 1 other misc project).
  • Plus one new project.

Very little time, too much to do. :-)

It’s getting a bit better, so  hopefully you will hear more of my ‘self-centered’ ‘worthless’ rants in the coming days :-D .

Leave a Comment more...

Norton 360 4.0.

by on Mar.23, 2010, under General

Norton 360I received a complimentary copy of Norton 360 a couple of weeks back, but only managed to install it now. Let me start off by saying, I am pretty impressed with the number of products integrated within the Norton 360 package. For a package this large, the installation was pretty silent, quick and quite clean. Everything got installed correctly and I could start using the software in about 5 mins flat. I tried almost everything and found nothing I could really complain about. Most things are pretty straight forward except maybe the firewall settings where you do have to turn to the application help if you want to customize things. However, even the firewall was configured correctly from the word go. A full thumbs-up to that. The virus/adware/spyware scans are also pretty good albeit a bit slow.

I personally never used any Norton product before this one. However, Norton products used to be installed on machines where I used to work and the common complaint with them is, “They seem to hog computer resources and are really slow”. Well, I can’t say that this version is a resource hog, but it’s still on the slower side. The problem I found with 360 is a large amount of Page faults, and that could explain some of the problems as to why things seem to slow down. I don’t know, my machine has 2 Gigs of memory which may not be a lot, but 2,500,000 page faults in 3 hours is way too much. I guess this must be because of a lot of checking that goes on when the software is running. The application however takes a surprisingly low memory footprint.

The software integrates with internet browsers (IE 6 and higher and Firefox 3.0 and later) to prevent phishing websites. I however, am having troubles with IE integrations. My IE sometimes becomes unresponsive and sometimes takes ages to startup and load since I installed the software. Firefox integration is pretty good but I found “Norton site safety”  marking some hacker/warez and illegal sites as “safe”. This is clearly a lapse with the software especially on sites that are known for malware/spyware/viruses and/or phishing. Some were even marked with “transaction protection” and “privacy protection”. At the very least such sites should have been marked as “unknown”.

I can’t comment on the backup system for a simple reason, I have a elaborate backup system of my own for my projects and I don’t want to mess that up. But make no mistake about it, a backup system is integral to any good security solution and Norton 360 does provide that.

Lastly, Norton 360 also has a module that will tune up your system, clean redundant and temporary files and optimize your disks for performance. This an added bonus and though these things don’t particularly fall into the category of system security, they are probably equally important.

I think Norton 360 is good solid all round package focusing on security of a system. It does everything that should be done to keep your system safe and more. The team behind the product has taken into consideration every aspect of security including having a backup system in place. If worse come to worst you have the option of restoring your data from an online source. Having said that, the product did seem a bit slow while scanning. That said, for an average user Norton 360 is a good solution.

Things I liked :-

  • Comprehensive package for your computer security. Good integration of products.
  • A lot of focus on multiple levels and aspects of security including backups. Covers mostly everything you need to keep yourself safe and/or recover from a malware/security related attack.
  • Good support.
  • Easy installation of a complex security solution.

Things I found that can be improved :-

  • Not the fastest around, a bit on the slower side. Could have been much faster.
  • Parts of the user interface could be daunting for a non technical user. Some configurations could be complex for some users.
  • Does a lot of checking giving the user a feeling that the system is slowing down.
  • Some warez and dangerous sites were marked as safe.
  • Browser integrations could be improved.
  • A bit pricey.

General tips to protect yourself from malware/spyware/viruses/phishing. (These do not specifically apply to Norton 360)

  • Choose the correct browser and plugins. Firefox has Ad-bloc plus/WOT/No-Script and a host of  other plugins that can reduce the number of unwanted scripts/ads that run of webpages. This has 2 advantages, a) They will make your browsing speed faster, reduce the number of bloated ads and flash scripts and b) Automatically reduce the risk of running malicious scripts on unfriendly websites.
  • Avoid warez and illegal download sites like the plague. The No. 1 reason for getting malware on your desktop is visiting and downloading from such sites.
  • Never give off your password, credit card No., or for that matter any personal information to anyone on the internet or on the phone, period! This may sound like stating the obvious, but you will be surprised how easily people are fooled into giving away their personal information. For example, most people don’t think twice about giving away their email with their email passwords when signing up on some social networking site. What if this information is used for identity theft? I am not saying it will happen; but it could happen! Remember those messages you get in your mail “AFriendOfYours is now using, come and join him/her and be a part of the community!” Most sites like these will ask for your email with your email password (to get access to your email) so they can automatically connect and invite your friends. These sites will go through your email and build a profile on you including your habits, friends, where you go and what you do online. God forbid, if you do online transactions, or have your bank statements emailed to you, then nothing prevents them for knowing all about your finances. This is how phishing takes place.
  • Update and deep scan (antivirus and adware) your system at least once a week. A good antivirus/adware/spyware solution will auto-update regularly.
  • If you are using Windows never turn off the firewall. With XP (and higher) there is a built-in firewall. Norton and others have other more elaborate solutions. Use them, and don’t turn them off under the pretext of faster browsing speed. Firewalls rarely affect browsing speeds. On systems other than Windows, it’s always good to have a firewall.
  • Apply updates to your system regularly.
  • Avoid using your workplace computer to do private stuff. Remember computers at your workplace can and are being monitored. Every key you press can be logged by a key logger and some of these systems are extremely sophisticated and are actively used by organizations to monitor employees.
5 Comments :, , , more...

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!