When coding, I use both of my hands for typing and both my eyes fixed on the code I’m writing. When I don’t know a certain keyboard shortcut, I reach for the mouse and in doing so, look away from the main Visual Studio window. My hands stop typing. I’m not productive anymore. For me, being productive is the foundation of my developer happiness.
Loss of productivity = developer unhappiness
Another situation that sometimes occur is when I’m in the zone coding like I’m on steroids. We’ve all tried that. It’s the best and most fulfilling part of being a developer. Highly productive and my developer happiness is at max. When I’m in the zone, nothing disturbs me. I don’t notice people entering or leaving the room and sometimes not even various status messages inside Visual Studio. Did the build just fail? What about the unit tests? I have no idea and that could be problematic.
My eyes are used for seeing what’s on the screen (input) and my hands are used for typing (output). The problem is that I only have 1 pair of hands, 1 pair of eyes and I’m using both. I’ve maxed out my IO.
The body parts useful for coding is a limited resource
Mother nature did provide us with much higher IO bandwidth than what can be achieved by only using eyes and hands. One of them is sound. The ears are hearing (input) and your voice is speaking (output).
So why not try to increase our IO using sound? What if we could hear when the build brakes? What if we, using our voice, could tell our editor to format the document in case we forgot that keyboard shortcut? It seems like a perfectly reasonable idea to me.
To experiment with this idea, I wrote two extensions for Visual Studio 2012.
Voice Commands
This extension let’s you speak any command out load. The extension will be able to recognize the voice command and execute it in Visual Studio.
It really is impressive how far speech recognition has come since I started playing around with it in the early nineties. All credit is given to the researchers and developers who kept evolving Windows Speech Recognition through all those years.
I started pronouncing each word very clearly, because I expected it would more easily recognize the words like that. What I’ve found is quite the opposite. You have to speak to it like you would speak to any normal human being. It felt counter intuitive, but that’s how far speech recognition has come. The accuracy is astonishing.
Voice Commands starts listening when you type Alt+V. Then just speak any command, such as Format Document, Save All, Options, Toggle bookmark, Collapse To Definition.
Download: VS Gallery
Source code: GitHub
Farticus
Because every app store needs a fart app – including the Visual Studio Gallery.
Granted, this is a joke. However, it’s a joke in disguise, because it does solve my problem of missing occasional build errors by playing a sound every time the build breaks. With Farticus installed, I’ve never missed a build error.
Sayed and I build Farticus with a few goals in mind:
- It should be useful and solve a real problem
- It must be open source
- It should be easy for others to learn from the code
Don’t like the fart sounds? No problem. Fork the project and substitute the sounds with more appropriate ones. Doh! or You can’t do that, Dave comes to mind. Why not play a sound when unit tests fail as well? Fork the source code and release your own version or send us a pull request and we’ll add it to Farticus.
Download: VS Gallery
Source code: GitHub
So there you have it folks. Two extensions that adds a natural user interface to Visual Studio and adds some body parts to our coding arsenal.