I thoroughly enjoy delegating error-prone and time consuming tasks to tools.
Code Editors
I have several years of experience in using Vim and I have created a small plugin for it. I have gathered plugins which improve on the basic editor giving me quite a good IDE-like editor. The only feature that I really miss is refactoring tools.
Database Tools
In additions to the default administration tools I use Vim with its dbext extension for query development and for bigger jobs I've used Aqua Data Studio.
IDEs
Most of my experience with IDEs is with Eclipse (PHP Development Tools) and its CVS, SVN and Git integrations, and with PHPStorm.
I have also used the Java and Scala development tools environments in Eclipse for writing small programs.
Several years ago at school I used MS Visual Studio for VB.net for a 40K+ SLOC project. I have also used Visual Studio for C++ development and Netbeans for Java development, both for school work and for my own small applications.
VCS
My current choice of version control is Git and I'm also experienced with CVS and Subversion. I enjoy terse and informative commit comments, short-lived and often merged branches, and binary search over the version history when looking for bugs.
Web Tools
Although I occasionally use other browsers like Opera and IE for testing or experimenting, most of my web development work is done with Chrome and its Developer Tools or with Firefox and its add-on Firebug. I use these development tools mostly for experimenting with and developing JavaScript, checking out the HTTP traffic and inspecting and experimenting with HTML and CSS.
Lori, another add-on, shows nicely the time used in loading pages.
Web Developer add-on makes whole lot of things easier to do, including developing for smaller screens and testing pages without certain styles or JavaScript. Responsive Design View is also a gem.
Live HTTP Headers add-on is a good tool for discovering header related problems, and Tamper Data add-on allows for editing sent request headers which helps exploratory testing.
Chrome has a great mobile phone debugging possibility.
For more in-depth exploring I run TCP/IP tools like traceroute, dig, netcat and wireshark.
For more automation I use Selenium or command line browsers like wget and curl.
I am a proponent of mouse gestures, and the add-on FireGestures speeds up handling the browser and in addition to its basic commands I've used it as a convenient way of creating custom JS scripts for Firefox.
Test Frameworks
Of code-level test tools I have used mostly PHPUnit, Jasmine, Karma and JUnit and I have also created some basic tests with LuaUnit. Other QA tools I have used include Selenium, Burp Suite and Tamper Data.
Automated tests can be very useful although discretion should be used to avoid spending too much time creating and maintaining too extensive tests which aren't pulling their weight.
I've also found that Selenium performs well for automating applications to produce test data.
I occasionally use destructive test tools like Cynic which try to force the application into errors by simulating odd responses from integration points via TCP and HTTP.
Unix Utilities
I have been a Linux and Unix user for about ten years and have a good working knowledge of the system and the command line tools. Although for anything more than a few filtering commands I usually edit the data with a powerful editor like Vim or write a script in a general purpose programming language.
Debuggers
Debuggers are very helpful when there are hard problems or unknown codebases, or both together. It's also a good practice to sometimes step through the code in a debugger to make sure that it does what it's supposed to do.
Profiling
Profiler is the weapon of choice when there's enough talk about performance, and when libraries are updated or there have been performance optimizations and the software has changed after adding said optimizations. Things change and optimizations that don't optimize should be changed to more maintainable code (if they were already maintainable, they wouldn't need special treatment and hence need not be called optimizations). Profilers also give a view inside the software, not as thorough as debuggers but a more stream-lined and a simple one.
Web Servers
I mostly use Nginx or NodeJS with Express nowadays but I have many years of experience with Apache HTTPD. It is quite astounding how much you can do with Apache although quite a lot of that is often better left for the application. I'm reasonably knowledgeable with the HTTP protocol and have set up and tested a WebDAV server using Apache. I was involved in partially porting a PHP4 WebDAV library to PHP5 using litmus test framework and cadaver, a command line WebDAV client.
Task management
I have come across the following task management applications: Trac, Scarab, Mantis, Bugzilla, Trello and Jira with Greenhopper. For this purpose I prefer a tool with a lot of features rather than simple ones. One of my pet peeves is that people write cryptic bug reports. I err on the side of spending too much time writing a task rather than less, confident that the one reading the report will save time and the risk of misunderstandings goes down. The time that I save could also be my own, and indeed has been a number of times.
Estimating and time tracking are also great features for this category of software, in my opinion.
Server OS
My exposure to servers consists mostly of Linux (Ubuntu, Red Hat, SuSE, Oracle Unbreakable) and SunOS, both as a developer and as a sysadmin. I have some experience using the Oracle VM virtualization solution to supply virtual servers for development.
I enjoy using server monitoring software, but for some cases I like the flexibility of scripting with SSH to get information to and from servers.
Workstations
I've used Windows XP, Mac OS X and Linux (Debian and CentOS, KDE 3 and 4) as a workstation for development. I have used VMware Workstation and VMware Fusion to run virtual machine guests.
Distributions I have used at home for at least a couple of months: Gentoo, Ubuntu, Debian, Fedora and Arch Linux.
Image Software
For image conversions, metadata retrieval and batch jobs I turn to ImageMagick. Tasks requiring modification of only few images I usually do with Gimp.
PDF
If ImageMagick chokes on a big PDF or if there's a need for more flexibility I have experience with PDFlib to do the job.
Using Adobe Acrobat Pro I have maintained JavaScript batch jobs for updating forms on PDF files.
For tasks requiring digging in deeper inside a PDF I use available Linux tools or the Enfocus Pitstop Pro plug-in for Adobe Acrobat to get more information.
Code Editors
I have several years of experience in using Vim and I have created a small plugin for it. I have gathered plugins which improve on the basic editor giving me quite a good IDE-like editor. The only feature that I really miss is refactoring tools.
Database Tools
In additions to the default administration tools I use Vim with its dbext extension for query development and for bigger jobs I've used Aqua Data Studio.
IDEs
Most of my experience with IDEs is with Eclipse (PHP Development Tools) and its CVS, SVN and Git integrations, and with PHPStorm.
I have also used the Java and Scala development tools environments in Eclipse for writing small programs.
Several years ago at school I used MS Visual Studio for VB.net for a 40K+ SLOC project. I have also used Visual Studio for C++ development and Netbeans for Java development, both for school work and for my own small applications.
VCS
My current choice of version control is Git and I'm also experienced with CVS and Subversion. I enjoy terse and informative commit comments, short-lived and often merged branches, and binary search over the version history when looking for bugs.
Web Tools
Although I occasionally use other browsers like Opera and IE for testing or experimenting, most of my web development work is done with Chrome and its Developer Tools or with Firefox and its add-on Firebug. I use these development tools mostly for experimenting with and developing JavaScript, checking out the HTTP traffic and inspecting and experimenting with HTML and CSS.
Lori, another add-on, shows nicely the time used in loading pages.
Web Developer add-on makes whole lot of things easier to do, including developing for smaller screens and testing pages without certain styles or JavaScript. Responsive Design View is also a gem.
Live HTTP Headers add-on is a good tool for discovering header related problems, and Tamper Data add-on allows for editing sent request headers which helps exploratory testing.
Chrome has a great mobile phone debugging possibility.
For more in-depth exploring I run TCP/IP tools like traceroute, dig, netcat and wireshark.
For more automation I use Selenium or command line browsers like wget and curl.
I am a proponent of mouse gestures, and the add-on FireGestures speeds up handling the browser and in addition to its basic commands I've used it as a convenient way of creating custom JS scripts for Firefox.
Task Automation
When it comes to deployment, generation of artifacts and generally saving time on small things, I've used Capistrano, Ant, Grunt but my favorite one is the unix shell because of its simplicity and versatility. A small shell script with a little SSH goes often a long way with a fraction of the time spent. But the more dedicated software certainly have their uses when working in a team or when the script itself would grow too complex.Test Frameworks
Of code-level test tools I have used mostly PHPUnit, Jasmine, Karma and JUnit and I have also created some basic tests with LuaUnit. Other QA tools I have used include Selenium, Burp Suite and Tamper Data.
Automated tests can be very useful although discretion should be used to avoid spending too much time creating and maintaining too extensive tests which aren't pulling their weight.
I've also found that Selenium performs well for automating applications to produce test data.
I occasionally use destructive test tools like Cynic which try to force the application into errors by simulating odd responses from integration points via TCP and HTTP.
Unix Utilities
I have been a Linux and Unix user for about ten years and have a good working knowledge of the system and the command line tools. Although for anything more than a few filtering commands I usually edit the data with a powerful editor like Vim or write a script in a general purpose programming language.
Debuggers
Debuggers are very helpful when there are hard problems or unknown codebases, or both together. It's also a good practice to sometimes step through the code in a debugger to make sure that it does what it's supposed to do.
Profiling
Profiler is the weapon of choice when there's enough talk about performance, and when libraries are updated or there have been performance optimizations and the software has changed after adding said optimizations. Things change and optimizations that don't optimize should be changed to more maintainable code (if they were already maintainable, they wouldn't need special treatment and hence need not be called optimizations). Profilers also give a view inside the software, not as thorough as debuggers but a more stream-lined and a simple one.
Web Servers
I mostly use Nginx or NodeJS with Express nowadays but I have many years of experience with Apache HTTPD. It is quite astounding how much you can do with Apache although quite a lot of that is often better left for the application. I'm reasonably knowledgeable with the HTTP protocol and have set up and tested a WebDAV server using Apache. I was involved in partially porting a PHP4 WebDAV library to PHP5 using litmus test framework and cadaver, a command line WebDAV client.
Task management
I have come across the following task management applications: Trac, Scarab, Mantis, Bugzilla, Trello and Jira with Greenhopper. For this purpose I prefer a tool with a lot of features rather than simple ones. One of my pet peeves is that people write cryptic bug reports. I err on the side of spending too much time writing a task rather than less, confident that the one reading the report will save time and the risk of misunderstandings goes down. The time that I save could also be my own, and indeed has been a number of times.
Estimating and time tracking are also great features for this category of software, in my opinion.
Server OS
My exposure to servers consists mostly of Linux (Ubuntu, Red Hat, SuSE, Oracle Unbreakable) and SunOS, both as a developer and as a sysadmin. I have some experience using the Oracle VM virtualization solution to supply virtual servers for development.
I enjoy using server monitoring software, but for some cases I like the flexibility of scripting with SSH to get information to and from servers.
Workstations
I've used Windows XP, Mac OS X and Linux (Debian and CentOS, KDE 3 and 4) as a workstation for development. I have used VMware Workstation and VMware Fusion to run virtual machine guests.
Distributions I have used at home for at least a couple of months: Gentoo, Ubuntu, Debian, Fedora and Arch Linux.
Image Software
For image conversions, metadata retrieval and batch jobs I turn to ImageMagick. Tasks requiring modification of only few images I usually do with Gimp.
If ImageMagick chokes on a big PDF or if there's a need for more flexibility I have experience with PDFlib to do the job.
Using Adobe Acrobat Pro I have maintained JavaScript batch jobs for updating forms on PDF files.
For tasks requiring digging in deeper inside a PDF I use available Linux tools or the Enfocus Pitstop Pro plug-in for Adobe Acrobat to get more information.