S Anand

Portable Apps

I’m totally hooked to portable apps now. You don’t need admin rights to install them. You can run them off a USB stick. They won’t make your machine slower. All the reasons not to install an application vanish.

PortableApps.com is a good starting point. For what it’s worth, here are my portable apps by category (most used on top).

Platforms

  • Firefox. If you’re using IE6, please die. Lack of admin access is no longer an excuse.
  • Cygwin brings you UNIX commands to Windows.
  • Portable Ubuntu run Ubuntu as a window in Windows.

Tiny utilities

  • GDI++ replaces the Windows font engine with a Mac-line rendering. Looks cool.
  • Clip is a command line tool that copies to clipboard. “dir | clip” copies the file listing to the clipboard. Outrageously useful.
  • PicPick takes screenshots of the screen, windows, regions, whatever. And you can edit them too.
  • uTorrent downloads torrents.
  • WinDiff compares two files and tells you the difference.
  • AlwaysOnTopMaker makes any window stay on top of other windows.
  • DiskTT tells you your hard disk (or USB stick) speed.
  • WinHTTrack downloads websites.
  • AllChars lets you type special characters like ½ by type Alt-1-2 or “ by Alt-`-`. It’s shockingly intuitive.
  • Restoration lets you undelete permanently deleted files.
  • Windirstat tells you what’s taking up space on your disk.
  • Sysinternals is a bunch of system monitoring utilities.
  • Virtual CD-ROM mounts .ISO files. You can use .ISO files without burning them.
  • Autostitch stitches together photos to create panoramas.

Media

  • VLC plays any audio or video file.
  • TightVNC lets you log into other machines like a remote desktop./li>
  • Audacity lets you record and edit audio.
  • CamStudio lets you record video (screen capture).
  • VirtualDub lets you edit video.
  • MediaCoder converts audio and video from any format to another.
  • GIMP is like Photoshop. You can edit pictures.
  • Inkscape lets you edit vector graphics.

Servers

  • XAMPP installs Apache, MySQL, PHP and Perl at one shot.
  • App Engine is Google’s freemium platform for app hosting.
  • Persevere is a RESTful JSON app server that runs on Java.
  • Tomcat is a JSP server.
  • nginx is a fast web server
  • CouchDB is a RESTful JSON app server that runs on Erlang.

Development tools

Let me repeat:

  1. You don’t need admin rights to install these.
  2. You can run them off a USB stick.
  3. They won’t make your machine slower.

There’s really no reason whatsoever not to have them on a USB stick at least. They’re cheap.

SSH Tunneling through web filters

You can defeat most web filters by spending around 8 cents/hr 0 cents/hr on Amazon EC2. (It’s usually worth the money. It’s a fraction of the cost a phone call or a sandwich. And I usually end up wasting that money anyway on calling someone or eating my way out of the misery of corporate proxies.)

Most web filters and proxies block all ports except the HTTP port (80) and the HTTPS port (443). But it’s used to carry encrypted traffic, and, as Mark explains:

since all the traffic that passed through the tunnel is supposed to be SSL encrypted (so as to form an unhindered SSL session between the browser and the HTTPS server), there are little or no access controls possible on such a tunnel

That means web filters can’t really block HTTPS traffic. So we can redirect web traffic to a local HTTPS server, and set up a server outside the firewall that redirects them back to the regular servers.

Putty will be our local HTTPS server. Amazon EC2 gives us a server outside the firewall.

So here’s a 16-step recipe to bypass your web filter. (This is the simplest I could make it.)

In Steps 1-7, we’ll launch a server on Amazon EC2 with 2 tweaks. Step 1 enables Port 443, and step 6 re-configures SSH to run on Port 443 instead of on Port 22. (Remember: most proxies block all ports other than 80 and 443). Alestic’s article on how to Automate EC2 Instance Setup with user-data Scripts and this thread on running SSH on port 443 are invaluable.

In Steps 8-13, we’ll set up Putty as our local HTTPS server. Read how to set up Putty as a SOCKS server and how to use Putty with a HTTP proxy. All I did was to combine the two.

In steps 14-16, we’ll configure the browser to use the Putty as the SOCKS server.

Ingredients

  1. Amazon AWS account (sign up for free – you won’t be charged until you use it)
  2. Putty (which may be available on your Intranet, if you’re lucky)

Directions

  1. On the AWS EC2 Console, click on Security Groups and select the default security group. At the bottom, select HTTPS as the connection method, and save it.
  2. Click on Key Pairs, select Create Key Pair and type in some name. Click on the Create button and you’ll be asked to download a key file. Save it somewhere safe.
  3. Run PuttyGen (it comes with Putty), click Load and select the key file you just saved. Now click on Save private key and save it as privatekey.ppk.
  4. Back on the AWS EC2 Console, click on Launch Instance.
  5. Select Community AMIs and find ami-ccf615a5. It’s a Ubunty Jaunty 9.04 instance that’s been customised to run scripts passed as user-data. You may pick any other alestic instance. (The screenshot below picks a different instance. Ignore that.)
  6. Continue until you get to Advanced Instance Options. Here, copy and paste the following under User Data. Do not make a mistake here!
    #!/bin/bash
    mv /etc/ssh/sshd_config /etc/ssh/x
    sed "s/^#\?Port.*/Port 443/" /etc/ssh/x > /etc/ssh/sshd_config
    /etc/init.d/ssh restart

  7. Keep pressing Continue and Launch the instance. Once launched, click on “Instances” on the left, and keep refreshing the page until the status turns green (running). Now, copy the Public DNS of the instance.
  8. Run Putty. Type in root@<the-public-DNS-you-just-copied> as the host name, and 443 as the port
  9. Under Connection > Proxy, set HTTP as the proxy type. Type in the Proxy hostname and Port you normally use to access the Internet. Select Yes for Do DNS name lookup at proxy end. Type in your Windows login ID and password.
  10. Under Connection > SSH, select Enable Compression.
  11. Under Connection > SSH > Auth, click Browse and select the privatekey.ppk file you’d saved earlier.
  12. Under Connection > SSH > Tunnels, type 9090 as the Source port, Dynamic as the Destination, and click Add.
  13. Now click Open. You should get a terminal into your Amazon EC2 instance.
  14. Open your Browser, and set the SOCKS server to localhost:9090. For Internet Explorer, go to Tools – Options – Connections – LAN Settings, select Use a proxy …, click on Advanced, and type localhost:9090 as the Socks server. Leave all other fields blank.
  15. For Firefox, go to Tools – Options – Advanced – Network – Settings and select Manual proxy configuration. Set the Socks Host to localhost:9090 and leave all other fields blank.
  16. Also, go to URL about:config, and make sure that network.proxy.socks_remote_dns is set to true.

That’s it. You should now be able to check most blocked sites like Facebook and YouTube.

Those who favour the command line may want to automate Steps 1-7 by downloading Amazon’s EC2 API tools. EC2 API tools work from behind a proxy too. The commands you’ll need to use to setup are:

set EC2_HOME=your-ec2-home-directory
set EC2_CERT=your-ec2-certificate
set EC2_PRIVATE_KEY=your-ec2-private-key
ec2-add-keypair mykeypair
ec2-authorize default -p 443
set EC2_JVM_ARGS=-DproxySet=true -DproxyHost=yourproxy \
-DproxyPort=yourport -Dhttps.proxySet=true \
-Dhttps.proxyHost=yourproxy -Dhttps.proxyPort=yourport \
-Dhttp.proxyUser=yourusername -Dhttps.proxyUser=yourusername \
-Dhttp.proxyPass=yourpassword -Dhttps.proxyPass=yourpassword
ec2-run-instances ami-ccf615a5 --key mykeypair --user-data-file your-startup-file-containing-lines-in-step-6

You can go further and use any software (such as Skype) if you install FreeCap. More details are in this article on Secure Firefox and IM with Putty.

Linux users may want to check out ProxyTunnel and this article on Tunneling SSH over HTTP(S).

Update: Follow-ups on hacker news comments, twitter, delicious and digg.

Open source in corporates

Last month, my first application went live.

I’ve been writing code for 20 years. Not one line of my code has been officially deployed in a corporate. (Loser…)

It’s a happy feeling. Someone defined happiness as the intersection of pleasure and meaning. Writing code is pleasurable. Others using it is meaningful.

But this post isn’t quite about that. It’s about the hoops I’ve had to jump through to make this happen.

I’ve been living in a nightmare since March 2009. That was when I decided that I’d try and get corporates to use open source.

March 2009
It began with a pitch to a VC firm. They were looking to build a content management system (CMS). Normally we’d pull together slides that say we’ll deliver the moon. This time, we put together demo based on WordPress’ CMS plugins.

The meeting went fabulously well. We said, “Here’s a demo we’ve built for you. Do you like it?” The business lead (Stuart) was drooling and declared that that’s exactly what they wanted. The IT lead (another Stuart) was happy too, but warned the business users: “Just remember: this isn’t how we do development, so don’t get your hopes up that we can deliver stuff like this :-)”

Time to make my point. I asked, “What’s your policy on open source software?”

The business lead went quiet. “I don’t know,” he finally said. Fair enough.

I turned to the IT lead. “Well, we don’t use it as a matter of policy… there are security concerns…” he said.

“Which web server do you use?”

”Oh, OK. I see what you mean. We use Apache. So on a case to case basis, we have exceptions. But generally we have security concerns.“

”Why? Do you believe open source software is more insecure than commercial software?“

He thought about it for a while. “Well… maybe. I don’t know.” We debated this a bit. Then we found the real issue: “It’s just that we don’t have control over the process. We don’t know enough about it to decide.”

A couple of weeks later, I tried pitching to a newspaper company. This time, it was our sales team that raised the same question. “But… isn’t open source insecure?”

I didn’t even bother pitching any open source stuff to them. But I’d learnt my lessons:

1. Demo the application. Don’t talk about it.
2. Show it to the business first, and then tackle IT.

Aside: June 2009

In June, I got another chance. I was building the website for a large retailer. The very first thing I did was ask to see the Javascript. Total mess, and filled with browser-incompatible DOM requests. So I went over to their web development team.

“Look, why don’t you guys use a Javascript library? It’ll get you cross browser compatibility and compact maintainable code at the same time.”

And, to their credit, they said, “Sure. Which library?”

I showed them this comparison of jQuery (blue), dojo, scriptaculous and mootools…

… and we agreed on jQuery. So, if nothing else, I’ve managed to get one open source library into a corporate.

July 2009

I was also looking at payments, and retailer was looking to replace their chargeback application. Since I had a week off, I built a working PCI compliant prototype on Django. This time, I applied the lessons I’d learned, and demo-ed it to the business, who were thrilled. Time to tackle IT.

I started with the architecture team. Matt on the architecture team was the most approachable. So I went over, demo-ed it, and said, “Matt, this took a week to put together. It’s based on some new technologies. Are you game to try these out?”

He was. And quite enthused about it too. So we put together a proposal for the architecture review board, proposing a new technology stack: Django / Python and MySQL. As before, I showed the demo before I talked technology. I had prepared answers to all security related questions upfront (and practically memorised section 3 of the PCI guidelines.) The clincher, though, was the business case. To build it on Java, it would cost ~1,000 person days. On Django, I’d mostly done it in 5. There was no way of justifying 1,000 person days for an application that could save, at best £100,000 a year.

So they said “Go ahead, we’re fine if operations and infrastructure are fine.”

It was time to find a Django developer in Infosys. I hunted for a couple of weeks but none was available. (Only 2 people knew Django in the first place.) So that effort got canned, and we were back to the 1,000 person day solution. (Which got canned too, later.)

But in the process, I’d learned my third lesson.

3. If you’re trying new technologies, plan on delivering it yourself.

October 2009

Another application popped up that looked like a prime candidate for introducing open source. They were using an Excel application to fraud screen orders, and wanted to make a web app out of it.

I followed the same route as before. Demo it. Show it to business first, then IT. Built it myself. I skipped Architecture, since they’d already approved the technology stack, and took it straight to Infrastructure.

“This application uses Apache as the web server, MySQL as the database, and uses PHP and Javascript for the application logic. Could we get a Linux server to host it?”

Our entire conversation lasted 30 seconds. He said, “No. We use Windows servers” (I was fine)

“… and you’ll need to chance Apache to IIS” (fine again)

“… and we don’t support PHP, so it’ll have to be Java or .NET” (I don’t know .NET or Java… but fine)

“… and we don’t support MySQL, it’ll have to be SQL Server” (fine, I guess)

“… and we don’t have DBAs available until January, so you’ll have to wait.” (definitely not good.)

So back to the drawing board on the technology stack. I needed something in Java (I know very little Java, but nothing at all in .NET) and to avoid the DBA headache, it would have to bundle in a database. I first explored key-value stores like CouchDB, Redis, etc. None of them worked on Java. The only one I found that did was Persevere, and it was a JSON data store, which fit perfectly with my plans.

By this time, I’d also learn my my fourth and most important lesson.

4. Don’t try to promote open source. Just deliver the application

I said, “This is a custom-built application that runs on Java. Could we get a Windows server to host it?”

The answer was “Yes”, and we had it the next day.

PS: December 2009

The application’s deployed and running. It has about 10,000 orders fraud screened by now.

And the lessons are well learnt. So when some came over asking if there was any image resizing solution I knew off, I said: “Sure, who’s your business sponsor?” Then I went over and said, “Let me show you this open source application called ImageMagick. It handles aspect ratios correctly, and can crop too. Doesn’t this look professional?” Then I went over to IT and said, “It’s open source, so you can change it. It has Java bindings, so you can integrate it into your environment. It can handle 8 3000×2400 images a second on my puny laptop. It’s used by your competitors. And I can build it for you if you like.”

I might just have my second open source entry into a corporate this year.

Inline form validation

A List Apart’s article on Inline Validation is one of the most informative I’ve read in a while — and it’s backed by solid data.

Some useful lessons:

  1. Inline validation can reduce form completion time by 40%
  2. Use inline validations where the user doesn’t know if they’ll get it wrong (e.g. is a username available?). Don’t use them if user knows the answer (e.g. their name)
  3. Validate on blur, not on keypress (it’s distracting, and users can’t multitask)

A flaw in rationality

I found this piece from “The Happiness Hypothesis” pretty interesting:

In the 1990s, Damasio found that when certain parts of the orbitofrontal cortex are damaged, patients lose most of their emotional lives. They report that when they ought to feel emotion, they feel nothing, and studies of their autonomic reactions (such as those used in lie detector tests) confirm that they lack the normal flashes of bodily reaction that the rest of us experience when observing scenes of horror or beauty. Yet their reasoning and logical abilities are intact. They perform normally on tests of intelligence and knowledge of social rules and moral principles.

So what happens when these people go out into the world? Now that they are free of the distractions of emotion, do they become hyperlogical, able to see through the haze of feelings that blinds the rest of us to the path of perfect rationality? Just the opposite. They find themselves unable to make simple decisions or to set goals, and their lives fall apart. When they look out at the world and think, “What should I do now?” they see dozens of choices but lack immediate internal feelings of like or dislike. They must examine the pros and cons of every choice with their reasoning, but in the absence of feeling they see little reason to pick one or the other. When the rest of us look out at the world, our emotional brains have instantly and automatically appraised the possibilities. One possibility usually jumps out at us as the obvious best one. We need only use reason to weigh the pros and cons when two or three possibilities seem equally good.

Human rationality depends critically on sophisticated emotionality.

Guess it shouldn’t be a surprise then that models based on rationality fail.

IE6 in Corporates

PPK’s State of the Browser – IE Edition mentions one reason why IE6 will probably stay on for a while.

Now why do I expect IE6 to stick around while IE7 goes down? The answer is simple: Intranets… many office workers will continue to be condemned to IE6.

At work, that is. It’s quite likely that on their private computer at home they run another browser — IE7 or 8, Firefox, or maybe one of the smaller ones.

Basically, most of the IE6 market share comes from office-hour surfing, while it drops significantly in the after-hours period.

I checked the numbers on my site. It’s bang on. Last month, the percentage of IE6 users around noon was a little over 40%. At midnight, the percentage was 20%.

Percentage of IE6 users over a 24-hour windowGraph: Twice as many IE6 users at noon compared to midnight

Given that the bulk of my audience is from India, I would assume that these statistics are probably representative of Indian corporates. But I guess it means that there’s a fair bit of music listening happening at work. Probably a good thing.

Round buttons with Python Image Library

After much hunting, I finally settled on Hedger Wang’s simple round CSS links as the most acceptable cross-browser round button implementation. The minified CSS is about 2.5KB, and the syntax is very simple. To make an input button into a round button, just wrap it within a <span class="button">:

<span class="button"><input type="submit"></span>

… and it’s just as easy to convert a link into a rounded button:

<a class="button" href=”/”><span>Home</span></a>

It works by using a transparent PNG / GIF that looks like this:

The first button is the default button. The second appears on hover. The bottom two are for disabled buttons.

Can we easily create buttons in different colours?

That’s what this post is about: creating that image with round buttons and gradients.

When I tried creating these rounded buttons myself (and trying to do it in an automated was as usual), I saw 3 possible approaches:

  1. Create it using PowerPoint via Python and export as a PNG.
    So we make a curved box, put in the appropriate gradients and borders, and export it as a PNG. But the problem is I couldn’t figure out how to get transparent PNGs.
  2. Create it in GIMP using script-fu plugin.
    The problem is, I don’t know scheme or GIMP’s API. So I gave up on this as well.
  3. Create it using Python Image Library.
    This was inspired by Nadia’s PIL Tutorial: How to Create a Button Generator. Let me explain how this works.

The first step is to create a ‘button-mask.png’ like this one:

  1. Create a transparent 300 x 120 image in GIMP
  2. Selecting a box from (0,0) to (300,30)
  3. Shrink it by 2 pixels
  4. Convert it to a rounded rectangle with a radius of 80%
  5. Fill this in white
  6. Copy it to 60 pixels below

Now, we need code to create a gradient:

start, end = (192, 192, 224), (255, 255, 255)
grad = Image.new('RGBA', (300, 120))
draw = ImageDraw.Draw(grad)
for y in range(0, 30):
    draw.line(((0,y),(300,y)), fill=rgb(start, end, y/30.0))
    draw.line(((0,y+60),(300,y+60)), fill=rgb(start, end, 1.0-y/30.0))

Now that the gradient is created, convert that into a round button by loading button-mask.png’s alpha layer onto the gradient:

mask = Image.open('button-mask.png').convert('RGBA').split()[3]
border = Image.open('button-border.png').convert('RGBA')
grad.putalpha(mask)
grad.save('button.png')

There it is: a simple round button generator. You can see a sample of these buttons at my Dilbert search site.

Error logging with Google Analytics

A quick note: I blogged earlier about Javascript error logging, saying that you can wrap every function in your code (automatically) in a try{} catch{} block, and log the error message in the catch{} block.

I used to write the error message to a Perl script. But now I use Google’s event tracking.

var s = [];
for (var i in err) { s.push(i + '=' + err[i]); }
s = s.join(' ').substr(0,500)
pageTracker._trackEvent("Error", function_name, s);

The good part is that it makes error monitoring a whole lot easier. Within a day of implementing this, I managed to get a couple of errors fixed that had been pending for months.

15 years of Dilbert searchable

The Dilbert search index now carries 15 years worth of Dilbert comics — over 5,500 strips typed out. This is mainly due to the contributions of BFMartin (over 6 years worth of strips) and Paul Dorman (over 3 years worth of strips), myself (over 3 years worth of strips) and a long tail of contributors.

You can search the strips here. While you can find strips as far back as 1989, you won’t see the images earlier than 2002 because geek.nl (whose images I’m shamelessly hotlinking without permission) only holds images that far back. But once you know the date of the comic (say 1991-02-03), you can visit the Dilbert official site at dilbert.com/1991-02-03/ and see the strip.

Dilbert started around 20 years ago. So we’ve covered 75% of all the strips, and this is in just 8 months after starting this collaborative effort. A couple of lessons I learnt from this:

  1. Crowd sourcing beats going solo if you’re building content. It’s a no-brainer. There will always be only one or two people more passionate about something than you.
  2. The long tail is not very big. There will only be one or two people more passionate than something about you. Don’t expect the long tail contribution to be the significant. The value comes from being able to attract “the big fish”.