All posts by Jonathan Lau

Feedbaker UX Review @ Google I/O

This year, I was really fortunate to have a chance to attend Google I/O, Google’s annual developer-focused conference. The two-day conference was held in San Francisco, and featured highly technical, in-depth sessions focused on building web, mobile, and enterprise applications with Google and open web technologies.

Apart from the exciting keynote address where Google’s latest innovations are announced, I also got to learn a lot from the many technical sessions that covered design, development and distribution.

While I was there, I also managed to get some UX feedback for Feedbaker from Matt Gaunt (Developer Advocate for Chrome) and another Googler whom I didn’t manage to get his name.

Orbital Milestone 1

As Jon and I will be busy throughout the months of June and July, we have dedicated the past few weeks to work full-time on Feedbaker. In a short span of 3 weeks, we spent a total of more than 250 hours working on Feedbaker.


In building Feedbaker, we learnt and made use of many technologies and components that helped simplify coding and collaboration. The rest of this post will be a summary of the components we used, some of which may have been mentioned in previous posts with greater detail.



Our team has a git repository hosted at Github. This helps us to collaborate as we work together on different parts of the app and also logs each change made so that the other team member is aware.



Node.js is a platform built on Chrome’s JavaScript V8 engine. Our team made extensive use of Node.js as we built Feedbaker. Not only is Feedbaker served through Node.js, we also use other packages that run on Node.js to help build, test and develop Feedbaker.


Yeoman is a Node.js module that helps scaffold modern web applications. Our team used the angular-fullstack generator to kickstart our project. This generator helped us to scaffold a webapp that runs on an Express server, with AngularJS as the client-side JavaScript MVC framework and MongoDB as the database.



AngularJS allowed us write cleaner and more efficient code by separating the logic layer from the presentation layer. It also enabled the reuse of UI components — which means we do not have to edit multiple files if, for example, we would like to add an additional item to the common navigation bar.



Sass (Syntactically Awesome Style Sheets) is an extension of CSS3 which adds nested rules, mixins and more. Feedbaker uses the SCSS syntax for Sass.



Feedbaker uses the Sass-powered version of Bootstrap. With Bootstrap, our team is able to easily extend the base theme to develop a responsive web application without having to deal with too much CSS media queries.



Grunt is a JavaScript task runner that automates the app’s development, testing and building process. As part of the build process, Grunt automatically compiles Sass into CSS and injects bower dependencies into the app. It does this by watching certain files for changes, and refreshes the webapp once the build process completes. When the app is being built for production, Grunt takes care of minifying HTML, CSS, JavaScript and even images. It also concatenates the many JavaScript and CSS files into a single file. All this is done to optimise the page loading time. Minification increases the page load speed by reducing the file size, while concatenation does the same by reducing the number of requests made to the server.



JSHint is a JavaScript code quality tool. It helps to detect errors and potential problems as we code, and at the same time enforces JavaScript best practices. JSHint works with Grunt to alert our team of any potential issues with our code.



Express is a web application framework for node. Feedbaker uses express to serve both static pages and dynamic content.



Our team also configured Nginx as a reverse proxy for our web application. This allows us to host multiple node applications on a single server.



To enable real-time display of poll results, Feedbaker uses WebSockets to allow for low-latency bi-directional connection between client and server. Socket.IO helps establish a WebSocket connection and allows data to be pushed to the client when fresh data is available.



Passport is an authentication middleware for NodeJS. It works seamlessly with Express to handle user authentication. In our app, we used Passport to authenticate users with their NUS OpenID.



MongoDB is a NoSQL database. Feedbaker uses MongoDB to store user data and cache session information.



Mongoose is a MongoDB object modeling tool designed to work in an asynchronous environment. Our team used Mongoose to interface with MongoDB, simplifying things such as validation of models.



Forever is a simple CLI tool for ensuring that a given script runs continuously. Our team makes use of this node module in our production server to keep the Feedbaker application running.



Bower is a package manager for front-end packages. It eliminates the need to commit vendor codes into the repository. Bower maintains a list of front-end packages the app needs in a file called bower.json. Dependencies are resolved automatically and are downloaded straight from the package maintainer’s git repository.



npm is the official package manager for Node.js. Similar to Bower, npm takes care of all the packages needed to build, test and deploy the webapp. npm also tracks these packages in the app’s package.json.

Bug in NUS OpenID

Update (Jul 16): This bug has been patched by Wai Peng.

While integrating Feedbaker with NUS OpenID, I ran into some problems with mapping the OpenID identity to a unique user in the app’s user database. I found out that NUS OpenID accepts the NUSNET id both with and without the network domain, and depending on whether the domain is entered, the system returns a different OpenID identity.

For example, if I were to log in with “a0123456” as my NUSNET id, my OpenID identity would rightly be ““.

However, if I log in using “nusstu\a0123456” as my NUSNET id, I end up getting a different identity.

You are logged in as\a0123456.

When this happens, 3rd party applications that make use of NUS as an OpenID provider would identity this user as another different unique user.

After sending in a bug report to the NUS OpenID Developers group, I got a reply from Wai Peng, systems engineer from NUS SoC.

Thanks for finding this. It is _not_ desirable behaviour. I should probably code some checks into this. Will update when I fix it.

Meanwhile, as a temporary fix for the Feedbaker app, I decided to just replace out the domain portion of the identity.

openid = identifier.replace(/\/[^\/]*\\/, '/');

Poll Results

Today, Jon happened to be in the west and decided to drop by my place to work on the project together. Started off the morning going through our progress in the past week. Fixed a couple of bugs as we were testing the app together.

We started off by creating a short link for users to answer polls. Previously, a user would have to type in this long URL in order to answer a poll:

The poll id as stored in the database is MongoDB’s ObjectId, a 24-byte hexadecimal string. This would be extremely inconvenient for typing out, especially on small mobile devices. As such, we generated a short URL that was saved with each poll as they are created. After implementing the routes, the URL users had to type was much shorter than before.

Before deciding to implement it on our own, we also tried out Google’s URL shorterner API. However, the limitation with that was that the short URL will be in the form of We decided that this was not so ideal as anyone could access the click statistics that was publicly accessible to another with the link.

We also tried using ShortId and Hashids to come up with the short link, but eventually generated it randomly.

 Math.random().toString(36).substring(2, 8)

After getting this done and tested, we proceeded to work on the poll results page. Started off building the API for it, then the user interface for displaying poll results.

After Jon left, I also spent a little bit more time improving the UI of the poll results page. We decided that we should also show results in a chart, so I made use of the Google Charts API to draw the charts dynamically based on the result. After looking around, I came across angular-google-chart, an AngularJS module for Google Charts. I was able to get this working pretty quickly, but spent a lot of time trying to make the chart responsive. As of now, it displays fine on mobile, but could be drawn much bigger on larger screen sizes.

We might also consider allowing users to choose what kind of charts they would like to display (bar charts, pie charts, donut charts, etc).

Another feature we are considering is for poll results to be updated and displayed live. We are looking either at polling the server at a fixed interval, or using WebSockets so that the server can push data to the client whenever a new poll answer is received so that the UI and charts can be redrawn.

Poll Answer

Apart from working on the login redirect yesterday, I also worked on the UI for the poll answer page. Today’s work is mainly storing the poll answers to the database.

I previously chose the poll schema such that it embeds poll answers within the poll itself. I thought this might be a convenient way to store data as a single delete request would remove the poll together with its associated answers. However, I ran into a lot of problems trying to get data stored as an embedded document within an existing Poll object.

Eventually, I decided that having a separate collection for poll answers may be a more suitable option.

/lib/models/answer.js —

 * Answer Schema
var AnswerSchema = new Schema({
 owner_id: Schema.Types.ObjectId,
 poll_id: Schema.Types.ObjectId,
 answer: Number,
 updated_at: { type: Date, default: }

/lib/models/poll.js —

 * Poll Schema
var PollSchema = new Schema({
  question: { type: String, required: true, trim: true },
  owner_id: Schema.Types.ObjectId,
  active: { type: Boolean, default: false },
  choices: [ { type: String, required: true, trim: true} ],
  created_at: { type: Date, default: }

PollSchema.path('choices').validate(function (value) {
  return value.length >= 2 && value.length <= 8;
}, 'Number of choices invalid');

Login Redirection

One issue that we have yet to resolve is where to redirect the user after a successful login. Before today, this defaulted to /app/dashboard. However, given that a user may be given a direct link to a poll for example, and the user has yet to log in, this would provide a very bad user experience as the user would then have to go back to the source and get the direct link again once he has logged in.

I had this issue in mind a few days ago, and started some partial implementation. However, I was stuck on how to redirect users back to the authenticated route they were visiting previously. After a few hours of reading and trying, I finally got it working.

The following paragraphs summarises the steps taken in implementation.

In app.js, if the next route requires authentication and the user is not authenticated, we redirect the user to the login page instead. However, we also append a query string of where to redirect the user to after login.

/a/abc123 --> /login?redirectTo=/a/abc123

The login page takes whatever is in the redirectTo parameter and includes it with the GET request when the user clicks the login button.

--> /login/nus?redirectTo=/a/abc123

A new controller was made to handle OpenID authentication, and before passing control to passportJS, we check if there is a redirectTo parameter and store it into the session if it exists. Here, we also do some simple validation to make sure what is given is a relative path, not a full address. This prevents the application from having an open redirect.

Upon completion of the OpenID authentication, the app is brought back to /login/nus/return. Here, the authentication controller checks if a redirectTo is being stored in session, and if so passes this to the successRedirect parameter for PassportJS. User would then be directed to the page they requested for before authentication.

/login/nus/return --> /a/abc123

Testing Deployment

After discussions with Jon, we finally decided to buy the domain. It wasn’t my first time getting a domain, so that process was a fairly straightforward one. After paying for the domain, I went on to deploy a new DigitalOcean droplet and configured the nameservers and DNS to point to the new instance.

The next hour was spent provisioning the server. These were some things I had to install before getting started with the test deployment:

  • git
  • nodejs
  • npm
  • mongodb
  • compass (used by build scripts to compile sass into css)
  • Bower
  • Grunt

Once I got those ready, I proceeded to clone the repo, run the build script and start the server. Everything worked pretty fine, except that the OpenID return url was still hard coded to localhost on my development machine. In order to fix that, I added base_url as a new item in the config files so I can have different values for development and production. After that was done, everything worked well, and I was pretty happy with the results so far.

Spent some time cleaning up the code, fixing several warnings jshint was returning. While doing that, I noticed that jshint was throwing errors that the method confirm() was an undefined global. After reading up more on jshint, I found that I could suppress that error by adding devel: true to .jshintrc (the config file for jshint). However, after some discussion with Jon, we decided that removing the confirm() prompt would provide a better user experience.

In order to further enhance the UX, I also added a loading state when users click to activate or deactivate a poll. Prior to that, clicking on the button would make a PUT request to the API and update the UI with the expected result immediately. However, errors and failures might cause the user to think the action was successfully applied.

To end off the day, I went back to exploring how the app should be deployed. I read about placing a reverse proxies in front of NodeJS and decided to try running the app behind nginx. Also came across node-http-proxy, and might decide to give it a try some time down the road to see which is more suitable.

After spending another hour or so configuring nginx as a reverse proxy (got help from the nginx docs and wiki), I was able to get everything working. Made some changes to the app’s config file to listen only on so users cannot access the NodeJS server directly via

Tried running the site through Google’s PageSpeed Insights. After seeing the results, I was advised to cache static content to make pages load faster. I went back and tried to configure nginx to cache static content.

While doing that, I found that the page was loading a non-existent vendor.css. Apparently, the build script added that file in although I did not have any vendor css files. After some investigation, this was a bug caused by grunt-usemin, which was fixed in a later version. After updating the package to a later version, I was able to resolve this issue.

Apart from all these, I also came across and made use of Fiddler — a free web debugging proxy. With that, I was able to introduce a certain amount of latency to the app to simulate real-life use. I found out that the My Polls page shows, for a brief instance, that I currently have no polls, until the APIs are loaded and the page is updated. With this tool, I made some changes to further improve the user experience. I was also able to make use of Fiddler to set breakpoints and manipulate HTTP traffic in real-time.

Saving Polls to Database

Today was mainly spent figuring out how to save the Poll model to the database. In doing that, I learnt about AngularJS Providers and Factories, and how they are injected into controllers to get stuff done. I also learnt how to use AngularJS to perform client-side validation, and Mongoose to perform server-side validation.

When a user submits the form to create a new poll, AngularJS first performs client-side validation, then calls the createPoll() method. The controller then uses the Poll Factory to make a POST request to the API. The API routes the request to the server’s Poll controller, and performs server-side validation against the provided schema. Mongoose abstracts and simplifies this validation process. The newly created object, together with the poll id, is then returned. A callback in the AngularJS page controller is triggered, redirecting the user to the newly created poll.

I also learnt and made use of angular-moment while trying to improve the application’s user experience. Instead of displaying the date and time a poll was created, this module displays the relative time (e.g. 4 minutes ago) of creation.

I also started off on the poll details page. This page is designed for presenters to show the link of the poll for the audience to visit and answer. In order to make it easier, we planned to generate short URLs and provide QR Codes so that users do not have to type the full, long, URL into their mobile browser or laptop. For the QR Code, I made use of Google’s Chart API to draw and return an image of the QR Code for the corresponding URL used to access the poll.

We have yet to start on the URL shorterner, but will probably make use of existing URL shortening service such as or, or perhaps come up with our own within the same domain.

Database Schema Design

Today’s work was mainly adding the “My Polls” page to the application and modelling the database schema for the poll model in Mongoose.

One of the challenge I faced was to find out how I should associate each poll question with the many poll answers it will have. I read previously that mongoDB was somewhat better than relational databases at modelling one-to-many relationships.

For example, in WordPress (a blogging platform that uses relational databases), the posts model requires two tables (post & post_meta) in order to represent all the information. This is because a post may, for instance, contain zero to many tags — and this cannot be stored into a single table without serialization. In mongoDB, however, information like tags could be stored in an embedded sub-document.

This started me reading about mongoDB’s embedded documents. In coming up with the schema, I also learnt a bit about data model design from the mongoDB docs — when it will be better to use embedded data models and when to use normalized data models.

Embedded Data Model
Embedded Data Model – Generally better if you have a one-to-many relationship between entities
Normalised Data Model
Normalised Data Model – Generally used when the relationship is many-to-many and when embedding would result in duplication of data