cross-origin resource sharing

I keep finding myself going over to the Enable CORS website to copy/paste their example code into my server-side.  They’ve saved me more than once.

Yet again today I was momentarily flummoxed over some seemingly-correct Javascript code in a PhoneGap project to fetch a json response from the server.

 $.getJSON(strURL, function(jsonData) {
  // Do nothing at this level but wait for the response
  }).done(function(jsonData) {
    // Do something upon success
  }).fail(function(XMLHttpRequest, textStatus, e) {
    $('#homeStatusParagraph').html('Lookup failed,
      status=[' + textStatus + '], error=[' + e + ']');
  }).always(function(jsonData) {
    // Do nothing here
 });

Interestingly, the fail() code section ran instead of the expected done() section.  At this point I then made the call manually to the server address represented in the strURL variable and it returned exactly what I thought it would, a json-formatted document.

The status returned from getJSON() was simply error and the returned e object was empty, not very useful for troubleshooting this.  What’s actually going on is that the client-side browser is blocking the inclusion of json fetched from another computer, presumably for security reasons.

Fortunately I’ve dealt with this before and inserted the code in green below to my Node.js server’s app.js file.

app.use(passport.initialize());
app.use(passport.session());
app.use(function(req, res, next) {
  res.header("Access-Control-Allow-Origin", "*");
  res.header("Access-Control-Allow-Headers",
    "Origin, X-Requested-With, Content-Type, Accept");
  next();
});
var routes = require("./routes/index");

This immediately fixed the problem and getJSON() on the client now happily worked, parsing the response from the server.

one code to rule them all

JavaScript

Who’d have thought ten years ago that JavaScript would be so popular now?  I think we can reasonably thank Node.js, released back in 2009, for JavaScript’s enduring popularity.  It’s no longer just a browser client validation tool from its earliest use, it’s a full-blown programming language that’s reached maturity.

Officially, JavaScript’s been on the scene since 1995, over twenty years ago.  The original version was written in ten days.  It even appeared the same year as server-side but didn’t really take off as a backend coding tool until recently.  It wasn’t until Node.js’s asynchronous methodology that it could truly find its place in mainstream coding.

Standardized JavaScript

Fortunately for all of us, Netscape submitted the proposed JavaScript standard back then to Ecma International to formally get the language blessed as a standard.  Microsoft’s own version differed slightly at the time.  Having an unbiased third-party like Ecma bless the standard would allow the rest of us some relief in the browser wars that were going on among the big payers in this space.  Time has passed and we now anticipate the sixth formal JavaScript specification from Ecma to be implemented by the various browsers:  ECMAScript 6, also known as ES6 Harmony.

JSON

JavaScript Object Notation (JSON) is a useful standard for transferring and storing data.  It’s biggest competitor in this space is probably XML and its many subsets as a means of storing and identifying data.  They’re both similar in that they store data that’s marked up with the field names.  And yet they’re different in the way that markup occurs.

JSON’s popularity now is almost wholly due to Node.js’s domination of the playing field.  It’s simple to open and use JSON data within JavaScript and since Node is the platform of choice, JSON can’t help but be the favorite storage and transfer format.

Node.js

I could reasonably assert that there are two types of coders out there:  1) those who haven’t used Node.js yet and 2) those who love it.  It’s an awesome concept.  Write code in JavaScript and use Node to spawn (run) it.  Node manages an event queue for you and deals with what happens when some of your code takes longer than it should (“blocking calls”). You can create an entire webserver app within a few minutes with Node and since JavaScript is such a well-known language among coders, the comfort level of the created code is higher than for alternate languages choices that are available.

“There are two types of coders out there:  1) those who haven’t used Node.js yet and 2) those who love it.”

With other languages and development platforms you scale it up by breaking your code into multiple threads of execution.  And in those other languages you have to manage inter-thread communication and timing.  In the Node.js world, though, you scale your app by having something bring up another instance of your main app itself.

Hosting a Node.js App

This new model of scaling matches nicely with a variety of cloud virtual computer providers such as Amazon and Microsoft.  Even better, a secondary market of Node.js platform providers like OpenShift and Heroku provide a space for your application to be hosted.  (Originally, you would have to create a virtual computer at Amazon, for example, install all the dependencies to run everything and then add your Node.js app.  But now, a provider like Heroku assumes that you have a Node.js app and they take care of the prep-work for you.)

If you haven’t already done so, check out Red Hat’s OpenShift website as well as Heroku.  Both offer a (typically) free tier if you accept the scalability defaults.  Both work quite well for hosting a Node.js application.  I would say that both sites offer good Getting Started documentation.  I will say that I found the Heroku site to be slightly easier as a beginner.  I’m currently hosting one Node.js app on each of them and am happy with both providers. Note that if your app needs additional “always on” (also known as “worker”) apps then you need to fully understand each provider’s pricing model before getting settled into either arrangement.  You might easily incur an approximately $50/month fee for such an app.  Otherwise, the base scalability of both providers is essentially free.