Sunday, July 27, 2014

4 jQuery Cross-Domain AJAX Request methods

4 jQuery Cross-Domain AJAX Request methods

The web has changed and with it the way we develop websites. Today, the web is becoming a place where we develop web apps, rather than websites. We use third party API's to create our next mashups. So knowing how to make a cross-site AJAX request or requests that do not comply with the same origin policy is a must. In this article, you will learn 4 cross-site AJAX request methods (plus 4 bonus legacy methods and links to jQuery plugins).
This methods will be handy to overcome Same origin policy as well. Browsers will throw an error if you are making AJAX request to the same domain but with different protocol (to https from http), use different port ( or subdomain.
This article reviews the following 4 methods and discusses their advantages & disadvantages. Also, summarise cases when they are better used.
Here is the list of methods:
  1. CORS (Cross-Origin Resource Sharing)
  2. JSONP
  3. window.postMessage
  4. Setting up a local proxy
  5. 4 bonus legacy methods (document.domain,, iframe, flash)
  6. list of JavaScript libraries and jQuery plugins for making XSS requests.
Before we dive into the method details, let's cover most common cases:
  • Firstly, if you are trying to read data that is available as RSS feed, you are better off with universalRSS to JSON converter powered by Google.
  • Secondly, if you are accessing data from some popular website API, it's more likely they support JSONP as well. See their documentation.
JSONP is a cross browser method that does not rely on any browser hacks. It is supported by all browsers and many javascript libraries provide methods that make JSONP request seamless.

1. CORS (Cross-Origin Resource Sharing)

CORS is a W3C recommendation and supported by all major browsers. It makes use of HTTP headers to help browser decide if a cross-domain AJAX request is secure. Basically, when you make a CORS request, browser adds Origin header with the current domain value. For example:
The server, where the script makes its' CORS request, checks if this domain is allowed and sends response with Access-Control-Allow-Origin response header. Upon receiving, browser checks if the header is present and has the current domain value. If domains match, browser carries on with AJAX request, if not throws an error.
To make a CORS request you simply use XMLHttpRequest in Firefox 3.5+, Safari 4+ & Chrome andXDomainRequest object in IE8+. When using XMLHttpRequest object, if the browser sees that you are trying to make a cross-domain request it will seamlessly trigger CORS behaviour.
Here is a javascript function that helps you create a cross browser CORS object.
function createCORSRequest(method, url){
    var xhr = new XMLHttpRequest();
    if ("withCredentials" in xhr){
        // XHR has 'withCredentials' property only if it supports CORS, url, true);
    } else if (typeof XDomainRequest != "undefined"){ // if IE use XDR
        xhr = new XDomainRequest();, url);
    } else {
        xhr = null;
    return xhr;
Function takes 2 arguments: method - request method ("GET", "POST", etc.) and url - URL where to send the request. Here is how to make a "GET" request to Google.
var request = createCORSRequest( "get", "" );
if ( request ){
    // Define a callback function
    request.onload = function(){};
    // Send request
Because CORS specification relies on HTTP headers and all the heavy lifting is done by browser and server, our code does not need to change. In other words, you can make cross-domain AJAX requests like any other in jQuery.
$.get('', function( data ) {
  alert( 'Successful cross-domain AJAX request.' );

Requirements & Notes

In order to be able to make a CORS request, you need CORS supporting browser and a server. Check if your browser and server support it.
Also note, that if AJAX request adds any custom HTTP headers or use any method other than GET, POST or HEAD as a request type; browser will make a "preflight" request to check if the server responds with correct headers before sending the actual request. This adds an overhead to your AJAX requests.

Advantages & Disadvantages

CORS is a W3C specification and supported by all major browsers. It is how the cross domain AJAX querying will work in the future. So using CORS would be a future safe bet.
However, it requires CORS supporting server and browser. If you have administrative privileges at the server you can add CORS support as explained here. If you do not have any control over the server, then you are out of luck. You will need to choose some other method.

2. JSONP (JSON Padding)

Because of the same origin policy, we can not make cross domain AJAX requests, but we can have<script> tags that load javascript files from other domains. JSONP uses this exception in order to make cross domain requests by dynamically creating a <script> tag with necessary URL.
Here is how it works. Server wraps data, usually in JSON format, in a function call. Upon loading the script, browser calls that function and passes loaded data. This implies the third party server knows the local javascript function name, but for obvious reasons that is not practical. The workaround is to pass the function name as a parameter to the request URL.
Let's seen an example. Facebook's Open Graph supports JSONP calls. Here is a normal JSON response:
   "id": "10150232496792613",
   "url": "",
   "type": "website",
   "title": "jQuery Howto",
Open Graph documentation says that it takes callback parameter to turn JSON into JSONP response. So let's add callback parameter and turn it into JSONP request.
/**/ myFunc({
   "id": "10150232496792613",
   "url": "",
   "type": "website",
   "title": "jQuery Howto",
Noticed how the previous JSON data is now wrapped into myFunc();? So if we had defined myFunc()function previously, it would have been called with the Open Graph data.
function myFunc( data ){
  console.log( data.title ); // Logs "jQuery Howto"
jQuery has built in support for JSONP requests in it's AJAX methods. To trigger a JSONP request you need to add callback_name=? string at the end of the URL. Here is a previous example using jQuery.
$.getJSON( "", function( data ){
  console.log( data.title ); // Logs "jQuery Howto"
// OR using $.ajax()
  type:     "GET",
  url:      "",
  dataType: "jsonp",
  success: function(data){

Requirements & Notes

The JSONP has become de facto method to overcome same origin policy restrictions and it is supported by major data providers (Facebook, Twitter, Google, Yahoo, etc.). In order to be able to use JSONP, the third party server must support it. In other words wrapping JSON data into a function call.
Please remember, that the returned data is plain javascript file. This means that you are running arbitrary javascript code within the scope of your domain with access to all of the user cookies and data in the browser. This introduces a huge security concern. That is why you absolutely must trust the server you are fetching data using JSONP method.

Advantages & Disadvantages

  • Supported by almost all browsers.
  • Supported by major data providers and easy to implement on your own server.
  • Well supported by javascript libraries, including jQuery (see examples above).
  • No request overhead.
  • Run as arbitrary javascript code. Using JSONP implies that you absolutely trust data provider.
  • Requires server support (even though easy to implement).

3. window.postMessage

window.postMessage method is part of HTML5 introductions. It allows communication between window frames without being subject to same origin policy. Using postMessage() one can trigger a message event with attached data on another window, even if the window has different domain, port or a protocol. The frame where the event is triggered must add an event listener in order to be able to respond.
Let's see an example. Assume, we are on (1) website and would like to make a request to (2) domain. We first must obtain a reference to (2) window. This can be, or window.frames[]. For our case it's best to create a hidden iframe element and send messages to it. Here is how it looks.
// Create an iframe element
$('<iframe />', { id: 'myFrame', src: '' }).appendTo('body');
// Get reference to the iframe element
var iframe = $('#myFrame').get(0);
// Send message with {some: "data"} data
iframe.postMessage( {some: 'data'}, '');
The first argument is the data to be sent, the second is the URL of the current document. If this value is different from document.domain at the time when message is sent, browser will do nothing and silently ignore it. This is done for security reasons, since the frame's URL may change.
The page on server (2) must have an html content with a "message" event listener function. Let's use jQuery to do just that:
      $(window).on("message", function( event ){
        // We must check event.origin, because anyone can
        // trigger event. Unless, you are public data provider.
        if (event.origin !== "") return;

        // Now let's send the (1) window data
        event.source.postMessage({name: "Someone", avatar: "url.jpg"}, event.origin);
In order to receive the data sent from server (2), we must add another event listener on page (1). Let's update our previous code.
var iframe = $('#myFrame').get(0);
iframe.postMessage( {some: 'data'}, '');

$(window).on("message", function( event ){
  if (event.origin !== "") return;
  console.log( ); // Logs {name: "Someone", avatar: "url.jpg"}

Requirements & Notes

This method is relatively new and it is not used by that many services yet. All latest major browsers support it. However, IE8 & IE9 support only messaging between <frame> and <iframe>'s. IE10 supports messaging between windows, but only through MessageChannel's.
This method is great for intranet projects where you control the environment (know exactly installed browsers, etc.). Also, it has high performance compared to any other method of communication.

Advantages & Disadvantages

  • No need to install or update on the server.
  • Recommended way of communication between the browser windows.
  • Secure (when used correctly).
  • Not supported by all major browsers (mainly IE problems).

4. Setup local proxy

This method overcomes same origin policy by proxying content on another domain through itself. Thus making cross-domain issue irrelevant. To use this method you will either a) setup your server as a reverse proxy to fetch content from another server or b) write a script that would do that.
This cross domain querying solution works because you actually loading content from your own domain. You request the URL and the proxy script on your server loads the content and passes it over to you.
Here is a sample PHP proxy to get RSS feed from FeedBurner.
<?php// Set your return content type
header('Content-type: application/xml');
// Website url to open
$url = '';
// Get that website's content
$handle = fopen($url, "r");
// If there is something, read and return
if ($handle) {
    while (!feof($handle)) {
        $buffer = fgets($handle, 4096);
        echo $buffer;
Named the file proxy.php and make AJAX request to this URL. Here is a jQuery code example:
$("#rssFeeds").load("path/to/proxy.php", function(){
  // Some callback functions
And this is how you can overcame the jQuery cross site scripting problem.

Requirements & Notes

While setting up a proxy script, do not forget to cache the fetched data. This will reduce the loading times and save some processing on your hosting server.
This method must be your last resort, when previous 3 methods do not meet your requirements.

Advantages & Disadvantages

  • Does not rely on browser support.
  • Does not rely on data provider's support.
  • Can be used to solve any cross-domain request problem.
  • Requires setting up a proxy server. The bad news is that not all web hosting companies allowfopen() to other domains, but enable it on request. My web server was very strict on security but the script above worked well on it.

5. Legacy methods

Before new methods of cross domain request and messaging were introduced, developers relied on hacks and workarounds. Most popular ones were:
  • iframe - Include a hidden iframe and change it's URL fragment to exchange data. Latest browsers have added security restrictions that throw an error for accessing iframe location properties from different domains.
  • - Changing property for exchanging data.
  • flash - Flash can communicate with javascript and has different security rules. So developers included flash object on their pages to make cross-domain requests.
  • document.domain - This method is used for communication between two subdomains on the same domain by changing the document.domain property to the root domain value.

6. Links & Resources for making cross-domain requests

There are many libraries built around cross-domain AJAX problem. Here is a list of notable libraries and plugins.
  • easyXDM - Makes use of all possible cross-domain AJAX request methods and workarounds. If a browser does not support postMessage, CORS, etc. it will fall back to hacks (flash, etc.).
  • jQuery postMessage plugin - a wrapper around postMessage.
  • jQuery.ajax() - read jQuery's AJAX documentation to learn more about it's settings that help to configure remote requests.
In this post I tried to collect all the information available on cross-domain AJAX requests. If I missed anything, please let me know in the comments. Like/share for future reference.

Friday, July 25, 2014

Maximizing Your Meta Tags for SEO and CTR

original post: Maximizing Your Meta Tags for SEO and CTR

Maximizing Your Meta Tags for SEO and CTR

While it’s true that you no longer need to optimize your meta description and meta keyword tags for Google, that doesn’t mean you should ignore these fields entirely!
In fact, because your page title and meta description are frequently pulled to form the snippet that appears whenever your pages are listed in the natural search results, the content you include in these areas can play a major role in your ability to attract visitors from the SERPs.
With this in mind, here’s what you need to know about optimizing your meta tags for both search engine optimization (SEO) and click-through rates (CTR):

Meta tags should be filled out as completely as possible

The first step to properly optimizing your meta tags is to use as many characters as you’re given access to. Think about your title tag and your meta description tag as being the first introduction that many new visitors will have to your brand, as they encounter this information in your website’s snippet in the natural search results. Given how important this real estate is, why wouldn’t you try to maximize the number of characters you’re allotted?
As a general rule, you should aim for the following character limits within each of your meta tags:
  • Page title – 70 characters
  • Meta description – 160 characters
  • Meta keywords – No more than 10 keyword phrases

Include Target Keyword Phrases in a Natural Way

Now, as you’re filling out your meta tags, you’ll want to pay special attention to the way your target keyword phrases are included. Just because the practice of keyword stuffing your meta tags has been long since devalued doesn’t mean these phrases shouldn’t be included. It just means you need to be more strategic about using them.
For example, when structuring your title tags, consider including your page title, your brand name, and a phrase that includes your target keyword phrase separated by the “|” symbol, rather than stuffing in as many keyword variations as possible.
Following this formula, a good sample page title tag for the Single Grain “About Us” page might look like:
“About Us | Single Grain Digital Marketing – SEO & CRO Industry Leaders”
A bad example of a sample tag for the same page would be:
“About Us – Single Grain SEO Agency, SEO Marketers, and SEO Consultants”
Although both sample title tags use SEO industry related keywords, the good title tag version is a clearly written, compelling option that utilizes possible target keywords in a natural way.
The same goes for your meta description. Including your target keywords in this field has the added bonus of causing your phrases to be bolded in the natural search results if a search user enters your exact wording into the engine. But while this might make you want to include as many keyword variations into a tag as possible, stick to a single phrase in order to prevent possible over-optimization penalties.
Finally, when it comes to the meta keyword tag, recommendations are mixed. While adding content to this area won’t help to earn your website higher rankings in the SERPs—or rankings for keyword phrases that you haven’t adequately targeted on your website—the keywords found here may play a role in how well your site performs on second- and third-tier search engines. Include keyword variations here if you want, but don’t spend too much time agonizing over which specific versions to list.

Meta Description Should Include a Call to Action

Now comes the fun part…
Remember, your title tags and meta descriptions aren’t just fields that you’re optimizing in the hopes of receiving some nebulous SEO boost. Instead, these fields form your snippet in the natural search results, which means that they must be written to be as compelling as possible!
Imagine encountering the two following snippets in the SERPs:
“How to Build Links in 2013
This article talks about link building techniques that will work well in 2013, including email link prospecting, social media marketing and content marketing.”
“31 Ways to Easily Attract Backlinks in 2013
Are you using dated link building practices that could be harming your brand?  Find out how to effortlessly build links using these 2013-approved techniques.” 
I probably don’t even need to ask which of these articles you’d rather read, right?
In a way, writing good meta descriptions draws on the principles of copywriting as much as it does SEO best practices. This can take practice, but the reward is a higher click-through rate, increased natural search traffic to your website and potentially higher SERPs rankings if—as some SEO experts believe—it’s true that your overall CTR contributes in some small way to your snippet’s placement in the natural search listings.
If you aren’t yet an expert copywriter, consider the following guidelines when it comes to crafting your title tags and meta descriptions:
  • Add a call to action.  Asking people to do something (as in the case of “Find out how” in the example above) often results in readers taking the action you’ve requested. Other possible calls to action for your meta descriptions include “Discover how,” “Read more about,” “Click here,” or other related variation.
  • Use cliffhangers.  The first meta description show above gives everything away, that is, there’s no real reason for the reader to click through to read the article, as its content is given away by the snippet. Instead, use cliffhangers in your meta descriptions to encourage viewers to click through for the full story.
  • Write your tags for yourself. Once you’ve come up with a possible meta tag, ask yourself, “Would I click through based on this information?” If your tags don’t yet seem compelling, rewrite them until you come up with something more enticing.
Don’t forget, you can always test out and refine the effectiveness of your meta tag content by changing the information stored in these web page fields periodically. If you notice a spike in natural search traffic upon making a change, it’s possible that you’ve hit on a winning combination of meta tag text.

Meta tags can be improved by the use of structured data

One last thing you can do to improve both your website’s search engine optimization and its appearance in the SERPs (and, consequently, your listing’s overall click-through rate) is to use the structured data fields that create rich snippets for your brand.
Essentially, rich snippets are enhanced SERPs listings that display additional information beyond your title tag and meta description.  This additional information could include a picture, the number of people following you on Google+, or other industry-specific pieces of data (as in the case of cook times on recipe website rich snippets).
The following example (from the search query “potato soup recipe”) shows the difference between a web page that’s been optimized with structured data markup, and one that displays the traditional title tag and meta description snippet:
rich snippet example Maximizing Your Meta Tags for SEO and CTR
As you might expect, search users who wind up on this results page are significantly more likely to click the result that’s been optimized to include an image, recipe rating, cook time, and calorie count. As a result, stands to gain much more natural search traffic back to its recipe page, compared with the Food Network’s plainer listing.
For complete instructions on how to add this type of information to your own site’s pages, take a look at the tutorials provided by Google and Search Engine Land. Then, get to work implementing these tips and the strategies described above on your website. The difference in both your site’s overall SEO valuation and natural search click-through rate can be significant!
Image Credit: Shutterstock / woaiss