Grabbing SEOmoz Metrics with F#
I’m regularly tasked with quickly getting some statistics about a website, and being the geeky developer I am – I tend to use my F# scripts to gather this data. This time, I’m going to show you how you can use a similar method to my previous post ‘Grabbing Page Titles with F#‘ to grab metrics from SEOmoz.
To try out this code, you should already have an idea what F# is and how it works. If not, check out the F# Developer Center on MSDN.
Simply put, SEOmoz are a company which provide a rich set of APIs for gathering intelligence on links. You can get access to some of their data by using Open Site Explorer or by getting an SEOmoz account and trying out their API.
Below, I show you how you can use F# Interactive or write an F# script file (.fsx) to gather data from SEOmoz and display it to the screen. SEOmoz returns super-lightweight JSON strings which will need to be deserialized, so we make a reference to the excellent Json.NET library, which you’ll need to include at the top of your script.
#r @"C:\<your-lib-folder>\Newtonsoft.Json.dll" open System open System.IO open System.Net open System.Web open Newtonsoft.Json // // Note that SEOmoz returns much more data // than I'm using here, I just wanna keep it // nice and simple for this blog post! - Jules // /// The serializable Metrics class type Metrics = class val mutable uu : string val mutable pda : string val mutable upa : string end /// Get the text from a web server over HTTP let http (url:string) = try let req = WebRequest.Create(url) use resp = req.GetResponse() use stream = resp.GetResponseStream() use reader = new StreamReader(stream) let html = reader.ReadToEnd() html with | :? UriFormatException -> String.Empty | :? WebException -> String.Empty /// Tidy the URL of the target site for use with SEOmoz let urlTidy (url:string) = match url with | _ when url.StartsWith("http://") -> url.Substring(7, url.Length - 7) | _ when url.StartsWith("https://") -> url.Substring(8, url.Length - 8) | _ -> url /// Get the full URL for the SEOmoz API let mozify (url:string, acc:string, exp:string, sign:string) = "http://lsapi.seomoz.com/linkscape/url-metrics/" + HttpUtility.UrlEncode(urlTidy url) + "?AccessID=" + acc + "&Expires=" + exp + "&Signature=" + sign + "&Cols=103079215108" /// Deserialize the JSON to a Metrics object let deserialize (json:string) = match json with | _ when json.Length = 0 -> failwith "Got zero length input!" | _ when json. = '[' -> failwith "Got JSON array input!" | _ -> JsonConvert.DeserializeObject<Metrics>(json) /// Get URL Metrics for the URL let getMetrics (url:string, acc:string, exp:string, sign:string) = let moz = mozify (url, acc, exp, sign) let res = http moz deserialize res
As you can see above, we have a set of small functions which take in the details of the site you want to check (and your SEOmoz account information), then returns you a Metrics object which can then be accessed any way you like. Note that SEOmoz URL Metrics API is capable of returning much more data than this, be sure to check out the API reference for a full list of everything you can capture about a URL.
We can test this out simply by calling this function and printing the data as shown here…
// let's test this out... let m = getMetrics("http://www.blogstorm.co.uk", "<your-seomoz-account>", "<your-seomoz-expires>", "<your-seomoz-signature>") printfn "URL: %s" m.uu printfn "Domain Authority: %s" m.pda printfn "Page Authority: %s" m.upa
Which will return the results..
URL: www.blogstorm.co.uk/ Domain Authority: 64.474430465805 Page Authority: 70.2271529562222
So there you have it, simple F# programs can be used to call the SEOmoz API to provide super-fast metrics about a URL. I recently wrote about running F# script files with PowerShell and Vim on my personal blog. Be sure to check it out if you want to turn it up to geek factor 11.
Latest from B3Labs
- Another milestone reached for Branded3 as it’s acquired by the
St Ives Group
- The latest media consumer findings & what they mean for digital marketers
- Talk to Branded3 at @BuyYorkshire in Leeds next week!
Latest from Blogstorm
- Early thoughts on Penguin 2.0
- 5 myths about manual penalty recovery
- Google gets more aggressive with link devaluation