Linkedin rolled out their one-click endorsement feature this past month. As I’ve traded clicks with friends and colleagues, I’ve been trying to figure out what their real goal is. On the surface, this feature feels redundant with their existing “recommendation” feature. Given their aggressive efforts to promote this new feature, the data they are gathering is clearly important to the development of the algorithms behind their services – but how?
Linkedin has been fairly quiet about the inner workings of their search engine. This is likely to prevent people from manipulating the results, since there is significant value in being on the first page of a Linkedin search for a lucrative professional skill. They share some basic pointers about how to “be visible” on their help page. Key points from their page:
- There is no single rank for Linkedin Search – results are unique to each user/query
- The profile keywords of both parties (searcher, results) play a significant role
- Rankings are adjusted based on how prior searchers have reacted to your profile
While the above metrics are fine for identifying which candidates are relevant to a search, they don’t rate candidate quality: who actually knows their stuff? What’s missing here is a broader assessment of “page trust” (SEO concept) that candidates possess the skills that they reference on a profile. For example, Google’s search algorithm incorporates an evaluation of the credibility of a site using link patterns, brand signals, and social activity. With these new features, it looks like Linkedin may be trying to adapt Google’s Pagerank algorithm (or something similar) to ranking candidates for specific skillsets.
In which an experienced analytics guy advises the younger generation to leave the walled garden of enterprise analytics tools and learn how to write code using a real programming language. Specifically advocating the use of R and Python for data analysis and related programming. But hey, I’m flexible on that point…
The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.
I was taught a long time ago in some Management 101 course to sandwich constructive criticism between two compliments. So I’ll open with this statement:
SAS and the other BI vendors have done a nice job of bringing statistical computing techniques within the reach of the typical college graduate.
Now pull up a chair and grab yourself some popcorn, since I’m going to bite the hand that fed me for the first half of my career. I spent the first seven years of my career in roles involving significant usage of SAS and a variety of drag & drop query tools. The COBOL of the analytics world.
“The dog that trots about finds a bone”
- Southern Country Proverb
After twenty years in the business, I am giving up on the idea of asking brilliant questions. They don’t exist. Ironically, most of the questions which have delivered serious money in the past tended to look like relatively dumb ones…
The first set of significant wins I had in my analytics career was in direct marketing, where I moved the campaign analysis process for a $5MM/year program in-house. From a technical perspective, this was pretty straightforward: write a SAS program to merge our mailing list with our customer file then aggregate response and sales data. Since a common key existed on both files (finders file number), it was a simple matter to join the files and summarize the data into an Excel Pivot table. Intern level stuff.
I started working on my first website a few years ago. Being of a somewhat competitive nature, I was quickly drawn to objective metrics of the size of the “dent in the universe” that my little project was making – # visitors and duration of engagement. This, in turn, lead me into the murky world of search engine optimization.
There was the inevitable trip to Barnes & Noble, returning home with a tome professing to teach me how to increase my website’s search rank “in a weekend”. From a SEO perspective, their advice boiled down to:
- Performing various keyword shenanigans to convince Google you’re serious
- Using mass-submission services to (low-end) directories and search engines
- Dropping links on: forums, social profiles, article sites, blog comments, etc.
- And for those in a hurry, you could outsource a large portion of this work…
Several weeks later, Google rolled out the Panda Algorithm, followed shortly by the first Penguin updates. And started targeting the lower-quality directories and content farms. Websites that aggressively embraced these tactics got systematically stomped off the face of the search engine rankings, penalized for building lots of questionable links. Fortunately, I had read a few other articles before I began working…
Driven by a passionate desire to “scratch our own itch”, we released the first draft of a little project we’ve been working on this weekend. As our regular readers are aware, we built our first public website earlier this year. We started running ads earlier this summer (just to pay the server bills) and wanted some perspective on “what good looks like”. This is where things get furry: there aren’t any reliable and comprehensive public sources on revenue benchmarks for a small website. So we decided to build our own data set and create some benchmarks around how much a small website could earn. Here you go…
As you can see below, profit-per-visitor numbers vary widely. More after the jump…