Linkedin rolled out their one-click endorsement feature this past month. As I’ve traded clicks with friends and colleagues, I’ve been trying to figure out what their real goal is. On the surface, this feature feels redundant with their existing “recommendation” feature. Given their aggressive efforts to promote this new feature, the data they are gathering is clearly important to the development of the algorithms behind their services – but how?
Linkedin has been fairly quiet about the inner workings of their search engine. This is likely to prevent people from manipulating the results, since there is significant value in being on the first page of a Linkedin search for a lucrative professional skill. They share some basic pointers about how to “be visible” on their help page. Key points from their page:
- There is no single rank for Linkedin Search – results are unique to each user/query
- The profile keywords of both parties (searcher, results) play a significant role
- Rankings are adjusted based on how prior searchers have reacted to your profile
While the above metrics are fine for identifying which candidates are relevant to a search, they don’t rate candidate quality: who actually knows their stuff? What’s missing here is a broader assessment of “page trust” (graph model analysis concept) that candidates possess the skills that they reference on a profile. For example, Google’s search algorithm incorporates an evaluation of the credibility of a site using link patterns, brand signals, and social activity. With these new features, it looks like Linkedin may be trying to adapt Google’s Pagerank algorithm (or something similar) to ranking candidates for specific skillsets.
In which an experienced analytics guy advises the younger generation to leave the walled garden of enterprise analytics tools and learn how to write code using a real programming language. Specifically advocating the use of R and Python for data analysis and related programming. But hey, I’m flexible on that point…
The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.
I was taught a long time ago in some Management 101 course to sandwich constructive criticism between two compliments. So I’ll open with this statement:
SAS and the other BI vendors have done a nice job of bringing statistical computing techniques within the reach of the typical college graduate.
Now pull up a chair and grab yourself some popcorn, since I’m going to bite the hand that fed me for the first half of my career. I spent the first seven years of my career in roles involving significant usage of SAS and a variety of drag & drop query tools. The COBOL of the analytics world.
“The dog that trots about finds a bone”
- Southern Country Proverb
After twenty years in the business, I am giving up on the idea of asking brilliant questions. They don’t exist. Ironically, most of the questions which have delivered serious money in the past tended to look like relatively dumb ones…
The first set of significant wins I had in my analytics career was in direct marketing, where I moved the campaign analysis process for a $5MM/year program in-house. From a technical perspective, this was pretty straightforward: write a SAS program to merge our mailing list with our customer file then aggregate response and sales data. Since a common key existed on both files (finders file number), it was a simple matter to join the files and summarize the data into an Excel Pivot table. Intern level stuff.
Driven by a passionate desire to “scratch our own itch”, we released the first draft of a little project we’ve been working on this weekend. As our regular readers are aware, we built our first public website earlier this year. We started running ads earlier this summer (just to pay the server bills) and wanted some perspective on “what good looks like”. This is where things get furry: there aren’t any reliable and comprehensive public sources on revenue benchmarks for a small website. So we decided to build our own data set and create some benchmarks around how much a small website could earn. Here you go…
As you can see below, profit-per-visitor numbers vary widely. More after the jump…
As many of you have heard, Google’s search engine data can give advance warning of where a flu epidemic is located. They can track the percentage of the local population who are entering queries related to flu symptoms – once these hit a certain level, there is a high likelihood that a spike in flu cases will be reported through normal channels.
The same also applies to video games. A certain fraction of the players use “player assistance tools” (like a scrabble helper or a list of tips). Demand for these tools is a direct function of player activity and tends to be relatively constant over the life of the game.
Want to see how the demand and usage looks for a video game? Watch the cheaters.