The IAPA salary survey came out a couple of months back, and though it is Analytics focused it has some interesting results for those of us in the BI world. My key takeaways follow.
From a purely self interested point of view, Analytics is a well paid profession and it’s getting more so. Further, recruiters are reporting that finding people is getting harder, which indicates the talent pool is not all that deep and has been sucked fairly dry already. Something I experience regularly when trying to find BI talent.
If you want a job in the field, you’re best off being in Sydney or Melbourne. There also appears to be minimum education level of a bachelors degree with most professionals holding a masters or higher. Marketing is one of the biggest employers of analysts.
For those in the field there seems to be a mid career slump in satisfaction (around the ten year mark). Fresh starters are all excited and lifers seem happy too, but somewhere in the middle the enthusiasm fades.
Despite all the market enthusiasm, a significant proportion of respondents said there is an ongoing challenge reported that analysts struggle to get their organisation to value or act on analytics findings – supportive of Eugene Dubossarsky’s claims that business heavily invest in vanity analytics so they can claim “me too” rather than to derive real value.
Technical takeways – for all the noise, Big Data is still a Small Concern and regular sized analytical problems are prevalent. Excel is the #1 tool used to work with data, and if you are more of an integrator good SQL skills are king.
Last of all, There still seems to be a heavy focus on social media analytics – despite it’s dubious value – but it pays better. Something which underscores the vanity analytics claims further.
The end of the year is closing in fast but there’s still plenty of chances to learn from specialist providers Agile BI, Presciient and of course, me!
Topics cover the full spread of DW, BI and Analytics so there’s something for every role in the data focused organisation.
Build your Data Warehouse in SQL Server & SSIS with the BI Monkey
Nov 24/25 – Are you about to build your Data Warehouse with Microsoft tools and want to do it right first time?
This course is designed to help a novice understand what is involved in building a Data Warehouse both from a technical architecture and project delivery perspective. It also delivers you basic skills in the tools the Microsoft Business Intelligence suite offers you to do that with.
In the world of Information Security an Advanced Persistent Threat (APT) “usually refers to a group, such as a government, with both the capability and the intent to persistently and effectively target a specific entity”.
I’ve written and tweeted and otherwise socialled the message about the threat automation is posing human cognitive labour. However one thing I’ve skipped over – despite through my career choice being an implicit part of – is the APT to human labour that the application of analytics within business represents.
Attempting to control labour productivity and costs have always been an important activity within any operation – more productivity per unit of labour at the lowest possible cost being the key aim (when was the last time you heard business groups advocating higher minimum wages?).
The Science of the Labour Analytics APT
BI & Analytics have enabled this to move from an art – i.e. “I, Bob’s manager, suspect he is a slacker, and should be fired” – to a science – i.e. “I, Bob’s manager, see he is costing more to employ than he generates in revenue, and should be fired.” To people working in sales this is nothing new – targets and bonuses have long been part of the way their performance is measured (gleefully ignoring the evidence that this is counter-productive). Now however, everyone in the organisation can be assigned a “worth” which they must justify.
Now traditionally some components are more easily allocated value – sales people generate revenue, consultants can be sold – but areas that have been harder are starting to fall into a metricisable state. For example, through analytics of customer satisfaction, it can be worked out which aspects of service – billing, support, service levels – actually matter. Then consequently what the business should spend to get that function delivering the customer satisfaction to keep the customer. If support doesn’t really matter, don’t ask for a payrise if you work in that department.
Its not all dark side, of course – part of the metricisation of labour has meant that some improvements to working life have come along. The realisation that happy employees are more productive has led to companies paying more than lip service (read: obligatory Christmas party and awkward team-building events) to keeping people happy and feeling like their work is worth expending effort on. So we can all look forward to less beatings.
Analysts are the Architects of Unemployment
It may be a bit harsh to suggest this, but I believe that alongside the roboticists, software developers, visionaries and other people building our future, analysts are a key player in removing human labour from daily life. At the futurologists end of the deal they are designing the learning systems which will allow machines to think, but in the here and now they are creating the basis for working out what sections of business need to be automated first.
At least the good news is that as an Analyst you will probably be one of the last to be fired…. by the HR algorithm.
While I’m in a bit of a groove about the future of the workplace, I may as well talk about how there may not be a future for the workplace.
Automation destroyed the working class
The Industrial Revolution was so long ago now that it qualifies as history. The replacement of skilled labour with machines wiped out a whole class of skilled workers, but simultaneously expanded opportunities for unskilled workers to such an extent that overall standards of living rose and most people saw this as a Good Thing(tm). However since the seventies, robotics and computing started to strip humans from the factory to the point that now the modern factory floor workforce is only a tiny proportion of what it used to be. Similar effects can be found in farming, where vast farms are now run by just a handful of people.
Any repetitive physical task can be completed by a robot – and nobody has questioned this too hard. Factory conditions are harsh and most people don’t want to perform the exact same task hundreds of times a day due to the physical and psychic toll that can take.
However a clear upshot of this is that unskilled labour has little place in a modern economy. You could perhaps be a driver (a career with probably less that 20 years left before that becomes automated) – work in retail (currently being seriously eroded by ecommerce) – construction (safe for now) – but the options are limited and shrinking. If a job doesn’t require physical presence (e.g. Bricklayer) or face-to-face interaction (most sales) then it is potentially at risk.
A debate I’ve been having recently with a friend thinks that office workers are more immune… but I think she’s being rather optimistic.
Analytics will destroy the middle class
Famed economist John Maynard Keynes once predicted widespread unemployment “due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour” – i.e. we will make the economy so efficient that we don’t need all available working people to run it any more.
Now this future has been long foreseen by Science Fiction writers and falls across a wide spectrum of possibilities. There’s the wildly optimistic future presented by the late Iain M. Banks of “The Culture” where effectively machines take care of humanity in a benign manner and give them a life of luxury and freedom. Then there is the darker end, such as UK comic 2000AD‘s character Judge Dredd‘s dystopian Mega Cities where wealth is concentrated in the hands of the few and 99% of the population is unemployed and lives off far from generous state handouts and life for most people is pretty dismal.
According to a study by Oxford University nearly 47% of US jobs are at high risk of being replaced by automation within the next 20 years. So this may be a reality we need to work out sooner rather than later. If your job involves decision making and it has routine repeatable elements to it then it is at risk of a pattern detecting engine being applied to it and that decision making process delegated to a machine. This could be as simple as approving a loan – something that is largely automated anyway – or as complex as diagnosing cancer.
Now many people may resist this and argue that a machine could never replicate the subtlety of human thinking. To some extent that is true, but the quality of human decision making is poor and it is arguable that handing over things such as medical diagnoses to systems that can absorb a volume of data far beyond our poor human brains capacity – and assess it rationally and fairly – may well improve the decisions that do get made.
So, perhaps it is time hail our new AI overlords, and let us pray they are kind to their creators…
One theme that constantly pops up in the BI / Analytics / Big Data world is why – given we have all these amazing tools and models, etc. – is the adoption of Analytics so low? From a Microsoft perspective, Data Mining was baked into SQL Server since 2005 – and due to negligible uptake has hardly changed since. Now I know from my colleagues in Analytics – and the fact that R continues to grow at a great rate – that it’s not a dead field. Far from it. But it’s not quite at the front of everyone’s minds either.
I think the challenges are human rather than technical. Understanding Analytics often means pushing the mind to the limits of what our poor grey lumps of brain were designed to do. We are rigged to make snap decisions with limited information to aid our survival, not contemplate the likelihood of that wolf being hungry through careful modelling deep thought and … ouch, why is there a wolf biting my leg?
A great example of this is showed up in my Facebook feed recently:
Source: these guys, who I totally don’t endorse as they might be hippies
OH MY GOD POUR ALL THE SODA DOWN THE SINK!!!
Well, er – lets not rush. As with all internet circulated health information, the facts are dubiously presented with no link to source. So first of all, let’s remedy that – this is the study in question:
Cancer Epidemiology, Biomarkers & Prevention, February 2010
Hurrah for open access journals. Reading through the study, the kernel of truth is there – a statistically valid effect found that indicated that those with a soda consumption of greater than 2 a week increased the relative risk of cancer by 85%. I’m not going to scoff at that, 85% is a big uptick in risk. Relative Risk – and this is where the above image is misleading.
At face value I would take the 85% figure to mean that if I drink 2 or more cans of soda a week, I have an 85% chance of getting pancreatic cancer, i.e. the Absolute Risk. If this was the case I would ban soda from my house immediately.
However dig into the maths and for the population study group the actual Absolute Risk of developing Pancreatic cancer if you drink no soda is about 1/4500. This makes it a pretty unusual cause of death compared to the big killers like Diabetes, which is a more likely consequence of drinking excess soda. For the population studied who did drink more than 2 sodas a week, the risk jumped to 1/2500. Which is still pretty remote. It also makes for a lousy headline. Much better to say the risk has increased by 85% without stating that the number refers to Relative risk and the Absolute risk is small. Not to mention that the study admits that its findings are far from conclusive.
Absolute Risk is the chance of something happening to you if all other factors are equal. So for example, crossing a city street with your eyes closed may have a Absolute Risk of 10% in terms of being hit by a vehicle.
Relative risk is the adjustment to Absolute Risk when conditions alter. If it’s a highway, that risk of being hit by a car may jump to 70%. So the Relative Risk of crossing a highway instead of a city street is 700% higher. It doesn’t mean you have a 700% chance of getting hit by a vehicle, because – well, that makes no sense to have a 700% chance of something happening.
What does this have to do with how our brains are wired for Analytics?
It explains why the above image is simultaneously accurate and misleading. The snap decision we make is Soda – Cancer – Big Risk number – Soda Bad. The deeper analysis took a bit longer, and by which point most of us have lost interest.
Analytics is hard to get penetrated in the human way of working because it doesn’t appeal to our way of thinking, and it takes work to understand. So the message from here is if you are in Analytics and not being successful, it may not be because your models aren’t brilliant (I’m sure they are) – but because you cannot communicate how they work – and their value – in a way most peoples grey lumpy bits can grasp.
Disclaimer: I may have got some of the maths a bit wrong, particularly around the Absolute Risk of getting Pancreatic cancer, as I only spent 5 minutes trying to work it all out. This post does not constitute medical advice. If you take medical advice from Facebook, Twitter, Blogs or any other form of social media that has never been to Medical School, see a Doctor.