Simplistic Analytics | Go2Cab Top Five Actions to Avoid Simplistic Analysis and Business Insight Mistakes

Simplistic Analytics | Go2Cab Top Five Actions to Avoid Simplistic Analysis and Business Insight Mistakes

Simplistic Analytics | Go2Cab Top Five Actions to Avoid Simplistic Analysis and Business Insight Mistakes

Go2Cab Actions for Business Analytics (Source: Go2Cab Pty Ltd)

Organisations tend to quickly think of tools as the cure for business transformation, business insight, business analytics and operational challenges. The tendency to think that tools are the miracle cure has substantially increased due to the ease of developing and deploying applications on smart mobile devices. The other factor that organisations tend to ignore when thinking about business insight is the root-cause of the problem.
Organisations rush into investments in band-aid solutions with the justification that getting something done is better than nothing. In doing so, business insight and guidance mainly rely on simplistic analysis by reporting averages, tally and change in the percentage of “items” over time. Such “bad” evidence will lead people, whether at the C-Level or downstream at the lowest levels of the operation, to making the wrong decisions where operational dynamics and measurable business objectives are out of sight.

This blog focuses on selected important actions that any organisation should take to avoid making the wrong decisions where such organisations mistakenly think that they are using “evidence” with the right measurement and analysis methods. The reality is that the “evidence” at hand is misleading.
Action 1 for Business Insight and Business Analytics: Get the Right People with the Right Skills:
When organisations become interested in exploring business insight, the first common mistake organisations make is start bunch of surveys or, if the origination is large enough, they will appoint change managers, program and project managers and a suite of business analysts starting a series of workshops to brainstorm “things”. Before you know it, you would most likely have burnt hundreds of thousands of your budget. The payback and ROI are much greater if you first assemble the team with the right skill set. The team does not have to be large. A few people is typically adequate.

Instead, the role of such team is to make decisions on the selection of the vital-few data you have to gather, the best mechanisms to gathers it, facilitate the “gathering” of the vital data and, to properly analyse such data. Further, equally if not more important, such team shall be able to explain clearly and help you make decisions on how the outcome of the analysis will be used by who, when and for what purpose to address specific segments of one or more measurable business objective. Such approach yields a guaranteed success to quickly gain the sought business insight.

Organisations that have demonstrated continued success in using analytics which has led to significant impact on operational excellence and improved bottom lines always strike the right balance with team’s skills combining the subject matter experts, operational staff and system modellers. Analytics and business insight are a lot more than undertaking statistical analysis. Data scientists or statisticians who do NOT fully comprehend the operational dynamics of the business and challenges faced by the operational staff, can undertake the most sophisticated analysis on the planet, but the outcome may not mean anything to the operational arm of the business. The same statement applies to the subject matter experts who can use one or more analysis tool (such as Excel or any other tool) thinking that by simply inserting data and clicking a few buttons, the answer will lead to the best decisions. The reality is that modelling business dynamics of the situation and applying advanced analytics require a thorough understanding of the underlying concepts behind the algorithms used for analyses. One of the main reasons, in addition to the classical statistical methods, it is vital to incorporate the behavioural aspect of the “factors” or “elements” as time elapses leading to predictive outcomes. In this way, you will know what to do, how to do it and, you also know the expected outcome BEFORE it is too late.
Action 2 for Business Insight and Business Analytics: Do not Jump into Big Data, Machine Learning and Over-Engineered Analyses:
Most organisations have a large amount of data where the organisation either do not know what to do with it or, the organisation will do the wrong thing with the data. Classical examples are averaging data and calculating the percentage of “things” over time labelling such outcomes as the probability of occurrence of “things” treating such numbers as leading indicators whether the business is heading in the right or wrong direction. In most cases, the timing of the analysis is wrong anyway since you are always analysing the past without predicting the future with “proper” analysis.

Examples of such “things” include customer satisfaction, the projection of sales and profit margins, merger and acquisition, managing your healthy diet or reduction or retention of staff. Other examples to consider are things like the time it takes to discharge patients, time to clear incidents on roads, a journey time of travellers, cost of insurance policy, quality of products, effective management and scheduling of patrols and other emergency services, customer churn and so on.

It has become the trend that after a few classical workshops conducted under the banner of business insight, a decision leads to the conclusion that the main reason the organisation is not getting its decisions right is the inability to handle a large amount of data and, hence, the cure is to start spending on big data and some machine learning “stuff”. There is nothing wrong with big data and advanced machine learning. However, the organisation MUST prove the limitations with using the simpler, yet adequately sophisticated tools to undertake the proper analysis before getting into the realms of big data. Most of the desktop classical statistical and system dynamics off-the-shelf tools include very sophisticated machine learning algorithms that can handle millions of rows of data. Vendors of such tools have a very reasonable and affordable price-point too. The issue is not about the tools or big data. Before you know it, you would have spent millions of dollars on CRM, Case Management and ERP systems, workflow engines, tools for business intelligence and the like with little or no increase in business yield. And, when you figure it out, it is typically too late.
Action 3 for Business Insight and Business Analytics: Do Not Rely on Arithmetical Average of Data:
Sometimes, advances in technology tend to backfire where various industry verticals have shown that while a new technology helps in achieving something new, the same technology inherently leads to resurfacing of already solved challenges. In other cases, the new technology makes it easier for people to misuse it or, such people become lazy and rush into decisions before properly “thinking” about the topic. Examples are too many to list, but we will illustrate the concept by stating a few examples.

When stealth technology used in fighter jets was in its trial period, experts found out that while the technology makes fighter jets almost undetectable by radars, the fighter jets were more susceptible to electronic interference when compared with those manufactured using older technology. The older technology of metal skins provided inherent protection or immunity to the electronic devices installed inside the aircraft. Another example is the classical word processor where in the old days, people used to think a lot when writing the draft material by hand before typing the material on the classical typewriter. Once someone types the several pages, there was no room to insert new lines or change words around. Advances in word processing help authors with editing and spell-checking amongst many other benefits, but people tend to think a lot less about what they want to write, why and how to write about it. Everyone receives numerous emails that make no sense where such emails still require time to read before you decide whether or not you want to reply. Think of how much time you typically waste every week because of the misusage of the “tool” only.

Another classical example which is more pertinent to the topic of this blog is Excel. The tool is one of the best achievements in technology providing an extraordinary power to undertake the simple or sophisticated analysis. The problem is that anyone can insert data and calculate the average of something over time, plot colourful tally and other charts and, lo and behold, the average shows an increase or decrease in the prescribed measurement of the item under consideration which, in turn, will lead to a wrong decision. Decision makers are not only misled by the averages and charts that represent them but more importantly, the decision makers are misled by the fact that they think they are now using evidence. The average of a data set is truly an “evidence” but, in most cases, it represents the wrong evidence.

There is a natural tendency to “centralise” our focus when we mentally think about or discuss a topic. Managers typically ask for a magic number to represent far too many uncertainties or factors. Just think of how many times your manager or someone has asked you to provide one number that represents time to complete a task, delivery of goods, number of defects in a product and so on. What makes it worse, the senior managers tend to average the already averaged numbers provided by their subordinates. By the time the decision makers get the report (or the numbers), the data is highly aggregated where the average now makes no sense even if we assume that it made some sense at the lowest level at some point in time.

Some organisations think they have got it right by using what is commonly known as the three-point scale where they explore worse-case, base-case and best-case numbers (or scenarios). While this approach is slightly better than just relying on one magic number, the situation is not much healthier either. Organisations tend to forget that once the lowest or highest “numbers” or “limits” are stated, the inherent assumption is that nothing will or can change beyond such magic thresholds. Also, there is little or no regard to the distribution or the uncertainty in the “change” from the worst to the best scenario passing through the “normal” as time elapses . In most cases, there is little or common understanding or agreed definition of a “scenario” in the first place. To make things worse, once you state the magic numbers, they are carved in stone leading to major catastrophic decisions.
Action 4 for Business Insight and Business Analytics: Do Not Rely on Percentages and Tally of Data
When you are at a workshop or a meeting, try to pay attention to the number of times the participants refer to the “percentage” of something. The focus of the audience locks into the highest or lowest bar charts with attractive animations leading to making mental decisions unconsciously. Just like the term “average”, the tendency to calculate the “percentage” of something is the one the most common word aired among the participants. Both, the “average” and “percentage” are so easy to calculate using Excel.  The topic of production of various types of charts, colours and labels become the focus quickly missing the fact that the “percentage” is another commonly accepted, yet often misleading outcome used by decision makers. The rush into reporting the “percentage” also demonstrates how the ease of use of technology backfires leading to people making very serious and very fundamental mistakes.

A classical example is averaging categorical data they collected via surveys or set of questionnaires (eg where answers to questions are “Good”, “Poor”, “High” and “Low”). Another example is calculating the probability of something taking place (or otherwise) using the wrong methods with little or no regard to data type, uncertainty or the distribution of data. The situation is worse where people sometimes use the “average” and “percentage” to make decisions on the reliability of an “item”. It is constructive to note that every figure that represents the reliability or probability of something occurring is expressed as a “percentage” but NOT every “percentage” is a probability or reliability of an item.

In some cases, organisations inadvertently misuse the proper methods to analyse reliability. For example, people mistakenly apply the Weibull distribution to non-hardware domains like factors that represent the behaviour of people or factors that contribute to the reliability of journey travel time in transport. Similar domains include the time to discharge patients or factors that relate to the reliability of a software application. The common mistake is that people assume that the Weibull distribution can represent software or behaviour of people. One of the main concepts behind such distribution is that the reliability of “items” such as hardware, is characterised by the fact that things typically go wrong during the initial period of the lifecycle, then stabilisation takes place beyond teething period until the wear-and-tear period kicks-in. Organisations tend to forget that software or people behaviour (which is a paramount factor in transport, the delivery of customer services or manufacturing-related challenges- to name a few) is nothing to do with “wear-and-tear”. In some cultures, while people are driving, they slow down if they observe an accident on a breakdown lane leading to traffic congestion on the normal “flow” lanes or, drivers slow down for no apparent reason leading to a build-up of a queue blocking intersections. Similarly, once a piece of software works, if the code or operating environment has not changed, the software does not exhibit any wear-and-tear which is unlike a piece of hardware (eg, a classical mechanical switch or a spring or a door hinge).
Action 5 for Business Insight and Business Analytics: Do Not Ignore the Moving Parts of the Business
The topic of Action 5 is far more than classical statistical analysis. The topic is the heart of a relatively large domain of systems engineering, Design Of Experiments and business operational dynamics. A one-line summary is that even if you undertake the perceived “proper” analytics, the organisation may still make the wrong decisions with unqualified high risks if the analysis does NOT incorporate the operational dynamics . When the analysis incorporates the operational dynamics tightly coupled with business operational strategy and the ACTIONABLE operating model, one can quickly identify the most influencing factors on the desired measurable business objectives.

While one can argue that Excel facilitates undertaking complex analyses, it is not easy to visualise the “changes” in operational dynamics as time elapses. Please refer to Stella Architect, FlexSim or ExtendSim as a set of tools that Go2Cab promotes and uses as an alternative to Excel when appropriate. For example, in Excel, when you change a number in one cell, all related numbers instantly change with NO DELAY. The other typical deficiency in using Excel is the lack of traceability of “units” unless you are extremely careful when hooking-up formulas. Excel will produce a number regardless of whether or not the units are correct. Such phenomena lead to higher risks when you implement complex cell structure, formulas and linkages between cells, sheets and workbooks. So long as you are NOT dividing by zero, Excel does not care and, in almost all cases, it will produce a number. It is not that easy to inherently trace the units associated with numbers nor is easy to incorporate the dynamic interactions between several factors that most influence the desired outcome as time elapses.

Go2Cab can share numerous real-world examples where businesses and operations have suddenly collapsed simply because Excel could not cut it despite undertaking the right analysis.
Go2Cab’s evidence based decision making approach will ensure that poorly made decisions do not take place. Any analysis shall meet well-defined criteria which shall support the Customer objectives, meet the performance set for the organisation taking into account the operational dynamics, people, change management, learning and knowledge sharing. The outcome of the analysis shall be, at all times, actionable, accurate, relevant and timely. In this way, the visualisation set of tools becomes a perfect solution. Such tools will enable the ease of visualisation, global access, sharing and auto-notifications of the INFLUENCING FACTORS in a manner that is ACCURATE, RELEVANT and TIMELY with little or no bespoke code or any complex IT setup, all delivered quickly at a relatively low cost.
Go2Cab hopes that we have shared with you enough information to help you identify similar situations in your domain or organisation. We also hope that you are better empowered to think through decision-making in different ways.
Please feel free to Contact us for a free consolation session. Hope we can offer you some help in actionable evidence based decision making.

Please share with us your view or your situation.

Evidence Based Decision Making

Firmenkontakt
Noel Samaan
Noel Samaan
PO Box 492 492
NSW 1765 Sydney
+6140004431
noel.samaan@go2cab.onmicrosoft.com
https://go2cab.com

Pressekontakt
Go2Cab
Noel Samaan
PO Box 492 492
NSW 1765 Sydney
+6140004431

noel.samaan@go2cab.onmicrosoft.com
https://go2cab.com

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht.


CAPTCHA-Bild
Bild neu laden