Tag Archives: Business Intelligence & Data Warehousing

Worst Practices while deploying a Predictive Model Contd..

27 Jul

In my previous article we saw what are all the worst approaches followed by organizations while deploying a Predictive analytic project. This article will provide you information on how to deploy successful predictive analytics model.

Successful Predictive Analytics Deployment

Now that we’ve discussed the wrong approach to predictive analytics, let’s look at some of the critical steps that must be taken to ensure its success.

Understanding the Business Need

As mentioned earlier, it is crucial for companies to identify the drivers behind the predictive analytics project in the early planning stages. Once an organization defines what new information it is trying to uncover, what new facts it wants to learn, or what business initiatives need to be enhanced, it can build models and deploy results accordingly.

 Understanding the Data

A thorough collection and exploration of the data should be performed. This enables those who are building the application to get familiar with the information at hand, so they can identify quality issues, glean initial insight, or detect relevant subsets that can be used to form hypotheses suggested by the experts for hidden information. This also ensures that the available data will address the business objective.

 Preparing the Data

To get data ready, IT organizations must select tables, records, and attributes from various sources across the business. Data must be transformed, merged, aggregated, derived, sampled, and weighed. It is then cleansed and enhanced to optimize results. These steps may need to be performed multiple times in order to make data truly ready for the modeling tool.

 Modeling

Once information has been prepared, various modeling techniques should be selected and applied, and their parameters calibrated to optimal values. Choice of the modeling technique is determined by the underlying data characteristics or by the desired form of the model for scoring. In other words, some techniques may explain the underlying patterns in data better than others, and therefore, the outcomes of various modeling methods must be compared. A decision tree would also be used if it were deemed important to have a set of rules as the scoring model, which is very easy to interpret. Several techniques can be applied to the same scenario to produce results from multiple perspectives.

 Evaluation

Thorough assessments should be conducted from two unique perspectives: a technical/data approach often performed by statisticians, and a business approach, which gathers feedback from the business issue owners and end users. This often leads to changes in the model; but while the technical/data evaluation is important, it should not be so stringent that it significantly delays implementation and use of the model. The model’s business value should be the primary test.

 Deployment

Deployment, the final step, can mean one of two things: the generation of a single report for analysis, or the implementation of a repeatable data mining or scoring application. The goal here is to create a reusable application that can be used to generate predictions for large volumes of current data. The results are then distributed to front-line workers; in a format they are comfortable with – reports, dashboards, maps, or graphics – to enable proactive decision-making.

Avoiding common worst practices and adopting best ones, are the key to successfully implementing and using predictive analytics. By knowing what pitfalls to avoid, and what important steps need to be taken, companies can accelerate implementation, maximize user adoption, and realize substantial ROI.

About the Author

Shaughn is an industry analyst for business intelligence. For over ten years, he has assisted clients in business systems analysis, software selection and implementation of enterprise applications. Shaughn is the channel expert for BI for the small and Mid-Market segments at ZSL and conducts research of leading technologies, products and vendors in business intelligence, marketing performance management, master data management, and unstructured data. He can be reached at shaughnk@zslinc.com. And please visit Shaughn’s blog atzslbiservices.wordpress.com

Worst Practices while deploying a Predictive Model

23 Jul

With the inception of predictive analytics the reactive decision making has been proven unsuccessful. Organizations have started to take more proactive approach by using predictive analytics to make critical decisions to uncover a problem or an opportunity. Predictive analytics not only enables the organization to determine the forecast of the problems and opportunities but also keeping the bad alternatives futures from happening. Hence companies can foresee their problems well in advance and neutralize them at a preliminary level. According to research from IDC, organizations using predictive analytics solutions generate an average return on investment of 145 percent. Regrettably, many companies don’t implement it correctly and fail to achieve these desired results

Common Worst Practices while deploying a Predictive Model

As beneficial as predictive analytics can be to an organization, implementation and deployment projects often fall apart or fail to get underway due to common poor practices, procedures, and decisions, such as:

  •  Failing to focus on a specific business initiative that predictive analytics can enhance
  • Ignoring crucial steps, such as data preparation and access, or deployment of results
  • Spending too much time evaluating models
  • Investing in tools that yield little or no returns
  • Failing to operationalize findings

Failing to Focus on a Specific Business Initiative

Mostly, companies begin building their predictive application with loose goals in mind and trying to discover something critical that they don’t know. Substantially they end up in trying various analytical models and forces developers into a never-ending cycle of definition, evaluation, and fine-tuning. The best approach and successful predictive analytics endeavor for an organization is to define the project objectives and requirements that will satisfy their business needs. Predictive analytics will be more effective when it is used to identify expected cases and to apply insight from specific patterns and trends existing in the data to these new cases.

Ignoring Critical Steps

One of the frequently confronted failures in deploying predictive analytics is ignoring critical steps. Many companies while deploying the predictive model many organizations take major efforts to look only at the important steps and often ignores the data preparation and access process. In reality, this should be the activity to which the most effort is devoted. In fact, data preparation typically accounts for approximately 60 to 80 percent of the cost of a predictive modeling initiative.

Spending Too Much Time on Model Evaluation

Predictive models must be evaluated to determine how accurately they predict patterns. Primarily, they must be measured from a data perspective and then they must be assessed from a business perspective to ensure they will meet end-user expectations and requirements. Accuracy comes at a cost, and companies must decide how precise they need their models to be. Companies often tend to over-evaluate. They add new variables to the models to increase their accuracy, which often requires rebuilding. They test and retest the models endlessly, spending tremendous amounts of time making continuous refinements because they are not quite perfect. This delays deployment, and prevents the organization from recognizing the substantial advantages that predictive analytics can offer.

There is a tradeoff to be made between time to market, usefulness, and accuracy. Companies must sacrifice some precision in order to accelerate deployment. Or they must halt implementation and rollout – and delay the realization of benefits – to achieve higher levels of accuracy. The truth is, if a model is better than the current approach to forward-looking decision-making (and it likely is), then it should be considered ready for deployment. No model will ever be perfect, because shifting business strategies and evolving end-user needs require continuous modifications.

Investing Heavily in Analytic Tools With Little or No Return

There are various common mistakes made when it comes to investing in predictive analytics tools. Companies frequently end up in buying expensive, complex analytical tools that is way too sophisticated for their needs. These solutions not only come with very high price tags, but also they are typically hard to deploy and difficult to use by anyone other than statisticians and experienced analysts. As a result, they likely contain features and functions that will never be used. All of these factors will significantly lessen the ROI of an organization.

Failing to Operationalize

 For predictive analytics to succeed, it must be embedded into applications that are leveraged whenever users need to make decisions. If an application is not built and deployed, the effort devoted to creating a model will do nothing to enhance forward-looking decision-making. The results will remain in a document that few people will refer to in support of their daily activities. However, when a model is incorporated into a dashboard or reporting environment, the results will be readily accessible to end users, whenever they need them. This will help to create an analytics-driven culture across the entire business.

How to avoiding Worst Practices

The worst practices we have highlighted don’t have to derail a predictive analytics initiative. In fact, they can all be easily avoided by:

Driving ROI

When planning a predictive application, companies must consider total cost of ownership and anticipated return, to ensure that maximum value is achieved.

Focusing on Bottom-Line Initiatives

Create models that will provide forward-looking intelligence to help solve specific problems (i.e., minimizing customer churn by uncovering the factors that contribute to it) or help to achieve certain goals (i.e., increasing up-sell and cross-sell revenue by understanding what new products customers are most likely to buy).

 Preparing Data

Guarantee the most accurate possible results by ensuring that disparate data is easily and properly accessed and cleansed before the models are created and applied.

 Evaluate the Model, Without Over-Evaluating

The model must be tested to ensure that it provides better decision-making capabilities over current analysis methods. But over-evaluation can delay deployment and hinder ROI. It simply needs to be assessed until it is determined that it will provide value. At that point, it can be implemented. The statistical properties of the finished model are secondary to the value it brings to the business.

 Deploying the Results

The insight provided by predictive analysis efforts must be shared with key stakeholders across and beyond the organization. For example, a bank that has predicted which customers are most likely to churn should disseminate that information to all those who interact with those clients, including call center staff and branch personnel. That way, everyone can contribute to correcting the problem and ensure that countermeasures are being implemented.

 We will see how to adopt best practices and the key to successfully implementing and using predictive analytics in the next article

About the Author

Shaughn is an industry analyst for business intelligence. For over ten years, he has assisted clients in business systems analysis, software selection and implementation of enterprise applications. Shaughn is the channel expert for BI for the small and Mid-Market segments at ZSL and conducts research of leading technologies, products and vendors in business intelligence, marketing performance management, master data management, and unstructured data. He can be reached at shaughnk@zslinc.com. And please visit Shaughn’s blog at zslbiservices.wordpress.com