A cheat sheet on how to tackle an Analytics Project

Analytics helps solve many problems across various application areas like healthcare, retail, climate science, crime, banking, fraud and more. Each of these different applications calls for some basic domain knowledge. For instance, to solve a problem of your retailer client, you would need to have an idea of retail operations in general.

However, what happens when you are assigned a problem or project?

Does the approach differ? No. The Analytics Project Life Cycle goes through some typical stages.

Analytics life cycle2

 The Analytics Project Life Cycle

  1. What is the Problem?
  • Understand the type of problem for analysis – predictive analytics, prescriptive analytics, machine-to-machine implementation, root cause analysis, and so on.
  • Define the problem / project objective – the steps, metrics
  • Understand scope of the project – specifics, budgets, time, other considerations
  • Draft schedule and scope
  • How to tackle the project – develop customised in-house solution or implement a vendor product. Benchmark products for the latter.
  1. What are the available data sources?
  • ‘Extract’ or compile available data
  • Evaluate data quality
  • Perform EDA (Exploratory Data Analysis)
  • Populate fields
  • Clean your data, improve quality and consistency
  • Is the available data sufficient to solve the problem?
  1. What additional data do your require?
  • Any historical data required
  • Whether data is required in real-time
  • Address the storage and access of such data
  • Fields needed
  • Granularity desired
  1. What analysis would you implement?
  • Address the next step – how to ‘Transform’ the data
  • Identify and remove outliers
  • Selecting appropriate imputation methodology
  • Conduct cross-correlation
  • Select best-fit model
  • Apply Sensitivity analysis
  • Measure and test the model
  1. How to implement or deploy the model?
  • Address encoding or recoding of model
  • Verification of model – temporal logic, scalability, verification algorithms to use
  • Checking and debugging
  • Frequency of updation
  • Need for an API
  • Workflow analysis tools
  1. How to communicate?
  • Dashboard architecture
  • Modes of Visualisation
  • Integrations of results
  • Reporting
  • Are the key deliverables adequately represented? 
  1. Does it require maintenance or monitoring?
  • Post implementation review and versioning
  • Model monitoring – placing alerts and processes for problem resolution
  • Real-world testing – stress tests, other tools to use
  • Metadata to attach to facilitate future troubleshooting
  • Follow-up on team feedback
  • Recommendations for action

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.