Analytic Best Practices: Managing a Proof Of Value Project

Learn Data Science
Teradata Employee

Many clients have asked me about best practices and tools I have used to manage a good proof of value project.  It could also be just running an analytic project in general.  I have put together a package that is attached in this blog post that contains some of the tools I use to manage an analytic project.

The package contains three documents:

1.  A Kickoff Deck explaining POV Readiness.

2.  A Project Guide Template

3.  A Use Case Scorecard Template

1.  The Kickoff Deck:  All this does is set the stage and help everyone gain an understanding of what is required to run a good project.  It contains the following:

     a.  Team Roles and Responsibilities

     b.  How to define a good use case

     c.  Data and the raw materials of a use case

     d   Project Management Template

2.  A Project Guide Template:  This contains all the boring project management artifacts required to run any project: 

     a.  Action Items and Risks and Issues List

     b.  Deliverable List and Scope List

     c.  Team Roles and Responsibilities

     d.  Communications Plan

     e.  Data Sources

     f.   High Level Project Plan

     g.  Status Reporting

Below is an artifact detailing the high level project plan from the template:


3.  Use Case Scorecard:  This document allows you to measure the strength of a use case based on criteria.  How do I avoid a science experiment? 

The criteria are very simple:

     a.  Is this an appropriate Aster Use Case?  Does it fit the Aster type of analytics.

     b.  What business value does it have?  Does it have cost avoidance/revenue impact/both?

     c.  Actionable:  Can I operationalize the analytic project?  Can I change people and process?

     d.  Data Available:  Do we have the data and is it available?

     e.  Data Size and Scope:  What do I have to do to get the data and prepare the data?  Can it be harmonized?

     f.  Participation:  Do we have buy in from the business, technology, and organizational leadership?

Another area I wanted to highlight was the data subject area.  I have found that understanding these aspects of data have helped me successfully run an analytic project:

  •      Data Source Name - What is the name of the data source
  •      Use Case Supports - What use case does it support
  •      Internal/External - Is the data source internal to my organization or is it external (3rd party)
  •      Transportation of Data - Where does my data live and how do I get it to my analytic platform
  •      Quantity of Data - How much data do I need and how big is it as far as volume
  •      Data Structure -  What does my data look like?  What is it's high level structure?
  •      Data Sample - Can I get a data sample representative of the whole?
  •      Business/Technical Metadata - Can I get business data about the data as well as technical data about the data?
  •      Data Quality Concerns - Do I have data quality concerns, lineage concerns, completeness concerns?
  •      Esoteric Business Knowledge - What primal knowledge do I lack?  Are there business rules injected into my data?
  •      Data SME - Who is my data business expert, who are my technical experts?  Are they available?
  •      Data Privacy/Security/Sensitivity - What privacy issues, compliance risks, or other sensitivity concerns do I have and what are my remedies.