There is only one boss. The customer. And he can fire everybody in the company from the chairman on down, simply by spending his money somewhere else.
Determining the value of a customer is a holy grail in CRM Anaytics. Organizations adopt different methods to rank a customer in terms of value they bring to the organization. Marketing efforts are designed an spent on customers based on value rank.
Two popular methods of determining customer value are
RFM (Recency, Frequency, Monetary)
This is a simple and easy to implement solution
Performs ranking only and parameters can be dynamically adjusted
It requires basic data about a customer’s history of transactions with the organization
CLTV (Customer Life Time Value)
Complex approach and implementing takes more time
Requires more data for analysis to run
Mathematical model does not provide room for changes
Tries to predict future prospects of a customer
Organizations adopt RFM based analysis for customer ranking first and then graduate to CLTV based analysis. RFM analysis deals with three parameters and all three parameters are independent of each other, but are tied together by time context. A common time context should be used or else the final rank will be skewed.
Recency refers to the latest time frame of a customer order. (e.g. For the year 2016, Customer A last order was on 17-Jun-2016 as of 31-Jul-2016)
Frequency refers to the quantum of orders within the referred time frame. (e.g. For the year 2016, Customer A made on average 1.5 orders per month until 31-Jul-2016)
Monetary refers to the value of all customer orders with the referred time frame. (e.g. For the year 2016, Customer A total order value is $12,500 until 31-Jul-2016)
Firstly, historical customer data is used to determine the different scores for R, F & M. Method for determining scores depends on the business policies, which are adjusted based on different parameters. In the example above, assume that a three value scale is implemented i.e. Low = 1, Medium = 2 and High = 3
Recency: If customer had purchased in last 30 days, then “High” or else if purchases were made in last 90 days then “Medium” or else “Low”. In this case, score for Recency is 2.
Frequency: If the customer had purchased in last 30 days 5 or more times, then “High” or else if purchases were made in last 90 days 3 or more times then “Medium” or else “Low”. In this case, score for Frequency is 2.
Monetary: If the customer had purchased in last 30 days value of $5000 or more, then “High” or else if purchases were made in last 90 days value of $10000 or more times then “Medium” or else “Low”. In this case, score for Monetary is 2.
The above example illustrates a simple scoring model for RFM analysis. Based on the scores arrived, a simple rank or weighted rank could be obtained as the product of three parameters as follows.
Rank thus determined is used for further analysis and the process is tweaked over period of time whenever there are changes to internal and external factors. In real-time scenarios the scoring models are complex and when used in conjunction with predictive analysis can really give deep insight into customer behavior.
AR Collection Effectiveness is a KPI that measures the ability of a company to collect payments from customer in timely manner. The term “collection” is vague and many organizations differentiate between receivables and collections. For sake of simplicity, the assumption here is that “Collection” denotes ALL receivables including due and overdue amounts.
Finance Perspective Collection Effectiveness Index (CEI) from the perspective of Finance looks at account balances at different time periods to arrive at the value. The picture shows how the values are derived, which is industry wide accepted definition for CEI.
AR Collection Effectiveness described here varies slightly from the Financial perspective option and instead looks at details of invoice time periods for analysis. The quantum of invoice value that was expected to be collected during a time frame versus the quantum actually collected during the same reference time frame gives a percentage value that can be seen as effectiveness of collection activity.
Example Reference time frame is Apr-2016.
Number of All (i.e. Open and Paid) Invoices with Due Date falling on or before 30-April-2016
Value of All Invoices with Due Date falling on or before 30-April-2016
Value of Payments received against Invoices identified earlier
The above method is a high level mechanism to measure effectiveness of collection activities. Many scenarios in real-time have to be factored into account and fine tuned as per individual organization(s) requirement. Some special scenarios are
How should overdue be handled
Partial payments received
Agreements / Contracts handing e.g. AMC
Collection effectiveness illustrated here will not match Collection Effectiveness Index (CEI) that is derived from financial statements. Both methods essentially show the same performance measure for Accounts Receivable as a whole but vary in their approach. Method discussed here takes a detailed approach and looks at individual transactions to calculate the final KPI value.
In all ETL jobs one would have noticed a step “Delete Target Table Index(s)” just prior to writing data into target data mart or table after all the processing steps are completed.
As a matter of fact, it is one of the “ETL Best Practices” to be adopted while designing ETL data flows. This precursor step has a great performance incentive when adopted in data flows.
In a database, any INDEX created has to be maintained in sorted fashion in order for the INDEX to be effective. Whenever a record is inserted, subsequent to INSERT operation into target tables, the INDEX is re-organized with the new value which was inserted. In case when hundred records are inserted, the INDEX is re-organized or built a hundred times (Note: Depending on the database, index type and DBA actions, this process can be fine tuned). If the target table has multiple INDEXes, the problem is compounded because all indexes have to be adjusted.
In Data Warehousing environments, the quantum of records to be inserted or updated will be in several thousands on daily basis and at this magnitude every performance bottlenecks have to be removed. Dropping the INDEX(s) and loading data helps to a great extent in avoiding costly database operations.
The process while loading data adopted is in the following sequence
Sales Velocity or Opportunity Pipeline Velocity is a KPI that provides general direction and speed at which the sales pipeline is moving based on historical opportunity conversions. It is an indicator of sales organization’s effectiveness vis-a-vis to their initiatives taken over time. Sales Velocity can be considered as a throughput measure of opportunities through the pipeline.
Classic definition of velocity states that “object rate of change in position with reference to time“. Sales Velocity, as a KPI measures positive conversion i.e. Won Opportunities over period of time. Widespread formula adopted to derive Sales Velocity is as follows
This standard formula pro-rates the final value to a daily amount based on data available as of the current date. It should be interpreted as the sales velocity snapshot. The moment a new opportunity is identified or value of opportunity changes, the sales pipeline velocity has to be re-calculated. If one among the three measures in numerator increase or denominator decreases, it will have positive impact on final Sales Velocity.
Alternative approach to derive Sales Velocity uses cumulative opportunity value trend over a period of time and calculates the growth percentage for each adjacent periods. The time context should have a beginning and ending time period with an additional period at the beginning to calculate growth. Base measure used in this approach to build Sales Velocity is Opportunity Valueof won opportunities during the respective time periods. Plotting both values i.e. Opportunity Value and Growth %, similar to the sample on right will show the sales velocity over period of time based on actual data. In the sample graph one can easily identify that growth is falling irrespective of increase in opportunities and can take actions accordingly.
In conclusion, both components are crucial to interpret sales velocity. The historical trend chart shows past performance of sales velocity and when combined with the standard formula it gives the variables that a business user can use to change the direction of Sales Opportunity Velocity.
Opportunity Conversion Duration is a KPI that measures time taken for each opportunity that was won during a specific time period. In a typical sales cycle, ratio of sales pursuits that end in favorable note for the organization is a small number. Here, Opportunity Conversion Rate shows the quantum of sales deals that were won during a time period. Due to this nature of, the choice of visualization is a funnel chart for sales pipeline related metrics.
In order to arrive at the time taken for those sales opportunities that were won during a specific time period, one has to check and the time period during which the opportunity was actually won. In the example above, Opportunity A was closed in Q2 2016 even though the pursuit started in Q1 2016.
So, we can define Opportunity Conversion Duration as the average length of the time taken across all opportunities that were won during a specific time period irrespective of the start date of each individual opportunities.
The following steps gives an overview to derive this KPI
Get all opportunities that were won during the time time period (e.g. Q2 2016)
Calculate the time taken from Lead stage to Signoff stage for each opportunity (e.g. Opp A 120 days, Opp B 45 days, Opp C 71 days)
Derive the average of the value for values in #2 (e.g. 78 days)
Average number of days taken to win during Q2 2016 is 78 days i.e. Opportunity Conversion Duration for Q2 2016 is 78 days.
Both KPI’s related to opportunity conversion i.e. Rate & Duration are reported together and due to nature of calculating these KPI’s, especially time context it is hard to draw trends (i.e. showing monthly conversion rates for last 12 months). A sample visualization will look similar to the table below