Software vs. Human Call Processing: What’s the Difference?

Software vs. Human Call Processing: What’s the Difference?

Monetization

FIRM FIXED IDEAS

Speech to text
Speech analytics
Software call processing

Software and human call processing have significant differences for businesses who rely on the accuracy of phone call information.

“With the introduction of Apple’s Siri and similar voice search services from Google and Microsoft, it is natural to wonder why it has taken so long for voice recognition technology to advance to this level, and we wonder, when can we expect to hear a more human-level performance?” – Baker, Huang, and Reddy

The difference between software speech analytics and human call processing is wide, but there are improvements. Technology has come a long way in the past 40 years making enormous leaps in speech analytics. Companies that find the newest system are quick to patent and secure their advances as the demand for speech-enabled devices grows year after year. Mobile devices are increasingly adding advanced speech analytics to enhance productivity, make driving safer, and texting hands-free.

The goal of speech analytics for businesses is to affordably identify what happened on a phone call, if the caller a missed opportunity, and what this information can do to help both marketing and sales closing going forward. The goal of the patent creators is to be a little more accurate and have fewer errors than their competitors to have a superior product. These goals clash with the largest innovators of speech analytics technology whose goal is to make speech analytics better than human processing.

“Speech analytics in the next 40 years will pass the Turning test.” – Baker, Huang, and Reddy

There are three major problems that software speech analytics has to overcome: background noise, echo or reverberation, and the accent or dialect variations. Major scientific theories, algorithms, and models have taken shape around advances in modern computing allowing innovative ideas to finally become a reality. In the following sections, we will discuss these three major problems that you should consider if you are interested in call processing.

“The basic learning and decoding algorithms have not changed substantially in 40 years.” – Baker, Huang, and Reddy

To properly discuss how the differences impact businesses who utilize speech analytics to score and process phone calls we need to know their WER. The Word Error Rate (WER) is a standardized model of assessing how well software performs at speech analytics. “The word error rate (WER) is a good measure of the performance of dictation system, and a reasonable approximation of the success of a voice search system.” – Senior

“The best commercial speech analytics systems achieve 30.5% error.” – Case

Background Noise: How well do Apple, Google, and Microsoft perform?

Background noise is a contributing factor to the ability to clearly and accurately transcribe a conversation during a phone call. Sources of noise include wind, crowds, music, and even screaming children. Ideally, a phone call is placed by the caller in a quiet place where they can think and talk coherently.

That’s not always the case.

Looking above at figure 1, the ability for software speech analytics to accurately translate what is being said by the caller in noisy environments corresponds to high WER, or high error rates. When the caller knows that they have difficulty being heard by the person on the other side of the line, they will attempt to compensate, and this is called the Lombard effect. The Lombard effect often makes software recognition even more difficult because the caller’s speech fluctuates:

  • The loudness of the caller’s voice goes up and down.
  • Pitch changes in the caller’s voice.
  • The harmonic rate of words changes for the caller.
  • The duration and pausing of syllable intensity shifts for the caller.

The big takeaway of whether or not noise is a consideration for you and your business is how often has noise impacted your companies calls in the past, and what are you going to do about it?

“Results of our study shows that performance of cloud-based speech analytics systems can be affected by jitter and packet loss; which are commonly occurring over WiFi and cellular and mobile network connections.” – Assefi

If even 25% of your businesses phone calls occur from cell phones, your business has been negatively impacted by noise in some degree – but it is ultimately up to you to put a dollar value on that expense. Having human call processing analysts screen and listen to your calls is one way to mitigate the factor of noise in calls.

Echo and Reverberation: Building Robust Call Handling Systems

Software speech analytics software must account for the direction of the voice. A simple way to understand this problem is to carry on a conversation in an empty and hardwood floored house. If you start to hear your voice echo off the walls and floor, it interrupts what you are saying. In the image above, the Airforce tests echo and reverberation in an echo-free room. As you can see, people typically won’t be calling your business from this type of place.

Imagine how difficult it would be to hear someone’s voice with an echo also being picked up by the phone microphone and relayed over the call. Hello… hello… hello…

The best takeaway from echo and reverberation I can give you is that to train software on how to account for the doubling effect of sounds from echo, recordings of human scored calls are used over and over again, every single day to add to training data. As of today, software is 100% dependent upon calls that were already scored by humans to raise their day-to-day accuracy of phone call transcriptions using these steps:

  1. A human call analyst is used to evaluate how well a system is at diagnosing echo and sound reverberation in the caller’s environment.
  2. After the phone call has been scored and transcribed by a human call analyst, the information is added to software speech analytics training data.
  3. The speech software will then attempt to transcribe the call and differences between the human transcription and the software transcription produces a WER rate.
  4. Rinse and repeat… over and over again millions of times to lower the average WER rate.

Human call analysts are trained and vetted using a similar school of thought using these steps:

  1. Human call analysts vetted with years of call handling experience are put through a school with hours of training with typical calls, difficult calls, and how to avoid word errors.
  2. CallSource human call analysts are given amazing opportunities to work from their chosen locations, they are screened and trained on the best practices in the industry and are motivated to improve their accuracy rates continually.
  3. Once human call analysts are certified, they begin taking phone calls and work with mentors and colleagues to ensure that processes are being followed.
  4. Rinse and repeat… on their schedule, delivering the lowest WER rates in the industry.

“The network must not only learn how to recognize speech sounds, but how to transform them into letters, this is challenging, especially in an orthographically irregular language like English.” – Alex Graves

Graves (quoted above) has researched for improving models of speech analytics with neural networks and gives insight into how speech analytics output can defer in decoding single sentences:

Example #1

Speaker: TO ILLUSTRATE THE POINT A PROMINENT MIDDLE EAST ANALYST IN WASHINGTON RECEIVES A CALL FROM ONE CAMPAIGN.

Speech Recognition Software: TWO ALSTRAIT THE POINT A PROMINENT MIDILLE EAST ANALYST IM WASHINGTON RECOUNCACALL FROM ONE CAMPAIGN.

 

Example #2

Speaker: ALL THE EQUITY RAISING IN MILAN GAVE THAT STOCK MARKET INDIGESTION LAST YEAR.

Speech Recognition Software: ALL THE EQUITY RAISING IN MULONG GAVE THAT STACKR MARKET IN JUSTIAN LAST YEAR.

Source: Towards End-to-End Speech Recognition with Recurrent Neural Networks by Alex Graves

Both software speech analytics and human call analysts require robust systems to ensure that WER rates are as low as possible. WER rates directly contribute to businesses missing phone calls and misunderstanding what actually happened on phone calls. End to end speech software struggles to deal with echo and state of the art solutions continue to fall short the worse the noise and echo are in the environment of the caller.

Accents and Dialects: The curveball of call handling

According to Wikipedia, the United States has over 30 major dialects of the English language. For native-born Americans, these apply to the geographic location you are born and raised, but also to the dialects of your parents. Assuming that all of your business calls are from callers with a dialect that software has trained on, you should be looking a moderately high but acceptable WER rate for accents. The diversity of your callers weighs heavily on the accuracy outcomes of speech analytics WER rates. You would see a shocking rise in WER rates for callers that come from a distinct dialect or who carry a unique accent separate from what the software has been training on.

“The performance of speech analytics systems degrades when speaker accent is different from that in the training set. Accent-independent or accent-dependent recognition both require collection of more training data.” – Kiu Wai Kat

Accents and dialects represent a curve, such as a curveball, as utterances are spoken, and software attempts to accurately decode the words into coherent sentences. The degrading accuracy rates contribute to large gaps and word errors which may completely miss what was said or intended. When the outcome of what happened is dependent on a single word… and that word is usually interpreted incorrectly; the consequences are detrimental for businesses.

All of the large studies on accents point out that speech analytics has been unable to conquer the British English speech systems of Scotland, and with hilarious results (see the video below, video contains adult humor)

 

Only small progress has been made in dealing with different dialects and accents as these interrupt the way that words sound to software speech analytics. The systematic approach required for robust software speech analytics is challenged by the need for a system that is adaptable to a large variance of pronunciation.

Human call analysts also struggle with accents and dialects unfamiliar to them. However, one huge advantage that CallSource has is to select and hire human analysts who are familiar with those accents. There are call analysts who understand large sets of accents and dialects with ease and they are fully capable of the challenge.

“The error rate from accent speakers is around 30.89%” – Liu Wai Kat

The quantifiable reasons that accents are difficult to classify, transcribe and decode have to do with the acoustic differences between accent groups. Those differences are difficult to account for while still accounting for noise and echo in the environment. For ESL (English as a second language) speakers who have thicker accents, the problems are compounded if they are not in ideally quiet and low echo environments.

The detection of key phrases during a call is vitally important to understanding what happened during a call. How those key phrases are used is how that call will ultimately be classified by the system. Marketing and sales are unable to move forward effectively without accurate information of what happened during an initial phone call into the business. Advances in the improvement of accents will have to be developed in every accent and dialect individually and added to training data collectively to overcome the problem of high WER rates.

Conclusion

You should ask yourself: are you clever enough to handle high phone call accuracy rates? Can you make a difference in your business by knowing what happened on every single call? Can you achieve the results you need even with pages of word errors?

We think you are clever.

We know that you could turn that knowledge into practical business decisions for the future, and by those decisions, you can make waves in your market. Assuming you do not want to deal with all of these issues, just go back to doing what you do best:

  • Asking the caller to call you back from a quieter place over 30% of the time.
  • Asking the caller to move into a room with fewer echoes over 30% of the time.
  • Having to task a sales or service employee to re-listen to 30% of phone calls from a region with specific accents.
  • Asking the caller to call you back from a landline or a location with better reception.

The differences between software-based speech analytics and human call analysts comes down to how much of an impact accuracy makes in your business. Talk to your appointment setters and ask them if they ever have trouble hearing what people are saying because it’s a good bet that if they have ever had trouble, your call processing has had trouble.

Article References:

  1. Assefi, Mehdi, et al. “An Experimental Evaluation of Apple Siri and Google Speech analytics.” An Experimental Evaluation of Apple Siri and Google Speech analytics, www.cs.montana.edu/izurieta/pubs/sede2_2015.pdf.
  2. Hannun, Awni, et al. “Deep Speech: Scaling up End-to-End Speech analytics.” [1412.5567] Deep Speech: Scaling up End-to-End Speech analytics, 19 Dec. 2014, arxiv.org/abs/1412.5567.
  3. Kat, Liu Wai, and P. Fung. “Fast Accent Identification and Accented Speech analytics.” 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258), 1999, doi:10.1109/icassp.1999.758102.
  4. Senior, Andrew, et al. “An Empirical Study of Learning Rates in Deep Neural Networks for Speech analytics.” An Empirical Study of Learning Rates in Deep Neural Networks for Speech analytics – IEEE Conference Publication, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, ieeexplore.ieee.org/document/6638963/.
  5. Xuedong Huang, James Baker, and Raj Reddy. 2014. A historical perspective of speech analytics. Commun. ACM 57, 1 (January 2014), 94-103. DOI: https://doi.org/10.1145/2500887
  6. Zhang, Ying, et al. “Towards End-to-End Speech analytics with Deep Convolutional Neural Networks.” [1701.02720] Towards End-to-End Speech analytics with Deep Convolutional Neural Networks, Cornell University, 10 Jan. 2017, arxiv.org/abs/1701.02720.
Kevin Profile Shot
Kevin Dieny
Marketing Professional

Kevin is a contributing author and product expert in all things digital where he works as a Digital Marketing Analyst at CallSource.

Tracking Persistent User Data Across Your Website with Cookies

Tracking Persistent User Data Across Your Website with Cookies

Monetization

FIRM FIXED IDEAS

Google Tag Manager
Cookies
Attribution

How to setup your own personal cookie to store important marketing related information about the website visitor.

In moments that could only be described as Deja Vu I came across a solution to a problem I was having and it turns out – I’m not the only one who was having that kind of problem. I was sending website traffic to one of many of the company’s websites from a paid platform and I was not getting the conversion return that I expected. I tested, changed, and tweaked for weeks finding a new low in my acquisition confidence when I tried explaining the situation to a friend of mine.

He very meekly invited me to explain what I was doing and I gave him the ELI-5 treatment (explain it like I’m five years old). As the words came out of my mouth I realized that I was not tracking website visitors who navigated away from my landing page. I mean, they were tracked, but they weren’t attributed. I knew about Google’s multi-channel tracking capabilities and decided to see what the report would hold.

If I had a screen shot of that report it would look like a rainbow because all of the channels were color coded and my user reports were all over the place (in terms of channels). [See an example below from the Demo analytics account]

I remember that the top converting channel combination was simply, “Direct” and after that it was an average of 9 channels per conversion. I was in the, “Assisted Conversions” report and saw that although my advertisements were fighting to break even in the first month from last-click conversions they were more then carrying their weight by assisting future conversions.

So Google knows in one of it’s reports that my paid ads are generating conversions but not always in the ‘last-click attribution’ window of time. Using this information I realized that on average (at least 9 channels worth) I could get a conversion from nine interactions. Not that any of those nine conversions were valuable but it did tell me that my paid channels were generating value and I wanted that information passed into my CRM.

Here was the problem: how do I store and pass marketing attribution data into my CRM from those channels that simply assisted in the conversion to prove their value?

So let’s break this down into the elements of the question that we would need to solve for in order to correctly answer it:

  1. How do I store marketing attribution data from the varying channels?
  2. How do I pass that data into my CRM?
  3. How can I make sense of the data to award credit to the proper channels in order to prove the worth of my paid acqusition efforts?
where utm parameters go

“How do I work with this in my situation?”

The first element to solve is to figure out how to store marketing attribution data from varying channels. There are likely many ways to solve this but I will show you how to create your own cookie in order to accomplish it that way. GDPR does not want anyone storing PPI data but the beauty of this is that you can store data related to the channels (UTM values) and this does not necessarily constitute PPI data. As long as you aren’t storing data that can lead back to a single customer you are not violating GDPR (call me on this if I’m wrong?).

So what information do we want to store?

Staying away from PPI we only need to store data right above the consumer level and the UTM parameters are exactly right above that. In terms of granular data I would use the UTM basics: the Campaign, the Source, the Medium, the Content (Ad Group/Ad Set), and the Keyword. None of those have PPI data in them as long as you do not add it yourself. In addition to the UTM parameters you will want to store this differently in your CRM by loading the values into fields in your CRM that correspond to the first time that those values were created and possibly the most recent time they were collected.

For example if the fields are empty, then populate them with the UTM values (call it first campaign, or first source, etc), however… if they are populated then populate another field (call it last campaign, or last source, etc) and append the value there regardless of there being a value in that field.

Here’s what you will be storing (feel free to customize this to however you want):

  • utm_campaign
  • utm_source
  • utm_medium
  • utm_content
  • utm_term
  • gclid
  • msclkid

The values with ‘utm’ in them will denote the UTM values each time the data is detected and the cookie is created. This will create the effect of a last-click attribution model. Every time a user visits from a different channel with UTM values in the url the values will update. You could forseeably add any data from a web visit or from a web parameter that you wanted but make sure you clean up your urls in Google Analytics with filters (that’s for another blog article).

Here’s what you need in order to store this data in a cookie:

  • Google Tag Manager with admin access.
  • The code scripts, variables, and any relevant tags published in GTM to deploy this.
  • I also recommend a preview environment in order to test it.
where utm parameters go

Custom Cookie Installation Instructions:

  1. Login into the correct Google Tag Manager.
  2. Enter the appropriate container (like a property).
  3. Create a new tag.
  4. Set the trigger to fire across ‘All Pages’ (unless you need a specific page-case).
  5. Set the tag configuration to be a Custom HTML tag type.
  6. In the HTML whitespace, you add the code below.
  7. Name the tag to comply with your taxonomy structure in GTM and Save it.
  8. At the top, click preview and then view the website where your container will fire.
  9. In the web page, at the bottom (by default) you will see the preview mode, and make sure you see your tag name in the list of Tags Fired on This Page.
  10. Next, tap the F12 on your keyboard (on windows) or open up the developer console (I’m using Google Chrome). Then hit the tab for Application, and on the left open the view for Cookies and select your domain.
  11. When you look at the alphabetical list on the right you won’t see your utm fields and values… wait why? Because you must put the utm parameters into the url.
  12. Now, try reloading the URL but first add the following parameter text to the end of your page’s URL (?utm_campaign=test) so it will look like (domain.com/page?utm_campaign=test).
  13. Now when you scroll down in the application > cookies > domain view on the right you should see the field of utm_campaign and on the right of it, the value of ‘test’ – you did it!
  14. Back in Google Tag Manager you can save or play around before you finally publish the changes to your live site.
  15. Last, if you have a requirement to tell your visitors what data you collect, make sure you consult with your legal team what you need to add to your privacy policy and cookie policy.
where utm parameters go

Step 2: Collecting that data from cookie and storing it into your CRM

I can’t begin to know what CRM you use or if you even are using a CRM (you know who you are) but if you have digital conversions such as form fills, phone calls, chats, etc then you can collect this information. At the moment of filling out a form, making a call, initiating a chat, or whatever is the point where the data you have been storing in the cookie will be associated with that conversion.

I cannot cover a how-to for every type of conversion but I will detail the form because it’s the simplest and can be applied to the other conversion methods depending on what you do. You will capture this information using hidden form fields that will pull the data out of the cookie and store the information into your CRM at the point that the form is submitted. I would also make sure to add the information you are gathering to your privacy policy or at least make a mention below the form that you are gathering marketing data for the purposes of serving them a better experience.

Some form builders have built-in tools to pull data from a cookie or proprietary methods to get data that could require customized javascript or jquery (I just have the basics). For example I know Marketo has a hidden field that can pull from the cookie which makes it very easy for you to set this up. If you use HTML or have form fields that can be populated from a jquery snippet (and your web page accepts those languages) then you can use this method:

Add this script to your form, page, or to your form/landing page however you can:

$('input[name=INSERT FIELD NAME]').val($.cookie("INSERT COOKIE NAME"));

Step 3: Making sense of all this data to paint a better picture of multi-channel attribution

The use cases are many, bringing me back to my Deja Vu. In the past few years since I figured this out I’ve run into dozens of fellow marketers struggling with a problem that this could solve. Not every solution ended up with them slapping together their own cookie but it was relief for them to figure out that there was a solution if they are willing to go through with it.

So here’s my warning: don’t make this, set this up, and then forget about it. You need to have an applicable use case for how you are going to use the data. Even if I tell you, this is amazing if you do it because it helped me do X, Y, or Z… you still need to know how it’s going to benefit you.

Knowing the attribution of your conversions is only half the picture. The other half is how you can impact the conversion for your higher-value customers. You should use this data to help you better understand the differences in your customers. Do not use the data to look at your customers in aggregate without considering that your customers fall into different segments. Users that turn into high-value, long retention, and high frequency buyers will be your best customers and so spend the effort (extra effort required) to understand them and where they come from.

Slice your customer data by value segments or retention segments and then take a look at your attribution channels to see what you find.

Optional Cookie Code I’ve Seen:

Why use the datalayer? You could use the datalayer across domains… and there are other reasons if you want to dig around.

Kevin Profile Shot

Kevin Dieny

Marketing Professional

Here is a great resource for you if you want to learn more about the DataLayer:

https://www.simoahava.com/analytics/persist-datalayer-across-pages/

Unifying the User for Marketing Attribution

Unifying the User for Marketing Attribution

Monetization

FIRM FIXED IDEAS

Unifying User
Marketing Attribution
Google Data Studio

How to unify marketing insights from silos of data so you have a complete picture of the customer journey.

To unify the user for marketing attribution requires a unique identifier across the discrete silos of data in order to merge them. Sometimes you have a lot of data that can be merged but possibly not 100% of it. The good news is that most analytics and data collection tools have a unique identifier. The bad news is that few tools have the same identifiers with the same data in those fields such as: emails, cookie id’s, phone numbers, addresses, identification numbers, or semi-randomly derived values. This sounds complex and you might be asking yourself, “How do I work with this in my situation?”

(Tip: You will need a field in your data to account for every system or tool’s unique identifier).

where utm parameters go

“How do I work with this in my situation?”

First, it’s helpful to start with where you want to have your central space of truth. Where do you want to go for the data? What system do you have that is equipped to contain all of the data you need? Data collection is the first step towards unifying the user and is why most marketers lean on their CRM or marketing automation tools as those systems tend to be robust enough to handle it. In my opinion I recommend a CRM (customer relationship management) tool.

Second, you need to take inventory of all the data you currently have access to. Do not focus on all the data you could have or might have… start with what you have today. You will need to inventory all of the data that touches the customer either directly or indirectly. You might want to prioritize the data that most directly interacts with the customer (where you can see the customer in your data) and sources that are easy for you to access.

Third, you need to model your data inventory to draw out some ideas for how it will all merge together. Have a whiteboard, mind mapping, or sketching software then use that to help you. To model the data you need to think of everything in terms of tables (think of excel). Rows will always correspond to your unique identifiers (typically over time). Data that is too large will make it impossible to work in software like Excel because it will crash on you. This is why you can’t just rip all of the data from everywhere – you need to begin with a model. You don’t need all the rows of data but you do need all of the columns (the metrics and dimensions) that add context to your data. With this model you can see the unique identifiers that cross silos and some that cannot.

Fourth, you will almost need to write a recipe of sorts that details what needs to be merged with what, when, and in what order. When two identifiers can match up between two tables of modeled data you will end up with a single table that includes all of the information combined. For example say you have a table of names and emails, and another table with phone numbers and emails, when you combine them you will have a table of names, emails, and phone numbers. The emails column was the unique identifier that connected the two data silos and when they matched up you will be able to have phone numbers alongside the emails. When you have data you feel that you don’t need you do not need to bring it along – but this is why we model so you can see ahead of time what work needs to be done.

where utm parameters go

How does this whole process look?

Let’s go through a hypothetical example using some free tools and how you could merge that data into a business intelligence tool (in this case we will use Google Data Studio). For our example let’s assume you have the following tools with data in them:

  • Google Analytics (Website data)
  • Email (Could be any email platform)
  • Digital Advertising (Could be any advertising platform)
  • CRM (Could be any customer relationship platform)

Step 1 – Decide what your ultimate source of truth will be.

In these tools we could have (3) possible sources of truth: Google Analytics, Email platform, or the CRM. I recommend not using Google Analytics because you are not allowed to store personally identifiable information that is not hashed or stored in a private way. Depending on how robust your Email platform is you could use it as the source of truth but typically these systems are not built with this primary function. Feel free to use anything you can to achieve this but for this example I will consider using the CRM.

Step 2 – Take inventory of your data sources to get our metrics and dimensions.

Google Drive is free if you have a gmail so for this example I’m creating a gmail and jumping into Google Sheets. From sheets (similar to Excel) I am going to use this to inventory the data and do all of the merging with the index match function galore (see this article if you want to know how I do this).

For Google Analytics you can use the free analytics addon to pull your data. I’m going to save you some time and let you know that if you use the standard setup of GA you aren’t using the User-ID view and you are not hashing or storing your personal data into custom dimensions/metrics. Lets deal with the basic setup here – in which case the unique identifier is the cookie id or a UTM parameter. The trouble is, if you aren’t capturing this into your Email platform or CRM with a form field that picks this up you will not be able to use this data to merge. You can of course see data in aggregate like X visitors came from a specific email (assuming you used UTMs) but you cannot see this by each unique customer unless your systems are setup to account for unique user identification. By default they are not setup this way.

So what now? Your Google Analytics standard setup is valuable for website analytics but does not come out of the box with a unique user identification system. You have to do some work to set it up that way. I wanted to include this mention but if you have setup your analytics with a way to match the user to a specific ID that is stored in your CRM/Email provider because you capture it there as well – then power to you! In this case you can use the analytics addon to pull any metrics and dimensions (other than the identification field) to add website analytics to your model.

For Email, you have the easiest source of data yet because you at least have the email of each unique record in your system. After that any data you have corresponds to information that adds context to that email such as: first names, last names, company information, demographic information, any activities/interactions that took place, and possibly commerce related fields. Most email platforms have a way to export data but if you can only download the aggregate data (leaving out the individual emails) then you will not be able to do any trying of it’s data to other systems.

Email to landing page insights might also exist in your email platform but likely will not exist in Google Analytics (tied to a user) by default because Google Analytics removes personally identifiable information like this on purpose.

Digital Advertising data collection of the unified user is similar to email – whatever platform you are using for forms is going to have the user identified by emails. If however, you are not using an email platform like this and have your CRM attached to the forms then you will look there. Either way the only way to attribute any digital advertising activities to users is from the capture point such as a form, a chat, an email, or a phone call. The point where a user becomes known is often described by many platforms as a, “Conversion.” By itself, saying something is a conversion in marketing is not very descriptive.

The last and hopefully the easiest place to get customer data is the customer relationship management platform (aka the CRM). In order to model the data you need to work with the data it it’s tabled format. So export all of the user-related data from Google Analytics (if the unique identifier exists), Email, Digital Advertising, or CRM platforms with at least the relevant dates and the unique identifiers. All the other metrics, dimensions, and context of your data is not necessary for modeling.

googles utm builder tool

Step 3 – now we will model the data you’ve exported and align their unique identifiers.

If your data is not too large you can hold the data from one table in the first Excel tab, and the data from another table in the second Excel tab. Assuming this is the case, we are going to be using Index Matching to merge the second table into the first table (or vice versa). The two data tables need to have at least 1 of the unique identifiers in common. I would start with a table that has the most data potential (like the CRM) as the first tab and merge all data into it.

(Tip: If your data is too large for Excel or Google Sheets, then you need a server (SQL-Lite) can use a desktop and provision a file for this purpose. Is it worth it? If you are working with that much data you should really consider what value improving data insights could provide).

As you work through the data modeling you need to remember (or better yet, write it down) all the steps you took to merge the data. The index data (first part of the formula) will contain the data you want to add from the second tab into the first tab. The match data (second part of the formula) will contain the data you are comparing to see if there is an exact match of the unique identifier. For example, the index data could be the names, and the match data could be comparing emails to ensure they match.

(Tip: This is where you might realize that the data in one matches the other but there are spaces or formatting issues in one table. Therefore you might need to use format cleaning in the Excel document or change the way data is collected to make sure it’s scrubbed prior.)

utm parameters archive

The cleanliness of the data you are merging is a major factor. You need to rinse and repeat the index match formula for each column of data in the second tab that you want merged into the first. At the end you will end up with a much larger first table then you started with.

(Tip: After you are done merging the tables copy and paste the values but do not paste the formulas, paste just the values so no formulas exist in the cells but just the values.)

utm parameters archive

Now that you’ve done this for two tables, let’s open a third tab (if its possible), or delete the older second tab and start another second tab (that is fresh and empty). Time to merge the next table into the first – so make sure that the tables both have at least 1 unique identifier. You will continue this for all data tables you wish to merge until you end with a final table of glory… hah.

utm parameters archive

Step 4 – the last step is to write down your process and make sure you can account for everything.

Writing down the process in this way prepares you to work with Structured Query Language (SQL) the basic language of processing data in most if not all systems. You are working in Excel because that is a great place to start but you are learning a skill that can be applied to working with large volumes of data using a SQL-based server. The merging in SQL is done using Joins just as it is using Index Match in Excel.

Assuming you got this far now it’s time to have some fun in Google Data Studio. You do need a Google Account, but it’s easy to create one (with a strong password) and then jump in there:

utm parameters archive
utm parameters archive
Kevin Profile Shot

Kevin Dieny

Marketing Professional

Get started by saving your final data table as a (.CSV) file and then upload it as a data source into Google Data Studio.

After your data is in there create a report. You can jump into a template one designed for what you are trying to measure but usually you can open a blank template and then add charts and data visualization to play around.

I recommend that at the end of the day you watch some videos (take a training) on how to get the most out of Google Data Studio. At first some of it is daunting but it’s a really great tool and one that has amazing features. Most of what you learn you can even take with you into paid platforms.

Here is a decent Google Data Studio learning video from Google:

I wish I could expand more into different data types but this can be so subjective for different businesses. At some point I will tackle the user-id in Google Analytics but it is well covered in other blogs. My experience with user-data in Google Analytics is that while it is nice to have it also means some of the elements of Google Analytics function differently.

Marketing 101 – Position and Perceptual Mapping

Marketing 101 – Position and Perceptual Mapping

Monetization

FIRM FIXED IDEAS

Marketing 101
Positioning Map
Perceptual Map

Marketing position and perceptual mapping is a valuable tool used in research to visually represent the comparative metrics and dimensions of products, brands, and services.

“When you throw dirt, you lose ground.” – Texas Saying

One of the issues that crops up in every organization is when everyone has different priorities for tasks and views the weights attached to those tasks differently. Unity is when everyone has a somewhat uniform perspective within the company at any given time. Meetings are held to unify us, special events, group activities, projects, inter-department meet ups, it’s all been strategically created with the goal of unifying everyone.

Ultimately everyone works together but you want all of your resources working cohesively and cooperatively to be as efficient as possible. Within this complex chain of efficiency lies position and perceptual mapping. Internally, these tools are used to help unify an organization and realign the goals and priorities so everyone is helping and maximizing their efforts.

Externally, this tool is utilized by marketing research to inform the company of how certain customers view specific metrics. The best example of this is asking customers what brand offers the best, “bang for your buck.” This simple comparison we do in our everyday lives compares the cost metric to the value dimension.

marketing perceptual positioning map

The image above is the standard design for any position and perceptual mapping. I will walk you through how to set this up and conduct your own mapping.

Step 1:

You have a question that if you knew the answer to, you could make your company, brand, product, or service more focused, and therefore create more value. The question needs to have a metric and a dimension; two key performance areas that make up the question.

“Compared to our competitors who has the most reliable product for the price?”

Step 2:

A representative audience and a limited-bias format of delivering that question and deriving answers should be used. The audience should be relevant to the population of clients, customers, whatever group the question pertains to.

The question should be non-partisan, should not lead them, and be conducted professionally. You want to extract quantifiable data not qualitative data, so use scales or assigned values to represent their answer choices. You can also compare each element to each other and create an ordered list which can be turned into a simple scale.

“On a scale of 1 to 10, 10 being best and 1 worst, which of these five companies are reliable?”

Step 3:

Results should be tallied, statistical review completed, and all relevant data should be there with the presentation of results. The scaling should match what we will see on the map. The answers will produce a value for the metric, and a value for the dimension (X-value, and Y-Value) that you will essentially graph.

“Company Z, has 8 for reliability, but only 4 for price (8, 4).”

Step 4:

Present and evaluate results (think scientific method). There are a few considerations when it comes to mapping. The position and perceptual map is a limited view based on the people interviewed, it represents how people see the elements when compared with only that dimension and metric, and only shows you how something is currently viewed. Maps can become outdated quickly, everyone is competing and trying to interpret trends and predict the future, so anticipating shifts means redoing the map constantly.

“Right now, Company T is viewed as the price leader to our audience with a 10, while Company J is the reliability leader with a 9.”

The perceptual/positioning map is just that, a perspective, and a view of how things are in the mind of those asked. These are not rigid maps to buried treasure or gold for your company. In fact they may often be skewed and contain any amount of error because the measurement is not perfect.

These maps should be used to match and alleviate position imbalances, help you plot goals, and try to stay relevant. You can identify characteristics you may not have considered – representing fresh opportunities and market share you can conquer.

Kevin Profile Shot

Kevin Dieny

Marketing Professional

Marketing positioning and perceptual mapping is scientific but is limited in scope. Map what truly matters by starting with a hypothesis and test for it with professional research. We all know how hard it is to unify, it’s wishful and hopeful, but something must be done to focus and streamline goals and priorities.

Marketing Analysis with Excel – Index Match

Marketing Analysis with Excel – Index Match

Monetization

FIRM FIXED IDEAS

Marketing Analysis
Index Match Function
Excel

A walk-thru of marketing analysis with Excel using the Index Match functions for up to date and dynamic data across cell ranges.

“[The files] they’re *in* the computer?” – Hansel in Zoolander

Excel is a wonderful marketing analysis tool and is one of those universal tools to work with data. When I was working for an agency with multiple clients we struggled to find an affordable business intelligence solution for an agency with multiple clients. Everything kicked off in Excel.

My favorite tool in Excel (as of right now) is the Index Match function. Actually it uses both the Index and the Match functions together to make a range of data dynamically update. There are multiple use cases but I will walk you through how I was taught and still use this function today:

excel marketing function

EXCEL:

Unless you already have data to work with I would set up some sample data to play with and see if you can get this function to work for you before throwing it into official sheets. The function works by setting a code found using the index function, gets the position by the match function, and always returns the last cell in the range.

Disclaimer, I use this function to return data only, not words/phrases/etc.

excel marketing function

INDEX:

For the Index function we will be using the first block to indicate the Row number (ROW#) as pictured above. The Row should correspond to the row where your data range exists, from left to right. Each ROW# should be the same row number (ex: Row 20 is 20:20).

Traditionally, you would use the first block to indicate the column where your data exists, and where you will be returning a final value from.

MATCH:

The second block in the Index function will be used to specific the updating data in the range, or the last cell with data inside it, from left to right. You have the large number, and then the same Row you used before, exactly as you used it before in the second block of the Match.

Traditionally, you would use the final block to indicate the column/row you want to look up against.

The Large Number:

The large number (9.999999E) must be there, and as I am not an excel specialist, I can only say that it must be large to correctly narrow to the correct numerical data. It represents the lookup value but in our usage it reflects the numerical range. I believe it could be replaced in order to narrow to words/phrases if desired.

my function

The image above is a sample of one of the Index Match functions I am currently using in my Analysis Sheets. I will show you exactly how I am using this function now that you know how it works. Remember I am using this for number data, not words, so test your version accordingly.

My data sheet

A:

The A highlight shows the function that I am using, on Row 20.

A highlight function

B:

The B highlight shows the Row that I am referencing data from, Row 20.

B highlight row

C:

The C highlight references Column I, or the final range of data included in the function. Only the furthest right data will be referenced, in this range, which would be C, the value is 537. The bottom 2/3rds of my sheet represents the Deep Dive section – where all the micro data lives, period after period.

deep dive highlight c

D:

The D highlight references the top 1/3rd and macro data view of the data in the Deep Dive. At the top, for easy access is what is used most by Management or when sharing the sheet. The data marketers care about is in the deep dive – it’s not pretty down there but it’s where we see opportunities.

Top macro view highlight D

The Index Match function dynamically pulls the most recent (furthest right) data in the sheet and displays it in the top area automatically. This helps us to limit how much data we are inputting and manually updating. There are ways to automate even that to some extent.

Kevin Profile Shot

Kevin Dieny

Marketing Professional

All in all I use these tools to maximize the most important work that I do and minimize the tedious and unimportant work. Automation is vital for the mundane tasks. Once you have sheets set up you can jump into the optimization.

If you have any questions let me know?

Basics of Digital Marketing Analysis

Basics of Digital Marketing Analysis

Monetization

FIRM FIXED IDEAS

Marketing Analysis
Metrics
KPI

One of the most effective steps you can take as a marketer right now is to start tracking the data of your most important key performance indicators.

“Prove it to me, and you’ll get the budget.” – An old boss

Digital marketing analysis can be broken down for beginners and the advanced but only if we are talking about the key performance indicators. If you want insight you need data and not all data is the same you need the right metrics.

simple marketing analysis

Let’s talk about a few of the most important metrics for each stage of the funnel and why these might be key performance metrics for your own business. My goal here is to give you the tools you need to get started and figure this out on your own and apply it to your own situation.

VISITS

Visits is one of the most important metrics because it is the knock at the door, the first glance, and the moment that your brand makes first contact and brings them back for more. You want to track these because they represent the beginning of the funnel and come in handy for optimizing traffic.

Tip: Not all visitors are the same – unique visitors are not the same as sessions or page views.

LEADS

Leads are all too common a metric that is lumped into one fat category. Leads are vital but my suggestion is that not all leads are created equal – they must be segmented as early as possible. If your leads came from newsletters, products, or opt-ins then tag or segment them as such.

Tip: Attribute your leads using UTM parameters and hidden fields in forms as often as possible.

ROI

The final metric is usually one reserved for the CFO or the CEO – it’s also one that winds up sucking the life out of the directors and managers at the end of a month or quarter. The ROI can be tracked weekly, daily, and monthly, however you like – but you must track it. Learn how and add it.

Tip: I would recommend that you stick to tracking the return on investment from marketing activities only and then adding it to the overall accounting picture at the end.

Kevin Profile Shot

Kevin Dieny

Marketing Professional

Some additional suggestions for KPI metrics to consider tracking are: Audiences, Likes, Shares, Followers, Backlinks, CPC, and many more.

Want more?

I’ve created a special marketing analysis template to get you started. Head on over to the shop or click the link below to learn more.