data driven marketing - kth.diva-portal.org1337198/fulltext01.pdf · data driven marketing - how to...

14
DEGREE PROJECT IN COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2019 Data driven marketing How to gain relevant insights through Google Analytics JENNY CARLSSON STÅBI KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Upload: others

Post on 22-May-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

DEGREE PROJECT IN COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2019

Data driven marketing

How to gain relevant insights through Google Analytics

JENNY CARLSSON STÅBI

KTH ROYAL INSTITUTE OF TECHNOLOGYSCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Page 2: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

ABSTRACT In this report, problems regarding the retrieving, measuring, and analysis of data when analysing marketing effects in the web analytics tool Google Analytics will be discussed. A correct setup, configuration, maintenance, campaign tracking and the understanding of the data in Google Analytics is essential to be able to achieve relevant insights. This is important since many Swedish marketing departments experience issues related to their setup of Google Analytics as well as the ongoing configuration and maintenance. A literature study has been conducted to gather information, focusing on collecting theories from researchers and experts in the field of web analytics and marketing analytics. Google Analytics data and reports from several Swedish companies have been studied to gain a deep understanding of how the tool is used for the measuring and analysis of the marketing effects. Interviews with marketing department and media bureau/agency employees have been conducted and analysed in a qualitative manner. A thematic analysis of the interviews has been done, resulting in 8 themes which are presented in the result section. The result has been analysed and discussed in relation to the theory. The interviews showed that there is a difference in knowledge and experience between the senior and junior analysts, and that there is a significant learning curve when working in Google Analytics. The junior analysts trusted the data, and did not know about campaign tracking and filters, in contrast to the senior analysts, who did not trust the data as a control mechanism, and did work with campaign tracking and filters. Furthermore, the senior analysts had more understanding of the data models in Google Analytics, such as attribution models, which are known to show different stories based on which attribution model is being used. The conclusions are four capabilities that address a need for more and better control over the setup and over the data, a wider use of campaign tracking, and wider knowledge of the data and the data models in Google Analytics, and of the business the organisation is conducting, to be able to gain relevant insights.

Page 3: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

ABSTRAKT I den här rapporten diskuteras problemen med att insamla, mäta och analysera data vid analys av marknadseffekter i webbanalys-verktyget Google Analytics. Korrekt installation, konfiguration, underhåll, kampanjspårning och förståelsen av datan i Google Analytics är viktigt för att kunna uppnå relevanta insikter. Detta är viktigt eftersom att många svenska marknadsföringsavdelningar upplever problem i samband med installationen av Google Analytics samt den pågående konfigurationen och underhållet av data som ska mätas och analyseras. En litteraturstudie har gjorts för att samla in information, med inriktning på att samla teori från forskare och experter inom webbanalys och marknadsanalys. Google Analytics-data och rapporter från flera svenska företag har studerats för att få en djupare förståelse för hur verktyget används för att mäta och analysera marknadsföringseffekter. Intervjuer med medarbetare på marknadsavdelningar och mediebyråer har genomförts och analyserats på ett kvalitativt sätt. En tematisk analys av intervjuerna har gjorts, vilket resulterat i 8 teman som presenteras i resultatavsnittet. Resultatet har analyserats och diskuterats i förhållande till teorin. Intervjuerna visade att det finns skillnad i kunskap och erfarenhet mellan seniora och juniora analytiker, och att det finns en signifikant inlärningskurva när en arbetar i Google Analytics. De juniora analytikerna litade på datan och tillämpade inte kampanjspårning och filter i motsats till de seniora analytikerna som inte litade på datan som en kontrollmekanism, samt tillämpade kampanjspårning och filter. Vidare hade de seniora analytikerna större förståelse för datamodellerna i Google Analytics, till exempel attributionsmodeller, som är kända för att indikera olika saker baserat på vilken modell som används. Slutsatserna är fyra förmågor som relaterar till ett behov av mer och bättre kontroll över datan och installationen av Google Analytics, en bredare användning av kampajspårning, bredare kunskaper om både datan och de olika datamodellerna i Google Analytics, och verksamheten som organisationen utför för att kunna tillskansa sig relevanta insikter som är lämpliga att grunda beslut utifrån.

Page 4: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

Data driven marketing - How to gain relevant insights through Google Analytics

Jenny Carlsson Ståbi KTH Royal Institute of Technology School of Electrical engineering and

Computer science [email protected]

ABSTRACT In this report, problems regarding the retrieving, measuring, and analysis of data when analysing marketing effects in the web analytics tool Google Analytics will be discussed. A correct setup, configuration, maintenance, campaign tracking and the understanding of the data in Google Analytics is essential to be able to achieve relevant insights. This is important since many Swedish marketing departments experience issues related to their setup of Google Analytics as well as the ongoing configuration and maintenance. A literature study has been conducted to gather information, focusing on collecting theories from researchers and experts in the field of web analytics and marketing analytics. Google Analytics data and reports from several Swedish companies have been studied to gain a deep understanding of how the tool is used for the measuring and analysis of marketing effects. Interviews with marketing department and media bureau/agency employees have been conducted and analysed in a qualitative manner. A thematic analysis of the interviews has been done, resulting in 8 themes which are presented in the result section. The result has been analysed and discussed in relation to the theory. The interviews showed that there is a difference in knowledge and experience between the senior and junior analysts, and that there is a significant learning curve when working in Google Analytics. The junior analysts trusted the data, and did not know about campaign tracking and filters, in contrast to the senior analysts, who did not trust the data as a control mechanism, and did work with campaign tracking and filters. Furthermore, the senior analysts had more understanding of the data models in Google Analytics, such as attribution models, which are known to show different stories based on which attribution model is being used. The conclusions are four capabilities that address a need for more and better control over the setup and over the data, a wider use of campaign tracking, and wider knowledge of the data and the data models in Google Analytics, and of the business the organisation is conducting, to be able to gain relevant insights.

Author Keywords Marketing analytics; Web analytics; Google Analytics; Analysis; Measurement; Data driven; Effects; Insights

INTRODUCTION Over the past decennium there has been an exponential growth in customer, competitor and marketplace information. This growth has presented the opportunity for companies to meet their customers’ needs by delivering and connecting the right services and products to the right people. Furthermore, there is also the opportunity to meet the customers’ needs at the right time, through the right

format, devices and channels. This cannot be done without a proactive and strategic approach to marketing analytics [15]. To be able to adopt such an approach to marketing analytics, companies need to be able to measure and analyse their media investments and the effects of the marketing. The area is complex and there has been, and still are, some uncertainties about what to measure, and how to measure advertising. Internet and digital media stands for over 50 % of the media investment in Sweden, and are steadily increasing. Since digital media has grown in size the past decennium, the requirements for measurement methods have increased in the area. The industry wants to be able to evaluate its’ digital communication with the same insights as with other media [20].

This study will focus on the web analytic tool Google Analytics (GA), which is the most common tool for web analysis, used by approximately 50 % of all web sites on the Internet [5]. Since GA is such a common tool, it is motivated that this tool is subject for this report. The business areas examined in this report are editorial, informational, e-commerce, and banks.

Web analytics is all about pattern recognition and is a statistical compilation of data that can give the analyst an idea of why something happened and what is likely to happen next [13]. Marketing analytics involves the technologies and processes used to evaluate and measure the success and value of the marketing efforts [7]. For the marketer this can provide actionable insights into, for example, which marketing effort drew traffic to the website, how paid media campaigns perform, and how to identify and segment customers [13]. Companies with advanced analytic capabilities are likely to outperform their rivals because they are able to make faster, better executed, and more strategic decisions [14].

However, with the increased amount of statistics available, it has become increasingly advanced to work in these tools. Without prior knowledge, experience or insights, it can be difficult to draw conclusions about what the statistics indicate, or even which statistics that are relevant to analyse. The amount of data can be overwhelming, which might lead to no conclusions, or even wrongful conclusions [21]. A lot of companies have a ton of data, without the expertise to analyse it in order to gain insights. Furthermore, for marketing departments, 36 % do not trust their GA data, and up to 50 % of employee time is spent on finding and correcting errors in the data. Reasons for these issues are that the setup and installation is not understood [6]. Poor data governance is also an issue, which is the capability in an organisation to ensure that high quality data exist [16]. Another problem is that data collection rarely is verified [6]. This shows that marketing departments are in

Page 5: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

need of guidelines on the setup and maintenance of the data, as well as for understanding the data.

GA does not necessarily reflect the truth, and it can be difficult to know whether or not the data is matching the truth. This makes it problematic to base important business decisions on insights gained only from GA data. When using GA to measure and analyse the effects of marketing efforts, there are some aspects that are critical; the initial setup, the configuration and maintenance, campaign tracking and the capability to draw correct conclusions in order to gain relevant insights [5].

Purpose The goal with this research is to contribute to the field of marketing analytics. The purpose is to assist marketing departments who have problems with their GA, related to the setup, control, and understanding of the data. The objective is to identify these problems and to discuss possible solutions.

Research question Which capabilities are needed when using GA in order to conduct an efficient and reliable analysis of the marketing effects at a Swedish marketing department?

THEORY The theory section starts with marketing focused facts on GA and Data governance. The theory section continues with four identified main problem areas; Initial setup, configuration and maintenance, campaign tracking and gaining insights.

Google Analytics GA is a widely used web analytic tool provided by Google. There is a free version and a paid version of the tool. GA consists of four main components; collection, configuration, processing and reporting [5]. GA can be used in a number of ways by different parts of a company, in this study the focus will be on marketing analytics, and the metrics most commonly used to measure and analyse marketing effects.

Conversions are desirable, pre defined (by the analyst) actions the organisation wants their customers to perform on their website. These can be to buy something, sign up for an email-list, or download an article. Goals can also be pre-defined in GA, and can be for example amount of visitors, amount of conversions, etc. [5]. In GA, the marketing analyst can see different traffic sources and how much traffic each of them generates to the website. The traffic sources are commonly visualised through staples and pie chart in the GA interface, and can be divided into categories such as ”Top Channels”, for example. Organic search is when visitors reach the website through a non-paid search, whereas paid search is traffic from search engine results that is the result of paid advertising. Referral traffic is traffic from another website that has a link to the website. Social is traffic from social networks such as Facebook and Twitter. “Other” is traffic that does not fit into another source or has been tagged as other via an URL parameter. Direct traffic is a traffic source that is supposed to contain traffic that occurs when users navigate directly to the website by typing the URL into a web browser, or when opening a bookmark. However, direct traffic can contain a lot more than that. GA will report traffic as direct traffic when it lacks data on how

the session arrived at the website. This can also happen if the referring source has been configured to be ignored or when the user is using a mobile application, which have the habit of not sending referral information along when a user clicks a link. In other words, GA uses direct as a fallback option for when its processing logic has failed to attribute a session to a particular source. This can make it hard for the analyst to know how close to the truth the traffic sources in GA are. The best way to overcome this obstacle is to tag all marketing efforts with UTM tags, which will be discussed further on [11].

There are several plugins that are widely used when working with marketing analytics in GA. Google Ads is a tool that enables companies to buy paid search. Google Tag Manager is a tool that allows the user to manage and deploy marketing tags (snippets of code or tracking pixels) on the website without having to modify the code. This tool becomes handy when a company have a lot of tags to manage because all of the code is stored in one place. Through Google Tag Manager one can control other tools such as Google Ads, Facebook Pixel and Hotjar [10].

Data governance The data quality is important for companies striving towards becoming data-driven. Data governance can be seen as a structural framework for responsibilities and decision-making rights regarding the use of data in an organisation. This framework includes companies capabilities to ensure high data quality, and data governance aims at maximising the value of data assets [16]. To be able to obtain high quality data in an organisation, technical components such as the initial setup as well as the configuration and maintenance of GA have to be done in a correct manner [2].

Initial setup If the initial setup has been done incorrectly, the data will most certainly also be incorrect. An example of this is if the installation code has been added more than once to a page on the site, the users will be counted more than once, and the page views will be at least be double the amount of the actual page views. Another common pitfall is that the code is not populated on each of the websites page, leading to no data from those pages where the code is lacking. The initial setup is therefore essential in order to retrieve the correct data from the beginning [2]. If a company is having problems with incorrect data, it is often wiser to focus on improving the way new data is created, rather than trying to clean existing data. Making sure that the setup is correct is a first step towards reliable data [18].

Configurations and maintenance After the initial setup, there are important configurations in GA that should be done correctly in order to retrieve the correct data. This include filtering out bots and internal traffic, which both are likely to skew the data if not filtered out. Internal traffic is typical to skew the data in an incorrect way as this traffic is likely to be behaving differently from the “usual” user traffic, and it is the customers, not the employees one wants to analyse [5].

Bots are an increasing problem as they continue to grow in size, and are becoming more sophisticated. There are good bots and bad bots. Good bots scan websites to help

Page 6: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

prospective customer find the correct website, whereas bad bots harm businesses. The bad bots are used for several reasons, including price and content scraping, account takeover and creation, credit card fraud, denial of service, and inventory and gift card balance checking. These bots are mimicking human behaviour and can be hard to spot. According to Distil Networks, out of all internet traffic, approximately 20 % is bad bots, and 17,5 % is good bots. 62 % is estimated to be genuine, human traffic (see figure 1). Both good bots and bad bots can skew the GA data so much that the numbers might be invalid to serve as decision making basis. To intelligently separate human traffic from bot traffic is essential to be able to make informed business decisions [8]. The ongoing maintenance of the data includes updating configurations continuously, keeping track of the data, and be attentive to sudden changes and abnormalities. If the the numbers seem too good to be true, they probably are not. It is also a good idea to set up several filter views; one raw data view, where no filters are applied, which can be used as a backup, one master view with all filters applied, used for reporting and analysis, and a testing view with only internal traffic where one can test new settings, filters etc.[5].

!

Figure 1. Bot traffic pie chart [8]

Campaign tracking Knowing the performance of the marketing, and to be able to determine which campaigns generate the best results are key advantages of marketing analytics. Campaign tracking does not happen automatically in GA, which can be considered a problem from a marketing perspective. If companies are using campaigns as a part of their marketing strategy, it is essential to be able to track these campaigns to be able to evaluate their performance. UTM (Urchin Tracking Module) tagging is a way to tag and track the campaigns and let GA know which campaign is responsible for which traffic and conversion. This is a way of taking control over the data. UTM tags are links that are added as campaign parameters to URLs in each campaign. By using UTM tagging, it is possible to track sources with more precision. UTM tagging is especially helpful to understand referral and direct traffic in GA. UTM tagging should be done for each campaign, including email send-outs, shared links, banners, social media post etc. Brian Clifton, who is a web analytics expert, recommends to tag everything possible, including all of the website activity, such as file downloads and internal search terms. Even if companies actually do incorporate UTM tagging in their marketing process, only 18 % of companies does their UTM tagging in a correct manner [5] [6].

Gaining insights To be able to gain valuable insights through GA, one must realise that GA is a tool. The insights are gained through the

analysis done by skilled analysts. Data without context are just numbers, and only when you put data into a broader context, a story can begin to emerge. Data should not be siloed, and analysed in silos, data should be related to other information such as sales numbers and/or data from complementing tools [5]. Without the right analytic approach, no amount of investment will translate to insights [15]. To be able to translate numbers into insights, companies need analysts with an understanding for the business and its products, the value propositions, the website content, engagement points, processes, and the marketing plan. A good analyst also needs to have a broad set of skills from overlapping fields, preferably within marketing, the technical field, as well as within business and sales [5].

The amount of, and type of knowledge an analyst holds is important, as the knowledge the analyst has of GA might affect how accurate possible insights are. Passis and Jackson discuss Data vision and two different understandings and levels of knowledge; Rules bound and Rules based. The Rules based view is a more fruitful way of understanding data analytics, and is structural, but not fully determined by mechanical implementations, whereas the Rules bound view organises and analyses the data through the application of abstract and mechanical methods. They argue that data analysis is a craft that is never fully bound by rules, but merely based on rules [17].

Attribution models Attribution modelling concerns how GA allocates traffic or conversion credit to different sources. The standard model is “last non direct click”, which means that the last non direct touchpoint gets the credit for a conversion. A common problem is that the employees working with analysing the marketing effects does not always have the skills and knowledge required for understanding the attribution models in GA [5]. It is important to actually understand the data and the attribution models used in GA, in order to be able to gain relevant insights. The attribution model ”last non direct click” means that it is only one source that gets the credit, when in reality, the user might have visited the website through several sources before making the purchase. If making conclusions based on "last non direct click”, without being aware of this problem, these conclusions might cause wrongful decisions [5][12]. A typical customer journey may last for a long time, with a number of both analog and digital interaction points, on a number of platforms. Therefore it is important to be aware of attribution models in GA, and what they actually mean [12].

Responsibilities Problems with unreliable or incorrect data has existed for a long time. Redman [18] argues that the solutions is not better technology, but better communication. Communication between the ones creating the data and the ones using the data. According to Redman, the responsibility for data quality should not be at IT personnel, but with managers who actually understands the business process and are invested in retrieving correct data. Furthermore, data can be interpreted in a number of ways, therefore it can be an advantage to work as a team, and let multiple people try and make sense of the data to ensure that any key perspectives are not overlooked [1].

Page 7: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

KPIs It is important to decide in advance what is valuable to measure and analyse, instead of measure and analyse everything possible and available, because there is a lot of data available, and as many KPIs (Key Performance Indicators). KPIs are important to establish because they define which data metrics that are to be collected, measured and analysed. If a company knows what to ask the data in advance of making the analysis, it is easier to retrieve interesting data, instead of the other way around, where you ask the questions in retrospective, but to the wrong data. [5]. There are a lot of things to consider when choosing which KPIs to measure and analyse. One should always try to turn the existing strategy into practical measures. An example of this could be to measure conversion rate (conversions divided by total visitors) if you are a new e-commerce site looking to increase sales. Another important aspect is to be able to decide which measures to ignore. The KPIs should be few, business specific and based on goals [19]. According to Edgecombe, there are some KPIs that every marketing department should measure, such as sales revenue, because no company wants to spend money on something that is not generating money, landing page conversion rates, because a landing page that does not generate leads is useless, and organic traffic, because high organic traffic means that people are finding the website on their own. These KPIs are general indicators of wether the marketing is successful or not [9].

Process and documentation As it is important to ask the right questions from the beginning, making sure that the people working with marketing analytics knows in beforehand how the data will be used, is one of the most effective ways of improving data quality, and the quality of the analysis of the data [18]. To make sure that the process and everything around it is documented is crucial for a company striving towards being data driven. To establish a process and documenting the structuring of the data, as well as documenting things affecting the data is a smart way to make sure that the marketing analysis is an iterative, ongoing process [5]. Österberg [21] has developed a model on how the process should look like when working with web analysis:

1. Work with business goals, KPIs and measurement values.

2. Collecting your data, while collecting data, you develop reports.

3. Analyze and communicate the findings. 4. Make improvements to the site.

If an organisation’s process is built around these four cornerstones, there is potential for effective and successful marketing analysis [21].

Testing Data is only useful if there is a hypothesis to describe what it means to the business. If a company has hypotheses, these can be tested in order to determine the relevance. The most common way to test hypotheses is through A/B-tests. Usually this is done by directing one group of users to one page, and another group to a different page, and comparing the conversion rate between the two segments [5]. While analysis is a way to find out how things are, testing is a way to find out how things could be [1]. However, testing is not

always the way to go, experience and expert knowledge can sometimes be more valuable when it comes to hypotheses [5]. A mixture between the two would be optimal when evaluating hypotheses.

METHOD In order to answer the research question, several methods have been used. First, a pre-study of the area of marketing analytics and web analytics was done in order to get an understanding of the scope of the research, as well as to be able to define the problem and the research question [4].

A comprehensive literature study has been conducted, with focus on marketing analytics, web analytics and GA. Articles, books, academic papers, reports etc. were used. The sources used in the literature study were discovered either via use of the search engine Google Scholar or via review of previous research references. As a part of the literature study, GA data and reports have been examined in order to get a thorough understanding of how the tool works, and how the process of analysing marketing effects looks like in practice.

Semi-structured interviews with employees at marketing departments and media agencies were conducted in order to retrieve qualitative data. Semi-structured interviews are flexible and focuses on what the interviewee views as important in explaining and understanding patterns, events, and forms of behaviour. An interview guide with pre written questions was used as a foundation for the interviews, and follow up questions were asked based upon what the interviewee answered to the pre written questions. The interviews were recorded, transcribed, analysed, and coded in a qualitative manner. Themes evolved out of these interviews, and serve as the basis of the result [4]. These themes was identified through a thematic analysis, which is a method for identifying, organising, and offering insight into patterns of meaning across a data set. The thematic analysis does not focus on what are the most common commonalities, but those commonalities that are important and interesting in order to answer the research question [3].

The purpose of the interviews was to gain a general understanding of how marketing analysts work in GA (and complementing tools), what data they measure, and why, and which problems they experience. The purpose was also to understand how the process of the marketing analysis and how the maintenance of the data is being carried out. 12 interviews were held, half of the interviewees were marketing department employees, and the other half worked at media/communication/strategy agencies where they managed customers’ Google Analytic accounts. Half of the interviewees were women, and half were men. Half of the interviewees were junior analysts, and half were senior analysts. The junior analysts had little or no previous experience of marketing analysis, less than 3 years experience. The senior analysts had longer experience of marketing analysis, more than 3 years, and considered themselves to be experienced analysts.

The business areas examined were: • e-commerce • editorial sites • informational sites • banks

Page 8: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

RESULT The result has been divided into themes that have been identified through the thematic analysis of the interviews held with marketing analysts [3]. All of the participants who were interviewed worked with marketing analytics, and they all analysed and evaluated the effects of their organisation’s marketing efforts.

Goals The interviewees all stated that they worked toward common goals. They often based the KPIs on these predefined goals. These goals were based on the company's strategy and business model. Many of the interviewees had goals set up in GA which they used to measure against. They also had defined conversions in GA to be able to measure if the users acted as they wanted them to. One interviewee said that one goal is that the marketing should not cost more than they make, which they could evaluate from GA data. A senior analyst said that:

”To work towards pre-defined goals is one of the most important parts of web analysis. If you don't know in advance what you want to achieve, it is hard to measure if the outcome is a success or not. The numbers are just numbers if they are not defined as a goal, and by having goals, one can evaluate the numbers”. CEO at a Management consultant firm

Another interviewee expressed that:

“In connection with campaigns in social media, I compare the results to be able to draw conclusions about whether the campaigns have the desired effect, etc., depending on what the goal has been.”

And continued with:

“If established goals are achieved, we obviously believe that the campaign has been successful. This is something we learn from, as this suggests that the target group was positive to the campaign, which may lead to us doing similar campaigns in the future.” Head of Marketing at an IT-company

The ones who did not work with predefined goals tended to have predefined KPIs or standard reports that they used to compare with the foregone year or month. An indicator that a campaign was successful was if the campaign generated revenue which was higher than the cost of the campaign. A senior analyst described the evaluation process like this:

”The actual sales (in relation to media spend) is a receipt on how well the marketing works, in the end it is all about ROAS / COS (Return on ad spend/Cost of sales). Is the marketing profitable? Data related to user behaviour is fundamental to being able to understand what effects our marketing has and where we can optimise. Say we note higher bounce rate on a particular campaign, then we need to review why the user chooses to leave. Maybe the ad does not match the result on the landing page (the user had expected something else). Maybe we are driving the wrong traffic? Social and digital specialist at an e-commerce site

There were some general KPIs that all analysts mentioned, but the KPIs also differed, and depended highly on type of company. The banks and e-commerce sites focused more on

flows, and conversion rate than other type of sites. The e-commerce sites looked a lot at conversion rate in different stages of the buying process, and targeted the steps were customers dropped out. The banks had similar processes, but the focus was not as much on sales, as the banks have other incentives (investment, mortgage, lending etc.). One of the analysts mentioned that they used e-commerce tracking, which is a setup in GA that lets the analyst know even more about purchases on the site.

For editorial and informational sites, some of the more common KPIs were positions, traffic sources, new users, number of page views, pages per session, session length and bounce rate. The editorial and informational sites usually offer content on their sites, which they want the visitors to engage with. This means that their goal is to attract a large audience who stays on their site for as long as possible. Positions is an important KPI because of the fact that a large audience is not always the right audience. These kinds of sites often have target audiences that they want to attract, and they can know if they succeed with this by looking at positions and referral traffic sources. An important aspect for these kinds of sites is that they usually make their money through ads, meaning that they want to attract a large audience that are interested enough in the content to stay for a long time, meaning that the ads get many viewings and have time to show as many ads possible to each viewer.

General KPIs for all of the different types of companies were traffic sources, landing pages, number of visitors, number of new visitors, average session length and devices. These KPIs are generally valuable to measure as they provide the analyst with important information such as how the visitor found the site, which landing page is performing the best, how many visitors are new visitors, how long they are staying, and which device they are using.

Process, documentation and responsibilities Some of the interviewees weighted the importance of an established process and documentation. A junior analyst, a digital performance manager who works at an IT-company, said that their process started with asking the questions that they wanted the data to answer, after that, the collection and the cleansing of data was done, followed by analysis, and then visualisation. The analyst also mentioned that documentation is crucial, especially for big and growing companies that can’t be dependent on one person. If events are being documented in a sufficient manner, wrongful insights can be avoided. The majority of the interviewees did not have a determined process when it comes to working in GA, but most did do some sort of documentation. The most common documentation was to document the testing of hypotheses done through A/B-tests. The majority of the interviewees said that the responsibility of the marketing analysis was on the person working with GA and/or the marketing department. The analyst was often one single person in the organisation. Usually this person did the analysis, visualised it, and then presented the insights and/ or findings to the board.

Trust in the data A senior analyst stated that they were keeping control over the data and checking it continuously to avoid errors. The analyst also stated that they did not trust the data to one hundred percent:

Page 9: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

“All of a sudden there are a bunch of bots in there.” CEO at a consultant firm The analyst explained that these bots had to be taken care of, and mentioned that the data is dynamic and in need of ongoing maintenance in order to be trustworthy. The only interviewees who stated that they did not trust the GA data were the ones with the most experience from web analytics; the senior analysts. The junior analysts all stated that they trusted the data, and had no reason not to. The senior analysts mentioned filters such as bots and internal traffic as important, whereas the junior analysts was not aware of these filters at all. A junior analyst who worked with an editorial website said that it was not necessary to filter out internal traffic since they were a small team. The senior analysts all mentioned that they did not trust the data fully, and that they would not base financial decisions upon GA data, they used the data rather to look at trends than absolute numbers.

Knowledge and skills All of the 12 interviewees expressed that they experienced some kind of problem related to GA. These problems varied in size and type, but all of them mentioned the learning curve, which was described as “steep”. In other words, most of them thought that GA was hard to learn as it contains a lot of functions, layers and data. One interviewee expressed that there is a lot to learn in GA, and it takes some time to grasp and understand the data and the functions. The majority of the interviewees had nothing to do with the installation of GA. Some of them did maintenance of GA and said that they had knowledge of the setup and structure of the tool, these were exclusively senior analysts.

All junior interviewees stated that they did not feel that they had sufficient skills to be able to answer complicated questions related to GA, and that they were “no experts” in the field. The senior interviewees stated that they had a lot of experience within web analytics and marketing analytics, and had reached a plateau of understanding the data. Attribution models was a topic that most interviewees was not familiar with, only a couple of the senior analysts discussed attribution models, and problems related. One of these interviewees explained the issues they experienced connected to attribution models:

”They can be misleading if you don't understand these data models. If looking at the standard model, non last direct click, email send-outs has a very low conversion rate, but when looking at first click, email send-outs has the highest conversion rate.” E-commerce manager at an e-commerce site

The analyst drew the conclusion that email was a good way to draw the customer in, and arouse interest in the product. Since the products they worked with (e-commerce) were rather expensive they meant that the customers probably would do some research and visit the site through various touch points before committing to a purchase. The analyst continued with explaining that if you don't know how the attribution models in GA works, it's easy to reach inaccurate insights that might harm the future marketing.

One junior analyst said that their direct traffic was high, as they would expect it to be. The analyst did not have a clear answer to why they assumed that the high amount of direct traffic was accurate. They just drew the conclusion that a lot

of people went directly to their website. Contrary to this, another interviewee said that they usually don’t even look at direct traffic, and do not draw conclusions around that traffic source, as it is impossible to know whether or not it reflects the truth.

Only one of the interviewees, a senior analyst who described oneself as proficient within web analysis, stated that they could deliver high quality data to the organisation. That organisation was a big media house with a big team of web analysts working with ensuring that the organisation possessed high quality data, and maximised their data through a marketing perspective.

Campaign tracking Half of the interviewees stated that they track their campaigns through UTM tags. The other half did not. It was clear that the senior interviewees were the ones who did track their campaigns, whereas the junior interviewees did not track their campaigns, or had no knowledge of whether or not they used campaign tracking. The ones who did not, did not know what campaign tracking was, or stated that they did not track campaigns because they already saw which channels drew visitors in GA. The ones who did track their campaigns all weighted the importance of UTM tagging, and all had UTM tagging as a requirement when working with campaigns. Some interviewees mentioned that the campaign traffic could end up in “other” or ”direct” if it was not tagged, which would be a waste of potential insights.

One of the junior analysts was not sure wether or not they used campaign tacking:

”We use GA to know where the traffic is coming from, which links and sources. I don't think we are working with UTM-tagging. Maybe we have done it before with a guy who was a specialist. Maybe he has set it up before?” Marketing coordinator at a recruitment agency

Insights All of the interviewees stated that they did come to valuable insights through analysing the data in GA. One interviewee said that they use GA as a receipt of if the marketing decision was correct, whereas another stated that GA data was used for trend analysis. One interviewee concluded their usage like this:

“The idea is to be able to draw conclusions on campaigns, if a campaign goes well or not, we do not want to waste money on campaigns that doesn't bring in money.” Web analyst at a big retail bank

Another interviewee described the usage as:

“We use GA to follow up on the results of our marketing activities, in order to comprehensively evaluate the effectiveness of different channels.” Social and Digital specialist at an e-commerce site

When being asked about insights some of the interviewees were more cautious about their potential insights than others. One gave an example of an insight as the hunch that “something seems to be wrong”, but that potential reasons and solutions only could be seen as hypotheses. These hypotheses should be tested (through other tools or user

Page 10: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

surveys) in order to conclude if the insight is relevant or not. This was a senior analyst who is the CEO of a management consultant bureau, and manages several customers’ GA accounts. The analyst explained that some customers are more eager to test for new solutions based on the GA data available, while others are more cautious about making changes early on only GA data.

Optimisation All of the interviewees stated that they did changes on the site and/or future content and campaigns based on insights gained through analysing GA data and other, complementing tools and data sources. The changes were tightly connected to the kind of company; for e-commerce the conversion funnel was important to optimise, for editorial sites, it could be to learn which combination of pictures and text worked best in the marketing campaigns. Informative sites focused on the content in the campaigns, whereas action oriented sites such as banks changed flows and optimised for the kind of device that was the most commonly used for each page. A junior analyst mentioned that real time changes were common, they could easily change communication or segment if the campaign or landing page was not doing well. The senior analysts weighted the importance of not acting too fast, and expressed that the data collection often needed more time in order to be trustworthy.

Complementing tools Almost all of the interviewees answered that they did use some kind of complementing tool besides GA. Google plugins were commonly used, Google Tag Manager and Google Ads were used by most of the interviewees. The most common 3rd party tool was Hotjar, a tool that lets the analyst see how the user uses the site and enables the analyst to ask the user direct questions through pop-ups on the site. When being asked why they used these complementing tools, most answered that they used them to be able to know things that GA could not contribute with.

ANALYSIS Initial setup The majority of the interviewees had nothing to do with the initial setup of GA, usually there had been someone who worked there before, or one of the IT staff who had done the initial setup a while back. This can be seen as an issue as they would have no idea of whether the setup had been done correctly or not. As mentioned in the theory section, there are several things that might be wrong with the initial setup, for example; the code might be installed more than once, which will generate at least double the amount of visitors than the true number. The interviewees relied fully on the person who had done the setup, even though it might have been done years ago, by a person they had never met. The interviewees would have no clue whether GA was installed correctly, and they would not think to check it. Another issue is that many interviewees mentioned that ”the IT person” had installed GA. There is a risk that the IT person did not know which metrics were going to be measured, and did not have knowledge of how the tools was going to be used in the future. Redman [18] emphasises that the responsibility of web analysis should be on managers, and not on IT personnel. The interview findings might indicate that if the responsibility is being put on IT personnel, who after the installation are disconnected from

GA, it might result in poor data governance, since it is not clear who is responsible for the data quality.

Configuration and maintenance The installation is a one time task, and if done correctly, there should not be any problems with that part of the GA setup. Something that can be far more time consuming is the configuration and maintenance of the GA data, which is an iterative, ongoing process that needs to be done continuously. Data is dynamic, and always changing, which calls for that someone looks after the maintenance and makes sure that the data is clean [5]. The senior analysts all mentioned that they did maintenance in the form of cleansing the data, keeping track of bots and working with filters. They could notice if something was wrong with the data, and all of them mentioned that they did not trust the data to 100 %. Some mentioned the bots as problems they had to deal with, but only a couple seemed to realise to which extent bots could skew the numbers in GA. It can be seen as problematic that the majority of the analysts either did not know what bots were (and therefore had not filtered them out) or did not seem to give them much thought at all. Approximately 40 % of all internet traffic is estimated to be bots [8], and if you have not even crossed the bot filter box in your GA, the numbers are bound to be skewed. Bots are a great example of how knowledge permeates the process of working in GA. The Rules bound analyst would follow the rules and cross the bot filtering box in GA, while the Rules based analyst would do this, and still keep in mind that data analysis is a craft where the analyst must use their experiences as much as learned rules to make sense of the data, and implement other strategies as well [17].

Data governance is another issue which seems to be evident throughout the interviews; the senior analysts did not trust the data, and were aware of that the data quality was not high enough for the data to serve as decision basis. In the other end, the junior analysts did not filter out bots and/or internal traffic, which skews the data, and results in poor data quality [16].

Campaign tracking Campaign tracking was used by half of the interviewees, and exclusively by the senior employees. This could indicate that knowledge and experience and campaign tracking might have a correlation. Since GA is such a comprehensive tool, with a learning curve, it might take some experience to reach the point were understanding that campaign tracking is necessary when evaluating the marketing effects. The same goes for gaining knowledge of marketing in general. One interviewee stated that they already knew where the traffic comes from, which is true to some extent. They have the traffic sources, which might be more or less accurate, but they can never fully know which campaign drew which traffic without campaign tracking. One of the junior analysts was not sure wether they used campaign tracking or not, and that maybe the person who worked in GA previously had set it up. The analyst was the only one who currently worked in GA at that company, which indicates that they probably did not work with campaign tracking at the moment.

Gaining insights All of the interviewees mentioned the “steep” learning curve in GA, and described the tool to be hard and time

Page 11: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

consuming to learn. The junior analysts expressed that they did not master GA fully, whereas the senior analysts mentioned the learning curve, but said that they had reached a plateau in understanding the tool and the data. Analysts might come to wrongful insights because of the lack of knowledge and experience, and if these insights serve as a basis for important decisions within the organisation, it might have a negative impact. A clear example of wrongful insights is the scenario one of the senior analysts who worked with e-commerce mentioned, about the attribution model ”non last click” in relation to email send-outs. The conclusion was that based on this insight one could make the decision to cut the email sendouts from the marketing efforts because of the low conversion rate, which might cause a negative impact on the business, since the insight was not correct.

Direct traffic One interviewee discussed direct traffic, which they expected to be high since they thought that users were prone to type in their URL directly into the browser. This is interesting since this is a hypothesis based on gut feeling. In relation to the theory, which says that direct traffic is likely to be misleading, this hypothesis shows that people trust the data when they perhaps should not. There are some metrics that should be met with some skepticism, direct traffic being one of them, as there are a number of factors that can skew the data [11]. As mentioned in the theory section, tests are a good way to evaluate hypotheses, while experience also can serve as a way to evaluate hypotheses [5]. In this particular case, the interviewee might have done everything possible to make sure that the traffic source direct traffic only contained direct traffic, and if this is the case, direct traffic can be a receipt of a strong brand.

Complementing tools Almost all of the interviewees mentioned that they used complementing tools to measure the marketing effects. The reason was that they wanted to measure things that was not possible through GA. This is positive as one of the most important parts of gaining insights is to be able to put data into a broad perspective, and in relation to other data and information. If using complementing tools and other information sources, such as sales numbers, one is able to gain more knowledge of how the organisation is performing [5].

Tests Many of the interviewees mentioned that they conducted tests to evaluate their hypotheses, most common were A/B-tests. Tests are a way of testing ones hypotheses before acting on them, which is a smart way of avoiding wrongful insights [1]. It is positive that so many interviewees conducted tests, which might be seen as an extra control mechanism before one acts on GA data.

Optimisation Optimisations were done by all of the interviewees to some extent. Common for the junior analysts were to make fast decisions based on GA data, and optimise campaigns in real time. The senior analysts were more cautious, and mentioned that they often needed more data to be able to make an evaluation of the marketing and take action on the insights. To be able to put data into a wider perspective, and to not analyse data in silos is crucial to be able to gain

relevant insights [5]. If one acts on siloed data, without relating the data to other information, there is a risk for rashed decisions.

Goals Goals are something that is commonly mentioned in the theory on web and marketing analytics. Goals and conversions can be setup in GA, which seemed to be common based on the interviews. It was also common to work toward goals, and to measure against goals that were set up from the beginning of a marketing effort. These goals were set from target numbers the organisation wanted to achieve (could be sales numbers, or qualitative numbers). Goals were a way to evaluate if the marketing was successful or not. If the marketing cost more than it earned, most interviewees drew the conclusion that it was a failure. Most of the interviewees had processes for how to evaluate if the marketing had the desired effects. Usually the economic indicators were seen as good evaluation metrics, but also using KPIs related to the users’ behaviour on the site were mentioned as good indicators. If the users behaved in an undesired way, it could indicate that the marketing was unclear or targeted at the wrong audience. Optimisations were made based on these evaluations, which was a way to take action on the insights. Trends was a way of seeing patterns in the data, which could serve as profitable insights in the future. Since most interviewees were interested in, and put a lot of effort into determining wether the marketing was profitable, it is interesting that only 50 % of them (the senior analysts) used campaign tracking, which is the only way of knowing in detail which campaign generates most revenue.

DISCUSSION The purpose of this study was to contribute to the field of marketing by identifying and discussing problems related to marketing analysis with GA, and to assist marketing departments with possible solutions to these problems. The study was qualitative, and consisted of 12 interviews. The result was 8 themes focusing on the interviewees’ problems, processes and experiences related to working in the web analytics tool GA.

The interviews disclosed that there is a learning curve when working in GA, and that it takes knowledge and/or experience to be able to gain relevant insights. Furthermore, the interviews showed that there might be a correlation between how much the interviewees trusted the data in GA, and the level of experience they had working with marketing analysis. The junior analysts trusted the data, whereas the senior analysts did not trust the data. To not trust the data can be seen as a way of keeping control over the data, and to be proactive against wrongful data.

For a company to be able to gain relevant insights that can act as basis for decisions, there are some capabilities that are essential to possess. On the technical end, the installation, configuration and maintenance need to be done correctly, by someone who understands both the technical area and the business area. Bots and internal traffic need to be filtered out, and the data must be looked over continuously as the collection of data is an iterative process. Furthermore, companies need to track everything possible through campaign tracking, with UTM tagging. To not do this is a waste of valuable information and possible insights. The people working with analysis in GA must have a

Page 12: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

certain amount of knowledge to be able to reach relevant insights. They must understand data models such as attribution models, and they must look at the data in a broad context to be able to paint a picture and make sense of the data. To establish processes and documentation guidelines is recommended as marketing analysis should be an iterative process, and in order to make the process effective [5].

To be data-driven and have a strategic approach to marketing analytics is becoming increasingly important in our digital society [15]. In Sweden, the media investment is levitating toward digital channels, which has a market share of more than 50 %. GA is a tool that is one of the key components in many companies marketing analytics process [5]. As the field of digital marketing is developing rapidly, it is likely that marketing analytics and web analytics also will develop and increase in importance in the near future [20]. There are many problems connected to both web analytics as a whole, and GA specifically. The uncertainty about the accuracy of the data, the large amount of bots skewing the numbers (and harming businesses), and the learning curve are some factors that make GA a tool with potential for improvements. In the meantime, companies need to take on a strategic approach to marketing analytics, and be aware of the related problems. Inaccurate insights can lead to decisions that can affect the marketing efforts negatively, leading to a negative impact on return of investment. To be data-driven takes effort and skills, and is not something that happens overnight.

In the beginning of this paper, it was stated that 50 % of employee time (in GA) was wasted on finding and correcting errors in the data, and that 35 % did not trust their data. Poor data governance was another widespread problem in many organisations [6]. In this study there are similar patterns evolving from the interviews. The senior analysts did not trust the data, and many of the interviewees mentioned that they ”cleaned” the data before it could be used in an analysis. In this study there are some findings that points towards poor data governance. Data governance refers to the capability in an organisation to ensure that high quality data exist. Only one of the interviewees, which was a senior web analyst expert, stated that they could ensure high quality data in their GA.

As an analyst there are a big amount of things to have in mind when working with marketing analytics in GA (and complementing tools). One should be aware of the learning curve, and be humble towards their own capabilities. If the analyst has analytical skills, is familiar with the business and product, and have some technical knowledge it is a good starting point. Marketing analytics is about seeing patterns and trends in data, and connect the dots [5]. The findings from the interviews points toward that the experience the senior analysts possess might be a great advantage when working with marketing analytics. Before reaching this level of experience, one should try to be cautious about the conclusions one draws from the data, and test the hypotheses before acting on them. The findings also suggest that more education and training in the marketing analytics field might be needed. The training and education should focus on both technical skills and analytical skills, and take on a Rules based view of data analytics in order to prepare the analysts for their upcoming tasks [17].

The chosen method, semi-structured interviews, was relevant in this study since the research question focused on capabilities, and there was need for a deep understanding of processes, knowledge and experience of those who work in GA. It would have been interesting to complement the semi-structured interviews with a survey to gather quantitative data, and be able to make genrealisations.

Future research of the area is needed since web and marketing analytics is a fast growing and dynamic field. The problems related, and the uncertainties calls for more research to develop tools and processes that eliminates errors in the data. Additional studies are needed to get a deeper understanding of how the experience and knowledge of web and marketing analysts affect results, as well as how mistakes and errors derived from wrongful data affect organisations’ revenue and market position.

CONCLUSION When working with data driven marketing, one of the main goals is to reach relevant insights that can serve as basis for decisions within organisations [5]. To reach relevant insights, however, is not something that happens the moment GA is installed. To obtain these insights, there are a number of capabilities that needs to be adopted in an organisation. The research question was: Which capabilities are needed when using Google Analytics in order to conduct an efficient and reliable analysis of the marketing effects at a Swedish marketing department?

The initial setup of GA is critical. There are pitfalls that should be avoided, such as incorporating the code in the wrong place. If the setup is not done correctly, the data will be wrong from the beginning [2]. Findings from the interviews suggest that there is a need for validating the setup, and making sure it is correct.

When it comes to configuration and maintenance in GA, findings from the interviews suggest that there seem to be a general lack of knowledge of bots and filters, and a need for more control over the data. 40 % of the Internet traffic is estimated to be bots [8]. This shows that it is essential to filter out bots, and to keep track of the data continuously. To adopt a Rules based view of data analysis is advised [17].

There seem to be a general lack of knowledge on how to track campaigns, which is found in the interviews as well as in the theory [6]. Every organisation that works with marketing analytics should track their campaigns through UTM tagging, to not do this would be a waste of insights. A common problem is the traffic source ”direct traffic”, which is known to contain more than only direct traffic. This problem can be solved by tagging every campaign.

The last, and perhaps most important capability is to understand how GA works. To understand data models and data collection in GA is important in order to not draw wrongful conclusions. The people working with marketing analysis should have analytical and technical skills, and have knowledge of the business, the products and the marketing of the organisation. The key to gaining relevant insights is to put the data in a wider perspective, not silo the data, and relate the data to complementing data and information [5]. To gain control over the process of marketing analysis by assuring that these capabilities are obtained in an organisation is a way of reaching efficient analyses and reliable insights.

Page 13: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

REFERENCES

1. Abbamonte, Kiera (2017). How to Better Integrate Analytics Into Your Marketing Strategy. Neil Patel. Retrieved from: https://neilpatel.com/ blog/integrate-analytics-into-marketing/

2. Bloom, Kevin (2018). Why Your Google Analytics Data is Wrong and How To Fix It .Hinge Marketing. Retrieved from: https:// hingemarketing.com/blog/story/why-your-google-analytics-data-is-wrongand-how-to-fix-it

3. Braun, V., Clarke, V., Hayfield, N., & Terry, G. (2019). Thematic analysis. Handbook of Research Methods in Health Social Sciences, 843-860.

4. Bryman, Alan (2012). Social research methods 4. ed. Oxford: Oxford University Press

5. Clifton, Brian (2015). Successful analytics - Gain Business Insights by Managing Google Analytics. Advanced Web Metrics Ltd

6. Clifton, Brian (2019 ). The State of Google Analytics Data Quality. Verified data. Retrieved from: https://verified-data.com/research/? utm_source=superweek+2019&utm_medium=slides&utm_campaign=rese arch+study

7. Cross, Amy (2018). What is Marketing Analytics? NG Data. April 04. Retrieved from: https://www.ngdata.com/what-is-marketing-analytics/

8. Distil Networks (2019). Bad bot report. Retrieved from: https:// resources.distilnetworks.com/white-paper-reports/bad-bot-report-2019

9. Edgecomb, Carolyn (2017). The 10 Marketing KPIs You Should Be Tracking. Impact. Retrieved from: https://www.impactbnd.com/the-10marketing-kpis-you-should-be-tracking

10. Gant, Amanda (2017). What is Google Tag Manager and how to use it. Orbit Media. Retrieved from: https://www.orbitmedia.com/blog/what-isgoogle-tag-manager-and-why-use-it/

11. Hansen, Mark (2015). Understanding direct traffic in Google Analytics. Megalytic. April 14. Retrieved from: https://www.megalytic.com/blog/understanding-direct-traffic-in-google-analytics

12. Kaboteh, N., Moosman, O., Petersson, J., Ahnborg, P. (2019). Attribution IAB Sweden white paper. IAB

13. Krieger, Brad (2018). Three Ways Google Analytics Helps Marketers Solve Problems. Dandelion inc. September 28. Retrieved from: https:// www.dandelioninc.ca/blog-posts/2018/9/28/three-ways-google-analyticshelps-marketers-solve-problems

14. Mankins, Michael & Sherer, Lori (2015). Creating Value through Advanced Analytics - Why advanced analytics is about decision-making, not just technology. Bain & Company. Retrieved from: https://www.bain.com/insights/creating-value-through-advanced-analytics/

15. Mela, Carl & Moorman, Christine (2018). Why Marketing Analytics Hasn’t Lived Up to Its Promise. Harvard Business Review. May 30. Retrieved from: https://hbr.org/2018/05/why-marketing-analytics-hasntlived-up-to-its-promise

16. Otto, Boris (2011). Data Governance. Business & Information Systems Engineering, 3: 241. Retrieved from: https://doi-org.focus.lib.kth.se/10.1007/s12599-011-0162-8

17. Passi, Samir & Jackson, Steven (2017). Data Vision: Learning to See Through Algorithmic Abstraction. Proceedings of the 2017 ACM Conference in Computer Supported Cooperative Work and Social Computing

18. Redman, Thomas C. (2013). Data’s Credibility Problem. Harvard business review. December. Retrieved from: https://hbr.org/2013/12/datascredibility-problem

19. Smith, Bernie (2018). KPI checklists, Develop Meaningful, Trusted, KPIs and Reports Using Step-by-step Checklists. Metric Press

20. Sveriges Annonsörer (2019). Guide för digital annonsering. [e-book]

21. Österberg, Marcus (2016). Guide: Webbanalys - en introduktion till mätbarhet på webben. Webbstrategi för alla. Retrieved from: http:// webbstrategiforalla.se/webbanalys-en-introduktion/

Page 14: Data driven marketing - kth.diva-portal.org1337198/FULLTEXT01.pdf · Data driven marketing - How to gain relevant insights through Google Analytics Jenny Carlsson Ståbi KTH Royal

www.kth.se

TRITA -EECS-EX-2019:423