In the arena of top marketing firms, data-driven marketing seems like the key buzzword of the past few years. In fact, it’s no passing fade. Leveraging analytics to reach target customers has become a key component of any successful digital campaign. According to a recent survey by the Global Alliance of Data-Driven Marketing Associations, employing a data-driven approach has become the backbone to just about any campaign or messaging—whether it’s targeting the right audience, or even predicting potential success. And as marketing technology continues to make inroads across the industry, it’s should not be surprising that more businesses want to take a data-driven approach to their marketing.

The use of data to improve the effectiveness of marketing, and to measure it, is virtually universal these days. In fact, a recent study found that the number of marketers who still don’t use data are now just one in 10. In addition, even more complex data techniques, such as integration of third-party data and cross-channel measurement, were found to be widely used.

The survey found that more than nearly 80 percent of advertising and marketing professionals are now using data-driven techniques to maintain customer databases, measure campaign results across multiple marketing channels, and segment their data for proper targeting. Again, for those of at the top digital marketing agencies, this finding makes perfect sense. Marketing automation platforms, when configured properly, can easily deliver this type of feedback, and those platforms have been aggressively showcasing these capabilities.

The survey, which targeted both advertising and marketing executives from a wide variety of industries, notes a clear shift in spending patterns. The survey respondents reported a strong expectation that spending on data-driven efforts would continue to rise. Nearly two-thirds of the respondents said their spending on data analytics for marketing would rise, while only seven percent said they expected a decline in spending the coming year.

While data-driven marketing is not quite yet a flawless solution, that’s not unexpected for a relatively new approach that relies on new technologies, data-driven marketing isn’t a perfect solution, at least not yet. A recent analysis from Square Root found a challenge for many companies is gathering the type of high-quality data that is necessary to optimize results. Data analysts are searching to find more effective ways to collect, manage and understand data. Forty-four percent of the survey respondents reported that they were still using outdated tools. A similar amount believed they could make decisions without in-depth data. According to the survey, over a quarter of respondents cited other time wasters from data source overkill to bad numbers. Square Root’s study, in particular, found more than half of data professionals felt they could use better training, closely followed by another 49% who desired more user-friendly or updated data tools.

But chief marketing officers and other executives wouldn’t be making these investments if they didn’t think they deliver results. They recognize the benefits to the bottom line.

Want help delivering a data-driven marketing program that delivers clear results? Give us a call and see how Bluetext can help.

If your digital marketing agency team doesn’t have a SMAC roadmap, you may find your company drifting off-course in 2017 and beyond. Here’s brief refresher course on SMAC.

Social Media

Social Media continues to evolve.   Platforms rise and fall by the year vs the decades of old.   Some new trends we see emerging that we see potentially continuing to gain momentum.
1. Snap’s Evolution Will Result in Interesting New Opportunities.
2. Twitter Fatigue Will Worsen.
3. Users Will Crave More Vicarious Experiences.
4. New Areas of Communication Will Emerge.

Mobile

Mobile devices are the cornerstone of how new business is being built and legacy businesses are reinventing themselves. Mobile devices allow users to constantly update their profile, stay aware of deals and promotions, and track locations and buying habits by virtue of connecting to various wireless signals and near-field communication (NFC) devices.

Some new trends we see emerging that we see potentially continuing to gain momentum.
1. Consumers redefine purchase boundaries; mobile marketing, brand partnerships deepen
2. Department stores, mobile marketing partners tackle the ‘Amazon Effect’
3. Programmatic accelerates: brands, tech, marketing continue to invest
4. Next-generation creative, video redefine mobile engagements

Analytics

As databases have grown larger and processors and memory have become capable of chewing through hundreds of millions of records in a short time, we have begun to see how analytics can do more than just track clicks. Analytics can establish links between entities and make intelligent predictions about customer behavior based on knowledge a system has about a customer — knowledge that has been informed by social networking.

To keep up with the explosion in Big Data, companies and corporations are beginning to invest in BI projects and more and more sophisticated analytics infrastructure.  Some new trends we see emerging that we see potentially continuing to gain momentum.
1. Multi-channel Attribution
2. Focus on ‘Return on Analytics Investment
3. Monetization of Data
4. Exciting new players in the MarTech arena to complement the core analytic platforms

Cloud

The cloud element of SMAC refers to the capability a business has to spin up vast amounts of capacity that are paid for by the minute or hour. Businesses do not need to spend millions of dollars building another data warehouse – they simply rent it from a cloud provider, do their work and turn it off. When the business environment changes, they simply spin up another cluster in the cloud, pay another few hundred dollars and continue building insights.

Some new trends we see emerging that we see potentially continuing to gain momentum.

1. Artificial intelligence (AI) will make personalization a reality in 2017.
2. Self-service will be the new normal.
3. Enhancing the Buyer Journey
4. Google Tag Manager and other granular analytics modules being the norm
With buyer sophistication growing daily, marketers need to deliver increasingly smarter strategies and campaigns. Are you taking the time to measure how your efforts are working and think about how you might enhance your efforts, or do you find yourself quickly moving from one campaign to the next?

Need help with your SMAC TALK?  Contact the digital marketing gurus at Bluetext.

Successful digital marketers are constantly evaluating where to put their resources, and how to measure the programs they are funding in terms of lead generation and sales. Digitally mature enterprises go one step further– They put their money where their data is. That’s because they know that data-driven marketing is an essential component of their maturity. It provides a foundation for their programs, and takes the guesswork out of marketing.

Advanced analytics allow companies to go far beyond baseline metrics, by providing the tools to really understand how their target buyers are consuming content, what entices them to engage and interact, and what triggers a conversion. In-depth analytics -including multivariant as well as A/B testing – provide the types of information that enable more automation and personalization to map to each buyer’s journey. A recent survey from Adobe found that digitally mature enterprise organizations plan on growing their measurement programs by 41 percent over the next three years. Digitally mature companies rate the whole customer view, predictive marketing, and attribution modeling as their highest priorities. And that means having a clear picture of who the target customer is if they want to deliver a personalized experience that will drive conversion.

As the survey found, data no longer just informs, it also predicts. “Customers expect digital marketers to know who they are and what they’re interested in.”

Combining in-depth analytics and machine learning begins to give a picture of the entire individual journey that buyer is on, delivering insights that enable an experience that is relevant to that customer, including his or her preferences, expectations and timing. Providing the right types of content when the target buyer wants that content is the most likely path to turning a prospect into a client. Getting that data in real-time from the right analytics and tools will offer the most current insights for reacting quickly and putting the best content in front of that audience, responding to what’s happening now, not what took place a week or month earlier.

Our recommendation is to let a digitally mature brand be your model, and invest in the best analytics that will provide real-time, data-driven insights to meet your marketing and revenue goals.

Let Bluetext assess your digital maturity and analytics so you can meet your lead and revenue targets.

Today CSC launched the 2.0 version of its Digital Briefing Center. CSC’s Digital Briefing Center is where customers, partners and prospects from across the globe can come to learn more about the key technology conversations and shifts CSC is driving into the market.

The center is driven with immersive 3D video technology that is completely interactive through html 5 overlays throughout the user journey.

csc22

Following launch, Bluetext’s collaborative creation with CSC’s Digital Marketing team became the top performing component of the csc.com global web presence, a huge feat for a Fortune 500 corporation.

Version 2.0 features new capabilities spanning:

  • Multi-floor scalability
  • Triple screen experience
  • Dynamic social media integration
  • Triggered infographic visualizations synched with briefing videos
  • Chaptered video interactivity

The following video of CSC’s head of global brand and digital marketing talks about this project:

dbcquote

Contact us to learn about how we create innovate digital experiences for brands like yours.

 

Measure. Evaluate. Evolve

At Bluetext, every campaign we execute is different – some clients need to strengthen their brand, some need to sell more services, some need to differentiate versus upstart competitors, and some even need to energize their internal sales force. It is amazing how often this last point is a motivator for our campaigns.

For this reason, the simple question about our process for measuring success at Bluetext is not always black and white. What are you trying to achieve? Sure, we use Google Analytics or Eloqua or any of the various lead tracking systems. We also survey the market to get a baseline today then again in 6 or 12 or 18 months of where a brand stands. All of these measurements are valid for marketing campaigns, but there is no one size fits all approach.

Just as we recommend that clients ensure they understand the sandbox they play in through our messaging and discovery process, it is just as important to determine how you are going to measure success, and then be prepared to course correct quickly. What are you going to do with that great website or infographic we developed? How will you hit your target audience with it? Channels are always evolving, but success metrics should not.

The old adage “Build it and they will come” simply does not work in our world. Think about these three questions when executing a campaign:

  • What are you trying to achieve?
  • What do you want your audience to do?
  • What message can we deliver to them?

If all of these questions are answered up front, we will work with you to create a powerful campaign delivered via the right channels to achieve the right metrics…that is how we define success with every campaign we execute.

 

Google’s new Hummingbird search engine algorithm is sending shock waves throughout the digital marketing arena. What it means, and how marketers need to adjust their SEO thinking will be on the to-do list for the foreseeable future.

When Google released its latest changes this fall, it used a very clever strategy that took almost everyone involved in SEO by surprise. First, it ran the new algorithm for 30 days before telling anyone. No big announcement, no public launch, just a quiet change. Then it held a press conference to discuss what was quickly recognized as its most significant revision in more than a dozen years. And with a full 30 days’ worth of data under its belt, Google was able to say that the world had not ended by its revision. Not only did the industry feel no seismic disruptions, but by most accounts no one had even noticed.

Hummingbird is a massive change in the way in which the Google search engine returns search results, and it has major implications for the way that companies and organizations need to approach SEO.
First, a little search engine background. Search has always been a game of cat-and-mouse. The marketer’s goal is to use links, key words, and other tactics to ensure that their website comes up high during relevant searches. Google’s interest is in having the most relevant results that don’t favor a site just because it has tricked the search engine. So, for example, when inbound links were weighed heavily, tacticians could create “link farms” that gave the impression of links that weren’t real. When Google altered the algorithm to degrade unimportant links, new tricks were developed that included keyword stuffing, or the heavy use of searched terms throughout the site. Google responded by setting parameters on how many words could be used in a given paragraph. The back-and-forth continued.
Hummingbird marks a steep departure from this word-based game. It focuses on context and what are known as “long-tail” queries to deliver results that are more specific to the needs of an evolving Internet where mobile devices and voice commands are replacing simple word searches. Hummingbird is supposed to reflect that context when, to use an obvious example, we search for Chinese restaurants. What earlier search engines would deliver was a list of restaurants. But what we really want to learn is a good place to eat that is nearby. The intent of Hummingbird is to understand that context and deliver recommendations of good restaurants in our area. Remember that what is a “good” place to eat is a subjective notion and will become very important in how marketers will need to structure their SEO strategy going forward.
That context gets more difficult as people speak their questions rather than type. So for example, while a typed query might read, “nearby Chinese restaurant,” a spoken query might say “What’s the best place to get Chinese near my home.” Google needs to recognize the actual location of your home, understand that ‘place’ means you want a brick-and-mortar restaurant, and get that “Chinese” is a particular type of restaurant. Knowing all these meanings may help Google go beyond just finding pages with matching words.
Google has reoriented its search algorithm in three very important ways in Hummingbird, and two of those changes have to do with what it determines is “good.”
The first is that Google now rewards good content. That means that long, detailed and well-sourced articles are going to get better results than mere word mentions on a page. Do a search on “slavery” and you will find long articles from The New York Times as well as The Smithsonian magazine. Search for “best rain jackets” and you will get reviews from publications and “How to choose” articles from within the REI site, instead of links directly to items for sale.
The second is that Google is putting links to what it considers to be good content directly on the results page, and is including related articles and other information that it didn’t previously deliver. From a consumer’s point of view, this turns the search results page into a sort of encyclopedia with snippets of content pulled from others’ sites. From a marketers perspective, it could mean that viewers will see information from your site, but not need to click onto your site to get it. Skeptics have theorized that Google is actually trying to keep you on their page as long as possible in order to run more ads and realize more revenues. Whatever the motive, getting someone to leave the search page for your website is more challenging.
The third is that social media, and in particular Google+, will become a larger part of the search engine equation. Google’s goal is to tap into your network of friends to give you additional insight on your query. Go back to the question about a good nearby Chinese restaurant. If Google sees that friends within your Google+ circles like a particular restaurant, that might be included in the search results.
This is a lot to think about, and requires a different mindset when executing your SEO strategy. If this is starting to make your head spin, join the club. Much of what has been written about Hummingbird so far is difficult for anyone not steeped in algorithm technology to understand. So with that in mind…

Those of us still paying attention saw that the court overseeing the long-running legal battle between Google and authors and publishers ruled against the proposed agreement last week (“Judge Rejects Google’s Deal to Digitize Books,” New York Times). Good luck figuring out what it means, and more importantly, why anyone should care. But it is important, if for no other reason than it is the result of a massive collision between an industry– book publishing–and the realities of the Internet and digital access to information. If you think you’ve seen this movie before, it does look like the fights over Napster and Internet file sharing that has decimated the music industry. Only in this case, the major players are attempting to find a legal solution.

I’ve been following this since the American Association of Publishers, together with the Author’s Guild, sued Google in 2005 to stop it from copying every book known to man–allegedly in violation of copyright protections. Google’s ambitious project was greeted enthusiastically by researchers, journalists, historians, people who read– just about everybody except those whose intellectual property might be given away for free over the Internet (remember those old companies in the recording industry, and what happened to them after the Internet got popular?) [Full Disclosure– Both the AAP and Google have been clients of mine over the years.]

But it is interesting, if perhaps not quite so important for most of us, to understand at least a little of what this is all about. For Google, it was co-founder Larry Page’s effort to digitize books and make them widely available, at least to search snippets, for students, researchers, historians, and anyone else wanting to  experience the bulk of human knowledge leveraging the Internet. Sounds good enough, and that’s the easy part.

What gets complicated is sorting out the three broad categories of authors.  Actually, two are easy, and one is difficult. The first is the volume of works through history where their copyright protections no longer apply– think Shakespeare, the Bible, The Iliad and The Odyssey, etc. Google (and anyone else) is entitled to go for those. The second is the volume of works under copyright protection where the author and publisher are known and active. Think of all the popular authors you know and love, Anne Tyler, Bill Bryson, Sarah Palin, fiction, non-fiction, and everything in-between. This is, of course, a little more complicated, but those authors (and their publishers) can actively protect their intellectual property, and do so by cutting their own deals for licensing rights, directly with Google or via organizations such as the Copyright Clearance Center [another former client]. Or they can choose not to license their works for internet distribution at all.

That leaves the third category, and that’s what the fight now is really all about. This group includes all of the works where copyright still applies, but where the holder of that copyright– the author or publisher, her relatives, spouse or estate– cannot be found. In this category are (mostly) out-of-print books and other publications, known as “orphan” works, and these are what the judge decided that the settlement was not adequately protecting. The terms of the settlement– worked out between Google, the publishers, and the authors–according to the judge, “would have granted Google a “de facto monopoly” and the right to profit from books without the permission of copyright owners.” He called that “unfair.”

What the settlement would essentially do for orphan works is set up an “opt out” process, where copyright owners could come forward and decide not to participate in the settlement (keeping their works out of Google’s search engines). What he believed is appropriate is an “opt-in” process, where works of copyright owners could be included if the copyright owners come forward and gave permission. The problem with that, of course, is that by definition these people can’t be easily found. That’s the dilemma.

So why do we care? Most of us probably don’t. We are not hot in pursuit of out-of-print obscure books that have been long-forgotten. Unless you are a researcher, historian, journalist, academic, blogger, hobbyist, or anyone who likes to know what there is to know about a subject. Then being able to include these works in your scholarly pursuits can open up long-lost information, and maybe even a gold-mine of data. The courts will ultimately decide what’s fair, but it is a good example of how the Internet is challenging all of our assumptions– for better for for worse.

 

I’ve been involved in public opinion polling and research for 20 years and am a big believer in understanding where you stand in the public”s mind. When I was in the Clinton White House, we had polls in the field every week.  When I wanted to know what the biggest threat was during the campaign finance investigations following the 1996 elections, I knew within a week from our polling that the only theme that could do real damage to the President was if it were shown that foreign adversaries had funneled money illicitly to the campaign. That allowed me and my colleagues on the damage control team to spend our effort defending a much smaller field of play than if we had treated every allegation equally.

 

I don”t think I have ever seen a poll that didn”t contain at least one surprise, no matter what the organization, company or industry that has commissioned it.  And I firmly believe that the same is true with the revolution in on-line research and analytics that is now available. Knowing who your key audience is, what they think of you, and which messages compel : justin-bieber-news.info Bieber’s World Other highlights from the story:- The Canadian-born Bieber never plans on becoming an American citizen. them to take action is even more important when there are so many more ways for them to get information than even 15 years ago.

 

MRA_Logo.JPG

I have spent the last five years on the Advisory Board of the Marketing Research Association, the primary trade association representing the polling firms, focus group facilities, and the range of other industry suppliers. The Advisory Board has been sounding the alarm for some time that the industry is under siege, not because clients don”t believe it is important, but because technology has made it more readily available without the expense of traditional phone banks and focus groups.

 

The reality is that traditional ways of conducting market research are becoming less effective. Take traditional land lines. The only way that pollsters can reach target audiences when conducting telephone interviews. Changing demographics mean that fewer and fewer people actually own land lines at home, relying instead on their personal cell phones which are off-limits to polling firms. So by definition a traditional phone bank will not be able to reach a significant part of the population. And while the traditional methods still return a more statistically valid result with a smaller margin of error, the price of those results may be higher than their ultimate value to a company or organization.

 

Indeed, analytical tools as simple of those provided by Google offer a wealth of market intelligence even though the margin of error may not be as small as traditional phone banking. More sophisticated tools, like those provided by Adobe”s Omniture services, can parse through huge volumes of web traffic, for example, to provide valuable insights into who is coming to your web site and what they are doing there. Successful companies will take advantage of these analytical resources to understand their target audiences, leveraging technology for the best value and greatest insight.

 

It may be a surprising question.  Google, after all, is a search engine, Facebook a social network that by all appearances is the hot property over the past 12 months.  The number of users is growing exponentially, and investors believe it is worth billions of dollars. But there is real evidence that Facebook is inadvertently losing its primary attribute– intimate social interaction– that truly differentiates it from the all-data driven promise of Google.  It may seem counter-intuitive and will, take a little explaining, but it all became clear when reading Time Magazine’s Person of the Year cover story in December on Mark Zuckerberg, Facebook’s erratic founder.

 

The gist of the difference between the world’s largest search engine and the world’s dominant social networking site goes something like this:  When you look for information on Google, such as restaurant critiques, movie reviews, suggestions for vacation resorts, or anything else you can think of, you are getting the wisdom of strangers.  When you seek that same information on Facebook, you are getting the recommendations from friends.  The latter would seem to be more valuable for most people because it is from individuals with whom they have those types of relationships, and not from faceless (and often nameless) Internet personalities.

 

That’s just one example of the utility of Facebook.  It is also much more of a place to have a dialogue with people with whom you have a relationship with– to see what they are up to, what they are reading, or how their families are doing, or just to look at their photographs.  In other words, where Google is about broadcasting information, Facebook is about engaging in conversation with friends.

 

But look what’s happening on Facebook when it comes to having “friends.”  The definition has changed– dramatically.  In the off-line world, friends are people with whom you share experiences, spend quality social time, and interact with on a personal level.  In the Facebook world, the meaning of the term “friend” goes far beyond that– it is anyone who hits the “accept” button.

 

My daugthers each have well over 800 friends on Facebook.  In their off-line life, the number is a fraction of that.  I have one colleague who has more than 3000 friends on Facebook.  When he asks for movie recommendations, even from his Facebook friends, the response is no different in my mind than from a Google search– it is merely the opinion of strangers who happen to be friends by this new definition.

 

From a practical stand-point, there is no way he can follow the news feeds of those 3000 friends, not can he engage in any meaningful conversation.  When he posts a status update, links to an article or video, talks about his weekend, he isn’t engaging in dialogue on a personal level with anyone.  He is simply broadcasting his posts, hoping that, as with Google, people will see it.  When online friends don’t really have the same attributes as off-line friends, the social networking component disappears.

 

What does this mean for communications professionals?  For one, Facebook can be a great place to broadcast information far and wide. But it’s going to become a more difficult place to actually engage key audiences– be they consumers, customers, employees, policy makers, or just friends.