User experience (UX) as defined by Wikipedia involves a person’s emotions about using a particular product, system or service. User experience highlights the experiential, affective, meaningful and valuable aspects of human-computer interaction and product ownership. Additionally, it includes a person’s perceptions of the practical aspects such as utility, ease of use and efficiency of the system. User experience is subjective in nature because it is about individual perception and thought with respect to the system. User experience is dynamic as it is constantly modified over time due to changing circumstances and new innovations.

The two most critical points of Wiki’s definition are that users are individuals each with unique goals and behaviors – and that user experience is dynamic. In the name of user centered design principles,  however; designers often segment users into personas based on similar (vs. individual) goals and behaviors that ultimately define a user experience that’s good for the masses some of the time rather than best for each individual user all of the time.

This definition of UX is also guided by the principle that there are real humans standing behind every brand or product. And while a great UX continuously learns from its users to deliver a more intuitive, human centered experience – thanks to AI –  it no longer requires an actual human to deliver it.

And like AI – which ‘learns’ how to avoid design ‘mistakes’ by memorizing the experience of every user – UX is in itself a designer’s primary source of human intelligence that, if properly applied, will recognize patterns in human interaction to mitigate frustration and optimize usability.

What this means is that ideally, designers should continue to employ traditional user centered design principles to lay the groundwork for major design decisions, and at the same time, leverage AI to continuously calibrate incremental changes to that design in real time based on the individual goals and activities of every user.

The compound effect of applying AI to UX in this manner will empower designers to humanize the customer experience for each individual user and usher in a new age of a dynamic and truly utopian, user centered design methodology.

 

 

 

 

 

 

 

In its simplest form – content strategy is broadly defined as the plan for the creation, delivery and governance of useful and usable digital content that meets the business objectives of your company. User experience – on the other hand – is ultimately designed to meet the needs of – and therefore expressed by the voice of – the user.

The outcome of both of these business focused and user centered goals is the user experience – and core to that experience is content strategy. Having a plan that creates a bridge between user needs and business goals is essential – it is inherently impossible to design a great UX without it.

Content strategy creates a global framework for how digital content is prioritized, organized and presented based on its business purpose. UX strategy informs the information architecture – providing the structural context to bring the content to life based on the user persona it is designed to reach.

However – website design is often driven solely by UX teams who – absent a comprehensive digital upbringing – tend to overlook content early in the design phase of the project – or worse yet – after the templates have already been created. This is the point at which projects catch fire and timelines blow up – resulting in a large amount of rework later when the content doesn’t fit the templates and either the templates need to be redeveloped or the content reconfigured – not fun.

At Bluetext – we recommend identifying user needs and expressing them as user stories during the initial design phase. By gaining insights through the user stories – designers gain a better understanding of the content users need and therefore can apply more critical thinking around the templates they need to create to support that content.

With content becoming more critical to brand performance by the hour, designers have to work more closely than ever with content strategists to ensure the UX supports the desired narrative for the content play out without getting in the way. The more you can embed content strategy into each step of your design process, the better the user experience will be.

 

Google has done it again, quietly making a significant change to the way its algorithms process Google AdWords that could be significant challenge for digital marketing if not understood and managed.  At Bluetext, we closely monitor all of updates to how the Google’s search engines returns query results, and we have posted a number of blogs to let our clients know about these changes and how to address them.

This time, it’s a little different because this change, which Google announced on March 17, addresses AdWords, the tool companies use to implement their keyword purchasing strategies, rather than a revision of its organic search functionality. With this change, marketers may need to adjust their spending programs for purchasing the keywords that drive traffic to their sites.

In the past with AdWords, marketers would select a set of short-tail search terms that would be part of their search advertising mix. For example, a hotel chain might include simple key phrases like “best hotels in Nashville,” mirroring the way customers search for a list of places to stay. Up until the latest change, that exact phrase would drive the Adwords results. But Google has decided that people don’t always type their searches as that exact phrase, dropping the “in” by mistake or even misspelling it as “on.” As a result, Google has decided to expand its close variant matching capabilities to include additional rewording and reordering for exact match keywords.

What does that mean? In layman’s terms, Google will now view what it calls “function words” – that is, prepositions (in, to), conjunctions (for, but), articles (a, the) and similar “connectors” as terms that do not actually impact the “intent” behind the query. Instead, it will ignore these function words in Adwords exact match campaigns so that that the intent of the query will be more important that the precise use of these words.

Sounds like a good move, because if you search for “best hotels in Nashville” or “Nashville best hotels,” the result will be the same in AdWords.

But what if the search is for “flights to Nashville,” which isn’t the same as “flights from Nashville”? Ignoring the function words “to” or “from” would change the purpose of the query. Google says not to worry, its algorithm will recognize the difference and not ignore those words since they do impact the intent.

Hopefully, Google will make good on that promise. But advertisers who have been briefed on this revision aren’t too certain. Their carefully constructed AdWords investments might take a hit if the function words are not managed precisely to meet this new approach.

We like the old adage of “Trust but verify.” While we take Google at its word, we know there are always growing pains with these types of revisions. For our clients, we are recommending that they carefully review the terms they are including in their AdWords mix. Our advice: Be as precise as you can and factor in how these functions words might be perceived before pulling the trigger. Losing traffic to your site because of placement of a simple word should be a real concern.

Want to think more about your adWords, search and SEO strategies. Bluetext can help.

Download the Bluetext Guide to Building for the Future.

With technology advancing the landscape of digital marketing so rapidly and personalization driving more and more customer interaction, organizations are accelerating their deployment of multiple websites and web presences. But how do you determine whether your organization needs a new website, or a digital platform?

As organizations expand, satellite and secondary websites are created—often because it’s easier and quicker than integrating with the main site, and because internal IT resources may not be available to take on the project. But that approach has its own consequences. With more digital assets, organizations struggle to keep them up to date, maintain brand consistency and implement a quality control mechanism that can take all of the sites into account. The simple tasks of changing a logo or copyrighting text turns into a complex exercise of coordinating multiple groups across different parts of the organization—and getting them to respond when you need it.

The answer is a move towards an approach based on a digital platform rather than a website, and the distinction is important.  If your organization has multiple web presences, then you need to ask yourself the question: “Would I benefit from a platform?” If you don’t explore this option, you may be setting yourself up for years of frustrations.

Need help determining whether your organization needs a new website or a digital platform?

Download the Bluetext Guide to Building for the Future.

It’s the beginning of a new year, and that means that industry experts will pull out their magic eight balls, clean their Google glasses, and attempt to see into the future. But as a marketer, it’s going to be tricky to understand what trends are real, and which ones aren’t worth spending time or resources chasing. Here’s what we can confidently predict:survival_cover When technology experts take a stab at projecting into the future, they both overestimate the rate of consumer adoption and underestimate the resistance from political regulation. We all love the idea of Amazon’s warehouses in the sky delivering our packages by remote-controlled drones, and we may well be able to manufacture self-driving cars that are safe and efficient. But saying yes to UAVs circling our neighborhoods and giving the green light to driverless vehicles, that’s a different story altogether.

Digital marketing is evolving by the hour, but we can make some predictions on what’s going to be important to markets this year. Most important is how to survive these changes. So what should we expect in 2017 that may be more down to earth and actually come to pass, and how should you plan your strategy?

To find out, download our 2017 Digital Marketing Survivor Guide.

Today’s “need for speed” mantra is evident in everything we do. Your website is no exception. We all want everything to be instantly available at our fingertips – including our online experience. For websites, that means the faster the page speed, the better.  Top B2B Marketing Agencies have been working with their clients for the past decade to improve page speed, looking for all sorts of tricks and tips to reduce load time and improve response.  Some major players – including Akamai in the hosting space, Google’s AMP and Lightening from Facebook –  have developed significant technologies and innovations that are worth considering for your digital game plan.

A survey from Statistic Brain concluded that the average person’s attention span has fallen to 8.25 seconds down from 12 second in 2000. This statistic is projected to continue to decline. As a marketer, that means you have even less time to grab your audience’s attention before they’re on to the next shiny object.

Page speed is defined as the load time of one particular page on your website. Ideally, the site is completely rendered and ready to go on a screen within microseconds of someone typing in its URL and hitting enter. Does this seem like an impossible ask? The short answer is yes.  Since a feat such as this is borderline impossible in most cases, we’re forced to resort to more realistic metrics to achieve this lofty goal.

Here are the  top three reasons why lightning fast page speed is essential for the success of your organization’s website.

1) It’s all about the User

User experience is the number one priority. Without them, of course, your site is just a jumble of html that serves no purpose.  Site optimization is key and should be performed often.

  • Fast page load time means users will be able to quickly navigate the site, increasing pages per session, time on page, and (possibly) decreased bounce rate.
  • Better numbers for the metrics listed above mean better rankings from Google.
  • Referrals become more likely when a user has had a good experience on your site.

In today’s ultra-competitive marketplace, a positive user experience could easily be the edge between your site and someone else’s.

2) The Fast and the Mobile Friendly

Google expects a mobile page to render above the fold in one second or less. Since more than half of the 3.4 billion daily Google searches are done on mobile devices, it’s imperative to have a fast and mobile-friendly site. According to an experiment done by Moz, Google has indicated it may actually be measuring “time to first byte” (TTFB) — which is how long it takes the first byte of information to get from a server to a browser.

Now that you know what Google’s looking for, there are numerous tools to help pinpoint where improvements could be made on a site’s backend. At Bluetext, we like to take out any guesswork and get our insights straight from Google. Put any URL into Google’s PageSpeed Insights tool and it provides recommended fixes, as well as a speed score on both mobile and desktop.

3) Page Speed + Stellar CTA = Increased # of Conversions

It’s been proven that page speed has a direct correlation to the number of conversions as long as it’s paired with an enticing Call to Action (CTA).  For example, if a user wants to download a white paper but has to wait for the page to load, that user will lose interest and most likely leave the site. For businesses, that means a prospect is bouncing and may be lost for good.

Every second counts. Don’t wait, start optimizing your site speed today because if you’re not recognizing the need for speed, you might as well go home. For more tips on making a great first impression? Click here: https://bluetext.com/top-branding-agencies-know-never-get-second-chance-make-first-impression/

Need help speeding up your digital platform to get the performance you want ?  Contact us

 

In the game of Internet search, in which updates to Google algorithms are studied like tea leaves, every action has a consequence– and it’s not always what’s intended. Google, for the first time in nearly two years, has announced an update to its Penguin algorithm, which the experts that most closely monitor search engine trends have called Penguin 4.0 and that may have a significant impact on search. But how we got here in the first place is an example of unintended consequences. Here’s the background, and what digital marketers need to know about the new update.

As we have written many times, including on the Google “Hummingbird” release and “Mobilegeddon,” search engine optimization is a never-ending game of cat and mouse. The mouse–anyone with a website that they want to get noticed–is always trying to find shortcuts that make their website come up high in a Google search. The cat–the search engine team at Google–wants to eliminate any shortcuts, work-arounds or downright cheating. That’s the game that has been going on for many years, because doing it the way Google wants, which is by having great content that is fresh, original, updated and linked to by other sites, is hard. In fairness, Google has been doing a great job of reacting and responding to the evolving nature of Internet usage and protecting the integrity of its search engine, and has made it much more difficult to game the system.

So here’s where Penguin 4.0 comes in. One of the ways that Google has assessed where a particular webpage should rank in a search query is by looking at the inbound links on that page. The theory goes that if The New York Times is linking to it, it must have the type of credibility and value that warrants a high ranking. Lesser websites have value, just not as much as The New York Times. However, by making inbound links an important component in the rankings, the mice on the other side found ways to artificially have lots of websites link back to key pages, sometimes by buying links or engaging with networks of link builders. That was the first unintended consequence.

In response, Google launched a Penguin update in April 2012, to better catch sites deemed to be spamming its search results, in particular those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings. At first, the Penguin algorithm would identify suspicious links when it crawled the Internet, and simply ignore them when it performed its ranking in response to queries. But sometime in the middle of 2012, Google started punishing websites with bad links, not just ignoring them but actually driving page rankings down for offenders. And that set off a mad scramble as sites needed to somehow get “unlinked” from the bad sites. Which led to unintended consequence number two: Not only did webmasters have to worry about being punished for bad links, they also had to worry about rivals purposely inserting bad links to undermine their competitor’s search results. Ugh!

So, in October of 2012, Google tried to fix the problem it created by offering a “Disavow Links” tool that essentially tells the Google crawlers when they find a bad inbound link that the website in question has “disavowed” that bad link, and therefore please don’t punish us for it any longer. Here’s how Searchengineland described the tool at the time: “Google’s link disavowal tool allows publishers to tell Google that they don’t want certain links from external sites to be considered as part of Google’s system of counting links to rank web sites. Some sites want to do this because they’ve purchased links, a violation of Google’s policies, and may suffer a penalty if they can’t get the links removed. Other sites may want to remove links gained from participating in bad link networks or for other reasons.”

And that created yet another unintended consequence, because, unfortunately, the Penguin algorithm wasn’t updated on a regular basis. So for websites trying to clean up their links, as SearchEngineland put it, “Those sites would remain penalized even if they improved and changed until the next time the filter ran, which could take months. The last Penguin update, Penguin 3.0, happened on October 17, 2014. Any sites hit by it have waited nearly two years for the chance to be free.”

Penguin 4.0 addresses that by integrating the Penguin “filter” into the regular crawler sweeps that assess websites on an ongoing basis. Waiting for up to two years for a refresh is now a thing of the past, as suspect pages will now be identified–or freed because they are now clean–on a regular basis.

What does this mean for websites? It’s what we’ve been writing now for a half dozen years. Good SEO doesn’t just happen, and it can’t be manipulated. It takes hard work, an effective strategy, and a long-term view to create the kind of content and links that elevate your brand for your customers, prospects, and the Google search engine. For more tips on navigating Penguin, download our eBook now.

Download Our eBook Now




Last week I was fortunate enough to be invited by BizBash to speak at their DC event entitled “ELEVATE”.

Elevate is a one-day conference where event and meeting professionals are able to rethink and explore the new attendee journey at events. Featuring in-depth workshops on event marketing, technology, design, sponsorships, and other topics, they discovered innovative ideas and compelling insights from the most influential names in events.

As BizBash.com describes it, “Social media has become a key component in all aspects of business, especially live experiences. Understanding new, emerging platforms and how social media and event marketing strategies merge is an integral part of the event marketing process. In this session, Jason Siegel, founding partner of Bluetext, will discuss how to develop a three-part campaign style approach to social media to maximize event reach. Siegel will share new ways to create urgency to register, how to leverage website personalization, insights on interpreting engagement, and how to seamlessly integrate virtual reality to drive interest and registrations for events.”.

The energy and buzz in the Reagan Center was very strong, and it was great to get out and meet a lot of top marketers in the field of event marketing, virtual reality, and all kinds of experiential elements.   Please enjoy the presentation I gave below.

To learn more about how Bluetext can elevate your social presence, reach out today:




Find out today how Bluetext can help you take your business to the next level.