Search Engine Optimization, or SEO, has obviously been an important factor for website performance for well over two decades. However, with increasing refinement and understanding of Google’s (and other alternative search engine) algorithms, website managers have gotten wiser and more strategic about the deployment of their SEO strategy. To rank ahead of the competition, modern SEO implementation methodology must be quite nuanced, and frequently updated to keep pace with ever-evolving algorithms and updates. In this article, we’ll talk about how website schema will be a key differentiator for organic search.
Traditional Methods of SEO Alone are Insufficient
Traditional methods of search engine optimization, like metatext optimization, H1 headings, and backlinking are by no means non-essential nowadays. In fact, it is quite literally the opposite – they are the mere table stakes that every single website should have to be considered even operational in 2022. However, what once was the complete focus of organic search strategies has evolved to the foundational building blocks. Though true optimization, or distinguishing your website from the rest, takes far more complex and nuanced forms. In 2023 we will see a noticeable shift and prioritization of schemas.
2023 – Year of the Schema?
Website schema isn’t a brand new concept – Google has offered “Rich Results” or “Structured Data” since the 2000s. What has changed, however, is the importance these displays are to organic performance. As existing SEO methods become commonplace, the remaining distinguishing factors will become increasingly important factors.
If you aren’t familiar with structured data markup, you’ll be surprised to know that you likely interact with them every day. Structured data markup is simply website information formatted in a specific, “structured” schema (as outlined by Google’s engineers) that feeds Google’s Rich Results — these Rich Results displays are the interactive elements (that aren’t sitelinks) that you see on the Search Engine Results Page (SERP). Some common ones include:
- Job Postings
- Related Images
- Related Videos
Visual Density Search Can Improve Clickthrough
What role does the schema of structured data markup play with regards to SEO? For one, the most critical factor is that of density and representation. A hallowed rule of SEO has been one link per domain (unless you’re lucky enough to get a sitelink sitemap). Rich Results is a clear break of that rule — there’s no limit to how many Rich Results one site can return. More results simply mean more opportunities for viewers to click through to your site organically.
Furthermore, vertical density is an important consideration. Without rich results, if a prospect scrolls past your sitelink in the SERP, it’s very unlikely that an individual is ever going to interact with your link ever again. In contrast, Rich Results can be inserted at any point through the SERP — meaning that a site can have multiple points for visual interaction through the SERP.
How to Implement Website Schema on Your Website
It is incredibly important to understand that structured data website schema cannot and will not happen accidentally — website managers need to spend time and significant effort to ensure any provided data matches what Google is looking for. Even a single error will mean Google will largely ignore the whole provided schema; Google isn’t looking to do any fuzzy logic or favors for website schema.
Thankfully, Google provides an easy-to-reference listing of ALL the possible Rich Results and their expected schema. Traversing the whole list is daunting, but it is vital to realize that structured data schema, unlike traditional SEO strategies like metatext or image optimization, is not a cookie-cutter, one-size-fits-all solution. Before embarking on updating your website data to meet Google’s schema requirements, identify what your website’s strengths and priorities are, and then match them with a similarly minded structured data requirement.
For example, a site heavily devoted to recruitment would certainly be interested in adding the Estimated Salary and Job Posting schema items — both would be a big driver for interested applicants. A site with a focus on video could leverage various Video layouts to encourage prospects to click through in a visual way not possible through metatext alone. Websites with more complex service offering could leverage Q&A and How-tos to convince and coax wary end users. Don’t bother trying to implement every schema — instead focus on what is most relevant for your website.
Google doesn’t do all this schema creation by themselves. Much of these definitions are defined by a consortium of web stakeholders through an organization called Schema.org. Schema.org offers comprehensive documentation of their data structures online. Above all else, however, it is important to note that both Schema.org’s and Google’s structures are quite fluid. Web needs constantly shift, and both organizations are very much in deep conversations with web users at large to identify what would be relevant changes and schema updates. Bluetext would recommend adding the Schema.org changelog as a bookmark, and referencing once a quarter. It may be deep in the weeds, but it also acts as a bellwether of what schema and data structures are being actively updated and worked-on — a good sign of increasing relevance for that data structure.
Finally, no task is complete without verification. Thankfully, Google has made that easy as well with the Schema Markup and Rich Results testers. You can easily validate website data against what Google’s crawlers see, to ensure your work did not go to waste.
SEO is no longer a paint-by-numbers game. People have gotten too smart and traditional monolithic strategies are simply insufficient by themselves to propel your website to the top of the rankings. Schema and website markup offer a way for dedicated and driven website managers to distinguish their organic results and secure a stake of organic traffic away from lackadaisical competitors.
If you want some help in identifying opportunities for your website to leverage Rich Results and website schema, come talk to us here at Bluetext.
Maybe you’ve seen one of those large banners across your Google Analytics property: “Universal Analytics will no longer process new data in standard properties beginning July 1st, 2023. Prepare now by setting up and switching over to a Google Analytics 4 property.” Seems problematic, right? Such a warning rings an alarm and raises several good questions to digital marketers, including: What is GA4? Should I switch now? Why is Google making me change? How do I switch? Will I still be able to access my data from previous years? If your mind is buzzing with these questions about your marketing analytics data you’re not alone. Luckily Bluetext has done its research and is here to answer some frequently asked questions and quell any lingering fears over this transition. This article will empower you to make an informed decision about Google Analytics 4.
What are Universal Analytics and Google Analytics 4?
Universal Analytics (UA) is Google’s third iteration of its popular web analytics service. If you’ve logged on to Google Analytics in the past decade, you were more likely than not using UA. When UA launched in 2012, it was quite a technological leap, adding advanced features in cross-platform tracking and custom dimensions. It shaped Google Analytics from simply being a page view tracking platform to a robust data reporting and attribution tool that could compete against some of the largest web-oriented business intelligence platforms, like Tealium. Most importantly, Google provided nearly the whole feature set free of charge.
Google Analytics 4 (GA4) is simply Google’s newest iteration – think of it as a new generation of analytics technologies. The web has transformed significantly since the early 2010s, and Google is merely re-platforming analytics to match today’s realities. GA4 launched in 2019 to little fanfare but only recently gained significant traction in March of this year due to Google’s landmark announcement that GA4 will be the only analytics service it supports in 2023.
Why is Google Switching to Google Analytics 4 and Ending Support for Universal Analytics?
This is a complex question – with some good answers that Google will give you and some answers you’ll need to read between the lines to get. Google’s official statement is that GA4 better reflects the modern web. UA did a woeful job reporting on non-webpage-based metrics, such as those from web apps. It was also cumbersome if your reporting needs didn’t precisely match those of a traditional website experience – e.g., single-page or non-linear web apps. GA4 is more customizable and reflects modern data collection and attribution processes better.
The underlying message here, though, is that of data privacy. Since UA launched nearly ten years ago, fundamental shifts have occurred over how people and the law treat data privacy on the web. Think of Edward Snowden, GDPR, and the countless data breaches over the last decade. At its core, Google realizes that this enormous cache of web data collected from millions of websites, even if not strictly Personally Identifiable Information (PII), is a huge security risk to the company. GA4 is an attempt to offset some of that risk, either removing entirely or at least offloading it to individual companies. GA4’s data collection methods are more anonymized, and data retention is limited to 14 months. Overall, this is a calculated move by Google to push its analytics customers to use tools that won’t put Google in hot water.
What’s similar between Google Analytics 4 and Universal Analytics? What’s different?
While the actual end-user experience may look starkly dissimilar, the foundation remains the same. GA4 will remain an incredibly flexible web analytics platform suitable for most websites today – regardless of whether it’s a personal blog, an online retailer, or a corporate website. Most day-to-day tasks like page view tracking, user attribution, and measuring bounce rates will remain the same. GA4 merely stores these metrics and measurements in alternative locations.
That isn’t to say everything is identical. The significant differences you’ll notice every day are rooted in the architectural shift in hit types. UA treated things like page views, events, and e-commerce tracking as separate entities or “hit types.” GA4, on the other hand, treats them all as “events”. Any tracking item will now be an event: resource downloads, page scroll, form submits. Google is thus simplifying the old event architecture by putting everything on the same level – everything is an event with associated customizable event parameters.
For example, under UA, a resource download event might have looked something like this:
- Event Category: Downloads
- Event Action: Resource Download
- Event Label: resource_file_name.doc
- Event Action: Resource Download
Note that regardless of whether it was necessary, Events always took on this three-stage hierarchy. GA4 removes this rigid hierarchy. Instead of having the arbitrary “Event Action” and “Event Category” dimensions, GA4 lets one create as many custom event parameters as necessary to communicate an event’s nature fully. GA4 can track the event instead as:
- Event: Download
- Download Type: Resource
- File Name: resource_file_name.doc
Sessions are also changing. By default, UA defined the end of a session by identifying 30 minutes of inactivity since the last event. GA4 measures the period between the first and last events in a session. GA4 also doesn’t create a new session when a user’s campaign parameters are changed. The major takeaway of these changes is that session numbers will likely be lower in GA4 than in UA.
Aside from these two critical areas, there are many other minor changes. While lesser in scope, these changes may affect your reporting, depending on what kind of features you currently rely upon regularly. For example, customizable views for properties are going away in GA4. If you depend on different views, you’ll likely have to experiment with custom audience building to replicate the reporting. As mentioned before, GA4 will also only store data from the previous 14 months.
Documenting every change is beyond the scope of this blog post. If interested in getting into the nitty-gritty, read through Google’s documentation on the significant changes.
Do I Need to Switch to Google Analytics 4?
Google states that no further data will be processed after July 1st, 2023 (Customers of 360 Universal Analytics get a small extension to October 1st, 2023). While Google may extend to a further date, make no mistake, Universal Analytics will eventually be completely deprecated. If your business relies on web analytics in any form, you need to start planning soon on what your migration plan looks like – hopefully well before July of next year.
How Can I Switch to Google Analytics 4?
For most websites, merely enabling dual tracking will be sufficient. Google has made an easy setup wizard for GA4. To access it, go to the admin panel for your UA property and click the “GA4 Setup Assistant” link. You can follow Google’s instructions here, but within a few clicks, you’ll have a tracking setup that collects both UA and GA4 data. You’ll already have nearly a year’s worth of GA4 data to review once UA goes offline next year. As noted previously, be aware that no historical data will be present in GA4, even if you use this wizard. That said, it will give an excellent basis of comparison to see the reporting differences, especially as you can compare each month between GA4 and UA up until the cutoff date.
Custom events and e-commerce will require a more personalized and custom approach. We’ll cover these in future guides here at Bluetext, but for now, you can consult Google’s guides on the matter here.
I hope this guide relieved some worries and cleared up some unknowns regarding Universal Analytics and GA4. There’s a lot to cover about GA4, and this guide only covers the surface. If you have any further questions about UA4 and GA4, be it migrating data, specific differences, or a transition plan, contact us to learn more about Bluetext’s analytics capabilities.
The decades-long reign of the PC is over, with mobile devices now making up more than 52% of all internet traffic. While plenty of people preach the importance of responsive website design, far fewer have articulated updated guidelines for the reality of today’s internet. Keenly aware of trends as ever, Google has continually refined its search algorithm to keep pace with increasingly mobile and untethered internet. Advertisers, marketers, and website owners alike need to be aware of what these paradigm shifts are, and how that could impact their sites’ SEO.
Cellphones’ bountiful data has empowered Google to enhance its search engine. Search results are more custom than ever before, incorporating key differentiating factors like time of day, weather, and geography. The search results for a morning bagel in Washington D.C. will look entirely different three hours later in San Francisco.
Optimizing for Local Search
More so than ever before, websites need to be local. Gone are the days of simply tacking on addresses and list of phone lines. To be competitive in 2020, websites need to address the mindset and inquiries of the region they serve, be it a street, coast, or country. A quintessential, doughy foldable New York slice is in stark contrast to a dense, deep-dish pie from Chicago. The top result for a pizza in Manhattan will not be wasting content on merely their cheese, sauce, and pepperoni, but rather what distinguishes their slice from their other New York brethren. Language, context, and local distinctions are now a mandatory part of website content strategy.
Dealing with Short Attention Spans
Major changes to search algorithms are only a handful of the changes introduced by the rise of mobile. Attention spans online are shorter than ever with the ubiquity of the internet and easily accessible information, even more so for mobile where screen size comes at a steep premium. Hero zones should be appropriately leveraged. Heroes should state the most important critical information concisely and contain a quick and simple CTA or takeaway. Organic visitors who cannot immediately find an answer to their search query after a glance and a few swipes will assuredly bounce away to a competitor.
Search and Virtual Assistants
Smartphones’ impact on websites has not just been limited to mobility and smaller screens. Virtual assistants like Amazon’s Alexa, Google Assistant, and Apple’s Siri fundamentally change how people browse the internet. For many on-the-go, the automated search functionality provided by these virtual assistants have all but replaced a typical Google search.
How Google and the other virtual assistants parse through webpages and present them for voice search is a complex topic, but the vital SEO fundamentals remain in place. Research demonstrates that people are unsurprisingly far more conversational in their wording versus a typed-in search. Optimized content thus needs to serve this need directly, often best served using blogs that cover such frequent, informal topics as “What is the best X” or “Y versus Z”.
Google has been increasingly leveraging its structured data for voice search results, largely due to its predictable format and parseable nature. For best results, website owners need to cross-reference website content and identify what data could be passed off to Google using structured data. Articles, menus, locations, events, and reviews are just a handful of the many structured data formats that Google accepts. Conveniently, Google now provides a simple tutorial for anybody familiar with HTML to get started on incorporating structured data and improving their site for voice search.
The shift to mobile devices has opened up new avenues for content creation and design. Location and voice were unheard of topics even a decade ago, but they are here to stay for organic search. It’s up to website owners and marketers whether they take advantage of these new strategies, or get left in the dust.