Get a handle on the delicate balance between competing SEO demands and learn tips to avoid potential pitfalls.
We all want to demonstrate strong results for our clients or stakeholders. But sometimes, pushing to an extreme can undo our efforts. This is much more obvious between different disciplines and departments.
For example, designers (or UX / CRO specialists) may think they can increase a site’s conversion rate by 10% by cutting content and giving a more streamlined look. But if that 10% increase in conversion rate comes at the cost of 20% of organic traffic intake, then it’s probably not a good trade.
These conflicts are common, especially between competing disciplines and roles. But even within one discipline, like SEO, similar issues can arise.
This article looks at some competing forces in SEO and how to approach them.
- Volume of URLs: Ranking footprint vs. crawl efficiency
- Links and content: Quality vs. quantity
- Keyword optimization: Sparse vs. spam
- User experience: Speed vs. functionality
- Regional deployment: Local focus vs. global reach
- Internal linking: Connected vs. cumbersome
Volume of URLs: Ranking footprint vs. crawl efficiency
Working on a large site with plenty of webpages? Some SEOs might think more pages and content items are synonymous with a broader indexing (and therefore ranking) footprint.
But more URLs on your site doesn’t always equate to more potential ranking opportunities or organic traffic. This is especially applicable to sites that suffer from poor architecture.
For example, ecommerce sites that include a product category within a product-level URL which also allow products to be nested within multiple separate categories. In such a situation, you can end up with:
- Mysite.com/category-1/product-1/
- Mysite.com/category-2/product-1/
- Mysite.com/category-3/product-1/
Since all of the above resolve the same product page (product-1), there are now three URLs for the same page (duplicate content).
This means that Google will invariably end up (eventually, over time) crawling the same product three times. Two of those three crawls could have gone to different products or content. That content could then have gone on to rank.
So, in this situation, inefficient use of the crawl budget actually ends up harming the velocity at which new content is ranked.
Hopefully, Google will still crawl all the unique, distinct content eventually, but it may take longer. As new content is published, it will take longer to perform.
Several other scenarios can cause this same phenomenon.
For example, different filtering combinations on a site with faceted navigation may result in exponentially expanding volumes of parameter URLs spawning on a website.
A non-filtered category page may end up with ten or even 100 parameter variations as different filtering is applied.
We can just put canonical tags on the highly-duplicate pages that we don’t want Google to index and that will handle the content duplication issues, right?
While that is true, Google still has to crawl and visit the non-canonical addresses to see that they are non-canonical (to read their embedded canonical tags).
Canonical tags only help to alleviate content duplication, but they don’t really help much with crawl efficiency and content discovery.
You could argue that this is where you deploy complex wildcard robots.txt rules. Still, you must be careful in that area, as you can easily unintentionally cut off chunks of organic search traffic.
The best practice is to implement correct URL architecture and supporting redirects. If you have those in place, not much can go wrong.
Often, canonical tags are deployed as a band-aid solution after issues have already arisen. But they’re quite a messy patch to a more fundamental problem.
Links and content: Quality vs. quantity
On the surface, this seems like a no-brainer. Google has constantly stated that quality content and links matter more than mass-manufactured spam.
SEOs and digital PR specialists can often spend weeks attempting to create great content and ascertain a single high-value placement to knock the competition off their ranking pedestal.
No SEO worth their salt would argue that mass-spun content and spam links are an effective tool. These tactics are ineffective if you expect to maintain a long-term online brand, a foundation of business that you can build upon over time.
So, is there a place for quantity on a quality-first web, where higher quality signals matter more?
Yes. If you have worked on large sites for enterprise-tier clients, you’ll know that such brands (and those they compete with) already have high-quality links and content.
Quality never becomes irrelevant, but quantity once again raises its head.
For such clients, the game is about delivering a quantity of quality. In these situations, both dimensions (quantity and quality) matter.
When your site has such powerful ranking equity, every minute you’re not delivering new content, which targets new keywords, is lost time and traffic.
Every moment you get three high-value links, as your competitor wins 10, can be a moment of failure.
For such high-caliber sites and clients, the goalposts change entirely. The only way to earn high-value links quickly enough is to do very noticeable things in the real world, like:
- Sponsoring charities and speaking at educational institutions.
- High-impact PR stunts and the activity types that connect with an audience. (Hint: not web directory submissions!)
You stop thinking about building links and individual placements and start thinking, “How can we go out there and do something newsworthy?”
While quality is sovereign, don’t forget that quantity is still required within the most competitive spaces.
Quality real-world activity can deliver a quantity of quality links. That’s where you want to be.
Dig deeper: How to use digital PR to drive backlinks and business growth
Keyword optimization: Sparse vs. spam
A content gap analysis can present you with two primary findings.
- A keyword isn’t ranking well enough because the connected content doesn’t make enough of that concept (gap “in” content).
- There’s a missing page that you need to create for your website (gap “of” content).
In the former scenario, you’re likely to open an existing page and work out where you could deploy the underperforming keyword.
Or you might go further and determine whether an additional content section is required.
Either way, you’re browsing a content page and looking for a keyword deployment opportunity. And what’s wrong with that? It’s what we’re paid to tweak content and get each item of content performing to its optimal standard.
We want to identify sparse, thin, underperforming content, which doesn’t say enough.
But it’s a thin line between sparse content with too few referenced topics (too few keywords) and spam content, which is nothing but keyword injections.
Even before Google’s well-known Panda update, there were attempts to curb the “keyword enthusiasm” of SEOs.
Content that doesn’t contain atopical relevance doesn’t have the weight to penetrate Google’s SERPs. By contrast, content that is too optimization-heavy sinks.
Keep these competing forces in mind when optimizing or reducing the optimization intensity of your content. Your content must be heavy enough to penetrate but not so heavy that it sinks.
function getCookie(cname) {
let name = cname + “=”;
let decodedCookie = decodeURIComponent(document.cookie);
let ca = decodedCookie.split(‘;’);
for(let i = 0; i <ca.length; i++) {
let c = ca[i];
while (c.charAt(0) == ' ') {
c = c.substring(1);
}
if (c.indexOf(name) == 0) {
return c.substring(name.length, c.length);
}
}
return "";
}
document.getElementById('munchkinCookieInline').value = getCookie('_mkto_trk');
User experience: Speed vs. functionality
WordPress is known as an SEO-friendly content management system (CMS). But often, some site wonders want more functionality than the default CMS, so they start installing many plugins.
Fairly quickly, site performance deteriorates as pages load slower and slower.
Shortcode must be queried and transmuted to HTML / CSS, which involves additional calls to various tables from the database.
Additional scripts pile up in the browser’s main thread, creating execution bottlenecks.
Getting a good balance of page-loading speeds and functionality was fairly easy in the past.
As long as you minified your scripts and sheets, compressed your images and installed a caching plugin, you were good to go.
Those days are over.
Nowadays, Google wants us to begin interpreting what happens on the client’s (end-user’s) browser’s main processing thread. There’s no point shipping 5-10 scripts to a user really efficiently if all of that JavaScript waits in the browser’s main processing thread to be executed.
As such, we now have to consider:
- JavaScript code audits.
- Intelligent JavaScript deployment (only call scripts on the pages where they are needed).
- Server-side rendering (SSR).
- JavaScript parallelization.
You can still achieve high functionality combined with high speed. It just takes a lot more work (and intelligence) than previously.
Forging an effective critical JavaScript/CSS rendering path is not for the faint of heart.
If you can spare the senior development time, you can have a relatively feature-rich and fast site running on mediocre hosting.
It will take more time than ever, so be prepared.
Regional deployment: Local focus vs. global reach
This is a trap that can spring both ways.
You can aim for global reach without sufficient content, architecture and authority. In such a situation, you may wish you had picked a more localized domain (national) instead.
You may wish you had aimed at your local area with NAP signals. Sometimes it’s better to walk before you can run, and over-extending your reach too quickly can lead to failure (success on neither local nor global fronts).
On the other hand, going with a local approach when you have global ambition can really lock you in. For example, it’s unlikely that a .co.uk (UK) domain would rank well in France or Germany.
It’s important to realize that none of these decisions are concrete. If you lock yourself down locally, you can buy new domains and perform site migrations.
In such a situation, you’ll likely lose at least a little of your ranking power, so you should only jump ship (from one domain to another) once your site has gained critical mass.
If you’re only seeing a few hundred organic sessions monthly, it’s probably not time to make that move yet.
Depending on your ambitions, a local or global approach may be better.
If you’re a local vacuum repair shop, no one is likely to be critically interested in your business. Aiming for global SEO might be a bit of a reach.
If there are only two other vacuum shops in your local area, a locally targeted SEO campaign would almost guarantee you ranking on top of the search results for relevant, local terms. That’s much less effort than reaching out to potential consumers across the seas.
If you’re a well-known fashion brand and branching out from clothing to other items like scents (cologne), you’d probably expect some business from other countries.
Take the actions that will bring you the fastest revenue with the least effort.
If you’re a small business without enough ranking power to rank globally, go local first and circle back later. Otherwise, aim high and chip away.
Internal linking: Connected vs. cumbersome
Adding a few choice links within your content is great, perhaps to support orphaned pages or top-performing products. That said, there’s such a thing as having too many internal links.
Imagine a page where every other word or phrase was linked to a destination URL. How would you determine where to go?
It would seem like every snippet of text was competing for your attention equally.
This would be problematic for end users of your website. The same can also be said for search engines.
If every text item is a link to somewhere else and every page on your site supplies and receives thousands of links, how can a search engine interpret which pages are more or less valuable?
Even contextual analysis and thematic page categorization become much trickier.
Navigating competing forces in SEO
Balancing competing forces in SEO requires a strategic approach.
- Volume of URLs: Prioritize efficient crawling over excessive content. Optimize URL architecture and redirects to prevent duplication and wasted crawl budget.
- Links and content: Emphasize quality while considering the quantity needed in competitive spaces. Focus on delivering high-value content and seeking impactful real-world link building opportunities.
- Keyword optimization: Strive for a balance between sparse content and keyword stuffing. Ensure topical relevance without overwhelming optimization.
- User experience: Aim for a blend of speed and functionality. Optimize JavaScript deployment, consider server-side rendering, and manage execution bottlenecks for a smooth user experience.
- Regional deployment: Tailor your approach based on goals. For local focus, prioritize NAP signals. For global reach, focus on content, architecture, and authority.
- Internal linking: Maintain a connected structure without overwhelming users or search engines. Include relevant links to enhance navigation while avoiding link overload.
Adopting a holistic and adaptable strategy that respects the nuances of each force can help you make your SEO efforts more manageable.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
Related stories
@media screen and (min-width: 800px) {
#div-gpt-ad-3191538-7 {
display: flex !important;
justify-content: center !important;
align-items: center !important;
min-width:770px;
min-height:260px;
}
}
@media screen and (min-width: 1279px) {
#div-gpt-ad-3191538-7 {
display: flex !important;
justify-content: center !important;
align-items: center !important;
min-width:800px!important;
min-height:440px!important;
}
}
googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-3191538-7’); });
–>
Original Source: How to balance competing forces in SEO