Google Archives - Bruce Clay, Inc. https://www.bruceclay.com/blog/tag/google/ SEO and Internet Marketing Thu, 06 Mar 2025 20:07:47 +0000 en-US hourly 1 5 High-Impact Tips To Optimize Your Google Business Profile https://www.bruceclay.com/blog/tips-optimize-google-business-profile-local-seo/ https://www.bruceclay.com/blog/tips-optimize-google-business-profile-local-seo/#comments Thu, 06 Mar 2025 19:15:46 +0000 https://www.bruceclay.com/?p=238794 Optimizing your Google Business Profile (GBP) might be the missing piece in your local SEO strategy. If you have a business with a local service area, a well-optimized GBP is the key for visibility in the “Local Pack” search results and on Google Maps. According to a study by BrightLocal, the average business profile is […]

The post 5 High-Impact Tips To Optimize Your Google Business Profile appeared first on Bruce Clay, Inc..

]]>
Woman on phone shopping outside local clothing store.

Optimizing your Google Business Profile (GBP) might be the missing piece in your local SEO strategy.

If you have a business with a local service area, a well-optimized GBP is the key for visibility in the “Local Pack” search results and on Google Maps.

According to a study by BrightLocal, the average business profile is found in over a thousand customer searches each month — 84% of which are discovery searches.

Brightlocal chart showing Proportions of direct and discovery searches per Google Business Profile listing.
Image credit: BrightLocal, “Google My Business Insights Study – Benchmark Your Business’s GMB Insights” (2019)

This means that optimizing your GBP could be exactly what you need to reach new customers who are actively searching for products or services like yours.

In this article, I’ll walk through five key tips to optimize your Google Business Profile and boost your local visibility.

The Benefits of Google Business Profile Optimization

Why optimize your Google Business Profile? Here are a few benefits:

  1. Better visibility: Detailed business profiles can attract people who are simply looking for a product or service, not a business name.
  2. Build trust and engagement: Using high-quality photos, adding specific categories and including positive customer reviews improves your credibility and helps you connect with your customers better.
  3. Direct customer interaction: Google Posts, customer reviews and messaging gives you a direct line to your customers, giving you more opportunities to build loyalty, respond to inquiries and promote special offers.

5 High-Impact Tips for Optimizing Your GBP

1. Create and Verify

Do you have your Google Business Profile yet? If you do, have you verified it?

First things first: Sign in to your Google Account (or create one using your business email). Go to create a profile, enter your business name and select it if it appears in suggestions.

Google find and manage your business page.
Image credit: business.google.com/create

Google offers step-by-step instructions on how to complete this piece here.

Next, you want to verify the profile. A verified Google Business Profile can be a big influence on Local Pack ranking, and you can see evidence of this in the 2023 Whitespark Local Search Ranking Factors Report (No. 9).

Local Search Ranking Factors 2023 report, Whitespark
Image credit: Local Search Ranking Factors 2023 report, Whitespark

Basically, Google wants to know you are a legitimate business, and there are some steps you have to go through.

The whole process of getting your GBP verified and up and running in search could take up to a few weeks (or more) if executed efficiently, depending on the type of verification.

You can see Google’s help file here on verification and explore the different ways to verify, including phone/text, email, video recording, live video call or mail.

2. Fill in All the Details

A more complete GBP may influence how well you do in the search results. So, take your time on this step and make it as robust as you can.

For example, Extra Space Storage’s GBP for Thousand Oaks has multiple photos and a video by the owner, products, categories, hundreds of questions and answers, a robust description, recent updates, links to social profiles and more.

Extra Space Storage Google Business Profile on Google Maps.
Extra Space Storage GBP on Google Maps

Here are all the details available for you to fill in, as outlined in Google’s help file:

  • Business name: Enter your exact business name as shown on all branding. If you change it post-verification, re-verify.
  • Category: Choose a primary category that represents your business accurately. You can add up to nine additional categories, but more may not necessarily be better. Avoid using categories solely for keywords or attributes.
  • Address and pin location: Input the complete address or leave it blank if customers don’t visit your physical location. Address changes require re-verification.
  • Service area: Specify the local area you serve based on cities, postal codes, etc., to clarify where you deliver or visit.
  • Hours: Set regular hours, plus don’t forget holiday and feature-specific hours.
  • Phone number: In addition to the primary business number, you can add up to two more contact numbers (not fax). Use a local number for your primary, and you can add a toll-free number as a secondary if you have it.
  • Website: Add your website and ensure Googlebot can crawl your site; some business types may add links for online orders or appointments.
  • Social media links: For eligible regions, add one link per social platform, including Facebook, Instagram, LinkedIn and others.
  • Attributes: Add attributes like “Wi-Fi” or “outdoor seating.” Some attributes rely on customer feedback.
  • Business description: Write a description (up to 750 characters) highlighting offerings, history, mission and unique points about the business or brand. Take the time to craft this section as it contributes to the first impression of your business. Avoid URLs, HTML or promotional offers. Experiment with breaking the text up for easy reading (instead of a large wall of text).
  • Opening date: Enter the opening date to inform customers of business longevity; only month and year are required.
  • Menu/services: Use the Menu and Services Editor for accurate menu or service listings (limited feature available for certain industries). Make sure to craft your service offerings with care and include relevant keywords naturally.
  • Products (retail): Add your products and put thought and time into product details, weaving in relevant keywords where it makes sense. You can automatically add products by linking a POS system with Google’s Local Inventory app. This is a limited feature at the time of writing.
  • Hotel features (hotels only): List check-in/check-out times and hotel amenities to keep guests informed.
  • Car dealership inventory (dealers only): Display available cars for sale, manage inventory and use preferred data providers if eligible. This is a limited feature at the time of writing.

Pro tip: Make sure to get your NAP+W (name, address, phone number and website) right as part of a greater strategy for NAP+W consistency across the web — inconsistencies across the web confuse Google.

3. Optimize Categories

The right category for your GBP may be the most important ranking factor in the Local Pack.

Most SEOs agree, as showcased in the Whitespark 2023 Local Search Ranking Factors study referenced earlier:

Local Search Ranking Factors 2023 report, Whitespark.
Image credit: Local Search Ranking Factors 2023 report, Whitespark

For this, you want to choose the most relevant primary category for your business.

Be specific, such as “hair salon” instead of just “salon.” If your ideal category isn’t available, pick the closest general option from the list.

You should have plenty to choose from though. This list of GBP categories from Propellic shows thousands (almost 4,000).

When you pick your primary category, think about what your business is at its core.

Here, Google gives a guiding principle for picking a category:

“Select categories that complete the statement: “This business IS a” rather than “this business HAS a.” The goal is to describe your business holistically rather than a list of all the services it offers, products it sells, or amenities it features.”

While you can add up to nine more categories, start with an additional two or three — and only if they are relevant.

As Google points out in the help file linked above: “Don’t add a category for every product or service.”

However, it can be helpful to add categories based on special departments or services, as Google says here:

“For example, if you manage a grocery store that includes a pharmacy and deli, choose “Grocery store” as your primary category, then add “Pharmacy” and “Deli” as additional categories.”

Use the attributes feature to refine what you have to offer even more.

Experiment over time to see how different categories may impact your business.

Schedule a free consultation with Bruce Clay.

4. Add Images

You want to make sure you do as much as possible to own the first impression of your business. And sometimes photos can be that first impression.

A positive image of your business can drive conversions. BrightLocal research (cited earlier) found that businesses with more than 100 images on their GBP get 520% more calls, 2717% more direction requests and 1,065% more website clicks than the average business.

BrightLocal graph showing average monthly customer actions by number of images.
Image credit: “Google My Business Insights Study,” BrightLocal.com

Both owners of the business and customers can both upload images, but it’s important that you take responsibility for getting some high-quality images up.

Going back to the example shared earlier with Extra Space Self Storage, you can see the difference in photo quality from what the owner uploaded, and what a customer uploaded.

The owner’s photo is on the top, and the customer’s on the bottom:

Extra Space Self Storage photos.

If you were to only rely on customer photos, you’d be missing out on a big opportunity.

You can include photos such as your logo, a cover photo and additional photos of your business. Don’t forget you can add video, too.

Google has a helpful file on the types of photos you can include and why you’d want to include them, here.

Tips for business-specific photos on your Business Profile, Google Business Profile Help.
Sample of business photo ideas, “Tips for business-specific photos on your Business Profile,” Google Business Profile Help

You can learn more about photo and video guidelines and specifications in Google’s help file, here.

Aim to keep your photos fresh and update on a regular basis — try a new photo every week.

5. Communicate and Update

There are different ways to engage with your audience through your GBP, and one great way is to take advantage of Google Posts.

Google Posts serve as mini announcements of what’s going on at your place of business. These posts can help you stand out from the competition and engage further with your audience.

Your audience can get notified of your updates, too.

One study, done informally and posted on Moz, found that these posts are highly engaging but about 40% of businesses do not take advantage of it.

You can post anything from promotions and events to key updates about the business and any relevant news.

This Google help file gives more general information on Google Posts, and Google gives guidelines on the content you create here.

Aim to keep this area of your GBP fresh, posting once a week is a good cadence.

Final Thoughts

There’s much you can do to continuously improve your Google Business Profile optimization.

To start, focus on high-impact tactics like verifying your profile, filling in all the details, selecting precise categories, adding quality images and engaging through regular updates and Google Posts.

Remember, your GBP often serves as a first impression of local businesses for searchers. Keeping your profile accurate, visually appealing and informative not only boosts local search rankings but also builds trust and attracts new customers.

For more local SEO tips, see our big local SEO tips list.

Our SEO experts can help you optimize your Google Business Profile for peak performance and generate more traffic.

26,000+ professionals, marketers and SEOs read the Bruce Clay Blog

Subscribe now for free to get:

  • Expert SEO insights from the “Father of SEO.”
  • Proven SEO strategies to optimize website performance.
  • SEO advice to earn more website traffic, higher search ranking and increased revenue.

FAQ: How can I improve my Google Business Profile visibility to attract more customers through local SEO?

An optimal Google Business Profile is essential to drawing customers through local SEO. Taking the steps to optimize and expand their online presence online ensures businesses can increase customer reach.

Set up your Google Business Profile by providing accurate details relating to your name and address, phone number, website, categories and categories. Google uses these details to match up local searches with local businesses more accurately. Be sure to add stunning images or posts that draw customers in.

Call upon satisfied customers to leave reviews that demonstrate the depth of your care for customer service while simultaneously building trust as a business. Create an effective review strategy by inviting all reviews – both good and bad – right after sales/service have occurred.

Local citations are of equal importance. Make sure your business information is consistent across various online platforms like directories and social media; this will keep search rankings higher while helping customers quickly locate you.

Include local keywords in your business description and posts to optimize SEO results and drive more local searches. Doing this can make an immediate statement about who your audience is.

Stay engaged with your community by using Google Posts and answering inquiries in the Q&A section. Keep your profile current by regularly adding updates that inform potential customers of special offers or events happening near your location.

Implement these strategies to boost your visibility on Google and attract more traffic and customers. Be mindful to remain consistent as trends and customer preferences may alter over time.

Step-by-Step Procedure

  1. Sign into your Google Business account.
  2. Verify that all of your business details are complete, current and accurate.
  3. Add images to show off your business, products and services. Make sure they are high-resolution.
  4. Customize your business description by including relevant local keywords.
  5. Take steps to verify your business location using Google’s verification process.
  6. Establish clear business hours, including any holidays that may arise.
  7. Share updates, special offers and exclusive events on Google Posts.
  8. Respond swiftly and appropriately to customer reviews, feedback and inquiries.
  9. Encourage customers to submit positive reviews following purchase.
  10. Monitor customer profile insights to evaluate customer engagement.
  11. Make adjustments to your profile according to feedback and performance data.
  12. Use questions and answers to address common customer inquiries.
  13. Keep your business information consistent across platforms.
  14. Engage with local community events and spread the word through your profile.
  15. Make use of local SEO tools to measure performance.
  16. Adjust based on local search trends.
  17. Join forces with local influencers to increase profile visibility.
  18. Use social media to drive visitors to your Google Business Profile.
  19. Stay relevant by regularly adding fresh content.
  20. Evaluate competitors’ profiles to gain insights and make improvements.
  21. Use Google Ads to increase organic traffic.
  22. Apply advanced SEO strategies to reach more customers and expand your local SEO visibility.
  23. Look for ways to cross promote your brand and work with other local businesses.
  24. Always review and assess how your local SEO strategy is doing. Make adjustments as necessary.

The post 5 High-Impact Tips To Optimize Your Google Business Profile appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/tips-optimize-google-business-profile-local-seo/feed/ 11
5 Ways to Future-Proof Your Website Against Google Core Updates https://www.bruceclay.com/blog/ways-future-proof-website-google-core-updates/ https://www.bruceclay.com/blog/ways-future-proof-website-google-core-updates/#comments Mon, 03 Feb 2025 19:38:44 +0000 https://www.bruceclay.com/?p=238110 Here are five quick tips on how to future-proof your website so it’s ready when those pesky core updates hit.

The post 5 Ways to Future-Proof Your Website Against Google Core Updates appeared first on Bruce Clay, Inc..

]]>

Ah, core updates … a mix of thrill, uncertainty and confusion. But we don’t need to fear core updates — if we take the right approach to our websites.

In this article, I’ll give five quick tips on how to future-proof your website so it’s ready when those pesky core updates hit.

What Do Google Core Updates Do?

What happens during a Google core update? Imagine you and your competitors have all done the “right” SEO things to your sites — essentially leveling the playing field in the search results.

Core updates come in and can shift the balance, so some sites win and some sites lose.

This can happen by reweighting certain factors in the algorithm, emphasizing specific variables or prioritizing different types of content (like informational over navigational).

Your competitors may have sites that the algorithm now favors, making them the “least imperfect” in this new landscape.

“Least imperfect” means that no one website will ever get their site optimized perfectly to match the Google algorithm, but some will get more things right than others.

In a core algorithm update, your competition that came out on top didn’t necessarily work harder — they just had the right mix of things.

So, how do you become the “least imperfect” and reclaim your rankings?

Focus on proactive strategies to strengthen your site. That means doubling down on quality, relevance and usability — the areas that will help you stand out in every algorithm iteration.

Now, let’s look at the five areas you should put your SEO efforts into.

5 Ways to Future Proof Your Website Against Google Core Updates

1. Know Why You’re Creating the Content

Reality check: Why are you creating the content for your SEO program?

If you said “to rank for a keyword,” you’re not alone. While this is a goal in SEO, we can’t let it become the master.

A means to an end (ranking for a keyword) should not overshadow the ultimate purpose (serving your audience).

Google’s guide on creating helpful, reliable, people-first content suggests we think of our “why:

“Why” is perhaps the most important question to answer about your content. Why is it being created in the first place?

The “why” should be that you’re creating content primarily to help people, content that is useful to visitors if they come to your site directly. If you’re doing this, you’re aligning with E-E-A-T generally and what our core ranking systems seek to reward.

If the “why” is that you’re primarily making content to attract search engine visits, that’s not aligned with what our systems seek to reward.

It’s possible you’ve focused too much on churning out tons of content without taking the time to understand what your audience finds helpful.

So take a deep breath and think about this existential question. You might be surprised at how it could shift your content strategy for the better.

2. Dive into the SQRG and E-E-A-T

Pour a cup of coffee, get comfortable and explore some of SEO’s most-loved acronyms, including the “SQRG” (Search Quality Rater Guidelines) and E-E-A-T (experience, expertise, authoritativeness and trust).

These resources are a wealth of knowledge when it comes to creating quality content that Google wants to reward.

Since quality content is at the heart of many of these Google core updates, we want to better understand what it is that Google values.

Danny Sullivan over at Google gave folks a gentle reminder about this back in 2018:

In a nutshell, the SQRG is useful for anyone doing SEO because it gives us a glimpse into the inner workings of how Google Search defines quality.

The SQRG was created for a Google program where hired staff use the guide to evaluate the search results for certain queries and then report back what they have found.

This, in turn, acts as a feedback loop for Google engineers to make further tweaks to the Search algorithm. It helps Google understand if the changes it’s making to its Search algorithms are producing quality results.

So, you can imagine the potential nuggets of useful information that can be found in the guide.

To expand on this, E-E-A-T is a concept presented in the SQRG that discusses the principles of quality in more detail.

If you’re not quite ready to dive into the hundreds of pages in the SQRG, here is your cheat sheet o n E-E-A-T:

  • Trust is at the center of E-E-A-T. A page must be accurate, honest, safe and reliable.
  • Experience is about the need to have first-hand knowledge of a topic, which is obtained through direct interaction or involvement.
  • Expertise is the necessary knowledge or skill needed to write about a topic.
  • Authoritativeness builds on expertise and is about being a go-to source on a topic.

For more on this, I highly recommend you read through Google’s SQRG and explore our in-depth guide on E-E-A-T.

3. Handle AI Content with Care

In a reality where everybody and their brother can use AI to create content, it’s a bit of a Wild West situation.

So, we must be responsible for what we put out there. Google is already fighting AI content in the search results with new spam guidelines targeted to AI-generated content at scale.

Although its stance on AI-generated content is a bit more complex. Google doesn’t care how you create the content, but it does expect the content to uphold quality principles.

In other words, you need to keep the bar high for any AI-generated content, asking the “why” and reviewing the content for E-E-A-T.

Google has gone so far as to suggest that disclosing how you used AI in your content could be a good idea, though I think most companies are still a bit shy about coming clean.

Here’s an excerpt from a Google help file that discusses this:

Many types of content may have a “How” component to them. That can include automated, AI-generated, and AI-assisted content. Sharing details about the processes involved can help readers and visitors better understand any unique and useful role automation may have served.

If automation is used to substantially generate content, here are some questions to ask yourself:

  • Is the use of automation, including AI-generation, self-evident to visitors through disclosures or in other ways?
  • Are you providing background about how automation or AI-generation was used to create content?
  • Are you explaining why automation or AI was seen as useful to produce content?

Overall, AI or automation disclosures are useful for content where someone might think “How was this created?” Consider adding these when it would be reasonably expected.

I’ve written at length about how to create quality content and integrate AI content into your SEO processes, and here are a few articles you might like:

4. Invest in Technical SEO for User Experience

A site that runs well creates an environment that is better for SEO.

Technical SEO is about addressing elements of a website you don’t see — the things that give users and search engines a good experience.

Google recently has doubled down on its user experience advice, making “page experience” a focal principle.

Google’s page experience brings together different factors that impact a website’s user experience. Getting these factors right can give you a leg up on your competition in the search results.

There’s much that goes into making a site run well. That’s why we have technical SEO.

This includes strategies like site structure, optimizing for core web vitals, mobile-friendliness, HTTPS, optimizing a website for crawlers and more.

Be sure to check out 5 Technical SEO Best Practices You Can’t Ignore.

5. Get a Website Audit

How will you know if your site can withstand a Google algorithm update? Get a technical SEO audit.

A technical SEO looks at all of the things that are hurting website performance and preventing it from ranking.

There are different kinds of audits — some are good for uncovering surface-level problems. However, the best audits take time and effort, and are completed by technical SEO experts.

As I’ve written about before here on the blog, the three typical audits include:

  • The software-driven SEO audit: A software tool uncovers superficial SEO issues (many of which may be useful). However, the tool produces a generic report. This is appropriate when you don’t have an auditing budget, or you want to check some basics yourself before starting with an agency.
  • The blended SEO audit: An SEO vendor uses software for an audit and gives additional SEO insights but without many solutions. They may identify problems that data analytics alone can’t uncover. But, without in-depth solutions, this only points to possible trouble areas.
  • The in-depth technical SEO audit: An SEO vendor or expert performs an in-depth technical audit that requires the expertise of a seasoned SEO who specializes in technical website analysis and SEO business strategy. This is a manual review supported by tools, and it takes many hours (for instance, our audits here at Bruce Clay Inc. take more than 100 hours).

If you’re interested in learning more about audits, here is an article I wrote about it: SEO Audits and Tools: The Good, The Better and The Best.

Final Thoughts

Core updates are unpredictable. Google won’t allow us to control their algorithm, but we can take steps with our sites that ensure they’re ready for any future updates.

Remember: your goal is to beat the competition, not the algorithm.

Follow the five steps outlined in this article, and you’ll be able to better weather any core updates that come your way.

Don’t let the next core update catch you off guard! Our SEO experts can analyze your site, identify risks and create a strategy to future-proof your rankings.

26,000+ professionals, marketers and SEOs read the Bruce Clay Blog

Subscribe now for free to get:

  • Expert SEO insights from the “Father of SEO.”
  • Proven SEO strategies to optimize website performance.
  • SEO advice to earn more website traffic, higher search ranking and increased revenue.


FAQ: How do I determine whether my website has been positively or negatively impacted by Google core updates?

Knowing how Google core updates impact your site — both positively and negatively — is key to staying visible online. These updates represent major alterations to their search algorithms that aim to make search results more relevant and high-quality, potentially altering visibility and traffic flows in either direction. So let’s dive in now.

Before doing anything else, it is crucial that you first monitor your search rankings and traffic using tools such as Google Search Console and Google Analytics. A sudden decrease in rankings or traffic could signal potential trouble; conversely, if rankings or traffic appear to have increased after updates – that could be good news – simply compare changes with dates to see any potential conflicts between updates vs changes versus updates vs updates!

Content quality matters as well. Google likes high-quality, relevant, and trustworthy information – if your material lacks depth or is thin or outdated it could get hit hard by updates such as these. Review all material to ensure it meets their standards, refreshing or removing low-quality pieces as appropriate.

User experience matters too, with core updates often favoring websites that deliver an outstanding user experience. Consider testing how quickly your site loads, whether or not it’s mobile-friendly, and usability overall – tools such as PageSpeed Insights can help identify areas for improvement that could cause people to leave and drag down rankings; negative experiences could even have people altogether leaving your site entirely, significantly decreasing rankings over time if that happens!

Engagement metrics such as time on page, click-through rates, and return visits provide an accurate measure of user interaction with your content. A high engagement score demonstrates to Google that its search algorithms find your material relevant; otherwise, it indicates areas for improvement.

Backlinks play an equally vital role, and any sudden drop or increase in low-quality links could drastically damage your rankings. Tools like SEMrush and Ahrefs can provide invaluable assistance for tracking backlink health.

Researching your competitors is important, and it can certainly be eye-opening. Watch how their updates compare with yours; if they made more ground while you did not, that likely indicates they adapted their content and SEO strategies better than you.

Discussion forums and communities for SEO professionals can offer invaluable insight. Sharing experiences may bring to light patterns or issues you hadn’t noticed; ultimately, this collective wisdom can contribute towards improving SEO strategies.

Hire SEO specialists if you’re facing difficulty reaching a certain milestone; their experts can conduct extensive analyses of your site’s performance and recommend tailored strategies that can get things back on track quickly. In addition, their expertise can also assist in understanding Google’s ever-evolving algorithm changes.

Recognizing the effect of Google’s core updates on your website involves considering various elements: performance metrics, content quality, user experience, engagement levels, backlinks, and competitor behavior. By piecing together this puzzle you will have a clearer view and can take steps to optimize it for future success.

Step-by-Step Procedure

  1. Log into Google Search Console: Once logged into the console, access data such as impressions, clicks, and average positions for more accurate analysis.
  2. Analyze Traffic Patterns: Evaluate changes in traffic prior and post Core Update implementation.
  3. Analyze Rankings: To quickly and efficiently monitor ranking changes for key terms, keyword tracking tools provide invaluable visibility into these fluctuations.
  4. Conduct a Content Audit: Evaluate the quality of your content against Google’s benchmarks for measuring quality content.
  5. Assess User Experience Metrics: Assess metrics like speed, mobile friendliness, and usability as indicators of customer experience.
  6. Evaluate Engagement Metrics: To understand engagement metrics, analyze page time on page, bounce rates, and click-through rates.
  7. Monitor Backlink Profile: Utilizing tools such as SEMrush to monitor any changes or losses on your backlink profile is vital in maintaining optimal health, and identifying any possible losses or dangers.
  8. Conduct Competitor Analysis: Evaluate how well your rivals are faring and what strategies they employ.
  9. Join SEO Communities: By joining forums, members can exchange knowledge and receive input from other members.
  10. Seek Assistance From An SEO Professional: Hire an expert consultant for assistance by conducting a full audit and evaluation.
  11. Implement Changes: Once your findings are in, use them to make any necessary modifications to content and SEO strategies as a result of them.
  12. Track Progress: Assess any modifications made to your website performance to ascertain their effect and any possible repercussions they might have had.
  13. Remain Aware: Keep informed on SEO trends and any upcoming Google updates in order to stay one step ahead.
  14. Refine Strategies: As needed, adjust your approach based on ongoing analyses and emerging best practices.
  15. Target Web Vitals: Focus on increasing loading speeds, interactivity, and visual stability to meet this objective.
  16. Increase Content Authority: When providing information, be sure it comes from reliable sources with expert opinions backing it up.
  17. Diversify Backlinks: For increased authority, diversifying backlinks is key to creating a healthy backlink profile and should be done through diversifying them.
  18. Produce Content that Satisfies User Intent: Create content that precisely fulfills user search intentions.
  19. Implement Data-Driven Insights: Make informed decisions with content and strategy by employing analytics for smart decision-making.
  20. Establish a Feedback Loop: Gather user input regularly in order to continuously enhance performance.
  21. Uphold SEO Hygiene: Upgrade plugins and resolve technical errors regularly to maintain SEO hygiene.
  22. Adjust Advertising Strategies: Coordinate ads with SEO efforts for consistent messaging.
  23. Utilize Social Media: Take advantage of social platforms to drive traffic and engage your target audience.
  24. Examine Long-Term Impact: Monitor changes over time in order to ascertain their long-term ramifications.

The post 5 Ways to Future-Proof Your Website Against Google Core Updates appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/ways-future-proof-website-google-core-updates/feed/ 8
How To Increase Website Speed for User Engagement https://www.bruceclay.com/blog/how-to-increase-website-user-engagement/ https://www.bruceclay.com/blog/how-to-increase-website-user-engagement/#comments Mon, 11 Nov 2024 18:31:44 +0000 https://www.bruceclay.com/?p=234049 Looking to increase website user engagement and keep visitors on your pages longer? Here are some tips and tools to help.

The post How To Increase Website Speed for User Engagement appeared first on Bruce Clay, Inc..

]]>
Website analytics and metrics displayed on laptop.

Increasing organic traffic levels is the number one priority of any SEO program. But merely getting people to your site is not enough.

Attracting and keeping visitors’ attention after they click through to your website is essential to ongoing conversions and search engine success.

Keep in mind that Google always looks at your site’s experience as part of its ranking algorithm.

Read on as I discuss a few ways you can boost website user engagement, including some of the necessary tools.

Prioritizing Speed and Performance

Research conducted by Google indicates that 53% of people will abandon a website if its load time is over three seconds.

google data load times.

Website performance and speed are critical for user engagement. A website that loads at a snail’s pace can cause user frustration and lead visitors to click off before engaging with your content.

Optimize the Experience

Optimizing your website for maximum speed and functionality is fundamental for a satisfying user experience.

Google outlines ways to do this, including looking at key areas such as:

  1. Core web vitals: Are your pages performing well in terms of loading speed, interactivity, and visual stability?
  2. Security: Is your site served over HTTPS?
  3. Mobile usability: Does your content display well on mobile devices?
  4. Ad placement: Does your content avoid excessive or intrusive ads that distract users?
  5. No intrusive interstitials: Do your pages have pop-ups that block access to content?
  6. Content layout: Can visitors easily distinguish between the main content and other elements on the page?

To assess your page experience, use resources such as:

  • Core web vitals: Understand how these impact your page ranking.
  • Search Console’s HTTPS report: Monitor your site’s security status.
  • Chrome Lighthouse: This tool identifies page experience improvements, including mobile usability and other performance metrics.

Learn more by reading our page experience series of articles.

Upgrade Your Hosting

Your servers and web hosting provider can impact how your website performs, especially during traffic surges.

Shared hosting, while cost-effective, can lead to slower load times as resources are shared with multiple sites.

Consider upgrading to cloud hosting or dedicated hosting solutions for enhanced speed, performance and security.

These options offer better resource allocation, ensuring a smoother user experience even with increased traffic.

Mastering Content Creation for Engagement

To increase time on page, your content should resonate with your audience and offer practical value.

Not every website visitor will convert into a paying customer, but creating premium content increases the probability of repeat visits and helps to affirm that your website is a trustworthy resource.

But how can you develop content that truly captivates your audience?

Quality

High-quality content is crucial for engagement. A visitor who finds your content valuable is more inclined to return, strengthening engagement over time.

Top-notch content encompasses elements like E-E-A-T (experience, expertise, authoritativeness and trust), ethical journalism and professional-level writing.

Google wants us to focus on people-first, helpful content rather than search engine-first content.

Today, AI can assist in content creation to make things more efficient, but it should be used carefully with human editing to ensure value and uniqueness.

There are many AI content tools out there — I recommend trying our own PreWriter.ai, which automates the time-consuming aspects of writing to move SEO projects forward faster.

Learn more: Crafting High-Quality SEO Content: A Comprehensive Guide

Engaging Videos

Research from Wistia indicates that people spend 1.4 times longer on pages with video content than those without such media.

Wistia graph comparing average time spent on page with and without video.
Image from Wistia.com

Simply put, video content is a highly immersive media channel. Videos can distill complex ideas, making it easier for viewers to understand topics.

This makes videos a great tool for educational content or product demonstrations, boosting audience engagement.

But don’t stop at just creating videos; make sure they are optimized so they have a better chance of showing up for a search.

Learn more: A CMO’s Guide to YouTube SEO in Less Than 5 Minutes

Accessibility

Web design accessibility ensures that everyone can use your website, including those who are living with disabilities.

Incorporating accessible design principles, like providing alt text for images, helps to boost your site’s inclusivity.

The Web Content Accessibility Guidelines (WCAG), which were developed by W3.org, act as the universal benchmark for web content accessibility geared toward individuals with disabilities, including those who have vision, cognitive, hearing and motor impairments.

WCAG compliance levels run the gamut from minimal to optimal. Start by meeting the basic requirements and use your available resources to build upon them.

User Feedback

Deploying a feedback loop is important for refining your website and satisfying customer demands.

Collecting users’ input provides insight into what’s working and which areas could use improvement, ultimately enhancing the user experience overall.

You can use several methods to collect feedback, including surveys, polls and contact forms.

This invaluable information allows you to make data-backed changes and build a website that’s easier to use and more engaging for all users.

Go to Ask Your Target Market (AYTM) for key insights.

Final Thoughts

Getting a firm grasp on user engagement and making efforts to improve it is integral to any website’s digital success.

Constantly assess user habits, amass feedback and tweak your tactics to address your customers’ evolving needs and expectations.

And be sure to make use of helpful tools along the way to amplify your effectiveness.

Need better website user engagement? Our SEO experts can help!

FAQ: How can I effectively improve my website speed to increase engagement and keep users on the site longer?

There are multiple strategies you can employ to speed up and enhance website user engagement, keeping visitors browsing.

Image optimization is one of the key elements to ensure websites load fast. Large images without proper optimization can substantially slow page loads times — reduce file sizes by compressing them with modern formats like WebP without compromising quality.

Utilize browser caching effectively; it enables frequently visited resources to be stored locally on users’ devices instead of being downloaded again upon subsequent visits. Setting appropriate cache headers will further expedite site speed.

Minifying CSS, JavaScript and HTML code is another useful strategy to reduce file sizes and speed up loading times. Tools like UglifyJS and CSSNano make this process even simpler by automating its processes to keep your code nice and clean.

It’s important to select the right hosting provider. A top-quality hosting service with resources tailored specifically for your website’s requirements can significantly speed it up. Look for dedicated servers or content distribution networks (CDNs) that distribute content globally to reduce load times regardless of where your users are located.

Implementing lazy loading is an advanced strategy designed to delay loading non-critical resources until they’re needed. Deferring loading until the resources are within view of users decreases initial load times while improving performance.

You’ll also want to focus on reducing server response times. Optimize your server configuration, implement HTTP/2 and upgrade resources as appropriate to handle more concurrent users efficiently.

Regularly monitor your site performance using tools like Google PageSpeed Insights and GTmetrix. Look for bottlenecks and areas for improvement.

Implementing these strategies to increase your website speed, keep visitors on your pages longer and increase the chances of conversion.

Step-by-Step Procedure

  1. Assess your current website speed. Tools like GTmetrix and Google PageSpeed Insights can help.
  2. Reduce file sizes and convert images into current formats like WebP.
  3. Implement browser caching by setting appropriate cache headers.
  4. Minify CSS, JavaScript and HTML files using automated tools.
  5. Choose a high-quality hosting provider with robust resources.
  6. Consider using a CDN to distribute content globally.
  7. Implement lazy loading for images and videos.
  8. Optimize server configuration for faster response times.
  9. Upgrade your server resources if necessary.
  10. Regularly monitor performance and adjust strategies based on data.
  11. Reduce the number of HTTP requests by combining files.
  12. Use asynchronous loading for JavaScript files.
  13. Eliminate unnecessary plugins and scripts.
  14. Optimize font loading by using display swap.
  15. Implement server-side rendering if using JavaScript frameworks.
  16. Use HTTP/2 for improved loading times.
  17. Secure your site with SSL to enhance performance.
  18. Test your site across different devices and browsers.
  19. Analyze competitor sites for best practices.
  20. Continuously update and refine your optimization strategies.
  21. Educate your team on web optimization techniques.
  22. Keep up on the latest in technology and trends.
  23. Gather user feedback to identify performance issues.
  24. Set performance goals and track progress over time.

The post How To Increase Website Speed for User Engagement appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/how-to-increase-website-user-engagement/feed/ 5
How To Submit a Reconsideration Request To Google https://www.bruceclay.com/blog/how-to-submit-reconsideration-request-google/ https://www.bruceclay.com/blog/how-to-submit-reconsideration-request-google/#comments Thu, 20 Jun 2024 19:49:09 +0000 https://www.bruceclay.com/?p=224924 Learn the steps to effectively handle a Google Reconsideration Request and secure your website’s standing in search results.

The post How To Submit a Reconsideration Request To Google appeared first on Bruce Clay, Inc..

]]>
Man kneeling on a mountain, outstretching arms up in the air.

Your site suddenly loses rankings and traffic or has some weird security alert attached to it. You may need to address some urgent issues on your website and ask Google for forgiveness. This is done through a reconsideration request.

In this article:


What Is a Google Reconsideration Request?

A reconsideration request is Google’s formal process for telling Google how you fixed certain problems on your website. Per Google, a reconsideration request is “a request to have Google review your site after you fix problems identified in a manual action or security issues notification.”

Why You Might Need to Request Reconsideration

There are two main situations when you would request reconsideration from Google:

  1. When Google has issued a manual action, aka “penalty,” against your website because it has violated Google guidelines.
  2. When security issues are detected on your website.

In the case of a manual action, your site can rank lower or be omitted from the search results altogether.

When you have a security issue, affected webpages or websites can appear with a warning label in the search results, or Google will give users a warning page when trying to access the site.

Either way, this is bad for your organic traffic and should be addressed immediately.

Are there any other scenarios where you might need to request a review? Yes, in rare cases.

For instance, if you’ve bought an expired domain that can’t rank because it was previously banned by Google for spamming. You, the new site owner, can request reconsideration.

How Do You Know If You Need to Request a Review from Google?

You will know if there are major detected issues with your website by keeping up-to-date with two reports in Google Search Console:

  1. The security issues report.
  2. The manual actions report.
Security and manual actions reports, Google Search Console.
Security and manual actions reports, Google Search Console

If you don’t understand why you have received a manual action, Google recommends going to the Google Search Central Help Community to try to get answers:

How to Request Reconsideration from Google

Depending on whether you are facing a manual action or security issue, you will need to take a few steps before you request reconsideration from Google.

Manual Actions

If you have a manual action against your website, Google outlines six steps to take in its manual actions report help page (linked earlier) before requesting a review:

  1. Expand the manual action description panel on the report for more information.
  2. See which pages are affected
  3. See the type and short description of the issue, and follow the “Learn more” link to see detailed information and steps to fix the issue. (You can find the detailed information for each action below on this page).
  4. Fix the issue on all affected pages. Fixing the issue on just some pages will not earn you a partial return to search results. If you have multiple manual actions on your site, read about and fix all of them.
  5. Be sure that Google can reach your pages; affected pages should not require a login, be behind a paywall, or be blocked by robots.txt or a noindex directive. You can test accessibility by using the URL Inspection tool.
  6. When all issues listed in the report are fixed in all pages, select Request Review in this report. In your reconsideration request, describe your fixes. A good request does three things:
    • Explains the exact quality issue on your site.
    • Describes the steps you’ve taken to fix the issue.
    • Documents the outcome of your efforts.

Security Issues

If you have encountered security issues on your website, here are the seven steps to take before you submit a review request (from Google’s security issues help page linked earlier):

  1. Expand the issue description on the Security Issues report.
  2. Read the description of the issue and follow its “Learn more” link for detailed information and steps to fix the issue. (The learn more links point to the descriptions below on this page.)
  3. Use the sample of affected pages provided in the details section to troubleshoot and fix your issue. This list is not necessarily complete, but just a sample of pages on your site affected by this issue. You might have a security issue with no example URLs; this does not mean that no pages are affected, only that we could not generate samples for some reason.
  4. Fix the issue throughout your site. Fixing the issue on just some pages will not earn you a partial return to search results.
  5. If the report lists multiple security issues affecting your site, fix all of them.
  6. Test your fixes.
  7. When all issues listed in the report are fixed in all pages, select Request Review in the Security Issues report. In your reconsideration request, describe your fixes. A good request does three things:
    • Explains the exact quality issue on your site.
    • Describes the steps you’ve taken to fix the issue.
    • Documents the outcome of your efforts.

Requesting a Review

Once you have completed all the necessary steps, you can request a review within Search Console. Google outlines what makes a good reconsideration request:

A good request does three things:

  1. Explains the exact quality issue on your site.
  2. Describes the steps you’ve taken to fix the issue.
  3. Documents the outcome of your efforts.

Google recommends that you might even link to the webmaster help forum discussion about your website for more context:

More Reconsideration Request Tips

If you have a manual action against your site due to a partnership with another site, and the partner is doing something that goes against Google’s guidelines, it can reflect poorly on your site. (One example is a paid guest posting partnership.)

Document this in a complete reconsideration request. For instance, if links were purchased in a joint campaign effort, show your effort to clean it up and disconnect ties to the site and paid links.

Always be honest. As the video below states, Google’s John Mueller says, “I recommend not trying to hide anything about your website. Everyone makes mistakes at some point, that’s fine.”

And, while details are important to include, be as clear and as concise as possible. As the video below points out below, don’t overload the request with unnecessary details:

How Long Does a Google Reconsideration Request Take?

A reconsideration request can take days, weeks or even longer in some cases. Remember, they are reviewed by Google staff manually. In 2020, Google’s Mueller said in a since-deleted tweet that it can sometimes take months for a review:

There’s no defined time for a reconsideration request to be processed; sometimes it’s a week or so, sometimes it’s a few months. It’s not dependent on whether you bought the domain recently (and we see a lot of abuse attempts with expired domains, so it doesn’t mean it’s easier).

🍌 John 🍌 (@JohnMu) October 11, 2020

Google explains:

“Most reconsideration reviews can take several days or weeks, although in some cases, such as link-related reconsideration requests, it may take longer than usual to review your request. You will be informed by email when we receive your request, so you’ll know it is active. You will also receive an email when the review is complete. Please don’t resubmit your request before you get a decision on any outstanding requests.”

Then, be patient. Following up before you get a decision, says Google, can “cause longer turnaround time for the next request, or even get you marked as a repeat offender” (if related to security issues).

What Happens After a Google Reconsideration Request

First, you will receive confirmation that Google has received the request. Then, you will receive confirmation that the request has been processed. This happens in the Manual Action viewer in Search Console.

If the request has gone through, you will be notified that Google has revoked the manual action, for example. On the other hand, Google may inform you that the site still violates its guidelines.

It can take some time to see the results after a reconsideration review. Thankfully, Google has improved the transparency of the reconsideration process, reducing the guesswork of anxious webmasters awaiting reconsideration.

For more information on reconsideration requests, check out all the reconsideration request videos at Google Search Central on YouTube.

Penalized by Google and unsure how to resolve the issues? Our SEO experts can help get your website, online visibility and traffic back on track.

FAQ: How can I effectively handle a Google Reconsideration Request to resolve website issues?

A penalty from Google can significantly affect your online visibility, traffic and website performance. If Google imposes a penalty on your site, you’ll have to fix the issues and then submit a reconsideration request to have them review your site. Let’s discuss how to manage this process to restore your website standing and ensure you’re in compliance with Google’s guidelines.

Understanding Google Penalties
Google issues penalties to websites when they violate its guidelines. These penalties typically include algorithmic, manual actions, or security issues. Penalties usually include things like unnatural links, keyword stuffing, or cloaked content. You’ll have to identify the reason behind the penalty so you can address it and then submit a reconsideration request.

The Reconsideration Request Process
If you are penalized by Google, you’ll need to conduct a thorough website audit to determine the violations. Once you have found and fixed the issues, you’ll have to submit detailed documentation demonstrating how you have rectified the problem, along with a sincere appeal to Google.

Find the Root Causes
You’ll need to do a deep dive into your website’s content, backlinks and SEO practices to find the root causes of the Google penalty.

Corrective Actions
Fixing Google penalties involves things like removing harmful content, links and updating your SEO tactics. Any actions you take have to comply with Google rules and guidelines.

Communicating with Transparency
Clearly outline the steps you’ve taken to fix the issues and provide concrete examples of how the issues have been resolved. Being transparent with your corrective actions tells Google you are committed to maintaining a quality website.

Monitoring Post-Submission
Check the status of your reconsideration request after you’ve submitted it. Be ready to respond in case you get additional queries from Google.

Get Expert Help
In some cases, you may need professional expertise to handle a Google reconsideration request, especially if the site issues are complex. SEO experts can answer your questions, give you advice on how to fix site issues and can guide you throughout the reconsideration process.

Step-by-Step Procedure

  1. Confirm the type of penalty your site has received from Google. Is it algorithmic, a manual action, or a security issue?
  2. Audit your website thoroughly to find all guideline violations.
  3. Correct all issues, including replacing unnatural links and updating content to fix keyword stuffing.
  4. Document all the actions and changes you have made. Include important information like dates.
  5. Write a clear and concise reconsideration request. Include documented evidence like screenshots or links to removed content.
    Explain to Google how future violations will be prevented.
  6. When complete, submit the reconsideration request through Google Search Console.
  7. Track the status of your request by monitoring your email and Google Search Console for any communication from Google.
  8. Respond promptly if Google asks for additional information.
  9. Once the Google penalty is lifted, monitor your site’s compliance frequently.
  10. Adjust your SEO strategy to ensure compliance and prevent future penalties.
  11. Regularly update your site’s content and backlinks profile.
  12. Use Google Webmaster tools to receive alerts on potential website problems.
  13. Always stay current on Google guidelines and algorithm updates through ongoing education and SEO training.

Follow these steps to overcome Google penalties and restore your website’s performance in the search results.

The post How To Submit a Reconsideration Request To Google appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/how-to-submit-reconsideration-request-google/feed/ 5
Why Does Google Change My Webpage Title in the Search Results? (And What To Do About It) https://www.bruceclay.com/blog/why-does-google-change-webpage-title-search-results/ https://www.bruceclay.com/blog/why-does-google-change-webpage-title-search-results/#comments Mon, 17 Jun 2024 19:24:17 +0000 https://www.bruceclay.com/?p=224555 Learn why Google sometimes changes webpage titles, how it affects SEO and your site's visibility. Discover effective strategies for adapting titles to improve click-through rates.

The post Why Does Google Change My Webpage Title in the Search Results? (And What To Do About It) appeared first on Bruce Clay, Inc..

]]>
Google homepage displayed on laptop.

A webpage’s title tag plays a part in Google’s search engine ranking algorithm. It also helps get people to click through to your webpage from the search results. Because of that, I always recommend spending time getting it right.

(If you need a refresher on title tags, check out my article on meta tags, including research on why these tags still matter.)

But what happens when you put in your best effort, and Google changes the title in the search results anyway?

It can be frustrating. But it has nothing to do with your webpage’s ability to rank, which I’ll touch on later in this article.

In this article, I’ll cover:

Fact: Google Changes Titles in the Search Results

We’ve known for some time that Google has autogenerated snippets for the search results, including the title and description.

Title and description rendering in the search results for BruceClay.com home page.
Title and description rendering in the search results for BruceClay.com home page

But as time has passed, it seems Google is taking more liberties, and this was a topic of discussion in 2021.

Today, Google’s help file on influencing your title links in search results says:

“Google’s generation of title links on the Google Search results page is completely automated and takes into account both the content of a page and references to it that appear on the web. The goal of the title link is to best represent and describe each result.

Google Search uses the following sources to automatically determine title links:

  • Content in <title> elements
  • Main visual title shown on the page
  • Heading elements, such as <h1> elements
  • Other content that’s large and prominent through the use of style treatments
  • Other text contained in the page
  • Anchor text on the page
  • Text within links that point to the page
  • Website structured data

… While we can’t manually change title links for individual sites, we’re always working to make them as relevant as possible. You can help improve the quality of the title link that’s displayed for your page by following the best practices.”

Google Rewrites Titles 61% of the Time

Research from Zyppy shows that Google rewrites titles in the search results 61% of the time.

The research highlights common scenarios in which Google rewrote titles:

  • Overly long titles and short titles
  • Using the same keyword more than once
  • Use of title separators, such as dashes “-” or pipes “|”
  • Titles with [brackets] or (parentheses)
  • Identical “boilerplate” used across many titles
  • Missing or superfluous brand names

Much of Zyppy’s research aligns with the best practices documented by Google on titles regarding length, keyword stuffing, branding and boilerplate content.

SEOClarity also researched Google title rewrites and found that titles in position one in the search results were rewritten fewer times than other positions.

What This Means for SEO

It’s important to remember that Google is not rewriting your title tag. The title tag is the HTML code on the webpage that helps Googlebot understand what the page is about.

That stays the same even if Google rewrites the title for the search results page. This has nothing to do with indexing or ranking. It is simply a rendering issue.

In other words, the title tag is still the information that Google uses for ranking. So, you still need to follow all the best practices for title tags. That is important.

“While we can’t manually change title links for individual sites, we’re always working to make them as relevant as possible.”

-Google Search Central

Are you doing something wrong if Google selects a different title? The answer is maybe.

However, Google changes titles because people don’t search the way they used to. Your webpage may be the best answer for a query, but the title tag may not reflect that.

In these cases, Google will try to figure out if a better title is more representative of your webpage and to better match the query.

So long as you are following best practices, then no, you’re not doing anything wrong if Google rewrites your title. That is just Google being Google.

But what about the clickthrough rate? We know that what the title says in the search results can make or break a click.

Unfortunately, there have been some reports of click-throughs decreasing with Google rewrites – with one report showing a 37% drop.

How to Prevent Google from Rewriting Titles, or at Least Help with Relevance

Let’s look at some of the elements that Google says it uses to craft titles, so you can review your page content and ensure it is cohesive.

I’ll use a fictional website that sells honey as an example throughout this section.

Content in <title> Elements

You don’t want to obsess over your title tags, but you do want to put your best foot forward. That includes following the best practices outlined by Google and other best practices, such as those outlined in my article on meta tags.

Main Visual Title or Headline on a Page

Heading Elements, Such as <h1> Elements

A honey e-commerce site might have an H1 tag that reads: “Explore Our Wide Range of Natural Honeys.” Heading tags often summarize or introduce key content themes similar to traditional headlines. So they’re valuable indicators when generating descriptive and accurate titles.

(Note that the main visual title and the H1 are one in the same.)

Research from Zyppy (linked earlier) found that “using H1 tags strategically could limit the amount of title rewriting Google might perform on your site.” And “matching your H1 to your title typically dropped the degree of rewriting across the board, often dramatically.”

So make sure you put effort into crafting those H1s on your webpages. See our article, How to Craft a Headline That Gets More Clicks.

Other Content That’s Prominent Through the Use of Style Treatments

If you have a featured section of content styled prominently, it can indicate key information about the overall page context or focus areas. This helps refine auto-generated titles if headings and title tags provide limited insight.

For example, a site that sells honey may have a featured section prominently displaying the words: “Award-Winning Honey Selection.”

Other Text Contained on the Page

As a website selling honey, you may have an opening paragraph on the page highlighting the company’s commitment to sustainable beekeeping practices.

Although not explicitly tagged as a title or heading, a section of text such as this reflects core content aspects that Google might find helpful for understanding user queries related to sustainability in honey production.

Anchor Text on the Page

In cases where other elements on the page don’t provide clear context, anchor links within content could offer additional clues, especially when they relate to topics of user interest.

For instance, your honey e-commerce site might have a link: “Learn more about our eco-friendly packaging.”

Text Within Links That Point to the Page

A website that sells honey may have reviews across the web that point to a page with anchor text that reads, for example, “best organic honey.”

Since external links vouch for a site’s content, and using descriptive anchors can affect relevance, this becomes useful for identifying which aspects might be best for title links.

Website Structured Data

Google may use page-specific Schema.org structured data to generate a title.

Final Thoughts on What To Do

The fact that Google is rewriting the titles of the search results doesn’t change the recommendation to craft your title tags carefully.

Google uses them for ranking, and there’s always the chance (about a 39% chance if you follow the Zyppy research) that nothing will change in the search results.

So what if you don’t like the way Google is changing your titles – is there any recourse? In its title tag help file, Google states that you can “submit feedback” to the community, but that’s about it:

“If you’re seeing your pages appear in the search results with modified title links, check whether your page has one of the issues that Google adjusts for. If not, consider whether the title link in search results is a better fit for the query. To discuss your pages’ title links and get feedback about your pages from other site owners, join our Google Search Central Help Community.”

But, evidence suggests that you can modify your H1 tag to reflect the title you want, and Google may pull from that. Once you do that, you can request a recrawl of the page to get it updated quickly.

Struggling to implement best practices for title tags and other aspects of your SEO program? Our SEO experts can help.

FAQ: How do I ensure my webpage’s title remains effective even if Google changes it?

Webpage titles are very important for SEO and user engagement. Great titles summarize the content of a webpage and entice readers to click through and read it.

Google will sometimes alter webpage titles to better reflect the relevance of the query or the content on the page. It doesn’t always happen, but this underscores the need for titles to be flexible and resilient if changes are made. This involves a deeper understanding of SEO practices and Google’s evolving algorithms.

Let’s look at the strategies you can follow to keep your webpage title effective.

Aligning Titles with Content
Webpage titles must accurately represent what the page content is about. Accurate titles effectively align with searcher intent and prevent discrepancies that could prompt Google to change. Write your titles using primary keywords and make sure that the webpage content reflects on the promises made by the title.

Keyword Optimization
Relevant and targeted keywords in your title signal to Google the subject matter of your webpage. Including too many keywords is considered keyword stuffing and can potentially trigger Google to change the title to better match search intent. Use keywords naturally in titles so they provide clear value to readers.

Adapting to Google’s Guidelines
Google frequently updates its guidelines and algorithm, so you need to stay current on how titles are evaluated and potentially adjusted. User experience is very important to Google — understanding the rationale why it may potentially change webpage titles helps you adapt your strategies so that they remain effective.

Crafting Compelling Titles
Separate from SEO, webpage titles need to be compelling enough for users to click on them. Write them so that they evoke curiosity or offer solutions. Even if Google edits it, a title with a strong, relevant message ensures that the main theme will attract users.

Consistency Across Meta Tags
To solidify the theme of your webpage, you must keep the messaging across the title tag, meta description and the content itself consistent. Consistency corroborates the theme, narrative or information presented, maintaining the integrity of your title even if it is changed by Google.

Monitoring Performance
Frequently monitor your webpage’s performance with tools like Google Search Console or our own SEOToolSet®. Analytics tools can show you how often your titles are being changed, giving you insights into how you can further refine and optimize your titles.

Engaging with the SEO Community
Experience is the best teacher. Engage with others in the SEO community to learn about their experiences about title changes. This knowledge can help you develop your own strategy to minimize the impact of Google changing your webpage titles.

While you have no control over Google altering your webpage titles, you can implement these tips into your strategy to keep them effective and relevant for your users.

Step-by-Step Procedure

  1. Find the most relevant and searched terms related to your content through keyword research.
  2. Pick a primary keyword and include it in your webpage title naturally.
  3. Ensure the content on your page directly reflects the title.
  4. Audit your content regularly and update as needed to keep it relevant when search trends and algorithms evolve.
  5. Track webpage title performance and how often they are being changed with tools like Google Search Console or SEOToolSet.
  6. Stay current on the latest Google webpage title management best practices.
  7. A/B test different webpage titles to see which ones perform better in the search results.
  8. Collect user feedback to learn how they perceive and react to your titles. Adjust titles as necessary based on this feedback.
  9. Make sure all meta tags related to the webpage support the same theme as the title.
  10. Review top-performing pages and analyze their titles.
  11. Connect with the SEO community and participate in SEO forums to gain insights into how you should write your titles.
  12. Keep webpage titles under 60 characters so that they fully display in the search results.
  13. Write webpage titles clearly and concisely to improve user understanding and engagement.
  14. When there are major changes to webpage content, be sure to update the title to accurately reflect the new content.
  15. Stay ahead of the curve through continuous SEO training and education.

The post Why Does Google Change My Webpage Title in the Search Results? (And What To Do About It) appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/why-does-google-change-webpage-title-search-results/feed/ 3
Google’s Explosive March Updates: What I Think https://www.bruceclay.com/blog/googles-explosive-march-updates/ https://www.bruceclay.com/blog/googles-explosive-march-updates/#comments Thu, 28 Mar 2024 17:37:33 +0000 https://www.bruceclay.com/?p=216622 Google's flurry of March updates created volatility for SEO. After an initial analysis, here are my thoughts on what's going on.

The post Google’s Explosive March Updates: What I Think appeared first on Bruce Clay, Inc..

]]>
Woman working on laptop, holding a cup of coffee.
On March 5, Google launched its first core algorithm update of 2024, in addition to unveiling updated spam policies and a spam update. Plus, its helpful content system was integrated into its core algorithm update system.

Google told Search Engine Land it would reduce unhelpful content in the search results by up to 40%.

In addition to sites that were negatively impacted by the core update, Google unleashed a slew of manual actions on sites that violated its most current spam policies.

Some sites were completely de-indexed. Many of these sites were likely driven by AI content.

via GIPHY

After an initial analysis of the updates and Google’s documentation, I am going to share what’s going on with a large part of the latest updates. In this article:

The Focus Is Largely on AI Content

Are you using AI content for your website? Now is a good time to evaluate your methods (if Google hasn’t already done that for you in its latest core update.)

If you sailed through this last update, does that mean you are immune? Probably not. Google’s algorithms continue to evolve, and Google’s main goal is to take out the trash.

With the rise in popularity of AI content in recent times, Google knew it had a looming problem on its hands.

I discussed at Search Engine Land a while back that AI content could create a world where the quality of an answer in the search results would be average at best — only as good as the AI would allow.

If you consider that many AI content tools are connected to the web, the AI is reading its own generated content to come up with ideas for more generated content.

Google has faced quality problems like this before. Over the years, Google has combated many different tactics that degrade its search results.

This is just another example of something that has become “spam” – AI content. But there are still ways to do it right.

What Google Is Now Saying About AI Content

So what did Google have to say about AI content in its recent announcements?

Here, Google talks about how it is addressing the “abusive behavior” of using AI content:

To better address these techniques, we’re strengthening our policy to focus on this abusive behavior — producing content at scale to boost search ranking — whether automation, humans or a combination are involved.

There’s a lot to unpack in that one statement. Let’s break it down.

Producing Content to Rank: Is It Spam?

Google says it is targeting content at a scale that’s intended to boost search rankings.

But isn’t all content created for SEO intended to rank?

Google has an opinion, but doesn’t go into too much detail:

There are some things you could do that are specifically meant to help search engines better discover and understand your content. Collectively, this is called “search engine optimization” or SEO, for short. Google’s own SEO guide covers best practices to consider. SEO can be a helpful activity when it is applied to people-first content, rather than search engine-first content.

It sounds like what Google is saying is that SEO is an afterthought.

Google seems to downplay SEO so that people follow its guidelines for creating helpful content first, then tack on SEO to make it perform better.

Sounds logical.

Except we all know that SEO isn’t just an add-on and that we first start with keyword research and trending topics to capitalize on the success of content before creating new content.

So, again, the question is: Aren’t we already producing content to boost search rankings?

Here’s what I think: Automated mass production of content with no regard for adding value is the problem.

Human-Edited AI Content: Is It Spam?

Google says it’s targeting websites with content that is totally automated, written by humans or a combination.

So AI content can be bad even when there is a human touch to it. That means if you are using AI tools, tread carefully.

AI tools are not inherently bad but abusing them is.

Here, Google talks more about the idea of “scaled content abuse”:

Examples of scaled content abuse include, but are not limited to:

  • Using generative AI tools or other similar tools to generate many pages without adding value for users
  • Scraping feeds, search results, or other content to generate many pages (including through automated transformations like synonymizing, translating, or other obfuscation techniques), where little value is provided to users
  • Stitching or combining content from different web pages without adding value
  • Creating multiple sites with the intent of hiding the scaled nature of the content
  • Creating many pages where the content makes little or no sense to a reader but contains search keywords

The first bullet is key: You can use AI tools all you want, and you can edit AI-generated content all you want, but if you’re not adding something unique – an expert perspective, personal experience, etc., then your content could be a fair target for Google enforcing its spam policies.

The third bullet elaborates: Don’t use AI tools that merely stitch together the information in the search results into a new article without adding some extra value.

It is fine to take keywords, make a list, make a unique outline, and from there devise your own content … just do not plagiarize or create generic content.

Even before the use of AI, content creators would look at the top-ranked pages as research for what they write.

However, even that has potentially caused an existential crisis for Google. Others have written about the downward spiral of the quality of the search results, and Google has been ramping up efforts to surface better content.

In a nutshell, you need to have something original, regardless of how you create the content.

The main point here is to differentiate your content. What is happening with AI content is generic content that doesn’t add anything new to the conversation.

So, What Is Content Spam Now?

Google’s job is to weed out the garbage no matter how the content is created. So it will lean heavily on its algorithms to identify what is quality content.

Google has taken many approaches to this in the past, and it will continue to evolve. In its latest iteration, Google has clarified what spam content is, and had this to say recently about AI content and new spam policies:

Our long-standing spam policy has been that use of automation, including generative AI, is spam if the primary purpose is manipulating ranking in Search results. The updated policy is in the same spirit of our previous policy and based on the same principle. It’s been expanded to account for more sophisticated scaled content creation methods where it isn’t always clear whether low quality content was created purely through automation. [Emphasis added.]

What methods might Google use to further determine quality?

Many things. Perhaps high bounce rates, poor sentiment in reviews, low site trust, a high degree of similarity to other documents, no inbound mentions and lack of website maintenance.

Google also had this to say:

This will allow us to take action on more types of content with little to no value created at scale, like pages that pretend to have answers to popular searches but fail to deliver helpful content.

SEL spoke to Google, and a Google rep clarified:

What are examples of pages that pretend to have answers but fail to deliver? Tucker [a Google rep] explained that those are the pages that start off by stating it will answer your question, lead you on with low-quality content, and never end up giving you the answer to your questions:

  • “Our long-standing spam policy has been that use of automation, including generative AI, is spam if the primary purpose is manipulating ranking in Search results. The updated policy is in the same spirit of our previous policy and based on the same principle. It’s been expanded to account for more sophisticated scaled content creation methods where it isn’t always clear whether low quality content was created purely through automation.”
  • “Our new policy is meant to help people focus more clearly on the idea that producing content at scale is abusive if done for the purpose of manipulating search rankings and that this applies whether automation or humans are involved.”

Surviving Google’s Algorithms

We are in the midst of another shift in SEO, where we continue to define quality.

There is no room for flying low under the AI radar or it’s a certain crash. It’s vitally important that organizations remain white-hat current on changes related to SEO requirements to remain visible and relevant within search results.

Assess and Pivot

For the near future, protection is job No. 1. Navigating changes in search algorithms requires businesses to periodically reevaluate their SEO tactics.

We recommend comprehensive SEO audits to surface any current or potential threats.

Adjusting to algorithm updates, adapting to rules as they change, and abandoning manipulative tactics even if they once worked are all key parts of maintaining high rankings in search engines.

Adopt Best Practices

Adopting best practices is indispensable to top rankings. Just because it is an SEO practice in your industry, doesn’t mean it is good.

Yes, that’s a bit obvious but if everyone did it then there would not be chaos over these updates.

The zone of acceptance is like a water balloon – shifting shape as the rules change. Adhering to core principles can make you almost immune to algorithm updates.

Create Quality Content

Quality content remains key in building a lasting online presence.

By producing user-centric content that offers something unique and of value, you can build trust, authority and credibility within your niche, not only helping with algorithm changes but also strengthening your overall online presence.

Spend time thinking about and creating helpful, people-first content.

Relevant topics are key for maintaining audience engagement and producing high-quality content that resonates with audiences.

By understanding your target audience’s needs and conducting extensive research using SEO tools, you can quickly locate trending or pertinent subjects that resonate with them.

But just be sure you are adding something unique to the conversation, and it’s not a “copy/paste/reword” approach.

See my AI content beginner’s guide if you want to continue leveraging AI’s benefits in your content creation.

Track Progress

Taking a set-it-and-forget-it approach to publishing content will harm you in the end.

As Google releases core updates, you need to understand how your site is faring after the dust settles.

You must use analytics tools to see what is working. Monitor progress, identify issues and pivot as needed.

Have you been impacted by Google’s March updates? Our SEO experts can help you get your SEO program back on track. Reach out to us today.

FAQ: How can businesses search rankings survive Google’s explosive March core and spam updates?

Google’s constant algorithm updates can wreak havoc on a business’s search rankings. The good news is you can implement certain strategies to survive Google’s explosive March updates and be prepared to handle any future update. Let’s go over them:

Understand Google’s updates: Familiarize yourself with the specific changes introduced in the March core and spam updates. Always stay current on the latest algorithm updates and adapt your SEO strategy accordingly.

Enhance website content: Craft high-quality, engaging content that caters to your target audience’s needs. Identify keywords that are relevant to your audience and naturally incorporate them. Make sure your content provides value and addresses pain points for your users.

Focus on site speed: User experience is one of the biggest ranking factors in search engines, and site speed is a critical component of that. Google prioritizes fast-loading pages, so focus on improving your website speed by compressing images, reducing code and utilizing caching techniques.

Improve mobile experience: Google has moved to mobile-first indexing, meaning that the search engine will prioritize mobile versions of webpages for ranking. Optimizing your site to be mobile friendly improves user experience, ensures your pages load faster and provides a consistent experience across platforms. Improving in this area will help your website survive any future update.

Build authoritative backlinks: A solid backlink profile tells search engines that you are an expert in your field. You can build high-quality backlinks by reaching out to industry influencers or websites for guest blogging opportunities. When your content is great, other people will naturally start to link to it and help establish your brand as a trusted authority.

Optimize on-page elements: Optimizing the on-page elements of your site is a great way to survive Google updates. Why? On-page optimization keeps your site relevant, ensures your content is useful for your users, improves user experience and when done right, it improves your search engine ranking. On-page elements include things like content, images, videos, meta tags, title tags, headers and URLs. Optimizing these elements helps search engines understand what your content is about. The better they can understand, the better your chances of earning high ranking.

Leverage social media: Social media is a powerful marketing tool to engage with your audience and get the word out about your brand. It can attract a wider audience to your content and website, improve your online presence and positively influence search ranking through social signals (likes, shares, comments, etc.).

Monitor and analyze performance: Utilize web analytics tools to track your website’s performance and make data-driven decisions. Continuously monitor your keyword rankings, organic traffic and user behavior and look for ways to improve your strategy.

The rules of SEO change every day for everyone, but with these tips you can futureproof your website to survive any future algorithm update.

Step-by-Step Procedure

  1. Stay up to date with Google’s algorithm updates, including the March core and spam updates.
  2. Analyze your website’s content and improve its quality, relevance and optimization for targeted keywords.
  3. Do thorough keyword research to find the keywords your target audience is searching for. Incorporate the keywords into your content throughout your site.
  4. Always evaluate your content strategy. Optimize your content to match search intent and provide value for your users.
  5. Evaluate your website’s speed and optimize it by compressing images, reducing server response time and leveraging caching techniques.
  6. Optimize your website to be mobile-friendly and responsive. Provide a seamless experience across all devices to significantly improve user experience.
  7. Work to acquire high-quality backlinks and build a solid backlink profile to establish your brand as a trusted authority.
  8. Optimize on-page elements such as content, imagery, videos, meta tags, title tags, headers and URLs. Be sure to include your keywords into these elements.
  9. Connect with your audience on social media platforms. Utilize social media to promote your content and encourage your audience to like, share and comment.
  10. Monitor your website’s performance using web analytics tools, tracking keyword rankings, organic traffic and user behavior. Adjust your SEO strategy as necessary.
  11. Regularly audit your website to identify technical SEO issues. Promptly fix any broken links, duplicate content, or improper redirects you find.
  12. Always stay updated on the latest trends and best practices in SEO. SEO training is the best way to do this.
  13. Engage with others in your industry by participating in forums, guest blogging, or collaborating with influencers. This is a great way to increase your online presence and establish trust.
  14. Struggling to adapt to algorithm updates? Losing traffic and ranking after a core update? You may want to consider hiring a reputable SEO agency or consultant. Our SEO experts can work with you to get better results from your SEO — just let us know how we can help.

The post Google’s Explosive March Updates: What I Think appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/googles-explosive-march-updates/feed/ 3
What Are Google’s Top Ranking Factors? https://www.bruceclay.com/blog/what-are-googles-top-ranking-factors/ https://www.bruceclay.com/blog/what-are-googles-top-ranking-factors/#comments Wed, 22 Nov 2023 18:20:36 +0000 https://www.bruceclay.com/?p=204446 Discover the profound impact of content quality on website rankings. Learn expert strategies to optimize your content for enhanced search engine visibility and user engagement.

The post What Are Google’s Top Ranking Factors? appeared first on Bruce Clay, Inc..

]]>
Tablet displaying Google search engine.

Back in 2016, a Google engineer famously said that Google’s top three ranking factors were content, links and RankBrain. However, this was later disputed by more than one Googler.

Sure, those three factors are likely a huge part of how Google determines rankings, but that’s not an exhaustive list.

It is wrong to think that the algorithm is made up only of these three factors and that each of them carries one-third weight for each query. That’s because the factors in Google’s algorithm change from query to query.

That said, I’ll spend the rest of this article shedding more light on what I believe are the top factors in how your website ranks.

Content

If I were to choose the No. 1 spot among the top ranking factors, it would be content. Content is the fabric of the web. Without content, Google’s search results simply wouldn’t exist.

Now, any website can have content, but quality content is how you rank. That’s because the content you create is important to the efficacy of Google’s search results.

Google wants its users to have a good experience. To do that, the websites featured in its search results must offer good answers to their users’ queries.

To ensure that Google is featuring the best content in its search results, it has created things like:

For more, see:

Technical Factors

If your site is not crawlable and/or doesn’t perform well, it will likely not do well in the search results.

One of the most important things you can do is make sure your site is optimized from the ground up so that search engines can access, crawl and understand it with ease. And it must provide visitors with a good user experience.

Case in point: Google’s Gary Illyes once said:

“I really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made-up terms by the rank trackers and that they talked more with the developers of the website once done with the first part of this sentence.”

In Google’s Search Quality Evaluator Guidelines, it states that unmaintained sites are low quality:

… unmaintained/abandoned “old” websites or unmaintained and inaccurate/misleading content is a reason for a low Page Quality rating.

In 2021, Google rolled out its page experience update, outlining various technical factors that should be followed to ensure a good user experience. If you want to be rewarded with better ranking, optimize your site to address these signals.

For more, see:

Links

If content is the fabric of the web, links are the strings that tie it together. Ever since PageRank, links have been a significant way search engines determine rankings. Why? Links have always served as a “vote” from one website for another.

Even though Google says links have less impact now than they used to, they are still important.

However, not all links are created equal. Google doesn’t give every link to your site an equal vote. In fact, some links can even result in a negative impact on your website’s ability to rank.

Today, links are no longer a numbers game. Even though Backlinko research shows that the No. 1 result in Google has an average of 3.8 times more backlinks than positions No. 2 to 10, we have seen sites with fewer but higher quality links outrank sites with more.

To get links right, you want to focus on link earning rather than link building and the quality and relevance of links versus the quantity. Link earning first starts with creating excellent content and a trusted site so that it naturally earns the links it deserves.

In 2020, Google’s John Mueller said links were “definitely not the most important SEO factor.” What does that mean? Google wants people to focus on making a great site first and not worry too much about building tons of links.

For more, see:

RankBrain

RankBrain is probably one of the most misunderstood ranking factors. RankBrain is a machine learning component of Google’s algorithm. It uses multiple data points at the time of a search to help the search engine better interpret the intent of search queries and serve the most relevant search results.

With RankBrain, Google can go beyond factors like the quality of the content or the links to a webpage to help identify the very best answer to a search.

So to survive RankBrain, you have a lot of work to do to ensure that you are creating the type of content that satisfies the query / your keywords. If you don’t, you risk the chance of RankBrain deciding that your content is not relevant to a search and you can lose rankings.

For more, see:

There are countless factors that go into ranking a webpage, video, or image. And those factors change based on the search query. Understanding the top factors, though, is an important first step in knowing how search works.

Unlock your website’s potential with our expert SEO strategies tailored to Google’s dynamic ranking factors—let’s boost your visibility and climb the search results together. Talk to us

FAQ: How can I increase the ranking of my website with Google using its top ranking factors?

Today’s digital world necessitates having an effective online presence for any business to remain viable; and Google’s top ranking factors play a pivotal role when it comes to your website’s SERP ranking. So, how can these elements help boost website performance and how can businesses utilize these factors to their benefit? In order to understand more on this subject and gain some valuable insights.

At first, it’s essential to recognize that Google uses a sophisticated algorithm to rank websites. Although this algorithm incorporates numerous factors, here are a few primary ones which you should focus on:

  1. Content Quality: Producing high-quality, engaging and relevant content is of utmost importance for Google to recognize websites as providers of valuable information to their visitors, so invest time into producing engaging articles, blog posts, videos and other forms of media content creation such as books.
  2. Backlinks: An authoritative backlink network can make all the difference for improving website rankings. Partner with prominent industry websites when possible and prioritize quality over quantity when it comes to backlinks.
  3. Mobile Responsiveness: Being mobile-first makes having a mobile-responsive website essential. Google prioritizes websites which deliver seamless user experiences on all devices; to maximize this benefit for mobile visitors optimize both design and functionality to provide optimal experience across devices.
  4. Page Loading Speed: Slow websites can lead to poor user experiences and reduce Google ranking, so make sure yours loads quickly by optimizing images, minifying code and employing caching strategies.
  5. User Experience: Google prioritizes websites that focus on user experience. Focusing on elements like easy navigation, intuitive design and clear calls-to-action will not only increase rankings but will also foster engagement and conversions.

Moving forward to enhance your website’s ranking:

  1. Leverage Keywords Precisely: Conduct keyword research to identify relevant search terms, then incorporate these naturally into the content, meta tags and headings on your website.
  2. Optimize meta tags: Create attractive titles and descriptions that accurately capture your webpage content while encouraging viewers to visit it.
  3. Take Advantage of Header Tags: Header tags can help structure and simplify your content for users as well as search engines alike, making it more readable for both audiences.
  4. Include alt text with images: Alt text can help search engines understand your images more readily, making your website both accessible and search-engine-friendly.
  5. Regularly update content: Stagnant websites tend not to rank well; to stay ahead, ensure yours remains fresh by regularly publishing articles, blog posts or product updates.
  6. Leverage Social Media: Utilize social media channels such as Twitter and Facebook to increase exposure, drive visitors directly to your website, and potentially secure backlinks from outside sources.
  7. Monitor and Assess Website Performance: Utilizing tools like Google Analytics can be useful in monitoring website performance, user activity and traffic sources — providing valuable data that reveals areas requiring improvement and potential problem areas.
  8. Engage Your Audience: Responding to user comments, queries and feedback is vital in building relationships and maintaining your website’s visibility and reputation. Engaging and building relationships will pay dividends!
  9. Increase Website Security: Search engines prefer secure websites; utilize SSL encryption technology and regularly upgrade software updates in order to protect it against malware attacks.
  10. Optimize for local search: If your business has physical presence, pay special attention to local SEO factors like Google My Business listings, customer reviews and local keywords when optimizing for local search.
  11. Generate an XML Sitemap: Generating an XML sitemap helps search engines understand and index your website more effectively.
  12. Stay Abreast of Your Competitors: Keep tabs on what strategies, keywords and tactics your competitors employ in order to increase the ranking of their website.
  13. Create an effective internal linking structure: Internal links enable search engines to navigate your website more effectively while also helping establish its hierarchy, dispersing authority more evenly among pages and improving user navigation.
  14. Strive for positive user experience: Prioritize providing visitors with an engaging user experience on your website by prioritizing fast loading pages, intuitive interfaces and fast access to pertinent data.
  15. Stay current: With search engine algorithms ever evolving, staying informed on latest trends, updates and best practices is key for maintaining and improving the ranking of your website.

Utilizing Google’s top ranking factors is the cornerstone of improving website rankings. Apply these expert strategies, deliver valuable content with exceptional user experiences and watch as your rankings soar higher and higher on search engine result pages.

The post What Are Google’s Top Ranking Factors? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/what-are-googles-top-ranking-factors/feed/ 8
An Up-to-Date History of Google Algorithm Updates https://www.bruceclay.com/blog/google-algorithm-updates/ https://www.bruceclay.com/blog/google-algorithm-updates/#comments Thu, 09 Nov 2023 18:00:42 +0000 https://www.bruceclay.com/?p=72831 The only constant is change — especially when it comes to Google algorithm updates. We've collected all the most notable updates from Google since 200. With this history, we provide guidance on what to do once an update hits.

The post An Up-to-Date History of Google Algorithm Updates appeared first on Bruce Clay, Inc..

]]>
Algorithm update: A change in the search engine’s ranking formulas that may or may not cause noticeable seismic shifts in which webpages appear at the top of search results, but which is meant to improve the quality of results overall.

Seismograph volatility readings.

An SEO Perspective on Algo Fluctuations

The only thing that’s constant in search engine optimization is change. In one year (2020), for example, Google reported running 600,000+ experiments that resulted in more than 4,500 improvements to Search. That’s a lot of volatility, folks.

Here is our running list of the notable confirmed and major unconfirmed algorithm updates of all time. Below the list, we also explain how to watch for algorithm updates and what to do if you think your site has been impacted.

Main TOC:

The following are the major updates, in our view, of all time — the ones that have shaped the face of search and SEO. These links will take you to the most important Google updates in our opinion:

If you want to jump to a particular year, be my guest:
2023 | 2022 | 2021 | 2020 | 2019 | 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | Pre-2009

Google Algorithm Updates by Year

2023 ALGORITHM UPDATES

November 2023 – Reviews Update

Released on November 8, Google said this reviews update would the final one that it would announce. Going forward, the reviews system will be updated on a regular basis.

November 2023 – Core Update

In a rare development, Google announced its second major core update in as many months on November 2. The rollout was completed on November 28. Google said this update was to make an improvement to the core update in October 2023.

October 2023 – Core Update

Google completed the third core update of the year on October 19.

October 2023 – Spam Update

The spam update began October 4 and ended October 19. This update addressed the increase in spam being reported by users when searching in their native languages.

September 2023 – Helpful Content System Update

The helpful content system update finished rollout on September 28. Google updated its guidance on AI-generated content, shifting its attitude to be consistent with generating helpful content. Google also revised guidelines on hosting third-party content. It now recommends blocking this content from being indexed if it is unrelated to the website’s main purpose. Learn more about the helpful content system here.

August 2023 – Core Update

The second core update of 2023 began on August 22 and completed September 7. See how it compared to previous updates.

May 2023 – Topic Authority System

Google announced its topic authority system as a way to “better surface relevant, expert, and knowledgeable content” in news queries. This system is not new — Google has been using it for several years, but they are discussing it now to bring more transparency into how ranking works.

April 2023 – Reviews Update

Google updated the name of this update from “product reviews” to just “reviews.” It now evaluates reviews of services, businesses, media — any topic that is reviewable. Google revised the language in its guidance documentation to apply to reviews of all kinds. This update was completed April 25. Learn more here.

March 2023 – Core Update

Google rolled out this update on March 15. It was the first of 2023 and finished rolling out March 28. This update caused more volatility than previous ones.

February 2023 – Product Reviews Update

Google’s product reviews update promotes review content that is of higher quality than the typical standard review information found online. The goal of this update is to “provide users with content that provides insightful analysis and original research, content written by experts or enthusiasts who know the topic well,” according to Google. This update added several new languages including English, Spanish, German, French and Italian among others. Google completed the rollout on March 7.

2022 ALGORITHM UPDATES

December 2022 – Link Spam Update

Hot off the heels of the helpful content system update, Google launched a link spam update on December 14. Google is targeting spammy links with SpamBrain, their AI-based spam-prevention system. Learn more about this update here.

December 2022 – Helpful Content System Update

This is the first update to Google’s helpful content system since its launch in August 2022. It began rolling out December 6, adding new signals and updating it worldwide to include all languages.

October 2022 – Spam Update

On October 19, Google quietly rolled out a spam update to improve detection of search spam. The update finished on October 21. Read more about Google’s spam update here.

September 2022 – Product Reviews Update

Google completed its second product reviews update of 2022 on September 26. Like the first update in July, this update rewards review-based content that is helpful and informative to shoppers.

September 2022 – Core Update

Google began rolling out its second core update of the year on September 12. It was completed two weeks later on September 26. This update appeared to have less of an impact than previous core updates.

August 2022 – Helpful Content Update

Google announced the helpful content update on August 18. This new signal rewards content that Google believes is helpful and informative, rather than content that is purely meant to rank well in search engines. The update rolled out August 25 and was completed on September 9.

Check out our video What to know about the Helpful Content Update to get Bruce’s take on the new signal.

June 2022 – Product Reviews Update

The Product Reviews Update was announced on July 27. While it was expected to take several weeks to roll out, it only took six days, fully rolling out on August 2. This update promotes quality review content to help shoppers learn more about products before purchasing. See Google’s help document on how to write high quality product reviews.

May 2022 – Core Update

This algo update acknowledged by Google caused some noticeable ranking fluctuations, more than some other recent core updates. It rolled out starting May 25 and finished June 9, 2022.

February 2022 – Page Experience Update for Desktop

Core Web Vitals and the page experience ranking factor applied to desktop as well as mobile as of this update, which finished rolling out on March 3, 2022.

2021 ALGORITHM UPDATES

June 2021 – Page Experience Update

Rolled out between mid-June and early September 2021, this update had less impact on rankings than Google’s core updates. Yet despite the fact that the update was somewhat downplayed in the SEO community, Google’s John Mueller confirmed it was more than simply a “tie-breaker” when ranking webpages. He also stated that when webmasters downplay the update, it may also mean that they downplay the impact that the ranking factors have on users.

What it did was create a new page experience ranking factor combining at least six signals related to how a webpage performs for mobile users. (Note that though it initially applied to mobile ranking only, Google rolled out the page experience update for desktop as well in February 2022.)

Core Web Vitals are three new performance metrics introduced with this update. Websites that meet certain performance thresholds can gain some competitive advantage and also improve their site’s user experience.

While improving page experience factors is often technical back-end work, top sites continue to improve their scores. So especially if you’re in a competitive industry, I highly recommend you check your site and get started.

For more details, see our comprehensive e-book Google’s Page Experience Update: A Complete Guide and these resources:

June 2021 – Core Update

In June 2021, Google released a core update and announced another one would be coming the following month in July.

Google told Search Engine Land that the reason the rollout was broken into two phases was that not all of the planned improvements for the June 2021 update were ready. So Google decided to release the parts that were ready and push out the rest the following month.

Many in the industry felt that this was a big update, according to a roundup of data published at Search Engine Land. Subsequently, many felt like the link spam update in July–August 2021 did not have as big of an impact, which is why we aren’t listing it separately here.

Was the June core update related to “your money or your life” webpages? Some thought so.

Google released a blog post that coincided with the June core update, stating that: “core updates are designed to increase the overall relevancy of our search results. In terms of traffic, we send, it’s largely a net exchange. Some content might do less well, but other content gains.”

February 2021 – Passage Ranking

On February 11, Google announced it had launched passage ranking for U.S. queries in English.

Passage ranking helps Google Search better choose and rank the most relevant webpages with passages (like blocks of text) that answer very specific queries.

Whereas before, the search engine may have ranked articles that give general information on the query; now, Google can find and rank articles that answer the query the best, even if it’s only within a block of text on the webpage.

Google said this about passage ranking:

Very specific searches can be the hardest to get right, since sometimes the single sentence that answers your question might be buried deep in a web page. We’ve recently made a breakthrough in ranking and are now able to better understand the relevancy of specific passages. By understanding passages in addition to the relevancy of the overall page, we can find that needle-in-a-haystack information you’re looking for.

Google stated that when it is fully rolled out globally, it will impact 7% of search queries.

2020 ALGORITHM UPDATES

December 2020 – Core Update

In early December 2020, Google released a new broad core update. Many industry commentators stated it was a big core update — one of the largest yet — with many sites seeing extreme traffic gains and losses.

As with previous broad core updates, there was no specific ranking factor targeted; rather, broad core updates are an update to how sites are evaluated.

While many webmasters were anticipating this update as a way to recover from losses from the May 2020 update, many were also concerned about the timing of this update, as it occurred during the holiday period.

You can check out an analysis of this core update at Search Engine Land: “Google’s December 2020 core update was big, even bigger than May 2020, say data providers.

November 2020 – Subtopics Ranking

Some websites may have experienced ranking changes on or around mid-November, and the subtopics ranking change may have been the reason.

Google did not announce this algorithm update (but did discuss it back in October). Google’s Danny Sullivan later confirmed in 2021 that the ranking change went live in November 2020.

It’s worth noting that subtopics ranking is a new feature of the algorithm, as opposed to an update of existing processes.

Google discussed subtopics ranking in October 2020, saying the following:

We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad. As an example, if you search for “home exercise equipment,” we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page. We’ll start rolling this out by the end of this year.

In other words, the subtopics ranking feature is designed to help Google understand how subtopics relate to a query.

As another example, if someone were to search for “SEO,” Google can now understand relevant subtopics such as agencies, conferences, tools, and Google algorithm updates. With this information, it can then show wider-ranging content in the search engine results pages.

For more, see the Search Engine Land article: “Google launched subtopics ranking in mid-November.”

May 2020 Core Update

Google announced a broad core update via Twitter. This update took approximately two weeks to fully roll out.

Many felt that the May update was significant, even for a core update, with many sites seeing significant losses or gains in traffic. Many algorithm-tracking tools registered extreme volatility.

A core update, according to Google, is a broad algorithm update that does not target specific types of queries or pages. Instead, the update is about improving how the search engine assesses content overall to make results more relevant. We are told to compare a core update to refreshing a list of top 100 movies that you made a few years back. Naturally, new items would appear in your list today, and other titles on your list would shift up or down.

Consequently, Google says that “pages that drop after a core update don’t have anything wrong to fix.” And some pages that were “previously under-rewarded” will rank higher.

Moz gives an analysis of the winners and losers of this core update here.

January 2020 – Featured Snippets Update

Google confirmed that results shown in featured snippets would no longer be repeated on the first page of search results. Previously, a featured snippet could be found in “Position 0” as well as one of the top organic listings on the page.

2019 ALGORITHM UPDATES

October 2019 – BERT

Google announced BERT — its deep learning algorithm also known internally at Google as DeepRank — on October 25, 2019. BERT impacts conversational search queries by helping Google to better understand context, including how words like “for” and “to” change the meaning of a search. Google later confirmed there was nothing specific to optimize for, and that BERT affects nearly every search performed.

September 2019 Core Update

Google announced this broad core algorithm update ahead of time. The industry weighed in after it rolled out, and many hypothesized it targeted link spam.

June 2019 Core Update & Site Diversity Update

Google pre-announced this update, which seemed to focus on correcting the way the algorithm evaluated links. It put more weight on the domain authority and trustworthiness of a site’s incoming links.

In my opinion, the “trustworthiness” component of the E-A-T factors rose in importance. And it overflowed SEO to include a fix for detecting sentiment. Note that this firmly places the issue with a ranking loss on marketing in general … No wonder Google has said that there is nothing sites can do specifically to “fix” their rankings after a core update runs. Mess with user sentiment, link sentiment, and quality, and prepare to die.

The search engine simultaneously released a separate site diversity update. The stated goal was to make search results more diverse. For most queries, users no longer see a domain appear more than twice in the top-ranked results, making it harder for a site to “saturate” a topic organically.

Hazards: Sites with too low a percentage of backlinks from trusted websites may have dropped. Those that used to have many pages ranking for a single query also lost some SERP real estate.

Winners: Pages may have risen that were previously under-rewarded. (Aren’t we all?)

March 2019 Core Update

Google confirmed this update, which seemed to fine-tune broad core algorithm changes of the past. Data showed the majority of the websites that were impacted were also impacted by the March 2018 core update and the August 2018 “Medic” update. To prevent naming confusion, Google tweeted the update’s name the same day it was released.

However, the losers weren’t impacted as much as the winners. Research found that sites whose search traffic increased experienced higher rankings site-wide, with no increase in the number of keywords ranked.

On the flip side, the update hurt many sites that provide a poor user experience (due to excessive pop-ups, poor navigation, over-optimization, and so forth).

And, of course, trust was a significant signal. Sites dealing with YMYL (Your Money or Your Life) topics took ranking hits if they were participating in untrusted activities.

Hazards: Google’s March 2019 Core Update behaved like an evolution of previous algorithms, negatively affecting sites that were over-optimized.

Winners: Reputable sites ranking for queries related to particularly sensitive topics, like those about health questions, benefited. According to SearchMetrics, “websites with a strong brand profile and a broad topical focus” also advanced. See, trust matters.

2018 ALGORITHM UPDATES

September 2018 – Small Update

Google confirmed there was a small update made, and reiterated that it wasn’t anything major.

August 2018 Core Update – Medic

Google announced the release of this broad core algorithm update, coined by the industry as “Medic.” Google’s advice? Be more relevant for searches — and not just page by page, but as a whole site. More advice stated that small tweaks to content may not suffice and that holistic changes to the site may be required.

I believe the Medic update was a significant step focused on trust (part of Google’s “E-A-T” quality factors). In my opinion, trusted sites that were interlinked with untrusted sites were penalized.

Learn more:

July 2018 – Speed Update

Fast performance creates a better user experience for searchers clicking on results. After alerting website owners months in advance, Google announced its Speed Update on July 9, 2018. Previously, page load speed factored into only desktop search results. Since this update, slow performance on mobile devices can hurt a site’s optimization.

April 2018 – Broad Core Update

Google confirmed this broad core update. As is the case with all broad core updates, Google indicates that there’s nothing specific to do. However, as indicated in a later confirmation of the March core update (below), it may have been around relevance.

March 2018 – Broad Core Update

Google confirmed this broad core update and reminded webmasters to continue building great content. Later, Google explained that most of its updates are around relevance, not how good or bad a site is.

2017 ALGORITHM UPDATES

December 2017 – Several Updates

Google reacted to the SEO community that dubbed fluctuations in December as “Maccabees.” The search engine said it wasn’t a single update but several minor improvements and that there weren’t any major changes as a result.

March 2017 – Fred

Google confirmed a string of updates with the caveat that they make three updates per day on average. When asked if the updates had a name, Google jokingly said all updates would be called Fred, and the name stuck. These updates seemed to be related to quality, and some impacted were sites using aggressive monetization tactics that provided a bad user experience, and that had poor-quality links. Fred was wide-reaching and focused on quality across a variety of factors, not just a single process.

One specific target was sites using aggressive monetization tactics that provided a bad user experience. Poor-quality links were also targeted by Fred. Link to an untrusted site, and it lowers your trust … and rankings.

Hazards: Sites with thin, affiliate-heavy, or ad-centered content were targeted.

Winners: Many websites featuring quality, high-value content with minimal ads benefited.

January 2017 – Interstitial Updates

Google preannounced that it would make updates in January 2017 that would impact sites with pop-ups on their pages that created a poor user experience. It confirmed this algorithm updates on January 10, 2017.

2016 ALGORITHM UPDATES

September 2016 – Penguin Integrated into Core Algorithm

Google confirmed that its webspam algorithm dubbed “Penguin” was rolled into the core algorithm. That meant that instead of manual refreshes, it would now work in real-time. Penguin would now devalue spam links instead of demoting whole sites. It also became more granular in how it worked.

Rather than demoting a site for having bad inbound links, the new Penguin tried to just do away with link spam.

Now, if a site has inbound links from known spam sites, Google just devalues (ignores) the links.

However, if a site’s backlink profile is too bad, Google may still apply a manual action for unnatural links to your website. Also, John Mueller said earlier this year that a software-induced penalty can still occur if a site has “a bunch of really bad links” (h/t Marie Haynes).

Friendlier Penguin has not proven to be 100% effective. As a result, many businesses still need help cleaning up their link profile to restore lost rankings. Google has said that you should not need to disavow files, yet also welcomes them. To me, that is a very clear signal that we should not rely on the algorithm alone when it comes to backlinks.

Hazards: Sites that had purchased links were targets, as well as those with spammy or irrelevant links, or incoming links with over-optimized anchor text.

Winners: Sites with mostly natural inbound links from relevant webpages got to rise in the SERPs.

See the detailed history of Penguin in the Penguin Algorithm Updates section.

September 1, 2016 – Possum

Possum is an unconfirmed yet widely documented Google algorithm update. This update targeted the local pack. Unlike confirmed updates, the details of the Possum update are a bit less clear. SEOs believe it sought to bring more variety into local SERPs and help increase the visibility of local companies.

With this update, Google seemed to change how it filtered duplicate listings. Before Possum, Google omitted results as duplicates if they shared the same phone number or website. With Possum, Google filtered listings that shared the same address. This created intense competition between neighboring or location-sharing businesses.

Hazards: Businesses with intense competition in their target location could be pushed out of the local results.

Winners: Businesses outside physical city limits had a chance to appear in local listings.

May 2016 – Mobile Update

Google announced ahead of time that it would increase the mobile-friendliness ranking signal in May. Google confirmed the rollout was completed on May 12.

Learn more: Mobile Friendly SEO Ranking Boost Gets Boosted in May

January 2016 – Panda Integrated into Core Algorithm

Google revealed that the Panda algorithm targeting quality was a part of the core algorithm. It was not clear when this happened exactly. Google also confirmed that even though it was part of the core algorithm, it did not operate in real-time.

This significant update is detailed below in the Panda Updates section.

Learn more: Google Explains What It Means To Be Part Of The “Core” Algorithm

2015 ALGORITHM UPDATES

October 2015 – RankBrain

Google revealed to Bloomberg that RankBrain — Google’s artificial intelligence system — was one of its top three ranking signals. Reportedly, 15 percent of queries per day have never been seen by Google before. Initially, RankBrain helped interpret those queries, but it may now be involved in every query. It’s also possible with RankBrain that searchers’ engagement with the search results is a factor in how it determines the relevancy of a result.

According to Google’s Gary Illyes on Reddit,

RankBrain is a PR-sexy machine learning ranking component that uses historical search data to predict what would a user most likely click on for a previously unseen query. It is a really cool piece of engineering that saved our butts countless times whenever traditional algos were like, e.g. “oh look a “not” in the query string! let’s ignore the hell out of it!”, but it’s generally just relying on (sometimes) months old data about what happened on the results page itself, not on the landing page. Dwell time, CTR, … those are generally made up crap. Search is much more simple than people think.

OK, if it changes the target, then it is always right. If you change the results to match a user-intent profile, then in the future, all clicks would match that profile since that is all there is. Are all searches for a particular keyword always informational? RankBrain may think so and push out ecommerce sites from the results. Fortunately, it is often correct.

Hazards: No specific losers, although sites won’t be found relevant that have shallow content, poor UX, or unfocused subject matter.

Winners: Sites creating niche content and focusing on keyword intent have a better chance of ranking.

Learn more:

July 2015 – Panda Update 4.2

For more information on this update, please see the Panda section below.

May 2015 – Quality Update to Core Algorithm

Google confirmed a change to its algorithm (although not right away) on how it processed quality signals. Google stated that the update wasn’t intended to target any particular sites or class of sites, but was an update to the overall ranking algorithm itself.

April 2015 – Mobile-Friendly Update (“Mobilegeddon”)

Google announced ahead of time in February and then confirmed in April its mobile-friendly update was rolling out that would boost the rankings of mobile-friendly pages. This update laid the foundation for Google’s mobile-first search mechanism.
The update underlined mobile-friendliness as a ranking signal and laid the foundation for the way Google’s mobile-first search mechanism works today. It was fun watching ecommerce sites try to fit 700 navigation links into a mobile menu. (Side note: There are better ways to handle mobile navigation.)

Hazards: Sites without a mobile-friendly version of the page, or with poor mobile usability, suffered.

Winners: Responsive sites and pages with an existing mobile-friendly version benefited.

2014 ALGORITHM UPDATES

October 2014 – Penguin Update 3.0

For more information on this update, please see the Penguin section below.

September 2014 – Panda Update 4.1

For more information on this update, please see the Panda section below.

July 2014 – “Pigeon” Local Search Algorithm Update

The update dubbed Pigeon shook up the local organic results in Google Web and Map searches.

Google told Search Engine Land that it had made an update to its local ranking signals to provide better results for users. Dubbed “Pigeon,” these updates improved distance and location ranking parameters (“near me”), and incorporated more of the ranking signals used in Google’s main web search algorithms. Google said it probably wouldn’t detail any more changes to the local search algorithm in the future.

Hazards: Local businesses with poor on- and off-page SEO suffered.

Winners: Local companies with accurate NAP information and other SEO factors in place gained rankings.

Learn more: How Do I Rank Higher in Google Local Search? Bruce Clay’s Checklist for Local SEOs

June 2014 – Payday Loan Algorithm Update (3.0)

Google confirmed that the third iteration of the “Payday Loan” algorithm that targets heavily spammed queries was rolling out.

May 2014 – Payday Loan Algorithm Update (2.0)

Google confirmed the second iteration of the “Payday Loan” algorithm, impacting about 0.2% of English queries.

May 2014 – Panda Update 4.0

For more information on this update, please see the Panda section below.

February 2014 – Page Layout Algorithm Refresh

Google announced a refresh of its page layout algorithm. Its impact was not given.

2013 ALGORITHM UPDATES

October 2013 – Penguin Update 2.1

For more information on this update, please see the Penguin Update section below.

August 2013 – Hummingbird

Hummingbird was announced in September (although it had been live since August). Hummingbird was essentially a complete overhaul of its algorithm (not an added-on update) and was the beginning of semantic search as we know it today.

Google needed a way to better understand the user intent behind a search query. Search terms that were similar but different, for example, often generated less-than-desirable results. Take the word “hammer” as an example. Is the searcher looking for the musician, the museum, or a tool to pound nails with?

Google’s Knowledge Graph was a first step. Released the year before Hummingbird, the Knowledge Graph mapped the relationships between different pieces of information about “entities.” It helped the search engine connect the dots and improve the logic of search results.

Hummingbird used semantic search to provide better results that matched the searcher’s intent. It helped Google understand conversational language, such as long-tail queries formed as questions. It impacted an estimated 90% of searches and introduced things like conversational language queries, voice search, and more.

Hazards: Pages with keyword stuffing or low-quality content couldn’t fool Google anymore.

Winners: Pages with natural-sounding, conversational writing and Q&A-style content benefited.

Learn more: Google Hummingbird & The Keyword: What You Need To Know

July 2013 – Panda Update – Recovery

For more information on this update, please see the Panda section below.

June 2013 – Payday Loan Algorithm Update

Google announced a new algorithm to address the quality of results for heavily spammed queries such as “payday loans,” “viagra” and pornography-related keywords. Sites impacted tended to be those involved in link schemes, webspam, and often illegal activities.

Learn more: What You Need to Know About the Google Payday Loan Algorithm Update

May 2013 – Penguin Update 2.0

For more information on this update, please see the Penguin section below.

March 2013 – Panda Update #25

For more information on this update, please see the Panda section below.

January 2013 – Panda Update #24

For more information on this update, please see the Panda section below.

2012 ALGORITHM UPDATES

December 2012 – Panda Update #23

For more information on this update, please see the Panda section below.

November 21, 2012 – Panda Update #22

For more information on this update, please see the Panda section below.

November 5, 2012 – Panda Update #21

For more information on this update, please see the Panda section below.

October 2012 – Penguin Update 1.2

For more information on this update, please see the Penguin section below.

October 2012 – Page Layout Algorithm Update #2

Google announced an update to its page layout algorithm update and confirmed it impacted about 0.7% of English queries.

September 27, 2012 – Panda Update #20

See the Panda section below for more details.

September 18, 2012 – Panda Update 3.9.2

See the Panda section below for more details.

September 2012 – Exact-Match Domain Algorithm Update

Google announced an algorithm update dubbed “Exact-Match Domain” that aimed to reduce low-quality exact-match domains in the search results.

Learn more: The EMD Update: Like Panda & Penguin, Expect Further Refreshes To Come

August 2012 – Panda Update 3.9.1

See the Panda section below for more details.

July 2012 – Panda Update 3.9

See the Panda section below for more details.

June 25, 2012 – Panda Update 3.8

See the Panda section below for more details.

June 8, 2012 – Panda Update 3.7

See the Panda section below for more details.

May 2012 – Penguin Update 1.1

For more information on this update, see the Penguin Updates section below.

May 2012 – Knowledge Graph Release

In what Google described as a “critical first step towards building the next generation of search,” it began rolling out the Knowledge Graph. This is basically a knowledge base designed to match keywords to real-world entities.

April 27, 2012 – Panda Update 3.6

See the Panda section below for more details.

April 19, 2012 – Panda Update 3.5

See the Panda section below for more details.

April 2012 – Webspam Update (Penguin)

Google announced an algorithm designed to target sites that were directly violating its quality guidelines. With “Penguin,” link spam became the target of Google’s efforts. This significant update is detailed below in the Penguin Updates section.

March 2012 – Panda Update 3.4

See the Panda section below for more details.

February 27, 2012 – Venice Update

Google announced improvements to ranking for local search results. Dubbed “Venice,” this update to local search took into account the user’s physical location or IP address. This was a major change to how local search worked.

February 2012 – Panda Update 3.3

See the Panda section below for more details.

January 19, 2012 – Page Layout Algorithm Update

Google confirmed that it would be updating its page layout algorithm to penalize sites with overly aggressive “above-the-fold” ads.

January 2012 – Panda Update 3.2

See the Panda section below for more details.

2011 ALGORITHM UPDATES

November 2011 – Panda Update 3.1

See the Panda section below for more details.

November 2011 – Freshness Update

To give users the freshest, most recent search results, Google announced that it would be improving its ranking algorithm to prioritize freshness for certain queries. Google said it “noticeably impacts six to 10 percent of searches, depending on the language and domain you’re searching on.”

October 2011 – Panda Update 3.0

See the Panda section below for more details.

September 2011 – Panda Update 2.5

See the Panda section below for more details.

August 2011 – Panda Update 2.4

See the Panda section below for more details.

July 2011 – Panda Update 2.3

See the Panda section below for more details.

June 2011 – Panda Update 2.2

See the Panda section below for more details.

May 2011 – Panda Update 2.1

See the Panda section below for more details.

April 2011 – Panda Update 2.0

See the Panda section below for more details.

February 2011 – Panda Quality Update

Google announced on its official blog that a new update to reduce rankings for low-quality content had been introduced. Dubbed “Panda,” it took particular aim at content produced by so-called “content farms.”

The initial rollout impacted about 12% of English queries. (You’ll find detailed history in the Panda Algorithm Update section below.)

Hazards: Websites lost rankings if they had duplicate, plagiarized or thin content; user-generated spam; keyword stuffing.

Winners: Original, high-quality, high-relevance content often gained rankings.

January 2011 – Attribution Update

In an effort to reduce spam, Google updated its algorithm to better detect scrapers. Matt Cutts, Google’s head of webspam at the time, revealed the change on his personal blog, saying it was a “pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice.”

2010 ALGORITHM UPDATES

June 2010 – Caffeine

Google completed a significant new web indexing system named Caffeine (originally announced in 2009). It enabled Google to speed up its search engine, as well as provide users with fresher content.

Learn more:

May 2010 – Mayday Update

Search Engine Land reported that at an industry event, Google had confirmed the so-called Mayday update. This update significantly reduced long-tail traffic for some sites.

Learn more:

2009 ALGORITHM UPDATES

December 2009 – Real-Time Search Launch

Google announced the release of real-time search, a “dynamic stream of real-time content” that allowed users to see the most recent and relevant tweets, news stories, and more.

Pre-2009 ALGORITHM UPDATES

December 2005 – “Big Daddy”

This infrastructure update worked to improve the quality of search results. It was visible in December and 100% live by March of 2006, as reported by Google’s then-head of webspam, Matt Cutts.

Learn more: Was Big Daddy Too Much for Google to Handle?

September 2005 – “Jagger” Updates Begin

In a series of three updates (September, October, and November), “Jagger” was meant to deal with the increasing amount of webspam in the search results.

Learn more:

November 2003 – Webspam Update (Florida)

The update called “Florida” targeting webspam was the first major update coming from Google that put the kibosh on tactics used in previous years to manipulate rankings.

Learn more: What Happened to My Site on Google?

A Note on Algorithm Changes Pre-”Florida”
Between 2000 and 2003, PageRank would usually be updated monthly, and rankings would fluctuate. Webmasters would often post their findings on Webmaster World (before the days of confirmations or announcements from Google).

Learn more: A Brief History of SEO

Panda Algorithm Update

Panda was rolled out in February of 2011, aimed at placing a higher emphasis on quality content. The update reduced the amount of thin and inexpert material in the search results. The Panda filter took particular aim at content produced by so-called “content farms.”

With Panda, Google also introduced a quality classification for pages that became a ranking factor. This classification took its structure from human-generated quality ratings (as documented in its Search Quality Evaluator Guidelines).

Websites that dropped in the SERPs after each iteration of Panda were forced to improve their content in order to recover. Panda was rolled into Google’s core algorithm in January 2016.

History of Panda Updates

From 2011 to 2016, Panda had many data refreshes and updates before being rolled into the core algorithm.

  • Core Algorithm Integration – January 2016: Google revealed through SEM Post and later confirmed that Panda was integrated into the core algorithm.
  • Update 4.2 – July 2015: Google revealed to Search Engine Land that it pushed out a slow rollout. This refresh affected 2 to 3 percent of English queries, and gave a second chance to those penalized by Panda in the previous refresh.
  • Update 4.1 – September 2014: Google’s announced that this refresh was meant to further “help Panda identify low-quality content more precisely.” The refresh impacted 3-5 percent of queries.
  • Update 4.0 – May 2014: Google announced a major update that impacted 7.5% of English queries.
  • “Recovery” – July 2013: Google confirmed to Search Engine Land that this refresh was “more finely targeted” than previous ones.
  • Update No. 25 – March 2013: Search Engine Land reported that Google’s Matt Cutts announced a Panda update for March 15, 2013, during the SMX West panel. Tests suggest it happened, though Google never confirmed.
  • Update No. 24 – January 2013: Google announced a refresh that would affect 1.2% of English queries.
  • Update No. 23 – December 2012: Google announced a refresh that would affect 1.3% of English queries.
  • Update No. 22 – November 21, 2012: Google confirmed this update to Search Engine Land and said that 0.8% of English queries were impacted.
  • Update No. 21 – November 5, 2012: Google confirmed to Search Engine Land that an update took place. This refresh affected 0.4 percent of queries worldwide and 1.1% of English queries in the U.S.
  • Panda 20 – September 27, 2012: Google confirmed to Search Engine Land this relatively major update (more than a data refresh) that took more than a week to roll out. The update impacted 2.4% of English queries. Panda 20 marked a change in the naming convention of the update.
  • Update 3.9.2 – September 18, 2012: Google announced a refresh that “noticeably” affected less than 0.7%. They also said to “expect some flux over the next few days.”
  • Update 3.9.1 – August 2012: Google confirmed a refresh that impacted about 1% of queries.
  • Update 3.9 – July 2012: Google announced a refresh that impacted about 1% of search results.
  • Update 3.8 – June 25, 2012: Google announced a refresh that “noticeably” affected about 1% of queries worldwide.
  • Update 3.7 – June 8, 2012: Google belatedly confirmed this refresh. Less than 1 percent of queries were noticeably impacted in the U.S. and 1% of queries were impacted worldwide. Ranking tools have suggested that this refresh was heavier hitting than others.
  • Update 3.6 – April 27, 2012: Google confirmed to Search Engine Land that it pushed out a refresh on this day, and said it affected very few sites.
  • Update 3.5 – April 19, 2012: Google confirmed that a refresh happened. Search Engine Land published a list of the “winners and losers.”
  • Update 3.4 – March 2012: Google announced a refresh that affected about 1.6% of queries.
  • Update 3.3 – February 2012: Google announced the refresh on its Inside Search blog, saying it would make Panda “more accurate and more sensitive to changes on the web.”
  • Update 3.2 – January 2012: Google confirmed to Search Engine Land that a data refresh had taken place.
  • Update 3.1 – November 2011: Google confirmed that a minor update went out and impacted less than 1% of searches.
  • Update 3.0 – October 2011: Google announced that people should “expect Panda-related flux in the next few weeks,” but that it would have less impact than previous updates at about 2 percent. The update included new signals in the Panda algorithm and a recalculation of how the algorithm affected websites.
  • Update 2.5 – September 2011: Google confirmed to WebProNews that a refresh happened, though declined to share details about the sites impacted by it.
  • Update 2.4 – August 2011: Google announced on its Webmaster Central blog that the Panda update had been rolled out internationally to English-speaking and non-English-speaking countries (except for Japan, Korea, and China). The update impacted 6 to 9 percent of queries in most languages.
  • Update 2.3 – July 2011: Google confirmed to Search Engine Land that it implemented a small data refresh.
  • Update 2.2 – June 2011: Google confirmed to Search Engine Land that a data refresh occurred.
  • Update 2.1 – May 2011: The industry first thought this was a much larger update and could be Panda 3.0, but Google clarified that it was only a small data refresh.
  • Update 2.0 – April 2011: Google announced the first core Panda update, which incorporated additional signals and rolled the algorithm out to all English-speaking Google users worldwide. Only about 2% of U.S. queries were affected.
  • Update 1.0 – February 2011: Google announced on its official that a new update to reduce rankings for low-quality sites had been introduced, impacting about 12% of English queries.

Learn more: Understanding Google Panda: Definitive Algo Guide for SEOs

Penguin Algorithm Update

The Penguin update worked to target link spam.

Before rolling out Penguin, Google paid close attention to page link volume while crawling webpages. This made it possible for low-quality pages to rank more prominently than they should have if they had a lot of incoming links.

Penguin helped with the mission to make valuable search results as visible as possible by penalizing low-quality content and link spam. Many sites cleaned up their links. But they could stay in Penguin jail for months, unable to regain their lost rankings until Google ran the next update.

Google made Penguin part of its real-time algorithm in September 2016, and a friendlier version emerged.

History of Penguin Updates

From 2012 to 2016, Penguin had several data refreshes and updates before rolling into the core algorithm.

  • Update 4.0 – September 2016: Google announced on its Webmaster Central blog that Penguin was now part of the core algorithm. This meant Penguin worked in real-time and was also more granular.
  • Update 3.0 – October 2014: Google confirmed to Search Engine Land that a data refresh had occurred. Google later said that the update impacted less than 1 percent of English queries.
  • Update 2.1 – October 2013: Google confirmed a data refresh happened. About 1 percent of searches were noticeably affected.
  • Update 2.0 – May 2013: Google’s Matt Cutts confirmed on “This Week in Google” that a significant update took place and impacted 2.3% of English queries.
  • Update 1.2 – October 2012: Google announced that a small refresh was happening. Only 0.3% of English queries would be affected.
  • Update 1.1 – May 2012: Google announced that the first Penguin data refresh had occurred. Less than 0.1% of English searches were impacted.
  • Update 1.0 – April 2012: Google announced on its Inside Search and Webmaster Central blogs that the Penguin update was launched and designed to catch spammers and those going against publisher guidelines. The update would impact about 3% of search queries.

How to Watch for Google Algorithm Changes

With the exception of recent broad core updates, Google rarely announces its algorithm updates. And when it does, it is usually only after others discover them.

With so many tweaks going on daily, it is possible that Google doesn’t know that some changes will be significant enough to mention.

Often the first indication you have is your own website. If your search traffic suddenly jumps or dives, chances are good that Google made an algo update that affected your search rankings.

Where can you go for information when your online world gets rocked? Here’s what I recommend …

Have a “seismograph” in place on your website.

To detect search traffic fluctuations on your own website, you need analytics software. If you haven’t already, install Google Analytics and Google Search Console on your website. They’re free, and they’re indispensable for SEO.

Watch the SERP weather reports.

RankRanger SERP fluctuations chart.

Various websites and tools monitor ranking changes across categories and search markets and report on SERP volatility. Here are places you can check for early warning signs of a search ranking algorithm update:

Follow industry resources.

I’m always reading as an SEO. For the latest Google news, I recommend that you:

What To Do After a Google Update

Think that an algorithm update has penalized your site?

Don’t panic. Remember — nobody truly understands the algorithm. Whatever you’re experiencing may or may not be due to a Google fluctuation. And Google may “fix” it tomorrow, or next week.

With this in mind, get intentional. And stay calm. Decide whether you need to act before you do.

Calm woman meditating with laptop.

Here’s a plan to follow after an algorithm update …

  1. Stay calm.
  2. Get into puzzle-solving mode. Do NOT react or make changes hastily. Instead, gather data. Determine whether your site was impacted by the change and not something else, such as a technical SEO issue. Or it could be that your rankings dived because your competitors moved up in the SERPs. Depending on the cause, you need to do something different in response.
  3. Learn about the update from several sources (see my suggested resources above). Find out what other SEO experts are saying and experiencing.
  4. Adjust your SEO strategy accordingly.
  5. Remember that Google’s ranking algorithms change all the time. What impacts your site today could reverse itself in a month.
  6. Change what makes sense on your website.
  7. Re-evaluate your impact.
  8. If no results have changed, now you can panic.
  9. Call us.

Last Thoughts: You Don’t Need to Beat the Algorithm

Google’s algorithm updates are constant, often unverified, and difficult to anticipate. That doesn’t mean you have to be afraid.

Don’t spend your time trying to figure out a way to beat the algorithm. You’ll waste hours chasing your tail and missing the things that truly matter, like creating a high-quality website that is worthy of ranking.

I like to tell a story to illustrate this …

Imagine you’re out camping with a friend, and a bear shows up. You both take off running, the bear in hot pursuit.

In this situation, do you have to be an Olympic runner to survive?

No — you just have to be faster than your buddy.

In the world of SEO, your mission is to be better than your competition. You don’t need to beat the bear.

So don’t let algorithm updates cause you to make knee-jerk decisions. Instead, be strategic about how and when, but stay informed so you can make these decisions properly.

If you found this helpful, please subscribe to our blog. If you’d like assistance with your website SEO, contact us for a free quote.

FAQ: What steps should I take if my site’s ranking is impacted after an update?

Search engine algorithm updates can significantly impact a website’s ranking, leading to fluctuations in search results. If your site’s ranking has been affected after a recent update, there are crucial steps to take for recovery and reclaiming lost visibility.

Understanding the Impact

The first step is to carefully assess the extent of the ranking drop and identify the specific pages or keywords affected. Analyze the timing of the decline with recent algorithm updates. By understanding the impact and the changes made in the update, you can develop a targeted plan for recovery.

Site Audit and Optimization

Conduct a comprehensive site audit to identify any technical issues or violations of search engine guidelines that may have caused the ranking drop. Focus on factors like page load speed, mobile-friendliness, and broken links. Optimize your website’s content by incorporating relevant keywords naturally and ensuring high-quality, valuable content that aligns with user intent.

Link Profile Analysis

Evaluate your website’s backlink profile to check for suspicious or low-quality links that might have triggered a penalty. Disavow toxic links, and work on acquiring high-quality backlinks from reputable sources to improve your site’s authority.

User Experience Enhancement

A positive user experience is crucial for ranking well in search results. Ensure your website is visually appealing, easy to navigate, and features clear calls to action to improve its overall user experience. Doing this will encourage more visitors to remain on your site for extended periods.

Engaging Content Strategy

Craft a content strategy that provides valuable, engaging, and informative content for your audience. Regularly update your website with fresh posts in order to establish authority in your niche area and demonstrate expertise.

Step-by-Step Procedure: How to Recover Lost Rankings After an Update

  1. Evaluate the Ranking Drop
  2. Identify the Affected Pages or Keywords
  3. Understand the Algorithm Update
  4. Plan a Targeted Recovery Strategy
  5. Conduct a Comprehensive Site Audit
  6. Address Technical Issues and Violations
  7. Optimize Website Content
  8. Focus on Mobile-Friendliness
  9. Improve Page Load Speed
  10. Fix Broken Links
  11. Analyze the Backlink Profile
  12. Disavow Toxic Links
  13. Acquire High-Quality Backlinks
  14. Enhance User Experience
  15. Optimize Site Navigation
  16. Create Clear Calls-to-Action
  17. Ensure a Visually Appealing Design
  18. Implement an Engaging Content Strategy
  19. Publish Valuable and Informative Content
  20. Monitor and Analyze Progress

By following these expert strategies and taking appropriate action, you can navigate through algorithm updates and recover lost rankings effectively. Remember to continually monitor your website’s performance and adapt your SEO efforts to stay resilient in the ever-changing landscape of search engine optimization.

The post An Up-to-Date History of Google Algorithm Updates appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/google-algorithm-updates/feed/ 63
Why SEO Basics Still Matter + Evergreen SEO Tips https://www.bruceclay.com/blog/why-seo-basics-matter/ https://www.bruceclay.com/blog/why-seo-basics-matter/#comments Fri, 11 Aug 2023 15:00:44 +0000 https://www.bruceclay.com/blog/?p=22296 There are a ton of advanced Web marketing tactics these days, and the evolution of the field has brought us to a very healthy, holistic approach to digital marketing. But it’s equally important not to lose sight of the basics that allow a website to reach its full potential. We see it time and time again; sites that don’t implement the fundamentals of SEO find obstacles creeping in to various parts of their sites, their businesses, their strategies. That’s why SEO basics are the foundation of any successful website.

At the upcoming Search Engine Strategies in San Francisco this August, Bruce Clay presents the session, “Getting Started with SEO.” Conferences host these types of sessions time and time again because the basics of SEO are still very relevant.

This is because:

-Large brands with complicated websites are unable to take their site to the next step without implementing the basics of SEO on their site.
-Small business site owners are just getting started in search engine optimization, and need to understand why these tactics exist, and how to implement them.

Inspired by Bruce’s upcoming presentation, I thought we’d use this post to look at what SEO basics still matter and why. But first, let’s explore the “lasting” side of SEO – the approach to SEO that stands the test of time.

Read more of Why SEO Basics Still Matter.

The post Why SEO Basics Still Matter + Evergreen SEO Tips appeared first on Bruce Clay, Inc..

]]>
A, B and C blocks on a table.

Now more than ever, it seems the world is changing. Not too long ago, I spoke about tipping for SEO and digital marketing, which is causing high-growth companies to drive even faster.

And then, of course, Google changes an average of 12 things per day in Search, and websites are being impacted for things they aren’t even doing wrong.

Yet with all these changes, the basics of SEO remain the same. It is important not to lose sight of the fundamentals that we know make a difference. They make a difference in Google’s ranking algorithm and make a difference to businesses trying to remain relevant and competitive online.

That is why SEO basics are the foundation of any successful website. Let’s look at this idea in more detail.

What Is an Evergreen SEO Strategy?

No matter what changes are thrown our way, a basic and dependable approach to SEO helps a company’s online presence weather those changes.

That approach includes the following SEO principles:

  1. Beat the competition, not the algorithm
  2. Focus on a whole-SERP SEO strategy
  3. Optimize to be the least imperfect

1. Beat the Competition, Not the Algorithm

There are countless ranking signals in Google’s algorithm, with thousands of updates made yearly to Search. So it’s safe to say that no one will know exactly what the search engine algorithm is looking for.

We do the best we can with the information that’s available to us and the wisdom we have from years of practice, success and failure. We stay on top of developments and test them over and over again.

However, instead of focusing on every little algorithmic possibility, we instead focus on our competition. Our competition is the content ranking on Page 1 of the search results. And we discover that competition through keyword research (more on that later).

For more, read:

2. Focus on a Whole-Serp Strategy

With keywords in tow, you will be able to see what types of search engine results are most prominent for those queries. You will prioritize your SEO and content development efforts there.

For example, some search queries will have more prominent video results, images, or perhaps featured snippets.

Google search engine results page for the query "how to get kool aid out of carpet."
Google SERP for the query “how to get kool aid out of carpet”

In other words, don’t just focus on the “blue links.” When you approach SEO and content development by understanding which features Google believes are the most relevant for a query, you evergreen your SEO program.

For more, read:

Optimize To Be the Least Imperfect

Analyze the content and the webpages that are ranking for your desired keywords. What are they doing well? What can you do better?

As mentioned, the goal of SEO is not to try to beat the ranking algorithm, which is infinitely large. The goal of SEO is to beat the competition. And the way to do that is to be least imperfect compared to the competition. Every website is imperfect against the Google algorithm. And when Google evaluates which pages to serve in its search results, it chooses the least imperfect compared to others for that search.

For more, read:

What Are Some Basic SEO Strategies That Get Results?

Here are five basic SEO strategies that are proven to get results:

  1. Keyword and audience research
  2. Information architecture aka SEO siloing
  3. Quality content
  4. Technical and on-page optimization
  5. Linking practices

1. Keyword and Audience Research

SEO keywords are single words or short phrases that represent the search queries that people use in a search engine. Once you have identified your keywords and explored the intent behind them (i.e. what a person is trying to accomplish when they use them in Google), you can do the following and more:

  • Identify and speak the language of the target market
  • Create useful content for your target audience
  • Communicate to Google that a webpage is a relevant match for a query
  • Drive more qualified traffic to appropriate webpages

For more, read:

2. Information Architecture aka SEO Siloing

How you organize the content on your site matters for search engines and users. SEO siloing builds relevance for a website and positions it as an authority on a topic. It also helps website visitors navigate the content with ease, and get complete answers to their questions.

For more, read:

3. Quality Content

Google wants to display the most useful content to its search engine users. So quality content is likely the most important ranking factor to get right.

For more, read:

4. Technical and On-Page Optimization

The performance of your website matters to website visitors. That is why you need to make sure that the site provides a good user experience. You do this through technical SEO practices that optimize the back-end of the website.

In addition, you want to help Google understand what a webpage is about, and this can be accomplished through on-page SEO.

For more, read:

5. Linking Practices

How you link to other webpages matters in SEO. There are three primary strategies in linking:

  1. Internal links: How you link to pages within your website
  2. Outbound links: Which websites do you link to
  3. Inbound links: Which websites link to you

Graphic illustrating the difference between internal links, inbound links and external links.

Each type of linking strategy has its own best practices. It is important to strive for quality and relevance with every link.

For more, read:

Without the basics of SEO, websites suffer when Google changes things or we experience economic or market downturns. This results in knee-jerk reactions that end up costing businesses more in the end than an upfront investment in evergreen SEO strategies.

Looking to add an evergreen SEO strategy to your company plan? Talk to us. We can help.

FAQ: How can I implement an effective evergreen SEO strategy to stay competitive in an ever-changing digital landscape?

With search engine algorithms frequently changing and competition intensifying, adopting an evergreen SEO strategy is essential to remain competitive in the long run.

An evergreen SEO strategy focuses on core principles that remain effective regardless of algorithmic shifts. The key is prioritizing beating the competition rather than chasing after ever-changing search engine algorithms. Conducting thorough keyword research to identify and understand your competition is the first step. By analyzing what content ranks on Page 1 of search results for your target keywords, you gain valuable insights into your competitors’ strategies and areas for improvement.

Adopting a Whole-SERP approach is a crucial aspect of a successful evergreen SEO strategy. It goes beyond focusing solely on traditional blue links in search results. Instead, consider the various features that Google deems relevant for a specific query, such as featured snippets, images, videos, or knowledge graphs. By understanding which elements are prominent for your target keywords, you can tailor your SEO and content development efforts accordingly to maximize visibility and engagement.

Optimizing to be the least imperfect compared to your competition is another fundamental principle of an evergreen SEO strategy. Instead of obsessing over trying to outsmart the ever-changing ranking algorithms, concentrate on delivering the best possible user experience and content quality. Analyze the webpages that rank for your desired keywords and learn from their strengths. By providing more value to your audience and addressing their needs comprehensively, you increase your chances of ranking higher in search results.

Implementing evergreen SEO strategies involves five proven practices:

  1. Conduct comprehensive keyword and audience research to understand user intent and create targeted content.
  2. Optimize your website’s information architecture using SEO siloing to establish authority on specific topics and improve navigation.
  3. Prioritize producing high-quality content that caters to your audience’s needs, as it remains the most significant ranking factor.
  4. Ensure technical and on-page optimization to enhance user experience and facilitate search engine understanding.
  5. Adopt effective linking practices including internal, outbound and inbound links to build authority and relevance for your website.

Mastering the art of evergreen SEO is essential for staying competitive in the ever-changing digital landscape. By focusing on beating the competition, understanding user intent and delivering high-quality, valuable content, businesses can create a solid foundation for long-term online success.

Step-by-Step Evergreen SEO Strategy Implementation: 

  1. Conduct comprehensive keyword research to identify relevant target keywords.
  2. Analyze competitor content that ranks on Page 1 of search results for your chosen keywords.
  3. Identify areas for improvement and potential content gaps in your niche.
  4. Develop a Whole-SERP strategy by considering different types of search engine results for your target queries.
  5. Prioritize content creation that aligns with your audience’s search intent.
  6. Implement SEO siloing to organize your website content and establish topic authority.
  7. Optimize website navigation for user-friendly access to relevant content.
  8. Focus on producing high-quality, valuable content that addresses your audience’s needs and queries.
  9. Ensure technical SEO practices to improve website performance and user experience.
  10. Optimize on-page elements such as meta tags, headers and content structure.
  11. Utilize internal linking to guide users to related content within your website.
  12. Employ outbound links to reputable sources that support and complement your content.
  13. Seek quality inbound links from authoritative websites in your industry.
  14. Monitor and analyze website performance and rankings regularly.
  15. Continuously update and improve your content to remain relevant and valuable.
  16. Stay up-to-date with industry trends and algorithm changes to adapt your strategy.
  17. Leverage social media and other promotional channels to drive traffic to your content.
  18. Encourage user engagement and interaction with your content.
  19. Monitor and respond to feedback and comments from your audience.
  20. Continuously evaluate and refine your evergreen SEO strategy based on performance metrics and changing market demands.

The post Why SEO Basics Still Matter + Evergreen SEO Tips appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/why-seo-basics-matter/feed/ 12
Do WordPress Sites Do Better in Google Search? https://www.bruceclay.com/blog/wordpress-sites-better-google-search/ https://www.bruceclay.com/blog/wordpress-sites-better-google-search/#comments Thu, 13 Jul 2023 18:14:45 +0000 https://www.bruceclay.com/?p=194476 Explore the reasons behind Google's embrace of WordPress, the impact of their partnership, and what makes WordPress sites competitive.

The post Do WordPress Sites Do Better in Google Search? appeared first on Bruce Clay, Inc..

]]>
A bundle of cubes with WordPress logo printed on them.

For several years, people have speculated that websites built on WordPress do better in Google’s search results. So, does this mean Google favors WordPress sites?

Let’s dive into:

Why Google Embraces WordPress

WordPress is behind more than one-third (43%) of the websites on the internet and its CMS market share is roughly 64%, according to W3Techs.

That makes it the most popular content management system and growing. In fact, from January 2020 to April 2022, WordPress usage grew from 35% to 43% – that’s nearly a 23% increase in just over two years.

And WordPress isn’t just for small businesses. WordPress powers some of the biggest brands and most popular websites including Zoom.us, Alibaba.com, Bloomerg.com, and Salesforce.com.

Google endorsed WordPress early on, stating back in 2009 that the CMS solved up to 90% of SEO mechanics, with features such as enabling an SEO-friendly site structure, mobile friendliness, and enabling page optimization via meta tags – and that is right out of the box.

Additionally, WordPress plugins enable you to further tailor functionality (including our own Bruce Clay WordPress SEO plugin), providing plenty of ways for optimizing a WordPress website for SEO.

Even with all of these new features, WordPress still isn’t perfect for organic search and needs work if you really want to increase your search results.

In 2017, Google presented findings at a WordCamp event that showed WordPress performed poorly compared to non-WordPress webpages on a lot of key performance indicators.

Graph from Google presentation showing median speed index between WordPress and non-WordPress sites.
Image source: Performance Is User Experience, Google, WordCamp, 2017
Graph from Google presentation showing Time to First Interactive metric for WordPress and non-WordPress sites.
Image source: Performance Is User Experience, Google, WordCamp, 2017
Graph from Google presentation showing First Meaningful Paint metric for WordPress and non-WordPress sites.
Image source: Performance Is User Experience, Google, WordCamp, 2017

(In case you didn’t notice, those performance indicators are the same found in Google’s “page experience” algorithm update that hit in 2021).

So in 2018, Googler Alberto Medina announced a partnership between Google and WordPress to help improve the WordPress ecosystem. (The announcement on his personal blog seems to be currently unavailable.)

Why partner together?

As WordPress expands in market share and becomes the website platform of choice for more websites, Google wants to ensure that WordPress sites perform well in search results.

That equates to a better experience for Google users. And this is what Google really cares about.

Does Google Give WordPress Sites Better Rankings?

Maybe – but it’s not likely due to a ranking signal in its algorithm. And, in fact, Google has stated otherwise.

In 2016, Google’s John Mueller said this on Twitter:

And again in 2021, Mueller said this:

The reason why WordPress sites may tend to rank better is due to WordPress’s SEO-friendly features. Not every CMS allows you to do advanced SEO. This is what you want in a CMS.

As WordPress and its community — including Google — continue working together to enhance it, WordPress sites will be more competitive in search results compared to other website builders.

If you are choosing between WordPress and Wix for creating a website, my advice would be to ensure you can compete effectively with your competition online.

And, if all your competitors use WordPress, it would be advantageous for all parties involved to be competing on an even playing field. You don’t get there by using a website builder that does not enable you to do advanced SEO.

Contact us today to learn more about how you can develop or update your WordPress site to rank in Google.

FAQ: How does the partnership between Google and WordPress benefit search results?

Google recently formed an alliance between itself and WordPress, offering multiple benefits for website owners and marketers. This partnership positively impacts search rankings for WordPress users in many ways, including:

Enhanced Crawling and Indexing

The partnership between Google and WordPress has resulted in improved crawling and indexing of WordPress websites. Google Searchbot is tightly integrated into WordPress websites and will quickly identify and index new or updated pages on them, giving Google the power to show them in relevant search results quickly. Google can even quickly index blog posts written for WordPress sites as quickly as they’re published or updated, making sure your posts appear quickly for searchers to discover them.

Optimized Site Structure and Mobile-Friendliness

WordPress is widely admired for its user-friendly interface and robust architecture, offering a host of SEO-friendly themes and plugins that can optimize website structure, increase speed and enhance mobile friendliness. This integration aligns perfectly with Google’s emphasis on mobile-first indexing and page experience signals. By leveraging WordPress’s features and Google’s guidelines, website owners can create highly responsive and user-friendly websites that are favored by search engines.

Streamlined Content Publishing and SEO

Content is king in the digital realm, and WordPress empowers website owners to create and publish high-quality content seamlessly. The partnership with Google further enhances this process by providing valuable insights and recommendations for optimizing content for search results. WordPress plugins, such as Yoast SEO, integrate with Google Search Console to offer real-time suggestions for improving keyword targeting, meta descriptions and overall content quality. This collaborative approach ensures that your WordPress site aligns with Google’s search algorithm, maximizing visibility and engagement.

Improved Schema Markup and Rich Snippets

Schema markup, also known as structured data, provides search engines with additional information about your website’s content, enabling them to display rich snippets in search results. WordPress offers numerous plugins that simplify the implementation of schema markup, ensuring that your content stands out in search listings. The Google-WordPress partnership has facilitated the adoption of standardized schema types and improved compatibility between WordPress and Google’s search engine. As a result, website owners can effectively communicate the relevance and context of their content to Google, leading to enhanced visibility and click-through rates.

The integration of Google’s search capabilities with the WordPress platform empowers website owners to create user-friendly, SEO-optimized websites that rank well in search results, ultimately driving more organic traffic and increasing online visibility.

The post Do WordPress Sites Do Better in Google Search? appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/wordpress-sites-better-google-search/feed/ 5