aperchorowicz, Author at Bruce Clay, Inc. https://www.bruceclay.com/blog/author/aperchorowicz/ SEO and Internet Marketing Fri, 14 Mar 2025 17:15:32 +0000 en-US hourly 1 SEO Tools for Microsoft IIS Servers https://www.bruceclay.com/blog/seo-tools-for-microsoft-iis-servers/ https://www.bruceclay.com/blog/seo-tools-for-microsoft-iis-servers/#respond Tue, 22 Mar 2011 06:46:01 +0000 https://www.bruceclay.com.au/blog/?p=3627   One of the most frustrating things that I have come across is a CMS (Content Management System) that has poor support for SEO. Basic tools such as generating a sitemap, searching for broken links or checking markup are often unavailable. This blog will focus on SEO tools available for websites built on Microsoft IIS […]

The post SEO Tools for Microsoft IIS Servers appeared first on Bruce Clay, Inc..

]]>
 

One of the most frustrating things that I have come across is a CMS (Content Management System) that has poor support for SEO. Basic tools such as generating a sitemap, searching for broken links or checking markup are often unavailable. This blog will focus on SEO tools available for websites built on Microsoft IIS and how effective they are in covering a website assessment.

At Bruce Clay we group core SEO requirements in 3 broad categories: Technical, Expertness/Linking and Copywriting. These are broken into sub areas of which some are shown below

Core SEO elements:



Tools such as the MS IIS Search Engine Optimisation Toolkit, and Expression Web SEO Checker help to optimise or support the technical aspects of your website. These include sitemaps, robot exclusions, HTML and structure.

 

 

The IIS SEO Toolkit helps web developers and server administrators to improve their Website’s relevance in search results.

It has a fully featured crawler engine and reporting which helps understand the route search engines reach your content. This is important since the preliminary step for Search Engines is to index your content. What do you want indexed? Are there parts of your site you do not want a robot having access? And are there any errors in your URLs that are preventing a spider from crawling?

In addition this toolkit has Robot Exclusion Features and the ability to generate Sitemaps based on the architecture of your site. A robots.txt and a sitemap are mandatory files when setting up a website on GWT ( Google Webmaster Tools). GWT helps webmasters see how Google is crawling and indexing their site and also provides data for ranking and traffic reports for SEO specialists.

Microsoft Expression Web is an HTML editor and general web design product by Microsoft. It supports XHTML, XML, CSS, W3C accessibility standards and also integrates with Visual Studio. It was released at the end of 2006 and has had positive reviews. Last year Expression Web 4 provided a new reporting tool, the SEO Checker.

The SEO Checker analyses page structure and content based on 50 rules that the Expression Web team have researched. What does the SEO checker do? It focuses on HTML markup and relevant content. This covers the sub categories of HTML and structure in BCA core technical requirements mentioned above.

The SEO checker’s rules provide guidance in the following categories mentioned in MSDN blog, Using the Expression Web SEO Checker.

Content is not where search engines expect to find it

e.g. a missing title tag, empty H1, or missing ALT attributes

Relevance: Title tags are used to identify what your page is about and hold important keywords. This is what appears in SERP (Search Engine Results Page). H1 is an important element of page structure and is a must have element in grouping content in headings. Alt attributes label images and are needed for accessibility requirements as well as appearing in Google Images.

Content or markup follows patterns that may be associated with deceptive practices

Relevance: This a test to see if you are inadvertently stuffing keywords or have excessive content in your page, metadata and ALT attributes or using redirection inappropriately which can penalise your position in search engine rankings.

Content or markup interferes with the ability of search engines to analyze a page

Relevance: This is similar to running your page to in W3C HTML Validator. It will look for missing tags and see if HTML elements are nested correctly. It checks to see if the page is well formed and parsed correctly.

Content or markup reduces the appeal of a page in search-engine results.

Relevance: This analyses how optimum your title and metadata schema is. Are elements missing and are they too short in length. This is the information that is displayed in SERP (Search Engine Ranking Results Page)

Content causes search engines to consider two or more pages to be relevant to the same search term.

Relevance: This analyses your site for duplicate content. Is your title and description unique? This information can also be obtained in HTML suggestions in Google Webmaster tools. It also checks malformed URLs that can result in duplicate URLs such as session IDs, uppercase characters and redundant forward slashes.

Content or markup blocks search engines from analysing your site.

Relevance: This checks to see if content is being blocked from search engines. Are there any errors that are preventing your site being crawled properly? This covers hyperlinks with the nofollow attribute, hyperlinks with too many parameters or invalid chars attributes in the <meta name=”robots”>

The above tools are great for webmasters and server administrators who maintain a MS IIS site. It’s particularly helpful if they are using a CMS that lacks SEO support. Note that these tools do not analyse Expertness/Linking or Copywriting as described in the BCA core SEO elements above. For link profiling a good set of tools would be Majestic SEO and Open Site Explorer. For copywriting the Bruce Clay SEO Toolset can come in handy for optimising content on webpages.

The post SEO Tools for Microsoft IIS Servers appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/seo-tools-for-microsoft-iis-servers/feed/ 0
Directory Link Building https://www.bruceclay.com/blog/directory-link-building/ https://www.bruceclay.com/blog/directory-link-building/#comments Thu, 10 Feb 2011 12:50:30 +0000 https://www.bruceclay.com.au/blog/?p=3410 Directory link building is a common SEO strategy employed in a link building program and can be a fast way to gain new links. It’s a tactic that definitely works and is worth investing as a first foot forward in building links. The advantages of submitting to directories are increased exposure and can deliver a […]

The post Directory Link Building appeared first on Bruce Clay, Inc..

]]>
Directory link building is a common SEO strategy employed in a link building program and can be a fast way to gain new links. It’s a tactic that definitely works and is worth investing as a first foot forward in building links. The advantages of submitting to directories are increased exposure and can deliver a boost in your rankings. Not only will your products and services be accessible via search engines but to those customers looking up companies in relevant categories.

So which directories should you start submitting to? A sample of top tiered directories would be DMOZ, Yahoo, Best of the Web and Joe Ant .  The next step would be to source local business directories in the same geography as your domain. Unfortunately for Australian domains Yahoo!7 Directory has stopped  receiving submissions some time ago but an excellent source of reviewed Australian directories would be at DMOZ Australian Business and Economy Directories . These directories have passed DMOZ editorial review process and would be of excellent value.

So when you are on the hunt for new directories what should you be looking for? Generally look for good quality sites that are not in bad neighborhoods (i.e. share an IP address with disreputable sites) and don’t have submissions from adult and gambling sites. Good KPI’s to analyse would be age of domain, PR, pages indexed and inbound links.  Google Webmaster Central Blog also states as a guideline “There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it’s on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes.”

When reviewing free directories make sure you don’t opt for reciprocal links which can be quite a common offer. Reciprocal linking may occur naturally within niche industries but become a 4 letter word when your site has a high proportion of 1-1 links.

Then comes the question “Should you submit to paid directories?”, “Does Google see it as a paid link and attribute a penalty?” As Matt Cutts explains in an interview, if you are paying for editorial value, the process were editor’s review the quality and relevance of your site, then it’s ok to pay for the directory but be wary if you’re paying for lower quality, automated submissions or the option of choosing anchor text.

A great way to look for highly relevant directories is using link searches for themed directories. The aim is to use Google to search for directories that have content closely related to your own and have a good PR.

E.g: http://www.google.com.au/#hl=en&site=&source=hp&q=keyword+business listing

Another great way to find a source of directories is to analyze your competition’s backlinks. Using Excel and data from a competitor backlinks specialist like Majestic SEO you can analyse where your competition is getting its best links from. By running filters in Excel you can harvest a list of directories that your competition has submitted to.

Directory submissions are great way to get your company profile established on the web. However, in the end natural link building over time is what boosts a site to top rankings. A site with the best tools, services, prices etc. and good website content is what will obtain the best natural links from the web.

The post Directory Link Building appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/directory-link-building/feed/ 1
SEO Update – November 2010 https://www.bruceclay.com/blog/seo-update-november-2010/ https://www.bruceclay.com/blog/seo-update-november-2010/#comments Mon, 13 Dec 2010 10:34:05 +0000 https://www.bruceclay.com.au/blog/?p=3247 Welcome to our monthly SEO update for November 2010. This update highlights key news in the SEO industry during November 2010, key SEO related blog posts from our bloggers in the US and Australia, key articles covered in our global newsletter and key podcasts from our SEM Synergy Webmaster radio show. 1. SEO News a) […]

The post SEO Update – November 2010 appeared first on Bruce Clay, Inc..

]]>
Welcome to our monthly SEO update for November 2010. This update highlights key news in the SEO industry during November 2010, key SEO related blog posts from our bloggers in the US and Australia, key articles covered in our global newsletter and key podcasts from our SEM Synergy Webmaster radio show.

1. SEO News

a) Google Places

In late October, Google rolled out significant changes to Google Places, which impacted the rankings, layout and number of search results of some location based searches. For more information on this topic, see our blog post, the latest podcast from SEMSynergy and the videos on Google Places included in the sections below.

b) Bruce Clay – thought leadership video interviews

Didit’s CEO Kevin Lee conducted a number of video interviews with Bruce Clay, CEO of Bruce Clay, Inc. at Ad:Tech New York in November 2010. In these videos, Bruce discusses the SEO implications of a number of topics including Social Media, Personalization Changes to Local SERPs, and Search Engine Algorithm Differentiation.

c) Google showing more results from a domain

Google is now showing more results from one domain in the search results. Previously the maximum number of times a domain could appear in a search result was twice, with the second result indented. Now it appears to be around four, with no results indented. There are also single-line snippets linking to further results. Google stated that “..when our algorithms predict pages from a particular site are likely to be most relevant, it makes sense to provide additional direct links in our search results”. This is happening mainly for brand related terms.

Implication:  If your site is expanding to four search results it will give you new opportunities to showcase content. If you’re ranking near the bottom of page 1 your search results may get pushed to page 2 if a site above you expands to 4 results. Ensure multiple pages in your site are optimised from an SEO and click through rate perspective.

d) Google getting better at indexing Flash files

Google officially began indexing parts of Flash (SWF) files in 2008 but they have now announced that they are doing a better job of indexing Flash files.

Implication:  It has always been somewhat unclear as to which content and links in Flash objects were being properly indexed. Now Google has stated that almost any text that a user can see as well as the URLs in SWF files can be indexed and crawled. However, this does not mean you should build your site fully in Flash, but it does mean Google has a better understanding of the content and links within those Flash files used on a site. Ensure this is considered as part of your SEO strategy.

e) Google Hotpot

During the month, Google announced Google Hotpot, a “local recommendations engine powered by you and your friends.” This service works as follows; you and your friends’ rate stuff and then when you search in Google Places, you will be recommended places based on you and your friends likes and dislikes. Google is attempting to make local search results more personal, relevant and trustworthy.

Implication:  Ensure you have your Google Places account correctly optimised and encourage users to write positive reviews about your business.

f) Google now taking poor reviews into account in rankings

Google announced recently that they have adjusted their algorithm and rankings to take into account, in their words “..merchants, that in our opinion, provide an extremely poor user experience”.

Implication:  Google is pushing for a better user experience on the web. Now rankings are not just affected by link building and content, but also by poor reviews provided by customers and potentially other signals reflecting a poor customer (user) experience.  Ensure you are managing you online brand reputation and providing good customer service and exceptional user experience.

g) Google providing new meta tags to help identify original news items

Google is experimenting with 2 new meta tags that it hopes will help identify the original sources of online content. As Google has stated, “The more accurate metadata that’s out there on the web, the better the web will be.” These tags are similar to Google’s canonical tag except they are designed for news publishers and only work within Google News. Google also says that if misused they may decrease the importance assigned to current meta tags on a  site, and if this proves to be the case they have the right to remove sites from the Google News index.

Implication: If you are a publisher, try these tags if you feel it will add to your SEO strategy, and then monitor the impact on rankings and traffic. Note that Google says this is only an experiment at this stage.

2. Blogs

Key blog posts in November:

3. Newsletter

Key newsletter articles in November:

4. Webmaster Radio – SEM Synergy

Podcast in November:

The post SEO Update – November 2010 appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/seo-update-november-2010/feed/ 1
Video and Google TV https://www.bruceclay.com/blog/video-and-google-tv-2/ https://www.bruceclay.com/blog/video-and-google-tv-2/#respond Mon, 08 Nov 2010 10:52:54 +0000 https://www.bruceclay.com.au/blog/?p=3065 The much anticipated Google TV became available to consumers via a TV set from Sony, and a set-top box from Logitech last month. Prior to this release Google has been driving to educate the public on publishing video for the web most likely in order to increase content for Google TV. Google TV is as […]

The post Video and Google TV appeared first on Bruce Clay, Inc..

]]>
The much anticipated Google TV became available to consumers via a TV set from Sony, and a set-top box from Logitech last month. Prior to this release Google has been driving to educate the public on publishing video for the web most likely in order to increase content for Google TV.

Google TV is as Google states “Television, meet search engine.” Amongst the features of Google TV such as web browsing, apps, recording shows. Searching TV and web video will be a key core feature. So it would come as no surprise that in the months prior to fall of this year that Google has been releasing a lot of material on video sitemaps. Video publishers will want their content discoverable on this new medium. Recently at SMX Advanced in Seattle, Google’s Matt Cutts, head of the web spam team stated that Google will be looking more at sitemaps in the future especially with the coming of Google TV.

More Americans are watching TV and Internet together as seen in the Three Screen Report by Nielsen. Americans now spend 35% more time using the Internet and TV simultaneously than they were a year ago – now spending 3.5 hours each month surfing the Internet and watching TV at the same time. This growing market has not just been targeted by Google but by Apples iPad, as can be seen in Where and how We’ll Use It.

Google wants to be able to index all video on the web and this media has been increasing in importance from Google’s acquisition of YouTube to the coming of Google TV.  So it looks like there is a new market for SEO. Not only is there a focus on making all video on the web searchable through video sitemaps, websites will also be accessible for TV and Google has released a post – Optimizing sites for TV telling the differences. Current sites will need to provide users with an enhanced TV experience – the “10-foot UI” (user interface). There is also a Google TV Web Site Optimization Checklist that details technical requirements and tests. It looks like there will be versions of websites produced especially for Google TV much like there are for mobile phones.

So to the best way to optimise a video for the Google TV would be with a video sitemap. In the last 3 months there have been several posts on Google Webmaster Central Blog about this. i.e.

Video Sitemaps: Is your video part of a gallery?

Google Videos Best Practices

Video Sitemaps 101: Making your videos searchable

Video Sitemaps: Understanding Location Tags

To err is human, Video Sitemap feedback is divine!

These posts will detail the technical requirements of constructing an  XML sitemap as well as setting this up on Google Webmaster Tools for your website.

Key information needed about your video would be:

For more technical information you can see Bruce Clay’s Chee Chun Foo’s post on Video Optimisation. And one more important point: a video sitemap will get you found and indexed but if you want to rank for searched terms you will need some link juice and supporting content. A good first step would be to create a link from your blog to your new video.

If you don’t have Google TV and you want a sneak preview of how some providers will be presenting video check out YouTube’s Leanback. You will note this is a site optimised for large screens. When you go to this site you will notice that videos start playing immediately, are always presented in full screen and HD and can be easily browsed by using the arrow keys on your keyboard.

So how will advertising work on Google TV? Google CEO Eric Schmidt simply thinks that advertising on Google TV will have an advantage over regular television since it can target ads the same way as Google does online. Initially ads on Google TV will be either delivered through websites or traditional TV programming. There is a possibility of delivering ads through Android apps optimised for Google TV, which would compete with broadcast ads running on the same screen, though this is more likely to happen in the future.

Could Google overlay a paid advert on a carefully crafted and expensive viral series of videos? On looking at YouTube’s Leanback there are Ads by Google that appear in a popup at the bottom of the screen.

Google TV has just arrived and it’s going to be interesting how well the consumer and business will adopt this new technology – The merging of TV, Internet and on demand media.

The post Video and Google TV appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/video-and-google-tv-2/feed/ 0
Using Proper Keyword Density https://www.bruceclay.com/blog/using-proper-keyword-density/ https://www.bruceclay.com/blog/using-proper-keyword-density/#comments Wed, 20 Oct 2010 06:21:33 +0000 https://www.bruceclay.com.au/blog/?p=2929 The goal of search engine optimization is to create web pages that will rank at the top of search engine results for high volume converting keywords. Using proper keyword density is one of the factors that search engines will look for to see how relevant your site is for a searched term. The formula to […]

The post Using Proper Keyword Density appeared first on Bruce Clay, Inc..

]]>

The goal of search engine optimization is to create web pages that will rank at the top of search engine results for high volume converting keywords. Using proper keyword density is one of the factors that search engines will look for to see how relevant your site is for a searched term. The formula to calculate keyword density is (NKWTW) * 100, where NKW is the number of times a specific keyword is repeated in a page of TW (total words).

Optimizing

When optimizing the keyword of a page one must ensure that the page’s main keyword appears in the following HTML elements and at the optimal keyword density.

  • Title Tag: The title tag should appear first in your HTML’s head section. Main keywords should appear here.
  • Meta Description Tag: Is often what a search engines will display in a listing and should contain all your keywords that appear in your title.
  • Meta Keywords Tag: This tag is no longer used by the main search engines as it once was due the spam technique of keyword stuffing.
  • Heading Tags: Heading tags e.g. H1, H2 are important to search engines as they contain the ideas your document is about. Make sure you have at least an H1 on every page in your site and that it contains the appropriate text.
  • Body Content: There are various guides on how many words should appear in your body. SEOBook suggests variations in word count from page to page to make your site appear natural to search engines. At Bruce Clay we consider the first 200 words of higher importance than the rest of the body text. In the end, the best way to determine the figure is to run a Multi Page Analyzer on your competitors that appear first in SERP (explained below).

Overloading your page with a targeted keyword may be considered search engines as spam. Keyword stuffing was a technique used in the past to obtain good rankings. Measuring keyword density using keyword tools such as WeBuildPages or BruceClay SEO Tool KDA will allow you to minimize the over repetition of non-targeted keywords, while ensuring you have the proper density of target keywords as well as not loading your page with irrelevant keywords.

There are recommended rules of thumb for the optimal keyword density but the best method to obtain this figure is to run an analysis of the top positions in SERP for a given keyword and see what Google sees as the best figure. The Bruce Clay SEOToolSet offers a Multi-Page Analyzer that a user can subscribe to.

Keyword Density

Nowadays, keyword density is less of an important variable for search engine rankings, but there are still words you have to include within your content, based on competitor research. For example, you might find that your competitors are using main topic keywords more than others. Or, you may find that they are using sub-keywords a certain way, and you may want to do something similar based on that analysis. Either way, it’s important to consider how words are being used on the page and how best to beat those competitors if you want to obtain higher rankings.

Optimize your SEO strategy: unlock higher rankings by mastering keyword density — discover how proper keyword usage boosts relevance and competitiveness in search engine results. Contact us.

FAQ: How can I effectively use keyword density to improve my website’s SEO?

Search Engine Optimization (SEO) is key to the success of any website in today’s digital environment. Keyword density plays a pivotal role in SEO and can immediately affect how search engines display results pages for your website. In this article, we’ll show how keyword density can help boost SEO rankings on any site.

Step-by-Step Process for Effectively Using Keyword Density to Improve SEO

  1. Understanding Keyword Density:

Before diving into the practical aspects, let’s clarify keyword density. Keyword Density refers to the ratio between crucial phrases or keywords and total words on a website, usually expressed as a percent.

  1. Quality Over Quantity:

While stuffing your content with keywords is tempting, this approach needs to be updated. Although Google doesn’t penalize for keyword stuffing, they will most likely ignore your page and not allow it to perform well because of this. Focus on using keywords naturally, ensuring they add value to your content and make it reader-friendly.

  1. Research and Select Relevant Keywords:

Utilize keyword research tools like Google Keyword Planner or SEMrush to quickly identify the most pertinent phrases or keywords for your content.

  1. Strategic Placement:

Place your selected keywords strategically throughout your content. They should naturally fit into your text and enhance the reader’s experience. Include them in titles, headings, and the meta description.

  1. Keyword Density Formula:

Calculate keyword density using the formula:

Keyword Density = (Number of Keywords / Total Words) x 100

For example, if your page has 500 words and you’ve used your target keyword 10 times, the keyword density is (10 / 500) x 100 = 2%.

  1. Ideal Keyword Density:

While there is no universal ideal keyword density, a safe range is generally between 1% and 3%. However, this can vary depending on your niche and competition.

  1. Monitor and Adjust:

Regularly check your content’s performance using tools like Google Analytics. If your rankings need to be improved, consider adjusting your keyword density, but do so sparingly.

  1. User Experience Matters:

Remember that user experience is paramount. Quality content that keeps visitors engaged will lead to better SEO results, so don’t focus solely on keyword density.

  1. Use Synonyms and Variations:

Rather than overusing a single keyword, incorporate synonyms and variations to make your content more diverse and engaging.

  1. Meta Tags and Alt Text:

Remember to optimize your meta tags and alt text for images with your chosen keywords. These elements play a vital role in SEO.

  1. Mobile Optimization:

Ensure your website is mobile-friendly. With the increasing use of smartphones, search engines prefer mobile-optimized sites.

  1. Backlinks and Internal Links:

Building high-quality backlinks and incorporating internal links can improve your website’s SEO. Use keywords in anchor text where relevant.

  1. Regular Updates:

Keep your content fresh and up-to-date to maintain its relevance and improve SEO over time.

  1. Avoid Duplicate Content:

Search engines penalize duplicate content, so make sure your content is unique and not copied from other sources.

  1. Seek Professional Guidance:

Consider hiring an SEO agency or expert if SEO becomes too time-consuming and complex to manage independently.

Search engine optimization relies on understanding and employing keywords appropriately; however, it must also consider user experience and produce high-quality content for long-term success. Staying up-to-date with algorithm updates and trends to maintain high rankings on search engines.

This article was updated on November 23, 2023.

The post Using Proper Keyword Density appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/using-proper-keyword-density/feed/ 3
Web Accessibility and SEO https://www.bruceclay.com/blog/web-accessibility-and-seo/ https://www.bruceclay.com/blog/web-accessibility-and-seo/#comments Tue, 21 Sep 2010 10:36:17 +0000 https://www.bruceclay.com.au/blog/?p=2806 Web accessibility is the practice of making websites usable and accessible by people with abilities and disabilities. Web content accessibility guidelines WCAG 1.0 were published by the World Wide Web Consortium (W3C) in 1999 in an effort to make web content accessible to people with disabilities. The various categories of people that web accessibility addresses […]

The post Web Accessibility and SEO appeared first on Bruce Clay, Inc..

]]>
Web accessibility is the practice of making websites usable and accessible by people with abilities and disabilities. Web content accessibility guidelines WCAG 1.0 were published by the World Wide Web Consortium (W3C) in 1999 in an effort to make web content accessible to people with disabilities. The various categories of people that web accessibility addresses are:

  • Visual: Low vision and blindness
  • Motor/Mobility: problems in using the hands or muscles
  • Auditory: Hearing impairments
  • Seizures:  Seizures caused by visual strobe or flashing effects
  • Cognitive/Intellectual

According to a research report by Microsoft and Forrester Technologies

  • 40% (51.6 million) of computer users are likely to benefit from the use of accessible technology due to experiencing mild difficulties or impairments.
  • 17% (22.6 million) of computer users are very likely to benefit from the use of accessible technology due to experiencing severe difficulties or impairments.
  • 43% (56.2 million) of computer users are not likely to benefit from the use of accessible technology due to experiencing no or minimal difficulties or impairments.

So as we can see there is probably a figure of 20% to 40% of computer users that could benefit from accessible technology and this should interest companies in the private sector who wish to benefit from this untapped market share and getting more traffic from Google. There are many websites out there that don’t meet accessibility guidelines and the only real push seems to be coming from the government.

In Australia the government has endorsed the Web Content Accessibility Guidelines (WCAG) version 2.0 for all government websites. Under the Disability Discrimination Act 1992 agencies must ensure that people with disabilities have the same fundamental rights to access information as the rest of the community.

So it would make sense that Google in its drive to push the development of the web and provide an enriched user experience would reward sites that are accessible. If we look at Google’s Webmaster Guidelines it mentions best practices which will help Google find, index and rank your site. We can see in the Design and Content Guidelines and Technical Guidelines recommendations closely match W3C Web Content Accessibility Guidelines.

Designing a website with accessibility in mind not only help those people who are blind or disabled but provides a coding standard that is important for SEO.

The post Web Accessibility and SEO appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/web-accessibility-and-seo/feed/ 1
Whats up at Google I/O https://www.bruceclay.com/blog/whats-up-at-google-io/ https://www.bruceclay.com/blog/whats-up-at-google-io/#respond Wed, 26 May 2010 18:50:39 +0000 http://10.10.0.6/blog/archives/2010/05/whats-up-at-google-io.html Last week in San Francisco was Google I/O, the company’s largest developer conference of the year featuring 2 days of technical content covering Android, Google Chrome, Google APIs, GWT, App Engine, open web technologies, and more. The conferences’ biggest announcement was the introduction of Google TV. Google TV is a software platform that hardware makers […]

The post Whats up at Google I/O appeared first on Bruce Clay, Inc..

]]>
Last week in San Francisco was Google I/O, the company’s largest developer conference of the year featuring 2 days of technical content covering Android, Google Chrome, Google APIs, GWT, App Engine, open web technologies, and more.

The conferences’ biggest announcement was the introduction of Google TV. Google TV is a software platform that hardware makers Logitech, Sony and DiSH will support. It will combine the web with TV and work just like Google search. It will be compatible with your existing cable/satellite box infrastructure, and Android applications will be accessible on your television. It is built on Android 2.1, will run Google Chrome for the browser and support Flash; and when Google says Flash they mean videos and games.


what-is-google-thinking

Photo by Warrantedarrest via Creative Commons

Google has noted that the TV market of users is huge with 4 billion users worldwide, which vastly exceeds the 1 billion computer users and 2 billion mobile users that exist in the world. They have estimated that the annual ad spends on US television is 70 billion dollars alone.

The coming of Google TV into the market place has been welcomed and some see the competition as positive kick in the butt for companies like Apple and their product Apple TV.

That brings us to another Google/Apple take on Google’s potential threat to iTunes with their Simplify Media acquisition which was also announced at Google I/O.

Until Google’s acquisition of Simplify a couple of months ago, the company offered a free software application for PC and Mac that let users stream music from their Winamp or ITunes libraries over the internet from their home computer to other devices they own.

At the conference Google engineering Vice President Vic Gundotra said Google would offer a desktop app that would incorporate Simplify’s technology and build the receiving technology into a future version of Android OS.

Gundotra also took the opportunity to show a new version of the Android Marketplace with a link to download songs. Watch out iTunes.

Google also launched a public version of their online collaboration tool Google Wave. It was released last year but Google product managers observed that it was used by the invitation only test groups they allowed to try it out.

Wave is often compared to Microsoft Office 2010 and Hotmail. Google won’t openly challenge Microsoft applications, but the free Wave is meant to be used instead of Microsoft’s communication and collaboration tools.

Critics have stated that last year’s version was too hard to figure out and that it was unnerving to see users watch their colleagues type their thoughts and then erase or edit them.

Google also launched the WebM Project. It is supported by Mozilla, Opera, Adobe and forty other software and hardware vendors. It is a Google supported project to create an open and royalty free video format that provides high quality video compression for use with HTML 5 video without the need for Flash.

The current codec that has broad acceptance is the H.264, which Apple supports. It is modern codec that is fast and light. They only problem is that it’s owned by the MPEG-LA consortium which plans to charge and enforce royalties in 2015.

Google will give WebM a real drive by making YouTube videos support it as well. The only question is will Safari and Microsoft support it? Apple is pushing H.264 pretty hard and voicing the lesser need to support Flash. We will see what the future holds.

The post Whats up at Google I/O appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/whats-up-at-google-io/feed/ 0
From Web Developer to SEO consultant https://www.bruceclay.com/blog/from-web-developer-to-seo-consultant/ https://www.bruceclay.com/blog/from-web-developer-to-seo-consultant/#respond Wed, 14 Apr 2010 16:30:55 +0000 http://10.10.0.6/blog/archives/2010/04/from-web-developer-to-seo-consultant.html So it all began 5 weeks ago. I was a web developer for 10 years and now suddenly find myself, an SEO consultant at Bruce Clay. I remember when, before I started, I was telling a former colleague in web development about my new role in SEO. I said to him, “Hey, guess who scored […]

The post From Web Developer to SEO consultant appeared first on Bruce Clay, Inc..

]]>
So it all began 5 weeks ago. I was a web developer for 10 years and now suddenly find myself, an SEO consultant at Bruce Clay. I remember when, before I started, I was telling a former colleague in web development about my new role in SEO. I said to him, “Hey, guess who scored a new job in search engine optimisation!” His reply was “Sweet, should be easy, just fill in the meta data tags, content, and do some link building and you’re set”. Now over a month into the job, I think those remarks were kind of naive.


web design to seo

Photo by Armando Maynez via Creative Commons

So what’s involved? As a co-worker once said, “The initiation is a baptism of fire.” Training, company methodologies, tight deadlines, custom toolsets, the list goes on. Working here has probably been one of the most focused environments I have ever experienced in IT.

So there’s more to SEO than just filling out meta data, writing content, and link building. Probably, the biggest difference I have experienced between this career and my past role, is the myriad of research methodologies involved. So far, I have done keyword research, website assessment, competitive analysis, and link profiling, employing a comprehensive list of tools I never really used as a web developer.

I understand HTML, CSS, JavaScript, and SQL with a dash of web server administration. Added to this tech vocabulary from SEO comes Page Rank, Backlinks, Physical/Virtual Siloing, and Universal Search plus racking your brains, trying to figure out how Google’s search algorithm works. The goals are different in SEO compared to web development. As a web developer I was concerned with clean code that was accessible and would conform to W3C guidelines and focused on efficient and well designed programming. Well validated code is still important in SEO, however accessibility goes out the window, as many of the coding technique I used to use in design, violate SEO principles or are considered SPAM. Like hiding text using CSS so it can’t be seen by regular users only by screen readers. Website and navigation have to be totally rethought of. Now, I have discovered that a lot of sites on the web are just not SEO-friendly.

In the past, my main tools as a coder were Dreamweaver, Aptana, and Photoshop. Now it’s primarily Excel, the Bruce Clay SEOToolSet, and a host of other SEO web tools. So great, I don’t have to code anymore but my focus now is to become an Excel guru, a statistician, and a master of SEO principles. Wish me luck as I make my mark in SEO!

The post From Web Developer to SEO consultant appeared first on Bruce Clay, Inc..

]]>
https://www.bruceclay.com/blog/from-web-developer-to-seo-consultant/feed/ 0