SEO

Search Engine Optimizattion GraphicIn the mid-1990s as the first search engines were cataloging the Internet, Webmasters and content providers began optimizing web sites for search engines results. In the beginning webmasters would submit the URL address or a page, to the various engines. The search engine would then initiate a program called a "spider" or "bot" to "crawl" that page and extract links to other pages from it and return that information to be indexed. The search engine would then download the page and storing it on the server, where an indexer, another program, would extract information about the page. Information regarding the words it contains and their location. In addition it would weight specific word density, and all links the page contained. This information was then placed in a scheduler for crawling at some later date.

It wasn't long before webmasters and site owners realized this was a gold mine waiting to be prospected for clients. Recognizing the value of having their sites rank and highly visible in search engine results spawned the creation of an entirely new industry, Search Engine Optimization (SEO). Mr. Browning, our founder and CEO, was Director of IT for a major national Real Estate Franchise on South Padre Island. He took on the job of optimizing the company's website and building one of the first, in-house, on-line reservation systems on South Padre Island. It took about a year for the on-line reservation system went from making 8% of the company’s total reservation on line to well over 60%, just by getting listed in the top 5 list on the search engines of the day like Excite, Alta Vista, HotBot, Overture, Yahoo, and Lycos. Google was just a startup at the time.

Early versions of SEO were basically like the Wild West and there was a whole barrage of unscrupulous tactics used to gain a listing. The search engines relied heavily on site-provided information such as the keyword Meta tags, or index files in engines like ALIWEB. Soon Marketers started Marketing Companies and convinced site owners to use their schemes to increase site "hits" or visitors. They used Meta data to index pages that was less than reliable, and potentially inaccurate representation of the site's actual content. Inaccurate and inconsistent data in Meta tags caused pages to rank for irrelevant searches. By relying so much on factors such as keyword density, which was exclusively within the sites marketers control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines were forced to adapt to ensure their search engine result pages (SERPs) showed the most relevant search results instead of unrelated pages stuffed with numerous keywords by unscrupulous marketers to falsely claim more hits/visitors. Since the success of a search engine and its popularity is determined by its ability to produce the results most relevant to a given search term, poor quality or irrelevant search reports could lead users to find other search sources. Search engines responded by developing more complex algorithms for ranking, taking into account additional factors that were more difficult for the Marketer to manipulate. By the late 1990's, search engine designers recognized that Marketers were making efforts to rank well in their search engines, and that some were posing as webmasters to manipulating their rankings in search results by stuffing pages with excessive and even irrelevant keywords. Early search engines, such as Alta Vista and InfoSeek, responded by adjusted their algorithms in an effort to prevent these marketer/webmasters from manipulating rankings. In today’s world, companies that employ overly aggressive techniques can get their client websites banned from the search results. In fact, a company named Traffic Power allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Book for writing about the ban. Google later confirmed that it had in fact ban Traffic Power, including some of its clients. Having seen this type of behavior first hand, Devrod is committed to only using ligament search engine suggested procedures for optimizing the sites it builds. That’s why we have survived the days of the marketer gunslingers.

Search Engine Optimization

Search Engine Optimization (SEO) is the process of affecting the visibility of a website or a web page in a web search engine's unpaid results. This is referred to as "organic" search results. In general, the higher a site is ranked on the search results page and more frequently a site appears in the search results list, and the more visitors it will receive from the search engine's users. These visitors can be converted into customers. SEO can target different markets. A small local business and the "local" search for their industry-specific terms is the best tact to achieve their goal. As an Internet marketing strategy, Devrod considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their client’s targeted audience. Optimizing a website involves proper content and HTML, along with associated coding that will both increase its relevance to specific keywords and to remove barriers to the indexing of search engines. Another SEO tactic is to promoting a site to increase the number of backlinks, or inbound links. As of 2015, mobile search has finally surpassed desktop search, Google is developing and pushing mobile search as the future in all of its products and many companies are beginning to take a different approach on their internet strategies and we strongly encourage our clients to do the same.

Getting Indexed

The leading search engines, use "crawlers" to find pages for their algorithmic search results. Therefore, it is not necessary to submit a site to these search engines, pages that are crawled from search engine get indexed because they are found automatically. However, there are two major Directories, Yahoo Directory and DMOZ, both require a manual submission and use human editorial review. Because of the human review, directories a sometimes thought to be more accurate. Yahoo charges a pretty hefty fee while DMOZ is free. Google offers a Search Console where an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links. Crawlers look at a number of different factors when crawling a site. Not every page is indexed by the search engines. The distance a pages is from the root directory of a site may also be a factor in whether or not a page gets crawled. One can avoid undesirable content in the search index crawls that might harming your ranking. A standard robots.txt file can instruct spiders not to crawl certain files or directories. This file is in the root directory of your domain. In addition, a page can be excluded explicitly from a search engine's database by using a Meta tag that specifically instruct the bots not to crawl that page. When a search engine visits a site, the robots text file located in the root/main directory is the first file read. This file is then parsed/analyzed, and will tell the bot which pages are not to be crawled. A crawler sometimes keep a copy of this file. Pages typically prevented from being crawled are login specific pages, such as shopping carts and user-specific content from say an internal search. Google warns webmasters that they should prevent indexing of internal search results because those pages are considered search "spam" that will depreciate your site and harm your ranking.

Increasing Prominence

A variety of methods can increase the prominence/ranking of a web page within the search engine result pages (SERPs). Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Writing content that includes frequently used keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site, this is the benefit of having a blog included in your site. Adding relevant keywords to a web page's Meta data, including the title tag and Meta description, may or may not improve the relevancy of a site's search listings thus increasing traffic. URL normalization of web pages accessible via multiple universal resource links or URLs or using the canonical link element or via 301 redirects may or may not help make sure links to different versions of the URL can count towards the page's link popularity score. I say may or may not because search engines are constantly changing their algorithms in an attempt to foil any marketer that may have figured out their secrets. Not to mention that search engines have become marketers themselves and want to keep the advantage in their favor.

SEO Techniques

SEO techniques can be classified into two broad categories. "White Hat" techniques that are what Search Engines recommend and are part of good design, they are techniques of which the search engines approve. "Black Hat" techniques anticipate that their sites may eventually be banned either temporarily or permanently once the Search Engines discover what they are doing, therefore it's not worth anymore discussion. White hats tend to produce results that are long term. Devrod only uses the "White Hat" optimization. We've spent an entire lifetime building a reputation and refuse put it at risk. An SEO technique is considered white hat if it conforms to the search engine's guidelines and involves no deception. A search engine's guidelines are not written as a set of A, B, Cs, or yes you can or no you can't, and this is a very important distinction. White hat SEO is not just about following guidelines, but rather about ensuring that the content a search engine indexes and subsequently ranks is the content a user wants to see. White hat advice is generally summed up as creating content for users, not for the search engines algorithm, and then making that content easily accessible to the spiders, rather than dishonestly attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways is similar to web development that promotes accessibility, although the two are not exactly the same. And, then there’s the Grey Hats. In our opinion Grey Hats are just another shade of Black Hats that attempts to justify deceptive behavior.

Search Engine Submission

Submission is a process in which one submits a website directly to a search engine. While search engine submission is sometimes presented as a way to promote a website, it generally is not necessary. Search Engines use web crawlers that will eventually find most web sites on the Internet without assistance. There are two reasons to submit a web site or web page to a search engine. To add an entirely new web site without waiting for a search engine to discover it, or to have a web site's record updated after a substantial Site Update. Some search engine submission software not only submits websites to multiple search engines, but also will add links to websites from their own pages. This could appear helpful in increasing a website's ranking, because of the external links are an important factors in determining a website's ranking. However, Google has stated that this "can lead to a tremendous number of unnatural links for your site" with a negative impact on site ranking.

As a Marketing Strategy

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective like paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate. In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public, which now shows a shift in their focus towards "usefulness" and mobile search.

SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors. Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day. It is considered wise business practice for website operators to liberate themselves from total dependence on search engine traffic.

Ads

Ads advertise advertisement sales of stuff ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum consectetur adipiscing. Cras et ipsum vel nulla mattis dapibus ut eget risus. Nullam sed varius justo. Curabitur viverra aliquam rutrum. Nam suscipit porttitor condimentum. Sed risus metus, pulvinar pharetra laoreet at, molestie eget elit. Suspendisse hendrerit dapibus nisi, sit amet feugiat erat gravida at. Phasellus pellentesque urna ac tellus aliquet non tristique elit luctus.