Wednesday, October 2, 2019
How People Interact With Search Engines
How People Interact With Search Engines A search engine is a web-based tool that enables users to locate information on the World Wide Web. Popular examples of search engines are Google, Yahoo!, and MSN Search. Search engines utilize automated software applications (referred to as robots, bots, or spiders) that travel along the Web, following links from page to page, site to site. The information gathered by the spiders is used to create a searchable index of the Web. When people use the term search engine in relation to the Web, they are usually referring to the actual search forms that searches through databases of HTML documents, initially gathered by a robot. There are basically three types of search engines: Those that are powered by robots (called crawlers; ants or spiders) and those that are powered by human submissions; and those that are a hybrid of the two. HOW DO SEARCH ENGINES WORK? Every search engine uses different complex mathematical formulas to generate search results. The results for a specific query are then displayed on the SERP. Search engine algorithms take the key elements of a web page, including the page title, content and keyword density, and come up with a ranking for where to place the results on the pages. Each search engines algorithm is unique, so a top ranking on Yahoo! does not only a short period before the search engines developers become wise to the tactics and change their algorithm. More likely, sites using these tricks will be labeled as spam by the search engines and their rankings will plummet. animation mean nothing to search engines, but the actual text on your pages does. It is difficult to build a Flash site that is as friendly to search engines; as a result, Flash sites will tend not to rank as high as sites developed with well coded HTML and CSS (Cascading Style Sheets a complex mechanism for adding styles to website pages above and beyond regular HTML). If the terms you want to be found by do not appear in the text of your website, it will be very difficult for your website to yield high placement in the SERPs. Crawler-based search engines are those that use automated software agents (called crawlers) that visit a Web site, read the information on the actual site, read the sites meta tags and also follow the links that the site connects to performing indexing on all linked Web sites as well. The crawler returns all that information back to a central depository, where the data is indexed. The crawler will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the administrators of the search engine. Human-powered search engines rely on humans to submit information that is subsequently indexed and catalogued. Only information that is submitted is put into the index. In both cases, when you query a search engine to locate information, youre actually searching through the index that the search engine has created -you are not actually searching the Web. These indices are giant databases of information that is collected and stored and subsequently searched. This explains why sometimes a search on a commercial search guarantee a prominent ranking on Google, and vice versa. To make things more complicated, the algorithms used by search engines are not only closely guarded secrets, they are also constantly undergoing modification and revision. This means that the criteria to best optimize a site with must be surmised through observation, as well as trial and error and not just once, but continuously. Gimmicks less reputable SEO firms tout as the answer to better site rankings may work at best for engine, such as Yahoo! or Google, will return results that are, in fact, dead links. Since the search results are based on the index, if the index hasnt been updated since a Web page became invalid the search engine treats the page as still an active link even though it no longer is. It will remain that way until the index is updated. So why will the same search on different search engines produce different results? Part of the answer to that question is because not all indices are going to be exactly the same. It depends on what the spiders find or what the humans submitted. But more important, not every search engine uses the same algorithm to search through the indices. The algorithm is what the search engines use to determine the relevance of the information in the index to what the user is searching for. One of the elements that a search engine algorithm scans for is the frequency and location of keywords on a Web page. Those with higher frequency are typically considered more relevant. But search engine technology is becoming sophisticated in its attempt to discourage what is known keyword stuffing, or spamdexing. Another common element that algorithms analyze is the way that pages link to other pages in the Web. By analyzing how pages link to each other, an engine can both determine what a page is about (if the keywords of the linked pages are similar to the keywords on the original page) and whether that page is considered important and deserving of a boost in ranking. Just as the technology is becoming increasingly sophisticated to ignore keyword stuffing, it is also becoming more savvy to Web masters who build artificial links into their sites in order to build an artificial ranking. SEARCH ENGINE OPTIMIZATION Search engine optimization (SEO) is the process of affecting the visibility of a website or a webpage in a search engines natural or un-paid (organic) search engine results In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engines users. SEO may target different kinds of search, including image search, local search video search, academic search news search and industry-specific vertical search engines engines. As an internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content ,HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of back links or inbound links, is another SEO tactic. HOW PEOPLE INTERACT WITH SEARCH ENGINES We like to say Build for users, not search engines. When users have a bad experience at your site, when they cant accomplish a task or find what they were looking for, this often correlates with poor search engine performance. On the other hand, when users are happy with your website, a positive experience is created, both with the search engine and the site providing the information or result. What are users looking for? There are three types of search queries users generally perform: Do Transactional Queries Action queries such as buy a plane ticket or listen to a song. Know Informational Queries When a user seeks information, such as the name of the band or the best restaurant in New York City. Go Navigation Queries Search queries that seek a particular online destination, such as Facebook or the homepage of the NFL. When visitors type a query into a search box and land on your site, will they be satisfied with what they find? This is the primary question search engines try to figure out millions of times per day. The search engines primary responsibility is to serve relevant results to their users. It all starts with the words typed into a small box. KEYWORD RESEARCH It all begins with words typed into a search box. Keyword research is one of the most important, valuable, and high return activities in the search marketing field. Ranking for the right keywords can make or break your website. Through the detective work of puzzling out your markets keyword demand, you not only learn which terms and phrases to target with SEO, but also learn more about your customers as a whole. Its not always about getting visitors to your site, but about getting the right kind of visitors. The usefulness of this intelligence cannot be overstated with keyword research you can predict shifts in demand, respond to changing market conditions, and produce the products, services, and content that web searchers are already actively seeking. In the history of marketing, there has never been such a low barrier to entry in understanding the motivations of consumers in virtually every niche.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.