
This is just a single site in the huge expanse that is the Internet. Yet, rather than sputtering to a slow, painful death, the Internet collects and organizes this information promptly and logically. The primary method that has been developed in order to deal with categorizing the Internet has been search engines. Search engines have been around for quite a long time after I did some research regarding them.
I found out that one of the first search engines was called, “Archie,” and it was launched in 1990 by a student at
In later years, other search engines were founded that are what some may consider a regular search engine: AltaVista, WebCrawler, AskJeeves, and Excite. These systems worked very well in creating a portal to the hundreds of thousands of websites that you may have been searching for based on the frequency of the keyword that you were searching for. In the early days of Internet search engines, there was a very simple formula in how to catalog results. Search engines would organize web pages based on the frequency of the keyword you were searching for on that page.
Therefore, if you entered the keyword “dog,” the search engine would return results to you so that the page with the highest amount/frequency of the word “dog” on it would show up first, and so on and so on. This worked pretty well at the beginning and was one of the things that Yahoo! did incredibly well. This helped them gain a large share of search market. However, as the Internet steadily grew, people needed to develop better ways to organize their search results so that information could be found quicker, better, and easier. That is when Google developed its unbelievably brilliant, yet simple, formula. Larry Page and Sergey Brin, the founders of Google, developed an algorithm known as PageRank, aptly named after Larry Page.
I believe in them and you should too.” So, the more websites that reference your website, the more people seem to agree that your information is reliable and warranted, and thus should improve your rating when searched for in a search engine. The reason that this is an old concept is because it is very similar to citations used in books or scholarly documents. If there are a lot of other sources that cite a particular source, it is generally accepted as something that is pertinent to that particular topic.
Think about physics. Tons of studies cite evidence relating to Albert Einstein or Stephen Hawking, because these guys have discovered information that is at the core of the subject. Google works in a very similar way. It uses the entire Internet to rank it’s catalog of billions of web pages. But just as Google improved upon a system that people thought was perfect enough as it is, I think that there is potential for improvement in the future.
Testing for Realism:
- Is it well accepted by a particular target demographic?
However, in the late 1990’s when AOL was running the Internet, people laughed when they thought that in 10 years they would be using something completely different, especially a thing called Google. But the reality is, Google came out with a better product that provided people with what they needed faster. Slowly but surely, people started to sway away from their older technology and jump into something that truly got the job done. I feel this is and can be true with this sort of technology.
It simply makes for a more accurate search engine when you combine the passive and active roles of Internet users. That is one demographic that I can see switching over to this sort of search engine; the masses who want to use something better. That is a tremendous amount of people. Yet, consider further that there are on average 353,987 new Internet users per day. This may include people in developing countries who are using the Internet for the first time, young people who have never used the Internet yet, or older people who haven’t touched the darn thing their whole lives.
These numbers are complimentary of Google Answers (Source). These people, although they probably know about Google and Yahoo! are not entrenched in their Internet habits and if there is a better and more efficient technology, it’s likely that they will opt for it. Slowly, but surely, a superior product will gain market share. I don’t expect this to happen overnight or quickly, but rather over a couple years.
- Does it fill a need?
I would say the Internet is almost the personification of human ingenuity, creativeness, and improvement. Therefore, I feel as though we are doing the Internet and ourselves a disservice if we don’t continue to adapt and expand our possibilities.
We need to remember that we should always be striving to improve, and just because something works well now, doesn’t mean that it is going to work well indefinitely. We should always be conscious of moving forward and trying new things, because who knows what we can learn from it.
- Can it be setup by an individual or at most small group of individuals?
In order to get active “votes” on websites by users, not only using the passive voting that the computer algorithm achieves, users will have to install some sort of toolbar into their Internet browser (better yet, the search engine could be its own Internet browser). The browser can be set up in a number of different ways. It can have three designs as I see it. The first would be a toggle, in which users can drag how effective the site was in giving them what they needed. They can move the toggle back and forth between 0 and 1000. If you got exactly what you wanted, put 1000, but if you didn’t like the site at all, give it a 0.
These tabulations would then be averaged out and combined with the passive ranking algorithm to give a middle ground of sorts. While it does this, the search engine will also have to remember the keywords that you searched for and make the ranking you gave unique to that keyword or string of keywords. For instance, you may search for “dog” and vote 750 on the first site that comes up. But, if you search for “dog collars” that exact same site might be a 100. The search engine will have to remember the phrase you searched for and connect your respective ranking to it.
The second method that you can use is a simple 1 through 10 scale in which you select a number and submit it in. This doesn’t allow for as much variation, however.
The final method would be to ask the person who was searching, “Did this website find what you were looking for,” and people can reply “Yes or No.” This allows for the least amount of variation, however, it still provides people will the ability to actively vote on how accurate a site meets their needs.
- Can it generate income?
Essentially, these numbers provide a snapshot of the current search engine market share in the
Google – 16.5 Billion or 245M/1% market share
Yahoo! – 7 Billion or 345M/1% market share
MSN/Live Search – 1.848 Billion or 462M/1% market share
Ask.com – 227 Million or 56M/1% market share
**Click the Image to see it more clearly**

By looking at the top four companies in the Internet search engine industry, there is a mean of 277M for every 1% of market share that they capture. However, I don’t accept this as a conservative enough estimate, due to the fact that as a search engine gains popularity, advertising space begins to become more attractive and thus more expensive. This is well illustrated by the gap between Ask.com and both Yahoo! and Google. It is somewhat contradicted by MSN/Live Search, but I assume that Yahoo! and Google are going through diseconomies of scale (Source) resulting in increased per-unit costs.
After constructing a graph that illustrates the Billions of Dollars of revenue produced by each company measured against the percent of market share that each assumes, a linear trend line can be added. The linear trend line has an intercept of (0,0) because we assume that if you don’t enter the market you don’t make any money, but if you enter the market you will have a linear increase in revenue. The trend line has an equation of y = 0.2536x, which essentially translates as $253 million for every 1% of market share that is assumed. Therefore, I would make a conservative estimate that this search engine, if able to capture 1% of the Internet search engine market share, can produce revenue of $253.6 million per year.
- Is it marketable?
This marketing is especially effective, because it is usually friends or close loved one’s who refer you to a specific thing. This is how Google initially got its start and how Web 2.0 websites like Facebook, Digg, and a slew of others got their start as well. This is a very effective way to get users to start using your product and tell others about it. Word of mouth creates a buzz and people typically respond very well to it.
Especially when it comes to a new website, people who are part of that demographic who are currently users of Yahoo! and Google, will only move if they have explicitly been told by someone close to them. As for the new users, they will be inspired by the buzz and follow suit.
If the Internet has taught us anything, it is that the world is a constantly changing place that is continually looking for new ways to perform efficiently and effectively. Surely there are Internet search engine giants now that control over 60% (in some cases) of the market, but if we remember the old English adage, “the bigger they are, the harder they fall.”
No one can really predict what will come of the Internet in the next 5, 10, or 20 years from now, but I guarantee you that if we find a better way to do something, there’s no reason why we would stop ourselves from doing it. I believe Doogle can be the effective conglomeration of passive and active cataloging which will make for the best Internet search experience possible.