Home   |   About Us  |  Quality Policy  |  Domain Search  |  Enquiry  |  Contact Us  |  Career  |  Franchisee  |  Employee Mail  |  Website Designing  |  Web Hosting 
       
 
  Our Services
Web Hosting
Website Promotion
Website Maintenance
Website Redesign
Domain Name Registration
Corporate Presentation
Email Marketing
Internet Glossary
   
  Internet Details & Glossary
 

What is the Internet?
The Internet is a vast network that connects many independent networks spanning over 170 countries in the World. It links computers of many different types, sizes, and operating systems, and, of course, the many people of those countries that use the Internet to communicate.

The one thing all these different computers have in common is the use of the Internet Protocol, abbreviated as IP, which allows computers of different types to communicate with each other. You will often see reference to the longer abbreviation, TCP/IP, which stands for Transmission Control Protocol/Internet Protocol. Your own computer uses TCP/IP software to enable it to link to this service.

What can I do on the Internet?

The Internet Protocol makes it possible for you to communicate in various ways, find things that interest you, and exchange information and files. The most common things you can do are:

Get information on almost any subject by searching the web. It takes some skill to search efficiently, and since anyone can publish just about anything, there is lots of misinformation on the web, too. You need to develop some skill in evaluating the accuracy and reliability of the information you find.

Send and receive email or chat or exchange messages with people all over the world. Almost as fast as the telephone, there is never a busy signal, and you never play phone tag.

Join discussion groups about a common subject with message boards, Newsgroups and email discussion lists.

Get or exchange software and files with the File Transfer Protocol (FTP)

Explore the World Wide Web, which can use all of the above, and adds easy links to other resources and adds multimedia--graphics, sound, and video capabilities.

Publish your own material on the web in blogs, message boards, or your own web pages.

 
 
  Who Owns the Internet?
 


No organization, corporation or government owns or runs the Internet. Instead, many people and organizations voluntarily participate in task force groups who meet to develop standards for the many various technical needs of running the Internet. Decisions are made by consensus among all who choose to participate, and every point of view is heard in the long process of hashing out decisions and setting new standards.

The equipment--the computers, the cables, the routers, and so on are owned by government and private organizations and are paid for by taxes and user fees. In the early history of the Internet, the US government paid for many of the development and operating costs through government grants. In recent years, the US government has stepped aside except for the portions that link government organizations and let private enterprise develop the nets.
What is the World Wide Web?

The World Wide Web is one of the protocols that lets you link to many sites on the Internet. A page can be one or many screens as it displays on your monitor.

What are major search engines?

Google (http://www.google.com)
Google is a continually improving search engine of about billions of pages created originally by two Stanford PhD students. It has followed Yahoo, Excite, and Who Where in moving from a student project to a commercial site. Its relevance ranking uses two factors not generally included in search engine rankings: number of links to the page from elsewhere and the "importance" of the pages that link to it. Thus, if Yahoo links to it, it is important, and will rank higher in the list than a link from someone's "unimportant" personal page. Other ranking factors are the number of hits on the search words in the title and the text and the proximity of search term to each other. By default, all words are ANDed, but you can now add OR between words. A minus sign is used as a NOT. Common words (stop words) are not included in the search unless preceded by a +. Stemming is not supported; a search for evaluating will not find the word evaluation. They recommend searching for only a few words, not many, and let the relevance ranking work for you. It works surprisingly well!

Google added indexing of .pdf files, which most search engines cannot reach, due to their special formatting. This is a major step forward! Try it!

Fast (http://www.alltheweb.com)
Fast, from Norway, has jumped to become a large search engine. It supports full Boolean expressions, although they are not mentioned in the documentation. It also supports +, and - symbols, quotes and parentheses, and a form to support all the words, any of the words, exact words, and language, domain, title, text, and link restrained searches

AltaVista (http://www.altavista.com/)
This is another large Internet search engine with powerful advanced features. It searches over 130 million web pages. Because of its size, your search should be carefully crafted, preferably in Boolean terms, or you may get too many hits to look through. The basic search can use + and - prefixes to ensure or exclude terms from the search, and quotes to ensure adjacency (see below). The Advanced Search is its most powerful feature, and allows AND, OR, AND NOT, and NEAR (within 10 words) as Boolean expressions and limiting by dates and other fields. Use of parentheses are encouraged to group expressions. It has many unique features, including search by specific language, rudimentary language translation, and sophisticated techniques for refining searches and ranking results. Use its field searches to search by link, url, domain, etc. It has also added a natural language search capability by licensing AskJeeves technology.

HotBot (http://www.hotbot.com)
HotBot, powered by Inktomi, seems to vary in size from 40 to 100 million pages, depending on how many servers are running. It supports full Boolean searching and recognizes AND, OR, NOT, (or the equivalent symbols: &, |, !), parentheses, double quotes, + and -, and other advanced features. It supports * and ? as wild cards, too. It also supports a system of modifying the first round search to refine it. HotBot and its parent Wired Digital are under agreement to be acquired by Lycos.

HotBot's major strength is the "More Search Options" forms that support searching. You can craft complex searches without knowing Boolean expressions. This, after Google, is the power engine for the novice.

Lycos (http://www.lycos.com)
Lycos is a medium size search engine, searching over 50 million web pages as well as gopher and ftp sites. It has some very sophisticated features for controlling proximity and sequencing of search terms. It searches for graphics or sound files as separate choices. The basic search ORs all terms, but gives preference to results with the most hits. The Custom Search allows you to AND all terms, OR all terms, or return at least a selected number of terms. It also allows you to limit or include lower scoring returns.

Yahoo (http://www.yahoo.com)
Yahoo is the biggest of the subject-matter organized directories and has been imitated all over, particularly by LookSmart and the Open Directory Project. It is very useful to find good collections of resources for a topic. It has an advanced search mode, too

Internet history:
Internet was the result of some visionary thinking by people in the early 1960s who saw great potential value in allowing computers to share information on research and development in scientific and military fields. J.C.R. Likelier of MIT, first proposed a global network of computers in 1962, and moved over to the Defense Advanced Research Projects Agency (DARPA) in late 1962 to head the work to develop it. Leonard Klein rock of MIT and later UCLA developed the theory of packet switching, which was to form the basis of Internet connections. Lawrence Roberts of MIT connected a Massachusetts computer with a California computer in 1965 over dial-up telephone lines. It showed the feasibility of wide area networking, but also showed that the telephone line's circuit switching was inadequate. Klein rock’s packet switching theory was confirmed. Roberts moved over to DARPA in 1966 and developed his plan for ARPANET. These visionaries and many more left unnamed here are the real founders of the Internet.

The Internet, then known as ARPANET, was brought online in 1969 under a contract let by the renamed Advanced Research Projects Agency (ARPA) which initially connected four major computers at universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah). The contract was carried out by BBN of Cambridge, MA under Bob Kahn and went online in December 1969. By June 1970, MIT, Harvard, BBN, and Systems Development Corp (SDC) in Santa Monica, Cal. were added. By January 1971, Stanford, MIT's Lincoln Labs, Carnegie-Mellon, and Case-Western Reserve U were added. In months to come, NASA/Ames, Mitre, Burroughs, RAND, and the U of Illinois plugged in. After that, there were far too many to keep listing here.

The Internet was designed in part to provide a communications network that would work even if some of the sites were destroyed by nuclear attack. If the most direct route was not available, routers would direct traffic around the network via alternate routes.

The early Internet was used by computer experts, engineers, scientists, and librarians. There was nothing friendly about it. There were no home or office personal computers in those days, and anyone who used it, whether a computer professional or an engineer or scientist or librarian, had to learn to use a very complex system.

E-mail was adapted for ARPANET by Ray Tomlinson of BBN in 1972. He picked the @ symbol from the available symbols on his teletype to link the username and address. The telnet protocol, enabling logging on to a remote computer, was published as a Request for Comments (RFC) in 1972. RFC's are a means of sharing developmental work throughout community. The ftp protocol, enabling file transfers between Internet sites, was published as an RFC in 1973, and from then on RFC's were available electronically to anyone who had use of the ftp protocol.

Libraries began automating and networking their catalogs in the late 1960s independent from ARPA. The visionary Frederick G. Kilgore of the Ohio College Library Center (now OCLC, Inc.) led networking of Ohio libraries during the '60s and '70s. In the mid 1970s more regional consortia from New England, the Southwest states, and the Middle Atlantic states, etc., joined with Ohio to form a national, later international, network. Automated catalogs, not very user-friendly at first, became available to the world, first through telnet or the awkward IBM variant TN3270 and only many years later, through the web. See The History of OCLC

Internet matured in the 70's as a result of the TCP/IP architecture first proposed by Bob Kahn at BBN and further developed by Kahn and Vint Cerf at Stanford and others throughout the 70's. It was adopted by the Defense Department in 1980 replacing the earlier Network Control Protocol (NCP) and universally adopted by 1983.

The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic, followed, providing a means of exchanging information throughout the world . While Usenet is not considered as part of the Internet, since it does not share the use of TCP/IP, it linked UNIX systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks.

Similarly, BITNET (Because It's Time Network) connected IBM mainframes around the educational community and the world to provide mail services beginning in 1981. Listserv software was developed for this network and later others. Gateways were developed to connect BITNET with the Internet and allowed exchange of e-mail, particularly for e-mail discussion lists. These listservs and other forms of e-mail discussion lists formed another major element in the community building that was taking place.

In 1986, the National Science Foundation funded NSFNet as a cross country 56 Kbps backbone for the Internet. They maintained their sponsorship for nearly a decade, setting rules for its non-commercial government and research uses.

As the commands for e-mail, FTP, and telnet were standardized, it became a lot easier for non-technical people to learn to use the nets. It was not easy by today's standards by any means, but it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to make good use of the nets--to communicate with colleagues around the world and to share files and resources.

While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected, the Internet became harder and harder to track. There was more and more need for tools to index the resources that were available.

The first effort, other than library catalogs, to index the Internet was created in 1989, as Peter Deutsch and his crew at McGill University in Montreal, created an archive for ftp sites, which they named Archie. This software would periodically reach out to all known openly available ftp sites, list their files, and build a searchable index of the software. The commands to search Archie were Unix commands, and it took some knowledge of Unix to use it to its full capability.

At about the same time, Brewster Kahle, then at Thinking Machines, Corp. developed his Wide Area Information Server (WAIS), which would index the full text of files in a database and allow searches of the files. There were several versions with varying degrees of complexity and capability developed, but the simplest of these were made available to everyone on the nets. At its peak, Thinking Machines maintained pointers to over 600 databases around the world which had been indexed by WAIS. They included such things as the full set of Usenet Frequently Asked Questions files, the full documentation of working papers such as RFC's by those developing the Internet's standards, and much more. Like Archie, its interface was far from intuitive, and it took some effort to learn to use it well.

Peter Scott of the University of Saskatchewan, recognizing the need to bring together information about all the telnet-accessible library catalogs on the web, as well as other telnet resources, brought out his Hytelnet catalog in 1990. It gave a single place to get information about library catalogs and other telnet resources and how to use them. He maintained it for years, and added HyWebCat in 1997 to provide information on web-based catalogs.

In 1991, the first really friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. A debate followed between mainframe adherents and those who believed in smaller systems with client-server architecture. The mainframe adherents "won" the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system. The demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of UNIX or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want.

Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Net wide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation and Display).

In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution. This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop.

Marc AndreessenThe development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center For Supercomputing Applications (NCSA) gave the protocol its big boost. Later, Andreessen moved to become the brains behind Netscape Corp., which produced the most successful graphical type of browser and server until Microsoft declared war and developed its Microsoft Internet Explorer.

Since the Internet was initially funded by the government, it was originally limited to research, education, and government uses. Commercial uses were prohibited unless they directly served the goals of research and education. This policy continued until the early 90's, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet Internet backbone.

Delphi was the first national commercial online service to offer Internet access to its subscribers. It opened up an email connection in July 1992 and full Internet service in November 1992. All pretenses of limitations on commercial use disappeared in May 1995 when the National Science Foundation ended its sponsorship of the Internet backbone, and all traffic relied on commercial networks. AOL, Prodigy, and CompuServe came online. Since commercial usage was so widespread by this time and educational institutions had been paying their own way for some time, the loss of NSF funding had no appreciable effect on costs.

Today, NSF funding has moved beyond supporting the backbone and higher educational institutions to building the K-12 and local public library accesses on the one hand, and the research on the massive high volume connections on the other.
Bill GatesMicrosoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of Windows 98 in June 1998 with the Microsoft browser well integrated into the desktop shows Bill Gates' determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance. We'll leave it up to you whether you think these battles should be played out in the courts or the marketplace.

During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the direct costs away from the consumer--temporarily. Services such as Delphi offered free web pages, chat rooms, and message boards for community building. Online sales have grown rapidly for such products as books and music CDs and computers, but the profit margins are slim when price comparisons are so easy, and public trust in online security is still shaky. Business models that have worked well are portal sites that try to provide everything for everybody, and live auctions. AOL's acquisition of Time-Warner was the largest merger in history when it took place and shows the enormous growth of Internet business! The stock market has had a rocky ride, swooping up and down as the new technology companies, the dotcom’s encountered good news and bad. The decline in advertising income spelled doom for many dotcoms, and a major shakeout and search for better business models took place by the survivors.

A current trend with major implications for the future is the growth of high speed connections. 56K modems and the providers who supported them spread widely for a while, but this is the low end now. 56K is not fast enough to carry multimedia, such as sound and video except in low quality. But new technologies many times faster, such as cable modems and digital subscriber lines (DSL) are predominant now.

Wireless has grown rapidly in the past few years, and travelers search for the Wi-Fi "hot spots" where they can connect while they are away from the home or office. Many airports, coffee bars, hotels and motels now routinely provide these services, some for a fee and some for free.

A next big growth area is the surge towards universal wireless access, where almost everywhere is a "hot spot". Municipal Wi-Fi or city-wide access, wiMAX offering broader ranges than Wi-Fi, EV-DO, 4g, and other formats will joust for dominance in the USA in the years ahead. The battle is both economic and political.

Another trend that is rapidly affecting web designers is the growth of smaller devices to connect to the Internet. Small tablets, pocket PCs, smart phones, eBooks, game machines, and even GPS devices are now capable of tapping into the web on the go, and many web pages are not designed to work on that scale.

As the Internet has become ubiquitous, faster, and increasingly accessible to non-technical communities, social networking and collaborative services have grown rapidly, enabling people to communicate and share interests in many more ways. Sites like Face book, Twitter, Linked-In, YouTube, Flickr, Second Life, delicious, blogs, wikis, and many more let people of all ages rapidly share their interests of the moment with others everywhere.

Anonymous FTP: A way to obtain publicly-accessible files from the Internet. Anonymous means that you don't need a user ID and password to download files. FTP stands for File Transfer Protocol.

Articles: Messages posted to Usenet newsgroups.

ASCII: An acronym for American Standard Code for Information Interchange. For Internet purposes, ASCII stands for straight text formatting. The opposite of ASCII for file transfer purposes is binary.

Backbone: A system of high-speed connections that routes long-haul Internet traffic by connecting the slower regional and local data paths.

Bandwidth: The amount of data that can flow through a channel. The higher the bandwidth, the more information that can flow through it.

BBS: An acronym for Bulletin Board System.

Binary: Not ASCII. For file transfer purposes, any file that is not in an ASCII format is considered a binary file. In newsgroups, a binary is a graphical image (or sometimes audio or video.)

Bookmark: In Netscape, a user-maintained index of frequently used URLs. In Internet Explorer these are called favorites.

Boolean Logic: A system that uses operators such as AND, NOT, OR and NEAR to link key words together to make searching databases more precise. Such as:

Browser: A computer program that is used to access the World Wide Web. Examples of browsers are Netscape Navigator, Microsoft Internet Explorer and Lynx.

Client: A computer or software program that requests services from another computer, called a server. Eudora is an example of an e-mail client; ws_ftp is an example of an FTP client.

Compress: To make a file smaller, usually to conserve space or to speed up file transfers. Many of the files and programs available on the Internet are compressed. The most common compression format you will encounter is .zip; you need a program such as pkunzip to uncompress these files and make them usable.

DNS: An acronym for Domain Name Server. The DNS is everything after the @ in an e-mail address. When expressed numerically rather than alphabetically, the DNS is an IP address.

Domain: An Internet domain is a major subsection of the Internet. It is indicated by the last group of letters in an Internet address. Such Domains include:

.com (commercial, such as icir.com)
.mil (military, as in army.mil)
.org (organization, as in eff.org)
.net (network, as in internic.net)
.gov (government, as in whitehouse.gov)
.edu (educational, as in lonestar.utsa.edu.)

The domains listed above are used only in the United States. Other countries use a country code - such as .au for Australia or .ca for Canada as their domain. Domain Name Server: see DNS

Download: Transfer a file from a remote computer to your own computer. The opposite of upload. Hint: the formula to calculate download time is file size/modem speed/6, for example, 1,665,780/28,000/6=9.6 minutes, the time to download one 3.5" floppy disk using a 28.8 bps modem.

Email: Electronic mail or messages sent to other users over the Internet.

Finger: A UNIX command that returns information about an Internet user. To finger someone you must at least know their DNS.

Firewall: Used by some networks to provide security by blocking access to certain services from the rest of the Internet.

Flame: A rude response to an e-mail message or newsgroup article. Flames are usually directed at someone who violates the informal rules of the Internet and are out of proportion to the severity of the "offense."

FAQ: An acronym for Frequently Asked Questions. These are text documents in a question-and-answer format that answer the most common questions posted to Usenet newsgroups.

FTP: An acronym for file transfer protocol, a way of downloading files and programs from a remote computer to your own directory or computer, and vice-versa.

Gateway: A computer that moves data from one network to another.

GIF: An acronym for the Graphical Interchange Format. This was developed by CompuServe and is a popular way of exchanging pictures over the Internet. JPEG or .JPG (Joint Photographic Experts Group) is another acronym used to express the technical name for a graphics file format.

Gopher: A protocol and client program that lets users retrieve Internet resources. Gopher is a text-based menu system that serves a similar function to the World Wide Web.

GUI: (sometimes pronounced gooey) An acronym for Graphical User Interface. Windows is a GUI; DOS is a text-based interface.

Helper Application: Mainly used in browsers, such as Netscape. A helper application is a software program that works within another program to help it accomplish its tasks. For example, you can use the helper application WPLANY within Netscape to list to audio files, or the Adobe Acrobat Reader to look at and print .pdf files.

HTML: An acronym for Hypertext Markup Language, the formatting system for World Wide Web documents.

HTTP: An acronym for Hypertext Transfer Protocol, the system used to request documents from the World Wide Web.

Home Page: An individual's or organization's presence on the World Wide Web.

Hypertext: A text document that contains links to other documents. Hypertext is used in the World Wide Web and also in Windows help documents.

Internet: A world-wide network of computer networks using the TCP/IP protocol. It is a three level hierarchy composed of backbone networks (e.g. ARPAnet, NSFNet, and MILNET), mid-level networks, and stub networks. These include commercial (.com or .co), university (.ac or .edu) and other research networks (.org, .net) and military (.mil) networks and span many different physical networks around the world with various protocols including the Internet Protocol.

IP address: An Internet DNS expressed in numbers rather than letters.

IRC: An acronym for Internet Relay Chat, a world-wide live chat system with a text based interface.

ISDN: An acronym for Integrated Services Digital Network, a service offered by an ISP or local telephone companies, but most readily in Australia, France, Japan and Singapore, with the UK somewhat behind and availability in the USA rather spotty.

ISP: An acronym for Internet Service Provider, a company that provides Internet connections for individuals and businesses. Lurk: Follow a newsgroup or sitting in on an IRC channel without contributing anything. Not necessarily a bad thing!

Lynx: A text-based World Wide Web Browser.

Modem: Short for Modulate-Demodulate, a modem is a computer peripheral that allows transmission of digital information over an analog phone line. Modems are rated by the number kilobytes-per-second that they are able to send and receive.

MIME: An acronym for Multipurpose Internet Mail Extensions. This is a protocol for sending audio, graphics and other binary data as attachments to e-mail messages.

Netiquette: Etiquette on the Internet. Violating netiquette may get you flamed.

Netscape: A GUI browser used to access the World Wide Web.

Newbie: Someone new to the Internet. >

Newsgroup: One of more than 20,00 discussion forums carried on the Internet and some other networks, such as Fidonet.

NNTP: An acronym for the Network News Transport Protocol. This is the protocol used to distribute Usenet news.

Online Service: A provider such as America Online, CompuServe and Prodigy that started out in business providing unique and proprietary content but now also includes a gateway to the Internet.

Packet: The basic unit of information transmitted over the Internet. The TCP/IP protocols break messages down into packets and route them.

Plug-in: A file containing data used to alter, enhance, or extend the operation of a parent application program. World-Wide Web browsers support plug-ins which display or interpret a particular file format or protocol such as Shockwave, RealAudio, Adobe PDF, Corel CMX (vector graphics), Virtual Reality Modeling Language (VRML) etc. Plug-ins can usually be downloaded for free and are stored locally coming in different versions specific to a particular operating system and may even come "preloaded" within a Web browser.

Search Engine: The tool that allows a user to search the Internet by keywords and/or categories. (I.e. Yahoo, Excite, HotBot, Alta Vista, etc.)

Shareware & Freeware: Software distributed through the Internet; designers hope they will be paid for it if people use it regularly. "Freeware", on the other hand, is distributed for free.

Spam: To post irrelevant or inappropriate messages to one or more Usenet newsgroups or mailing lists in deliberate or accidental violation of netiquette. To indiscriminately send large amounts of unsolicited e-mail meant to promote a product or service. Spam in this sense is sort of like the electronic equivalent of junk mail sent to "Occupant."

TCP/IP: Transmission Control Protocol over Internet Protocol. The de facto standard Ethernet protocols incorporated into 4.2BSD UNIX. TCP/IP was developed by DARPA for internetworking and encompasses both network layer and transport layer protocols. While TCP and IP specify two protocols at specific protocol layers, TCP/IP is often used to refer to the entire DOD protocol suite based upon these, including telnet, FTP and others.

Telnet: The Internet standard protocol for remote login. Runs on top of TCP/IP. Defined in STD 8, RFC 854 and extended with options by many other RFCs. UNIX BSD networking software includes a program, telnet, which uses the protocol and acts as a terminal emulator for the remote login session. Sometimes abbreviated to TN. TOPS-10 had a similar program called IMPCOM?

UNIX: Text-based operating system that supports the many applications capable of operating two or more programs at the same time and performing two or more actions at the same time.

URL: Uniform Resource Locator. Standard way to give the address of any resource on the Internet that is part of the WWW. A draft standard for specifying the location of an object on the Internet, such as a file or a newsgroup. URLs are used extensively on the World-Wide Web. They are used in HTML documents to specify the target of a hyperlink which is often another HTML document (possibly stored on another computer). Usenet: Wide-ranging set of newsgroups that create discussions of serious and not-so-serious natures. A distributed bulletin board system supported mainly by UNIX machines and the people who post and read articles thereon. Originally implemented in 1979 - 1980 by Steve Bellovin, Jim Ellis, Tom Truscott, and Steve Daniel at Duke University, it has swiftly grown to become international in scope and is now probably the largest decentralized information utility in existence. Usenet encompasses government agencies, universities, high schools, businesses of all sizes and home computers of all descriptions. As of early 1993, it hosts well over 1200 newsgroups ("groups" for short) and an average of 40 megabytes (the equivalent of several thousand paper pages) of new technical articles, news, discussion, chatter, and flame age every day. To join in you need a news reader. Several web browsers include news readers and URLs beginning "news:" refer to Usenet newsgroups. Is this more information than what you really needed? :-)

World Wide Web (WWW): Commonly referred to as "The Web". An Internet client-server hypertext distributed information retrieval system which originated from the CERN High-Energy Physics laboratories in Geneva, Switzerland. The system or universe of hypertext severs (HTTP servers), which allows text, graphics, video, and sound to be mixed together and used at the same time. Usually requires a WWW browser like Netscape Navigator or Internet Explorer.

A Very Brief History of the Internet

Late 1960's to early 1970's
Dept. of Defense Advanced Research Projects Agency (ARPA)

¨ ARPANET served as basis for early networking research as well as a central backbone during the development of the Internet.

¨ TCP/IP evolved as the standard networking protocol for exchanging data between computers on the network.

Mid-To-Late 1970's
Basic services were developed that make up the Internet:
- Remote connectivity
- File Transfer
- Electronic mail

1979-80
Usenet systems for newsgroups

1982
Internet gopher

1991
Public introduction to World Wide Web (mostly text based)
- In the early 1990s, the developers at CERN spread word of the Web's capabilities to scientific audiences worldwide.
- By September 1993, the share of Web traffic traversing the NSFNET Internet backbone reached 75 gigabytes per month or one percent. By July 1994 it was one terabyte per month.

1994
Prior to this time the WWW was not used for commercial business purposes
- The Internet is one-third research and education network
- Commercial communications begin to take over the majority of Internet traffic