About the WWW | kalishia owens sent the following | comment to "anthony", about "about www". | You are reading it:- | ------------------------------------------------------------ | what is www? | When was it started? | who was the first to use the web? | who are the current users? | What are the program languages used on www? The following are just my thoughts and recolections on the WWW Any and all dates are approximate. I do know it started slow, and took quite a few years before it became "well known". The WWW is actually global name for a collection of information provided by computers accross the internet. You could say the internet is the WWW - World Wide Web. Before 1990 the only means to downloading and providing information to anyone was anonymous FTP (file transfer protocol). For person to person their was email and a now defunct method called MHSNET. That is, you FTP to a host login as "anonymous" or "ftp" and give you email address (for tracking errors) as the password. almost anything actually will work as the password, but it is usally just something to identify the user. The problem was you could only download filenames, and it was difficult to tell without downloading what various file contained. Few index lists were provided and their was no way to link or redirect users other locations for more or related information. Around 1990 "gopher" protocol was designed to allow you to provide indexed links to information on that server and links to other servers. The problem was it was still basically a "list of links" format. Also text documents could not refer to other documents. The HTML file format solved this. The files is basically plain text, that can include images, and best of all provide links to information available on other HTTP servers or even to other FTP, NEWS and even GOPHER servers. The idea was that a document on something can not only mention something else but refer to a page with more details about that. It was actually never designed for advertising, or link sites, etc, but for real documentaion like scientific reports and journals. The HTTP or web server is actually just a file server. In fact in the early days a lot of HTML documents were provided not by a HTTP server but a normal FTP server. Yes it works, and works very well, the html pages were downloaded and displays and links to other files and documents were usable. An FTP server however did not provide a means to run an executable with link to user provided arguments, allowing them to generate pages from other information, to be passed to client users. What the HTTP server did was * Use a protocol which was FAST for small files. FTP is designed for checksumed download of large archives. It is still better suited for large files than HTTP. * Replace the "directory listing" of a FTP server with a "index.html" EG: A document of links instead of just a file list. * Tell a HTTP client what a file contains, NOT just its filename and size EG: That is this file is of type image/gif and contains an image of a beach ball at 800x600 pixels. That is the link description and the actual file name did not have to match. * Provide means to run programs to generate ANY document on the fly. EG: CGI scripts, PHP server generated pages, Database access programs, and pages basied on user preferences and data. This concept has been expanded in many directions, and is now the primary source of generating pages that can be edited online. Note: the web was NEVER designed to allow, constantally changing pages, animations, or fully graphical pages. Pages were suposed to be mostly unchanging text with just a few images and links. Even framing, or even whole books and movies were not what it was designed for. Since then HTML has expanded into a more graphical format to provide better web site design. Even so it is still PLAIN TEXT with special tags. Also new ideas have been tested and trailed. Java (and all its sub-sets) PHP, server side includes (SSI), tomcat, jakata, streaming audio/video, GIF animation, Flash. With sites like Yahoo, and Google making a hugh impact. And now we have web-2 applications where the users never really 'login' to the actual hosting machine but edit the pages via the web interface. With all pages stored with revision history in databases, and programmically recreated for each user as needed. For example: Discussion Forums (replacing News), Wiki's (Wikipedia), Blogs, YouTube, Flickr, Drupal, Blogs, etc. etc. etc. This has made the WWW the jungle it is today. Approximate date line. 1990 Gopher -- shows the need for better services than plain FTP 1991 html/http services trialed, and started to appear 1992 Mosaic was provided for free, a decent HTTP client program that allowed you to use a mouse to point and click. This later evolved into netscape, and latrer Mozilla. 1993 Http servers become better known (cautious takeup, by computer centers) 1995 The web became known to the generall public as even children and computer illiterate users can "surf" the web. 1996 Talk by politicans of the "Super Information Highway". Comercial enterprises started tring to make money from this new global market. 2003 Web-2 Applications started to appear Any programming language and be used to run the web. CGI is not a language but a guide to how any language should interact with the web serevr! For example, I have written CGI shell scripts for very simple tasks. Of course some languages, particualarly text and object oriented languages with database access capabilities, work better than most others. Scripting languages that are machine independant, Perl, PHP, Java, etc is a even bigger advantage as you can then make use of any computer environment to run your web server. Java, was devices to allow remote programs to run safely on client machines, allowing local updates and data modification without needing constant communication with the main server. It is still the leading programming language in this field.