Planning Motivation Control

Development history and structure of the Internet. The history of the development of the Internet. The era of the world wide web

In 1957, after the Soviet Union launched the first artificial earth satellite, the US Department of Defense decided that in case of war, America needed a reliable information transmission system. The US Defense Advanced Research Projects Agency (DARPA) has proposed developing a computer network for this. The development of such a network has been entrusted to the University of California Los Angeles, Stanford Research Center, University of Utah and California State University Santa Barbara. The computer network was named ARPANET(eng. Advanced Research Projects Agency Network), and in 1969, within the framework of the project, the network united the four indicated scientific institutions. All work was funded by the US Department of Defense. Then the ARPANET network began to actively grow and develop, scientists from various fields of science began to use it.

The first ARPANET server was installed on September 2, 1969 at the University of California, Los Angeles. The Honeywell DP-516 computer had 24 KB of RAM.

On October 29, 1969, at 21:00, a communication session was held between the first two nodes of the ARPANET, located at a distance of 640 km - at the University of California Los Angeles (UCLA) and at the Stanford Research Institute (SRI). Charley Kline was trying to remotely connect from Los Angeles to a computer at Stanford. His colleague Bill Duvall from Stanford confirmed the successful transmission of each entered character by phone.

For the first time, only three LOG symbols were sent, after which the network ceased to function. LOG should have been the word LOGIN (login command). The system was returned to working order by 22:30, and the next attempt was successful. This date can be considered the birthday of the Internet.

By 1971, the first program was developed to send e-mail over the network. This program immediately became very popular.

In 1973, the first foreign organizations from Great Britain and Norway were connected to the network via a transatlantic telephone cable, and the network became international.

In the 1970s, the network was primarily used for sending e-mail, when the first mailing lists, newsgroups and message boards appeared. However, at that time, the network could not yet easily interoperate with other networks built on other technical standards. By the end of the 1970s, data transfer protocols began to develop rapidly, which were standardized in 1982-1983. John Postel played an active role in the development and standardization of network protocols. On January 1, 1983, the ARPANET switched from the NCP protocol to TCP / IP, which is still successfully used to combine (or, as they say, "layering") networks. It was in 1983 that the term "Internet" was assigned to the ARPANET.


In 1984, the domain name system (eng. Domain Name System, DNS).

In 1984, the ARPANET faced a formidable rival: the US National Science Foundation (NSF) founded the vast intercollegiate network NSFNet. National Science Foundation Network), which was composed of smaller networks (including the then famous Usenet and Bitnet networks) and had much higher bandwidth than the ARPANET. About 10 thousand computers were connected to this network in a year, the name "Internet" began to smoothly transition to NSFNet.

In 1988, the Internet Relay Chat (IRC) protocol was developed, making real-time communication (chat) possible on the Internet.

In 1989, in Europe, within the walls of the European Council for Nuclear Research (CERN), the concept of the World Wide Web was born. It was proposed by the famous British scientist Tim Berners-Lee, who over the course of two years developed the HTTP protocol, HTML language and URIs.

The history of all great inventions, as it has long been well known, is based on a large number of those that preceded them. In the case of the World Wide Web (WWW), in this context, apparently, at least two paths of development and accumulation of knowledge and technologies that are most important for the success of the project should be noted: 1) the history of the development of systems such as hypertext ...; 2) the Internet protocol, which actually made the worldwide network of computers an observable reality.

In 1990, the ARPANET ceased to exist, completely losing the competition to NSFNet. In the same year, the first connection to the Internet via a telephone line (so-called "dial-up", eng. dialup access).

In 1991, the World Wide Web became publicly available on the Internet, and in 1993 the famous NCSA Mosaic web browser appeared. The World Wide Web was gaining popularity.

It was the combination of Tim Berners-Lee's web protocol, which enabled communication, and Marc Andreessen's browser (Mosaic), which provided a functionally perfect user interface, that created the conditions for the observed explosion (interest in the Web). In the first 24 months that have elapsed since the appearance of the Mosaic browser, the Web has gone from complete obscurity (beyond a few people within a narrow group of scientists and specialists of only one little-known profile of activity) to its full and absolutely everywhere in the world of its prevalence.

In 1995, NSFNet returned to its role as a research network, with network providers now routing all Internet traffic, rather than National Science Foundation supercomputers.

Also in 1995, the World Wide Web became the main provider of information on the Internet, overtaking the FTP file transfer protocol in traffic. The World Wide Web Consortium (W3C) was formed. We can say that the World Wide Web has transformed the Internet and created its modern look. Since 1996, the World Wide Web has almost completely replaced the concept of the Internet.

In the 1990s, the Internet consolidated most of the then existing networks (although some, like Fidonet, remained separate). The merger looked attractive because of the lack of a unified leadership, and also because of the openness of the technical standards of the Internet, which made the networks independent from business and specific companies. By 1997, there were about 10 million computers on the Internet, and more than 1 million domain names were registered. The Internet has become a very popular medium for the exchange of information.

Currently, you can connect to the Internet through communication satellites, radio channels, cable TV, telephone, cellular communications, special fiber-optic lines or electric wires. The World Wide Web has become an integral part of life in developed and developing countries.

Introduction

The chosen topic in our time is considered one of the most relevant, since the number of Internet users is growing along with the population of the Earth, and cheaper technologies make it possible to develop where it was previously unavailable. Along with the growth of the world's population in modern society, the need for a variety of Internet services is growing. Internet services greatly simplify the work of a person, today almost every home has an Internet connection point, almost all organizations, enterprises and firms use Internet services.

The purpose of this term paper: to consider in detail the place and importance of Internet services in modern society.

Objectives of this work:

  • 1. Study the history of the emergence and development of the Internet;
  • 2. Consider the modern face of the Internet;
  • 3. Analyze the main features of the Internet;
  • 4. Give an overview and characteristics of the main Internet services in modern society;

The history of the emergence of the global Internet

Humanity, in its quest for the exchange of information, has made a powerful leap forward over the past century. The transmission and exchange of information at all times have been the most important factors in evolution. Without the exchange of information and experience, there would be no spread of new human inventions. In a closed information space, there can be no development, only extinction is possible. For the exchange of information, man first invented writing, which for many centuries was the only reliable way of transmission. This method was far from perfect, since it could not provide a wide coverage of information consumers, as well as the speed of its transmission. Finally, at the end of the nineteenth century, man invented the radio, the telegraph, and the telephone. These inventions significantly accelerated the exchange of information, which, in turn, made it possible to advance much faster progress in the field of communications, and already in the twentieth century it was the turn of television broadcasting, and then the computer and such a method of transmitting information as the Internet. Humanity saw tremendous opportunities in this new phenomenon, and all the accumulated experience became available to many in a short period of time. With the development of the computer industry, the Internet has reached the most remote corners of the planet and has become so commonplace in our time that the activities of almost all developed enterprises are unthinkable without it.

It is worth telling in more detail the history of the emergence of the global Internet.

During the Cold War with the Soviet Union, the United States of America wanted an information network that could survive even a nuclear war. The US Department of Defense began to think about how to create a reliable communications system that would continue to function normally even if some of its parts were disabled. The telephone networks used at that time did not provide adequate stability. To address this issue, the US Department of Defense turned to RAND Corporation.

Paul Baren, one of her employees, developed a distributed network design. Due to the fact that it is very difficult to transmit an undistorted signal over long distances, he suggested transmitting digital data in packets. The Pentagon liked his ideas and turned to AT&T telephone company. The same, in turn, rejected Baren's ideas, stating that such a network could not be built.

October 4, 1957 Soviet Union launched the first artificial Earth satellite, thereby gaining an advantage in space. The United States decided that the money allocated to the Pentagon for scientific research was wasted. In connection with this conclusion, it was decided to create a single scientific organization under the auspices of the Ministry of Defense - ARPA (Advanced Research Projects Agency), which would choose from those offered by universities and scientific organizations the most interesting projects and signed contracts on them.

Over time, ARPA director Larry Roberts turned his attention to the computer networking project. Intrigued by expert engineer Wesley Clarke's idea of ​​a packet data network, he gave a talk at the ACM SIGOPS Symposium in 1967. At the same symposium, a similar, but already existing network was presented at the English National Physics Laboratory. The implementation of this system has shown that packet switching can be applied in practice. Larry Roberts brought out from the symposium a clear intention to create a similar network in America.

The first version of the Internet created in the United States was called ARPAnet. The Internet began on December 5, 1969, when computers at four different universities in the country were connected to one network and the first packet data transmissions were made, and this moment can be considered the official beginning of ARPAnet - the network that eventually became the Internet.

Very soon, other American universities became interested in this network. By that time, research institutes and universities in the United States had accumulated a lot of different information on their computers, for the exchange of which such a single computer network would be the best means. With the help of such a network, it would be possible to connect with each other many institutions in different parts of the country, as well as in other parts of the world. At first, this network united only a few powerful computers in the organizations of the military-industrial complex, educational and research centers.

APRANet has grown and developed every year. More and more new users were included in this network. As a result, when thousands of computers were already connected in the network, it became obvious that it was necessary to modernize the mechanism of access to the network. In 1983, the "TCP / IP" (Transmission Control Protocol / Internet Protocol) protocol was introduced.

The birth of this protocol allowed users to easily connect to a computer network using a regular telephone line. At the same time, APRANet split up. The Pentagon decided to separate from the general network for the following reason: their own and with serious capital investments, the network turned into a courtyard in which some civilians were constantly jostling. Even with the number of users less than a thousand, there was no question of any secrecy. Therefore, it was decided to leave to the Pentagon a part of APRANet, called MILNet, and give the rest of the network space at the mercy of the thirsty inter-agency (English-Inter) public communications. This is how the global InterNet network emerged.

Until 1993-1994, the Internet was used mainly only in the scientific and university environment. The main application of the network was the transfer of emails and various information from one user to another. However, the development of the Internet went by leaps and bounds - over the six years of the existence of the network, the number of users connected to it has increased more than 100 times.

In the early 90s, there was a revolution in the way information is displayed on the Web in the form of "pages" capable of carrying not only text, but also graphics, and later - sound and video. This was what was needed for users who did not set themselves the goal of only surfing the Internet. The network has healed new life, blossomed with all possible colors, lost her boring look. Thanks to the technology of the World Wide Web (WWW), created back in 1988, all the resources on the Internet have turned into a single hypertext structure.

Users are now not only specialists and scientists, but ordinary people have poured into the Internet in a stream. The demand for Internet services has grown by leaps and bounds: since the beginning of the 90s, the number of people connected to the Internet has at least doubled annually. In 1995, a real boom of the Internet began, turning the Web into the largest, dynamic and accessible global medium of mass communication, which is developing, to this day, reaching unprecedented corners of its reach.

The Internet has become a meeting place full of people and ideas - a world of communication, information and entertainment, in which the concept of "distance" has disappeared. The growth of the Internet has surpassed and exceeds all expectations and forecasts. The number of users is doubling every year. If in 1980 the Internet united only 25 networks, then in 15 years - more than 44 thousand university, state and corporate network systems of 160 countries of the world, connected by high-speed private and public communications.

It is impossible to establish the exact number of Internet users. By December 2015, it was roughly estimated that this figure was close to 3.5 billion people.

The Russian segment of the Internet appeared in the early 1980s in the USSR, when the Kurchatov Institute was the first in our country to gain access to world networks.

The Internet in Russia, as well as throughout the world, is increasingly becoming an element of society's life, actively developing with an increase in the number of users. On the this moment in Russia, more than 80 million people have access to the Internet. This is more than half of the country's population. Undoubtedly, the Internet has a huge informational impact on society.

In its development Russian Internet in general, repeats the stages of development of the world network of new servers. The growth rate of the number of servers is close to the best indicators in the world, although it is constrained to some extent by communication problems and the high cost of traffic in domestic Internet networks.

If you go deeper into the history of the birth of networks and, in particular, the Internet, you can trace the progress before the generally accepted date of 1969, and divide into several main stages, putting in the following sequence:

  • 1945-1960. The first computers (mainframes) are produced. The first interactive devices and computers appeared, on which the time division mode was implemented. Theoretical works on the interactive interaction of a person with a machine were written. Multi-terminal systems have appeared.
  • 1961-1970. Developed by technical principles data packet switching. Digital telephone networks for packet data were introduced, and the ARPANET was introduced in 1969.
  • 1971-1980. The first personal computers, the first local area networks appeared, the number of ARPANET nodes increased to several dozen, special cable lines were laid connecting some nodes, and the number of ARPANET nodes began to function. Email, scientists report on the results of their work at international scientific conferences.
  • 1981-1990. The TCP / IP protocol was adopted, the US Department of Defense separates its own network based on ARPANET, the division into InterNET and MILNet takes place, the DomainNameSystem (DNS) is introduced, the number of hosts reaches 100,000.
  • 1990 - 2000. Appearance wireless networks, WiFi network signal transmission protocols.
  • 2000 - up to the present - recent history. Development of network services.

In the recent history of the Internet, discoveries in the field of cyberspace occur every day, multiplying the knowledge bank of this vast area of ​​human activity. Part of our life has moved to the web. Many devote to communication through the emerging social media more time than live communication. Financial services are actively developing, allowing you to make multimillion-dollar transactions without leaving your home, even without being present on the stock exchange. Banks are increasingly attracting customers with the ability to manage their accounts without standing in lines or visiting branches. From home, it became possible to pay for utilities and other services by connecting the service personal account in a virtual branch of any bank.

What can we say about all kinds of entertainment on the network, which have become available to everyone who has a computer and Internet access. A huge collection of games, books, entertainment services can actually occupy a person for his entire life, if he himself so desires. Go to the theater without leaving your home - please. Watch the latest movies - just connect to the virtual cinema and you already admire your favorite characters on your screen. Books, magazines, blogs - everything for those who like to read at their leisure or find the information they need.

You can get an education, improve your qualifications, study foreign language and you don't need to go or drive anywhere either.

I myself am writing this work, using the opportunity to study remotely, at any time convenient for me, and also use the extensive knowledge base accumulated before me on the Internet. This is, I believe, a unique opportunity and an incredible breakthrough in information exchange.

I keep returning to the beginning of my work and do not get tired of remembering the main problematic - the desire to communicate and exchange information leads humanity by leaps and bounds to such horizons that our ancestors never dreamed of. No fantasy could have imagined at the beginning of the twentieth century what the scientific thought of the next generations would achieve, and what ideas would be embodied in the life of everything within just a hundred years.

At the beginning of the century, connecting with the right person with the help of a switchboard girl, could it be imagined that to transmit a voice over a distance, and even more so a live image of an opponent, no effort would be required, except for one hand movement or even just a voice command to dial a number?

The rapid development of communication, of course, was facilitated by the general progress in science and the increasing turnover of scientific thought and, sadly, in part, two world wars, which forced competing governments to direct huge resources to develop new methods of processing and delivering information.

In many ways, the modern look of the Internet is a cast of the look of a modern, established society.

In the next chapter of my work, we will continue to review the evolution of networks by looking at the current face of the Internet.

The history of the creation of the Internet

An indirect impetus for the creation of the Internet was the launch of the first artificial satellite in the Soviet Union in 1957. The Cold War had already begun and the United States, realizing that the satellite was just the tip of the iceberg of Soviet military scientific research, saw for itself the threat of using missiles to deliver a nuclear strike against the United States. In the same year 1957, under the US Department of Defense, the Agency for Advanced Defense research projects- ARPA. One of the directions of the Agency's work was the creation computer technology for military purposes, in particular for communications.

Scientists were tasked with creating a computer network that could be used by the military during a nuclear attack on the country. The network was to be used for communication between command posts defense systems. The main criterion for creating a network was considered the invulnerability of the network to partial destruction during a nuclear attack. Even with the destruction of some branches and nodes of the network, messages had to get to the addressee. In addition, it was necessary to take into account the issues of secrecy of information transmitted over the network. To fulfill this condition, a network concept was proposed based on two main ideas:

  • · Lack of a central computer - all computers on the network are equal;
  • · Batch method of transferring files over the network. (In 1963, a student at the Massachusetts Institute of Technology Leonard Kleinrock describes a technology capable of breaking files into chunks and transferring them in various ways over a network. At the end point, the chunks are reassembled into a single file.)

Another theoretical source for the creation of the network was the concept of the "Galactic network" by Joseph Licklider. According to this concept, using the network, any person from anywhere on Earth can receive information and exchange files with any other person. Today we can say that this concept has been embodied in the modern Internet.

The development of such a network has been entrusted to the University of California, Los Angeles, Stanford Research Center, University of Utah and California State University Santa Barbara. The computer network was named ARPANET (Advanced Research Projects Agency Network), and in 1969, within the framework of the project, the network united the four specified scientific institutions. All work was funded by the US Department of Defense.

Joseph Licklider was appointed to lead the computer program.

computer network protocol arpanet

The first ARPANET server was installed on September 1, 1969 at the University of California, Los Angeles. The Honeywell DP-516 computer had 24 KB of RAM.

On October 29, 1969 at 21:00, a communication session was held between the first two nodes of the ARPANET, located at a distance of 640 km - at the University of California Los Angeles and at the Stanford Research Institute. Charlie Cline tried to remotely connect to a computer at the Stanford Research Institute.

His colleague Bill Duvall from Standford confirmed the successful transmission of each entered character by telephone. For the first time, only three LOG symbols were sent, after which the network ceased to function. LOG should have been the word LOGON (login command). The system was returned to working order by 22:30 and the next attempt was successful. This date can be considered the birthday of the Internet.

By January 1971, the Arpanet network consisted of 13 computers, and by April 1972 - 23 computers.

By 1971, the first program was developed to send e-mail over the network. This program immediately became very popular.

In 1972, Washington hosted the first International Conference on Computer Communications. The conference was attended by scientists from 10 countries. The ARPANet network was introduced to the conference participants.

1971-1974 years. ARPANET connects computers of leading, including non-military, laboratories and research centers in the United States.

In 1973, the first foreign organizations from Great Britain and Norway were connected to the network via a transatlantic telephone cable, and the network became international.

In the 1970s, the network was primarily used for sending e-mail, when the first mailing lists, newsgroups and message boards appeared. However, at that time, the network could not yet easily interoperate with other networks built on other technical standards.

By the end of the 1970s, data transfer protocols began to develop rapidly, which were standardized in 1982-83. John Postel played an active role in the development and standardization of network protocols. On January 1, 1983, the ARPANET switched from the NCP protocol to TCP / IP, which is still successfully used to combine (or, as they say, "layering") networks.

It was in 1983 that the term "Internet" was assigned to the ARPANET.

In 1984, the Domain Name System (DNS) was developed.

In 1984, ARPANET had a serious rival: the US National Science Foundation founded the vast inter-university network NSFNet (English National Science Foundation Network), which was made up of smaller networks (including the then famous Usenet and Bitnet) and had much higher bandwidth. than ARPANET.

About 10 thousand computers were connected to this network in a year, the title of "Internet" began to smoothly move to NSFNet.

In 1988, the Internet Relay Chat (IRC) protocol was developed, making real-time communication (chat) possible on the Internet.

In 1989 in Europe, within the walls of the European Council for Nuclear Research, the concept of the World Wide Web was born. It was proposed by the famous British scientist Tim Berners-Lee, who over the course of two years developed the HTTP protocol, HTML language and URIs.

To interconnect IP networks and networks using other protocols, it was necessary to create a special Internet protocol. This protocol was created by Vincent Surf and Robert Kahn in 1974 and named TCP. After the consolidation of the two protocols TCP and IP in 1982 into one protocol TCP / IP, it became the standard protocol of the interconnected network - the Internet. In the same year, Cerf and his colleagues coined the term "Internet". Today Vincent Surf is called the "Father of the Internet".

In 1990, the ARPANET ceased to exist, completely losing the competition to NSFNet. In the same year, the first connection to the Internet via a telephone line (so-called "dialup" - English Dialup access) was recorded.

In 1991, the World Wide Web became publicly available on the Internet, and in 1993 the famous NCSA Mosaic web browser appeared. The World Wide Web was gaining popularity.

In 1995, NSFNet returned to its role as a research network, with network providers now routing all Internet traffic, rather than National Science Foundation supercomputers.

Also in 1995, the World Wide Web became the main provider of information on the Internet, overtaking the FTP file transfer protocol in traffic. The World Wide Web Consortium (W3C) was formed. We can say that the World Wide Web has transformed the Internet and created its modern look. Since 1996, the World Wide Web has almost completely replaced the concept of the Internet.

In the 1990s, the Internet consolidated most of the then existing networks (although some, like Fidonet, remained separate). The merger looked attractive because of the lack of a unified leadership, and also because of the openness of the technical standards of the Internet, which made the networks independent from business and specific companies.

By 2013, 1.9 billion users will be using the Internet on a regular basis. Currently, you can connect to the Internet through communication satellites, radio channels, cable TV, telephone, cellular communications, special fiber-optic lines or electric wires. The World Wide Web has become an integral part of life in developed and developing countries.

Russia first gained access to the Internet in the early 1980s. Access by the Institute atomic energy them. I. V. Kurchatov. In 1990 RELCOM was created - a network of UNIX users.

The history of the creation and development of the global Internet.

In 1957, within the US Department of Defense, a separate structure emerged - the Advanced Research Projects Agency (DARPA). The main work of DARPA was devoted to the development of a method for connecting computers to each other. The global Internet began to develop on the basis of the ARPA (Advanced Research Projects Agency) DARPA network in 1969.

This network was designed to connect various scientific centers, military institutions and defense enterprises. For its time, ARPAnet was an advanced and extremely robust closed system. With its help, it was planned to facilitate the process of communication between numerous organizations working for the defense industry, as well as to create practically indestructible communication channels. In particular, when creating ARPAnet, it was assumed that this system will continue to function in the face of a nuclear attack.

The project was based on three basic ideas:

  • each node in the network is connected to others, so there are several different paths from node to node;
  • all nodes and links are considered untrusted - there are automatically updated packet forwarding tables;
  • a packet intended for a non-neighboring node is sent to the one closest to it, according to the packet forwarding table, if this node is unavailable, to the next, etc.

These ideas were supposed to ensure the functioning of the network in the event of destruction of any number of its components. In principle, the network could be considered operational even if only two computers were functioning. The system created on this principle did not have a centralized control unit and, therefore, could easily change the configuration without the slightest damage to itself.

Initially, the network consisted of 17 mini-computers. The memory of each had a volume of 12 KB. In April 1971, 15 nodes were connected to the network. In 1975, the ARPAnet already included 63 nodes.

In mid-1972, it began to spread among netizens that sending a letter over a computer network was much faster and cheaper than using the traditional method. This is how e-mail began to emerge - a service without which it is impossible to imagine the Internet today.

Soon the UUCP (Unix-to-Unix Copy) program appears. This led to the creation of the next service - USEnet (network news). This is how the network was originally called, allowing the user to enter the computer where the information was posted, and select from there all the materials of interest to him. Already on initial stage development, the number of USEnet users has tripled annually.


Soon enough, the architecture and principles of the ARPAnet network ceased to meet the requirements put forward. The need arose to create a universal data transfer protocol.

In 1974, the Internet Network Working Group, created by DARPA, developed the universal data transmission and networking protocol Transmission Control Protocol / Internet Protocol (TCP / IP), which is the basis for the functioning of the Internet. In 19X3, DARPA obliged to use the TCP / IP protocol on all ARPAnet computers, on the basis of which the US Department of Turnover divided the network into two parts: separately for military purposes - MILnet and scientific research - the ARPAnet network.

Initially, the network was focused only on sending files and unformatted text. However, many users needed an infrastructure to work in a more convenient mode. In particular, exchange research results via the Internet in the form of formatted and illustrated text that is familiar to scientists, including links to other publications. In 1989, the European Laboratory of Elementary Particle Physics (CERN, Switzerland, Geneva) developed the technology of hypertext documents - the World Wide Web, which allows access to any information on the network on computers around the world. This was the beginning of the World Information Web, which by now has "entwined" with its networks almost the entire computer world and made the Internet accessible and attractive to millions of users.

In 1990, the ARPAnet ceased to exist, and the Internet emerged in its place.

The main features of the Internet:

  • versatility of the concept, independent of the internal structure of the connected networks and types of hardware and software;
  • maximum reliability of communication with a deliberately low quality of communications, communication facilities and equipment;
  • the ability to transfer large amounts of information.

The rapid expansion of the network has led to range problems not envisioned in the original design and has forced developers to find technologies to manage large distributed resources.

In the original draft, the names and addresses of all computers connected to the Internet were stored in a single file, which was manually edited and then distributed throughout the Internet. But already in the middle of 1980 it became clear that the central database was ineffective. First, requests to update a file would soon exceed the capacity of the people who handled them. Second, even if the correct central file existed, there was not enough network bandwidth to allow either frequent distribution to all locations or immediate access to it from every location.

New protocols were developed, and a naming system was used throughout the unified Internet network, which allowed any user to automatically determine the address of a remote machine from its name. Known as the Domain Name System (DNS), this mechanism relies on machines called name servers to answer queries for names. There is not one machine that contains the entire database of names. Instead, the data is spread across multiple machines that use TCP / IP protocols to communicate with each other when responding to requests.

Thus, today the Internet is a union of a huge number of different computer networks almost all over the world.

The global network of the Internet. Definition of the Internet

Internet- a worldwide information computer network, which is an unification of many regional computer networks and computers that exchange information with each other via public telecommunications channels (dedicated telephone analog and digital lines, optical communication channels and radio channels, including satellite communication lines).

The Internet is a peer-to-peer network, i.e. all computers on the network are essentially equal, and any computer can be connected to any other computer. Any computer connected to the network can offer its services to anyone else. But the Internet is not only communication channels. The nodes of this worldwide connection are equipped with computers that contain various information resources and offer various information and communication services.

Information on the Internet is stored on servers. Servers have their own addresses and are controlled by specialized programs. They allow you to transfer mail and files, search databases, and perform other tasks.

The exchange of information between the servers of the network is carried out via high-speed communication channels. Individual users' access to information resources on the Internet is usually carried out through a provider or corporate network.

Provider - network service provider - a person or organization that provides services for connecting to computer networks. The provider is some organization that has a modem pool for connecting with clients and accessing the worldwide network.

There are also computers that are directly connected to the global network. They are called host computers. A host is any computer that is a permanent part of the Internet, i.e. connected via the Internet protocol to another host, which in turn is connected to another, and so on.

Below is the structure of the global Internet

Almost all Internet services are based on the client-server principle.

The transmission of information to the Internet is ensured by the fact that each computer on the network has a unique address (IP address), and network protocols ensure the interaction of different types of computers running different operating systems.

All computers involved in data transmission use a single TCP / IP communication protocol, which consists of two different protocols that define different aspects of data transmission in the network:
1. TCP (Transmission Control Protocol) is the control of data transmission. This protocol "splits" the transmitted information into packets and corrects information errors in the recipient's packet;
2. Internet Protocol (IP) is an internetworking. It is responsible for addressing and allows a packet to travel over multiple networks on its way to its destination.

Information transfer via TCP / IP protocol occurs according to the following scheme: TCP protocol breaks information into packets and numbers them; then the IP protocol transmits these packets to the recipient, where the completeness of the received packets is checked using the TCP protocol (whether all packets have been received); after all the packets have been delivered, the TCP protocol puts the packets in the correct order and connects them into a single whole.

Any computer connected to the Internet has two unique addresses: a digital IP address and a symbolic domain address. Assignment of addresses to computers is carried out according to the scheme: the organization "Network Information Center" issues groups of addresses to the owners local area networks, and they distribute these addresses as they see fit. A computer's IP address is 4 bytes long: the 1st and 2nd bytes are the network address, the 3rd byte is the subnet address, and the 4th byte is the computer's subnet address. The IP address is written in the form of four numbers in the range from 0 to 255, separated by dots (example: 145.37.5.150, where 145.37 is the network address; 5-; subnet address; 150- computer address in the subnet). Domain address (English domain - area), in contrast to the IP address, is symbolic and easier for a person to remember. Example: computer.group.big.by, domain computer is the name of the real computer that owns the IP address, domain group is the name of the group that assigned the name to this computer, domain big is the name of the larger group that assigned the name to the domain group, and by is the domain space ... During data transfer, the domain address is converted to an IP address.

Thus, the Internet is a global computer system that:
- logically interconnected by the space of globally unique addresses (each computer connected to the network has its own unique address);
- is able to maintain communication (exchange of information);
- provides the operation of high-level services (services), for example, WWW, e-mail, teleconferences, conversations on the network and others.

Concept and types of services

Servers are network nodes designed to service requests from clients - software agents that extract information or transfer it to the network and work under the direct control of users. Clients provide information in an understandable and user-friendly form, while servers perform the service functions of storing, distributing, managing information and issuing it at the request of clients. Each type of service on the Internet is provided by appropriate servers and can be used with the help of appropriate clients.

The most suitable for classifying Internet services is the division into interactive, direct and deferred reading services. These groups combine services according to a large number of characteristics. Services belonging to the lazy read class are the most widespread, the most versatile, and the least demanding on computer resources and communication lines. This includes, for example, email.

Direct access services are characterized by the fact that information on request is returned immediately. However, an immediate response is not required from the recipient of the information. Services that require an immediate response to the information received, i.e. the information received is, in fact, a request, refer to interactive services.

Currently, there are quite a large number of services on the Internet that provide work with the entire spectrum of resources. The most famous among them are:

DNS service
DNS service, or the domain name system, which provides the ability to use mnemonic names instead of numeric addresses for addressing network nodes. DNS is a computer distributed system for obtaining information about domains. Most often used to obtain an IP address by hostname (computer or device), obtain information about mail routing, serving hosts for protocols in a domain.

Email
Electronic mail (E-mail) - designed to send information to a specific user of the global network. Each user must have an e-mail box - this is a folder on the server where the user's incoming and outgoing messages are stored. In addition, modern e-mail allows you to: send a message to several subscribers at once, forward letters to other addresses, turn on an autoresponder - for everything. incoming letters a response will be automatically sent, rules will be created for performing certain actions with messages of the same type (for example, deleting advertising messages coming from certain addresses), etc. An attachment can be added to the email - any other file. For many companies, email is not just mail, but the backbone of the entire business process. Many computer applications have built-in email support. Email is one of the most common services on the Internet. Mailing lists work through e-mail.

Mailing lists
The mailing lists (maillists) are a simple yet very useful service on the Internet. This is practically the only service that does not have its own protocol and client program and works exclusively via e-mail.
The idea behind a mailing list is that there is an email address that is actually the common address of many people who subscribe to the mailing list. You send a letter to this address, for example to the address us.ksm.tej | n11l-u # us.ksm.tej | n11l-u(this is the address of a mailing list dedicated to discussing localization problems for UNIX-class operating systems), and your message will be received by all people subscribed to this mailing list.

Usenet Networking News
Teleconferences, or newsgroups (Usenet), providing collective messaging are also an Internet service. Whereas email broadcasts messages on a one-to-one basis, network news broadcasts one-to-many messages. Usenet is a worldwide discussion club. It consists of a set of newsgroups, whose names are organized hierarchically according to the topics discussed. Messages (“articles” or “messages”) are sent to these conferences by users using special software. Once sent, messages are sent to news servers and made available for reading by other users.

You can send a message and view the responses to it, which will appear in the future. Since many people read the same material, reviews begin to accumulate. All messages on the same topic form a “thread” [in Russian, the word “topic” is also used in the same meaning]; thus, although the responses may have been written at different times and mixed with other messages, they still form a coherent discussion. You can subscribe to any conference, view the headers of messages in it using a news reader, sort messages by topic to make it easier to follow the discussion, add your own messages with comments and ask questions. Newsreaders are used to read and send messages, such as Netscape News, which is built into the Netscape Navigator browser, or Microsoft's Internet News, which comes with the latest versions of Internet Explorer.

FTP service
FTP service is a file archive system that provides storage and transfer of files of various types. Another widespread Internet service. The FTP service provides remote access to the server's file system. Access to files in file archives, to gigantic amounts of information on the Internet. The FTP server can be configured in such a way that you can connect to it not only under your name and password, but also under the conventional name anonymous. Then only a certain set of files on the server becomes available to you - a public file archive.

IRC service
Service IRC - Internet Relay Chat, designed to support text communication in real time.
There are thousands of Internet Relay Chat (IRC) servers on the Internet that provide interactive communication. Any user can connect to such a server and start communicating with one of the visitors of this server or participate in a collective "meeting". Messages are transmitted inside the server. The simplest way to communicate is chat. This is an exchange of messages typed from the keyboard. If the computers of the interlocutors are equipped with a sound card, microphone and acoustic speakers, you can exchange audio messages. However, a "live" conversation is simultaneously possible only between two interlocutors. In order to see each other, that is, exchange video images, video cameras must be connected to the computers. To organize interactive communication, you need a special software(for example, NetMeeting, which is included with Windows).

Infrastructure services
FTP described above is an example of an Internet infrastructure service, that is, a service based on software tools usually supplied as part of the operating system.

Telnet service - designed to manage remote computers in terminal mode. It is also used as a means of accessing remote information services, which are operated in a text terminal mode. Telnet is used as part of the Internet information service, when, when connecting, the user enters not the command interpreter, but immediately into a specialized program that provides access to information resources.

So you can work with directories of some libraries, with a server serving information system CTN, you can access the terminal WWW navigator (text or graphical).

Hypermedia system WWW
World Wide Web (WWW, W3, World Wide Web) is a hypertext (hypermedia) system designed to integrate various network resources into a single information space. It is a distributed system that provides access to related documents located on various computers connected to the Internet.

The World Wide Web is formed by hundreds of millions of web servers. Most of the resources of the world wide web are based on hypertext technology. Hypertext documents posted on the World Wide Web are called web pages. Multiple web pages combined common theme, design, as well as linked links and usually located on the same web server are called a website. To download and view web pages, special programs are used - browsers (English browser).

The World Wide Web has caused a real revolution in information technology and an explosion in the development of the Internet. Often, when talking about the Internet, they mean the World Wide Web, but it is important to understand that they are not the same thing.

The services listed above are standard. This means that the principles of building client and server software, as well as interaction protocols are formulated as international standards... Consequently, software developers are required to comply with general technical requirements in practical implementation.
Along with standard services, there are also non-standard ones, which are the original development of a particular company. As an example, we can cite various systems such as Instant Messenger (a kind of Internet pagers - ICQ, AOl, Demos on-line, etc.), Internet telephony systems, broadcasting radio and video, etc. An important feature of such systems is lack of international standards, which can lead to technical conflicts with other similar services.

The main stages of the creation and development of the Internet

The predecessor to the modern Internet was the US Department of Defense's APRANET. The development of the network was entrusted to the University of California, Los Angeles, Stanford Research Center, University of Utah and California State University Santa Barbara. The computer network was named ARPANET (Advanced Research Projects Agency Network), and in 1969, within the framework of the project, the network united the four specified scientific institutions. All work was funded by the US Department of Defense. Then the ARPANET network began to actively grow and develop, scientists from various fields of science began to use it.

Within five years, the Internet has reached an audience of over 50 million users. Since January 22, 2010, the crew of the International Space Station has received direct access to the Internet.

The first ARPANET server was installed on September 2, 1969 at the University of California, Los Angeles. The Honeywell DP-516 computer had 24 KB of RAM.

On October 29, 1969, at 21:00, a communication session was held between the first two nodes of the ARPANET, located at a distance of 640 km - at the University of California Los Angeles (UCLA) and at the Stanford Research Institute (SRI). Charley Kline was trying to remotely connect from Los Angeles to a computer at Stanford. His colleague Bill Duvall from Stanford confirmed the successful transmission of each entered character by phone. For the first time, only three LOG symbols were sent, after which the network ceased to function. LOG should have been the word LOGIN (login command). The system was returned to working order by 22:30 and the next attempt was successful.

By 1971, the first program was developed to send e-mail over the network. This program immediately became very popular.
In 1973, the first foreign organizations from Great Britain and Norway were connected to the network via a transatlantic telephone cable, and the network became international.

In the 1970s, the network was primarily used for sending e-mail, when the first mailing lists, newsgroups and message boards appeared. However, at that time, the network could not yet easily interoperate with other networks built on other technical standards.

By the end of the 1970s, data transfer protocols began to develop rapidly, which were standardized in 1982-1983. John Postel played an active role in the development and standardization of network protocols.

On January 1, 1983, the ARPANET switched from the NCP protocol to TCP / IP, which is still used to combine (or, as they say, "layering") networks. It was in 1983 that the term "Internet" was assigned to the ARPANET.

In 1984, the Domain Name System (DNS) was developed. And in 1984, the ARPANET network had a serious rival: the US National Science Foundation (NSF) founded the vast inter-university network NSFNet (English National Science Foundation Network), which was made up of smaller networks (including the then famous Usenet and Bitnet networks) and had much more bandwidth than ARPANET. About 10 thousand computers were connected to this network in a year, the name "Internet" began to smoothly transition to NSFNet.

In 1988, the Internet Relay Chat (IRC) protocol was developed, making real-time communication (chat) possible on the Internet.

In 1989, in Europe, within the walls of the European Council for Nuclear Research (CERN), the concept of the World Wide Web was born. It was proposed by the famous British scientist Tim Berners-Lee, who over the course of two years developed the HTTP protocol, HTML language and URIs.

In 1990, the ARPANET ceased to exist, completely losing out to competition with NSFNet. In the same year, the first connection to the Internet via a telephone line (so-called dialup access) was recorded.

In 1991, the World Wide Web became publicly available on the Internet, and in 1993 the famous NCSA Mosaic web browser appeared. The World Wide Web was gaining popularity.

In 1995, NSFNet returned to its role as a research network, with network providers now routing all Internet traffic, rather than National Science Foundation supercomputers. Also in 1995, the World Wide Web became the main provider of information on the Internet, overtaking the FTP file transfer protocol in traffic. The World Wide Web Consortium (W3C) was formed. We can say that the World Wide Web has transformed the Internet and created its modern look. Since 1996, the World Wide Web has almost completely replaced the concept of the Internet.

In the 1990s, the Internet consolidated most of the then existing networks (although some, like Fidonet, remained separate). The merger looked attractive because of the lack of a unified leadership, and also because of the openness of the technical standards of the Internet, which made the networks independent from business and specific companies.

By 1997, there were already about 10 million computers on the Internet, and more than 1 million domain names were registered. The Internet has become a very popular medium for the exchange of information.

Since January 22, 2010, the crew of the International Space Station has received direct access to the Internet.

Web 1.0 and Web 2.0

The Internet boom is usually attributed to the stable commercial growth of Internet companies associated with the advent of the World Wide Web, which began with the first release of the Mosaic web browser in 1993 and continued throughout the 90s.
The short (by historical standards) lifetime of the WWW service has shown its relevance to an ever-increasing number of users. More and more companies are refocused on the Internet business with a large share of advertising, rather than Internet services. Between 1995 and 2001, there was a reassessment of internet technology. The dot-com bubble - which culminated on March 10, 2000, led to a wave of bankruptcies and a loss of confidence in the securities of high-tech firms associated with the provision of services over the Internet. The ensuing rise in 2002 led to the emergence of high-tech Internet companies, the rapid development of Internet services. This has become a good incentive for the development of web-based concepts and technologies that increase user experience. The massive introduction and use of these solutions is the reason for the qualitative changes in the World Wide Web, a kind of change in the "version" of the Web. At the moment, Internet analysts identify web 1.0, web 2.0 Internet resources, and the concept of web 3.0 services already exists (it should be noted that this division is conditional and often criticized).

Web 1.0
Web 1.0 is a retronym for the status of the WWW and any style of website design used prior to the introduction of the term Web 2.0. Web 1.0, or as it is called the "classic web", are static sites. This is a kind of web library, which is made by a few for many, where sites are compared according to the type of technology used. Typical examples of web 1.0 are sites that consist of many linked static web pages, information on which is created and modified only by the site developer. Since 1998, guest books and forums have been massively used to add interactivity to sites (although these functions were available before). Such sites are sometimes called web 1.5, emphasizing the ability of users to communicate, the presence of profiles and the formation of Internet communities. However, the user cannot yet create or modify content - this is the prerogative of the site administrators.

There were no developed chat rooms, mostly IRC and ICQ were used, but more - email. Few created normal own sites, many poor quality sites were created on free hosting.

Site versions were created for different encodings and browsers, depending on the users' software. Domain registration and payment for normal paid hosting, which a small number of people had, were inaccessible. There were no blogs, web services, or wikiprojects.

The main characteristics of Web 1.0 are: unchanging site structure, static information, time-consuming process of updating and creating new resources, one-way security process, centralized content of websites, small number of users.
Web 1.0 is a generic term describing the state of the World Wide Web over its first decade. The 90s of the XX century were characterized by low computer literacy of users, slow connection types and a limited number of Internet services. The web sites of that time had the following main features:
- static content of web pages, the content was created and maintained by the developers of the website;
- frame and / or tabular layout;
- low quality markup (often the content was presented in the form of plain text borrowed from Usenet newsgroups and similar sources, and enclosed in a tag

);
- widespread use of non-standard tags supported only by a specific browser;
- the use of physical or embedded styles, rarely embedded and, moreover, related style sheets;
- indication of information about the recommended browser version and monitor resolution, at which the site design is displayed correctly;
- guest books, forums or chats - as tools for feedback and interactivity;
- use of graphic and text informers (weather, dollar exchange rate, etc.) to aggregate information.

In the first decade of the Internet, or Web 1.0, the very foundation of the Internet was developed, which made it possible to give access to vast amounts of information to a wide range of Internet users.

The conditional end of the "Web 1.0" era dates back to 2001, when there was a collapse in the shares of Internet companies. Actually, the existing sites did not disappear, but the newly created sites were more and more different from the typical "one-to-one".

Web 2.0
Web 2.0 is a set of web technologies focused on the active participation of users in the creation of website content.
The emergence of the name Web 2.0 is usually associated with Tim O'Reilly's article "What Is Web 2.0" dated September 30, 2005. In this article, Tim O'Reilly linked the emergence of a large number of sites, united by some common principles, with the general trend of the development of the Internet community, and called this phenomenon Web 2.0, as opposed to the "old" Web 1.0. Despite the fact that the meaning of this term is still the subject of much debate, those researchers who recognize the existence of Web 2.0 highlight several main aspects of this phenomenon.

The first to use the phrase Web 2.0 was O'Reilly Media, an information technology publisher. It happened in 2004. A little later, publisher Timothy O'Reilly formulated some of the Web 2.0 principles. Over the years, the Web 2.0 space has expanded, displacing the traditional Web services known as Web 1.0.

Feature of Web 2.0. is an:
- involvement of the "collective mind" to fill the site;
- interaction between sites using web services;
- update web pages without reloading;
- aggregation and syndication of information;
- combining various services to obtain new functionality;
- design with style markup and emphasis on usability.

The main elements of Web 2.0:
Web services (web services) are network applications available over the HTTP protocol, using XML-based data formats (RPC, SOAP, or REST) ​​as communication protocols. As a result, the software can use web services instead of independently implementing the required functionality (for example, checking the postal address entered in the form). There are tools for working with HTTP and XML in any modern programming language, so web services are platform independent.

AJAX (Asynchronous JavaScript and XML)- an approach to building user interfaces of web applications, in which the web page, without reloading, asynchronously loads the data the user needs. The use of Ajax became most popular after Google began to actively use it in the creation of their sites, such as Gmail and Google Maps. Ajax is often considered a synonym for Web 2.0, which is not at all the case. Web 2.0 is not tied to any one technology or set of technologies; Flash 4 already provided the ability to asynchronously refresh a page as early as 1999.

Web syndication- simultaneous dissemination of information, including audio and video, to various pages or websites, usually using RSS or Atom technologies. The principle is to distribute the titles of materials and links to them (for example, the latest forum posts, etc.). This technology was originally used on news resources and in blogs, but gradually the scope expanded.

Mash-up(literal translation - "mixing") - a service that fully or partially uses other services as sources of information, providing the user with new functionality for work. As a result, such a service can also become a new source of information for other web mash-up services. Thus, a network of services dependent on each other, integrated with each other, is formed. For example, the site transport company may use maps of the Google Maps service to track the location of the transported cargo.

Tags (tags)- keywords describing the object under consideration, or referring it to any category. These are a kind of labels that are assigned to an object to determine its place among other objects.

Socialization- the use of developments that allow you to create user communities. The concept of socialization of the site can also include the possibility of individual site settings and the creation of a personal zone (personal files, images, videos, blogs) for the user, so that the user feels his uniqueness. Encouragement, support and trust in the "collective intelligence". When forming a community great importance has a competitive element, Reputation or Karma, which allows the community to self-regulate and set additional goals for users to be on the site.

Design. The concept of Web 2.0 is also reflected in the design. Roundness, imitation of convex surfaces, imitation of reflections in the manner of glossy plastic of modern hi-end devices (for example, players) became preferred. Overall perception appearance seems more pleasing to the eye. The graphics of such sites take up more volume than when using an ascetic design. This trend is partly due to the coincidental release of new versions of operating systems using the aforementioned ideas. However, the monotony of such sites is evident in recent times the graphic look of classic Web 2.0 design is considered to be outdated and not creative. This is especially reflected in current trend creating informative sites where simplicity, grace, graphics and usability play the main role. There should be no restrictions in design, but Web 2.0 instills them.

Disadvantages of Web 2.0
When using Web 2.0 technologies, you become a tenant of the service and / or disk space from some third-party company. The resulting dependence forms a number of disadvantages of new services:
- the dependence of sites on the solutions of third-party companies, the dependence of the quality of the service on the quality of the work of many other companies;
- poor adaptability of the current infrastructure to perform complex computing tasks in the browser;
- vulnerability of confidential data stored on third-party servers for intruders (cases of theft of personal data of users, massive hacking of blog accounts are known).

We are now at the end of the second decade in Web 2.0, various user interfaces have been developed that allow users to already manage the content of the Internet and communicate with each other.

Web 3.0
Web 3.0 is a fundamentally new approach to processing information presented on the World Wide Web. Web 3.0 primarily implies a different approach to the processing of information by the user community. Also, the term Web 3.0 is often referred to as the concept of the Semantic Web. Semantic Web - “part of the global concept of the development of the Internet, the purpose of which is to implement the possibility of machine information processing available on the World Wide Web. The main emphasis of the concept is on working with metadata that uniquely characterize the properties and content of the World Wide Web resources, instead of the currently used textual analysis of documents ”(Wikipedia). That is, it is a kind of network over the Network that contains metadata about the resources of the World Wide Web and exists in parallel with them.

Alternative theory of Web 3.0
Web 3.0 is a concept for the development of Internet technologies, formulated by the head of Netscape.com Jason Calacanis (English Jason Calacanis) in continuation of the concept of Web 2.0 Tim O'Reilly. Its essence is that Web 2.0 is only a technological platform, and Web 3.0 will allow professionals to create high-quality content and services on its basis.
The definition was posted on Kalacanis' personal blog on March 10, 2007. Kalakanis noted that Web 2.0 allows you to quickly and practically free of charge use a significant number of powerful Internet services with high consumer qualities, the choice among which can be made by selecting data of interest to the user (behavioral factors).

From Web 1.0 to Web 3.0 - Three Decades. Here are the following images to explain the differences between Web 2.0 and Web 3.0 otherwise known as the Semantic Web. But before comparing Web 2.0 and Web 3.0, it is useful to compare Web 1.0 to Web 2.0:

Low (HTML page) Medium (XML tag) High (RDF Objects) Services provided Search (ability to search for information, search results are not accurate) Communities (social media blogs) Search (a way to find information, search results are accurate and different from user to user due to preferences) User participation factor short average high User satisfaction factor from using the site short average high Data referencing factor (referrals) low (docs) medium (documents) high (documents and their individual parts) Subjectivity factor high medium (the ability to choose partners (friend lists) or set restrictions on access to data in blogs) low (anyone can access the resource via a URI) Content transclusivity level short medium ("mixing" of data, driven by application code) high (data-driven mixing) What You See Is What You Prefer (WYSIWYP) short average high (customizable resource view description, targeted search) Data availability (open access to data) low medium (access via data bins - server applications) high (direct access) User identification tools weak medium (OpenID) strong (FOAF + SSL) System deployment model Centralized Centralized, with the delegation of part of the authority by the user (registration of a new user automatically leads to the creation of an environment for him) Distributed, with dedicated centralized functions Data model Logical (hierarchical, DOM-based) Logical (hierarchical, XML-based) Conceptual (RDF graphs) User interface Dynamically generated (server-side) static interface (client-side) Dynamically generated (server-side), with the possibility of partial modification on the client side (XSLT, XQuery / XPath) Fully dynamic interface represented by RDF self-describing capability Data Query Capabilities Full text search Full text search Full-text search + search in graph structures using SPARQL (Structured Graph Pattern Query Language) Web as a media Represents the opinion of the author / publisher Reflects opinion social group composed of peer contributors and commentators Represents the opinion of a social group, supported by expert assessments... Popularity of information is important

Development prospects

It is very difficult to predict the development of such a complex and large-scale phenomenon as the Internet. One thing is certain: network technologies will play a huge role in the life of the information society.

Currently, the Internet is developing exponentially: every one and a half to two years, its main quantitative indicators double. This refers to the number of users, the number of connected computers, the amount of information and traffic, the amount of information resources.

The Internet is rapidly developing and of high quality. The boundaries of its application in the life of mankind are constantly expanding, completely new types of network services and the use of telecommunication technologies, even in household appliances, appear.

A life modern society becomes more and more computerized. Requirements for the efficiency and reliability of information services are growing, new types of them appear. Scientists are already developing fundamentally new forms of global information networks. In the not too distant future, many network design, administration and maintenance processes will be fully automated.

Links

In the days of Web 1.0, websites were built only by their owners. The sites did not allow changes to their content, with only email and guestbooks present in the way they interact with the user, providing a rather meager communication experience.