Schweitzer Fachinformationen
Wenn es um professionelles Wissen geht, ist Schweitzer Fachinformationen wegweisend. Kunden aus Recht und Beratung sowie Unternehmen, öffentliche Verwaltungen und Bibliotheken erhalten komplette Lösungen zum Beschaffen, Verwalten und Nutzen von digitalen und gedruckten Medien.
At the turn of the 21st century, information, including access to the Internet, became the basis for personal, economic and political progress. A popular name for the Internet is the "information highway", and it became the place where one goes to find the latest financial news, to browse library catalogs, to exchange information with colleagues or to participate in a lively political debate. The Internet is the tool that will lead you, beyond telephones, faxes and isolated computers, to a rapidly growing network of information without borders.
The Internet complements the traditional tools you use to collect information, graphical data, view the news and connect with others. The Internet is shrinking the world; bringing information, skills and knowledge on almost all subjects imaginable directly to your computer.
The Internet is what we call a meta network, that is a network of networks that covers the entire world. It is impossible to give an exact number of the amount of networks or users that make up the Internet, but it easily exceeds several billion (4.57 billion Internet users in the first quarter of 2020 according to the site blogdumoderateur1).
The Internet was first designed by the Department of Defense, as a means of protecting US government communications systems in the event of a military attack. The original network, baptized ARPANET (after the Advanced Research Projects Agency that developed it), evolved into a communication channel between the entrepreneurs, military personnel and academic researchers who contribute to ARPA projects.
In 1957, the Advanced Research Project Agency (ARPA) was created in the United States to lead a small number of projects aimed at ensuring scientific and technical predominance over the Russians. Making up this organization was some of the United States' most valued scientists.
In 1967, Lawrence G. Roberts, who had recently chaired the ARPA computer network project, presented these scenarios for the ARPANET (Advanced Research Projects Agency NETwork). Meanwhile, that same year, Donald Davies and Robert Scantlebury of the National Physical Laboratory (NPL) in the United Kingdom announced the design of a packet-switching network.
In 1969, the ARPANET began to operate, initially linking up four universities. Using this connection, four facilities were able to transfer data and perform lengthy calculations, remotely, on multiple computers.
During the 1970s, research laboratories gradually linked up to the ARPANET.
In 1970, the Network Control Protocol (NCP) was used on ARPANET with the aim of linking heterogeneous devices (IBM, Unix, etc.).
In 1983, the NCP was definitively cast aside in favor of the Transmission Control Protocol/Internet Protocol (TCP/IP), which is still in use now and represents the main protocol of the Internet. The TCP is responsible for segmenting a message into packets and rearranging the packets after they are received, while the role of IP is to ensure that packets pass from one computer to another until they reach their destination.
In 1977, the TCP/IP was effectively used to link several networks to the ARPANET. More than a hundred computers were connected, and from this point, the number would continue to increase year after year.
In March 1989, Tim Berners-Lee, a computer scientist at the CERN (European Council for Nuclear Research), advised putting documents on the CERN website that were linked by hyperlinks, with the aim of helping physicists searching for information. The origin of the Web dates back to this point in time.
In the early 1990s, the birth of the Internet as we know it today was announced: the Web was defined by a collection of HTML (HyperText Markup Language) pages combining text, images and links that can be reached via a URL (Uniform Resource Locator), based on HTTP (HyperText Transfer Protocol).
In 1991, in Geneva, Tim Berners-Lee developed the Internet interface known as the World Wide Web (WWW), allowing the network to be opened up to the general public by facilitating website consultation instructions.
In 1991, 300,000 computers were connected, with this figure reaching 1,000,000 by 1992.
In 1992, the first link (known as a hyperlink), enabling access to the CERN's Internet site, was built on the Fermilab server in the United States: this was the beginning of the weaving of the WWW. The Net continued to expand at an exponential rate during the 1990s under the impetus of the Web.
The year 1993 saw the birth of the first web browser, designed by Netscape, which supported text and images. That same year, the National Science Foundation (NSF) founded a company to enable the registration of domain names.
In 1993, there were 600 sites, with this figure exceeding 15,000 by 1995. Today, the WWW has come to be the most valued service on the Internet.
As of 2008, there were 1.5 billion Internet users worldwide, 1.3 billion email users, 210 billion emails sent daily, 186.7 million websites and 133 million blogs.
E-commerce revenues exceeded $2,300 billion in 2017 and are expected to reach $4,500 billion in 2021.
The World Wide Web, commonly known as the Web, and sometimes as the Net, presents a hypertext system running on the Internet. The Web is used to consult accessible pages on websites using a browser. The image of the spiderweb originates from the hyperlinks that interconnect web pages.
Web 1.0 functioned in a strictly linear manner: a producer would offer content that was displayed on a website, and Internet users would consult this site. This generation of the Web favored product-oriented sites, which had little influence on user influx. This period was marked by the birth of the first e-commerce sites. Proprietary programs and software were extremely costly.
Figure 1.1. Web 1.0: diffusion
Web 2.0 offered a completely new outlook. It promoted the sharing and exchange of information and data (text, videos, images and more), and witnessed the upsurge of social media, blogs, wikis and smartphones. The Web was becoming more popular and stimulating. The customer's opinion was constantly coveted and users developed a taste for this virtual collectivization. However, the reproduction of content of disproportionate quality led to an overabundance of information that was difficult to verify.
Figure 1.2. Web 2.0: collaboration
Web 3.0 aimed to classify the vast mass of usable information according to the conditions and requirements of each user, according to their positioning, preferences, etc. Websites evolved (and continue to evolve) into online applications that can automatically analyze written and pictorial data, that are able to understand, interpret and classify them, and rediffuse them to new Internet users.
Figure 1.3. Web 3.0: semantic
It is very difficult to predict what the Web will become. Some believe that the future of Web 3.0 is Web 4.0, or the artificial intelligence-based Web. The purpose of this Web is to introduce people into a steadfastly remarkable environment (strong and robust). Nova Spivack2, founder of Radar Networks, gives the definition of Web 4.0 as "the ability to work with tools online only". Similarly, Joël de Rosnay3, consultant to the president of the Cité des sciences et de l'industrie (a science museum whose name means "City of Science and Industry") at La Villette, Paris, indicates that this version of the Web is synonymous with cloud computing.
NOTE.- Wikipedia4 defines cloud computing as a concept referring to the use of memory and computing skills of devices and servers shared all over the world, and interconnected by an international network: the Internet.
EXAMPLE.-
There are now clothes that are available that include electronic chips using biosensors to detect information on the body. This information can then be sent to calculators connected to the Internet via wireless networks such as WiFi. This information can be used to identify the wearer of the clothes in the event of an accident, for example. The individual can be recovered by utilizing the information transmitted between the different technologies in use (biosensor, WiFi network, Internet and satellites).
Figure 1.4. Connected textiles
Certain refrigerators can automatically detect missing ingredients, and with a single click and without the need to open it, the refrigerator gives you a description of the missing ingredients as well as the nearest supermarkets that sell this product.
Figure 1.5. LCD screen of a Samsung smart...
Dateiformat: ePUBKopierschutz: Adobe-DRM (Digital Rights Management)
Systemvoraussetzungen:
Das Dateiformat ePUB ist sehr gut für Romane und Sachbücher geeignet – also für „fließenden” Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein „harter” Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.Bitte beachten Sie: Wir empfehlen Ihnen unbedingt nach Installation der Lese-Software diese mit Ihrer persönlichen Adobe-ID zu autorisieren!
Weitere Informationen finden Sie in unserer E-Book Hilfe.