Telecommunication
Telecommunication is the transmission of information over significant distances to communicate. In earlier times, telecommunications involved the use of visual signals, such as beacons, smoke signals, semaphore telegraphs, signal flags, and optical heliographs, or audio messages via coded drumbeats, lung-blown horns, or sent by loud whistles, for example. In the modern age of electricity and electronics, telecommunications now also includes the use of electrical devices such as telegraphs, telephones, and teleprinters, the use of radio and microwave communications, as well as fiber optics and their associated electronics, plus the use of the orbiting satellites and the Internet.
A revolution in wireless telecommunications began in the first decade of the 20th century with pioneering developments in wireless radio communications by Nikola Tesla and Guglielmo Marconi. Marconi won the Nobel Prize in Physics in 1909 for his efforts. Other highly notable pioneering inventors and developers in the field of electrical and electronic telecommunications include Charles Wheatstone and Samuel Morse (telegraph), Alexander Graham Bell (telephone), Edwin Armstrong, and Lee de Forest (radio), as well as John Logie Baird and Philo Farnsworth (television).
The world's effective capacity to exchange information through two-way telecommunication networks grew from 281 petabytes of (optimally compressed) information in 1986, to 471 petabytes in 1993, to 2.2 (optimally compressed) exabytes in 2000, and to 65 (optimally compressed) exabytes in 2007[1]. This is the informational equivalent of 2 newspaper pages per person per day in 1986, and 6 entire newspapers per person per day by 2007.[2] Given this growth, telecommunications play an increasingly important role in the world economy and the worldwide telecommunication industry's revenue was estimated to be $3.85 trillion in 2008.[3] The service revenue of the global telecommunications industry was estimated to be $1.7 trillion in 2008, and is expected to touch $2.7 trillion by 2013.[3]
During the Middle Ages, chains of beacons were commonly used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London that signaled the arrival of the Spanish warships.[5]
The businessman Samuel F.B. Morse and the physicist Joseph Henry of the United States developed their own, simpler version of the electrical telegraph, independently. Morse successfully demonstrated this system on September 2, 1837. Morse's most important technical contribution to this telegraph was the rather simple and highly efficient Morse Code, which was an important advance over Wheatstone's complicated and significantly more expensive telegraph system. The communications efficiency of the Morse Code anticipated that of the Huffman code in digital communications by over 100 years, but Morse and his associate Alfred Vail developed the code purely empirically, unlike Huffman, who gave a detailed theoretical explanation of how his method worked.
The first permanent transatlantic telegraph cable was successfully completed on 27 July 1866, allowing transatlantic electrical communication for the first time.[9] An earlier transatlantic cable had operated for a few months in 1859, and among other things, it carried messages of greeting back and forth between President James Buchanan of the United States and Queen Victoria of the United Kingdom.
However, that transatlantic cable failed soon, and the project to lay a replacement line was delayed for five years by the American Civil War. Also, these transatlantic cables would have been completely incapable of carrying telephone calls even had the telephone already been invented. The first transatlantic telephone cable (which incorporated hundreds of electronic amplifiers) was not operational until 1956.[10]
The conventional telephone now in use worldwide was first patented by Alexander Graham Bell in March 1876.[11] That first patent by Bell was the master patent of the telephone, from which all other patents for electric telephone devices and features flowed. Credit for the invention of the electric telephone has been frequently disputed, and new controversies over the issue have arisen from time-to-time. As with other great inventions such as radio, television, the light bulb, and the digital computer, there were several inventors who did pioneering experimental work on voice transmission over a wire, and then they improved on each other's ideas. However, the key innovators were Alexander Graham Bell and Gardiner Greene Hubbard, who created the first telephone company, the Bell Telephone Company in the United States, which later evolved into American Telephone & Telegraph (AT&T).
The first commercial telephone services were set up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven, Connecticut, and London, England.[12][13]
On March 25, 1925, John Logie Baird of Scotland was able to demonstrate the transmission of moving pictures at the Selfridge's department store in London, England. Baird's system relied upon the fast-rotating Nipkow disk, and thus it became known as the mechanical television. It formed the basis of experimental broadcasts done by the British Broadcasting Corporation beginning September 30, 1929.[17] However, for most of the 20th century, television systems were designed around the cathode ray tube, invented by Karl Braun. The first version of such an electronic television to show promise was produced by Philo Farnsworth of the United States, and it was demonstrated to his family in Idaho on September 7, 1927.[18]
ARPANET's development centred around the Request for Comment process and on 7 April 1969, RFC 1 was published. This process is important because ARPANET would eventually merge with other networks to form the Internet, and many of the communication protocols that the Internet relies upon today were specified through the Request for Comment process. In September 1981, RFC 791 introduced the Internet Protocol version 4 (IPv4) and RFC 793 introduced the Transmission Control Protocol (TCP) — thus creating the TCP/IP protocol that much of the Internet relies upon today.
However, not all important developments were made through the Request for Comment process. Two popular link protocols for local area networks (LANs) also appeared in the 1970s. A patent for the token ring protocol was filed by Olof Soderblom on October 29, 1974, and a paper on the Ethernet protocol was published by Robert Metcalfe and David Boggs in the July 1976 issue of Communications of the ACM.[21][22] The Ethernet protocol had been inspired by the ALOHAnet protocol which had been developed by electrical engineering researchers at the University of Hawaii.
A number of key concepts reoccur throughout the literature on modern telecommunication systems. Some of these concepts are discussed below.
Sometimes, telecommunication systems are "duplex" (two-way systems) with a single box of electronics working as both a transmitter and a receiver, or a transceiver. For example, a cellular telephone is a transceiver.[25] The transmission electronics and the receiver electronics in a transceiver are actually quite independent of each other. This can be readily explained by the fact that radio transmitters contain power amplifiers that operate with electrical powers measured in the watts or kilowatts, but radio receivers deal with radio powers that are measured in the microwatts or nanowatts. Hence, transceivers have to be carefully designed and built to isolate their high-power circuitry and their low-power circuitry from each other.
Telecommunication over telephone lines is called point-to-point communication because it is between one transmitter and one receiver. Telecommunication through radio broadcasts is called broadcast communication because it is between one powerful transmitter and numerous low-power but sensitive radio receivers.[25]
Telecommunications in which multiple transmitters and multiple receivers have been designed to cooperate and to share the same physical channel are called multiplex systems.
On the other hand, unless the additive noise disturbance exceeds a certain threshold, the information contained in digital signals will remain intact. Their resistance to noise represents a key advantage of digital signals over analog signals.[26]
The other meaning of the term "channel" in telecommunications is seen in the phrase communications channel, which is a subdivision of a transmission medium so that it can be used to send multiple streams of information simultaneously. For example, one radio station can broadcast radio waves into free space at frequencies in the neighborhood of 94.5 MHz (megahertz) while another radio station can simultaneously broadcast radio waves at frequencies in the neighborhood of 96.1 MHz. Each radio station would transmit radio waves over a frequency bandwidth of about 180 kHz (kilohertz), centered at frequencies such as the above, which are called the "carrier frequencies". Each station in this example is separated from its adjacent stations by 200 kHz, and the difference between 200 kHz and 180 kHz (20 kHz) is an engineering allowance for the imperfections in the communication system.
In the example above, the "free space channel" has been divided into communications channels according to frequencies, and each channel is assigned a separate frequency bandwidth in which to broadcast radio waves. This system of dividing the medium into channels according to frequency is called "frequency-division multiplexing" (FDM).
Another way of dividing a communications medium into channels is to allocate each sender a recurring segment of time (a "time slot", for example, 20 milliseconds out of each second), and to allow each sender to send messages only within its own time slot. This method of dividing the medium into communication channels is called "time-division multiplexing" (TDM), and is used in optical fiber communication.[27][28] Some radio communication systems use TDM within an allocated FDM channel. Hence, these systems use a hybrid of TDM and FDM.
Modulation can also be used to transmit the information of low-frequency analog signals at higher frequencies. This is helpful because low-frequency analog signals cannot be effectively transmitted over free space. Hence the information from a low-frequency analog signal must be impressed into a higher-frequency signal (known as the "carrier wave") before transmission. There are several different modulation schemes available to achieve this [two of the most basic being amplitude modulation (AM) and frequency modulation (FM)]. An example of this process is a disc jockey's voice being impressed into a 96 MHz carrier wave using frequency modulation (the voice would then be received on a radio as the channel "96 FM").[31] In addition, modulation has the advantage of being about to use frequency division multiplexing (FDM).
Because of the economic benefits of good telecommunication infrastructure, there is increasing worry about the inequitable access to telecommunication services amongst various countries of the world—this is known as the digital divide. A 2003 survey by the International Telecommunication Union (ITU) revealed that roughly a third of countries have fewer than one mobile subscription for every 20 people and one-third of countries have fewer than one land-line telephone subscription for every 20 people. In terms of Internet access, roughly half of all countries have fewer than one out of 20 people with Internet access. From this information, as well as educational data, the ITU was able to compile an index that measures the overall ability of citizens to access and use information and communication technologies.[36] Using this measure, Sweden, Denmark and Iceland received the highest ranking while the African countries Nigeria, Burkina Faso and Mali received the lowest.[37]
Since then the role that telecommunications has played in social relations has become increasingly important. In recent years, the popularity of social networking sites has increased dramatically. These sites allow users to communicate with each other as well as post photographs, events and profiles for others to see. The profiles can list a person's age, interests, sexual preference and relationship status. In this way, these sites can play important role in everything from organising social engagements to courtship.[39]
Prior to social networking sites, technologies like short message service(SMS) and the telephone also had a significant impact on social interactions. In 2000, market research group Ipsos MORI reported that 81% of 15 to 24 year-old SMS users in the United Kingdom had used the service to coordinate social arrangements and 42% to flirt.[40]
Telecommunication has also transformed the way people receive their news. A survey by the non-profit Pew Internet and American Life Project found that when just over 3,000 people living in the United States were asked where they got their news "yesterday", more people said television or radio than newspapers. The results are summarised in the following table (the percentages add up to more than 100% because people were able to specify more than one source).[41]
Telecommunication has had an equally significant impact on advertising. TNS Media Intelligence reported that in 2007, 58% of advertising expenditure in the United States was spent on mediums that depend upon telecommunication.[42] The results are summarised in the following table.
From a global perspective, there have been political debates and legislation regarding the management of telecommunication and broadcasting. The history of broadcasting discusses some of debates in relation to balancing conventional communication such as printing and telecommunication such as radio broadcasting.[45] The onset of World War II brought on the first explosion of international broadcasting propaganda.[45] Countries, their governments, insurgents, terrorists, and militiamen have all used telecommunication and broadcasting techniques to promote propaganda.[45][46] Patriotic propaganda for political movements and colonization started the mid 1930s. In 1936, the BBC did broadcast propaganda to the Arab World to partly counter similar broadcasts from Italy, which also had colonial interests in North Africa.[45]
Modern insurgents, such as those in the latest Iraq war, often use intimidating telephone calls, SMSs and the distribution of sophisticated videos of an attack on coalition troops within hours of the operation. "The Sunni insurgents even have their own television station, Al-Zawraa, which while banned by the Iraqi government, still broadcasts from Erbil, Iraqi Kurdistan, even as coalition pressure has forced it to switch satellite hosts several times." [46]
The fixed-line telephones in most residential homes are analog — that is, the speaker's voice directly determines the signal's voltage. Although short-distance calls may be handled from end-to-end as analog signals, increasingly telephone service providers are transparently converting the signals to digital for transmission before converting them back to analog for reception. The advantage of this is that digitized voice data can travel side-by-side with data from the Internet and can be perfectly reproduced in long distance communication (as opposed to analog signals that are inevitably impacted by noise).
Mobile phones have had a significant impact on telephone networks. Mobile phone subscriptions now outnumber fixed-line subscriptions in many markets. Sales of mobile phones in 2005 totalled 816.6 million with that figure being almost equally shared amongst the markets of Asia/Pacific (204 m), Western Europe (164 m), CEMEA (Central Europe, the Middle East and Africa) (153.5 m), North America (148 m) and Latin America (102 m).[49] In terms of new subscriptions over the five years from 1999, Africa has outpaced other markets with 58.2% growth.[50] Increasingly these phones are being serviced by systems where the voice content is transmitted digitally such as GSM or W-CDMA with many markets choosing to depreciate analog systems such as AMPS.[51]
There have also been dramatic changes in telephone communication behind the scenes. Starting with the operation of TAT-8 in 1988, the 1990s saw the widespread adoption of systems based on optic fibres. The benefit of communicating with optic fibers is that they offer a drastic increase in data capacity. TAT-8 itself was able to carry 10 times as many telephone calls as the last copper cable laid at that time and today's optic fibre cables are able to carry 25 times as many telephone calls as TAT-8.[52] This increase in data capacity is due to several factors: First, optic fibres are physically much smaller than competing technologies. Second, they do not suffer from crosstalk which means several hundred of them can be easily bundled together in a single cable.[53] Lastly, improvements in multiplexing have led to an exponential growth in the data capacity of a single fibre.[54][55]
Assisting communication across many modern optic fibre networks is a protocol known as Asynchronous Transfer Mode (ATM). The ATM protocol allows for the side-by-side data transmission mentioned in the second paragraph. It is suitable for public telephone networks because it establishes a pathway for data through the network and associates a traffic contract with that pathway. The traffic contract is essentially an agreement between the client and the network about how the network is to handle the data; if the network cannot meet the conditions of the traffic contract it does not accept the connection. This is important because telephone calls can negotiate a contract so as to guarantee themselves a constant bit rate, something that will ensure a caller's voice is not delayed in parts or cut-off completely.[56] There are competitors to ATM, such as Multiprotocol Label Switching (MPLS), that perform a similar task and are expected to supplant ATM in the future.[57]
The broadcast media industry is at a critical turning point in its development, with many countries moving from analog to digital broadcasts. This move is made possible by the production of cheaper, faster and more capable integrated circuits. The chief advantage of digital broadcasts is that they prevent a number of complaints common to traditional analog broadcasts. For television, this includes the elimination of problems such as snowy pictures, ghosting and other distortion. These occur because of the nature of analog transmission, which means that perturbations due to noise will be evident in the final output. Digital transmission overcomes this problem because digital signals are reduced to discrete values upon reception and hence small perturbations do not affect the final output. In a simplified example, if a binary message 1011 was transmitted with signal amplitudes [1.0 0.0 1.0 1.0] and received with signal amplitudes [0.9 0.2 1.1 0.9] it would still decode to the binary message 1011 — a perfect reproduction of what was sent. From this example, a problem with digital transmissions can also be seen in that if the noise is great enough it can significantly alter the decoded message. Using forward error correction a receiver can correct a handful of bit errors in the resulting message but too much noise will lead to incomprehensible output and hence a breakdown of the transmission.[59][60]
In digital television broadcasting, there are three competing standards that are likely to be adopted worldwide. These are the ATSC, DVB and ISDB standards; the adoption of these standards thus far is presented in the captioned map. All three standards use MPEG-2 for video compression. ATSC uses Dolby Digital AC-3 for audio compression, ISDB uses Advanced Audio Coding (MPEG-2 Part 7) and DVB has no standard for audio compression but typically uses MPEG-1 Part 3 Layer 2.[61][62] The choice of modulation also varies between the schemes. In digital audio broadcasting, standards are much more unified with practically all countries choosing to adopt the Digital Audio Broadcasting standard (also known as the Eureka 147 standard). The exception being the United States which has chosen to adopt HD Radio. HD Radio, unlike Eureka 147, is based upon a transmission method known as in-band on-channel transmission that allows digital information to "piggyback" on normal AM or FM analog transmissions.[63]
However, despite the pending switch to digital, analog television remains being transmitted in most countries. An exception is the United States that ended analog television transmission (by all but the very low-power TV stations) on 12 June 2009[64] after twice delaying the switchover deadline. For analog television, there are three standards in use for broadcasting color TV (see a map on adoption here). These are known as PAL (British designed), NTSC (North American designed), and SECAM (French designed). (It is important to understand that these are the ways from sending color TV, and they do not have anything to do with the standards for black & white TV, which also vary from country to country.) For analog radio, the switch to digital radio is made more difficult by the fact that analog receivers are sold at a small fraction of the price of digital receivers.[65][66] The choice of modulation for analog radio is typically between amplitude modulation (AM) or frequency modulation (FM). To achieve stereo playback, an amplitude modulated subcarrier is used for stereo FM.
It is estimated that the 51% of the information flowing through two-way telecommunications networks in the year 2000 were flowing through the Internet (most of the rest (42%) through the landline telephone). By the year 2007 the Internet clearly dominated and captured 97% of all the information in telecommunication networks (most of the rest (2%) through mobile phones). [1] As of 2008[update], an estimated 21.9% of the world population has access to the Internet with the highest access rates (measured as a percentage of the population) in North America (73.6%), Oceania/Australia (59.5%) and Europe (48.1%).[69] In terms of broadband access, Iceland (26.7%), South Korea (25.4%) and the Netherlands (25.3%) led the world.[70]
The Internet works in part because of protocols that govern how the computers and routers communicate with each other. The nature of computer network communication lends itself to a layered approach where individual protocols in the protocol stack run more-or-less independently of other protocols. This allows lower-level protocols to be customized for the network situation while not changing the way higher-level protocols operate. A practical example of why this is important is because it allows an Internet browser to run the same code regardless of whether the computer it is running on is connected to the Internet through an Ethernet or Wi-Fi connection. Protocols are often talked about in terms of their place in the OSI reference model (pictured on the right), which emerged in 1983 as the first step in an unsuccessful attempt to build a universally adopted networking protocol suite.[71]
For the Internet, the physical medium and data link protocol can vary several times as packets traverse the globe. This is because the Internet places no constraints on what physical medium or data link protocol is used. This leads to the adoption of media and protocols that best suit the local network situation. In practice, most intercontinental communication will use the Asynchronous Transfer Mode (ATM) protocol (or a modern equivalent) on top of optic fibre. This is because for most intercontinental communication the Internet shares the same infrastructure as the public switched telephone network.
At the network layer, things become standardized with the Internet Protocol (IP) being adopted for logical addressing. For the World Wide Web, these "IP addresses" are derived from the human readable form using the Domain Name System (e.g. 72.14.207.99 is derived from www.google.com). At the moment, the most widely used version of the Internet Protocol is version four but a move to version six is imminent.[72]
At the transport layer, most communication adopts either the Transmission Control Protocol (TCP) or the User Datagram Protocol (UDP). TCP is used when it is essential every message sent is received by the other computer where as UDP is used when it is merely desirable. With TCP, packets are retransmitted if they are lost and placed in order before they are presented to higher layers. With UDP, packets are not ordered or retransmitted if lost. Both TCP and UDP packets carry port numbers with them to specify what application or process the packet should be handled by.[73] Because certain application-level protocols use certain ports, network administrators can manipulate traffic to suit particular requirements. Examples are to restrict Internet access by blocking the traffic destined for a particular port or to affect the performance of certain applications by assigning priority.
Above the transport layer, there are certain protocols that are sometimes used and loosely fit in the session and presentation layers, most notably the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. These protocols ensure that the data transferred between two parties remains completely confidential and one or the other is in use when a padlock appears in the address bar of your web browser.[74] Finally, at the application layer, are many of the protocols Internet users would be familiar with such as HTTP (web browsing), POP3 (e-mail), FTP (file transfer), IRC (Internet chat), BitTorrent (file sharing) and OSCAR (instant messaging).
Voice over Internet Protocol (VoIP) allows data packets to be used for synchronous voice communications. The data packets are marked as voice type packets and can be prioritised by the network administrators so that the real-time, synchronous conversation is less subject to contention with other types of data traffic which can be delayed (i.e. file transfer or email) or buffered in advance (i.e. audio and video) without detriment. That prioritisation is fine when the network has sufficient capacity for all the VoIP calls taking place at the same time and the network is enabled for prioritisation i.e. a private corporate style network, but the Internet is not generally managed in this way and so there can be a big difference in the quality of VoIP calls over a private network and over the public Internet. [75]
There are also independent wide area networks ("WANs" – private computer networks that can and do extend for thousands of kilometers.) Once again, some of their advantages include their privacy, security, and complete ignoring of any potential hackers – who cannot "touch" them. Of course, prime users of private LANs and WANs include armed forces and intelligence agencies that must keep their information completely secure and secret.
In the mid-1980s, several sets of communication protocols emerged to fill the gaps between the data-link layer and the application layer of the OSI reference model. These included Appletalk, IPX, and NetBIOS with the dominant protocol set during the early 1990s being IPX due to its popularity with MS-DOS users. TCP/IP existed at this point, but it was typically only used by large government and research facilities.[76]
As the Internet grew in popularity and a larger percentage of traffic became Internet-related, LANs and WANs gradually moved towards the TCP/IP protocols, and today networks mostly dedicated to TCP/IP traffic are common. The move to TCP/IP was helped by technologies such as DHCP that allowed TCP/IP clients to discover their own network address — a function that came standard with the AppleTalk/ IPX/ NetBIOS protocol sets.[77]
It is at the data-link layer, though, that most modern LANs diverge from the Internet. Whereas Asynchronous Transfer Mode (ATM) or Multiprotocol Label Switching (MPLS) are typical data-link protocols for larger networks such as WANs; Ethernet and Token Ring are typical data-link protocols for LANs. These protocols differ from the former protocols in that they are simpler (e.g. they omit features such as Quality of Service guarantees) and offer collision prevention. Both of these differences allow for more economical systems.[78] Despite the modest popularity of IBM token ring in the 1980s and 90's, virtually all LANs now use either wired or wireless Ethernets. At the physical layer, most wired Ethernet implementations use copper twisted-pair cables (including the common 10BASE-T networks). However, some early implementations used heavier coaxial cables and some recent implementations (especially high-speed ones) use optical fibers.[79] When optic fibers are used, the distinction must be made between multimode fibers and single-mode fiberes. Multimode fibers can be thought of as thicker optical fibers that are cheaper to manufacture devices for but that suffers from less usable bandwidth and worse attenuation – implying poorer long-distance performance.[80]
A revolution in wireless telecommunications began in the first decade of the 20th century with pioneering developments in wireless radio communications by Nikola Tesla and Guglielmo Marconi. Marconi won the Nobel Prize in Physics in 1909 for his efforts. Other highly notable pioneering inventors and developers in the field of electrical and electronic telecommunications include Charles Wheatstone and Samuel Morse (telegraph), Alexander Graham Bell (telephone), Edwin Armstrong, and Lee de Forest (radio), as well as John Logie Baird and Philo Farnsworth (television).
The world's effective capacity to exchange information through two-way telecommunication networks grew from 281 petabytes of (optimally compressed) information in 1986, to 471 petabytes in 1993, to 2.2 (optimally compressed) exabytes in 2000, and to 65 (optimally compressed) exabytes in 2007[1]. This is the informational equivalent of 2 newspaper pages per person per day in 1986, and 6 entire newspapers per person per day by 2007.[2] Given this growth, telecommunications play an increasingly important role in the world economy and the worldwide telecommunication industry's revenue was estimated to be $3.85 trillion in 2008.[3] The service revenue of the global telecommunications industry was estimated to be $1.7 trillion in 2008, and is expected to touch $2.7 trillion by 2013.[3]
History
For more details on this topic, see History of telecommunication.
Ancient systems
Main articles: Hydraulic telegraph and Beacon
Greek hydraulic semaphore systems were used as early as the 4th century BC. The hydraulic semaphores, which worked with water filled vessels and visual signals, functioned as optical telegraphs. However, they could only utilize a very limited range of pre-determined messages, and as with all such optical telegraphs could only be deployed during good visibility conditions.[4]During the Middle Ages, chains of beacons were commonly used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London that signaled the arrival of the Spanish warships.[5]
Systems since the Middle Ages
Main article: Semaphore line
In 1792, Claude Chappe, a French engineer, built the first fixed visual telegraphy system (or semaphore line) between Lille and Paris.[6] However semaphore systems suffered from the need for skilled operators and the expensive towers at intervals of 10–30 kilometers (6–20 mi). As a result of competition from the electrical telegraph, Europe's last commercial semaphore line in Sweden was abandoned in 1880.[7]The telegraph and telephone
The first commercial electrical telegraph was constructed by Sir Charles Wheatstone and Sir William Fothergill Cooke, and its use began on April 9, 1839. Both Wheatstone and Cooke viewed their device as "an improvement to the [already-existing, so-called] electromagnetic telegraph" not as a new device.[8]The businessman Samuel F.B. Morse and the physicist Joseph Henry of the United States developed their own, simpler version of the electrical telegraph, independently. Morse successfully demonstrated this system on September 2, 1837. Morse's most important technical contribution to this telegraph was the rather simple and highly efficient Morse Code, which was an important advance over Wheatstone's complicated and significantly more expensive telegraph system. The communications efficiency of the Morse Code anticipated that of the Huffman code in digital communications by over 100 years, but Morse and his associate Alfred Vail developed the code purely empirically, unlike Huffman, who gave a detailed theoretical explanation of how his method worked.
The first permanent transatlantic telegraph cable was successfully completed on 27 July 1866, allowing transatlantic electrical communication for the first time.[9] An earlier transatlantic cable had operated for a few months in 1859, and among other things, it carried messages of greeting back and forth between President James Buchanan of the United States and Queen Victoria of the United Kingdom.
However, that transatlantic cable failed soon, and the project to lay a replacement line was delayed for five years by the American Civil War. Also, these transatlantic cables would have been completely incapable of carrying telephone calls even had the telephone already been invented. The first transatlantic telephone cable (which incorporated hundreds of electronic amplifiers) was not operational until 1956.[10]
The conventional telephone now in use worldwide was first patented by Alexander Graham Bell in March 1876.[11] That first patent by Bell was the master patent of the telephone, from which all other patents for electric telephone devices and features flowed. Credit for the invention of the electric telephone has been frequently disputed, and new controversies over the issue have arisen from time-to-time. As with other great inventions such as radio, television, the light bulb, and the digital computer, there were several inventors who did pioneering experimental work on voice transmission over a wire, and then they improved on each other's ideas. However, the key innovators were Alexander Graham Bell and Gardiner Greene Hubbard, who created the first telephone company, the Bell Telephone Company in the United States, which later evolved into American Telephone & Telegraph (AT&T).
The first commercial telephone services were set up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven, Connecticut, and London, England.[12][13]
Radio and television
Main articles: History of radio and History of television
In 1832, James Lindsay gave a classroom demonstration of wireless telegraphy via conductive water to his students. By 1854, he was able to demonstrate a transmission across the Firth of Tay from Dundee, Scotland, to Woodhaven, a distance of about two miles (3 km), again using water as the transmission medium.[14] In December 1901, Guglielmo Marconi established wireless communication between St. John's, Newfoundland and Poldhu, Cornwall (England), earning him the Nobel Prize in Physics for 1909, one which he shared with Karl Braun.[15] However small-scale radio communication had already been demonstrated in 1893 by Nikola Tesla in a presentation before the National Electric Light Association.[16]On March 25, 1925, John Logie Baird of Scotland was able to demonstrate the transmission of moving pictures at the Selfridge's department store in London, England. Baird's system relied upon the fast-rotating Nipkow disk, and thus it became known as the mechanical television. It formed the basis of experimental broadcasts done by the British Broadcasting Corporation beginning September 30, 1929.[17] However, for most of the 20th century, television systems were designed around the cathode ray tube, invented by Karl Braun. The first version of such an electronic television to show promise was produced by Philo Farnsworth of the United States, and it was demonstrated to his family in Idaho on September 7, 1927.[18]
Computer networks and the Internet
Main articles: Computer networking and History of the Internet
On 11 September 1940, George Stibitz was able to transmit problems using teleprinter to his Complex Number Calculator in New York and receive the computed results back at Dartmouth College in New Hampshire.[19] This configuration of a centralized computer or mainframe computer with remote "dumb terminals" remained popular throughout the 1950s and into the 60's. However, it was not until the 1960s that researchers started to investigate packet switching — a technology that allows chunks of data to be sent between different computers without first passing through a centralized mainframe. A four-node network emerged on December 5, 1969. This network soon became the ARPANET, which by 1981 would consist of 213 nodes.[20]ARPANET's development centred around the Request for Comment process and on 7 April 1969, RFC 1 was published. This process is important because ARPANET would eventually merge with other networks to form the Internet, and many of the communication protocols that the Internet relies upon today were specified through the Request for Comment process. In September 1981, RFC 791 introduced the Internet Protocol version 4 (IPv4) and RFC 793 introduced the Transmission Control Protocol (TCP) — thus creating the TCP/IP protocol that much of the Internet relies upon today.
However, not all important developments were made through the Request for Comment process. Two popular link protocols for local area networks (LANs) also appeared in the 1970s. A patent for the token ring protocol was filed by Olof Soderblom on October 29, 1974, and a paper on the Ethernet protocol was published by Robert Metcalfe and David Boggs in the July 1976 issue of Communications of the ACM.[21][22] The Ethernet protocol had been inspired by the ALOHAnet protocol which had been developed by electrical engineering researchers at the University of Hawaii.
Key concepts
Etymology |
The word telecommunication was adapted from the French word télécommunication. It is a compound of the Greek prefix tele- (τηλε-), meaning "far off", and the Latin communicare, meaning "to share".[23] The French word télécommunication was coined in 1904 by the French engineer and novelist Édouard Estaunié.[24] |
Basic elements
A basic telecommunication system consists of three primary units that are always present in some form:- A transmitter that takes information and converts it to a signal.
- A transmission medium, also called the "physical channel" that carries the signal. An example of this is the "free space channel".
- A receiver that takes the signal from the channel and converts it back into usable information.
Sometimes, telecommunication systems are "duplex" (two-way systems) with a single box of electronics working as both a transmitter and a receiver, or a transceiver. For example, a cellular telephone is a transceiver.[25] The transmission electronics and the receiver electronics in a transceiver are actually quite independent of each other. This can be readily explained by the fact that radio transmitters contain power amplifiers that operate with electrical powers measured in the watts or kilowatts, but radio receivers deal with radio powers that are measured in the microwatts or nanowatts. Hence, transceivers have to be carefully designed and built to isolate their high-power circuitry and their low-power circuitry from each other.
Telecommunication over telephone lines is called point-to-point communication because it is between one transmitter and one receiver. Telecommunication through radio broadcasts is called broadcast communication because it is between one powerful transmitter and numerous low-power but sensitive radio receivers.[25]
Telecommunications in which multiple transmitters and multiple receivers have been designed to cooperate and to share the same physical channel are called multiplex systems.
Analog versus digital communications
Communications signals can be either by analog signals or digital signals. There are analog communication systems and digital communication systems. For an analog signal, the signal is varied continuously with respect to the information. In a digital signal, the information is encoded as a set of discrete values (for example, a set of ones and zeros). During the propagation and reception, the information contained in analog signals will inevitably be degraded by undesirable physical noise. (The output of a transmitter is noise-free for all practical purposes.) Commonly, the noise in a communication system can be expressed as adding or subtracting from the desirable signal in a completely random way. This form of noise is called "additive noise", with the understanding that the noise can be negative or positive at different instants of time. Noise that is not additive noise is a much more difficult situation to describe or analyze, and these other kinds of noise will be omitted here.On the other hand, unless the additive noise disturbance exceeds a certain threshold, the information contained in digital signals will remain intact. Their resistance to noise represents a key advantage of digital signals over analog signals.[26]
Telecommunication networks
Main article: Telecommunications network
A communications network is a collection of transmitters, receivers, and communications channels that send messages to one another. Some digital communications networks contain one or more routers that work together to transmit information to the correct user. An analog communications network consists of one or more switches that establish a connection between two or more users. For both types of network, repeaters may be necessary to amplify or recreate the signal when it is being transmitted over long distances. This is to combat attenuation that can render the signal indistinguishable from the noise.[27]Communication channels
The term "channel" has two different meanings. In one meaning, a channel is the physical medium that carries a signal between the transmitter and the receiver. Examples of this include the atmosphere for sound communications, glass optical fibers for some kinds of optical communications, coaxial cables for communications by way of the voltages and electric currents in them, and free space for communications using visible light, infrared waves, ultraviolet light, and radio waves. This last channel is called the "free space channel". The sending of radio waves from one place to another has nothing to do with the presence or absence of an atmosphere between the two. Radio waves travel through a perfect vacuum just as easily as they travel through air, fog, clouds, or any other kind of gas besides air.The other meaning of the term "channel" in telecommunications is seen in the phrase communications channel, which is a subdivision of a transmission medium so that it can be used to send multiple streams of information simultaneously. For example, one radio station can broadcast radio waves into free space at frequencies in the neighborhood of 94.5 MHz (megahertz) while another radio station can simultaneously broadcast radio waves at frequencies in the neighborhood of 96.1 MHz. Each radio station would transmit radio waves over a frequency bandwidth of about 180 kHz (kilohertz), centered at frequencies such as the above, which are called the "carrier frequencies". Each station in this example is separated from its adjacent stations by 200 kHz, and the difference between 200 kHz and 180 kHz (20 kHz) is an engineering allowance for the imperfections in the communication system.
In the example above, the "free space channel" has been divided into communications channels according to frequencies, and each channel is assigned a separate frequency bandwidth in which to broadcast radio waves. This system of dividing the medium into channels according to frequency is called "frequency-division multiplexing" (FDM).
Another way of dividing a communications medium into channels is to allocate each sender a recurring segment of time (a "time slot", for example, 20 milliseconds out of each second), and to allow each sender to send messages only within its own time slot. This method of dividing the medium into communication channels is called "time-division multiplexing" (TDM), and is used in optical fiber communication.[27][28] Some radio communication systems use TDM within an allocated FDM channel. Hence, these systems use a hybrid of TDM and FDM.
Modulation
The shaping of a signal to convey information is known as modulation. Modulation can be used to represent a digital message as an analog waveform. This is commonly called "keying" – a term derived from the older use of Morse Code in telecommunications – and several keying techniques exist (these include phase-shift keying, frequency-shift keying, and amplitude-shift keying). The "Bluetooth" system, for example, uses phase-shift keying to exchange information between various devices.[29][30] In addition, there are combinations of phase-shift keying and amplitude-shift keying which is called (in the jargon of the field) "quadrature amplitude modulation" (QAM) that are used in high-capacity digital radio communication systems.Modulation can also be used to transmit the information of low-frequency analog signals at higher frequencies. This is helpful because low-frequency analog signals cannot be effectively transmitted over free space. Hence the information from a low-frequency analog signal must be impressed into a higher-frequency signal (known as the "carrier wave") before transmission. There are several different modulation schemes available to achieve this [two of the most basic being amplitude modulation (AM) and frequency modulation (FM)]. An example of this process is a disc jockey's voice being impressed into a 96 MHz carrier wave using frequency modulation (the voice would then be received on a radio as the channel "96 FM").[31] In addition, modulation has the advantage of being about to use frequency division multiplexing (FDM).
Society and telecommunication
Telecommunication has a significant social, cultural. and economic impact on modern society. In 2008, estimates placed the telecommunication industry's revenue at $3.85 trillion or just under 3 percent of the gross world product (official exchange rate).[3] Several following sections discuss the impact of telecommunication on society.Economic impact
Microeconomics
On the microeconomic scale, companies have used telecommunications to help build global business empires. This is self-evident in the case of online retailer Amazon.com but, according to academic Edward Lenert, even the conventional retailer Wal-Mart has benefited from better telecommunication infrastructure compared to its competitors.[32] In cities throughout the world, home owners use their telephones to organize many home services ranging from pizza deliveries to electricians. Even relatively-poor communities have been noted to use telecommunication to their advantage. In Bangladesh's Narshingdi district, isolated villagers use cellular phones to speak directly to wholesalers and arrange a better price for their goods. In Côte d'Ivoire, coffee growers share mobile phones to follow hourly variations in coffee prices and sell at the best price.[33]Macroeconomics
On the macroeconomic scale, Lars-Hendrik Röller and Leonard Waverman suggested a causal link between good telecommunication infrastructure and economic growth.[34] Few dispute the existence of a correlation although some argue it is wrong to view the relationship as causal.[35]Because of the economic benefits of good telecommunication infrastructure, there is increasing worry about the inequitable access to telecommunication services amongst various countries of the world—this is known as the digital divide. A 2003 survey by the International Telecommunication Union (ITU) revealed that roughly a third of countries have fewer than one mobile subscription for every 20 people and one-third of countries have fewer than one land-line telephone subscription for every 20 people. In terms of Internet access, roughly half of all countries have fewer than one out of 20 people with Internet access. From this information, as well as educational data, the ITU was able to compile an index that measures the overall ability of citizens to access and use information and communication technologies.[36] Using this measure, Sweden, Denmark and Iceland received the highest ranking while the African countries Nigeria, Burkina Faso and Mali received the lowest.[37]
Social impact
Telecommunication has played a significant role in social relationships. Nevertheless devices like the telephone system were originally advertised with an emphasis on the practical dimensions of the device (such as the ability to conduct business or order home services) as opposed to the social dimensions. It was not until the late 1920s and 1930s that the social dimensions of the device became a prominent theme in telephone advertisements. New promotions started appealing to consumers' emotions, stressing the importance of social conversations and staying connected to family and friends.[38]Since then the role that telecommunications has played in social relations has become increasingly important. In recent years, the popularity of social networking sites has increased dramatically. These sites allow users to communicate with each other as well as post photographs, events and profiles for others to see. The profiles can list a person's age, interests, sexual preference and relationship status. In this way, these sites can play important role in everything from organising social engagements to courtship.[39]
Prior to social networking sites, technologies like short message service(SMS) and the telephone also had a significant impact on social interactions. In 2000, market research group Ipsos MORI reported that 81% of 15 to 24 year-old SMS users in the United Kingdom had used the service to coordinate social arrangements and 42% to flirt.[40]
Other impacts
In cultural terms, telecommunication has increased the public's ability to access to music and film. With television, people can watch films they have not seen before in their own home without having to travel to the video store or cinema. With radio and the Internet, people can listen to music they have not heard before without having to travel to the music store.Telecommunication has also transformed the way people receive their news. A survey by the non-profit Pew Internet and American Life Project found that when just over 3,000 people living in the United States were asked where they got their news "yesterday", more people said television or radio than newspapers. The results are summarised in the following table (the percentages add up to more than 100% because people were able to specify more than one source).[41]
Local TV | National TV | Radio | Local paper | Internet | National paper |
---|---|---|---|---|---|
59% | 47% | 44% | 38% | 23% | 12% |
Internet | Radio | Cable TV | Syndicated TV | Spot TV | Network TV | Newspaper | Magazine | Outdoor | Total | |
---|---|---|---|---|---|---|---|---|---|---|
Percent | 7.6% | 7.2% | 12.1% | 2.8% | 11.3% | 17.1% | 18.9% | 20.4% | 2.7% | 100% |
Dollars | $11.31 billion | $10.69 billion | $18.02 billion | $4.17 billion | $16.82 billion | $25.42 billion | $28.22 billion | $30.33 billion | $4.02 billion | $149 billion |
Telecommunication and government
Many countries have enacted legislation which conform to the International Telecommunication Regulations establish by the International Telecommunication Union (ITU), which is the "leading UN agency for information and communication technology issues."[43] In 1947, at the Atlantic City Conference, the ITU decided to "afford international protection to all frequencies registered in a new international frequency list and used in conformity with the Radio Regulation." According to the ITU's Radio Regulations adopted in Atlantic City, all frequencies referenced in the International Frequency Registration Board, examined by the board and registered on the International Frequency List "shall have the right to international protection from harmful interference."[44]From a global perspective, there have been political debates and legislation regarding the management of telecommunication and broadcasting. The history of broadcasting discusses some of debates in relation to balancing conventional communication such as printing and telecommunication such as radio broadcasting.[45] The onset of World War II brought on the first explosion of international broadcasting propaganda.[45] Countries, their governments, insurgents, terrorists, and militiamen have all used telecommunication and broadcasting techniques to promote propaganda.[45][46] Patriotic propaganda for political movements and colonization started the mid 1930s. In 1936, the BBC did broadcast propaganda to the Arab World to partly counter similar broadcasts from Italy, which also had colonial interests in North Africa.[45]
Modern insurgents, such as those in the latest Iraq war, often use intimidating telephone calls, SMSs and the distribution of sophisticated videos of an attack on coalition troops within hours of the operation. "The Sunni insurgents even have their own television station, Al-Zawraa, which while banned by the Iraqi government, still broadcasts from Erbil, Iraqi Kurdistan, even as coalition pressure has forced it to switch satellite hosts several times." [46]
Modern telecommunication
For more details on this topic, see Outline of telecommunication.
Telephone
Main article: Telephone
In an analog telephone network, the caller is connected to the person he wants to talk to by switches at various telephone exchanges. The switches form an electrical connection between the two users and the setting of these switches is determined electronically when the caller dials the number. Once the connection is made, the caller's voice is transformed to an electrical signal using a small microphone in the caller's handset. This electrical signal is then sent through the network to the user at the other end where it is transformed back into sound by a small speaker in that person's handset. There is a separate electrical connection that works in reverse, allowing the users to converse.[47][48]The fixed-line telephones in most residential homes are analog — that is, the speaker's voice directly determines the signal's voltage. Although short-distance calls may be handled from end-to-end as analog signals, increasingly telephone service providers are transparently converting the signals to digital for transmission before converting them back to analog for reception. The advantage of this is that digitized voice data can travel side-by-side with data from the Internet and can be perfectly reproduced in long distance communication (as opposed to analog signals that are inevitably impacted by noise).
Mobile phones have had a significant impact on telephone networks. Mobile phone subscriptions now outnumber fixed-line subscriptions in many markets. Sales of mobile phones in 2005 totalled 816.6 million with that figure being almost equally shared amongst the markets of Asia/Pacific (204 m), Western Europe (164 m), CEMEA (Central Europe, the Middle East and Africa) (153.5 m), North America (148 m) and Latin America (102 m).[49] In terms of new subscriptions over the five years from 1999, Africa has outpaced other markets with 58.2% growth.[50] Increasingly these phones are being serviced by systems where the voice content is transmitted digitally such as GSM or W-CDMA with many markets choosing to depreciate analog systems such as AMPS.[51]
There have also been dramatic changes in telephone communication behind the scenes. Starting with the operation of TAT-8 in 1988, the 1990s saw the widespread adoption of systems based on optic fibres. The benefit of communicating with optic fibers is that they offer a drastic increase in data capacity. TAT-8 itself was able to carry 10 times as many telephone calls as the last copper cable laid at that time and today's optic fibre cables are able to carry 25 times as many telephone calls as TAT-8.[52] This increase in data capacity is due to several factors: First, optic fibres are physically much smaller than competing technologies. Second, they do not suffer from crosstalk which means several hundred of them can be easily bundled together in a single cable.[53] Lastly, improvements in multiplexing have led to an exponential growth in the data capacity of a single fibre.[54][55]
Assisting communication across many modern optic fibre networks is a protocol known as Asynchronous Transfer Mode (ATM). The ATM protocol allows for the side-by-side data transmission mentioned in the second paragraph. It is suitable for public telephone networks because it establishes a pathway for data through the network and associates a traffic contract with that pathway. The traffic contract is essentially an agreement between the client and the network about how the network is to handle the data; if the network cannot meet the conditions of the traffic contract it does not accept the connection. This is important because telephone calls can negotiate a contract so as to guarantee themselves a constant bit rate, something that will ensure a caller's voice is not delayed in parts or cut-off completely.[56] There are competitors to ATM, such as Multiprotocol Label Switching (MPLS), that perform a similar task and are expected to supplant ATM in the future.[57]
Radio and television
In a broadcast system, the central high-powered broadcast tower transmits a high-frequency electromagnetic wave to numerous low-powered receivers. The high-frequency wave sent by the tower is modulated with a signal containing visual or audio information. The receiver is then tuned so as to pick up the high-frequency wave and a demodulator is used to retrieve the signal containing the visual or audio information. The broadcast signal can be either analog (signal is varied continuously with respect to the information) or digital (information is encoded as a set of discrete values).[25][58]The broadcast media industry is at a critical turning point in its development, with many countries moving from analog to digital broadcasts. This move is made possible by the production of cheaper, faster and more capable integrated circuits. The chief advantage of digital broadcasts is that they prevent a number of complaints common to traditional analog broadcasts. For television, this includes the elimination of problems such as snowy pictures, ghosting and other distortion. These occur because of the nature of analog transmission, which means that perturbations due to noise will be evident in the final output. Digital transmission overcomes this problem because digital signals are reduced to discrete values upon reception and hence small perturbations do not affect the final output. In a simplified example, if a binary message 1011 was transmitted with signal amplitudes [1.0 0.0 1.0 1.0] and received with signal amplitudes [0.9 0.2 1.1 0.9] it would still decode to the binary message 1011 — a perfect reproduction of what was sent. From this example, a problem with digital transmissions can also be seen in that if the noise is great enough it can significantly alter the decoded message. Using forward error correction a receiver can correct a handful of bit errors in the resulting message but too much noise will lead to incomprehensible output and hence a breakdown of the transmission.[59][60]
In digital television broadcasting, there are three competing standards that are likely to be adopted worldwide. These are the ATSC, DVB and ISDB standards; the adoption of these standards thus far is presented in the captioned map. All three standards use MPEG-2 for video compression. ATSC uses Dolby Digital AC-3 for audio compression, ISDB uses Advanced Audio Coding (MPEG-2 Part 7) and DVB has no standard for audio compression but typically uses MPEG-1 Part 3 Layer 2.[61][62] The choice of modulation also varies between the schemes. In digital audio broadcasting, standards are much more unified with practically all countries choosing to adopt the Digital Audio Broadcasting standard (also known as the Eureka 147 standard). The exception being the United States which has chosen to adopt HD Radio. HD Radio, unlike Eureka 147, is based upon a transmission method known as in-band on-channel transmission that allows digital information to "piggyback" on normal AM or FM analog transmissions.[63]
However, despite the pending switch to digital, analog television remains being transmitted in most countries. An exception is the United States that ended analog television transmission (by all but the very low-power TV stations) on 12 June 2009[64] after twice delaying the switchover deadline. For analog television, there are three standards in use for broadcasting color TV (see a map on adoption here). These are known as PAL (British designed), NTSC (North American designed), and SECAM (French designed). (It is important to understand that these are the ways from sending color TV, and they do not have anything to do with the standards for black & white TV, which also vary from country to country.) For analog radio, the switch to digital radio is made more difficult by the fact that analog receivers are sold at a small fraction of the price of digital receivers.[65][66] The choice of modulation for analog radio is typically between amplitude modulation (AM) or frequency modulation (FM). To achieve stereo playback, an amplitude modulated subcarrier is used for stereo FM.
The Internet
Main article: Internet
The Internet is a worldwide network of computers and computer networks that can communicate with each other using the Internet Protocol.[67] Any computer on the Internet has a unique IP address that can be used by other computers to route information to it. Hence, any computer on the Internet can send a message to any other computer using its IP address. These messages carry with them the originating computer's IP address allowing for two-way communication. The Internet is thus an exchange of messages between computers.[68]It is estimated that the 51% of the information flowing through two-way telecommunications networks in the year 2000 were flowing through the Internet (most of the rest (42%) through the landline telephone). By the year 2007 the Internet clearly dominated and captured 97% of all the information in telecommunication networks (most of the rest (2%) through mobile phones). [1] As of 2008[update], an estimated 21.9% of the world population has access to the Internet with the highest access rates (measured as a percentage of the population) in North America (73.6%), Oceania/Australia (59.5%) and Europe (48.1%).[69] In terms of broadband access, Iceland (26.7%), South Korea (25.4%) and the Netherlands (25.3%) led the world.[70]
The Internet works in part because of protocols that govern how the computers and routers communicate with each other. The nature of computer network communication lends itself to a layered approach where individual protocols in the protocol stack run more-or-less independently of other protocols. This allows lower-level protocols to be customized for the network situation while not changing the way higher-level protocols operate. A practical example of why this is important is because it allows an Internet browser to run the same code regardless of whether the computer it is running on is connected to the Internet through an Ethernet or Wi-Fi connection. Protocols are often talked about in terms of their place in the OSI reference model (pictured on the right), which emerged in 1983 as the first step in an unsuccessful attempt to build a universally adopted networking protocol suite.[71]
For the Internet, the physical medium and data link protocol can vary several times as packets traverse the globe. This is because the Internet places no constraints on what physical medium or data link protocol is used. This leads to the adoption of media and protocols that best suit the local network situation. In practice, most intercontinental communication will use the Asynchronous Transfer Mode (ATM) protocol (or a modern equivalent) on top of optic fibre. This is because for most intercontinental communication the Internet shares the same infrastructure as the public switched telephone network.
At the network layer, things become standardized with the Internet Protocol (IP) being adopted for logical addressing. For the World Wide Web, these "IP addresses" are derived from the human readable form using the Domain Name System (e.g. 72.14.207.99 is derived from www.google.com). At the moment, the most widely used version of the Internet Protocol is version four but a move to version six is imminent.[72]
At the transport layer, most communication adopts either the Transmission Control Protocol (TCP) or the User Datagram Protocol (UDP). TCP is used when it is essential every message sent is received by the other computer where as UDP is used when it is merely desirable. With TCP, packets are retransmitted if they are lost and placed in order before they are presented to higher layers. With UDP, packets are not ordered or retransmitted if lost. Both TCP and UDP packets carry port numbers with them to specify what application or process the packet should be handled by.[73] Because certain application-level protocols use certain ports, network administrators can manipulate traffic to suit particular requirements. Examples are to restrict Internet access by blocking the traffic destined for a particular port or to affect the performance of certain applications by assigning priority.
Above the transport layer, there are certain protocols that are sometimes used and loosely fit in the session and presentation layers, most notably the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. These protocols ensure that the data transferred between two parties remains completely confidential and one or the other is in use when a padlock appears in the address bar of your web browser.[74] Finally, at the application layer, are many of the protocols Internet users would be familiar with such as HTTP (web browsing), POP3 (e-mail), FTP (file transfer), IRC (Internet chat), BitTorrent (file sharing) and OSCAR (instant messaging).
Voice over Internet Protocol (VoIP) allows data packets to be used for synchronous voice communications. The data packets are marked as voice type packets and can be prioritised by the network administrators so that the real-time, synchronous conversation is less subject to contention with other types of data traffic which can be delayed (i.e. file transfer or email) or buffered in advance (i.e. audio and video) without detriment. That prioritisation is fine when the network has sufficient capacity for all the VoIP calls taking place at the same time and the network is enabled for prioritisation i.e. a private corporate style network, but the Internet is not generally managed in this way and so there can be a big difference in the quality of VoIP calls over a private network and over the public Internet. [75]
See also: Victorian Internet
Local Area Networks and Wide Area Networks
Main articles: Local Area Network and Wide Area Network
Despite the growth of the Internet, the characteristics of local area networks ("LANs" – computer networks that do not extend beyond a few kilometers in size) remain distinct. This is because networks on this scale do not require all the features associated with larger networks and are often more cost-effective and efficient without them. When they are not connected with the Internet, they also have the advantages of privacy and security. However, purposefully lacking a direct connection to the Internet will not provide 100% protection of the LAN from hackers, military forces, or economic powers. These threats exist if there are any methods for connecting remotely to the LAN.There are also independent wide area networks ("WANs" – private computer networks that can and do extend for thousands of kilometers.) Once again, some of their advantages include their privacy, security, and complete ignoring of any potential hackers – who cannot "touch" them. Of course, prime users of private LANs and WANs include armed forces and intelligence agencies that must keep their information completely secure and secret.
In the mid-1980s, several sets of communication protocols emerged to fill the gaps between the data-link layer and the application layer of the OSI reference model. These included Appletalk, IPX, and NetBIOS with the dominant protocol set during the early 1990s being IPX due to its popularity with MS-DOS users. TCP/IP existed at this point, but it was typically only used by large government and research facilities.[76]
As the Internet grew in popularity and a larger percentage of traffic became Internet-related, LANs and WANs gradually moved towards the TCP/IP protocols, and today networks mostly dedicated to TCP/IP traffic are common. The move to TCP/IP was helped by technologies such as DHCP that allowed TCP/IP clients to discover their own network address — a function that came standard with the AppleTalk/ IPX/ NetBIOS protocol sets.[77]
It is at the data-link layer, though, that most modern LANs diverge from the Internet. Whereas Asynchronous Transfer Mode (ATM) or Multiprotocol Label Switching (MPLS) are typical data-link protocols for larger networks such as WANs; Ethernet and Token Ring are typical data-link protocols for LANs. These protocols differ from the former protocols in that they are simpler (e.g. they omit features such as Quality of Service guarantees) and offer collision prevention. Both of these differences allow for more economical systems.[78] Despite the modest popularity of IBM token ring in the 1980s and 90's, virtually all LANs now use either wired or wireless Ethernets. At the physical layer, most wired Ethernet implementations use copper twisted-pair cables (including the common 10BASE-T networks). However, some early implementations used heavier coaxial cables and some recent implementations (especially high-speed ones) use optical fibers.[79] When optic fibers are used, the distinction must be made between multimode fibers and single-mode fiberes. Multimode fibers can be thought of as thicker optical fibers that are cheaper to manufacture devices for but that suffers from less usable bandwidth and worse attenuation – implying poorer long-distance performance.[80]
No comments:
Post a Comment