• READING PASSAGE 4 Information theory - the big idea
  • Few words to say about this book




    Download 2,47 Mb.
    Pdf ko'rish
    bet93/112
    Sana20.05.2024
    Hajmi2,47 Mb.
    #244845
    1   ...   89   90   91   92   93   94   95   96   ...   112
    Bog'liq
    THE-BIBLE-OF-IELTS-READING-BOOK

    Questions 14-17 
    Reading Passage 2 has six paragraphs, 
    A-F
    .
    Which paragraph contains the following information?
    Write the correct letter, 
    A-F
    , in boxes 14-17 on your answer sheet.
    NB
     You may use any letter more than once.
    19 
    The location of the first test site 
    20
    A way of bringing the power produced on one site back into Britain 
    21
    A reference to a previous attempt by Britain to find an alternative source of energy 
    22
    Mention of the possibility of applying technology from another industry 
    Questions 23-27 
    Choose 
    FIVE
     letters, 
    A-J
    .
    Write the correct letters in boxes 18-22 on your answer sheet.
    Which 
    FIVE
     of the following claims about tidal power are made by the writer?
    A
    It is a more reliable source of energy than wind power. 
    B
    It would replace all other forms of energy in Britain. 
    C
    Its introduction has come as a result of public pressure. 
    D
    It would cut down on air pollution. 
    E
    It could contribute to the closure of many existing power stations in Britain. 
    F
    It could be a means of increasing national income. 
    G
    It could face a lot of resistance from other fuel industries. 
    H
    It could be sold more cheaply than any other type of fuel. 
    I
    It could compensate for the shortage of inland sites for energy production. 
    J
    It is best produced in the vicinity of coastlines with particular features. 


    168 
    READING PASSAGE 4
    Information theory - the big idea 
    Information 
    theory
     lies at the heart of everything - from DVD players and the genetic code of DNA to the 
    physics of the universe at its most fundamental. It has been central to the development of the science 
    of communication, which enables data to be sent electronically and has therefore had a 
    major
     
    impact
     on our 
    lives 

    In April 2002 an event took place which demonstrated one of the many applications of information theory. The 
    space probe, Voyager I, launched in 1977, had sent back spectacular images of Jupiter and Saturn and then 
    soared out of the Solar System on a one-way mission to the stars. After 25 years of exposure to the freezing 
    temperatures of deep space, the probe was beginning to show its age. Sensors and circuits were on the brink of 
    failing and NASA experts realised that they had to do something or lose contact with their probe forever. The 
    solution was to get a message to Voyager I to instruct it to use spares to change the failing parts. With the 
    probe 12 billion kilometres from Earth, this was not an easy task. By means of a radio dish belonging to 
    NASA’s Deep Space Network, the message was sent out into the depths of space. Even travelling at the speed 
    of light, it took over 11 hours to reach its target, far beyond the orbit of Pluto. Yet, incredibly, the little probe 
    managed to hear the faint call from its home planet, and successfully made the switchover.

    It was the longest-distance repair job in history, and a triumph for the NASA engineers. But it also highlighted 
    the astonishing power of the techniques developed by American communications engineer Claude Shannon, 
    who had died just a year earlier. Born in 1916 in Petoskey, Michigan, Shannon showed an early talent 
    for maths and for building gadgets, and made breakthroughs in the foundations of computer technology when 
    still a student. While at Bell Laboratories, Shannon developed information theory, but shunned the resulting 
    acclaim. In the 1940s, he single-handedly created an entire science of communication which has 
    since inveigled its way into a host of applications, from DVDs to satellite communications to bar codes - any 
    area, in short, where data has to be conveyed rapidly yet accurately.

    This all seems light years away from the down-to-earth uses Shannon originally had for his work, which began 
    when he was a 22-year-old graduate engineering student at the prestigious Massachusetts Institute of 
    Technology in 1939. He set out with an apparently simple aim: to pin down the precise meaning of the 
    concept of ‘information’. The most basic form of information, Shannon argued, is whether something is true or 
    false - which can be captured in the binary unit, or ‘bit’, of the form 1 or 0. Having identified this fundamental 
    unit, Shannon set about defining otherwise vague ideas about information and how to transmit it from place to 
    place. In the process he discovered something surprising: it is always possible to guarantee information will 
    get through random interference - ‘noise’ - intact.

    Noise usually means unwanted sounds which interfere with genuine information. Information theory 
    generalises this idea via theorems that capture the effects of noise with mathematical precision. In particular, 
    Shannon showed that noise sets a limit on the rate at which information can pass along communication 
    channels while remaining error-free. This rate depends on the relative strengths of the signal and noise 
    travelling down the communication channel, and on its capacity (its ‘bandwidth’). The resulting limit, given in 
    units of bits per second, is the absolute maximum rate of error-free communication given signal strength and 
    noise level. The trick, Shannon showed, is to find ways of packaging up - ‘coding’ - information to cope with 
    the ravages of noise, while staying within the information-carrying capacity - ‘bandwidth’ - of the 
    communication system being used.

    Over the years scientists have devised many such coding methods, and they have proved crucial in many 
    technological feats. The Voyager spacecraft transmitted data using codes which added one extra bit for every 
    single bit of information; the result was an error rate of just one bit in 10,000 - and stunningly clear pictures 
    of the planets. Other codes have become part of everyday life - such as the Universal Product Code, or bar 
    code, which uses a simple error-detecting system that ensures supermarket check-out lasers can read the price 
    even on, say, a crumpled bag of crisps. As recently as 1993, engineers made a major breakthrough by 


    169 
    discovering so-called turbo codes - which come very close to Shannon’s ultimate limit for the maximum rate 
    that data can be transmitted reliably, and now play a key role in the mobile videophone revolution.

    Shannon also laid the foundations of more efficient ways of storing information, by stripping out superfluous 
    (‘redundant’) bits from data which contributed little real information. As mobile phone text messages like ‘I 
    CN C U’ show, it is often possible to leave out a lot of data without losing much meaning. As with error 
    correction, however, there’s a limit beyond which messages become too ambiguous. Shannon showed how to 
    calculate this limit, opening the way to the design of compression methods that cram maximum information 
    into the minimum space.

    Download 2,47 Mb.
    1   ...   89   90   91   92   93   94   95   96   ...   112




    Download 2,47 Mb.
    Pdf ko'rish