Professional Documents
Culture Documents
Articles
Articles
This is yet another in the "Dummies" series of books. The series presents information, in a friendly manner, on about just about every topic that could be imagined. The idea is to present a fresh look at a topic and present it in a way that even dummies could understand. These books are not really for dummies (creating web pages using HTML is not a task for dummies). The title is just a bit of tonguein-cheek humor to suggest that these topics can at least be presented in a friendly and easier-to-read format (each new section in the book starts with a job-related cartoon). The book begins with an introduction to, in the authors' words, "the wild, wacky, and wonderful possibilities inherent in the World Wide Web." After a not-so-brief internet history lesson, they begin the analysis of how the internet works, describing what is behind the scenes of web pages (they call it "under the hood"). They provide an overview of clients and servers, responses and requests, and the roles of front and back ends. That leads to a discussion of serving up web resources which, of course, leads directly to the role and function of HTML. The HyperText Markup Language (HTML) is described as the way web servers and clients talk to each other. While most books on HTML start with the basics, this book starts with a focus on what the viewer of a web page sees. It takes several chapters to begin the introduction of HTML syntax. While some more experienced programmers might get impatient with this approach, it is an easy-to-digest method of providing the necessary background in web function for the less experienced. The review of history and structure of the internet makes for interesting reading, no matter what your level of expertise.
HTML is a collection of markup codes that must be recognized by the user's browser. But not every new tag is determined by the standards committee: some special tags are recognized only by certain browsers. And, as the authors point out, the number and kinds of HTML tags continues to grow with each new iteration of the markup language. The book describes all of the currently recognized HTML tags and it provides an overview of which tags are supported by which browsers. The CD-ROM that comes with this book contains an excellent collection of readymade web page designs, along with other useful information. This book provides a good introduction to the internet and how it works. It also provides a very thorough introduction to HTML and it does a good job of describing how HTML codes are used. However, when you are ready to begin creating your own web pages, a detailed reference to HTML will be also necessary. We would recommend this book to beginners, with the support of a detailed reference to HTML such as Special Edition Using HTML 4, Sixth Edition by Molly E. Holzschlag or Platinum Edition Using HTML 4, XML, and Java by Eric Ladd and Jim O'Donnell.
1. Introduction
Wireless services in India started a little over 100 years ago with the commissioning of a single wireless telegraph link in Calcutta. After many years of slow growth, wireless in recent years has become both an economical and convenient (e.g., mobile) means to deliver voice and data services. Wireless networks in India have expanded dramatically in the past six years to reach over 700 million mobile wireless phone subscribers (March 2011). This makes India the second largest in the number of wireless subscribers - with China at the top. A large population, low wire line telephony penetration, falling tariff, and rising income levels have made India the fastest-growing wireless nation. Despite the success of its wireless services sector, India largely relies on imported technology for its networks. This article has four sections. In the first, we outline the evolution of global wireless technology and networks and discuss the emerging trends. Next, we trace the history of wireless networks in India, policy evolution, and current trends in services. In the third section we survey wireless R&D and manufacturing in India. Finally in section four we conclude with suggestions for reviving the telecom equipment industry.
G. Marconi, the name most associated with mass-market wireless services, demonstrated the feasibility of wireless telegraphy, progressing from small home experiments in 1895, to an experimental transatlantic wireless link in 1901. Although Marconi has received much credit for commercialization of the wireless technology, the underlying scientific discoveries go back to at least 50 years earlier. Most prominent of these were J.C. Bose in India, O. Lodge in the UK, E. Branly in Paris, and N. Tesla in the US. In 1894, Prof. J.C. Bose demonstrated the first millimeter wave radio transmission using gun powder to create a burst of radio energy, which rang a bell a few feet away. Following Marconi's commercial success, technology improvements came rapidly in the early part of the twentieth century. Until 1980, wireless communications remained a technology used in defense and police services and did not reach mass markets. The roots of today's pervasive mobile wireless technology goes back to D.H. Ring and W.R. Young, both at Bell Laboratories, who proposed the fundamentals of the cellular frequency reuse in 1947. Another key concept of handover, a technique for transferring calls as the user moves across cells, was proposed by A. Joel in 1970, also at Bell Laboratories. Soon, work began on a full-fledged mobile telephony system involving the key principles of cellular frequency reuse, handover, and multiple-access, with technology demonstrations in 1973, in the US. In the eleven decades since those early beginnings in the nineteenth century, wireless technology has transformed our world, from satellite radio and High Definition Television (HDTV) broadcasting, to mobile telephony, and now the emerging mobile broadband. We have over four billion wireless mobile phone users in a total world population of 6.5 billion, and it is in mobile services, that the wireless has made the greatest impact on our society. Mobile broadband is still in its early stages and will usher in a big new era of wireless services that deliver rich multimedia internet services to a variety of devices including handheld products. A number of core technologies have underpinned the growth and success of mobile wireless. Some examples are multiple access, multiple-input and multiple-output (MIMO), adaptive modulation, and coding. We discuss these briefly. The origins of multiple access date back to Marconi's proposal of 'tuned circuits' or equivalently, Frequency Division Multiplexing (FDM). This allows multiple links to be established from a single transmitter base to a receiver, with each link using a distinct frequency channel. FDM strictly refers to each link terminating at different geographically dispersed users. The connection from the transmitter base to the dispersed receivers is referred to as the downlink. When these dispersed users transmit back to the base, again with distinct frequency channels, it is called the uplink and this access mode is referred to as Frequency Division Multiple Access (FDMA). In a cell with a base station and its multiple dispersed users, the FDM downlink and FDMA uplink, are referred to as FDM/FDMA. Two other multiple access technologies have been developed: Time Division (TDM/TDMA) and Code Division (CDM/CDMA). TDM/TDMA uses time slots instead of frequencies to distinguish users. In Code CDM/CDMA, the users are separated by unique spreading codes. Henceforth, we will use CDMA to refer to CDM/CDMA and likewise for TDMA. There was a vigorous debate of CDMA versus TDMA in the mid 1990s, about the relative spectral efficiency - roughly the amount of throughput per cell for a fixed amount of spectrum. The key to such efficiency lay in the clever manner in which the challenges of fading, interference, and handover were dealt with. Although both TDMA and CDMA, found equally effective ways to deal with these challenges, there were practical reasons favoring CDMA as the more convenient approach. The arguments for and against CDMA and TDMA were finely balanced, and both technologies became well established, with TDMA-based GSM remaining the dominant technology even today. The 3G standard adopted a 5 MHz channel to support higher bit rates and adopted a CDMA approach, as channel equalization in TDMA became computationally complex at such bandwidths. CDMA used Rake receivers and was simpler to implement. In the late 1990s, as the demand for broadband data further increased, the channel bandwidth had to be increased to 10 or 20 MHz. CDMA also became efficient due to a large number of multipaths that became resolved and the resulting high interpath interference. A new approach called Orthogonal Frequency Multiple Access (OFDMA) using a large number of narrow, mutually orthogonal, sub-channels emerged as the preferred access technique for 4G broadband systems. In summary, 1G used FDMA, 2G used both TDMA, 3G used CDMA, and 4G adopted OFDMA. Multiple Input Multiple Output, another key technology, goes back to the work of Stanford University researchers in 1973, who proposed the use of multiple antennas to transmit and receive and to implement a new concept called spatial multiplexing.
Spatial multiplexing works when the spatial signatures at the receiving antenna array, induced by the different transmit antenna streams, are quasi orthogonal, and hence, separable. MIMO spatial multiplexing multiplies the effective channel bandwidth by the number of antenna pairs and has stirred enormous interest. Multiple transmit and receive antennas used in MIMO can also be used for link diversity. Although many of the theoretical fundamentals of MIMO were developed by Bell Laboratory researchers, the first commercial system to adopt MIMO and OFDMA was developed by Iospan Communication Inc. in the US during the late 1990s, and this technology eventually became the basis of 4G wireless standards. WiMAX and 3GPP Long Term Evolution (LTE) have both adopted MIMO-OFMDA, as also WiFi IEEE 802.11n standard. MIMO spatial multiplexing has been extended to a multi-user format, wherein, a base station transmits dedicated streams to different users. On the downlink, the multi-user MIMO requires channel state knowledge to orthogonalize transmissions to different users. Other key technologies that have contributed to wireless performance improvements include: (a) Turbo and LDPC channel coding that have enabled links to operate close to Shannon capacity. (b) Hybrid-Automatic Request for Re-transmission (HARQ) is a physical layer Automatic Repeat-reQuest (ARQ), which outperforms regular ARQ. (c) Adaptive modulation and coding (AMC) allows the use of modulation and a coding rate to be chosen, to suit the channel SNR. (d) Opportunistic Scheduling (OS) refers to assigning the most favorable (highest SNR) frequency or time slot to a user, as against random channel assignment. OS can improve spectral efficiency by about 20% in practical networks.
1948, in Bangalore. ITI was the first Public Sector Unit (PSU) in independent India and is indicative of the importance that Mr. Nehru, the Indian Prime Minister then, gave to the development of an indigenous telecom equipment capability. Till the 1990s, ITI manufactured large Strowger and Crossbar exchanges, small local exchanges, and telephone equipment under licensed agreements with western companies. A notable exception was switches based on Centre for Development of Telematics (CDOT) technology (see a little later in the text). Later, ITI entered into a number of joint ventures with US, European, and Chinese companies for a diverse range of transmission products. Since the late 1990s, ITI had run at a loss and was declared a sick unit in 2003. The Indian Government has tried to revive ITI with large cash grants. Attempts to sell or merge ITI have so far not succeeded and its future remains uncertain. Many other central and state PSUs also built telecom equipment such as VSAT terminals, digital STM radios, again, mostly based on licensed technology from abroad. They have also faced severe difficulties in recent years and have suffered declines. CDOT: This was a Government of India (GOI) funded R&D unit formed in 1984 under the Chairmanship of Mr. Sam Pitroda, a visionary telecom leader who returned to India after a successful career in the US. CDOT successfully developed and transferred technology for Rural Exchange (RAX) and Private Automatic Branch Exchange (PABX) switches to a number of small/ medium manufactures from 1988 onwards. The MAX switch (up to 50,000 line capacity) was successfully developed by 1995, and the technology was transferred to ITI. Over 1000 MAX switches based on CDOT technology have been installed and have been the mainstay of the DOT voice network. CDOT's success in switch design remains a singular achievement in indigenous telecom technology. However, CDOT in recent years has not been able to sustain the success of RAX and MAX developments during the early 1990s. Its inability to develop a Mobile Switch removed a major opportunity to participate in the exploding mobile wireless market. Private sector: From the 1950s to 1989s, efforts by the private sector to design and manufacture telecom equipment was severely limited due to the restrictive licensing policy of the Indian Government that was focused on protecting the public sector. Early companies in this segment included ARM (now ICOMM Tele), Himachal Futuristic Communications, and BPL (Telecom). Due to licensing restrictions, these companies never developed the market scale to build a credible R&D capability to compete against the giant Telecom Multi National Companies (T-MNCs). A particularly promising company is Midas Communications Ltd., which developed an innovative wireless local loop product, known as CorDECT, based on a Digital Enhanced Cordless Technology (DECT) technology. Midas successfully marketed CorDECT to over six countries and has so far sold over two mMillion lines. CorDECT share on Indian market remains limited and the company is trying to re-enter the market with a GSM technology. VNL has also built low power off grid solar GSM base stations that show great promise in emerging countries. Shyam Telecom, Terracom, Coral, Pointred, and Matrix are other examples in the private sector with some local capability, but are no match to T-MNCs. Engineering Service Companies: India has built a vibrant engineering services industry which carries out mostly software and some hardware development for MNC clients. Wipro, Sasken, Infosys, Mindtree, and others, have a sizable business in telecom engineering services. Their revenue from telecom-related engineering services is estimated to be $ 6 billion in 2009. However, none of these companies have emerged as significant telecom equipment providers for the Indian market, which continues to be dominated by T-MNCs. A number of factors played a role in the eclipse of the local telecom equipment industry. First the barriers to entry, including import duty of foreign manufactured equipment were rapidly reduced after the National Telecom Policy (NTP)-99. This allowed T-MNCs to successfully outbid ITI with long-term vendor financing, which proved to be irresistible to DOT. Next, the private sector began to take an increasing share of the telecom services (current share about 65%), and this sector opted for the most advanced global technology, to remain competitive in the quality of services and network economics. Finally, there was no determined effort by the Government to help the local industry, which was forced out of product development and manufacturing. Some of these companies have moved to a trading model, wherein the telecom equipment is imported in a fully or partially assembled form, and only the final assembly is undertaken before supply to the end customers. There is usually only minor, if any, value addition. Clearly, revival of an indigenous telecom sector has to receive high priority. 5.2 Foreign Companies T-MNCs: With the growth of Indian mobile market, Nokia, Motorola, LG, and Samsung have set up cell phone assembly plants to take advantage of tax incentives. The value addition in these plants is estimated to be <7% of the end product price. Ericsson and Alcatel-Lucent also do some assembly and test of wireless infrastructure equipment in India. Alcatel-Lucent entered into a joint venture with CDOT for the development of WiMAX products for the Indian market. The net local value addition still remains focused in the final assembly and test areas, and hence, remains small. No value addition in the core technologies such as semiconductor design or manufacturing is undertaken by T-MNCs in India. The Government has not mandated value addition in core technologies, as is currently happening in China. Venture Funded Small Companies: In recent years Venture funded companies in the US have built significant engineering back ends in India. Two notable companies are Beceem Communications and Tejas Networks. Beceem commands 65% of the WIMAX semiconductor market worldwide and has 80% market share in US 4G semiconductor. Beceem was acquired by Broadcom in November 2010. Tejas is an optical switch company that is almost 100% India-based and has done well in the mid and lower segments of SDH switches, both in the Indian and emerging markets.
6. Concluding Remarks
Wireless services in India have seen dramatic growth in recent years through the expansion of mobile voice networks. With over 700 million users, the cell phone has become the symbol of India's growth and dynamism, and has proven to be a truly transformative technology. The cellphone has put in the hands of even the weakest sections of Indian society, a technology marvel, which would allow him or her to connect with anyone anywhere in the country or indeed the world. Clearly, the next revolution of broadband wireless is now close at hand, and may prove to have an even greater impact on the India's economic growth and productivity. The dark cloud in all this good news is the inability of Indian technology companies, with very few exceptions, to participate in the huge opportunities in equipment design and manufacturing offered by India's massive telecom expansion. Telecom technology is very R&D intensive and it takes large and high risk investments to build each of the different pieces of this great
industry. It will need concerted policy support to help create a globally competitive Indian telecom equipment industry
[1] [2] [3] [4] [5]
, , , ,
Why Analog Computation? Byl I Unclassi{ied An introduction to analog computation containing a brief description of the analog computer and problems in which it can be advantageously applied. Both analog computers and systems combining analogand digital techniques are discussed in order to show why the Agency'8 interest in thiIJ computation area has increased. Why analog computation? Wi th the interest in analog computing equipment rapidly increasing in our digitally oriented Agency, thi s is a question many of us mus t ask. The preponderence of digital comput ing equipment in this Agency would preclude analog computation from consideration i f the two type s of computers performed the same operations equally well; but thi s is not the case. A comparison of digital and analog comput e r applications reveals a. basic difference in the i r operation. The digital computer performs numerical operations on discrete signals; in cont r a s t , the analog computer performs algebraic and integro-differential operations upon continuous signals. Therefore certain operations, which a r e difficult to program on a digital computer, are available inherently on the analog machine. In order to appreciate where an analog computer can be advantageously applied, one mus t become more familiar with wha t i t is and how it is used. Before discussing problem areas in which the analog computer possesses an advant age , let us briefly consider the fundamentals
of i t s operation. The he a r t of the computer is the high-gain D.C. amplliier--either va cuum tube or t r ans i s tor - tha t , when properly connected wi th passive components, forms the basic operational element. The schematic representation for an operational amplifier is shown in Fig. 1. I f the passive components in both feedback and input a rms a r e entirely resistive, the circuit of Fig. 1 adds the applied voltages in proportion to the r a t ios of the individual resistors; while if the feedba ck impedance is capacitive, the circuit integrates the sum of the applied voltages. The schematic diagrams for an amplifier used as a summe r (it is called an inve r t e r if it ha s only one input )
In an article in TJ, young earth creation science theorist Jason Lisle comments on a scientific report that there is a ripple pattern in the clustering of galaxies. He claims that this is a blow to the Big Bang model of creation.1 (The article was also featured as the daily feature on the Creation Ministries International website on 30 August 2006).
This article is a little strange for an article in Technical Journal, as it is not very "technical." Lisle gives some general information about galaxies, and then proceeds to tell about the scientists method of interpretation. He claims that we both have the same data (typical of young earth claims), but that the interpretation of that data is different. This is true. Whereas secular scientists have interpreted the data based solely upon science, Lisle has based his interpretation not on the facts, but upon his preconceived notion that the earth is young, based on his faulty biblical methods. His main point in this article is that the secular scientist "assumes that the big bang is true." This assumption is based upon other scientific facts which confirm the Big Bang. The young earth creation propoganda machine has long said that the Big Bang is in trouble, when in reality, it has only been getting stronger and stronger as a theory.
Of course, we can say the same thing about Lisle. Whereas he claims the scientists "assume" the Big Bang is true, we can say that he "assumes" that the universe is young. He cannot prove this through science, and can only issue weak arguments against the Big Bang, arguments which contain no science. Another one of his main points is this...
The big bang, however, has been refuted on the basis of both Scripture and good science. For example, the big bang is not compatible with the order, timescale and cause of the events of creation as recorded in Genesis. Really, the big bang is a secular opponent of the biblical framework.
Nothing could be further from the truth. Scripture does not indicate clearly that the earth is young. And, after many years of young earth claims, we are still waiting for their "good science" that refutes the Big Bang. The Big Bang is compatible with Genesis, and millions of Christians believe in the Big Bang and the Bible. There is no problem accepting a literal, inerrant interpretation of Genesis, and the Big Bang. Finally, Lisle mentions that the stars and galaxies were created on Day 4 of the creation week. Within the old earth interpretation, the stars, sun, and moon became plainly visible on this day, but were previously existing. One must keep in mind that the creation account is written from the perspective of a person standing on the surface of the earth. From his point of view, they were first observed this day.