Sunday, November 30, 2008

Sustaining the Growth of Internet

Widespread claims of Internet traffic doubling every three or four months are exaggerated. Actual U.S. backbone traffic appears to be doubling once a year.“Traffic doubling each year” refers here to any growth rate between 70 and 150% per year. Imprecision caused by incomplete statistics.There was a slowdown in growth, but that occurred in 1997. Ever since, growth has been steady and rapid, although not as astronomical as popular mythology holds. Even if the problems related to high speed fibre networks are solved, there appears to be a limit at which traffic is likely to grow, caused by the many other feedback loops operating on different time scales.

In a world-first model of internet power consumption, University of Melbourne researchers have been able to identify the major contributors to Internet power consumption as the take-up of broadband services grows in the coming years.

"It has now become clear that the exponential growth of the Internet is not sustainable, "said Dr Hinton.The result indicates that, even with the improvements in energy efficiency of electronics, the power consumption of the Internet will increase from 0.5% of today's national electricity consumption to 1% by around 2020.

"The growth of the Internet, IT broadband telecommunications have opened up a wide range of new products and services. New home services include Video on Demand, web based real-time gaming, social networking, peer-to-peer networking and more. For the business community, new services may include video conferencing, outsourcing and tele-working. To support these new high-bandwidth services, the capacity of the Internet will need to be significantly increased. If Internet capacity is increased, the energy consumption, and consequently the carbon footprint of the Internet will also increase", Dr Hinton quoted.

Reference:
http://www.sciencedaily.com­ /releases/2008/11/081125113116.htm

Saturday, July 5, 2008

NFC - The new mobile mantra?

Near Field Communication or NFC, is a short-range high frequency wireless communication technology which enables the exchange of data between devices over about a 10 centimetre (around 4 inches) distance. The technology is a simple extension of the ISO 14443 proximity-card standard (contact less card, RFID) that combines the interface of a smartcard and a reader into a single device.

What is NFC?
Near Field Communication (NFC) is a short-range wireless connectivity technology standard designed for intuitive, simple and safe communication between electronic devices. NFC communication is enabled by bringing two NFC compatible devices within a few centimeters of one another. Applications of NFC technology include contactless transactions such as payment and transit ticketing, simple and fast data transfers including calendar synchronization or electronic business cards and access to online digital content.

A different world!
NFC makes life easier - it's easier to get information, easier to pay for goods and services, easier to use public transport, and easier to share data between devices. You simply bring NFC-compatible devices close to one another, typically less than four centimeters apart.Thanks to NFC technology, we will be able to "pick up" information from our environment. NFC technology allows mobile devices to "read" information stored in "tags" on everyday objects. These can be affixed to physical objects such as posters, bus stop signs, street signs, medicines, certificates, food packaging and much more. You will know where to find the tag by looking for the NFC Forum "Target Mark" on the object.

Specifications
Near Field Communication is based on inductive-coupling, where loosely coupled inductive circuits share power and data over a distance of a few centimeters. NFC devices share the basic technology with proximity (13.56MHz) RFID tags and contactless smartcards, but have a number of key new features. Like ISO 14443, NFC communicates via magnetic field induction, where two loop antennas are located within each other's near field, effectively forming an air-core transformer. It operates within the globally available and unlicensed radio frequency ISM band of 13.56 MHz, with a bandwidth of almost 2 MHz.
* Working distance with compact standard antennas: up to 20 cm
* Supported data rates: 106, 212, or 424 kbit/s
* In reader/writer mode, the NFC device is capable of reading NFC Forum mandated tag types, such as in the scenario of reading an NFC Smartposter tag. The reader/writer mode is on the RF interface compliant to the ISO 14443 and FeliCa schemes.
* In Peer-to-Peer mode, two NFC devices can exchange data. For example, you can share Bluetooth or WiFi link set up parameters, and exchange data such as virtual business cards or digital photos. Peer-to-Peer mode is standardized on the ISO/IEC 18092 standard.
* In Card Emulation mode, the NFC device itself acts as an NFC tag, appearing to an external reader much the same as a traditional contactless smart card. This enables contactless payments and e-ticketing, for example.

Coding and data rates
* NFC employs two different codings to transfer data. If an active device transfers data at 106 kbit/s, a modified Miller coding with 100% modulation is used. In all other cases Manchester coding is used with a modulation ratio of 10%.
* NFC devices are able to receive and transmit data at the same time. Thus, they can check the radio frequency field and detect a collision if the received signal does not match with the transmitted signal.
* NFC data transmission is measured in Kilo Bits Per Second (kbps). The NFC standard supports varying data rates, again to ensure interoperability between pre-existing infrastructure. The current data rates are 106kbps, 212kbps and 424kbps.

Advantages
Acting as a secure gateway to the connected world, tomorrow’s NFC-enabled mobile devices will allow consumers to store and access all kinds of personal data – at home or on the move. Simply by bringing two NFC-enabled devices close together, they automatically initiate network communications without requiring the user to configure the setup. NFC-enhanced consumer devices can easily exchange and store your personal data – messages, pictures, MP3 files, etc. Delivering ease of use, instant intuitive connectivity, zero configuration and smart key access, NFC meets all the needs of today’s connected consumer and creates opportunities for new mobile services.

Tuesday, April 22, 2008

International Telecommunication Union Approves WiMAX Technology as New IMT-2000 Standard

WiMAX Technology Inclusion to Expand Operators Global Access

Portland October 19, 2007 The WiMAX Forum is pleased to recognize the decision of the Radiocommunication Sector of the International Telecommunication Union (ITU-R) to include WiMAX technology in the IMT-2000 set of standards. This decision is of global importance to operators who look to ITU to endorse technologies before they invest in new infrastructure. The decision to approve the WiMAX Forum's version of IEEE Standard 802.16 as an IMT-2000 technology significantly escalates opportunities for global deployment, especially within the 2.5-2.69 GHz band, to deliver Mobile Internet to satisfy both rural and urban market demand.

"This is a very special and unique milestone for WiMAX technology," said Ron Resnick, president of the WiMAX Forum. "This is the first time that a new air interface has been added to the IMT-2000 set of standards since the original technologies were selected nearly a decade ago. WiMAX technology currently has the potential to reach 2.7 billion people. And today's announcement expands the reach to a significantly larger global population."

From the initial application made at the ITU-R WP8F meeting in January of this year to this week's meeting of the Radiocommunications Assembly in Geneva, Administrations, industry and ITU have worked together to achieve this groundbreaking decision.

"It gives me great satisfaction to observe that the ITU Radiocommunication Sector continues to be responsive to the most pressing needs of the wireless industry," said Valery Timofeev, Director of the ITU Radiocommunication Bureau.

With WiMAX technology approved as a new IMT-2000 specification, the WIMAX ecosystem will benefit from greater economies of scale, thus reducing the already low cost to deliver broadband wireless services to include VOIP as well as the multiple services expected from wireless broadband Internet access.

Originally created to harmonize 3G mobile systems and to increase opportunities for worldwide interoperability, the IMT-2000 family of standards will now support four different access technologies, including OFDMA (includes WiMAX), FDMA, TDMA and CDMA.

"3G solutions based upon technologies such as W-CDMA, CDMA-2000, and TD-SCDMA technologies were already included in the IMT-2000 set of standards," said Resnick. "With WiMAX technology now included, it places us on equal footing with the legacy-based technologies ITU-R already endorses." The bottom line is that operators across the globe now have the freedom to select the right technology to best meet their business and regional needs."

source: Wimax forum

Wednesday, March 12, 2008

Mobile WiMAX Deployment Alternatives

Traditionally, cellular deployments were based solely on achieving ubiquitous coverage with little consideration for capacity requirements. Since the only services offered were voice and the market was uncertain, this was a very reasonable approach. Moreover, the voice service offering is a low data rate application enabling traditional cellular networks to achieve wide outdoor and indoor coverage with a low data rate network (~10-15 kbps bandwidth depending on type of vocoder). As the customer base grew and more services offered, additional base stations were deployed and/or channels added to existing base
stations to meet the growing capacity requirements. With Mobile WiMAX, however, operators will want to offer a wide range of broadband services with Quality-of-Service (QoS) support. To meet customer expectations for these types of services it will be necessary to predetermine capacity requirements and deploy accordingly at the outset. Careful deployment planning in anticipation of growing customer demands will ensure a quality user experience when the network is at its busiest.

Determining Capacity Requirements
Arriving at an accurate estimate of capacity requirements for new broadband services is not a simple exercise. One must anticipate how users will make use of the new services being offered and how often users will be actively engaged with the network. Data density, expressed as Mbps per km^2, is a convenient metric for describing capacity requirements. Determining the required data density for a specific demographic region is a multi-step process.The expected market penetration, or take-up rate, at maturity is dependent on a number of factors including the competitive situation and the services offered that distinguish one service provider from another. The service provider’s penetration may also vary within the metropolitan area since urban and dense urban residents will often have other broadband access alternatives from which to choose as compared to residents in suburban and rural areas.

Base Station Deployment Alternatives
Mobile WiMAX base station equipment will be available from many different vendors and, although all will be WiMAX compliant and meet performance and interoperability requirements, a great many different configurations will be available from which service providers can choose. The availability and timing of optional features also adds to the equipment variability. Additionally, there are different frequency bands that can be considered and varied amounts of spectrum availability within these bands. The spectrum choices will, in many cases, affect the frequency reuse factor and the channel bandwidths that can be employed in the access network.WiMAX solutions with beamforming will generally be architected quite differently from
SIMO and MIMO solutions. A typical SIMO or MIMO configuration will have power amplifiers mounted at the base of the tower to facilitate cooling and maintenance. The amplifiers in this case would have to be sized to compensate for cable losses, which can range from 2 to 4 dB depending on tower height and frequency. Beamforming solutions require good phase and amplitude control between transmitting elements and will often be architected with their power amplifiers integrated with the antenna elements in a tower-mounted array. The larger size and weight of these structures will also require more robust mounting. There is additional signal processing requirements for beamforming solutions with Adaptive Beamforming being the most computational intensive.

The selection of channel bandwidth and duplexing method can also have an economic impact on the varied WiMAX deployment alternatives. In addition the desired “worse case” UL rate will affect the UL link budget and therefore, impact the range and coverage of the base station.

Conventional cellular deployments used cell frequency reuse factors as high as seven (7) to mitigate intercellular co-channel interference (CCI). These deployments assured a minimal spatial separation of 5:1 between the interfering signal and the desired signal but required seven times as much spectrum. With technologies such as CDMA, introduced with 3G, and OFDMA, introduced with WiMAX, more aggressive reuse schemes can be employed to improve overall spectrum efficiency.

Number of Base Stations
The key metric for a quantified comparison will be the number of WiMAX base stations required to meet both capacity and coverage requirements in the varied demographic regions. The WiMAX base station is a key network element in connecting the core network to the enduser in that it determines the coverage of the network and defines the end-user experience. If too few base stations are deployed the coverage will not be ubiquitous and the end-user may experience drop outs or periods of poor performance due to weak signal levels as he moves throughout the coverage area. And since the base station investment will tend to be a dominant contributor to the total end-to-end network costs, deploying too many base stations can result in unnecessary start-up costs for the operator leading to a weaker business case.

Summing up
In the long term, the higher performance base stations with wideband channels provide a potentially more cost-effective deployment solution as measured by the number of required base stations. One might conclude that it would be worth waiting for antenna technologies such as beamforming and beamforming + MIMO and possibly even 20 MHz channels, before deploying a Mobile WiMAX network. This however, is not the case. In the early years , deployment can begin with range-limited base stations using (1x2) SIMO or (2x2) MIMO base station configurations to get ubiquitous coverage over the entire metropolitan area. These base stations can then be upgraded in the following years with beamforming and beamforming + MIMO as necessary to meet the capacity requirements in anticipation of a growing customer base. In most metropolitan area deployments this will only be necessary in the dense urban and urban area

Thursday, February 7, 2008

Web 3.0, What is it?

Web 3.0 is a term used to describe the future of the World Wide Web. Following the introduction of the phrase "Web 2.0" as a description of the recent evolution of the Web, many technologists, journalists, and industry leaders have used the term "Web 3.0" to hypothesize about a future wave of Internet innovation. Views on the next stage of the World Wide Web's evolution vary greatly. Some believe that emerging technologies such as the Semantic Web will transform the way the Web is used, and lead to new possibilities in artificial intelligence. Other visionaries suggest that increases in Internet connection speeds, modular web applications, or advances in computer graphics will play the key role in the evolution of the World Wide Web.The internet, Web 1.0, is so incredibly powerful that even now, almost 20 years later, we have only begun to explore its potential. Web 2.0, with its YouTube, Facebook, Flickr and blogs galore is even younger and shows even more potential.

web 3.0

Web 3.0 is defined as the creation of high-quality content and services produced by gifted individuals using Web 2.0 technology as an enabling platform. Web 2.0 services are now the commoditized platform, not the final product. In a world where a social network, wiki, or social bookmarking service can be built for free and in an instant, what's next?
Web 2.0 services like digg and YouTube evolve into Web 3.0 services with an additional layer of individual excellence and focus. As an example, funnyordie.com leverages all the standard YouTube Web 2.0 feature sets like syndication and social networking, while adding a layer of talent and trust to them. A version of digg where experts check the validity of claims, corrected errors, and restated headlines to be more accurate would be the Web 3.0 version. However, I'm not sure if the digg community will embrace that any time soon. Wikipedia, considered a Web 1.5 service, is experiencing the start of the Web 3.0 movement by locking pages down as they reach completion, and (at least in their German version) requiring edits to flow through trusted experts. Also of note, is what Web 3.0 leaves behind. Web 3.0 throttles the "wisdom of the crowds" from turning into the "madness of the mobs" we've seen all to often, by balancing it with a respect of experts. Web 3.0 leaves behind the cowardly anonymous contributors and the selfish blackhat SEOs that have polluted and diminished so many communities. Web 3.0 is a return to what was great about media and technology before Web 2.0: recognizing talent and expertise, the ownership of ones words, and fairness.

The debates over Web 3.0
There is considerable debate as to what the term Web 3.0 means, and what a suitable definition might be.
  • Transforming the Web into a database.
  • An evolutionary path to artificial intelligence.
  • The realization of the Semantic Web and SOA.
  • Evolution towards 3D.
  • Web 3.0 as an "Executable" Web Abstraction Layer.

All are welcome to think what the future Web would look like.... the W3C has been quiet on this aspect!