Monday, October 30, 2006

IMS plugfest

Some 500 test engineers have just completed a two-week IP Multimedia Subsystem (IMS) plugfest, hosted by Verizon. Mark Wegleitner, Verizon's senior vice president and CTO said initial feedback suggests significant progress has been made to prove that IMS is the right platform for next-gen networking equipment and networks.

A collective sigh of relief from an industry betting its future on the technology, and front page news if the outcome of the interop tests had been anything else.

Telco - a pioneer in corporate blogging

Global Crossing has received praise for its use of blogging to get its message across and interact with customers and partners. "A pioneer among telecommunications companies in corporate blogging,'' is the assessment of Martin Geddes, chief analyst of STL and author of the Telepocalypse blog.

The
Global Crossing site covers key technologies such as VoIP peering, IP video, data and conferencing technologies, broadband and Ethernet access.

Friday, October 27, 2006

3G meets broadband

Further to the recent post on femtocells, see the New Electronics link for a more detailed article (click on the .pdf download).

Thursday, October 26, 2006

Convergence back on track

Telecom Italia has confirmed that fixed mobile convergence is back on its agenda. "Telecom Italia] confirms that convergence between fixed telephony, mobile telephony, broadband Internet and media content remains its strategic goal," it said in an official statement, with the text deliberately in bold.

Last month Telecom Italia surprised the industry when it announced that it would separate its fixed-line and mobile phone businesses (See fixed mobile divergence). "Telecom Italia has been knocked back by the national regulator, but that was not sufficient reason to turn the corporate ship one hundred and eighty degrees" was Total Telecom's leader comment at the time.

Now Telecom Italia intends to develop a next-generation access network enabling high-definition TV and public services such as tele-medicine. The network will also be separated from Telecom Italia, according to a model to be jointly developed with the regulator.

According to The Times, Vittorio Merloni, head of Indesit and an advisory member of the Telecom Italia board, was asked if the board had discussed spinning off TIM into a separate company, he replied: “Not even in your dreams.”

Fibre-to-the-CPU


Intel and the University of California, Santa Barbara last month announced an electrically pumped, hybrid laser, an important component that aids integrated silicon photonics. NGN asked Intel some follow-up questions about the possible application of such technology. Here is the response of Sean Koehl, technology strategist at Intel's tera-scale computing research programme.


NGN: Can Intel explain what are the two or three bottlenecks that it already sees coming (even if 5 years off) where such optical interconnect technology will be needed (and where electrical interfaces will no longer do).


SK: Storage Area Networks (SANs) already rely on optical interconnects, with the state of the art currently at 4Gbit/s and increasing. For server rack-to-rack communication in data centers, there is already a mix of optical and electrical interconnects, but this is moving more and more to optical as 10Gbit/s Ethernet becomes prevalent.

Within a server, board-board communications will be an increasing bottleneck and therefore an opportunity for more optical interconnects.

Longer term, as systems enter the tera-scale era (teraflop processors operating on terabytes of data), processor-to-processor and processor-to-memory bandwidth requirements will scale to the point where even the best copper Input/Output (I/O) will have difficulty providing the required bandwidth. This is further out, but it is also the highest volume opportunity for silicon photonics and thus requires significant advantage in the price/performance characteristic for optical I/O.

NGN: The hybrid laser technology looks suited to applications where data rate and distance are issues, but Intel seems to be focused more on high-performance servers with lots of CPU cores and boards. It appears more an issue of interface- and data rate-density rather than data rate and distance.

SK: All three are important benefits of optical: data rate, distance, and density.
  1. Data rate: because copper will have difficulty scaling beyond 10Gbit/s, while optical technology already exists at 40Gbit/s
  2. Distance: though copper is getting faster, the distance these links can span is beginning to shrink significantly. Within a data center, distance is no issue with optical. This could not only solve existing links, but enable new architectures by providing distance independence.
  3. Density: because it is possible to multiplex many 10Gbit/s or 40 Gbit/s channels on a single fiber. Tera-byte bandwidths are straightforward to achieve on a single fiber. This is where fiber has the biggest advantage: aggregate bandwith. Copper requires more and more pins or ports to compete in this respect.

NGN: What are the key bottlenecks that will pop up first that need such optical technology? (Is it backplane technology? Is it CPU-to-CPU or CPU-to-memory?). And can Intel add some numbers here - data rates/ interface densities where electrical runs out of steam.


There is no exact number, because the distance and speed for electrical are highly related. You can always push an electrical solution farther, but you pay in power and complexity (by adding additional lines), and cost. This issue is the overall price/performance of the current electrical solution versus the proposed optical one.

However, that said, for chip-chip interconnects, going beyond 20Gbit/s per line looks to be very challenging. We also have leading research on copper based I/O which is showing great results, but above this speed optical will start to look more attractive.


For networking, 100Gbit/s Ethernet looks to be a key speed for optical. Ethernet is expected to continue the “factor of 10” scaling that it has in the past, and at 100Gbit/s speeds any copper solution would be extremely challenging.

Wednesday, October 25, 2006

A question of standards

The telecom industry has benefited greatly over the years from standards. Carriers can use equipment from several vendors, and with equipment interoperability comes economies of scale and faster adoption. Perhaps the best example of collaboration is the 3GPP/3GPP2, ETSI TISPAN and CableLabs all coming together to support the latest NGN and IMS standards - an industry first.

But despite the collective benefits, politics is never far away. Intellectual property rights of particular companies need to be resolved when a standard is thrashed out. More often than not the standard ends up being more complicated to accommodate the factions. Very rarely the gap can't be bridged and two standards emerge, splintering the market from the start.

There are also regional politics. Europe did particularly well from second generation cellular by making GSM a global standard. Other regions took note and made sure they weren't left behind the next time round. South Korea has been smart in using standards locally early enough to allow it to be adept at addressing markets worldwide. The latest examples of this are WiBRO and WDM-PON.

But what is one to make of China? China has still to issue 3G licenses but it is clear that its own TD-SCDMA standard will play a key role. It is also developing its own mobile-TV standard (does the world, with nearly a dozen mobile-TV standards, really need one more?). China is also considering its own passive optical networking (PON) standard.

China is unique in the size of its home market. And if the percentage of the population that can afford advanced telecom services is still small, it is growing. In turn, developing internal standards ensures Chinese telecom vendors reduce their exposure to foreign-held patents. Internal standards also help them develop the necessary expertise as well as ensure they have a local market where they compete favourably with foreign vendors.

But what is the point of introducing standards if they do not advance the industry? For example, what benefit is there in introducing yet another 3G standard that is several years behind the two existing ones?

TD-SCDMA may yet prove to be a superior 3G standard. But with the momentum and scale of existing 3G standards that is a tall order, especially when no external market has adopted TD-SCDMA. Moreover, Chinese firms such as ZTE are already active 3G handset and networking equipment players. The same is true for PON, where Chinese equipment vendors offer EPON and GPON.

One indicator of China's emergence as a key player on the world's telecom stage will be when its Ministry of Information Industry lets up on scripting local standards across the telecom landscape.

Is China being shrewd with its standards policy given the success of Chinese equipment vendors?

Tuesday, October 24, 2006

WiMAX’s opportunities: few and far between

Fierce competition from two flanks - fixed-line broadband and cellular - will limit the global deployment of WiMAX. So argues a new study from Sound Partners.

The market research firm has developed a business model based on the same assumptions as the WiMAX Forum as to what is needed to build a network. “We wanted to understand the key sensitivities,” says Alastair Brydon, Sound Partners’ CEO whose WiMAX report is to be published by Analysys. "In reality there are not that many cases [for WiMAX] that offer a good return."

The study looks at deployment scenarios in urban, suburban and rural areas in developed and developing markets. In developed markets, carriers’ digital subscriber line (DSL) deployments of ADSL2+, complemented with fibre-to-the node/VDSL2 to fill in coverage and enhance link speeds, poses a key competitive threat to WiMAX. Moreover, such DSL services are being offered by well-known brand names (Tesco, 3, Sky) as well as incumbents. “It will be really hard for anyone, using any [broadband] technology, to get 10 to 15 percent market share in five years,” says Brydon.

The same applies in rural areas where DSL coverage is being extended to nearly all exchanges. “BT is constantly expanding its service,” he says.

Nor is this solely the case in markets such as the U.K. China plans to use broadband wireless access to provide services where there is no fixed-line infrastructure but not WiMAX. “There are no plans issued by operators in China for the deployment of WiMAX,” Tian Wenguo, senior vice president of company strategy at ZTE, told NGN.

“Cellular is also rocketing,” says Brydon. “Terminals for cellular are dirt cheap as are DSL modems; WiMAX terminals are not.” Someone must pay for them, and if it is the operators it will prove a big expense. “We are not saying WiMAX won’t happen, just it will not be as big a business as some are arguing,” says Brydon.

WiMAX’s prospects would certainly be enhanced if a major mobile operator embraced the standard. But, if anything, cellular operators are looking to DSL to bolster their broadband offerings. Vodafone and O2 are adopting DSL in certain markets though as yet it is not a global decision.

“They are in the decision process now, which is why this is a critical time for WiMAX’s prospects,” says Brydon.

Is there market hype regarding WiMAX's prospects?

Monday, October 23, 2006

Q&A: The future of mobiles - Part II


The second part of the Q&A with Professor Rudy Lauwereins, vice president of Belgium’s Interuniversity Microelectronic Centre (IMEC), on the future of handsets. For the first part of the Q&A, click here

NGN: Fixed mobile convergence (FMC) is a central policy for carriers at present. How does true FMC integration happen? For now the wireless infrastructure upgrade race just seems to be in parallel with the DSL/Fibre fixed network race. Apart from WiMax there doesn't seem to be a convergence path. Is that really a wrong interpretation, and is IMEC seeing signs of FMC as part of future handset designs?

RL: This is difficult for me to answer as we don't work on fixed. But FMC in terms of making calls in the home from a mobile handset via broadband is happening, as are the technology capabilities for FMC [such as dual mode GSM-Wi-Fi handsets]. But most operators aren't happy. Even if a mobile arm is part of the carrier, it is still a separate entity with its own profit and loss. And then there is the prospect of third party VoIP providers taking business.

It's the same story with cognitive radio: operators don't like it. It's ok if the phone picks and choices a standard as long as it's a standard the carrier provides: I offer all and I am in control. That is why they are pushing for the basestation to take the decisions but that doesn't make sense. The decision-making should really be in the terminal and that is what they are scared of.


NGN: Are the fast-moving wireless market trends making mobile design for 2012 a continual moving target? Here is just one example: Nokia have said there is a working group at 3GPP looking at inter-working between WiMAX and 3G-LTE. Clearly they see the technologies co-existing and think mobile WiMAX will be for non-3G operators.

RL: It is a moving target and it is moving faster and faster. You need to make sure you have a flexible platform in case things change, then it is just software you adapt not hardware. With software-defined radio, it is not just about multiple standards but the evolution of standards. IMEC is implementing IEEE 802.11n, IEEE 802.16e and 3GPP-LTE [all on the one platform] and the three aren't standards so we aren't standard-compliant but we are confident the platform will be able to handle the standards when they will be finalized. Companies are now saying let's start earlier - pre-standard- and we will adapt using the same platform.


NGN: Next Generation Networks keeps asking this question: By 2012 the 4G standard will be starting, offering 100Mbit/s (mobile) and 1Gbit/s (static) data rates. Just what will such data rates be used for?


I have been asked this already many times. You only have to look at wired connections. A friend of my son was sharing his music library between their PC hard drives and were copying over a 100Mbit/s link and it took 4 hours. This would be less than 30 minutes over a 1Gbit/s link.

What will 1Gbit/s be used for? As a cable replacement and for sending video wirelessly in bursts. Solid-state storage in a video camera will be 256 Gbyte by 2012. Sending it over 100Mbit/s will take forever.

In the home, video is likely to be sent uncompressed from a central hard disk store to displays around the house uncompressed. Three uncompressed HDTV channels will be greater than 1Gbit/s in total.


NGN: Will that be 4G?


RL: No one agrees what 4G will be. We expect the air-interface to be flexible. If used indoors it will be wireless LAN-based while outdoors it will use a different set of radio parameters.


Professor Lauwereins heads one of IMEC's four research divisions. His group develops enabling technologies for consumer and battery-operated devices in the nomadic and mobile arena.

Wednesday, October 18, 2006

Broadband's impact on an economy

Accepted wisdom is that broadband is an important economic enabler. Certainly that is the argument of those promoting FTTH deployment in Europe. But is it true?

The findings of a 2005 study, by the Massachusetts Institute of Technology’s (MIT) Center for Technology, Policy and Industrial Development, suggests broadband does benefit an economy. But it is extremely difficult to measure, according to one of the work's authors.

Using U.S. national data, the MIT study showed communities experienced more rapid growth in employment and businesses once mass-deployment of broadband occurred between 1998 and 2002.

What the study didn't answer is whether the economic benefits are short lived once neighbouring regions catch up, or whether the benefits of getting broadband earlier compound into the future. Nor did it measure the relative economic impact of particular broadband technologies such as digital subscriber line (DSL) versus FTTH. But new data collected from late 2005, as mandated by U.S. regulator, the Federal Communications Commission (FCC), will enable such analysis in future.

"The big problem with determining the economic cost - benefits foregone, loss of competitive advantage - associated with slower broadband adoption is that it is very difficult to measure,” says William Lehr, research associate in the Center for Technology, Policy and Industrial Development at MIT. “It is difficult to measure the impact of any IT related input and even more difficult to measure the impact associated with a specific kind of IT”

"Qualitative: broadband seems to produce benefits," says Lehr. "Quantitative: in our study these were on the order of 1% higher job growth which is huge and increased share of firms in higher value IT-intensive sectors.

“However, we are not convinced we have adequately controlled for causality. Simply put, this is the question of whether broadband follows economic activity (communities with broadband are ones that had greatest growth prospects) or whether broadband produces economic activity. Our study used the best metrics available to control for causality but this remains an important question.”

Lehr continues:

“With respect to FTTH and who lags whom, there are no good empirical studies. Some will argue that early pioneers may be the ones with arrows in their backs. That is likely the case with some of the municipal broadband deployments. That said, it seems clear to me that FTTx is a good thing and more bandwidth in the last mile facilities will help promote growth across the value chain. But -- I do not have a number to say how important.

Also, catching up is relatively easier as systems become more modular and commoditised. So it is unclear whether communities that fail to adopt broadband will lose much in competitive advantage if they follow more closely.

Real issues are whether future global economy based on broadband will promote localism or accentuate the scale/ scope of the economies of the leaders. Will Hollywood become even more dominant or will French local content finally be able to reach a viable audience? Both equilibria are possible. I believe that broadband enhances opportunity for local content and local economies, but this is unproven.

Another obvious benefit of broadband sooner is you realize the consumption gains sooner. It has been estimated that the cost of delaying mobile telephone services in the US because of regulatory delays was on the order of $40 billion in foregone consumer surplus. These sorts of economic losses are sizable but again are difficult to estimate,” he says.


Does it matter that Europe lags in FTTH deployments?

Are equipment vendors right to raise concerns or should they be investing more effort to quantify the cost to Europe of its non- deployment?

Tuesday, October 17, 2006

Verizon turns heads

Verizon Technology Organization (VTO) allowed analysts into its Waltham, MA labs recently to tour its FiOS test integration and digital home labs. Ovum-RHK, for one, left with an extremely favourable impression of Verizon’s FiOS fibre-to-the-premise (FTTP) -based triple-play programme.

Verizon now has close to 500,000 FiOS subscribers of which 100,000 are FiOS TV subscribers. The service provider also expects its investment in FiOS to earn positive operating income by 2009, with a positive impact on earnings in 2008. From 2004 through 2010, Verizon believes that it will have spent $18 billion passing 18 million households, says Ovum-RHK.

“Basically, they were there to tell the analyst community “look at us, we are doing the right thing, it’s economically viable, and it will pay off.” They are off to a good start,” says Ken Twist, vice president, technology consulting and broadband networks practices at Ovum-RHK.

Will the increasingly favourable reports on Verizon's progress change the plans of other carriers? A recent report by U.S. investment banker, Cowens & Co., says that AT&T remains committed to its Project Lightspeed fibre-to-the-node (FTTN) deployment through 2009. But with cable operators looking to upgrade their networks with FTTH beyond 2010, the competitive landscape is shifting rapidly. “AT&T may feel compelled to transition from FTTN to FTTH more rapidly,” the report says.

Cowens & Co. makes a further observation that while the financial community concentrates on AT&T’s Lightspeed project, it expects the service provider to “focus on its largest and highest growth potential businesses – wireless and Enterprise”.

These represent 60% of its expected revenues in 2007. Residential, in contrast, accounts for 23% of revenues only, and Cowens & Co. expects it will generate modest growth only.

Fibre is not the sole way for a carrier to bolster its financial performance.

Monday, October 16, 2006

Q&A: The future of mobiles - Part I


Next Generation Networks (NGN)
asked Professor Rudy Lauwereins, vice president of Belgium’s Interuniversity Microelectronic Centre (IMEC) for his thoughts on mobile handsets through 2012.

Professor Lauwereins heads one of IMEC's four research divisions. His group develops enabling technologies for consumer and battery-operated devices in the nomadic and mobile arena.

This is the first of a two-part interview.


NGN: Looking at current handsets, they do everything: they are cameras, MP3 and video game players, as well as providing web browsing and email. What will handsets deliver by 2012, and what will be new and novel?

RL: This is a difficult one. Today's phones deliver all you can imagine. But that doesn't mean there won't be new applications. What I can say is that screens will have higher resolution and probably be bigger in size, plus there will be much more storage. Devices will thus support more realistic games, higher (data) throughput and much more compute power. All these will make several applications possible but I have no clear idea.


NGN: What specification will they have and how will they compare to today's phones in terms of processing power, storage, and technologies within the handset? One example you have mentioned in the past is that handsets will have more than one camera to create views from any angle.

RL: I have been tracking the cost of compact flash (non-volatile memory) for the last eight years: what capacity memory you can buy for Euro 50 and what for Euro 1000. It is doubling every year. At the end of 2006, you can buy 16 Gbyte for Euro 1000 and 2 Gbyte for Euro 50. Memory companies tell me they expect this to continue for the next five years. That means handsets - at least high-end ones - will have 64 Gbyte in 2012. For low-end phones it will be 16 GB. What applications will use such cheap storage? Current cameras provide stills and poor quality video clips. By 2012 you'll have a video camera with you all the time.

Handset devices in 2012 will also have multiple processing cores. Handsets now have multiple cores but having had discussions with nine companies recently, the expectation is that by 2012 there will be between 16 and 100 cores per handset. That's ten times greater than today. There will also be silicon-scaling benefits [using smaller feature CMOS processes which will further boost each core's processing performance] so the overall processing performance improvement will be greater than ten times. It is hard to say exactly but maybe a factor of 30.

As for the multiple cameras, all nine companies say they don't know whether there will be a suitable business model for it and hence it is not a strong focus. Where there is pressure is to combine video and 3D graphics on one platform. High-end phones have a graphics-processing-unit now. There is a need to find an architecture that deals with video and 3D graphics without needing a dedicated [hardware] unit. The video and graphics will be used mainly for games but also for graphical data such as navigation maps.


NGN:
What are the leading technical challenges to be overcome to make such handsets possible? For a start, Nokia and Qualcomm both mention that handsets will need to support eight distinct radio standards. Then there is the issue of growing power consumption while the energy capacities of batteries are not advancing in lock step.

RL: Sofware-defined radio will be in products in 2009 and the work on that is almost done, so this is almost in the past! What is being finalised now is passive software-defined radio – a flexible platform that can implement a number of radio standards but they will be ones I select when I want to use them.

The next step, to appear by 2012, is agile radios. This is a much more active form of software-defined radio. Here the handset scans to select the network available based on the user profile, such as what is lowest cost or what offers highest voice quality. But the terminal rather than the operator does the selection.

Then there is cognitive radio where the terminal scans the whole spectrum, finds a frequency it can use and then chooses the most appropriate modulation scheme. But a true cognitive radio is beyond 2012.

Batteries aren’t scaling well but then current phones, with batteries weighing 30g or 40g, last twice as long as previous ones, so we can solve the problem at the architecture and application level.

With 3G handsets it’s a new problem again but there is quite some innovation here in processing technology, algorithms and storage. All are used to increase the compute power for the same energy capacity.

The idea that by 2012 there will be new fuel cells that offer a quantum leap in energy stored is not seen as likely, but current batteries are increasing in energy capacity by 5% to 10% each year. We can certainly live with it.

Companies also points out that even if larger batteries emerge, the instantaneous power consumption will not rise above 3W. You put a phone in your pocket and it cannot go above the 3W mark for temperature reasons.

There is also the issue of programming the software code onto a multi-core architecture. The C [high-level language used to write the software code] compiler needs to be aware of the multi-processor architecture.

Then there is the issue of predictability. One application mapped onto the hardware may run in real-time and so may a second but when you run both? The mix of applications changes over time and guaranteeing that all the combinations will run in real-time is a problem. Ensuring all the applications in all combinations working correctly is infeasible. There are stories that after mapping applications onto the hardware
correctly, it took 100 man-years of effort to solve the predictability stuff.

Another issue
is scalability. Cell phone makers support some 60 different phone models using several general hardware platforms ranging from low end to high end. Since mapping applications is a huge job, they don't want to spend all this effort on each of the platforms. Rather they want to develop them once and get them running quickly on these platforms.

One last challenge, that IMEC is expert in, is building a reliable silicon platform with predictable performance, when going to smaller CMOS processes results in increasing transistor variability and decreasing reliability.

The second part of the interview covers fixed-mobile convergence, how to ensure designs of 2012 meet the fast-changing requirements of wireless, and what will 4G data rates be used for.

Friday, October 13, 2006

Fixed mobile convergence's second wave

There is something odd about spending a fortune on 3G spectrum licenses only to discover that signal coverage in the home is poor. Did mobile operators always know that after rolling out 3G cells at a macro level, they would need to turn their attention to the home? Or have they been surprised?

Either way, the result is the emergence of the femtocell market. A femtocell is a tiny base station that sits in the home and which is connected to the network via broadband. Femtocells are a mobile operator play: you need to be a spectrum owner to offer them.

They also address 2G as well as 3G, although 3G will enable operators to best exploit good in-door signal coverage. Femtocells also work with existing handsets unlike FMC services being deployed now. The user also avoids trading in their handset for one of only a few dual-mode cellular-Wi-Fi handsets available.

Femtocells can thus be viewed as the next wave in fixed mobile convergence (FMC). “For the cellular carrier, the central notion [of femtocells] is to capture more of the consumer spend,” says Stuart Carlaw, ABI Research’s principal analyst, wireless connectivity.

Operators will start femtocell trials in 2007, and ABI forecasts that by 2011 there will be 32 million femtocells deployed, supporting 102 million users.

But there are challenges. Radio interference is one. 3G mobile networks are planned carefully in terms of the cell frequencies and scrambling codes used. Once femtocells are sold, low-power base stations will start appearing within existing 3G cells. If a user’s phone detects a stronger 3G-macrocell signal, will it switch over to the regular 3G cell? Equally, if femtocell signal is louder than a macro cell, will a passer-by’s phone try to connect?

“No one has done this in anger,” says Dean Bubley, analyst and founder of Disruptive Analysis. “No one knows what it does to the frequency plans when 1000 of these light up in a square kilometre.”

Another challenge is snaring households to adopt a femtocell. Do all a household’s users have 3G handsets? Then there is the issue of connecting yet another box to the home gateway, and the help-desk cost and support needed when there is a problem.

That said, all the makers of femtocells, and associated networking equipment, that NGN has spoken to are bullish about the market's prospects. You only have to talk to the operators to know this will happen, says one.

Thursday, October 12, 2006

What is copper's life sentence?

Two service providers, Telefonica and Bezeq, have joined several vendors as well as academic institutions to form the Dynamic Spectrum Management (DSM) Consortium. The consortium is tasked with developing DSM technology to "increase current subscriber broadband rates beyond DSL technology".

The technology promises "fibre-optic rates" over twisted pair, extending the data rates achieved by ADSL2+ and VDSL2 for a given length of twisted pair cabling. VDSL2 achieves 100 Mbit/s at 500m, and 50 Mbit/s over 1km.

Quite what is meant by "fibre rates" using DSM is not stated but at best it will be a doubling of the data rate/distance performance current DSL technology achieves.

Using DSM to extend the data rates over copper is clearly an attractive option for carriers, especially European incumbents who favour fibre-to-the-node/VDSL2 access schemes rather than fibre-to-the-home. Couple that with improvements in coding algorithms such as a 3.5Mbit/s HDTV channel, and copper-based broadband could live years longer than expected.

So just what is the maximum data rate possible over copper? Try 10Gbit/s over 100m. That is what the 10GBase-T standard, approved in June, offers.

Now no one is suggesting 10GBase-T be used for broadband. For a start, 10GBase-T uses special four twisted pairs cabling and is designed for data centres not access networks. And if you were going to the expense of laying new cabling as close as 100m to the home, it would be fibre not copper. But it shows what is possible over copper when advanced signal processing techniques are used.

Tuesday, October 10, 2006

The ultimate customer premises equipment

“With teenagers happy to use MySpace (the networking website) and blogs to share details of their private lives, there may be less concern surrounding privacy than for other generations.”

Geraldine Padbury, senior business analyst at The Institute for Grocery Distribution (IGD), a retail think-tank, that found almost one in ten teenagers and one in twenty adults are willing to have a microchip implanted to pay shop bills and prevent card or identity fraud and muggings.

However, the IDG believes supermarkets will use biometric techniques first, such as fingerprint and iris recognition. Such methods are more popular than paying by mobile phone because of handset theft concerns. This suggests the industry must solve a key concern if handsets are to be used for e-commerce.

Lastly, the IDG survey found 16 per cent of teenagers and 12 per cent of adults want navigation systems on supermarket trolleys to help them round the store. Such a system is already being used in a trial in Germany. Shoppers connect their loyalty card to a trolley-based computer, which displays goods bought last time as well as special offers and their location.

Add RFID to the goods bought, and armed with your chip implant you could breeze past the checkout. Now that's some service offering.

Monday, October 09, 2006

UltraHD and IMS

It is rare for a commentator to be bowled over by a technology. But that is what happened when Hervè Utheza, a former director of IPTV research for market research company, The Diffusion Group, witnessed an NHK-NTT demonstration of Ultra High Definition (UltraHD).

“I saw something that truly excited me - that is, made shivers run up the back of my neck and goose bumps appear, letting me know I was witnessing was a truly breakthrough technology, not just some repackaging, re-integration, or re-combination of the existing, but something capable of really changing the digital technology landscape,” said Utheza in an opinion piece for The Diffusion Group.

UltraHD has a definition of 7,680 pixels by 4,320, some 16 times the resolution of HDTV 1080p, or 3.75 times that of Digital Cinema 4K.

Utheza believes UltraHD is so outstanding that the industry should leapfrog straight to the standard and scrap Digital Cinema altogether.

Meanwhile, analyst Dean Bubley attended a two-day brainstorming session on IMS. He told NGN that while the event's aim was to look at how to make money from IMS services, it quickly focused on what bits of IMS are ready and what are not.

“IMS is not ready for mobile operators while for fixed it is technically ready in parts," he said. See his post for more detail.

Friday, October 06, 2006

Soundbite: FTTH in Europe

“No European incumbent has announced a large FTTH deployment because they realise there is no business case... FTTH is very risky and very uncertain, with negative returns just to stay in the game.”

Lars Godell, principal analyst, European telecoms, at Forrester Research, who fears carriers are being pressured to enter the FTTH market in a way reminiscent of 3G spectrum auctions six years ago. Click here for Total Telecom article.

Thursday, October 05, 2006

Seeking the business case for FMC

Today is a big day for Orange and arguably Europe regarding the future of fixed mobile convergence (FMC). Orange’s Unik service goes live in France enabling users to make calls on their mobile through their home’s broadband connection or Wi-Fi hotspot.

Orange plans to also launch a Unik-style service in the Netherlands, UK, Spain and Poland. With 60m subscribers in these markets, mobile handset makers should at last be convinced to bring more dual-mode WiFi-GSM handsets to market.

Orange is only the latest of several European operators to deploy such a service. Telecom Italia just made available its much-anticipated Unica UMA-based FMC service, though Total Telecom has already reported how the service came out with little fanfare. Meanwhile, in August, TeliaSonera's launched its Home Free Mobile IP service in Denmark while Deutsche Telekom started its SIP-based T-One service in Germany.

And all follow trailblaser BT with its Fusion service launched over a year ago. BT’s Fusion subscriber numbers, however, are discouraging. “The latest figures we have are 30,000 at the time of BT's Q1 results [issued in July],” said a BT spokesperson. One FMC industry observer believes BT’s service package has too many constraints. “Operators that understand how to bring the service to market should be well into the million-plus subscribers in the first year,” he says.

Yet FMC service launches will only gather pace in the coming year. ”One year ago the FMC Alliance had a dozen members and BT had launched Fusion. Now it has 26 members and eight have announced or started services,” says Ian Cox, an analyst soon to publish an ABI Research report on FMC. “In mid-2007 I expect all 26 members to have services running.”

Cox notes that UMA is the first standard-based way of enabling handsets to connect via broadband and it has a two- to three-year lead over a full IMS solution. But there are pre-IMS trials and services up and running based on SIP dual use handsets from the MobileIGNITE group.

But what concerns him most is the business case for the service: “There is a lot of expense here.” There is the CAPEX for the UMA controller that costs several million dollars apiece, “and you need at least two”. Then is the cost of the handset as well as the challenge to get a potential subscriber to give up their existing handset, likely to be relatively new.

Cox views the operators’ FMC service launches as a defensive strategy. If carriers don’t do anything they will lose revenue, he says, and with such a service they can offer prices that match VoIP charges of a Vonage and Skype. “The business case for FMC is difficult. There are lots of things to consider,” he says.

The industry observer is more upbeat about FMC services though his gaze is towards the US market. T-Mobile is very encouraged by their FMC trials and the consumer feedback, he says. “They are upping their forecasts.

Are operators harming themselves by offering complicated and inflexible converged service packages to users?

See also the poll in The Register, click here

Wednesday, October 04, 2006

VoIP as an opportunity

VoIP need not always be a threat, indeed for cellular operators it represents an opportunity. So argues a report from market research firm, Sound Partners. "In ten years’ hence VoIP won't be the dominant way [voice calls are made] but it will be getting there," says Sound Partners' CEO, Alastair Brydon.

Cellular cannot deliver mass market VoIP services: it does not have the throughput and voice quality would be unacceptable due to packet delays. But this is about the change with the launch in the U.S. this year of CDMA2000 1 x EV-DO Revision A which has hooks for VoIP. Europe will have to wait till 2009 with the introduction of the 3G Long Term Evolution (LTE) standard.

Operators will invest in 3G LTE for such services as Internet access and mobile TV services, with VoIP being one more offering. "A 3G LTE network using VoIP will be 28% of the cost of today's network," says Brydon.

Other benefits going to IP include the reduced cost of a single core network for fixed and mobile traffic compared to a separate circuit switched network for voice, and richer services by mixing voice with multimedia content. Since the trend for traffic is to move from fixed to mobile, all the carriers have to do is manage the transition to VoIP correctly, says Brydon.

And the opportunity? "If it is plain voice, everyone is squeezed but VoIP can be offered as a premium service," he says. Such a service would include superior voice quality as well as instant messaging, presence and multimedia.

Sound Partners forecasts that cellular VoIP will account for 49% of all cellular minutes in 2015 and generate revenues of U.S. $70.9 billion. In Western Europe it will be 33% of all cellular minutes and $54.7 billion revenues.

Tuesday, October 03, 2006

Unlicensed mobile access

A website dedicated to the Unlicensed Mobile Access (UMA) technology. Click here

Monday, October 02, 2006

Incumbents as dinosaurs

The Economist has used recent events at Telecom Italia to survey Europe's incumbents. It highlights Telecom Italia's woes - slow growth in mobile, a decline in fixed-line revenues, competition and huge debts - and suggests these are common with its "dinosaur-like peers" elsewhere in Europe.

Stephen Pentland of Deloitte is quoted as saying that "the economics are not getting better,” and that incumbents can expect fundamentally lower levels of earnings.

In response, carriers have all adopted similar strategies. They have combined fixed and wireless service bundles domestically while competing with their mobile and broadband arms in external markets.

The way different carriers are looking to attack their rivals by bundling services is fascinating. I have been talking to vendors and analysts about femtocells, tiny 3G basestations serving a home which mobile operators are eyeing as a way to compete with fixed line operators while using their broadband networks to carry the traffic. Femtocells are set to appear in the second half of 2007.

The effect of all these strategies is increasing pain for the carriers as they “at last start to compete with each other”. Competition benefits the consumer, argues The Economist, while strengthening the case for carrier consolidation. Such consolidated entities would also be in a better position when negotiating with content providers. But governments will block such moves, it says, hence the "dinosaurs lumber on".

Sunday, October 01, 2006

Bring a (telecom) Book to Work Week

From The Observer newspaper (1/10/2006):
"Thousands of people will take their favourite book to work next week as part of a global initiative by Book Aid International to benefit the world's poorest readers. The idea of Bring a Book to Work Week is that for every book taken in, companies signed up to the scheme will donate £2 to help the charity provide people in some of the world's poorest countries with one of their own."
Would you recommend a telecom book, and if so, please say why you found it so useful?