Features

European Communications highlights some of the upcoming events focussing on broadband technology

The Broadband World Forum Europe,

The Broadband World Forum Europe comes to Madrid, Spain, 3 - 6 October, bringing together key executives working in carrier and supplier companies from around the world, to join today's thought-leaders from such organisations as Telefónica, France Telecom, Belgacom, BT, Telecom Italia, KT, NTT, T-Com, and Swiss Telecom.
In all, the Broadband World Forum Europe features 80+ exhibiting companies, 150+ high-level speakers, 40+ workshops and conference sessions, Plenary Panels, Keynote Address, Hot Seat Sessions and more.
Attendees will have the opportunity to learn the business models, deployment strategies, and rollout practices that have proven successful in making mass-market broadband in Europe a reality. The event features the latest perspectives and real-world experiences from leading executives through four days of comprehensive educational programming, complemented by a cutting-edge technology exhibition and high-tech demo pavilion, offering attendees a firsthand look at the latest broadband technologies, equipment, applications, solutions, and services.
New at this year's World Forum, the co-located WiMAX Global ComForum tackles technology and business challenges in implementing WiMAX wireless broadband networks for both fixed and wireless access and mobile broadband.
The InfoVision Award program will also take place at Broadband World Forum Europe 2005. The InfoVision Awards recognizes technologies, applications, products, advances, and services. The awards also honour corporations and individuals for innovative contributions and developments that have proven important to society.
Hands-On Technology Pavilions include: WiMax Demo Pavillion, CableNet Demo Pavillion, VoIP Public Phone Booth, and a Video Demo Pavillion.
Broadband World Forum Europe conference chairman, Julio Linares Lopez, executive chairman of Telefonica Espana, will welcome attendees during the Monday morning Opening Ceremony Address, and Luis Ezcurra, general manager of marketing and mart development at Yelefonica Moviles Espana, will deliver the Wednesday keynote address The Challenge of UMTS.
John Janowiak, senior director of the IEC, notes: "Telefonica's commitment to broadband in the market, as proven in their mass deployment of ADSL, is truly impressive. Their expertise in voice and data systems will provide valuable experience to conference attendees."
Linares Lopez adds: "The topical platform coupled with the level of executive speakers and conferees brings a unique flare to the programme. The shared international knowledge will guarantee great success for this Forum."
Other key speakers at the event include: Edward Deng, Senior Vice President, Head of Global Marketing, Huawei Technologies;  Alan Mottram, President, Fixed Solutions Division, Alcatel; Phil Corman, Director, Worldwide Partner Development, Microsoft TV Group; Paul Berriman; Head of Strategic Market Development, PCCW Limited; and Jean-Philippe Vanot, Executive Vice President, Networks, Carriers, and IT, France Telecom.
Details: www.iec.org

The 2nd Broadband Russia and CIS Summit 2005, 31 October - 1 November, Moscow

Supported by Leonid Reiman, Russia's Minister of Information Technology and Telecommunications, the event will take place from 31st October - 1st November 2005 at the Marriot Grand Hotel in Moscow, Russia.
This annual event has established itself as the largest forum for IT and Telecom industries in Russia and its neighbouring countries. Indeed, the previous Broadband Russia and CIS Summit attracted Government ministers and senior representatives from the public and private sectors, along with more than 200 international participants.
It is now established as a unique platform to pursue business opportunities in Russia and CIS countries in the ICT Broadband, Wireless, Satellite, Cellular, Content, Cable and Broadcasting, and 3G Technology markets.
Commenting on the event, Leonid Reiman notes: "The development of broadband communication networks is one of the top priority tasks of the communication and information industry in Russia. We believed that broadband multimedia communication networks will become the basis for next generation networks, accelerate information and Internet development, and create a civilised, investment-attractive market environment for operators, producers and suppliers of equipment.
"I am confident that this Conference will become an important step in the development of prospective communication networks in Russia and outline specific measures for their implementation."
Conference themes reflect the a wide spectrum of topic areas, and include:
*  Regional and international overview of key ICT broadband investment opportunities in Russia and the CIS
*  Overview of commercial and residential trends, demand and penetration rates for potential broadband access
*  Introducing commercial and residential broadband wireless networks and technologies to Russia and the CIS
*  Overview of financial, legal and fiscal framework for investment into Russia's broadband sector and the CIS.
Sponsors for the event include Motorola, Rostelecom, New Skies Satellites, and ViaSat.  Supporting Organisations include: CWTA, WiMAX Forum, Georgian National Communications Commission, Global VSAT Forum, ESOA, Enforta, and J'son & Partner.
Details:  vp@ite-exhibitions.com 

Broadband Europe, 12-14 December, Bordeaux

Broadband Europe aims to be the forum that unites academia and industry in discussion and display of the latest (and future) broadband components, products, systems, and services. The event was established in response to the overwhelming demand from the European broadband community for an annual gathering within which to present key developments.
Broadband has, over the past few years, enjoyed one of the most rapid growths and adoptions ever in the area of electronics or communications, even faster than mobile communications.
Introduction of broadband entails multi-disciplinary aspects (technology, socio-economic, regulatory, content delivery, security and standardisation). This multi-disciplinary character of broadband is recognised in the BREAD-project (Broadband for All in Europe: a multi-disciplinary approach), which is a Co-ordination Action, started 01/01/04 within the "Broadband for All" -- strategic objective of the FP6-framework of the European Commission.
The BREAD project has -- amongst its objectives -- to develop a holistic vision, encompassing technical, as well as economic and regulatory aspects. Another important aspect is to identify roadblocks on European, national and regional level, and share visions and best practices at both national and EU level.
In December 2004, the BREAD-project initiated Broadband Europe, bringing together on an international level all the broadband players, researchers, service providers, content providers, operators, manufacturers, policy makers, standardisation bodies, and professional organisations.
As well as an extensive exhibition, and additional visitor attractions such as the Application Village -- whose theme for 2005 is Broadband Gaming -- Broadband Europe offers an authoritative conference agenda, featuring exclusive scientific presentations, broadband project research results, semi-scientific commercial presentations (including agencies, municipalities on trials and roll-outs on broadband) and commercial presentations from companies on broadband products and services. 
Speakers at the 2005 conference will include representatives from the European Commission, government agencies, standardisation bodies, professional organisations, industry (world-wide), R&D centres and academia.
To ensure the conference addresses the most current issues, with the highest standard of presentations, a committee of specialists has been formed and meets regularly to agree topics and adjudicate papers for inclusion in the programme.
Details: www.broadbandeurope.co.uk

New technology is driving growth in the cable sector. As of late 2004, more than 11 million households and businesses worldwide have signed up for cable telephony services. And new VoIP
technologies are offering an increasingly convincing economic voice service argument to cable service providers, says Andrew Beutmueller

The research firm, Point Topic, recently surveyed voice over broadband providers, compared to 'pure play' Internet-only services like Skype. The survey revealed that Yahoo Broadband in Japan controls about 80 per cent of worldwide VoIP subscribers (over 4 million VoIP subscribers). The other five leading operators surveyed comprised less than 1 million subscribers (see table below).
Broadband ISPs like Yahoo Japan still dominate the VoIP business in terms of subscriber numbers. But the trend in 2004 was toward VoIP growth as part of a Broadband Multi-service Operators' (MSO) triple-play strategy -- along the lines of France's 'Free', Italy's FastWeb or Cablevision in North America. During the same period, pure-play VoIP providers like Vonage became less important to the overall picture than they were 12 months ago.
By the same token, there will be 14 million new Cable VoIP customers by the end of 2005, according to market research by In-Stat. And, as more cable companies invest in VoIP over the next few years, worldwide cable telephony penetration is projected to exceed 22 million by year-end 2008.

Cable big in Scandinavia

n the past two years, Western Europe has enjoyed particularly remarkable growth in terms of cable penetration. According to Forrester Research, more than 40 per cent of all households in Europe will be broadband cable subscribers by 2010. The biggest adoption rates will be in Holland and across Scandinavia, with Sweden's cable penetration rate the highest at 63.5 per cent across the three dominant Swedish cable operators Com Hem AB, UPC and Kabelvision.
Com Hem leads the market offering service to one third of all Swedish households. The company recently decided to further leverage this leading position by including voice over IP services to round out its triple-play strategy.
The time was right according to Ole Nygaard, Business Development Manager Fixed Networks Scandinavia for Siemens: "There is a lot of fibre in the ground in Sweden, which allowed a number of service providers like B2 to invade on Com Hem's leading market share."
In fact, when Com Hem began offering voice services, both new and traditional service providers started offering digital TV services via fibre to the home and xDSL.
 "Com Hem expected this to happen and decided to offer voice services in order to add value to their current offering," says Nygaard. "Part of the package is a subscription to three services, for which subscribers pay only a fixed fee for two services and you get one bill."

The technology

Com Hem's technological strategy required a softswitch supporting a state-of-the-art next generation IP telephony network solution that would be integrated into its existing IP infrastructure, as well as into its billing system. The solution also included a trunk media gateway that will establish the transit link from the IP network to the traditional PSTN.
The voice-over-cable solution comprises the Siemens Surpass hiQ 8000 Softswitch as a call management server, the Surpass hiG 1200 Trunking Gateway, an announcement and voice mail system, billing mediation and a VoIP cable modem. The softswitch, the gateway and the cable modem are all compliant with the EuroPacketCable interoperability standard.
This platform offers Com Hem a streamlined architecture featuring one operating system and standard, modular code that enhances the ability to develop and deploy innovative services more quickly and reliably. It is furthermore designed with separate control elements (signalling, bearer, and call and service) to enable faster, more cost-effective feature creation and deployment -- as well as being built with commercial components, thus simplifying integration into multi-vendor environments.
Hardware independence
Compatible with Solaris and Linux operating systems, the solution that Com Hem is implementing is designed with state-of-the-art software tools to enhance its adaptability. Hence, the softswitch provides multiple integrated functions including Call Management Server (CMS), Media Gateway Controller (MGC) and Signalling Gateway (SG).
The softswitch, at the heart of the installation, enables self-provisioning for subscribers and lower operating costs through Subscriber Self-Care (iSSC) toolkit. With the iSSC, Com Hem can customise and brand web pages that integrate directly into the existing web portals. It also enables subscribers to add, change or discontinue their services. Through an XML/SOAP interface, the iSSC communicates directly with SURPASS hiQ 8000 to enable rapid response to subscribers and streamlined operations.
This strategy gives Com Hem the ability to deploy an integrated multi-service communications platform at lower OPEX and CAPEX; this in turn allows aggressive service price discounting without sacrificing margins. Moreover, IP supports unique value-added features like integrated voice mail and e-mail messaging and web-based, real-time provisioning of additional services without rewiring or truck rolls.
 "The customer uptake has been above expectation," says Petter Östlund, Business Area Manager Telephony at Com Hem, "and the best indication of this success is the customer's response of experiencing a similar service level as traditional POTS but at much lower prices."

Easy fit

"Most importantly," explains Nygaard, "the solution is designed to support both public and private operating environments with enough flexibility to enable voice over broadband, voice over IP and voice over cable applications. It provides a comprehensive suite of voice features for both residential and business applications on any IP access infrastructure and, in addition to supporting industry-standard services, it also enables next generation or enhanced services and real-time multimedia services. This includes features such as click to dial, personal call management, conferencing, video calling and others that enterprise customers are turning to for enhanced productivity."                      n       

Andrew Beutmueller is a freelance communications writer and consultant

Alan Robinson explains the technical concept of IPTV and why testing a service properly could mean the difference between happy or disgruntled viewers

By now, if you haven't heard of Triple Play or IPTV, there's a fair chance you've been living under a rock. Every major and almost every minor service provider seems to have either announced plans for IPTV deployments or is in the process of drawing them up.
With an average subscriber to a digital satellite service spending over â‚-500 a year, it's easy to see why the traditional wireline operators are anxious to get revenue figures like these from their existing subscriber base. In addition, video-on-demand and pay-per-view services can as much as double these average revenue figures. So the financial rewards are obviously there. Also, high-speed Internet connections have an appeal that is limited to the technically literate in society so there's a limit as to how much broadband Internet products they can sell. On the other hand, television has a far wider or almost universal appeal so the market is so much larger.
But before service providers' eyes glaze over and they start drooling thinking about the revenue, there are quite a number of technical difficulties to resolve before the money starts rolling in. And to underline that, there have been a number of widely publicised trials that failed to turn into deployments, including a number of delays in deployment of over twelve months. These have been very costly, not only in terms of loss of face but also in crude financial terms, too.
So what can go wrong? Well, put simply, lots of things. Firstly, the DSL model used by operators is one of high-bandwidth, which is good for IPTV but with the downside of contention (ie, lots of users all using the same bandwidth). Additionally, very little about Quality of Service is mentioned in terms of broadband products and for a good reason; usually, there isn't any. IP is inherently unreliable but TCP helpfully handles problems like mis-ordered or missing packets, while latency and jitter aren't usually obvious to a web, e-mail or P2P user. But for an IPTV user, the result can be detrimental to their viewing enjoyment.
We're also a pretty tolerant bunch when it comes to IP over DSL. If our web page is a little sluggish, we don't really get worked up about it but we have zero tolerance when we're watching TV. With the advent of cable TV systems, digital satellites, DVDs, etc, we expect our viewing to be, well, picture perfect. Small artefacts or frozen frames are simply taboo.
Because MPEG frames (the encoding scheme for most IPTV solutions) are larger than Ethernet frames, losing a single Ethernet packet may mean that none of the MPEG frame can be displayed. It may be part of an advert that isn't displayed but what if it's the split second that decides if the winning goal was onside or offside? You can wear out your viewers' patience pretty quickly in those cases.
And just in case you thought things couldn't get worse, research has shown that most people don't report technical problems with their viewing enjoyment; they simply lose patience and churn to another provider and that's a very expensive event for the operators. They've spent considerable money getting the subscriber added to their service in the first place and now that investment's wasted.
Most analysts would agree that the key thing for operators to do in order to keep revenue high is to keep the churn rates down. There always will be a residual number of users that churn, but outside of this group, the quality of experience that a user receives is a key deciding factor in staying with the service or migrating to another supplier. One point that absolutely can't be stressed too much is to ensure that the test methodology is correct. Up to now, telecommunications equipment has typically been tested on the blast basis. Fire in wire-rate 64 byte packets and if they all get through the far side, then you're good to go. This is far too simplistic for the modern network where users are only interested in how their real-world applications (web, e-mail, IPTV, VoIP) run on the network and don't care how good contrived data sets that only ever occur in labs, function on the network. So make sure you're running real-world data. Don't simulate the traffic;emulate the traffic and there's a key difference here.
Another drawback to earlier testing methodologies has been the generalisation problem. Because it's difficult to measure simultaneously large numbers of statistics from different endpoints -- and also visually represent these figures in such a way that it's easy to see where problems lie -- the temptation has been to take one large set of measurements and report an average. Other than not testing at all, nothing could have greater risk of encouraging churn. The problem is that if there are relatively few data points that indicate poor service, then these get lost in the overall noise of the good service. The danger is, and we've seen this in 'real' deployments, that there is always a group of users that get poor service. In other words, the lowest ten per cent of users get blocky video, frozen frames or have too lengthy a delay when changing channels. These users churn and are replaced by other users formerly from the 'good' group. The operators monthly churn rate is high and no one knows why.
In the remainder of this article, we'll take a look at the types of things that can go wrong and how testing can make sure that they stay in the lab and not in the network. But first it's probably worthwhile familiarising ourselves with the type of traffic that we can expect on IPTV-enabled networks.
IPTV is sent over IGMP in most networks. IGMP (Internet Group Management Protocol) allows for multicasting (point to multi-point). Put very simply, a server (in this case a Video Server) transmits each separate TV channel as a single multi-cast stream with a special type of destination IP address. If a viewer wants to receive this stream, it sends an IGMP 'Join' message containing this special destination IP address. The network infrastructure then directs a copy of this stream to the user's Set-Top Box (STB). This is a very efficient use of bandwidth, because it doesn't matter how many 'viewers' of this stream there are; a single copy is sent through the infrastructure and this is split only where it needs to be split into multiple copies. The network infrastructure effectively also keeps a count of the number of viewers. As viewers issue IGMP 'leave' messages (ie, they change channels and no longer view this one), the count is decremented. If it reaches zero, then this portion of the IPTV network can elect not to split and transfer the packets, thereby further reducing bandwidth requirements. Compare this with a unicast (point to point) mechanism. For each viewer, there would be a separate connection to the Video Server. This simply wouldn't scale.
When a viewer wants to change TV channel, their STB issues an IGMP 'Leave' message for the TV channel it no longer wants to receive and a 'Join' message for the TV channel it has now switched to.
Usually, video is transmitted as an MPEG-2 stream. (More recent advances in compression have resulted in MPEG-4 being more widely deployed and this trend will likely continue.) An MPEG-2 stream consists of a number of different types of frames. Greatly simplifying, in order to provide compression, each frame is effectively a compressed difference to, or delta of, the previous frame. However, if the initial frame or one of the deltas was lost for any reason, it would be impossible to decode the picture. In order to overcome this, MPEG-2 calls for an 'I-frame' to be inserted at regular intervals (usually about 15 frames or less apart). This is a full picture with no dependencies on any previously received (and potentially lost) frames. If a user 'joins' the stream, then he has to wait for the next I-frame before a picture can be rendered on the television. MPEG usually is encoded at between 25 and 30 frames per second, so it could take up to half a second before the TV can display an image. Using a larger number of incremental frames between I-frames reduces the bandwidth required to send a video stream, but this has the disadvantage that it also implies the potentially longer time it can take to change channels or more generally, for the stream to recover its integrity (ie, display a perfect picture) when an error has occurred.
Because the size of the MPEG frames is much larger than Ethernet packets, a single MPEG frame has to be carried in multiple Ethernet packets. If one of these is lost, then the MPEG frame may not decompress correctly or fully. Subsequent frames depend on this frame until the next I-frame is received, so it's clear that a missing, corrupt or mis-ordered frame can have far-reaching consequences.
Traditional testing involves 'blasting' the system with packets. This isn't sufficient for testing IPTV systems. Packet loss can be measured by these techniques but not the effect of the packet loss, as some packets' loss may be more acutely felt than others. One of the most common reasons for loss in these types of networks is 'interference' from other IP traffic. If the kids are upstairs gaming or downloading MP3s, the detrimental effect on the quality that can be obtained on the television downstairs is difficult to overestimate. Modern test solutions give access to real live traffic (like web, P2P or e-mail) that can be mixed with IPTV to see if any losses occur as the traditional IP traffic's volume is increased.

Zap rate

We've already seen that it takes some time for a valid picture to be seen on the television when an IGMP stream is 'joined'. In addition to this, we've talked about the network infrastructure only having to split and send streams that one or more STBs are watching. So, if there are no current viewers for that stream, then potentially the network may have to go a number of hops before it can find a place where the split can take place. The zap rate measures how long it takes to change channels on the TV and have a valid picture to watch. It's surprising how critical a measure it is for the viewing public. With the growth in the number of channels that are provided in even the most basic of packaged offerings, perhaps it's not too difficult to understand why. Zapping through the channels looking for something worth watching is a common enough occurrence in most homes, and the longer it takes for each channel to change dictates how long it takes for us to find something more interesting to watch.
Modern test systems allow the user to measure the zap rate by joining and leaving channels in a controlled manner and collating the statistics for each individual viewer over long periods of time. Analysing all of this data will allow the tester to determine if any viewer's 'zap rate' went outside of acceptable bounds.
Suppose there's a big match on. The half-time whistle blows -- what do we do? Lots of people start changing channel. They may want to see something more nteresting than the usual collection of talking heads offering half-time punditry or catch the scores from another match on another channel. Whatever the reason, the numbers who switch channels at this point equates to a huge spike. This can stress an IPTV network terribly as it goes from a steady state of long-term viewing to a huge series of changes. A modern test system can allow the operator to generate these half-time scenarios, where a steady state is interrupted with large numbers of channel request changes to stress the infrastructure. Again, it's vital to measure the effect on a per-user basis or the bad service effects can be averaged out and lost.
These are just a few of the major 'must-check' problems that should appear on any operator's checklist before deploying. Unfortunately, there are quite a few more and we haven't even touched on perceptual video quality or more worryingly, the whole new raft of security issues and problems that have been created by IPTV. It's probably fair to say that IGMP was built with little or no thought put into security issues, the potential for fraud or Denial of Service threats. That's a whole other area of testing that only now is beginning to take place, so get testing and don't leave it to the viewers!                                                          n

Alan Robinson is Chief Executive and co-founder of Shenick Network Systems. He can be contacted via tel: +353-1-236 7002
e-mail: alan.robinson@shenick.com

External Links

Shenick

New technologies are enabling the evolution of 3G mobile networks to support broadband connection speeds. This opens up new opportunities for mobile operators to address the all-
important broadband market, not only among their traditional mobile subscribers, but also among fixed-line business and
residential customers - especially in areas not currently served by wireline broadband, explains Andrei Dulski

The Internet revolution has created huge demand for broadband access around the world, and has made it one of the fastest-growing telecom services -- with an estimated 175 million subscriptions worldwide today and an anticipated growth rate of more than 20 per cent per year over the coming years.
The residential broadband market in Western Europe has exploded in recent years, driven by falling prices and a strong supplier push. Forrester Research predicts that 41 per cent of European households (72 million households) will have broadband access by 2010.
This rapid take-up of broadband is good news for operators and online businesses. On average, broadband users spend twice as long on line as dial-up Internet users and, more importantly, they spend more -- being more likely to shop or bank online -- according to Forrester. The most popular activities are e-mailing, researching and booking travel and holidays, and downloading music.
In parallel with the growth in broadband, laptops represent the fastest-growing PC segment -- indicating a strong end-user desire for mobility -- while 3G mobile networks are being rolled out on all continents. There are now 82 WCDMA networks serving customers in 37 countries, according to the Global Mobile Suppliers Association (GSA). By the end of July 2005, there were approximately 31 million WCDMA subscribers, with two million being added each month, according to Informa Telecoms & Media.
Despite these high growth rates, the European Commission (EC) is concerned that many Europeans may miss out on the broadband revolution. In a recent statement, the EC said that as many as five million would-be users could still be without access to broadband services in 2013.
With affordable broadband access critical to the EC i2010 strategy for boosting economic growth and jobs, the Commission is focusing on how to get the technology to rural, remote and other under-served areas. According to the EC, around 15 per cent of the population is currently excluded from broadband deployment in the relatively advanced EU15 countries -- this figure is likely to be higher in the ten new accession states. What's more, broadband service coverage is concentrated in built-up areas: only 62 per cent of the rural population in the EU15 countries has access to broadband, compared with 90 per cent access in urban areas.
Evolved 3G networks could hold the key to bridging this digital divide.

Supercharging 3G

With 3G technology, it is possible to bring broadband connectivity everywhere for everyone over wireless networks. Mobile broadband provides a fast, always-on Internet access to laptop users wherever they are over 3G technology, at a cost comparable to that for fixed services.
New developments in 3G mobile technology will enable mobile broadband services to be delivered with bit-rates comparable to ADSL (Asymmetric Digital Subscriber Line). Such services will help enterprises increase productivity by making all kinds of information and applications available wherever and whenever mobile workers need them. Consumers will be able to access Internet-based information and entertainment services wherever they desire.
3G already offers data speeds similar to some fixed broadband access service offerings. With evolution towards HSDPA (High Speed Downlink Packet Access) for WCDMA, we shall see peak downstream data speeds of 14Mbit/s, mainly through software upgrades to existing networks. What takes minutes to download today takes seconds with evolved 3G networks. For example, at 2Mbit/s, a one-minute audio file takes four seconds to download, while a 200KB presentation file takes less than two seconds, and a 30KB JPEG image takes just a fraction of a second.
The next 3G development, HSUPA (High Speed Uplink Packet Access) will make broadband data rates available in the other direction -- enabling fast e-mail and file uploads and high-quality videoconferencing. In May, Ericsson and 3 Scandinavia demonstrated HSUPA for the first time in a live WCDMA network based on commercial products. 
The first commercial mobile broadband services in Europe based on HSDPA are expected in 2006.
Ericsson believes the first user terminals for HSDPA will be PC cards. This is natural, since PC cards are easy to implement and give consumers high-speed Internet access from their laptops wherever they are. Initial HSDPA PC cards, supporting data rates of up to 3.6Mbit/s, will be available in the second half of 2005. Early 2006 will see the introduction of HSDPA into smart-phones, as well as the availability of second-generation PC cards. We are likely to see HSDPA integrated into laptops in 2007.

Extending the reach of mobile services

The significant increases in data rates and system capacity enabled by HSDPA and HSUPA mean mobile operators can expand their service offerings, even to include low-revenue-per-megabyte services. Such broadband services already have proven mass-market appeal in both the enterprise and consumer segments, and offer significant new revenue potential to mobile operators.
In the past, technical limitations have restricted capacity and pushed up costs. This has led mobile operators to focus their attention on a variety of data services that generate high revenue per megabyte of traffic. Evolved 3G networks with their broadband data rates and improved capacity will greatly reduce the cost of delivering data services.
A key attraction of 3G is its ability to mix services of different characteristics -- helping operators ensure that spectrum and network capacity are always being used profitably.
Mobile broadband successfully creates permanent 'background' data traffic that uses operator assets to generate revenue. Prioritisation mechanisms provide operators with the means to allocate capacity instantaneously to high-revenue-per-megabyte services, such as voice, while also ensuring that spare capacity in the network is utilised.
Mobile operators will be able to expand their broadband offerings in the same way as fixed operators have. A range of add-on services that are directly linked to the broadband service can be used to generate additional subscription-based revenue. These services might include e-mail, web hosting, virus protection and network security. For small and medium-sized enterprises, operators can offer hosted and managed solutions, including security and authentication services © needed for accessing corporate networks, for example.
In all likelihood, the fastest growing, most price-tolerant -- and most profitable -- segment for mobile broadband services will be corporate users. Mobile broadband is seen to be an important driver of productivity: IT departments are keen to have access to the higher speeds that will enable workers to access corporate networks wherever and whenever necessary. Ericsson expects to see a wide range of industries adopt mobile broadband for their workforces.
While the corporate segment values mobility, it also welcomes flat-rate pricing schemes that enable costs to be predicted and managed. In a recent study performed in Sweden, it was found that Swedish IT and manufacturing companies are willing to pay more than EUR660 per employee per year for mobile broadband services.
Consumers also want mobility, but their need is not as pronounced as that of enterprises.
The increasing popularity of laptop computers will fuel demand for mobile broadband. There is already considerable interest in the short-range micro-mobility enabled by technologies such as WiFi. Mobile broadband will transform this micromobility into true mobility, relying on 3G for connectivity in every situation --- whether at home, on the move, indoors or outdoors.
Consumers are also less willing to pay a significant mobility premium. But some segments -- such as one-person households, students and mobility-oriented consumers -- will appreciate the mobile operators ability to offer bundled mobile telephony and broadband service packages. Combined offerings of this kind, which address an individual's total communication needs, will also enable mobile operators to seek a bundling premium -- justified by the increased simplicity, ease-of-use and combined invoices that appeal to consumers.

Fixed wireless broadband over 3G

Evolved 3G networks will not only enable mobile operators to offer enhanced mobile broadband services to enterprise and consumer customers. They will also enable operators to address the 'traditional' residential broadband market using 3G fixed cellular terminals.
As far as the residential customer is concerned, there is no difference between broadband access delivered over copper and one delivered over a cellular network. The fixed wireless broadband access service -- terminated in stationary customer premises equipment (CPE), or 3G modem -- is comparable to any         conventional broadband service based on DSL or cable.
Because they are stationary, and not mobile, fixed 3G modems can deliver high output power and be equipped with high-gain antennas to significantly improve bit-rates and radio efficiency in both the uplink and downlink over wide areas.
Today, many fixed network operators have great difficulty delivering ADSL services in rural areas, where the length of the local loop means DSL services cannot be supported, and the cost of bringing fibre closer to the customer is prohibitive.
By contrast, mobile operators relying on 3G can successfully deliver broadband connectivity services with high bit rates over wide service areas. In rural areas, this approach might actually be the only cost-effective solution to offering broadband services.
A similar approach can be used in urban and suburban areas to deliver broadband to the home or to the small office. High site density makes it easy to handle capacity limitations using additional carriers or increased antenna sectorisation. In this way, operators can provide high bit-rates even when usage and service penetration levels are high. The main benefit over traditional broadband service provision is the avoidance of the existing copper network with its associated high maintenance costs, as reflected by the current local loop unbundling fees and, in some cases, insufficient quality for providing DSL services.
An important area to be considered when evolving 3G networks is the 'second-mile' transmission network. Mobile operators can enhance the business case for mobile broadband services by leveraging existing transmission infrastructure and upgrading it in a targeted, cost-effective way as capacity needs increase. Here, microwave transmission solutions offer operators a great deal of flexibility in meeting their needs for capacity enhancement and traffic aggregation.
Ericsson's view is that mobile broadband services provide 'high-quality' revenue potential, particularly as broadband is an already proven, mass-market service with low associated risk. Moreover, 3G mobile operators are in a unique position of being able to extend the appeal of broadband services for consumers and to create a whole new market among enterprises and residential users -- even in remote and rural areas.   n

Andrei Dulski is Marketing Manager, Radio Networks, Ericsson, and can be contacted via tel: +46 8 719 0000  www.ericsson.com

Joe Barrett considers some of the critical elements in the mobile broadband access deployment decision making process, and suggests various technological and commercial validation criteria that could be used

Technology innovation goes on unabated as the wireless industry continues to push the boundaries forward into the 21st century. While expectations are high and new growth opportunities are being identified, market hype needs to be tempered with reality. GSM has been a remarkable success story for cellular voice, and 3G is taking mobile service levels into new areas with low cost voice and narrowband data services.
However, there is still a clear service gap between what current technologies can deliver and the market opportunity for new mobile broadband access solutions. Progress is being made towards closing the access service gap and technologies such as WiMAX are offered as future solutions. Industry opinion is that growth will come from converged service delivery and/or triple play, the convergence of voice, data and broadcast service delivery that is being built around a single transport technology -- IP. Additionally the level of mobile data consumption is expected to reach Gigabyte proportions in the next few years
For this reason industry pundits are now looking at a range of technologies -- including OFDM-based wireless and mobile technologies -- to bridge the divide.
With multiple standards bodies and industry forums proposing numerous solutions, it can become difficult to sort the wheat from the chaff. Decision makers need to fully understand the facts and comprehend the key criteria that should be part of the validation process for any proposed mobile broadband technology.
The computing world is different from the voice world. TCP/IP is the dominant internetworking transport protocol and any technology needs to be able to transparently support it and other LAN based transport protocols. If it doesn't, the effective performance of the mobile broadband network will fall short on delivering an optimum customer experience.
Various criteria for the evaluation and viability of different access solutions are regularly proposed as competitive advantages by companies promoting different radio access technologies. Some, it is fair to say, are more valid and helpful than others.
Validation criteria that are regularly used include peak data rate and bits/sec/Hz. While all are individually relevant to enable the delivery of a unique customer experience, there are additional criteria that are equally, if not more important to ensure a high level of mobile service delivery. These are sustainable sector throughput, cell edge data rates, state transition times and latency.

Peak data rate

The headline rate is often quoted by vendors as an indication of a technology's 'throughput' capabilities. Peak data rate is the maximum data rate achievable under ideal radio conditions and is normally only possible close to the centre of the cell site. The further away from the centre of the cell and in weaker radio coverage areas such as indoors or deep fading situations, peak data rates will be lower. In addition, it is often not understood that quoted data rates are 'sector' rates and not the actual data rates that will be experienced by multiple users. Since the peak data rate is only actually deliverable in 5 -10 per cent of the cell area, for one active subscriber, it is not a justifiable metric to use for assessing the viability of any technology. Peak data rates are a measure of maximum performance, but do not have the greatest impact on individual or average subscriber experience.
A technology that can deliver 1Mbps peak data rate using 1MHz of bandwidth would have a spectral efficiency of 1bit/sec/Hz. Likewise if the data rate was increased to 3Mbps the spectral efficiency would increase to 3bits/sec/Hz. Peak data rate claims are often quoted based on all the sector capacity delivered to a single user with full buffers under ideal radio conditions. This can create an unrealistic view of how efficient a technology actually performs under normal load in a commercially deployed network. With some technologies, spectral efficiency actually declines as the network is loaded due to increased interference resulting in an increase in latency. Another fallacy is to compare bits/sec per Hertz when it comes to fixed wireless versus mobile technology solutions, regardless of spectrum bandwidth usage.
Internet access network elements are mostly based on shared capacity resources. A typical fixed ADSL subscriber with a 512kbps data service may actually be one of 200 users sharing a 2Mbps capacity link -- at a 50:1 over-subscription ratio. It is not unusual for fixed operators to commonly use a 10:1 over subscription ratio for DSL or Cable services. So at busy times, actual user data rates could be lower than the capped 512kbps service as more users are accessing the shared data resource.
Maximum throughput would be available when there is less demand for the bandwidth. It is the same with many radio access networks (e.g. EDGE, FLASH-OFDM, EV-DO). The cell or sector data rate is a shared resource that is used by all active users in the cell at any one time. The difference is that users are often moving, so actual user data rates will vary across the cell area. The closer to the centre of the cell, the higher the data rates, and the further away, at the cell edge, the lower the data rates.
A key to delivering a consistently acceptable customer experience is therefore to ensure that the average Sustainable Sector Capacity is as high as possible.
As already discussed, data rates are higher when subscribers are close in to the cell, and lower at the cell edge. Cell edge situations don't only occur at the cell edge; they also occur indoors and in places of deep fading, especially in dense urban areas. Cell edge situations can typically occur in up to 30 per cent of the cell coverage area. Ensuring high data rates at the cell edge in both the downlink and uplink is therefore a critical element in maximising sustainable sector data rates and delivering a high level of customer experience.
If they are to meet the expectations and user experience demands of subscribers, mobile broadband access technologies must address the provision of consistently high edge data rates and enables operators to deliver a mobile broadband data solution across the whole network, not just in parts of the network.

Data usage

Unlike connection-oriented networks, connectionless packet data networks such as the Internet do not deliver one circuit to one user for the time it is needed. Subscribers share the network resources and TCP/IP. The underlying transport/networking protocol, manages the end-to-end scheduling and allocation of how much of each 'pipe' each user gets.
Data usage can be a few Bytes of e-mail, a download of a web page, a short transaction with a remote server or an FTP (File Transfer Protocol) download of a large file. When evaluating any data access technology it should be remembered that browsing is still a primary service on the Internet. Supporting web service economically requires the ability to support a large number of short HTTP transfers per unit of time to a large subscriber population. Hence, when assessing the viability of a mobile broadband technology for delivering packet data services, it is important then not to fall into the trap of simply downloading a large file of some tens or hundreds of Megabytes and measuring the time it takes.
Depending on the type of applications and services being used, transition time can have a significant impact on the user experience. For applications such as FTP, where the connection is required for a sustained period, connection-oriented access technologies such as those based on CDMA are ideal. In contrast, for short, 'bursty' data type of services such as simple e-mail, messaging or browsing, a short transition time -- the periodit takes the system to bring the device out of the Sleep state and into the On (Send) state -- is more important.
The level of customer experience of the operator's mobile broadband data service is directly dependent on how quickly the system can transition subscribers in and out of the various states. In addition, the faster users can be brought in and out of the network, the more users can be supported per sector and each individual device consumes the less battery power.
One of the key concerns for mobile operators delivering a wireless data network should be network latency -- the RTT (Round Trip Time) for a data packet to travel across and back over a network.
Large network latency (delay) will impact the maximum throughput that can be achieved by a data subscriber. This is one of the key reasons that operators are looking to deploy OFDM based technologies for Mobile Broadband Access. FLASH-OFDM, for example, has a network latency of less than 50ms, and this has been validated in a fully loaded network in numerous technical and commercial operator deployments.

Active subscribers

To validate the network capacity of any radio network, traffic dimensioning and modelling needs to consider multiple applications and their respective quality of service considerations -- not just evaluate FTP downloading under full buffer conditions to a single user. Traffic load needs to be based on varying levels of VoIP demand as well as browsing -- which is the predominant Internet service usage today. This will provide a realistic evaluation for technology capacity validation purposes.
Voice performance is evaluated using Mean Opinion Scores (MOS) measuring voice quality. The 'capacity' of a voice system is then counted in terms of the number of acceptable quality calls that can be served within a certain blocking probability, measured in Erlangs. Browsing performance is measured from the subscriber's perspective, in that both the rate and latency performance at the application level for the typical web subscriber are captured, and web page latency is used as the performance benchmark.
Peak Data Rates do not accurately reflect a technology's ability to deliver high user capacity or a high level of customer experience when handling fairly simple transaction requests. Actual performance depends on a radio access technology's ability to support TCP. Issues such as latency, number of Active and On users and transition time between states has a greater impact on performance and capacity than peak data rate.
Operators are faced with a choice between a variety of radio access technologies, some available today and others in various stages of specification work, laboratory testing or technical/market trials. The validity of each technology cannot be defined based on a single attribute. Operators should consider each technology's total system level capability, how efficient the technology interacts with transport protocols in the Internet, timing for full commercialised availability and the total user experience of existing and new services. Peak data rates and bits/sec/Hz, while important, are not sufficient measures to be used for technology validation purposes. At the end of the day, operators will base their technology decisions on economics, capacity and user experience. In short, how many happy subscribers can a system support and at what price.
The industry has moved on from feed-and-speeds, and is suspicious (often with some justification) of claims that are born in the lab and do not translate into the real-world. Understanding how to evaluate mobile broadband technology, and so understand the services and user experience that can be delivered, must be the foundation stone of any decision making process. In short, the market for mobile broadband access is happening now and technologies must be capable of delivering and rolling out a commercial mobile broadband access solution today.                           n

Joe Barrett is Marketing Director EMEA, Flarion Technologies, and can be contacted via e-mail: J.Barrett@flarion.com

VoIP has finally joined the mainstream and is beginning to have a profound effect on the growth of the communications industry. But providers must have the right strategy in place if they are to clean up in the market, says Michael Fillman

Initially deployed as a free service used by Internet enthusiasts, voice over Internet protocol (VoIP) have reached the mainstream and has quickly become the driving force in the new communications revolution. Over 11 million people worldwide were using a retail VoIP service for at least some of their phone calls at the end of March 2005, reports market research firm Point Topic -- a substantial increase from 4 million in mid-2004.
Growth predictions for the next several years indicate that we are clearly on the brink of rapid expansion in the VoIP market, particularly now that it is economically attractive to launch value-added services to a mass audience over an IP-based network. Research firm. Infonetics, predicts that in North America alone, VoIP will grow to a $20 billion industry by 2009 -- a 1431 per cent increase over 2004 revenues! This growth will not be without its challenges, among them the ability to deliver the new services -- including setup, provisioning and billing -- as cost effectively as possible.
The emergence of VoIP is blurring the distinction between next-generation carriers, incumbent local exchange carriers (ILECs), competitive local exchange carriers (CLECs), cable providers, Internet Service Providers (ISPs), and broadband service providers. These providers are collectively creating a new marketplace that offers a wide range of IP and packet based telephony services -- including television -- over the same IP connection that offers Internet access, data, content, and video. As the IP telephony market is undergoing this profound convergence, the ability to deliver multiple forms of voice, data and content services over public and private networks is unleashing an unprecedented number of new opportunities for service providers.
VoIP's early adopters focused on the technology itself-not the value of applying it. This effectively relegated the technology to the realm of a novelty item. Today, the low cost and improved quality of service, coupled with feature-rich applications that are difficult and expensive to deploy over traditional TDM (time division multiplexing) networks, are fuelling an unprecedented growth in the communications market and making VoIP a viable alternative to the PSTN. While voice is the predominant service being delivered over IP, other telephony services such as call management, personal desktop productivity, multimedia, collaboration and even television are rapidly emerging as the 'killer apps' that will fuel mass market adoption of XoIP -- everything over IP. To emphasise this point, we can look to British Telecom which has launched 21CN -- an aggressive plan to move virtually its entire customer base to an IP-based network, effectively shutting off the PSTN. 
Moving forward, most experts agree that VoIP will be part of the new public network where all services run over an all-IP network. As the technology continues to improve, the focus is no longer on VoIP as just an inexpensive alternative to circuit-switched voice, but rather on VoIP as a valuable enabler of emerging services and feature-rich applications, including the integration of VoIP with the desktop and PDA to provide seamless enterprise productivity.
Many service providers are driving towards being the one-stop-shop for both the consumer and enterprise markets by providing local, long distance, Internet, and video services. Cable companies are striving for the triple play and even quadruple play angle by utilising their existing home co-ax installed base for voice, Internet access, video programming and mobile services. ISPs are continually trying to add value above and beyond access by offering video and voice services. ILECs, CLECs, and IXCs (Internet exchange carriers) have moved into each other's markets and are now extending their offerings with IP and video services.
In the VoIP space, ILECs are not only competing with CLECs, but with cable companies and ISPs for both consumer and enterprise dollars. Therefore, service providers need to develop competitive VoIP offerings and avoid commoditising the technology as just a cheap PSTN alternative.
The window of opportunity for success in the IP-based public network will not last forever, and service providers that do not address their business challenges today will face increased difficulty in       competing in this environment tomorrow.
To win customers and traffic when competing against cable companies, IP, and traditional telephony providers, the service providers must be able to differentiate their offerings by quickly and efficiently delivering new, valuable services to the customer base. They must also find ways to encourage VoIP service sampling, and increase adoption and usage. To accomplish this, service providers should build their business infrastructure on a platform flexible enough to enable innovative business models that support multiple services. The business infrastructure that providers choose must address the challenges they are likely to face, such as exponential growth, cross-border competition, deregulation (or potential regulation in the case of VoIP), revenue leakage and integrating with emerging technologies.
While developing a VoIP strategy, service providers still need to deal with internal pressures around the drive to improve margins, lower costs, and become more efficient with existing systems. Operational complexity is a significant issue for many incumbent operators who have patched together legacy systems over the years. This is also true for providers who have grown through acquisition, inheriting multiple systems. It is not uncommon, for example, for a service provider to have five to 50 different rating and billing applications. Legacy applications often hinder a provider's ability to go to market with new products and services such as VoIP and its applications in a timely and cost-effective manner. With competitive pressures and fickle consumer needs, rapid introduction of VoIP products, services, and pricing plans is essential. Waiting for costly and time-consuming programming changes to billing and other back office systems to launch or modify new services is just not an option in the competitive marketplace.
Solutions that support VoIP should also require minimum capital expenditures compared to a traditional custom-built system that handles legacy services. It should be a cost-effective solution that readily integrates into the existing 'web' of applications within the service provider's environment.

Service differentiation is critical

To profitably address this market, providers must be able to create and manage customer accounts; develop, price, and provision IP communications and other services; and analyse customer activity. And all of this must be done in real time.
Additionally, promotions and discounts should be offered to customers in real time to increase customer loyalty and decrease churn. Call routing needs to be optimised, based on costs from affiliate originating and terminating providers, to allow them to manage balance information.
As IP telephony services gain popularity and widespread acceptance, the key to surviving in this market will be service differentiation and faster time to market. So how will consumer and enterprise customers see value in VoIP? The key is to offer services that users cannot get over traditional PSTN. IP telephony providers will have to offer many advanced services, like collaborative multimedia, facsimile, interactive voice response, integrated conferencing, Internet call waiting, unified messaging, voice processing, and web-enabled call centre services. These are areas where providers can truly differentiate and build value through their VoIP offerings. To support these activities, service providers will need to roll these services out as quickly as possible while being able to rapidly scale to support millions of users.
To maximise revenue per subscriber by effectively charging for multiple new services and bundled offerings, IP telephony providers will require a flexible rating engine that supports an unlimited variety of pricing structures, including subscription-based, transaction-based, multi-tier, and usage-based pricing. This is made feasible by utilising a single database that allows centralised management of subscribers, services, pricing, and usage.
To make VoIP more than just a cheap alternative to PSTN, service providers should avoid pricing, bundling, and billing it as a cheaper flat rate or an 'all you can eat' one-price-fits-all type of service. Service providers should look beyond just simple time/duration rating and billing. Rather, they should look to bill for bandwidth or application usage, including value-based pricing, based on what the content is 'worth' to the consumer at that moment in time, to drive the value and opportunity out of the VoIP Market. Quality-of-service (QoS) price models, tiered-pricing and charge-backs should be offered to enterprise customers for voice, PBX, and VoIP applications. Complex enterprise customer relationships should be leveraged to bundle and offer VoIP applications.
Providers should also attempt to exploit the quality-of-service revenue opportunities available via IP telephony by delivering services through standard public networks, private networks, or ultra-reliable networks with predetermined maximum delay times. This ability allows providers to bill for the customers' desired quality of service.
VoIP has finally become mainstream and is beginning to have a profound effect on the growth of the communications industry. At the same time, VoIP is being recognised as merely the tip of the iceberg for an all-IP based public network, capable of delivering a feature-rich set of applications designed to improve the way that consumers and enterprises communicate -- whether simple voice calls, collaborative video conferencing, multimedia or advanced telephony services. 
To remain competitive and excel in this industry, service providers must address the business challenges of their front and back-office applications in an IP-based world. The ability to differentiate by service, rapidly introduce new products and services, and interoperate with existing applications will be critical to the success of these service providers.                                  n

Michael Fillman is Director of Product Marketing, Portal Software, and can be contacted via tel:
+1 408-572-3862; e-mail: mfillman@portal.com

It's been said that document management technology will become as pervasive in the next decade as e-mail. So, what does this mean for organisations? David Jefferies looks at the issues

based documents in a way that could be reasonably easily retrieved when required. In the 1990s, document management elevated itself into 'workflow', where documents and their role in certain business processes became more intelligent, carrying appended information along with them, as well as being subject to business rules that did not allow documents to proceed from one stage to another unless they had been properly completed or verified. This phase of development marked the first stages of 'media-agnosticism' for the document, where paper and electronic versions of documents had to be managed within a single computing application.
Latterly, document management has become part of a bigger idea, labelled 'Enterprise Content Management', or ECM, by Gartner1. The research group says, "Content technologies, such as integrated document and Web content management, have coalesced into ECM product suites. And it is precisely the business process of Enterprise Content Management -- making sure the right people have the right information at the right time, controlling that access, and then ensuring the right actions are taken -- that is occupying the thoughts of senior business managers today."
In fact, document management technology will become as pervasive in the next decade as e-mail became in the 1990s, according to Gartner2.  The analyst states that: "document management will reach mass-market status in the next few years, as pressure continues to mount on organisations to store documents in a secure, central repository, and to comply with ever-stricter regulations."
So, in today's business and public sector environment, document management is no longer a simple matter of 'archive and retrieve'. Nowadays, document management is a matter of monitoring, managing and controlling the whole lifecycle of each document, embedded within the business processes which that document helps to support. During the lifecycle of a typical document, it will pass between several people, usually vendor, channel and customer. It is the ability to keep track of documents and their associated information (content/customer/channel/product/etc) that enables an organisation to manage customer communications effectively.

Two main reasons

More specifically, there are two main reasons why business managers are preoccupied with content and document management. First are the millions that have been poured into CRM and other marketing analysis systems without, in the majority of cases, achieving the desired return on investment3. The overall business objective that modern document management supports is none other than that of Communications Management, especially Customer Communications Management. Content is just as important as channel when managing customer relationships through targeted communications. Second is the major burden of regulatory compliance imposed on corporates recently, notably in the form of the Sarbanes-Oxley Act, its forthcoming EU equivalent, and the recent changes to International Financial Reporting Standards (IFRS).
With these key drivers in mind, Pitney Bowes conducted research amongst Europe's top 10,000 companies, as well as a further sample of larger public sector organisations, in order to compare and contrast effective document management standards between countries and between broad industry segments.
The proportion of organisations with an effective document management strategy was found to be greatest in the UK (39 per cent) and France (38.3 per cent).  Italy (36.7 per cent), Germany (36.4 per cent) and Netherlands (35.9 per cent) made up the middle ranks, whilst Scandinavia (34.2 per cent) came surprisingly lowest of the countries researched.
Overall, the average result indicates that only one third to two fifths (38 per cent) of larger European companies have an effective document management strategy in place. 
Given the earlier argument that document management has now become fundamental to both marketing ROI and regulatory compliance, these research findings would seem to suggest that European companies are storing up serious problems for themselves.
The report also looks at the split in document management standards by industry -- a picture that has two extremities. 
On the one hand, Financial Services (43.9 per cent) stands out as the clear leader. This is no doubt the result of increasing legislation designed to make the selling of financial products fully transparent. 
Telecoms & Utilities (39.3 per cent), Service Industries (38.3 per cent) and Manufacturing Industry (37.2 per cent), form the middle-rankers in document management standards. However, the clear laggard in the study was the Public Sector. This is of great concern, as public sector organisations are increasingly coming under similar pressures to those felt by commercial companies. One good example is actual, or impending, Freedom of Information legislation. 
Effective document management strategy and infrastructure is fundamental to two key business tasks: communicating effectively with customers to produce enduring commercial growth; and complying with the increasing tide of regulatory standards that are sweeping over organisations across the Western world.
It should therefore be of concern that our research shows just under two thirds of larger European companies do not have an effective document management strategy in place.
Inefficient marketing and CRM activity costs real money. Only by tracking and measuring customer response to different offers and products, can customer value be built. The universe of prospective customers may well diminish sharply over the next few years. All the more reason to look after the customers you have, and for that, an effective document management strategy is essential.
Inability to meet regulatory standards is a more serious issue. Sarbanes-Oxley legislation means that organisations face escalating fees4 unless they can automate much of the audit process -- that means efficiently finding, retrieving and cross-checking documents and other content. Europe is soon to have its own Sarbanes-Oxley equivalent5, and so this issue is relevant to every European company, and not just those with a US listing. Indeed, there is a raft of other regulatory requirements in Europe, all of which are reliant on high standards of document management.
We should expect to see this rapidly rectified over the coming decade, whether through in-house systems improvements, or through outsourced service provision.  As Gartner has said, "...document management will be to this decade what email was to the last."6                       n

1 HandySoft, Time to pull our Soxs up - a research study into Sarbanes-Oxley compliance amongst US-listed European companies, Nov 03
2 The European Commission, Directive on statutory audit of annual accounts and consolidated accounts
3 Gartner, Content Management Hype Cycle, 2004
4 HandySoft, Time to pull our Soxs up - a research study into Sarbanes-Oxley compliance amongst US-listed European companies, Nov 03
5 The European Commission, Directive on statutory audit of annual accounts and consolidated accounts, http://europa.eu.int/comm/internal_market/ auditing/officialdocs_en.htm
6 Gartner, Content Management Hype Cycle, 2004


David Jefferies is Marketing Director, Pitney Bowes
www.pitneybowes.co.uk

10GigE shifts the burden of assuring interoperability from carriers to equipment vendors -- and quality assurance departments must ensure standards are met. Paul Fitzgerald explains

In the traditional SONET/SDH world, carriers engineered high-speed optical link parameters, such as the gain or attenuation, to achieve interoperability among different switch vendors. But 10-Gigabit Ethernet (10GigE) is turning that approach upside down.  Now it's the component and switch vendors and the carrier's QA departments that are responsible for interoperability. This is a major philosophical shift of responsibility as carrier class 10GigE switches and routers begin trying to meet five nines reliability. 
10GigE is meant to be "plug and play" -- plug it in and it should work -- without the need for involving any engineers. The 10GigE IEEE standard was tightly written in a way that assures interoperability between different pieces of Ethernet equipment manufactured by different vendors. The expected outcome from this approach will be lower deployment and system costs than traditional SONET/SDH systems.
To achieve this interoperability, all 10GigE equipment vendors are required to conform to the IEEE 802.3ae standard, especially the stressed receiver sensitivity (SRS) specification. In simplest terms, SRS requires equipment to achieve a bit error rate (BER) performance better than 10-12 with impaired signals that mimic the worst-case allowable conditions of real-world deployment. This is contrasted to  SONET/SDH testing where near-perfect signals are often used to characterise systems and then margin is added by engineers to accommodate possible conditions in the field.

What is SRS testing?

An SRS test consists of four basic impairments to an otherwise perfect optical signal - all applied to the signal simultaneously. With the impaired signal applied, the equipment must meet the performance characteristics mandated by the 10GigE standard.
The four impairments are; 1) a poor extinction ratio; 2) horizontal timing jitter; 3) vertical (or amplitude) jitter; and 4) slowed rise and fall times. With these impairments placed on the optical signal, system performance is then judged for acceptable BER performance at low light levels combined with low optical modulation amplitude. By      testing each component and system under the same conditions that could exist in the field, the risk of actually deploying a faulty or poorly performing network element is greatly reduced. Additionally, plug-and-play performance is evident and the probability of achieving vendor-blind interoperability is enhanced.
The extinction ratio is a comparison between a laser's power when at the '1' level and at the '0' level. In the SONET/SDH standard, extinction ratios are specified to be relatively high. But in the 10GigE standard, allowances were made for low-cost transmitters that have low extinction ratios for the test signals. By defining the extinction ratio of the test signal, 10GigE has eliminated a major source of test-to-test variability and reduced the need for adding margin to test specifications.
Horizontal timing jitter refers to the degradation of signals caused by poor synchronisation. The 10GigE standard mandates that switches and routers must continue to operate even when subjected to a wide range of horizontal jitter conditions.
Vertical jitter refers to the effects interfering signals have on the main signal. The SRS test includes an interfering signal that simulates noise that can creep into transmitter circuits and cause poorly designed receivers to have bit errors.
The dispersion effects of fibre and printed wiring board create slower rise and fall times for the signals. The SRS test includes a filter that simulates these effects. 
The 10Gig E SRS test was developed to ensure the likelihood that systems will interoperate and be ready for plug-and-play operation.
Testing for SRS compliance begins at the optical component level, but should also be performed by transceiver manufacturers, 10GigE system vendors and carrier quality assurance (QA) engineers. QA engineers can make certain the switches and modules they are deploying into the network will 'plug and play'. Testing up front will not only ensure interoperability today, but provide assurance that their network will provide the quality of service their customers are expecting. 
Since zero defects is the only acceptable performance standard, "good enough" is simply not acceptable. Even though all Ethernet switches and routers will not encounter the worst-case scenario posed with SRS testing, accepting even a few field failures will prevent carriers from reaching the 99.999% reliability level.
QA engineers following the six sigma and zero defects movements understand how important SRS testing is to achieving smooth 10GigE deployment. With all vendors and customers testing the same way -- using worst-case conditions -- 10GigE can ramp-up volumes with minimal problems.

SRS prevents future headaches

It is important to pass the SRS test because if there are bit errors on the 10GigE link, then packets are dropped from the 1GE and 10/100 Ethernet tributary links. Those packets will require retransmission and, in the case of voice-over-IP (VoIP) or video signals, the lost packets cannot be re-sent and the end result is low quality voice or blocks of data missing from a video display.
It's a relatively simple practice to ensure SRS testing from the component level to the carrier -- and it's a practice that will ensure the highest possible levels of flexibility, scalability and interoperability for the network.
Compliance to the 10GigE standard ensures the flexibility and scalability for future network expansions and reconfigurations. SRS testing assures interoperability, easier deployment, higher bit error quality, and higher customer satisfaction. There are even indirect benefits, such as lowering operational expenses through a more flexible network and less truck rolls due to faulty equipment. In the end, the service provider benefits by providing customers with the quality services they expect.                                          n

Paul Fitzgerald is director of marketing at Circadiant Systems Inc. and can be contacted via e-mail: pwfitzgerald@circadiant.com
www.circadiant.com

Competition amongst fixed line operators is still largely dominated by the local incumbents, depite regulators' efforts to free up the market. Jonathan M. Steinberg reports

majority of all domestic telecom activity was in the hands of the government-run operator.
Even when competition was introduced and governments sold some or all of their holdings in their cash cow operators, these incumbents remained dominant because they owned the infrastructure giving them control of pricing. They also owned key customer relationships and mind space (in consumer and business markets) and had relatively unlimited access to capital for growth and development. The only significant change that incumbents faced in their fixed-line business was the nominal splitting of their operations into wholesale, that owns the network, and retail, that sells to end users. 
However, the definition of what a telecommunications company is and does has recently changed. Most incumbents have expanded their business models from their traditional fixed-line business into the newer, more competitive wireless business. Many telecom operators now also provide media and content-related services acting as Internet Service Providers (ISPs) or Application Service Providers (ASPs). As a result, no longer do operators, particularly incumbents, merely go head-to-head against other operators. They now face competition on three fronts: from mobile carriers, alternative network carriers and ISPs.
In the past, competition had minimal effect on incumbents. While mobile operators cannibalised some fixed-voice traffic, people did not simply migrate from fixed to mobile, they just talked more. Over time, most incumbents became mobile operators themselves, either through organic or inorganic growth. For example, after the Vodafone/Mannesman acquisition, France Telecom bought Orange to become one of the dominant 'global' mobile players.
Incumbent operators were, and are, facing scrutiny from the regulators who want to ensure a telecoms-rich social service to their respective nations. While regulators have allowed incumbents to expand their business models (albeit with some restrictions), in many geographies, regulators have also driven decreases in voice prices. In some cases, this means that incumbent operators no longer enjoy all of the advantages of a usual monopoly.

Technology driven portions

Until now, the newer technology-driven portions of the expanding telecommunications arena have been filled mainly with smaller players who lack the significant access to capital and the financial security that incumbents enjoy. With the downturn in operational profits and financial stability in the telecom market over the past four to five years, most alternative operators and ISPs were unable to make truly significant dents into incumbents' business models. There have been some individual success stories on a per company per country basis, such as Fastweb in Italy. In no European market, however, have these upstart challengers truly threatened the incumbent operator. But with incumbent operators beginning to challenge other incumbent rivals on their own turf -- often via financial or operational interests in alternative carriers, mobile operators or ISPs -- the competitive threat to the host incumbent operator has become hotter.
The challenge for incumbents today and in the near future is to explore and develop new growth opportunities beyond their current traditional domestic voice markets. This is critical to their future success, given that traditional revenue sources are decreasing due to increasingly cheaper voice call rates, increasing costs of network maintenance and the potential need for major -- and costly -- overhauls.
The advancements and proliferation of technology are creating the challenges and business model opportunities that incumbents are increasingly facing. The telecommunications industry's transition from being primarily home-based, voice services operators to international, multi-service players has been a financial and regulatory-driven necessity. Three primary factors have encouraged this phenomenon:
1.  DSL expansion and uptake.
2.  Local loop unbundling and wholesale line rental.
3. New DSL-enabled service provider business models.
The third factor is effectively driven by the realisation of the first two, which are highly correlated.
With their domestic voice revenues in decline,     incumbents need to look at the options for growth. These include developing new products and services and seeking new opportunities abroad.
Many incumbent operators have turned to non-voice services, such as content, to fill-in the revenue holes created by price decreases in voice services. However, these efforts have achieved poor to mixed results.  Visual communications, for example, has been a burgeoning area of interest for operators. However, to date, DSL TV and Video on Demand (VoD), have provided uncertain short-term results. The IP TV business model is still nebulous. DSL TV satellite operators are interested in partnering with DSL providers to try and penetrate city centres, which are traditionally a cable TV stronghold. However, bandwidth requirements (for operators with legacy networks) and customer premises equipment costs make the business model unclear, since the cost of layout and installation may not be worthwhile, particularly if uptake is slow.
While IP-enabled VoD has some potential, several key issues have yet to be resolved:               *  User interfaces for identifying and selecting video programs are not easy for all customers to implement.
*  Content owners are unwilling to cannibalise the maturing DVD market.
*  Content owners are also concerned about Digital Rights Management (DRM) issues, particularly data protection, as VoD services are streamed to customers with vulnerable technologies.
Therefore, while IP services-led value destruction is almost a given, there is major uncertainty whether or not content and related services will provide the means to offset traditional voice losses.

Seeking new opportunities abroad

The other sources of growth opportunities are in new markets. With local loop unbundling (LLU) enabling alternative carriers to penetrate previously incumbent-dominated markets, this provides an incentive for one country's incumbent to exploit potential DSL opportunities in another country. Becoming a 'challenger  incumbent' operator and/or DSL service provider in a different geography can be very attractive while not being particularly risky because:
*  Incumbent operators usually have access to large financial and capital resources for any operational or marketing investment required;
*  They can absorb any initial losses associated with start-up business cycle phases, and can withstand price wars while spending to create market awareness; and
*  They may enjoy a cost and operational advantage over non-incumbent alternative operators, due to scale and expertise in many key processes and technologies.
Many European incumbents are already seeking fortunes outside their home markets. For example, Wanadoo (France Telecom's ISP operation) is a multi-national operation competing in Spain, the Netherlands and the UK. The ISP arm of Deutsche Telecom -- T-Online -- operates in France, Spain, Austria and Portugal. Both of these players use a combination of large-scale marketing and price/service differentiation to carve out significant (though not massive) market shares in many geographies. They are acting like heavyweights, taking advantage of the scale and capital that their parent organisations can afford, while being more nimble in their chosen consumer and small-medium enterprise markets than many of their competitors.  Other incumbents, from BT to Telecom Italia, are beginning to expand their international 'challenger incumbent' models. For instance, BT has recently completed a takeover of Albacom in Italy, providing a foothold in what is Telecom Italia's turf.
Economic models suggest that only three or four broadband service providers will survive in each country, with the host incumbent most likely to be one of them. This is true for many reasons:
*  ISP-only business models are easy to duplicate and are harder to gain a competitive advantage in;
*  Price wars will drive out some competitors who have minimal or no alternative revenue sources; and
*  Challenger incumbent broadband service providers benefit from the deeper financial resources of their parent companies allowing them to withstand early revenue challenges.
Challenger incumbents also benefit from their larger scale and scope, making them more attractive for content partners.
To win in this new playing field, challenger incumbents are using three competitive techniques:
*  Compete against incumbents in areas other than traditional voice, such as ISP or mobile;
*  Attack specific fixed-line customer segments where the incumbent may be vulnerable, such as multi-national businesses; and
*  Participate indirectly by funding an alternative carrier or other rival to provide necessary capital.
We have seen evidence of the first bullet point in a series of cross-border plays by incumbent operators.  In some cases, these expansions have simply paralleled their home market business. For example, France Telecom's Wanadoo ISP, which operates successfully in its domestic market, is an easily transferable business model that now exists in many European countries.

Attacking host incumbents

In other cases, the incumbent has deployed a more traditional voice business model to attack host incumbents in the countries they are expanding into. BT Global Services uses a variance of this, providing voice and data services to UK-based companies with foreign operations who require telecom services abroad. Other European incumbents, plus global operators such as AT&T, NTT and Telstra, perform similar strategies to follow their home markets' top business clients abroad to gain profitable niches in foreign territories. Usually, however, these cross-border operations aim to acquire and retain only those customers with specific interests in the incumbent's home domestic market, rather than all potential businesses that require services in a foreign market. Another variance of this model deals more with financial rather than operational interest in the cross-border entity.
We are now beginning to see cases where the incumbent cross-border entry not only expands its revenue sources abroad via a replicable business model, but also indirectly prevents that new market's host incumbent from counter-attacking effectively in the challenger incumbent's home market. In effect, the challenger incumbent may distract the host incumbent, forcing the latter to concentrate on revenue retention at home rather than seek its fortunes elsewhere with as much resource as it would have had if unencumbered by domestic competition. France Telecom is increasing its presence in markets where it can offer a broader suite of products and services that directly compete against host incumbent services; by doing so, it forces the host incumbent to utilise resources to fend off an FT attack rather than deploying them to challenge FT directly in France.

Tackling the challenges ahead

f companies are to survive in an open and extremely competitive market with tight margins, they will need to embrace the new technologies and develop new business models accordingly.
Incumbents should consider the following strategies:
*  Manage widespread VoIP introduction -- particularly once quality improves to drive widespread acceptance -- as an alternative to traditional voice, given the increasing proliferation of broadband as well as competitors;
*  Prepare responses to LLU-enabled competition at the network and services layers to stave-off excessive substitution;
*  Identify potential incumbent competitors that are most likely to attack directly, on their own or via strategic partnerships with existing telecom and media players;
*  Manage carefully the internal challenges of operating as both a carrier and services provider, which might be done by separating the services business from the traditional, parent company;
*  Determine if and how to expand operations abroad, and whether to compete against host  incumbents directly or indirectly;
*  Address the past challenges that most incumbents have had in creating and operating partnerships, to ensure access to critical technologies and profitable customers that the incumbent may not achieve on its own;
*  Manage any regulatory issues that may arise from the above three bullet points; and
*  Ensure the company has workable technology and architecture to manage the expansion of products and services as well as geographies.
For alternative carriers, sitting on the sidelines is not an option. They should consider the following actions:
*  Assess which LLU-enabled areas are ripe for aggressive target marketing, and which warrant the costs of the actual unbundling;
*  Select a strategy for addressing the VoIP opportunity/challenge given that VoIP will reduce voice-related revenues as it proliferates;
*  Identify and approach potential incumbents from foreign markets who would benefit from a domestic strategic partnership;
*  Determine which ISPs may be useful as partners, paying close attention to their strategic and financial situations, to mimic the broadband service provider models that incumbents may more easily implement; and
*  Be ready for deep-pocketed, foreign competitors to squeeze alternative carriers, via massive marketing campaigns, very low pricing strategies or other predatory techniques used to drive out smaller players.        n

The full paper, 'A new era in fixed line telecommunications competition -- surf's up for incumbents', on which this article is based, can be downloaded from the RSM Robson Rhodes Business Consulting website at www.rsmi.co.uk/consulting

Jonathan M. Steinberg is Principal, RSM Robson Rhodes Business Consulting and can be contacted via e-mail: jonathan.steinberg@rsmi.co.uk

Next Generation Networks need Next Generation People

Once upon a time it used to be so simple. People asked you what you did for a living and the mere mention of the word 'telecommunications' was enough to stop them dead in the water. Mental images of telegraph poles, black bakelite phones, long waits for field engineers and the drab efficiency of a utility sector conjured up a grey world of proto-nerds.  Okay, we may have had a brief romance with fame during that dot.com bubble that we'd all rather forget, but by and large, telecommunications has remained an iceberg-shaped industry, with most of our work resting safely below the waterline of the public's consciousness.
2005 however is now looking like the year in which we're going to break surface once again, finally doing what I call 'joined-up' telecoms, using IP as the effective equivalent of the ubiquitous Model-T Ford of the last century -- i.e. strap anything you want to the chassis and start rolling westward in search of the Promised Land. As a result, it's probably more accurate to call it 'the industry previously known as telecommunications', just as that miniscule pop star rebranded himself.
One of the central problems, though, in making this long-expected transition towards integrated content and applications, is whether the kinds of people and skills that we have in the industry are able to make the change themselves. Most of the issues that I see daily in the industry aren't so much technological anymore. Instead, they're cultural and perceptual ones within organisations, or, indeed, in the 'wetware' lurking between peoples' ears.
I'd argue for a start that we're seeing the equivalent of continental drift in all the convergence work under way around us. The North America of IP is now butting up against the Eurasia of the PSTN, while the Australasia of content and applications moves up from the South and, just like the Indian subcontinent, is currently pushing up the Himalayas, we're getting the equivalent of new valleys and mountain ranges appearing in our midst. As a result, we need new species to colonise those new niches. The trouble is that many in the telecoms establishment still have a naïve faith in the integrity of their own particular nation state -- despite the fact that the earth's crust is already shifting under their feet....
It is, however, encouraging that industry debate is now starting to look at these human-centric issues. At a recent New York analyst briefing session from the commercially reborn Telcordia, a number of different speakers highlighted the importance of recognising the stresses and strains involved in making the necessary transitions -- particularly against a backdrop of ever-increasing commercial and technological complexity. One particularly dynamic presentation came from Joe Gensheimer, COO of a new US MVNO called Movida, set up to target the Hispanic marketplace through a hosted MVNO service from Telcordia. His business model revolves around understanding the customer better than his competitors and fulfilling their needs faster and more efficiently -- not in spending billions to build a new network.
Much of the new growth in telecoms is going to come from similarly-minded entrepreneurs, often from outside the industry. And therein lies the problem for the industry once known as telecoms. Many of those now in senior positions in the industry first entered the sector when it was the high-technology wing of a Post Office. Risk was minimal, career progressions were almost set in stone and there was a reliable pension waiting at the end of the day. Things today couldn't be more different -- but have the people and the underlying organisations changed sufficiently to adapt to the new conditions?
The next few years are going to be tricky ones for us. How do we keep those core telecoms values of reliability and integrity in what's fast becoming an over-stocked bazaar of content and applications? Do we sink back below the waterline to become once again the trusted but largely invisible utility -- or do we try and re-invent ourselves as both an industry and as individuals to make the transformation?
It's much easier swapping out legacy infrastructure than it is legacy mindsets...                                    n

Alun Lewis is a telecommunications writer and consultant, specialising in what he, tongue in cheek, calls 'post-modernist telecoms'. He can be contacted via: alunlewis@compuserve.com

Despite all the new technologies and services on offer,
traditional values of good old customer care have not gone away. And that's where a good managed service can pay
dividends, says Andy Peers

Western Europe will reach just under 100 per cent by 2009, operators will have to look at new ways to gain revenue from existing subscribers.
One route identified is via increased segmentation of the subscriber base, through using mobile virtual network operators (MVNOs) to reach new niche sectors.  Yankee Group's Mobile Wireless Services report 2004 also points to new MVNOs identifying segments not already addressed by existing wireless offerings. It asserts that the more focused the segment, the less the cost of building their brand in that segment. In fact Pyramid reported that half of Sprint's net additions in Q1 2004 were generated via segmented MVNOs. These MVNOs generated additional traffic revenue through targeting subscribers that the operator would not otherwise have reached, such as specific enterprise or ethnic segments.
Pyramid warns that as current staples of MVNO revenue -- namely brand building, pricing/promotion, voice and SMS services -- become commoditised,MVNOs will have to harness the growing demand for content and multimedia and emerging services such as mobile broadband access and wireless services.  Because they have not invested in the network infrastructure to deliver new wireless and data solutions, MVNOs can be more flexible about the range of services they offer. However, Yankee Group believes that with the rise of wireless adoption there will be a greater demand for customer support. Therefore MVNOs will not be able to add wireless into their customer acquisition and care, device provisioning and billing functions, without significant capital expenditure or outsourcing.

Looking after the back office

There is a rapidly growing trend among carriers to offer wholesale network access, opening the door for high brand organisations to launch MVNO services. Companies with strong brands can extend them further by reselling wireless services as an extra service to customers. Managed services are particularly attractive for blue chip enterprises rolling out into new markets and territories using an MVNO model, because services can be set up and configured quickly, without taking focus away from their core business. Minimal initial investment is required to get started and upgrades can be achieved more cost effectively as the business grows.
As an indicator of the impact of new service delivery, Gartner has predicted that managed services in the telecoms sector will grow by 13% in 2005. Organisations that invest in managed services gain economies of scale from shared access to network and systems management platforms that they could never afford to invest in using the in-house model. There is also an attractive opex factor because the managed services provider can manage more elements of the telecoms service, such as provisioning, billing, credit checking, CRM and call centre facilities, at a substantially lower price.
Therefore, outsourcing is a good way of ensuring business continuity and quality of service, which has a positive impact on brand perception. It also ensures that the company is buying in a wealth of knowledge and expertise from a company such as a virtual network enabler (VNE), whose core business is focused on the provision of these services. VNEs can also help big brands that are looking to launch an MVNO presence, by reducing their capital outlay and time to market and enabling them to focus solely on marketing and acquiring customers.
VNEs offer specialist skills and back office infrastructure that have been estimated to reduce MVNO start up costs by as much as 90 per cent, when compared with building up capability and infrastructure in-house. In general, VNE specialisation includes vital back office functions such as billing solutions, wholesale negotiation and content aggregation, through to full turnkey operations. In addition to the infrastructure investment, the VNE forms the link between the network operator and the MVNO, negotiating airtime in large blocks and thereby enabling further savings. When one considers that purchasing airtime makes up to 50 per cent of the MVNO's running costs, this demonstrates a major benefit of using the services of a VNE. As MVNO services start to include mobile content and multimedia, the VNE role is predicted to expand to digital rights VNEs offer specialist skills and back office infrastructure that have been estimated to reduce MVNO start up costs by as much as 90 per cent, when compared with building up capability and infrastructure in-house. In general, VNE specialisation includes vital back office functions such as billing solutions, wholesale negotiation and content aggregation, through to full turnkey operations. In addition to the infrastructure investment, the VNE forms the link between the network operator and the MVNO, negotiating airtime in large blocks and thereby enabling further savings. When one considers that purchasing airtime makes up to 50 per cent of the MVNO's running costs, this demonstrates a major benefit of using the services of a VNE. As MVNO services start to include mobile content and multimedia, the VNE role is predicted to expand to digital rights   Â© management and provision of advanced mobile data applications.

The challenge of new service delivery

MVNOs face particular challenges when rolling out new mobile and wireless services, the major one being customer support provision, through dedicated call centres, online self-care and IVR. MVNOs must also rate and bill airtime usage in near realtime, particularly where they are offering prepaid services.
   Although reselling network minutes from the operators, the MVNO also needs to have its own billing cycles to avoid the scenario where subscribers have their service suspended because they sign up at the end of a billing cycle and appear not to have paid their monthly fee. By using the services of a VNE, the MVNO can meet these challenges by plugging their customers into carrier grade back office infrastructure, advanced billing applications and call centres. They can also tap into years of expertise built up by the VNE's experience in providing similar services to traditional network operators.
The Martin Dawes Systems (MDS) managed service includes a network of established interfaces connecting to all major networks, fulfilment houses, credit agencies, print bureaus and banks. This established connectivity to third party suppliers means that newly merged operators or fledgling MVNOs can bring their mobile offerings to market quickly, without worrying about infrastructure problems, or compromising their high standards of customer service.
Catering for all levels of service providers, ranging from smaller independent service providers through to the larger operators, the MDS managed service helps operators bring new services to market quickly, without having to deal with the issues of incompatible billing solutions, unrealistic development timescales, extortionate development costs and limited service support. Managed services also offer the benefit of zero capital expenditure on hardware and software, low implementation costs and an easy upgrade path.
The world over, traditional telecom companies, and new entrants to the space, are demonstrating the need to lower costs and swiftly provide first rate new services to their customers, while still focusing on their core business and maintaining the customer service on which their corporate reputation depends. Once these new services have been delivered, suppliers need to ensure that billing is clear and simple.
Consumers and enterprise customers alike are demanding a single bill for multiple services and meeting this need will help ensure revenue protection and customer retention.  So despite all the new technologies and services on offer, the old issues of attractive pricing, competitive bundling, ease of use, simple billing and responsive service - in short, good old customer service, have not gone away. In fact Pyramid Research asserts that the mobile telecom industry is entering a new era where customer service has replaced subscription growth as the key factor in determining business success.

Year of the MVNO

Analysts have predicted that 2005 will be the year of the MVNO and the launch of easyMobile and the planned expansion of O2's MVNO agreements with Tesco and Tchibo, certainly seemed to bear this out. The arrival of 3G, HSDPA, UMTS, and IMS are pointing to new multimedia and content services that MVNOs can personalise and deliver to niche segments to drive up traffic for the operators. The next generation MVNOs are predicted to broaden out to offer these capabilities in the enterprise and healthcare industries and machine-to-machine telematics for automotive industries. This represents a growing opportunity for VNEs to act as the new service enablers: adding expertise, minimising risk and providing continuity of service to customers while new services are rolled out.                                       n

Andy Peers is the Managed Services Director at Martin Dawes Systems, and can be contacted via tel: +44(0) 1925 555300; e-mail: mds_info@mdsuk.com

While reports indicate that 3G is shaping up nicely in the revenue stakes, further challenges for the technology in the infrastrucure need to be addressed to ensure its ultimate success, says Alan Carr

Flexible. Lighter. Smaller. Things are starting to look more attractive for what will become the most widespread 3G standard, WCDMA or UMTS. There are now well over ten million subscribers worldwide and encouraging reports of a significant increase in revenues per subscriber relative to 2G. As a result, the outlook is now much more attractive for the vendors supplying infrastructure to the 3G industry.
But all is not plain sailing and important challenges remain. A particular issue is the sheer competitiveness of the market. There are currently estimated to be around 20 suppliers of 3G infrastructure around the world including:
*  Traditional Western suppliers with their proven pedigree and track-record
*  Rapidly expanding and aggressive new entrant Chinese companies
*  Japanese companies with the advantage of the world's most advanced home market
*  Korean companies with their long experience in the underlying CDMA technology.
Experience shows that ultimately there will only be room for a handful of these companies to be successful. But in the meantime, this overly competitive situation will make life difficult for all concerned.
So against this background, what lies ahead for the mobile infrastructure industry worldwide? What will be the factors that decide which of the current crop of 3G infrastructure vendors go on to become world leaders? Here we identify two factors we think will have a significant impact.

Flexible, lightweight solutions

Current 3G infrastructure solutions closely follow the model established with earlier generations of mobile infrastructure. However, there are signs that this will need to change in view of issues:
*  There will be an ever-growing need for new basestation sites to provide good 3G coverage. In the UK, for example, it is estimated that at least 12,000 new cell sites will be needed and, following recent trends, it is likely that many of these will involve protracted disputes with the communities where they need to be located.
*  Achieving good indoor coverage for 3G is a growing problem. 3G simply does not penetrate well into buildings and this will only get worse as higher data rates are introduced and with new features such as HSDPA.
*  Finally, there is the issue of the dynamic status of 3G standards. Although the basic standards are now fairly stable, there is no sign of standards-making machines such as the 3GPP slowing down. On the contrary an ambitious programme of continuous enhancement is already underway.
The problem for infrastructure vendors is the need continually to re-design products to keep pace. To address these issues, future infrastructure solutions will need to be much more lightweight and flexibleWith a new class of much more compact basestations, it will be possible to avoid the large cell sites that are becoming so unpopular with the public and to move instead to much smaller less conspicuous equipment that blends into the urban environment. Following the same theme, still smaller systems located indoors will be able to overcome the problem of indoor coverage.
Basestations will also need to be much more flexible to overcome the problem of continual changes in 3G standards. This applies to both small and large basestations, but is especially pertinent where larger numbers of smaller basestations are deployed and where site visits to upgrade functionality will become prohibitively expensive. The technology that will provide the flexibility needed is software defined radio (SDR). With this approach, as much of the basestation functionality as possible is implemented in software, and enhancements and updates can then be loaded remotely in the form of a software update.

Picocell and microcell base stations

To understand the reason why small basestations will be so important in 3G networks, we first need to consider changes in the type of traffic in a 3G network.
With 2G, most traffic is voice calls and, of course, voice will remain an important element in 3G networks. But the real goal with 3G is to transition to a true mobile communication system rather than just a mobile telephone network. As a result, a growing part of the traffic will be made up of data, much of it at higher rates than is possible in 2G networks. In urban areas where users tend to cluster together in offices, stations, shopping malls and conference centres, this will quickly lead to a situation where the overall density of traffic is substantially higher than in 2G networks. The result is that many more basestations will be needed. But this will be uneconomic and, in many cases unacceptable due to public opposition to the introduction of new sites. The answer is to introduce a layer of smaller basestations in crowded urban areas to carry much of the traffic.
Another big driver towards smaller basestations in 3G is the need to get good indoor coverage.             
The problem starts from the fact that 3G operates at around 2GHz, where losses penetrating buildings are higher than with 2G systems. However, this alone ought not to be a major issue when it is considered that GSM has been made to work perfectly satisfactorily in the nearby band at 1800MHz.
The more significant factor is that 3G users inside buildings will typically want to make use of higher speed services, normally requiring a higher quality radio link than would be needed for voice alone. And so, although it may be possible to achieve reasonable voice coverage inside buildings, it will be much more difficult for the sort of advanced data services that will be needed indoors. Again smaller basestations can address this need.

Some technical factors

There are also some more detailed technical reasons why smaller basestations will be more important in 3G than 2G networks. The first relates to the orthogonality introduced to separate users in the downlink.CDMA systems have been carefully designed to maximise this orthogonality since it has an important effect on boosting capacity in the downlink. However in a macrocell, it is common for a substantial part of the orthogonality to be lost because of the complex multipath channels that exist between the basestation to the mobile terminal. This results in a corresponding loss in capacity. The effect is much less in microcells or picocells because of the shorter and less complex path between basestation and terminal.
The second technical factor in favour of smaller cells is intercell interference. Adjacent cells in CDMA systems normally operate on the same carrier frequency; and interference from cells bordering the home cell appear as noise when trying to detect the wanted signal. In turn, this noise like effect depresses system capacity. Pico and microcells are normally located below roof-top level and there is therefore less scope for interference between cells. On average, capacity loss due to intercell interference is also therefore lower. Thus, perhaps surprisingly, smaller cells are more efficient than their conventional larger equivalents and are therefore able to support more user traffic.

The cost factor

The final driver to small basestations is simply cost. Macrocell basestations are relatively expensive items of equipment designed for so-called 'carrier class' operation. This typically involves building in redundancy to meet reliability targets as well as incorporating expensive high power amplifiers. Special environmentally protected cabins are also often needed.
By contrast, the format of pico and microcell basestations is such that a design approach can be used closer to handsets rather than conventional basestations. Reliability requirements are typically reduced, output powers are much lower and special environmental protection is not needed. Maintenance costs for pico and microcell basestations can also be lower.
In terms of operational costs, a weak area for smaller basestations is backhaul because of the larger number of cell sites involved than with macrocells. However, technology is evolving to provide solutions here. For picocell basestations, 3G standards are developing so that IP networks can be used for backhaul. So for an indoor application such as in an office, an attractive backhaul solution is to use the existing Ethernet infrastructure installed within the building.
For microcell basestations that are more likely to be located outdoors, one emerging solution is the 802.16 WiMAX standard which could provide an effective radio backhaul option. There are also other radio alternatives and, in terms of fixed networks, xDSL is potentially a lower cost alternative to conventional leased lines.
To conclude, the emergence of flexible, lightweight solutions and smaller base stations are just two of the technological shifts that will help infrastructure vendors to tackle industry challenges successfully as the roll out of 3G gathers pace.                                            n

Alan Carr is a Member of PA Consulting's Management Group. Contact via tel: +44 1763 267492
e-mail: wireless@paconsulting.com
www.paconsulting.com/wireless

    

 

European Communications is now
Mobile Europe and European Communications

  

From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:
www.mobileeurope.co.uk 

 

@eurocomms

Other Categories in Features