Analysis

Generating carrier class Ethernet services for business can be tricky, with a number of issues needing to be addressed if Quality of Service is to be assured. Robert Winters provides some guidance

Metro Ethernet service deployments are continuing apace on a global basis with a variety of service offerings and enabling technologies that offer 'real broadband' as an attractive alternative to lower bandwidth DSL and Cable products, high cost leased lines, ATM and Frame Relay. Depending on the region of deployment there are a number of Ethernet technology alternatives and build-out strategies in progress.
For example, in Europe many incumbent service providers are maximising their use of existing SDH transport assets by upgrading equipment to support Ethernet services. With the insertion of new Ethernet line cards a variety of non-switched pure Ethernet transport implementations such as GFP (Generic Framing Procedure) and more QoS oriented switched services such as VLANs (Virtual LAN) etc, coupled with the value add of MPLS, can now be offered and new revenue models instituted. 
Alongside the transport network there are also deployments using hybrid switching and routing technologies with next generation protocols such as MPLS (Muliprotocol Label Switching) and RPR (Resilient Packet Ring).
Along with the enhancements to existing SDH transport equipment and switched Ethernet networks there also exists a growing number of European state sponsored broadband initiatives in countries such as Sweden and Ireland. These programmes encourage the rollout of dark-fibre thus enabling competitive broadband service providers to build their networks over a ready-made physical layer transport medium. This type of initiative offers a reasonably clean slate approach to building an Ethernet product offering. The competitive service provider can at least focus on a deployment technology of choice, such as Ethernet over MPLS or RPR.
However, nothing is ever that easy. As can be imagined when business class services (as opposed to best effort home consumer type) are being guaranteed on an end-to-end basis between two major metropolitan areas, or indeed within the confines of a particular metro ring, there are challenges where 'Carrier Grade Ethernet QoS' is required. In this situation, service providers are expected to offer high bandwidth Ethernet services but also reliability, redundancy and high quality business class applications. Applications are increasingly delay and jitter sensitive, such as multicast video, time sensitive e-commerce web solutions and voice over IP (VoIP). This article focuses on the requirements of carrier grade Ethernet QoS at a layer 2 service and IP application level and assumes other carrier grade issues related to hardware redundancy (for example, MPLS fast reroute guarantees, inherent SDH protection and RPR protection) are addressed.
To capture the enormous enterprise business market with differentiated Carrier Grade Ethernet QoS products requires an understanding of the capabilities of Ethernet Services and the applications being transported over Ethernet.  What to look out for when offering Carrier Grade Ethernet QoS:
1. Understand the performance of QoS and CoS (Class of Service)
The IEEE 802.1p,q standards for Virtual LAN (VLAN) services offer a method for identifying a service stream, setting bandwidth and assigning a priority setting that determines class of service (CoS), rather than it being a pure QoS parameter in network implementations such as ATM which offers attributes such as CBR (constant bit rate) settings etc. So, in order to benefit from inherent Ethernet cost effectiveness and high bandwidth, plus offer QoS, there is also a need to bring additional quality metrics into the mix such as those offered through connection admission control (CAC) for end-to-end bandwidth and using MPLS signalling and traffic engineering capabilities. Ethernet industry-focused organisations, such as the Metro Ethernet Forum (MEF), have defined service types including Ethernet Line and LAN Services for point to point and point to multipoint/multipoint to multipoint services. 
CoS identifiers within these services can include specific source and destination MAC addresses, customer edge VLAN ID/IEEE 802.1p. Inspection of these packet headers requires processing power from network devices that may impact performance and requires verification of performance on a per service basis. The MEF has defined traffic profiles per CoS identifier that include Commited Information Rate (CIR), Peak Information Rate (PIR) and associated burst sizes. The provider can thus offer a greater number of service options to their customers. For example, a subscriber may connect to a metro Ethernet service at one location with 10Mbps user to network interface and another location at 100Mbps. The CIR in this case could be 10Mbps. More is to come. With the development of VPLS and loss-less packet transmission in metro Ethernet networks, the number of network options will continue to increase.
2. Check ability to guarantee service stream and IP application flow quality.
In the past it has been difficult to test on a per service and per application flow basis since traditional test methods relied on packet blasting at layer 2 only.
Basically, if the layer 2-service pipe was rated by RFC2544 throughput tests, this was generally viewed as a sufficient guarantee of quality. However, to really guarantee carrier grade Ethernet QoS, a far more granular approach is required. Service providers need to be confident that each service, each user and each IP application flow using that service are thoroughly tested for quality. 
Therefore, a pragmatic approach to testing is required whereby corporate Ethernet service and application flow models can be quickly built, then emulated and analysed for quality issues throughout the network under test with varying load and network status conditions. Using this test method, QoS boundaries can be realistically determined for both network services and application layers.

3. Guarantee end-to-end QoS
Ethernet services invariably start out their 'circuit life' as a layer 2 service (e.g. VLAN) originating at the customer premise into some point of aggregation and transport such as MPLS/RPR. The transport method can be a layer 3 VPN such as MPLS RFC2547 and then converted out the 'other side' back to the layer 2 VLAN and into the remote customer premise. It is important to test on an end to end basis. For example, with the possibility of an MPLS misconfiguration the number of hops and propogation time can change and requires end-to-end test for different traffic engineered service configurations. Also, it is important that each CoS priority assignment for 802.1 VLANs effectively maps onto MPLS EXP bits (equivalent quality metric) and back again. In situations involving MPLS fast reroute, how long does it really take for an individual end to end Ethernet service to get back to normal if a disruption occurs? Another consideration is restoration of service when normal conditions resume.
4. Understand the effects of TCP/IP application flows on 'guaranteed' Ethernet services bandwidth
Yes, we all know that Ethernet is a layer 2 service and you should not care about the IP and application layers above. However, when it comes to offering bandwidth guarantees you need to pay attention. It is extremely important to consider the effect of multiple TCP/IP application traffic flows running over a given layer 2 service and the potential side effects such as a drop in effective bandwidth. Due to TCP congestion notification schemes, layer 4-7 performance can rapidly degrade leaving customer bewildered and confused about the service specification and network performance. Rather than facing an irate customer who believes in getting the bandwidth pipe they paid for, it is worth testing a variety of scenarios with voice, video and data traffic in advance that can cause an excessive amount of dropped packets. In this way a service provider can better understand how and why this occurs, but also explain to customers why, for example, a 20Mbps service at layer 2 does not necessarily translate into the equivalent 'application bandwidth'. Of course, with full RFC2544 tests, throughput can be guaranteed at layer 2, but add real application TCP flows into the mix and see what happens.
5. Have a method of isolating Ethernet quality service issues from customer application problems
The ability to offer end-to-end carrier grade Ethernet QoS usually assumes the customer has the perfect set of well-behaved applications. Again, as an Ethernet services provider, the last thing needed is blame for a service issue caused by application problems. Aside from bandwidth hogging applications such as peer-to-peer (P2P) transactions that sponge bandwidth at an enormous rate, even standard IP applications such as Web, E-mail, VoIP, Multicast and Streaming applications can contribute to latency and loss of bandwidth. The ability to quickly isolate a problem source - and to prove it - is a key element in customer satisfaction.

Pressure on service providers and the equipment vendors supplying them to provide carrier grade Ethernet quality of service (QoS) guarantees are being heightened with the introduction of an array of value added applications such as Video on Demand, response time critical Web applications and VoIP. In order to improve competitive advantage there are a number of areas in which QoS issues can be determined and mitigated with practical quality boundaries worked out in which premium level business class services can be more effectively and confidently guaranteed.                                   

Robert Winters is Chief Marketing Officer and Co-founder Shenick Network Systems Limited, and can be contacted via tel: +353-1-2367002; e-mail: robert.winters@shenick.com [l=www.shenick.com/]http://www.shenick.com/[/l]

Ethernet is emerging as a key component for bridging the gaps from the access network to the customer demarcation, as Troy Larsen explains

As business and residential customers raise the bar on demand for new voice, data, and video services requiring higher bandwidth and faster speeds, Ethernet is poised to become a key technology in the subscriber access network environment. Ethernet in the First Mile (EFM) - or perhaps more appropriately Ethernet in the First Kilometre - is rapidly moving to the forefront of service provider options because of three major advantages: simplicity, scalability, and interoperability.
European service providers, in particular, have been quick to embrace Ethernet over fibre during the past few years as a 'low cost, no nonsense' approach to giving customers expanded services to meet soaring expectations. Europe's demand for metro access solutions rose quickly, perhaps due to the density of customers in most geographical areas putting businesses and consumers in close proximity to embedded fibre.
Today, most service providers, including incumbent local exchange carriers (ILECs) throughout North America, are bullish on Ethernet. With competition heating up and revenue opportunities for fibre to the home and business increasing, carriers sorely need solutions that can bring flexible, low cost bandwidth straight to the demarcation points.
According to a recent report from Current Analysis, a US-based telecom market analysis firm, Ethernet is one area that appears to have solid customer demand and North American ILECs are rolling out aggressive development plans. SBC, Verizon, BellSouth and Quest have eyed the opportunities and jumped in early to build national Ethernet coverage. MCI more recently launched plans to expand its Ethernet service portfolio and footprint, despite playing catch-up after having to work through WorldCom's Chapter 11 bankruptcy issues.
Ethernet evolution
Ethernet provides several cost-saving benefits for bringing high bandwidth services to customers. First and foremost, Ethernet has been, and will continue to be, the easiest protocol to implement in any type of network topology. It's not only simple to install and maintain, but it is ubiquitous throughout the industry. There are well-defined standards and a worldwide industry supply chain.
Scalability has always been an asset in Ethernet deployment. There is no denying that Ethernet, throughout its long history, has continually increased its bandwidth capacity, as well as its ability to handle larger and larger network topologies. For metro-area backbone networks, 10-Gigabit Ethernet is providing the same scalability advantage.
Interoperability is an issue for any telecom technology, new or old. The standards organisations, as well as interest groups such as the Metro Ethernet Forum, have spent a great deal of time and effort creating standards that make Ethernet the easiest protocol to implement - from the carrier network all the way down to the subscriber network.
Because of these and other advantages, Ethernet provides economical benefits that make it very attractive, particularly in access networks. Compared to asynchronous transfer mode (ATM), synchronous optical network (SONET), synchronous digital hierarchy (SDH), and other protocols, the cost difference is significant for the carrier. Ethernet not only lowers equipment costs, but it costs less to maintain in the network.
Leaping over the hurdles 
Despite its many advantages, Ethernet was plagued in its early implementations by a number of hurdles that prevented it from becoming the protocol of choice for connecting the First Mile. First, carrier customers required the highest (99.999 per cent) reliability and uptime. Although 'best effort' reliability is acceptable in many local or enterprise network situations, it is not tolerated in the telco realm. Because of that, carriers had been reluctant to invest money into deploying Ethernet until standards could provide a more acceptable reliability factor.
Another major hurdle was operation, administration, maintenance, and performance (OAM&P) monitoring. Since carriers were creating these networks to generate revenue, the management issue was not only an extension of reliability in general, but a key component for reducing operational expenses while creating competitive pricing structures. Profitability in the long term is essential.
Restoration capability was also an issue that needed to be addressed. Restoration simply means having a redundant link between a carrier's point-of-presence (PoP) and the customer site in the event of a major physical problem, such as a fibre cut. This is generally provided by having dual links to one PoP, known as 'single homing,' or separate links to two different PoPs, known as 'dual homing.'
To be effective, the switchover between the primary and secondary links must be quick enough to remain transparent. The benchmark for this is the < 50 ms switchover time offered by SONET. This precluded normal Ethernet restoration protocols, such as Spanning Tree, which can lose a significant number of packets during longer reconvergence times - unacceptable in a mission-critical network.
The answer to these and other obstacles to EFM viability is the IEEE's recently ratified 802.3ah standard. This standard provides a set of management tools with the specific goal of making Ethernet acceptable in the carrier environment for deployment in access networks. The 802.3ah standard calls for the ability to remotely manage a demarcation unit or the customer premise equipment (CPE) with full OAM&P.
Vendors of Ethernet access solutions are extending the 802.3ah standard to include further functionality. Addressing link restoration requirements, some vendors have introduced implementations that meet or even exceed < 50 ms switchover times. Other solutions enable carriers to offer and administer multi-tiered service level agreements (SLAs) using CPEs with built-in rate limiting capabilities.
Long-awaited standards
At its core, the long-awaited 802.3ah standard sets the groundwork for giving carriers the confidence to deploy today's Ethernet. They can now reap the benefits in managing Ethernet services to the customer premise while guaranteeing any level of service. Additionally, operational expenditures are minimised through remote management capabilities that eliminate the expensive truck rolls of the past.
Rarely considered when planning Ethernet access service is the need for a management agent at each network device. For example, a carrier normally manages the central office through a simple network management protocol (SNMP) that requires an IP address and a management agent. Using the same scheme for every CPE device makes management of the IP resources alone a huge burden. Worse, the added complexity this creates in the network results in reduced liability.
However, with the 802.3ah standard, the need for an IP address at the customer premise is eliminated. This not only simplifies the setup of each device - which today involves a plug-and-play module with auto-discovery features - but greatly simplifies maintenance requirements over the long term. All of this, of course, equates to less cost and more revenue opportunity.
Another overlooked aspect in access networks is packet size. The maximum size for IEEE standard Ethernet frames is 1522 bytes. However, many Ethernet and IP switches/routers make use of extensions to the frame that result in a larger maximum frame size. To enable all the commonly used protocols - as well as the new emerging IP/MPLS/Ethernet protocols - to run undisturbed between physical locations, service providers require access equipment to have the ability to transmit frame sizes from 64 bytes to a maximum of between 1548-9000 bytes.
Any Ethernet demarcation solution with a maximum frame size below 1600 bytes will substantially limit its attractiveness to service providers. Emerging protocols for transparent LAN services accept the fact that in full duplex mode, Ethernet has no practical limit on packet size. Emerging services and new protocols are already requesting mini-jumbo frames (1900 bytes) and in the future may request jumbo size packets that extend to 9000 bytes.
Finally, to ensure restoration features, carriers should look for an Ethernet services demarcation unit that has built-in 'link-state' redundancy capability. This is the ability to detect a loss of link on a primary interface and instantaneously switch to a redundant link. With the advent of pluggable optical interfaces and intelligent, remotely manageable CPE devices, it is possible today to provide a single solution that incorporates all the intelligence needed to provide a redundant link with transparent restoration if and when the customer requires it.
Ethernet revolution
The two basic service solutions for Ethernet access are Ethernet LAN (E-LAN) services and Ethernet Line (E-Line) services. E-LAN services provide multipoint-to-multipoint solutions over a wide-area network, sometimes referred to as a wide LAN solution. E-Line based services, on the other hand, are point-to-point in nature, and fall into three categories.
Simple point-to-point E-Line services physically connect one location directly to another. More advanced E-Line point-to-point services rely on a network with multi-service solutions, meaning that quality-of-service (QoS), advanced VLAN capabilities, circuit emulation, and possibly even encryption services are available at the demarcation or aggregation point. The third E-Line category is point-to-multipoint services wherein one site is connected to several other sites through the network.
Why Ethernet in the access network? Simply put, it is scalable, simplistic, and interoperable. Ethernet is the most widely used global protocol, supporting data, voice and video traffic while easily bridging the gap between provider and subscriber networks through transparent, but fully managed demarcation capability. At the end of the day, Ethernet meets every carrier's primary demand: a lower cost solution capable of reaping additional revenue from new and existing access networks.

Troy Larsen is technology marketing manager at MRV Communications
 [l=www.mrv.com/]http://www.mrv.com/[/l]

In an increasingly information driven world, the question of how to protect that information in the name of privacy has risen to the top of the corporate agenda. Lynd Morley talks to Toby Stevens, managing director of EPG, about the privacy issues affecting business today

The issues surrounding the privacy of personal information in business are fast moving up the corporate agenda, as organisations begin to recognise that they are caught in a web of rules and regulations at both national and international levels. Understanding and applying the regulations correctly are now as vital to a company's commercial survival as guaranteeing the security of information systems or adhering to correct accounting procedures have become over the past decade.
"Privacy is recognised as one of the key elements of good corporate governance," explains Toby Stevens, managing director of the Enterprise Privacy Group. "Corporate social responsibility demands that you show respect for personal data."
Stevens, who established EPG with Simon Davies – widely acknowledged as one of the foremost privacy experts in the world, and founder of the watchdog group Privacy International – also points out that every commercial relationship is built on trust. "If you misuse someone's personal information, you can destroy that trust instantly. Your customers and employees may forgive an accidental security failure, but they will not forget an abuse of their personal privacy, regardless of the cause."
The post-Enron emphasis on faultless corporate governance, heightened public awareness of privacy issues, and a growing culture of litigation are all contributing to a very real need for organisations, in both the private and public sectors, to understand and implement the privacy requirements being placed upon them.
In the wake of such developments as the introduction of new anti-terrorist legislation across the world, and – specifically in the UK – the forthcoming introduction of ID cards, images of a Big Brother society are beginning to loom in the public consciousness. As a result, organisations are having to respond to privacy concerns, much as they did to the information security concerns that emerged in force during the 90s. A decade ago, information security was still viewed as a drain on the bottom line by most businesses – an optional, value-added service. With the growth of the Internet, the increased public consciousness of hacking, and some high profile security incidents, most companies realised that they had to start offering security, no longer as an optional extra or differentiator, but as a commodity. Indeed, they recognised that they would lose customers if security were not integrated into every aspect of their products and services.
Stevens points out that over the past few years there has been a similar growth in public awareness of how personal data is managed – prompted, in part, by the introduction of EU legislation on privacy.
"Europe has absolutely led the way in this field, with a very strong cultural concept that your personal data is private, and that you have the right to control who sees it, who handles it, what they do with it," he explains. "In the late 80s that concept was translated into the EU Data Protection Directive, and companies were given the burden of actually having to be accountable for how they handled personal data. Back then, most of them saw it as something of an irritation, and couldn't see any commercial value in compliance. Quite often, data protection was fobbed off onto security departments, or junior management, because it was seen as a purely regulatory and legal compliance issue.  The attitude was: 'We'll do the bare minimum we need to, and then we'll forget about it'. But this is rarely effective: security professionals are worried about hackers or disgruntled employees, but the biggest privacy threat can come from your best customer or most loyal member of staff. The privacy manager requires a different mindset to the security manager."
Stevens believes that organisations are now becoming all too aware of the fact that, not only is there a considerably heightened awareness of privacy issues among the consuming public – who will no longer accept privacy of information simply as an optional extra – but that there is also a move in Europe towards the US model, where the growth of privacy legislation has been driven by litigation. US organisations are obliged to consider the possible litigation arising from any privacy incident, and this has created a culture of respect for privacy, since it directly impacts the organisation's bottom line. The litigation-driven approach has also created a diverse range of laws to address very specific privacy problems, despite the absence of an equivalent to the EU Data Protection Directive.
The US Video Privacy Protection Act, for instance, was passed by Congress in the wake of the controversy that arose when Judge Robert Bork's video rental records were released during hearings into his Supreme Court nomination. The Act forbids a video rental or sales outlet from disclosing information about which tapes a person borrows or buys, or releasing other personally identifiable information without the informed, written consent of the customer. The Act also allows consumers to sue for damages if they are harmed by any violations of the Act.
Another example of the effects of litigation was demonstrated when a US federal and state class action against Internet advertising agency DoubleClick was settled under an agreement that requires the company to give consumers new privacy protection. The lawsuits alleged that DoubleClick violated state and federal laws by tracking and collecting consumers' personal details and combining them with information on their web surfing habits. As part of the settlement, DoubleClick agreed to adhere to a number of practices and policies, including the commitment that the company's privacy policy would display easy-to-read explanations of its online services; the company would undertake a consumer education effort, which included consumer privacy banner ads that invite consumers to learn more about how to protect their on-line privacy; and the company would institute internal policies to ensure the protection and routine purging of data collected online. The legal fees and costs of up to $1.8 million fell to DoubleClick.
But even without possible legal ramifications, Stevens is adamant that, in the information society, proper handling of personal data will become one of the major factors for any client deciding to whom he or she is prepared to divulge personal information.
"We increasingly see people voting with their feet if, for instance, they don't like a web site's privacy policy," he comments. "These effects can be measured – how many virtual shopping baskets don't go through checkout because the individual gets cold feet about handing over personal information?
"Every business handles information, but particularly in the business to consumer environment any company that does not respect personal information will, sooner or later, come unstuck. Not necessarily as a result of legal action, but purely at a commercial level. People simply won't hand over their data."
EPG, whose brief is to understand best practice in privacy management and help their clients implement it successfully is, for example, currently working with a central UK government department to assess its compliance with data protection legislation. EPG is also working with a leading management and systems consulting firm to consider issues arising from the use of Radio Frequency Identification (RFID) tags on pharmaceutical products.
Understanding the detail
Stevens, whose experience spans over 15 years in the management of corporate security and privacy projects, explains that the problem for business now is in understanding the detail, as well as the principles, of handling personal data.
"This can be any personal data," he stresses. "It's not just your customer database, your marketing list, or your employee information. It is anything that can be linked back to an individual in any way. Even if you strip someone's name away from the data, as long as there's still an identifier such as a telephone number, it's personal data.
"I've worked with a great many large organisations – some of them huge – which had absolutely no central control over privacy or data protection," he continues. "I spoke to a wealth of companies who said that they simply had no idea what they were meant to be doing, or who was responsible for doing it.
"The problem with privacy, from a legal perspective, is that every country's requirements are different. Even within the confines of the EU Data Protection Directive, each country has interpreted the law differently. In Spain, for instance, they define the levels of encryption and the types of password to be used to protect different types of personal data, whereas the UK was recently criticised by the EC for deficiencies in its interpretation of the Directive.
"For an international company trying to operate across borders – and the hardest of those borders is the Atlantic – the challenge is in constantly trying to keep up with the legislation, interpret it and then implement it."
He goes on to point out that, in the US in particular, we are now seeing the emergence of the corporate privacy officer – an individual who is dedicated exclusively to working on privacy issues, and reports to a very senior level of management. Microsoft is one example of an organisation using this approach in Europe, publicly demonstrating their commitment to data protection issues with the appointment of a highly respected privacy specialist as the company's EMEA corporate privacy strategist.
Microsoft's approach is to provide a focal point for privacy issues – a 'champion' – who will both advise the organisation's staff and work with third parties to help them resolve and avoid privacy problems. EPG aims to fulfil a similar role for clients.
"By understanding and establishing best practice, we aim to move privacy management away from being a compliance driven process, and help our members to take control of the issues proactively. They will then no longer have to play catch-up with their obligations in whichever country they are operating," Stevens explains."If we can give them an effective infrastructure, and the skills they need, they will be able to turn privacy into a business enabler."
www.privacygroup.org
Lynd Morley is editor of European Communications

Alun Lewis talks to Gordon L Stitt and Martin van Schooten of Extreme Networks about the current market climate and the company’s strategy in the Ethernet arena

The scale and strategies involved in network investment often provides a pretty sensitive barometer for the general health of our businesses and our wider economies. When times are good, the network is a tool for growth - when times are bad, networks can also help companies and even whole industries compete more effectively with limited resources.

It is with these thoughts in mind that European Communications recently met with two senior members of Extreme Networks, Gordon L. Stitt, President and Chief Executive Officer, and Martin van Schooten, Vice President of Marketing, to discuss their take on the current opportunities for Ethernet in both the enterprise and public service sectors.

AL: Gentlemen, it was probably around two years ago that we last spoke, just as the industry was still headed south into a gathering recession. What's your take on the current situation now?
GLS: For a start - a lot more optimistic! It's important to remember that in the space of only a few years, both business and industrial strategies and the underlying technologies that they use have continued changing, even though the recession was obviously hitting the mainstream telecommunications sector pretty hard.
On a region-by-region basis, Asia is looking very positive in a number of countries thanks to the continued take up of broadband services by both consumer and business customers. Japan currently has the world's largest metro Ethernet network in Tokyo, with some 200,000 end points, while Ethernet is also being rolled out on a large scale in Korea, supporting a real hunger for bandwidth that's often being driven by domestic applications such as on-line gaming.
The picture is far more mixed in the US, where the post dotcom crash is still impacting on network operators. There's also uncertainty in that market, as operators continue to evaluate different access technologies. Because of the scale of investment that is faced in re-engineering their access networks, things are moving comparatively slowly there. That said, growth in the enterprise market for Ethernet solutions continues to increase steadily.
One of the interesting drivers for this - which we may see echoed in the EMEA region - is the increased demand for compliance with industry regulations, such as in the finance and healthcare sectors. Companies are finding out that unless they can substantially automate even more of their processes and improve the flow of information around their organisations, they'll both drown in paperwork and fail to meet their legal obligations.
We recently had a good example of this kind of development in a contract we signed earlier this year with Pine Digital Security, who are providing a Lawful Intercept solution to Dutch ISPs following the issue of a number of subpoenas to enforce this. This is an important application area, wherever you look around the globe and, in support of this, Extreme has also recently joined the Trusted Computing Group (TCG), which is an open industry standards organisation that produces specifications designed to protect critical data.
MvS: Generally speaking, Europe's showing a nice mix of opportunities for us. On one hand, many operators, both incumbents and CLECs, are actively deploying metro Ethernet networks, though usually on a rather piecemeal process. Their strategies though aren't set in stone yet and there are some interesting opportunities for Extreme emerging there.
In both the UK and other parts of the continent, there are emerging opportunities from Internet exchanges as well as from the ISPs themselves.
There's also a lot of Internet catch-up going on in the other various EMEA regions as well, such as in the 'new' Europe and in parts of the Middle East, such as Dubai, which is transforming itself into a major hub for electronic businesses of all types.
AL: So what's Extreme Networks itself been up to since we last spoke?
GLS: While business growth has been steady - and lately we haven't been hit as badly as some of our competitors - we've been able to take advantage of a relatively quiet period to continue investing in new technology and in the company's organisation, and have been able to stand back for a clear look at where the whole industry - and our customers - are heading.
For a start, we've seen a lot of VoIP start to be deployed in the enterprise space and that naturally puts a strain on the capabilities of the traditional data network. Alongside that, there's the continued convergence of the wired and wireless environments, most significantly in our case with the take-off of WiFi as an access carrier for both voice and data traffic. Voice is extremely intolerant of any delay or degradations in the Quality of Service and requires extremely high levels of availability if a company is to seriously consider moving off a traditional infrastructure - and being able to deliver that is exactly one of Extreme's main selling propositions.
MvS: Convergence in the public network space is also starting to pick up on the next wave of convergence, though this is obviously focused more on the 'triple play' kinds of offerings that involve video as well as more familiar voice and data connectivity services. Domestic customers who find their entertainment suddenly cut off because of network problems can be just as unforgiving as the most hypercritical CEO or CTO of a large business, so our QoS focus goes down extremely well in these markets. Supporting this approach, we're also able to enhance services still further through our strengths in policy management, ensuring that the right data arrives in the right place.
AL: And the wider drivers for growth in the business sector?
GLS: I'm afraid it's convergence again, but this time another aspect of it is involved. What we're also seeing in the enterprise space is an accelerating move to interlink communications and IT applications in ways - and at prices - that have never previously been really possible. While we're all familiar with dedicated call centres, where telecommunications and applications come together, these sorts of functionalities are now starting to be rolled out to support other business departments and applications.
That in turn means that the network has to be far more adaptable and intelligent than it ever had to be in the past. Any networking solution has to deliver a balance between all the hardware and software involved as a totality - and that's where the 'smarts' that Extreme Networks can deliver comes in.
Hardware's good at doing some things; software is good at doing others. Only by taking a sensibly holistic approach to the entire environment - and that means implicitly understanding the wider business objectives of your customer - can you hope to deliver a solution that is fit for purpose.
A good example of this is a recent European contract that we signed with Trader Media in the UK in August this year. Best known for their Auto Trader series of publications and owners of the UK's busiest automotive website - which processes over one million searches for cars on a busy day through both fixed and mobile services - the network for them really is their business. Using our policy management techniques, they're able to ensure that when the NAS (Network Attached Storage) devices that support their two Oracle databases synchronise, there is still ample bandwidth in the network for visitors to their internet sites to access the overlying Web services. These QoS factors also increase the frequency of data synchronisation, meaning that Auto Trader can stay and remain a truly up to date source of information for its customers.
Supporting the Trader Media is our EpiCenter software suite, which allows staff to manage the network from a single console, taking advantage of the open standards supported by EpiCenter to integrate the network with its existing systems management applications.
MvS: As we said, there's a similar high focus on reliability - irrespective of total network scale - in the carrier market. Here, we've been building on a technology we announced to the world towards the end of last year - EAPS, standing for Ethernet Automatic Protection Switching. Essentially what this does is replicate the kind of protection and survivability that's traditionally been enjoyed by SDH/SONET networks, but on an Ethernet topology.
We had BT Exact, BT's R&D organisation, carry out extensive test on the solution and they found that EAPS delivered sub 50-millisecond failover on both copper and fibre interfaces. With that kind of performance, carriers can now deploy a highly dependable - yet inexpensive - fibre-optic ring spanning hundreds of miles, combining EAPS in conjunction with a redundant design, with aggregation, edge and premise switching platform fully integrated into the entire solution.
AL: The old cliché has it that a network is always more than the sum of its parts. What's your strategy for working with other members of the networking community?
GLS: Extreme understands the need for openness and transparency whenever commercially possible, which is one reason why we've made the Application Programming Interfaces to our new operating systems announced last year available to other members of the networking community.
The sad truth is that if you buy from some of our competitors, you'll find yourself locked in to end-to-end, proprietary solutions that may deliver advantage in the short term, but ultimately limit the scope for technological - and hence commercial - freedom and adaptability.
We have a very broad range of partnerships with many leading players across the whole networking ecosystem. Probably one of the most significant of these is with Avaya, with whom we've recently signed an important agreement to support an ever-widening range of integrated applications, using SIP, for example, to link presence and availability information about staff or a mobile engineering force directly with voice and data communications and with IT systems.
To support this sort of environment, the network, naturally, has to do the job. It also has to be manageable, so we design our solutions to make the extraction of traffic management data as easy to use as possible. Again, openness is one of the key criteria essential for success in such a dynamic environment.
MvS: It's also worth mentioning that there's a bit of a paradox here that our customers often face - and it's one that our open strategy is designed to resolve efficiently and cost-effectively.
The network may be recognised as being at the heart of most businesses these days, irrespective of whether they're a national telecommunications service provider, or a medium sized business. The problem is that the network is going to have to change and adapt as the organisation itself adapts to changing market conditions and business opportunities, and as new applications are added or old legacy ones removed from the networking environment. With many historic networking solutions based on a 'one size fits all' model, it's often been a challenge to make the necessary changes without major service disruptions, followed by often extensive periods while the network is fine tuned to cope with the new world.
That's obviously a situation that is, frankly, unsupportable in today's 24/7 culture, and the reason why we've introduced our new network Operating System in a modular format, capable of being deployed in an incremental fashion to fit with new applications and interfaces.
AL: So it's finally looking like there's some light at the end of the networking tunnel?
GLS: Very much so! Service providers need new, value-added Ethernet-based services to generate additional revenues and protect their customer base from new competitors. Enterprises are adopting multi-service networking platforms to gain an early edge from converged IT and communications solutions. And both in their own ways are contributing to the revival of a sector that's been relatively dormant for far too long - though it's a far more pragmatic world than the fevered pace of the late '90s.                                         
Alun Lewis is a telecommunications writer and consultant  alunlewis@compuserve.com
[l=www.extremenetworks.com/]http://www.extremenetworks.com/[/l]

Described as a new wave of opportunity, the Central and Eastern European markets could provide a new hunting ground for service providers. Alun Lewis talks to Wolfgang Hetlinger of T-Systems about the company’s strategy in these new markets

While it's an obvious truism that telecommunications knows no boundaries, for much of the last century there was a huge divide between the countries of Western Europe and the former Soviet Bloc. Now, as the EC extends its boundaries eastwards, and Russia, and other members of the CIS, take their first steps towards creating free markets, a new wave of opportunity in telecommunications is starting to appear. The benefits for both sides are potentially enormous. While both telecom service providers and enterprises in these regions will need access to Western technology and business methods, the West also has much to gain through the energy, enthusiasm and commitment of the emerging commercial cultures in these countries.
European Communications recently visited T-Systems International GmbH, an information and communications (ICT) service provider serving Deutsche Telekom's business customers worldwide – including telecommunications service providers. The basis for the discussion with Wolfgang Hetlinger, Executive Vice President Sales CEE, Telecommunications Industry, T-Systems, was T-Systems' focus on telecommunications service opportunities in Central and Eastern Europe.
AL: While everyone has naturally heard of Deutsche Telekom, fewer may know T-Systems in any depth. Can you give me a brief overview of the company?
WH: Certainly. We're one of the largest information and communications service and solution providers in Europe, with around 40,000 employees and a 2003 turnover of 10.6 billion euros. While our headquarters is in Frankfurt, with other offices scattered around Germany, we also have a growing presence in around 25 other countries. While we've only been in formal commercial existence since 2001, our experience goes back a lot further than that, and our teams of experts have played vital roles in supporting both members of the Deutsche Telekom family and other service providers for many years – as well as an incredibly broad set of industry sectors, such as finance, public sector, healthcare and manufacturing.
The telecommunications market makes up the main part of our business, contributing around a quarter of T-Systems' total revenues. While Deutsche Telekom naturally remains our biggest customer, other important service providers such as KPN/e-plus, mm02, AOL Deutschland, Kabel Deutschland and AT&T also rely on our experience and expertise.
AL: And now you're targeting Eastern Europe and the former CIS. How do you see opportunities emerging  there?
WH: Very positively – and that's the reason we've set up a specialised team to cover the region. It's important though to understand that each country is very different and opportunities must be approached sensitively! ICT strategies from both the carrier and enterprise perspective are implicitly linked to local realities – whether it's regulatory issues, the role of legacy infrastructure, or just the general cultural ways in which business is done locally.
With organisations like Gartner measuring growth in telecoms services in these regions at around seven or eight per cent a year, these are obviously attractive and often greenfield markets, when compared to the comparative steady state of telecoms investment in much of Western Europe.
Our strategies for entering these markets vary according to local conditions and, where possible, we initially partner with local organisations. In Hungary, for example, we already have quite a high profile through the work that we have been doing with Matav – part of the Deutsche Telekom Group. We're also increasingly active in Poland, Austria, the Czech and Slovak Republics, the former Yugoslavia and Turkey – and even as far away as Siberia. We also have a strategy of following our customers as they expand, so we're also getting involved in new regions by supporting VPN customers from other industry sectors as they enter fresh geographic regions.
Our corporate strapline is, after all, 'Managed Business Flexibility' – and that means that we often help our clients by taking over management of their technologies, while they decide on business strategy. This makes their businesses faster, more manoeuvrable, and far more powerful. We therefore have to practice what we preach to our customers! One size never fits all in ICT.
AL: What do you see as the most popular offerings that you have for these new territories?
WH: For our telco customers, a lot involves sharing our expertise in the Operations Support and Business Support Systems areas. Many of our customers are now rolling out broadband services such as DSL to their own customers who are hungry for bandwidth to support their own business applications. Trying to do that with systems and processes originally designed for circuit switched networks will quickly cripple any innovation, so we find our ability to automate many of the underlying processes is a very attractive proposition.
This can involve designing, building and delivering entire systems – or advising on the best processes to adopt to get the maximum return on network and service investments and their human resources. If our potential customers want evidence of our abilities in this area, they only have to look at Deutsche Telekom's domestic network to see the kind of quality that they will get.
One important aspect of this involves keeping our own integration strategies as open and as flexible as possible. Like an increasing number of service providers, we use the TeleManagement Forum's eTOM model to provide an important level of consistency, while simultaneously adopting a Commercial Off The Shelf (COTS) approach to OSS design, where we often carry out extensive pre-integration work before even making a bid.
We also have a policy of working with other industry leaders in particular technology areas. We already have close business relationships with Telcordia and Micromuse, who are also targeting these regions, for example, giving us access to even more product and expertise.
While each customer has their own particular set of issues, some are commonly shared and many involve getting the most from the network while keeping costs down. As convergence becomes a reality through the shift towards all-packet infrastructures, there's far less focus than there used to be on point solutions that fix a single problem. Our customers recognise the importance of treating the telecommunications value chain as an integrated whole and the holistic perspective that we take on this is much appreciated. Since we already work in all the sectors that a service provider is targeting, we can help them anticipate and understand their own customers' needs and ambitions.
AL: And what are the most important issues confronting your service provider customers?
WH: Ensuring quality of service across that whole value chain is an important aspect of our work. Many business users in these regions – or even Internet cafes – are paying a premium for broadband connectivity over WiFi or DSL and expect a similarly premium level of service. In this context, business processes are often more important than the underlying technologies, and so we often get deeply involved in advising on aspects of engineering workforce management, CRM and billing, as well as providing the supporting systems. Of particular interest to many new operators is our ability to provide an outsourced service for billing, using our sophisticated  operations office in Germany.
When it comes to the core network and other central operations of telco customers, we often find ourselves helping them make the transition to new network architectures. To do this successfully, they often have to confront many inherited problems that have accumulated over the years. For instance, operators commonly find problems with their network inventory – even to the extent of having effectively 'lost' up to thirty percent of their network assets through inaccurate data, incompatible data formats or even just through their engineering experts reaching retirement age or moving to new companies. We can help them recover from this dangerous position and turn their inventory systems into a real business enabler.
We can also help an operator's key executives make the best decisions by ensuring that they're presented with the best management information possible. We can do this by integrating data flows from across all the different multi-vendor, multi-technology subsystems that they have in place and then presenting it coherently and consistently through management dashboard interfaces.
Finally, we're able to offer these services in a variety of different commercial models. Some may be risk-sharing partnerships, where our own revenues depend on the success of the service or network that we're supporting. In others, a customer might outsource the complete management of their infrastructure to us – allowing them to get on with their core business of developing and marketing attractive new services.
AL: Anticipating the future is always tricky in our industry. What do you see coming up on the horizon technologically?
WH: Number portability is an issue as it's becoming mandated by law in a number of countries in our target regions – and it's one that we already have extensive experience of. Content is naturally another hot topic as operators start to evaluate their strategies to deal with the so-called triple play services of voice, video and data. As we've been involved with these concepts since their earliest days, we find we're well positioned to advise on everything from portal design to the technologies involved in streaming video to managing digital copyright issues across different media and technologies.
AL: And your perception of the future of the region?
WH: In global terms, they might look like modest markets at the moment, but they have a huge potential in both directions. We see ourselves as long term partners to the region, helping to enable the free flow of best practice between our different countries and cultures. As one demonstration of our intention to engage, we're currently running a series of road shows, initially in Budapest, to help carry our message, face-to-face, across the region.
Alun Lewis is a telecommunications writer and consultant  alunlewis@compuserve.com
[l=www.t-systems.com/]http://www.t-systems.com/[/l]

With broadband entertainment now becoming firmly established, the question of which services to offer and how they best be deployed is key for operators, says Murali Nemani

Service providers have long been debating the merits of entering the global broadband entertainment (BBE) space. Declining traditional revenues and aggressive new competitors have them looking for new revenue and business growth opportunities. In 2003, BBE deployments by service providers in many regions have demonstrated strong demand for entertainment services. It's no longer about entering the BBE market but rather about which entertainment services to offer and how best to deploy them.
Service providers are asking five fundamental questions when considering the BBE services market:
1. What are our market opportunities and challenges?
2. What are the prospective business models and which are most likely to succeed?
3. What key groups comprise the value chain and what roles do they play?
4. What is the optimal framework for BBE service deployment?
5. Which service providers have performed early trials and how has the market reacted?
Market opportunities and challenges
In Spring 2003, InSites E-Research and Consulting asked European Internet users to choose the most attractive set of advanced services and the price points they'd be willing to pay. Video on demand (VoD) ranked as the most sought after service both by males and females, followed by interactive TV and online gaming. Another study – the Cahners In-Stat study of broadband television subscribers – predicts that 15.9 million subscribers will take up this service by 2006.
The film industry has come to view online distribution as a welcomed tool in their negotiations with the ever-powerful video rental distributors. They have also begun to make content available online to early adopters around the world.
In support of this industry-wide momentum, the consumer electronics industry is developing a large number of broadband-ready devices, ranging from game consoles and set top boxes to mobile terminals. With all of these devices in the hands of consumers, the need for networks capable of delivering broadband content ready for consumer consumption becomes critical. Subsequently, a market enabled by consumer devices complimented by high-bandwidth networks will accelerate adoption rates, and open new revenue streams to service providers.
These market changes offer service providers the business opportunities they need to offset declining voice revenues and reduce customer churn. With BBE services, service providers also have the opportunity to increase their average revenue per user (ARPU) while providing a solid defensive strategy against fast moving competitive providers.
For BBE, significant challenges lie in the opportunities themselves. The sheer complexity of managing the BBE value chain, the numerous alliance initiatives, and the mastery of technology integration all require significant effort in business design, customer trials and standardisation. The next few sections will help clarify some of the issues.
Prospective business models
BBE is about selling entertainment services such as video, music and gaming over broadband networks. In the BBE value chain – as in its offline equivalent – content flows from the content owners, through a distribution network, to the content consumers. In this case, the service provider operates the distribution network. How the service provider positions itself in the service delivery process will define its role in the entertainment services value chain. We'll look at three models here.
• The Public Garden model
In this model the service provider is limited to providing a transparent connectivity pipe between the consumers and the content owners. A consumer uses the service provider's network infrastructure to seek out the content owner's website, selects the desired content, and consumes it. This is much like today's Internet.
From a consumer's point of view, the choice in content sources is large. Anyone who has content can put it online. However, it is nearly impossible for the content provider to guarantee a satisfying user experience. For example, when the network connection between the content owner and the consumer is congested, the user experience is sacrificed – especially for video. Also, the wide spectrum of content sources and its fragmented nature often make it a very difficult and frustrating experience for consumers. With independent payment options for each content source, fear of potential fraud makes consumers reluctant to purchase on-line content.
Content providers not only establish no long-term customer relationships, but payment authentication and verification becomes quite cumbersome. Targeted marketing for their content is difficult, since they have no way to selectively target new consumers.
For service providers, this is an unattractive business model. They generate a flat connection fee from their broadband consumers, independent of bandwidth consumption. Since bandwidth consumption is a cost driver, service providers have no control over their network costs. In addition, content providers are using the service provider's infrastructure to generate business without compensating the service provider proportionally for the associated cost.
• The Walled Garden model
In the walled garden model, the consumer is put in a garden with pre-defined content. All the content is licensed by the service provider from the content providers and offered as a service pack to consumers. The service provider handles all content layout, authentication, billing, and quality of service (QoS).
For the consumer, this model eliminates the risk associated with credit card payments. The end-user experience is also guaranteed since there is only one party (the service provider) involved in the end-to-end network path. The limitation is that consumers are restricted in content choice.
For content providers, this model allows them to focus on their core business of producing content. They can then allocate the distribution of content to the service provider who handles the billing, customer support and QoS.
Service providers are now at the centre point of the value chain, with every penny flowing through their books. The trouble with this model is the service providers level of exposure, as significant resources will be dedicated to content aggregation, layout, maintenance and support.
• The Gated Garden model
In this model, the service provider establishes a tollgate concept through which many content providers can offer content in exchange for a revenue share with the service provider. Content providers have a vested interest in the success of this service offering and will likely promote the carrier's initiatives. Content owners focus on content creation; the service provider is responsible for the user experience through QoS, authentication, billing, etc. The main enabler for this model is a horizontal network platform that not only provides the features of the walled garden model, but also maintains a business-to-business interface with content parties.
BBE value chain and key contributors
There are three categories that represent the main bulk of the supply chain.
• Content Providers: movie studios, music labels, content aggregators, broadcasters and programmers, producing, aggregating and selling consumer content.
• Content Retailer: video distributors, cable and satellite distributors, and newly emerging telecom operators.
• Content Consumers: at the end of the value chain, they desire access to content any time, from anywhere and on any terminal.
Finding the business model that fits the needs of all three may seem straightforward on the surface, but the relationships that exist between these and other market players are complex. Consider:
• the integration of the many components from networking, application and consumer electronics vendors
• the often-overlapping value chain for service delivery with various players and technologies causing friction and affecting customer service
• the dynamic regulatory regime that plays a prominent role in determining the nature of the relationships between the market players
Optimal framework for BBE service deployment
For reasons noted above, the 'gated garden' architecture is emerging as the model of choice for carriers. It offers economies of scale as third-party content providers can easily plug into a pre-determined store-front; strong brand recognition allows for customer ownership and high customer service standards; innovative technology development enables continued service innovation and service differentiation. The optimal framework will include:
• consumer-desired content sources
• business relationships with BBE supply chain players
• a go-to-market service model
• an open network platform for service deployment
Early trials and market momentum
While BBE services are still relatively new, BBE service providers have begun to deliver standardised and scalable services and products. Market pioneers like Kingston Communications (UK) and Aliant (Canada) have played an important role in helping to develop innovative and cost-effective services tailored to the unique needs of end users.
Success stories like Italy's Fastweb – which is delivering voice, data and video services to tens of thousands of users over both fibre and DSL infrastructures – demonstrate the growth potential for BBE. Fastweb uses the walled garden model when it comes to managing their own VoD service via license agreements with movie studios, but has chosen to migrate to the gated garden model for scale and added variety of content. A similar trend is emerging in Japan, where competition from BB Cable TV is actively pushing the incumbent service provider to begin deploying triple play services. Yahoo Japan started with the public garden model for hosting content provided primarily by Yahoo, but has now migrated to the gated garden model, expanding their BBE offering to a wide variety of content providers.
The BBE market presents many new growth opportunities for service providers and content aggregators alike. While it is still too early to determine the clear market leaders, early successes in the space suggest that this market is becoming one of the new broadband battlefields.
While it is clear that telecommunications companies have made the decision to enter into this space, it remains to be seen how aggressive they will be in upgrading their network infrastructure and adopting the right business model for bandwidth-intensive entertainment services.
The opportunity is evident, but to seize it – and stand apart from the ever-growing crowd – requires courage, know-how, and the conviction to find the right partners with the right business model.                       

Murali Nemani is Director of Strategic Marketing for Alcatel's Fixed Communications Group (FCG), and can be contacted via Helen Simpson at e-mail: helen.j.simpson@alcatel.co.uk
A complete white paper on this topic is available from Alcatel's Broadband web site at [l=www.alcatel.com/broadband/]http://www.alcatel.com/broadband/[/l]

Although SMS has provided a lucrative avenue of revenue for mobile service providers, MMS might not be so straightforward. Margrit Sessions explains

Originally built into the GSM specification, Short Messaging Service (SMS) is without question a success story for the wireless industry. End-users have shown they are addicted to sending SMS messages and have even created special languages for communicating with their friends.
SMS has the perception of being cheap, with end users paying on average EURO0.15 for each SMS. However, the cost for the network operators to deliver SMS is but a fraction of the cost to the end user, typically EURO0.2. Virtually no bandwidth is required, thus enabling network operators to make a good profit.
Messaging services have grown fast, with 22 billion SMS messages sent worldwide in 2003, compared to 16.5 million in 2002. In Europe, 18.3 million MMS messages were sent in 2003, with a tenfold increase of MMS users during that timeframe and an average of over 4.5 MMS messages sent per user. Some 39 per cent of all new handsets sold in Europe in 2003 were MMS enabled, while 14 per cent were camera phones.
Multimedia Messaging Services (MMS) has been hailed as the next great SMS. It is being positioned as a simple evolutionary path for SMS. But whereas SMS was a success story, MMS, may not be. The factors, which contributed to SMS' success, don't necessarily apply.
MMS, first introduced in Europe by Hungarian operator Westel on 18th April 2002, is an entirely new wireless protocol created specifically for GSM GPRS networks and, in the future, 3G UMTS W-CDMA networks. It is designed to support a wide range of content types including low-resolution images and music. End-users will have the promise of being able to create content themselves. They will have the ability to create and send pictures to their friends.
MMS services do not come without a cost: there are costs for network operators and costs to end-users.  For starters, network operators must upgrade to GPRS networks and MMS will be dependent on GPRS networks which can deliver a quality proposition to end-users.
Network operators must also put in MMS servers as  legacy SMS servers will not support MMS. These servers will control content delivery, roaming, user profiles, transcoding, device capability negotiating, create charging data records, and so forth.
End-users will need to upgrade their phones. Whereas all GSM phones automatically come with SMS capabilities, this will not be the case for MMS.  In order for MMS to be valuable, both senders and receivers will need to purchase MMS phones. More memory and higher resolution displays will be required to support MMS.
Network operators who have launched GPRS are still wrestling with how to charge for content. MMS will not make this task easier. Whereas SMS uses very little bandwidth and is therefore cheap to end-users, MMS is not necessarily so. A simple MMS picture of 10 KB consists of about 300 to 400 times more data per message than an SMS message. A complex MMS with text and audio clips could be as much as 50 KB, about 1,500 to 2,000 times more data per message than an SMS. Pricing MMS is certainly a challenge, and profiting from it will be equally difficult.
Tarifica compared the MMS prices, and every effort has been made to ensure that these prices are up-to-date and accurate. Prices are in Euros per message and apply to post-paid services, while prices exclude VAT. During the launch phase, many operators offered MMS at no charge. Since launch we have seen some changes in the pricing of MMS. Operators also started offering MMS bundles, charged at a monthly fixed rate for x MMS, which offers the per message price at considerably lower rate than as when charged individually.

Margrit Sessions, Senior Analyst, Tarifica, can be contacted via tel: +44 207 692 5292; e-mail: msessions@tarifica.com  www.tarifica.com

Revenue management can be divided into separate, distinct stages and objectives, all of which are crucial to operators seeking to maximise profit. Alan Laing explains

There's a lot of talk these days in the telecom market of the need for revenue management systems within carriers, and most of the companies offering them are from one or other segment of the billing industry. They may have come from retail or interconnect billing; they may be offering licensed software or a managed service. They all want to say they can offer revenue management, so what is this new holy grail for the sector?
In simplistic terms, you could say revenue management is the billing industry's adaptation to the market downturn in telecoms that started in 2001 and from which there are still only timid signs of emergence even now.
As consumers and businesses around the world have pulled in their horns, reining in spending on all things including communication costs, revenue growth at telcos great and small has slowed in comparison with the glory years of the Internet boom. When your top line is growing conservatively, or not at all, sound business sense dictates you must look to your bottom line, and it is no coincidence that carriers now emphasise profitability over revenue growth, in some cases at the behest of the financial markets.
Leaner and meaner times
Operators have got leaner and meaner, trimming their staff, lowering net debt and concentrating back on their core businesses. Far-flung empires, often comprising a mishmash of minority shareholdings around the globe, have been pared back to the manageable and, wherever possible, profitable – or at least with the prospect of becoming that way. For every international conglomerate like Vodafone there is now a downsized giant like BT, doing what it does best in the countries it feels most confident about doing it, rather than trying to be all things to all people half way round the world.
If my priority has gone from growing my revenue – even if it cost me a fortune to do it – to increasing the profit I derive from my business, I must run a tight ship, and keep a close watch on every phase of it to ensure there is neither waste nor squandered opportunity. This is, in essence, what revenue management seeks to do.
Another aspect of increasing profitability is selling more services to the same customer, and it is for this reason that DSL and CATV providers now want to offer you voice telephony while mobile operators want to offer you data services as well as voice. With multiple services on offer from a single provider and customers picking and mixing them, then paying for them all in different ways, revenue management is vital to achieving a single view of the subscriber and knowing best how to target him or her with future products.
In other words, if it's a teenager that looks at lots of video clips, offer them funky ring tones based on what the clubs are playing, while if it's a business executive using lots of WiFi to connect remotely to office applications, offer loyalty points that can be spent at hotels and restaurants abroad.
The four stages of revenue management
An analysis of the journey revenue makes through an operator led us, at Portal, to coin the phrase Revenue Lifecycle which, like the Ages of Man, can be said to fall into four stages. There is Revenue Generation – which is when a subscriber consumes a service and starts to generate revenue for the carrier. This stage can only begin once processes such as provisioning of the service, activation and authorisation have taken place, so it is in the carrier's interest that these are carried out as quickly as possible after signing up the customer. This will also increase customer satisfaction (ever signed up for a service then waited three weeks for it to start?), a sine qua non of upselling them to other services and growing share of wallet.
Next comes Revenue Capture which, on the face of it, sounds straightforward enough. It's knowing how much of a service has been consumed in a given timeframe in order to bill correctly and promptly. In fact it's considerably more complex these days, as a service (fixed as well as mobile) may be prepaid, in which case the carrier must know in real time how much credit the subscriber has, in order to warn them to top up before it runs out. In the case of content services, this may mean advising them that the next video clip will be the last for which they have funds.
Another layer of complexity in modern telecom services comes from the fact that, increasingly, families may want to have several different numbers, one for each member yet all grouped together on a single bill to the account holder, the pater familias. Equally, if a customer has a find-me service whereby calls to an office phone are rerouted first to a home number and then on to a mobile number, these will need to be billed for correctly, based on the rates for each of those individual services, plus an additional fee for the trouble of re-routing the calls.
Even a single subscriber today is liable to be a multi-faceted one, maybe defining certain calls from his or her phone as billable to an employer, while others are strictly personal, or paying for content downloads by credit card while voice calls are postpaid and e-mailing is prepaid. A parent may wish to stipulate that a teenage child whose number is normally a prepaid one, fed by pocket money, should be able to make postpaid calls to a taxi firm, billed to the parent's account, if credit has run out.
Then there is Revenue Collection, the bit we all love, when we bill someone for some work we've done. Again, in today's world, this phase has grown in complexity, however, as a carrier may be billing on behalf of other players in the value chain, such as roaming partners or content providers. Their presence in the chain makes it even more important to bill correctly and quickly, as they want their money too.
Revenue Capture and Collection should also give the carrier the information on customer behaviour to be able to develop new products and respond to market trends swiftly, lowering prices on certain services, bundling different products or launching new ones.
Last, but by no means least, is Revenue Assurance, which means making sure that there is no leaks, when for instance someone's account has been deactivated, yet they continue to receive the service for another week. Or when an interconnect partner is being paid too much because you don't have the wherewithal to check what they bill you for and dispute any discrepancies from what you think you owe them.
Billing system is key
An operator's billing system is key to success in all the four stages described above. It needs to interact with provisioning, activation and authorisation for revenue generation to begin. It needs to work with data from the network to know how much revenue a given subscriber has generated for the operator, i.e. it needs to carry out revenue capture. It must generate the bills, or reduce credit levels, and enable settlement with partners during revenue collection, and it must scrutinise and report on network activity to avoid leakage, overpayment for interconnect traffic sent or undercharging for interconnect traffic received, thereby providing revenue assurance.
The objectives of revenue management
The three pillars around which a billing vendor's revenue management offering should be developed are:
• Optimisation of value, achieved by delivering an integrated suite of revenue management tools for current and future areas of activity
• Maximisation of profitability, by offering a single platform for multiple services, business models and customer types, and
• Promotion of business agility, enabling operators to move towards real-time response to changes in the market.                                                         
The third element means integration with other systems operating inside a carrier, in particular CRM and ERP. CRM requires subscriber behaviour data to help craft better, more personalised services to promote customer satisfaction and up share of wallet. ERP can interact with the billing and revenue management system in place to streamline business processes as well as to oversee invoicing and make financial projections.
Integration with business applications
Some billing vendors have acquired CRM businesses, while we at Portal have preferred to integrate with the market leader, Siebel. We have a similar relationship with that heavyweight of the ERP world, SAP. These pre-tested solutions are designed to speed implementation, again enabling a faster time-to-revenue for the carrier as well as reducing complexity of management.
As top-line growth has slowed, telecom operators are turning their attention to the bottom line, prioritising increased profitability over general subscriber growth. They are focused not only on stripping cost out of existing legacy billing systems, but also on achieving higher revenue per subscriber through next-generation revenue management solutions that operate across offerings, channels and geographies. The benefits of this approach include:
1) the ability to rapidly generate revenue from customers immediately after they have subscribed (Revenue Generation);
2)  knowing in real-time how much to bill customers for, or in prepaid environments, only allowing access to services they have subscribed to and still have enough credit for (Revenue Capture);
3) taking the correct payment for services promptly and settling with third parties such as content or service providers, but only for exactly what the carrier owes them (Revenue Collection) and
4) eliminating revenue leakage through fraud or first generation back-office procedures that enable terminated customers to continue using services, or interconnect partners to charge more than they are owed (Revenue Assurance).
What's more, as today's new revenue management solutions integrate fluidly with other back-office systems such as CRM and ERP, operators will benefit from more sophisticated trend analysis for product/service development, both for end customers (new billing packages, tariff bundlings, services offerings) and third-party suppliers (portals on which they can accompany their content's reception on the network, new ways of advertising its availability). 
As carriers begin to recognise the bottom line value of taking a unified approach to Revenue Management, we can expect to see operators more successfully advancing their efforts to build differentiated global brands. Revenue management will become a central strategy to better service their most profitable customers through the launch of innovative new products and services that not only meet, but exceed customer expectations.     

Alan Laing, Vice President and General Manager Europe, Middle East, Africa, Portal Software can be contacted via: alaing@portal.com [l=www.portal.com/]http://www.portal.com/[/l]

Although offered a degree of flexibility by the European Commission during the dark days of recession, should mobile operators now be pressured to meet commercial Location Based Services accuracy targets, asks Jake Saunders

On the 11 June 2004, the European Commission (EC) carried out a quorum on value-added data services to canvas concerns and opinions about the current state and future direction of that sector. Also, by the end of the year, the EC will have also carried out a review of compliance to its emergency location identification E112 initiative. Concise Insight believes that the two areas of mobile communications should be reviewed in light of each other. Commercial mass-market LBS and E112 deployment are intertwined.
When the EC framed its E112 mandate for EU cellular operators it took a more flexible line than its US counterparts. Cellular operators were obliged to hand-off the location co-ordinates and personal details of cellular users in distress to the emergency services: but no level of accuracy obligations was placed on the operators.
In the UK, the average level of delivered accuracy is between 1 and 4 Km, and the spread of x-y location readings is quite large indeed.
This flexibility was merited in light of the concern that the European cellular operators were feeling the strain from the 3G license fees, paid at a time when the industry was experiencing one of the most significant downturns in company performance in history. It was a prudent course of action. But what should the EC do now?
The tide is turning
2004 is proving to be a much more optimistic year than any of the previous three years. Many operators have cleaned up their balance sheets and are reporting positive cashflows. Until now, the commercial LBS market has failed to meet expectations as the anticipated take-off in mobile data services failed to materialise. But that is changing, Vodafone UK reported 16.9 per cent of its service revenues came from non-voice applications as of March 2004 and 2.6 per cent was content and value-added data service related. That is more than double the March 2003 figure. The contribution, however, from location-enabled data applications is still very small. Across the top 20 markets in Western Europe, commercial LBS revenues represented just US$ 285 million in 2003.
Distinctive and robust LBS applications are continuing to be developed, but if the EC believes that the current cashflow from LBS applications will spur on European cellular operators to purchase high accuracy positioning determining equipment, they may have to hold their breath for a while yet. As research from Concise Insight's European Location-Based Services 2004 report shows, commercial LBS applications are gaining traction but, at this rate, European cellular operators are unlikely to have high accuracy equipment for mass-market use before the turn of the new decade.
We would like to argue that it may be constructive for the EC to tighten the mandate on E112 to not only require operators to pass on the contact details and the end-user's very approximate location to the emergency services, but also to insist on a requisite level of accuracy. The accuracy target could be set rather flexibly in the early days and reviewed over time. The key to facilitating this process is a finance-raising mechanism.
Changing society
There are a number of challenges facing European society. In most markets, nearly everyone who could own a mobile phone, does own a mobile:
• From its own research, the EC are aware that approximately 40 to 60 per cent of calls made to the emergency services are made on a mobile phone, and that figure is rising;
• As a result, in 2004, 64 million calls will be made to the emergency services in Western European, and that figure is rising;
• The volume of intra-EU country visits of tourists and business personnel has reached over 85 million per year;
• The location of every fixed line phone is known, that is not the case for mobile handsets. The original premise for being flexible on location accuracy is no longer tenable:
• Operators have cleaned up their acquisitions sheet and restructured their operations to be more profitable;
• Applications providers have developed LBS applications but they have little leverage to persuade the operators to install high accuracy equipment.
Therefore, given the current industry trajectory, high location accuracy for the mass market could well not appear until the end of the decade at the earliest. This is acceptable with most commercial applications but in a world where everyone has a mobile, there are the wider social and welfare implications that should be taken into account as well.    
If the current bottleneck in the transfer of personal and location data between the operator and the emergency services is not resolved and location accuracy does not improve, ambulance, fire brigade and police response times could well go down, not up.
Tougher mandate
Therefore, could a tougher mandate be a 'win-win' situation for all parties? Certainly the industry does seem to be caught in a negative triangle of interoperability, capital expenditure, and handset feature-set concerns:
• A lack of interoperability currently prevents international roaming subscribers from taking advantage of LBS, although current standards being finalised through the hard work of the 3GPP, ETSI, the OMA and the GSM Association are slowly but surely rectifying the situation;
• Network-based solutions such as E-OTD and U-TDOA, other than base-station cell ID solutions, face challenges of getting critical mass as operators have been hesitant to pick up the tab on the capex;
• A-GPS handsets have not materialised because the handset manufacturers have been reticent to install GPS in a large number of GSM handset models due to intense competition from other components/applications vying for battery life, space in the handset and overall cost considerations.
Therefore the EC and the member states need to make a clear decision to either:       
1. Remain flexible on the issue of location accuracy and accept that the emergency services could find it increasingly difficult to locate distressed individuals within the shortest time possible;
2. Or impose a high location accuracy mandate on the operators. Certainly there will be grumblings from the operators over political interference and capex but there is a clear-cut application that is desperately in need of accuracy, and that is emergency response.
Knock-on effects
The benefits of high accuracy PDE equipment would also have knock-on benefits for value-added data services and therefore the operators can spread the cost of location accuracy to a large number of direction-finding, traffic notices, community friend-finders and even gaming scenarios.
If the EC does opt for a stricter interpretation, then the key issue of consideration must be raising the money. The implementation of personal and location data IT transfer systems between the operators and emergency services has been disjointed in most countries, primarily due to the fact that no clear finance-raising model has been established.
Different countries, national emergency authorities and operators are prioritising at different levels of commitment. There is a danger that fragmentation also affects the deployment of location accuracy. It would perhaps be an unpleasant possibility that your choice of operator not only dictates the quality of coverage you may have in certain parts of the country but also how quickly the emergency services arrive if you are in distress.
The US introduction of E911 has been contentious, and certainly the EU could learn to avoid some of the US's pitfalls, but it does clearly set out a fund-raising model. For Europe, either a fixed fee per subscriber or percentage levy of ARPU could then be put towards the E112 budget. A proportion of that budget could then be set aside for implementation of personal and location data IT transfer systems between the operators and the emergency services, and the remainder could be allocated to the high accuracy solution of the operators choice. If the operator prefers a network-based solution, they can start to draw the funds together; or if the operator prefers a handset-based solution, the allocated budget could be used as an incentive to handset manufacturers to incorporate GPS into more mainstream handset models to ensure mass adoption.
In most countries, consumers already pay 17 to 22 per cent VAT. You could argue that the EC/national governments could meet the operators half-way, by declaring that x percent of that VAT paid by the end users is retained by the operator. Market driven solutions have, for the most part, been the most cost efficient mechanism to deliver results, but sometimes the wider social context needs to be considered. Mobiles have become the norm in our lives.
Due the current state of the LBS E112 market-place, there is a distinct possibility that it could be safer to make that emergency distress call from a fixed-line phone. It could even be argued that if a laissez-faire policy on high accuracy is maintained, there is a danger of social discrimination if an end user were to choose a network that had a lower level of location accuracy or could not afford a handset that had A-GPS built into it.
A 'raise the bar' higher accuracy location policy would have a knock-on benefit for value-added data services, as they are a whole host of applications that could benefit. Addressing the higher accuracy funding mechanism issue would allow the operator to install the position-determining equipment of their choice that benefits their commercial LBS application needs, as well as deliver improved response and location-fixes of subscribers in distress.               

Jake Saunders, Director, Concise Insight, can be contacted via e-mail: Enquiry@Concise-Insight.com

Hot on the heels of the company’s latest acquisition, Intec’s Mike Frayne and Kevin Adams outline the organisation’s strategy in the OSS market

Telecommunications has had a rough ride over the past two or three years, and while signs of a recovery are coming through loud and clear, it is those companies who held their nerve in difficult market conditions, and continued to invest for the future, that now stand to reap the greatest rewards. Intec recently added to its track record of growth with the acquisition of ADC Telecommunication's Singl.eView retail billing software division. The company's Executive Chairman Mike Frayne, and CEO Kevin Adams, spoke to European Communications about the thinking behind the acquisition and the role they see for Intec in the OSS market.

LM: What was the initial thinking behind your decision to acquire Singl.eView? 
MF: Retail or transactional billing is the biggest part of the OSS market – about 65 per cent of the whole sector – and it has the highest level of senior executive mindshare, because it directly affects both the major revenue stream and the customer relationship. To grow the company in the way we wanted to, and to secure ourselves a position as a truly Tier 1 OSS vendor, we really had to be in retail. Intec already dominates interconnect and mediation, and probably has the largest customer base in real-time charging/active mediation, so retail was also a logical next step from a product portfolio viewpoint.

LM: Had you, in fact, had feedback from customers who wanted you to offer retail/transactional billing?
MF: We have had requests from customers over the years, particularly from some of our most well established customers who like the way we look after them, and also from some of the new IP billing customers from our recent acquisition of the Digiquant business.
But the drive to do this acquisition was primarily internal, as a result of our perception of the need to elevate Intec into the top tier of vendors. We think that a lot of the smaller and niche vendors will have a very difficult time over the next few years, and the lack of newsflow and good financial results from many of them is already evidence of this.

LM: So was it also a matter of better positioning yourselves with a wider offering in a recovering marketplace?
MF: We do see a slow and steady recovery, and now is clearly a better time than in the previous two to three years to be executing on our growth ambitions. But we also have a long-term strategy to grow and develop by acquisition, as well as organically, and this is primarily another step in that plan, albeit a big one!

LM: Did you look at any other contenders, aside from ADC?  MF: We have looked at many, many OSS companies over the past few years, and only acquired a very few of them, as we have strict criteria for acquisitions, in terms of product quality, financial performance, cultural synergy and long-term potential. There are not many major players in retail, and we are clearly not in a position to acquire one of the larger vendors. Singl.eView was well timed for us, as it was affordable, and a good business that we felt comfortable with.

LM: Were there any other factors that led you to choose Singl.eView?
KA: Singl.eView is first and foremost a great product – we think probably the best current tier 1 retail billing and transaction management system on the market. It's modern technology, architected for high volume and real-time processing, and capable of handling any kind of service and payment method. The feedback we had during due diligence from customers and partners was very, very positive, and gave us a lot of confidence to go forward. It's also highly configurable, but without a massive services overhead, so total cost of ownership is low, relatively speaking. It's beaten all the major players in various recent major deals, such as Tele2 and Deutsche Telekom, so we know it's right up there with the rest of the tier 1 solutions in terms of performance and functionality.
Singl.eView also has a solid customer base of well-known customers, in fixed, mobile and 3G. That not only gives us a day one revenue stream, it also brings referenceability in our sales campaigns, which is absolutely crucial today.Singl.eView has a strong professional services operation – almost twice as large as Intec's – because of the nature of retail projects. We'll have almost 700 PS staff, and around 300 developers in total – that's a big capability to offer the industry.
Culturally we felt it was also a good fit, with a strong management team and very good people in all areas. We already have people from both sides working well together, and the chemistry is good.

LM: Does this acquisition fundamentally change Intec?
MF: Yes and no. Absolutely yes in terms of market position, capability and visibility – it takes us immediately into the top three or four OSS product companies in terms of software revenues, technical capabilities and customer base. You can't double the size of a business and not think it will be a massive change. But, on the other hand, it wont change our fundamentals – strong focus on customer care, good business performance, and the best products. Everyone says these things but it's what we built Intec on, and our growth and success, right through the bad years, is the evidence.

LM: Could you detail exactly what you believe Singl.eView will bring to Intec, and how it fits with your existing offering?
KA: In terms of fit, it is very good. We have about a dozen common customers already, including real innovators like 3, so we know the products work well together, not just in theory but in really demanding production sites. Retail naturally fits right alongside interconnect, and both are fed by mediation, so the fit is obvious. Technologically, the architectures are very compatible, and we are already exploring some very interesting technical synergies, both ways. There is a  minor overlap with the acquired Digiquant (Intec DCP) products in a couple of areas, but they are really targeted on different problems, and it's not an issue in the opportunities we see.

LM: So, how will the acquisition benefit existing Singl.eView customers; your current inter-carrier billing and mediation customers; and possible new customers?
KA: Intec has a big customer base, about 400 customers, while Singl.eView has 70, and the crossover is not huge. So there is clearly a great opportunity to cross-sell products both ways. All telcos need billing, both retail and interconnect, as well as mediation and now, real-time charging, so we feel we have a very strong, logical architecture to offer to both new and  existing customers.                                             Singl.eView has a great reputation, but it's clear that uncertainty over its position and future has held it back. Those issues are gone now, and we have had a lot of encouraging responses already from customers and prospects. People know that Intec is a solid business with good customer care and a real commitment to product investment – something like $30m next year – and they can see that we have bought Singl.eView to take it forward. We've already committed to the existing roadmap, and we'll be looking to extend that going forwards.

LM: Have you had much reaction from your own customers, and existing Singl.eView users, yet? And what has been the feedback from the marketplace in general?
MF: Early days to say too much, but really very encouraging so far. We've been to see most of the major Singl.eView customers, and talked to a lot of Intec customers, and the response has been overwhelmingly positive. People know it's a big deal to take on, but they also know we've done it before, and we keep our promises on commitments for product delivery and support. In general I think there's a lot of demand for a technically strong retail solution that's flexible and doesn't cost a fortune to implement and maintain.

LM: What, in your view, are the advantages and disadvantages to the operators of a 'one-stop-shop'? Is there a danger, for instance, that by creating powerful, large suppliers (who might push the smaller guys out of business), operators will be locked into a narrow choice of solutions?
KA: It's a longstanding argument – best of breed or one stop shop. I guess Intec has been both of these things, so we are in a pretty good position to answer that one. Having focus on one product – say mediation – is great because you know what you sell, and you have to sell it. Life is straightforward, and you can focus on building a really strong product. We've bought several one-product companies now, and the benefits are typically a lot of skill and a great product. But it is also limiting and a bit risky, as markets sometimes turn down, and you are exposed as a one-product vendor. As a vendor with multiple products, particularly if they are a logical fit together, you have more technical strengths to bring to customers, and more ability to build them a coherent systems architecture without an 'integration tax.' There is some downside to customers if one supplier dominates, but in reality it rarely happens. There is enough innovation in the software business to ensure that there will always be some guy in a backroom building a great product and keeping the big guys honest. In fact, that's probably what keeps the major vendors awake at night!But there is another factor, too – the growing demand for a 'solutions' approach. We have customers saying 'we want to offer this new next-generation service, bundled with these other things, and with various payment options – how do we do it?' Intec can now solve those problems in a complete way, without a lot of multi-vendor hassle and integration worries, and we are seeing some very interesting business come through.

LM: Broadening out to the OSS/BSS market in general, do you believe the marketplace is now in a period of recovery, and if so, how will it compare to its heyday of two or three years ago?
MF: It will be a long time before we see the craziness of 1999 and 2000 again. Vendors were creating completely unsustainable business models based on IPO or VC cash, spending $10 for every $5 they earned, and the carriers were caught up in a race for market share and spectrum at any cost. Life may be harder work now, but at least the good companies are making money and building a long-term future. There are still a lot of damaged businesses out there, and many won't make it. One of our messages to customers is always to look at the underlying financial strength and performance of vendors before committing to them. You need to see a five year roadmap and business plan in our view.

LM: Would you say that OSS/BSS systems are now more central to strategic telecoms thinking? 
MF: Absolutely – in the marketplace we all have to face today, operational efficiency, margins and customer service are king. Good OSS can deliver in all those areas and the forward thinking players are looking very hard at their architectures and OSS cost bases. Some companies spend truly scary amounts on legacy OSS, and we think a lot of it could be scaled back with the right new technology choices. But it's a big decision to move away from a system that works, even if it costs an arm and a leg to maintain. Another factor is new revenue streams – things like content, messaging and games. These are where the future growth and margins are going to be, and carriers want a growing share of consumers' disposable income. Working in this space almost inevitably implies new OSS spending, because many legacy systems just aren't up to it, and it's very high on the executive agenda right now.

LM: So, how do you see the future for OSS/BSS, in terms of its role in the broader telecoms picture? What will it have to do to adapt to the inevitable changes in technology, user demand, and a possible shake-down in the operator market?
KA: We are really excited about the future, both for the industry and Intec. There's a lot going on and many new opportunities, particularly in emerging markets and new technologies. We are adapting Intec, and our product portfolio, to address the opportunities we see, and to adapt to the changes that will come. Yes, there may be some operator fall out, and vendors too, but the overall trend is still growth, and you have to take the long view. There's a lot of innovation right now in new services, pricing strategies, and technologies and that makes for a fun industry.

LM: Do you have any plans for further acquisitions? And where do you see Intec in, say, five years time?
MF: We've done eight acquisitions in four years, and they have made a massive contribution both to our growth and to our ability to deliver what customers want, so we can't see any reason to change tack. But we have strict criteria for what we will acquire, and a long-term plan for what we might move into. Our medium term ambition is to be clearly the number one OSS products company, and Singl.eView takes us a long way towards that. We also aim to continue consolidating the market where possible and, of course, you can't control when opportunities arise.
Beyond that, who knows?   [l=www.intecbilling.com/]http://www.intecbilling.com/[/l]

Lynd Morley is editor of European Communications

Making the most of a mobile workforce means implementing a sound mobile workforce management system, says Jennifer Dewar

Analysts, the Probe Group, are predicting the number of global enterprise wireless data users will exceed 160 million by 2008. At the same time, the European market for mobile devices is growing by 25.6 per cent compared to the same period last year, according to an IDC report. Clearly, European telecommunications service providers might be tempted to sit back and relax a little. But service providers cannot afford to become complacent.
Despite politicians' promises of an economic upswing, the current economic climate is not particularly buoyant. Global uncertainty and European Union growing pains, coupled with deregulation and vendor consolidation, have created a competitive and unpredictable environment for service providers. With increasing expectations of empowered customers compounding the situation, companies are struggling to differentiate themselves and remain one step ahead of their competition.
Customers calling the shots
In today's marketplace, customer satisfaction is one of the most visible and crucial business goals of service providers, yet one of the most elusive. With increased choices available, customers have become heady with power and are demanding products and services faster, better, and for less money. A hard-fought battle for customer loyalty is ensuing from the conflict between escalating customer expectations and profit-driven management.
A slick public relations campaign cannot erase the damage caused by poor or inconsistent service. Cancelled appointments or hours spent waiting for a technician to arrive will quickly drive a customer into the arms of a competitor. However, service providers that effectively respond to customer requests, anticipate their needs, and build customer trust will survive the long haul. Indeed, top-notch customer service will buoy the bottom-line and secure customer loyalty, ensuring a consistent revenue stream and reducing costly customer churn.
So, how are service companies striving to optimise their service lifecycle and build lasting relationships with their customer base? Many European service providers, such as Belgian telecommunications giant Belgacom, have implemented mobile workforce management systems to automate their field service workers, significantly increasing operational efficiency and improving customer service.
Service providers are confronted daily with the difficult task of optimally assigning work requests to their field force, dispatching work from the office to the field, monitoring the progress of the work, and responding to changing conditions. In addition, field service organisations must measure workforce performance in order to improve the quality of strategic forecasting and planning efforts. In contrast with time-consuming and inefficient paper-based systems, mobile workforce management systems manage, schedule, and dispatch work for mobile engineers and technicians – all wirelessly and in real-time. Work assigned to technicians is delivered wirelessly to the workforce in the field using laptops or hand-held mobile devices like Pocket PCs; as work progresses, technicians send completed information wirelessly back to the enterprise.
But automating the field force is just the beginning. In order to garner greater efficiencies, improve productivity, and create a truly customer-centric operation, service providers are looking to extend their workforce management solution to the entire enterprise. An enterprise workforce management solution delivers an integrated operational view of the mobile workforce and its workload, whilst leveraging operational efficiencies across departmental boundaries. With enterprise-wide visibility into all operational and departmental areas – customer service, inspections, maintenance, construction, outage, meter service, billing, among others – service providers can reduce operational costs, deliver improved customer care, and witness greater return on their investment. 
Cy Tordiffe, Managing Director of EMEA for MDSI, the a leading workforce management software provider, notes: "Service providers need to break free from the silo mentality; managing individual groups separately is no longer a viable option. In order to avoid costly duplication and repetition, leverage economies of scale, and manage the workforce more effectively, companies must adopt an enterprise-wide approach."
As the momentum towards enterprise workforce management builds, field service organisations are searching for solutions to maximise efficiencies across all levels of their organisation. In response to this demand, mobile workforce management software vendors must offer flexible solutions that encompass all functional units of the business and support a wide spectrum of enterprise applications and mobile devices. 
MDSI Mobile Data Solutions Inc is an example of one company that is leading the way. Working with telecom giant Nokia, MDSI is extending Advantex – its enterprise workforce management system – to the Nokia 6600 mobile phone. This development will enable field engineers who typically use laptop PCs, hand-held PCs, and Pocket PCs to access information across the enterprise using just a mobile phone. In addition, this solution will enable companies to bring their enterprise applications to a larger, more diversified workforce and provide greater mobility at a lower cost.
The Advantex Mobile Application running on the Nokia 6600 phone is a Java 2 (J2ME') application that enables dispatchers to communicate with field technicians using XML via HTTP/HTTPS over GPRS networks. This solution is particularly cost-efficient for certain enterprise workers, such as inspection workers, that may not require the robustness of ruggedised laptop PCs mounted in their trucks or the full functionality of certain Advantex tools such as mobile mapping. The nature of their work enables them to use a lightweight mobile phone to send completed inspection forms back to the office and fulfil their duties efficiently. Using mobile phones to communicate with the enterprise is also a very affordable alternative for short-term workers who are handling emergency situations such as outages or contract workers assigned to short-term projects.
"Companies are consistently striving to unearth cost-cutting measures. As there is often a high capitalisation cost associated with equipping the entire field force with laptops or hand-held PCs, mobile phones are an ideal solution – especially for people who are doing simple inspections. Mobile phones are inexpensive and lightweight and can fulfil the requirements of many mobile workers within the enterprise, explains Warren Cree, VP Marketing and Business Development, MDSI.
With Advantex and Nokia 6600 phones, service providers can extend mobile and enterprise applications to a larger and more diversified workforce in order to increase productivity, heighten customer intimacy, and reap marked cost and time savings. Companies can save the equivalent of approximately 1.5 hours of work time per technician per day, whilst reducing field operational costs by 35 to 40 per cent. Technicians are typically able to complete 10 to 20 per cent more jobs, and have real-time access to relevant customer information. Additionally, improved data quality, automation of time sheets and billing, and a reduction in paper usage translate into significant back office savings.
No matter how dramatic the savings or how revolutionary the improvements, buy-in from the field force is essential for a successful enterprise workforce management implementation. Field workers expect their mobile devices to be both intuitive and reliable and they want to be able to use new applications with minimal training. What could be more user-friendly than the ubiquitous mobile phone? With the worldwide production of mobile phones reaching a record high in 2003, according to research by Nikkei Market Access, it is likely that many employees already rely on mobiles for personal or business use. With the MDSI/Nokia workforce management solution, the cultural transition will be minimised. And making the lives of the workforce easier and more efficient translates to happy employees, improved customer service, and loyal customers.
In a climate of shrinking profit margins, competitive pressures, and rising customer expectations, service providers must strive to evolve through continuous service innovation. Flexible, visionary solutions from companies like MDSI are helping companies to do just that. 
Jennifer Dewar writes about science, technology and healthcare.   [l=www.mdsi.ca/]http://www.mdsi.ca/[/l]

Internet access for all has been the rallying cry for governments and the UN. But even in Europe the division between the haves and have nots has yet to be fully addressed. Andrew Davies looks at the issues

The Digital Divide is a major issue at a national, European and world-wide level. In simple terms, it is the divide between those who have access to computers and the Internet and those who do not. The divide can be economic, demographic or geographic. It is the geographic dimension that is an issue for telecommunication service providers and the routes to its resolution could be a significant opportunity.
In modern economies, access to computers, the Internet and, increasingly, broadband communications is seen as essential for future prosperity. Consequently, lack of access could be a significant brake on economic growth. On a national basis, in countries such as the UK where penetration of computers and the Internet is high, the provision of access to broadband connection is important enough to be embodied in government policy. This view is held by most Western European countries, the European Union and the United Nations. With the enlargement of the EU, the need to provide equitable access to what is referred to as the Information Society is seen as essential.
New member states
The state of play in Europe varies, particularly in some of the new member states. The penetration of fixed line access is low, which means that significant proportions of the population do not have access to voice telephony, let alone the Internet. In some of these countries, as much as 25 per cent of the population have never even heard of the Internet. With a poor fixed line infrastructure, penetration of broadband is also low. This means that there is much work to do if the geographic digital divide is to be closed.
It is interesting to compare the situation of the new member states with the established members where the expectation is that the geographic digital divide should already be solved. In the UK, BT recently announced that all households would be in reach of a broadband connection by 2005. Their most recent estimate is that the latest technology will enable them to connect all but 100,000 individual users.
By contrast, a recent workshop in France defined the minimum broadband connection as 2 Mbps by 2007 and estimated that as much as 20 per cent of the French population would be out of reach of terrestrial connection at that rate. Spain and Italy have similar concerns and their national governments are developing programmes to address the issue.
National government and European Union concern about the digital divide in Europe potentially creates an opportunity for providers of telecommunication services and equipment. However, it is useful to answer a few pertinent questions to determine the extent of the geographic digital divide:
1. How real is the demand for Internet and broadband connection?
2. How great is the demand and how much is unmet by current means?
3. How can government institutions help and should it be left to market forces?
A quick look at the situation in the UK answers the first question. From Ofcom figures, personal computer ownership is tending towards 70 per cent of the population by 2007 and Internet connection is tending towards the same figure. If 70 per cent of the population is connected to the Internet by 2007, it is likely that all of them would be potential users of broadband. In the business community, particularly in the Small to Medium Enterprise (SME) sector, penetration of Internet connection is even higher and broadband take-up is already at 34 per cent and rising. As SMEs are the lifeblood of any economy, these figures indicate that Internet access, and increasingly broadband connection, are important to a thriving economy. This is especially true in rural areas where SMEs are often the main employer.
For the new member states, additional factors come into play. A good example is the impact of the Common Agricultural Policy (CAP). This places significant demands on record keeping and form filling for farmers in the new member states. This means that the farmers will need computers and, ideally, Internet connection for e-mail and information services. Additionally, village based communities are more numerous in the new member states and the importance of the rural SME is even greater.
This indicates that there is undoubted demand and economic need for Internet access and, increasingly, broadband connection. This leads to the second, more complex issue: how much of this is unmet by current connection means? BT has indicated all but 100,000 users will be broadband enabled if they want it, by 2005. With 70 per cent PC ownership this implies 70,000 users will want it but will be out of reach. If this figure is increased pro rata across Europe, then the unmet demand becomes 600,000. However, the French figures indicate as much as 20 per cent of their population will not have access to broadband by 2007, an unmet demand based on PC take up of around 8.5 million which, scaled up to the enlarged Europe, gives an unmet demand of over 60 million.
This indicates that there could be a large number of households and small businesses that want Internet access and broadband connection but are unable to get it. In some of the new member states there is the additional issue of the ability to pay for such services and this plays into the answer to the last of the three questions, on the role of government.
A number of EU and national initiatives are in progress but there are limits to what government can achieve. The two leading issues are the need to maintain competition and the need for 'technology neutrality'.
Contravening regulations
A national government is unable to simply give money to a service provider to connect remote users. This is seen as a subsidy and contravenes European and world trade regulations. Similarly, as technologies such as satellite already claim to cover most of Europe, albeit at higher cost, providing incentives to satellite service providers to bridge the digital divide is not 'technologically neutral'. However, funding can be made available that falls within the rules but enables rural users to have broadband access. In the UK, Regional Development Agencies provide grants to businesses that may be disadvantaged by the lack of broadband. Because the grant is made on an individual basis, the end user can choose how the connection is delivered, maintaining technology neutrality and competition. If, for example, satellite access is the only means of delivery to the user, he or she may still have a choice of competing service provider, thus staying within the rules.
For new member states, the EU funding available for disadvantaged areas, known as structural funding, can be used if Internet and broadband connectivity can be shown to be necessary for economic development. In Poland, grants will be made available to 100,000 farmers to enable them to comply with the requirements of the CAP. If this money is used to buy computers and Internet connections, it is still within the rules.
In the Nordic countries, tax breaks for users to encourage demand and for suppliers to encourage build out have enabled the fixed line infrastructure to effectively reach out further.
So, the three questions can be answered. We have confirmed that here is a growing demand for Internet and broadband connection across Europe and that a significant proportion of that demand cannot be met economically by fixed line connection in the current market. There are ways institutional funding can be used to underwrite some of the cost of connection. So how should the service provider community respond?
The first point to make is that these 'digitally divided' users in Western Europe are outside the economic reach of fixed line service providers. In many of the new member states, the fixed line infrastructure is very limited outside urban areas. Leaving it to market forces could result in a growing gap between those with Internet and broadband access and those without. However, if institutional intervention takes place to stimulate the market, someone still has to deliver to places that are not currently economically feasible. This situation is exacerbated in some of the new member states for two reasons:
• Average incomes are between a half and a quarter of those in Western Europe, so user subscription levels need to be lower.
• The successful roll out of GSM in some of the countries has meant many users are bypassing the fixed network altogether, discouraging build out of the fixed line network by the incumbents.
This implies that there is an opportunity for wireless connection for broadband, embracing both fixed wireless and satellite. There may also be an opportunity for newer technologies such as powerline delivery through the electricity infrastructure.
Satellite service providers such as Eutelsat and SES Astra are already looking at satellite/WiFi combinations to provide community broadband. The lower cost WiFi for local access combined with the more expensive satellite for trunking looks promising provided issues of scaling are successfully addressed. If so, the model could work anywhere in the satellite footprint. Low cost satellite access models are also being addressed, as the unmet market may be of sufficient scale to provide the volumes needed to drive down user terminal costs. Powerline – the delivery of communications via electric power cabling – is beginning to look more attractive because of the scale of the potential demand. It may even be argued that 3G networks could cover some of the unmet demand, based on the success of 2G networks in new member states.
What is emerging is that the issue of the digital divide could be addressed by the technologies that promised much in the late 1990's but never quite delivered. The government imperative may be the key to unlocking the potential of these technologies, provided service providers are able to integrate them in their service offer. There are many value chain issues to be resolved, particularly in new member states where credit cards and bank accounts are not the norm and where consumer scale distribution chains are relatively primitive. Innovation in service delivery is required in addition to innovation in technology.
One issue is clear. Western European governments believe that the Internet is an essential tool for the citizen and that broadband connection will form part of the path to future growth and prosperity. If this is true, those outside the reach of these technologies will be seriously disadvantaged, with resultant shifts in the patterns of business away from rural areas and less well connected countries. This is contrary to the aims of the enlarged European Union. However, it is not enough for government institutions to provide state support to service providers to solve the problem. The service providers themselves need to be able to offer innovative solutions that combine with institutional support to address the digital divide.  This could be a great opportunity for some niche technologies to come to the fore over the next five years.

Andrew Davies is Business Development Director at strategic technology consultants ESYS, and can be contacted via: adavies@esys.co.uk

    

 

European Communications is now
Mobile Europe and European Communications

  

From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:
www.mobileeurope.co.uk 

 

@eurocomms

Other Categories in Features