In an increasingly information driven world, the question of how to protect that information in the name of privacy has risen to the top of the corporate agenda. Lynd Morley talks to Toby Stevens, managing director of EPG, about the privacy issues affecting business today

The issues surrounding the privacy of personal information in business are fast moving up the corporate agenda, as organisations begin to recognise that they are caught in a web of rules and regulations at both national and international levels. Understanding and applying the regulations correctly are now as vital to a company's commercial survival as guaranteeing the security of information systems or adhering to correct accounting procedures have become over the past decade.
"Privacy is recognised as one of the key elements of good corporate governance," explains Toby Stevens, managing director of the Enterprise Privacy Group. "Corporate social responsibility demands that you show respect for personal data."
Stevens, who established EPG with Simon Davies – widely acknowledged as one of the foremost privacy experts in the world, and founder of the watchdog group Privacy International – also points out that every commercial relationship is built on trust. "If you misuse someone's personal information, you can destroy that trust instantly. Your customers and employees may forgive an accidental security failure, but they will not forget an abuse of their personal privacy, regardless of the cause."
The post-Enron emphasis on faultless corporate governance, heightened public awareness of privacy issues, and a growing culture of litigation are all contributing to a very real need for organisations, in both the private and public sectors, to understand and implement the privacy requirements being placed upon them.
In the wake of such developments as the introduction of new anti-terrorist legislation across the world, and – specifically in the UK – the forthcoming introduction of ID cards, images of a Big Brother society are beginning to loom in the public consciousness. As a result, organisations are having to respond to privacy concerns, much as they did to the information security concerns that emerged in force during the 90s. A decade ago, information security was still viewed as a drain on the bottom line by most businesses – an optional, value-added service. With the growth of the Internet, the increased public consciousness of hacking, and some high profile security incidents, most companies realised that they had to start offering security, no longer as an optional extra or differentiator, but as a commodity. Indeed, they recognised that they would lose customers if security were not integrated into every aspect of their products and services.
Stevens points out that over the past few years there has been a similar growth in public awareness of how personal data is managed – prompted, in part, by the introduction of EU legislation on privacy.
"Europe has absolutely led the way in this field, with a very strong cultural concept that your personal data is private, and that you have the right to control who sees it, who handles it, what they do with it," he explains. "In the late 80s that concept was translated into the EU Data Protection Directive, and companies were given the burden of actually having to be accountable for how they handled personal data. Back then, most of them saw it as something of an irritation, and couldn't see any commercial value in compliance. Quite often, data protection was fobbed off onto security departments, or junior management, because it was seen as a purely regulatory and legal compliance issue.  The attitude was: 'We'll do the bare minimum we need to, and then we'll forget about it'. But this is rarely effective: security professionals are worried about hackers or disgruntled employees, but the biggest privacy threat can come from your best customer or most loyal member of staff. The privacy manager requires a different mindset to the security manager."
Stevens believes that organisations are now becoming all too aware of the fact that, not only is there a considerably heightened awareness of privacy issues among the consuming public – who will no longer accept privacy of information simply as an optional extra – but that there is also a move in Europe towards the US model, where the growth of privacy legislation has been driven by litigation. US organisations are obliged to consider the possible litigation arising from any privacy incident, and this has created a culture of respect for privacy, since it directly impacts the organisation's bottom line. The litigation-driven approach has also created a diverse range of laws to address very specific privacy problems, despite the absence of an equivalent to the EU Data Protection Directive.
The US Video Privacy Protection Act, for instance, was passed by Congress in the wake of the controversy that arose when Judge Robert Bork's video rental records were released during hearings into his Supreme Court nomination. The Act forbids a video rental or sales outlet from disclosing information about which tapes a person borrows or buys, or releasing other personally identifiable information without the informed, written consent of the customer. The Act also allows consumers to sue for damages if they are harmed by any violations of the Act.
Another example of the effects of litigation was demonstrated when a US federal and state class action against Internet advertising agency DoubleClick was settled under an agreement that requires the company to give consumers new privacy protection. The lawsuits alleged that DoubleClick violated state and federal laws by tracking and collecting consumers' personal details and combining them with information on their web surfing habits. As part of the settlement, DoubleClick agreed to adhere to a number of practices and policies, including the commitment that the company's privacy policy would display easy-to-read explanations of its online services; the company would undertake a consumer education effort, which included consumer privacy banner ads that invite consumers to learn more about how to protect their on-line privacy; and the company would institute internal policies to ensure the protection and routine purging of data collected online. The legal fees and costs of up to $1.8 million fell to DoubleClick.
But even without possible legal ramifications, Stevens is adamant that, in the information society, proper handling of personal data will become one of the major factors for any client deciding to whom he or she is prepared to divulge personal information.
"We increasingly see people voting with their feet if, for instance, they don't like a web site's privacy policy," he comments. "These effects can be measured – how many virtual shopping baskets don't go through checkout because the individual gets cold feet about handing over personal information?
"Every business handles information, but particularly in the business to consumer environment any company that does not respect personal information will, sooner or later, come unstuck. Not necessarily as a result of legal action, but purely at a commercial level. People simply won't hand over their data."
EPG, whose brief is to understand best practice in privacy management and help their clients implement it successfully is, for example, currently working with a central UK government department to assess its compliance with data protection legislation. EPG is also working with a leading management and systems consulting firm to consider issues arising from the use of Radio Frequency Identification (RFID) tags on pharmaceutical products.
Understanding the detail
Stevens, whose experience spans over 15 years in the management of corporate security and privacy projects, explains that the problem for business now is in understanding the detail, as well as the principles, of handling personal data.
"This can be any personal data," he stresses. "It's not just your customer database, your marketing list, or your employee information. It is anything that can be linked back to an individual in any way. Even if you strip someone's name away from the data, as long as there's still an identifier such as a telephone number, it's personal data.
"I've worked with a great many large organisations – some of them huge – which had absolutely no central control over privacy or data protection," he continues. "I spoke to a wealth of companies who said that they simply had no idea what they were meant to be doing, or who was responsible for doing it.
"The problem with privacy, from a legal perspective, is that every country's requirements are different. Even within the confines of the EU Data Protection Directive, each country has interpreted the law differently. In Spain, for instance, they define the levels of encryption and the types of password to be used to protect different types of personal data, whereas the UK was recently criticised by the EC for deficiencies in its interpretation of the Directive.
"For an international company trying to operate across borders – and the hardest of those borders is the Atlantic – the challenge is in constantly trying to keep up with the legislation, interpret it and then implement it."
He goes on to point out that, in the US in particular, we are now seeing the emergence of the corporate privacy officer – an individual who is dedicated exclusively to working on privacy issues, and reports to a very senior level of management. Microsoft is one example of an organisation using this approach in Europe, publicly demonstrating their commitment to data protection issues with the appointment of a highly respected privacy specialist as the company's EMEA corporate privacy strategist.
Microsoft's approach is to provide a focal point for privacy issues – a 'champion' – who will both advise the organisation's staff and work with third parties to help them resolve and avoid privacy problems. EPG aims to fulfil a similar role for clients.
"By understanding and establishing best practice, we aim to move privacy management away from being a compliance driven process, and help our members to take control of the issues proactively. They will then no longer have to play catch-up with their obligations in whichever country they are operating," Stevens explains."If we can give them an effective infrastructure, and the skills they need, they will be able to turn privacy into a business enabler."
Lynd Morley is editor of European Communications

Hot on the heels of the company’s latest acquisition, Intec’s Mike Frayne and Kevin Adams outline the organisation’s strategy in the OSS market

Telecommunications has had a rough ride over the past two or three years, and while signs of a recovery are coming through loud and clear, it is those companies who held their nerve in difficult market conditions, and continued to invest for the future, that now stand to reap the greatest rewards. Intec recently added to its track record of growth with the acquisition of ADC Telecommunication's Singl.eView retail billing software division. The company's Executive Chairman Mike Frayne, and CEO Kevin Adams, spoke to European Communications about the thinking behind the acquisition and the role they see for Intec in the OSS market.

LM: What was the initial thinking behind your decision to acquire Singl.eView? 
MF: Retail or transactional billing is the biggest part of the OSS market – about 65 per cent of the whole sector – and it has the highest level of senior executive mindshare, because it directly affects both the major revenue stream and the customer relationship. To grow the company in the way we wanted to, and to secure ourselves a position as a truly Tier 1 OSS vendor, we really had to be in retail. Intec already dominates interconnect and mediation, and probably has the largest customer base in real-time charging/active mediation, so retail was also a logical next step from a product portfolio viewpoint.

LM: Had you, in fact, had feedback from customers who wanted you to offer retail/transactional billing?
MF: We have had requests from customers over the years, particularly from some of our most well established customers who like the way we look after them, and also from some of the new IP billing customers from our recent acquisition of the Digiquant business.
But the drive to do this acquisition was primarily internal, as a result of our perception of the need to elevate Intec into the top tier of vendors. We think that a lot of the smaller and niche vendors will have a very difficult time over the next few years, and the lack of newsflow and good financial results from many of them is already evidence of this.

LM: So was it also a matter of better positioning yourselves with a wider offering in a recovering marketplace?
MF: We do see a slow and steady recovery, and now is clearly a better time than in the previous two to three years to be executing on our growth ambitions. But we also have a long-term strategy to grow and develop by acquisition, as well as organically, and this is primarily another step in that plan, albeit a big one!

LM: Did you look at any other contenders, aside from ADC?  MF: We have looked at many, many OSS companies over the past few years, and only acquired a very few of them, as we have strict criteria for acquisitions, in terms of product quality, financial performance, cultural synergy and long-term potential. There are not many major players in retail, and we are clearly not in a position to acquire one of the larger vendors. Singl.eView was well timed for us, as it was affordable, and a good business that we felt comfortable with.

LM: Were there any other factors that led you to choose Singl.eView?
KA: Singl.eView is first and foremost a great product – we think probably the best current tier 1 retail billing and transaction management system on the market. It's modern technology, architected for high volume and real-time processing, and capable of handling any kind of service and payment method. The feedback we had during due diligence from customers and partners was very, very positive, and gave us a lot of confidence to go forward. It's also highly configurable, but without a massive services overhead, so total cost of ownership is low, relatively speaking. It's beaten all the major players in various recent major deals, such as Tele2 and Deutsche Telekom, so we know it's right up there with the rest of the tier 1 solutions in terms of performance and functionality.
Singl.eView also has a solid customer base of well-known customers, in fixed, mobile and 3G. That not only gives us a day one revenue stream, it also brings referenceability in our sales campaigns, which is absolutely crucial today.Singl.eView has a strong professional services operation – almost twice as large as Intec's – because of the nature of retail projects. We'll have almost 700 PS staff, and around 300 developers in total – that's a big capability to offer the industry.
Culturally we felt it was also a good fit, with a strong management team and very good people in all areas. We already have people from both sides working well together, and the chemistry is good.

LM: Does this acquisition fundamentally change Intec?
MF: Yes and no. Absolutely yes in terms of market position, capability and visibility – it takes us immediately into the top three or four OSS product companies in terms of software revenues, technical capabilities and customer base. You can't double the size of a business and not think it will be a massive change. But, on the other hand, it wont change our fundamentals – strong focus on customer care, good business performance, and the best products. Everyone says these things but it's what we built Intec on, and our growth and success, right through the bad years, is the evidence.

LM: Could you detail exactly what you believe Singl.eView will bring to Intec, and how it fits with your existing offering?
KA: In terms of fit, it is very good. We have about a dozen common customers already, including real innovators like 3, so we know the products work well together, not just in theory but in really demanding production sites. Retail naturally fits right alongside interconnect, and both are fed by mediation, so the fit is obvious. Technologically, the architectures are very compatible, and we are already exploring some very interesting technical synergies, both ways. There is a  minor overlap with the acquired Digiquant (Intec DCP) products in a couple of areas, but they are really targeted on different problems, and it's not an issue in the opportunities we see.

LM: So, how will the acquisition benefit existing Singl.eView customers; your current inter-carrier billing and mediation customers; and possible new customers?
KA: Intec has a big customer base, about 400 customers, while Singl.eView has 70, and the crossover is not huge. So there is clearly a great opportunity to cross-sell products both ways. All telcos need billing, both retail and interconnect, as well as mediation and now, real-time charging, so we feel we have a very strong, logical architecture to offer to both new and  existing customers.                                             Singl.eView has a great reputation, but it's clear that uncertainty over its position and future has held it back. Those issues are gone now, and we have had a lot of encouraging responses already from customers and prospects. People know that Intec is a solid business with good customer care and a real commitment to product investment – something like $30m next year – and they can see that we have bought Singl.eView to take it forward. We've already committed to the existing roadmap, and we'll be looking to extend that going forwards.

LM: Have you had much reaction from your own customers, and existing Singl.eView users, yet? And what has been the feedback from the marketplace in general?
MF: Early days to say too much, but really very encouraging so far. We've been to see most of the major Singl.eView customers, and talked to a lot of Intec customers, and the response has been overwhelmingly positive. People know it's a big deal to take on, but they also know we've done it before, and we keep our promises on commitments for product delivery and support. In general I think there's a lot of demand for a technically strong retail solution that's flexible and doesn't cost a fortune to implement and maintain.

LM: What, in your view, are the advantages and disadvantages to the operators of a 'one-stop-shop'? Is there a danger, for instance, that by creating powerful, large suppliers (who might push the smaller guys out of business), operators will be locked into a narrow choice of solutions?
KA: It's a longstanding argument – best of breed or one stop shop. I guess Intec has been both of these things, so we are in a pretty good position to answer that one. Having focus on one product – say mediation – is great because you know what you sell, and you have to sell it. Life is straightforward, and you can focus on building a really strong product. We've bought several one-product companies now, and the benefits are typically a lot of skill and a great product. But it is also limiting and a bit risky, as markets sometimes turn down, and you are exposed as a one-product vendor. As a vendor with multiple products, particularly if they are a logical fit together, you have more technical strengths to bring to customers, and more ability to build them a coherent systems architecture without an 'integration tax.' There is some downside to customers if one supplier dominates, but in reality it rarely happens. There is enough innovation in the software business to ensure that there will always be some guy in a backroom building a great product and keeping the big guys honest. In fact, that's probably what keeps the major vendors awake at night!But there is another factor, too – the growing demand for a 'solutions' approach. We have customers saying 'we want to offer this new next-generation service, bundled with these other things, and with various payment options – how do we do it?' Intec can now solve those problems in a complete way, without a lot of multi-vendor hassle and integration worries, and we are seeing some very interesting business come through.

LM: Broadening out to the OSS/BSS market in general, do you believe the marketplace is now in a period of recovery, and if so, how will it compare to its heyday of two or three years ago?
MF: It will be a long time before we see the craziness of 1999 and 2000 again. Vendors were creating completely unsustainable business models based on IPO or VC cash, spending $10 for every $5 they earned, and the carriers were caught up in a race for market share and spectrum at any cost. Life may be harder work now, but at least the good companies are making money and building a long-term future. There are still a lot of damaged businesses out there, and many won't make it. One of our messages to customers is always to look at the underlying financial strength and performance of vendors before committing to them. You need to see a five year roadmap and business plan in our view.

LM: Would you say that OSS/BSS systems are now more central to strategic telecoms thinking? 
MF: Absolutely – in the marketplace we all have to face today, operational efficiency, margins and customer service are king. Good OSS can deliver in all those areas and the forward thinking players are looking very hard at their architectures and OSS cost bases. Some companies spend truly scary amounts on legacy OSS, and we think a lot of it could be scaled back with the right new technology choices. But it's a big decision to move away from a system that works, even if it costs an arm and a leg to maintain. Another factor is new revenue streams – things like content, messaging and games. These are where the future growth and margins are going to be, and carriers want a growing share of consumers' disposable income. Working in this space almost inevitably implies new OSS spending, because many legacy systems just aren't up to it, and it's very high on the executive agenda right now.

LM: So, how do you see the future for OSS/BSS, in terms of its role in the broader telecoms picture? What will it have to do to adapt to the inevitable changes in technology, user demand, and a possible shake-down in the operator market?
KA: We are really excited about the future, both for the industry and Intec. There's a lot going on and many new opportunities, particularly in emerging markets and new technologies. We are adapting Intec, and our product portfolio, to address the opportunities we see, and to adapt to the changes that will come. Yes, there may be some operator fall out, and vendors too, but the overall trend is still growth, and you have to take the long view. There's a lot of innovation right now in new services, pricing strategies, and technologies and that makes for a fun industry.

LM: Do you have any plans for further acquisitions? And where do you see Intec in, say, five years time?
MF: We've done eight acquisitions in four years, and they have made a massive contribution both to our growth and to our ability to deliver what customers want, so we can't see any reason to change tack. But we have strict criteria for what we will acquire, and a long-term plan for what we might move into. Our medium term ambition is to be clearly the number one OSS products company, and Singl.eView takes us a long way towards that. We also aim to continue consolidating the market where possible and, of course, you can't control when opportunities arise.
Beyond that, who knows?   [l=www.intecbilling.com/]http://www.intecbilling.com/[/l]

Lynd Morley is editor of European Communications

Making the most of a mobile workforce means implementing a sound mobile workforce management system, says Jennifer Dewar

Analysts, the Probe Group, are predicting the number of global enterprise wireless data users will exceed 160 million by 2008. At the same time, the European market for mobile devices is growing by 25.6 per cent compared to the same period last year, according to an IDC report. Clearly, European telecommunications service providers might be tempted to sit back and relax a little. But service providers cannot afford to become complacent.
Despite politicians' promises of an economic upswing, the current economic climate is not particularly buoyant. Global uncertainty and European Union growing pains, coupled with deregulation and vendor consolidation, have created a competitive and unpredictable environment for service providers. With increasing expectations of empowered customers compounding the situation, companies are struggling to differentiate themselves and remain one step ahead of their competition.
Customers calling the shots
In today's marketplace, customer satisfaction is one of the most visible and crucial business goals of service providers, yet one of the most elusive. With increased choices available, customers have become heady with power and are demanding products and services faster, better, and for less money. A hard-fought battle for customer loyalty is ensuing from the conflict between escalating customer expectations and profit-driven management.
A slick public relations campaign cannot erase the damage caused by poor or inconsistent service. Cancelled appointments or hours spent waiting for a technician to arrive will quickly drive a customer into the arms of a competitor. However, service providers that effectively respond to customer requests, anticipate their needs, and build customer trust will survive the long haul. Indeed, top-notch customer service will buoy the bottom-line and secure customer loyalty, ensuring a consistent revenue stream and reducing costly customer churn.
So, how are service companies striving to optimise their service lifecycle and build lasting relationships with their customer base? Many European service providers, such as Belgian telecommunications giant Belgacom, have implemented mobile workforce management systems to automate their field service workers, significantly increasing operational efficiency and improving customer service.
Service providers are confronted daily with the difficult task of optimally assigning work requests to their field force, dispatching work from the office to the field, monitoring the progress of the work, and responding to changing conditions. In addition, field service organisations must measure workforce performance in order to improve the quality of strategic forecasting and planning efforts. In contrast with time-consuming and inefficient paper-based systems, mobile workforce management systems manage, schedule, and dispatch work for mobile engineers and technicians – all wirelessly and in real-time. Work assigned to technicians is delivered wirelessly to the workforce in the field using laptops or hand-held mobile devices like Pocket PCs; as work progresses, technicians send completed information wirelessly back to the enterprise.
But automating the field force is just the beginning. In order to garner greater efficiencies, improve productivity, and create a truly customer-centric operation, service providers are looking to extend their workforce management solution to the entire enterprise. An enterprise workforce management solution delivers an integrated operational view of the mobile workforce and its workload, whilst leveraging operational efficiencies across departmental boundaries. With enterprise-wide visibility into all operational and departmental areas – customer service, inspections, maintenance, construction, outage, meter service, billing, among others – service providers can reduce operational costs, deliver improved customer care, and witness greater return on their investment. 
Cy Tordiffe, Managing Director of EMEA for MDSI, the a leading workforce management software provider, notes: "Service providers need to break free from the silo mentality; managing individual groups separately is no longer a viable option. In order to avoid costly duplication and repetition, leverage economies of scale, and manage the workforce more effectively, companies must adopt an enterprise-wide approach."
As the momentum towards enterprise workforce management builds, field service organisations are searching for solutions to maximise efficiencies across all levels of their organisation. In response to this demand, mobile workforce management software vendors must offer flexible solutions that encompass all functional units of the business and support a wide spectrum of enterprise applications and mobile devices. 
MDSI Mobile Data Solutions Inc is an example of one company that is leading the way. Working with telecom giant Nokia, MDSI is extending Advantex – its enterprise workforce management system – to the Nokia 6600 mobile phone. This development will enable field engineers who typically use laptop PCs, hand-held PCs, and Pocket PCs to access information across the enterprise using just a mobile phone. In addition, this solution will enable companies to bring their enterprise applications to a larger, more diversified workforce and provide greater mobility at a lower cost.
The Advantex Mobile Application running on the Nokia 6600 phone is a Java 2 (J2ME') application that enables dispatchers to communicate with field technicians using XML via HTTP/HTTPS over GPRS networks. This solution is particularly cost-efficient for certain enterprise workers, such as inspection workers, that may not require the robustness of ruggedised laptop PCs mounted in their trucks or the full functionality of certain Advantex tools such as mobile mapping. The nature of their work enables them to use a lightweight mobile phone to send completed inspection forms back to the office and fulfil their duties efficiently. Using mobile phones to communicate with the enterprise is also a very affordable alternative for short-term workers who are handling emergency situations such as outages or contract workers assigned to short-term projects.
"Companies are consistently striving to unearth cost-cutting measures. As there is often a high capitalisation cost associated with equipping the entire field force with laptops or hand-held PCs, mobile phones are an ideal solution – especially for people who are doing simple inspections. Mobile phones are inexpensive and lightweight and can fulfil the requirements of many mobile workers within the enterprise, explains Warren Cree, VP Marketing and Business Development, MDSI.
With Advantex and Nokia 6600 phones, service providers can extend mobile and enterprise applications to a larger and more diversified workforce in order to increase productivity, heighten customer intimacy, and reap marked cost and time savings. Companies can save the equivalent of approximately 1.5 hours of work time per technician per day, whilst reducing field operational costs by 35 to 40 per cent. Technicians are typically able to complete 10 to 20 per cent more jobs, and have real-time access to relevant customer information. Additionally, improved data quality, automation of time sheets and billing, and a reduction in paper usage translate into significant back office savings.
No matter how dramatic the savings or how revolutionary the improvements, buy-in from the field force is essential for a successful enterprise workforce management implementation. Field workers expect their mobile devices to be both intuitive and reliable and they want to be able to use new applications with minimal training. What could be more user-friendly than the ubiquitous mobile phone? With the worldwide production of mobile phones reaching a record high in 2003, according to research by Nikkei Market Access, it is likely that many employees already rely on mobiles for personal or business use. With the MDSI/Nokia workforce management solution, the cultural transition will be minimised. And making the lives of the workforce easier and more efficient translates to happy employees, improved customer service, and loyal customers.
In a climate of shrinking profit margins, competitive pressures, and rising customer expectations, service providers must strive to evolve through continuous service innovation. Flexible, visionary solutions from companies like MDSI are helping companies to do just that. 
Jennifer Dewar writes about science, technology and healthcare.   [l=www.mdsi.ca/]http://www.mdsi.ca/[/l]

Internet access for all has been the rallying cry for governments and the UN. But even in Europe the division between the haves and have nots has yet to be fully addressed. Andrew Davies looks at the issues

The Digital Divide is a major issue at a national, European and world-wide level. In simple terms, it is the divide between those who have access to computers and the Internet and those who do not. The divide can be economic, demographic or geographic. It is the geographic dimension that is an issue for telecommunication service providers and the routes to its resolution could be a significant opportunity.
In modern economies, access to computers, the Internet and, increasingly, broadband communications is seen as essential for future prosperity. Consequently, lack of access could be a significant brake on economic growth. On a national basis, in countries such as the UK where penetration of computers and the Internet is high, the provision of access to broadband connection is important enough to be embodied in government policy. This view is held by most Western European countries, the European Union and the United Nations. With the enlargement of the EU, the need to provide equitable access to what is referred to as the Information Society is seen as essential.
New member states
The state of play in Europe varies, particularly in some of the new member states. The penetration of fixed line access is low, which means that significant proportions of the population do not have access to voice telephony, let alone the Internet. In some of these countries, as much as 25 per cent of the population have never even heard of the Internet. With a poor fixed line infrastructure, penetration of broadband is also low. This means that there is much work to do if the geographic digital divide is to be closed.
It is interesting to compare the situation of the new member states with the established members where the expectation is that the geographic digital divide should already be solved. In the UK, BT recently announced that all households would be in reach of a broadband connection by 2005. Their most recent estimate is that the latest technology will enable them to connect all but 100,000 individual users.
By contrast, a recent workshop in France defined the minimum broadband connection as 2 Mbps by 2007 and estimated that as much as 20 per cent of the French population would be out of reach of terrestrial connection at that rate. Spain and Italy have similar concerns and their national governments are developing programmes to address the issue.
National government and European Union concern about the digital divide in Europe potentially creates an opportunity for providers of telecommunication services and equipment. However, it is useful to answer a few pertinent questions to determine the extent of the geographic digital divide:
1. How real is the demand for Internet and broadband connection?
2. How great is the demand and how much is unmet by current means?
3. How can government institutions help and should it be left to market forces?
A quick look at the situation in the UK answers the first question. From Ofcom figures, personal computer ownership is tending towards 70 per cent of the population by 2007 and Internet connection is tending towards the same figure. If 70 per cent of the population is connected to the Internet by 2007, it is likely that all of them would be potential users of broadband. In the business community, particularly in the Small to Medium Enterprise (SME) sector, penetration of Internet connection is even higher and broadband take-up is already at 34 per cent and rising. As SMEs are the lifeblood of any economy, these figures indicate that Internet access, and increasingly broadband connection, are important to a thriving economy. This is especially true in rural areas where SMEs are often the main employer.
For the new member states, additional factors come into play. A good example is the impact of the Common Agricultural Policy (CAP). This places significant demands on record keeping and form filling for farmers in the new member states. This means that the farmers will need computers and, ideally, Internet connection for e-mail and information services. Additionally, village based communities are more numerous in the new member states and the importance of the rural SME is even greater.
This indicates that there is undoubted demand and economic need for Internet access and, increasingly, broadband connection. This leads to the second, more complex issue: how much of this is unmet by current connection means? BT has indicated all but 100,000 users will be broadband enabled if they want it, by 2005. With 70 per cent PC ownership this implies 70,000 users will want it but will be out of reach. If this figure is increased pro rata across Europe, then the unmet demand becomes 600,000. However, the French figures indicate as much as 20 per cent of their population will not have access to broadband by 2007, an unmet demand based on PC take up of around 8.5 million which, scaled up to the enlarged Europe, gives an unmet demand of over 60 million.
This indicates that there could be a large number of households and small businesses that want Internet access and broadband connection but are unable to get it. In some of the new member states there is the additional issue of the ability to pay for such services and this plays into the answer to the last of the three questions, on the role of government.
A number of EU and national initiatives are in progress but there are limits to what government can achieve. The two leading issues are the need to maintain competition and the need for 'technology neutrality'.
Contravening regulations
A national government is unable to simply give money to a service provider to connect remote users. This is seen as a subsidy and contravenes European and world trade regulations. Similarly, as technologies such as satellite already claim to cover most of Europe, albeit at higher cost, providing incentives to satellite service providers to bridge the digital divide is not 'technologically neutral'. However, funding can be made available that falls within the rules but enables rural users to have broadband access. In the UK, Regional Development Agencies provide grants to businesses that may be disadvantaged by the lack of broadband. Because the grant is made on an individual basis, the end user can choose how the connection is delivered, maintaining technology neutrality and competition. If, for example, satellite access is the only means of delivery to the user, he or she may still have a choice of competing service provider, thus staying within the rules.
For new member states, the EU funding available for disadvantaged areas, known as structural funding, can be used if Internet and broadband connectivity can be shown to be necessary for economic development. In Poland, grants will be made available to 100,000 farmers to enable them to comply with the requirements of the CAP. If this money is used to buy computers and Internet connections, it is still within the rules.
In the Nordic countries, tax breaks for users to encourage demand and for suppliers to encourage build out have enabled the fixed line infrastructure to effectively reach out further.
So, the three questions can be answered. We have confirmed that here is a growing demand for Internet and broadband connection across Europe and that a significant proportion of that demand cannot be met economically by fixed line connection in the current market. There are ways institutional funding can be used to underwrite some of the cost of connection. So how should the service provider community respond?
The first point to make is that these 'digitally divided' users in Western Europe are outside the economic reach of fixed line service providers. In many of the new member states, the fixed line infrastructure is very limited outside urban areas. Leaving it to market forces could result in a growing gap between those with Internet and broadband access and those without. However, if institutional intervention takes place to stimulate the market, someone still has to deliver to places that are not currently economically feasible. This situation is exacerbated in some of the new member states for two reasons:
• Average incomes are between a half and a quarter of those in Western Europe, so user subscription levels need to be lower.
• The successful roll out of GSM in some of the countries has meant many users are bypassing the fixed network altogether, discouraging build out of the fixed line network by the incumbents.
This implies that there is an opportunity for wireless connection for broadband, embracing both fixed wireless and satellite. There may also be an opportunity for newer technologies such as powerline delivery through the electricity infrastructure.
Satellite service providers such as Eutelsat and SES Astra are already looking at satellite/WiFi combinations to provide community broadband. The lower cost WiFi for local access combined with the more expensive satellite for trunking looks promising provided issues of scaling are successfully addressed. If so, the model could work anywhere in the satellite footprint. Low cost satellite access models are also being addressed, as the unmet market may be of sufficient scale to provide the volumes needed to drive down user terminal costs. Powerline – the delivery of communications via electric power cabling – is beginning to look more attractive because of the scale of the potential demand. It may even be argued that 3G networks could cover some of the unmet demand, based on the success of 2G networks in new member states.
What is emerging is that the issue of the digital divide could be addressed by the technologies that promised much in the late 1990's but never quite delivered. The government imperative may be the key to unlocking the potential of these technologies, provided service providers are able to integrate them in their service offer. There are many value chain issues to be resolved, particularly in new member states where credit cards and bank accounts are not the norm and where consumer scale distribution chains are relatively primitive. Innovation in service delivery is required in addition to innovation in technology.
One issue is clear. Western European governments believe that the Internet is an essential tool for the citizen and that broadband connection will form part of the path to future growth and prosperity. If this is true, those outside the reach of these technologies will be seriously disadvantaged, with resultant shifts in the patterns of business away from rural areas and less well connected countries. This is contrary to the aims of the enlarged European Union. However, it is not enough for government institutions to provide state support to service providers to solve the problem. The service providers themselves need to be able to offer innovative solutions that combine with institutional support to address the digital divide.  This could be a great opportunity for some niche technologies to come to the fore over the next five years.

Andrew Davies is Business Development Director at strategic technology consultants ESYS, and can be contacted via: adavies@esys.co.uk

Wireless service providers are now appreciating the vital role that OSS can play in a successful operation, says Kieran Moynihan 

With the exception of areas such as billing, OSS has never enjoyed a high profile. In the eyes of many a CFO, it was just a cost centre, strewn around the back office, and senior management in general didn't really understand what exactly it did day-to-day. In the severe opex and capex cuts in wireless service providers over the last two years, senior management have appreciated for the first time the importance of OSS in streamlining the operations of the network and acting as the overall foundation for the business processes employed in the network. As the network complexity continues to spiral and there are less people to manage the network, OSS has now entered a new phase of its evolution as the cornerstone of operating a wireless business.
Wireless service providers are under unrelenting pressure to improve their earnings performance, leverage their existing network infrastructure, improve customer satisfaction and bring new services to market more quickly. They are competing in an environment of intense competition with rapidly evolving and diverse technology challenges. One of the most expensive and painful problems facing wireless service providers today is that their multi-vendor OSS systems do not offer the level of interoperability and flexibility required to efficiently achieve their business goals, while reducing both the cost of ownership of the OSS and the effort needed by the service provider to maintain the OSS infrastructure.
The importance of interoperability
Both the wireless service provider and the OSS community have long understood the business benefits of interoperability. However, in practice, the level of open standards and interoperability achieved has been very disappointing. The OSS vendor community, the system integrator community and the wireless carriers need to share the blame for the lack of progress here. On the vendor side, there has always been a reluctance to work on standards with competitors, coupled with the financial pressure to maximise deployment and integration revenue from deploying 'complex' solutions. System Integrators did what came naturally to them and assumed the responsibility of knitting the disparate OSS systems together in expensive integration projects. Wireless service providers discouraged OSS products companies from conforming to standards by frequently requesting custom solutions to meet their special requirements, thereby creating a spiral of legacy OSS integration which is still in existence today.
The OSS landscape is now undergoing a fundamental change with a genuine sea-change in attitude to the importance of OSS standards and interoperability. In the current climate of microscopic focus on opex, service providers cannot keep sustaining the 'OSS integration tax' and unnecessarily high cost of ownership. Wireless service providers have accordingly increased the emphasis on OSS vendors conforming to the TeleManagement Forum (TMF) and emerging standards such as OSS/J. Wireless OSS vendors are reacting to this pressure from the service providers. Wireless OSS vendors are also, interestingly, reacting to the increasing pressure from their shareholders who have recognised that in most cases, the most successful OSS products companies, in terms of shareholder value, had high-volume OSS products with particular strengths in ease-of-deployment and interoperability. Finally the system integrator community is beginning to transition their revenue generation focus away from costly integration projects to high-value business transformation projects as it sees, for example, the business process transformation projects associated with the introduction of a Service Quality & Service Level Agreement management system.
While the basic concepts of service provisioning, activation and service assurance are well understood, the process of integrating these systems together so that they 'talk' to each other has been difficult, requiring service providers to undergo lengthy and costly software customisation projects or to build entirely new applications. The challenge of integrating diverse OSS components is a deep-rooted problem in the evolution of OSS technology. Wireless service providers are realising that it's no longer feasible to develop and maintain a costly customised OSS solution environment, particularly with the increasingly complex infrastructure environments of 2.5 and 3G networks.
Service providers try and use a best-of-breed approach to identifying OSS vendors that offer the best product or service. However, at some point, these disparate OSS solutions must be integrated into a unified OSS system to manage all aspects of their business operations. In order to achieve interoperability, wireless service providers are faced with cost-prohibitive integration costs and a lengthy data migration process that can often stretch into years. Wireless service providers who want to profitably deliver next-generation services must rely on OSS vendor support to meet the changing needs of consumer demands in an evolving digital economy.
The need for industry standardisation
A major challenge for wireless service providers is how to manage multi-technology, multi-vendor networks more efficiently. To meet the market demand for standardisation, standards bodies such as the TeleManagement Forum and OSS/J were created to better define development standards for OSS application development.
Wireless service providers have been hard pressed to find a standard set of applications from multiple vendors that will work together and, often, have chosen all of their products from a single vendor, which has resulted in deploying proprietary solutions, not solutions developed on a set of industry standards.
To meet the growing demand from wireless service providers requesting interoperable OSS solutions and industry standard applications, four leading independent providers of OSS solutions joined together to form the Service Management Alliance (SMA). The four companies, Argogroup, Casabyte, NetTest and WatchMark-Comnitel, are working together to promote the advancement, awareness and industry collaboration of service management solutions for the benefit of wireless service providers. The SMA is focused on developing a solution to the traditional problem of interoperability between multi-vendor OSS applications in the service assurance space that are often complex, proprietary, or require custom application development. Several other vendors in the wireless service assurance space have recognised the overall benefits of this alliance and have formally submitted requests to join the SMA. Early feedback from the wireless service provider community has been extremely positive and has acknowledged the proactive approach by several of the leading vendors in the wireless service assurance space to promote standards and interoperability and reducing the cost of ownership.
OSS' changing role
With the tremendous pressure on service providers to improve their service quality levels, reduce customer churn, improve customer satisfaction and increase their profitability, wireless service providers are now taking a different view of the OSS vendor community and recognising the significant role that OSS plays in their business operations.
It's time for wireless service providers to differentiate themselves in a commoditised business environment and they must now rely on value-added services, quality of service, speed to market for new services and a quality customer experience as market differentiators.
There's mounting competition for high-value customers and service providers must be able to address network, service and customer issues and opportunities simultaneously to ensure profitable operations. To achieve acceptable customer retention, service providers must develop new ways of viewing service quality through the eyes of the customer. This is a view that experience tells us is often out of sync with the traditional network-centric view. This transition to a customer-centric approach to managing networks, in a profitable manner, represents a fundamental change for wireless service providers as they roll out 2.5G and 3G services.
Wireless service providers have now recognised that the introduction of service management in wireless networks acts as a catalyst to bring together network operations, customer care, corporate account management and sales and marketing, to extend their current cooperation levels to a new powerful paradigm where all teams are integrated around a common core objective of delivering consistent service quality to customers in a profitable manner.
By implementing OSS solutions that truly offer interoperability, service providers will benefit by improved earnings performance, a reduction of OSS integration and maintenance costs, faster deployment of new services, improved customer satisfaction and loyalty and an overall increase in revenue per customer.

Kieran Moynihan, CTO, WatchMark-Comnitel can be contacted via tel: +353 21 730 6002; e-mail: kieran.moynihan@watchmark-comnitel.com

Margin management is a frequently overlooked element of revenue assurance. David Halliday explains why it should have a more prominent role

The downturn in the telecoms industry has seen several operators go out of business, and for many that have survived the emphasis is now on protecting and growing existing revenues. Revenue losses in the telecoms industry globally have been estimated at 13.7 per cent of turnover (Analysys/Azure, 2003), which equates to billions of Euros. Consequently, operators are slowly but surely facing up to these losses and are starting to combat them by implementing revenue assurance programmes.
Revenue assurance is a term that operators have traditionally associated with causes of loss, such as fraud, end-to-end revenue leakages and the over-payment and undercharging for interconnect capacity.  However, there is an equally important part of revenue assurance that has tended to be overlooked, which is margin management. Managing your margins makes perfect business sense in any industry, but is particularly relevant in the telecoms industry, which is still navigating itself out of a very difficult time. At a time when managing expenditure is seen as key, there is a significant addressable revenue opportunity by implementing an effective margin management strategy.
Margin management is complementary to all other aspects of revenue assurance. Taking a simplified view, margin management can be defined as managing the business opportunity – interconnect accounting makes sure payments are managed effectively, whilst fraud management ensures that none of the money is going astray. All these aspects, managed together, will help run a much more complete end-to-end revenue assurance operation.
Optimising revenues is by no means a new concept as is evident with the practice of least-cost routing.  The international telecommunications industry has experienced dramatic changes over recent years with the proliferation of competitive international carriers and capacity resellers who are increasingly offering cheaper rates to deliver traffic to an ever-increasing variety of destinations.  In order to remain competitive within the industry, carriers have needed to be able to negotiate and take advantage of the lowest market rates, whilst minimising disruption from poor quality routings. Failure to make quick decisions on how routing is organised can significantly damage profits.
A business-driven approach can be implemented across a carrier's entire processes and technology base. As with other aspects of revenue assurance, margin management isn't solely a finance or operational function. An organisation's COO will want to ensure that operational efficiency is being achieved, whilst its CFO will want to see the impact on the bottom line. To this end it is important to have the correct processes and technology in place.
Understanding the health of any organisation is critical. Regular third-party health checks of existing systems and procedures are recommended as even a basic human input error could significantly impact on margins. Carriers have finite resources, so it is essential that they ensure margins are being fully optimised.  Yet until existing procedures are actually reviewed, it is very difficult to plan how to structure a business to achieve maximum revenues.
Once business objectives have been identified, solutions need to be implemented in order to help bring margin management to fruition. Whether a PTT or a wholesale carrier, systems with a slow response time or insufficient reporting capability will risk profitability. Many carriers will claim to have effective systems in place, yet a great deal of them are making do with 'best-effort' in-house solutions. Too many carriers simply try to adapt their existing tools, with limited success. This can also often result in information being replicated, which wastes significant time and money and makes the whole process of margin management clumsier.
Carriers need to bite the bullet and move away from their existing legacy systems in order to make it possible for them to maximise margins and improve profitability. Systems need to focus on automating tasks, improving efficiency and productivity by reducing the need for large groups to manage core data. Systems are now available that will enable carriers to:
• highlight arbitrage opportunities at the lowest dial-code level
• substantially improve overall quality
• validate carrier quality
• automatically generate and maintain MML routing changes onto a switch which is fully auditable
• provide custom alarms and alerts based on a wide range of threshold data
• provide user definable custom reporting with drill-down capabilities
All of this will help carriers maximise margins and improve profitability in the complex world of routing choices, and help guarantee quality of service.
Having the correct processes and technology in place will become increasingly important with the emergence of next-generation technology and services. Multimedia and its content, for example, will bring increasingly complicated supply and value chains, meaning that partners will need to be settled with much more efficiently.  Moving forward, margin management will be evolving into what can be termed as trading management.  This environment would be much more opportunity driven and proactive, whereby carriers would be able to settle with each other in real-time which helps with cash flow and cuts down on manual intervention. In the future C-level executives will have the visualisation tools as part of a revenue assurance dashboard so that they can see such transactions first hand.
This of course will be achieved much more effectively in an open trading environment, which is not a common characteristic of today's telecoms market. Despite the vast increase in numbers of operators many of the PTTs remain highly influential, consequently the market still has many cliques with certain carriers only dealing with certain partners. All of this obviously can be detrimental to customers as quality of service is not being guaranteed as they may not necessarily be provided with the best service.
This however may not be an issue in the future as regulators such as OFCOM are looking at ways to develop and encourage a true open telecoms market.  Therefore, it is essential that carriers adopt margin management now, so that they are in the best position to take advantage of new services in the future.

David Halliday, Director of Margin Management at Azure, can be contacted via e-mail: info@azuresolutions.com   www.azuresolutions.com [l=www.azuresolutions.com/]http://www.azuresolutions.com/[/l]

Robert Winters looks at the drivers behind Ethernet Service testing and the ways in which QoS can be maximised

Ethernet services are on the increase, with carriers deploying cost effective, high bandwidth services on a worldwide basis. Some Asian countries, such as Japan and Korea, are well ahead of the curve with low price 100Mbps fibre-to-the-home Ethernet 'best effort' services already on offer for about EURO30 per month. 
However, there is also great emphasis on premium Ethernet products that offer differentiated quality of service (QoS) business packages that command higher prices in return for guaranteed performance and reliability. Europe and North America are catching up with greater focus, initially, on high value, high quality business-oriented Ethernet services.
Despite the ubiquity, high bandwidth and inherent cost advantages of Ethernet, there is still a requirement to effectively guarantee quality of service (QoS). The usual network-based Service Level Agreement (SLA) that offers throughput, latency guarantees is now being increasingly supplemented with IP QoS guarantees, as an array of delay sensitive value added applications such as video are offered.
For Ethernet service providers and the equipment vendors supplying them, competitive advantage can be much improved by offering a combination of traditional network service quality and IP QoS guarantees. However, this requires a more structured approach to testing in order to increase confidence levels in offering such guarantees.
Drivers for Ethernet Services testing
1. The move from best effort to 'Business Class' Broadband Ethernet Services
Fundamental to the whole issue of QoS testing is the underlying movement from best effort broadband services model to 'Business Class' premium service offerings. In order to distinguish between the SoHo/SME consumers – who will generally accept a best effort service – and the more demanding market segments covering large enterprise, financial services, healthcare and government etc, guaranteed service quality parameters are offered, including fixed bandwidth, latency and high priority throughput of traffic.
Coupled with the above differentiated services, new revenue models are being derived that include value added business applications such as Multicast Video, Voice over IP (VoIP), time sensitive web services guarantees, storage area networks etc. At the end of the day, it's all about end customer QoS experience and many of these applications are very sensitive to delays. It doesn't matter what the service level claims are, if a customer CEO has a Multicast Video session to fifty branch offices on a Friday afternoon and it is not up to scratch, there'll be trouble.
Therefore, along with traditional service level guarantees like latency and throughput, Ethernet service providers need to give themselves the best chance possible by determining any potential delay sensitive application IP QoS issues.
2. It is not just about packet blast testing anymore
During the late '90s telecom boom, building capacity into the network was key and test approaches tended to focus on packet blast methods to testing layer 2 services with stateless unidirectional packets. The approach was to fill the service pipe with varying size packets and measure throughput capabilities, latency etc, according to test standards such as RFC2544.
However, these days, pure layer 2 test methods need to be supplemented with end-to-end IP QoS testing that invokes what the real end user is expected to experience. Rather than the sum of the parts test view, it is more practical to emulate and analyse the performance of individual layer 2 services, their associated bandwidth and mixed priority settings. The IP flows that run over these services require verification too and fully stateful real applications that represent the most common internet mix, such as web, email, multicast, streaming etc.
For example, the knock on effects from dropped packets at layer 2 can result in a large decrease in effective bandwidth caused by TCP retransmits. However, badly specified application servers can also cause TCP retransmits. The issue is how to test and effectively isolate the problem source.
3. Understanding service limitations – testing for QoS boundaries
Strictly speaking, Ethernet inherently offers Class of Service (CoS)-based service guarantees through VLAN 802.1 services with bandwidth and priority traffic settings, as opposed to specific QoS settings that are more prevalent for example, in the traditional (and expensive) ATM world. This is an important distinction that necessitates a view of what defines 'carrier grade Ethernet QoS'.
In order to guarantee carrier grade Ethernet QoS, providers need to be confident that each service, each user and each IP application flow using that service are thoroughly tested for quality. Therefore, a pragmatic approach to testing is required whereby corporate Ethernet service and application flow models can be quickly built, then emulated and analysed for quality issues throughout the network, under test with varying load and network status conditions. Using this test method, QoS boundaries can be realistically determined for both network services and application layers.
4. Convergence – Network or Application quality issues?
The overall trend for convergence of telecom and IT departments in large enterprise is blurring the distinction between pure network layer testing requirements for Ethernet services and the quality issues of applications utilising those services. 
Ethernet service providers need a structured test environment, not only for pre-turn up and provisioning test but also for post-deployment capacity planning and trouble shooting test purposes.
For example, if a trouble ticket is generated from a large enterprise customer that necessitates an on site visit, then problems need to be narrowed down very quickly as the time and money clock is ticking for both parties.
Despite the Ethernet service provider opportunity to offer sophisticated value add applications, the downside is that poorly configured application servers can have a serious network layer side effect. Most of these servers are out of the control of the service provider and can lead to them being unfairly blamed for certain problems such as decreased bandwidth.
An integrated and pragmatic test approach is required that first determines whether the network is the issue and, if not, can then effectively prove whether the applications feeding into the network are at fault.
Test systems need to provide assessment capabilities as to whether dropped bandwidth is caused by packet loss on a layer 2 Ethernet Switch or caused by a high level of TCP retransmits due to a poorly configured web applications server.
In this instance, it is important to be able to decouple the application servers and instead emulate IP flows using a test approach that analyses whether the network is at fault. If not, the test system should then be turned on the actual application servers themselves and potential quality issues identified.
Customer expectations
High speed Ethernet services testing demands are intensifying as quality of service (QoS) guarantees reach a greater level of sophistication. End customer expectations are heightened, with the introduction of an array of delay sensitive value added applications such as video and VoIP. In order to improve competitive advantage, the traditional SLA is being supplemented with IP QoS guarantees. Therefore, Ethernet service providers also need to understand their overall QoS boundaries in a converged network and application environment within which premium level business class services can be guaranteed. Post deployment, trouble-shooting needs to quickly determine whether an Ethernet service issue is causing a problem or whether the applications using the service are at fault. Having an integrated network and application test approach provides the necessary test environment to meet these requirements.                 

Robert Winters, Chief Marketing Officer, Co-founder, Shenick Network Systems Limited can be contacted via tel: +353 12367002; e-mail: robert.winters@shenick.com [l=www.shenick.com/]http://www.shenick.com/[/l]

Lucent Technologies INS President, Janet Davidson tells Priscilla Awde why OSSs are crucial to providing both a network transformation and the best service to customers

Hidden behind the economic downturn of the past several years, a quiet revolution has been going on in the communications industry. It is one driven less by telcos and operators and more by consumers: end users are demanding more independence, flexibility, speed and functionality. Consumers are flexing muscles and, driven by greater choice and mobility, they are putting pressure on operators to deliver.
People want access to multimedia services in near real time regardless of where they are, what kind of device they are using and whether they are using it for business or pleasure. Indeed – whether terminating on fixed or mobile devices, at home or in offices – convergent, multimedia content and applications are forcing changes both in the type and organisation of service providers, and the communications networks themselves.
Lucent Technologies is committed to helping operators deliver convergent services seamlessly and efficiently over fully integrated IT platforms. Janet Davidson, Lucent's President of Integrated Network Solutions and their OSS software unit, explains: "To support new business processes, service providers need to completely link their business processes and their operations software environments.
"In the old world, dominated by POTs and ISDN, the number and variety of services were dictated and constrained by network technologies and limited supply. In today's brave new communications world, Davidson suggests, it is the networks and operators which must adapt to deliver high quality service levels to an even more demanding, better informed and more dynamic needs of the customer base.
Telcos are restructuring and becoming more streamlined, both out of economic necessity, and to serve rapidly changing markets. Lured by the benefits of co-ordinating front and back office functions, integrating disparate systems under a seamless IT umbrella, and using automation to save money and become more efficient, companies are turning to enterprise-wide OSSs. Davidson's goal is to convert telcos into organisations in which customer needs are a priority.
Working together, vendors, operators and content owners are creating new networks, software, systems and solutions designed to give customers what they want and when they want it. Old arguments about whether operators' core business is bandwidth or value added services have largely been won in favour of the latter.
Needing to keep customers and increase their lifetime value, telcos are marrying state-of-the-art networks with a portfolio of appropriately priced applications governed by service level agreements (SLAs). They are reducing churn by bundling voice/data services and creating new convergent applications and differentiated service levels. Managed services, hosted, bundled multimedia applications, value added and premium services are replacing basic connectivity, flat fees and fixed internet access.
Network convergence
"The challenges for service providers are to tailor their business processes to support the new business models and link them to their OSS," says Davidson. "Convergence is, in the end, the dynamic interaction of end users, networks and service providers, enabled by technology in the service of personal empowerment at work, at home and on the move. This dynamic is driving not just the transformation of networks but service provider business models."
Convergent networks, technologies and services create opportunities for new business models that exploit a telco's resident expertise in network provisioning and customer relationship management (CRM). In the world of converged services, the value lies in making things easy for customers. It lies in convenience, simplicity, location, connectivity and the transparent integration and delivery of information services.
"The challenge for service providers is to tune and/or develop new business processes to support these new ventures.  To do that profitably, service providers need to completely link their business processes and operations software environments," Davidson explains.
The new generation of OSS and business solutions software is designed precisely to support the necessary convergence and internal re-organisation. Davidson suggests there is a shift in the enabling technologies to control and measure how networks perform. The emphasis is less on simple connectivity and more on service level agreements: a move from complex backbone network technologies to high-speed, broadband fibre access systems.
Service providers' business models are moving towards measuring how customers are served; towards a customer-oriented view of the network. A change that Davidson believes will not happen piecemeal. "Service providers need to adopt a systematic approach to network builds, service layering and service creation and assurance," she says. "In our view, the foundation is an MPLS, core/optical transport network capable of delivering all end-user applications, and servicing all end-user devices. The MPLS/optical core unifies all current networks infrastructures – wireless/wireline, voice and data, metro optical, and cable. On that foundation, it is relatively easy for service providers to deliver mobile voice and high-speed data, VOIP, and legacy voice service and/or even cable service. Our Service Intelligent Architecture gives service providers a systematic way to capitalise on opportunities at each layer of the network. And includes Next-Gen OSS to support both legacy and new network services and elements, as new services and network elements are deployed.
Barrier free collaboration
"The ultimate goals are simplicity in network architecture and OSS, and flexibility in creating, delivering and managing services."
Davidson defines such 'barrier free' communications as the transparent, seamless exchange of information between service providers, network technology suppliers, content and application suppliers, and end users. The aims are to support consumer choice both in the types of applications, delivery methods and billing, independent of time or place, while simplifying users' experience and gaining maximum service awareness to improve take-up rates.
Success demands seamless, near real time data feedback shared amongst all partners. Increasingly complex relationships between operators, their suppliers and technology partners must be managed in new ways that support diversity and flexibility. The end result for service providers, Davidson suggests, will be new, differentiated revenue streams, richer consumer relationships and better management of capital and operational expenditure.
Telcos realise the benefits of highly automated barrier free communications systems in better flow- through, improved quality procedures, lower costs, shorter product lifecycles and faster times to market. In the process, they develop more flexible, integrated and responsive business systems designed to improve their ability to respond to fast moving competitive markets.
"The operational task is to govern priorities, organisational processes and resources to make sure the company is working on the things that drive business value," Davidson explains.
"Flow through rates must continue to ascend and also support session-based services. Self-service and selection must be measured and improved with end-to-end process ownerships.
Customer and service centric view
"Exploiting converged service opportunities requires that operators assure the quality of these services. Increasingly, providers will need to raise the service level bar to maintain differentiation and offer tiered services to extract the maximum value per segment. The value of managing service quality is immense."
The new service-centric view of the business makes it faster and easier to segment customers and offer differentiated service levels. Yet segmentation depends on operators knowing and understanding user patterns and behaviour. Real time network and session visibility are essential if telcos are to model, poll and predict service quality from customers' perspectives.
"Software needs to help support the free exchange of service quality and business optimisation information, and automatically distribute and monitor actionable events to all critical stakeholders," says Davidson.
Switching a business to a more holistic and service centric model and, in the process making it more agile and proactive, is enabled by tightly integrated IT platforms. Yet measuring and managing operations from a service perspective in complex multi-vendor networks is complicated, and made more so by the move to IP transmission. Telcos are simultaneously introducing new services and access options to an explosive number of feature-rich end user devices. For most operators this is an on-going, expensive and sometimes frustrating exercise, which, in the short term, may slow down new product launches.
Ultimately, believes Davidson: "The goals of agile integration are to reduce this integration tax and speed software service delivery without breaking anything."
Agile integration enhances profitability
It is the task of OSS software vendors to make the integration process as painless and as productive as possible. It is up to vendors to design the enterprise-wide solutions that will help operators manage their enormous and ever changing product portfolio and tariffs.
Davidson's recipe for vendors includes: standards involvement and advancement; service-oriented architecture (SOA) software design; improved software delivery and development lifecycles; and effective content and technology partnerships. All of which are essential in designing the requisite software to allow operators to move from their current situation in which the pervading 'spaghetti junction' of bolted-on legacy systems inhibit fast reaction to either market changes or customer demand. Integrated IT platforms are essential for operators wanting to succeed in the new communications market. Soon presence-enabled conferencing, personalised ring-back tones, speech activated services, multimedia entertainment and productivity applications will infiltrate all aspects of work and lifestyles.
"Users will be able to find, order and experience content quickly and simply. Users will hear, see, talk, purchase, do and create from a wide range of access means and applications with the aid of converged services," explains Davidson. "In the new network environment, services revolve around users, not users around the services.
"Networks will be fluid, responsive, dynamic. They will adapt to the changing location and preferences of an end user. In this environment, providers need to ensure robust, flexible services by means of progressive security, QOS-enforced IP routes, flexible pricing and pricing options and policy based end user service selections.
Getting to this point depends on operators evolving from the bureaucratic entities many are, into modern, flexible businesses in which units seamlessly interact. Wholesale and retail billing systems must be linked with marketing and revenue assurance amongst others, and be fast enough to allow operators to change tariffs quickly. Information held in disparate databases should be accessible to authorised staff who can interrogate it to discover patterns of customer behaviour and/or the success of different services and change parameters as necessary.
Automating business processes
Successful implementations of OSS depend on close working relationships between vendor and service provider. Working with one incumbent operator, Lucent deployed OSS software that increased provisioning efficiencies by more than 50 per cent in the provider's large multi-vendor, multi-platform network with more than a dozen regional management centres delivering a variety of services to numerous customers.
Using Lucent's software to automate their subscriber management and activation processes to rapidly meet demand, another mobile operator now activates upwards of 600,000 orders (adds and changes) daily with 95 per cent flow-through.
Network traffic patterning products have helped operators improve network operations and revenue leakage analysis as well as reduce disputes with other carriers/partners and significantly increase call completion rates.
"We are helping service providers pioneer dual and triple mode access authorisation for 3G data and WiFi," says Davidson. "We have recently begun field-testing IP configuration software that can automatically analyse the capacity of and re-route the IP paths based on operations needs."
Ultimately, suggests Davidson, operators who understand the market dynamics and offer high quality services for a range of access devices will keep customers and increase ARPU.
In the triple play of home entertainment services (voice, video and data), operators must know and have more control over their networks and their subscribers if they are to increase performance, customer loyalty and take-up rates. Bandwidth hungry business applications depend on fast networks and high application throughput.
Whichever markets they sell into, guaranteeing service levels is as important for all operators as good customer relations and the quantity and quality of applications. 
"Ultimately service providers must anticipate the needs of customers and position themselves as the supplier of 'first convenience'," concludes Davidson. "The value of communications services will be measured by how well they satisfy and empower end users at home, at work and on the move."

Priscilla Awde is a freelance communications writer


Operators must learn how to exploit the technologies that will drive the WLAN phenomenon, suggests Kevin Dorton

Mobile providers worldwide are evaluating public wireless LAN (WLAN) networks as the newest way to provide communications services to consumers on the go. With so many consumers around the world eager to download content services, delivering content through mobile devices has proven to be a viable means of generating profits.
The number of hotspots in Western Europe is expected to grow from 1,500 at the end of 2002 to 32,500 in 2007, generating total revenue of $1.4 billion over a period of five years (figures from analyst firm, IDC).  Whether it is business users working remotely or individual consumers looking to arrange a night out with friends or seek information about the latest films before visiting the cinema, demand is pushing the deployment of hotspots across this region.
WLAN service providers are in an excellent position to exploit this positive environment. However, in order for wireless service providers to understand how to bill and collect the revenue for WLAN services, they must first know how to exploit the technologies that will power a successful WLAN programme. When to invest in new or upgraded systems and when to leverage existing solutions are key issues for service providers.
The myth that operators need to replace their existing billing systems to support billing for WLAN is dissolved by the simple fact that WLAN billing is essentially handled the same way in which other mobile data services are rated and billed. Simply put, WLAN is just another way to access data. Operators' systems that can handle billing for mobile data can handle billing for WLAN. 
As the core of the billing system should handle billing for basic WLAN services, operators have the choice to invest in complementary software to handle more complex WLAN offerings. Roaming and revenue sharing solutions not only enhance the WLAN experience for consumers, but can also be used as an opportunity to grow service providers' revenues and preserve their brand as users roam from hotspot to hotspot. 
Mobile users are growing more familiar with accessing customer service when and where they choose – including from small PDA devices that are WLAN connected. Extending online self-care applications to the travelling PDA user will enable wireless providers to take advantage of reduced customer service costs and allow them to deliver a timely self-care experience that increases customer satisfaction.
Reducing customer care costs by offering self-care applications has proven to be a significant return on investment for service providers. Gartner estimates that it costs an operator on average $5.50 each time a customer contacts a call centre compared with just 24 cents for electronic customer self-care. In addition, giving the consumer the opportunity to update their account information or change services at their own convenience inadvertently fuels loyalty to the provider – thus driving customer retention.
Investing for the future
Another way to profit from offering WLAN services is to partner with venues that can offer hotspots and with third parties that have the content consumers want. These types of relationships require revenue sharing agreements so that the parties involved are able to set up clear revenue sharing parameters and split profits among the players. If the service provider uses an automated revenue sharing solution, they control the management of the partnerships and the reconciliation of the revenues, putting the service provider in charge of the revenue flow.
Hot spot venues, aggregators, network operators/ WISPs will all need to be able to share revenues. They also need to be able to make settlements with content, application and merchant partners. The complexities grow as partnerships become more diverse and the number of partners increases, so it pays to be in charge of the revenue flow.
Organic growth
As WLAN services grow in popularity, the organic growth will permeate across all business models and payment methods. New and casual users will opt to prepay for WLAN services if the choice is presented to them, as the prepay payment method has a much lower perceived risk. Since WLAN is a relatively new service, many new users will want to first try the service through several different providers before signing up for a subscription. In addition, the individual WLAN provider typically has limited network coverage – leading consumers to find it easier to pay as they go rather than sign up with multiple vendors in order to ensure they always have WLAN coverage whenever they need it.
However, many legacy prepaid billing systems can't handle service authorisation and session management for multiple users accessing data, content and voice services. Existing billing systems can usually handle the simple 'pay in advance' single session authorisation used in most hotspots. Nonetheless, WLAN services that complement 3G data and content services, real-time multi-session service authorisation systems must track simultaneous user sessions and determine when to stop or re-authorise a user session.
In a world where staying connected no matter where you go is important – especially for the business traveler – a roaming solution is essential. The user should be able to access the service in a familiar way and service charges should be added to the regular bill, or debited from a pre-pay account. Therefore, as customers use different hotspots linked to different wireless providers, transaction information must be sent between the providers so they can appropriately bill the end user.
The WLAN environment 
Roaming is particularly important in the WLAN environment as coverage may only be a few hundred meters.  UK consultancy BWCS recently stated that wireless service providers risk losing up to 30 per cent of potential hot spot revenues if there is no roaming agreement in place. So, while the industry awaits a universal standard for WLAN roaming records to be established, accepted and implemented, providers must have WLAN roaming solutions that allow them to export and import usage records for multiple roaming interfaces, some of which are proprietary. A seamless service and a single bill for consumers, regardless of whether the hotspots used are in a coffee shop, airport or hotel at different ends of the region – should be the result.
The keys to capturing revenue from WLAN services and cutting customer care costs are in the effective use of billing and customer care systems. As hotspots continue to populate airports, coffee shops and other public arenas, service providers can reap the benefits by re-evaluating current systems for maximum return on recent and future investments.     

Kevin Dorton, CSG Systems, can be contacted via e-mail:    Kevin_dorton@csgsystems.com [l=www.csgsystems.com/]http://www.csgsystems.com/[/l]

While operators focus on multimedia and IP-based content as future revenue generators, it is important that they do not forget the quality of their core product. Alun Lewis explains

The last decade has been a hectic time for the telecommunications industry. A combination of new technologies and radical regulatory change – combined with the financial quagmire of recent years – has challenged the ability of operators to stay focused and on track. It's now becoming clear that these distractions have impacted heavily on the core application that our industry has grown rich on: traditional voice services.
Even the familiar acronym of POTS – Plain Old Telephone Services – illustrates the relatively low regard shown for voice services when compared with the much-hyped potential of multimedia content and IP-based applications. Despite this perception, the global market for voice – according to international research consultancy Ovum – is expected to continue to grow from $784 billion last year to around $1000 billion by 2007, so there's still considerable market share to fight for.
One of the key problems facing operators, irrespective of whether they're running fixed, mobile or IP networks, is making sure that the quality of the voice services that they offer is acceptable to the markets that they're targeting. While voice generally used to be seen as a 'one size fits all' service – sensible enough when everything was being carried pretty much by one operator over one network – the current cat's cradle of complexity involving interconnections between different carrier technologies and service providers creates major headaches for all involved.
A business, for example, may be happy to pay a lower charge for lower quality voice services for internal company use – but more than happy to pay a premium for external calls to customers. Similarly, the youth market for mobile services may be attracted away from SMS communication to speech interaction at slightly higher tariffs – and be happy for a lower grade of service than the majority of other mobile users.
In particular, the impact of VoIP and the steady growth of supporting IP-based access technologies such as DSL and WLAN are already forcing service providers of all sizes and types into a re-examination of the whole voice services market. Additionally, for operators with more traditional networks, it's also essential that they understand exactly where in the infrastructure to make the right investments that will generate ARPU or market differentiators through enhanced or managed voice quality.
But how exactly do you measure voice quality? Unlike the hard technical parameters used to measure the efficiency of data networks – packet loss, latency and so on – perceived voice quality also depends on the hardware and software of that most complex of mechanisms, the human brain and nervous system. The telecoms value chain is complex enough when it's just based on copper and silicon – add in the human mouth and ear and that complexity increases by an order of magnitude.
The good news is that part of the ITU has been working on this problem for a number of years, coordinating research to develop standardised methods that can be applied across multiple networks and different technologies. Their approach is based on a concept known as PESQ – Perceptual Evaluation of Speech Quality – which measures end-to-end voice quality based on a database of subjective listeners' experiences of call quality defined by Mean Opinion Scores (MOS):

Listening quality MOS scale
Score            Quality of the speech 
5                            Excellent 
4                              Good 
3                               Fair 
2                               Poor 
1                               Bad
While it's obviously possible for an operator to carry out regular customer surveys of voice quality, clearly the optimum solution is to automate the whole process through the use of software algorithms, so that network performance can be monitored on a regular basis. This means that they're effectively 'listening in' to calls and then reporting back to other management systems.

Recommended standards
At the end of 2003, the ITU announced one of the first of these recommended standards, based on a non-intrusive algorithm for use in PSTN and mobile networks, following intensive evaluation of different solutions from a number of different companies.
Iain Wood of BT spin-off Psytechnics, one of the winning solution providers and a specialist in voice quality metrics, takes up the story: "The mobile market in particular is going to have to start addressing the issue of voice quality very seriously. If they're to keep pushing ARPU it's vital that they have some systems in place to be able to monitor the customer's end experience of voice services. Was a call ended prematurely because of voice quality problems, or is a customer churning because of poor reception? The kinds of data that you get from the usual QoS systems doesn't take account of this human-centric issue, even though it's vital to understanding the overall customer experience.
"Psytechnics has recently been looking at the performance of a number of mobile networks both in the UK and abroad and has found some interesting results. One of the most significant was that there was no discernable difference in speech quality between the most expensive and the cheapest handsets - despite a price differential of hundreds of pounds. Given that for many adult users voice easily remains their primary mobile service, does this mean that operators and handset manufacturers are putting too much effort and investment into unused – and ultimately unprofitable – features and capabilities?
"Similar issues apply when you consider the attempts that some mobile operators around the world are making to compete directly with the fixed line  market through homezone tariffing. With a typical UK PSTN line providing a MOS of 4.3, UK GSM network typically operate at between 2.9 and 4.1, and this issue of perceived voice quality will make it unlikely that high spending business users will migrate unless service levels are improved.
"Even if the customer's getting a full signal strength bar, there are a host of other factors that degrade speech quality and an understanding of how voice quality is performing is an invaluable tool in how to best engineer the value chain and all its underlying components.
"This technology also has an equally valuable role to play for the end customer, especially the large corporate thats unsure whether it's receiving service value for its money. While it will certainly have strict Service Level Agreements in place to monitor its data networks – along with stringent penalties if these aren't met – both businesses are effectively flying in the dark when it comes to measuring speech quality. Because of this, we're currently working on an application that can be downloaded onto the mobile device itself. This can then use SMS to report back to the operator – or another party – to provide MOS feedback, as well as displaying voice quality metrics to the actual user."
Important role to play
While voice quality metrics have an important role to play with mobile service providers, they have an absolutely vital one when it comes to supporting the wide-scale roll out of VoIP services in both public and enterprise networks. The price and performance benefits of moving to an all-IP environment might be well understood, but customers are often rightly sceptical of its suitability for carrying mission critical voice services. The whole industry is full of anecdotes about disastrous VoIP implementations that have failed to deliver and caused nightmares for customer and vendor alike.
For companies like Psytechnics, this presents another market opportunity, as Iain Wood explains.
"IP was designed originally for computer-to-computer interaction and unfortunately humans are a lot more critical than machines when it comes to carrying on conversations. In trying to replicate the behaviour of the PSTN in a packet environment, there are all sorts of subtle perceptual cues that have to be taken into account if we are to provide a truly satisfying speech experience.
"Unfortunately, the legacy management tools that came with the IP world are incapable of supporting voice services – as the industry found to its cost in the past. Fortunately, we now have a range of techniques – similar to those now being deployed in the circuit switched environment – that can be deployed to ensure voice quality remains appropriate for the context that it's used in.
"What's important to remember, however, is that delivering voice quality in an IP environment must be an iterative process. You might start by testing a LAN for its suitability even before a VoIP solution is deployed. That can be done by running simulated VoIP traffic over it and monitoring performance at strategic locations. Then, during commissioning, it's useful to carry out end-to-end testing to fine tune the configurations and get it ready for live use. After thats done, regular monitoring remains essential, particularly given the constantly changing topology of most IT networks and the natural variations in traffic flow that affect any real world environment. In some of these scenarios, it's appropriate to use testing tools; in others, constant monitoring is best.
"We've found two ways into this rapidly growing market. Firstly through supporting system vendors by allowing them to incorporate our technology into their management systems, and secondly through providing a consultancy service for service providers, system integrators and end customers."
With the ITU about to announce similar standardised approaches for VoIP as they have already done for mobile and PSTN services, approaches to voice quality finally look like becoming definable – rather than infinitely debateable. Maybe it's time to update the old saying of the billing industry that 'if you can't bill for it, don't offer it,' to 'if you can't measure it, don't try to sell it'...   

Alun Lewis is a telecommunications writer and  consultant. alunlewis@compuserve.com


If providers want broadband customer loyalty, they must prove they can be trusted right from the start, says Kieran Moynihan

The transformation of the world's wireline networks is already well underway in many countries, with the ability to deliver broadband content and entertainment services to a wide variety of non-telephony devices within the domestic environment already centre stage within the strategic planning departments of most fixed line operators.
Even straightforward voice services are about to be reborn in fresh guises, using new protocols like SIP -- along with simple presence and location information -- to widen the depth and breadth of traditional service portfolios.
However, in this headlong rush towards new revenue streams, it's vitally important that service providers don't lose sight of the actual customer sitting at the end of what is going to be an ever-lengthening value chain. It's not going to be any good just pumping more and more bits at them in the hope they're going to pay for this on the basis of sheer volume alone. For the customer -- who's already becoming far more discerning about the range of possibilities on tap to them -- other criteria are also important, particularly the overall quality of the total service experience that they get from their provider. The raw truth is that if you want them to come on the broadband journey with you, you're going to have to prove to them -- right from the start -- that you're a trustworthy partner.
For some fixed service providers, negotiating the service quality management route in the broadband environment could end up becoming rather a problematic kind of journey. Managing speech quality in the circuit-switched world, while not simple, is at least a well-understood and standardised discipline that's been practised for many decades. By contrast, making the move into the broadband IP world necessarily involves far higher value transactions for content services from the customer's perspective -- along with an accompanying increased exposure to uncertainty and risk for the provider. Before asking the customer to transfer all their communications services -- including broadcast  -- onto the shoulders of one single provider, we'd better make sure that we're up to the job.
Fortunately, one part of our industry -- the mobile sector -- has already been here before and there are valuable lessons for the fixed operator to learn from that experience.
Consider the essentials of the wired broadband mix:
*  A variety of different access technologies, each with their own particular behaviours and physical limits
*  A need to support a multiplicity of services in a consistent manner
*  Great variation in the actual terminal devices being used by the customer
*  A potential separation between ownership of the actual physical network infrastructure and the providers of services to run on them              *  Complex and often highly distributed value chains, involving third parties and intricate settlement procedures
*  A fickle and often fragmented customer base, prepared to churn at a moment's notice
*  Customers who will use the network for both business and personal use -- with appropriate levels of security and reliability
Sounds a bit like the mobile environment -- but with one important difference. Customer expectations in the mobile space have largely kept pace with the network's actual ability to deliver on these. In comparison, consumers of broadband services will usually be approaching their relationship with their fixed line service provider with already high expectations, based on years of consistent performance. To them it's irrelevant that something called VoIP is letting them talk cheaply to friends on the other side of the country -- they just expect it to work.
In this context, the main lesson that could be transferred from the wireless community to their fixed line cousins is the importance of appreciating the totality of the customer experience -- and that means taking a far more holistic approach to the systems that support service quality. In the steady-state world of circuit-switched services, the data needed to monitor service quality was pretty basic, as only a handful of performance parameters actually impacted on the user's experience. Did the call connect, did it complete, was it successfully billed for and so on?
By contrast, there are a host of different issues that can impact on the broadband experience and, just like the mobile environment, not all of these will be under the direct control of either the service provider or the network owner. In taking a leaf from the mobile book, fixed line operators are going to have to start seeing things from the customers' -- not the network's -- perspective, and this involves a shift in focus to what in the mobile arena is called customer-centric service management.
That involves pulling together all the currently separate and discrete functions of network operations, customer care, billing, enterprise account and product management and integrating them in ways that add value to the customer, to the whole business and not just to narrow departmental responsibilities. With the right kind of service quality management system in place, it becomes possible for faults that will have an immediate impact on customer service to be quickly identified and resolved, while triage can readily be performed to prioritise less urgent problems. Internal Service Level Agreements (SLAs) can also be defined and monitored across the organisation, leading to a greater coordination between all the different departments actually involved in creating, deploying, marketing and billing new services.
Given the critical place we're at in the rollout of broadband services, there's a strong argument to be made for adapting that old adage 'If you can't bill for it -- don't offer it' into 'If you can't guarantee its quality -- don't offer it". Fortunately, help -- and real world experience -- is at hand.                                                        n

Kieran Moynihan is General Manager, Service Management Division, Vallent Corporation

Just as was predicted long ago, much of the technology underpinning telecoms  networks these days has become a commodity. As a result, the areas that service providers now need to concentrate on instead involve finding the most efficient ways to set up, manage and control the myriad processes and events that take place on top of those enabling technologies.



European Communications is now
Mobile Europe and European Communications


From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:



Other Categories in Features