Analysis

FTSA, the parent company of the France Telecom group, along with its constituent companies, are experiencing the benefits of introducing e-learning into the development strategies for their 200,000 or so employees throughout the world. Bob Little reports

Although France Telecom has been using some elements of computer based learning since 1993, it took its first tentative steps in e-learning some five years ago. It is only in the last year or so, however, that e-learning within the group has begun to take off.
"In the summer of 2004, we carried out a controlled test on the effectiveness and use of e-learning within France Telecom," explains Yves Scaviner, deputy manager for group training at France Telecom. "We asked 1500 of the company's managers to work through a number of e-learning courses -- delivered through the medium of both French and English -- and over 60 per cent were won over to e-learning as a result of this experience. And now they have become real ambassadors for e-learning and the spearhead for its deployment in operational units throughout France."
The France Telecom (FT) Group comprises five major subsidiaries : TP Group, based in Poland; Orange, which has a presence in 17 countries; Equant, which provides services for the top 3,000 multinational companies in 200 countries across the world; Wanadoo, the Internet connection provider; and FTSA, the parent company of France Telecom.
"It's not easy to change a prevailing corporate training culture and implement e-learning overnight," says Scaviner. "Face-to-face training is the traditionally accepted method of learning which everyone understands -- even if it doesn't suit their individual learning style.
 "The secret of introducing e-learning and gaining rank-and-file acceptance of it as a learning delivery method is not only to have high profile endorsement from senior management but also to convince line managers of the benefits and advantages of e-learning. Moreover, you also have to prove to employees that e-learning is not 'second class learning' simply because it rubs against traditional classroom learning.
 "E-learning offers many benefits and advantages over more traditional methods of training delivery," Scaviner explains. "It is true that developing and using e-learning can result in major cost savings -- especially where training large numbers of employees is concerned. However, while this is a significant reason for France Telecom, it is not the main reason that we are increasing our use of e-learning.
"As a company, our business is in providing the infrastructure to encourage and enhance 'e-transformation' -- so, in embracing e-learning, we are helping to set an example to our clients and suppliers.
"But, for us, the most important benefit of e-learning is that it is a more efficient way of presenting learning than via the classroom," he continued. "Our studies have shown that learners learn faster when they use e-learning, compared with conventional classroom-based teaching methods. Typically, we have seen that what takes some six hours to teach in a classroom can be learnt in four hours in the virtual classroom and three hours if done via distance learning. This makes e-learning a highly efficient as well as cost-effective way to learn."
Mindful of the dangers of putting all its eggs in one basket, while FTSA is increasing the emphasis it places on e-learning within the companies in its group, Scaviner is also keen to stress that its human resource development strategy is dependent on a 'blended approach' -- that is, a mixture of classroom-delivered training with virtual classroom and distance learning inter alia.
Having realised the cost-effectiveness of e-learning, compared with other learning delivery methods -- especially for companies with widely dispersed workforces, such as Equant -- FTSA is actively pursuing a strategy that will result in half of its training/learning activities being delivered via e-learning in 2006. The current volume of training/learning delivered via e-learning within the group is some 20 per cent. This e-learning comprises a mixture of custom built e-learning, mainly developed in-house, and generic e-learning courseware from two worldwide suppliers of training software.
"Where transferable skills are concerned, we do not want to produce in-house what is already available in the marketplace," says Scaviner. "That is why we have bought licences for some 3,000 generic e-learning courses.

Learning path

"Although France Telecom's 200,000 employees have theoretical access to each of these courses, in practice FT training staff choose a learning path for each learner based on that person's revealed training needs," Scaviner adds. "Currently, the most accessed programmes cover general management issues; managing meetings; discovering your management style; managing stress, and motivating staff."
According to France Telecom's Odile Demery: "E-learning -- both virtual classroom and distance learning -- can be delivered to learners' desk-tops but many France Telecom employees work in open space and there is a greater chance of them being disturbed during their learning time. For this reason, France Telecom has made available some 400 dedicated training booths around the country -- known as 'Espace Clic-n-learn' -- where individuals can study undisturbed. So the booths -- time in which can be pre-booked online -- offer an ideal solution."
"We are popularising e-learning throughout France © Telecom via a number of initiatives, including posters and mousemats advertising 'Espace Clic-n-Learn'," said Demery.
"Ultimately the popularity of e-learning will depend on the 'me too' factor -- as people see their colleagues visiting the Clic-n-Learn booths and benefiting from their new knowledge and skills."

CASE STUDY 1

Christine Skelhorn, head of training at Orange, passionately believes that e-learning is the way forward for Orange. Along with her training team and e-learning co-ordinator Amanda Yarrow, she is committed to providing all Orange employees with access to innovative, effective and enjoyable learning. At the beginning of 2004, Orange launched its e-learning strategy with a Corporate Induction module, developed with TATA Interactive Systems (TIS).
In its ten years of existence, Orange has grown to some 12,000 employees in the UK. As the company continues to grow so, too, does its requirement to recruit staff -- especially in the customer services field -- and give them induction training. Until the advent of the e-learning materials, the corporate induction programme was a three-hour PowerPoint presentation conducted, as required, by any of the company's available trainers.
"We are delighted at the positive feedback we've received to the induction module," says Amanda Yarrow. "In particular, it's been a real winner with new starters across the business.
"Users range from engineers who have worked at Orange for many years to newly recruited customer service staff. They all seem to like the image we've adopted of a 'fairy godmother'. This virtual entity guides them through the programme and helps to dispel any 'techno' fears that they may have. She also helps to add meaning and significance to the contents of the induction programme.
"Importantly, TIS seems to have hit on a style of presentation which appeals to everyone in the company," Amanda Yarrow adds.
Orange has a number of further e-learning programmes in development for 2005 and beyond.
"It's been an encouraging start with Orange Induction and we really look forward to building on our achievement in the future," concluded Yarrow.

CASE STUDY 2

In less than two years, Equant -- part of the FTSA group -- has provided ongoing e-learning to some 9,500 employees worldwide, saving $6.5m in the process.
Equant operates a worldwide telecommunications network that manages 152,000 user connections across 220 countries for some 3,700 customers. Its employees need ongoing training in a range of topics including IT skills, project management, problem solving and negotiation skills.
With so many of its employees based throughout the world, Equant knew that traditional classroom-based training was time-consuming and costly, so it implemented a blended learning strategy integrating elements of classroom-based training with e-learning. The bulk of its e-learning programmes are provided by SkillSoft's IT and business related courses. SkillSoft also provides a '24/7' mentoring service to support its IT curriculum.
By the end of 2003, Equant had over 8,500 of its employees using e-learning -- a user rate of over 85 per cent -- at a cost per employee of less than $100 for 24/7 access to 700 courses.
Access to e-learning is:
*  Having a positive impact on levels of staff retention at Equant.
*  Helping employees to become proficient in their jobs more quickly - thus reducing costs, increasing productivity and revenue.
*  Enabling employees based in more remote locations to feel a closer part of the Equant 'family' and providing more development opportunities than were previously available to them.                               

Bob Little is a freelance communications writer

Providing a consistent level of Ethernet service is vital if carriers are to make the most of its potential, says Fred Ellefson

European carriers have embraced Ethernet services and, according to the Probe Group, are on track to grow the European Ethernet service market at a 40 per cent CAGR to provide almost â‚-4B worth of services by 2008. With the success of this service, its profitability for carriers is paramount. The capex side of delivering Ethernet services is extremely attractive, with Ethernet port costs approaching one tenth of comparable SDH port costs, according to Network Strategy Partners, LLC. 
However, the opex side of delivering Ethernet services over a five year service contract will end up dwarfing the original capex for delivering the service, and is substantially higher than traditional SDH or PDH based services (see Figure 1, right). This is critical because Ethernet service price points are typically much lower per megabit than traditional SDH/PDH based data services. Opex costs can make delivering profitable Ethernet services a challenge.                                                 
The original Ethernet standards were designed for the enterprise LAN environment and do not have the operations administration and maintenance (OAM) capabilities that carriers require to deliver a WAN service profitably on a wide scale. While manpower can be thrown at the problem when the number of customers is small, this approach will not scale as Ethernet services move down market to the millions of small and medium enterprise customers necessary to grow the market. 
Fortunately, the standards bodies have recognised this problem and have been working on adding the necessary OAM capabilities that are required to allow Ethernet services to scale. The first OAM standards were ratified in the middle of last year by the IEEE 802.ah Ethernet in the First Mile (EFM) standards group. While most of the media attention has been focused on the copper transport side of this standard, the bigger impact to carrier bottom lines will come from the OAM advances found in the standard. The ability to remotely monitor and perform maintenance will eliminate expensive truck rolls and dramatically improve the profitability and margin of Ethernet services. According to studies by Covaro, these new standards will reduce truck rolls (and opex) by 47 per cent versus un-managed Ethernet services and can greatly reduce the total cost of delivering Ethernet services (see Figure 2, opposite).
In addition, ITU and Metro Ethernet Forum (MEF) standards bodies are developing standards for end-to-end service monitoring/testing, while IEEE 802.1ag/ITU are also working on connectivity standards for multi-point to multi-point services. These standards will be ratified in late 2005 or 2006, and together with IEEE 802.3ah, will provide a layered set of OAM capabilities analogous to OAM capabilities found in SDH and PDH standards. 

Ethernet demarcation

With these new OAM standards, the biggest changes and impact will occur at the customer premise where the service is delivered to the end user. Traditional data services incorporate a demarcation device such as a smartjack or CSU/DSU to provide remote monitoring/test capabilities and have been a critical factor in reducing opex costs and ensuring profitability of these services. In the Ethernet world, carriers have had to improvise using Ethernet switches, routers, media converters or even SDH ADMs to perform this function. However, these devices are not up to the challenge: they cannot perform even the simplest of demarcation functions such as loopbacks or test generation, and often are not particularly reliable. Many enterprise users are shocked to see the same router that they have to reset every week in their own network, installed in the telco closet by the carrier for demarcation. 
A number of companies are now offering purpose built Ethernet demarcation devices which incorporate both Network Terminal Equipment (NTE) functionality along with User Network Interface (UNI) functionality.  The NTE performs the OAM capabilities defined in 802.3ah, including remote/local loopbacks, remote failure indication, fault isolation, performance monitoring with threshold alarms, status monitoring and discovery. The UNI function is aligned with MEF recommendations for Ethernet service definitions including Committed Information Rate (CIR), Excess Information Rate (EIR), and burst size on both a port basis and an Ethernet virtual circuit (EVC) basis for VLAN based services. These two functions are key to both defining an Ethernet service as well as maintaining it profitably.
The NTE function provides a full suite of RMON Etherstats plus extensions, enabling carriers to monitor SLA conformance on both sides of the demarcation point and to analyze performance trends over time. Performance data can be collected and stored in 15 minute intervals just like performance data from traditional carrier services. This valuable data provides a performance log for billing and SLA purposes, and provides advance indication of performance degradation before an outage actually occurs. Carriers can make this data available to customers through a web portal to facilitate customer network management, as is often done with private line or frame relay services. 
Should a service outage occur, the NTE provides remote visibility and control of the demarcation device, reducing or eliminating the need for a truck roll. The carrier can employ remote Ethernet loop-backs along with pattern generation/detection for remote testing, and can remotely determine if the CAT5 cable connected to the CPE is open circuited, short circuited or properly terminated. Open or short circuits can be located it to the nearest meter for precise diagnosis of cabling problems, which can then typically be corrected by the customer. The EFM dying gasp message even provides the carrier with a remote indication that power has been lost, which can often be addressed remotely by the customer.
The Ethernet demarcation device's UNI provides the CIR, EIR and burst parameters needed to define the QoS/CoS of Ethernet services on a port or EVC (VLAN) basis. The use of VLANs enables multiple Ethernet services, such as dedicated Internet access and/or Ethernet private line services, to be carried in a single Ethernet connection to the customer. This function is extremely visible to the end user because it defines the look, feel and personality of the Ethernet service. 
Locating the service UNI at the customer premises enables service definition and prioritisation at the point where full-rate Ethernet is rate-limited for the lower bandwidth transport and price points of services targeted at small to medium businesses. Performing this function at the rate-limiting point is essential to the proper prioritisation of such latency-sensitive services as VoIP and video. The customer premises location of the UNI also enables remote additions and changes to the service definition, eliminating truck rolls in service upgrades also.

Consistent look and feel

In summary, in addition to providing a dramatic improvement in opex, the demarcation devices provide carriers with a consistent service personality or SLA that can be hard to achieve when delivering Ethernet services over a mixture of fibre, copper, SDH and PDH technologies. Multi-site enterprise customers are often disappointed that their service SLA and monitoring capability is limited at sites that are served via early generation SDH or PDH equipment. Installing a demarcation device behind a SDH ADM can provide VLAN and OAM support that the ADM cannot provide, and enables those sites to receive the same rich set of service capabilities that the native Ethernet sites are receiving.  Providing a consistent, ubiquitous Ethernet service is key to providing a differentiated Ethernet service, increasing market share and ensuring the profitability of an Ethernet service offering.                              n   

Fred Ellefson is Vice President of Marketing, Covaro Networks, and can be reached via e-mail: fred.ellefson@covaro.com   www.covaro.com

As a mobile Internet protocol, i-mode could provide
operators with a means of differentiating their services
in the mobile data market, reckons Kevin Buckley

This year, the 3GSM World Congress in Cannes found the GSM world had well and truly embarked on 3G, with at least one, and usually various operators, having launched services in all the major markets. And not a moment too soon, as voice revenue, everywhere, is under pressure from competitors and, in the case of interconnect rates, from regulators. Data (besides just SMS) is therefore charged with the responsibility of complementing the challenges of declining growth in voice revenues.
In general terms, GSM world operators can be divided into two groups for the purpose of analysing their mobile Internet strategies -- the leaders or, frequently, the top two players in any given market. They will usually have far more subscribers than the rest of the competition, forming a de facto duopoly and vying between themselves for the leadership position, quarter by quarter.
These operators' main challenge is to migrate their huge customer bases smoothly from 2G through 2.5G to 3G. Having learnt from their mistakes with Wireless Access Protocol (WAP) phones, which came to market in 1999-2000 before an ecosystem of well-designed, well-conceived sites existed, they concentrate on building services rather than emphasising the technology. They sell camera phones and music downloads rather than GPRS or UMTS.
As such, they introduce their consumer-oriented mobile Internet offerings as content portals on their GPRS networks, signing up subscribers so that 3G can subsequently be marketed as a speed upgrade.
The other group comprises operators that, for one reason or another, need to differentiate their offering from the rest. Some are new entrants, i.e. groups that have no 2G customer base because they came in at the 3G licensing stage and therefore need to wow potential customers into leaving their existing provider in favour of them. Others already have a 2G business but aspire to become a market-leader and so need to raise their profile as a sexy option for mobile phone users wanting content offerings along with voice.
One way this group can seek to differentiate itself is by promoting the fact that it is offering 3G telephony, running ad campaigns that emphasise the new things that can be done with the more advanced phones in terms of content acquisition, m-commerce transactions or location-based services.
At the same time, the more efficient spectrum utilisation of 3G, compared to earlier generations of technology, means that more voice calls can be put on the same wavelength, a fact the new entrants are exploiting to offer cheap voice services. These are designed to bring customers to their networks, after which it is an easier task to persuade them to start using the mobile Internet function and to acquire content.

i-mode as a differentiator

Another way to stand out from the crowd, and one we are seeing in an increasing number of markets in Europe, is with i-mode. Like WAP, this mobile Internet protocol is an overlay on the network and can operate wherever an IP layer has been deployed, i.e. from 2.5G onwards.
The question all operators now face is: how can they make money from data services? It's all very well for a mobile carrier to say it has diversified beyond voice and into data, but mobile Internet access alone is not enough to bolster revenue. It too is being commoditised as operators start to offer flat rate "all you can eat" services to attract subscribers away from competitors who don't. Remember the cautionary tale of Internet service providers in the wireline world, whose initial promise was blighted as the flat rate, always-on environment grew, forcing them to move to value-added services or die.
Let's begin by separating the provision of such services to business/enterprise customers, who want secure mobile access to key applications running on their corporate networks such as ERP, CRM or SFA, from the marketing of non-voice functions to consumers. The latter represent a mass market that, aside from mobile e-mail and text messaging, essentially boils down to the sale of content. It is the provision of data services to consumers that I want to discuss here.

DoCoMo led the way in Japan

hat business is, of course, in its infancy, but there are interesting lessons from a market we at NEC know very well, namely Japan. NTT DoCoMo is universally acknowledged to have been ahead of its time when it came to content with its development of i-mode, the proprietary technology which, from 2.5G onwards, has successfully built both a large subscriber base (some 44 million in Japan today, or 92 per cent of all DoCoMo subscribers) and a huge pool of vendor sites (about 84,500 right now, of which just under 4,400 are 'official' sites, i.e. ones that pay a 9 per cent commission on sales to DoCoMo, and 4,600 have 3G content). In financial terms, i-mode contributed 25 per cent of DoCoMo's total revenue last year, which is not bad for only its fourth full year in operation.                The model not only works for DoCoMo at home in Japan: it has also licensed the technology to operators in 12 other countries, in one case (KPN in Holland), a carrier in which it holds equity.
Eight of the 12 are in Western Europe: KPN and its subsidiaries E-Plus in Germany and BASE in Belgium, Bouygues in France, Telefonica in Spain, Wind in Italy, mmO2 in the UK, Ireland and Germany and CosmOTE in Greece. O2 plans to deploy i-mode in the UK and Ireland this year and Germany (under a different brand) in 2006.
In other parts of the world, Far EasTone in Taiwan launched in 2002 and Australia's Telstra followed suit last year, while both CellCom in Israel and MTS of Russia have signed up with DoCoMo to launch services.

WAP vs i-mode

Meanwhile other industry heavyweights, such as Vodafone, Orange and T-Mobile, are building services based on WAP gateways, with Vodafone live! the furthest advanced in a number of countries (16 at the end of September '04) and subscribers (11.5 million at that same time). Like i-mode, WAP sits above and is thus independent of the radio access layer, provided the network is IP-enabled (i.e. 2.5G and beyond), they can function to enable a mobile Internet experience.
Before we go any further, let me address the fact that WAP is an open standard while i-mode is proprietary and must therefore be licensed from DoCoMo.
All true, but let us not forget that if, for instance, a games developer wants its game to be playable on the Vodafone live! service, it must write to the operator's proprietary API, called VFX, for the purpose.
In the i-mode world, the main pull for ISVs to write to DoCoMo's API has to date been the carrier's commanding share of the Japanese market. Now that other licensees are coming online there is a buddy group forming which, by virtue of its collective subscriber base, again makes it worthwhile for the software developers and handset manufacturers to work to the i-mode spec. By its size and geographical reach, the group begins to rival the clout of Vodafone.

Street market vs. shopping mall

The difference between the two mobile Internet technologies -- and herein lies the secret of i-mode's success -- is that the latter was developed after the way the market for it would work had been defined, whereas WAP debuted as, to paraphrase Pirandello, a technology in search of a business model.
The i-mode business model can be likened to that of a street market. If a vendor wants to set up a stall (i.e. a site), they agree to pay a percentage of the takings to the local council (i.e. DoCoMo). It is therefore in the operating council's interest to have hundreds of stalls, or indeed thousands, since the Internet does not have the physical restrictions of city streets.
Since the business model was thought out beforehand, from the outset DoCoMo recognised that it was in its interest to promote the take-up of i-mode, and to this end has always allowed so-called unofficial sites to proliferate, i.e. the ones that don't pay the 9 per cent fee on their sales and for whom it does not carry out the billing and revenue collection.
It still makes money on them, however, charging for the traffic generated by their m-commerce activities. © ndeed, some 80,000 of the total 84,500 sites are unofficial, and what they pay to communicate with customers across the DoCoMo network makes up 50 per cent of the carrier's non-voice revenue.
Another major difference between the i-mode model and those of the operators basing their mobile Internet services on WAP gateways is that, in the former, all content acquisition is the result of Internet browsing and all content is delivered via DoCoMo's i-mode platform. In the WAP-based world, there are far less sites and the bulk of content is acquired via SMS. The operators derive revenue from the SMS traffic, of course, but considerably less than they would if their subscribers used the mobile Internet to do their m-commerce, particularly now that SMS and MMS are being bundled into cut-price packages along with voice minutes as competition heats up.
The revenue from the browse-to-buy model comes not, primarily, from the time the subscribers are online, particularly now that many operators are going over to a flat-rate model for Internet access. But by charging the content providers to be the delivery mechanism for their ringtones, weather forecasts, horoscope updates or whatever, and by making it easy for thousands of providers to put up sites.
If i-mode is like a street market then the WAP-based mobile Internet model is like a shopping mall. The number of stores (i.e. sites) is small and the commission the mall owners (i.e. the operators) earn is a lot higher, anywhere from 40 to 60 per cent of vendors' revenue in fact. And since most content is bought by SMS rather than on the Web, one could continue the analogy by saying that most shoppers aren't even entering the mall.
The knock-on effect here is obvious. If I receive an SMS inviting me to buy a snazzy ringtone and I text back to buy one, that's the end of the transaction. If, on the other hand, I get an e-mail with a link to a site where I can download the ringtone, the vendor has far more sell-on or sell-up potential while I'm on the site.
There is also a greater opportunity to create a recurring revenue stream by signing me up to regular service of, say, a new ringtone every week or month, or indeed of multiple ringtones so that I can differentiate between calls from my boss (mental note to self: answer swiftly) and from my mother-in-law (mental note to self: let it go to voicemail). NEC believes that more content brings more users, and more users bring more content.
What's interesting in the Japanese market is that, since i-mode was the first and most successful service there, it has created the country's mobile Internet culture, such that DoCoMo's competitors seek to emulate its business model, even though they're not deploying i-mode.
So while Japanese consumers, thanks to i-mode, browse to buy, their European counterparts text to buy. In the second scenario, the content will often not even go through the operator's servers en route to the subscriber, being stored instead on a content server on the provider's network and delivered by SMS, with the operator deriving revenue only for transporting the text message.

Which strategy will win?

There will probably not be a single winner, as the success of any mobile Internet strategy will not just depend on the technology, or even on the business model alone, but also on the market clout of the operator adopting it and the acumen of the executives running the business environment in which they are operating.
As manufacturers, we at NEC play an important part, as we provide the means to make it all happen! We see a place for both models depending on where a particular operator aspires to be. Our challenge is one of continually being one step ahead -- developing the roadmaps for more cost-effective, high performance platforms and the technology on which either model can fit or indeed, any future models.                       

Kevin Buckley, General Manager NEC (UK),  Mobile Network Solutions Division, can be contacted via e-mail: kevin.buckley@uk.neceur.com

Inventory management is a vital ingredient in the feast known as VoIP services, as Julie Wingerter explains

It's like a hamburger and fries: it's just better together. And in the case of VoIP, IPTV, and other IP based services, inventory management really does make a difference in the overall ability of service providers to roll out these services profitably and efficiently. VoIP services offer operators significant revenue upside, but they also come with a set of deployment and operational challenges. That's where a robust inventory management solution comes into play.
Because IP services such as VoIP and IPTV are executed over a combination of shared multi-service transport environments, there are more devices to provision and maintain; network topologies to keep straight, and; network bandwidth/traffic issues that require prioritisation, than in a typical POTS scenario.
For VoIP to work, all of these network related activities, equipment and designs have to be monitored and managed in real time. To do this, a powerful network inventory management system is required. Such a system provides an accurate view of the network and serves as the core data repository supporting the automation of routine functions and providing vital information that allow billing, order management, service provisioning, outside plant, and purchasing to run efficiently.

Carrier success

How will carriers be successful in rolling out new VoIP services? From a network perspective their multi-service transport/broadband IP environments must meet some pretty high standards: ie tough enough to facilitate millions of phone calls; reliable as legacy phone services with a sound network architecture and POTS interconnection strategy; high quality (e.g. jitter-free, static-free), and; designed to rapidly process customers' orders and provision services.
What lies ahead in rolling out new IP based services?  Let's look at some of the specific challenges carriers face when introducing VoIP. 
If you don't know what is in your pantry you may not have all of the ingredients necessary to create an appetising dinner. Similarly, VoIP requires carriers to maintain an accurate picture of their network inventory so they can determine which services are available and when they are available, and so they can plan ahead to avoid any 'shortages.' This is particularly important, as IP generally requires many more network devices and configuration parameters than POTS services, including additional Customer Premise Equipment (CPE) and home networking components and associated MAC Ids, IP numbers, customer services data, pricing, etc.; Access Technologies -- HFC/cable, DSL, PON (FTTH); and PSTN off-net and other off-net connectivity components like media gateways, SS7, etc.
There is also a need to integrate with additional management applications that use inventory information such as billing systems, network management applications, etc., which may be using the information differently in order to introduce usage mediation or real-time trouble-shooting.
In short, VoIP services are more complex than standard POTS services because of the additional products and numbers/addresses that are used. Consequently, service providers need an inventory solution that provides: a mechanism to capture all of the physical and logical assets of the network; an integrated inside and outside plant inventory that supports an end-to-end view of the network; a means to keep the data current, and; a mechanism to integrate with service provisioning processes.

Oversubscription Management

Network congestion increases as thousands of new customers are added on an IP network. Oversubscription management looks at the usage patterns and analyses traffic and network capacity real-time. However, for this information to be accurate, carriers must have a view into their whole network. With up-to-date network data from inventory management this is achievable. With the ability to view the entire current network, service providers can adjust © oversubscription management based on experienced performance and quality to ensure that performance does not suffer.

Traffic Prioritisation

For most consumers, static filled calls are not acceptable. Therefore, to ensure QoS, a carrier needs to effectively monitor and prioritise the types of packets running through the network. Voice needs bandwidth. If there is not enough bandwidth available, voice quality deteriorates as 'static,' or 'jitter,' and dropped calls become inherent problems.
   A carrier needs to facilitate traffic prioritization policies to ensure that voice and video are prioritized over data transmission activities. Enforcing and adjusting these policies is paramount especially when offering triple play services that all vie for bandwidth on the same network. All of this revolves around accurate network inventory data to maintain bandwidth at levels that match network activity levels.

Self-Service Feature Management

Creating individual, made-to-order bundles of VoIP services is required to stay competitive and is heavily dependent upon accurate network inventory information. Through web-based customer sites or by phone, users should be able to, in real-time, change their VoIP features. For example, customers may go online and adjust their call forwarding, call waiting, or voice mail parameters. Importantly, customers don't expect to have to wait for these changes to be applied-they want them to be instantaneous. This capability requires accurate customer information and the associated network details. 
Another unique VoIP service feature is virtual phone numbers. It is possible to have a VoIP number reflect where a person or company would like to have virtual offices or presence in other countries. For instance someone from the United States might want to have a virtual office in the United Kingdom. Using VoIP, they can now have a number that matches UK phone numbering conventions even while all calls are routed back to the original US number. These unique enhanced service features require a network inventory that is flexible and integrates with other processes. VoIP brings to the table all these new service features that didn't exist in the traditional POTS environment.
IPTV deployment challenges mirror those of VoIP, only they tend to be intensified in scope. IPTV requires a bevy of new supporting equipment from encoders and video-on-demand servers, to video compression and IP-  encapsulation. For many carriers today, IPTV is the next big service they intend to roll out and VoIP is their introduction to IP-services.
Accurate inventory, traffic prioritisation, oversubscription management and self-service feature management are just a few of the areas that must be managed well to generate profits from VoIP and IPTV services.  An inventory-based OSS delivers the capabilities that allow carriers to see their whole network, add services and capacity to maximise their resources, and analyse results for future planning.
NetCracker Technology's OSS Solution is improving how carriers rollout next-gen services such as VoIP, IPTV, Fibre-to-the-Home and others. The Solution includes industry-proven inventory-based OSS software and the professional services delivery expertise to make it happen. NetCracker customers include Telstra, Australia's largest carrier; Telus, Canada's second largest carrier; MGTS, one of the largest wireline providers in Europe; and Covad, a leading North American broadband service provider, among others.

A moveable feast

VoIP and IPTV do have their unique challenges. However, combined with a solid inventory management, OSS service providers can generate higher revenues, increase market share, maximize network resources and remain competitive.
  A multi-service transport environment, IP's additional equipment and design parameters need to be kept in synch using inventory management.

Julie Wingerter, Vice President of Strategy, NetCracker Technology   www.netcracker.com

Can MMS -- as predicted by many in the industry -- produce the knockout blow that will fuel greater revenues and
opportunities for all, or will it be marginalised as a stepping stone to 3G services. Terry Ernest-Jones investigates

$161.3bn in 2009, by which time it will be well-established as a day-to-day feature of the mobile mass market. Yet so far, MMS has failed to deliver on promise for many of its users and its potential to enrich mobile communications has not been realised. It has recently passed through the early adopter phase, and the MMS industry continues to wrestle with problems such as handset compatibility, digital rights management, and pricing models. However, the major opportunity, of receiving and sending multimedia messages on the move -- as easily as with SMS -- gets closer by the month.
MMS brings multimedia features such as photos, sound, video, rich text or interactive applications to mobile messaging. This can take the form of a message sent between mobile phone users ('peer-to-peer') or, of equal importance, a message sent from a third party to a user ('server-to-mobile'). An MMS message can be compared with a scaled-down PowerPoint presentation, able to contain a variety of media. By contrast, Short Messaging Service (SMS) -- which has paved the way for MMS -- only allows basic text-messaging of up to 160 characters.
Technically speaking, MMS originates from mobile messaging standards defined by the Third Generation Partnership Project (3GPP) and the WAP Forum (which has since merged into the 'Open Mobile Alliance'). It has only really been up and running for about three years. Amongst the countries where uptake is strongest are Japan, South Korea, Germany and the Nordic region. The USA is less developed, reflecting the general lower mobile handset penetration in the region. In between are countries such as the UK, displaying roughly average MMS usage levels: even by the end of the first quarter 2004, out of a total of 47.5 million subscribers to the four main UK mobile networks, 11.3 MMS active devices were registered, according to the Mobile Data Association. This gives an MMS penetration rate of 24 per cent.
MMS requires special handsets with colour screens and, usually, built-in cameras. When the first Juniper Research report on MMS appeared in 2002, there were just two MMS phone models. Now there are hundreds. They have already brought in significant revenues for the leading MMS handset suppliers such as NEC, Nokia, Samsung, Sony-Ericsson, Panasonic and LG Electronics. But for operators urgently looking for new revenue sources, and other suppliers in the value chain, MMS offers the chance to build on new consumer behaviour in messaging -- for example linking audience participation into digital TV programmes -- driving new data revenues and raising ARPU, as well as handset replacement rates.

Lifting the barriers

Whilst offering a leap in mobile phone usage and appeal, it must be emphasised that until now, MMS has also been a frustration for large numbers of users, even for basic functions such as exchanging photos between mobiles. However, many of the compatibility and interoperability obstacles that have menaced MMS will be ironed out over the next two years, allowing a freer flow of multimedia messages, approaching the level of today's SMS. A major advantage for MMS is that, following in the wake of SMS, it can slot into customers' existing mobile usage habits. The downside is that user expectations have been set to require the same standard of service, and smooth operation, as they get from SMS.
Fortunately for the MMS industry, users do however expect to pay for mobile services. (This is not by any means the case with, say, the Internet.) MMS presents a large revenue opportunity, not only in providing enhanced peer-to-peer messaging, but also content and application-based services. As yet few suppliers are making any money out of it. But MMS provides opportunities to sell a range of enabling technologies. This will not be realised until operators and all other suppliers in the industry a) understand the dynamics of the value-chain, b) adhere to industry standards for interoperability, and c) fine tune their infrastructure needed to exploit MMS.

The MMS value-chain

As MMS carries multimedia into the mobile telecommunications market, a new range of suppliers have been drawn to the mobile channel to reach a wider client base. Just as the web has linked the IT and media markets, MMS is bringing content providers and application developers in touch with mobile operators. There are several instances where these players are working in close co-operation to create a multimedia user experience, deliver attractive services, and provide end-to-end solutions that stimulate the market and create new needs.
Ease of use and intuitive interfaces are fundamental requirements for end-users -- neither of which are as yet properly addressed by the MMS industry. If, say, a young teenager is in a clothes store and wants to get the OK from his/her parent to buy an article of clothing via an MMS picture, the operation must be transparent and quick. Minutes spent overcoming the handset's technical hurdles in a busy clothes store will defeat the purpose of this and other useful applications.
From the mobile end-user's perspective, MMS involves a radical shift: in effect moving the focus of their attention from the ear to the eye. As so often with technology innovations, end-users are neglected as the key element in the value-chain, whilst the industry races ahead with 'push'. So far as MMS is concerned, the youth market is the main driver -- in fact some studies say that most MMS buying activity is seen amongst 15 to 17 year olds. Each age group needs to be taken into account -- there is a large mobile population of  people aged 60+. Messages sent to them of, for example, grandchildren's birthdays, should not be ignored under the misconception that MMS is 'youth only'. 
A feature of young people's lives today which is in MMS' favour is that they record the details of their lives online a lot, and both sexes have a stronger urge than in previous times to share their experiences with friends.
Taking the UK as an average, around 25 per cent of handsets are MMS-enabled. Yet most young users of MMS phones today regard it -- initially at least -- as a disappointment. Typically, initial efforts to send photos to friends fail. Tales abound of handset makers blaming operators and vice-versa when frustrated users contact help centres to overcome problems. MMS represents an investment for end-users, who buy an MMS-capable handset -- costing up to $500 plus services. The mass market will only be willing to make this investment if users can derive real value out of using MMS. There are several compelling value propositions linked to MMS at its most basic level:
*  MMS handsets in themselves add kudos with their basic features of colour screens and polyphonic ringtones.
*  MMS offers the possibility to enhance the popular peer-to-peer messaging with the addition of photos and sound. More importantly, pictures and sound allow the user to add personality and emotional content to the messaging experience, and share this with friends.
*  MMS information and entertainment services promise unrivalled personalisation possibilities, and will enable users to access content anytime, anywhere.

Operators

Operators play a pivotal role in the value-chain. As such, they must focus closely on the end-user's needs and guide their business partners to help them develop offers that will satisfy end-users. (Presently, this is often the other way round.) This requires more than simply supplying 'jazzed-up' SMS messages; operators need to reach into new models such as T-Mobile's for the Euro 2004 football tournament. This scheme included MMS picture messages sent at intervals during each of a team's matches -- the whole package costing $4.50 per team for picture, or $10 for video, updates. (T-Mobile was contacted for this report to state the result of the programme, but declined to give figures. Instead it commented that it was "very pleased with both the level of uptake and also the technical performance of the delivery of the alert services.")

Wide range of services

To make their MMS offer complete, operators are having to provide a wide range of services, including the potential to create and store messages, and a wide choice of content and applications. In addition to these core services, operators are also continuing to add infrastructure to ensure that the value-chain functions properly. It's still early days though. For example Virgin Mobile only launched picture messaging in July 2004. The industry is still wrestling hard with how to adapt end-user billing systems to MMS messages, not to mention the task of ensuring that MMS content flows seamlessly from the original provider to the end-user. Up-stream, it is also their role to ensure that third-party providers collect revenue, as it is they who manage the end-user billing.
The rewards that operators can reap from successful implementation of MMS are huge. MMS provides a much needed boost in ARPU, as well as a justification to investments in General Packet Radio Services (GPRS) and Third Generation (3G) networks and fosters strong customer loyalty. (3G cost them around $120 billion in licences alone.) Efforts are underway to solve interoperability problems, but there is still a long way to go. "Operators have worked hard in previous months [on interoperability]," says Sandy Ryrie, messaging chief with operator O2. "We want to bring the same level of confidence to MMS as there is in SMS."
Generally today users are introduced to MMS services, whether they have specified it or not, simply by upgrading their handset. A wider range of MMS-capable devices is beginning to appear, catering to the needs of different market segments, from the prepaid youth segment to the high-end business user. For manufacturers, MMS presents a major opportunity.
Megapixel cameras are raising the quality of images, but generally cannot as yet be handled adequately by operators. Meanwhile, the handset is turning into something of a Swiss Army Knife, and some executives, such as the former technical director for Symbian, Simon East, have branched out to focus on the photo printing and image quality of MMS phones. His company Cognima aims to provide a 'single key press' for printing from phones.
Much of the effectiveness of MMS messaging today depends a) on how new a handset is and b) if the vendor has followed internationally agreed standards to ensure interworking. Meanwhile, following the classic pattern for new technology devices, the emphasis is on launching handsets with dazzling features such as video cameras, megapixel, 180 degree swivel lenses, 4x zoom 265,000 colour screens, image editing features -- in a wide variety of inventive, slick designs, and weighing from around 85gm.

Infrastructure vendors

elecommunications software vendors provide the all-important MMS Centre (MMSC): the server that manages all MMS message flows within an operator's network, and handles addressing, filtering, and temporary message storage. (WAP then carries the MMS message from the MMSC to the mobile.) Operators are continuing to invest significantly in infrastructure for MMS, and there is a growing industry in supplying it with companies such as LogicaCMG, Comverse, Nokia, Sema, Ericsson, Motorola and Alcatel supplying them. Managing the sheer volume of information that flows through is already a challenge, but more importantly, MMSCs need to be able to deal with a host of different formats and profiles. Content usually needs to be altered before presentation on the receiver's phone.
One of the MMSC's most useful functions today, once a message has been received, is to discover the configuration of the handset to which it is being sent, and adapt the format of the message so it can be accepted by that equipment. Even video messages can be adapted so they can be received if the recipient doesn't have video handset.
Certain infrastructure components are required to manage the store and forward functions of MMS. MMSCs have to connect into other network components, and a network must also be WAP-enabled and at least capable of handling GPRS. Apart from the MMSC, there are other infrastructure elements that must be implemented in order to offer effective MMS services:
*  End-user billing adapted to the nature of MMS messages.
*  Inter-operator billing adapted to the nature of MMS to deal with cross-networks messages and roaming.
*  Revenue sharing mechanisms to allow the automated redistribution of revenues across the content and applications value-chain.
*  Digital Rights Management solutions that can identify copyright-protected content in Peer-to-Peer messages.
*  Security to ensure that valuable content is protected.
*  Application and content gateways to allow third parties to link into the MMSC.
There is a further industry supplying MMSCs. TCS (TeleCommunication Systems) for instance provides messaging services for global operators. It doesn't sell MMSCs but has enabling technologies to enhance them -- such as providing a single domain for MMS.
MMSCs are not used only by large operators. For instance, in June 2004 Comverse announced its 'Compact MMSC', an entry-level solution for smaller wireless operators. Compact MMSC had been recently deployed by two operators in Asia.

Content providers

MMS represents a new channel to market for information and content providers. Unlike the Internet or WAP, MMS provides a clear revenue opportunity for media suppliers, who can justify claiming a share of the traffic revenues that operators collect from end-users. As the market matures and billing structures evolve, they will be able to provide their services directly to the market, potentially using operators only as distributors, for example for a 'joke of the day' service.
A wide range of content types can prove effective over the MMS bearer, but content must be created specifically for this new channel in order to be successful -- taking into account the effect of new user behaviour as usage evolves. MMS as a distribution channel is superior to traditional channels in two ways: it offers the possibility to create finely personalised content, and can be accessed by people on the move.
MMS is also a compelling advertising channel. It enables the building of highly targeted campaigns, and the communication of clear messages directly into the hands of the intended recipient. MMS will also be an extremely effective conveyor for 'viral' marketing campaigns.
It is quite possible that the usage of MMS to supply third party content to the phone, rather than peer-to-peer messaging, will eventually steer the market. Many involved with the MMS industry believe that MMS could ultimately be driven by major events -- sporting or otherwise. Besides, it is the ideal platform for instant updates and alerts on events of special interest, which can then be forwarded to friends.

Applications developers

MMS enables the interaction between mobile handsets and networked servers. The varieties of ways to translate these interactions into concrete applications, that provide value to the market, are numerous. In the consumer market, MMS can support TV quizzes, polls and eventually video games. Operators will be eager to offer such applications, as they generate repetitive traffic and create customer satisfaction. Content providers and advertisers are also interested in providing such applications as marketing tools. On the business side, corporate applications like ERP or CRM systems can use MMS to develop a mobile extension and reach remote and mobile workers, providing them with a permanent link to their company and customer databases or e-mail.
Attractive and compelling content, rather than exotic handset designs and features will be the factor that ultimately drives MMS, though. SMS was the mobile success story of the 1990s, and the jury is out on whether MMS will turn out to be the success story of the present decade. Most likely the main role of MMS will be as a stepping-stone for multimedia applications and services that will drive 3G. Either way, video is next the step -- besides, it's a natural progression to what MMS does best: 'sharing the moment'.
The main focus is on consumers, who undoubtedly will drive the MMS market in the near future. There is more to MMS, though, than multimedia 'infotainment' for consumers. There is also the possibility of developing interactive mobile business communications and applications. Handset manufacturers are well aware of this, so that, for instance, the Siemens CX65 business user mobile launched in the summer if 2004 comes not only with a digital stills camera, but also video, taking clips of up to 15 seconds. Indeed, business use of MMS technologies will become a $64.1bn industry by 2009.                                             

Terry Ernest-Jones is an Associate Consultant with Juniper Research   www.juniperresearch.com

Reducing revenue leakage and maximising profits is the nirvana of every operator in the market. So, what are they doing to achieve this goal? John Maclean investigates

According to the experts the future is starting to look rosy again for the global and European telecoms markets. Gartner reports that revenue decline is slowing for the top North American telcos, while in Western Europe, the EITO (European Information Technology Observatory) expects the industry's rate of growth to increase from 2004 to 2005.
Although the picture is certainly looking less bleak, Gartner also reports that worldwide, the BSS and OSS markets will only see incremental and uneven recovery until telcos resume sustainable profitability. But surely, the BSS and OSS markets should be recovering ahead of telco profitability? As the market eases, telcos are looking to pick up the pace in launching new services and deploying new technologies. But a complete inversion of investment priorities, focusing on new services rather than back office processes, could in the long term produce precisely the results that they are trying to avoid. Profitability will come by maintaining the focus on the back office to give a strong set of customer facing processes, upon which new services and offerings can be delivered -- quickly, effectively and profitably.
In order to achieve profitability, telcos need, principally, to ensure that they minimise the time and cost associated with turning customer orders into cash. The less it costs a telco to provision and service a customer effectively, the greater chance it has of keeping valued customers and of making a profit. For example, as a Western European mobile operator recently commented: "Our financial figures are now in the top five of the operators in Europe. We automated our order management process within the past two and a half years and this has totally helped the system compared with the manual processes of the past."
This quote came from a recent research study by Analysys, which highlighted that despite many recent improvements in order-to-cash processes, most telcos acknowledge there is still significant work to do to fully optimise the order to cash lifecycle.
Analysys surveyed over 40 telecoms operators in Europe and North America; a staggering 80 per cent of respondents admitted that further order management improvements are essential if they are to achieve better financial results and competitive differentiation. Fifty per cent of telecoms operators, and within this, all bar one of the large operators, are without a single common process for order taking across all lines of business.
One in three of the operators that had met obstacles in improving order to cash processes had to stop or delay critical product launches -- a highly detrimental factor in the quest for profitability in a competitive marketplace. Operators looking to jump start their market expansion plans need to ensure faster time-to-market, especially as the pressure intensifies to introduce new services and tariffs.
With profitability as the ultimate target and order management the means of getting there, there are also a number of goals to be achieved along the way. Namely, in the Analysys research, half of the operators surveyed cited improved cash flow, better service, increased customer satisfaction and revenue growth as benefits made possible when the obstacles to improved order management are removed.

Hurdles to clear

Analysys outlines two main hurdles to achieving these goals. These are the sheer complexity of the order management processes that need to be overhauled, and the impact of seemingly irreplaceable legacy systems that still remain the backbone of telcos' IT infrastructures.
The issue with the order management business processes themselves is that they are becoming increasingly complex as telecoms businesses become more diverse, nimble and affected by rapid market change. Issues like constant development of new services to keep competitive edge and a high rate of market consolidation, mean that the market landscape for telcos is constantly evolving. Despite their increasing complexity, processes need to be effective, resilient and efficient in order to deliver profitability.
The complexity of IT systems and difficulty in customising legacy IT systems is blatantly clear from the survey. 29 per cent of telcos are trying to hard wire critical systems needed across order management to create an integrated process. While this may serve today's needs, hard wiring usually means that there is little option for change. How will these telcos fair in the future, when their order management processes, and probably the systems needed to support them, have changed beyond all recognition?
Only one in five operators has invested in integration tools with standard technology interfaces, despite that fact that these systems offer a more future proof route to profitability because they are based on open standards that allow any combination of systems to be integrated.

Lip service to industry standards?

The Analysys survey shows an increase in the awareness of industry standards such as the TeleManagement Forum's (TMF) Enhanced Telecom Operations Map (eTOM), which provides a blueprint for successful business, processes like order management. But while 60 per cent of large and medium sized operators are tracking and analysing these standards for ways to create value in order management, only five per cent are actually developing order management solutions using the industry standards. Over the next few years, the market needs to follow these early adopters who have successfully made order management improvements using standards based solutions.
One early standards adopter is R (formerly R Cable), one of the fastest-growing telecommunications companies in Spain. In early 2004, R decided to take the standards based approach to order management by implementing Vitria:OrderAccelerator, a solution based on the TMF's eTOM framework, which streamlines and automates manual, order-related business process flows across OSS and BSS systems.
The decision by R to standardise on an order management platform was based on the desire to run the entire process of customer subscriptions and services provisioning from one place. It is integrating information flow between the web, its customer relationship management platform, service activation systems, on-site workforce systems, network inventory system, its Interactive Voice Response facility, and billing system in a single environment.
"The strong competition in the telecommunications sector requires a sustained effort to provide the best levels of operating efficiency and service," comments Antonio Gómez, Systems, Organisation and Quality Manager at R. "Vitria addresses specific issues that have a direct impact on efficiency and quality through the automation of processes and the integration of information flows inside the company. By offering visibility of customer details and services across the organisation, Vitria also helps us reduce costs and improve service."

The revenue impact

An earlier study by Analysys, in February 2004, showed that, with revenues overall growing more slowly, management of existing revenues has become increasingly important. Under the global title of 'revenue assurance', this has become an absolutely essential project for operators. The study showed that of the top six causes of revenue loss, three were in the area of order management and processing, namely 'Poor Processes and Procedures', 'Poor Systems Integration' and 'Applying New products and Services'.
Even more surprising was the gap between assumed revenue loss and likely real revenue loss. In Western and Central/Eastern Europe an acceptable loss level was around 1 per cent of revenue. However, the research suggested that the actual loss level was probably well over 7 per cent. This was unlikely to be due to complacency but more the result of vertical organisational structures, and a resultant inability to recognise the real causes of revenue loss, and more importantly, put in place effective measures to reduce them.
In today's climate, lost revenue is close to being a crime and it is vital that operators focus on eliminating it before getting carried away with the front-end technology. It will be of no long-term value to have launched the most leading-edge services if, at the same time, an operator is leaking more revenue at the back.

A two-pronged attack

It is clear that although the improved overall climate for operators is permitting some renewed opportunity to launch new technology based services such as 3G in mobile and VoIP in fixed, the biggest risk is if they de-focus on the back office. A strong and determined effort to 'sort out' order management, and its associated processes and procedures, will have the triple benefits of reducing costs, stemming some key sources of revenue loss and, most importantly, allow the more effective and timely launch of new and exciting service offerings.         

John Maclean is Telecoms Marketing Director EMEA at Vitria, and can contacted via tel: +44 (0)1628 421852;

As UMTS finally takes off across Europe, network planners are exploring how best to meet projected 3G subscriber growth. The technology at the base station RF interface holds many of the answers, writes Joerg Springer

In the latter half of 2004, the third-generation (3G)/universal mobile telecommunications system found its feet in Europe. This technology should provide answers to many of the business challenges facing the region's aging second-generation (2G)/global system for mobile communications networks. While Western Europe's GSM networks still enjoy steady subscriber growth, many are over a decade old, and face serious capacity limitations.
UMTS is founded on wideband code division multiple access (W-CDMA) technology, and promises relief from the current capacity headache. W-CDMA offers a capacity-per-MHz far greater than that of time division multiple access (TDMA)-based GSM, plus reduced OPEX. It also promises more established and sustained average revenue per unit growth -- a powerful driver in an industry climbing its way out of a three-year ARPU slump. The earliest experiences with 2.5G data services suggest that the more advanced 3G will be an important factor in industry growth.

Unique challenges

Although the European '3G beast' is now flying, there are unique challenges ahead from a network expansion and RF perspective. Today's 2G-to-3G transition is a different scenario to that of the mid-nineties leap from first generation analogue services to digital.
In just over a decade, Europe's cellular services have matured dramatically, with penetrations at around 85 per cent. The downside of this is that all the prime base station sites are occupied. The environmental requirements regarding site location and visibility have also 'matured' to become some of the world's most demanding.
The transition from TDMA-based GSM to W-CDMA-based UMTS technology also influences network planning. Where TDMA planning strategies are based on minimising co-channel interference by re-using a select number of     channels over a group of cells, W-CDMA uses the full frequency band in each cell. Moreover, W-CDMA cells are said to 'breathe' -- the size of the cell varies with the number of callers within the cell, the transferred data rate and so on. The resulting co-channel interference that can occur in the W-CDMA network increases the noise floor, and progressively depletes the capacity of the network. It presents a notoriously tougher network planning challenge when compared with GSM, particularly in addressing the interference resulting from adjacent cells.
Perhaps most challenging of all is subscriber expectation with regards to quality of service (QoS). No longer are subscribers willing to condone dropped calls and fades -- Western Europe has arguably the highest cellular QoS in the world. The new UMTS services have much to live up to.

Rooftop realities

The majority of Europe's urban cell sites are rooftop-based. Given the tough site acquisition conditions, the easiest 3G roll-out option (and the one largely chosen to-date by Europe's UMTS operators) is co-siting.
The situation on European rooftops today has much in common with a crowded early morning commuter train -- no-one enjoys the congestion, there are established long-term disputes and rivalries between some 'passengers', but on the whole, the system works. To accommodate UMTS spectrum, new antennas are required, so the 'train' needs to be reorganised. The most popular strategy adopted to-date is the multi-band antenna solution. This is manifesting in strong demand across Europe for dual- and tri-band antenna solutions supporting combinations of UMTS 2100 MHz, GSM 900 MHz and GSM 1800 MHz.
A further challenge is co-siting interference. When antennas operating at different frequencies are located in close proximity, there is potential for RF interferences. These are caused by intermodulation products orspurious emissions, which can in turn lead to BTS or Node B blocking. The most extreme cases of these occur when the core band spectral separation is narrow (a pair of UMTS 2100 MHz and GSM 1800 MHz services is a most obvious case), and the antennas are physically close. As a result, UMTS/GSM co-location isn't always straightforward. In some cases, it simply isn't practical, and the new UMTS operator is forced to opt for a site that is nearby, but 'sub-optimal'. The RF challenge is to make the best of such a bad situation, and to optimise the RF footprint to suit the alternative location.

RF flexibility

The upshot of this highly constrained site location scenario -- coupled with the exacting requirements of W-CDMA network planning -- is that Europe's 3G operators are demanding higher levels of base station RF precision and flexibility. First and foremost is the issue of antenna performance: new-generation 'precision footprint' antennas feature diminished side and rear lobe radiation levels, improved null fill, and increased front-to-back ratios.
'Flexibility' is being sought on a number of RF technology fronts -- specifically in the control of cell footprint size, shape, direction and power. To compensate for CDMA-style cell breathing and less-than-optimal site locations, variable electrical tilt is de rigueur. Increasingly, this is accompanied by remote tilt control functionality, linked back to the network management centre via industry standard communication protocols.
There is also a demand for tower-mounted amplifiers (TMAs) across the majority of Europe's 3G sites. These provide amplification in the uplink signal from the terminal, which overcomes losses in feeder and co-siting components, decreases the system noise, plus increases the potential cell size. The need again is for flexibility -- a broad choice of amplification levels, dual and multiband configurations, and a wide selection of antenna gains.
In addition, the challenge of W-CDMA adjacent cell interference has created a demand for alternative apertures. Where the 65-degree tri-sector is the norm in 2G/GSM networks, antenna apertures of 90-, 45- and even 33-degrees permit the W-CDMA network planner to 'break the symmetry' of the final cell pattern, and thus minimise cell-to-cell interference.
The not-too-distant future holds even more RF challenges -- significantly, the evolution from a coverage-driven to a capacity-driven strategy. In the very short term, we'll see UMTS operators continue to expand and enhance their 3G coverage across the major city and urban centres. These are the areas that present the greatest revenue earning potential, and are also the most voice capacity-challenged.

Longer term challenge

More challenging, though, is the longer term. Analysts predict a 50-fold increase in Europe's UMTS subscriber levels between end-2004 and end-2009. In essence, this suggests Western Europe will see 3G subscriber levels rise to almost equal those of the region's current GSM count, in a time frame just over half that afforded to the evolution of GSM. This represents extraordinary subscriber growth, and presents unique challenges to 3G network planners and RF technology providers.
To meet the fast-growing capacity requirements as GSM subscribers migrate to UMTS, we can expect to see even more advances in base station RF technology. These will almost certainly be founded on two key elements: advanced 'hybrid' (a mix of active and passive components) antenna solutions, plus greater intelligence and control functionality built into the antennas.
While today's antennas are purely passive, tomorrow's antennas will need to integrate active RF conditioning components, such as low-noise amplifiers, multiplexers and filters, with even greater levels of control. Similarly, by providing more onboard intelligence within the antenna, an increasingly broad range of antenna pattern parameters might be adjusted and controlled. This will provide superior levels of flexibility to the network planner.
Over time, we'll also see a vast improvement in W-CDMA network simulation tools. This should result in more dynamic and intelligent network management strategies, and possibly lead to the realisation of the so-called 'dynamic antenna performance control'. Here, the adjustable parameters of the antenna components -- both active and passive -- could be corrected in response to the simulation tools, perhaps even in a closed-loop real-time configuration. It is these types of super-flexible base station RF solutions that will play a significant role in the establishment of UMTS in the longer term.     

Joerg Springer is the Chief Marketing Officer, Radio Frequency Systems (RFS). He can be contacted via: tel +49 511 676 2516; e-mail: joerg.springer@rfsworld.com
www.rfsworld.com

Andrew Beutmueller and Phil Haddock examine the
effectiveness of an automated rights management system

Recently, the Assicurazioni Generali, the third largest European life insurance company, controlling 626 companies worldwide, made the strategic decision to implement an Identity and Access Management solution to automate rights management across a range of diverse systems. Beginning at its Swiss division, Generali Group Switzerland, the new system has helped to streamline increasingly complex IT environments, and cut operating costs and errors.
A series of mergers and acquisitions brought a number of formerly independent Swiss insurance companies under one roof, resulting in rapid growth in the number of IT users requiring access to a broad range of services, data and applications running on a variety of platforms across the Generali Group Switzerland.
Despite the increased work load resulting from the merger, resource provisioning was still performed manually using printed paper forms to assign each user a set of access rights to a specific configuration of systems and resources. This system of 'manual rights assignment' worked well enough in the past, but with the merger and today's increasingly complex IT environments, manual rights assigning has become costly, time-consuming and inaccurate.

Role-based provisioning

The objective was, of course to replace the labour-intensive, system-specific assignment of user rights and permissions then in place at Generali; this referenced an emerging standard known as Role-Based Access Control (RBAC). This was replaced by "a strategic, centrally administered directory service (role concept) based on standards (X.500, LDAP)," according to Jürgen Lorek, Internal Auditor of Generali (Schweiz).
After some research, it was discovered that the best option would be to implement a Meta Directory solution that was built around a role-based approach to identity and access management.
Role-based provisioning is based on the RBAC standard developed by the National Institute of Standards and Technology (NIST). The solution deployed at Generali Group Switzerland enables cross-platform provisioning at a level that closely mirrors the organisational structure of the enterprise. The definition of roles, role hierarchies, relationships, and constraints reflects the levels of responsibility and specific operations to be executed by persons in particular jobs. Each role is assigned one or more permissions containing bundles of access rights, and each employee is assigned one or more roles. The specialised Meta Directory solution enables access rights to be granted, refused, withdrawn and monitored dynamically, independently of the platforms and applications used. Once a role-based framework has been put in place for an organisation, the principal administrative actions are the user-to-role (user-to-job) assignments.

The DirX product

The Identity and Access Management solution implemented at Generali Group Switzerland is based on the following components of the Siemens DirX product suite: 1) The DirX V6.0D10, a high-performance LDAPv3/X.500 directory server that stores employee data, the DirXmetahub configuration and all roles, groups and accounts, and 2) the Siemens HiPath SIcurity DirXmetahub V6.5B10, which is a Meta Directory Engine supporting the HiPath SIcurity DirXmetaRole Version V2.0B00 for provisioning user and access Management.
Master identity data is maintained consistently in the DirX directory server and made available centrally. The Meta Directory engine DirXmetahub ensures the automatic synchronization of data between all connected systems. HiPath SIcurity DirXmetaRole enables the provisioning.
Three separate systems for development, test and production of the DirX-based solution were put in place. Siemens provided the software components and licenses along with project consulting and a standard set of maintenance services.

Putting theory into practice

The Generali Group Switzerland's Legal Protection division offers legal insurance -- it was this part of the company that was targeted for the initial implementation of the solution. It was a prime candidate, as some 55 users required differentiated access to diverse resources and services: sales administration, contract management, and damage-assessment systems (all running under Oracle), and Microsoft ADS/Exchange.
A great deal of preparatory work went into the description of specific job functions and their definition in terms of roles, and a key requirement was to enable HR personnel to perform a 'user-to-role' assignment for each employee. Once the name, address, salary, contract start, effective date of entry, etc. was registered, the HR manager was to assign a role corresponding to a specific job.
At that point, the automatic procedures would take over: the entire user-to-role assignment would be read from the HR database and passed back to HiPath SIcurity DirXmetaRole, where the corresponding assignment of access rights was to take place.

Problem solving in real time

The objective of achieving flawless synchronization between the meta-directory and the HR management system is simple enough in theory, but making it work in real time was a challenge. Fortunately, the solution is flexible enough to take into account the possibility that customer-specific extensions of the DirXmetaRole might be necessary for effective integration in legacy environments.
Moreover, synchronization between the directory and the Generali application systems was critical. Once the role-based rights configurations had been integrated in the DirX meta-directory, they had to be mapped back into the different application systems and resources accurately.
"Our core business is built on the Oracle platform and is structured very similarly across the different applications, so this backwards compatibility with the current database applications in production was absolutely critical," explains Lorek.
Now that the solution is in place, when a new employee enters the system or when the role assignments of existing employees change, all that isrequired from Generali's HR manager is to check a box on screen. The new software reads the status of all user-to-role assignments-enabled, added or deleted -- and the result is automatically synchronised with the entire legacy system. As such, "the whole complexity of the paper-based, manual process in place prior to the project has been reduced to a single mouse-click," says Lorek.

Success at hand

The Identity and Access Management solution has been in production in the Legal Protection division for almost a year now. Under the previous manual system, correcting an error in rights assignment cost three or four hours and involved several different people in HR or IT administration. This was obviously highly cost intensive. And if you think about the potential damage from misuse of obsolete or erroneous access rights, the risks were obvious.
"We underestimated the amount of work involved, but we also underestimated the impact our success would have on the organization," said Lorek. "The solution improved the security of our systems; it reduced the risk of misuse and potential damage to the business."
Furthermore, as remaining non-core applications continue to be integrated into the central system, the benefits become clearer, especially in terms of significantly lower administration costs; near-perfect accuracy in terms of moves, adds and changes; and greater confidence in the security of the IT environment and the company's ability to satisfy evolving regulatory requirements.


Jürgen Lorek talks about Identity and Access Management at Generali Group Switzerland

Q: Although you have a background as an IT professional, you also brought a business point of view to the challenge of resource provisioning. Can you give us a little more detail?
Lorek: I was lucky in the sense that I could take on both the IT and the business analysis and bring them together in a form of interdisciplinary thinking.
I studied this problem and, in doing so, added complementary business and financial analysis skills to my professional IT expertise. As such I was able to couch the resource provisioning issue in terms that non-IT colleagues and external solution providers could understand; I chose an object-oriented analysis and design tool known as UML (Unified Modeling Language). With this I was able to analyse and model Generali Group Switzerland in terms of its organisational structure and business processes. The UML model was a real milestone. It was clear that if we could find a product capable of implementing it, we would have a solution.
We began with classical organisational theory and looked at Generali's business in terms of how it was organised and in terms of business processes, as an operational organisation. We evaluated all of the resources available in the company under these two points of view. Although functions often cross departmental divisions, it's very difficult to change the way people traditionally think and work, so we're proceeding on a two-track path.
As far as provisioning is concerned, we've introduced an operational structure. We have Siemens HiPath SIcurity DirXmetaRole set up to map functional responsibilities in terms of roles and corresponding privileges. We've modelled everything that a person does within the enterprise, the day-to-day business in a functional organisation as well as project-oriented business, and together that provides the summa summarum of all rights and privileges for that person.
Q: What sort of impact did this approach have on the project?
Lorek: I think it's one of the main reasons the project has been successful-the fact that we were able to convince management to look at it as a business issue rather than as a purely technical challenge. It was a business and organisational challenge that could be resolved using IT tools.                       

Negotiating telecoms contracts can be a daunting prospect but, as David Warren explains, there's no gain without pain

Having squeezed out 'low hanging' inefficiencies, organisations are now turning to review their telecommunications cost structure on a European scale. The money enterprises spend on telecommunications services has long been considered a cost of doing business.
Yet, in today's business environment, no area of spend is sacred -- especially with expenditure on telecommunications amounting to one to two per cent of sales. A good telecommunications review could reduce the investment by 20-25 per cent. Repeated every one to two years, those savings could continue at 10-15 per cent.
In recent years, national players in telecommunications are being challenged by global carriers who, in turn, are being challenged by pan-European suppliers. Business models for pricing are being rewritten. We are juggling with 'postalisation' of rates, voice over IP, escalation of mobile usage, a switch to the receiver paying (through freephone numbers), and volume -- not distance -- pricing as data becomes the prominent form of traffic.

Corporate lethargy

But there remains a corporate lethargy for action.  National contracts, in many cases, remain untouched.  Business users, although dictating and escalating the expenditure, are not taking responsibility for it.  Telecoms suppliers are making the most of the profitability window.
  Daunted by complexity, inaction is understandable.  But it is better to instigate a review than to have a review instigated on you.
Having decided the time is right -- how do you get started? Renegotiation of a telecommunications contract or contracts is similar to any other legal agreement and just as serious. To embark on a review the telecoms manager(s) responsible must begin with a phase of Discovery. The Discovery Phase may need to take place within each of your operating territories. Each territory will need to work together to ensure that all parties involved are fully aware of the parameters and ambitions of the review. This first step is critical to a successful contract, as it informs you of the complete telecoms expenditure of your organisation. The fact is that most users don't know what they have, what they use and what they are paying for.
It is worth noting that decisions defining mobile needs are different to traditional voice services and very nationally focused. International mobile communications are on the increase and charges for these services are very expensive, and international roaming makes this very difficult to control. The actual charges are very dependant upon the various supplier partnerships in place in each country and the network provider used for the home network. Mobile on a European basis therefore warrants a different, but linked, evaluation and negotiation. 
The Discovery Phase is followed with a renegotiation. Renegotiation should begin with a sharing of the discovery findings with the incumbent supplier(s), along with your strategic plans and projections for the future. The incumbent should be given the opportunity to respond.   One time out of five the vendor team responds successfully. Often the vendor escalates the review within their organisation: if this happens 50 per cent of the escalations provide a favourable response.
Risk is a prerequisite to an effective telecom services agreement. To obtain favourable rates and concessions from vendors, telecom customers must convince vendors that they are willing to change service providers. Put simply: clients considered to be 'at risk' get the best prices.
If successful the review process can end there. But, if not, it will move on to the time consuming process of Request for Proposal (RFP), a second renegotiation, then if necessary wider circulation of the RFP, final selection, negotiation, award and conversion. One way to ease the pain of this process is to use proxy bids based on market data to test the proposal of your incumbent organisation. The results of similar discoveries conducted at other customer organisations provide a comparative context, allowing negotiations to be based on actual market rates.

Priorities in the RFP

f you get to RFP, three priorities for inclusion are: pricing linked to market rates, annual reviews and demanding quality of service parameters.
1) Firstly, with regard to pricing, the RFP should insist that vendors commit to coming within 15 per cent of the current market prices and be explicit and not tied to controlled price lists or confidential contracts.     
2) The second 'must' is to include a contractual provision for annual renegotiations against market rates, to allow for adjustments in response to new market conditions and changing business requirements. Flexibility can also be enhanced through a low minimum annual commitment, which should be less than 66 per cent of expected annual spend.
3) And finally, to ensure quality of service, the terms must include non-linear credits for service outages. What I mean by this is that telecom agreements are typically structured so that, in the event of a service outage, the vendor provides 'credits' for additional service, rather than a monetary payment. If the penalty isn't sufficiently painful, the vendor may prefer to dole out credits rather than to fix an underlying problem causing outages. An effective contract must therefore escalate penalties after each outage. An escalated penalty structure provides the telecom vendor with a clearincentive  to ensure that the cause of the initial outage is investigated and addressed, in order to prevent future problems. For the customer, meanwhile, quality of service is ensured.
After the RFP is prepared, give your supplier the opportunity for a second renegotiation. If this fails to achieve the concessions you require publish the RFP to the wider competition.This does not necessarily mean a change in vendor; indeed most often the incumbent retains the business.   
RFP response assessment will most likely uncover two best responses with which the final negotiations take place. Both organisations should discuss the strategic view of their businesses and ensure a mutual fit in ambition over the coming years. If a new carrier is chosen,  conversion must be managed as smoothly and amicably as possible. The vendor's ability to execute this conversion transparently from the end-user point of view will set the tone for the life of the contract.
This process needs to be repeated for each carrier in each territory. Phasing is important, as there may be opportunities for one supplier to offer a service across several other territories, or for one supplier to operate across all territories, in the style of an international managed service. More and more businesses, in our experience, are taking this approach.
So you have concluded negotiations and agreed on your carrier(s) of choice. But it's not over.

Carrier pre-select

Carrier pre-select is another feature that is driving down telecommunication rates, as companies can use different carriers for different purpose: taking advantage of time of day discounts for one carrier, and favourable international rates for another, while using mainstream carriers for the majority of their telecommunications requirements. 
A bit like deciding whether to take the train, bus or car for a particular journey, these decisions can be made on the 'fly'. There is, however, a limit to the complexity that is appropriate to build in. For the most part, large companies are continuing with single contracts with mainstream carriers, but increasingly medium sized companies are using carrier pre-select. To compete, traditional carriers must respond with favourable pricing that competes with the incentive to use pre-select.

Get out and be proactive

As customers demand more, in the coming months and years, telecoms suppliers must get out and be proactive: once a customer has started the review process the supplier is on the back foot. Carriers should take new pricing models to their customers proactively and demonstrate how they can save them money and the carrier can keep a profitable loyal customer. 
Similarly telecoms manager must remember that suppliers are running profitable businesses: squeezing prices into oblivion will not do you or them any favours.  Understand the true market pricing and be realistic.  Forcing short-term cost cutting as far as possible    doesn't work either.
Both parties want a strong ongoing relationship: for the supplier it means more business; for the customer it means simplicity. It can still be a 'customer for life' approach, this time based on open relationship building on both sides.
In summary, telecoms costs are typically one to two per cent of a company's sales. Yet most users don't know what they have, what they're using, or what they're paying for. It's not uncommon for enterprises to have multiple telecoms contracts covering the same services, and to pay several times for the same service. Effective negotiations and management of telecom contracts can have a measurable impact on a business' financial performance. Telecoms managers can drive this change and ensure a profitable relationship, on both sides.

David Warren is managing consultant specialising in telecommunications, at Compass Management Consulting, and can be contacted via tel: +44 1483 514500 e-mail: info@compassmc.co.uk

As potential breaches of IT security become reality, the cost of doing nothing could be severe. And, as Craig Pollard explains, it's not just data that needs tightened security...

In today's business environment, IT network security is vitally important, with security breaches across voice and data networks growing by the day. Emotive terms such as 'cyber attack' and 'cyber-terrorism' are always certain to generate plenty of media excitement, with science-fiction visions of malevolent hackers creating vicious computer viruses to rampage through cyberspace, doing unseen and untold damage to the infrastructures that support our way of life. However, while the reality of IT security is far more mundane than such science-fiction ideas, the threat to a network from malicious attack remains real and the consequences just as frightening. Every business is dependent upon information technology, which brings with it inevitable vulnerability.
Dark rumours of underground hacker networks and conferences give rise to the belief in a vast and growing number of aggressive, deliberately destructive hackers. Significantly, the methods these hackers adopt to gain unauthorised access to corporate resources are now also extending to embrace telecommunications systems.

The terrorist threat

he hacker phenomenon has a serious and far-reaching influence. Were communications on two continents ever disrupted by moving telecommunications satellites? Have computing resources belonging to government agencies ever been hacked? Have environmental controls in a shopping centre ever been altered via a modem? The answer to all of these questions is yes. But, unlike other crime groups who receive high profile coverage in the media, the individuals responsible for these incidents are rarely caught.
As if that is not enough, unauthorised use of telecommunications facilities is the preferred methodology for people who sympathise or support terrorist organisations, and want their activities to remain invisible.
The French authorities studying the Madrid train bombings in March 2004, for example, are investigating whether the bombers hacked into the telephone exchange of a bank near Paris as they were planning their attack. The telephone calls involved were made by phreaking -- a practice similar to hacking that bypasses the charging system.

Combating telephony fraud
The PBX is among the most susceptible areas to telecommunications fraud. Typical methods of fraudulent abuse involve the misuse of common PBX functions such as DISA (Direct Inward System Access), looping, call forwarding, voicemail and auto attendant features.
Another area popular for frequent fraudulent exploitation is the maintenance port of PBXs. Hackers often use the dial-up modem attached to such ports to assist in remote maintenance activities. When a PBX is linked to an organisation's IT network -- as is increasingly the case with call centres, for instance -- a poorly protected maintenance port can offer hackers an open and undefended 'back door' into such critical assets as customer databases and business applications.

When things go wrong

It is clearly important to balance the cost of securing your voice infrastructure from attack against the cost of doing nothing. The consequences from inaction can include:
*  Direct financial loss through fraudulent call misuse (internal or external)
*  Missed cost saving opportunities through identification on surplus circuits
*  Adverse publicity, damage to reputation and loss of customer confidence
*  Litigation and consequential financial loss
*  Loss of service and inability to dispense contractual obligations
*  Regulatory fines or increased regulatory supervision
As is the trend with hacking data networks, the threat to PBXs comes primarily from within. For example, an employee, a contractor, or even a cleaner could forward an extension in a seldom-used meeting room to an overseas number and make international calls by calling a local rate number in the office.
The perpetrator could likewise be the beneficiary of a premium rate telephone number in this country or abroad and continue to leave phones off the hook or on a re-direct to that number netting thousands of pounds in illicit gains in a weekend.
And, of course, let's not forget about the new telecommunications technologies which are based around open communications via the Internet. These include IP-driven PBXs supported by all the adjunct devices, the deployment of CTS (Computerised Telephone Systems), CTI (Computer Telephony Integration) and Voice over IP.  The introduction of these technologies means IT and telecoms managers need now to become even more alert to prevent new and existing threats that are typically associated with data networks, now impacting upon voice networks. Without diligent attention, telecoms systems are in grave danger of becoming the weak link in the network and utterly defenceless against targeted attacks by hackers.

Practical measures

So, what practical measures can telecom or IT managers take to help prevent becoming a victim of telecom fraud?
One of the most effective approaches to improving the security of telephony systems includes conducting regular audits of:
*  Station privileges and restrictions
*  Voice and data calling patterns
*  Public and private network routing access
*  Automatic route selection
*  Software defined networks
*  Private switched and tandem networks
*  System management and maintenance capabilities
*  Auto attendant and voicemail
*  Direct inward system access (DISA)
*  Call centre services (ACD)
*  Station message detail reporting
*  Adjunct system privileges
*  Remote maintenance protection
*  Primary cable terminations and physical security of the site and equipment rooms
Other measures include reviewing the configuration of your PBX against known hacking techniques, comparing configuration details against best practice and any regulatory requirements that may pertain to your industry sector.
Ensure default voicemail and maintenance passwords are changed and introduce a policy to prevent easily guessable passwords being used. Make sure that the policy demands regular password changes and take steps to ensure the policy is enforced.
Installing a call logging solution, to provide notification of suspicious activity on your PBX, is a useful measure and one that can often give valuable early warning of an attack. In addition, review existing PBX control functions that might be at risk or which could allow errors to occur.
Be aware that many voice systems now have an IP address and are therefore connected to your data network. Therefore, you must assess what provisions you have to segment both networks. Security exposures can also result from the way multiple PBX platforms are connected across a corporate network or from interconnectivity with existing applications.
Research and investigate operating system weaknesses, including analytical findings, manufacturer recommendations, prioritisation and mitigation or closure needs -- and implement a regular schedule of reviewing server service packs, patches, hot-fixes and anti-virus software.
Finally, formalise and instigate a regular testing plan that includes prioritisation of the elements and components to be assessed, and supplement this by conducting a series of probing exercises to confirm the effectiveness of the security controls used.
To achieve this level of security on a voice network requires an advanced level of expertise. Insight and Siemens are drawing on their combined skills and experience in information security and telephony solutions to introduce a new portfolio of voice security services that provide a comprehensive approach to mitigating the threats that voice networks face.
These services include security audits, vulnerability assessments, incident response, forensic investigation as well as telecom policy review and development. All services will be compatible with voice equipment from Avaya, Cisco, Ericsson, Nortel, Mitel, Siemens and others.                                                   

Insight Consulting, a division of Siemems plc, are exhibiting at Infosecurity Europe on the 26th - 28th April 2005 in the Grand Hall, Olympia.
www.infosec.co.uk

Craig Pollard, Head of Security Solutions, Siemens Communications

This year's TeleManagement World looks set to provide a platform for the most pressing telecoms issues of the day

This year's TeleManagement World event (Nice, France, May 16 - 19) is expected to be the TM Forum's biggest and most successful event ever, as the telecom industry finally returns to something approaching normality. After three years of hunkering down and trimming costs, telecom operators are now gearing up to invest in next generation services. And the evidence will be at TeleManagement World's Catalyst Showcase.
TM Forum Catalysts Projects are where OSS/BSS suppliers combine to produce integrated solutions by following the TM Forum's specifications and guidelines. The objective is to produce a prototype solution, which can then be demonstrated to potential customers. TM Forum calls its Catalyst Program a 'Living Lab' because of its pragmatic, customer-driven approach -- and it claims that the level of participation in the Catalyst Showcase can be taken as a bellwether for the health of the sector.
This year TeleManagement World is set to host the greatest number of Catalyst Projects in its Catalyst Showcase feature since 2001, at the very  peak of the boom. The Catalyst Showcase is an area at TeleManagement World devoted to demonstrating multi-vendor demonstrator projects that have been developed to TM Forum specifications, such as eTOM (enhanced Telecommunications Operations Map) and SID, the data and integration model. 
"This time around we've had a complete resurgence," says Debbie Burkett, TM Forum's director of market collaboration. "We have 10 Catalysts and, better yet, the scope is wider than ever before."
According to Burkett the fact that all the Catalyst Projects are sponsored by at least one service provider (sometimes several) proves that there is real procurement intent behind the activity. "In a healthy industry the Catalysts are viewed as a way of developing quick prototypes," she says.
And the range of Catalysts is broader this year. As well as Catalyst Projects exploring aspects of the TM Forum's core NGOSS (new generation operations systems and software) Burkett points out that there are Catalysts targeting topics such as process management, content billing, and revenue assurance. "And most of them are new subjects, rather than being second or third phases of earlier projects," she points out.
While the increase in Catalyst Projects and sponsors is a sign that service providers are preparing to spend on OSS/BSS, the specific topics chosen provide insight into the new directions that the spending might take.
A key unifying concept within the TM Forum is the idea of 'lean operations': a catalogue of best practice for service providers facing a highly competitive 21st century market.

Pre-requisite for survival

According to the TM Forum, lean operations aren't a feel-good aspiration -- they're a pre-requisite for survival as prices spiral downwards and liberalisation and consolidation intensify competition. Lean service providers find ways to reduce costs and, at the same time, generate more revenue by getting to market with value-laden services to beat the competition. So lean doesn't mean chopping operations to the bone. It means making them both more efficient and more responsive and agile.
The work of the TM Forum is about promoting this 'lean' idea: to assist service providers complete the required metamorphosis from slightly paunchy, sometimes overly bureaucratic organisations that often arrange their procedures to suit themselves rather than their customers, to customer-facing organisations that prize continuous change and improvement.
One place to start with a makeover on this scale is with the very DNA of any service provider: its business processes. All organisations have processes -- Business Process Management (BPA) is a way of codifying and then managing them in a structured way, and the technique is an up-and-coming one within telecoms.              A Catalyst demonstration at TeleManagement World will show how the technology can be used to drive speedier change in service provider businessprocesses  such as Service Fulfilment and Supplier Partner Management. 
Another Catalyst will demonstrate how processes can be measured and monitored. The Business Activity Monitoring Catalyst will show a system tapping into the information that flows between applications, in real-time, to show business performance. This treats a telecom business as if it were a network requiring management -- not only can instant action be taken if the equivalent of an 'alert' appears, (say, customer fulfilment not being processed efficiently for the last two days) but, as with network management, trends can be monitored and improvements to the way the business is actually organised can be made.

A live issue

Revenue Assurance (techniques to actually bank the money you've earned) is always a live issue in telecoms. The TM Forum's Revenue Assurance team will present a Catalyst to demonstrate how revenue assurance systems can plug revenue leakages by checking customer bills against provisioned services, making sure that the right customer is using the right service (and paying for it).
Data and Content Charging will become a big issue, especially in the mobile sector as it moves rapidly towards third generation services, which will rely more heavily on data and content to turn a profit. The business relationships required to deliver profitable content are complex and will also change as the services evolve.
The TM Forum's Data and Content Charging team will use the Catalyst Showcase to demonstrate a flexible architecture designed to spread across today's borders of fixed and mobile environments. The architecture will involve third party arrangements as well as customer self-generated content and a wide variety of business relationships will be catered for.
Pragmatism may be the watchword for many of the demonstrations, but more strategic issues will also be aired in the Catalyst showcase. An MDA  (model driven architecture) Catalyst will demonstrate how the TM Forum's NGOSS framework could be implemented using MDA-based tools to integrate OSS components.
  Another strategic issue will be aired by the Open OSS Catalyst, the first ever Open OSS demonstration.  Open OSS may prove to be a very important strand in the TM Forum's technical work. The idea is to provide open source, free components which can be extended to provide TM Forum members with a permanent and evolvable OSS test bed and base of reference software.
The intention is not to create carrier class open source software, says the team developing the Catalyst, but to exploit the open source development approach to stimulate collaboration between different organisations.
These Catalyst demonstrations will have an appreciative audience at Nice and the OSS/BSS industry is expecting a steady up-tick in interest and orders from the service providers who visit TeleManagement World. However, unlike in the boom days of the late 1990s, there will be no spending gold rush. Instead, service providers will be looking to carefully choose systems which will help them bring services to market more quickly and, most important, help them to reduce operations costs.
This is where the TM Forum's Catalyst Program shows its worth, says Burkett, since it showcases suppliers in combination, focusing on their ability to produce integrated OSS/BSS solutions that solve real problems, rather than just push individual components. The increased popularity of the Catalysts is also an indication that suppliers understand the value of working together.
 "I would say about 50 per cent of the benefit for the vendors participating in the catalysts is technical -- it's an ideal way to develop prototype systems.
"The other 50 per cent is about building relationships with other suppliers," she claims. "What starts as a tactical, short term relationship to accomplish a particular prototype solution, can end up as a strategic long-term relationship."                                             

www.tmforum.org

Some major challenges face network operators seeking to provide customers with technologies which are both resilient and offer high performance. Chris Hamilton explains

In the emerging multi-service access network environment, there are two major challenges to network providers. First, there is an ever-growing list of advanced IP-based services that operators will have to support on multi-service access node equipment at the network edge. The second challenge is that many of the most profitable service flows will require resiliency.
Network providers must now maintain a variety of boxes to support multiple services. Some, like e-mail, file transfer protocol (FTP) and traditional web access have low QoS (see Figure 1, right) and resiliency requirements, while others -- such as carrier grade VoIP and multi-media services -- require both high quality of service and high levels of delivery reliability.
The problem is that many of these new services -- which require real-time latencies in the milli- to micro-second range -- must be delivered over the non-real time 'best effort'-based Internet, with its variable queuing delays on network routers, dropped packets, and lengthy re-routing restoration mechanisms that are on the order of seconds to tens of seconds. The delay/jitter problem in IP/MPLS transport networks that are 'private', i.e. non-Internet based, is still a substantial issue to be addressed but is not quite as severe as in Internet-based systems.
As they currently stand, most pre-existing efforts to come up with the necessary QoS and service resiliency have particular problems that do not offer their application to the broader problem: they are too focused on a particular network topology, are specific to particular services, or are too slow.
But one serious drawback they all share is that rather than protect service data flow, they instead focus on protecting the network links or equipment nodes. As a result, they are all-or-nothing solutions with regard to their ability to protect a given path or node, much less the content that is being sent.
Because these alternative approaches to service resiliency can only guarantee either total protection or none at all, they lack the flexibility and the service identification specificity to address the resiliency needs of any particular service request and are wasteful of bandwidth, equipment, and financial resources.

Flow optimised application service resiliency

Now working its way through the Next Generation Network and International Telecommunications Union standards process is a proposal for a universal flow-optimised application service resiliency (ASR) specification as a fundamental requirement of the next generation telecom network infrastructure that turns traditional approaches to service resiliency on their head.
The primary purpose of the new ASR proposals is to enhance and/or complement current approaches to application service resiliency and to do so by addressing several characteristics of this new network environment that traditional methods have problems satisfying.
Proposed and/or supported by Agere, AT&T, British Telecom, Cisco, Lucent, Nortel, and Sprint, the essential idea behind ASR is redundancy, not of hardware, but of multiple paths and data, and management of both mission-critical and less critical data such that traffic arrives successfully when needed and in the form necessary.
A networking environment that implements ASR will benefit immediately, even with legacy equipment. For example, suppose that 10 per cent of the total bandwidth of a particular path is protected and the primary and secondary paths are of equal bandwidth.
The primary and secondary paths each can carry 10 per cent of duplicate protected traffic and 90 per cent of unprotected, best effort traffic. This translates into a total bandwidth use of 95 per cent. Compare this to the 50 per cent for present either/or techniques that cannot discriminate at the traffic service level and require 100 per cent of the traffic to be protected.
Of course, best results would be achieved with hardware and traffic management network processors optimised for the task. However, if properly implemented, even existing systems with minimal or no fast restoration capability could be retrofitted to perform ASR on an incremental, pay-as-you-go basis.
Such a flow-optimised ASR network architecture would work independently of the packet-transport protocol (IP, Ethernet, ATM, Multi-protocol Layer Switching [MPLS], etc.), or physical transport topology (ring, mesh, star, etc.). More importantly, it would work independently or in conjunction with existing network resiliency mechanisms such as MPLS reroute.

How flow-optimised ASR works

he simplest implementation of the ASR concept is between the two end points of a protected flow. In this scenario, it is assumed the data moving in both directions behaves in the same way. In this case, all subscriber services such as voice, video, and Internet access are concentrated through a home or business gateway device. Consolidated data is sent or received by the gateway over a single broadband link connected to a multi-service access node (MSAN).
If the network has underlying mechanisms in place for fully or partially separate primary and secondary paths, and allow network policy managers to notify the MSAN's control plane processor which flows are to be protected, provisioning can be statically or dynamically configured using SIP session establishment requests.
When the total aggregate flow on the primary path of all service flows arrives from the subscriber to the MSAN, protected flows are identified and replicated. They are then sent to both the primary and secondary paths, which are physically and spatially independent of the primary path.
Under normal circumstances, at the termination end of the protected flow, the router would accept traffic from the primary channel and discard traffic from the secondary one. But in the event of a network failure, the router can make a decision on how to handle it depending on the degree of control needed. The router can detect the disruption on the primary and rapidly switch to the secondary, or data from both paths can be retained, with the NPU making decisions on a packet-by-packet basis as to which flow to discard.

The role of the NPU

At the MSAN, data would flow into a line card where an NPU then handles data path operations such as protocol encapsulation, forwarding, etc., while a general purpose CPU in the control plane performs  corresponding functions on the control path.
For ASR to work effectively, the NP must take on several critical tasks in the MSAN. Most importantly, it must classify the incoming subscriber data to determine if the flow is protected, by inspection of the bits in the packet header that uniquely identify a packet flow.
Once a protected packet or flow is identified, the NPU must assign it a proper priority and buffer it to be scheduled for transmission to both primary and secondary paths. This prioritisation is essential because it gives protected packets precedence over unprotected ones.
Because in most cases NPU classification engines are programmable, the specific classification criteria can be extremely flexible. The packet classification subroutine that is invoked on NPU (see Figure 2, right) initially obtains packet classification information such as physical port number and Ethernet MAC address.
The packet classifier then classifies the incoming packet based on one or more techniques such as exact matching, longest prefix matching, or range checking. The result determines whether the packet should be protected and the corresponding results are returned to the calling process.
At the termination end of the two paths of protected flow, another NPU must classify and identify the protected flows, keeping only the primary flows if the network is operating normally. But if the NPU detects a network outage on the primary flow, it switches over to the secondary one, keeping all data that arrives there and discarding data on the primary flow.

No trade off for designer

The multicast nature of the protected packets requires an NPU architecture designed with efficient multicasting in mind, so the designer does not have to trade off network resiliency for performance.
In addition, the NPU allows buffer management discard/tag decisions to be executed independently on each multicast branch, so congestion of the secondary path will not impact the QoS of the primary path. It is also important that the NPU be able to have sufficient bandwidth and memory resources to handle situations when both the primary and secondary paths must be retained.
In such situations, the NPU will have to make timely milli- and micro-second decisions as to which flow to discard based on criteria such as sequence numbers, timestamps and checksum integrity data. In such cases, it should be possible to perform the equivalent of a fault tolerant 'hitless switchover' since switchover is being decided on a packet-by-packet basis.
The programmable nature of most NPUs means it will ultimately be possible to employ more than one fault detection approach in the same NPU, if such capability has practical application to network operators.

Chris Hamilton, Senior Manager, Agere Systems, can be contacted via tel: +1 610 712 7827; e-mail: cwh1@agere.com

    

 

European Communications is now
Mobile Europe and European Communications

  

From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:
www.mobileeurope.co.uk 

 

@eurocomms

Other Categories in Features