While the problems created by years of designing systems with a silo mentality cannot be cured overnight, product portfolio management software can help to utilise telcos' existing assets more effectively, says Paul Hollingsworth

If, like me, you have recently had an experience of taking on a new mobile phone contract, then you may be wondering whether your operator's support systems are integrated sufficiently yet. Apparently simple requests are "impossible" or can only be handled by closing the account and being penalised with a hefty termination fee, even if the ultimate goal is to retain the same service with the same operator and with the same device. My request may have been uncommon but, given the level of complaints received by the media relating to mobile phone companies, it's clear that I am not alone.
Now I will say that the staff at the phone company we're exceedingly courteous, if ultimately unhelpful. But it wasn't their fault. The issues were clearly in the fact that their support systems are not joined up.
This is hardly surprising if one thinks for a moment about how we have traditionally built systems: define a domain-specific data model and then build functionality around the data. To be fair, this approach was an advance on building the data model around the required function, but it still leaves operators with an integration headache. Each operator has tens, hundreds or even thousands of systems that have been designed around a domain-specific database with little or no pre-defined understanding of the data that the system will be integrated with.
It should therefore be no surprise to anyone that the domain integration software built to deploy a new Telco product is costly, error-prone and frequently insufficient.
A major challenge when launching a new product is in cross-domain business rule development. For example, to define product code mappings, product compatibility, pricing rules, product options and error condition handling. These rules are hidden in code either around the customer management/order handling applications or often within the integration layer. Since the rules are coded they are costly to build and maintain. Also frequently there is little re-use of this code and so similar rules are developed time and time again. 
The result is ever increasing spaghetti logic that takes longer and longer to maintain and extend. New products increase the ongoing OSS/BSS support costs and reduce the accuracy of cross-domain data.  Revenue and profit margin is being lost.

Breaking out of the spiral

o break out of this spiral of cost and data inaccuracy, operators need to encapsulate this business logic within an application where the rules can be configured in data in a similar way to other enterprise application's information. Rules are then quickly and efficiently created and maintained and are easily searched for, thus increasing re-use. This ensures that integration becomes less costly and ultimately better.
Most IT managers, however, will initially throw cold water at this suggestion. They argue that the last thing they need is another business application to implement and a new "rule" database to manage. Another common response is that the current strategy of building custom code around existing apps or in the integration tier may not be great but at least the cost and risk is clearly understood. 
This view shows a failure to understand the long term impact of this strategy on the effectiveness of the business and the cost of operation. For example, business rule code often depends on data mastered in different places. A decision about product offer compatibility and network availability requires knowledge about network management, pricing and account hierarchy if not more. Billing start time and equipment shipping dates will require knowledge of stock in the fulfilment house and the planned and completed delivery.
The common approach to resolving these issues is either to copy the necessary data into a local store that is accessible to the rules or build a layer around the requested application with new API logic to manage the query. In each of these approaches there is inefficiency built in. In the first there are errors due to data update synchronisation and in the second we have created a need to develop new transaction-specific code that must be built or modified each time a new product is launched. Leading to more custom code, more cost and slower time to market.
What is needed is for the new business rule application, mentioned above, to be able to federate data from any application in the OSS/BSS where the data is physically mastered. In that way the data will always be current and new logic is able to simply re-use existing data views and business logic. Federated data applications are able to manage the data relationships between the rules and the underlying OSS/BSS business data by treating information not as a new data silo, as traditional system designs do, but as views into real business information. They don't introduce a new database to the architecture -- just a common view of the existing information.

New breed

A new breed of application that provides this capability is available and termed Product Portfolio Management. Whilst no new technology can solve all the problems created by years of designing systems with a silo mentality, it is time to start utilising the telco's existing assets more effectively. This can be done by introducing product portfolio management software to provide a single view of cross-application information so that in the future customers will not be turned away simply because multiple systems can't be made to talk to each other.                                      n

Paul Hollingsworth is Director of Product Marketing at Celona Technologies  www.celona.com

External Links


Scott G. Silk, CEO of Action Engine, explains to Priscilla Awde why he believes the company's 'browserless' mobile application platform is leagues ahead of the 'browsered' competition

Thinking outside the box

Optimistic, enthusiastic, energetic -- these are somewhat rusty epithets when applied to what has been the ailing telecoms sector. However, they are fundamental to Scott G. Silk, president and CEO of American software company Action Engine Corporation, who sees the market as both extremely dynamic, and as attracting investors. As operators are beginning to understand their customers, and brands are beginning to understand this powerful medium, Silk expects the mobile applications and content market to grow exponentially.
Backed by New York venture capitalists Baker Capital, the company is a relative newcomer to the industry albeit one on which many are keeping a close eye. Founded in 2000, Action Engine spent the first four years in intensive R&D developing a software platform for mobile operators, enterprises and application developers which, it believes, is several years ahead of what Silk refers to as the 'big boys'.
Employing all the advantages of a small, entrepreneurial company, the Action Engine founders never accepted the prevailing wisdom that operating systems for mobile devices should or could be similar to those developed for the computing environment. While others made what he believes were predictable responses to the whole idea of accessing mobile data services, Silk believes his company had the vision to 'think outside the box', investing over 100 man-years of development, and raising over 25 million GBT of funding to date. The result is the Action Engine Mobile Application Platform, the company's one-touch, client server solution which eliminates 80 per cent of keystrokes and decreases network response times to make all applications 20 times faster than existing systems.
 "We didn't want to replicate a PC-like experience but to create a new paradigm," he explains. "Early on we knew that phones would inevitably follow Moore's Law, becoming more powerful and including more available computing horsepower which we wanted to exploit to make the mobile experience faster and applications easier to use.
 "We have shattered the mobile usability barrier: people can now execute rich computing applications on their phones at the same speed as they do on their laptops. But we have done it with a client server platform rather than using browser technology optimised for the computing environment."
The fundamental advantage of the system is that most of the choices are made offline on the handset using resident software either previously downloaded from the operator or embedded into the ROM of the device at manufacture. Users quickly navigate their way through a series of drop down menus on the phone to refine their search. Unlike most existing systems which involve numerous exchanges between the phone and the network before users finally get what they want, Action Engine saves time by only accessing the network with a specific request once the user has specifically defined all parts of the request and chooses to send it to the network. The result is that because users find it considerably faster and easier, they use mobile data applications more.
Business customers will now be more willing to use their phones to access corporate databases and applications such as CRM, diaries and corporate directories, whilst sales staff can quickly update files and retrieve specific and relevant up-to-the-minute information.
All of the popular data services (travel, news, financial data, sport and film clips plus fashion information), including location based applications (people, place and facility finder), are available faster and more conveniently, which drives up usage. Downloading a one minute music clip or full song for example takes 30 seconds compared to the more common eight minutes.
Action Engine has significantly reduced what was originally a relatively big software package so that it can run on all of the mid and high tier phones in the market. The architecture has now been optimised to run on Java handsets and this next generation is expected to be available in September.
The system is expected to have a knock-on effect as application developers will have more of an incentive to create innovative content and products to sell in to the operators.
Offering more inventive applications makes it easier for operators to differentiate themselves from competitors, whilst speeding up the user experience may increase average revenues per user and therefore overall revenues. Although third generation networks supporting higher speeds are rolling out, take up of 3G services has been slower than anticipated. Although successful in Asia, in Europe, all but the early adopters still need to be convinced of the value of mobile data and video -- even they may be deterred because access is difficult and time consuming.
Feature rich phones are becoming more popular but many people use only a fraction of the available functions which has a negative effect on the success of the mobile data market. Estimates are that 76 per cent of people do not use the data services capabilities of their phones because they are too complex to operate. Neither do people use their phones as much as they might which, considering that operators are under pressure to increase revenues, does little to contribute to overall profits.                                 Â©
"Mobile devices have the computing power to execute many applications but usability is hampered by slow speeds, small screens and keypads. Based on drop down menus, our system is designed for the mobile form and leverages the power of the handset," explains Silk.
 "Because all the applications are integrated on a common platform, data is shared between them and the software learns patterns of use. The result is a personalised phone experience, as, with use, the device gets smarter and remembers preferences. People have regular patterns to their lives: generally they travel to the same few places, on the same airlines or trains and eat similar foods wherever they are. Drop down menus reflect these choices making it still faster to navigate through the system."
However, even the most innovative content and services will not drive mobile data growth if the people cannot use them quickly and conveniently because response times are slow and the number of keystrokes and transactions is intensive. A case illustrated by the fact that although location based services and other sophisticated applications have been available for some time, take up is relatively slow. Silk suggests that what is missing and inhibiting rapid adoption of mobile data is the user experience -- the time it takes to download anything -- a situation he suggests is quickly remedied.
From his perspective, all operators have to do is buy and install a server and roll out the Action Engine Mobile Application Platform which is licensed to them on a server and per user basis. To make deployment even faster, Action Engine also offers a turnkey pack of pre-developed applications called Brand-n-Go. With Brand-n-Go, Action Engine has sourced the content, built the applications, and absorbed the hosting costs -- operators only have to choose from the Action Engine application catalogue, place their "brand" on the pack and "go" to market. Since the system logs and reports the number of users by application, operators can judge the success of each package making it easier to segment and target their customers and drive up revenues. The ultimate goal is to add value, and therefore profit, to mobile phone and data sales.
Action Engine is currently running trials in North America, Asia/Pacific and Europe. Two large American operators are planning to introduce a phone with pre-integrated software later this year. An Asian telco is trialling the mobile application client to deploy and test a suite of branded services and, if successful, plans to roll them out to its wider customer base. In Europe, Action Engine is working closely with a number of key operators and distributors and is developing specific trials for a Tier 1 and a Tier 2 operator, as well as working with a European mobile phone distributor to create a tailored application pack which can be sold direct to operators and consumers.
SMART Communications in the Philippines has launched its own version of the system and used the Action Engine Mobile Application Platform to create a suite of applications sold as Amazing Explorer. As well as successfully differentiating itself from competitors, repeat usage of data service on high-end devices has risen by nearly ten per cent. SMART can update or add new applications by seamlessly sending an over-the-air command to customers' phones and users can switch between phones to access the range of services.
The system was deployed in stages the first of which involved installing the server in the data centre and pre-installing client applications on a ROM. Location based services have been integrated into the system and users are able to control when they want to access the network. Seven new services were developed: movies; music; travel; celebrities; personal; sports and fashion/beauty and branded to target particular customers.
Amazing Explorer is not only driving data usage but also allows SMART to expand its products beyond SMS. Customers have on or off-line access to information and, by tight integration with the phone's dialling, PIM and messaging capabilities, users can make calls, save and send information with one-click.
Silk believes that the SMART experience is just the start of what is a global market of operators waiting for the tools to help increase their data revenues. Silk is positioning the company for growth and spent the last year investing in sales, marketing, and business development, and making the transition from a purely technology company into a marketing led entity. However, the company plans to continue investing heavily in technology to expand what Silk calls its "unfair advantage" in the market.
Estimating they are at least two years ahead of their competitors, and recognising that operators and end users are ready to deploy and consume more sophisticated applications, Silk is expanding Action Engine in Europe and appointed Mike Kent as his European VP. Offices are now open in Britain and France with plans to expand into more countries. The goal is to demonstrate the system, to sell in to the top 20 telcos and target handset manufacturers, distributors, content providers, and enterprises.
 "Europe is ahead of America in mobile data services and it is a particularly attractive market since there are a small number of relatively large players as well as a large number of smaller operators. Although others may eventually get into the game, we have at least a two year lead and, with all the advantages of being a small company, we are very nimble and entrepreneurial and can therefore move fast. Lots of technology is developed by smaller companies which tend to execute products two or three years ahead of bigger players," explains Silk who was brought in to add value and take Action Engine into a public offering.
 "We are in a position of one: there is no real competition today although we expect it to happen because the realisation of the mobile dream is to put PC applications on phones."
Having appointed what he believes is an aggressive team of pre-sales, sales and support staff, Silk says the company aims quickly to gain as much market share in Europe, Asia/Pacific and North America as possible before the big players start coming into the market.
When it was first proposed, many people responded to this 'browserless', client server system with disbelief, scepticism and even contempt but, suggests Silk, they may well have to seriously revise their opinions as they try to play a catch up game with Action Engine's early gains in what he believes will be a significant market.

Priscilla Awde is a freelance communications writer

External Links

Action Engine

Cerillion CEO, Louis Hall, explains to Lynd Morley why he believes the company is on the fast track to success in the highly competitive convergent billing market

Agility is the key

The concept of the lean, responsive organisation, flexible and adaptive to client requirements, may not be entirely new, but it is certainly experiencing a strong re-birth in the current telecommunications marketplace. It is not always an easy brief for an industry rooted in the conservative 24/7 service provisioning ethos to fulfil, but it is certainly one that, according to its customers, convergent billing specialist, Cerillion Technologies, is delivering.
Louis Hall, Cerillion's CEO, is undoubtedly keen to foster the image of an agile organisation, able to answer the complex and urgent needs of operators, but it is the testimonials of satisfied customers that ultimately give gravitas to any marketing claims. Chris Hall, Managing Director of Manx Telecom -- a long-standing Cerillion customer -- for instance, notes: "We telcos are being dragged into becoming much more customer centric and marketing focussed. That's why it's imperative that we have flexible systems. It's no good having a great marketing idea to stimulate sales if it's going to take two years to execute it.
"That's where Cerillion score so highly. They listen to their customers. They are invariably very constructive, very positive and, probably more importantly, they deliver, going from concept to execution very quickly."
Formed in 1999, following the management buyout of the in-house billing and customer care division of Logica, Cerillion provides carrier grade billing, CRM, interconnect and mediation solutions to fixed, mobile, IP and convergent operators worldwide. Having grown from an organisation of some 18 people, with a $4.5 million turnover, to a staff of over 140, and figures for the financial year ending September 2004 showing a 46 per cent increase in turnover to $17.5 million, Cerillion and Louis Hall are feeling, understandably, bullish.
"Certainly if an operator is looking for a small, flexible, responsive, customer focussed organisation that has a credible carrier-grade solution, we're pretty much all there is in that space -- most of our serious competitors having been merged into large corporations."
To date, a wide variety of companies, from Cable and Wireless, Jersey Telecom, Tele2, Manx Telecom and Tele Greenland, to Caudwell Communications, Maltacom, Go Mobile, Redstone Telecom and MobiCom Corporation have made exactly that choice.
In terms of its solution, Hall explains, Cerillion is very much focussed on breaking down the traditional paradigm between end-to-end and best of breed, and concentrating on what he terms a 'bundled components' approach.
He stresses that the Cerillion system supports any deployment model. The end-to-end model, for instance, offers workflow embedded as an integral part of the system, while the systems modules are all pre-integrated encompassing everything from rating, CRM and billing, through to mediation, point of sale and directory management.
"But we're equally comfortable with a mixed deployment project," Hall explains. "Our modules can be easily replaced by third party packages where desired to meet specific project needs, allowing for seamless evolution with no restrictions on an organisation's future growth. For Bulldog Communications Ltd, a subsidiary of Cable and Wireless in the UK, for instance, we installed our rating and billing modules alongside their own in-house Order Management System.
Add to these solutions, best-of-breed, managed service solutions, or indeed, bespoke developments, and the view of Cerillion as a truly flexible provider appears to hold water.
"I think that what we offer tells quite a different story to what the rest of the market is doing -- and that's partly why we're being so successful," Hall notes. He goes on to underline the fact that the success is based on a number of new contract wins (as opposed to upgrades of existing systems) -- including Cable and Wireless and Caudwell Communications in the UK, BTC in Bulgaria, and Gamtel and Gamcell in Gambia, the latter reflecting Cerillion's experience in both wireline and wireless markets as it undertakes to migrate more than 200,000 fixed and mobile subscribers from two separate legacy systems into the new convergent solution.
Based in Central London, with offices in the US and -- more recently -- Singapore, the company operates in both mature and slightly more 'emerging' markets, answering a wide variety of requirements.  But 'mature' Hall says, does not necessarily denote 'sophisticated'.
"In mature markets we are, in many ways, seeing simplification rather than sophistication," he comments.  "Many operators are really focussed on price and a fairly commoditised service. This is obviously partly a result of having been hit by the telecoms downturn, but it is also because there's a limit to the appetite among users for ever more sophisticated features."
He goes on to stress that, going back over, say three years, much of the hype generated by industry analysts and the press, stressing the promise of ever increasing sophistication of services through 3G, has not yet fully materialised.
"3G has, arguably, still not found a market," he comments. "A lot of the services available in 3G that are actually selling, are also available in 2.5G. Users are tired of hearing the 'next great thing' being continuously trumpeted, so the operators' focus is now much more on price and delivering the services customers actually want and believe they can get."
He points out, for example, that the success of the CPS (carrier preselect) and MVNO (mobile virtual network operators) models is based in their targeting of very specific segments of the end user base. 
"So, the CPS provider offers SME businesses, for instance, really neat packages that suit them perfectly.  The SME is not confused by the choice of some 40 different offerings, but can concentrate on what really works for him. Now that is getting close to the customer!"
The focus on price, of course, has led operators to pay considerable attention to their BACC systems, not only in terms of function, but also with regard to the cost of building, installing and running such systems. One solution, of course, is to outsource the whole procedure, taking up the offer of managed service billing operations from an outside provider.
Tele2 Ireland recently joined Tele2 UK in using Cerillion's managed service offering. The system is located in London's Docklands and is managed entirely by Cerillion staff. Bill Butler, CEO of Tele2 UK notes that by using this approach, the company has been able to launch the new Ireland service both quickly and efficiently.
Outsourcing the billing function is not, however, particularly widespread in Europe, despite the fact that it may seem an obvious solution for operators determined to concentrate on their core competencies, get closer to the customer, and keep a tight hold on the bottom line.
"There is a much greater trend towards outsourcing in the US," Hall comments, "particularly as billing is regarded very much as a commoditised service there. 
"The problem elsewhere is that the billing system is seen as the revenue generation engine, and if something goes wrong with that, it impacts directly on the bottom line. So there's an issue about control here.
"The other issue, of course, is security of data -- many European players still don't feel comfortable about giving up that information."
Certainly, given that many senior managers in telcos across Europe grew up in PTTS where so much of the data they held was government information, which they were charged with guarding, it is hardly surprising that they are still instinctively highly protective of their data.  In addition, data protection legislation in a number of countries across Europe specifically forbids the moving of certain data outside their borders -- so any managed service could only be run from within those countries.
Hall notes, however, that he believes there will be a lot more movement in the managed service space over the next three to five years.
"One of the great advantages that running the managed service give us is that we have the same experience as our non-managed service customers, dealing with the day-to-day operation. And that means that we're closely involved in how the system operates in a live environment. As a result of that first-hand experience, we can improve it to everyone's benefit."
This notion of using a specific development or experience to work to the benefit of all its customers, is a central plank of Cerillion's reputation among those customers. Manx Telecom's Chris Hall notes:
"I view Cerillion as a partner, not a supplier. I know a lot of people aspire to partnership with their clients, but for me, of all my suppliers, Cerillion are among the few that have actually achieved it.
"They do have an excellent product, but one of their great strengths is their people. They put real effort into understanding what we're trying to achieve, and they're very responsive. We have loads of good ideas, and we want to get them out to the market quickly, and Cerillion will find a way of doing it.
"Indeed, the resulting work will often go into their general release, so the whole club of Cerillion users is effectively working together, putting new innovations into the system, from which we all profit."
Having established a sure footing, both in terms of turnover and customer base, Hall sees his company moving increasingly centre stage to work with the larger operators. "In today's market" he comments, "productised solutions increasingly make sense for telcos. In cost-based organisations -- which telcos are now undoubtedly becoming -- there really is no option. This procurement model, which was always used by the smaller companies, is now moving up to the larger operators."
To date, the company has traditionally been focussed on Europe, but now has a growing customer base in the Americas, and is looking to the Asian market with the opening of its Singapore office.
"I believe we'll see a great deal of activity in the Americas, and particularly in the Caribbean, over the next couple of years," says Hall. "There are a lot of legacy systems out there, and a lot of billing vendors that have gone under, and that's left many operators high and dry. Certainly it's a highly competitive marketplace, with a lot of operators such as AT&T and Cable and Wireless establishing themselves in the region.
"At the same time, the growth in the holiday industry has meant that the corresponding growth in the roaming traffic is massive. There are operators running services in the Caribbean just for roaming -- because, obviously that's where the money is. We're currently in discussions to start work with three operators who are doing just that."
Setting up the Singapore office is a longer-term investment. Hall believes that while the region does have a lot of home grown suppliers, it is nonetheless a very brand conscious market, and the fact that Cerillion is building a solid brand in Europe is helping the company in Asia.  It is, of course, a potentially huge market, and Hall comments that while the 19th Century could be viewed as the 'English' century, the 20th as the 'American' century, the 21st will certainly be the 'Asian' century.
In the meantime -- back in the old country, or, at least, the old region -- Hall notes that differentiation in the telecommunications market will increasingly be based in marketing and branding -- given that everyone is offering much the same service. He points to the emergence of the re-seller market through the CPS providers on the fixed networks, and the MVNOs on the mobile side, and while he notes that this does not necessarily represent a power shift -- as all these providers have to sit on somebody's network anyway -- the resultant competitive pressure means that all these players must increasingly look to managing their relationships with customers, package their products and services effectively and, of course, bill for them.
"There are something like around thirty CPS providers in the UK alone, and the trend is moving into the European market. These companies are coming in with little experience of billing and customer care in the telecommunications environment, which provides us with a tremendous opportunity.
"At the same time, the growth in MVNOs is also great for us. These companies know exactly what they're doing in terms of establishing their brand, but they will look to outsource the network, billing, customer care and debt collection. So what you have at the end of the day is a marketing organisation. And that's commoditisation."
It will doubtless not have escaped Hall's attention that in a commoditised market, it tends to be the agile, flexible and responsive companies that win the day.

Lynd Morley is editor of European Communications

Some major challenges face network operators seeking to provide customers with technologies which are both resilient and offer high performance. Chris Hamilton explains

In the emerging multi-service access network environment, there are two major challenges to network providers. First, there is an ever-growing list of advanced IP-based services that operators will have to support on multi-service access node equipment at the network edge. The second challenge is that many of the most profitable service flows will require resiliency.
Network providers must now maintain a variety of boxes to support multiple services. Some, like e-mail, file transfer protocol (FTP) and traditional web access have low QoS (see Figure 1, right) and resiliency requirements, while others -- such as carrier grade VoIP and multi-media services -- require both high quality of service and high levels of delivery reliability.
The problem is that many of these new services -- which require real-time latencies in the milli- to micro-second range -- must be delivered over the non-real time 'best effort'-based Internet, with its variable queuing delays on network routers, dropped packets, and lengthy re-routing restoration mechanisms that are on the order of seconds to tens of seconds. The delay/jitter problem in IP/MPLS transport networks that are 'private', i.e. non-Internet based, is still a substantial issue to be addressed but is not quite as severe as in Internet-based systems.
As they currently stand, most pre-existing efforts to come up with the necessary QoS and service resiliency have particular problems that do not offer their application to the broader problem: they are too focused on a particular network topology, are specific to particular services, or are too slow.
But one serious drawback they all share is that rather than protect service data flow, they instead focus on protecting the network links or equipment nodes. As a result, they are all-or-nothing solutions with regard to their ability to protect a given path or node, much less the content that is being sent.
Because these alternative approaches to service resiliency can only guarantee either total protection or none at all, they lack the flexibility and the service identification specificity to address the resiliency needs of any particular service request and are wasteful of bandwidth, equipment, and financial resources.

Flow optimised application service resiliency

Now working its way through the Next Generation Network and International Telecommunications Union standards process is a proposal for a universal flow-optimised application service resiliency (ASR) specification as a fundamental requirement of the next generation telecom network infrastructure that turns traditional approaches to service resiliency on their head.
The primary purpose of the new ASR proposals is to enhance and/or complement current approaches to application service resiliency and to do so by addressing several characteristics of this new network environment that traditional methods have problems satisfying.
Proposed and/or supported by Agere, AT&T, British Telecom, Cisco, Lucent, Nortel, and Sprint, the essential idea behind ASR is redundancy, not of hardware, but of multiple paths and data, and management of both mission-critical and less critical data such that traffic arrives successfully when needed and in the form necessary.
A networking environment that implements ASR will benefit immediately, even with legacy equipment. For example, suppose that 10 per cent of the total bandwidth of a particular path is protected and the primary and secondary paths are of equal bandwidth.
The primary and secondary paths each can carry 10 per cent of duplicate protected traffic and 90 per cent of unprotected, best effort traffic. This translates into a total bandwidth use of 95 per cent. Compare this to the 50 per cent for present either/or techniques that cannot discriminate at the traffic service level and require 100 per cent of the traffic to be protected.
Of course, best results would be achieved with hardware and traffic management network processors optimised for the task. However, if properly implemented, even existing systems with minimal or no fast restoration capability could be retrofitted to perform ASR on an incremental, pay-as-you-go basis.
Such a flow-optimised ASR network architecture would work independently of the packet-transport protocol (IP, Ethernet, ATM, Multi-protocol Layer Switching [MPLS], etc.), or physical transport topology (ring, mesh, star, etc.). More importantly, it would work independently or in conjunction with existing network resiliency mechanisms such as MPLS reroute.

How flow-optimised ASR works

he simplest implementation of the ASR concept is between the two end points of a protected flow. In this scenario, it is assumed the data moving in both directions behaves in the same way. In this case, all subscriber services such as voice, video, and Internet access are concentrated through a home or business gateway device. Consolidated data is sent or received by the gateway over a single broadband link connected to a multi-service access node (MSAN).
If the network has underlying mechanisms in place for fully or partially separate primary and secondary paths, and allow network policy managers to notify the MSAN's control plane processor which flows are to be protected, provisioning can be statically or dynamically configured using SIP session establishment requests.
When the total aggregate flow on the primary path of all service flows arrives from the subscriber to the MSAN, protected flows are identified and replicated. They are then sent to both the primary and secondary paths, which are physically and spatially independent of the primary path.
Under normal circumstances, at the termination end of the protected flow, the router would accept traffic from the primary channel and discard traffic from the secondary one. But in the event of a network failure, the router can make a decision on how to handle it depending on the degree of control needed. The router can detect the disruption on the primary and rapidly switch to the secondary, or data from both paths can be retained, with the NPU making decisions on a packet-by-packet basis as to which flow to discard.

The role of the NPU

At the MSAN, data would flow into a line card where an NPU then handles data path operations such as protocol encapsulation, forwarding, etc., while a general purpose CPU in the control plane performs  corresponding functions on the control path.
For ASR to work effectively, the NP must take on several critical tasks in the MSAN. Most importantly, it must classify the incoming subscriber data to determine if the flow is protected, by inspection of the bits in the packet header that uniquely identify a packet flow.
Once a protected packet or flow is identified, the NPU must assign it a proper priority and buffer it to be scheduled for transmission to both primary and secondary paths. This prioritisation is essential because it gives protected packets precedence over unprotected ones.
Because in most cases NPU classification engines are programmable, the specific classification criteria can be extremely flexible. The packet classification subroutine that is invoked on NPU (see Figure 2, right) initially obtains packet classification information such as physical port number and Ethernet MAC address.
The packet classifier then classifies the incoming packet based on one or more techniques such as exact matching, longest prefix matching, or range checking. The result determines whether the packet should be protected and the corresponding results are returned to the calling process.
At the termination end of the two paths of protected flow, another NPU must classify and identify the protected flows, keeping only the primary flows if the network is operating normally. But if the NPU detects a network outage on the primary flow, it switches over to the secondary one, keeping all data that arrives there and discarding data on the primary flow.

No trade off for designer

The multicast nature of the protected packets requires an NPU architecture designed with efficient multicasting in mind, so the designer does not have to trade off network resiliency for performance.
In addition, the NPU allows buffer management discard/tag decisions to be executed independently on each multicast branch, so congestion of the secondary path will not impact the QoS of the primary path. It is also important that the NPU be able to have sufficient bandwidth and memory resources to handle situations when both the primary and secondary paths must be retained.
In such situations, the NPU will have to make timely milli- and micro-second decisions as to which flow to discard based on criteria such as sequence numbers, timestamps and checksum integrity data. In such cases, it should be possible to perform the equivalent of a fault tolerant 'hitless switchover' since switchover is being decided on a packet-by-packet basis.
The programmable nature of most NPUs means it will ultimately be possible to employ more than one fault detection approach in the same NPU, if such capability has practical application to network operators.

Chris Hamilton, Senior Manager, Agere Systems, can be contacted via tel: +1 610 712 7827; e-mail: cwh1@agere.com

Mike Hill explains whay it is now more critical than ever for organisations to store and be able to retrieve electronic data

Compliance is the buzz-word of the moment. For many companies in highly regulated industries such as healthcare and pharmaceuticals, it's been on their priority list for a while. But since the high-profile corporate scandals at Enron, WorldCom, Parmalat and others it's now moved onto the agenda of every organisation.
Gartner Group estimates that some large and mid-sized enterprises will spend as much as $2m each during the current year to become compliant with legislation such as Sarbanes-Oxley. Will they spend this money wisely, and how can they best benefit from the investment?
Up to now, many compliance requirements have been partially addressed by separate IT products that provide multiple separate solutions in different areas, including backup and archiving. As a result, organisations have tools that, to a greater or lesser extent, allow them to store and archive data and comply with data retention requirements, but the data is usually in multiple locations and in diverse formats. Frequently the organisation has little idea of what data is actually being retained, and what it contains.
Banks and other financial institutions are becoming increasingly regulated and are now required to store all electronic communications related to their business, and retain them for several years. This requirement may also apply to any organisation that accepts orders, changes to orders, invoices, credit notes or any similar accounting documentation electronically. Archive systems are all well and good, but there are some that require the whole of the archive to be restored in order that a single specific item can be retrieved.
The rub often comes when an organisation is asked to retrieve a particular detail, an e-mail message for example. In a recently reported case in the USA, Perot Systems claimed that it was going to cost $4.7m to retrieve some specific e-mails requested for a court case.
To make matters more complex, some legal requirements also impose a time constraint: the Data Protection Act requires that you produce requested information within 40 days of the request. The Freedom of Information Act also applies a time limit for compliance with a request. This may be 20 or 60 working days depending on the type of organisation and some other parameters.
It's of no real benefit having the world's most comprehensive archiving and storage system if you can't retrieve selected items from it quickly and easily without disrupting your normal business. Your data retention policy should specify what kinds of data are stored and what kinds are discarded. Wouldn't it make sense for your archiving and storage system to implement this automatically? That way you could ensure that you're storing only the information that is both relevant to your business and necessary to achieve legal and regulatory compliance.

'Intelligent' need

There's a need for 'intelligent' storage of data. The archiving and storage system needs to understand both the type and the content of the information being stored. Then you could avoid storing unnecessary information: storage is cheap but there's no point in paying to use it if you don't have to. Furthermore you could identify business data and communications that do not need to be retained to achieve compliance; so you don't need to store it. Then, you can't be required to retrieve it: you can only retrieve information you have stored, after all.
However if you talk to your backup specialist, or your IT department, they will undoubtedly confirm that backing-up and archiving material is difficult: users tend to keep things on their local machines and it's hard to back up laptops and hand-held devices because they're never in one place long enough. And then there are those home-based telecommuters who rarely visit the office.
Perhaps the best place to monitor and record both business information and electronic communications is the network itself. Eventually almost every document, memo, spreadsheet, invoice, work order and press release your organisation generates or receives will pass, in electronic form, over your network. There are suitable systems available on the market today that simply plug into your network.
If you are going to implement one of these systems you could also use the same capability to monitor the entire organisation's electronic communications: e-mail, instant messenger (IM), web-mail, Internet downloads and so on.
IM is a terrifically useful tool. It can enable both one-to-one communications and group communications within your business. Unlike e-mail you know immediately that the other party has received your message. In stockbroking some clients found it useful to issue instructions to their broker, in real time. But sadly many organisations, particularly in financial services, believe that it's impossible to record and retain this form of communication. Because this would likely be interpreted as a breach of statutory duty, they have blocked use of IM on their systems entirely.

Head in the sand

This may be one way of achieving compliance, but in my view it's a head-in-the-sand attitude to a technology that has the capability of truly changing the way many of us communicate. Better to enable it, but to record and store it in a form that both achieves compliance and enables IM conversations to be retrieved quickly and easily. Not only are there systems available that do this, but some allow all forms of electronic communications to be monitored, recorded and searched upon in the same place irrespective of protocol, application or file type. This means that if the organisation needs to retrieve some specific records, it doesn't need to search multiple archives in multiple applications and try to string them together chronologically: it can search one system for all relevant communications whatever the protocol or application used.
If the system understands the content of all this data as it's being stored then it could even highlight, and alert upon, items as they are being stored. It's only one small step further to configure such a system to alert if the traffic appears to breach internal policies such as your internet acceptable use policy.     Why might you want to do that? Well there are a number of things, loosely categorised as risks, which may be discovered, and acted upon, within the content of archives or backups. These include legal liabilities such as: employee harassment by e-mail; defamation by someone within your organisation for which your organisation may be held responsible; transmission of viruses or worms, which may be construed as negligence; and copyright infringement. Perhaps someone in your organisation is using your network to download copyright music, or pornographic material. In some jurisdictions the organisation may be liable for failing to take action to prevent the individual committing the act.
Other things you might also be able to identify include security breaches such as: transmission of confidential material from inside your organisation and illicit or illegal activities such as money laundering by a client, or by a member of staff. It is not unknown for criminals to conduct their activities at their place of work and use their employer's computer systems to do it. There were several cases last year of employees downloading paedophilic material onto their work computer systems. In one widely-reported instance the employer didn't know and was not aware until their employee's girl-friend told them.

Inappropriate use

And finally you could detect inappropriate use of your computer systems. Are your employees always working for you, or do they book their holidays or gamble at online casinos during working hours? Would you know if they were?
Because all activity is recorded you would have documentary evidence should you need to take any matter further. This might include reporting suspicious transactions to the money laundering authorities, or taking disciplinary action against an employee.
But are you permitted to do this? In most instances you are allowed, and even required, to do this for business purpose, but what if your employees also send private e-mail? Regulations such as the Regulation of Investigatory Powers Act, the Human Rights Act and the Data Protection Act appear to limit, or restrict, the right of an organisation to monitor the electronic communications of its staff. Under the Regulation of Investigatory Powers Act 2000, monitoring and storing employee's private e-mails (if you allow them reasonable private use of business systems as most organisations do) is a breach of statutory duty unless you have their consent and the consent of their correspondent.
The answer here is to monitor and record, but also to inform your employees that you are doing so; you must include this in your communications policy and state that their first use of business systems for private use will be their deemed consent to the monitoring. This allows them to make an informed decision about whether or not they want to send and receive private e-mails at work. This procedure is relatively easy for your employees, staff or students but how do you get the consent of external correspondents? Look at what the international and city firms of solicitors are doing. They put a statement at the end of all their e-mails warning that they will monitor and record e-mails and that continued e-mail correspondence with their employees in any capacity will be deemed consent to the monitoring by both parties.
The same general principle holds good in cases of your employees visiting unacceptable Internet sites.
Updating your communications policy, your Internet acceptable use policy and your employees' terms and conditions of employment may be necessary to ensure that you comply with the legislation that protects your employees' rights while you implement systems to ensure that you comply with the legislation affecting your business; and you accrue the greatest business benefit from doing so.                                          n

Mike Hill, Vice President, Marketing, Chronicle Solutions (UK), can be contacted via tel: +44 7775 923 910 or +44 1494 672 999; e-mail: mike.hill@chroniclesolutions.com

FTSA, the parent company of the France Telecom group, along with its constituent companies, are experiencing the benefits of introducing e-learning into the development strategies for their 200,000 or so employees throughout the world. Bob Little reports

Although France Telecom has been using some elements of computer based learning since 1993, it took its first tentative steps in e-learning some five years ago. It is only in the last year or so, however, that e-learning within the group has begun to take off.
"In the summer of 2004, we carried out a controlled test on the effectiveness and use of e-learning within France Telecom," explains Yves Scaviner, deputy manager for group training at France Telecom. "We asked 1500 of the company's managers to work through a number of e-learning courses -- delivered through the medium of both French and English -- and over 60 per cent were won over to e-learning as a result of this experience. And now they have become real ambassadors for e-learning and the spearhead for its deployment in operational units throughout France."
The France Telecom (FT) Group comprises five major subsidiaries : TP Group, based in Poland; Orange, which has a presence in 17 countries; Equant, which provides services for the top 3,000 multinational companies in 200 countries across the world; Wanadoo, the Internet connection provider; and FTSA, the parent company of France Telecom.
"It's not easy to change a prevailing corporate training culture and implement e-learning overnight," says Scaviner. "Face-to-face training is the traditionally accepted method of learning which everyone understands -- even if it doesn't suit their individual learning style.
 "The secret of introducing e-learning and gaining rank-and-file acceptance of it as a learning delivery method is not only to have high profile endorsement from senior management but also to convince line managers of the benefits and advantages of e-learning. Moreover, you also have to prove to employees that e-learning is not 'second class learning' simply because it rubs against traditional classroom learning.
 "E-learning offers many benefits and advantages over more traditional methods of training delivery," Scaviner explains. "It is true that developing and using e-learning can result in major cost savings -- especially where training large numbers of employees is concerned. However, while this is a significant reason for France Telecom, it is not the main reason that we are increasing our use of e-learning.
"As a company, our business is in providing the infrastructure to encourage and enhance 'e-transformation' -- so, in embracing e-learning, we are helping to set an example to our clients and suppliers.
"But, for us, the most important benefit of e-learning is that it is a more efficient way of presenting learning than via the classroom," he continued. "Our studies have shown that learners learn faster when they use e-learning, compared with conventional classroom-based teaching methods. Typically, we have seen that what takes some six hours to teach in a classroom can be learnt in four hours in the virtual classroom and three hours if done via distance learning. This makes e-learning a highly efficient as well as cost-effective way to learn."
Mindful of the dangers of putting all its eggs in one basket, while FTSA is increasing the emphasis it places on e-learning within the companies in its group, Scaviner is also keen to stress that its human resource development strategy is dependent on a 'blended approach' -- that is, a mixture of classroom-delivered training with virtual classroom and distance learning inter alia.
Having realised the cost-effectiveness of e-learning, compared with other learning delivery methods -- especially for companies with widely dispersed workforces, such as Equant -- FTSA is actively pursuing a strategy that will result in half of its training/learning activities being delivered via e-learning in 2006. The current volume of training/learning delivered via e-learning within the group is some 20 per cent. This e-learning comprises a mixture of custom built e-learning, mainly developed in-house, and generic e-learning courseware from two worldwide suppliers of training software.
"Where transferable skills are concerned, we do not want to produce in-house what is already available in the marketplace," says Scaviner. "That is why we have bought licences for some 3,000 generic e-learning courses.

Learning path

"Although France Telecom's 200,000 employees have theoretical access to each of these courses, in practice FT training staff choose a learning path for each learner based on that person's revealed training needs," Scaviner adds. "Currently, the most accessed programmes cover general management issues; managing meetings; discovering your management style; managing stress, and motivating staff."
According to France Telecom's Odile Demery: "E-learning -- both virtual classroom and distance learning -- can be delivered to learners' desk-tops but many France Telecom employees work in open space and there is a greater chance of them being disturbed during their learning time. For this reason, France Telecom has made available some 400 dedicated training booths around the country -- known as 'Espace Clic-n-learn' -- where individuals can study undisturbed. So the booths -- time in which can be pre-booked online -- offer an ideal solution."
"We are popularising e-learning throughout France © Telecom via a number of initiatives, including posters and mousemats advertising 'Espace Clic-n-Learn'," said Demery.
"Ultimately the popularity of e-learning will depend on the 'me too' factor -- as people see their colleagues visiting the Clic-n-Learn booths and benefiting from their new knowledge and skills."


Christine Skelhorn, head of training at Orange, passionately believes that e-learning is the way forward for Orange. Along with her training team and e-learning co-ordinator Amanda Yarrow, she is committed to providing all Orange employees with access to innovative, effective and enjoyable learning. At the beginning of 2004, Orange launched its e-learning strategy with a Corporate Induction module, developed with TATA Interactive Systems (TIS).
In its ten years of existence, Orange has grown to some 12,000 employees in the UK. As the company continues to grow so, too, does its requirement to recruit staff -- especially in the customer services field -- and give them induction training. Until the advent of the e-learning materials, the corporate induction programme was a three-hour PowerPoint presentation conducted, as required, by any of the company's available trainers.
"We are delighted at the positive feedback we've received to the induction module," says Amanda Yarrow. "In particular, it's been a real winner with new starters across the business.
"Users range from engineers who have worked at Orange for many years to newly recruited customer service staff. They all seem to like the image we've adopted of a 'fairy godmother'. This virtual entity guides them through the programme and helps to dispel any 'techno' fears that they may have. She also helps to add meaning and significance to the contents of the induction programme.
"Importantly, TIS seems to have hit on a style of presentation which appeals to everyone in the company," Amanda Yarrow adds.
Orange has a number of further e-learning programmes in development for 2005 and beyond.
"It's been an encouraging start with Orange Induction and we really look forward to building on our achievement in the future," concluded Yarrow.


In less than two years, Equant -- part of the FTSA group -- has provided ongoing e-learning to some 9,500 employees worldwide, saving $6.5m in the process.
Equant operates a worldwide telecommunications network that manages 152,000 user connections across 220 countries for some 3,700 customers. Its employees need ongoing training in a range of topics including IT skills, project management, problem solving and negotiation skills.
With so many of its employees based throughout the world, Equant knew that traditional classroom-based training was time-consuming and costly, so it implemented a blended learning strategy integrating elements of classroom-based training with e-learning. The bulk of its e-learning programmes are provided by SkillSoft's IT and business related courses. SkillSoft also provides a '24/7' mentoring service to support its IT curriculum.
By the end of 2003, Equant had over 8,500 of its employees using e-learning -- a user rate of over 85 per cent -- at a cost per employee of less than $100 for 24/7 access to 700 courses.
Access to e-learning is:
*  Having a positive impact on levels of staff retention at Equant.
*  Helping employees to become proficient in their jobs more quickly - thus reducing costs, increasing productivity and revenue.
*  Enabling employees based in more remote locations to feel a closer part of the Equant 'family' and providing more development opportunities than were previously available to them.                               

Bob Little is a freelance communications writer

Providing a consistent level of Ethernet service is vital if carriers are to make the most of its potential, says Fred Ellefson

European carriers have embraced Ethernet services and, according to the Probe Group, are on track to grow the European Ethernet service market at a 40 per cent CAGR to provide almost â‚-4B worth of services by 2008. With the success of this service, its profitability for carriers is paramount. The capex side of delivering Ethernet services is extremely attractive, with Ethernet port costs approaching one tenth of comparable SDH port costs, according to Network Strategy Partners, LLC. 
However, the opex side of delivering Ethernet services over a five year service contract will end up dwarfing the original capex for delivering the service, and is substantially higher than traditional SDH or PDH based services (see Figure 1, right). This is critical because Ethernet service price points are typically much lower per megabit than traditional SDH/PDH based data services. Opex costs can make delivering profitable Ethernet services a challenge.                                                 
The original Ethernet standards were designed for the enterprise LAN environment and do not have the operations administration and maintenance (OAM) capabilities that carriers require to deliver a WAN service profitably on a wide scale. While manpower can be thrown at the problem when the number of customers is small, this approach will not scale as Ethernet services move down market to the millions of small and medium enterprise customers necessary to grow the market. 
Fortunately, the standards bodies have recognised this problem and have been working on adding the necessary OAM capabilities that are required to allow Ethernet services to scale. The first OAM standards were ratified in the middle of last year by the IEEE 802.ah Ethernet in the First Mile (EFM) standards group. While most of the media attention has been focused on the copper transport side of this standard, the bigger impact to carrier bottom lines will come from the OAM advances found in the standard. The ability to remotely monitor and perform maintenance will eliminate expensive truck rolls and dramatically improve the profitability and margin of Ethernet services. According to studies by Covaro, these new standards will reduce truck rolls (and opex) by 47 per cent versus un-managed Ethernet services and can greatly reduce the total cost of delivering Ethernet services (see Figure 2, opposite).
In addition, ITU and Metro Ethernet Forum (MEF) standards bodies are developing standards for end-to-end service monitoring/testing, while IEEE 802.1ag/ITU are also working on connectivity standards for multi-point to multi-point services. These standards will be ratified in late 2005 or 2006, and together with IEEE 802.3ah, will provide a layered set of OAM capabilities analogous to OAM capabilities found in SDH and PDH standards. 

Ethernet demarcation

With these new OAM standards, the biggest changes and impact will occur at the customer premise where the service is delivered to the end user. Traditional data services incorporate a demarcation device such as a smartjack or CSU/DSU to provide remote monitoring/test capabilities and have been a critical factor in reducing opex costs and ensuring profitability of these services. In the Ethernet world, carriers have had to improvise using Ethernet switches, routers, media converters or even SDH ADMs to perform this function. However, these devices are not up to the challenge: they cannot perform even the simplest of demarcation functions such as loopbacks or test generation, and often are not particularly reliable. Many enterprise users are shocked to see the same router that they have to reset every week in their own network, installed in the telco closet by the carrier for demarcation. 
A number of companies are now offering purpose built Ethernet demarcation devices which incorporate both Network Terminal Equipment (NTE) functionality along with User Network Interface (UNI) functionality.  The NTE performs the OAM capabilities defined in 802.3ah, including remote/local loopbacks, remote failure indication, fault isolation, performance monitoring with threshold alarms, status monitoring and discovery. The UNI function is aligned with MEF recommendations for Ethernet service definitions including Committed Information Rate (CIR), Excess Information Rate (EIR), and burst size on both a port basis and an Ethernet virtual circuit (EVC) basis for VLAN based services. These two functions are key to both defining an Ethernet service as well as maintaining it profitably.
The NTE function provides a full suite of RMON Etherstats plus extensions, enabling carriers to monitor SLA conformance on both sides of the demarcation point and to analyze performance trends over time. Performance data can be collected and stored in 15 minute intervals just like performance data from traditional carrier services. This valuable data provides a performance log for billing and SLA purposes, and provides advance indication of performance degradation before an outage actually occurs. Carriers can make this data available to customers through a web portal to facilitate customer network management, as is often done with private line or frame relay services. 
Should a service outage occur, the NTE provides remote visibility and control of the demarcation device, reducing or eliminating the need for a truck roll. The carrier can employ remote Ethernet loop-backs along with pattern generation/detection for remote testing, and can remotely determine if the CAT5 cable connected to the CPE is open circuited, short circuited or properly terminated. Open or short circuits can be located it to the nearest meter for precise diagnosis of cabling problems, which can then typically be corrected by the customer. The EFM dying gasp message even provides the carrier with a remote indication that power has been lost, which can often be addressed remotely by the customer.
The Ethernet demarcation device's UNI provides the CIR, EIR and burst parameters needed to define the QoS/CoS of Ethernet services on a port or EVC (VLAN) basis. The use of VLANs enables multiple Ethernet services, such as dedicated Internet access and/or Ethernet private line services, to be carried in a single Ethernet connection to the customer. This function is extremely visible to the end user because it defines the look, feel and personality of the Ethernet service. 
Locating the service UNI at the customer premises enables service definition and prioritisation at the point where full-rate Ethernet is rate-limited for the lower bandwidth transport and price points of services targeted at small to medium businesses. Performing this function at the rate-limiting point is essential to the proper prioritisation of such latency-sensitive services as VoIP and video. The customer premises location of the UNI also enables remote additions and changes to the service definition, eliminating truck rolls in service upgrades also.

Consistent look and feel

In summary, in addition to providing a dramatic improvement in opex, the demarcation devices provide carriers with a consistent service personality or SLA that can be hard to achieve when delivering Ethernet services over a mixture of fibre, copper, SDH and PDH technologies. Multi-site enterprise customers are often disappointed that their service SLA and monitoring capability is limited at sites that are served via early generation SDH or PDH equipment. Installing a demarcation device behind a SDH ADM can provide VLAN and OAM support that the ADM cannot provide, and enables those sites to receive the same rich set of service capabilities that the native Ethernet sites are receiving.  Providing a consistent, ubiquitous Ethernet service is key to providing a differentiated Ethernet service, increasing market share and ensuring the profitability of an Ethernet service offering.                              n   

Fred Ellefson is Vice President of Marketing, Covaro Networks, and can be reached via e-mail: fred.ellefson@covaro.com   www.covaro.com

Lynd Morley looks back at this year's 3GSM World Congress

It was cold in Cannes this year. Even the determined visiting joggers, who pound along the Croisette in the early hours, were swathed in woolly hats and gloves.  There was something of a chill in the air from the locals as well, which might, of course, have been in some way connected to the fact that after ten years, the 3GSM World Congress was paying it's last visit to Cannes before decamping to Barcelona next year. Given the significant boost the show delivers to the local economy, it is probably not surprising that, viewing all connected with the Congress as in some way treasonous, the waiters, bartenders and some shopkeepers surpassed even their usually accepted levels of arrogance, indifference and downright rudeness. 
But inside the Palais des Festivals all was warm and glowing -- that is, once you'd recovered from the frostbite contracted while queuing to register. The atmosphere in all five of the exhibition halls was bullish, determined and positively radiant. Indeed the much discussed (and prayed for) recovery in the telecommunications industry could not have been better illustrated than by the sheer numbers at 3GSM this year.  A record 34,000 participants meant that attendance was up by some 20 per cent on last year. Indeed, the number of delegates, exhibitors and visitors swelled the population of Cannes to such an extent that it was near on impossible to find table space to grab a café au lait in any of the cafes around the Palais.
All grist to the mill, of course, as far as the GSM Association and the Informa Group were concerned, underlining -- as such figures do -- the importance of the event in the mobile calendar.
Both organisers and participants were, understandably, keen to stress the growing momentum of 3G, emphasising its continuing development through IMS and HSPDA -- whose imminent arrival has precipitated a rash of roll out plans from the likes of Motorola, Siemens, Nortel, and, of course, Ericsson -- which claims to have set a new HSDPA data transfer world record of 11 Mbits/s. Questions about the availability of appropriate handsets remain, however, in the face of a rather unnerving reticence from handset vendors on the subject (and resurrecting the spectre of handset shortages for 3G roll-out last year), but pundits believe the industry has learned its lesson -- a sentiment echoed by Sony Ericsson vice president, Jan Wareby's statement to the effect that his company will be making products available for trial this year, with commercial volumes during the first half of 2006.
While product launches, partnership plans, and development announcements -- covering every conceivable aspect of the 3G world -- abounded during the show, the buzz was particularly audible around such topics as TV to mobile devices (buzz volume increases with the assertion from Orange that some 60 per cent of its users in France watched live TV on their mobiles); and music (several more decibels on the buzzometer with the announcement of a link-up between Microsoft and Nokia on delivering music to mobile phones).
Reflecting the incredibly wide range of nationalities present at the show -- which drew visitors from the full spectrum of market maturity across different countries around the world -- the 3G community is looking to its future in such countries as China, India, Brazil and Russia, who will, according to Bharti Chairman, Sunil Mittal, provide the next billion GSM customers within three years. The GSM Association and Motorola will help this process along, of course, with their promise to deliver a sub-$40 handset this year, with the aim of reducing to sub-$30 in the future. It could be stressed that such prices are essential to the further spread of GSM, given the GSM Association's own figures   Â© which show that although some 80 per cent of the world's population has access to wireless coverage, only around 25 per cent can actually afford to do so.
3GSM 2005 was, by any measure, a considerable success. Significant announcement were made, networking flourished, and deals were done; exhibitors seemed more than satisfied with attendance levels; the corporate parties were lavish and exuberant; and even those professional whingers, the assembled press, complained considerably less about the media centre facilities.
A fitting farewell, perhaps, to the Cote D'Azur. Next year Barcelona -- and hopefully a better chance of getting that cup of coffee.                                             n         

Just across the road from the Palais des Festivals, the Telecom Valley Gallery set up shop again this year in the famous La Potinieres du Palais.  Boasting much-welcomed heaters in the restaurant's canopy area, the Telecom Gallery offered comparative calm, mixed with that essentially French sense of stylish purposefulness.

Organised by the Telecom Valley Association, in partnership with Cote d'Azur Development (CAD) and Initiative Riviera Technologies (IRT), the Gallery is a showcase of the latest 'made in the Cote d'Azur' wireless technologies. The companies discussing their products and services over an excellent glass of wine (not to mention a superb menu), included Aequalis, Altix-EDS, Atos Origin, Devnet, ETSI, Istar/EADS, NCR Teradata, OrangeFrance, Philips Semiconductors, Smartcom, Temex, Texas Instruments, Trendium, and W3C.
Jean-Marc Dijan, President of the Telecom Valley Association, notes that these companies demonstrate that a real telecom value-chain has taken root and grown to maturity in the region, covering standardisation institutes, electronic design centres, consulting firms, software creation and integration firms, support services, research laboratories, engineering schools and so forth.
With the clear intention of attracting more companies into the area, Jean-Pierre Mascarelli, President of Cote d'Azur Delopment, adds that the local high tech community has been able to take advantage of the fact that the CAD provides international businesses with personal and confidential contacts to the area's business, economic and administrative networks, at no charge.
The Telecom Valley companies, like their colleagues in the main exhibition halls, came well equipped with company information and announcements, timed perfectly to coincide with the 3GSM World Congress.  Teradata, for instance (who also had a booth in one of the main halls), unveiled its data warehousing solution, Warehouse 8.0, aiming to provide businesses with breakthrough business intelligence to solve the problems of how to increase revenue, reduce expenses and identify new growth opportunities. Chris Parsons, Teradata's EMEA Industry Marketing Director, noted that for operators to achieve their declared aim of 'getting closer to the customer' (a recurring theme at 3GSM 2005) effective business intelligence is crucial. He explained that the stronger, more robust Teradata data warehouse enables customers to gain a competitive edge with a new level of business intelligence, and stressed that the company continues to enhance the warehouse suite to make it easier for businesses to integrate Teradata into their overall enterprise.
The IT services company, Atos Origin, which provides business consulting, systems integration and managed operations, was also able to highlight its capabilities at 3GSM hot on the heels of an announcement -- the company having recently implemented the LHS rating package 1.2 at T-Mobile Austria, within just seven months. The solution enables the billing of both post- and pre-paid customers via one system. A core element of the solution is that it will also be used for future 3G services, such as mobile voice and data communications via UMTS, GPRS or WLAN.
The companies gathered in the Telecoms Valley Gallery all had their own particular success stories to relate, of course. But while there's no doubt that this particular venue will be missed in Barcelona, it is to be hoped that they will continue to contribute to the 3GSM gathering -- on foreign soil.                        n

Lynd Morley is editor of European Communications

As a mobile Internet protocol, i-mode could provide
operators with a means of differentiating their services
in the mobile data market, reckons Kevin Buckley

This year, the 3GSM World Congress in Cannes found the GSM world had well and truly embarked on 3G, with at least one, and usually various operators, having launched services in all the major markets. And not a moment too soon, as voice revenue, everywhere, is under pressure from competitors and, in the case of interconnect rates, from regulators. Data (besides just SMS) is therefore charged with the responsibility of complementing the challenges of declining growth in voice revenues.
In general terms, GSM world operators can be divided into two groups for the purpose of analysing their mobile Internet strategies -- the leaders or, frequently, the top two players in any given market. They will usually have far more subscribers than the rest of the competition, forming a de facto duopoly and vying between themselves for the leadership position, quarter by quarter.
These operators' main challenge is to migrate their huge customer bases smoothly from 2G through 2.5G to 3G. Having learnt from their mistakes with Wireless Access Protocol (WAP) phones, which came to market in 1999-2000 before an ecosystem of well-designed, well-conceived sites existed, they concentrate on building services rather than emphasising the technology. They sell camera phones and music downloads rather than GPRS or UMTS.
As such, they introduce their consumer-oriented mobile Internet offerings as content portals on their GPRS networks, signing up subscribers so that 3G can subsequently be marketed as a speed upgrade.
The other group comprises operators that, for one reason or another, need to differentiate their offering from the rest. Some are new entrants, i.e. groups that have no 2G customer base because they came in at the 3G licensing stage and therefore need to wow potential customers into leaving their existing provider in favour of them. Others already have a 2G business but aspire to become a market-leader and so need to raise their profile as a sexy option for mobile phone users wanting content offerings along with voice.
One way this group can seek to differentiate itself is by promoting the fact that it is offering 3G telephony, running ad campaigns that emphasise the new things that can be done with the more advanced phones in terms of content acquisition, m-commerce transactions or location-based services.
At the same time, the more efficient spectrum utilisation of 3G, compared to earlier generations of technology, means that more voice calls can be put on the same wavelength, a fact the new entrants are exploiting to offer cheap voice services. These are designed to bring customers to their networks, after which it is an easier task to persuade them to start using the mobile Internet function and to acquire content.

i-mode as a differentiator

Another way to stand out from the crowd, and one we are seeing in an increasing number of markets in Europe, is with i-mode. Like WAP, this mobile Internet protocol is an overlay on the network and can operate wherever an IP layer has been deployed, i.e. from 2.5G onwards.
The question all operators now face is: how can they make money from data services? It's all very well for a mobile carrier to say it has diversified beyond voice and into data, but mobile Internet access alone is not enough to bolster revenue. It too is being commoditised as operators start to offer flat rate "all you can eat" services to attract subscribers away from competitors who don't. Remember the cautionary tale of Internet service providers in the wireline world, whose initial promise was blighted as the flat rate, always-on environment grew, forcing them to move to value-added services or die.
Let's begin by separating the provision of such services to business/enterprise customers, who want secure mobile access to key applications running on their corporate networks such as ERP, CRM or SFA, from the marketing of non-voice functions to consumers. The latter represent a mass market that, aside from mobile e-mail and text messaging, essentially boils down to the sale of content. It is the provision of data services to consumers that I want to discuss here.

DoCoMo led the way in Japan

hat business is, of course, in its infancy, but there are interesting lessons from a market we at NEC know very well, namely Japan. NTT DoCoMo is universally acknowledged to have been ahead of its time when it came to content with its development of i-mode, the proprietary technology which, from 2.5G onwards, has successfully built both a large subscriber base (some 44 million in Japan today, or 92 per cent of all DoCoMo subscribers) and a huge pool of vendor sites (about 84,500 right now, of which just under 4,400 are 'official' sites, i.e. ones that pay a 9 per cent commission on sales to DoCoMo, and 4,600 have 3G content). In financial terms, i-mode contributed 25 per cent of DoCoMo's total revenue last year, which is not bad for only its fourth full year in operation.                The model not only works for DoCoMo at home in Japan: it has also licensed the technology to operators in 12 other countries, in one case (KPN in Holland), a carrier in which it holds equity.
Eight of the 12 are in Western Europe: KPN and its subsidiaries E-Plus in Germany and BASE in Belgium, Bouygues in France, Telefonica in Spain, Wind in Italy, mmO2 in the UK, Ireland and Germany and CosmOTE in Greece. O2 plans to deploy i-mode in the UK and Ireland this year and Germany (under a different brand) in 2006.
In other parts of the world, Far EasTone in Taiwan launched in 2002 and Australia's Telstra followed suit last year, while both CellCom in Israel and MTS of Russia have signed up with DoCoMo to launch services.

WAP vs i-mode

Meanwhile other industry heavyweights, such as Vodafone, Orange and T-Mobile, are building services based on WAP gateways, with Vodafone live! the furthest advanced in a number of countries (16 at the end of September '04) and subscribers (11.5 million at that same time). Like i-mode, WAP sits above and is thus independent of the radio access layer, provided the network is IP-enabled (i.e. 2.5G and beyond), they can function to enable a mobile Internet experience.
Before we go any further, let me address the fact that WAP is an open standard while i-mode is proprietary and must therefore be licensed from DoCoMo.
All true, but let us not forget that if, for instance, a games developer wants its game to be playable on the Vodafone live! service, it must write to the operator's proprietary API, called VFX, for the purpose.
In the i-mode world, the main pull for ISVs to write to DoCoMo's API has to date been the carrier's commanding share of the Japanese market. Now that other licensees are coming online there is a buddy group forming which, by virtue of its collective subscriber base, again makes it worthwhile for the software developers and handset manufacturers to work to the i-mode spec. By its size and geographical reach, the group begins to rival the clout of Vodafone.

Street market vs. shopping mall

The difference between the two mobile Internet technologies -- and herein lies the secret of i-mode's success -- is that the latter was developed after the way the market for it would work had been defined, whereas WAP debuted as, to paraphrase Pirandello, a technology in search of a business model.
The i-mode business model can be likened to that of a street market. If a vendor wants to set up a stall (i.e. a site), they agree to pay a percentage of the takings to the local council (i.e. DoCoMo). It is therefore in the operating council's interest to have hundreds of stalls, or indeed thousands, since the Internet does not have the physical restrictions of city streets.
Since the business model was thought out beforehand, from the outset DoCoMo recognised that it was in its interest to promote the take-up of i-mode, and to this end has always allowed so-called unofficial sites to proliferate, i.e. the ones that don't pay the 9 per cent fee on their sales and for whom it does not carry out the billing and revenue collection.
It still makes money on them, however, charging for the traffic generated by their m-commerce activities. © ndeed, some 80,000 of the total 84,500 sites are unofficial, and what they pay to communicate with customers across the DoCoMo network makes up 50 per cent of the carrier's non-voice revenue.
Another major difference between the i-mode model and those of the operators basing their mobile Internet services on WAP gateways is that, in the former, all content acquisition is the result of Internet browsing and all content is delivered via DoCoMo's i-mode platform. In the WAP-based world, there are far less sites and the bulk of content is acquired via SMS. The operators derive revenue from the SMS traffic, of course, but considerably less than they would if their subscribers used the mobile Internet to do their m-commerce, particularly now that SMS and MMS are being bundled into cut-price packages along with voice minutes as competition heats up.
The revenue from the browse-to-buy model comes not, primarily, from the time the subscribers are online, particularly now that many operators are going over to a flat-rate model for Internet access. But by charging the content providers to be the delivery mechanism for their ringtones, weather forecasts, horoscope updates or whatever, and by making it easy for thousands of providers to put up sites.
If i-mode is like a street market then the WAP-based mobile Internet model is like a shopping mall. The number of stores (i.e. sites) is small and the commission the mall owners (i.e. the operators) earn is a lot higher, anywhere from 40 to 60 per cent of vendors' revenue in fact. And since most content is bought by SMS rather than on the Web, one could continue the analogy by saying that most shoppers aren't even entering the mall.
The knock-on effect here is obvious. If I receive an SMS inviting me to buy a snazzy ringtone and I text back to buy one, that's the end of the transaction. If, on the other hand, I get an e-mail with a link to a site where I can download the ringtone, the vendor has far more sell-on or sell-up potential while I'm on the site.
There is also a greater opportunity to create a recurring revenue stream by signing me up to regular service of, say, a new ringtone every week or month, or indeed of multiple ringtones so that I can differentiate between calls from my boss (mental note to self: answer swiftly) and from my mother-in-law (mental note to self: let it go to voicemail). NEC believes that more content brings more users, and more users bring more content.
What's interesting in the Japanese market is that, since i-mode was the first and most successful service there, it has created the country's mobile Internet culture, such that DoCoMo's competitors seek to emulate its business model, even though they're not deploying i-mode.
So while Japanese consumers, thanks to i-mode, browse to buy, their European counterparts text to buy. In the second scenario, the content will often not even go through the operator's servers en route to the subscriber, being stored instead on a content server on the provider's network and delivered by SMS, with the operator deriving revenue only for transporting the text message.

Which strategy will win?

There will probably not be a single winner, as the success of any mobile Internet strategy will not just depend on the technology, or even on the business model alone, but also on the market clout of the operator adopting it and the acumen of the executives running the business environment in which they are operating.
As manufacturers, we at NEC play an important part, as we provide the means to make it all happen! We see a place for both models depending on where a particular operator aspires to be. Our challenge is one of continually being one step ahead -- developing the roadmaps for more cost-effective, high performance platforms and the technology on which either model can fit or indeed, any future models.                       

Kevin Buckley, General Manager NEC (UK),  Mobile Network Solutions Division, can be contacted via e-mail: kevin.buckley@uk.neceur.com

Inventory management is a vital ingredient in the feast known as VoIP services, as Julie Wingerter explains

It's like a hamburger and fries: it's just better together. And in the case of VoIP, IPTV, and other IP based services, inventory management really does make a difference in the overall ability of service providers to roll out these services profitably and efficiently. VoIP services offer operators significant revenue upside, but they also come with a set of deployment and operational challenges. That's where a robust inventory management solution comes into play.
Because IP services such as VoIP and IPTV are executed over a combination of shared multi-service transport environments, there are more devices to provision and maintain; network topologies to keep straight, and; network bandwidth/traffic issues that require prioritisation, than in a typical POTS scenario.
For VoIP to work, all of these network related activities, equipment and designs have to be monitored and managed in real time. To do this, a powerful network inventory management system is required. Such a system provides an accurate view of the network and serves as the core data repository supporting the automation of routine functions and providing vital information that allow billing, order management, service provisioning, outside plant, and purchasing to run efficiently.

Carrier success

How will carriers be successful in rolling out new VoIP services? From a network perspective their multi-service transport/broadband IP environments must meet some pretty high standards: ie tough enough to facilitate millions of phone calls; reliable as legacy phone services with a sound network architecture and POTS interconnection strategy; high quality (e.g. jitter-free, static-free), and; designed to rapidly process customers' orders and provision services.
What lies ahead in rolling out new IP based services?  Let's look at some of the specific challenges carriers face when introducing VoIP. 
If you don't know what is in your pantry you may not have all of the ingredients necessary to create an appetising dinner. Similarly, VoIP requires carriers to maintain an accurate picture of their network inventory so they can determine which services are available and when they are available, and so they can plan ahead to avoid any 'shortages.' This is particularly important, as IP generally requires many more network devices and configuration parameters than POTS services, including additional Customer Premise Equipment (CPE) and home networking components and associated MAC Ids, IP numbers, customer services data, pricing, etc.; Access Technologies -- HFC/cable, DSL, PON (FTTH); and PSTN off-net and other off-net connectivity components like media gateways, SS7, etc.
There is also a need to integrate with additional management applications that use inventory information such as billing systems, network management applications, etc., which may be using the information differently in order to introduce usage mediation or real-time trouble-shooting.
In short, VoIP services are more complex than standard POTS services because of the additional products and numbers/addresses that are used. Consequently, service providers need an inventory solution that provides: a mechanism to capture all of the physical and logical assets of the network; an integrated inside and outside plant inventory that supports an end-to-end view of the network; a means to keep the data current, and; a mechanism to integrate with service provisioning processes.

Oversubscription Management

Network congestion increases as thousands of new customers are added on an IP network. Oversubscription management looks at the usage patterns and analyses traffic and network capacity real-time. However, for this information to be accurate, carriers must have a view into their whole network. With up-to-date network data from inventory management this is achievable. With the ability to view the entire current network, service providers can adjust © oversubscription management based on experienced performance and quality to ensure that performance does not suffer.

Traffic Prioritisation

For most consumers, static filled calls are not acceptable. Therefore, to ensure QoS, a carrier needs to effectively monitor and prioritise the types of packets running through the network. Voice needs bandwidth. If there is not enough bandwidth available, voice quality deteriorates as 'static,' or 'jitter,' and dropped calls become inherent problems.
   A carrier needs to facilitate traffic prioritization policies to ensure that voice and video are prioritized over data transmission activities. Enforcing and adjusting these policies is paramount especially when offering triple play services that all vie for bandwidth on the same network. All of this revolves around accurate network inventory data to maintain bandwidth at levels that match network activity levels.

Self-Service Feature Management

Creating individual, made-to-order bundles of VoIP services is required to stay competitive and is heavily dependent upon accurate network inventory information. Through web-based customer sites or by phone, users should be able to, in real-time, change their VoIP features. For example, customers may go online and adjust their call forwarding, call waiting, or voice mail parameters. Importantly, customers don't expect to have to wait for these changes to be applied-they want them to be instantaneous. This capability requires accurate customer information and the associated network details. 
Another unique VoIP service feature is virtual phone numbers. It is possible to have a VoIP number reflect where a person or company would like to have virtual offices or presence in other countries. For instance someone from the United States might want to have a virtual office in the United Kingdom. Using VoIP, they can now have a number that matches UK phone numbering conventions even while all calls are routed back to the original US number. These unique enhanced service features require a network inventory that is flexible and integrates with other processes. VoIP brings to the table all these new service features that didn't exist in the traditional POTS environment.
IPTV deployment challenges mirror those of VoIP, only they tend to be intensified in scope. IPTV requires a bevy of new supporting equipment from encoders and video-on-demand servers, to video compression and IP-  encapsulation. For many carriers today, IPTV is the next big service they intend to roll out and VoIP is their introduction to IP-services.
Accurate inventory, traffic prioritisation, oversubscription management and self-service feature management are just a few of the areas that must be managed well to generate profits from VoIP and IPTV services.  An inventory-based OSS delivers the capabilities that allow carriers to see their whole network, add services and capacity to maximise their resources, and analyse results for future planning.
NetCracker Technology's OSS Solution is improving how carriers rollout next-gen services such as VoIP, IPTV, Fibre-to-the-Home and others. The Solution includes industry-proven inventory-based OSS software and the professional services delivery expertise to make it happen. NetCracker customers include Telstra, Australia's largest carrier; Telus, Canada's second largest carrier; MGTS, one of the largest wireline providers in Europe; and Covad, a leading North American broadband service provider, among others.

A moveable feast

VoIP and IPTV do have their unique challenges. However, combined with a solid inventory management, OSS service providers can generate higher revenues, increase market share, maximize network resources and remain competitive.
  A multi-service transport environment, IP's additional equipment and design parameters need to be kept in synch using inventory management.

Julie Wingerter, Vice President of Strategy, NetCracker Technology   www.netcracker.com

Can MMS -- as predicted by many in the industry -- produce the knockout blow that will fuel greater revenues and
opportunities for all, or will it be marginalised as a stepping stone to 3G services. Terry Ernest-Jones investigates

$161.3bn in 2009, by which time it will be well-established as a day-to-day feature of the mobile mass market. Yet so far, MMS has failed to deliver on promise for many of its users and its potential to enrich mobile communications has not been realised. It has recently passed through the early adopter phase, and the MMS industry continues to wrestle with problems such as handset compatibility, digital rights management, and pricing models. However, the major opportunity, of receiving and sending multimedia messages on the move -- as easily as with SMS -- gets closer by the month.
MMS brings multimedia features such as photos, sound, video, rich text or interactive applications to mobile messaging. This can take the form of a message sent between mobile phone users ('peer-to-peer') or, of equal importance, a message sent from a third party to a user ('server-to-mobile'). An MMS message can be compared with a scaled-down PowerPoint presentation, able to contain a variety of media. By contrast, Short Messaging Service (SMS) -- which has paved the way for MMS -- only allows basic text-messaging of up to 160 characters.
Technically speaking, MMS originates from mobile messaging standards defined by the Third Generation Partnership Project (3GPP) and the WAP Forum (which has since merged into the 'Open Mobile Alliance'). It has only really been up and running for about three years. Amongst the countries where uptake is strongest are Japan, South Korea, Germany and the Nordic region. The USA is less developed, reflecting the general lower mobile handset penetration in the region. In between are countries such as the UK, displaying roughly average MMS usage levels: even by the end of the first quarter 2004, out of a total of 47.5 million subscribers to the four main UK mobile networks, 11.3 MMS active devices were registered, according to the Mobile Data Association. This gives an MMS penetration rate of 24 per cent.
MMS requires special handsets with colour screens and, usually, built-in cameras. When the first Juniper Research report on MMS appeared in 2002, there were just two MMS phone models. Now there are hundreds. They have already brought in significant revenues for the leading MMS handset suppliers such as NEC, Nokia, Samsung, Sony-Ericsson, Panasonic and LG Electronics. But for operators urgently looking for new revenue sources, and other suppliers in the value chain, MMS offers the chance to build on new consumer behaviour in messaging -- for example linking audience participation into digital TV programmes -- driving new data revenues and raising ARPU, as well as handset replacement rates.

Lifting the barriers

Whilst offering a leap in mobile phone usage and appeal, it must be emphasised that until now, MMS has also been a frustration for large numbers of users, even for basic functions such as exchanging photos between mobiles. However, many of the compatibility and interoperability obstacles that have menaced MMS will be ironed out over the next two years, allowing a freer flow of multimedia messages, approaching the level of today's SMS. A major advantage for MMS is that, following in the wake of SMS, it can slot into customers' existing mobile usage habits. The downside is that user expectations have been set to require the same standard of service, and smooth operation, as they get from SMS.
Fortunately for the MMS industry, users do however expect to pay for mobile services. (This is not by any means the case with, say, the Internet.) MMS presents a large revenue opportunity, not only in providing enhanced peer-to-peer messaging, but also content and application-based services. As yet few suppliers are making any money out of it. But MMS provides opportunities to sell a range of enabling technologies. This will not be realised until operators and all other suppliers in the industry a) understand the dynamics of the value-chain, b) adhere to industry standards for interoperability, and c) fine tune their infrastructure needed to exploit MMS.

The MMS value-chain

As MMS carries multimedia into the mobile telecommunications market, a new range of suppliers have been drawn to the mobile channel to reach a wider client base. Just as the web has linked the IT and media markets, MMS is bringing content providers and application developers in touch with mobile operators. There are several instances where these players are working in close co-operation to create a multimedia user experience, deliver attractive services, and provide end-to-end solutions that stimulate the market and create new needs.
Ease of use and intuitive interfaces are fundamental requirements for end-users -- neither of which are as yet properly addressed by the MMS industry. If, say, a young teenager is in a clothes store and wants to get the OK from his/her parent to buy an article of clothing via an MMS picture, the operation must be transparent and quick. Minutes spent overcoming the handset's technical hurdles in a busy clothes store will defeat the purpose of this and other useful applications.
From the mobile end-user's perspective, MMS involves a radical shift: in effect moving the focus of their attention from the ear to the eye. As so often with technology innovations, end-users are neglected as the key element in the value-chain, whilst the industry races ahead with 'push'. So far as MMS is concerned, the youth market is the main driver -- in fact some studies say that most MMS buying activity is seen amongst 15 to 17 year olds. Each age group needs to be taken into account -- there is a large mobile population of  people aged 60+. Messages sent to them of, for example, grandchildren's birthdays, should not be ignored under the misconception that MMS is 'youth only'. 
A feature of young people's lives today which is in MMS' favour is that they record the details of their lives online a lot, and both sexes have a stronger urge than in previous times to share their experiences with friends.
Taking the UK as an average, around 25 per cent of handsets are MMS-enabled. Yet most young users of MMS phones today regard it -- initially at least -- as a disappointment. Typically, initial efforts to send photos to friends fail. Tales abound of handset makers blaming operators and vice-versa when frustrated users contact help centres to overcome problems. MMS represents an investment for end-users, who buy an MMS-capable handset -- costing up to $500 plus services. The mass market will only be willing to make this investment if users can derive real value out of using MMS. There are several compelling value propositions linked to MMS at its most basic level:
*  MMS handsets in themselves add kudos with their basic features of colour screens and polyphonic ringtones.
*  MMS offers the possibility to enhance the popular peer-to-peer messaging with the addition of photos and sound. More importantly, pictures and sound allow the user to add personality and emotional content to the messaging experience, and share this with friends.
*  MMS information and entertainment services promise unrivalled personalisation possibilities, and will enable users to access content anytime, anywhere.


Operators play a pivotal role in the value-chain. As such, they must focus closely on the end-user's needs and guide their business partners to help them develop offers that will satisfy end-users. (Presently, this is often the other way round.) This requires more than simply supplying 'jazzed-up' SMS messages; operators need to reach into new models such as T-Mobile's for the Euro 2004 football tournament. This scheme included MMS picture messages sent at intervals during each of a team's matches -- the whole package costing $4.50 per team for picture, or $10 for video, updates. (T-Mobile was contacted for this report to state the result of the programme, but declined to give figures. Instead it commented that it was "very pleased with both the level of uptake and also the technical performance of the delivery of the alert services.")

Wide range of services

To make their MMS offer complete, operators are having to provide a wide range of services, including the potential to create and store messages, and a wide choice of content and applications. In addition to these core services, operators are also continuing to add infrastructure to ensure that the value-chain functions properly. It's still early days though. For example Virgin Mobile only launched picture messaging in July 2004. The industry is still wrestling hard with how to adapt end-user billing systems to MMS messages, not to mention the task of ensuring that MMS content flows seamlessly from the original provider to the end-user. Up-stream, it is also their role to ensure that third-party providers collect revenue, as it is they who manage the end-user billing.
The rewards that operators can reap from successful implementation of MMS are huge. MMS provides a much needed boost in ARPU, as well as a justification to investments in General Packet Radio Services (GPRS) and Third Generation (3G) networks and fosters strong customer loyalty. (3G cost them around $120 billion in licences alone.) Efforts are underway to solve interoperability problems, but there is still a long way to go. "Operators have worked hard in previous months [on interoperability]," says Sandy Ryrie, messaging chief with operator O2. "We want to bring the same level of confidence to MMS as there is in SMS."
Generally today users are introduced to MMS services, whether they have specified it or not, simply by upgrading their handset. A wider range of MMS-capable devices is beginning to appear, catering to the needs of different market segments, from the prepaid youth segment to the high-end business user. For manufacturers, MMS presents a major opportunity.
Megapixel cameras are raising the quality of images, but generally cannot as yet be handled adequately by operators. Meanwhile, the handset is turning into something of a Swiss Army Knife, and some executives, such as the former technical director for Symbian, Simon East, have branched out to focus on the photo printing and image quality of MMS phones. His company Cognima aims to provide a 'single key press' for printing from phones.
Much of the effectiveness of MMS messaging today depends a) on how new a handset is and b) if the vendor has followed internationally agreed standards to ensure interworking. Meanwhile, following the classic pattern for new technology devices, the emphasis is on launching handsets with dazzling features such as video cameras, megapixel, 180 degree swivel lenses, 4x zoom 265,000 colour screens, image editing features -- in a wide variety of inventive, slick designs, and weighing from around 85gm.

Infrastructure vendors

elecommunications software vendors provide the all-important MMS Centre (MMSC): the server that manages all MMS message flows within an operator's network, and handles addressing, filtering, and temporary message storage. (WAP then carries the MMS message from the MMSC to the mobile.) Operators are continuing to invest significantly in infrastructure for MMS, and there is a growing industry in supplying it with companies such as LogicaCMG, Comverse, Nokia, Sema, Ericsson, Motorola and Alcatel supplying them. Managing the sheer volume of information that flows through is already a challenge, but more importantly, MMSCs need to be able to deal with a host of different formats and profiles. Content usually needs to be altered before presentation on the receiver's phone.
One of the MMSC's most useful functions today, once a message has been received, is to discover the configuration of the handset to which it is being sent, and adapt the format of the message so it can be accepted by that equipment. Even video messages can be adapted so they can be received if the recipient doesn't have video handset.
Certain infrastructure components are required to manage the store and forward functions of MMS. MMSCs have to connect into other network components, and a network must also be WAP-enabled and at least capable of handling GPRS. Apart from the MMSC, there are other infrastructure elements that must be implemented in order to offer effective MMS services:
*  End-user billing adapted to the nature of MMS messages.
*  Inter-operator billing adapted to the nature of MMS to deal with cross-networks messages and roaming.
*  Revenue sharing mechanisms to allow the automated redistribution of revenues across the content and applications value-chain.
*  Digital Rights Management solutions that can identify copyright-protected content in Peer-to-Peer messages.
*  Security to ensure that valuable content is protected.
*  Application and content gateways to allow third parties to link into the MMSC.
There is a further industry supplying MMSCs. TCS (TeleCommunication Systems) for instance provides messaging services for global operators. It doesn't sell MMSCs but has enabling technologies to enhance them -- such as providing a single domain for MMS.
MMSCs are not used only by large operators. For instance, in June 2004 Comverse announced its 'Compact MMSC', an entry-level solution for smaller wireless operators. Compact MMSC had been recently deployed by two operators in Asia.

Content providers

MMS represents a new channel to market for information and content providers. Unlike the Internet or WAP, MMS provides a clear revenue opportunity for media suppliers, who can justify claiming a share of the traffic revenues that operators collect from end-users. As the market matures and billing structures evolve, they will be able to provide their services directly to the market, potentially using operators only as distributors, for example for a 'joke of the day' service.
A wide range of content types can prove effective over the MMS bearer, but content must be created specifically for this new channel in order to be successful -- taking into account the effect of new user behaviour as usage evolves. MMS as a distribution channel is superior to traditional channels in two ways: it offers the possibility to create finely personalised content, and can be accessed by people on the move.
MMS is also a compelling advertising channel. It enables the building of highly targeted campaigns, and the communication of clear messages directly into the hands of the intended recipient. MMS will also be an extremely effective conveyor for 'viral' marketing campaigns.
It is quite possible that the usage of MMS to supply third party content to the phone, rather than peer-to-peer messaging, will eventually steer the market. Many involved with the MMS industry believe that MMS could ultimately be driven by major events -- sporting or otherwise. Besides, it is the ideal platform for instant updates and alerts on events of special interest, which can then be forwarded to friends.

Applications developers

MMS enables the interaction between mobile handsets and networked servers. The varieties of ways to translate these interactions into concrete applications, that provide value to the market, are numerous. In the consumer market, MMS can support TV quizzes, polls and eventually video games. Operators will be eager to offer such applications, as they generate repetitive traffic and create customer satisfaction. Content providers and advertisers are also interested in providing such applications as marketing tools. On the business side, corporate applications like ERP or CRM systems can use MMS to develop a mobile extension and reach remote and mobile workers, providing them with a permanent link to their company and customer databases or e-mail.
Attractive and compelling content, rather than exotic handset designs and features will be the factor that ultimately drives MMS, though. SMS was the mobile success story of the 1990s, and the jury is out on whether MMS will turn out to be the success story of the present decade. Most likely the main role of MMS will be as a stepping-stone for multimedia applications and services that will drive 3G. Either way, video is next the step -- besides, it's a natural progression to what MMS does best: 'sharing the moment'.
The main focus is on consumers, who undoubtedly will drive the MMS market in the near future. There is more to MMS, though, than multimedia 'infotainment' for consumers. There is also the possibility of developing interactive mobile business communications and applications. Handset manufacturers are well aware of this, so that, for instance, the Siemens CX65 business user mobile launched in the summer if 2004 comes not only with a digital stills camera, but also video, taking clips of up to 15 seconds. Indeed, business use of MMS technologies will become a $64.1bn industry by 2009.                                             

Terry Ernest-Jones is an Associate Consultant with Juniper Research   www.juniperresearch.com



European Communications is now
Mobile Europe and European Communications


From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:



Other Categories in Features