Analysis

Mike Hill explains whay it is now more critical than ever for organisations to store and be able to retrieve electronic data

Compliance is the buzz-word of the moment. For many companies in highly regulated industries such as healthcare and pharmaceuticals, it's been on their priority list for a while. But since the high-profile corporate scandals at Enron, WorldCom, Parmalat and others it's now moved onto the agenda of every organisation.
Gartner Group estimates that some large and mid-sized enterprises will spend as much as $2m each during the current year to become compliant with legislation such as Sarbanes-Oxley. Will they spend this money wisely, and how can they best benefit from the investment?
Up to now, many compliance requirements have been partially addressed by separate IT products that provide multiple separate solutions in different areas, including backup and archiving. As a result, organisations have tools that, to a greater or lesser extent, allow them to store and archive data and comply with data retention requirements, but the data is usually in multiple locations and in diverse formats. Frequently the organisation has little idea of what data is actually being retained, and what it contains.
Banks and other financial institutions are becoming increasingly regulated and are now required to store all electronic communications related to their business, and retain them for several years. This requirement may also apply to any organisation that accepts orders, changes to orders, invoices, credit notes or any similar accounting documentation electronically. Archive systems are all well and good, but there are some that require the whole of the archive to be restored in order that a single specific item can be retrieved.
The rub often comes when an organisation is asked to retrieve a particular detail, an e-mail message for example. In a recently reported case in the USA, Perot Systems claimed that it was going to cost $4.7m to retrieve some specific e-mails requested for a court case.
To make matters more complex, some legal requirements also impose a time constraint: the Data Protection Act requires that you produce requested information within 40 days of the request. The Freedom of Information Act also applies a time limit for compliance with a request. This may be 20 or 60 working days depending on the type of organisation and some other parameters.
It's of no real benefit having the world's most comprehensive archiving and storage system if you can't retrieve selected items from it quickly and easily without disrupting your normal business. Your data retention policy should specify what kinds of data are stored and what kinds are discarded. Wouldn't it make sense for your archiving and storage system to implement this automatically? That way you could ensure that you're storing only the information that is both relevant to your business and necessary to achieve legal and regulatory compliance.

'Intelligent' need

There's a need for 'intelligent' storage of data. The archiving and storage system needs to understand both the type and the content of the information being stored. Then you could avoid storing unnecessary information: storage is cheap but there's no point in paying to use it if you don't have to. Furthermore you could identify business data and communications that do not need to be retained to achieve compliance; so you don't need to store it. Then, you can't be required to retrieve it: you can only retrieve information you have stored, after all.
However if you talk to your backup specialist, or your IT department, they will undoubtedly confirm that backing-up and archiving material is difficult: users tend to keep things on their local machines and it's hard to back up laptops and hand-held devices because they're never in one place long enough. And then there are those home-based telecommuters who rarely visit the office.
Perhaps the best place to monitor and record both business information and electronic communications is the network itself. Eventually almost every document, memo, spreadsheet, invoice, work order and press release your organisation generates or receives will pass, in electronic form, over your network. There are suitable systems available on the market today that simply plug into your network.
If you are going to implement one of these systems you could also use the same capability to monitor the entire organisation's electronic communications: e-mail, instant messenger (IM), web-mail, Internet downloads and so on.
IM is a terrifically useful tool. It can enable both one-to-one communications and group communications within your business. Unlike e-mail you know immediately that the other party has received your message. In stockbroking some clients found it useful to issue instructions to their broker, in real time. But sadly many organisations, particularly in financial services, believe that it's impossible to record and retain this form of communication. Because this would likely be interpreted as a breach of statutory duty, they have blocked use of IM on their systems entirely.

Head in the sand

This may be one way of achieving compliance, but in my view it's a head-in-the-sand attitude to a technology that has the capability of truly changing the way many of us communicate. Better to enable it, but to record and store it in a form that both achieves compliance and enables IM conversations to be retrieved quickly and easily. Not only are there systems available that do this, but some allow all forms of electronic communications to be monitored, recorded and searched upon in the same place irrespective of protocol, application or file type. This means that if the organisation needs to retrieve some specific records, it doesn't need to search multiple archives in multiple applications and try to string them together chronologically: it can search one system for all relevant communications whatever the protocol or application used.
If the system understands the content of all this data as it's being stored then it could even highlight, and alert upon, items as they are being stored. It's only one small step further to configure such a system to alert if the traffic appears to breach internal policies such as your internet acceptable use policy.     Why might you want to do that? Well there are a number of things, loosely categorised as risks, which may be discovered, and acted upon, within the content of archives or backups. These include legal liabilities such as: employee harassment by e-mail; defamation by someone within your organisation for which your organisation may be held responsible; transmission of viruses or worms, which may be construed as negligence; and copyright infringement. Perhaps someone in your organisation is using your network to download copyright music, or pornographic material. In some jurisdictions the organisation may be liable for failing to take action to prevent the individual committing the act.
Other things you might also be able to identify include security breaches such as: transmission of confidential material from inside your organisation and illicit or illegal activities such as money laundering by a client, or by a member of staff. It is not unknown for criminals to conduct their activities at their place of work and use their employer's computer systems to do it. There were several cases last year of employees downloading paedophilic material onto their work computer systems. In one widely-reported instance the employer didn't know and was not aware until their employee's girl-friend told them.

Inappropriate use

And finally you could detect inappropriate use of your computer systems. Are your employees always working for you, or do they book their holidays or gamble at online casinos during working hours? Would you know if they were?
Because all activity is recorded you would have documentary evidence should you need to take any matter further. This might include reporting suspicious transactions to the money laundering authorities, or taking disciplinary action against an employee.
But are you permitted to do this? In most instances you are allowed, and even required, to do this for business purpose, but what if your employees also send private e-mail? Regulations such as the Regulation of Investigatory Powers Act, the Human Rights Act and the Data Protection Act appear to limit, or restrict, the right of an organisation to monitor the electronic communications of its staff. Under the Regulation of Investigatory Powers Act 2000, monitoring and storing employee's private e-mails (if you allow them reasonable private use of business systems as most organisations do) is a breach of statutory duty unless you have their consent and the consent of their correspondent.
The answer here is to monitor and record, but also to inform your employees that you are doing so; you must include this in your communications policy and state that their first use of business systems for private use will be their deemed consent to the monitoring. This allows them to make an informed decision about whether or not they want to send and receive private e-mails at work. This procedure is relatively easy for your employees, staff or students but how do you get the consent of external correspondents? Look at what the international and city firms of solicitors are doing. They put a statement at the end of all their e-mails warning that they will monitor and record e-mails and that continued e-mail correspondence with their employees in any capacity will be deemed consent to the monitoring by both parties.
The same general principle holds good in cases of your employees visiting unacceptable Internet sites.
Updating your communications policy, your Internet acceptable use policy and your employees' terms and conditions of employment may be necessary to ensure that you comply with the legislation that protects your employees' rights while you implement systems to ensure that you comply with the legislation affecting your business; and you accrue the greatest business benefit from doing so.                                          n

Mike Hill, Vice President, Marketing, Chronicle Solutions (UK), can be contacted via tel: +44 7775 923 910 or +44 1494 672 999; e-mail: mike.hill@chroniclesolutions.com
www.chroniclesolutions.com/uk

FTSA, the parent company of the France Telecom group, along with its constituent companies, are experiencing the benefits of introducing e-learning into the development strategies for their 200,000 or so employees throughout the world. Bob Little reports

Although France Telecom has been using some elements of computer based learning since 1993, it took its first tentative steps in e-learning some five years ago. It is only in the last year or so, however, that e-learning within the group has begun to take off.
"In the summer of 2004, we carried out a controlled test on the effectiveness and use of e-learning within France Telecom," explains Yves Scaviner, deputy manager for group training at France Telecom. "We asked 1500 of the company's managers to work through a number of e-learning courses -- delivered through the medium of both French and English -- and over 60 per cent were won over to e-learning as a result of this experience. And now they have become real ambassadors for e-learning and the spearhead for its deployment in operational units throughout France."
The France Telecom (FT) Group comprises five major subsidiaries : TP Group, based in Poland; Orange, which has a presence in 17 countries; Equant, which provides services for the top 3,000 multinational companies in 200 countries across the world; Wanadoo, the Internet connection provider; and FTSA, the parent company of France Telecom.
"It's not easy to change a prevailing corporate training culture and implement e-learning overnight," says Scaviner. "Face-to-face training is the traditionally accepted method of learning which everyone understands -- even if it doesn't suit their individual learning style.
 "The secret of introducing e-learning and gaining rank-and-file acceptance of it as a learning delivery method is not only to have high profile endorsement from senior management but also to convince line managers of the benefits and advantages of e-learning. Moreover, you also have to prove to employees that e-learning is not 'second class learning' simply because it rubs against traditional classroom learning.
 "E-learning offers many benefits and advantages over more traditional methods of training delivery," Scaviner explains. "It is true that developing and using e-learning can result in major cost savings -- especially where training large numbers of employees is concerned. However, while this is a significant reason for France Telecom, it is not the main reason that we are increasing our use of e-learning.
"As a company, our business is in providing the infrastructure to encourage and enhance 'e-transformation' -- so, in embracing e-learning, we are helping to set an example to our clients and suppliers.
"But, for us, the most important benefit of e-learning is that it is a more efficient way of presenting learning than via the classroom," he continued. "Our studies have shown that learners learn faster when they use e-learning, compared with conventional classroom-based teaching methods. Typically, we have seen that what takes some six hours to teach in a classroom can be learnt in four hours in the virtual classroom and three hours if done via distance learning. This makes e-learning a highly efficient as well as cost-effective way to learn."
Mindful of the dangers of putting all its eggs in one basket, while FTSA is increasing the emphasis it places on e-learning within the companies in its group, Scaviner is also keen to stress that its human resource development strategy is dependent on a 'blended approach' -- that is, a mixture of classroom-delivered training with virtual classroom and distance learning inter alia.
Having realised the cost-effectiveness of e-learning, compared with other learning delivery methods -- especially for companies with widely dispersed workforces, such as Equant -- FTSA is actively pursuing a strategy that will result in half of its training/learning activities being delivered via e-learning in 2006. The current volume of training/learning delivered via e-learning within the group is some 20 per cent. This e-learning comprises a mixture of custom built e-learning, mainly developed in-house, and generic e-learning courseware from two worldwide suppliers of training software.
"Where transferable skills are concerned, we do not want to produce in-house what is already available in the marketplace," says Scaviner. "That is why we have bought licences for some 3,000 generic e-learning courses.

Learning path

"Although France Telecom's 200,000 employees have theoretical access to each of these courses, in practice FT training staff choose a learning path for each learner based on that person's revealed training needs," Scaviner adds. "Currently, the most accessed programmes cover general management issues; managing meetings; discovering your management style; managing stress, and motivating staff."
According to France Telecom's Odile Demery: "E-learning -- both virtual classroom and distance learning -- can be delivered to learners' desk-tops but many France Telecom employees work in open space and there is a greater chance of them being disturbed during their learning time. For this reason, France Telecom has made available some 400 dedicated training booths around the country -- known as 'Espace Clic-n-learn' -- where individuals can study undisturbed. So the booths -- time in which can be pre-booked online -- offer an ideal solution."
"We are popularising e-learning throughout France © Telecom via a number of initiatives, including posters and mousemats advertising 'Espace Clic-n-Learn'," said Demery.
"Ultimately the popularity of e-learning will depend on the 'me too' factor -- as people see their colleagues visiting the Clic-n-Learn booths and benefiting from their new knowledge and skills."

CASE STUDY 1

Christine Skelhorn, head of training at Orange, passionately believes that e-learning is the way forward for Orange. Along with her training team and e-learning co-ordinator Amanda Yarrow, she is committed to providing all Orange employees with access to innovative, effective and enjoyable learning. At the beginning of 2004, Orange launched its e-learning strategy with a Corporate Induction module, developed with TATA Interactive Systems (TIS).
In its ten years of existence, Orange has grown to some 12,000 employees in the UK. As the company continues to grow so, too, does its requirement to recruit staff -- especially in the customer services field -- and give them induction training. Until the advent of the e-learning materials, the corporate induction programme was a three-hour PowerPoint presentation conducted, as required, by any of the company's available trainers.
"We are delighted at the positive feedback we've received to the induction module," says Amanda Yarrow. "In particular, it's been a real winner with new starters across the business.
"Users range from engineers who have worked at Orange for many years to newly recruited customer service staff. They all seem to like the image we've adopted of a 'fairy godmother'. This virtual entity guides them through the programme and helps to dispel any 'techno' fears that they may have. She also helps to add meaning and significance to the contents of the induction programme.
"Importantly, TIS seems to have hit on a style of presentation which appeals to everyone in the company," Amanda Yarrow adds.
Orange has a number of further e-learning programmes in development for 2005 and beyond.
"It's been an encouraging start with Orange Induction and we really look forward to building on our achievement in the future," concluded Yarrow.

CASE STUDY 2

In less than two years, Equant -- part of the FTSA group -- has provided ongoing e-learning to some 9,500 employees worldwide, saving $6.5m in the process.
Equant operates a worldwide telecommunications network that manages 152,000 user connections across 220 countries for some 3,700 customers. Its employees need ongoing training in a range of topics including IT skills, project management, problem solving and negotiation skills.
With so many of its employees based throughout the world, Equant knew that traditional classroom-based training was time-consuming and costly, so it implemented a blended learning strategy integrating elements of classroom-based training with e-learning. The bulk of its e-learning programmes are provided by SkillSoft's IT and business related courses. SkillSoft also provides a '24/7' mentoring service to support its IT curriculum.
By the end of 2003, Equant had over 8,500 of its employees using e-learning -- a user rate of over 85 per cent -- at a cost per employee of less than $100 for 24/7 access to 700 courses.
Access to e-learning is:
*  Having a positive impact on levels of staff retention at Equant.
*  Helping employees to become proficient in their jobs more quickly - thus reducing costs, increasing productivity and revenue.
*  Enabling employees based in more remote locations to feel a closer part of the Equant 'family' and providing more development opportunities than were previously available to them.                               

Bob Little is a freelance communications writer

Providing a consistent level of Ethernet service is vital if carriers are to make the most of its potential, says Fred Ellefson

European carriers have embraced Ethernet services and, according to the Probe Group, are on track to grow the European Ethernet service market at a 40 per cent CAGR to provide almost â‚-4B worth of services by 2008. With the success of this service, its profitability for carriers is paramount. The capex side of delivering Ethernet services is extremely attractive, with Ethernet port costs approaching one tenth of comparable SDH port costs, according to Network Strategy Partners, LLC. 
However, the opex side of delivering Ethernet services over a five year service contract will end up dwarfing the original capex for delivering the service, and is substantially higher than traditional SDH or PDH based services (see Figure 1, right). This is critical because Ethernet service price points are typically much lower per megabit than traditional SDH/PDH based data services. Opex costs can make delivering profitable Ethernet services a challenge.                                                 
The original Ethernet standards were designed for the enterprise LAN environment and do not have the operations administration and maintenance (OAM) capabilities that carriers require to deliver a WAN service profitably on a wide scale. While manpower can be thrown at the problem when the number of customers is small, this approach will not scale as Ethernet services move down market to the millions of small and medium enterprise customers necessary to grow the market. 
Fortunately, the standards bodies have recognised this problem and have been working on adding the necessary OAM capabilities that are required to allow Ethernet services to scale. The first OAM standards were ratified in the middle of last year by the IEEE 802.ah Ethernet in the First Mile (EFM) standards group. While most of the media attention has been focused on the copper transport side of this standard, the bigger impact to carrier bottom lines will come from the OAM advances found in the standard. The ability to remotely monitor and perform maintenance will eliminate expensive truck rolls and dramatically improve the profitability and margin of Ethernet services. According to studies by Covaro, these new standards will reduce truck rolls (and opex) by 47 per cent versus un-managed Ethernet services and can greatly reduce the total cost of delivering Ethernet services (see Figure 2, opposite).
In addition, ITU and Metro Ethernet Forum (MEF) standards bodies are developing standards for end-to-end service monitoring/testing, while IEEE 802.1ag/ITU are also working on connectivity standards for multi-point to multi-point services. These standards will be ratified in late 2005 or 2006, and together with IEEE 802.3ah, will provide a layered set of OAM capabilities analogous to OAM capabilities found in SDH and PDH standards. 

Ethernet demarcation

With these new OAM standards, the biggest changes and impact will occur at the customer premise where the service is delivered to the end user. Traditional data services incorporate a demarcation device such as a smartjack or CSU/DSU to provide remote monitoring/test capabilities and have been a critical factor in reducing opex costs and ensuring profitability of these services. In the Ethernet world, carriers have had to improvise using Ethernet switches, routers, media converters or even SDH ADMs to perform this function. However, these devices are not up to the challenge: they cannot perform even the simplest of demarcation functions such as loopbacks or test generation, and often are not particularly reliable. Many enterprise users are shocked to see the same router that they have to reset every week in their own network, installed in the telco closet by the carrier for demarcation. 
A number of companies are now offering purpose built Ethernet demarcation devices which incorporate both Network Terminal Equipment (NTE) functionality along with User Network Interface (UNI) functionality.  The NTE performs the OAM capabilities defined in 802.3ah, including remote/local loopbacks, remote failure indication, fault isolation, performance monitoring with threshold alarms, status monitoring and discovery. The UNI function is aligned with MEF recommendations for Ethernet service definitions including Committed Information Rate (CIR), Excess Information Rate (EIR), and burst size on both a port basis and an Ethernet virtual circuit (EVC) basis for VLAN based services. These two functions are key to both defining an Ethernet service as well as maintaining it profitably.
The NTE function provides a full suite of RMON Etherstats plus extensions, enabling carriers to monitor SLA conformance on both sides of the demarcation point and to analyze performance trends over time. Performance data can be collected and stored in 15 minute intervals just like performance data from traditional carrier services. This valuable data provides a performance log for billing and SLA purposes, and provides advance indication of performance degradation before an outage actually occurs. Carriers can make this data available to customers through a web portal to facilitate customer network management, as is often done with private line or frame relay services. 
Should a service outage occur, the NTE provides remote visibility and control of the demarcation device, reducing or eliminating the need for a truck roll. The carrier can employ remote Ethernet loop-backs along with pattern generation/detection for remote testing, and can remotely determine if the CAT5 cable connected to the CPE is open circuited, short circuited or properly terminated. Open or short circuits can be located it to the nearest meter for precise diagnosis of cabling problems, which can then typically be corrected by the customer. The EFM dying gasp message even provides the carrier with a remote indication that power has been lost, which can often be addressed remotely by the customer.
The Ethernet demarcation device's UNI provides the CIR, EIR and burst parameters needed to define the QoS/CoS of Ethernet services on a port or EVC (VLAN) basis. The use of VLANs enables multiple Ethernet services, such as dedicated Internet access and/or Ethernet private line services, to be carried in a single Ethernet connection to the customer. This function is extremely visible to the end user because it defines the look, feel and personality of the Ethernet service. 
Locating the service UNI at the customer premises enables service definition and prioritisation at the point where full-rate Ethernet is rate-limited for the lower bandwidth transport and price points of services targeted at small to medium businesses. Performing this function at the rate-limiting point is essential to the proper prioritisation of such latency-sensitive services as VoIP and video. The customer premises location of the UNI also enables remote additions and changes to the service definition, eliminating truck rolls in service upgrades also.

Consistent look and feel

In summary, in addition to providing a dramatic improvement in opex, the demarcation devices provide carriers with a consistent service personality or SLA that can be hard to achieve when delivering Ethernet services over a mixture of fibre, copper, SDH and PDH technologies. Multi-site enterprise customers are often disappointed that their service SLA and monitoring capability is limited at sites that are served via early generation SDH or PDH equipment. Installing a demarcation device behind a SDH ADM can provide VLAN and OAM support that the ADM cannot provide, and enables those sites to receive the same rich set of service capabilities that the native Ethernet sites are receiving.  Providing a consistent, ubiquitous Ethernet service is key to providing a differentiated Ethernet service, increasing market share and ensuring the profitability of an Ethernet service offering.                              n   

Fred Ellefson is Vice President of Marketing, Covaro Networks, and can be reached via e-mail: fred.ellefson@covaro.com   www.covaro.com

Lynd Morley looks back at this year's 3GSM World Congress

It was cold in Cannes this year. Even the determined visiting joggers, who pound along the Croisette in the early hours, were swathed in woolly hats and gloves.  There was something of a chill in the air from the locals as well, which might, of course, have been in some way connected to the fact that after ten years, the 3GSM World Congress was paying it's last visit to Cannes before decamping to Barcelona next year. Given the significant boost the show delivers to the local economy, it is probably not surprising that, viewing all connected with the Congress as in some way treasonous, the waiters, bartenders and some shopkeepers surpassed even their usually accepted levels of arrogance, indifference and downright rudeness. 
But inside the Palais des Festivals all was warm and glowing -- that is, once you'd recovered from the frostbite contracted while queuing to register. The atmosphere in all five of the exhibition halls was bullish, determined and positively radiant. Indeed the much discussed (and prayed for) recovery in the telecommunications industry could not have been better illustrated than by the sheer numbers at 3GSM this year.  A record 34,000 participants meant that attendance was up by some 20 per cent on last year. Indeed, the number of delegates, exhibitors and visitors swelled the population of Cannes to such an extent that it was near on impossible to find table space to grab a café au lait in any of the cafes around the Palais.
All grist to the mill, of course, as far as the GSM Association and the Informa Group were concerned, underlining -- as such figures do -- the importance of the event in the mobile calendar.
Both organisers and participants were, understandably, keen to stress the growing momentum of 3G, emphasising its continuing development through IMS and HSPDA -- whose imminent arrival has precipitated a rash of roll out plans from the likes of Motorola, Siemens, Nortel, and, of course, Ericsson -- which claims to have set a new HSDPA data transfer world record of 11 Mbits/s. Questions about the availability of appropriate handsets remain, however, in the face of a rather unnerving reticence from handset vendors on the subject (and resurrecting the spectre of handset shortages for 3G roll-out last year), but pundits believe the industry has learned its lesson -- a sentiment echoed by Sony Ericsson vice president, Jan Wareby's statement to the effect that his company will be making products available for trial this year, with commercial volumes during the first half of 2006.
While product launches, partnership plans, and development announcements -- covering every conceivable aspect of the 3G world -- abounded during the show, the buzz was particularly audible around such topics as TV to mobile devices (buzz volume increases with the assertion from Orange that some 60 per cent of its users in France watched live TV on their mobiles); and music (several more decibels on the buzzometer with the announcement of a link-up between Microsoft and Nokia on delivering music to mobile phones).
Reflecting the incredibly wide range of nationalities present at the show -- which drew visitors from the full spectrum of market maturity across different countries around the world -- the 3G community is looking to its future in such countries as China, India, Brazil and Russia, who will, according to Bharti Chairman, Sunil Mittal, provide the next billion GSM customers within three years. The GSM Association and Motorola will help this process along, of course, with their promise to deliver a sub-$40 handset this year, with the aim of reducing to sub-$30 in the future. It could be stressed that such prices are essential to the further spread of GSM, given the GSM Association's own figures   Â© which show that although some 80 per cent of the world's population has access to wireless coverage, only around 25 per cent can actually afford to do so.
3GSM 2005 was, by any measure, a considerable success. Significant announcement were made, networking flourished, and deals were done; exhibitors seemed more than satisfied with attendance levels; the corporate parties were lavish and exuberant; and even those professional whingers, the assembled press, complained considerably less about the media centre facilities.
A fitting farewell, perhaps, to the Cote D'Azur. Next year Barcelona -- and hopefully a better chance of getting that cup of coffee.                                             n         

Just across the road from the Palais des Festivals, the Telecom Valley Gallery set up shop again this year in the famous La Potinieres du Palais.  Boasting much-welcomed heaters in the restaurant's canopy area, the Telecom Gallery offered comparative calm, mixed with that essentially French sense of stylish purposefulness.

Organised by the Telecom Valley Association, in partnership with Cote d'Azur Development (CAD) and Initiative Riviera Technologies (IRT), the Gallery is a showcase of the latest 'made in the Cote d'Azur' wireless technologies. The companies discussing their products and services over an excellent glass of wine (not to mention a superb menu), included Aequalis, Altix-EDS, Atos Origin, Devnet, ETSI, Istar/EADS, NCR Teradata, OrangeFrance, Philips Semiconductors, Smartcom, Temex, Texas Instruments, Trendium, and W3C.
Jean-Marc Dijan, President of the Telecom Valley Association, notes that these companies demonstrate that a real telecom value-chain has taken root and grown to maturity in the region, covering standardisation institutes, electronic design centres, consulting firms, software creation and integration firms, support services, research laboratories, engineering schools and so forth.
With the clear intention of attracting more companies into the area, Jean-Pierre Mascarelli, President of Cote d'Azur Delopment, adds that the local high tech community has been able to take advantage of the fact that the CAD provides international businesses with personal and confidential contacts to the area's business, economic and administrative networks, at no charge.
The Telecom Valley companies, like their colleagues in the main exhibition halls, came well equipped with company information and announcements, timed perfectly to coincide with the 3GSM World Congress.  Teradata, for instance (who also had a booth in one of the main halls), unveiled its data warehousing solution, Warehouse 8.0, aiming to provide businesses with breakthrough business intelligence to solve the problems of how to increase revenue, reduce expenses and identify new growth opportunities. Chris Parsons, Teradata's EMEA Industry Marketing Director, noted that for operators to achieve their declared aim of 'getting closer to the customer' (a recurring theme at 3GSM 2005) effective business intelligence is crucial. He explained that the stronger, more robust Teradata data warehouse enables customers to gain a competitive edge with a new level of business intelligence, and stressed that the company continues to enhance the warehouse suite to make it easier for businesses to integrate Teradata into their overall enterprise.
The IT services company, Atos Origin, which provides business consulting, systems integration and managed operations, was also able to highlight its capabilities at 3GSM hot on the heels of an announcement -- the company having recently implemented the LHS rating package 1.2 at T-Mobile Austria, within just seven months. The solution enables the billing of both post- and pre-paid customers via one system. A core element of the solution is that it will also be used for future 3G services, such as mobile voice and data communications via UMTS, GPRS or WLAN.
The companies gathered in the Telecoms Valley Gallery all had their own particular success stories to relate, of course. But while there's no doubt that this particular venue will be missed in Barcelona, it is to be hoped that they will continue to contribute to the 3GSM gathering -- on foreign soil.                        n

Lynd Morley is editor of European Communications

As a mobile Internet protocol, i-mode could provide
operators with a means of differentiating their services
in the mobile data market, reckons Kevin Buckley

This year, the 3GSM World Congress in Cannes found the GSM world had well and truly embarked on 3G, with at least one, and usually various operators, having launched services in all the major markets. And not a moment too soon, as voice revenue, everywhere, is under pressure from competitors and, in the case of interconnect rates, from regulators. Data (besides just SMS) is therefore charged with the responsibility of complementing the challenges of declining growth in voice revenues.
In general terms, GSM world operators can be divided into two groups for the purpose of analysing their mobile Internet strategies -- the leaders or, frequently, the top two players in any given market. They will usually have far more subscribers than the rest of the competition, forming a de facto duopoly and vying between themselves for the leadership position, quarter by quarter.
These operators' main challenge is to migrate their huge customer bases smoothly from 2G through 2.5G to 3G. Having learnt from their mistakes with Wireless Access Protocol (WAP) phones, which came to market in 1999-2000 before an ecosystem of well-designed, well-conceived sites existed, they concentrate on building services rather than emphasising the technology. They sell camera phones and music downloads rather than GPRS or UMTS.
As such, they introduce their consumer-oriented mobile Internet offerings as content portals on their GPRS networks, signing up subscribers so that 3G can subsequently be marketed as a speed upgrade.
The other group comprises operators that, for one reason or another, need to differentiate their offering from the rest. Some are new entrants, i.e. groups that have no 2G customer base because they came in at the 3G licensing stage and therefore need to wow potential customers into leaving their existing provider in favour of them. Others already have a 2G business but aspire to become a market-leader and so need to raise their profile as a sexy option for mobile phone users wanting content offerings along with voice.
One way this group can seek to differentiate itself is by promoting the fact that it is offering 3G telephony, running ad campaigns that emphasise the new things that can be done with the more advanced phones in terms of content acquisition, m-commerce transactions or location-based services.
At the same time, the more efficient spectrum utilisation of 3G, compared to earlier generations of technology, means that more voice calls can be put on the same wavelength, a fact the new entrants are exploiting to offer cheap voice services. These are designed to bring customers to their networks, after which it is an easier task to persuade them to start using the mobile Internet function and to acquire content.

i-mode as a differentiator

Another way to stand out from the crowd, and one we are seeing in an increasing number of markets in Europe, is with i-mode. Like WAP, this mobile Internet protocol is an overlay on the network and can operate wherever an IP layer has been deployed, i.e. from 2.5G onwards.
The question all operators now face is: how can they make money from data services? It's all very well for a mobile carrier to say it has diversified beyond voice and into data, but mobile Internet access alone is not enough to bolster revenue. It too is being commoditised as operators start to offer flat rate "all you can eat" services to attract subscribers away from competitors who don't. Remember the cautionary tale of Internet service providers in the wireline world, whose initial promise was blighted as the flat rate, always-on environment grew, forcing them to move to value-added services or die.
Let's begin by separating the provision of such services to business/enterprise customers, who want secure mobile access to key applications running on their corporate networks such as ERP, CRM or SFA, from the marketing of non-voice functions to consumers. The latter represent a mass market that, aside from mobile e-mail and text messaging, essentially boils down to the sale of content. It is the provision of data services to consumers that I want to discuss here.

DoCoMo led the way in Japan

hat business is, of course, in its infancy, but there are interesting lessons from a market we at NEC know very well, namely Japan. NTT DoCoMo is universally acknowledged to have been ahead of its time when it came to content with its development of i-mode, the proprietary technology which, from 2.5G onwards, has successfully built both a large subscriber base (some 44 million in Japan today, or 92 per cent of all DoCoMo subscribers) and a huge pool of vendor sites (about 84,500 right now, of which just under 4,400 are 'official' sites, i.e. ones that pay a 9 per cent commission on sales to DoCoMo, and 4,600 have 3G content). In financial terms, i-mode contributed 25 per cent of DoCoMo's total revenue last year, which is not bad for only its fourth full year in operation.                The model not only works for DoCoMo at home in Japan: it has also licensed the technology to operators in 12 other countries, in one case (KPN in Holland), a carrier in which it holds equity.
Eight of the 12 are in Western Europe: KPN and its subsidiaries E-Plus in Germany and BASE in Belgium, Bouygues in France, Telefonica in Spain, Wind in Italy, mmO2 in the UK, Ireland and Germany and CosmOTE in Greece. O2 plans to deploy i-mode in the UK and Ireland this year and Germany (under a different brand) in 2006.
In other parts of the world, Far EasTone in Taiwan launched in 2002 and Australia's Telstra followed suit last year, while both CellCom in Israel and MTS of Russia have signed up with DoCoMo to launch services.

WAP vs i-mode

Meanwhile other industry heavyweights, such as Vodafone, Orange and T-Mobile, are building services based on WAP gateways, with Vodafone live! the furthest advanced in a number of countries (16 at the end of September '04) and subscribers (11.5 million at that same time). Like i-mode, WAP sits above and is thus independent of the radio access layer, provided the network is IP-enabled (i.e. 2.5G and beyond), they can function to enable a mobile Internet experience.
Before we go any further, let me address the fact that WAP is an open standard while i-mode is proprietary and must therefore be licensed from DoCoMo.
All true, but let us not forget that if, for instance, a games developer wants its game to be playable on the Vodafone live! service, it must write to the operator's proprietary API, called VFX, for the purpose.
In the i-mode world, the main pull for ISVs to write to DoCoMo's API has to date been the carrier's commanding share of the Japanese market. Now that other licensees are coming online there is a buddy group forming which, by virtue of its collective subscriber base, again makes it worthwhile for the software developers and handset manufacturers to work to the i-mode spec. By its size and geographical reach, the group begins to rival the clout of Vodafone.

Street market vs. shopping mall

The difference between the two mobile Internet technologies -- and herein lies the secret of i-mode's success -- is that the latter was developed after the way the market for it would work had been defined, whereas WAP debuted as, to paraphrase Pirandello, a technology in search of a business model.
The i-mode business model can be likened to that of a street market. If a vendor wants to set up a stall (i.e. a site), they agree to pay a percentage of the takings to the local council (i.e. DoCoMo). It is therefore in the operating council's interest to have hundreds of stalls, or indeed thousands, since the Internet does not have the physical restrictions of city streets.
Since the business model was thought out beforehand, from the outset DoCoMo recognised that it was in its interest to promote the take-up of i-mode, and to this end has always allowed so-called unofficial sites to proliferate, i.e. the ones that don't pay the 9 per cent fee on their sales and for whom it does not carry out the billing and revenue collection.
It still makes money on them, however, charging for the traffic generated by their m-commerce activities. © ndeed, some 80,000 of the total 84,500 sites are unofficial, and what they pay to communicate with customers across the DoCoMo network makes up 50 per cent of the carrier's non-voice revenue.
Another major difference between the i-mode model and those of the operators basing their mobile Internet services on WAP gateways is that, in the former, all content acquisition is the result of Internet browsing and all content is delivered via DoCoMo's i-mode platform. In the WAP-based world, there are far less sites and the bulk of content is acquired via SMS. The operators derive revenue from the SMS traffic, of course, but considerably less than they would if their subscribers used the mobile Internet to do their m-commerce, particularly now that SMS and MMS are being bundled into cut-price packages along with voice minutes as competition heats up.
The revenue from the browse-to-buy model comes not, primarily, from the time the subscribers are online, particularly now that many operators are going over to a flat-rate model for Internet access. But by charging the content providers to be the delivery mechanism for their ringtones, weather forecasts, horoscope updates or whatever, and by making it easy for thousands of providers to put up sites.
If i-mode is like a street market then the WAP-based mobile Internet model is like a shopping mall. The number of stores (i.e. sites) is small and the commission the mall owners (i.e. the operators) earn is a lot higher, anywhere from 40 to 60 per cent of vendors' revenue in fact. And since most content is bought by SMS rather than on the Web, one could continue the analogy by saying that most shoppers aren't even entering the mall.
The knock-on effect here is obvious. If I receive an SMS inviting me to buy a snazzy ringtone and I text back to buy one, that's the end of the transaction. If, on the other hand, I get an e-mail with a link to a site where I can download the ringtone, the vendor has far more sell-on or sell-up potential while I'm on the site.
There is also a greater opportunity to create a recurring revenue stream by signing me up to regular service of, say, a new ringtone every week or month, or indeed of multiple ringtones so that I can differentiate between calls from my boss (mental note to self: answer swiftly) and from my mother-in-law (mental note to self: let it go to voicemail). NEC believes that more content brings more users, and more users bring more content.
What's interesting in the Japanese market is that, since i-mode was the first and most successful service there, it has created the country's mobile Internet culture, such that DoCoMo's competitors seek to emulate its business model, even though they're not deploying i-mode.
So while Japanese consumers, thanks to i-mode, browse to buy, their European counterparts text to buy. In the second scenario, the content will often not even go through the operator's servers en route to the subscriber, being stored instead on a content server on the provider's network and delivered by SMS, with the operator deriving revenue only for transporting the text message.

Which strategy will win?

There will probably not be a single winner, as the success of any mobile Internet strategy will not just depend on the technology, or even on the business model alone, but also on the market clout of the operator adopting it and the acumen of the executives running the business environment in which they are operating.
As manufacturers, we at NEC play an important part, as we provide the means to make it all happen! We see a place for both models depending on where a particular operator aspires to be. Our challenge is one of continually being one step ahead -- developing the roadmaps for more cost-effective, high performance platforms and the technology on which either model can fit or indeed, any future models.                       

Kevin Buckley, General Manager NEC (UK),  Mobile Network Solutions Division, can be contacted via e-mail: kevin.buckley@uk.neceur.com

Inventory management is a vital ingredient in the feast known as VoIP services, as Julie Wingerter explains

It's like a hamburger and fries: it's just better together. And in the case of VoIP, IPTV, and other IP based services, inventory management really does make a difference in the overall ability of service providers to roll out these services profitably and efficiently. VoIP services offer operators significant revenue upside, but they also come with a set of deployment and operational challenges. That's where a robust inventory management solution comes into play.
Because IP services such as VoIP and IPTV are executed over a combination of shared multi-service transport environments, there are more devices to provision and maintain; network topologies to keep straight, and; network bandwidth/traffic issues that require prioritisation, than in a typical POTS scenario.
For VoIP to work, all of these network related activities, equipment and designs have to be monitored and managed in real time. To do this, a powerful network inventory management system is required. Such a system provides an accurate view of the network and serves as the core data repository supporting the automation of routine functions and providing vital information that allow billing, order management, service provisioning, outside plant, and purchasing to run efficiently.

Carrier success

How will carriers be successful in rolling out new VoIP services? From a network perspective their multi-service transport/broadband IP environments must meet some pretty high standards: ie tough enough to facilitate millions of phone calls; reliable as legacy phone services with a sound network architecture and POTS interconnection strategy; high quality (e.g. jitter-free, static-free), and; designed to rapidly process customers' orders and provision services.
What lies ahead in rolling out new IP based services?  Let's look at some of the specific challenges carriers face when introducing VoIP. 
If you don't know what is in your pantry you may not have all of the ingredients necessary to create an appetising dinner. Similarly, VoIP requires carriers to maintain an accurate picture of their network inventory so they can determine which services are available and when they are available, and so they can plan ahead to avoid any 'shortages.' This is particularly important, as IP generally requires many more network devices and configuration parameters than POTS services, including additional Customer Premise Equipment (CPE) and home networking components and associated MAC Ids, IP numbers, customer services data, pricing, etc.; Access Technologies -- HFC/cable, DSL, PON (FTTH); and PSTN off-net and other off-net connectivity components like media gateways, SS7, etc.
There is also a need to integrate with additional management applications that use inventory information such as billing systems, network management applications, etc., which may be using the information differently in order to introduce usage mediation or real-time trouble-shooting.
In short, VoIP services are more complex than standard POTS services because of the additional products and numbers/addresses that are used. Consequently, service providers need an inventory solution that provides: a mechanism to capture all of the physical and logical assets of the network; an integrated inside and outside plant inventory that supports an end-to-end view of the network; a means to keep the data current, and; a mechanism to integrate with service provisioning processes.

Oversubscription Management

Network congestion increases as thousands of new customers are added on an IP network. Oversubscription management looks at the usage patterns and analyses traffic and network capacity real-time. However, for this information to be accurate, carriers must have a view into their whole network. With up-to-date network data from inventory management this is achievable. With the ability to view the entire current network, service providers can adjust © oversubscription management based on experienced performance and quality to ensure that performance does not suffer.

Traffic Prioritisation

For most consumers, static filled calls are not acceptable. Therefore, to ensure QoS, a carrier needs to effectively monitor and prioritise the types of packets running through the network. Voice needs bandwidth. If there is not enough bandwidth available, voice quality deteriorates as 'static,' or 'jitter,' and dropped calls become inherent problems.
   A carrier needs to facilitate traffic prioritization policies to ensure that voice and video are prioritized over data transmission activities. Enforcing and adjusting these policies is paramount especially when offering triple play services that all vie for bandwidth on the same network. All of this revolves around accurate network inventory data to maintain bandwidth at levels that match network activity levels.

Self-Service Feature Management

Creating individual, made-to-order bundles of VoIP services is required to stay competitive and is heavily dependent upon accurate network inventory information. Through web-based customer sites or by phone, users should be able to, in real-time, change their VoIP features. For example, customers may go online and adjust their call forwarding, call waiting, or voice mail parameters. Importantly, customers don't expect to have to wait for these changes to be applied-they want them to be instantaneous. This capability requires accurate customer information and the associated network details. 
Another unique VoIP service feature is virtual phone numbers. It is possible to have a VoIP number reflect where a person or company would like to have virtual offices or presence in other countries. For instance someone from the United States might want to have a virtual office in the United Kingdom. Using VoIP, they can now have a number that matches UK phone numbering conventions even while all calls are routed back to the original US number. These unique enhanced service features require a network inventory that is flexible and integrates with other processes. VoIP brings to the table all these new service features that didn't exist in the traditional POTS environment.
IPTV deployment challenges mirror those of VoIP, only they tend to be intensified in scope. IPTV requires a bevy of new supporting equipment from encoders and video-on-demand servers, to video compression and IP-  encapsulation. For many carriers today, IPTV is the next big service they intend to roll out and VoIP is their introduction to IP-services.
Accurate inventory, traffic prioritisation, oversubscription management and self-service feature management are just a few of the areas that must be managed well to generate profits from VoIP and IPTV services.  An inventory-based OSS delivers the capabilities that allow carriers to see their whole network, add services and capacity to maximise their resources, and analyse results for future planning.
NetCracker Technology's OSS Solution is improving how carriers rollout next-gen services such as VoIP, IPTV, Fibre-to-the-Home and others. The Solution includes industry-proven inventory-based OSS software and the professional services delivery expertise to make it happen. NetCracker customers include Telstra, Australia's largest carrier; Telus, Canada's second largest carrier; MGTS, one of the largest wireline providers in Europe; and Covad, a leading North American broadband service provider, among others.

A moveable feast

VoIP and IPTV do have their unique challenges. However, combined with a solid inventory management, OSS service providers can generate higher revenues, increase market share, maximize network resources and remain competitive.
  A multi-service transport environment, IP's additional equipment and design parameters need to be kept in synch using inventory management.

Julie Wingerter, Vice President of Strategy, NetCracker Technology   www.netcracker.com

Can MMS -- as predicted by many in the industry -- produce the knockout blow that will fuel greater revenues and
opportunities for all, or will it be marginalised as a stepping stone to 3G services. Terry Ernest-Jones investigates

$161.3bn in 2009, by which time it will be well-established as a day-to-day feature of the mobile mass market. Yet so far, MMS has failed to deliver on promise for many of its users and its potential to enrich mobile communications has not been realised. It has recently passed through the early adopter phase, and the MMS industry continues to wrestle with problems such as handset compatibility, digital rights management, and pricing models. However, the major opportunity, of receiving and sending multimedia messages on the move -- as easily as with SMS -- gets closer by the month.
MMS brings multimedia features such as photos, sound, video, rich text or interactive applications to mobile messaging. This can take the form of a message sent between mobile phone users ('peer-to-peer') or, of equal importance, a message sent from a third party to a user ('server-to-mobile'). An MMS message can be compared with a scaled-down PowerPoint presentation, able to contain a variety of media. By contrast, Short Messaging Service (SMS) -- which has paved the way for MMS -- only allows basic text-messaging of up to 160 characters.
Technically speaking, MMS originates from mobile messaging standards defined by the Third Generation Partnership Project (3GPP) and the WAP Forum (which has since merged into the 'Open Mobile Alliance'). It has only really been up and running for about three years. Amongst the countries where uptake is strongest are Japan, South Korea, Germany and the Nordic region. The USA is less developed, reflecting the general lower mobile handset penetration in the region. In between are countries such as the UK, displaying roughly average MMS usage levels: even by the end of the first quarter 2004, out of a total of 47.5 million subscribers to the four main UK mobile networks, 11.3 MMS active devices were registered, according to the Mobile Data Association. This gives an MMS penetration rate of 24 per cent.
MMS requires special handsets with colour screens and, usually, built-in cameras. When the first Juniper Research report on MMS appeared in 2002, there were just two MMS phone models. Now there are hundreds. They have already brought in significant revenues for the leading MMS handset suppliers such as NEC, Nokia, Samsung, Sony-Ericsson, Panasonic and LG Electronics. But for operators urgently looking for new revenue sources, and other suppliers in the value chain, MMS offers the chance to build on new consumer behaviour in messaging -- for example linking audience participation into digital TV programmes -- driving new data revenues and raising ARPU, as well as handset replacement rates.

Lifting the barriers

Whilst offering a leap in mobile phone usage and appeal, it must be emphasised that until now, MMS has also been a frustration for large numbers of users, even for basic functions such as exchanging photos between mobiles. However, many of the compatibility and interoperability obstacles that have menaced MMS will be ironed out over the next two years, allowing a freer flow of multimedia messages, approaching the level of today's SMS. A major advantage for MMS is that, following in the wake of SMS, it can slot into customers' existing mobile usage habits. The downside is that user expectations have been set to require the same standard of service, and smooth operation, as they get from SMS.
Fortunately for the MMS industry, users do however expect to pay for mobile services. (This is not by any means the case with, say, the Internet.) MMS presents a large revenue opportunity, not only in providing enhanced peer-to-peer messaging, but also content and application-based services. As yet few suppliers are making any money out of it. But MMS provides opportunities to sell a range of enabling technologies. This will not be realised until operators and all other suppliers in the industry a) understand the dynamics of the value-chain, b) adhere to industry standards for interoperability, and c) fine tune their infrastructure needed to exploit MMS.

The MMS value-chain

As MMS carries multimedia into the mobile telecommunications market, a new range of suppliers have been drawn to the mobile channel to reach a wider client base. Just as the web has linked the IT and media markets, MMS is bringing content providers and application developers in touch with mobile operators. There are several instances where these players are working in close co-operation to create a multimedia user experience, deliver attractive services, and provide end-to-end solutions that stimulate the market and create new needs.
Ease of use and intuitive interfaces are fundamental requirements for end-users -- neither of which are as yet properly addressed by the MMS industry. If, say, a young teenager is in a clothes store and wants to get the OK from his/her parent to buy an article of clothing via an MMS picture, the operation must be transparent and quick. Minutes spent overcoming the handset's technical hurdles in a busy clothes store will defeat the purpose of this and other useful applications.
From the mobile end-user's perspective, MMS involves a radical shift: in effect moving the focus of their attention from the ear to the eye. As so often with technology innovations, end-users are neglected as the key element in the value-chain, whilst the industry races ahead with 'push'. So far as MMS is concerned, the youth market is the main driver -- in fact some studies say that most MMS buying activity is seen amongst 15 to 17 year olds. Each age group needs to be taken into account -- there is a large mobile population of  people aged 60+. Messages sent to them of, for example, grandchildren's birthdays, should not be ignored under the misconception that MMS is 'youth only'. 
A feature of young people's lives today which is in MMS' favour is that they record the details of their lives online a lot, and both sexes have a stronger urge than in previous times to share their experiences with friends.
Taking the UK as an average, around 25 per cent of handsets are MMS-enabled. Yet most young users of MMS phones today regard it -- initially at least -- as a disappointment. Typically, initial efforts to send photos to friends fail. Tales abound of handset makers blaming operators and vice-versa when frustrated users contact help centres to overcome problems. MMS represents an investment for end-users, who buy an MMS-capable handset -- costing up to $500 plus services. The mass market will only be willing to make this investment if users can derive real value out of using MMS. There are several compelling value propositions linked to MMS at its most basic level:
*  MMS handsets in themselves add kudos with their basic features of colour screens and polyphonic ringtones.
*  MMS offers the possibility to enhance the popular peer-to-peer messaging with the addition of photos and sound. More importantly, pictures and sound allow the user to add personality and emotional content to the messaging experience, and share this with friends.
*  MMS information and entertainment services promise unrivalled personalisation possibilities, and will enable users to access content anytime, anywhere.

Operators

Operators play a pivotal role in the value-chain. As such, they must focus closely on the end-user's needs and guide their business partners to help them develop offers that will satisfy end-users. (Presently, this is often the other way round.) This requires more than simply supplying 'jazzed-up' SMS messages; operators need to reach into new models such as T-Mobile's for the Euro 2004 football tournament. This scheme included MMS picture messages sent at intervals during each of a team's matches -- the whole package costing $4.50 per team for picture, or $10 for video, updates. (T-Mobile was contacted for this report to state the result of the programme, but declined to give figures. Instead it commented that it was "very pleased with both the level of uptake and also the technical performance of the delivery of the alert services.")

Wide range of services

To make their MMS offer complete, operators are having to provide a wide range of services, including the potential to create and store messages, and a wide choice of content and applications. In addition to these core services, operators are also continuing to add infrastructure to ensure that the value-chain functions properly. It's still early days though. For example Virgin Mobile only launched picture messaging in July 2004. The industry is still wrestling hard with how to adapt end-user billing systems to MMS messages, not to mention the task of ensuring that MMS content flows seamlessly from the original provider to the end-user. Up-stream, it is also their role to ensure that third-party providers collect revenue, as it is they who manage the end-user billing.
The rewards that operators can reap from successful implementation of MMS are huge. MMS provides a much needed boost in ARPU, as well as a justification to investments in General Packet Radio Services (GPRS) and Third Generation (3G) networks and fosters strong customer loyalty. (3G cost them around $120 billion in licences alone.) Efforts are underway to solve interoperability problems, but there is still a long way to go. "Operators have worked hard in previous months [on interoperability]," says Sandy Ryrie, messaging chief with operator O2. "We want to bring the same level of confidence to MMS as there is in SMS."
Generally today users are introduced to MMS services, whether they have specified it or not, simply by upgrading their handset. A wider range of MMS-capable devices is beginning to appear, catering to the needs of different market segments, from the prepaid youth segment to the high-end business user. For manufacturers, MMS presents a major opportunity.
Megapixel cameras are raising the quality of images, but generally cannot as yet be handled adequately by operators. Meanwhile, the handset is turning into something of a Swiss Army Knife, and some executives, such as the former technical director for Symbian, Simon East, have branched out to focus on the photo printing and image quality of MMS phones. His company Cognima aims to provide a 'single key press' for printing from phones.
Much of the effectiveness of MMS messaging today depends a) on how new a handset is and b) if the vendor has followed internationally agreed standards to ensure interworking. Meanwhile, following the classic pattern for new technology devices, the emphasis is on launching handsets with dazzling features such as video cameras, megapixel, 180 degree swivel lenses, 4x zoom 265,000 colour screens, image editing features -- in a wide variety of inventive, slick designs, and weighing from around 85gm.

Infrastructure vendors

elecommunications software vendors provide the all-important MMS Centre (MMSC): the server that manages all MMS message flows within an operator's network, and handles addressing, filtering, and temporary message storage. (WAP then carries the MMS message from the MMSC to the mobile.) Operators are continuing to invest significantly in infrastructure for MMS, and there is a growing industry in supplying it with companies such as LogicaCMG, Comverse, Nokia, Sema, Ericsson, Motorola and Alcatel supplying them. Managing the sheer volume of information that flows through is already a challenge, but more importantly, MMSCs need to be able to deal with a host of different formats and profiles. Content usually needs to be altered before presentation on the receiver's phone.
One of the MMSC's most useful functions today, once a message has been received, is to discover the configuration of the handset to which it is being sent, and adapt the format of the message so it can be accepted by that equipment. Even video messages can be adapted so they can be received if the recipient doesn't have video handset.
Certain infrastructure components are required to manage the store and forward functions of MMS. MMSCs have to connect into other network components, and a network must also be WAP-enabled and at least capable of handling GPRS. Apart from the MMSC, there are other infrastructure elements that must be implemented in order to offer effective MMS services:
*  End-user billing adapted to the nature of MMS messages.
*  Inter-operator billing adapted to the nature of MMS to deal with cross-networks messages and roaming.
*  Revenue sharing mechanisms to allow the automated redistribution of revenues across the content and applications value-chain.
*  Digital Rights Management solutions that can identify copyright-protected content in Peer-to-Peer messages.
*  Security to ensure that valuable content is protected.
*  Application and content gateways to allow third parties to link into the MMSC.
There is a further industry supplying MMSCs. TCS (TeleCommunication Systems) for instance provides messaging services for global operators. It doesn't sell MMSCs but has enabling technologies to enhance them -- such as providing a single domain for MMS.
MMSCs are not used only by large operators. For instance, in June 2004 Comverse announced its 'Compact MMSC', an entry-level solution for smaller wireless operators. Compact MMSC had been recently deployed by two operators in Asia.

Content providers

MMS represents a new channel to market for information and content providers. Unlike the Internet or WAP, MMS provides a clear revenue opportunity for media suppliers, who can justify claiming a share of the traffic revenues that operators collect from end-users. As the market matures and billing structures evolve, they will be able to provide their services directly to the market, potentially using operators only as distributors, for example for a 'joke of the day' service.
A wide range of content types can prove effective over the MMS bearer, but content must be created specifically for this new channel in order to be successful -- taking into account the effect of new user behaviour as usage evolves. MMS as a distribution channel is superior to traditional channels in two ways: it offers the possibility to create finely personalised content, and can be accessed by people on the move.
MMS is also a compelling advertising channel. It enables the building of highly targeted campaigns, and the communication of clear messages directly into the hands of the intended recipient. MMS will also be an extremely effective conveyor for 'viral' marketing campaigns.
It is quite possible that the usage of MMS to supply third party content to the phone, rather than peer-to-peer messaging, will eventually steer the market. Many involved with the MMS industry believe that MMS could ultimately be driven by major events -- sporting or otherwise. Besides, it is the ideal platform for instant updates and alerts on events of special interest, which can then be forwarded to friends.

Applications developers

MMS enables the interaction between mobile handsets and networked servers. The varieties of ways to translate these interactions into concrete applications, that provide value to the market, are numerous. In the consumer market, MMS can support TV quizzes, polls and eventually video games. Operators will be eager to offer such applications, as they generate repetitive traffic and create customer satisfaction. Content providers and advertisers are also interested in providing such applications as marketing tools. On the business side, corporate applications like ERP or CRM systems can use MMS to develop a mobile extension and reach remote and mobile workers, providing them with a permanent link to their company and customer databases or e-mail.
Attractive and compelling content, rather than exotic handset designs and features will be the factor that ultimately drives MMS, though. SMS was the mobile success story of the 1990s, and the jury is out on whether MMS will turn out to be the success story of the present decade. Most likely the main role of MMS will be as a stepping-stone for multimedia applications and services that will drive 3G. Either way, video is next the step -- besides, it's a natural progression to what MMS does best: 'sharing the moment'.
The main focus is on consumers, who undoubtedly will drive the MMS market in the near future. There is more to MMS, though, than multimedia 'infotainment' for consumers. There is also the possibility of developing interactive mobile business communications and applications. Handset manufacturers are well aware of this, so that, for instance, the Siemens CX65 business user mobile launched in the summer if 2004 comes not only with a digital stills camera, but also video, taking clips of up to 15 seconds. Indeed, business use of MMS technologies will become a $64.1bn industry by 2009.                                             

Terry Ernest-Jones is an Associate Consultant with Juniper Research   www.juniperresearch.com

Reducing revenue leakage and maximising profits is the nirvana of every operator in the market. So, what are they doing to achieve this goal? John Maclean investigates

According to the experts the future is starting to look rosy again for the global and European telecoms markets. Gartner reports that revenue decline is slowing for the top North American telcos, while in Western Europe, the EITO (European Information Technology Observatory) expects the industry's rate of growth to increase from 2004 to 2005.
Although the picture is certainly looking less bleak, Gartner also reports that worldwide, the BSS and OSS markets will only see incremental and uneven recovery until telcos resume sustainable profitability. But surely, the BSS and OSS markets should be recovering ahead of telco profitability? As the market eases, telcos are looking to pick up the pace in launching new services and deploying new technologies. But a complete inversion of investment priorities, focusing on new services rather than back office processes, could in the long term produce precisely the results that they are trying to avoid. Profitability will come by maintaining the focus on the back office to give a strong set of customer facing processes, upon which new services and offerings can be delivered -- quickly, effectively and profitably.
In order to achieve profitability, telcos need, principally, to ensure that they minimise the time and cost associated with turning customer orders into cash. The less it costs a telco to provision and service a customer effectively, the greater chance it has of keeping valued customers and of making a profit. For example, as a Western European mobile operator recently commented: "Our financial figures are now in the top five of the operators in Europe. We automated our order management process within the past two and a half years and this has totally helped the system compared with the manual processes of the past."
This quote came from a recent research study by Analysys, which highlighted that despite many recent improvements in order-to-cash processes, most telcos acknowledge there is still significant work to do to fully optimise the order to cash lifecycle.
Analysys surveyed over 40 telecoms operators in Europe and North America; a staggering 80 per cent of respondents admitted that further order management improvements are essential if they are to achieve better financial results and competitive differentiation. Fifty per cent of telecoms operators, and within this, all bar one of the large operators, are without a single common process for order taking across all lines of business.
One in three of the operators that had met obstacles in improving order to cash processes had to stop or delay critical product launches -- a highly detrimental factor in the quest for profitability in a competitive marketplace. Operators looking to jump start their market expansion plans need to ensure faster time-to-market, especially as the pressure intensifies to introduce new services and tariffs.
With profitability as the ultimate target and order management the means of getting there, there are also a number of goals to be achieved along the way. Namely, in the Analysys research, half of the operators surveyed cited improved cash flow, better service, increased customer satisfaction and revenue growth as benefits made possible when the obstacles to improved order management are removed.

Hurdles to clear

Analysys outlines two main hurdles to achieving these goals. These are the sheer complexity of the order management processes that need to be overhauled, and the impact of seemingly irreplaceable legacy systems that still remain the backbone of telcos' IT infrastructures.
The issue with the order management business processes themselves is that they are becoming increasingly complex as telecoms businesses become more diverse, nimble and affected by rapid market change. Issues like constant development of new services to keep competitive edge and a high rate of market consolidation, mean that the market landscape for telcos is constantly evolving. Despite their increasing complexity, processes need to be effective, resilient and efficient in order to deliver profitability.
The complexity of IT systems and difficulty in customising legacy IT systems is blatantly clear from the survey. 29 per cent of telcos are trying to hard wire critical systems needed across order management to create an integrated process. While this may serve today's needs, hard wiring usually means that there is little option for change. How will these telcos fair in the future, when their order management processes, and probably the systems needed to support them, have changed beyond all recognition?
Only one in five operators has invested in integration tools with standard technology interfaces, despite that fact that these systems offer a more future proof route to profitability because they are based on open standards that allow any combination of systems to be integrated.

Lip service to industry standards?

The Analysys survey shows an increase in the awareness of industry standards such as the TeleManagement Forum's (TMF) Enhanced Telecom Operations Map (eTOM), which provides a blueprint for successful business, processes like order management. But while 60 per cent of large and medium sized operators are tracking and analysing these standards for ways to create value in order management, only five per cent are actually developing order management solutions using the industry standards. Over the next few years, the market needs to follow these early adopters who have successfully made order management improvements using standards based solutions.
One early standards adopter is R (formerly R Cable), one of the fastest-growing telecommunications companies in Spain. In early 2004, R decided to take the standards based approach to order management by implementing Vitria:OrderAccelerator, a solution based on the TMF's eTOM framework, which streamlines and automates manual, order-related business process flows across OSS and BSS systems.
The decision by R to standardise on an order management platform was based on the desire to run the entire process of customer subscriptions and services provisioning from one place. It is integrating information flow between the web, its customer relationship management platform, service activation systems, on-site workforce systems, network inventory system, its Interactive Voice Response facility, and billing system in a single environment.
"The strong competition in the telecommunications sector requires a sustained effort to provide the best levels of operating efficiency and service," comments Antonio Gómez, Systems, Organisation and Quality Manager at R. "Vitria addresses specific issues that have a direct impact on efficiency and quality through the automation of processes and the integration of information flows inside the company. By offering visibility of customer details and services across the organisation, Vitria also helps us reduce costs and improve service."

The revenue impact

An earlier study by Analysys, in February 2004, showed that, with revenues overall growing more slowly, management of existing revenues has become increasingly important. Under the global title of 'revenue assurance', this has become an absolutely essential project for operators. The study showed that of the top six causes of revenue loss, three were in the area of order management and processing, namely 'Poor Processes and Procedures', 'Poor Systems Integration' and 'Applying New products and Services'.
Even more surprising was the gap between assumed revenue loss and likely real revenue loss. In Western and Central/Eastern Europe an acceptable loss level was around 1 per cent of revenue. However, the research suggested that the actual loss level was probably well over 7 per cent. This was unlikely to be due to complacency but more the result of vertical organisational structures, and a resultant inability to recognise the real causes of revenue loss, and more importantly, put in place effective measures to reduce them.
In today's climate, lost revenue is close to being a crime and it is vital that operators focus on eliminating it before getting carried away with the front-end technology. It will be of no long-term value to have launched the most leading-edge services if, at the same time, an operator is leaking more revenue at the back.

A two-pronged attack

It is clear that although the improved overall climate for operators is permitting some renewed opportunity to launch new technology based services such as 3G in mobile and VoIP in fixed, the biggest risk is if they de-focus on the back office. A strong and determined effort to 'sort out' order management, and its associated processes and procedures, will have the triple benefits of reducing costs, stemming some key sources of revenue loss and, most importantly, allow the more effective and timely launch of new and exciting service offerings.         

John Maclean is Telecoms Marketing Director EMEA at Vitria, and can contacted via tel: +44 (0)1628 421852;

Louis Meyer argues that the drive for a more advanced
OSS should begin at a board level

As 2005 kicks off, it would appear, finally, that telecoms carriers are beginning to recognise the strategic importance of driving OSS investment from the boardroom. UMTS shows a great deal of promise for the future, and many carriers are realising the necessity for a highly proactive approach in developing and delivering innovative new services that are tempting subscribers to spend more. Certainly MMS is starting to take off, 'push-to-talk' is proving increasingly popular, and SMS-based voting services are an intriguing possibility.
The mainstream use of 3G services is firmly on the horizon. But as the industry readies itself to gauge the market's reaction to these ambitious new services, the benefits of installing new infrastructures to manage their efficient delivery is being overlooked by many carriers. Lack of business value-supporting OSS/BSS are emerging as a potential source of seriously numbing headaches in the coming years. Carriers will need to fully equip themselves for the strenuous demands that the introduction of new services will place on their delivery infrastructure in the near future, or risk being left behind by their competitors as the pace of change quickens.
The automation of network management is an ongoing process with significant advances already having been made, and with many improvements and refinements in the pipeline. The widespread adoption of these automated infrastructures must appear increasingly necessary and desirable at a board level to executives who are looking to improve efficiency, in practical and economic terms, maximising operating profits and ensuring that their networks will continue to deliver value and cope under the strain.
OSS has taken a prominent role in recent industry conventions and, as expected, billing and billing mediation solutions have received plenty of attention due to the convergence of pre-pay, pay now, and post-pay services, with an array of new applications under evaluation by operators. The prevalence of GIS systems vendors has demonstrated a high level of interest in services with a location-based flavour. Certainly analysts have been generally surprised at the strong presence of pure OSS vendors at the latest showcase events.
At TeleManagement World 2004, communication service providers readily admitted that their business processes could be made more efficient, but tight budgets -- in combination with the perennial misalignment between the requirements of IT and the board -- were often blamed as the reason for the lack of a systematic approach to the enhancement of network management.
It seems the problems most commonly encountered in the telecoms industry are receiving repeat calls from customers, the inconsistent distribution of work, and an over-dependence on an experienced workforce -- instead of highly automated processes or systems -- to complete the myriad of tasks in hand. However, it is encouraging that telecom professionals are at least aware of these inefficiencies and of their causes, even if they feel they have their hands tied when it comes to implementing solutions. This highlights the importance of aligning IT and business goals if operational efficiency improvements are to be realised, and that the drive for a more advanced OSS should begin at a board level.
There can be no doubt that OSS, and the attitudes surrounding its use, currently stand at a critical crossroads. The market led approach to mobile communications -- selling a product, before considering the suitability of the infrastructure to support the roll out of new services -- is sure to lead to some major problems in delivering a satisfactory level of customer quality and satisfaction.
There are two possible outcomes of this approach. Firstly, there is the possibility that the industry will intelligently fine tune investments in both network infrastructure and OSS/BSS to match market growth and tackle the changing demands as they arise. The second outcome is that there is a significant mismatch in growth and delivery capability, creating very public service quality problems, which could negatively impact on and perhaps irreversibly damage a firm's brand image.
The outstanding challenge for most carriers will be managing the huge volume of change ahead, both in terms of network growth and growth of the customer base. In some cases, the number of additions to the network could reach many hundreds each day. Without the human programming of diagnostic rules it is simply not possible to adapt to this rate of change. As far as the improvement of the level of service in network operations centres (NOCs) is concerned, the telecoms industry is now undergoing a sea-change where it is possible to largely automate network management and fault resolution, cutting human resource costs and the time taken to analyse and fix faults.
Next generation OSS infrastructure utilises artificial intelligence (AI), originally developed for use in the nuclear industry, where diagnostic decisions have a critical impact, to analyse and discover the root causes of network problems and reduce the number of events that need human intervention. This delivers greater OSS productivity without increased hardware investment; improved SLA compliance and service levels, reducing the likelihood of incurring SLA penalty costs; and improved network reliability leading to significantly reduced churn. Which in board-speak means reduced opex and lowered TCO with a very attractive ROI.
Carriers and by extension, their end users, can benefit greatly from task-based management and reason-led systems that allow for more rapid deployment of closed-loop control. Some carriers have reported a 98 per cent success rate when applying these AI solutions to identify and resolve errors that arise in the day-to-day running of their networks, which can mean up to two million individual problems fixed automatically every day.
Executives must recognise that their firms' reputation is at stake here and put in the extra effort to secure the necessary investment in high efficiency hands-off automated OSS, an investment which will make a tangible difference to the bottom line and may ultimately prove to be crucial to carriers' survival in the coming years.                                                     

Louis Meyer is CEO of Pivetal, and can be contacted via e-mail: louism@pivetal.com   www.pivetal.com

As UMTS finally takes off across Europe, network planners are exploring how best to meet projected 3G subscriber growth. The technology at the base station RF interface holds many of the answers, writes Joerg Springer

In the latter half of 2004, the third-generation (3G)/universal mobile telecommunications system found its feet in Europe. This technology should provide answers to many of the business challenges facing the region's aging second-generation (2G)/global system for mobile communications networks. While Western Europe's GSM networks still enjoy steady subscriber growth, many are over a decade old, and face serious capacity limitations.
UMTS is founded on wideband code division multiple access (W-CDMA) technology, and promises relief from the current capacity headache. W-CDMA offers a capacity-per-MHz far greater than that of time division multiple access (TDMA)-based GSM, plus reduced OPEX. It also promises more established and sustained average revenue per unit growth -- a powerful driver in an industry climbing its way out of a three-year ARPU slump. The earliest experiences with 2.5G data services suggest that the more advanced 3G will be an important factor in industry growth.

Unique challenges

Although the European '3G beast' is now flying, there are unique challenges ahead from a network expansion and RF perspective. Today's 2G-to-3G transition is a different scenario to that of the mid-nineties leap from first generation analogue services to digital.
In just over a decade, Europe's cellular services have matured dramatically, with penetrations at around 85 per cent. The downside of this is that all the prime base station sites are occupied. The environmental requirements regarding site location and visibility have also 'matured' to become some of the world's most demanding.
The transition from TDMA-based GSM to W-CDMA-based UMTS technology also influences network planning. Where TDMA planning strategies are based on minimising co-channel interference by re-using a select number of     channels over a group of cells, W-CDMA uses the full frequency band in each cell. Moreover, W-CDMA cells are said to 'breathe' -- the size of the cell varies with the number of callers within the cell, the transferred data rate and so on. The resulting co-channel interference that can occur in the W-CDMA network increases the noise floor, and progressively depletes the capacity of the network. It presents a notoriously tougher network planning challenge when compared with GSM, particularly in addressing the interference resulting from adjacent cells.
Perhaps most challenging of all is subscriber expectation with regards to quality of service (QoS). No longer are subscribers willing to condone dropped calls and fades -- Western Europe has arguably the highest cellular QoS in the world. The new UMTS services have much to live up to.

Rooftop realities

The majority of Europe's urban cell sites are rooftop-based. Given the tough site acquisition conditions, the easiest 3G roll-out option (and the one largely chosen to-date by Europe's UMTS operators) is co-siting.
The situation on European rooftops today has much in common with a crowded early morning commuter train -- no-one enjoys the congestion, there are established long-term disputes and rivalries between some 'passengers', but on the whole, the system works. To accommodate UMTS spectrum, new antennas are required, so the 'train' needs to be reorganised. The most popular strategy adopted to-date is the multi-band antenna solution. This is manifesting in strong demand across Europe for dual- and tri-band antenna solutions supporting combinations of UMTS 2100 MHz, GSM 900 MHz and GSM 1800 MHz.
A further challenge is co-siting interference. When antennas operating at different frequencies are located in close proximity, there is potential for RF interferences. These are caused by intermodulation products orspurious emissions, which can in turn lead to BTS or Node B blocking. The most extreme cases of these occur when the core band spectral separation is narrow (a pair of UMTS 2100 MHz and GSM 1800 MHz services is a most obvious case), and the antennas are physically close. As a result, UMTS/GSM co-location isn't always straightforward. In some cases, it simply isn't practical, and the new UMTS operator is forced to opt for a site that is nearby, but 'sub-optimal'. The RF challenge is to make the best of such a bad situation, and to optimise the RF footprint to suit the alternative location.

RF flexibility

The upshot of this highly constrained site location scenario -- coupled with the exacting requirements of W-CDMA network planning -- is that Europe's 3G operators are demanding higher levels of base station RF precision and flexibility. First and foremost is the issue of antenna performance: new-generation 'precision footprint' antennas feature diminished side and rear lobe radiation levels, improved null fill, and increased front-to-back ratios.
'Flexibility' is being sought on a number of RF technology fronts -- specifically in the control of cell footprint size, shape, direction and power. To compensate for CDMA-style cell breathing and less-than-optimal site locations, variable electrical tilt is de rigueur. Increasingly, this is accompanied by remote tilt control functionality, linked back to the network management centre via industry standard communication protocols.
There is also a demand for tower-mounted amplifiers (TMAs) across the majority of Europe's 3G sites. These provide amplification in the uplink signal from the terminal, which overcomes losses in feeder and co-siting components, decreases the system noise, plus increases the potential cell size. The need again is for flexibility -- a broad choice of amplification levels, dual and multiband configurations, and a wide selection of antenna gains.
In addition, the challenge of W-CDMA adjacent cell interference has created a demand for alternative apertures. Where the 65-degree tri-sector is the norm in 2G/GSM networks, antenna apertures of 90-, 45- and even 33-degrees permit the W-CDMA network planner to 'break the symmetry' of the final cell pattern, and thus minimise cell-to-cell interference.
The not-too-distant future holds even more RF challenges -- significantly, the evolution from a coverage-driven to a capacity-driven strategy. In the very short term, we'll see UMTS operators continue to expand and enhance their 3G coverage across the major city and urban centres. These are the areas that present the greatest revenue earning potential, and are also the most voice capacity-challenged.

Longer term challenge

More challenging, though, is the longer term. Analysts predict a 50-fold increase in Europe's UMTS subscriber levels between end-2004 and end-2009. In essence, this suggests Western Europe will see 3G subscriber levels rise to almost equal those of the region's current GSM count, in a time frame just over half that afforded to the evolution of GSM. This represents extraordinary subscriber growth, and presents unique challenges to 3G network planners and RF technology providers.
To meet the fast-growing capacity requirements as GSM subscribers migrate to UMTS, we can expect to see even more advances in base station RF technology. These will almost certainly be founded on two key elements: advanced 'hybrid' (a mix of active and passive components) antenna solutions, plus greater intelligence and control functionality built into the antennas.
While today's antennas are purely passive, tomorrow's antennas will need to integrate active RF conditioning components, such as low-noise amplifiers, multiplexers and filters, with even greater levels of control. Similarly, by providing more onboard intelligence within the antenna, an increasingly broad range of antenna pattern parameters might be adjusted and controlled. This will provide superior levels of flexibility to the network planner.
Over time, we'll also see a vast improvement in W-CDMA network simulation tools. This should result in more dynamic and intelligent network management strategies, and possibly lead to the realisation of the so-called 'dynamic antenna performance control'. Here, the adjustable parameters of the antenna components -- both active and passive -- could be corrected in response to the simulation tools, perhaps even in a closed-loop real-time configuration. It is these types of super-flexible base station RF solutions that will play a significant role in the establishment of UMTS in the longer term.     

Joerg Springer is the Chief Marketing Officer, Radio Frequency Systems (RFS). He can be contacted via: tel +49 511 676 2516; e-mail: joerg.springer@rfsworld.com
www.rfsworld.com

Andrew Beutmueller and Phil Haddock examine the
effectiveness of an automated rights management system

Recently, the Assicurazioni Generali, the third largest European life insurance company, controlling 626 companies worldwide, made the strategic decision to implement an Identity and Access Management solution to automate rights management across a range of diverse systems. Beginning at its Swiss division, Generali Group Switzerland, the new system has helped to streamline increasingly complex IT environments, and cut operating costs and errors.
A series of mergers and acquisitions brought a number of formerly independent Swiss insurance companies under one roof, resulting in rapid growth in the number of IT users requiring access to a broad range of services, data and applications running on a variety of platforms across the Generali Group Switzerland.
Despite the increased work load resulting from the merger, resource provisioning was still performed manually using printed paper forms to assign each user a set of access rights to a specific configuration of systems and resources. This system of 'manual rights assignment' worked well enough in the past, but with the merger and today's increasingly complex IT environments, manual rights assigning has become costly, time-consuming and inaccurate.

Role-based provisioning

The objective was, of course to replace the labour-intensive, system-specific assignment of user rights and permissions then in place at Generali; this referenced an emerging standard known as Role-Based Access Control (RBAC). This was replaced by "a strategic, centrally administered directory service (role concept) based on standards (X.500, LDAP)," according to Jürgen Lorek, Internal Auditor of Generali (Schweiz).
After some research, it was discovered that the best option would be to implement a Meta Directory solution that was built around a role-based approach to identity and access management.
Role-based provisioning is based on the RBAC standard developed by the National Institute of Standards and Technology (NIST). The solution deployed at Generali Group Switzerland enables cross-platform provisioning at a level that closely mirrors the organisational structure of the enterprise. The definition of roles, role hierarchies, relationships, and constraints reflects the levels of responsibility and specific operations to be executed by persons in particular jobs. Each role is assigned one or more permissions containing bundles of access rights, and each employee is assigned one or more roles. The specialised Meta Directory solution enables access rights to be granted, refused, withdrawn and monitored dynamically, independently of the platforms and applications used. Once a role-based framework has been put in place for an organisation, the principal administrative actions are the user-to-role (user-to-job) assignments.

The DirX product

The Identity and Access Management solution implemented at Generali Group Switzerland is based on the following components of the Siemens DirX product suite: 1) The DirX V6.0D10, a high-performance LDAPv3/X.500 directory server that stores employee data, the DirXmetahub configuration and all roles, groups and accounts, and 2) the Siemens HiPath SIcurity DirXmetahub V6.5B10, which is a Meta Directory Engine supporting the HiPath SIcurity DirXmetaRole Version V2.0B00 for provisioning user and access Management.
Master identity data is maintained consistently in the DirX directory server and made available centrally. The Meta Directory engine DirXmetahub ensures the automatic synchronization of data between all connected systems. HiPath SIcurity DirXmetaRole enables the provisioning.
Three separate systems for development, test and production of the DirX-based solution were put in place. Siemens provided the software components and licenses along with project consulting and a standard set of maintenance services.

Putting theory into practice

The Generali Group Switzerland's Legal Protection division offers legal insurance -- it was this part of the company that was targeted for the initial implementation of the solution. It was a prime candidate, as some 55 users required differentiated access to diverse resources and services: sales administration, contract management, and damage-assessment systems (all running under Oracle), and Microsoft ADS/Exchange.
A great deal of preparatory work went into the description of specific job functions and their definition in terms of roles, and a key requirement was to enable HR personnel to perform a 'user-to-role' assignment for each employee. Once the name, address, salary, contract start, effective date of entry, etc. was registered, the HR manager was to assign a role corresponding to a specific job.
At that point, the automatic procedures would take over: the entire user-to-role assignment would be read from the HR database and passed back to HiPath SIcurity DirXmetaRole, where the corresponding assignment of access rights was to take place.

Problem solving in real time

The objective of achieving flawless synchronization between the meta-directory and the HR management system is simple enough in theory, but making it work in real time was a challenge. Fortunately, the solution is flexible enough to take into account the possibility that customer-specific extensions of the DirXmetaRole might be necessary for effective integration in legacy environments.
Moreover, synchronization between the directory and the Generali application systems was critical. Once the role-based rights configurations had been integrated in the DirX meta-directory, they had to be mapped back into the different application systems and resources accurately.
"Our core business is built on the Oracle platform and is structured very similarly across the different applications, so this backwards compatibility with the current database applications in production was absolutely critical," explains Lorek.
Now that the solution is in place, when a new employee enters the system or when the role assignments of existing employees change, all that isrequired from Generali's HR manager is to check a box on screen. The new software reads the status of all user-to-role assignments-enabled, added or deleted -- and the result is automatically synchronised with the entire legacy system. As such, "the whole complexity of the paper-based, manual process in place prior to the project has been reduced to a single mouse-click," says Lorek.

Success at hand

The Identity and Access Management solution has been in production in the Legal Protection division for almost a year now. Under the previous manual system, correcting an error in rights assignment cost three or four hours and involved several different people in HR or IT administration. This was obviously highly cost intensive. And if you think about the potential damage from misuse of obsolete or erroneous access rights, the risks were obvious.
"We underestimated the amount of work involved, but we also underestimated the impact our success would have on the organization," said Lorek. "The solution improved the security of our systems; it reduced the risk of misuse and potential damage to the business."
Furthermore, as remaining non-core applications continue to be integrated into the central system, the benefits become clearer, especially in terms of significantly lower administration costs; near-perfect accuracy in terms of moves, adds and changes; and greater confidence in the security of the IT environment and the company's ability to satisfy evolving regulatory requirements.


Jürgen Lorek talks about Identity and Access Management at Generali Group Switzerland

Q: Although you have a background as an IT professional, you also brought a business point of view to the challenge of resource provisioning. Can you give us a little more detail?
Lorek: I was lucky in the sense that I could take on both the IT and the business analysis and bring them together in a form of interdisciplinary thinking.
I studied this problem and, in doing so, added complementary business and financial analysis skills to my professional IT expertise. As such I was able to couch the resource provisioning issue in terms that non-IT colleagues and external solution providers could understand; I chose an object-oriented analysis and design tool known as UML (Unified Modeling Language). With this I was able to analyse and model Generali Group Switzerland in terms of its organisational structure and business processes. The UML model was a real milestone. It was clear that if we could find a product capable of implementing it, we would have a solution.
We began with classical organisational theory and looked at Generali's business in terms of how it was organised and in terms of business processes, as an operational organisation. We evaluated all of the resources available in the company under these two points of view. Although functions often cross departmental divisions, it's very difficult to change the way people traditionally think and work, so we're proceeding on a two-track path.
As far as provisioning is concerned, we've introduced an operational structure. We have Siemens HiPath SIcurity DirXmetaRole set up to map functional responsibilities in terms of roles and corresponding privileges. We've modelled everything that a person does within the enterprise, the day-to-day business in a functional organisation as well as project-oriented business, and together that provides the summa summarum of all rights and privileges for that person.
Q: What sort of impact did this approach have on the project?
Lorek: I think it's one of the main reasons the project has been successful-the fact that we were able to convince management to look at it as a business issue rather than as a purely technical challenge. It was a business and organisational challenge that could be resolved using IT tools.                       

    

 

European Communications is now
Mobile Europe and European Communications

  

From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:
www.mobileeurope.co.uk 

 

@eurocomms

Other Categories in Features