Analysis

Global Innovation: Building the ICT Future
27-30 March 2006 Conference   28-29 March 2006 Exhibition
Business Design Centre, London, England

The 21st Century Communications World Forum conference and exhibition provides a global venue in which service providers, enterprise end-users, and industry analysts will come together to discuss issues, challenges, and opportunities raised by the emergence of next-generation ICT services and applications – as well as IP-based network architectures and technologies.

Produced by the International Engineering Consortium (IEC) this global assembly of telecom leaders will examine the impact of emerging information and communications technology in shaping the evolving digital networked economy.
Special emphasis will be placed on bringing together the three key constituencies shaping the future of information communication technology:
• Network operators, service providers, and application developers
• Chief technology officers, enterprise chief information officers, and end-user professionals
• Corporate strategists, industry analysts, system integrators, and technology thought leaders
As the host-sponsor, BT will open the conference with a keynote address presented by Matt Bross, group CTO, that will outline BT’s visionary plan to create its 21st Century network. This multi-billion dollar project is designed to provide the UK with an IP-based multi-service network that will deliver a wide range of fully converged services to businesses and consumers, including broadband voice, video, data and mobility.
Additional featured keynote addresses include, Thomas Ganswindt, member of the corporate executive committee, Siemens; and Terry Matthews, chairman of Mitel.
As part of the executive speaker line-up, a special Plenary Panel on ‘Next-Generation Network Rollout Plans’ will convey the ‘Carrier’s View.’ Speakers include: Mick Reeve (Chairperson) Group Chief Technology Officer, BT; J. Trevor Anderson, Senior VP, Technology, Bell Canada; Massimo Coronaro, Technology Officer, Telecom Italia; Tadanobu Okada, Executive Director, Information Sharing Laboratory Group, NTT; and Berit Svendsen, Executive Vice President, Technology, and Chief Technology Officer, Telenor ASA.
Complimenting the educational program is the technology exhibition featuring more than 70 exhibiting companies offering a wide range of service delivery platforms and innovative solutions on the expansive show floor. Attendees can spend their time walking the exhibit hall, or maximising their time by attending sessions within the educational program.
A new element introduced at this year’s program is the IMS Global ComForum, revealing the latest IMS research and developments, applications and implementation by leading industry experts. Delegates will examine the promise of IMS-myth vs. reality through a series of dedicated sessions over the four-day program.
The inaugural 21st Century Communications World Forum 2005 drew more than 2,000 registrants to London, eager to discuss the latest advances behind the move toward customer-centric information and communications technologies.                             

For more information on the 21st Century Communications World Forum 2006, visit: www.iec.org

Billing is now widely considered to be strategic -- a key element in the struggle for better customer service and cash flow management. But it has also split into two parts, says Alex Leslie

The cynics amongst us -- and by that I mean those of us who spend too much time at telecoms conferences -- often become philosophical in bars at airports. 'The problem,' we say, after a sip of alleged Chardonnay, 'is that nothing changes.' We then nod, and take another sip, and wonder when the plane will arrive to take us home.
I believe we think this because we are suffering from Powerpoint poisoning, and simply do not recognise the symptoms. We have come to believe that there is nothing new because the slides look the same as they did several years ago. I have been guilty of this. I used to show a slide at billing conferences. It said that the next generation of billing system needs to be scalable and flexible and have far better reporting capabilities and be properly integrated with customer care. True, but possibly boring -- until I flipped to the next slide, which said that this list of 'requirements' was presented at a conference in 1994.
It is when you rewind to 1994 itself that you see the awesome changes that have actually occurred. A plethora of new services, true competition, almost universal mobile phones, 'free' voice -- all of which would have been greeted, in fact was greeted, by conference audiences back in 1994 as pie in the sky fantasy.
The fact that the slides have not changed much actually means that we got the fundamental 'to do' list right. We will always need greater flexibility and scalability and, to an extent, speed, and certainly greater integration. Those truths are, as Mr Jefferson said, self evident. It also means that, depending on where you are in the world, and where you are in the development of the telecoms market, you will be somewhere on a 'line' of maturity, where the focus changes depending on whether there is huge subscriber growth, or a more sophisticated, customer centric, mature market place.

Consider billing

Here in Europe, I think billing is coming of age. Billing managers now have regular meetings with CFOs, which is a major breakthrough. Billing, by which I mean the whole revenue management process, not just the system in the middle of the process, is widely considered as strategic -- a key element in the struggle for better customer service, and cash flow management. It is now a process that is regularly measured, whereas before it was not. There are now teams -- that, as often as not, spring from billing -- that roam the corridors looking for revenue leakage. Revenue assurance is becoming a way of life, not an audit.                              ©
Billing has changed in other ways. It has split into two parts. One part is responsible for the process, the whole process. This job has, as its primary goal, to make the process completely independent of people, who are generally the things that change and mess things up and make the midnight pizza delivery guys rich people. These process people do not care, except in a high brow intellectual way, about 3G and VoIP and Triple Play, and all the things that we go to conferences to watch slides about. When a new service is launched they want the CDRs, or whatever event record is used, to go through the process smoothly, produce a bill and thus produce money.
In this part of billing, this new maturity was hard won. The Telecoms Troubles gave them no capital, less people and more responsibility. The days of buying a new system to launch a new service disappeared. The process became king. Many vendors reinvented themselves as revenue assurance specialists. Many operators at this point joined the ranks of the cynical.
The other part of billing is responsible for figuring out whether the process is capable of supporting the new services that marketing wants to launch on the world. In some operators, this role is actually now part of product management or strategy. In one or two operators, this role has the right of veto on a new product that they cannot bill for. This is the person that vendors take out to dinner. The process person would probably join them, but he is too busy shouting at the network people who did not tell him they had upgraded the switches without telling him, again. More midnight pizzas were delivered.
Whilst this maturity was hard won, it was at least, won. Looking at the billing industry now, it is a mature industry, and 10 years ago it was certainly not that.

Can I talk about convergence, please?

I hate to add another favourite from slide packs, but the evolution of the billing process is, to a great extent, being driven by convergence. I cannot remember when I first saw a slide with convergence written on it, but it was certainly back in the days when conference speakers had to take their word processed text to a graphic design shop, wait a week, and then collect a box of 35mm slides. I miss those days -- there was less writing on the slides.
The funny thing is that we are now realising that convergence is about the customer. The headline grabbing projects involving hundreds of millions of euros, converting businesses to IP, are about offering the customers a range of services, cheaper, faster and better than the competition.
For the billing process, this actually means less emphasis on the billing system itself and more of a focus on order processing (being automated at a telco near you), service provisioning and CRM, and integrating these into the whole process, better than before. The goal is to provide a single view of the customer.
The mature billing world, in Europe, is becoming a world where the process is stable and independent of people, and integrated. And because of all this, the customer experience is becoming better.
It also means that if a new system is needed, then mature and tough negotiations take place, and this has ramifications which I am not too happy about, but I fully understand. The downward pressure on the cost of billing systems means that the resources being ploughed into R&D and new functionalities are under pressure. It also means some vendors decided to provide their own professional consultancy and integration services and now this has been taken for granted among the operator community, which left the systems integration community exposed, and they were already under threat from the fashion for offshoring.

Around the world

I have to confess that once I had decided to provide a round up of the major regions and where they were in the 'maturity matrix' of billing processes, I found that my knowledge was not as up to date as it should be, and so, faced with the £64,000 question, I decided to use a lifeline and phone a friend -- well several, actually.
First I phoned Andreas at Orga Systems, to help me out with what is going on in Latin America. In the mobile market, which completely dominates, massive growth is the theme. In Brazil, net additional subscribers for 2004 was just under 20 million. In Argentina the growth rate is 75 per cent. In Colombia just under 60 per cent. The vast majority of the market is prepaid. Competition is fierce, and thus pressure on ARPU is intense.
The result is that the underlying issues in the region are not too different from the ones we know from the recent past in Europe. Although they are happening faster, and all at once, in Europe we had the luxury of seeing the fastest growth period happen during a mainly 'voice' period. In Latin America it is happening at the same time. Billing processes are therefore still relatively unstable, as one might expect, only the first attempts at measuring and controlling them are emerging, and the keys for success or survival are real time     Â© systems, and scalability.Then I phoned Mike at Portal Software in Cupertino, and asked about the state of the market in North America. He was upbeat. The telecoms market in the US, generally speaking, is improving and is about the three 'C's'. Consolidation is ongoing, and on a scale that is awesome, and as we thought several years ago, is shaking out into the dominance of a very few players. The challenges that consolidation brings in terms of the billing process and the systems that support the process are huge, and generally take time to sort out. Consolidation, as many of you will know, is a real enemy of a stable process!
The second 'C' driving the market in North America is our friend Convergence. It is happening, IPTV is in the wings, and is not only a huge opportunity but a huge challenge. In fact, it is a completely new business. Everything over IP is, as I have said, about the customer, and delivering services better, faster and cheaper than the competition. It also enables innovation in pricing, which brings with it sophistication and the potential for differentiation.
The third 'C' is content. Content is becoming king in North America, driven as much by the fabulous popularity of iPods and games, as anything else. The US, particularly, is now becoming about prepaid. It was, as we know, slow to take off, but is now forecast to be the biggest growth area in mobile, helped along by the emergence of MVNOs.
The market in Africa would require a separate article to do it justice, it is simply too complex. However, it would be fair to say that, again, mobile is not only the driving force, but in some cases is driving the economy. The constraints are the lack of capital and the challenges of supporting billing implementations.
I phoned the GBA's Asia team to get an up to date view of that huge and varied market. They surprised me by being less upbeat. New contracts for billing vendors are few and far between, and those few are hard won. Content is one area where there is light, but it is not providing many opportunities for innovative and sophisticated pricing and billing. Indeed the emphasis seems to be on the content provider providing priced records to the billing system, whether the content provider is the dominant partner, or not. In fact we are now seeing content providers providing value chain pricing as part of their offerings. Perhaps Asia is once again breaking the mould and bending other people's innovations around their own processes.

In summary

The trends in billing must be broken into two parts. The trends in the process part will be towards more and more stable processes, independent of people and reorganisations. It will also be towards quicker processes -- in mature markets there is now an emphasis on shortening the 'time to cash' as a constant goal. Part of this is integrating the 'front end' of the process better. In emerging markets, stable processes will seem a dream at the moment, but the journey is already starting, and will follow a well trodden and rocky path.
In terms of billing development, the focus is on convergence, and now, not just on slides, but in the real world. IP will deliver services better, faster and cheaper, and if the process is mature, then the customer experience will genuinely be enhanced.
There is also an opportunity in the mid-market section of our community. The larger billing vendors used to concentrate on the very top end of the market, and the smaller ones provided niche players with billing systems during the past few years. There is an opportunity in the middle, both here in Europe, and in North America, where the tried and trusted drivers -- time to market and flexibility, will create these opportunities.
The winds of change are blowing, and blowing at different speeds around the world. In Latin America, they are blowing hard and fast, in more mature markets such as Europe, they may be easing off, but they still seem to have some surprising eddies in them.           n

Alex Leslie is CEO, Global Billing Association, and can be contacted via: alex@globalbilling.org

External Links

Global Billing Association

Does your organisation really know everything it needs to know about its assets in general and its fixed assets in particular? Equally to the point, is it accounting for them properly? Nicola Byers investigates

A major new legislative climate in the United States has been ushered in by the Sarbanes-Oxley Act of 2002. This Act has revolutionised attitudes to corporate governance in the States. The Act has immediate consequences for UK organisations that have US parent corporations, but it also has pressing implications even for organisations with no particular US connection. As the Financial Times stated in April 2005:
 "It seems that the more obvious demands imposed by Sarbanes-Oxley in financial accounting -- the expense, the time investment, the extra audits -- are just the tip of the iceberg. The required mix of 'proper' business controls and personal liability is causing a chain reaction that affects Boards, organisational structures, professional advisers and the daily efficiencies of all public companies -- and many private ones, even though technically they are not covered by the Act."
Beyond question, Sarbanes-Oxley places a major new focus on corporate procedures. There are important consequences here for all aspects of an organisation's accounting procedures, and especially for how it accounts for its assets, which are such a major part of any organisation's fundamental structure.
The days when major organisations could confidently expect to be able to handle the accounting of their assets using simplistic in-house systems or databases appear to be coming to an end. In its place, a climate is developing where organisations must be able to bring a new, highly flexible, interactive approach to asset accounting, based around using a specialist asset management solution. The good news is that organisations that adopt this will enjoy a handle on their assets, and a knowledge of them, that will allow them to work their assets even harder in the future.
Assets are a significant part of an organisation's accumulated wealth and a fundamental resource used to generate its profit. The careful monitoring of assets is both a commercial and a statutory requirement. But how should an organisation best monitor and account for its assets? If you are involved with the management of your organisation's assets, do you know what assets your business holds, where they are located, and what they are currently worth? Do you have the facility to furnish this information for any particular moment in the past that might come under scrutiny? Are you able quickly and efficiently to know what your asset status was a year ago? Does the information you have about your assets come complete with a detailed audit trail? Can you relate the physical asset to a financial record?
Historically, the term 'fixed assets' has been used when discussing asset registers. This was because a 'fixed asset' referred to a purchased item that would have been a benefit to an organisation for a fixed term period of greater than one year. Before the advent of information technology in the accounting arena, the fixed asset ledger contained a schedule of all major capital purchases made by an organisation. As these would have to be manually depreciated, they generally consisted of large non-portable items such as vehicles, engineering equipment, land and buildings. The management and depreciation of these fixed assets was obviously a resource intensive process and as such, the ledger entries were often in a summary form and sacrificed detail.
The impact of IT improved this situation. New accounting software products, in replicating the old ledgers, were able to automate some of the processes and reduce the resource issue but even they had their limitations. But the way assets are accounted for has moved on. Organisations now need to account for (and manage) assets that no longer fit the old 'fixed asset' definition. Today's fixed assets can be small, portable and also intangible. There is a requirement to manage the asset register in order to record all purchase details and accounting history, tracking movement, audit all events and actions relating to the individual asset and finally, relate multiple physical assets to the single fixed asset entry in the accounts. This detailed, micro-management requirement was often not possible with old accounting software products and alternatives have had to be found.
For small businesses with a relatively low number of fixed assets or non-depreciating assets, the accounting requirements are managed by in-house systems or databases. For practical purposes, an organisation turning over less than £5 million annually, or one with less than 500 assets, will probably be able to make do with such a system. The proviso should, however, be made that if the organisation is in a strong growth cycle and likely to accumulate a good number of new assets over, say, the next six months, it should think hard about whether a simple accounting record will really meet its asset accounting needs.
Any asset may go missing or be lost. We live in an age when relatively small assets -- laptops, for example -- are extremely valuable. The physical tracking of such assets is a crucial aspect of the quality of an organisation's management information, and can be overlooked due to time restraints or poor monitoring systems. If these requirements for asset monitoring and tracking are not adequately met, the consequences can be extremely serious. The issue of corporate governance is certainly putting pressure on many organisations that have previously put the issue of asset management on the back burner.
Research conducted by Real Asset Management suggests that up to about 40 per cent of senior managers in major UK organisations are not confident that their registers of fixed assets are up-to-date. Similarly, a rather alarming 35 per cent believe they have assets missing from their register, while 30 per cent have no facility to track their assets by furnishing some form of electronic identification. In practice, such senior managers will typically have a simple accounting record of the assets. But the trouble is, large organisations cannot expect to 'make do' with simple systems for complex asset situations, any more than you can run a major business today using technology no more sophisticated than an abacus.
Ideally, any organisation with more than £5 million annual turnover or an asset register with more than 500 records, needs to have an asset management resource in place that provides rapid and effective affirmative answers to the following key questions:
*  Is the accounting resource linked to the actual asset or assets by means of a tracking system?
*  If the asset is disposed of, how does the finance team know?
*  If the asset is moved, is the finance department told where its new location is and who manages this resource?
*  Does the accounting resource facilitate a physical audit trail of its assets?
 In practice, many large UK organisations could probably not answer all these questions with a resounding affirmative. But UK organisations are far from alone in rarely being able to do this. Fixed asset measurement, management and overall control are also often          deficient in the US, too. Indeed, the whole matter of the monitoring and management of assets -- whether fixed assets or non-depreciating assets -- is one important area that is being improved in the United States following the passage into law of the new legislation mentioned above, the Sarbanes-Oxley Act.
The Act -- often known popularly among regulators as SOX -- is one of the most sweeping and influential pieces of legislation ever brought to law in the US. In a legislative environment where many passionate initiatives to change the law often wind up being diluted and compromised, SOX received widespread consensual backing, doubtless because of the scars Enron and the ongoing investigations into WorldCom scandals made on the US corporate mindset. Among the many vigorous provisions of SOX is a requirement for corporations to monitor much more closely how they record purchases of capital assets. Many US corporations that previously did not have a dedicated system for recording such purchases are now working to address the situation.
Under SOX, every transaction related to capital expenditure must be available for analysis and reporting. This provision has implications for all types of transactions in which an organisation engages and it also has especially important implications for fixed assets, which tend to have a high value.
Fortunately for financial directors, senior managers and anybody else who has a professional requirement for information about their organisation's assets in general, the power of specialist asset management solutions gives financial departments an important -- and in many cases essential -- tool for measuring, managing and monitoring their assets. Without such systems, they would be obliged to try to monitor and account for all their assets very much by the seat of their pants, or by using rules of thumb that do not, by definition, have very much that is scientific about them.
The benefit of a specialist asset management solution is that it offers tracking and inventory control, so that financial records can be related to physical items. Specialist systems also provide an audit trail facility and a centralised asset register so that when items are disposed of or moved, the accounting records are updated. Without such a resource it is difficult to see how a financial department will know when an asset is disposed of or moved. This is rarely information that other departments automatically pass on. It is easy to see how, under these circumstances, records can very easily get out of date.
Overall, organisations simply have to engineer themselves into a position where they can be confident that they do not belong to the thirty percent or so of UK organisations whose financial directors, according to our research, do not have a good knowledge of their assets. In order to prevent all the problems that go hand-in-hand with inadequate asset registers, ideally, organisations must know all of the following points relating to their asset or assets:
*  Physical location
*  Location history
*  Details of the actual user or manager responsible for the asset or assets
*  The serial number of assets
*  The actual value of the asset or assets
SOX requires companies to prove that in-house systems/databases formulas comply with US GAAP (Generally Accepted Accounting Principles) rules. This can be a difficult matter for in-house systems/databases, when the GAAP rules have been known to change on a daily basis. In-house systems/databases for asset accounting also bring some of the following problems:
*  They can only operate as single-user systems, whereas a multiple-user system is likely to be required
*  They provide no cost effective way of building an audit history
*  In most cases they have no data security facility or, if they do, it can be very complex to configure
*  They are, generally, prone to errors
*  In-house systems rarely have any external support
*  They are difficult to maintain and development can be time-consuming
*  It may be difficult to manage large quantities of data using this approach
*  They often demand considerable maintenance time from senior members of the finance department who have other responsibilities.
*  They offer no facility to relate financial records to physical items; that is, they offer no asset tracking.
An additional problem with in-house systems/databases is that the author or controller needs to be available if the system is to be used properly. This obviously causes difficulties if he or she has left the organisation or moved to another role outside the department.
For any organisation, the ideal situation is that it has a detailed, comprehensive and powerful knowledge of its assets. Essentially, an organisation needs a specialist asset management solution that allows it to keep precise records of every significant fact about an asset or assets.
The precise solution you choose for ensuring that your organisation has a top-quality tool in place to measure and manage your fixed assets is obviously up to you. However, ideally it should be able to do all the following:
*  Provide for both the physical and financial control of assets
*  Be able to hold movement history
*  Offer a full audit trail facility
*  Guarantee data security
*  Be multi-company, multi-currency or multi-book if needed
*  Be capable of analysing multi-depreciation via different methods: straight line, reducing balance, etc
*  Provide high-quality reports and have a facility to produce user reports
Conversely, if organisations don't relate accounting records to physical assets, the organisations can:
*  Wind up with inaccurate data
*  Lose substantial sums of money by inefficiently managing assets
*  Fail crucial audits
*  Fail to meet external requirements such as UK Government guidelines or the provisions of the SOX
*  Fail to qualify for additional funding
*  Lose shareholder value
*  Fail to identify redundant assets
The asset climate is changing, and organisations need asset management resources that meet the more stringent requirements of this new climate. The bad news is that some investment is likely to be required to introduce a new accounting system that meets the varied and complex needs of your asset register. The good news is that the new accounting system will indeed prove a truly powerful new resource that deals with a potentially thorny area of an organisation's activities and which will also help catapult your business to greater success.         n
 
Nicola Byers is Marketing Manager of Real Asset Management, and can be contacted via tel: +44  1689 892 100; e-mail: nbyers@ramplc.com

Louise Penson describes how the TDC Group in Denmark put together a plan to control customer exposure

How many telecommunication operators -- or any organisation for that matter -- can tell you at any point in time what their customer exposure is? A pipe dream perhaps? Not according to TDC, Denmark's leading supplier of telecommunication services, who are coming to the end of an ambitious project designed to give them a comprehensive view of the financial risk posed by their customers.
Established in 1990, the  TDC (formerly Tele Danmark) Group comprises a range of business lines including landline telephony, data communications, Internet services, mobile and cable, as well as interests in a number of other European telcos. Towards the end of 2003, TDC recognised the need to improve their credit and fraud management processes, which at the time were based upon traditional monitoring of individual customer entities, and limited to selected parts of the group's major business areas.   
TDC established a project team with the participation of key employees from TDC Solutions, TDC Mobile, IT and shared service functions. As the project evolved to be cross-organisational it was anchored in TDC's headquarters.
The scope of the project was extensive, comprising:
*  Network surveillance and fraud control
*  Bad debt
*  Procedures and governance rules
The main aim of the project team was to attain a complete and comprehensive view of TDC's customer exposure at any point in time -- as viewed from the following standpoints:
*  From individual telephone numbers, to the customer as a whole
*  From individual customers, to the customer base as a whole
*  From each individual business unit, to the TDC group as a whole1
The project team realised early on that efficient monitoring of customer exposure demanded organisational restructuring of the fraud and credit management functions. Previously, fraud management was handled by the network unit, and credit management was handled by the business units. The former decentralised handling of customer exposure was deemed to be incongruous. 
The project team suggested the following as the ideal organisational structure:

[img="http://www.hhcmailer.com/downloads/CustExp2.gif"]

It was believed that this structure would facilitate fast and efficient interaction between the various departments, supporting efficient monitoring of customer exposure. The next step comprised the consolidation and streamlining of the credit and fraud functions, standardisation of processes and procedures within those areas, education of key employees and the identification of requisite IT-support.
TDC put together specification documents that were initially aimed at finding two separate IT solutions, one for credit and one for fraud management. Unable to find a credit management solution that met their requirements they commissioned risk management solution provider Neural Technologies to work with them to transform the requirement specification used for the RFQ process into an operational requirement specification for a credit system, bearing in mind the solution needed to be a combined fraud and credit management system, or as TDC prefer to call it, a 'Customer Exposure System', to support the revised organisational structure. 

The Customer Exposure System

TDC's specification for the Customer Exposure System was extensive, comprising:
*  The ability to detect new types of fraud (for which they stipulated a neural network system)
*  Improved credit rating on new activations
*  Advanced warning of bad debt and prioritisation of alerts/cases. 
Crucially, TDC wanted to ensure that the acceptance of any large exposure was based on approved business procedures. The functionality of the new system will put an end to the traditional credit monitoring, based on fixed credit lines for each type of customer that is, in turn, often based on external parameters. Traditional credit monitoring is often performed on alerts based on high usage, excluding information of the customers' previous usage and payment behaviour. 
The system allows TDC to set an individual exposure limit for each customer based on the customers' behaviour. The exposure limits ensure that alerts are raised when customers change usage pattern or payment behaviour. Furthermore, the functionality impacts prioritisation of alerts, and the number of alerts that have to be investigated -- which will be determined on the basis of the actual customer exposure. Finally, neural functionality is expected to support the identification of the customers' usual usage and payment patterns. 
These exposure limits will be calculated on a daily basis for each customer (person or legal entity) and automatically adjusted over time based on the customers' usage, provided that the customers meet certain criteria (i.e. no adverse credit history in TDC and that their invoices are paid on time etc.). Adjusting exposure limits based upon usage and payment behaviour will enable TDC to achieve more efficient monitoring of customer exposure, which may even increase customer satisfaction.
TDC also set out to protect their customers, by having prior knowledge of customer exposure and behaviour, from unintended increase in usage -- often resulting in hefty invoices and possible default in payment. TDC stipulated that the solution should provide a full and complete segregation of customer data between the different business lines in order to comply with the aforementioned Danish law. Neural Technologies were able to create individual user profiles and segment these profiles along with their associated customer data to meet this challenge.
TDC Solutions and TDC Mobile have a large number of systems containing customer information needed for preparing customer exposure. Often, telecommunication companies monitor customer exposure on the basis of billed traffic: TDC includes unbilled usage etc. to get the total overview of the customer exposure.  The new customer exposure system aims at bringing all that data together from the disparate systems, allowing TDC a clear view of the exposure posed by any customer.
TDC expect the project to be an immense success.  One of the major factors behind this success is the resounding support from TDC's executive management and the strong leadership that was able to get 'buy in' and consensus from the various different business units. 
TDC's Senior Audit Manager, Marianne Holmbjerg, who was a key proponent and major driving force behind the project, notes: "At the start of the project the different business units had disparate working methods due to decentralisation. Now the business units are working with one common goal. We are expecting to reduce duplication of effort and to optimise and standardise our processes for customer response and monitoring of customer exposure. We now have the basis for an increased awareness of risk and have improved the sharing of knowledge between various functions -- Revenue Assurance, Fraud Management and Credit Management -- in each legal entity.
"Our customer exposure monitoring will cover all categories of customers. In addition, we will be able to monitor and assess fraud and credit risk on new services and price plans. Overall we have enhanced data visibility leading to a vast improvement in decision making".
Paul Bowler, Deployment Manager for Neural Technologies, adds: "The fact that fraud and credit risk is now managed within a single system gives TDC a holistic view of any issues, and means that potential losses are more likely to be identified. Particularly given that there is often crossover between the two functions. For example, what starts as a fraud can end up as a credit alert if the expected exposure is exceeded due to high usage. Furthermore, the solution ensures TDC fulfil their accounting principles by providing clear fraud and bad debt definitions."
The new system is supported by TDC's existing processes, which comprise, for example, credit vetting on new activations, monitoring of accounts receivables, dunning processes and monitoring of network performance. At the moment the system is in a testing phase on live data and is expected to be fully implemented by the end of the year. The user acceptance tests of the "Customer Exposure System" on synthetic data gave rise to almost no corrections. 
The Customer Exposure System initially covers the business units of TDC Solutions and TDC Mobile. It is envisaged that TDC's other business units will be incorporated over time.                                             n

1 Ensuring that the organisation working within the premise of current Danish law, which prohibits the exchange of customer information between TDC's separate legal entities.

Louise Penson, Neural Technologies, can be contacted via tel: +44 1730 260256;
e-mail: louise.penson@neuralt.com

High-tech industrial espionage is on the increase, but there are ways of preventing the 'baddies' from penetrating IT systems, as Calum Macleod explains

For those of you who follow the news, you may have come across the recent story of spy software discovered at some of Israel's leading companies, which reads just like the spy stories we've been fascinated by for years. Indeed, if it weren't for the sacrifice of the likes of Roger Moore and Pierce Brosnan, who knows where we would be today.
But that would be to miss the point completely. Firstly the imagined villains are in fact the victims. But more importantly, highlighting the problem of spy software being prevalent in Israeli companies came as a result of one of the most comprehensive investigations involving computer crime ever undertaken. The Trojan had been introduced by providing companies with contaminated files, or sending a contaminated e-mail message to the companies. This also raises concerns that these methods evaded all the security measures in place at the companies infected.
Today, our businesses depend on the exchange of electronic information with our business partners, but many of the mechanisms that are used still rely too much on the goodwill of those business partners, or the integrity of the systems that they use.
Two of the most commonly used security measures, FTP and Digitally Signed e-mails, using technologies such as PGP, are not really equipped to deal with this type of situation. In Israel, e-mails were being received from trusted business partners, so digital signatures on the e-mail would possibly be considered trustworthy. In the case of files being shared using systems such as FTP, the presence of malware detectors was unable to identify anything inappropriate.
The bottom line is that we frequently depend too much on the trustworthiness of those we deal with, and unless we take appropriate measures to handle the type of eventualities described above, we are leaving ourselves vulnerable.
So are there measures we can take?
Here are a number of suggestions that might help:
1. Do not expose your internal networks to external parties. The process of transferring files in and out of the enterprise must be carried out withoutexposing and risking the internal network. No type of direct or indirect communication can be allowed between the partner and the enterprise network.
2. Ensure that the repository for files being moved back and forth is secure. While information is waiting to be retrieved by the enterprise or sent to the business partner, it must reside in a secure location. This is especially critical when the intermediary storage is located on an insecure network, such as the enterprise's DMZ, outsourced site, or even the Internet. Additionally you should take steps to define what format files will have, and to ensure that they can only be deposited if they are virus free.
3. The environment for exchanging data should be a sterile environment. Encryption and other security mechanisms are not helpful if the security layers where the data is being stored can be circumvented. Encryption is good for confidentiality, but does not protect data from intentional deletion or accidental modifications. In order to build multi-layered security, a sterile environment must exist to accommodate and protect the security infrastructure. Creating such a sterile environment requires the creation of a single data access channel to the machine and ensuring that only a strict protocol, that prohibits code from entering, is available for remote users. Many file exchange technologies do not run in sterile environments. For example FTP Servers, a common method, are frequently nothing more than applications running on insecure platforms.
4. Protect your data when it is at rest. The cornerstone of protecting storage while at rest is encryption. Encryption ensures that the data is not readable and thus maintains its confidentiality. But encryption that places high demands on managing is ineffective. A common approach for many organisations is to use a Public/Private key approach, but this is generally considered to be ineffective because of the enormous effort to maintain such a system. A Symmetric encryption model ensures a manageable and effective method to secure the data.
5. Data must be protected from deletion and tampering. The protection of data by encryption is simply one part of the problem. Files may be accidentally or intentionally deleted or changed. Additionally you need to ensure that data cannot be tampered with.
6. Ensure that you are able to audit and monitor all activities. Comprehensive auditing and monitoring capabilities are essential for security for several reasons. Firstly, it allows the enterprise to ensure that its policy is being carried out. Secondly, it provides the owner of the information with the ability to track the usage of its data. Thirdly, it is a major deterrent for potential abusers, knowing that tamper-proof auditing and monitoring can help in identification. Finally, it provides the security administrator with tools to examine the security infrastructure, verify its correct implementation and expose inadequate or unauthorised usage.
7. End-to-End network protection. Security must also be maintained while the data is being transported over the public network. The process of transferring data must be, in itself, secure, and there are several factors that influence the overall security of data transmission.  As data transfer is an essential part of a larger business process, it is critical to be able to validate that this step in the process was executed correctly. This requires the solution to provide auditing features, data integrity verification and guaranteed delivery options. Transmitted data should be automatically digitally signed, thus ensuring the data delivery is complete and un-tampered.
8. Performance is a major issue in many networks, especially when using the Internet where Service Levels are difficult to guarantee. When there are large volumes of data and the number of recipients is high, it is critical to ensure that performance is optimised. Compression should be deployed to reduce file size, and since network availability and reliability may disrupt the transfer process, automatic resume from the last successful checkpoint should also be a standard feature.
9. Ease of Integration with existing business processes. File transfer is usually part of a larger business process and needs to integrate seamlessly. This demands the ability to automate the file transfer process and thus integrate it with the existing business processes. In order to make this integration as simple and as seamless as possible, the file transfer solution must have an extremely flexible and diverse interface, providing transparent integration. This also minimises the amount of human intervention and, as a result, can improve overall security by reducing the possibility of tampering with your data.
But of course no one is interested enough in what your business is doing to waste a few minutes planting some spyware in your company. After all it only happens in the movies!                                                      n

Calum Macleod is the European Director for Cyber-Ark and can be contacted via tel: +31 40 2567 132;  e-mail: calum.macleod@cyber-ark.com 
www.cyber-ark.com

Organisations are coming under increasing regulatory pressure to ensure stricter policy towards corporate governance. Michael Burling looks at what companies can expect

In the wake of the Enron and Worldcom accounting scandals, the regulations an enterprise implements to ensure its integrity are open to increasing scrutiny. This has given rise to a growing number of initiatives such as Basel II, the Sarbanes-Oxley Act and the new Companies Act, all designed to ensure that high-standards of corporate governance become part of day-to-day business culture.
Basel II, the forthcoming protocol for the financial sector, is designed to replace the 1988 Capital Accord. It recognises that managing and controlling financial risk and operational risk, such as IT, is an integral part of corporate governance and, as such, obligates companies to assess their vulnerability and make it public.
Basel II is based on three main areas that allow banks to effectively evaluate the risks financial institutions face: minimum capital requirements, supervisory review of an institution's capital adequacy, and internal assessment process and market discipline through effective disclosure to encourage safe and sound banking practices. 
Financial organisations that do not provide appropriate details must set aside 20 per cent of their revenue in order to cover losses or risk being prevented from trading. The first phase of Basel II will come into effect at the end of 2006, with the more advanced elements planned for implementation at the end of 2007.
The furthest reaching of these regulations is the Sarbanes-Oxley Act, which requires companies to comply with challenging new standards for the accuracy, completeness and timeliness of financial reporting, while increasing penalties for misleading investors. The Act, which applies to all companies (and their subsidiaries) on the US public markets, protects the interests of investors and serves the wider public interest by outlawing practices that have proved damaging, such as overly close relationships between auditors and managers. The law includes stiff penalties for executives of companies that are non-compliant, including fines of $5m dollars, and up to 20 years in prison per violation.
The forthcoming Companies (Audit, Investigations and Community Enterprise) Act is designed to help UK firms avoid the much-publicised accounting and auditing problems experienced by companies such as Enron, Worldcom and Parmalat. The Bill, which made mention in this year's Queen's speech and will be debated in this session of Parliament in order to come into force early next year, will impose new measures to ensure that data relating to trades, transactions and accounting throughout an organisation is fully auditable.
With reference to the Companies Act, Department for Trade and Industry minister Jacqui Smith has said: "We want the UK to have the best system of corporate governance in the world. There is no denying that financial markets around the world have been badly shaken by the corporate failures of the last few years.
 "This Bill completes a comprehensive package of measures aimed at restoring investor confidence in corporate governance, company accounting and auditing practices here in Britain. Its aim is to raise corporate performance across the board and beyond.
 "The Bill tightens the independent regulation of the audit profession and strengthens the enforcement of company accounting, both concerns highlighted by the Enron and Worldcom scandals. It gives auditors greater powers to get the information they need to do a proper job, and increases company investigators' powers to uncover misconduct." 

Network security
Basel II, the Sarbanes-Oxley Act and the Companies Bill all highlight the fact that board directors and executive management have a duty to protect the information resources of their organisations. As such, network security -- preventing unauthorised access to information and data -- is of the utmost importance, and the most effective way of achieving this is by deploying an effective provisioning solution that allows the enterprise to determine who has access to which applications and when.
However, implementing an identity and access management programme that ensures the correct level of security and internal controls over key information and data can be a difficult task for many large organisations.
Often, systems and access policies in use today were developed many years ago when security was not necessarily the highest priority. Not only are these legacy systems now unsuitable for use, but, since being implemented, many of the policies associated with them have not been reviewed, and access is granted either manually or by way of 'home grown' development.
Furthermore, many of the systems were not developed to cater for temporary changes such as the provisioning and de-provisioning of contract workers or account for a member of staff on leave. Adding to the problem is the fact that, often, companies have myriad systems and access policies, which have merged with another organisation's policies, systems and architectures.
These issues are now major problems that need to be addressed urgently. As well as the need to comply with corporate governance regulations, the situation has also given rise to an increased security threat; a fact highlighted by the Financial Services Authority's Financial Crime Sector Report: Countering Financial Crime Risks in Information Security. 

Secure enterprise provisioning

The latest enterprise provisioning technology allows organisations to alleviate these problems through centralised management of IT systems and applications, and the users who access them. Enterprise provisioning solutions, which automate the granting, managing and revoking of user-access rights and privileges, solve the problems created by complex user bases and IT infrastructures by enforcing policies that govern what users are allowed to access and then creating access for those users on the appropriate systems and applications.
The solution can execute provisioning transactions dynamically, based on the nature of the request and then initiate the appropriate approval workflows as defined by the appropriate policy. It will also provide robust reporting that enables the IT department to better manage user access rights from a global view. For example, systems administrators can view who has access to particular systems or the status of any individual access request (add, move, change, delete) in real time.
The best of the new breed of provisioning systems enforce organisational policies designed to ensure that financial enterprises comply with regulatory requirements by governing who can access particular systems and the information they contain. Reporting and auditing capabilities enable the organisation to demonstrate compliance by listing who has access to protected systems and reporting on how the access was granted and that appropriate approvals were obtained, thus demonstrating that proper policies designed to comply with regulations are being followed. The software can also demonstrate that users who have left the organisation have had access revoked from all the systems to which they were previously authorised.
These capabilities not only make regulatory compliance straightforward and easy to manage, but ensure increased productivity. Users can be connected to the resources they need to be productive in a fraction of the time, cost and effort previously required. Enterprises can compress the user set-up process from weeks to minutes and application integration from months to just days. In addition, the IT department's own productivity will increase dramatically as resources are freed up from the time-consuming tasks of managing user access and building integrations to managed systems and applications.
By ensuring regulatory compliance and at the same time reducing IT costs, secure enterprise provisioning solutions are sure to evolve from the great opportunity they currently present, to a critical element of the IT infrastructure of successful businesses.                    n 

Michael Burling is EMEA managing director of Thor Technologies and can be contacted via tel: +44 1932 268 456; e-mail: michael.burling@thortech.com
www.thortech.com

The telecoms industry has been working to develop an
architecture that could bring together the increasingly complex elements within the network. And finally, with IMS, it may well be succeeding, claims Grant Lenehan

For the last twenty years or so, the creation of an infrastructure that could support a 'network of networks' has been the long-term vision of many industry bodies, service providers and vendors. The aim has long been to create an agnostic environment that allows users to interact with services, whenever and wherever they are. Initiatives such as the International Telecommunications Union's (ITU's) IMT2000 directive set the way, although the 2G and ISDN technologies that it was designed to use now look rather dated in the face of the overwhelming success of IP.
While technologies, regulatory conditions, and operators' business models have changed dramatically, overall industry objectives have not. Confronted by continually increasing complexity in devices, protocols and applications -- and by the need to inter-work across multiple network boundaries -- the telecoms community has been working hard to develop an architecture that could bring these together in the simplest way possible. And finally, with the IP Multimedia Sub-system (IMS), it may well be succeeding.

Mobile first

The mobile community has got there first with the IP Multimedia Sub-system, mainly as a result of the economic downturn that limited fixed network investment at the start of this decade. 3GPP, the main global co-ordinating body for 3G network development, initiated the work on IMS that would act as the standard for the converged core network of the future.
With many fixed network operators now starting to confront similar issues, IMS-based solutions are also becoming attractive to them as they face a future based on offering the 'triple play' of integrated voice, data and content services. In this context, IMS is expected to play a major role in driving the continued convergence between the fixed, mobile and wireless sectors, and greatly simplifying usability from the end user's perspective.
IMS is rapidly gathering pace, with some systems planned to go live in late 2005. One of the significant drivers for IMS adoption is its potential for rationalising the heterogeneous network and service infrastructures that have been inherited by multinational mobile operators as a result of the industry consolidation of recent years.
Particularly acute is the issue of continued inter-working with legacy PSTN and cellular infrastructures. Investment has been enormous over the last century around the world in both access and switching equipment and it will be impossible to completely replace this for many decades. In fact, significant growth is still underway in circuit-switched cellular networks. For this reason, inter-working between the two domains will remain an important issue for the foreseeable future.
If truly global brands are to be established, they must offer consistent services across networks and global boundaries. In reality, this consistency must extend to the methods by which services are created, delivered and managed. The use of a single and coherent -- yet highly distributed -- architecture is desirable if systems integration and legacy support costs are not to become unmanageable.
The previously 'flat' structure of traditional telecoms networks is being replaced by an open, layered model that allows the delivery of richer multimedia services to a variety of devices from a variety of sources. Future services are being built around IP as the transport protocol, supported by Session Initiation Protocol (SIP) to control VoIP, and multimedia sessions and Diameter for handling customer authentication and billing procedures. Building on these, IMS has been designed to support other relevant protocols such as HTTP, Web services and Parlay, while also incorporating the work being done by the Open Mobile Alliance (OMA) in the applications layer.

All-packet core

At the heart of the IMS is an all-packet core that fully supports the ever-growing diversity of access technologies including 2G, 3G, WiFi and WiMax -- as well as the still largely open concept of '4G'. Supporting this, a number of other standards bodies are also developing appropriate extensions to allow IMS to inter-work with other access technologies such as xDSL, PacketCable (DOCSIS) and fibre in the local loop.
   The primary purposes behind IMS are to enable a richer set of services, as well as facilitate the seamless convergence of all the communications services that we presently use -- but which are currently partitioned by the nature of the networks that they run on. While we've become used to using the fixed Internet for some transactions, our mobile handsets for others and so on, this silo concept is increasingly inefficient and expensive for both user and service provider.

Moving up the value chain

With traditional voice revenues under constant erosion, it's essential that service providers of all types are able to move up the value chain, away from basic connectivity and towards more advanced communications services that include multimedia, messaging, business and lifestyle applications. IMS has been specifically designed to allow this type of rich interaction between services, allowing users to set up voice or multimedia sessions on the fly, exchange content and messages in highly flexible ways, direct fixed line voicemails to mobile in-boxes, or use presence and availability information to direct calls to the most appropriate person within an enterprise.
IMS also supports the core next generation network objective of openness and transparency. On one hand, the presence of standards ensures that multi-vendor purchasing strategies can be pursued without an accompanying rise in integration overheads. On the other, commercial relationships with content and application owners and aggregators can be protected within a secure framework for both financial transactions, supported by underlying techniques to guarantee quality of service across multiple network and operational domains.
The changing role of industry standards within telecoms is particularly important here. In direct contrast to the IT industry, communications standards have generally emerged through consensus, facilitated by the workings of national and international industrybodies  like the ITU. As the world's networks shift towards becoming open platforms, new technologies must be integrated at an ever-faster rate to achieve continued competitive advantage, placing a strain on increasingly fragile and multi-sector standards processes.
By providing what is, in effect, a common and open applications platform for both service providers and third parties to use, IMS goes a long way towards helping the telecommunications industry take its first steps towards becoming more commercial and even less utility based.
IMS will have an enormous impact on how the communications industry actually makes money in the future and, just as importantly, on how it will protect its traditional revenues from attack. While mobile service providers in particular have always been sensitive to the loss of revenues to third parties, they must reinvent themselves to add more value to transactions in a variety of ways to help both themselves and their business partners.

Opening up the value chain

IMS has a unique ability to open up the value chain while simultaneously allowing the network operator to retain control of certain essential value-added functions. Telcordia in fact, is concentrating much of its development on these 'value added' functions, often referred to by OMA as 'service enablers.' Since Telcordia believes in open ecosystems, and that a holistic approach is the best way to forward the objectives of IMS, we are actively contributing to the development of these specifications within OMA and 3GPP.
IMS supports a significant number of value-added functions both within the network and also within the business models that are the justification for the current interest in IMS. These functions all share the attributes of being common across most applications, and being far easier to implement within the network than hundreds or thousands of times within each application.
Firstly, there is presence and the availability information; with the network knowing whether users are     available for calls and what device they are using at that particular moment, it becomes possible to offer premium services to both private and business customers that ensure that calls or transactions always get through to the appropriate person or device.
Then there is location information; if the network knows where the user is, a broad portfolio of location-specific services and applications can be offered to the customer, extending to special promotions in shopping areas or traffic and weather alerts. Here, IMS can be combined with the availability of GPS and other positioning technologies to finally make location-based services a commercial reality.
Security and risk management are also both important to operators; sensitivity to security vulnerabilities is finally becoming a serious issue for both business and private users and the emergence onto the scene of viruses that target mobile devices or the impact of Denial of Service (DoS) attacks are hitting the headlines. For end users, the emergence of IMS means they can use the network in the knowledge that their data itself is secure from prying eyes.
Shared (or 'common') user data and profiles represent a valuable opportunity to simplify the development and usability of new services; service providers can extend and leverage the increasingly wide range of customer information that will be needed to support advanced services. It also opens the network to end users to express their own service preferences, order their own subscriptions and appear on specific group lists.
Then there's flexible charging. The highly diverse, multimedia nature of NGN services is about to revolutionise billing models, with service providers requiring almost infinite flexibility in how they package and price their services to meet ever smaller market niches and support far more complex relationships with third parties. IMS, with its well-defined session model and value-added functions, allows operators to add value, and also flexible methods to charge for services.  Charging may, in fact, be the most significant difference between IMS networks and other, 'dumber' IP networks.            IMS provides a framework to simplify all of these procedures, making it easier to associate particular service quality parameters with specific customers to create gold and silver service grades for example, or to rapidly create cross-charging and payment relationships with partners, such as TV programmes for televoting, or the original owners of brands and content, such as film studios.

Monetising every transaction

It's arguable that IMS could be interpreted as the 'IP Metering System' (IMS!), given its ability to track and charge for every conceivable transaction that takes place, irrespective of whether this is via standard credit or prepaid systems, or through service-specific micro-payments. This helps communications service providers and network owners to retain their market dominance and opens up bandwidth for the flow of money, as well as data, once the services themselves have been created. Of course, the next-generation charging systems must take advantage of IMS' inherent flexibility -- legacy systems in IMS will simply become barriers to innovation.
In fact, service creation was often a problematic area in the days of Intelligent Networks, but IMS is allowing the drag-and-drop creation of new, multi-network and multi-protocol services and applications by non-technical staff, driving the rapid low cost prototyping and introduction of new services. In turn, the revenue opportunities created by IMS are transparently apparent to the operator.
That is because, as IMS at the heart of both the network and the service environment, data can be readily gathered from a multitude of network elements, end devices and third parties to produce clarity in billing and associated reconciliation procedures.
If much of the IMS flight path remains clouded in commercial confidentiality, it is becoming clear that there are two areas where it could have a major impact: in fixed-mobile-wireless convergence, and the area of discovering more about how subscribers use services.
Firstly, in the area of convergence, major operators such as BT are examining the role of IMS as a tool to offer truly 'joined up' services to customers, allowing them to roam freely between fixed, WiFi and cellular networks both at home and in public. One important issue here lies in allowing customers to connect in the most appropriate way for the service required at the optimum cost -- but through a single account and customer profile.
Secondly, in terms of discovering more about service usage, and adding to work done by the OMA, the part of IMS dedicated to supporting customer details and preferences might be accelerated to enhance data, messaging and virtual operator services by providing a much richer, more personalised experience. This in turn, can help drive take up of advanced services by online communities, increasing both revenues and brand loyalty.

The ultimate implementation

However IMS is ultimately implemented by each individual service provider, it's clear that its impact is going to be truly transformational in the business process and operations areas. It is, however, going to bring a need for reassessment in several key areas.
With quality of service, it will no longer be possible to take a simple, deterministic view of service quality based purely on a few connectivity-based, network-centric parameters. This prescriptive approach will have to be replaced by far more flexible and dynamic methods that can aggregate multiple sources of quality of service data, based more on the customer's actual experience of a transaction.
Quality of service issues are magnified by the fact that communications will increasingly take place across different commercial and technological domains, only a few of which may be actually owned by the primary service provider. Protecting both the integrity of the service and of the service provider, without any direct control over the entire length of the value chain, will present challenges for technologists, business development specialists and lawyers.
Now that the long awaited 'network of networks' looks like it's finally emerging from the complex cat's cradle of co-existing and often competing technologies and protocols that have grown up in recent decades, it's important to remember that IMS is there as a true business enabler.
The invention of money revolutionised entire economies and social structures, replacing the inevitable time and space limitations imposed by bartering. By comparison, IMS is set to open up the communications environment to new ways of doing business. In the process, new value -- and new wealth -- will be created.                                                           n

Grant Lenahan is Vice President, Wireless Mobility, Telcordia, and can be contacted via tel: +1 732 699 4894; e-mail: glenahan@telcordia.com 

All mobile operators now offer their subscribers a massive choice of content. The only thing that's missing is the users. The key to success is to push interactive content direct to the idle screens of subscribers, says Yossi Wellingstein

"Build it and they'll come" is a risky way to run a business -- and one that simply hasn't worked in the mobile content world. Today content discovery is one of the biggest problems mobile operators face, following the massive effort to build-up huge content catalogues. Put simply, the fact that users don't know that so much content is waiting for them, and more importantly, don't know how to find it, explains why mobile data services still account for such a small proportion of operators' revenue, despite their best efforts to increase their share.
Operators around the world have realised that they can't wait for users any more. They have to bring content to customers -- and then make sure that it is easy to access. Anything too complicated, or requiring pro-active behaviour from the customer, just hasn't, doesn't and won't work. 
A new approach is needed to let subscribers know that content is available -- and to encourage them to click through. Not only to build data traffic so mobile operators can start to generate decent revenues to justify their infrastructure investments -- but also in this competitive market, to provide services that add value and differentiate one network from another.
One answer to the content discovery problem is incredibly obvious and already being used by a number of operators around the world.
The solution: push content teasers to the screen of a mobile phone when the phone is turned on but not being used. Make these teasers free to the customer. Ensure that they vanish after a few minutes and are replaced with new headlines. Make the content relevant to that user. And finally minimise the number of clicks needed to access additional information linked to a teaser. Research has proven over and over again: clicks kill revenue. The more customers have to navigate, the less likely they are to bother.
A mobile phone screen with no content on it is a major missed opportunity. And a dead screen seems acceptable to customers -- until they see what they could be getting -- up-to-date information headlines for free with an easy way to access additional information and services. Operators need to make the idle screen a dynamic palette of intriguing content, news, games, offers, polls and real-time updates.

Commercially proven


Active Content to the idle screen is a technology that has proved itself commercially. According to operators already using it, over 90 per cent of users keep the service once it's introduced and over 30 per cent use it regularly, an unprecedented use of content services.
VimpelCom which operates as BeeLine GSM in Russia faced a problem common to all mobile operators -- it had spent considerable sums building its data networks, and interesting content was available to its customers thanks to a series of partnerships with content providers. However, customers just weren't accessing content from their phones.
VimpelCom's research showed that users simply didn't know how to find the content -- and if they did it required far too much effort. And so, VimpelCom decided it needed to take a proactive approach. In April this year it launched its "Chameleon" service, using technology from Celltick, to broadcast live content directly to the phone screens of its subscribers.
The Chameleon service sends streams of free news headlines, sport reports, weather updates, music stories, gossip and games directly to mobile phone screens. Just like a screen saver, the messages appear silently only when the phone is not in use. 
Chameleon is very easy for customers to use and does not require any action on the part of the customer to start receiving the service. When they see a message that interests them and want to know more, they simply click on the OK button. A menu opens and presents various options. For example, a video news report or an automatic link to a WAP site or web page, or an SMS message with more information. A second click launches the desired service. And then the subscriber starts paying.
VimpelCom has invested in interesting and credible content, using brands such as MTV, Cosmopolitan and Playboy. It has created a countrywide team to run the broadcast operation, effectively working like a TV editorial team.
Chameleon targets all types of audiences, and can broadcast both to the entire subscriber base and to specific segments and locations, so content can be customised and localised to ensure that it is relevant and of interest to customers.
The results are staggering. During the first three months more than seven million data transactions have taken place with 50 per cent of enabled users reacting to the content on a regular basis.
Victor Markelov, Products Director at VimpelCom, said: "Our customers love Chameleon. We've finally found a way to provide them with an opportunity to use our data services actively. The easier the access to information and data services, the more often our customers use them".
Currently Chameleon is available to almost 3 million users, and VimpelCom plans to expand it to more than 10 million by the end of the year. 
From a business perspective, the key to a successful Active Content system is sending free teasers to a vast number of users, while keeping the cost of these teasers low. It's a numbers game. The operator relies on a certain per cent of users clicking through, and therefore needs to send teasers to as many users as possible. But how to send these teasers cost-effectively? Say there are one million users and the operator wants to send 100 daily teaser messages to each of them. Doing it with any kind of point-to-point delivery will bring the network's SMSCs to its knees or eat up the GPRS capacity. Since these messages are not paid for by customers, this is certainly not a good use of the network resources.
There is a simple answer to this challenge -- enable mobile networks to send one message to many users simultaneously, or in other words, broadcast. Using broadcast, teasers can reach millions of users in real time without clogging the network and without using any commercial bandwidth at all. Since the broadcast capacity of the network is miniscule to begin with and allocated in advance, it makes the content teasers virtually free. This makes Active Content particularly attractive, since the variable costs associated with its delivery to idle screens is close to zero. When users click on a teaser, the terms change, a paid-for session begins and the operator starts seeing revenue.
Users love having content pushed to their screen because it provides free information, entertainment and timely updates with zero intrusion.
Operators love it because it increases traffic on data networks by solving the content discovery problem  -- and in today's competitive market is an effective brand differentiator.
So why are you still hiding your valuable content and services deep inside your portal? It's time to bring it to the front. By activating the idle screen.                    n

Yossi Wellingstein is CEO of Celltick and can be contacted via www.celltick.com

Internet Protocol Television (IPTV) looks set to revolutionise the world of entertainment. However, like any new technology, there are roadblocks to negotiate before profits can be realised. And one of the biggest challenges to IPTV is the flexibility of back office applications such as OSS and billing systems. Simon Gleave looks at the issues

Just when the telecoms industry has come to terms with the billing challenges of next generation networks, along comes a new technology that presents equally tough obstacles. This innovation is IPTV, the television service that provides real-time interactive experience for the viewer. 
Downloading a favourite film, music video or sitcom whenever you want it, will become possible with this entertainment system, not to mention bringing other interactive delights such as home shopping, television gambling, and video games on demand into the living room.
In short, television will evolve into something that its inventor, Logie Baird, never dreamt of. It will offer the variety of on-demand content that is currently associated with the Internet, providing consumers with the       opportunity to watch whatever they want to, whenever they want to.   

Technology vs. delivery

Technically, IPTV is very different from any service that telcos have previously deployed in their networks -- for a number of reasons.
Firstly, the consumer data created by the IPTV network is immense, which provides both a challenge and an opportunity for the CSP. It would require an advanced mediation system with which to capture and correlate the disparate records being fed into it, but once this is completed, supplies the CSP with detailed information on its user base -- what they watched, what adverts they clicked on/skipped. Providing valuable information with which to attract advertising revenues and upsell promotions for its own products.
Second, IPTV is enabled by large amounts of sophisticated software, as opposed to hardware, for traditional voice and services. This provides a higher abstraction level of the service, which enables the CSPs products to more intelligently interact with the network and thereby create more advanced services.
Third and most importantly, IPTV differs from satellite and cable television in one, very important way -- it's not a unidirectional broadcast technology where everyone receives the same signal. Although it does use IP multicast for standard channels, the video head-ends are in direct contact with the viewer's set-top box, and it is able to react instantaneously to requests. This gives the user much more control over what is viewed, when and how. For example, a video-on-demand would start instantly, instead of just being scheduled at regular times.  Enabling these facilities, however, requires a BSS/OSS system that can cope with this change to on-demand services.
Viewers can also access various types of media options by using the television remote to send control commands to the set-top box. 
At the same time, there is also a need to develop franchising and content rights to ease the way into film, game and television markets and, most importantly, the flexibility of OSS and billing systems for the telecoms operators scrambling to secure key market share.

Franchising woes

Much like 3G, IPTV involves everyone in the entertainment and communications sector, including television, gaming, music and movie studios, cable companies and telecoms operators. With so much at stake, it is not surprising that there is a great deal of interest in this new technology. In Europe and the US -- two markets where IPTV is expected to attract over 20 million users by 2008 -- the telecoms industry is sharpening its knives to get a piece of the action. The time is almost upon us, they believe, when compression technologies and broadband penetration will allow the interactive TV (iTV) promises of the 1990s to be finally realised. And they are prepared to do battle with entrenched cable and satellite companies to win over the consumer to this brave new media world.
One thing is clear -- it won't be easy. Who gets the rights to broadcast certain content is still in the balance.  In the US alone, there are more than 30,000 franchise areas for IPTV, each of which can take close to a year to negotiate entry. At this rate, it could take years before all licenses are procured, and there is likely to be a bidding war in the meantime.
Another major problem is exactly who delivers 'must have' content. The biggest consumer attraction to IPTV is the ability to watch content on demand. Traditionally aggregators of must-watch content made their money by showing the content, and financing this through ads or license fees. But what is worrying many a television channel executive today is that content producers are exploring options of delivering their content independently. They could achieve this by setting up IPTV delivery sites of their own, enabling consumers to go 'straight to the source' for content and making the old networks effectively redundant. Accordingly, many of the Hollywood studios are reticent to sign up to any favourable distribution deal for their films on IPTV.  And there is a danger that companies hoping to get ahead in the IPTV world will bid over the odds for content required to attract consumers.

OSS and billing challenges

Franchising and content rights are one IPTV headache.  Another is billing. This is one of the biggest challenges faced by any player entering the IPTV market. At the heart of any successful media roll-out is a flexible OSS back office system, and IPTV is no exception. 
Billing for cable has always been a straightforward process in the past, but the addition of telecoms operators venturing into the IPTV space means the billing process suddenly gets more complicated. It must now contend with the requirements of 'triple-play' billing with all the tracking and reporting challenges that brings.
With IPTV, telecoms operators will need to acquire the means to identify the users of the service and all vital statistics associated with billing -- such as what programmes the viewer watched and for how long, what their viewing patterns are in terms of hours watched and when, the content accessed -- which can then help drive targeted upsell marketing campaigns. In a country with a franchising model, the payment requirements for each franchise or content in a given area need to be supported.
Another billing challenge is trying to find ways to deal with the endless number of partnership agreements that are created to provide the content necessary for IPTV and to manage the thousands of franchises that go with it. Settlement between partners is a complex process when just a handful are involved -- with the potential for thousands, it becomes a problem on an entirely different scale, and IPTV providers will need to address these issues with new partner management and settlement systems.
Rating engines have to be adaptable to support different fees and local taxes, in addition to various channel line ups in the service bundles. One household, for example, might be given free access to a movie network, while another could be offered free children's programme viewing. You could also have pre-paid wallets (for VoD or mobile top up purchases) supporting a standard post paid monthly flat fee.
The variety of these issues now means that telcos are having to start talking to people they usually don't deal with, such as Hollywood studios and advertising agencies. IPTV providers must be able to cater to these players and their individual requirements and demands.
Billing systems should, therefore, provide specific rates for specific classifications. Flexibility is key, so that every IPTV service is charged fairly and in real-time, according to where and what service is enabled or downloaded.

New opportunities for OSS players

IPTV provides tremendous opportunities for billing providers, especially those that can offer an adaptable BOSS solution -- tying an intuitive customer interface with real-time rating, charging and activation on demand capabilities. Content partner management software can be installed to support the different rating requirements for the various partnerships agreements presented by IPTV. Meanwhile, a multi-service mediation system can be installed to manage the abundant levels of customer data produced by the IPTV network.
The opportunity for the BSS/OSS players lies in four main areas:
*  differentiating IPTV from current TV services
*  supporting on demand services
*  stimulating consumer spend
*  stimulating advertising spend
Achieving these aims means investing in the right back office systems and, in the process, making OpEx savings by helping to automate key customer support processes  -- e.g. buying a new service via the TV rather than the contact centre.
Like 3G, the opportunities to make money from IPTV are numerous, but without the right billing systems in place, it cannot realise its maximum potential as the biggest money-spinner in the history of television.      n

Simon Gleave is IPTV Market Manager at Intec.
www.intecbilling.com

If someone tells you that they know what's happening in the telecommunications industry these days, the chances are that they deserve a sharp poke with a pointy stick*. The world's networks were already being described in the 1970s as being the most complex machine that man had ever built, and the last few years have only increased that complexity by a few orders of magnitude.

Imagine a speeded-up film of the earth from space. One single solitary telegraph line first appeared in 1843, stretching all of forty miles from Washington to Baltimore. Over the following decades, that fragile link grew into a web of cables and repeater stations, reaching across continents and oceans. Fast forward a century and a half, and the earth is now covered by a dense mesh of fibre, radio, satellite and copper networks. Regions, once dark and silent, are being rapidly filled in as the world goes mobile, adding a second billion onto those souls already glued to their handsets and screens.
But geographic reach alone only tells one part of the story. As we shift towards an all-IP world, drawing content and applications into the traditionally two-dimensional signalling framework of telecommunications, we're starting to approach biological levels of complexity -- and that's going to demand some serious changes in the ways that we perceive and plan our industry.
Chatting recently with Digitalk, one of the companies now bringing SIP -- that increasingly ubiquitous protocol -- firmly into the PSTN and voice services world, I started getting flashbacks to biology classes at school and all those wonderfully intricate diagrams of metabolic pathways.
Now, for an individual who did a degree in zoology and psychology before falling into networking, the fact that I should look for biological metaphors shouldn't really be surprising. What is perhaps surprising is that respectable academia is also now taking this route, but under the catchy title of 'Complex Adaptive Systems'.
Partly growing out of chaos theory, that pop science favourite of a while ago, a growing number of researchers around the world are looking at the subject of network behaviours -- but across a multitude of different areas. Google on the Santa Fe Institute in New Mexico, and you'll find such apparently unrelated topics as economies, weather, embryological development and telecommunications coming together in fascinating ways.
All very interesting perhaps, but what's this got to do with profits in the increasingly Darwinian playground of a deregulated telecommunications industry ? 
It's simply that nature's been doing for billions of years what we've now been doing for only a hundred or so -- and there should be some good tricks we can pick up. I sometimes deliberately annoy engineer friends by asking them if an individual bee knows that it's actually building a larger hive as it works on its own individual little cell -- each a masterpiece of optimised design. After I've picked myself up off the floor, they do, however, usually get what I mean, especially when I explain why the use of the word 'ecosystem' in all those endless marketing presentations is inappropriate. Anything that kicks marketing departments usually goes down well with engineers, I've found...
In an ecosystem, everything eats everything else. In reality, the world of networks is becoming much more like a super colony of very simple organisms - but with each starting to exchange the equivalent of genetic material with each other.
Consider the fraught relationship between content owners and network owners. Each needs the other -- but they're increasingly fighting the other for dominance of that growing space. What started off as an apparently symbiotic relationship now looks like moving to become parasitical for one of the parties.
Continuing the HR theme of my last column, we now need people who can rise above the raw silicon of the network and spot these shifts in power as they emerge. Talking recently with Daniel Osmer of telecom recruitment specialist, Spectrum-EHCS, he made the interesting comment that "some of the best sales directors we've seen hold degrees in Psychology -- it's their understanding of human behaviour and decision making processes. The industry though is still dominated by very traditional, risk averse hiring".                           n

*Unless they're Keith Willetts, of course.

Alun Lewis is a telecommunications writer and consultant. He can be contacted via:
alunlewis@compuserve.com

A new report published by the Information Security Forum (ISF) warns that the cost of complying with the Sarbanes-Oxley legislation is diverting spending away from addressing other security threats. The global not-for-profit organisation says that many of its members expect to spend more than $10m on information security controls for Sarbanes-Oxley. The business imperative to comply also means that, in many cases, the true cost of compliance is unknown.

With increasing concerns about compliance, the new ISF report provides a high-level overview of the Sarbanes-Oxley Act 2002 and examines how information security is affected by the requirement to comply. The report provides practical guidance to address problematic areas in the compliance process. According to the ISF, these problem areas include poor documentation, informal controls and use of spreadsheets, lack of clarity when dealing with outsource providers, and insufficient understanding of the internal workings of large business applications.
What's more, the Act ignores security areas that are extremely important when dealing with risks to information, such as business continuity and disaster recovery. This makes it vital to integrate compliance into an overall IT security and corporate governance strategy.
"In the wake of financial scandals like Enron and WorldCom, the Sarbanes-Oxley Act was designed to improve corporate governance and accountability but has proved difficult to interpret for information security professionals," says Andy Jones, ISF Consultant. "As neither the legislation nor the official guidance specifically mentions the words 'information security', the impact on security policy and the security controls that need to be put into place must be determined by each individual organisation in the context of their business.
 "It is important that Sarbanes-Oxley does not push organisations into following a compliance-based approach rather than a risk-based approach that may compromise information security.  The ISF report helps companies to achieve compliance while also ensuring that they have the appropriate security controls in place."
The full Sarbanes-Oxley report is one of the latest additions to the ISF library of over 200 research reports that are available free of charge to ISF Members.
Details: June Chambers, ISF, tel: + 44 20 7213 2867; e-mail: june.chambers@securityforum.org
www.securityforum.org.

"An economic power shift from the US to Europe is now gaining steam and promises to have a far-reaching effect on the world technology sector," asserts Cutter Consortium Fellow Tom DeMarco -- a point vigorously debated by his colleagues on the Cutter Consortium Business Technology Trends Council.

The European Union's transition from marketplace arrangement to world superpower, and a possible new age for European IT, characterised by relatively frictionless commerce and exciting growth, is debated by the Cutter Business Technology Council along with Cutter Consortium Senior Consultants Tom Welsh (UK) and Borys Stokalski  (Poland) in the latest issue of Cutter Consortium's Business Technology Trends and Impacts Opinion.
According to DeMarco: "After decades of looking over its shoulder at Japan and the East, the US economy is fast being overtaken by someone else entirely: Europe. The colossus that has been assembled in what used to be called the Common Market has emerged as an economic superpower."
Cutter Consortium Fellow Lou Mazzucchelli counters: "The European Union is taking hold, albeit with fits and starts as evidenced by the results of recent constitutional referenda in France and the Netherlands. But I see little evidence of a massive shift of economic power from the US to the EU. Perhaps it is because the gap between them is small, relative to the gap between either the US or the EU and China or India or South America. The latter have so much more to gain, potentially at our expense." 
He continues: "It is unarguable that changes in Europe have an impact on the US and the world, but of all the forces working to shift economic power from the US, Europe may be the least threatening. Europe may have gotten a second wind as an economic power, but it seems unable to run a distance race against India and China."
Details: www.cutter.com

    

 

European Communications is now
Mobile Europe and European Communications

  

From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:
www.mobileeurope.co.uk 

 

@eurocomms

Other Categories in Features