Next Generation Networks need Next Generation People

Once upon a time it used to be so simple. People asked you what you did for a living and the mere mention of the word 'telecommunications' was enough to stop them dead in the water. Mental images of telegraph poles, black bakelite phones, long waits for field engineers and the drab efficiency of a utility sector conjured up a grey world of proto-nerds.  Okay, we may have had a brief romance with fame during that dot.com bubble that we'd all rather forget, but by and large, telecommunications has remained an iceberg-shaped industry, with most of our work resting safely below the waterline of the public's consciousness.
2005 however is now looking like the year in which we're going to break surface once again, finally doing what I call 'joined-up' telecoms, using IP as the effective equivalent of the ubiquitous Model-T Ford of the last century -- i.e. strap anything you want to the chassis and start rolling westward in search of the Promised Land. As a result, it's probably more accurate to call it 'the industry previously known as telecommunications', just as that miniscule pop star rebranded himself.
One of the central problems, though, in making this long-expected transition towards integrated content and applications, is whether the kinds of people and skills that we have in the industry are able to make the change themselves. Most of the issues that I see daily in the industry aren't so much technological anymore. Instead, they're cultural and perceptual ones within organisations, or, indeed, in the 'wetware' lurking between peoples' ears.
I'd argue for a start that we're seeing the equivalent of continental drift in all the convergence work under way around us. The North America of IP is now butting up against the Eurasia of the PSTN, while the Australasia of content and applications moves up from the South and, just like the Indian subcontinent, is currently pushing up the Himalayas, we're getting the equivalent of new valleys and mountain ranges appearing in our midst. As a result, we need new species to colonise those new niches. The trouble is that many in the telecoms establishment still have a naïve faith in the integrity of their own particular nation state -- despite the fact that the earth's crust is already shifting under their feet....
It is, however, encouraging that industry debate is now starting to look at these human-centric issues. At a recent New York analyst briefing session from the commercially reborn Telcordia, a number of different speakers highlighted the importance of recognising the stresses and strains involved in making the necessary transitions -- particularly against a backdrop of ever-increasing commercial and technological complexity. One particularly dynamic presentation came from Joe Gensheimer, COO of a new US MVNO called Movida, set up to target the Hispanic marketplace through a hosted MVNO service from Telcordia. His business model revolves around understanding the customer better than his competitors and fulfilling their needs faster and more efficiently -- not in spending billions to build a new network.
Much of the new growth in telecoms is going to come from similarly-minded entrepreneurs, often from outside the industry. And therein lies the problem for the industry once known as telecoms. Many of those now in senior positions in the industry first entered the sector when it was the high-technology wing of a Post Office. Risk was minimal, career progressions were almost set in stone and there was a reliable pension waiting at the end of the day. Things today couldn't be more different -- but have the people and the underlying organisations changed sufficiently to adapt to the new conditions?
The next few years are going to be tricky ones for us. How do we keep those core telecoms values of reliability and integrity in what's fast becoming an over-stocked bazaar of content and applications? Do we sink back below the waterline to become once again the trusted but largely invisible utility -- or do we try and re-invent ourselves as both an industry and as individuals to make the transformation?
It's much easier swapping out legacy infrastructure than it is legacy mindsets...                                    n

Alun Lewis is a telecommunications writer and consultant, specialising in what he, tongue in cheek, calls 'post-modernist telecoms'. He can be contacted via: alunlewis@compuserve.com

While reports indicate that 3G is shaping up nicely in the revenue stakes, further challenges for the technology in the infrastrucure need to be addressed to ensure its ultimate success, says Alan Carr

Flexible. Lighter. Smaller. Things are starting to look more attractive for what will become the most widespread 3G standard, WCDMA or UMTS. There are now well over ten million subscribers worldwide and encouraging reports of a significant increase in revenues per subscriber relative to 2G. As a result, the outlook is now much more attractive for the vendors supplying infrastructure to the 3G industry.
But all is not plain sailing and important challenges remain. A particular issue is the sheer competitiveness of the market. There are currently estimated to be around 20 suppliers of 3G infrastructure around the world including:
*  Traditional Western suppliers with their proven pedigree and track-record
*  Rapidly expanding and aggressive new entrant Chinese companies
*  Japanese companies with the advantage of the world's most advanced home market
*  Korean companies with their long experience in the underlying CDMA technology.
Experience shows that ultimately there will only be room for a handful of these companies to be successful. But in the meantime, this overly competitive situation will make life difficult for all concerned.
So against this background, what lies ahead for the mobile infrastructure industry worldwide? What will be the factors that decide which of the current crop of 3G infrastructure vendors go on to become world leaders? Here we identify two factors we think will have a significant impact.

Flexible, lightweight solutions

Current 3G infrastructure solutions closely follow the model established with earlier generations of mobile infrastructure. However, there are signs that this will need to change in view of issues:
*  There will be an ever-growing need for new basestation sites to provide good 3G coverage. In the UK, for example, it is estimated that at least 12,000 new cell sites will be needed and, following recent trends, it is likely that many of these will involve protracted disputes with the communities where they need to be located.
*  Achieving good indoor coverage for 3G is a growing problem. 3G simply does not penetrate well into buildings and this will only get worse as higher data rates are introduced and with new features such as HSDPA.
*  Finally, there is the issue of the dynamic status of 3G standards. Although the basic standards are now fairly stable, there is no sign of standards-making machines such as the 3GPP slowing down. On the contrary an ambitious programme of continuous enhancement is already underway.
The problem for infrastructure vendors is the need continually to re-design products to keep pace. To address these issues, future infrastructure solutions will need to be much more lightweight and flexibleWith a new class of much more compact basestations, it will be possible to avoid the large cell sites that are becoming so unpopular with the public and to move instead to much smaller less conspicuous equipment that blends into the urban environment. Following the same theme, still smaller systems located indoors will be able to overcome the problem of indoor coverage.
Basestations will also need to be much more flexible to overcome the problem of continual changes in 3G standards. This applies to both small and large basestations, but is especially pertinent where larger numbers of smaller basestations are deployed and where site visits to upgrade functionality will become prohibitively expensive. The technology that will provide the flexibility needed is software defined radio (SDR). With this approach, as much of the basestation functionality as possible is implemented in software, and enhancements and updates can then be loaded remotely in the form of a software update.

Picocell and microcell base stations

To understand the reason why small basestations will be so important in 3G networks, we first need to consider changes in the type of traffic in a 3G network.
With 2G, most traffic is voice calls and, of course, voice will remain an important element in 3G networks. But the real goal with 3G is to transition to a true mobile communication system rather than just a mobile telephone network. As a result, a growing part of the traffic will be made up of data, much of it at higher rates than is possible in 2G networks. In urban areas where users tend to cluster together in offices, stations, shopping malls and conference centres, this will quickly lead to a situation where the overall density of traffic is substantially higher than in 2G networks. The result is that many more basestations will be needed. But this will be uneconomic and, in many cases unacceptable due to public opposition to the introduction of new sites. The answer is to introduce a layer of smaller basestations in crowded urban areas to carry much of the traffic.
Another big driver towards smaller basestations in 3G is the need to get good indoor coverage.             
The problem starts from the fact that 3G operates at around 2GHz, where losses penetrating buildings are higher than with 2G systems. However, this alone ought not to be a major issue when it is considered that GSM has been made to work perfectly satisfactorily in the nearby band at 1800MHz.
The more significant factor is that 3G users inside buildings will typically want to make use of higher speed services, normally requiring a higher quality radio link than would be needed for voice alone. And so, although it may be possible to achieve reasonable voice coverage inside buildings, it will be much more difficult for the sort of advanced data services that will be needed indoors. Again smaller basestations can address this need.

Some technical factors

There are also some more detailed technical reasons why smaller basestations will be more important in 3G than 2G networks. The first relates to the orthogonality introduced to separate users in the downlink.CDMA systems have been carefully designed to maximise this orthogonality since it has an important effect on boosting capacity in the downlink. However in a macrocell, it is common for a substantial part of the orthogonality to be lost because of the complex multipath channels that exist between the basestation to the mobile terminal. This results in a corresponding loss in capacity. The effect is much less in microcells or picocells because of the shorter and less complex path between basestation and terminal.
The second technical factor in favour of smaller cells is intercell interference. Adjacent cells in CDMA systems normally operate on the same carrier frequency; and interference from cells bordering the home cell appear as noise when trying to detect the wanted signal. In turn, this noise like effect depresses system capacity. Pico and microcells are normally located below roof-top level and there is therefore less scope for interference between cells. On average, capacity loss due to intercell interference is also therefore lower. Thus, perhaps surprisingly, smaller cells are more efficient than their conventional larger equivalents and are therefore able to support more user traffic.

The cost factor

The final driver to small basestations is simply cost. Macrocell basestations are relatively expensive items of equipment designed for so-called 'carrier class' operation. This typically involves building in redundancy to meet reliability targets as well as incorporating expensive high power amplifiers. Special environmentally protected cabins are also often needed.
By contrast, the format of pico and microcell basestations is such that a design approach can be used closer to handsets rather than conventional basestations. Reliability requirements are typically reduced, output powers are much lower and special environmental protection is not needed. Maintenance costs for pico and microcell basestations can also be lower.
In terms of operational costs, a weak area for smaller basestations is backhaul because of the larger number of cell sites involved than with macrocells. However, technology is evolving to provide solutions here. For picocell basestations, 3G standards are developing so that IP networks can be used for backhaul. So for an indoor application such as in an office, an attractive backhaul solution is to use the existing Ethernet infrastructure installed within the building.
For microcell basestations that are more likely to be located outdoors, one emerging solution is the 802.16 WiMAX standard which could provide an effective radio backhaul option. There are also other radio alternatives and, in terms of fixed networks, xDSL is potentially a lower cost alternative to conventional leased lines.
To conclude, the emergence of flexible, lightweight solutions and smaller base stations are just two of the technological shifts that will help infrastructure vendors to tackle industry challenges successfully as the roll out of 3G gathers pace.                                            n

Alan Carr is a Member of PA Consulting's Management Group. Contact via tel: +44 1763 267492
e-mail: wireless@paconsulting.com

Online convergent charging solutions have become a key weapon for operators in maximising revenue potential from users, says Giles Newcombe

Even the most applauded innovations and new developments take time to penetrate the mainstream, irrespective of industry or market segment. When they do, however, users wonder how they ever managed without them. Convergent online charging solutions are a case in point.
It is true that some online solutions have attracted negative media coverage and have failed to adequately satisfy criteria such as availability, scalability and functionality. It is equally true, however, that the marriage of highly-available IN technologies and advanced rating engines are re-shaping the online landscape.
Such solutions are enabling pioneering operators like P.T. Telekomunikasi (Telkomsel) to realise the true potential of their business and optimise service delivery and income generation. Fundamentally, online convergent charging can drive such benefits because it is about eradicating traditional distinctions between subscriber types due to payment method. Once rid of traditional BSS characterised by stove-piped and separate systems for prepaid and postpaid subscribers, operators are empowered to maximise the revenue potential from all subscribers and derive efficiency and cost savings

Business drivers

Many operators compete in mature markets characterised by few 'new' subscribers. Competition thus revolves around luring subscribers from competitor networks. Increasing ARPU rather than subscriber numbers is a significant driver of revenue growth, meaning that operators have to tap into the potential of both prepaid and postpaid subscribers. A focus on optimising the value of existing customers requires a re-think of BSS requirements, and a shift from stovepipe legacy environments to convergent online charging systems that can accommodate both prepaid and postpaid accounts and offer the flexibility to combine a variety of charging methods is essential.
A fully integrated approach to prepaid and postpaid accounts -- one that provides a single view of a customer's accounts -- is essential to enable segmentation of the customer base and support next-generation applications and services. The billing system needs to afford all users access to the full range of 2.5G and 3G services and promote credit management across services. Increasing ARPU amongst prepaid customers relies upon those subscribers having access to the same services, tariffs, marketing offers, incentives, customer care and roaming facilities that are available to postpaid subscribers. Operators can unify offerings to both types of account, which simplifies and reduces the cost of marketing, and usage data can drive more tailored offers. This helps to boost loyalty, encourages prepaid users to migrate to contracts and cuts the costs linked with anti-churn campaigns. The ability to identify, handle and monitor prepaid and postpaid accounts from a central point serves to reduce operational costs, minimise churn and enable the management of credit/risk. Vendors are addressing this need by combining Intelligent Network platforms with fully convergent rating and billing engines, to support online accounts (prepaid), offline accounts (postpaid) and hybrid accounts (prepaid and/or postpaid).

Flexible rating and pricing

Whereas traditional voice-based engines rate against a fixed set of parameters (time, distance and duration), online content transactions will need event-based rating capable of processing any type of service against whatever measures are appropriate. The nuances of pricing for new services are many: streamed music purchases could be priced by transaction; advertising measured via responses generated; a videoconferencing session by duration; and an interactive game by the number of levels accessed. This requires a rating engine that is flexible enough to handle all existing pricing criteria and support whatever services may emerge in the future. Operators will also need the ability to:
*  Apply discounts across products and services: Cross-product discounts can provide incentives to subscribers and reward particular usage patterns. This approach, which can be especially useful in support of new promotions, packages and targeted offers, is equally relevant to corporate as well as private accounts, allied to the potentially lucrative interplay between the two. In addition to flexibility, both the rating engine and network need to possess the high availability and quality required to support an operator's business over the long term.
*  Deploy point-of-service pricing: Customers accessing on-line services require instant pricing information and transaction authorisation, which in turn requires that information flows between the rating engine and the network in real time. Postpaid systems are simply incapable of supporting this, so operators must deploy systems that facilitate on-line charging, which occurs within the lifetime of a call or event. Operators can then quickly verify usage limits and the status of account balances in real time, and customers can make informed decisions about their usage and expenditures.
When a customer downloads a video clip via a next-generation handset, for instance, they need to be pre-advised of the total price they will pay, and the amount authorised against their account balance in real time. Subsequently, if the content fails to download correctly, the customer's account needs to be topped up again in real time. Such functionality also enables operators to better manage their risk of revenue exposure. Operators need to closely monitor activities and the services they offer, not least in relation to high-value, low-margin services, which are potentially risky.

Sophisticated service scenarios

Convergence across all lines of business offers a major opportunity for operators to develop innovative packages that combine a range of services/products and discounts. These represent a chance to increase customer 'stickiness' and raise ARPU. Operatorscan devise sophisticated and lucrative convergent service scenarios around family-based usage or business and private accounts, etc. A user could, for example, operate two accounts from a single handset using one SIM card: one account for business and one for private usage. This opens up numerous permutations for cross-service discounts and innovative marketing.
In addition, convergent online charging enables operators to:
*  Combine the online and offline world to create new hybrid service offers. These can help both to rejuvenate older services and drive the penetration of new services. Packaged adeptly, with choice use of cross-service discounts, bonus and loyalty schemes, such hybrid services represent a significant revenue and churn-reducing opportunity for mobile operators.
*  Use the single view of all customers to drive joint product launches for prepaid and postpaid subscribers and better targeting of offers. Operators can analyse the data on usage patterns that is generated by today's advanced, convergent BSS to improve the targeting of services and bundled offerings. Ultimately, this approach helps to take the guesswork out of marketing, whilst also driving innovations in product and service strategy that might otherwise be lost.

Better fit between IT and engineering

Translating theory into practice always throws up challenges, particularly in highly complex communications organisations that have a tradition of distinct and separate IT and engineering functions. Whereas networks, and thus prepay and IN, have tended to be the preserve of engineering, rating and related areas such as packaging and discounting have largely been the remit of IT. Achieving true convergence, however, requires a closer linkage between these two areas and a coming together of BSS and OSS-related activities and functions. Failure to secure the commitment of both functions can seriously undermine the implementation, timescales, and operational effectiveness of a convergent solution. In short, convergence is as much about eroding the silo mentality within organisations as it is about technology. Coupling advanced technologies with improved flow of information and co-operation between departments, moreover, helps to strengthen and feed the marketing function.

Telkomsel: a case in point

Telkomsel, Indonesia's leading cellular operator since 1998, is leading the market through its investment in an advanced solution that simultaneously supports online charging and offline charging with a common rating application. Telkomsel identified a need to introduce charging systems that could handle large, complex volume processing and provide the flexibility required to tailor product offerings to individual customer segments, irrespective of payment method. With the chosen solution, Telkomsel benefits from a single view of its entire prepaid and postpaid subscriber base and a single point of entry for all product and pricing data.  The solution is designed to enable same services, discounts and bundles to be offered to prepaid and postpaid clients, and provide the company with the ability to adapt rapidly in response to changing market conditions and future developments. Equally important, the solution maintains existing network availability, successfully marrying today's network engineering standards with support for tomorrow's convergent services.
The adoption by operators of flexible convergent online charging systems that can seamlessly support combinations of both online and offline services will be key to them extracting the revenue potential from all subscribers. Such solutions will also help operators to develop innovative services and bundles, reduce costs and improve operational efficiencies. It is such qualities that will enable operators to be competitive and profitable over the long-term.                                 n

Giles Newcombe is Senior Director, International OEM Sales, Convergys EMEA, and can be contacted via tel: +44 1223 705000; e-mail: Europe@convergys.com

The quest for end-to-end telecommunications solutions is now achievable because of the widespread adoption of standards, says William F. Wilbert

Seems like everyone is searching for end-to-end telecommunications solutions that can be quickly and cost-effectively implemented. You'd think that business -- or at least offering a host of new services -- depended on it. And you'd be right.
The race to deliver new products ahead of the competition is putting enormous pressure on all carriers to move away from a proprietary world of inflexible, monolithic systems toward a much more open world of containers and components. Single vendor solutions have given way to multi-vendor solutions that include products from ISVs, Network Equipment Providers (NEPs), system integrators and middleware vendors.
The more compatible these products are, the easier they are to integrate and the better off everyone will be. The challenge is to build end-to-end solutions that support the full spectrum of OSS requirements. These solutions must also support increasingly heavy burdens on network management. Scalability, availability, reliability, and agility will remain top concerns. 

Trapped between the old and the new

Though everyone agrees that legacy infrastructures are technology dinosaurs that will eventually be replaced, most carriers still depend on a tangle of processes, systems and networks -- a patchwork of legacy OSS/BSS and individually integrated applications, either homegrown or commercial off-the-shelf (COTS) applications.
Many believe that this uneasy co-existence will continue for some time. After all, OSS/BSS can't be unplugged without halting business, and opportunities for green field implementations are increasingly rare. But the pressure for change is mounting. Within the last few years, the complexity of OSS/BSS has grown exponentially -- by a factor of 10 some say, including Telstra, a leading Australian telecommunications and information services company. 
According to Michael Lawrey, Head of Network Services Infrastructure Services at Telstra, the best way to deal with complexity is to step back and look at the problem from the 30,000 foot level. Lawrey, who also sits on the board of the TeleManagement Forum, strongly believes that telcos must start with business strategy, then look for standard integration platforms.
 "If you want to solve the OSS/BSS problem, you've got to have an overall business model first before you can understand what the infrastructure building blocks should look like," he says. "Telstra is a legacy telco. It's grown up in its siloes. Within my business area I'm running a massive integration program. I've got all the fixed networks, the mobile networks, all the cable networks, all the online broadband networks -- all under my one business unit and they have to be able to function together."
To achieve this, Lawrey is striving for what he describes as "a fully integrated business model."
"You've got to start with a business model point of view and a process point of view," he says. "My real challenge is trying to get our systems to come online or drop off as needed. To succeed, I want company-wide standards, I want a framework. The big hurdle for us and for other telcos is to agree on a company-wide business model. Once we do this we will need some glue to hold the framework together. This will allow us to move from a siloed world into a new world of fully integrated services."

Standards are the strategy

To break the bonds of technology and process siloes and find the Holy Grail of end-to-end solutions, carriers are thinking long and hard about standards.
 "Standards are necessarily strategic," says Peter Mottishaw, Strategic Marketing Manager, OSS Business Unit, Agilent Technologies, a major provider of service and network assurance OSS software. "You may get a more immediate benefit from a proprietary solution, but over the long haul you are likely to discover hidden costs if you go down that route. In fact, the hidden costs can be huge. This is why carriers are looking for solutions that plug and play together and can also be integrated into the OSS infrastructure fairly easily."
On the technical side, much of this boils down to integration issues. Tomorrow's infrastructures are being built now and depend on new ways to link existing systems and networks with new ones. After years of learning through painful trial-and-error experiences, the telecommunications industry has come to the collective conclusion that the best way to bring order to the middleware/application interface chaos is to adhere to standards.
Along with a host of other carriers across the globe, Telstra is now taking a close look at OSS/J, the industry's first implementation of the TeleManagement Forum's New Generation Operations System and Software (NGOSS) framework.
"OSS/J is the glue that holds system components together," Telstra's Lawrey explains. "I want OSS/J to put the glue in place to get all our OSS building blocks to talk to each other, to get my end-to-end process flow right and to get my end-to-end service management layer right."
The goal of the OSS through Java Initiative is to create a carrier grade industry standard integration platform. An industry-wide consortium of leading industry players-equipment providers, OSS suppliers, applications developers, and consulting services providers -- the group develops, tests and publishes inter-OSS application program interfaces (APIs) that are available free of charge. (For a full list of current members, APIs and certified products, go to www.ossj.org.)
OSS/J APIs are based on the J2EE (Java 2 Platform, Enterprise Edition ) integrated application framework. J2EE was chosen because members agree that it is "the simplest and most reliable" means of embracing a multitier architecture based on re-usable components and container technology, that can address tightly coupled integration with Java, loosely coupled integration with XML and Web Services. OSS/J deliverables are backed by best practices and design patterns implemented in each API. OSS/J's careful mapping of its APIs to the TeleManagement Forum's widely adopted enhanced Telecom Operations Map (eTOM) means that providers and vendors can apply common process integration rules to the integration tasks that comprise business process engineering and re-engineering.
They can also use common, freely available implementation tools to put those rules into practice. Endorsed by a large majority of the world's leading service providers, eTOM defines a common terminology and map for communications industry business processes and a reference point for internal process  © reengineering needs, partnerships, alliances and general working agreements with other providers.
While OSS/J implements the fundamental architectural patterns common to all players within the telco industry, it has characteristics that set it apart:
*  OSS/J is the only open standard for OSS that is available with reference implementations, compatibility test suites, a portfolio of certified products, and an open ecosystem of tools, adapters and extensions.
*  OSS/J is the only open standard that builds upon a middleware that is itself an open standard: J2EE. Compatible implementations of J2EE standard are available from a large choice of suppliers.
*  OSS/J implements telecommunication application open standards and runs them on open standard middleware.
 "OSS/J adds OSS functionality to the standard enterprise Java and Web Services platform called J2E," says Philippe Lalande, Sun Microsystems, head of the OSS through Java Initiative. "A lot of enterprise software experience has been engineered into J2EE and telecommunications is now able to build service oriented architectures upon standardised OSS/J interfaces and mainstream enterprise platforms for their next generation OSS systems. Because OSS/J certified products can be assembled in a plug-and-play manner, OSS/J has become the implementation standard of choice for the successful and rapid deployment of next generation OSS solutions."
Agilent's Mottishaw agrees with this assertion. "It's clear from our perspective that the J2EE and OSS/J environment have become well established in the industry," he says. "Within our customer base, J2EE is increasingly being used as the platform for integration. J2EE is the preferred integration environment and OSS/J APIs are a critical part of this. So I think the role of OSS/J within the industry is pretty well assured. There's really no other option right now."
Independent industry analysts confirm the fact of widespread OSS/J adoption. "The battle over which standard will dominate is essentially over," says Dan Baker, director of OSS research at Dittberner & Associates. "The J2EE environment is the clear winner and the new OSS/J tools basically let you write from an expert outline instead of from a blank piece of paper."
J2EE (together with its telecommunications extension OSS/J) has proven that it can facilitate actual end-to-end software solutions that support the full spectrum of OSS requirements.

OSS/J and Open Source

There are multiple indicators showing that the pace of OSS/J adoption has increased markedly over the last year. Among the service providers who are adopting OSS/J, BT, Covad Communications, Vodafone and QinetiQ have decided to play an active role in the OSS/J Initiative and now sit on the Initiative's Advisory Council. 
What's more, the TeleManagement Forum has acknowledged OSS/J APIs and the first - and to date, the only - implementation of NGOSS.  Indeed, OSS/J technologies were an integral part of four major TMF catalyst projects on display at this year's TeleManagement World conference held in Nice, France, May 16-19:
*  The OpenOSS Catalyst -- a testbed of freely available OpenOSS Initiative open source components. 

*  Business Agility Implementing the SID Catalyst  --© Use of the SID model to simplify and accelerate integration-related development in a service activation scenario defined by France Telecom.
*  Business Activity Monitoring (BAM) Catalyst - providing BAM over the OSS/J Service Activation, Trouble Ticket and Quality of Service APIs.
*  NGOSS/MDA: Realising NGOSS as a Model-Driven Approach - using a model-driven problem-solving methodology and leveraging state-of-the-art standards and tools
Agilent's Mottishaw has been an active participant in the OpenOSS Catalyst Project which, he says, "We wanted to enable participants from across the industry to collaborate on the implementation issues that we face in delivering OSS standards based solutions. Proprietary software makes this difficult because nobody wants to expose the implementation issues they face. Open source software solves this because the source code is available to everyone. OpenOSS is not attempting to compete with proprietary solutions, but will help the industry address some of the integration problems that limit the value of current products." In fact, OSS/J APIs were a critical component of the project.
 "OSS/J's QoS and Trouble Ticket interfaces play a key role," Mottishaw says. "We successfully used OSS/J to integrate open source OSS software and COTS OSS products in a complementary way to create an initial sandbox for further development projects."
The OpenOSS Catalyst is especially significant because it is backed by a consortium of industry leaders -- BT, COLT Telecom Group, Covad, NTT, and QinetiQ -- as well as Britain's University of Southampton and Hungary's Budapest University of Technology and Economics. The purpose of the OpenOSS Project is to:
*  Embrace the TMF NGOSS approach to develop a 'sandbox' of open source software
*  Make software and associated documentation freely available to the industry
*  Provide feedback on implementation experience into OSS standards activities
*  Provide a vehicle for research to engage with  realistic service provider problems
 (More information on the OpenOSS initiative which delivered the TMF Catalyst can be found at http://www.openossinitiative.org/.)
Craig Gallen, who leads the University of Southampton's OpenOSS Project agrees, adding that OSS/J and NGOSS are reshaping the market.
 "OSS/J has grounded and realised the work of the TeleManagement Forum and I think that's very important," he says. "The Initiative has proven that low-cost components from the enterprise world can be easily integrated into the OSS/BSS environment. This will lower the barrier to entry for non-incumbent component vendors -- effectively commoditising parts of the OSS market while allowing incumbent ISVs to concentrate on delivering high-value solutions. And, finally, OSS/J provides a framework for sharing service provider investments in software development."                             n

William F. Wilbert has written for technology publications for over 15 years. He is currently Communications Manager for the OSS through Java Initiative, and can be contacted via e-mail: Wilbertz1@comcast.net

According to recent research from Siemens, non-bank sources of finance are fuelling investment in technology by medium-sized companies. Kari Kupila reports

One of the significant values of technology investment rates is that they provide a sensitive barometer of the corporate sector's real rate of recovery from the recent difficult economic period. In its recent forecast for 2005, the International Monetary Fund projects GDP growth rates of 3.6 per cent in USA and 2.6 per cent in the UK. France is predicted to grow at 2.0 per cent -- a slower rate than last year. Germany is positioned as the rear runner at 0.8 per cent. Despite these 2005 growth rates, new research from Siemens Financial Services -- which investigates and analyses the current state of broad technology financing in key European economies and in the US -- nevertheless shows a confident increase in technology investment, indicating a steady return to economic health.
In entering into a discussion of the detailed findings of this research, however, we must also ask how any upturn in technology investment is being financed, especially since banks throughout Europe are reported to be reducing the amount of credit extended to SMEs.  The availability of finance for technology investments --- especially in a European business community that is much more highly indebted than it was ten years ago -- is also a critical factor in the barometer of economic health. So what is happening with technology spending, and what does it tell us about business confidence amongst medium firms who make up the 'engine room' of the national economy? More specifically, what is the particular meaning of the study's findings to the IT and telecoms sector? 
The Siemens Financial Services research, completed in March 2005, addressed an audience of companies with a turnover between â‚-50m and â‚-300m, this being the study's definition of larger medium-sized companies.  400 FDs or senior financial managers were interviewed by telephone -- 100 in the UK, Germany, France and the US. The study looked at the broad picture of technology finance availability and usage, defining 'technology' as anything from industrial plant, to office equipment, to information and communications technology. 
Overall, the research indicated that non-bank sources of finance are boosting technology investment by European medium-sized companies. Companies have invested more in technology in 2004 and intend to continue to do so in 2005. A balance of companies increased their spending on technology during 2004.  The US is the clear leader with 62 per cent of medium-sized companies saying they spent more in 2004 (the UK is the clear European leader with 47 per cent reporting spending increases).
In Europe those companies spending more in 2005 intend to do so at a higher rate than those who increased spending in 2004. A balance of companies intend to continue spending growth into 2005, with average increases amongst those reporting expected further spending growth of: UK 52.3 per cent; Germany 34.7 per cent; France 39.2 per cent; US 22 per cent. This growth comes from a low base following the post-millennial downturn -- for instance, in the currently most robust European economy (the UK) IT investment fell 21 per cent between 2001 and 2003.
Technology finance is readily available for these mid-sized companies and is boosting increased rates of technology uptake, across IT, business equipment, plant and vehicles, fuelling recovery growth. Technology finance is strongly available in US, UK, France, less so in Germany. Technology finance is most easily available in France (7.74 on a scale of 1-10), with the UK (7.71) and the US (7.61) not far behind. Finance is less available in Germany (6.82 on a scale of 1-10). Although technology finance availability lags somewhat in Germany compared to the UK and France, all countries are showing positive signs in this important proxy for general economic recovery. 
The readily obtainable finance which is driving technology investment, however, is not primarily available through traditional banking lines of credit. Often, lease and lease-loan packages are helping mid-sized companies escape the bank credit squeeze on unsecured lending by accessing non-bank finance. Leasing is currently the most favoured financing method for technology investment across Europe, but especially in France. Aside from cash purchases, leasing is easily the most favoured technology financing option in France, Germany and the UK, particularly France. Growth in the overall leasing market in all three countries is therefore shown to be fundamental to companies' growing capital equipment spending plans. Notably, the overwhelming majority of medium-sized firms in Europe and the US see technology investment as critical to improvingtheir competitive position and accelerating growth, despite a small proportion of companies that do not see technology as a competitive advantage.
In the post-millennial years, European IT investment went through a major downturn. The bubble of dotcom spending burst and many suppliers of the web-based software solutions and telecoms technology disappeared or were bought for a pittance. The general economic effect suppressed IT spending by all types of company, whether in countries where consumer spending cushioned the recessionary threat (UK) or in those which suffered the full blast of economic downturn (Germany and, to a lesser extent, France).

Respected analyses

Perhaps one of the most respected analyses comes from the European IT Observatory, and it is worth briefly reviewing this body's latest figures.
Western Europe as a whole has firmly passed its IT market nadir of 2003 (minor or negative growth) to record a healthy 2.7 per cent upturn in 2004, with 4.3 per cent increases projected for each of 2005 and 2006. Of the three major economies in the region, the UK remains the strongest, recording 4.1 per cent growth in 2004, compared with 3 per cent in France and just 1.8 per cent in Germany. All three countries are projected to continue this generally upward trend in 2005 and 2006, but share the prediction of a very slight fall-off in 2006. If the market is widened to also include telecoms technologies -- a logical step given the convergence between the two areas -- the same overall pattern emerges. We suspect that this phenomenon marks the end of the surge in IT and telecoms investment that resulted from three post-millennium years of under-investment, with the market showing a slight correction as it settles into a new self-sustaining equilibrium for the rest of the decade.
Through the recent difficult economic period, the IT and telecoms industry were forced to react and find ways that helped their businesses survive the drought.  Margins came under the most severe pressure, leaving a legacy now where like-for-like pricing has been permanently discounted. Moreover, newer industries have come to the fore that help organisations spread the cost of acquiring IT and manage their cash flow better. These have ranged from a substantially increased take-up of outsourcing, along with software rental, ASP and web-based service provision.
The financing of software we have seen lags behind other more tangible IT elements, but is expected to grow substantially over the next few years. Software finance has suffered from difficulties concerning capital tax allowances for software assets in leasing arrangements.  This is now being overcome, for instance in the UK, by an increasingly accomodating attitude from the tax authorities. However, possible prolongation of the tax-deductible write-off period for software in Germany may produce new uncertainties in this largest of European economies. One must also point to the criticality of software finance availability for the telecoms sector.  As telecoms and datacoms convergence proceeds apace, the software element of telecoms solutions is increasing, making it imperative for medium-sized firms to be able to access suitable financial options to manage their investments in this area.
In summary, our research concludes that the growing availability of non-bank finance, in particular leasing, is helping to maintain growth in technology investment amongst medium-sized European companies. Whilst different technology types vary in their ability to access suitable finance, the general picture is highly positive.  It will, however, be important for non-bank finance to become available for software in the telecoms sector to support investment in convergence technologies.  Germany is expected to provide a technology investment and finance picture going into 2006 which mirrors the developments that are already discernable in the UK, France and the US.                                             n

Kari Kupila is Head of Equipment and Sales Financing, Siemens Financial Services

Lynd Morley visits Telcordia and finds a company moving beyond its redoubtable  technical expertise to a new approach in product  development strategies

Telcordia Technologies is not unlike the telecoms industry itself. The industry, rooted in the principles of guaranteed provision of service, 24 hours a day, seven days a week, usually supplied by state-owned monopolies, has had to learn the fast responses of a lean, mean machine in a competitive marketplace. It has had to change from the equivalent of a giant oil tanker, taking days to turn around, into the sleek, high-speed sailing yachts of the America's Cup. Equally, Telcordia's gravitas and stately pace as a highly respected software developer, focussed on technology, has had to adapt to the new order of customer-centric cut and thrust.
There is no doubt that Telcordia is a telecoms heavyweight when judged on technical expertise, innovative development, and, indeed, sheer brain power. It holds more than 800 telecom patents, and its track record includes the development of the MIME standard, the invention of Advanced Intelligent Network technology, helping to define ADSL technology, and the invention of SONET among many.
The question for the US-based player -- which grew out of Bell Labs and earned its development spurs in an, almost, academic environment -- is whether it can now operate successfully in the far less rarefied atmosphere of a highly competitive commercial market.
Indeed, at Telcordia's recent media and analyst conference, "E3: The Elements of Transformation", Ken Kharbanda, who joined the company in 2003 from a venture capital and investment background, noted:  "We have tremendous resources here -- great brains and great technology -- but in the past we haven't had a way of systematically commercialising our ideas, and turning our expertise and technology into products".
All that is now changing -- not only according to Kharbanda, who as Vice President for Corporate and Business Development, is responsible for strategy initiatives and developing new growth areas for the company -- but also according to the wide range of senior Telcordia executives, partners and customers lending their voices to the discussions and debates over the two days of the conference. Indeed, 'Transformation' was the theme of the event, reflecting not only the company's view of its own path, but also the future direction of the industry where the convergence of networks, technologies and services demands the transformation of business models to ensure long-term profitability.
Matt Desch, Telcordia's CEO, has been instrumental in the changes taking place in the company. Within months of joining Telcordia in 2002, he restructured the operation, brought in new blood to key management positions, and stressed the vital need to focus on customers and products rather than purely on technology. He is clearly a man who is not afraid of change, though during the E3 conference his emphasis was on 'transformation', which, apparently, has the advantage over simple 'change' of carrying far more positive connotations.   Desch admitted that while transformation could be a hard pill to swallow, it was essential in the current global marketplace, illustrating his point by quoting the words of such industry stalwarts as Intel's Andy Grove who said: "There is at least one point in the history of any company when you have to change dramatically to rise to the next level of performance. Miss that moment and you start to decline". Desch is clearly determined not to miss his moment.
Telcordia's Elementive strategy, launched in 2003, emphasised the move from purely proprietary systems to a new product platform that was standards-based, open, flexible and configurable. Desch stressed that these qualities did not only apply to the products, but also to the company's new approach. "To Telcordians," he noted, "'Elementive' means that we work, and think, and act differently."
This different approach has, more recently, been facilitated by the degree of independence Telcordia has achieved following its acquisition by Warburg Pincus and Providence Equity Partners, freeing the company from some of the restrictions which were an inevitable consequence of being owned by the heavily Pentagon-involved Science Applications International Corporation (SAIC).
Larry Bettino of Warburg Pincus noted that, following the recent shakeout in the telecoms industry, he believed it was now a good time to invest in communications, and that Telcordia was particularly attractive, given that it is among "the most experienced and knowledgeable providers of telecommunications in the world."  He went on to stress that the company is also one of the largest and broadest players in the OSS market, offering stability, reliability and service to customers, which, he believed, was not duplicated anywhere else, and undoubtedly also had the experience to partner successfully with network builders.  "After all," he commented, "they have been living and breathing telecoms for decades."

Strengthening relationships

At the E3 conference, Telcordia was understandably keen to stress that it was aiming to consolidate on its new position as an independent company by strengthening customer and partner relationships, broadening its product and service portfolio, and expanding its global presence. Desch underlined the company's more partner-friendly approach, pointing to Telcordia's emphasis on providing 'service transformation' through partnerships with the likes of IBM and Accenture, to help CSPs with the complicated task of network change necessary to the deployment of next generation services.
At the same time, the company is aiming to expand it's global reach, establishing a presence in Europe, Asia-Pacific, and the Caribbean and Latin America (CALA), where, for example, it recently established a Brazilian base with offices in Rio de Janeiro and Sao Paulo -- a shrewd move given that the telecommunications market is predicted to grow some 40 per cent by 2008 in the CALA region. 
Telcordia will partner with local systems integrators in these various markets, pursuing its strategy to establish a solid presence outside its traditional North American base, and reflecting the essentially global character of any successful telecom player in the current market. A few weeks after the E3 conference, for example, the company announced that Liaoning Mobile Communications, a subsidiary of China Mobile, had selected Telcordia Granite Inventory Management, part of the Granite Service Resource Management Portfolio, which will enable China Mobile to intelligently manage network and service-related assets across the organisation in a consolidated, flexible and service-centric

manner. Commenting on the selection, Liu Cheng of China Mobile emphasised the quality of the technology in the solution, and added: "Telcordia is also providing superior business and technical assistance to China Mobile Liaoning, helping us transform our operations to anticipate and address changing market forces. We are working together with Telcordia on implementing activities across the whole province."
This emphasis on partnership and global spread underlines the company's determination to shed its somewhat 'ivory tower' image -- an aim highlighted by the dialogue Telcordia clearly intended to establish at E3 through the presence of a wide range of guest speakers. These included MCI-Europe's Andy MacLeod, Fari Ebrahimi of Verizon, and Telecom Argentina's Edmundo Poggio who participated in the panel discussion Seizing the Power of Operations Transformation; and Dan Elron from Accenture, Vern Kennedy of Broadview Networks and Joe Gensheimer of Movida, who were the star attractions on the Services Transformation: Converging on a Common Future panel.
Despite the emphasis from Telcordia on becoming more product focussed, in his presentation Desch had commented that while products are undoubtedly the tools of transformation, the process is really about people -- a position enthusiastically supported by the speakers on the Seizing Power panel. Andy MacLeod, for instance, noted that: "Transformation is only a little bit to do with technology, it's mostly to do with people" and added that the focus really has to shift to the customer.  He commented, however: "As an industry, we don't do customer service very well. We really have to learn to create an enhanced service experience."  Edmundo Poggio added to the requirement by stressing that you have to begin with employees when making sure that people understand the benefits of change, but Fari Ebrahimi pointed out that it is human nature to fear change, so both employees and customers within the industry might well be scared of transformation.  He went on to stress, therefore, that the whole process of transformation has to entail education to overcome inevitable concerns, as well as a             multidirectional communications process that not only helps customers understand your offering, but also helps the provider understand where the customer is coming from. Indeed, the overall message from the panel was that none of this transformation can be achieved in isolation -- it must include input from all parties, not least of which is the marketplace itself.
Not unnaturally, the talk at E3 focused to a very large extent on "getting closer to the customer" -- a sentiment rather frequently expressed in the industry of late.  Desch stressed, however, that in this context the phrase was intended to emphasise the quality of the relationship -- really listening to customer requirements, and making sure that product and engineering services were located geographically close to the customer to ensure true fulfilment of local needs.

Hot potatoes

For all the touchy-feely emphasis of much of the E3 conference, Telcordia did not avoid tackling some of the industry's hot potatoes, such as outsourcing.  Desch explained that the company was offshoring around 15 per cent of software development to such countries as India and Russia, but emphasised that while that figure might go to around 20 per cent, it would not go any higher because the knowledge and expertise built up within Telcordia was vital to its product development. "We're not interested in simply having programmers developing software in a factory-like scenario," he stressed.
At the same time such thorny issues as security and privacy (which were identified as undoubtedly providing a strong differentiator in product offerings) were also tackled, not only in the main sessions, but also in the particularly effective break-out sessions where Telcordia executives, partners, analysts and media installed themselves in small, casually furnished rooms for less structured discussion of the issues -- a further proof of Telcordia's determination to shed its more exclusive and formal image, and even allow many of its prized shibboleths to be challenged in the lion's den.                n

Lynd Morley is Editor of European Communications

Selecting the right OSS/BSS vendor is crucial both to the maintenance of happy customers and future profitability, says Andrew Rodaway

Walk down any high street and it's pretty clear that telecoms services -- even the most technically sophisticated ones -- are becoming commoditised. As I write this you can buy a new 3G prepaid phone, complete with usage credit, for less than £50. And if you opt for a monthly contract, the differences between tariffs are often minimal, because business is so competitive. Is it any wonder that customer retention is a growing problem for the networks?
The solution, obviously, is high service quality and good customer care. Happy customers don't walk, as one carrier executive put it to me. But with most networks, both fixed and wireless, offering seamless coverage and high technical call quality, service quality brings the issue of Operations and Business Support Systems (OSS/BSS) performance into sharper focus. If your OSS/BSS aren't actively helping you build customer loyalty through the capabilities they give you to deliver great service, you're going to be at a competitive disadvantage.
For example, many operators want a unified approach to billing, where multiple services can be converged onto a single presentation to the customer. Even when that is achieved, the technical reality behind it may be many different billing systems. Suppose there is a bill problem, and the customer calls to get it fixed. Your OSS/BSS can make this a seamless experience, or it can be a customer service nightmare.
There is no one-size-fits-all solution, given the wide range of services, customer volumes and technical environments. But with growing competition for customers, the last thing you want is a failed OSS/BSS project, or multiple attempts to solve the same problem, because the vendor or its technology didn't live up to expectations.
So, how do you make the right choice of vendor? Note that I say vendor, not systems, because there has been a noticeable shift in the last couple of years in the way that networks procure critical OSS/BSS. Not so long ago it was primarily a technical issue -- how well did the product on offer meet the RFP functionality requirements? But after the last few, tough years in telecoms, the focus is now much more on the suitability of vendors as suppliers of complex, customer care-critical systems.
Of course, product capability still matters, but even the best product can be a liability if the vendor can't afford to develop it, or hasn't got the staff to implement it, or to support it in production well into the future. Worse still, what if your chosen vendor gets acquired, goes broke, or changes product strategy? All these things have happened regularly to high-profile OSS/BSS businesses in the past couple of years. In many cases their customers were left high and dry.
It has to be said that there are no foolproof ways to assure yourself that a vendor is going to be a suitable, long-term partner for the supply of OSS/BSS technology and services. But there's still a strong case for appropriate due diligence on any supplier, however well established. Here are some questions to ask -- and these are developed from real-world experience, as someone who has been involved in the acquisition of numerous OSS/BSS vendors in the past few years.
Firstly, take the most obvious case -- financial stability. Many vendors are finding it hard to build a sustainable business model. Even today, when the industry seems to be doing a lot better, judging by the reported results of many carriers, there are still OSS/BSS vendors bleeding cash. Some are propped up by VC or IPO funds, but nothing lasts for ever, particularly investor patience. Take a long, hard look at your shortlisted vendors' financial statements, and ask yourself where they will be in 3-5 years time.
Think also about product investment. Modern OSS/BSS are complex, and they need a lot of development spend to be truly reliable in production environments. Ask your vendors about quality processes, testing, benchmarking and long-term roadmaps as well as functional capabilities.
OSS/BSS aren't just software. The professional services that go alongside are vital. Major OSS/BSS projects need in-depth skills in project management, sizing, business analysis, architectural design, implementation, configuration, data transfer, integration, tuning, communications, database performance and many other areas. Does your vendor or its SI partner have these skills, in sufficient numbers, where and when you need them?
Long-term support and customer care is another consideration. If a vendor has stability in its customer base, it probably isn't just a coincidence. Just as with carriers, OSS/BSS vendors that work hard to keep customers happy tend to develop long relationships. Site visits to recent implementations may be impressive, but what about the customers sold five years ago?
These may be straightforward principles, but good customer care isn't black magic -- for either carriers or their suppliers.                                                   n

Intec was an exhibitor at Billing Systems in London. For further information see www.iir.co.uk/billing

Andrew Rodaway is Director of Marketing, Intec

Tom Uhart explains why it is vital for the industry to get its sums right when it comes to settlement procedures

Through 'revenue-share' business models, operators and content providers have found a way to share incentive and reward, and this has resulted in the recent explosive growth of mobile content. 
As a result, operators are finding themselves at the centre of mobile content value-chains, with the need to calculate revenue splits based on data transactions. This is not always trivial; the simple retail models involved in mobile content purchase do not translate into basic wholesale models between operators and third parties. Even a straightforward ringtone download scenario can involve the operator settling with both a content provider and a digital rights management entity.
As the market moves on a pace, operators will begin to enter ever more complex value-chains. Free in France, for example, is now offering 20mb ADSL that will certainly result in video and music on-demand offerings based around deals with content providers. In this context, settlement is set to become a key component of the business support systems architecture.
Traditional retail billing platforms offer low levels of assistance for settlement, typically supporting simple models such as sponsorship or fixed percentage revenue-split per transaction. In order to manage more complex settlement models, operators are often resorting to manual or semi-manual calculations based on data warehouse extracts and Excel spreadsheets. These often involve the operator's IT, marketing and finance departments, all of whom need to communicate with one another. The results are far from audit proof and the method raises a number of key issues.
Disputes between parties -- operators, content providers and media companies, for example -- are commonplace, and notoriously difficult to reconcile. Each organisation holds their own records, which they produce as evidence during such disputes, and agreements can entail significant human effort and service downtime.
With increasing complexity in the behind-the-scenes B2B business models, manual settlement no longer presents a viable solution. Now is the time for organisations involved in complex billing scenarios to take action. Opting for the low levels of support offered by traditional billing platforms, and continuing to fight ongoing payment disputes should both become a thing of the past.
The time is ripe for operators involved in revenue-sharing models to consider installing a 'deal management and settlement' platform. This will enable them to offer the types of agreements needed to attract and retain the best partners -- a key competitive advantage in offering compelling data products and services.
Today, the core business of operators is not to create content, but to enable its distribution. As key players in new value-chains, operators must be able to stay on top of their revenues to ensure a healthy bottom line. And, they must aim to stay one step ahead in terms of back-office support.
A deal management system enables specific settlement rules and tariffs to be defined, managing individual events and simplifying new revenue streams. Such a solution has the ability to settle disputes on a level playing field. The parties involved in the transaction can then monitor the success of the content, and feed accurate findings into their marketing planning.
The market for content is booming, and the time is ripe for operators, content providers, media companies and billing companies alike to capitalise on its exponential growth.
Exciting times lie ahead -- but only for those who get their settlement pitch right.                                   n

Nimbus Systems were an exhibitor at Billing Systems in London.  For further information see www.iir.co.uk/billing

Tom Uhart is co-founder and managing partner of Nimbus Systems   www.nimbussys.com

There were good vibes aplenty at this year's TeleManagement World in Nice...

TeleManagement World, gave the distinct impression that they were taking an upbeat and optimistic view of their particular world. Certainly, in terms of sheer numbers it was the biggest and best TeleManagement World to date, with over 2300 delegates and exhibitors in attendance, including over 80 vendor exhibitors and eight Catalyst showcase exhibits -- the TMF's "living labs" showing multi-vendor demonstrator projects, developed to TM Forum specifications, such as eTOM (enhanced Telecommunications Operations Map) and SID, the data and integration model.
Keith Willetts, TM Forum Chairman, set the optimistic tone when he commented: "This is by far the biggest TeleManagement World yet, on just about every measure. The industry has been in the deep freeze for a few years, but now the ice is really melting."
His upbeat stance was taken up by many of the conference speakers, including BT's CTO Matt Bross, who made no secret of the fun he was having, announcing to the audience that being CTO of BT at this point in time is "the best job on the planet". He went on to describe BT's plans for the £10-billion build out of its 21st Century Network -- an all-IP network -- naming Fujitsu, Alcatel, Siemens Ericsson and Lucent, among others, as key equipment suppliers in the scheme. At the same time, he emphasised that BT aims to reduce its cost structure by £1bn a year, as of 2009.
Tapping into the much-vaunted concept of the "tipping point", Bross believes that this BT project is tipping the traditional telephony service into something he calls "networked IT", and underlined the changes underway with those words -- so beloved of software vendors over the past few years ---convergence and transformation.  But he stressed that convergence simply won't happen if the underlying infrastructure to deliver and support it is not put in place - and that is the role which transformation has to play.
Indeed 'transformation' was a recurring theme for the conference speakers.  Not surprisingly, Matt Desch, CEO of Telecordia Technologies, and a standard-bearer of transformation (see story on page 48), stressed that transformation, which is being driven by consolidation and change, was as much about non-technical issues as about technical ones. He warned that companies are likely to have to tackle strong resistance to change -- a fairly consistent human reaction -- but noted that effective commercial survival might well depend on the success of the process.
While not exactly bucking the keynote speeches' trend -- and its attendant enthusiasm for transformation -- Telecom Italia's Stefano Pileri appeared to be taking a more measured approach, emphasising that the company's migration to NGN will be market driven, promoting new services such as IPTV, and only making infrastructure investment when necessary.
Outside the conference rooms, the exhibition floors were busy, noisy and clearly reflected the general feeling that telecoms was, at last, moving out of the doldrums. Most exhibitors said that they were more than satisfied with the level of attendance (a comparatively rare sentiment at industry exhibitions), and many underlined the general view that business was 'on the move'. Many exhibitors timed a whole range of announcements to coincide with TMW, heightening the sense of an active, go-getting industry, as well as taking advantage of the considerable press attendance at the event.  CSG, for instance, announced that BT will deploy the latest version of its Kenan Billing Platform. The two companies will work together to upgrade six deployments across the BT group, supporting services including fixed line local and long distance, IP and broadband.  On the customer care front, Nawras, the Oman based mobile operator, opted for Microsoft's Customer Care Framework, while Atos Origin also elected the Microsoft route, announcing the launch of a Microsoft-based solution which includes pre and postpaid billing, CRM and order management, for MVNOs and enablers.
The more sombre topic of fraud and revenue leakage was highlighted by Azure when the company announced that research it had commissioned from analysts Analysys shows that telecom operators have lost around $115 billion in revenue since the last TeleManagement World event. The major sources of revenue loss were identified as being fraud, poor processes and procedures, applying new products and prices, and incomplete or incorrect CDRs. At the same time, Azure announced the launch of its interconnect version 9.0, the latest manifestation of its interconnect system and bureau service, which allows operators to analyse call data, bill more accurately and verify interconnect traffic between themselves and other network partners. 
Taking up the TMF's much repeated advice on lean business transformation, Denmark's TDC is automating it's back office processes using Cramer inventory-powered automation software to enable the rapid and cost-effective introduction of next generation services and facilitate the phased retirement of legacy systems.  Quality monitoring announcements were also in evidence with the likes of Tele.ring opting for NetTest for service quality monitoring of its UMTS network, while Proximus chose Datamat's wireless end-to-end service quality monitoring solution, which will perform quality assessments on such services as voicemail, SMS, MMS and WAP.
There were, of course, many more announcements, and almost certainly several deals struck that did not quite reach the announcement stage in Nice. Exhibitors did not, of course, confine themselves purely to the exhibition floors, and many were also clearly keen to participate in -- as well as present -- a wide range of conferences sessions, which were divided into five tracks: Lean Business Transformation; Implementing NGOSS; Next Generation Network Operations; Customer-Centric Services; and Billing and Revenue Management. They were also -- as was everyone else attending TMW -- much in evidence at the many social functions, where that essential elements of any successful event - networking - is eased and oiled by the copious amounts of champagne, canapés or, indeed, cordon-bleu level, several-course, meals. Certainly judging by the chatter volume and very evident bonhomie at these events, things must be looking up for the telecoms industry.                                                 n


Seven years on from its inception, Bluetooth has developed into the must have handset technology. Glenn Collinson looks at how it has moved forward

At the time of its initial inception in 1998, Bluetooth was intended, in its simplest form, as a wire replacement technology using short-range radio connections to transfer voice and data quickly. Isolated pundits scoffed at the technology, predicting that it would be forgotten in a few years. However, seven years later, Bluetooth is going from strength to strength and is recognised as being an essential feature for new mobile handset models.
This global recognition cannot simply be explained by commitment from the handset designers and silicon providers to the technology -- consumer demand for Bluetooth technology has proved insatiable. Indeed, in many equipment reviews, the lack of Bluetooth often has a significant negative impact on a product's final score. This trend has been further demonstrated by the recent adoption of Bluetooth by companies such as Samsung, who's first Bluetooth-enabled GSM handset was so popular that extra supplies had to be found from other parts of the globe in order to satisfy demand.
This article will explain why Bluetooth has enjoyed such popularity and why this popularity will continue to grow. It will examine the evolution of the Bluetooth standard and highlight the different enhancements brought by newer versions, along with details of new and existing applications for this technology.

The technology

A Bluetooth wireless link sends data to another Bluetooth radio in the form of data packets whichare sent at a rate of 1Mbit per second in the case of standard rate Bluetooth, and 3Mbits per second in the case of enhanced data rate Bluetooth. Voice packets (in synchronous connection oriented, or SCO channels) are given more bandwidth than data, thus maintaining a basic level of audio quality. However, more recent iterations of the Bluetooth standard, combined with new software providing an extra level of echo cancellation have contributed to the huge uptake of Bluetooth headsets, especially in countries where driving safety legislation has banned the use of a handset when at the wheel of a car.
Class 1 Bluetooth devices which offer the 100m range require more power and therefore tend to be featured in products with a ready and fixed power supply, such as a PC or printer. Smaller, battery-dependent devices like mobile phones, headsets or PDAs tend to be kept within a short distance of each other within the personal area network (PAN), and therefore often feature a Class 2 device with a range of approximately 10m.
Profiles included in the Bluetooth protocol stack determine the applications for which the device can be used. Common profiles include a handsfree profile for use with Bluetooth headsets of in-car handsfree systems, or dial up networking which allows a Bluetooth-enabled mobile phone to be used as a modem to connect a laptop or PDA to the Internet.
Bluetooth technology has moved on a lot in the past 18 months. In mid-2003, the Bluetooth Special Interest Group (SIG), which defines the Bluetooth standard, launched a major revision of the specification. The revision, named v1.2, brought with it an even more robust connection, as well as to enhance coexistence with other wireless standards, especially those in the 2.4GHz range -- such as Wi-Fi (802.11b/g).
In addition, in late 2004, v2.0+EDR was introduced bringing designers the chance to develop new Bluetooth-enabled products with faster data rates, even lower power consumption, and more robust voice and data connections. The evolution of the specification introduced many improvements to the technology. Some of the key improvements are detailed below with an explanation of the benefits they have brought in terms of additional applications that would otherwise have been unthinkable.

Adaptive Frequency Hopping

Adaptive Frequency Hopping (AFH) is one of these enhancements. AFH allows Bluetooth devices to monitor the link quality and then determine if there are poor channels (interference from other devices) present on specific frequencies. In such a case, the Bluetooth devices adjust their hopping sequence to avoid the bad channels, therefore improving data throughput.

Extended Synchronous Connection Oriented links

The addition of the extended Synchronous Connection Oriented (eSCO) link type is another valuable addition to the Bluetooth specification. These links allow the checking and retransmission of voice packets to enhance the quality of the audio link -- an extremely important feature for consumers who choose to invest in a Bluetooth headset.

Enhanced Data Rate

For any communications technology, faster is almost always better, and Bluetooth technology is no exception.  Enhanced Data Rate (EDR) was announced by the Bluetooth SIG in June 2004 and delivers raw data rates up to 3 times faster than basic rate Bluetooth spec. EDR is a subset of the latest v2.0 specification of Bluetooth, called v2.0+EDR.
However, EDR also brings benefits in multiple-connectivity. In isolation, standard Bluetooth-based applications do not generally demand more than the current 1 Mbps data rate. But as Bluetooth technology grows in popularity, users will increasingly run multiple Bluetooth links at the same time. This is particularly true for PCs, where it is easy to imagine a scenario where a user is operating a Bluetooth mouse and keyboard and at the same time listening to stereo audio over a pair ofBluetooth headphones. EDR gives Bluetooth the extra capacity it needs to handle links to multiple devices at data rates that users will find acceptable. In other words, users will not be tolerant of the need for the protocol to retransmit data packets if this means waiting for the mouse cursor to respond.

New areas for Bluetooth technology

he evolutionary changes to the Bluetooth specification detailed above give developers an expanded purview for application areas. The characteristics of AFH, eSCO, and EDR make Bluetooth technology appropriate for new and diverse tasks.
EDR and eSCO have made possible some interesting user scenarios. One of them has to do with the newly approved Audio Visual profiles. EDR enables stereo headsets to be used over a Bluetooth link at the same time as other Bluetooth connections. An example application would be a pair of Bluetooth stereo headphones that can be used for a phone call as well as for listening to music. The user listens via Bluetooth stereo headphones to an MP3 stream from a PC. When a phone call comes in the phone notifies the application in the PC via a Bluetooth link and this causes the phone to ask the user if they wish to answer the call. If the user chooses to take the call, the PC pauses the MP3 stream and tells the phone to proceed. Once the conversation is complete, the phone hangs up and tells the PC to resume the MP3 music.
Another related PC/phone/headset scenario is the "e-mail on the fly" situation. This scenario puts the user in, for instance, a busy airport. An e-mail is delivered to the PC via the Bluetooth link with a mobile handset. Software on the PC in the user's briefcase scans the e-mail and determines that it is important.  The application then performs a text-to-voice translation and "reads" the e-mail to the user via a Bluetooth headset. The user is then able to use the EDR link from the headset to the PC to send a high quality reply stream that is processed by a voice-to-text utility that composes an e-mail to be sent back to the original sender. The receipt, processing, and reply of e-mail can all happen 'on the fly' without the user having to stop and type e-mails.

Ultra-fast Bluetooth

The Bluetooth SIG recently announced that it intends to work towards developing a UWB Bluetooth standard which could potentially offer data transfer speeds of up to 480Mbits per second (compared to the top rate of 3Mbits per second for EDR Bluetooth). This proposal is still in the early stages of development but once the physical (PHY) layer is finalised from the competing standards currently fighting for dominance, the potential future applications for this new form of Bluetooth are vast. Its main attraction however will come from the ability to transfer huge media files such as high-definition TV programmes, films and music, instantly without the need for slower streaming data transfer.
Bluetooth has enjoyed global success in the consumer electronics market thanks to a number of different factors. The technology has evolved quickly to respond to potential coexistence or link quality issues. The rigorous testing and qualifications process demanded by the Bluetooth SIG also ensures optimum levels of compatibility between all Bluetooth products available to consumers. This evolution has also introduced applications that would otherwise have been unthinkable (for example, the highly advanced automotive telematics systems in high-end Japanese and European car models, using mobile phones connected over Bluetooth to update the car unit and utilise handsfree calling).  Further applications are already expected from the anticipated UWB-based ultra-fast Bluetooth, which may remove even more cabling from the home entertainment system.
In short, Bluetooth has proved itself a robust and secure technology which has delivered on past promises and offers more exciting future developments sure to capture consumer interest.       Glenn Collinson is co-founder of CSR

With more and more people in the UK eschewing urban life for a pastoral alternative, Broadband technology is providing the means for workers to stay connected to corporate networks. And Wales is a prime example of telecommuting in action, says Michael Eaton

In 1975, UK TV audiences were first introduced to Tom Good and his wife Barbara. Kitted out in dungarees and galoshes, the pair has quit the rat race in pursuit of The Good Life, by transforming their Surbiton home into an urban farm. Not only a BBC hit, The Good Life may have also pre-empted a trend today's researchers are calling "greenshifting" as more than one in four United Kingdom urbanites move into the countryside to carve out their place in rural paradise.
Last year the Countryside Agency (CA) published its State of the Countryside Review 2004 and found that in the past four years alone, an additional 352,000 people have made the move from urban to rural.
The trend is due in some part to soaring property prices which are pushing suburb boundaries into the countryside, but it is also due to a genuine desire from not just families, but single people too, for a better quality of life -- green space offering room to move and clean air to breathe. Hot on the list of desired locations is Wales, which more and more people are fast realising, is a remarkably networked country -- in fact the Office of National Statistics found that, in 2003, 60,000 newcomers moved to Wales. 

Clocking into the virtual office

Adoption of a rural life doesn't have to mean a total abandonment of one's professional career, thanks to the advent of telecommuting, otherwise known as remote working. In short, this is the process by which employees maintain day-to-day business operations from their homes instead of a centralised office, using remote communication services to transmit and receive business communications, documents and data. By 2010, 10 million UK employees will be adopting telecommuting as a way to achieve the elusive work/life balance.
For employees it's the ultimate way to achieve that elusive work/life balance we all desire, with the added benefits of reduced commuting time, fuel expenses, auto depreciation, parking expenses and stress; cost savings on wardrobe and meals; proximity to family; autonomy and control over work conditions and schedules; and an improved quality of life.
Advances in communications technology mean that there is no longer any physical reason for many kinds of office work to be done in one location rather than another, and as such work is becoming something you do, rather than a place you go to. The present  trend  in moving away from expensive city centre offices and in adopting telecommuting is proving to be a major cost saving for businesses. Employers benefit from: enhanced productivity, lower labour costs, coverage for difficult shifts, enhanced employee retention, reduced absenteeism, reduced sick leave and healthcare costs, reduced relocation costs, improved motivation of employees. What many people don't realise is that they can achieve all of these things using a standard 512kbps broadband connection at home. It can provide fast, direct access to corporate networks and files equivalent to being in the office. Users can keep calling and faxing as normal while connected, and clients won't be driven-away by annoying engaged signals.

Welsh broadband in action

We partnered with research agency ORC International earlier this year to survey 5,500 Welsh residents on their broadband habits and opinions and found that overall broadband take-up in Welsh homes has reached 25 per cent, and that broadband take-up of speeds of 512kbps or higher have jumped from 11 per cent to 17 per cent in less than a year. These excellent results are thanks to rising awareness levels, which have reached a staggering 93 per cent, pushing Welsh broadband take-up above the UK average. Research has also shown that Cardiff is one of the most broadband connected cities in the UK.

A couple of  success stories

Reynoldston is a small village in Gower (outside of Swansea) and is widely known not only as the home of King Arthur's stone but also for its spectacular coastal views. For architectural consultant, David Clarke, Reynoldston was the ultimate retirement destination from his busy Oxford practice. However his retirement plans hinged on one factor -- the availability of broadband, so that he could continue work as a specialist architectural consultant. Just like hundreds of other people moving to Wales for the first time, David couldn't fathom that such a remote community could have access to broadband. But it does, thanks to the Reynoldston Community Wireless Network, a village initiative that provides a local wireless network to homes in the village.
In a similar way, Inet Experts' Andrea Jones is running her business on a satellite connection from the top of a mountain in Llanelli, proving that work/life balance is achievable. The business IT and training aspect of the company was set up specifically so Andrea could relocate back to rural Wales. She says without satellite "the business could not function" as dial-up doesn't provide the power to quickly manage large files, or to provide the professional polish expected from an IT company. Moreover, the "real time" aspect of broadband helps Andrea to show customers the work she is doing without them needing to visit her home.

Feeling insecure?

So what are the telecommuting issues that are keeping managers awake at night, and what steps can be taken to combat them?
When a computer is detached from the core network, antivirus software updates can lapse. Therefore users, particularly those using a home PC, need to be diligent and ensure their computer is adequately protected, otherwise they might find themselves in the awkward position of passing on any infections to their customers and co-workers. Users should set-up their system so that virus and hacker intrusion, detection and protection software is automatically updated each time the corporate network is accessed.
In addition, a NCC Group Survey recently found that one in six remote PCs didn't have adequate protection against hackers, who each day are conceiving more elaborate ways of accessing corporate networks through the "back door". For example, spyware can covertly gather data, including keystrokes and passwords, and delivers it straight into the competition's hands. Firewalls are another essential tool to telecommuters.
Finally, if we're being honest, we're often our own worst enemies. As a tip -- get a physical lock, and ensure access to the core network is routed through a virtual private network that requires a pin and a password.
To ensure that staff adhere to vital security protocols, some employers are including them in their remote working security policy and linking them to staff appraisals. Certainly food for thought...
   The Countryside Agency has said "the increase in rural population witnessed in recent decades shows no signs of abating as more and more people choose to move from urban to rural districts." There's no denying that it's time to exit the fast lane, pick up the pitchfork and plug in the laptop. Owning a few green acres in Wales is not just for farmers anymore, it's about people like you and I chasing the elusive work/life balance.        n                               

Michael Eaton is Director for the Welsh Assembly Government's Broadband Wales Programme
Details: broadband@wales.gsi.gov

Getting consumers interested in 3G services will require the right strategy from operators if the technology is going to bring home the bacon, argues Matt Hooper

The advertising and marketing blitz that preceded Christmas 2004 made it clear that consumer 3G services have well and truly arrived in the UK. And if the messaging was anything to go by, network operators are pinning their hopes on streaming content, capturing the imagination of a subscriber base until now reticent to embrace next generation mobile technology.
'Full track music downloads -- live it!' screamed 20ft posters in cities all over the country. 'Watch music videos on your mobile -- all the way through!' chanted faux-Japanese animated characters in peak rate TV advertising slots throughout December. The point they wanted to get across was this: with 3G, you can use your phone to consume content in ways not possible on 2.5G networks, whether it's listening to music or watching video on your handset. More bandwidth transforms your phone into a lifestyle accessory. 
ARC Group estimate that the mobile video market will generate worldwide revenues of $5.4 billion by 2008 -- but 2008 is a long way off. Are current 3G services really going to usher in a brave new world where rich content such as video is king and operators can clean up by fulfilling endless consumer demand for streaming music and video services?
In short -- of course not, at least certainly not in the short term. The plain fact of the matter is that operators are a lot more excited about 3G than consumers are; behind the multi-million pound advertising lies a relatively slow subscription rate that betrays the true extent of consumer apathy.
Consumers buy into the services, not the technology, and the services currently available do not yet take full advantage of the opportunities offered by third generation networks. Furthermore, there is often little thought given to the promotion of these streaming services; for many subscribers, streaming is a new and unfamiliar technology, something to be wary of, and not the indispensable lifestyle accessory the glossy ads make it out to be. To really drive these rich content services, operators need to look at both their delivery infrastructure and marketing strategies.
At the moment, most operators are starting to employ dedicated streaming servers to deliver music and video content over the air to subscribers. For the most part, these servers are workable, standalone solutions that do the job they're intended for -- in some cases, even over the 2.5G network. However, a standalone streaming server is not a sound basis for an ongoing content revenue strategy. In fact, maintaining a standalone solution that isn't integrated into existing content management and delivery systems is simply another cost for operators to bear.
A standalone streaming server is essentially a 'store and forward' mechanism, which makes it very difficult to develop video content into a targeted offering. To do this, integration points are required into the content management, CRM and, importantly, content delivery infrastructure; this allows all commercial aspects to be controlled -- such as pricing, catalogue and portal management, promotion and the discovery/delivery experience. When delivering video, marketing and education is key; with a standalone server, there is a reliance on the user at the device level to ensure that a media player client is installed and able to handle the relevant content. Put simply, an integrated and unified content catalogue and delivery platform is required.
With the appropriate management and delivery infrastructure in place, the next key consideration is the way new services are marketed to subscribers. Operators need to proactively stimulate demand for rich content; passive services will not find users.
Firstly, operators need to think in terms of segmenting their subscriber base to ensure their marketing activities are targeted. For example, adult content will certainly be one of the hot areas likely to drive the uptake of streaming video, but promotional campaigns need to be very specifically targeted, based on subscriber age and registered preferences. Beyond the adult and sport arena, streaming music services -- which will almost certainly be the other category that sparks consumer excitement -- are more likely to find an audience if tailored promotional campaigns can be delivered to an identifiable subscriber sector.
Secondly, operators need to think more specifically about how they can integrate 3G services into their existing content portfolio. Bundling content into relevant, themed packages, for example, will help        introduce the subscriber base to the possibilities of streaming. Offering video clips alongside familiar items such as games and ringtones will help drive discovery, boost uptake and help establish an appetite for richer media.
For example, operators could bundle branded content from a blockbuster movie that includes stills from the film, a JAVA game, a ringtone -- and a streaming trailer for the movie itself. Subscribers then receive a catalogue of mostly familiar items relevant to their interest, increasingly the likelihood is that they will trial the video service. By pursuing this 'soft introduction' technique, operators can gradually build user familiarity with 3G services and develop consumer demand.
Once we step beyond the hype, what does the next year really hold for the 3G content market? To a certain extent, it's a chicken and egg situation. Video and audio streaming won't be the cash cow everyone hopes for until more subscribers can be convinced to start using these types of service, and device penetration increases. Simultaneously, without operators investing in the appropriate infrastructures and taking a more holistic view of the content marketing function, the content on offer is unlikely to engage consumers and build the desired levels of demand. As the ARC Group have noted, the purchasing behaviour of consumers is the hardest variable, in the rich content market, to predict.
In a nutshell, 3G is ready, but the advanced services that can provide the revenues necessary to wipe out the burden of 3G license fees are in the early stages. The opportunities are there, but until operators can get their pricing, delivery and marketing strategies right to bring customers on board, streaming content will remain a niche sector for some time to come.                     n

Matt Hooper is Vice President, Marketing & Strategic Alliances, elata Ltd and can be contacted via tel: +44 (0)1202 207407   www.elata.com



European Communications is now
Mobile Europe and European Communications


From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:



Other Categories in Features