Q&A

Q&A

Cisco UK’s CTO and technical director Ian Foddering discusses all things cloud.

Eurocomms.com: Last week, Cisco estimated global cloud computing traffic will grow to 1.6 zettabytes annually by 2015 – a 12-fold increase. What is the single most important thing that operators must do to ensure they are ready for this?

Ian Foddering: One of the single most important things is download and upload speeds as well as latencies. These are vital measures to assess network capabilities of cloud readiness and should one of the big focuses for operators.

Nick Webb, head of solutions at Vodafone Global Enterprise, discusses the mobile workforce, unified comms and the company’s new R&D facility in Silicon Valley.

Eurocomms.com: What’s the biggest challenge facing you and your enterprise customers currently?

Nick Webb: One of the biggest challenges is how to deal with the implications of a rapidly growing mobile workforce. Five years ago, there were typically only a handful of senior executives and sales people working away from the office. Companies across all sectors now have increasing numbers of staff throughout the business working flexibly or away from the office. This change is rapid and relentless. For example, one of our customers, a major multinational electronics business, has seen more than 70 percent of its workforce adopt some level of mobile working in just two years.

BT’s latest financial figures showed that the company is performing adequately amid the difficult economic conditions.

Profit was up three percent to €3.3 billion in the last six months boosted by a 46 percent rise in new retail broadband customers.

However, revenues were down three percent at €11.2 billion over the same period with the company’s wholesale division the worst hit – revenues there fell seven percent due to mobile termination rate reductions.

BT Group CIO Clive Selley reflected the topsy-turvy nature of the current business climate in an exclusive interview with European Communications.

Martin Péronnet, CEO of Monaco Telecom, discusses the firm’s LTE trial

Eurocomms.com: You are currently trialling LTE – how many people are taking part and what devices is it available on?

Martin Péronnet: We have been running a LTE trial since last April on Ericsson equipment. The network is available through any Monaco Telecom SIM card and the devices we are using to date are dongles. LTE is still in its infancy so vendors have not released any commercial equipment yet. We should start to see some next year.

Geoff Unwin, the new chairman of network software provider OpenCloud and former CEO of Cap Gemini Ernst & Young, shares his thoughts on the state of the telecoms sector.

Accanto systems has come a long way from its roots as a provider of protocol analysers. its focus is now on helping service providers to overcome the challenges of offering mobile data and Voip services while also managing legacy technologies   

While IP appears to have simplified telecoms, Christoph Kupper, Executive Vice  President of Marketing at Nexus Telecom tells Lynd Morley that the added complexity of monitoring the network - due largely to exploding data rates - has led to a new concept providing both improved performance and valuable marketing information

Nexus Telecom is, in many ways, the antithesis of the now predominant imperative in most industries - and certainly in the telecoms industry - which requires wholesale commoditisation of services; an almost exclusive focus on speed to market; and a fast response to instant gratification.

Where the ruling mantra is in danger of becoming "quantity not quality" in a headlong rush to ever greater profitability (or possibly, mere survival), Nexus Telecom calls something of a halt, focussing the spotlight on the vital importance of high quality, dependable service that not only ensures the business reputation of the provider, but also leads to happy - and therefore loyal - customers.

Based in Zurich, Nexus Telecom is a performance and service assurance specialist, providing data collection, passive monitoring and network service investigation systems.  The company's philosophy centres around the recognition that the business consequences of any of the network's elements falling over are enormous - and only made worse if the problem takes time to identify and fix.  Even in hard economic times, the investment in reliability is vital.

The depressing economic climate does not, at the moment, appear to be hitting Nexus Telecom too directly.  "Despite the downturn, we had a very good year last year," comments Christoph Kupper, Executive Vice President of Marketing at Nexus Telecom.  ‘And so far, this year, I don't see any real change in operator behaviour. There may be some investment problems while the banks remain hesitant about extending credit, but on the whole, telecom is one of the solid businesses, with a good customer base, and revenues that are holding up well."

The biggest challenge for Nexus Telecom is not so much the economy, but more one of perception and expectation, with some operators questioning the value and cost of the OSS tools - which, relative to the total cost of the network has increased over the years.  In the past few years the price of network infrastructure has come down by a huge amount, while network capacity has  risen.  But while the topological architecture of the network is simplifying matters - everything running over big IP pipes - the network's operating complexity is vastly increasing.  So the operator sees the capital cost of the network being massively reduced, but that reduction isn't being mirrored by similarly falling costs in the support systems.  Indeed, because of the increased complexity, the costs of the support systems are going up.

Complexity is not, of course, always a comfortable environment to operate in.  Kupper sees some of the culture clash that arises whenever telecom meets IT, affecting the ways in which the operators are tackling these new complexities.

"In my experience, most telecom operators come from the telco side of the road, with a telecom heritage of everything being very detailed and specified, with very clear procedures and every aspect well defined," he says.

"Now they're entering an IP world where the approach is a bit looser, with more of a ‘lets give it a try' attitude, which is, of course, an absolute horror to most telcos."

Indeed, there may well be a danger that network technology is becoming so complex that it is now getting ahead of some CTOs and telecom engineers.

"There can be something of a ‘fear factor' for the engineers, if ever they have an issue with the network," Kupper says.  "And there are plenty of issues, given that these new switching devices can be configured in so many ways that even experienced engineers have trouble doing it right.

"Once the technical officers become fully aware of these issues, the attraction of a system such as ours, which gives them better visibility - especially independent visibility across the different network domains - is enormous.

"It only takes one moment in a CTO's life when he loses control of the network, to make our sale to him very much easier."

The sales message, however, depends on the recognition that increased complexity in the network requires more not less monitoring, and that tools which may be seen as desirable but not absolutely essential (after all, the really important thing is to get the actual network out there - and quickly) are in fact, vital to business success.  Not always an easy message to get across to those whose background in engineering means they do not always think in terms of business risk.

Kupper recognises that the message is not as well established as it might be. "We're not there yet," he says.  "We still need to teach and preach quite a lot, especially because the attraction of the ‘more for less' promise of the new technology elements hides the fact that operational expenditure on the management of a network with vastly increased traffic and complexity, is likely to rise."

The easiest sales are to those technical officers who have a vision, and who are looking for the tools to fulfil it.  "They want to have control of their networks," says Kupper. "They want to see their capacity, be able to localise it, and see who's affected."

And once Nexus Telecom's systems are actually installed, he stresses, no one ever questions their necessity. 

"The asset and value of these systems is hard to prove - you can't just put it on the table. It's a more complicated qualitative argument that speaks to abstract concepts of Y resulting from the possible failure of X, but with no exact mathematical way to calculate what benefits your derive from specific OSS investment."

So the tougher sales are to the guys who don't grasp these concepts, or who remain convinced that any network failure is the responsibility of the network vendors who must therefore provide the remedy, without taking into account how long that might take, and the subsequent impact on client satisfaction, and therefore, ultimately business success.
These concepts, of course, are relevant to the full range of suppliers, from wireline and cable operators to the new mobile kids on the block.  Indeed, Kupper stresses that with the advent of true mobile data broadband availability, following the change to IP, and the introduction of flat rates to allow users to make unlimited use of the technology, the cellular operator has positioned himself as a true contender against traditional wireline and cable operators.

Kupper notes: "For years in telecommunications, voice was the data bearer that did not need monitoring - if the call didn't work, the user would hang up and redial - a clearly visible activity in terms of signalling procedure analysis.

"But with mobile broadband data, the picture has changed completely.  It is the bearer that needs analysis, because only the bearer enables information to be gleaned on the services that the mobile broadband user is accessing.  The network surveillance tools, therefore, must not only analyse the signalling procedure but also, and most importantly, the data payload.  It is in the payload that we see if, for example, Internet browsing is used, which URL is accessed, which application is used, and so forth. And it is only the payload, for which the subscriber pays!"

He points out that as a consequence of the introduction of flat rates and the availability of 3G, data rates have exploded.

"It is now barely possible to economically monitor such networks by means of traditional surveillance tools.  A new approach is needed, and that approach is what we call ‘Intelligent Network Monitoring'. At Nexus Telecom we have been working on the Intelligent Network Monitoring concept for about two years now, and have included that functionality with every release we have shipped to customers over that period.  Any vendor's monitoring systems that do not include developments incorporating the concepts of mass data processing will soon drown in the data streams of  telecom data networks."

Basically, he explains, the monitoring agents on the network must have the ability to interpret the information obtained from scanning the network ‘on the fly'.  "The network surveillance tools need a staged intelligence in order to process the vast amount of data; from capturing to processing, forwarding and storing the data, the system must, for instance, be able to summarise, aggregate and discard data while keeping the essence of subscriber information and its KPI to hand - because, at the end of the day, only the subscriber experience best describes the network performance. And this is why Nexus Telecom surveillance systems provide the means always to drill down in real-time to subscriber information via the one indicator that everyone knows - the subscriber's cell phone number."

All this monitoring and surveillance obviously plays a vital role in providing visibility into complicated, multi-faceted next generation systems behaviour, facilitating fast mitigation of current and potential network and service problems to ensure a continuous and flawless end-customer experience.  But it also supplies a wealth of information that enables operators to better develop and tailor their systems to meet their customers' needs.  In other words, a tremendously powerful marketing tool.

"Certainly,' Kupper confirms, "the systems have two broad elements - one of identifying problems and healing them, and the other a more statistical, pro-active evaluation element.  Today, if you want to invest in such a system, you need both sides.  You need the operations team to make the network as efficient as possible, and you also need marketing - the service guys who can offer innovative services based on all the information that can be amassed using such tools."

Kupper points out that drawing in other departments and disciplines may, in fact, be essential in amassing sufficient budget to cover the system.  The old days when the operations manager could simply say ‘I need this type of tool - give it to me' are long gone, and anyway their budgets, these days, are nothing like big enough to cover such systems.  Equally, however, the needs of many different disciplines and departments for the kind of information Nexus Telecom systems can provide is increasing as the highly competitive marketplace makes responding to customer requirements and preferences absolutely vital.  Thus the systems can prove to be of enormous value to the billing guys, the revenue assurance and fraud operations, not to mention the service development teams.  "Once the system is in place," Kupper points out, "you have information on every single subscriber regarding exactly which devices and services he most uses, and therefore his current, and likely future, preferences.  And all this information is real-time."

Despite the apparent complexity of the sales message, Nexus Telecom is in buoyant mood, with good penetration in South East Asia and the Middle East, as well as Europe.  These markets vary considerably in terms of maturity of course, and Kupper points out that OSS penetration is very much a lifecycle issue.  "When the market is very new, you just push out the lines," he comments.  "As long as the growth is there - say the subscriber growth rate is bigger than ten per cent a year - you're probably not too concerned about the quality of service or of the customer experience. 

"The investment in monitoring only really registers when there are at least three networks in a country and the focus is on retaining customers - because the cost of gaining new customers is so much higher than that of hanging on to the existing ones.

"Monitoring systems enable you to re-act quickly to problems.  And that's not just about ensuring against the revenue you might lose, but also the reputation you'll lose.  And today, that's an absolutely critical factor."

The future of OSS is, of course, intrinsically linked to the future of the telcos themselves.  Kupper notes that the discussion - which has been ongoing for some years now - around whether telcos will become mere dumb pipe providers, or will arm themselves against a variety of other players with content and tailored packages, has yet to be resolved.  In the meantime, however, he is confident that Nexus Telecom is going in the right direction.

"I believe our strategy is right.  We currently have one of the best concepts of how to capture traffic and deal with broadband data.

"The challenge over the next couple of years will be the ability to deal with all the payload traffic that mobile subscribers generate.  We need to be able to provide the statistics that show which applications, services and devices subscribers are using, and where development will most benefit the customer - and, of course, ultimately the operator."

Lynd Morley is editor of European Communications

As users become increasingly intolerant of poor network quality, Simon Williams, Senior VP Product Marketing and Strategy at Redback Networks tells Priscilla Awde that, in order to meet the huge demand for speed and efficiency, the whole industry is heading in the same direction - creating an all IP Ethernet core using MPLS to prioritise packets regardless of content

Speed, capacity, bandwidth, multimedia applications and reliable any time, anywhere availability from any device - tall orders all, but these are the major issues facing every operator whether fixed or mobile. Meeting these needs is imperative given the global telecoms environment in which providing consistently high quality service levels to all subscribers is a competitive differentiator. There is added pressure to create innovative multimedia services and deliver them to the right people, at the right time, to the right device but to do so efficiently and cost effectively.

Operators are moving into a world in which they must differentiate themselves by the speed and quality of their reactions to rapid and global changes. Networks must become faster, cheaper to run and more efficient, to serve customers increasingly intolerant of poor quality or delays. It is a world in which demand for fixed and mobile bandwidth hungry IPTV, VoD and multimedia data services is growing at exponential rates leaving operators staring at a real capacity crunch.

To help operators transform their entire networks and react faster to demand for capacity and greater flexibility, Ericsson has created a Full Service Broadband initiative which marries its considerable mobile capabilities with similar expertise in fixed broadband technologies. With the launch of its Carrier Ethernet portfolio, Ericsson is leveraging the strength of the Redback acquisition to develop packet backbone network solutions that deliver converged applications using standards based IP MPLS (Multi Protocol Label Switching), and Carrier Ethernet technologies.

Committed to creating a single end-to-end solution from network to consumer, Ericsson bought Redback Networks in 2007, thereby establishing the foundation of Ericsson IP technology but most importantly acquiring its own router and IP platform on which to build up its next generation converged solution.

In the early days of broadband deployment, subscriber information and support was centralised, the amount of bandwidth used by any individual was very low and most were happy with best effort delivery. All that changed with growth in bandwidth hungry data and video applications, internet browsing and consumer demand for multimedia access from any device. The emphasis is now on providing better service to customers and faster, more reliable, more efficient delivery. For better control, bandwidth and subscriber management plus content are moving closer to customers at the network edge.

However, capacity demand is such that legacy systems are pushed to the limit both in handling current applications, let alone future services, and guaranteeing quality of service. Existing legacy systems are inefficient, expensive to run and maintain compared to the next generation technologies that transmit all traffic over one intelligent IP network. Neither do they support the business agility or subscriber management systems that allow operators to react fast to changing markets and user expectations.

Despite tight budgets, operators must invest to deliver and ultimately to save on opex. They must reduce networking costs and simplify existing architectures and operations to make adding capacity where it is needed faster and more cost effective.

The questions are: which are the best technologies, architectures and platforms and, given the current economic climate, how can service providers transform their operations cost effectively. The answers lie in creating a single, end-to-end intelligent IP network capable of efficiently delivering all traffic regardless of content and access devices. In the new IP world, distinctions between fixed and mobile networks, voice, video and data traffic and applications are collapsing. Infonetics estimates the market for consolidating fixed and mobile networks will be worth over $14 billion by 2011 and Ericsson, with Redback's expertise, is uniquely positioned to exploit this market opportunity.

Most operators are currently transforming their operations and as part of the solution, are considering standards based Carrier Ethernet as the broadband agnostic technology platform. Ethernet has expanded beyond early deployments in enterprise and Metro networks: carrier Ethernet allows operators to guarantee end-to-end service quality across their entire network infrastructure, enforce service level agreements, manage traffic flows and, importantly, scale networks.

With roots in the IT world where it was commonly deployed in LANs, Ethernet is fast becoming the de facto standard for transport in fixed and mobile telecoms networks. Optimised for core and access networks, Carrier Ethernet supports very high speeds and is a considerably more cost effective method of connecting nodes than leased lines. Carrier Ethernet has reached the point of maturity where operators can quickly scale networks to demand; manage traffic and subscribers and enforce quality of service and reliability.
 

"For the first time in the telecoms sector we now have a single unifying technology, in the form of IP, capable of transmitting all content to any device over any network," explains Simon Williams, Senior VP Product Marketing and Strategy at Redback Networks, an Ericsson company. "The whole industry is heading in the same direction: creating an all IP Ethernet core using MPLS to prioritise packets regardless of content.
 

"In the future, all operators will want to migrate their customers to fixed/mobile convergent and full service broadband networks delivering any service to any device anytime, but there are a number of regulatory and standards issues which must be resolved. Although standards are coming together, there are still slightly different interpretations of what constitutes carrier Ethernet and discussions about specific details of how certain components will be implemented," explains Williams.

Despite debates about different deployment methods, Carrier Ethernet, MPLS ready solutions are being integrated into current networks and Redback has developed one future proof box capable of working with any existing platform. 

Experts in creating distributed intelligence and subscriber management systems for fixed operators and now for mobile carriers, Redback's solutions are both backward and forward compatible and can support any existing platform, including ATM, Sonet, SDH or frame relay. Redback is applying its experience in broadband fixed architectures to solving the capacity, speed and delivery problems faced by mobile operators. As the amount of bandwidth per user rises, the management of mobile subscribers and data is being distributed in similar ways as happened in the fixed sector.

Redback has developed SmartEdge routers and solutions to address packet core problems and operator's needs to deliver more bandwidth reliably. SmartEdge routers deliver data, voice or video traffic to any connected device via a single box connected to either fixed or mobile networks. Redback's solutions are designed to give operators a gradual migration path to a single converged network which is more efficient and cost effective to manage and run.

In SmartEdge networks with built-in distributed intelligence and subscriber management functionality, operators can deliver the particular quality of service, speed, bandwidth and applications appropriate to individual subscribers.

Working under the Ericsson umbrella and with access to considerable R&D budgets, Redback is expanding beyond multiservice edge equipment into creating metroE solutions, mobile backhaul and packet LAN applications. Its new SM 480 Metro Service Transport is a carrier class platform which can be deployed in fixed and mobile backhaul and transport networks; Metro Ethernet infrastructure and to aggregate access traffic. Supporting fixed/mobile convergence, the SM 480 is a cost effective means of replacing legacy transport networks and migrating to IP MPLS Carrier Ethernet platforms. The system can be used to build packet based metro and access aggregation networks using any combination of IP, Ethernet or MPLS technologies.

Needing to design and deliver innovative converged applications quickly to stay competitive, operators must build next generation networks. Despite the pressures on the bottom line, most operators see the long-term economic advantages of building a single network architecture. Moving to IP MPLS packet based transmission and carrier Ethernet creates a content and device agnostic platform over which traffic is delivered faster and over a future proof network. Operators realise the cost and efficiency benefits of running one network in which distinctions between fixed and mobile applications are eliminated.

Although true convergence of networks, applications and devices may be a few years away, service providers are deploying the necessary equipment and technologies. IP MPLS and carrier Ethernet support both operators' needs for speed, flexibility and agility and end user demand for quality of service, reliability and anywhere, anytime, any device access.
 

"Ultimately however, there should be less focus on technology and more on giving service providers and their customers the flexibility to do what they want," believes Williams. "All operators are different but all need to protect their investments as they move forward and implement the new technologies, platforms and networks. Transformation is not only about technology but is all about insurance and investment protection for operators ensuring that solutions address current and future needs."

Priscilla Awde is a freelance communications journalist

As financial turmoil rampages across the worlds' markets, Professor Janusz Filipiak, founder and chief executive of OSS/BSS provider Comarch, tells George Malim that he sees great opportunity as carriers seek to streamline their operations and get to grips with new business models, services and the complex new telecoms value chain

Comarch, the Polish IT solutions provider has been developing OSS/BSS systems for telecoms since 1993 and now provides a portfolio of systems and managed services to incumbent, broadband, triple play operators as well as MVNOs/MVNEs and start-ups. With a turnover of €170m, more than 3,000 employees and a customer roster that includes T-Mobile Germany and Austria, Bouygues Telecom France, O2 Germany and Polkomtel and PTC in Poland, the company has enjoyed a 33 per cent increase in turnover during the last five years. As the general economic crisis deepens, founder and chief executive, Professor Janusz Filipiak, thinks vendors will have to chase harder and act more cleverly to win deployments.
"Now all companies have to be mean and lean in the recession" he says. "We are very cost minded and every bit that is not needed is removed. You can't come to carriers with a higher price than your competitors. IT engineers are now a global resource and want the same payment in China or the UK, for example, so we are in the same position as all vendors. We can't compete on price so we can only be more intelligent and more effective than others. In spite of the recession we must now continue to invest in developing new products."
Current financial market woes aside, Comarch is heavily focused on the mobile market and recognises the challenges faced by operators. "In today's world of telecommunications, mobile operators are faced with the challenges resulting from market saturation in the majority of countries" adds Filipiak. "Innovative product offerings and enhanced service levels are indispensable in order to gain new customers and prevent churn. Operators are searching for the Holy Grail of telco that will prevent ARPU from decreasing. As voice is still the ‘killer application', we see data and value added services as a fast growing market. Other trends are still ahead of us such as seeing strong market competition from global corporate customers seeking the best deals from global mobile groups."

Filipiak also sees great potential in currently non-mobile operators. "Keeping in mind that everything eventually goes mobile, we haven't forgotten the great potential of fixed broadband operators, cable TV providers and triple and quad play operators" he says. "We target different segments of the market while not focusing exclusively on a single one."
Pre-paid billing has been one of the major functions carriers have sought during the life of Comarch but, as bundled and flat-rate packages become more popular, Filipiak sees it's emphasis waning. "Today there are not too many content services available but they will come" he says. "Video streaming will put new requirements on bandwidth and devices. It will be very resource consuming and will be charged via pay-per-use. The experience won't be very different to paying for bandwidth or connection time with voice. Pre-paid is a method of payment which is still the most popular for the youngest segments of users, but pre-paid is becoming less related to cheap prices - because those are achievable in post-paid models as well - than to a philosophy of ‘no contract, no obligation'."

Flat-rate offers will be harder to make business sense of. "Flat rate is only viable in a world with unlimited capacity" adds Filipiak. "Flat-rate packages make a difference in the final price of services but the introduction of real flat-rate, where everything is included and mobile access is a commodity like internet or electricity or gas, will lead to a weakening of pre-paid which will favour post-paid."

Filipiak sees the market moving in this direction. "We can see that many players are moving towards a mix of post-paid with a significant amount of free minutes, SMS and MMS in a bundle," he adds. "This offer is really close to an actual flat rate and assures stable revenue for providers as well as strong customer loyalty and a resulting decrease in churn. My mantra in telecoms is that customers now expect everything to be easy."

The emergence of mobile content and the move to data services put obvious pressure on carriers' systems and the telecoms revenue chain has become much more complex. Comarch has long been prepared for this shift, as Filipiak attests: "The revenue chain is more complex and an operator is now not the only one that provides the services delivered. Service ‘sponsoring', third-party service providers, resellers and service dealers introduce the need for multi-party billing and put more pressure on monitoring quality of offerings," he says. "Our solutions also address and deal with the complexity that content and data services bring in wholesale, next generation TV, content distribution, service creation and control. We address these needs through our InterPartner Billing solution. On the OSS side, we provide service level management and service level inventory, our flagship OSS products, which enable service modelling of resources and services provided by different parties along with pro-active quality monitoring and management."

Comarch has grown from its eastern European roots and now has operations in 30 countries and addresses operators of all sizes and types, as Filipiak explains: "The Comarch brand is recognised in the telecoms world," he says. "We've been in the industry for 15 years and time is now working for us. Our biggest customers for specialised OSS solutions are Tier 1 operators. Large operators with 10 million subscribers are customers for our InterPartner Billing and, when it comes to independent operators, we have about 30 per cent of the local market as clients for integrated BSS and OSS/BSS solutions. We also target the largest CATV and broadband operators offering convergent services. Our strategy also addresses global players where we can offer the best value, give good prices, still be flexible and deliver enterprise level services."

In spite of the general economic downturn, Filipiak still sees great opportunities emerging. One area is that of next generation mobile networks and self-optimising networks. "Such concepts will invite carriers to look for solutions outside the long established segments of OSS, such as Inventory Management, configuration Management and Network planning," he says. "It will not be sufficient to cover one area in the future; instead co-operation of the planning and operations areas will be needed where we see an opportunity for us. In addition, carriers are now more oriented towards a loose coupling of functional modules and standard interfaces that make it easier for smaller players, like us."

New means of delivering solutions are also critical. "With our future proof architecture of solutions, we can address modern modularity concepts and tendencies that now exist in the market," adds Filipiak. "The openness and standard interfaces in high level OSS products is the key and customers can choose the best modules for their operations. This provides a possibility to reduce opex by utilising new business models for our customers, network virtualisation, distribution and outsourcing of operations and hosting solutions."

Regardless of the current economic gloom, Filipiak believes a new investment wave must come to the telecoms market. "Investment must happen because there will be greater demand" he says. "Physical travel will be a high cost so there will be more load on existing networks."

Carriers face massive challenges in spite of the increased demand for their network capacity and services. "In the international mobile groups, unification and co-operation issues are still of key importance in order to gain competitive advantage on the global market" he says. "Outsourcing of operations has become very popular but unsurprisingly it has turned out not to be a remedy for everything. Carriers still need to adapt their business processes and way of thinking to this new model. On the other hand, the need to reduce capex is forcing carriers to introduce scenarios of sharing physical resources, such as radio masts."
Filipiak also identifies additional challenges such as churn prevention, automatic client profiling and concentrated web-based marketing campaigns, as issues carriers will need to address.

Winning business from the large carrier groups against this backdrop is, without doubt difficult.

"International groups are certainly challenging customers" admits Filipiak. "National companies differ in software environments, processes and levels of maturity as well as corporate and national culture. They therefore require a flexible approach in implementation strategy and software functionality and look for a common architecture for their network as well as their IT systems. Such carriers pay a lot of attention to building up corporate standards at the services level and business process levels in order to achieve a common view."

Good products, knowledge and proven experience are the ways to win this type of business. "No power point slide solutions can be sold anymore," adds Filipiak. "It takes a lot of time and money but these are the only ways to win contracts with groups."

However, winning such business is never achieved on a static battlefield. Carrier consolidation continues and that can be both a threat and an opportunity for solutions vendors. "On one had, it is difficult because some groups will enforce product choices at the global level, and it may be more difficult for Comarch to gain a global recommendation in a large group since we have to fight for our portion of the market with much stronger players" says Filipiak. "On the other hand, consolidation forces carriers to unify their OSS/BSS landscapes and this is a good opportunity to change long-established solutions for something new and fresh. Heterogeneous environments of global groups with plenty of flavours in different countries require a great level of flexibility that Comarch can provide. We already have positive experience with such projects, for example our experience with T-Mobile, that enables us to be optimistic for the future."
 

"Ultimately, we must live with the situation" adds Filipiak. "We're a service company and it's not our job to comment or expect specific customers to behave in any particular way. The level of consolidation is already very high so we may not see much more, in any case."
It's not only carrier consolidation that presents challenges to vendors, though. Carriers are at different stage of their business and that places a development burden on all vendors as they seek to develop systems applicable to individual carrier needs.

"Comarch builds its solutions for different segments of the telco market," says Filipiak. "We offer both pre-integrated solutions for small business, such as an integrated BSS/OSS solutions for an MVNO, and complex solutions tailored specifically for the needs of large players. We have frameworks and modules of software but we've never sold it without adaptation. In the end, it is always a construction job. You have modules but ultimately you must put them together in different ways."

Comarch has grown organically since its inception in 1993 and has shunned much of the mergers and acquisition activity that has occurred in the OSS/BSS sector in recent years. "Our product portfolio follows unified design principles and is not the result of an acquisition of missing parts," explains Filipiak. "This gives us the possibility to offer seamlessly integrated solutions and products that complete while at the same time not redundant in functionality."

Inevitably, for all rules there are exceptions, and Comarch has recently announced an agreement to acquire 50.15 per cent of Frankfurt listed company SoftM und Beratung AG for a transaction that could exceed €22m. The German software producer and systems integrator employs 420 personnel and supplies more than 4,000 customers.

Filipiak is open to further moves although they will be well considered. "Acquisition, yes but only in a way that we can handle along with continued organic growth. There will be no miracle from us, just steady organic growth."

Filipiak also rejects any notion of selling the company. "The company isn't for sale. My family has a controlling stake and I'm not going to sell now. The company's value is increasing and the scope of the business grows every day."

George Malim is a freelance communications journalist

ip.access CEO, Stephen Mallinson, discusses the impact of pico and femtocells with Priscilla Awde

Mobile operators everywhere are facing something of a conundrum which goes like this: in saturated markets they must increase revenues from high margin data services but these are typically bandwidth hungry applications resulting in a network capacity crunch. Additionally, recent research shows that around 60 per cent of customers use their mobiles inside buildings at work and at home. As people exploit the benefits of the big new touch screen smartphones, they will expect networks to be fast enough to provide the necessary capacity reliably and everywhere. These are growing trends.

However, delivering the promise of mobile multimedia applications means delivering high-speed indoor mobile networks. Which poses big questions for operators: how can they get broadband 3G networks inside to provide reliable, cost effective in-building coverage? How can they do it fast, without significant and expensive investment in macro networks and give customers access to the applications they want at prices they are willing to pay?
Fortunately ip.access has the answers since bringing high-speed wireless networks inside is its raison d'être. Building on its long experience in developing IP communications solutions, ip.access designs and manufactures picocells for business users and femtocells for the domestic market.

Picocells and femtocells plug directly into existing fixed broadband networks be they DSL, cable, satellite or even WiMax. Acting as mini-base stations, both can be quickly installed anywhere in buildings or outside to bring networks to where the demand is.

These plug and play units have advantages for everyone. For users, both professional and consumers, they make the mobile phone a truly broadband device which can reliably connect to high-speed networks anywhere. For operators, pico and femtocells take traffic off the macro wireless network, add capacity and improve performance. They also give telcos the competitive advantage they need to sell into new sectors and offer a range of high margin value added services.

For years ip.access has successfully deployed nanoGSM picocells in enterprises, business parks, skyscrapers, underground and public buildings. They are even installed on planes, ships and other remote locations where they are connected to broadband satellite backhaul networks. Depending on their size, picocells can support up to 100 users and companies can dot them around the organisation to provide connections where needed.

Solving the problem for residential users, the Oyster3G femtocell allows people to use their existing mobiles to access broadband applications at home. Supporting up to four simultaneous connections, family members can get seamless high-speed access as they move about inside the house. ip.access expects commercial deployments of plug and play 3G femtocells will be up and running by spring 2009.

"There are two legs to our business," explains Stephen Mallinson, CEO at ip.access. "We design end-to-end pico and femtocell solutions so operators can deliver robust solid networks for business and residential users inside any building, ship or aircraft."
The difference between the two is one of size, power, capacity, functionality, price and target audience. However both allow operators to add capacity cost effectively, divert traffic from the macro network and thereby improve performance for all users connected to a cell site. Network black spots in cities and rural areas can be eliminated and people previously unable to get mobile signals can be connected to high-speed networks.

"Operators can use pico and femtocells to put broadband wireless networks precisely where there is demand be that indoors or outside," explains Mallinson. "They can do this without either the expense or controversy of installing new masts and avoid adding equipment to existing base stations. The advantages extend beyond capacity issues: operators can introduce and support new, high margin services and offer home zone tariffs to drive up data usage inside and on the move.

"There are QOS advantages: although people may tolerate occasional dropped voice calls they will be less forgiving if essential business communications or video content are interrupted. These mini-base stations ensure connections are maintained as people move around inside buildings."

Plugging mini-base stations into the existing broadband connections takes indoor data sessions off the macro network so raising the number of users each site can support and increasing its capacity beyond the number of users removed. Operators therefore do not have to invest either in backhaul or in increasing base station capacity. According to ip.access, instead of upgrading the macro network to meet the capacity demands of increased data usage, an operator with 10 million subscribers could save €500 million over four years by deploying fully subsidized femtocells to 20% of its subscribers' homes. Similarly, research firm Analysys-Mason calculates the annual cost saving per customer for a large operator deploying 3G femtocells is between $6 - $12.

Setting aside revenue advantages, increases in service and performance levels and churn reduction, the added capacity achieved by deploying femtocells more than makes the business case even if they are fully subsidised. Even ignoring the cost savings, it takes only a Euro 11 per month increase in ARPU spread over one household to cover the cost of fully subsidising a femtocell.

Operators are seeing an explosion in mobile data usage (in the UK 3 saw a 700% increase in data traffic throughput between September 2007 and March 2008 ), and are looking to picocells and femtocells to solve both network capacity and indoor high-speed access problems. Demand for high bandwidth multimedia mobile applications is rising fast. In the consumer market, usage growth can be attributed to the popularity of social networking sites; uploading and sharing multimedia data; mobile advertising and the personal experience enabled by mobile TV. Following the launch of the iPhone, operators reported an immediate and continuing surge in data usage.

According to Informa, 60% of mobile data traffic will be generated at home by 2013. Ovum anticipates 17 million femtocells will be deployed throughout Western Europe by 2011 and IDC expects consumer spend on femtocell enabled services to grow to $900 million by the same year. Other surveys suggest nearly half of smartphone data usage is at home and the ‘digital generation' either does, or wants to watch mobile television at home.

As distinctions between professional and consumer applications and use blur, employees at all levels are taking popular mobile services into the workspace and combining them with mobile access to multimedia corporate applications. Mobiles are an essential part of corporate life: many business applications formerly limited to fixed devices have migrated onto wireless platforms. "Picocells support reliable connectivity to network services," continues Mallinson. "Enterprises can now support the flexibility and device independent access employees need, delivering reliable and consistent mobile high-speed access everywhere."

Operators are urgently addressing the capacity problems such increases in data usage imply. Some are capping monthly unlimited data plans while others encourage content developers to limit application bandwidth. Neither of which are likely to be popular with users and may increase churn: both of which enhance the consumer proposition for deploying picocells and 3G femtocells.

While adding what could be millions of mini-base stations to a network, integrating them into existing infrastructure and systems and managing them is a significant task for operators, the rewards are potentially equally significant. The cost of delivering calls drops; service levels, speed and reliability rise and operators can introduce new, high margin services to the handsets people already have.

They can encourage both usage and fixed mobile substitution by offering FemtoZone services which are tied to a particular location and automatically activated when phones are within range of the femtocell. When people get home, texts could be automatically sent to absent parents to notify them children are back; podcasts, videos or images can be loaded to phones or targeted advertising sent to interested users.

"Femtocells are a cost effective technology and real commercial proposition for the residential market," explains Mallinson. "Most people in Europe have access to broadband networks at home and, by rolling out 3G networks, carriers are stimulating demand for mobile data. However, many users are frustrated since they cannot fully exploit the benefits of 3G phones or get the quality of service or application access they want.

"Most people use phones for data indoors where, without pico or femtocells, 3G coverage is often not reliable or signals not even available. Femtocells give consumers a better experience and faster downloads so they can really use all the features and functions 3G handsets and networks support while inside."

The Femto Forum industry body, of which ip.access is a founding board member, now includes more than 90 companies, including 36 operators covering 914 million subscribers. The Forum is encouraging the development of open standards which will lead to economies of scale - unit prices are expected to to drop below $100.

There are plans to include the new I-uh standard in release 8 of the 3GPP standard due out in December. It will replace the numerous different ways in which femtocells currently connect to networks and proprietary systems and define how they can be integrated into core networks. By standardising communications between femtocells and core network gateways, operators will no longer be locked into proprietary interfaces or particular vendors and so can choose consumer premise equipment (CPE), separately from the gateway.
Concerns about managing the multitudes of new units within a network are also being addressed by the industry. Currently available for DSL equipment, the TR-069 standard allows operators to remotely manage devices, diagnose and solve problems and download software upgrades. The standard is being extended to support the management of femtocells.

Based on open standard interfaces, the nanoGSM picocell and Oyster 3G femtocell products are total end-to-end solutions which include the requisite controllers and management systems. 

Over the five years they have been used in enterprises, the advantages of the nanoGSM are well documented. Fast and easy to install it increases mobile voice and data usage and reduces operator costs. With an indoor range up to 200 metres, traffic is backhauled through existing IP networks and it supports fast data rates over GPRS and EDGE to devices such as Blackberries. The nanoGSM picocell can be hung on a wall and, once the Ethernet connection is plugged into the box, it is up and running providing guaranteed mobile capacity and service quality indoors.

Like its bigger cousin but less powerful and with a smaller range, the Oyster 3G architecture creates a complete indoor broadband access network for the residential market. Using the same underlying technical platform as the Oyster 3G, ip.access is developing next generation picocells. Having solved many of the 3G femtocell ease of use, price and installation challenges necessary to meet consumer needs, ip.access believes these solutions can be incorporated into picocells. In future, the company expects to offer self-install 3G picocells to both large enterprises and to SMEs through their existing channels.

"These are very exciting times," says Mallinson. "We are building on our experience to produce next generation picocells designed for businesses of all sizes. SMEs need plug and play, easy to use, cost effective units which can be self installed and remotely managed. It makes commercial sense for companies large and small to deploy picocells. It also makes commercial sense for operators, giving them the edge over competitors and a new value proposition for smaller companies which historically have been something of a closed shop."
It's a truism that everything is going mobile and operators are already feeling the capacity pinch. Pico and femtocells give them a cost effective means of meeting the expected upsurge in demand and delivering the network performance capable of supporting next generation multimedia applications.

Today's smart phones are as powerful and feature rich as the PCs of only a few years ago and look set to become the principle controller of all domestic electronic equipment. Operators are now able to deliver the ubiquitous high-speed networks consumers of all kinds expect.

Mallinson looks forward to the day when content is automatically and seamlessly transferred between devices over femtocell platforms: "Users will be able to control televisions remotely from their mobiles; share content between phones and other devices quickly and automatically so all are updated. In the new converged IP world, audio, video, text and photographs will be seamlessly shared between devices.

There's a stark dynamic framing in the telecoms Operations Support Systems (OSS) market. Until recently networks were expensive while the price tags for the OSS systems used to assure the services running across them were, by comparison, puny. Today that's all changed - not because OSS systems have become significantly more costly, but because network components are a fraction of the capital cost they were 15 years ago. The result is an apparent cost disparity that may be causing some operators to swallow hard and think about putting off their OSS investments, Thomas Sutter, CEO of Nexus Telecom tells Ian Scales.  That would be a huge mistake, he says, because next generation networks actually need more OSS handholding than their predecessors, not less

Naturally, Thomas has an interest. Nexus Telecom specializes in data collection, passive monitoring and network and service investigation systems and, while Nexus Telecom's own sales are still on a healthy upswing (the company is growing in double figures), he's growing increasingly alarmed at some of the questions and observations he's hearing back from the market. "There is a whole raft of issues that need exploring around the introduction of IP and what that can and can't do," he says. "And we need to understand those issues in the light of the fundamental dynamics of computer technology. I think what's happening in our little area of OSS is the same as what tends to happen right across the high technology field. As the underlying hardware becomes ten times more powerful and ten times as cheap, it changes the points of difference and value within competing product sets." If you go back and look at the PC market, says Thomas, as you got more powerful hardware, the computers became cheaper but more standard and the real value and product differentiation was, and still is, to be found in the software. "And if you look at the way the PC system itself has changed, you see that when microcomputers were still fairly primitive in the early 1980s all the processor power and memory tended to be dedicated to the actual application task  - you know, adding up figures in a spreadsheet, or shuffling words about in a word processor. But as PC power grew, the excess processing cycles were put to work at the real system bottleneck: the user interface. Today my instincts tell me that 90 per cent of the PC's energy is spent on generating the graphical user interface.  Well I think it's very similar in our field. In other words, the network infrastructure has become hugely more efficient and cost effective and that's enabled the industry to concentrate on the software. And the industry's equivalent of the user interface, from the telco point of view at least, is arguably the OSS. "You could even argue that the relative rise in the cost of OSS is a sign that the telecoms market as a whole is maturing." That makes sense, but if that's the case what are these other issues that make the transformation to IP and commodity network hardware so problematical from an OSS point of view?
"There's a big problem over perceptions and expectations. As the networks transform and we go to 'everything over IP', the scene starts to look different and people start to doubt whether the current or old concepts of service assurance are still valid. "So for example, people come to our booth and ask, 'Do you think passive probe monitoring is still needed?  Or even, is it still feasible?  Can it still do the job?' After all, as the number of interfaces decrease in this large but simplified network, if you plug into an interface you're not going to detect immediately any direct relationships between different network elements doing a telecom job like before, all you'll see is a huge IP pipe with one stream of IP packets including traffic from many different network elements and what good is that? "And following on from that perception, many customers hope that the new, big bandwidth networks are somehow self-healing and that they are in less danger of getting into trouble. Well they aren't.  If anything, while the topological architecture of the network is simplifying things (big IP pipes with everything running over them), the network's operating complexity is actually increasing." As Thomas explains, whenever a new technology comes along it seems in its initial phases to have solved all the problems associated with the last, but it's also inevitably created new inefficiencies. "If you take the concept of using IP as a transport layer for everything, then the single network element of the equation does have the effect of making the network simpler and more converged and cost effective. But the by-product of that is that the network elements tend to be highly specialized engines for passing through the data  - no single network element has to care about the network-wide service." So instead of a top-down, authoritarian hierarchy that controls network functions, you effectively end up with 'networking by committee'. And as anyone who has served on a committee knows, there is always a huge, time-consuming flow of information between committee members before anything gets decided.  So a 'flat' IP communications network requires an avalanche of communications in the form of signaling messages if all the distributed functions are to co-ordinate their activities. But does that really make a huge difference; just how much extra complexity is there? "Let's take LTE [Long Term Evolution], the next generation of wireless technology after 3G. On the surface it naturally looks simpler because everything goes over IP. But guess what? When you look under the bonnet at the signaling it's actually much more complicated for the voice application than anything we've had before. "We thought it had reached a remarkable level of complexity when GSM was introduced. Back then, to establish a call we needed about 11 or 12 standard signaling messages, which we thought was scary. Then, when we went into GPRS, the number of messages required to set up a session was close to 50.  When we went to 3G the number of messages for a handover increased to around 100 to set up a standard call. Now we run 3GPP Release 4 networks (over IP) where in certain cases you need several hundred signaling messages (standard circuit switching signaling protocol) to perform handovers or other functions; and these messages are flowing between many different logical network element types or different logical network functions. "So yes of course, when you plug in with passive monitoring you're probably looking at a single IP flow and it all looks very simple, but when you drill down and look at the actual signaling and try to work out who is talking to who, it becomes a nightmare. Maybe you want to try to draw a picture to show all this with arrows - well, it's going to be a very complex picture with hundreds of signaling messages flying about for every call established. "And if you think that sort of complexity isn't going to give you problems:  one of my customers - before he had one of our solutions I hasten to add - took  three weeks using a protocol analyzer to compile a flow chart of signaling events across his network. You simply can't operate like that - literally. And by the way, keep in mind that even after GSM networks became very mature, all the major operators went into SS7 passive monitoring to finally get the last 20 per cent of network optimization and health keeping done. So if this was needed in the very mature environment of GSM, what is the driver of doubting it for less mature but far more complex new technologies? ''
Underpinning a lot of the questions about OSS from operators is the cost disparity between the OSS and the network it serves, says Thomas. "Today our customers are buying new packet switched network infrastructure and to build a big network today you're probably talking about 10 to 20 million dollars. Ten or 15 years ago they were talking about 300 to 400 million, so in ten years the price of network infrastructure has come down by a huge amount while network capacity has actually risen. That's an extraordinary change. 
"But here's the big problem from our point of view.  Ten years ago when you spent $200 million on the network you might spend $3 million on passive probe monitoring.  Today it's $10 million on the network and $3 million on the passive probing solution. Today, also, the IP networks are being introduced into a hybrid, multiple technology network environment so during this transition the service assurance solution is getting even more complex. "So our customers are saying, ‘Hey!  Today we have to pay a third of the entire network budget on service assurance and the management is asking me, 'What the hell's going on?' How can it be that just to get some quality I need to invest a third of the money into service assurance?' "You can see why those sorts of conversations are at the root of all the doubts about whether they'll now need the OSS - they're asking: 'why isn't there a magic vendor who can deliver me a self-healing network so that I don't have to spend all this money?" Competitive pressures don't help either. "Today, time-to-market must be fast and done at low cost," says Thomas, "so if I'm a shareholder in a network equipment manufacturing company and they have the technology to do the job of delivering a communication service from one end to the other, I want them to go out to the market.  I don't want them to say, 'OK, we now have the basic functionality but please don't make us go to the market, first can we build self-healing capabilities, or built-in service assurance functionality or built-in end-to-end service monitoring systems - then go to the market?'  This won't happen." The great thing about the 'simple' IP network was the way it has commoditized the underlying hardware costs, says Thomas. "As I've illustrated, the 'cost' of this simplicity is that the complexity has been moved on rather than eliminated - it now resides in the signaling chatter generated by the ad hoc 'committees' of elements formed to run the flat, non-hierarchical IP network. "From the network operator's point of view there's an expectation problem: the capital cost of the network itself is being vastly reduced, but that reduction isn't being mirrored by similar cost reductions in the support systems.  If anything, because of the increased complexity the costs of the support systems are going up. "And it's always been difficult to sell service assurance because it's not strictly quantitative. The guy investing in the network elements has an easy job getting the money - he tells the board if there's no network element there's no calls and there's no money. But with service assurance much more complicated qualitative arguments must be deployed. You've got to say, 'If we don't do this, the probability is that 'x' number of customers may be lost. And there is still no exact mathematical way to calculate what benefits you derive from a lot of OSS investment."
The problem, says Thomas, is as it's always been. That is, that building the cloud of network elements - the raw capability if you like - is always the priority and what you do about ensuring there's a way of fixing the network when something goes wrong is always secondary. "When you buy, you buy on functionality. And to be fair it's the same with us when we're developing our own products. We ask ourselves, what should we build first? Should we build new functionality for our product or should we concentrate on availability stability, ease of installation and configuration.  If I do too much of the second I'll have less features to sell and I'll lose the competitive battle. "The OSS guy within the operators organization knows that there's still a big requirement for investment, but for the people in the layer above it's very difficult to decide - especially when they've been sold the dream of the less complex architecture. It's understandable that they ask: 'why does it need all this investment in service assurance systems when it was supposed to be a complexity-buster?" So on each new iteration of technology, even though they've been here before, service providers have a glimmer of hope that 'this time' the technology will look after itself. We need to look back at our history within telecoms and take on board what actually happens.    

    

@eurocomms

Other Categories in Features