Geoff Unwin, the new chairman of network software provider OpenCloud and former CEO of Cap Gemini Ernst & Young, shares his thoughts on the state of the telecoms sector.
Geoff Unwin, the new chairman of network software provider OpenCloud and former CEO of Cap Gemini Ernst & Young, shares his thoughts on the state of the telecoms sector.
Accanto systems has come a long way from its roots as a provider of protocol analysers. its focus is now on helping service providers to overcome the challenges of offering mobile data and Voip services while also managing legacy technologies
While IP appears to have simplified telecoms, Christoph Kupper, Executive Vice President of Marketing at Nexus Telecom tells Lynd Morley that the added complexity of monitoring the network - due largely to exploding data rates - has led to a new concept providing both improved performance and valuable marketing information
Nexus Telecom is, in many ways, the antithesis of the now predominant imperative in most industries - and certainly in the telecoms industry - which requires wholesale commoditisation of services; an almost exclusive focus on speed to market; and a fast response to instant gratification.
Where the ruling mantra is in danger of becoming "quantity not quality" in a headlong rush to ever greater profitability (or possibly, mere survival), Nexus Telecom calls something of a halt, focussing the spotlight on the vital importance of high quality, dependable service that not only ensures the business reputation of the provider, but also leads to happy - and therefore loyal - customers.
Based in Zurich, Nexus Telecom is a performance and service assurance specialist, providing data collection, passive monitoring and network service investigation systems. The company's philosophy centres around the recognition that the business consequences of any of the network's elements falling over are enormous - and only made worse if the problem takes time to identify and fix. Even in hard economic times, the investment in reliability is vital.
The depressing economic climate does not, at the moment, appear to be hitting Nexus Telecom too directly. "Despite the downturn, we had a very good year last year," comments Christoph Kupper, Executive Vice President of Marketing at Nexus Telecom. ‘And so far, this year, I don't see any real change in operator behaviour. There may be some investment problems while the banks remain hesitant about extending credit, but on the whole, telecom is one of the solid businesses, with a good customer base, and revenues that are holding up well."
The biggest challenge for Nexus Telecom is not so much the economy, but more one of perception and expectation, with some operators questioning the value and cost of the OSS tools - which, relative to the total cost of the network has increased over the years. In the past few years the price of network infrastructure has come down by a huge amount, while network capacity has risen. But while the topological architecture of the network is simplifying matters - everything running over big IP pipes - the network's operating complexity is vastly increasing. So the operator sees the capital cost of the network being massively reduced, but that reduction isn't being mirrored by similarly falling costs in the support systems. Indeed, because of the increased complexity, the costs of the support systems are going up.
Complexity is not, of course, always a comfortable environment to operate in. Kupper sees some of the culture clash that arises whenever telecom meets IT, affecting the ways in which the operators are tackling these new complexities.
"In my experience, most telecom operators come from the telco side of the road, with a telecom heritage of everything being very detailed and specified, with very clear procedures and every aspect well defined," he says.
"Now they're entering an IP world where the approach is a bit looser, with more of a ‘lets give it a try' attitude, which is, of course, an absolute horror to most telcos."
Indeed, there may well be a danger that network technology is becoming so complex that it is now getting ahead of some CTOs and telecom engineers.
"There can be something of a ‘fear factor' for the engineers, if ever they have an issue with the network," Kupper says. "And there are plenty of issues, given that these new switching devices can be configured in so many ways that even experienced engineers have trouble doing it right.
"Once the technical officers become fully aware of these issues, the attraction of a system such as ours, which gives them better visibility - especially independent visibility across the different network domains - is enormous.
"It only takes one moment in a CTO's life when he loses control of the network, to make our sale to him very much easier."
The sales message, however, depends on the recognition that increased complexity in the network requires more not less monitoring, and that tools which may be seen as desirable but not absolutely essential (after all, the really important thing is to get the actual network out there - and quickly) are in fact, vital to business success. Not always an easy message to get across to those whose background in engineering means they do not always think in terms of business risk.
Kupper recognises that the message is not as well established as it might be. "We're not there yet," he says. "We still need to teach and preach quite a lot, especially because the attraction of the ‘more for less' promise of the new technology elements hides the fact that operational expenditure on the management of a network with vastly increased traffic and complexity, is likely to rise."
The easiest sales are to those technical officers who have a vision, and who are looking for the tools to fulfil it. "They want to have control of their networks," says Kupper. "They want to see their capacity, be able to localise it, and see who's affected."
And once Nexus Telecom's systems are actually installed, he stresses, no one ever questions their necessity.
"The asset and value of these systems is hard to prove - you can't just put it on the table. It's a more complicated qualitative argument that speaks to abstract concepts of Y resulting from the possible failure of X, but with no exact mathematical way to calculate what benefits your derive from specific OSS investment."
So the tougher sales are to the guys who don't grasp these concepts, or who remain convinced that any network failure is the responsibility of the network vendors who must therefore provide the remedy, without taking into account how long that might take, and the subsequent impact on client satisfaction, and therefore, ultimately business success.
These concepts, of course, are relevant to the full range of suppliers, from wireline and cable operators to the new mobile kids on the block. Indeed, Kupper stresses that with the advent of true mobile data broadband availability, following the change to IP, and the introduction of flat rates to allow users to make unlimited use of the technology, the cellular operator has positioned himself as a true contender against traditional wireline and cable operators.
Kupper notes: "For years in telecommunications, voice was the data bearer that did not need monitoring - if the call didn't work, the user would hang up and redial - a clearly visible activity in terms of signalling procedure analysis.
"But with mobile broadband data, the picture has changed completely. It is the bearer that needs analysis, because only the bearer enables information to be gleaned on the services that the mobile broadband user is accessing. The network surveillance tools, therefore, must not only analyse the signalling procedure but also, and most importantly, the data payload. It is in the payload that we see if, for example, Internet browsing is used, which URL is accessed, which application is used, and so forth. And it is only the payload, for which the subscriber pays!"
He points out that as a consequence of the introduction of flat rates and the availability of 3G, data rates have exploded.
"It is now barely possible to economically monitor such networks by means of traditional surveillance tools. A new approach is needed, and that approach is what we call ‘Intelligent Network Monitoring'. At Nexus Telecom we have been working on the Intelligent Network Monitoring concept for about two years now, and have included that functionality with every release we have shipped to customers over that period. Any vendor's monitoring systems that do not include developments incorporating the concepts of mass data processing will soon drown in the data streams of telecom data networks."
Basically, he explains, the monitoring agents on the network must have the ability to interpret the information obtained from scanning the network ‘on the fly'. "The network surveillance tools need a staged intelligence in order to process the vast amount of data; from capturing to processing, forwarding and storing the data, the system must, for instance, be able to summarise, aggregate and discard data while keeping the essence of subscriber information and its KPI to hand - because, at the end of the day, only the subscriber experience best describes the network performance. And this is why Nexus Telecom surveillance systems provide the means always to drill down in real-time to subscriber information via the one indicator that everyone knows - the subscriber's cell phone number."
All this monitoring and surveillance obviously plays a vital role in providing visibility into complicated, multi-faceted next generation systems behaviour, facilitating fast mitigation of current and potential network and service problems to ensure a continuous and flawless end-customer experience. But it also supplies a wealth of information that enables operators to better develop and tailor their systems to meet their customers' needs. In other words, a tremendously powerful marketing tool.
"Certainly,' Kupper confirms, "the systems have two broad elements - one of identifying problems and healing them, and the other a more statistical, pro-active evaluation element. Today, if you want to invest in such a system, you need both sides. You need the operations team to make the network as efficient as possible, and you also need marketing - the service guys who can offer innovative services based on all the information that can be amassed using such tools."
Kupper points out that drawing in other departments and disciplines may, in fact, be essential in amassing sufficient budget to cover the system. The old days when the operations manager could simply say ‘I need this type of tool - give it to me' are long gone, and anyway their budgets, these days, are nothing like big enough to cover such systems. Equally, however, the needs of many different disciplines and departments for the kind of information Nexus Telecom systems can provide is increasing as the highly competitive marketplace makes responding to customer requirements and preferences absolutely vital. Thus the systems can prove to be of enormous value to the billing guys, the revenue assurance and fraud operations, not to mention the service development teams. "Once the system is in place," Kupper points out, "you have information on every single subscriber regarding exactly which devices and services he most uses, and therefore his current, and likely future, preferences. And all this information is real-time."
Despite the apparent complexity of the sales message, Nexus Telecom is in buoyant mood, with good penetration in South East Asia and the Middle East, as well as Europe. These markets vary considerably in terms of maturity of course, and Kupper points out that OSS penetration is very much a lifecycle issue. "When the market is very new, you just push out the lines," he comments. "As long as the growth is there - say the subscriber growth rate is bigger than ten per cent a year - you're probably not too concerned about the quality of service or of the customer experience.
"The investment in monitoring only really registers when there are at least three networks in a country and the focus is on retaining customers - because the cost of gaining new customers is so much higher than that of hanging on to the existing ones.
"Monitoring systems enable you to re-act quickly to problems. And that's not just about ensuring against the revenue you might lose, but also the reputation you'll lose. And today, that's an absolutely critical factor."
The future of OSS is, of course, intrinsically linked to the future of the telcos themselves. Kupper notes that the discussion - which has been ongoing for some years now - around whether telcos will become mere dumb pipe providers, or will arm themselves against a variety of other players with content and tailored packages, has yet to be resolved. In the meantime, however, he is confident that Nexus Telecom is going in the right direction.
"I believe our strategy is right. We currently have one of the best concepts of how to capture traffic and deal with broadband data.
"The challenge over the next couple of years will be the ability to deal with all the payload traffic that mobile subscribers generate. We need to be able to provide the statistics that show which applications, services and devices subscribers are using, and where development will most benefit the customer - and, of course, ultimately the operator."
Lynd Morley is editor of European Communications
As users become increasingly intolerant of poor network quality, Simon Williams, Senior VP Product Marketing and Strategy at Redback Networks tells Priscilla Awde that, in order to meet the huge demand for speed and efficiency, the whole industry is heading in the same direction - creating an all IP Ethernet core using MPLS to prioritise packets regardless of content
Speed, capacity, bandwidth, multimedia applications and reliable any time, anywhere availability from any device - tall orders all, but these are the major issues facing every operator whether fixed or mobile. Meeting these needs is imperative given the global telecoms environment in which providing consistently high quality service levels to all subscribers is a competitive differentiator. There is added pressure to create innovative multimedia services and deliver them to the right people, at the right time, to the right device but to do so efficiently and cost effectively.
Operators are moving into a world in which they must differentiate themselves by the speed and quality of their reactions to rapid and global changes. Networks must become faster, cheaper to run and more efficient, to serve customers increasingly intolerant of poor quality or delays. It is a world in which demand for fixed and mobile bandwidth hungry IPTV, VoD and multimedia data services is growing at exponential rates leaving operators staring at a real capacity crunch.
To help operators transform their entire networks and react faster to demand for capacity and greater flexibility, Ericsson has created a Full Service Broadband initiative which marries its considerable mobile capabilities with similar expertise in fixed broadband technologies. With the launch of its Carrier Ethernet portfolio, Ericsson is leveraging the strength of the Redback acquisition to develop packet backbone network solutions that deliver converged applications using standards based IP MPLS (Multi Protocol Label Switching), and Carrier Ethernet technologies.
Committed to creating a single end-to-end solution from network to consumer, Ericsson bought Redback Networks in 2007, thereby establishing the foundation of Ericsson IP technology but most importantly acquiring its own router and IP platform on which to build up its next generation converged solution.
In the early days of broadband deployment, subscriber information and support was centralised, the amount of bandwidth used by any individual was very low and most were happy with best effort delivery. All that changed with growth in bandwidth hungry data and video applications, internet browsing and consumer demand for multimedia access from any device. The emphasis is now on providing better service to customers and faster, more reliable, more efficient delivery. For better control, bandwidth and subscriber management plus content are moving closer to customers at the network edge.
However, capacity demand is such that legacy systems are pushed to the limit both in handling current applications, let alone future services, and guaranteeing quality of service. Existing legacy systems are inefficient, expensive to run and maintain compared to the next generation technologies that transmit all traffic over one intelligent IP network. Neither do they support the business agility or subscriber management systems that allow operators to react fast to changing markets and user expectations.
Despite tight budgets, operators must invest to deliver and ultimately to save on opex. They must reduce networking costs and simplify existing architectures and operations to make adding capacity where it is needed faster and more cost effective.
The questions are: which are the best technologies, architectures and platforms and, given the current economic climate, how can service providers transform their operations cost effectively. The answers lie in creating a single, end-to-end intelligent IP network capable of efficiently delivering all traffic regardless of content and access devices. In the new IP world, distinctions between fixed and mobile networks, voice, video and data traffic and applications are collapsing. Infonetics estimates the market for consolidating fixed and mobile networks will be worth over $14 billion by 2011 and Ericsson, with Redback's expertise, is uniquely positioned to exploit this market opportunity.
Most operators are currently transforming their operations and as part of the solution, are considering standards based Carrier Ethernet as the broadband agnostic technology platform. Ethernet has expanded beyond early deployments in enterprise and Metro networks: carrier Ethernet allows operators to guarantee end-to-end service quality across their entire network infrastructure, enforce service level agreements, manage traffic flows and, importantly, scale networks.
With roots in the IT world where it was commonly deployed in LANs, Ethernet is fast becoming the de facto standard for transport in fixed and mobile telecoms networks. Optimised for core and access networks, Carrier Ethernet supports very high speeds and is a considerably more cost effective method of connecting nodes than leased lines. Carrier Ethernet has reached the point of maturity where operators can quickly scale networks to demand; manage traffic and subscribers and enforce quality of service and reliability.
"For the first time in the telecoms sector we now have a single unifying technology, in the form of IP, capable of transmitting all content to any device over any network," explains Simon Williams, Senior VP Product Marketing and Strategy at Redback Networks, an Ericsson company. "The whole industry is heading in the same direction: creating an all IP Ethernet core using MPLS to prioritise packets regardless of content.
"In the future, all operators will want to migrate their customers to fixed/mobile convergent and full service broadband networks delivering any service to any device anytime, but there are a number of regulatory and standards issues which must be resolved. Although standards are coming together, there are still slightly different interpretations of what constitutes carrier Ethernet and discussions about specific details of how certain components will be implemented," explains Williams.
Despite debates about different deployment methods, Carrier Ethernet, MPLS ready solutions are being integrated into current networks and Redback has developed one future proof box capable of working with any existing platform.
Experts in creating distributed intelligence and subscriber management systems for fixed operators and now for mobile carriers, Redback's solutions are both backward and forward compatible and can support any existing platform, including ATM, Sonet, SDH or frame relay. Redback is applying its experience in broadband fixed architectures to solving the capacity, speed and delivery problems faced by mobile operators. As the amount of bandwidth per user rises, the management of mobile subscribers and data is being distributed in similar ways as happened in the fixed sector.
Redback has developed SmartEdge routers and solutions to address packet core problems and operator's needs to deliver more bandwidth reliably. SmartEdge routers deliver data, voice or video traffic to any connected device via a single box connected to either fixed or mobile networks. Redback's solutions are designed to give operators a gradual migration path to a single converged network which is more efficient and cost effective to manage and run.
In SmartEdge networks with built-in distributed intelligence and subscriber management functionality, operators can deliver the particular quality of service, speed, bandwidth and applications appropriate to individual subscribers.
Working under the Ericsson umbrella and with access to considerable R&D budgets, Redback is expanding beyond multiservice edge equipment into creating metroE solutions, mobile backhaul and packet LAN applications. Its new SM 480 Metro Service Transport is a carrier class platform which can be deployed in fixed and mobile backhaul and transport networks; Metro Ethernet infrastructure and to aggregate access traffic. Supporting fixed/mobile convergence, the SM 480 is a cost effective means of replacing legacy transport networks and migrating to IP MPLS Carrier Ethernet platforms. The system can be used to build packet based metro and access aggregation networks using any combination of IP, Ethernet or MPLS technologies.
Needing to design and deliver innovative converged applications quickly to stay competitive, operators must build next generation networks. Despite the pressures on the bottom line, most operators see the long-term economic advantages of building a single network architecture. Moving to IP MPLS packet based transmission and carrier Ethernet creates a content and device agnostic platform over which traffic is delivered faster and over a future proof network. Operators realise the cost and efficiency benefits of running one network in which distinctions between fixed and mobile applications are eliminated.
Although true convergence of networks, applications and devices may be a few years away, service providers are deploying the necessary equipment and technologies. IP MPLS and carrier Ethernet support both operators' needs for speed, flexibility and agility and end user demand for quality of service, reliability and anywhere, anytime, any device access.
"Ultimately however, there should be less focus on technology and more on giving service providers and their customers the flexibility to do what they want," believes Williams. "All operators are different but all need to protect their investments as they move forward and implement the new technologies, platforms and networks. Transformation is not only about technology but is all about insurance and investment protection for operators ensuring that solutions address current and future needs."
Priscilla Awde is a freelance communications journalist
As financial turmoil rampages across the worlds' markets, Professor Janusz Filipiak, founder and chief executive of OSS/BSS provider Comarch, tells George Malim that he sees great opportunity as carriers seek to streamline their operations and get to grips with new business models, services and the complex new telecoms value chain
Comarch, the Polish IT solutions provider has been developing OSS/BSS systems for telecoms since 1993 and now provides a portfolio of systems and managed services to incumbent, broadband, triple play operators as well as MVNOs/MVNEs and start-ups. With a turnover of €170m, more than 3,000 employees and a customer roster that includes T-Mobile Germany and Austria, Bouygues Telecom France, O2 Germany and Polkomtel and PTC in Poland, the company has enjoyed a 33 per cent increase in turnover during the last five years. As the general economic crisis deepens, founder and chief executive, Professor Janusz Filipiak, thinks vendors will have to chase harder and act more cleverly to win deployments.
"Now all companies have to be mean and lean in the recession" he says. "We are very cost minded and every bit that is not needed is removed. You can't come to carriers with a higher price than your competitors. IT engineers are now a global resource and want the same payment in China or the UK, for example, so we are in the same position as all vendors. We can't compete on price so we can only be more intelligent and more effective than others. In spite of the recession we must now continue to invest in developing new products."
Current financial market woes aside, Comarch is heavily focused on the mobile market and recognises the challenges faced by operators. "In today's world of telecommunications, mobile operators are faced with the challenges resulting from market saturation in the majority of countries" adds Filipiak. "Innovative product offerings and enhanced service levels are indispensable in order to gain new customers and prevent churn. Operators are searching for the Holy Grail of telco that will prevent ARPU from decreasing. As voice is still the ‘killer application', we see data and value added services as a fast growing market. Other trends are still ahead of us such as seeing strong market competition from global corporate customers seeking the best deals from global mobile groups."
Filipiak also sees great potential in currently non-mobile operators. "Keeping in mind that everything eventually goes mobile, we haven't forgotten the great potential of fixed broadband operators, cable TV providers and triple and quad play operators" he says. "We target different segments of the market while not focusing exclusively on a single one."
Pre-paid billing has been one of the major functions carriers have sought during the life of Comarch but, as bundled and flat-rate packages become more popular, Filipiak sees it's emphasis waning. "Today there are not too many content services available but they will come" he says. "Video streaming will put new requirements on bandwidth and devices. It will be very resource consuming and will be charged via pay-per-use. The experience won't be very different to paying for bandwidth or connection time with voice. Pre-paid is a method of payment which is still the most popular for the youngest segments of users, but pre-paid is becoming less related to cheap prices - because those are achievable in post-paid models as well - than to a philosophy of ‘no contract, no obligation'."
Flat-rate offers will be harder to make business sense of. "Flat rate is only viable in a world with unlimited capacity" adds Filipiak. "Flat-rate packages make a difference in the final price of services but the introduction of real flat-rate, where everything is included and mobile access is a commodity like internet or electricity or gas, will lead to a weakening of pre-paid which will favour post-paid."
Filipiak sees the market moving in this direction. "We can see that many players are moving towards a mix of post-paid with a significant amount of free minutes, SMS and MMS in a bundle," he adds. "This offer is really close to an actual flat rate and assures stable revenue for providers as well as strong customer loyalty and a resulting decrease in churn. My mantra in telecoms is that customers now expect everything to be easy."
The emergence of mobile content and the move to data services put obvious pressure on carriers' systems and the telecoms revenue chain has become much more complex. Comarch has long been prepared for this shift, as Filipiak attests: "The revenue chain is more complex and an operator is now not the only one that provides the services delivered. Service ‘sponsoring', third-party service providers, resellers and service dealers introduce the need for multi-party billing and put more pressure on monitoring quality of offerings," he says. "Our solutions also address and deal with the complexity that content and data services bring in wholesale, next generation TV, content distribution, service creation and control. We address these needs through our InterPartner Billing solution. On the OSS side, we provide service level management and service level inventory, our flagship OSS products, which enable service modelling of resources and services provided by different parties along with pro-active quality monitoring and management."
Comarch has grown from its eastern European roots and now has operations in 30 countries and addresses operators of all sizes and types, as Filipiak explains: "The Comarch brand is recognised in the telecoms world," he says. "We've been in the industry for 15 years and time is now working for us. Our biggest customers for specialised OSS solutions are Tier 1 operators. Large operators with 10 million subscribers are customers for our InterPartner Billing and, when it comes to independent operators, we have about 30 per cent of the local market as clients for integrated BSS and OSS/BSS solutions. We also target the largest CATV and broadband operators offering convergent services. Our strategy also addresses global players where we can offer the best value, give good prices, still be flexible and deliver enterprise level services."
In spite of the general economic downturn, Filipiak still sees great opportunities emerging. One area is that of next generation mobile networks and self-optimising networks. "Such concepts will invite carriers to look for solutions outside the long established segments of OSS, such as Inventory Management, configuration Management and Network planning," he says. "It will not be sufficient to cover one area in the future; instead co-operation of the planning and operations areas will be needed where we see an opportunity for us. In addition, carriers are now more oriented towards a loose coupling of functional modules and standard interfaces that make it easier for smaller players, like us."
New means of delivering solutions are also critical. "With our future proof architecture of solutions, we can address modern modularity concepts and tendencies that now exist in the market," adds Filipiak. "The openness and standard interfaces in high level OSS products is the key and customers can choose the best modules for their operations. This provides a possibility to reduce opex by utilising new business models for our customers, network virtualisation, distribution and outsourcing of operations and hosting solutions."
Regardless of the current economic gloom, Filipiak believes a new investment wave must come to the telecoms market. "Investment must happen because there will be greater demand" he says. "Physical travel will be a high cost so there will be more load on existing networks."
Carriers face massive challenges in spite of the increased demand for their network capacity and services. "In the international mobile groups, unification and co-operation issues are still of key importance in order to gain competitive advantage on the global market" he says. "Outsourcing of operations has become very popular but unsurprisingly it has turned out not to be a remedy for everything. Carriers still need to adapt their business processes and way of thinking to this new model. On the other hand, the need to reduce capex is forcing carriers to introduce scenarios of sharing physical resources, such as radio masts."
Filipiak also identifies additional challenges such as churn prevention, automatic client profiling and concentrated web-based marketing campaigns, as issues carriers will need to address.
Winning business from the large carrier groups against this backdrop is, without doubt difficult.
"International groups are certainly challenging customers" admits Filipiak. "National companies differ in software environments, processes and levels of maturity as well as corporate and national culture. They therefore require a flexible approach in implementation strategy and software functionality and look for a common architecture for their network as well as their IT systems. Such carriers pay a lot of attention to building up corporate standards at the services level and business process levels in order to achieve a common view."
Good products, knowledge and proven experience are the ways to win this type of business. "No power point slide solutions can be sold anymore," adds Filipiak. "It takes a lot of time and money but these are the only ways to win contracts with groups."
However, winning such business is never achieved on a static battlefield. Carrier consolidation continues and that can be both a threat and an opportunity for solutions vendors. "On one had, it is difficult because some groups will enforce product choices at the global level, and it may be more difficult for Comarch to gain a global recommendation in a large group since we have to fight for our portion of the market with much stronger players" says Filipiak. "On the other hand, consolidation forces carriers to unify their OSS/BSS landscapes and this is a good opportunity to change long-established solutions for something new and fresh. Heterogeneous environments of global groups with plenty of flavours in different countries require a great level of flexibility that Comarch can provide. We already have positive experience with such projects, for example our experience with T-Mobile, that enables us to be optimistic for the future."
"Ultimately, we must live with the situation" adds Filipiak. "We're a service company and it's not our job to comment or expect specific customers to behave in any particular way. The level of consolidation is already very high so we may not see much more, in any case."
It's not only carrier consolidation that presents challenges to vendors, though. Carriers are at different stage of their business and that places a development burden on all vendors as they seek to develop systems applicable to individual carrier needs.
"Comarch builds its solutions for different segments of the telco market," says Filipiak. "We offer both pre-integrated solutions for small business, such as an integrated BSS/OSS solutions for an MVNO, and complex solutions tailored specifically for the needs of large players. We have frameworks and modules of software but we've never sold it without adaptation. In the end, it is always a construction job. You have modules but ultimately you must put them together in different ways."
Comarch has grown organically since its inception in 1993 and has shunned much of the mergers and acquisition activity that has occurred in the OSS/BSS sector in recent years. "Our product portfolio follows unified design principles and is not the result of an acquisition of missing parts," explains Filipiak. "This gives us the possibility to offer seamlessly integrated solutions and products that complete while at the same time not redundant in functionality."
Inevitably, for all rules there are exceptions, and Comarch has recently announced an agreement to acquire 50.15 per cent of Frankfurt listed company SoftM und Beratung AG for a transaction that could exceed €22m. The German software producer and systems integrator employs 420 personnel and supplies more than 4,000 customers.
Filipiak is open to further moves although they will be well considered. "Acquisition, yes but only in a way that we can handle along with continued organic growth. There will be no miracle from us, just steady organic growth."
Filipiak also rejects any notion of selling the company. "The company isn't for sale. My family has a controlling stake and I'm not going to sell now. The company's value is increasing and the scope of the business grows every day."
George Malim is a freelance communications journalist
ip.access CEO, Stephen Mallinson, discusses the impact of pico and femtocells with Priscilla Awde
Mobile operators everywhere are facing something of a conundrum which goes like this: in saturated markets they must increase revenues from high margin data services but these are typically bandwidth hungry applications resulting in a network capacity crunch. Additionally, recent research shows that around 60 per cent of customers use their mobiles inside buildings at work and at home. As people exploit the benefits of the big new touch screen smartphones, they will expect networks to be fast enough to provide the necessary capacity reliably and everywhere. These are growing trends.
However, delivering the promise of mobile multimedia applications means delivering high-speed indoor mobile networks. Which poses big questions for operators: how can they get broadband 3G networks inside to provide reliable, cost effective in-building coverage? How can they do it fast, without significant and expensive investment in macro networks and give customers access to the applications they want at prices they are willing to pay?
Fortunately ip.access has the answers since bringing high-speed wireless networks inside is its raison d'être. Building on its long experience in developing IP communications solutions, ip.access designs and manufactures picocells for business users and femtocells for the domestic market.
Picocells and femtocells plug directly into existing fixed broadband networks be they DSL, cable, satellite or even WiMax. Acting as mini-base stations, both can be quickly installed anywhere in buildings or outside to bring networks to where the demand is.
These plug and play units have advantages for everyone. For users, both professional and consumers, they make the mobile phone a truly broadband device which can reliably connect to high-speed networks anywhere. For operators, pico and femtocells take traffic off the macro wireless network, add capacity and improve performance. They also give telcos the competitive advantage they need to sell into new sectors and offer a range of high margin value added services.
For years ip.access has successfully deployed nanoGSM picocells in enterprises, business parks, skyscrapers, underground and public buildings. They are even installed on planes, ships and other remote locations where they are connected to broadband satellite backhaul networks. Depending on their size, picocells can support up to 100 users and companies can dot them around the organisation to provide connections where needed.
Solving the problem for residential users, the Oyster3G femtocell allows people to use their existing mobiles to access broadband applications at home. Supporting up to four simultaneous connections, family members can get seamless high-speed access as they move about inside the house. ip.access expects commercial deployments of plug and play 3G femtocells will be up and running by spring 2009.
"There are two legs to our business," explains Stephen Mallinson, CEO at ip.access. "We design end-to-end pico and femtocell solutions so operators can deliver robust solid networks for business and residential users inside any building, ship or aircraft."
The difference between the two is one of size, power, capacity, functionality, price and target audience. However both allow operators to add capacity cost effectively, divert traffic from the macro network and thereby improve performance for all users connected to a cell site. Network black spots in cities and rural areas can be eliminated and people previously unable to get mobile signals can be connected to high-speed networks.
"Operators can use pico and femtocells to put broadband wireless networks precisely where there is demand be that indoors or outside," explains Mallinson. "They can do this without either the expense or controversy of installing new masts and avoid adding equipment to existing base stations. The advantages extend beyond capacity issues: operators can introduce and support new, high margin services and offer home zone tariffs to drive up data usage inside and on the move.
"There are QOS advantages: although people may tolerate occasional dropped voice calls they will be less forgiving if essential business communications or video content are interrupted. These mini-base stations ensure connections are maintained as people move around inside buildings."
Plugging mini-base stations into the existing broadband connections takes indoor data sessions off the macro network so raising the number of users each site can support and increasing its capacity beyond the number of users removed. Operators therefore do not have to invest either in backhaul or in increasing base station capacity. According to ip.access, instead of upgrading the macro network to meet the capacity demands of increased data usage, an operator with 10 million subscribers could save €500 million over four years by deploying fully subsidized femtocells to 20% of its subscribers' homes. Similarly, research firm Analysys-Mason calculates the annual cost saving per customer for a large operator deploying 3G femtocells is between $6 - $12.
Setting aside revenue advantages, increases in service and performance levels and churn reduction, the added capacity achieved by deploying femtocells more than makes the business case even if they are fully subsidised. Even ignoring the cost savings, it takes only a Euro 11 per month increase in ARPU spread over one household to cover the cost of fully subsidising a femtocell.
Operators are seeing an explosion in mobile data usage (in the UK 3 saw a 700% increase in data traffic throughput between September 2007 and March 2008 ), and are looking to picocells and femtocells to solve both network capacity and indoor high-speed access problems. Demand for high bandwidth multimedia mobile applications is rising fast. In the consumer market, usage growth can be attributed to the popularity of social networking sites; uploading and sharing multimedia data; mobile advertising and the personal experience enabled by mobile TV. Following the launch of the iPhone, operators reported an immediate and continuing surge in data usage.
According to Informa, 60% of mobile data traffic will be generated at home by 2013. Ovum anticipates 17 million femtocells will be deployed throughout Western Europe by 2011 and IDC expects consumer spend on femtocell enabled services to grow to $900 million by the same year. Other surveys suggest nearly half of smartphone data usage is at home and the ‘digital generation' either does, or wants to watch mobile television at home.
As distinctions between professional and consumer applications and use blur, employees at all levels are taking popular mobile services into the workspace and combining them with mobile access to multimedia corporate applications. Mobiles are an essential part of corporate life: many business applications formerly limited to fixed devices have migrated onto wireless platforms. "Picocells support reliable connectivity to network services," continues Mallinson. "Enterprises can now support the flexibility and device independent access employees need, delivering reliable and consistent mobile high-speed access everywhere."
Operators are urgently addressing the capacity problems such increases in data usage imply. Some are capping monthly unlimited data plans while others encourage content developers to limit application bandwidth. Neither of which are likely to be popular with users and may increase churn: both of which enhance the consumer proposition for deploying picocells and 3G femtocells.
While adding what could be millions of mini-base stations to a network, integrating them into existing infrastructure and systems and managing them is a significant task for operators, the rewards are potentially equally significant. The cost of delivering calls drops; service levels, speed and reliability rise and operators can introduce new, high margin services to the handsets people already have.
They can encourage both usage and fixed mobile substitution by offering FemtoZone services which are tied to a particular location and automatically activated when phones are within range of the femtocell. When people get home, texts could be automatically sent to absent parents to notify them children are back; podcasts, videos or images can be loaded to phones or targeted advertising sent to interested users.
"Femtocells are a cost effective technology and real commercial proposition for the residential market," explains Mallinson. "Most people in Europe have access to broadband networks at home and, by rolling out 3G networks, carriers are stimulating demand for mobile data. However, many users are frustrated since they cannot fully exploit the benefits of 3G phones or get the quality of service or application access they want.
"Most people use phones for data indoors where, without pico or femtocells, 3G coverage is often not reliable or signals not even available. Femtocells give consumers a better experience and faster downloads so they can really use all the features and functions 3G handsets and networks support while inside."
The Femto Forum industry body, of which ip.access is a founding board member, now includes more than 90 companies, including 36 operators covering 914 million subscribers. The Forum is encouraging the development of open standards which will lead to economies of scale - unit prices are expected to to drop below $100.
There are plans to include the new I-uh standard in release 8 of the 3GPP standard due out in December. It will replace the numerous different ways in which femtocells currently connect to networks and proprietary systems and define how they can be integrated into core networks. By standardising communications between femtocells and core network gateways, operators will no longer be locked into proprietary interfaces or particular vendors and so can choose consumer premise equipment (CPE), separately from the gateway.
Concerns about managing the multitudes of new units within a network are also being addressed by the industry. Currently available for DSL equipment, the TR-069 standard allows operators to remotely manage devices, diagnose and solve problems and download software upgrades. The standard is being extended to support the management of femtocells.
Based on open standard interfaces, the nanoGSM picocell and Oyster 3G femtocell products are total end-to-end solutions which include the requisite controllers and management systems.
Over the five years they have been used in enterprises, the advantages of the nanoGSM are well documented. Fast and easy to install it increases mobile voice and data usage and reduces operator costs. With an indoor range up to 200 metres, traffic is backhauled through existing IP networks and it supports fast data rates over GPRS and EDGE to devices such as Blackberries. The nanoGSM picocell can be hung on a wall and, once the Ethernet connection is plugged into the box, it is up and running providing guaranteed mobile capacity and service quality indoors.
Like its bigger cousin but less powerful and with a smaller range, the Oyster 3G architecture creates a complete indoor broadband access network for the residential market. Using the same underlying technical platform as the Oyster 3G, ip.access is developing next generation picocells. Having solved many of the 3G femtocell ease of use, price and installation challenges necessary to meet consumer needs, ip.access believes these solutions can be incorporated into picocells. In future, the company expects to offer self-install 3G picocells to both large enterprises and to SMEs through their existing channels.
"These are very exciting times," says Mallinson. "We are building on our experience to produce next generation picocells designed for businesses of all sizes. SMEs need plug and play, easy to use, cost effective units which can be self installed and remotely managed. It makes commercial sense for companies large and small to deploy picocells. It also makes commercial sense for operators, giving them the edge over competitors and a new value proposition for smaller companies which historically have been something of a closed shop."
It's a truism that everything is going mobile and operators are already feeling the capacity pinch. Pico and femtocells give them a cost effective means of meeting the expected upsurge in demand and delivering the network performance capable of supporting next generation multimedia applications.
Today's smart phones are as powerful and feature rich as the PCs of only a few years ago and look set to become the principle controller of all domestic electronic equipment. Operators are now able to deliver the ubiquitous high-speed networks consumers of all kinds expect.
Mallinson looks forward to the day when content is automatically and seamlessly transferred between devices over femtocell platforms: "Users will be able to control televisions remotely from their mobiles; share content between phones and other devices quickly and automatically so all are updated. In the new converged IP world, audio, video, text and photographs will be seamlessly shared between devices.
There's a stark dynamic framing in the telecoms Operations Support Systems (OSS) market. Until recently networks were expensive while the price tags for the OSS systems used to assure the services running across them were, by comparison, puny. Today that's all changed - not because OSS systems have become significantly more costly, but because network components are a fraction of the capital cost they were 15 years ago. The result is an apparent cost disparity that may be causing some operators to swallow hard and think about putting off their OSS investments, Thomas Sutter, CEO of Nexus Telecom tells Ian Scales. That would be a huge mistake, he says, because next generation networks actually need more OSS handholding than their predecessors, not less
Naturally, Thomas has an interest. Nexus Telecom specializes in data collection, passive monitoring and network and service investigation systems and, while Nexus Telecom's own sales are still on a healthy upswing (the company is growing in double figures), he's growing increasingly alarmed at some of the questions and observations he's hearing back from the market. "There is a whole raft of issues that need exploring around the introduction of IP and what that can and can't do," he says. "And we need to understand those issues in the light of the fundamental dynamics of computer technology. I think what's happening in our little area of OSS is the same as what tends to happen right across the high technology field. As the underlying hardware becomes ten times more powerful and ten times as cheap, it changes the points of difference and value within competing product sets." If you go back and look at the PC market, says Thomas, as you got more powerful hardware, the computers became cheaper but more standard and the real value and product differentiation was, and still is, to be found in the software. "And if you look at the way the PC system itself has changed, you see that when microcomputers were still fairly primitive in the early 1980s all the processor power and memory tended to be dedicated to the actual application task - you know, adding up figures in a spreadsheet, or shuffling words about in a word processor. But as PC power grew, the excess processing cycles were put to work at the real system bottleneck: the user interface. Today my instincts tell me that 90 per cent of the PC's energy is spent on generating the graphical user interface. Well I think it's very similar in our field. In other words, the network infrastructure has become hugely more efficient and cost effective and that's enabled the industry to concentrate on the software. And the industry's equivalent of the user interface, from the telco point of view at least, is arguably the OSS. "You could even argue that the relative rise in the cost of OSS is a sign that the telecoms market as a whole is maturing." That makes sense, but if that's the case what are these other issues that make the transformation to IP and commodity network hardware so problematical from an OSS point of view?
"There's a big problem over perceptions and expectations. As the networks transform and we go to 'everything over IP', the scene starts to look different and people start to doubt whether the current or old concepts of service assurance are still valid. "So for example, people come to our booth and ask, 'Do you think passive probe monitoring is still needed? Or even, is it still feasible? Can it still do the job?' After all, as the number of interfaces decrease in this large but simplified network, if you plug into an interface you're not going to detect immediately any direct relationships between different network elements doing a telecom job like before, all you'll see is a huge IP pipe with one stream of IP packets including traffic from many different network elements and what good is that? "And following on from that perception, many customers hope that the new, big bandwidth networks are somehow self-healing and that they are in less danger of getting into trouble. Well they aren't. If anything, while the topological architecture of the network is simplifying things (big IP pipes with everything running over them), the network's operating complexity is actually increasing." As Thomas explains, whenever a new technology comes along it seems in its initial phases to have solved all the problems associated with the last, but it's also inevitably created new inefficiencies. "If you take the concept of using IP as a transport layer for everything, then the single network element of the equation does have the effect of making the network simpler and more converged and cost effective. But the by-product of that is that the network elements tend to be highly specialized engines for passing through the data - no single network element has to care about the network-wide service." So instead of a top-down, authoritarian hierarchy that controls network functions, you effectively end up with 'networking by committee'. And as anyone who has served on a committee knows, there is always a huge, time-consuming flow of information between committee members before anything gets decided. So a 'flat' IP communications network requires an avalanche of communications in the form of signaling messages if all the distributed functions are to co-ordinate their activities. But does that really make a huge difference; just how much extra complexity is there? "Let's take LTE [Long Term Evolution], the next generation of wireless technology after 3G. On the surface it naturally looks simpler because everything goes over IP. But guess what? When you look under the bonnet at the signaling it's actually much more complicated for the voice application than anything we've had before. "We thought it had reached a remarkable level of complexity when GSM was introduced. Back then, to establish a call we needed about 11 or 12 standard signaling messages, which we thought was scary. Then, when we went into GPRS, the number of messages required to set up a session was close to 50. When we went to 3G the number of messages for a handover increased to around 100 to set up a standard call. Now we run 3GPP Release 4 networks (over IP) where in certain cases you need several hundred signaling messages (standard circuit switching signaling protocol) to perform handovers or other functions; and these messages are flowing between many different logical network element types or different logical network functions. "So yes of course, when you plug in with passive monitoring you're probably looking at a single IP flow and it all looks very simple, but when you drill down and look at the actual signaling and try to work out who is talking to who, it becomes a nightmare. Maybe you want to try to draw a picture to show all this with arrows - well, it's going to be a very complex picture with hundreds of signaling messages flying about for every call established. "And if you think that sort of complexity isn't going to give you problems: one of my customers - before he had one of our solutions I hasten to add - took three weeks using a protocol analyzer to compile a flow chart of signaling events across his network. You simply can't operate like that - literally. And by the way, keep in mind that even after GSM networks became very mature, all the major operators went into SS7 passive monitoring to finally get the last 20 per cent of network optimization and health keeping done. So if this was needed in the very mature environment of GSM, what is the driver of doubting it for less mature but far more complex new technologies? ''
Underpinning a lot of the questions about OSS from operators is the cost disparity between the OSS and the network it serves, says Thomas. "Today our customers are buying new packet switched network infrastructure and to build a big network today you're probably talking about 10 to 20 million dollars. Ten or 15 years ago they were talking about 300 to 400 million, so in ten years the price of network infrastructure has come down by a huge amount while network capacity has actually risen. That's an extraordinary change.
"But here's the big problem from our point of view. Ten years ago when you spent $200 million on the network you might spend $3 million on passive probe monitoring. Today it's $10 million on the network and $3 million on the passive probing solution. Today, also, the IP networks are being introduced into a hybrid, multiple technology network environment so during this transition the service assurance solution is getting even more complex. "So our customers are saying, ‘Hey! Today we have to pay a third of the entire network budget on service assurance and the management is asking me, 'What the hell's going on?' How can it be that just to get some quality I need to invest a third of the money into service assurance?' "You can see why those sorts of conversations are at the root of all the doubts about whether they'll now need the OSS - they're asking: 'why isn't there a magic vendor who can deliver me a self-healing network so that I don't have to spend all this money?" Competitive pressures don't help either. "Today, time-to-market must be fast and done at low cost," says Thomas, "so if I'm a shareholder in a network equipment manufacturing company and they have the technology to do the job of delivering a communication service from one end to the other, I want them to go out to the market. I don't want them to say, 'OK, we now have the basic functionality but please don't make us go to the market, first can we build self-healing capabilities, or built-in service assurance functionality or built-in end-to-end service monitoring systems - then go to the market?' This won't happen." The great thing about the 'simple' IP network was the way it has commoditized the underlying hardware costs, says Thomas. "As I've illustrated, the 'cost' of this simplicity is that the complexity has been moved on rather than eliminated - it now resides in the signaling chatter generated by the ad hoc 'committees' of elements formed to run the flat, non-hierarchical IP network. "From the network operator's point of view there's an expectation problem: the capital cost of the network itself is being vastly reduced, but that reduction isn't being mirrored by similar cost reductions in the support systems. If anything, because of the increased complexity the costs of the support systems are going up. "And it's always been difficult to sell service assurance because it's not strictly quantitative. The guy investing in the network elements has an easy job getting the money - he tells the board if there's no network element there's no calls and there's no money. But with service assurance much more complicated qualitative arguments must be deployed. You've got to say, 'If we don't do this, the probability is that 'x' number of customers may be lost. And there is still no exact mathematical way to calculate what benefits you derive from a lot of OSS investment."
The problem, says Thomas, is as it's always been. That is, that building the cloud of network elements - the raw capability if you like - is always the priority and what you do about ensuring there's a way of fixing the network when something goes wrong is always secondary. "When you buy, you buy on functionality. And to be fair it's the same with us when we're developing our own products. We ask ourselves, what should we build first? Should we build new functionality for our product or should we concentrate on availability stability, ease of installation and configuration. If I do too much of the second I'll have less features to sell and I'll lose the competitive battle. "The OSS guy within the operators organization knows that there's still a big requirement for investment, but for the people in the layer above it's very difficult to decide - especially when they've been sold the dream of the less complex architecture. It's understandable that they ask: 'why does it need all this investment in service assurance systems when it was supposed to be a complexity-buster?" So on each new iteration of technology, even though they've been here before, service providers have a glimmer of hope that 'this time' the technology will look after itself. We need to look back at our history within telecoms and take on board what actually happens.
The promise of IPTV is fraught with dangers - from outages to poor quality pictures - but effective systems test and measurement could save the day. Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde
Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
"It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
"Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
"We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.
Priscilla Awde is a freelance communications journalist
End-to-end transaction data is increasingly being recognised as the not-so-secret sauce required for full-flavoured telco transformation. If so, it should be treated with the reverence it deserves, Thomas Sutter, CEO of data collection and correlation specialist, Nexus Telecom tells Ian Scales
Nexus Telecom is a performance and service assurance specialist in the telecom OSS field. It is privately held, based in Switzerland and was founded in 1994. With 120 employees and about US$30 million turnover, Nexus Telecom can fairly be described as a 'niche' player within a niche telecom market. However, heavyweights amongst its 200 plus customers include Vodafone, T-Mobile and Deutsche Telekom.
It does most of its business in Europe and has found its greatest success in the mobile market. The core of its offer to telcos involves a range of network monitoring probes and service and revenue assurance applications, which telcos can use to plan network capacity, identify performance trends and problems and to verify service levels. Essentially, says CEO, Thomas Sutter, Nexus Telecom gathers event data from the network - from low-level network stats, right up to layer 7 applications transactions - verifies, correlates and aggregates it and generally makes it digestible for both its own applications and those delivered by other vendors. What's changing, though, is the importance of such end-to-end transaction data.
Nexus Telecom is proud of its 'open source approach' to the data it extracts from its customers' networks and feels strongly that telcos must demand similar openness from all their suppliers if the OSS/BSS field is to develop properly. Instead of allowing proprietary approaches to data collection and use at network, service and business levels respectively, Sutter says the industry must support an architecture with a central transaction record repository capable of being easily interrogated by the growing number of business and technical applications that demand access. It's an idea whose time may have come. According to Sutter, telcos are increasingly grasping the idea that data collection, correlation and aggregation is not just an activity that will help you tweak the network, it's about using data to control the business. The term 'transformation' is being increasingly used in telecom.
As currently understood it usually means applying new thinking and new technology in equal measure: not just to do what you already do slightly better or cheaper, but to completely rethink the corporate approach and direction, and maybe even the business model itself.
There is a growing conviction that telco transformation through the use of detailed end-to-end transaction data to understand and interact with specific customers has moved from interesting concept to urgent requirement as new competitors, such as Google and eBay, enter the telecom market, as it were, pre-transformed. Born and bred on the Internet, their sophisticated use of network and applications data to inform and drive customer interaction is not some new technique, cleverly adopted and incorporated, but is completely integral to the way they understand and implement their business activities. If they are to survive and prosper, telcos have to catch up and value data in a similar way. Sutter says some are, but some are still grappling with the concepts.
"Today I can talk to customers who believe that if they adopt converged networks with IP backbones, then the only thing they need do to stay ahead in the business is to build enough bandwidth into the core of the network, believing that as long as they have enough bandwidth everything will be OK."
This misses the point in a number of ways, claims Sutter.
"Just because the IP architecture is simple doesn't mean that the applications and supply chain we have to run over it are simple - in fact it's rather the other way about. The 'simple' network requires that the supporting service layers have to be more complex because they have to do more work."
And in an increasingly complex telco business environment, where players are engaged with a growing number of partners to deliver services and content, understanding how events ripple across networks and applications is crucial.
"The thing about this business is not just about what you're doing in your own network - it's about what the other guy is doing with his. We are beginning to talk about our supply chains. In fact the services are generating millions of them every day because supply chains happen automatically when a service, let's say a voice call over an IP network, gets initiated, established, delivered and then released again. These supply chains are highly complex and you need to make sure all the events have been properly recorded and that your customer services are working as they should. That's the first thing, but there's much more than that. Telcos need to harness network data - I call them 'transactions' - to develop their businesses."
Sutter thinks the telecom industry still has a long way to go to understand how important end-to-end transaction data will be.
"Take banking. Nobody in that industry has any doubt that they should know every single detail on any part of a transaction. In telecoms we've so far been happy to derive statistics rather than transaction records. Statistics that tell us if services are up and running or if customers are generally happy. We are still thinking about how much we need to know, so we are at the very beginning of this process."
So end-to-end transaction data is important and will grow in importance. How does Nexus Telecom see itself developing with the market?
"When you look at what vendors deliver from their equipment domains it becomes obvious that they are not delivering the right sort of information. They tend to deliver a lot of event data in the form of alarms and they deliver performance data - layer 1 to layer 4 - all on a statistical basis. This tells you what's happening so you can plan network capacity and so on. But these systems never, ever go to layer 7 and tell you about transaction details - we can.
"Nexus Telecom uses passive probes (which just listen to traffic rather than engage interactively with network elements) which we can deploy independently of any vendor and sidestep interoperability problems. Our job is to just listen so all we need is for the equipment provider to implement the protocols in compliance with the given standards."
So given that telcos are recognising the need to gather and store, what's the future OSS transaction record architecture going to look like?
"I think people are starting to understand it's important that we only collect the data once and then store it in an open way so that different departments and organisations can access it at the granularity and over the time intervals they require, and in real (or close to real) time. So that means that our approach and the language we use must change. Where today we conceptualise data operating at specific layers - network, service and business - I can see us developing an architecture which envisages all network data as a single collection which can be used selectively by applications operating at any or all of those three layers. So we will, instead, define layers to help us organise the transaction record lifecycle. I envisage a collection layer orchestrating transaction collection, correlation and aggregation. Then we could have a storage layer, and finally some sort of presentation layer so that data can be assembled in an appropriate format for its different constituencies - the marketing people, billing people, management guys, network operation guys and so on, each of which have their own particular requirements towards being in control of the service delivery chain. Here you might start to talk about OSS/BSS Convergence."
Does he see his company going 'up the stack' to tackle some of these applications in the future.
"It is more important to have open interfaces around this layering. We think our role at Nexus Telecom is to capture, correlate, aggregate and pre-process data and then stream or transfer it in the right granularity and resolution to any other open system."
Sutter thinks the supplier market is already evolving in a way that makes sense for this model.
"If you look at the market today you see there are a lot of companies - HP, Telcordia, Agilent and Arantech, just to name a few - who are developing all sorts of tools to do with customer experience or service quality data warehouses. We're complementary since these players don't want to be involved in talking to network elements, capturing data or being in direct connection with the network. Their role is to provide customised information such as specific service-based KPIs (key performance indicators) to a very precise set of users, and they just want a data source for that."
So what needs to be developed to support this sort of role split between suppliers? An open architecture for the exchange of data between systems is fundamental, says Sutter. In the past, he says, the ability of each vendor to control the data generated by his own applications was seen as fundamental to his own business model and was jealously guarded. Part of this could be attributed to the old-fashioned instinct to 'lock in' customers.
"They had to ask the original vendor to build another release and another release just to get access to their own data," he says. But it was also natural caution. "You would come along and ask, 'Hey guys, can you give me access to your database?', the response would be 'Woah, don't touch my database. If you do then I can't guarantee performance and reliability.' This was the problem for all of us and that's why we have to get this open architecture. If the industry accepts the idea of open data repositories as a principle, immediately all the vendors of performance management systems, for instance, will have to cut their products into two pieces. One piece will collect the data, correlate and aggregate it, the second will run the application and the presentation to the user. At the split they must put in a standard interface supporting standards such as JMS, XML or SNMP. That way they expose an open interface at the join so that data may be stored in an open data to the repository as well as exchanged with their own application. When telcos demand this architecture, the game changes. Operators will begin to buy separate best in class products for collecting the data and presenting it and this will be a good thing for the entire industry. After all, why should I prevent my customer having the full benefit of the data I collect for him just because I'm not as good in the presentation and applications layer as I am in the collection layer? If an operator is not happy with a specific reporting application on service quality and wants to replace it, why should he always loose the whole data collection and repository for that application at the same time?"
With the OSS industry both developing and consolidating, does Nexus Telecom see itself being bought out by a larger OSS/BSS player looking for a missing piece in its product portfolio?
"Nexus Telecom is a private company so we think long-term and we grow at between 10 and 20 per cent each year, investing what we earn. In this industry, when you are focusing on a specialisation such as we are, the business can be very volatile and, on a quarter-by-quarter basis, it sometimes doesn't look good from a stock market perspective."
But if a public company came along and offered a large amount of money? "Well, I'm not sure. The thing is that our way of treating customers, our long-term thinking and our stability would be lost if we were snapped up by a large vendor. Our customers tend to say things like 'I know you won't come through my door and tell me that someone somewhere in the US has decided to buy this and sell that and therefore we have to change strategy.' Having said that, every company is for sale for the right price, but it would have to be a good price."
So where can Nexus Telecom go from here? Is there an opportunity to apply the data collection and correlation expertise to sectors outside telecom, for instance?
"Well, the best place to go is just next door and for us that's the enterprise network. The thing is, enterprise networks are increasingly being outsourced to outsourcing companies, which then complete the circle and essentially become operators. So again we're seeing some more convergence and any requirement for capturing, correlating and aggregating of transactions on the network infrastructure is a potential market for us. In the end I think everything will come together: there will be networks and operators of networks and they will need transaction monitoring. But at the moment we're busy dealing with the transition to IP - we have to master the technology there first.”
Ian Scales is a freelance communications journalist.
Organisations like the TeleManagement Forum have a dilemma when it comes to anniversaries. TM Forum’s 20th birthday in 2009 will naturally be a cause for celebration: you don’t get such a long run in this business unless you’re doing something right. But there’s a nagging worry too. Can a successful first 20 years as a thought leader; framework and standards setter – mostly for telecoms operational and business support systems - serve as a basis for another 20 years setting frameworks for an industry that is turning rapidly into something else, as new players muster at its borders? Does the heritage help or hinder when it comes to refining a role in the rapidly converging telecom, media and Internet industries, where the new tends to be seen as ‘good’ and anything else is consigned as ‘legacy’?
For Keith Willetts, the TM Forum's original co-founder and current Chairman, it's a question that soon answers itself, once you apply a little deep thought to the matter.
“What's become really clear, over the past year or so especially, is that convergence is here – it's for real and we're really at the start of the process,” he says. “What you've got is three trillion dollar industries - media, Internet and telecom - all coming together. You just have to pick up a newspaper, listen to the news or, of course, surf the web to see that it's happening. Who's Virgin bought? What services is Skype offering now? All that sort of thing. And over the coming years we're going to see far more of this mixing and matching – where a company strong in one field takes over or forms an alliance with a company that's strong in another.”
For Willetts it's a process that brings opportunities as well as threats. One apparent threat for many in the telecom industry is that telecom becomes sidelined in many markets as a new breed of player moves in and takes over. This extreme scenario might involve powerful, highly capitalised Internet companies, such as eBay with IP telephony company Skype (which it bought in 2005) completely disrupting the traditional telephony market.
“I use Skype and I'm amazed at just how good the service is. I wonder to myself, 'why would you need anything else?. But,” admits Willetts, “the more likely scenario is that we'll end up with a real mix of companies which take elements from all three sectors.”
There lies the opportunity. Willetts thinks the TMF can provide the frameworks that integrate the players, just as it has hitherto provided frameworks to integrate telcos' disparate back-office systems. The challenge is to apply its expertise in a new way.
“What we've been good at is helping our members develop a lean end-to-end process environment – a set of frameworks and standards encapsulated in our NGOSS (New Generation Operations Software and Systems) initiative that enables them to build flow-through business processes that cross the old internal demarcation lines that were, and often still are, such a feature in traditional telcos. Using NGOSS they can begin to join the dots between things like inventory, provisioning, service assurance and so on.”
What's clearly required in the new converged telecom-media-Internet world, he points out, is a similar set of guidelines at the inter-company level. “We are going to need standards and frameworks that reach beyond the company and the sector to automate things like content delivery, digital rights management and things we haven't even thought of yet.
“Of course, it's a huge area and there are a number of unresolved questions,” he says. “For example one specific conversation we've recently had within TMF has been around the possibility of defining a value chain. And we came to the conclusion that such a question presupposes we know who is going to be where in the chain. In fact, all we can actually say is that there will be value chains and there will be different people at different positions within them. What we're looking at is the development of something more two- or even three-dimensional than a simple chain – it's probably better to think of these relationships forming something like a 'value web', where companies might sit in any one of several positions. They might be undertaking one set of commercial roles in one territory and a different set in another.”
In fact, says Willetts, TMF as an organisation is keen to develop a role as an independent business and technical facilitator rather than being seen as the advocate of a particular, sector-specific, approach. The reason is simple – the telecom industry itself won't exist as we know it five to ten years from now. It's transforming, and as web and media companies are moving onto some of its traditional turf, telecoms itself is branching out into many new areas.
“It's important we aren't seen to be in the business of promoting any particular outcome,” claims Willetts. “We want to be part of an environment where there can be a range of outcomes, shapes and services. The important thing is that user companies and providers can actually put the pieces together and have them work. It's a case of 'may the best man win'.”
So where exactly is the TM Forum running to?
First, TMF is inviting thought leaders from media and cable companies to join its Board in order to get a 360-degree view of emerging needs. Second, it's rapidly broadening its business and software vision to encompass the needs of information and content-based services and the myriad of virtual providers and value chain players. Third, collaboration with other bodies will be important and ongoing. For example, recently TMF struck a landmark deal with the Broadband Services Forum (BSF) with a formal partnership where relevant work is shared. In fact the members of each organisation will have influence over related technical work in the area of telecom-web convergence issues and the first fruits of the collaboration will show up in a new TMF document entitled “Telecom Media Convergence Industry Challenges and Impact on the Value Chain”. The relationship will also contribute to more multimedia focused panels at TM Forum events, and future development of process standards for content management and convergent media-telecom operations.
“One of the most exciting and fundamental things we're going to do is to develop what we're calling a ' super-catalyst', and we'll be kicking that project off at Nice this year.”
The TMF Catalysts are joint projects undertaken by members and sponsored by service providers, usually to demonstrate leading edge thinking on how to solve problems in integrating the back office, using approaches based on TM Forum standards and guidelines. The results of these projects are demonstrated at TMF's TeleManagement World conferences in Nice and Dallas each year.
“The super-catalyst, which we're likely to call the Converged Services Showcases, will be really major events, involving media companies, device companies, cable TV, IPTV and mobile TV,” says Willetts. “The idea is to show a whole set of advanced service scenarios, but unlike what you'd see at a trade show - where you typically just see the thing working - with the super-catalyst you'll be able to walk around the back of this and be shown how it's actually being operated and controlled using standards and the various OSS and BSS systems involved.
“It's at an early stage of development, but the general idea is that you go to the show floor and you see the equivalent of a town with houses and retail establishments and so on. And you see all these services that you're getting and then you walk around the back to the network operations centre and you can see how it's all being managed. It's a big leap.”
We're working on, not just a demonstration, but a real catalyst designed to flush out problems and what standards you need, and what bits you need to invent that you haven't thought of. The fact is that we don't know what the standards requirements are in some cases in the converged world yet, and that's why this super-catalyst is going to be a great vehicle for developing the whole area. It's going to be a major undertaking.”
The plan is for the first super-catalysts to debut later this year at the TMF's Dallas TMW.
Nice TMW will be the start of the major change. “What we want to show is that convergent services are here. So we have a very strong convergence message and a very strong illustration that TMF is responding. There will be discussion about managing content-based and entertainment-based services and more involvement from media companies. For instance, for a meeting at Nice we've invited executives from Disney, Time-Warner and Virgin Mobile to join the table. The fact is that it's just as relevant for a senior executive at BT, say, to sit down with a Disney executive, as it is for the Disney guy to get to understand how the company can exploit the telecom space.”
“For some of these players convergence will result in a partners' love-fest and for others it will be 'daggers drawn', as they realise they're going to be contesting the same space, but in the long run nobody knows who will be in which role at any one given point in time. TMF's role isn't to try and predict that.”
What about the core frameworks and standards generated by the TMF? Will these have to change markedly to accommodate the broader remit and the entry of new types of player into the value web?
“Yes, no doubt there will be changes as we go forward. One area that we're probably going to have to address in all our output is outsourcing. While our current guidelines intrinsically assist players to define and manage all their processes, so that outsourcing, where required, will be simpler to accomplish, it's also true to say that outsourcing isn't often specifically allowed for. I've just been to India to speak at a TMF event there, and what I heard there was really eye opening in terms of the way outsourcing is being used to reduce costs.
“At Bharti Airtel, one of the big mobile operators with 80 to 90 million subscribers, all the IT is outsourced and they operate at a cost level that a European mobile operator, for instance, can't even come close to.”
Willetts says he thinks that outsourcing and partnering arrangements are bound to become more complex and must be catered for in the back office in a fundamental way.
“For example BT might run an IPTV service in the UK using its infrastructure, and in Germany it might run a service on someone else's because it doesn't own any infrastructure there. But it will probably want to run the same brand and service. The back office systems need to support that sort of thing.”
But the big question has to be asked. Isn't there a danger in all this for TMF? This is a member-driven organisation and it is energised by a core of highly motivated, mostly telecoms-oriented individuals who give, not just their companies' time, but often their own time and effort as well. Doesn't TMF run the big risk in realigning itself so radically?
Willetts is adamant: “What people sometimes don't understand is that it's not a question of: 'If you go and chase all these converging media and web companies, will you desert your core telecom membership in the process?' That question forgets the fact that telecom companies are, themselves, becoming multi-media companies. So, the reality is, to be of maximum use to our core constituency, we need to run with them, not away from them.”
Ian Scales is a freelance communications journalist.
Technology companies come and go, but some are blessed with the foresight to help drive the technological developments that permeate all our lives. One such company is Micron, whose COO, Mark Durcan, tells Lynd Morley why it has been so successful
Future gazers abound in our industry, and we’re being promised a near-future of sensor networks and RFID tags that will control or facilitate everything from ordering the groceries, to personalised news projected into our homes or from our mobile phones. This stuff of science fiction, fast becoming science fact, is the visible, sexy end-result of the technology, but what about the guys working at the coal-face, actually producing the tools that enable the dreams to come true?
Micron Technology is one of the prime forces at that leading edge. Among the world’s leading providers of advanced semiconductor solutions, Micron manufactures and markets DRAMs, NAND Flash memory, and CMOS image sensors, among other semiconductor components and memory modules for use in computing, consumer, networking and mobile products. And Mark Durcan, Micron’s Chief Operating Officer, is confident that the company has been instrumental in helping the gradual realisation of the future gazers’ predictions.
“I do think that we are, in many ways, creating the trends, because we’ve created the technology which enables them,” he comments. “I can give you two prime examples. The first is in the imaging space where, for many decades, charge-coupled devices (CCDs) were the technology of choice for capturing electronic images – mostly because the image quality associated with CCDs was much better than that of CMOS images, which is what Micron builds today.
“Nonetheless, we were strong believers that we could marry very advanced process technology, device design and circuit design techniques with the CMOS imager technology, and really create a platform that enabled a whole new range of applications.
“I think we did that successfully,” he continues, “and the types of applications that were then enabled are really quite stunning. For instance, with CCDs you have to read all the bits out serially, so you can’t capture images very quickly. With CMOS imagers you can catch thousands of images per second, which then opens the door to a whole new swathe of applications for the imagers – from very high speed cameras, to electronic shutters that allow you to capture a lot of images, and, by the way, you can do it using far less power. We have already made a major impact in providing image sensors to the notoriously power hungry cameraphone and mobile device based marketplaces, and in the space of two years have become the leading supplier of imaging solutions there. One in three cameraphones now have our sensors and in only two years we have become the largest manufacturer of image sensors in unit terms worldwide. So now, for instance, the technology enables all sorts of security, medical, notebook and automotive applications – you can tune the imagers for a very high dynamic range, low light and low noise at high temperatures which then enables them to operate in a wide variety of environments that CCDs can’t function in.
As a result, you can put imaging into a multitude of applications that were never possible before, and I think we really created that movement by creating the high quality sensors that drive those applications.”
The second example Durcan quotes is in the NAND memory arena. “What we’ve done is probably not apparent to everyone just yet, but, actually, I believe that we’ve broken Moore’s law.
“We are now scaling in the NAND arena much faster than is assumed under Moore’s law, and that has really changed the rate at which incremental memory can be used in different and new ways. As a result, I believe it will also pretty quickly change the way computers are architected with respect to memory distribution. So we’re going to start seeing changes in what types of memory are used, and location in the memory system, and it’s all being driven by a huge productivity growth, associated with NAND flash and the rate at which we’re scaling it. We are scaling it faster than anyone else in the world now and we are also well tuned to the increasingly pushy demands of mobile communications, computing and image capture devices.“
The productivity growth Durcan alludes to has been particularly sharp for Micron over the past year. The formation of IM Flash – a joint venture with Intel – in January 2006 has seen the companies bringing online a state-of-the-art 300mm NAND fabrication facility in Virginia, while another 300mm facility in Utah is on track to be in production early next year. The venture also produces NAND through existing capacity at Micron’s Idaho fabrication facility. And just to keep things even busier, the partners introduced last July the industry’s first NAND flash memory samples built on 50 nanometre process technology. Both companies are now sampling 4 gigabit 50nm devices, with plans to produce a range of products, including multi-level cell NAND technology, starting next year. At the same time, Intel and Micron announced in November 2006 their intention to form of a new joint venture in Singapore (where Micron has a long history of conducting business) that will add a fourth fabrication facility to their NAND manufacturing capability.
In June 2006, Micron also announced the completion of a merger transaction with memory card maker Lexar Media, a move that helped Micron expand from its existing business base into consumer products aimed at digital cameras, mobile computing and MP3 or portable video playing devices.
“Our merger with Lexar is interesting for a number of different reasons,” Durcan comments. “Certainly it brings us closer to the consumer, as, historically, our products tended to be sold through OEMs. But, in addition, it provides the ability to build much more of a memory system, as opposed to stand-alone products, given that Lexar delivers not only NAND memory, but also a NAND controller that manipulates the data in different ways and puts it in the right format for the system that you’re entering. Working closely with Lexar, we want to ensure that this controller functionality is tied to the new technologies we want to adopt on the NAND front, making sure that they work well together, thus enabling more rapid introduction of new technologies and getting them to market more quickly.”
The considerable activity of the past twelve months clearly reflect Micron’s view of itself as a company that is in the business of capturing, moving and storing data, and aiming for the top of the tree in each section. On the ‘capturing’ front, for instance, Durcan notes: “We’ve been very successful from a technology development perspective, and I think we’re pretty much the unquestioned leader in the image quality and imaging technology arena. As mentioned we also happen to be the world’s biggest imaging company now – it happened more quickly than any of us thought it would, but it was driven by great technology. So we have plenty of challenges now in making sure that we optimise the opportunity we’ve created to develop new and more diversified applications.”
Certainly, the company is willing to put its developments to the most stringent of tests. All of Micron’s senior executives, including Durcan, recently drove four Micron off-road vehicles in an exceptionally rugged all-terrain race in California, the Baja 1000, digitally capturing and storing more than 140 hours of video from the race, using Micron’s DigitalClarity image sensors and Lexar Professional CompactFlash memory cards specially outfitted for its vehicles. All the technology performed remarkably well, as did Micron’s CEO Steve Appleton, who won the contest’s Wide Open Baja Challenge class some 30 minutes ahead of the next closest competitor.
Appleton’s energetic and non-risk-averse approach to both the Baja 1000 (in some ways the American version of the Paris Dakar Rally) and to life in general (he is reputed to have once crashed a plane during a stunt flight, but still proceeded with a keynote speech just a few days later) is reflected in an undoubted lack of stuffiness within Micron.
Certainly, the company has taken a certain level of risk in pioneering technology developments. RFID is a case in point. “Sometimes,” Durcan explains, “the technology was there, but the market was slow to develop. RFID is a good example. Today, Micron has the largest RFID patent portfolio in the world. We certainly developed a lot of the technology that is now incorporated in global RFID standards, but when we first developed it, the threat of terrorism, for instance, was less obvious, so we simply couldn’t get these tags going that are now absolutely commonplace. I suppose you could say we’ve been a little ahead of our time.”
The company is also managed by a comparatively young executive team, with a very non-hierarchical approach to business. “I do believe that we have a certain mindset that keeps us pretty flexible,” Durcan explains, “and one our strongest cards is that we have some really great people, with a great work ethic. At the same time, we drive a lot of decisions down into the company. We’re probably less structured in our decision making than a lot of companies.
“So, we try to get the right people in the room (not necessarily in the room actually, but on the same phone line!) to make a decision about what is the right space to operate in, then we can turn it over to people who can work the details.
“We try to get to that right space, at a high level, through good communication and then drive it down. It is the opposite of what I believe can happen when companies grow, become compartmentalised, and tend to get more and more siloed.
“There is also very strong synergy between the different activities within Micron,” he continues. “In each case we’re really leveraging advanced process technology, advanced testing technology, and large capital investments in large markets. There are a lot of things that are similar and they do all play closely with each other.”
Micron’s people are, in fact, a truly international bunch, recruited globally, and bringing a great diversity of skills and approaches to the company. “I think that we are one of the most global semiconductor companies in the world,” Durcan says, “despite being a relatively young company. We recently started manufacturing our sensors in Italy and have design centres in Europe, both in the UK and Norway, which are expanding their operations. In fact we are now manufacturing on most continents – except in Africa and Antartica – and we have design teams right around the world who work on a continuous 24hr cycle handing designs from site to site. We’ve tried to grow a team that is very diverse, and leverage the whole globe as a source of locating the best talent we can.”
So, does all this talent produce its own crop of future gazers? Durcan believes they have their fair share. “There certainly are people at Micron who are very good at seeing future applications. My personal capabilities are much more at the technology front end. I can see it in terms of ‘we can take this crummy technology and really make it great’. Then I go out and talk to other people in the company who say ‘that’s fantastic, if we can do that, then we can...’. It really does take a marriage of the whole company, and a lot of intellectual horsepower.”
That horsepower has resulted in a remarkable number of patents for Micron. Durcan comments: “The volume and quality of new, innovative technology that Micron has been creating is captured by our patent portfolio. It’s an amazing story, and something I’m really proud of. The point is, Micron is a pretty good-sized company, but we’re not large by global standards – we’re roughly 23,500 employees worldwide. Yet we are consistently in the top five patent issuers in the US.
“I feel the more important part of the patent story, however, is that when people go out and look at the quality of patent portfolios, they typically rank Micron as the highest quality patent portfolio in the world – bar none. I think that’s pretty impressive and speaks volumes about the quality our customers benefit from.”
Lynd Morley is editor of European Communications
Tony Wilson, COO, Martin Dawes Systems and Warren Buckley, director of portfolio convergence at BT, describe the relationship between the two companies and how MDS is enabling BT to become more responsive and agile
As every operator working in the highly competitive global telecoms industry knows, success depends on business agility, innovative, easy to use services and putting customers first. This is especially true in the emerging market for converged services where end users increasingly want anywhere, anytime connections over any device.
Converged services need converged companies to supply them: companies that have both telecoms and IT expertise/experience and the next generation solutions and networks to deliver. Just such a company is Martin Dawes Systems which has over 20 years experience as a virtual mobile operator running next generation networks and creating converged software solutions for the market. Acting as a Mobile Virtual Network Enabler (MVNE), it offers a suite of specialist subscriber management systems, processes and end-to-end managed services and platforms to its virtual mobile telecoms clients. “We work in partnership with and as part of our customer's operation, becoming almost an internal department helping them to get products to market fast,” explains Tony Wilson, COO at Martin Dawes Systems.
“Since we understand both the telecoms and IT sides of the business and can draw on our history as a Mobile Virtual Network Operator (MVNO), we can help operators become more flexible, customer centric and responsive.”
Demanding markets and competitive challenges means operators must react fast, but few have the unified network architecture, supporting technologies or internal organisation needed for rapid response to shifting customer demands.
Typical of most former incumbents and major telcos, BT has a complex legacy environment with over 4,000 systems, hundreds of networks, over 20 million customers and several thousand products. Launching converged services efficiently requires a single unified platform. BT's answer was to choose a combination of in-house core technical innovation and partnerships with expert, trusted third party suppliers. “A big challenge for BT was in deciding how much to do for ourselves and how much to outsource, to get through a partnership or buy in,” says Warren Buckley, director of portfolio convergence at BT, which has created a close working partnership with Martin Dawes Systems to help deliver its fixed mobile convergence (FMC) service BT Fusion to SMEs.
“We partnered with Martin Dawes Systems because of their huge mobile experience combined with custom built products for SMEs. We have a unique, fully managed billing and CRM service which enables us to deliver speed to market plus the level and complexity customers demand.
“The requirements of business users in the SME space are more complex than those of consumers, especially from the mobile point of view, and represent a divergence from traditional PSTN services. Mobility entails the flexibility to offer bundled minutes, handle tariff changes and an on-going hierarchy of relationships within and between businesses. FMC needs to offer the best of both worlds and we therefore needed to converge,” explains Buckley. “We made the big decision to go to a third party for an array of different services including CRM, billing, revenue assurance and tariff set up and control.”
Explaining the relationship further, Wilson says his company acts as a department within BT, fully understanding its requirements and providing a stable group of experts who help develop new ideas for meeting tight time schedules and fostering business agility.
Traditional telco response to launching new products is to build a separate systems stack for each one. The result is a legacy of proprietary and largely manual systems; disconnected islands of automation and an environment in which valuable business information and customer data are made largely inaccessible because they are stored and duplicated in numerous separate silos. All of which adds up to inefficiency, slow response to market demand and an architecture unable properly to support convergence, business agility or customers – a situation which can be ameliorated by creating third party partnerships.
Facing other challenges
New entrants, many of which are retail brand management companies or content rights owners with little or no telecoms experience, face other challenges. For them the priority is to select a network provider plus an expert partner to whom they can outsource end-to-end service delivery. With no legacy, these MVNOs are, however, agile competitive companies used not only to anticipating customer needs but also to delivering high quality services. They are introducing retail business practices into a sector which has historically been dominated by incumbent operators with little competitive pressure to change.
Whether existing or new, all operators must cut the cost of doing business whilst simultaneously © introducing a raft of innovative multimedia services terminating on different devices.
The old vertically structured point-to-point architecture is too cumbersome to support this fast moving world in which products may be quickly set up and torn down; where customers demand different billing models and on-line account management across the services they use.
“In today's market operators are trying to re-position themselves and focus on convergence,” says Wilson. “The convergent model is now all about offering quadruple play services which combine wireless and wireline voice with broadband and video in one bundled package.
“The industry is re-establishing itself. Operators are re-building and reforming the MVNO model to generate new business. They are outsourcing to third party MVNE suppliers and have a choice between a pure IT organisation or one which is a telecoms aware IT provider.”
Moving to a partnership model meant BT did not have to build/add more infrastructure or undertake the difficult, time consuming and complex task of integrating new systems into the existing estate. With the benefit of a partner that understands its aims and supplies experienced staff, plus a suite of unique product options, this relationship has helped BT launch Fusion into the SME market very fast. Buckley estimates that working with the managed services model shaved nine months off the launch time. “It would have taken us between nine months and a year to launch working with our legacy environment, but working with Martin Dawes Systems' OSS/BSS and CRM solutions we did it in three months,” he says.
Having a strong brand image is not the only ingredient of success; it is also a matter of being able to react fast, differentiate services from competitors, offer choice and value added services which can be made personal. Operators must capitalise on market opportunities – a difficult task both for established telcos and new, often inexperienced entrants. Outsourcing or partnering with an MVNE is often the quickest and most reliable route to market since it avoids the need to build, run and manage systems and adds OSS/BSS solutions and expertise.
Acting as an experienced intermediary between the network provider and the MVNO, Martin Dawes Systems manages and runs the required operational systems to react quickly and deliver multimedia products. Depending on the contract, it also handles relationships with all the third party suppliers including the content providers so essential in converged services.
Exploiting all the flexibility of its next generation network, Martin Dawes Systems supports MVNOs quick reactions in testing, launching or tearing down new converged services fast. Tariffs can be changed equally quickly and discounts applied as appropriate, with different billing models offered to selected customers. “As we have provided a convergent architecture for years, we can deliver services very fast and connect to wireless and wireline networks seamlessly,” says Wilson. “We deliver complex services to corporate and SME customers and by employing the managed services model, act as an IT department.”
Martin Dawes Systems reduces risks and, by helping to maximise capital investments and compete effectively, speeds up the Return on Investments (ROI). MVNOs are therefore free to concentrate on core competencies, brand management and adding value in the form of innovative converged services, customised and targeted at specific users.
According to Yankee Group statistics, the MVNO market will generate service revenues of $10.7 billion by 2010 and British regulator Ofcom estimates that already they account for 5.5 million UK phone contracts.
Running and managing multimedia services depends on linking both the network OSS and customer facing BSS systems, which is a challenge best met by automation and a move from proprietary to open standard OSS/BSS estates.
While it is a slow, expensive process, the big operators are creating service oriented architectures and building next generation networks based on open standards. They are restructuring internal processes/systems and putting customers at the heart of their businesses.
However, for smaller operators, new entrants or even for incumbents wanting to move fast into new markets, rather than building their own systems or buying off the shelf and then customising, perhaps the quickest, most cost effective option is to outsource to an MVNE.
“From an IT perspective, operators need managed, secure data which is easy to manipulate and made available to customers easily. They need service oriented architectures, Java and open standard OSS/BSS systems. They need marketing, customer relationship management systems and everything in between,” explains Wilson.
“Convergence sits between two camps – in both the OSS and BSS environments,” he says. “From an OSS perspective it is delivered over intelligent networks. In the BSS space, operators must be able to bill for converged services, and contact centre agents must respond fast and accurately to customers' questions.”
Among other processes, convergence affects billing systems. Traditionally, mobile pre and post-paid customers were managed on separate, independent platforms with the former handled in the BSS environment and post-paid billing supported by IT departments. In converged networks both are handled on the same billing platform managed by staff with both IT and telecoms expertise.
At the heart of Martin Dawes Systems' product offering is the dise3G pre-integrated end-to-end billing and CRM solution that handles multi-service, multi-subscription pre and post-paid billing on the same platform, making it quick and easy for operators to launch multimedia services. Operators not only use the system to manage all aspects of the customer relationship fast and economically via self care features, but also to run critical business processes including sales, marketing, order management, rating and revenue assurance. The open standard CPP billing and CRM solution provides telcos with all the flexibility and control needed to support converged services, different billing models and customised solutions.
Unconcerned about underlying technologies or the considerable complexities, costs and challenges of moving to next generation networks and services, end users are most interested in price, convenience and the quality of service. The point is to deliver better products and make them easy for people to use – convergence is also about simpler, better, end user experience.
“Convergence is the future,” believes Buckley. “Business users are starting to see real benefits as we move away from simple connectivity into bundled minutes delivered to any device over any network. Connectivity will be fundamental, but services will become increasingly important and we will work with Martin Dawes Systems for billing, billing analysis and related solutions.
“One of the most positive aspects of the relationship between the two companies,” he notes, “is that as BT realises its long term plan of moving all products, services and customers onto its 21 century network, it has a supportive, expert and flexible partner.”
That the relationship is strong and mutually beneficial is evidenced by the fact that Martin Dawes Systems won the top prize for best billing and OSS implementation at this year's World Billing Awards for its work on the SME version of BT Fusion. Similarly BT has won a number of industry accolades.
In addition to its high profile work with BT, Martin Dawes Systems works with telcos large and small, new, old, fixed, mobile or with ISPs running circuit switched or IP networks. For those MVNOs new to the business it has developed its 'telco in a box' solution which has all the OSS/BSS and other systems required to switch on, deliver and bill services. “We offer an end-to-end platform from account activation through to customer care and billing for convergent products,” explains Wilson, who believes the goal is to support operators' changing requirements and give them an efficient, flexible platform to move forward into converged services.
Choosing an MVNE is a decision to create a close and long term partnership based on trust – it is about sharing risks, increasing customer numbers and reducing churn.
Priscilla Awde is a freelance communications writer