Q&A

Q&A

As users become increasingly intolerant of poor network quality, Simon Williams, Senior VP Product Marketing and Strategy at Redback Networks tells Priscilla Awde that, in order to meet the huge demand for speed and efficiency, the whole industry is heading in the same direction - creating an all IP Ethernet core using MPLS to prioritise packets regardless of content

Speed, capacity, bandwidth, multimedia applications and reliable any time, anywhere availability from any device - tall orders all, but these are the major issues facing every operator whether fixed or mobile. Meeting these needs is imperative given the global telecoms environment in which providing consistently high quality service levels to all subscribers is a competitive differentiator. There is added pressure to create innovative multimedia services and deliver them to the right people, at the right time, to the right device but to do so efficiently and cost effectively.

Operators are moving into a world in which they must differentiate themselves by the speed and quality of their reactions to rapid and global changes. Networks must become faster, cheaper to run and more efficient, to serve customers increasingly intolerant of poor quality or delays. It is a world in which demand for fixed and mobile bandwidth hungry IPTV, VoD and multimedia data services is growing at exponential rates leaving operators staring at a real capacity crunch.

To help operators transform their entire networks and react faster to demand for capacity and greater flexibility, Ericsson has created a Full Service Broadband initiative which marries its considerable mobile capabilities with similar expertise in fixed broadband technologies. With the launch of its Carrier Ethernet portfolio, Ericsson is leveraging the strength of the Redback acquisition to develop packet backbone network solutions that deliver converged applications using standards based IP MPLS (Multi Protocol Label Switching), and Carrier Ethernet technologies.

Committed to creating a single end-to-end solution from network to consumer, Ericsson bought Redback Networks in 2007, thereby establishing the foundation of Ericsson IP technology but most importantly acquiring its own router and IP platform on which to build up its next generation converged solution.

In the early days of broadband deployment, subscriber information and support was centralised, the amount of bandwidth used by any individual was very low and most were happy with best effort delivery. All that changed with growth in bandwidth hungry data and video applications, internet browsing and consumer demand for multimedia access from any device. The emphasis is now on providing better service to customers and faster, more reliable, more efficient delivery. For better control, bandwidth and subscriber management plus content are moving closer to customers at the network edge.

However, capacity demand is such that legacy systems are pushed to the limit both in handling current applications, let alone future services, and guaranteeing quality of service. Existing legacy systems are inefficient, expensive to run and maintain compared to the next generation technologies that transmit all traffic over one intelligent IP network. Neither do they support the business agility or subscriber management systems that allow operators to react fast to changing markets and user expectations.

Despite tight budgets, operators must invest to deliver and ultimately to save on opex. They must reduce networking costs and simplify existing architectures and operations to make adding capacity where it is needed faster and more cost effective.

The questions are: which are the best technologies, architectures and platforms and, given the current economic climate, how can service providers transform their operations cost effectively. The answers lie in creating a single, end-to-end intelligent IP network capable of efficiently delivering all traffic regardless of content and access devices. In the new IP world, distinctions between fixed and mobile networks, voice, video and data traffic and applications are collapsing. Infonetics estimates the market for consolidating fixed and mobile networks will be worth over $14 billion by 2011 and Ericsson, with Redback's expertise, is uniquely positioned to exploit this market opportunity.

Most operators are currently transforming their operations and as part of the solution, are considering standards based Carrier Ethernet as the broadband agnostic technology platform. Ethernet has expanded beyond early deployments in enterprise and Metro networks: carrier Ethernet allows operators to guarantee end-to-end service quality across their entire network infrastructure, enforce service level agreements, manage traffic flows and, importantly, scale networks.

With roots in the IT world where it was commonly deployed in LANs, Ethernet is fast becoming the de facto standard for transport in fixed and mobile telecoms networks. Optimised for core and access networks, Carrier Ethernet supports very high speeds and is a considerably more cost effective method of connecting nodes than leased lines. Carrier Ethernet has reached the point of maturity where operators can quickly scale networks to demand; manage traffic and subscribers and enforce quality of service and reliability.
 

"For the first time in the telecoms sector we now have a single unifying technology, in the form of IP, capable of transmitting all content to any device over any network," explains Simon Williams, Senior VP Product Marketing and Strategy at Redback Networks, an Ericsson company. "The whole industry is heading in the same direction: creating an all IP Ethernet core using MPLS to prioritise packets regardless of content.
 

"In the future, all operators will want to migrate their customers to fixed/mobile convergent and full service broadband networks delivering any service to any device anytime, but there are a number of regulatory and standards issues which must be resolved. Although standards are coming together, there are still slightly different interpretations of what constitutes carrier Ethernet and discussions about specific details of how certain components will be implemented," explains Williams.

Despite debates about different deployment methods, Carrier Ethernet, MPLS ready solutions are being integrated into current networks and Redback has developed one future proof box capable of working with any existing platform. 

Experts in creating distributed intelligence and subscriber management systems for fixed operators and now for mobile carriers, Redback's solutions are both backward and forward compatible and can support any existing platform, including ATM, Sonet, SDH or frame relay. Redback is applying its experience in broadband fixed architectures to solving the capacity, speed and delivery problems faced by mobile operators. As the amount of bandwidth per user rises, the management of mobile subscribers and data is being distributed in similar ways as happened in the fixed sector.

Redback has developed SmartEdge routers and solutions to address packet core problems and operator's needs to deliver more bandwidth reliably. SmartEdge routers deliver data, voice or video traffic to any connected device via a single box connected to either fixed or mobile networks. Redback's solutions are designed to give operators a gradual migration path to a single converged network which is more efficient and cost effective to manage and run.

In SmartEdge networks with built-in distributed intelligence and subscriber management functionality, operators can deliver the particular quality of service, speed, bandwidth and applications appropriate to individual subscribers.

Working under the Ericsson umbrella and with access to considerable R&D budgets, Redback is expanding beyond multiservice edge equipment into creating metroE solutions, mobile backhaul and packet LAN applications. Its new SM 480 Metro Service Transport is a carrier class platform which can be deployed in fixed and mobile backhaul and transport networks; Metro Ethernet infrastructure and to aggregate access traffic. Supporting fixed/mobile convergence, the SM 480 is a cost effective means of replacing legacy transport networks and migrating to IP MPLS Carrier Ethernet platforms. The system can be used to build packet based metro and access aggregation networks using any combination of IP, Ethernet or MPLS technologies.

Needing to design and deliver innovative converged applications quickly to stay competitive, operators must build next generation networks. Despite the pressures on the bottom line, most operators see the long-term economic advantages of building a single network architecture. Moving to IP MPLS packet based transmission and carrier Ethernet creates a content and device agnostic platform over which traffic is delivered faster and over a future proof network. Operators realise the cost and efficiency benefits of running one network in which distinctions between fixed and mobile applications are eliminated.

Although true convergence of networks, applications and devices may be a few years away, service providers are deploying the necessary equipment and technologies. IP MPLS and carrier Ethernet support both operators' needs for speed, flexibility and agility and end user demand for quality of service, reliability and anywhere, anytime, any device access.
 

"Ultimately however, there should be less focus on technology and more on giving service providers and their customers the flexibility to do what they want," believes Williams. "All operators are different but all need to protect their investments as they move forward and implement the new technologies, platforms and networks. Transformation is not only about technology but is all about insurance and investment protection for operators ensuring that solutions address current and future needs."

Priscilla Awde is a freelance communications journalist

As financial turmoil rampages across the worlds' markets, Professor Janusz Filipiak, founder and chief executive of OSS/BSS provider Comarch, tells George Malim that he sees great opportunity as carriers seek to streamline their operations and get to grips with new business models, services and the complex new telecoms value chain

Comarch, the Polish IT solutions provider has been developing OSS/BSS systems for telecoms since 1993 and now provides a portfolio of systems and managed services to incumbent, broadband, triple play operators as well as MVNOs/MVNEs and start-ups. With a turnover of €170m, more than 3,000 employees and a customer roster that includes T-Mobile Germany and Austria, Bouygues Telecom France, O2 Germany and Polkomtel and PTC in Poland, the company has enjoyed a 33 per cent increase in turnover during the last five years. As the general economic crisis deepens, founder and chief executive, Professor Janusz Filipiak, thinks vendors will have to chase harder and act more cleverly to win deployments.
"Now all companies have to be mean and lean in the recession" he says. "We are very cost minded and every bit that is not needed is removed. You can't come to carriers with a higher price than your competitors. IT engineers are now a global resource and want the same payment in China or the UK, for example, so we are in the same position as all vendors. We can't compete on price so we can only be more intelligent and more effective than others. In spite of the recession we must now continue to invest in developing new products."
Current financial market woes aside, Comarch is heavily focused on the mobile market and recognises the challenges faced by operators. "In today's world of telecommunications, mobile operators are faced with the challenges resulting from market saturation in the majority of countries" adds Filipiak. "Innovative product offerings and enhanced service levels are indispensable in order to gain new customers and prevent churn. Operators are searching for the Holy Grail of telco that will prevent ARPU from decreasing. As voice is still the ‘killer application', we see data and value added services as a fast growing market. Other trends are still ahead of us such as seeing strong market competition from global corporate customers seeking the best deals from global mobile groups."

Filipiak also sees great potential in currently non-mobile operators. "Keeping in mind that everything eventually goes mobile, we haven't forgotten the great potential of fixed broadband operators, cable TV providers and triple and quad play operators" he says. "We target different segments of the market while not focusing exclusively on a single one."
Pre-paid billing has been one of the major functions carriers have sought during the life of Comarch but, as bundled and flat-rate packages become more popular, Filipiak sees it's emphasis waning. "Today there are not too many content services available but they will come" he says. "Video streaming will put new requirements on bandwidth and devices. It will be very resource consuming and will be charged via pay-per-use. The experience won't be very different to paying for bandwidth or connection time with voice. Pre-paid is a method of payment which is still the most popular for the youngest segments of users, but pre-paid is becoming less related to cheap prices - because those are achievable in post-paid models as well - than to a philosophy of ‘no contract, no obligation'."

Flat-rate offers will be harder to make business sense of. "Flat rate is only viable in a world with unlimited capacity" adds Filipiak. "Flat-rate packages make a difference in the final price of services but the introduction of real flat-rate, where everything is included and mobile access is a commodity like internet or electricity or gas, will lead to a weakening of pre-paid which will favour post-paid."

Filipiak sees the market moving in this direction. "We can see that many players are moving towards a mix of post-paid with a significant amount of free minutes, SMS and MMS in a bundle," he adds. "This offer is really close to an actual flat rate and assures stable revenue for providers as well as strong customer loyalty and a resulting decrease in churn. My mantra in telecoms is that customers now expect everything to be easy."

The emergence of mobile content and the move to data services put obvious pressure on carriers' systems and the telecoms revenue chain has become much more complex. Comarch has long been prepared for this shift, as Filipiak attests: "The revenue chain is more complex and an operator is now not the only one that provides the services delivered. Service ‘sponsoring', third-party service providers, resellers and service dealers introduce the need for multi-party billing and put more pressure on monitoring quality of offerings," he says. "Our solutions also address and deal with the complexity that content and data services bring in wholesale, next generation TV, content distribution, service creation and control. We address these needs through our InterPartner Billing solution. On the OSS side, we provide service level management and service level inventory, our flagship OSS products, which enable service modelling of resources and services provided by different parties along with pro-active quality monitoring and management."

Comarch has grown from its eastern European roots and now has operations in 30 countries and addresses operators of all sizes and types, as Filipiak explains: "The Comarch brand is recognised in the telecoms world," he says. "We've been in the industry for 15 years and time is now working for us. Our biggest customers for specialised OSS solutions are Tier 1 operators. Large operators with 10 million subscribers are customers for our InterPartner Billing and, when it comes to independent operators, we have about 30 per cent of the local market as clients for integrated BSS and OSS/BSS solutions. We also target the largest CATV and broadband operators offering convergent services. Our strategy also addresses global players where we can offer the best value, give good prices, still be flexible and deliver enterprise level services."

In spite of the general economic downturn, Filipiak still sees great opportunities emerging. One area is that of next generation mobile networks and self-optimising networks. "Such concepts will invite carriers to look for solutions outside the long established segments of OSS, such as Inventory Management, configuration Management and Network planning," he says. "It will not be sufficient to cover one area in the future; instead co-operation of the planning and operations areas will be needed where we see an opportunity for us. In addition, carriers are now more oriented towards a loose coupling of functional modules and standard interfaces that make it easier for smaller players, like us."

New means of delivering solutions are also critical. "With our future proof architecture of solutions, we can address modern modularity concepts and tendencies that now exist in the market," adds Filipiak. "The openness and standard interfaces in high level OSS products is the key and customers can choose the best modules for their operations. This provides a possibility to reduce opex by utilising new business models for our customers, network virtualisation, distribution and outsourcing of operations and hosting solutions."

Regardless of the current economic gloom, Filipiak believes a new investment wave must come to the telecoms market. "Investment must happen because there will be greater demand" he says. "Physical travel will be a high cost so there will be more load on existing networks."

Carriers face massive challenges in spite of the increased demand for their network capacity and services. "In the international mobile groups, unification and co-operation issues are still of key importance in order to gain competitive advantage on the global market" he says. "Outsourcing of operations has become very popular but unsurprisingly it has turned out not to be a remedy for everything. Carriers still need to adapt their business processes and way of thinking to this new model. On the other hand, the need to reduce capex is forcing carriers to introduce scenarios of sharing physical resources, such as radio masts."
Filipiak also identifies additional challenges such as churn prevention, automatic client profiling and concentrated web-based marketing campaigns, as issues carriers will need to address.

Winning business from the large carrier groups against this backdrop is, without doubt difficult.

"International groups are certainly challenging customers" admits Filipiak. "National companies differ in software environments, processes and levels of maturity as well as corporate and national culture. They therefore require a flexible approach in implementation strategy and software functionality and look for a common architecture for their network as well as their IT systems. Such carriers pay a lot of attention to building up corporate standards at the services level and business process levels in order to achieve a common view."

Good products, knowledge and proven experience are the ways to win this type of business. "No power point slide solutions can be sold anymore," adds Filipiak. "It takes a lot of time and money but these are the only ways to win contracts with groups."

However, winning such business is never achieved on a static battlefield. Carrier consolidation continues and that can be both a threat and an opportunity for solutions vendors. "On one had, it is difficult because some groups will enforce product choices at the global level, and it may be more difficult for Comarch to gain a global recommendation in a large group since we have to fight for our portion of the market with much stronger players" says Filipiak. "On the other hand, consolidation forces carriers to unify their OSS/BSS landscapes and this is a good opportunity to change long-established solutions for something new and fresh. Heterogeneous environments of global groups with plenty of flavours in different countries require a great level of flexibility that Comarch can provide. We already have positive experience with such projects, for example our experience with T-Mobile, that enables us to be optimistic for the future."
 

"Ultimately, we must live with the situation" adds Filipiak. "We're a service company and it's not our job to comment or expect specific customers to behave in any particular way. The level of consolidation is already very high so we may not see much more, in any case."
It's not only carrier consolidation that presents challenges to vendors, though. Carriers are at different stage of their business and that places a development burden on all vendors as they seek to develop systems applicable to individual carrier needs.

"Comarch builds its solutions for different segments of the telco market," says Filipiak. "We offer both pre-integrated solutions for small business, such as an integrated BSS/OSS solutions for an MVNO, and complex solutions tailored specifically for the needs of large players. We have frameworks and modules of software but we've never sold it without adaptation. In the end, it is always a construction job. You have modules but ultimately you must put them together in different ways."

Comarch has grown organically since its inception in 1993 and has shunned much of the mergers and acquisition activity that has occurred in the OSS/BSS sector in recent years. "Our product portfolio follows unified design principles and is not the result of an acquisition of missing parts," explains Filipiak. "This gives us the possibility to offer seamlessly integrated solutions and products that complete while at the same time not redundant in functionality."

Inevitably, for all rules there are exceptions, and Comarch has recently announced an agreement to acquire 50.15 per cent of Frankfurt listed company SoftM und Beratung AG for a transaction that could exceed €22m. The German software producer and systems integrator employs 420 personnel and supplies more than 4,000 customers.

Filipiak is open to further moves although they will be well considered. "Acquisition, yes but only in a way that we can handle along with continued organic growth. There will be no miracle from us, just steady organic growth."

Filipiak also rejects any notion of selling the company. "The company isn't for sale. My family has a controlling stake and I'm not going to sell now. The company's value is increasing and the scope of the business grows every day."

George Malim is a freelance communications journalist

ip.access CEO, Stephen Mallinson, discusses the impact of pico and femtocells with Priscilla Awde

Mobile operators everywhere are facing something of a conundrum which goes like this: in saturated markets they must increase revenues from high margin data services but these are typically bandwidth hungry applications resulting in a network capacity crunch. Additionally, recent research shows that around 60 per cent of customers use their mobiles inside buildings at work and at home. As people exploit the benefits of the big new touch screen smartphones, they will expect networks to be fast enough to provide the necessary capacity reliably and everywhere. These are growing trends.

However, delivering the promise of mobile multimedia applications means delivering high-speed indoor mobile networks. Which poses big questions for operators: how can they get broadband 3G networks inside to provide reliable, cost effective in-building coverage? How can they do it fast, without significant and expensive investment in macro networks and give customers access to the applications they want at prices they are willing to pay?
Fortunately ip.access has the answers since bringing high-speed wireless networks inside is its raison d'être. Building on its long experience in developing IP communications solutions, ip.access designs and manufactures picocells for business users and femtocells for the domestic market.

Picocells and femtocells plug directly into existing fixed broadband networks be they DSL, cable, satellite or even WiMax. Acting as mini-base stations, both can be quickly installed anywhere in buildings or outside to bring networks to where the demand is.

These plug and play units have advantages for everyone. For users, both professional and consumers, they make the mobile phone a truly broadband device which can reliably connect to high-speed networks anywhere. For operators, pico and femtocells take traffic off the macro wireless network, add capacity and improve performance. They also give telcos the competitive advantage they need to sell into new sectors and offer a range of high margin value added services.

For years ip.access has successfully deployed nanoGSM picocells in enterprises, business parks, skyscrapers, underground and public buildings. They are even installed on planes, ships and other remote locations where they are connected to broadband satellite backhaul networks. Depending on their size, picocells can support up to 100 users and companies can dot them around the organisation to provide connections where needed.

Solving the problem for residential users, the Oyster3G femtocell allows people to use their existing mobiles to access broadband applications at home. Supporting up to four simultaneous connections, family members can get seamless high-speed access as they move about inside the house. ip.access expects commercial deployments of plug and play 3G femtocells will be up and running by spring 2009.

"There are two legs to our business," explains Stephen Mallinson, CEO at ip.access. "We design end-to-end pico and femtocell solutions so operators can deliver robust solid networks for business and residential users inside any building, ship or aircraft."
The difference between the two is one of size, power, capacity, functionality, price and target audience. However both allow operators to add capacity cost effectively, divert traffic from the macro network and thereby improve performance for all users connected to a cell site. Network black spots in cities and rural areas can be eliminated and people previously unable to get mobile signals can be connected to high-speed networks.

"Operators can use pico and femtocells to put broadband wireless networks precisely where there is demand be that indoors or outside," explains Mallinson. "They can do this without either the expense or controversy of installing new masts and avoid adding equipment to existing base stations. The advantages extend beyond capacity issues: operators can introduce and support new, high margin services and offer home zone tariffs to drive up data usage inside and on the move.

"There are QOS advantages: although people may tolerate occasional dropped voice calls they will be less forgiving if essential business communications or video content are interrupted. These mini-base stations ensure connections are maintained as people move around inside buildings."

Plugging mini-base stations into the existing broadband connections takes indoor data sessions off the macro network so raising the number of users each site can support and increasing its capacity beyond the number of users removed. Operators therefore do not have to invest either in backhaul or in increasing base station capacity. According to ip.access, instead of upgrading the macro network to meet the capacity demands of increased data usage, an operator with 10 million subscribers could save €500 million over four years by deploying fully subsidized femtocells to 20% of its subscribers' homes. Similarly, research firm Analysys-Mason calculates the annual cost saving per customer for a large operator deploying 3G femtocells is between $6 - $12.

Setting aside revenue advantages, increases in service and performance levels and churn reduction, the added capacity achieved by deploying femtocells more than makes the business case even if they are fully subsidised. Even ignoring the cost savings, it takes only a Euro 11 per month increase in ARPU spread over one household to cover the cost of fully subsidising a femtocell.

Operators are seeing an explosion in mobile data usage (in the UK 3 saw a 700% increase in data traffic throughput between September 2007 and March 2008 ), and are looking to picocells and femtocells to solve both network capacity and indoor high-speed access problems. Demand for high bandwidth multimedia mobile applications is rising fast. In the consumer market, usage growth can be attributed to the popularity of social networking sites; uploading and sharing multimedia data; mobile advertising and the personal experience enabled by mobile TV. Following the launch of the iPhone, operators reported an immediate and continuing surge in data usage.

According to Informa, 60% of mobile data traffic will be generated at home by 2013. Ovum anticipates 17 million femtocells will be deployed throughout Western Europe by 2011 and IDC expects consumer spend on femtocell enabled services to grow to $900 million by the same year. Other surveys suggest nearly half of smartphone data usage is at home and the ‘digital generation' either does, or wants to watch mobile television at home.

As distinctions between professional and consumer applications and use blur, employees at all levels are taking popular mobile services into the workspace and combining them with mobile access to multimedia corporate applications. Mobiles are an essential part of corporate life: many business applications formerly limited to fixed devices have migrated onto wireless platforms. "Picocells support reliable connectivity to network services," continues Mallinson. "Enterprises can now support the flexibility and device independent access employees need, delivering reliable and consistent mobile high-speed access everywhere."

Operators are urgently addressing the capacity problems such increases in data usage imply. Some are capping monthly unlimited data plans while others encourage content developers to limit application bandwidth. Neither of which are likely to be popular with users and may increase churn: both of which enhance the consumer proposition for deploying picocells and 3G femtocells.

While adding what could be millions of mini-base stations to a network, integrating them into existing infrastructure and systems and managing them is a significant task for operators, the rewards are potentially equally significant. The cost of delivering calls drops; service levels, speed and reliability rise and operators can introduce new, high margin services to the handsets people already have.

They can encourage both usage and fixed mobile substitution by offering FemtoZone services which are tied to a particular location and automatically activated when phones are within range of the femtocell. When people get home, texts could be automatically sent to absent parents to notify them children are back; podcasts, videos or images can be loaded to phones or targeted advertising sent to interested users.

"Femtocells are a cost effective technology and real commercial proposition for the residential market," explains Mallinson. "Most people in Europe have access to broadband networks at home and, by rolling out 3G networks, carriers are stimulating demand for mobile data. However, many users are frustrated since they cannot fully exploit the benefits of 3G phones or get the quality of service or application access they want.

"Most people use phones for data indoors where, without pico or femtocells, 3G coverage is often not reliable or signals not even available. Femtocells give consumers a better experience and faster downloads so they can really use all the features and functions 3G handsets and networks support while inside."

The Femto Forum industry body, of which ip.access is a founding board member, now includes more than 90 companies, including 36 operators covering 914 million subscribers. The Forum is encouraging the development of open standards which will lead to economies of scale - unit prices are expected to to drop below $100.

There are plans to include the new I-uh standard in release 8 of the 3GPP standard due out in December. It will replace the numerous different ways in which femtocells currently connect to networks and proprietary systems and define how they can be integrated into core networks. By standardising communications between femtocells and core network gateways, operators will no longer be locked into proprietary interfaces or particular vendors and so can choose consumer premise equipment (CPE), separately from the gateway.
Concerns about managing the multitudes of new units within a network are also being addressed by the industry. Currently available for DSL equipment, the TR-069 standard allows operators to remotely manage devices, diagnose and solve problems and download software upgrades. The standard is being extended to support the management of femtocells.

Based on open standard interfaces, the nanoGSM picocell and Oyster 3G femtocell products are total end-to-end solutions which include the requisite controllers and management systems. 

Over the five years they have been used in enterprises, the advantages of the nanoGSM are well documented. Fast and easy to install it increases mobile voice and data usage and reduces operator costs. With an indoor range up to 200 metres, traffic is backhauled through existing IP networks and it supports fast data rates over GPRS and EDGE to devices such as Blackberries. The nanoGSM picocell can be hung on a wall and, once the Ethernet connection is plugged into the box, it is up and running providing guaranteed mobile capacity and service quality indoors.

Like its bigger cousin but less powerful and with a smaller range, the Oyster 3G architecture creates a complete indoor broadband access network for the residential market. Using the same underlying technical platform as the Oyster 3G, ip.access is developing next generation picocells. Having solved many of the 3G femtocell ease of use, price and installation challenges necessary to meet consumer needs, ip.access believes these solutions can be incorporated into picocells. In future, the company expects to offer self-install 3G picocells to both large enterprises and to SMEs through their existing channels.

"These are very exciting times," says Mallinson. "We are building on our experience to produce next generation picocells designed for businesses of all sizes. SMEs need plug and play, easy to use, cost effective units which can be self installed and remotely managed. It makes commercial sense for companies large and small to deploy picocells. It also makes commercial sense for operators, giving them the edge over competitors and a new value proposition for smaller companies which historically have been something of a closed shop."
It's a truism that everything is going mobile and operators are already feeling the capacity pinch. Pico and femtocells give them a cost effective means of meeting the expected upsurge in demand and delivering the network performance capable of supporting next generation multimedia applications.

Today's smart phones are as powerful and feature rich as the PCs of only a few years ago and look set to become the principle controller of all domestic electronic equipment. Operators are now able to deliver the ubiquitous high-speed networks consumers of all kinds expect.

Mallinson looks forward to the day when content is automatically and seamlessly transferred between devices over femtocell platforms: "Users will be able to control televisions remotely from their mobiles; share content between phones and other devices quickly and automatically so all are updated. In the new converged IP world, audio, video, text and photographs will be seamlessly shared between devices.

There's a stark dynamic framing in the telecoms Operations Support Systems (OSS) market. Until recently networks were expensive while the price tags for the OSS systems used to assure the services running across them were, by comparison, puny. Today that's all changed - not because OSS systems have become significantly more costly, but because network components are a fraction of the capital cost they were 15 years ago. The result is an apparent cost disparity that may be causing some operators to swallow hard and think about putting off their OSS investments, Thomas Sutter, CEO of Nexus Telecom tells Ian Scales.  That would be a huge mistake, he says, because next generation networks actually need more OSS handholding than their predecessors, not less

Naturally, Thomas has an interest. Nexus Telecom specializes in data collection, passive monitoring and network and service investigation systems and, while Nexus Telecom's own sales are still on a healthy upswing (the company is growing in double figures), he's growing increasingly alarmed at some of the questions and observations he's hearing back from the market. "There is a whole raft of issues that need exploring around the introduction of IP and what that can and can't do," he says. "And we need to understand those issues in the light of the fundamental dynamics of computer technology. I think what's happening in our little area of OSS is the same as what tends to happen right across the high technology field. As the underlying hardware becomes ten times more powerful and ten times as cheap, it changes the points of difference and value within competing product sets." If you go back and look at the PC market, says Thomas, as you got more powerful hardware, the computers became cheaper but more standard and the real value and product differentiation was, and still is, to be found in the software. "And if you look at the way the PC system itself has changed, you see that when microcomputers were still fairly primitive in the early 1980s all the processor power and memory tended to be dedicated to the actual application task  - you know, adding up figures in a spreadsheet, or shuffling words about in a word processor. But as PC power grew, the excess processing cycles were put to work at the real system bottleneck: the user interface. Today my instincts tell me that 90 per cent of the PC's energy is spent on generating the graphical user interface.  Well I think it's very similar in our field. In other words, the network infrastructure has become hugely more efficient and cost effective and that's enabled the industry to concentrate on the software. And the industry's equivalent of the user interface, from the telco point of view at least, is arguably the OSS. "You could even argue that the relative rise in the cost of OSS is a sign that the telecoms market as a whole is maturing." That makes sense, but if that's the case what are these other issues that make the transformation to IP and commodity network hardware so problematical from an OSS point of view?
"There's a big problem over perceptions and expectations. As the networks transform and we go to 'everything over IP', the scene starts to look different and people start to doubt whether the current or old concepts of service assurance are still valid. "So for example, people come to our booth and ask, 'Do you think passive probe monitoring is still needed?  Or even, is it still feasible?  Can it still do the job?' After all, as the number of interfaces decrease in this large but simplified network, if you plug into an interface you're not going to detect immediately any direct relationships between different network elements doing a telecom job like before, all you'll see is a huge IP pipe with one stream of IP packets including traffic from many different network elements and what good is that? "And following on from that perception, many customers hope that the new, big bandwidth networks are somehow self-healing and that they are in less danger of getting into trouble. Well they aren't.  If anything, while the topological architecture of the network is simplifying things (big IP pipes with everything running over them), the network's operating complexity is actually increasing." As Thomas explains, whenever a new technology comes along it seems in its initial phases to have solved all the problems associated with the last, but it's also inevitably created new inefficiencies. "If you take the concept of using IP as a transport layer for everything, then the single network element of the equation does have the effect of making the network simpler and more converged and cost effective. But the by-product of that is that the network elements tend to be highly specialized engines for passing through the data  - no single network element has to care about the network-wide service." So instead of a top-down, authoritarian hierarchy that controls network functions, you effectively end up with 'networking by committee'. And as anyone who has served on a committee knows, there is always a huge, time-consuming flow of information between committee members before anything gets decided.  So a 'flat' IP communications network requires an avalanche of communications in the form of signaling messages if all the distributed functions are to co-ordinate their activities. But does that really make a huge difference; just how much extra complexity is there? "Let's take LTE [Long Term Evolution], the next generation of wireless technology after 3G. On the surface it naturally looks simpler because everything goes over IP. But guess what? When you look under the bonnet at the signaling it's actually much more complicated for the voice application than anything we've had before. "We thought it had reached a remarkable level of complexity when GSM was introduced. Back then, to establish a call we needed about 11 or 12 standard signaling messages, which we thought was scary. Then, when we went into GPRS, the number of messages required to set up a session was close to 50.  When we went to 3G the number of messages for a handover increased to around 100 to set up a standard call. Now we run 3GPP Release 4 networks (over IP) where in certain cases you need several hundred signaling messages (standard circuit switching signaling protocol) to perform handovers or other functions; and these messages are flowing between many different logical network element types or different logical network functions. "So yes of course, when you plug in with passive monitoring you're probably looking at a single IP flow and it all looks very simple, but when you drill down and look at the actual signaling and try to work out who is talking to who, it becomes a nightmare. Maybe you want to try to draw a picture to show all this with arrows - well, it's going to be a very complex picture with hundreds of signaling messages flying about for every call established. "And if you think that sort of complexity isn't going to give you problems:  one of my customers - before he had one of our solutions I hasten to add - took  three weeks using a protocol analyzer to compile a flow chart of signaling events across his network. You simply can't operate like that - literally. And by the way, keep in mind that even after GSM networks became very mature, all the major operators went into SS7 passive monitoring to finally get the last 20 per cent of network optimization and health keeping done. So if this was needed in the very mature environment of GSM, what is the driver of doubting it for less mature but far more complex new technologies? ''
Underpinning a lot of the questions about OSS from operators is the cost disparity between the OSS and the network it serves, says Thomas. "Today our customers are buying new packet switched network infrastructure and to build a big network today you're probably talking about 10 to 20 million dollars. Ten or 15 years ago they were talking about 300 to 400 million, so in ten years the price of network infrastructure has come down by a huge amount while network capacity has actually risen. That's an extraordinary change. 
"But here's the big problem from our point of view.  Ten years ago when you spent $200 million on the network you might spend $3 million on passive probe monitoring.  Today it's $10 million on the network and $3 million on the passive probing solution. Today, also, the IP networks are being introduced into a hybrid, multiple technology network environment so during this transition the service assurance solution is getting even more complex. "So our customers are saying, ‘Hey!  Today we have to pay a third of the entire network budget on service assurance and the management is asking me, 'What the hell's going on?' How can it be that just to get some quality I need to invest a third of the money into service assurance?' "You can see why those sorts of conversations are at the root of all the doubts about whether they'll now need the OSS - they're asking: 'why isn't there a magic vendor who can deliver me a self-healing network so that I don't have to spend all this money?" Competitive pressures don't help either. "Today, time-to-market must be fast and done at low cost," says Thomas, "so if I'm a shareholder in a network equipment manufacturing company and they have the technology to do the job of delivering a communication service from one end to the other, I want them to go out to the market.  I don't want them to say, 'OK, we now have the basic functionality but please don't make us go to the market, first can we build self-healing capabilities, or built-in service assurance functionality or built-in end-to-end service monitoring systems - then go to the market?'  This won't happen." The great thing about the 'simple' IP network was the way it has commoditized the underlying hardware costs, says Thomas. "As I've illustrated, the 'cost' of this simplicity is that the complexity has been moved on rather than eliminated - it now resides in the signaling chatter generated by the ad hoc 'committees' of elements formed to run the flat, non-hierarchical IP network. "From the network operator's point of view there's an expectation problem: the capital cost of the network itself is being vastly reduced, but that reduction isn't being mirrored by similar cost reductions in the support systems.  If anything, because of the increased complexity the costs of the support systems are going up. "And it's always been difficult to sell service assurance because it's not strictly quantitative. The guy investing in the network elements has an easy job getting the money - he tells the board if there's no network element there's no calls and there's no money. But with service assurance much more complicated qualitative arguments must be deployed. You've got to say, 'If we don't do this, the probability is that 'x' number of customers may be lost. And there is still no exact mathematical way to calculate what benefits you derive from a lot of OSS investment."
The problem, says Thomas, is as it's always been. That is, that building the cloud of network elements - the raw capability if you like - is always the priority and what you do about ensuring there's a way of fixing the network when something goes wrong is always secondary. "When you buy, you buy on functionality. And to be fair it's the same with us when we're developing our own products. We ask ourselves, what should we build first? Should we build new functionality for our product or should we concentrate on availability stability, ease of installation and configuration.  If I do too much of the second I'll have less features to sell and I'll lose the competitive battle. "The OSS guy within the operators organization knows that there's still a big requirement for investment, but for the people in the layer above it's very difficult to decide - especially when they've been sold the dream of the less complex architecture. It's understandable that they ask: 'why does it need all this investment in service assurance systems when it was supposed to be a complexity-buster?" So on each new iteration of technology, even though they've been here before, service providers have a glimmer of hope that 'this time' the technology will look after itself. We need to look back at our history within telecoms and take on board what actually happens.    

The promise of IPTV is fraught with dangers - from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

End-to-end transaction data is increasingly being recognised as the not-so-secret sauce required for full-flavoured telco transformation. If so, it should be treated with the reverence it deserves, Thomas Sutter, CEO of data collection and correlation specialist, Nexus Telecom tells Ian Scales

Nexus Telecom is a performance and service assurance specialist in the telecom OSS field. It is privately held, based in Switzerland and was founded in 1994. With 120 employees and about US$30 million turnover, Nexus Telecom can fairly be described as a 'niche' player within a niche telecom market. However, heavyweights amongst its 200 plus customers include Vodafone, T-Mobile and Deutsche Telekom.

It does most of its business in Europe and has found its greatest success in the mobile market. The core of its offer to telcos involves a range of network monitoring probes and service and revenue assurance applications, which telcos can use to plan network capacity, identify performance trends and problems and to verify service levels. Essentially, says CEO, Thomas Sutter, Nexus Telecom gathers event data from the network - from low-level network stats, right up to layer 7 applications transactions - verifies, correlates and aggregates it and generally makes it digestible for both its own applications and those delivered by other vendors. What's changing, though, is the importance of such end-to-end transaction data.

Nexus Telecom is proud of its 'open source approach' to the data it extracts from its customers' networks and feels strongly that telcos must demand similar openness from all their suppliers if the OSS/BSS field is to develop properly. Instead of allowing proprietary approaches to data collection and use at network, service and business levels respectively, Sutter says the industry must support an architecture with a central transaction record repository capable of being easily interrogated by the growing number of business and technical applications that demand access. It's an idea whose time may have come. According to Sutter, telcos are increasingly grasping the idea that data collection, correlation and aggregation is not just an activity that will help you tweak the network, it's about using data to control the business. The term 'transformation' is being increasingly used in telecom.

As currently understood it usually means applying new thinking and new technology in equal measure: not just to do what you already do slightly better or cheaper, but to completely rethink the corporate approach and direction, and maybe even the business model itself.

There is a growing conviction that telco transformation through the use of detailed end-to-end transaction data to understand and interact with specific customers has moved from interesting concept to urgent requirement as new competitors, such as Google and eBay, enter the telecom market, as it were, pre-transformed. Born and bred on the Internet, their sophisticated use of network and applications data to inform and drive customer interaction is not some new technique, cleverly adopted and incorporated, but is completely integral to the way they understand and implement their business activities. If they are to survive and prosper, telcos have to catch up and value data in a similar way. Sutter says some are, but some are still grappling with the concepts.

"Today I can talk to customers who believe that if they adopt converged networks with IP backbones, then the only thing they need do to stay ahead in the business is to build enough bandwidth into the core of the network, believing that as long as they have enough bandwidth everything will be OK."

This misses the point in a number of ways, claims Sutter.

"Just because the IP architecture is simple doesn't mean that the applications and supply chain we have to run over it are simple - in fact it's rather the other way about. The 'simple' network requires that the supporting service layers have to be more complex because they have to do more work."

And in an increasingly complex telco business environment, where players are engaged with a growing number of partners to deliver services and content, understanding how events ripple across networks and applications is crucial.

"The thing about this business is not just about what you're doing in your own network - it's about what the other guy is doing with his. We are beginning to talk about our supply chains. In fact the services are generating millions of them every day because supply chains happen automatically when a service, let's say a voice call over an IP network, gets initiated, established, delivered and then released again. These supply chains are highly complex and you need to make sure all the events have been properly recorded and that your customer services are working as they should. That's the first thing, but there's much more than that. Telcos need to harness network data - I call them 'transactions' - to develop their businesses."

Sutter thinks the telecom industry still has a long way to go to understand how important end-to-end transaction data will be.

"Take banking. Nobody in that industry has any doubt that they should know every single detail on any part of a transaction. In telecoms we've so far been happy to derive statistics rather than transaction records. Statistics that tell us if services are up and running or if customers are generally happy. We are still thinking about how much we need to know, so we are at the very beginning of this process."

So end-to-end transaction data is important and will grow in importance. How does Nexus Telecom see itself developing with the market?

"When you look at what vendors deliver from their equipment domains it becomes obvious that they are not delivering the right sort of information. They tend to deliver a lot of event data in the form of alarms and they deliver performance data - layer 1 to layer 4 - all on a statistical basis. This tells you what's happening so you can plan network capacity and so on. But these systems never, ever go to layer 7 and tell you about transaction details - we can.

"Nexus Telecom uses passive probes (which just listen to traffic rather than engage interactively with network elements) which we can deploy independently of any vendor and sidestep interoperability problems. Our job is to just listen so all we need is for the equipment provider to implement the protocols in compliance with the given standards."

So given that telcos are recognising the need to gather and store, what's the future OSS transaction record architecture going to look like?

"I think people are starting to understand it's important that we only collect the data once and then store it in an open way so that different departments and organisations can access it at the granularity and over the time intervals they require, and in real (or close to real) time. So that means that our approach and the language we use must change. Where today we conceptualise data operating at specific layers - network, service and business - I can see us developing an architecture which envisages all network data as a single collection which can be used selectively by applications operating at any or all of those three layers. So we will, instead, define layers to help us organise the transaction record lifecycle. I envisage a collection layer orchestrating transaction collection, correlation and aggregation. Then we could have a storage layer, and finally some sort of presentation layer so that data can be assembled in an appropriate format for its different constituencies - the marketing people, billing people, management guys, network operation guys and so on, each of which have their own particular requirements towards being in control of the service delivery chain. Here you might start to talk about OSS/BSS Convergence."

Does he see his company going 'up the stack' to tackle some of these applications in the future.

"It is more important to have open interfaces around this layering. We think our role at Nexus Telecom is to capture, correlate, aggregate and pre-process data and then stream or transfer it in the right granularity and resolution to any other open system."

Sutter thinks the supplier market is already evolving in a way that makes sense for this model.

"If you look at the market today you see there are a lot of companies - HP, Telcordia, Agilent and Arantech, just to name a few - who are developing all sorts of tools to do with customer experience or service quality data warehouses. We're complementary since these players don't want to be involved in talking to network elements, capturing data or being in direct connection with the network. Their role is to provide customised information such as specific service-based KPIs (key performance indicators) to a very precise set of users, and they just want a data source for that."

So what needs to be developed to support this sort of role split between suppliers? An open architecture for the exchange of data between systems is fundamental, says Sutter. In the past, he says, the ability of each vendor to control the data generated by his own applications was seen as fundamental to his own business model and was jealously guarded. Part of this could be attributed to the old-fashioned instinct to 'lock in' customers.

"They had to ask the original vendor to build another release and another release just to get access to their own data," he says. But it was also natural caution. "You would come along and ask, 'Hey guys, can you give me access to your database?', the response would be 'Woah, don't touch my database. If you do then I can't guarantee performance and reliability.' This was the problem for all of us and that's why we have to get this open architecture. If the industry accepts the idea of open data repositories as a principle, immediately all the vendors of performance management systems, for instance, will have to cut their products into two pieces. One piece will collect the data, correlate and aggregate it, the second will run the application and the presentation to the user. At the split they must put in a standard interface supporting standards such as JMS, XML or SNMP. That way they expose an open interface at the join so that data may be stored in an open data to the repository as well as exchanged with their own application. When telcos demand this architecture, the game changes. Operators will begin to buy separate best in class products for collecting the data and presenting it and this will be a good thing for the entire industry. After all, why should I prevent my customer having the full benefit of the data I collect for him just because I'm not as good in the presentation and applications layer as I am in the collection layer? If an operator is not happy with a specific reporting application on service quality and wants to replace it, why should he always loose the whole data collection and repository for that application at the same time?"

With the OSS industry both developing and consolidating, does Nexus Telecom see itself being bought out by a larger OSS/BSS player looking for a missing piece in its product portfolio?

"Nexus Telecom is a private company so we think long-term and we grow at between 10 and 20 per cent each year, investing what we earn. In this industry, when you are focusing on a specialisation such as we are, the business can be very volatile and, on a quarter-by-quarter basis, it sometimes doesn't look good from a stock market perspective."

But if a public company came along and offered a large amount of money? "Well, I'm not sure. The thing is that our way of treating customers, our long-term thinking and our stability would be lost if we were snapped up by a large vendor. Our customers tend to say things like 'I know you won't come through my door and tell me that someone somewhere in the US has decided to buy this and sell that and therefore we have to change strategy.' Having said that, every company is for sale for the right price, but it would have to be a good price."

So where can Nexus Telecom go from here? Is there an opportunity to apply the data collection and correlation expertise to sectors outside telecom, for instance?

"Well, the best place to go is just next door and for us that's the enterprise network. The thing is, enterprise networks are increasingly being outsourced to outsourcing companies, which then complete the circle and essentially become operators. So again we're seeing some more convergence and any requirement for capturing, correlating and aggregating of transactions on the network infrastructure is a potential market for us. In the end I think everything will come together: there will be networks and operators of networks and they will need transaction monitoring. But at the moment we're busy dealing with the transition to IP - we have to master the technology there first.”

Ian Scales is a freelance communications journalist.

Organisations like the TeleManagement Forum have a dilemma when it comes to anniversaries. TM Forum’s 20th birthday in 2009 will naturally be a cause for celebration: you don’t get such a long run in this business unless you’re doing something right. But there’s a nagging worry too. Can a successful first 20 years as a thought leader; framework and standards setter – mostly for telecoms operational and business support systems - serve as a basis for another 20 years setting frameworks for an industry that is turning rapidly into something else, as new players muster at its borders? Does the heritage help or hinder when it comes to refining a role in the rapidly converging telecom, media and Internet industries, where the new tends to be seen as ‘good’ and anything else is consigned as ‘legacy’?

LEAD INTERVIEW - Brave New World

For Keith Willetts, the TM Forum's original co-founder and current Chairman, it's a question that soon answers itself, once you apply a little deep thought to the matter.
“What's become really clear, over the past year or so especially, is that convergence is here – it's for real and we're really at the start of the process,” he says. “What you've got is three trillion dollar industries - media, Internet and telecom - all coming together. You just have to pick up a newspaper, listen to the news or, of course, surf the web to see that it's happening. Who's Virgin bought?  What services is Skype offering now? All that sort of thing. And over the coming years we're going to see far more of this mixing and matching – where a company strong in one field takes over or forms an alliance with a company that's strong in another.”
For Willetts it's a process that brings opportunities as well as threats.  One apparent threat for many in the telecom industry is that telecom becomes sidelined in many markets as a new breed of player moves in and takes over. This extreme scenario might involve powerful, highly capitalised Internet companies, such as eBay with IP telephony company Skype (which it bought in 2005) completely disrupting the traditional telephony market.
“I use Skype and I'm amazed at just how good the service is. I wonder to myself, 'why would you need anything else?. But,” admits Willetts, “the more likely scenario is that we'll end up with a real mix of companies which take elements from all three sectors.”
There lies the opportunity. Willetts thinks the TMF can provide the frameworks that integrate the players, just as it has hitherto provided frameworks to integrate telcos' disparate back-office systems. The challenge is to apply its expertise in a new way.
“What we've been good at is helping our members develop a lean end-to-end process environment – a set of frameworks and standards encapsulated in our NGOSS  (New Generation Operations Software and Systems) initiative that enables them to build flow-through business processes that cross the old internal demarcation lines that were, and often still are, such a feature in traditional telcos. Using NGOSS they can begin to join the dots between things like inventory, provisioning, service assurance and so on.”
What's clearly required in the new converged telecom-media-Internet world, he points out, is a similar set of guidelines at the inter-company level. “We are going to need standards and frameworks that reach beyond the company and the sector to automate things like content delivery, digital rights management and things we haven't even thought of yet.
“Of course, it's a huge area and there are a number of unresolved questions,” he says. “For example one specific conversation we've recently had within TMF has been around the possibility of defining a value chain. And we came to the conclusion that such a question presupposes we know who is going to be where in the chain. In fact, all we can actually say is that there will be value chains and there will be different people at different positions within them.  What we're looking at is the development of something more two- or even three-dimensional than a simple chain – it's probably better to think of these relationships forming something like a  'value web', where companies might sit in any one of several positions.  They might be undertaking one set of commercial roles in one territory and a different set in another.”
In fact, says Willetts, TMF as an organisation is keen to develop a role as an independent business and technical facilitator rather than being seen as the advocate of a particular, sector-specific, approach.  The reason is simple – the telecom industry itself won't exist as we know it five to ten years from now.  It's transforming, and as web and media companies are moving onto some of its traditional turf, telecoms itself is branching out into many new areas. 
 “It's important we aren't seen to be in the business of promoting any particular outcome,” claims Willetts.  “We want to be part of an environment where there can be a range of outcomes, shapes and services. The important thing is that user companies and providers can actually put the pieces together and have them work. It's a case of  'may the best man win'.”
So where exactly is the TM Forum running to?
First, TMF is inviting thought leaders from media and cable companies to join its Board in order to get a 360-degree view of emerging needs.  Second, it's rapidly broadening its business and software vision to encompass the needs of information and content-based services and the myriad of virtual providers and value chain players.  Third, collaboration with other bodies will be important and ongoing.  For example, recently TMF  struck a landmark deal with the Broadband Services Forum (BSF) with a formal partnership where relevant work is shared.  In fact the members of each organisation will have influence over related technical work in the area of telecom-web convergence issues and the first fruits of the collaboration will show up in a new TMF document entitled “Telecom Media Convergence Industry Challenges and Impact on the Value Chain”. The relationship will also contribute to more multimedia focused panels at TM Forum events, and future development of process standards for content management and convergent media-telecom operations.
 “One of the most exciting and fundamental things we're going to do is to develop what we're calling a ' super-catalyst', and we'll be kicking that project off at Nice this year.”
The TMF Catalysts are joint projects undertaken by members and sponsored by service providers, usually to demonstrate leading edge thinking on how to solve problems in integrating the back office, using approaches based on  TM Forum standards and guidelines. The results of these projects are demonstrated at TMF's TeleManagement World conferences in Nice and Dallas each year.
 “The super-catalyst, which we're likely to call the Converged Services Showcases, will be  really major events, involving media companies, device companies, cable TV, IPTV and mobile TV,” says Willetts.  “The idea is to show a whole set of advanced service scenarios, but unlike what you'd see at a trade show - where you typically just see the thing working - with the super-catalyst you'll be able to walk around the back of this and be shown how it's actually being operated and controlled using standards and the various OSS and BSS systems involved.
 “It's at an early stage of development, but the general idea is that you go to the show floor and you see the equivalent of a town with houses and retail establishments and so on.  And you see all these services that you're getting and then you walk around the back to the network operations centre and you can see how it's all being managed.  It's a big leap.”
We're working on, not just a demonstration, but a real catalyst designed to flush out problems and what standards you need, and what bits you need to invent that you haven't thought of. The fact is that we don't know what the standards requirements are in some cases in the converged world yet, and that's why this super-catalyst is going to be a great vehicle for developing the whole area. It's going to be a major undertaking.”
The plan is for the first super-catalysts to debut later this year at the TMF's Dallas TMW. 
Nice TMW will be the start of the major change.  “What we want to show is that convergent services are here. So we have a very strong convergence message and a very strong illustration that TMF is responding.  There will be discussion about managing content-based and entertainment-based services and more involvement from media companies. For instance, for a meeting at Nice we've invited executives from Disney, Time-Warner and Virgin Mobile to join the table. The fact is that it's just as relevant for a senior executive at BT, say, to sit down with a Disney executive, as it is for the Disney guy to get to understand how the company can exploit the telecom space.”
 “For some of these players convergence will result in a partners' love-fest and for others it will be 'daggers drawn', as they realise they're going to be contesting the same space, but in the long run nobody knows who will be in which role at any one given point in time. TMF's role isn't to try and predict that.”
What about the core frameworks and standards generated by the TMF? Will these have to change markedly to accommodate the broader remit and the entry of new types of player into the value web?
“Yes, no doubt there will be changes as we go forward. One area that we're probably going to have to address in all our output is outsourcing. While our current guidelines intrinsically assist players to define and manage all their processes, so that outsourcing, where required, will be simpler to accomplish, it's also true to say that outsourcing isn't often specifically allowed for. I've just been to India to speak at a TMF event there, and what I heard there was really eye opening in terms of the way outsourcing is being used to reduce costs.
“At Bharti Airtel, one of the big mobile operators with 80 to 90 million subscribers, all the IT is outsourced and they operate at a cost level that a European mobile operator, for instance, can't even come close to.”
Willetts says he thinks that outsourcing and partnering arrangements are bound to become more complex and must be catered for in the back office in a fundamental way.
“For example BT might run an IPTV service in the UK using its infrastructure, and in Germany it might run a service on someone else's because it doesn't own any infrastructure there. But it will probably want to run the same brand and service.  The back office systems need to support that sort of thing.”
But the big question has to be asked. Isn't there a danger in all this for TMF?  This is a member-driven organisation and it is energised by a core of highly motivated, mostly telecoms-oriented individuals who give, not just their companies' time, but often their own time and effort as well. Doesn't TMF run the big risk in realigning itself so radically?
Willetts is adamant: “What people sometimes don't understand is that it's not a question of: 'If you go and chase all these converging media and web companies, will you desert your core telecom membership in the process?'  That question forgets the fact that  telecom companies are, themselves, becoming multi-media companies.  So, the reality is, to be of maximum use to our core constituency, we need to run with them, not away from them.”

Ian Scales is a freelance communications journalist.

Technology companies come and go, but some are blessed with the foresight to help drive the technological developments that permeate all our lives. One such company is Micron, whose COO, Mark Durcan, tells Lynd Morley why it has been so successful

Lead interview – It's a vision thing

Future gazers abound in our industry, and we’re being promised a near-future of sensor networks and RFID tags that will control or facilitate everything from ordering the groceries, to personalised news projected into our homes or from our mobile phones. This stuff of science fiction, fast becoming science fact, is the visible, sexy end-result of the technology, but what about the guys working at the coal-face, actually producing the tools that enable the dreams to come true?
Micron Technology is one of the prime forces at that leading edge. Among the world’s leading providers of advanced semiconductor solutions, Micron manufactures and markets DRAMs, NAND Flash memory, and CMOS image sensors, among other semiconductor components and memory modules for use in computing, consumer, networking and mobile products. And Mark Durcan, Micron’s Chief Operating Officer, is confident that the company has been instrumental in helping the gradual realisation of the future gazers’ predictions.
“I do think that we are, in many ways, creating the trends, because we’ve created the technology which enables them,” he comments. “I can give you two prime examples. The first is in the imaging space where, for many decades, charge-coupled devices (CCDs) were the technology of choice for capturing electronic images – mostly because the image quality associated with CCDs was much better than that of CMOS images, which is what Micron builds today. 
“Nonetheless, we were strong believers that we could marry very advanced process technology, device design and circuit design techniques with the CMOS imager technology, and really create a platform that enabled a whole new range of applications. 
“I think we did that successfully,” he continues, “and the types of applications that were then enabled are really quite stunning. For instance, with CCDs you have to read all the bits out serially, so you can’t capture images very quickly. With CMOS imagers you can catch thousands of images per second, which then opens the door to a whole new swathe of applications for the imagers – from very high speed cameras, to electronic shutters that allow you to capture a lot of images, and, by the way, you can do it using far less power. We have already made a major impact in providing image sensors to the notoriously power hungry cameraphone and mobile device based marketplaces, and in the space of two years have become the leading supplier of imaging solutions there. One in three cameraphones now have our sensors and in only two years we have become the largest manufacturer of image sensors in unit terms worldwide. So now, for instance, the technology enables all sorts of security, medical, notebook and automotive applications – you can tune the imagers for a very high dynamic range, low light and low noise at high temperatures which then enables them to operate in a wide variety of environments that CCDs can’t function in.
As a result, you can put imaging into a multitude of applications that were never possible before, and I think we really created that movement by creating the high quality sensors that drive those applications.”
The second example Durcan quotes is in the NAND memory arena. “What we’ve done is probably not   apparent to everyone just yet, but, actually, I believe that we’ve broken Moore’s law.
“We are now scaling in the NAND arena much faster than is assumed under Moore’s law, and that has really changed the rate at which incremental memory can be used in different and new ways. As a result, I believe it will also pretty quickly change the way computers are architected with respect to memory distribution. So we’re going to start seeing changes in what types of memory are used, and location in the memory system, and it’s all being driven by a huge productivity growth, associated with NAND flash and the rate at which we’re scaling it. We are scaling it faster than anyone else in the world now and we are also well tuned to the increasingly pushy demands of mobile communications, computing and image capture devices.“
The productivity growth Durcan alludes to has been particularly sharp for Micron over the past year. The formation of IM Flash – a joint venture with Intel – in January 2006 has seen the companies bringing online a state-of-the-art 300mm NAND fabrication facility in Virginia, while another 300mm facility in Utah is on track to be in production early next year. The venture also produces NAND through existing capacity at Micron’s Idaho fabrication facility. And just to keep things even busier, the partners introduced last July the industry’s first NAND flash memory samples built on 50 nanometre process technology. Both companies are now sampling 4 gigabit 50nm devices, with plans to produce a range of products, including multi-level cell NAND technology, starting next year. At the same time, Intel and Micron announced in November 2006 their intention to form of a new joint venture in Singapore (where Micron has a long history of conducting business) that will add a fourth fabrication facility to their NAND manufacturing capability.
In June 2006, Micron also announced the completion of a merger transaction with memory card maker Lexar Media, a move that helped Micron expand from its existing business base into consumer products aimed at digital cameras, mobile computing and MP3 or portable video playing devices.
“Our merger with Lexar is interesting for a number of different reasons,” Durcan comments. “Certainly it brings us closer to the consumer, as, historically, our products tended to be sold through OEMs. But, in addition, it provides the ability to build much more of a memory system, as opposed to stand-alone products, given that Lexar delivers not only NAND memory, but also a NAND controller that manipulates the data in different ways and puts it in the right format for the system that you’re entering. Working closely with Lexar, we want to ensure that this controller functionality is tied to the new technologies we want to adopt on the NAND front, making sure that they work well together, thus enabling more rapid introduction of new technologies and getting them to market more quickly.”
The considerable activity of the past twelve months clearly reflect Micron’s view of itself as a company that is in the business of capturing, moving and storing data, and aiming for the top of the tree in each section.   On the ‘capturing’ front, for instance, Durcan notes: “We’ve been very successful from a technology development perspective, and I think we’re pretty much the unquestioned leader in the image quality and imaging technology arena. As mentioned we also happen to be the world’s biggest imaging company now – it happened more quickly than any of us thought it would, but it was driven by great technology. So we have plenty of challenges now in making sure that we optimise the opportunity we’ve created to develop new and more diversified applications.”

Stringent tests
Certainly, the company is willing to put its developments to the most stringent of tests. All of Micron’s senior executives, including Durcan, recently drove four Micron off-road vehicles in an exceptionally rugged all-terrain race in California, the Baja 1000, digitally capturing and storing more than 140 hours of video from the race, using Micron’s DigitalClarity image sensors and Lexar Professional CompactFlash memory cards specially outfitted for its vehicles. All the technology performed remarkably well, as did Micron’s CEO Steve Appleton, who won the contest’s Wide Open Baja Challenge class some 30 minutes ahead of the next closest competitor.
Appleton’s energetic and non-risk-averse approach to both the Baja 1000 (in some ways the American version of the Paris Dakar Rally) and to life in general (he is reputed to have once crashed a plane during a stunt flight, but still proceeded with a keynote speech just a few days later) is reflected in an undoubted lack of stuffiness within Micron.
Certainly, the company has taken a certain level of risk in pioneering technology developments. RFID is a case in point. “Sometimes,” Durcan explains, “the technology was there, but the market was slow to develop. RFID is a good example. Today, Micron has the largest RFID patent portfolio in the world. We certainly developed a lot of the technology that is now incorporated in global RFID standards, but when we first developed it, the threat of terrorism, for instance, was less obvious, so we simply couldn’t get these tags going that are now absolutely commonplace. I suppose you could say we’ve been a little ahead of our time.”
The company is also managed by a comparatively young executive team, with a very non-hierarchical approach to business. “I do believe that we have a certain mindset that keeps us pretty flexible,” Durcan explains, “and one our strongest cards is that we have some really great people, with a great work ethic. At the same time, we drive a lot of decisions down into the company. We’re probably less structured in our decision making than a lot of companies. 
“So, we try to get the right people in the room (not necessarily in the room actually, but on the same phone line!) to make a decision about what is the right space to operate in, then we can turn it over to people who can work the details.
“We try to get to that right space, at a high level, through good communication and then drive it down. It is the opposite of what I believe can happen when companies grow, become compartmentalised, and tend to get more and more siloed.
“There is also very strong synergy between the different activities within Micron,” he continues. “In each case we’re really leveraging advanced process technology, advanced testing technology, and large capital investments in large markets. There are a lot of things that are similar and they do all play closely with each other.”

International bunch
Micron’s people are, in fact, a truly international bunch, recruited globally, and bringing a great diversity of skills and approaches to the company. “I think that we are one of the most global semiconductor companies in the world,” Durcan says, “despite being a relatively young company. We recently started manufacturing our sensors in Italy and have design centres in Europe, both in the UK and Norway, which are expanding their operations. In fact we are now manufacturing on most continents – except in Africa and Antartica – and we have design teams right around the world who work on a continuous 24hr cycle handing designs from site to site. We’ve tried to grow a team that is very diverse, and leverage the whole globe as a source of locating the best talent we can.”
So, does all this talent produce its own crop of future gazers? Durcan believes they have their fair share.  “There certainly are people at Micron who are very good at seeing future applications. My personal capabilities are much more at the technology front end. I can see it in terms of ‘we can take this crummy technology and really make it great’. Then I go out and talk to other people in the company who say ‘that’s fantastic, if we can do that, then we can...’. It really does take a marriage of the whole company, and a lot of intellectual horsepower.”
That horsepower has resulted in a remarkable number of patents for Micron. Durcan comments: “The volume and quality of new, innovative technology that Micron has been creating is captured by our patent portfolio.  It’s an amazing story, and something I’m really proud of.  The point is, Micron is a pretty good-sized company, but we’re not large by global standards – we’re roughly 23,500 employees worldwide. Yet we are consistently in the top five patent issuers in the US.
“I feel the more important part of the patent story, however, is that when people go out and look at the quality of patent portfolios, they typically rank Micron as the highest quality patent portfolio in the world – bar none. I think that’s pretty impressive and speaks volumes about the quality our customers benefit from.”

Lynd Morley is editor of European Communications

Tony Wilson, COO, Martin Dawes Systems and Warren Buckley, director of portfolio convergence at BT, describe the relationship between the two companies and how MDS is enabling BT to become more responsive and agile

Lead interview - The right partnership

As every operator working in the highly competitive global telecoms industry knows, success depends on business agility, innovative, easy to use services and putting customers first. This is especially true in the emerging market for converged services where end users increasingly want anywhere, anytime connections over any device.
Converged services need converged companies to supply them: companies that have both telecoms and IT expertise/experience and the next generation solutions and networks to deliver. Just such a company is Martin Dawes Systems which has over 20 years experience as a virtual mobile operator running next generation networks and creating converged software solutions for the market. Acting as a Mobile Virtual Network Enabler (MVNE), it offers a suite of specialist subscriber management systems, processes and end-to-end managed services and platforms to its virtual mobile telecoms clients. “We work in partnership with and as part of our customer's operation, becoming almost an internal department helping them to get products to market fast,” explains Tony Wilson, COO at Martin Dawes Systems.
 “Since we understand both the telecoms and IT sides of the business and can draw on our history as a Mobile Virtual Network Operator (MVNO), we can help operators become more flexible, customer centric and responsive.” 
Demanding markets and competitive challenges means operators must react fast, but few have the unified network architecture, supporting technologies or internal organisation needed for rapid response to shifting customer demands.
Typical of most former incumbents and major telcos, BT has a complex legacy environment with over 4,000 systems, hundreds of networks, over 20 million customers and several thousand products. Launching converged services efficiently requires a single unified platform. BT's answer was to choose a combination of in-house core technical innovation and partnerships with expert, trusted third party suppliers. “A big challenge for BT was in deciding how much to do for ourselves and how much to outsource, to get through a partnership or buy in,” says Warren Buckley, director of portfolio convergence at BT, which has created a close working partnership with Martin Dawes Systems to help deliver its fixed mobile convergence (FMC) service BT Fusion to SMEs.
 “We partnered with Martin Dawes Systems because of their huge mobile experience combined with custom built products for SMEs. We have a unique, fully managed billing and CRM service which enables us to deliver speed to market plus the level and complexity customers demand.
 “The requirements of business users in the SME space are more complex than those of consumers, especially from the mobile point of view, and represent a divergence from traditional PSTN services. Mobility entails the flexibility to offer bundled minutes, handle tariff changes and an on-going hierarchy of relationships within and between businesses. FMC needs to offer the best of both worlds and we therefore needed to converge,” explains Buckley. “We made the big decision to go to a third party for an array of different services including CRM, billing, revenue assurance and tariff set up and control.”
Explaining the relationship further, Wilson says his company acts as a department within BT, fully understanding its requirements and providing a stable group of experts who help develop new ideas for meeting tight time schedules and fostering business agility.
Traditional telco response to launching new products is to build a separate systems stack for each one. The result is a legacy of proprietary and largely manual systems; disconnected islands of automation and an environment in which valuable business information and customer data are made largely inaccessible because they are stored and duplicated in numerous separate silos. All of which adds up to inefficiency, slow response to market demand and an architecture unable properly to support convergence, business agility or customers – a situation which can be ameliorated by creating third party partnerships.

Facing other challenges
New entrants, many of which are retail brand management companies or content rights owners with little or no telecoms experience, face other challenges. For them the priority is to select a network provider plus an expert partner to whom they can outsource end-to-end service delivery. With no legacy, these MVNOs are, however, agile competitive companies used not only to anticipating customer needs but also to delivering high quality services. They are introducing retail business practices into a sector which has historically been dominated by incumbent operators with little competitive pressure to change.
Whether existing or new, all operators must cut the cost of doing business whilst simultaneously   ©            introducing a raft of innovative multimedia services terminating on different devices.
The old vertically structured point-to-point architecture is too cumbersome to support this fast moving world in which products may be quickly set up and torn down; where customers demand different billing models and on-line account management across the services they use.
 “In today's market operators are trying to re-position themselves and focus on convergence,” says Wilson. “The convergent model is now all about offering quadruple play services which combine wireless and wireline voice with broadband and video in one bundled package.
 “The industry is re-establishing itself. Operators are re-building and reforming the MVNO model to generate new business. They are outsourcing to third party MVNE suppliers and have a choice between a pure IT organisation or one which is a telecoms aware IT provider.”
Moving to a partnership model meant BT did not have to build/add more infrastructure or undertake the difficult, time consuming and complex task of integrating new systems into the existing estate. With the benefit of a partner that understands its aims and supplies experienced staff, plus a suite of unique product options, this relationship has helped BT launch Fusion into the SME market very fast. Buckley estimates that working with the managed services model shaved nine months off the launch time. “It would have taken us between nine months and a year to launch working with our legacy environment, but working with Martin Dawes Systems' OSS/BSS and CRM solutions we did it in three months,” he says.
Having a strong brand image is not the only ingredient of success; it is also a matter of being able to react fast, differentiate services from competitors, offer choice and value added services which can be made personal. Operators must capitalise on market opportunities – a difficult task both for established telcos and new, often inexperienced entrants. Outsourcing or partnering with an MVNE is often the quickest and most reliable route to market since it avoids the need to build, run and manage systems and adds OSS/BSS solutions and expertise.

Experienced intermediary
Acting as an experienced intermediary between the network provider and the MVNO, Martin Dawes Systems manages and runs the required operational systems to react quickly and deliver multimedia products. Depending on the contract, it also handles relationships with all the third party suppliers including the content providers so essential in converged services.
Exploiting all the flexibility of its next generation network, Martin Dawes Systems supports MVNOs quick reactions in testing, launching or tearing down new converged services fast. Tariffs can be changed equally quickly and discounts applied as appropriate, with different billing models offered to selected customers. “As we have provided a convergent architecture for years, we can deliver services very fast and connect to wireless and wireline networks seamlessly,” says Wilson. “We deliver complex services to corporate and SME customers and by employing the managed services model, act as an IT department.”
Martin Dawes Systems reduces risks and, by helping to maximise capital investments and compete effectively, speeds up the Return on Investments (ROI). MVNOs are therefore free to concentrate on core competencies, brand management and adding value in the form of innovative converged services, customised and targeted at specific users.
According to Yankee Group statistics, the MVNO market will generate service revenues of $10.7 billion by 2010 and British regulator Ofcom estimates that already they account for 5.5 million UK phone contracts.
Running and managing multimedia services depends on linking both the network OSS and customer facing BSS systems, which is a challenge best met by automation and a move from proprietary to open standard OSS/BSS estates.
While it is a slow, expensive process, the big operators are creating service oriented architectures and building next generation networks based on open standards. They are restructuring internal processes/systems and putting customers at the heart of their businesses.
However, for smaller operators, new entrants or even for incumbents wanting to move fast into new markets, rather than building their own systems or buying off the shelf and then customising, perhaps the quickest, most cost effective option is to outsource to an MVNE.
 “From an IT perspective, operators need managed, secure data which is easy to manipulate and made available to customers easily. They need service oriented architectures, Java and open standard OSS/BSS systems. They need marketing, customer relationship management systems and everything in between,” explains Wilson.
 “Convergence sits between two camps – in both the OSS and BSS environments,” he says. “From an OSS perspective it is delivered over intelligent networks. In the BSS space, operators must be able to bill for converged services, and contact centre agents must respond fast and accurately to customers' questions.”

Separate platforms
Among other processes, convergence affects billing systems. Traditionally, mobile pre and post-paid customers were managed on separate, independent platforms with the former handled in the BSS environment and post-paid billing supported by IT departments. In converged networks both are handled on the same billing platform managed by staff with both IT and telecoms expertise.
At the heart of Martin Dawes Systems' product offering is the dise3G pre-integrated end-to-end billing and CRM solution that handles multi-service, multi-subscription pre and post-paid billing on the same platform, making it quick and easy for operators to launch multimedia services. Operators not only use the system to manage all aspects of the customer relationship fast and economically via self care features, but also to run critical business processes including sales, marketing, order management, rating and revenue assurance. The open standard CPP billing and CRM solution provides telcos with all the flexibility and control needed to support converged services, different billing models and customised solutions.
Unconcerned about underlying technologies or the considerable complexities, costs and challenges of moving to next generation networks and services, end users are most interested in price, convenience and the quality of service. The point is to deliver better products and make them easy for people to use – convergence is also about simpler, better, end user experience.
“Convergence is the future,” believes Buckley. “Business users are starting to see real benefits as we move away from simple connectivity into bundled minutes delivered to any device over any network. Connectivity will be fundamental, but services will become increasingly important and we will work with Martin Dawes Systems for billing, billing analysis and related solutions.
“One of the most positive aspects of the relationship between the two companies,” he notes, “is that as BT realises its long term plan of moving all products, services and customers onto its 21 century network, it has a supportive, expert and flexible partner.”
That the relationship is strong and mutually beneficial is evidenced by the fact that Martin Dawes Systems won the top prize for best billing and OSS implementation at this year's World Billing Awards for its work on the SME version of BT Fusion. Similarly BT has won a number of industry accolades.
In addition to its high profile work with BT, Martin Dawes Systems works with telcos large and small, new, old, fixed, mobile or with ISPs running circuit switched or IP networks. For those MVNOs new to the business it has developed its 'telco in a box' solution which has all the OSS/BSS and other systems required to switch on, deliver and bill services. “We offer an end-to-end platform from account activation through to customer care and billing for convergent products,” explains Wilson, who believes the goal is to support operators' changing requirements and give them an efficient, flexible platform to move forward into converged services.
Choosing an MVNE is a decision to create a close and long term partnership based on trust – it is about sharing risks, increasing customer numbers and reducing churn.                                                     

Priscilla Awde is a freelance communications writer

Convergence, consolidation, amalgamation – these are the current watchwords of the telecoms industry, as buy-out and merger speculation (and reality) abounds, and the Tier One players jostle for power and position. But as these growing entities position themselves to tackle an equally expanding global market, their requirements for operations systems support are becoming commensurately greater and more complex, as they respond to the need to deliver continued customer loyalty, cost-effective operations and profitable revenue growth.

Lead interview – Building on success

Central to the success of any business, of course, is its revenue, and, certainly, in the telecoms industry, proving the business case for revenue assurance and fraud prevention programmes should be an easy task, especially given that telco revenue leakage never falls much below 10 per cent. For the large Tier One providers, an OSS supplier with the stature and reputation within its own field that can match that of the telco – not to mention the products and services to help solve the problems – is becoming vital. 
Subash Menon, head of the newly merged Subex Azure Limited, comments: “We recognised that to provide the large telcos with the revenue maximisation facilities they need, you really have to be a large powerhouse that can deliver the breadth and depth of service they require.”
To that end, Subex Systems, leaders in the supply of fraud management systems, and Azure Solutions, the number one revenue assurance systems provider, joined forces with the aim of helping telecoms operators establish Revenue Operations Centres (ROC) through an integrated suite of solutions. Subash Menon, Chairman, President and CEO of Subex Systems will have the same role within the new company, while John Cronin, president and CEO of Azure Solutions will oversee the integration of the combined entity, ensuring a smooth transition of services and products from both the customer and employee points of view.
Commenting on the merger, Menon notes: “Clearly, there is a great deal of consolidation underway among operators, and we believe that it is critical for OSS vendors to also consolidate in order to take advantage of the opportunities going forward. We believe that the large operators are now seriously looking at the revenue maximisation space, and will need a significant and reliable partner with whom they can tackle the issues.  This is not about buying or licensing a point solution, it is about partnering for the very long term. We aim to fill that space.”
Subex is acquiring Azure for a little over $140 million in an all share deal. Subex is the smaller company in revenue terms – at around $25 million – compared to Azure whose revenues last fiscal were $31 million. In the merged entity, venture funding companies including Doughty Hanson, Intel Capital and New Venture Partners, which are investors in Azure Solutions, will  hold a 34.5 per cent stake, while the current shareholders of Subex Systems will hold 65.5 per cent. These current shareholders of Subex Systems include Subash Menon and a variety of funds.

Issues to be addressed
There are certainly plenty of issues to be addressed in the revenue maximisation space. Operators across the globe have to deal with a whole range of threats to their revenues, including internal and external fraud, invoicing system errors, poor systems integration and processes, rating errors and credit management, to name a few. The downturn in the telecom industry's fortunes during the early part of the decade understandably generated a surge of interest in both revenue assurance and fraud prevention – the old adage of 'every penny counts' holding sway. But as we begin to move back into a more entrepreneurial business environment, it is vital that operators do not lose sight of the fact that wealth creation in telecoms continues to be determined by the ability to manage and extract business value from the highly complex services they offer.
Subex Azure's John Cronin comments:  “More and more, the big operators are integrating the various elements of revenue maximisation. They now need to move into the next stage, which is the Revenue Operations Centre, offering a centralised, integrated operations infrastructure that monitors, controls, and ensures integrity of the revenue chain through continual automated tracking of performance indicators.
“This involves the provision of tools to ensure proactive revenue chain optimisation and expedited error correction,” he explains. “The ROC concept is similar to that of the Network Operations Centre which allows the telco CTO to monitor the technical health of the network 24/7. In the case of the ROC, of course, it would be the CFO, for instance, who could monitor revenue generation and leakage across the network 24/7.”
Subex and Azure bring a wealth of different experiences and expertise to their joint table that will facilitate the establishment of the ROC concept in the telco psyche. Subex, founded in 1992, is based in Bangalore, India, and has a global presence across North America, Europe and Asia. The company went public with an IPO in 1999, and its shares are listed in India on the Mumbai, Bangalore and National Stock Exchange, while its GDRs are listed on the Luxembourg Stock Exchange.  Azure – spun out from British Telecom in April 2003 – is headquartered in London, has its R&D centre in Ipswich, UK, and has staff based in the USA, the Far East, Australia, South America and across Europe. The newly merged Subex Azure's global and Asia-Pacific headquarters will be in Bangalore, while the EMEA and American operations will be based in London, UK and Westminster, Colorado respectively. The new company counts some 23 of the world's 40 largest operators among its customers, with a customer base of around 150 installations in over 60 countries. Tier One customers include BT, Telenor, Vodafone, Orange, O2, Cable & Wireless, TeliaSonera, T-Mobile, Verizon, Bharti Televentures and AT&T.
The merger continues both companies' recent strategy of market consolidation. Acquisitions by Subex include the fraud management assets and technology of Mantas in March 2006, Lightbridge and Alcatel in October 2004 and Magardi in May 2001. Azure acquired US-based cost and revenue assurance company, Connexn Technologies, and UK-based telecoms software company Anite Calculus in November 2004, and it also acquired route optimisation company, Monnet in January 2004. 
Against this background, Subex and Azure began discussions some 12 months ago. “There was obviously a 'get to know each other' period, during which time we were able to look at and discuss technical capabilities, product sets and the specific relationships we each had with our customers,” Cronin explains. “At the end of the day, we clearly felt that this is a perfect match, and must stress, of course, that all existing customer and partner agreements will be fully supported by the new company.”
Menon adds: “We both looked at other options during that one year period, but obviously came to the conclusion that this is the right marriage. Our customer overlap was minimal – almost nil – and our different product sets combined perfectly to address the needs of telcos around the world.”
The advantages of being able to more easily reach each others' markets also played a part in the decision to merge the two companies. Cronin comments: “Looking specifically at Asia, we felt that the only way we would be able to establish ourselves quickly and – most importantly – effectively, was by having local people, working within the local culture, to service customers. We felt that we couldn't continue to ship people from the UK and North America into Asia in order to service our customer base in those countries, and we recognised that going through organic growth in any region that presented as big an opportunity as does Asia, would simply take too long. Clearly through this merger, we now have an acceleration into Asia.”
For Subex, equally, there was the attraction of adding more Tier One companies to their client list, and the recognition that the merger would enhance their geographic presence and credibility, especially in Europe, but Menon is keen to stress that it is the complete package of the two companies – their product offering, their market experience, and their customer relationships, as well as their geographic spread – that provides the strengths to be leveraged in the marketplace.
“In order to create a successful and sustainable software product organisation in our space, I think there are three or four basic elements that must be in place,” he explains. “In no particular order of importance, I would say that one is the ability to create profits – the whole sustainability and execution part of being in business.  Certainly I believe that Subex has proven itself in that area. We are an extremely profitable organisation. We have grown from the ground up in six years to reach some $25 million revenue last year, providing profits of around $8 million. 
“Second,” he continues, “is having a great product offering, and clearly we believe that we're bringing a fantastic solution to the marketplace, via our ROC concept, including the well-proven elements of revenue assurance and fraud control, originally from the two separate companies.
“Third is stature – the company's image and credibility in the marketplace. Now while I certainly believe that Subex has built up excellent credibility in the telecoms market, Azure has an even better image and has built up an exceptionally strong brand.
“And fourth is the most important element of customers – who they are, and the nature of your relationship with them. Obviously, Azure has a brilliant relationship with such companies as BT, and between us we serve well over 150 customers worldwide. 
“To my mind,” he concludes, “it is these four elements which basically make or break an organisation in the long run. Clearly, on each of these fronts, the combined company of Subex Azure is extremely well placed.”
As is generally the case when two well established companies combine their operations, the management of both Azure and Subex recognise that such a development is not merely a question of merging products and services, but must also take account of the different and specific company cultures that will have grown up. Subash Menon is keen to emphasise that while some issues of corporate culture may need to be addressed, he does not believe that the different national cultures of the two companies is in any way relevant to the current marketplace.
“I really don't think it's valid to look at a company in terms of national culture anymore. Both Subex and Azure are international organisations, with several national cultures working within each company.
“As to the corporate culture – that, between the two companies, is fairly similar in that we both have a pretty aggressive approach in going after new business, and are both particularly supportive of our customers.
“Certainly, I don't believe there are any significant differences between us, although there are bound to be some gaps – no two organisations are ever identical”
He goes on to note: “When looking at an integration of this nature, there are certain fundamental things that we need to hold sacred. First and foremost is the fact that any commitment made to customers by either of the separate companies will be met. That is absolutely vital.
“Second, we must recognise the impact on our own people. We are determined to ensure that any such impact will be minimal, and that we will conduct ourselves in as professional a manner as possible to ensure a smooth transition.
“These are two absolutely cardinal principles,” he stresses. “Once we have these set in stone, and are guided by them, the rest falls into place pretty simply.”                                                     

• Lynd Morley is editor of European Communications

For most people in the telecommunications industry, the complexity of the silicon chips that lie at the heart of their services, devices and profits remain largely – and safely – in the background. While the principles of Moore’s Law have continued to hold good, a couple of generations of both engineers and business executives have grown used to expecting constant falls in hardware price and radical increases in processing power.

Lead interview – Image is everything

Now, however, with network operators of all shapes and sizes trying to come to terms with a new universe of services based on content, openness and real consumer choice, the role of that core silicon in driving entirely new business models is coming under renewed scrutiny. In particular, as mobile phones become the world's most popular consumer electronics device, the power and functionality of that phone – whether represented by its image capturing capabilities or its storage capacity for multimedia – will form a key building block for the revenue chains of the coming years.
It was with these thoughts in mind that European Communications recently caught up with some of the senior executives of Micron Technology Inc., one of the world's most innovative semiconductor developers and a leading force in both memory and imaging technologies.
EC's Alun Lewis started by asking Bob Gove, Micron's VP of Imaging, what changes we could expect to see as the world's telecommunications companies responded to advances in imaging and memory technologies:
BG: At the risk of sounding like a Hollywood screenwriter pitching a horror story, we're starting to see the Internet – in both its fixed and wireless forms – grow eyes! That may initially come across as an overly melodramatic statement but, when you consider that cameraphones now account for two out of every three handsets sold today – out of an annual market of around 500,000 devices, and growing, sold each year – you get an idea of the scale involved in the mobile world alone.
Add onto that the potential for imaging in other applications – from security and surveillance to medical diagnostics and monitoring – and you soon see that the creation of low-cost, but powerful, imaging devices has a huge potential to drive near-exponential change in the ways that we work, live and communicate.
We're currently at something of a cusp when it comes to recognising how important these developments are going to be for the traditional telecommunications industry. Not everyone 'gets it' yet, but those that do – and many new business models and intermediaries in the imaging sector are still emerging – will have access to entirely new and original streams of revenue.
What's more – and just like the staggering and totally unexpected success of SMS – it's largely going to be down to the customers themselves as the driving force behind this new wave of services and money. What we, as an industry, have to do is give them tools that are up to the job and that fulfil their own particular needs.
AL: You mention SMS as an example of where the industry failed to appreciate the real dynamics of their customer base. How do you translate customer requirements in such a new area into designing and engineering real products?
BG: There's an old saying in the engineering fraternity that when it comes to making a product faster, cheaper and better, you can only ever have any two out of those three at any one time. The combination of the core characteristics of silicon – as represented by Moore's Law – combined with Micron's own specific expertise in both design and manufacturing, means that we can now get three out of three in both imaging and memory components.
What's important, though, is understanding the requirements of the end user. If we don't make products fit for each customer's usage – whether they're an enthusiastic mobile blogger, someone seeking to send images or video in to a TV show, or a grandmother wanting to capture images of her grandchildren – then we and our own customers will miss the particular points on the cost and performance curves that address those markets.
Traditionally, it's usually been the gadget-minded male who's the first to get into digital photography. The merging of imaging and the mobile phone and the greatly improved usability of these devices has opened this up to a female market with a huge potential. It is, after all, the wives and mothers who usually become the family archivists, capturing precious family memories and passing them down the generations.
That's one reason why Micron has concentrated on delivering very high quality image sensors that can readily operate in the great variety of situations where the amateur's going to want to capture an image without having to think about it. This is particularly acute in low-light environments, and we do have a saying that if your cameraphone still works under a table, then it's a Micron image sensor! We put a lot of effort into ensuring that things like colour sensitivity and accuracy get  taken for granted by the end user, without them having to negotiate complex image processing software to get a decent, fit-for-purpose picture or video.
AL: There are huge changes afoot in the wider telecommunications industry though – the entry of WiFi and WiMax into the mass market, cross-industry activity between content owners and telecoms carriers such as TV voting and reality programming, plus the development of real convergence between the mobile and fixed worlds through projects like BT's Fusion service, to name just a few. How are these impacting on Micron's business and technology strategies?
BG: In essence, the value chain for service providers is getting a lot longer and much more complicated and it's crucial for their business plans that the customer is more than just a passive consumer of bandwidth – they have to generate their own content now and imaging naturally forms a key part of that new value. Some people are just going to want basic point and shoot functionality, while others are going to go for a top range device – we have to meet both requirements and indeed people may cycle between different devices during a single day depending on their requirements.
Network operators themselves are realising that Quality of Service is a crucial attribute for increasingly sophisticated customers, and Micron can help them in achieving that by ensuring that the visual content  from the customer's cameraphone is excellent to begin with.
The domestic networking market, in particular, is also looking very interesting, with WiFi and WiMax driving the true integration of consumer electronics devices like TVs and HiFis with PCs and mobile handsets. Today's youth, for example, doesn't distinguish between different voice and data networks and services in the way that their parents do – to them it's all communications and media.
This development also plays to our strengths in the memory field. Our technology and manufacturing investments in CMOS for imaging and DRAM, DDRAM and NAND technologies allows us to bring high-yield, high-performance products to market quickly and reliably – essential for our own customers who have their own tight product schedules to meet.
In the wired/wireless home, users are far more likely to download multimedia content onto their mobile devices over DSL links that are effectively free, than they are to pay the traditionally high tariffs demanded by mobile data services. As content such as films and TV programmes become distributed over the Internet and Digital Rights Management issues become clarified, users will want to take this content with them on the move and that requires a lot of compact storage in the device itself.
In August this year, for example, we announced that we'd started shipping 8 Gigabit and 4 Gigabit NAND Flash memory products.
AL: Your earlier comment about the Internet growing eyes certainly has a particularly apt resonance in today's security conscious world. How do you see imaging developing in this area?
BG: It's far too simplistic to just see things through 1984-coloured spectacles, and the ability to monitor many different things visually can bring incredible benefits to us as both individuals and as societies.
For personal health applications, Micron's gone beyond the cameraphone to develop the camerapill, capable of capturing images of a patient's gut during its journey from mouth onwards, and then send these to a radio receiver on the patient's waist. This can dramatically reduce the need for invasive surgery or disruption to the patient's life.
Additional imaging applications that we see include 3D barcodes that can be read by mobile phones, allowing customers to carry out instant in-store checks on product information and prices, while another area of focus is on imaging for fingerprint readers in mobile devices. As these become increasingly important as highly personalised tools that allow us to interact with ever-richer sets of online services and applications, their own implicit security also increases in importance. It may be your device, but how can you guarantee that it's only used by you?
There's also the remote monitoring of home and family while people are away, as another important application area. Whether it's a working parent being able to see their child at nursery school via a webcam or just check up on their house while away from home, remote imaging can bring peace of mind, as well as new revenue opportunities for service providers.
Continued improvements in how we package devices are also allowing us to introduce them to more hostile environments, such as cars and lorries. On one hand there's legislation coming in around the world to introduce both high and low speed crash avoidance systems, and optical sensors are now ideal for this, replacing a profusion of mirrors. On the other, they can also control airbag deployment to make sure that a small child doesn't get squashed.
AL: And any final thoughts for the mobile industry?
BG: Keep watching this space – even if you're not directly using image processors or are involved in cameraphones! Most of the current, new wave of growth in telecommunications is riding on content – with much of it produced by users themselves. If you fail to spot what's happening at that end of the food chain, you'll miss out on opportunities further up the line. From almost a standing start three years ago, 20 per cent of our revenues now come from imaging and supporting  specialist memory – a proportion certain to increase in coming years as users move from transmitting just their voices and thoughts to include images of the world around them.                                               

Alun Lewis is a freelance telecommunications writer and consultant  alunlewis@compuserve.com


Manufacturing for an ever more complex value chain
Alun Lewis was also able to talk briefly to Dr David Burrows, director of Micron's UK  Design Centre, about the company's design and manufacturing strategy
AL: David, semiconductor manufacturing can be a pretty hairy place to be at times, given the speed the industry moves at. How has Micron approached these issues?
DB: It might sound like a cliché, but it's all about understanding what the customer wants – sometimes even before they're fully aware of it themselves, and then designing products and prices to meet that need. For example, we anticipated the demand for cameraphones and similar devices quite early on and pioneered the use of CMOS technology in manufacturing these, allowing them to be made using standard methods employed for DRAM production.
We're also able to place a great deal of functionality and power within the actual chip itself, drawing on Micron's in-house image processing and manipulation expertise to add considerable value to the basic image capture system. This means that we're able to supply both individual components as well as complete camera systems-on-a-chip that include features such as colour recovery and correction, and auto-exposure, making things simpler for both the device manufacturer and the end user.
These integrated features are not the only thing of course: our experience in designing high precision versions of our sensors manufactured in ceramic packaging with industrial temperature capabilities, global shutter and other features also allows us to supply additional markets, including those of automotive safety and driving assistance, medical, defence, security surveillance and aviation related systems. In addition we create very high frames per second sensors that are already being used for Hollywood film special effects.
AL: So how do we see these factors impacting on price and performance into the future?
DB: We'll still be riding on Moore's Law for quite a while yet!
The cameraphone market is starting to settle out and there's a huge potential with a dynamic demand that, in due course, should even see us beginning to manufacture our sensors here in Europe at our semiconductor plant in Avezzano, Italy. Another emphasis is on improving the performance of mid-range products such as our newly available 3.1 megapixel camera chip, which can still turn out a high quality A4 image in low light conditions. The top of the range continues to expand as well, hence our just-announced launch of a 5 megapixel digital still camera quality device. Our research and development in CMOS imager efficiency continues to advance, of course, and we recently gave the world's first demonstration of images produced by 0.17 micron pixel technology. This will allow even greater megapixel cameraphone targeted devices from us in the not too distant future.               

External Links

Micron

Sheer size alone is no longer the asset it once was in the ICT sector. That said, some industry giants do retain a dominant position – even as they step into the third century of their existence.

Interview – Joined-up thinking

One such organisation is NEC, the $42 billion market leader from Japan. While everyone knows its name – and has a broad idea of its size and scope – many Europeans may still not be aware all of its areas of expertise.
A solid global brand to be sure, with strong credentials in IT, communications, display technologies and identity management solutions, NEC has been an active player in Europe for over 30 years.
European Communications recently met the newly appointed Managing Director, NEC UK, David Payette,  and Kevin Buckley, Director of Mobile Network Solutions Division, NEC UK, to ask them what future they saw for the European communications sector and, more specifically, what a changing NEC had to offer the region.

EC: David – to start off on a somewhat personal note, you took Asian studies and Japanese as a discipline at university, what attracted you to this subject and did this affect your decision to join NEC?
DP: I felt early on that the interplay between Asia and the rest of the world would be a stimulating environment to work in, so after university I went to Japan in the early '90s and joined a Tokyo-based systems integrator. In a series of increasingly senior roles, I moved to EDS in Asia Pacific, then Lucent in Australia and five years ago ended up here in Europe with Avaya, before being headhunted to this post only a few months ago. These international experiences have been both challenging and culturally enriching, and yes, the fact that NEC offers me the opportunity to put my Japan experience to use in this part of the world, makes this new role a particularly attractive fit for me.

EC: And what are the specific benefits you feel that NEC brings to the market?
DP: For a start, we have a long history of bringing innovation and service excellence to Europe, and with that comes an emphasis on continuity and trust with customers and partners that can sometimes be hard to find in today's industry. This focus on long-term credibility is well received in the market, right down through the value chain to the end user. 
As an innovator, we're proud of the many market firsts we've achieved here in areas such as 3G networks and devices, and more recently with i-Mode, to name a few examples. We believe these experiences provide a platform that will help us go from strength to strength and deliver ever increasing value to our customers.

EC: It is widely recognised that much of Asia Pacific is well ahead of the West when it comes to innovation and deployment of advanced technologies. What cultural factors do you think influence this?
DP: When you talk about high standards of manufacturing and quality control, I suspect values and cultural factors are significant, particularly in countries such as Japan. But where innovation is concerned, I believe this is more a matter of basic economics. For example, it was only natural that the high population density you see in Asia's developed markets brought a high mobile subscriber base or 'teledensity' more rapidly there than in other parts of the world. Once this stage is reached, market forces relentlessly drive innovation through intensified competition and increasingly insatiable consumer appetites for more functionality. The game changes from simply laying the infrastructure and acquiring the subscriber base to building loyalty by delivering powerful data services to the handset and fortifying the network with higher technologies like HSPDA and IMS, which drive the necessary throughput and convergence.
We are seeing the same thing happening right now in the UK and throughout Western Europe. And, because NEC has always been a major player at the forefront of Asia's development we're well suited to bring that experience and technology to Europe, and in a time tested and proven manner. And i-Mode for instance is one of the more recent examples of this.

EC: For certain, the NEC portfolio is extremely broad.  Is it possible that it's too broad, or that you are trying to cover too much?
DP: We don't think so at all. Highly specialised companies and individuals will always be essential – and we partner with many of them, but I'd argue that there's also a growing requirement for corporations to take wider scopes, given the perpetually increasing 'joined-up' nature of the systems that support our societies, businesses and lifestyles.
The future of telecommunications is about breaking down boundaries and making the customer experience seamless. NEC sees the big picture and our joined-up thinking makes us the first choice for service providers of all types around the world. The future for ourselves, our customers and our customers' customers will be about enabling communications, information and commerce anywhere, anytime. For me, being able to provide broad based individual expertise across areas as seemingly diverse as supercomputing, identity management, wireless and so on, which in fact increasingly complement the communications arena – combined with our ability to join disparate components together to achieve higher value – will play an important role in the industry and that's good news for us and our customers.

EC: Kevin, how do you see this 'joined-up' thinking coming to market, specifically in the European mobile sector?
KB: I think what David just intimated about continuity and breadth is very important. If you look at the forces that now characterise the European communications sector as a whole, then openness, interoperability and adaptability are key drivers. The network's no longer closed and, in the case, say, of a service provider offering a converged fixed-mobile portfolio of services, they not only need access to 2G, 2.5G, 3G, HSDPA, i-Mode, WiFi, WiMax, DECT and Bluetooth expertise, but also to particular applications such as mobile TV – an NEC speciality – as well as enabling technologies such as identity and security management.

EC: Investing in R&D in such a fast changing industry is like trying to hit a moving target while aiming through a fogged-up mirror. How does NEC plan its investment?
KB: NEC itself has an interesting balancing act to play as it tries to help the whole industry move forward. On one hand we do possess what might be recognised as traditional strengths. We invest around £4 million each and every day in R&D; we play a major part in all the relevant standards bodies around the world; and we also place a lot of emphasis on local partnering with both academia and business.
On the other hand we have the ability to move very quickly when conditions demand. That necessarily involves being as close to the customer as possible and eliminating the unnecessary formality that can slow down the all-important 'time-to-market' factor.

EC: What joint R&D programmes do you run in EMEA with commercial or academic partners?
KB: One good example of the NEC character working together lies in some of our 3G work in Europe. In the background, we have Mobisphere, a long-term R&D  partnership with Siemens aimed at driving 3G development work onwards. Out in the 'real world', we worked closely in the field with Siemens deploying 3's UK and Irish networks in record time. At one point, that involved installing 140 cell sites each week.
DP: Particularly with a complex technology like 3G, the kind of end-to-end thinking from NEC that Kevin's just highlighted becomes highly relevant. NEC has been involved in nearly every first-stage roll out of 3G systems worldwide – we developed the world's very first 3G handsets and we were the first to bring both of these to Europe. But we also know that great technology in the network is only part of the answer. To maximise ROI, you also have to understand the wider behaviours of the customer and the whole service and applications environment as well.

EC: David, you mentioned i-Mode, which you are currently involved in bringing to Europe. What are the particular benefits it brings?
DP: i-Mode is a mobile experience that gives subscribers incredibly fast Internet access, among other things, starting in the 2.5G environment, with a rich and friendly user interface, and content designed specifically with mobility in mind. This winning combination has delivered proven results in user take rates, which is key for service providers, who also appreciate the time tested success and technological depth of the solution. 
Our long history with i-Mode in Japan has enabled us to help service providers across Europe differentiate themselves and grow data services revenues by enriching the mobile experience for their subscribers. And, for the content and applications provider – irrespective of whether they're Disney or a developer creating ring tones – the i-Mode handset is a compelling route to market, and anything that improves the user experience is good for both them and the actual network owner. We've already installed i-Mode systems in Russia, Greece, France, Italy and Spain and have just completed supplying the core infrastructure and terminals for O2's i-Mode launches in the UK and Ireland.

EC: Kevin, you mentioned Mobile TV as one specific application that NEC is involved in – along with the higher speed broadband radio technologies like HSDPA. What's your take on how the industry can deploy these new technologies to best advantage?
KB: There's a lot of talk out there about how the content owners or retail-oriented third parties like Google, Amazon and E-Bay might reduce the traditional telecommunications operators to a utility status as lowest possible cost, bit-pipe carriers.
Yes, the industry is transforming itself and yes, competition is now coming from a variety of new entrants who have radically different business models from the traditional telecommunications mindset. That said, the content and applications are still only part of the total offering.
If you don't have the appropriate handsets and devices for each emerging market niche, or network technologies that guarantee seamless delivery across multiple network platforms, then all the creativity of the content developers will count for nothing.
Consumers are becoming increasingly sophisticated buyers and, while parts of the communications industry will always reflect fast-changing lifestyle and gadget choices, fashion can also be a very fickle master to follow. Customers have long memories, with many buying decisions increasingly being made on word of mouth disseminated around on-line communities.At the risk of repeating ourselves, this is where NEC's corporate characteristics of continuity and extremely high reliability plays out very well, amongst both consumers and our network owner customers.

EC: What are your strategic priorities?
DP: Our strategy is three-fold and involves combining innovation and service excellence in unique and creative ways, exposing our major customers to more of NEC's diverse portfolio, and providing service providers with not just the right technology based solutions, but evolving value-add offers, such as professional services and end-to-end mobility offerings, that enhance their competitiveness and help them promote attractive data services and devices to their customers.
With that strategy, comes an appreciation from NEC of the consumer experience and the cultural characteristics of local markets. You referred to Mobile TV, which is another good example of this. We are a leading vendor of digital TV transmission networks in Europe, and with the mobile standard now being ready (DVB-H), we can offer both terrestrial and mobile distribution via the same network.
The same is true for our experience as a leader in microwave radio, ensuring that all the new high bandwidth variants of existing technologies – such as HSDPA, HSUPA and WiMax – perform out in the field in Europe with the same consistency and quality of services that they've shown in the lab and in the market in Japan.
EC: Do you think that traditional telecommunications business cultures are up to promoting content and applications as well as basic connectivity – especially where it comes to CRM?
DP: Yes, generally speaking, I do. I think we are really starting to see a sea change in this respect. But what I find interesting, is that while content, networks, and applications all help drive customer loyalty, it's 3G and Wi-Fi which have truly kicked the door open when it comes to opening up the world's markets to the next generation of data-based services, and it will be enhancers like HSPDA and IMS, which are major elements of our strategy, that help bring these services to fruition and create new opportunities for all.

EC: David, you've obviously got your own ideas about which directions NEC should be heading in over the coming months and years. Would you like to talk us through some of the changes that you intend making to the ways that NEC does business with its customers and partners?
DP: Sure. NEC intends to be more than just a high technology provider. We are well positioned to bundle our technologies, put more value-add services on top, and provide quantifiable business advantages to our customers in the B to B world from the CXO level on down and out into the consumer arena where the most important decisions are ultimately made. 
This will involve promoting our brand identity more proactively, building further operational synergies across our business units, and driving a greater level of cross-product training internally. Our people are already highly motivated and skilled professionals in their respective areas and they will increasingly promote more of the overall technology continuum in support of our customers.
I'll also be looking to build on the good feedback routes that we already have in place with our R&D operations in Japan and across the globe.
NEC can actively help the wider industry achieve its now inevitable transformation by continuing to lead in many of the vital background areas that underpin it. Super computing, client server and display technologies, as well identity management solutions, all complement the communications environment and form part of a broader continuum. We believe companies that are adept at joining up this disparity and translating it into tangible business value will have the edge in the future. That, fortunately, has always been part of NEC's broader strategic vision.

External Links

NEC UK

    

@eurocomms

Other Categories in Features