A new report published by the Information Security Forum (ISF) warns that the cost of complying with the Sarbanes-Oxley legislation is diverting spending away from addressing other security threats. The global not-for-profit organisation says that many of its members expect to spend more than $10m on information security controls for Sarbanes-Oxley. The business imperative to comply also means that, in many cases, the true cost of compliance is unknown.

With increasing concerns about compliance, the new ISF report provides a high-level overview of the Sarbanes-Oxley Act 2002 and examines how information security is affected by the requirement to comply. The report provides practical guidance to address problematic areas in the compliance process. According to the ISF, these problem areas include poor documentation, informal controls and use of spreadsheets, lack of clarity when dealing with outsource providers, and insufficient understanding of the internal workings of large business applications.
What's more, the Act ignores security areas that are extremely important when dealing with risks to information, such as business continuity and disaster recovery. This makes it vital to integrate compliance into an overall IT security and corporate governance strategy.
"In the wake of financial scandals like Enron and WorldCom, the Sarbanes-Oxley Act was designed to improve corporate governance and accountability but has proved difficult to interpret for information security professionals," says Andy Jones, ISF Consultant. "As neither the legislation nor the official guidance specifically mentions the words 'information security', the impact on security policy and the security controls that need to be put into place must be determined by each individual organisation in the context of their business.
 "It is important that Sarbanes-Oxley does not push organisations into following a compliance-based approach rather than a risk-based approach that may compromise information security.  The ISF report helps companies to achieve compliance while also ensuring that they have the appropriate security controls in place."
The full Sarbanes-Oxley report is one of the latest additions to the ISF library of over 200 research reports that are available free of charge to ISF Members.
Details: June Chambers, ISF, tel: + 44 20 7213 2867; e-mail:

High-tech industrial espionage is on the increase, but there are ways of preventing the 'baddies' from penetrating IT systems, as Calum Macleod explains

For those of you who follow the news, you may have come across the recent story of spy software discovered at some of Israel's leading companies, which reads just like the spy stories we've been fascinated by for years. Indeed, if it weren't for the sacrifice of the likes of Roger Moore and Pierce Brosnan, who knows where we would be today.
But that would be to miss the point completely. Firstly the imagined villains are in fact the victims. But more importantly, highlighting the problem of spy software being prevalent in Israeli companies came as a result of one of the most comprehensive investigations involving computer crime ever undertaken. The Trojan had been introduced by providing companies with contaminated files, or sending a contaminated e-mail message to the companies. This also raises concerns that these methods evaded all the security measures in place at the companies infected.
Today, our businesses depend on the exchange of electronic information with our business partners, but many of the mechanisms that are used still rely too much on the goodwill of those business partners, or the integrity of the systems that they use.
Two of the most commonly used security measures, FTP and Digitally Signed e-mails, using technologies such as PGP, are not really equipped to deal with this type of situation. In Israel, e-mails were being received from trusted business partners, so digital signatures on the e-mail would possibly be considered trustworthy. In the case of files being shared using systems such as FTP, the presence of malware detectors was unable to identify anything inappropriate.
The bottom line is that we frequently depend too much on the trustworthiness of those we deal with, and unless we take appropriate measures to handle the type of eventualities described above, we are leaving ourselves vulnerable.
So are there measures we can take?
Here are a number of suggestions that might help:
1. Do not expose your internal networks to external parties. The process of transferring files in and out of the enterprise must be carried out withoutexposing and risking the internal network. No type of direct or indirect communication can be allowed between the partner and the enterprise network.
2. Ensure that the repository for files being moved back and forth is secure. While information is waiting to be retrieved by the enterprise or sent to the business partner, it must reside in a secure location. This is especially critical when the intermediary storage is located on an insecure network, such as the enterprise's DMZ, outsourced site, or even the Internet. Additionally you should take steps to define what format files will have, and to ensure that they can only be deposited if they are virus free.
3. The environment for exchanging data should be a sterile environment. Encryption and other security mechanisms are not helpful if the security layers where the data is being stored can be circumvented. Encryption is good for confidentiality, but does not protect data from intentional deletion or accidental modifications. In order to build multi-layered security, a sterile environment must exist to accommodate and protect the security infrastructure. Creating such a sterile environment requires the creation of a single data access channel to the machine and ensuring that only a strict protocol, that prohibits code from entering, is available for remote users. Many file exchange technologies do not run in sterile environments. For example FTP Servers, a common method, are frequently nothing more than applications running on insecure platforms.
4. Protect your data when it is at rest. The cornerstone of protecting storage while at rest is encryption. Encryption ensures that the data is not readable and thus maintains its confidentiality. But encryption that places high demands on managing is ineffective. A common approach for many organisations is to use a Public/Private key approach, but this is generally considered to be ineffective because of the enormous effort to maintain such a system. A Symmetric encryption model ensures a manageable and effective method to secure the data.
5. Data must be protected from deletion and tampering. The protection of data by encryption is simply one part of the problem. Files may be accidentally or intentionally deleted or changed. Additionally you need to ensure that data cannot be tampered with.
6. Ensure that you are able to audit and monitor all activities. Comprehensive auditing and monitoring capabilities are essential for security for several reasons. Firstly, it allows the enterprise to ensure that its policy is being carried out. Secondly, it provides the owner of the information with the ability to track the usage of its data. Thirdly, it is a major deterrent for potential abusers, knowing that tamper-proof auditing and monitoring can help in identification. Finally, it provides the security administrator with tools to examine the security infrastructure, verify its correct implementation and expose inadequate or unauthorised usage.
7. End-to-End network protection. Security must also be maintained while the data is being transported over the public network. The process of transferring data must be, in itself, secure, and there are several factors that influence the overall security of data transmission.  As data transfer is an essential part of a larger business process, it is critical to be able to validate that this step in the process was executed correctly. This requires the solution to provide auditing features, data integrity verification and guaranteed delivery options. Transmitted data should be automatically digitally signed, thus ensuring the data delivery is complete and un-tampered.
8. Performance is a major issue in many networks, especially when using the Internet where Service Levels are difficult to guarantee. When there are large volumes of data and the number of recipients is high, it is critical to ensure that performance is optimised. Compression should be deployed to reduce file size, and since network availability and reliability may disrupt the transfer process, automatic resume from the last successful checkpoint should also be a standard feature.
9. Ease of Integration with existing business processes. File transfer is usually part of a larger business process and needs to integrate seamlessly. This demands the ability to automate the file transfer process and thus integrate it with the existing business processes. In order to make this integration as simple and as seamless as possible, the file transfer solution must have an extremely flexible and diverse interface, providing transparent integration. This also minimises the amount of human intervention and, as a result, can improve overall security by reducing the possibility of tampering with your data.
But of course no one is interested enough in what your business is doing to waste a few minutes planting some spyware in your company. After all it only happens in the movies!                                                      n

Calum Macleod is the European Director for Cyber-Ark and can be contacted via tel: +31 40 2567 132;  e-mail:

All mobile operators now offer their subscribers a massive choice of content. The only thing that's missing is the users. The key to success is to push interactive content direct to the idle screens of subscribers, says Yossi Wellingstein

"Build it and they'll come" is a risky way to run a business -- and one that simply hasn't worked in the mobile content world. Today content discovery is one of the biggest problems mobile operators face, following the massive effort to build-up huge content catalogues. Put simply, the fact that users don't know that so much content is waiting for them, and more importantly, don't know how to find it, explains why mobile data services still account for such a small proportion of operators' revenue, despite their best efforts to increase their share.
Operators around the world have realised that they can't wait for users any more. They have to bring content to customers -- and then make sure that it is easy to access. Anything too complicated, or requiring pro-active behaviour from the customer, just hasn't, doesn't and won't work. 
A new approach is needed to let subscribers know that content is available -- and to encourage them to click through. Not only to build data traffic so mobile operators can start to generate decent revenues to justify their infrastructure investments -- but also in this competitive market, to provide services that add value and differentiate one network from another.
One answer to the content discovery problem is incredibly obvious and already being used by a number of operators around the world.
The solution: push content teasers to the screen of a mobile phone when the phone is turned on but not being used. Make these teasers free to the customer. Ensure that they vanish after a few minutes and are replaced with new headlines. Make the content relevant to that user. And finally minimise the number of clicks needed to access additional information linked to a teaser. Research has proven over and over again: clicks kill revenue. The more customers have to navigate, the less likely they are to bother.
A mobile phone screen with no content on it is a major missed opportunity. And a dead screen seems acceptable to customers -- until they see what they could be getting -- up-to-date information headlines for free with an easy way to access additional information and services. Operators need to make the idle screen a dynamic palette of intriguing content, news, games, offers, polls and real-time updates.

Commercially proven

Active Content to the idle screen is a technology that has proved itself commercially. According to operators already using it, over 90 per cent of users keep the service once it's introduced and over 30 per cent use it regularly, an unprecedented use of content services.
VimpelCom which operates as BeeLine GSM in Russia faced a problem common to all mobile operators -- it had spent considerable sums building its data networks, and interesting content was available to its customers thanks to a series of partnerships with content providers. However, customers just weren't accessing content from their phones.
VimpelCom's research showed that users simply didn't know how to find the content -- and if they did it required far too much effort. And so, VimpelCom decided it needed to take a proactive approach. In April this year it launched its "Chameleon" service, using technology from Celltick, to broadcast live content directly to the phone screens of its subscribers.
The Chameleon service sends streams of free news headlines, sport reports, weather updates, music stories, gossip and games directly to mobile phone screens. Just like a screen saver, the messages appear silently only when the phone is not in use. 
Chameleon is very easy for customers to use and does not require any action on the part of the customer to start receiving the service. When they see a message that interests them and want to know more, they simply click on the OK button. A menu opens and presents various options. For example, a video news report or an automatic link to a WAP site or web page, or an SMS message with more information. A second click launches the desired service. And then the subscriber starts paying.
VimpelCom has invested in interesting and credible content, using brands such as MTV, Cosmopolitan and Playboy. It has created a countrywide team to run the broadcast operation, effectively working like a TV editorial team.
Chameleon targets all types of audiences, and can broadcast both to the entire subscriber base and to specific segments and locations, so content can be customised and localised to ensure that it is relevant and of interest to customers.
The results are staggering. During the first three months more than seven million data transactions have taken place with 50 per cent of enabled users reacting to the content on a regular basis.
Victor Markelov, Products Director at VimpelCom, said: "Our customers love Chameleon. We've finally found a way to provide them with an opportunity to use our data services actively. The easier the access to information and data services, the more often our customers use them".
Currently Chameleon is available to almost 3 million users, and VimpelCom plans to expand it to more than 10 million by the end of the year. 
From a business perspective, the key to a successful Active Content system is sending free teasers to a vast number of users, while keeping the cost of these teasers low. It's a numbers game. The operator relies on a certain per cent of users clicking through, and therefore needs to send teasers to as many users as possible. But how to send these teasers cost-effectively? Say there are one million users and the operator wants to send 100 daily teaser messages to each of them. Doing it with any kind of point-to-point delivery will bring the network's SMSCs to its knees or eat up the GPRS capacity. Since these messages are not paid for by customers, this is certainly not a good use of the network resources.
There is a simple answer to this challenge -- enable mobile networks to send one message to many users simultaneously, or in other words, broadcast. Using broadcast, teasers can reach millions of users in real time without clogging the network and without using any commercial bandwidth at all. Since the broadcast capacity of the network is miniscule to begin with and allocated in advance, it makes the content teasers virtually free. This makes Active Content particularly attractive, since the variable costs associated with its delivery to idle screens is close to zero. When users click on a teaser, the terms change, a paid-for session begins and the operator starts seeing revenue.
Users love having content pushed to their screen because it provides free information, entertainment and timely updates with zero intrusion.
Operators love it because it increases traffic on data networks by solving the content discovery problem  -- and in today's competitive market is an effective brand differentiator.
So why are you still hiding your valuable content and services deep inside your portal? It's time to bring it to the front. By activating the idle screen.                    n

Yossi Wellingstein is CEO of Celltick and can be contacted via

If someone tells you that they know what's happening in the telecommunications industry these days, the chances are that they deserve a sharp poke with a pointy stick*. The world's networks were already being described in the 1970s as being the most complex machine that man had ever built, and the last few years have only increased that complexity by a few orders of magnitude.

Imagine a speeded-up film of the earth from space. One single solitary telegraph line first appeared in 1843, stretching all of forty miles from Washington to Baltimore. Over the following decades, that fragile link grew into a web of cables and repeater stations, reaching across continents and oceans. Fast forward a century and a half, and the earth is now covered by a dense mesh of fibre, radio, satellite and copper networks. Regions, once dark and silent, are being rapidly filled in as the world goes mobile, adding a second billion onto those souls already glued to their handsets and screens.
But geographic reach alone only tells one part of the story. As we shift towards an all-IP world, drawing content and applications into the traditionally two-dimensional signalling framework of telecommunications, we're starting to approach biological levels of complexity -- and that's going to demand some serious changes in the ways that we perceive and plan our industry.
Chatting recently with Digitalk, one of the companies now bringing SIP -- that increasingly ubiquitous protocol -- firmly into the PSTN and voice services world, I started getting flashbacks to biology classes at school and all those wonderfully intricate diagrams of metabolic pathways.
Now, for an individual who did a degree in zoology and psychology before falling into networking, the fact that I should look for biological metaphors shouldn't really be surprising. What is perhaps surprising is that respectable academia is also now taking this route, but under the catchy title of 'Complex Adaptive Systems'.
Partly growing out of chaos theory, that pop science favourite of a while ago, a growing number of researchers around the world are looking at the subject of network behaviours -- but across a multitude of different areas. Google on the Santa Fe Institute in New Mexico, and you'll find such apparently unrelated topics as economies, weather, embryological development and telecommunications coming together in fascinating ways.
All very interesting perhaps, but what's this got to do with profits in the increasingly Darwinian playground of a deregulated telecommunications industry ? 
It's simply that nature's been doing for billions of years what we've now been doing for only a hundred or so -- and there should be some good tricks we can pick up. I sometimes deliberately annoy engineer friends by asking them if an individual bee knows that it's actually building a larger hive as it works on its own individual little cell -- each a masterpiece of optimised design. After I've picked myself up off the floor, they do, however, usually get what I mean, especially when I explain why the use of the word 'ecosystem' in all those endless marketing presentations is inappropriate. Anything that kicks marketing departments usually goes down well with engineers, I've found...
In an ecosystem, everything eats everything else. In reality, the world of networks is becoming much more like a super colony of very simple organisms - but with each starting to exchange the equivalent of genetic material with each other.
Consider the fraught relationship between content owners and network owners. Each needs the other -- but they're increasingly fighting the other for dominance of that growing space. What started off as an apparently symbiotic relationship now looks like moving to become parasitical for one of the parties.
Continuing the HR theme of my last column, we now need people who can rise above the raw silicon of the network and spot these shifts in power as they emerge. Talking recently with Daniel Osmer of telecom recruitment specialist, Spectrum-EHCS, he made the interesting comment that "some of the best sales directors we've seen hold degrees in Psychology -- it's their understanding of human behaviour and decision making processes. The industry though is still dominated by very traditional, risk averse hiring".                           n

*Unless they're Keith Willetts, of course.

Alun Lewis is a telecommunications writer and consultant. He can be contacted via:

"An economic power shift from the US to Europe is now gaining steam and promises to have a far-reaching effect on the world technology sector," asserts Cutter Consortium Fellow Tom DeMarco -- a point vigorously debated by his colleagues on the Cutter Consortium Business Technology Trends Council.

The European Union's transition from marketplace arrangement to world superpower, and a possible new age for European IT, characterised by relatively frictionless commerce and exciting growth, is debated by the Cutter Business Technology Council along with Cutter Consortium Senior Consultants Tom Welsh (UK) and Borys Stokalski  (Poland) in the latest issue of Cutter Consortium's Business Technology Trends and Impacts Opinion.
According to DeMarco: "After decades of looking over its shoulder at Japan and the East, the US economy is fast being overtaken by someone else entirely: Europe. The colossus that has been assembled in what used to be called the Common Market has emerged as an economic superpower."
Cutter Consortium Fellow Lou Mazzucchelli counters: "The European Union is taking hold, albeit with fits and starts as evidenced by the results of recent constitutional referenda in France and the Netherlands. But I see little evidence of a massive shift of economic power from the US to the EU. Perhaps it is because the gap between them is small, relative to the gap between either the US or the EU and China or India or South America. The latter have so much more to gain, potentially at our expense." 
He continues: "It is unarguable that changes in Europe have an impact on the US and the world, but of all the forces working to shift economic power from the US, Europe may be the least threatening. Europe may have gotten a second wind as an economic power, but it seems unable to run a distance race against India and China."

Telecom operators across Western Europe are launching IPTV services in an effort to increase revenues and improve customer satisfaction for their broadband services. In a new study on IPTV services in Western Europe, IDC has found that the potential for success with IPTV services varies widely from country to country, depending on the penetration of existing pay TV services, the level of broadband competition, and the commitment of incumbent and leading competitive operators to investing in the network upgrades and content necessary for high-quality IPTV services.

IDC estimates that the market for IPTV services in Western Europe was worth $62 million in 2004, with less than one per cent of households subscribing to IPTV services. The market will boom over the next five years, growing from $262 million in 2005 to $2.5 billion in 2009. By that year, six per cent of Western European households will subscribe to IPTV services. By 2009, IDC expects that all European incumbents and a large portion of the major alternative tier 2 providers will offer IPTV services. DSL will be the most widely used platform for the service, though a minority of households in a few countries will receive IPTV services over metro Ethernet connections.
IDC's study Western European IPTV Forecast 2004-2009 says that in order to be successful, broadband operators will need to differentiate their service bundles from the video services already available in the market. The area they will be able to do this is in high-quality content underpinned with interactivity.

Worldwide mobile phone sales totalled 190.5 million units in the second quarter of 2005, a 21.6 per cent growth from the same period last year, according to Gartner, Inc. In the second quarter of 2005, the mobile phone market experienced the second strongest quarter on record (in the fourth quarter of 2004 worldwide sales surpassed 195.3 million units).

Nokia and Motorola have strengthened their position in the marketplace, as the two companies accounted for 49.8 per cent of worldwide mobile phone sales in the second quarter of 2005. Nokia's market share grew 2.3 percentage points in the second quarter of 2005 to reach 31.9 per cent. "Nokia regained its top position in Latin America and stepped up to the third position in North America benefiting from the successful launch of its Virgin Mobile which helped its lagging code division multiple access (CDMA) sales," says Hugues de la Vergne, principal analyst for mobile terminals research at Gartner, based in Dallas, Texas.
Motorola was the second best-selling vendor in Western Europe, a significant improvement compared to the same time last year when Motorola finished as the No. 5 vendor in the market. In North America, Motorola was the market leader with its share reaching 33.5 percent, while it was the No. 2 vendor in Latin America with 31.9 per cent of sales in the region.

According to a recent study by Evalueserve, The Impact of Skype on Telecom Operators, the European Telecom market is expected to be hit the hardest due to the fast and accelerating uptake of Skype, which is by far the most successful P2P (peer-to-peer) VoIP solution available around the world today.

Skype has revolutionised VoIP telephony by offering high quality voice transmission, and by reducing the cost to zero for Skype-to-Skype calls, and to a fraction of current long-distance rates for Skype-to-Fixed/Mobile network calls. European operators are much more exposed due to the characteristics of European telecom markets where the calling and roaming rates, as well as the share of roaming calls, is higher and local calls are charged by the minute, as opposed to a flat monthly fee in the US. Worldwide, the figure of regular retail Skype users is likely to be between 140-245 million by 2008, the study reports.
The Evalueserve study further projects that incumbent telecom operators who combine fixed and mobile networks are likely to face a significant risk of permanent reduction in overall profitability by at least 22-26 per cent and reductions in revenue by 5-10 per cent as a direct impact of Skype by 2008.
At present, Skype has two million users in the US and 13 million users worldwide and the company claims 80,000 new subscribers daily.

A taste of TeleManagement World in Texas -- which takes place from 7-10 November

Telecoms is undergoing enormous and rapid change. Managing this change could mean the difference between capturing market share for new and innovative services or being left behind.
The TeleManagement Forum's U.S. conference and expo TeleManagement World, taking place November 7 - 10 in Dallas, Texas, will provide an in-depth analysis of the changing telecoms industry and will address new entrants, new technologies and ideas from revolutionary thought leaders that will ready operations and business support system professionals for the challenges that come with the innovation underway in the telecoms marketplace.
"Telecom competition is actually no longer about new technologies or even new services. It's about who can most effectively manage themselves, their networks and their customers, and which operators can become leanest, fastest," says TeleManagement Forum President Jim Warner.
According to Warner, the US Telecom market has not seen this much change in more than 20 years. Warner points out that there is massive consolidation taking place -- with Verizon buying MCI, SBC buying AT&T, and Sprint merging with Nextel.
"But there's also a lot of new players coming into the market, often with new technologies and capabilities," Warner says. "The fact is that if you can't manage your customers and give them the services they want, they're going to go elsewhere now that they've got choices."
TeleManagement World Dallas brings together operators, suppliers and industry experts to discuss, share examples and provide solutions for how to meet the challenges of aggressive time-to-market pressures and how to improve the customer experience to gain a competitive edge.
TeleManagement World is designed to be a "cram course" to help operators truly understand how to manage next-generation services and the networks and back office systems that will deliver them.
The conference will include a Telecom Innovation Summit on November 8 that will delve into some of the industry's most pressing issues. Industry panels will focus on managing the next wave of innovation and other hot topics such as new devices and services, municipal Wi-Max networks and how operators can be 'lean' in light of Sarbanes-Oxley.
Six tracks being held November 9 - 10 will prepare operators and suppliers for their next move in the telecom industry by focusing on business transformation and market readiness; how to optimize the next-generation network's potential; service delivery and content to provide the anytime, anywhere customer experience; revenue assurance and billing; NGOSS software and technology and a deeper look at cable, IMS and IPTV.
Seven different Catalyst programs will serve as a living lab of innovation at TeleManagement World Dallas and will include two operator-driven initiatives, demonstrating how to solve complex operation and business support system issues.
The Catalyst programs will create scenarios that employ NGOSS concepts to manage mobile multimedia service management processes, simplify Sarbanes-Oxley compliance and manage Internet Protocol (IP) services, including voice over IP. Catalyst projects will also explore how to integrate OSS products from multiple vendors in a triple play scenario and will kick-start a new NGOSS Web Service Enablement initiative.
Fourteen different full or half-day training courses will be offered on Nov. 7 and Nov. 10 on topics such as NGOSS, the SID and the eTOM as well as using OSSJ to implement NGOSS. Courses will also cover a BSS perspective of VoIP, billing and charging for data and content, revenue assurance and billing in a convergent world. Attendees of the Monday courses will recieve a copy of NGOSS Distilled, a book written by John Reilly and Martin Creaner.
TeleManagement World provides a networking, educational, and demonstration forum that will help attendees apply state-of-the-art advancements in telecom to their business to manage the innovation of the future.

For more details or to register visit:

With currect technological forces driving the industry down the road of convergence, companies who are unable to respond quickly will suffer most, says Marc Kawam 

Convergence, or the transition to Internet Protocol (IP), is what many people are calling a 'stealth revolution'. Slowly and invisibly to the majority of phone users, Europeans are changing the way they communicate with one another. Following on from the popularity of IP telephony in the US, where there are already 3 million residential users, we are now moving away from traditional circuit routed networks to Internet-based voice calls. We are even seeing the emergence of triple-play services as service providers like FastWeb and Homechoice bundle together voice, data and video and wireless to deliver a one-stop shop for all communications and entertainment needs via IP.

Convergence going mainstream

In the UK, BT's recent announcement to transform its telecommunications infrastructure into a pure IP-based network by 2009 is a clear indication of the commitment that large providers are making. And there are good reasons for it: as well as paving the way for a string of new hi-tech applications, its 21st century network (21CN) is expected to deliver cash savings of £1bn a year to BT by 2009. But are established providers up to the challenge of migrating to IP and can they really compete with innovative companies like Skype and Vonage that are rewriting the rules and being heavily backed by deep-pocketed investors?
In the fast-paced telecoms sector, responsiveness and speed-to-market are all-important. Whether speed to respond to competition, speed to innovate, or speed to respond to customer requests, responsiveness is the major challenge companies face today. Take almost any CSP offering and track it over the years: it becomes quickly evident that what was considered a rapid service roll-out a few years back would be considered exceptionally slow by today's standards.
As deregulation and a fall in barriers to entry opens the market up to a new breed of innovative companies fighting for a slice of this evolving market, customers are being freed from the tyranny of the incumbent. Free to choose from an increasing array of communications services providers (CSPs), customers are increasingly demanding more for less. Not only do they want advanced IP-enabled services, they want to pay less as well. If that wasn't enough, they also have to   accommodate the expectations of the 'Internet Generation' who are used to getting what they want when they want. Some CSPs believe their customers are now shifting to a real-time provision mind-frame where they can instantly obtain the services they desire.

Agility is key

Stuck with hundreds of IT legacy systems coming from the days of circuit telephony, many companies fight with a myriad of disjointed IT systems, and lack a clear vision of their future technological needs. Even some of the established players lack the internal systems and related flexibility to handle demand for next generation services where the customer is in control. According to Insead's agility index, which measures the ability to roll-out and provision innovative IP-based services (such as broadband, VoIP and IP-TV) rapidly, efficiently and to a high quality to new market segments, CSPs in Europe have, on average, reached only 60 per cent of their potential agility. In addition, whereas CSPs are now, on average, able to bring service to market 2.5 times faster than five years ago, customers are prepared to wait 2.8 times less long for new services to be delivered to them.
Clearly there is a gap between the speed and           flexibility required to compete in the fast-paced telecoms industry, and the IT readiness of these organisations to respond to this need. With competition in Europe being fierce, agility becomes an imperative for business success. So what can they do? How can they move to the nirvana of real-time service provisioning, as their customers often demand? How can this be achieved for the complex bundles of IP-based products/services (including content services) that are delivered off multiple vendor platforms, and more often then not owned by diverse business entities?
Insead's research report identifies self-provisioning/self-management and integration with partners' systems for more seamless order mediation as key areas for improvement. The latter becomes increasingly important as service providers begin offering aggregated services, where a subset of the services is actually supplied by a partner entity. Two-way communication of service order information, such as transmitting service requests, receiving acknowledgements, and confirming service availability between partner service provider entities, will increasingly need to be automated. Unstructured manual processes for mediating orders will not scale and will not allow CSPs to meet the demands of their consumers.

How to cope with change

Service providers need to bridge the 'automation gap' between customer-facing software applications and the service delivery partners for a variety of rapidly changing and evolving services. Next Generation Operations Support Systems (NGOSS) can bridge the gap. They allow CSPs to not only handle the vast array of new services that customers are demanding today, but also support future IP-based services that service providers will need to roll out to retain and grow their customer base. These solutions are often built on open technologies, such as, J2EE, EJB, JMS and XML. Standards, such as TeleManagement Forum's SID and OSS Through Java Initiative (OSS/J) APIs, promise to facilitate development of component based OSS where CSPs can mix-and-match systems as needed to quickly respond to customer demands or new services needs.

Crucial part of OSS business

Standards are a crucial part of the OSS business, since currently, there are over 2,000 commercially available OSS applications on the market, and combined with applications that service providers have created themselves, the number increases to almost 5,000. Most of these solutions, however, do not provide all the components needed to cover widespread adoption, and the integration challenge for these applications is immense. The interfaces between the systems developed using standards can hide the complexities of the functionalities of each individual system and allow, for example, flow-through service provisioning without any inefficiencies due to manual processes or manual data transfer. This is one key and necessary part of the equation: the technological piece.
Beyond that it requires management's understanding of the IT needs and priorities, and an appreciation of the impact that low IT agility can have on the success of the organisation. Responsiveness, speed to market and quality of service are the key ingredients to crack the European market. It is a long way to the top, and most CSPs have a fair way to go yet, but one thing is clear: those who manage to climb up the agility index ladder will have the technological ability to keep up with the accelerating pace of the CSP industry; the rest may have to pay the price for being too slow.                 

Marc Kawam is Managing Director, Europe, CEON Corporation. tel: +33 153 536766; e-mail:

European Communications highlights some of the upcoming events focussing on broadband technology

The Broadband World Forum Europe,

The Broadband World Forum Europe comes to Madrid, Spain, 3 - 6 October, bringing together key executives working in carrier and supplier companies from around the world, to join today's thought-leaders from such organisations as Telefónica, France Telecom, Belgacom, BT, Telecom Italia, KT, NTT, T-Com, and Swiss Telecom.
In all, the Broadband World Forum Europe features 80+ exhibiting companies, 150+ high-level speakers, 40+ workshops and conference sessions, Plenary Panels, Keynote Address, Hot Seat Sessions and more.
Attendees will have the opportunity to learn the business models, deployment strategies, and rollout practices that have proven successful in making mass-market broadband in Europe a reality. The event features the latest perspectives and real-world experiences from leading executives through four days of comprehensive educational programming, complemented by a cutting-edge technology exhibition and high-tech demo pavilion, offering attendees a firsthand look at the latest broadband technologies, equipment, applications, solutions, and services.
New at this year's World Forum, the co-located WiMAX Global ComForum tackles technology and business challenges in implementing WiMAX wireless broadband networks for both fixed and wireless access and mobile broadband.
The InfoVision Award program will also take place at Broadband World Forum Europe 2005. The InfoVision Awards recognizes technologies, applications, products, advances, and services. The awards also honour corporations and individuals for innovative contributions and developments that have proven important to society.
Hands-On Technology Pavilions include: WiMax Demo Pavillion, CableNet Demo Pavillion, VoIP Public Phone Booth, and a Video Demo Pavillion.
Broadband World Forum Europe conference chairman, Julio Linares Lopez, executive chairman of Telefonica Espana, will welcome attendees during the Monday morning Opening Ceremony Address, and Luis Ezcurra, general manager of marketing and mart development at Yelefonica Moviles Espana, will deliver the Wednesday keynote address The Challenge of UMTS.
John Janowiak, senior director of the IEC, notes: "Telefonica's commitment to broadband in the market, as proven in their mass deployment of ADSL, is truly impressive. Their expertise in voice and data systems will provide valuable experience to conference attendees."
Linares Lopez adds: "The topical platform coupled with the level of executive speakers and conferees brings a unique flare to the programme. The shared international knowledge will guarantee great success for this Forum."
Other key speakers at the event include: Edward Deng, Senior Vice President, Head of Global Marketing, Huawei Technologies;  Alan Mottram, President, Fixed Solutions Division, Alcatel; Phil Corman, Director, Worldwide Partner Development, Microsoft TV Group; Paul Berriman; Head of Strategic Market Development, PCCW Limited; and Jean-Philippe Vanot, Executive Vice President, Networks, Carriers, and IT, France Telecom.

The 2nd Broadband Russia and CIS Summit 2005, 31 October - 1 November, Moscow

Supported by Leonid Reiman, Russia's Minister of Information Technology and Telecommunications, the event will take place from 31st October - 1st November 2005 at the Marriot Grand Hotel in Moscow, Russia.
This annual event has established itself as the largest forum for IT and Telecom industries in Russia and its neighbouring countries. Indeed, the previous Broadband Russia and CIS Summit attracted Government ministers and senior representatives from the public and private sectors, along with more than 200 international participants.
It is now established as a unique platform to pursue business opportunities in Russia and CIS countries in the ICT Broadband, Wireless, Satellite, Cellular, Content, Cable and Broadcasting, and 3G Technology markets.
Commenting on the event, Leonid Reiman notes: "The development of broadband communication networks is one of the top priority tasks of the communication and information industry in Russia. We believed that broadband multimedia communication networks will become the basis for next generation networks, accelerate information and Internet development, and create a civilised, investment-attractive market environment for operators, producers and suppliers of equipment.
"I am confident that this Conference will become an important step in the development of prospective communication networks in Russia and outline specific measures for their implementation."
Conference themes reflect the a wide spectrum of topic areas, and include:
*  Regional and international overview of key ICT broadband investment opportunities in Russia and the CIS
*  Overview of commercial and residential trends, demand and penetration rates for potential broadband access
*  Introducing commercial and residential broadband wireless networks and technologies to Russia and the CIS
*  Overview of financial, legal and fiscal framework for investment into Russia's broadband sector and the CIS.
Sponsors for the event include Motorola, Rostelecom, New Skies Satellites, and ViaSat.  Supporting Organisations include: CWTA, WiMAX Forum, Georgian National Communications Commission, Global VSAT Forum, ESOA, Enforta, and J'son & Partner.

Broadband Europe, 12-14 December, Bordeaux

Broadband Europe aims to be the forum that unites academia and industry in discussion and display of the latest (and future) broadband components, products, systems, and services. The event was established in response to the overwhelming demand from the European broadband community for an annual gathering within which to present key developments.
Broadband has, over the past few years, enjoyed one of the most rapid growths and adoptions ever in the area of electronics or communications, even faster than mobile communications.
Introduction of broadband entails multi-disciplinary aspects (technology, socio-economic, regulatory, content delivery, security and standardisation). This multi-disciplinary character of broadband is recognised in the BREAD-project (Broadband for All in Europe: a multi-disciplinary approach), which is a Co-ordination Action, started 01/01/04 within the "Broadband for All" -- strategic objective of the FP6-framework of the European Commission.
The BREAD project has -- amongst its objectives -- to develop a holistic vision, encompassing technical, as well as economic and regulatory aspects. Another important aspect is to identify roadblocks on European, national and regional level, and share visions and best practices at both national and EU level.
In December 2004, the BREAD-project initiated Broadband Europe, bringing together on an international level all the broadband players, researchers, service providers, content providers, operators, manufacturers, policy makers, standardisation bodies, and professional organisations.
As well as an extensive exhibition, and additional visitor attractions such as the Application Village -- whose theme for 2005 is Broadband Gaming -- Broadband Europe offers an authoritative conference agenda, featuring exclusive scientific presentations, broadband project research results, semi-scientific commercial presentations (including agencies, municipalities on trials and roll-outs on broadband) and commercial presentations from companies on broadband products and services. 
Speakers at the 2005 conference will include representatives from the European Commission, government agencies, standardisation bodies, professional organisations, industry (world-wide), R&D centres and academia.
To ensure the conference addresses the most current issues, with the highest standard of presentations, a committee of specialists has been formed and meets regularly to agree topics and adjudicate papers for inclusion in the programme.

Alan Robinson explains the technical concept of IPTV and why testing a service properly could mean the difference between happy or disgruntled viewers

By now, if you haven't heard of Triple Play or IPTV, there's a fair chance you've been living under a rock. Every major and almost every minor service provider seems to have either announced plans for IPTV deployments or is in the process of drawing them up.
With an average subscriber to a digital satellite service spending over â‚-500 a year, it's easy to see why the traditional wireline operators are anxious to get revenue figures like these from their existing subscriber base. In addition, video-on-demand and pay-per-view services can as much as double these average revenue figures. So the financial rewards are obviously there. Also, high-speed Internet connections have an appeal that is limited to the technically literate in society so there's a limit as to how much broadband Internet products they can sell. On the other hand, television has a far wider or almost universal appeal so the market is so much larger.
But before service providers' eyes glaze over and they start drooling thinking about the revenue, there are quite a number of technical difficulties to resolve before the money starts rolling in. And to underline that, there have been a number of widely publicised trials that failed to turn into deployments, including a number of delays in deployment of over twelve months. These have been very costly, not only in terms of loss of face but also in crude financial terms, too.
So what can go wrong? Well, put simply, lots of things. Firstly, the DSL model used by operators is one of high-bandwidth, which is good for IPTV but with the downside of contention (ie, lots of users all using the same bandwidth). Additionally, very little about Quality of Service is mentioned in terms of broadband products and for a good reason; usually, there isn't any. IP is inherently unreliable but TCP helpfully handles problems like mis-ordered or missing packets, while latency and jitter aren't usually obvious to a web, e-mail or P2P user. But for an IPTV user, the result can be detrimental to their viewing enjoyment.
We're also a pretty tolerant bunch when it comes to IP over DSL. If our web page is a little sluggish, we don't really get worked up about it but we have zero tolerance when we're watching TV. With the advent of cable TV systems, digital satellites, DVDs, etc, we expect our viewing to be, well, picture perfect. Small artefacts or frozen frames are simply taboo.
Because MPEG frames (the encoding scheme for most IPTV solutions) are larger than Ethernet frames, losing a single Ethernet packet may mean that none of the MPEG frame can be displayed. It may be part of an advert that isn't displayed but what if it's the split second that decides if the winning goal was onside or offside? You can wear out your viewers' patience pretty quickly in those cases.
And just in case you thought things couldn't get worse, research has shown that most people don't report technical problems with their viewing enjoyment; they simply lose patience and churn to another provider and that's a very expensive event for the operators. They've spent considerable money getting the subscriber added to their service in the first place and now that investment's wasted.
Most analysts would agree that the key thing for operators to do in order to keep revenue high is to keep the churn rates down. There always will be a residual number of users that churn, but outside of this group, the quality of experience that a user receives is a key deciding factor in staying with the service or migrating to another supplier. One point that absolutely can't be stressed too much is to ensure that the test methodology is correct. Up to now, telecommunications equipment has typically been tested on the blast basis. Fire in wire-rate 64 byte packets and if they all get through the far side, then you're good to go. This is far too simplistic for the modern network where users are only interested in how their real-world applications (web, e-mail, IPTV, VoIP) run on the network and don't care how good contrived data sets that only ever occur in labs, function on the network. So make sure you're running real-world data. Don't simulate the traffic;emulate the traffic and there's a key difference here.
Another drawback to earlier testing methodologies has been the generalisation problem. Because it's difficult to measure simultaneously large numbers of statistics from different endpoints -- and also visually represent these figures in such a way that it's easy to see where problems lie -- the temptation has been to take one large set of measurements and report an average. Other than not testing at all, nothing could have greater risk of encouraging churn. The problem is that if there are relatively few data points that indicate poor service, then these get lost in the overall noise of the good service. The danger is, and we've seen this in 'real' deployments, that there is always a group of users that get poor service. In other words, the lowest ten per cent of users get blocky video, frozen frames or have too lengthy a delay when changing channels. These users churn and are replaced by other users formerly from the 'good' group. The operators monthly churn rate is high and no one knows why.
In the remainder of this article, we'll take a look at the types of things that can go wrong and how testing can make sure that they stay in the lab and not in the network. But first it's probably worthwhile familiarising ourselves with the type of traffic that we can expect on IPTV-enabled networks.
IPTV is sent over IGMP in most networks. IGMP (Internet Group Management Protocol) allows for multicasting (point to multi-point). Put very simply, a server (in this case a Video Server) transmits each separate TV channel as a single multi-cast stream with a special type of destination IP address. If a viewer wants to receive this stream, it sends an IGMP 'Join' message containing this special destination IP address. The network infrastructure then directs a copy of this stream to the user's Set-Top Box (STB). This is a very efficient use of bandwidth, because it doesn't matter how many 'viewers' of this stream there are; a single copy is sent through the infrastructure and this is split only where it needs to be split into multiple copies. The network infrastructure effectively also keeps a count of the number of viewers. As viewers issue IGMP 'leave' messages (ie, they change channels and no longer view this one), the count is decremented. If it reaches zero, then this portion of the IPTV network can elect not to split and transfer the packets, thereby further reducing bandwidth requirements. Compare this with a unicast (point to point) mechanism. For each viewer, there would be a separate connection to the Video Server. This simply wouldn't scale.
When a viewer wants to change TV channel, their STB issues an IGMP 'Leave' message for the TV channel it no longer wants to receive and a 'Join' message for the TV channel it has now switched to.
Usually, video is transmitted as an MPEG-2 stream. (More recent advances in compression have resulted in MPEG-4 being more widely deployed and this trend will likely continue.) An MPEG-2 stream consists of a number of different types of frames. Greatly simplifying, in order to provide compression, each frame is effectively a compressed difference to, or delta of, the previous frame. However, if the initial frame or one of the deltas was lost for any reason, it would be impossible to decode the picture. In order to overcome this, MPEG-2 calls for an 'I-frame' to be inserted at regular intervals (usually about 15 frames or less apart). This is a full picture with no dependencies on any previously received (and potentially lost) frames. If a user 'joins' the stream, then he has to wait for the next I-frame before a picture can be rendered on the television. MPEG usually is encoded at between 25 and 30 frames per second, so it could take up to half a second before the TV can display an image. Using a larger number of incremental frames between I-frames reduces the bandwidth required to send a video stream, but this has the disadvantage that it also implies the potentially longer time it can take to change channels or more generally, for the stream to recover its integrity (ie, display a perfect picture) when an error has occurred.
Because the size of the MPEG frames is much larger than Ethernet packets, a single MPEG frame has to be carried in multiple Ethernet packets. If one of these is lost, then the MPEG frame may not decompress correctly or fully. Subsequent frames depend on this frame until the next I-frame is received, so it's clear that a missing, corrupt or mis-ordered frame can have far-reaching consequences.
Traditional testing involves 'blasting' the system with packets. This isn't sufficient for testing IPTV systems. Packet loss can be measured by these techniques but not the effect of the packet loss, as some packets' loss may be more acutely felt than others. One of the most common reasons for loss in these types of networks is 'interference' from other IP traffic. If the kids are upstairs gaming or downloading MP3s, the detrimental effect on the quality that can be obtained on the television downstairs is difficult to overestimate. Modern test solutions give access to real live traffic (like web, P2P or e-mail) that can be mixed with IPTV to see if any losses occur as the traditional IP traffic's volume is increased.

Zap rate

We've already seen that it takes some time for a valid picture to be seen on the television when an IGMP stream is 'joined'. In addition to this, we've talked about the network infrastructure only having to split and send streams that one or more STBs are watching. So, if there are no current viewers for that stream, then potentially the network may have to go a number of hops before it can find a place where the split can take place. The zap rate measures how long it takes to change channels on the TV and have a valid picture to watch. It's surprising how critical a measure it is for the viewing public. With the growth in the number of channels that are provided in even the most basic of packaged offerings, perhaps it's not too difficult to understand why. Zapping through the channels looking for something worth watching is a common enough occurrence in most homes, and the longer it takes for each channel to change dictates how long it takes for us to find something more interesting to watch.
Modern test systems allow the user to measure the zap rate by joining and leaving channels in a controlled manner and collating the statistics for each individual viewer over long periods of time. Analysing all of this data will allow the tester to determine if any viewer's 'zap rate' went outside of acceptable bounds.
Suppose there's a big match on. The half-time whistle blows -- what do we do? Lots of people start changing channel. They may want to see something more nteresting than the usual collection of talking heads offering half-time punditry or catch the scores from another match on another channel. Whatever the reason, the numbers who switch channels at this point equates to a huge spike. This can stress an IPTV network terribly as it goes from a steady state of long-term viewing to a huge series of changes. A modern test system can allow the operator to generate these half-time scenarios, where a steady state is interrupted with large numbers of channel request changes to stress the infrastructure. Again, it's vital to measure the effect on a per-user basis or the bad service effects can be averaged out and lost.
These are just a few of the major 'must-check' problems that should appear on any operator's checklist before deploying. Unfortunately, there are quite a few more and we haven't even touched on perceptual video quality or more worryingly, the whole new raft of security issues and problems that have been created by IPTV. It's probably fair to say that IGMP was built with little or no thought put into security issues, the potential for fraud or Denial of Service threats. That's a whole other area of testing that only now is beginning to take place, so get testing and don't leave it to the viewers!                                                          n

Alan Robinson is Chief Executive and co-founder of Shenick Network Systems. He can be contacted via tel: +353-1-236 7002

External Links




European Communications is now
Mobile Europe and European Communications


From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to: 



Other Categories in Features