Features

While RFID is one of those technologies with a potentially sinister side, it has been proven in the field as an
important tool for companies who need to track their assets and reduce costs. Malavika Srinath looks at how the
technology is developing

Every organisation in existence is a victim of technology. For a company to be 'high-tech' however, it requires a lot of positive thinking. This is an expensive label to earn and few are willing to take the risk of investing in a technology that hasn't yet proved its worth. Yet, how can worth be proved unless the technology is put to use? This question has remained unchanged over decades.
And one more victim of the market's 'chicken-and-egg' view on technology is RFID.
So far, big retail giants like Wal-Mart have spear-headed the pro-RFID movement by issuing mandates to their suppliers, resulting in half-hearted attempts by companies to introduce this technology into their operations. But with new packages on the market introduced by ERP software vendors like SAP and Sun Microsystems, RFID has become infinitely more attractive.
Why? Simply because now, thousands of customers the world over can adopt these packages almost seamlessly. Therefore, a technology that was previously "too expensive" now doesn't seem such a waste of money and resources after all.

What is all this fuss about RFID?

The technical process of RFID mystifies many. But in reality, the principle behind RFID reminds me of biology lessons at school. In essence, the RFID scanner is like a bat, sending out radio waves to track the location of a truck, or pallet. The tag embedded in the object sends a message back to the 'bat', detailing its location. This message zips back to the system, to be read by operators and line managers.
Simplistic as this may sound, RFID can save a company millions of pounds in wastage and loss of goods. The technology was initially used to track livestock on farms, but for the companies that have been brave enough to extend this system to their business operations, the results have been phenomenal. Tighter inventory control, time saving and increased transparency in operations along the supply chain are just a few examples of the benefits that they have seen with the use of this technology.
Given the nature of the RFID mandates, the earliest adopters of RFID were in the retail supply chain. It is estimated that spending to track retail products alone will grow from $91 million in 2004 to $1.3 billion by 2008. Within retail, perhaps the sector that benefits most from the use of RFID is food retail, especially in cases of food and drug safety and product re-calls.
Undoubtedly, the market is growing more positive about the technology but companies still need convincing that they can generate tangible business value through integrating RFID in their business operations. No one wants to be first to act and the atmosphere is one of "let's wait and watch".
The apprehension behind the adoption of RFID is not unjustified. Costs are high now, and benefits are not fully visible. The possibilities of misuse and breach of security seem more plausible at this point, especially to a company that does not have funds to invest in hi-tech systems.
In addition, as with any technology, there are always drawbacks: it is expensive, it can be inaccurate due to external elements, and scanners can only read tags from particular distances.
Also, once companies have seen the benefit of this technology, there is a possibility of it being used in areas outside supply chain management. It is scary to think that these scanners could make their way into daily life, intruding on privacy and scanning private details -- for example, the contents of one's handbag when inside a store.
But realistically, given the cost of single tag, RFID is most likely to remain in the back room well into the future. It needs to be viewed, therefore, merely as an opportunity to make operations faster, more efficient and certainly a way to keep customers happy.
It is therefore critical that software providers are attuned to changes in market requirements.
For most vendor companies, whose yearly business objective includes the introduction of new functionality and packages, software innovation can make or break yearly revenues.
The question really is: what do customers want?

The next logical step

Many customers who are already on enterprise resource planning (ERP) software packages have been interested in RFID for some years now, either because they are proactive or merely because of compliance. Software vendors have simply tapped into this burgeoning need for faster, more efficient systems.
The result: new software packages, encapsulating the RFID functionality. Easy usage, effective results.
Several attempts were made in the past to simplify the integration of data collected via RFID into other software applications. With these new packages, one of the biggest hurdles faced by the tag in managing large amounts of data could be done away with.            ©
Packages were initially offered to pilot-test customers, then opened to the entire market in mid-2004 and customers who were the early adopters are already seeing results.
Focussing on unique industry demands, vendors encourage customers to capitalise on the competitive advantages offered by their respective packages. For example, SAP even helps its customers implement RFID in a step-by-step approach, making the whole process less daunting for the customer. So far, things are going right and the industry seems to have delivered to its eager customers.
So what's the down-side?
It's that customers on various ERP systems run more than 90,000 installations. Will they all be able to integrate RFID into their operations with equal ease? Naturally not. So this means that customers will have to invest further in their systems in order to be able to run these new RFID packages. Although the benefits of adopting RFID into one's operations cannot be denied, it could be years before companies see tangible benefits.
In addition there are other systems available in the market like RFID-enabled Warehouse Management and Asset Tracking that could offer very similar, if not such elaborate, RFID functionality. These are even likely to be marginally cheaper, given that more and more software providers are starting to run the RFID race.

Heavyweight vendors

With so many software vendors climbing into the sad-pit, the tug of war for the top spot continues as heavyweight vendors like SAP and Sun Microsystems release RFID packages within weeks of each other.
Despite being in a fiercely competitive market, the bigger players in the industry seem confident that they won't be losing customers because of their prices. In the end, the belief is that the benefits of using RFID will outweigh all the initial costs of implementation.
Interestingly, companies like Kimberly-Clark (an early adopter of RFID due to the Wal-Mart mandate) are starting to play major roles in helping to push the concept of RFID further, while cutting back on their own costs. This is reflected in Kimberly-Clark's suggestion that SAP and OAT work together to create a productised interface to ensure that companies would not be required to develop their own custom software, thereby reducing costs.
The market is highly likely to see more and more of this kind of initiative, as customers in industries such as high-tech, retail and pharmaceutical become increasingly convinced that RFID is the way forward.

A market set to grow

It is not at all surprising that despite currently averaging on a price of 50 cents a tag, the market for RFID is set to grow. Eventually, the costs saved because of RFID will have an impact on the price of tags and by 2007 businesses could well be shelling out over $260 million for the integration of RFID into their systems.
With the introduction of their new RFID packages, bigger vendors like Oracle, SAP and Sun Microsystems have already bagged themselves a fair share of this future revenue. It is unfortunate that smaller software providers like RedPrairie and HID Corporation who began to deploy RFID solutions several years ago may have missed out on this pot of gold. The key factor governing the success of technologies like RFID is in the time of introduction: the market is unlikely to be receptive to technology when it is not ready. In biding their time, the bigger players have scored.
Increasingly, it is starting to look as if RFID is the key to winning the race for technology.                           n

Frost & Sullivan is currently carrying out strategic research in the RFID market with a specific focus on logistics. For more information or to give us feedback on this article, please contact the author at malavika.srinath@frost.com

Malavika Srinath is a Research Analyst, Logistics and Supply Chain Management, Frost & Sullivan

Network security is one of the key issues in the IT industry, and the challenge for operators is making sure that its network elements are protected. Costa Costantakis explains

Security has become a hot-button issue in IT and carrier network operations.  While 'securing the perimeter' appears to have captured most of the available spotlight, a related and emerging security challenge for carriers is that of administering secure access to their network elements (NE) -- both to differentiate themselves and, by extension, to ensure the security and availability of their customers' traffic. 
Unfortunately, the reality today does not reflect the security measures one might reasonably expect to find in service provider networks. Many NEs, particularly legacy elements, have limited security features designed into their software. Tools to centralise and automate security administration are lacking, at best. Due to these challenges, combined with a lack of resources and time, security administrators are often forced to overlook basic procedures like password strength enforcement, with the original, default password set by the manufacturer often left unchanged on NEs.
Consider also the true cost of security administration. According to industry estimates, companies will typically spend between 25 and 40 euros ($30 and $50) in operating expenditures (OPEX) for every password change they administer. With substantial numbers of passwords on a network being changed many times per year, the automation of credential changes can amount to many millions of euros in annual expenditures.
Beyond the direct costs of security administration, there are the associated costs of opportunity that include service level agreement (SLA) credits, customer dissatisfaction and churn, and network outages that can be attributed to two sources: (a) the inadvertent misuse of commands by innocent network operators dealing with fast-paced, complex network operations; and (b) increasingly sophisticated hackers gaining unauthorised access to a group of NEs within a network.

A daunting challenge

In spite of the increasing importance of securing the management plane within their multi-vendor network environments, the reality is that service providers face daunting challenges in administering security. They are routinely forced to use rudimentary tools, like scripts, to effectuate password changes across their networks (with failure rates often exceeding 50 per cent on a first pass). They are forced to allow shared passwords among network operators because of the inherent limitations in the number of accounts that can exist on a network element (eg, a hundred users requiring access through a limit of only five or six accounts offered on a network element).
  This is in spite of the fact that shared passwords compromise security altogether and make tracking network activity difficult, at best. Central logging activity is also a challenge because different vendor systems log activity differently (some on the NE, some on the EMS that accompanies the NE, etc).
Perhaps equal to the risk of hacker intrusions, in terms of potential network outages, is the innocent misuse of commands from within. The inability of security administrators to restrict a network operator from using specific commands beyond the scope of his responsibilities is a primary reason for this. The end result is that a network operator is often granted more privileges than would otherwise be given by the security administrator, which in turn leads to the possibility of inadvertent or accidental human errors that can take down parts of the network.

A better approach

Conceptually, a solution to the challenges described above entails breaking from the traditional approach of giving users direct access to NE accounts. An alternative approach is to create a new layer of 'user' security that is distinct from the network element security layer.  This is achieved by deploying a distributed Security Proxy Server that behaves as a layer of abstraction between users logged onto the system and NE accounts. The Security Proxy Server's job is to authenticate users and determine their command privileges before providing them access to a network element account. Furthermore, it filters all of their commands, allowing only those that an individual has been authorised to use. The solution described is compliant with the emerging ANSI (ATIS T1.276-2003) and ITU-T (M.3016) standard specifications, being driven by service providers and governments in a quest to secure their infrastructure without markedly increasing their operating costs.
In taking this overall approach, several benefits arise, as follows:
*  Unique Passwords for All Network Operators
First, passwords are no longer shared. Every network operator is assigned a unique user ID, password, and set of command privileges. All are strictly enforced by the Security Proxy Server serving as a 'gatekeeper' and controlling all access to NE accounts. A further benefit is that NE account passwords now remain hidden from all but a few security administrators, owing to the layer of abstraction between users and the NEs.
*  Secure Centralised Dissemination and Control of Passwords
Second, security administrators avoid the headache of 'swivel chair' management by moving to one, centralised system for managing security on all NEs across the multi-vendor network. Such centralisation can pave the way for true automation of the ongoing requirement for user ID and password changes, in turn saving the service provider millions of euros per year in security administration effort.
*  Overcoming Security Feature Limitations
A third benefit of the layer of abstraction between users and the NEs is that common security policies can be imposed on user privileges right across the entire multi-vendor network, regardless of the number of NE types. This is done by implementing the policy logic at the new user security layer in the Security Proxy Server, instead of hoping to find common security features on each vendor's NE. The use of this user security layer further streamlines element  management by providing each user with "single sign-on", versus requiring network operators to log on to each NE in series.
*  Centralised Logging of All User Activities
A tiered architecture that makes use of the additional security layer as described, allows the tracking and logging of user activity to move out of individual NEs and into a central repository. Central logging and auditing of all user activity suddenly becomes possible, with all user commands being funnelled through the Security Proxy Server. Effort, time and money is spared by security administrators who gain the ability to access consolidated activity records from a single source and in a consistent format, in turn proving invaluable in the event of a network crisis or outage.
*  Granular Security Privilege Control
Not every field operator is trained and qualified to enter every possible command on every NE -- nor should they be if the economics of network operations are to remain favourable. The Security Proxy Server 'brokers' sessions by checking the authorisations granted to an individual as he attempts to issue a command to the NE. This level of granular control allows security administrators to minimise the potential for human error to occur. In essence, an inexperienced field operator cannot accidentally issue a fatal command if he is blocked from issuing it.
*  Optional or imperative?
Faced with many of the time-consuming challenges related to management plane security, administrators have more often than not been forced to accept the risk of a breach by cutting corners on standard security management practices. To date, this shortcutting has been viewed as realistic and a necessary compromise in ensuring that vital network operations proceed in a timely fashion.
As next generation carrier networks become more complex and susceptible to outages, and as secure network operations become a differentiator for service providers in the eyes of customers, such corner-cutting will become too costly to overlook. It is inevitable that, driven by the instinct for survival and their customers' insistence, all service providers will seek out more robust, cost-effective approaches to securing the management planes of their multi-vendor networks in the years to come.                                             n

Costa Constantakis is Director of Marketing at Nakina Systems, and can be contacted via tel: +001 613 254 7351 ext. 307; e-mail:costa@nakinasystems.com

A new report published by the Information Security Forum (ISF) warns that the cost of complying with the Sarbanes-Oxley legislation is diverting spending away from addressing other security threats. The global not-for-profit organisation says that many of its members expect to spend more than $10m on information security controls for Sarbanes-Oxley. The business imperative to comply also means that, in many cases, the true cost of compliance is unknown.

With increasing concerns about compliance, the new ISF report provides a high-level overview of the Sarbanes-Oxley Act 2002 and examines how information security is affected by the requirement to comply. The report provides practical guidance to address problematic areas in the compliance process. According to the ISF, these problem areas include poor documentation, informal controls and use of spreadsheets, lack of clarity when dealing with outsource providers, and insufficient understanding of the internal workings of large business applications.
What's more, the Act ignores security areas that are extremely important when dealing with risks to information, such as business continuity and disaster recovery. This makes it vital to integrate compliance into an overall IT security and corporate governance strategy.
"In the wake of financial scandals like Enron and WorldCom, the Sarbanes-Oxley Act was designed to improve corporate governance and accountability but has proved difficult to interpret for information security professionals," says Andy Jones, ISF Consultant. "As neither the legislation nor the official guidance specifically mentions the words 'information security', the impact on security policy and the security controls that need to be put into place must be determined by each individual organisation in the context of their business.
 "It is important that Sarbanes-Oxley does not push organisations into following a compliance-based approach rather than a risk-based approach that may compromise information security.  The ISF report helps companies to achieve compliance while also ensuring that they have the appropriate security controls in place."
The full Sarbanes-Oxley report is one of the latest additions to the ISF library of over 200 research reports that are available free of charge to ISF Members.
Details: June Chambers, ISF, tel: + 44 20 7213 2867; e-mail: june.chambers@securityforum.org
www.securityforum.org.

High-tech industrial espionage is on the increase, but there are ways of preventing the 'baddies' from penetrating IT systems, as Calum Macleod explains

For those of you who follow the news, you may have come across the recent story of spy software discovered at some of Israel's leading companies, which reads just like the spy stories we've been fascinated by for years. Indeed, if it weren't for the sacrifice of the likes of Roger Moore and Pierce Brosnan, who knows where we would be today.
But that would be to miss the point completely. Firstly the imagined villains are in fact the victims. But more importantly, highlighting the problem of spy software being prevalent in Israeli companies came as a result of one of the most comprehensive investigations involving computer crime ever undertaken. The Trojan had been introduced by providing companies with contaminated files, or sending a contaminated e-mail message to the companies. This also raises concerns that these methods evaded all the security measures in place at the companies infected.
Today, our businesses depend on the exchange of electronic information with our business partners, but many of the mechanisms that are used still rely too much on the goodwill of those business partners, or the integrity of the systems that they use.
Two of the most commonly used security measures, FTP and Digitally Signed e-mails, using technologies such as PGP, are not really equipped to deal with this type of situation. In Israel, e-mails were being received from trusted business partners, so digital signatures on the e-mail would possibly be considered trustworthy. In the case of files being shared using systems such as FTP, the presence of malware detectors was unable to identify anything inappropriate.
The bottom line is that we frequently depend too much on the trustworthiness of those we deal with, and unless we take appropriate measures to handle the type of eventualities described above, we are leaving ourselves vulnerable.
So are there measures we can take?
Here are a number of suggestions that might help:
1. Do not expose your internal networks to external parties. The process of transferring files in and out of the enterprise must be carried out withoutexposing and risking the internal network. No type of direct or indirect communication can be allowed between the partner and the enterprise network.
2. Ensure that the repository for files being moved back and forth is secure. While information is waiting to be retrieved by the enterprise or sent to the business partner, it must reside in a secure location. This is especially critical when the intermediary storage is located on an insecure network, such as the enterprise's DMZ, outsourced site, or even the Internet. Additionally you should take steps to define what format files will have, and to ensure that they can only be deposited if they are virus free.
3. The environment for exchanging data should be a sterile environment. Encryption and other security mechanisms are not helpful if the security layers where the data is being stored can be circumvented. Encryption is good for confidentiality, but does not protect data from intentional deletion or accidental modifications. In order to build multi-layered security, a sterile environment must exist to accommodate and protect the security infrastructure. Creating such a sterile environment requires the creation of a single data access channel to the machine and ensuring that only a strict protocol, that prohibits code from entering, is available for remote users. Many file exchange technologies do not run in sterile environments. For example FTP Servers, a common method, are frequently nothing more than applications running on insecure platforms.
4. Protect your data when it is at rest. The cornerstone of protecting storage while at rest is encryption. Encryption ensures that the data is not readable and thus maintains its confidentiality. But encryption that places high demands on managing is ineffective. A common approach for many organisations is to use a Public/Private key approach, but this is generally considered to be ineffective because of the enormous effort to maintain such a system. A Symmetric encryption model ensures a manageable and effective method to secure the data.
5. Data must be protected from deletion and tampering. The protection of data by encryption is simply one part of the problem. Files may be accidentally or intentionally deleted or changed. Additionally you need to ensure that data cannot be tampered with.
6. Ensure that you are able to audit and monitor all activities. Comprehensive auditing and monitoring capabilities are essential for security for several reasons. Firstly, it allows the enterprise to ensure that its policy is being carried out. Secondly, it provides the owner of the information with the ability to track the usage of its data. Thirdly, it is a major deterrent for potential abusers, knowing that tamper-proof auditing and monitoring can help in identification. Finally, it provides the security administrator with tools to examine the security infrastructure, verify its correct implementation and expose inadequate or unauthorised usage.
7. End-to-End network protection. Security must also be maintained while the data is being transported over the public network. The process of transferring data must be, in itself, secure, and there are several factors that influence the overall security of data transmission.  As data transfer is an essential part of a larger business process, it is critical to be able to validate that this step in the process was executed correctly. This requires the solution to provide auditing features, data integrity verification and guaranteed delivery options. Transmitted data should be automatically digitally signed, thus ensuring the data delivery is complete and un-tampered.
8. Performance is a major issue in many networks, especially when using the Internet where Service Levels are difficult to guarantee. When there are large volumes of data and the number of recipients is high, it is critical to ensure that performance is optimised. Compression should be deployed to reduce file size, and since network availability and reliability may disrupt the transfer process, automatic resume from the last successful checkpoint should also be a standard feature.
9. Ease of Integration with existing business processes. File transfer is usually part of a larger business process and needs to integrate seamlessly. This demands the ability to automate the file transfer process and thus integrate it with the existing business processes. In order to make this integration as simple and as seamless as possible, the file transfer solution must have an extremely flexible and diverse interface, providing transparent integration. This also minimises the amount of human intervention and, as a result, can improve overall security by reducing the possibility of tampering with your data.
But of course no one is interested enough in what your business is doing to waste a few minutes planting some spyware in your company. After all it only happens in the movies!                                                      n

Calum Macleod is the European Director for Cyber-Ark and can be contacted via tel: +31 40 2567 132;  e-mail: calum.macleod@cyber-ark.com 
www.cyber-ark.com

All mobile operators now offer their subscribers a massive choice of content. The only thing that's missing is the users. The key to success is to push interactive content direct to the idle screens of subscribers, says Yossi Wellingstein

"Build it and they'll come" is a risky way to run a business -- and one that simply hasn't worked in the mobile content world. Today content discovery is one of the biggest problems mobile operators face, following the massive effort to build-up huge content catalogues. Put simply, the fact that users don't know that so much content is waiting for them, and more importantly, don't know how to find it, explains why mobile data services still account for such a small proportion of operators' revenue, despite their best efforts to increase their share.
Operators around the world have realised that they can't wait for users any more. They have to bring content to customers -- and then make sure that it is easy to access. Anything too complicated, or requiring pro-active behaviour from the customer, just hasn't, doesn't and won't work. 
A new approach is needed to let subscribers know that content is available -- and to encourage them to click through. Not only to build data traffic so mobile operators can start to generate decent revenues to justify their infrastructure investments -- but also in this competitive market, to provide services that add value and differentiate one network from another.
One answer to the content discovery problem is incredibly obvious and already being used by a number of operators around the world.
The solution: push content teasers to the screen of a mobile phone when the phone is turned on but not being used. Make these teasers free to the customer. Ensure that they vanish after a few minutes and are replaced with new headlines. Make the content relevant to that user. And finally minimise the number of clicks needed to access additional information linked to a teaser. Research has proven over and over again: clicks kill revenue. The more customers have to navigate, the less likely they are to bother.
A mobile phone screen with no content on it is a major missed opportunity. And a dead screen seems acceptable to customers -- until they see what they could be getting -- up-to-date information headlines for free with an easy way to access additional information and services. Operators need to make the idle screen a dynamic palette of intriguing content, news, games, offers, polls and real-time updates.

Commercially proven


Active Content to the idle screen is a technology that has proved itself commercially. According to operators already using it, over 90 per cent of users keep the service once it's introduced and over 30 per cent use it regularly, an unprecedented use of content services.
VimpelCom which operates as BeeLine GSM in Russia faced a problem common to all mobile operators -- it had spent considerable sums building its data networks, and interesting content was available to its customers thanks to a series of partnerships with content providers. However, customers just weren't accessing content from their phones.
VimpelCom's research showed that users simply didn't know how to find the content -- and if they did it required far too much effort. And so, VimpelCom decided it needed to take a proactive approach. In April this year it launched its "Chameleon" service, using technology from Celltick, to broadcast live content directly to the phone screens of its subscribers.
The Chameleon service sends streams of free news headlines, sport reports, weather updates, music stories, gossip and games directly to mobile phone screens. Just like a screen saver, the messages appear silently only when the phone is not in use. 
Chameleon is very easy for customers to use and does not require any action on the part of the customer to start receiving the service. When they see a message that interests them and want to know more, they simply click on the OK button. A menu opens and presents various options. For example, a video news report or an automatic link to a WAP site or web page, or an SMS message with more information. A second click launches the desired service. And then the subscriber starts paying.
VimpelCom has invested in interesting and credible content, using brands such as MTV, Cosmopolitan and Playboy. It has created a countrywide team to run the broadcast operation, effectively working like a TV editorial team.
Chameleon targets all types of audiences, and can broadcast both to the entire subscriber base and to specific segments and locations, so content can be customised and localised to ensure that it is relevant and of interest to customers.
The results are staggering. During the first three months more than seven million data transactions have taken place with 50 per cent of enabled users reacting to the content on a regular basis.
Victor Markelov, Products Director at VimpelCom, said: "Our customers love Chameleon. We've finally found a way to provide them with an opportunity to use our data services actively. The easier the access to information and data services, the more often our customers use them".
Currently Chameleon is available to almost 3 million users, and VimpelCom plans to expand it to more than 10 million by the end of the year. 
From a business perspective, the key to a successful Active Content system is sending free teasers to a vast number of users, while keeping the cost of these teasers low. It's a numbers game. The operator relies on a certain per cent of users clicking through, and therefore needs to send teasers to as many users as possible. But how to send these teasers cost-effectively? Say there are one million users and the operator wants to send 100 daily teaser messages to each of them. Doing it with any kind of point-to-point delivery will bring the network's SMSCs to its knees or eat up the GPRS capacity. Since these messages are not paid for by customers, this is certainly not a good use of the network resources.
There is a simple answer to this challenge -- enable mobile networks to send one message to many users simultaneously, or in other words, broadcast. Using broadcast, teasers can reach millions of users in real time without clogging the network and without using any commercial bandwidth at all. Since the broadcast capacity of the network is miniscule to begin with and allocated in advance, it makes the content teasers virtually free. This makes Active Content particularly attractive, since the variable costs associated with its delivery to idle screens is close to zero. When users click on a teaser, the terms change, a paid-for session begins and the operator starts seeing revenue.
Users love having content pushed to their screen because it provides free information, entertainment and timely updates with zero intrusion.
Operators love it because it increases traffic on data networks by solving the content discovery problem  -- and in today's competitive market is an effective brand differentiator.
So why are you still hiding your valuable content and services deep inside your portal? It's time to bring it to the front. By activating the idle screen.                    n

Yossi Wellingstein is CEO of Celltick and can be contacted via www.celltick.com

If someone tells you that they know what's happening in the telecommunications industry these days, the chances are that they deserve a sharp poke with a pointy stick*. The world's networks were already being described in the 1970s as being the most complex machine that man had ever built, and the last few years have only increased that complexity by a few orders of magnitude.

Imagine a speeded-up film of the earth from space. One single solitary telegraph line first appeared in 1843, stretching all of forty miles from Washington to Baltimore. Over the following decades, that fragile link grew into a web of cables and repeater stations, reaching across continents and oceans. Fast forward a century and a half, and the earth is now covered by a dense mesh of fibre, radio, satellite and copper networks. Regions, once dark and silent, are being rapidly filled in as the world goes mobile, adding a second billion onto those souls already glued to their handsets and screens.
But geographic reach alone only tells one part of the story. As we shift towards an all-IP world, drawing content and applications into the traditionally two-dimensional signalling framework of telecommunications, we're starting to approach biological levels of complexity -- and that's going to demand some serious changes in the ways that we perceive and plan our industry.
Chatting recently with Digitalk, one of the companies now bringing SIP -- that increasingly ubiquitous protocol -- firmly into the PSTN and voice services world, I started getting flashbacks to biology classes at school and all those wonderfully intricate diagrams of metabolic pathways.
Now, for an individual who did a degree in zoology and psychology before falling into networking, the fact that I should look for biological metaphors shouldn't really be surprising. What is perhaps surprising is that respectable academia is also now taking this route, but under the catchy title of 'Complex Adaptive Systems'.
Partly growing out of chaos theory, that pop science favourite of a while ago, a growing number of researchers around the world are looking at the subject of network behaviours -- but across a multitude of different areas. Google on the Santa Fe Institute in New Mexico, and you'll find such apparently unrelated topics as economies, weather, embryological development and telecommunications coming together in fascinating ways.
All very interesting perhaps, but what's this got to do with profits in the increasingly Darwinian playground of a deregulated telecommunications industry ? 
It's simply that nature's been doing for billions of years what we've now been doing for only a hundred or so -- and there should be some good tricks we can pick up. I sometimes deliberately annoy engineer friends by asking them if an individual bee knows that it's actually building a larger hive as it works on its own individual little cell -- each a masterpiece of optimised design. After I've picked myself up off the floor, they do, however, usually get what I mean, especially when I explain why the use of the word 'ecosystem' in all those endless marketing presentations is inappropriate. Anything that kicks marketing departments usually goes down well with engineers, I've found...
In an ecosystem, everything eats everything else. In reality, the world of networks is becoming much more like a super colony of very simple organisms - but with each starting to exchange the equivalent of genetic material with each other.
Consider the fraught relationship between content owners and network owners. Each needs the other -- but they're increasingly fighting the other for dominance of that growing space. What started off as an apparently symbiotic relationship now looks like moving to become parasitical for one of the parties.
Continuing the HR theme of my last column, we now need people who can rise above the raw silicon of the network and spot these shifts in power as they emerge. Talking recently with Daniel Osmer of telecom recruitment specialist, Spectrum-EHCS, he made the interesting comment that "some of the best sales directors we've seen hold degrees in Psychology -- it's their understanding of human behaviour and decision making processes. The industry though is still dominated by very traditional, risk averse hiring".                           n

*Unless they're Keith Willetts, of course.

Alun Lewis is a telecommunications writer and consultant. He can be contacted via:
alunlewis@compuserve.com

"An economic power shift from the US to Europe is now gaining steam and promises to have a far-reaching effect on the world technology sector," asserts Cutter Consortium Fellow Tom DeMarco -- a point vigorously debated by his colleagues on the Cutter Consortium Business Technology Trends Council.

The European Union's transition from marketplace arrangement to world superpower, and a possible new age for European IT, characterised by relatively frictionless commerce and exciting growth, is debated by the Cutter Business Technology Council along with Cutter Consortium Senior Consultants Tom Welsh (UK) and Borys Stokalski  (Poland) in the latest issue of Cutter Consortium's Business Technology Trends and Impacts Opinion.
According to DeMarco: "After decades of looking over its shoulder at Japan and the East, the US economy is fast being overtaken by someone else entirely: Europe. The colossus that has been assembled in what used to be called the Common Market has emerged as an economic superpower."
Cutter Consortium Fellow Lou Mazzucchelli counters: "The European Union is taking hold, albeit with fits and starts as evidenced by the results of recent constitutional referenda in France and the Netherlands. But I see little evidence of a massive shift of economic power from the US to the EU. Perhaps it is because the gap between them is small, relative to the gap between either the US or the EU and China or India or South America. The latter have so much more to gain, potentially at our expense." 
He continues: "It is unarguable that changes in Europe have an impact on the US and the world, but of all the forces working to shift economic power from the US, Europe may be the least threatening. Europe may have gotten a second wind as an economic power, but it seems unable to run a distance race against India and China."
Details: www.cutter.com

Telecom operators across Western Europe are launching IPTV services in an effort to increase revenues and improve customer satisfaction for their broadband services. In a new study on IPTV services in Western Europe, IDC has found that the potential for success with IPTV services varies widely from country to country, depending on the penetration of existing pay TV services, the level of broadband competition, and the commitment of incumbent and leading competitive operators to investing in the network upgrades and content necessary for high-quality IPTV services.

IDC estimates that the market for IPTV services in Western Europe was worth $62 million in 2004, with less than one per cent of households subscribing to IPTV services. The market will boom over the next five years, growing from $262 million in 2005 to $2.5 billion in 2009. By that year, six per cent of Western European households will subscribe to IPTV services. By 2009, IDC expects that all European incumbents and a large portion of the major alternative tier 2 providers will offer IPTV services. DSL will be the most widely used platform for the service, though a minority of households in a few countries will receive IPTV services over metro Ethernet connections.
IDC's study Western European IPTV Forecast 2004-2009 says that in order to be successful, broadband operators will need to differentiate their service bundles from the video services already available in the market. The area they will be able to do this is in high-quality content underpinned with interactivity.
Details: www.idc.com

Worldwide mobile phone sales totalled 190.5 million units in the second quarter of 2005, a 21.6 per cent growth from the same period last year, according to Gartner, Inc. In the second quarter of 2005, the mobile phone market experienced the second strongest quarter on record (in the fourth quarter of 2004 worldwide sales surpassed 195.3 million units).

Nokia and Motorola have strengthened their position in the marketplace, as the two companies accounted for 49.8 per cent of worldwide mobile phone sales in the second quarter of 2005. Nokia's market share grew 2.3 percentage points in the second quarter of 2005 to reach 31.9 per cent. "Nokia regained its top position in Latin America and stepped up to the third position in North America benefiting from the successful launch of its Virgin Mobile which helped its lagging code division multiple access (CDMA) sales," says Hugues de la Vergne, principal analyst for mobile terminals research at Gartner, based in Dallas, Texas.
Motorola was the second best-selling vendor in Western Europe, a significant improvement compared to the same time last year when Motorola finished as the No. 5 vendor in the market. In North America, Motorola was the market leader with its share reaching 33.5 percent, while it was the No. 2 vendor in Latin America with 31.9 per cent of sales in the region.
Details:  www.gartner.com

According to a recent study by Evalueserve, The Impact of Skype on Telecom Operators, the European Telecom market is expected to be hit the hardest due to the fast and accelerating uptake of Skype, which is by far the most successful P2P (peer-to-peer) VoIP solution available around the world today.

Skype has revolutionised VoIP telephony by offering high quality voice transmission, and by reducing the cost to zero for Skype-to-Skype calls, and to a fraction of current long-distance rates for Skype-to-Fixed/Mobile network calls. European operators are much more exposed due to the characteristics of European telecom markets where the calling and roaming rates, as well as the share of roaming calls, is higher and local calls are charged by the minute, as opposed to a flat monthly fee in the US. Worldwide, the figure of regular retail Skype users is likely to be between 140-245 million by 2008, the study reports.
The Evalueserve study further projects that incumbent telecom operators who combine fixed and mobile networks are likely to face a significant risk of permanent reduction in overall profitability by at least 22-26 per cent and reductions in revenue by 5-10 per cent as a direct impact of Skype by 2008.
At present, Skype has two million users in the US and 13 million users worldwide and the company claims 80,000 new subscribers daily.
Details: www.evalueserve.com

A taste of TeleManagement World in Texas -- which takes place from 7-10 November

Telecoms is undergoing enormous and rapid change. Managing this change could mean the difference between capturing market share for new and innovative services or being left behind.
The TeleManagement Forum's U.S. conference and expo TeleManagement World, taking place November 7 - 10 in Dallas, Texas, will provide an in-depth analysis of the changing telecoms industry and will address new entrants, new technologies and ideas from revolutionary thought leaders that will ready operations and business support system professionals for the challenges that come with the innovation underway in the telecoms marketplace.
"Telecom competition is actually no longer about new technologies or even new services. It's about who can most effectively manage themselves, their networks and their customers, and which operators can become leanest, fastest," says TeleManagement Forum President Jim Warner.
According to Warner, the US Telecom market has not seen this much change in more than 20 years. Warner points out that there is massive consolidation taking place -- with Verizon buying MCI, SBC buying AT&T, and Sprint merging with Nextel.
"But there's also a lot of new players coming into the market, often with new technologies and capabilities," Warner says. "The fact is that if you can't manage your customers and give them the services they want, they're going to go elsewhere now that they've got choices."
TeleManagement World Dallas brings together operators, suppliers and industry experts to discuss, share examples and provide solutions for how to meet the challenges of aggressive time-to-market pressures and how to improve the customer experience to gain a competitive edge.
TeleManagement World is designed to be a "cram course" to help operators truly understand how to manage next-generation services and the networks and back office systems that will deliver them.
The conference will include a Telecom Innovation Summit on November 8 that will delve into some of the industry's most pressing issues. Industry panels will focus on managing the next wave of innovation and other hot topics such as new devices and services, municipal Wi-Max networks and how operators can be 'lean' in light of Sarbanes-Oxley.
Six tracks being held November 9 - 10 will prepare operators and suppliers for their next move in the telecom industry by focusing on business transformation and market readiness; how to optimize the next-generation network's potential; service delivery and content to provide the anytime, anywhere customer experience; revenue assurance and billing; NGOSS software and technology and a deeper look at cable, IMS and IPTV.
Seven different Catalyst programs will serve as a living lab of innovation at TeleManagement World Dallas and will include two operator-driven initiatives, demonstrating how to solve complex operation and business support system issues.
The Catalyst programs will create scenarios that employ NGOSS concepts to manage mobile multimedia service management processes, simplify Sarbanes-Oxley compliance and manage Internet Protocol (IP) services, including voice over IP. Catalyst projects will also explore how to integrate OSS products from multiple vendors in a triple play scenario and will kick-start a new NGOSS Web Service Enablement initiative.
Fourteen different full or half-day training courses will be offered on Nov. 7 and Nov. 10 on topics such as NGOSS, the SID and the eTOM as well as using OSSJ to implement NGOSS. Courses will also cover a BSS perspective of VoIP, billing and charging for data and content, revenue assurance and billing in a convergent world. Attendees of the Monday courses will recieve a copy of NGOSS Distilled, a book written by John Reilly and Martin Creaner.
TeleManagement World provides a networking, educational, and demonstration forum that will help attendees apply state-of-the-art advancements in telecom to their business to manage the innovation of the future.

For more details or to register visit:
www.telemanagementworld.com

With currect technological forces driving the industry down the road of convergence, companies who are unable to respond quickly will suffer most, says Marc Kawam 

Convergence, or the transition to Internet Protocol (IP), is what many people are calling a 'stealth revolution'. Slowly and invisibly to the majority of phone users, Europeans are changing the way they communicate with one another. Following on from the popularity of IP telephony in the US, where there are already 3 million residential users, we are now moving away from traditional circuit routed networks to Internet-based voice calls. We are even seeing the emergence of triple-play services as service providers like FastWeb and Homechoice bundle together voice, data and video and wireless to deliver a one-stop shop for all communications and entertainment needs via IP.

Convergence going mainstream

In the UK, BT's recent announcement to transform its telecommunications infrastructure into a pure IP-based network by 2009 is a clear indication of the commitment that large providers are making. And there are good reasons for it: as well as paving the way for a string of new hi-tech applications, its 21st century network (21CN) is expected to deliver cash savings of £1bn a year to BT by 2009. But are established providers up to the challenge of migrating to IP and can they really compete with innovative companies like Skype and Vonage that are rewriting the rules and being heavily backed by deep-pocketed investors?
In the fast-paced telecoms sector, responsiveness and speed-to-market are all-important. Whether speed to respond to competition, speed to innovate, or speed to respond to customer requests, responsiveness is the major challenge companies face today. Take almost any CSP offering and track it over the years: it becomes quickly evident that what was considered a rapid service roll-out a few years back would be considered exceptionally slow by today's standards.
As deregulation and a fall in barriers to entry opens the market up to a new breed of innovative companies fighting for a slice of this evolving market, customers are being freed from the tyranny of the incumbent. Free to choose from an increasing array of communications services providers (CSPs), customers are increasingly demanding more for less. Not only do they want advanced IP-enabled services, they want to pay less as well. If that wasn't enough, they also have to   accommodate the expectations of the 'Internet Generation' who are used to getting what they want when they want. Some CSPs believe their customers are now shifting to a real-time provision mind-frame where they can instantly obtain the services they desire.

Agility is key

Stuck with hundreds of IT legacy systems coming from the days of circuit telephony, many companies fight with a myriad of disjointed IT systems, and lack a clear vision of their future technological needs. Even some of the established players lack the internal systems and related flexibility to handle demand for next generation services where the customer is in control. According to Insead's agility index, which measures the ability to roll-out and provision innovative IP-based services (such as broadband, VoIP and IP-TV) rapidly, efficiently and to a high quality to new market segments, CSPs in Europe have, on average, reached only 60 per cent of their potential agility. In addition, whereas CSPs are now, on average, able to bring service to market 2.5 times faster than five years ago, customers are prepared to wait 2.8 times less long for new services to be delivered to them.
Clearly there is a gap between the speed and           flexibility required to compete in the fast-paced telecoms industry, and the IT readiness of these organisations to respond to this need. With competition in Europe being fierce, agility becomes an imperative for business success. So what can they do? How can they move to the nirvana of real-time service provisioning, as their customers often demand? How can this be achieved for the complex bundles of IP-based products/services (including content services) that are delivered off multiple vendor platforms, and more often then not owned by diverse business entities?
Insead's research report identifies self-provisioning/self-management and integration with partners' systems for more seamless order mediation as key areas for improvement. The latter becomes increasingly important as service providers begin offering aggregated services, where a subset of the services is actually supplied by a partner entity. Two-way communication of service order information, such as transmitting service requests, receiving acknowledgements, and confirming service availability between partner service provider entities, will increasingly need to be automated. Unstructured manual processes for mediating orders will not scale and will not allow CSPs to meet the demands of their consumers.

How to cope with change

Service providers need to bridge the 'automation gap' between customer-facing software applications and the service delivery partners for a variety of rapidly changing and evolving services. Next Generation Operations Support Systems (NGOSS) can bridge the gap. They allow CSPs to not only handle the vast array of new services that customers are demanding today, but also support future IP-based services that service providers will need to roll out to retain and grow their customer base. These solutions are often built on open technologies, such as, J2EE, EJB, JMS and XML. Standards, such as TeleManagement Forum's SID and OSS Through Java Initiative (OSS/J) APIs, promise to facilitate development of component based OSS where CSPs can mix-and-match systems as needed to quickly respond to customer demands or new services needs.

Crucial part of OSS business

Standards are a crucial part of the OSS business, since currently, there are over 2,000 commercially available OSS applications on the market, and combined with applications that service providers have created themselves, the number increases to almost 5,000. Most of these solutions, however, do not provide all the components needed to cover widespread adoption, and the integration challenge for these applications is immense. The interfaces between the systems developed using standards can hide the complexities of the functionalities of each individual system and allow, for example, flow-through service provisioning without any inefficiencies due to manual processes or manual data transfer. This is one key and necessary part of the equation: the technological piece.
Beyond that it requires management's understanding of the IT needs and priorities, and an appreciation of the impact that low IT agility can have on the success of the organisation. Responsiveness, speed to market and quality of service are the key ingredients to crack the European market. It is a long way to the top, and most CSPs have a fair way to go yet, but one thing is clear: those who manage to climb up the agility index ladder will have the technological ability to keep up with the accelerating pace of the CSP industry; the rest may have to pay the price for being too slow.                 

Marc Kawam is Managing Director, Europe, CEON Corporation. tel: +33 153 536766; e-mail: MKawam@Ceon.com   www.ceon.com

    

 

European Communications is now
Mobile Europe and European Communications

  

From June 2018, European Communications magazine 
has merged with its sister title Mobile Europe, into 
Mobile Europe and European Communications.

No more new content is being published on this site - 

for the latest news and features, please go to:
www.mobileeurope.co.uk 

 

@eurocomms

Other Categories in Features