Features

CWDM technology provides the answer to bridging demand for bandwidth, both fast and cost effectively says Francis Nedvidek

FTTx (Fibre To The Home, Business, etc.) is gradually gaining momentum in Europe. Projects realised to date have tended to be modest but all opinions concede that FTTx has entered the mainstream. Network operators have plans on drawing boards in the Netherlands, Norway, Sweden, Denmark, France, Italy, Slovenia, the UK and Germany among others. Certainly, regulatory and legal frameworks concerning the use of legacy infrastructure and of newly installed fibre; the jurisdiction of regional and city carriers vs telecom vs CATV / HFC (Cable Television / Hybrid Fibre Cable) operators; and, the access to multiple-dwelling buildings, still need resolution. In addition, the technical debates concerning PON vs Point-to-point architectures evolve as broadband demand, telecommunications legislation and network technology advance.
As networks expand in terms of the number of subscribers, the offer and take up of services and the expansion of geographic footprints, Coarse Wave Division Multiplexing (CWDM) has emerged as the preferred method for increasing link capacities of these optical access networks quickly, simply and at a low cost. Passive CWDM requires absolutely no electrical power and the technology has proven itself to be sufficiently robust and reliable for installation in the most demanding environmental conditions.
Modern CWDM technology enables network capacity upgrades in the form of install-and-forget hardware allowing network operators to multiply the bandwidth of their presently overloaded fibre spans. A CWDM technology platform permits enhanced flexibility in terms of network planning and installation without sacrificing scalability to far higher transmission volumes as bandwidth needs inevitably grow. CWDM is inherently transparent to protocol, coding and bit rate and therefore ideally suited for aggregating fibre bandwidth. Capacity increases of factors from 4X to 8X or even up to 18X at a fraction of the cost of laying new cable in trenches or drawing additional fibre strands through conduits are routinely achieved. Operators implement network functionality upgrades literally within hours, while continuing to operate legacy ATM, TDM/TDMA, SDH/SONET or whatever topology their legacy or new architectures embrace. Furthermore, CWDM bandwidth augmentations are network transparent and fully operable with BPON, (Broadband PON (BPON ITU-TG.983.x), GPON or Gigabit capable PON (GPON ITU-TG.984.x), Ethernet or EPON (EPON IEEE 802.3ah) or various versions of DOCSIS. Even 1310 nm and 1550/1490 nm analogue modulation combining full digital overlays may be accommodated.
The goal of the network operator is to provide ever more subscribers with service while containing the cost to reach each additional customer. Reaching more subscribers with higher bandwidths attains higher penetration densities and consequently greater revenue generation potential. Increasing the bandwidth of existing fibre lines promotes higher degrees of network utilisation by permitting the price of each router port and laser transceiver to be shared across many connection drop points. Increasingly, attracting new subscribers also means providing the bandwidth that customers need for the services and the programming that they are signing up to enjoy. In all, CWDM is a very attractive means for network operators to achieve their objectives.
At the very edge of the network, FTTx architectures traditionally exploit an optical platform to carry downstream traffic to approximately 16 to 32 residential drop points or subscribers and upstream traffic back in the opposite direction. FTTx deployments, whether telecom-centric or HFC-centric, ultimately require extending sufficient optical bandwidth from the central offices and headends all of the way to these subscribers.
In its simplest form, CWDM multiplexers aggregate additional wavelengths, or in other words, additional data channels, onto an optical fibre where previously only one wavelength or channel had been transmitted. Upon arrival at the opposite end of the fibre, a CWDM demultiplexer discriminates and physically separates the different wavelengths so that each wavelength is rendered once again as an individual communications channel. In practice passive CWDM may be deployed in simple ring or protected ring distribution, point-to-point setups, PON configurations, via bidirectional or unidirectional arrangements or utilised to carry analogue signals simultaneously with bidirectional digital overlays. CWDM equipment may be packaged to fit 19-inch telecom central office installations, splice cassettes for mounting in street cabinets, hand-holes or CATV-pedestal closures. The most advanced CWDM components work over temperatures spanning the Telcordia GR-standards for outside plant operating conditions and are small enough for convenient insertion or retrofit into existing fibre splice cassettes. In fact, upgrading network capacity, in practical terms, becomes a task of modifying outside plant fibre connectivity rather than procurement and installation of new inside plant equipment.
Network operators are increasingly taking advantage of CWDM-enhanced architectures and their accompanying low CAPEX, minimal OPEX, and simple and straightforward planning and implementation. Decisions to adopt CWDM typically revolve around the following priorities:
Low and predictable equipment and operating cost - CWDM network upgrade approaches require significantly lower CAPEX and offer much more economical OPEX scenarios compared with any active equipment deployment. Especially attractive are the quicker return-on-investments. We often encounter network operators who redeem the cash flows generated via newly CWDM-acquired subscribers and enterprise service contracts for financing their next access network expansion.
Ability to upgrade portions or the entire network quickly and efficiently - Agility has a major impact on launch strategy and timing. Rapid response is key to pre-emptively or defensively capturing and holding market share. Our experience over the past three years with numerous European network operators exploiting CWDM building blocks involving many thousands of nodes clearly confirms that four or eight channel upgrades may be installed and fully operational within days or less.
Simplicity of specification, simplicity of deployment and simplicity of upgrade / reconfiguration - An inherent attraction of passive CWDM-based solutions is that the technical expertise required to design, manage and upgrade or otherwise adapt the existing or new network are well within the capabilities of virtually any network operator. Risks and burdens of complex network design and planning may be minimized without sacrificing options to further scale the bandwidth or network configuration. Deployment means plug-and-play installation with no need for additional power supplies or software updates.
Solutions that facilitate rather than constrain future expansions - Network operators strive to add subscribers, extend geographical reach and transport ever more data traffic. CWDM is a low cost and low risk tactic that complements other capacity enhancements whether future expansion strategies incorporate further passive or active equipment or even a complete change of operating philosophy. Roll out may be planned to ensure that technical improvements and the financial resources associated with upgrade scenarios remain decomposable into predictable and non-cost-prohibitive phases. Network Operators preserve the freedom to roll out capacity, coverage and services as the changing demand and competitive landscape require and as cash flows dictate.
Freedom from becoming locked into proprietary schemes - A CWDM approach tied to the established open standards typically operates unconstrained with any of the routers, switches, DSLAMs and even the WDM systems offered by major Telecom / CATV / HFC / Datacomm vendors. As a passive element, CWDM modules are functionally agnostic to all data transmission protocols and are equally immune to the incompatibility problems often encountered when connecting disparate equipment or accessories supplied by different vendors. Risks of becoming captive to any particular proprietary approach or attendant service agreement are eliminated.

Dr. Francis Nedvidek is CEO, Cube Optics AG, and can be contact via tel: +49 (0) 162 263 8032; e-mail: nedvidek@cubeoptics.com

In telecommunications, as in many other industries, success usually comes from careful planning. Danny Berko and Ron Levin explain that planning now for the deployment of effective and proven deep fibre platforms will help meet the demands of the future wave of IPTV and other new customer services

The rapid growth of new broadband services such as IPTV will soon stretch the local loop or access network to its limits in terms of bandwidth delivery capability.  Even existing Internet services are becoming thirstier for higher download speeds as they cram their site pages with customer-compelling pictures and graphical content. 
Additionally, a growing customer segment - SOHOs and home workers - is looking for higher upload speeds to support their needs to send ever larger files to central office locations and facilitate increasing numbers of peer-to-peer sessions.  Many telcos and LLU operators have sought successfully to address these demands by exploiting the latest advances in Digital Subscriber Loop technology - xDSL - to carry these higher speeds over a predominantly copper local loop network originally designed to carry analogue voice services.
Much has been achieved in this respect and it is estimated that a large proportion of customers (over 90 per cent in Western Europe) now have access to broadband speeds of over 2 Mbit/s with some (nearer 10 per cent) enjoying 10 Mbit/s or greater. 
Characteristically however, xDSL speeds reduce with copper loop delivery distance and the laws of physics are beginning to diminish further speed improvements that can be made over existing copper loops to meet further higher speed customer services.  So consideration must now be given to how these distances can be reduced to meet this next wave of bandwidth demands, which look to be of the order of 25-50 Mbit/s.
This is particularly important for operators as they plan to satisfy growing demand and retain/build their revenue streams.  Deeper fibre into the access network - to shorten the copper loop distances and bring the high launch speeds of xDSL sources closer to the customer - is the principal approach to tackling the situation.
This entails the deployment of robust and reliable xDSL broadband platforms at the end of the fibre in often environmentally harsh and less accessible parts of the access network, such as cabinets, building basements and underground enclosures. 
The challenges for operators and their suppliers are not insignificant. Such platform investments must support a positive business case and they need to complement operators' current network convergence strategies, as well as longer-term plans for the access network as a whole.
The rewards are nevertheless significant in terms of order of magnitude improvements in bandwidth speeds (factor of 10+) and the overall potential future services that can be offered to customers and the community in general.  Cable operators who have copper pairs incorporated within their coax distribution (Siamese pairs) may also find such platforms attractive in terms of the premium service potential they can offer over and above their normal cable modem services.

Why more bandwidth?
The underlying trend within the developed world continues to be for more content and associated higher delivery speed in broadband services, be it within the standard Internet services portfolio or specific new planned services such as IPTV and video streaming/conferencing services. 
The limit is difficult to ascertain and is analogous to the processing speed/memory capacity trends within the PC industry.  While much has been achieved in improving the transport efficiency of such services, including advanced compression and coding techniques (MPEG, etc), the net trend still translates into ever-higher bandwidth capability required of the access transport. In cumulative terms this can move anticipated customer bandwidth demand to between 25 and 50 Mbit/s. 
Currently, most Western European operators appear to be looking at around the 25 Mbit/s figure, whereas in the USA, where the attention to HD TV appears to be greater - together with a demand for more simultaneous sessions - the figure approaches the 50 Mbit/s mark.
This expectation of future bandwidth growth needs to be addressed by operators if they plan to meet and stay ahead of future demand. Current deployment of xDSL technology at CO sites has achieved much in servicing the initial growth of broadband (principally Internet services) over the last decade and typically, within Western Europe, it is estimated that around 90 per cent of broadband customers connected to incumbent telcos' networks now enjoy 2 Mbit/s plus.
However, probably only around 10 per cent enjoy 10 Mbit/s or more. This is because the high launch speeds deliverable by xDSL technologies diminish with copper loop distance (or reach) from the CO, with the result that the speeds needed for future services become progressively less attainable to customers beyond 2-3 km reach. 
So the distribution of customers in relation to copper distance from exchange is currently a defining factor. This distribution is not too different across Western Europe incumbent operators, although there is a noticeable difference with the USA, by nature of its demographics. With these existing copper distance distributions, it is clear that the bandwidth identified for future service growth could only be delivered to around 10-20 per cent of Western European customers at best, and even less in the USA.
Clearly, in order to capture the bulk of customers within a much wider future services bandwidth footprint, something radical needs to be done to shorten copper transport distances within the access network.

Deeper fibre provides the solution
The means to achieving the shorter copper distances needed involves the deeper penetration of fibre into the access network.  As a result, operators, along with their suppliers, need to develop optimum strategies for achieving this against valid business cases. 
Most operators already deploy fibre all the way to large, and to a significant proportion of small, business customers, thus removing the copper bottle-neck altogether. However, to provide a similar major fibre overbuild to the remaining bulk of customers, i.e. fibre to the premises/home (FTTP/H), is currently a prohibitively costly investment for most operators.
Business cases are beginning to emerge for FTTP/H deployment in new build (greenfield) scenarios, but these barely amount to more than 1-2 per cent per annum of an operator's total network.  Therefore, in order to address the future bandwidth challenge effectively, lower partial fibre investment solutions need to be considered. Normally referred to as ‘deep fibre' platforms, they involve deploying fibre from the CO to appropriate points deeper in the access network, terminating on xDSL platforms which then connect with the remaining (shorter) copper distribution pairs.
As a consequence, the higher launch speeds of xDSL can be exploited to provide the much greater bandwidth anticipated to meet future needs.  The main governing factors determining the points in the access network where such fibre terminates are the location of appropriate and accessible copper cross-connection points where the fibre/copper transition can practically take place and the fibre count necessary to achieve an economic customer footprint deployment. 
The deployment scenarios adopted by most operators are either fibre to the node (FTTN), normally coincident with the first primary cross-connect point (PCP-external cabinet), or deeper to the curb (FTTC), normally coincident with a secondary cross-connect (SCP) or street distribution point (DP).
In the case of conurbations made up of large blocks of flats or multi-dwelling units (MDUs) a fibre to the basement/building (FTTB) deployment may also be appropriate.  In each case, this shortening of the copper loop enables the much higher xDSL launch speeds to be delivered to a significantly larger proportion of the population, typically 25-50 Mbit/s+.
This has been improved upon further with the latest VDSL2 chipsets (potential speeds up to 100 Mbit/s). The VDSL2 ETSI standard has been optimised for deployment at such points close to the customer and has the advantage that it can be configured to both symmetric as well as asymmetric delivery capability.  Additionally, both Ethernet over DSL - Ethernet First Mile (EFN) - and traditional ATM over DSL are configurable with this standard.

Proven deep fibre deployments
Major operators around the world are now deploying deep fibre solutions either in trials or actual deployments in major segments of their networks. This is enabling them to prove the technology, develop the experience and create the processes and procedures necessary to build this necessary high-capacity access infrastructure that their customers and future service opportunities will demand.
The key for platform suppliers is to be at the forefront of many of these deployments and be able to share and understand operator needs and requirements while being able to demonstrate the capabilities of the underlying technology. 
A major recent example of deep fibre platform deployment has been Deutsche Telekom's High Speed Interface (HSI) project where high-speed services are being rolled out in ten major cities in Germany and several thousand deep fibre platforms are being deployed delivering bandwidths of between 25 Mbit/s and 50 Mbit/s.
Examples of UG platforms are already in widespread deployment around the world, including Kingston Communications in the UK.
The thirst for more customer bandwidth is beginning to grow and will soon outstrip the capabilities of operators to deliver potential future services using only CO-based DSLAM architectures, particularly if a broad customer service footprint is to be maintained.
Enriching the access network with more fibre by the deployment of appropriate deep fibre platforms will address this need successfully. Such platforms are becoming available and are proving themselves to be economic deployment solutions now, which have the capability to accommodate future needs. 
The key to successful deployments depends on a thorough understanding of the access network, its distributive characteristics, the harshness of its environment and the associated principal factors that will drive service improvements and OPEX reductions.  This needs to be accompanied by full cognisance of the complementary service delivery management aspects required in the backhaul network. 
Leading suppliers are now working closely with operators to ensure that a partnership approach is achieved in meeting new customer demands and potential revenue growth.  In telecommunications, as with many other industries, success usually only comes when careful planning meets with opportunity.
Planning now for the deployment of effective and proven deep fibre platforms will ensure that the opportunities of this future wave of IPTV and new customer services can be grasped by operators successfully and with confidence.

Danny Berko, Director is Product Marketing, and Ron Levin, Director, Product Marketing, Broadband Access Division, ECI Telecom.

Eastern European telecoms is poised to enter a new phase. The big mobile players that dominate the region’s telecoms markets are eager to transform their business models.  With their networks mostly built-out and their subscriber numbers built-up to saturation point, they’re turning their attention to developing the subscribers they have - by introducing new services and more refined customer and financial management. And they’re looking to expand their operations into new territories. It may sound like a straightforward next step, but it requires major technological and human re-engineering and it will drive the market in the region.

To assist this transition, not just for the region’s big three operators - but for all the players, the TM Forum and telecom and technology consultants Ernst and Young are staging their latest Tele|Evo (Telecoms Evolution) event in Moscow (October 8 - 11, 2007). The underlying theme of the conference is telecom business transformation. Attendees will gain an understanding of evolving telecom business models: why they’re changing and how all players - operators, vendors, specialist service providers, integrators - can best adapt and meet the management challenges that will be thrown up by the process.  This TeleEvo event couldn’t have come at a better time.  Mobile has been a huge success in Eastern Europe (Russia, CIS and the former Warsaw Pact block).  So much so that in many territories across the region the rapid growth phase of that market is well and truly over.  With mobile penetration rates now at over 100 per cent in some markets, mobile companies now have to find new ways to drive revenue growth.  That means that instead of finding new subscribers, operators are concentrating on keeping the customers they have in increasingly competitive conditions and generating more revenue from them by offering new services and packages. “In the past the operators have been very network-centric,” says Ilya Kuznetsov, Telecom Advisory Director at consultants’ Ernst & Young and Regional Business Development Manager for Russia, CIS, and Eastern Europe for the TeleManagement Forum. “They have been competing for licenses, developing their networks over very large territories, and generally marketing their brands. So up to now their approach has generally been to say: ‘OK, we have a network, we’re going to invest in new technologies such as EDGE, then we’re going to understand what we can do with it from a services perspective, then we’re going to try and find a way of targeting it at customers’. That approach won’t work any more.” Instead, says Kuznestsov, players understand that in a saturated market they have to take a customer, rather than network or technology-centric, view of their businesses. “This requires changes to their traditional business models,” adds Kuznestsov. “The market landscape in Eastern Europe has different sub-markets in contrast to the European market which now seems much more integrated,” explains Kuznestov. “This environment should now be focused on new offerings: converged offerings, content-based offerings. It’s about getting more revenues from the existing customer base." One of the most remarkable features of the Central and Eastern European telecom markets has been the sustained growth of GSM mobile. In Russia, for instance, the ‘big three’ GSM mobile operators -  MTS, VimpelCom and MegaFon, which together account for over 132 million subscribers -  have helped Russia achieve a  108 per cent penetration rate in mobile.    “The ‘big three’ are positioning themselves right now as CIS-wide operators and they are covering around 85 to 90 per cent of the CIS territory,” says Kuznestsov. “These companies are very strong players in the field: they are generating good cash flow and they have enthusiastic, aggressive investors and holding companies behind them. They are constantly looking to expand their business operations - not necessarily within Eastern Europe - but more into markets in the Mediterranean region, Middle East or in South and South-East Asia. These are markets that are not yet at saturation point, unlike countries such as the Czech Republic or Poland. 
“The other focus for development is on generating more revenue from new services in the markets they have already developed.  “What needs to be addressed now is that gap in the business model between the Network Layer and the Brand Layer,” points out Kuznestsov.  “That’s the OSS and BSS area: do this right and you get a deeper understanding of your customers. ”Gaining that understanding is key to keeping and profiting from customers in maturing markets.  “Our regional operators cover huge populations and large geographies and within that they are serving a huge number of different segments. These segments have never, so far, been understood enough by the companies to enable smooth transformation of their old business model to the new one,” he says.   “Over the next 12 to 18 months there is potential for a wave of mergers and acquisitions in the telecom-related fields such as fixed/mobile convergence, content/service provisioning, telecom/media business. But before they make these big business decisions they need to define very precisely the customer segments they are going to target and what customer propositions will be developed for each of their different territories. As things currently stand not everybody has a way of defining and reaching these sub segments.” So one of the important roles of TeleEvo conference, says Kuznestosov, is to challenge the region’s operators. “We will be asking them:  ‘Who are your customers and what is your business?”
TeleEvo features a slate of top speakers to help them define answers to these questions. Martin Creaner, TM Forum President, will outline the TM Forum’s vision of industry transformation and the role of its standards and frameworks in the BSS/OSS area and representatives from major players in the region, such as Alexey Nichiporenko, First Deputy CEO at  mobile operator MegaFon, will also give their perspectives.   Important subjects covered at the event include revenue assurance and management, telecom-oriented IT governance and compliance and interconnect billing. The importance of the customer and services experience is also explored, as are IT and operations topics such as managing multi-play products and services, and service delivery frameworks.
As TeleEvo illustrates, telecoms is no longer purely about building technology, but is much more about building strands of business.
“There is no lack of investment resources in the region," explains Kuznestoso, “but there is currently a lack of sound ideas about how to use that investment properly and within the target profit levels in the next phase of business development from a customer service point of view.” TeleEvo will provide an ideal forum to explore these issues.  “The great advantage we have in our region is that we can look how these things have been done in different markets and the sorts of business models that have been deployed,” enthuses Kuznestosov. “We have to examine and understand which models might be applicable and justifiable from both a revenue and customer loyalty perspective. So TeleEvo conference is trying to bring people in - not just from Europe and North America - but from Asia and other territories, to demonstrate all the different ways of transforming the traditional telecom operator into the new customer and service centric operator."

 

Telcos are good at ‘factory’ service provision but it’s an ethos that doesn’t fit with enterprise demands for highly complex ICT outsourcing. To tap that market, telcos will have to up their game and invest, says Leo McCloskey

The number of enterprises looking to outsource converged information and communications technology (ICT) solutions is growing rapidly and should be providing a handy new market for European telcos as other parts of the wireline network business continue to be squeezed.  But the problem is that telcos need to equip themselves properly if they’re to tap this increasingly competitive market profitably, or they’ll be relegated to the dreaded role of providing low-value ‘factory’ service components to other players  - such as Virtual Network Operators or IT outsourcers - who manage to tap it first. This means equipping themselves with the right tools to efficiently meet enterprise needs in what is a highly competitive market for complex, multi-provider, converged services, because the old way of ad hoc, manually intensive processes is simply not profitable. Selling services to enterprises isn’t what it used to be. Back when enterprise technologies tended to be managed in separate silos - separate networks and separate IT domains for voice, corporate data and desktop LANs, for instance - telcos were apt to look forward to a more rational, profitable time when their infrastructure and intellectual resources could meet all those disparate technology and application requirements. That would enable the enterprise to outsource the ICT environment to a single entity, reducing cost, hiding complexity and enabling new applications. As the natural provider of the network glue, telcos judged they could and should be considered viable providers of those outsourced services involving both communications services and IT. ‘Bring on the revolution’, they chorused.  Indeed the converged network eventually happened, albeit based on IP rather than the ISDN and then ATM that telcos initially envisaged. In addition, as the converged enterprise network concept has gathered pace, the conventional business wisdom around business ownership and control has changed too. Whereas 10 or 20 years ago many enterprises (obviously depending on their sector) would have seen communications solution ownership and control as strategic and their communications performance as crucial to competitiveness, today many enterprises regard ICT as ‘non-core’ and therefore a prime candidate for outsourcing, with the emphasis on performance increasingly focused on the end-to-end user experience.  So, by rights, telcos should be in the hot seat ready to intercept a brave new lucrative business market involving network convergence and outsourcing - playing straight to their strengths.

Not quite. A single enterprise IP network infrastructure certainly irons out communications infrastructure issues. But, of course, it’s not the monolithic network it appears, as it is constructed of multiple service components, each bringing their own complexity problem. Previously, enterprises managed ICT islands with a single source (or just a few sources) of technology - such as might formerly have been found in an intranet or a voice network. This island approach may have been expensive, certainly required multiple management systems, and inevitably meant that it was difficult to build features or applications that crossed networks. But, by having the environment broken up into homogenous chunks of technology, it at least kept things simple and enabled radical change (say the replacement of a voice switch) to take place without impacting the rest of the technology estate.

Today, however, enterprises are building large complex and integrated service environments that must be able to integrate multiple applications and service components. Converged enterprise services such as LANs, storage, mobility and voice-over-IP - which make contrary performance demands of the network – are highly distributed and must operate consistently throughout the enterprise. That makes these ICT environments hugely more complex to design, integrate, and, above all, to manage on an ongoing basis.  So there has been a slight change to the script - the converged enterprise network has actually brought with it, not an across-the-board reduction in complexity, but a complexity migration - in effect, the problems have scurried up the protocol stack.  The problem for telcos is that serving this market segment by building a powerful single managed service provider capability  - pulling together IP-based ICT services in the traditional telco way  -  is no longer an option.

The complexity of each individual ICT project is such that service delivery doesn’t scale.  The germ of the problem lies in the telcos’ ‘factory’ approach to the service delivery. For most of their history, and indeed for most of their activities today, the factory approach has worked well. It’s about defining, managing and monitoring tight processes and procedures to suit the delivery of high numbers of standard product/service combinations, all within their own service delivery environment. A good example today is ADSL-based broadband service deployment where OSS standards and vendor solutions concentrate on automating as much of the ordering/provisioning/fulfilment process as possible. The objective here is to get service delivery ‘right first time’, because market competition will not permit extra costs in terms of telephone help desk time or even truck rolls to sort out problems. 

Now consider the requirements of complex ICT service delivery in the enterprise market. In complete contrast to ADSL delivery, where the service is uniform, complex ICT deployments are highly bespoke collections of products and services delivered from different service providers, with each component requiring modification to suit the enterprise ICT requirement.  An ICT enterprise solution needs its constituent elements ‘decomposed’ into specific requirements and then ordered as components from the right provider and delivered in the right sequence. As things currently stand it is typical practice for providers to devote large numbers of staff  - complex bid and project teams  - to designing and then stitching together these solutions. Being human and dealing with high degrees of interdependency and complexity, they make mistakes. The inevitable upshot is botched configurations, missing components and delay as the stricken solution is troubleshot, project profitability plummets, and the enterprise that depends on it becomes increasingly frustrated.  But the problems don’t stop there. Once a hand-crafted solution is successfully running there will be ongoing changes, sometimes as high as five per cent per month. Since the initial design was produced by groups of people using ad hoc processes and no centralised tooling, all adds and changes are implemented on an ad hoc basis too. In these circumstances, service problems are very difficult to isolate, leading to yet lower profits and higher customer frustration. Ironically it’s been the nirvana of the homogenous, converged network - the development that was supposed to enable telcos to move up the value chain to offer network outsourcing services to enterprises  -  which has exposed the limitations of the repeatable ‘factory’ approach to complex ICT solution delivery.

So what to do? What’s required is centralised tooling which acts as an abstraction layer above a single OSS to organise and correctly sequence the planning and implementation activities for complex projects that are sourced from multiple OSS ‘factory’ stacks. Such a centralised approach must complement the information received from the existing OSS that manages each ‘factory’ service component, enabling a comprehensive end-to-end managed solution. Nexagent has implemented such an approach within its Nexagent System. The Nexagent System guides the provider through the complete solution lifecycle. It provides a centralised solution design and modeling capability, enabling the MSP to efficiently capture requirements from a potential customer and to model and validate a network design based on the ICT requirements. It then takes the model and generates an implementation procedure to take the solution through to fulfilment. Once the solution is up and running, it complements the ‘factory’ OSS by monitoring the actual end-to-end user experience across all network service components to ensure that the solution is delivering against the design requirements.  This replaces the hand-tooling that currently takes place to create the enterprise solution. It standardises design and integration processes, greatly reducing the time and number of people required to design, transition and operate the solution.

Obviously there are other benefits too: revenue is accelerated, the time taken to undertake the project is reduced and the number of mistakes in implementation is reduced  - each leading to enhanced solution profitability.  In effect the Nexagent System creates a standard, automated process for integrating and sequencing ICT services from multiple service providers’ ‘factories’. The Nexagent System is aligned with the ITIL Service Lifecycle as well as conforming with the TeleManagement Forum’s eTOM (Enhanced Telecommunications Operations Map) process model.  As the telco business model changes radically - with traditional voice call charge revenue contracting and even, in some territories, voice lines shrinking as subscribers turn to mobile replacement - telcos are keen to tap new revenues, especially in high growth, high profit areas.

Leveraging their network expertise to win combined Communications/IT converged network and applications outsourcing business has always been on the radar.  But if telcos are to make good in this area there must be a recognition that custom enterprise ICT managed solutions are no longer winnable using ad hoc, hand stitching. Telcos must apply the same focus on process automation to complex ICT solutions as they currently are to the ‘factory’ parts of their business. If they don’t, then there are plenty of competitors in the IT and outsourcing spaces to step in and take the business. Telcos need to invest now in the right tools to secure their competitive edge in this growing market.  

Leo McCloskey is VP Marketing and Business Development at Nexagent

European Communications previews the TM Forum’s take on the converging industries of telecom, cable, content, media and the Internet at TMW Americas

With the slogan “Whatever the future holds – if you can’t manage it, you can’t monetise it”, the TMForum’s Management World Americas event is underlining the crucial role that effective systems management must play if the brave new world of services, from the increasingly converged strands of the telecoms, web, media, entertainment and information services industries, is to prove a profitable venture for the many and various players.
As the TM Forum celebrates its 10th anniversary, the organisation is stressing the management credentials it has built up over the years with the back room boys of telecom, to take it into a future of converged services, and a central role in the systems management of
those services.
Scheduled for November 4th – 8th in Dallas, Texas, TMW Americas is created, backed and supported by the TMForum, now widely recognised as the industry’s largest independent trade association for management and operational matters and standards.  The theme of the show is based around the TMForum’s conviction that, with services converging onto single networks and increased usage of multi-service platforms to reduce cost and new service delivery times and increase reliability, operational systems and standards have never been more critical.
Spreading it’s appeal net far and wide, the TMForum claims that whether you are service provider or supplier, operating a fixed or mobile network, an incumbent or challenger, a business executive or technologist, in finance or operations, TMW Americas will have something for you.  The organisers stress that technology is no longer the key differentiator in this new, converged marketplace.  It is how well assets are deployed, speed of reaction to market opportunities, and anticipation of customer needs that will count.  In other words ‘managing operations for converging services is the key to success’.  To this end, TMW Americas aims to offer a 360º view, ranging from strategic business issues to deep level operational and technical topics, and is lining up an impressive range of keynote speakers, and a packed programme of summits, seminars and training courses.
Among the keynotes speakers are Mike LaJoie, Executive Vice President, Chief Technology Officer of Time Warner Cable; Robin Bienfait, CIO, Oversees Blackberry Operations and Corporate IT, Research In Motion; and Kevin Slavadori, CIO, Telsus.  The Executive Summit on Business Transformation offers highly interactive sessions covering a range of topics from investment trends in wireless technology to examination of such initiatives as BT Group’s use of its 21st Century global platform to enable the delivery of more interconnected, more logical and more intuitive services.  TM Forum Technical Initiatives in the real world offers in-depth technical insight into TM Forum Collaboration Program, through research and case studies, and including such subjects as unlocking the potential of SOA as a foundation to management systems, or optimising business processes throughout the product and service lifecycle. 
Other conference tracks include Managing and Delivering Content-Based Services; Managing and Optimising Customer Experience; and Operational Challenges in a Converged Market.
TM Forum’s Catalyst Showcases will also be much in evidence at TMW Americas.  The Catalyst programme is the Forum’s proving ground for pragmatic solutions – enabling service providers, system integrators, and hardware/software vendors to work together to solve common, critical industry challenges – and always prove to be a great draw at TM Forum events.
No event worth its salt, these days, misses the opportunity of giving vendors a platform to show their wares.  TMW is no exception, and the Converged Services Expo will give vendors the chance to lay out their stalls, and show the wide range of products and solutions from the many different strands of this increasingly convergent industry.

The challenge for carriers today is to establish a global and standardised network operating system that ties together both networks and applications. Verizon and Nakina teamed up to use the TM Forum's NGOSS eTOM and SID solutions and models to design and implement the fundamental building blocks of a multi-vendor, multi-technology Common Element Management System.   William F. Wilbert provides key insights on the implementation process, its challenges, and the outcomes of creating an efficient, effective, scalable, standards-based solution for managing one of the largest and most complex multi-vendor networks in the world

It held the potential for disaster. But fortunately they saw it coming, says Robert Ormsby, Director, Network Management Laboratory, at Verizon Telecom, remembering the days before Verizon and MCI merged. “We realized that unless we took action, our infrastructure could not continue to support the new services we wanted to provide. In fact, the burden of managing the network was threatening to undermine our business model.”
A top North American inter-exchange carrier (IXC), Verizon operates one of the largest communications networks in the world with an optical networking infrastructure that spans close to 100,000 miles and more than 4,000 Points of Presence (POPs). Verizon’s network currently offers high availability, plus a wide array of next-generation features. The company’s network is now a key differentiator.  But this was not always the case.
Prior to the merger with MCI, the carrier found that the effort to maintain legacy systems – hardware, element management systems (EMSs), licensing, testing and training – were steadily rising and eating away at profits. For example, the cost of introducing a single EMS exceeded $1 million (for hardware, training, and updating methods of procedure) and also required hefty licensing, integration and ongoing maintenance fees.

OSS/BSS: a fine mess
The technology challenges facing the telecommunications industry can be summed up in a single word: complexity. Carriers operate their businesses on a complicated mix of systems and networks known as operating and business support systems (OSS/BSS). Everything runs on hardware from a variety of network equipment providers (NEPs) whose boxes come with their own proprietary systems and communications protocols. Some, but scarcely all, boxes come with element management systems (EMSs).
With varying amounts of integration work, EMSs help pull multi-vendor devices into one functioning network. EMSs tend not to be open, secure, or scalable, so a lot of extra work has to be done to make them function. “OSS/BSS integration can be a nightmare because everything has to be interfaced to everything else,” Ormsby says. “In most cases there is no standard information model to help build a searchable database, and this causes a huge stumbling block to network efficiency and reliability.”
Since the EMS layer does little to hide network complexity, many manual processes are required to fill in the gaps. To provide a single new service across multi-vendor, multi-domain (optical, Ethernet, IP/MPLS) systems requires stitching together a tangled web of networks and applications – literally spanning hundreds or thousands of network nodes utilizing many different software interfaces. Often, each device has to be manually interfaced into the network. Adding a new device type or application to the mix typically requires upgrading both hardware and software across the entire telecom network system – not an easy task considering that many of today’s new services and networks are built with a complex mix of products.
“Telecom operators deploying equipment from many different vendors face the challenges of integrating multiple EMSs into their OSS systems and training staff in each different EMS,” says Peter Mottishaw, senior analyst with independent analyst firm OSS Observer. “Many Equipment Providers deliver capable element management systems, but despite the efforts of telecom standards bodies there is still patchy support for standard northbound interfaces. This drives up integration costs. A further issue is that some equipment vendors do not deliver EMSs that meet the full set of operator requirements. EMS development is costly and complex and equipment vendors focused on getting new products to market sometimes under-invest in this area.”

Untangling the web: keep it simple
Almost all telecom services providers say they want to reduce the number of OSS vendors and products they have to manage. As they see it, their vendors – independent software vendors (ISVs) and network equipment providers (NEPs) – alike should simplify their product portfolios and move toward standardized offerings that have the potential to support plug-and-play environments that minimize network and application integration. Many are hoping that standardized Service Delivery Platforms (SDPs) will allow them to rapidly rollout new services on increasingly converged next generation networks, with high reliability and significantly lower costs.
SDPs rely on OSS/BSS systems underneath them, which in turn interface to the network elements (NEs) through element management systems (EMSs). Yet a fundamental disconnect persists between the NE and OSS layers of the network – a problem that has become costly and complex because of the continuous and inexorable technology changes taking place at both levels.
As NEs change so does the OSS/BSS interface layer, which may have a ripple effect on the overlying services. If carriers wish to implement new OSS/BSS applications, these interfaces must be rebuilt and tested against both the SDP and to the underlying EMS layer. An architecture that considers all these elements is critical if a sustained advantage is to be held in the market. Rapid innovations made in network element technologies have to be constantly linked with new operational support systems (OSS) above them. The problem only worsens in proportion as the number of OSS and network technologies grows.
Thinking there had to be a technical solution for a technical problem, Verizon/MCI teamed with Nakina systems of Ottawa, Canada to help develop and implement a new kind of integration solution called a Network Operating System (NOS). A whole new animal, a NOS essentially combines the capabilities of an EMS and an NMS (Network Management System) for multi-vendor, multi-technology networks, with a carrier grade scalable and secure architecture built using open, standards-based interfaces.
 “Our vision for a NOS is to provide a single management solution that discovers, secures, configures and manages any vendor’s networking products,” says Nakina Systems Chairman and CEO, Marco Pagani. “In essence, it is a universal network adaptation, abstraction and mediation layer that provides a single point of integration between the actual network elements themselves and higher-level management, OSS and BSS applications.”
The Nakina solution provides carrier-grade performance and scalability  that spans both legacy and next-generation equipment across multiple domains and OSI layers – including SONET, SDH and WDM optical equipment, Ethernet switches, IP/MPLS routers, video switches, and service routers and wireless equipment, among others. Unlike a traditional EMS, which essentially provides an interface or API for a specific network element type; a NOS provides a stable environment and single point of integration that abstracts the network complexity and disengages it from direct integration with the upper layer OSS/BSS in the same way that the operating system on a PC separates the hardware from its applications.
This mediation and abstraction function allows higher-level applications or services to be developed independently while simultaneously enabling new network equipment to be introduced and updated beneath it – while ensuring that the entire system still functions together like clockwork. A universal mediation layer helps service providers roll out their next-generation services and network infrastructure much faster in multi-vendor build-outs such as residential broadband, triple play, IPTV, Carrier Ethernet and wireless backhaul applications.

NOS in Action: the Verizon CEMS solution
In 2005, Verizon Business (formerly MCI) set out to build a state-of-the-art ultra-long-haul transport network and a converged packet access network initially comprised of over twenty different types of equipment from ten different vendors.  With new technologies being introduced routinely and an already overloaded operations staff, Verizon was looking for what they termed a “Common Element Management System” (CEMS) as a solution to manage both these new networks.  The goal was that CEMS would reduce operating expenses by limiting the growth of single-vendor EMSs, providing centralized operations control and simplifying the integration of new devices into their existing back office systems.
Verizon’s initial pilot project for the production CEMS solution was to upgrade functionality across a 40-node segment of the optical network without compromising network availability. The roll-out, which spanned two major metropolitan cities in the southwestern U.S. and covered approximately 650 miles (1,045 kilometers), would not only deliver new features and products to customers, it would also result in a more self-sustaining and secure network with fewer outages.
Verizon worked closely with Nakina Systems, using the company’s network OS product as the CEMS solution. The entire audit and software upgrade process was accomplished remotely in less than two hours.  To the surprise and satisfaction of the operations personnel, this was a dramatic improvement over the 40-hour, week-long, on-site effort that Verizon had originally anticipated. Verizon was also able to leverage the new features available in the new software load a full week ahead of schedule, reducing its time-to-new-service revenue. As it turned out, the CEMS solution held significant implications for the future in its potential to reduce costs and manage thousands of nodes across both networks.
Based on the success of the pilot project, Verizon Business has since expanded the use of Nakina across many of its vendors, with payback being achieved simply on cost avoidance of single-vendor EMSs. Overall, approximately $1M to $1.5M savings have been achieved per EMS in addition to real dollar savings that have been made in OPEX due to consistent, reliable operations. 
The CEMS solution has now been in production for over two years providing Verizon the benefits of a single point of integration and a consistent set of procedures and interfaces for all their network element types. In addition, the service provider has also noted the following substantial CEMS benefits that help drive new revenues while making the network more efficient:
• Simplifies and accelerates the introduction of new services
• Lowers the cost and enables rapid integration of new systems into higher level OSS/BSS systems
• Provides one set of methods, applications, and interfaces that apply to all vendors
• Enables cost-efficient training, due to common procedures

Ace in the Hole: industry standards
Until recently, the idea of a Network Operating System was looked upon with some skepticism. Who, for example, would assume the burden of building and continuously updating “adapters” (or device drivers) so that the NOS could continue to talk to each box on the network?
 “When we first started talking about the idea of a network OS, there were more than a few doubters, but that has turned around,” says Mr. Pagani. “Now many people think that a universal mediation layer will emerge naturally as more and more carriers put pressure on equipment vendors to provide standard interfaces and adapters as an integrated part of the product, just as PC peripherals manufacturers provide drivers as a standard part of their product. Nakina frequently gets request from NEPs for our Software Development Kit (SDK) so they can build their own adapters.”
As the most effective way of transcending a mix of proprietary products from NEPs, the Nakina Network OS solution was designed from the beginning to have an open and modular software architecture. It adheres to the TeleManagement Forum’s New Generation OSS (GNOSS) standards. With an open architecture approach “adapters” can be implemented at run time. These adapters are hot deployable and their development time is measured in days or weeks rather than the months or even years required to build a carrier-grade EMS.
Nakina is also forging partnerships and alliances with equipment vendors such as ANDA Networks and LSI, and System Integrators (SIs) such as HP. “These companies realize that there is a tremendous benefit to creating a standards-based common environment that makes it easy to manage and upgrade all devices on the network,” says Mary O’Neill, Vice President of Market Development at Nakina Systems. “A multi-vendor ecosystem will equally benefit carriers, NEPs, ISVs and SIs.”
With a NOS in place, a service provider is far better equipped to retain and build a differentiated service offering in the market and will be ready for their move beyond Quad Play, keeping services affordable over time.
 “Nakina’s Network OS solution is a comprehensive EMS platform that has been proved out with deployments with tier-1 customers,” says OSS Observer’s Peter Mottishaw. “It provides the scalability and security requirements required for a carrier-class EMS. Operators struggling with multi-vendor equipment environments and deficiencies in existing EMS should consider the platform as an alternative approach to purchasing EMS from equipment manufacturers. Network equipment manufacturers who cannot support large-scale EMS platform development should also consider the Nakina Network OS as a potential common platform.”

William F. Wilbert has written for technology publications for more than 15 years
For more information, visit
www.nakinasystems.com

The move to newer generation networks has brought the realisation that a horizontal layered structure of OSS/BSS that is common across service layouts is more viable.  Anita Gupta explains that in such architectures, the middleware layer acts as the glue that binds together the OSS and BSS components and makes them truly network independent

In recent years, the telecom market has been proliferated by packet based IP networks, predominantly the wireless networks. As communications companies compete with each other and an entirely new breed of competitors such as the cable operators, the race is on to differentiate their offerings in the minds of the consumers.
Service architectures such as IMS and SOA have started to gain acceptance, though there is no clear winner yet. The network operators have also been deliberating long enough on the potential of migrating from one access network to another keeping in view their long-term viability and growth objectives.
Although, it is impossible to predict the market trend with certainty, it is widely established that the IP networks are indeed the way to go in the future. Operators have been cautiously tracking and adopting the transition from legacy networks like PSTN to the newer generation networks such as 3G, 4G and WiMAX. Amidst this discussion is the realization that the approach of separate vertical OSS/BSS for each service can no longer be relied on. A horizontal layered structure of OSS/BSS that is common across service layouts is more viable. In such architectures, the middleware layer acts as the glue that binds together the OSS and BSS components and makes them truly network independent. In short, this layer of component-based frameworks abstracts the underlying network from the management systems above, and provides ease of migration from one access network to another (say from 2G to 3G).

Where are the wireless networks headed?
Network operators are carefully thinking on their next ‘moves' about their access network deployments. Some of the GSM (2G) networks operators are looking at migrating to 3G networks and some are even thinking of migrating to the evolving fourth generation (4G) networks, where user services are meant to be offered on "Anytime, Anywhere" basis and at a very high data rates. Should companies bypass 3G and leapfrog straight to the 4G networks is a matter of intense debate.  It is also being argued that 3G and 4G technologies are not mutually exclusive but are complementary to each other. The fourth generation networks are said to offer the air interface data rates, which is 10 times the data rate offered by the 3G networks. The 4G technologies are still in infancy and the international standards do not exist yet. The CDMA based operators (for example Sprint Nextel in the US) are also looking for fourth generation services primarily saddled on the back of the mobile WiMax as the enabler technology. WiMax, in fact, is being widely adopted to offer several broadband services. On its own, WiMax as a technology provides a means of increasing bandwidth for a variety of data-intensive applications. A variant, mobile WiMax delivers data at a speed comparable to that of conventional third-generation (3G) networks, but it promises to be cheaper to implement because it uses newer, more efficient technology. Then, there are other operators who seem to follow a seemingly straightforward path of transition from 2.5G or 3G to 4G deployments. The 2.5G systems (like GPRS), an interim step between 2G and 3G, provide an enhanced channel capacity, higher data rates and throughput, and optimized packet-data transmission, enhancing Internet access from different wireless devices.
Figure 1: Network migration
The new generation radio access networks will be subjected to operational challenges such as the need to handle erratic traffic patterns based on individual demands of different multimedia services, higher throughput requirements, high costs associated with deployment and operation, and the heterogeneity of different radio air interfaces.
In order to offer ‘anywhere-anytime' wireless access, the inter-working and harmonization among heterogeneous networks are the most important requirements.
Increasing and changing service demands, and the limited radio resources are the challenges of planning the future network. However, more flexible network architecture, advanced radio resource management, as well as spectrum management schemes contributes to the increased spectrum efficiency. It is demanded that the network planning procedure should not only consider features of radio element, but also the traffic demand, resource and traffic scalability, and interoperability of heterogeneous sub-networks as well as the spectrum management schemes. Such performance enhancement measures will effectively help to reduce the CAPEX and OPEX.

Abstracting your network to gain a advantage
Global telecom market trends suggest that the OSS/BSS architectures are evolving to a strategic level. They play a major role in the inter-working of several heterogeneous networks. The challenges in operating in the new paradigm of service environments are aimed at homogenizing these networks to foster interoperability between systems and rationalize the OPEX.
The traditional approach, of having a separate OSS/BSS for each application, that has been a hallmark of the telecom industry, needs to be abandoned in favor of a horizontal structure, if the operators are to make any significant market gains in the years to come.
The telecommunication back-office is facing severe reintegration issues. It must redefine its architecture in order to understand how to migrate its heterogeneous services to a homogenous architecture, and offer the differentiated and converged services the end-user is looking for.
Today, there is a segmentation of the elements between the network and the service areas. However, as service providers move towards a more integrated business model, we are witnessing the emergence of OSS middleware that provides a layer of abstraction - a mediation layer - that can ensure the interoperability of functions and ensure the scalability of services.
Figure 2 Mediation Layer of abstraction for evolving networks

Figure 2 illustrates the concept of abstraction proposed by the OSS middleware layer of mediation. This well-defined layer of applications shields the network/service provider from the changes of configuration, administration, QoS and supervision that every new technology introduces. The network operator experiences a seamless transition from the incumbent to the new network, without having to make any drastic changes to the existing management applications. Such an abstraction layer allows the overall OSS and BSS to be independent of the underlying network elements, helping service providers reduce dependencies on their network vendors' roadmaps and product priorities.
OSS is a strategic area for a service provider and there is no place for trial-and-error experimentation. This least-risk approach means IMS and SOA are yet to have a major impact at this stage in the market. There seem to be a cautious approach by telecom companies to move into either SOA with IMS or any other emerging standard. Most of them are adopting a wait-and-see approach and instead turning to an OSS mediation middleware to converge their systems while they see what happens in the wider market.
The telecom industry should soon see a trend towards outsourcing OSS. Increasingly, it is becoming clear that OSS won't be captive to one operator but rather shift to becoming a hosted service. Service providers want to focus on their core business and let someone else worry about their network. Rather than worrying about the maintenance and deployments of their networks, operators are focusing their energies on their core competences of managing and growing their businesses.

Bringing business and networks together
The need of the hour is to migrate critical applications in all-important areas to evolve a homogenized architecture that is future ready and supports current requirements equally well. These are:

  • Customer facing applications
  • Operations/Network facing applications
  • Common maintenance and support applications

Proper integration of varied middleware components is of utmost importance that influences the critical manageability aspects of all the networks. It ensures a smooth link-up between BSS and OSS enabling rapid development and delivery of the operator's services roadmap. This calls for an investment in building a joint business model for identified solutions and services. Telecom services providers are becoming decisively more inclined in building strategic partnerships with experienced OSS/BSS partners and vendors who have strong telecom domain experience and have invested in building skills and expertise to address all critical success factors.
The true potential of the networks cannot be realized until the businesses that they create are compelling enough for the end customer to embrace. For this to happen, network manageability has to be as responsive to business needs, as businesses themselves are to their customers. It's definitely hard to predict the future, but there is no doubt that the wireless networks are at the helm of technological forefront, yet their proximity to businesses may define winners in telecom.  

Anita Gupta is Strategic Business Unit Head, Telecom Service Provider business at Aricent

In an era of commoditisation, competition and churn, it is becoming increasingly difficult for telecommunications providers to find meaningful market differentiators. Superior customer service offers a solution, but, to be truly effective, it requires superior customer information, explains James Wilkinson

Where most fixed-line and wireless services now offer broadly similar products, services and pricing structures, and customers can switch with increasing ease to providers with the latest incentive packages, the competitive advantage is often with those who offer a higher quality, more personalised customer service.  This also provides churn-reducing loyalty and, as every marketer knows, retaining an existing customer is exponentially more cost-efficient than recruiting a new one. Existing customers also represent a potentially rich source of cross-product and upselling opportunities - providing, of course, one knows what they want.
Knowing what the customer wants depends on knowing who they are. This, in turn, requires an IT system with the capacity to capture relevant information from customer-facing sources, combine this data with other enterprise systems and then maintain the information in a format that is easily accessible, reliably accurate and always up-to-date. Creating a master customer index with these attributes is no trivial task. To really do the job requires a combination of advanced integration, identity matching and data management technology.
Until now telecoms providers have been investing heavily in applications such as CRM to create their customer profiles. Increasingly, however, CRM is being viewed as an entry-level solution. While CRM is capable of capturing a plethora of data and loading it into one place, it typically lacks the ability to integrate customer-facing and back office systems such as customer accounts, enterprise resource planning (ERP) or web portals. Nor are CRM systems generally able to arrange this information into a meaningful, prioritised view of the customer or business entity.
Having the ability to capture every front-end transaction and then be able to selectively transform this data into a coherent view of the customer that is both current and operational remains an elusive quest with CRM.  To accomplish this level of sophistication, today's solution of choice is Customer Data Integration (CDI) - a subset of Master Data Management (MDM), an advanced suite of data integration and identity matching tools that enables providers to ‘know their customer' with a next generation level of depth, detail and accuracy. By offering a complete, front-to-back office system of tools for real-time customer visibility, CDI picks up where CRM leaves off.  

A 360o customer view
A CDI-based customer profile, or master record, is far richer than a single-faceted view based on something like the monthly bill.  It brings together strands of information from all touch points to enable the telecoms provider to see the full set of relationships it has with each customer, whether corporate or consumer.  In the case of a household, this may involve consolidating a number of different accounts either held in separate data silos (accounts payable, marketing, CRM, EPR, etc.), or belonging to different family members, all with their own mobile phone accounts. For some purposes, such as payments receivable, these accounts may need to be viewed together. For functions like direct marketing, however, customer files may need to be analysed individually.  Adding to this complexity is the proliferation of new services coming to market such as wireless Internet, TV-over-mobile and other content-based services, all of which makes the task of obtaining a single unified view of the customer an even greater challenge. 
In the case of corporate clients, the consolidated customer record may include a variety of other information on the company's supply chain partners, different departments, lines of business or global network of offices. Using CDI technology, this data can be segmented according to any set of pre-defined parameters in order to identify such things as call volumes within a specific cost centre or to get a breakdown of personal vs. business mobile phone usage for individual staff members. This layered, or hierarchical, view of the enterprise not only provides the telecoms operator with an enriched level of insight into the high-margin business customer, it constitutes a potentially new value-added data management service for which corporates may be prepared to pay a premium.
Traditionally, the call centre is one of the service provider's costliest overheads. Given the right tools, however, the customer support representative (CSR) can not only be a problem-buster and retention builder, but can also become a dynamic sales agent, transforming the hitherto cost centre into an effective profit centre.  When a customer calls in (usually with a complaint), the CSR must have instant, at-a-glance access to the customer's complete master record - time lost searching for files, asking repetitive identifier questions or transferring the caller to other departments inevitably compounds the customer's initial irritation. By contrast, having a complete overview of the customer from the start and being able to sort out the problem quickly creates a positive, personalised customer experience. This in turn prepares the ground for a selling opportunity.
Armed with this master customer record, a CSR can quickly spot service usage trends and offer better tariff rates, lock in new call packages or sell a mobile contract to a fixed or broadband customer. This interaction at the same time enables agents to update demographic details such as recent change of address, add new staff members to a corporate account, or make note of a family member who has reached school age and may need a mobile phone.  Once captured within the CDI hub, this continually updated record becomes the most accurate, trusted source of customer profiling information, available in real-time at any security-cleared service or operational touch point.
Using CDI technology, creating the customer record is speedy, requires no costly data transformation professionals and is non-invasive to existing systems because the hub sits between the existing systems that gather data, and the enterprise systems that want to consume these data. Like a spider's web, it is linked to legacy back-office data sources and customer-facing applications via a system of re-usable, object-oriented components enabling master record data to be gathered and held in either a central, or a federated repository. Because this indexing database is set up to capture information from anywhere across the enterprise on a real-time basis, it becomes the most accurate, up-to-date and trusted source of information concerning the customer, business or any other entity.

Intelligent data matching
One of the main challenges of integrating data from disparate sources into a central repository is that the resulting database is frequently cluttered with duplicate files and fractured, incomplete information. This is often the result of misspelling and other errors: for example, having one file under ‘Smythe' and another under ‘Smith', or just the typical variation associated with collecting data (using Bill sometimes and William other times).  Cleaning up the customer's master record and then keeping it current on an ongoing basis, is one of today's biggest IT headaches. To tackle the job, Initiate Identity Hub software, one of the industry's leading CDI/MDM solutions, employs a system of highly accurate probabilistic matching and linking algorithms to identify and resolve these anomalies. Mechanisms are in place to facilitate implementation such that changes subsequently made by a CSR or customer-facing application are in turn implemented in the master record.
In creating a master customer index, telecoms providers have an advantage over most other industries: a unique customer identifier called the telephone number. A wealth of profiling data can - in theory at least - be collected around the phone number. However, this can also lead to much confusion. An individual may have several different numbers (work, home, mobile etc.), or conversely a group of people (office or family members) may share a common number. Another common scenario is that a customer has switched to a different provider and taken their number with them. This is a particular challenge for suppliers like the Carphone Warehouse, which represents multiple providers.  In such a case, CDI's intelligent matching technology can link up customers with their mobile numbers and their current supplier to eliminate duplicate or obsolete files and then group this corrected information into a new master customer record.
So far, we have focused on the customer-centric application of CDI. However, because this technology provides a fast, cost-efficient means of integrating silo information, it also has major cost and efficiency implications at an operational level. Indeed, one of the outcomes of the recent spate of merger and acquisition activity within the telecoms industry is a need to rationalise increased amounts of legacy data, frequently at a global level. Once gathered into the CDI hub, customer lists from the merged entities can be compared and cleaned, and supplier data can be compared for negotiating leverage, notably in cases where the parent company may have pre-existing partner contracts, or where the combined volume of business suggests a pricing discount is appropriate.  Whatever the application, CDI/MDM's contribution - whether customer-facing or operational - is revolutionising the data management landscape.

James Wilkinson is EMEA Services Director,
Initiate Systems
www.initiatesystems.com

Customer Experience Management (CEM) is a crucial process for mobile network operators and communication service providers in general. It places the customer at the centre of a converging communications environment, finally recognising that it is the end-users’ response to their experience of the network and its services that will ultimately drive the success or failure of any network brand.  James Doyle discusses the emergence of CEM as a crucial application, and why it is now an essential complementary part of CRM/BSS/OSS processes

Is Customer Experience Management (CEM) just another three-letter acronym added to a growing list, or does it represent a new approach to delivering value within the OSS/BSS landscape? This is a common question passing across the lips of those who bump up against CEM for the first time.
CEM has traditionally been viewed by management consultants and analysts more as a business and marketing philosophy rather than a real technology or application delivering tangible value. However, CEM implementations through real software solutions across the globe are on the increase and are producing real results for new revenue generation, lower churn rates and greater customer satisfaction amongst many MNOs. CEM systems are able to build top-down models to define customer experience and measure experience indicators in transaction-orientated networks in near real time against those models. This allows any customer’s experience of service consumption to be viewed by the CSP and through such measurement experience ‘gaps’ can be identified and customers proactively managed, as a result. This, in turn, reveals: customer satisfaction, churn potential, lost revenue/potential revenues, brand damage, negative and positive advocacy and a customer-centric experience view of your business.

CEM’s differentiating qualities
So, how different is CEM to Customer Relationship Management (CRM) or Service Quality Management (SQM)? The difference is subtle but very significant and requires approaching the problem from the customer’s point of view. Firstly, CEM delivers a customer experience view of every point of interaction the customer has with a CSP’s business and operational platforms, e.g. service usage, billing, customer care. We call these the customer touchpoints; secondly, it delivers a complete view of a customer experience and their problems direct to the desktop of the relevant stakeholder or workflow process within the business, so immediate action or feedback can be taken to improve that experience, e.g. closing the loop.
CEM is, therefore, more of a horizontal, iterative way of providing experience feedback to the entire business, whereas  the other approaches are vertical methods of understanding the operation of assets within each silo; SQM delivers a service view rather than a customer or business view, and CRM data limitations mean a real-time, actual customer-experience view cannot be extracted. 
Using the customer as the central reference point of your value delivery and service experience provides a more holistic and better way of managing and running a business, especially in terms of technology and service integration (e.g. convergence). Why rely solely on standard OSS and BSS silos, with systems to measure each silo (sometimes per technology), when you can use the customer to pull this data together to provide a more complete picture?
Such views are supported in independent research by Allen, Reichheld and Hamilton in the Oct 2005 Harvard Business Newsletter Management Update. Their paper, entitled: Tuning Into The Voice of Your Customer, clearly demonstrates that, by simple measurement, the positive and negative advocacy of its customers provide a clear indication as to the growth potential of a business. Full CEM systems act as a key feedback mechanism to deliver against this theory.

Closing the gap
In terms of aiding speed of response to customer needs, CEM is one of the most efficient and proactive ways an operator has of closing the gap between measuring a poor performance and enacting change to solve the problem. Other OSS and BSS systems just do not ’see‘ the actual customers behaviour so clearly. CEM can also be applied to traditional BSS areas like customer care and CRM, which is a key part of Arantech’s solution strategy and what the company, in turn, believes should become an overall industry strategy. The company sees three CEM elements as crucial: First CEM – that is providing rich, customer-centric data early to the company, especially key stakeholders like executives, customer care and account teams; Proactive CEM – this is about taking action before the customer has raised a problem and can be seen as managing against the positive and negative advocacy of your customer base; and Next Best Action CEM – this is about applying business logic rules to a combined CEM and CRM data set to provide a simple set of next best actions for the operator to improve the business process.
A proactive customer-management strategy executed through, say, customer care activities, can deliver a much needed proactive first line of care, and much less of a reactive, cost-based process. By being more proactive, an operator’s process will become more revenue focused and less cost focussed. And if one considers that most customer experience problems don’t even get reported to the customer care department, this proactive approach makes total sense in revenue and experience terms.
In the end, the only way of getting a customer-focused view is by going top down, customer-to-asset, and not bottom-up. This requires measuring all customers all of the time, in real time, enabling a CSP to look at how these customers are using and consuming both services and network basics.
In a network management or service management system tens of thousands of elements are being measured and managed. Whereas in a customer-centric solution tens of millions of customers and their experience are being managed; this is a key differentiator of CEM systems.
Currently, a significant number of operators and service providers are already applying the Arantech touchpoint CEM to their networks and are now able to monitor service delivery and usage through the eyes of their subscribers.

The future
The TM Forum has spent a significant time defining service management and turning that into a deployable technology. Taking a pure customer-centric approach and turning that into a deployable technology has, until now, only been pursued comprehensively by one company, Arantech, currently the thought leader and pioneer of this space. However over the last 18 months CEM has started to become a real market opportunity, which is attracting more vendors to the CEM approach.
CEM, as both a business process and a best practice discipline for any operator, is beginning to both catalyse and enable a cultural change process across the mobile industry. It is helping CSPs review the whole way in which they do business and has already led a number of MNOs to change their workflows and processes to reap competitive and revenue advantages.
So, are the days of just measuring the performance of a network or service rapidly drawing to a close? Maybe the answer to that will revolve around whether current systems in use will ever be able to positively answer several key questions, in both real-time and historically, across all touchpoints at which the subscriber experiences the network and its services. These questions are:
• Can you segment, group and manage the service experience of the entire customer base dynamically?
• Can you identify who is trying to spend money but can’t?
• Can you identify which customers are having a good or bad experience across all operational and business platforms?
• Can you identify what services are being consumed by which customers or device?
• Can you identify which of your network assets are the most effective at delivering business value and profit?
• Can you continually determine the positive and negative high-margin customers dynamically to ensure an action plan for high company growth?
If the system in use is a comprehensive CEM implementation, such questions will be answered in the affirmative on each count and more importantly delivered to the desktop of the key stakeholder or process responsible.

James Doyle is VP Marketing and Product Management with Arantech

The telecoms market is renowned for its rapid billing cycles, but service creation has always been expensive, time-consuming and operator-led. Antoine Guy explains that a new approach promises to revolutionise this situation by addressing the very core of the problem: subscriber management

Taking its lead from other sectors, such as IT and manufacturing, telecoms vendors are now beginning to embrace a new discipline - Product Lifecycle Management (PLM) - which originates from Computer Aided Design (CAD). PLM applications unify the marketing, service management and network resources of the operator to provide end-to-end product creation, re-using data to shrink the time and cost associated with taking services to market.  PLM is gaining interest, with the TM Forum recently forming the Product and Service Assembly Forum to standardise the discipline, and it's already begun to pave the way for personalised provisioning.
An example of PLM in action is a new breed of Subscriber Management Platform (SMP) capable of uniting traffic management systems with the operator's dedicated subscriber management system, RADIUS and DHCP servers. Drawing upon the data from each of these, the SMP is able to correlate the user's network activity against the services he or she takes and those at the operator's disposal. In effect, it enables a broadband network to become ‘subscriber-aware', mapping the connection between individual subscribers and the services they consume.
The first step to subscriber-led servicing is to understand subscriber usage by mapping applications on the network. Over today's IP infrastructure, a multitude of applications compete for capacity unchecked, and many operators are unaware of how capacity is being used. Deep packet inspection (DPI) technology can provide the operator with visibility into network usage, identifying content in the packet's header and payload and cross referencing this with a library of applications to determine the nature of the traffic. DPI technology segments packets by protocol, application type and patterns within data payloads, offering detailed visibility into traffic usage and trends. This level of visibility - understanding the users, protocols, and applications on the network and how they are behaving - is the first step in controlling traffic and usage down to the subscriber level.
With this baseline in place, application control can begin. This allows the operator to allocate sufficient resources to promote or demote an application. The operator can classify traffic and assign actions to each class to create network rules, or policies. For example, if peer-to-peer traffic is hindering network performance, a pre-set application control policy can segment this traffic to a portion of the network, opening up more bandwidth for critical applications, such as third party VoIP or online gaming. Application types can also be set to maintain a constant maximum level of bandwidth. And the level of detailed information provided by DPI means that network deficiencies can be spotted and corrected. As a result, the operator can provide best customer service when trouble arises, limiting complaints or speeding up response times.
In a PLM context, DPI provides the means to assess which subscribers are utilising which services. Armed with this information, it's possible to begin honing service offerings. And we're not just talking about tiering services into gold, silver and bronze, but providing individual subscribers with tailor-made offerings on a mass scale. Subscriber managed services can be used to target specific groups of subscribers, such as voice-over-IP (VoIP) users, gamers, businesses, high-bandwidth users, casual users, peer-to-peer (P2P) users, P2P-free users and others. Examples of service types include:
Yet subscriber management can go even further, providing the operator with opportunities to upsell. Thanks to its ability to meter usage, enforce quality of service and generate usage statistics and accounting records, DPI traffic management can enable the operator to deploy joint revenue models with alternate providers or non-provisioned services such as gaming and VoIP applications. Symbiotic business models will begin to emerge. Network operators may provide a well-known service with free carriage and even prioritise the application in order to benefit by association. So we may see small ISPs advertising five star QoS for Skype to attract new subscribers, for example. Alternatively, a start-up may contact a large-scale network operator requesting that its new innovative application be prioritised. In this scenario, the content provider would pay the operator to gain access to a guaranteed large subscriber base.
Using an individual example, the operator might find that Joe Bloggs is a keen user of a third party VoIP service such as Skype or Vonage which can be sensitive to delay and jitter. Rather than throttling the capacity for this service, which could damage brand perception, the service provider has several options. It can offer the user the option of convert to its own VoIP service, perhaps at a lower rate for calls to the US that Joe regularly makes. It could suggest he pay an additional £5 per month to prioritise the third party VoIP service he is already using to ensure quality of the service. Or the operator could collaborate with the service provider and provide QoS in exchange for royalties from the application. This proactive approach enables the service operator not only to capitalise on a subscriber's existing usage but even to gain revenue from services it doesn't control.
Mapping the connection between individual subscribers and the services they consume creates the ability to mass customise. Through real-time and long-term monitoring and reports on subscriber data, operators can identify and develop the most profitable services or tiered service plans. Services can be developed and brought to market in a matter of days rather than months. Information is sent back to those OSS systems that interface with the Subscriber Management Platform, conveying data usage statistics for instantaneous charging and provisioning.
With the number of services and applications expected to increase exponentially over the next few years, issues over bandwidth consumption and quality of experience (QoE) will arise. The race to the top spot will no longer simply be about access to services. The battle for the consumer will be fought over the quality of content and services. Operators will need to balance QoE for individual users while retaining control of network usage and cost: both require strict service management.
A subscriber-aware network is the answer, and it's a network capable of intelligently identifying its subscribers based on their unique subscriber ID, regardless of the subscriber's dynamic IP or inconsistent connectivity. Operators that can identify individual or groups of subscribers and dynamically apply policies or rules based on use or other network patterns will be ahead of the crowd. Monitoring trends, the operator who can offer subscriber service options, service level agreements, packages and prices which can be refined to constantly meet demand, will be able to reap the benefits of rising ARPU and customer satisfaction.

Antoine Guy is Marketing Director, EMEA, Allot Communications, and can be contacted via tel: +33 (4) 9300 1167; e-mail: aguy@allot.com www.allot.com

IMS has gained substantial momentum as the standardised architecture for the convergence of communications service delivery. But many early adopters have taken a network-centric approach to the benefits of IMS deployment rather than taking a holistic view of the strategic opportunities, says Simon Dadswell

IMS is set to be a big numbers game if industry forecasts are correct. Global industry analyst Stratecast estimate in its report Next Generation OSS for IMS and Next Generation Networks – Now!, (December 2006) that capital expenditure on IMS infrastructure will rise from US$179 million in 2006 to over US$10billion in 2010. Informa takes a more conservative stance, forecasting worldwide capital spending on IMS infrastructure to reach $4. 5billion in the same period, according to its report IMS Opportunities and Challenges: Fixed and wireless market outlook, (March 2006). Others believe there will be a delay in market acceptance until the success of early adopters becomes visible, followed by a significant ramp up in 2009 or 2010. Regardless of the accuracy of these forecasts most industry commentators accept that virtually every major telecom service provider is now including IMS in one form or another in strategic planning. The industry has for some time been working towards the adoption of ‘Triple Play’, ‘Quadruple Play’ and ‘Fixed-Mobile Convergence’ (FMC) - all of which are enabled by the potential of IMS. IMS helps to standardise protocols to eliminate proprietary technology and enable interoperability for all IP-based services, and provides the opportunity for ubiquitous access for users regardless of service, terminal type or location.
Telecoms service providers are undergoing a period of dynamic change in the provision of services, so they need to have strategies to react to change effectively – and adoption of IMS is a key component. A concern is that current approaches to IMS may end up missing significant opportunities for the future. That is why it’s important to highlight the key characteristics that IMS must possess to secure the right business results for the telecoms sector.
The first consideration is that IMS is usually only linked with the short term issues of deploying new services cost-effectively. Many service providers take the approach that IMS is purely a network engineering issue and their IMS adoption business case is only based on OPEX reduction for network and service management.
The challenge is broader than that. If cost reduction is the basis for the business case then the adoption of IMS and resulting competitive strategies would proceed incrementally at the pace of normal equipment replacement cycles. While network consolidation and saving on operating expenses is an admirable objective for a service provider, new competitive threats from the Internet based world are fast emerging. The battle for the consumer will increasingly be fought over content, services and brand equity, not over access or network capacity. Capacity demand is unlikely to significantly ramp up in the short term and the major ‘New Age’ telcos present a clear and present threat to future revenue as they begin to redefine the direction of telecom and broadband services. For established service providers IMS implementation has to be more about the introduction of new multimedia services, P2P services like video sharing, and innovative service bundling.
The second consideration is that IMS needs to be recognised as a horizontal technology offering potential benefits across the service provider’s entire portfolio, and linking the world of the Network Management department and the IT Operations department for both wireline and wireless services. This adds complexity to the process of creating a compelling business case for the entire organisation and can result in departmental battles between IT and network engineering. If a Network Equipment Provider bundles the key IMS Charging technology with its next generation equipment to enable the network engineering department to retain ownership of IMS mediation projects, then how does this effect organisational strategy? Service providers should consider a pragmatic approach to IMS adoption by exploiting their existing technology investments and gradually replacing legacy network and OSS/BSS elements. In addition, online and offline charging have very distinct business requirements. Service providers should consider the deployment of proven IMS Charging technologies with the flexibility to handle rapidly evolving business models without compromising service capabilities at the expense of the network.
The third consideration is the evolution of triple-play and quad-play services and new revenue generating opportunities. IMS deployment helps service providers integrate discrete fixed and mobile network technology to deliver fixed-mobile convergence (FMC) and reduce time to market for new services. However many service providers need to do more research to determine what sort of services customers want to receive. The concept of convergence is finally becoming a reality but there are several different options to reach this particular goal. It is essential that service providers focus on the revenue generating potential of new services as a key driver to FMC by exploiting IMS as a means to obtain a larger share of the customer’s wallet. Furthermore, the adoption of IP based communications and value added service strategies will open-up new wholesale business opportunities for service providers, including integrated transport solutions, multiple platforms for customised solutions and a wider services portfolio.
Currently Voice-over-IP (VoIP) appears to be the service most frequently targeted for initial IMS deployment and is sometimes enhanced with IMS-related application services like personalisation, presence or special routing. A typical example of these features in combination would be a laptop-housed soft phone application operating as an alternative to a mobile phone. When the user is logged in to the laptop, incoming calls could be routed to it instead of the mobile handset, subject to the subscriber-controlled presence settings, similar to those found on Instant Messaging services. Outbound calls could be made, and rated, as if they originated from the subscriber’s home service location, regardless of the current location of the laptop. The laptop phone’s address book would be automatically synchronised with the mobile handset. In addition, IMS services, as experienced by the user, can be created from a palette of network and application service components including voice, presence, personalisation, content delivery, content sharing and blogging, various forms of messaging (chat, push-to-talk, e-mail, calendaring), synchronisation of subscriber-owned data across devices; and seamless delivery of all these components across traditional network boundaries – the options and complexity ensure that to follow a proprietary network based charging approach could risk a service provider being locked into a very limited set of service options.
The fourth consideration is one of differentiation. IMS is about offering new opportunities in an open and standards based framework - enabling rapid and cost effective ‘plug and play’ capabilities for multimedia services, regardless of service delivery platform types and end-user devices. However equipment and application vendors are attempting to differentiate services outside that framework and are enforcing their own extension to provisioning and charging standards. For IMS to work properly and deliver to its true potential it is important to ensure interoperability and facilitate the progress of open interfaces and standards in an all-IP world. Differentiation comes from a compelling mix of services, flexible charging options and consistently high quality, multi-channel customer care – not from the deployment of proprietary technology.
Service providers are starting to invest in IMS.  They are taking into account how they can benefit from IMS in the short term to streamline network architectures and to reduce operational costs. While this may help to reduce OPEX the strategic opportunity is to enable new competitive revenue opportunities associated with the growth in multimedia, content and FMC services.
If service providers are to make the shift from network-centric to service-oriented organisations, it is paramount that they invest in highly adaptive infrastructures to exploit the full potential of IMS and next generation BSS/OSS. By so doing they will be able to charge, bill and manage a wider range of complex services – services that will have increasingly shorter lifecycles, intense competition, increasing price pressures and will require back office systems to manage them profitably and efficiently.

Simon Dadswell is Advanced Solutions Marketing Manager, Intec Telecom Systems

Mobile operators need to adopt an approach that allows them to target specific users with relevant advertisements, while, at the same time, managing the overall ad-related user experience, including user privacy and charging aspects, explains
Danny Kalish

For many years, advertising has been the driving financial force behind all successful media such as TV, Internet and newspapers. Nowadays, with falling subscriber voice revenues and fierce competition, there is a growing interest in mobile advertising among mobile operators as a potential new revenue source that does not originate from their subscriber base.
 The mobile device is potentially the most effective advertising media ever developed. It is personal, always-on and enables precise targeting of advertising based on contextual information such as location, user profile and usage history. A recent report from Strategy Analytics predicts an incredible potential for this market and suggests mobile advertising could account for up to 25 per cent of worldwide Internet advertising expenditure by 2011. Mobile advertising as a revenue stream is rapidly increasing, with the same report predicting that mobile Internet advertising will grow from $1.4 billion this year to $14.4 billion by 2011.
 Will mobile advertising be another over-hyped industry segment or can it actually deliver genuine value to the consumers?
Mobile advertising is one of the few effective, user-centric advertising media where user experience is a key success factor that should be maintained across all delivery channels. Operators need to ensure they develop a viable and sustainable business model that does not alienate consumers and damage both their own reputation and that of the brands using the service.
When TV viewers are faced with a poorly targeted advertisement during their favourite programme it is unlikely that it will damage their perception of the TV service provider. However, if mobile users continually receive annoying advertisements on their device, they are likely to place the blame on the operator.
Deciding when and which subscribers should be exposed to advertising is complex.  It depends on many factors, such as the subscriber’s price plan, usage context, opt-in procedures and regulations. An agreement needs to be reached between customers and their service provider to reflect the operator’s obligation to protect users’ privacy in exchange for the subscribers’ consent to being exposed to targeted advertisements. Ensuring customer satisfaction can be a complex process; a negative experience may reflect badly on the service provider and possibly cause the consumer to refuse advertising services.
In addition to guaranteeing best user experience, mobile operators need to make sure advertising becomes a win-win situation. This can be achieved by selectively choosing which subscribers will receive advertising and serving them with advertisements that genuinely brings them value. Operators should offer subscribers full opt-in and opt-out options, and grant them various financial benefits in exchange for their acceptance of receiving advertisements. After all, the willingness of subscribers to receive ads on their devices is essential.
Targeting the advertisements to each and every individual user is important in maintaining a satisfying experience, but there’s more to it than just receiving a suitable advertisement. The advertisements also need to be adapted to the mobile device and access channel so it will be clearly presented without damaging the user’s experience and interests. Providing intuitive advertising services with interactive actions, such as context-aware ‘click to call’ or ‘click for info’ options, will further support a positive experience.
If an advertisement is non-intrusive, delivers value, protects the subscribers’ privacy, stimulates interest and is relevant to the consumer, there will be a higher propensity for service adoption.
To achieve this, mobile operators need to be 100 per cent clear how many ‘real’ customers they have and ensure the quality of the information held on each user. Names and addresses alone simply aren’t adequate. Operators need to be well acquainted with user interests and preferences to ensure they understand the user in every way possible and can therefore customise every single interaction. Subscriber profile information should be collected from all existing sources (e.g. billing, CRM, etc). Building a dynamic user profile, which is based on the user’s behavioural information, including content consumption and reaction to past campaigns, will further ensure that subscribers will be exposed only to relevant advertisements that offer them real value.
The current mobile content shopping experience is challenging for teens and young adults, but it is frequently inadequate when compared to similar services on the PC. However, there is huge potential for advertisers to exploit young mobile customers who are used to being targeted with advertising messages. This segment is more likely to embrace personalised advertisements and awareness campaigns, as long as the content is relevant, and they are offered targeted service benefits for viewing these adverts on their mobile devices. If this is done successfully, there are some inherent benefits for the operator as it is an excellent way to build loyalty and “stickiness”. Service benefits are also an effective way to take the usage of mobile services to the next level by subsidising premium content and lowering the service consumption cost barrier.
For example, the number of users willing to pay for mobile TV services is a small fraction of the total number of users who want to use the service for free.  Subscribers should also be offered benefits for accepting advertisements in the form of discounts or free services. Using this approach, operators may increase their users’ satisfaction and eventually turn mobile advertising into a win-win situation.
In addition to creating a positive user experience and targeted campaigns, operators need to take a mobile-centric, holistic approach. Whether using a partner, who would be responsible for media selling and campaign management or centralising the core service system and integration efforts internally, operators would still need to use a centralised advertising system that would allow them to protect their valuable assets and maximise the mobile advertising potential. Only a centralised system that would handle the operator-side business logic, communicate with the operator’s IT systems (subscriber data, context information and access channels), and at the same time manage the rich context, user experience and targeting of the advertisements, would be able to achieve this goal.
Mobile advertising presents a huge opportunity for mobile service providers. By leveraging their current assets and positioning themselves as a key player in the advertising value chain, the mobile advertising business opportunity can be maximised while substantially reducing failure risks. To fully exploit its potential for revenue growth, operators need to realise that user experience and flexibility are the key success factors. With a high level of personalisation, targeting capabilities and openness, operators will realise the rich potential of mobile advertising and ensure its success.

Danny Kalish is CTO, Unipier
www.unipier.com

Oren Glanz looks at the future for mobile music, and what operators need to consider to ensure they don’t get left behind

Music fans have always been proud of their collections and where, in the past, they used to compare their massive vinyl collections in meters, gigabytes are now a better means of comparison. Statistics from the International Federation of the Phonographic Industry's 2007 Digital Music Report demonstrates the growth of digital music, showing an estimated $2 billion worth of music being sold online or through mobile phones in 2006, accounting for 10 per cent of music sales. A Strategy Analytics forecast for Western Europe shows that in 2006 there were almost a million active mobile music users and it continues by predicts that by 2010 this figure will have risen to 2  million users - which has made mobile operators take note and place even more faith in the mobile delivery channel.
Usage and interest in mobile music is clearly exploding so how can operators ensure their subscribers become addicted to downloading music?
With so many devices on the market there is the potential for mobile music users to be quick to churn when a service/device doesn't live up to their expectations - very often without contacting their operator.  These customers do not contribute to the operator's top line, and can tarnish the reputation of mobile music amongst their peers and fellow subscribers.  This is where the challenge becomes difficult, as most service providers do not have a clear view of their customers preferences or what barrier to adoption they face when trying to use a service.  They do, however, now accept that customers will not make contact to highlight any barriers to service.
It is important to note, even when a subscriber has successfully overcome any barriers to initially adopting a service, that there are still issues to turn them from an occasional user to an active user.   It is vital for mobile operators to understand that identifying problems with delivery/usage is only part of the issue, encouraging new users and increasing their consumption should be imperative and is a key to a successful mobile music service.
Subscribers trying to access mobile music services can encounter various technical, training, value and compatibility obstacles, which can reduce the likelihood of their becoming a habitual user.  These include:
Digital right management (DRM) - creates major barriers within a music download service as the mechanism for DRM can vary between the complete separation of the music content from the DRM file to a lock on forwarding tracks.  Another major issue is variances between handset models - certified Vs non-certified operator handsets.
Navigation - concerned with music downloads tracking all aspects of a subscribers' journey through to identifying valued content.  Experience shows that navigation paths are not always clear, subscribers cannot find the content they are after, causing them to quickly lose interest and abandon a service.
Barriers to usage - also common relating to areas of handset and music service usage.  Customer behaviour is very intuitive with most individuals happy to try new services, but the majority will give up once they encounter a barrier to usage.  The sheer complexity of many new mobile services sometimes causes this.
Functions such as streaming and downloading can often be confusing.  Subscribers tend to struggle with how to search, locate, select and replay after they have downloaded a track, especially the mass market, non-technical and inexperienced users.
This leads us to far simpler barriers, but ones that are ultimately the largest barriers to subscribers becoming addicted to mobile music on their handset.  Usability, training, interest and other user experience barriers are just as damaging as any of the previously mentioned issues.  Mobile operators need to look at how they can ensure such customer problems are solved before they impact on the user experience.
So with so many barriers to the success of mobile music, why use a mobile phone as a music delivery method and player when compared with other standalone players?  The obvious argument is the need to only carry one device, and the advantage mobile operators hold here is that consumers are already addicted to their devices and other services such as SMS.  But how can operators change this device addiction into a mobile music addiction?
The answer is simple - mobile operators need to improve their subscribers' user experience when using music services, which, in turn, will drive satisfaction.
Understanding users and delivering exceptional customer service is just as important a part of the mobile experience as the latest technology and the size of the marketing budget. It can be the key differentiator for a business.  Too much time and money is invested in getting products to market quickly rather than getting products to market efficiently.  The objective is to provide the best mobile experience for each individual user.
To do this, mobile operators need to build an end-to-end view of their music service from all contributing systems and content.  This is not as easy as it might seem, as although common elements are contained within most music services, many aspects of delivery, billing and content are unique to each operator - which affects subscriber behaviour, usage patterns and other adoption barriers.  The simplest way to achieve an end-to-end view is by approaching the problem with Service Adoption Management (SAM) tools.
SAM treats adoption and usage from a holistic point of view, taking into account all technical and non-technical issues which affect adoption and usage levels.  It provides an operator with a clear insight into the groups enabled to use mobile music and presents the operator with the information to enable them to encourage a user to download songs on their phone.  For example, SAM tools allow an operator to understand which handset a user is using, what they are interested in and at which stage of adoption they are.  If a user can be seen to search for a song but stops, and does not download it, then pricing could be the barrier and the mobile operator can review this.  If a user is only downloading one song a month, then the operator should try to encourage greater usage by offering promotions.
By using these principles and analysing subscriber information (such as billing, CRM, customer care and network or platform services) it should also be possible to track other key adoption traits such as social networking, campaign effectiveness and early service launch measurement.
Armed with this highly granular insight into virtually any aspect of usage, and the ability to identify and remedy usage barriers, even those which have never before arisen, managers in charge of mobile music marketing and content can make more informed decisions and better target offerings and campaigns. By providing deep real-time visibility into subscribers' interactions with mobile music, it is possible to stimulate customer loyalty and increase mobile music usage and generate meaningful profits.
Operators can immediately see how mobile music services perform, how campaigns are working, and the experience of both individual and/or larger groups of users. This detailed real-time information can then be used to make mobile music more effective and attractive, and to proactively contact customers who need training or an upgrade to use services successfully.
Customers enjoying a positive first experience of new services will be more likely to use them again.  If services do not work or perform properly, operators can proactively contact these customers to solve the problem, offer assistance or offer refunds.  By treating customers positively in this way, users are more likely to use the service again and stay loyal.
This will be measurable by the increase in number of music tracks downloaded, the growth of active users, reduction of streaming terminated sessions, reduction of DRM related problems or in the number of music related calls received by customer care.
The mobile industry needs to be less focused on the technology and should instead proactively explore the real needs of users.  When setting their objectives, service providers should also consider the impact of customer satisfaction on their brand and image.  Providing a higher quality customer experience will also help service providers to differentiate themselves from their competitors, increase loyalty and attract more customers.   
Understanding and improving the user experience, not just for customers who have became addicted users of mobile music, but more importantly, for those who never get past their first failed usage attempt, is essential.  SAM tools revolutionise the whole business process of value added services such as mobile music and allow the operator to understand what is actually happening to the user.  By focusing on removing some of the frustration that is inherent in today's users' experiences the road should be clear for mobile operators to succeed in reaching the mass market with mobile music.

Oren Glan is CEO of Olista

Andrea Casini explains how operators can look to grow their existing operations in the face of a tough market environment

It’s common knowledge that the mobile phone market in developed countries is highly competitive, and opportunities for revenue growth are not as clear-cut as they once were. Mobile operators need to grow the potential of their network in order to grow the business and survive, rather than simply rely on finding additional subscribers. Evolution, it seems, is the only way forward. Having accrued a huge customer base, there is pressure to keep pricing low in order to keep churn to a minimum.
However, new technologies such as mobile VoIP are poised to change the established income model. To be seen as a mobile phone company alone is no longer good enough. In the consumer market, applications like the mobile Internet and content such as ringtones are big business; services such as mobile TV have yet to take off but are also potential revenue generators.
Meanwhile, enterprises are increasingly offering mobility as an option to their workforce. There is now a plethora of mobile technologies in, or entering, the marketplace - all vying for enterprise business. For a mobile operator looking to take a slice of the lucrative enterprise mobility market, they need to compete with fixed line technologies such as VoIP using Wi-Fi or WiMAX. Enterprises that opt for mobile services to support their workforce will expect unprecedented reliability. 

A crowded market
We’ve reached a stage of mobile phone ubiquity. There is an expectation, from both leisure and business consumers, that their handset will be able to find a signal wherever they are. Currently, places that are densely populated or experience large spikes in usage, such as city centres or football stadiums, can suffer a drop in service with patchy coverage and call drop out. As a high number of users all vie for a signal in one small area, reception quality will inevitably suffer.
Delivering consistent service in enclosed public spaces is further compounded by the fact that leisure customers will tend to use entertainment applications such as mobile TV and gaming when they’re sitting indoors, making in-building penetration necessary. Business users will also spend a significant proportion of their time accessing e-mails and transferring data whilst indoors. The border between private and business users is becoming more and more undefined, with people trying to use similar services and data rates.
Analysys predicts that half of all telephone calls made across Western Europe in 2008 will be made on a mobile phone. A tall order, admittedly, and one that will only be possible if mobile operators evolve to embrace the need for in-building technology.
The very nature of 3G signalling makes it difficult to penetrate areas such as office blocks and tunnels. In addition, high-rise buildings will require additional capacity and bandwidth due to the higher number of users. Resolving the problem of poor reception in built up areas and at large events, by embracing in-building wireless coverage and capacity, has become an important service differentiator for wireless carriers.
In-building solutions can vary in power, offering high capacity blanket coverage in shopping malls through to limited home capacity boosters. Products such as distributed Antenna systems and RF repeaters are designed to guarantee wireless coverage in highly populated areas and provide cost-effective, common infrastructures. This infrastructure will support all of the various standards, while providing a high level of omnipresent coverage to mobile phone users. If there were not be a seamless transition to 3G, its adoption could be threatened by alternative fixed-line services.

The future of in-building
In the consumer and enterprise market, femtocells and picocells could, in the future, help with coverage problems at home and at work. These products transmit mobile signals over a LAN and have a more limited range than large-scale in-building products. Picocells are a higher power indoor basestation alternative to femtocells.
In the UK, Ofcom recently auctioned off 3.3Mhz of the wireless spectrum, equating to a series of twelve low power licenses to allow both fixed line and mobile operators to compete for the option to install picocells into buildings and public spaces. Deployment has been slow as operators appear reluctant to embrace the potential revenue opportunities. However, O2 is due to launch picocells later this year for roughly €100 each.
While femtocells show promise for solving some indoor wireless coverage issues, the technology is still in the early stages of development. Deploying femtocells for individual buildings could, in the long term, reduce operator costs and allow operators to offer smart tariffs. The first femtocells have gone on sale in the UK and are currently being promoted as a way to boost mobile reception at home, but the market is still in its infancy.  
The idea of dual tariffs, using femtocells, will mean users can get the best deal when using their phone at home or out and about. New handsets could automatically hand over from a basestation to a femtocell as users come into range at home. Operators may also offer femtocells as part of a subscriber package. Vodafone is already trialling the technology and this is one step on from the current BT Fusion system, which is reliant on Wi-Fi. Femtocells could even see mobile operators grab a share of the home phone market by offering blanket coverage.
Femtocells and picocells will effectively usurp the need for WLAN and other fixed/wireless offerings. Currently technical issues, such as standardisation and integration with existing networks, need to be resolved before these products will gain wide acceptance. However by offering devices that boost coverage, operators could eventually look forward to increased revenue, as their services are more widespread and readily available. This could also prevent churn.
The in-building theory is not new. Andrew Corporation has already deployed many in-building solutions to provide indoor wireless coverage. The company deployed an ION™-B fiber distributed antenna system (DAS) at Dallas Fort-Worth International Airport Terminal D, its parking structure and the nearby Grand Hyatt Hotel to extend wireless coverage to customers. The ION-B utilises head-end equipment that interfaces with multiple, co-located, operator base stations.
Providing blanket mobile coverage and a set of competitive tariffs to match will mean mobile operators will finally be in a position to claim glory. Users who can make important calls and access data wherever they need to are less likely to look to another operator. In the UK, O2’s churn for contract customers was 23 per cent for the year ending December 2006. Coupled with a £2 lower average revenue per user (ARPU) over 2006, the financial cost of losing customers and paying to retain them in an increasingly competitive market is high.

Location based services
Growing the potential of a network through in-building is only one side of the story. Location Based Services (LBS) are already offered by some mobile phone networks and are a way to generate significant revenue for operators. From child-finder services to tracking enterprise equipment and location-targeted advertising pushes, there is money to be made from LBS.
In the UK, Ofcom has stipulated that by early 2008, any VoIP service allowing users to make calls to ordinary phone numbers must also be able to make calls to the emergency services. Research by the communications watchdog has revealed that 78 per cent of VoIP users who cannot use their service to call 999 either thought they could, or did not know whether they could. This development means that geo-location will be provided on a “push” basis (where information is sent to the user without them explicitly requesting it) and it will be done at the operators’ expense.
The cost of forced geo-location can be offset in the long term with other LBS options, however. Currently, the solution is primarily used as a way to send custom advertising and other information to mobile phone subscribers based on their current location. Major operators, focusing on recouping the cost of 3G licenses and falling voice revenues, have yet to exploit the full potential of LBS. Yet this is set to change with a wave of new lower cost devices, including the Nokia N95, that offer LBS capabilities. In fact, ABI Research predicts that by 2011, the total population of GPS-enabled location-based services subscribers will reach 315 million, up from 12 million in 2006. This represents a rise from less than 0.5 per cent of total wireless subscribers today to more than 9 per cent worldwide at the end of the study's five-year forecast period.
The business case, as demonstrated by Nextel in the US, which currently has a 53 per cent market share of the LBS, is there. Operators, with an influx of hardware, should look at LBS in a new light. It has been over-hyped previously, but it is another way to generate revenue with a relatively low outlay.

The growth potential
The mobile phone market is changing. Operators have succeeded in establishing huge customer bases, to the point where phones now\ outnumber people. Yet, in the mature markets of Europe and the US, there is a need to look beyond voice and data services in order to prevent churn and drive revenue.
In-building, in its simplicity, overcomes one of the most common reasons for a customer changing operator: lack of signal. Meanwhile, the market for LBS and geo-location has been over-hyped, but with an influx of new devices and increasing operator confidence it will be possible to overcome cynicism and make a positive business case to consumers and enterprises alike.
After all, we are in an age of convenience, and in-building solutions and LBS are about to make everything that little bit easier.

Environmental and radiation issues
EM and radiation compatibility, health hazard with exposure to non-ionizing radiations has always been a hot topic, recently coming as a burning issue following a recent article about interferences from mobile phones to medical equipment in hospitals.
As every RF engineer knows very well, unlike many generalist reporters and the general public, 2G and 3G mobile terminals and phones are power-controlled by the network, so as to ensure that networks function properly, interferences are minimised, and battery life is prolonged for more air-time and subscriber’s convenience.
The degree of action for Power Control extends over 30dB, or 1000 times, for GSM and to much more (in the range of 70dB, or 10 million times) for UMTS terminals; which means more transmit power is requested in Uplink (up to 1 or 2 Watts) when radio coverage and visibility are poor (high path loss, high penetration loss - i.e. in buildings from outdoor cells), and on the contrary, very low power (down to 1 mW in GSM or much less for UMTS) is needed with good coverage and antenna visibility - i.e. with an efficient in-building system.
Moreover, specific parameters can be set at certain locations, so that Uplink power will never exceed a pre-set value not to cause out-of-control interferences to the own operator’s network or to other services.
And, by the way, well designed in-building systems would have several distributed low-power antennas (or DAS), with power in the range of mW and radiation levels well below the most stringent EU Country-specific limits (.5 V/m).
Actual results of implementing in-building coverage show local RF power density decreasing by three orders of magnitude, with any previous interference issues turned to non-existing.
In-building coverage is therefore the only solution that ensures, on top of seamless communication and proper service for the network and its subscribers, the highest degree of EM and radiation compatibility, either to delicate equipment like navigation instruments in airplanes or medical equipment in hospitals, and to the human beings.

Andrea Casini is VP EMEA Sales & Marketing,
Andrew Corporation

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features