YOC was tasked with creating a mobile campaign to promote the launch of Kraft Foods' new instant coffee, Jacobs 3in1/2in1. It was to be integrated with traditional media to provide consumers with an uncomplicated way to order samples via mobile, whilst minimising sampling wastage levels and associated costs through accurate targeting and tester self-selection.

The main objective of the mobile campaign was to place product samples amongst early adopters, who are considered to be innovation savvy and opinion leaders. It was also important that the campaign maximised the reach of the target group and increased conversion rate of sample requests, while at the same time decreasing product distribution costs. Measurement needed to be simple and transparent. Kraft also wanted to gather data to develop a customer database.

YOC created a mobile advertising and sampling campaign, integrating a mobile call to action with online and traditional TV and print media. Shortcodes and key words were promoted across traditional print and TV ads, inviting consumers to send an SMS to the campaign shortcode. Consumers could directly respond and request a product sample by sending a one word text message. Participants were then sent a WAP push link to the mobile sampling portal where users could enter personal details to receive a sample. Alongside traditional media promotion, banner ads for the sampling campaign were placed on the Vodafone portal on the Nokia, Sat1, Pro7 Mobil MTV, Viva and YOC.mobi sites. 

Following MMA guidelines, the campaign placed choice and control in the hands of the consumer. Promoting the mobile campaign through traditional media gave users an opportunity to interact with the brand and request a sample if they were interested in the product.

The ‘3in1 / 2in1' campaign enabled almost 500,000 samples to be placed directly amongst the target group. Over 450,000 customers registered with Jacobs during the promotion, making the mobile sampling campaign one of the largest and most successful ever. More than 80,000 users registered their details to be used for future permission based marketing. 

Some 0.4 per cent of users who saw the television advert ordered a product sample via the mobile. The campaign saw high responses from users of mobile portals that were led directly from mobile banner advertising to the mobile registration portal. Almost 650 banners were placed on selected portals relevant to the target group and these achieved a click-though rate of more than 3 per cent. 250,000 text messages were sent to the opted-in YOC Community members who were part of the defined target group. Thanks to the detailed profiling and selection of community members, 10.6 per cent responded and YOC distributed more than 26,000 samples to these respondents.

NetCracker /TELUS
Telus, the largest telco in western Canada and the country's second largest overall, underwent a five-year program to transform its back office systems and OSS processes. The company focused on retiring its old systems and constructing a new, integrated approach to customer and service management with new systems and software as replacements.
Telus merged its mobile and fixed arms in 2005 to reduce costs and, eventually, to provide fixed/mobile converged services. Telus wanted to roll out triple play services (voice, DSL broadband and IPTV) aimed at the residential market. To cope with the dramatic rise of new and converging services it was apparent that the operator needed a more sophisticated OSS system. And, as many telcos have discovered, without efficient provisioning and assurance it can be difficult to deploy and support complex, but low, margin services profitably. 

Telus decided it was more economical to go for outright replacement of its OSS environment, since integration and optimisation would be more expensive.

Part of that bold approach was to sign a deal with OSS specialist NetCracker Technology to supply and support its inventory and fulfilment systems. The new technology replaced Telus' legacy systems and integrated with the provider's service activation, network management and billing applications.

For managing legacy and next generation network resources, NetCracker provides resource inventory with outside plant, discovery and reconciliation, and design and planning modules. Its modules include asset management, which focuses on resources and equipment lifecycle management, and order management, for receiving and processing of customer orders.
Telus uses NetCracker software in both its enterprise and mass market businesses as it introduces new, advanced services:  advanced Layer 3 services such as MPLS and VPLS on the enterprise side, and triple play on the mass market side.  The NetCracker architecture involves one central database handling everything with different processing ‘instances' developed for different services.

Instead of installing technology first and designing processes to control it, Telus has taken a process-lead approach by developing a forward-looking set of processes and bending the technology to work with the processes, rather than the other way around.
The key was to focus on their inventory systems first so they had a single view of the customer for a growing list of products from the service representation right down to the physical network. 

Ultimately, the development of the service layer through its OSS transformation has helped the company migrate to its next generation network.

Gavitec AG / Spanair
Spanair wanted to find a cost effective and efficient means to expedite its passengers' experience when travelling. They turned to Mobile Marketing Association member Gavitec AG, a subsidiary of NeoMedia Technologies, to help them to create a completely paper-free ticketing process via a system of mobile boarding passes that passengers could access from their mobile handsets during check-in and boarding.

The campaign set out to achieve two key objectives. The first was to reduce costs for check-in and boarding procedures. The second was to expand and streamline Spanair's customer service by providing customers with a flexible, convenient and innovative way to check-in for their flights.

Taking into account the day-to-day convenience that mobile provides to users, Gavitec sought to create an entirely paperless ticketing service. The service was enabled by EXIO scanners and followed the IATA publication of a mobile boarding standard based on 2D barcodes. Instead of a paper boarding pass, passengers received a 2D barcode directly to their handset via text message (MMS, EMS or SMS). The message then functioned as both a ticket and a boarding pass. Prior to travel the 2D bar code was scanned at the airport check-in desks and security points and validated via an online database, reducing waiting time for passengers.

Spanair is the first airline in Spain to offer its passengers a mobile check-in facility. The mobile system implemented by Gavitec reduced distribution and operational costs and positioned the airline as an environmentally friendly carrier. The system increased customer satisfaction, allowing passengers to reclaim time that would usually be spent waiting around by shortening cheque-in queues and providing a convenient and efficient means to board their flights. The mobile ticketing system also allowed Spanair to improve its customer relationship management methods and develop a number of customer-centric strategies through data collection on passenger flying habits, providing a new channel for one-to-one direct marketing.

Using the EXIO scanners, the system is now running in eleven Spanish airports, initiating a service that could save the airline industry up to $500million each year. With the widespread installation of the streamlined and convenient service across Spanish airports, Spanair estimates that some 10 per cent or 800,000 of its passengers will make use of the mobile boarding pass facility during 2009.

Has the pace of change in communications outstripped the ability of organisations'   business management processes to keep up? Michael Coppack takes a look

The telecommunications industry, particularly on the cellular side of the fence, prides itself on being at the cutting edge of the latest technological innovations. Fortunately, for the cellcos, this perception is not only held among the industry's practitioners, but also by the end users, the subscribers.

In some respects, these beliefs are well deserved. The onset of the information age has changed the way businesses and individuals operate and communicate out of all recognition in the space of a single generation. Not since the Industrial Revolution has the world changed so rapidly.

The pace of change in the communications world has been so fast and absolute that it all too often masks over some fundamental problems, particularly when it comes to business management processes. When the ‘old' world of business collides with the ‘new', a certain amount of upheaval is inevitable.

Business leaders are right to be wary of new technologies that promise the earth and deliver nothing. For every solution that makes a difference, there are a whole host of others that fall short of the mark. For all its advancements, the communications industry bubble burst in the most spectacular of fashions at the turn of this century and attention became focused on steady, real, business practices. Rather than over-inflated stock market valuations.

Fast forward less than a decade and we once again find ourselves in an economic climate that, following some catastrophic mismanagement, advocates enterprise wide risk management with safe and efficient business practices. This time around, though, the communications industry finds itself at an advantage. Communications companies are not completely immune to the effects of the credit crunch, however, they're very well placed to ride out the crash. The comms industry is growing, globally, and will surely come out fitter and stronger on the other side of the current economic malaise.

The finger of blame for the most recent crash is being pointed fairly and squarely at the world of finance. Though the world of finance could well end up supplying a solution that the communications industry will use to iron out some of the hidden inefficiencies from which it suffers. Ironically, these operational inefficiencies lie at the heart of one of the oldest, steadiest of departments: accounts.

It would be fair comment to say that accountancy suffers from certain unfortunate image problems. Bookkeeping is not generally perceived as an exciting discipline. Indeed, account reconciliation is a laborious, monotonous, process that requires meticulous attention to detail. It is precisely the type of process that is ripe for automation.

Yet account reconciliation retains a relatively low-tech reputation. Accounting within complex organisations, such as network carriers, is often carried out using completely different methods across the wide range of business units in place. Growth through acquisition, mergers and de-mergers, and the introduction of new technologies and new pricing models have tied the supply chains and accounting departments of communications firms in bureaucratic, fragmented, knots.

It seems odd, insulting almost, that an industry which is activity promoting unified communications and convergence technologies, has not embraced automation within its own divisions.

Even at firms that embraced business process management solutions, such as those supplied by Oracle and SAP, account reconciliation is more often than not carried out by large teams of staff putting data into spreadsheets using semi-manual or completely manual processes. In today's tough economy it is essential for organisations to take control of their cash management and financial accounts to increase financial control and reduce operational risk.

The good news for shareholders and customers alike is that communications firms are increasingly recognising that manual financial transaction processing does not deliver the control and efficiency required to alleviate the challenges of balance sheet reconciliation. In many communications companies, the responsibility of account reconciliation falls to the individual departments who frequently use their own manual method to reconcile their sub and general ledger accounts.

At many firms, account reconciliation sees spreadsheets being populated manually with cash, cheque and credit card transaction data, this is then matched by an individual against bank statements. A typical organisation will have teams of employees permanently checking and re-checking accounts.

Generally speaking using manual processes provides little to no visibility and proving an audit trail and tracking transactions can be a timely, onerous and complex task, which is highly prone to errors. Resolving discrepancies also requires timely searches, prior to decisions being made on given business decisions. This batchwise approach to account reconciliation is a major source of irritation and cost at the majority of communications firms.
A small number of vendors exist in the market that can provide automated account reconciliation. That number diminishes further when looking for a supplier of proven software to large, complex, multinationals.

Financial directors are under an enormous amount of pressure, particularly in the current economic climate. And most of that pressure is being applied with the aim of cutting costs. Making an investment in technology requires an almost immediate, measurable, return.
There are seven basic areas that financial directors should consider when looking into automated account reconciliation solutions.
Does the solution:

  • Enhance control?
  • Improve timeliness?
  • Assure accuracy?
  • Reduce cost and risk?
  • Prove easy to use?
  • Prove easy to integrate? - and
  • Port across multiple-processes?

Users need to be in total control of the reconciliation process from input to output. They must have real-time visibility into the status of the reconciliation process, giving the assurance that appropriate and effective controls are in place and used to best effect.

Manual reconciliation takes time. Effective systems must save considerable time and enable users to feel confident that the reconciliation is 100 per cent correct. The time saving alone makes automated account reconciliation solutions a sound investment, enabling users to focus on more useful and rewarding tasks.

With manual reconciliation it is easy to make mistakes. A reconciliation system needs to provide controls to help prevent common errors, and functions to make it easy to rectify them if they are made.

Automated account reconciliation solutions will often provide a full return on investment within the first year of implementation. As a result of the automatic matching of the vast majority of transactions, time is freed up to deal with exception management and risk reduction. Some leading systems will often automate up to 95 per cent of the transactions in the reconciliation process.

The technology needs to be easy to use, preferably developed by accountants, for accountants. A good rule of thumb suggests that a system should be so easy to use that it requires less than one day of training for effective use.

The technology also needs to be flexible enough to handle input files from all banks and financial systems that can export account information in a file. This should be a standard module in almost all systems. The implementation and configuration process should take a minimum of time.

Finally, if you're looking to invest in an automatic account reconciliation solution you should look for a proven supplier that offers solutions across a range of markets and industries, that is not only used for bank reconciliation, but is also used for reconciliations of technical systems, internal account reconciliation, and for the reconciliation of inter-company accounts.
Firms need total control of their reconciliation process from input to output. They need to have real-time visibility into the status of the reconciliation process, giving the assurance that appropriate and effective controls are in place and used to best effect. With these systems in place, communications firms can iron out hidden operational inefficiencies and play a leading role taking the world out of recession.

Michael Coppack is Managing Director UK, Adra Match

The BSS Summit offers a comprehensive conference with an operator-only speaker panel, interactive tutorials and discussion sessions, networking events and exhibition of BSS suppliers

The BSS Summit, which runs on June 8th-11th 2009 in Amsterdam is billed as the event that will help telco businesses optimise their BSS infrastructure to drive efficiency and sustain revenues. 

Through a mixture of executive keynotes, service provider case studies, real time interactive polling, panel discussions and expert masterclasses, the BSS Summit aims to address a range of thorny questions, including: will your current BSS strategy allow you to protect your existing revenue streams while exploiting new opportunities; optimise ROI from your existing billing infrastructure; effectively manage the customer experience and develop the right business model to develop and maintain your position as a market leader? Also, are you prepared to do more with less? How will you manage your business processes to maintain the flexibility, efficiency and effectiveness of your BSS platform during this period of economic uncertainty?

The intention, according to organisers IIR Telecoms, is to help delegates meet the challenges of delivering world-class BSS performance within tighter budgets and a highly competitive market environment, advising on the best approaches to:

  • Design, plan and prepare for BSS projects more intelligently and realistically
  • Deliver BSS projects within restricted cost and time frameworks, supporting organisation-wide efforts towards greater efficiency and leaner use of resources
  • Demonstrate practical, visible improvements within shorter timeframes

Details: www.bsssummit.com

Some telecom CEOs seem to believe that in recession ‘branding' is less important than  delivering on fundamentals.  However, while functional issues are important, brand image is increasingly becoming a key success factor both internally and externally.  In fact, mobile telephony has become one of the great battlegrounds of consumer branding.

The sector is complex and becoming more so by the day.  There is choice of network standards (GSM or CSMA), distributors (Carphone Warehouse or Phones4U), network operators (Vodafone or Orange), tariffs (Dolphin or Passport), devices (Blackberry or Sony Ericsson), applications (mobile money transfer or mobile Internet) and content (Sky or Bloomberg). 

Consumers are confused.  Speed of change, technology convergence, a plethora of products and services and complexity are all making the telecoms market more difficult for consumers to choose. 

This is exactly why strong brands are so important.  Brands were pioneered in the US fast moving consumer and durable goods markets to help consumers make decisions more quickly and simply.  Brands simplify choice.

They are now doing just that in the telecoms sector and some brands are becoming increasingly dominant.

Vodafone is currently the most valuable and strongest brand in the world, boasting AAA BrandBeta Rating and a value of $24.6 billion.  This reflects the fact that Vodafone is now present in over 60 countries worldwide with over 250 million subscribers  (Source: BrandFinance).

When Vodafone went on its M&A spree in 2000 the strategy was based on creating the first truly global mobile brand.  As over 40 of the countries where Vodafone now operates are licensed ‘partner' markets rather than owned subsidiaries, it becomes clear how powerful the brand has become.  Behind its mantra of being ‘Red, Rock Solid and Restless' Vodafone has built an internal brand culture which has been communicated externally.  Football and F1 sponsorships, heavy advertising, exciting product PR and the ubiquitous red trade dress have lifted spontaneous consumer awareness from 10 per cent to 90 per cent globally. 
Strong brand positioning, awareness and emotion make new market entries easier.  Look at the hugely successful re-branding of Hutch in India.  No mean feat considering what a great brand Hutch was before the transition.  It also enhances acquisition rates and lowers lapse rates and increases price premium and arpu levels.

There is no doubt that Vodafone is rapidly reaching the dominant position of a VISA or Coca Cola.  This creates a huge advantage and puts other operators on the back foot.  In response local brands are smartening up their act and some are trying to emulate Vodafone's success.  Look at Zain and Etisalat in the Middle East or O2 and Orange in Europe.

The Blackberry brand is another example of power branding.  Research In Motion (RIM) has boomed on the back of its technology and Apple has cleaned up with the iPhone.  In both cases it is debatable whether they would have become so popular, with businessmen and consumers respectively, if they had not developed such strong brands to complement their technology.  Meanwhile Nokia continues to dominate the mainstream handset market and commands AAA- BrandBeta Rating and a brand value of $19.9 Billion (Source: BrandFinance).
The reason brands become so valuable is because when they are cleverly devised, well managed and consistently invested in, they secure demand and leverage all the other intangible assets of a business, increasing the life of patents and technology.  They become reservoirs of value or counter balances that maintain momentum even in tough times.
But brands need strong leadership, a clear point of view and consistency. Take Virgin.  In 1968, Richard Branson, the idealistic college drop out, developed an enduring brand promise. In his words:

"The Virgin brand promise is based on five key factors: value for money, quality, reliability, innovation and an indefinable, but nonetheless palpable, sense of fun.

"At Virgin, we know what the brand name means, and when we put our brand name on something, we're making a promise. It's a promise we've always kept and always will."
Virgin has a strong ‘positioning', allowing it to excel as an MVNO, amassing customers and fortunes along the way.

But Telecoms CEOs can build brands too. Hans Snook, Peter Erskine, Jorma Ollila and Arun Sarin all followed the Branson lead.  Every Telecom CEOs needs to embrace branding to get through the recession in good shape.

David Haigh is Founder and CEO of Brand Finance.
Brand Finance produces the BrandFinance500 annual survey of the world's strongest global brands. 

Ray Adensamer argues that Voice Quality Enhancement can help deliver the standard  required of VOIP conferencing systems

Audio conferencing services based on circuit switched networks and audio bridging equipment have provided hosted conferencing users with a benchmark for pricing and quality in voice communications. While next generation networks based on VoIP technology introduce economic benefits with new feature capabilities for conferencing service providers (CSPs), they also present new technical challenges in maintaining acceptable voice quality. Delivering good voice quality is an important requirement in any VoIP conferencing system, as poor voice quality will increase the costs associated with customer churn, while impacting the bottom line by reducing revenue growth prospects.

Voice Quality Enhancement (VQE) encompasses an integrated set of features designed to overcome common audio quality problems in VoIP conferencing services, including noise, packet loss and echo. A comprehensive VQE solution also measures VoIP quality metrics, which are used in ongoing voice quality measurement associated with service level agreements.

Many features inherent in a VQE solution require sophisticated digital signal processing algorithms. The rapid, scalable execution of these algorithms dictates a product specifically designed for real-time IP packet processing. Fortunately, in a next-generation VoIP audio conferencing architecture, a network element already exists with carrier-class real-time IP packet processing power. And that network element is the IP media server.

The three most common sources of VoIP audio quality problems in a VoIP or IMS network are noise, packet loss and echo. This section discusses each of these VoIP audio quality challenges and describes the conceptual solutions to overcome quality problems.

Audio noise
Gone are the days when people were confined to quiet office and residential environments. Today, with mobile phones and the Internet, people are calling from their cars, airports and from just about anywhere, and these environments are flooding the mouthpiece with all kinds of unwanted sounds that ultimately get onto the call. Making matters worse, callers using laptops and mobile phones are typically saddled with marginal equipment such as low cost earphones and microphones.

This section describes a combination of mechanisms that reduce and help manage the disturbing effect of audio noise: noise gating, noisy line detection and noise reduction.

Noise gating
Noise gating is a simple yet effective mechanism to reduce background noise.
When no speech is detected on a line, its signal is attenuated (eg decreased amplification), which prevents unnecessary noise from being inserted into a VoIP recording or conference mix. Noise attenuation is configurable, so the conferencing application can avoid making the signal unnaturally quiet when the noise gate is applied to an audio signal.
Key benefits of noise gating:

  • Reduces background noise using a simple yet effective mechanism
  • Supports configurable attenuation

Noisy line detection
There are times on a conference call when some lines are very noisy and disrupt the productivity of the entire call. Noisy line detection measures the noise on audio ports and sends a noisy line notification message to the VoIP application server if a predefined threshold is exceeded, as shown in Figure 2. A second message is sent if the noise subsequently falls below the threshold.

Key benefits of noisy line detection:

  • Notifies the application server of noisy line conditions, initiating possible corrective action
  • Enables quick remedial action by the application server or the operator (eg mute line)

Noise reduction
While a noise gating function described earlier provides a relatively simple solution to eliminating noise when no speech is detected, noise reduction goes a step further by using digital processing techniques to remove the noise and leave the important speech signal intact. This provides benefits in many VoIP applications, such as removing noise from VoIP audio recordings or noisy caller lines in a conference mix.
Key benefits of noise reduction:

  • Filters out noise without impacting the speaker's signal
  • Reduces noise continuously, whether speech is detected or not

Dropped packets
The Internet is an amazing network of interconnected computers, but it's not perfect. The network employs the IP protocol, which does not guarantee packet delivery. Hence, when IP networks get busy or congested, packets can get lost or delayed. While lost packets are not critical for many data applications, packet loss in real-time VoIP services can cause significant audio quality problems. Without special technology to compensate for dropped packets, the result is an abnormal audio signal that might sound ‘choppy.'

Packet loss concealment
Packet loss c\oncealment is a technique for replacing audio from lost or unacceptably delayed packets with a prediction based on previously received audio.
Whereas any voice repair technology would have difficulty recovering from extreme packet loss in abnormal conditions, packet loss concealment is designed to perform intelligent restoration of lost or delayed packets for a large majority of congested network scenarios.
Key benefits of packet loss concealment:

  • Softens any breaks in the voice signal
  • Reduces the occurrences of choppy audio

Acoustic echo
An acoustic echo is created when sound emanating from the receiver's speaker (eg handset or speakerphone) is transmitted back by the receiver's microphone. This is depicted in Figure 3, where the Sender (on the left) transmits a signal to the Receiver, and an acoustic echo is created when some speech energy ‘bounces back.' In a VoIP conferencing application, all participants will hear an echo except for the guilty party with the device causing the echo. Since nobody can quickly answer the basic question, "Who's causing the echo?" troubleshooting echo issues in a VoIP conference call can be difficult and frustrating.

Acoustic echo cancellation
Acoustic echo cancellation (AEC) technology is designed to detect and remove the sender's transmit (Tx) audio signal that bounces back through the receive (Rx) path. By removing the echo from the signal, overall speech intelligibility and voice quality is improved.

AEC in a VoIP network is particularly challenging. In a traditional voice network, once a voice circuit is established through the PSTN, the round-trip echo delay is constant. However, in a VoIP network, packet delay is a variable, hence the echo delay is also a variable for the duration of the call, which makes the echo cancellation algorithms in any VoIP quality improvement product more complex and processor-intensive than an equivalent echo cancellation solution in a circuit-switched network.
Key benefits of acoustic echo cancellation:

  • Removes a sender's audio echo from the receive path
  • Addresses variable packet delay inherent in IP networks

Voice quality metrics
Technology to remove audio quality impairments in a VoIP network is an important part of any solution. But along with the functions to improve VoIP quality, service providers also need a standard, objective way to measure voice quality in order to accurately monitor performance levels and uphold service level agreements (SLAs) with customers.
Voice quality metrics can be divided into three groups: packet, audio and acoustic echo cancellation (AEC). All statistics are captured for each call leg of a conference
call to help with granular troubleshooting of audio quality problems and performance measurement. Packet statistics measure performance with respect to packet throughput, loss and delay, while audio statistics measure speech and noise power levels. AEC statistics measure echo delay and echo cancellation performance.
Key benefits of voice quality metrics:

  • Provides objective measurements for administering service level agreements (SLAs)
  • Facilitates the troubleshooting of audio quality issues in the network

Voice quality enhancement
Voice quality enhancement (VQE) encompasses an integrated set of features designed to improve VoIP quality and generate statistics needed for ongoing performance monitoring. This requires sophisticated digital signal processing algorithms that perform rapid real-time IP packet processing, a key component in next-generation VoIP audio conferencing architecture. As such, VQE can be deployed in an existing IP media server, which provides the requisite carrier-class real-time IP packet processing power.

IP media servers, also known as the Multimedia Resource Function (MRF) in an IMS architecture, are specifically designed to deliver real-time IP media processing as a common, shared resource for a broad range of VoIP and IMS applications in a next-generation network.

They also deliver real-time processing of codec algorithms, transcoding of codecs and sophisticated audio mixing for conferencing applications. Since media server and VQE tasks are interrelated and require the rapid execution of IP packet processing algorithms, it makes sense to integrate the functions of both into a single network element.

Ray Adensamer is Senior Product Marketing Manager, RadiSys

As budgets are reprioritised, initiatives that promise the most value should be given  first priority, with service layer transformation somewhere near the top of the heap says Rick Mallon

Many service providers are in the midst of long-term IT transformation programs. Most often these aim to launch new services, improve speed to market, achieve customer-centricity, and simplify operations. In the current economic climate, however, resources for many CSPs are far more constrained than was anticipated when these programs were conceived. Consequently, many CSPs face tough decisions regarding their IT budgets. Those that reduce budgets sharply risk stranding previous investments made in ongoing programs. A newly established set of IT systems intended to enable a major migration could become yet another expensive operations silo, thus undermining the program's original goals.

Service layer vs BSS transformation
Transformation initiatives often start with BSS transformation. In post-merger environments where consolidation and customer-centricity are the business drivers, there's a definite logic to this. For one thing, service providers want the billing process to continue without a hitch. Trouble is, regardless of the technologies you use, the conversion of disparate billing and CRM environments to a centralised architecture will necessarily involve some risks and variable costs.

The biggest risk may be in putting the wrong foot forward. Some opt at the beginning for a traditional big-bang BSS transformation, because that path is a known route. Others may prefer to begin with a service layer transformation initiative, because it keeps future options open. Ultimately, the best- as well as least costly and most flexible - approach is to start with service layer transformation. Service layer transformation won't automatically save providers from recreating some of the mistakes of the past, but it will enable them to more easily add on new services in the future and create a common view of each subscriber.
BSS conversions are likely to affect a large number of business stakeholders, touch thousands of users, and impact millions of customers. As care channels change, or redesigned bills are introduced, there are always significant costs relating to possible customer disruptions during the transition period. Internally, new users need to be trained and established processes need to be re-engineered. The likelihood of battling organisational resistance is magnified.

Even if a CSP manages its way through the pitfalls, BSS transformations often struggle to deliver their intended value when they intersect still-fragmented service layers. Put simply, if the service layer continues to consist of distinct product-facing silos, it would be difficult to accelerate time to market, launch cross-domain services, and drive personalisation and customer intelligence.

Merging or federating these silos through service layer transformation has the potential to deliver significant benefits for less cost, effort and scope than BSS transformation. Fewer customer touch points are involved; fewer stakeholders are directly affected. Transformation of the service layer removes the silos, simplifies operations, and benefits marketing, service innovation, and opex and capex budgeting.

Because networks continue to experience significant advancement, vendor churn has become a consistent challenge for network organisations. Massive multi-vendor environments are also a common end result of large scale mergers and acquisitions. Service layer transformation is necessary to manage all of that kit under one layer and fulfil services in a technology-agnostic way. Even if a BSS transformation succeeds, time to market and operational-simplification goals can be derailed by overly complex networks broken out into distinct vendor domains.

While improving care, service personalisation, and customer intelligence is often at the core of BSS transformation's business case, it's difficult to achieve without a centralised view of a customer from a product and service perspective. Service layer transformation enables a single, correlated view of each customer and all their subscribed services. This view can, in turn, fuel more intelligent targeted marketing and personalised up-sales of value-added applications or additional services. Customers are tired of receiving promotional offers for services they already use at a higher price. If the service layer isn't transformed, this kind of churn-driving marketing continues while undermining revenue goals for new, add-on and niche-oriented offerings. CSPs have an opportunity to leverage significant standards work. Specs like SCTE 130 out of North America enable personalised services and advertising, while various efforts that leverage the TM Forum's business process framework and OSS/J teams provide models, process flows, and interface specifications for reducing the risk, time, and effort related to service layer transformations.

ZON Multimedia
ZON Multimedia, based in Oeiras, Portugal, is a good example of a CSP that has drawn significant benefits from an effective service layer initiative. ZON announced that by the end of 2008 more than 53 per cent of its customers were subscribed to multiple services. The company added 144,000 net new \subscribers in 2008 and credits much of this success to a bundle of video, broadband and IP-based voice services it launched in May last year. ZON is also launching mobile services, using a MVNO model, which will soon give it a true quad-play offering.

One of the unique aspects of ZON's offerings is that they are delivered in a network technology-agnostic fashion. Due to its OSS service layer transformation, ZON can determine the best and most profitable way to re-use and deliver services to any customer location. For example, ZON offers TV through on-network cable and also a satellite resale model. It offers PacketCable-based VoIP services over its cable network and SIP VoIP over resold ADSL lines. ZON has the ability to plug any new service offerings into its common service layer platform and deliver them through any available and appropriate networks and CPE devices. In addition to its MVNO offering, ZON offers HSD services for cable and WiFi, and DSL (resold model) from its common, transformed service layer.

For business or commercial market offerings, service layer automation can be more challenging than in the consumer sector. Enterprises vary significantly in their particular requirements, making end-to-end automation challenging. However, mass-market offerings for small and medium businesses are well within reach for the same service layer transformation that empowers consumer offerings. Services that combine, for example, a metro-Ethernet pipe with SIP-based VoIP, video for board rooms or waiting rooms, and advantageous mobile packages are ideal for this marketplace. In particular, preparing for the wave of SIP devices and applications to come is an important part of any CSP's growth strategy.

The multi-screen opportunity
One concept that is central to SIP, and increasingly important in video and broadband offerings, is service portability. Put simply, consumers want access to their video content wherever and whenever they choose. Many CSPs have yet to embrace this concept, leaving the opportunity open to Internet-based players. Though IPTV has the potential to provide flexible access to more content, it's only a technology and is often applied in the same framework as traditional cable TV. Customers, however, have a concept of content ownership. If they've purchased movies or TV shows for their iPod, mobile phone, or PC, they'd like to be able to access them from any other device, including their TVs at home.
Service layer transformation allows a CSP to seize this trend as a growth opportunity. It can enable service providers to tie services to individual subscribers, rather than to locations or devices. An intelligent service layer will understand to what devices a user has access and deliver content and services as appropriate and seamlessly. With a service layer empowering that kind of portability, new options for promotions and offerings open up, including personalised bundles and targeted advertising.

In today's disparate service layer environments, CSPs are scrambling to some extent because their TV offerings are being bypassed thanks to Hulu, iTunes, YouTube and other outlets. Delivering content to the channel the consumer wants at any given time requires service layer transformation. It enables customer-centricity across all services and breaks down the barriers between technology domains. With these capabilities in place, marketers have a world of options open to them that they wish they had today. Too often they instead face IT and network organisations that have to say: "We can't do that yet."

From a return on investment perspective, service layer transformation is a winner.  It can be a catalyst for moving service offerings and customer intelligence forward into entirely new areas of opportunity. It is simpler, less costly, and less risky than massive BSS conversions, yet sets the stage to make those inevitable BSS-layer transformations simpler and more beneficial. In a resource constrained world, where budgets are tightening and business cases are more scrutinised, service layer transformation is a priority that empowers both revenue growth and operational improvement.

Rick Mallon, Sigma Systems

People are communicating more things to more people than ever before, and not just  by phone anymore. Internet-enabled communication models are gaining audience, attention and market share at the expense of traditional telecommunication providers. Can telcos fight back and find new growth opportunities in this rapidly changing ecosystem? The challenge is not just in understanding the technology, but also the unfolding fundamental shifts in human communication behaviour say Chris Pearson and Rob van den Dam

A  growing number of people are visiting social networks. According to Comscore, approximately two-thirds of the worldwide Internet audience are regularly visiting social networks. This trend is universal. In South Korea, considered by many to be the world's most developed social network nation, more than 90 per cent of teens and almost half of the entire population are members of Cyworld.  In the United States, 80 per cent of the young adults, 60 per cent of teens and 30 per cent of adults use social networks. And in the UK, 90 per cent of teens spend time on these sites.

Social Networks have become a primary destination for a rapidly expanding universe of online users for managing and enriching a digital lifestyle. They provide the ability for them to communicate, to develop their identities, to build a network of relationships, to find information, to share experiences and self generated content, to buy products, and more.
With numerous communication tools at their disposal, social networks are becoming integrated communication hubs. The integration of MySpace and Skype, for example, illustrates how social networks and communication applications can converge to benefit users. With more than 118 million active MySpace users and over 370 million Skype registered users around the world, this partnership connects two of the most popular communication platforms on the Internet to create the world's largest online voice network.
A number of telcos are already responding to the challenges and opportunities of social networking. Many have initiatives underway that range from simply enabling online social networking sites to extending their offerings to the mobile communication environment, to even building their own, proprietary social networks. For example, telecom operators such as Sprint, AT&T and Vodafone enable MySpace and Facebook members to access their profiles from their cell phones. And Vodafone recently launched Connect to Friends, a Facebook application that enables Vodafone and non-Vodafone subscribers to communicate with each other from either a PC or a mobile phone.

The widespread social networking phenomenon is a reflection of shifts in two long-term communication trends:
1. communication patterns are changing from point-to-point and two-way conversations, to many-to-many, collaborative communications, augmented with links, videos, photos and multimedia content that substantially enrich the user experience.
2. control of communications is shifting away from the proprietary domain of telecom providers to open Internet platform service providers.

The so-called Net Generation - the digital natives who have grown up in a technology-enabled and Internet-connected environment - is at the forefront of shifting social communication patterns. Their preference is for staying connected, sharing, creating content, multitasking, assembling random information into patterns, and using technology in new ways. They are the wisdom-of-crowds generation that grew up rating peers, physical attributes, products and services, etc.

They are native speakers of technology, fluent in the digital languages of computers, the Internet, video games and the mobile phone, and often living in a state of continuous partial attention. For many of them, social networking is supplanting email, and even voice, as the preferred method of communication. But the shift away from traditional communications to social networking is not limited to this generation. A growing number of adults now use social networking to get what they want from each other, instead of from traditional media and institutions.

The second trend, the shifting control of communication media from the domain of telcos toward a more open communication platform, is the result of widespread availability and affordability of connectivity and communication tools/devices. With better, cheaper technologies and greater use of broadband, the Internet, and wireless networks, open platforms such as social networking sites, are becoming ever-more viable platforms for communication services - and consumers are responding eagerly.

The combination of shifts in communication control and patterns is redefining the competitive landscape, giving rise to new business models. In contrast with traditional communication models, emerging models are based on open platforms that support many-to many and/or collaborative communication patterns.

Traditional communication
The traditional model, characterized by two-way point-to-point communication is the domain of traditional telco providers. It is the largest segment in terms of revenue and subscribers, but it is showing signs of slow growth as other models take hold. Wireline revenue is declining and although, according to Gartner, global mobile services revenue is forecast to grow 7.6 per cent from 2007-2012, the mobile subscriber base has reached saturation in key developed markets.

Open and Free
This model offers alternatives to traditional point-to-point communication services on open Internet platforms. Companies in this domain provide basic communication services such as VoIP for free or at very low cost. Many of these services threaten profitable traditional services such as long distance calling and mobile roaming.

Providers in this space include VoIP provider Skype, Google with GoogleTalk and Microsoft with Windows Live Messenger, which offer PC-to-PC voice services along with instant messaging and chat. With over 370 million registered users worldwide, Skype has, in a matter of five years, come close to creating a truly global telecom service.

Many of the players in the "open and free" space, such as Microsoft and Google, have considerable resources and leave little room for a commercially viable response from telcos beyond repackaging existing services into "convenience bundles." Some telcos, however, seemingly have embraced the model and are partnering with disruptive new entrants. As an example, the mobile operator 3 in Australia and the United Kingdom has partnered with Skype to launch the 3 Skypephone, to attract and retain customers.

Gated Communities
This model focuses on group communication and collaboration in the Telcos's "walled-garden" and will appeal to users and enterprises with a preference for the more secure and reliable communications environment traditionally provided by telecoms.

The most obvious opportunity here is extension of social networking to the mobile where operators continue to retain some exclusivity. Research companies such as Informa and Juniper estimate that by 2012, mobile social networking will represent a market opportunity of between US$22.5 and US$52 billion, and telcos should be able to seize a share of that.
Recent studies have revealed that more than 40 per cent of iPhone users in the United States, Germany, France and the United Kingdom are visiting social networking sites. In South Korea, a mobile user visits Cyworld on average eleven times per day. The mobile service of the Japanese social network Mixi, which started as an online site, has turned out to be hugely successful with mobile page views already outnumbering online page views.
Telcos have also an opportunity to play a role in the delivery of fully integrated collaborative services to enterprises and organisations that value carrier-grade capabilities in a secure and reliable environment. According to Forrester, enterprise spending on Web 2.0 collaboration technologies is forecast to grow to US$4.6 billion globally by 2013, with social networking as the top spending category.

Shared Social Spaces
Shared Social Spaces facilitate collaboration on the open Internet. The main providers in this space are Over The Top (OTT) applications such as MySpace, Bebo, YouTube and Facebook. And virtual worlds such as Second Life belong to this domain. Also such parties as Microsoft, Google and Nokia, with its OVI/Share platform, have entered the fray.

As many of these players integrate telephony services, they have the potential to become fully integrated, end-to-end communication platforms. Though the revenue model remains unproven, they are drawing attention away from traditional communication service providers and are contributing to their slowing growth.

In addition, these types of applications put additional strain on already-burdened network infrastructure, particularly with the rapid increase in video content sharing and distribution. Cisco forecasts that by 2012, the sum of all forms of online video, including TV, VoD, Internet and peer-to-peer (P2P), will account for nearly 90 per cent of all consumer Internet traffic, a large portion of which will flow through OTT applications. According to Ofcom these types of OTT services will impose additional £830m (US$1.4 billion) in bandwidth costs on UK Internet service providers, without a corresponding revenue model.

To deal with over-burdening OTT traffic, telecom operators have options that include filtering or blocking OTT traffic, but this is unsustainable in many jurisdictions as it violates net-neutrality principles. Instead of protesting about the OTT bandwidth demand, Telcos should embrace the demand. Network and computing infrastructure optimization techniques such as traffic shaping and content delivery network (CDN) technology can reduce the cost of delivering high bandwidth content and potentially lead to new business models that even can capture value from this increasing OTT traffic.

The social networking phenomenon arose from significant shifts in communication, driven by the widespread growth of Internet connectivity and the emergence of interactive online communication tools. These shifts have been redefining a century-old industry, with the result that the advantages enjoyed by traditional communication service providers are beginning to wane. Telcos can, however, remain relevant in the face of changing user sentiments and demands if they take bold steps to adapt to this evolving marketplace.
Telcos can begin by taking advantage of the window of opportunity in mobile social networking, and also bolster their capabilities to serve the evolving, broader communication needs of enterprises. They should partner with, or acquire, existing players, to proactively develop the capabilities required for success. We already see some examples of such moves. In October 2008, Telefonica signed a global agreement with Facebook allowing it to integrate access to Facebook's mobile service and applications from all of Telefonica's mobile Portals. And in May of that year Vodafone acquired Zyb, a Danish mobile social network, with an eye on extending its social networking capabilities.

Telcos should also focus on enabling other participants in the value chain to benefit from distinctive telecom capabilities such as location, presence, text/multimedia messaging services and conference calling, in this way also generating revenue for themselves. Vodafone's Connect to Friends application in Facebook is an example of such an approach.
In addition, telcos should work more closely with OTT and/or CDN (such as Akamai and Limelight) providers to reduce the cost of delivering high bandwidth content (e.g. video, music) in response to increasing demands for such services. This can be achieved through network and computing infrastructure optimization techniques such as traffic shaping and caching of content close to the edge of the network using CDNs. Such approaches can potentially lead to new business models that capture more value from this increasing OTT traffic.

Over the long term, telcos should broaden the scope of their traditional voice business to more actively encompass both point-to-point and many-to-many collaborative models, which include voice, internet-based communications and content, and align the organisation and industry partnerships accordingly.

The fundamental change in the way we are now communicating is driving the evolution of a new communication services ecosystem that will force significant and bold changes by existing providers if they wish to remain an integral part of the communications landscape. The journey will not be without risks, but the option of doing nothing is a luxury many providers cannot afford. Revenue from traditional services will continue to decline and highly resourceful Internet information providers and IT companies continue to grow in the communications space to claim a larger share of communication time.

Chris Pearson is Global Telecom Industry Leader, IBM Global Business Services.
Rob van den Dam is EMEA Telecommunications Leader of the IBM Institute for Business Value and can be contacted via: rob_vandendam@nl.ibm.com


Most of Western Europe is now thoroughly gripped by recession, with even the  suggestion of a depression skulking on the horizon.  These are uncertain times - but one thing that is certain during these dark days, says Michael Callahan, is that companies will need to look at ways to reduce overheads, be smart with their diminishing budgets and seek solutions that provide value for money

Recent months have seen a number of high profile organisations fighting for survival, from redundancies within the financial sector, downtime on production lines in manufacturing, to major retailers slashing costs. All organisations, across all sectors - from small businesses to international conglomerates, are being affected by today's economic climate. Their continued existence will depend on them reducing their bottom line and tightening their belts effectively.

When a company needs to limit its spending, the first area to be examined, and habitually slashed, is its IT budget often with the security element considered non-essential. While many businesses overwhelmingly recognise that security has the power to determine whether they live or die commercially, many remain frustrated by the strain it places on finances and human resources. The reality is that growing regulatory requirements demand enterprises protect data, making such a cost saving strategy risky and potentially damaging.
Many an organisation has fallen foul when, having taken the decision to deploy technology, it has then inadequately scoped the investment, instead restricting it to what it considers the bare minimum and failing to anticipate the implications of its deployments. No matter what type of software or device is chosen, security should be an important consideration to lock down both the device, and the data that's contained within it to avoid ‘hidden' expense. Taking a mobile device, as an example, the questions that should be asked are: the types of information it will be able to access and carry; and how easy would it be for the device to be lost or stolen? The answers will have a great impact on security concerns and risks and will dictate the type and amount of security needed for the device. Simple, cost effective solutions, like boot-up passwords, two factor authentication and encryption, can all play a role.

One publicised example of inadequate, or even short-sighted, investment was a Marks & Spencer's owned laptop that contained a database of its 26,000 employees' details that was stolen from a third party. Having taking the decision to invest in laptops, it had opted not to take the precaution of sufficiently protecting those with sensitive data stored on them, as in this case. The Information Commissioner's Office found Marks & Spencer in breach of the Data Protection Act, leaving the retailer, not only with its reputation tarnished, but also an enforcement notice to ensure that all laptop hard drives were encrypted - a modest investment in hindsight which would have saved its blushes, not to mention the costs involved in handling the breach.

Many leading companies and organisations have already looked to decrease their overheads by reducing their property spend and energy expenses in downsizing to smaller, cost-effective premises. Redundancies are inevitable as workforces are slimmed down, with remote working and hot-desking practises feasible alternatives.

As department numbers decrease, the resultant increased workload for those that remain may force diligent employees to take work home with them to avoid falling behind or missing deadlines. Hot-desking could become widespread as companies strive to maximise their use of resources and cut costs by providing limited desks for their workforce, if at all - a drastic option could be to cut the cost of a central office altogether in favour of a ‘virtual' office. Another solution may be to utilise external resources, such as contracted labour, consultants, and possibly entire departments - IT support, HR and payroll are just a few examples.

While many companies try to weather the storm, data security must still be paramount. Privacy laws, along with corporate governance and industry-specific regulations, have become prevalent over recent years and neither ignorance, nor lack of funds, will be deemed as adequate defence. If organisations decide to lower their fortifications to allow flexible working practices, it is important that they do so securely and in a controlled manner. Here are a few ways for companies to examine what they currently have in their arsenal, and those that they really shouldn't be without:

Mobile computing allows people to use IT without being tied to a single location. Any business with staff that work, or will work, away from the office can benefit from using it. Devices - from laptops and personal organisers to "third generation" (3G) phones - can help to keep in touch and make the most productive use of your time. They can change the way you do business and lead to new ways of working, even new products and services that can be offered to customers, bringing new business opportunities. Increasingly, networking "hot spots" are being provided in offices where multiple employees access the same machine and network.  While this increases productivity and can reduce costs, it must be done securely. Data security advice from the Information Commissioner's Office is to encrypt any personal information held electronically if it will cause damage or distress if it is lost or stolen and only provide data access to approved personnel.

With new technologies, it's not only easier but more secure than it once was to let workers log onto the company network from home. Having fewer people working at the office could save money on energy bills - this could be taken further and shut down the office completely one day per week and have everyone work from home, with further savings realised by shutting down the heating or air conditioning system.  However, it is still imperative to secure the data as it leaves the office and travels home on the tube.

Replace dedicated WAN links with site-to-site VPN. If your business has multiple physical locations and you have dedicated leased lines connecting them, it might be time to think about ditching the expensive dedicated links and replacing them with site-to-site VPN connections instead. Midsize and large businesses may be able to save thousands on monthly fees by doing this.

Software application management (SAM) identifies installed applications, and then monitors their usage (or lack of) to determine compliance with software licenses, adherence to corporate usage and security policies. SAM is often perceived as a compliance exercise, yet the truth is organisations tend to underutilise licenses - typically 10-20 per cent on dead, outdated, or unnecessary application licenses. Reducing underutilised software has security benefits as well. Fewer applications means fewer opportunities for compromises and configuration errors. Also, the process of inventorying and auditing software usage often paves the way for additional control disciplines that cut costs and boost asset productivity.
Outsourcing is a sensitive subject, often conjuring images of personnel cuts. Yet the reality is judicious outsourcing can allow you to better utilise existing personnel.

Make good use of existing investments - advice PA Consulting should have heeded when an employee decided to circumnavigate existing security procedures, transferring data unencrypted to a memory stick, in breach of the company's contract and its own security policies. The memory stick, containing a Home Office database of 84,000 prisoners, was subsequently lost and, as a result, it has had its three year contract worth £1.5million terminated, with the Home Office further reviewing its other contracts worth £8million a year. Everyone within an organisation must understand his or her responsibility for keeping sensitive information secure and how to use the available technology, such as encryption software, to do so.

Fundamentally, effective security means doing more with less - it is about people, processes and technology. There are plenty of interesting technologies available although they're all useless if they're inappropriately deployed, managed and maintained. Allowing devices to operate in your enterprise without any rules or policies is truly the biggest risk. Complicated policies that regular users can't grasp are futile; instead they should be simple, precise and basic common sense. Often if people understand why they need to do something, then they'll do it. If all else fails look for something that can be enforced, often unseen, that takes the onus away from them.

In difficult economic times it is important to remember that the evidence of past downturns shows that those who make smart use of innovative technology will be the ones who live to fight another day.

Michael Callahan is Vice President Global Marketing, Credant

As operators begin to position themselves as multimedia companies, Chris Yeadon  examines the essential billing systems capabilities now required to support an effective content based strategy

Over the last few years many industry observers have stated the importance of mobile operators becoming media companies, if they are to avoid becoming merely ‘dumb bit pipes' as they approach the so called ‘IP Jungle'.  Many operators have taken notice. An increasing number are now positioning themselves as multimedia companies, offering new fixed-line, DSL, IPTV and mobile TV services in addition to their core mobile offerings. Content is also playing an increasing role, especially in the mobile channel. Indeed, according to Portfolio Research, the mobile content market was already worth $24 billion in 2008 and is forecast to rise to $47 billion in 2013.

The challenge for many operators is how to support the new array of partners and especially content partners who will play an increasingly important role in the value chain of their offerings. In this regard the billing system will be pivotal in the support of next generation strategies.

Mobile content covers a huge area in terms of genre and format, ranging from stock price information to games and SMS alerts to mobile TV and advertising. Therefore, before we examine the billing system requirements for mobile content it is necessary to make some assumptions. For the purposes of this article mobile content is:

  • Both produced and owned by a third party
  • Obtained or consumed over the network
  • Marketed and sold by the operator (‘on-deck')
  • Always chargeable - but not necessarily to the end user.

Mobile content provides operators with the means to increase customer focus by targeting customer segments with relevant content. This could include offering financial news to the business segment or music downloads to the youth segment. Depending on the ambition of an operator's content strategy, they may develop a content partner ecosystem consisting of numerous companies each with different financial and contractual models to be supported. Therefore, in much the same way that an operator requires efficient access to contractual and product data to manage its relationship with its customers, it must have the ability to efficiently manage its partner relationships. This includes contract definition flexibility to support the different types of partner business models.

Unless the content partnership is based on a ‘buy-out' or ‘blanket fee' arrangement, the business terms of the partnership will typically be based on a percentage of revenue or rate per use royalty. Therefore by definition, the billing system will need to support multiple chargeable parties. Firstly, the subscriber usage charges (including any applicable offers or discounts) must be calculated and made available for subsequent billing. Then, in relation to each transaction, additional charges payable to the content partner must also be calculated by the billing system.

Whilst this may seem obvious, for certain types of content the reality can be much more complex. Music content is a good example. In some cases an operator's billing system may have to support three or more additional chargeable parties including a record company, music publishers and various copyright protection societies.

In order to support a rich content partner ecosystem, the billing system must have the flexibility to generate different types of invoices and statements for each type of transaction in addition to those produced for the subscriber. Taking the example of music content; a record company partner may have control of the content platform from which downloads are made and therefore is able to issue an invoice to the operator at the end of each billing period.  The billing system may then need to produce a reconciliation statement in order to reach settlement with the partner. On the other hand a copyright society, with no connection to the content delivery process, will rely on the operator to provide a different kind of credit note or ‘negative invoice' at the end of the billing period.

For an operator's billing system, partner agreement flexibility is the ability to be able to support the business terms of all their partner agreements.  In particular this means having the tariff flexibility to support the calculation of all the appropriate charges payable to a content provider or at the billing level, the application of bulk usage discounts and incentives that may also be written into the partner agreements.

The following example shows the schedule of royalties payable to the Mechanical Copyright Protection Society (MCPS) for full-track music downloads. MCPS is a copyright society in the UK, representing music publishers from which businesses who are recording music, must obtain a licence. Similar organisations exist in many countries across the world. It illustrates a potential complex charging scenario that must be supported.

No. tracks in bundle      Royalty Rate     Min. Royalty Per Track
1-7                              8% gross revenue                4p
8-12                            8% gross revenue                3.5p
13-17                          8% gross revenue                3p
18-29                          8% gross revenue                2.5p
30+                             8% gross revenue                2p
MCPS (UK) Royalty Schedule for Full Track Downloads

In this example the basic royalty charge payable by the operator is eight per cent of gross revenue from the download of the music file. However this is subject to a minimum royalty charge of four pence. Furthermore this minimum royalty is variable, depending of the number of tracks that are included in a bundle. Therefore, within the charging component of the billing system, an operator may have to set up a dependant tariff of eight per cent of gross revenue as well as a series of alternative tariffs based on the various potential bundle scenarios. During the rating process, the charges based on both the dependant tariff and the applicable alternative tariff would have to be calculated and then subsequently during the billing process, a ‘best option' rule applied, selecting the most favourable charge for the partner (MCPS).

The support of the above scenario also assumes the number of tracks contained in a bundle can be supported as a field in the content usage record, and that the rating engine can support the bundled track numbers as a unit of measurement.

Historically, mobile content involves higher levels of risk resulting from fraud and accidental overspending. The risk is heightened by the fact that operators are exposed to third party content charges that are usually dependant on usage or revenues due rather than revenues paid.

Therefore content services should be chargeable in real-time for both prepaid and post paid subscribers. This would enable operators to perform balance checks on the subscriber's account.  In the case of a prepaid subscriber this is to ensure that sufficient credit remains to cover the transaction. For a postpaid subscriber this is to check against a threshold set by the operator, as a precaution against possible fraud or overspending or one set by the subscriber as a voluntary spending control measure. Such checks should result in an advice of charge message being sent to the subscriber in advance of the transaction.

The benefits of real-time charging to the operator are clear. It reduces the degree of credit risk, provides price transparency and a feeling of control to the subscriber and provides the possibility to offer the subscriber, in real-time, promotions and discounts.

One possible way to achieve this would be to utilise the real-time capabilities of a postpaid billing system to charge all content transactions for all subscribers, prepaid and postpaid. In addition to the important real-time controls, it would mean that all content services can be made available to all subscribers and that costly content tariff configuration duplication is reduced.

In the future as the telecommunications industry converges with the media and entertainment industries it is clear that content partnerships will play an increasingly significant role in an operator's success. It is therefore essential that an operator is equipped with a billing system that is designed to support content partner models, has the flexibility to sustain the varied financial models of a large partner ecosystem and, in addition, has the tariff and marketing features, including real-time charging, to satisfy increasingly demanding customers.

Chris Yeadon is Director, Product Marketing, LHS

Over the past 30 years or so, consumers' quality expectations have been set high from using legacy voice systems. When carriers introduced VoIP as an alternative delivery system for traditional telephony systems, these quality expectations represented a new challenge for service providers as they attempted to deliver voice over the IP infrastructure. Today, Bruce Perlmutter explains, service providers face new challenges of defining and delivering multi-play services (voice, data, video, etc) that meet customer expectations simply because there were no pre-defined metrics for delivering multiplay services

IP networks differ from the legacy networks they're replacing. The fundamentals of IP network construction and the dynamic nature of today's commercial infrastructure cause inevitable congestion. Congestion is the source of many quality problems in multiplay networks. Network operators can partition network resources based on policy goals by implementing quality of service (QoS) mechanisms, but good QoS at the network level does not ensure good Quality of Experience (QoE).

As service providers try to meet customers' high QoE expectations, critical tools and standards definitions need to be implemented. A far cry from traditional network monitoring, a new breed of service verification tools that actively monitor end-user QoE are now being introduced.  These tools are becoming so sophisticated that they can actively test individual subscriber paths - down to the set-top box in the home - rather than passively monitoring and reporting on general network conditions.

The Telecommunication Standardization Sector (ITU-T) coordinates standards for telecommunications on behalf of the International Telecommunication Union (ITU) and is based in Geneva, Switzerland. Since its inception in 1865, the Union has been brokering industry consensus on the technologies and services that form the backbone of the world's largest, most interconnected man-made system. The two standards most relevant to this article are the QoE and QoS standards.

ITU standard P.10/G100 provides the following definition of QoE: "The overall acceptability of an application or service, as perceived subjectively by the end-user."

Since they include the subjective user experience, the ITU defined a test methodology based on mean opinion scores (MOS). A MOS is measured by carefully controlled subjective tests laid out in ITU-R BT.500-11 and ITU-T P.800. In these tests subjects listen to video or telephony, and rate them on a scale from 1 to 5. The average ratings from  each test case yield the MOS.

QoS is a concept closely related to QoE though the two are often confused. The ITU's E.800 standard gives the definition for QoS as: "The collective effect of performance which determines the degree of satisfaction of a user of the service." In telecommunications, QoS measures a network's actual performance using two factors: QoS mechanisms and QoS metrics.

QoS mechanisms such as traffic schedulers, shapers and admission control techniques help smooth out traffic flows. These mechanisms ensure that no one user or application gets an unfair share of the available bandwidth. In IPTV networks, loss concealment protocols such as forward error correction (FEC) or automatic repeat request (ARQ) are commonly used QoS mechanisms. More common QoS metrics and measurements such as packet loss, network latency, and jitter also help determine the underlying quality of the transmission network.
Network operators predict user voice call quality over the network by using QoS metrics. Unfortunately, video streams are more difficult to manage; making predictions about the effect of a given set of QoS metrics on the service can be very difficult.

ITU's standard G.1080, "Quality of Experience Requirements for IPTV Services," explains the elements of QoE for IPTV. Figure 1 shows the framework described in the document.
Content providers who transport digitised and packetised video signals over an IP network have a myriad of options. The selected compression algorithm parameters drive the underlying network's QoS metrics requirements, as recommended by the ITU in the G.1080 standard. Packetisation encoder input comes from different source video stream variants such as PAL and NTSC. Signal coding comes with a variety of algorithms, such as those defined by MPEG. Control setting management often defines the competitive advantage of one encoding system versus another.

ITU's G.1080 standard provides recommendations for the underlying network QoS metrics for a given type of encoding and bit rate.

Multiplay service networks have come a long way. Operators partition network resources using sophisticated QoS mechanisms so that streaming video and peer-to-peer file sharing applications do not adversely impact voice services. The main impediments to high subscriber QoE are:

  • The inherent structure of IP networks and their building blocks cause network congestion to always occur.
  • It is prohibitively expensive to engineer an IP infrastructure that handles the large ratios between peak and mean traffic.
  • Deviations from a network's planned implementation are inevitable.

Exceptionally high traffic exacerbates these issues.
Providing support for short traffic bursts by building out server and network infrastructure is usually too expensive for service providers. Congestion must be handled gracefully, servers and network infrastructure shouldn't malfunction under the load, and any subscribers admitted to the service should receive a reasonable service despite the overload.
The large variety of applications supported on IP-based multiplay networks contributes to the network's dynamic nature as subscribers adopt new services. Increasing amounts of HDTV and user generated video content can quickly change network requirements, for example.

The majority of systems currently deployed exclusively use passive monitoring technologies, which often employ expensive probes at numerous points in the network. Exclusive use of passive monitoring can give a malfunctioning network a clean bill of health, even while support personnel busily help large numbers of disgruntled customers. Passive network management systems can overlook an issue's root cause, even as network management tools light up with hundreds of alarms.

Recently, Ixia, based in California, developed a patented methodology in cooperation with tier 1 network operators, to provide subscribers with a high level of QoE from an actively tested network infrastructure. Ixia's IxRave runs on actual subscriber network paths using pre-designed tests from a centralised "test head." The solution monitors the network infrastructure from typical locations, and enables tier 1 service personnel to run active tests on an individual subscriber's service path-from the network core all the way to the set-top box sitting at the subscriber's home. 

This solution tests the entire network infrastructure and verifies that it meets the ITU defined goals for QoS and QoE by using both passive monitoring and active testing. By sending different permutations of multiplay traffic, such as voice and video, under different network load conditions it measure how the network performs during peak activity times down subscriber paths and checks that they meet QoS standards. It then makes network health assessments based on the defined QoE and QoS metrics set by the network provider.
Service providers can now test the transport, network and service layers of the network, as well as more common QoS metrics such as delay, jitter and packet loss. Customer's QoE can be measured using standard ITU-defined MOS scores so that network provider gets a true end-to-end picture of the network, and "sees what the customer sees."

Bruce Perlmutter, Ixia, can be contacted via:

Dominant incumbents hobble market
Broadband connections across the EU rose by 20 per cent over the year, to a total of 110.5m connections, representing 22.5 per cent of Europe's population, according to ECTA's latest twice-yearly EU Broadband Scorecard. But the pro-competition body warns that in countries such as Spain, where the incumbent operator, Telefonica, continues to increase its control of the market with more than 57 per cent of all retail broadband connections, the market has stagnated with the result that Spain is languishing below the OECD and EU average with a broadband penetration rate of only 20 per cent and low growth rates.

On the eve of the tripartite meeting of the European Commission, the European Parliament and the Council of Ministers, which is aiming to achieve consensus on the way forward for next generation access networks, ECTA has cited Spain as an example of why firm regulation is needed to ensure investment and take-up in next generation networks, as well as to guarantee competition and consumer choice.

Innocenzo Genna, Chairman of ECTA, says: "Light touch regulation has not worked in the banking sector and there is no reason to assume it will work to consumers' benefit in telecoms. Financial results from incumbents, such as Telefonica last week, show that they are primarily focused on increasing profitability at the expense of vital infrastructure investment. What is particularly disingenuous is that, at the same time, they are threatening governments and European politicians that they will not invest in next generation access unless there is a relaxation of competition rules that allow rivals to offer services over these networks.

"Despite Spanish regulator CMT granting Telefonica a regulatory moratorium for next generation fibre networks, ostensibly to support €1bln fibre investment programme, there is little or no evidence that it will prioritise infrastructure investment in future. Instead it is committed to ‘preserving the company's strong cash flow generation', to the benefit of shareholders."

Telefonica Spain posted an 8.9 per cent increase in profit (EBITDA) and cash flows up 14 per cent to €8bln, but a reduction of 7.2 per cent in investment (capex) in 2008. In addition, results from Deutsche Telekom, which has also been demanding regulatory forbearance from EU policy-makers, show that it outperformed financial expectations and ‘expanded its leading position in the German DSL market'. The annual report also confirmed the regulatory holiday it has enjoyed in access to its ‘fibre to the node' network, a situation that has in all likelihood helped reverse competitive progress in Germany.

Genna concludes: "We have no problem with companies prioritising profitability, making healthy profits and benefiting from their own investment and innovation. However, we do have a problem with healthy companies using the recession as an excuse to blackmail policy-makers into relaxing regulation, with the aim of strengthening their own dominant position further at the expense of competitors and consumers. For companies such as Telefonica and Deutsche Telekom, investment in next generation fibre networks should be part of their normal upgrade strategy to replace decades-old copper networks, which have been fully paid for, and not a licence for stifling competition."

Other more encouraging results from the industry benchmark scorecard show leaders in broadband - Denmark, Finland, Sweden and the Netherlands - all have penetration rates exceeding 30 per cent with the UK not far behind. Common to all these top ranked countries is strong competition from both cable and regulated unbundling of the local loop. In some of these countries, incumbents have also publicly committed to open access policies, in contrast with those of incumbents in Germany and Spain.

Mobile ups and downs
The mobile entertainment market will reach nearly $36bn in 2010 according to Juniper Research, but the analysts also warn that this is a best case forecast, and revenue growth will be markedly lower if the global recession fails to bottom-out over the next twelve months.

Using a scenario-based approach to assess the impact of the recession on the mobile entertainment industry, the recent Mobile Entertainment in a Recession report found that average annual growth over the next two years declines from nearly 19 per cent under the best case scenario to less than seven per cent in the worst case, with mobile TV, user-generated content and music amongst those sectors which are particularly exposed.
Lower discretionary spend on services and handsets were amongst the major contributory factors to the decline in top-line entertainment service revenues, although the report found that other attendant factors - such as a lack of available funding to finance the development of new applications, and faster migration to ad-funded services - would also impact on revenue growth over the forecast period.

According to the report author Dr Windsor Holden: "Some entertainment services appear to be highly susceptible to the downturn. Furthermore, given that operators will perceive that consumers will be increasingly reluctant - or unable - to purchase content, they may in turn be less likely to roll out expensive, higher risk services: a dedicated mobile broadcast TV network is a prime example."

However, the Juniper report found that some sectors, such as adult services and gambling, were less exposed than others: it noted that for mobile gambling, there was a possible upside in the migration of wagers from physical to remote sites with consumers going out less and instead placing bets via the mobile or PC.
Details: www.juniperresearch.com

Digital proximity marketing
Although digital proximity marketing is still in the beginning stage of development, it is spreading across Europe, with the UK leading the way.  Currently there are more than 35 providers throughout Europe who have helped to showcase these new technologies in the marketing realm.  Perhaps the most impressive utility of digital proximity marketing is the ability to inform consumers about their interests in a specific time and space.  This strategic positioning makes this new form of marketing exciting and a hot topic today.

"The broad diffusion of short-range wireless technologies embedded in mobile phones has enabled interaction between mobile users and the surrounding environments. Systems of sensors can detect mobile phones in the short distance and send them information and data that could be useful to the final users," says Saverio Romeo, Frost & Sullivan Industry Analyst. "Digital proximity marketing uses this idea to enable digital marketing and advertising campaigns in places where the actual users are. 

Digital proximity marketing transports information using the Internet from a content management system which manages all the marketing and advertising campaign, to a content server which stores everything.  Then through cellular networks, the information is delivered to access points, which are connected to the end users' mobile phones through short-range wireless technologies, mainly Bluetooth, but also Infrared, WiFi, GPS, cell towers and RFID. The system is based on an opt-in model and so users receive information only if they want.

Business leaders demand video on the move
A consumer behaviour study by Ericsson and CNN has revealed that the international business elite are increasingly accessing the Internet while on the move. The growing need for flexible viewing options to fit with changing lifestyle habits means that top executives are increasingly viewing TV content on laptops, desktop computers and mobile devices. The survey, carried out amongst CNN's online audience, also showed that more business leaders than ever are sharing user-generated video content.

  • 56 per cent of respondents with mobile internet, access online content whilst on the move for example, via a mobile device or wireless LAN. This trend speaks to the increasing number of upscale consumers with internet access outside of the home or office environment.
  • Three quarters (73 per cent) of CNN's online audience of global citizens share user-generated video content. In fact, 66 per cent of those over 45 share user-generated video content, de-bunking the myth that it is just an activity for the youth. Almost a third (29 per cent) of those surveyed record video clips on their mobile phone. In a nod to the sharp rise of citizen journalism, and perhaps in response to the growing number of social platforms enabling video sharing exchange,16 per cent of respondents are sharing user-generated video content are doing so with other digital community members.
Details: www.ericsson.com


On the eve of the TM Forum's Management World, Keith Willetts notes that the  imminent arrival of a true digital economy represents a massive opportunity for expanded communications services.  The key question, however, is does it also open up a whole new set of revenue streams for the service providers?

Not sure I've seen any fireworks lighting up the sky, but it's now a full 25 years since the first telecom deregulation. In that time just about every market in the world has gone down the path to competitive communications. So what, fundamentally, has happened in that time? Competition and regulatory pressures have transformed prices but as the communications world discovered the laws of market elasticity, rising volumes and the phenomenal growth of mobile have meant that revenues have continued to rise. In reality, the business model for communication services hasn't changed much in that time - we've sharpened up marketing their old one.

But just as financial markets found out, all good things come to an end! According to IDate, 2008 saw the global communications market only grew by +4.2% to $1.37 trillion, but most of that growth was from still expanding markets like India and China. In mature markets any volume growth was more than cancelled out by price declines on mobile and broadband. Poor old fixed line revenues fell by 5%. Prices for everything are declining as we go not only into a recession but maybe a deflationary period as well - I can't imagine a scenario where communications prices will go up, indeed they are likely to follow a form of Moore's law.
In Europe, mobile penetration now exceeds 100 per cent, with no more market left to trawl. So how do you continue to grow your business? The stock answer from CEO's is an exciting story of new mobile broadband; mobile TV; IPTV; unlimited music, online books - you name it they will claim it. But that question and similar answers have been asked for a long time now and there is little evidence to show that the service providers can realistically generate new, innovative revenues.

Remember when location based services would make us all rich - well the market took so long defining standards for exposing the location data that the handset guys have just got around it by putting GPS chips in their phones. Same for MMS - too hard, too slow and too user unfriendly to get a mass market going. The only truly new services, like iTunes, have come to market from ‘over the top' players, not the communications companies.

So the question has to be asked - can service providers realistically generate sufficient new revenue from the services they sell to their current customers to replace the falls in price on traditional services as markets saturate? And if the answer to that is maybe not, what are they going to do for an encore? Until recently you could point to diversified services like outsourcing of corporate communications networks as a ray of sunshine - that was until one major carrier started posting profits warnings and admitting over-stating profitability of that business.

Clearly, service providers are quickly coming to a fork in the road when it comes to their core business model - just who are their customers and their competitors; what services should they be selling and how are they going to monetize them?

Pioneering services like Amazon, Google, iTunes, and Hulu have shown that entire markets can be shifted to a digital economy model at much less cost, but where everyone can still make money. Apart from bricks and mortar stores of course. We are seeing a similar thing in publishing - more and more publications are going online and eschewing expensive printing and shipping. Books and newspapers may well follow music and videos in going online through products like Amazon's Kindle.

In fact, the global recession will push almost every business on the planet to look at what cheaper and better online approaches they can exploit. Thanks to advances in communications - fibre, 4G wireless and femtocells, (putting cell sites within the home), the market for digitally enabled services may well explode on a myriad of consumer devices from net-enabled TV's through online gas and electric meters, fridges and cars.

This mushrooming of devices and a true digital economy represents a huge array of opportunities for expanded communications services.  The key question is - does it also open up a whole new set of revenue streams for the service providers? Do they get commoditized into bit pipe players? Would that matter?

Almost as long ago as deregulation, Michael Porter (Key competencies, 1985) outlined the concept of companies maximizing their core competencies and minimizing any reliance on what they are not good at. So what is it that communications are good and no so good at? How many wildly successful new services have been introduced in the past 10 years? Apart from DSL (Alexander Graham Bell with knobs on) you really have to scratch your heads to come up with anything - most are basically variations on a theme: voice minutes in all-you-can-eat packages with texting thrown in and different bundles with broadband.

For mass market, innovative, successful services - Google, Facebook, iTunes, Kindle, Hulu, and so on none of them have come out of a communications company. All of them could have been invented by a communications player - they certainly have the brains - but their business models get in the way - their DNA is just not geared to taking risk, moving quickly and launching anything that might damage current lines of business.

But on the other hand none of these new services could exist without the innovations of the communications industry. What the service providers are good at is being a great enabler of other people's services - after all, for a 100 years phone companies have enabled us to talk to other people - they didn't do the talking!

Being a service enabler presents a new business model or at least, significantly extends an old one. Providing a range of enabling capabilities can unlock a different charging model, such as taking a percentage of the revenues of the services that are enabled. This gives much more scalable revenues than, say, flat bandwidth charging approaches. It opens up new revenue streams by opening up the software and process infrastructure of a comms company - transport obviously (but maybe various qualities of service)  plus capabilities like billing; settlements; authentication; cloud computing; user information and so on: in other words a super- wholesale enabler. But to open your mind up to that, you have to get your head around the fact that you are accepting that someone else is going to be the provider of service to the end user. And it's tough to pursue both a provider model and an enabler business model in the same company because they are usually in conflict. You can just imagine the schizophrenia that can result.

At TM Forum's Management World Nice this May, we're hosting a sessions on exactly this subject. Werner Vogels, CTO of Amazon, will talk about how his company has successfully played both sides of the fence: providing services to its own end users but also providing a lot of capability to enable third parties to sell through Amazon.

This business model is starting to be more understood and taken more seriously by communications companies, but you'd have to say the jury is still out on which fork in the road providers are going to take. Will it be the model of trying to develop innovative new services for individual end users and businesses, or will it be more of the role of a behind-the-scenes enabler.

I think the next two to three years will be crucial to answering this question. A "do nothing" approach probably means service providers getting backed further and further into the role of a commodity bit carrier.  Being the ‘Intel Inside' of numerous new and exciting services is a much better place to be that a bystander watching the action for the sidelines. Enabling other people's services is something that communications companies can do to leverage their really core competencies.

Let's put a traffic camera by that fork and watch which way the punters go.

For more information on TM Forum and Management World in Nice please visit
Keith Willetts is Chairman and CEO, TM Forum

Mobile network operators often ask 'how can we leverage the social networking  phenomena'? A better question, says Jouko Ahvenainen, is 'how can we mine the social network we already have - that is, the network of our subscribers'?

Mobile phones generate better and more useful information about consumers than any other technology, including the web. How people use their phone is a very personal thing. Who people call, text and save in their phones is more closely related to their 'real' network of contacts than the people they connect with on Facebook, Twitter or MySpace. In the book I helped write, Social Media Marketing, one passage reads:

"The mobile device is a key element of the digital footprint since it is oriented towards capturing information (which is a driver of the digital footprint). The real question for the telecoms and mobile industry is what can they do with all this information? More importantly, what could they do in future with all this information?"

By accepting this advantage and looking at how this data can be utilised, while also maintaining strict standards of trust and privacy with subscribers, mobile phone operators can understand the best asset they already have - the 'goldmine' of subscriber data.

This data can open exciting new revenue channels and give deep customer insight so operators can offer better products and services. They can also create much more effective anti-churn campaigns. In addition to behaviour and demographic information, subscriber social network and influence information can be uncovered. Who a subscriber has influence over is a critical point when it comes to churn. If person A churns, and brings person B, C and D with them, the problem has been quadrupled. By identifying that person A might churn, and that he or she will bring three other leavers along, mobile operators can make churn busting campaigns far more economical, with far greater impact. Operators can also be a source of market research data in the future, when they can collect more data than traditional market research firms can.

Of course, maintaining subscriber privacy is critical in all this, and it is certainly achievable. But it requires a shift in culture. There is a stark difference between trying to 'know' your customer and trying to 'own' your customer. Web companies continue to 'know' their customers better, while many phone operators stick to an outdated notion that they 'own' their customers. The longer the telecoms industry delays knowing the customer better, the longer the industry will lose out. Embracing the power of social networking information is key to this transformation.

Many operators hope that call data records (CDRs) give enough information to gain improved customer insight. It is a good starting point to make the operator's own marketing activities more effective, but it is not enough to be a platform for advertising and social services in mobile. We need to know more about how the subscriber interacts and uses service elements. There are three things (generally accepted) that social media requires to know a customer better:
1. You must incentivise a customer to share more personal information about him/herself. This assumes that all privacy / confidentiality standards are adhered to.
2. An open ecosystem / platform is needed. At present, Facebook and Google Android are the best examples of where third parties interact on open playing fields, and as such, do the work to grow the ecosystem. The best part about this is it can be exploited without direct monetary rewards.
3. We need touch points. Places where the operator can interact directly with the customer and generate a two-way communication. Broadcasting information at customers without interaction (one-way street) is an outdated approach and no longer prescient in today's Twitter age. Advertisements are the best option (an alternative, call centres, are too expensive). And social networking can make these advertisements personal, intuitive and relevant - rather than annoying to the subscriber. Open systems are the best location for the advertisement to be placed.

With these being the first steps for operators, a starting point is by thinking about marketing and services in a new way. The marketing can no longer be a one-way broadcast of messages to customers, it must be a much more interactive relationship that also supports word-of-mouth. And it is the same with services: operators cannot make or select all services for the subscribers; users must be able to create their own services and choose which ones they want to use. Web 2.0 truly is coming to mobile, and it offers a platform where people can do what they want to do. It's not a place to push selected ideas and models to them. And web 2.0 is not the only evolution, but also CRM 2.0, which is where subscribers can also utilise their own data and data analytics for their own benefit. For example, subscribers can know their own social network and manage their own connections. It becomes a way to motivate people to share their data when they can get benefits from doing so.

Following this, a measurement system must be created and agreed upon. Operators cannot act until there is a way to manage social media programmes. One approach has been to track user behaviour at the network level. An example of this is the mechanism Phorm's service is based upon. Phorm has become a pressing issue in the UK because, even if you did get permission to undertake this level of tracking, people do not understand networks and don't trust what they don't understand. People are normally okay to share small amounts of information if they get something back, but they don't like one-way spying models.

By working at higher levels of the stack than networking level (usually, the application level) and by making 'knowing the customer better' the goal rather than 'owning the customer,' there are huge marketing gains to be had as well as a new level of trust to be engendered with subscribers. The only way this can be achieved is through aggregated data, by not working with specific individuals or specific transactions but rather with aggregates and data patterns derived from the data.

Data aggregates for marketing purposes also presents a solid business case for the converged operator. If an operator owns customer touchpoints via not only the mobile phone, but also broadband, TV, landline, etc, the data presents a richer picture of the customer and it becomes easier to engender trust when the customer only has to share information once, with one brand whom they trust.

As mentioned before, by understanding people's behaviours in the context of this network, operators can pull out and define a member's measure of influence, and use this for new clever mobile and online marketing techniques. In the case of London-based Flirtomatic, the web and mobile social network for flirting for ages 18+, they are using this superior customer insight to engage in more targeted marketing and services for its influential members - those who have word of mouth impact on other members. Flirtomatic applies 'social intelligence' to create more compelling services for each customer segment, and targeted, relevant and personal marketing and promotions, via web and mobile. Flirtomatic is focused on generating viral take-up throughout its community via word-of-mouth marketing. This approach works two-fold, because influential members have both direct pull over purchasing decisions (by recommending products to their friends) as well as indirect pull (by friends' desire to imitate or mimic their friends' purchases). Flirtomatic is using Xtract Social Links product for this insight.

This works because studies show that social influence is more important than any other factor in consumers' purchasing decisions. One research on car-buying showed that 71 per cent of car buyers are influenced by what their friends said, whereas only 17 per cent were influenced by TV ads.

This insight can be applied by the operator for its own marketing, such as churn campaigns, where as much as a 20 per cent improvement in campaign effectivity has been reached, or for generating new revenue through third-party advertising schemes. And this market is growing fast; eMarketer predicted that spending on behavioural targeting will reach $3.8 billion by 2011.

Flirtomatic's CEO Mark Curtis recently said: "The early results from customer segmentation are very insightful and exciting. We can now see considerable potential, as the business scales, to directly improve our revenues through a sophisticated view of our customers, their behaviour and the pattern of their relationships. The tool hands us an effective, powerful CRM solution."

Jouko Ahvenainen, co-founder and VP at Xtract and author of Social Media Marketing , and can be contacted via Jouko.Ahvenainen@xtract.com

The growth in the African telecommunications market over the past five years has  been nothing less than phenomenal. Although growth rates are expected to slow, Julia Lamberth and Serge Thiemelé explain that Africa should continue to be the fastest growing market in the world for the next five years

The growth in the Arican telecoms market has turned the telephone from a luxury item to a basic necessity in many countries. However, the expansion has not been universal across the continent. Some countries, such as South Africa and Libya, have already passed the 100 per cent mobile penetration rate, while others, such as Ethiopia and Eretria, still have penetration rates under 10 per cent.

According to Ernst & Young's recently released Africa Connected survey, the growth up until now has been driven almost entirely by GSM voice. While voice should continue to be the largest component of the market for the foreseeable future, it is expected that data is going to be an ever-increasing component of operator revenues in the future.

Internet penetration on the continent is still substantially lower than any other part of the world, with only nine countries on the continent having penetration rates above one per cent. It is expected that the construction of submarine cable systems, the first of which should be operational by the middle of the year, is likely to be the catalyst for accelerated growth in African Internet penetration. Alongside the construction of the submarine cable systems, which should, to a large extent, address the problems posed by inadequate international connectivity, there has been significant investment in terrestrial fixed line infrastructure.

This investment has been made by both private operators, especially in countries such as Nigeria and South Africa, as well as by governments, in countries such as Angola, Malawi, Botswana and the Democratic Republic of the Congo. While the impact of this investment will not be felt immediately by consumers in many countries, it will provide the basis for cheaper and more reliable telecommunications in the next few years, particularly in rural areas. It is likely that the vast majority of the next wave of African Internet users will not connect to the Web through services that rely on fixed networks, but instead use the infrastructure provided by mobile and fixed wireless service providers.

One of the reasons for the slow pace of telecommunications growth on the continent in the past has been the historical lack of basic infrastructure. Poor infrastructure was one of the areas identified by operators as a key challenge to the development of telecommunications. This weakness has manifested itself in a number of ways, including limited access to core telecommunications infrastructure, as well as a lack of a reliable electricity network needed to keep networks up and running. This weakness in the basic services has a negative impact on the ability of operators to rapidly deploy their own infrastructure. Having to make contingencies for these weaknesses is something that has a significant cost attached. Safaricom, in Kenya for example, reportedly spends more than a million Euros a month in diesel to power the generators it needs to keep its network running.

This situation has resulted in operators exploring alternative sources of energy such as wind or solar power to supplement other power generation options.

Operators identified the issue of attracting and maintaining talent as the largest operational issue. This issue applies both to the technical and management skills, with operators struggling to fill vacancies across the spectrum. While they acknowledged the importance of training, the issue of staff being poached by rivals was identified as an ongoing challenge.
The increased vigilance of regulators on the continent has heightened the need of ensuring network reliability as regulators are taking a more active consumer protection role. Examples of this include operators being barred from marketing their services in Nigeria until the quality of service reached an acceptable level.

In addition, operators interviewed voiced concern over the perceived political interference in the regulatory process. It is this lack of consistency that creates difficulties for operators, as they are unsure of how changes in the local regulatory framework will impact their businesses, especially if these changes are being driven by a political, rather than a pure regulatory agenda.

Operators highlighted the high rates of taxation they are subjected to, with the average across the continent coming in at over 30 per cent. Governments across the continent have chosen to place a heavy tax burden on mobile operators by taxing profits at a higher rate, instituting mobile specific taxes or raising license fees. Operators also raised the issue of excise taxes on imported handsets, making them less affordable to consumers and hampering the ability of companies to reach potential customers in lower income brackets.
In order to succeed in the African market, scale is considered one of the key elements of future success, and competition for new licenses and existing operations is keen. It is likely that we are going to see significant consolidation in the next few years, as smaller operators feel the effects of increased competition.

The global economic crisis is not likely to leave the African market unscathed, as many operators may find it more difficult to raise the funding needed to continue the level of investment needed to remain competitive. Especially in key markets such as Nigeria where the multinational operators are investing heavily in network expansion, the smaller operators may find it difficult to keep pace with either the network coverage or the technological innovation of the large regional and multi-national operators.

The issue of infrastructure sharing and outsourcing of parts of the business is one way for operators to cut the costs of doing business. However, operators surveyed were resistant to this, preferring rather to have control over the infrastructure and services that they consider their competitive advantage. More recently, however, some operators have said that they are looking to cooperate with competitors wherever possible to bring down the cost of deploying new infrastructure.

Operators, specifically in more developed markets, are also starting to look at broadening their set of services to include targeting the wider ICT market. This has seen operators acquiring companies in the information and communications technologies sector. This broader focus is setting the stage for a divide in the market between operators that choose to create a converged services offering and those that focus on offering voice and basic data services at a lower cost.

The rollout of these converged services, which include fixed and mobile services as well as offerings that have traditionally been the reserve of the Internet service providers, such as hosting and business continuity, will further drive the development of the African ICT market. While the initial focus of these services will be in the developed markets, it is expected that these services will rapidly be driven out to corporate customers in all the territories in which these companies operate in Africa.

This investment by operators, as well as the infrastructure that is being deployed will set the stage for the rapid adoption of more data-focused services for both governments and corporates across the continent. While these types of organisations are likely to access these new networks via new fixed-line networks, consumers should benefit from the deployment of high-speed wireless services with the attendant increase in bandwidth and broadband.

It is anticipated that networks based on 3G will dominate the market for broadband wireless access with CDMA EVDO and WiMax offering some competition as well as providing access where the GSM-based service is not suitable.

We expect the next five years to see a continuation of the growth in African telecommunications, with increased Internet and broadband penetration across the continent. At the same time the market is likely to undergo a period of considerable consolidation with the existing African operators expanding their reach and continuing to expand their reach across the continent. It is our view that operators who do not already have an African presence could have a difficult time in challenging both strong, regional and global operators (MTN, Vodafone, France Telcom, Zain - for example) and a plethora of new licensees (more than 40 per cent of the market is still in one per cent market share slices). Operators launching as the fourth or fifth license holder in countries may face challenges in generating profits, especially in countries where one operator already has a dominant position. Countries such as Angola and Ethiopia where none of the large regional players have established a presence are being viewed as real opportunities for future expansion.
Julia Lamberth and Serge Thiemelé are co-leaders of the Ernst & Young Global Telecommunications Center - Africa.

The Africa connected survey was compiled from interviews conducted with operators from across Africa. For further information please visit www.ey.com/telecommunications or contact globaltelecommunicationscenter@uk.ey.com



Other Categories in Features