Features

Features

Ray Adensamer argues that Voice Quality Enhancement can help deliver the standard  required of VOIP conferencing systems

Audio conferencing services based on circuit switched networks and audio bridging equipment have provided hosted conferencing users with a benchmark for pricing and quality in voice communications. While next generation networks based on VoIP technology introduce economic benefits with new feature capabilities for conferencing service providers (CSPs), they also present new technical challenges in maintaining acceptable voice quality. Delivering good voice quality is an important requirement in any VoIP conferencing system, as poor voice quality will increase the costs associated with customer churn, while impacting the bottom line by reducing revenue growth prospects.

Voice Quality Enhancement (VQE) encompasses an integrated set of features designed to overcome common audio quality problems in VoIP conferencing services, including noise, packet loss and echo. A comprehensive VQE solution also measures VoIP quality metrics, which are used in ongoing voice quality measurement associated with service level agreements.

Many features inherent in a VQE solution require sophisticated digital signal processing algorithms. The rapid, scalable execution of these algorithms dictates a product specifically designed for real-time IP packet processing. Fortunately, in a next-generation VoIP audio conferencing architecture, a network element already exists with carrier-class real-time IP packet processing power. And that network element is the IP media server.

The three most common sources of VoIP audio quality problems in a VoIP or IMS network are noise, packet loss and echo. This section discusses each of these VoIP audio quality challenges and describes the conceptual solutions to overcome quality problems.

Audio noise
Gone are the days when people were confined to quiet office and residential environments. Today, with mobile phones and the Internet, people are calling from their cars, airports and from just about anywhere, and these environments are flooding the mouthpiece with all kinds of unwanted sounds that ultimately get onto the call. Making matters worse, callers using laptops and mobile phones are typically saddled with marginal equipment such as low cost earphones and microphones.

This section describes a combination of mechanisms that reduce and help manage the disturbing effect of audio noise: noise gating, noisy line detection and noise reduction.

Noise gating
Noise gating is a simple yet effective mechanism to reduce background noise.
When no speech is detected on a line, its signal is attenuated (eg decreased amplification), which prevents unnecessary noise from being inserted into a VoIP recording or conference mix. Noise attenuation is configurable, so the conferencing application can avoid making the signal unnaturally quiet when the noise gate is applied to an audio signal.
Key benefits of noise gating:

  • Reduces background noise using a simple yet effective mechanism
  • Supports configurable attenuation

Noisy line detection
There are times on a conference call when some lines are very noisy and disrupt the productivity of the entire call. Noisy line detection measures the noise on audio ports and sends a noisy line notification message to the VoIP application server if a predefined threshold is exceeded, as shown in Figure 2. A second message is sent if the noise subsequently falls below the threshold.

Key benefits of noisy line detection:

  • Notifies the application server of noisy line conditions, initiating possible corrective action
  • Enables quick remedial action by the application server or the operator (eg mute line)

Noise reduction
While a noise gating function described earlier provides a relatively simple solution to eliminating noise when no speech is detected, noise reduction goes a step further by using digital processing techniques to remove the noise and leave the important speech signal intact. This provides benefits in many VoIP applications, such as removing noise from VoIP audio recordings or noisy caller lines in a conference mix.
Key benefits of noise reduction:

  • Filters out noise without impacting the speaker's signal
  • Reduces noise continuously, whether speech is detected or not

Dropped packets
The Internet is an amazing network of interconnected computers, but it's not perfect. The network employs the IP protocol, which does not guarantee packet delivery. Hence, when IP networks get busy or congested, packets can get lost or delayed. While lost packets are not critical for many data applications, packet loss in real-time VoIP services can cause significant audio quality problems. Without special technology to compensate for dropped packets, the result is an abnormal audio signal that might sound ‘choppy.'

Packet loss concealment
Packet loss c\oncealment is a technique for replacing audio from lost or unacceptably delayed packets with a prediction based on previously received audio.
Whereas any voice repair technology would have difficulty recovering from extreme packet loss in abnormal conditions, packet loss concealment is designed to perform intelligent restoration of lost or delayed packets for a large majority of congested network scenarios.
Key benefits of packet loss concealment:

  • Softens any breaks in the voice signal
  • Reduces the occurrences of choppy audio

Acoustic echo
An acoustic echo is created when sound emanating from the receiver's speaker (eg handset or speakerphone) is transmitted back by the receiver's microphone. This is depicted in Figure 3, where the Sender (on the left) transmits a signal to the Receiver, and an acoustic echo is created when some speech energy ‘bounces back.' In a VoIP conferencing application, all participants will hear an echo except for the guilty party with the device causing the echo. Since nobody can quickly answer the basic question, "Who's causing the echo?" troubleshooting echo issues in a VoIP conference call can be difficult and frustrating.

Acoustic echo cancellation
Acoustic echo cancellation (AEC) technology is designed to detect and remove the sender's transmit (Tx) audio signal that bounces back through the receive (Rx) path. By removing the echo from the signal, overall speech intelligibility and voice quality is improved.

AEC in a VoIP network is particularly challenging. In a traditional voice network, once a voice circuit is established through the PSTN, the round-trip echo delay is constant. However, in a VoIP network, packet delay is a variable, hence the echo delay is also a variable for the duration of the call, which makes the echo cancellation algorithms in any VoIP quality improvement product more complex and processor-intensive than an equivalent echo cancellation solution in a circuit-switched network.
Key benefits of acoustic echo cancellation:

  • Removes a sender's audio echo from the receive path
  • Addresses variable packet delay inherent in IP networks

Voice quality metrics
Technology to remove audio quality impairments in a VoIP network is an important part of any solution. But along with the functions to improve VoIP quality, service providers also need a standard, objective way to measure voice quality in order to accurately monitor performance levels and uphold service level agreements (SLAs) with customers.
Voice quality metrics can be divided into three groups: packet, audio and acoustic echo cancellation (AEC). All statistics are captured for each call leg of a conference
call to help with granular troubleshooting of audio quality problems and performance measurement. Packet statistics measure performance with respect to packet throughput, loss and delay, while audio statistics measure speech and noise power levels. AEC statistics measure echo delay and echo cancellation performance.
Key benefits of voice quality metrics:

  • Provides objective measurements for administering service level agreements (SLAs)
  • Facilitates the troubleshooting of audio quality issues in the network

Voice quality enhancement
Voice quality enhancement (VQE) encompasses an integrated set of features designed to improve VoIP quality and generate statistics needed for ongoing performance monitoring. This requires sophisticated digital signal processing algorithms that perform rapid real-time IP packet processing, a key component in next-generation VoIP audio conferencing architecture. As such, VQE can be deployed in an existing IP media server, which provides the requisite carrier-class real-time IP packet processing power.

IP media servers, also known as the Multimedia Resource Function (MRF) in an IMS architecture, are specifically designed to deliver real-time IP media processing as a common, shared resource for a broad range of VoIP and IMS applications in a next-generation network.

They also deliver real-time processing of codec algorithms, transcoding of codecs and sophisticated audio mixing for conferencing applications. Since media server and VQE tasks are interrelated and require the rapid execution of IP packet processing algorithms, it makes sense to integrate the functions of both into a single network element.

Ray Adensamer is Senior Product Marketing Manager, RadiSys
ray.adensamer@radisys.com
www.radisys.com

As budgets are reprioritised, initiatives that promise the most value should be given  first priority, with service layer transformation somewhere near the top of the heap says Rick Mallon

Many service providers are in the midst of long-term IT transformation programs. Most often these aim to launch new services, improve speed to market, achieve customer-centricity, and simplify operations. In the current economic climate, however, resources for many CSPs are far more constrained than was anticipated when these programs were conceived. Consequently, many CSPs face tough decisions regarding their IT budgets. Those that reduce budgets sharply risk stranding previous investments made in ongoing programs. A newly established set of IT systems intended to enable a major migration could become yet another expensive operations silo, thus undermining the program's original goals.

Service layer vs BSS transformation
Transformation initiatives often start with BSS transformation. In post-merger environments where consolidation and customer-centricity are the business drivers, there's a definite logic to this. For one thing, service providers want the billing process to continue without a hitch. Trouble is, regardless of the technologies you use, the conversion of disparate billing and CRM environments to a centralised architecture will necessarily involve some risks and variable costs.

The biggest risk may be in putting the wrong foot forward. Some opt at the beginning for a traditional big-bang BSS transformation, because that path is a known route. Others may prefer to begin with a service layer transformation initiative, because it keeps future options open. Ultimately, the best- as well as least costly and most flexible - approach is to start with service layer transformation. Service layer transformation won't automatically save providers from recreating some of the mistakes of the past, but it will enable them to more easily add on new services in the future and create a common view of each subscriber.
BSS conversions are likely to affect a large number of business stakeholders, touch thousands of users, and impact millions of customers. As care channels change, or redesigned bills are introduced, there are always significant costs relating to possible customer disruptions during the transition period. Internally, new users need to be trained and established processes need to be re-engineered. The likelihood of battling organisational resistance is magnified.

Even if a CSP manages its way through the pitfalls, BSS transformations often struggle to deliver their intended value when they intersect still-fragmented service layers. Put simply, if the service layer continues to consist of distinct product-facing silos, it would be difficult to accelerate time to market, launch cross-domain services, and drive personalisation and customer intelligence.

Merging or federating these silos through service layer transformation has the potential to deliver significant benefits for less cost, effort and scope than BSS transformation. Fewer customer touch points are involved; fewer stakeholders are directly affected. Transformation of the service layer removes the silos, simplifies operations, and benefits marketing, service innovation, and opex and capex budgeting.

Because networks continue to experience significant advancement, vendor churn has become a consistent challenge for network organisations. Massive multi-vendor environments are also a common end result of large scale mergers and acquisitions. Service layer transformation is necessary to manage all of that kit under one layer and fulfil services in a technology-agnostic way. Even if a BSS transformation succeeds, time to market and operational-simplification goals can be derailed by overly complex networks broken out into distinct vendor domains.

While improving care, service personalisation, and customer intelligence is often at the core of BSS transformation's business case, it's difficult to achieve without a centralised view of a customer from a product and service perspective. Service layer transformation enables a single, correlated view of each customer and all their subscribed services. This view can, in turn, fuel more intelligent targeted marketing and personalised up-sales of value-added applications or additional services. Customers are tired of receiving promotional offers for services they already use at a higher price. If the service layer isn't transformed, this kind of churn-driving marketing continues while undermining revenue goals for new, add-on and niche-oriented offerings. CSPs have an opportunity to leverage significant standards work. Specs like SCTE 130 out of North America enable personalised services and advertising, while various efforts that leverage the TM Forum's business process framework and OSS/J teams provide models, process flows, and interface specifications for reducing the risk, time, and effort related to service layer transformations.

ZON Multimedia
ZON Multimedia, based in Oeiras, Portugal, is a good example of a CSP that has drawn significant benefits from an effective service layer initiative. ZON announced that by the end of 2008 more than 53 per cent of its customers were subscribed to multiple services. The company added 144,000 net new \subscribers in 2008 and credits much of this success to a bundle of video, broadband and IP-based voice services it launched in May last year. ZON is also launching mobile services, using a MVNO model, which will soon give it a true quad-play offering.

One of the unique aspects of ZON's offerings is that they are delivered in a network technology-agnostic fashion. Due to its OSS service layer transformation, ZON can determine the best and most profitable way to re-use and deliver services to any customer location. For example, ZON offers TV through on-network cable and also a satellite resale model. It offers PacketCable-based VoIP services over its cable network and SIP VoIP over resold ADSL lines. ZON has the ability to plug any new service offerings into its common service layer platform and deliver them through any available and appropriate networks and CPE devices. In addition to its MVNO offering, ZON offers HSD services for cable and WiFi, and DSL (resold model) from its common, transformed service layer.

For business or commercial market offerings, service layer automation can be more challenging than in the consumer sector. Enterprises vary significantly in their particular requirements, making end-to-end automation challenging. However, mass-market offerings for small and medium businesses are well within reach for the same service layer transformation that empowers consumer offerings. Services that combine, for example, a metro-Ethernet pipe with SIP-based VoIP, video for board rooms or waiting rooms, and advantageous mobile packages are ideal for this marketplace. In particular, preparing for the wave of SIP devices and applications to come is an important part of any CSP's growth strategy.

The multi-screen opportunity
One concept that is central to SIP, and increasingly important in video and broadband offerings, is service portability. Put simply, consumers want access to their video content wherever and whenever they choose. Many CSPs have yet to embrace this concept, leaving the opportunity open to Internet-based players. Though IPTV has the potential to provide flexible access to more content, it's only a technology and is often applied in the same framework as traditional cable TV. Customers, however, have a concept of content ownership. If they've purchased movies or TV shows for their iPod, mobile phone, or PC, they'd like to be able to access them from any other device, including their TVs at home.
Service layer transformation allows a CSP to seize this trend as a growth opportunity. It can enable service providers to tie services to individual subscribers, rather than to locations or devices. An intelligent service layer will understand to what devices a user has access and deliver content and services as appropriate and seamlessly. With a service layer empowering that kind of portability, new options for promotions and offerings open up, including personalised bundles and targeted advertising.

In today's disparate service layer environments, CSPs are scrambling to some extent because their TV offerings are being bypassed thanks to Hulu, iTunes, YouTube and other outlets. Delivering content to the channel the consumer wants at any given time requires service layer transformation. It enables customer-centricity across all services and breaks down the barriers between technology domains. With these capabilities in place, marketers have a world of options open to them that they wish they had today. Too often they instead face IT and network organisations that have to say: "We can't do that yet."

From a return on investment perspective, service layer transformation is a winner.  It can be a catalyst for moving service offerings and customer intelligence forward into entirely new areas of opportunity. It is simpler, less costly, and less risky than massive BSS conversions, yet sets the stage to make those inevitable BSS-layer transformations simpler and more beneficial. In a resource constrained world, where budgets are tightening and business cases are more scrutinised, service layer transformation is a priority that empowers both revenue growth and operational improvement.

Rick Mallon, Sigma Systems
www.sigma-systems.com

People are communicating more things to more people than ever before, and not just  by phone anymore. Internet-enabled communication models are gaining audience, attention and market share at the expense of traditional telecommunication providers. Can telcos fight back and find new growth opportunities in this rapidly changing ecosystem? The challenge is not just in understanding the technology, but also the unfolding fundamental shifts in human communication behaviour say Chris Pearson and Rob van den Dam

A  growing number of people are visiting social networks. According to Comscore, approximately two-thirds of the worldwide Internet audience are regularly visiting social networks. This trend is universal. In South Korea, considered by many to be the world's most developed social network nation, more than 90 per cent of teens and almost half of the entire population are members of Cyworld.  In the United States, 80 per cent of the young adults, 60 per cent of teens and 30 per cent of adults use social networks. And in the UK, 90 per cent of teens spend time on these sites.

Social Networks have become a primary destination for a rapidly expanding universe of online users for managing and enriching a digital lifestyle. They provide the ability for them to communicate, to develop their identities, to build a network of relationships, to find information, to share experiences and self generated content, to buy products, and more.
With numerous communication tools at their disposal, social networks are becoming integrated communication hubs. The integration of MySpace and Skype, for example, illustrates how social networks and communication applications can converge to benefit users. With more than 118 million active MySpace users and over 370 million Skype registered users around the world, this partnership connects two of the most popular communication platforms on the Internet to create the world's largest online voice network.
A number of telcos are already responding to the challenges and opportunities of social networking. Many have initiatives underway that range from simply enabling online social networking sites to extending their offerings to the mobile communication environment, to even building their own, proprietary social networks. For example, telecom operators such as Sprint, AT&T and Vodafone enable MySpace and Facebook members to access their profiles from their cell phones. And Vodafone recently launched Connect to Friends, a Facebook application that enables Vodafone and non-Vodafone subscribers to communicate with each other from either a PC or a mobile phone.

The widespread social networking phenomenon is a reflection of shifts in two long-term communication trends:
1. communication patterns are changing from point-to-point and two-way conversations, to many-to-many, collaborative communications, augmented with links, videos, photos and multimedia content that substantially enrich the user experience.
2. control of communications is shifting away from the proprietary domain of telecom providers to open Internet platform service providers.

The so-called Net Generation - the digital natives who have grown up in a technology-enabled and Internet-connected environment - is at the forefront of shifting social communication patterns. Their preference is for staying connected, sharing, creating content, multitasking, assembling random information into patterns, and using technology in new ways. They are the wisdom-of-crowds generation that grew up rating peers, physical attributes, products and services, etc.

They are native speakers of technology, fluent in the digital languages of computers, the Internet, video games and the mobile phone, and often living in a state of continuous partial attention. For many of them, social networking is supplanting email, and even voice, as the preferred method of communication. But the shift away from traditional communications to social networking is not limited to this generation. A growing number of adults now use social networking to get what they want from each other, instead of from traditional media and institutions.

The second trend, the shifting control of communication media from the domain of telcos toward a more open communication platform, is the result of widespread availability and affordability of connectivity and communication tools/devices. With better, cheaper technologies and greater use of broadband, the Internet, and wireless networks, open platforms such as social networking sites, are becoming ever-more viable platforms for communication services - and consumers are responding eagerly.

The combination of shifts in communication control and patterns is redefining the competitive landscape, giving rise to new business models. In contrast with traditional communication models, emerging models are based on open platforms that support many-to many and/or collaborative communication patterns.

Traditional communication
The traditional model, characterized by two-way point-to-point communication is the domain of traditional telco providers. It is the largest segment in terms of revenue and subscribers, but it is showing signs of slow growth as other models take hold. Wireline revenue is declining and although, according to Gartner, global mobile services revenue is forecast to grow 7.6 per cent from 2007-2012, the mobile subscriber base has reached saturation in key developed markets.

Open and Free
This model offers alternatives to traditional point-to-point communication services on open Internet platforms. Companies in this domain provide basic communication services such as VoIP for free or at very low cost. Many of these services threaten profitable traditional services such as long distance calling and mobile roaming.

Providers in this space include VoIP provider Skype, Google with GoogleTalk and Microsoft with Windows Live Messenger, which offer PC-to-PC voice services along with instant messaging and chat. With over 370 million registered users worldwide, Skype has, in a matter of five years, come close to creating a truly global telecom service.

Many of the players in the "open and free" space, such as Microsoft and Google, have considerable resources and leave little room for a commercially viable response from telcos beyond repackaging existing services into "convenience bundles." Some telcos, however, seemingly have embraced the model and are partnering with disruptive new entrants. As an example, the mobile operator 3 in Australia and the United Kingdom has partnered with Skype to launch the 3 Skypephone, to attract and retain customers.

Gated Communities
This model focuses on group communication and collaboration in the Telcos's "walled-garden" and will appeal to users and enterprises with a preference for the more secure and reliable communications environment traditionally provided by telecoms.

The most obvious opportunity here is extension of social networking to the mobile where operators continue to retain some exclusivity. Research companies such as Informa and Juniper estimate that by 2012, mobile social networking will represent a market opportunity of between US$22.5 and US$52 billion, and telcos should be able to seize a share of that.
Recent studies have revealed that more than 40 per cent of iPhone users in the United States, Germany, France and the United Kingdom are visiting social networking sites. In South Korea, a mobile user visits Cyworld on average eleven times per day. The mobile service of the Japanese social network Mixi, which started as an online site, has turned out to be hugely successful with mobile page views already outnumbering online page views.
Telcos have also an opportunity to play a role in the delivery of fully integrated collaborative services to enterprises and organisations that value carrier-grade capabilities in a secure and reliable environment. According to Forrester, enterprise spending on Web 2.0 collaboration technologies is forecast to grow to US$4.6 billion globally by 2013, with social networking as the top spending category.

Shared Social Spaces
Shared Social Spaces facilitate collaboration on the open Internet. The main providers in this space are Over The Top (OTT) applications such as MySpace, Bebo, YouTube and Facebook. And virtual worlds such as Second Life belong to this domain. Also such parties as Microsoft, Google and Nokia, with its OVI/Share platform, have entered the fray.

As many of these players integrate telephony services, they have the potential to become fully integrated, end-to-end communication platforms. Though the revenue model remains unproven, they are drawing attention away from traditional communication service providers and are contributing to their slowing growth.

In addition, these types of applications put additional strain on already-burdened network infrastructure, particularly with the rapid increase in video content sharing and distribution. Cisco forecasts that by 2012, the sum of all forms of online video, including TV, VoD, Internet and peer-to-peer (P2P), will account for nearly 90 per cent of all consumer Internet traffic, a large portion of which will flow through OTT applications. According to Ofcom these types of OTT services will impose additional £830m (US$1.4 billion) in bandwidth costs on UK Internet service providers, without a corresponding revenue model.

To deal with over-burdening OTT traffic, telecom operators have options that include filtering or blocking OTT traffic, but this is unsustainable in many jurisdictions as it violates net-neutrality principles. Instead of protesting about the OTT bandwidth demand, Telcos should embrace the demand. Network and computing infrastructure optimization techniques such as traffic shaping and content delivery network (CDN) technology can reduce the cost of delivering high bandwidth content and potentially lead to new business models that even can capture value from this increasing OTT traffic.

The social networking phenomenon arose from significant shifts in communication, driven by the widespread growth of Internet connectivity and the emergence of interactive online communication tools. These shifts have been redefining a century-old industry, with the result that the advantages enjoyed by traditional communication service providers are beginning to wane. Telcos can, however, remain relevant in the face of changing user sentiments and demands if they take bold steps to adapt to this evolving marketplace.
Telcos can begin by taking advantage of the window of opportunity in mobile social networking, and also bolster their capabilities to serve the evolving, broader communication needs of enterprises. They should partner with, or acquire, existing players, to proactively develop the capabilities required for success. We already see some examples of such moves. In October 2008, Telefonica signed a global agreement with Facebook allowing it to integrate access to Facebook's mobile service and applications from all of Telefonica's mobile Portals. And in May of that year Vodafone acquired Zyb, a Danish mobile social network, with an eye on extending its social networking capabilities.

Telcos should also focus on enabling other participants in the value chain to benefit from distinctive telecom capabilities such as location, presence, text/multimedia messaging services and conference calling, in this way also generating revenue for themselves. Vodafone's Connect to Friends application in Facebook is an example of such an approach.
In addition, telcos should work more closely with OTT and/or CDN (such as Akamai and Limelight) providers to reduce the cost of delivering high bandwidth content (e.g. video, music) in response to increasing demands for such services. This can be achieved through network and computing infrastructure optimization techniques such as traffic shaping and caching of content close to the edge of the network using CDNs. Such approaches can potentially lead to new business models that capture more value from this increasing OTT traffic.

Over the long term, telcos should broaden the scope of their traditional voice business to more actively encompass both point-to-point and many-to-many collaborative models, which include voice, internet-based communications and content, and align the organisation and industry partnerships accordingly.

The fundamental change in the way we are now communicating is driving the evolution of a new communication services ecosystem that will force significant and bold changes by existing providers if they wish to remain an integral part of the communications landscape. The journey will not be without risks, but the option of doing nothing is a luxury many providers cannot afford. Revenue from traditional services will continue to decline and highly resourceful Internet information providers and IT companies continue to grow in the communications space to claim a larger share of communication time.

Chris Pearson is Global Telecom Industry Leader, IBM Global Business Services.
Rob van den Dam is EMEA Telecommunications Leader of the IBM Institute for Business Value and can be contacted via: rob_vandendam@nl.ibm.com

 

Most of Western Europe is now thoroughly gripped by recession, with even the  suggestion of a depression skulking on the horizon.  These are uncertain times - but one thing that is certain during these dark days, says Michael Callahan, is that companies will need to look at ways to reduce overheads, be smart with their diminishing budgets and seek solutions that provide value for money

Recent months have seen a number of high profile organisations fighting for survival, from redundancies within the financial sector, downtime on production lines in manufacturing, to major retailers slashing costs. All organisations, across all sectors - from small businesses to international conglomerates, are being affected by today's economic climate. Their continued existence will depend on them reducing their bottom line and tightening their belts effectively.

When a company needs to limit its spending, the first area to be examined, and habitually slashed, is its IT budget often with the security element considered non-essential. While many businesses overwhelmingly recognise that security has the power to determine whether they live or die commercially, many remain frustrated by the strain it places on finances and human resources. The reality is that growing regulatory requirements demand enterprises protect data, making such a cost saving strategy risky and potentially damaging.
Many an organisation has fallen foul when, having taken the decision to deploy technology, it has then inadequately scoped the investment, instead restricting it to what it considers the bare minimum and failing to anticipate the implications of its deployments. No matter what type of software or device is chosen, security should be an important consideration to lock down both the device, and the data that's contained within it to avoid ‘hidden' expense. Taking a mobile device, as an example, the questions that should be asked are: the types of information it will be able to access and carry; and how easy would it be for the device to be lost or stolen? The answers will have a great impact on security concerns and risks and will dictate the type and amount of security needed for the device. Simple, cost effective solutions, like boot-up passwords, two factor authentication and encryption, can all play a role.

One publicised example of inadequate, or even short-sighted, investment was a Marks & Spencer's owned laptop that contained a database of its 26,000 employees' details that was stolen from a third party. Having taking the decision to invest in laptops, it had opted not to take the precaution of sufficiently protecting those with sensitive data stored on them, as in this case. The Information Commissioner's Office found Marks & Spencer in breach of the Data Protection Act, leaving the retailer, not only with its reputation tarnished, but also an enforcement notice to ensure that all laptop hard drives were encrypted - a modest investment in hindsight which would have saved its blushes, not to mention the costs involved in handling the breach.

Many leading companies and organisations have already looked to decrease their overheads by reducing their property spend and energy expenses in downsizing to smaller, cost-effective premises. Redundancies are inevitable as workforces are slimmed down, with remote working and hot-desking practises feasible alternatives.

As department numbers decrease, the resultant increased workload for those that remain may force diligent employees to take work home with them to avoid falling behind or missing deadlines. Hot-desking could become widespread as companies strive to maximise their use of resources and cut costs by providing limited desks for their workforce, if at all - a drastic option could be to cut the cost of a central office altogether in favour of a ‘virtual' office. Another solution may be to utilise external resources, such as contracted labour, consultants, and possibly entire departments - IT support, HR and payroll are just a few examples.

While many companies try to weather the storm, data security must still be paramount. Privacy laws, along with corporate governance and industry-specific regulations, have become prevalent over recent years and neither ignorance, nor lack of funds, will be deemed as adequate defence. If organisations decide to lower their fortifications to allow flexible working practices, it is important that they do so securely and in a controlled manner. Here are a few ways for companies to examine what they currently have in their arsenal, and those that they really shouldn't be without:

Mobile computing allows people to use IT without being tied to a single location. Any business with staff that work, or will work, away from the office can benefit from using it. Devices - from laptops and personal organisers to "third generation" (3G) phones - can help to keep in touch and make the most productive use of your time. They can change the way you do business and lead to new ways of working, even new products and services that can be offered to customers, bringing new business opportunities. Increasingly, networking "hot spots" are being provided in offices where multiple employees access the same machine and network.  While this increases productivity and can reduce costs, it must be done securely. Data security advice from the Information Commissioner's Office is to encrypt any personal information held electronically if it will cause damage or distress if it is lost or stolen and only provide data access to approved personnel.

With new technologies, it's not only easier but more secure than it once was to let workers log onto the company network from home. Having fewer people working at the office could save money on energy bills - this could be taken further and shut down the office completely one day per week and have everyone work from home, with further savings realised by shutting down the heating or air conditioning system.  However, it is still imperative to secure the data as it leaves the office and travels home on the tube.

Replace dedicated WAN links with site-to-site VPN. If your business has multiple physical locations and you have dedicated leased lines connecting them, it might be time to think about ditching the expensive dedicated links and replacing them with site-to-site VPN connections instead. Midsize and large businesses may be able to save thousands on monthly fees by doing this.

Software application management (SAM) identifies installed applications, and then monitors their usage (or lack of) to determine compliance with software licenses, adherence to corporate usage and security policies. SAM is often perceived as a compliance exercise, yet the truth is organisations tend to underutilise licenses - typically 10-20 per cent on dead, outdated, or unnecessary application licenses. Reducing underutilised software has security benefits as well. Fewer applications means fewer opportunities for compromises and configuration errors. Also, the process of inventorying and auditing software usage often paves the way for additional control disciplines that cut costs and boost asset productivity.
Outsourcing is a sensitive subject, often conjuring images of personnel cuts. Yet the reality is judicious outsourcing can allow you to better utilise existing personnel.

Make good use of existing investments - advice PA Consulting should have heeded when an employee decided to circumnavigate existing security procedures, transferring data unencrypted to a memory stick, in breach of the company's contract and its own security policies. The memory stick, containing a Home Office database of 84,000 prisoners, was subsequently lost and, as a result, it has had its three year contract worth £1.5million terminated, with the Home Office further reviewing its other contracts worth £8million a year. Everyone within an organisation must understand his or her responsibility for keeping sensitive information secure and how to use the available technology, such as encryption software, to do so.

Fundamentally, effective security means doing more with less - it is about people, processes and technology. There are plenty of interesting technologies available although they're all useless if they're inappropriately deployed, managed and maintained. Allowing devices to operate in your enterprise without any rules or policies is truly the biggest risk. Complicated policies that regular users can't grasp are futile; instead they should be simple, precise and basic common sense. Often if people understand why they need to do something, then they'll do it. If all else fails look for something that can be enforced, often unseen, that takes the onus away from them.

In difficult economic times it is important to remember that the evidence of past downturns shows that those who make smart use of innovative technology will be the ones who live to fight another day.

Michael Callahan is Vice President Global Marketing, Credant
www.credant.com

As operators begin to position themselves as multimedia companies, Chris Yeadon  examines the essential billing systems capabilities now required to support an effective content based strategy

Over the last few years many industry observers have stated the importance of mobile operators becoming media companies, if they are to avoid becoming merely ‘dumb bit pipes' as they approach the so called ‘IP Jungle'.  Many operators have taken notice. An increasing number are now positioning themselves as multimedia companies, offering new fixed-line, DSL, IPTV and mobile TV services in addition to their core mobile offerings. Content is also playing an increasing role, especially in the mobile channel. Indeed, according to Portfolio Research, the mobile content market was already worth $24 billion in 2008 and is forecast to rise to $47 billion in 2013.

The challenge for many operators is how to support the new array of partners and especially content partners who will play an increasingly important role in the value chain of their offerings. In this regard the billing system will be pivotal in the support of next generation strategies.

Mobile content covers a huge area in terms of genre and format, ranging from stock price information to games and SMS alerts to mobile TV and advertising. Therefore, before we examine the billing system requirements for mobile content it is necessary to make some assumptions. For the purposes of this article mobile content is:

  • Both produced and owned by a third party
  • Obtained or consumed over the network
  • Marketed and sold by the operator (‘on-deck')
  • Always chargeable - but not necessarily to the end user.

Mobile content provides operators with the means to increase customer focus by targeting customer segments with relevant content. This could include offering financial news to the business segment or music downloads to the youth segment. Depending on the ambition of an operator's content strategy, they may develop a content partner ecosystem consisting of numerous companies each with different financial and contractual models to be supported. Therefore, in much the same way that an operator requires efficient access to contractual and product data to manage its relationship with its customers, it must have the ability to efficiently manage its partner relationships. This includes contract definition flexibility to support the different types of partner business models.

Unless the content partnership is based on a ‘buy-out' or ‘blanket fee' arrangement, the business terms of the partnership will typically be based on a percentage of revenue or rate per use royalty. Therefore by definition, the billing system will need to support multiple chargeable parties. Firstly, the subscriber usage charges (including any applicable offers or discounts) must be calculated and made available for subsequent billing. Then, in relation to each transaction, additional charges payable to the content partner must also be calculated by the billing system.

Whilst this may seem obvious, for certain types of content the reality can be much more complex. Music content is a good example. In some cases an operator's billing system may have to support three or more additional chargeable parties including a record company, music publishers and various copyright protection societies.

In order to support a rich content partner ecosystem, the billing system must have the flexibility to generate different types of invoices and statements for each type of transaction in addition to those produced for the subscriber. Taking the example of music content; a record company partner may have control of the content platform from which downloads are made and therefore is able to issue an invoice to the operator at the end of each billing period.  The billing system may then need to produce a reconciliation statement in order to reach settlement with the partner. On the other hand a copyright society, with no connection to the content delivery process, will rely on the operator to provide a different kind of credit note or ‘negative invoice' at the end of the billing period.

For an operator's billing system, partner agreement flexibility is the ability to be able to support the business terms of all their partner agreements.  In particular this means having the tariff flexibility to support the calculation of all the appropriate charges payable to a content provider or at the billing level, the application of bulk usage discounts and incentives that may also be written into the partner agreements.

The following example shows the schedule of royalties payable to the Mechanical Copyright Protection Society (MCPS) for full-track music downloads. MCPS is a copyright society in the UK, representing music publishers from which businesses who are recording music, must obtain a licence. Similar organisations exist in many countries across the world. It illustrates a potential complex charging scenario that must be supported.

No. tracks in bundle      Royalty Rate     Min. Royalty Per Track
1-7                              8% gross revenue                4p
8-12                            8% gross revenue                3.5p
13-17                          8% gross revenue                3p
18-29                          8% gross revenue                2.5p
30+                             8% gross revenue                2p
MCPS (UK) Royalty Schedule for Full Track Downloads

In this example the basic royalty charge payable by the operator is eight per cent of gross revenue from the download of the music file. However this is subject to a minimum royalty charge of four pence. Furthermore this minimum royalty is variable, depending of the number of tracks that are included in a bundle. Therefore, within the charging component of the billing system, an operator may have to set up a dependant tariff of eight per cent of gross revenue as well as a series of alternative tariffs based on the various potential bundle scenarios. During the rating process, the charges based on both the dependant tariff and the applicable alternative tariff would have to be calculated and then subsequently during the billing process, a ‘best option' rule applied, selecting the most favourable charge for the partner (MCPS).

The support of the above scenario also assumes the number of tracks contained in a bundle can be supported as a field in the content usage record, and that the rating engine can support the bundled track numbers as a unit of measurement.

Historically, mobile content involves higher levels of risk resulting from fraud and accidental overspending. The risk is heightened by the fact that operators are exposed to third party content charges that are usually dependant on usage or revenues due rather than revenues paid.

Therefore content services should be chargeable in real-time for both prepaid and post paid subscribers. This would enable operators to perform balance checks on the subscriber's account.  In the case of a prepaid subscriber this is to ensure that sufficient credit remains to cover the transaction. For a postpaid subscriber this is to check against a threshold set by the operator, as a precaution against possible fraud or overspending or one set by the subscriber as a voluntary spending control measure. Such checks should result in an advice of charge message being sent to the subscriber in advance of the transaction.

The benefits of real-time charging to the operator are clear. It reduces the degree of credit risk, provides price transparency and a feeling of control to the subscriber and provides the possibility to offer the subscriber, in real-time, promotions and discounts.

One possible way to achieve this would be to utilise the real-time capabilities of a postpaid billing system to charge all content transactions for all subscribers, prepaid and postpaid. In addition to the important real-time controls, it would mean that all content services can be made available to all subscribers and that costly content tariff configuration duplication is reduced.

In the future as the telecommunications industry converges with the media and entertainment industries it is clear that content partnerships will play an increasingly significant role in an operator's success. It is therefore essential that an operator is equipped with a billing system that is designed to support content partner models, has the flexibility to sustain the varied financial models of a large partner ecosystem and, in addition, has the tariff and marketing features, including real-time charging, to satisfy increasingly demanding customers.

Chris Yeadon is Director, Product Marketing, LHS

Over the past 30 years or so, consumers' quality expectations have been set high from using legacy voice systems. When carriers introduced VoIP as an alternative delivery system for traditional telephony systems, these quality expectations represented a new challenge for service providers as they attempted to deliver voice over the IP infrastructure. Today, Bruce Perlmutter explains, service providers face new challenges of defining and delivering multi-play services (voice, data, video, etc) that meet customer expectations simply because there were no pre-defined metrics for delivering multiplay services

IP networks differ from the legacy networks they're replacing. The fundamentals of IP network construction and the dynamic nature of today's commercial infrastructure cause inevitable congestion. Congestion is the source of many quality problems in multiplay networks. Network operators can partition network resources based on policy goals by implementing quality of service (QoS) mechanisms, but good QoS at the network level does not ensure good Quality of Experience (QoE).

As service providers try to meet customers' high QoE expectations, critical tools and standards definitions need to be implemented. A far cry from traditional network monitoring, a new breed of service verification tools that actively monitor end-user QoE are now being introduced.  These tools are becoming so sophisticated that they can actively test individual subscriber paths - down to the set-top box in the home - rather than passively monitoring and reporting on general network conditions.

The Telecommunication Standardization Sector (ITU-T) coordinates standards for telecommunications on behalf of the International Telecommunication Union (ITU) and is based in Geneva, Switzerland. Since its inception in 1865, the Union has been brokering industry consensus on the technologies and services that form the backbone of the world's largest, most interconnected man-made system. The two standards most relevant to this article are the QoE and QoS standards.

ITU standard P.10/G100 provides the following definition of QoE: "The overall acceptability of an application or service, as perceived subjectively by the end-user."

Since they include the subjective user experience, the ITU defined a test methodology based on mean opinion scores (MOS). A MOS is measured by carefully controlled subjective tests laid out in ITU-R BT.500-11 and ITU-T P.800. In these tests subjects listen to video or telephony, and rate them on a scale from 1 to 5. The average ratings from  each test case yield the MOS.

QoS is a concept closely related to QoE though the two are often confused. The ITU's E.800 standard gives the definition for QoS as: "The collective effect of performance which determines the degree of satisfaction of a user of the service." In telecommunications, QoS measures a network's actual performance using two factors: QoS mechanisms and QoS metrics.

QoS mechanisms such as traffic schedulers, shapers and admission control techniques help smooth out traffic flows. These mechanisms ensure that no one user or application gets an unfair share of the available bandwidth. In IPTV networks, loss concealment protocols such as forward error correction (FEC) or automatic repeat request (ARQ) are commonly used QoS mechanisms. More common QoS metrics and measurements such as packet loss, network latency, and jitter also help determine the underlying quality of the transmission network.
Network operators predict user voice call quality over the network by using QoS metrics. Unfortunately, video streams are more difficult to manage; making predictions about the effect of a given set of QoS metrics on the service can be very difficult.

ITU's standard G.1080, "Quality of Experience Requirements for IPTV Services," explains the elements of QoE for IPTV. Figure 1 shows the framework described in the document.
Content providers who transport digitised and packetised video signals over an IP network have a myriad of options. The selected compression algorithm parameters drive the underlying network's QoS metrics requirements, as recommended by the ITU in the G.1080 standard. Packetisation encoder input comes from different source video stream variants such as PAL and NTSC. Signal coding comes with a variety of algorithms, such as those defined by MPEG. Control setting management often defines the competitive advantage of one encoding system versus another.

ITU's G.1080 standard provides recommendations for the underlying network QoS metrics for a given type of encoding and bit rate.

Multiplay service networks have come a long way. Operators partition network resources using sophisticated QoS mechanisms so that streaming video and peer-to-peer file sharing applications do not adversely impact voice services. The main impediments to high subscriber QoE are:

  • The inherent structure of IP networks and their building blocks cause network congestion to always occur.
  • It is prohibitively expensive to engineer an IP infrastructure that handles the large ratios between peak and mean traffic.
  • Deviations from a network's planned implementation are inevitable.

Exceptionally high traffic exacerbates these issues.
Providing support for short traffic bursts by building out server and network infrastructure is usually too expensive for service providers. Congestion must be handled gracefully, servers and network infrastructure shouldn't malfunction under the load, and any subscribers admitted to the service should receive a reasonable service despite the overload.
The large variety of applications supported on IP-based multiplay networks contributes to the network's dynamic nature as subscribers adopt new services. Increasing amounts of HDTV and user generated video content can quickly change network requirements, for example.

The majority of systems currently deployed exclusively use passive monitoring technologies, which often employ expensive probes at numerous points in the network. Exclusive use of passive monitoring can give a malfunctioning network a clean bill of health, even while support personnel busily help large numbers of disgruntled customers. Passive network management systems can overlook an issue's root cause, even as network management tools light up with hundreds of alarms.

Recently, Ixia, based in California, developed a patented methodology in cooperation with tier 1 network operators, to provide subscribers with a high level of QoE from an actively tested network infrastructure. Ixia's IxRave runs on actual subscriber network paths using pre-designed tests from a centralised "test head." The solution monitors the network infrastructure from typical locations, and enables tier 1 service personnel to run active tests on an individual subscriber's service path-from the network core all the way to the set-top box sitting at the subscriber's home. 

This solution tests the entire network infrastructure and verifies that it meets the ITU defined goals for QoS and QoE by using both passive monitoring and active testing. By sending different permutations of multiplay traffic, such as voice and video, under different network load conditions it measure how the network performs during peak activity times down subscriber paths and checks that they meet QoS standards. It then makes network health assessments based on the defined QoE and QoS metrics set by the network provider.
Service providers can now test the transport, network and service layers of the network, as well as more common QoS metrics such as delay, jitter and packet loss. Customer's QoE can be measured using standard ITU-defined MOS scores so that network provider gets a true end-to-end picture of the network, and "sees what the customer sees."

Bruce Perlmutter, Ixia, can be contacted via:
contact_us@ixiacom.com

Dominant incumbents hobble market
Broadband connections across the EU rose by 20 per cent over the year, to a total of 110.5m connections, representing 22.5 per cent of Europe's population, according to ECTA's latest twice-yearly EU Broadband Scorecard. But the pro-competition body warns that in countries such as Spain, where the incumbent operator, Telefonica, continues to increase its control of the market with more than 57 per cent of all retail broadband connections, the market has stagnated with the result that Spain is languishing below the OECD and EU average with a broadband penetration rate of only 20 per cent and low growth rates.

On the eve of the tripartite meeting of the European Commission, the European Parliament and the Council of Ministers, which is aiming to achieve consensus on the way forward for next generation access networks, ECTA has cited Spain as an example of why firm regulation is needed to ensure investment and take-up in next generation networks, as well as to guarantee competition and consumer choice.

Innocenzo Genna, Chairman of ECTA, says: "Light touch regulation has not worked in the banking sector and there is no reason to assume it will work to consumers' benefit in telecoms. Financial results from incumbents, such as Telefonica last week, show that they are primarily focused on increasing profitability at the expense of vital infrastructure investment. What is particularly disingenuous is that, at the same time, they are threatening governments and European politicians that they will not invest in next generation access unless there is a relaxation of competition rules that allow rivals to offer services over these networks.

"Despite Spanish regulator CMT granting Telefonica a regulatory moratorium for next generation fibre networks, ostensibly to support €1bln fibre investment programme, there is little or no evidence that it will prioritise infrastructure investment in future. Instead it is committed to ‘preserving the company's strong cash flow generation', to the benefit of shareholders."

Telefonica Spain posted an 8.9 per cent increase in profit (EBITDA) and cash flows up 14 per cent to €8bln, but a reduction of 7.2 per cent in investment (capex) in 2008. In addition, results from Deutsche Telekom, which has also been demanding regulatory forbearance from EU policy-makers, show that it outperformed financial expectations and ‘expanded its leading position in the German DSL market'. The annual report also confirmed the regulatory holiday it has enjoyed in access to its ‘fibre to the node' network, a situation that has in all likelihood helped reverse competitive progress in Germany.

Genna concludes: "We have no problem with companies prioritising profitability, making healthy profits and benefiting from their own investment and innovation. However, we do have a problem with healthy companies using the recession as an excuse to blackmail policy-makers into relaxing regulation, with the aim of strengthening their own dominant position further at the expense of competitors and consumers. For companies such as Telefonica and Deutsche Telekom, investment in next generation fibre networks should be part of their normal upgrade strategy to replace decades-old copper networks, which have been fully paid for, and not a licence for stifling competition."

Other more encouraging results from the industry benchmark scorecard show leaders in broadband - Denmark, Finland, Sweden and the Netherlands - all have penetration rates exceeding 30 per cent with the UK not far behind. Common to all these top ranked countries is strong competition from both cable and regulated unbundling of the local loop. In some of these countries, incumbents have also publicly committed to open access policies, in contrast with those of incumbents in Germany and Spain.

Mobile ups and downs
The mobile entertainment market will reach nearly $36bn in 2010 according to Juniper Research, but the analysts also warn that this is a best case forecast, and revenue growth will be markedly lower if the global recession fails to bottom-out over the next twelve months.

Using a scenario-based approach to assess the impact of the recession on the mobile entertainment industry, the recent Mobile Entertainment in a Recession report found that average annual growth over the next two years declines from nearly 19 per cent under the best case scenario to less than seven per cent in the worst case, with mobile TV, user-generated content and music amongst those sectors which are particularly exposed.
Lower discretionary spend on services and handsets were amongst the major contributory factors to the decline in top-line entertainment service revenues, although the report found that other attendant factors - such as a lack of available funding to finance the development of new applications, and faster migration to ad-funded services - would also impact on revenue growth over the forecast period.

According to the report author Dr Windsor Holden: "Some entertainment services appear to be highly susceptible to the downturn. Furthermore, given that operators will perceive that consumers will be increasingly reluctant - or unable - to purchase content, they may in turn be less likely to roll out expensive, higher risk services: a dedicated mobile broadcast TV network is a prime example."

However, the Juniper report found that some sectors, such as adult services and gambling, were less exposed than others: it noted that for mobile gambling, there was a possible upside in the migration of wagers from physical to remote sites with consumers going out less and instead placing bets via the mobile or PC.
Details: www.juniperresearch.com

Digital proximity marketing
Although digital proximity marketing is still in the beginning stage of development, it is spreading across Europe, with the UK leading the way.  Currently there are more than 35 providers throughout Europe who have helped to showcase these new technologies in the marketing realm.  Perhaps the most impressive utility of digital proximity marketing is the ability to inform consumers about their interests in a specific time and space.  This strategic positioning makes this new form of marketing exciting and a hot topic today.

"The broad diffusion of short-range wireless technologies embedded in mobile phones has enabled interaction between mobile users and the surrounding environments. Systems of sensors can detect mobile phones in the short distance and send them information and data that could be useful to the final users," says Saverio Romeo, Frost & Sullivan Industry Analyst. "Digital proximity marketing uses this idea to enable digital marketing and advertising campaigns in places where the actual users are. 

Digital proximity marketing transports information using the Internet from a content management system which manages all the marketing and advertising campaign, to a content server which stores everything.  Then through cellular networks, the information is delivered to access points, which are connected to the end users' mobile phones through short-range wireless technologies, mainly Bluetooth, but also Infrared, WiFi, GPS, cell towers and RFID. The system is based on an opt-in model and so users receive information only if they want.
Details:www.frost.com

Business leaders demand video on the move
A consumer behaviour study by Ericsson and CNN has revealed that the international business elite are increasingly accessing the Internet while on the move. The growing need for flexible viewing options to fit with changing lifestyle habits means that top executives are increasingly viewing TV content on laptops, desktop computers and mobile devices. The survey, carried out amongst CNN's online audience, also showed that more business leaders than ever are sharing user-generated video content.

  • 56 per cent of respondents with mobile internet, access online content whilst on the move for example, via a mobile device or wireless LAN. This trend speaks to the increasing number of upscale consumers with internet access outside of the home or office environment.
  • Three quarters (73 per cent) of CNN's online audience of global citizens share user-generated video content. In fact, 66 per cent of those over 45 share user-generated video content, de-bunking the myth that it is just an activity for the youth. Almost a third (29 per cent) of those surveyed record video clips on their mobile phone. In a nod to the sharp rise of citizen journalism, and perhaps in response to the growing number of social platforms enabling video sharing exchange,16 per cent of respondents are sharing user-generated video content are doing so with other digital community members.
Details: www.ericsson.com

 

On the eve of the TM Forum's Management World, Keith Willetts notes that the  imminent arrival of a true digital economy represents a massive opportunity for expanded communications services.  The key question, however, is does it also open up a whole new set of revenue streams for the service providers?

Not sure I've seen any fireworks lighting up the sky, but it's now a full 25 years since the first telecom deregulation. In that time just about every market in the world has gone down the path to competitive communications. So what, fundamentally, has happened in that time? Competition and regulatory pressures have transformed prices but as the communications world discovered the laws of market elasticity, rising volumes and the phenomenal growth of mobile have meant that revenues have continued to rise. In reality, the business model for communication services hasn't changed much in that time - we've sharpened up marketing their old one.

But just as financial markets found out, all good things come to an end! According to IDate, 2008 saw the global communications market only grew by +4.2% to $1.37 trillion, but most of that growth was from still expanding markets like India and China. In mature markets any volume growth was more than cancelled out by price declines on mobile and broadband. Poor old fixed line revenues fell by 5%. Prices for everything are declining as we go not only into a recession but maybe a deflationary period as well - I can't imagine a scenario where communications prices will go up, indeed they are likely to follow a form of Moore's law.
In Europe, mobile penetration now exceeds 100 per cent, with no more market left to trawl. So how do you continue to grow your business? The stock answer from CEO's is an exciting story of new mobile broadband; mobile TV; IPTV; unlimited music, online books - you name it they will claim it. But that question and similar answers have been asked for a long time now and there is little evidence to show that the service providers can realistically generate new, innovative revenues.

Remember when location based services would make us all rich - well the market took so long defining standards for exposing the location data that the handset guys have just got around it by putting GPS chips in their phones. Same for MMS - too hard, too slow and too user unfriendly to get a mass market going. The only truly new services, like iTunes, have come to market from ‘over the top' players, not the communications companies.

So the question has to be asked - can service providers realistically generate sufficient new revenue from the services they sell to their current customers to replace the falls in price on traditional services as markets saturate? And if the answer to that is maybe not, what are they going to do for an encore? Until recently you could point to diversified services like outsourcing of corporate communications networks as a ray of sunshine - that was until one major carrier started posting profits warnings and admitting over-stating profitability of that business.

Clearly, service providers are quickly coming to a fork in the road when it comes to their core business model - just who are their customers and their competitors; what services should they be selling and how are they going to monetize them?

Pioneering services like Amazon, Google, iTunes, and Hulu have shown that entire markets can be shifted to a digital economy model at much less cost, but where everyone can still make money. Apart from bricks and mortar stores of course. We are seeing a similar thing in publishing - more and more publications are going online and eschewing expensive printing and shipping. Books and newspapers may well follow music and videos in going online through products like Amazon's Kindle.

In fact, the global recession will push almost every business on the planet to look at what cheaper and better online approaches they can exploit. Thanks to advances in communications - fibre, 4G wireless and femtocells, (putting cell sites within the home), the market for digitally enabled services may well explode on a myriad of consumer devices from net-enabled TV's through online gas and electric meters, fridges and cars.

This mushrooming of devices and a true digital economy represents a huge array of opportunities for expanded communications services.  The key question is - does it also open up a whole new set of revenue streams for the service providers? Do they get commoditized into bit pipe players? Would that matter?

Almost as long ago as deregulation, Michael Porter (Key competencies, 1985) outlined the concept of companies maximizing their core competencies and minimizing any reliance on what they are not good at. So what is it that communications are good and no so good at? How many wildly successful new services have been introduced in the past 10 years? Apart from DSL (Alexander Graham Bell with knobs on) you really have to scratch your heads to come up with anything - most are basically variations on a theme: voice minutes in all-you-can-eat packages with texting thrown in and different bundles with broadband.

For mass market, innovative, successful services - Google, Facebook, iTunes, Kindle, Hulu, and so on none of them have come out of a communications company. All of them could have been invented by a communications player - they certainly have the brains - but their business models get in the way - their DNA is just not geared to taking risk, moving quickly and launching anything that might damage current lines of business.

But on the other hand none of these new services could exist without the innovations of the communications industry. What the service providers are good at is being a great enabler of other people's services - after all, for a 100 years phone companies have enabled us to talk to other people - they didn't do the talking!

Being a service enabler presents a new business model or at least, significantly extends an old one. Providing a range of enabling capabilities can unlock a different charging model, such as taking a percentage of the revenues of the services that are enabled. This gives much more scalable revenues than, say, flat bandwidth charging approaches. It opens up new revenue streams by opening up the software and process infrastructure of a comms company - transport obviously (but maybe various qualities of service)  plus capabilities like billing; settlements; authentication; cloud computing; user information and so on: in other words a super- wholesale enabler. But to open your mind up to that, you have to get your head around the fact that you are accepting that someone else is going to be the provider of service to the end user. And it's tough to pursue both a provider model and an enabler business model in the same company because they are usually in conflict. You can just imagine the schizophrenia that can result.

At TM Forum's Management World Nice this May, we're hosting a sessions on exactly this subject. Werner Vogels, CTO of Amazon, will talk about how his company has successfully played both sides of the fence: providing services to its own end users but also providing a lot of capability to enable third parties to sell through Amazon.

This business model is starting to be more understood and taken more seriously by communications companies, but you'd have to say the jury is still out on which fork in the road providers are going to take. Will it be the model of trying to develop innovative new services for individual end users and businesses, or will it be more of the role of a behind-the-scenes enabler.

I think the next two to three years will be crucial to answering this question. A "do nothing" approach probably means service providers getting backed further and further into the role of a commodity bit carrier.  Being the ‘Intel Inside' of numerous new and exciting services is a much better place to be that a bystander watching the action for the sidelines. Enabling other people's services is something that communications companies can do to leverage their really core competencies.

Let's put a traffic camera by that fork and watch which way the punters go.

For more information on TM Forum and Management World in Nice please visit
www.tmforum.org/mw09
Keith Willetts is Chairman and CEO, TM Forum
www.tmforum.org

Mobile network operators often ask 'how can we leverage the social networking  phenomena'? A better question, says Jouko Ahvenainen, is 'how can we mine the social network we already have - that is, the network of our subscribers'?

Mobile phones generate better and more useful information about consumers than any other technology, including the web. How people use their phone is a very personal thing. Who people call, text and save in their phones is more closely related to their 'real' network of contacts than the people they connect with on Facebook, Twitter or MySpace. In the book I helped write, Social Media Marketing, one passage reads:

"The mobile device is a key element of the digital footprint since it is oriented towards capturing information (which is a driver of the digital footprint). The real question for the telecoms and mobile industry is what can they do with all this information? More importantly, what could they do in future with all this information?"

By accepting this advantage and looking at how this data can be utilised, while also maintaining strict standards of trust and privacy with subscribers, mobile phone operators can understand the best asset they already have - the 'goldmine' of subscriber data.

This data can open exciting new revenue channels and give deep customer insight so operators can offer better products and services. They can also create much more effective anti-churn campaigns. In addition to behaviour and demographic information, subscriber social network and influence information can be uncovered. Who a subscriber has influence over is a critical point when it comes to churn. If person A churns, and brings person B, C and D with them, the problem has been quadrupled. By identifying that person A might churn, and that he or she will bring three other leavers along, mobile operators can make churn busting campaigns far more economical, with far greater impact. Operators can also be a source of market research data in the future, when they can collect more data than traditional market research firms can.

Of course, maintaining subscriber privacy is critical in all this, and it is certainly achievable. But it requires a shift in culture. There is a stark difference between trying to 'know' your customer and trying to 'own' your customer. Web companies continue to 'know' their customers better, while many phone operators stick to an outdated notion that they 'own' their customers. The longer the telecoms industry delays knowing the customer better, the longer the industry will lose out. Embracing the power of social networking information is key to this transformation.

Many operators hope that call data records (CDRs) give enough information to gain improved customer insight. It is a good starting point to make the operator's own marketing activities more effective, but it is not enough to be a platform for advertising and social services in mobile. We need to know more about how the subscriber interacts and uses service elements. There are three things (generally accepted) that social media requires to know a customer better:
1. You must incentivise a customer to share more personal information about him/herself. This assumes that all privacy / confidentiality standards are adhered to.
2. An open ecosystem / platform is needed. At present, Facebook and Google Android are the best examples of where third parties interact on open playing fields, and as such, do the work to grow the ecosystem. The best part about this is it can be exploited without direct monetary rewards.
3. We need touch points. Places where the operator can interact directly with the customer and generate a two-way communication. Broadcasting information at customers without interaction (one-way street) is an outdated approach and no longer prescient in today's Twitter age. Advertisements are the best option (an alternative, call centres, are too expensive). And social networking can make these advertisements personal, intuitive and relevant - rather than annoying to the subscriber. Open systems are the best location for the advertisement to be placed.

With these being the first steps for operators, a starting point is by thinking about marketing and services in a new way. The marketing can no longer be a one-way broadcast of messages to customers, it must be a much more interactive relationship that also supports word-of-mouth. And it is the same with services: operators cannot make or select all services for the subscribers; users must be able to create their own services and choose which ones they want to use. Web 2.0 truly is coming to mobile, and it offers a platform where people can do what they want to do. It's not a place to push selected ideas and models to them. And web 2.0 is not the only evolution, but also CRM 2.0, which is where subscribers can also utilise their own data and data analytics for their own benefit. For example, subscribers can know their own social network and manage their own connections. It becomes a way to motivate people to share their data when they can get benefits from doing so.

Following this, a measurement system must be created and agreed upon. Operators cannot act until there is a way to manage social media programmes. One approach has been to track user behaviour at the network level. An example of this is the mechanism Phorm's service is based upon. Phorm has become a pressing issue in the UK because, even if you did get permission to undertake this level of tracking, people do not understand networks and don't trust what they don't understand. People are normally okay to share small amounts of information if they get something back, but they don't like one-way spying models.

By working at higher levels of the stack than networking level (usually, the application level) and by making 'knowing the customer better' the goal rather than 'owning the customer,' there are huge marketing gains to be had as well as a new level of trust to be engendered with subscribers. The only way this can be achieved is through aggregated data, by not working with specific individuals or specific transactions but rather with aggregates and data patterns derived from the data.

Data aggregates for marketing purposes also presents a solid business case for the converged operator. If an operator owns customer touchpoints via not only the mobile phone, but also broadband, TV, landline, etc, the data presents a richer picture of the customer and it becomes easier to engender trust when the customer only has to share information once, with one brand whom they trust.

As mentioned before, by understanding people's behaviours in the context of this network, operators can pull out and define a member's measure of influence, and use this for new clever mobile and online marketing techniques. In the case of London-based Flirtomatic, the web and mobile social network for flirting for ages 18+, they are using this superior customer insight to engage in more targeted marketing and services for its influential members - those who have word of mouth impact on other members. Flirtomatic applies 'social intelligence' to create more compelling services for each customer segment, and targeted, relevant and personal marketing and promotions, via web and mobile. Flirtomatic is focused on generating viral take-up throughout its community via word-of-mouth marketing. This approach works two-fold, because influential members have both direct pull over purchasing decisions (by recommending products to their friends) as well as indirect pull (by friends' desire to imitate or mimic their friends' purchases). Flirtomatic is using Xtract Social Links product for this insight.

This works because studies show that social influence is more important than any other factor in consumers' purchasing decisions. One research on car-buying showed that 71 per cent of car buyers are influenced by what their friends said, whereas only 17 per cent were influenced by TV ads.

This insight can be applied by the operator for its own marketing, such as churn campaigns, where as much as a 20 per cent improvement in campaign effectivity has been reached, or for generating new revenue through third-party advertising schemes. And this market is growing fast; eMarketer predicted that spending on behavioural targeting will reach $3.8 billion by 2011.

Flirtomatic's CEO Mark Curtis recently said: "The early results from customer segmentation are very insightful and exciting. We can now see considerable potential, as the business scales, to directly improve our revenues through a sophisticated view of our customers, their behaviour and the pattern of their relationships. The tool hands us an effective, powerful CRM solution."

Jouko Ahvenainen, co-founder and VP at Xtract and author of Social Media Marketing , and can be contacted via Jouko.Ahvenainen@xtract.com
www.xtract.com

The growth in the African telecommunications market over the past five years has  been nothing less than phenomenal. Although growth rates are expected to slow, Julia Lamberth and Serge Thiemelé explain that Africa should continue to be the fastest growing market in the world for the next five years

The growth in the Arican telecoms market has turned the telephone from a luxury item to a basic necessity in many countries. However, the expansion has not been universal across the continent. Some countries, such as South Africa and Libya, have already passed the 100 per cent mobile penetration rate, while others, such as Ethiopia and Eretria, still have penetration rates under 10 per cent.

According to Ernst & Young's recently released Africa Connected survey, the growth up until now has been driven almost entirely by GSM voice. While voice should continue to be the largest component of the market for the foreseeable future, it is expected that data is going to be an ever-increasing component of operator revenues in the future.

Internet penetration on the continent is still substantially lower than any other part of the world, with only nine countries on the continent having penetration rates above one per cent. It is expected that the construction of submarine cable systems, the first of which should be operational by the middle of the year, is likely to be the catalyst for accelerated growth in African Internet penetration. Alongside the construction of the submarine cable systems, which should, to a large extent, address the problems posed by inadequate international connectivity, there has been significant investment in terrestrial fixed line infrastructure.

This investment has been made by both private operators, especially in countries such as Nigeria and South Africa, as well as by governments, in countries such as Angola, Malawi, Botswana and the Democratic Republic of the Congo. While the impact of this investment will not be felt immediately by consumers in many countries, it will provide the basis for cheaper and more reliable telecommunications in the next few years, particularly in rural areas. It is likely that the vast majority of the next wave of African Internet users will not connect to the Web through services that rely on fixed networks, but instead use the infrastructure provided by mobile and fixed wireless service providers.

One of the reasons for the slow pace of telecommunications growth on the continent in the past has been the historical lack of basic infrastructure. Poor infrastructure was one of the areas identified by operators as a key challenge to the development of telecommunications. This weakness has manifested itself in a number of ways, including limited access to core telecommunications infrastructure, as well as a lack of a reliable electricity network needed to keep networks up and running. This weakness in the basic services has a negative impact on the ability of operators to rapidly deploy their own infrastructure. Having to make contingencies for these weaknesses is something that has a significant cost attached. Safaricom, in Kenya for example, reportedly spends more than a million Euros a month in diesel to power the generators it needs to keep its network running.

This situation has resulted in operators exploring alternative sources of energy such as wind or solar power to supplement other power generation options.

Operators identified the issue of attracting and maintaining talent as the largest operational issue. This issue applies both to the technical and management skills, with operators struggling to fill vacancies across the spectrum. While they acknowledged the importance of training, the issue of staff being poached by rivals was identified as an ongoing challenge.
The increased vigilance of regulators on the continent has heightened the need of ensuring network reliability as regulators are taking a more active consumer protection role. Examples of this include operators being barred from marketing their services in Nigeria until the quality of service reached an acceptable level.

In addition, operators interviewed voiced concern over the perceived political interference in the regulatory process. It is this lack of consistency that creates difficulties for operators, as they are unsure of how changes in the local regulatory framework will impact their businesses, especially if these changes are being driven by a political, rather than a pure regulatory agenda.

Operators highlighted the high rates of taxation they are subjected to, with the average across the continent coming in at over 30 per cent. Governments across the continent have chosen to place a heavy tax burden on mobile operators by taxing profits at a higher rate, instituting mobile specific taxes or raising license fees. Operators also raised the issue of excise taxes on imported handsets, making them less affordable to consumers and hampering the ability of companies to reach potential customers in lower income brackets.
In order to succeed in the African market, scale is considered one of the key elements of future success, and competition for new licenses and existing operations is keen. It is likely that we are going to see significant consolidation in the next few years, as smaller operators feel the effects of increased competition.

The global economic crisis is not likely to leave the African market unscathed, as many operators may find it more difficult to raise the funding needed to continue the level of investment needed to remain competitive. Especially in key markets such as Nigeria where the multinational operators are investing heavily in network expansion, the smaller operators may find it difficult to keep pace with either the network coverage or the technological innovation of the large regional and multi-national operators.

The issue of infrastructure sharing and outsourcing of parts of the business is one way for operators to cut the costs of doing business. However, operators surveyed were resistant to this, preferring rather to have control over the infrastructure and services that they consider their competitive advantage. More recently, however, some operators have said that they are looking to cooperate with competitors wherever possible to bring down the cost of deploying new infrastructure.

Operators, specifically in more developed markets, are also starting to look at broadening their set of services to include targeting the wider ICT market. This has seen operators acquiring companies in the information and communications technologies sector. This broader focus is setting the stage for a divide in the market between operators that choose to create a converged services offering and those that focus on offering voice and basic data services at a lower cost.

The rollout of these converged services, which include fixed and mobile services as well as offerings that have traditionally been the reserve of the Internet service providers, such as hosting and business continuity, will further drive the development of the African ICT market. While the initial focus of these services will be in the developed markets, it is expected that these services will rapidly be driven out to corporate customers in all the territories in which these companies operate in Africa.

This investment by operators, as well as the infrastructure that is being deployed will set the stage for the rapid adoption of more data-focused services for both governments and corporates across the continent. While these types of organisations are likely to access these new networks via new fixed-line networks, consumers should benefit from the deployment of high-speed wireless services with the attendant increase in bandwidth and broadband.

It is anticipated that networks based on 3G will dominate the market for broadband wireless access with CDMA EVDO and WiMax offering some competition as well as providing access where the GSM-based service is not suitable.

We expect the next five years to see a continuation of the growth in African telecommunications, with increased Internet and broadband penetration across the continent. At the same time the market is likely to undergo a period of considerable consolidation with the existing African operators expanding their reach and continuing to expand their reach across the continent. It is our view that operators who do not already have an African presence could have a difficult time in challenging both strong, regional and global operators (MTN, Vodafone, France Telcom, Zain - for example) and a plethora of new licensees (more than 40 per cent of the market is still in one per cent market share slices). Operators launching as the fourth or fifth license holder in countries may face challenges in generating profits, especially in countries where one operator already has a dominant position. Countries such as Angola and Ethiopia where none of the large regional players have established a presence are being viewed as real opportunities for future expansion.
Julia Lamberth and Serge Thiemelé are co-leaders of the Ernst & Young Global Telecommunications Center - Africa.

The Africa connected survey was compiled from interviews conducted with operators from across Africa. For further information please visit www.ey.com/telecommunications or contact globaltelecommunicationscenter@uk.ey.com

With mobile operators keen to implement impending network upgrades in the most  effective manner, Colin Garrett explores how they can limit network planning costs in the face of the economic downturn

Mobile operators are under increasing pressure to provide the best service to their customers at the most competitive rates.  The next 12 months will see operators across Europe struggling to strike a sensible balance between the need to roll out the latest network upgrades and avoiding passing additional costs on to the end user.  With the difficult economic situation affecting industries across Europe, all eyes are on reducing costs across the board, and for mobile operators this means reviewing spend involved in the initial planning stages of the network through to the training of customer-facing staff.

With the rise in popularity of the smartphone device during 2008, consumers and business users are demanding improved mobile data speeds to access more content via mobile.  The race is on for mobile operators to boost data speeds by rolling out HSPA and LTE networks as soon as possible.  The first step in upgrading existing mobile networks is to gather sufficient network data to identify areas of high mobile penetration and expose any areas that may be lacking in coverage and capacity before choosing which areas of the network require the most urgent upgrade work. 

A common cost-effective approach to test the network is to seed drive test tools in business van fleets.  Drive test systems enable wireless operators to view their own and their competitors' wireless voice and data services from the perspective of the subscriber by providing critical quality-of-service (QoS) measurements. Network designers can then use portable test transmitters to verify optimal antenna positioning and as a low power source for testing the design and functionality of RF repeaters and base stations. This allows operators to limit infrastructure costs by identifying the correct products for network upgrades.

It has become widely accepted in the ICT industry that the correct method for analysing the cost of a vendor's products or services is to do a Total Cost of Ownership (TCO) analysis. Rather than focusing solely on price, buyers of ICT products and services must consider the additional, often hidden, costs of training, operating, managing and upgrading their purchases. Addressing only the purchase price will not sufficiently make a difference. 

TCO is more than the original cost of purchasing the system. We have found that more than 70 per cent of the TCO is involved in non-purchasing activities. It must include all direct and indirect costs associated with mobile network data gathering and drive test systems. Drive test systems have a typical life span of five years. At some institutions this life span may be more like ten years, but in both cases the older units are removed and abandoned as redundant because they cannot be used to test and measure the latest network infrastructure upgrades.

There are many factors and elements that make up the TCO for mobile network data gathering with drive test systems. Over the last ten years, the TCO for drive test tools has continued to increase due to technological advancements, drive test product limitations and increased Mobile Network Operator (MNO) competition. Those institutions that have already addressed and developed strategies and programming to reduce the cost of ownership of mobile network data gathering systems are now seeing benefits. Institutions that have not yet addressed this issue are probably not seeing a cost reduction. In fact, institutions and companies that have not addressed TCO are continuing to experience out-of-control cost increases for mobile network data gathering and drive test systems.

Sweeping changes and improvements in technology continue to challenge the mobile industry to reshape and redefine how best to deploy mobile network data gathering systems. Individual organisations will find that it can prove expensive to stay current unless they have a handle on what it takes to acquire, implement, and support drive test tools. By addressing the components that make up the TCO, an institution will be in a position to take full advantage of the latest innovation in mobile network data gathering techniques. It will become very difficult, even impossible, to implement an institutional mobile network data gathering and drive test methodology aimed at including HSPA results if an enterprise is using a bespoke technology and frequency limited system.

Introducing the wrong drive test systems to your network can be very costly. Being aware of the TCO components is the first step in lowering your mobile network data gathering cost. We have found that limiting choices and setting standards are the best methods for starting to get control of your drive test systems cost. Whereas ensuring that all parties use a single type of system is usually the fastest way to bring mobile network data gathering and drive test systems costs under control, it is not always easy to implement when both individuals and group networks have developed enough expertise and knowledge to be able to specify and utilise their own drive test systems.

The implementation of "soft standards", including significant economies of scale, simplified purchasing procedures and centralised training support, will work best in bringing the entire enterprise to accept a standard and limited choice. Nevertheless limited choice should still offer enough variety to cover the end user's requirements, including engineering (optimization and integration), special coverage groups (in-building and special coverage projects), marketing (benchmarking) and management (key network performance indices).
As already established, institutional TCO consists of more than simply the original purchase of hardware and software. We have defined seven different base elements that make up the cost components for drive test systems. These base elements are purchase price for all hardware and software, staff training costs, installation and implementation costs, support services and update costs, cost of required functional upgrades, technology upgrades, interoperability costs.  Each of these base elements includes several types of expenditures.
The purchase price includes all direct and indirect purchases for a drive test system, namely the drive test tool hardware, software, data collection supported devices, and log file (output) manipulation. The price should also include warranties, extended warranties and maintenance agreements.

Training costs will include all direct and indirect expenditures for training activity required to effectively run the drive test system. Formal and informal training usually occurs with the installation of the drive test system. Costs and methods vary according to vendor.

Installation and implementation costs include all direct and indirect expenditures involved in ensuring that the system is installed correctly and meets an institution's standard operating procedures. This may vary from tools needed for hardware installation to server configuration to accommodate the storage and access of log files.

Support services costs include all staff costs incurred in providing adequate personnel support to the drive test system. This includes on-site technical support, as well remote support via telephone, e-mail and the Internet. Installers, troubleshooters and skilled support staff are all involved in maintaining the system.

Functional change upgrade costs comprise both direct and indirect expenditures necessary to make ongoing changes to the drive test systems operation. This will allow the institution to increase its drive test efficiencies, including the deployment of the latest software updates, the addition of extra parameters, and the improvement of data displays.
Technology upgrades costs should take into account both the direct and indirect costs involving in acquiring new tools or upgrading the current system to be compatible with new mobile devices as well as the latest mobile technologies, i.e. CDMA 1x to EVDO or HSPA to LTE.

Through a careful step-by-step consideration of each of the elements that constitute the TCO for a vendor's drive test system, mobile operators can reach an informed decision as to the cost effectiveness of a vendor's tool set.  Although wireless network data gathering comprises only one aspect of the network planning process, an accurate TCO evaluation for drive test systems is a great place to start in order to ensure maximum cost and performance efficiency across an institution's entire remit.  At a time when businesses need to evaluate every area of their spend in order to retain the highest possible competitive advantage in a saturated market, mobile operators cannot afford to base buying decisions solely on purchasing price, but must instead consider all aspects of TCO across their wireless networks.

Colin Garrett is Product Manager, Test and Measurement Systems, Andrew
www.andrew.com

This is the first in a series of columns focusing on issues surrounding the management of  today's communications business models. For this debut effort, I thought I would talk about voice over IP and its impact on communications, or perhaps I should say the lack of it.

I read an interesting article recently that said voice over IP (VoIP) was stalling even though not so many years ago it looked like it would sweep the board. On the contrary, VoIP usage appears to be declining; a recent report by independent British communications regulator Ofcom says that only 14 per cent of broadband subscribers are even using the technology.
Adding fuel to the fire is the rumor floating around that eBay is looking to sell VoIP provider Skype, which it purchased in 2005 for over $2 billion. The article even quotes Skype CEO as saying it's a great standalone business. Surely that's a big hint at what may be to come.
So this is quite an interesting turn of events we have on our hands. The reason I'm focusing on this bit of news that VoIP uptake appears to be waning is that back about 15 years ago when I was working with BT, I saw my first demo of the technology. I remember one of BT's board members being rather panicked and saying VoIP would kill off their business, and the world was coming to an end.

Obviously that never happened. But what has happened is pricing on traditional circuit-switched calls has become lower and lower in the past 15 years. Nowadays, most people have some sort of flat-rate fixed-line or mobile calling plan that's priced very aggressively. Sure, Skype-to-Skype calls are free, but today's consumer is interested in a lot more than just a free lunch.

Also, the convenience of VoIP just isn't there. With PC-to-PC calling, as with Skype, you're anchored to your PC and stuck at your desk. If all parties are using the same service, the call usually works as intended, but if you're on a raw IP connection, or someone is using the conventional phone network, all bets are off.

And unlike the common perception that if it's free you can't complain about it, consumers are much more savvy and demanding that every form of communications they touch lives up to the high standards of the traditional PSTN.

Back when mobile phones were brand new, and the novelty of being able to call from the middle of a field or on the top of a hill still had a shine on it, people didn't really care if calls dropped or quality was poor. But after a while that novelty started to wane, and today you can get mobile service on tunnels, trains and just about anywhere else with high call quality.
So we have lower priced traditional voice calls and customers who are demanding - and getting - higher quality of service. And that is exactly what the Internet has not been able to achieve in terms of voice.

It exposes the myth that people don't care about quality if something is free. And nowhere is voice call quality more of an issue than in the corporate world. Can you imagine the Fortune 500 companies using a VoIP configuration that is going over the general Internet where there is no packet priority and jitter and delay are common? The Internet is great for email, downloading video and anything else where it's not a huge deal if packets are sent and received out of order or with latency. But the inconvenience of having a VoIP call dropped or sounding like static just isn't cutting it in the corporate world.

I'm the furthest thing from a Luddite, but the call quality, the inconvenience of being stuck making calls from your PC and other factors are hindering VoIP's potential to be a voice communications game-changer.

Keith Willetts is Chairman and CEO, TM Forum
kwilletts@tmforum.org

A wide range of factors is driving mobile broadband demand as our lifestyles become   increasingly digital. Howard Wilcox asks whether LTE is the natural future standard of choice

LTE is often quoted as a 4G mobile technology. However, at this point there is no agreed global definition of what is included in 4G: the ITU is establishing criteria for 4G (also known as IMT-Advanced) and will be assessing technologies for inclusion. The two next generation technology candidates are mobile WiMAX 802.16m (WiMAX Release 2) and LTE Advanced. Both these products will meet the IMT Advanced specification with, for example, up to 1 Gbit/s on the downlink at low mobility.

There is a wide range of factors driving mobile broadband demand as our lifestyles become increasingly digital.

LTE is a global mobile broadband standard that is the natural development route for GSM/HSPA network operators and is also the next generation mobile broadband system for many CDMA operators. The overall aim of LTE is to improve capacity to cope with ever-increasing volumes of data traffic in the longer term. The key LTE objectives include:

  • Significantly increased peak data rates - up to 100 Mbps in the downlink and uplink peak data rates up to 50 Mbps,
  • Faster cell edge performance and reduced latency for better user experience
  • Reduced capex/opex via simple architecture, re-use of existing sites and multi-vendor sourcing
  • Wide range of terminals - in addition to mobile phones and laptops, many further devices, such as ultra-mobile PCs, gaming devices and cameras, will employ LTE embedded modules.

3GPP's core network has been undergoing SAE (System Architecture Evolution), optimising it for packet mode and IMS (IP-Multimedia Subsystem) which supports all access technologies. SAE therefore is the name given by 3GPP to the new core all-IP packet network that will be required to support the LTE evolved radio access interfaces (RAN): it has a flat network architecture based on evolution of the existing GSM/WCDMA core network. LTE and SAE together constitute 3GPP Release 8 and have been designed from the beginning to enable mass usage of any service that can be delivered over IP. The RAN LTE specification was completed at the end of 2008, with further work required to complete SAE by March 2009: this work is on track for completion of the full release 8 standard at that time.

Beyond LTE to 4G
LTE is often quoted as a 4G mobile technology. However, at this point there is no agreed global definition of what is included in 4G: the ITU is establishing criteria for 4G (also known as IMT-Advanced) and will be assessing technologies for inclusion. The two next generation technology candidates are mobile WiMAX 802.16m (WiMAX Release 2) and LTE Advanced. Both these products will meet the IMT Advanced specification with, for example, up to 1 Gbit/s on the downlink at low mobility.

There is a wide range of factors driving mobile broadband demand as our lifestyles become increasingly digital.

Personal connectivity:  "Always On"
Anytime, anywhere connectivity as an overall concept is becoming a clear user expectation. The increase in connectivity is seen to be driving applications, user preferences and broadband demand, which in turn drives the demand for access. The demand for increased access is actually leading to bigger investments in the area of mobile and broadband networks, in turn making it cheaper and supporting higher bandwidths and ubiquitous connectivity. As available bandwidth grows, so does the variety and sophistication of devices. As the volume of devices increases, prices become more attractive, so driving user demand. This completes a cycle of demand.

However, as shown below, each demand driver can equally impact any of the others, for example smarter devices clearly drive more sophisticated applications and services, whilst the knowledge that increased bandwidth is available means that more users are likely to demand services:

Economic stimulus
Fixed broadband already plays a vital part in developing the economy, connecting the population at large, businesses, and governments, and enabling commerce. Mobile broadband is also being driven by the need to provide broadband where it is not possible to easily, quickly and economically deliver fixed broadband, particularly in developing countries, but also in underserved or rural areas in developed countries.

Emerging mobile youth generation
The younger generation (particularly the under 18s but also the 18 to 30 age group) are the future employees and workers, as well as being the momentum behind popular applications such as social networking, gaming and music and the earliest adopters of ICT devices. They are also amongst the most skilled, innovative and fastest learning users of technology. These skills and expectations as users are derived not only from their mobile phones but from the increasing ubiquity of broadband at home, and the teen generation is highly likely to carry forward this level of expectation (and more) into adulthood.

New applications and services 
New applications and services (some of which may well be unknown now) are going to be key drivers of mobile broadband and faster and faster data rates. Aspects include: 

  • Growth of mobile commerce

Over the past 12 to 18 months there has been significant activity and growth in mobile payments (particularly digital and physical goods purchases), and mobile banking. In addition these services and applications, along with contactless NFC, mobile money transfer, ticketing and coupons are forecast to grow rapidly over the next five years.

  • Mobile web 2.0

Before long, anything you can do at your desktop, you will be able to do on the road with a laptop or other mobile device. Users want the same capabilities wherever they are located and however they are connected - as fixed, mobile or nomadic subscribers.
This means that mobile broadband will provide personalised, interactive applications and services, such as multiplayer gaming, social networking, or other video/multimedia applications: anytime and anywhere.   The mercurial rise of social networking sites and user-generated content has rekindled users' interest in accessing web-based services on the move. 

The difference between current 3G applications and mobile broadband at the speeds envisaged is that LTE mobile broadband will enable greater user-generated content and uploading/downloading, along with person-to person connectivity.

  •  Portable video revolution

One application that is crucial to driving demand for mobile broadband is video. There is a variety of applications that can be offered in video, which include video calling, video clips streaming, live mobile TV and video clip uploads and downloads (especially for sites such as YouTube, MySpace etc.). The focus on video clip downloads is an application that is extremely popular. The demand to watch videos on the go, has been ignited by the emergence of the video iPod, with similar devices following from other vendors.

  • Impact on network traffic growth

In January 2009 Cisco forecast that globally, mobile data traffic will double every year, increasing 66 times between 2008 and 2013. Mobile data traffic will grow at a CAGR of 131 per cent between 2008 and 2013, reaching over 2 exabytes per month by 2013. Confirming the paragraphs above, Cisco said that almost 64 per cent of the world's mobile traffic will be video by 2013. Mobile video will grow at a CAGR of 150 per cent between 2008 and 2013.

  • The need for mobility

Worldwide mobile subscribers have grown by a factor of in excess of 15 times over the last ten years and actually surpassed the worldwide fixed line base in 2001-2002. Mobile subscriber density has been showing strong growth ever since, while fixed line density has been experiencing low or no growth. In the same period, the number of PCs has grown by a factor of nearly three, whilst Internet users have grown over 11 times. Fixed lines are very much the poor relation, and in the last couple of years the number of fixed lines has begun to decline.

LTE market opportunity
There will be considerable change to the global mobile technology base over the next five years:

  • Subscribers in developed nations and regions will migrate upwards from 3G to existing mobile broadband such as HSPA
  • A limited number of high end enterprise and consumer subscribers in developed nations and regions then migrate further upwards to LTE
  • Developing nations and regions see considerable growth in 2G and 2.5G as people and businesses seek first time connectivity ahead of more sophisticated services, and sometimes instead of acquiring fixed network access
  • A limited number of high end subscribers in developing nations migrate towards newer generation technologies

Juniper Research forecasts that the LTE service revenue opportunity for mobile network operators will exceed $70bn pa by 2014, with the main regional markets in North America, Western Europe and the Far East & China.

This article is based on Juniper Research's report: LTE: The Future of Mobile Broadband 2009 - 2014.
Howard Wilcox is a Senior Analyst with Juniper Research.
www.juniperresearch.com

 

The recent focus on privacy issues surrounding behavioural advertising is only the tip of the iceberg says Lynd Morley

European Telecoms Commissioner Viviane Reding has been placing the issue of privacy firmly on the communications agenda of late, and the subject has - particularly in the UK - been causing quite a stir.  Even the British national press has been exercised about it - something of an unusual occurrence, given their more normal propensity to fill pages with scandals that are more accessible and simpler to understand than the complexities of the gradual erosion of privacy now taking place.

The current fuss is largely due to the fact that the European Commission could pursue legal action against the UK Government, because the latter has paid little attention to the Commission's concerns about the use of Phorm software to monitor the Internet browsing habits of users without their consent.

The Phorm system, used, for instance, in a number of trials carried out by BT over its broadband network, offers a behavioural advertising facility, targeting adverts at users based on the types of sites they have visited.  The catch as far as the Commission is concerned is that neither BT nor Phorm asked users' permission to gather and use this information.

The EU directive on privacy and electronic communications basically says that member states must ensure the confidentiality of data on communications and related data traffic by prohibiting unlawful interception and surveillance unless the users concerned have consented to such activity.

Reding reinforces the sentiment in a recent statement, noting: "Europeans must have the right to control how their personal information is used.  European privacy rules are crystal clear - your information can only be used with your prior consent."

Clearly, there should be considerable cause for concern in the UK - not only among its citizens whose rights to privacy under European directives are being ignored, but also in Government, which now risks legal action by the EU.

But while the Phorm affair has served to raise the profile (if only en passant) of privacy issues, it is by no means the only privacy concern that Europe should be turning its attention to.  Reding has certainly pointed to other areas within communications technology that warrant close observation, including the significant amounts of data that social networking sites hold on their users, and the increasing use of RFID chips in a wide range of products.  And while the UK Government might fairly be accused of a certain laxity in its attitude to privacy issues, the country's Information Commissioners' Office has been focussing attention on the sometimes complex requirements central to establishing effective information privacy practices. At the end of last year, for instance, the ICO issued an in-depth document on the subject - Privacy by design.  Prepared by the Enterprise Privacy Group, the report is intended as a first step in the privacy by design programme, which aims to encourage public authorities and private organisations to ensure that, as information systems that hold personal information are developed, privacy concerns are identified and addressed from first principles.

The ICO noted in its introduction to the report: "The capacity of organisations to acquire and use our personal details has increased dramatically since our data protection laws were first passed.  There is an ever-increasing amount of personal information collected and held about us as we go about our daily lives.  Although we have seen a dramatic change in the capability of organisations to exploit modern technology that uses our information to deliver services, this has not been accompanied by a similar drive to develop new effective technical and procedural privacy safeguards."

Toby Stevens, Director of the Enterprise Privacy Group, notes that among the barriers to any successful adoption of privacy safeguards are, not only, an ongoing lack of awareness of privacy needs at an executive management level within organisations - often driven by uncertainty about the potential commercial benefits of privacy-friendly practices - but also the fundamental conflict between privacy needs and the pressure to share personal information within and outside organisations.

"Addressing privacy issues at the start of systems development," he explains, "can have significant business benefits, and in some circumstances ensure that new ventures do not run into privacy problems that can severely delay time to market."
www.ico.gov.uk
www.privacygroup.org

    

@eurocomms

Other Categories in Features