Features

Features

There is a natural symbiosis between IPTV and advertising, argues Tony Hart.  He looks at what it might mean for telcos

When it comes to IPTV, two of the biggest challenges facing service providers are a) how to generate revenue and secure some kind of return-on-investment; and b), how does IPTV differentiate itself in markets where it is competing against other TV delivery platforms?   This is why more and more industry players are turning the spotlight on the role of advertising across IPTV networks.   In return, the flexible nature of IPTV promises to breathe new life into tired old TV advertising formats and help to halt the declining TV ad revenue in some Western markets.

Targeted advertising... addressable advertising.... personalised advertising... call it what you will, but this new approach to TV advertising could help to give consumers a more relevant experience, while at the same time helping to attract advertisers to this new delivery medium and generating revenue for the service providers involved.   Nor is this just hot air: in the past year, operators in Europe have already conducted addressable advertising campaigns across terrestrial TV channels, with further campaigns planned.

Annelise Berendt, of industry analyst firm Ovum, has previously gone on record saying:  "The IPTV platform offers advertisers the best of both worlds.  It offers the immersive and proven impact of traditional television with the added benefits of being able to enhance it with interactivity.  It also offers the addressability and accountability of advertising in the Internet world, enabling the targeting of individual homes, personalized advertising and measurement of an advertisement's impact."

Before we delve further into what addressable advertising is all about, let's be clear about the definition of IPTV being used here.  In this article, we are talking about IPTV in the sense of TV delivered across a private IP network to a subscriber's broadband access point, to then be viewed on a television set.  This kind of IPTV -already being delivered by the likes of BT, Orange France Telecom and Telefonica - is not about web TV (or Internet TV as some people call it), which unlike IPTV, cannot guarantee the quality of service that viewers associate with TV. 

There will be some overlap between IPTV and web TV (such as being able to access Web pages from a TV screen, associated with a specific TV programme).  Furthermore,  the growth of Web TV has changed the way we consume video-based video for ever,  meaning that the viewer is in charge of ‘when and where'. It is even possible to interact with content via social websites.

At the moment, many consumers in early markets typically receive IPTV services as part of a package, rather than actively demanding the technology.  After all, consumers are generally only interested in the content, not the delivery mechanism.  However, this is not enough, particularly for operators competing with traditional terrestrial, cable and satellite.  As Ashley Highfield, Director, Future Media & Technology, BBC has said on the corporation's web site: "The winners will be the IPTV aggregators who offer truly complementary, differentiated services to those which people can find on their TVs...what IP-delivered TV should be about are the things that traditional television struggles at: amplification,  global distribution, rediscovery engagement, collaboration, innovation and navigation." With IPTV, service providers have massive opportunity to provide integrated services, more personalised content and user-generated content are all possibilities.

In addition, advertising on IPTV has the potential to be dramatically different to the ‘traditional' TV experience.  IPTV enables content to be targeted according to different factors, the first of which is geography. Because of the bi-directional nature of IP, it also becomes possible to discover viewing behaviour in ‘real time' without the consumer having to give away any potentially sensitive personal data. In this way the service provider can tell whether the IP address associated with a particular device is consistently watching a genre of programmes.  This information can be used not only to offer certain kinds of content, but also to help service providers to offer advertisers a means through which to deliver more relevant advertising.

For instance, a household that is clearly a regular consumer of holiday programmes - but never watches any children's TV - could be targeted to receive ads about holidays but de-selected to see any ads aimed at families.  Furthermore, if consumers also ‘opt in' to provide additional information themselves, then profiling of ads could become even more detailed. 
This tailored approach has advantages all round. It excludes the danger of advertisers falling into a ‘spray and pray' approach to TV advertising.   ‘Frequency capping' can be used to ensure that a viewer only sees an ad a certain number of times, or to create serialised ads where viewers see ‘episodes' in sequence.  The same brand ad could have different sequences depending on the viewer (for instance, city hatchback car with one message for young people, then a different message and visuals for an older audience).  In this way, viewers are less likely to skip ads, even when using PVRs.  Research from Nielsen has shown that while viewers do skip or fast-forward ads, ad-skipping habits vary according to the show or whether the programme is watched live or later.  When watching Survival: China, just under 20 per cent of the 5.16 million ‘live' viewers ad-skipped, while of the 6.51 million who recorded and watched the programme up to three days later, 5.23 million did not ad-skip.

As far as IPTV operators are concerned, these new, more engaging formats can help to attract the advertisers who are becoming increasingly disillusioned with the return-on-investment from TV advertising.  Addressable advertising means less wastage, which apart from being more cost-effective for big brand advertisers, also brings TV promotion within the grasp of a whole new pool of businesses who would have previously found TV too unfocused and expensive.  Finally, the data of viewing habits of different IP addresses can be used to create the basis for more sophisticated measurement tools.   Moreover, the same ad avail (or timeslot) can be targeted to different viewers and most importantly, sold to different advertisers, making this a highly attractive business model for broadcasters.

Of course, the viewers themselves also benefit from seeing more relevant and hopefully more enjoyable advertising.  If consumers are presented with more relevant advertising then they are more likely to accept its presence outside of traditional TV.  This in turn makes advertising a more valuable revenue stream for service providers. For example, many mobile TV services are expected to be loss leaders and offered to consumers for free.  If the operators behind mobile TV can hope to recoup some of their investment through advertising, then their business cases are in better shape.

Some IPTV operators, broadcasters and advertising agencies have already been exploring addressable advertising opportunities.  The world's first targeted TV advertising campaign over IPTV took place in late 2007, involving UK broadcaster Channel 4, the UK IPTV network Inuk, media agency Mediacom and Packet Vision.  Using an ad from an existing financial sector client, the campaign ran daily on Channel 4 for two weeks.  It specifically targeted at university students across the UK so that during the same 40 seconds in which the ad spot ran, students saw an ad from a different brand to the rest of the general viewing population.  The targeting was made possible by installing Packet Vision's IPTV advertising solution, the PV1000 (which involves services, software and hardware, including a single rack mounted unit in the telco's network, providing splicing, routing, ad insertion and management features) within the Inuk Freewire service, an IPTV network that provides triple play services to universities across the UK.  

Rhys Mclachlan, Head of Broadcasting Implementation at Mediacom said at the time: "We've delivered a pure targeted campaign for a client through television advertising on terrestrial broadcasting for the first time.  Packet Vision offers an opportunity for advertisers wanting to reach a specific demographic without screening copy to viewers who fall outside of the intended audience.  We also see advertisers with restricted budgets using this service on a regional basis for the delivery of cost-effective campaigns."

Another example in the summer of 2008 involved Channel 4 via Inuk again, but this time using Dollond & Aitchison, one of the UK's leading eyewear providers and its agency Arena BLM.   Arena BLM was keen to exploit the targeting potential of IPTV for its client.  It booked a campaign which features a D&A lifeguard saving a woman who is drowning in a sea of glasses, to run on Channel 4 and to be seen only by students.  Caroline Binfield, Business Director of Television at Arena BLM said, "This innovative technology allows our client to target niche and highly relevant audiences, which will drive improved efficiency of the advertising campaign."

These early experiences are just the beginning.  To earn its place, IPTV cannot just be yet another ‘me too' delivery vehicle: it has to offer something different, as well as make money for its stakeholders.  Addressable advertising could be the key to making IPTV a truly profitable medium.

Tony Hart is Business Development Manager with Packet Vision
www.packetvision.com

How can operators best manage and monetise the capacity demands of Internet TV?  Jonathon Gordon takes a look

There has been much controversy over who should foot the bill for over-the-top Internet TV services such as the BBC iPlayer, ITV's catch-up TV and Channel 4oD, as well as non-broadcaster user-generated content channels such as YouTube and Joost. The rise in popularity of these bandwidth-intensive services seems to know no bounds. According to Ofcom online viewing has doubled over the past year from 1.57 million to 2.96 million, with one in nine UK homes, for instance, now tuning in via their PC. Our viewing habits are changing and we now demand that footage be available to view, perhaps repeatedly, anytime, anywhere. This trend is even seeing PVRs (personal video recorders) and home media centres equipped with Internet connectivity. In such numbers video services consume vast amounts of broadband capacity. So much so that there is a real danger of their popularity threatening the deployment of next generation networks.

Why? Because the way these services are monetised means there is little payback for the operator. Research from Telco 2.0 looking at the impact of the BBC's iPlayer on UK ISP Plusnet found that costs have gone up 200 per cent, from 6.1p to 18.3p per user, because the ISP needs to buy more capacity but is seeing no additional revenue. And it's not just the operator who gets the short end of the straw. Users are also being short changed given that three hours viewing of Joost, a P2P video service devised by the same founders as Skype, would use up the 1GB monthly allowance awarded to most subscribers. Content providers too have a vested interest in how this content is delivered. Poor delivery means their service may alienate viewers. Internet TV content is highly susceptible to latency, delay and jitter caused by fluctuating contention rates and the emergence of high definition footage, which can consume up to 75Mb per minute when streaming at 10Mbps, is likely to further exacerbate the problem.

So should the service provider, content provider or end user foot the bill? At the moment, the jury is out on what new business models will emerge. Traditionally, a linear model has seen consumers pay for access to content and distributors paying the content provider with additional revenue generated from advertising. The operator is left out of the equation based on the assumption that the fee users pay for their broadband connection will cover consumption. In reality there is little correlation between content and the cost of delivery. iTunes, for example, charges a dollar per MP3 download and around five dollars for a movie even though the latter is 100 times greater in size. Clearly operators can't charge a corresponding amount for Internet video yet at the other extreme, user-generated content is available for free. There has to be a middle ground.

Internet TV consumption monopolises resources and should the current situation continue unabated some argue that the operator will be unable to cope with operational expenditure ruling out network upgrades and stymying technological advancement. New business models, therefore, have to emerge. Perhaps the ISP will partner with legitimate content providers, or they may decide to steer clear of any content-related activity altogether, choosing to act as an access-only conduit but with payment reflecting the level of access. Regardless of the model that evolves, there is an imperative for the content chain and access provider to work more closely together to ensure Quality of Experience (QoE) for the user.

Content owners use Content Delivery Networks (CDNs), which sometimes use P2P file sharing technologies, to distribute content. P2P is valuable in improving QoE because this distributed storage and retrieval mechanism improves speed, minimises the load on servers and is a cheap and scalable means of distribution. These systems can be used to help operators prevent congestion. In return, the operator can assist the content owner by using caching technology to prevent service degradation.

Caching provides an ideal opportunity for the operator to add value. It works by storing popular content in close proximity to the user, allowing the operator to ensure content is delivered effectively while also meeting the needs of many users at once. But caching should only be seen as part of the solution because it tends to focus on specific traffic types rather than addressing traffic volume as a whole. What's more, as more content is made available, viewing is likely to fragment, making it more difficult to store the most popular content on the cache and improve the QoE.

The operator needs to be able to factor in off-net issues and the total bandwidth available, requiring caching to be supplemented by another technology capable of traffic management. Deep Packet Inspection (DPI) service optimisation is the ideal partner as this technology is capable of peering into the traffic stream to determine the signature of applications whilst also monitoring bandwidth consumption. It's a versatile technology as the operator can use it to passively monitor the network and capacity consumption or to take more assertive action. For instance, the operator could decide to prioritise video applications across the network, allowing the operator to guarantee Quality of Service (QoS) on this type of traffic. It's easy to see that this type of guaranteed service would appeal to Internet viewers allowing the operator to market this service or simply use it as a differentiator.

Network operators who favour the proactive approach can use DPI to establish network congestion policies. These allocate bandwidth according to the traffic category, either boosting or limiting the capacity. Traffic can be prioritised according to its content and congestion managed according to these categories. As a consequence, the operator can better utilise network resources, conserving bandwidth and postponing the need for frequent network upgrades. If the DPI device is subscriber aware, it can take into account any subscriber SLAs and compare these with the application categories dictated in the traffic management policy. DPI devices, which are both content and subscriber aware, can inform the creation of tiered service packages and be used to tweak these should usage patterns alter.

When used in league with caching, DPI traffic management can prioritise the passage of specified traffic across the network, reducing the delays associated with multimedia content buffering. When the subscriber requests content this communication is recognised by a blade housed on the DPI device in real time and the request is redirected to the caching engine. If the content is already housed on the cache, it is streamed directly to the subscriber from the caching via the DPI device. Alternatively, the cache can retrieve the content over the Internet, whether it is housed on CDN servers or P2P nodes, a request again routed through the DPI device. Regardless of the source, all content is managed by the DPI traffic management system to prevent congestion. DPI is also capable of prioritising this traffic and, if used with a service gateway functionality, can also subject it to filtering. The Service Gateway essentially allows the DPI device to interface with value added systems that provide security control or url filtering in order to carry out different rule sets. A Service Gateway DPI device unifies the operator's billing, subscriber management and provisioning systems, acting as a central point of management and one-stop shop that combines data to assess service utilisation. In an Internet TV context, it can carry out pre-processing or post-processing of the content flow, allowing it to perform harmful content filtering, for example.

As well as governing the network, a DPI traffic management solution can also be used as a customer-facing tool. It is able to set quotas for individual subscribers without showing a bias towards any one particular content provider. Quota Management can be used to differentiate between video traffic, VoIP traffic and web traffic, using volume usage quotas to provision and enforce customised service plans. The DPI device collects information and allocates usage per subscriber, meters actual service consumption, and adjusts QoS according to content, volume and time elapsed, or any combination of these parameters. When the allocated quota is reached, the operator can choose to redirect the customer to a portal where they can "refuel" their quota or change their service plan. Quota Management ensure the user's access to high quality content is protected and allows the operator to manage bandwidth resources.

In essence, DPI traffic management finally makes the operator part of the Internet TV value chain. A unified, open platform DPI traffic management device with Service Gateway and video caching capabilities optimises network resources and prioritises traffic according to the nature of the content. As a consequence, Internet TV traffic is retrieved and delivered in as timely and conservatively a way as possible while also allowing new subscriber-led service plans to be developed.

For the viewer, timing is everything. Uninterrupted access will become as necessary to the survival and proliferation of Internet TV as always-on broadband has been to the web. So operators need to turn on, tune-in or drop out.

Jonathon Gordon is Director of Marketing, Allot Communications, and can be contacted via
jgordon@allot.com

Only a few years ago startups such as Skype and VoiceBuster gave the telecom industry a real scare by allowing people to make telephone calls over the Internet free of charge. They forced telecom operators to cannibalize their traditional voice revenues with VoIP services. It was one of the reasons for the telcos to start investing in IPTV.  Now there is a new threat for the telcos in the form of Internet TV. Will new players again traumatize the telecom industry by undermining the telcos nascent IPTV services, asks Rob van den Dam

Many telecom operators are investing in digital content in the hope of offsetting the fall in fixed-voice revenues. They focus primarily on offering television and video services, in particular IPTV; many of them see this as a necessity to combat the trend of losing subscribers to cable companies, which are increasingly offering VoIP as part of triple-play bundles.

But new Internet developments again pose a threat to telecom operators. The Internet has already caused a transformation in the telecom industry in the domain of communication services, where new players such as Skype have forced telecom companies to offer VoIP at substantially lower prices than they previously offered for traditional voice services over the fixed network. And now, the Internet enables Internet-TV start-ups to become a threat for the telcos nascent IPTV-services. Today Internet video is still delivered in rather low quality via sites like YouTube. In spite of grainy images and the small window format, however, these sites have been successful in attracting millions of viewers on a regular basis. And as broadband becomes faster and available to a broader public, they will be able to offer professional video services with a continually improving image quality, in this way providing an alternative for IPTV.

IPTV is a system where video content is transmitted in the form of IP-data packages over a closed secure network. The infrastructure is configured such that viewers can only receive the IPTV provider's own TV-channels. IPTV focuses primarily on the TV-set in the living room, generally a wide-screen TV with high image quality. A Set Top Box (STP) is required to receive the signal. IPTV telecom operators are very uniquely placed to enhance the television experience:

  • They can augment their IPTV-offerings with a wide variety of voice and data services.
  • They are well placed to combine IPTV on the TV with the other screens: the PC and the mobile.
  • They also have a lot of information about the viewer that they can use to deliver personalised content and advertising.
  • And last, but certainly not least, they are able to guarantee a qualitative high end-to-end television experience.

Currently most IPTV-services are based on subscriptions and a Video-on-Demand charge.
Up to now, many IPTV operators have focused on offering the same services - the same TV-channels and type of content - that their competitors, usually the cable companies, offer. But some telcos have taken it further. For example, Belgacom in Belgium is competing primarily on exclusive sports content. Other operators rather compete on offering ease-of-use. For example, by offering an Electronic Programme Guide (EPG) that allows individual users in the household to set up their own TV-guide with their own favourite programs and settings. These operators are taking optimum advantage of the possibilities that IPTV offers with regard to personalization and interactivity. There are numerous ways to compete, and each IPTV provider has his own strategy.

Internet TV has the "look and feel" of IPTV, but is delivered over the open public Internet, and is delivered "over the top" (OTT) of existing networks, actually getting a free ride. Internet TV is usually delivered to the PC or another device connected to the Internet, using peer-to-peer technology. Internet TV offers the OTT providers the following advantages:

  • They do not have to invest in distribution networks because they use the telecom and cable companies' networks.
  • They offer the same type of interactivity and viewing capabilities as IPTV.
  • They have a global coverage.

However in contrast to IPTV:

  • There are still issues with the video quality, though it is continually improving.
  • Users really need some technical know-how to use it properly.

Internet TV is not a controlled environment. There are no guarantees regarding accessibility, availability and reliability. There is no control over who is allowed to watch which programs and under what circumstances, such as related to distribution rights in different countries.
Internet TV providers offer programs for free; revenue is preliminary based on advertising. Obviously, Internet TV is still in the embryonic phase. There are a number of players who are attempting to create a market for themselves. Joost is the most well-known. Joost is coming from the developers of the music-sharing program Kazaa and the VoIP-service Skype, developments which has severely traumatized the music recording industry and telecom sector, respectively. Joost only distributes professionally-made content, and sharing advertising revenue with the content providers. While Joost is focusing on a large public, Babelgum focuses on specific target groups by offering niche content via a large selection of theme-channels. Hulu, the online video project from Newscorp and NBC/Universal has begun its offering to the US-public early 2008. Other providers include Narrowstep and JumpTV.
Variations on Internet TV include BBC's iPlayer and Apple TV. iPlayer is an on-demand TV-service enabling users to view BBC-programs via Internet. Apple TV uses an STB that makes it possible to stream digital information from any computer with iTunes to a widescreen high-definition TV-set. This enables viewers to transmit videos, TV-programs, music, YouTube videos and other Internet material from the computer to their TV-set, or to save them on the STB hard disk. And if it is up to Microsoft, users will soon be able to connect their TV to a Windows Media Centre PC or an Xbox 360 using Microsoft's Media Centre Services to get their daily diet of TV-programs.

All together, more than enough threat for the telcos who have spent large amounts of money building and launching their own IPTV-services. They are understandably worried, that OTT providers will ultimately capture all the value that video-over-IP promises. In that case, they would be left with nothing to offer but the so-called ‘dumb pipe'.
Clashes appear to be unavoidable. IPTV and OTT providers will certainly be confronting one another in the domain of distribution and advertising.

In terms of the first point, the OTT providers shift the distribution problem to the owners of the networks. IPTV providers invest heavily in upgrading their networks for their own IPTV-services; now they must handle the OTT traffic as well, which means additional investments. In fact, incumbent telecom companies face the unique dilemma that as they increase their broadband capacity, they make it easier for OTT providers to deliver the quality-of-service that is required for professional TV-broadcasting. Of course, that will not be acceptable for the telecom companies. They can respond in different ways:

  • Filter the OTT-traffic, possibly block specific traffic, and offer higher distribution priority and quality to parties who are willing to pay (more). However, throttling OTT-traffic controversially violates the so-called net-neutrality principles, i.e. blocking other parties' traffic to give their own services precedence. This could lead to intervention by government regulators.
  • Find a way to insert themselves into the relationships between the OTT provider and their customers, and make agreements regarding the charge-through of the distribution costs, either to the OTT provider or its customers.
  • Open its IPTV-platform to OTT-content by making services from OTT suppliers available as separate IPTV-channels. This would allow the operators to bring in extra revenue.

In terms of the second point, the advertising, ultimately it all revolves around the advertising relationships and the possibilities the Internet offers for more efficient targeting. Internet television is currently paid entirely by advertising. IPTV advertising will also become increasingly important for telcos to fund their content, as customers do not expect to pay for all content. In 2007, IBM's Institute for Business Value conducted a consumer survey to evaluate changes in consumers' media behaviour. A number of questions were related to advertising; the results indicated that in all the countries involved the majority of those surveyed were willing to view advertising before or after a good quality, free video broadcast.
 
In the battle for advertising funds both parties offer good possibilities for efficient and effective advertising, better than the traditional TV providers. But the telcos seem to have the best assets; assets that advertisers really value. First of all, with their network capabilities telcos are able to better control where the ads go to and to track advertising effectiveness. Telcos collect vast quantities of customer data, which they can use to develop profiles of their subscribers, including viewing patterns and perhaps shopping habits. They can combine these customer insights with their ability to identify the location of individual users and offer highly targeted, localised promotions. Integrated telcos can even combine data collected from fixed, wireless and other networks. They are well placed to enable the advertising experience practically anywhere, on any device and at any time.
Over the short term, Internet TV does not represent a real threat to the IPTV providers. IPTV has a clear possibility to establish a strong position in this market before the problems regarding image quality and the ease-of-use of Internet TV are resolved. But after that, the situation may change. In particular when Internet TV moves to the TV-screen, Internet TV can pose a bigger threat for IPTV.

It is all about getting Internet video onto the TV-screen. The Apple TV initiative mentioned earlier illustrates this. More and more manufacturers of consumer electronics are working on developments for equipping TV-sets with possibilities that make Internet access possible. Sony, as an example, is working on rolling out a network adapter for showing web clips on its HDTV's. It is only a question of time until the access to Internet is a standard feature built into the TV-set. This is an essential milestone from the perspective of the consumer that makes things a lot easier for him.
On the other hand, a partnership between IPTV and OTT suppliers is not unlikely:

  • Telcos can make OTT-content available as part of their IPTV-services
  • OTT providers can profit from the IPTV providers' "walled garden" that gives them a better guarantee in terms of quality, control over the distribution, and feedback with regard to volumes, viewing times and viewer behaviour.
  • Telcos can use the OTT-channel to collect additional customer data regarding consumers' viewing habits for improving targeted advertising.

In fact, we are already seeing this type of initiatives. Some telecom companies are bringing the Internet TV players into their own IPTV-environment, such as Verizon with YouTube and BT with Podshow. They offer, as it were, an extension of their closed IPTV environment. Many providers will offer their own Internet TV in parallel to this, possibly geared to other customer segments, optimally utilizing brand recognition, relations and distribution of content across both channels. BT Vision, with its IPTV-platform and web portal with a download archive for on-demand content and the purchase of physical DVDs, is one example of this.

OTT Internet TV is currently seen as a marginal threat for the IPTV providers. But as bandwidth and QoS will become less of an issue, the OTT providers will increasingly develop into mature TV suppliers of online live HD programming. Joost, Hulu, Babelgum and others are most likely just the top of the iceberg. More of these types of companies will emerge. They will get funding and then fight for customers and advertisers. In the end, it will come down to finding a solid business model. At the same time,  IPTV will mature, finding the right ways and approaches to be successful. Probably there is room for both IPTV and Internet TV, each addressing a particular consumer segment, and the possibility for some sort of partnership is certainly there.

Rob van den Dam is European telecom leader of the IBM Institute for Business Value and can be contacted via: rob_vandendam@nl.ibm.com

At heart all healthy businesses are trying to do the same thing, says David Ollerhead

Linguists today think that all languages have the same purpose and deep structure. Basically linguists believe that all languages are at heart doing the same thing. This appears to be true of healthy businesses too.

All healthy businesses have the same purpose: to grow and maximise profitability within the markets in which they are operating. There's plenty of practical empirical evidence to suggest that healthy businesses also have a great deal in common in their structures and the way they organise their activities.

Management skills, after all, are widely regarded as transferable between different vertical sectors. Senior executives tend to be recruited (or appointed to Boards) based on their success in roles where it is their positive impact on a particular organisation that matters rather than the sector in which the organisation operates. This suggests that healthy businesses have in common organic things which good managers can consistently nurture and develop, whatever the nature of the vertical sector where the business operates.
Similarly, university and business school courses focus on management skills in a general sense. ‘Serial entrepreneurs' are, by definition, fabled for their expertise at forming, growing and then selling businesses in a wide variety of sectors. Indeed, the very existence of management consultants who are geared to consulting in any sector where managers need assistance or guidance is perhaps the most decisive evidence of all that ‘management skill' is a tangible, discrete and specific thing which is basically sector-independent.

Further evidence that healthy businesses are all doing much the same thing is found in how brands operate. Major brands positively exult in their ability to win a presence in markets that on the face of it are disparate but in practice tend to become linked when a brand successfully establishes a loyal, enthusiastic, customer base.

Taking two examples, the Virgin brand (including music, travel, publishing, communications, financial services and soft drinks) has come to be associated with fun, youthfulness, value for money and Richard Branson, while the Saga brand (including travel, publishing, financial services) is seen by many adherents as signifying reliability, good quality, and a square deal for the over-50s. Brand-loyal customers willing to buy from more than one and very possibly all the different businesses under one particular brand obviously feel that the brand is more important than what's being sold.

The science of linguistics that originated the idea of, deep down, all languages being the same, is a fascinating science, but ultimately simply an academic pursuit. Business, on the other hand, powers the world's wealth and is the source for most people of their income and economic security. Big-picture conclusions about business and how it works consequently have massive implications for all of us.

The route to growing and maximising profit is to sell more products or services to more customers, given that neither the business nor its customers will want there to be any negative changes in the quality of the products or services being delivered. Equally importantly, in the case of a service, the business will not want customers to be over-serviced, which will increase the quality of what is being supplied but make supplying it much less profitable. The organisation will also want to sell more things to more customers without disproportionately increasing the time taken to supply what is being sold.

For healthy businesses, a melodious and useful mantra is: ‘Revenue is vanity, profit is sanity, cash-flow is key'. Chasing revenue for its own sake makes no sense if the revenue does not come accompanied by a healthy profit and a correspondingly healthy and positive cash-flow.  Above all, it makes no sense for a business to succeed in its aim of selling more products or services to more customers unless the business can do so without disproportionately increasing the cost of supplying what is being sold. Similarly, the business will want to avoid disproportionately reducing the prices of what is being sold. Selling more things to more customers by slashing the price (such as through a ‘buy one get one free' offer) can easily reduce profit and so be self-defeating.

Within the constraints of these qualifications a healthy business's aims are clear. All healthy businesses are trying to sell more things to more customers without:

  • compromising the need for the business to supply products and services to the required (rather than excessive) level of quality
  • incurring costs that make supplying the products and services unprofitable
  • reducing prices to a level where supplying the product or service becomes unprofitable.

So, how does a healthy business achieve these vital objectives?  Ultimately, the very nature of what a healthy business actually is suggests there can only be one answer to this question. The only way for a business to sell more products and services to more customers is to have a total focus on its customers. The fact that this answer, baldly stated, sounds straightforward does not make it any easier to achieve, or lessen its importance.
The first challenge in achieving this vital customer focus is knowing who your customers are, which includes your existing customers (i.e. the ones you've won already) and also your potential customers (i.e. the ones you could win.)

The second challenge is knowing what your existing and potential customers need, at least in the context of what you are able to sell to them. This challenge may well be more difficult than knowing who your existing and potential customers actually are, but mastering this second challenge is vital to your success, because until you truly understand what your customers need, it is always possible that:

  • you might be offering customers things that they don't actually want, or that not enough customers want
  • you might be focusing on irrelevant issues (eg cost-discounting things customers don't really want) instead of getting to grips with finding out what customers do want
  • you might start improving areas of your business that have no ultimate effect on customers and the improvement of which will therefore not lead to you selling more things to more customers.

The third challenge, once you know what your customers do want from you, is to work out how you can meet these needs by profitably producing goods and services as efficiently as possible.

The fourth challenge is the need to commit yourself to ensuring that your responses to the first three challenges are subjected to a continual state of interrogation that involves making sure your responses are undergoing a continual state of improvement.
The four challenges are fairly easily stated but by no means easy to meet. They involve, above all, establishing and maintaining a focus on your customers rather than on internal matters at the business or on your own personal concerns. But businesses that really do rise to the challenges - businesses that become, in effect, experts at focusing on customer needs - can enjoy prodigious success.

Once you do know who your customers are and what they want from you, one particularly potent way to ensure that your business is really focused around their needs and meeting those needs with maximum efficiency, is to look hard at your business's processes.
In business, a process is a series of steps that produces a specified deliverable to meet a customer need.  This definition is precise: the steps of the activity must actually meet customer needs (or, for organisations that have several processes, the needs of different customers) successfully. A series of steps that doesn't meet customer needs can't properly be regarded as a process, or at least not an effective one.

Whatever the precise nature of the process or processes a business carries out, the very fact that process is actually defined in terms of delivering a benefit to customers leaves no doubt that a business's process or processes lie not only at the heart of the business but are the heart of the business.

And make no mistake: all good businesses will have a healthy heart whose pumping creates maximum profit for you, and maximum satisfaction for your customers.

David Ollerhead is head of consulting within the Professional Services Group at Airwave Solutions Limited, and can be contacted at david.ollerhead@airwavesolutions.co.uk
www.airwavesolutions.co.uk

 

 

In this issue, guest commentator Axel Pawlik, CEO of Regional Internet Registry RIPE NCC, discusses the vital role of IPv6 in the continued development of the Internet

The Internet industry is running out of IPv4 addresses. At some point, probably about three years from now, IANA (the Internet Assigned Numbers Authority), the body responsible for the top-level distribution of IP addresses, will hand out the last unallocated IPv4 addresses. The exact ramifications of this are currently the subject of much discussion and debate, but it is clear that IPv6, the new generation of IP protocol, is vital to the continued growth and development of the Internet. Ensuring that IPv6 is efficiently and effectively deployed is therefore the major challenge facing the Internet today.

As the IPv4 exhaustion date approaches, IP addressing is a global concern. Already, 180 of the 256 IPv4 address blocks of "/8" have been allocated, and of the remaining 76, 35 are reserved for the Internet Engineering Taskforce (IETF). The remaining 41 blocks are held by the IANA for future allocation to the Regional Internet Registries (RIRs). The RIRs, in turn, distribute addresses to ISPs and other users in their respective regions.

The original plan was that as the Internet grew and the IPv4 address pool was depleted, the new protocol, IPv6, would be deployed. According to this plan, the deployment of IPv6 would be complete long before the last IPv4 address was allocated.

This has not happened, and IPv6 deployment activity has, up to this point, been minimal. This is now a real problem, because the transition to IPv6 will now have to take place in that grey area beyond the exhaustion of the IPv4 address pool. Therefore, as the Internet continues to grow, network operators will have to "dual stack", or run both IPv4 and IPv6 simultaneously.

The pace of Internet growth, however, may pose problems for this plan. The RIRs currently allocate approximately 268 million unique addresses every year. This is even as the use of NAT (Network Address Translation, a technology that allow many devices to use the same address) increases. If there are no IPv4 addresses for new users, how will new networks be able to implement dual stack?

Looking at the data collected by the five RIRs, there are some encouraging signs. But when we compare the amount of IPv6 that is actually being routed on the Internet to the amount of routed IPv4 address space, that optimism begins to seem a little misplaced. There are simply not many IPv6 addresses currently on the Internet.

During 2007 and the early part of 2008, however, there has been a significant rise in IPv6 uptake. This is certainly a cause for increased optimism, but the fact remains that, with only three years until the exhaustion of the IPv4 address pool, a dual stack Internet is inevitable, and IT Directors are going to have to be creative.

Ultimately, the RIRs urge that the widespread deployment of IPv6 be made a high priority by all stakeholders.

Governments are key players in Internet growth and we urge them to play their part in the deployment of IPv6, and in particular to lead by example in making content available over the IPv6 Internet.

When business leaders make firm decisions to deploy IPv6, the process is fairly straightforward. Staff must be trained, management tools need to be enhanced, routers and operating systems need to be updated, and IPv6-enabled versions of applications need to be deployed. All of these steps, however, will take time and money.

The RIRs have well established, open and widely supported mechanisms for Internet resource management and we are confident that our Policy Development Process meets and will continue to meet the needs of all Internet stakeholders through the period of IPv4 exhaustion and IPv6 deployment. The immediate challenge lies in making content available via IPv6, and in using the processes and mechanisms already available to ensure that service providers and content providers build adequate experience and expertise to continue to grow and develop the Internet.
www.ripe.net

The telecoms industry varies wildly in its predictions for FMC uptake in the next four years. What they all agree on, though, is that users will be numerous and it will save companies money. Pierre-Alexandre Fuhrmann explains how businesses can get to grips with this emerging technology

Industry watchers' estimates vary wildly as to the penetration of fixed-mobile convergence (FMC) in the coming years but they all agree - FMC is set to grow. Fast. One of the main drivers in 2008 is the emergence of dual mode handsets and widespread availability of wireless (WiFi) networks. The consensus also agrees that a converged telephony environment will save businesses money. So, when faced with this new emerging technology, how can IT managers and CFOs alike make sense of fixed-mobile convergence?
According to some estimates, there will be some 435 million mobile Session Initiation Protocol (SIP) users in 2012 whereas some analysts say there will be 18 million FMC subscribers by 2011. When it comes to call savings, Aberdeen Group estimates business users will save an average of $150 per user per year - for businesses of all sizes this is a significant amount.

As flexible, mobile and remote working begins to take a cultural hold on our society, IT and telecoms managers need to work out how FMC can give their businesses further competitive edge.

The UK enjoys some of the highest wireless Internet coverage in Europe. And for travelling business executives the opportunity to make calls by connecting to wireless networks is an attractive prospect for companies looking to reduce the cost of international calls. Wireless networks are now commonplace in airports, hotels, coffee shops and, increasingly, public areas.

Fixed Mobile Convergence means different things to different people so let's take a step back and break it down. Fixed communications comprise companies' telephone systems or private branch exchanges (PBXs), including analogue or digital ISDN exchange lines, and increasingly SIP trunks, and internal extensions.
At the same time, mobile communications can comprise your mobile handsets and their various calling plans.

Fixed Mobile Convergence focuses on taking the benefits of both of these communications streams and using the best bits to provide the most workable and cost effective system.
FMC solutions integrate fixed and mobile networks, providing communications services to mobile workers regardless of their location, access technology, and device, increasing employee productivity and decrease cost in an open standard-based environment.
FMC works in the same way as a hybrid car switches from electricity in low-speed urban areas but uses petrol power when on the motorway. When within range of a wireless network, dual mode phones can switch to WiFi and use IP telephony, moving back onto cellular access when out of WiFi coverage.

One of the major features at Mobile World Congress this February was FMC and in particular visitors felt that wireless providers could communicate their FMC offerings more effectively. But it's not just the mobile operators that IT managers can turn to when implementing FMC.
It is not difficult to FMC-enable a traditional telephony system, and even less so one which already uses an internet protocol (IP) private branch exchange (PBX). The IP PBX can incorporate the new wave of mobiles that combine GSM/3G, WiFi and SIP.
SIP is the de facto open standard allowing products from different manufacturers to be connected together, so a mobile device like the Nokia E Series can be connected to an IP PBX, such as the Aastra IntelliGate, MX-ONE and NeXspan.

Using WiFi, which is ubiquitous in business today, and Voice over IP (VoIP) the mobile device can switch from the GSM/3G network to the IP PBX's fixed network for incoming and most importantly outgoing calls.

Recent surveys show that most employees with a mobile use this as their primary phone, even when in the office, so enabling the IP PBX to ‘take over' the calls could dramatically reduce communication overheads.

The best thing IT managers can do is to read up about FMC, talk to their telephony supplier or service provider and take it from there. It is not a complex process to migrate to a converged environment, so don't believe any scare stories that say otherwise. It is the natural next step in business communications.

SIP, the open standard for IP telephony, will further enable the development of FMC, and we at Aastra believe that SIP will prevent businesses from being marooned on an island of IP. Growing demand and availability of innovative solutions based on SIP will see the transition to ‘IP Telephony 2.0' phase of internet telephony, enabling businesses to choose SIP-based universal terminals and SIP trunks to manage their external calls at a lower cost.
We strongly believe that innovation comes increasingly from the consumer markets, because of the volume effect, and especially in the mobility area. So it is important to be as open as possible to integrate these new upcoming innovations.
Moving away from proprietary solutions companies will be able to reap the benefits of open standards technologies for enterprise communications.
Your rivals are looking at FMC - if they've not got a system in place already. They will benefit from reduced communications costs, improved staff productivity as they can now deal with their emails on the road, and even better staff morale.

Convergence is happening and it's going to provide incredible efficiency and cost benefits to your organisation. According to ABI Research within the next two years mobile networks will emerge with an all-IP architecture and will deliver multimedia services as well as VoIP.
So come on and be a part of it, the technology is ready so there's no better time to migrate to fixed-mobile convergence and steal a march on your competitors.

Case Study - Procter & Gamble
While Aastra provides fixed-mobile converged technologies, this just forms part of a wider mobile working portfolio. Many blue chip customers, such as Procter & Gamble, are using Aastra's mobile technology to operate more efficiently using remote working.
Procter & Gamble is a recognised leader in the consumer products industry, with more than 135,000 employees in 80 countries worldwide. Its brand portfolio includes the likes of Duracell, Gillette, Pantene, Ariel and Lenor.

The company's Swedish operation identified the need for a more efficient mobilised communications system, especially to enable more remote working. Many employees had three different contact numbers, depending on their location and, although 60 per cent of employees had a company handset, many had to carry two or more phones.
Added to this, Procter & Gamble staff's phones were not integrated into the telephony system, often making them difficult to reach and placing a burden on the switchboard.
The challenge was to increase mobility and simplify the system, while maintaining security. Procter & Gamble's IT service provider Hewlett-Packard recommended Aastra's One Phone framework. This system includes a mobile phone for each member of staff connected to Procter & Gamble's communications infrastructure and converges fixed and mobile lines, enabling employees to be accessible anywhere and at anytime, using a single number.
The company saw immediate results. Since implementation, a significant amount of time has been saved each day, call quality has increased and users find it much easier to maintain contact. The fact that each member of staff has one phone and a single contact number vastly simplifies communications while improving productivity and cost-efficiency.
Håkan Berggren, Information and Decision Solutions Manager for Procter & Gamble Nordic, says: "The mobility solution has enabled us to save 20 minutes per day, when you consider this is per employee it represents significant savings, both in time and costs."

Pierre-Alexandre Fuhrmann is VP Products & Solutions for Aastra
www.aastra.com

GSM operators today are faced with an increasingly daunting task: how to manage the financial and data clearing settlement of multiple roaming agreements. In addition to having to deal with an estimated 200 to 300 roaming partners each, they must all comply with complicated industry-wide standards and regulations for invoicing, data record format, settlement and more.  Eugene Bergen Henegouwen proposes a solution

According to the GSM Association website, its 700 mobile operator members across 218 countries and territories now serve more than 2.5 billion customers. This means an almost unfathomable amount of mobile traffic must be documented and accounted for each day, hour and minute. And with about 500,000 new GSM connections being made globally every hour, this complexity is only going to intensify for the world's operators. Now add more market growth factors, including the proliferation of non-voice services, particularly SMS and MMS, and price reductions for users. All will lead to increased traffic.
The implications to operators are staggering.

Fortunately, they do not have to go it alone, as there are a number of clearing house specialists who have the expertise, experience and knowledge of the GSM wireless and banking industries and their multifaceted rules. This experience is especially important when it comes to financial clearing, an area where billions in Euros and other currencies are at stake.

Financial settlement of multiple roaming agreements is a complex and costly process for any operator to undertake autonomously. As more and more subscribers roam outside their home country borders to a wider range of countries, operators must collect and settle communications-related charges in a wide variety of currencies. Also, as imagined, the cost for highly qualified personnel and the systems required to manage this process is great, and the need for accuracy is imperative.

The best alternative is to make use of a high-quality, fully automated financial clearing house (FCH) to manage receivables and payables. On one side of the house, the FCH proactively collects customers' receivables. Obviously, the sooner this happens, the better, and this should be executed within about 40 days. On the other side is payables. An FCH manages foreign exchanges and all bank costs, ensuring that this part of the process is timely and accurate.

Revenue assurance, profitability and cost savings - all key components in today's competitive environment - also result from utilizing an FCH. First, the need for expensive personnel and an expensive in-house clearing system is eliminated. Second, a fully automated FCH provides operational efficiency while eliminating the risk of human error. A good FCH also improves debt management through proactive debt collection, allocation of funds and a clear audit trail.

A variety of FCHs are available for operators today, and operators must be sure the one chosen is fully functional. For example, the system should be able to take care of all types of traffic that may cross an operator's network, including GSM voice traffic, messaging (SMS and MMS, including interworking) and data (3G, GRPS and WLAN) transactions. Having a system that supports CDMA/TDMA clearing is important, as well, given that actually, subscribers worldwide do not want to be constrained in the network they use on their travels. The FCH should also integrate into the operator's accounting system for easier reconciliation and management of ledgers, and it should eliminate banking fees while ensuring competitive foreign exchange rates. Just as importantly, an operator must have access to its information via online reporting 24 hours a day.

At the end of the day, GSM operators need help most with reducing the risk while improving their return via a service that provides clearing and settlement for all of a GSM operator's international traffic. The invoicing and collection services available through an FCH should aim to simplify an operator's settlement processes with each of its roaming partners for all billing records exchanged during an invoicing period. This includes receivables from all partners who sent roamers into the operators' network, and payables to all partners where the operator's subscribers utilize a service. All settlement functions are consolidated, with the FCH acting on the operator's behalf. The operator simply funds its account, and the FCH settles with all of its roaming partners - removing the complexity involved with the current bilateral settlement process that is becoming prohibitively costly and way too time consuming as the number of partners increases.

Roaming, especially international roaming, is a significant part of operator revenues. However, before any exchange of money can even take place via an FCH, an operator must first have billing data detailing its roaming voice and data sessions. This data is essential for accurate invoicing and settlement. For an operator who has hundreds of roaming partners, the process of exchanging billing data is extremely complicated and tremendously expensive to undertake on its own.

Using a third-party data clearing house (DCH) is typically the option of choice for operators who want to ensure this critical part of their business is handled correctly. A DCH handles the exchange of billing data with an operator's roaming partners. Whether the visited or home operator, the DCH provides a single point of contact for the exchange of roamer billing records. This results in streamlined processing of roamer billing data and a single source of information regarding roaming customers as well as a way for the operator to calculate its financial position with roaming partners. And in an ideal world, the DCH should provide the operator with ­­a number of features, including the ability to process vital and sensitive operator interactions regardless of technology type, billing format or signaling standard.
The advantages of using an independent DCH are much the same as for a FCH: revenue assurance, compliance with industry standard regardless of roaming partner, cost savings and operational efficiency. With all these benefits, it makes perfect sense for an operator to look outside itself for a DCH provider.

As the wireless industry continues to mature, operators are showing a preference for clearing services from a single source that can provide both data and financial solutions in the global marketplace. The key advantages for an operator of having a single provider for its DCH and FCH are both operational and practical. On the operational side, the operator can be certain that all billing data and supporting reporting is fully integrated between the DCH and FCH, eliminating risks of data leakage and ensuring full transparency across the entire clearing process from file processing through to final settlement. On the practical side, the operator benefits from one account manager, one contract and an end-to-end commitment from the clearing house to the customer.

Ultimately, operators are provided with the convenience and cost savings associated with bundled services, and they benefit from the critical relationships needed to ensure data and reporting integrity required for financial clearing and settlement. To give you some idea of the level of processing we are already seeing, I can reveal that the Syniverse FCH generates and receives more than 18,000 invoices and settles over 8,000 roaming partner positions. As this trend continues, FCH providers will benefit from an upturn in their business from operators who have made the decision to outsource their requirements rather than trying to run this side of their operations in-house, as traditionally been the case.

It's a dynamic industry at the moment, and this is causing suppliers to augment their capabilities to keep abreast with the demands. For example, Syniverse acquired at the end of 2007 the wireless data and financial clearing business of Billing Services Group Limited (BSG). As a result, our current and future customers are provided with the convenience and cost savings associated with bundled services, and they also benefit from the critical relationships needed to ensure data and reporting integrity required for financial clearing and settlement.

Also, during any period of change and consolidation, it's important that independence is maintained when it comes to verifying and validating the source data provided by the DCH. This data should still be subject to the same stringent quality checks that are applied to all other DCHs, even if they are run by suppliers. And they also should be fully able to provide financial clearing services for operators who have already established a relationship with other data clearing houses (DCH).

I think it's this transparent approach and attention to detail that will show operators that the time is right for them to consider outsourcing their FCH and DCH operation in order to achieve cost benefits and efficiencies. If they do, it will benefit them and also the suppliers who work hard to provide a trusted source for their important - and sensitive - clearing operations.

Eugene Bergen Henegouwen is Executive Vice President, Syniverse, EMEA
www.syniverse.com

Mobile operators must tackle the transition to next-generation Ethernet if they are to successfully meet consumer demands for new services, argues Vinay Rathore

Mobile operators are in a period of fundamental transition, as consumers increasingly demand access to personal high-bandwidth services while on the move. The growth of 3G data services, mobile broadband and the availability of powerful new mobile devices such as the Apple iPhone are placing significant strain on mobile networks and operators are investing in additional capacity to support this bandwidth explosion.

However, the operational costs associated with traditional mobile backhaul (defined as the access portion of the network transporting traffic between the mobile base station and gateways to the packet network and voice switched network) are increasing faster than the revenue generated by new data services.  Until recently, most network operators have been trying to add backhaul capacity primarily by leasing additional TDM-based E1 circuits at a high cost. Worldwide, TDM backhaul accounts for 20 to 40 per cent of mobile network Operating Expense (OPEX). All this is untenable in a competitive market of shrinking average revenue per user (ARPU).

The challenge facing mobile operators is how to increase network bandwidth, both in terms of capacity and speed, while reducing the total cost of running the network and growing top line services revenue.

In the UK, for example, demand for mobile Internet services is increasing while the adoption of mobile broadband is heating up competition among fixed and mobile operators. Approximately one in eight UK consumers have either replaced their fixed-line Internet connection with a mobile alternative or chosen a mobile broadband service from the outset according to research published by YouGov. Similarly, Ofcom's latest Communications Market Report (August 08) found that two million people have already used mobile broadband via a dongle, 3G datacard or similar device, with sales reaching 133,000 from 69,000 between February and June this year alone.

In addition, a recent report from Neilson mobile highlighted that the UK has the second highest number of active mobile Internet users in the world (12.9 per cent), second only to the US. Furthermore, the availability of powerful new mobile devices such as the 3G iPhone are driving mobile Internet usage by promising consumers easier access to mobile business, personal and entertainment services.  For example, 37 per cent of iPhone users watch video, 82 per cent access the Internet, 17 per cent stream music and 76 per cent send email on their phones according to Neilson Mobile.

Operators have been quick to capitalise on people's desire to access and download content off the Internet on their mobile phone, launching a suite of new services. A recent independent survey (June 2008) conducted on behalf of Quickplay, found that two in five people in the UK had already watched TV and video content on their mobile phone, with many now regularly using such services. 18 per cent of those that had tried a Mobile TV and video service watch on a weekly basis.

However, while mobile operators are continuously upgrading their wireless networks to support this bandwidth explosion, it's not just about adding capacity. In fact, network quality is the most important driver of satisfaction with the mobile Internet, accounting for 79 per cent of overall satisfaction according to Nielsen Mobile.

These trends highlight the need for mobile operators to invest in next-generation network infrastructure to accommodate increasing bandwidth demand and deliver a high quality user-experience while maintaining profitability.

Current networks were designed to transport voice traffic over Time Division Multiplex (TDM) networks with E1 circuits to provide backhaul transport from the base station to the network controller, and over SONET/SDH networks for voice traffic from the controller to a Mobile Switching Center (MSC). With the advent of 2.5 G mobile networks and the data services thereby enabled, the backhaul network has evolved to accommodate increased data traffic by including Frame Relay, ATM and IP, but in large part this data still travels over TDM circuits utilizing ATM/IMA.

The current TDM-based backhaul network is being overwhelmed by the rapid increase in bandwidth demand with the introduction of 3G (HSPA, EV-DO) and 4G (LTE, UMB, and WiMax) data services. For example, to ensure all users have access to Mobile TV services, the network must be scalable to support thousands of multi-cast video streams such as broadcast TV as well as uni-cast streams like You Tube.

As a result, mobile operators must reduce the cost-per-bit of data transport in the backhaul network while continuing to ensure voice quality, maintain carrier-grade Operations, Administrations, and Maintenance (OAM), and provide circuit-like resilience.

Carriers can take advantage of advanced Ethernet technology as a solution to challenges in the mobile backhaul network and reduce the dependency on E1 leased lines, and expensive SDH infrastructure. Carrier Ethernet is far more economical as it lowers the cost-per-bit and operational expenses while offering carrier-class management and Quality of Service with the right attributes. High-performance Carrier Ethernet solutions offer larger pipes - essentially, more bandwidth - to meet end-user bandwidth requirements while lowering the overall infrastructure cost and ensuring high quality of service. Using Ethernet, operators can scale network more easily to meet the demands of mobile services and applications without scaling costs. By building an Ethernet mobile backhaul, operators can burst up to the full speed and use the same circuit to carry different types of traffic. While there is still some concern regarding carrying time-sensitive traffic such as voice over Ethernet, the industry is working toward resolving this issue in multiple ways including developing TDM over Ethernet standards.

However, while advanced Ethernet technology provides a solution to backhaul problems, the backhaul network does not operate in isolation. To work efficiently and leverage Operational Expense (OPEX) advantages, the mobile core network must also evolve as the access network migrates to Ethernet. Building a Carrier Ethernet network infrastructure using standards defined by the Metro Ethernet Forum provides operators with long-term, low-cost strategy to replace their existing SDH infrastructure while maintaining carrier-class reliability.
However, as carriers have invested heavily in their current mobile networks, they cannot afford to simply tear out and replace current legacy radio infrastructure. It is crucial that their mobile backhaul and core network strategy still supports legacy traffic and service while allowing them to gradually transition to next-generation infrastructure that are more scalable and economical.

There is no ‘one size fits all' approach to building out an Ethernet backhaul network.  Each tower has different requirements based on available infrastructure, bandwidth requirements, and geography.  Most Ethernet backhaul networks will be a hybrid of fibre, microwave and copper.  In addition, it is likely that operators will lease portions of their network and own portions in order to balance CAPEX and OPEX budgets.  Additionally, operators will need to support TDM, ATM and Ethernet networks during the transition phase.  With all of these varied requirements, operators must seek out vendors that supply a comprehensive Ethernet portfolio that can be gradually applied as the network demands evolve.
Fixed telecom operators are already benefiting from the migration from TDM-centric to next generation Ethernet centric networks. Mobile operators must manage this aggressive transition to next-generation Ethernet to maximise the investment in existing mobile and network infrastructures while maintaining a quality of service that minimises subscriber churn.

Vinay Rathore is Director of Product Marketing, EMEA, Ciena
www.ciena.com

Francois Mazoudier argues that fixed-mobile convergence is not the be all and end all in the evolution of telecoms

The term Fixed Mobile Convergence (FMC) has captured the minds of businesses across the telecoms industry, with many players believing that its growing presence is causing a communications revolution. To date, the term has attracted a great deal of attention across the world, with anybody who is anybody in the telecoms space talking about FMC. However, in reality, is FMC all just ‘hype' and, if so, what are the alternative communications solutions out there?

By converging fixed and mobile communication, FMC provides a synergistic combination of technologies, enabling all-in-one communication systems that allow voice to switch between networks on an ad-hoc basis using a single mobile device. However, as these solutions start to impact the telecommunications ecosystem, do FMC players really believe it is the best answer for businesses that are looking to adopt the latest communications models? Or are the telecoms vendors and operators making a last ditch attempt to breathe life back into their ever-decreasing profit margins by stamping a larger footprint into the office environment in the name of FMC?

Until now, mobility was designed as an extension to this office-based hardware telephony system, perceived as a luxury that was too expensive to handle all business calls. With the universal adoption of mobile phones, it is now the fixed-to-mobile element that is complex and expensive, with calls to mobile phones the main form of voice communication over traditional fixed-line handsets. So why not go all mobile?

In today's busy offices the majority of workers still have to juggle a desk-based phone and mobile device, a situation that is not only inconvenient but also costly for businesses that support and cover the costs of both phones. As mobility increases, and more staff work outside the traditional office environment, employees are forced to give out and use multiple phone numbers. As FMC gives users a single dual handset that can be used anywhere at anytime it seems like the perfect solution to this issue. However, away from the hype, integrating mobile and fixed-line networks is a complex matter and there is a far simpler single handset solution already on the market - the mobile phone itself!

It does not have to be expensive to introduce an all mobile telephony solution into a business environment, all the necessary hardware is already supplied and paid for as employees' already have mobile devices - no other hardware is needed. Calls made to the office number are handled exactly as they would on an ordinary telephone system, directed to the user's mobile phone over a GSM network of choice, rather than over a complicated office fixed-line telephony network to a desk-based phone. This makes them reliable and cost effective, particularly compared to the FMC solution.  

True mobility enables employees to be contacted on just one number whether they are in the office or not. All mobile solutions enable this unprecedented access without compromising on the features users require from their desk phone. This new level of access allows business workers to stay in contact with colleagues and customers with ease; they never miss an important call again.

The simple-to-use system can be provisioned and activated instantly via the web and takes ten minutes to set up. Put bluntly, an all mobile solution eradicates all the complexity of a fixed-line, and therefore an FMC solution, as there is no need for hardware, techies, long-term contracts and expensive upgrades. Handover between fixed and mobile is inherent - because it is all mobile.

All mobile phone solutions have the potential to dismiss the FMC ‘hype' in the same way that network-enabled voicemail ended the office hardware business when it was launched back in the 90's.  Back then each office had a tape recorder in the reception and this hardware eventually became superseded by network-enabled voicemail. Today there is an all mobile system that can supersede the office PBX, turning the business telephony industry (selling mostly hardware) into a service industry.

In addition, an all mobile system solves the problems found by all companies today when buying/changing phone systems: complexity, high upfront costs, hidden ongoing costs, high dependency on technical specialists, costly ongoing upgrades and, most importantly, expensive monthly bills as their employees are increasingly mobile and incoming calls are redirected to them at full mobile rates. 

Despite the fact that FMC has existed as a concept for over ten years, its penetration is likely to be as little as 8.8 per cent of the total business subscriber base by 2012 (according to Informa) and it remains to be seen if businesses will want to replace existing infrastructure, when you look behind the complex scenes of FMC. It is far easier to achieve mobility without such high level infrastructure investment.

I am not saying that FMC won't take off on a large scale or even change the way we communicate. It probably will, but by using an all mobile phone solution we can change the very nature of office communication without the need for the fixed-line.

FMC is really about the move to mobile where everyone's phone is wireless. Therefore let's stop talking about FMC and instead talk about accelerating such a move.
Francois Mazoudier is CEO at GoHello
www.GoHello.com

We've been hearing a great deal about ‘converged', ‘21st Century' and ‘next generation' networks, and what they will mean for business.  But what does it all actually mean in terms of technology? Peter Thompson takes a look

Next generation networks, while promising great strides for business, happen to entail - in terms of technology - a radical shift from circuit switching, where fixed resources are allocated to a session (such as a telephone call) for as long as it lasts - regardless of whether they are actually being used at any particular moment - to packet switching, which allocates transmission resources for only as long as it takes to forward the next packet. This is more efficient, since most sources of packets only generate them occasionally (though sometimes in bursts). Equally important is the inherent flexibility of packet switching to cope with variations in demand, and hence to support a wide range of different applications and services. While several packet switching standards have been used, the clear favorite is IP (Internetworking Protocol), which is the basis of an ever-expanding web of enterprise and service provider networks that link together to form the Internet.

For the enterprise, shifting to a converged all-IP network translates into immediate productivity gains through integration of different functions - now available in easy-to-use Unified Communications packages - and medium-term cost savings from toll bypass and consolidation of network infrastructure.

If this all sounds a bit too good to be true, then it should. Allowing streams of packets from different applications to share resources in a free-for-all makes the network simple and cheap but causes the service each application gets to be extremely variable. Whenever packets turn up faster than a network link can forward them, queues form (called congestion), causing packets to be delayed, and buffers may overflow causing some packets to be lost. Traditional data applications such as email transfer don't mind this too much, but new real-time services such as IP telephony are very intolerant of such behavior.
The upshot of all this is that despite all its benefits, a converged packet network can't be considered a reliable substitute for a circuit-switched one without having something in place to ensure that it provides an appropriate quality of service (QoS) for all critical (and particularly real-time) applications. This means giving each application enough bandwidth, and keeping packet loss and end-to-end delay within bounds. Loss and delay can only get worse as a stream of packets crosses a network, so it makes sense to think in terms of allocating an end-to-end budget for these parameters across different network segments. Different parts of the network can then attempt to meet their budgets using a variety of methods.

A technique used in the high-bandwidth and high-connectivity core of a network is to control the routes that streams of packets take so as to avoid congestion almost entirely.
MPLS, with its traffic engineering extensions, is a standardized way to do this, but there are also proprietary mechanisms that some of the IXCs use that work well enough for them to carry billions of call minutes annually over converged IP networks using VoIP.
Move towards the edge of the network, however, and the number of alternate routes diminishes. The capacity of the individual links also goes down, making occasional congestion much harder to avoid. At the level of an individual WAN access link it becomes almost inevitable, so packets will often be queued up to cross it. Delivering QoS then becomes a matter of managing this queuing process to assure service for critical packet flows even when the link is saturated. This can be very tricky when several different applications are all ‘critical' but have wildly different throughput requirements and sensitivities to packet loss and delay. This problem is a major drag on the uptake of converged networks, causing them to be widely regarded as ‘complicated' and ‘difficult', when they ought to be making life easier.

One reason that the available QoS mechanisms don't help as much as they might is that they fail to take account of the intrinsic interaction between the different QoS parameters, or rather between the resource competitions that affect them. At a congestion point, packet streams compete for the outgoing link bandwidth, and since having more traffic than capacity is the definition of congestion in the first place, a lot of ‘QoS' implementations focus on managing this one competition, i.e. they provide a way to allocate bandwidth. However this isn't the only limited resource, as queued packets have to be stored somewhere, and any buffer can only hold so many; consequently there is another competition between the streams, for access to this buffer, which determines their packet loss rate.

Finally there is the limitation that, however fast the network link, it only sends one packet at a time, and so there is a third competition to be selected for transmission from the buffer, which determines queuing delay. These three competitions are interlinked; for example increasing the amount of buffering to reduce packet loss results in more packets being queued up to send and hence increases average delay. Even assuming a series of QoS mechanisms can be combined to manage all three of these competitions, the behind-the-scenes interactions between them will sabotage every attempt to deliver precise and reproducible QoS. In practice, the effect of this is that reasonable QoS can only be achieved by leaving substantial headroom, resulting in very inefficient use of the link, which can become a high price to pay for a solution that was supposed to save money!

Predictable multi-service QoS
Fortunately a new generation of QoS solutions is emerging, that manage the key resource competitions at a network contention point using a single, general mechanism rather than a handful of special-purpose ones. This not only controls the intrinsic interactions but even allows trade-offs between different packet streams, for example giving a voice stream lower delay and a control stream lower loss within the same overall bandwidth. By starting from a multi-service perspective, multiple critical applications can all be prioritized appropriately without any risk of one dominating and destroying the performance of the others. Embracing the inherently statistical nature of packet-based communications makes the resulting QoS both predictable, eliminating surprises when the network device is configured, and efficient; up to 90 per cent of a link's capacity can be used for packet streams requiring QoS, with the rest filled with best-effort traffic.

Applying this technology at severe contention points, such as the WAN access link, enables the biggest potential losses of QoS for critical applications to be controlled. This makes the QoS ‘budget' for the rest of the network achievable using established techniques such as route control and bandwidth over-provisioning.

For the business, this QoS technology is most useful for managing the WAN access link to the rest of the network. Combining it with session awareness, NAT/Firewall/router functions and the ability to convert legacy applications such as conventional telephony to converged applications such as SIP VoIP produces a new class of device called a Multi-service Business Gateway (MSBG). Such a device can either be managed by a service provider delivering managed services or by a business buying simple connectivity services from a provider. It also provides a convenient point to provide QoS assurance such as VoIP quality measurement to ensure that SLAs are not breached. Overall it is an enabler for reliable, converged, packet-based services, allowing the full potential of 21st Century networks to be realized. We are only just beginning to see the changes this will bring to both business processes and everyday life.

Peter Thompson is Chief Scientist at U4EA Technologies and can be contacted at peter.thompson@u4eatech.com
www.u4eatech.com

Next generation access ("NGA") networks are slowly being rolled out in a range of European, American and Asia-Pacific countries. Even BT, that had until recently been reluctant to commit to NGA investments, announced a five year £1.5 billion plan to roll out fibre based NGA infrastructure to replace parts of the legacy copper network. The target architecture chosen will initially deliver services with speeds of between 40 Mb/s (for existing lines) and 100 Mb/s (some new builds) with the potential for speeds up to 1,000 Mb/s in the future. It is anticipated that the NGA will be rolled out to 10 million UK households by 2012. Similar plans have been announced in a number of other European countries.

Most operators, however, have stated, like BT, that such investment plans were conditional on a "supportive and enduring regulatory environment". What does this mean? What are the regulatory options available? To what extent will these impact the competitive dynamics of the market and end users?

The regulation of NGA investment raises a number of important regulatory issues. While not regulating wholesale access to these new infrastructures may have a positive impact on investment it would also creates significant barriers to entry for third party providers and may therefore result in a less competitive environment. Existing operators using unbundled copper loops may, for example, see their investments stranded as new fibre based networks are being rolled out. This could translate into less choice and higher prices for end users. At the other end of the spectrum an aggressive regulatory regime mandating cost based wholesale access to all may stifle investment and result in suboptimal outcomes for all stakeholders.

Operators, regulators and Governments worldwide are currently grappling with these questions. Regulatory measures that are being considered include:

Permanent and temporary forbearance - this entails placing no regulatory requirement on NGA operators, either for a period of time or permanently. The US approach is to forbear from regulation of fibre access and this seems to have stimulated NGA investment from operators such as Verizon. The German Government proposed a regulatory holiday for Deutsche Telekom along the same lines as the US, but was subsequently criticised by the EC and had to withdraw its proposals following the threat of legal action.

Cost based access - the regulator determines access prices for NGA based on the cost of providing access. A number of cost modelling approaches could be used for this including long run incremental cost ("LRIC") commonly used in the telecommunication sector or the Regulatory Asset Base ("RAB") approach often used for the regulation of utilities.
Retail minus - the regulator determines the access prices on the basis of the retail price charged by the incumbent operator less the costs avoided by not having to retail the service. This can be thought of as a "no margin squeeze" rule, which prevents a discrepancy between the wholesale access charge from the integrated company to competing service providers and the retail broadband price from the integrated company.

Anchor product regulation - in addition to providing access to all wholesale products on a retail minus and equivalent basis, the wholesale operator also provides an "anchor" product, a service they already provide, which they then continue to supply at the same price. For example, if the current copper network is capable of providing a 5 Mb/s broadband service, then the anchor NGA product would also be a 5 Mb/s broadband service and the integrated company would be required to provide it at the current service price.
Geographic market regulation - this is a variant on the forbearance approach. The regulator may forbear from regulating acces where there are competing NGA operators.
Each approach has its own advantages and disadvantages that will need to be considered carefully in the context of domestic market characteristics.

Another major difficulty, from a regulatory viewpoint, is the specification of the NGA products that should be regulated. Traditional solutions such as the leasing of the copper pair to third parties (a process known as "unbundling") are often difficult to implement -for both technical and economic reasons- with fibre networks.  Regulators may therefore want to regulate other products such as access to unused fibre, or even ducts to ensure that competition is not harmed.

The challenge for regulators will be to develop a regulatory approach that provides incentives for efficient and timely investment in NGA as well as regulatory visibility for stakeholders.

Benoit Reillier is a Director and European head of the telecommunications and media practice of global economics advisory firm LECG.  The views expressed in this column are his own.
breillier@lecg.com

Could VPLS offer the answer to the seemingly inevitable future bandwidth crunch?  Chris Werpy explores the options

With any communications network, the most common demand from multinational enterprises is for a reliable, secure and cost-effective communication channel between their globally dispersed offices, which requires guaranteed end-to-end bandwidth performance. However, bandwidth is under growing pressure from the increasing popularity of multimedia communications and converged voice, video and data applications, such as VoIP and video conferencing. As a result, the threat of traffic bottlenecks occurring between LANs is looming and corporations are looking for safe and guaranteed LAN to LAN connectivity that is scalable to meet whatever future bandwidth they may require.


Although carriers and service providers have been offering VPN services based on traditional TDM, Frame Relay, and ATM for some time now, the cost of operating separate networks to provide these services, coupled with the greater bandwidth consumption pressures, is forcing them to move to more cost-effective technologies: namely IP and MPLS.
Enter global virtual private LAN service (VPLS) into the networking spotlight. VPLS is a point-to-multipoint Ethernet-based transport service that allows businesses to securely extend their LAN throughout the entire WAN. VPLS benefits from the scalability and reliability of an MPLS core, with no legacy Frame Relay or ATM networks to integrate, and access to the existing network infrastructure and equipment. It scales well to national or international domains while preserving the quality of service (QoS) guarantees, with the added privacy and reliability of a Layer 2, carrier-class service.


The question is; are there any alternative solutions available that can compete with VPLS, such as Private IP, point-to-point solutions (such as Virtual Leased Line), Ethernet-in-Ethernet, L2TP and Border Gateway Protocol (BGP)/MPLS VPNs? The simple answer is "No". Simplicity and transparency is the name of the game for VPLS.


VPLS lets customers maintain control of their networks while allowing them to order on-the-fly bandwidth increments for multiple sites, instead of being constrained by the traditional legacy services. Configuration is also very straightforward - only the peer PE routers for a VPLS instance need to be specified. VPLS uses edge routers that can learn, bridge and replicate on a VPN basis. These routers can be connected by a full mesh of tunnels, enabling any-to-any connectivity. Customers can either use routers or switches with a VPLS solution, as opposed to Private IP. VPLS always offers Ethernet port handoff (customer demarcation) between the carrier and the customer router or a simple LAN switch allowing higher bandwidth service at a lower cost of deployment. Unlike IP VPN, where the customer hand-off can range from Ethernet, frame relay, or IP over TDM, with VPLS the customer hand-off to the WAN is always Ethernet. VPLS is also access technology-agnostic. The list of advantages is substantial.


BGP/MPLS VPNs (also known as 2547bis VPNs), on the other hand, require in-depth knowledge of routing protocols. As the number of instances increase, service provisioning systems are often recommended in both cases to ease the burden on the administrator, particularly for Layer 3 VPNs. Layer 2 VPNs also enjoy a clear separation between the customer's network and the provider's network - a fact that has contributed heavily to its increasing popularity. Each customer is still free to run any routing protocol that the customer chooses, and that choice is transparent to the provider. Layer 3 VPNs are geared towards transport of IP traffic only. Although IP is nearly ubiquitous, there could be niche applications that require IPX, AppleTalk or other non-IP protocols. VPLS solutions support both IP and non-IP traffic. One significant security and performance advantage is that there is no IP interaction at the connection between the provider edge and the customers' devices.


Another differentiator is that VPLS offers greater flexibility and cost reductions, by putting the customer in control of the network and the negation of equipment upgrading requirements. End users have the flexibility to allocate different bandwidths at different sites, with the bandwidth varying from site to site by as much as 1 Mbps (for example at a low traffic-generating sales site) to Gig-E (which could be needed for the company's headquarters and/or data centre). Furthermore, as customers increase the bandwidth, there is no need to buy new cards for the existing CPE. I estimate that customers with a 50-site network can save up to 20 per cent in networking costs by moving over to VPLS.
VPLS solutions also score highly in the areas of compatibility and scalability. They are transparent to higher layer protocols, so that any type of traffic can be transported and tunneled seamlessly. VPLS auto-discovery and service provisioning simplifies the addition of new sites, without requiring reconfiguration at existing sites.


The most effective VPLS offerings are delivered using Ethernet connectivity, in the form of VLANs. These VLANs can be provisioned across TDM connections (E1, T1, E3, T3, etc) when native Ethernet is not available.


Dan O'Connell, research director for Gartner is a staunch supporter of VPLS, stating recently that "VPLS is a major new growth area for Ethernet. Customers are already very familiar with Ethernet in their local area networks. Extending to the wide area network is a natural progression, especially for those business and government customers seeking a clear IP migration path to enable convergence of their multiple legacy networks."


So with VPLS displaying so many powerful capabilities, it would be surprising to imagine any circumstances where VPLS would be less than optimal. However, there is one such application and it's called multicast. Unlike Ethernet networks where there is native support for multicast traffic, VPLS requires the replication of such packets to each PE over each pseudo-wire, in order for multicast packets to reach all PE routers in that VPLS instance. The problem is further exacerbated in metro networks, where ring-based physical topologies are often deployed. Clearly, this replication is very expensive, causes wastage of bandwidth and is applicable at best when multicast traffic is expected to be a small proportion of overall traffic needs. Alternative solutions that the industry is researching include the establishment of shared trees within the VPLS domain, but this research has a long way to go.
Despite this caveat, VPLS is gaining momentum. Maria Zeppetella, Senior Analyst in Business Communications Services at Frost & Sullivan, agrees with this trend, but makes the point that most carriers are not planning to shut down legacy networks, as they still obtain a steady, albeit shrinking, revenue stream from them. Sprint being one exception, however, as it has announced it will shut down its legacy networks in 2009 and will have full migration to IP by then.


As for what the future holds for VPLS, I believe it will become the biggest network solution adoption of 2008 and 2009 for globally dispersed enterprises. Service providers will look to enhance their offerings for the early adopters and will introduce more customer network control applications and features. End users are always looking to simplify their network connections, while optimising transport effectiveness and keeping costs low. For businesses that are globally distributed and want to extend the benefits of the lower costs and simplicity of Ethernet technology throughout their entire network, global VPLS based on IP/MPLS is the solution of choice.

Chris Werpy is Director of Sales Engineering at Masergy, and can be contacted via tel: +1 (866) 588-5885; e-mail: chris.werpy@masergy.com
www.masergy.com

It's no secret that voice traffic is still expensive to carry in many regions of the world. And even though the related transmission costs are decreasing with the introduction of network entities such as 2G-3G media gateways, many operators are still struggling to deliver consistent OPEX reductions while evolving their networks to support ever-increasing traffic levels that now consist of voice, data and video content.

But even while multimedia traffic continues to grow, the fact is that voice services are still the core revenue stream for the majority of operators worldwide. Whether deploying cellular or satellite infrastructures or some combination of both, operators must optimize voice trunks in order to offer price-competitive services including pre-paid calling cards, private business lines and call centers.  And, with the telecom bubble of the late 1990s long gone, brute-force bandwidth provisioning is no longer a viable option. Now more than ever operators must plan the evolution of the transmission network pragmatically and look at enhancements that are available at low cost and deliver immediate ROI.

Want to know a secret?

Adding bandwidth capacity is not the only solution to relieving network congestion or increasing service capacity.  Used in telephony networks for many years, DCME (Digital Circuits Multiplication Equipment) solutions have earned a solid reputation for providing advanced PCM (Pulse Code Modulation) voice compression over transmission media such as satellite and microwave links.

Driven by standardization advancements in the ITU-T, DCME technologies have continued to evolve and now achieve impressive compression ratios, allowing operators to provide extra bandwidth without provisioning additional capacity. Adopted and valued by thousands of operators worldwide, DCME optimization and compression technologies offer time-tested, field-proven results that drive more bandwidth from existing assets while sustaining ¾ or in some cases even improving ¾ voice quality in networks where media gateways are deployed, resulting in substantial OPEX savings and improved profitability.

The secret to cost savings without sacrificing voice quality

Many operators are still carrying voice across plain PCM [G.711] 64 kbit/s channels or Adaptive Differential Pulse Code Modulation ADPCM [G.726] 32 Kbit/s channels over satellite links (even though the link costs are substantial), simply because they believe more aggressive voice codecs will cause a degradation in voice quality. But advances in today's DCME technologies have resulted in codecs that offer up to 16:1 bandwidth reduction on voice trunks while preserving quality of service.

For example, consider a link consisting of 8 x E1s, carrying 240 voice channels and 8 x SS7 signaling channels. Assuming a conservative 35% silence ratio, today's DCME solutions will reduce the bandwidth required on the satellite link from 4,300kbit/s to less than 1,000 kbit/s. This translates into substantial yearly savings with a payback period of only a few months.

A similar scenario can be repeated for leased line backhaul or congested PDH (E3) microwave links. The above example would significantly reduce the backhaul capacity from 8 x E1s (or a single E3) to a single E1.

In an Ater link configuration, voice traffic is carried between the BSC and the MSC in a compressed format (usually 16kbit/s per voice channel). But as traffic increases it typically becomes necessary to migrate voice traffic from the Ater link to an A link where it is not compressed, requiring at least four times more transmission capacity and significantly increasing network OPEX.

However, by equipping the A link with a DCME solution transmission bandwidth requirements are reduced by up to 4:1, thus delivering significant OPEX savings and liberating existing bandwidth assets to support additional backhaul capacity for future growth and new services.

The secret to cost-effective, secure disaster recovery

Mobile networks are being incorporated more and more into public disaster response plans, further emphasizing network availability as a critical component of any network planning.  Network outages - which can account for 30% to 50% of all network faults and hundreds of thousands of dollars in lost revenue -- are particularly sensitive for operators using third party leased lines or unprotected fiber links.

DCME technologies offer a cost-effective, reliable A/E link back-up solution that uses satellite backhaul on an as-needed basis to tightly control OPEX budgets without sacrificing reliability and security requirements.

DCME - the secret is out

Word is spreading that DCME solutions have come a long way from their humble beginnings as PCM voice compression solutions.  Today's advanced solutions offer a vast range of interfaces as well as varied processing capacity allowing connectivity in diverse environments, from STM-1 to large trunks, or even E1,  STM-1 or IP/Ethernet connecting to an MPLS core over Fast Ethernet or Gigabit Ethernet interfaces (electrical or optical).

But while there is a vast array of solutions available, operators must research vendors carefully to ensure the solution supports key criteria such as bandwidth management, a crucial component helping revenue continuity with carrier grade voice quality in traffic congestion situations. Other critera can include, 16:1  bandwidth compression (20:1 for telephony), 8:1 SS7 optimization, high-quality mobile codecs, voice and data aggregation, backbone protocol independence, integrated traffic monitoring and versatile connection capability. The checklist might seem exhaustive, but it is only with this complete feature set that a DCME solution can deliver the bandwidth efficiency, OPEX savings, exceptional voice quality, and network reliability demanded by today's mobile operators.

What do mobile operators do now that many of their services are reaching saturation point?  How do they continue to develop new and innovative ways for people to communicate that are as universally embraced as voice and text? Allen Scott contends that Mobile Instant Messaging will become the third key operator communications channel in the future

Mobile messaging services have never been so popular, with SMS still the hands-down winner. According to recent forecasts by Gartner (December 2007), 2.3 trillion messages will be sent across major markets worldwide this year. That is almost a 20 per cent increase from the 2007 total. Whilst the growth in traffic has been (and is predicted to remain) nothing short of phenomenal, for most operators the growth in volumes will not be anywhere near matched by a growth in revenues. Many are already seeing the flattening of messaging revenues.


Operator margins on messaging services are going to become ever slimmer as competition and market saturation bite deeper. Gartner estimates that the compound annual growth rate (CAGR) for SMS revenues is expected to fall by almost 20 per cent over the next four years to just 9.9 per cent. Rather than trading blows in a competitive game of one-upmanship over highly reduced bundled text tariffs - or even giving text away for free - forward-thinking mobile operators are beginning to realise that they should be planning right now for a future where current messaging services are fully commoditised and the margins greatly reduced.


The challenge is to replace the revenue. There is a need to develop new services that drive additional revenues, increase customer loyalty and develop new streams of income for operators.  The challenge has not changed in the last five years. Yet a host of services - from MMS to WAP and from Mobile Television to video calling - have failed to capture the imagination of the public.


Perhaps it is time to go back to basics.  So far, the most successful operator services have all been channels of communication rather than specific services.  What do I mean?  Voice and text are both simple to understand and use.  Most importantly, both provide channels with which to communicate.  No one tells users what to say or write in voice and text. What is provided is a simple communications channel for the user to use as he or she sees fit.
Operator services that have failed to capture users' imaginations have tended to provide very specific services rather than communications channels - or they have tried to recreate a PC experience in a way that is not suitable for the mobile. WAP, for example, is a specific service providing browsing on a mobile phone. However, it is slower than browsing on a PC, and the results are usually difficult to read and to navigate. Similarly, MMS is a specific service offering the user an opportunity to send photographs from mobile to mobile. Ultimately, volumes of photo messaging are likely to stall (probably in the next year or two) as mobile subscribers increasingly share photos through mobile communities and social network portals rather than sending them directly to one another.


So what are the new communications channels?  The most obvious one today is mobile Instant Messaging.  Mobile IM offers operators a new opportunity because it provides a comprehensive new communications channel and not a specific service. Though mobile IM's uptake today compared to SMS is still relatively small, operators are taking the service seriously.  Indeed, early indications are that mobile IM has a growth pattern that matches, and in some cases exceeds, that of SMS at the same time in its development.
IM has at times been unfairly seen as little more than a service extension to the PC. But the reality offers so much more, with operators increasingly recognising the opportunity to create a global partnership to truly take mobile to the Internet, and vice versa. The combination of messaging, presence, and conversations, plus the ability to attach links or pictures, provides an incredibly vibrant solution for the mobile environment. It's about much more than driving additional active subscribers.


IM is actually ‘text talk.' It is differentiated by two key points: presence and interactivity. Presence means subscribers can tell their connections what they want to do.  Are they busy? Are they free to communicate? With SMS, there is a restricted level of interactivity and uncertainty as to whether a text message recipient is ready or able to communicate. With Mobile IM, interactivity delivers a text conversation in its truest form - and offers an unmatched user experience.  IM allows that conversation to become more real-time, more intuitive, and more content-driven.


Instant messaging can, and already does, have numerous different forms. There are multiple applications (business, social, educational) and multiple channels (via ISP or mobile operator) for delivery.


A lot has been intimated about the risk to SMS revenues due to ‘cannibalisation' from mobile IM.  This has not been the experience within the operator community.    Turkcell has already announced that they have seen an increase in revenue from mobile IM users who are now using more voice and SMS.  The nature of the medium is conversational, chatty, and public.  Turkcell users were starting conversations in mobile IM and then breaking off to call or text one of the participants whilst the conversation continued.


So far, there have been more than 60 mobile IM launches across the globe.  Operators as diverse as Vodafone, Vimpelcom, Turkcell, 3 UK, TIM, and Tele2 have launched mobile IM services.  Some are market leaders and others are new entrants.  Some have launched ISP branded services, whilst others have launched their own branded services.  Some have even launched both!  The common bond amongst them is that they see benefit and revenues from the mobile IM opportunity.  NeuStar's operator customers alone have an end user base in excess of a third of a billion mobile subscribers.


Most important, though, is the potential mobile IM has to transform the delivery of information and data to mobile across the Internet. Browsing on a mobile is a slow and frustrating activity. Through the use of IM ‘information buddies', mobile IM services can provide users with access to information like news and sport headlines, train timetables, weather information, and so forth. This is relevant, useful information delivered in a way that is suitable for the mobile environment - short, sharp bursts of relevant information delivered quickly and in an easily digestible form.


Furthermore, the ability to share is a fundamental cornerstone of mobile IM.  Today it is sharing messages, but tomorrow it could be any number of things: bookmarks, photos, video, phone numbers, or location-based information.


To sustain growth over the next few years, operators are likely to look to IM and social networking applications to drive traffic, working either with popular established ISPs and social networking sites or creating new communities in which people can gather and communicate.


The growth of mobile IM has been impressive.  A number of operators went public with announcements in the latter half of 2007 and the early part of 2008 regarding service success so far.  These included 3 UK, who announced more than one billion mobile IMs sent in less than a year.  Vodafone Portugal announced an early success milestone of 100 million messages, and Vimpelcom commented favourably on the launch of its Beeline service.
The challenge for the industry now is to seize the moment.  SMS became a ubiquitous service when it opened its doors to interoperability.  Before this, users needed to know which network someone was on to message them.  Mobile IM is even more complex, with interoperability issues between different ISPs as well as different operators.  But consumers do not care about this.  They simply want to communicate.  The opportunity is there to be seized.  If operators engage with the ISPs and with other operators, then there are real achievements to be secured. We, as an industry, must keep looking forward to the bigger picture. The danger apparent is in emulating the growth of the Internet, which took ten years to derive real value - and a lot of money was lost on the way.


This may all sound ambitious, but as long as operators embrace the vision and address the demands of interoperability, IM has the propensity to drive new revenues in a host of ways.  No one expected SMS to be the roaring success that it has been since it embraced interoperability.  This year, mobile operators will have 2.3 trillion reasons to be thankful it has been.  Mobile IM has as much, if not more, potential to succeed.  The technology is in place - it just needs to be well executed. 

Allen Scott is General Manager of NeuStar NGM www.neustar.biz/ngm/

    

Other Categories in Features