Features

Features

The mobile advertising sector will be worth $18.5 billion by 2010, largely because advertisers want to take advantage of the most exciting channel for delivering targeted messaging in the history of advertising, but also because operators want to supplement their traditional business with an additional revenue stream says Cathal O'Toole

Mobile advertising presents a huge opportunity for both the relatively young mobile community and the well-established advertising and media industry. For both, there is the chance to be in at the start of something big. For mobile, it is expected to be one of the most important revenue-generating opportunities presented by mobile technology. For the advertisers this exciting means of mass-audience targeting and message delivery will, as Telefonica O2's CEO, Peter Erskine predicted in 2007, grow even faster than Internet advertising, which has already surpassed radio. He added: "It seems inevitable that the mobile screen - just as cinema, TV and PCs before - will be used for advertising, and when you consider that there are a lot more mobiles than any other device, the rise and rise of mobile advertising is unstoppable."


In the context of the wider world, mobile advertising is something which will soon positively affect everyone who has a mobile phone, so it is important that both the advertising and the mobile telecoms industries make its design successful from the start. Through a co-operative, considered approach, they can create an optimum mobile advertising ecosystem from the beginning.


There are, however, a number of difficulties that must be overcome in order to take advantage of this opportunity. Firstly, the lack of mutual understanding that exists between the two sectors must be addressed through co-operation. Historically, advertising and mobile communications have evolved at different times and in very different ways, resulting in very different industrial cultures, languages and approaches to doing business. For the mobile operators to take advantage of the rise of mobile advertising, they will need to implement solutions that speak the language of the advertising community - Cost per Impression, Cost per Click, Cost per Acquisition rather than Messages per Second, Transactions per Second, and so on. And these solutions once implemented must allow the easy placement of advertisements by media and advertising agencies, using similar interfaces as they are used to for the Internet or, indeed, for traditional media.
Secondly, the technical fragmentation and complexity presented by the mobile channel offers a potentially confusing variety of options as to where to place an advert, such as: text message, multi-media message, ringtone, during an interactive voice menu, during browsing or downloading, or on-line chat sessions. In addition, there is a bewildering breadth of terms to describe these options - SMS, MMS, RBT, IVR, IM, etc - and technical restrictions on the nature of the advert that can be displayed on these different options in terms of such variables as size, timing, and ease of response. A text message, for example, offers a very basic text experience but is compatible with all handsets, while a multi-media message offers a much richer media option but it is not compatible with every mobile device. Each of these technical options offers different possibilities for the delivery of advertising campaigns. Some will be suited to one type of campaign whilst others will be ideal for different types of campaign.


As a result of this complicated picture, it will be crucial for the mobile industry to guide the advertising world through this maze in order to achieve optimum results. Operators must work with the advertising industry to simplify the use of these technologies. Organisations like the GSMA and MMA are already working to assist in setting guidelines and advisory statements to guide the development of this business and they will be instrumental in designing the best mobile advertising way forward.  In 2007, the GSMA announced its ‘Mobile Advertising Programme' to ensure the establishment of guidelines and standards in support of this new sector of the industry. The operators must continue to support this work and to develop an environment that encourages advertisers and agencies to deliver campaigns over mobile networks.


Thirdly, there is the challenge of commercial unfamiliarity on both sides. Continuous debate is taking place about how to price an advertisement on a mobile and how to adjust this price based on the degree of relevance, the timing, the media content and the ability to respond. For their part, the operators are afraid of losing out due to inexperience of the market and what the value of their assets is to the advertising community. The advertisers, on the other hand, are afraid that the ‘unproven' advertising channel offered by mobile communications will be ineffective and so they are hesitant about committing large percentages of their marketing budgets to this embryonic vehicle.


Overcoming these obstacles will only happen with time and experience in the form, perhaps, of some early trial agreements. But it is even more important that all parties enter into a co-operative atmosphere conducive to learning what will or will not work for all involved. If the above three difficulties are addressed then the advertising industry and the operators can quickly take a strong position.


Mobile advertising can take many forms, each of which has its own characteristics that makes it suitable for specific campaigns. Broadly, the advertising media can be broken down into three categories: advertising over messaging, advertising during browsing, or advertising using media or VAS applications.


Advertising over messaging is where advertisements are sent using SMS, MMS, Instant Messaging, or other messaging media. Mobile subscribers have been experiencing push advertising over SMS for a number of years but this has taken the form of unsophisticated, outbound, and untargeted marketing messages. Still, SMS remains an extremely powerful vehicle for mobile advertising delivery. Indeed, SMS delivery alone is projected by some industry sources to account for US$9 billion of mobile advertising revenues by 2011.
For SMS and MMS options to work in the delivery of a mobile advert, the operator's mobile advertising platform, or advertising engine, needs to communicate with the network nodes. For Peer-2-Peer messages coming from the sender/originator, the SMSC recognises the sender as an advertising subscriber, or not, and if recognised the SMSC then alerts the operator's advertising engine, which selects the appropriate targeted advert for the subscriber. This selection is based on specific user profile criteria, for example time, sender profile, receiver profile, and then inserts the advert into the SMS/MMS and sends it back to the SMSC (or MMSC), which then completes message delivery to the recipient.
The specific campaign being run by the advertiser will determine the exact content of the messages. Consumers targeted by the ad message might, for example, be asked to send a text to a 5-digit short code promoted through existing TV, radio, online or print media, in order to receive the offered product, service or other brand information. In this way, traditional standalone forms of advertising, such as outdoor billboard and TV, are being turned into interactive media through the power of text messaging, enabling the target audience, for example, to text in for free samples of any number of products, services and consumables.


In addition, some of the larger mobile operator brands are eliminating the use of direct mail and using multimedia messaging instead. This enables the marketing teams to create graphically rich messages incorporating animations, audio, images and streaming video. Not only will such campaigns be more cost effective without print and mailing charges, but they will be also be rapid in their execution and more effective in eliciting a response from the audience than other means. With phone in hand the willing recipient can respond immediately to an advertising offer with calls to call centres (or requesting in-bound calls to the handset) or click-throughs to mobile Internet sites.


Secondly, there is advertising via an Internet browser offering a similar experience, though in miniature, to advertising on the Internet. With a huge surge in the number of mobile Internet sites available to advertisers, typically as companion sites to traditional web pages, combined with the continuing rise in numbers and sophistication of mobile audiences, it is becoming more and more viable for advertisers to consider the positioning of display-type ads on such mobile sites.


As with traditional online banner ads, mobile Internet ads consist of text and/or graphics, and offer the target consumer a variety of response options. A simple click-through, for instance, may reach a product registration page, or there may be a click-to-call option initiating an outbound call to a call centre. A click-to-buy option is also one possible route, with a mobile Internet purchase appearing on the consumer's normal phone bill. There may also be a simple click requesting a text message reply for further product or service information. At present, mobile Internet display advertising has been shown to be up to ten times more effective than Internet banner ads in terms of response rates.
Thirdly, there is Media/VAS-related advertising and handset/content related advertising. The Media/VAS-related advertising is where an advert will be inserted into a service experience of, say, Ringback Tones or Interactive Voice Response (IVR), on mobile TV or using idle screen time on a handset. An IVR message, for example, may say to the caller, "before entering your PIN to retrieve you messages did you know that ‘Brand Name' is on offer..." or similar promotional messages.


The methodology behind this option is similar to messaging domain advertising although the advertising message is received via some form of application such as Voice Messaging or Ringback Tone service provided by a third party content provider outside the network.
Although the media/VAS domain, as a vehicle for mobile advertising, is the newest option, one major brand has already utilised this method using idle screen time to great effect in the Far East last year. Targeting subscribers on the AIS network in Thailand, an interactive content campaign was run on behalf of the Honda motor company, with messages broadcast twice daily appearing on millions of users' phone screens but only when they were idle. Offering tips about motorcycle safety and fuel efficiency, the campaign's main aim was to promote the brand and encourage user response through the incentive of a click-through prize-draw.


Within three weeks, more than three million unique impressions, targeted at subscribers in the Bangkok area, were generated, and more than 100,000 users clicked to participate in the prize draw - and receive more information from Honda. These results show the ability of the mobile phone to be a truly mass-market advertising vehicle.


It should be noted, however, that any mobile advertising campaign may draw on a number of the above mobile possibilities combining, say, idle-screen with SMS, mobile TV and non-mobile advertising media. And each module of a campaign may also afford interaction by the user with another form such as offering short codes to viewers of a TV delivery to seek, via their mobile, the next leg of the ad campaign journey.


It is important that the mobile operator moves quickly to discuss the mobile channel with a solution provider that has experience building advertising solutions, and with a media agency that has early experience or understanding in the delivery of advertisements on the mobile channel.


It will then be essential to set up trial advertising campaigns in order for the advertising side to begin building experience and understanding of the operator organisation. O2 Telefonica ran trials earlier last year with "encouraging signs of customer receptivity" and "no negative impact on overall customer experience or brand perception".

Once all operators take part in such evaluation they will find that the opportunities for driving advertising revenues will grow and revenues from unprofitable traffic on the network will start being realised.

Cathal O'Toole is Product Manager, Jinny Software  www.jinny.ie

Test and monitoring has transitioned from being a ‘necessary evil' to maintain the network, into a key business enabler, according to Michele Campriani, allowing operators to rapidly expand their service portfolios and at the same time reduce operating expenses

The race is on as mobile operators across the globe accelerate their migration path to mobile data services.  This rush stems from explosive customer demand for web-based services, the opportunity to capture increased average revenue per user (ARPU) while reducing operating expenses, as well as the desire to stay one step ahead of the competition. Most industry insiders agree - in the not-so-distant future, it will be rich mobile services such as videoconferencing, mobile gaming and presence that will set mobile operators apart. 
As the technical early adopters have experienced, however, the transition to converged networks supporting mobile data services presents complicated challenges.  At the heart of the problem lies the fact that most of the time, IP and PSTN-based services are operating in parallel during the convergence phase. While the operator may well understand how to measure service quality and SLAs on legacy traffic, new IP-based services are much different, and require vastly different monitoring techniques and service quality measurements. 
Hence this new world of mobile data services has greatly expanded the role of network troubleshooting and monitoring tools. In essence, they have transitioned from being a ‘necessary evil' to maintain the network, to a key business enabler, allowing operators to rapidly expand their service portfolios and at the same time reduce operating expenses.  As the nature of networks and services has evolved, so have these tools.  In the sections below, we take a closer look at how a mobile operator can most effectively utilize protocol analysis and network monitoring tools to successfully launch and manage mobile data services.


New access technologies such as 3G/UMTS and 3.5G/HSPA have finally provided the cost-effective ‘edge' bandwidth required to offer mobile data services. For the operator, once these services are offered, they will likely experience a burst in network usage driven by adoption of the new services, as well as porting legacy traffic over the new infrastructure.  
At this point, the core of the mobile network is one of the most vulnerable areas. This is due to the fact that the core network is typically tailored to access network bandwidth... so as the operator migrates from GSM to UMTS to HSPA, for example, it will amplify core infrastructure weaknesses.

   
It is critical at this point that the operator be able to correlate and monitor all activity occurring over network elements interconnected through various protocols and interfaces. The operator must simultaneously correlate information exchanged by each device involved in transactions, including those on the internal signaling network as well as external connections for calls made to/from subscribers of other operators.


The complexity lies in the fact that there are many signaling exchanges and interfaces involved during data transactions, each giving a different level of visibility into network and service issues.  For example, in a UMTS network, the Gn interface is the most crucial in providing overall visibility into the network (eg for ‘macro' problems and issues related to authentication with external networks), while the Gi interface provides information on the quality of IP traffic and services.  So unless the Gn and Gi interfaces are correlated, there is no way to test the interconnection between the operator data network and the external data network (e.g. the Internet).


Thus managing each of these interfaces, as well as all of the traffic traversing them becomes of high importance for high-quality mobile data services.  This type of monitoring is easily provided by a new breed of distributed monitoring system.  Here are example interface correlations, and the key information they provide:

  • Gn/Gi - to check the interconnection between the operator data network and the external data network for snapshot of state of services. After the Gn/Gi correlation, monitoring either the Gb or the IuPS helps triangulate on location of problem
  • Iu/Gn/Gi - comprehensive view of both the core and access network signaling messages of a 3G / 3.5G data session (PDP context)
  • Gb/Gr - to decrypt signaling messages of a data session over the Gb
  • Iu/Gr - to analyze the data session activation and authentication phases

Compared with voice services, mobile data services require vastly different monitoring and troubleshooting techniques. For legacy voice, it is generally assumed that if the network has high QoS, service quality is good. For data services, this is no longer true. This is due to the fact that most mobile data services are UDP/IP and TCP/IP-based, with many of them being high-bandwidth and interactive, hence highly affected by packet loss, TCP resets/latency and application-layer issues such as DNS and HTTP anomalies.


Thus for data services, understanding the service actually experienced by the end user (or "quality of experience" -QoE-), now becomes the important metric. Unfortunately, QoE cannot be provided by usual network node element metrics and tests.  Here are basic measurement guidelines for the major mobile data service types.

  • Background services such as web, FTP and e-mail are not time-sensitive, so delay and jitter are not a big issue. More important for these services is throughput per call, traffic per call and packet loss.
  • Streaming services such as webcasting and video viewing are much more real-time sensitive. For these services, delay-related measurements such as jitter and delay are most important.
  • Conversational services such as video calls and mobile gaming must be based on all of the above... throughput, packet loss, jitter and delay.
  • Thus the operator must now integrate more measurement types, and know how each affects the other.

As an example of these new measurements, the screenshot in Figure 2 shows a few key TCP user-plane statistics that can be used at either a summary level, or in a drill-down level to assess how TCP resets and TCP delay times are affecting services. TCP is a very important protocol-under-monitor for mobile data services as it most directly affects service responsiveness once a service link had been established.


Perhaps the most important element of service monitoring and troubleshooting lies in understanding what is happening at the application layer. After all, it is protocols such as DNS and HTTP that ultimately determine service availability. In Figure 3, we see a summary-level view of DNS response codes over selected set of records. In this example, we see clearly that we must drill down further and investigate the cause of the DNS name failures to achieve high service availability. Other DNS-related information we might want to investigate are average DNS response times, top DNS addresses and occurrences and DNS query types (host, mail, domain name, etc). This measurement and the one above are only a small sampling of new measurements that must be learned.

  
In summary, while the lure of competitive differentiation, OPEX reduction and increased ARPU are driving operators towards mobile data services, there are significant challenges that must be overcome for successful introduction. At the most basic level, the operator must integrate a multitude of new measurements and network monitoring techniques that provide insights into the IP-based services.  The good news is that significant advancements have been made in capabilities of network monitoring and protocol analysis systems. Mastery of these new tools is critical for the operator hoping to introduce and manage multimedia mobile data services. 

Michele Campriani is General Manager of Sunrise Telecom's Protocol Product Group. www.sunrisetelecom.com

It has now been proven that access to digital communications has a direct and measurable impact on economic growth. Yet despite this, Janne Hazell explains, huge numbers of emerging market communities, often located at the very source of the natural resources which fuel the economy, still remain cut off from basic telecommunications. Herein lies the paradox of the Digital Divide; providing communications to drive economic development is, in itself, cost prohibitive.  Though several key barriers have been overcome, one still remains, the cost of transmission to, from and between these people

Significant advances in wireless technologies, together with the economies of scale resulting from hugely successful global initiatives such as GSM, have coupled together to provide much of the world's population with cost-effective wireless communications. Competition between equipment suppliers, increased government subsidies and important initiatives, such as the GSM Association's ultra low-cost wireless handset initiative, have all combined to drive down network costs. Mobile network operators (MNOs) and, ultimately the wireless users themselves, have benefited from this wireless industry evolution and we can now experience cost-effective broadband communications. Many MNOs are now focusing on operational overheads, almost exclusively, looking to outsource the operation of their networks and drive down costs further. While the drive to reduce the operational cost of running networks has taken centre stage globally, within emerging markets where arpu is at sub US$10 per month levels, operational overhead costs can often be a bridge too far, leaving communities cut off from basic communications.


While costs vary depending on local factors, one key cost stands out consistently across the world's mobile networks - transmission costs. In particular, transmission costs to and from base stations. Many alternative technologies exist today. Within the reach of the terrestrial telecommunications grid, optical fibre or copper are the dominant technologies and short range microwave links are also common. Cost-effective microwave technology is also dominant for longer distances although the cost of the installation and management of towers has made this an increasingly expensive option. With this the use of satellite as a backhaul technology has accelerated, yet this too brings with it operational cost barriers.
Therefore, the focus has now shifted to the reduction in transmission costs to and from wireless base stations and, with the majority of communications (particularly within emerging markets) being local within the communities themselves, the development of technology to address this local communications requirement.


While the price for network equipment and mobile terminals has fallen sharply since the introduction of global digital mobile networks in 1991, the same deflationary trend has not been visible for transmission costs. Transmission costs have been growing steadily to the point where it is now estimated that anything from 15 to 80 per cent of the total cost of ownership of a BTS relates to backhaul transmission costs. While it is clear that the need for cellular backhaul never will be eliminated, it is also clear that for any area with costly backhaul it is a waste of resources to carry local communications back and forth over the backhaul link. That is to say, that when two subscribers are located in the same area, there is no reason why their intercommunications must travel across the transmission network just to return again to the same area. Industry estimates for such local traffic are as high as 60 to 80 per cent. MNOs are losing significant profit margins because technology has not been introduced to address the wasted cost of local traffic being unnecessarily carried back and forth across the network instead of being switched locally. By switching calls locally, a very significant share of the operational overhead can be avoided, thus making remote rural wireless service more economical for the service providers.


Another cost driver for cellular backhaul is the signaling specifications to which suppliers must adhere. The GSM specification calls for one or more 2mbit/s backhaul links (1.5mbit/s in North America) for any base station depending on capacity. In reality, most remote rural sites only require a fraction of a 2mbit/s link. Any excess capacity is wasted and adds operational overhead to sites that is a financial challenge to start with. A solution to the problem is to change the structure of the backhaul link. The trend is to move from PCM coded TDM backhaul links to IP, as IP lends itself particularly well to handle dynamic link utilization and manipulation. While niche suppliers have introduced stand-alone equipment that converts the backhaul link to IP and removes all unnecessary transmission and silence, Ericsson, the leading telecom supplier, has built-in support for optimizing the cellular backhaul link over IP. The integrated functionality ensures best possible performance, even under severe conditions.


Not only does the high cost of backhaul in remote areas put a strain on the business case for wireless coverage in these areas, the infrastructure and operating expenses must be shared by fewer users than in urban sites. This leads to requirements on lower cost sites and lower cost deployments. As a result of fewer subscribers active on the site, idle load on the backhaul link will also come into play. A 3kbit/s idle load corresponds to a 12-hour voice call every 24 hours, or 20,000 minutes per month. This is the same volume as the expected voice traffic from a 100-subscriber community. From a backhaul perspective, idle load thus doubles the backhaul cost per call.


Based on the above, the ideal solution for small remote communities is a base station with sophisticated link optimisation for long-distance traffic and local connectivity for local calls, which eliminates the idle load when no revenue-generating traffic is ongoing. The base station should be easy to deploy and maintain, require minimal space and work under generous environmental specifications.


We believe that Altobridge's Remote Community solution meets the above criteria. Not only does it deliver cost-effective wireless services to communities with 50 to 500 subscribers in remote or hard to reach areas, it also provides operators with the opportunity to switch locally in any network scenario, e.g. downtown Paris. The technology reduces the backhaul cost to levels below that of sites in the macro network.


The company has addressed the provision of wireless services to communities in the most hard-to-reach areas of the globe. Focusing on the use of legacy satellite technology for backhaul, Altobridge's Split BSC made GSM on commercial jets an attractive proposition for AeroMobile, with Emirates being the first airline to launch the service commercially. That same technology made GSM on merchant ships with as few as 21 crew members a commercially interesting opportunity for Blue Ocean Wireless, with the key to success in both those cases being the ability to minimise backhaul bandwidth utilisation, using existing low bandwidth satellite channels. And now, the same technology is being used to make the delivery of mobile communications to remote communities an interesting proposition for MNOs.


The core of the Remote Community solution is the Split BSC. In simple terms, the part of the BSC handling Radio Resource Management and all other communication with the BTS and the mobiles, has been moved out to the BTS site, while the part handling communication with the MSC/Media Gateway, is kept centrally. This split allows the BSC to manage the BTS and mobiles, without any signaling going back and forth over the backhaul link. Similarly, the MSC ‘sees' all the traffic channels, without having actual contact with them. Once a call is set up, the two halves of the Split BSC establish a connection over the satellite link and the call can be completed.


As the Split BSC resides on either side of the satellite backhaul link, it is in full control of how and what is transferred over the link. In addition to just optimising the payload by removing padding and silent frames, the Split BSC transcodes the signal. The transcoded signal requires 5-8 kbit/s per active voice channel, compared to 17-25 kbit/s required by competing solutions. In many areas of the world, the difference represents 4-5 US cents per minute, effectively the entire profit margin. Add to that the idle load and it becomes clear that the Remote Community solution is ideal for addressing small remote communities, profitably.
To further improve both business case and user experience the Remote Community includes the Local Connectivity solution, which handles all local calls locally. With 50 per cent local calls, the backhaul cost drops a further 50 per cent. The company's patented Local Connectivity functionality is unique in that call control, charging, supplementary service management and O&M remains in the control of the MSC. The transparency ensures that the investment in optimising the central core network and service layer is protected and no expensive and complex distributed architecture is introduced. As Local Connectivity eliminates double satellite hops for local calls, users will also experience improved network quality. This usually leads to longer call holding times.


For larger subscriber groups beyond the populations addressed by the Remote Community solution, the Local Connectivity feature is also available through Ericsson on their entire range of GSM base stations (see boxed text).


Using such a solution, operators have an attractive opportunity to provide communication to an untapped market - remote communities - that was previously considered too small or too costly to address. Universal service obligation now becomes a profit opportunity rather than a regulatory liability, and first movers will be able to lock in new subscriber groups if they recognize the technology now exists to do so.
The digital divide has shrunk considerably!


Janne Hazell is Altobridge General Manager, Remote Community Communications

To enable a successful Web 2.0, we also need Internet 2.0 says Richard Lowe

Ten years ago the Internet had capacity to spare and the applications that it supported consumed relatively few network resources. The advent of Web 2.0 has changed all of that. We have seen, and continue to see, a proliferation of complex applications demanding ever-more bandwidth. This has led many to wonder exactly who will foot the bill for the necessary upgrades to the network.


Web 2.0 applications are very much a success story - services like Wikipedia, Facebook, YouTube and the wide variety of text and video blogs all seem to defy demographic boundaries and continue to experience stratospheric growth in users. However, there are genuine fears that the demands these place on bandwidth resources may ultimately overload the network and cause Internet meltdown. To enable a successful Web 2.0, we also need Internet 2.0.


The problem is that the Internet was never designed to deal with the increasing demands that are being placed on it. In this respect it bears a close resemblance to modern motorway infrastructure. In the past no one predicted the number of cars that there would eventually be on our roads, as a result commuters are faced with chronic gridlock, especially during rush hour. Similarly, no one could have predicted the popularity of next-generation Internet applications. Interactive and video-rich Web 2.0 applications demand a great deal of bandwidth, which consequently clog the networks carrying the information and degrades overall performance quality. Furthermore, the IP traffic generated by Web 2.0 applications does not follow the one to many, top down, approach of most original web applications. As one senior network architect of my acquaintance frequently explains, "traffic can come from strange directions".


Though it may sound counter intuitive, the solution is not as simple as merely building larger networks; in much the same way as a motorway, extra capacity is soon consumed. It is a vicious circle: the more capacity that is provisioned the more innovative bandwidth-hungry applications are developed to exploit it. Of course, new capacity has to be built to ensure continued innovation on the web, but what we also need to do is to enable the intelligent management of network resources to support valuable services.


This is not about creating a two-tier Internet. Yes, for some people, they may express the value they see through increased payments to their Service Provider. For others, they may choose to prioritise their access line resources towards gaming at no extra cost, and sponsors or advertisers may choose to fund incremental network capacity for services like IPTV. We have learned over the last few years not to second-guess what business models might emerge. After all, who would have believed a few years ago that one of the world's most valuable companies would offer a free consumer search service funded by contextual advertising?


Internet operators already have to compete with the challenge of managing network resources for multi-play packages that include ‘walled-garden' services such as VoIP and IPTV. However, the problem is exacerbated when ‘over-the-top', web-based services - such as YouTube or the BBC's iPlayer - seek to exploit greater network capacity for substitute services without bearing the network costs. Real-time services such as video can be severely impaired, or fail, if insufficient network resources are available to them.


This places operators in a difficult position. They cannot tolerate a decline in the overall quality of their network, but nor can they turn their back on third party services which drive broadband adoption and are highly valued by customers. In the same way as content providers need to monetise their content, network providers need to monetise their networks.


Operators can no longer expect to make sufficient profits through the sale of voice lines.  Nor is Internet service delivery the salvation is once appeared to be.  Most of the important global markets are open to alternative network providers and this has resulted in fierce competition - eroding prices and eating into the profitability of broadband provision.  Many telcos are facing the possibility of becoming little more than a bit pipe for the provision of over the top services by third party suppliers.

  
This can only be countered by the provision of compelling value-added services that exploit the unique capabilities offered by ownership of the access and aggregation network, while ensuring that these premium services enjoy the quality of service required to differentiate them from over-the-top providers. Clearly there needs to be a proven economic case for the allocation of network resources to these services, rather than allowing all services to wrangle for resources in an ‘unallocated pipe.'


If telcos are to retain autonomy over the service they provide, they need to move into the sphere of rich media, web application provider, and leverage the best asset they have to compete effectively - the network itself. In order to do so, however, they will need a fresh network management tool and a new business model.


The traditional approach to managing network resources for a particular service is partitioning. This cuts off bandwidth resources specifically for VoIP, IPTV et cetera and only admits the number of concurrent sessions that the resource can support in the access network - or what my friend, the network architect, calls "sterilizing bandwidth". Service quality is therefore only guaranteed when the network is ‘over-provisioned.' This is an untenable and inherently risky approach in the web 2.0 era.


For one thing, partitioning is a backward looking approach because the basic goal of migrating to all-IP networks is to have a common shared resource that is service and application agnostic.  Partitioning the network only results in higher capital and operational expenditures because it is an inherently wasteful process. In looking to ensure quality, it necessitates highly in-demand network assets becoming ‘stranded' and idle. In the early days of IP networks where the dominant traffic was voice, with very little video, over-provisioning and partitioning was still possible, albeit inefficient, because voice compared to video consumes far less bandwidth and its growth and peak traffic patterns are more predictable.  Video traffic by contrast is very bandwidth hungry and subject to large peaks, a bit like a motorway changing from nearly empty with good flow and speed to overloaded with vehicles in less than a minute causing endless delays and stoppages with no apparent reason or warning.  Trying to over provision and partition for such demands will be economically impossible for Service Providers.  Actual usage patterns may not match the capacity plan causing customers to be unable to access a service or application when, in fact, sufficient capacity exists in another partition or ‘silo.' Not only is it an inefficient way of managing current services but every time a new service is launched, a brand new capacity plan has to be launched alongside it. This leads to extended time-to-market, repetition of work and expense and an inflexibility that is disadvantaging in the competitive converged media markets.


A modern approach needs to be agile - there is not unlimited bandwidth to justify wasting resources. The solution lies in technologies that allow the carrier to treat the network as a holistic resource available to all applications.


The ETSI standards-based Resource and Admission Control Subsystems (RACS) permit available resources in the access network to be allocated dynamically rather than being pre-provisioned, thus ensuring that they are exploited in the most efficient way. Operax has enhanced the basic standards by proposing that the functionality operators require is "dynamic Resource and Admission Control" (dRAC). This brings dynamic topology awareness into the admission control and policy enforcement process - thus ensuring that services and sessions are truly guaranteed QoS on the basis of resources that are really available.
The functionality is situated between the application layer and the network, a position from which it is able to become the only point of contact to which applications can request bandwidth - effectively isolating the service from the network resources. It is then able to enforce subscriber and service policies, allocating resources on a real-time, per session basis, removing any need for applications to understand the underlying topology of the network.


In the same way, dynamic Resource and Admission Control is able to intelligently manage applications, services, subscribers and network resources according to the carrier's business policies.  All the different points of bandwidth contention are identified and are automatically processed before a session is set up. dRAC tracks the available bandwidth into a consumer's home and can ensure that a session is not set up if the necessary bandwidth is unavailable.
At present, applications are competing for bandwidth on a best-effort network. Automated management of bandwidth commodities will not only ensure that service quality can be guaranteed for premium real-time services such as VoIP and IPTV, but can also ensure that over-the-top services have a reasonably free access to resources. Quality can be guaranteed in the premium tiers of the network while still leaving room for innovation in web-based services.


More than merely saving operational expenditure by providing the most efficient technical support for services, this method of automated management may also allow operators to open new revenue streams and present new business opportunities. By treating bandwidth as a commodity that can be allocated dynamically, quality of service can itself become a monetising strategy. For example, if an individual customer wishes to subscribe to a ‘gold' standard of quality for a service, such as high-definition (HD) for IPTV, the RACS can monitor the capacity and automatically inform the customer of the available levels of quality. If there is only capacity for a ‘bronze' standard-definition (SD) class of service, the customer could be alerted before payment and charged appropriately if they choose to proceed. Alternatively, they can be offered a discount and priority if they prefer to access the ‘gold' session through a network digital video recorder (DVR) at a later time.


There is plainly a middle ground to be drawn between the current Internet model, which allows free access to all services, and a controlled tiered Internet. Operators rightly want to see a return of investment in network technologies, but not at the risk of the competitive market. Personalisation is very much a buzz word of the Web 2.0 era, rather than the unknown quantity of provisioning through network partitions, automated resource and admission control can allow the operator to tailor its service levels for their individual subscribers, ensuring guaranteed quality for tiered services and yet still allowing capacity for innovation in over-the-top services.

Richard Lowe is CEO, Operax

The ‘battle for the home' is a key development currently taking place within the telecoms industry, as mobile operators, fixed operators and VoIP providers fight for what was once the sole territory of fixed operators. Although low-cost VoIP and increased coverage continue to be key benefits associated with fixed-mobile convergence, the focus has now shifted beyond voice, as mobile operators realise that by taking control of the connected home they can also open the doors to new revenue-generating services and applications.  Steve Shaw takes a look inside

Increasing usage of mobile data, has led subscribers' homes to become the next telecom battleground.  With multiple providers struggling to increase share-of-voice and gain their cut of the available revenue, it has become a strategic imperative for mobile operators to own the home. 


Analyst house Infonetics predicts that the FMC market worldwide will be worth $46.3 billion by 2010, so there is no doubt that the future will be a connected home. But the battle to own that space has only just begun, as operators across Europe develop and launch homezone services based on dual-mode handsets or femtocells. All eyes will be on the industry to see who rises to the challenge and how the market will develop.


Indoor voice and data usage represents one of the largest growth opportunities for mobile operators today. European operators are investing in ‘homezones' to attract new subscribers and increase customer loyalty. A Home Zone 2.0 (HZ2.0) service enables carriers to deliver mobile voice and data over the IP network, rather than the expensive outdoor macro network, when the consumer is located within the home or office zone. This can have huge financial benefits for the carrier and also opens the doors for high-bandwidth services, such as downloading/ uploading pictures.


The first HZ2.0 services are already live across Europe, and include dual-mode GSM/WiFi offerings from Orange and TeliaSonera. And with leading mobile operators, including T-Mobile and Telefonica O2, announcing femtocell trials across the continent, we will soon see the launch of femtocell-based homezones.


Orange's Unik service is a good example of the potential of the HZ2.0 concept. In France, Orange's service has been deployed since September 2006.  The service has delivered a 10 per cent increase in average revenue per user (ARPU), 15 per cent of subscribers who take the service are new to Orange mobile, and subscribers with a Unik service churn three times less than standard Orange mobile subscribers.


Since HZ2.0 services usually offer low-cost or free calls from within the home, from the consumer's point of view, cost is an obvious benefit, but one that VoIP providers and even fixed operators can also offer. With rising consumer demand for mobile data services, operators can make their HZ2.0 services work harder for them. By capitalising on this growing demand, operators can provide a compelling mobile data experience at home at a vastly reduced cost, so forming the foundations for their ownership of the connected home.
In the home of the future all devices will be connected (TVs, DVRs, cameras, game consoles, etc.). This picture has long been discussed, but what has never been understood is quite how that will be enabled. For the mobile operator to create that network and truly own the home, it needs to go beyond local data offload and improved coverage, and make the mobile handset the central device in managing and maintaining the connected home.
With the rise of a new generation of mobile handsets designed not only for voice services, but also data, such as the iPhone and Blackberry, the handset is fast becoming the primary access mode for e-mails, basic web browsing and social networking.
The vision of a connected home is an important part of the strategy of key players in the telecoms market, including Orange and Apple. Orange for example, has recently announced the ‘Soft at Home' initiative, a joint venture with Sagem and Thomson that aims to facilitate the deployment and interoperability of digital equipment in the home.
Apple, a newcomer to the telecoms space, is connecting the Apple AirPort WiFi router to its range of computers and laptops, which in turn accesses its iTunes service, synchronises with the new iTV server, as well as a WiFi enabled iTouch. This vision consolidates around the iPhone - a central device that could bring the connected home together.
In the battle for the home, mobile operators will not only fight with fixed operators and VoIP providers, but also other home-service providers (such as Virgin and Sky for example) and even device manufacturers. And with new players like Apple and Google entering the telecoms space, the battle has been further intensified.


The key challenge facing mobile operators is to position the handset as the central device in the home, expanding the way consumers use and experience their phones and shifting the focus from voice to data services. The handset is without any doubt a key player in this game, as it needs to operate as the link between all the different devices that form the home.


More importantly, operators have a vital advantage when compared to rivals: they manage both the macro network and the ‘homezone', being able to provide a seamless experience between both networks. For mobile operators, FMC becomes the vehicle enabling them to own and add value to the connected home.


By understanding service expectations of customers, operators can create a home network and deliver valuable and wanted services to a multitude of connected devices within the home - therefore creating, owning and using the network within the home.
From a consumer's point of view, the convenience of the connected home is a natural next step that will enable the customer to not only have access to better coverage and lower-cost services, but also to personalise the handset and services it controls according to his/her preferences and life style.


We have already seen the shift towards data services, with consumers using handsets to access e-mail, download music and videos, web browsing and social networking. In the future, this trend will only be intensified, with consumers further personalising their phones and the content they access.


The launch of HZ2.0 services in key European markets and upcoming femtocell launches gives mobile operators the opportunity to move beyond the proposition of low-cost voice. With new rivals entering the mobile space and service providers refining their strategies, the battle for the home has turned into an extremely competitive market.


By taking ownership of the mobile handset within the home, delivering high quality and unique services, mobile operators can begin to build the case for their control over the connected home. From here, they can create, own and use the home network - since they understand the service expectations of consumers. Proving value through home service delivery to the mobile handset, operators can craft a connected home network and deliver valuable and wanted services to a multitude of connected devices within the home, with the mobile handset at the heart of this new connected environment.


The homezone vision enables carriers to stay one step ahead of the competition and develop a long term connected home that cements the mobile operator as the service provider of choice for in-building communications and puts the mobile phone firmly at the centre of the next-generation in-home network.

Steve Shaw is Associate Vice President Marketing, Kineto Wireless 

The International Broadcasting Convention - IBC 2008 - will once again take over Amsterdam's RAI convention centre from 11th -16th September this year.  Ian Volans takes a look at aspects of particular interest to the communications marketplace, and points out that the event provides a unique opportunity to build bridges and understanding between industries with very different cultures

IBC, the International Broadcast Convention, is to the broadcast and video content creation sectors what the Mobile World Congress is to mobile.

Run by the industry, for the industry, the event celebrated its 40th anniversary in 2007 - quite an achievement for a technology-focussed conference and exhibition.  This year, the IBC Conference programme will run from Thursday 11th to Monday 15th September, while the exhibition, showcasing the latest technologies and services relevant to the creation, management and delivery of content, will be open from Friday 12th September to Tuesday 16th.


A key element to IBC's sustained success is its determination to stay abreast of new technological and business trends within the broadcast sector.  For example, HD TV was first discussed at IBC some 20 years ago, while two years ago, NHK from Japan stunned the IBC audience of hardened TV professionals with a spectacular demonstration of Ultra High Definition TV.  Sessions on the potentially wide ranging impact of Digital Cinema on the content creation industry has been attracting large audiences for the last few years and filled the RAI's largest Auditorium for one of the first public viewings of excerpts from U2 in 3D last year.


Historically, IBC has probably not been high on the list of events for the telecommunications and mobile sectors, with the possible exception of carriers supplying fixed links to broadcast networks or OB units.  However recent years have seen a steady rise in attendance from strategists and those responsible for defining operators' content offerings, as the potential of IPTV and mobile multimedia revenues to offset stagnant or declining voice arpu has risen up the agendas of telco and mobile operators. 


Anticipating this convergence and seeking to satisfy the thirst for knowledge within the content creation community, IBC established a dedicated Mobile Zone in 2005 and added an IPTV Zone in 2007.  Last year's overflowing seminar on Digital Signage - the use of IP-networked flat screens to distribute information, advertising, multimedia and TV content in retail environments, transport hubs and stadia - was the catalyst for a new Digital Signage Zone in 2008.


If the ‘delivery' part of IBC's remit to cover the ‘world of content creation, management and delivery' was originally understood to mean broadcast, today mobile, IPTV, the web and digital signage are all recognised as part of the distribution mix of the future.  IBC therefore provides a unique opportunity to build bridges and understanding between industries with very different cultures.  (IBC's unusual timetable spanning a weekend is, in itself, a reflection of one of the cultural differences - freelance directors, producers, camera operators and editors can attend IBC without impinging on their revenue generating working week!). 
In 2008, around 30 companies encompassing all aspects of the mobile TV and video value chains, including Qualcomm MediaFLO, MobiTV, Nagravision and Rubberduck Media will attend. The IPTV Zone has expanded dramatically and will embrace nearly 50 companies.  Exhibitors around the specialist Zones and across the show floor addressing Mobile, IPTV and Digital Signage can be identified at the IBC product locator.
Mobile, IPTV and Digital Signage will also be recurring themes throughout the conference, seminar and business briefing programme.

The conference
The conference is once again built around five key theme days - several of specific relevance to the telecoms and mobile sector.

  • Thursday 11 September

Theme Day: Content Access via the Web explores how the significant trend towards consuming content across a multiplicity of web-connected devices - including mobiles, PCs, multimedia players and games consoles - will impact the traditional broadcast value chain
Session 1 - Quality issue in IPTV - part one (IEEE Tutorial)
Session 2 - Quality issue in IPTV - part two (IEEE Tutorial)
Session 3 - Content over the web - case studies
Session 4 - The future outlook for content over the web

  • Saturday 13 September

Theme Day: The Digital Dividend: HD, mobile, broadband or new media? will examine the options and issues that arise from competing demands for the spectrum that will be released with the transition from analogue to digital terrestrial TV transmission.
Session 24 - The great spectrum land rush
Session 25 - Which services will we want enough to be willing to pay for?
Session 26 - Great idea for the digital dividend - but how are we going to pay for it?
Session 27 - Content mania - who will feed the digital dividend with programmes?
Elsewhere in the Conference Programme, Thursday's Technical paper sessions (sessions 9 and 10) will take an in-depth look at technical innovations in the area of IPTV and the networked home.


Two free sessions in Friday's conference stream on Content Production Trends will be of particular interest to the mobile and telecoms sectors.


Session 19 will be led by mobile veteran Ken Blakeslee, Chairman of WebMobility Ventures. Entertaining the Mobile Audience - Games and Rich Content Production and Distribution to an Interactive Audience of One will look at the dynamics of conceiving, designing, monetising and delivering entertaining content to the fourth screen.   Session 20 will explore User generated content - social networking and the democratisation of broadcasting.
Friday's Theme Day - Future Broadcast business - addresses many of the macro-economic issues facing the broadcast sector which will inevitably touch on the growing presence of the telecoms sector in a world of convergent services.


What Caught My Eye is one of the most popular features of the IBC Conference: an expert reports back on the most exciting innovations to be found on the exhibition floor.  Mobile takes centre stage in Sunday's What Caught My Eye - session 44 - when Mike Short, VP R&D at O2 and President of the Mobile Data Association will unveil the most exciting mobile-specific innovations to be found at IBC2008.


Sunday's theme day - Content production: technology, creativity and business in an era of headlong change - will provide an invaluable insight into culture and current drivers - both creative and commercial - within the broadcast sector.
Sessions 49 and 50 on Monday morning are taken up with a pair of tutorials from DVB & ETSI on Open Standards, Technology & Implementation.  These will look at the evolution of the Digital Video Broadcast standard to accommodate a number of distribution scenarios including DVB-H for handhelds.


IET hosts another tutorial in Session 54 on Digital delivery and getting content to the consumer.
Monday's Theme Day - New Dimensions for the Big Screen - switches to the RAI auditorium and will provide insight into, and spectacular demonstrations of, the latest developments in the fields of digital and three dimensional cinema.


Conceived to complement the peer-reviewed IBC Conference, the free-admission Business Briefings provide a platform for content providers, application developers and technology companies to share their experiences of the impact that mobile, broadband and IP worlds are having on the creation, management and delivery of content.  Confirmed participants to date include Qualcomm MediaFLO and Nagravision on Mobile; Edgeware, Miniweb Interactive, the Open IPTV Forum and Dolby on IPTV; and Sony on Digital Signage.
 
Ian Volans is a mobile analyst and produces the Mobile, IPTV and Digital Signage Business Briefing programme at IBC

 

With the aim of giving users greater choice and flexibility, a number of European countries are looking to create a more competitive telecoms landscape, along the lines of the UK model. Dominic Smith looks at the best approaches to ensure success

‘Functional separation' is currently a hot topic of discussion in the telecoms world. The approach requires incumbent operators to separate their retail division from their network infrastructure and enable competing service providers to access the infrastructure on equal terms.


The UK went through this process in 2006, when BT created Openreach, a new wholesale business unit, which is required to give the same level of access to the local network to third-party service providers as it does to BT Retail. The UK is now widely regarded as a benchmark of just how successful functional separation can be, and EU telecoms commissioner, Viviane Reding, has cited it as a model for the rest of Europe. In line with this, Poland, Italy, Ireland, Sweden and the Netherlands are all now considering the introduction of functional separation.


The primary objective is to create a more competitive landscape, leading to more choice and flexibility for the end-user, greater service innovation and ultimately growth in the marketplace as a whole. There remains a risk, however, that the achievement of these goals will be jeopardised by the greater complexity of the back-office infrastructure.
The success of functional separation will largely depend on the circumstances surrounding its implementation. If it is imposed by the regulator, the incumbent telco will most likely become resentful and may even deliberately obstruct the process by making it difficult for other companies to integrate with its network.


By contrast, if the incumbent believes that functional separation is inevitable, it may choose to pre-empt any regulatory move and implement many of the key elements proactively. In this latter situation, operators will be more open to moving beyond the role of wholesale network provider and taking the opportunity to launch a range of complementary value-added services themselves. 


Whatever the scenario, as and when they are faced with functional separation all operators will have to manage the complex process of dividing their infrastructure in two. Many existing automated procedures such as provisioning and order handling will need to be split. Rather than having one fully integrated end-to-end system, operators will now have to offer integration with multiple retailers' systems and will therefore have to manage a ‘break in the chain.'


This becomes even more difficult because of the intrinsic differences between the types of support systems required to conduct wholesale and retail activities. While the retailer will invariably be dealing with a large number of individual subscribers, the wholesaler will typically be interacting with a much smaller number of customers, each with a wide range of connections. This will often have a significant impact on the underlying rating and billing systems architecture and on how CRM is managed.


One of the major challenges wholesale providers face in delivering equal access is in creating a suitable interface for their new retail partners for ordering and service provisioning.
Interacting with the underlying network is inherently complex. However, for true ‘equivalence', the wholesaler must provide equal access to all retail partners. If this new interface has a complex ‘telco' configuration, however, the operator will effectively be making it difficult for any non-telco retailer to interact with the wholesale network.
It will be harder and more expensive, for example, for a supermarket chain offering telecom services to interact with this type of wholesale interface than for a retailer with a telecoms background.


Smaller retailers tend to suffer most in this respect. Large organisations with a business model that projects high subscriber numbers may not be unduly concerned that there is a significant cost to set-up and maintain that interface. By contrast, smaller companies with fewer customers, who may be looking to add new services to an already diverse portfolio, are likely to be discouraged by the prospect of incurring high costs in supporting such integration.


Bridging the gap between the wholesale and retail functions of a telco operation is clearly a challenge in itself. One potential solution is that the wholesale telco offers a ‘white label' service, incorporating everything from ‘bill on behalf of' to order management, so that the retailer no longer has to develop its own support systems infrastructure.
Another objective of functional separation is the delivery of service innovation. If this simply means repackaging existing services, then this can easily be carried out by the retailer, who can develop suitable pricing schemes for bundles of fixed, mobile, broadband and TV offerings.  The retailer might also want to bring in pricing per event, cross-service incentives or flat-rate fees, for instance, and these can all be achieved when equipped with the appropriate business support systems.


However, when it comes to new service innovation there is an inevitable dependence on the underlying network. If you want to support a 20MB broadband connection for example, you need to have a network in place that can provide this service.


To become more innovative, therefore, the retailer will need to rely on the capability of the wholesaler's network. If the wholesaler is reluctant to launch new services - for example if the investment required is outweighed by the returns expected - then the whole process may stall.


This gives agile Internet companies an opportunity to bypass the telco by making effective use of IP networks and by launching dynamic new services ‘over the top'. In other words, if the wholesaler is not innovative in creating network capability that the retailer can use, both risk being compromised by third parties.


Of course, service innovation is of little value if it does not benefit the customer. The main objective must always be to provide the best possible service.
In the traditional telecoms model, the customer should never experience an extensive time-lag between ordering a new service and receiving it. However, when the retail and wholesale functions are separated, the process can become significantly more complicated. Service level agreements will typically need to be put in place and the time taken to activate the customer's account and services can increase significantly.


Customers may experience anything up to a 30-day window in which the set-up of service takes place. While the regulator may dictate such a time period, from a systems and order processing perspective, it is desirable that orders that transfer across the retail - wholesale divide are processed much more quickly than this. In short, installed solutions need to deliver straightforward customer service and ensure that both wholesale and retail parties generate revenue quickly and efficiently.


As a result, operators will increasingly want to work with business support systems (BSS) providers that not only have expertise in both wholesale and retail areas but can also provide a service that spans both types of systems and have experience of working in an environment where this sort of process has taken place.


After all, if the end goal of functional separation is to provide enhanced choice for the user and improvements in customer service, then the provision of equal access to the wholesale network and effective services to subscribers must be simple to manage.


Operators should look to partner with BSS providers who can deliver high-quality, pre-integrated solutions which can be easily linked to the surrounding wholesale and retail infrastructure. By offering this level of functionality, the solutions provider can reassure the operator that no data will be lost between systems, that processes spanning both sides of the functional divide will be carried out seamlessly, and that the end customer will benefit from streamlined customer service.


Incumbent operators may be less than enthusiastic about the drive for functional separation and claim that it is not in the best interests of end customers, but with EU commissioner Reding actively looking at this issue, the decision may well be out of their hands.
They need to be fully prepared not only to face the consequences of functional separation but also to ensure that they maximise the benefits both for themselves and for the customers they serve.

Text messages could raise thousands for charities while celebrating the life of a great man

To celebrate the 90th birthday of Nelson Mandela, his charitable organisations have come together to launch an international premium SMS service, allowing millions to text their own birthday message to the former South African president.


A mixture of shared and dedicated PSMS codes have been set up in over 20 countries around the world including the US, UK, South Africa, Australia, Spain and Germany as well as many African nations, making this one of the biggest premium SMS fundraising initiatives to be staged.  Well-wishers text their own message to their specific short code in country, receive a return thank you message complete with unique PIN allowing them to view their message securely online at www.happybirthdaymandela.com.  The specially created birthday site also contains a raft of celebrity messages both written and filmed from Mr Mandela's supporters around the world.


Launched on June 16th in the majority of territories the Birthday Wishes Campaign coincides with Mr Mandela's 90th birthday concert in London's Hyde Park on June 27th.  The concert is broadcast internationally on television, radio and online and sees numerous stars echoing the call to action to ‘Text Nelson Mandela Happy Birthday'.  The campaign runs through until July 18th, Mr Mandela's actual 90th birthday, culminating in a card being presented to him in South Africa noting the number of messages received.

Managed by Nelson Mandela's HIV/AIDS global awareness campaign 46664 and strategically live for 46,664 minutes, all profits received from text charges will go to the Nelson Mandela Legacy Trust and other charitable organisations supported by Mr Mandela.  These include The Nelson Mandela Children's Fund, The Nelson Mandela Foundation and the Mandela Rhodes Foundation.

46664 International Director Tim Massey says: "We are delighted to be launching this campaign allowing so many people around the world the chance to wish this extraordinary man happy birthday.  A text based service allows more people to feel part of the celebrations in a very simple way and we hope will raise significant revenues to support the ongoing work of the charities that bear Mr Mandela's name and 46664."

Specific telecoms advice has been given to the charity in setting up the complex international campaign by telecoms consultant Tim Williams and Naqada.   Running on a system built and managed by UK based Mediadeck, the campaign has been enabled by SMS aggregators Sybase 365, mBlox and Digital Services Group in South Africa.   46664 concert sponsor Zain has also assisted in switching on the campaign across its African territories.  A customer care service has been provided by Mediadeck with support from Elephant Talk, Mediatel and North West Nevada Telecom. With web design and build by V&H Sistemas Informáticos in Argentina, WAP site design by Adfrap in the UK and hosting by Rackspace, the Birthday Wishes Campaign is one of the first truly global charity PSMS services ever launched.
Details: www.46664.com

Customer experience will only improve when customers are viewed as individuals, not account numbers says Giovanni Pellegrini

In the business environment waste and inefficiency are quite rightly abhorred. Conversely, efforts are poured into increasing productivity and efficiency. Recent attempts in the field are ever more focused on achieving this objective via improved customer satisfaction and strengthened customer loyalty.

In the highly fragmented media environment, however, consumers are empowered to switch allegiance with the ease of a mouse click. Suppliers are becoming ever more aware that any action that is perceived as a slight by the consumer, such as the still widespread phenomenon of misaddressed mail, will be met with defection.

Recent research by Pitney Bowes Group 1 Software indicates that increasingly fierce competition has not been met by the implementation of successful retention strategies.
"The Dynamics of Defection" report found in fact that customer churn is on the rise throughout Europe having reached almost 19 per cent across key consumer industries in 2007. The mobile telecoms industry was found to be particularly affected by defections with just over one in five consumers (21 per cent) switching mobile telecoms provider in continental Europe in 2007.  Consequently European mobile telecoms providers are urgently shifting their focus from acquisition to customer retention.

In order to increase loyalty and improve retention, the telecoms industry needs to depart from an impersonal 'account' approach to campaign management - where elements of the communication cycle are handled remotely and/or disparately - and opt to create a two-way, business-to-individual-back-to-business closed loop-process.

Organisations have invested large amounts into implementing complex Customer Relationship Management (CRM) systems, but general opinion holds that they have failed to take off due to a lack of change in corporate mentality: customer experience cannot improve if customers are still viewed as account numbers and not individuals.

Sets of activities that lie at the heart of CRM and come under the heading "customer communications" have in the past been largely overlooked. These activities include everything from data management to address data quality; from personalised document generation to electronic bill presentment and payment (EBPP) and document management and even call centre operations.

To ensure these customer communications fulfil their purpose and truly engage the customer, it is necessary to integrate them with the appropriate business processes they connect with, however disparate they may appear. The analysis drawn from these integrated information streams equips businesses to reach out to their customers intelligently. This integration of key business processes and their related information streams into CRM defines and drives Customer Communications Management (CCM).
While CRM is fundamentally customer facing and outwardly focused, CCM, by capturing the external customer information and linking it to internal business processes such as those drawn from the marketing, sales and other departments, can create a comprehensive Single Customer View (SCV).

There are seven equally viable points of entry to a comprehensive enterprise CCM solution. This solution should at all times be fully scalable and entirely compatible with line of business legacy systems.

Data access and integration
The first step to more effective customer communications is gaining an all-round view of the customer as an individual.  To this end the CCM data access and integration tools give instant, seamless access to customer information wherever in the business it is stored.
Companies can then consolidate and integrate this data across the systems in which it resides to finally obtain an SCV. These tools also give marketers the ability to generate business intelligence reports, marketing campaign analyses, customer segmentations and audits.  Most importantly, however, managers are empowered to make more timely and informed business decisions.

Data manipulation
Data manipulation tools perform address cleansing and mail coding tasks to avoid duplication and reduce print and mail costs, ensure prompt delivery and increase response rates.
More sophisticated data manipulation tools are able to target offers based on specific business geographies and create customer profiles defined by household demographics.  As a result, companies can accurately predict response rates for a range of offers and identify up-sell and cross-sell opportunities on the fly.

Document creation
Document creation tools provide a single, easy-to-use approach to creating one-to-one multi-channel communications for both high volume and interactive lettering. These tools are designed to efficiently create different documents: contracts, complex billing, insurance policies, bank account statements, and even packages containing multiple documents like travel booklets.

Document creation tools can also help speed the document development process: once created, a design can be re-used across applications and multiple delivery channels for business rules, templates, text and other content, and distributed via the web, SMS, fax e-mail and print.

Production / Distribution
Production/distribution tools streamline both high-volume and on-demand production off all forms of customer communication.  In addition to this, they allow users to distribute and proof documents over the web prior to giving final authorisation. 

Data vault
The data vault places all customer data into a single, secure yet accessible electronic environment.  The vault will need to be able to integrate both print and digital files at a much lower cost than the expensive PDF- and HTML-based solutions.  The modular architecture of a CCM data vault should be able to rapidly deploy call centre, customer self-service, and EBPP applications. 

Customer & call centre support
With the availability of a centralised CCM data vault call centres can drastically reduce call handling and conflict resolution times by instantly retrieving exact replicas of all customer documents.  

Customer support can also be extended to provide 24/7 web customer self service for the individual's customer account, enabling them to autonomously retrieve information and make online payments.

Replenishment
Replenishment tools create closed transaction loops by providing automated updates and connecting all communications back to the related business processes.  For instance, they can link to accounts receivable for round-trip processing, mine data from dynamic documents and continuously refine business intelligence. With replenishment tools it is possible to reduce remittance processing errors and costs and generate accurate time-sensitive financial reporting.  

Giovanni Pellegrini is Sales Director Southern Europe, Pitney Bowes Group 1 Software

The wide variety of technology formats that promise consumers access to premium content any time, any place, anywhere is putting conditional access systems under the spotlight.  Lynd Morley takes a look

The application of conditional access (CA) used to be fairly simple to grasp: the protection of content - most commonly sent to digital television systems via cable or satellite - requiring certain criteria to be met before granting access to said content.  OK, so install the Sky box near your TV; insert the smart card; and you're away. 


But the increasing digitisation of content and the plethora of new distribution methodologies and business models are rather complicating the picture.  Consumers can now access content via a wide range of devices, and at a time and place of their choice.  And while content distributors and service providers are clearly expanding the boundaries and enhancing the reach of their services to the consumer, such premium content is, inevitably, increasingly exposed to the risks of piracy and theft.  Indeed, the increasing prevalence of broadband networks and the ease with which digital media can be cloned and distributed, combine great opportunity and great risk for the owners of premium content, raising security to a high priority.


Doubtless that is one of the reasons that market analysts remain fairly bullish about the conditional access sector, which is set to generate revenues approaching $1.4 billion this year, according to ABI Research, who also note that telcos will be taking an increasingly large slice of the pie at the expense of cable and satellite industries.  Not that cable and satellite players are about to disappear, but industry analyst Zippy Aima notes: "The options now offered by new deployments of mobile and IP TV - including interactive and on-demand content, time-shifting and place-shifting - are generating a buzz that drives demand for their premium content to a wider audience."


Certainly conditional access players are addressing a fast changing, challenging market, where all participants, whether content providers and distributors or security solutions vendors, continue to jockey for position and battle for market share, all with an eye to the technology changes and developments that will impact their success.  Indeed, the marketplace has recently seen not only hard competitive selling, but the unedifying spectacle of court action over the alleged cracking of smart card encryption by one solutions vendor - the results of which are passed on to pirates - in order to gain competitive advantage over another.  This may raise the question of who should actually be described as a ‘pirate'.
For the most part, however, CA solutions vendors, including such leading lights as Irdeto, Viaccess, Conax, Nagravision, and Verimatrix take a more conventional competitive stance in arguing the advantages of their particular approach or products. And faced with a fast developing, and clearly highly competitive market, they are also finding different ways to present competitive advantage.  Irdeto, for example, is now offering what it describes as a full range of Content and Business Model Protection.  Having established its name in content security, with more than 400 CA and Digital Rights Management (DRM) customers worldwide, the company recently acquired business support systems specialist IBS Interprit, set-top box (STB) solutions provider Idway and software and data centre security firm Cloakware.  The idea is to enable operators and broadcasters to launch and grow their digital TV businesses profitably and securely, and while each acquired product will continue to be sold individually under the Irdeto brand, they will also be sold bundled as an end-to-end solution called Irdeto SmartStart for digital TV operators.


"We believe that this approach gives us a lot of strength in being able to offer clients the option of an end-to-end solution," explains Christopher Schouten, Director Global Product Marketing with Irdeto.  "We've seen a real demand for this type of solution in the marketplace, particularly in developing economies such as India, where the SmartStart concept is very attractive to companies whose - often home-grown - legacy systems are beginning to buckle under the demands of providing multi-play services."


Conax is also building a solid customer base in many of the emerging economies, having established a presence in India, China and Brazil back in 2003.  Celebrating the five-year anniversary of its presence in India on June 27th of this year, the company announced that it had deployed a total of five million smart cards to the Indian market during that time. But Conax's philosophy is to concentrate on its security products - its core competency - as Geir Bjorndal, COO and Sales & Marketing Director explains.  "We are very focussed on our security products, and I believe that having that focus means we can be very reactive to market requirements."


Certainly, the company has a number of security laurels to - if not rest on - at least, point to.  It developed one of the first pay-TV smart cards in the world in 1990, and by 2006 Conax CA was in operation in over 180 installations in more than 60 countries around the world.  The latest figures show an expansion into over 70 countries globally. 
Bjorndal believes that broadcast solutions are still the ‘bread and butter' of CA offerings, noting: "These linear solutions are going to be needed for some time to come.  However," he adds, "developing technologies, and changes to infrastructure and distribution mean we have to stay awake!"   And to that end, he adds: "We are actively building relations both with new and existing customers wishing to upgrade to IPTV, and with major integration partners, to deliver total solutions."  Conax is also, according to Bjorndal, closely following the development within Mobile TV and is maintaining close contact with several partners offering solutions in the area.


Irdeto's Christopher Schouten agrees that changing markets require a sharp awareness of the need for different solutions. "Our focus on software-only solutions, for instance, is certainly increasing," he comments, "but it's really a matter of ‘horses for courses' - we need to ensure that the security that is provided for any environment is appropriate to that environment."


The software route has proved pretty successful for the comparatively ‘new kid on the block' Verimatrix whose software-based content security offering has brought plaudits from the likes of the Multimedia Research Group (MRG) ranking the company as a ‘global leader of IPTV content security' in its bi-annual IPTV Market Leader Report.  Stephen Christian, Verimatrix VP Marketing, notes: "The whole notion of security in pay-TV systems has been very limited in scope for a long while, and very firmly centred around the notion that it's the smart card that represents the secure capability.


"We're coming from the background of a different kind of distribution system - not cable and satellite, but IPTV.  And what we're seeing is that the general principles we've established for security in the IPTV world are actually going to be the norm for future distribution systems of all types - mobile video, satellite video, cable video and so forth.  Everything is heading towards IP technologies, and we need security regimes that are built on IP foundations."


For any CA system to succeed in the marketplace, the exceptionally powerful studios - providing the all-important content - need to be comfortable with the solution.  Verimatrix has been careful to ensure that their brand is known to the studios.  "We've never had a pay-tv operator refused content because of their software based security regime," Christian explains. "We're pro-active with the studios - making sure the relevant decision makers inside studios and broadcast companies are well aware of what we can bring to the party, and how we're able to protect their interests.  So licensing deals go as smoothly for us as for the legacy players."

Keeping the studios informed - and aware of your brand - is at least as important as the technical merits of any particular solution according to Christian.
He goes on to point out that there's no tougher testing environment for ensuring robust security than the Internet, where every solution must run a gauntlet of professional and underground hackers on a continuous basis.  "That's why software-based IP security technologies have emerged as the gold standard for securing everything from web-based banking and financial transactions to high-value video in broadband and IPTV service applications," he comments.  "Clearly, standards-based, high integrity security can be applied to media just as much as to, say, banking transactions.  All the dominos are in place to make this happen - chip sets in set-top boxes are that much more powerful; TCP/IP protocols are widely available; broadband access is increasingly pervasive.  All the constituent components are in place - let's take advantage of it, and make this leap forward."

NBC/Digital Rapids
NBC has selected Digital Rapids to provide media encoding, transcoding and streaming systems for the network's Internet coverage of the 2008 Olympic Games from Beijing. Digital Rapids' DRC-Stream encoding and streaming solutions will enable NBC Olympics' live and on-demand online coverage.   Some 2200 hours of video will be streamed live on the Internet at NBCOlympics.com, primarily encoded from video feeds into web-friendly streams through the DRC-Stream systems. Streams will be encoded in the VC-1 compression format for a viewing experience powered by Microsoft Silverlight technology. The encoded live streams will also be archived for viewers to watch on-demand. Digital Rapids Transcode Manager will be used to convert affiliate-provided content between compression and file formats for US domestic distribution.  "We're thrilled to continue our relationship with NBC by supplying our solutions for coverage of this year's paramount event, the Beijing Olympics," says Briek Eksten, President of Digital Rapids. "The nearly unlimited scope of Internet-based video lends itself perfectly to coverage of an event of this scale, and our solutions are renowned for bringing video to the web with exceptional quality and reliability. We're pleased that NBC has again placed their trust in our technology and expertise for their ground-breaking online coverage." Rab Mukraj, Director of Digital Media Delivery at NBC Universal, adds: "Delivering an unparalleled online experience is a vital component of our unprecedented multi-platform coverage of the Beijing Olympics.  The Digital Rapids encoding systems will enable an outstanding viewing experience for our online audience through superior encoded video quality and robust reliability, while providing us the workflow efficiencies needed for coverage of this magnitude."


Digital Rapids' DRC-Stream encoding solutions combine powerful hardware for video and audio capture and pre-processing with the intuitive Stream software interface, delivering reliable, high-quality, multi-format media encoding and streaming for professional applications such as high-end Internet TV and IPTV. The advanced, hardware-based video processing features enable superior quality and the most efficient use of bandwidth in the compressed result. Digital Rapids Transcode Manager provides automated, distributed transcoding with centralized management and exceptional load balancing intelligence for high-volume, multi-format workflows, increasing production volume while reducing operational costs. Details: www.digital-rapids.com

MTN/Colibria
The MTN Group is the leading provider of communication services in Africa and the Middle East with over 61m subscribers.
Innovation is paramount to MTN's brand values.  With today's consumers searching for new, exciting and interactive ways to communicate, MTN was quick to recognise the opportunity for brand differentiation by launching an enhanced mobile messaging service.
In December 2007, MTN deployed a new mobile IM (MIM) service in South Africa, called ‘noknok', powered by Colibria.
The market:

  • MTN is the second largest operator in South Africa, with a 36 per cent market share and 14.8m subscribers
  • Mobile phone penetration is over 80 per cent, however Internet penetration is approximately 10 per cent, making this an ideal market for an enhanced mobile messaging service
  • The market is technically challenging as many of the mobile phones in circulation are not recent models
  • Third-party MIM services are already available making this a well educated, yet highly competitive market
The service:
Compatible with a wide range of handsets at launch, noknok is a feature-rich MIM service that offers a truly community-based mobile experience.
Operator benefits:
  • A complementary revenue-generating service alongside voice and existing messaging technologies
  • An enabling technology that enhances the functionality and usability of existing applications and services
  • The technical infrastructure is modular, therefore new revenue-generating services and applications can easily be introduced
User benefits:
  • Simple to download and install ensuring the user experience is intuitive and compelling
  • Users can impulsively share experiences with friends and groups at the click of a button
  • Fully interoperable so users can chat with friends on either the MTN network or the Vodacom network in South Africa
  • Users can add anyone to their contact list. Those who don't have noknok on their mobile will receive messages as either an SMS or an MMS. This supports SMS continuity as advocated by the GSMA's PIM Initiative
  • Incorporates Presence enabling users to see their contacts availability, status picture, status text and mood details

Noknok launched with a range of clients; including a PC client, a WAP client and MIM clients for Symbian and Java handsets.  In addition a Java ‘Lite' client has been produced specifically targeted towards basic and low-cost handsets. 


Noknok is about much more than just delivering messages instantly - noknok is also about establishing identities and promoting personalities.  The next evolution of noknok will include content bots and non-P2P chat services to further grow and enhance the user experience. 
Details: www.colibria.com

Turkcell/NeuStar
Back in 2004 Turkcell realised that mobile messaging solutions were due to become a mass market, globally ubiquitous service and recognised the timing was now right to begin to launch its own Mobile Instant Messaging (IM) solution. Turkcell turned to NeuStar for the technology needed to launch a service that was uniquely Turkcell's, and not just an extension to other fixed service provider offerings.


Turkcell needed to not only make the most of this new market opportunity, but also consider its existing customers and potential threats the new offering might have on its SMS revenues. The service had to clearly demonstrate customer benefits with functionality like Presence being at the heart of the user experience.


February 2005, TurkcellMessenger was launched to both prepaid and postpaid customers. The service could be used through a mobile application downloaded to the handset or via the PC, Web, and Wap clients, enabling IM.


TurkcellMessenger allowed subscribers to communicate in context for the first time, with a Presence enabled contacts list detailing their buddies' status (online/away/busy etc) and enabled them to control their own experience by activating their own status.
Two years into the service, TurkcellMessenger's chat rooms had become one of the most popular features of Turkcell's service with almost 50 per cent of the total IM traffic generated in chat rooms.


 "We launched Turkcell Messenger on a flat rate monthly charge with a pay-per- message alternative and unlimited data usage for both, and used viral marketing techniques to promote the service", says Leylim Erenel- Product Manager. "Our chat rooms have become one of the most popular features, so we will look to extend these social network type applications for our customers".


Turkcell enjoyed an increase in the arpu of users who have subscribed to TurkcellMessenger. More surprisingly, helping to debunk the myth that mobile IM can potentially cannibalise text usage, Turkcell has also enjoyed a rise in the SMS usage of IM users. A recent analysis carried out by Turkcell showed that the SMS traffic created by users who subscribed to TurkcellMessenger at the beginning of Q4 2006 actually increased by 5.8 per cent during the same quarter.

On the back of this success, Turkcell extended the service this year and launched Turkcell Windows Live Messenger, allowing subscribers to log on to Windows Live Messenger on their mobile through a download application.  In the first three months of deployment 1.1 million subscribers signed up to the service, following an effective marketing campaign. The service produced record-breaking statistics with subscribers logging in 15 million times and exchanging 800 million messages in just three months.
Details: www.neustar.biz

When Kireeti Kompella and David Noguer Bau ask the service provider community about the future of transport networks, there is general agreement that the future is in Ethernet. So what are the wider implications of this position?

Driven by the reduced cost per bit, Ethernet is becoming the standard interface in the telecommunications industry; we can find Ethernet ports from DSLAMs to mobile base stations. At the same time, Ethernet VPNs are gaining popularity to provide connectivity between enterprise branches.
This change in the industry is driving the requirement for an efficient transport model. The limitations in extending Ethernet into the MAN and WAN are well known (scalability, resiliency, lack of OAM...), so its growing importance is pushing for optimized transport mechanisms:

  • T-MPLS: A ‘profile' of MPLS that meets transport requirements (only)
  • PBB-TE: Purpose: to make Ethernet carrier-grade (or transport-grade)
  • MSP: Multiservice Provisioning Platform that adds functionalities to SDH nodes (trying to extend SDH live).
  • MPLS: A true Multiservice transport (IP + Ethernet + legacy)

However, before jumping to quick fixes for Ethernet limitations, let's look at a brief history of the transition from TDM-centric networks to packet-centric networks; hopefully, in doing so, we will gain better perspective on why things are the way they are, and what really needs to be changed.

A bit of history
Fine-grained Time Division Multiplexing (TDM) networks were designed primarily for voice services and adapted reasonably successfully for leased line circuits as data requirements became more important.
A decade ago, with the incipient demand of data services, the network was still able to accommodate it.
The transport requirements for TDM were clear, making SDH a magic layer providing the required features for data:

  • Frequency synchronization
  • Deep Channelization: down to DS0
  • Framing
  • Integrated OAM model
  • Redundancy with Fast Restoration (around 50ms)
  • Traffic Engineering for path an capacity management

So the combined SDH + DWDM model was emerging as a universal transport, common to all services and mainly voice- and circuit-centric. The transport department was in charge of providing the right requirements (bandwidth, resiliency, framing...) and all the services ran across the top. We'll define the separation between the two departments, Services and Transport, as ‘the Purple Line'.


This model is still implemented in most service provider organizations today; however the idea is to get a sense of the value of TDM networks, what the issues are, and how this should evolve around the growing dominance of Ethernet.

The next generation
The massive demand for best-effort Internet services, the migration of voice services to IP, the quick replacement of leased lines by Ethernet and IP VPNS, as well as the growing importance of IPTV, is challenging this model. The requirements for the transport layer are new and Ethernet appears to look well positioned.


This transition towards Ethernet consequently forces re-allocation of the missing functions: OAM, Traffic Engineering, Synchronization and Fast Restoration should move into the new ‘magic layer'. Today, the industry is struggling to find the best technology to fulfill the magic layer requirements, positioning at the heart of this debate technologies such as T-MPLS, PBB-TE and MSE, all designed to complete and optimize the transport of Ethernet.


The ‘Purple Line' made sense 20 years ago, when several independent services rode over the transport network. The Purple Line drew a demarcation between ‘infrastructure' and ‘services'. A particular service failure would typically affect just that service while an outage in the infrastructure would affect all services. Keeping infrastructure separate enabled a very stable network, over which each service could be managed on its own.


Today, with the NGN (Next Generation Network) model, there is essentially just a single ‘traditional service' over transport, namely IP/MPLS. Replacing SDH with an enhanced Ethernet technology is not going to change it.  All the real services will still be sitting at a higher layer. Since IP/MPLS carries all the services, it must have the same stability and resilience as the ‘infrastructure' below the Purple Line. The natural consequence of this is that IP/MPLS must be part of the transport infrastructure, i.e., the Purple Line must be redrawn ....

Placing the line
Where should the new Purple Line be placed? In other words, is ‘IP/MPLS' really a service? Having a transport-specialized MPLS and keeping IP as part of the services would separate IP and MPLS into different departments, therefore negating the tight synergy between IP and MPLS.


The right model is having IP/MPLS as part of the transport side of the Purple Line and all the real applications and control services sitting on top of it. This model shows a good partition between infrastructure and services maintaining the synergy between MPLS and IP. Also note that we can now finally fill in the "magic layer": a thin layer of Ethernet (for framing) and G.709 (for optical OAM/FEC).


This model is the only way for networks to take a giant step forward and become packet-centric rather than optimized for TDM circuits. Keeping IP/MPLS separated from transport introduces inefficiencies and duplications as two different departments have to deal with the same issues: resiliency, traffic engineering, capacity. This integration will also help equipment vendors to find new synergies between IP/MPLS and optical transport.  As we begin the process of moving the Purple Line, a long list of opportunities for improving the overall network will arise.

Consequences
Moving the Purple Line is not at all easy, as 20 years is a long time for habits and attitudes to take hold. This particular future has consequences for many groups: vendors, service providers, regulators, unions.  How quickly and effectively these groups respond to the challenge will determine how fast we can move to the new paradigm of packet-centric networks.

New platforms have to be built to meet the new requirements. New architectures and new management paradigms are needed to best use these new platforms. New regulations may be needed to say which platforms can be deployed, where and how. The labour force may need to be reorganized to address the new opportunities.

The Purple Line served a very useful purpose, but has become stagnant over time, and now finds itself out of place.  However, the idea of separating "services" and "infrastructure" is still valid and should be preserved. Redrawing the Purple Line must be the first priority in designing a packet-centric Next Generation Network in order to truly optimize it for cost and efficiency within the new communication paradigms (point-to-point, any-to-any, multicast ...) and this may be challenging for many.

In this new context, the way packet and optical switches are built, deployed and managed has to be rethought. The good news is the validation from both the packet and the transport worlds  - IP control and data plane infrastructure is effective, robust and future-proof, service-enabling and scalable.

Leaders will define the future, followers will live in it.

Kireeti Kompella is Distinguished Engineer and Fellow at Juniper Networks, and David Noguer Bau is Head of Carrier Ethernet and Multiplay Marketing for EMEA at Juniper Networks

The 1990s brought us Business Process Re-engineering.  Now the talk is all about Business Transformation. Hugh Roberts contrasts and compares the approaches - and the results

‘BPR' (business process re-engineering) was the hot topic in the 90's when it came to change management. This time around, we've moved from BPR to BPM, but our new hot button is business transformation. (Still so new, it doesn't yet have a proper abbreviation!) Ostensibly, there isn't much difference between the two approaches, but the closer one looks, the clearer the differences in business prioritisation and market drivers become.
The one thing that has remained the same, however, is the high failure rate of transformation projects, often accompanied by the regular repopulation of executives at board level. After all, someone has to carry the can for all of those apparently ‘unfit-for-purpose' systems...

Towards the end of the last century as software capabilities improved, technology was first and foremost positioned as a means of enabling automation - seen as a key element in cost management programmes aimed at downsizing personnel and overcoming the restrictive work practices endemic in formerly monopolistic incumbent telcos. As a consequence of reduced staff numbers, existing hierarchies crumbled in an almost fetishistic rush to delayer organisations and establish everyone still employed in the organisation from top to bottom as a ‘Process Owner'. Similarly, whilst lip service was paid to the establishment of customer-centricity at all levels of the business, the real focus of culture change was to identify means of managing and motivating staff in a working environment that had regressed from one of high stability to one of high volatility, and where job satisfaction and staff churn were moving in opposite and unhelpful directions.

One might think that the new entrant operators would have been protected from the worst ravages of BPR, but of course - as demanded by their shareholders - skilled and experienced labour was required, and where better to get it from than the large and now freely
available pool of ex-incumbent employees? To quote Brendan Logan, who heads up Patni's Telecommunications Consulting and Advisory practice: "It took a new entrant about three years to create the same levels of inefficiency and disfunctionality in its operations environment that it used to take the PTTs eighty years to achieve."

In these new flatter organisations consisting of tens, hundreds, and in some cases thousands of Process Owners, the real problem was that very few people knew how the process that they owned actually fitted into the overall value generation mechanisms of the business, or how their processes related to those in other business units. However, they did at least know that they owned them. In the 21st century version of BPR - not least because of Sox 404, plus initiatives such as the eTOM - knowledge of process flows and inter-relationships has become significantly better. Unfortunately, the emerging convergence ecosystem has required us to maintain a much more fluid view of process ownership, so rather than declining, turf wars and inter-departmental politics are on the increase as we attempt to transform our organisations into lean, mean and agile enterprises.
Make no mistake; change is here to stay. Time-to-market constraints are typically no longer determined by technology development cycles (IMS and related notwithstanding!), so the strategic planning process must needs remain in a state of flux.

There are any number of obvious fiscal and housekeeping challenges in the finance domain raised by business transformation, amongst them the management of CapEx and OpEx, the sweating of legacy assets, the maintenance of good governance, corporate security and so on. One way or another, all of these are centred on risk management, which will undoubtedly supplant business transformation in the foreseeable future as the next ‘unifying' business focus of choice. In addressing these issues, the communications industry is probably no better or worse than most other industries. However, there are quite a number of ‘telco-specific' challenges posed by business transformation, most of which are a direct reflection of our uniquely intimate relationship with interconnected technologies and our relative lack of competitive and regulatory maturity on a global scale.
Here are four of the more insidious that we now need to face up to.

1. Recognising that ‘best practice' is, although useful to be aware of, an outmoded concept to use as a guiding principle. As the reality of globe-spanning operations and ownership bites, it is quite clear that local cultural, political, regulatory and socio-demographic factors will continue to maintain high levels of market diversity, and however mature we become as an industry this isn't going to change any time soon. Clearly, these factors must be respected. Although at the network layer and up into the bottom end of our OSS - anywhere, in fact, that functional activities could and should be totally transparent to the end user - standardisation is a given; in the BSS domain we will continue to waste an awful lot of money implementing applications and platforms that turn out not to be ‘fit for purpose' under local operating conditions. We have to acknowledge that ‘best fit' is going to be far more critical to our profitability and competitiveness, and that the determination of this may lead to conflicts with group directives and economies of scale.

2. Almost everything to do with ‘the customer'.
Telecoms must be the only industry on the planet that can't agree on a single and unifying definition of what a customer is. Quite apart from the competitive sensitivity of maintaining definitions that maximise our apparent market penetration levels, it remains common for many of the sub-systems within a single network operator's operating environment to maintain different customer data models. The challenge of developing and maintaining a single view of the customer is therefore quite daunting; never mind the challenges of doing so on an international basis or of extending the reach of telecoms into new market areas under the auspices of convergence. However much we believe we are being truly customer-centric... we're not. On a path well trodden by every other industry, CRM, CEM and BI are all steps in the right direction, but that's all they are: steps. We do, however, have some remarkable capabilities with the capture and management of high volumes of usage data. Once we learn to co-operate rather than compete with our ‘other customers' - the other players in the value chain - with regard to customer ownership, we may be able to fully leverage these skills to our advantage.

3. What to do about the information architecture.
Every business unit and function feeds off the central data backbone that we often (and somewhat erroneously) call ‘billing'. Somehow we need to find a way to take the politics out of the movement of data as we monetise the process of turning information into knowledge. Moreover, as the range and complexity of the services we offer increases, so do the number of relevant sources of knowledge about the customer's experience and perceptions of quality. We can no longer rely on the network to provide metrics that determine the value of the customer value proposition, nor can we rely on the traditional parameters we have used in the past to determine the value of the customers' attention and actions to our third party supply chain partners. We need to embrace the new methodologies entering the industry alongside the ‘X-factor' players - the network has the ability to generate knowledge but in the new generation it is certainly nether the owner nor the arbiter of ultimate (and bankable) truth.

4. Determining who your friends are.
Perhaps the greatest challenge of business transformation is that it doesn't lend itself to ‘projectisation'. Whilst elements of a transformation programme can be instigated and undertaken as projects, the reality is that transformation must be treated holistically if meaningful and lasting success is to be achieved. The impact must be felt on the systems, processes and skills deployed across the entire organisation. Unfortunately - and however much the pressures for rapid time-to-market wish to direct otherwise - the timescales required for the transformation of these three key elements of business operations are not synchronous. It is hopelessly unrealistic to expect new business processes and staff re-skilling to be in place at the point where platforms and applications have been upgraded, and vice versa. As a consequence, and even if the ‘plug and play' of COTS products were a credible reality, traditional RFP-based methodologies for the selection of products, vendors and integrators are almost certain to lead to failure. The selection of supply-side partners has now become the most critical of all transformation issues, and we have as yet no established framework in place for determining how to proceed. We need to learn how to ‘buy into' ongoing and flexible framework relationships with our suppliers for mutual - not exploitative - long term benefit.

In keeping with green operations, much of the slideware generated by BPR remains recyclable in the current climate of business transformation. But this doesn't mean that - even if it was meaningful the first time around - we should allow ourselves the luxury of feeling we've ‘been there before'. In the 90's we were competing for customer revenues by delivering a range of familiar services, albeit exploiting new technologies to deliver approximately the same services better, faster and cheaper, and we were competing between ourselves. At the moment, even the most basic of business questions remain open-ended: who we are competing with; what we are selling; who we are selling it to; and what it might be worth to them. The only answer that remains constant is the need for change, in response to change.

Hugh Roberts is Senior Strategist for Patni Telecoms Consulting and can be contact via: hugh@hughroberts.com

Everyone agrees that backhaul is expensive, but is there an ideal one-size-fits-all  solution, asks Lance Hiley

The headlines are clear for everyone to see: backhaul is one of the biggest issues and expenses facing mobile operators today. There isn't much consensus within the industry on what to do about it, but one thing that everyone does agree on is that the cost of backhaul represents 30 per cent of the capital and operational expenditure spend of the average operator each year. This could represent nearly $20 billion this year, and the figure has grown over the last few years with more data being consumed. Indeed, figures from Yankee Group indicate that transmission costs as part of operational expenditure (opex) in 2G networks can be as little as 10 to 20 per cent, but rises to 30 to 40 per cent in existing 3G networks. Global expenditure is predicted to reach $23billion by 2013.

Carrying data is clearly expensive, and unless this cost is brought down, it will continue to increase with the problem being exacerbated as mobile networks are built and upgraded to support new mobile data services and standards such as HSPA, WiMAX and LTE. If operators are to roll out the next generation of data services and importantly, realise significant profits, both opex and capex need to be reduced - doing ‘more of the same' is no longer an option.
Whilst doing more of the same is no longer enough, to solve the backhaul issues facing operators around the world and equip them for the future, we need to recognise that the majority of operators will have legacies of leased line and point-to-point backhaul infrastructure already in place. As such, we cannot simply recommend discarding the past and beginning with a clean slate. Particularly in Western Europe, with the predominance of point-to-point microwave links connecting cellular base stations, recommending that each link is replaced is simply not a realistic option for operators.


Achieving higher backhaul capacity is not just a matter of adding bandwidth; it also involves increasing the efficiency of traffic handling. As the industry evolves to a full packet environment, microwave must be able to support Ethernet IP protocols in addition to legacy 2/3G interfaces, such as time division multiplexing (TDM) / asynchronous Transfer Mode (ATM) and synchronous optical networking (SONET) / synchronous Digital Hierarchy (SDH).If we look briefly at the options available to operators for backhaul - leased line, point-to-point, point-to-multipoint - we begin to see that only two technologies effectively extend the benefits of IP/Ethernet principles to the network edge - which is where it needs to be to get the necessary bandwidth at reduced opex.

 

Deployment

time

CAPEX OPEX Capacity

Reliability/Operator

control

Leased

lines

Leased from

third party 

 Low High

 T1,

multiple

Ts Low

Unpredictable;

dependent upon

suppliers

 Fiber Slow

High;

increase

with

distance

but diminishes

over time

Dependent

upon

regulatory

market in

local

environment

 High

Medium - subject to

civil works and

environmental

damage

 Microwave Fast

Medium

to Low;

Flexible,

not related

to distance

 Low

Medium

to High

High, users have total

control

The table above is a useful comparison prepared by Yankee Group of the different backhaul technologies and approaches available. It is clear that there are several trade-offs to be understood when deciding on a backhaul strategy.


Leased lines and fibre tend to be seen as a panacea for the industry, but clearly there are disadvantages. Older leased-line technologies such as T1 and E1 cannot be dimensioned easily to cope with the unpredictable traffic demands of mobile data networks, and a network planner has to make quality of service decisions such as  dimensioning a link for the peak or mean traffic coming from a particular cell site. The ratio between peak to mean traffic coming from a cell site can be as much as 10:1. Designing for the mean will result in customers being limited in performance (and experience) at busy periods. Designing for the peak will result in underutilised resources for most of the operating day - a waste of operating capital.


Fibre reduces some of the issues of leased lines in that the acuteness of designing for the peak and mean does not present itself (unless an operator is paying for the fibre on a Mb/s basis) but the cost is higher and in many cases the wait longer. Fibre is well suited for very high traffic cells in a dense urban environment where fibre access is likely to be good. Outside of this environment, other options, such as microwave, seem to be a better choice.
There are two other factors to be considered when choosing between leased lines, fibre and microwave - cost and reliability. As the name implies, leased lines are an ongoing cost (lease) as well as an asset that an operator may not own or control. This is an important consideration when assessing the strategic aspects of building a backhaul network. Leased lines are great - provided that an operator trusts the independence and business model of their supplier. Leasing capacity from a competitor is always a risk - regardless of the strength of the local regulator - if there is one!


Reliability of leased lines is not a topic that we hear about often but it needs to be considered. Availability of lines in Europe tends to be very good - especially in markets where, by and large, they are buried underground. However in markets like North America where much of the lines are still strung between poles, reliability can be an issue and data integrity may be compromised.


Globally, an increasing percentage of new backhaul investment is in microwave. The business case for microwave rests on ease of deployment, and greater range, performance and flexibility. With zero dependence on renting or leasing wired lines both overall expenditure and running costs are reduced plus systems can be installed quickly and are not prone to cable cuts, increasing overall reliability. Already, according to Yankee group, globally microwave represents 50 per cent of all backhaul, and outside of North America microwave penetration is more than 60 per cent.


There are two flavours of microwave: point-to-point microwave and point-to-multipoint microwave. Point-to-point (PTP) is best suited for longer range links, rural areas and short very high capacity links. PTP microwave generally require a spectrum license for each link and are designed to provide a fixed capacity bandwidth link. PTP microwave operates over a range of frequencies - some of which are affected by atmospheric conditions. To deal with this, they sometimes employ adaptive modulation to step-down the capacity for a short period of time whilst an atmospheric event - like a snow storm - passes though the region where the microwave links are operating.


Because they are point-to-point links, and operate at a fixed frequency and capacity, PTP microwave are very much like leased lines in that an operator has to design its network to provision for peak loads, and as such, an operator may find himself spending capex on spectrum for links that are only operating at 10 per cent of their capacity most of the time. This is frequently called the ‘fat-pipe' approach to backhaul but clearly, it is a poor use of valuable spectrum resources.


Point-to-multipoint microwave uses a different architecture to address the backhaul issue. Rather than point-to-point links, point-to-multipoint backhaul architectures bring a number of cell site links back to a single aggregation point or hub. Immediately, this reduces the number of radios and antennae making the network less expensive to build. However, because the spectrum for the system is licensed across a number of radios, the resource is in effect shared - making the utilisation of the spectrum more efficient. PMP microwave systems also lend themselves to a more IP-like approach to packet data management.
Beginning with an already impressive raw data rate, of over 150Mbps gross throughput in a sector, PMP solutions utilise data optimisation and statistical multiplexing, together with its advanced on-air bandwidth control and interference management, to provide an ‘efficiency gain factor' of up to 4x.


The question is how to ensure a smooth migration to new backhaul networks that reduce costs and improve customer experience. Operators need to invest in a backhaul solution that takes into account the realities of their current network infrastructure as well as the vision of their future network. With its inherently traffic-neutral, flexible, innovative architecture, PMP microwave will in most cases fit the bill.


Not only is it easier to deploy than other backhaul technologies it offers increased capacity at a much lower ‘cost per bit' with cost estimates for a typical Western European operator running at cost savings of up to 44 per cent of capex and in excess of 58 per cent of opex compared with point-to-point links.


Clearly there is no ‘one size fits all' for every operator, but when weighted up against competing options, microwave and in particularly PMP microwave offers a compelling argument based on the four main metrics which matter to operators - capacity, quality of service, capex and opex.

Lance Hiley is VP Market Strategy, Cambridge Broadband Networks, and can be contacted via tel: +44 1223 703000; e-mail: LHiley@cbnl.com

www.cambridgebroadband.com

    

Other Categories in Features