Features

As commercial mobile TV and IPTV services are rolled out around the world, it is becoming clear that telecom operators hold some winning cards: in particular, the ability to deliver truly interactive and personalised TV services without the need to invest in brand new networks. Per Nordlöf and Anders Kälvemark fill in the picture

As TV enters fixed and mobile telecom networks, user behaviour is evolving from a ‘lean back’ approach to a ‘lean forward’ one in which consumers want greater control over what to watch, how to watch, and when to watch it. Consumers are getting used to personalising, controlling and interacting with content, services and brands. At the same time, there is greater availability of high-capacity fixed and mobile broadband connections, and content is increasingly available in digital format – making it easier to store and distribute over fixed and mobile IP-based networks.

MOBILE TV - A unique revenue opportunity

The generations that are growing up with sites like YouTube, Google Video and other user-generated content sites will want to do more than just watching traditional 'linear' broadcast TV. In the USA, 20 per cent of TV viewers now surf the web at the same time as they watch TV, for example.
TV of the future will be about giving users access to their favourite programmes and content, wherever they are, whenever they want and whatever device they are using.
As TV moves from traditional one-way (broadcast) distribution towards digital two-way networks, it is undergoing a fundamental change. The addition of a return channel makes the TV experience more interactive and personal; less passive and more active.

Mobile TV over cellular sets the pace
The good news for telecom operators is that mobile and IPTV services can already benefit from the built-in return channel. They enable users to enjoy interactive, personalised TV services wherever they go – whether mobile TV on the go or IPTV in the home.
Out of the 120 mobile TV services that have launched worldwide to date, more than 100 are distributed over existing cellular networks. And there are already more than 10 million regular viewers of mobile TV in developed markets – around four per cent of mobile subscribers.
A recent study conducted by Ericsson ConsumerLab among more than 700 mobile TV users in six countries (France, Italy, Japan, Korea, the UK and the USA) shows that there is strong and growing interest in the services. These people claim to watch an average of around 100 minutes of TV on their mobiles every week. Around two-fifths of those surveyed say they watch mobile TV every day, with viewing situations spread fairly evenly between commuting, during breaks, waiting for someone while out, at home and anywhere when there is a big event on (such as a major sporting event). The most popular time for mobile TV viewing is between 6pm and 10pm – today's prime time for normal TV.
More than half of consumers in the study pay a monthly subscription for mobile TV, or have it as a part of their monthly subscription, and around one-third do not pay anything for mobile TV. On average, consumers pay 14 euros per month for mobile TV, although there were large variations between markets. Interestingly, the payment option does not seem to have any effect on usage – perhaps indicating that users are more willing to pay for worthwhile content.
The study found that mobile users value different types of mobile TV content in different situations. Commuters looking to fill some time want 'light' content that allows for frequent disturbances – to enable them to get on and off public transport, negotiate crowds of people, listen to station announcements, and so on. Such disturbances should not cause the user to miss too much plot or content.
People typically tend to relax at certain times of the day, and mobile TV content should be designed to fit into this pattern of behaviour. For example, the Ericsson ConsumerLab study found that people will watch weightier content early on in the day – on the way to work, for example – when they may want to check the day's business and technology news. This is also true of people who commute long distances and so have time to concentrate on more demanding content.
The enthusiasm of those taking part in the study is echoed in real-world trials of interactive mobile TV services conducted by Ericsson and Norwegian state broadcaster NRK since 2005. Two-fifths of those who downloaded the mobile TV client used it every day – around four times a day on average. One of the most interesting findings is that average session times for users who have access to interactive features are typically double those of viewers watching standard programming.
One new aspect of the NRK–Ericsson collaboration is the world's first trial of interactive, personalised advertising, started in December 2006. The advertisements are interactive, customised to ensure their relevance to individual users, and tailored to the user's age, gender, location and personal interests. Without any marketing, around 200 consumers downloaded an easy-to-use client that enables them to receive personalised mobile TV advertising in return for free TV shows, radio channels and other content on their mobiles (users still pay for the connection). As at January 2007, the trial has shown that clicking on ads increases average session time from around two-and-a-half minutes to over six minutes, and the average click-through rate is 16 per cent. The most popular driver of click-throughs is the offer of free ringtones.
The results of such studies and trials are encouraging, but the key question for operators now is, what is the best way forward for operators to meet the demands of a mobile TV mass market?

More interactive, more personal
Mobile TV is much more than traditional TV delivered to a small screen. In addition to being able to view content wherever they go – with a combination of linear TV, on-demand TV and Podcast TV – users have access to easy-to-use programme guides, fast channel switching, interactivity and personalisation features.
Users have greater freedom in where and when to watch. They can use personalisation features to get a notification when their favourite team has scored and then get the video clip and goal summary pushed to their device. They can interact with friends and other communities, for example through commenting on or voting for their favourite contestant in a TV talent show. Mobile TV clients are available as downloadable client software, which offers consumers a convenient way to access the service, change programmes using the keypads or a menu, and get a programme overview through an electronic programme guide.
For operators, interactivity opens up new revenue streams possibilities, for example through premium SMS when voting.
Personalisation is valued by users. With mobile TV delivered by cellular networks, consumers can personalise their own TV programme portfolio according to their preferences. Based on their video-on-demand consumption history, a recommendation engine can offer a tailored programme schedule. Content can also be provided based on location or TV advertising region.
Ericsson ConsumerLab's study found consumers have clear views about where and when advertising is acceptable. Ideally, ads should be placed before or preferably after a programme – not included as a break in the middle. The length of the ads also needs to be balanced: users have a limited amount of time available when watching mobile TV, and they do not want this time eaten into by long ads. The length of the ad must be in proportion to the length of the programme so, for example after a weather forecast, they would be happy to see a short sponsor's message, while during a longer TV serial episode they would accept a slightly longer ad before, and maybe one after.
What is clear is that a well-functioning and robust service, with great choice of content, is a must for continued mobile TV usage. The study found that features that enable users to take control over their mobile TV viewing would be very popular. For example, the ability to pause and play, rewind and fast forward, and record content would be well appreciated. Consumers are also interested in making their mobile TV watching more flexible and would like to be able to use the mobile phone as a portable media controller.

Converged vision
Mobile TV services via cellular network open up new business models and revenue streams for the operator – for example, through targeted advertising and add-on sales. The vision is one of networked media delivered over three different screen types – TV, computer and mobile phone – enabled by IP Multimedia Subsystem (IMS). 
To enable such seamless, converged service offerings for a mass market, removal of usage restrictions is key. If we want to enable any TV to interconnect with any mobile device or any computer – to be able to view content in the most appropriate way at any given time (TV screen in the living-room; mobile phone when waiting for a bus) – we need to have industry-wide agreement on how these different devices will share content.
Mobile TV and IPTV solutions must meet very high requirements on scalability and performance. The telecoms industry is used to talking about five nines availability: TV services must meet at least the same requirements – just imagine the consequences of large events, perhaps sponsored by major international brands, being interrupted by technical difficulties.
The telecoms industry, with its well-established commitment to global standards and high quality of service is well placed to address these requirements. Interactive, personalised TV should be part and parcel of fixed–mobile convergence and full-service broadband, underpinned by IMS and other open international standards.
Mobile cellular networks already have, by default, both down- and up-link communication abilities in the network, and so are ready to offer interactive, personalised mobile TV services. More to the point, existing mobile networks already have more than 2.5 billion customers and global coverage in place, and their capacity is being given a tremendous boost through High Speed Packet Access (HSPA) and, in the future, Long Term Evolution (LTE).
Today, mobile TV services are delivered over cellular networks using unicast streaming technology. Data packets are transmitted from a single source to a single destination, for example from a content server to a mobile device.
There is more than enough capacity in 3G networks to scale up for mass-market mobile TV services, especially if the operator has deployed HSPA. HSPA provides several capacity increase steps, enabling more users to be served with a greater diversity and higher quality of mobile TV content. LTE moves mobile capacity up to another level: Ericsson recently demonstrated speeds of 144Mbit/s in a live network using LTE.
Multimedia Broadcast Multicast Service (MBMS) will enable broadcasting over 3G networks by allowing traffic channels to be shared by all users that are simultaneously watching the same programme in the same area. MBMS complements HSPA to support higher loads in dense areas and ensure efficient network utilisation (as shown in Figure 1)
Figure 1. Mobile cellular networks can meet mobile TV capacity needs today and tomorrow

By using a combination of unicast and broadcast, network capacity and investments can be optimised. Broadcast bearers can be used for the most popular programmes, and an unlimited number of additional programmes and on-demand content can continue to be delivered efficiently using unicast. In the combined unicast–broadcast scenario, the user will not notice any difference in how content is delivered. The user will have a single user interface (TV client) in the terminal to access all content. This combination unicast and broadcast provides the best way to meet personalisation and mass market.
One glimpse of how converged TV services will work is being provided by a joint project between Endemol, Ericsson Netherlands and Triple-IT. The companies are creating a service that enables subscribers to interact with TV shows – for example, by sending in live news reports or comments from their mobile phones – in real-time, even from overseas.
The opportunity for telecom operators to create unique interactive, personalised mobile TV services is there – all that's needed now is for the right technology choices to be made.

Per Nordlöf is Product Strategy Director at Ericsson, and Anders Kälvemark is Senior Consultant at Ericsson ConsumerLab

Two associations, TIA and USTelecom have joined together to create a single, much-needed industry venue, say NXTcomm organisers

NXTcomm will bring together the unique strengths of each of two trade association - TIA's representation of supplier and technology companies, and USTelecom's representation of the world's leading communications and entertainment companies.

EVENT PREVIEW - NXTcomm

The event will showcase the leading telecommunications companies in the new converged ICET industry who are merging communications and entertainment via innovations fueled by broadband deployment.
"NXTcomm will be about doing business - about advancing an industry of central importance to the modern information economy, and showcasing what's next across the colliding worlds of information, communications and entertainment," says USTelecom President and CEO Walter B. McCormick, Jr.
The show will bring together top executives from every segment of the global industry to exhibit, explore new business opportunities and buy the latest technologies driving the converged communications and entertainment industry. Showcasing hundreds of new and innovative products, the NXTcomm exhibit floor will reflect the dramatic changes in the sector. The show's programming will feature keynotes from the top CEOs in the communications, entertainment and technology sectors. From enterprise users and service providers to technology suppliers and content providers - the forces that drive communication and the solutions to harness it converge here. With 20,000+ attendees and 450+ exhibitors, NXTcomm aims to be at the center of the expanding ICET universe.
Keynotes
The event's keynote speakers include AT&T Chairman and CEO Ed Whitacre; Bell Canada President and CEO Michael Sabia; Cisco Chairman and CEO John Chambers; GE Vice Chairman and Executive Officer and NBC Universal Chairman Bob Wright; Motorola Chairman and CEO Ed Zander; Verizon Communications Chairman and CEO Ivan Seidenberg; and Federal Communications Commission Chairman Kevin Martin.
More speakers will be announced in the coming months.
 “These CEOs are directly responsible for charting the course for the converging communications, entertainment and information industries,” says Wayne Crawford, Executive Director of NXTcomm.
The event aims to bring together top executives from every segment of the global industry to exhibit, explore new business opportunities and buy the latest technologies driving the converged communications and entertainment industry. Showcasing hundreds of new and innovative products, the NXTcomm exhibit floor will reflect the dramatic changes in the industry. The show will also feature a broad range of educational programming.

International Attendees
International attendee who plan to come as a buyers, qualify for special benefits, including:
•    The International Trade Center (ITC), exclusively for the use of international buyers
•    US Department of Commerce assistance in finding U.S. suppliers
•    Interpreters (in the International Trade Center only)
•    Transportation between official hotels and McCormick Place

NXTcomm 2007 - Conferences: June 18 – 21, 2007; Exhibits: June 19 - 21, 2007
McCormick Place, 2301 S. Lake Shore Drive, Chicago Il, USA www.nxtcommshow.com

Network Resilience promotes peace of mind for business of any shape or size according to Piers Daniell

With businesses becoming increasingly reliant on telecom communication it is surprising how many companies rely on a single voice and Internet solution without a backup in place. Consolidation has been a buzz word in the IT industry for the past few years, but a consolidated communications network creates a big risk for business in the form of a huge single point of failure.

CONSOLIDATION - Another string to your bow

Historically, network service levels have been managed by availability, but today poor performance can be equally or more damaging as no service at all. To remain competitive, networked application performance must meet the needs of the business and its end-users.  It is now widely accepted that service provision and receipt should be governed by an agreement – the Service Level Agreement (SLA). This is essential to define the parameters of the service, for the benefit of both the provider and the recipient and it is crucial in setting the expectations of all parties as to what service can be relied upon and what the results of failure will be.  Of course, once the expectations and the agreement have been set, it is essential that the terms are adhered to by all parties. This is not always easy to achieve.
 
With BT currently suffering widespread and well-documented resource issues it is regularly failing to meet its SLAs.  This can mean that businesses are putting their future at risk by losing their Internet or voice services for, sometimes, extended periods of time.  It is critical that they evaluate the costs to the business of such service losses and seek to protect those services to a relevant and appropriate level.  Obviously, if the business relies totally on the availability of its telecoms infrastructure then greater weight will be given to this equipment and the steps taken to protect those services will be rigorous. 
However, few businesses, in reality, have actually looked in detail at the response they would get from their ISP should the worst happen. And, they need to ask the crucial question “what compensation will we actually receive?”. No telecom service would recompense against the true loss of business and the majority would only look to reimburse the customer for a percentage of their monthly service rental. This equates to a pitiful amount when compared to the lost business if a company loses e-mail, Internet or voice, even for the smallest of companies.
The answer may well lie in a philosophy that has saved many a shrewd businessman from disaster – the idea that spreading the risk reduces the exposure. In this instance, this means putting additional measures in place to limit the impact of a single source problem, and companies should consider this to protect against service outages.  By purchasing, for example, a smaller Internet connection from a different provider, they protect against a technical problem with the main connection.  For larger businesses this is a common solution but for the smaller companies the cost implications or the requirement for additional hardware and network configuration demands can often prevent them from making this provision.  This is where the cost equation comes into play and should be carefully considered before further infrastructure investment is declined. 
Outages can be caused by so many factors that no matter how reliable your connectivity, single points of failure are just too risky and need to be eliminated. Increased resilience and backup will already be in place on the ISP's core network. However, that is only a small part of the overall solution when an Internet connection is provided. The copper in the ground for the service line is most likely owned and operated by BT, which runs to a local telephone exchange where voice and Internet traffic is routed to the major data centres across the country. This 'last mile' suffers from a multitude of potential problems that could take out the vital link for a business and its customers.  And these problems may be as simple as road workers cutting through cables as they dig up a road to lay new pipes.  Although this is an easy mistake to make and one that occurs on a fairly regular basis, it can cause immense problems and take a good deal of time to rectify.  The simplicity of the mistake is no consolation to the companies that are affected by such an outage. 
Then there are the more technology driven problems.  For example, LLU SDSL providers are reporting an increasing problem with users being unceremoniously disconnected by BT engineers unfamiliar with the technology. SDSL operates over a data-only circuit with a true symmetrical dataflow. However this means the copper cannot be used for voice, as is the case with ADSL, and so there is no dial tone on the line. The dial tone is used by engineers to tell if a line has been properly connected and is in use. Without a dial tone engineers might accidentally reuse the copper when cabling a new circuit in the local exchange.
Aside from careless engineers, major outages can cause longer periods of downtime no matter how much care is taken or engineering resource is available. From rats chewing through cables and allowing water onto the copper, to lightning strikes or power surges, faults are difficult to troubleshoot and timely to fix. Businesses should therefore ensure proper procedures are in place and alternative connectivity has been sourced to maintain communications.
Network resilience and connection redundancy are essential. At the very least businesses should look to purchase two different kinds of Internet connections from their supplier so that, should one line or technology have an issue, the business will be able to continue to operate using the other. Also by taking this secondary service from the ISP it is possible to request that both the primary and secondary circuit utilise the same network settings hence making it easier for anyone to switch connections as no further network configuration would be required. There are also systems on the market from companies such as Cisco that have the capability to include two routers in one case, making the switch over from primary to secondary lines pretty much seamless. Many customers who do consider backup stick with the industry standard of ISDN, however with per second billing the service can prove costly should it be used a lot. Also ISDN only offers 64Kb/s of data transfer, which is a very small amount compared to the 20,000Kb/s offered by some ISPs' ADSL services. But with the advent of DSL technology in the UK there has been quite a bit of investment into the technology and now businesses have a number of options when considering backup.
SDSL, which offers symmetrical data connections, has proved popular as primary connections with the SME market but also as a secondary backup to a larger company's leased line service. Moving down the scale, ADSL offers great backup for businesses invested in SDSL and can be used in the aforementioned Cisco routers. When looking to back up an ADSL circuit this is best achieved by actually choosing a second ADSL circuit, but also by making the following provisions. Ensure that the second ADSL line is activated over a totally different phone line within the building. This is because normal phone cable carries three phone lines and hence, if a cable has a fault, it can affect a number of lines. Secondly try to get the second line activated over a different ADSL technology, which will be using a different part of the exchange. Over the past few years telecom carriers have been upgrading BT's exchanges with their own ADSL technology which has been know as Local Loop Unbundling (LLU) providing alternative connectivity options.
Although the concept of bonding or aggregation has been with us for some years now, it is only recently that some service providers have developed state-of-the-art aggregation technology which makes it possible to provide customers with a connection bonded using multiple lines from different providers and even using different technologies. The potential of such a service is huge as it increases the level of resilience many fold. This can be presented to the consumer in a single hand-off, avoiding a lot of the downsides of a simple backup connection, as there is no need to use different hardware, reconfigure the internal network or miss out on the extra bandwidth as the service is completely aggregated together creating one virtual service at all times.
Looking at voice protection standard services such as call answering can prove invaluable when all office communications are severed, ensuring client enquiries continue to be dealt with. Other solutions, especially with new technologies, such as VoIP, empower businesses to forward calls straight to other landline phones or mobiles in the event of loosing connectivity.
Whatever way businesses chose to protect themselves, the message is clear – with the internet and telecommunications becoming key facilitators in day-to-day business activities across a wide range of industries, forward planning and backup solutions are essential.

Piers Daniell is Director of Fluidata
www.fluidata.co.uk

If you think telecom-media-Internet convergence means a few years’ turbulence and the captain switching on the seatbelt sign to make sure nobody gets injured, think again. Keith Willetts notes that it's clear that all three sectors are going to fuse, but also thinks that telecom can play an important role in the rebirth

The imminence of so-called ‘convergence’ has been a hot topic in communications and networking for at least 15 years, probably longer. ‘Will telecom converge with broadcast media?’ we were always being asked.  ‘Will fixed and mobile telephony come together in a single service?’.  And so on. 

Well not any more. Convergence is no longer just something rumored to be around the corner. Like that other imminent happening, climate change, it's already here and is making itself felt.
With the telecom, media and Internet sectors now so clearly banging up against each other it's not surprising that this year's TeleManagement World in Nice (20-24 May) is putting convergence and its opportunities front and center.
With an overall theme of Managing Telecom-Media-Internet Convergence:  Leadership Strategies for the World of Communications, Information & Entertainment, we kick off on Day One with an executive summit 'The Big Bang: Telecom-Media-Internet Collision & Rebirth'.
That's what we think we're looking at now.  Not a gentle coming together of sectors where one group of businesses makes a few incursions into another group's traditional territory, but a full-scale fusion and a subsequent re-birth a little further down the track.
EBay's purchase of Skype is an obvious example of the process in action, but there are plenty more. We see traditional telcos trying to turn themselves into Communication and Entertainment Service Providers by adding IPTV to their service offerings and looking to buy and deliver the content themselves. We see cable operators and even satellite broadcasters also selling telephony; mobile operators selling broadband; and just about every other crossover permutation you can think of.
The most urgent question being asked by people working in all these sectors is, naturally enough, who is going to win out?  Will traditional telecom service providers be killed off by voice over the Internet on one side and mobile on the other?  Will the emerging online publishing industry, enabled by the rise of the Internet, knock out traditional paper-based publishing?
There may be relative winners and losers over time, but a more realistic scenario is that we are entering a technical convergence phase where different types of player will partner to move content over different networks, share revenue, ensure security and so on. 
We think that's where the TM Forum is going to make an invaluable contribution. Up to now our mission has been to provide a framework to enable different pieces in the telco back office jigsaw puzzle to fit together in an efficient, easy-to-integrate way. So we developed NGOSS (New Generation Operations Systems and Software), which could be thought of as a blueprint for the internal convergence of the fiefdoms, and information silos that have developed within our service provider members.
Now we're facing the same sort of convergence problem one level up, between different sorts of companies.  Because we think we know what we're doing – after all, we've had the practice - we want the TM Forum to be a vehicle for bringing together the new converged telecom-media-Internet environment.  After all, we're not just seeing a fusion of technologies but a fusion of different competencies and one of the characteristics of the telecom sector, as exemplified in the work of the TM Forum, is a certain methodical way of stitching together complex, overlapping technologies - it's what we've been doing for about 100 years and we're good at it. 
So come to Nice in May and hear more about how we're going to be an important midwife at the rebirth of the new Telecom-Media-Internet sector. 

Keith Willetts is Chairman, TM Forum

Ever increasing sophistication among cyber criminals is putting pressure on organisations that must also meet the requirements of compliance legislation.  Peter Wollacott looks at ways to fight back

Cybercrime has come a long way since 1988 when the Morris worm - considered by many to be the first piece of malware and certainly the first to gain widespread media attention - hit the Internet.  The worm, technically a Trojan Horse piece of malware, was written by a student at Cornell University, Robert Tappan Morris, and launched on November 2, 1988.

CYBERCRIME - Safety first

The replicating and 'clogging' concept of the worm, which effectively brought the Internet to a grinding halt, has since been copied many times, although no-one could have foreseen the developments in the world of malware and cyber-attacks that would ensue in the years to come.
If we fast-forward just over 18 years to January, 2007, we see the Nordea Bank in Sweden reporting a loss of $1.1 million to Russian organised criminals over a three-month period, with a key logging Trojan at the heart of the scam.
According to BBC news reports, the bank lost its money in relatively small amounts over the three months, with debits spread across the accounts of around 250 business and consumer (retail) customers.  Reports note that the Russian criminals developed their own custom Trojan, which was sent to the bank's customers disguised as an anti-spam application.  Because the Trojan was custom-made and only sent to a small number of Internet users, it fell below the radar of conventional IT security software.
Once the bank's customers downloaded the application, they were infected by a modified version of the haxdoor.ki Trojan, which triggered key logging when users accessed their Nordea bank accounts online. These details were then relayed to a group of servers in Russia, where an automated routine started siphoning money from the customer's accounts.
The bank has borne the costs of reimbursement to all the affected customers and is seeking ways of preventing further attacks of this nature.
Unfortunately for customers worldwide, this type of low-value, multi-account fraud is extremely difficult to counter, unless the bank concerned has both heuristic and holistic IT security technologies to protect its IT resources.
The Nordea bank incident illustrates the fact that modern cybercrime has "come of age" driven at least in part by the arrival of organised criminals using sophisticated techniques to extract significant amounts of money from organisations both large and small without detection.
Most modern organisations have installed multi-vector security technology, including perimeter security systems, to protect their IT resources against almost every conceivable form of external attack, whether it is an e-mail-borne virus, hybrid malware, or a sustained brute force attack on their EFTPOS/financial systems.
But this is only part of the security equation. Today there is also the very real issue of internal attacks, originated by anyone from a disgruntled employee to a WiFi-wielding cracker who gained access to the company's internal network using a wireless backdoor, courtesy of a new Centrino-driven notebook sitting on the marketing director's desk.
Employing user privilege-based control systems on the IT network, as well as installing event monitoring/response technology that can block any unauthorised and/or unusual activity on the IT resource, can protect against loss through internal attacks of this type - as well as sophisticated hybrid attacks from the Russian criminals involved in the Nordea Bank scam.
Unfortunately for hard-pressed IT managers the world over, some of the best IT monitoring/control systems can be relatively expensive option to install and operate, meaning that a compromise in terms of security and cost might seem the order of the day. This could prove to be false economy and impact good governance.
Modern legislation, like the Sarbanes-Oxley Act in the US, the Companies Act in the UK and other equivalent laws around the world, impose a duty of care on senior officers within organisations to install an auditable IT security system that protects against all known and unknown security threats that might impact their organisation.
Perhaps worse, these new laws do not take account of the fact that hacker techniques - as clearly illustrated in the Nordea Bank attack - are becoming more sophisticated and specifically designed to evade existing detection methodologies.
Many of the forensic accounting and data auditing software seen in the last decade, in fact, is now significantly out of date against a backdrop of the increasing levels of authorised misuse, unwitting internal participation and fraud that are starting to appear in many major organisations. Authorised misuse is a grey area that many IT security managers overlook at their peril. If, for example, an office worker starts downloading the entire company customer base, it may be that a legitimate back-up is in progress, or it might be the beginnings of a major contravention of local data protection legislation. But which is it?
An effective monitoring system capable of alerting IT management staff to such an event and taking pre-defined lock-down action, as appropriate, goes a long way towards protecting against loss, keeping the auditors at bay, and, perhaps more importantly, keeping the management on the right side of the law.
This is because a failure to address such increasingly prevalent internal security matters is a breach of a growing number of compliance legislation such as Sarbanes-Oxley in the US and the Companies Act in the UK.
All is not lost, however, as a new generation of monitoring systems, capable of using real-time heuristic and holistic analysis techniques - alongside more conventional auditing and IT security software - can help IT managers meet the demands of increasingly complex risk environments set against increasingly draconian compliance legislation.
Our observations here at Tier 3, where we have developed a behavioural intelligence approach to IT resource protection and control, is that an increasing number of  major organisations around the world that do business with their US counterparts are now adhering to the provisions of the Sarbanes-Oxley Act.  This analysis leads us to the conclusion that most US companies will soon include Sarbanes-Oxley or similar compliance requirements into their commercial trading terms and conditions with other parties.
Improved governance is good business practice and so even those non-US organisations not forced into a 'comply or die' situation with international legislation will, we believe, find it advantageous to move to this best practice approach on IT security.  For this reason, organisations should consider moving from a point-solution based IT security system to an integrated approach, with multi-faceted security technology installed, at all technology levels, across the organisation under the control of a fully automated and auditable database-driven ICT Threat Management system.
In addition to this, if an organisation takes steps to perform a continually updated research and risk analysis on its IT systems and resources, then it is well on its way to ensuring relevant regulatory compliance - as well as protecting against organised criminal gangs using customised Trojans to extract money from the organisation's bank accounts.

Peter Woollacott is CEO, Tier 3
www.tier-3.com

If operators are to build profitable content-based service businesses, they will need to address unacceptably high levels of avoidable revenue loss, says Geoff Ibbett

Year on year telecom operators lose about 12 per cent of their revenue to avoidable leakages. Clearly, operators have a great opportunity to show immediate improvement in their top and bottom line performance if they can successfully tackle this revenue leakage.

REVENUE MANAGEMENT - Stopping the leaks

And there is some good news too. Revenue maximisation programmes managed by CFOs deliver best results in containing revenue leakage. Undoubtedly, telecom operators can show dramatically improved results if they implement an enterprise wide revenue management programme effectively managed by, and reporting to, the CFO.
And there is some good news too. Revenue maximisation programmes managed by CFOs deliver best results in containing revenue leakage. Undoubtedly, telecom operators can show dramatically improved results if they implement an enterprise wide revenue management programme effectively managed by, and reporting to, the CFO.
Unfortunately, over the years, operators have deployed BSS/OSS systems with an eye on the immediate needs of the business without necessarily analysing the impact on existing systems within the chain. This has often had the effect of fragmenting the operations chain into seemingly impregnable silos. An executive can access a lot of data but very little actionable information.
In addition, telecom operators are stepping into the exciting world of content-driven services. These new services will help telecom operators combat the problem of rapidly commoditising voice-based services that suffer from high rate of subscriber churn and falling ARPUs.
These next-generation services will have an even more complex revenue distribution and settlement chain associated with them, involving partners and resellers. Telecom operators will find themselves cast as trusted partners for product delivery and related payment receipt. This new role will sharply bring into focus the impact of revenue leakages. In the conventional voice-based services environment, operators could treat revenue leakage as opportunity loss. In the content-driven service environment, however, operators will suffer real loss because an operator is liable to pay the content provider even if he does not or cannot collect matching payment from the subscribers.
The greatest challenge, for a telecom operator is to establish a strategic framework to foster sustained profitable growth. This is easier said than done. The industry is fiercely competitive, demands rapid response from operators to ever changing business and technology environments but offers little leeway to experiment, let alone make mistakes, and this is where the next generation of revenue management platform comes in.
Revenue management in its broader context though, is much more than just assuring revenue, reducing fraud and managing credit risk.  It should provide a mechanism for actively managing the performance of an operator's business. 
Of course it should monitor, control and ensure revenue integrity within all of the various revenue chains, but also provide the ability to manage the cost base associated with service delivery to allow profitability and product margin to be managed rather than just revenues alone.  This is because not all revenue is good revenue; at least if it costs an operator more to deliver the service than is received in receipts from its customers.  Often this information is simply not available to the business manager.
But the holy grail of revenue management is to provide a single, consolidated, real-time view of the overall performance of the business that supports business managers in their day-to-day decisions, making it a role that directly impacts the performance of their business.
Next generation revenue management moves beyond just managing leakages, it needs to address profitability and even track subscriber behaviour so that the assets of the business are put to optimal use.
One of the biggest hurdles to overcome in achieving this is in bringing information together, from the traditionally separate systems that exist today, and providing a visual representation of this information from a business perspective.
The concept of the Revenue Operations Center (ROC) is in doing just that, and presenting it in a manner that enables issues that are affecting business performance to be easily identified, investigated, diagnosed and corrected.
Modelled on the Network Operations Centre (NOC), it is intended to provide an equivalent view to the financial community of the operational effectiveness of a telecom operator's revenue network, as the NOC itself does for network operations.
A Revenue Operations Centre, though, is much more than just another dashboard; it should be underpinned by an integrated suite of revenue management solutions, providing multiple levels of drilldown to support the day to day activities of different levels of business management.
To support the goal of assessing and quantifying business performance and revenue integrity, the Revenue Operations Centre also needs to provide comparative analysis of revenue operations.  The full power of the ROC can be realised when business performance can be tracked at key stages of revenue realisation.
Six such stages have been identified for monitoring by a ROC:
•    Forecast Revenue, based on revenue targets usually derived from a company's business plan.
•    Predicted Revenue, based on revenue projections of the current subscriber base together with estimated ARPU and AMPU.
•    Expected Revenue, based on the provision of service within the network and service usage recorded within the network
•    Invoiced Revenue, based on actual billed revenues
•    Collected Revenue, based on the revenue actually received by the company
•    Reported Revenue, based on how those collected revenues are reported in the accounts and summarised in the company's annual report.
In an ideal world all of these revenue stages should give the same value, but of course they never do.  For example, the difference between Expected Revenue and Invoiced Revenue can be accounted for by revenue assurance losses and internal fraud, and the different between Invoiced Revenue and Collected Revenue can be accounted for by external fraud and bad debt.
By comparing these key revenue perspectives, the operational effectiveness of a business can be determined and, by combining information from a telecom operator's various monitoring system, gaps between the revenue stages can be quantified and a business is able to understand whether there are any gaps that cannot be explained. 
The process of investigating these gaps will reveal hitherto unknown issues, such as revenue leakages, stranded and under utilised assets, inflated operating costs and inefficient systems and process amongst other things.
It is the Revenue Operations Centre that will become a key business solution that will enable a business to manage its four levers of profitability, namely, price, cost of service delivery, product portfolio and targeted customer effectively.  Those businesses that can achieve this will be able to maximise profit growth within an increasingly competitive and complex industry.

Geoff Ibbett is Director, Product Management, Subex Azure

European Communications presents its regular round-up of the latest developments in the world of telecommunications

ITU goes West
ITU Secretary-General Hamadoun I. Touré recently conferred with some of the leading lights of Silicon Valley,  aiming  to cement ties with the private sector and promote the use of state-of-the-art in ICT to bridge the digital divide.

Among the participants were executives from communications, hardware, Internet, software and venture capital firms, including Intel, Cisco Systems, Nokia Siemens Networks, Hewlett Packard, Google, IBM Venture Capital Group, Visa International, Microsoft, as well as Stanford University and the University of California, Berkeley.
Speaking at the opening of the "UN Meets Silicon Valley" event, Dr Touré focused on three main trends that appear to be influencing the ICT industry: innovation and cybersecurity; changing business models; and the development of new markets. "Innovation is a key source of new products, added value and fresh growth in revenues," Dr Touré told participants. "I want to challenge you to think beyond the borders of Silicon Valley, beyond even the borders of the United States, to the emerging markets in the rest of the world."
He said that closing the digital divide should not be seen as charity, but as a sound business model attractive to industry.
Describing the ITU as a unique intergovernmental organisation, which also has strong relations with business, Dr Touré added: "The Union has a noble mission: to provide access to the benefits of ICT to all the world's inhabitants.  To achieve that goal, we need to work in partnership with governments, the private sector and civil society, and to exploit the dynamism of regions like Silicon Valley."
A road map to connect the unconnected by 2015 was set out by the World Summit on the Information Society that was organised by the ITU in 2003 and 2005. With world leaders recognising the potential of ICT as an enabler for development, Dr Touré said the moment is ripe to harness the culture of innovation and competition in Silicon Valley to connect the world. The ITU has been charged with building the infrastructure required and ensuring security in cyberspace as well as bring together all stakeholders in meeting the goals of the Summit.
Details: www.itu.int

Entertaining  potential
The mobile entertainment market is set for a new era of rapid growth as 3G environments become more commonplace, applications built for mobile predominate, and more users in the mass market exploit the mobile phone as a multifunctional communications and entertainment device says Juniper Research
The value of the mobile entertainment market, including music, games, TV, sports and infotainment, gambling and adult content is forecast to increase from $17.3 billion in 2006 to nearly $77 billion by 2011, driven by mobile TV, video rich applications and a buoyant Asian market. This is rapid growth, but for the potential to be realised, there are still a number of barriers to be overcome.
Principal author of the Juniper Research Mobile Entertainment Series, Bruce Gibson, comments: “The face of mobile entertainment is expected to change significantly over the next five years as next generation mobile services continue to be rolled out around the globe and take up steadily increases. As 3G services become commonplace, sophisticated mobile entertainment products and services can reach the mass market and provide the sort of anywhere/anytime entertainment that has been predicted for some time, but not really delivered.” However, he adds a note of caution: “Whilst the potential to generate dramatically increased revenues is certainly there, many uncertainties affecting sections of the market still exist and could put a break on growth - the development of legislative environments for mobile gambling and adult content, and the success of broadcast mobile TV trials currently underway or planned, are just two examples.”
 Dramatic changes in service delivery are forecast, but some aspects of market structure will not change. The Asia Pacific region currently provides the largest market for Mobile Entertainment services and contributes over 40 per cent of global revenues. Despite more rapid growth in North America and in developing markets, the Asia Pacific region is forecast to retain its leadership through to 2011, when it will still contribute 37 per cent of global revenues.
Details: www.juniperresearch.com.

Internet freedom
The Internet industry must do more to fight governments' attempts to repress Internet users around the world, Amnesty International UK Campaigns Director Tim Hancock noted at the Internet Services Providers' Association (ISPA) annual awards ceremony.
'The Internet has revolutionised free speech and gives a voice to millions. But we must be on our guard against those who want to limit access to information and take that free speech away,” he said.
'The Internet is the new front in the battle between those who want to speak out, and those who want to stop them.  Businesses whose operations impact on freedom of speech bear no less responsibility for upholding human rights standards than other industries.”
He went on the stress that web users and service providers alike have a responsibility to keep alive the things that have made the Internet great - its democracy, its freedom and the way it gives people access to knowledge and the opportunity to participate and be heard.
Over 60,000 people have joined Amnesty International's irrepressible.info campaign, highlighting the repression of Internet users around the world, and the collusion of major Internet companies with governments such as China to restrict access to information over the Internet.
The human rights organisation recently announced that it was joining multi-stakeholder discussions with companies including Google, Microsoft and Yahoo!, together with other NGOs, experts and investors, to establish principles for  safeguarding human rights on the Internet.
Details: http://amnesty.org

EC gets it right
The socioeconomic profitability of the eCall system, proposed by the European Commission, has been independently verified by a new research report from the analyst firm Berg Insight.
The eCall system is intended to automatically initiate an emergency call to 112 from a vehicle and transmit satellite positioning data to the operator in case of a road accident. By reducing the reaction time for the emergency services, the system is expected to save thousands of lives annually when fully implemented. Exactly how many lives that would actually be saved is, however, the subject of a debate between the proponents and sceptics who believe the cost exceeds the benefits. According to the findings of the Berg study, there will be a net socioeconomic benefit for the EU if road fatalities and severe injuries are reduced by 3 per cent or more.
 “The eCall project is based on the well known Golden Hour principle of accident medicine, saying that the chance of surviving a severe injury decreases from 26 per cent to 5 per cent in the first hour,” explains Tobias Ryberg, Senior Analyst, Berg Insight. “Literally, every minute counts when it comes to saving lives, not to mention preventing severe injuries which are a heavy burden on public finances.”
Berg Insight estimates eCall could save 1,400–2,800 lives and prevent 8,600–17,100 severe injuries annually in the EU when fully implemented. Long-term savings would be in the range of ? 5–10 billion, whereas the long-term cost is projected as ? 4 billion. Ryberg believes that segments of the automotive industry exaggerate the cost of integrating an eCall device in every new vehicle, as would be required for the system to work.
“Worldwide production of mobile phones now exceeds 1 billion units, and in five years a majority of those will have integrated GPS,” he says. “I am convinced that the cost of producing another 15 million units - without displays, digital cameras and music playback capabilities - will be marginal once the automotive purchasing departments have done their job.”
Details: www.berginsight.com

Future competitive differentiation lies in the quality of the customer relationship and the ability to meet individual expectations.  Mikko Hietanen explains the importance of providing a personalised customer experience to secure loyalty and increase lifetime values

We are living in a world of high churn rates, but should we sit-back and simply accept it?   A main contributing factor to this phenomenon is that users are expressing a growing dissatisfaction with the quality of service delivery and customer care they receive from their communication service providers.  They, quite rightly, expect high levels of service and support tailored to their own requirements but are normally disappointed and unimpressed with the way it is provided.   

CUSTOMER LIFECYCLE MANAGEMENT - The personal touch

Operators are struggling to deliver a significant improvement in the customer experience.  The use of analytics, better segmentation and outbound campaigns has overcome some of these issues but it's clearly not enough.  There is a definite disconnect between marketing's requirements to build lifetime relationships and a lack of co-ordination and connectivity between the customer facing systems designed to achieve this goal. 
Creating an improved customer experience requires less time and effort than communication service providers may think. The emphasis is to stay in tune with the customer and develop marketing plans to address them on a one-to-one basis by utilising and enhancing existing IT investments.  The pursuit of this essential business requirement is known as Customer Lifecycle Management (CLM), and is fast becoming the single most efficient method of retaining profitable customers.

Unlocking customer data
All service providers share one key asset – customer data.  Ensuring every piece of that data is attainable and delivering its full value is the foundation on which to build an improved customer experience.  Having the capability to build in-depth profiles made up of all historical and contextual data and continually adding to it as more interactions are initiated is the way to really get to know your customers on an individual basis.
However, collating and co-ordinating this data presents its own set of challenges.   Access is often hampered as there are many different customer-facing systems that are incompatible.  Leaving vital data locked away in separate systems such as e-mail, direct mail SMS, IVR, webportals, CRM and campaign management tools.  To be effective every single piece of data needs to be unlocked and integrated to work together as a comprehensive unit. 
Opening up this data is like opening up Pandora's box.  Enriched profiles can be constructed as you start to monitor exactly how each customer interacts with you, why and when.
For example, a customer may be in dispute with customer services over a recent bill.  It is important during this period that a customer is not contacted with other offers until this situation has been resolved.  However with a lack of co-ordination between systems this is hard to prevent.  If the very same customer meets the criteria of a segmented group for a new service, a campaign management tool will automatically include them in the campaign oblivious of the fact that the timing is not right.  A non-response from, an unhappy customer, will then automatically trigger a reminder for a service he may have no interest in and before too long the customer feels frustrated and is increasing the probability of churn. 
This scenario can be avoided if all inbound and outbound campaign data is collated and integrated from one system.

Personal attention
Adding the personal touch makes every customer feel special and delivers a fantastic brand experience.  Initiating truly personalised dialogues and responding in context enriches each and every interaction.  With the level of customer data available there is no need to simply push offers to segmented groups via campaign management tools. 
The customer can initiate the start point of any dialogue by approaching their provider with a specific need.  This need can then be addressed by positioning offers or other marketing-driven content in the context of the interaction.  In some cases the customer may trigger an additional sale opportunity or possibly an educational tip regarding a new service.  The real difference here is the communication is personalised to the user's own relationship and the exposure of the message is driven by the customer behaviour. 
To personalise transactions you need to understand what is needed from each system and what each system needs to know and when, to play their part in the fulfilment of the customer requests.  By evaluating the responses you can benefit from knowing where a customer is in their lifecycle.  Value risk assessments can be made in real time determining the potential risk of churn and decide on the right incentive for that particular customer to take action.
It is key that marketing has the ability to design and control the rules to steer the dialogues in the required business direction so they can plan, create, monitor and manage the dialogues and associated initiatives with little reliance on the IT department. 
To achieve the best results, the personalised approach has to be consistent across all available communication points.  Operators offer a wide choice of communication methods to give their customers the utmost convenience but it is a far from seamless experience.  It's all well and good to offer a personalised service within the confines of the same communication medium but if a customer chooses to adopt more than one method there is normally a disconnect in the service received.    A customer is oblivious to the technical challenges, and quite rightly expects the same dialogue to continue whichever medium is chosen.   
For example, when a customer receives an SMS with an incentive and a call to action this needs to be automatically reflected on the web page.  The content needs to perfectly reflect the offer without the need to search different pages to locate it.   If at the same time, the customer decides to contact the call centre, the customer service representative should be provided with information pertaining to the specific offer introduced and how the customer responded.  This information will allow the CSR to confidently reinforce the offer based on factual information and increase the probability of up and cross sales.
This can only be achieved if all customer interactions are integrated and co-ordinated across all the channels.  The result - continuous and relevant dialogues.
Personalised marketing campaigns and initiatives will often consist of hundreds of different incentives aligned to the business strategy.  With multiple offers and incentives going out to customers simultaneously, successfully fulfilling these offers is important to the overall customer experience.  For example, upgrade of new handsets, redemption of cinema tickets or discount vouchers.   Delivering these items, organising a demonstration of how to use them, sending user guides and even the correct set of additional services such as insurance can prove to be a logistical nightmare.   
A problem at any point in the fulfilment process triggers an immediate negative perception with the customer.  The end result is customer apathy, manifesting as a continuing strong resistance to offers and the take-up of new services.

Customer Lifecycle Management
CLM is a new and unique approach that focuses on all the crucial steps required to develop strong customer relationships.  From one central system it manages and co-ordinates every piece of customer data, across all the communication touch points, personalising the content of campaigns and fulfilling all associated initiatives.   
With CLM there is no need to change or stop using the existing stack of IT systems.  It works in a co-ordinated way orchestrating existing systems in real time, and accessing the data already stored.   It works alongside CRM, IVR, web and mobile portals, campaign management systems and data warehouses, orchestrating these systems according to pre-defined business rules.  Utilising existing systems, CLM is a fast and low risk implementation that needs minimal resources to get immediate business benefits.   It comes with proven, predefined business processes, all the necessary applications, management and integration tools and a complete set of communication gateways.   
Customer Lifecycle Management is the answer to enhancing the customer experience and achieving the ultimate segment of one.   Nurturing each and every relationship is making customers feel special resulting in high levels of trust and increased loyalty.    Isn't it time for you to embrace the power of personalisation to capture your customers' attention?

Mikko Hietanen is CEO of Agillic   www.agillic.com

External Links

Agillic

Product success or disaster ultimately comes down to how compelling the experience is to the end user. Some of the most significant successes in the ITC industry, like mobile services and more recently iTunes and iPOD, combine new innovative consumer electronics with the value of being connected to a network. Is IPTV the next service to fuel growth for service providers? Is the offer compelling enough to drive migration from satellite and cable? Per Lembre takes a look

The drive to unify video entertainment, voice, broadband and mobile has already had significant market implications. Recent M&A activities like Tele2 and Versatel, BSkyB and Easynet, Telenor and Bredbandsbolaget, were all motivated by gaining access to broader customer bases and to leverage a wider set of services to attract and retain those customers. In parallel, new technologies emerge to increase capacity and provide greater functionality in support of a converged service offer. The rational is to share resources between services, simplify operations and increase end user experience.

Whilst there are many advantages in delivering multiplay services, service providers still need to look carefully at their video offer itself and consider how differentiating and thus how successful it may be. Video over broadband is finally growing rapidly in Western Europe. Point Topic reported almost 3 million paying IPTV subscribers worldwide as per June 2006, with half of those users resident in  Europe.  This is in line with some projections, but lower than what many forecasted just two years ago.
The European market for IPTV is fragmented to the extent that it may even be misleading to say it is one market. Rather, every country is a market in its own right, with its own specific attributes. There are several factors that service providers will need to look into when deciding their IPTV strategy. What is the broadband penetration and what does bandwidth competition look like? Are pay-TV services already popular? What platforms do people use to receive their TV signals? The introduction of digital terrestrial TV in countries like Germany, Norway and Sweden forces long time terrestrial users to change from analogue to a digital solution. This technology shift constitute a window of opportunity for IPTV broadband providers, however the window is rapidly closing as people invest in digital set top boxes to continue to use their legacy antenna solution.

Content not enough
Some of the early IPTV pioneers, like Fastweb in Italy, have successfully secured exclusive content, in particular rights to the national football leagues. By carefully selecting high value content, service providers may drive initial penetration levels for IPTV in a similar fashion to how cable and satellite providers attracted subscribers to their pay-TV content some 20 years ago. The challenge here of course is that content rights are already distributed in all developed countries, so what content can possibly be out there that is attractive enough to drive mass adoption of IPTV?
Maybe that is the wrong question to ask. Over time, most premium content will tend to be available on all distribution platforms, simply because content owners will make more money that way. Instead, let's look into the unique capabilities in IPTV. What can the platform provide that traditional broadcasting can't?
First, IP networks are far better suited to deliver unicast traffic, sending data from one source to an individual consumer. This is perfect for distribution of video on demand (VOD), and to allow for a more personal user experience. Adams Media Research recently forecasted consumer spending on video download at $4bn by 2011.
Second, it allows for greater measurability when compared to broadcast technologies. IP networks can provide information about what the users want, when they want it, and what additional services they may be interested in. This has great implications for the multi-billion dollar advertising market. Targeted advertisements represent two to ten times the value to broadcasted advertisements, and when the big brands start to push new innovative advertising on IPTV platforms to get interactivity and better measurement, then the advertising market will embark on a new journey.
Third, and probably most important to the consumer, IP networks allow the user to play themselves. Few consumers use the service provider home page as their starting page on the Internet. Why would they go to a single service provider portal for all video content? The concept of active users, exploring and even producing and sharing content with others actually play to the traditional strength of service providers: It is about personal communication. Let's embrace it.

Consumers or producers?
User Generated Content (UGC) was one of the hottest trends in 2006 and gained a lot of business interest when Google acquired YouTube for $1.65 million in October. Building strong communities and allowing users to produce, share, view and contribute to content creation has already made an impact on the media industry.
UGC is another example of how different innovations together form a critical mass to allow a new service to succeed. UGC wouldn't have been possible without video consumer electronics that you can carry around in your pocket. Nor would it have taken off without inexpensive PC-based publishing software. Or broadband and community portals like Break, YouTube and national news portals allowing upload of video content from citizen journalists.
When Internet users in the UK, Germany and France were asked if they have shared any video content over the Internet, in average 8.7 per cent claim they have, with French users scoring highest at 11.7 per cent.  This corresponds to almost 3 million broadband users sharing videos over the Internet.  Given the early phase of UGC, this is a very significant number.  Subscribers to IPTV services may not only want to look for the hottest releases from Hollywood, they may want to take part in some of the production itself.
Studying consuming behaviours of video content on the Internet, UGC came out as the most attractive type of content with 47.1 per cent of viewers (1).

Telco TV providers have a unique opportunity to blend UGC with broadcast content. Service providers can potentially play a significant role in adding capabilities such as encoding quality levels for UGC suitable for large screens, infrastructure for micro payments, and the concept of 'family channels', allowing users to broadcast themselves. As the IPTV market unfolds, these capabilities help differentiate IPTV against legacy TV distribution platforms.

Understanding consumer preferences
The European IPTV market is still in it infancy and it is hard to foresee how it will evolve over the next years. Broadcasters have started to put a limited set of content available for on-line streaming. Peer-to-peer technologies are evolving from file sharing and voice applications to distribution platforms for television and on-demand streaming media. To add to the complexity, illegal distribution of TV channels over the Internet puts higher pressure on guarding principles on content rights.
The secret lies in understanding consumer preferences.  Over time, they tend to get what the want. The early video over broadband market indicates that consumers are moving from passive users of TV broadcasted content. They participate themselves, they vote, they produce and share, they put an alternate end to the latest story online, and they brutally rank what they see. IPTV may put an end to zapping, it may bring a far more personal entertainment experience, and it may swing the advertising market around. To succeed IPTV providers need to break out of the me-too services and leverage the inherent personal nature of IP networking.

(1) UGC and news preferred over sports when users are asked what video content they currently download and watch on the Internet.
Source: Juniper consumer survey, Nov 2006

Per Lembre is Head of Multiplay Marketing, Juniper Networks EMEA, per@juniper.net

The telecoms industry appears, finally, to be giving identity management the attention it deserves.  Lynd Morley looks at the most recent initiative 

Identity management is fast establishing itself as one of the telecoms industry’s current major buzzes, yet not so long ago, an article on the subject would have been considered distinctly left-of-field in a telecoms publication.  True, groups like the Liberty Alliance, formed back in 2001 by some 30 organisations to establish open standards, guidelines and best practices for federated identity management, have been attempting to engage the industry for some time now.  But while the industry could not really be accused of turning a deaf ear, it did seem to be distinctly hard of hearing.

But then, the gradual recognition that, in an information economy, trust is the necessary foundation for secure interoperability, and central to the successful realisation of what might be possible on the web, brought identity management front and centre for telecoms players.
Most recently, the ITU has announced its own Focus Group on Identity Management (IdM), noting that the use of multiple usernames and passwords represents a boon for hacking, identity theft and other forms of cybercrime, and is causing substantial financial loss amounting to billions of US dollars.  The ITU initiative, according to the organisation, is poised to offer a technology- and platform-independent solution.
The world's key players in IdM have taken the first steps towards a globally harmonised approach to IdM, says the ITU. Developers, software vendors, standards forums, manufacturers, telcos, solutions providers and academia from around the world have come together in the Focus Group to share their knowledge and co-ordinate their IdM efforts. The aim is to bring interoperability among solutions, by providing an open mechanism that will allow different IdM solutions to communicate even as each one continues to evolve.
Such a "trust-metric" system has not existed until now. Experts concur that interoperability between existing IdM solutions will provide significant benefits, including increased trust by users of on-line services, improved cybersecurity, and reduction of SPAM and seamless "nomadic" roaming between services worldwide.
Abbie Barbir, chairman of the ITU Focus Group, and Nortel standards adviser, explains: "Our main focus is on how to achieve the common goals of the telecommunication and IdM communities. Nobody can go it alone in this space; an IdM system must have global acceptance. There is now a common understanding that we can achieve this goal."
IdM promises to reduce the need for multiple user names and passwords for each service used, while maintaining privacy of personal information. A global IdM solution should also help diminish identity theft and fraud. Further, the ITU stresses, IdM is one of the key enablers for a simplified and secure interaction between customers and services such as e-commerce.
From now to July, the Focus Group will conduct an analysis of what IdM is used for, as well as analyse the gap between existing IdM frameworks now being developed by industry forums and consortiums. These gaps will need to be addressed before interworking and interoperability between the various solutions can be achieved. A framework based on this work is expected to be conveyed to relevant standards bodies including ITU standards-setting groups. The document will include details on the requirements for the additional functionality needed within next-generation networks (NGN).
Identity management, it seems, is at last finding its place in centre field.

The telecoms sector is one of the most highly competitive industries, with one third of its customers churning every year. So how can service providers differentiate themselves from the competition and minimise the impact of churn on the bottom line?  Ofer Yourvexhal explains how major organisations are meeting challenges head on through an integrated desktop approach that ensures consistent high-quality customer service

Customer churn is one of the greatest challenges for any organisation, and none more so than the telecoms industry. The mobile phones market reached saturation point some time ago and the plethora of pricing options, different service plans and new phone models available causes customers to swap suppliers on a regular basis.

The old adage that 'service sells' has never been more applicable and good customer service can help telecoms providers hold on to their customers and provide a value added service. Take, for example, the news that UK cable giant NTL is acquiring Virgin Mobile. The Virgin name was favoured for the new company as it offered instant consumer brand appeal, but more importantly because of its renowned excellent customer service. So how can other service providers meet the growing challenge of Virgin's customer service offering? Answer? Through enhancing their own customer service.
High quality customer service
Traditionally contact centres have often been referred to as 'cost centres' and a necessary business expense, however, if used correctly they can have a significant impact on customer retention. They can add real market value and prove to be a successful business tool.
The key to keeping customers happy and ensuring that they stay with your company, is to provide them with the information they need, when they need it. This is often easier said than done. Agents frequently have to access many different databases to find all of the relevant information in order to satisfy a customer enquiry. The answer lies in an integrated desktop, which enables agents to access all the information they need through a single screen, including product details and availability as well as customer histories. This means that they do not need to separately navigate the many silos of data that may already exist within your business – freeing up agents' time to provide truly personalised service.
In addition to customer churn, agent retention is also a great obstacle to overcome for the telecoms industry. Research from global market research firm, Gartner, has shown that anything from wages to working environment can be responsible for staff turnover and with staff churn rates reaching a high of 60 per cent, and with the contact centre industry set to expand to over 37,000 by the end of the year, this churn figure will undoubtedly rise.
Agents are the most significant cost within a contact centre, accounting for up to 80 per cent of the overall budget. They are also its most valuable asset, and possibly the only contact that customers have with your business, so it's essential they are effective and reflect an accurate image of the organisation. Agents have a significant value and can generate additional revenue for your contact centre, but to do this, they need support.
The best way to help agents become more productive is to simplify the processes they carry out on every call, every day. Again thanks to an integrated desktop that provides a single view of the customer, agents are able to have all the relevant information at the right time. This enables them to optimise the critical part of the call when they actually engage with the customer.
Streamlining the first third of a call – which is needed for identification and verification, and so is not productive – also helps the agent to be effective. This should include ensuring that any information gathered in an IVR is translated to the desktop and put in front of the agent. When the agent actually takes over the call, for the final two thirds of it, they will be supported with that information so they can deliver a quality experience for the customer.
How should agents be measured?
In order to get the most out of their agents, businesses need to assess exactly what it is they are looking for from them. Every contact centre is different and will value certain metrics over others, so you must first define what areas of productivity are key for your agents.  If cross- and up-sales are KPIs, then measure those, but if agents have no involvement with sales, it is senseless to measure them. Instead, define what is important for each agent to achieve, whether its call volumes, average handle times, or first time resolution. Then the processes that are put in place can reflect these targets and make them more easily achievable.
Cross- and up-sales are a good metric to monitor in order to gain an insight into the productivity of your agents. It's obvious that the bottom line will improve significantly if every agent achieves an increase in these areas. The best way to enable agents to do this is to empower them with all the information that they need to do their jobs well. If an agent can access customer information – including purchase history – product information and special offers, all on the same screen, then they have all the knowledge that they need to complete these sales.
The important thing to remember is that agent and customer churn happens and will continue to do so, especially in today's competitive market. However, by optimising agent processes and giving staff the right equipment to do their jobs well, businesses can see churn levels – for both agents and customers – drop.

Ofer Yourvexhal is Senior Vice President of International Sales, Jacada

Moving to VoIP and converged networking can be challenging. Organisations must take care to choose the right systems, processes and partners if they want to realise the full cost and flexibility benefits of these new communications technologies explains Guy Clark

A decade ago, life was very different.  The typical enterprise had separate voice, data and video conferencing networks, which were inflexible, expensive and complex to manage.  The growth of frame relay and ATM networks - both evolving point-to-point services – lead to inefficient management of the networks and their components.

IP-BASED INFRASTRUCTURE - Gateway to simplicity

Multiple hardware and software vendors with multiple support and maintenance contracts added to the cost and complexity.  Few, if any, of the networks “talked” to each other and separate access was required for each network, causing costs to escalate. 
Although, generally, demands on data were much less onerous than they are today – it being the days before e-mail and more aggressive time-critical applications - if an individual application was slow to perform it was difficult to pinpoint the cause because there was limited visibility or control over activity in the network or bandwidth utilisation.   
Enter simplicity
Fast forward to today.  Businesses that have fully embraced IP convergence and Voice over Internet Protocol (VoIP) have resolved most, if not all, of these issues.  Indeed, as companies look to focus more on their core competencies in the face of ever-tougher competitive and regulatory pressures, many now outsource the management of their newly unified networks and on-premise routers to an experienced solutions provider. 
Enterprises typically pursue two outsourcing arrangements.  Some customers prefer to manage the routers themselves and select a wires-only service.  This gives them a broader choice of vendors and a wider range of costs to choose from for customer premises equipment.  It also gives them more control over how and when they configure parts of their network.
Of course, this arrangement assumes that the customer is bearing the burden of all router maintenance and vendor interaction.  As a result, many end-users opt for a fully-managed service.  In this instance, the provider installs routers on the customer's premises in addition to providing ongoing maintenance, support and management – in effect, a true one-stop IP VPN shop.
A fully-managed solution clearly provides businesses with more control and flexibility over their network quality of service. This is especially important at times of potential congestion when it is essential to preserve a premium class of service for voice and video applications. 
One example of this may be when the CEO needs maximum bandwidth for a worldwide video conference.  Bandwidth can then be reallocated to the videoconference, while less important applications, such as e-mail and Internet access, receive a lower priority.
In a converged and unpredictable world, usage-based charging for bandwidth may be a viable alternative.  This arrangement can help businesses optimise the cost versus benefit equation.  Customers only pay for what they absolutely need on a fixed-monthly-fee basis and are charged for bursts of capacity over and above this as needs dictate.
The greater inherent simplicity of an IP converged network also should lead to significant cost savings.  This is true despite the growing emergence of e-commerce and overseas call centres, for example, which have increased network complexity with the introduction of new components into the network.  Although these additional components tend to increase network costs, it's widely acknowledged that businesses need to do more to be competitive today.
Having said that, there remains a clear cost advantage to migrating to converged technologies.

Application performance monitoring
Similarly, a single, converged IP service simplifies maintenance and, using a managed service, provides customers with a single point of contact.  Customers gain direct visibility into their network, so they can take control when they need to, and can provision, troubleshoot, control, monitor, and manage their provider's services and network.  For example, it provides access to bandwidth utilisation, creates and tracks trouble tickets for fault management and provides billing and other reports to enable detailed business analysis.   
This critical capability saves time, improves transaction speed, reduces cost and improves productivity, allowing end-users to focus on their core business. 
At the same time, service providers can use Application Performance Monitoring (APM) to help ensure that optimum performance is maintained across its own and customers' networks. 
Essentially, this means placing probes (on a temporary basis for professional services activities or on a permanent basis for constant management and optimisation) on both the customer premises and in the provider's network.  These probes collect valuable information from the packet headers and the packets themselves, and use it to provide intelligence on how each application is performing.  Packets are assessed in terms of bandwidth utilisation, jitter, latency and packet loss between various points in the network.
Each application has its own unique “signature” at the packet and header level, so it is easy to identify.  Reports can then be generated based on individual applications or application types, such as e-mail, web-browsing, SAP, or CRM.  Using a special algorithm, the APM software can even provide mean opinion scores to indicate the quality of VoIP traffic.
So, for example, if a customer experiences a delay remotely accessing a particular application screen that provides account details or experiences a decline in voice quality, it's possible to request a report - in near real-time - that shows how a particular application performed against key metrics during the period of sub-standard operation. 
Normally, 'bursting' among the lower classes of service wouldn't affect performance, but examining the relevant statistics may reveal that there was unexpected bandwidth utilisation by another application which may have been caused by an unauthorised transfer of an extremely large amount of data.  This could prompt the customer to respond by preventing similar future transactions during business hours or increasing bandwidth to enable the transfer to occur.
In the case of premium-class service (used for real time applications, such as voice or video), “bursting” is not possible if more bandwidth is required.  In this case, existing bandwidth needs to be reallocated among the various classes of service, or the customer needs to request larger circuits.  Another approach could be to request more bandwidth in the network core (if the access circuits already are over-sized) to increase the total bandwidth available to this premium class.
Clearly, APM is an invaluable way to identify activity in a converged network, yet it can also form an important part of a service provider's broader professional services offering.  The ability of APM to assess total bandwidth and performance requirements against existing capacity within a LAN and WAN provides a meaningful gauge to businesses considering implementing an IP VPN or IP converged solution. 
These are just some of the reasons why APM helps enterprises make informed decisions about how best to manage future business growth.  By analysing and projecting industry bandwidth utilisation trends, it's possible to determine when a business may need to increase its overall bandwidth or reallocate among different classes of service. 
Alternatively, by highlighting a dramatic growth in one application, such as email usage, use of APM metrics may prompt a decision to tackle the issue in another way, such as by introducing a policy of zipping large file attachments or encouraging staff to use shared folders that reside on a server as opposed to emailing files to each other.

Planning change     
It's no secret that good timing is essential to the success of any plan and implementing a VoIP migration is no exception to the rule.  The move to VoIP may be appropriate, for example, if the business has a tired, legacy PBX that needs a major upgrade to continue serving the business at current levels.  What if the business is relocating, opening new offices or rolling out a call-centre application?
In some cases, the objective may be to contain or reduce maintenance and management costs.  For other businesses, the overriding issue may be the need for additional functionality, such as unified messaging or the availability of presence-based information.
Whatever the objective, it's essential to view the problem holistically to ensure the solution meets the broader objectives of the business.  That's why it's important to ask these questions:
•    How do staff interact with each other on a daily basis?
•    Are they office-based or do they work remotely?
•    Does the business operate multiple offices across multiple geographies?
•    What forms of communication are used to interact with customers?
•    Do staff members typically face specific logistical issues, such as parcels or parts tracking?               
Having established these parameters, the next step is to design a network that would support the applications necessary to not only bolster the business today, but also help it move forward into tomorrow. 
Essential to this exercise is finding a solutions provider with the expertise and flexibility to understand the business issues, to help design a solution that maintains or reduces costs, and to provide the scope of functionality necessary for the business to achieve its corporate goals.

Open dialogue
Finding the right solutions provider for your business can be a time-consuming task.  Many businesses issue requests for proposals in order to assess the range of options available in the marketplace, what individual suppliers can offer and at what cost.
Unfortunately, this process can take up to a year or more to complete – precious time that many businesses can ill afford in today's highly volatile and competitive markets.
Engaging an experienced solutions provider can lead to an open dialogue about a business's existing network infrastructure, a discussion of end-user requirements and assessment of potential challenges the business may face during the transition. 
Enterprises also should consider other requirements before inviting a solutions provider to become such a fundamental part of its day-to-day business.  Is the systems integrator or communications network provider financially stable?  Does its network offer sufficient geographic coverage?  Has it developed a forward-looking product roadmap that provides flexible IP solutions?  Are those IP solutions supported by a comprehensive portfolio of services that address the business's needs? 
Put simply, does the solutions provider offer a truly 'one-stop shop' for today's and future networking needs? 
The past two years have seen a major shift towards IP VPN adoption.  And as the adoption rate of VoIP-based solutions accelerates, the market is gaining greater awareness and confidence in the benefits of this new technology.  Industry research confirms this growing trend, predicting a marked increase in the implementation of VoIP solutions over the next five years.
Clearly, in a fast-paced commercial environment in which agility and responsiveness define the fine line between success and failure, the network visibility and control provided by today's best-in-class solutions providers may be key to tipping the balance in favour of the business.
          
Guy Clark is Acting Marketing Vice President, Europe, Global Crossing

Convergence is the plat de jour in service provider thinking, but like service charges in the small print, is divergence back on the menu asks Paul Gainham

There are many who believe that Ethernet should not have followed the example of a certain famous rock ‘n roll star in ‘leaving the building’; that as a technology, its home was the enterprise building LAN and that is where it should stay.

CARRIER ETHERNET - Anyone for Divergence?

But, following in Elvis' footsteps, Ethernet did leave the building, on a promise of lower cost, ubiquitous service transport in the eyes of many service providers. Certainly the early promise, on the surface, was appealing. Take predominantly enterprise-focused Ethernet switching products, add hardware redundancy and deploy in Service Provider Metro network areas to deliver 10/100/1000 Ethernet services at a lower cost point than other technologies of the day.
It is not being unkind to say that the first experiences of Ethernet in carrier networks were as appealing as a trip to a mediaeval dentist.  Quite simply, the basic technology was not geared to the levels of scale or reliability that carrier networks demanded.
Unbowed, the Ethernet vendor industry forged ahead, focusing its efforts on trying to correct the underlying issues by gradually introducing a raft of new protocols in an attempt to address Ethernet's fundamental issues of scale, reliability and performance.
Whilst these enhancements improved matters in certain areas, similar to a circus clown's car, these networks were still prone to unpredictable behaviour.
The key lesson from this period for the future is that 'good enough' is not good enough for service provider networks being asked to carry ever more critical, highly demanding voice and video services.  The service delivery benchmarks of the PSTN and television distribution networks are both set at an established, extremely high level in the eyes of end customers and this is the level service providers need to meet and exceed in the new “packetised” environment.
Exploring this point further, the demands on the packet network are changing rapidly from a predominantly data focused to a real time centric service environment.  Suddenly, the requirements for zero packet loss in a flow increase rapidly as there is no re-transmit option in voice or video flows. This drives two fundamental changes within the network.  Firstly there is the need to look at the network in its end-to-end entirety from a QOS perspective, instead of a hop-by-hop basis, to ensure real time flows can be introduced on to the network and handled consistently for their duration. This drives the need for an effective, scalable policy and control system end-to-end. Secondly, the routing and switching products at the transport layer itself must be capable of supporting real time flows (both unicast and multicast) at scale and be able to react rapidly to network changes or disturbances with minimal service impact to the real time flow.
So what's the answer?
MPLS (I hear the screams of the 'too' crowd – too expensive, too complex, too core centric, too MPLS!) has proven itself at the heart of many large, very complex and highly scalable service provider networks worldwide. Most importantly, it is well understood operationally in terms of service deployment and ongoing network management and provisioning.
A number of vendors are now beginning to see and promote the effectiveness of MPLS as a metro Ethernet network protocol, both in the data and control planes.  The reasoning and thinking is simple – a common control plane end-to-end that has all the resiliency, reliability and QOS functions necessary to provide a true carrier class scale network, capable of hitting the PSTN and TV service benchmarks mentioned previously.
Picking a vendor that can deliver the MPLS deployment expertise combined with the technologies and products capable of meeting those benchmarks is still a decision that service providers need to weigh up carefully as they migrate further towards an 'all packet' environment
Let's be clear, this is not a battle between Ethernet technologies old or new and MPLS as some Ethernet vendors would forcibly suggest. The technologies are mutually beneficial in the metro network, the control plane resilience and service plane richness of MPLS combined with the flexibility and cost effectiveness of Ethernet makes for a very powerful combination.
In a highly competitive and aggressive market, most service providers are driving towards greater operational convergence, not just network convergence.  Effectively they are looking to reduce the costs of operation associated with the network and the service delivery riding over it, not add additional complexity and costs.
A very sobering thought is that based on Juniper Networks estimates, typically, Service Providers receive on average one fault call every 14 years for PSTN customers, yet during the early deployments of IPTV services, this has been in the order of one every three months.  Whilst not all are attributable to the underlying network, quite clearly, this is not a sustainable business model and again points to the dual needs of reducing the operational complexity of the network whilst demanding the most stable and proven combination of hardware and software from network vendors.
At a time when many are beginning to see the benefits of combining the strengths of MPLS with Ethernet towards that goal of operational convergence, the vendor industry in some quarters decides it's going to re-introduce divergence, in the shape of new protocols which claim to offer the usual nirvana of a fix-all solution.
Have we not been here before?  Enterprise Ethernet vendors with minimal large-scale service provider network design and deployment expertise trying to diverge from the basics in an attempt to confuse and stall the market at a time when service providers need all the help they can to simplify and operationally converge?
As always, the market will decide, but with MPLS technologies becoming ever more cost competitive, with huge advances in operational management combined with an established knowledge base, is there really anyone for divergence?

Paul Gainham is Director of Service Provider Marketing, Juniper Networks EMEA

As leading telecom operators take significant, albeit cautious, strides towards global 3G rollout, they are confronted by threats posed by innovative business models, increased customer demand and ever-intensifying competition. The emergence of niche players specialising in next generation service provision, and the entry of cross-sector operators into the communications arena has taken competition to dizzying heights. Amidst such a dynamic telecommunications environment, Siddharth Puri explains, customer care and billing (CCB) solutions have transformed from mere back-end support systems for service providers to strategic tools in customer retention and management

Over the past year, convergence across the telecommunication industry has become apparent. Carriers, cable operators and wireline service providers, increasingly competing for the same customers, are evolving to create a new breed of communications companies. Some will be pure-play voice or just data providers, while others will emerge as truly integrated communications companies offering varied combinations of mobile broadband voice, video, data and broadcast services. Service providers see the potential of generating new revenue streams by becoming a ‘one-stop-shop’ for all the communication needs of a customer, and are in the process of constantly refining their service offerings.

NEXT GENERATION BILLING - One-stop hybrid

With operators migrating to an IP-based next generation network, the convergence of voice, video and data – termed as triple play – has received a major boost. In the early stages of deployment, triple play services were introduced purely as a mechanism to reduce customer churn. With a bundled service offering comprising voice, video and Internet services, the service provider's brand grew stronger as it became harder for the customers to switch between operators while maintaining all of their services. From the end-user's point of view, the benefits in availing all services through a single operator were twofold - overall convenience derived from such an arrangement and the price discounts that usually came along with such service bundles. Next generation triple/ quadruple play, however, refers to much more than just tying these basic services together.
The communications industry has graduated from a rather technology-centric ideology to a user-centric value creation model. Service providers recognise that migration to a convergent environment would not only help retain existing customers and defend their current revenue, but also generate additional revenues by introducing more sophisticated services. Moreover, operators can deliver innovative customer-focussed services by offering new functionality in the areas where all three components of voice, video and data converge.
Challenges for billing vendors
An important factor restricting the deployment of a truly convergent environment is the inability of legacy CCB systems to handle the entire gamut of services that possibly a single operator can now offer its customers. The in-house systems used by the erstwhile telecom operators providing simple voice services worked well for what they were designed for - charging customers a flat rate based on time and distance. In the context of next generation services, however, billing becomes far more complex. Three major challenges facing vendors today are:
•    Multi-level convergence
•    Evolving value chain
•    Flexibility and scalability
Multi-level Convergence
Service providers have achieved convergence at the network, service, device and application levels. This allows them to strive for complete customer ownership, as a single operator can offer the whole spectrum of communication services to its customers. Now that operators possess the technology to provide such a multitude of services, the challenge is in attracting customers with the right service mix - therein lies the gap. Several operators still possess independent billing modules for the different sets of services that they offer. To tap the full potential of a truly convergent infrastructure, the need is for a unified customer care and billing platform, a system that can handle the complexities of a convergent environment and at the same time, provide the operator with dynamic rating and billing capabilities such as cross-service packaging and discounting. Moreover, the implementation and maintenance costs involved in deploying a unified billing platform would be considerably lower. In this context, the role of mediation and correlation engines has shifted from mere collation of usage records to a more strategic function of information gathering, validation and intelligence creation.
With prepaid-postpaid integration becoming a reality, operators are able to offer different payment options to customers on a service-by-service basis, rather than at the customer level. Service providers can realise an increased ARPU through service differentiation and service innovation strategies as they would be able to offer all services to all customers, irrespective of the payment method. In order to achieve prepaid-postpaid integration capability, some postpaid billing vendors have opted for alliances with established players in the prepaid arena, rather than developing a prepaid solution on their own. This strategy allows them to combinine their respective skill sets, and also saves on time and capital investments.
Evolving Value Chain
Emerging business models around a multi-level convergent environment have led to the introduction of numerous entities into the value chain, vis-à-vis, network operators, service providers, content developers and aggregators, and application providers. The value chain is extending in both horizontal and vertical directions, and the complexity of value chain management is also on the rise. Revenue realisation and settlement become issues of grave concern when multiple players stake their claim on the revenues generated from a customer. The number of leakage points to be monitored by the operator is also higher in such a scenario.
In an era where telecom service providers depend heavily on third-party content providers, multiple-party billing capability is critical to any service provider's business. Real-time revenue sharing coupled with cross-service discounting capability is becoming the de facto standard for billing vendors. Furthermore, as content services are catching on with consumers, the mounting revenue leakage issues are becoming even more pervasive. Next generation billing solutions should be able to identify, and thereafter enable the operator to successfully plug leakage points across the entire value chain.
Flexibility and Scalability
Service providers were stifled in their approach to global 3G adoption due to the inflexibility of existing infrastructure. Legacy billing systems could neither scale up technologically to manage the wide expanse of services that next generation networks promised, nor could they effectively match up to the growing customer base expected from a multi-service operator setup.
Service providers and billing vendors alike, realise that the CCB system needs to be as dynamic and interactive as the network that it is supporting. Estimates show that nearly fifty per cent of all billing systems become obsolete, on an average, every four to six years, primarily due to their inflexibility. It is quite obvious that solutions would need to possess the prerequisite of scalability ingrained deep into their development philosophy.

A strategic outlook
Service providers, who are in the process of rolling out triple play, 3G services and beyond, realise that as the technological divide between operators closes down rapidly, it will be service differentiation in terms of total customer experience that will hold the key to success in the long run. Operators understand that each customer is unique, and deserves to be treated differently. In order to achieve this level of differentiation, the marketing teams need to design innovative customer-specific service bundles, taking into account factors like services availed, prior usage patterns usage data and customer loyalty. There is a need for dynamic and interactive pricing solutions with the ability to analyse historic customer data, and come out with best value plans based on individual customer preferences.
From a vendor point of view, the evolution to a truly convergent communication environment presents a great opportunity. Billing systems of yesteryear clearly do not possess the capability to handle such a technologically rich and functionally dynamic setup. However, owing to financial constraints and existing license agreements with vendors, it is unlikely that operators would opt for a complete replacement of their billing infrastructure. Rather, an alternative approach being adopted is a gradual move to an integrated billing infrastructure - a phased replacement driven by the introduction of next generation services. The key for vendor success lies in developing a highly modular system that is flexible enough to incorporate an ever-evolving, complex and innovative service portfolio, and is future-proof in terms of its scalability.

Siddharth Puri is Product Manager, GTL Infrastructure Limited

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features