What OSS technologies will be in demand by carriers in the coming year and beyond? Clarissa Jacobson believes that the way investors and industry experts answer this question influences decisions about venture capital investment and mergers and acquisitions

In this world of advancing technology where wrong bets on the future can result in major failures, telecoms executives and investors are constantly on the watch for what might be termed the "Next Big Thing."  Since our industry is so replete with acronyms, we shorten this to "NBT."  In this article we explore current thinking about NBTs via a survey we conducted of a number of top executives and venture capitalists who are intimately familiar with the telecom/OSS space.  We also review some recent merger and acquisition trends.
Both the survey and M&A review heavily point at a tidal shift in carrier requirements.  The emphasis during the last five to ten years has been on rolling out new services and adapting systems to IP architecture.  While this continues to be of importance, our study indicates that carriers are increasingly turning their focus to making existing systems more cost and customer efficient and doing so within a real-time environment.  Sophisticated OSS applications that elegantly address these needs are the OSS world's NBTs.
Respondents to the survey, asked about what was most likely to capture venture capital investment, cited several areas over and over again: business and network intelligence, customer self-care and product lifecycle management (PLM).  One respondent articulated it very clearly:  "Any technology that supports automation of customer processes and reduced administration will garner VC interest.  Carrier focus is definitely on operating expense reduction."

Companies that can demonstrate good return on investment and overall lower total cost operation for their solution set were ideal according to Nick Stanley, VP Networks of Brilliant Cities, a designer, builder, and operator of regional broadband telecom networks.
The survey results came from a wide girth of OSS business owners/executives/board members, consultants, venture capitalists and carrier executives.  The majority were from North America and Europe with a smattering from ANZO, Asia, and Africa. 

From 1999-2005 the major trend was that of adapting to the explosion of new services and deploying IPplatforms.  Companies hustled to deliver the vast array of new products; and software that could guarantee seamless capabilities was all the buzz.  Fast forward to today.  Now that most carriers have next-gen service capabilities in place their primary concern returns to managing costs and improving customer experience.   At the risk of oversimplifying what the future has to offer, it can be seen that applications that help increase arpu, reduce churn and lower costs are where the deals will happen, and are happening.  Several big acquisitions in the past year confirm this.   

In September of 2007 Cisco Systems (NASDAQ:CSCO) bought Web Business Intelligence and Analytic Reporting Company, Latigent, whose product helps to cull call centre data into reports to improve customer service and analyse customer behaviour.   "By acquiring Latigent, Cisco is signalling a commitment to increase the value of customer investments in our customer interaction solutions, by providing appealing, robust and dynamic tools to enable increased visibility and efficiency, resulting in improved customer experiences," says Laurent Philonenko, Vice President and General Manager of the Customer Contact Business Unit, Cisco.  

One of the biggest deals announced in 2007 and closed in 2008, SAP (NYSE:SAP) acquired Business Objects for 4.8 billion Euros.   A French company, Business Objects software helps companies analyse data to detect market trends.  SAP had been comfortable making smaller targeted acquisitions, but in an effort to compete with Oracle, which has aggressively been acquiring business application companies over the past three years, SAP took the leap at the end of October with its decision to acquire Business Objects.  Oracle has spent more than $20 billion on companies that offer software which manage human resources, supply chains and customer relations and previously acquired SAP's competitor, Hyperion.
At practically the same time as the SAP activity, NetScout Systems (NASDAQ:NTCT) announced their intent to acquire data mining and network analysis company Network General for $213 million.  They closed the acquisition January 14, 2008.  NetScout said the combined company would focus on reducing Mean Time to Resolution for enterprises, wireless providers and government agencies.  NetScout President and CEO Anil Singhal said: "Today, we are bringing together two established companies with complementary technologies to form a new, stronger organisation that will have the scale, technology and mindshare to meet some of the greatest challenges associated with virtualisation, convergence, SOA and highly distributed network-centric operations."   With the ability to integrate the companies, they expect to achieve numerous cost synergies and $30 million in expense reductions. 

In December of 2007, Motricity announced the completion of its acquisition of InfoSpace's Mobile Services Business.   A provider of mobile content services and solutions, Motricity acquired the InfoSpace unit for $135 million.  It served to expand their customer base and offer a full range of services with an end-to-end platform. Ryan Wuerch, Chairman and CEO of Motricity says: "Perhaps the biggest differentiator of the combined company is that we offer unmatched insight into the mobile consumer.  This insight is invaluable for our partners."

Finally, but by no means the last of the business intelligence deals that we expect to see in the coming year, was Nokia Siemens Networks' announcement to buy Apertio for 140M Euros, which is expected to close in May.  Apertio is a provider of mobile subscriber data platforms and applications.  Key to the acquisition for Nokia is that Apertio will give them the added edge to help customers simplify their networks and manage their subscriber data.   
Jurgen Walter, head of Nokia Siemens Networks, notes: "The race is on to deliver seamless and highly targeted services to end-users across various access devices and this requires a unified approach to subscriber data.  Enabling access to this information in real-time means you can profile subscribers and deliver new services and advertising appropriately." 
Paul Magelli, Apertio CEO puts into a nutshell exactly the reason business intelligence deals have been so prevalent: "With Internet services, communications services, and entertainment services now converging, operators must simplify their networks and focus on subscriber intelligence to stay competitive." 

One area that several of the survey respondents mentioned, but that did not show up in the merger and acquisition deals of the past months, was Customer Self-Care.   No longer does this mean a simple web portal for customers to review a bill or get information.  The next generation of self-care automates the entire move, change, add workflow process from customer entry to provisioning and activation.  Providing customers with the ability to help themselves is extremely beneficial to a carrier's business as it reduces operational costs and improves customer experience.    With churn rates averaging 1-3 per cent monthly and the typical carrier spending $300-$600 to gain one new customer, it is obvious why this is a hot topic. 

Lane Nordquist, President Information Solutions, a subsidiary of HickoryTech Corporation, a diversified communications company, states: "Customer Self-Care through the web or mobile devices is becoming increasingly pervasive as customers/prospective customers take advantage of their ability to execute consumer choices without interference.  Any technology that seamlessly links customer self care to automated provisioning of services should attract venture capital investment." 

One can only deduce that the capital hasn't been put forth, as the technology to make Customer Self-Care seamless is not quite there yet.   

The number one area cited by survey respondents can be lumped together under what today is loosely referred to as Product Lifecycle Management (PLM).  Conceptually, PLM allows a carrier to build, introduce and deliver services and consumer choices much faster.  This requires managing and coordinating many divergent systems and databases.  A true PLM system integrates network and business intelligence with back office functionality.
A few survey responses suggested that emerging WiMax and Mobile Communities, and the OSS software for managing these areas was an up and coming NBT.  With high profile M&A deals for companies like FaceBook and MySpace, some see this area as ripe for investment as on-line communities extend into the mobile and wireline environment.

Carriers are faced with fickle customers with increased demands and little patience.  Competition is cut throat, and carriers that are able to streamline costs while delivering a better customer experience are the companies that will succeed in the years to come.

Clarissa Jacobson, is Director, Research and Administration with Peter A. Sokoloff & Co - an investment banking firm that specialises in mergers and acquisitions of companies in the Telecom and Security industries. She can be contacted via: cjacobson@sokoloffco.com

Dante Iacovoni discusses how operators can benefit from the lessons of Web 2.0 success

The phenomenal success of web 2.0 companies is there for all to see. They have, in a relatively small space of time, gone from a new and emerging technology to a worldwide phenomenon led by companies like MySpace, Facebook and Google.  The uptake and popularity of these companies and services has been based almost universally on personal and free of charge content, subsidised by the very effective gathering of advertising revenues. 

The primary key to this success is in the information that 2.0 companies have on their consumers and the way in which they use this information for their benefit.  Each company has a detailed level of insight into the behaviour of the people using its service, making them desirable to advertisers.  But what can the telco world learn from this? 

Telco operators are, for the most part, used to basing their revenues on subscriptions and usage fees.  As a result, telco portfolios are often very similar and it can be difficult for consumers to differentiate between them. In today's environment, consumers switch service providers simply according to which operator is offering the best price for what is likely to be a very similar service. To go beyond this fleeting loyalty and really build a relationship with the customer, operators will first and foremost need to offer distinct and compelling services beyond the triple or even quad-play bundles that are becoming the norm in some markets.
Although triple and quad play have initially succeeded in reducing churn, telcos will soon find that they need to provide more distinctive services to maintain their customers' loyalty.
Competition amongst telcos ultimately comes down to who "owns" the consumer. To stand the test of this competition, operators will need to learn to better understand their end user and to incentivise them and generate loyalty that goes beyond "call minutes". This will enable them to differentiate themselves amidst such fierce competition and to gain a deeper level of understanding of the users they're servicing. Once they know what services their users want, and to consider not only their subscribers but also each person in that house who uses the service, they will be able to identify the most effective ways to monetise them.
Distinct and compelling services will be the primary catalyst in acquiring and keeping customers. One of the strongest weapons in the fight to develop these services will be trust, which is essential for a user to provide the kind of information that is needed to create tailored and personalised services.  Operators have a distinct advantage in that they already occupy a position of trust with their users, but they have yet to leverage the full potential of these existing relationships and convert them to advertising revenues. In the meantime savvy Internet companies have used their insight into consumer behaviour to leverage ad revenues and grow - in some cases exponentially - as a result. Their distinct and compelling services are the key to their success and offering them free of charge has helped them to create a sizeable user base. Operators as yet have not begun to leverage this kind of opportunity - despite the fact that they are in a position of trust with their users and could do so with relative ease.

For operators who learn not to categorise users by network access there can be even more advantages to be gained. They will be able to target advertising based on the consolidated user behaviour and then reach the user with messages based not only on their interest but also their location. The best advertising can also be that which users do not think of as advertising - take Google search as an example.

But the links between the two worlds can extend beyond simply learning from the 2.0 success stories. As an operator you are providing access not only to your own services but also to those existing in "the Internet cloud" - the likes of Joost, Facebook and so forth. Today these exist entirely independently of each other, but in the future there may be value in finding synergies between the two, and perhaps even striking agreements between an operator and the individual 2.0 companies. There are, for example, many opportunities for shared revenue; it is just a question of working out the right format.

The opportunities for telephony, triple- and quad-play will eventually be pushed to the limit. All consumers see is that they are getting essentially the same service they would anywhere, nothing revolutionary or overly exciting. And they have a point: many of today's IPTV services are a simple carbon copy of cable. 

The key now is to upscale the value of the broadband network and leverage the opportunities it offers. To do this, operators will need to build intelligent service nodes into the home that service-connect the end users. New functionality will need to be added to both set-top boxes (STBs) and home gateways (HGWs) to meet this demand: customer premises equipment (CPE) needs to be fully upgradable, providing new functionality in line with new service offerings as well as advances in technology and standards.

To date operators have been very technology focused, concentrating on improving the way they do business rather than re-evaluating their business models. There is a need for them to focus less on technology and more on services than they have before, and by doing so, to acquire revenue from more sources than they do today: call minutes alone are no longer enough.

One viable opportunity lies in third party cooperation and in helping third parties to develop their own applications. Some operators like France Telecom and BT are already starting to push this and will eventually open up a broader portfolio of services as a result.

There is a real opportunity for operators to capitalise on the lessons of the digital boom, but to do so successfully they will need to broaden their view of who their customers are and to understand that they are individuals with individual wants. It is about moving from Average Revenue Per User (ARPU) to Gross Revenue Per User (GRPU). By making this effort they will be able not only to enhance their own service portfolios, but also to sell on to advertisers who are interested in gaining that knowledge, creating a strong and sustainable revenue stream and, vitally, gaining a long-term foothold in the "ownership" of the customer.
We are on the threshold of changing times.  The future is about new services and applications as well as new business models. For operators, the possibilities to develop their business in new directions are huge, and if they can acquire revenues from advertisers, they will be able to offer new services at a lower price.

They have a chance to understand and build a relationship with the customer and through that to develop a power position; without this, they will have a clear risk of ending up as a bit pipe for 2.0 companies.  Get to know your consumer and you can create compelling services that you know will appeal to them; and once you've achieved that goal, you can leverage your knowledge to create valid and sustainable revenue streams. 

Dante Iacovoni is Marketing Director, Tilgin

The growth in demand for data services is great news for the industry in general, but it does change the dynamics of the market. Doug Dorrat outlines the implications for mobile operators in an environment dominated by the need for unique customer profiling

It's been talked about in markets around the world for a long time - the conversion from voice-focus to a data-focused telecommunications market, - but there's only about 18 months left before traditionalist telcos hit a critical point.

Why? Because across the world voice revenues are predicted to drop so steeply that conventional voice services can no longer be the ‘bread and butter' revenue source that all operators have enjoyed.

And, there are clever CEOs of content-focused virtual operators such as Finland's Blyk that are changing the rules of the way both data and voice services are marketed and provided.
Blyk, for instance, is offering free airtime to the vital 16-24 year old market segment who are prepared to be advertised to - Blyk is making revenue from the advertisers. The advertisers are reaching a highly targeted audience, and the kids are phoning, sms-ing and connecting to the Internet from their mobile devices free of charge.

Everyone's happy - and the message to telcos could not be clearer: understand that your old market is disappearing and get to grips with your new market in fine detail.
The old market is gone because there is no longer a mass market to which you can supply bulk access and charge for the time customers use on your network.  As the Blyk example shows, the generation leaving school now expects to connect free of charge. And, because of push technologies, the mass market is turning into millions of individuals, each of whom wants access to connectivity in a unique way that has massive revenue-bearing potential for telcos.

But, to tap into that potential you need business intelligence (BI) - and you need it at the right level.

By the right level, I mean the kind of industrial strength BI solution that not only enables you to link appropriately to your audience individually, but also enables data service and content providers to link to you and your audience.

To achieve that you need to be able to collect, analyse, report on, and share terabytes of data - because you need to track the behaviour of each of your millions of customers in extreme detail. When do they use your services, what are their work demands on your network, what are their lifestyle preferences, what kind of advertisers are going to want to reach which of your customers? And so on.

Take the example of someone flying into, say, Heathrow airport and wanting a taxi to take him to his meeting in town. Location-based services will automatically advertise taxi services to him and the operator provides an accessible easy to use application to order the taxi, send his selected company his destination details, get a quote and book and pay for the trip on the device. In-built GPS enabled to guide the taxi directly to the customer.
The operator wins through advertising revenue, a share of the transaction and the fee charged to the taxi firms for hosting the application. With this model the operator makes significantly more than a voice call ordering a taxi. How about that for boosting arpu?  And if your organisation can't provide him with that capability, then he'll go to one that can.
Which brings us to the question of churn and customer loyalty - both of which are dependent entirely on your ability to differentiate yourself in the market. Right now, using the old bulk network approach, there's no way you can differentiate yourself. The telecoms market is very nearly saturated.

Most of the people who want telecoms access have it in some form or another. Also, the tariffs for individual telco services are confusing for the average consumer, so you can't build loyalty by offering the cheapest service (if that is what the customer is looking for). Besides, you already know that you're running a very expensive infrastructure for a largely unprofitable group of users. Or do you?

Conventional telco wisdom has it that a tiny pocket of users is profitable. In theory, they're the ones who use your network a lot to make lengthy calls. Conventional wisdom also says that on average each new user you take on takes a year to 18 months to become profitable when you consider all the elements of cost the customer incurs individually. So, in effect, you're funding your customers.

This problem has prompted operators to put Customer Value and Profitability at the very top of their strategic agendas as a means to maintaining competitiveness, maintaining loyalty and finding new ways of growing their businesses. In fact Bain & Co recently found out that an increase of one per cent in customer loyalty can improve profitability by 25 per cent!
That's not only a difficult way to make money; it's also a risky model considering the new competitive threats.

BI is now proving that the customers you used to think were unprofitable are actually the ones you need most. As in a housewife who makes very few calls from her mobile device but receives a large number from her friends and family. In other words, she costs you very little, but brings in a significant revenue from other networks. She's one of the thousands like her that you should be marketing to - in terms of content offerings.

But unless you have BI, you can't know which of your customers are profitable and in which ways. Without effective customer-level BI, you're essentially running your business on a hunch.

Certainly, you're not going to be able to compete in what is an entirely new market that has absolutely nothing to do with paying for calls. Your shareholders should be very anxious - the market is not going to be tolerant of outdated services for more than about another 18 months.

The process of implementing the right kind of BI, however, is going to take you at least that long - if you start planning now. Remember, it's not just a question of installing the solution. You have to adjust your business to use the information it gives you. Information not just about your customers but about the cost of servicing and marketing to those customers. You're going to need to build a profitability model based on how you choose to differentiate yourself.

It takes time, but it's not particularly difficult. British Telecom, for instance, saw the writing on the wall in the late nineties and, this year, will make more money out of providing services other than traditional telephony services to consumers.

Doug Dorrat is Microstrategy Director Industry Services - Telecoms and Media, Europe, Middle East and Africa

Ajit Jaokar examines the synergies between Mobile Web 2.0 and IMS, defining the terms, and exploring how these two concepts complement each other

At first glance, Mobile Web 2.0 and IMS have no synergies. After all, they operate at different layers of the stack - Mobile Web 2.0 is at the Web/services layer, and IMS is a networking paradigm.

However, market forces have conspired to bring these two ideas together because many IMS services can be implemented on the web (often for free). In a nutshell, the telecoms industry cannot ignore the web. It must instead think of how it can add value to the web and identify elements that can be uniquely implemented in the core of the network (and not the edge).

Web 2.0 and Mobile Web 2.0
Since Mobile Web 2.0 extends the meme of Web 2.0 - it is necessary to understand Web 2.0 before we explore Mobile Web 2.0. In spite of all the hype, the distinguishing characteristic of Web 2.0 lies in its use of the web as a platform. If we now extend this idea to mobile devices, then at a minimum, a Mobile Web 2.0 service must use the web as a backbone.
On first impressions, Mobile Web 2.0 is simple enough.  However it's implications are profound, as we shall see below.

The first implication is: The web is the dominant player and not telecoms. This is not a comforting factor for many in the telecoms industry. Yet we, as users, accept these ideas. Even the youth today are spending more time on the web and less on mobile devices (for instance with applications such as Facebook). In addition, the web is 'free' - which is immediately adds to suspicion from the telecoms side
Secondly - in a Mobile Web 2.0 scenario, the device and the service become more important than the network itself. This is a natural by-product of the intelligence shifting to the edge of the network.

In addition, we have the 'deep blue sea problem'. If we end up capturing content from a phone and uploading it on the 'deep blue sea' of sites like Flickr - then the unique mobile advantage is lost (ie once the content is on the web, it can be treated as any other piece of content). Hence there is a need to consider the question of 'uniqueness of mobile' when it comes to interacting with the web.

It is against this backdrop that we explore IMS - ie we are exploring what IMS can add to a service that can be uniquely performed by the network

IMS brings IP (Internet Protocol) to the telecoms world. A complete definition of IMS is outside the scope of this article - however the Wikipedia entry on IMS gives a good introduction.  IP traditionally implies dumb pipes and smart nodes (aka net neutrality principles - all packets are created equal and intelligence shifts to the edge of the network). However, although IMS is IP based, it is philosophically opposite to the principles of net neutrality since it seeks to make the network intelligent.
On one hand, thinking of IMS applications is a bit like thinking of 3G applications. Every application will be a 3G application but in most cases, the bearer does not matter. Consequently, if you flip this argument, then an ‘IMS application' needs to be an application that will make use of the (bearer) telecoms network itself.

So can such applications be possible?
In theory - yes.
In itself, making the network intelligent is not such a big issue. Consider delay tolerant networks - which are used in military and space applications. In that case, all packets are not created equal especially when operating in hostile environments.
The real question is - are all packets created commercially equal?
Hence, the question spans more than the technical remit and is directly tied to business models and can be reframed as: Will people pay for applications with differential charging and differential QOS?

If such applications may be found and/or they add value uniquely from the network core - then they would be 'IMS' applications in the true sense of the word (otherwise they are likely to be implemented by the web/application layer itself and are likely to be free).
The context within which IMS operates cannot be ignored as well. The Internet and the web are dominant. They are options for most IMS applications. The Internet and the web are global and they are free. That does not help for IMS applications.
So, IMS applications must:
a) Uniquely leverage the network
b) For an operator - and let's face it, IMS is mainly driven by operators - be chargeable to the end user and
c) Must take the Internet into account - ie competing against the Internet else it will not work.
One key observation is: The web is global. IMS is national at best - and in most cases sub-national in coverage (more than one operator within a country). Also, end-to-end IMS connectivity issues are still not solved - and that hampers many IMS applications.
IMS applications

Is there an example of an IMS application?
Consider the case of ‘Mobile Multimedia Twitter'. Twitter is popular microblogging service .. and according to Wikipedia:
"Twitter is a free social networking and micro-blogging service that allows users to send ‘updates' (or ‘tweets'- text-based posts, up to 140 characters long) to the Twitter website, via short message service, instant messaging, or a third-party application such as Twitterrific. Updates are displayed on the user's profile page and instantly delivered to other users who have signed up to receive them. The sender can restrict delivery to those in his or her circle of friends (delivery to everyone is the default). Users can receive updates via the Twitter website, instant messaging, SMS, RSS, e-mail or through an application."
The idea of a media rich twitter is not new and, indeed, there are some services already in existence, and, of course, Twitter itself is already 'mobile' in the sense that you can get updates via SMS.

However, to take the idea of video twitter to mobile devices, would be a complex proposition, and would need optimisation of the network (hence an IMS
The idea of mobile video twitter could combine a number of different ideas - most of which we know already:
a) Twitter itself, ie short updates
b) Video
c) Maybe presence
d) Maybe location
e) Maybe push to talk
f) Client side optimisation

However, most importantly - it will need the mobile network to be optimised. Push to Talk (PTT) has been around for a long time - it's biggest proponent being Nextel. However, PTT has not taken off in most places in the world - partly because it needs the network to be optimised - and in most places, you end up delivering voice over the non optimised GPRS network, which is not really feasible from a performance and user experience standpoint, as we can see from the experience of Orange which attempted to launch PTT back in 2004 without much success.

However, the networks themselves have come a long way since that time, and indeed, one of the most common questions we see today is  ‘Where are the IMS applications?'  - which translates to ‘Where are applications that can uniquely use the network?' The service will need client side optimisation as well as network side optimisation if it is to be truly useful and friction free to the end user. From an end user perspective, we can view it almost like ‘video push to talk'. I have been sceptical of the idea of end to end (person to person) IMS, and I don't think person-to-person mobile video twitter will work (yet). However, a web terminated service can certainly work.

Interestingly, it is one of the very few services I have seen where an operator can have a competitive advantage over a similar web application (because the service needs both device side optimisation and network side optimisation)

Many IMS services can be implemented by Web 2.0 (often for free). However, as we have seen above - not all IMS services can be implemented by Web 2.0. To identify truly unique IMS services, it is necessary to leverage those tasks that can be uniquely performed by the network.

Ajit Jaokar is the founder and CEO of the publishing company futuretext. He believes in a pragmatic but open mobile data industry - a vision which he fosters through his blog OpenGardens. Ajit is the co-author of the book 'Mobile web 2.0' . He chairs Oxford University's Next Generation Mobile Applications Panel (ForumOxford) and conducts courses on Web 2.0 and User Generated Content at Oxford University. He is currently doing a PhD on Identity and Reputation systems at UCL in London

Telecoms providers challenged with the need to transform their technology to meet next generation service requirements are looking to IT benchmarking for the roadmap ahead, says Paul Michaels

A recent study by Ovum entitled IT Governance for Telcos, reports that: "IT for the telecoms vertical is currently going through an exciting period of change as telecoms operators gear up for the long haul of business transformation - from a traditional vertically-integrated telco into a competitive service provider based on a next-generation, all IP-network." 

Yes, the European communications industry has entered cyberspace and the future is advancing at warp velocity. Whatever else the next ten years may bring, one thing is certain: telecoms providers will continue to face fierce competition, especially from new players ‘born' in the Internet era, and will be forced to cope with unrelenting pressures to deliver services ever faster, better, cheaper.  According to Ovum: "Transformation into an ‘infotainment' or ICT company - to name just two examples - requires intelligent, responsive infrastructures and running costs that are more in line with today's competitive business environment."

To be among this new breed of telecoms provider, organisations need access to enabling technology that can drive next generation IP networks, content and value-added customer support services. However, technology itself is only part of the equation - to be fully optimised it must be supported by a progressive corporate culture.  To be in the vanguard of the next generation communications industry, organisations must be committed to reducing operational costs, making continual performance improvements and bringing to market new services along with best practice customer support.

Transforming current telecoms technology and operational support structures (OSS) is no trivial task, especially when faced with the need to juggle such opposing pressures as cost reduction on the one hand and investment in new services on the other. So where should one start?  Any journey towards change must begin with a clear picture of one's current situation - a frequently non-straightforward task, particularly in the case of large organisations burdened with multiple, often duplicated, legacy systems and broadly dispersed infrastructures.  However, without this initial clarity, many fundamental business decisions cannot be made. 

Consider, for example, the question of whether it is more cost efficient to support, say,  the customer billing service or the enterprise desktop environment through the in-house ICT department, or instead turn these applications over to an outsourcing provider. This issue can only be addressed effectively when management has a full set of detailed, current baseline data on such items as cost and key performance indicators (KPI) on all relevant IT components and OSS methodologies. Without this type of granular metric, it is very difficult for management to evaluate trends-over-time of improvements in cost management and/or performance levels. And it is virtually impossible to make an accurate ‘apples-to-apples' comparison between in-house and outsourcing costs. 

Because of this increased appetite for business information, benchmarking - both in the back office and at the customer-facing end of the operation - has become an increasingly popular way to achieve best practice and thereby win competitive edge. Whether it is analysing the cost, quality and performance measurements of IP networking infrastructures, client-server and help desk support, or making cost-vs-quality comparisons between supporting fixed line, 3G and global m-banking services, benchmarking parameters are potentially vast.   Benchmarking provides the analytical data upon which management and business consultants base their advice. Its aim is to measure an organisation's own operational methodologies, pricing structures, service levels, technology infrastructures and customer service levels and compare this with the competition (peer group), and against best practice within the industry as a whole. Whether analysing IT, service quality or any other element of the business process, benchmarking has in the past been viewed as a somewhat mundane back-office activity. These days it is coming into the boardroom as telecoms leaders realise that without these metrics, it is hard to see where they stand in a fast-moving industry, or what they must do to stay ahead of the curve. 

Generating a set of cost and performance metrics that provides the launching platform for transformational change is not always easily achieved from inside the organisation - this can be for several reasons. Stakeholders do not always feel incentivised to upset the status quo. Even where there is enterprise-wide buy-in (as in the majority of cases), it can still be difficult to achieve the objectivity needed to assess one's own strengths and weaknesses, or to obtain a 360 degree view of the operation.  It's like the blind men attempting to describe an elephant, each one focused on a different part of its body. To one, it's a tree trunk, to another a sail flapping in the wind, to the other a swinging rope: not one of them is able to perceive the complete entity.  In a similar way, a company looking to benchmark itself may see the wisdom of employing outside help to gain an impartial view of the company's situation.

It's frequently easier for an external consultant to sidestep a company's internal politics and enlist staff participation. Perhaps most importantly, they offer a unique level of access (because consultants tend to work with many companies in the same industry) to comparative peer group metrics on cost, productivity, service quality, and so on.  These specialists also have ‘insider' data on service pricing data for local, near-shore and off-shore outsourcing providers and a wealth other independent market information resources. They often also act as intermediaries in the negotiation of service provider contracts, helping to clarify the deliverables and make cost structures more transparent.

Many established organisations - and telecoms providers are no exception - suffer from an accretion of legacy hardware, applications, databases, desktop and network systems glued together with complex links that need ever greater levels of maintenance to function. This situation is further compounded by a variety of disjointed workflow methodologies that impede a company's end-to-end efficiency. 

Identifying and benchmarking those load points in the system that are causing higher than necessary costs and reducing performance can result in significant savings and lead to streamlined workflows that mean a more nimble service to customers. This is equally true whether a service is run in-house or it is outsourced. Often external provider costs are inflated because they are forced to support clients' overly complex systems. These costs are then passed onto the customer, often without the causes for the surcharge being clear. Just this fact alone can account for much misunderstanding between clients and their outsourcing partners.

Organisations with an eye to transformational change are beginning to take a broad-based view of benchmarking. Instead of viewing benchmarking as a one-off, crisis-driven expense, it is increasingly being implemented as a strategic tool for generating key business intelligence data. 

In this broader role, benchmarking moves beyond cost-only considerations to examine, among other things, the balances between a technology or a service's cost and quality, or cost and performance. As anyone knows from the high street, the lowest cost does not necessarily represent the best value. Assessing the value of a particular system, whether it is a sales or finance system or a corporate tool such as e-mail, is arrived at by looking at the balance between running cost and service quality, complexity and productivity. Being intelligent and responsive to the future and leveraging the disruptive technologies that are driving change, depends upon access to good business metrics. As organisations get more forensic, and begin to introduce IT costs and KPI measurement as part of their ‘good housekeeping' procedures; and as they get into the habit of regularly comparing the quality of their service levels against best practice models, the more the telcos of tomorrow will be in a position to ‘benchmark their way to success'.

Paul Michaels is Director of Consulting at Metri Measurement Consulting, and can be contacted via paul.michaels@metri-mc.com

Benoit Reillier examines the economics of bandwidth  

The relationship between infrastructure investment and economic growth has been established by many studies. In our ‘knowledge based' economies, investment in new communications infrastructures, in particular, is seen as increasingly critical to long-term economic growth.

In much of the developed world, most of the fixed line telephony and internet services used today are based on old copper pairs that were designed for voice telephony in the early 1900's.  While clever technological innovations (such as DSL technologies) have recently allowed operators to breathe  new life into these old local loops, this is not quite the gift of immortality that some may have hoped for.

But replicating or upgrading the existing local loop with new, fibre based technology to facilitate the availability of high bandwidth services is a difficult and costly exercise.
And while some argue that demand may not yet exist for very high bandwidth, it is worth keeping in mind that Moore's law (that roughly states that the processing power of computer chips doubles every two years) has been holding remarkably true since the 60's. There is no reason to assume that this trend will stop overnight and therefore every reason to believe that tomorrow's digital cameras, TVs and computers will have a higher resolution than those of today. The processing power of our computers and the storage space required will also increase accordingly, and so will bandwidth requirements.  If the infrastructures cannot cope, they will increasingly represent a bottleneck.

Operators are therefore increasingly considering the roll out of so called New Generation Networks (NGNs). Given that copper does not carry signals well over long distances, there is a direct relationship between how much copper is used and the speed/quality of services provided over it. The key question, therefore, is: how close to our homes will these new networks be rolled out?

Many operators talk about NGN investment in the context of using fibre optic in the core of their network. Needless to say, while helpful overall, from an economic viewpoint this is not the kind of transformational investment plan that would drastically enhance the experience of users as it would leave the ancient copper loop infrastructure intact - and not getting any younger. Some talk about New Generation Access (NGAs) networks and these could also have different characteristics that may or may not provide users with the full new generation experience. FTTH is more costly than the alternative Fibre to the Cabinet (or Curb) solution, however it offers higher speeds and perhaps more importantly, greater reliability as the network is no longer dependent on a copper line to the home.  In the US, Verizon is rolling out FTTH to around 19 million households, while other US telcos are following a FTTC strategy.  It is not clear yet as to which strategy will be the most effective.   In both cases, NGA requires very significant investments that operators (and their shareholders) are often reluctant to commit to in light of the uncertainties associated with the financial returns available.

The New Regulatory Framework being negotiated in Brussels at the moment will have a significant impact on many of the underlying economic drivers that operators are considering, and it is likely that large scale NGA investment plans will be somewhat delayed in a number of countries, at least until more regulatory visibility is provided, probably at the end of the year.  It also requires regulators to focus on how regulation impacts investment decisions, in the short and long term, rather than simply transferring wealth from suppliers to consumers.
Of course other technologies, often wireless based like the much hyped WiMax standard, are possible substitutes for local loop investment. Users are, in fact, technology agnostic and couldn't care less about the underlying delivery mechanisms as long as the quality, reliability and features expected are made available at a reasonable price. One thing is sure though, our centenary copper pairs are unlikely to provide the required communications capabilities that our economies' future growth will require.

Benoit Reillier is a Director and European head of the telecommunications and media practice of global economics advisory firm LECG.  He can be contacted via: breillier@lecg.com
The views expressed in this column are his own.



A new services development paradigm is driving the communications industry says John Janowiak

What exactly is Web 2.0?  How relevant is it to the service provider business model going forward? The short answer: very relevant.  Carriers who remain focused on traditional voice services and video are missing the larger transformational drift of the communications industry. It's no longer just about service providers inventing services and then selling them to customers; it's about platforms on which customers share communications and entertainment experiences with one another, building ever-larger communities of friends, colleagues, and customers.

If anything characterizes the Web 2.0 world - and, by extension, the new soft service provider world - it is openness. In order to interact richly with colleagues, friends, customers, and business partners, end users are pushing a model in which 1) they have a hand in developing and defining the services they themselves want, and 2) interacting with the network itself is easy and efficient. This is the end game of the network-as-software model: one in which software and applications live on the network, are accessed by the network, and indeed are created via the open-access network.

For example, at SOFNET 08 - a new conference produced by the International Engineering Consortium in April - Microsoft will discuss its Connected Services Sandbox as an example of this new paradigm. Through Sandbox, operators can open their networks to next-generation Web 2.0 applications that can be mashed together with traditional services to create new connected services. The goal is to facilitate the rapid development and market deployment of new service offerings, creating new opportunities for participants and delivering new options for consumers and

"In the new soft service provider environment, operators will be able to offer hundreds, if not thousands, of new services that enable them to target specific customer segments, reduce ‘churn' and drive new revenues." says Michael O'Hara, general manager for the Communications Sector at Microsoft. "By embracing the principles of Web 2.0 and leveraging the significant customer relationships and assets they already have in place, operators have the opportunity to redefine the models for doing business."

Matt Bross, Group CTO for BT, in an interview recently with Light Reading, noted: "The innovation genie is out of the bottle.  We need to do more mash-ups, and we need to connect together for innovation. There are major innovation possibilities by opening up collaboration opportunities. We're moving towards a real-time global innovation model, and moving from a closed to an open model. It's a big challenge."

Getting a handle on these mash-ups (that is, creating a new service by putting together two existing ones) as well as opening the network to third-party innovators, is the course forward according to Bross, who will serve as overall conference chair at SOFNET 08.
"We need to change our mindsets and focus on how we can enhance the quality of people's lives and how they do business," he said. "We need to innovate at the speed of life."

John R. Janowiak, President, International Engineering Consortium

SOFNET 08 runs from April 28th to May 1st 2008, at the Olympia National Hall, London.

The increasing complexity of service provision is creating new revenue leakage risks says Adam Boone

New services, innovative service bundling, and emerging content-distribution business models open the door to a host of potential new risks for the typical telecommunications service provider.  Increasingly complex new content partnerships and revenue sharing arrangements create potential for new forms of revenue leakage.  Smarter end-user devices, content-based services and converged network environments create new potential for fraud. 

In short, the new world of next-generation services means new risks, and these emerging problems are foremost in the mind of service providers around the world as they seek to roll out new service offerings and adopt more competitive business practices.
In mid 2007, telecoms industry research specialists Analysys undertook the fifth annual global survey of service providers' attitudes to revenue management, touching on topics like fraud, revenue assurance, and other sources of revenue leakage.   The survey, which was underwritten by Subex Limited, identified a continued increase in revenue leakage across the globe and provided insight into the main causes.  The study examined regional differences in revenue leakage and the approaches to combat it, showing some intriguing differences in how European operators address the problem when compared with operators elsewhere.

Globally, Analysys reported, the overall average level of revenue leakage from all causes stood at 13.6 per cent of total revenues. The Middle East/Africa region experienced more than 20 per cent of revenue loss, with Asia close behind at just below 20 per cent and Central and Latin America at more than 15 per cent. Western Europe ranked lowest of all regions at about 7 per cent, following Central and Eastern Europe at 8 per cent and North America at just about the 13 per cent average.

When breaking losses down by operator type, mobile operators continue to lose the most at nearly 14 per cent, with mid-sized operators with between 100,000 and 1 million subscribers racking up the most loss at more than 18 per cent. For comparison, the largest operators are losing only 6 per cent of revenue per year.

Significant growth in fraud losses and revenue assurance problems related to the launching of new products and pricing has driven the overall increase in losses.  In addition to fraud, the three primary sources of revenue leakage cited by respondents are poor processes and procedures, poor systems integration, and problems associated with applying new products and pricing schemes.

The level of revenue loss that operators find ‘acceptable' has risen this year to 1.8 per cent, from 1.1 per cent in 2006. This is the largest single increase since the survey was started five years ago. Major incumbents were least tolerant (1.2 per cent) and fixed line alternate operators most tolerant (2 per cent). Operators in Central and Latin America reported the highest level of ‘acceptable' loss at 2 per cent, with operators in the Middle East and Africa accepting lower levels of loss (1.4 per cent) than the average.

At the planning stage for new products, most operators take into account most causes of loss as part of their preparation for new service launch.  However, 32 per cent of operators do not use any third party help to address revenue leakage issues. Yet the findings show that the operators who use third-party solution providers for revenue assurance lose 30 per cent less compared to those who use no external help.

The survey found that managers responsible for revenue assurance and fraud management feel a great deal of uncertainty as they look to the future and consider next-generation networks and converged services.

The findings showed that dramatically more revenue assurance and fraud managers are concerned about the impact of next-generation networks and services on revenue management than in previous years.  In fact, around half of the survey respondents reported that addressing revenue management issues for these new technologies will be a chief concern in the next three years.  Much of the anxiety may stem from the unknown.  New products like IPTV are still reaching mass-market subscriber penetration and unanticipated revenue leakage issues may emerge as these products reach peak subscriber growth and market uptake.

The new converged, IP network represents new opportunities for fraud, especially as these services incorporate content that must be delivered and billed for across a converged infrastructure. Further, end user devices are increasingly intelligent, opening the door for new hacking techniques and mobile malware. These are significantly different from the risks associated with a traditional fixed-line telephone network and therefore present a higher degree of vulnerability for operators.  This risk is heightened as operators are under growing pressure to deliver these new services rapidly in order to stay ahead of the competition.  Compounding these challenges, the competitive environment facing most operators means they must achieve faster time-to-market for new offerings and shorter product lifecycles.  As a result, billing processes must be able to adapt to this accelerated pace of change, or what has been described as the need for greater service agility and operational dexterity.
Another area to take into consideration is the implication of delivering new content-based services that involve third parties. In the Analysys research, 30 per cent of respondents cited interconnect/partner payment errors as one of the main causes of revenue leakage across the business. If an operator intends to offer content-based services, it may no longer be responsible solely for a service's connectivity, but also the delivered content, either provided on its own platform or by a third party.  As a result, this creates additional complexity at the accounting stage due to payment handling, tariff management and revenue sharing. With more third-party operators involved in the delivery of the service to the end-user - from the operator, the content owner, to the content host - there is potentially greater opportunity for interconnect or invoicing system errors, which need to be assessed.

An emerging key strategy for addressing many facets of this transformation, and to maximize the benefit of revenue management efforts, is to establish a Revenue Operations Centre (ROC), a consolidated collection of systems that monitors the health of the revenue chain and the impact on costs. Like a NOC enables the tracking of service quality and network health, so the ROC is a centralised monitoring and control infrastructure that integrates an operator's individual revenue assurance, fraud management, cost management and risk management solutions to better monitor revenues and costs.  This end-to-end approach takes into consideration all the processes involved in delivering the service to the subscriber, and helping the service provider to understand the impact of operational processes and outcomes on profit.

A ROC allows operators to monitor the financial performance (eg total revenue, arpu, subscriber growth), revenue performance (eg revenue/cost by category, revenue/fraud loss) and operational performance (eg revenue/fraud/ bad debt loss by root cause) across their networks.  It also enables an operator to track costs associated with delivering services, and arrive at an understanding of the profitability of different service types, different subscribers, different market segments, and other relevant business metrics.

For operators offering next-generation wireless and wire-line services, implementing an end-to-end approach to monitoring and protecting revenues and managing down costs will become an even greater requirement, as services become more complex.  An approach like a ROC enables operators to compare and consolidate information from across network, operations and business systems to monitor revenue chain integrity, detect cost overruns and, hence, achieve sustainable profitability.

Adam Boone is VP Strategic Marketing, Subex Limited



Jonathan Bell explains how operators can make ageing Intelligent Network infrastructure flexible for both the markets of today and of tomorrow

All industries are future-focused and the telecoms industry is no exception. Next generation services such as IPTV and mobile VoIP have been the talk of the industry for a number of years. However, it is important that service providers do not get distracted from their core revenue drivers. In Western Europe these are undoubtedly voice and messaging. In 2007, 95 per cent of mobile telephony revenues (?29.5 of a total arpu of ?30.4) came from person-to-person (P2P) voice and messaging. Although most analysts predict that messaging and voice revenues will decrease, the latest estimates predict that revenue share will stay above 80 per cent for the next five years, still representing substantial revenues for the operator.
In order to make the most of revenue opportunities, a major focus of the industry should be on the best ways to innovate within existing person-to-person communication services. Introducing a well-targeted variation of an existing service generally leads to much higher acceptance and adoption than launching completely new, unproven services.

However, this innovation is only happening to a limited extent. Despite the technology being available to enhance these services, the telecoms model that has stood for the last 20 years has not really changed. Globally, operators are still providing standardised, homogenised, utilitarian voice and messaging services, effectively providing access and connectivity only.
With increased competition and regulatory changes - particularly in roaming, telephony prices are falling and margins are shrinking.  The spectre of telecoms companies becoming bit-pipe providers in a commodity market is already here. If operators do not take action now, established markets will begin to slip away and the opportunity to develop an existing, semi-captive market will be missed.

There is a great deal of opportunity for operators to extend their services beyond the same limited voice and messaging product set, and focusing innovation on these services makes sound business sense. Value-added features such as presence or visual voicemail extends already popular services and does not require the conceptual leap that something like mobile TV does, making them easier to sell.

But if the technology is available and the market is ready, why have operators not explored these avenues more comprehensively? The reason lies in the construction of the Intelligent Network (IN) platforms that mobile networks use.

The IN authorises and controls connectivity, metering and charging for calls and data sessions.  This is an exacting and complex task requiring low latency as the IN sits in the call signalling path.  In order to achieve this with exceptional reliability and with an enormous volume of concurrent activities, IN platforms were engineered - 10 to 20 years ago - as tightly integrated stacks of software and hardware.  For the same reasons, the telecommunications services that they host are streamlined, so that they are relatively simple and standardised, utilitarian services.

Today's IN platforms were conceived and designed in a different era, one where there were significantly fewer telecommunications services and networking technologies.  Each line of IN code has to be crafted and the service logic and interdependencies tested by a small number of highly skilled IN engineering staff.  Software engineering has also moved on in the interim.  Modern software engineering approaches design systems as separate, horizontal layers that provide services to the other layers, which are decoupled from each other.  This helps to provide a "safe" runtime environment that isolates the behaviour of the application code from the platform and other applications or services. 

IN platforms are expensive in themselves and represent a substantial Opex investment for the operator.  Excessive ‘spare' capacity is therefore undesirable, it translates into dead weight in terms of assets and most operators organise their IN capacity to limit this. The result is that many operators are working at full capacity, further limiting their ability to roll out new services. 

Further exacerbating the situation is the evolutionary rather than revolutionary approach that operators have taken to their networks.  As new technologies have become available, mobile telephony services have been added to and modified, resulting in a disparate array of equipment and IN platforms that do not readily communicate with each other.
As a consequence, the level of expertise needed to adapt the IN is very high and such engineering is expensive.  This acts a real barrier to service innovation - operators need to be almost 100 per cent certain that a new service will deliver before they can even embark upon a trial - a level of certainty which is rarely possible and often means that valuable opportunities are often missed.  Indeed, the high cost of service creation and the long lead times associated with getting a service developed often invalidate the tentative business case for new services, before they can be explored and trialled.

The IMS (IP multi-media subsystem) architecture, the future blueprint of mobile networks, is expected to solve a lot of these issues. The network will have an all-IP core, which is intended to reduce costs and enable the creation of new services on a uniform platform. In particular, the IMS Session Initiation Protocol (SIP) Application Server concept is designed to facilitate service innovation and eliminate the rigidities of today's IN platforms. 
However, IMS is a costly infrastructure investment and rollout, so operators are reluctant to make a full migration without immediate service requirements. Today, nearly all mobile subscribers are on the SS7 circuit-switched TDM network and all the applications and services they value are based upon this technology.  Making the business case for service innovation based on a strategy of IMS rollout and mass subscriber migration to IMS is extremely challenging.

The problem remains that the core business areas of person-to-person communication - chiefly voice and messaging currently - are under fierce price pressure and all operators are providing their customers with the same, standardised, utilitarian services.  If everyone is selling the same thing, then the only way you can differentiate is on price or customer service.  Premier customer service and low price are in direct conflict. 
There is therefore a strong case to be made for service innovation in the core person-to-person communications.  Targeted services that are designed to meet the specific needs of a segmented customer base, rather than "one-size-fits-all, pile them high, sell them cheap". 
At the same time of course, operators need to progress towards the long-term goal of creating a core mobile network based on the 3GPP IMS architecture. The modern operator is stuck between a rock and a hard place. Innovating existing services is a painful process and abandoning their existing infrastructure would be too great a loss of assets.
There is a way through woods, however.  Just as IN was added to augment the telecommunications switch, allowing extra capabilities to be added to the network without requiring significant changes to the switch.  It is also possible to augment the capabilities of the IN platform which telecoms operators are so dependent upon, and which ironically are also at the heart of their inability to innovate in their core services.  IN augmentation, rather than replacement, maintains the IN benefit and overcomes inflexibilities. 

Chaining in the service layer enables the operator to configure subscriber-specific service logic for session or call routing and new service integration, effectively providing the flexibility promised by IMS, but on today's TDM networks.  IN augmentation results in fast and cost-effective service introduction.  Providing the ability to launch, refine, enhance and (if appropriate) withdraw services and variations, including integration with an online charging system for real-time pricing.

The inter-network multi-protocol gateway capabilities bring further benefits.  By providing cross-network access, operators can migrate subscribers to IMS without the need to have a full service portfolio available in the IMS domain.  The additional benefits the vast majority of their subscribers can access the new services provided on the IMS network. 
Operators need to respond to market pressures now, using the strength of their two highest revenue-generating services (voice and messaging) to uncover new revenue generating opportunities. This will enable them to differentiate now, providing services that are sticky and reduce subscriber churn, and to charge a premium for services that meet the specific needs of individual customers.  For an industry that is characterised by long term investment and comparatively slow ROI, IN augmentation holds a great deal of appeal. With promiscuous customers and competition from outside the sector, can operators really afford not to explore this?

Jonathan Bell is VP Product Marketing for OpenCloud

As communications, media and entertainment services converge and competition increases, the billing system is pivotal in determining operators' ability to embrace or adapt to the potential of next generation business models, says Wolfgang Kroh

The evolution of the telecommunications industry has reached a crossroads. Emerging technologies and radical new business models have the potential to cause a fundamental change in direction for the industry. Now, more than ever before, communications providers look to billing systems vendors, to equip them with the tools to manage this uncertainty and provide the agility required to meet demands of a rapidly changing business landscape.
In fixed line markets, VoIP services are making huge dents into the subscriber bases of the incumbent operators. In 2006, European fixed line operators lost over 10 million subscribers. At the same time new VoIP entrants attracted over 14 million new subscriptions, with just 3 million of those being won by traditional fixed line operators, attempting to win back part of the VoIP market share.

In many mobile markets, deregulation and competition has driven down call rates and the proliferation of MVNOs, targeting niche market segments with highly competitive lifestyle, brand or language based offerings is leading to a near-commoditisation of mobile voice services. Furthermore Wi-Fi and Wi-Max technologies have the potential to make a serious impact on traditional mobile services.

Increase in competition and pressure on traditional business models is not a phenomenon solely facing mature markets. Emerging markets such as South Asia and the Middle East and Africa are growing at a rapid pace. In 2007, mobile subscriber numbers in India grew by 7 million per month and the recently launched Etisalat Egypt has built a subscriber base of 3.5 million in just 10 months. Whilst such markets have comparatively low levels of mobile penetration, competition is intensifying and new operators are already deploying state-of-the art convergent billing architectures and launching innovative value-added content services in order to offer to differentiate from the existing providers and increasing competition.

Whilst there may be uncertainty over the direction in which the communications industry is heading, it is clear that the current climate of increased competition, market penetration and emerging technologies is giving rise to innovation, in terms of new services, new applications but perhaps more significantly, new business models and new participants in the value chain.

The challenge now facing communications providers is to ensure that they are as market responsive as possible. This requires high levels of business flexibility to rapidly deploy innovative value-added services and applications, and work with a variety of new business partners under non-traditional business conditions. The billing system is therefore pivotal in determining their ability to either embrace or adapt to the potential of these next generation business models.

Communications providers, and in particular mobile operators, threatened with becoming mere ‘bit pipes’, have been keen to acquire compelling content to offer the value added services that, as little as two years ago, were viewed merely as marketing or customer retention tools, but are now considered by many operators, as strategic differentiators and, moreover, critical revenue generators. In 2006 the global mobile content market was valued at around $89 billion but, with increased cooperation with the entertainment and media industries and the increased speed of HSDPA, this is forecast to exceed $150 billion by 2011.
However, acquiring content is not necessarily a simple linear transaction. For example, music content distributed over mobile networks requires the licensing of multiple rights, including the right to copy and transmit both the musical composition and the sound recording. Depending on the country, an operator may therefore have to work with multiple partners, (including copyright institutions offering standard, non-negotiable licensing schemes), each requiring reporting, settlements and payment.

The communication provider’s billing systems and processes must therefore have the adaptability to support these potentially complex non-negotiable partner licence agreements, whether they are based on revenue share, rate per use or combinations of these payment models and associated reporting requirements.  A particular challenge can be the obligation, to a rights owner, to calculate the correct proportion of advertising and sponsorship revenues which may have been sold across the communication provider’s entire portal.

Music is an example of the many new services being introduced involving multiple parties in the value chain. In order to promote mobile music services, real-time charging and balance management becomes increasingly important to the revenue management process. With the diversification of services, many subscribers are still unfamiliar with new content-based services and require reassurance over costs. Billing systems must therefore support real-time charging and balance management to enable real-time advice-of-charge messages to ensure that subscribers are comfortable in using the service. At the same time, they must allow the communications provider to ensure the credit-worthiness of the subscriber with real-time balance authorisation and reservation capabilities. Bad debt resulting from high levels of content consumption, including music, carries the added exposure to third party content licence costs in addition to lost service revenue

Multi-play services are increasingly being deployed as a means to differentiate and gain market share. Traditionally this has been a strategy of the broadband cable
operators aggressively moving into the telecommunications space. However it is now also being deployed by new market entrants, seeking to rapidly gain market share in highly saturated markets, who have the advantage of state-of-the art billing architectures rather than the ‘siloed’ legacy systems often operated by incumbent providers.
One such example is EITC, in the UAE, operating under the brand name ‘du’. At the time of launch, in February 2007, mobile penetration in the UAE already exceeded 120 per cent but after just 10 months of operations du had attracted over 1.9 million subscribers, accounting for some 30 per cent of market share.

Central to du’s strategy to enter the UAE market was the premise of offering subscribers simplicity and convenience. With du’s  ‘At Home’ package, subscribers were offered an innovative, voice, data, video and content packages that are not only simple to use but also easy to purchase, to pay for and to receive support.

du’s state-of-the art, ‘any-play’ billing architecture was fundamental to this strategy. It enabled all network technologies and all services to be supported within one system. As a result ‘du’ are able to offer a triple play; fixed-line, Internet and pay-TV package with all elements of the service covered by a single monthly bill. In addition, the system facilitates fully integrated customer care, whereby a subscriber can receive support for all services through a single point.

The convergent capabilities of ‘du’s’ billing system was also a key enabler of the highly targeted cross-service campaigns and promotions that were another key feature of their aggressive launch into the UAE market. It enabled them to offer compelling, tailored packages and marketing promotions to targeted segments of their subscriber bases, across a range of communications media, in both the consumer and business sectors. One such promotion is du’s ‘Free Time’, a cross-service promotion whereby subscribers earn credits for every second of every international call made.  The credit accumulates and is displayed on the bill each month. It can then be redeemed against any kind of usage, monthly fees or valued-added service.

Perhaps the greatest potential change to the mobile market place is the arrival of advertising supported services. 2007 saw the launch of Blyk, the UK’s first advertising based mobile service provider, offering a certain number of free calls and texts in exchange for agreeing to receive targeted advertising messages on their mobile phone. Historically, mobile operators have been able to charge a premium on mobile call rates for the intrinsic value of mobility, however, with the advent of what is, at least in part, free mobile services; it could be that we are witnessing the beginning of paradigm shift in the mobile communications business model.

In addition, with growing interest in mobile IPTV, but with no clear business models emerging, it seems that advertising supported services are likely to play a major part in the evolution of telecommunications, and in particular, mobile business models. Mobile operators could soon find themselves competing with providers who are offering equivalent services together with compelling content, free of charge.

Whilst advertisement supported mobile service business models are still emerging, it is clear they will play a major role in the development of the telecommunication business landscape. It is also another example of the business uncertainty that drives the billing system requirements of today’s communication provider.
With the uncertainty over the direction of the communications industry, providers are facing some difficult decisions: Which technologies, to embrace? Which business models to adopt? Which partners to work with?

However, what is clear is the necessity to invest in a multi-technology ready billing system that provides the convergent, business adaptive billing environment to support sophisticated charging scenarios, including advertising rate plans, and complex partner settlements. This should also include open and service orientated architecture, providing the ability to easily and quickly upgrade in accordance with new technology such as IMS, rapidly launch services and integrate third party applications.
Only those players that have made the necessary preparations to their billing environment and have geared up for innovation will be best placed to maximise the potential of these next generation business models.

Wolfgang Kroh is CEO at LHS and can be contacted via info@lhsgroup.com www.lhsgroup.com

With the latest buzzword ‘transformation' ringing in everybody's ears, European Communications takes a look at what will be on offer at the TM Forum's Management World

The great and the good (and sundry others) from the OSS/BSS world will be descending en masse on Nice again this May, to learn, debate, observe and participate in the on-going,
fast paced, and - some might say - disturbing developments that are affecting their every-day working lives.

The TM Forum's Management World - re-titled to reflect the organisation's expansion from a purely telecoms brief into the broader (and more complicated) world of communications, information, entertainment and media - runs from May 18th - 22nd at the Acropolis Convention Centre.  With these separate but increasingly converging industries still going through what Martin Creaner, TM Forum's President and Chief Technical Officer, describes as "a total sea change", the Forum's role in bringing together the movers and shakers from - staying with the sea analogy - the octopus' various legs, is a crucial element in the transformation which many players are now having to undergo. 

Sometimes viewed - it might be said unfairly, given the dull and bureaucratic implications - as essentially a standards organisation, the TM Forum aspires to be, and often is, much more than that.  This is in no small part due to the drive and vision of its CEO, Keith Willetts, who was proselytising the theories of lean and agile corporations when telecoms, as a whole, was still trying to dislodge its boots from the sticky monopoly mud.

Willetts is still banging the drum for optimizing business processes and automating them end-to-end through integrated systems, noting that while it is certainly important to be highly efficient, fast to market and delivering great services, it is no longer enough.  The industry buzzword now is ‘transformation', which Willetts describes as being as much about acquiring new skills and competencies, as it is about putting new kit into central office buildings.  "It's about changing the way companies think and act, as much as it is about new service ideas," he says. "The watchwords are: innovation, partnering, exploiting assets, and taking risks."

Management World in Nice is intended to reflect all these aspects, and give those attending the opportunity to share information as they navigate transformation.  The areas to be tackled at the event, therefore, include ‘Business Transformation Strategies' which looks at the proposition that service providers are continually adapting to market dynamics through transformation strategies - and only by investing in technology for managing the service lifecycle will they compete in the 21st century; and ‘Technology Transformation Strategies' which will argue that involving the right use of technology coupled with effectively managed systems migration will enhance both operations and business support systems.  ‘Business Enablers and Managing the Content Lifestyle' meanwhile, discusses the fact that systems and processes that worked for more traditional telecom services may no longer be up to the challenge of delivering content-based services.  The next generation of services will be more sophisticated - delivering a mix of content and media to a diverse and increasingly mobile subscriber base. 

Other conference sessions will include SOA for Next Generation Operations; the TM Forum's Prosspero and NGOSS in the Real World; Strategies for Optimising the Customer Experience; Revenue Management and Assurance; and Delivering Innovative Services to Cable Customers.  A specific Focus on China will look at the proposition that understanding what it takes to succeed in China as a service provider, integrator or vendor can be difficult in such a large and high stakes market - but that failure to enter that market could be significant.  Speakers from China Mobile, China Telecom, China Unicom and Guoxin Lucent will bring to the session their experience and knowledge of living and working in the region.
Reflecting the industry-breadth of its 650 plus members, TM Forum is fielding a number of keynote speakers from different parts of the converging communications industry, including Sol Trujillo, CEO, Telstra; Alan Bell, Executive Vice President and Chief Technology Officer, Paramount Pictures; Paul Reynolds, CEO Telecom New Zealand; and Stephan Scholz, CTO, Noikia Siemens Networks.

The ever-popular Catalyst Showcases will also, of course, feature at the event. The Catalyst program directly supports the TM Forum's objectives to provide practical solutions, in order to improve the management and operation of information and communication services.  It also aims to provide an environment where service providers can pose real-world challenges and directly influence the system integrators, hardware, software, and middleware providers to define, develop, and demonstrate solutions. Projects within the program are delivered in a very short timeframe, typically six to nine months, and the results are presented at the Showcase during the event.  This year's crop includes, among the ten featured showcases: End-to-End B/OSS Framework; Delivering and Industry Information Infrastructure; Zero Touch Deployment; and Operator User Management.

Alongside the Expo, where vendors, equipment manufacturers, system integrators and service providers get the chance to show their wares, check each other out, and compare notes, the networking events at Management World always prove to be a considerable draw.  As well as the expo cocktail reception, and the networking event party, this year will also see the second Excellence Awards Ceremony and gala dinner, where awards covering such areas as Best New Management Product; Most Innovative Integrated Marketing Campaign; and Best Practices - Service Supplier will be handed out to the winners, among the glitz and glamour of the Palais de la Mediterranee. 

Management World 2008, 18th - 22nd May, The Acropolis Convention Centre, Nice.

While service providers are still tempted to use price as a competitive weapon, Tony Amato argues that they should be investing capital in enhanced revenue generating VAS applications - and ensuring that they are tested across every element of the network

I want to be able to download my music and also be online with my friends anywhere-anytime, among other things, and I need all of this very economically", asserts the technology savvy end-customer. The underlying message translates simply to a convenience-at-your-fingertips idiom with a pocket-friendly ulterior motive. If you are a service provider or a network operator, you might already be overwhelmed with such a paradoxical sentiment. On one hand lies the enormity of improving average revenue per user (arpu) and profitability each successive ‘accounting period'. On the other is the seemingly perilous choice of selection and rollout of new Value Added Services (VAS) that appeal to the imagination of the masses. Now consider the following stark realities

  • Fixed-line revenues are dwindling owing to an increased fixed-to-mobile substitution
  • In geographies that are witnessing positive subscriber growth, arpu figures are either flat or seem headed downwards, although profitability figures have shown improvement in some cases
  • In highly saturated markets arpu growth has shown a direct impact on profitability
  • New and innovative VAS such as multimedia messaging, presence, gaming, mobile commerce (m-Commerce), mobile office and location-based services are starting to contribute significantly to the data component of total arpu (i.e. voice + data)
  • The data component of total arpu is growing, but not fast enough to offset the decline in the voice component of total arpu

Market studies have established VAS to be the prime driver for arpu growth. From an average current revenue share of 8-12 per cent worldwide, VAS implementations are poised to account for at least 15-20 per cent of the top lines of service providers in the next couple of years. However, VAS implementations have to be operationally supported by the deployed network and systems. For them to be operationally efficient, service providers and operators are realising the importance of effective utilisation of their network infrastructure. The need of the hour is for new service rollouts to provide ample revenue improving opportunities, while also dealing with a shortened time-to-market cycle. For them to succeed, all innovative marketing techniques for the new services have to be pillared on a solid foundation of a harmonised network configuration.

Service providers (both fixed and wireless) deciding to offer bundled/converged (used inter-changeably with VAS) services often find it difficult to deal with intricacies at key stages of the product lifecycle. At the very outset, the high-level business consulting process needs to be focused on assessing the existing operations and management systems to discover potential gaps and recommend solutions (in the order of priority). This forms an inherent part of a future-state transition plan that has strategic as well as tactical ramifications. The overriding motive is to use Business Process Modelling (BPM) to evolve to an operationally efficient state that delivers optimal resource utilisation, improves productivity and reduces the possibility of a substantial overhaul. This improved operations efficiency will streamline processes that work to further enable future VAS.  To help enable a flexible service delivery environment, this stage should also consider prevailing market trends and preferences. The planning process has a bearing on the eventual returns on investment (ROI) and arpu. Service providers have increasingly started to rely on the services of their partners and specialist vendors to chalk-out strategic roadmaps for optimising their networks and service rollouts.

VAS implementation and integration are also fraught with numerous challenges. While content acquisition, its management, and spectrum regulations (for video/data applications) pose a common threat to all providers, the actual implementation and integration effort provides the differentiation from competition. This stage attempts to convert the optimised functional models (suggested during the consulting phase) into action. This may require replacement/retirement of legacy components and introduction of new COTS systems that seamlessly plug into the network. Once the final selection of components is made, their seamless integration into the network follows. Effective customer relationship management and network management are the desired outcomes of this phase. The success of this phase determines the ease of deployment of current and new services, as well as their financial viability, through reduced opex as the result of integrated, end-to-end systems in support of services.

SLA-based managed testing is another interesting trend in the communications space. By removing testing silos and adopting a single testing strategy, service providers and network operators tend to dramatically reduce the operational costs associated with managing a multi-vendor environment across all their networks, devices and applications. As operators focus on the launch of VAS, they are also striving to reduce the manpower and maintenance overheads of their product line. The managed testing partner brings an in-depth knowledge of technology development and testing to support end-customer SLAs and Key Performance Indicators (KPIs).

Extreme competitive pressures are forcing operators to reduce R&D costs, while simultaneously ensuring that VAS are tested across every element of the network.
The touch points at various interfaces of the network core, various OSS/BSS elements as well as the main application and network components also need managed testing. This ensures a thorough verification of all features, functionalities, performance and quality metrics prior and after service launch. It also improves predictability and visibility of costs the operator may need to spend on testing, year on year. Prominent types of managed testing services include test engineering and consulting, end-to-end integration testing, test automation, user acceptance, and multi-vendor interoperability testing services.
The managed testing vendors also have the capability to conduct end-to-end testing scenarios in a controlled network environment. They assume responsibility for the entire service lifecycle. This may include a lab setup to emulate the entire network deployment architecture to conduct various testing scenarios in heterogeneous access networks, with multi-protocol implementations of converged services. Such labs also allow communications service providers to address and monitor critical issues such as performance, latency, voice quality, retransmission, security QoS and policy, enabling a smooth launch of services, maybe even ahead of competition.

VAS is leading the way in driving arpu growth and improving profitability. But, this path must be trodden carefully, backed up by the capability of a ‘fine-tuned' network and the associated management systems. Operators cannot afford inefficiency and poor management of their own systems and hope to be competitive at the same time.
Service providers often use price as a competitive weapon when the services market faces extreme competitive pressures. They find it simply easier to offer better pricing for a longer-term contract commitment with early termination fees to suppress churn, than to invest capital in enhanced revenue generating VAS applications. This temptation has to be curbed in favour of optimising their networks to achieve long-term sustainable arpu growth. Strategic partnerships with specialist telecom vendors are enabling them to achieve operational efficiencies to make their networks ready for new service rollouts. This also helps them rationalise their operational expenses.

Rather than worry about the maintenance and deployments of their networks, innovative operators are focusing their energies on managing and growing their businesses through VAS. After all, the end-customer will continue to request more innovative services regardless of any operational challenges a service provider might be facing.

Tony Amato, AVP Network Services Solutions, Aricent, can be contacted via tel: +1 516 795 0082,  e-mail: anthony.amato@aricent.com

Cable operators must streamline their networks for faster service rollout if they are to guard against hungry telcos says Bill Bondy

As telcos race to roll out IPTV along with Internet access, VoIP, e-mail, messaging and security services, cable operators can not rest on their laurels by relying on their strongholds in the entertainment and broadband industries. Despite cable's solid brand recognition and established customer loyalty, telcos could gain considerable ground on cable turf by boasting "on-demand" TV capabilities and personalisation of "blended lifestyle services" in their quad plays.

If IPTV subscriptions grow to 36.8 million by 2009, as predicted by Multimedia Research Group, this personalisation will be a significant differentiator.

To stay ahead, MSOs must recognise the many identities of a person as he or she transitions from personal, professional and leisure profiles. A subscriber can be a wife, a mum, an office manager, a tennis player, an antiques collector or a dancer at different times in the same day. The fact a subscriber could opt to change service settings according to the time of day, location or situation could be leveraged to open the door to increased loyalty through improved service quality perception.

The problem is that embracing the customer and the seamless hand-offs among TV, fixed telephony, broadband and cellular networks will take substantial engineering feats. Of paramount importance will be the ability to instantly access information about bandwidth requirements, QoS, permissions, pricing plans, credit balances, locations and device types.
To achieve this, there needs to be a one-stop shop for data, and an understanding of how dynamic services fit into rigid legacy networks with silo data storage structures.
While service management, control and security can be greatly simplified with the unification of subscriber-specific data, the fact remains that multitudes of protocols and access methods go across many components (ie RADIUS, AAA, session accounting, policy management and HSS). That makes consolidation a very daunting task.

With so many different types of databases to manage-each with its own protocols and access methods, there is often a duplication rate of up to 35 per cent. More often than not, manual processes and forklift migrations are the status quo for re-synchronising databases with networks in order to support and to keep up with increasingly rapid service changes.
The new-world view of data centralisation is more dynamic, as it focuses on real-time capabilities and on-the-fly transactions. These capabilities require a move away from historical, report-oriented strategies that sat at the core of monstrous data warehousing initiatives and did not have rigorous latency and response time requirements. Monolithic libraries of information now have to give way to intelligent databases that "grip" data for deeper personalisation of services and performance at increasingly higher levels.
To do so, cable companies have to break away from reliance on "transform layers" or "federation layers" that sit on top of multiple databases as an ad hoc "glue". While these layers help applications and clients to better understand the nature of queries, they will cease being real-time responsive dealing with, say, 50 databases. Because each data repository possesses its own access interfaces and protocols, the glue will no longer be enough when cross-database access within the network is required. Core network service and application performance lags are a major liability.

A centralised view will instead depend on the creation of one logical database to house all subscriber data with a discoverable, published common subscriber profile, as well as one single set of interfaces for managing that data (ie LDAP, TelNet, SNMP, etc). The single logical database will co-exist with data federation to allow a gradual, step by step, migration of data on a silo by silo basis until the operator has consolidated all required subscriber data to the degree that is possible.

Subscriber data is at the heart of control for the user experience and quality across networks. By consolidating customer data, MSOs enable provisioning and maintenance from one centralised location. A one-step process for adding all data for subscribers and services to a single database would give cable companies a huge opportunity to activate complex services within seconds of customer orders, rather than in some cases hours or days.
Instant access to synchronised data will greatly improve the customer experience, as well as create tremendous opex and capex savings. Potentially, miles of racks and servers could be eliminated if terabytes of data were moved to pizza-box sized hardware rather than complicated SANs and larger servers.

To realise capex and opex benefits, there are certain components that are crucial to centralising subscriber data among different network layers: a hierarchical extensible database, real-time performance, massive linear scalability, continuous availability, standard, open interfaces and a common information model. To help prepare for the day when IMS becomes a reality, leaving room for a software upgrade to a full-blown HSS will become important. 

As cable operators integrate to PacketCable 2.0 environments, building and maintaining a subscriber-centric architecture will be key to services that require very fast, reliable and resilient repositories that concurrently serve multiple applications. After all, latency is not tolerated in pre-IMS networks today, which could spell doom for quad plays that don't build on a consolidated subscriber centric architecture.

A network directory server (NDS) is the first step in freeing and directing customer data from silos, as an NDS puts a directory in the heart of the network. With a centralised repository, service logic can be separated from subscriber data, enabling a cable operator to have VoIP and associated services working on WiFi, because the subscriber data can be reused among various access networks (ie VoIP on cable, CDMA or GSM).

Additionally, the application independent and hierarchical nature of an NDS makes it extremely flexible and extensible; suitable to host data for multiple applications and multiple access networks compared with embedded relational databases. A proper NDS directory structure is better suited to the disparate nature of the data prevalent in converged networks, which involve dynamic, real-time relationships. An NDS directory is object-oriented in nature with a data model that is published, enforced and maintained by the directory itself.

For a network directory server to provide these capabilities in the core of the MSO network, it is critical that it be highly performant, massively scalable, and geographically resilient.
Typical disk-based databases and legacy directories don't offer the read/write speed operators need to consolidate data in a live core network. Average latencies of three milliseconds for a query and less than five milliseconds for an update are critical to maintain customer performance expectations. Update performance is critical and using highly distributed memory resident directory databases can offer update (as well as query) transaction scalability at the point of access.

As critical as performance, a consolidated single logical database must always be available, downtime is loss of business. The network directory must provide continuous availability even in the event of multiple points of failure throughout the network, ideal for geographically dispersed networks and business continuity reassurance. NDS technology can be scaled massively, using data partitioning and distribution to host virtually unlimited quantities of data. Transactions and resilience are scaled by replicating data in real-time over multiple local and geographically distributed servers.

To make this scalability cost-effective, the hardware must be compact, inexpensive and non-proprietary and the NDS software must be able to scale linearly with the hardware. In fact, the hardware necessary for high transaction rates with the aforementioned low latency is actually very small. A small network directory system can yield 10,000 transactions per second for a couple million subscriber data profiles on a handful of dual-core processor servers running Linux.

That is a big difference from relational systems, which rely on expensive and complex hardware to scale to high transaction rates and directory sizes. Relational systems often struggle to utilise more than a single server or operating system footprint to scale capacity, forcing much more expensive hardware into a network. That increases opex and capex. Relational databases do have their place, as they are more the ideal for batch-mode, complex billing- and CRM-type operations, but for voice services, SMS and Internet services, distributed in-memory directories are more adept at handling the real-time nature of use when and where the data is needed.

Directories also help to simplify integration by supporting access through common IT technologies and protocols, such as LDAP, XML/SPML, and SOAP. Using IT technologies and protocols broadens the pool of qualified professionals who can support such as system. This translates into substantial cost savings, as operators can implement open interfaces in off-the-shelf hardware and operating systems. It's important to keep network components adaptable to a wide range of equipment to bring down support and maintenance costs.
Furthermore, to realise all the benefits of an NDS it is critical that forethought be put into designing a common information model (CIM). This is the foundation for a useful, extensible data model that encourages data re-use while allowing applications to peacefully co-exist in a multi-application, single logical database environment. The CIM model focuses on arranging subscriber, network and application data in several categories: subscriber identities, common shared global data, application specific shared data, and private data.
Unfortunately, no standard model exists, as every operator has its own information model and its own methodology for migrating and consolidating applications. However, most MSOs can build a common data repository within their network using an evolutionary approach. Starting with a single application that fulfils a emerging need of the MSO (eg presence, IM), the CIM data model framework may be established. This provides the foundation upon which other application data may be integrated and built. From then on, new applications (eg WiFi, AAA or policy management) can build on the already existing model. The key is to establish the proper foundation first and then add to it in an incremental fashion.

The CIM allows cable and telco operators to share data in a single logical database, as it houses re-usable data that can be used for new applications and services. As new applications are added and exiting ones evolve, data models are analysed and often changes are required. Changes can be applied to existing application data models where data is part of a common model using virtualisation techniques. So-called virtualisation is the ability to provide application clients with different views of the common data based on the identity of the accessing agent. This allows the common data model to be filtered, re-organised or enhanced to fit each individual application clients requirements, while keeping the core data model intact and un-entangled with a specific application.
As data is "virtualised", objects can be viewed according to different characteristics. For example, attributes specific to a particular application or object distinguished names according to the accessing application or user. That means data is implemented once and managed as one instance, but it can be viewed as an object according to different characteristics over and over again.

As the CIM evolves, cable companies will need to find the synergies so that applications can share common data. Once you have shared objects, you continue to evolve the process of designing schema for applications and merging the schemas together into the common model.

As operators consolidate their subscriber data, the platform they choose must offer a seamless migration to support IMS data via an HSS. This prevents an operator from deploying yet another silo if/when the operator decides to deploy IMS. An HSS can also source its data from the NDS, storing its data as part of the CIM thereby allowing IMS applications to source their data from the NDS as well as non-IMS applications. This has the potential to provide non-IMS and IMS applications a way to provide common data and services across different access planes. An HSS essentially sits on top of the NDS to offer an continued evolution to the process of consolidation. It does so as it enhances the CIM with an operator's IMS subscribers, the characteristics of their connected devices, and the preferences for those services.

For cable operators to guard their markets against hungry telcos who are charging toward IPTC, Internet service, VoIP, and other traditionally ‘cable' services, they must start planning how to streamline their networks for faster services rollout. To achieve a quad play set of offerings, consolidation of subscriber data for unified views of customer profiles across multiple services is essential.

Bill Bondy is CTO Americas for Apertio, and can be contacted via: bill.bondy@apertio.com



Other Categories in Features