European Communications

Last update12:59:33 PM

Features

European Communications discusses the latest telecom trends with telco executives, analysts and topic experts viainsightful analysis, Q&As and opinion pieces.

Telcos v. ISPs

Hugh Roberts looks at how service providers are developing their offerings and the consequences it may have for OSS/BSS systems which are already in place

Depending on your viewpoint, 2006 has either seen the marketplace for consumers wishing to purchase communications services get substantially simpler – or quite possibly the opposite. Historically, companies were set up to exploit discrete customer needs: a fixed wire telephone company for your home phone; a cellular operator for your mobility; an ISP for accessing the internet (with or without the availability of premium content and hosted services); and a satellite or cable provider if you wanted to increase your broadcast choice.
 Although there was some crossover – with offerings such as cable telephony and video on demand – it wasn’t too difficult to work out which company to go to for which service. Perhaps more importantly, it was a fairly easy matter to identify how much you were likely to pay for each, and how you were going to get charged and billed for each.
Then overlaps and virtual offerings began to occur. In the UK, for example, cable providers extended their telephony offering to include broadband, and in some markets the choice of alternative voice providers multiplied into the hundreds, existing high street brands and ISPs started to offer virtual fixed and/or mobile services, and the mobile companies themselves – both network operators and MVNOs – now offer fixed, broadband and entertainment services. Even the satellite players are getting in on the multi-play act.
In the new world order of communications, aggregated branded multi-service bundles and M&A are now radically reducing the number of companies that customers can buy services from. While ‘voice’ is consequently getting cheaper and this is being presented as ‘a good deal’ for the customer, even with all-you-can-eat pricing being applied across the board, it is surprisingly difficult for subscribers to work out exactly what they are going to pay, and whether it will actually be a less expensive and better quality overall level of service than was available before. In a number of cases customer rebellions have been highlighted in the media as these companies struggle to overcome the challenges of rolling out network and OSS infrastructure to keep up with demand, and BSS systems to integrate their customer information management and provide seamless and trouble free customer services with unified billing. 
In any multi-play bundled offering it seems to have become a golden rule that at least one of the key elements should be positioned as ‘free’, but most consumers have learned the hard way that there is no such thing as a free lunch. In many cases the small print, thresholds and cross contingencies on the usage of the other services for full qualification are extremely complicated and difficult to work out.
And all this before customers have to make sense of the choice of ‘delivery channel’ for converged services such as broadband Internet access or streamed video to mobile. Curiously, early usage indications show that the majority use of IPTV is in the home and not in ‘a mobile context’, and that a large proportion of this usage occurs in the same room as a TV set capable of accessing the same content! The sophistication of games has certainly increased on mobile phones, but the range of communications services now available on games consoles, and even within gaming environments themselves, put the majority of handset enhancements to shame.
Bigger (and ‘wider’) is apparently better in the opinion of the new aggregated operators, but is this merely about the ability to leverage economies of scale? The primary driver – at least in markets where the penetration levels of one or more of the core service offerings is high – is to do everything possible to retain customers, increase their loyalty, and hopefully also capture and increase revenues from them. Clearly, the larger the number of services a customer has from one provider, the less likely he is to churn, not least because of the increased lifestyle disruption caused. However this is only part of the answer. There is a critical need for the services themselves to be ‘sticky’, or the whole value proposition may be compromised. Customers are getting smarter: it is no longer enough to have a brand, or premium content, or low (and understandable) pricing, or good quality customer service – all of these are now required to be competitive.
So, is it possible for the small and niche players to be winners? At the backbone layer, the answer would appear to be a resounding ‘no’. The satellite and cable industries have already reduced to a minimum and the number of pure ISP players is rapidly reaching the same point. No one is entirely certain exactly how many mobile network operators are viable in a given marketplace, but the evidence from the US is that in most developed countries the number is probably too high. There are some niches evolving in the infrastructure domain as new technologies such as WiFi/WiMax develop, but even these niches are rapidly being squeezed between the dominant operators and the growing positioning of ‘social information infrastructure,’ such as state or city owned metropolitan networks.
Notwithstanding some brand and service differentiation (reflecting different operators’ market segmentation strategies and consequent capture of premium content offerings), it is getting harder to tell the difference between the companies that remain in the marketplace, irrespective of where and how they originated. There is a growing customer expectation in highly penetrated and competitive markets that any CSP should be capable of offering a full (virtual or real) multi-play package and that substantial price reductions or cross-product discounts should accrue.
This would be the end of the story, apart from the fact that three factors are coming together to turn the existing communications industry structure on its head:
The mechanisms for revenue generation within the telecoms value chain are changing
The so-called ‘X-Factor’ companies such as Google and eBay/Skype are changing the value proposition for CSPs from one based solely on subscriber revenues to one which sees customers as both source of income and key resource in the chase for advertising and service sponsorship money. Revenue splitting with subscribers for ‘customer generated content’ is emerging as a prime source of income and loyalty for service providers – an ecosystem based on ‘trading relationships with everyone’ is replacing the ‘uni-directional revenue flow’ model.
As adverting-derived income erodes customer content revenues even further (moving from ‘all-you-can-eat’ to ‘free’ to the subscriber), eCommerce and financial transaction processing will take their place as critical revenue drivers (although content will remain the primary support for brand positioning). Prepaid (real-time) micro transaction expertise is one area in which the telecoms industry can claim to be significantly more advanced than any other, and the scale of repatriation of funds to family and friends from northern to southern hemisphere via mobile phone is testament to this.
The industry is going to restructure along horizontal rather than vertical lines under both regulatory and commercial pressures
The EU’s support for structural separation – the splitting of the infrastructure and services divisions – is well documented. Citing the break up of AT&T into the Baby Bells in the US in the 80’s, and OfTel’s (now OfCom) requirement for BT to establish ‘operationally separate’ business units in the 90’s as precedents, the Commission seeks to guarantee fair access and to promote competition and investment across the region.
Commercially too, this makes sense. Building a     business around the ‘bit pipe’ is, an extremely profitable business, but only if aggregated to achieve economies of scale; owning multiple technology channels and platforms to be able to ensure best cost/QoS delivery and avoid ‘disruptive technology’ pitfalls; and perhaps most importantly of all, to trade only in a B2B context, as the cost of maintaining a full consumer brand presence would be unsustainable.    
Affinity groups are rapidly growing in importance as the primary mechanism for delivering high value service to niche customer groups
The secret of affinity groups (as opposed to communities of interest) is that they are driven by customer pull rather than service provider push. These have existed on the Internet for some time, but it is with the inclusion of mobility that their true value will be realised. It may be that ISP expertise and experience in handling closed user groups may prove their most important and lasting legacy to the communications ecosystem.
If an affinity group has access, content and mobility included within its remit, how does this differ from an MVNO multi-player? The structural answer is ‘not a lot’! The marketing answer is, of course, ‘quite a lot!’, and comprehending this will be the secret to understanding the ‘4G’ environment as it evolves. The niche players of the future will be the Customer Lifestyle Providers (CLPs) supported by a combination of Virtual Service Enablers (VSEs) and Network Service Providers (NSPs).
So, turf wars? For the operators that have successful subscriber and partner relationships, the transition may not be too uncomfortable. Thought should go to the larger ISV and SI players in the OSS/BSS supply chain, whose product sets, deployment models and value propositions may be based on architectures and business unit inter-relationships that no longer match the requirements of their existing customers as they evolve from one-dimensional caterpillars into four+-dimensional butterflies.                                                       

Hugh Roberts is Associate Strategist - Logan Orviss International

Smart phone complexity

Doug Overton investigates the causes behind the mobile ‘No Fault Found’ phenomenon and how the industry can solve this vastly expensive problem

Incredibly, one in seven mobile phones are returned within the first year of purchase by subscribers as faulty, according to research by Which? This statistic will doubtless raise eyebrows and drive speculation over product design flaws and standards in build performance. However further analysis into the nature of these returns reveals the even more disturbing statistic that 63 per cent of the devices being returned are done so without fault.
This figure, unearthed as part of a study into mobile device returns trends in the UK, places mobile phone 'No Fault Found' returns at a level 13 per cent above the average within the consumer electronics sector.
With operators, manufacturers and retailers collectively covering administration, shipping and refurbishment costs approaching GBP35 per device, this equates to a potential cost to the UK mobile industry of GBP54,016,200 and more significantly a global industry cost of GBP2,274,399,029 (US$4,499,898,480).
So why are so many devices returned without fault? WDSGlobal is working alongside a leading UK mobile retailer to implement mechanisms and services to significantly reduce the impact of the NFF phenomenon. Analysis of over 15000 monthly calls arriving at the specialist retail returns/diagnostics line provides a valuable insight into the causes behind the trend.
In some of these instances (24 per cent) the user had resorted to abandoning the device after a lengthy and frustrating battle with the usability of functions or applications. This provides a clear indication that manufacturers still need to invest considerable time and effort into the user centred design and modelling of device software.
A recent study in the Netherlands reported a 20 minute average time in which the user will attempt to use a service before abandoning it. Alarm bells should be ringing for the industry when the manual set-up of an e-mail service on a device alone takes a minimum of 20 minutes even before the user attempts to understand how the program works.
Some 8 per cent of users were simply attempting to return a device on the basis that it did not fulfil the purpose for which it was sold. This may be a consequence of inadequate marketing on behalf of the manufacturer, but is more often than not attributed to a knowledge deficit at the point of sale. The majority of mobile retailers are not equipped with the expertise to provide informed advice on the more complex features of mobile devices. Many high-end mobile phones are now differentiated through data communications technologies including GPRS, EDGE, UMTS and Bluetooth, which are complimented by an equally confusing array of applications such as WAP, MMS, e-mail and Streaming media.
 If the mobile retailer is anything short of fully briefed on the benefits and application of these technologies they are highly likely to be furnishing the customer with inaccurate or insubstantial advice. A recent mystery shopper survey** carried out by WDSGlobal identified that only 20 per cent of retail staff could provide a moderate description of what 'BlackBerry' functionality could provide within a device, despite its prevalence as a powerful differentiating business function.
For some retailers, the provision of inappropriately positioned devices to customers is also a reflection of store policies to ship specific models based on margins, stock levels or promotions rather than matching requirements to solutions. The same retail survey alarmingly identified that only 60 per cent of leading high street retailers adopted an impartial customer focussed approach to sales, based upon listening to the requirements of the customer.
It is little wonder that angry and frustrated customers try to return devices as 'faulty' when they were ill advised at the point of sale. The more astute retailers are already embracing in-store kiosks and other knowledge based point of sale platforms in an effort to prevent this happening.
The most significant contributor to the 'No Fault Found' problem derives from users who – quite understandably – diagnose lack of connectivity to services such as WAP or e-mail as a fault. The reality however is that many devices are purchased in an un-configured state for use with these services. While many operators attempt to set up services for immediate 'out of the box' usage the more popular applications such as e-mail will be left to the 'DIY' devices of the user.
Subscribers who swap networks while maintaining their equipment, or those who purchase imported, second hand or SIM-free equipment will comprise the growing number of users who will be in an 'unconfigured' state for all services.
Out of the 300,000 calls received by WDSGlobal into a specialist tier two support environment in Q2 of this year, 47 per cent of the issues faced related to problems associated with mobile service configuration. This in itself is a major concern, even more so when it presents a 2 per cent increase on the same statistic drawn in 2000. The problem is not going away and as device and service complexity continues to develop at an unprecedented rate the user experience only stands to worsen.

The true cost of the problem
Working within the industry it is often hard to empathise with the pain experienced by mobile subscribers. Assumptions that users will embrace complex device configuration menus or self-serve portals often fail to recognise that most users simply expect devices to work without understanding the underlying complexity. This is not difficult to understand when all other consumer electronic devices including MP3 players, portable gaming units and digital cameras simply work when they are taken from their boxes.
A user spending GBP500 on a sophisticated mobile device is doing so on the understanding that it will improve their personal productivity or simplify part of their lifestyle. However when this benefit comes at the expense of time spent elsewhere engaging with customer service agents or ploughing through convoluted instructional guides, the exercise becomes counterproductive.
The result is that either the device is abandoned (returned without fault) or the service itself is abandoned, relegating a potentially powerful communications tool to the status of an expensive personal organiser; neither situation is healthy for the consumer or the industry.
The $4.5 Billion quantitative cost of this problem is easily recognised, and for mobile operators and manufacturers can be easily absorbed into the growing overhead associated with launching new products, or in many cases simply hidden within the an inflated consumer price tag.
What is of greater concern is the less tangible qualitative issues at stake. Mobile subscribers are becoming increasingly despondent with mobile technologies, and a frustrating user experience has sadly become the rule and not the exception. Brand loyalty and subscriber churn once again come under fire as mobile users migrate between device vendor and mobile operator brands in an ostensibly eternal quest for an optimal user experience.
The mobile phone has become the poor relation of its consumer electronics cousins, and while many parallel sectors receive recognition and accolade for innovation and design, the mobile industry continues to draw bad publicity.
In an age of rapid innovation, where industry prophets foretell of entirely converged consumer electronic devices in the near future it is ironic that the mobile phone appears to be at the hub of the convergence. If mobile manufacturers and operators truly wish to form the vanguard of convergence innovation then there is still much to learn from their consumer electronic counterparts.

Mitigating the 'No Fault Found' risk
The NFF problem is not going to be solved overnight. It is a problem that has developed and festered over many years in conjunction with rapid industry growth and technology innovation.
The root causes however are not shrouded in mystery – they can be catalogued and analysed systematically with a view to preventing them in future product launches. Most operator support centres will carry detailed call records, explicitly capturing the frustrations faced by subscribers at the coal-face; similarly most reverse logistics organisations or departments will accurately log the inherent drivers behind NFF returns.
It is this data that provides the market intelligence for mobile industry Product Managers to mitigate problems with the launch of future products. Every problem can be traced back to a deficiency in the device design or the channels, processes and mechanisms that surround its launch and in-life support. Most of these issues may be addressed through stringent device testing and usability modelling prior to the launch of the mobile device. Furthermore, the effective empowerment of sales channels and support centres with specialist knowledge will help to assuage much of the NFF problem.
It is a simple and logical process, which if pursued on an ongoing basis, should realistically show a reversal in the trends of high volume customer support calls and 'No Fault Found' returns inside two years.
 
Doug Overton is Head of Communications at WDSGlobal   www.wdsglobal.com

Regulatory framework

Malcolm Dowden charts developments in the regulatory framework, which is designed to encompass the European communications market

The electronic communications sector has come under increasingly close scrutiny from the European Commission, and 2007 looks set to be a particularly active year for the industry's representatives in Brussels and beyond. The Commission has stressed that it is ready to revisit any aspect of the Regulatory Framework, provided that this contributes to the attainment of the Lisbon 2010 Agenda objective of making the EU the most dynamic and successful knowledge-based economy by the end of this decade.
The current Regulatory Framework dates back to only 2003, and transposition into national law has only been completed in 2006. Indeed, some of the member states have not yet adopted the necessary secondary legislation or completed their market reviews.  Nevertheless, a review has been initiated to ensure that the legislative and regulatory regimes can take account both of the rapid pace of technological developments and the competitive landscape of the industry in the EU's range of established and emerging markets.
The Commission is examining all of the key directives on which the Regulatory Framework is based, together with Article 8 of the Electronic Communications Competition Directive. The Commission's principal objective is to remove any obstacles to the provision of faster, more innovative and competitive services. Further, the Commission has made it clear that the exercise will extend to the regulation of next generation networks and the liberalisation of radio spectrum.

Extended powers
In one of the main 'on-off' stories of 2006, the Commission has also been considering whether its powers under the Regulatory Framework should be extended to create a single regulator for the EU's electronic communications sector and also to include an ability to veto remedies imposed by national regulatory authorities. The idea of a single regulator was floated in June 2006 by Viviane Reding, the Information Society Commissioner. If introduced, it would resemble the European System of Central Banks in structure, with local regulators responsible for analysing local market conditions and reporting back to the EU's regulator to ensure that European law is applied equally across the continent. Introducing the concept the Commissioner observed that this lack of harmony gives some countries an advantage over others, which is “unacceptable” and “an obstacle to the internal market to effective competition”.
The Commission intends to table draft legislative proposals amending the Regulatory Framework before the end of 2006 or early 2007. These proposals will then be transmitted to the European Parliament and Council for adoption under the co-decision procedure. The Commission's target is implementation and transposition into national laws by 2009-2010.
In parallel, a revision of the Recommendation on relevant product and service markets within the electronic communications sector is underway. The Recommendation lists a number of wholesale and retail markets susceptible to ex ante regulation by the member states. The Commission is proposing to reduce the number of markets from 18 to 12. The only remaining retail market covered by the Recommendation would be access to the public telephone network at a fixed location. 
There is also a root-and-branch review of the relationship between regulation and the application of competition law. In its 2005 Discussion Paper on the reform of Article 82 the Commission laid down some important policy markers – and in particular the view that competition law enforcement should be effects-based and focus on protecting consumer welfare. This would represent a significant shift in approach, with enforcement action no longer depending on the form a business practice takes, but on its effects.

Competition problems
What effects? Introducing the consultation in 2005, the Competition Commissioner Neelie Kroes made it clear that Article 82 enforcement should focus on real and empirically demonstrable competition problems. In other words, “behaviour that has actual or likely restrictive effects on the market, which harm consumers.”
In the telecoms sector and, particularly in the case of emerging markets, it will be very important to clarify this issue. Incumbent operators often argue that they should be granted a 'regulatory holiday' when they plan to upgrade bottleneck access infrastructure (e.g. from narrowband to broadband). However, as the infrastructure is not readily replicable (due to economies of scale and scope and legacy infrastructure), there is a risk that such holidays might result in retail markets being foreclosed to competition – this did in fact occur in the provision of broadband via ADSL technology in some Member States. 
In its response to the Discussion Paper the European Competitive Telecommunications Association (ECTA) urged the Commission to recognise the relationship between competition law and sectoral regulation and follow up with a more in-depth analysis in sector-specific documents e.g. through an update of (i) the Notice on the application of competition rules for the telecoms sector; and (ii) the Commission's guidelines on ex-ante market definition and assessment of significant market power.
ECTA also called upon the Commission to highlight how rules can most appropriately be applied in sectors characterised by economies of scale, vertical integration, historic monopolies and former state funding. This is particularly relevant to guidance on 'emerging markets', 'leverage', 'efficiencies' and 'refusal to supply'.
Critically, ECTA asked the Commission to clarify what is meant by “capability to foreclose competition”. In the telecoms sector, where behaviours such as margin squeeze can cause considerable and lasting damage, it is important that a case can be brought before “actual foreclosure” has occurred. It would be little comfort to a market entrant that its failure might subsequently be used as evidence of anti-competitive behaviour on the part of an incumbent. 
There are indications that Commissioner Kroes is considering a sector-wide inquiry in telecoms in 2007. Having recently set about investigating the energy and financial services sectors, it is thought the Commission may examine competition and the state of liberalisation of the telecoms sector as early as next year.
Sector inquiries typically begin with extensive questionnaires being sent by the Commission to industry players. They are organised and carried out by DG Competition in conjunction with the other relevant services of the European Commission. For telecoms, that would be the services of Viviane Reding, the Commissioner for Information Society & Media.
There is no doubt that 2007 looks set to be an interesting and critically important year for the electronic communications sector.                                       

Malcolm Dowden is an Associate with Charles Russell LLP, and can be contacted via e-mail: This e-mail address is being protected from spambots. You need JavaScript enabled to view it

Project management

Projects fail - some spectacularly - and, looking at project management statistics, if you were a betting man you wouldn’t lay money on their successful delivery. But is failure inevitable? Are project managers doomed to spend their careers in a never-ending groundhog day of disaster? Brendan Loughrey looks at the best ingredients to prevent the bitter taste of failure

Project failure is not discriminatory – it pretty much affects all sectors and all countries in equal measure. Whether it's an embarrassingly late delivery by a Department of Defence that has spent millions on helicopters that aren't safe to fly, or a contractor featured in a recent scathing report that had only completed six out of the 150 health centres it was supposed to be building in Iraq, despite having already spent 75 per cent of the USD186 million budget.
Louise Cottam, a senior project manager who works in the food, nutritionals, pharmaceuticals and medical devices industries comments: “I've seen the same failure modes across all the industries I've worked in – in fact some of the worst projects I've experienced were within consulting or project management divisions, although the goals and products might be different, failed projects share many common characteristics”.
And it's not like we don't know why projects fail. After all, hundreds of thousands of words have been written to document and analyse the causes of project failure. There are also statistics galore, showing just how very bad some are at delivering projects on time and to budget; while many, many more words have been written to describe features of successful projects and to promote a variety of methodologies that are claimed to improve success rates. Yet still the Holy Grail of successful project delivery seems to slip from our grasp.
Neither is this an inconsequential issue. The consequences of project failure can be catastrophic – failed projects bankrupt businesses, generate humiliating publicity which devastates investor confidence and slashes the stock price, help to bring down governments and soak up huge sums of money that could be better spent elsewhere. In the worst cases they kill people.

Grim...and getting worse
IT and telecoms industry stats are particularly grim and are expected to become worse. The daddy of all IT project failure studies is the Chaos Report by the Standish Group. It's Standish that are behind the often-quoted figure of 70 per cent failure rates for IT projects, which is even more worrying when you consider that their definition of failure is 100 per cent overrun on time or budget. Scarily, in the ten years since the initial study, little seems to have changed. Indeed some industry observers comment that they may even have got worse. Standish stats show that 31 per cent of projects are cancelled before completion, 88 per cent exceed deadline, budget or both, while average cost overruns are 189 per cent and average schedule overruns are 222 per cent.
With global spending on software, hardware and services to support tomorrow's networked, computerised world a multi-trillion dollar industry, financial losses from overruns and failed projects are staggering. When you consider that these projects have knock on financial effects – such as lost business, lost stock value and the requirement to compensate customers – the numbers quickly become unimaginable. The problem is also increasing exponentially due to the number of IT megaprojects as large governmental organisations automate processes and connect disparate systems. One example of a current megaproject is the UK's National Health Service project to automate health records, which it is estimated will cost at least USD65 billion.

Root cause
Everything is not as it first appears; critics of Standish statistics and other surveys point to the fact that the root cause of overruns lies in poor scoping and requirements captured at the start of the project. Peter Bowen, a senior consultant in the telecoms industry explains: “Many companies chronically underestimate the cost of running a project or how long realistically it will take to deliver. Scope creep adds to this problem, with companies failing to understand how much extra time and money adding that nice-to-have feature will cost them. Of the hundreds of projects I have reviewed, the most common problem is the failure to meet expectations. I've lost count of the times I've heard: 'this is not what I want' (customer); 'but this is what you asked for' (supplier); 'but this is not what I need'.”
In fact executive pressure to bring projects in as cheaply as possible often means there is little incentive to identify the true cost or realistic timescales at the start. Doing so would mean that many projects would simply never get off the ground.
So is project failure an inevitable fact of life? Well it isn't at Comunica. Our business is based on our ability to deliver and we have a 0 per cent failure rate. Quick double take. Did he say 0 per cent? Surely not? At Comunica we're proud that we've never let a client down on a date, and we offer a fixed price upfront, so we never overrun the budget. If we go over it's our         problem not the client's. This is one of our differentiators: few companies are brave enough to offer fixed prices and our willingness to do so demonstrates our confidence in our ability to deliver.

Ultimately it's about people
Why are we so successful at delivering projects? Well ultimately it's all about people. The technology we implement is pretty well tested, with absolutely minimal failure rates. The key to our success lies with human issues such as good communication, realism and truthfulness, and creating trust with your employees, with other project staff and with the customer. Our differentiator is the quality of our people.
Formal methods are useful, but they simply do not work out-of-the-box in the real world. You have to be prepared to adapt to the environment and culture, taking what you need from methodologies. You also have to be prepared to review and learn from your mistakes – constantly evolving your approach to make it more effective. Getting the approach right is also important. While there has to be  accountability in projects, a finger-wagging blame culture is counterproductive. Things go wrong and people make mistakes. Success comes from being fair but firm, and earning trust so that people will tell you there is a problem. If your staff trust you enough to tell you they're worried about something then that gives you the chance to fix it. Blaming is easy: being solution-oriented and constructive is much, much harder.                                                         

Brendan Loughrey is Comunica Limited's Project Director   www.comunica.co.uk

NGN application servers

Paul Bassa picks his way through some of the ‘knowns’ and ‘unknowns’ of Next Generation Networks and suggests that service providers will need to employ insight and wisdom if they are to reap the rewards of IP migration

“As we know, there are known knowns. There are things we know we know. We also know there are known unknowns. That is to say, we know there are some things we do not know. But there are also unknown unknowns, the ones we don't know we don't know.”
  Donald Rumsfeld, US Secretary of State for Defence, February 2002

Donald Rumsfeld may no longer be at the US Department of Defence but his 'Known Unknowns' theory – either a work of genius or pure nonsense, depending on your point of view – remains a lasting legacy. Thankfully, the migration of communications networks and services from 'legacy' or 'traditional' to 'next generation' networks (NGNs) has very little in common with the 'war on terror', but Rumsfeld's famous aphorism can be a useful way to describe just how much we truly know about the potential costs and benefits involved as we step into the brave new world of the NGN.

The 'known knowns'
The fundamental 'known known' of an NGN is that it supports a single signalling and transport architecture for multiple multimedia services, including voice and video, messaging and presence, integrated communications and IT applications. This is achieved by the SIP session control and RTP media transport. Also known is the ability to overlay that network with value-added applications that can be rapidly introduced (and removed if required) across any type of network, fixed, wireless or mobile.
These are known capabilities because IP networks and services are available now and because the standards and technology to support these additional features are already being developed and deployed. There may still be technical issues to resolve – either at the design or at the implementation and integration level – but the fundamentals are already in place.
The specific technical choices may vary depending on the network operator or service provider in question. Large European incumbents like BT, France Telecom, Telecom Italia and Norway's Telenor, for example, have committed to large-scale IMS network architectures. As the dominant fixed-line operators in their respective markets they face a number of challenges: reducing infrastructure costs, competing with the influx of new and agile competitors, countering fixed-mobile substitution with fixed-mobile convergence, and re-creating the themselves as content players, either as aggregators or ultimately as owners. The aim of the game here is long-term viability, not short-term returns.
Mid-tier operators, meanwhile, are likely to be more cautious on their commitment to major NGN and IMS investments, though equally keen to exploit the cost and revenue benefits of IP.
Local market conditions will determine their particular requirements, but a common factor behind mid-tier operators migrating to NGN is the declining revenues of voice minutes, which are quickly becoming a commodity thanks to VoIP. This threat, spearheaded by VoIP providers such Skype, may have once been limited to fixed-line but is quickly having a similar effect in mobile too.
Mobile operators are already tied to an existing separation between their circuit-switched and packet data access networks. The principle migration path is to follow the 3GPP route from R4 circuit-switched and packet data access, to R5 IMS, and ultimately to the all-IP R6 network. Mobile operators are trialling IMS networks, but there remains a short-term investment challenge between rolling out the new infrastructure and launching new self-contained services, such as mobile TV.
The smaller service providers will focus on their specific markets, such as prepaid or business services, and will look for immediate cost minimisation and immediate revenue returns. A single multiservice platform supporting PSTN and IP carrier interconnection with least-cost routing, service access, validation and control, real-time rating and charging, service creation and web-based customer self-care, provides the ideal basis on which to enter a market. Such a solution supports the SIP signalling and multimedia capabilities of NGNs, but also directly addresses the integration overheads of multi-vendor solutions.

The 'known unknowns'
While there is plenty that is understood about NGNs, there remains – in Rumsfeld-speak – a number of 'known unknowns.' These are mainly concerned with the uncertainty over precisely how the new technology will disrupt the status quo. There is a myriad of unanswered questions. Which services, applications and features will connect with users and generate the biggest returns? Which stakeholders in the value-chain stand to reap the greatest benefits? Will customers opt for service bundles or pick 'n' mix solutions? Which access technologies will come to dominate – cable, DSL or fibre, mobile or wireless? Only time will answer such questions, but it is useful to look closely at two of them: the uncertainties over service demand and the new service value chain.
Service demand is about delivering value. In the consumer market, for example, value is both personal and social: to teenagers, it's about being new and cool; to their parents it's about being simple and reliable. To deliver true value, therefore, you have to know your customers intimately and respond to their demands instantly.
The first requirement is pure marketing; the second requires the platform capability and business processes that can support, for example, a new feature success rate as low as 10 per cent. The costs of rolling out new services and features will include enabling the network capabilities, integrating operations and business support, promoting the services and acquiring the customers. By minimising the service implementation costs, including operations and business support, the operator is freed to do more than repackage       existing services, and can directly address the uncertain demand for new services.
In regards to the value chain, different models may apply depending on the relative power of the content and network providers and on the accessibility of the target market to the content provider: a strong content provider may command premium value and so may opt to sell exclusive rights to a major service provider that wants to attract the maximum number of subscribers; other content providers may look to provide services directly; and others will want to buy into multiple channels (eg via a mobile content aggregator). Similarly, network services may be offered directly, via VNOs, or through sales and services channels. For example, a hosted business communications service, such as IP Centrex, may be offered via local partners that provide a range of IT and communications services to their customers. To maximise value from the relationship, those services partners may want to offer an own-branded service with their own pricing models. The network operator, therefore, must offer both the most appropriate service features for the end-users but also the management and charging flexibility for the channels.

The 'unknown unknowns'
There is almost nothing that can be said about 'unknown unknowns', other than the fact that their very existence suggests the expectation of change. It is likely that the recent revolution in communications and media technology will consolidate into a small number of new business models over the next few years. Operators expecting to exploit this change – at whatever level and position in the market – must be ready with a platform that enables rapid, low cost service rollout with well-targeted new features and services encompassing all types of network and multimedia. Whether this requires an IMS solution, an NGN or softswitch architecture, or a standalone IP platform, will depend on their role in the value chain and the range of services they intend to offer.
To cope with these 'unknown unknowns' service providers will need to demonstrate great responsiveness and the right of combination of intelligent insights and wise actions if they are to realise the true benefits of migrating to IP. It will not be as complex or exhausting as the 'war on terror', but it might seem like it at times.                                           

Paul Bassa is Product Manager, Digitalk

Newspeak

European Communications presents its regular round-up of the latest developments in the world of telecommunications

Upwardly mobile
European firms now spend 32 per cent of their telecom and networking budgets on mobility, according to Forrester Research. In its report The State Of European Enterprise Mobility In 2006, Forrester sets out how the opportunity for greater productivity, cost savings, and boosted employee morale has put mobility at the core of strategic agendas.
Today, more than 70 per cent of enterprises are using some type of mobile application. A third of businesses say that setting mobile and wireless strategy and policy is a priority in 2006; for another 16 per cent it's a critical priority. Why are these companies investing in mobility? Mobility provides the enabling technology for business practices like flexible working, with quantifiable benefits.
However, Forrester also reports that mobility projects raise cost and reliability concerns, especially as businesses start to enhance their enterprise applications – like sales force and field force applications – with mobility. According to Forrester, enterprises need to stay focused on identifying their needs while taking a step-by-step approach to realise the full benefits.
Forrester analyst, Jenny Lau, says: “It's good news that enterprises are strategically planning for mobility and benefiting from it. However, there is a risk that mobility will remain a strategic agenda point that doesn't get realised. To turn strategy into reality, firms must devise a strategy and plan pilots with users, set handset policies early, and have contingency plans in place. In addition, it's best to start small. Firms should see how mobility could give existing infrastructure a boost rather than planning for an ERP overhaul at the outset.”
Details: www.forrester.com

The 'unstoppable force'
Ofcom, the independent regulator and competition authority for the UK communications industries, has published a series of essays by academics, politicians and regulators that examine the effect of convergence on the global communications sector. The book was launched at Ofcom's 2006 international conference in late 2006.
Both book and conference examine the impact of changes in technology, media consumption and market behaviour and assess the implications for regulation in the future. The essays identify a number of trends, which, it is argued, will render obsolete many established policy and regulatory frameworks and create new challenges for those responsible for protecting, and delivering benefits to, consumers.
Ofcom's international conference hosted more than 200 delegates from around the world. Keynote speakers and panellists included senior business leaders from broadcasting, Internet and telecommunications as well as regulators, politicians and academics.
In his address to the conference, Ofcom CEO Ed Richards examined a changing marketplace in which:
• established business models are under threat from disruptive technologies such as Voice over IP and Internet advertising;
• businesses and regulators must reflect and anticipate consumer demand for greater control, mobility and participation in media and communications; and
• new public policy debates arise around the scope of public service broadcasting and universal service obligations.
“Convergence is an unstoppable force which, in a few short years, will transform the conventions of commercial and regulatory practice,” said Richards. “They will pose questions and demand answers for every company, every regulatory agency and every government over the next decade.”
Communications – the next decade can be bought at The Stationary Office for £25 and will be available for download at: www.ofcom.org.uk
Can't buy me innovation
A select group of the world's 1000 largest corporate R&D spenders perform significantly better than their competitors over a sustained period while spending less on R&D than their industry rivals, according to management consulting firm Booz Allen Hamilton's second annual global innovation study.
The study found that although R&D spending of these 1000 companies rose last year by more than $20 billion, money simply can't buy effective innovation.
A group of 94 'high-leverage innovators' including Toyota, Apple, Christian Dior, Google and Caterpillar spend less than their competitors on research and development, yet consistently outperform their industry rivals across a broad set of performance measures.
Booz Allen analysed the world's top 1000 corporate research and development spenders – the Booz Allen Global Innovation 1000 – in what continues to be the most comprehensive effort to date to assess the influence of R&D on corporate performance. The study identified the linkages between spending on innovation and corporate performance, and uncovered insights into how organisations can get the greatest return on their innovation investment.
Details: www.boozallen.com

Shaping the Web
The Massachusetts Institute of Technology and the University of Southampton has launched of a long-term research collaboration that aims to produce the fundamental scientific advances necessary to guide the future design and use of the World Wide Web.
The Web Science Research Initiative (WSRI) will generate a research agenda for understanding the scientific, technical and social challenges underlying the growth of the Web. Of particular interest is the volume of information on the Web that documents more and more aspects of human activity and knowledge. WSRI research projects will weigh such questions as, how do we access information and assess its reliability? By what means may we assure its use complies with social and legal rules? How will we preserve the Web over time?
 Commenting on the new initiative, Tim Berners-Lee, inventor of the World Wide Web and a founding director of WSRI, said: “As the Web celebrates its first decade of widespread use, we still know surprisingly little about how it evolved, and we have only scratched the surface of what could be realised with deeper scientific investigation into its design, operation and impact on society.
 “The Web Science Research Initiative will allow researchers to take the Web seriously as an object of scientific inquiry, with the goal of helping to foster the Web's growth and fulfil its great potential as a powerful tool for humanity.”
The joint MIT-Southampton initiative will provide a global forum for scientists and scholars to collaborate on the first multidisciplinary scientific research effort specifically designed to study the Web at all scales of size and complexity, and to develop a new discipline of Web science for future generations of researchers.
Professor Wendy Hall, head of school at Southampton University School of Electronics and Computer Science and also a founding director of WSRI, said: “As the Web continues to evolve, it is becoming increasingly clear that a new type of graduate will be required to meet the needs of science and industry. Already we are seeing evidence of this, with major Internet companies and research institutions lamenting the fact that there are simply not enough people with the right mix of skills to meet current and future employment demands. In launching WSRI, one of our ultimate aims is to address this issue.”
WSRI will be headquartered at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT and at the School of Electronics and Computer Science (ECS) at the University of Southampton. Initial plans will focus on joint research projects, workshops and student/faculty exchanges between the two institutions.
The initiative will have four founding directors: Tim Berners-Lee, director of the World Wide Web Consortium, senior research scientist at MIT and professor at the University of Southampton; Wendy Hall, professor of computer science and head of the School of Electronics and Computer Science at the University of Southampton; Nigel Shadbolt, professor of artificial intelligence at the University of Southampton and director of the Advanced Knowledge Technologies Interdisciplinary Research Collaboration;  and Daniel J. Weitzner, Technology and Society Domain leader of the World Wide Web Consortium and principal research scientist at MIT.
www.webscience.org

MVNOs

Tony Wilson looks at the developing role of the MVNO and its latest incarnation in the form of the MVNE

It doesn't take a lot to realise that the mobile industry is firmly on the up again; and that it is changing beyond recognition. MVNOs, or Mobile Virtual Network Operators, have once again injected new hope into an industry battling with ever-increasing saturation levels and growing price pressure. What's more, there is renewed expectation that many more major brands will become associated with new converged communications services. 
However, the truth is that not every brand has got the technical know-how or the marketing power it takes to succeed. Figures published earlier this year by industry regulator Ofcom revealed that MVNOs now account for 5.5 million phone contracts out of a total 62.5 million in the UK alone. Nonetheless, by 2010, the market for mobile virtual network operators is forecast to generate service revenue of $10.7 billion, according to the Yankee Group. And some very strong brands like Disney have recently announced UK MVNO plans.
So, the question is, what has changed since the first wave of MVNOs to fuel the industry's confidence this time?

Next-generation MVNOs
Three essential factors have emerged since companies such as Tesco successfully started to compete with traditional operators. While the initial business model of creating new revenue streams without actually having to be an expert in the wireless industry still stands, today's MVNOs are far from being a license to print money. In a saturated market like the UK it is not enough to just resell wireless voice or short messaging services under a new brand name. 
Firstly, because successful MVNOs will have to differentiate and offset these threats with new value-added services orientated around customer choice and a personalised customer experience. In order to achieve this, Next-Generation MVNOs, such as content providers and broadband companies, have realised that they need a unique value proposition and a strong brand image in order to remain competitive. Many MVNOs have accomplished this by choosing a particular niche market such as business customers and offering specific mobile services including data, content, gaming or other service innovations that appeal to its targeted customer base. 
For example, before launching the service, BT Mobile (part of BT Retail) recognised the market for mobile services was extremely fast-moving and competitive, with corporate users having very different requirements from consumers. It was therefore essential that BT Mobile could tailor its service for business customers and bring to market the new service as quickly and effectively as possible.
Secondly, the recent mergers between NTL and Virgin Mobile demonstrate that convergence has become the new driving force behind the new sophisticated generation of MVNOs. NTL's bold move to acquire Virgin Mobile promises to create a powerhouse in the rapidly converging telecom and media industries. For the first time, a company is able to offer the much talked about quadruple play, combining mobile and fixed line telephone services, broadband and TV. In a similar case, the success of the service for its mobile business customers led BT Retail to launch BT Fusion, its pioneering fixed mobile convergence service, in February 2006. 
Last but not least, the concept of the MVNE, or Mobile Virtual Network Enabler, has come of age. They act as intermediaries between the MVNO and the network infrastructure provider. An MVNE can provide the skills and systems to manage both the back office and the relationship with the network provider too.
The intermediary function can go much further. An MVNE can also absorb the burden of interfacing with all of the other organisations including credit agencies, outsourced call centres and banks that are involved in delivering a service. This requirement is likely to increase as MVNOs consider converged services that depend on third party content providers.
Consequently the MVNO is free to focus on where it can add value to the service, whilst the MVNE absorbs all of the operational complexities of running a mobile virtual network business. 
Smooth operation of the MVNO ultimately also benefits the operator who is looking for a secure additional revenue stream. They are looking for an MVNO to present a compelling business case that serves a group of customers that the operators don't aggressively market to. The presence of an MVNE with the right skills and experience gives significant reassurance to an operator about how the MVNO can fulfil this potential and maximise its capital investments.

MVNE or not
One of the most important business decisions an MVNO has to make, therefore, is whether or not to use an MVNE. Many companies that are interested in becoming an MVNO lack the time, money, or expertise it takes to run the day-to-day back office processes.  Furthermore, many MVNOs are finding that it is difficult to manage all of the third-party relationships needed to successfully run its business. 
MVNEs like Martin Dawes Systems, itself a former MVNO, help organise and manage these relationships while also reducing risks and accelerating return on investment (ROI). In the case of BT, who chose Martin Dawes Systems for its business customer mobile service, the company needed to find a partner who could provide the complete span of billing and customer relationship management systems required, as well as bring into the operation the relevant experience and expertise of serving business customers to ensure the smooth launch and then ongoing operation of the service. 
The other important decision any potential MVNO has to make is the business model – prepaid/pay as you go or postpaid. Early MVNOs were characterised by pre-paid, low-end customers, often not offering more than voice and SMS services. One of the best examples of this is Virgin Mobile UK which introduced itself with a pre-pay service – a service the mobile network operators were trying to move away from – and returned better revenue per user than most of the operators' subscribers. 
However, many of today's Next Generation of MVNOs (e.g. BT Mobile) are targeting more refined niche  segments and vertical markets. Convergence offers opportunities to target such customers with innovative bundles that combine free or flat rate broadband with a mobile and fixed line service. This customer base expects and values a paper bill, live customer service, and a wide choice of pricing plans – areas where end-to-end MVNEs can leverage their distribution assets to help their MVNO clients acquire and service customers. 
In this respect, MVNEs play a critical role in delivering a high quality customer experience in line with MVNO's brand value and proposition. What an MVNE with the right background in billing and rating processes can do is crack the underlying complexities of getting this task right especially when the MVNO is offering special discounts and deals on a service. The consequences of not getting this right – late or inaccurate billing, delays to implement additional services on a customer account – quickly undermine an MVNO's whole market proposition. 
Choosing an MVNE partner is about making a long term strategic decision and having quite clear business goals into the future. Experience of the first MVNOs shows they often start with a very simple business model based on prepaid. This is the foundation for the MVNO to then evolve to post-paid and now more complex offerings. It therefore becomes desirable that the MVNE can take the MVNO through these stages and other changes such as moving network service providers.

The future of MVNOs
Quadruple play, as seen with NTL and Virgin, and initiatives such as BT Fusion, are promising even more revenue generating opportunities for MVNOs. Fixed-mobile convergence opens up the market to the numerous potential MVNO models that emerged previously but never reached their potential. 
The requirement to enter markets quickly and effectively is putting pressure on MVNOs that MVNEs are ideally placed to relieve. Outsourcing the complex business of billing and customer care for converged services to an MVNE creates real flexibility for the MVNO. It allows a new service to be tested and rolled out rapidly, whilst always giving the MVNO the option to either curtail or extend the service based on how it performs in the field. Indeed the very best MVNEs are those that give their MVNO clients the opportunity to fine tune their service portfolio and be most responsive to changing market conditions.

Maturing eco-system
MVNEs are becoming part of a maturing eco-system in the converged communications industry. This already includes network providers and the branded service businesses that have a direct relationship with a customer. As these proliferate and seek competitive advantage, the role of the MVNE will grow.
The potential agility gained by partnering with an MVNE could be applied in newer fields too. For example, it is realistic to expect that MVNEs will play a key role following the recent auction of 'mini-GSM' licences. New licence holders who want to bring to market new niche concepts like business-class campus networks are likely to turn to MVNEs with resources, capacity and skills to realise services rapidly.
Naturally, there is a rush of businesses positioning themselves as MVNEs, as well as brands evaluating the MVNO model. The criteria for success in this MVNE field obviously must include assured capabilities in running a network service business. But, as MVNE pioneers are demonstrating, close collaboration between all players is absolutely essential to success.  Specialist skills, experience and systems are of little good unless the MVNE has worked out how to share risks and responsibilities with its MVNO client and the other third parties involved. An open relationship with clean demarcations is critical. 
Managed carefully and integrated successfully, quite often requiring the support of an MVNE, Next-Generation MVNOs have certainly the potential to cause even more excitement than they already do.   

Tony Wilson is CTO, Martin Dawes Systems, and can be contacted via: This e-mail address is being protected from spambots. You need JavaScript enabled to view it

Mobile entertainment

Service providers must address the opportunities and requirements of mobile entertainment throughout the entire content value chain, says Dr Graham Carey

The idea of delivering content on our mobile handsets is not new. For the last five years, 'power users' have downloaded text-based weather, travel and news information from mobile Internet or WAP sites. Those of us brave enough to try these services were either 'in the business' or the classic technology enthusiast. The user had to have patience and be persistent, subscribe to 2G data services, and have the latest WAP enabled handset. They also had to be forgiving of the seemingly ever-present technical glitches that would occur.
Today, things have become far more advanced. As the market moves from a technology-based to a consumer-driven market, these early adopters are being joined by mainstream customers looking for meaningful services that are instantly available and competitively priced. The desire for relevant services has replaced the technology as the primary motivator to adoption. While handsets remain a significant consumer fashion drive, for both consumers and service providers, it is now more about the range of services being delivered, how they are bundled and priced, and delivering value that is important. As this content market matures, new commercial realities are kicking in as users' expectations are increasing. Customers are increasingly fickle, loyalty to service providers is low, value is at a premium.  Furthermore, as consumers  become increasingly sophisticated, they know where and how to find a good deal.
The content asset management system must be able to deal with ingestion, cataloguing, search and discovery whilst the content delivery eco-system must address critical issues including content repurposing and rights management. Rights management is not just about digital rights but also about knowing the legal rights to distribute and deliver content to different devices and different geographies. Added to this we must also know what royalties are payable and to whom for any consumption. On top of these complications we are also seeing the emergence of more services that require real  time transaction and payments. For example with the rise of on-line gambling it is important that the e-wallet of the consumer is managed in real-time. This e-wallet could be a separate account or it could in fact be part of an entertainment balance within a wider portfolio of balances that the user has access to via the service provider. Interactive gaming is another example where real-time transactions and balance updates will be important. Resources used for a game, which may not always be money-based, would need to be updated and managed for the user in real-time. Whatever the combination of balances and bundles, it is clear the consumer has multiple options and it is critical that we are able to track and bill for all of them. In summary, to make money in these extended and more complex value chains, all the revenues must be tracked and monitored as they happen.
Monetising content and value added services – and making sure each member of the value chain is compensated – is critical. To grow services and deliver value to the consumer, service providers must focus both on the content delivery and on the comprehensive order-to-cash process while proactively managing these new revenue streams. This requires more than simply charging or billing for a single piece of content. It is all about handling the complete revenue management lifecycle.
Billing and Revenue Management systems must address bundles of content from various sources, price content individually and/or as part of a wider bundle, instantly offer deals and promotions at the time of service consumption, and allow the user multiple and convenient ways to pay. As the service, content or bundle is delivered, the user must be charged and the service contributors and partners appropriately compensated through automatic settlement and remittance.  The mobile content arena will also have to address sponsorships and advertising with real-time deals and promotions. 
Managing revenues associated with the delivery of new media content is not an entirely new concept.  Forward thinking service providers including Orange UK, Vodafone Live!, Telenor, Swisscom, as well as some ISPs, media companies, and broadcast networks have been more sophisticated in terms of how they view the entire revenue management lifecycle for mobile content.  Companies such as these are able to offer innovative and exciting pricing models and, as a result, are no longer limited to selling a mobile or Internet game based simply on the file size or some other physical parameter. Instead, they have the ability to charge in real time for a service based on other considerations such as the number of transactions or usage.
For example, in an interactive gaming session the winner could be allowed to play the next level of game for free. Similarly, free music downloads provided by a concert promoter could be made available prior to a concert and, by integrating music downloads and rights permissioning, the music could then be restricted for play after the event only if the user either purchases the rights or attends the concert. The model could be extended to encourage further purchases by giving discounts based on today's purchase for future downloads or service consumptions.
Peer-to-peer marketing and super distribution represents another potential opportunity for revenues in the content value chain; because the latest standards for DRM allow super distribution, users who act in an 'originator' capacity can instantly claim affinity points or monetary discounts on their accounts when sharing and locally distributing content, in effect becoming a new channel partner for revenues.
As mobile entertainment moves to more sophisticated market-driven users, service providers must address the opportunities and requirements throughout the entire content value chain in an easy-to-understand manner. Properly handling the revenue management lifecycle in real-time is absolutely key to delivering these mission critical, and highly strategic, revenue generating services.                             

Dr Graham Carey is Director, Industry Solutions, Communications Global Business Unit, Oracle

Mobile content management

While adult mobile content presents a potentially lucrative avenue of revenue for operators, regulatory and marketing pressures have so far prevented them from engaging with the market fully, as Tom Erskine explains

With the explosive growth of mobile content, operators expect their revenues also to grow, as the subscriber appetite for content increases. Gaming, search, personal navigation and music markets are booming, and SMS and MMS messaging are now commonplace. Mobile operators now face increased competition as MVNOs enter a saturated marketplace, trying to attract subscribers with their specialised content offerings. 
There is one area in which many mobile operators have been reluctant to associate themselves, yet it remains one of the fastest growing areas for revenue: adult mobile content. Content providers are poised and ready to offer this content, but have found markets (especially the United States) tough to penetrate, as mobile operators have worked hard to distance themselves from 'racy' offerings.
The market for mobile adult content is estimated to grow to $90 million in the United States and $1 billion globally by 2008, according to Yankee Group, a Boston-based independent technology research and consulting firm. And recent figures from Jupiter Research predict that mobile adult video is expected to outpace text messaging by 2009, generating nearly $2.7 billion annually.
With such high financial projections, mobile operators can no longer ignore this revenue stream. Rather, they need to consider offering solutions and services to capitalise on the demand for this content without damaging their brand equity.
Historically, adult content on mobiles have carried a stigma from various outside pressures. There is the social and cultural stigma of downloading and viewing this type of content, and there are regulatory bodies placing controls on everything from radio to TV. Now, mobile is no exception.
This content may be more pervasive than originally thought. The latest US government-commissioned study found that about one per cent of web sites indexed by Google and Microsoft are sexually explicit. The study also found that the strictest filters were able to block 91 per cent of sexual explicit web sites in indexes maintained by Google and Microsoft's MSN.
With the evolution of adult mobile content comes increased pressure from outside stakeholders, like government and concerned parents, to control and restrict access. Mobile operators must offer tools for improved access management for the ever-growing mobile teen market (12- to 17-year-olds) and the 'tween' market of 8- to 12-year-olds.
Inbound or outbound, the threat among these age groups is mounting.
• Outbound: Harmful contact and content is available with free browsing and third-party access to illegal or inappropriate sites, including adult content, adult gaming and gambling web sites.
• Inbound: Inappropriate blogging, solicitation,        spamming and adult-oriented marketing is pushed to subscribers.
Despite the fact that the access is at the subscriber's discretion, it is usually the network that is deemed the responsible party. The question is no longer when but how to deliver effective, easy-to-use access and content controls that satisfy subscriber and consumer concerns, as well as operators' needs to grow revenue.
Internationally, government bodies have been regulating mobile content on various levels of restrictions with some success.
The United Kingdom has a co-operative 'Code of Conduct' that was introduced in January 2004 for the categorisation and filtering of content.
The International Mobile Classification Body (IMCB) was introduced later in 2004 to set a Classification Framework for commercial mobile picture-based content.
Italy also has a 'Code of Conduct' although entirely voluntary on premium services and children protection, which was signed by major mobile operators in February 2005. 
Israel has taken it a step further with the Communications Ministry mandating the use of filtering of mobile content to protect local youths.
In the United States, the Federal Communications Commission (FCC) is requiring solutions on mobile content under an industry security rating initiative, and the Cellular Telecommunications Industry Association (CTIA) developed a set of voluntary guidelines in 2005 around offering mobile adult content. Individual state governments are driving more legislation. 
For mobile operators, it's not just a matter of filtering this content. Many filter solutions on the market today aren't dynamic enough to keep up with the development of new adult sites. The technology is available, but operators need to choose their solution provider wisely. Things to consider:
Verification: Age verification and age registration are not easily monitored nor controlled on a prepaid phone. In addition, mobile operators are struggling with     administration of a blacklisting service to provide better filtering solutions.
Support: Support is needed across multiple touch points for each subscriber or groups of subscribers. How does the solution handle browsing filtering, SMS, MMS, International Gateways, internal and third-party content servers all at the same time?
Service Quality: Does the solution impede the subscriber experience, slowing down the performance or interaction and preventing subscriber usage, thus decreasing potential mobile operator revenue?

Striking a balance
The answer is to strike a balance among preserving the subscriber experience, minimising operational concerns and adhering to inevitable regulatory controls of government bodies.
Mobile operators have an opportunity to take the lead on addressing the concerns on adult mobile content by providing subscribers with content controls as a value-added service to support subscriber growth and to build loyalty.
Access management creates new market opportunities that can drive subscriber acquisition and revenue growth. Subscribers are able to configure interfaces depending on their preferences for management and reporting purposes. What's more, the marketing opportunities are endless.
How can mobile operators capitalise on such revenues without soiling their reputations with more explicit offerings? Forward-thinking mobile operators and MVNOs are taking advantage of vendor-provided solutions that address this nascent challenge. Meanwhile, content providers looking to profit from adult content continue to rely on subscribers to get it the old fashion way; by digging out their credit cards for impulse buying. This approach leaves operators safely on the sidelines – and out of the revenue loop.                               

Tom Erskine is Vice President of Product Development and Marketing, bcgi.  www.bcgi.net