Growth drives long-term value, but what drives growth? The answer could make the difference between thriving and barely surviving in Western Europe's telecom industry. Asmus Komm and Sven Smit look at telecom growth opportunities, and explore eight megatrends and microtrends that can open new revenue pools to industry players

McKinsey's research detailed in ‘The granularity of growth' by Mehrdad Baghai, Sven Smit, and Patrick Viguerie, clearly illustrates how growth plays a crucial role in long-term profitability and survival.

Companies that grow at above-GDP rates are six times less likely, on average, to go bankrupt or be acquired. Furthermore, company growth that exceeds GDP expansion corresponds to a 28 per cent greater long-term total return to shareholder (TRS). Even ‘cash cows' deliver inferior long-term TRS on average, and are more likely to ‘die', if their growth is slow.

Unfortunately, analysts expect most large Western European telecom players to grow at rates below GDP, planting them firmly in the high-death rate category. This represents a dramatic turnaround from the industry's pre-2001 performance, when most of Western Europe's telco players enjoyed strong double-digit growth, driven primarily by the expansion of the mobile and broadband markets. Even during the transitional period from 2001 to 2005, most players grew in the two to four per cent annual range, largely matching GDP levels.

Telco incumbents that focus primarily on their core (home) markets will find it difficult to achieve growth rates that align with or exceed GDP. While nominal GDP in Western Europe is forecast to grow at 3.7 per cent per year through 2009, the telco core market, made up of fixed and mobile voice and basic data, will only grow at about 1.6 per cent annually. In addition, the average incumbent player still retains a high share of slow- or no-growth fixed voice revenues, which could limit its core telecom portfolio to an annual growth rate of only about one per cent.

In line with this general picture, capital markets do not predict much growth for Western Europe's telco incumbents, since current performance explains most of their entity value. Furthermore, the share that reflects expected performance improvement continues to decline, and now represents less than 10 per cent of total value.

So should Telco's in Western-Europe forget about growth and focus on returning the bottom line and return dividends to shareholders, or pursue share buy backs?  The answer is, it depends: like for many utilities it is a very viable strategy to return high dividends and pursue share buy backs to create value. On the other hand expansive "growers" for example, pursuing emerging markets have succeeded in finding valuable growth.  For those that consider growth we have postulated eight trends shaping the future telecom market that can support growth strategies.

Eight trends shape future telecom markets
Our research indicates eight telco megatrends will shape the West European telecom market through the end of the decade. These trends will threaten most incumbent business models, but will also give rise to substantial new growth pools. We will examine each of the eight trends in some detail.
Trend I: Convergence.  Convergence has been and is much talked about, both in terms of fixed and mobile convergence as well as content/infrastructure convergence.  Some evidence suggests that fixed/mobile convergence could accelerate as technical and usage barriers disappear. Driven by Internet protocol (IP) proliferation, data and voice traffic will converge. While a huge potential for new products and business models will emerge, few additional revenues streams will be created directly for the "infrastructure business". Telco players are likely to be challenged to monetise the additional customer value resident in the newly converged offers, as convergence can lead to more competition.
Trend II: The Commoditisation of Traffic. Competitive pressure on usage-based voice and data pricing will accelerate the shift towards flat rate type offers.  Incumbents will increasingly compete as access "pipes", and will likely find themselves unable to fully rebalance their declining traditional traffic revenues with higher access revenues unless markets consolidate a lot more than today.
Trend III: Broadband Proliferation. Spurred by continuing price declines, fixed line broadband penetration will continue to rise but the additional revenue potential will be limited (without value-added-services/content). Broadband penetration and usage is an opportunity for in nomadic and mobile data applications, leading to more revenues from WiFi hot spots and 3G data networks despite likely pressure on price levels.
Trend IV: Value-Added-Services (VAS) and Content-Driven Traffic. Fixed and mobile broadband networks will enable a multitude of value added services and new forms of content. These services represent substantial revenue potential beyond the traditional telecom business, examples of which include eHealth applications, gaming and gambling, and telematics, for telcos the challenge is - like in the emergence of the Internet - to capture revenues beyond increased price for bandwidth.  In a way VAS is a diversification opportunity for telcos on success of which has limited proof to date.
Trend V: Reshaping the Value Chain. Both regulation and technological progress increasingly enable attackers to break up the existing integrated incumbent value chain to compete on their favorite parts (eg, city/local access or call origination/termination via the Internet) and thereby put pressure on the most valuable pockets.  Investing in attackers abroad is the growth opportunity for telcos as "at home" this trend challenges revenues.
Trend VI: Consolidation in Western Europe. Incumbents face limited organic growth opportunities in their core home markets. Large players will increasingly seek the opportunity to grow inorganically and to form global players by acquiring small and medium sized players in other markets. The development of the US market is a case in point.
Trend VII: Regulatory Focus on Wholesale Favouring Attackers. Regulators will continue to shift focus from retail to wholesale prices in order to foster competition. With mobile penetration approaching saturation levels, mobile operators with significant market power could face the same rigid regulatory pressure as fixed-line incumbents.
Trend VIII: Growth in Emerging Markets. Rapid economic growth in emerging markets (eg, in Eastern Europe, Middle East, and Asia), driven by mobile voice and broadband, will remain a major revenue pool for telecom players following a geographical diversification strategy, the challenge is to find non-fully valued assets in the space.

From megatrends to growth pools
Based on the above eight telco megatrends, we identified over 20 growth pools with a collective growth potential of USD 63 billion from 2006 to 2009. The identified growth pools will grow from USD 121 in 2006 to 184 billion in 2009 (Figure 3).
Players should prioritise and select a growth portfolio based on these revenue pools.  To be effective, they should make this assessment based on the specific profile, positioning, and capabilities of a given company, and consider it in terms of three different dimensions:

Players need to assess growth opportunities in terms of barriers to entry and familiarity from their own individual perspective.  Barriers to entry include legal, regulatory or technological aspects, while familiarity reflects the extent to which a player already operates in or near a particular market segment. The most favourable growth pools combine a sufficient level of familiarity with some substantial barriers to entry that limit competitive pressure.

Growth opportunities differ widely in their size, expected ebitda margins and required capex.  By ranking growth pools along these dimensions, players can identify growth ‘nuggets' with high margins at relatively low capex and, more generally, select the most favorable trade-offs of capex demand and likely operating margins.

The development of growth pools typically follows an S-curve characterised by a moderate start, a rapid uptake phase, and a moderate maturation phase. Incumbents should ideally move when the pool enters the uptake phase in order to leverage scale-up capabilities and to avoid over-paying.

These three dimensions should be applied as filters for potential growth initiatives to identify the most promising portfolio of growth pools based on a player's needs, capabilities, and assets.

Several industry players have seen the ‘writing on the wall' and are already moving into selected growth pools.

If players aggressively leverage the identified growth pools in their core businesses and especially in adjacent markets, total annual growth rates of 8 to 9 per cent appear within reach.

Total revenues can be fuelled by three sources: inorganic growth (ie, acquisitions), market growth, and organic share gains. Inorganic growth and organic share gains have historically contributed three to four per cent and one per cent of growth, respectively, and will likely remain at that level for the best-performing players. Therefore, in order to achieve growth rates significantly above GDP, players need to tap market growth pools, and our research indicates that determined moves into a well-selected portfolio of growth opportunities can deliver an additional four percent, for an overall growth rate of eight to nine per cent.
Reaching these growth levels will not be easy as they often represent strategic shifts for the company involved, players will have to compare these possibilities to the alternative of dividends and share buy backs and there own capabilities.

Asmus Komm is a principal and Sven Smit is a director in McKinsey & Company's Hamburg and Amsterdam offices, respectively. Sven Smit is also co-author of ‘The Granularity of Growth: Making choices that drive enduring company performance' (with Patrick Viguerie and Mehrdad Baghai) published by Marshall Cavendish and Cyan Books.

Successful data migration is vital to the effective transformation of telcos into lean and agile competitors in the communications marketplace.  Celona Technologies' Charles Andrews, CEO (left) and MD, and Tony Sceales, CTO, talk to Lynd Morley about overcoming companies' fear of failure, and the best way to achieve a winning migration

A data migration project that works, comes in on time, within budget, and without causing major disruption to the business may sound like a extraordinary piece of wishful thinking to those experienced in the pitfalls of the exercise (after all, Bloor Research puts the failure or overrun figure at 80 per cent among Forbes 2000 companies) but it is exactly what Celona Technologies CEO, Charles Andrews, stresses can now be delivered.

Andrews believes that at the heart of any successful data migration project is the clear recognition that migration is a business issue.  "Keeping the business aware and in control of the migration is the first and biggest challenge - but absolutely critical to getting it right," he says. "Simply put, most of the decisions that need to be made during the process require business rather than technical knowledge.  Sure, the analysts understand about data formats and interface requirements, but they can often only guess at what the data they are processing means to the business."

Successful data migration must surely be a central plank of any telco's plans to tackle the business transformation now so essential to survival in the highly competitive communications market.  Along with innovation and business process optimisation, transformation is certainly the dominant theme in the industry at the moment, and Andrews agrees that innovation is seen as a key differentiator for many businesses.
"Innovation is not a fad - it is here to stay," he says.  "It is a mantra that drives businesses, and will continue to move them forward.

"In the 1990s businesses became adept at sales and marketing, branding, re-branding and growth through merger and acquisition. With the support of the Internet, businesses opened up new global markets and the barriers to setting up a business lowered. This provided a host of new opportunities, but it also introduced a range of new threats - not least that increased numbers of competitors made differentiation harder, and premiums for particular products and services more difficult to maintain.

"Today, each innovation is scrutinised, copied and the advantage negated that much quicker - thanks to the power of the Internet-supported global market. GE's Jeffrey Immelt, for example, explains that now ‘constant re-invention is the central necessity...we're all just a moment away from commodity hell'.

"The ability to respond to change, to continually innovate and to get product to market quickly and reliably are the new hallmarks of business success. Or, in Rupert Murdoch's words: ‘big will not beat small anymore. It will be the fast beating the slow'."
Andrews who, before joining Celona at the beginning of this year, had worked with both IBM and Sun Microsystems, clearly has his eye firmly on the business issues, but is more than well grounded in IT.  He stresses that IT is a central player in a business' search for both innovation and differentiation.  "IT's critical role in supporting an organisation's innovation fitness was underlined in a recent survey conducted by Capgemini Consulting. The survey revealed that two-thirds of CIOs believe that IT is critical to business innovation, but only 25 per cent feel their IT function is actually driving business innovation. Capgemini's Eric Monnoyer, BIS Global Leader, comments that the requirement to balance operation and innovation is ‘a constant challenge' for CIOs, although the survey indicates that 60 per cent of CIOs believe it's possible to do both," Andrews says.

"Seemingly it's the old, old problem of how to have your cake and eat it," he continues. "CIOs are being asked to ensure that IT is continuing to function efficiently, to comply with legislation and regulation, and to be secure against an ever-wider range of threats. They're also expected to perform the usual upgrades, renewals and maintenance on legacy infrastructures, and to ‘manage' (as in maintain or reduce) IT budgets. But as if doing all of this were not enough, IT is now required to ‘innovate' to support businesses that are being fundamentally re-engineered for the new economy. All of which has far reaching effects on IT infrastructures, budgets and goals.

"So why aren't more IT departments supporting business innovation effectively? Well to some extent we have already answered this question. Many CIOs and IT departments are busy just keeping IT running and measuring performance against vital key performance indicators. Often IT is seen as a cost centre that needs to be measured, optimised and controlled, rather than as the powerhouse of business innovation. And CIOs may have little time or budget to innovate, due to the fact that such a large chunk of existing IT budgets, resources and staff are committed simply to keeping legacy infrastructure running. The scale of this problem was revealed in a recent white paper by Erudine's Dr Toby Sucharov and Philip Rice who noted that: ‘The cost of legacy systems [from industry polls] suggest that as much as sixty to ninety per cent of IT budget is used for legacy system operation and maintenance'."

Andrews believes that IT underpins the business process, whether it is the customer relationship management systems, the billing system, the provisioning system or whatever. IT can either be an enabler or an inhibitor. Frequently, he explains, different people in the business see the same IT system as both.

"This is a tough place for a CIO to be. If you want to re-align your IT to business needs there are two options: to tactically manage the issue (for example, by extending systems or by partial replacement of infrastructure) or to strategically redesign your infrastructure. While the second approach will yield the most benefits in the long run, in practice the first approach is taken by most companies. The migration of mission-critical applications and their associated data have a risk, degree of difficulty, and such a poor track record of being delivered on time or to budget, that businesses shy away from this approach. The compounded effect of using a tactical approach to solve legacy IT problems over a number of years is the unbelievable complexity that is now responsible for sucking IT budgets dry.
"We now have a seemingly intractable ‘chicken or egg' conundrum of innovation versus operation," Andrews notes, but stresses that a solution to this problem is offered by the new generation of application migration technology that is coming to market.
"So-called ‘third generation' migration solutions are very different from preceding generations of migration technology. Notably, they are highly adept at dealing with the thorny problem of business logic held in legacy systems and are flexible enough to enable ‘business-driven' migrations. CIOs that have employed this technology have achieved business-driven application migration and consolidation projects on time and to budget. They are benefiting both from a lower legacy infrastructure cost and the ability to offer new products and services to their customers - supporting innovation and opening up new revenue streams.

"Take early adopter BT, for example, who wanted to migrate the legacy billing system that supported its Featurenet customers to Convergys's Geneva system, but who also desired a ‘completely seamless transition' to the new system. It achieved a successful migration in just six months (a full 13 months ahead of schedule) using the Evolve tool from Celona Technologies. BT Retail has since credited the successful project with creating more than ?148 million in new revenues, thanks to its ability to launch innovative new services to Featurenet customers.

"Third-generation migration technology could be the CIO's best friend - the key to unlocking the budget and resources trapped in legacy systems, by enabling effective, low-risk application migration and consolidation. And, by significantly reducing both the risk and cost of consolidating and renewing legacy infrastructure, it allows more resources and effort to be targeted at innovation."

Keenly aware of the trends now fashioning service delivery in the telecoms sector, Andrews highlights the importance of successful data migration to the effectiveness of the new delivery platforms.

"Two main trends are clearly emerging," he says. "The first is the standardisation of components in the SDP (Service Delivery Platform), as opposed to bespoke development, and the second is that the key adoption drivers are now commercial rather than technological. 

"Business is demanding that technology should not inhibit change. SDPs promise vital competitive advantage, enabling service providers to roll out new services, faster and cheaper than before. However, realising all the benefits offered by SDPs also requires service providers to have the key application data in the right place. The move to a standardised set of applications, with more re-use of functionality, means that the application data will need to be moved into the new applications.

"The traditional way of moving this data involves either people-based techniques or primitive data migration using extract-transform-load (ETL) techniques," Andrews explains. "Unfortunately, the downside of these approaches - such as the inability to scale or to respond to the changing business requirements, as well as high cost - are diametrically opposed to the reasons for implementing an SDP.  SDPs put the business in control rather than the technology, which means that data must be where and when the business needs it to be, rather than something that the technology controls.

"The key to delivering the vital benefits provided by SDPs is, therefore, the ability to move critical data on time, without loss of service and without spiralling costs and budgets. A survey we conducted amongst IT management revealed that 60 per cent of respondents thought a principal cause of failed migrations is that data complexity and cleanliness are poorly understood; 36 per cent said that they did not think they would be able to get some or all of their data across.

"These challenges cannot be ignored: data migration needs to move into the era of SDPs and SOA (Service Oriented Architecture) - with re-usable standard components, and with the business directing the use of the technology."

Celona, Andrews believes, can answer both the concerns of the surveyed IT managers, and the vital needs of service delivery. "Data migration is our core competence, and we have gone back to basics and begin with a definition of the types of migration. There are five possible approaches that can be applied to a migration - 1) Don't migrate; 2) Event based; 3) Incremental; 4) Bulk-load; 5) Big-bang. No single approach fits every project's requirements: any programme of transformation must ensure that a range of approaches can be delivered. Celona is able to deliver each approach and can adapt and change between approaches depending on requirements.

"Even a single project may move through a number of approaches over time or even combine approaches, in parallel," he continues. "For example, to get a new customer service up and running, without delay, an enterprise might decide to go with a ‘Don't Migrate' approach initially. Some information may be synchronised with the existing systems, eg revenues written back to the old accounts receivable system. Following the launch and trial, with new customers, existing customers who take up the new service are migrated with their old service information on an event-by-event basis. After the new systems have stabilised, then incremental or even bulk-load strategies might be added, whilst continuing to migrate individual customers as each new service order is received."
Conscious of the uphill battle Celona might well have in persuading companies that data migration need not be the agonising (and ultimately unsuccessful) undertaking many imagine, the company has refined what it describes as the ‘Four golden rules of data migration' - describing the common characteristics shared by proven, successful data migrations. 

Tony Sceales, Celona's CTO, explains: "The first two rules stress that data migration is a business issue, and that business knows best.  Putting business in the driving seat means that before we ask ‘how do we migrate data' we first answer a series of important related questions that help to frame and scope the project.  These are: "Why are we migrating data?'; ‘What data should be migrated?'; and ‘When should it be migrated?'.  These questions cannot be answered by technicians, but only by business managers."
Sceales goes on to stress that ensuring the business makes the decisions and drives the project also frees up IT to do what it does best - the technical aspects of moving the data.
"At the same time," he adds, "the second rule stresses that business drivers, not technical ones should take precedence.  It is critically important that business goals should define the solution and approach selected, and not the other way around.  To be successful, the chief business stakeholders must not only define their requirements, but must also take responsibility for driving the project."

The third ‘golden rule' states that no one needs, wants or will pay for perfect data.  Sceales explains: "While enhancing data quality is a worthwhile goal, it's really important not to go off on a tangent mid-project in the quest for perfect data quality.  Data owners and users need to determine the level of quality they require at the start of a project so that the technologists have an appropriate goal to aim at."

The fourth rule also addresses data quality, noting that ‘if you can't count it, it doesn't count'.  Again, Sceales explains: "The challenge is how to measure data quality in order to asses the state of your legacy data and determine the level of quality your business users require.  To make matters worse, data quality is not static, but erodes and improves over time.  It's really important that the measures used make sense to business users and not just to technologists.  This allows deliverables to be measured, gap analyses to be performed, and ongoing data quality to be monitored and improved."

Celona is a small company, very much at the forefront of solving a big problem - a position that Charles Andrews is clearly very proud of.  "Data migration," he says, "is all about getting the data in the right place at the right time, and we are solely focused on this.
"We have built a platform, a method and experience/best practice which can deliver the promise by managing the detail and allowing the business to decide on the speed of the migration.  We are calling it progressive migration - it could be called migrating at the speed that the business needs to be able to drive innovation and new products and services into the market."

What's in a name and why the need for a Common Language? Well, the difference between profit and loss, for a start,  says Allen Seidman

The educated layman might wonder what semantics - essentially the study of the meaning of words - has to do with our industry's business of moving speech, content and data around the world in ever faster and cheaper ways.

The answer lies in the fact that unless we can employ standardised methods to define and describe the many components, attributes and functions that make up the world's networks, then we'll be left floundering in an unmanageable sea of proprietary definitions and descriptions that will make the Tower of Babel sound simple by comparison.
This certainly isn't a new problem in human history. While many creation myths contain a common tale of the first men naming plants and animals, neither the physical nor life sciences would have progressed without standardised systems of measurement or nomenclature. Given that the future success of our industry relies on combining ever-larger numbers of devices, servers and network elements into a single functional entity, the need for a rational common language that can describe all these assets in consistent and meaningful ways is essential, not optional.

This issue however isn't just of concern to those focused on the back office or on engineering operations. Without the ability to clearly define and communicate about   equipment, network connections, locations and services, we'll be completely unable to implement the wider shared ‘Information Infrastructure' concepts that cross-boundary services based on Web 2.0.rely on. How can different applications, content and network owners share information and hardware assets to create meshed and merged services without common ways of defining and communicating about those assets? Or even know which geographies they reside in? Integration costs even within a single company can be an onerous overhead and the absence of standard terms makes inter-company cooperation even more complex and time wasting.

Service providers have certainly already spent many hundreds of millions of dollars over the last decade on trying to rationalise their network inventory systems and find better ways of extracting the maximum value from their fixed assets. Anecdotal evidence suggests that some operators had, in the past, even managed to ‘mislay' significant chunks of their physical networks through poor record keeping, incompatible data formats and the departure or retirement of experienced engineers. Additional surveys from organisations like the Yankee Group show that ‘dirty' and inaccurate data will collectively cost US and European operators a total of $6.3 billion each year by 2010, with knock-on effects that impact heavily on provisioning, fault finding and network engineering. All too often, expensive human resources are wasted trying to resolve problems caused by misleading data records.
Significant too in this drive for a clearer understanding of the cost/performance issues of the infrastructure itself have been the various TMF initiatives such as the enhanced Telecommunications Map (eTOM) and the Structured Data/Information (SID) model. Although also heavily focused on the semantics of our industry, these have increasingly been adopted by both service providers and vendors to solve real world problems and remove unnecessary costs from both business relationships and operations engineering.   While extremely valuable, these approaches only go part of the way to solving the overall problem and are insufficiently granular in depth and detail to drill down to the kinds of elemental component details that are really required. Ultimately, poor network equipment information impacts in numerous ways across almost every aspect of service providers' operations  - and on their relationships with their vendors, customers and partners.

But how exactly does - or should - a common language for describing components work in the telecoms industry? Can we quantify the benefits to service of using a standardised naming system for the multiplicity of different elements - inevitably from different manufacturers with inevitably different naming conventions - that have to be combined to create a network and deliver a service?

Consider one of the most basic, bottom-up building blocks of any communications infrastructure - the humble plug-in card or blade. Each will inevitably have multiple markings such as part numbers, revision numbers, com codes and stencilling - but might not even carry the name of the actual manufacturer or vendor. To add to the potential for confusion and hassle, most boards have to be removed from their racks before their provenance can be properly identified with all the add-on impact that has on interrupted services.
Even where a local ‘asset tag' is created by an individual service provider, in an attempt to standardise their own inventory management systems, these often result in multiple representations of the same device type, adding still further to the overall confusion and waste. If service providers are unable to correctly identify equipment, then they leave themselves open to confusion about their available inventory, their assets can become effectively ‘stranded' and they will often have to invoke costly data synchronisation, reconciliation and stocktaking measures to resolve these problems.

That's a concept that extends not just to the equipment or its location, but also how it is connected to other equipment, what the significance or ‘context' of that connection is, and what might happen to it if there's a requirement to swap out equipment at either end. Clearly, if we take into consideration equipment, its location and the connections between them, then we're talking about the need for an entire Information Infrastructure I mentioned earlier. And that's far-reaching stuff, especially when you take into account that analyst firms such as Stratecast are already referring to Information Infrastructure as ‘the next OSS' battleground.

For example, Arun Dharbal, SVP Communications Industry Solutions, SAP, has said to me that SAP continues to see and enable an increased focus on the management of information across the enterprise. Its Master Data Management solution embraces this trend with the capability to synchronize, and unlock the value of information across a spectrum of systems and data management areas transcending product, customer, asset and service domain.
However, the wider Information Infrastructure issue can also manifest itself in the naming problem I mentioned earlier, impacting heavily on the vendor community. Although many service providers use the proprietary naming conventions of each of their vendors, this approach creates its own kinds of confusion. Because manufacturers have their own multiple drivers for identifying equipment - such as tracking manufacturing changes, marketing, ordering and so on - there is very rarely a strict one-to-one relationship enforced between equipment type and part number. For example, suppliers might use different multiple part numbers for a single equipment type sold across different geographical markets. Conversely, multiple equipment types can be represented by the same part number and revision codes - so making it impossible to rely on vendor-supplied part numbers and revisions to uniquely identify the equipment types both within and between suppliers.

There is huge variation amongst vendors in how they record their own equipment. As a result, there are no clear guidelines on how information revision and interchangeability should be interpreted. The greatly differing formats also make it impossible to create a normalised equipment identification mechanism that is based solely on a vendor-assigned part number and/or revision codes. This concept of a normalised equipment identifier is central to any sensible model for equipment tracking as it ensures that different equipment types can be tracked with easy interchangeability when there are equipment revisions.
Adding to this complexity is the issue of how equipment information is actually managed and distributed by the vendors. Numerous non-standardised formats are inevitably used, with important documents stored in Microsoft Word and Adobe PDF formats, distributed on CDs and hard copy paper manuals or distributed through web links. With each supplier providing different sets of attributes and attribute values, it becomes difficult to model the information with any sense of consistency and this often involves complex mapping algorithms.
For an industry that's betting its future on making the move towards introducing ‘just-in-time' service creation and provisioning principles, many service providers are still stuck managing their inventories with very 20th century technologies. While we might have a range of different tools available to us now to help data capture and warehouse management - such as linear bar code labels, standard 2-D symbology labels, RFID tags and autodiscovery - these will only work efficiently if they are supported by standardised equipment information formats.

Although some good work has been done in this area already by various industry bodies, the actual implementation of these has to be carried out by the individual service providers, network owners and vendors. It's here that the realities on the ground often interfere with good intentions from above, resulting in multiple standards-based proprietary identification mechanisms with heterogeneous part number and revision variances.
On top of this, both vendors and operators also often end up wasting valuable time and energy by forming cross-business teams of engineers and procurement staff in attempts to create internal naming and identifier conventions. With each service provider or vendor around the world attempting to deal with this complexity in their own way, the management overheads involved place a very heavy burden on an industry already trying to streamline its operations as much as possible.

Tony Gladden, Director of Products and Technology at SITACORP has told me that data problems aren't new, but as the global marketplace demands more and more automation across corporate boundaries, data issues become more transparent and manifest themselves in exception processing creating higher costs and lost revenue in areas like invoice reconciliation, order fulfillment, shipping and receiving. So, clearly, this is an issue that lies at the heart of operators' ability to make progress.

To resolve these problems, Telcordia developed its Common Language® Information Services initiative which has grown in recent years to become the industry's default centralised information registry and clearinghouse, capable of providing the structure, format, unique and meaningful identifiers, syntax and common language registries that are needed to reduce operational and capital expenses. With nearly 100 service providers and 1,000 equipment vendors now using Common Language, hard evidence exists that shows some service providers now being able to reduce their master data administration and maintenance costs by as much as 90 per cent for equipment, location, connections and service master data. Significant savings are also made in supporting areas such as spares inventories, systems integration and network utilisation.

These benefits extend beyond global operators to the firms that they serve. SAP ‘s Arun Dharbal says that enterprises are keen to exploit synergies between organisations. In fact, SAP's work with key industry solutions such as Common Language allows its customers the ability to get a single view of their assets across the corporation. Ultimately, this challenge needs to be addressed on a broader basis than just the individual enterprise - the industry needs to adopt a strategy to manage information across corporate boundaries as the information processes of each operator is intertwined with a broader ecosystem of trading partners and equipment vendors.

Although we are all at the beginning of solving the Information Infrastructure issue, and it's size dictates that a Common Language won't be implemented overnight, there are operators and enterprises that are making rapid progress. Daniele Fracasso, Common Language Director at Telecom Italia, for example, says that Common Language has helped Telecom Italia to have up to 95 per cent flow-through as part of its operations, reducing its cost of systems integration.

Colin Orviss, Senior VP at Patni Telecoms Consulting, emphasises the benefits to the entire industry - operators, systems integrators and vendors - of Common Language. He says that it takes standards to a whole new level by providing a unique global implementation that no individual systems integrator or internal IT department can achieve. And that's pretty powerful.

The last couple of decades have seen an explosion in the complexity of an already complex industry. What was once a largely closed community of national operators and vendors now includes members from the broadcasting, IT and consumer electronics sectors - each with their own specific ways of identifying and managing component equipments. If the industry ‘previously known' as telecoms doesn't look to put its house in order soon, a lot of the power of new technologies and business strategies will remain mired in the complexities of managing increasingly heterogeneous networks and systems, creating its own internal Tower of Babel that does little for its customers' own needs to communicate.

Allen Seidman is Vice President, Business Development and Marketing, Telcordia, and can be contacted via: aseidman@telcordia.com

Ethernet just keeps on going from strength to strength, but with the growing demand for services and deployments, service providers must focus on effectively managing an increasingly complex network, says Chris Chartrand

Ethernet has continued to evolve and overcome new challenges since the dawn of the age of networked computers. What started in the LAN as a simple, high-speed (10Mbps) broadcast technology for interconnecting computers has evolved from its original copper wire format to cover optical fibre and wireless physical media at speeds up to 10Gbps, with 40Gbps and 100Gbps already on the horizon. More recently, an industrial-strength version of the technology called Carrier Ethernet has emerged as a reliable and cost-effective way for service providers to offer enterprise WAN data connectivity, wireless backhaul and other telecommunications transport services. The next big challenge facing both Ethernet technology and service providers is how to cost-effectively manage these new Carrier Ethernet networks and services.

By remaining true to its heritage of being simple, high-speed and highly standardised yet flexible, Ethernet has continued to win the battle against competing technologies in the LAN. To become carrier-grade, Ethernet has evolved to add service scalability, quality of service (QoS) and operations, administrations and management (OAM) capabilities and standards to the protocol. Today, Carrier Ethernet is fast supplanting the traditional data services delivered to enterprises and other large organisations. In fact, Ethernet is starting to replace ATM, Frame Relay, and TDM-based T1/E1 and T3/E3 circuits in the enterprise data connectivity services market.

Recent industry analyst reports support this view. Infonetics says that service providers reported 90-100 per cent increases in their Ethernet traffic in 2006 and 2007 and projects worldwide Ethernet services revenues to grow to $69.2 billion by 2009.  Heavy Reading, a research firm that closely tracks the equipment market for the gear used to build the networks to support these services, predicts the global market for Carrier Ethernet Switch/Routers (CESR) will grow at a 25 per cent CAGR over 5 years to $3.76 billion in 2010.
The case for Carrier Ethernet is easy to make: it is scalable, reliable, fast to deploy and offers compelling cost savings. The pervasiveness of Ethernet technology has led to high volume components, which has delivered an extremely low cost per bit. The emergence of Carrier Ethernet is happening at a time when companies are experiencing an explosion in the volume of data they generate. For some, content is now measured in terabytes rather than gigabytes. And the volume of enterprise data is expected to grow exponentially, driving demand for ever-increasing amounts of bandwidth to move large volumes of information around a corporate campus or across the globe.

These next-generation Carrier Ethernet infrastructures will provide scalable bandwidth, support competitive new services and reduce capex and opex costs for service providers. To accomplish this, they are deploying Ethernet devices from multiple vendors in their access, metro and core networks using a variety of technologies that include Ethernet over copper, HFC, optical, TDM, IP/MPLS and wireless. And this is where the next challenge lies: while there is growing demand for services, and deployments are underway, how can service providers effectively manage an end-to-end service that traverses an increasingly complex network comprised of so many different pieces of equipment and technologies? 
To offer broad coverage of Carrier Ethernet services, service providers often have to revamp and reconfigure their infrastructures to ensure end-to-end integration. Delivering even the most basic service fulfilment function requires orchestrating a complex web of multi-vendor networks transporting data across multiple domains (optical, Ethernet, IP/MPLS). Wherever possible, service providers must reduce back-office OSS/BSS and data transport complexity, optimise overall network management and "sweat existing assets."

Although carriers talk about ‘the network' as if their infrastructures are one monolithic system, in reality most service providers run their business and their Carrier Ethernet services on a hybrid mix of legacy (TDM, SONET/SDH) and next-generation (IP/MPLS, Ethernet, WDM, WiMAX) technologies and equipment from a wide variety of network equipment providers (NEPs). Often, each device has to be manually interfaced into the network. Adding a new device type or application to the mix typically requires upgrading both hardware and software across the entire telecom network system - not an easy task considering that many of today's new services and networks, including Carrier Ethernet and wireless backhaul (see box), are built with a complex mix of products.

The challenge of managing these multi-vendor networks can be huge. Thousands of network devices from multiple vendors using multiple technologies must be orchestrated to function together smoothly. Traditional element and network management systems (EMSs / NMSs) are often inadequate to the task. Indeed, EMSs are really only a partial solution.  They tend not to be open, secure, or scalable, and they do little to minimise network complexity. To provide a single new Ethernet service across multi-vendor, multi-domain systems requires stitching together a tangled web of networks and applications - potentially spanning hundreds or thousands of network nodes using many different software interfaces.
"Today, the real challenge is how to fit all these neat technology pieces together - and do it in a reliable, scalable, and cost effective way," says Dan Baker of Dittberner Associates.  "The industry consensus is that cobbling together management software in the traditional way is no longer effective." (Dittberner recently issued a report on this topic, "The Telecom Integration Middleware Market: Network & Element Management, Semantics, SOA & Interconnect Solutions" which can be found at http://www.technology-research.com/mid.htm.)

For service providers, the traditional approach to the problem has been to use a mix of management software point solutions.  First, an EMS is deployed from each of the different equipment vendors in their network. Since these are vendor-specific, typically a configuration system is also deployed to help fill in their weaknesses and to scale up the network infrastructure build process across hundreds or thousands of devices. 
Next, a service activation system is deployed to achieve automation and flow-through provisioning. Finally, all these systems must be integrated into the inventory system. Since inventory is typically the only centralised view of the network, carriers often cram in as much detail as possible, resulting in overly complex inventory models that slow the integration process. Every time new devices are added to the network, interfaces must be rebuilt and tested against both the service and the underlying EMS layer - a process that can easily take 18 months to complete.

Increasingly, the answer to this problem will be pluggable EMS/NMS frameworks that solve most of the multi-vendor, multi-domain issues up front. "Carriers need a common mediation framework that allows solutions to plug in and be configured more easily," Baker says.
Next-generation Carrier Ethernet and wireless backhaul networks, among other methods of transport, require a new type of network and element management system that can provide a universal mediation function that hides the complexity of the different technologies and vendors' equipment below it and offers a single interface to higher layer back office OSS/BSS systems. 

This universal mediation layer allows higher-level applications or services to be developed independently while simultaneously enabling new network equipment to be introduced and updated beneath it. This type of architecture helps service providers roll out their next-generation services and network infrastructure much faster in multi-vendor build-outs such as in Carrier Ethernet, wireless backhaul, residential broadband, triple play and IPTV networks.

"Service providers are clearly looking at multi-vendor EMS platforms," Dittberner's Dan Baker says. "Operators would love to take some of their heavy integration costs out of the equation.  It's a matter of taking the data model native to each network element and mapping that to a common model, thereby improving the overall ability to provision and monitor the network. By definition, then, a multi-vendor management platform must be good at bridging the network platform and OSS/BSS worlds."

If it contains a meta-data model that understands relationships across multiple NEs and OSS functions, an abstraction and mediation framework can also deliver a larger and highly valuable network management view that transcends all equipment and technologies. "As it is, network managers often find themselves flying in the dark," Baker says. "If you want to optimise your network resources, you have to be able to see what you are dealing with. There may be substantial network assets that are not being put to the best use."
For Carrier Ethernet to truly deliver on its promise, service providers must address the challenge of trying to scale their management solutions to deploy, remotely manage and provision their Carrier Ethernet networks, while also facilitating the rapid integration of new devices into their existing OSS/BSS.

Fortunately, an entirely new approach to managing disparate networks is emerging, one that combines the capabilities of an EMS and an NMS for multi-vendor, multi-technology networks, with a carrier grade scalable and secure architecture built using open, standards-based interfaces. An effective vendor-neutral mediation and abstraction layer helps by providing a hierarchical view that abstracts away the details of the network elements - limiting what the inventory system needs to know about the network.

Under this scenario, when a carrier wants to add a new Ethernet supplier, the carrier arranges for the supplier to integrate with the vendor-neutral architecture framework, a step which greatly simplifies back office integration. Nakina Systems is a leading independent software vendor that offers this type of solution with their Nakina Network OS product. In real service provider network deployments of their solution, the roll out of Ethernet services has been completed 50 per cent faster than with the traditional approach.

Chris Chartrand is Director of Marketing, Nakina Systems. www.nakinasystems.com

The price of Internet-connected video devices (webcams) is falling rapidly while their overall features and reliability are rising. As the demand for video surveillance increases, many enterprises are looking at ways to use webcams for that purpose. Justin Bewick explains why webcams are a suitable solution for modern surveillance needs, and argues how deployment of 802.3af Power over Ethernet (PoE) in combination with high-quality structured cabling can optimise their use

With the cost of employing security personnel increasing, video surveillance is becoming an ever more vital tool in maintaining security of business premises and people or material within them. Video surveillance is also an important tool for intruder detection and identification of unauthorized activities by employees, contractors and visitors. In combination with other security measures such as intercoms and electronic door locks, it can provide secure access to restricted areas, ensuring that only authorized personnel are admitted.

Video surveillance is not just useful from a security point of view. It can also be used as part of the environmental management of the building, switching on (and off) lighting and heating and verifying that doors, windows, etc. are shut. Likewise, it may be used for safety reasons, detecting, for example, whether lifts are occupied or not.

The key to all these diverse video surveillance applications is price and ease of deployment. Webcams, that is to say video devices connected to the enterprise's IP data network, are ideal in this respect as they are generally significantly cheaper and easier to deploy and move.

Their main drawback - one often shared by traditional CCTV devices - is that they require a separate power supply in addition to the data cable, which in turn means that positioning depends on the ability to connect to the building's power. That problem is made worse by the fact that many webcams have a separate mini transformer that must be plugged in.
Traditional CCTV systems require connection to a dedicated cable, generally coax. Not only does the requirement for a dedicated wiring network conflict with standard cable management practices such as structured cabling, but coaxial cable is no longer in common use and is typically some 30-40 per cent more expensive.

On the other hand, a webcam just needs a standard Ethernet cable, no matter whether UTP or screened to provide environmental protection, something that is usually readily available or need merely be pulled to the nearest wiring closet. Using a webcam solution allows for a non-proprietary, open solution. The installation is not dependant on one vendor and it is simple mixing and matching equipment from different vendors.  Replacing a failed camera whose supplier is no longer in business or no longer supports it is a straightforward matter.
Coax may seem to have an advantage over UTP in exterior locations and other harsh environments, but in today's market there are ways to make Ethernet networks just as environmentally resistant. Reichle & De-Massari offers an industrial cabling solution with Insulation Displacement Contact (IDC) technology, which includes RJ45 connectors and connection modules that are sealed, gas- and water-tight and vibration resistant.
With these considerations in mind, it is clear that proper network planning and use of standards-compliant structured cabling must be applied. The optimal approach is an identifiable separate network for security. Ideally this will use entirely separate active equipment and wiring, both copper and fibre, however by implementing a separate VLAN on an existing network infrastructure the same level of security may be reached.

Power over Ethernet
The issue of power supply is relevant to all video surveillance equipment. No matter what the system, every device needs to have both electrical power and some sort of data feed. Supplying power to video equipment can be difficult because cameras are frequently placed well away from the main power runs and sockets.
However where webcams use Power over Ethernet, they can receive both data and power from a single cable. Power over Ethernet (PoE) has been standardized as IEEE 802.3af. The standard is robust and failsafe, enabling a number of possible approaches to adding power to the standard UTP or shielded (SF-UTP) Ethernet cable run, and specifying that the devices drawing power from the cable should support all methods.
The power provided is a maximum of 350mA at 48V (802.3af) which, after taking cable losses into consideration, means that the end device must consume no more than 15W. This is not enough to power a personal computer, but is more than enough for an IP phone or a webcam.

PoE also neatly solves the need for security surveillance to be operational even if the building has lost power. Unless the supply of electricity to the entire building is backed up by UPS, traditional CCTV will stop working. A PoE solution, by contrast, relies on an infrastructure that in most enterprises already has UPS, namely the corporate data network.
Ethernet data networks using structured cabling and distribution schemes gain hugely from the flexibility and manageability these techniques offer. In the case of video surveillance structured cabling greatly ease network alterations. However, this flexibility comes at a price: making it easy for people to tamper with the system, whether accidentally or with intent.

R&M's Patch Guard system, for instance, offers lockable patch cables and panels. It is ideal for server rooms and also very suitable for webcam security, making accidental tampering less likely, and malicious tampering both harder and significantly more obvious.
Another potential benefit of structured wiring is that it makes it comparatively easy to create a resilient network that can survive equipment outages or cable cuts. Clearly an individual webcam will be lost if the cable to it fails, but between the wiring closets and the security control centre the use of redundant paths to ensure reliability is entirely standard. This means that the overall reliability of a PoE webcam solution is likely to be greater than that of a traditional solution.

Future proof
To be future proof, your surveillance solution should be easy to install and manage, reliable, and cost effective. The answer to this is a combination of webcams, PoE and high quality structured cabling.

This combination is naturally future proof because the open standards that underpin IP video ensure that any IP video surveillance solution will be interoperable and functional no matter what technology is developed in the future. And simply put, Power over Ethernet permits greater reliability of power supply than any other solution.

Justin Bewick is UK Sales Director of Reichle & De-Massari (R&M), and can be contacted via: justin.bewick@rdm.com

Companies wanting to succeed in mobile TV provision can no longer ignore the wider ecosystem and rely purely on the technology to drive demand explains Markus Hochenbleicher

Is mobile TV going to be another technology with huge potential destined to failure? The adage ‘build it and they will come' no longer applies to new technology services and if this technology is going to be successful then we need to step back and look at the wider ecosystem that makes up the market. For example, why has football been one of the major success stories in encouraging mobile TV take-up? 3 Italia gained 400k subscriptions in 10 months, equivalent to six per cent of its base when it launched its mobile TV services in time for the world-cup. It has then successfully managed to leverage this take-up and continued this growth beyond the world-cup. The answer is simple - football supporters don't care about mobile TV technology. What they care about is receiving football on a sociably acceptable handset when they are not able to watch it through traditional mediums, utilising packages that are geared around their needs and wallet.

Success is dependent on the whole ecosystem working together to deliver against the value chain. Companies have been guilty of often releasing technology for technology's sake, handing a product, which does not resemble the market, to marketing, expecting them to make a success out of it. What the value chain shows is how the different work streams across the value chain need to work together to target the market on its level.

Making users pay
Creating combinations of the value chain will lead to mobile TV revenue generation but finance departments need to be aware that the time revenue will be long, as many success stories rely on initial free services and subsidised handsets.

What operators and service providers can expect to charge
In 2005/2006 O2 and Arquiva conducted an extensive trial in Oxford, UK with several hundred users. Oxford O2 participants watched up to 4 hours per week. 85 per cent of the users were satisfied with the service and 72 per cent indicated that they would take up the service within 12 months at an acceptable price. The stumbling block was current data tariffs that could cost over £400 per month whereas data from recently launched mobile TV services in Korea, US and Italy suggest a price point of £5 to £10 per month.
Low prices alone do not guarantee success for mobile TV

In the UK Virgin Media with BT Movio had a working technical delivery platform that achieved 10,000 subscribers and was turned off in mid-2007. They offered a free mobile TV service in a bundle with voice minutes, one type of handset and five channels. Similarly the US broadcaster Crown Castle turned off its mobile TV network in New York called Modeo, in mid 2007.

Creating demand depends on the combination of three key elements: themed content, appealing handsets and attractive prices
A clear demonstration of this combination working is in Korea. TU Media launched a mobile TV service based on a satellite infrastructure in 2005. The service includes 15 TV channels, 19 radio channels, real-time traffic information and mobile shopping. Since reducing the monthly price below $11 take up accelerated and in mid-March 2007 TU had 1.15 million customers and is expecting to sign up another million during 2007.

3 Italia points out that thematic channel, primarily sports and news, are the main drivers for adoption. 70 per cent of viewers indicated that they preferred those channels. Usage peaks are through lunch and before dinner and appears to be mainly during outdoor activities. After all who would not like to spend the lunch break with a sandwich in the sunny piazza watching the news, or drinking a glass of Italian wine in the garden before dinner checking the goals of the day? 3 Italia offers various payment options starting at P3 per day to a full service package including voice for P49 per month.

With three competing mobile operators, TIM, 3 Italia and Vodafone, offering alternative services, Italy is currently the most competitive mobile TV market in Europe. Italy however also exemplifies how difficult the position of mobile network operators is. Content distribution, the natural home of network operators, is highly contested. Broadcast mobile TV does not necessarily require an operator's wireless network. Standards such as DVB-H allow market participants to build TV broadcast networks similar to today's digital TV networks, for example Freeview in the UK. This enables content aggregators to set up their own networks and distribute content to portable digital TV receivers, plug-ins for existing media players (iPod, PSP, etc.), or mobile TV ready mobile phones.

Mediaset, the Italian TV and media company owned by Silvio Berlusconi has acquired broadcast frequencies and set up a mobile TV network. TIM and Vodafone Italy are both using services from Mediaset in order to supply their subscribers with mobile TV services. The role of those operators is limited to subsidising the handset and billing the end customer for the service. They do not control the distribution of the content, nor what content is provided and can hardly use it for differentiation.

The danger for mobile operators is that they loose control over handsets, which they have used so far to tie in customers to their own add-on services and content.

Becoming media companies
For media distribution 3 Italia has gone a different way than the other mobile operators. They have built their own broadcasting network covering 75 per cent of the country. This gives 3 Italia the freedom to decide on the type of content for their customers. 3 Italia still use a wholesale content aggregation service from Mediaset but has also started to produce their own content.  In particular they have developed free news and sports channels as well as recently inaugurating their own TV studios. Potentially 3 Italia will be able to source or produce content that allows them to differentiate their offer towards end-customers.
The downside is that the costs of such a strategy are high. 3 Italia has spent P80 million on the network alone. Further investments into media rights, TV studios etc. are required. So far, 3 Italia appears to rely primarily on end customer tariff packages to finance this expansion but there are further revenue earning possibilities. One of these could be advertising, particularly if these can be targeted to defined user groups. In the UK the new MVNO Blyk has started to subsidize mobile telephony for young users who receive advertising SMS and MMS messages.
In order to take advantage of this P2 billion market, companies can no longer ignore the wider ecosystem and rely on the technology to drive demand. The perfect ecosystem is attained from the right mix of the value chain, shaping the supporting organisation around this. The key rules are:

  • The road to making money and achieving the business case is long
  • Thematic channels for specific user groups are a main driver of adoption
  • Share the burden, you don't need to own the whole value chain, just exploit it
  • Differentiation and positioning are the most difficult elements to achieve

Above all the wave of successes and failures within the mobile TV market has taught us one important lesson, technology is only important to engineers.

Markus Hochenbleicher is Consultant in Communications, Media and Entertainment at PA Consulting Group, and can be contacted via tel: +44 207 881 3365;  e-mail: innovation@paconsulting.com 

The battle is on to deliver the next generation of very high speed data services to consumers in both fixed and mobile locations. Nigel Wright looks at the relative strengths and weaknesses of the two leading technologies, WiMAX and LTE

WiMAX and 3GPP Long-Term Evolution (LTE) are two different (but not necessarily competing) technologies that will eventually be used to achieve data speeds of up to 100 Mbps. Speeds that are fast enough to potentially replace wired broadband connections with wireless, and enable services such as HDTV on mobiles and TVs without the need for a fixed-line or dish in the home, as well as a host of other exciting services currently seen as too bandwidth-hungry to be delivered using existing mobile technologies.

WiMAX and LTE are both in different stages of development. WiMAX is widely recognised as being the first that will be brought to market. The world's first large scale mobile WiMAX deployment is due in the US in spring 2008 under the Xohm brand by Sprint Nextel. However, although LTE may on paper be some years off, it will bring with it many advantages, not least the fact that operators will be able to evolve their existing infrastructure and base station real estate to deliver it.

Realistic expectations
There is much expected of WiMAX and it's probably fair to say that some of this can be classified as ‘hype' yet there is much to be excited about, provided we set realistic expectations with early stage deployments. It's also important to remember that WiMAX comes in two distinctly different flavours - mobile WiMAX (referred to under standard 802.16e) and fixed (802.16d). There are significant differences between the two, not least the fact that it's technically much easier to deliver the high bandwidth speeds to a stationary external antenna associated with fixed WiMAX than it is to one on a mobile device in someone's pocket or handbag.

This means that whilst symmetrical speeds of 10 Mbps may be technically possible at a range of 10km today, in practice this is likely to be achieved only using fixed WiMAX and is reliant on other variables for its success, such as a high quality external antenna with line-of-sight to the base station. Given this situation is far from common and that buildings get in the way and degrade WiMAX signals, it will be more likely that mobile WiMAX users will only see half that data rate at much shorter distances from the base station - at least until techniques such as MIMO (multiple input multiple output) and beamforming are perfected to counter, and even take advantage of the multipath effects from physical obstructions.

The spectrum issue
One of the biggest obstacles to widespread WiMAX deployments is the lack of available high quality spectrum. In the US, Sprint benefits greatly from its 2.5 GHz spectrum holdings. This relatively low-frequency band allows greater coverage per base station since signals travel much further than at higher frequencies. This results in fewer base stations needed, making WiMAX cheaper to deploy in the US than in other markets that don't have access to the same spectrum. Even given the availability of 2.5 GHz spectrum, for Sprint's network to provide nationwide coverage it will require more than 60,000 base stations across the US.
In Europe, bandwidth below 2.5GHz is scarce and mostly occupied by analogue TV and current GSM mobile signals. Therefore, until now most European WiMAX trials and licences have been limited to the 3.5 GHz or even 5 GHz bands with often disappointing results, which is why we haven't seen anywhere near as much WiMAX traction in Europe as the US. It may not be until after analogue broadcast signals are switched off across Europe (with the UK scheduled for 2012) that sub 2.5 GHz spectrum becomes available and we start to see large-scale European WiMAX deployments.

An open door for LTE?
An alternative high speed mobile technology that could be used instead of, or to run alongside, WiMAX is LTE. The crucial difference is that, unlike WiMAX, which requires a new network to be built, LTE runs on an evolution of the existing UMTS infrastructure already used by over 80 per cent of mobile subscribers globally. This means that even though development and deployment of the LTE standard may lag Mobile WiMAX, it has a crucial incumbent advantage.

There is also no doubt that the advent of WiMAX has injected a new sense of urgency to the LTE standardisation effort. This may help provide operators keen to control investment with the confidence to wait for LTE technology to reach maturity before upgrading their existing infrastructure, rather than invest in a brand new WiMAX network.

Even prior to the arrival of LTE, speeds of up to 7.2 Mbps are currently being reached by existing HSPA technology, which is being used by more than five million subscribers worldwide. And speeds are getting faster - within the next two years it is expected that downlink speeds will reach up to 14 Mbps, which will in turn provide high quality TV and other bandwidth-hungry applications. Although HSPDA is designed for high-speed data in the downlink only, HSUPA is also beginning to arrive in some markets now, increasing uplink rates to a maximum of 5.8 Mbps. This will enable additional services that benefit from symmetrical data rates, such as online gaming and videoconferencing.

At these speeds, HSDPA becomes a realistic candidate to replace wired Internet access with wireless in some cases. It is therefore possible that even with WiMAX and LTE developments, HSDPA will be enough to satisfy the needs of many users or will be a satisfactory interim solution prior to LTE's arrival.

A combination of the two?
So which technology will ultimately prevail? It is arguable that LTE is more ‘risk-free' than WiMAX because it will run on an evolution of existing mobile infrastructure. Also, mobile operators will be able to use their experience from current 3G and HSDPA networks to carry out the incremental fine-tuning necessary to ensure that the rollout of LTE will deliver on user expectations. Also in Europe it has the advantage of being unaffected by the lack of available spectrum.

However, the recognition of WiMAX as an IMT-2000 technology by the ITU in October 2007 is a significant step, that in the future may help WiMAX to gain a foothold in today's UMTS spectrum and so close the spectrum availability gap, but the full impact of this move has yet to unfold.

Nevertheless, LTE is still perhaps three to four years from being ready whereas mobile WiMAX equipment is entering the final testing phase now. Some operators far from seeing LTE as being less of a risk may take the view that by missing an early mover advantage into ultra high speed mobile broadband and waiting for LTE would have an impact in terms of potential subscribers perhaps attainable by moving to WiMAX now.

Also LTE will start to come to the forefront at the same time as analogue TV signals are switched off in Europe, making the spectrum debate largely irrelevant to the WiMAX versus LTE argument. This is of course provided national governments release spectrum for WiMAX and it's available at a price that operators deem worth paying.

Interestingly many operators have already stated their interest in both camps. In August of this year, Vodafone, a key advocate of LTE, declared itself ‘technology neutral' and joined the WiMAX Forum. This pragmatic approach is perhaps a sign that for now many operators will adopt a ‘wait and see' approach and learn from the experiences of early pioneers such as Sprint Nextel before deciding whether to choose WiMAX or LTE.

Ultimately the decision may be to use both. As Spirent Nextel is showing in the US, the real estate occupied by an operator's current base stations can also be used to site new WiMAX base stations. Then the strategy could be that LTE is used to support mobile broadband users and WiMAX to support fixed or lower-mobility broadband users. Alternatively, they could well use LTE for macro cellular coverage and WiMAX for micro cell coverage.
In all likelihood many devices of the future will ship with both LTE and WiMAX capability, meaning full compatibility across both technologies. Consumers will probably not even know which particular technology is delivering high speed data to them and they're hardly likely to care, so long as it works to their satisfaction, and the content provided is engaging and available at the right price.

Nigel Wright is Vice President of Product Marketing, Spirent Communications

Simon Vye, CEO of Telstra Europe talks to Lynd Morley about the trends and opportunities in the international telecoms wholesale market

LM How has the wholesale market performed in general over the last few years, and what longer term trends do you see?

SV Switched voice traffic has long been a commodity, and performance in the wholesale market has been driven by demand for data circuits.

Over the last three years growth in Internet traffic has outstripped growth in capacity (leading to higher network utilisation rates) and has been increasing by an average of 60 per cent year-on-year. This has more than offset decline in margins during the same period.
Indeed, downward pressure on pricing has generally stabilised in most mature markets globally, and wholesale bandwidth providers are still able to achieve premium tariffs in emerging markets in Africa, Asia and the Middle East. The only fly in the ointment has been South America, where exceptionally fierce competition between international carriers has led to sharper than average transit price falls of around 35 per cent since the mid-2005.
Consequently, providers have, on the whole, enjoyed a period of modest growth in profit. Whilst this is likely to continue in the short term, it may prove to be the market bounce after the telecoms crash at the beginning of the decade however, and corrections in the longer term trend could see profits from wholesale bandwidth slip back into decline.

Buyers are purchasing higher capacity ports to cope with end-customer demand, which carry a lower price per Mbps than lower speed ports - making the squeeze on profit growth more acute even if traffic volumes continue to grow steadily. In the US, for example, the price per Mbps for a GigE (1,000 Mbps) is more than 20 per cent lower than that for an STM-4 (622 Mbps). In addition, many carriers upgraded their networks in 2007 and capacity growth accelerated to 68 per cent - exceeding the growth rate in traffic for the first time in many years.

There are some hot spots in the market that will buck the trend for the foreseeable future, however. Growth in VoIP handoff and offnet services and mobile interconnect, especially in territories such as China, India, parts of North Africa and Eurasia, will continue to be strong. We will also probably see accelerated growth in non-traditional markets such as the Middle East and South Africa, where wholesale providers who partner and invest should see attractive sales and margin performance in the coming three years. In addition, as global IP MPLS services head towards becoming commoditised, there is a premium opportunity for managed switched SDH services for network restoration, especially for latency intolerant network applications - such as VoIP or Video.

But that all said, most wholesale providers are looking to ‘value-added' IP network services as critical to their longer term performance.

LM So what will be the key ‘value-added' network services that we can expect to see in wholesale providers' portfolios?

SV The greatest demand will be for fully managed IP services in the network cloud delivered over MPLS. This will enable communications providers to accelerate time-to-market of new end-customer offerings at a lower cost and reduced level of risk.

In the business market, VoIP is relatively mature, so the greatest opportunity in voice communications will undoubtedly come from SIP gateway services. This will enable far greater and easier support for end-customer unified communications solutions accessed via SIP clients, SIP trunks and traditional interconnects such as PBXs.

Another likely winner is Ethernet VPNs, offering flexible ‘any-to-any' connectivity. Pre-integration in the network removes traditional barriers associated with having to integrate multiple circuits and enables service providers to offer much more scalable and cost-effective solutions to their end-customers. For wholesale providers with a global client-base, various derivatives of International Ethernet will deliver robust performance in their services portfolio. And we see strong potential for virtualised data centre services and applications such as Storage Area Networks.

Videoconferencing is a long-established technology that has not entered the mainstream, partly due to poor delivery against customer expectations on quality of service and experience. However, next-generation network-based videoconferencing services offer much higher specifications at a lower price point, and are capable of delivering high definition connections to both the boardroom and desktop. At the same time, demand is being fuelled by growing business imperatives to reduce travel, costs and carbon footprints, so videoconferencing will undoubtedly be part of the bigger picture.

On the consumer side, Internet-based multimedia services such as IPTV and video distribution will undoubtedly be hugely successful. And with low barriers to market entry, there will a significant opportunity for IP service delivery platforms with QoS wraps designed specifically for emerging ‘new media' content players.

Fixed-mobile convergence is also a potentially interesting area, especially in providing core network services to manage seamless handoff between access networks for multimedia IP sessions.

LM Do you see any major regional opportunities?

SV The trend for globalisation of supply chains and skills markets is a key driver for growth, especially in hot business process outsourcing markets such as India, China, Eastern Europe, South Africa and increasingly the Middle-East. The Asia-Pacific market generally is enjoying incredibly strong economic growth (according to the World Bank, it has had the strongest economic performance of any developing market over the last decade), and we're seeing rapid expansion of US and European firms in the region. Consequently, the wholesale market for circuits into Asia-Pacific is a real sweet-spot - especially for providers like Telstra.
In mature broadband markets, such as Western Europe, the main opportunity is the growth in demand for advanced IP-based services, such as IPTV. Conversely, in developing countries, there is potential for explosive growth in bandwidth demand. According to the latest figures from the United Nations, penetration of Internet subscribers in developing nations is estimated to be just 8.5 per cent. With the development of the $100 laptop and plans for rapid deployment of mobile broadband in regions such as Africa, penetration levels in the next few years could grow exponentially.

LM So what will characterise successful wholesale providers in the future?

SV On an international scale, wholesale providers will need a footprint in the key growth regions outlined previously. But expanding in some of these countries can be extremely difficult because of the challenges associated with being awarded state-controlled operating licenses. Therefore, many of the smaller players will need to partner with larger providers who already have licenses or have the financial muscle and scale to negotiate with Governments.

The most successful wholesale providers will also have global IP MPLS backbones, enabling them to deliver IP services seamlessly for intra-region and intra-continental end-customers. Competition is fiercest on a domestic level, so having access to premium transits routes will be a must.

Success will also be massively influenced by the financial positions wholesale providers are in today.

There are a number of providers who have built significant revenues on the back of voice. Ten years ago, they would have enjoyed margins of around 30 per cent, but the return today has shrunk to between 2-5 per cent. So those who have predicated their business on voice face a tough ride in turning the super-tanker around and developing a more balanced revenue portfolio from data services. With dwindling revenues, they will have to borrow to invest, and the current global credit crunch has shown the risks and difficulties associated in relying on readily available cash at decent rates. Consequently, wholesale customers are likely to apply increasingly tough due diligence in scrutinising the financial position of providers to ensure the long-term viability and stability of partnerships, and this could lead to a shake out in the market.

Many wholesale providers will also need to restructure their costs to ensure long term success. But they will absolutely need to avoid the temptation of making a quick fix by stripping away resource from the back-office and reducing headcount in customer service, operations and provisioning teams, and restricting sales expenditure to a minimum. In the short term, this will have a damaging effect on morale and motivation, and will almost certainly lead to lower levels of customer satisfaction and increased churn. Instead, the focus needs to be on the lifetime value of customers and segmenting investment correspondingly.

Lastly, we're likely to see strong performance from more agile players who can outmanoeuvre the market gorillas. Speed to market will be critical in the world of wholesale managed IP services provision in the cloud, as development cycles for service providers' end-customers in IP/web-based applications continue to shorten.

Lynd Morley is Editor of European Communications

Thomas Breuer explores consumer power in a convergent communications marketplace whose motto should now be ‘supplier beware'

As history often shows, when technological developments hit the market there is always a struggle for market share as disparate players make their bid for ownership of customers and future revenue streams.

The converging telco and media marketplace is no different. Telecoms operators, broadcasters and content owners alike are setting out their strategies, building new partnerships and working out revenue models as demand for consumer and business markets for fixed, mobile, telecoms and broadcast services gathers pace.

This article explores the theme of consumer power - this market is not one that can be dictated to by a group of controlling suppliers, as companies such as AOL and TimeWarner found to their cost. Consumers will seek out or, as we have seen with the rise of social networking sites such as MySpace and Facebook, even go out and set up their own environments - leaving supplier-led portals trailing behind in their wake. For perhaps the first time in the history of IT, this market is not so much a case of ‘buyer beware' as ‘supplier beware' - get it wrong and fail; use your judgement to get it right and succeed where others crash and burn.

The biggest development in the converged telco and media market is the delivery of basic and premium services to consumers and business users via their
channel of choice, 24 hours a day, wherever they may happen to be.
The three most important questions that the industry is facing today are:

  • Which services consumers will want to use?
  • Who should be responsible for delivering those services? and
  • How should those services be charged?

There are two sets of commercial protagonists for whom the answers to these questions matter most: telecoms operators who typically own the ‘plumbing' and delivery channels and media groups who typically own the content that users want. In general, it is the telecoms operators who own the relationship with customers, having spent many years (and many euros) setting up systems to enable them to provide an individual (billable) service to home and business users.

Media groups are approaching the converged marketplace from a different direction and are more used to broadcasting to the masses than narrowcasting to individuals using multiple devices. They are less likely to know their customers and typically charge per view or service rather than according to individual preferences and usage.

This is beginning to change, and more recently there have been attempts by organisations to build hybrid models where either an operator partners with a media group or a media group extends beyond television to encompass computer and mobile phone platforms into their services. This is confirmed by the consolidation of major players in the market, such as NTL and Virgin Media in the UK.

It is encouraging to see this development, but not quite so encouraging to follow some of the efforts by both sides of the equation to ‘land grab' and attempt to stand against the tide of customer preference. As happened with so-called ‘walled gardens' established in the earlier days of web browsers, where users could access only limited services provided in partnership with the host, consumers will bypass these arrangements and find a new way to get what they want.

The key point to remember is that nobody outside the telco industry cares how content and services are delivered from a technical standpoint - they just want easy access.
To understand the context for this, it's worth thinking about the retail sector and how it has had to change over time. A hundred years ago retail was driven by local producers who sold only through small shops.

This changed dramatically as retail became a national and then international business, with retailers effectively charging suppliers a fee for providing space to sell their products. They continue to sell a range of their own brand goods in order to boost the lower margin business of running the retail ‘infrastructure' but know that they have to provide what customers want alongside this range - otherwise the customer will go elsewhere.
The same kind of ecosystem is emerging in the telco and media space. Telco operators are beginning to understand that they will not succeed by providing their own portals with limited access to own brand services (or even access to one or two partners). The Internet has shown how easily accessible content is, and the genie cannot be put back into the bottle.

Our view is that while telcos own the plumbing - and the billing relationship - with customers it is the media groups who hold most of the cards in the convergent marketplace. With few notable exceptions, there tends to be an engineering culture in telcos that means they have ‘mechanical' mindsets that concentrate on high quality delivery rather than creative and innovative approaches to how customers wish to experience different media.
As a result, we have seen some spectacular failures in Europe as telcos invest in new channels that don't deliver the content that users want, and experience extremely low levels of take-up and return on investment (ROI).

The future is in the ability to host and deliver services seamlessly using multiple devices which are of an extremely high quality. Just thinking about three of the most popular consumer devices, the BlackBerry, the iPod and the personal video recorder (PVR) helps to explain where the market is heading and the roles that
organisations will need to adopt.
1. The BlackBerry
Research in Motion's BlackBerry has revolutionised e-mail usage by businesses. Users can send, or respond to, e-mails wherever they are and whenever they want to - and are always connected. Users are now beginning to experience the benefits of accessing web based information services from wherever they are using this handheld device, although most of us would rather do this via a web browser on a desktop computer or a laptop.
2. The iPod
Apple's iPod is the consumer sensation of the early 21st century. The ability to store an entire CD collection on a small MP3 player and listen to a selection of tracks without having to carry around physical media is hugely convenient and exciting. The potential danger of Apple's ‘closed loop' iTunes service has, so far, not been a barrier, while the recent launch of the iPhone takes Apple even further into consumer territory.
3. The PVR
Personal video recorders such as Tivo and SkyPlus are changing consumer behaviour in the home and are, again, hugely liberating. Consumers can record and keep programmes and watch at their leisure rather than be dictated to by the broadcast schedule (and the accompanying advertising). Programmes can be stopped, rewound and even burned onto DVD at the consumer's convenience.

The benefits of these technologies cannot be argued with, and it's clear that their vendors are providing products with customer convenience in mind. However, there are still technology gaps that make working with these products difficult. For example, users access e-mails and web-links from a BlackBerry through different interfaces, while patchy mobile phone coverage means that access to 3G services is not always reliable or of a good quality. The overall feeling of consumers and businesses is that they continue to need to carry around multiple devices for multiple functions and interfaces between them is still fairly ‘clunky'.

But looking ahead, it's not difficult to build a vision of the future in which all content in the home is stored once in digital format and accessed when and where it is needed; in which a single mobile unit can act as MP3 player, phone and work access device, such as the recently launched iPhone; and in which the currently separate regimes of TV and PC are integrated. Progress towards the digital home is slow, but there is clear demand for a more integrated approach.

And while technology suppliers continue to develop proprietary formats such as Blu-ray and HD DVD, consumers are demanding increased accessibility. Vendors such as Lucky Goldstar have responded by developing a DVD player compatible with both formats. We believe that consumers will drive suppliers to deliver this integrated vision. Consumers already have their own preferred set of services on the Internet, which to a large extent is based on social circles. They will increasingly expect easy access to these services from their mobile phones, TV devices and PCs, and will increasingly see the operator as an access provider - as they view their ISP today.

The mass introduction of high speed, high bandwidth communications for all users will enable these services, but operators and media groups need to work together to guarantee a joined-up approach. The organisations that can provide users with the capability to create a personal service environment across devices and access content and services (such as home pages, games, information and personal content) at will are those who will succeed.


Thomas Breuer is Managing Director, Telecoms and Media International Line of Business at LogicaCMG

Telcos ignore data protection at their peril, says Lynd Morley


In the wake of the recent fiasco of a UK Government department managing to lose the personal details of 25 million individuals - described as the country's worst ever data protection breach - it is worth noting that, for some time now, there have been many warning voices about the fragile state of data protection, and the risks attendant to data loss, not least of which is the massive potential for identity theft and fraud.
But the subjects of data protection, security, privacy and identity management just don't stand a chance of being "sexy" in the fast moving, competitive, high-octane industry that telecoms now believes itself to be. I was astonished, at a recent conference, when discussing the use of personal information gathered about network users for marketing purposes, to be told by a pretty senior telecoms player that "customers don't really care about what we do with their data".

Yet there are those, within the industry, who are clearly concerned - and have been for some time.  Witness, for instance, a report earlier this year from law firm Linklaters' Technology, Media & Telecommunications Group.  Entitled Data Protected, it stresses that in a EU market of half a billion people, it is increasingly important that businesses address compliance with data protection legislation in a systematic way.

"There is a risk that any such compliance programme will take its impetus from the more exotic and media-friendly issues such as the passenger name records spat between the EU and US, and the dispute over the disclosure of banking payments to the US Department of the Treasury," notes Christopher Millard, Partner at Linklaters.  "However, in reality, it is no more likely that the EU's 27 national data protection regulators will make any serious attempt to close down the global banking system than it was that they would try to stop planes flying across the Atlantic and, although ad hoc enforcement action by individual regulators can't be ruled out, the only practical solution is for a deal at an inter-governmental level."

Millard goes on to comment that, with the above in mind, organisations should concentrate on the issues they can control and really should be doing something about - if they aren't already buttoned down.  Among his suggestions for attention, is the need to get people to take information security seriously.  Millard explains that, by forcing organisations to regulate themselves by sending warning notices to individuals who might be at risk of identity theft following security breaches, US State legislators appear to have stolen a march on the EU with its often bureaucratic approach to regulation.  The European Commission has consulted on whether breach notification rules should be introduced in the EU, starting with telcos and ISPs.

Millard further suggests that priority might be placed on "stopping people doing stupid things with e-mail."  He notes that despite all the publicity surrounding ‘smoking gun' e-mails, many organisations still seem to have a cavalier attitude to e-mail and, worse still, instant messaging.  Many don't bother to deploy appropriate policies, training, software tools or disciplinary procedures.

However, judging by the furore following the UK Government's lapse, there are signs that enterprises and their customers are becoming more aware of the issues, and more concerned about privacy and identity theft.  The Enterprise Privacy Group, for example, has noted growing interest in the concept of ‘Information Brokers' to help users control their personal data.  As customers do gain understanding of the issues, it would be an unwise company that didn't address privacy and identity concerns.


Mobile TV is engendering strongly held - and opposing - positions on the chances of its successful adoption.  Dr Windsor Holden examines the detail behind the posturing

Much of the literature that has been written about mobile TV falls roughly into one of two camps. In the blue corner are the evangelists for whom it is the killer app to which the populace will subscribe in their millions; weaving and ducking in the red corner are the nay-sayers, amongst whom the general consensus appears to be that they would not touch mobile TV, no Sirree, not even with a barge-pole, and anyone that does will be turned into a pillar of salt, or at the least see their enterprise go belly-up due to lack of customers.
Now then. Mobile TV clearly represents a tremendous opportunity for the various participants in the value chain. People will pay a premium for mobile TV. So far so good. Now for the caveats. It will generate significant revenues, if... (You should sit back and make yourself comfortable at this point, for there are quite a few "ifs".) If it is properly marketed; if prices of handsets and services are not prohibitive; if operators are allowed a free hand to pick their standard of choice; if it is if delivered via a cost-effective but robust solution; if the service can be received nationwide; if the service quality is acceptable at high-speed; if service providers can obtain the mobile rights to key content; if prime spectrum is made available; if chipsets are fitted in mass-market handsets. And so on.
Read on, and we'll go through a few of those "ifs" in a little more detail...

Regulatory constraints
One of the most pressing concerns for adherents of the various standards is that regulatory bodies will mandate one of their competitors, effectively ending their interest in a given market. Chinese regulator SARFT moved down this path in 2006 when it mandated STiMi as the de jure standard in the country, but a similar (if not yet so emphatic) stance has been taken by the European Commission (EC). The Commission report, Strengthening the Internal Market for Mobile TV, argued that DVB-H should be introduced as a common standard for the following reasons:

  • DVB-H is the most widely-used standard in Europe and is also becoming popular worldwide; and,
  • DVB-H is fully backwards compatible with DVB-T, a core advantage as DVB-T is used for terrestrial digital transmission everywhere in Europe, and network operators have experience in building and operating DVB-T networks

While the EC's report makes some valid points, most notably over a single standard delivering economies of scale, the effective imposition of a standard both flies in the face of policy elsewhere (the EC calls, for example, for a "level playing field" in terms of content obligations for mobile TV) and hamstring players who feel that MediaFLO or another standard might be better suited to their particular circumstances.

Spectrum availability
The availability of spectrum is a key factor in determining the success of Mobile TV. This is especially true in the case of DVB-H, which plans to operate in the UHF spectrum band currently occupied (depending on the country/region) by analogue TV.  However, the availability of UHF spectrum varies widely across Europe: this is partly a reflection on the relative popularity of analogue terrestrial services in the member states. Thus, in the Netherlands and Luxembourg for example, where less than two per cent of customers used terrestrial TV services, relatively little disruption was caused to consumers when analogue signals were switched off. By contrast, the UK has traditionally relied on terrestrial broadcast TV services with a comparatively low adoption of cable, and even though the country has the highest level of digitalization in the EC - 80.5 per cent of households were digital at the end of March 2007 - that left more than 4.6m households dependent upon analogue terrestrial transmissions for their television.

A consultation paper released by Ofcom in December 2006 on the Digital Dividend Review (DDR) suggested that L-Band spectrum (scheduled to be auctioned in the UK by the end of 2007) might be suitable for mobile TV. However, UHF spectrum is perceived as ideal for DVB-H because signals can travel comparatively long distances (thus minimising the required number of repeaters), and can penetrate walls without noticeable degradation. By contrast, signals in the L-Band (1452-1492MHz) travel shorter distances (more repeaters, therefore a more expensive network) and are less effective at penetrating buildings Furthermore, if the UK unilaterally decided to utilise L-Band spectrum, this would immediately put it out of kilter with neighbouring countries committed to using UHF spectrum for DVB-H. Thus, in the UK at least, there appears to be an impasse: Ofcom will not release the optimal spectrum; L-Band is perceived as too expensive.

Network costs
The start-up costs for a mobile TV network are not insignificant, regardless of the broadcast standard employed. In the first instance, operators may be obliged to pay a fee for spectrum over which the service will operate. Secondly, there are the network infrastructure (construction and implementation) costs. Thirdly, the backhaul costs of distributing the mobile TV content to the broadcast stations will be substantial. The cost of the spectrum will vary significantly by market and the relative desirability of the spectrum in question. In the US, Crown Castle acquired nationwide spectrum in the L-Band for $13m; the UHF (channel 55) spectrum acquired by MediaFLO cost $38m. In the UK, Arqiva recently acquired four lots of 4MHz spectrum at 400MHz for a total of £1.5m ($3.0m).
However, these costs are minimal when compared with the network rollout costs. The La 3, Hutchison Italia DVB-H network cost around $280m, while Mediaset's DVB-H network cost around $320m. The MediaFLO network in the US was estimated to cost somewhere between $800m, while Modeo would have been obliged to spend in excess of $1bn. Costs also vary depending upon the spectrum utilized: as noted on page, the different propagation attributes of the spectrum bands means that L-Band services require a greater number of terrestrial repeaters.  Even for relatively small countries, certain topographical characteristics can ramp up costs.

The sheer scale of these rollout costs make it imperative for operators to share infrastructure. Let us assume a single network rollout in a medium-sized country (e.g. France), costing around $400m (DVB-H UHF spectrum) or $600m (L-Band). In either case, it will take an operator five years until cumulative service revenues exceed these rollout costs alone (regardless of other expense such as backhaul costs, programming costs, etc). The construction of two sets of competing infrastructure would therefore be uneconomic, suggesting that the best model would be for competing service operators to share network infrastructure to minimize costs.

Network coverage
Clearly the costs outlined above are dependent upon the breadth and intensity of network coverage involved. One of the key factors that can ramp up the cost is providing an acceptable level of quality of in-building coverage. This has been an issue for most cellular networks, and with higher 3G frequencies and a higher degradation of the signal through thick walls and glass, an increasing number of cellular operators and building owners are resorting to In-Building solutions like Distributed Antenna Systems, Pico Cells and Repeaters. The same issue of indoor coverage applies to broadcast mobile TV technologies like DVB-H, DAB/DMB, ISDB-T and MediaFLO. Although it is too early to say, the degree of the problem and which technology will eventually face an indoor coverage issue, we believe that the success of broadcast mobile TV is very much dependent of the user experience. As the various trials of mobile TV have illustrated, the majority of mobile TV viewing is expected to happen at the office, at home or while in a bus or train. This means that the user will at most times be inside an enclosed space (whether stationary or at speed). Thus it becomes very important from a network design point of view.

However, the comparative analysis of mobile TV standards is very much a game of claim and counter-claim, with the adherents of each particular standard citing research that purports to show that their technology is superior to that of the competition. Regardless of the veracity of these various claims, the fact remains that mobile TV will not achieve its full potential unless the selected technology offers an acceptable audio and video quality over the majority of a country's territory and to subscribers watching its service within buildings and moving vehicles. Ultimately, customers are paying for the universal availability of a mobile service; the premium they will pay is to receive that service anywhere, anytime. This is a problem that has yet to be fully addressed by 3G networks, where streamed services are regularly interrupted when users move out of areas with good coverage. It is an issue that must be addressed at the outset when establishing a dedicated mobile TV network. To put it another way: customers of a DTH or cable pay TV service would not renew their subscriptions if such services regularly and repeatedly broke down.

Service pricing
Key to the mass adoption of mobile TV is finding an acceptable price point for services: too high and users will be disinclined to subscribe; too low and service providers run the risk of failing to extract maximum value from their service. In January 2007, the Mobile DTV Alliance released a white paper claiming that:
"When prospective subscribers are asked if they are willing to pay $20/month for the service, before experiencing the service first-hand, only 10 per cent on average respond positively. However, once holding phones in their hands with real, live, high-quality broadcast services available, this number should change, and more than 50 per cent will be willing to pay for the service, as worldwide commercial trials (Italy, Finland, UK) have shown. What will the mobile broadcast TV revenue of US operators be if 25 per cent of their subscribers are willing to pay $20/month? With close to 200m US subscribers, the annual revenue will be $12bn. Even if we assume that 50 per cent of that revenue will go directly to the content owners, and the entire investment in infrastructure (estimated at anywhere between $500m and $2bn) is to be amortized over one year - there is plenty of profit to share."

This argument is flawed because it places too much faith in comparatively small survey groups, and secondly, in reality, adoption of such technologies is almost always much lower when the reality (I must now pay $20) takes over from the hypothetical (I will pay $20 in the future). It is true that, in Italy, a monthly subscription to the service retails at P19 ($24.77) which on the surface would suggest that the Mobile DTV Alliance case holds water. However, this price also includes 1GB of bundled data, and - most pertinently - more customers are tempted by offers which bundles mobile TV with low cost telephony, or else (seeing that they have made the investment to buy a broadcast TV-enabled set in the first instance) are purchasing three month subscription at which point the effective monthly spend falls to P9.67 ($12.61). Bearing in mind there are already some casual (daily/weekly) users of the service in the market, and taking 3 Italia's own figures, the effective spend attributable to broadcast mobile TV is around P8.3 ($10.8) per user per month.

Certainly, within Western Europe at least, customers seem unlikely to baulk at a monthly subscription of P8-10 ($10.43-13.04), provided that the service offers good coverage and service quality. Furthermore, operators seeking to retain customers could emulate 3 Italia and bundle mobile TV in as part of a wider content/data bundle, although one caveat is that in so doing they should ensure that they do not undervalue the mobile TV element of the bundle (particularly given that in most cases they will have been required to invest significant capital in any mobile TV venture).

Given the high cost of handsets, it is likely that most initial customers will be postpaid, with operators in a position to offer further incentives for customers to renew subscriptions by offering the high-end handsets either free of charge or (more likely in the first instance) at a significant discount as part of the subscription package. In addition, operators should make mobile TV available on a one-off basis to encourage revenues from casual users of its postpaid services, who might just want to watch occasional sporting events or might be tempted to "dip their toes in the water" with one or two viewings before signing up to a monthly subscription. In such cases it makes sense to offer a varied tariff, with certain programmes (i.e. the aforementioned sporting events) priced at a premium to news and soap operas.

Conversely, for mobile prepaid users, it makes sense for operators to seek to emulate postpaid subscriptions by offering subscription-type tariffs, with customers paying in advance (a la 3 Italia) for access to the service for 24 hours, a week, or a month.
One additional factor to bear in mind is that for simulcasts of pay TV services, the premium charged to existing subscribers of those services (i.e. those who already receive them via DTH, cable or DTT) will need to be lower than that charged to new customers: i.e., the premium they pay will be a mobility premium rather than a content premium. Thus, customers who already pay $50-80 per month for a Comcast, Viacom or BSkyB package will be reluctant to pay an additional $15 per month purely for the benefit of receiving that content over a mobile.

However, operators and service providers should instead see this an opportunity, ultimately offering the opportunity to bundle services as part of a triple or quad play package by locking customers in to a pay TV package across different media. Thus, for example, in the UK Sky Mobile offers existing Sky TV subscribers the opportunity to purchase themed bundles of Sky channels at £5 ($10) per bundle, thereby reinforcing its relationship with those customers but within the mobile environment. While not currently offered to non-Sky TV subscribers, it is possible that such bundles might be offered in the future although at a considerable premium.

The above "ifs" are not insignificant, and there are others, equally problematic, which are, unfortunately, not going to disappear overnight. That said, if and when operators and service providers have addressed those "ifs", have ticked them off on their "to do" lists, then the opportunity afforded by the small screen of mobile TV appears large indeed.

Dr Windsor Holden is Principal Analyst with Juniper Research, and can be contacted via: windsor.holden@juniperresearch.com

Today, most major carriers and service providers have far progressed strategies to deploy all IP Next Generation Networks. However, how does one secure the delivery of high quality user experience over IP infrastructures? As with legacy networks and services this will also be an issue also for tomorrow's converged networks.
Few things in life are trivial, and providing high service quality to the end users across a converged network is certainly not one of them. Although IP in many cases will simplify the operation, the convergence in the access layer and across services indeed adds complexity. Ever since the launch of UMTS wireless operators have struggled with inter-technology handovers between their GSM and UMTS radio access networks. Once extended into a fixed-mobile converged (FMC) environment, handovers need to be facilitated between cellular accesses and both trusted and non-trusted local WiFi and WiMAX accesses, to enable mobility and Voice Call Continuity.
One of the enablers for FMC is the IP Multimedia Subsystem, IMS. Destined to provide a wide set of seamless services through service delivery platforms and application servers, IMS will be an important means for service expansion and hence revenue. Even though IMS is widely accepted, few believe it will be the only service environment. This means that operators need to manage multiple service domains apart from their multiple access networks and their core backbone network. Added to this, most operators will continue using their legacy infrastructure for a foreseeable future.
Much of the richness in the services of tomorrow comes from the continued evolution of the user terminals. Open Java and SIP smart-phones enables a completely new set of services, and the very foundation for the operators' future revenue growth. The increased complexity in the service architecture with terminal based client applications may however represent an unpleasant rendezvous for the operators. Most users who have attempted to use WAP and GPRS services probably also have experienced problems, problems which often have been due to trivial profile or parameter related reasons, but still fatal for the end user, and fatal to the success of the service. Open terminals can easily turn in to a true nightmare for the operators trying to guarantee seamless operation and roaming.
Service quality is probably one of the most important sources for differentiation. Having end-to-end visibility across the NGN network and services will be paramount, not only for understanding the customers' experience, but also to efficiently manage the increasingly complex value chains, where application- and service partners play an ever more important role thereby rendering SLA management a must. Only by proper monitoring, operators will be able to tell what their customers experience, and whether they get through loud and clear!

7 layers
More and more industry segments are beginning to integrate wireless modules, such as GSM/GPRS, UMTS/HSDPA, W-LAN, Bluetooth etc, into their products. By doing this they are increasing the attractiveness and usability of their products considerably. The range of businesses for which wireless technologies can bring considerable improvements is extremely wide. The continuously growing Health Care Sector, the Traffic and Automotive Sector or the Metering Sector are only a few examples of industries that benefit from the integration of wireless modules into their products.
However, for manufacturers from the non-telecommunications sector, integrating wireless modules is quite a challenge. First of all they have to keep any eye on the interaction between the various modules and other hardware and software components within their products. And secondly interoperability between products from different manufacturers is a must if products are going to be a success on the global market.
But that is not all. Not only do manufacturers have to fulfill their own quality standards. They also have to fulfill country specific regulatory and type approval demands, which increase considerably when integrating wireless modules. On top of this, module integrators have to make sure their products meet the requirements of qualification and certification regimes such as the Bluetooth SIG, the GSMA, PTCRB etc. Depending on the way wireless modules have been integrated and probably altered during the development process, it is necessary to go through a fairly complex, time consuming test and certification process. Having a partner who thoroughly understands the ins and outs of these processes can be a great help.
7 layers is one of the world's leading test and service centres for the wireless industry and supports some of the largest mobile phone manufacturers with testing and certification of their products. In addition to this they have a thorough understanding of a large number of successful modules, chip-sets and reference designs that have been tested in their laboratories. This is the experience manufacturers from the non-telecommunications sector can build on when tackling highly interesting but demanding new business fields by integrating wireless modules into their products.

Service providers are actively looking to interactive, personalised IPTV services to differentiate their triple and quad play solutions.  As always, the search for top line revenue growth has to be balanced with a firm eye on bottom line expenses.  Although most operators understand how IPTV middleware is critical for delivering a superior quality of experience, few are aware of the economic impact of this choice.  Recently, Espial, a leading IPTV middleware provider, released a white paper studying the economics of middleware.  They considered several cost areas in their analysis including set-top boxes (STB), IPTV head-end and ecosystem components; IPTV middleware; service innovation; and, finally network infrastructure. The conclusion?  IPTV middleware significantly impacts overall deployment costs on a $600M investment.
The fictitious operator in the white paper builds one million subscribers over five years.  The projected savings from judicious middleware selection was 33 per cent or US$195M. Let's quickly explore the impact of middleware selection on two cost areas.
First, Set Top Boxes (STBS) are a major cost line item -  up to 50 per cent of overall spending. As well, these costs vary considerably depending on the STB features, truck roll costs and the STB lifespan.  For example, an SD STB costs in the order of $90-150 while an HD with DVR unit will run around $450.  A well architected middleware - one with an efficient code base and data architecture - reduces memory and CPU requirements, extends lifespan and reduces truck rolls. This can substantially impact TCO as noted in chart above.
Seconds, middleware can affect IPTV Headend ecosystem costs including procuring and integrating video systems with operations/business support systems (OSS/BSS).  This includes equipment to receive, encode, store and distribute the IPTV service to the set-top box as well as OSS/BSS systems such as billing.  Wise selection of middleware can impact this cost area between 10-40 per cent. These savings are attributable to three areas: an open integration environment, multi-domain management and a scalable architecture. An open environment dramatically lowers system integration costs across the entire IPTV Headend ecosystem.  A scalable architecture supports linear growth and ideally can support 100,000+ subscribers per application server. Finally, a multi-domain capability will support separate regional channel line-ups, UI skins, and applications.  This allows a single system to serve the needs of many communities, which avoids duplicative spending. 
To wrap up - prudent middleware selection can dramatically affect an IPTV service TCO.
Kirk Edwardson, Director of Product Marketing


Other Categories in Features