It's 8 am and Lucy's mobile email has stopped working.  She's nervous, out of time, and out of patience.  Arriving at the office, she manages to reach a live person after what seemed like an eternity on hold, only to be then led through a confusing set of menus and email settings. A half-hour later, the problem is solved, but is she really happy?  What was the impact on her loyalty, and the mobile operator's operational expenses?  How could this have played out differently? David Ginsburg looks at one type of technology that provides the mobile operator-for the first time-with direct over-the-air access to the phone when the subscriber calls for help, thus avoiding the error-prone and inefficient interplay between the frustrated subscriber and the frontline CSR that has been the norm since the birth of the industry

Everyone agrees.... mobile network operators are facing challenges in delivering quality customer care, especially in light of the explosive growth of smartphones. Indeed, in the next few years, smartphones are expected to account for more than 75 per cent of new devices shipped.  These phones, now entering the mass market, are often difficult or counterintuitive to use and expensive to support. And operators, in a rush to deliver the latest and greatest device in a brutal and unforgiving market, have less control over the stability of the software on these phones.  They face a sea change from a simple world where the handset either worked or was physically broken, to a more sophisticated, more complex world where it is easy to misconfigure advanced services and settings.  These factors all add up to additional support costs, service abandonment and subscriber churn.  Operators have two options: either hire more frontline help, at considerable cost, or hold the line on expenses, and risk reducing customer satisfaction and loyalty.  So how does MDM offer a way out?  If we look at the factors contributing to the customer care dilemma, they fall into three areas - handset recalls, handset returns due to usability, and configuration calls.  Mobile Device Management (MDM) can address all three areas.

Handset recalls
Handset recalls occur when the operator, working with the handset vendor, realizes that the handset, due to a hardware or software bug, is broken in some significant way.  Traditionally, the operator would issue a recall, forcing subscribers to bring their phones in to the store to be replaced or re-flashed.  This results in high per-device costs and does nothing to engender subscriber satisfaction.  Annual exposure amongst Tier-1 operators is upwards of $1.4 billion.  With FOTA (firmware over the air), MDM can address more than $500 million of that $1.4 billion, with this figure growing over time based on increasing FOTA client penetration and MDM server rollouts.  By 2013, MDM will be able to address a projected 75 per cent+ of expected handset recall exposure of $1.9 billion.  These savings, along with the positive impact on the subscriber experience, are compelling arguments in favor of FOTA. Add to them time-to-market advantages that the ability to update devices after they have left the factory provide operators with, and the FOTA value proposition becomes fairly clear.

Handset returns
Handset returns occur when a subscriber just can't seem to properly configure the phone. In fact, one in seven phones in North America, for example, are returned for this very reason. And of these returned phones, there is usually no fault found.  The global exposure for mobile operators from returns in 2009 will be $2.5 billion. MDM can initially address almost $400 million of this, growing to more than $1 billion in potential savings by 2013 through better control of configurations resulting in subscribers actually being able to use their phones and the shiny new billable features they have.

Configuration issues
The biggest support challenge mobile network operators face is configuration, with more than 30 per cent of all calls being configuration related. Tier-1 operators field tens of millions of these calls every year.  Typical reasons for calls include "My phone doesn't ring anymore," or "I cannot receive SMS messages." The subscriber may leap to the conclusion that the device is broken or the issue resides on the network but in most cases the phone not ringing any more is due to it being set to vibrate or not ring. Text messages not coming in or going out are often due to things like the SMS inbox being full.

In addition to problems like these, new device and service launches create their own problems. For example, navigation services result in an entirely new set of questions, including "Does it work with my phone?" or "I've loaded it, but it is not working."  And of course, the care organization must be trained in addressing these complaints.
The ability to significantly reduce configuration call times is perhaps the greatest benefit MDM brings to the table.  In fact, configuration issues alone present mobile operators with a staggering $21 billion bill each year, a figure forecast to grow rapidly with the adoption of the smartphone. But there is a light and the end of the tunnel.  As mentioned earlier, device management opens a real-time channel to the device, allowing the CSR to see into the device and when needed reach out and fix the phone. Gone are the days of walking confused and frustrated subscribers through a twisty little maze of menu choices, all alike. Instead, that frustration and wasted time can be replaced with a "wow" experience where the subscriber is surprised and delighted by how quickly and how completely his or her problem has been addressed.  The figure below illustrates just how dramatic an impact MDM can have on a typical call.

The bottom line
Ultimately, MDM may save operators globally a total of $3 billion in 2009 across the three areas described above - recalls, returns and configuration calls.  This will grow to $23 billion in 2013 due to increasing OMA-DM device penetration and operator familiarity with the technology.  Mapping this to the typical Tier-1, an operator with 50 million subscribers will enjoy $80 million in potential savings in 2009, providing more than enough validation for their MDM investment.  These numbers have been recently validated by the analyst firm Stratecast, providing the first third-party analysis of the positive impact of MDM on frontline care and customer satisfaction.

The call revisited
It's 8 a.m. when Lucy's mobile email stopped working.  She's nervous, out of time, and out of patience.  Arriving at the office, she manages to reach a live person, and is greeted with a very different dialogue.  While on-hold, the system had already polled the phone for its hardware and software status, and has determined if an update is recommended.  The agent then asks if she'd like her email settings checked against the operator's reference settings.  Of course, Lucy says yes.  The settings are retrieved, compared, and corrected in a matter of minutes, and Lucy is on her way.  Mobile Device Management, or MDM, is one technology that makes this all possible.

David Ginsburg is Vice President of Marketing and Product Management at InnoPath Software. He can be reached at dginsburg@innopath.com

The most credible revenue opportunities are focused around the metro rather than the core: residential multiplay, business managed services, next generation mobile, software as a service, etc. Service providers should now take a deep look at the metro networks they build, say David Noguer Bau and Jean-Marc Uzé, and evaluate the level of optimisation and the potential to evolve in step with the demand of new services

In the recent years a large portion of the SP infrastructure budget has gone to access and metro networks. These investments have been driven by the convergence of services to all-IP but unfortunately this has frequently translated to multiple networks, purpose-built for each service and application; paradoxically it was during convergence that multiple new networks have been built, converting it to a network divergent world.

It's now time for the transformation of networks into true convergence, driven by cost and simplification and, more importantly, driven by the potential of its future monetization.
A few years back, the architects of core networks were facing a similar situation; with the declining demand of TDM (Time Division Multiplexing) services and the emergence of Ethernet as the standard interface, the core required a major transformation for a true convergence. The issue here was to build a simplified network without unnecessary layers to enable the efficient coexistence of packets and circuits. The idea of layering, and of separating "services" and "infrastructure", was justified in order to allow for a very stable network, over which each service could be managed on its own. So a particular service failure would typically affect just that service. However today, with the success of a foundation multi-services layer based on IP/MPLS and deployed by the IP Services department, there are very few services directly built on top of the transport layer.

With the majority of revenues coming from the higher layers of the network most of the service providers have decided to go for a more pragmatic solution in the core: the integration of the transport and IP departments into a single group, to achieve better visibility of the layers required to build an efficient network and so simplify operations. New technologies such as PBB-TE or MPLS-TP emerged promising a future integration with the transmission elements, and ultimately providing an optimised model to the L0 to L2 services. However the transformed core networks are built with IP-MPLS over DWDM, leveraging the advantages of the two worlds and providing the flexibility and efficiency required by revenue-generating services, and avoiding the limitations of circuits-only or basic packet functionality. Moreover this allows the use of the same model, architecture and operational processes in the metro Ethernet as has been deployed in the backbones over the last 10 years.

Today's metro networks are already Ethernet centric, most of them with IP capabilities, but a large number of service providers still keep separated networks for business, residential and mobile operations. By keeping them separate, service providers are losing a valuable opportunity to capitalise on the synergies between them.

The main goals for convergence are: simplification, lower operating costs, flexibility. To achieve true convergence the service provider must look at the specific requirements of each service and application to integrate all of them successfully. Equally important is to integrate a greater resiliency and scalability across all services and applications. MPLS seems to provide all the ingredients for this mix.

The deployment of MPLS metro networks is not new, however due to the lack of scalability of some early implementations, many service providers decided to split the metro into the multiple networks they have today. MPLS now offers service providers the required scalability and tools for converged metro, and Juniper Networks can provide them in a cost effective way: LDP/BGP MPLS interworking, Point to Multipoint LSPs, unparalleled MAC address tables, L2 and L3 integration.

Purpose-built networks have resulted in faster service deployments in line with the required SLAs, but also in expensive, monolithic and rigid infrastructures unable to evolve with the services and the new demands. With the converged metro there's the risk of recreating again dedicated service delivery points in the edge of the network - bringing back most of the issues experienced in the past. A service-specific edge element can't evolve with new applications, and imposes a one-size-fits-all model not applicable to every service.
Service providers are seeing increasing value in deploying services in a more distributed fashion, such as: location-based services, local advertisement insertion, distributed caching of high-demanding services (video, P2P). But still, some services should remain centralised. As services evolve, the most efficient placement of their delivery points may change in order to scale according to user demand while optimising operational expenses.

The solution here should be based on adding an intelligent service plane so the metro nodes can run a variety of services. The service provider has to be able to decide which services he wants to deliver and where the services have to be enabled in order to match the required architecture for each individual service. This model virtually converts each metro node into a truly "Intelligent Services Edge". Also, with this model the service delivery point for each service can be anywhere, providing the required flexibility and ultimately translating to the expected service velocity and agility to innovate with creative services, retain existing and obtain new subscribers.

A converged MPLS metro network with built-in flexible service deployment brings to the service provider a significant competitive advantage with the richest available set of tools: L2 VPNs, IP VPNs, Intrusion Detection and Prevention (IDP), broadband Subscriber Management, Session Border Control (SBC).

We've seen how a uniform MPLS infrastructure provides seamless service continuity between core and metro so the services can be placed wherever they are most effective.
If the access, metro and core were all based on different technologies, moving a service around would be considerably harder, and may involve shifting boundaries or re-architecting the entire network.

Service providers look for service ubiquity across access technologies (xDSL, FTTx, 2G, 3G, 4G) and this requires a scalable resilient network. The Broadband Forum (BBF) is already debating  "MPLS to the Access Node" for such applications.

With end-to-end MPLS, from the moment that a customer packet enters the network until it exits the network, it will experience no breaks, no discontinuities, whether the customer is residential or business, fixed or mobile, commodity bit-pipe or deeply service- oriented, Layer 2 or Layer 3 or even Layer 7. The implementation of MPLS to the access node obviously provides convergence since the network is uniform. Furthermore, it provides true service flexibility to deploy the services when and where needed, as typically the access LSP becomes the access virtual link of any given service instance, hosted in the appropriate intelligent services edge. It also brings up new services quickly and easily, and moves them as their requirements evolve.

Building a single MPLS network as described, entails very large scaling requirements: from < 1000 nodes today to 10 to 100 thousand nodes in a single MPLS network all-encompassing: access, metro, core. It also requires robust protocols, devices and OAM; a low latency and resiliency levels to provide 50 millisecond service restoration. The network architecture required to achieve the above requirements must not constrain services in any way.
MPLS technology inherits from a hierarchical approach and inter-domain signalling (BGP) that makes possible a scalable end-to-end model. The architecture to achieve the above requirements will divide the network into "regions" and establish the demanded connectivity within them. The simplicity will be achieved by single IP connectivity for control plane and MPLS connectivity for all customer packets.

The network exists to enable services but unfortunately too often, the network architecture dictates which services can be offered. The services should determine connectivity paradigms, quality of experience and resiliency requirements.

Building a converged metro network and adding service delivery flexibility creates a significant competitive advantage to service providers who are focused on high-performance network infrastructure. Extending MPLS from the core to the metro and perhaps to the access nodes, provides a seamless service continuity with a better control of the subscriber experience and ultimately will bring the wanted monetisation of the network.

Intelligent service edge, coupled with MPLS in the access, provide the ultimate flexibility for service providers to offer any service, with appropriate scale while minimising the cost of managing them. Moreover, it allows the service providers to innovate by creating new services, ultimately allowing a non disruptive trial-and-error approach that, so far, generated the most profitable applications we can access to with internet.

Services are the reason subscribers will stay in your network.

David Noguer Bau and Jean-Marc Uzé, Juniper Networks

With peak data rates of 100Mbps+ downlink and 50Mbps+ uplink already being promoted by operators - a seven-fold increase from today's 3G High Speed Packet Access (HSPA) services - Long Term Evolution (LTE) will evolve wireless networks for the first time to an all-IP domain. Mike Coward and Manish Singh look at the role that emerging technologies like femtocells and deep packet inspection (DPI) will play as voice and data networks converge

Most of the industry is united in the common demand for mobile broadband, and there are certainly enough applications and content types to use up the 100Mbps downlink data rates of LTE. But the fact remains that subscribers will always demand more for less. Even if LTE delivers a seven-fold increase in data speeds, end users certainly won't pay any more for it, let alone seven times more for that privilege. Thankfully, LTE's spectral efficiency - four times that of 3G - can deliver some cost savings, but there needs to be a great deal more optimization before operators can make significant cuts to the cost per bit.

Before signing off on their LTE strategies, carriers need to answer some fundamental questions. How should capacity be increased to meet end-user demand? Where should this capacity be built into the network? How can operators slash the cost/bit without massive capital expenditure? These are questions that require an appreciation of subscriber behaviour. For example, one insight into today's subscribers is that nearly 60 per cent of today's voice calls start and finish inside a building. So what does this mean for data usage?
We submit that operators should consider the lessons learned from 3G roll-outs; recall that widespread 3G adoption took five years longer than the industry initially predicted. This fact, combined with the sky-high cost of obtaining 3G wireless spectrum, meant that return on investment (ROI) for 3G services was also delayed by five years. Meanwhile, the 2.1GHz spectrum wasn't effective for delivering on indoor coverage, causing a real customer satisfaction issue. And let's not forget the big upfront capital outlays required to build these national 3G networks, which didn't provide material cash flow until there was widespread adoption. Learning from these 3G lessons, it's clear that wireless operators need to find ways to ease into LTE deployment in a cost-effective, scalable manner that minimizes upfront investment and risk.
Fortunately, there is already a technology solution to address these basic yet important issues of increased coverage, more capacity, and reduced churn. Femtocells are small wireless base stations which sit in consumers' homes to provide a 3G air interface, thereby enabling 3G handsets to work much better indoors. By utilizing the consumer's IP broadband connection - such as DSL or cable modem - a femtocell is able to connect to the operator's 3G core network.
We believe mobile operators must leverage LTE femtocells, also known as Home eNodeBs (HeNBs), as part of their LTE network rollout strategy. By enabling carriers to build their LTE networks one household at a time, one femtocell at a time, operators can avoid huge upfront capital expenditure in building citywide and nationwide LTE networks. In other words, femtocells empower operators to augment capacity where it is needed the most - inside homes, offices, airports, etc. - while leveraging their existing 3G networks to provide widespread coverage.
Indeed, there will come a time when it will make economic sense for operators to build citywide macrocell LTE networks. Until that day, LTE femtocells offer operators the ability to expand networks in line with market demand and investment plans.
Thanks to minimal costs related to site acquisition, power, cooling, and backhaul, femtocells are the cheapest type of cell site an operator can deploy, all the while increasing capacity and driving down the cost/bit significantly. Because femtocell devices reside in consumers' homes, utilizing their existing electrical and broadband IP connections, most of the cost is passed to consumers.

It is important to remember that the 100Mbps+ downlink data rates, which LTE femtocells will deliver, cannot be supported by existing DSL or cable modems which currently achieve a maximum of 7-10Mbps. However, by 2010 when we expect LTE networks to begin roll-out, FTTx is likely to provide the baseline residential backhaul infrastructure, and early FTTx adopters are likely to also be the early LTE adopters.

LTE also faces other challenges as it reaches broader market rollout. For example, the proliferation of increasingly intelligent handsets and high wireless bandwidth - giving subscribers network connections equal or superior to personal computers on wireline broadband - mean LTE networks are likely to quickly become swamped with peer-to-peer (P2P) traffic and susceptible to the same type of aggressive network security attacks that afflict wireline networks.

Deep packet inspection (DPI) is one technology that is likely to take centre stage in ensuring LTE networks deliver the high-speed data rates that have been promised. DPI broadly refers to services that inspect the contents of packets, normally for the purposes of identifying the application, which are creating the traffic, such as Voice over IP (VoIP), P2P, e-mail, or Web page downloads. DPI systems take this information and trigger appropriate actions such as traffic shaping, traffic management, lawful intercept, caching, and blocking. DPI has already emerged as a key technology in managing the growth of data traffic in wireline networks.
P2P blocking and traffic shaping are typical of the highest profile DPI deployments in the wireline market to date. While U.S. cable provider Comcast's blocking policy received much criticism at the time, today the industry is coming to realize that some shaping of subscriber traffic is required. Furthermore, delivery of premium services (such as prioritized bandwidth) and the fulfilment of service level agreements (SLAs) can both be implemented with DPI. While initial DPI deployments have been focused in the fixed broadband arena, mobile DPI deployments are on the increase as wireless data traffic explodes, and analyst predictions suggest that mobile DPI revenues will exceed fixed DPI revenues by 2011.

In fact, DPI is arguably the most effective tool to re-take control of the network. When used to implement network-based security and prevent attacks from even reaching subscriber handsets, DPI makes it infinitely easier by blocking an attack in the network - closer to the source - to protect thousands or even millions of subscribers. DPI can also control P2P traffic by throttling it back in order to protect more valuable (and revenue-generating) web, e-mail, or mobile video traffic.

Another interesting perspective to consider is that consumers have come to expect flat-rate plans for their home broadband connection, and this paradigm now being replicated in the mobile broadband arena - the opposite of what wireless carriers had hoped for. Most major US carriers have already moved to a monthly flat rate for calling, short message service (SMS), and wireless data. Such a flat-rate pricing model leaves carriers searching for new and advanced services with a premium price tag, thereby empowering operators to recoup their investment in LTE technology.

DPI systems can provide the basis for delivering such innovative new services and thereby differentiate and generate additional revenue. For instance, operators might want to offer different bandwidth levels for different price plans, or speed boosts when connecting to affiliated network sites, or application-optimized packages that prioritize gaming, VoIP, and video conferencing traffic. DPI platforms also have a market intelligence benefit in helping carriers gather information on where data is used to help plan new service packages or create targeted mobile advertising.

When it comes to deciding where the DPI technology should sit in a wireless network, there is some debate. We believe the optimum approach is to include DPI functionality in the LTE network nodes themselves, particularly if those nodes are based on standards-based, bladed architectures like ATCA. Combining the functionality in this way reduces the number of separate "boxes" in the network and therefore removes some of the complexity and administration required. Packet latency through the system is also improved by reducing the number of hops, which is critical in maintaining good voice call performance in an all-IP network. A bladed environment where DPI and other functions can be mixed and matched also reduces rack footprint and lowers system cooling and management costs while giving carriers the option to upgrade DPI functionality as new threats and defences emerge.
While these are still very early days for LTE, femtocells provide a compelling alternative for how operators can build out their networks; operators can launch new services and higher data rates more quickly without the front-loaded capital expenditures normally required for building citywide and nationwide networks. Likewise, DPI provides solutions to the technical and security challenges posed by high-bandwidth LTE or 3G connections to increasingly open, intelligent, and sometimes vulnerable handsets. Together, Femtocells and DPI provides carriers with a new set of business tools to increase average revenue per user (ARPU) while delivering flexibility, customer satisfaction, and return on investment.

Mike Coward is CTO, Continuous Computing, and can be contacted via: mikec@ccpu.com
Manish Singh is VP Product Line Management, Continuous Computing, and can be contacted via: manish@ccpu.com

As financial turmoil rampages across the worlds' markets, Professor Janusz Filipiak, founder and chief executive of OSS/BSS provider Comarch, tells George Malim that he sees great opportunity as carriers seek to streamline their operations and get to grips with new business models, services and the complex new telecoms value chain

Comarch, the Polish IT solutions provider has been developing OSS/BSS systems for telecoms since 1993 and now provides a portfolio of systems and managed services to incumbent, broadband, triple play operators as well as MVNOs/MVNEs and start-ups. With a turnover of €170m, more than 3,000 employees and a customer roster that includes T-Mobile Germany and Austria, Bouygues Telecom France, O2 Germany and Polkomtel and PTC in Poland, the company has enjoyed a 33 per cent increase in turnover during the last five years. As the general economic crisis deepens, founder and chief executive, Professor Janusz Filipiak, thinks vendors will have to chase harder and act more cleverly to win deployments.
"Now all companies have to be mean and lean in the recession" he says. "We are very cost minded and every bit that is not needed is removed. You can't come to carriers with a higher price than your competitors. IT engineers are now a global resource and want the same payment in China or the UK, for example, so we are in the same position as all vendors. We can't compete on price so we can only be more intelligent and more effective than others. In spite of the recession we must now continue to invest in developing new products."
Current financial market woes aside, Comarch is heavily focused on the mobile market and recognises the challenges faced by operators. "In today's world of telecommunications, mobile operators are faced with the challenges resulting from market saturation in the majority of countries" adds Filipiak. "Innovative product offerings and enhanced service levels are indispensable in order to gain new customers and prevent churn. Operators are searching for the Holy Grail of telco that will prevent ARPU from decreasing. As voice is still the ‘killer application', we see data and value added services as a fast growing market. Other trends are still ahead of us such as seeing strong market competition from global corporate customers seeking the best deals from global mobile groups."

Filipiak also sees great potential in currently non-mobile operators. "Keeping in mind that everything eventually goes mobile, we haven't forgotten the great potential of fixed broadband operators, cable TV providers and triple and quad play operators" he says. "We target different segments of the market while not focusing exclusively on a single one."
Pre-paid billing has been one of the major functions carriers have sought during the life of Comarch but, as bundled and flat-rate packages become more popular, Filipiak sees it's emphasis waning. "Today there are not too many content services available but they will come" he says. "Video streaming will put new requirements on bandwidth and devices. It will be very resource consuming and will be charged via pay-per-use. The experience won't be very different to paying for bandwidth or connection time with voice. Pre-paid is a method of payment which is still the most popular for the youngest segments of users, but pre-paid is becoming less related to cheap prices - because those are achievable in post-paid models as well - than to a philosophy of ‘no contract, no obligation'."

Flat-rate offers will be harder to make business sense of. "Flat rate is only viable in a world with unlimited capacity" adds Filipiak. "Flat-rate packages make a difference in the final price of services but the introduction of real flat-rate, where everything is included and mobile access is a commodity like internet or electricity or gas, will lead to a weakening of pre-paid which will favour post-paid."

Filipiak sees the market moving in this direction. "We can see that many players are moving towards a mix of post-paid with a significant amount of free minutes, SMS and MMS in a bundle," he adds. "This offer is really close to an actual flat rate and assures stable revenue for providers as well as strong customer loyalty and a resulting decrease in churn. My mantra in telecoms is that customers now expect everything to be easy."

The emergence of mobile content and the move to data services put obvious pressure on carriers' systems and the telecoms revenue chain has become much more complex. Comarch has long been prepared for this shift, as Filipiak attests: "The revenue chain is more complex and an operator is now not the only one that provides the services delivered. Service ‘sponsoring', third-party service providers, resellers and service dealers introduce the need for multi-party billing and put more pressure on monitoring quality of offerings," he says. "Our solutions also address and deal with the complexity that content and data services bring in wholesale, next generation TV, content distribution, service creation and control. We address these needs through our InterPartner Billing solution. On the OSS side, we provide service level management and service level inventory, our flagship OSS products, which enable service modelling of resources and services provided by different parties along with pro-active quality monitoring and management."

Comarch has grown from its eastern European roots and now has operations in 30 countries and addresses operators of all sizes and types, as Filipiak explains: "The Comarch brand is recognised in the telecoms world," he says. "We've been in the industry for 15 years and time is now working for us. Our biggest customers for specialised OSS solutions are Tier 1 operators. Large operators with 10 million subscribers are customers for our InterPartner Billing and, when it comes to independent operators, we have about 30 per cent of the local market as clients for integrated BSS and OSS/BSS solutions. We also target the largest CATV and broadband operators offering convergent services. Our strategy also addresses global players where we can offer the best value, give good prices, still be flexible and deliver enterprise level services."

In spite of the general economic downturn, Filipiak still sees great opportunities emerging. One area is that of next generation mobile networks and self-optimising networks. "Such concepts will invite carriers to look for solutions outside the long established segments of OSS, such as Inventory Management, configuration Management and Network planning," he says. "It will not be sufficient to cover one area in the future; instead co-operation of the planning and operations areas will be needed where we see an opportunity for us. In addition, carriers are now more oriented towards a loose coupling of functional modules and standard interfaces that make it easier for smaller players, like us."

New means of delivering solutions are also critical. "With our future proof architecture of solutions, we can address modern modularity concepts and tendencies that now exist in the market," adds Filipiak. "The openness and standard interfaces in high level OSS products is the key and customers can choose the best modules for their operations. This provides a possibility to reduce opex by utilising new business models for our customers, network virtualisation, distribution and outsourcing of operations and hosting solutions."

Regardless of the current economic gloom, Filipiak believes a new investment wave must come to the telecoms market. "Investment must happen because there will be greater demand" he says. "Physical travel will be a high cost so there will be more load on existing networks."

Carriers face massive challenges in spite of the increased demand for their network capacity and services. "In the international mobile groups, unification and co-operation issues are still of key importance in order to gain competitive advantage on the global market" he says. "Outsourcing of operations has become very popular but unsurprisingly it has turned out not to be a remedy for everything. Carriers still need to adapt their business processes and way of thinking to this new model. On the other hand, the need to reduce capex is forcing carriers to introduce scenarios of sharing physical resources, such as radio masts."
Filipiak also identifies additional challenges such as churn prevention, automatic client profiling and concentrated web-based marketing campaigns, as issues carriers will need to address.

Winning business from the large carrier groups against this backdrop is, without doubt difficult.

"International groups are certainly challenging customers" admits Filipiak. "National companies differ in software environments, processes and levels of maturity as well as corporate and national culture. They therefore require a flexible approach in implementation strategy and software functionality and look for a common architecture for their network as well as their IT systems. Such carriers pay a lot of attention to building up corporate standards at the services level and business process levels in order to achieve a common view."

Good products, knowledge and proven experience are the ways to win this type of business. "No power point slide solutions can be sold anymore," adds Filipiak. "It takes a lot of time and money but these are the only ways to win contracts with groups."

However, winning such business is never achieved on a static battlefield. Carrier consolidation continues and that can be both a threat and an opportunity for solutions vendors. "On one had, it is difficult because some groups will enforce product choices at the global level, and it may be more difficult for Comarch to gain a global recommendation in a large group since we have to fight for our portion of the market with much stronger players" says Filipiak. "On the other hand, consolidation forces carriers to unify their OSS/BSS landscapes and this is a good opportunity to change long-established solutions for something new and fresh. Heterogeneous environments of global groups with plenty of flavours in different countries require a great level of flexibility that Comarch can provide. We already have positive experience with such projects, for example our experience with T-Mobile, that enables us to be optimistic for the future."

"Ultimately, we must live with the situation" adds Filipiak. "We're a service company and it's not our job to comment or expect specific customers to behave in any particular way. The level of consolidation is already very high so we may not see much more, in any case."
It's not only carrier consolidation that presents challenges to vendors, though. Carriers are at different stage of their business and that places a development burden on all vendors as they seek to develop systems applicable to individual carrier needs.

"Comarch builds its solutions for different segments of the telco market," says Filipiak. "We offer both pre-integrated solutions for small business, such as an integrated BSS/OSS solutions for an MVNO, and complex solutions tailored specifically for the needs of large players. We have frameworks and modules of software but we've never sold it without adaptation. In the end, it is always a construction job. You have modules but ultimately you must put them together in different ways."

Comarch has grown organically since its inception in 1993 and has shunned much of the mergers and acquisition activity that has occurred in the OSS/BSS sector in recent years. "Our product portfolio follows unified design principles and is not the result of an acquisition of missing parts," explains Filipiak. "This gives us the possibility to offer seamlessly integrated solutions and products that complete while at the same time not redundant in functionality."

Inevitably, for all rules there are exceptions, and Comarch has recently announced an agreement to acquire 50.15 per cent of Frankfurt listed company SoftM und Beratung AG for a transaction that could exceed €22m. The German software producer and systems integrator employs 420 personnel and supplies more than 4,000 customers.

Filipiak is open to further moves although they will be well considered. "Acquisition, yes but only in a way that we can handle along with continued organic growth. There will be no miracle from us, just steady organic growth."

Filipiak also rejects any notion of selling the company. "The company isn't for sale. My family has a controlling stake and I'm not going to sell now. The company's value is increasing and the scope of the business grows every day."

George Malim is a freelance communications journalist

Keith Willetts uses his considerable experience of the communications industry to fuel a journey into the future and finds a comparatively optimistic landscape - provided regulators and governments recognise their vital role in encouraging investment

So the holiday is over; it's snowing; the power just went off; another retailer just went bankrupt and the media are keeping up their unrelenting glee at how awful the coming year is going to be. So not a great backdrop to an article on what the future might have in store for the communications industry!

But looking at life through such a gloomy lens clouds the reality that the communications industry has fantastic prospects. Recessions are like forest fires - sure, they do a lot of damage, but they also help foster renewal and new plants to thrive. There isn't a business on the planet right now that isn't looking at how to work more efficiently and market more effectively - and that's an incredible opportunity for our industry that enables both these things to occur. Economies could save themselves trillions by much more widespread net-enabled home working, swapping the ever burgeoning physical transport of commuting for communications. TV stations are falling over themselves to launch online replay services to catch those programs you missed while out partying, to play on the record numbers of large screen TV's sold this Christmas; Nokia launches phones with ‘all you can eat' music included; Nintendo is launching an online TV service to exploit the millions of net connected Wii's out there.... The list goes on and on - all drivers for growth of the communications industry.
But not the industry we've known in the past. The high cost base, slow moving and weak innovation communications industry of the past will be consigned to history by this recession. Those companies that have not embarked on moving to a lean, agile and innovative business model, or those companies that have but are deciding they can't afford the costs of change will suffer badly in the coming months. Trying to compete in the second decade of the 21st century with business models, processes and systems suited to the 1990's will simply not wash. In the past, service growth masked the need for fundamental and serious change but that growth runs out of steam with market saturation or recession.

So the old adage of ‘when the going gets tough, the tough get going' could not be truer in today's communications markets. Winners will be those companies who spot opportunities that the recession creates and transform their business delivery to an ultra low cost; highly automated and integrated approach. Losers will be those players who batten down the hatches, cut investment and hope for better times - they simply won't come.

At its core, the communications business is a transport business. Instead of physical goods, our transport business shifts digital information in ever increasing quantities but at prices that aren't going to keep pace with that growth. So what's new?  PC processors, memory, in fact most consumer goods, survive in a market where the capability goes up continuously while the price comes down. Companies like Intel and Dell have built business models that can thrive on that approach and communications providers can too. Out with multiple networks, one for each service; out with hundreds or even thousands of disjointed and fragmented back office systems that are replicated for each service yet can't communicate with each other or provide even a consolidated bill to a customer. Out with months of battling with different barons who own resources to get a new service launched. In with one, multi-service network infrastructure; in with one highly integrated and automated suite of business processes designed around the customer, not the network.

Market change has been happening gradually for a number of years but the recession will sharply accelerate the need for change and new investment, not slow it down. At the TM Forum, we work with over 700 member companies from 75 countries to develop and deliver industry roadmaps, guidebooks, best practices, training, conferences, benchmarks and industry research on how to make that investment journey as painless and as low risk as possible. The question for CEOs and Boards to consider right now is this: "What is worse - the costs and risks of changing to meet the coming challenges or not changing gears and not being able to compete in a transformed marketplace?"

But service providers can't do this alone. Their markets are not unfettered models of capitalism where investment can flow to fuel opportunities - they are distorted by regulation. Regulators must play their part by making some fundamental changes in how they govern the industry, if we are going to get the kind of innovative, high quality and low cost services that we need. For too long regulators have been preoccupied with stopping rather than enabling - curbing rather than liberating. And while we undoubtedly have a much more open marketplace, we also have an industry that is not investing enough to meet growing market needs because of business uncertainties created by regulation. 

Take, for example, access networks. The growth of demand for access network bandwidth is growing along Moore's law lines - yet regulators and governments continue to bleed the industry dry though spectrum auctions or creating major uncertainties over investment returns for upgrades to fibre.  Despite marketing hype, there isn't a fixed line player in the world investing in fibre access at a rate that is likely to keep up with demand. Innovations like the TV merging with the PC, innovations like HD and 3D TV coming on stream; bandwidth demand will grow at a much faster rate than fibre can be installed.

ADSL has a theoretical maximum speed of about 25M but in practice 8M is much more realistic and many users (like me) have to put up with 0.5M if they are lucky. Just like the early days of mobile, we were prepared to put up with poor service because the gain outweighed the pain. Even if network operators started to invest in major fibre rollouts tomorrow, we are talking five to ten years to complete the task for many countries. What do we think the average home will be using online services for by 2015?  But of course almost no access fibre is rolling out because regulators have been so exercised about opening up access networks they have forgotten that people only invest if there is a return at the end of it.

Nicholas Negroponte once described high priced 3G bandwidth auctions as ‘condemning our children to an information poor society'. I'd include the lack of action by regulators to create a sensible investment climate for infrastructure change as part of that condemnation. Fibre technology has been around for a couple of decades and in the beginning, service providers could not get excited about replacing copper phone lines with fibre ones. But the ability of today's information society to fill networks as exponentially as we fill hard disks, means that there will be no shortage of applications to use that capability (I just saw interactive 3D TV in Japan!).  But where will the networks be to fuel that growth - still on the drawing board unless governments stop using the rear-view mirror as a navigation aid.

The recession raises the stakes for the communications industry - there will be winners but also many losers. Service providers must take huge gambles and invest in renewing their infrastructure, their processes and their systems. Many won't take that risk and in five years' time probably won't be around, but regulators and legislators owe it to the people who pay their wages to do everything they can to encourage that investment and in turn benefit their economies.

Let's look back five years from now and see this recession as the turning point that delivered us a 21st century communication infrastructure and a set of lean and agile 21st century providers and not as the period that set us back a decade by not making the transformational leap we have to perform.

Keith Willetts is Chairman and CEO, TM Forum www.tmforum.org

Lynd Morley searches for light in the economic gloom

As we begin the new year, the term ‘economy', presented in any word association game, would undoubtedly elicit such well-worn  (some might say overworked) responses as ‘credit-crunch', ‘downturn', ‘recession', even ‘depression' - reflections of the continued deeply negative mood in the financial capitals of the world, the various industrial heartlands, and the international media.

Yet there are some positive lights flickering in the almost impenetrable gloom - not just at the end of the tunnel, but  fairly and squarely at the very point of the tunnel we now find ourselves at.  One of those lights is the information and communications technology industry, and while the telecoms market is not immune to present economic tribulation, opportunities do loom, as real time communication becomes more imperative in business, and remains an essential part of our social existence. Certainly, according to Frost & Sullivan's principal analyst, Sharifah Amirah, telecoms is one of the few industries which has a "strong leg to stand on and is likely to gain from the downturn" in the economy.

Amirah is not alone in injecting a slightly more positive note to the discussion - specifically where telecoms is concerned (see also Keith Willetts' comments in this issue of European Communications, on pages 14-16).  He notes that in the face of economic adversity, enterprises will be looking to minimise risk and improve operational efficiency, and this focus on core competencies and reducing operational costs will open doors for IT and telecommunication service providers.

At the same time, rising unemployment rates and falling GDP growth are forcing end users to spend less money on entertainment and digital communication.  In light of this reduced consumer spending, Amirah sees pricing as a short term priority for service providers. In the mid-term, he says, value added services and innovative distribution models will be key to growth.  And while the focus will be very much on surviving the next couple of years, a clear sight of further horizons should still be retained.  

In the longer term three key themes will prevail - mobility, content and bandwidth.
Amirah sees the industry moving towards divestment, consolidation, collaboration and greater investments in research and development. Service providers with strong fundamentals will survive the storm and transform into more streamlined entities. Sharing business risk and securing more immediate returns on investments will offer a basis for both vendors and service providers to capitalise on new opportunities.

In term of the public sector, several governments have already channelled resources towards digital infrastructure as a reboot mechanism for the economy. This would provide a stimulus not only for the ICT industry but will potentially see an improvement of digital public services such as e-health and e-procurement.  

On the whole, for telecoms at least, the outlook is far from gloomy.  Indeed the flickering lights threaten to illuminate the much vaunted but almost universally ignored need to resurrect confidence in the economy in order to begin some return to financial health.

The MPLS & Ethernet World Congress 2009 conference agenda will pay particular attention to Ethernet and MPLS transport standards and mechanisms.  Video transport, services and mobile backhaul will also be covered in detail.

Meanwhile, the traditional debate, will, this year, address access and transport, two of the main issues for both MPLS and Carrier Ethernet infrastructures.  As each year, the panel will propose what the organisers describe as "a fruitful confrontation" between equipment vendors and service providers.

During the event, the third edition of the Carrier Ethernet Workshop will discuss technological and implementation issues in parallel with the traditional MPLS tutorial addressed by the MFA.  The workshop will be addressed by the MEF ambassadors for the standardisation process review.  Other sessions will welcome vendors for business models and solutions descriptions and carriers for deployment reports.

The Congress, organisers Upper Side are happy to point out, owes much of its success to the interoperability platform.  The major manufacturers all participate in this platform showcasing service and product interoperability. 

The European Advanced Networking Test Centre in collaboration with Upper Side will invite industrialists to a multi-vendor MPLS & Ethernet interoperability in January 2009.
The EANTC will evaluate state-of-the-art MPLS & Ethernet architectures and applications in a detailed technical hot-staging.

The test results will be demonstrated in a public showcase during the event.  Service providers will support the preparation of the test plan and participate in the hot-staging and public event.

Participation is open to all vendors of MPLS routers and switches, traffic emulators and analysers, provisioning and fault-management vendors.

During the 2008 event, EANTC and the participating vendors showcased a complete mobile backhaul transport network supporting MEF and IP/MPLS Forum defined services.
The interop platform demonstrated devices' capabilities in constructing mobile backhaul networks and the ability to interoperate with other leading vendors in the industry.
Mobile application vendors (3GPP, 4G, WiMax) attached their equipment to the backhaul network to demonstrate service interoperability and to enable mobile application demonstrations at the public showcases.

MPLS & Ethernet World Congress 2009, 10-13 February, Marriott Rive Gauche, Paris.

With the development of new Carrier Ethernet technology and service definitions - including E-Tree, E-Line and E-Lan - and the growing awareness among business customers of the availability of L2VPN services offering comparatively low-cost P2P and multipoint connectivity, data/Ethernet product managers are in a good position to generate increasing revenues from this attractive suite of new services.

At the same time, however, competition in the Metro Ethernet P2P service marketplace is driving down prices, and there is a question mark over the sustainability of a business focussing purely on business customers and offering only basic interface limited, P2P services.  A new generation of Ethernet Services, provided by ambitious, lean alternative carriers threatens to seriously undermine existing customer relationships. 

The organiser of Ethernet Services Product Evolution, IIR, points out, therefore, that this is hardly the time for hesitation, stressing that the event will enable attendees to gather information and make contacts that will help them develop a strategy to compete in the increasingly competitive market for business and wholesale Ethernet services. 
Whether attendees are targeting business customers looking for data centre connectivity/managed services, internal customers looking for converged access solutions or wholesale customers looking for Ethernet access solutions, IIR claims the event will help them make the right choices in this rapidly evolving business area.
The event is intended to help in a number of areas, including

  • Finding out how Product Managers from service providers in similar and competitive positions are developing Ethernet service portfolios
  • Developing a clear understanding of the dynamics of the legacy, L2 and IPVPN markets and understanding how to maximise the value of each within their portfolios
  • Gaining a detailed understanding of the options for differentiation against competitors' services
  • Developing a comprehensive understanding of the evolution options from simple P2P Ethernet services with dedicated bandwidth to a full range of QoS enabled Ethernet services
Ethernet Service Product Evolution, 2-4 February, Brussels Marriott Hotel, Brussels.

Taking a highly targeted, strategic approach to market entry and operation is the key to success in emerging markets says Simon Vye

For service providers considering entering an emerging market, there are far more challenges than may be first realised; cultural differences and idiosyncrasies are important and need to be taken into account, but it's the political hurdles that may present the biggest challenge.

Before considering the point further, we must establish exactly what we mean by the term "emerging markets". Officially, an emerging market is one with a relatively low per capita income, which, more importantly, has a high potential for economic growth. However, there are different stages of emergence to consider. Economies will start as "developing", offering little in the way of infrastructural, regulatory or political support for foreign investors. As these economies evolve, communications infrastructure is one of the first critical foundations laid which allow business to thrive and the economy to grow.

Hot spots such as China, Vietnam and India are popular emerging markets in Asia, while areas of the Middle East such as United Arab Emirates, and Eastern Europe, such as Hungary or Poland, are also rising in popularity. Analyst house IDC, highlighted Pakistan as the biggest spender on telecommunications services in early 2008 but also predicted that Vietnam would have overtaken that market by the end of the year. However, as a whole, the Asia-Pacific telecommunications market was set to grow 11 per cent in 2008, providing a wealth of opportunity for telecommunications providers around the world.

The global economic landscape is constantly changing. In spite of recent economic turmoil, the opportunities offered by emerging economies provide new avenues for savvy telecoms strategists. The old cliché of the ‘global village' is now firmly established as the status quo and we are living in a truly connected world. Multinational organisations must constantly seek new ways to mitigate the risks, whilst drawing maximum benefits, from entering new markets. Businesses serious about competing on the global stage expect to be able to communicate instantaneously, efficiently and cost-effectively no matter where in the world they, or their customers, may be.

These expectations in turn create opportunities for the telecommunications industry to provide consistent quality and depth of service across geographically dispersed sites. Core services such as Ethernet private lines (EPL), IP-VPN and MPLS networks offer international companies peace of mind that their communications infrastructure is robust, allowing them to concentrate on their core objective of growing the bottom line. New communications capabilities are also allowing organisations to cut business travel costs through video or web conferencing and unified communications strategies, providing new avenues for cost reduction as well as revenue growth.

In order to attract increased foreign investment, governments in many emerging markets have reviewed their protectionist policies, setting up special economic zones (SEZs) and other incentives to attract the multinational leaders to their shores. In May 2008, the Chinese government announced a plan to restructure its telecommunications industry in order to make it more competitive. This move, which aims particularly to create more competition for China Mobile, the world's largest carrier by subscriber base, has received a mixed reaction from both Chinese and global industry players. In general, it should be considered a positive move for a traditionally sheltered market.

In addition, emerging markets often turn to the developed western economies for best practice in industries such as professional and financial services. However, unlike their western counterparts who have to take account of legacy hardware, infrastructure or technique, emerging markets often adopt a big bang approach to their telecommunications and IT infrastructure, leapfrogging the benchmark to set a new global standard.
A perfect example of this is the deployment of fibre to the home (FTTH) in South Korea, providing South Korean consumers with ultra-fast broadband internet access for a fraction of the price of the slower copper-based UK network.

While the benefits of entering new markets are clear; access to workforce, expanded customer base, new business opportunities; companies must be aware of the potential pitfalls which stand in the way of success.

The first issue which organisations are increasingly looking to address, by investing serious capital, is the often gaping cultural divide between the new host market and that of the organisation's home. This challenge is nothing new but it can cause serious problems when ignored or handled clumsily. With Japanese etiquette for example, the observation of hierarchy and relationship-centric business communication is the stuff of international management legend, a host of embarrassing anecdotes and libraries full of guides to ‘doing business in Asia'. However, should organisations apply the same rules and approaches to all Asian markets, such as Vietnam for instance? And what about other emerging markets in Europe or even Africa?

In spite of emerging markets' increasingly proactive approaches to attract foreign investment, from a telecommunications point of view, many markets remain highly restricted. Deregulated markets such as Japan, Hong Kong and Australia are open to new market entrants who are able to compete effectively to the benefit of customers. However, markets such as China and Cambodia are much more restricted, preventing telecoms service providers from acquiring or controlling national operators.

Red tape is often the cause of unexpected frustration for companies new to a market, and can act as a disincentive for doing business. The World Bank estimates that Indian senior managers, for example, spend 15 per cent of their time dealing with regulatory issues, far more than the Chinese. Without extensive research, companies moving into these markets can find their progress blocked by confusing regulations and compliance demands. Compliance is also an issue that can complicate matters significantly. For customers in the banking and financial services industry for example, service providers must be aware of the compliance requirements of the host market, as well as those of the home market. Often, global organisations are governed and regulated according to the standards of more developed markets. In the case of companies operating from the US for example, failure of an overseas branch of an international bank to comply with US legislation can cause legal ramifications and damage to its brand value and reputation.

So, how can a service provider in an emerging market drive top-line growth and improve profitability quickly? The key to success lies in taking a highly targeted, strategic approach to market entry and operation. The growth potential for emerging economies is such that they are often highly competitive environments with companies trying to secure first mover advantage without truly understanding the individual challenges of the market. They may also not fully understand the needs of the potential customer base either within that market, or wishing to enter that market.

As these markets consolidate through competition, the organisations that have developed strong and lasting relationships, as well as a deep understanding of the local market forces, will be the ones to thrive. There are a number of ways to ensure success in emerging markets.

The first is to take the time to really understand your target audience. Many international companies based in Europe are now turning to emerging markets in the Middle East as well as Asia to expand their market presence and develop new revenue streams. These organisations need the same quality network security and performance that they expect in their home markets. They may also wish to take advantage of similar products such as managed hosting and IT services, which may not necessarily be as advanced in certain markets.

Secondly, service providers should implement a simple, flexible network architecture that allows customers to accelerate the roll-out of new services.  Partnership is a crucial element of this approach as laying your own cable may simply not be an option in heavily regulated markets. Telstra International lays approximately one kilometre of new cable each week but also partners extensively with tier 1 carriers in emerging markets to ensure maximum coverage.

When choosing a partner for emerging markets, service providers should look to the local providers that have a strong network footprint in the key growth regions. Their networks should also be based on an advanced MPLS-IP backbone and they should have the capability to offer managed IP services. Often, as the markets slowly open up to deregulation, new carriers created within the country have greater opportunity to take on the previous national incumbents, well before foreign carriers are allowed to enter the market. These providers tend to be more agile than the larger market leaders, allowing them to bring new services to market much faster on behalf of its customers.

In essence, moving into emerging markets is a long term process which requires both careful planning and rapid deployment of services in order to capitalise on the growing economies. In the current economic climate, it is more important than ever to choose partners carefully in order to offer multinational customers a tier 1 carrier service across the board, no matter where in the world they may operate.

Simon Vye is CEO Telstra International EMEA.

The New Regulatory Framework (NRF) for the telecoms sector that was proposed in November 2007 by the Commission is now entering the last phase of what have been tedious and sometimes acrimonious negotiations. The Council of Ministers examined a number of key regulatory proposals at the end of November and a consensus is now emerging. The EC is expected to adopt the framework in early 2009.

The backdrop of this process has of course been the current financial crisis and, more recently, talks of recession in Europe. So what regulatory measures are likely to emerge from these negotiations and will the new framework be appropriate for recession times?
The Commission's proposals have been significantly watered down by both the Council of Ministers and the EU Parliament. As a result the current ‘consensus' version of the NRF, that will shape regulation in Europe over the next five years, is less ambitious than the initial draft on a number of key points.

As far as spectrum regulation is concerned the ambitious market centric plans initially proposed by the Commission have largely been rejected. The resulting status quo caters for the political and social concerns of member states and will probably be easier to manage in a difficult economic climate but may need to be revisited in a few years time.

The idea of a "super EU regulator", originally proposed by the Commission, is being replaced by the a new entity (tentatively called Group of European Regulators in Telecoms or GERT) that will be an independent body as opposed to a new EU agency. This means less power than envisaged to the Commission and more power to national regulators.

The controversial proposals aimed at giving the power to mandate functional separation of dominant operators to the national regulatory authorities have been kept but are now considered a "last resort" measure to be applied in "extraordinary" circumstances. As a result the burden of proof required for national regulators to select and implement separation is likely to be very high. The economic climate is not conductive to separation plans in the short term as these often involve important one off costs and can trigger in some cases the need for the renegotiation of the operators' debt. The new text is however only a partial victory for a number of incumbent operators as separation could still be mandated in the future.

Access to fibre based new generation networks is likely to be mandated in most cases but the prices charged by incumbents to new entrants will have to reflect the appropriate investment risk in order to preserve investment incentives. The exact methodologies to assess the returns allowed on these new investments have yet to be agreed though so the regulatory visibility is only partial. The explicit recognition of the need to incentivise investment is however good news for operators in a difficult economic environment.
Meanwhile market evidence suggests that many operators and equipment manufacturers in Europe are suffering from the current economic climate. First, the share price of most telecoms firms has tumbled and this has raised fears of aggressive takeovers and further consolidation. Second, both business and residential markets are softening with anecdotal evidence of reduced call volumes and shifts from expensive price plans/packages to cheaper ones when existing contract terms expire. Third, the switch to VoIP based solutions seems to be accelerating and while a number of players will benefit from this trend, the net result will be an overall decline in call revenues for the industry. Fourth, some of the investment intensive strategies that were under consideration are being reviewed and postponed.  For example, a number of investment plans in New Generation Networks are likely to be impacted.

The telecoms sector however went through a significant wave of structural rationalisation and consolidation following the dot com burst and is likely to be more resilient as a result. Also communications services are not as cyclical as some other industries such as luxury goods or retail. Lastly many customers are still under "fixed fee" contracts (fixed or mobile) and revenues will only be impacted when these expire.

While it would be wrong to design a regulatory framework with short term economic considerations in mind (the text will only be made into national laws in 2010/2011 after all) it is clear that some of the concerns of key market players have influenced the text of the current draft New Regulatory Framework.

Benoit Reillier is a Director and European head of the telecommunications and media practice of global economics advisory firm LECG.  The views expressed in this column are his own.

An explosion in the variety of distribution channels now available to entertainment content owners is ensuring new approaches to security explains Priscilla Awde

While pirates of the high seas are causing some very severe headaches to maritime trade off the horn of Africa, their quieter cousins are engaged in potentially equally devastating attacks against global networks and the traffic they carry. As the arteries on which all commerce depends, defending these networks is as critically important as protecting cargos on the high seas.

The same can be said for the entertainment content on which owners spend and stand to lose millions from piracy, illegal copying and distribution. Securing entertainment programming from abuse is big business. Major studios, content owners and vendors use technology to prevent abuse and prosecute pirates when they find them.

Security is complicated by the rise in broadband fixed and wireless connections; the explosion of digital content; consumer demand for anywhere over any network/device access and a growing predilection for mixing broadcast, internet and IP traffic.
While Digital Rights Management (DRM), controls usage and distribution within households, Conditional Access (CA), secures content transmitted through cable, satellite, broadcast and telecoms networks to households. According to ABI Research, the global market for CA alone is expected to have generated revenues of around $1.4 billion in 2008.

Traditional one-way broadcast networks mostly rely on tried and tested smart cards using strong security algorithms to protect encrypted, standards based DVB video streams. However, explains Cesar Bachelet, senior analyst at Analysys Mason: "In any security breach, smart cards must be replaced, which is very expensive. IPTV operators tend to go for software solutions because they have no legacy systems.

"There is a trend towards cross-platform content delivery and major access vendors have solutions for on-line video and internet television. The biggest consequence of the move to digital is the explosion of available content for download."

Switching content between devices has implications for CA and DRM. Operators must be able to enforce pre-defined parameters governing content usage on each device, its storage, duration of use, where, if and by whom it can be accessed or copied. This is increasingly done through handshake routines between CA and DRM systems which allow content to be moved from televisions to other devices and released into the computing domain. "Bridging technologies allow transfer to third party devices with embedded DRM facilities," explains Daniel Thunberg, Senior Director, Market Development, Irdeto. "Content is re-scrambled into another format within set-top boxes and handed over seamlessly along with transfer rights and certificates embedded in the headers."

New software based systems are the next big CA development but, while some believe they make preventing, detecting and shutting down piracy faster and more efficient, others suggest counterfeit copying and distribution are easier.

Not so says Stephen Christian, VP, Marketing for Verimatrix. As a relative newcomer, Verimatrix is: "Riding the wave of internet cryptology, IP technology and applying them to media. Software solutions offer greater ability to track what's going on in set- top boxes and can be downloaded as part of the content distribution mechanism at approximately zero cost and to a variety of devices. We are using the power of chips and the prevalence of broadband to develop software solutions mainly for greenfield IPTV providers but also for all Pay-TV operators. The world of CA is moving to IP which has turned it upside down," continues Christian.

IP transmission brings both opportunities and threats: computer hackers have spent years perfecting their nefarious trade in the computer world and can now deploy their dubious skills against Pay-TV content. Although IP networks may be easier to attack, they can support fast and sophisticated security applications.

While all security solutions are state-of-the-art at launch, most are eventually hacked, making it important to anticipate and assess risks and react fast to renew security.
Intelligent IP networks give CA vendors more opportunities; providing new ways to keep content safe believes Francois Moreau de Saint, CEO, Viaccess. "Lots can be done in networks to manage and secure different devices and manage rights centrally. Operators want to deliver different content over very different devices with different technologies so we must deliver the solutions to manage heterogeneity, to enable conversion features allowing content to be converted between formats so operators can ‘talk' to each device in the language it understands.

"Although the world is changing considerably, there is continuity in what's at stake: the whole point of CA is to fight piracy. Now there are more and more distribution channels and more opportunities to deliver more content and therefore more risks. We need to deploy the highest level of security technology and be prepared to take legal action against hackers."
Faster connections make illegal downloads easier and pirates are increasingly hiding behind peer-to-peer and file sharing making them difficult to trace, but the industry has some innovative solutions. Watermarking is a new, effective and increasingly popular tool in CA vendors' tool kits. "Mixing television with the internet frightens many broadcasters and content owners," says Geir Bjorndal, Sales/Marketing director for Conax. "Watermarking inserts an invisible pattern into the video stream and equipment records the unique identity of which set-top box received a copy. This makes illegally distributed content traceable to individual subscribers."

Next generation smart cards detect unusual behaviour and usage patterns (typical in card sharing), making it faster to shut down unauthorised access. CA vendors have developed robust systems for the growing, if nascent, mobile TV market: security embedded in SIM cards can be activated, managed and updated from the network.

New CA systems are enabling new Pay-TV models. In a revenue sharing agreement with content owners, Orange recently launched the five channel subscription based, multi-platform Cinema Series in France. Films, television, traditional and time shifted broadcasting are available on-demand and can be watched on any screen any time. Many big content owners are making their libraries available and internet Pay-TV will become more popular.
As always, flexibility, speed and real time reactions are essential for all communications systems which the trend towards software CA solutions should fulfil. The risks are that however necessary, security may become a stumbling block for seamless and fast content exchange between devices.

Priscilla Awde is a freelance communications writer

Emerging markets, without a doubt, are squarely in the sights of mobile operators looking to gain new subscribers in these tumultuous times.  As many people struggle to afford even the most basic mobile services in these regions, there are significant challenges for operators that seek to tap into this new subscriber base and maintain operating efficiency.  Customer service has traditionally been a major cost deterrent for operators, so to maintain profitability in these emerging markets, operators must look to new and innovative customer service solutions says Mikael Berner

In their search for subscriber growth, operators are looking to the world's emerging markets, such as China, India, Pakistan and Latin America, where some of the statistics are impressive.  For instance, revenue for India's big four telecom companies alone (Bharti, Idea, Reliance and Spice) grew collectively by 50 per cent in 2007 over 2006 to $12.05 billion. Capital spending simultaneously grew at an astonishing rate, underscoring massive infrastructure investments, as operators are eager to expand their networks.

Despite the impressive growth opportunities in these markets, operators are faced with the challenge of keeping average revenue per user (arpu) at a profitable level.  Even as these emerging economies blossom, disposable income remains low for most potential subscribers.  Often, consumers in these developing regions can afford mobile phones only as groups, or during periods when they have employment - sometimes going completely silent on mobile phone usage when jobs are scarce.

In addition to low arpu, other key risk factors have emerged for operators in these volatile regions, increasing churn and causing serious pain points for the operator.  For instance:

  • The handset market consists of many low-end phones;
  • These regions have predominantly pre-paid plans, where customers are simply replacing the SIM card when they have the funds to get a mobile phone again; and,
  • Operators either charge for or don't provide customer care for pre-paid customers in these regions.

Mobile operators are facing a new challenge as they scramble to access these markets and avoid costly blunders.  The operating costs in these areas remain the same as in developed markets, requiring operators to shift their focus from customer service to rapid expansion with the intention of getting service up and running for as many subscribers as quickly as possible. From a business perspective, this situation sounds familiar and indeed hits very close to home.  As an example, the low-cost European airlines are interested in flying as many customers as quickly as possible to predictable, tried and tested destinations - at least those with low airport fees.  They've become infamous for providing as little hassle (and customer service) as they can get away with.  This "no frills" approach by airlines has resulted in perceived poor customer service reputations and ultimately, damaged brand image.

How low can mobile operators go in their quest for profitable expansion into the wireless frontier without good customer service? As more companies vie for market share, the level of competition will likely increase and push the price of telecom services down even further. Costs will need to continue dropping if operations in rural areas are to expand profitably. And, while many low-overhead business models of mobile operators have been highly successful, there are some aspects of mobile service that simply cannot be taken out of the picture.

Operators still face the tremendous challenge of financing customer service.  With the introduction of new technologies and services, inevitable user questions arise that have typically been directed to expensive call centers.  It is crucial that operators provide tools to meet their customers' needs without incurring high support costs.

With the influx of pre-paid plans in new markets, mobile operators have addressed support issues by charging customers on their pre-paid account when they call customer support, instead of providing the service for free as with post-paid customers. Hidden fees such as these are not likely to be popular with subscribers.  And increased competition in emerging markets means that if a customer isn't happy, another wireless provider will be waiting with open arms to take their business.

This cutthroat landscape is hostile to subscriber retention unless a company can achieve the perfect balance of low costs and high customer satisfaction. Some operators have attempted to tackle customer support issues by encouraging subscribers to solve problems themselves. This self-help policy is problematic if companies don't offer customers the tools they need to solve handset issues.

The most promising approach to low-cost service options is device-based solutions.  By bringing the service experience to the handset, operators are able to deliver highly functional experiences on basic handsets at virtually no variable cost.  By resolving issues at the point of experience, operators will improve the customer experience while avoiding the costs of connecting callers to IVRs, agent queues, etc.  With device-based solutions, callers are easily capable of resolving more than 75 per cent of their issues quickly and easily on the device.

Two primary areas operators should focus on when it comes to self-service on the handset are:
1) Remote support services; and,
2) Account management services.
Remote support services are critical to resolve, as the primary reason subscribers defect from their operator is based upon device problems, service issues, or just plain confusion.  It is vital that operators find a way to resolve these issues as proactively as possible, even solving them before the subscriber knows they have an issue.  Account management services consist of much simpler tasks, like topping up or paying a bill. These types of requests occur on such a high frequency that having a solution in place to deal with these requests is imperative if operators are to keep their service delivery costs in line.
In addition to encouraging the use of revenue-saving applications, access to interactive tutorials will also teach the user about their mobile device, thus decreasing the number of calls to customer service. If operators can provide a workable, cost-effective solution to benefit the user, call center traffic will decrease significantly and reduce overall operational costs.

Mobile operators need to answer the challenges found in emerging markets by providing services and handsets at an affordable, competitive price point. Additionally, they must make products and services easy to find and simple to use to build and maintain customer loyalty. As operators provide more value to their customers, they will be more loyal and less likely to switch SIM cards and operators so frequently. The secret to maximising growth opportunities in emerging markets lies in outsmarting the competition by utilising new technologies that strike the elusive balance between low costs and satisfied subscribers.

Mikael Berner is Senior Vice President and GM, Enterprise, Nuance Communications

Much is being made of rival broadband access technologies and their prospects for making radical impacts on the telecoms market place. Phil Irvine and James Bennett explain that their analysis suggests that in a market where broadband to the home is currently dominated by fixed access, wireless can play a role - but only in limited areas. In more mature markets they see the dominance of incumbents' DSL offerings continue and believe these operators will be best placed to meet emerging demand for higher speeds by fibre services. In developing markets, however, the absence or poor state of fixed infrastructure and regulatory policy can make the relative deployment cost of wireless broadband very favourable. They urge prospective investors, suppliers and operators to proceed carefully. Many crucial choices need to be made - such as which territories, services and customers should be targeted?

The introduction of broadband access has been a huge driver of the growth of the telecoms sector. The services enabled by broadband have had a profoundly beneficial impact on people and businesses by changing the way they interact with each other, access information and entertainment, and conduct business. Around the world, demand continues to grow for higher access speeds and wider availability.

The telecoms industry faces significant uncertainty on how best to meet this demand. Key questions for operators and investors are whether fixed broadband access can be displaced by wireless access or emerging technologies, maybe including non-mainstream options such as Broadband over Powerlines (BPL).

These technology choices are characterised by the disruptive potential each could have on the industry structure. Investment in the wrong technology could be catastrophic for investors, operators and economies. On the other hand, getting it right could shake-up the industry.

Which technology will dominate the broadband access market will differ from country to country and where it will be deployed in-country. It will be determined by the state of user demand, technology maturity and economics, geographic coverage and regulatory policies towards infrastructure investment. Our view of which technologies will win out and where is summarised in the table below.

In developing markets, wireless broadband access technologies can play a major role so long as the regulatory environment is designed to encourage their development. In particular, wireless broadband can be seen as a viable solution for serving currently underserved areas. There is also the potential for new access technologies such as BPL to play a role, depending on whether technical limitations can be overcome.

By contrast in more mature markets, given the emerging regulatory focus on ‘access bottlenecks', broadband technologies will be dominated by fixed rather than wireless systems. In this respect there will be limited scope for new infrastructure-based operators to compete effectively and the success of wireless broadband will depend on the utility arising from mobility, not fixed access.

However, many questions remain for suppliers, operators and investors. Which territories and customers you should target? What services will succeed? How should you deploy your network? What partners do you need?

The rise and rise of broadband continues - but will it reach a point where the highest access speeds can only be met by fixed fibre?

Broadband access has been a key growth service for fixed telecoms operators around the world, whose importance is made even more significant by the decline of traditional telephony. Demand in mature markets is characterised by the continuing growth of data rates to access more and faster services. Ten years ago the typical access rate was a dial-up line at 56KB/s; today's typical service in mature markets is 2MB/s to 10MB/s.

These speeds are enabled either through DSL technology or by cable modems from incumbent telecoms and cable TV operators respectively. The technologies have limitations that restrict the type and speed of services that can be delivered across them, for example high definition video. Only fibre can support the high data rates of say 50MB/s and upwards that support these service portfolios. Deployments are already starting to take place, most notably in Taiwan, Japan and Hong Kong. In Europe a number of operators have launched Next Generation Networks (NGN), which involves fibre deployments, often to a distribution cabinet rather than the home. There is currently no foreseen role in this for wireless technology.

undamental economics favour DSL over wireless broadband - but only where fixed infrastructure exists.

For lower data rates, where wireless broadband speeds can compete with DSL, the underlying economics strongly favour DSL, as shown below. The cost of broadband deployment is dominated by access costs, and accounts for nearly two thirds of all operating costs over the first five years. The economics of deployment also strongly favour existing operators, where the scale efficiencies from widespread assets ownership means the incremental costs are far lower than for a Greenfield new entrant. As such, wireless broadband as an access service is a viable solution only where fixed infrastructure is not deployed.

This lack of opportunity for new technologies and hence new operators to enhance fixed access competition puts a clear focus on the role of regulation. Unfortunately there seems to be no consistency in policy among regulators around the world. Some regulators, such as Ofcom in the UK, have been active in suggesting a series of principles for regulating the ‘access bottleneck'. Others, such as the FCC have applied a policy of ‘forbearance', effectively relieving operators of the obligation for interconnection. The risk is that by setting a favourable investment climate, regulators are allowing operators to develop and possibly abuse a position of dominance.

The absence of DSL in developing markets presents an opportunity for wireless broadband operators especially in rural or underserved areas.

In less mature markets, the market development route might be quite different. The deployment of fixed physical infrastructure is often far less widespread than in more mature markets. Further, the success of mobile services in recent years has attracted traffic from fixed services, further reducing the means for upgrade and further deployment of fixed infrastructure. So, for example in Saudi Arabia, where only 70 per cent of households have fixed access, demand in currently underserved areas is for any form of access. Typical access speeds are accordingly lower and so the demand for higher speeds is quite different from mature markets.

A consistent feature of emerging markets is a regulatory policy aimed at encouraging the development of infrastructure through preventing resale and encouraging access in under-served areas. Unbundled local loop - DSL is therefore often unavailable and market prices for wholesale broadband are held up higher than they might be where a resale market was available. This presents an opportunity for wireless broadband to play a more significant role in urban and suburban areas, particularly where incumbents are slow to respond to the threat of a new entrant. This creates a paradox in regulatory regimes - a perception that infrastructure competition is an essential feature of a competitive market works against regulatory aims of reducing prices and increasing broadband penetration.

Our analysis of the costs of deployment in developing markets, suggests that regulatory impediments to unbundling and the high wholesale DSL cost create an opportunity for wireless broadband. In the long run wireless broadband could dominate, as its presence should inhibit further deployment of fixed infrastructure.

Broadband access is a fundamental service in the telecoms portfolio of all operators. Fixed access will continue to be dominated by incumbents' DSL and fibre services in developed markets; new opportunities will mainly exist for wireless as a nomadic and mobile service. In developing markets, wireless broadband can play a dominant role as an access service - but only in certain areas, subject to there being sufficient demand in those areas - and its prospects are strongly influenced by the role the regulator plays in encouraging the development of infrastructure competition.

Phil Irvine and James Bennett, PA Consulting Group

In today's financial environment, blended services may be the key to survival for telco operators.  With shareholder and investor audiences becoming increasingly difficult to please, fixed line and mobile operators need to identify new revenue streams - and one way of doing this is by looking within.  Many operators have excellent applications and service environments, but they are split in two - one for their next generation networks and a second, older environment for their legacy networks.  If a link existed that could seamlessly sit between the two and share applications across - or blend services - operators would be able to get the most out of the applications that they currently have says Mike Jones

Service blending is the practice of taking more than one service and combining them to make something new.  Think of making a fruit smoothie - mixing bananas, strawberries, and oranges together isn't something that naturally occurs in the wild, but when blended makes a delicious treat.  Considered separately these fruits are all delicious in their own right, however the blending of these fruits has created a business where there was not one previously - the act of blending is considered a value-add above merely the fruits alone, and therefore can demand a higher price.

Also consider that this business was created with ingredients that were already lying around the kitchen.  There are people who are content with buying the fruits individually, but there will always be some people who are tired of the same old fruit, and will be willing to try something new if for no other reason than to break the monotony.  This smoothie market was created when buyers were presented with something that they had not thought of or seen before.  They expected to eat regular fruit, however when presented with something new, they were delighted with the prospect of experiencing a new sensation.  This new sensation is what attracted their attention and money.

The last point to make with this analogy is that the key to unlocking this market was the tool - the blender!  The tool is the enabler for this new market.  Without the tool, the process of making the smoothie might have proven to be too expensive or ineffectively blended the ingredients.  The right choice of tool is crucial to blending the fruit into a new, delicious, and refreshing beverage that opens new opportunities and revenue streams.
Blending telecommunications services has a great deal in common with blending fruit.  Both need:

  • Ingredients: Preferably ones that are already being used and therefore are readily available
  • Innovation: Thought leaders that can see an opportunity for a new product or service
  • Tool: An efficient enabler that provides the link between the idea and the product or service.

Telcos have the first two items, but are missing the right tool that can easily and cost effectively turn their ideas into reality.

Services are currently deployed as discrete functions within a network, which are akin to the individual fruits in a smoothie.  SMS, voice mail, automated outbound calling, and pre-paid are all examples of discrete functions that are present in most service provider networks today.  These services are discrete due to the complexity of interworking the application with the network, and this complexity is a leading cause of inefficiency in telco networks.
What if these services could be unlocked and offered for free consumption to application developers?  Applications are rarely universally adopted by every subscriber in a network, so there are opportunities to repackage one or more of these discrete functions into a new service that will be consumable by a new user.  For instance, a mobile subscriber may only think of automated dialing as something a telemarketer would use.  However, that same mobile subscriber may view an automated wake-up call service as a useful feature.  This is just one simple example of how the same discrete network function can be blended with, for example, SMS to create a new service using piece parts already in a network.

As the market continues to take shape, existing enhanced services are prime candidates for incremental innovation and arpu enhancements.  By leveraging the existing enhanced services and creating innovation "on top" of them, service providers complement an understood user experience while at the same time, enable an ecosystem to reinforce the first social network application, voice services.     

Innovation comes into play for blending old with new.  Using the smoothie example, consider the fruit smoothie discussed earlier as being the "old" technology.  Now consider a "new" technology, protein powder, which is being used by fitness enthusiasts.  The blending of the "old" and the "new" in this case has enabled the smoothie vendor to start selling protein powder to a different audience, effectively creating a new market of fitness beverages.
This analogy once again carries into the Telco domain when compared to the legacy network and NGN.  There are new services being created for NGN all the time, but how well do these applications work with the mainstay applications in the legacy network?  Based on the fact that most NGN services duplicate the core functions of the legacy network, it's safe to assume that the new and the old interact very little or not at all.  Which will be more profitable?  Repackaging an existing service to address a new market, which is aimed at revenue growth, or duplicating a service to the same market for some nominal cost savings?
An example of service innovation in the telco market is the blending of "new" IT policy enforcement capabilities, such as web browser parental controls, with the "old" pre-paid application.  This policy enforcement could be extended to control who, when, and where phone calls can be made or received.  The blending of these technologies is another clear example of how two disparate technologies can be brought together to create a product that is marketed to people from two separate demographics - voice and IT security. 
The key that unlocked the smoothie market was the blender.  The right tool made the process of making the smoothie quick, efficient and cost effective.  The telcos also need a tool like the blender that will unlock their services for the purpose of creating something new.  This tool will:

  • Protect the telco by ensuring that their services operate independently from the underlying network
  • Prevent vendor lock-in by opening up the core network services as building blocks for new applications
  • Support telco and IT technologies such and IN and web services for the cultivation of multiple ecosystems
  • Create service building blocks from the old and new networks for rapid creation of innovative services
  • Support the reliability and scalabilty required by large scale services
  • Unlock trapped arpu

What service providers must do is protect the value and innovation potential of their legacy network by ensuring that their services can be offered independent of the underlying networks and, more importantly, independent of the vendors enabling those networks. Service providers who choose to open up their applications for mass consumption have the potential to open up new markets. By doing this, telcos can avoid falling into the vendor lock-in trap, and ensure that their investments in new technologies achieve maximum ROI.

Mike Jones, is Sales Engineer with AppTrigger



Other Categories in Features