Features

Features

Taking a highly targeted, strategic approach to market entry and operation is the key to success in emerging markets says Simon Vye

For service providers considering entering an emerging market, there are far more challenges than may be first realised; cultural differences and idiosyncrasies are important and need to be taken into account, but it's the political hurdles that may present the biggest challenge.

Before considering the point further, we must establish exactly what we mean by the term "emerging markets". Officially, an emerging market is one with a relatively low per capita income, which, more importantly, has a high potential for economic growth. However, there are different stages of emergence to consider. Economies will start as "developing", offering little in the way of infrastructural, regulatory or political support for foreign investors. As these economies evolve, communications infrastructure is one of the first critical foundations laid which allow business to thrive and the economy to grow.

Hot spots such as China, Vietnam and India are popular emerging markets in Asia, while areas of the Middle East such as United Arab Emirates, and Eastern Europe, such as Hungary or Poland, are also rising in popularity. Analyst house IDC, highlighted Pakistan as the biggest spender on telecommunications services in early 2008 but also predicted that Vietnam would have overtaken that market by the end of the year. However, as a whole, the Asia-Pacific telecommunications market was set to grow 11 per cent in 2008, providing a wealth of opportunity for telecommunications providers around the world.

The global economic landscape is constantly changing. In spite of recent economic turmoil, the opportunities offered by emerging economies provide new avenues for savvy telecoms strategists. The old cliché of the ‘global village' is now firmly established as the status quo and we are living in a truly connected world. Multinational organisations must constantly seek new ways to mitigate the risks, whilst drawing maximum benefits, from entering new markets. Businesses serious about competing on the global stage expect to be able to communicate instantaneously, efficiently and cost-effectively no matter where in the world they, or their customers, may be.

These expectations in turn create opportunities for the telecommunications industry to provide consistent quality and depth of service across geographically dispersed sites. Core services such as Ethernet private lines (EPL), IP-VPN and MPLS networks offer international companies peace of mind that their communications infrastructure is robust, allowing them to concentrate on their core objective of growing the bottom line. New communications capabilities are also allowing organisations to cut business travel costs through video or web conferencing and unified communications strategies, providing new avenues for cost reduction as well as revenue growth.

In order to attract increased foreign investment, governments in many emerging markets have reviewed their protectionist policies, setting up special economic zones (SEZs) and other incentives to attract the multinational leaders to their shores. In May 2008, the Chinese government announced a plan to restructure its telecommunications industry in order to make it more competitive. This move, which aims particularly to create more competition for China Mobile, the world's largest carrier by subscriber base, has received a mixed reaction from both Chinese and global industry players. In general, it should be considered a positive move for a traditionally sheltered market.

In addition, emerging markets often turn to the developed western economies for best practice in industries such as professional and financial services. However, unlike their western counterparts who have to take account of legacy hardware, infrastructure or technique, emerging markets often adopt a big bang approach to their telecommunications and IT infrastructure, leapfrogging the benchmark to set a new global standard.
A perfect example of this is the deployment of fibre to the home (FTTH) in South Korea, providing South Korean consumers with ultra-fast broadband internet access for a fraction of the price of the slower copper-based UK network.

While the benefits of entering new markets are clear; access to workforce, expanded customer base, new business opportunities; companies must be aware of the potential pitfalls which stand in the way of success.

The first issue which organisations are increasingly looking to address, by investing serious capital, is the often gaping cultural divide between the new host market and that of the organisation's home. This challenge is nothing new but it can cause serious problems when ignored or handled clumsily. With Japanese etiquette for example, the observation of hierarchy and relationship-centric business communication is the stuff of international management legend, a host of embarrassing anecdotes and libraries full of guides to ‘doing business in Asia'. However, should organisations apply the same rules and approaches to all Asian markets, such as Vietnam for instance? And what about other emerging markets in Europe or even Africa?

In spite of emerging markets' increasingly proactive approaches to attract foreign investment, from a telecommunications point of view, many markets remain highly restricted. Deregulated markets such as Japan, Hong Kong and Australia are open to new market entrants who are able to compete effectively to the benefit of customers. However, markets such as China and Cambodia are much more restricted, preventing telecoms service providers from acquiring or controlling national operators.

Red tape is often the cause of unexpected frustration for companies new to a market, and can act as a disincentive for doing business. The World Bank estimates that Indian senior managers, for example, spend 15 per cent of their time dealing with regulatory issues, far more than the Chinese. Without extensive research, companies moving into these markets can find their progress blocked by confusing regulations and compliance demands. Compliance is also an issue that can complicate matters significantly. For customers in the banking and financial services industry for example, service providers must be aware of the compliance requirements of the host market, as well as those of the home market. Often, global organisations are governed and regulated according to the standards of more developed markets. In the case of companies operating from the US for example, failure of an overseas branch of an international bank to comply with US legislation can cause legal ramifications and damage to its brand value and reputation.

So, how can a service provider in an emerging market drive top-line growth and improve profitability quickly? The key to success lies in taking a highly targeted, strategic approach to market entry and operation. The growth potential for emerging economies is such that they are often highly competitive environments with companies trying to secure first mover advantage without truly understanding the individual challenges of the market. They may also not fully understand the needs of the potential customer base either within that market, or wishing to enter that market.

As these markets consolidate through competition, the organisations that have developed strong and lasting relationships, as well as a deep understanding of the local market forces, will be the ones to thrive. There are a number of ways to ensure success in emerging markets.

The first is to take the time to really understand your target audience. Many international companies based in Europe are now turning to emerging markets in the Middle East as well as Asia to expand their market presence and develop new revenue streams. These organisations need the same quality network security and performance that they expect in their home markets. They may also wish to take advantage of similar products such as managed hosting and IT services, which may not necessarily be as advanced in certain markets.

Secondly, service providers should implement a simple, flexible network architecture that allows customers to accelerate the roll-out of new services.  Partnership is a crucial element of this approach as laying your own cable may simply not be an option in heavily regulated markets. Telstra International lays approximately one kilometre of new cable each week but also partners extensively with tier 1 carriers in emerging markets to ensure maximum coverage.

When choosing a partner for emerging markets, service providers should look to the local providers that have a strong network footprint in the key growth regions. Their networks should also be based on an advanced MPLS-IP backbone and they should have the capability to offer managed IP services. Often, as the markets slowly open up to deregulation, new carriers created within the country have greater opportunity to take on the previous national incumbents, well before foreign carriers are allowed to enter the market. These providers tend to be more agile than the larger market leaders, allowing them to bring new services to market much faster on behalf of its customers.

In essence, moving into emerging markets is a long term process which requires both careful planning and rapid deployment of services in order to capitalise on the growing economies. In the current economic climate, it is more important than ever to choose partners carefully in order to offer multinational customers a tier 1 carrier service across the board, no matter where in the world they may operate.

Simon Vye is CEO Telstra International EMEA.
www.telstra.com

The New Regulatory Framework (NRF) for the telecoms sector that was proposed in November 2007 by the Commission is now entering the last phase of what have been tedious and sometimes acrimonious negotiations. The Council of Ministers examined a number of key regulatory proposals at the end of November and a consensus is now emerging. The EC is expected to adopt the framework in early 2009.

The backdrop of this process has of course been the current financial crisis and, more recently, talks of recession in Europe. So what regulatory measures are likely to emerge from these negotiations and will the new framework be appropriate for recession times?
The Commission's proposals have been significantly watered down by both the Council of Ministers and the EU Parliament. As a result the current ‘consensus' version of the NRF, that will shape regulation in Europe over the next five years, is less ambitious than the initial draft on a number of key points.

As far as spectrum regulation is concerned the ambitious market centric plans initially proposed by the Commission have largely been rejected. The resulting status quo caters for the political and social concerns of member states and will probably be easier to manage in a difficult economic climate but may need to be revisited in a few years time.

The idea of a "super EU regulator", originally proposed by the Commission, is being replaced by the a new entity (tentatively called Group of European Regulators in Telecoms or GERT) that will be an independent body as opposed to a new EU agency. This means less power than envisaged to the Commission and more power to national regulators.

The controversial proposals aimed at giving the power to mandate functional separation of dominant operators to the national regulatory authorities have been kept but are now considered a "last resort" measure to be applied in "extraordinary" circumstances. As a result the burden of proof required for national regulators to select and implement separation is likely to be very high. The economic climate is not conductive to separation plans in the short term as these often involve important one off costs and can trigger in some cases the need for the renegotiation of the operators' debt. The new text is however only a partial victory for a number of incumbent operators as separation could still be mandated in the future.

Access to fibre based new generation networks is likely to be mandated in most cases but the prices charged by incumbents to new entrants will have to reflect the appropriate investment risk in order to preserve investment incentives. The exact methodologies to assess the returns allowed on these new investments have yet to be agreed though so the regulatory visibility is only partial. The explicit recognition of the need to incentivise investment is however good news for operators in a difficult economic environment.
Meanwhile market evidence suggests that many operators and equipment manufacturers in Europe are suffering from the current economic climate. First, the share price of most telecoms firms has tumbled and this has raised fears of aggressive takeovers and further consolidation. Second, both business and residential markets are softening with anecdotal evidence of reduced call volumes and shifts from expensive price plans/packages to cheaper ones when existing contract terms expire. Third, the switch to VoIP based solutions seems to be accelerating and while a number of players will benefit from this trend, the net result will be an overall decline in call revenues for the industry. Fourth, some of the investment intensive strategies that were under consideration are being reviewed and postponed.  For example, a number of investment plans in New Generation Networks are likely to be impacted.

The telecoms sector however went through a significant wave of structural rationalisation and consolidation following the dot com burst and is likely to be more resilient as a result. Also communications services are not as cyclical as some other industries such as luxury goods or retail. Lastly many customers are still under "fixed fee" contracts (fixed or mobile) and revenues will only be impacted when these expire.

While it would be wrong to design a regulatory framework with short term economic considerations in mind (the text will only be made into national laws in 2010/2011 after all) it is clear that some of the concerns of key market players have influenced the text of the current draft New Regulatory Framework.

Benoit Reillier is a Director and European head of the telecommunications and media practice of global economics advisory firm LECG.  The views expressed in this column are his own.
breillier@lecg.com

An explosion in the variety of distribution channels now available to entertainment content owners is ensuring new approaches to security explains Priscilla Awde

While pirates of the high seas are causing some very severe headaches to maritime trade off the horn of Africa, their quieter cousins are engaged in potentially equally devastating attacks against global networks and the traffic they carry. As the arteries on which all commerce depends, defending these networks is as critically important as protecting cargos on the high seas.

The same can be said for the entertainment content on which owners spend and stand to lose millions from piracy, illegal copying and distribution. Securing entertainment programming from abuse is big business. Major studios, content owners and vendors use technology to prevent abuse and prosecute pirates when they find them.

Security is complicated by the rise in broadband fixed and wireless connections; the explosion of digital content; consumer demand for anywhere over any network/device access and a growing predilection for mixing broadcast, internet and IP traffic.
While Digital Rights Management (DRM), controls usage and distribution within households, Conditional Access (CA), secures content transmitted through cable, satellite, broadcast and telecoms networks to households. According to ABI Research, the global market for CA alone is expected to have generated revenues of around $1.4 billion in 2008.

Traditional one-way broadcast networks mostly rely on tried and tested smart cards using strong security algorithms to protect encrypted, standards based DVB video streams. However, explains Cesar Bachelet, senior analyst at Analysys Mason: "In any security breach, smart cards must be replaced, which is very expensive. IPTV operators tend to go for software solutions because they have no legacy systems.
 

"There is a trend towards cross-platform content delivery and major access vendors have solutions for on-line video and internet television. The biggest consequence of the move to digital is the explosion of available content for download."

Switching content between devices has implications for CA and DRM. Operators must be able to enforce pre-defined parameters governing content usage on each device, its storage, duration of use, where, if and by whom it can be accessed or copied. This is increasingly done through handshake routines between CA and DRM systems which allow content to be moved from televisions to other devices and released into the computing domain. "Bridging technologies allow transfer to third party devices with embedded DRM facilities," explains Daniel Thunberg, Senior Director, Market Development, Irdeto. "Content is re-scrambled into another format within set-top boxes and handed over seamlessly along with transfer rights and certificates embedded in the headers."

New software based systems are the next big CA development but, while some believe they make preventing, detecting and shutting down piracy faster and more efficient, others suggest counterfeit copying and distribution are easier.

Not so says Stephen Christian, VP, Marketing for Verimatrix. As a relative newcomer, Verimatrix is: "Riding the wave of internet cryptology, IP technology and applying them to media. Software solutions offer greater ability to track what's going on in set- top boxes and can be downloaded as part of the content distribution mechanism at approximately zero cost and to a variety of devices. We are using the power of chips and the prevalence of broadband to develop software solutions mainly for greenfield IPTV providers but also for all Pay-TV operators. The world of CA is moving to IP which has turned it upside down," continues Christian.

IP transmission brings both opportunities and threats: computer hackers have spent years perfecting their nefarious trade in the computer world and can now deploy their dubious skills against Pay-TV content. Although IP networks may be easier to attack, they can support fast and sophisticated security applications.

While all security solutions are state-of-the-art at launch, most are eventually hacked, making it important to anticipate and assess risks and react fast to renew security.
Intelligent IP networks give CA vendors more opportunities; providing new ways to keep content safe believes Francois Moreau de Saint, CEO, Viaccess. "Lots can be done in networks to manage and secure different devices and manage rights centrally. Operators want to deliver different content over very different devices with different technologies so we must deliver the solutions to manage heterogeneity, to enable conversion features allowing content to be converted between formats so operators can ‘talk' to each device in the language it understands.
 

"Although the world is changing considerably, there is continuity in what's at stake: the whole point of CA is to fight piracy. Now there are more and more distribution channels and more opportunities to deliver more content and therefore more risks. We need to deploy the highest level of security technology and be prepared to take legal action against hackers."
Faster connections make illegal downloads easier and pirates are increasingly hiding behind peer-to-peer and file sharing making them difficult to trace, but the industry has some innovative solutions. Watermarking is a new, effective and increasingly popular tool in CA vendors' tool kits. "Mixing television with the internet frightens many broadcasters and content owners," says Geir Bjorndal, Sales/Marketing director for Conax. "Watermarking inserts an invisible pattern into the video stream and equipment records the unique identity of which set-top box received a copy. This makes illegally distributed content traceable to individual subscribers."

Next generation smart cards detect unusual behaviour and usage patterns (typical in card sharing), making it faster to shut down unauthorised access. CA vendors have developed robust systems for the growing, if nascent, mobile TV market: security embedded in SIM cards can be activated, managed and updated from the network.

New CA systems are enabling new Pay-TV models. In a revenue sharing agreement with content owners, Orange recently launched the five channel subscription based, multi-platform Cinema Series in France. Films, television, traditional and time shifted broadcasting are available on-demand and can be watched on any screen any time. Many big content owners are making their libraries available and internet Pay-TV will become more popular.
As always, flexibility, speed and real time reactions are essential for all communications systems which the trend towards software CA solutions should fulfil. The risks are that however necessary, security may become a stumbling block for seamless and fast content exchange between devices.

Priscilla Awde is a freelance communications writer

Emerging markets, without a doubt, are squarely in the sights of mobile operators looking to gain new subscribers in these tumultuous times.  As many people struggle to afford even the most basic mobile services in these regions, there are significant challenges for operators that seek to tap into this new subscriber base and maintain operating efficiency.  Customer service has traditionally been a major cost deterrent for operators, so to maintain profitability in these emerging markets, operators must look to new and innovative customer service solutions says Mikael Berner

In their search for subscriber growth, operators are looking to the world's emerging markets, such as China, India, Pakistan and Latin America, where some of the statistics are impressive.  For instance, revenue for India's big four telecom companies alone (Bharti, Idea, Reliance and Spice) grew collectively by 50 per cent in 2007 over 2006 to $12.05 billion. Capital spending simultaneously grew at an astonishing rate, underscoring massive infrastructure investments, as operators are eager to expand their networks.

Despite the impressive growth opportunities in these markets, operators are faced with the challenge of keeping average revenue per user (arpu) at a profitable level.  Even as these emerging economies blossom, disposable income remains low for most potential subscribers.  Often, consumers in these developing regions can afford mobile phones only as groups, or during periods when they have employment - sometimes going completely silent on mobile phone usage when jobs are scarce.

In addition to low arpu, other key risk factors have emerged for operators in these volatile regions, increasing churn and causing serious pain points for the operator.  For instance:

  • The handset market consists of many low-end phones;
  • These regions have predominantly pre-paid plans, where customers are simply replacing the SIM card when they have the funds to get a mobile phone again; and,
  • Operators either charge for or don't provide customer care for pre-paid customers in these regions.

Mobile operators are facing a new challenge as they scramble to access these markets and avoid costly blunders.  The operating costs in these areas remain the same as in developed markets, requiring operators to shift their focus from customer service to rapid expansion with the intention of getting service up and running for as many subscribers as quickly as possible. From a business perspective, this situation sounds familiar and indeed hits very close to home.  As an example, the low-cost European airlines are interested in flying as many customers as quickly as possible to predictable, tried and tested destinations - at least those with low airport fees.  They've become infamous for providing as little hassle (and customer service) as they can get away with.  This "no frills" approach by airlines has resulted in perceived poor customer service reputations and ultimately, damaged brand image.

How low can mobile operators go in their quest for profitable expansion into the wireless frontier without good customer service? As more companies vie for market share, the level of competition will likely increase and push the price of telecom services down even further. Costs will need to continue dropping if operations in rural areas are to expand profitably. And, while many low-overhead business models of mobile operators have been highly successful, there are some aspects of mobile service that simply cannot be taken out of the picture.

Operators still face the tremendous challenge of financing customer service.  With the introduction of new technologies and services, inevitable user questions arise that have typically been directed to expensive call centers.  It is crucial that operators provide tools to meet their customers' needs without incurring high support costs.

With the influx of pre-paid plans in new markets, mobile operators have addressed support issues by charging customers on their pre-paid account when they call customer support, instead of providing the service for free as with post-paid customers. Hidden fees such as these are not likely to be popular with subscribers.  And increased competition in emerging markets means that if a customer isn't happy, another wireless provider will be waiting with open arms to take their business.

This cutthroat landscape is hostile to subscriber retention unless a company can achieve the perfect balance of low costs and high customer satisfaction. Some operators have attempted to tackle customer support issues by encouraging subscribers to solve problems themselves. This self-help policy is problematic if companies don't offer customers the tools they need to solve handset issues.

The most promising approach to low-cost service options is device-based solutions.  By bringing the service experience to the handset, operators are able to deliver highly functional experiences on basic handsets at virtually no variable cost.  By resolving issues at the point of experience, operators will improve the customer experience while avoiding the costs of connecting callers to IVRs, agent queues, etc.  With device-based solutions, callers are easily capable of resolving more than 75 per cent of their issues quickly and easily on the device.

Two primary areas operators should focus on when it comes to self-service on the handset are:
1) Remote support services; and,
2) Account management services.
Remote support services are critical to resolve, as the primary reason subscribers defect from their operator is based upon device problems, service issues, or just plain confusion.  It is vital that operators find a way to resolve these issues as proactively as possible, even solving them before the subscriber knows they have an issue.  Account management services consist of much simpler tasks, like topping up or paying a bill. These types of requests occur on such a high frequency that having a solution in place to deal with these requests is imperative if operators are to keep their service delivery costs in line.
In addition to encouraging the use of revenue-saving applications, access to interactive tutorials will also teach the user about their mobile device, thus decreasing the number of calls to customer service. If operators can provide a workable, cost-effective solution to benefit the user, call center traffic will decrease significantly and reduce overall operational costs.

Mobile operators need to answer the challenges found in emerging markets by providing services and handsets at an affordable, competitive price point. Additionally, they must make products and services easy to find and simple to use to build and maintain customer loyalty. As operators provide more value to their customers, they will be more loyal and less likely to switch SIM cards and operators so frequently. The secret to maximising growth opportunities in emerging markets lies in outsmarting the competition by utilising new technologies that strike the elusive balance between low costs and satisfied subscribers.

Mikael Berner is Senior Vice President and GM, Enterprise, Nuance Communications

Much is being made of rival broadband access technologies and their prospects for making radical impacts on the telecoms market place. Phil Irvine and James Bennett explain that their analysis suggests that in a market where broadband to the home is currently dominated by fixed access, wireless can play a role - but only in limited areas. In more mature markets they see the dominance of incumbents' DSL offerings continue and believe these operators will be best placed to meet emerging demand for higher speeds by fibre services. In developing markets, however, the absence or poor state of fixed infrastructure and regulatory policy can make the relative deployment cost of wireless broadband very favourable. They urge prospective investors, suppliers and operators to proceed carefully. Many crucial choices need to be made - such as which territories, services and customers should be targeted?

The introduction of broadband access has been a huge driver of the growth of the telecoms sector. The services enabled by broadband have had a profoundly beneficial impact on people and businesses by changing the way they interact with each other, access information and entertainment, and conduct business. Around the world, demand continues to grow for higher access speeds and wider availability.

The telecoms industry faces significant uncertainty on how best to meet this demand. Key questions for operators and investors are whether fixed broadband access can be displaced by wireless access or emerging technologies, maybe including non-mainstream options such as Broadband over Powerlines (BPL).

These technology choices are characterised by the disruptive potential each could have on the industry structure. Investment in the wrong technology could be catastrophic for investors, operators and economies. On the other hand, getting it right could shake-up the industry.

Which technology will dominate the broadband access market will differ from country to country and where it will be deployed in-country. It will be determined by the state of user demand, technology maturity and economics, geographic coverage and regulatory policies towards infrastructure investment. Our view of which technologies will win out and where is summarised in the table below.

In developing markets, wireless broadband access technologies can play a major role so long as the regulatory environment is designed to encourage their development. In particular, wireless broadband can be seen as a viable solution for serving currently underserved areas. There is also the potential for new access technologies such as BPL to play a role, depending on whether technical limitations can be overcome.

By contrast in more mature markets, given the emerging regulatory focus on ‘access bottlenecks', broadband technologies will be dominated by fixed rather than wireless systems. In this respect there will be limited scope for new infrastructure-based operators to compete effectively and the success of wireless broadband will depend on the utility arising from mobility, not fixed access.

However, many questions remain for suppliers, operators and investors. Which territories and customers you should target? What services will succeed? How should you deploy your network? What partners do you need?

The rise and rise of broadband continues - but will it reach a point where the highest access speeds can only be met by fixed fibre?

Broadband access has been a key growth service for fixed telecoms operators around the world, whose importance is made even more significant by the decline of traditional telephony. Demand in mature markets is characterised by the continuing growth of data rates to access more and faster services. Ten years ago the typical access rate was a dial-up line at 56KB/s; today's typical service in mature markets is 2MB/s to 10MB/s.

These speeds are enabled either through DSL technology or by cable modems from incumbent telecoms and cable TV operators respectively. The technologies have limitations that restrict the type and speed of services that can be delivered across them, for example high definition video. Only fibre can support the high data rates of say 50MB/s and upwards that support these service portfolios. Deployments are already starting to take place, most notably in Taiwan, Japan and Hong Kong. In Europe a number of operators have launched Next Generation Networks (NGN), which involves fibre deployments, often to a distribution cabinet rather than the home. There is currently no foreseen role in this for wireless technology.
F

undamental economics favour DSL over wireless broadband - but only where fixed infrastructure exists.

For lower data rates, where wireless broadband speeds can compete with DSL, the underlying economics strongly favour DSL, as shown below. The cost of broadband deployment is dominated by access costs, and accounts for nearly two thirds of all operating costs over the first five years. The economics of deployment also strongly favour existing operators, where the scale efficiencies from widespread assets ownership means the incremental costs are far lower than for a Greenfield new entrant. As such, wireless broadband as an access service is a viable solution only where fixed infrastructure is not deployed.

This lack of opportunity for new technologies and hence new operators to enhance fixed access competition puts a clear focus on the role of regulation. Unfortunately there seems to be no consistency in policy among regulators around the world. Some regulators, such as Ofcom in the UK, have been active in suggesting a series of principles for regulating the ‘access bottleneck'. Others, such as the FCC have applied a policy of ‘forbearance', effectively relieving operators of the obligation for interconnection. The risk is that by setting a favourable investment climate, regulators are allowing operators to develop and possibly abuse a position of dominance.

The absence of DSL in developing markets presents an opportunity for wireless broadband operators especially in rural or underserved areas.

In less mature markets, the market development route might be quite different. The deployment of fixed physical infrastructure is often far less widespread than in more mature markets. Further, the success of mobile services in recent years has attracted traffic from fixed services, further reducing the means for upgrade and further deployment of fixed infrastructure. So, for example in Saudi Arabia, where only 70 per cent of households have fixed access, demand in currently underserved areas is for any form of access. Typical access speeds are accordingly lower and so the demand for higher speeds is quite different from mature markets.

A consistent feature of emerging markets is a regulatory policy aimed at encouraging the development of infrastructure through preventing resale and encouraging access in under-served areas. Unbundled local loop - DSL is therefore often unavailable and market prices for wholesale broadband are held up higher than they might be where a resale market was available. This presents an opportunity for wireless broadband to play a more significant role in urban and suburban areas, particularly where incumbents are slow to respond to the threat of a new entrant. This creates a paradox in regulatory regimes - a perception that infrastructure competition is an essential feature of a competitive market works against regulatory aims of reducing prices and increasing broadband penetration.

Our analysis of the costs of deployment in developing markets, suggests that regulatory impediments to unbundling and the high wholesale DSL cost create an opportunity for wireless broadband. In the long run wireless broadband could dominate, as its presence should inhibit further deployment of fixed infrastructure.

Broadband access is a fundamental service in the telecoms portfolio of all operators. Fixed access will continue to be dominated by incumbents' DSL and fibre services in developed markets; new opportunities will mainly exist for wireless as a nomadic and mobile service. In developing markets, wireless broadband can play a dominant role as an access service - but only in certain areas, subject to there being sufficient demand in those areas - and its prospects are strongly influenced by the role the regulator plays in encouraging the development of infrastructure competition.

Phil Irvine and James Bennett, PA Consulting Group
www.paconsulting.com

In today's financial environment, blended services may be the key to survival for telco operators.  With shareholder and investor audiences becoming increasingly difficult to please, fixed line and mobile operators need to identify new revenue streams - and one way of doing this is by looking within.  Many operators have excellent applications and service environments, but they are split in two - one for their next generation networks and a second, older environment for their legacy networks.  If a link existed that could seamlessly sit between the two and share applications across - or blend services - operators would be able to get the most out of the applications that they currently have says Mike Jones

Service blending is the practice of taking more than one service and combining them to make something new.  Think of making a fruit smoothie - mixing bananas, strawberries, and oranges together isn't something that naturally occurs in the wild, but when blended makes a delicious treat.  Considered separately these fruits are all delicious in their own right, however the blending of these fruits has created a business where there was not one previously - the act of blending is considered a value-add above merely the fruits alone, and therefore can demand a higher price.

Also consider that this business was created with ingredients that were already lying around the kitchen.  There are people who are content with buying the fruits individually, but there will always be some people who are tired of the same old fruit, and will be willing to try something new if for no other reason than to break the monotony.  This smoothie market was created when buyers were presented with something that they had not thought of or seen before.  They expected to eat regular fruit, however when presented with something new, they were delighted with the prospect of experiencing a new sensation.  This new sensation is what attracted their attention and money.

The last point to make with this analogy is that the key to unlocking this market was the tool - the blender!  The tool is the enabler for this new market.  Without the tool, the process of making the smoothie might have proven to be too expensive or ineffectively blended the ingredients.  The right choice of tool is crucial to blending the fruit into a new, delicious, and refreshing beverage that opens new opportunities and revenue streams.
Blending telecommunications services has a great deal in common with blending fruit.  Both need:

  • Ingredients: Preferably ones that are already being used and therefore are readily available
  • Innovation: Thought leaders that can see an opportunity for a new product or service
  • Tool: An efficient enabler that provides the link between the idea and the product or service.

Telcos have the first two items, but are missing the right tool that can easily and cost effectively turn their ideas into reality.

Services are currently deployed as discrete functions within a network, which are akin to the individual fruits in a smoothie.  SMS, voice mail, automated outbound calling, and pre-paid are all examples of discrete functions that are present in most service provider networks today.  These services are discrete due to the complexity of interworking the application with the network, and this complexity is a leading cause of inefficiency in telco networks.
What if these services could be unlocked and offered for free consumption to application developers?  Applications are rarely universally adopted by every subscriber in a network, so there are opportunities to repackage one or more of these discrete functions into a new service that will be consumable by a new user.  For instance, a mobile subscriber may only think of automated dialing as something a telemarketer would use.  However, that same mobile subscriber may view an automated wake-up call service as a useful feature.  This is just one simple example of how the same discrete network function can be blended with, for example, SMS to create a new service using piece parts already in a network.

As the market continues to take shape, existing enhanced services are prime candidates for incremental innovation and arpu enhancements.  By leveraging the existing enhanced services and creating innovation "on top" of them, service providers complement an understood user experience while at the same time, enable an ecosystem to reinforce the first social network application, voice services.     

Innovation comes into play for blending old with new.  Using the smoothie example, consider the fruit smoothie discussed earlier as being the "old" technology.  Now consider a "new" technology, protein powder, which is being used by fitness enthusiasts.  The blending of the "old" and the "new" in this case has enabled the smoothie vendor to start selling protein powder to a different audience, effectively creating a new market of fitness beverages.
This analogy once again carries into the Telco domain when compared to the legacy network and NGN.  There are new services being created for NGN all the time, but how well do these applications work with the mainstay applications in the legacy network?  Based on the fact that most NGN services duplicate the core functions of the legacy network, it's safe to assume that the new and the old interact very little or not at all.  Which will be more profitable?  Repackaging an existing service to address a new market, which is aimed at revenue growth, or duplicating a service to the same market for some nominal cost savings?
An example of service innovation in the telco market is the blending of "new" IT policy enforcement capabilities, such as web browser parental controls, with the "old" pre-paid application.  This policy enforcement could be extended to control who, when, and where phone calls can be made or received.  The blending of these technologies is another clear example of how two disparate technologies can be brought together to create a product that is marketed to people from two separate demographics - voice and IT security. 
The key that unlocked the smoothie market was the blender.  The right tool made the process of making the smoothie quick, efficient and cost effective.  The telcos also need a tool like the blender that will unlock their services for the purpose of creating something new.  This tool will:

  • Protect the telco by ensuring that their services operate independently from the underlying network
  • Prevent vendor lock-in by opening up the core network services as building blocks for new applications
  • Support telco and IT technologies such and IN and web services for the cultivation of multiple ecosystems
  • Create service building blocks from the old and new networks for rapid creation of innovative services
  • Support the reliability and scalabilty required by large scale services
  • Unlock trapped arpu

What service providers must do is protect the value and innovation potential of their legacy network by ensuring that their services can be offered independent of the underlying networks and, more importantly, independent of the vendors enabling those networks. Service providers who choose to open up their applications for mass consumption have the potential to open up new markets. By doing this, telcos can avoid falling into the vendor lock-in trap, and ensure that their investments in new technologies achieve maximum ROI.

Mike Jones, is Sales Engineer with AppTrigger
www.apptrigger.com

The launch of Next Generation Networks (NGN) has brought about a paradigm change in the telecommunications market. NGNs pose a particular challenge for charging and billing systems that are required both to meet customers' desire for simple tariffs and the growing complexity of products. After all, say Thomas Jaekel and Lothar Reith, successful business models depend increasingly on a combination of attractive content and flexible service delivery and charging options

The charging and billing system plays a key role, since it serves as a link between technological innovation and new business models. It must have the capacity to adjust swiftly to changed market conditions, services, products and network platforms. Similarly, it is essential to map flexible pricing models that enable both simple mass-market-oriented services and specialised complex value-added services. Meanwhile, product development cycles are growing drastically shorter.

The charging and billing system must also differentiate between content and services. The NGN operator not only provides a transport service, but increasingly participates in the content creation and distribution value chain. Since the operator controls access to the NGN, it owns the charging relationship, which is based on trust. Moreover, it owns the metering point where chargeable service units can be measured and priced. The most formidable challenge when charging and billing transport and content services is to provide quality-differentiated transport services where the quality depends on the transported content. Examples of such requirements are:

  • Differentiated, specific bandwidth categories on demand as guaranteed service qualities or quality-differentiated transport services provided on demand and charged to the end user or the content provider.
  • Virtualisation of resources to enable wholesale charging and billing to own subsidiaries or business units or to external business partners.

New business models and services
New business models such as the trend to divide market participants into NetCo, ServCo and SalesCo structures for the NGN have been clearly visible for some time now. Current development in European regulation (e. g. the EU Commission's wholesale initiative for the broadband market and Ofcom's current hearing on EALA {Ethernet Active Line Access}) support this direction. Further business model trends are:

Flat rate offerings. However, these often include only a basic offer without value added services such as service numbers or international calls. Non-included products still have to be billed according to use, and billed in conformity with existing legal requirements such as consumer and customer protection regulations.

Offerings financed by advertising, which include both access rights and a transport service for access to content.

Mash-up products using Web 2.0 that are put together flexibly from existing services. This places particularly high demands on the flexibility of billing systems.

The content provider pays - a business model proposed by large network operators whereby content providers are meant to pay network operators for delivering their content in assured quality.

Alongside these, traditional products such as value-added telephone services still have to be billed and charged and upgraded in the usual way in order to fulfil more complex postpaid and above all prepaid requirements as regards charging to the split second, advice of charge, consumption and call history and billing information.

Increased flexibility leads to particular consequences for new business models, customer and partner relations. This enables new types of cooperation along the new value chain, for example with content providers for quality-assured content delivery. It also enables new bundling options across multiple network and content platforms, such as quality delivery-assured content brokering.

Business agility is a central challenge
For speedy implementation of these new business models, business agility plays a dominant role. The NGN charging and billing system must be convergent and support real time charging for prepaid services, as well as offline charging for postpaid billing.
So far, conventional billing systems have been optimized so that offline charging supports postpaid billing. Online charging for real time balance management of a prepaid account has been implemented primarily on proprietary IN platforms. This fragmentation in dedicated systems for offline and online charging is a root cause of poor business agility. As a prerequisite for the necessary business agility NGN charging and billing systems must support multiple business models simultaneously, including for retail and for wholesale, covering both content and transport service delivery in an integrated, quality-differentiated way.

Yet for charging and billing system producers, the need to take a holistic view initially means greater complexity. Relevant existing bodies and new standards such as TMF, ITIL, GBA 3GPP, IMS and ETSI TISPAN are developing very dynamically and must necessarily be taken into account. The process-related standards are already taking holistic end-to-end approach with great success. Relevant standardisation bodies such as the Global Billing Association (GBA), which has now merged with TMF, are following this trend. In this case it was possible to design and further develop the billing chain in TMF's overall NGOSS environment, taking the environment processes into account.

Requirements for charging systems
The demand for business agility and for holistic unification results in a number of key requirements for successful solutions. The functions of a NGN charging and billing solution may be broken down into preceding (upstream) charging functions, central charging and billing functions and downstream charging and billing functions. Convergent solutions may be classified as either pre-delivery upstream real time charging or holistic post-delivery (upstream, central and downstream) real time charging and billing solutions.
Preceding (upstream) charging and billing functions include:

  • Real-time billing management as a starting point of the real time billing chain with interfaces to network elements of the service delivery platform, quota management with advice of charge, mediation to provide charging data to rating bodies and a real-time rating component for real-time calculation of unit costs
  • Real-time accounts receivable management that provides information in real time mode on account status/credit
  • Usage data mediation as a starting point of the offline process chain with interfaces to the network platform, raw data capture (CDR), normalisation and enhancement
  • Service instance rating and discounting management for the purpose of combined tariff and discount plan management for real time and offline rating (high impact on time to market).

The customer/partner billing component (communication bus, customer/partner database, etc.) is meant to ensure the use of central functions of the billing platform and therefore end-to-end billing management. Such central functions enable the unification of customer and partner billing with upstream and downstream charging and billing for both retail and wholesale business models.

The downstream charging and billing functions include the following components, which have been grouped as ‘billing domain' in 3GPP/IMS standards:

  • Invoice calculation and aggregation: Periodic or on demand invoice calculation on the basis of priced, accumulated use data, one-time or recurring fixed invoice items.
  • Invoice formatting: Transfers the calculated invoice items to suitable billing formats
  • Accounts receivable management: Administration of account credits, providing account information and producing reports (on outstanding debts, for instance)
  • Payment collection and dunning: Collection of payments from various incoming payment systems (banks, etc.) and issue of reminders of (or at least initialises) outstanding payments
  • Operation monitoring: Assuring billing quality by monitoring operating parameters (KPI) and revenue control reports.
Along with function-oriented criteria, key technical and commercial criteria are critical for realising successful billing solutions. Thus billing solutions should be capable of realising components that are diversified as regards availabilities and physical distribution.
In the case of components for which requirements as regards response time behaviour, system availability and consequences of breakdown tend not to be critical, centralised locations such as in IT computing centres with standard availabilities suffice.
Breakdowns of real time components such as offline rating, payment gateways and postpaid billing mediation have noticeably adverse effects on service in the event of medium-term breakdown (annoyed customers, loss of income, etc.). Therefore, critical components should be realised fully redundantly and at service delivery platform locations.

Innovation prospects
There is a steady flow of innovations in the field of charging and billing systems, leading to more flexibility, scalability and increased business agility. Examples are convergent billing systems, which support both prepaid and post-paid in one system. Other examples cover the convergence of previously disparate domains, such as fixed and mobile. In the future, more innovation can be expected, making it possible to charge for quality-differentiated service delivery of content access and transport services in a way that users can understand. Flexibly composed value networks may arrive, exceeding today's functionality, for wholesale, retail and partner billing. Charging for quality-differentiated delivery of content access and transport services must also support roaming in foreign networks without having to force-route all traffic via the home network. Convergence of SLA penalty management with billing is another area where innovations may arrive.

In general, one can expect NGN charging and billing systems to operate independently of the direction of monetary flow. This could support business agility for new business models such as advertising-financed and quality-assured content access and transport service delivery.

Thomas Jaekel is a Senior Consultant at Detecon International GmbH, where he is responsible for consulting services focusing mainly on billing and NGOSS.  He can be contacted via: Thomas.Jaekel@detecon.com

Lothar Reith is a Senior Consultant at Detecon International GmbH. He focuses mainly on NGN business models, NGN network architecture and NGN charging architecture, and can be contacted via: Lothar.Reith@detecon.com

Financial transactions are increasingly being conducted on the go. Bohdan Zabawskyj looks at why subscribers, dealers, operators, banks and money transfer agencies are embracing mobile money service opportunities

Money, m-transactions, micro-payments, mobile banking and mobile commerce - no matter how you refer to it, mobile money services are on the rise. They provide an unparalleled level of flexibility and convenience to a growing number of subscribers worldwide, in both emerging and developed markets. In the years to come, financial transactions, which are typically made today using ordinary financial instruments such as banks, ATM cards and cheques, will dwindle in popularity as subscribers take advantage of the convenience of mobile money.

Financial transactions are increasingly being conducted on the go. Subscribers are learning to transfer funds to a friend's mobile account, withdraw funds from a bank account or receive a remittance from an overseas relative, using a number of mobile devices. Increasingly, individuals with mobile money on their phones can both monitor their finances and purchase anything from taxi services to a candy bar at the corner convenience store.

In developed markets, bank account holders appreciate the immediacy and convenience of using their mobile device as their wallet. In emerging markets, mobile money forms the vital missing commercial link between ‘unbanked' individuals, companies and the societies they live in.

Emerging markets are benefiting most from the adoption of mobile money, especially those in which financial infrastructure is not readily accessible. The ability to transfer funds via a mobile phone in ‘under-banked' regions means that people can avoid many hours of travel between remote villages in order to pay bills or collect wages. Also, workers in many countries use their mobile phone to stay in touch with the current market price for their goods and therefore the phone is also a tool that facilitates profitable commerce and allows them to immediately capitalize on the latest prices.

Mobile money services will be driven primarily by the operators, who can charge service fees to complement existing SMS and voice revenues while simultaneously increasing customer loyalty and the number of transactions on the network.

Within the emerging markets, there are ample opportunities for operators to make the most out of mobile money services and this suggests that the mobile money phenomenon is here to stay. According to the GSM Association, fewer than 1 billion of the 6.5 billion people worldwide have bank accounts. At the same time, nearly 85 per cent of the next billion mobile subscribers are expected to come from areas such as Africa, Latin America and East Asia.

Short message service (SMS) and unstructured supplementary service data (USSD) are expected to remain the technologies of choice when dealing with mobile payments in emerging markets until 2011. This is largely because these technologies are ubiquitous and well-proven in even the most basic mobile devices and networks, and because SMS is the intuitive messaging vehicle of choice. For now, keeping handsets and access mechanisms simple and affordable is paramount in driving the uptake of mobile money services, until improved handsets are expected to support more complex, alternative Internet-centric capabilities for fund-transfers.

Mobile money services currently implemented in emerging markets are available in four major service types consisting of international remittances, airtime reselling, mobile wallet and roaming recharge.

International remittances are transfers of money by foreign workers to their home countries. They are generally international person-to-person fund transfers of a relatively low value, normally sub-US$200. Generally, the greatest flow of remittance traffic is from the developed countries to adjacent developing regions, for example, from the Middle East to Bangladesh and Pakistan or from the US to Central or South America.

Airtime reselling extends the dealer network of the operator to smaller population centres by allowing any subscriber to become an airtime reseller and effectively act as an agent for the operator. An airtime reseller purchases airtime from the operator distribution network at a discounted price via SMS on the mobile device. It is then sold, once again via SMS, to end subscribers at the full price - with the agent keeping the mark-up and thereby earning an income. In addition to creating an entrepreneurial framework, the operator benefits from reduced overhead and distribution costs, as well as the elimination of the theft and fraud write-offs associated with distributing physical airtime vouchers.

A mobile wallet provides the equivalent of a bank account to the "unbanked", and allows cash deposits and withdrawals. The mobile wallet is accessed via the mobile network and enables the subscriber to check the status of the account, make micropayments to a given merchant for goods or services, and even receive his or her weekly wages via the mobile wallet. In the future, mobile wallets will increase in capability as emerging markets develop more formal linkages with financial institutions.

Roaming recharge offers mobile top-ups and transfers of minutes between subscribers of an alliance of operators. Subscriber benefits include the convenience of topping up while roaming as well as the ability to conveniently transfer funds between subscribers of different operators. Roaming recharge services enable increased roaming revenues for prepaid subscribers as well as incidental revenues from any applied service charges.
Developed markets, such as those in Western Europe and North America, are also a valuable source of revenue through mobile money services. Mobile revenue from international money transfers in North America is expected to grow from $27 million in 2008 to $1.4 billion by 2012, whereas revenues from national transfers will only reach $17.5 million in the same time frame.

Although the mobile remittance industry is growing, the primary focus thus far on mobile money services in mature markets has been associated with an increasing need for real-time access to account information - coined ‘nano-economics'. In the case of these developed, mature markets, mobile banking services offer subscribers real-time access to account balances, the ability to transfer funds and make payments, or validate transactions. Security issues and standards are the largest inhibitors of mobile banking adoption, but these challenges are being overcome over time with the improved ratification and adoption of mobile security standards and tools.

Another form of mobile money is the area of payments using Near Field Communications (NFC). When the phone is placed close, say within less than 4cm, to a point of sale terminal supporting the same technology, the subscriber is allowed to make purchases using a PIN code from money stored on the SIM card. Many operators are working on enabling NFC technologies, and commercial GSM handsets supporting NFC are expected to hit the market this year. Revenue generation would likely follow the bank card model, with the operator getting a share of the transaction fee due to the key role it plays.

Collectively, the forecasted increase in mobile money Services, such as the increase in global mobile banking transactions from 2.7 billion transactions in 2007 to 37 billion by 2011, will contribute close to $8 billion in incremental revenue to mobile operators by 2012.
Subscribers, dealers, operators, banks and money transfer agencies are embracing mobile money service opportunities and creating value in the process. Even if analyst predictions are generous, the global economy is creating ample mobile money opportunities which cannot be ignored, and which will benefit subscribers and their mobile operators alike.

Bohdan Zabawskyj is CTO, Redknee
www.redknee.com

With each day, the complexity of telecommunication operators' market offerings grows in scope. It is therefore vital to present the individual offers to end customers in an attractive, simple and understandable manner. Together with meeting target profits and other financial measures, this is the principal goal of Marketing Departments for all communication service providers.

Within the OSS/BSS environment, forming clear and understandable Market Offerings is equally important for business as the factors described above. There is a huge difference between maintaining all key information about Market Offerings through various GUIs and different applications, and having it instantly at your fingertips in an organized manner. The latter option saves time and reduces the probability of human error, which makes a significant difference in both the length of time-to-market and the accuracy of the offering, ordering and charging processes experienced by the end customer.

What is a Market Offering?

Market Offerings have the following principal aspects that are usually defined during the offer design process:

  • General idea (defining the scope of the offer)
  • Target market segment
  • Selection of applicable sales channels
  • Definition of services and their packaging
  • Definition of pricing
  • Definition of ordering specifics
  • Definition of the order fulfilment process
  • Marketing Communication (from the first advertising campaign to communication at points of sale or scripts prepared for call centre agents)

It is apparent that Market Offerings aren't static objects at all; on the contrary, they are very dynamic entities and most of a communication provider's OSS/BSS departments have some stake in its success.

This leads directly to the key question: "Which environment can support a Market Offering and enable unified and cooperative access to it by appropriate teams during the proper phases of its lifecycle?"

The environment that addresses all of the above-mentioned aspects must be materialized in the form of some information system or application, if it is to be put into real existence.

Putting Clarity into Practice

The closest match to the requirements described above is an OSS/BSS building block called Product Catalogue.  

Product Catalogue is usually represented by the following three aspects:

  • A unified GUI that enables all key operations for managing a Market Offering during its lifecycle
  • Back-end business logic and a configuration repository
  • Integration with key OSS/BSS systems

Product Catalogue supports, with one exception, all aspects of the total Market Offering:

General idea - This enables the capture of the general idea, the keystone of the arch to be built.

Target market segment - This enables rule based definitions for the target market segment that shall be addressed by the Market Offering. It should enable changes or further specifications to this segment in conjunction with the information system that is used for market segmentation purposes.

Selection of applicable sales channels - Together with the Ordering system or systems, Product Catalogue should enable the specification of eligible sales channels for the Market Offering.

Other eligibility rules - In regards to the first two forms of eligibility (segmentation and affinity to the channel) additional rules should ideally be definable.

Definition of services and their packaging - As the name ‘Product Catalogue' suggests, products, which are productized services in principle, are the central elements of it. Products are packaged or bundled using Product Catalogue together with other parts of the usual Market Offering, such as Allowances (free units, etc.), Friends & Family or Closed User Group settings, VPN settings (for corporate segment), etc.

Definition of pricing - Another key function of Product Catalogue is defining price models related to the Market Offering in general or to its parts. Price models can be quite complex and require well-defined/productised underlying services, if they should be applied with a certain level of simplicity and convenience.

Definition of ordering specifics - In the individual screens of an Ordering application's GUI, Offer/Order Templates are usually defined using an ‘Ordering Catalogue', which may or may not be part of the overall Product Catalogue. There are pros and cons to having an integrated or separated Ordering Catalogue, but this is out of the scope of this article because the basic offer structure and its parameters with applicable/default values should come from Product Catalogue by design.

Definition of order fulfilment process - Similar to Ordering, Product Catalogue isn't usually the place where detailed Order Fulfilment and the subsequent Provisioning processes are defined. There is a variety of specialized systems for this on the market, each having its own unique configuration, and so it is impossible to cover all options by a single Product Catalogue application. On the other hand, Product Catalogue should enable the storage of some key ‘hints' that provide these systems a general method of determining what shall be done when the Market Offering is ordered. This should be materialized in the form of ICT environment configuration.  

Marketing Communication - Only this function is clearly out of the scope of Product Catalogue. So far, there are enough specialized Campaign Management tools and applications on the market; designed from bottom-to-top specifically to support MARCOM operations. 

An Aspect of Integration

Functions supported by an ideal Product Catalogue also define OSS/BSS systems that should be integrated with it, namely: Market Segmentation System (could be some BI or Analytical CRM), Ordering, Order Fulfilment, Provisioning, Charging & Billing and CRM. All these systems should either provide some data to Product Catalogue or use it as the master source of the data related to Market Offerings.

The necessity of integration in general is unquestionable; the only remaining issue is determining how the integration will be done and what will be the overall cost. Deciding which type of integration will take place depends on a number of factors, discussed below.    

The Principle Dilemma

There are three principal options for positioning Product Catalogue within the OSS/BSS environment. Product Catalogue can be deployed:

  • As a standalone application
  • As part of a CRM system
  • As part of a Charging & Billing system

Product Catalogue as a Standalone Application

This option appears tempting at first because: "Who can have better Product Catalogue than a company exclusively specializing in its development?" However, many unseen factors tend to surface later on regardless of the shining chain of GUI screens that are often presented.

Does the telecommunications operator really have intelligent Charging & Billing processes in place or smart customizations built on top of it? If a standalone Product Catalogue is deployed, the operator can forget about utilizing these special differentiating features unless they are willing to start never-ending investment into customizations without clear TCO[1] or ROI[2]. It would also be unusual for a Charging & Billing vendor to be willing to provide detailed information about defining price models and the other mechanisms to a 3rd party vendor, as they are often the key selling points of their Charging & Billing product.

Another disadvantage of this approach is that there is not one fixed point of integration for a standalone Product Catalogue.  No vendor of the surrounding OSS/BSS systems would guarantee compatibility with it. Once again, a never-ending integration project is a risky disadvantage of this choice.

In the end, it can be said that a standalone Product Catalogue would be a state-of-the-art application that will not provide the telecommunications operator with anything useful without extensive integration. Even assuming that this integration does succeed and results in a few months of perfect operation, shortly afterward a new set of features - vital for the telecommunication service provider's survival on the market - will certainly require implementation. This will probably affect either the Charging & Billing side (the most common case) and/or the CRM & Ordering side. It could also be that the most charming features of a standalone Product Catalogue will not be possible to use because of a lack of support by the surrounding OSS/BSS systems.

Product Catalogue as part of a CRM system

This is without a doubt a better option than the first choice because at least one side of the integration is guaranteed-if Ordering is part of the overall CRM system, then two sides are in the safe zone.

The only disadvantage of such an approach is that the pricing logic richness of a CRM system's Product Catalogue is quite low, if any. Subsequently, there is no principal gain in implementing a unified Product Catalogue as long as the definition of the price model and some additional key settings remain on the Charging & Billing system side. Such a setup is quite far from the ‘Unified Environment' described at the beginning of this article.

Product Catalogue as part of a Charging & Billing system

Service/Product bundling is usually tightly coupled with price model definition logic and the level of flexibility is in many cases, if not all, one of the cornerstones of the telecommunication operator's differentiating market offer.

Complex price modelling is the "holy grail" of profitability in every price-sensitive market. Even when there is an inexpensive almost flat rate applied to basic communication services (e.g. as in Austria), there is also the richness of value-added services (some of which can be priced using quite challenging logic), which raises the profit of telecommunications operators.

Another point of view is related to the effort necessary to implement complex Market Offerings.  Implementation on the side of Charging & Billing is quite often the most challenging when compared to Ordering or CRM, for example. Order Fulfilment can also be quite a challenge, especially when considering the example of introducing complex, fixed-mobile convergent packages for the corporate segment; however, Product Catalogue itself has no major effect on its simplification. We can say that out-of-the box compatibility between Product Catalogue and Charging & Billing significantly decreases the OPEX of a service provider as well as markedly shortens time-to-market for the introduction of new Market Offerings and the modification of existing ones.

It should be said that most of the top Charging & Billing systems provide Product Catalogue either as part of their latest releases or as an optional extension. Independent of words like ‘Unified' or ‘Enterprise' and others like this, the covered functional areas are quite similar and show a difference only in the degree of support for the above-mentioned individual aspects of the Market Offering. This level of support naturally increases with each new release of the component, and so changing the Billing system due to a better Product Catalogue component is an investment with quite uncertain returns. This is because the overall functional richness and the most important level of flexibility in the areas of pricing and convergence are really the key features of Charging & Billing systems nowadays.

Product Catalogue as a financial asset in the general OSS/BSS environment

Each of the three possible approaches described above would very likely lead to different results for CAPEX and OPEX.  Independently of the selection undertaken, the implementation of Product Catalogue should be justifiable by a clear gain on the service provider's side.

Business Benefits Coming from the Introduction of Product Catalogue

There is variety of direct and indirect benefits linked to implementation of Product Catalogue into the OSS/BSS environment. All of them are related to three qualities that accompany any successful introduction of Product Catalogue - clarity, accessibility and systematization.

Clarity

Managing Market Offering lifecycles is supported by Product Catalogue's design. This brings to all involved parties within the telecommunication operator a better understanding of related subjects, the level of their involvement and their role within the process. This decreases the level of confusion, which is experienced again and again regardless of how well-described the processes exist in paper form.

Accessibility

All Market Offerings are accessible and visible within a single environment, including the history of their changes and the Market Offering's sub-elements. Anyone, according to their access rights, can view the sections of Product Catalogue applicable to their role.

There is no risk of discrepancies between Market Offering related data in various systems provided that the Product Catalogue repository is the master data source as stated above. Accessibility to correct data is an important aspect of information accessibility in general.

Systematization

Product Catalogue not only enforces a certain level of systematization of Market Offering creation and maintenance processes, but also stores and presents all related business entities in a systematic manner, by default taking their integrity enforced by business logic into account.

Measurable benefits

All three qualities - clarity, accessibility and systematization - can be translated into two key terms - time and money. A successful implementation of Product Catalogue brings significant savings on the telecommunication operator's side as well as guarantees a considerable shortening of time-to-market for introducing new Market Offerings. If these two goals are not accomplished by implementing Product Catalogue, such a project must be considered a failure.

SITRONICS Telecom Solutions is a leading vendor of convergent charging and billing solutions in Central & Eastern Europe, Russia and the CIS with a growing footprint in Africa and Asia.  www.sitronicsts.com

See Directory for further company information: http://www.eurocomms.com/directory

ip.access CEO, Stephen Mallinson, discusses the impact of pico and femtocells with Priscilla Awde

Mobile operators everywhere are facing something of a conundrum which goes like this: in saturated markets they must increase revenues from high margin data services but these are typically bandwidth hungry applications resulting in a network capacity crunch. Additionally, recent research shows that around 60 per cent of customers use their mobiles inside buildings at work and at home. As people exploit the benefits of the big new touch screen smartphones, they will expect networks to be fast enough to provide the necessary capacity reliably and everywhere. These are growing trends.

However, delivering the promise of mobile multimedia applications means delivering high-speed indoor mobile networks. Which poses big questions for operators: how can they get broadband 3G networks inside to provide reliable, cost effective in-building coverage? How can they do it fast, without significant and expensive investment in macro networks and give customers access to the applications they want at prices they are willing to pay?
Fortunately ip.access has the answers since bringing high-speed wireless networks inside is its raison d'être. Building on its long experience in developing IP communications solutions, ip.access designs and manufactures picocells for business users and femtocells for the domestic market.

Picocells and femtocells plug directly into existing fixed broadband networks be they DSL, cable, satellite or even WiMax. Acting as mini-base stations, both can be quickly installed anywhere in buildings or outside to bring networks to where the demand is.

These plug and play units have advantages for everyone. For users, both professional and consumers, they make the mobile phone a truly broadband device which can reliably connect to high-speed networks anywhere. For operators, pico and femtocells take traffic off the macro wireless network, add capacity and improve performance. They also give telcos the competitive advantage they need to sell into new sectors and offer a range of high margin value added services.

For years ip.access has successfully deployed nanoGSM picocells in enterprises, business parks, skyscrapers, underground and public buildings. They are even installed on planes, ships and other remote locations where they are connected to broadband satellite backhaul networks. Depending on their size, picocells can support up to 100 users and companies can dot them around the organisation to provide connections where needed.

Solving the problem for residential users, the Oyster3G femtocell allows people to use their existing mobiles to access broadband applications at home. Supporting up to four simultaneous connections, family members can get seamless high-speed access as they move about inside the house. ip.access expects commercial deployments of plug and play 3G femtocells will be up and running by spring 2009.

"There are two legs to our business," explains Stephen Mallinson, CEO at ip.access. "We design end-to-end pico and femtocell solutions so operators can deliver robust solid networks for business and residential users inside any building, ship or aircraft."
The difference between the two is one of size, power, capacity, functionality, price and target audience. However both allow operators to add capacity cost effectively, divert traffic from the macro network and thereby improve performance for all users connected to a cell site. Network black spots in cities and rural areas can be eliminated and people previously unable to get mobile signals can be connected to high-speed networks.

"Operators can use pico and femtocells to put broadband wireless networks precisely where there is demand be that indoors or outside," explains Mallinson. "They can do this without either the expense or controversy of installing new masts and avoid adding equipment to existing base stations. The advantages extend beyond capacity issues: operators can introduce and support new, high margin services and offer home zone tariffs to drive up data usage inside and on the move.

"There are QOS advantages: although people may tolerate occasional dropped voice calls they will be less forgiving if essential business communications or video content are interrupted. These mini-base stations ensure connections are maintained as people move around inside buildings."

Plugging mini-base stations into the existing broadband connections takes indoor data sessions off the macro network so raising the number of users each site can support and increasing its capacity beyond the number of users removed. Operators therefore do not have to invest either in backhaul or in increasing base station capacity. According to ip.access, instead of upgrading the macro network to meet the capacity demands of increased data usage, an operator with 10 million subscribers could save €500 million over four years by deploying fully subsidized femtocells to 20% of its subscribers' homes. Similarly, research firm Analysys-Mason calculates the annual cost saving per customer for a large operator deploying 3G femtocells is between $6 - $12.

Setting aside revenue advantages, increases in service and performance levels and churn reduction, the added capacity achieved by deploying femtocells more than makes the business case even if they are fully subsidised. Even ignoring the cost savings, it takes only a Euro 11 per month increase in ARPU spread over one household to cover the cost of fully subsidising a femtocell.

Operators are seeing an explosion in mobile data usage (in the UK 3 saw a 700% increase in data traffic throughput between September 2007 and March 2008 ), and are looking to picocells and femtocells to solve both network capacity and indoor high-speed access problems. Demand for high bandwidth multimedia mobile applications is rising fast. In the consumer market, usage growth can be attributed to the popularity of social networking sites; uploading and sharing multimedia data; mobile advertising and the personal experience enabled by mobile TV. Following the launch of the iPhone, operators reported an immediate and continuing surge in data usage.

According to Informa, 60% of mobile data traffic will be generated at home by 2013. Ovum anticipates 17 million femtocells will be deployed throughout Western Europe by 2011 and IDC expects consumer spend on femtocell enabled services to grow to $900 million by the same year. Other surveys suggest nearly half of smartphone data usage is at home and the ‘digital generation' either does, or wants to watch mobile television at home.

As distinctions between professional and consumer applications and use blur, employees at all levels are taking popular mobile services into the workspace and combining them with mobile access to multimedia corporate applications. Mobiles are an essential part of corporate life: many business applications formerly limited to fixed devices have migrated onto wireless platforms. "Picocells support reliable connectivity to network services," continues Mallinson. "Enterprises can now support the flexibility and device independent access employees need, delivering reliable and consistent mobile high-speed access everywhere."

Operators are urgently addressing the capacity problems such increases in data usage imply. Some are capping monthly unlimited data plans while others encourage content developers to limit application bandwidth. Neither of which are likely to be popular with users and may increase churn: both of which enhance the consumer proposition for deploying picocells and 3G femtocells.

While adding what could be millions of mini-base stations to a network, integrating them into existing infrastructure and systems and managing them is a significant task for operators, the rewards are potentially equally significant. The cost of delivering calls drops; service levels, speed and reliability rise and operators can introduce new, high margin services to the handsets people already have.

They can encourage both usage and fixed mobile substitution by offering FemtoZone services which are tied to a particular location and automatically activated when phones are within range of the femtocell. When people get home, texts could be automatically sent to absent parents to notify them children are back; podcasts, videos or images can be loaded to phones or targeted advertising sent to interested users.

"Femtocells are a cost effective technology and real commercial proposition for the residential market," explains Mallinson. "Most people in Europe have access to broadband networks at home and, by rolling out 3G networks, carriers are stimulating demand for mobile data. However, many users are frustrated since they cannot fully exploit the benefits of 3G phones or get the quality of service or application access they want.

"Most people use phones for data indoors where, without pico or femtocells, 3G coverage is often not reliable or signals not even available. Femtocells give consumers a better experience and faster downloads so they can really use all the features and functions 3G handsets and networks support while inside."

The Femto Forum industry body, of which ip.access is a founding board member, now includes more than 90 companies, including 36 operators covering 914 million subscribers. The Forum is encouraging the development of open standards which will lead to economies of scale - unit prices are expected to to drop below $100.

There are plans to include the new I-uh standard in release 8 of the 3GPP standard due out in December. It will replace the numerous different ways in which femtocells currently connect to networks and proprietary systems and define how they can be integrated into core networks. By standardising communications between femtocells and core network gateways, operators will no longer be locked into proprietary interfaces or particular vendors and so can choose consumer premise equipment (CPE), separately from the gateway.
Concerns about managing the multitudes of new units within a network are also being addressed by the industry. Currently available for DSL equipment, the TR-069 standard allows operators to remotely manage devices, diagnose and solve problems and download software upgrades. The standard is being extended to support the management of femtocells.

Based on open standard interfaces, the nanoGSM picocell and Oyster 3G femtocell products are total end-to-end solutions which include the requisite controllers and management systems. 

Over the five years they have been used in enterprises, the advantages of the nanoGSM are well documented. Fast and easy to install it increases mobile voice and data usage and reduces operator costs. With an indoor range up to 200 metres, traffic is backhauled through existing IP networks and it supports fast data rates over GPRS and EDGE to devices such as Blackberries. The nanoGSM picocell can be hung on a wall and, once the Ethernet connection is plugged into the box, it is up and running providing guaranteed mobile capacity and service quality indoors.

Like its bigger cousin but less powerful and with a smaller range, the Oyster 3G architecture creates a complete indoor broadband access network for the residential market. Using the same underlying technical platform as the Oyster 3G, ip.access is developing next generation picocells. Having solved many of the 3G femtocell ease of use, price and installation challenges necessary to meet consumer needs, ip.access believes these solutions can be incorporated into picocells. In future, the company expects to offer self-install 3G picocells to both large enterprises and to SMEs through their existing channels.

"These are very exciting times," says Mallinson. "We are building on our experience to produce next generation picocells designed for businesses of all sizes. SMEs need plug and play, easy to use, cost effective units which can be self installed and remotely managed. It makes commercial sense for companies large and small to deploy picocells. It also makes commercial sense for operators, giving them the edge over competitors and a new value proposition for smaller companies which historically have been something of a closed shop."
It's a truism that everything is going mobile and operators are already feeling the capacity pinch. Pico and femtocells give them a cost effective means of meeting the expected upsurge in demand and delivering the network performance capable of supporting next generation multimedia applications.

Today's smart phones are as powerful and feature rich as the PCs of only a few years ago and look set to become the principle controller of all domestic electronic equipment. Operators are now able to deliver the ubiquitous high-speed networks consumers of all kinds expect.

Mallinson looks forward to the day when content is automatically and seamlessly transferred between devices over femtocell platforms: "Users will be able to control televisions remotely from their mobiles; share content between phones and other devices quickly and automatically so all are updated. In the new converged IP world, audio, video, text and photographs will be seamlessly shared between devices.

Do femtocells and picocells hold the key to the lucrative SME market?  Mark Keenan takes a look

Network operators have long tried to address that potentially very lucrative but hard-to-reach customer segment: the SME (small medium enterprise).  Over the years, operators have experienced differing degrees of success, but this is a market sector that has long been viewed as one of the biggest challenges the industry faces. However, an area of mobile communications that many analysts predict will soon be popular among consumers is increasingly being viewed as a key for unlocking the SME revenue stream for all kinds of operators.

The technology in question centres around indoor base stations, also referred to as femtocells and picocells, predicted by ABI Research to account for 102 million users worldwide by 2011.  In essence, these are small indoor access points - think of a slim paperback novel - that are designed to provide dedicated mobile network coverage within a limited area, such as a house or office.  Unlike larger macro cells, these units are relatively low-power devices and manufacturers are designing them to be as ‘plug and play' as broadband modems have become.  Another key difference between traditional base stations and these scaled-down versions is that they link back to the service provider via a broadband line (usually xDSL) to provide network backhaul, rather than using a leased line or microwave link.

Picocells and femtocell have much in common and employ the same base technology but they differ in that picocells are higher capacity and provide extra features for the business market, such as the ability to support larger numbers of simultaneous users, or to chain together picocells to create a network, and to integrate with existing IT environments.  Femtocells, on the other hand, are lower capacity and have less inbuilt ‘intelligence' but are cheaper and designed for the mass-market consumer market.

Indoor base stations address a very real market issue, namely: the problem of achieving high quality indoor coverage.  Many mobile networks - particularly in busy city and town centres - are already overloaded, with too many subscribers placing demands on the network at any one time.  Furthermore, the nature of radio based systems means that there will inevitably be weak spots in network coverage caused by a variety of obstructions, ranging trees and hills through to buildings and walls.  Even thick modern double glazing can create a problem.

A recent research-based report from analyst firm Quocirca revealed that approximately one third of SMEs had experienced problems with indoor coverage at work, with the figure raising to 45 per cent when those same users were at home (as is often the case with SME executives).  Yet despite the fact that buildings are not ideal for mobile communications, more than half of all mobile calls are believed to be made within buildings and our reliance on mobile devices as a business tool continues to increase.  Think of the number of people who live on their PDAs, whether at work, in a meeting or working from home. 

Does it matter?  Well, as the fight to attract and retain subscribers becomes harder and harder, then we all know that the emphasis on service quality increases.  Indeed, a US study carried out by Telephia indicated that over a fifth of customer churn was as a result of poor network coverage.  Research firm InStat has stated that the biggest challenge facing mobile subscribers is the lack of indoor coverage of 3G signals and warns operators that their success with 3G services will be limited unless they address this issue.

This is the operators' dilemma.  While they are banking on return-on-investment on their 3G networks, the very nature of 3G means that it finds it even harder to penetrate buildings than 2G.  At the same time, the kind of services that 3G lends itself to so well - mobile data and TV - place greater demands on the network than ever before.  Yet building whole new landscapes of macro cells is not an option, both in terms of cost and environmental restrictions. This is why so many players in the industry - not just analysts, but vendors and operators - believe that indoor base stations are the solution for overcoming the network traffic logjam.  Furthermore, they could help to enable new operators to enter the mobile market.

It would be wrong to think of indoor base stations in terms of 3G alone.  For some time now, a couple of vendors (including RadioFrame) have been deploying 2G units to network operators in Europe.  In RadioFrame's case, this includes providing business customers of Orange with picocells that enables the operator to improve service quality where needed. 
Quocirca's research underlined the fact that while growth of mobile data is happening, voice services are still business users' primary focus and where they have concerns about service quality and cost.  And let's not forget that most of these business users are still on 2G. They are also receptive to fixed mobile convergence, if presented attractively and cost-effectively. 
While femtocells may not hit the mass market for a couple of years yet, indoor base stations could well prove the solution to maintain customer satisfaction among the SME community.  Looking ahead, these ‘mini cells' can also be used to achieve fixed mobile substitution, by enabling users to reduce expensive mobile call costs by using mobile broadband IP connections. 

Ultimately, picocells could be used to enhance PBX services.  Mobiles could be integrated with the PBX to support call transfer, hunt groups and virtual fixed lines, for instance.  Potentially, picocells could replace the fixed PBX completely with a wireless PBX solution, or even supplant traditional WANs and LANs, although as these are so well-embedded in IT culture, that is certainly not going to happen overnight.  It's interesting to note, however, that the technology is pretty much there to achieve this.

Where are we now?  Apart from deployment of picocells to business users, femtocells - both on 2G and 3G - are being developed and trialled around the world, with a number of product and services from a variety of vendors and operators expected to be launched at the end of 2008 or early 2009.  Some markets are more developed than others and while Europe is expected to be one of the fastest growing pico/femto markets, Sprint in the US announced its own femtocell solution in August 2008. 

No new market sector is without some potential pitfalls.  Mass-market roll-out has been cited as a barrier and this certainly is something that needs addressing very soon.   Most mobile and fixed operators are used to supporting deployment of voice-centric mobile phones, but as some of them have found, as soon as you move into mobile data support, then far more technical support tends to be needed.  Furthermore, there is a big difference between distributing mobile phones and PDAs - whether via shops or courier delivery - and rolling out thousands - ultimately millions - of indoor base stations.

This is why it is crucial that pico and femtocells need to be ‘plug or play' and involve ‘zero touch' deployment. In other words:  devices that can be installed by the customer;  remotely activated  by the operator; and (in the case of RadioFrame's own product line) even remotely updated, all without a truck roll.

In 2007, the Femto Forum was created by a number of vendors and operators to jointly agree a way forward regarding industry standards.  A new technology that does not experience some dissent between players is rare, but the development of universally-accepted standards is happening at a relatively steady pace.  Certainly, standards should not be viewed as a total barrier to indoor base station deployment - after all, picocells are already in commercial operation - though of course, interoperability is very desirable for the market's future.

Another danger is the tendency to ‘over-hype'  picos and femtos.  Realistically, do we really think that every business and consumer will have one within the next 12 months?  I would say not.   So let's not raise expectations to a ridiculous point, or develop business plans that are based on hope rather than common-sense.

That said, the benefits of picos and femtos - to operators and users alike - are very clear.  While predictions on timescales, volumes and market expectations may vary, the general consensus would seem to be that indoor base stations are central to the future of the mobile industry, from 2G to 3G and beyond, not just for consumers but to support business users too.

Mark Keenan is General Manager for Europe, Middle East and Africa, RadioFrame Networks Inc. 
www.RadioFramenetworks.com

By Carsten Storbeck, director of product management with ADC KRONE

Fibre-to-the-home (FTTH) is certain to happen. In some countries it is well advanced, with customers enjoying data speeds of 100Mbit/s into their homes. In other territories, carriers are trying to squeeze the last few years out of their ageing copper networks but the best they can achieve is around 50Mbit/s. And this simply will not satisfy consumer demand in the coming years. The process of replacing copper cables with fibre is undoubtedly expensive but it must nevertheless happen sooner or later. Otherwise the telecomms companies will lose their broadband business to the cable TV operators.

From a technical point of view, laying fibre as far as every home is not difficult. However, installing fibre cables inside customer buildings can be a far less simple operation. This is particularly the case in continental Europe, where more than 70 per cent of people live in flats, apartments, terraces or town-houses, termed collectively as multi-dwelling units or MDUs.
 
Installing ‘traditional' copper cable in these buildings for telephony was easy. This new task is not. Network providers need to deliver broadband at 50 or 100 Mbit/s (perhaps 1Gbit/s) to each dwelling. They could use Category 5e/6 copper cable but this has a distance limit of 90 metres from the external fibre termination and requires electric power and probably an uninterruptable power supply in addition.

A far better alternative is to extend the fibre direct to each and every dwelling within the MDU, but until now this has been a difficult and costly process. Every fibre route must be measured with extreme accuracy and individual fibre cables manufactured to the correct lengths. This is an expensive and time-consuming process and three or four weeks may pass before the cables are delivered to site for installation.

Alternatively, the site technician may be able to install fibre in the cable risers from the basement to the fibre distribution points on each floor and then provide smaller fibre cables to each dwelling. Very great care is necessary, because standard fibre cables cannot tolerate the rough treatment, crude fixing methods and sharp-radius bends that are normal with copper cables. Fibre cables are simply not compatible with technicians' current working methods.

A skilled (and therefore expensive) fibre-splicing technician must either perform the whole job or else visit the site after the cable-laying is complete in order to splice all the fibre cables. This can frequently involve a hundred or more splice joints, making it a lengthy and expensive process, the more so because every splice must be tested afterwards.

In short, cabling a multi-dwelling unit for fibre has been an expensive process until now. This was before the recent launch by ADC KRONE of a fully-modular, ‘plug-and-play' fibre installation system.

The new breed of fibre system for MDU applications includes fibre cable developed using military experience.  It can be stapled to all kinds of architectural fittings without damage and bent around every type of right-angle found in buildings (on average every horizontal run needs to pass around 15 right-angles). It can even be crushed repeatedly without either damage or degradation of signal.

Accompanying this rugged cable are highly ingenious fibre distribution points that include a concealed cable-reel pre-loaded with 30, 60 or 90 metres of pre-terminated fibre cable that connects back to the previous distribution point.

In fact there are only four components in this system and even with the different fibre-length variants only nine component variants, all of which can be stocked and held in the technician's vehicle. With a stock of these nine variants in his van the wireman can arrive and start work immediately. There's no need for a site-survey nor a four-week wait while custom fibres are manufactured.

Because the fibre cables are all pre-tested at the factory, the only testing required on-site is to check the signal levels in each customer dwelling. With this novel approach, the whole process is just as simple as installing old-fashioned copper cable. The components are simple, durable, long-life and far less expensive then existing MDU fibre distribution. In this way installation costs have been reduced by 60 per cent by major telecomms carriers in the USA, where the equipment has been proven in both central office and field environments.

Taking place this 29 September through 2 October, the International Engineering Consortium's (IEC) Broadband World Forum Europe 2008 will once again gather the world's top ICT thought leaders to the Brussels Expo in Brussels Belgium.
Co-located with host sponsor Belgacom's 35th Annual ICT Symposium, the Broadband World Forum Europe 2008 will represent the entire ICT value chain and provide winning solutions to those aching to maximize the promise of broadband.

"In collaboration with the co-located ICT Symposium, this year's event will combine telecommunications solutions with innovations for enterprise," comments IEC President John Janowiak. "The exhibition and workshops will cover technology, business, strategic and operational issues on topics such as mobility, collaboration, security, and risk management, and address the most promising alternatives for moving forward in the ICT industry."

Themed "Delivering the Promise," the Broadband World Forum Europe 2008 will present key industry leaders at the event including World Forum Chair Scott Alcott, executive vice president of the service delivery engine at Belgacom; and Keynoters Didier Bellens, chief executive officer of Belgacom; Pat Russo, chief executive officer of Alcatel-Lucent; John McMahon, president and managing director of the European department at Sony Pictures Television International; Carl-Henric Svanberg, president and chief executive officer of Ericsson; and Julio Linares Lopez, chief operating officer of Telefonica de España.

The event provides "a great opportunity to see what is happening, what is on the future time path, and great interaction with all of the professionals around the world," declared World Forum Chair Scott Alcott.

Industry professionals will have the opportunity to learn from more than 250 global leaders at the forefront of broadband service, applications, and technologies and an opportunity to learn first-hand from some of the world's top product experts displaying the latest cutting-edge advancements from more than 90 key players exhibiting on the floor. The IEC will present its renowned world-class education in more than 50 keynote addresses, plenary panels, workshops, and sessions.

A history of drawing the world's ICT decision makers to the World Forum, 87 per cent of last year's attendees were manager-level and above.  In addition, more than 100 service provider companies traveled from around the globe to conduct business at the conference and exhibition. 

The Broadband World Forum's InfoVision Awards will also take place honoring the most unique and beneficial technologies, applications, products, advances and services in the industry.

Key event sponsors of the Broadband World Forum Europe 2008 include Official Host Sponsor Belgacom, as well as Nokia Siemens Networks, Alcatel Lucent, Ericsson, Huawei, NEC, Thomson, ZTE, Italtel, ADVA, Allied Telesis, Astra, AVM, Actelis, ADC, Soapstone Networks, and Cisco.

The IEC welcomes all ICT industry professionals to attend especially those involved with Carriers/Service Providers, Enterprise, Software Providers, Integrators/Aggregators, Content, Over-the-Top-Carriers, Residential, Equipment Manufacturers, and Semiconductor Manufacturers.
www.iec.org/events/2008/bbwf/register/

Pay by mobile
A new analysis of the global mobile payments opportunity forecasts that 2.1bn mobile subscribers will "pay by mobile" for digital goods downloaded to their mobile phones by 2013. Juniper Research defines digital goods as music (ringtones and full tracks), tickets, TV, user-generated content, infotainment and games - in fact any content bought by phone and delivered to the phone.

A region-by-region analysis by Juniper Research found that there is a significant growth opportunity not only for mobile payment systems, software, support and consultancy services vendors, but also for mobile operators to increase their arpu as transaction frequencies accelerate.

Report author Howard Wilcox notes: "Many digital content goods and services are becoming basic ‘must haves' - particularly in the sub 35 age group.  Devices like the iPhone - even in its 3G incarnation - are undoubtedly contributing to consumer awareness and usage of mobile music services. People who are 15 to 20 today will expect to buy directly with their phones and will drive this market over the next few years."
Highlights from the report include:

  • Users are forecast to make at least two payment transactions per month for digital goods by 2013
  • Nearly half of all mobile phone users will have bought digital goods at least once with their phones by 2013
  • The two leading regions (Western Europe and Far East & China) will account for over 50 per cent of the total digital goods gross transaction market value by 2013.

Howard Wilcox continues: "Even though typical transaction sizes will remain in the $3-$5 bracket a sufficient number of users will be using their mobiles to buy music, games, tickets, infotainment and the other digital goods sufficiently often to see gross transaction value grow nearly seven fold by 2013."

The report also focuses on purchases of physical goods - ranging from gifts to household goods to electronics - via the mobile web. It provides six-year regional forecasts of mobile payments for digital and physical goods, providing data on subscriber take-up, transaction sizes and volumes as well as detailed case studies from companies pioneering in this market. Juniper Research interviewed 37 senior executives across a wide range of vendors and operators.

Whitepapers and further details of the study, 'Mobile Payment Markets: Digital & Physical Goods 2008 - 2013' can be downloaded from www.juniperresearch.com

Wireless trends
The Wireless Technology Trends Report fills the need for a truly comprehensive wireless analysis designed to serve companies in need of understanding the position of individual technologies in the context of the overall market. This report is written for companies involved in, or potentially entering, the wireless market that are interested in a general overview accompanied by detailed forecasts.

"Over the last 12 months, many of the emerging wireless technologies have begun to exploit market sectors ranging from home automation to industrial and consumer electronics," says Dr Kirsten West, Principal Analyst of WTRS.   "The adoption of wireless as a pervasive technology is not a matter of "if", but when. The consumer desire for increasingly unfettered wireless connectivity is clear."

The report's key findings include:  

  • Ultra wideband has shifted from a nascent technology to a solidly emerging protocol underlying the certified wireless USB and Bluetooth "high speed" implementations.
  • Certified wireless USB will enable wireless transfer of photographs and other media from consumer electronics devices to per¬sonal computing and output devices.
  • Bluetooth has undergone an expansion campaign over the last 18 months to incorporate newly-emerging protocols such as near-field communications (NFC), Wibree, and Ultra Wideband.
  • ZigBee is poised to make great strides in the next year and many new products are expected to be released to market. It is very possible that the momentum of ZigBee will finally surge in 2008.
  • Wireless delivery of high definition video content is an area that has very much emerged over the last 12 months. The market for wireless delivery of high definition video promises to be large, with decisive consumer adoption once the technology has been proven in end products.
  • WiMAX has become a significant competition to alternative technologies in emerging markets.

www.researchandmarkets.com/product/641244/wireless_technology_trends_report_2008 
 
Research push
On 10 September, the European Commission officially launched 14 research projects, which are part of the FIRE initiative for Future Internet Research and Experimentation.

The launch event in the Paris City Hall was opened by Jean-Louis Missika, Deputy Mayor of Paris in charge of Innovation and Education, and Gilles Bloch, Director General of Research and Innovation at the French Ministry of Education, Higher Education and Research. More than 165 researchers, innovation managers, including European experts from national and European research projects, as well as representatives of European and national research funding institutions attended.

The event, which takes place in the context of the French EU Presidency, is organised by the Directorate General for Information Society and Media of the European Commission in cooperation with the EC-funded projects FIREworks and OneLab2.

The 14 FIRE projects contributing to the event are funded by the EC under the 7th Framework Programme for Research (FP7) and cover a wide range of research topics in the area of Future Internet Research and Experimentation, including advanced networking approaches to architectures and protocols, coupled with their validation in large-scale testing environments, as well as interconnection of test beds that enable experimentation on a large scale. The total planned budget of these projects is more than 58.5 million euro, of which the EC funds about 40 million euro.
www.ict-fireworks.eu

Premium content
The mobile communication market in Europe has reached a saturated and mature phase. Mobile penetration is more than 100 per cent in many western and eastern countries, even as many other countries are rapidly reaching full penetration. It is clear that the mobile industry in Europe requires investing and committing in other services and applications in order to grow effectively. Mobile premium content services and applications represent a potential source of significant revenues for the mobile industry.

New analysis from Frost & Sullivan European Mobile Premium Content Markets, finds that the market (including revenues from mobile music, mobile games, mobile video/TV and mobile graphics) was worth €2.68 billion in 2007 and is estimated to reach €11.0 billion in 2012.

"Content is the new horizon for the European mobile industry," notes Frost & Sullivan Research Analyst Saverio Romeo. "During the last three years, mobile operators have been observing a slow, but continuous decline in the average revenue per user (arpu) due to the decrease of voice and SMS arpu. New sources of revenues are needed: content is an excellent candidate."

Content types such as music, video/TV and games are leading the content growth. However, new services and applications such as mobile social networking, mobile searching and location-based services are gaining momentum. All these services, which can be defined as content tools, allow users to personalise, search and share content with other users. Business models are also shifting towards ad-based models.

"In order to exploit the variety of revenue-generated business opportunities, the industry has to face some critical challenges," cautions Romeo. "Consumers will use content on mobile devices if the industry is able to offer high-quality content with an excellent user experience at affordable prices."
www.frost.com

    

@eurocomms

Other Categories in Features