Much is being made of rival broadband access technologies and their prospects for making radical impacts on the telecoms market place. Phil Irvine and James Bennett explain that their analysis suggests that in a market where broadband to the home is currently dominated by fixed access, wireless can play a role - but only in limited areas. In more mature markets they see the dominance of incumbents' DSL offerings continue and believe these operators will be best placed to meet emerging demand for higher speeds by fibre services. In developing markets, however, the absence or poor state of fixed infrastructure and regulatory policy can make the relative deployment cost of wireless broadband very favourable. They urge prospective investors, suppliers and operators to proceed carefully. Many crucial choices need to be made - such as which territories, services and customers should be targeted?

The introduction of broadband access has been a huge driver of the growth of the telecoms sector. The services enabled by broadband have had a profoundly beneficial impact on people and businesses by changing the way they interact with each other, access information and entertainment, and conduct business. Around the world, demand continues to grow for higher access speeds and wider availability.

The telecoms industry faces significant uncertainty on how best to meet this demand. Key questions for operators and investors are whether fixed broadband access can be displaced by wireless access or emerging technologies, maybe including non-mainstream options such as Broadband over Powerlines (BPL).

These technology choices are characterised by the disruptive potential each could have on the industry structure. Investment in the wrong technology could be catastrophic for investors, operators and economies. On the other hand, getting it right could shake-up the industry.

Which technology will dominate the broadband access market will differ from country to country and where it will be deployed in-country. It will be determined by the state of user demand, technology maturity and economics, geographic coverage and regulatory policies towards infrastructure investment. Our view of which technologies will win out and where is summarised in the table below.

In developing markets, wireless broadband access technologies can play a major role so long as the regulatory environment is designed to encourage their development. In particular, wireless broadband can be seen as a viable solution for serving currently underserved areas. There is also the potential for new access technologies such as BPL to play a role, depending on whether technical limitations can be overcome.

By contrast in more mature markets, given the emerging regulatory focus on ‘access bottlenecks', broadband technologies will be dominated by fixed rather than wireless systems. In this respect there will be limited scope for new infrastructure-based operators to compete effectively and the success of wireless broadband will depend on the utility arising from mobility, not fixed access.

However, many questions remain for suppliers, operators and investors. Which territories and customers you should target? What services will succeed? How should you deploy your network? What partners do you need?

The rise and rise of broadband continues - but will it reach a point where the highest access speeds can only be met by fixed fibre?

Broadband access has been a key growth service for fixed telecoms operators around the world, whose importance is made even more significant by the decline of traditional telephony. Demand in mature markets is characterised by the continuing growth of data rates to access more and faster services. Ten years ago the typical access rate was a dial-up line at 56KB/s; today's typical service in mature markets is 2MB/s to 10MB/s.

These speeds are enabled either through DSL technology or by cable modems from incumbent telecoms and cable TV operators respectively. The technologies have limitations that restrict the type and speed of services that can be delivered across them, for example high definition video. Only fibre can support the high data rates of say 50MB/s and upwards that support these service portfolios. Deployments are already starting to take place, most notably in Taiwan, Japan and Hong Kong. In Europe a number of operators have launched Next Generation Networks (NGN), which involves fibre deployments, often to a distribution cabinet rather than the home. There is currently no foreseen role in this for wireless technology.

undamental economics favour DSL over wireless broadband - but only where fixed infrastructure exists.

For lower data rates, where wireless broadband speeds can compete with DSL, the underlying economics strongly favour DSL, as shown below. The cost of broadband deployment is dominated by access costs, and accounts for nearly two thirds of all operating costs over the first five years. The economics of deployment also strongly favour existing operators, where the scale efficiencies from widespread assets ownership means the incremental costs are far lower than for a Greenfield new entrant. As such, wireless broadband as an access service is a viable solution only where fixed infrastructure is not deployed.

This lack of opportunity for new technologies and hence new operators to enhance fixed access competition puts a clear focus on the role of regulation. Unfortunately there seems to be no consistency in policy among regulators around the world. Some regulators, such as Ofcom in the UK, have been active in suggesting a series of principles for regulating the ‘access bottleneck'. Others, such as the FCC have applied a policy of ‘forbearance', effectively relieving operators of the obligation for interconnection. The risk is that by setting a favourable investment climate, regulators are allowing operators to develop and possibly abuse a position of dominance.

The absence of DSL in developing markets presents an opportunity for wireless broadband operators especially in rural or underserved areas.

In less mature markets, the market development route might be quite different. The deployment of fixed physical infrastructure is often far less widespread than in more mature markets. Further, the success of mobile services in recent years has attracted traffic from fixed services, further reducing the means for upgrade and further deployment of fixed infrastructure. So, for example in Saudi Arabia, where only 70 per cent of households have fixed access, demand in currently underserved areas is for any form of access. Typical access speeds are accordingly lower and so the demand for higher speeds is quite different from mature markets.

A consistent feature of emerging markets is a regulatory policy aimed at encouraging the development of infrastructure through preventing resale and encouraging access in under-served areas. Unbundled local loop - DSL is therefore often unavailable and market prices for wholesale broadband are held up higher than they might be where a resale market was available. This presents an opportunity for wireless broadband to play a more significant role in urban and suburban areas, particularly where incumbents are slow to respond to the threat of a new entrant. This creates a paradox in regulatory regimes - a perception that infrastructure competition is an essential feature of a competitive market works against regulatory aims of reducing prices and increasing broadband penetration.

Our analysis of the costs of deployment in developing markets, suggests that regulatory impediments to unbundling and the high wholesale DSL cost create an opportunity for wireless broadband. In the long run wireless broadband could dominate, as its presence should inhibit further deployment of fixed infrastructure.

Broadband access is a fundamental service in the telecoms portfolio of all operators. Fixed access will continue to be dominated by incumbents' DSL and fibre services in developed markets; new opportunities will mainly exist for wireless as a nomadic and mobile service. In developing markets, wireless broadband can play a dominant role as an access service - but only in certain areas, subject to there being sufficient demand in those areas - and its prospects are strongly influenced by the role the regulator plays in encouraging the development of infrastructure competition.

Phil Irvine and James Bennett, PA Consulting Group

In today's financial environment, blended services may be the key to survival for telco operators.  With shareholder and investor audiences becoming increasingly difficult to please, fixed line and mobile operators need to identify new revenue streams - and one way of doing this is by looking within.  Many operators have excellent applications and service environments, but they are split in two - one for their next generation networks and a second, older environment for their legacy networks.  If a link existed that could seamlessly sit between the two and share applications across - or blend services - operators would be able to get the most out of the applications that they currently have says Mike Jones

Service blending is the practice of taking more than one service and combining them to make something new.  Think of making a fruit smoothie - mixing bananas, strawberries, and oranges together isn't something that naturally occurs in the wild, but when blended makes a delicious treat.  Considered separately these fruits are all delicious in their own right, however the blending of these fruits has created a business where there was not one previously - the act of blending is considered a value-add above merely the fruits alone, and therefore can demand a higher price.

Also consider that this business was created with ingredients that were already lying around the kitchen.  There are people who are content with buying the fruits individually, but there will always be some people who are tired of the same old fruit, and will be willing to try something new if for no other reason than to break the monotony.  This smoothie market was created when buyers were presented with something that they had not thought of or seen before.  They expected to eat regular fruit, however when presented with something new, they were delighted with the prospect of experiencing a new sensation.  This new sensation is what attracted their attention and money.

The last point to make with this analogy is that the key to unlocking this market was the tool - the blender!  The tool is the enabler for this new market.  Without the tool, the process of making the smoothie might have proven to be too expensive or ineffectively blended the ingredients.  The right choice of tool is crucial to blending the fruit into a new, delicious, and refreshing beverage that opens new opportunities and revenue streams.
Blending telecommunications services has a great deal in common with blending fruit.  Both need:

  • Ingredients: Preferably ones that are already being used and therefore are readily available
  • Innovation: Thought leaders that can see an opportunity for a new product or service
  • Tool: An efficient enabler that provides the link between the idea and the product or service.

Telcos have the first two items, but are missing the right tool that can easily and cost effectively turn their ideas into reality.

Services are currently deployed as discrete functions within a network, which are akin to the individual fruits in a smoothie.  SMS, voice mail, automated outbound calling, and pre-paid are all examples of discrete functions that are present in most service provider networks today.  These services are discrete due to the complexity of interworking the application with the network, and this complexity is a leading cause of inefficiency in telco networks.
What if these services could be unlocked and offered for free consumption to application developers?  Applications are rarely universally adopted by every subscriber in a network, so there are opportunities to repackage one or more of these discrete functions into a new service that will be consumable by a new user.  For instance, a mobile subscriber may only think of automated dialing as something a telemarketer would use.  However, that same mobile subscriber may view an automated wake-up call service as a useful feature.  This is just one simple example of how the same discrete network function can be blended with, for example, SMS to create a new service using piece parts already in a network.

As the market continues to take shape, existing enhanced services are prime candidates for incremental innovation and arpu enhancements.  By leveraging the existing enhanced services and creating innovation "on top" of them, service providers complement an understood user experience while at the same time, enable an ecosystem to reinforce the first social network application, voice services.     

Innovation comes into play for blending old with new.  Using the smoothie example, consider the fruit smoothie discussed earlier as being the "old" technology.  Now consider a "new" technology, protein powder, which is being used by fitness enthusiasts.  The blending of the "old" and the "new" in this case has enabled the smoothie vendor to start selling protein powder to a different audience, effectively creating a new market of fitness beverages.
This analogy once again carries into the Telco domain when compared to the legacy network and NGN.  There are new services being created for NGN all the time, but how well do these applications work with the mainstay applications in the legacy network?  Based on the fact that most NGN services duplicate the core functions of the legacy network, it's safe to assume that the new and the old interact very little or not at all.  Which will be more profitable?  Repackaging an existing service to address a new market, which is aimed at revenue growth, or duplicating a service to the same market for some nominal cost savings?
An example of service innovation in the telco market is the blending of "new" IT policy enforcement capabilities, such as web browser parental controls, with the "old" pre-paid application.  This policy enforcement could be extended to control who, when, and where phone calls can be made or received.  The blending of these technologies is another clear example of how two disparate technologies can be brought together to create a product that is marketed to people from two separate demographics - voice and IT security. 
The key that unlocked the smoothie market was the blender.  The right tool made the process of making the smoothie quick, efficient and cost effective.  The telcos also need a tool like the blender that will unlock their services for the purpose of creating something new.  This tool will:

  • Protect the telco by ensuring that their services operate independently from the underlying network
  • Prevent vendor lock-in by opening up the core network services as building blocks for new applications
  • Support telco and IT technologies such and IN and web services for the cultivation of multiple ecosystems
  • Create service building blocks from the old and new networks for rapid creation of innovative services
  • Support the reliability and scalabilty required by large scale services
  • Unlock trapped arpu

What service providers must do is protect the value and innovation potential of their legacy network by ensuring that their services can be offered independent of the underlying networks and, more importantly, independent of the vendors enabling those networks. Service providers who choose to open up their applications for mass consumption have the potential to open up new markets. By doing this, telcos can avoid falling into the vendor lock-in trap, and ensure that their investments in new technologies achieve maximum ROI.

Mike Jones, is Sales Engineer with AppTrigger

The launch of Next Generation Networks (NGN) has brought about a paradigm change in the telecommunications market. NGNs pose a particular challenge for charging and billing systems that are required both to meet customers' desire for simple tariffs and the growing complexity of products. After all, say Thomas Jaekel and Lothar Reith, successful business models depend increasingly on a combination of attractive content and flexible service delivery and charging options

The charging and billing system plays a key role, since it serves as a link between technological innovation and new business models. It must have the capacity to adjust swiftly to changed market conditions, services, products and network platforms. Similarly, it is essential to map flexible pricing models that enable both simple mass-market-oriented services and specialised complex value-added services. Meanwhile, product development cycles are growing drastically shorter.

The charging and billing system must also differentiate between content and services. The NGN operator not only provides a transport service, but increasingly participates in the content creation and distribution value chain. Since the operator controls access to the NGN, it owns the charging relationship, which is based on trust. Moreover, it owns the metering point where chargeable service units can be measured and priced. The most formidable challenge when charging and billing transport and content services is to provide quality-differentiated transport services where the quality depends on the transported content. Examples of such requirements are:

  • Differentiated, specific bandwidth categories on demand as guaranteed service qualities or quality-differentiated transport services provided on demand and charged to the end user or the content provider.
  • Virtualisation of resources to enable wholesale charging and billing to own subsidiaries or business units or to external business partners.

New business models and services
New business models such as the trend to divide market participants into NetCo, ServCo and SalesCo structures for the NGN have been clearly visible for some time now. Current development in European regulation (e. g. the EU Commission's wholesale initiative for the broadband market and Ofcom's current hearing on EALA {Ethernet Active Line Access}) support this direction. Further business model trends are:

Flat rate offerings. However, these often include only a basic offer without value added services such as service numbers or international calls. Non-included products still have to be billed according to use, and billed in conformity with existing legal requirements such as consumer and customer protection regulations.

Offerings financed by advertising, which include both access rights and a transport service for access to content.

Mash-up products using Web 2.0 that are put together flexibly from existing services. This places particularly high demands on the flexibility of billing systems.

The content provider pays - a business model proposed by large network operators whereby content providers are meant to pay network operators for delivering their content in assured quality.

Alongside these, traditional products such as value-added telephone services still have to be billed and charged and upgraded in the usual way in order to fulfil more complex postpaid and above all prepaid requirements as regards charging to the split second, advice of charge, consumption and call history and billing information.

Increased flexibility leads to particular consequences for new business models, customer and partner relations. This enables new types of cooperation along the new value chain, for example with content providers for quality-assured content delivery. It also enables new bundling options across multiple network and content platforms, such as quality delivery-assured content brokering.

Business agility is a central challenge
For speedy implementation of these new business models, business agility plays a dominant role. The NGN charging and billing system must be convergent and support real time charging for prepaid services, as well as offline charging for postpaid billing.
So far, conventional billing systems have been optimized so that offline charging supports postpaid billing. Online charging for real time balance management of a prepaid account has been implemented primarily on proprietary IN platforms. This fragmentation in dedicated systems for offline and online charging is a root cause of poor business agility. As a prerequisite for the necessary business agility NGN charging and billing systems must support multiple business models simultaneously, including for retail and for wholesale, covering both content and transport service delivery in an integrated, quality-differentiated way.

Yet for charging and billing system producers, the need to take a holistic view initially means greater complexity. Relevant existing bodies and new standards such as TMF, ITIL, GBA 3GPP, IMS and ETSI TISPAN are developing very dynamically and must necessarily be taken into account. The process-related standards are already taking holistic end-to-end approach with great success. Relevant standardisation bodies such as the Global Billing Association (GBA), which has now merged with TMF, are following this trend. In this case it was possible to design and further develop the billing chain in TMF's overall NGOSS environment, taking the environment processes into account.

Requirements for charging systems
The demand for business agility and for holistic unification results in a number of key requirements for successful solutions. The functions of a NGN charging and billing solution may be broken down into preceding (upstream) charging functions, central charging and billing functions and downstream charging and billing functions. Convergent solutions may be classified as either pre-delivery upstream real time charging or holistic post-delivery (upstream, central and downstream) real time charging and billing solutions.
Preceding (upstream) charging and billing functions include:

  • Real-time billing management as a starting point of the real time billing chain with interfaces to network elements of the service delivery platform, quota management with advice of charge, mediation to provide charging data to rating bodies and a real-time rating component for real-time calculation of unit costs
  • Real-time accounts receivable management that provides information in real time mode on account status/credit
  • Usage data mediation as a starting point of the offline process chain with interfaces to the network platform, raw data capture (CDR), normalisation and enhancement
  • Service instance rating and discounting management for the purpose of combined tariff and discount plan management for real time and offline rating (high impact on time to market).

The customer/partner billing component (communication bus, customer/partner database, etc.) is meant to ensure the use of central functions of the billing platform and therefore end-to-end billing management. Such central functions enable the unification of customer and partner billing with upstream and downstream charging and billing for both retail and wholesale business models.

The downstream charging and billing functions include the following components, which have been grouped as ‘billing domain' in 3GPP/IMS standards:

  • Invoice calculation and aggregation: Periodic or on demand invoice calculation on the basis of priced, accumulated use data, one-time or recurring fixed invoice items.
  • Invoice formatting: Transfers the calculated invoice items to suitable billing formats
  • Accounts receivable management: Administration of account credits, providing account information and producing reports (on outstanding debts, for instance)
  • Payment collection and dunning: Collection of payments from various incoming payment systems (banks, etc.) and issue of reminders of (or at least initialises) outstanding payments
  • Operation monitoring: Assuring billing quality by monitoring operating parameters (KPI) and revenue control reports.
Along with function-oriented criteria, key technical and commercial criteria are critical for realising successful billing solutions. Thus billing solutions should be capable of realising components that are diversified as regards availabilities and physical distribution.
In the case of components for which requirements as regards response time behaviour, system availability and consequences of breakdown tend not to be critical, centralised locations such as in IT computing centres with standard availabilities suffice.
Breakdowns of real time components such as offline rating, payment gateways and postpaid billing mediation have noticeably adverse effects on service in the event of medium-term breakdown (annoyed customers, loss of income, etc.). Therefore, critical components should be realised fully redundantly and at service delivery platform locations.

Innovation prospects
There is a steady flow of innovations in the field of charging and billing systems, leading to more flexibility, scalability and increased business agility. Examples are convergent billing systems, which support both prepaid and post-paid in one system. Other examples cover the convergence of previously disparate domains, such as fixed and mobile. In the future, more innovation can be expected, making it possible to charge for quality-differentiated service delivery of content access and transport services in a way that users can understand. Flexibly composed value networks may arrive, exceeding today's functionality, for wholesale, retail and partner billing. Charging for quality-differentiated delivery of content access and transport services must also support roaming in foreign networks without having to force-route all traffic via the home network. Convergence of SLA penalty management with billing is another area where innovations may arrive.

In general, one can expect NGN charging and billing systems to operate independently of the direction of monetary flow. This could support business agility for new business models such as advertising-financed and quality-assured content access and transport service delivery.

Thomas Jaekel is a Senior Consultant at Detecon International GmbH, where he is responsible for consulting services focusing mainly on billing and NGOSS.  He can be contacted via: Thomas.Jaekel@detecon.com

Lothar Reith is a Senior Consultant at Detecon International GmbH. He focuses mainly on NGN business models, NGN network architecture and NGN charging architecture, and can be contacted via: Lothar.Reith@detecon.com

Financial transactions are increasingly being conducted on the go. Bohdan Zabawskyj looks at why subscribers, dealers, operators, banks and money transfer agencies are embracing mobile money service opportunities

Money, m-transactions, micro-payments, mobile banking and mobile commerce - no matter how you refer to it, mobile money services are on the rise. They provide an unparalleled level of flexibility and convenience to a growing number of subscribers worldwide, in both emerging and developed markets. In the years to come, financial transactions, which are typically made today using ordinary financial instruments such as banks, ATM cards and cheques, will dwindle in popularity as subscribers take advantage of the convenience of mobile money.

Financial transactions are increasingly being conducted on the go. Subscribers are learning to transfer funds to a friend's mobile account, withdraw funds from a bank account or receive a remittance from an overseas relative, using a number of mobile devices. Increasingly, individuals with mobile money on their phones can both monitor their finances and purchase anything from taxi services to a candy bar at the corner convenience store.

In developed markets, bank account holders appreciate the immediacy and convenience of using their mobile device as their wallet. In emerging markets, mobile money forms the vital missing commercial link between ‘unbanked' individuals, companies and the societies they live in.

Emerging markets are benefiting most from the adoption of mobile money, especially those in which financial infrastructure is not readily accessible. The ability to transfer funds via a mobile phone in ‘under-banked' regions means that people can avoid many hours of travel between remote villages in order to pay bills or collect wages. Also, workers in many countries use their mobile phone to stay in touch with the current market price for their goods and therefore the phone is also a tool that facilitates profitable commerce and allows them to immediately capitalize on the latest prices.

Mobile money services will be driven primarily by the operators, who can charge service fees to complement existing SMS and voice revenues while simultaneously increasing customer loyalty and the number of transactions on the network.

Within the emerging markets, there are ample opportunities for operators to make the most out of mobile money services and this suggests that the mobile money phenomenon is here to stay. According to the GSM Association, fewer than 1 billion of the 6.5 billion people worldwide have bank accounts. At the same time, nearly 85 per cent of the next billion mobile subscribers are expected to come from areas such as Africa, Latin America and East Asia.

Short message service (SMS) and unstructured supplementary service data (USSD) are expected to remain the technologies of choice when dealing with mobile payments in emerging markets until 2011. This is largely because these technologies are ubiquitous and well-proven in even the most basic mobile devices and networks, and because SMS is the intuitive messaging vehicle of choice. For now, keeping handsets and access mechanisms simple and affordable is paramount in driving the uptake of mobile money services, until improved handsets are expected to support more complex, alternative Internet-centric capabilities for fund-transfers.

Mobile money services currently implemented in emerging markets are available in four major service types consisting of international remittances, airtime reselling, mobile wallet and roaming recharge.

International remittances are transfers of money by foreign workers to their home countries. They are generally international person-to-person fund transfers of a relatively low value, normally sub-US$200. Generally, the greatest flow of remittance traffic is from the developed countries to adjacent developing regions, for example, from the Middle East to Bangladesh and Pakistan or from the US to Central or South America.

Airtime reselling extends the dealer network of the operator to smaller population centres by allowing any subscriber to become an airtime reseller and effectively act as an agent for the operator. An airtime reseller purchases airtime from the operator distribution network at a discounted price via SMS on the mobile device. It is then sold, once again via SMS, to end subscribers at the full price - with the agent keeping the mark-up and thereby earning an income. In addition to creating an entrepreneurial framework, the operator benefits from reduced overhead and distribution costs, as well as the elimination of the theft and fraud write-offs associated with distributing physical airtime vouchers.

A mobile wallet provides the equivalent of a bank account to the "unbanked", and allows cash deposits and withdrawals. The mobile wallet is accessed via the mobile network and enables the subscriber to check the status of the account, make micropayments to a given merchant for goods or services, and even receive his or her weekly wages via the mobile wallet. In the future, mobile wallets will increase in capability as emerging markets develop more formal linkages with financial institutions.

Roaming recharge offers mobile top-ups and transfers of minutes between subscribers of an alliance of operators. Subscriber benefits include the convenience of topping up while roaming as well as the ability to conveniently transfer funds between subscribers of different operators. Roaming recharge services enable increased roaming revenues for prepaid subscribers as well as incidental revenues from any applied service charges.
Developed markets, such as those in Western Europe and North America, are also a valuable source of revenue through mobile money services. Mobile revenue from international money transfers in North America is expected to grow from $27 million in 2008 to $1.4 billion by 2012, whereas revenues from national transfers will only reach $17.5 million in the same time frame.

Although the mobile remittance industry is growing, the primary focus thus far on mobile money services in mature markets has been associated with an increasing need for real-time access to account information - coined ‘nano-economics'. In the case of these developed, mature markets, mobile banking services offer subscribers real-time access to account balances, the ability to transfer funds and make payments, or validate transactions. Security issues and standards are the largest inhibitors of mobile banking adoption, but these challenges are being overcome over time with the improved ratification and adoption of mobile security standards and tools.

Another form of mobile money is the area of payments using Near Field Communications (NFC). When the phone is placed close, say within less than 4cm, to a point of sale terminal supporting the same technology, the subscriber is allowed to make purchases using a PIN code from money stored on the SIM card. Many operators are working on enabling NFC technologies, and commercial GSM handsets supporting NFC are expected to hit the market this year. Revenue generation would likely follow the bank card model, with the operator getting a share of the transaction fee due to the key role it plays.

Collectively, the forecasted increase in mobile money Services, such as the increase in global mobile banking transactions from 2.7 billion transactions in 2007 to 37 billion by 2011, will contribute close to $8 billion in incremental revenue to mobile operators by 2012.
Subscribers, dealers, operators, banks and money transfer agencies are embracing mobile money service opportunities and creating value in the process. Even if analyst predictions are generous, the global economy is creating ample mobile money opportunities which cannot be ignored, and which will benefit subscribers and their mobile operators alike.

Bohdan Zabawskyj is CTO, Redknee

With each day, the complexity of telecommunication operators' market offerings grows in scope. It is therefore vital to present the individual offers to end customers in an attractive, simple and understandable manner. Together with meeting target profits and other financial measures, this is the principal goal of Marketing Departments for all communication service providers.

Within the OSS/BSS environment, forming clear and understandable Market Offerings is equally important for business as the factors described above. There is a huge difference between maintaining all key information about Market Offerings through various GUIs and different applications, and having it instantly at your fingertips in an organized manner. The latter option saves time and reduces the probability of human error, which makes a significant difference in both the length of time-to-market and the accuracy of the offering, ordering and charging processes experienced by the end customer.

What is a Market Offering?

Market Offerings have the following principal aspects that are usually defined during the offer design process:

  • General idea (defining the scope of the offer)
  • Target market segment
  • Selection of applicable sales channels
  • Definition of services and their packaging
  • Definition of pricing
  • Definition of ordering specifics
  • Definition of the order fulfilment process
  • Marketing Communication (from the first advertising campaign to communication at points of sale or scripts prepared for call centre agents)

It is apparent that Market Offerings aren't static objects at all; on the contrary, they are very dynamic entities and most of a communication provider's OSS/BSS departments have some stake in its success.

This leads directly to the key question: "Which environment can support a Market Offering and enable unified and cooperative access to it by appropriate teams during the proper phases of its lifecycle?"

The environment that addresses all of the above-mentioned aspects must be materialized in the form of some information system or application, if it is to be put into real existence.

Putting Clarity into Practice

The closest match to the requirements described above is an OSS/BSS building block called Product Catalogue.  

Product Catalogue is usually represented by the following three aspects:

  • A unified GUI that enables all key operations for managing a Market Offering during its lifecycle
  • Back-end business logic and a configuration repository
  • Integration with key OSS/BSS systems

Product Catalogue supports, with one exception, all aspects of the total Market Offering:

General idea - This enables the capture of the general idea, the keystone of the arch to be built.

Target market segment - This enables rule based definitions for the target market segment that shall be addressed by the Market Offering. It should enable changes or further specifications to this segment in conjunction with the information system that is used for market segmentation purposes.

Selection of applicable sales channels - Together with the Ordering system or systems, Product Catalogue should enable the specification of eligible sales channels for the Market Offering.

Other eligibility rules - In regards to the first two forms of eligibility (segmentation and affinity to the channel) additional rules should ideally be definable.

Definition of services and their packaging - As the name ‘Product Catalogue' suggests, products, which are productized services in principle, are the central elements of it. Products are packaged or bundled using Product Catalogue together with other parts of the usual Market Offering, such as Allowances (free units, etc.), Friends & Family or Closed User Group settings, VPN settings (for corporate segment), etc.

Definition of pricing - Another key function of Product Catalogue is defining price models related to the Market Offering in general or to its parts. Price models can be quite complex and require well-defined/productised underlying services, if they should be applied with a certain level of simplicity and convenience.

Definition of ordering specifics - In the individual screens of an Ordering application's GUI, Offer/Order Templates are usually defined using an ‘Ordering Catalogue', which may or may not be part of the overall Product Catalogue. There are pros and cons to having an integrated or separated Ordering Catalogue, but this is out of the scope of this article because the basic offer structure and its parameters with applicable/default values should come from Product Catalogue by design.

Definition of order fulfilment process - Similar to Ordering, Product Catalogue isn't usually the place where detailed Order Fulfilment and the subsequent Provisioning processes are defined. There is a variety of specialized systems for this on the market, each having its own unique configuration, and so it is impossible to cover all options by a single Product Catalogue application. On the other hand, Product Catalogue should enable the storage of some key ‘hints' that provide these systems a general method of determining what shall be done when the Market Offering is ordered. This should be materialized in the form of ICT environment configuration.  

Marketing Communication - Only this function is clearly out of the scope of Product Catalogue. So far, there are enough specialized Campaign Management tools and applications on the market; designed from bottom-to-top specifically to support MARCOM operations. 

An Aspect of Integration

Functions supported by an ideal Product Catalogue also define OSS/BSS systems that should be integrated with it, namely: Market Segmentation System (could be some BI or Analytical CRM), Ordering, Order Fulfilment, Provisioning, Charging & Billing and CRM. All these systems should either provide some data to Product Catalogue or use it as the master source of the data related to Market Offerings.

The necessity of integration in general is unquestionable; the only remaining issue is determining how the integration will be done and what will be the overall cost. Deciding which type of integration will take place depends on a number of factors, discussed below.    

The Principle Dilemma

There are three principal options for positioning Product Catalogue within the OSS/BSS environment. Product Catalogue can be deployed:

  • As a standalone application
  • As part of a CRM system
  • As part of a Charging & Billing system

Product Catalogue as a Standalone Application

This option appears tempting at first because: "Who can have better Product Catalogue than a company exclusively specializing in its development?" However, many unseen factors tend to surface later on regardless of the shining chain of GUI screens that are often presented.

Does the telecommunications operator really have intelligent Charging & Billing processes in place or smart customizations built on top of it? If a standalone Product Catalogue is deployed, the operator can forget about utilizing these special differentiating features unless they are willing to start never-ending investment into customizations without clear TCO[1] or ROI[2]. It would also be unusual for a Charging & Billing vendor to be willing to provide detailed information about defining price models and the other mechanisms to a 3rd party vendor, as they are often the key selling points of their Charging & Billing product.

Another disadvantage of this approach is that there is not one fixed point of integration for a standalone Product Catalogue.  No vendor of the surrounding OSS/BSS systems would guarantee compatibility with it. Once again, a never-ending integration project is a risky disadvantage of this choice.

In the end, it can be said that a standalone Product Catalogue would be a state-of-the-art application that will not provide the telecommunications operator with anything useful without extensive integration. Even assuming that this integration does succeed and results in a few months of perfect operation, shortly afterward a new set of features - vital for the telecommunication service provider's survival on the market - will certainly require implementation. This will probably affect either the Charging & Billing side (the most common case) and/or the CRM & Ordering side. It could also be that the most charming features of a standalone Product Catalogue will not be possible to use because of a lack of support by the surrounding OSS/BSS systems.

Product Catalogue as part of a CRM system

This is without a doubt a better option than the first choice because at least one side of the integration is guaranteed-if Ordering is part of the overall CRM system, then two sides are in the safe zone.

The only disadvantage of such an approach is that the pricing logic richness of a CRM system's Product Catalogue is quite low, if any. Subsequently, there is no principal gain in implementing a unified Product Catalogue as long as the definition of the price model and some additional key settings remain on the Charging & Billing system side. Such a setup is quite far from the ‘Unified Environment' described at the beginning of this article.

Product Catalogue as part of a Charging & Billing system

Service/Product bundling is usually tightly coupled with price model definition logic and the level of flexibility is in many cases, if not all, one of the cornerstones of the telecommunication operator's differentiating market offer.

Complex price modelling is the "holy grail" of profitability in every price-sensitive market. Even when there is an inexpensive almost flat rate applied to basic communication services (e.g. as in Austria), there is also the richness of value-added services (some of which can be priced using quite challenging logic), which raises the profit of telecommunications operators.

Another point of view is related to the effort necessary to implement complex Market Offerings.  Implementation on the side of Charging & Billing is quite often the most challenging when compared to Ordering or CRM, for example. Order Fulfilment can also be quite a challenge, especially when considering the example of introducing complex, fixed-mobile convergent packages for the corporate segment; however, Product Catalogue itself has no major effect on its simplification. We can say that out-of-the box compatibility between Product Catalogue and Charging & Billing significantly decreases the OPEX of a service provider as well as markedly shortens time-to-market for the introduction of new Market Offerings and the modification of existing ones.

It should be said that most of the top Charging & Billing systems provide Product Catalogue either as part of their latest releases or as an optional extension. Independent of words like ‘Unified' or ‘Enterprise' and others like this, the covered functional areas are quite similar and show a difference only in the degree of support for the above-mentioned individual aspects of the Market Offering. This level of support naturally increases with each new release of the component, and so changing the Billing system due to a better Product Catalogue component is an investment with quite uncertain returns. This is because the overall functional richness and the most important level of flexibility in the areas of pricing and convergence are really the key features of Charging & Billing systems nowadays.

Product Catalogue as a financial asset in the general OSS/BSS environment

Each of the three possible approaches described above would very likely lead to different results for CAPEX and OPEX.  Independently of the selection undertaken, the implementation of Product Catalogue should be justifiable by a clear gain on the service provider's side.

Business Benefits Coming from the Introduction of Product Catalogue

There is variety of direct and indirect benefits linked to implementation of Product Catalogue into the OSS/BSS environment. All of them are related to three qualities that accompany any successful introduction of Product Catalogue - clarity, accessibility and systematization.


Managing Market Offering lifecycles is supported by Product Catalogue's design. This brings to all involved parties within the telecommunication operator a better understanding of related subjects, the level of their involvement and their role within the process. This decreases the level of confusion, which is experienced again and again regardless of how well-described the processes exist in paper form.


All Market Offerings are accessible and visible within a single environment, including the history of their changes and the Market Offering's sub-elements. Anyone, according to their access rights, can view the sections of Product Catalogue applicable to their role.

There is no risk of discrepancies between Market Offering related data in various systems provided that the Product Catalogue repository is the master data source as stated above. Accessibility to correct data is an important aspect of information accessibility in general.


Product Catalogue not only enforces a certain level of systematization of Market Offering creation and maintenance processes, but also stores and presents all related business entities in a systematic manner, by default taking their integrity enforced by business logic into account.

Measurable benefits

All three qualities - clarity, accessibility and systematization - can be translated into two key terms - time and money. A successful implementation of Product Catalogue brings significant savings on the telecommunication operator's side as well as guarantees a considerable shortening of time-to-market for introducing new Market Offerings. If these two goals are not accomplished by implementing Product Catalogue, such a project must be considered a failure.

SITRONICS Telecom Solutions is a leading vendor of convergent charging and billing solutions in Central & Eastern Europe, Russia and the CIS with a growing footprint in Africa and Asia.  www.sitronicsts.com

See Directory for further company information: http://www.eurocomms.com/directory

ip.access CEO, Stephen Mallinson, discusses the impact of pico and femtocells with Priscilla Awde

Mobile operators everywhere are facing something of a conundrum which goes like this: in saturated markets they must increase revenues from high margin data services but these are typically bandwidth hungry applications resulting in a network capacity crunch. Additionally, recent research shows that around 60 per cent of customers use their mobiles inside buildings at work and at home. As people exploit the benefits of the big new touch screen smartphones, they will expect networks to be fast enough to provide the necessary capacity reliably and everywhere. These are growing trends.

However, delivering the promise of mobile multimedia applications means delivering high-speed indoor mobile networks. Which poses big questions for operators: how can they get broadband 3G networks inside to provide reliable, cost effective in-building coverage? How can they do it fast, without significant and expensive investment in macro networks and give customers access to the applications they want at prices they are willing to pay?
Fortunately ip.access has the answers since bringing high-speed wireless networks inside is its raison d'être. Building on its long experience in developing IP communications solutions, ip.access designs and manufactures picocells for business users and femtocells for the domestic market.

Picocells and femtocells plug directly into existing fixed broadband networks be they DSL, cable, satellite or even WiMax. Acting as mini-base stations, both can be quickly installed anywhere in buildings or outside to bring networks to where the demand is.

These plug and play units have advantages for everyone. For users, both professional and consumers, they make the mobile phone a truly broadband device which can reliably connect to high-speed networks anywhere. For operators, pico and femtocells take traffic off the macro wireless network, add capacity and improve performance. They also give telcos the competitive advantage they need to sell into new sectors and offer a range of high margin value added services.

For years ip.access has successfully deployed nanoGSM picocells in enterprises, business parks, skyscrapers, underground and public buildings. They are even installed on planes, ships and other remote locations where they are connected to broadband satellite backhaul networks. Depending on their size, picocells can support up to 100 users and companies can dot them around the organisation to provide connections where needed.

Solving the problem for residential users, the Oyster3G femtocell allows people to use their existing mobiles to access broadband applications at home. Supporting up to four simultaneous connections, family members can get seamless high-speed access as they move about inside the house. ip.access expects commercial deployments of plug and play 3G femtocells will be up and running by spring 2009.

"There are two legs to our business," explains Stephen Mallinson, CEO at ip.access. "We design end-to-end pico and femtocell solutions so operators can deliver robust solid networks for business and residential users inside any building, ship or aircraft."
The difference between the two is one of size, power, capacity, functionality, price and target audience. However both allow operators to add capacity cost effectively, divert traffic from the macro network and thereby improve performance for all users connected to a cell site. Network black spots in cities and rural areas can be eliminated and people previously unable to get mobile signals can be connected to high-speed networks.

"Operators can use pico and femtocells to put broadband wireless networks precisely where there is demand be that indoors or outside," explains Mallinson. "They can do this without either the expense or controversy of installing new masts and avoid adding equipment to existing base stations. The advantages extend beyond capacity issues: operators can introduce and support new, high margin services and offer home zone tariffs to drive up data usage inside and on the move.

"There are QOS advantages: although people may tolerate occasional dropped voice calls they will be less forgiving if essential business communications or video content are interrupted. These mini-base stations ensure connections are maintained as people move around inside buildings."

Plugging mini-base stations into the existing broadband connections takes indoor data sessions off the macro network so raising the number of users each site can support and increasing its capacity beyond the number of users removed. Operators therefore do not have to invest either in backhaul or in increasing base station capacity. According to ip.access, instead of upgrading the macro network to meet the capacity demands of increased data usage, an operator with 10 million subscribers could save €500 million over four years by deploying fully subsidized femtocells to 20% of its subscribers' homes. Similarly, research firm Analysys-Mason calculates the annual cost saving per customer for a large operator deploying 3G femtocells is between $6 - $12.

Setting aside revenue advantages, increases in service and performance levels and churn reduction, the added capacity achieved by deploying femtocells more than makes the business case even if they are fully subsidised. Even ignoring the cost savings, it takes only a Euro 11 per month increase in ARPU spread over one household to cover the cost of fully subsidising a femtocell.

Operators are seeing an explosion in mobile data usage (in the UK 3 saw a 700% increase in data traffic throughput between September 2007 and March 2008 ), and are looking to picocells and femtocells to solve both network capacity and indoor high-speed access problems. Demand for high bandwidth multimedia mobile applications is rising fast. In the consumer market, usage growth can be attributed to the popularity of social networking sites; uploading and sharing multimedia data; mobile advertising and the personal experience enabled by mobile TV. Following the launch of the iPhone, operators reported an immediate and continuing surge in data usage.

According to Informa, 60% of mobile data traffic will be generated at home by 2013. Ovum anticipates 17 million femtocells will be deployed throughout Western Europe by 2011 and IDC expects consumer spend on femtocell enabled services to grow to $900 million by the same year. Other surveys suggest nearly half of smartphone data usage is at home and the ‘digital generation' either does, or wants to watch mobile television at home.

As distinctions between professional and consumer applications and use blur, employees at all levels are taking popular mobile services into the workspace and combining them with mobile access to multimedia corporate applications. Mobiles are an essential part of corporate life: many business applications formerly limited to fixed devices have migrated onto wireless platforms. "Picocells support reliable connectivity to network services," continues Mallinson. "Enterprises can now support the flexibility and device independent access employees need, delivering reliable and consistent mobile high-speed access everywhere."

Operators are urgently addressing the capacity problems such increases in data usage imply. Some are capping monthly unlimited data plans while others encourage content developers to limit application bandwidth. Neither of which are likely to be popular with users and may increase churn: both of which enhance the consumer proposition for deploying picocells and 3G femtocells.

While adding what could be millions of mini-base stations to a network, integrating them into existing infrastructure and systems and managing them is a significant task for operators, the rewards are potentially equally significant. The cost of delivering calls drops; service levels, speed and reliability rise and operators can introduce new, high margin services to the handsets people already have.

They can encourage both usage and fixed mobile substitution by offering FemtoZone services which are tied to a particular location and automatically activated when phones are within range of the femtocell. When people get home, texts could be automatically sent to absent parents to notify them children are back; podcasts, videos or images can be loaded to phones or targeted advertising sent to interested users.

"Femtocells are a cost effective technology and real commercial proposition for the residential market," explains Mallinson. "Most people in Europe have access to broadband networks at home and, by rolling out 3G networks, carriers are stimulating demand for mobile data. However, many users are frustrated since they cannot fully exploit the benefits of 3G phones or get the quality of service or application access they want.

"Most people use phones for data indoors where, without pico or femtocells, 3G coverage is often not reliable or signals not even available. Femtocells give consumers a better experience and faster downloads so they can really use all the features and functions 3G handsets and networks support while inside."

The Femto Forum industry body, of which ip.access is a founding board member, now includes more than 90 companies, including 36 operators covering 914 million subscribers. The Forum is encouraging the development of open standards which will lead to economies of scale - unit prices are expected to to drop below $100.

There are plans to include the new I-uh standard in release 8 of the 3GPP standard due out in December. It will replace the numerous different ways in which femtocells currently connect to networks and proprietary systems and define how they can be integrated into core networks. By standardising communications between femtocells and core network gateways, operators will no longer be locked into proprietary interfaces or particular vendors and so can choose consumer premise equipment (CPE), separately from the gateway.
Concerns about managing the multitudes of new units within a network are also being addressed by the industry. Currently available for DSL equipment, the TR-069 standard allows operators to remotely manage devices, diagnose and solve problems and download software upgrades. The standard is being extended to support the management of femtocells.

Based on open standard interfaces, the nanoGSM picocell and Oyster 3G femtocell products are total end-to-end solutions which include the requisite controllers and management systems. 

Over the five years they have been used in enterprises, the advantages of the nanoGSM are well documented. Fast and easy to install it increases mobile voice and data usage and reduces operator costs. With an indoor range up to 200 metres, traffic is backhauled through existing IP networks and it supports fast data rates over GPRS and EDGE to devices such as Blackberries. The nanoGSM picocell can be hung on a wall and, once the Ethernet connection is plugged into the box, it is up and running providing guaranteed mobile capacity and service quality indoors.

Like its bigger cousin but less powerful and with a smaller range, the Oyster 3G architecture creates a complete indoor broadband access network for the residential market. Using the same underlying technical platform as the Oyster 3G, ip.access is developing next generation picocells. Having solved many of the 3G femtocell ease of use, price and installation challenges necessary to meet consumer needs, ip.access believes these solutions can be incorporated into picocells. In future, the company expects to offer self-install 3G picocells to both large enterprises and to SMEs through their existing channels.

"These are very exciting times," says Mallinson. "We are building on our experience to produce next generation picocells designed for businesses of all sizes. SMEs need plug and play, easy to use, cost effective units which can be self installed and remotely managed. It makes commercial sense for companies large and small to deploy picocells. It also makes commercial sense for operators, giving them the edge over competitors and a new value proposition for smaller companies which historically have been something of a closed shop."
It's a truism that everything is going mobile and operators are already feeling the capacity pinch. Pico and femtocells give them a cost effective means of meeting the expected upsurge in demand and delivering the network performance capable of supporting next generation multimedia applications.

Today's smart phones are as powerful and feature rich as the PCs of only a few years ago and look set to become the principle controller of all domestic electronic equipment. Operators are now able to deliver the ubiquitous high-speed networks consumers of all kinds expect.

Mallinson looks forward to the day when content is automatically and seamlessly transferred between devices over femtocell platforms: "Users will be able to control televisions remotely from their mobiles; share content between phones and other devices quickly and automatically so all are updated. In the new converged IP world, audio, video, text and photographs will be seamlessly shared between devices.

Do femtocells and picocells hold the key to the lucrative SME market?  Mark Keenan takes a look

Network operators have long tried to address that potentially very lucrative but hard-to-reach customer segment: the SME (small medium enterprise).  Over the years, operators have experienced differing degrees of success, but this is a market sector that has long been viewed as one of the biggest challenges the industry faces. However, an area of mobile communications that many analysts predict will soon be popular among consumers is increasingly being viewed as a key for unlocking the SME revenue stream for all kinds of operators.

The technology in question centres around indoor base stations, also referred to as femtocells and picocells, predicted by ABI Research to account for 102 million users worldwide by 2011.  In essence, these are small indoor access points - think of a slim paperback novel - that are designed to provide dedicated mobile network coverage within a limited area, such as a house or office.  Unlike larger macro cells, these units are relatively low-power devices and manufacturers are designing them to be as ‘plug and play' as broadband modems have become.  Another key difference between traditional base stations and these scaled-down versions is that they link back to the service provider via a broadband line (usually xDSL) to provide network backhaul, rather than using a leased line or microwave link.

Picocells and femtocell have much in common and employ the same base technology but they differ in that picocells are higher capacity and provide extra features for the business market, such as the ability to support larger numbers of simultaneous users, or to chain together picocells to create a network, and to integrate with existing IT environments.  Femtocells, on the other hand, are lower capacity and have less inbuilt ‘intelligence' but are cheaper and designed for the mass-market consumer market.

Indoor base stations address a very real market issue, namely: the problem of achieving high quality indoor coverage.  Many mobile networks - particularly in busy city and town centres - are already overloaded, with too many subscribers placing demands on the network at any one time.  Furthermore, the nature of radio based systems means that there will inevitably be weak spots in network coverage caused by a variety of obstructions, ranging trees and hills through to buildings and walls.  Even thick modern double glazing can create a problem.

A recent research-based report from analyst firm Quocirca revealed that approximately one third of SMEs had experienced problems with indoor coverage at work, with the figure raising to 45 per cent when those same users were at home (as is often the case with SME executives).  Yet despite the fact that buildings are not ideal for mobile communications, more than half of all mobile calls are believed to be made within buildings and our reliance on mobile devices as a business tool continues to increase.  Think of the number of people who live on their PDAs, whether at work, in a meeting or working from home. 

Does it matter?  Well, as the fight to attract and retain subscribers becomes harder and harder, then we all know that the emphasis on service quality increases.  Indeed, a US study carried out by Telephia indicated that over a fifth of customer churn was as a result of poor network coverage.  Research firm InStat has stated that the biggest challenge facing mobile subscribers is the lack of indoor coverage of 3G signals and warns operators that their success with 3G services will be limited unless they address this issue.

This is the operators' dilemma.  While they are banking on return-on-investment on their 3G networks, the very nature of 3G means that it finds it even harder to penetrate buildings than 2G.  At the same time, the kind of services that 3G lends itself to so well - mobile data and TV - place greater demands on the network than ever before.  Yet building whole new landscapes of macro cells is not an option, both in terms of cost and environmental restrictions. This is why so many players in the industry - not just analysts, but vendors and operators - believe that indoor base stations are the solution for overcoming the network traffic logjam.  Furthermore, they could help to enable new operators to enter the mobile market.

It would be wrong to think of indoor base stations in terms of 3G alone.  For some time now, a couple of vendors (including RadioFrame) have been deploying 2G units to network operators in Europe.  In RadioFrame's case, this includes providing business customers of Orange with picocells that enables the operator to improve service quality where needed. 
Quocirca's research underlined the fact that while growth of mobile data is happening, voice services are still business users' primary focus and where they have concerns about service quality and cost.  And let's not forget that most of these business users are still on 2G. They are also receptive to fixed mobile convergence, if presented attractively and cost-effectively. 
While femtocells may not hit the mass market for a couple of years yet, indoor base stations could well prove the solution to maintain customer satisfaction among the SME community.  Looking ahead, these ‘mini cells' can also be used to achieve fixed mobile substitution, by enabling users to reduce expensive mobile call costs by using mobile broadband IP connections. 

Ultimately, picocells could be used to enhance PBX services.  Mobiles could be integrated with the PBX to support call transfer, hunt groups and virtual fixed lines, for instance.  Potentially, picocells could replace the fixed PBX completely with a wireless PBX solution, or even supplant traditional WANs and LANs, although as these are so well-embedded in IT culture, that is certainly not going to happen overnight.  It's interesting to note, however, that the technology is pretty much there to achieve this.

Where are we now?  Apart from deployment of picocells to business users, femtocells - both on 2G and 3G - are being developed and trialled around the world, with a number of product and services from a variety of vendors and operators expected to be launched at the end of 2008 or early 2009.  Some markets are more developed than others and while Europe is expected to be one of the fastest growing pico/femto markets, Sprint in the US announced its own femtocell solution in August 2008. 

No new market sector is without some potential pitfalls.  Mass-market roll-out has been cited as a barrier and this certainly is something that needs addressing very soon.   Most mobile and fixed operators are used to supporting deployment of voice-centric mobile phones, but as some of them have found, as soon as you move into mobile data support, then far more technical support tends to be needed.  Furthermore, there is a big difference between distributing mobile phones and PDAs - whether via shops or courier delivery - and rolling out thousands - ultimately millions - of indoor base stations.

This is why it is crucial that pico and femtocells need to be ‘plug or play' and involve ‘zero touch' deployment. In other words:  devices that can be installed by the customer;  remotely activated  by the operator; and (in the case of RadioFrame's own product line) even remotely updated, all without a truck roll.

In 2007, the Femto Forum was created by a number of vendors and operators to jointly agree a way forward regarding industry standards.  A new technology that does not experience some dissent between players is rare, but the development of universally-accepted standards is happening at a relatively steady pace.  Certainly, standards should not be viewed as a total barrier to indoor base station deployment - after all, picocells are already in commercial operation - though of course, interoperability is very desirable for the market's future.

Another danger is the tendency to ‘over-hype'  picos and femtos.  Realistically, do we really think that every business and consumer will have one within the next 12 months?  I would say not.   So let's not raise expectations to a ridiculous point, or develop business plans that are based on hope rather than common-sense.

That said, the benefits of picos and femtos - to operators and users alike - are very clear.  While predictions on timescales, volumes and market expectations may vary, the general consensus would seem to be that indoor base stations are central to the future of the mobile industry, from 2G to 3G and beyond, not just for consumers but to support business users too.

Mark Keenan is General Manager for Europe, Middle East and Africa, RadioFrame Networks Inc. 

By Carsten Storbeck, director of product management with ADC KRONE

Fibre-to-the-home (FTTH) is certain to happen. In some countries it is well advanced, with customers enjoying data speeds of 100Mbit/s into their homes. In other territories, carriers are trying to squeeze the last few years out of their ageing copper networks but the best they can achieve is around 50Mbit/s. And this simply will not satisfy consumer demand in the coming years. The process of replacing copper cables with fibre is undoubtedly expensive but it must nevertheless happen sooner or later. Otherwise the telecomms companies will lose their broadband business to the cable TV operators.

From a technical point of view, laying fibre as far as every home is not difficult. However, installing fibre cables inside customer buildings can be a far less simple operation. This is particularly the case in continental Europe, where more than 70 per cent of people live in flats, apartments, terraces or town-houses, termed collectively as multi-dwelling units or MDUs.
Installing ‘traditional' copper cable in these buildings for telephony was easy. This new task is not. Network providers need to deliver broadband at 50 or 100 Mbit/s (perhaps 1Gbit/s) to each dwelling. They could use Category 5e/6 copper cable but this has a distance limit of 90 metres from the external fibre termination and requires electric power and probably an uninterruptable power supply in addition.

A far better alternative is to extend the fibre direct to each and every dwelling within the MDU, but until now this has been a difficult and costly process. Every fibre route must be measured with extreme accuracy and individual fibre cables manufactured to the correct lengths. This is an expensive and time-consuming process and three or four weeks may pass before the cables are delivered to site for installation.

Alternatively, the site technician may be able to install fibre in the cable risers from the basement to the fibre distribution points on each floor and then provide smaller fibre cables to each dwelling. Very great care is necessary, because standard fibre cables cannot tolerate the rough treatment, crude fixing methods and sharp-radius bends that are normal with copper cables. Fibre cables are simply not compatible with technicians' current working methods.

A skilled (and therefore expensive) fibre-splicing technician must either perform the whole job or else visit the site after the cable-laying is complete in order to splice all the fibre cables. This can frequently involve a hundred or more splice joints, making it a lengthy and expensive process, the more so because every splice must be tested afterwards.

In short, cabling a multi-dwelling unit for fibre has been an expensive process until now. This was before the recent launch by ADC KRONE of a fully-modular, ‘plug-and-play' fibre installation system.

The new breed of fibre system for MDU applications includes fibre cable developed using military experience.  It can be stapled to all kinds of architectural fittings without damage and bent around every type of right-angle found in buildings (on average every horizontal run needs to pass around 15 right-angles). It can even be crushed repeatedly without either damage or degradation of signal.

Accompanying this rugged cable are highly ingenious fibre distribution points that include a concealed cable-reel pre-loaded with 30, 60 or 90 metres of pre-terminated fibre cable that connects back to the previous distribution point.

In fact there are only four components in this system and even with the different fibre-length variants only nine component variants, all of which can be stocked and held in the technician's vehicle. With a stock of these nine variants in his van the wireman can arrive and start work immediately. There's no need for a site-survey nor a four-week wait while custom fibres are manufactured.

Because the fibre cables are all pre-tested at the factory, the only testing required on-site is to check the signal levels in each customer dwelling. With this novel approach, the whole process is just as simple as installing old-fashioned copper cable. The components are simple, durable, long-life and far less expensive then existing MDU fibre distribution. In this way installation costs have been reduced by 60 per cent by major telecomms carriers in the USA, where the equipment has been proven in both central office and field environments.

Taking place this 29 September through 2 October, the International Engineering Consortium's (IEC) Broadband World Forum Europe 2008 will once again gather the world's top ICT thought leaders to the Brussels Expo in Brussels Belgium.
Co-located with host sponsor Belgacom's 35th Annual ICT Symposium, the Broadband World Forum Europe 2008 will represent the entire ICT value chain and provide winning solutions to those aching to maximize the promise of broadband.

"In collaboration with the co-located ICT Symposium, this year's event will combine telecommunications solutions with innovations for enterprise," comments IEC President John Janowiak. "The exhibition and workshops will cover technology, business, strategic and operational issues on topics such as mobility, collaboration, security, and risk management, and address the most promising alternatives for moving forward in the ICT industry."

Themed "Delivering the Promise," the Broadband World Forum Europe 2008 will present key industry leaders at the event including World Forum Chair Scott Alcott, executive vice president of the service delivery engine at Belgacom; and Keynoters Didier Bellens, chief executive officer of Belgacom; Pat Russo, chief executive officer of Alcatel-Lucent; John McMahon, president and managing director of the European department at Sony Pictures Television International; Carl-Henric Svanberg, president and chief executive officer of Ericsson; and Julio Linares Lopez, chief operating officer of Telefonica de España.

The event provides "a great opportunity to see what is happening, what is on the future time path, and great interaction with all of the professionals around the world," declared World Forum Chair Scott Alcott.

Industry professionals will have the opportunity to learn from more than 250 global leaders at the forefront of broadband service, applications, and technologies and an opportunity to learn first-hand from some of the world's top product experts displaying the latest cutting-edge advancements from more than 90 key players exhibiting on the floor. The IEC will present its renowned world-class education in more than 50 keynote addresses, plenary panels, workshops, and sessions.

A history of drawing the world's ICT decision makers to the World Forum, 87 per cent of last year's attendees were manager-level and above.  In addition, more than 100 service provider companies traveled from around the globe to conduct business at the conference and exhibition. 

The Broadband World Forum's InfoVision Awards will also take place honoring the most unique and beneficial technologies, applications, products, advances and services in the industry.

Key event sponsors of the Broadband World Forum Europe 2008 include Official Host Sponsor Belgacom, as well as Nokia Siemens Networks, Alcatel Lucent, Ericsson, Huawei, NEC, Thomson, ZTE, Italtel, ADVA, Allied Telesis, Astra, AVM, Actelis, ADC, Soapstone Networks, and Cisco.

The IEC welcomes all ICT industry professionals to attend especially those involved with Carriers/Service Providers, Enterprise, Software Providers, Integrators/Aggregators, Content, Over-the-Top-Carriers, Residential, Equipment Manufacturers, and Semiconductor Manufacturers.

Pay by mobile
A new analysis of the global mobile payments opportunity forecasts that 2.1bn mobile subscribers will "pay by mobile" for digital goods downloaded to their mobile phones by 2013. Juniper Research defines digital goods as music (ringtones and full tracks), tickets, TV, user-generated content, infotainment and games - in fact any content bought by phone and delivered to the phone.

A region-by-region analysis by Juniper Research found that there is a significant growth opportunity not only for mobile payment systems, software, support and consultancy services vendors, but also for mobile operators to increase their arpu as transaction frequencies accelerate.

Report author Howard Wilcox notes: "Many digital content goods and services are becoming basic ‘must haves' - particularly in the sub 35 age group.  Devices like the iPhone - even in its 3G incarnation - are undoubtedly contributing to consumer awareness and usage of mobile music services. People who are 15 to 20 today will expect to buy directly with their phones and will drive this market over the next few years."
Highlights from the report include:

  • Users are forecast to make at least two payment transactions per month for digital goods by 2013
  • Nearly half of all mobile phone users will have bought digital goods at least once with their phones by 2013
  • The two leading regions (Western Europe and Far East & China) will account for over 50 per cent of the total digital goods gross transaction market value by 2013.

Howard Wilcox continues: "Even though typical transaction sizes will remain in the $3-$5 bracket a sufficient number of users will be using their mobiles to buy music, games, tickets, infotainment and the other digital goods sufficiently often to see gross transaction value grow nearly seven fold by 2013."

The report also focuses on purchases of physical goods - ranging from gifts to household goods to electronics - via the mobile web. It provides six-year regional forecasts of mobile payments for digital and physical goods, providing data on subscriber take-up, transaction sizes and volumes as well as detailed case studies from companies pioneering in this market. Juniper Research interviewed 37 senior executives across a wide range of vendors and operators.

Whitepapers and further details of the study, 'Mobile Payment Markets: Digital & Physical Goods 2008 - 2013' can be downloaded from www.juniperresearch.com

Wireless trends
The Wireless Technology Trends Report fills the need for a truly comprehensive wireless analysis designed to serve companies in need of understanding the position of individual technologies in the context of the overall market. This report is written for companies involved in, or potentially entering, the wireless market that are interested in a general overview accompanied by detailed forecasts.

"Over the last 12 months, many of the emerging wireless technologies have begun to exploit market sectors ranging from home automation to industrial and consumer electronics," says Dr Kirsten West, Principal Analyst of WTRS.   "The adoption of wireless as a pervasive technology is not a matter of "if", but when. The consumer desire for increasingly unfettered wireless connectivity is clear."

The report's key findings include:  

  • Ultra wideband has shifted from a nascent technology to a solidly emerging protocol underlying the certified wireless USB and Bluetooth "high speed" implementations.
  • Certified wireless USB will enable wireless transfer of photographs and other media from consumer electronics devices to per¬sonal computing and output devices.
  • Bluetooth has undergone an expansion campaign over the last 18 months to incorporate newly-emerging protocols such as near-field communications (NFC), Wibree, and Ultra Wideband.
  • ZigBee is poised to make great strides in the next year and many new products are expected to be released to market. It is very possible that the momentum of ZigBee will finally surge in 2008.
  • Wireless delivery of high definition video content is an area that has very much emerged over the last 12 months. The market for wireless delivery of high definition video promises to be large, with decisive consumer adoption once the technology has been proven in end products.
  • WiMAX has become a significant competition to alternative technologies in emerging markets.

Research push
On 10 September, the European Commission officially launched 14 research projects, which are part of the FIRE initiative for Future Internet Research and Experimentation.

The launch event in the Paris City Hall was opened by Jean-Louis Missika, Deputy Mayor of Paris in charge of Innovation and Education, and Gilles Bloch, Director General of Research and Innovation at the French Ministry of Education, Higher Education and Research. More than 165 researchers, innovation managers, including European experts from national and European research projects, as well as representatives of European and national research funding institutions attended.

The event, which takes place in the context of the French EU Presidency, is organised by the Directorate General for Information Society and Media of the European Commission in cooperation with the EC-funded projects FIREworks and OneLab2.

The 14 FIRE projects contributing to the event are funded by the EC under the 7th Framework Programme for Research (FP7) and cover a wide range of research topics in the area of Future Internet Research and Experimentation, including advanced networking approaches to architectures and protocols, coupled with their validation in large-scale testing environments, as well as interconnection of test beds that enable experimentation on a large scale. The total planned budget of these projects is more than 58.5 million euro, of which the EC funds about 40 million euro.

Premium content
The mobile communication market in Europe has reached a saturated and mature phase. Mobile penetration is more than 100 per cent in many western and eastern countries, even as many other countries are rapidly reaching full penetration. It is clear that the mobile industry in Europe requires investing and committing in other services and applications in order to grow effectively. Mobile premium content services and applications represent a potential source of significant revenues for the mobile industry.

New analysis from Frost & Sullivan European Mobile Premium Content Markets, finds that the market (including revenues from mobile music, mobile games, mobile video/TV and mobile graphics) was worth €2.68 billion in 2007 and is estimated to reach €11.0 billion in 2012.

"Content is the new horizon for the European mobile industry," notes Frost & Sullivan Research Analyst Saverio Romeo. "During the last three years, mobile operators have been observing a slow, but continuous decline in the average revenue per user (arpu) due to the decrease of voice and SMS arpu. New sources of revenues are needed: content is an excellent candidate."

Content types such as music, video/TV and games are leading the content growth. However, new services and applications such as mobile social networking, mobile searching and location-based services are gaining momentum. All these services, which can be defined as content tools, allow users to personalise, search and share content with other users. Business models are also shifting towards ad-based models.

"In order to exploit the variety of revenue-generated business opportunities, the industry has to face some critical challenges," cautions Romeo. "Consumers will use content on mobile devices if the industry is able to offer high-quality content with an excellent user experience at affordable prices."

The unification of all communication devices inside a single platform offers huge advantages for businesses looking to streamline their operations, yet requires careful planning and management if it is to deliver the benefits it promises. Driven in large part by the rise in mobile working and the growing need for more flexibility across different communications devices, there has been growing demand for unified communications in recent years with legal firms, and government at the forefront of adopting the platforms, explains Martin Anwyll

The growth of the mobile work force is a key factor in the growing demand for unified communications, in fact a recent Forrester report revealed that 64 per cent of the 2,187 US and European companies surveyed, listed mobility support for employees as a ‘priority', and nearly one in five as a ‘critical priority' in 2008. Also, analyst firm Gartner has estimated that 46.6 million people are expected to be spending at least one day working at home by 2011. This means that businesses are shifting towards a decentralised workforce to reduce costly office space.

Increasingly, the impetus to ‘go green' in these energy conscious times has moved up nearly every business's agenda over the past few months and with the ever increasing fuel costs, the use of online and virtual meetings can significantly reduce business travel costs and lower an organisation's carbon footprint.

However, the unified communications network is an inherently more sophisticated and complex environment, making high quality service difficult to deliver. What are the key issues that managers need to consider in order to quickly identify and troubleshoot any issues and avoid costly downtime?

Unified communications comprises many interdependent and heterogeneous parts making it more difficult to ensure an acceptable and consistent quality of service. Poor quality of service can put the goals and business benefits of unified communications in jeopardy, for example, poor quality of communications is likely to deter a mobile workforce from using real-time communications and reducing any potential benefits. Managers must be able to quickly identify degradation in quality of service or, if any service falls over take the appropriate action. Without monitoring, time to resolution can be significantly impacted, resulting in a poor user experience, reduced productivity and ultimately, loss of business.

Unified Communications can offer an organisation new flexibility and manageability for employees that can deliver unprecedented levels of connection between the distributed workforce. Not only will unified communications help with unravelling bottlenecks, it supports closer collaboration across the business and provides the organisation with a competitive edge that enables employees to contact each other more quickly and eliminate any delays caused by the inability to reach key decision-makers. Poor quality of communications is likely to deter the mobile workforce, reducing any potential cost benefits that an organisation may experience.

There are usually several factors that contribute to a manager being unable to deliver high quality of service and these can range from IP networks not being ready for real-time communications to organisations employing VoIP technology from more than one VoIP vendor. Real-time communication solutions such as email, instant messaging and VoIP, rely upon consistent, stable and low-latency network connections. However, most existing networks are built upon technologies that were not originally designed to be consistent, stable and low-latency and new communication technologies are much less tolerant and prone to service issues.

Unified communications is usually multi-vendor by nature; however, mergers, acquisitions and fragmented purchasing processes mean that an organisation will deploy VoIP technology from more then one of the major VoIP vendors such as Cisco, Nortel and Avaya. There are very few management vendors that support multiple VoIP technologies. Of those vendors that do, it is rare that they are able to manage other unified communications solutions such as Microsoft Exchange, BlackBerry or other business applications.

Each different technology or application that supports unified communications tends to bring its own native management tool. These native management tools generally support a specific piece of the organisations infrastructure and not the service as a whole. This will usually result in holes in the unified communications management or a patchwork of disparate tools to address the management challenge.

The use of an enterprise real-time communications server, for example Microsoft OCS, enables the unified communications infrastructure to allow instant messaging, presence, audio-video conferencing and web conferencing functionalities but administrators should be aware that intermittent network problems can cause issues.  For example, when network congestion causes dropped connections in conference calls or when poor call quality is detected from an OCS client to a non-OCS client.

Organisations that use an OCS platform in conjunction with another voice system must have a solution in place that allows them to have visibility of all major components so that they can effectively manage and monitor the unified communications environment. It is important that organisations are able to proactively anticipate potential problems before they have an impact on their core business.

Security also needs to be a priority in the buying equation.  With technologies in place that only provides protection from existing and impending threats, the likelihood of a major and successful attack on unified communications systems is growing for one simple reason; end-user failure to implement security techniques properly. For example, traditional firewalls do not protect VoIP calls as voice packets must be encrypted and traverse a firewall without undue latency.   Any network that ends with an IP address is vulnerable to unauthorised calls, spammers, information theft and other malicious activity by hackers and DoS (Denial of Service) attacks that can, at best, adversely impact call quality. In a worst-case scenario, the entire network can be at risk during a VoIP security breach.

In order to be effective from a security perspective, the unified communications management system must provide an automated security layer that monitors the entire unified communications environment in real time to increase protection levels and ensure layered defences. It should be capable of correlating security events and alert on security breaches and performing analysis and forensics - all in real time.

Organisations that are looking to deploy unified communications need visibility into the health of the entire communications platform and more importantly need to manage the service that it delivers. By using a comprehensive lifecycle management approach an organisation can successfully ensure that deployment, operation and continued roll-out of unified communication services is delivered at a high standard.

Unifying communications helps streamline business processes and improved connectivity has a direct influence on information sharing, productivity and efficiency. The task of unifying communication applications is actually the opposite of shifting to a single communications platform. Multiple, often unrelated systems must be linked together to appear seamless to the end user. The one unifying element is the underlying data network.

The process of planning, managing and improving begins before deployment and this means that prior to any new communications tools or technologies being used, the capacity of the network must be assessed to see whether it can cope with the anticipated communications traffic.

Once deployment has taken place, organisations must constantly monitor and manage the unified communications system and services. Monitoring should take place at the element level and from the perspective of the user. Taking a proactive approach can save an organisation's time and money by identifying problems before they have an impact on end users, freeing up time for managers to apply a more cost-effective solution. It is essential that organisations have the ability to quickly and easily troubleshoot issues; this requires tools and knowledge that are specific to unified communications technologies, especially those providing real-time communications.

Proactive assessment and monitoring tools provide organisations with the ability to generate comprehensive reports that detail the usage and performance of the various elements of the unified communications infrastructure. The ability to generate reports enables managers to adjust elements such as tracking calls, dial plans, gateway utilisation and external links to improve the service delivery.

Managing a unified communications system needs extensive visibility into the organisation's converged voice and data environment. Vendors offer management capabilities for fundamental unified communications technologies including VoIP, Microsoft Exchange Server, Active Directory and networking equipment. The ability to monitor, troubleshoot, report, diagnose and resolve events with one management solution enables an organisation to correlate events and take corrective or preventative action, effectively minimising resources and time to resolve issues.

The unified communications management market is young and immature. Many organisations are defining their own approach to unified communications and how it meets their specific needs. Organisations need to find a unified communications management solution that fits comfortably, with the flexibility to be tailored to the company's needs.

Business has always relied heavily on communications and organisations that have integrated business applications into the unified communications platform found that they were able to resolve customer issues faster whilst maintaining a higher quality of communication experience.

As communication platforms become more complex and integrated in nature, organisations require tools to assess, monitor, troubleshoot, and secure voice and data transactions. Organisations cannot afford to function in today's economy without the assurance of their communication systems' performance.

Martin Anwyll is Product Line Specialist, VoIP Solutions (EMEA) for NetIQ.

While we are still at an early point in the evolution of 4G network technologies, Rob Dalgety looks forward to greater clarity on how the different technologies will fare and how the market will shake out

Vodafone, T-Mobile and France Telecom have all announced plans to deploy LTE-based 4G networks. Some of these service providers are also planning to support WiMAX, another 4G technology, in addition to LTE. Live deployments of WiMAX are already underway in different parts of the world. Still other service providers are getting behind UMB, WiMAX, LTE or some mix of the three. It feels like we had not really finished discussions of 3G technologies before moving on to 4G. Now a dazzling array of old and new technology acronyms - WiMAX, LTE, UMB, OFDMA, 3GPP, 3GPP2, IEEE - are all now forming a part of the 4G lexicon.

There are three primary network technologies that support ‘4G'. The first is WiMAX:  Worldwide Interoperability for Microwave Access.  WiMAX was initially developed to support ‘last-mile' wireless broadband as an alternative to wired technologies, and has now been extended to support use cases that are truly mobile (non-line-of-sight communications, mobility via the IEEE 802.16e standard, etc.). 

It is currently the most mature of the 4G technologies, with stabilised standards and a number of live deployments around the world.

The second is LTE: Long Term Evolution.  This technology is an evolution of many of the currently deployed ‘GSM Family' of cellular networks (3G/UMTS and 2G/GSM networks) for which the standards are currently being finalised. This technology has the support of a large number of mobile operators and major equipment vendors, as it is intended to provide a relatively straightforward upgrade path from the current network infrastructures.  The first live deployments of LTE are expected in 2010 and beyond.

The final primary network technology in question is UMB: Ultra Mobile Broadband. This is the OFDMA-based 4G extension to the CDMA-2000 3G standard driven by Qualcomm.  This is primarily a route to 4G for service providers who currently use CDMA-2000-based networks. We are still at an early point in the evolution of 4G network technologies. As time passes, it will become clearer how each of these different technologies will fare and how the market will shake out.  Different network technologies will be deployed in different regions and territories depending on a variety of factors, including the commercial and regulatory environment, the availability of spectrum, and the applicability of different network technologies to the aspirations and goals of the service providers.

Moving to these 4G technologies will provide technical and economic opportunities for different service providers. These include the cost efficiencies gained by ensuring IP support in both the core and radio-access networks, as well as significant performance improvements, which can range from improved data throughput to reduced latency and increased capacity (subject to spectrum).  Evaluating these technical and economic factors is a significant part of the process of deciding when and how to move to a 4G technology.

Also key to the process of deciding when and how to move to a 4G technology are the new service opportunities that are enabled by 4G-the new mobile services that can be presented to end users. These can be distilled into four main areas:

  • Fixed Services: 4G technologies such as WiMAX and LTE provide the ability to support broadband and voice services that have, until now, been delivered by wireline technologies. There are opportunities to:

Substitute for wireline technologies, for example in rural areas - where the economics of wireline deployment are reduced by the lack of subscriber density, and/or 
Cannibalise fixed services - this would be by using the wireless network to form part of the proposition where the wireless service offering is more compelling across the marketing mix than the fixed alternatives.

  • Mobile Computing: Enabling truly mobile computing (laptops, modems, dongles, tablets and ultra-mobile PCs) is another service opportunity for 4G technologies. The higher bandwidth available and lower latency of 4G networks fits well with the services used by these more data-centric devices -whether the services are accessing a corporate network and enterprise applications, or using other data-heavy applications.
  • Mobile (2.0) Services: Beyond data-centric devices like computers, 4G networks will deliver the underlying network connectivity and bandwidth to unleash more advanced next-generation data services on traditional mobile devices.
  • Mobile Consumer Electronic Devices: 4G networks will also offer the opportunity to support connectivity for a wider range of consumer electronic devices - from MP3 players and cameras to personal media players. The mobile phone will be just one of many end-user devices connected to 4G networks. All of these different devices can benefit from this connectivity, which will enable support for over-the-air content and data updates as well as ongoing management of the devices themselves.

In general, timing of the rollout and uptake of these services should follow the order outlined above.  However, there will be many regional differences in Europe and around the world.  For example, territories that have extensive and well-developed wireline infrastructures may see more focus on the mobile services opportunities (points 2-4 above) than on the fixed services opportunities.  The commercial strategies of service providers will also significantly impact service mixes and rollout plans. Some service providers may decide to innovate, creating new service opportunities (such as providing connectivity for mobile consumer electronics devices). Others may go after known market opportunities that are currently serviced by wireline technologies, thereby cannibalising the ‘wired services'.

Regardless of which 4G network is under discussion, it is the new wireless network technologies themselves that tend to dominate the discussion of 4G currently. This focus on network technologies is not a new phenomenon; we saw this when 3G networks rolled out and we are seeing it now as the 4G market evolves.  The network itself will form an important part of the discussion as we move into the 4G world.  However, based on the quantity of conference papers, standards body activity, press coverage and news, you might think that once the 4G network issues are resolved and deployed, any issues having to do with delivering services over those networks will be resolved as well.

The reality is that this is far from the case.  There is a range of other enabling-technology elements that are essential to delivering 4G services. Critical components that need to be considered include the devices and services that will use the network. In building a truly functional and vibrant ecosystem, we not only need to deal with network, device, application and service issues, but we also need to deliver the key advanced management capabilities and tools that are essential for a truly seamless, high-quality end-user experience.

A critical lesson learned from the cellular world is that as the service environment becomes more complex, embedding advanced visibility and manageability into devices and services is critical to optimising new services and delivering a great end-user experience. These capabilities will be even more critical as we move into the 4G world with all its inherent complexity.  For example, in a 4G ecosystem, the device might be anything from a mobile phone to a computer to an MP3 player or a vending machine.  Being able to detect the device, recognise it for what it is, activate it and configure the relevant services on it without the need for human intervention constitutes the first step to an excellent user experience.   After all, there would be no point activating and configuring a vending machine for voice services!

The technical ecosystem is critical, but it is compelling service propositions that deliver value to the consumers or enterprises that will ultimately drive end-user demand for 4G services.  These propositions will rely on the inherent capabilities of the network as well as other technical components. But the litmus test will be the full value proposition service providers deliver to their users - the '7 Ps' of marketing. For a truly compelling value proposition, service providers must consider it all-from the pricing and promotional offers, to the service functions (product), distribution channels (place) and physical presence - especially of devices (from traditional mobile devices to other connected consumer electronic devices) - to the provisioning and setting up of services (process), and all of the delivery and support processes (people) that accompany the offerings.

There are significant service opportunities in the 4G environment.  The network is a core part of the equation - the oxygen that will support successful 4G services.  The network is also a significant component of the business case for investment and deserves attention for that reason as well.  However, there are other enabling technologies that will be critical to the success of 4G, including devices, services and management capabilities. These will all be important technical components in ensuring a full-functioning 4G technology ecosystem.  The ability to grow the usage of services - and reap the revenues that increased usage will bring - will also depend on the value proposition service providers deliver to end users. Offer compelling value propositions delivered over a superior technology infrastructure, and 4G services will fly.

Rob Dalgety is Commercial Director, Mformation Technologies

The early hype for mobile tv may now have died away, but Kamil Grajski is certain that successful mobile broadcast is on its way

The mobile TV hype is over.  Aggressive projections regarding consumer adoption, and revenue (whether via advertising or subscription) have not come to pass.  Chief among the more creative aspirations was that by the opening ceremonies for the 2008 Summer Olympics, aided by a European Commission sponsored DVB-H technology mandate, the European Union would be well along the road to rapid adoption of mobile TV and mobile broadcast services. Worse, news has emerged in recent days that due in part to lack of mobile network operator engagement, the Mobile 3.0 consortium that had won a license in Germany with plans to use DVB-H for its mobile TV service may be on the verge of collapse.  In parallel, it was announced last week that Norwegian broadcasters NRK, TV 2 and MTG had banded together to launch a free-to-air T-DMB-based mobile TV service.

All of this is not to pick exclusively on DVB-H.  Not at all.

The reality is that there are challenges aplenty across the whole mobile TV technology spectrum.  For example, operators in Korea have launched services using the T-DMB standard, which sports more than sixty device-types and more than 8M subscribers. But even with that size of audience, advertising revenue is running behind expectations. In the USA, which recently managed to air a live, dedicated mobile TV Olympics coverage channel via FLO-powered AT&T and Verizon services, uptake of mobile TV has been slower than many expected.  Qualcomm's CEO Paul Jacobs, who's company supports the FLO broadcast standard as well as other technologies, has publicly expressed a desire to see greater mobile network operator marketing and promotion of mobile TV services.

But despite the march of seemingly apparent bad news, the underlying trends for mobile TV still register strong positives.  In a word, the hype is gone merely to be replaced by the harsh realities of establishing a new global consumer mass market medium. Although TV is seen as a mature medium, transferring the broadcast model that appears on cable, satellite and terrestrial networks is a massive undertaking. Despite some early teething troubles mobile broadcast TV has launched commercially in a number of closely followed markets around the world including Japan, Korea, Europe (Italy) and the United States.  Other on-air or planned launches include Austria, Finland, Netherlands, among several others. So the commitment from operators and the industry is there.

So what has been learned so far?  For those operators that launched with a free-to-air model (and lots of devices), such as in Japan and Korea, initial consumer adoption is not the major issue - the services are in fact proving extremely popular.  What remains unclear is revenue and profitability growth, as operators try to move consumers over to the more lucrative subscription channels that feature premium content.  Conversely, operators that launched with only a pay-TV model have seen adoption as the greater issue. And even then, long term revenue and profitability growth are yet to be fully tested.

In Europe, 3 Italy, which was first to market with a mobile DVB/H-powered pay TV service, recently announced the addition of a free-to-air bouquet as a way of boosting adoption.  Similarly, MediaFLO USA has added a free-to-air promotional channel aimed at giving consumers an easy way to sample and subscribe.  Thus, in response to market realities, the launched mobile broadcast market is implementing and testing the proposition that a hybrid free-to-air and pay-TV model may be optimal to drive adoption and fuel business growth.

Looking at forthcoming deployments, the medium- and long-term trends remain positive.  In the medium-term, the realization that capacity (spectral efficiency) is pivotal to broadcast TV's success has seen a review of the different technologies on offer. Capacity is important for two reasons: firstly, the ability to deliver more channels over the same service means that operators can deliver a greater mix of paid and free content as they seek to drive uptake. Secondly, the capacity has an effect on the amount of spectrum needed to deliver TV services. This is of critical importance in Europe, where there is much greater pressure on the amount of spectrum available from country to country. This is one of the advantages of the FLO air interface, as it supports ~1.5X-2X the number of video channels as any other mobile broadcast technology.  Such capacity gives excellent flexibility in adjusting bandwidth between free-to-air and subscription-based programming.

For example, while it is early to define it as a widespread trend, we've observed regulators (France and the UAE among others) implement must-carry free-to-air requirements on mobile broadcast licensees.  The number of such free-to-air channels varies, but can be as high as 5-10 channels.  MediaFLO operating in an 8MHz channel can deliver 30+ channels streaming video (25+ frames per second; QVGA; AAC+ stereo audio).  Thus a 10 channel free-to-air requirement still leaves up to 20 channels to power a subscription-based service, or other allocation to include such services as multimedia file delivery (Clipcast), IP datacast services and mobile interactive TV.

Also in the medium term we can expect action to result from recently concluded spectrum auctions, including those in the United States and the United Kingdom.

Two key long-term trends signal continued positive momentum.  First, mobile broadcast related regulatory consultations have recently concluded or are in progress in India, Singapore, Taiwan, Ireland and the United Arab Emirates.  In Europe there is steady progress relating to the Digital Dividend and spectrum harmonization primarily via the CEPT ECC Task Group 4 operating under mandate from the European Commission.  Second, 3G-based mobile TV continues to build momentum.  In key high-growth markets, such as throughout the Middle East and North Africa, 3G-based mobile TV has created spirited competition between operators.

Ultimately, as 3G-based mobile TV and related services drive adoption and simultaneous use grows, the low-cost bearer economics of mobile broadcast in general and capacity performance of MediaFLO in particular will drive adoption.

Back to today.  While the mobile industry still works out the details of how best to create a robust mobile TV business, consumers are still getting to grips with even basic multimedia content. But there is optimism in the air: as people get used to browsing the web and consuming video and music on their mobile devices, they set the stage for TV and other richer content in the future. So despite the death of all the early hype, it's certainly a question of ‘when' and not ‘if' for mobile broadcast TV. Nobody said that changing habits was easy, but done right the effect can be amazing. Just ask Apple; not for what it has achieved for itself, but rather for the catalytic effect the iPhone has had on the whole industry. The work we do today prepares the ground for the services we enjoy tomorrow.

Kamil Grajski is President of the FLO Forum

In the face of continued threats to mobile communications from such factors as message-borne spyware, malicious messages and invasive spam texts, education is the next key step to fighting for mobile security explains Jay Seaton

In May this year, IMS research released data stating that in 2012, 900 million users will be accessing banking and payment services through their mobile phones.  This enormous figure reflects the evolution of consumer activity from the high street bank, through to the PC and on to the smart phone, and yet there is little to no education for users on the risks involved in accessing banking data through their mobile phone.  So while the last few years have focused on educating the public about banking, shopping and online activity through their PC's, the mobile phone is a new arena where operators need to step up and educate users on protecting their personal data on the go. 

Many consumers assume that operators pre-load security functionality on to handsets, and when purchasing a mobile phone, consumers are offered insurance for loss or theft of the handset, but not in relation to mobile security. Indeed, McAfee's 2008 Mobile Security report identified that 72 per cent of mobile users were concerned about the level of security services for their mobile phones, which shows that it is still a topic that remains less prominent, but just as prevalent, as PC security.  With the advancements of the mobile phone, come the additional tasks of protecting information stored and accessed via the device.

While the meaning of "security" hasn't changed much over time, its context has evolved at a frightening pace, with more and more risks making themselves ever present in consumers' everyday lives.  Prominently reported in the news, there have been several stories already this year of confidential information being accessed and taken out of the workplace through employee's devices.  Indeed, a recent survey by Decipher Inc found that "70 per cent (of those questioned) said they access what they consider to be sensitive data on their smartphone in order to work outside the office." While the mobile phone is not a new arena for security threats, it is still hugely overshadowed by the traditional areas of threats to home, work and personal computers.  With record numbers of spam hitting consumers' mobiles on an hourly basis, it's time to shift the focus from the online world to the personal realm of mobile communication.

For consumers the mobile phone has opened up a world of new possibilities. Mobile subscribers can now use their mobile phone for a host of activities -- be it paying or accessing bank details, purchasing cinema and concert tickets, travelling around cities or accessing social networking sites such as Facebook. All these tools are aimed at making consumers' lives easier to work while on the move.  However, with all the progress that has been made, there is still very little information available for the mobile user in protecting against the same dangers that would be second nature to them whilst working on a PC.

So what areas are mobile users most at risk from? 

Unwanted SMS
One of the most prevalent security risks for mobile users, and one universally recognised now is unwanted SMS - most users would recognise unwanted advertising, or in worse scenarios, unwanted and malicious messages.  With an estimated 72 per cent of all mobile phone subscribers worldwide being active users of SMS, each is at risk from several forms of SMS abuse, from unwanted advertising, denial-of-service (DOS) attacks in the form of SMS flooding or scam messages encouraging subscribers to make premium rate calls. 

Spam Text
Global SMS spam levels continue to rise at a frightening pace.  In March this year, China saw an unprecedented influx of spam messages with 200 million China Mobile subscribers hit.  What makes it easy for spammers is the low price of SMS, meaning users can be targeted outside their own country.  However with mobile marketing becoming ever more popular, Application-to-Person (A2P) SMS is gradually becoming more and more common as advertisers aim to reach audiences through different channels.

Messaging-borne spyware and malware 
Spyware and malware are one of the most malicious formats of mobile threats.  Users can be targeted through a message containing a URL or web link, which once clicked, downloads a virus or application with a hidden piece of code to the handset.  The most common strain of Trojan targets the user's address book, which then infects all contacts with the same virus and the pattern, repeats itself.  Small businesses in particular are at risk as they are more likely to be using smart phones with unsecured email clients.  Additionally, the arbitrators of the Trojans use fake mobile accounts that cannot be billed to a single operator, meaning there is a huge volume of messaging traffic that no one is paying for which makes them very hard to trace.

MMS threats
Despite currently being less prominent than SMS in attacking mobile users, malicious MMS messages are a threat for users with Bluetooth.  The virus once installed on a device, can replicate itself through the Bluetooth application again, through the address book on the phone.  The user is then charged for the huge volume of messaging that has taken place unbeknownst to the user, as well as draining the battery of the phone.  However, MMS threats can be more easily controlled through disabling the Bluetooth function when not in use.

So what can consumers do to prevent these threats?  First, more education is required from the operators and network providers - 55 per cent of users expect mobile operators to preload mobile security functionality to all handsets, and with PC security readily available, the lack of knowledge for mobile user's means assumptions are made by users because information is not readily available.  With planning already well under way for London's 2012 Olympic Games, its important to ensure security on all levels is in place - mobile security needs to be a key part of this.Second, the mobile operators need to heed their advice and deploy mobile security tools and services to ensure their subscribers are protected. For under 18s and other vulnerable users, a mobile operator can empower parents to control who can contact their children, and the types of content they are willing to receive. This can be done through content controls which allow parents to prevent children from accessing inappropriate web and WAP sites; receiving unwanted and unsolicited messages such as phishing attempts, bullying and harassment, pornographic images by MMS; or subscription to unwanted premium rate messaging services.

For corporate organisations, operators can not only provide subscribers with the means to enforce corporate usage policies (ensuring Mobile Data compliance to existing LAN Acceptable Use Policies) but can also extend this capability from Internet access to embrace messaging and safeguard users from spam, phishing and virus attacks, while also protecting the operator's network.

However it is not only the end users who need protecting. The mobile operators' networks are also affected by SMS fraud leading to revenue loss between operators. Studies of operator traffic show that typically one to two per cent of all traffic carried may be spoofed or faked, which for the large messaging volumes carried, result in direct costs.

Growing mobile messaging and data revenues depends upon the growth of accessible mobile content. However without controls, users are potentially subject to harassment, unsolicited messaging, inappropriate content and fraud. Unless addressed, these concerns will inhibit the growth of mobile phone penetration in new segments, and the usage of messaging and data. Without the ability to preserve privacy through managing content and access, a user has one choice - suffer or switch off the service.

Jay Seaton is Chief Marketing Officer at Airwide Solutions



Other Categories in Features