Features

In an era of commoditisation, competition and churn, it is becoming increasingly difficult for telecommunications providers to find meaningful market differentiators. Superior customer service offers a solution, but, to be truly effective, it requires superior customer information, explains James Wilkinson

Where most fixed-line and wireless services now offer broadly similar products, services and pricing structures, and customers can switch with increasing ease to providers with the latest incentive packages, the competitive advantage is often with those who offer a higher quality, more personalised customer service.  This also provides churn-reducing loyalty and, as every marketer knows, retaining an existing customer is exponentially more cost-efficient than recruiting a new one. Existing customers also represent a potentially rich source of cross-product and upselling opportunities - providing, of course, one knows what they want.
Knowing what the customer wants depends on knowing who they are. This, in turn, requires an IT system with the capacity to capture relevant information from customer-facing sources, combine this data with other enterprise systems and then maintain the information in a format that is easily accessible, reliably accurate and always up-to-date. Creating a master customer index with these attributes is no trivial task. To really do the job requires a combination of advanced integration, identity matching and data management technology.
Until now telecoms providers have been investing heavily in applications such as CRM to create their customer profiles. Increasingly, however, CRM is being viewed as an entry-level solution. While CRM is capable of capturing a plethora of data and loading it into one place, it typically lacks the ability to integrate customer-facing and back office systems such as customer accounts, enterprise resource planning (ERP) or web portals. Nor are CRM systems generally able to arrange this information into a meaningful, prioritised view of the customer or business entity.
Having the ability to capture every front-end transaction and then be able to selectively transform this data into a coherent view of the customer that is both current and operational remains an elusive quest with CRM.  To accomplish this level of sophistication, today's solution of choice is Customer Data Integration (CDI) - a subset of Master Data Management (MDM), an advanced suite of data integration and identity matching tools that enables providers to ‘know their customer' with a next generation level of depth, detail and accuracy. By offering a complete, front-to-back office system of tools for real-time customer visibility, CDI picks up where CRM leaves off.  

A 360o customer view
A CDI-based customer profile, or master record, is far richer than a single-faceted view based on something like the monthly bill.  It brings together strands of information from all touch points to enable the telecoms provider to see the full set of relationships it has with each customer, whether corporate or consumer.  In the case of a household, this may involve consolidating a number of different accounts either held in separate data silos (accounts payable, marketing, CRM, EPR, etc.), or belonging to different family members, all with their own mobile phone accounts. For some purposes, such as payments receivable, these accounts may need to be viewed together. For functions like direct marketing, however, customer files may need to be analysed individually.  Adding to this complexity is the proliferation of new services coming to market such as wireless Internet, TV-over-mobile and other content-based services, all of which makes the task of obtaining a single unified view of the customer an even greater challenge. 
In the case of corporate clients, the consolidated customer record may include a variety of other information on the company's supply chain partners, different departments, lines of business or global network of offices. Using CDI technology, this data can be segmented according to any set of pre-defined parameters in order to identify such things as call volumes within a specific cost centre or to get a breakdown of personal vs. business mobile phone usage for individual staff members. This layered, or hierarchical, view of the enterprise not only provides the telecoms operator with an enriched level of insight into the high-margin business customer, it constitutes a potentially new value-added data management service for which corporates may be prepared to pay a premium.
Traditionally, the call centre is one of the service provider's costliest overheads. Given the right tools, however, the customer support representative (CSR) can not only be a problem-buster and retention builder, but can also become a dynamic sales agent, transforming the hitherto cost centre into an effective profit centre.  When a customer calls in (usually with a complaint), the CSR must have instant, at-a-glance access to the customer's complete master record - time lost searching for files, asking repetitive identifier questions or transferring the caller to other departments inevitably compounds the customer's initial irritation. By contrast, having a complete overview of the customer from the start and being able to sort out the problem quickly creates a positive, personalised customer experience. This in turn prepares the ground for a selling opportunity.
Armed with this master customer record, a CSR can quickly spot service usage trends and offer better tariff rates, lock in new call packages or sell a mobile contract to a fixed or broadband customer. This interaction at the same time enables agents to update demographic details such as recent change of address, add new staff members to a corporate account, or make note of a family member who has reached school age and may need a mobile phone.  Once captured within the CDI hub, this continually updated record becomes the most accurate, trusted source of customer profiling information, available in real-time at any security-cleared service or operational touch point.
Using CDI technology, creating the customer record is speedy, requires no costly data transformation professionals and is non-invasive to existing systems because the hub sits between the existing systems that gather data, and the enterprise systems that want to consume these data. Like a spider's web, it is linked to legacy back-office data sources and customer-facing applications via a system of re-usable, object-oriented components enabling master record data to be gathered and held in either a central, or a federated repository. Because this indexing database is set up to capture information from anywhere across the enterprise on a real-time basis, it becomes the most accurate, up-to-date and trusted source of information concerning the customer, business or any other entity.

Intelligent data matching
One of the main challenges of integrating data from disparate sources into a central repository is that the resulting database is frequently cluttered with duplicate files and fractured, incomplete information. This is often the result of misspelling and other errors: for example, having one file under ‘Smythe' and another under ‘Smith', or just the typical variation associated with collecting data (using Bill sometimes and William other times).  Cleaning up the customer's master record and then keeping it current on an ongoing basis, is one of today's biggest IT headaches. To tackle the job, Initiate Identity Hub software, one of the industry's leading CDI/MDM solutions, employs a system of highly accurate probabilistic matching and linking algorithms to identify and resolve these anomalies. Mechanisms are in place to facilitate implementation such that changes subsequently made by a CSR or customer-facing application are in turn implemented in the master record.
In creating a master customer index, telecoms providers have an advantage over most other industries: a unique customer identifier called the telephone number. A wealth of profiling data can - in theory at least - be collected around the phone number. However, this can also lead to much confusion. An individual may have several different numbers (work, home, mobile etc.), or conversely a group of people (office or family members) may share a common number. Another common scenario is that a customer has switched to a different provider and taken their number with them. This is a particular challenge for suppliers like the Carphone Warehouse, which represents multiple providers.  In such a case, CDI's intelligent matching technology can link up customers with their mobile numbers and their current supplier to eliminate duplicate or obsolete files and then group this corrected information into a new master customer record.
So far, we have focused on the customer-centric application of CDI. However, because this technology provides a fast, cost-efficient means of integrating silo information, it also has major cost and efficiency implications at an operational level. Indeed, one of the outcomes of the recent spate of merger and acquisition activity within the telecoms industry is a need to rationalise increased amounts of legacy data, frequently at a global level. Once gathered into the CDI hub, customer lists from the merged entities can be compared and cleaned, and supplier data can be compared for negotiating leverage, notably in cases where the parent company may have pre-existing partner contracts, or where the combined volume of business suggests a pricing discount is appropriate.  Whatever the application, CDI/MDM's contribution - whether customer-facing or operational - is revolutionising the data management landscape.

James Wilkinson is EMEA Services Director,
Initiate Systems
www.initiatesystems.com

Customer Experience Management (CEM) is a crucial process for mobile network operators and communication service providers in general. It places the customer at the centre of a converging communications environment, finally recognising that it is the end-users’ response to their experience of the network and its services that will ultimately drive the success or failure of any network brand.  James Doyle discusses the emergence of CEM as a crucial application, and why it is now an essential complementary part of CRM/BSS/OSS processes

Is Customer Experience Management (CEM) just another three-letter acronym added to a growing list, or does it represent a new approach to delivering value within the OSS/BSS landscape? This is a common question passing across the lips of those who bump up against CEM for the first time.
CEM has traditionally been viewed by management consultants and analysts more as a business and marketing philosophy rather than a real technology or application delivering tangible value. However, CEM implementations through real software solutions across the globe are on the increase and are producing real results for new revenue generation, lower churn rates and greater customer satisfaction amongst many MNOs. CEM systems are able to build top-down models to define customer experience and measure experience indicators in transaction-orientated networks in near real time against those models. This allows any customer’s experience of service consumption to be viewed by the CSP and through such measurement experience ‘gaps’ can be identified and customers proactively managed, as a result. This, in turn, reveals: customer satisfaction, churn potential, lost revenue/potential revenues, brand damage, negative and positive advocacy and a customer-centric experience view of your business.

CEM’s differentiating qualities
So, how different is CEM to Customer Relationship Management (CRM) or Service Quality Management (SQM)? The difference is subtle but very significant and requires approaching the problem from the customer’s point of view. Firstly, CEM delivers a customer experience view of every point of interaction the customer has with a CSP’s business and operational platforms, e.g. service usage, billing, customer care. We call these the customer touchpoints; secondly, it delivers a complete view of a customer experience and their problems direct to the desktop of the relevant stakeholder or workflow process within the business, so immediate action or feedback can be taken to improve that experience, e.g. closing the loop.
CEM is, therefore, more of a horizontal, iterative way of providing experience feedback to the entire business, whereas  the other approaches are vertical methods of understanding the operation of assets within each silo; SQM delivers a service view rather than a customer or business view, and CRM data limitations mean a real-time, actual customer-experience view cannot be extracted. 
Using the customer as the central reference point of your value delivery and service experience provides a more holistic and better way of managing and running a business, especially in terms of technology and service integration (e.g. convergence). Why rely solely on standard OSS and BSS silos, with systems to measure each silo (sometimes per technology), when you can use the customer to pull this data together to provide a more complete picture?
Such views are supported in independent research by Allen, Reichheld and Hamilton in the Oct 2005 Harvard Business Newsletter Management Update. Their paper, entitled: Tuning Into The Voice of Your Customer, clearly demonstrates that, by simple measurement, the positive and negative advocacy of its customers provide a clear indication as to the growth potential of a business. Full CEM systems act as a key feedback mechanism to deliver against this theory.

Closing the gap
In terms of aiding speed of response to customer needs, CEM is one of the most efficient and proactive ways an operator has of closing the gap between measuring a poor performance and enacting change to solve the problem. Other OSS and BSS systems just do not ’see‘ the actual customers behaviour so clearly. CEM can also be applied to traditional BSS areas like customer care and CRM, which is a key part of Arantech’s solution strategy and what the company, in turn, believes should become an overall industry strategy. The company sees three CEM elements as crucial: First CEM – that is providing rich, customer-centric data early to the company, especially key stakeholders like executives, customer care and account teams; Proactive CEM – this is about taking action before the customer has raised a problem and can be seen as managing against the positive and negative advocacy of your customer base; and Next Best Action CEM – this is about applying business logic rules to a combined CEM and CRM data set to provide a simple set of next best actions for the operator to improve the business process.
A proactive customer-management strategy executed through, say, customer care activities, can deliver a much needed proactive first line of care, and much less of a reactive, cost-based process. By being more proactive, an operator’s process will become more revenue focused and less cost focussed. And if one considers that most customer experience problems don’t even get reported to the customer care department, this proactive approach makes total sense in revenue and experience terms.
In the end, the only way of getting a customer-focused view is by going top down, customer-to-asset, and not bottom-up. This requires measuring all customers all of the time, in real time, enabling a CSP to look at how these customers are using and consuming both services and network basics.
In a network management or service management system tens of thousands of elements are being measured and managed. Whereas in a customer-centric solution tens of millions of customers and their experience are being managed; this is a key differentiator of CEM systems.
Currently, a significant number of operators and service providers are already applying the Arantech touchpoint CEM to their networks and are now able to monitor service delivery and usage through the eyes of their subscribers.

The future
The TM Forum has spent a significant time defining service management and turning that into a deployable technology. Taking a pure customer-centric approach and turning that into a deployable technology has, until now, only been pursued comprehensively by one company, Arantech, currently the thought leader and pioneer of this space. However over the last 18 months CEM has started to become a real market opportunity, which is attracting more vendors to the CEM approach.
CEM, as both a business process and a best practice discipline for any operator, is beginning to both catalyse and enable a cultural change process across the mobile industry. It is helping CSPs review the whole way in which they do business and has already led a number of MNOs to change their workflows and processes to reap competitive and revenue advantages.
So, are the days of just measuring the performance of a network or service rapidly drawing to a close? Maybe the answer to that will revolve around whether current systems in use will ever be able to positively answer several key questions, in both real-time and historically, across all touchpoints at which the subscriber experiences the network and its services. These questions are:
• Can you segment, group and manage the service experience of the entire customer base dynamically?
• Can you identify who is trying to spend money but can’t?
• Can you identify which customers are having a good or bad experience across all operational and business platforms?
• Can you identify what services are being consumed by which customers or device?
• Can you identify which of your network assets are the most effective at delivering business value and profit?
• Can you continually determine the positive and negative high-margin customers dynamically to ensure an action plan for high company growth?
If the system in use is a comprehensive CEM implementation, such questions will be answered in the affirmative on each count and more importantly delivered to the desktop of the key stakeholder or process responsible.

James Doyle is VP Marketing and Product Management with Arantech

The telecoms market is renowned for its rapid billing cycles, but service creation has always been expensive, time-consuming and operator-led. Antoine Guy explains that a new approach promises to revolutionise this situation by addressing the very core of the problem: subscriber management

Taking its lead from other sectors, such as IT and manufacturing, telecoms vendors are now beginning to embrace a new discipline - Product Lifecycle Management (PLM) - which originates from Computer Aided Design (CAD). PLM applications unify the marketing, service management and network resources of the operator to provide end-to-end product creation, re-using data to shrink the time and cost associated with taking services to market.  PLM is gaining interest, with the TM Forum recently forming the Product and Service Assembly Forum to standardise the discipline, and it's already begun to pave the way for personalised provisioning.
An example of PLM in action is a new breed of Subscriber Management Platform (SMP) capable of uniting traffic management systems with the operator's dedicated subscriber management system, RADIUS and DHCP servers. Drawing upon the data from each of these, the SMP is able to correlate the user's network activity against the services he or she takes and those at the operator's disposal. In effect, it enables a broadband network to become ‘subscriber-aware', mapping the connection between individual subscribers and the services they consume.
The first step to subscriber-led servicing is to understand subscriber usage by mapping applications on the network. Over today's IP infrastructure, a multitude of applications compete for capacity unchecked, and many operators are unaware of how capacity is being used. Deep packet inspection (DPI) technology can provide the operator with visibility into network usage, identifying content in the packet's header and payload and cross referencing this with a library of applications to determine the nature of the traffic. DPI technology segments packets by protocol, application type and patterns within data payloads, offering detailed visibility into traffic usage and trends. This level of visibility - understanding the users, protocols, and applications on the network and how they are behaving - is the first step in controlling traffic and usage down to the subscriber level.
With this baseline in place, application control can begin. This allows the operator to allocate sufficient resources to promote or demote an application. The operator can classify traffic and assign actions to each class to create network rules, or policies. For example, if peer-to-peer traffic is hindering network performance, a pre-set application control policy can segment this traffic to a portion of the network, opening up more bandwidth for critical applications, such as third party VoIP or online gaming. Application types can also be set to maintain a constant maximum level of bandwidth. And the level of detailed information provided by DPI means that network deficiencies can be spotted and corrected. As a result, the operator can provide best customer service when trouble arises, limiting complaints or speeding up response times.
In a PLM context, DPI provides the means to assess which subscribers are utilising which services. Armed with this information, it's possible to begin honing service offerings. And we're not just talking about tiering services into gold, silver and bronze, but providing individual subscribers with tailor-made offerings on a mass scale. Subscriber managed services can be used to target specific groups of subscribers, such as voice-over-IP (VoIP) users, gamers, businesses, high-bandwidth users, casual users, peer-to-peer (P2P) users, P2P-free users and others. Examples of service types include:
Yet subscriber management can go even further, providing the operator with opportunities to upsell. Thanks to its ability to meter usage, enforce quality of service and generate usage statistics and accounting records, DPI traffic management can enable the operator to deploy joint revenue models with alternate providers or non-provisioned services such as gaming and VoIP applications. Symbiotic business models will begin to emerge. Network operators may provide a well-known service with free carriage and even prioritise the application in order to benefit by association. So we may see small ISPs advertising five star QoS for Skype to attract new subscribers, for example. Alternatively, a start-up may contact a large-scale network operator requesting that its new innovative application be prioritised. In this scenario, the content provider would pay the operator to gain access to a guaranteed large subscriber base.
Using an individual example, the operator might find that Joe Bloggs is a keen user of a third party VoIP service such as Skype or Vonage which can be sensitive to delay and jitter. Rather than throttling the capacity for this service, which could damage brand perception, the service provider has several options. It can offer the user the option of convert to its own VoIP service, perhaps at a lower rate for calls to the US that Joe regularly makes. It could suggest he pay an additional £5 per month to prioritise the third party VoIP service he is already using to ensure quality of the service. Or the operator could collaborate with the service provider and provide QoS in exchange for royalties from the application. This proactive approach enables the service operator not only to capitalise on a subscriber's existing usage but even to gain revenue from services it doesn't control.
Mapping the connection between individual subscribers and the services they consume creates the ability to mass customise. Through real-time and long-term monitoring and reports on subscriber data, operators can identify and develop the most profitable services or tiered service plans. Services can be developed and brought to market in a matter of days rather than months. Information is sent back to those OSS systems that interface with the Subscriber Management Platform, conveying data usage statistics for instantaneous charging and provisioning.
With the number of services and applications expected to increase exponentially over the next few years, issues over bandwidth consumption and quality of experience (QoE) will arise. The race to the top spot will no longer simply be about access to services. The battle for the consumer will be fought over the quality of content and services. Operators will need to balance QoE for individual users while retaining control of network usage and cost: both require strict service management.
A subscriber-aware network is the answer, and it's a network capable of intelligently identifying its subscribers based on their unique subscriber ID, regardless of the subscriber's dynamic IP or inconsistent connectivity. Operators that can identify individual or groups of subscribers and dynamically apply policies or rules based on use or other network patterns will be ahead of the crowd. Monitoring trends, the operator who can offer subscriber service options, service level agreements, packages and prices which can be refined to constantly meet demand, will be able to reap the benefits of rising ARPU and customer satisfaction.

Antoine Guy is Marketing Director, EMEA, Allot Communications, and can be contacted via tel: +33 (4) 9300 1167; e-mail: aguy@allot.com www.allot.com

IMS has gained substantial momentum as the standardised architecture for the convergence of communications service delivery. But many early adopters have taken a network-centric approach to the benefits of IMS deployment rather than taking a holistic view of the strategic opportunities, says Simon Dadswell

IMS is set to be a big numbers game if industry forecasts are correct. Global industry analyst Stratecast estimate in its report Next Generation OSS for IMS and Next Generation Networks – Now!, (December 2006) that capital expenditure on IMS infrastructure will rise from US$179 million in 2006 to over US$10billion in 2010. Informa takes a more conservative stance, forecasting worldwide capital spending on IMS infrastructure to reach $4. 5billion in the same period, according to its report IMS Opportunities and Challenges: Fixed and wireless market outlook, (March 2006). Others believe there will be a delay in market acceptance until the success of early adopters becomes visible, followed by a significant ramp up in 2009 or 2010. Regardless of the accuracy of these forecasts most industry commentators accept that virtually every major telecom service provider is now including IMS in one form or another in strategic planning. The industry has for some time been working towards the adoption of ‘Triple Play’, ‘Quadruple Play’ and ‘Fixed-Mobile Convergence’ (FMC) - all of which are enabled by the potential of IMS. IMS helps to standardise protocols to eliminate proprietary technology and enable interoperability for all IP-based services, and provides the opportunity for ubiquitous access for users regardless of service, terminal type or location.
Telecoms service providers are undergoing a period of dynamic change in the provision of services, so they need to have strategies to react to change effectively – and adoption of IMS is a key component. A concern is that current approaches to IMS may end up missing significant opportunities for the future. That is why it’s important to highlight the key characteristics that IMS must possess to secure the right business results for the telecoms sector.
The first consideration is that IMS is usually only linked with the short term issues of deploying new services cost-effectively. Many service providers take the approach that IMS is purely a network engineering issue and their IMS adoption business case is only based on OPEX reduction for network and service management.
The challenge is broader than that. If cost reduction is the basis for the business case then the adoption of IMS and resulting competitive strategies would proceed incrementally at the pace of normal equipment replacement cycles. While network consolidation and saving on operating expenses is an admirable objective for a service provider, new competitive threats from the Internet based world are fast emerging. The battle for the consumer will increasingly be fought over content, services and brand equity, not over access or network capacity. Capacity demand is unlikely to significantly ramp up in the short term and the major ‘New Age’ telcos present a clear and present threat to future revenue as they begin to redefine the direction of telecom and broadband services. For established service providers IMS implementation has to be more about the introduction of new multimedia services, P2P services like video sharing, and innovative service bundling.
The second consideration is that IMS needs to be recognised as a horizontal technology offering potential benefits across the service provider’s entire portfolio, and linking the world of the Network Management department and the IT Operations department for both wireline and wireless services. This adds complexity to the process of creating a compelling business case for the entire organisation and can result in departmental battles between IT and network engineering. If a Network Equipment Provider bundles the key IMS Charging technology with its next generation equipment to enable the network engineering department to retain ownership of IMS mediation projects, then how does this effect organisational strategy? Service providers should consider a pragmatic approach to IMS adoption by exploiting their existing technology investments and gradually replacing legacy network and OSS/BSS elements. In addition, online and offline charging have very distinct business requirements. Service providers should consider the deployment of proven IMS Charging technologies with the flexibility to handle rapidly evolving business models without compromising service capabilities at the expense of the network.
The third consideration is the evolution of triple-play and quad-play services and new revenue generating opportunities. IMS deployment helps service providers integrate discrete fixed and mobile network technology to deliver fixed-mobile convergence (FMC) and reduce time to market for new services. However many service providers need to do more research to determine what sort of services customers want to receive. The concept of convergence is finally becoming a reality but there are several different options to reach this particular goal. It is essential that service providers focus on the revenue generating potential of new services as a key driver to FMC by exploiting IMS as a means to obtain a larger share of the customer’s wallet. Furthermore, the adoption of IP based communications and value added service strategies will open-up new wholesale business opportunities for service providers, including integrated transport solutions, multiple platforms for customised solutions and a wider services portfolio.
Currently Voice-over-IP (VoIP) appears to be the service most frequently targeted for initial IMS deployment and is sometimes enhanced with IMS-related application services like personalisation, presence or special routing. A typical example of these features in combination would be a laptop-housed soft phone application operating as an alternative to a mobile phone. When the user is logged in to the laptop, incoming calls could be routed to it instead of the mobile handset, subject to the subscriber-controlled presence settings, similar to those found on Instant Messaging services. Outbound calls could be made, and rated, as if they originated from the subscriber’s home service location, regardless of the current location of the laptop. The laptop phone’s address book would be automatically synchronised with the mobile handset. In addition, IMS services, as experienced by the user, can be created from a palette of network and application service components including voice, presence, personalisation, content delivery, content sharing and blogging, various forms of messaging (chat, push-to-talk, e-mail, calendaring), synchronisation of subscriber-owned data across devices; and seamless delivery of all these components across traditional network boundaries – the options and complexity ensure that to follow a proprietary network based charging approach could risk a service provider being locked into a very limited set of service options.
The fourth consideration is one of differentiation. IMS is about offering new opportunities in an open and standards based framework - enabling rapid and cost effective ‘plug and play’ capabilities for multimedia services, regardless of service delivery platform types and end-user devices. However equipment and application vendors are attempting to differentiate services outside that framework and are enforcing their own extension to provisioning and charging standards. For IMS to work properly and deliver to its true potential it is important to ensure interoperability and facilitate the progress of open interfaces and standards in an all-IP world. Differentiation comes from a compelling mix of services, flexible charging options and consistently high quality, multi-channel customer care – not from the deployment of proprietary technology.
Service providers are starting to invest in IMS.  They are taking into account how they can benefit from IMS in the short term to streamline network architectures and to reduce operational costs. While this may help to reduce OPEX the strategic opportunity is to enable new competitive revenue opportunities associated with the growth in multimedia, content and FMC services.
If service providers are to make the shift from network-centric to service-oriented organisations, it is paramount that they invest in highly adaptive infrastructures to exploit the full potential of IMS and next generation BSS/OSS. By so doing they will be able to charge, bill and manage a wider range of complex services – services that will have increasingly shorter lifecycles, intense competition, increasing price pressures and will require back office systems to manage them profitably and efficiently.

Simon Dadswell is Advanced Solutions Marketing Manager, Intec Telecom Systems

Mobile operators need to adopt an approach that allows them to target specific users with relevant advertisements, while, at the same time, managing the overall ad-related user experience, including user privacy and charging aspects, explains
Danny Kalish

For many years, advertising has been the driving financial force behind all successful media such as TV, Internet and newspapers. Nowadays, with falling subscriber voice revenues and fierce competition, there is a growing interest in mobile advertising among mobile operators as a potential new revenue source that does not originate from their subscriber base.
 The mobile device is potentially the most effective advertising media ever developed. It is personal, always-on and enables precise targeting of advertising based on contextual information such as location, user profile and usage history. A recent report from Strategy Analytics predicts an incredible potential for this market and suggests mobile advertising could account for up to 25 per cent of worldwide Internet advertising expenditure by 2011. Mobile advertising as a revenue stream is rapidly increasing, with the same report predicting that mobile Internet advertising will grow from $1.4 billion this year to $14.4 billion by 2011.
 Will mobile advertising be another over-hyped industry segment or can it actually deliver genuine value to the consumers?
Mobile advertising is one of the few effective, user-centric advertising media where user experience is a key success factor that should be maintained across all delivery channels. Operators need to ensure they develop a viable and sustainable business model that does not alienate consumers and damage both their own reputation and that of the brands using the service.
When TV viewers are faced with a poorly targeted advertisement during their favourite programme it is unlikely that it will damage their perception of the TV service provider. However, if mobile users continually receive annoying advertisements on their device, they are likely to place the blame on the operator.
Deciding when and which subscribers should be exposed to advertising is complex.  It depends on many factors, such as the subscriber’s price plan, usage context, opt-in procedures and regulations. An agreement needs to be reached between customers and their service provider to reflect the operator’s obligation to protect users’ privacy in exchange for the subscribers’ consent to being exposed to targeted advertisements. Ensuring customer satisfaction can be a complex process; a negative experience may reflect badly on the service provider and possibly cause the consumer to refuse advertising services.
In addition to guaranteeing best user experience, mobile operators need to make sure advertising becomes a win-win situation. This can be achieved by selectively choosing which subscribers will receive advertising and serving them with advertisements that genuinely brings them value. Operators should offer subscribers full opt-in and opt-out options, and grant them various financial benefits in exchange for their acceptance of receiving advertisements. After all, the willingness of subscribers to receive ads on their devices is essential.
Targeting the advertisements to each and every individual user is important in maintaining a satisfying experience, but there’s more to it than just receiving a suitable advertisement. The advertisements also need to be adapted to the mobile device and access channel so it will be clearly presented without damaging the user’s experience and interests. Providing intuitive advertising services with interactive actions, such as context-aware ‘click to call’ or ‘click for info’ options, will further support a positive experience.
If an advertisement is non-intrusive, delivers value, protects the subscribers’ privacy, stimulates interest and is relevant to the consumer, there will be a higher propensity for service adoption.
To achieve this, mobile operators need to be 100 per cent clear how many ‘real’ customers they have and ensure the quality of the information held on each user. Names and addresses alone simply aren’t adequate. Operators need to be well acquainted with user interests and preferences to ensure they understand the user in every way possible and can therefore customise every single interaction. Subscriber profile information should be collected from all existing sources (e.g. billing, CRM, etc). Building a dynamic user profile, which is based on the user’s behavioural information, including content consumption and reaction to past campaigns, will further ensure that subscribers will be exposed only to relevant advertisements that offer them real value.
The current mobile content shopping experience is challenging for teens and young adults, but it is frequently inadequate when compared to similar services on the PC. However, there is huge potential for advertisers to exploit young mobile customers who are used to being targeted with advertising messages. This segment is more likely to embrace personalised advertisements and awareness campaigns, as long as the content is relevant, and they are offered targeted service benefits for viewing these adverts on their mobile devices. If this is done successfully, there are some inherent benefits for the operator as it is an excellent way to build loyalty and “stickiness”. Service benefits are also an effective way to take the usage of mobile services to the next level by subsidising premium content and lowering the service consumption cost barrier.
For example, the number of users willing to pay for mobile TV services is a small fraction of the total number of users who want to use the service for free.  Subscribers should also be offered benefits for accepting advertisements in the form of discounts or free services. Using this approach, operators may increase their users’ satisfaction and eventually turn mobile advertising into a win-win situation.
In addition to creating a positive user experience and targeted campaigns, operators need to take a mobile-centric, holistic approach. Whether using a partner, who would be responsible for media selling and campaign management or centralising the core service system and integration efforts internally, operators would still need to use a centralised advertising system that would allow them to protect their valuable assets and maximise the mobile advertising potential. Only a centralised system that would handle the operator-side business logic, communicate with the operator’s IT systems (subscriber data, context information and access channels), and at the same time manage the rich context, user experience and targeting of the advertisements, would be able to achieve this goal.
Mobile advertising presents a huge opportunity for mobile service providers. By leveraging their current assets and positioning themselves as a key player in the advertising value chain, the mobile advertising business opportunity can be maximised while substantially reducing failure risks. To fully exploit its potential for revenue growth, operators need to realise that user experience and flexibility are the key success factors. With a high level of personalisation, targeting capabilities and openness, operators will realise the rich potential of mobile advertising and ensure its success.

Danny Kalish is CTO, Unipier
www.unipier.com

Oren Glanz looks at the future for mobile music, and what operators need to consider to ensure they don’t get left behind

Music fans have always been proud of their collections and where, in the past, they used to compare their massive vinyl collections in meters, gigabytes are now a better means of comparison. Statistics from the International Federation of the Phonographic Industry's 2007 Digital Music Report demonstrates the growth of digital music, showing an estimated $2 billion worth of music being sold online or through mobile phones in 2006, accounting for 10 per cent of music sales. A Strategy Analytics forecast for Western Europe shows that in 2006 there were almost a million active mobile music users and it continues by predicts that by 2010 this figure will have risen to 2  million users - which has made mobile operators take note and place even more faith in the mobile delivery channel.
Usage and interest in mobile music is clearly exploding so how can operators ensure their subscribers become addicted to downloading music?
With so many devices on the market there is the potential for mobile music users to be quick to churn when a service/device doesn't live up to their expectations - very often without contacting their operator.  These customers do not contribute to the operator's top line, and can tarnish the reputation of mobile music amongst their peers and fellow subscribers.  This is where the challenge becomes difficult, as most service providers do not have a clear view of their customers preferences or what barrier to adoption they face when trying to use a service.  They do, however, now accept that customers will not make contact to highlight any barriers to service.
It is important to note, even when a subscriber has successfully overcome any barriers to initially adopting a service, that there are still issues to turn them from an occasional user to an active user.   It is vital for mobile operators to understand that identifying problems with delivery/usage is only part of the issue, encouraging new users and increasing their consumption should be imperative and is a key to a successful mobile music service.
Subscribers trying to access mobile music services can encounter various technical, training, value and compatibility obstacles, which can reduce the likelihood of their becoming a habitual user.  These include:
Digital right management (DRM) - creates major barriers within a music download service as the mechanism for DRM can vary between the complete separation of the music content from the DRM file to a lock on forwarding tracks.  Another major issue is variances between handset models - certified Vs non-certified operator handsets.
Navigation - concerned with music downloads tracking all aspects of a subscribers' journey through to identifying valued content.  Experience shows that navigation paths are not always clear, subscribers cannot find the content they are after, causing them to quickly lose interest and abandon a service.
Barriers to usage - also common relating to areas of handset and music service usage.  Customer behaviour is very intuitive with most individuals happy to try new services, but the majority will give up once they encounter a barrier to usage.  The sheer complexity of many new mobile services sometimes causes this.
Functions such as streaming and downloading can often be confusing.  Subscribers tend to struggle with how to search, locate, select and replay after they have downloaded a track, especially the mass market, non-technical and inexperienced users.
This leads us to far simpler barriers, but ones that are ultimately the largest barriers to subscribers becoming addicted to mobile music on their handset.  Usability, training, interest and other user experience barriers are just as damaging as any of the previously mentioned issues.  Mobile operators need to look at how they can ensure such customer problems are solved before they impact on the user experience.
So with so many barriers to the success of mobile music, why use a mobile phone as a music delivery method and player when compared with other standalone players?  The obvious argument is the need to only carry one device, and the advantage mobile operators hold here is that consumers are already addicted to their devices and other services such as SMS.  But how can operators change this device addiction into a mobile music addiction?
The answer is simple - mobile operators need to improve their subscribers' user experience when using music services, which, in turn, will drive satisfaction.
Understanding users and delivering exceptional customer service is just as important a part of the mobile experience as the latest technology and the size of the marketing budget. It can be the key differentiator for a business.  Too much time and money is invested in getting products to market quickly rather than getting products to market efficiently.  The objective is to provide the best mobile experience for each individual user.
To do this, mobile operators need to build an end-to-end view of their music service from all contributing systems and content.  This is not as easy as it might seem, as although common elements are contained within most music services, many aspects of delivery, billing and content are unique to each operator - which affects subscriber behaviour, usage patterns and other adoption barriers.  The simplest way to achieve an end-to-end view is by approaching the problem with Service Adoption Management (SAM) tools.
SAM treats adoption and usage from a holistic point of view, taking into account all technical and non-technical issues which affect adoption and usage levels.  It provides an operator with a clear insight into the groups enabled to use mobile music and presents the operator with the information to enable them to encourage a user to download songs on their phone.  For example, SAM tools allow an operator to understand which handset a user is using, what they are interested in and at which stage of adoption they are.  If a user can be seen to search for a song but stops, and does not download it, then pricing could be the barrier and the mobile operator can review this.  If a user is only downloading one song a month, then the operator should try to encourage greater usage by offering promotions.
By using these principles and analysing subscriber information (such as billing, CRM, customer care and network or platform services) it should also be possible to track other key adoption traits such as social networking, campaign effectiveness and early service launch measurement.
Armed with this highly granular insight into virtually any aspect of usage, and the ability to identify and remedy usage barriers, even those which have never before arisen, managers in charge of mobile music marketing and content can make more informed decisions and better target offerings and campaigns. By providing deep real-time visibility into subscribers' interactions with mobile music, it is possible to stimulate customer loyalty and increase mobile music usage and generate meaningful profits.
Operators can immediately see how mobile music services perform, how campaigns are working, and the experience of both individual and/or larger groups of users. This detailed real-time information can then be used to make mobile music more effective and attractive, and to proactively contact customers who need training or an upgrade to use services successfully.
Customers enjoying a positive first experience of new services will be more likely to use them again.  If services do not work or perform properly, operators can proactively contact these customers to solve the problem, offer assistance or offer refunds.  By treating customers positively in this way, users are more likely to use the service again and stay loyal.
This will be measurable by the increase in number of music tracks downloaded, the growth of active users, reduction of streaming terminated sessions, reduction of DRM related problems or in the number of music related calls received by customer care.
The mobile industry needs to be less focused on the technology and should instead proactively explore the real needs of users.  When setting their objectives, service providers should also consider the impact of customer satisfaction on their brand and image.  Providing a higher quality customer experience will also help service providers to differentiate themselves from their competitors, increase loyalty and attract more customers.   
Understanding and improving the user experience, not just for customers who have became addicted users of mobile music, but more importantly, for those who never get past their first failed usage attempt, is essential.  SAM tools revolutionise the whole business process of value added services such as mobile music and allow the operator to understand what is actually happening to the user.  By focusing on removing some of the frustration that is inherent in today's users' experiences the road should be clear for mobile operators to succeed in reaching the mass market with mobile music.

Oren Glan is CEO of Olista

Andrea Casini explains how operators can look to grow their existing operations in the face of a tough market environment

It’s common knowledge that the mobile phone market in developed countries is highly competitive, and opportunities for revenue growth are not as clear-cut as they once were. Mobile operators need to grow the potential of their network in order to grow the business and survive, rather than simply rely on finding additional subscribers. Evolution, it seems, is the only way forward. Having accrued a huge customer base, there is pressure to keep pricing low in order to keep churn to a minimum.
However, new technologies such as mobile VoIP are poised to change the established income model. To be seen as a mobile phone company alone is no longer good enough. In the consumer market, applications like the mobile Internet and content such as ringtones are big business; services such as mobile TV have yet to take off but are also potential revenue generators.
Meanwhile, enterprises are increasingly offering mobility as an option to their workforce. There is now a plethora of mobile technologies in, or entering, the marketplace - all vying for enterprise business. For a mobile operator looking to take a slice of the lucrative enterprise mobility market, they need to compete with fixed line technologies such as VoIP using Wi-Fi or WiMAX. Enterprises that opt for mobile services to support their workforce will expect unprecedented reliability. 

A crowded market
We’ve reached a stage of mobile phone ubiquity. There is an expectation, from both leisure and business consumers, that their handset will be able to find a signal wherever they are. Currently, places that are densely populated or experience large spikes in usage, such as city centres or football stadiums, can suffer a drop in service with patchy coverage and call drop out. As a high number of users all vie for a signal in one small area, reception quality will inevitably suffer.
Delivering consistent service in enclosed public spaces is further compounded by the fact that leisure customers will tend to use entertainment applications such as mobile TV and gaming when they’re sitting indoors, making in-building penetration necessary. Business users will also spend a significant proportion of their time accessing e-mails and transferring data whilst indoors. The border between private and business users is becoming more and more undefined, with people trying to use similar services and data rates.
Analysys predicts that half of all telephone calls made across Western Europe in 2008 will be made on a mobile phone. A tall order, admittedly, and one that will only be possible if mobile operators evolve to embrace the need for in-building technology.
The very nature of 3G signalling makes it difficult to penetrate areas such as office blocks and tunnels. In addition, high-rise buildings will require additional capacity and bandwidth due to the higher number of users. Resolving the problem of poor reception in built up areas and at large events, by embracing in-building wireless coverage and capacity, has become an important service differentiator for wireless carriers.
In-building solutions can vary in power, offering high capacity blanket coverage in shopping malls through to limited home capacity boosters. Products such as distributed Antenna systems and RF repeaters are designed to guarantee wireless coverage in highly populated areas and provide cost-effective, common infrastructures. This infrastructure will support all of the various standards, while providing a high level of omnipresent coverage to mobile phone users. If there were not be a seamless transition to 3G, its adoption could be threatened by alternative fixed-line services.

The future of in-building
In the consumer and enterprise market, femtocells and picocells could, in the future, help with coverage problems at home and at work. These products transmit mobile signals over a LAN and have a more limited range than large-scale in-building products. Picocells are a higher power indoor basestation alternative to femtocells.
In the UK, Ofcom recently auctioned off 3.3Mhz of the wireless spectrum, equating to a series of twelve low power licenses to allow both fixed line and mobile operators to compete for the option to install picocells into buildings and public spaces. Deployment has been slow as operators appear reluctant to embrace the potential revenue opportunities. However, O2 is due to launch picocells later this year for roughly €100 each.
While femtocells show promise for solving some indoor wireless coverage issues, the technology is still in the early stages of development. Deploying femtocells for individual buildings could, in the long term, reduce operator costs and allow operators to offer smart tariffs. The first femtocells have gone on sale in the UK and are currently being promoted as a way to boost mobile reception at home, but the market is still in its infancy.  
The idea of dual tariffs, using femtocells, will mean users can get the best deal when using their phone at home or out and about. New handsets could automatically hand over from a basestation to a femtocell as users come into range at home. Operators may also offer femtocells as part of a subscriber package. Vodafone is already trialling the technology and this is one step on from the current BT Fusion system, which is reliant on Wi-Fi. Femtocells could even see mobile operators grab a share of the home phone market by offering blanket coverage.
Femtocells and picocells will effectively usurp the need for WLAN and other fixed/wireless offerings. Currently technical issues, such as standardisation and integration with existing networks, need to be resolved before these products will gain wide acceptance. However by offering devices that boost coverage, operators could eventually look forward to increased revenue, as their services are more widespread and readily available. This could also prevent churn.
The in-building theory is not new. Andrew Corporation has already deployed many in-building solutions to provide indoor wireless coverage. The company deployed an ION™-B fiber distributed antenna system (DAS) at Dallas Fort-Worth International Airport Terminal D, its parking structure and the nearby Grand Hyatt Hotel to extend wireless coverage to customers. The ION-B utilises head-end equipment that interfaces with multiple, co-located, operator base stations.
Providing blanket mobile coverage and a set of competitive tariffs to match will mean mobile operators will finally be in a position to claim glory. Users who can make important calls and access data wherever they need to are less likely to look to another operator. In the UK, O2’s churn for contract customers was 23 per cent for the year ending December 2006. Coupled with a £2 lower average revenue per user (ARPU) over 2006, the financial cost of losing customers and paying to retain them in an increasingly competitive market is high.

Location based services
Growing the potential of a network through in-building is only one side of the story. Location Based Services (LBS) are already offered by some mobile phone networks and are a way to generate significant revenue for operators. From child-finder services to tracking enterprise equipment and location-targeted advertising pushes, there is money to be made from LBS.
In the UK, Ofcom has stipulated that by early 2008, any VoIP service allowing users to make calls to ordinary phone numbers must also be able to make calls to the emergency services. Research by the communications watchdog has revealed that 78 per cent of VoIP users who cannot use their service to call 999 either thought they could, or did not know whether they could. This development means that geo-location will be provided on a “push” basis (where information is sent to the user without them explicitly requesting it) and it will be done at the operators’ expense.
The cost of forced geo-location can be offset in the long term with other LBS options, however. Currently, the solution is primarily used as a way to send custom advertising and other information to mobile phone subscribers based on their current location. Major operators, focusing on recouping the cost of 3G licenses and falling voice revenues, have yet to exploit the full potential of LBS. Yet this is set to change with a wave of new lower cost devices, including the Nokia N95, that offer LBS capabilities. In fact, ABI Research predicts that by 2011, the total population of GPS-enabled location-based services subscribers will reach 315 million, up from 12 million in 2006. This represents a rise from less than 0.5 per cent of total wireless subscribers today to more than 9 per cent worldwide at the end of the study's five-year forecast period.
The business case, as demonstrated by Nextel in the US, which currently has a 53 per cent market share of the LBS, is there. Operators, with an influx of hardware, should look at LBS in a new light. It has been over-hyped previously, but it is another way to generate revenue with a relatively low outlay.

The growth potential
The mobile phone market is changing. Operators have succeeded in establishing huge customer bases, to the point where phones now\ outnumber people. Yet, in the mature markets of Europe and the US, there is a need to look beyond voice and data services in order to prevent churn and drive revenue.
In-building, in its simplicity, overcomes one of the most common reasons for a customer changing operator: lack of signal. Meanwhile, the market for LBS and geo-location has been over-hyped, but with an influx of new devices and increasing operator confidence it will be possible to overcome cynicism and make a positive business case to consumers and enterprises alike.
After all, we are in an age of convenience, and in-building solutions and LBS are about to make everything that little bit easier.

Environmental and radiation issues
EM and radiation compatibility, health hazard with exposure to non-ionizing radiations has always been a hot topic, recently coming as a burning issue following a recent article about interferences from mobile phones to medical equipment in hospitals.
As every RF engineer knows very well, unlike many generalist reporters and the general public, 2G and 3G mobile terminals and phones are power-controlled by the network, so as to ensure that networks function properly, interferences are minimised, and battery life is prolonged for more air-time and subscriber’s convenience.
The degree of action for Power Control extends over 30dB, or 1000 times, for GSM and to much more (in the range of 70dB, or 10 million times) for UMTS terminals; which means more transmit power is requested in Uplink (up to 1 or 2 Watts) when radio coverage and visibility are poor (high path loss, high penetration loss - i.e. in buildings from outdoor cells), and on the contrary, very low power (down to 1 mW in GSM or much less for UMTS) is needed with good coverage and antenna visibility - i.e. with an efficient in-building system.
Moreover, specific parameters can be set at certain locations, so that Uplink power will never exceed a pre-set value not to cause out-of-control interferences to the own operator’s network or to other services.
And, by the way, well designed in-building systems would have several distributed low-power antennas (or DAS), with power in the range of mW and radiation levels well below the most stringent EU Country-specific limits (.5 V/m).
Actual results of implementing in-building coverage show local RF power density decreasing by three orders of magnitude, with any previous interference issues turned to non-existing.
In-building coverage is therefore the only solution that ensures, on top of seamless communication and proper service for the network and its subscribers, the highest degree of EM and radiation compatibility, either to delicate equipment like navigation instruments in airplanes or medical equipment in hospitals, and to the human beings.

Andrea Casini is VP EMEA Sales & Marketing,
Andrew Corporation

Will the huge success of SMS be eclipsed by later generation services?  Not exactly, says Prisicilla Awde 

Virtually everybody does it every day because it’s a cheap, fast and an easy way to keep in touch and do business. No, we are not talking about making phone calls but sending SMS messages, which, after telephony, is the most successful and ubiquitous communications service ever. The big question is: is it threatened by next generation mobile instant messaging (IM), e-mail or multimedia messaging (MMS)? The consensus is that it is not, at least not for several years.
SMS may even have a negative impact on the success of advanced messaging services simply because it is integrated into all phones, is known, liked and used by millions who see little advantage in switching to more expensive, if media rich alternatives. Mobile phones are automatically set up for SMS, it works without data connections, is interoperable across networks/devices, is international and reliable.
In Western Europe alone, Gartner estimates that over 180 billion SMS messages were sent in 2006 and expects growth of over 20 per cent, reaching 210 billion in 2011. In developing countries where low incomes and high poverty levels favour cheap, easy to use devices/services, SMS is booming. In the last year India added six million mobile subscribers each month and growth in China is similar. In Africa roughly three per cent of the population have bank accounts but around 20 per cent have mobile phones.
SMS traffic is used for secure micro-payments and to share information in all sectors from agribusiness to health care, from fishing to financial services and political/entertainment elections. Expats use voice SMS for cheap international connections and it overcomes literacy/keypad barriers by providing a ‘talk/listen’ alternative to ‘type/read’.
Deepak Natarajan, Head of Marketing, Roamware predicts that over 2.3 trillion text messages will be sent in 2010: “SMS is alive, kicking and thriving and the vast majority of the global population is interested in it. Operators have a huge incentive to continue supporting SMS because many organisations use it as a communications channel. New sticky applications/features make people reluctant to change.
“Low cost handsets have produced a revolution in communications, but the next step needs cheap, available, easy to use next generation devices. Within ten years there will be a big growth in MMS and mobile IM but prices are not being lowered sufficiently across the board to motivate users to move away from SMS.”
The situation is not helped by innovative value added SMS features (storage, divert, print and filtering), which encourage usage, increase margins and reduce churn. “The original SMS infrastructure was store/forward but now it is direct deliver, texts can be used for transactional services,” explains Jeff Wilson, Chairman, Telsis.
Growth in advanced messaging has been inhibited by complicated set up; relatively expensive smart phones/tariffs and slow 3G roll out. Flat rate pricing and bundling accelerates take up but the choice of messaging service depends on circumstances - each is applicable to different situations. “Price; interoperability; ubiquity and user familiarity always inhibit new services but different messaging services will co-habit - growth in one increases traffic in others,” says Jay Seaton, Chief Marketing Officer at Airwide Solutions. “Operators’ revenues from SMS exceed all other messaging types combined. Carriers can add sticky enhanced SMS services over existing infrastructure fast/cost effectively and are increasing SMS revenues from personalised services.”
Andrew Moran, Head of Enterprise, Symbian agrees that take up is slowed by the lack of compelling consumer price plans: “If operators get the pricing right users will flock to new services driving up ARPU. Applications are available and the magic ingredients are there for a whole suite of messaging services if operators can hit on the right model.”
Mobile IM is perhaps the natural successor to SMS and some analysts estimate the market will be worth $1billion in 2011. It may find applications in customer service and could be useful for cross and up-selling.
“Although it is a different type of communication, an on-going chat, personal IM complements SMS although it won’t be a substitute,” believes Gabriel Solomon, Senior VP at the GSM Association. “SMS will co-habit with new services – the top 20 per cent of consumers using multimedia services still want to talk to the rest of the population via SMS. One shot, fast SMS messaging will continue to be used for more and more applications as a secure system where pre-paid accounts limit fraud.”
Extending computer IM and e-mail functionality into the mobile environment is a natural progression suggests Doug Brackbill, CMO at Visto. “The desktop and mobile environments will merge to allow seamless communications: people will move between systems. Messaging services are not competitive but meet different needs.”
Although MMS is mainly used to send informal and impromptu photographs, it may merge with SMS as graphics or ‘emoticons’ are automatically added by servers.
MMS is important for mobile advertising where subsidies will change the whole messaging market. Although mobile spam is less acceptable than on PCs, tariffs will drop for people willing to accept non-intrusive, relevant, useful, multimedia adverts. The subsequent lower costs will likely drive advanced messaging take-up. SMS will however still play a big part in advertising because of the huge addressable audience - some companies see it as the most economical way of reaching millions.
“MMS has been disappointing, but we are starting to see more recent growth especially for providers to deliver content rather than for personal messages,” says Stephanie Pittet, Principal Analyst for Gartner. “Mobile IM and e-mail will become more important especially in mature countries with high-end devices. It’s a slow start but operators are striking deals with IM providers and flat fee e-mails will be important for consumers. All are niche markets now. MMS won’t ever reach the levels of SMS which has high margins and users expect it to be central to all offerings.”
Mike Short, VP R&D for the O2 Group believes text based SMS will be popular until 2010 because of its convenience, familiarity, ubiquity, easy availability and the number of promotions relying on it. After then he expects other capabilities will be integrated into handsets especially e-mail, which makes the work environment mobile. “This is an evolution not revolution: there will be higher adoption in some places over others. Convenience and reach will drive new services,” says Short. “Customers do what’s most applicable. We need to make it easy and give them the widest range of services possible.”
Hilmar Gunnarsson, VP corporate development at Oz asks why operators launched MMS instead of just adding a picture facility to SMS. “Rather than marketing new services, carriers should evolve the underlying architecture to enhance and add new capabilities, giving users richer messaging experiences under the SMS umbrella. We need to think of things from users’ perspectives, keep things simple and easy to use. E-mail should be embedded into all phones and applications pre-integrated.”
The market is moving towards a situation in which people will not care which messaging system they use but will choose the most convenient/appropriate. Mobile unified messaging will give users a single in-box, address book and number accessible via any device. “When 4G networks become available in 2015/16, IP will be the prime network connection and there will no longer be a clear split between MMS, IM or SMS – messaging will be converged,” believes Aram Krol, market development manager, Acision.”

Priscilla Awde is a freelance communications journalist

The promise of IPTV is fraught with dangers - from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

The promise of IPTV is fraught with dangers – from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

Laura Marriott looks at the way operators are able to marry new technologies, such as mobile TV, with mobile marketing initiatives in order to deliver the services subscribers want, in a personalised and convenient form

There are several key stakeholder groups in the mobile marketing value chain. Each has a different set of objectives and requirements. For the advertisers it is about their ability to implement highly-targeted, measurable marketing initiatives to a wide audience. For the operators it is about rewarding customer loyalty (or preventing churn), increasing average revenue per user (ARPU) and developing service differentiation. Consumers, meanwhile, demand compelling content delivered to them in a personalised and convenient manner.
All these players will benefit from the development of a prosperous mobile advertising industry.
The opportunity is certainly a large one. According to ABI research, worldwide mobile marketing and ad spending will reach $19 billion by 2011, up from $3 billion in 2007.
For mobile operators, mobile marketing and advertising represents a relatively rare opportunity to create value from a non-subscriber revenue source. As most operators around the world still struggle to ramp-up subscriber usage of lucrative new data services, creating new revenue streams becomes ever more vital, especially as core voice revenues continue to decline due to price pressures.
There are five main types of mobile marketing content delivery: voice (e.g.: IVR), mobile internet (e.g.: WAP), messaging (e.g.: SMS/MMS), Video and television and downloadable (Java/ Brew etc.). Traditionally, the operator has used such delivery mechanisms to deliver "on-deck" content, which has required the content provider - or mobile marketer - to have a direct relationship with the operator. It was considered vital that the operator "owned" the customer and that this relationship was not compromised by third parties - other brands, advertisers, billing providers that could dilute the operator's relationship with its customer base. The downside of this model was that the operator shouldered the burden of marketing the new services, content was often limited or its potential was unrealised, and it was difficult for third-party brands to truly leverage their value.
As a consequence many of these early content-based initiatives were not compelling as far as the end-consumer was concerned and a few failed to deliver the desired uplift in operator-revenue.
Thanks to the rise of mobile marketing and advertising this model is evolving. As operators are increasingly able to generate revenue from marketing initiatives, their mobile web sites can begin to operate more like a portal than a mere storefront. This means that off-deck and on-deck content can be combined to enable access to provide the consumer with an almost unlimited range of content and services.
These emerging business models are also being shaped by new mobile technologies, most notably by the rise of mobile broadcasting. Today the mobile TV market is a small one in terms of revenue and consumer adoption, but the industry predicts strong growth. The first nationwide mobile TV services went live in the US and Europe in 2007 after years of trials and customer usage studies.  Nevertheless, the technology is in its infancy as far as consumers are concerned: of the estimated 233 million handsets in the US, only around 10 million are able to access mobile TV. (Sources: m:metrics, eMarketer, Forrester, Yankee Group).    
As the two industries - mobile marketing and mobile TV - are emerging, an opportunity has risen for them to compliment each other and stimulate further growth. A recent study by m:metrics commissioned by the Mobile Marketing Association (MMA), found that among US subscribers interested in mobile TV services, almost half (41 per cent) said they would watch mobile adverts in order to access mobile TV services free of charge. Another 20 per cent said they would access mobile advertising if it meant accessing services for a reduced fee.
Advertising within mobile TV environments can take many forms. As with traditional TV, advertising can occur before (pre-roll), after (post-roll) or during programming (including product integration or branded entertainment) - all formats most consumers will be familiar with. In addition, there are a number of forms of mobile-centric advertising elements that can be incorporated in the mobile programming. These include, real-time text overlays, real-time image overlay (watermark), and premium-rate SMS (PSMS) links to encourage Participation TV and avatar branding alerts.
The FIFA World Cup in Germany in 2006 proved a fertile test bed for many of these emerging business models. Mobile Dreams Factory, a specialist in delivering mobile marketing campaigns and an MMA member, was the company behind one of the most successful of these: Vodafone's "Videogoals 3D", which delivers animated, 3D videos of goals scored during the matches to mobile devices within minutes of the real-time experience.
3D models of all the teams and players were created. When a goal was scored, the play was reconstructed and animated within seven minutes of the real-time activity. The video was converted to a mobile format that was sent to Vodafone clients and to Marca.com, a leading Spanish media group, for publication on the Internet.
Videogoals 3D was the most visited of Vodafone services during the World Cup and the most visited of Marca.com's internet services, with more than nine million videos viewed. Not only did Vodafone associate its brand with football via this high-appeal, unique product, but the company was also able to promote its Vodafone Live 3G content-based service, including driving MMS revenues. The service was recognised at the MMA's annual mobile marketing awards 2006 for its ability to integrate multiple cross-media elements to communicate and deliver on the service availability.
For advertisers, the World Cup represented an opportunity to reach a huge audience but there were two main problems: traditional advertising was often prohibitively expensive for smaller companies and the event itself was so saturated with advertising that some less well-known brands could easily get lost in the noise. Mobile TV provided a solution to both problems.
CEPSA, a leading gas and oil company in Spain, worked with the Mobile Dreams Factory to create a World Cup-related real-time videoblog. The company sent two reporters to Germany to provide additional information on the activities and training sessions of the Spanish team and to share other anecdotes.
This was considered the first time that a real-time video service, utilising 3G mobile devices, captured and simultaneously transmitted a signal to thousands of users during a major event such as the World Cup. Using 3G mobile devices, the service was able to capture and simultaneously transmit a video signal via the Internet while audio files were transmitted over the operator's network. The audio and video subsequently came together in a platform in Madrid. The system recorded all the live connections so viewers could view them later as time-delayed videos.
The real-time videoblog was innovative because it involved the convergence of digital media, using the mobile device both as a communication tool and as the medium itself. By integrating 3G technology with both mobile and internet services provided the consumer with an exceptional mobile experience, while extending the brand for CEPSA and connecting it with the FIFA World Cup. The service was given the Innovation Award for Creativity in Technology at the MMA's mobile marketing awards 2006.
The success of such pioneering services during the World Cup has seen them subsequently taken up elsewhere and to similar effect. The Videogoals 3D service, for example, has since been rolled out by Telefonica´s Movistar in Spain, which successfully integrated Coca-Cola as the campaign advertiser of the service. The model is able to satisfy all members of the value chain: Coca-Cola is able to reach Movistar subscribers; Movistar is able to offer free, exclusive content that differentiates it from competitors, and the user receives exclusive, free premium-content.
The portal exhibits the corporate image of the advertiser and banners are placed in different places around the site featuring the Coca-Cola logo. These can be static or animated, and the system will automatically detect the user's device to request adapted and appropriate media.
The service saw some astonishing results. In the first three months of operation, more than 10,000 users registered for the service and around 1 million impacts on banners and downloads.
Its still early days, but such case studies demonstrate how operators are able to marry new technologies such as mobile TV with mobile marketing elements to deliver compelling services to subscribers. Hopefully, 2007 will see many more examples and lots of unique innovations. 

Laura Marriot is President of the Mobile Marketing Association

Is Apple really stealing a march on the mobile industry? 
Lynd Morley takes a look

Apple's iPhone may be over-hyped, over-priced and (so far) not over here, but despite that it's been a bit of a cage-rattler for the mobile industry - both vendors and service providers.  And so it should. Yes, we know all the nerdy objections to the thing are valid: most of the gadget's features are already available on existing phones from other vendors. And while the touch-sensitive screen is very clever and the ‘visual voicemail' feature breaks new ground (for the first time the device vendor seems to be dictating network-supported features to the service provider) these are hardly innovations to set the industry quaking in its boots.   What the industry should really worry about with the iPhone only became blindingly obvious in early September when Apple launched its latest iPod (the music devices with the white earplugs that young persons tend to wear).  It looked strangely familiar...   In fact the new iPOD Touch is really the iPhone without the phone.  It's a WiFi-enabled music player  with the same case and has the same touch screen and, most important, the same icon-driven interface. What Apple has now delivered is two identical ‘p's in a pod - a phone and a player. Importantly, there is certainly more to come.  This is the concrete expression of Apple's iLife framework for integrating the consumer's digital world. Functions such as satellite navigation, camera, storage can all be spun out as stand-alone products or spun in in various combinations to share functions on a convergence device.   Using it, the Apple brand can be spread like a viral infection across multiple existing categories of electronic device and even a few we've yet to think of, most of all across mobile phones.  In fact what Apple is selling with it's iLife is not the same old gadgetry but an electronic wardrobe of matching accessories.  It may be brilliantly trendy and a marketing triumph but the approach is also brilliantly necessary. That's because the natural brake on the utility of digital media devices of all types has always been their complexity at the user interface - so the ‘fiddlingaboutness' generally required to do things with portable digital content meant most of us under-use the facilities already available. There's a much better chance of us all doing complicated things like synching up devices, transferring files and songs to a central library and so on and so forth, if the processes are easy to follow and execute  (and aren't completely different on each device you come to).   So that' why Apple's might just succeed - the nagging worry for the mobile phone incumbents is that maybe they, or perhaps a combination of players, should have grasped this nettle more deftly themselves and moved on from the ‘button-driven' gadget environment to a personal device systems market first.  

Mobile TV debacle
The FLO Forum, a global body of more than 80 companies, today reacted to the European Commission’s Communication on “Strengthening the Internal Market for Mobile TV”.
The FLO Forum applauds the Commission’s efforts to advance the high potential mobile TV opportunity in Europe, including the focus on spectrum, harmonisation, standardisation and regulation. However, the FLO Forum believes that the Communication’s intention of favouring any one particular mobile TV technology for Europe could stall the advancement of a healthy European mobile TV eco-system.
Dr. Kamil A. Grajski, President, FLO Forum said of the Commission’s Communication: “The FLO Forum supports the principle of technology neutrality, which the major European industry groups have been calling on the Commission to respect[1]. There is a reason why the principle of technology neutrality exists and that is to ensure that the market can choose which technology delivers the most attractive solution for the consumer. Each country has its own unique market conditions and each mobile broadcasting technology standard has very different performance characteristics. Locking the European market into one technology model is potentially harmful to the growth of mobile broadcasting in Europe and will hinder the development of innovative technologies.”
“Despite its youth, the mobile TV marketplace already offers multi-standard and multi-technology products and services - from chipsets to broadcast network transmission equipment. It is now cost-effective and routine to consider multi-modal mobile TV handsets. These developments should allow for the take up of attractive broadcasting services that will enable economies of scale. Technology is not the problem, but restriction of choice will be,” added Grajski.
“The Commission’s support for DVB-H for mobile broadcasting in Europe is based, in part, on the suggestion that a mobile TV technology mandate, like the GSM mandate, is necessary to achieve economies of scale and to position European companies globally at the technology forefront. But the analogy is contrary to the market reality today,” said Grajski. “The mobile TV industry is still in its early stages, but the GSM mandate came after GSM had launched with wide commercial success. Technology mandation for mobile TV in Europe is not supported by the facts,” Grajski continues.
 Regarding FLO technology, Grajski notes that “recent independent trials of FLO technology in the UK involving several EU-based FLO Forum members highlighted significant technical advantages, which lead to savings on infrastructure spending.  FLO offers twice the capacity of DVB-H, or alternatively the same capacity, but with a network build out with significantly reduced cost. This can translate into millions of euros difference in capital and operating expenditures for a network.“
Concluding, Grajski notes that “Technology mandation is not an appropriate regulatory tool in innovative and dynamic markets such as mobile TV, especially where the market remains undecided and where the technology continues to evolve rapidly.”
Details:  www.floforum.org
 
Enterprise mobility
Indoor base stations, cellular radio enhancements and IP Multimedia Subsystem (IMS) will give mobile operators crucial new capabilities as they battle with WLAN vendors to exploit the enterprise mobility market, according to a new report, Seizing the Opportunities from Enterprise Mobility, published by Analysys, the global advisers on telecoms, IT and media.
"The limited coverage and throughput and the relatively high prices of indoor cellular services make it difficult for mobile operators to satisfy enterprises' requirements for mobility", according to report co-author, Dr Mark Heath.
"However, the combination of three major technological developments could radically enhance the capabilities available to mobile operators, enabling them to make substantial improvements to their enterprise mobility solutions, and to fend off competing solutions from the WLAN community."
Key findings from the new report include:
• Indoor base stations will significantly improve the coverage and performance of indoor cellular service, allowing mobile operators to devise different charging strategies for indoor services, including low-cost or free internal calls
• Cellular radio enhancements, such as HSPA+ and CDMA2000 1* Revision B, will increase the throughput and capacity of cellular systems to match those of WLANs, particularly with indoor base stations
• IMS will give mobile operators the functionality they need to integrate and control their indoor base stations and to deliver the flexibility, sophistication and interworking between services that enterprises will expect. "Armed with these new technologies, mobile operators are well placed to attack the mobile enterprise market", according to co-author Dr Alastair Brydon.
"One approach would be to use the technologies to integrate enterprises' existing systems and applications with their cellular networks, although this would need substantial support from systems integrators. An alternative, albeit more radical, tactic would be to aim for pervasive cellular mobility, whereby the same cellular network solution is used to deliver all of an enterprise's services and applications in every environment. "For some enterprises, the simplicity and uniformity of a common cellular service in all environments will have major benefits", says Brydon.
Details: http://research.analysys.com

Cost savings
To realise the true potential offered by mobile working, organisations must move towards delivering secure access to real-time, line-of-business applications. With a plethora of new devices, software and innovations coming to market the mobile workspace is constantly evolving. However, mobile and wireless technologies are notoriously averse to standardisation - users consistently experience technology fragmentation, interoperability issues and rapid obsolescence.
IDC predicts that in 2008 businesses will be focused on regaining control over mobility developments - having a clear vision as to how solutions should evolve to achieve flexibility, ease of use and cost savings. "Although a mobile enterprise deployment will require some up-front investment, most companies expect that, over time, the benefits will outweigh expenses, and eventually cost savings will be realised," said Stacy Sudan, research analyst, Mobile Enterprise Software.
A survey conducted at IDC's 2006 Mobility Conference identified that the combined IT spend of delegates was in excess of £300 million, with the average expenditure being £81 million; 95 per cent of delegates surveyed indicated that their mobile budgets would increase, on average by 40.3 per cent in 2007; respondents' key priorities beyond providing email access were customer relationship management, sales force and field force automation, and the implementation of additional security measures such as authentication and digital signatures.
Details: www.idc.com
 
Co-ordinating anti-spam
IronPort has announced the fruition of an anti-spam pilot project conducted between a coalition of leading European telcos and security organisations.  The seven month project resulted in improved spam catch rates, at the same time as revealing the need for a cooperation framework with more partners to standardise reporting formats, share information and adopt a common set of best practices in anti-spam.
In January 2007, IronPort joined forces with ETIS – The Global IT Association for Telecommunications, TeliaSonera, KPN, Belgacom and Telenor and Dutch research organisation TNO.  The coalition’s goal is to eliminate the majority of spam on the European network level before it even reaches the mail servers.
Throughout the pilot IronPort has provided insight into spam trends and methods, and best practice tips acquired by working with other large ISPs throughout the world.  The company also provided technology to all pilot members to enable them to see how IronPort’s reputation filtering & anti-spam technology eliminates spam.
The project trialled a combination of different technologies and co-operation procedures with positive results:  One of the participating ISPs remedied close to 16,000 spam incidents in less than a day during the pilot period.
The group found that the active information exchange among ISPs, especially regarding spam traffic flows received from one another, is a successful approach towards reducing spam.  Even within the limited time frame of the pilot, this level of information sharing dramatically reduced the spam rates as well as customer complaints in the participating ISPs. The existence of trust among partners under the ETIS umbrella allowed the process of resolving spam incidents to be automated to a great extent.
The Anti-Spam Pilot Project members will propose a Road Map for the expansion of the project to the major European Telcos at the next ETIS Information Security Working Group meting which will be held in Delft, The Netherlands on September 27.
Details: http://www.ironport.com/

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features