David Sharpley looks at how the mobile industry can manage the current mobile data  explosion, prevent mobile ‘bill shock' and impress its customers in the process

The days when the mobile phone was only used for voice calls are over. Data services have become just as essential as voice for many mobile customers and are a major factor in their choice of operator, tariff plans and device. This is good news for mobile operators, but it also creates some challenges. While billing for traditional voice and text services on a time and distance basis is well understood by customers, things get a little more problematic when it comes to offering high bandwidth services oriented around content. It's fast becoming clear that mobile operators need far more flexibility and control when it comes to these types of service if they're to make the most of them.

Customers will always prefer an operator who offers a transparent, consistent relationship with a high degree of personalisation and control. Today - especially for a mobile sector facing explosive demand for content, social networking and entertainment services - it's fast becoming clear that a much more intelligent, adaptive and flexible approach to the issue of balancing customers and resources is desperately required. Without the ability to manage access to services, content and applications at ever-finer levels of detail, customers face the risk of experiencing the mobile data equivalent of electricity ‘brownouts' - or the unpleasant experience of suddenly receiving unexpectedly huge bills for content downloads.

At the heart of the problem lies the increasing demand on network and air-path capacity that are being made around the world by users of many different types as they take advantage of the new generations of data-centric devices and services now available. It's generally estimated that a smartphone uses around thirty times the bandwidth of an ordinary handset, while a laptop pushes this to a massive 450 times. In the longer terms, it's expected that mobile data traffic will roughly double each year through to 2012, while general industry sources also indicate that revenues from mobile data in the US and China are growing at around 50% per year. In Europe alone, it's reckoned that revenues from the current mobile data market add up to around $36 billion - forecasting growth to continue at 21% a year to 2011. By contrast, revenues from voice services remain static, pushing mobile data centre-stage in any mobile operator's plans for growth and innovation.

While mobile operators can respond to this by increasing network capacity, this not only takes time and money - the latter currently being in short supply - but also runs up some inconvenient laws of physics in terms of wireless spectrum capacity. If they impose blanket bandwidth and download limits, or just offer ‘best-effort' connectivity, they run the risk of alienating some of their potentially most profitable and high-spending customers with the brownout scenarios highlighted above. Alternatively, if they try to throttle back traffic through high tariffs and roaming charges, they'll alienate customers even more as the newspaper stories already circulating about ‘bill shock' and roaming fees totalling tens of thousands of Euros highlight. For customer communities already hit by the global recession and watching every penny, such strategies will not win operators any loyalty or encourage experimentation with new services

Indeed, the European Union has recently passed legislation that not only limits the fees that operators can apply for data roaming, but also makes it a legal requirement for them to alert customers as soon as their transactions reach a pre-set financial limit.

So, something must be done - but what?

The first thing is to realise that the crucial balancing point for matching services, network availability and customer demand lies in the policies applied to each customer's account. While policy control in its broadest sense has always been an integral part of any mobile service, the tools that have often been applied in the past have been far too heavy-handed for operators to really turn them to either their own - or their customer's - true advantage. ‘Best effort' or throttled-back service is no longer sufficient especially where the experience of enterprise applications or high-value content is impacted. Alternatively, the use of blunt generic pricing policies or caps on service access can have a similarly brutal effect on the customer's trust and their future spending patterns.

It is Bridgewater's argument that what is really needed is an ability to apply a ‘Smart' policy approach to handle this increasingly complex and sensitive relationship with the required levels of personalisation. But what precisely do we mean by ‘Smart'?

There are essentially three components to applying such a strategy:

Firstly, there must be ‘Smart Controls' in place. This means giving customers access to self-service portals or automated alerts that allow them to send top-up or bandwidth boost requests, change their service packages, get information on their usage and calling patterns, and set individual limits on roaming or downloads. In practice, this translates into a far higher degree of personalisation for the user, reduces customer support overheads, encourages experimentation and take-up of additional services and, very importantly, prevents bill shock.

Secondly, ‘Smart Apps' describes the increased control of the reality of the customer experience that can be achieved through a far finer granularity of detail. Both the customer and the operator can start to tailor policies to support specific applications - video and music downloads, web browsing, email, etc. - while also taking into account the customer's own real-time behaviours and preferences as well as the wider network conditions at different times or in different places.

Finally, ‘Smart Caps' can provide a more flexible, real-time and customer-friendly approach to the issue of setting bandwidth caps by allowing operators to act appropriately depending on whether they're facing heavy or abusive users - or customers who inadvertently exceed a set limit by downloading a movie. By optimising and rationalising the allocation of bandwidth across customers - and prioritising those prepared and able to pay for it - the user gets a consistent, fair experience while the operator avoids the need for expensive network upgrades to relieve network congestion.

In practice, a smart policy control strategy can add an unprecedented richness and flexibility to each mobile operator's palette of service and billing relationships - and keep the customer feeling that they're in control of that relationship and the money that they are spending.  For example, by monitoring a postpaid customer's behaviour over long periods in real time, an operator can readily offer personalised bandwidth limits - or alternative tariffs - along with appropriate warnings to ensure fair usage or alerts to new service plans. Alternatively, prepaid customers can be monitored and offered relevant opportunities to upgrade their contracts through a portal as their circumstances change. On top of this, the inherent flexibility of the concept - and its easy integration with standard mobile network architectures - makes managing the charges and service quality parameters for accessing third-party content and applications much easier and more transparent.

The complexity of the relationships between individuals and enterprises and the ever-lengthening content and applications value chain isn't going away and can only increase. Customers demand to be recognised as individuals and only if we have the systems and processes in place to support that individuality will operators keep their loyalty and their revenues coming in. Policy control in the form of a combination of Smart Control, Smart Apps and Smart Caps approach is the most powerful tool operators have in their armouries to achieve that end.

David Sharpley is Senior Vice President, Marketing and Product Management, Bridgewater Systems

Mobile banking has been mooted for some time as a mass market application and it is  undoubtedly on the rise; Juniper Research predicts that the number of consumers accessing banking products and services via their mobile phones will reach 816 million by 2011 and management consultancy Arthur D. Little expects total m-payment transactions to grow by 68% per annum to reach almost $250 billion in 2012. Rodrigue Ullens feels that a number of converging factors have come together to enable the mobile operators to take the initiative and become our banks for everyday transactions

Around the world, customers' banking behaviour is changing- there is an ongoing decrease in branch use while internet and telephone banking continues to rise. The demand for increasingly sophisticated mobile devices and the success of mobile app stores from the likes of Nokia, Apple and RIM has also highlighted how customer expectations of what they should be able to do with their mobile phone are also changing.

Additionally, following the worldwide economic downturn, it would be fair to say that the traditional banking system is in disarray and that trust in financial institutions is at an all time low. All of these factors have converged to create an environment in which mobile banking can develop as customer demand coincides with a need for financial institutions to improve their image, attract new customers and generate revenue growth.

However, if the banks want to move beyond the simple text-message based updates they currently offer and provide the comprehensive ‘mobile banking 2.0' experience that customers crave, a number of large obstacles still need to be overcome. The banks have to quell consumers' fears regarding security and lack of trust whilst also establishing an appropriate billing relationship with their customers.

It is actually the mobile operators who are best placed to provide the mobile banking 2.0 experience as they already have convincing answers to these security and billing questions. Security is a much lower concern as the operators retain those functions they have traditionally controlled (financial institutions would have to rely on a third party) and all of their customers are provided with a dedicated device which holds a secure chip (the SIM card). Mobile has continued to be a growth sector in the current climate, which suggests that the operators are becoming more trusted than our banks. And of course, they already have the necessary pre-existing billing relationship in place with their customers.

Add to the above the fact that we carry our mobile phones with us at all times and that our mobile number acts as a unique identifier in the same way as our existing bank account number and sort code, and it becomes clear that the operators are well positioned to become our banks for everyday transactions. Schemes such as the GSMA's Mobile Money Transfer Initiative are already under way to make this possible and I'm sure we will see mobile operators partnering with companies that hold a European e-money licence, such as Tunz, in order to make this a widespread reality in the coming years.

Mobile banking opens up a range of new transactional opportunities such as contactless payments at point of sale, international money transfers and transfers over the mobile internet, all of which have already been successfully trialled somewhere in the world. For example, in Africa money transfers on a mobile phone are becoming the continent's first widespread cashless payment system, enabling cost-effective and secure transactions amongst citizens with no access to a bank account in the traditional sense of the term.
In terms of developed markets, contactless payments via a mobile phone are already widespread throughout Japan, and some European operators, including Belgium's Belgacom ICS, have been selected to provide this service in European countries over the next six months. Furthermore, a recent report from Arthur D. Little has revealed that in China- the country with the most internet users anywhere in the world- 29% of new users do not have a PC at all and instead access the internet solely through their mobile phone.

These examples demonstrate both the potential and the enthusiasm for banking services on the mobile phone. However, the key issue that remains to be addressed is cross border interoperability and standardisation. As the worldwide population becomes ever more global, the ability to take account details across national borders will become increasingly desirable to customers.

Last year, the International Telecommunications Union (ITU) created the new country code +883. These +883 numbers (or iNums) have effectively created ‘an area code for Earth', a telephone number that follows you wherever you are in the world. Aligning mobile banking services with the iNum as an international account identifier and sort code will be a huge step forward in addressing the global interoperability issue, even more so once every fixed and mobile operator makes iNums reachable from their networks.

While the mobile operators will not replace our financial institutions for larger transactions, a number of factors have all converged to put them in pole position to become our ‘everyday banks'. The current economic climate, our overwhelming public demand for mobile banking 2.0 and the existing relationship that the mobile operators have with their customers are factors that suggest a new era of mobile payments may be on the horizon.

There have already been a great deal of initiatives within both emerging and developed markets that have demonstrated mobile banking's potential to deliver innovative new services to customers, but the next step will be to take this activity global. People are becoming ever more mobile and the ability to take account details across national boundaries will become increasingly attractive to consumers in the next few years. 

Rodrigue Ullens is CEO and co-founder of Voxbone


You can't have a discussion these days with anyone in the IT industry without the word  ‘cloud' coming up: yes, cloud computing really is the biggest thing since sliced bread! Why have your own servers running, on average, at 5% loading when a cloud-based service will save huge amounts on processing by paying only for what you use?

Ever faster networks are making physical locations (sort of) irrelevant, opening up the market for all sorts of capabilities to be accessible online -  from applications (software-as-a-service) through infrastructure services like computing, storage and networking to online enabling services like billing, CRM, authentication etc.

If you're an end user, cloud services open up a huge range of services without the expense and expertise needed to have your own infrastructure. For communications players searching for new business models and services that will replace stagnating voice revenues, cloud services offer some very interesting opportunities. Do you take the cost reduction route of cutting your computing costs by moving to being a cloud computing buyer, or do you go the opposite way and expand your data centers to monetize them by selling services to others?

Another angle is to open up some of your internal services and capabilities for others to purchase and use for a price, such as call centres, CRM and billing, all of which service providers have in large measure.

Now all this sounds great in theory, but with not a lot of tangible examples out there of large cloud computing services that are in full deployment, it's hard to guess at how the market will develop and who will win.

One thing is for sure though, no matter how big telecom companies think their IT base is, it's small both geographically and in horsepower terms when compared to Amazon, Google and Microsoft. Economies of scale definitely will play a big part in who can make money as a price war will be played out not only between cloud providers but between cloud and the alternative ‘do-it-yourself' market of owning your own infrastructure.

Although there are numerous applications of a cloud approach, it's possible that cloud services might take off in two key areas. One is video content distribution, because to do this you need the ability to deploy significant network, computing and storage horsepower on demand.

The second area is in providing integrated packages of communications services, computing power and applications to small and medium enterprises. The cloud will make it possible to weave together packages of these services where all the data is stored in the cloud and all the applications are run in the cloud, but businesses get an integrated package where all of that is transparent to them.

I recently saw a demo by a large, multinational mobile operator of just this sort of innovative packaging of services. If they get it off the ground they could find customers beating down their doors.

A recent article in The Economist reviewed the cloud approach and warned that cloud may simply be a way for the computing industry to shield itself from the growing tide of openness. For example, how easy will it be for customers to switch cloud service providers - once your data is in the cloud, can you ever get it out again and move to a different supplier? Can you compare apples-to-apples on price and performance if every provider has different services measured in different ways?

You might expect buyers to be in favour of openness and suppliers to be hostile, but the reality is that lack of openness and transparency might hold back the entire market, so some level of standardization may be supported by both sides. Standards are needed for application interfaces and data structures as well as a common model for cloud ‘piece parts' - what industry called Stock Keeping Units, or SKUs, and benchmarks so that the buyers can easily make purchasing choices.

All of which are areas that TM Forum has a lot of expertise in, so we're actively working on facilitating some of the industry aspects of cloud-based services, and we're working with all sides of industry to understand what's needed.

Cloud is a game changer in that it is yet another step along the way of allowing anyone to enter new markets. It's a changing world and one that the communications industry, if it's smart, will grab with both hands.

Keith Willetts is Chairman and CEO, TM Forum

Governments and communications companies can work together to give urban  developments a global competitive advantage while offering new revenue streams to CME companies say Peter Elliot and James Bennet

Vibrant and creative cities drive economic, social and cultural development. Governments focus on renovating existing cities and cultivating new ones to act as a catalyst to development. This is now taking place on an unprecedented scale. The integration of ICT with development projects can change the urban landscape by developing the Smart City. The Smart City can enhance the lives of citizens, encourage business to invest, and create a sustainable urban environment. City governments have an opportunity to ensure ICT infrastructure is integrated with the changing city, and communications companies can support this vision to mutual advantage.

The Smart City requires a ubiquitous ultra-high speed network infrastructure, fixed and wireless, that allows people, business and government to connect with each other and the systems that manage the infrastructure and services of the city. The network provides valuable real-time data about the urban environment to Smart City applications and service providers.

Importantly, this information is:

  • context sensitive
  • location sensitive
  • personal
  • secure

This information can enable a myriad number of services:

  • Transport - traffic flow management, speed control, congestion charging, information systems, vehicle tracking, onboard safety
  • Public safety and security - access control systems, alarm monitoring, emergency warning and situation management
  • Transactions - electronic funds transfer
  • Public services - remote patient monitoring, patient records management, education/ learning networks
  • Identity - biometric/smart card systems
  • Utilities - facilities management, climate control, meter monitoring, energy generation and storage management, leak detection and network management
  • Environment - data collection and monitoring.

Many of these services are envisioned today and some have been implemented in isolation with limited capability. However the intelligence and integration of the Smart City infrastructure can:

  • Enhance the social environment - by integrating community services such as health, education, identity and payment systems
  • Manage the built environment more efficiently - homes, buildings, public spaces, transport systems and utilities - to predict and respond to changing conditions, enhancing quality of life and minimising impact on the natural environment
  • Offer consumers innovative services - the fastest internet access and innovative services available from any location, indoors and outdoors
  • Provide world-leading services to businesses - zero latency data connectivity, secure networks, low cost hosting and storage that make the city an attractive place to do business.

The Smart City infrastructure can therefore change the way the city is managed, improving delivery of public services and enhancing the lives of citizens. It can make a city a more attractive place to do business, not only by providing the infrastructure and services required by businesses but also by making a city a more attractive and prestigious place to live and work.

This is a key concern for multinationals that want to attract talent from a global talent pool. ICT can also help a city move to a sustainable future. Operational efficiency, reducing commuting times for example, can help this, but changing the way people live and work will also drive the development of the 21st century urban environment.

Competitive markets for provision of ICT infrastructure and services have developed in most countries around the world. However commercial service providers alone are unlikely to be able to deliver the ubiquitous high-speed access and shared platforms that will enable governments, business and citizens to realise the benefits of the Smart City. Local monopoly operators or duopolies have no incentive to create an open access network infrastructure that disturbs a potentially lucrative status quo. Competitive operators may have difficulty making the business case stack up without intervention from the public sector interested in the broader benefits that a Smart City can deliver. There may therefore be a role for the public sector in directing investment.

City governments can respond in different ways, reflecting different degrees of confidence in the ability of a competitive market to deliver appropriate services to end users. Examples of the implementation of open access fibre networks show how approaches can differ:

  • In Stockholm a subsidiary of the municipality manages a city-wide passive fibre network that is open for use by all service providers to provide ultra-high speed services to homes and businesses
  • In Amsterdam the city government is working with commercial investors to build the passive fibre infrastructure. The municipality has selected a commercial operator to install equipment and provide capacity to service providers on an open access basis
  • In Singapore the government is promoting a similar model to Amsterdam, but offering subsidies to two commercial operators to build a passive fibre network and provide capacity to service providers. The government has also split the island into three areas and offered ISPs concessions to provide blanket WiFi coverage of public spaces such as parks and shopping centres.

The implementation of Smart City infrastructure creates opportunities for commercial service providers to build new revenues, either by providing the supporting infrastructure for Smart City applications, or by using this infrastructure to offer services to government, business or consumer markets. Governments and businesses can therefore work together to develop the ICT sector at the same time as offering urban developments the opportunity to build a competitive advantage in a crowded global market for cities.

Peter Elliot and James Bennet, PA Consulting

As IBC opens its doors at the RAI Centre in Amsterdam, from September 10th to 15th, it is becoming increasingly clear that everyone in the communications industry is going to have to collaborate to turn the idea of ‘content anytime, anywhere' into a reality

For anyone who has been in the communications industry at any time in the last 30 years, the word "convergence" will be familiar. With broadband availability moving towards universality and mobile networks increasingly being regarded as data highways not just for voice, there is finally a chance that this convergence will happen.

While there will be multiple applications running over this data bandwidth, it is clear that what will drive it will be media. Audiences love the concept of being able to see the content they want, when they want it, at the time they want it, where they want it, and on the device they want.

It is not just a consumer issue. At first digital signage was simply a poster on a plasma screen. Today it is seen as a highly targeted, highly productive means of reaching audiences over what is effectively a private interactive television network. Retailers love the idea of being able to talk to potential purchasers at the instant they are making a purchasing decision - put a recipe for a chicken casserole above the chicken display in the supermarket.
So it is that telcos, mobile network operators (MNOs) and ISPs are now very much in the media business. They have the carrier infrastructure, and they have to be sure they are getting their fair share of the revenues from what will be a booming business, particularly at a time when voice arpu is falling.

On the other hand, broadcasters and production companies have the content that will be the real driver for audiences. Clearly, everyone has to collaborate to make the content anywhere become a reality.

At a time when the landscape is moving fast, it is wise to invest in knowledge; to understand what the commercial challenges are as well as the technology issues, and to see where the established players need to collaborate and where they still compete.

IBC has been an established date in the broadcaster's diary for more than 40 years; over the last decade it has become vital for telcos, MNOs and the data industry too. It is the leading event for anyone involved in content creation, management and delivery, attracting visitors from more than 130 countries.

There are multiple levels to the event. First, it is an authoritative conference, bringing experts from around the world to debate the key issues of the day.

This year the IBC conference has been reorganised into three distinct streams: technology advances; content creation and innovation; and the business of broadcasting. Each contains a series of technical papers, commercial and operational debates and masterclasses from leading creative professionals.

Most important, each crosses the boundaries between traditional broadcasting and the new creation and distribution of media. The content creation and innovation strand, for example, includes a full day on IPTV and mobile media. In addition, to counteract the inevitable concern that conference debates are always led by senior executives who may not be perfectly in touch with the consumer, a session is given over to listening to a panel of students from Japan, Russia, UK and USA talking about how they consume media and their expectations for the media of the future.

Every minute of every day, some 13 hours of content is uploaded to YouTube. Is this just vanity publishing that no one watches, or is user-generated content the future for the media industry? The answer to that question has far-reaching implications for the business and technology plans of data network operators.

Alongside the formal conference there is another opportunity for debate and exploration. There is a series of business briefings: presentations by people who have been there and done it, and share their experiences (good and bad). There are also business briefings on mobile media, for IPTV and online video, and for digital signage.

These presentations are linked to specific zones within the exhibition, again dedicated to mobile, IPTV and digital signage. The zones are an initiative to attract innovators of emerging media into the IBC community. They give exhibitors, and particularly start-up businesses, a simple and low-cost route into the exhibition, and they present visitors with a rapid overview of the latest technology in a particular field without them having to walk all around the show floor.

The full exhibition is large and comprehensive, filling the exhibition space - 11 halls - in the Amsterdam RAI Centre with around 1000 exhibitors. The layout of the exhibition is designed to make it easy for the visitor to find the solutions they need. If you are looking for the latest in communications technologies - from compression and multiplexing to set-top boxes and home hubs - then all the key players are grouped together in one area.

Equally, if your business plan requires you to move into content production or channel management, then you will find what you need in other halls. One hall includes the IBC Production Village, which combines a huge camera demonstration area and hands-on experiences of the latest equipment with a number of active presentations, including the production base for IBC TV News - the event's own on-air and online programme.

Another free presentation strand, "What caught my eye", is helpful for newcomers to a subject, but is always popular with regular visitors too. Experienced practitioners tour the exhibition looking for the latest and best in their particular fields, and in a short and lively workshop talk about what is new and exciting. It also gives tips to the audience on where to find a given highlight and the right questions to ask of the vendors.

There is a third element to IBC alongside the conference and exhibition: a whole range of offerings that add value to your time spent at the event. The business briefings, which are free to all visitors, are a good example. Others range from training opportunities to the chance to network with your peers. And most evenings there are free screenings of some fine television productions and movies in the state-of-the-art digital cinema created by IBC.

The IBC conference runs from September 10 to 14 and the exhibition from September 11 to 15, at the RAI Centre in Amsterdam. Exhibition only registration is free and includes all the added value opportunities including the business briefings, screenings and networking. There are a range of conference packages available for IBC2009: more information - and online registration - can be found at www.ibc.org.

The assumption among consumers is that video will be an integral part of any communication in future: if they can shoot something on a mobile phone and upload it to YouTube, then they will expect to be able to see it on their televisions at home too.

If you have any interest in the technology to make this happen, or the business cases that will realise a fair return on this investment, then IBC is the place to join the debate.


Environmental issues are impacting product development across a range of telecoms  offerings says Alix Morley

Environmental concerns are quite clearly becoming increasingly widespread, both among consumers and across a considerable variety of industries.  Telecoms is no exception, determined to prove its high eco status through a range of product and applications developments.

Certainly, the trend is making itself felt throughout the cell phone industry with major producers competing to heighten their green credentials.  Sony Ericsson's GreenHeart range is testament to these concerns.  Both the C901 GreenHeart and the soon to be released Naite boosting the green image of the company.  Indeed, these phones will be sold in smaller packages, thus reducing waste, while Sony Ericsson also advertise the C901 as including "an in-phone manual replacing the standard paper version; while recycled plastics, an energy efficient display and waterborne paint mean that the overall CO2 emissions of the phone are decreased by 15%."  This is certainly in line with Sony Ericsson's stated aim to cut 20% of its carbon dioxide emissions by 2015.

In a recent interview with Reuters, Mats Pellback-Scharp, Head of Sustainability at Sony Ericsson, highlighted that the company is "not aiming at a niche segment.  We are taking this to mainstream and to a bigger audience. . . Naite is the lowest cost 3G phone in Sony Ericsson's portfolio and we expect it to be one of the biggest volume drivers in end-2009 to early 2010."

Sony Ericsson is certainly not alone in venturing into the eco-friendly marketplace.  Indeed, while Samsung Electronics and ZTE have recently developed a solar powered phone, Nokia's 3110 Evolve is also attempting to harness the eco concerns of the consumer with increased green credentials.  Nokia states that: "Innovative bio-materials have been used in creating this handset" while, "even the sales box is made from 60% recycled materials."   Indeed, the company has taken its eco thinking further to include what it terms ‘intelligent charging' - a device which saves energy by only using electricity while actually charging.

The notion of applying a green practices to the charging of phones is one which has recently been further developed.  With a believed 51,000 chargers remaining unused, leading manufactures throughout the EU reached a landmark agreement to produce a universal charger, financially benefiting the consumer, and potentially decreasing what is held to be thousands of tonnes of wastage each year, created by old chargers being thrown away when phones are replaced.

In marketing terms, the benefits of a good ‘eco image' are undeniable.  Indeed, ABI Research has suggested that up to half of the US market takes environmental factors into account when making a purchase.  It may, of course, be argued that such models as the Naite and C901 GreenHeart are merely tapping into an already growing phenomena, being produced just in time to catch the eco friendly wave.  Certainly it creates a positive brand image, which, in an age of increasing public concern over climate change and environmental policies, is likely to positively impact sales.

And while the establishment of green credentials may lead to increased sales profits, the implementation of environmental practices can also significantly reduce costs within the individual company.  A recent Ovum report highlighted that the research and establishment of such stratagems and policies may initially be costly, but the long term operation of eco-friendly practices - so the report suggests - is certainly beneficial. 

At the same time membership organisations such as ATIS, which create "solutions that support the rollout of new products and services into the information, entertainment and communications marketplace", have certainly seen the benefits of boosting green credentials.  Indeed, in their Green Mission Statement, the organisation asserts that "ATIS and its members are committed to providing global leadership for the development of environmentally sustainable solutions for the information, entertainment, and communications industry.  The development of these innovative end-to-end solutions will: promote energy efficiencies; reduce greenhouse gas emissions; promote ‘reduce, reuse, recycle'; promote eco-aware sustainability and support the potential for societal benefits."
The use of energy is at the heart of any eco discussion, while energy costs, of course, are also a significant part of a provider's operating expenses.  So - for example - Ericsson's recent announcement that its network IP edge and metro platforms deliver the industry lowest energy consumption based on new metrics that measure how efficiently they can provide subscriber services, such as residential triple-play, not only suggests economic solutions, but also ecological ones.

Ericsson stresses that because energy usage accounts for up to 50% of operating expenses, and energy costs continue to rise while the demand for bandwidth-intensive services increases day-by-day, service providers need greater insight into how specific platforms use energy.

Based on the functions performed by each platform, Ericsson's new metrics provide practical views into energy efficiency, mapping energy usage granularly by subscriber and circuit. The new metrics are the latest sustainability effort from Ericsson, which in 2008 committed to providing up to a 40 % reduction in carbon emissions per subscriber across its product portfolio within five years. In addition to the focus on the IP edge and metro Ethernet, Ericsson has reported significant reductions in energy usage for its WCDMA radio base stations (an 80% improvement in energy efficiency from 2001 to 2008), for its mobile softswitch solution (60% more efficiency per subscriber), and for site power management.
Climate change, it may be fairly confidently argued, is now inevitable, so companies, as much as private consumers, would do well to take measures to reduce their carbon footprint.  Indeed, Ovum claims that: "Estimates suggest that telcos can achieve a 1-2% reduction in global carbon emissions by implementing green initiatives within their operations.  However, the telecommunications industry is expected to enable other businesses to reduce emissions by up to five times this amount, highlighting that telecoms has a major role to play in enabling a green economy."  Certainly, it appears that this industry is in an ideal position not merely to jump on the bandwagon but to lead the way to an improved and green global economy.

Alix Morley is a freelance writer.

In the face of an economic downturn and cautious consumer spending, Peter Hauser  takes a look at the types of incentives that will guarantee mobile customers' loyalty

Despite well-publicised hopes to the contrary, the telecoms industry is not immune to the effects of the global economic downturn. Handset manufacturers expect unit sales to fall by at least ten per cent this year - fuelled by cautious consumer spending - while industry giants such as Vodafone and Nokia rang in the New Year by announcing redundancies. Some industry commentators argued that the mobile industry was essentially ‘recession proof', thanks in part to the fact that global mobile penetration is still rising fast. However, consumers in wealthier, saturated markets, are likely to re-evaluate their spending under the current economic circumstances. This year, we'll see people shopping around for cheaper deals and swapping out of contract. That is, of course, unless they're getting discernable value-add in return for their loyalty.

Customer churn is nothing new of course. Indeed, for as long as mobile operators have offered post paid contracts, they've been under constant pressure to retain customers, or reduce churn. And this knowledge, that building loyalty is one of the most critical challenges in the mobile telecoms service industry, will go some way to steer the industry through the challenges of operating in a recession.

We've seen lots of industry experts provide advice to operators who are working to increase customer loyalty. But the inevitable difficulty with any strategy to reduce churn is that it relies on anticipating consumer desires, whims and even fashions. No easy feat.

It is a widely held belief that mobile applications can help to increase customer loyalty or so called ‘stickiness'. Indeed, taking a recent high profile example, for all the work Apple has done to make the iPhone a success over the past year, many say that its future lies in the hands of application developers. Over the past 12 months, we've seen speedy growth in iPhone applications, which have turned the handsets into the most sought after devices, not just in mobile technology - but in personal computing too. The rest of the industry appears to have woken up to this challenge; and virtually every other major handset vendor company is setting up their own application store.

But choosing to deploy or develop an application also relies on anticipating customer behaviour and desire. Operators want to find the ‘next big thing', but don't want to be the victim of a fleeting fad. Indeed, the industry has concluded that there is no one killer application for mobiles - so how do operators attract and, more importantly, retain their subscribers?

We believe that the key to success is simplicity. Operators are missing a trick by overestimating the technical know-how of their customers. While there will always be a group of early adopters keen to embrace the latest gadgets and applications, they will be outnumbered by a substantial majority of mobile subscribers who only ever use their handset for making voice calls and sending texts.

With this in mind, we believe that the ‘stickiness' that operators crave comes from creating an environment on the handset that makes it easy for people to use their mobile for more, without being intimidated by the technology, and without entering a world of unknown costs. In short, simple and intuitive applications, which enable users to do more on the mobile and through the network, are ultimately the tools operators need in order to reduce churn.

If consumers will be less likely to try out new services, then applications must be highly compelling, viral and likely to generate word-of-mouth interest. The growth of the ‘casual gaming' market, for example, reflects this need. Last year, the casual games industry was estimated to be worth $2.25bn (£1.17bn) - unsurprising perhaps given that casual games appeal to consumers who are used to, and enjoy, playing games on their PCs.

Classic arcade games, which can be picked up easily and played with circles of friends, look likely to be the order of the day for operators. The most popular mobile games are Solitaire and Tetris - old school, basic games, which are addictive and easy to pick up. Savvy operators are opting to introduce ‘micropayments' where a basic game is promoted for free, and then add-ons (like characters or functions) are offered at an extra cost. The more levels and add-ons there are, the more likely it is that a consumer will play for the long term - and therefore remain a loyal customer for the operator.

Voice activated applications also look poised for success - as they provide a comfortable, ‘easy', way of interacting with a device in order to drive new services. Our own customer research has shown that voice services offer a good, transitionary, way to engage new users and drive adoption. Corporate giants like Google have been quick to capitalise on this trend - with applications like a voice activated search application for the iPhone. Google's service makes searching the web easy and accessible, particularly for use on the move. Location Based Services (LBS) have long been touted as providing consumers with useful information on the move, and combining LBS with voice recognition tools seems to be a route to quick success.

For any voice recognition service, though, quality of service will be key to driving adoption. Developers, or operators themselves, should be prepared to work with the industry's best partners to ensure adequate quality of service - particularly as operators look to penetrate new markets, which naturally call for support for other languages. Companies like Nuance and Qpointer are leading the way to ensure good quality voice recognition. Indeed, Nuance says that its Dragon NaturallySpeaking product turns voice into text three times faster than most people type - with up to 99 per cent accuracy.

But as well as quality, operators must ensure that applications are priced fairly and competitively. Just this week, guidelines for application developers were released on pricing applications for BlackBerry manufacturer Research In Motion (RIM). These were set significantly higher than iPhone applications - a development which is bound to disgruntle some BlackBerry owners.

But developing, and rolling out, more costly applications is an understandable move. After all, more expensive applications are a sure-fire way to create revenue and keep both operators and application developers in business. Business related applications, or premium games services, naturally command a higher price - and if users can see the benefit, or need, for these services, they'll continue to invest. But a balance needs to be struck. Transparency - and one off payments - are the order of the day. Helping consumers understand their bills and what they'll be paying each month will help generate a level of trust and loyalty between consumer and operator. Flat rate data tariffs will also be key to driving web-based applications - vital as the smartphone or mobile computer market blossoms.

Finally, marketing will continue to be of vital importance for operators wanting to attract new customers and retain current subscribers. Adopting better customer relationship management procedures and employing targeted, personalised, marketing will bring an operator closer to its customer - and help build an understanding of services and applications which are needed and relevant. It's important for operators to focus on the wants and needs of users, more so now than ever before. No longer will consumers have expendable cash to ‘experiment with', so it's key to focus on what will work, what will appeal - and importantly what is easy to use. In short, 2009 is going to be a tough year for mobile operators - but the ones who keep their services simple, targeted and fair are poised to ride out the global recession, and ultimately succeed.

Peter Hauser is CEO, me2me

Alon Werber looks at how operators can best harness customer data to deliver the  right product or service at the right time

The hugely competitive and ever evolving battle to win the hearts and minds of the mobile user is bringing marketers within this sector new and escalating challenges, but also huge opportunities. In addition to rapid churn rates and high customer acquisition costs, marketers are now also faced with an economic downturn, during which it is likely they will see, if they have not already, a reduction in consumption levels. Therefore it is essential now, more than ever, that marketers use all the tools at their disposal to maintain customer loyalty and retention, as well as ensure arpu is maximised.

The key to building long-lasting relationships is to continuously present the consumer with services, offers, merchandise and bundles, which meet their requirements at exactly the right time. In this sense the key is to always think of ‘what's next?' for the customer. Instead of trying to predict what users want, operators need to adopt a system which acknowledges a customers response, or lack of response, immediately and keeps the dialogue open based on their decisions, whilst executing the offers in real-time. It's an essential part of the Customer Relationship Management mix and will help to deliver optimum results yet it's not something that can be achieved through ‘traditional' campaign management systems. This means operators cannot maximise tried and tested techniques such as up-sell and cross-sell if they do not address the customer, just at the right time, and based on his actual behaviour.

Gartner forecasts that within the next three to five years the techniques of up-sell, offering the customer a higher priced or better version of the product they are purchasing; and cross-sell, offering a related product, will soar. These are important techniques in optimising the lifetime value of customer relationships, so how can marketers best implement them in order to ensure maximum return on investment?

Get your facts straight. Whilst it sounds like stating the obvious, it is something that many operators can often fail to maximise because they don't have the right tools at their disposal. Current and past behaviour patterns are the best predictors of future action; to be able to up-sell and cross-sell effectively, operators must monitor usage of services or products in real-time to enable behavioural-based targeting; an approach that, until now, has been hard and costly to adopt. Yet with the development of tools, like our own Marketing Delivery Platform, which is able to monitor and harness data on usage and behaviour patterns and execute marketing offers in real-time, this can be addressed. Tools such as these will enable operators to be better equipped to launch targeted, personalised and perfectly timed offers delivered through the right channel; increasing the chances of users purchasing the additional offering.

Many operators may claim to understand the importance of personalisation; however they often fail to deliver. It is all very well offering deals such as buy-one-get-one-free, 3-for-2, but does the offer address what the user wants? Operators often send marketing offers to customers in a non-targeted way that is irrelevant to the user. Users won't purchase content or subscribe to a new service if it is not relevant and delivered at the right time. It is ineffective to offer new, more expensive services to a customer who has, for example, not paid their bill. Equally, it is pointless to offer content relating to Heavy Metal bands if previous purchases made by the user have been related to R&B - it will simply not work. What's more, if users are bombarded with multiple, irrelevant offers, there's a possibility it will be seen as spam, not a term you want associated with your brand. To prevent this, marketers should adopt a personalised, multi-step online marketing plan rather than generic, multiple, one-trick-pony plan. In this sense, operators need to follow the example set by online service providers who have maximised the potential of personalisation with "Amazon-like" recommendations linked to the users' preferences and real time behaviour.

This personalised approach has been proven to deliver compelling results, for example, a cable operator recognised its key competitive advantage was the delivery of video on demand (VoD), yet subscribers weren't purchasing this premium content. Through targeted marketing offers to different consumer categories the operator was able to convert users that had never previously bought content, and by analysing relationships between content categories was able to design cross-sell offers that leveraged purchases from one category to promote purchases in another relevant category. This not only drove VOD subscribers to pay for premium movies but also introduced customers to new categories; increasing overall sales in the promoted categories by 42 per cent over a nine month period. This is the strategy that operators will need to embrace if they want to out-perform the competition and improve levels of customer retention and profitability.

Timing is a critical component within the equation. All too often an opportunity can be missed if an operator does not react in time to retain relationships with users - meaning lost revenue and the potential of alienating the user by offering a mis-targeted, mis-timed offer. If a customer is buying a ringtone, it is best to offer a related cross-sell before the purchase is completed and not two weeks later as the user is less likely to want to spend extra on additional products or services. This is an opportunity which operators will miss if they don't have a full profile of the customer or have the ability to implement offers in real-time. So often it is only when you address the customer ‘here and now' - when their mindset is open to marketing offers and they are about to purchase, that you will achieve a high success rate and customer satisfaction. The chances of cross-selling or up-selling will have dramatically dropped if there is a lapse between time of purchase and offer. Furthermore by offering a deal at the point of purchase it will make the user feel as though they have a personalised service and, as long as it is something they will like, and there is a saving, the chances of the user following through with the offer increases. In this sense, it is essential operators react to the user's responses in real-time; keeping the communication with the user open to allow the execution of offers based on requirements. Even if the customer is not in the process of purchasing a service, this does not mean that they could not be tempted by a marketing offer. For example, with pre-paid mobile customers, operators should automatically offer a silent customer, with a low balance, a special top-up benefit. Through this the operator can improve response rates and drive ARPU from cross-sell and up-sell offers whilst user satisfaction increases.

One of the key advantages with cross-selling is the ability to promote products that are not necessarily one of the big sellers but something consumers would be interested in if marketed correctly. Initially when a new product is launched it can be a bestseller, but over time interest wanes and items can seem like one-hit wonders; this immediate surge of revenue is known as the Short Head. Mobile service providers play to this model by refreshing on-deck content to keep people engaged and encouraging them to make fresh purchases. Yet it is the less popular products that can deliver consistent revenue when offered alongside the latest content; this is known as the Long Tail and gives operators the unprecedented opportunity to cross-sell by promoting products being left behind. To do it effectively, service providers need a flexible platform to launch a diverse range of products in real-time, whilst exploiting the limited window of opportunity for content-related offerings. So when a customer is purchasing the latest remix of a particular song that is topping the charts, it is also the perfect opportunity to cross-sell the original version of the song. By doing this, the operator is selling an item which is not particularly popular but is still available to purchase; increasing the arpu. The Long Tail theory is that the Long Tail, over a period of time, will equal the Short Head in size; that collectively many products account for as many sales as the few bestsellers. With cross-platform selling exploiting the Long Tail and Short Head simultaneously, operators need to be able to capture and analyse customer preferences and lifestyle characteristics to support this one-to-one marketing. By using these factors to create packages and bundles - operators can turn on-the-fly promotions into optimal marketing offers.

So the bottom line is that to be able to run effective marketing offers you need to have current statistics of what is going on with all your products - what's popular, what's not. Through implementing effective marketing tactics, operators will be able to analyse in real-time the results they are achieving and measure any level of reduction in consumption; giving them ample time to rectify problems and turn them into a positive outcome.

Central to the effectiveness of these methods is having a full view of a customer's profile; that is, having full visibility of their preferences, behaviour and buying patterns in real time so offers can be personalised to suit individual needs and delivered when, and in a manner, that is most appropriate for the customer. This is the key to ensuring that cross-sell and up-sell - the ‘what's next?' part of the CRM mix - delivers optimum results: driving revenue, improving customer loyalty and reducing churn.

Alon Werber is VP Marketing and Business Development at Pontis

A number of EU countries are now concerned about the reach and speed of their broadband infrastructure. Recent reports (such as Digital Britain by Lord Carter in the UK) highlight a range of public policy initiatives aimed at encouraging the delivery of both quasi-ubiquitous access and super fast broadband services.

Should broadband become part of the Universal Service? And if yes, what kind of broadband? Provided by whom? And who should pick up the tab?

The concept of Universal Service is not new and the term is thought to have been originally coined by Theodore Vail, the then CEO of AT&T in 1907. While at the time the term Universal Service referred to the need to connect different telephone networks to ensure the maximum reach, it is now generally associated with the provision of a basic level of service to all and at a subsidised price in "uneconomic areas".

A number of criteria have been used by regulators in the past to decide what should be within the scope of USO and what should be excluded. The EU criteria is whether a) the service has to be used by a majority of users, b) the use of the service is conductive to social inclusion, c) quasi-ubiquitous availability of the service generates "general net benefits to all consumers".

While, in the past, broadband failed to pass the first of these three criteria, many countries have now reached broadband penetration levels beyond 50 per cent. The other two criteria have a less quantitative answer but it is now generally accepted that broadband access does contribute to social inclusion and that ubiquitous connectivity generates net benefits.
While plans for the roll out of New Generation Networks are still being developed it is becoming clear that the commercial development of these super fast networks (e.g. 50Mb/s and above) will be limited to high density areas. Many EU governments have decided that it is too early for them to commit to significant investment programmes (with taxpayers' money) and are opting for a "wait and see" attitude instead.

The developments of next generation access networks suggest that countries will experience an increasing divergence between the speed experience by different user groups. Households in new urban developments with Fibre to the Home (FTTH) may enjoy speeds of greater than 100Mbs while users in less densely populated areas with legacy exchanges, may get less than 2Mbs.

This raises the question of whether Governments should step in to ensure that households are connected to the digital society and at what speed.

Historically incumbent operators, given their former monopoly status and the quasi ubiquitous footprint of their networks, were deemed to be natural USO providers. More recently however alternative provision mechanisms that would, for example, allow different operators to bid for the right to provide USO in a given geography (and win if their bid is the lowest cost) have been discussed. Also, mobile operators that have been subject to coverage obligations and have developed innovative tariff packages (such as Pay As You Go) promoting social inclusion, have not received USO subsidies. With 3G now and LTE on the horizon, mobile and wireless operators may well seek to be part of the USO club and play a significant role in the delivery of ubiquitous broadband access in the future.

The question of funding of the USO is and has always been a very political and sensitive issue. Currently largely financed by the telecommunications industry itself (or in many cases the incumbent), the cost of USO should arguably be more widely distributed. In fact, many argue that governments should directly finance this public policy initiative.

Including broadband as part of the USO would be a way to provide a safety net under the growing divergence in broadband access with very high broadband speed for some while other remain excluded with little prospect of higher speeds. The criteria used to define the scope of USO are now consistent with such an addition. The details of the funding underpinning such an initiative are, however, yet to be resolved... and may significantly delay or even derail the laudable ambition of a broadband service for all.

Benoit Reillier is a Director and head of the European telecoms practice at global economics advisory firm LECG. The views expressed in this column are his own.

YOC was tasked with creating a mobile campaign to promote the launch of Kraft Foods' new instant coffee, Jacobs 3in1/2in1. It was to be integrated with traditional media to provide consumers with an uncomplicated way to order samples via mobile, whilst minimising sampling wastage levels and associated costs through accurate targeting and tester self-selection.

The main objective of the mobile campaign was to place product samples amongst early adopters, who are considered to be innovation savvy and opinion leaders. It was also important that the campaign maximised the reach of the target group and increased conversion rate of sample requests, while at the same time decreasing product distribution costs. Measurement needed to be simple and transparent. Kraft also wanted to gather data to develop a customer database.

YOC created a mobile advertising and sampling campaign, integrating a mobile call to action with online and traditional TV and print media. Shortcodes and key words were promoted across traditional print and TV ads, inviting consumers to send an SMS to the campaign shortcode. Consumers could directly respond and request a product sample by sending a one word text message. Participants were then sent a WAP push link to the mobile sampling portal where users could enter personal details to receive a sample. Alongside traditional media promotion, banner ads for the sampling campaign were placed on the Vodafone portal on the Nokia, Sat1, Pro7 Mobil MTV, Viva and YOC.mobi sites. 

Following MMA guidelines, the campaign placed choice and control in the hands of the consumer. Promoting the mobile campaign through traditional media gave users an opportunity to interact with the brand and request a sample if they were interested in the product.

The ‘3in1 / 2in1' campaign enabled almost 500,000 samples to be placed directly amongst the target group. Over 450,000 customers registered with Jacobs during the promotion, making the mobile sampling campaign one of the largest and most successful ever. More than 80,000 users registered their details to be used for future permission based marketing. 

Some 0.4 per cent of users who saw the television advert ordered a product sample via the mobile. The campaign saw high responses from users of mobile portals that were led directly from mobile banner advertising to the mobile registration portal. Almost 650 banners were placed on selected portals relevant to the target group and these achieved a click-though rate of more than 3 per cent. 250,000 text messages were sent to the opted-in YOC Community members who were part of the defined target group. Thanks to the detailed profiling and selection of community members, 10.6 per cent responded and YOC distributed more than 26,000 samples to these respondents.

NetCracker /TELUS
Telus, the largest telco in western Canada and the country's second largest overall, underwent a five-year program to transform its back office systems and OSS processes. The company focused on retiring its old systems and constructing a new, integrated approach to customer and service management with new systems and software as replacements.
Telus merged its mobile and fixed arms in 2005 to reduce costs and, eventually, to provide fixed/mobile converged services. Telus wanted to roll out triple play services (voice, DSL broadband and IPTV) aimed at the residential market. To cope with the dramatic rise of new and converging services it was apparent that the operator needed a more sophisticated OSS system. And, as many telcos have discovered, without efficient provisioning and assurance it can be difficult to deploy and support complex, but low, margin services profitably. 

Telus decided it was more economical to go for outright replacement of its OSS environment, since integration and optimisation would be more expensive.

Part of that bold approach was to sign a deal with OSS specialist NetCracker Technology to supply and support its inventory and fulfilment systems. The new technology replaced Telus' legacy systems and integrated with the provider's service activation, network management and billing applications.

For managing legacy and next generation network resources, NetCracker provides resource inventory with outside plant, discovery and reconciliation, and design and planning modules. Its modules include asset management, which focuses on resources and equipment lifecycle management, and order management, for receiving and processing of customer orders.
Telus uses NetCracker software in both its enterprise and mass market businesses as it introduces new, advanced services:  advanced Layer 3 services such as MPLS and VPLS on the enterprise side, and triple play on the mass market side.  The NetCracker architecture involves one central database handling everything with different processing ‘instances' developed for different services.

Instead of installing technology first and designing processes to control it, Telus has taken a process-lead approach by developing a forward-looking set of processes and bending the technology to work with the processes, rather than the other way around.
The key was to focus on their inventory systems first so they had a single view of the customer for a growing list of products from the service representation right down to the physical network. 

Ultimately, the development of the service layer through its OSS transformation has helped the company migrate to its next generation network.

Gavitec AG / Spanair
Spanair wanted to find a cost effective and efficient means to expedite its passengers' experience when travelling. They turned to Mobile Marketing Association member Gavitec AG, a subsidiary of NeoMedia Technologies, to help them to create a completely paper-free ticketing process via a system of mobile boarding passes that passengers could access from their mobile handsets during check-in and boarding.

The campaign set out to achieve two key objectives. The first was to reduce costs for check-in and boarding procedures. The second was to expand and streamline Spanair's customer service by providing customers with a flexible, convenient and innovative way to check-in for their flights.

Taking into account the day-to-day convenience that mobile provides to users, Gavitec sought to create an entirely paperless ticketing service. The service was enabled by EXIO scanners and followed the IATA publication of a mobile boarding standard based on 2D barcodes. Instead of a paper boarding pass, passengers received a 2D barcode directly to their handset via text message (MMS, EMS or SMS). The message then functioned as both a ticket and a boarding pass. Prior to travel the 2D bar code was scanned at the airport check-in desks and security points and validated via an online database, reducing waiting time for passengers.

Spanair is the first airline in Spain to offer its passengers a mobile check-in facility. The mobile system implemented by Gavitec reduced distribution and operational costs and positioned the airline as an environmentally friendly carrier. The system increased customer satisfaction, allowing passengers to reclaim time that would usually be spent waiting around by shortening cheque-in queues and providing a convenient and efficient means to board their flights. The mobile ticketing system also allowed Spanair to improve its customer relationship management methods and develop a number of customer-centric strategies through data collection on passenger flying habits, providing a new channel for one-to-one direct marketing.

Using the EXIO scanners, the system is now running in eleven Spanish airports, initiating a service that could save the airline industry up to $500million each year. With the widespread installation of the streamlined and convenient service across Spanish airports, Spanair estimates that some 10 per cent or 800,000 of its passengers will make use of the mobile boarding pass facility during 2009.

Has the pace of change in communications outstripped the ability of organisations'   business management processes to keep up? Michael Coppack takes a look

The telecommunications industry, particularly on the cellular side of the fence, prides itself on being at the cutting edge of the latest technological innovations. Fortunately, for the cellcos, this perception is not only held among the industry's practitioners, but also by the end users, the subscribers.

In some respects, these beliefs are well deserved. The onset of the information age has changed the way businesses and individuals operate and communicate out of all recognition in the space of a single generation. Not since the Industrial Revolution has the world changed so rapidly.

The pace of change in the communications world has been so fast and absolute that it all too often masks over some fundamental problems, particularly when it comes to business management processes. When the ‘old' world of business collides with the ‘new', a certain amount of upheaval is inevitable.

Business leaders are right to be wary of new technologies that promise the earth and deliver nothing. For every solution that makes a difference, there are a whole host of others that fall short of the mark. For all its advancements, the communications industry bubble burst in the most spectacular of fashions at the turn of this century and attention became focused on steady, real, business practices. Rather than over-inflated stock market valuations.

Fast forward less than a decade and we once again find ourselves in an economic climate that, following some catastrophic mismanagement, advocates enterprise wide risk management with safe and efficient business practices. This time around, though, the communications industry finds itself at an advantage. Communications companies are not completely immune to the effects of the credit crunch, however, they're very well placed to ride out the crash. The comms industry is growing, globally, and will surely come out fitter and stronger on the other side of the current economic malaise.

The finger of blame for the most recent crash is being pointed fairly and squarely at the world of finance. Though the world of finance could well end up supplying a solution that the communications industry will use to iron out some of the hidden inefficiencies from which it suffers. Ironically, these operational inefficiencies lie at the heart of one of the oldest, steadiest of departments: accounts.

It would be fair comment to say that accountancy suffers from certain unfortunate image problems. Bookkeeping is not generally perceived as an exciting discipline. Indeed, account reconciliation is a laborious, monotonous, process that requires meticulous attention to detail. It is precisely the type of process that is ripe for automation.

Yet account reconciliation retains a relatively low-tech reputation. Accounting within complex organisations, such as network carriers, is often carried out using completely different methods across the wide range of business units in place. Growth through acquisition, mergers and de-mergers, and the introduction of new technologies and new pricing models have tied the supply chains and accounting departments of communications firms in bureaucratic, fragmented, knots.

It seems odd, insulting almost, that an industry which is activity promoting unified communications and convergence technologies, has not embraced automation within its own divisions.

Even at firms that embraced business process management solutions, such as those supplied by Oracle and SAP, account reconciliation is more often than not carried out by large teams of staff putting data into spreadsheets using semi-manual or completely manual processes. In today's tough economy it is essential for organisations to take control of their cash management and financial accounts to increase financial control and reduce operational risk.

The good news for shareholders and customers alike is that communications firms are increasingly recognising that manual financial transaction processing does not deliver the control and efficiency required to alleviate the challenges of balance sheet reconciliation. In many communications companies, the responsibility of account reconciliation falls to the individual departments who frequently use their own manual method to reconcile their sub and general ledger accounts.

At many firms, account reconciliation sees spreadsheets being populated manually with cash, cheque and credit card transaction data, this is then matched by an individual against bank statements. A typical organisation will have teams of employees permanently checking and re-checking accounts.

Generally speaking using manual processes provides little to no visibility and proving an audit trail and tracking transactions can be a timely, onerous and complex task, which is highly prone to errors. Resolving discrepancies also requires timely searches, prior to decisions being made on given business decisions. This batchwise approach to account reconciliation is a major source of irritation and cost at the majority of communications firms.
A small number of vendors exist in the market that can provide automated account reconciliation. That number diminishes further when looking for a supplier of proven software to large, complex, multinationals.

Financial directors are under an enormous amount of pressure, particularly in the current economic climate. And most of that pressure is being applied with the aim of cutting costs. Making an investment in technology requires an almost immediate, measurable, return.
There are seven basic areas that financial directors should consider when looking into automated account reconciliation solutions.
Does the solution:

  • Enhance control?
  • Improve timeliness?
  • Assure accuracy?
  • Reduce cost and risk?
  • Prove easy to use?
  • Prove easy to integrate? - and
  • Port across multiple-processes?

Users need to be in total control of the reconciliation process from input to output. They must have real-time visibility into the status of the reconciliation process, giving the assurance that appropriate and effective controls are in place and used to best effect.

Manual reconciliation takes time. Effective systems must save considerable time and enable users to feel confident that the reconciliation is 100 per cent correct. The time saving alone makes automated account reconciliation solutions a sound investment, enabling users to focus on more useful and rewarding tasks.

With manual reconciliation it is easy to make mistakes. A reconciliation system needs to provide controls to help prevent common errors, and functions to make it easy to rectify them if they are made.

Automated account reconciliation solutions will often provide a full return on investment within the first year of implementation. As a result of the automatic matching of the vast majority of transactions, time is freed up to deal with exception management and risk reduction. Some leading systems will often automate up to 95 per cent of the transactions in the reconciliation process.

The technology needs to be easy to use, preferably developed by accountants, for accountants. A good rule of thumb suggests that a system should be so easy to use that it requires less than one day of training for effective use.

The technology also needs to be flexible enough to handle input files from all banks and financial systems that can export account information in a file. This should be a standard module in almost all systems. The implementation and configuration process should take a minimum of time.

Finally, if you're looking to invest in an automatic account reconciliation solution you should look for a proven supplier that offers solutions across a range of markets and industries, that is not only used for bank reconciliation, but is also used for reconciliations of technical systems, internal account reconciliation, and for the reconciliation of inter-company accounts.
Firms need total control of their reconciliation process from input to output. They need to have real-time visibility into the status of the reconciliation process, giving the assurance that appropriate and effective controls are in place and used to best effect. With these systems in place, communications firms can iron out hidden operational inefficiencies and play a leading role taking the world out of recession.

Michael Coppack is Managing Director UK, Adra Match

The BSS Summit offers a comprehensive conference with an operator-only speaker panel, interactive tutorials and discussion sessions, networking events and exhibition of BSS suppliers

The BSS Summit, which runs on June 8th-11th 2009 in Amsterdam is billed as the event that will help telco businesses optimise their BSS infrastructure to drive efficiency and sustain revenues. 

Through a mixture of executive keynotes, service provider case studies, real time interactive polling, panel discussions and expert masterclasses, the BSS Summit aims to address a range of thorny questions, including: will your current BSS strategy allow you to protect your existing revenue streams while exploiting new opportunities; optimise ROI from your existing billing infrastructure; effectively manage the customer experience and develop the right business model to develop and maintain your position as a market leader? Also, are you prepared to do more with less? How will you manage your business processes to maintain the flexibility, efficiency and effectiveness of your BSS platform during this period of economic uncertainty?

The intention, according to organisers IIR Telecoms, is to help delegates meet the challenges of delivering world-class BSS performance within tighter budgets and a highly competitive market environment, advising on the best approaches to:

  • Design, plan and prepare for BSS projects more intelligently and realistically
  • Deliver BSS projects within restricted cost and time frameworks, supporting organisation-wide efforts towards greater efficiency and leaner use of resources
  • Demonstrate practical, visible improvements within shorter timeframes

Details: www.bsssummit.com

Some telecom CEOs seem to believe that in recession ‘branding' is less important than  delivering on fundamentals.  However, while functional issues are important, brand image is increasingly becoming a key success factor both internally and externally.  In fact, mobile telephony has become one of the great battlegrounds of consumer branding.

The sector is complex and becoming more so by the day.  There is choice of network standards (GSM or CSMA), distributors (Carphone Warehouse or Phones4U), network operators (Vodafone or Orange), tariffs (Dolphin or Passport), devices (Blackberry or Sony Ericsson), applications (mobile money transfer or mobile Internet) and content (Sky or Bloomberg). 

Consumers are confused.  Speed of change, technology convergence, a plethora of products and services and complexity are all making the telecoms market more difficult for consumers to choose. 

This is exactly why strong brands are so important.  Brands were pioneered in the US fast moving consumer and durable goods markets to help consumers make decisions more quickly and simply.  Brands simplify choice.

They are now doing just that in the telecoms sector and some brands are becoming increasingly dominant.

Vodafone is currently the most valuable and strongest brand in the world, boasting AAA BrandBeta Rating and a value of $24.6 billion.  This reflects the fact that Vodafone is now present in over 60 countries worldwide with over 250 million subscribers  (Source: BrandFinance).

When Vodafone went on its M&A spree in 2000 the strategy was based on creating the first truly global mobile brand.  As over 40 of the countries where Vodafone now operates are licensed ‘partner' markets rather than owned subsidiaries, it becomes clear how powerful the brand has become.  Behind its mantra of being ‘Red, Rock Solid and Restless' Vodafone has built an internal brand culture which has been communicated externally.  Football and F1 sponsorships, heavy advertising, exciting product PR and the ubiquitous red trade dress have lifted spontaneous consumer awareness from 10 per cent to 90 per cent globally. 
Strong brand positioning, awareness and emotion make new market entries easier.  Look at the hugely successful re-branding of Hutch in India.  No mean feat considering what a great brand Hutch was before the transition.  It also enhances acquisition rates and lowers lapse rates and increases price premium and arpu levels.

There is no doubt that Vodafone is rapidly reaching the dominant position of a VISA or Coca Cola.  This creates a huge advantage and puts other operators on the back foot.  In response local brands are smartening up their act and some are trying to emulate Vodafone's success.  Look at Zain and Etisalat in the Middle East or O2 and Orange in Europe.

The Blackberry brand is another example of power branding.  Research In Motion (RIM) has boomed on the back of its technology and Apple has cleaned up with the iPhone.  In both cases it is debatable whether they would have become so popular, with businessmen and consumers respectively, if they had not developed such strong brands to complement their technology.  Meanwhile Nokia continues to dominate the mainstream handset market and commands AAA- BrandBeta Rating and a brand value of $19.9 Billion (Source: BrandFinance).
The reason brands become so valuable is because when they are cleverly devised, well managed and consistently invested in, they secure demand and leverage all the other intangible assets of a business, increasing the life of patents and technology.  They become reservoirs of value or counter balances that maintain momentum even in tough times.
But brands need strong leadership, a clear point of view and consistency. Take Virgin.  In 1968, Richard Branson, the idealistic college drop out, developed an enduring brand promise. In his words:

"The Virgin brand promise is based on five key factors: value for money, quality, reliability, innovation and an indefinable, but nonetheless palpable, sense of fun.

"At Virgin, we know what the brand name means, and when we put our brand name on something, we're making a promise. It's a promise we've always kept and always will."
Virgin has a strong ‘positioning', allowing it to excel as an MVNO, amassing customers and fortunes along the way.

But Telecoms CEOs can build brands too. Hans Snook, Peter Erskine, Jorma Ollila and Arun Sarin all followed the Branson lead.  Every Telecom CEOs needs to embrace branding to get through the recession in good shape.

David Haigh is Founder and CEO of Brand Finance.
Brand Finance produces the BrandFinance500 annual survey of the world's strongest global brands. 

Ray Adensamer argues that Voice Quality Enhancement can help deliver the standard  required of VOIP conferencing systems

Audio conferencing services based on circuit switched networks and audio bridging equipment have provided hosted conferencing users with a benchmark for pricing and quality in voice communications. While next generation networks based on VoIP technology introduce economic benefits with new feature capabilities for conferencing service providers (CSPs), they also present new technical challenges in maintaining acceptable voice quality. Delivering good voice quality is an important requirement in any VoIP conferencing system, as poor voice quality will increase the costs associated with customer churn, while impacting the bottom line by reducing revenue growth prospects.

Voice Quality Enhancement (VQE) encompasses an integrated set of features designed to overcome common audio quality problems in VoIP conferencing services, including noise, packet loss and echo. A comprehensive VQE solution also measures VoIP quality metrics, which are used in ongoing voice quality measurement associated with service level agreements.

Many features inherent in a VQE solution require sophisticated digital signal processing algorithms. The rapid, scalable execution of these algorithms dictates a product specifically designed for real-time IP packet processing. Fortunately, in a next-generation VoIP audio conferencing architecture, a network element already exists with carrier-class real-time IP packet processing power. And that network element is the IP media server.

The three most common sources of VoIP audio quality problems in a VoIP or IMS network are noise, packet loss and echo. This section discusses each of these VoIP audio quality challenges and describes the conceptual solutions to overcome quality problems.

Audio noise
Gone are the days when people were confined to quiet office and residential environments. Today, with mobile phones and the Internet, people are calling from their cars, airports and from just about anywhere, and these environments are flooding the mouthpiece with all kinds of unwanted sounds that ultimately get onto the call. Making matters worse, callers using laptops and mobile phones are typically saddled with marginal equipment such as low cost earphones and microphones.

This section describes a combination of mechanisms that reduce and help manage the disturbing effect of audio noise: noise gating, noisy line detection and noise reduction.

Noise gating
Noise gating is a simple yet effective mechanism to reduce background noise.
When no speech is detected on a line, its signal is attenuated (eg decreased amplification), which prevents unnecessary noise from being inserted into a VoIP recording or conference mix. Noise attenuation is configurable, so the conferencing application can avoid making the signal unnaturally quiet when the noise gate is applied to an audio signal.
Key benefits of noise gating:

  • Reduces background noise using a simple yet effective mechanism
  • Supports configurable attenuation

Noisy line detection
There are times on a conference call when some lines are very noisy and disrupt the productivity of the entire call. Noisy line detection measures the noise on audio ports and sends a noisy line notification message to the VoIP application server if a predefined threshold is exceeded, as shown in Figure 2. A second message is sent if the noise subsequently falls below the threshold.

Key benefits of noisy line detection:

  • Notifies the application server of noisy line conditions, initiating possible corrective action
  • Enables quick remedial action by the application server or the operator (eg mute line)

Noise reduction
While a noise gating function described earlier provides a relatively simple solution to eliminating noise when no speech is detected, noise reduction goes a step further by using digital processing techniques to remove the noise and leave the important speech signal intact. This provides benefits in many VoIP applications, such as removing noise from VoIP audio recordings or noisy caller lines in a conference mix.
Key benefits of noise reduction:

  • Filters out noise without impacting the speaker's signal
  • Reduces noise continuously, whether speech is detected or not

Dropped packets
The Internet is an amazing network of interconnected computers, but it's not perfect. The network employs the IP protocol, which does not guarantee packet delivery. Hence, when IP networks get busy or congested, packets can get lost or delayed. While lost packets are not critical for many data applications, packet loss in real-time VoIP services can cause significant audio quality problems. Without special technology to compensate for dropped packets, the result is an abnormal audio signal that might sound ‘choppy.'

Packet loss concealment
Packet loss c\oncealment is a technique for replacing audio from lost or unacceptably delayed packets with a prediction based on previously received audio.
Whereas any voice repair technology would have difficulty recovering from extreme packet loss in abnormal conditions, packet loss concealment is designed to perform intelligent restoration of lost or delayed packets for a large majority of congested network scenarios.
Key benefits of packet loss concealment:

  • Softens any breaks in the voice signal
  • Reduces the occurrences of choppy audio

Acoustic echo
An acoustic echo is created when sound emanating from the receiver's speaker (eg handset or speakerphone) is transmitted back by the receiver's microphone. This is depicted in Figure 3, where the Sender (on the left) transmits a signal to the Receiver, and an acoustic echo is created when some speech energy ‘bounces back.' In a VoIP conferencing application, all participants will hear an echo except for the guilty party with the device causing the echo. Since nobody can quickly answer the basic question, "Who's causing the echo?" troubleshooting echo issues in a VoIP conference call can be difficult and frustrating.

Acoustic echo cancellation
Acoustic echo cancellation (AEC) technology is designed to detect and remove the sender's transmit (Tx) audio signal that bounces back through the receive (Rx) path. By removing the echo from the signal, overall speech intelligibility and voice quality is improved.

AEC in a VoIP network is particularly challenging. In a traditional voice network, once a voice circuit is established through the PSTN, the round-trip echo delay is constant. However, in a VoIP network, packet delay is a variable, hence the echo delay is also a variable for the duration of the call, which makes the echo cancellation algorithms in any VoIP quality improvement product more complex and processor-intensive than an equivalent echo cancellation solution in a circuit-switched network.
Key benefits of acoustic echo cancellation:

  • Removes a sender's audio echo from the receive path
  • Addresses variable packet delay inherent in IP networks

Voice quality metrics
Technology to remove audio quality impairments in a VoIP network is an important part of any solution. But along with the functions to improve VoIP quality, service providers also need a standard, objective way to measure voice quality in order to accurately monitor performance levels and uphold service level agreements (SLAs) with customers.
Voice quality metrics can be divided into three groups: packet, audio and acoustic echo cancellation (AEC). All statistics are captured for each call leg of a conference
call to help with granular troubleshooting of audio quality problems and performance measurement. Packet statistics measure performance with respect to packet throughput, loss and delay, while audio statistics measure speech and noise power levels. AEC statistics measure echo delay and echo cancellation performance.
Key benefits of voice quality metrics:

  • Provides objective measurements for administering service level agreements (SLAs)
  • Facilitates the troubleshooting of audio quality issues in the network

Voice quality enhancement
Voice quality enhancement (VQE) encompasses an integrated set of features designed to improve VoIP quality and generate statistics needed for ongoing performance monitoring. This requires sophisticated digital signal processing algorithms that perform rapid real-time IP packet processing, a key component in next-generation VoIP audio conferencing architecture. As such, VQE can be deployed in an existing IP media server, which provides the requisite carrier-class real-time IP packet processing power.

IP media servers, also known as the Multimedia Resource Function (MRF) in an IMS architecture, are specifically designed to deliver real-time IP media processing as a common, shared resource for a broad range of VoIP and IMS applications in a next-generation network.

They also deliver real-time processing of codec algorithms, transcoding of codecs and sophisticated audio mixing for conferencing applications. Since media server and VQE tasks are interrelated and require the rapid execution of IP packet processing algorithms, it makes sense to integrate the functions of both into a single network element.

Ray Adensamer is Senior Product Marketing Manager, RadiSys



Other Categories in Features