Features

Features

Originally telecom providers built and provided a limited and controlled range of  services, which customers could choose from.  When competition was introduced into the market along with the internet, so too was consumer awareness of choice.  In the future, Phil Kingsland contends, consumer demand will drive the development of new and specific services and a key enabler of these could well be Public ENUM

Convergent communications isn't a new concept, but it's the topic that continues to dominate the telecommunications industry.  As the new and traditional technologies continue to converge, the number and types of products and services available will grow and evolve with suppliers offering a combination of services, tools and applications for users to communicate with.

At the same time as the technology's converging, so are the telecommunications and internet industries.  One of the challenges for the two industries is the speed of development and innovation, especially regarding fixed line and packet-switched internet telecoms.

The telecommunications industry is 140 years old and provides trusted and regulated services, with the associated reputation of regulated industries in regard to speed of innovation and development of new technologies. 

In contrast, the new kid on the block for the last decade or so has been the internet, which is run on a bottom up, self regulated, multi- stakeholder model, which has delivered a fluid environment, with constant changes and innovations. The internet's model enables services to be developed, tried and adopted or rejected faster, without the large-scale investment that is required to launch a regulated telecommunications product. This has allowed the industry to introduce a number of new and innovative ways to communicate that we might otherwise not have seen.

The combination of two very different industries is having an enormous impact on the telecommunications industry.  As convergence evolves, the classification of services becomes blurred.  Customers can now get a multitude of services from many providers.  These services are often becoming consolidated in a continuous evolution, increasing competition in an already fierce market.

A new issue introduced by this convergence is not being able to contact a VoIP telephone system from another VoIP system by using the associated telephone number.  If the VoIP address of the recipient of the call is not explicitly known then the call must be routed via the Public Switched Telephone Networks (PSTNs) to identify the called party.  ENUM was designed to address this issue; it maps telephone numbers into domains that are stored in the internet domain name system (DNS).  The owner of the domain can record both the PSTN telephone number and VoIP address against the ENUM domain.  This allows people to use traditional telephone numbering systems to connect VoIP phones, without needing the PSTN to find the corresponding phones.

The implication is that users with an ENUM-aware VoIP phone, can access any registered user over the internet without use of the traditional PSTN network.  Users simply dial a telephone number in the traditional manner - it is then transformed into an ENUM domain name.  A look up is then carried out and the call is routed according to the specific indications set by the user.  If the number called is not in the ENUM database, then the call will proceed to the person's non-VoIP telephone and be charged in the normal way.

Another feature of the ENUM protocol is that users can register multiple resource addresses in their ENUM domain such as VoIP servers, mobiles, email, websites etc.  This enables the possibility to converge the multiple types of communications to one telephone number and for new services to be created to exploit this. 

For example, a person may choose a VoIP option from a returned ENUM query to reduce call costs. Or present a caller who queries their ENUM domain with the type of communication that they are available on at different times of the day, e.g. Provide telephone numbers and emails in business hours and only email address out of work hours.

As consumers, these types of services will become invaluable as we begin to use IP communications in all devices and have more addresses for each contact in their communications portfolio.  A service provider that offers subscribers effective and cost efficient management of their communications will build loyalty by providing tangible benefits and be able to charge for this value.

This is a significant move away from the traditional business model of communications providers as it places more emphasis on these value services than call charges. There are service providers who argue against the use of Public ENUM due to the fear of the control that it presents to users.  These service providers may choose to exploit the benefits of ENUM via a private ENUM registry.  This offers the service provider the opportunity to protect the existing telecoms business models and other commercial information.

However, it is yet to be proved whether users are looking to manage their own service or if they are prepared to pay for services that help them control their communications from a service provider via their Public ENUM. Certainly the argument for call charges is diminishing as more minutes are added to inclusive deals, Ofcom in the UK reports that in 2008 a mere 14% of pay monthly mobile subscribers claim that they usually exceed their inclusive minutes.

Another example of the dilemma that the convergence presents that the current influx of mobile VoIP applications has put many mobile providers in fear of losing revenue, and caused a number of carriers to block VoIP calls over 3G.  In April this year, this led the European Union to consider a ban on carrier VoIP filtering.  Should this proposal be passed by the commission, users will have widespread access to a variety of free calling tools, making IP communications technology more commonly understood and used.  At this point, having an ENUM enabled mobile phone would become a very powerful business tool, and open the door to broad consumer use.

It is clear that an IP connectivity technology such as ENUM is central to the continued development and convergence of telecommunications and internet technologies. 

For the full advantages of Public ENUM to be realised, a sizeable group of users need to have registered their numbers and be able to perform ENUM look ups.  There is some debate about how and when this will happen.  It may take an application or service that really taps into business drivers to propel widespread adoption.  This could be realised via any number of tools that bring ENUM into the consumers' consciousness, in the same way that certain VoIP products in the internet space have made IP telephony accessible for everyday users.
In contrast to public ENUM, private ENUM tips the balance in favour of the provider.  It gives suppliers the ability to manage the service and therefore the customer and also retain more control over revenue streams. 

With consumers now more aware of choice than ever, it is vital that telecommunications providers offer user centric products to retain existing customers and win new ones.  At a time when all business is more competitive than ever before, these issues should have a considerable influence when deciding what product set to offer customers.

Consumers are no longer happy to just accept traditional service offerings that provide the best benefits to the supplier.  They have learnt from competition and the internet that there is another way.  Customers are concerned not just with the current offering, but also how that impacts on the future development of products, services and applications.  A communications strategy is central to the success of any business, so being at the forefront of technology and having the ability to adapt to future developments is vital to continued success.

Opening up IP connectivity, using Public ENUM, supports the continued innovation and evolution of the telecommunications market.  It establishes opportunities for applications to be developed that will benefit users in new ways and create openings in the market for new business models.

Public ENUM has the advantage of being readily available, cheap to provide and is already deployed.  It sits comfortably alongside existing communications services, and enabling suppliers to offer truly converged communications that adopt the best features of both the telecommunications network and the internet. 

There are a number of forces in both the telecommunications and internet markets that will decide the future direction of ENUM's role in converging communications.  Undoubtedly one of these will be consumer demand.  If business users realise the potential of Public ENUM and demand a service that gives them control of their communications, the type of service they've learnt to expect from the internet, suppliers will need to meet this need to retain business.  What's clear is that Public ENUM presents the possibility of a myriad of solutions and applications that suppliers may not even have begun to realise.

Phil Kingsland is Director of Marketing and Communications, Nominet UK

The future of pay-TV is hybrid says Francois Pogodalla. What hybrid means is the  combination of DVB reception techniques for receiving broadcast digital video, with IP capability to receive video or other multimedia content over Ethernet. The opportunities for new services enabled by hybrid products are immense

The benefits and drivers for the hybrid strategy apply to both the wide area services of switched digital video, video-on-demand, over-the-top and other interactive applications, and the local area services, that is, the home network.

This is how we at ADB see convergence: not device-centric, but content-centric. Hence the goal is to facilitate access to content, whatever its shape or origin.

The home network is a key element of true content convergence, with set-top boxes interacting with the devices in the home, including personal computers, games consoles, portable media players and mobile phones. This means that personal content can be shared throughout the home, and played on the TV or the home theatre system. Utilizing the home network, these products also allow for new applications such as multi-room PVR and tuner sharing.

Connecting the home network with the content provider's network and the internet allows consumers to access their content whenever and wherever they want. They can even record content from wherever they happen to be. The benefits of the hybrid strategy for wide and local area applications, although different, can cross over to some extent. Take a satellite broadcast network, with an installed base of PVR's: having an IP connection can allow other applications such as uploading locally stored content to the network, or to other boxes in the home, with an opportunity for new VOD services that not only include the programming the operator has pre-selected, but the content the users themselves have recorded.

Adding hybrid technology onto the DVB platform opens the doors to these types of services. Customers demand plug and play. Bringing the world of the PC and Internet to the television, means making the system easy to install and use for TV viewers.

What lies behind simplifying the customer experience is the knowledge that we has accumulated over the years in managing the complex software involved in providing many different kinds of services through the set-top box. Hybrid technology is complex from the software management perspective. It requires implementing the full IP software stack in parallel with all the other software that additional hybrid services demand. There is the MHP middleware that takes care of terrestrial digital video reception, the IP software stack sitting next to it for receiving video over the IP connection. Then you add the local area applications, the DLNA implementation for home networking, plus DRM (security) and multi-room PVR services. This combination of software can put a high level of stress on the efficiency of the software stack. Building a hybrid product by just adding different software modules together can mean an unstable product, that's difficult and slow.

It is very important to be able to stabilize the complex solutions involved, and there is tremendous value in the vertical control of the entire software stack. ADB is one of the few companies that can write the complete set of software from the low level operating system, up to the high-end applications, in a hybrid environment. We are leading the industry in software integration: the first set-top box company to be certified to use the new DLNA home networking standards, and the first in the world to deploy the new MPEG-4 video codec back in 2005

The best approach to hybrid technology is to deploy proprietary implementations of open standards.  Using open standards, such as DLNA, is the best way of guaranteeing interoperability and ensuring the industry remains dynamic, with devices being able to interact with one another. Using a proprietary implementation of an open standard also ensures that all elements are fine-tuned for optimal performance.

When we designed the hybrid box for customers back in 2005, there were no chips performing MPEG-4/H.264 decoding at the time, so we implemented our own MPEG-4 decoding software, based on the standard, using an off the shelf digital processor. 

We have developed our own implementation of the MHP standard which is merged with the customer's IP stack. We have gone even further now adding local area applications -home networking through implementing the DLNA standards that support picture and music exchange and multi-room PVR throughout the home. ADB is also implementing technology from partners such as Stream Group, whose product Solocoo opens up the wider world of Internet videos.

Making it easy for the TV viewer to access a multitude of new services means providing these services as part of the TV and not a PC experience. It's important to retain the simplicity of the TV experience in this complicated environment. ADB is constantly striving to make improvements in order to present the additional choices to the consumer, in as straightforward way as possible. For example, to assist viewers in navigating the additional content, all services provided through the box are made available as channels, Another innovation is the use of 3D graphics in the user interface, so navigation on the services available becomes a more pleasant and intuitive experience for the end-user.

For the operator, the use of hybrid techniques means the extra services provided can increase subscriber retention, hence reduce churn and offer opportunities to increase revenue.. There is a definite interest from operators in hybrid technology, in making better use of their assets.

The addition of IP to the DVB digital video set-top box opens up endless possibilities for added value content - VOD, access to user specific content, whether from the Internet or the user's own content. There's no doubt that hybrid solutions feature outstanding tools to limit churn - for instance adding access to the user's own content makes the box a personal device. IP also enables the operator to free up broadcast bandwidth, thus limiting future investment in infrastructure.

Different markets worldwide may be going at different speeds, but they're all moving in the direction of a hybrid strategy - bringing the Internet to the TV and adding local area applications will happen everywhere. Digital will do for television what it's done for telecoms, which is to open up endless possibilities. The industry can now be really creative in exploring all the opportunities that a truly converged TV experience brings.

Francois Pogodalla is CEO at ADB SA

Rahm Emanuel, Chief of Staff in the Obama White House, said recently: "Rule one:  never allow a crisis to go to waste. They are opportunities to do big things". Like introducing transformational tools that may just keep your business afloat, says Aaron McCormack

It's not just the financial services industry that has suffered in the current economic downturn. Many other sectors have been affected in a crisis that, to date, has been marked by large reversals in financial results, mass layoffs, bankruptcies and the disappearance of long-standing companies. And while the question of how we got here could occupy analysts, politicians and commentators for years to come, the more immediate concern for business leaders is to find ways to respond to the circumstances that now surround them.

The issue was to the fore when the Forum of Young Global Leaders, of which I am a member, met at the World Economic Forum's annual meeting in Davos early this year. The question we asked was: "Can we create a system where we value genuine value creation beyond the quarterly results?"

If so, this is exactly the right time to make the changes needed. As London Business School's Donald Sull said in a recent article in the Financial Times: "Every downturn opens a window of opportunity to adjust the status quo and astute managers push through necessary changes while the window is open". This raises two wider questions. The first: what will the post-crisis order look like? The second: what can organisations do to prepare for it, especially at a time when budgets are tight and hard choices often have to be made?

As we all know, crystal balls are a notoriously unreliable way of finding answers. But one thing is certain - like it or not, we are all heading for a low carbon future, and we need to get there fast. Certainly, this seems to be where political leaders are placing their bets, with calls for ‘new green deals' coming from every part of the globe. UN Secretary-General Ban Ki-Moon is just one of a number of prominent leaders who believes we are being presented with a unique opportunity to address two crises at the same time: climate change and the economic downturn. Business leaders also seem to be waking up to what Lord Stern of Brentford, the author of the 2006 review that laid out the economic case for fighting global warming, is calling ‘green stimulus'.

Based on BT's experience, introducing conferencing, and using it to its full potential, is going to be an important next step for many firms. Analyst house IDC is in agreement. In 2008, the Unified Communications (UC) market in Europe was worth $2.6 billion. By 2013, IDC expects this to increase at a Compound Annual Growth Rate (CAGR) of 39% to a value of $13.5 billion, making it one of the brightest spots in a very tough technology market. "In such a challenging market, where spending is plummeting, there is a strong opportunity for solutions that can reduce expenses such as travel in the short-term," said Chris Barnard, research director at IDC. "This means that UC, which includes video and audio conferencing and collaboration solutions, is one of the few technology areas well placed to grow during the recession."

The good news is that, in most businesses, the infrastructure to support audio and web conferencing is already in place. All that is required is a phone, a computer and an internet connection - facilities that already exist on almost every desk. However, it is true that newer telepresence systems - a new generation of video collaboration services, with high definition cameras and huge projection screens, require new investment. However with their ability to illustrate physical characteristics and cues in great detail, they offer commensurately greater returns.

With this technology, it is possible to create virtual meeting environments where eye contact, body language and other conversational cues come as standard, akin to face-to-face meetings. It is possible to run the most crucial of business meetings - for example, delicate negotiations and executive recruitment interviews. And, while it will probably never completely replace face-to-face meetings, it does significantly accelerate a firm's ability to get things done.

Europe, in particular, is very much behind the curve when it comes to using existing conferencing technology to make businesses more productive, fleet of foot and efficient - exactly what is needed right now.

So what real difference does conferencing make when you adopt it whole-heartedly? To begin with, it encourages people to interact and work together to solve problems. It takes time to set up face-to-face meetings. Diaries have to be checked, rooms booked and travel plans made, and that can be a barrier to effective collaboration and teamwork. Conferencing is much simpler, and much more immediate. If it's integrated with a system that allows people to see each other's diaries - Microsoft Outlook, for example, all employees have to do is check people's availability, send an appointment and make the call. People can get together in minutes or hours, not days or weeks.

Saving time in business, product and sales operations gives sales individuals more time to sell, increasing the bottom line. Enterprises with large, dispersed field organisations, including high-tech manufacturing companies like Agilent and professional services firms like Accenture, are setting the bar for others to follow by expanding their use of web conferencing to drive online teamwork and effective delivery of sales presentations. Levels of meeting ‘attendance' are often higher when organisations hold briefings, conferences and training sessions using web conferencing and other services.

Productivity is also improved. The hours employees would have spent travelling can be put to better use. Jobs get done more quickly, which benefits both the employer and the employee. If people can meet overseas clients and suppliers without having to fly to meet them, they might be able to close four to five deals in a week, as opposed to just one. And if people aren't away from home as often, their work-life balance is improved. The need to work evenings or weekends to catch up on time lost on the road is reduced.

Essentially, introducing this kind of technology changes business processes and the very way in which people work. Many of BT's people now work in virtual teams whose members are distributed around the world. When they need to get together to discuss something, more often than not they'll use our conferencing services to do so. It's much more convenient, much more flexible and it quickly engenders both a team spirit and trust.

And the bottom line benefits? At UK supermarket Tesco, staff who use audio conferencing services save an average of £300 a meeting on travel expenses and free up 3.97 hours of ‘on the road' time to do more productive work. ‘Attending' an average of 3.55 conference calls a week, staff save a total of 584,824 hours per year by not having to travel to meetings.

The returns on investment are significant and quickly feed through to the bottom line. BT asked independent researchers from the University of Bradford and SustainIT to assess the benefits it obtained from conferencing in 2006-07. Based on employee surveys, they estimated that our people used audio and video conferencing services to hold around 850,000 meetings that would otherwise have required some or all of them to travel to the selected venue. What they described as "conservative calculations" suggested that, by eliminating about 2.6 million return journeys, time worth more than £100 million had been made available for more productive use. More than 73% of BT conferencing users believed they had saved at least three hours of travel time and 46% of the trips would have been car. It estimated that BT saves at least £128 million by using conference technology, while each physical meeting conducted by videoconferencing saves £432. 

When it comes to building a low carbon economy, the environmental benefits are equally significant. The researchers found that 97,000 tonnes of CO2 emissions had been avoided through reduced travel. What's more, these benefits weren't a one off. Year in, year out, money will be saved, our people will work more productively and CO2 emissions will be reduced. 

So yes - we are in a period of constraint at the moment. But that doesn't mean companies can't use this time to streamline their operations and shape themselves for the future. Sure, they may need all the help they can get. But they may also find that a downturn in business, far from being a bad time to think about change, may well be exactly the right time to transform the ways they work. Perhaps it's time for more organisations to heed conferencing's call.

Aaron McCormack is CEO at BT Conferencing

John Konczal examines how today's telecom providers must accommodate the   explosive demand for new digital content through business alliances

Consumer demand for digital content and value-added services is transforming telecom service providers into the new super-enablers of the digital economy.  While revenue opportunities from these new products and services are sizable and promising, introducing them to the telecommunication service provider's current business platform is not without its operational challenges. 

If the telecom service providers are going to grasp the new revenue opportunities of the digital economy they will need to have secure, flexible business processes and systems to collaborate and connect with new value-chain partners and drive new offers, such as personalised digital content, to consumers.

To offer new products and services to consumers, a new market of converged sectors of communications, media, and entertainment has had to emerge: something that can be termed as the mediacosm.

This mediacosm is now forcing the telecom service providers to restructure the business models and processes they rely on to source, market, sell and deliver products and services. In fact, to compete effectively in the new market, telecom service providers' business strategies are now beginning to mimic those of world-class retailers rather than that of manufactures.

Telecom service providers have traditionally relied on manufacturing-driven business models using a "flat" business flow and revenue model. This means that the communications service provider buys equipment and services from suppliers, integrates them into the service provider's network, and bills the end-user for products and services delivered.  In such single-dimension models, there is very little, if any, development collaboration with business partners on what products, services and content are delivered to customers.

However, significant changes in service provider strategy, backed by new open network technology, such as service delivery platforms, are driving telecom service providers to embrace a multi-dimensional business collaboration model, much like what world-class retailers use, as the core enabler of the mediacosm. World-class retailers, whose end products are an amalgam of intermediate goods and end products from partners and suppliers rely heavily on business collaboration to drive their revenue stream.

For telecom service providers to become the super enablers of the digital economy involves thinking, operating and measuring progress like a retailer. For example, an electronics superstore may collaborate with a personal computer manufacturer to develop a laptop with specific features, branding, and supply-chain integration for the store to market and sell the laptop to consumers.  In this case, the value-added product provider (the computer company) and the direct-to-consumer retailer (the electronic superstore) connect, communicate, and collaborate throughout the product introduction process in order to bring the custom laptop to market.

Realisation of the Mediacosm involves having in place business collaboration platforms where third party organisations, such as digital media companies, can plug their products and processes into the telecommunication service provider's business platform. This will enable seamless integration and the delivery of new offers to the consumers with the telecommunication service provider's network.  A telecommunication service provider's transformation into a super-enabler, as well as one which generate increased revenue, will depend on its ability to diversify its product portfolio by building a broad and deep business collaboration network of digital content and value-added service providers.

To generate these new revenue streams and to offer new services to consumers, many telecom service providers will embrace new business relationships and partnerships with content and service vendors, partners, and suppliers in order to bring a diversified set of products and services to market.

As a by-product of this strategy, a complicated mix of inter-relationships will evolve where both simultaneous collaboration and competition between telecom service providers and vendors, content owners and application/software providers will emerge.

Expertly managing this rich and complex ecosystem of partnerships will be an essential objective for telecom service providers. Those that will be successful in capitalising on the opportunity of the mediacosm will be the ones that deploy and employ technology and business capabilities that allow a telecom service provider to seamlessly integrate with multiple enterprises and enable automated collaboration between value chain partners. This will enable them to source content and value-added service from multiple points and manage business relationships in a flexible manner.

To become super enabler, telecom service providers like AT&T and British Telecom are focusing greater attention on multi-enterprise integration - or building an extended community of business partners and suppliers to bring new products and services to market and enhance how products and services are sold and distributed.

‘The best companies are the best collaborators. In the flat world, more and more business will be done through collaborations within and between companies, for a very simple reason: The next layers of value creation-whether in technology, marketing, biomedicine or manufacturing-are becoming so complex that no single firm or department is going to be able to master them alone.'
The World is Flat, Thomas Friedman, 2005

Thomas Friedman's observation above is actually an accurate description of the environment in which a successful telecom service provider operates in the age of the mediacosm. No business works as a single unit. Each one is comprised of partners and suppliers with whom it connects, communicates and collaborates with to drive positive business results for all involved.

This business collaboration produces greater efficiencies throughout the organisation and, more importantly, allows innovation to emerge to a degree that would never be possible under traditional organisational structures. It is the secure, reliable and seamless integration of people, processes and technology-and the vast potential this integration holds for a business to optimise existing resources throughout its value chain - that gives a company the  power to reshape its strategy and remain competitive.

Not only has technology levelled the playing field by making the exchange of information a universal capability, but this capability empowers companies to work together in ways that were unimaginable a generation ago. Telecom service providers now rely more and more on business collaboration networks to connect, communicate and collaborate with their partners and suppliers to bring new content and value-added service to market. This means that every participant in the network - not just the organisation at the hub - reaps the benefits of orchestrated business collaboration that allows them access to expertise and information from beyond their own corporate walls. It also enables innovation that drives and optimises customer experience and ultimately, revenue growth.

In summary, a proper multi-enterprise integration solution should allow service providers to react to market dynamics quickly and with very little effort. As a gateway, it must be able to talk to any system a provider has in place today and any communication method a provider's business partners might support. As a process enabler, it should allow a service provider to quickly assemble offers to meet the needs of fickle consumers. As a visibility and collaboration tool, it should enable more efficient ways of giving a service provider and the provider's partners better insight into your business operations. As a means to governance, it should track and record every transaction end to end, at any granularity needed to support the business. And, finally, as a security tool, it should protect against fraud, theft, revenue leakages, and liability.

"Multi-enterprise integration isn't the goal, but it supports the goal... Automating such business activities helps drive bottom-line revenue via reduced errors, reduced cost of operations and faster process execution... It also drives top-line revenue via lowering barriers to automation, improving customer and external business partner satisfaction, and increased "stickiness" once automated processes and data exchanges are implemented.'
Gartner, Inc., "Key Issues for Multienterprise B2B Integration," February 2009.

The emergence of the mediacosm is transforming how telecom service providers are structured and how they operate. In the near future, most telecom service providers are likely to be more virtual than physical. Companies may be composed of alliances among many different providers that come together to offer products and services, rather than doing most things in-house.  As the pace of change continues to accelerate, one thing is certain for business in the 21st century: successful telecom service providers will be those that have productive business collaboration, inside and outside their enterprise, to deliver the right products and services to their customers the way they want to receive them.
John Konczal is Global Industry Executive, Communications & Media, Sterling Commerce

Empirix/Manx Telecom
Manx Telecom, part of the Telefónica S.A. group and the Isle of Man's largest telecommunications and Internet provider, has deployed Empirix Hammer XMS to provide service quality assurance for its new IMS network. Over the next 18 months Manx Telecom will transition from its existing PSTN and ISDN infrastructure to an IMS network. During that timeframe it will use Empirix's Hammer XMS system to provide ongoing service monitoring throughout the network.

"We realized early on that we would need an independent monitoring system to manage the complexities that IMS introduces into our network," says Jon Huyton, technical officer, who is leading Manx Telecom's migration to IMS. "We were impressed with Hammer XMS because it addressed all of our needs in terms of functionality, working virtually ‘out of the box,' and ease of use. Most importantly, Hammer XMS provided detailed, end-to-end reports on calls as they traversed from the PSTN on to the IMS network, which was a task we were expecting to have to compile manually."

Manx Telecom selected Empirix's Hammer XMS following extensive system tests on its live IMS network. The deciding factors included Empirix's ability to deliver a comprehensive view of both TDM and IMS network operations, real-time monitoring of call flows, as well as fast set-up and flexibility to customize monitoring rules and reports.

Hammer XMS enables network operations and quality engineers to drill down from high level views of the network to granular details of individual call paths. In addition, Hammer XMS creates snapshots of normal network activity that quality engineers can reference when errors occur. These capabilities will enable Manx Telecom to identify and rectify errors, before they cause customer issues.  These monitoring capabilities will also help Manx Telecom ensure that it continues to meet quality targets set out in Service Level Agreements (SLAs), which are very important as a large proportion of the operator's revenue comes from businesses, including many financial services companies.

Andy Belcher, Empirix's managing director of Europe, the Middle East and Africa comments: "Known as an industry innovator, Manx Telecom's decision to focus on service quality assurance early on in their IMS rollout is proof that leading operators realize that proactively ensuring service quality is critical to their competitive differentiation and commitment to their customers."
www.empirix.com

Transmode/3 Scandinavia
Mobile services provider 3 Scandinavia needed to build its network capacity to ensure no bottlenecks were created as applications become more data hungry and more widely used. The company manages its own wireless backhaul infrastructure across Denmark and Sweden comprising several interconnected optical rings.

Mobile data traffic is undoubtedly growing phenomenally but the older SDH/ATM networks have inherent technical limitations that prevent cost efficient capacity increases. IP /Ethernet is often the best solution here, as it provides high capacity upgrades at a relatively low cost.

After evaluating various options, 3 Scandinavia decided to deploy Transmode's iWDM solution. Over the past year, Transmode's Multiservice Muxponder (MS-MXP) has been deployed across its Danish and Swedish networks. The MS-MXP has allowed 3 Scandinavia to deploy multiple services over a single 4 Gbit/s CWDM or DWDM wavelength.

Transmode's Multi-Service Backhaul Solution, including its iWDM capabilities offers transparent synchronization support for native TDM and Ethernet traffic. And both the native TDM and Ethernet traffic can be transported using only one single wavelength.

Håkan Snis, 3 Scandinavia's Transmission Engineering Manager explained the significance the company's improved mobile backhaul capabilities.

"Our decision to deploy Transmode equipment was not only based on cost, although there were obvious cost benefits. 3 Scandinavia had previously enjoyed several years' successful co-operation with Transmode in the previous phases of our network development and we were very pleased with the reliability of these systems."

"Since January 2008, which marked the start of the latest phases of 3 Scandinavia's development program, our requirements have been for significantly increased capacity as well as the technical means to cater for a smooth and cost-efficient migration from TDM-based transport to a future-proof IP transport model."

Sten Nordell, CTO at Transmode, comments: "Our Multi-Service Muxponder is especially designed to fit the various traffic formats available in the mobile network. Ethernet, TDM and ATM traffic can be aggregated and delivered over a single wavelength which ensures capacity can grow to match future demand without changing the optical infrastructure."
www.transmode.com

The whole new telecoms regulatory framework is now in the doldrums following a  disagreement between the Council of Ministers and the EU Parliament. You may be forgiven for thinking that a major disagreement regarding key aspects of the package must be at the origin of the current acrimonious standoff between such venerable European institutions... but you would be wrong.  The quarrel that is threatening to derail more than eighteen months of tough negotiations, is about a recently introduced anti-piracy provision allowing the disconnection of internet offenders "without prior ruling by the judicial authorities". This seemingly innocuous provision, that was introduced by the French Government at the last minute and subsequently accepted by the Council of Ministers but rejected by the Parliament, is the origin of the stalemate.

With the recent European elections, and the failed negotiations before the summer, it is unclear whether any agreement on this issue will be reached in the near future. This situation is particularly awkward for the French Government given that its very own Conseil Constitutionel, France's highest constitutional authority, ruled against the very law that the French wanted to export. The so called "three strike" provision, that would have allowed an independent body to disconnect the internet access of individuals after three warnings, is therefore back to the drawing board. This is likely to embolden the EU Parliament in this institutional impasse and lead them to wait for the Council of Ministers to back down on their version of the text. In the meantime all the measures in the package, including improved number portability provisions, the set up of a new pan-European regulatory group, introduction of functional separation as a remedy, enhanced radio spectrum management, better access to emergency services, etc... are on hold.

While understandably irritated by these latest developments, and unable to make progress on its main text, the Commission decided to revisit another area of regulatory concern and took the unusual step of re-issuing a consultation document about its regulatory proposals for NGAs.

A wide consensus about the definite need for next generation access networks to be deployed throughout Europe is now emerging. Demand uncertainty (what will people buy exactly? for how much?), technology uncertainty (will it work? what is the optimal topology?) are however significant enough for many operators to carefully phase their investment. A "build it and they will come" business case is unlikely to persuade the boards of the major operators to sign off on what is thought to require €300bn of investment in Europe. The regulatory uncertainty surrounding the treatment of such investments has also been blamed for the delay in rolling out fibre. The first consultation on the issue, launched at the end of last year, apparently attracted so much criticism that rather than coming up with a final recommendation as initially expected, the Commission is re-issuing a new consultation. Hence the new draft recommendations and guidelines on NGA to ensure that a consistent framework is adopted, probably at the end of the year, across member states.

The core of the issue is the balance between investment incentives and the risk of (re)monopolisation of the local loop. Some fear that, if these new fibre based networks are not regulated (e.g. access to them is not mandated) competition will not emerge in the future. On the other end, some argue that, if drastic wholesale access obligations are imposed, operators are not going to invest.

The Commission is clear about the need to regulate access to NGA but doesn't want to be accused of stifling investment. To achieve this dual objective the Commission is trying to promote co-investment in infrastructures (so that market participants can pool resources) as well as the roll-out of multiple-fibre lines to ensure that future competitors will be able to enter the market by using available fibres (rather than leasing an existing link as is currently the case with unbundling in copper networks). The Commission also recognises that investment risk may need to be reflected in the access price of third parties but warns that this should not be used as a way to squeeze the margins of access seekers.

Even with regulatory visibility the business case for rolling out these new generation networks in large cities is now just emerging.  It is therefore likely that some form of government intervention will be required to make these networks available in more rural areas in the future.

Benoit Reillier is a Director and co-head of the European telecoms practice at global economics advisory firm LECG. The views expressed in this column are his own.  Breillier@lecg.com

Baseline connectivity VPNs are becoming commodity products.  Many carriers have  leveraged their network assets and introduced service-aware VPNs.  Application assurance is the next evolutionary step, and Bob Emmerson argues that this development addresses the application requirements of enterprises for better, consistent, visible, end-to-end performance

Enterprises employ a wide range of applications and services: email, VoIP, IM, file transfer, video conferencing, business IPTV services, storage backup and recovery, etc. Each application has unique requirements for bandwidth as well as timing or delay sensitivity. VoIP can suffer from poor quality and dropped calls. Streaming video may break up or take a long time to start playing.

There is a popular perception that a high QoS equates to a high QoE (Quality of Experience) and that any quality problems can be accommodated by adding more bandwidth. That perception is wrong but it persists.

In order to optimize the QoE, which is the quality parameter that matters the most, service providers need to manage the quality of the various applications - the QoA (Quality of Application) - all the way to the end user.  This means that the optimum QoE can only come from a combination of QoS and QoA.

Many service providers have progressed from the basic VPN model, which was connectivity centric, to today's service-aware model that supports the convergence of IP voice, data and video and that has the performance and resiliency necessary to run latency-sensitive applications. 

This model has enjoyed considerable success.  But - and it is a very big but - it's a QoS centric approach. A high QoS assures the performance of the network, but it does not recognize the applications.  What's needed is a model that builds on the success of service-aware VPNs and goes on to assure the performance of the applications, ie a model that is QoA centric. Add this parameter and you can realize the requisite QoE for all application types.

Enterprises rely on their business applications for day-to-day operations, but the majority have little or no visibility on how they are performing over the wide area network services they employ.  It's a serious issue and its impact is growing for a number of interrelated reasons:   

  • Most apps were designed for the LAN, not the WAN
  • More and more apps are being centralized at data centres
  • Real-time voice, multimedia and business-critical data applications are converging
  • Availability and performance must be optimized across multiple locations

And the issue is compounded by the fact that many if not most IT departments have limited resources.

The primary responsibility of service-aware VPNs is to ensure that the operator's network and service performance objectives are met. There is limited focus on applications: it is assumed the application performance is acceptable if the service performance objectives are met.  But QoS does not equate to QoE. 

This model assigns different classes of service (CoS). The CoS defines the service pipe into which applications will be classified by trusted CPE devices.  Thus, the CoS determines the priority rating of the applications. However, this does not help address the enterprise's key performance issue, which is to have per-application visibility and control, without having to implement a costly CPE appliance.

Application-aware VPN services offer a new way for service providers to approach their customers. Application awareness goes to the heart of what matters most to an enterprise: the predictable performance of its voice and data applications.  That's a deliverable that will allow service providers to become a strategic partner ahead of the time when commoditization of the QoS model starts and prices erode. 

SPs that make the transition from service awareness to application assurance will be able to deliver predictable performance and generate additional revenue streams.  Their offer will be distinctively different and market research indicates that it will be welcomed.

An Ovum study conducted in Europe and the US indicated that 30% of the enterprises would pay extra for an improved QoS that guaranteed the performance of mission-critical applications.  And 20% said they would be prepared to pay for consultancy services that helped them with application performance monitoring and reporting.  A similar study conducted by IDC showed that 51% of 368 enterprises would use a managed WAN optimization service from an operator.

An application-assured VPN ensures per-application performance objectives are met through application recognition and optimization. This is enabled through a network-based approach that provides per-application classification and end-to-end assurance from both trusted and untrusted CPE devices.  So, how is it done? 

Alcatel-Lucent, for example, has designed a solution that enables service providers to deliver application assurance as well as performance monitoring and reporting.  And while the functionality is very advanced, implementing the solution is relatively simple. 

In a nutshell, the operator simply hot-inserts a hardware module into the chassis of an existing 7450 ESS (Ethernet Service Switch) or a 7750 SR Service Router).  This can be done without disrupting the services that are running at that time.

The module is known as the Application Assurance Integrated Services Adapter (AA-ISA). The baseline function is to identify the various applications in order to enable dynamic per-service, per-site and per-application QoS policy control. The term per-application QoS equates to QoA, which was introduced at the beginning of this article.

The applications that enterprises run over their wide area network are numerous and varied.  When traffic is directed to the AA-ISA traffic flows are identified and subjected to Application QoS Policy rules that determine the requisite treatment.

Each AA-ISA module provides up to 10 Gbps of deep packet analysis, a figure that enables up to 3M traffic flows. Up to seven active AA-ISA modules can be deployed per chassis. In this case the analysis capacity can scale up to 70 Gbps: this figure enables up to 21M simultaneous flows.  Application assurance solutions that are CPE-based only scale to 1 Gbps.

Application assurance enables per-application refinements that can either optimize the performance over the WAN or prioritize one application over another.  In Alcatel-Lucent parlance it enables an application-level QoS, which is arguably a more meaningful term than my QoA.  

In addition, the AA-ISA provides the data that enables visibility of applications and their performance behavior over the VPN. This data is subsequently processed by the reporting and analysis manager that provides application identification plus application monitoring and reporting. 

Alcatel-Lucent's solution enables operators to provide enterprise customers with a Web-based service portal that is used to monitor applications on a per-VPN or per-site basis. IT management can view near-real-time reports as well as archived reports and also request or change application treatment.

These reports are critically important for enterprises as they are faced with operational challenges due to limited resources as well as increasing cost constraints. Without an application reporting capability, they are effectively running blind.

Scalable application assurance has the look of a compelling business proposition, one that meets the market need for enhanced application performance over the VPN.  The concept is also a logical evolutionary step for service providers.  An application assured VPN is a differentiated service offer that heads off commoditization and by delivering predictable performance that can be monitored the business relationship becomes that of an ICT partner instead of a connectivity supplier. 

Bob Emmerson is a freelance writer who lives in The Netherlands. He can be contacted via: bob.emmerson@melisgs.nl
www.electric-words.org

Unlocking the service layer will encourage a new market of innovation and  competition, both from application developers and the operators says Jonathan Bell.  But how can it be done?

Superficially, today's fixed and mobile telephone networks are not too different from those of thirty or more years ago.  You dial a number - it could be a special short number, or an 800 number, the principle is the same - signalling takes place to connect your call to the other party.  Sure, the numbers you dial look different now.  And they've added some nice features that make things more convenient - like a built-in answer-phone service and caller identification so you can filter out calls, find out who called and when etc.  For business users, there's even a little more like calling circles, hunt groups and multi-party calling for example.

We sense as service users that mobile telecoms provide a more personal service with greater utility.  Call the number and you connect to the person, not the location.  Wherever they are, whatever the time.  People of all ages have adopted text messaging enthusiastically as an additional, highly valued communication option.  Increasingly, mobile email and high-speed data are also becoming more commonly used, blurring the boundary between person-to-person telecommunications and "the web".  People are nomadic, and now connectivity and the device is mobile too, it is clear that extra attributes such as location and presence can be utilised to create services that are more "user aware" and therefore useful to the user.

To adapt the rigid A to B model that we started with, telecommunications engineers adopted a layered model and ensured that the signalling aspects of a service were separated from the actual communication channel.  It's called Signalling System 7 (SS7 for short), and the same principles have been re-used in SIP - the IP-based equivalent - that will, in all likelihood, eventually replace SS7.  The core network provides the signalling, switching and channels to deliver the service.  The layer above is the Service Layer.  It is in this layer where intelligence is added to the signalling and basic switching function. 

The Service Layer is implemented by one or more Service Control Points (SCP).  SCPs are also commonly referred to as an Intelligent Network Platform or "IN" platform.  When the network switch receives signalling for any kind of telephone number other than a standard, geographic number (which it will route directly), it passes control of the call to a designated SCP.  The SCP figures out what to do.  This might be to perform a number translation, check and authorise the call depending upon a prepaid tariff and balance, try several numbers in sequence or parallel, or look-up additional information such as subscriber ID, location, personal calling rules etc.  Throughout this process, the SCP is in charge and controls the call.

So far, so good.  So what's the problem?  Essentially, it is that SCPs are only available as a complete, vertically integrated hardware and software system.  In other words, a ‘box' that you connect into the network to perform predefined functions.  They were designed in the 1990s or earlier to provide the limited range of standard telephony services at that time and to comply with the ITU and 3GPP standards.  As the SCP is controlling phone calls, it is engineered to meet the exacting requirements for network equipment (NE) - "five nines" availability:  resilience to failure, upgrade with no down-time, hot swapping of components etc.  The deployment requirements, the restricted ambition in terms of the range of services when the SCPs were designed, the tight vertical integration and severe structural rigidities mean that the end-user services are essentially pre-baked into the SCP.  Adaptation of services has to be done by the SCP provider and is extremely expensive, often with very long lead times.  It means that the Communication Service Provider (CSP) can only sell and market a limited range of standard, utilitarian services.  They cannot experiment or innovate.  They have only one supplier for any changes that they require, the business case for which often fails due to the high costs of SCP adaptation.  Uniquely in a highly competitive marketplace, CSPs are handicapped in their ability to compete by differentiating their offer in terms of its capabilities.

Traditional SCPs are characterised by high prices, inflexibilities, single source for changes, slow evolution and enhancement.  As a CSP, once you have procured your SCP, you are a hostage to fortune.  Well, at least everyone is in the same boat.  But meanwhile voice minute price-points are in decline and all CSPs are under tremendous price pressure.  And the insurgent VoIP-based, price-focussed competitors are chasing hard. 

It used to be like this in Enterprise IT.  Enterprises bought a complete, vertically integrated stack of hardware, system software and applications from a single vendor.  This has all changed now.  There are commodity hardware providers, system software providers and application software providers.  There is competition within each layer.  The competition has driven the price-points down, platform performance and flexibility up and application innovation up.  There is something intrinsic at the heart of all this:  open systems and architectures promote competition and result in lower prices and innovation.  It's the basis of all free markets.

Until now, this option simply hasn't been available to the SCP.

What is needed is a solution that enables service agility in the telecoms network through an open, flexible platform that utilises commodity server hardware...in other words, a modern "IT" system designed explicitly for the telecoms network that unlocks the value of the telecoms service layer.

With this approach, new services can be created and delivered at a fraction of the cost and timescales normally associated with the telecoms industry.  In turn, this provides operators with the advantage needed to compete with the plethora of new device oriented applications currently hitting the market.

Opening up their SCP platforms to the burgeoning Java developer community will see increased levels of innovation in person-to-person communications - the 95% of all operator revenues today - sustaining operator market lead in these services.

Jonathan Bell is VP of Product Marketing at OpenCloud
www.opencloud.com

Jean-Noël de Galzain explains how developer communities, business leaders and  policy makers will attend Open World Forum 2009 to ensure open software plays an active role in the digital recovery

Free Libre Open Source (FLOSS) has been around for a number of years, but only recently has it become a mainstream alternative to licensed software. The raison d'être of FLOSS is to enable developers to gain a new "community" style approach to the design, development and distribution of software, whereby everyone is given free access to a software's source code. This approach to sharing software gives developers greater flexibility than more popular licensed programs, which restricts use and keeps users busy with trying to avoid any violation of intellectual property rights.

The people behind the FLOSS movement are helping to make things happen for open source. FLOSS is an ecosystem driven by software editors along with enterprises, service providers, distributors and integrators. It forges breakthroughs in networks, embedded systems, Web 2.0 technologies, Web services, application development, critical information systems and security. FLOSS supports the development of cloud computing, SaaS and other emerging technologies to help revolutionise the world of IT.

The benefits of FLOSS are hard to ignore, and more and more firms across Europe are embracing the concept-especially now when companies are tightening budgets and trying to cut costs without impacting services.  According to a study by UNU-MERIT, Open Source will represent 30% of the market for software and services by 2012. 

Open source is revolutionising information technology and is no longer limited to basic software such as Linux or Apache. Fledgling open source firms are finding opportunities in various business applications, including databases, customer relationship management and business intelligence.

Without a doubt, open source creates tremendous opportunities and challenges for those in the space but what will be the impact on innovation, governance, public policy, and ultimately, IT careers? How will FLOSS really revolutionise the local landscape of information technology and what technological advances in commercial and Free Software can be expected for the future?

The Open World Forum in Paris on October 1- 2, 2009, will play host to thousands of participants from 20 countries, to share their thoughts, foster innovation and competitiveness, and address these questions. The event will explore innovations and future trends in FLOSS and how the world can help build the future of FLOSS and make it a crucial component for the digital recovery.

The second annual Open World Forum is targeting open source developers to ensure the event's success in Europe. The event is open to the community at large and not targeted specifically at a specialised sector or the research community. It is aimed at all stakeholders: communities, IT managers, architects, developers, researchers, academics, industrialists, investors, etc.

The event will be held at Eurosites George V in Paris and the agenda will include global IT players (Alcatel-Lucent, Atos, Bull, Cap Gemini, Google, IBM, HP, Siemens, Sun, Thales), as well as the main communities (ASF, Eclipse, Linux Foundation, Qualipso, Mozilla, OW2, OSA Europe), large research and competitiveness clusters (System@tic Paris Region, Cap Digital, INRIA, Fraunhofer FOKUS, UNU-MERIT) and a wide network of SMEs from around the world.
The Open World Forum addresses issues raised by policy makers, professionals, users and contributors, customers, as well as the undecided, on the new information technologies and communication strategies available using Open Source. The agenda will include seminars and plenary sessions involving well -known personalities in the industry.

The plenary sessions will explore how Open Source can contribute to renewed economic growth, as a major driver for competitiveness and innovation as well as a force for social improvement and national sovereignty.

Delegates are invited to attend the plenary sessions on key economic and societal issues such as employment, innovation, cloud computing, IT governance, the role of the communities and the FLOSS as a social and economic leverage in public policies.

The event will end with the delivery of the roadmap by the president of the Program committee, Jean-Pierre Laisné, CEO of OW2 & Chief Open Source strategist at Bull, and the directors of the 7 major themes addressed during the Open World Forum, namely public policy, innovation, ecosystems, technological revolutions and economic governance, careers and BRIC.  I hope that you can join us as well.

Jean-Noël de Galzain, CEO Wallix
www.openworldforum.org

Matt Bancroft argues that an open wireless marketplace drives huge benefits for all participants. First among these is that the marketplace as a whole becomes larger; the availability of a broader and richer set of devices and services drives more mobile activity from more types of mobile users. Not only do service providers benefit from this overall market growth, but a host of opportunities are created for partners and third-party vendors as well. End users benefit from a broader set of compelling, competitively priced services

Discussions about openness in the mobile arena have, for the most part, centred on the operators. How open are the major mobile carriers like Vodafone, Orange, T-Mobile and others today? When will they become more open? How open will they get?

There are three other key elements to openness in the mobile marketplace:

  • Open devices - devices that can be obtained through multiple channels, will run on multiple networks, and can be personalised with services, features and applications that are added after the device is in use.
  • Open services/applications - services, features and applications than can be easily obtained and enabled after the device is in use, and that can come from either the primary service provider or from third-party providers (eg off-deck/off-portal services).
  • A wide set of more "open" contractual relationships - options to enter into longer - or shorter - term relationships with service providers, depending on the needs of an increasingly varied customer base.

Open devices

On the device side, mobile devices can have open or closed operating systems.  They can be locked to one network operator or be usable on multiple networks, and can allow access to additional services, features and applications or not, once in the hands of the users.

The iPhone, for example, could be considered a completely closed mobile environment. The device is normally only available through one operator in each market and comes with a service contract lasting at least 18 months.  When the iPhone first launched, the only services and applications you could use on it came with the device; you could have any flavour of iPhone you wanted, as long as it was vanilla. However, with the successful launch of the App Store, services are now available from third parties.  The phone and service plans for the iPhone are still mainly only available through one operator in each market and usually require a long-term contract. But even though new applications are only available via one source, the App Store, they do now come from a wider range of third-party application developers, who are free to determine the features and prices of their offerings.

Google's Android and the Open Handset Alliance (OHA) project comprise an initiative that is meant to specifically address the openness of mobile devices. The goal of this initiative is to develop a completely open mobile operating platform on which vendors will be free to develop applications and services that will work on all of the devices employing the Android platform. Leading this charge on the operator front was T-Mobile, which was first to field a mobile phone using the Android platform at the end of last year. Many other operators - like Vodafone - are launching Android-based phones as well.

Interestingly, the vendors involved in the Android and OHA initiatives are touting Android devices as "iPhones for the masses". Because the platform will be on a number of different devices from different handset vendors, the vendors claim that it (and the services and applications that run on it) will be able to capture a much larger market share than Apple could possibly do with a single (relatively expensive) device.

Open services/applications

In terms of mobile services and applications, most mobile devices are sold in Europe with a limited set of services. These are preloaded on the device. A more open system would allow more - and more diverse - services, which could be tailored for mobile devices and accessed directly by the subscribers. Some services would be delivered directly by the operator, while others would be supplied via a range of models, from partnership, co-branding and joint delivery with the operators to wholesale models where the operator facilitates access for the mobile subscribers. This more open model would broaden the available services and drive up overall mobile activity as a consequence.

Open contractual relationships

In Europe there is a mix of customer types. Many are contract customers in longer-term relationships where they are locked into one- or two-year contracts with the service provider. By contrast, pre-paid customers have shorter-term relationships, literally buying "airtime" periodically to support their mobile needs.

In the contract model there are conditions to lock in the customer, including financial penalties for breaking these contracts. Customers are unlikely to switch operators in the middle of a contract, regardless of their support or service experience. An open marketplace has a broad mix of relationship types to meet the differing needs of an increasingly diverse set of enterprise and consumer subscribers.

Macro trends driving toward openness

Network operators will execute a number of strategies that will continue to open the European wireless marketplace further. The fact is that the mobile market is becoming saturated - many European markets are running at well over 100% penetration - putting enormous pressure on the operator to look at new strategies to keep stimulating revenue growth. These strategies include: driving increased usage, particularly advanced data usage; going after new customer segments, including lower-ARPU customers and customers from competitors (who may want to bring their phones with them); and finding new ways of managing acquisition costs and a changing blend of customers.

These strategies will lead toward more open business models. Operators are already broadening the set of services that can be delivered either through them or with partners. They will also promote and differentiate new contracts that are tailored to meet the needs of different segments of the customer base. These will necessarily include a mix of subsidised and unsubsidised devices, with explicit policies for devices sourced independently from the operator.

As next-generation networks such as LTE and WiMAX roll out, we expect to see fundamentally open access models where there is a wider set of devices using the network - not just mobile phones but also computers and netbooks, mobile Internet devices, and other consumer electronic devices. Some service providers will expand the pay-as-you-go category to include the more nomadic type of user who may want access to the network for a specific period of time.

Mobile customers are demanding more flexibility. Enterprises want to be able to directly control and manage the devices used by their employees - and the applications on these devices - regardless of where they come from or what network they run on. And consumers want an "Internet-like" experience on their mobile devices wherever they are. As mobile devices become less like phones and more like portable computers, it is difficult for many users to understand why they should not be able to use their mobile devices on whatever network is available to them. Nor do they understand why they should not be able to access the best and most relevant applications and services out there, wherever they are or whomever they come from.

Openness - opportunity or challenge?

We believe that openness will support a larger wireless marketplace - period. And there is an inevitable trend towards more openness in the European mobile market. This openness is coming in a variety of flavours - more open network access, more open device platforms, more open service provisioning and more options for contractual relationships with service providers. Forward-looking operators will manage this change and find ways to capitalise on more open commercial models that can deliver higher revenues and margins. Delivering devices, services and experiences that are tailored to subscriber needs will naturally lead to improvements in ARPU, customer satisfaction and ultimately, customer retention.

Matt Bancroft is Vice President, Mformation
www.mformation.com

Mobile Marketing

The number of European users of mobile location-based services will grow from 20 million users in 2008 at a compound annual growth rate of nearly 37% to reach 130 million users in 2014, according to a new report from Berg Insights. Local search, navigation services and social networking are believed to become the top applications in terms of number of users.
According to senior analyst André Malm: "The key enablers for LBS are rapidly falling into place. On-device application stores allow easier access to mobile services for a broader audience at the same time as flat-rate data plans make pricing more transparent. In conjunction with more operators opening their location platforms to third parties, location aggregators have started to provide common APIs for accessing location data from multiple operators. This together with ever growing GPS handset sales will allow more application developers to create location-enabled mobile applications."

Berg Insight estimates that more than 20% of mobile handsets shipped in 2009 will feature GPS and that the installed base in Europe will surpass 50% of total handsets in 2013.
Many of the most popular services will primarily rely on advertising revenues. However, revenues may not grow at the same rate as usage because the mobile marketing and advertising ecosystem is highly fragmented and immature. It will take several years before a successful model has been established that can reach out to a critical mass of active users. At the same time, service categories including navigation and tracking can be expected to remain premium services to a large extent also in the future.

"Besides monthly subscriptions and per-use fees, service providers will increasingly offer one-time fees, service bundles or device bundles to match consumer expectations," says Malm.
www.berginsight.com

 

Telecom execs online

Telecom executives are spending an extra 11 hours a month online sharing their professional experiences and learning from their peers, according to a study run by online business networking service www.MeettheBoss.com.

MeettheBoss surveyed 15,000 of its 200,000 executive members to understand more about what effect Web 2.0 has had on the business world.

The survey asked its executive users what direct value they gained from spending time running blogs, writing tweets and connecting with other executives on sites such as Linked-In, Xing, MeettheBoss, Ryze, Facebook and Twitter.

The answers revealed that the telecom executives proficient with online activities are spending an average of 11 hours more per month online than this time last year - and over 90% of respondents said they felt their time online was ‘very valuable' to their daily role.
Professionals are going to the web for immediate answers to their most pressing questions.

"I find online business hugely beneficial," says MeettheBoss member Jacob Lee, Project Director at Vodafone. "It allows me to create or join active discussions and get immediate answers and experiences from colleagues and like-minded professionals on a range of issues."

Finding answers for your business is one thing, but most executives have spotted more ‘individual' advantages to this new phenomenon.

"Everyone knows that building contacts in the industry can strengthen your career prospects, but sharing your knowledge with the world is a very effective personal branding exercise," says Adam Burns, Editor-in-Chief of online business channel, MeettheBoss.TV.

Many thought leaders agree with him. Ed Candy - Group CTO for Hutchison 3G and Dr Felix Wunderer - SVP Fixed Mobile Convergence for Deutsche Telekom, have all shared their expertise on the site.

Though most of these executives agree there is no substitute for face-to-face networking, using social media tools correctly can bring substantial return on your time investment. You can become a thought leader in your niche environment overnight and even pull business to your door. The challenge is finding the best one for you.
www.MeettheBoss.com

 

Transforming business

The latest in a series of books describing how to adopt and apply the TM Forum's Solutions Frameworks as part of cost-base transformation for communications service providers, is now available.

Business Transformation with TM Forum Solution Frameworks and SOA is co-authored by Iwan Gramatikoff and Serge Garcia of Edelweiss Consulting and John Wilmes, Chief Technical Architect, Communications Sector, Progress Software and co-chair of the TM Forum's Information Framework Collaboration Program.  The authors assert that to move towards the "next generation" business model supported by the TM Forum Solution Frameworks, enterprises must first undergo a business transformation. With that transformation they will acquire the ability to use business processes more effectively to attain strategic goals.
Business transformation requires more than just introducing a new IT and process architecture. To move forward, service providers need to transform their organizations from technology-driven to market-driven. The purpose of the new is to guide and assist service provider transformation towards this next generation world.

Recognizing and overcoming the challenges involved can result in:

  • Effective business processes to help reach strategic enterprise goals
  • Agile resource management that can evolve as needed
  • Efficient resource allocation to meet expected results

Service-Oriented Architectures (SOA) and TM Forum Solution Frameworks offer complementary approaches for managing complexity and ensuring success. This book is intended to enable organizations to find the best path to follow for their own successful business transformation.
www.tmforum.org

 

LTE growth

The number of LTE next generation networks is set to grow significantly with the number of subscribers exceeding 100 million by 2014, according to a new report from Juniper Research.
The LTE report found that these market numbers will be buoyed by the embedding of broadband capabilities within consumer electronics devices such as MP3 players, Netbooks and digital cameras.

Juniper Research forecast that whilst subscribers will largely use handsets such as smart phones, and laptops, consumers will be motivated to connect devices in the home by LTE.

Report author Howard Wilcox comments: "There is intense activity in the LTE market right now, with in excess of 30 network operator commitments. Operators and vendors alike are moving rapidly to jump on the road to LTE, attracted by the connectivity-based opportunities that the technology offers. Sony, for example, announced that network connectivity is one of three top priority actions."

However, the report determined that there are still several open issues that need addressing before the market takes off. One of these is the issue of device convergence: what will a smart phone look like, and be capable of, in three years' time?
Further report findings include:
       

  • There will be multiple millions of LTE subscribers as early as 2011.
  • Embedded LTE chipsets will become the second most popular means of access behind SIM cards by 2014.
  • Whilst early LTE adopters will be enterprise subscribers, consumers will begin to take up LTE based services towards 2012/2013.

The report provides a balanced assessment of the opportunity represented by the rapidly developing and very topical LTE mobile broadband technology. It includes a comprehensive six year forecasting suite of critical figures, data and analysis on enterprise and consumer subscriber take-up, devices, network access via dongles, cards and embedded capability, chipset shipments, arpus and service revenues. The report also evaluates the key mobile operators and vendors pioneering in this developing market.

LTE Mobile Broadband Strategies: Consumer & Enterprise Markets; Devices & Chipsets 2009-2014 can be freely downloaded from the Juniper  Research website.
www.juniperresearch.com  

 

For several years now, the mobile TV industry has been searching for the elusive  business model that will drive mass adoption. Analysts continue to paint a rosy picture of mobile TV in the future, with Juniper Research predicting that the global user base for mobile broadcast TV services is likely to exceed 330 million by the end of 2013. But despite the optimistic outlook, Diana Jovin notes that it remains unclear how we will reach these future heights when compared to where we are today

In most countries today mobile TV in its current form represents a complex interaction between content owners, network operators and standards organisations. Broadcasters own the relationships with the content owners, mobile operators own the consumer relationships, spectrum and network infrastructure, and there are a myriad of competing standards being pushed by a variety of commercial organisations and regulators. Amidst the discussions of content rights, standards, spectrum and infrastructure - that is how to deliver mobile TV - it is little surprise that the industry has struggled with a more important question - what does the consumer want from mobile TV?

In this last year a clear answer to the consumer question has emerged, and it is one that simplifies the complicated relationship between broadcasters, content owners and operators. Over the last year, consumers have indicated a strong preference for a free-to-air model, which provides them with the same terrestrial over-the-air programming that they know and watch on their TV at home, but simply received on their mobile phone rather than their living room television set. This gives the consumer content using existing broadcast infrastructure that does not require the operator to make considerably outlay on capital expenditures. From a commercial perspective, it has the potential to stimulate a variety of related service and premium markets.

Following on the heels of successful free-to-air mobile TV rollouts in Japan and Korea, frequently cited as the industry's leading mobile TV success stories, more than 20 million consumers in analogue broadcast TV markets chose to purchase TV handsets that receive free-to-air terrestrial broadcast signals - all in a brief eighteen month period between mid-2007 and the end of 2008. In response to this burgeoning consumer interest, several of the leading operators in operator-controlled markets, such as Latin America, have now included these handsets in their mobile phone portfolio. This is a clear indicator, when compared to the more sluggish adoption of subscription mobile TV services in other parts of the world - over the same period of time - that the free-to-air mobile TV model works and can appeal to both the consumer and the operator.

In Europe, the mobile TV industry has been preoccupied with decisions regarding which standard should be the dominant one to drive mobile TV adoption forward, citing fragmentation of standards as impeding progress towards viable consumer solutions. Early in 2008 Viviane Reding announced that the European Commission would be endorsing DVB-H as the preferred technology for terrestrial mobile broadcasting. Yet by late 2008 LG revealed the launch of its latest touch-screen device into the European market equipped with a DVB-T mobile TV tuner putting it in direct conflict with the EU's decision to support DVB-H. With the industry focused on standards discussions, an important question remains in the background: what do these standards mean for the consumer?  Looking more closely, mobile TV based on DVB-H or another similar mobile broadcast standard provides content that is different from content traditionally received on a TV set at home - and is usually accompanied by a recurring service fee.  In contrast, DVB-T transforms a mobile phone into a portable television set, delivering the same free and familiar content that one would view on a conventional set at home.

To date, mobile TV has been trialed and deployed in various markets in several forms. In many markets, mobile TV is offered as a service, which includes a custom content package offered by an operator for a monthly service fee. These are typically pay TV services delivered via digital broadcast standards such as DVB-H or FLO, or streaming services delivered via WiFi or 3G and an accompanying data plan. The challenge with these services is that they require the consumer to become familiar with a content package that they're unaccustomed to and be willing to pay for it in the process. In contrast, free-to-air mobile TV piques consumer interest by simply place-shifting the same programming that they are used to viewing on their TV set at home. Consumers are able to transfer their TV viewing habits to their mobile phone, watching their favourite programmes at the time they expect them to be broadcast.

A question that springs to mind is whether consumers want to make their television mobile, or would they prefer to subscribe to a tailored mobile TV service? The adoption figures to date begin to point at the answer to this question. Even so, for free-to-air mobile TV to be truly successful, it must ultimately be embraced by operators in markets where they control handset portfolios and purchases.  Towards the end of 2008, the industry saw operators such as Telefónica, Telcel, Claro and others adding free-to-air TV phones to their mobile handset portfolios. So what brought about this shift in strategic thinking from operators, and how does free-to-air mobile TV deliver value?

The pay TV business model seems to be the obvious one for operators - it delivers direct monetisation of a new service that can be upsold to existing subscribers.  However, one needs to assess a variety of factors to understand the ultimate impact on an operator's bottom line. Here are some of those factors, with a comparison between a pay subscription and a free-to-air business model:

  • Cost of entry: The development of new mobile TV infrastructure is quite expensive. Given the recent economic downturn and the corresponding tightening of capital budgets, operators are giving the ROI on subscription mobile TV tight scrutiny, given the size of the required investment. In contrast, free-to-air mobile TV uses existing broadcast TV infrastructure installed by the broadcasters, enabling operators to provide the feature to consumers without having to bankroll costly spectrum acquisition, infrastructure deployment and content licensing.
  • Subscriber stability and growth: Most of the operators who have deployed free-to-air mobile TV in Latin America, for example, are leaders in their market. They view free-to-air TV handsets as a way to solidify a leadership position, attract new subscribers, reduce subscriber churn in the existing customer base, and grow service revenue over time by providing a high value service to consumers. In developing markets, where many consumers cannot afford the data plans and service fees associated with pay TV models, and where some countries prohibit the bundling of TV services with operator networks, free-to-air TV provides a vehicle for operators to increase the value they provide consumers without a significant investment.
  • Related and premium services: Operators are now recognising that free-to-air TV can be used to attract consumers and then generate revenue from related or premium services. Arima, a Taiwan original design manufacturer (ODM) of handsets, recently announced an SMS-TV design that allows viewers to send and receive text messages while viewing free-to-air TV. Also, once consumers have become accustomed to watching TV on their handset, they will be more receptive to considering complementary subscription offerings around premium programming.

When talking of mass adoption of mobile TV, it is important to think about consumer uptake globally and not just in those parts of the world where digital mobile TV trials have led to deployments. Interestingly, despite the significant focus and discussion on digital standards, by 2012 more than 85% of the world's population will still be receiving analogue broadcast TV signals. This suggests that a comprehensive global strategy must encompass both analogue and digital free-to-air mobile TV.

The second consideration is that despite the fastest subscriber growth figures the majority of the people living in developing markets do not have data plans that accommodate streaming services, nor can they accommodate recurring monthly service fees. Being able to provide free-to-air mobile TV - whether analogue or digital - in entry-level priced handsets opens up opportunities for adoption by a large, global population. This is a particularly important consideration preceding the World Cup activities in South Africa next year, which are certain to hold appeal to soccer fans around the world.

Offering TV on handsets for free at first glance seems counterintuitive as a business model.  However, by providing consumers with the content they want, and without requiring costly and time-consuming investment by the operators, free-to-air mobile TV presents a compelling business case, especially as it has the potential to stimulate service revenue and reduce customer churn. Consumer adoption figures underscore the seal of approval placed by consumers on this market approach proving that free-to-air mobile TV is a viable strategy for stimulating the global adoption needed to deliver on industry analysts' market predictions.

Diana Jovin is VP, Corporate Marketing, Telegent Systems

Despite a marked reticence among many brands to take the mobile marketing plunge,  Daryn Wober believes its potential is considerable, and take-up imminent

The Internet Advertising Bureau and PricewaterhouseCoopers recently revealed research showing that "...mobile ad spend bucked all market trends, doubling in size on a like for like basis in 2008, increasing by 99.2% year on year." That's encouraging but at £28.6 million, and even though the study ignored mobile marketing expenditure, it's still a tiny element. In a total market worth around £17 billion you'd be forgiven for not getting too excited. But in 1998 that £28.6 million was roughly what the internet advertising market, at a similar stage of development, was worth. Internet advertising grew exponentially from that point and is now worth £3.35 billion so if mobile heads the same way, things are about to get interesting.

Given the almost universal access to mobile devices in the market it seems to make sense to target consumers on the platform, but it's clear that few brands have taken the plunge to date - many brands rate mobile as a ‘would like to have' rather than a necessity in their marketing campaigns. But for brands to market themselves effectively they need to ‘follow the consumer', targeting customers on their platform of choice, whatever that may be. Like the early days of the internet, mobile advertising can be a daunting prospect, but in reality it need be no more complex than the online platform. What's required is specialist partners that know the specific requirements and have the infrastructure to provision, deliver, manage and track campaigns. With that expertise in place, mobile is a viable and valuable channel that can be exploited alone, or as part of an integrated campaign.

It's important to look at how consumers are using their mobiles right now so that we can see how the market might develop in the future. At the moment, a significant proportion of mobile users stick with voice and text services, shunning any of the added value applications and services available. That may be because of their demographic profile, the handset they own or their pricing package. For those consumers, the marketing and advertising options are limited and well established, but fundamental changes are afoot and relate to three primary factors:

Firstly, the latest generation of mobile phones, typified by the iPhone, are now much more capable devices - they have high resolution colour screens, lots of processing power and broadband connectivity. Secondly, ‘all you can eat' data packages are now widely available and finally, we have a significant generation of consumers that have grown up with mobile and are comfortable interacting with their chosen devices. Those factors have created an entirely new universe of opportunity, comparable in many ways to the impact broadband services had on internet browsing via the PC.

Some brands are responding to the mobile opportunity, but for the most part they remain wedded to the ‘lowest common denominator' formats like SMS voting and competitions utilising shortcodes. It is definitely worth pointing out that these types of campaigns can produce great results though. However, there is a lot more that could be done and opportunities are being missed - everything from WAP portals to music and video downloads and brand-specific user generated content. What's missing for brands is the ability to target those campaigns effectively and that's where mobile network operators become most important.

On the surface mobile ought to be an advertising nirvana. The handset is the most personal device in comparison to other platforms. Mobile operators know an awful lot about their subscribers including where they are and who they like to call. With that information to hand it's possible to build a highly accurate profile that could be used to target marketing and advertising messages accurately and appropriately. Blyk, the so called ‘free mobile network' has created an entire business using this approach (although their overall business model is yet to be proven), offering 16-24 year olds free calls and text messages in return for personal data. Subscribers are then targeted by brands via text and MMS with adverts and offers that are relevant to their profile. There are plenty of critics of Blyk's model but the company has certainly proven that consumers are accepting of advertising on the platform if there is also a benefit accruing to them, in this case, free calls and texts.

Operators use third parties to enable mobile advertising across their networks but handicap the effectiveness of these campaigns by not sharing specific information on subscriber habits. This data would add considerable value from a brand and agency perspective and help build more targeted and therefore more successful campaigns. Good news for all. Advertising on Google.com owes much of its success to the harvested data on individual search preferences Google shares with advertisers. Operators need to follow suit and share information within strict legal parameters. Additionally, brands need to see that they are getting a return on their investment and in many cases the tracking and measurement systems aren't available. Those factors, combined with the technical complexity of delivering mobile campaigns have held the sector back.

At the moment brands are primarily working within their comfort zone; the established media channels. They know those channels work (even if returns are diminishing) but their knowledge of the available opportunities on mobile is limited. If they have a forward thinking agency, mobile could be brainstormed and potentially added as an additional element, but even then it is likely to be on a limited and ad-hoc basis rather than as part of an ongoing integrated campaign. The inevitable result is that brands are missing out on a key outlet for reaching consumers, especially in today's connected society where people have their mobile phone with them 24/7.

It can be very difficult to assess what is possible on mobile and we shouldn't blame brands or agencies for the current situation. The typical mobile campaign planning experience today involves speaking to a wide range of companies that can deliver only a single element of the solution. The result may be a complex supply chain that is difficult to manage and difficult to extract value from. Brands and their agencies work within a fast paced environment - they need to know what is available and what the value is and most of all they need to know that it ‘just works'.

The good news is that there are specialist managed services platforms available with the technological capabilities to address the key issues for brands, agencies, and operators. With a managed service platform in place, taking control of the planning, execution and tracking of mobile campaigns across multiple channels becomes far easier for the brand. The operator benefits from working with a trusted provider that has the required technological expertise to plug into their network effectively. Managed service platforms can host and deploy mobile adverts to specific demographics, making the most of the specialist technology and expertise.

For brands the managed service platform means they have access to core campaign elements that can be switched on and off very quickly and incorporate the activity into a wider campaign, knowing that what they plan will work. For example Lynx, the male grooming brand, recently implemented a multi-platform marketing strategy to coincide with the launch of its ‘Snake Peel' product. Lynx ran a number of competitions for 16-24 year olds giving them the chance to win various prizes. One of the competitions offered men the opportunity to demonstrate their dating prowess by sending in videos and texts of chat-up lines with the best one winning a trip to Miami. Alongside this competition, ran a series of banner adverts on a number of operator portals and traditional physical promotion in the form of branded ‘man-wash' booths at summer festivals (where men could be washed by a model and then text-in to enter further competitions). The success of this campaign was down to a brand and their agency understanding their target consumer and the ways in which they interact and the core campaign elements being available ‘off the shelf'. As a result, Lynx interacted with its target market effectively through the right channels, including mobile, at the right time. Good strategic planning and effective brand placement in appropriate channels always produces results.

Consumers are becoming increasingly accustomed to accessing multimedia and interactive services on their mobile phones, and with that change in behaviour we are seeing a corresponding increase in the opportunity for brands and mobile network operators - where there are eyeballs, there is an opportunity to advertise. Despite that, we most often see an entirely haphazard approach to exploiting the opportunity. It has much in common with online marketing ten years ago but if the current success of the online platform is to be replicated on mobile, brands and operators need to work together with specialist managed services platforms that have the required experience and technology. These platforms allow campaigns to be planned, booked, implemented and managed across multiple networks and off-portal channels. Just as importantly it allows for verified reports, tracking and analysis from a single trusted source. The operators must also become more active players in the value chain. If mobile advertising is to realise its potential operators can no longer remain silent. They are the crucial gatekeepers. Subscriber information must be shared in a carefully controlled and legal manner with trusted specialist players. With this information campaigns will become better targeted, richer for the consumer and more fulfilling for all parties involved.

Daryn Wober is Vice-President of Business Development at IMImobile Europe

    

@eurocomms

Other Categories in Features