Features

Features

ip.access CEO, Stephen Mallinson, discusses the impact of pico and femtocells with Priscilla Awde

Mobile operators everywhere are facing something of a conundrum which goes like this: in saturated markets they must increase revenues from high margin data services but these are typically bandwidth hungry applications resulting in a network capacity crunch. Additionally, recent research shows that around 60 per cent of customers use their mobiles inside buildings at work and at home. As people exploit the benefits of the big new touch screen smartphones, they will expect networks to be fast enough to provide the necessary capacity reliably and everywhere. These are growing trends.

However, delivering the promise of mobile multimedia applications means delivering high-speed indoor mobile networks. Which poses big questions for operators: how can they get broadband 3G networks inside to provide reliable, cost effective in-building coverage? How can they do it fast, without significant and expensive investment in macro networks and give customers access to the applications they want at prices they are willing to pay?
Fortunately ip.access has the answers since bringing high-speed wireless networks inside is its raison d'être. Building on its long experience in developing IP communications solutions, ip.access designs and manufactures picocells for business users and femtocells for the domestic market.

Picocells and femtocells plug directly into existing fixed broadband networks be they DSL, cable, satellite or even WiMax. Acting as mini-base stations, both can be quickly installed anywhere in buildings or outside to bring networks to where the demand is.

These plug and play units have advantages for everyone. For users, both professional and consumers, they make the mobile phone a truly broadband device which can reliably connect to high-speed networks anywhere. For operators, pico and femtocells take traffic off the macro wireless network, add capacity and improve performance. They also give telcos the competitive advantage they need to sell into new sectors and offer a range of high margin value added services.

For years ip.access has successfully deployed nanoGSM picocells in enterprises, business parks, skyscrapers, underground and public buildings. They are even installed on planes, ships and other remote locations where they are connected to broadband satellite backhaul networks. Depending on their size, picocells can support up to 100 users and companies can dot them around the organisation to provide connections where needed.

Solving the problem for residential users, the Oyster3G femtocell allows people to use their existing mobiles to access broadband applications at home. Supporting up to four simultaneous connections, family members can get seamless high-speed access as they move about inside the house. ip.access expects commercial deployments of plug and play 3G femtocells will be up and running by spring 2009.

"There are two legs to our business," explains Stephen Mallinson, CEO at ip.access. "We design end-to-end pico and femtocell solutions so operators can deliver robust solid networks for business and residential users inside any building, ship or aircraft."
The difference between the two is one of size, power, capacity, functionality, price and target audience. However both allow operators to add capacity cost effectively, divert traffic from the macro network and thereby improve performance for all users connected to a cell site. Network black spots in cities and rural areas can be eliminated and people previously unable to get mobile signals can be connected to high-speed networks.

"Operators can use pico and femtocells to put broadband wireless networks precisely where there is demand be that indoors or outside," explains Mallinson. "They can do this without either the expense or controversy of installing new masts and avoid adding equipment to existing base stations. The advantages extend beyond capacity issues: operators can introduce and support new, high margin services and offer home zone tariffs to drive up data usage inside and on the move.

"There are QOS advantages: although people may tolerate occasional dropped voice calls they will be less forgiving if essential business communications or video content are interrupted. These mini-base stations ensure connections are maintained as people move around inside buildings."

Plugging mini-base stations into the existing broadband connections takes indoor data sessions off the macro network so raising the number of users each site can support and increasing its capacity beyond the number of users removed. Operators therefore do not have to invest either in backhaul or in increasing base station capacity. According to ip.access, instead of upgrading the macro network to meet the capacity demands of increased data usage, an operator with 10 million subscribers could save €500 million over four years by deploying fully subsidized femtocells to 20% of its subscribers' homes. Similarly, research firm Analysys-Mason calculates the annual cost saving per customer for a large operator deploying 3G femtocells is between $6 - $12.

Setting aside revenue advantages, increases in service and performance levels and churn reduction, the added capacity achieved by deploying femtocells more than makes the business case even if they are fully subsidised. Even ignoring the cost savings, it takes only a Euro 11 per month increase in ARPU spread over one household to cover the cost of fully subsidising a femtocell.

Operators are seeing an explosion in mobile data usage (in the UK 3 saw a 700% increase in data traffic throughput between September 2007 and March 2008 ), and are looking to picocells and femtocells to solve both network capacity and indoor high-speed access problems. Demand for high bandwidth multimedia mobile applications is rising fast. In the consumer market, usage growth can be attributed to the popularity of social networking sites; uploading and sharing multimedia data; mobile advertising and the personal experience enabled by mobile TV. Following the launch of the iPhone, operators reported an immediate and continuing surge in data usage.

According to Informa, 60% of mobile data traffic will be generated at home by 2013. Ovum anticipates 17 million femtocells will be deployed throughout Western Europe by 2011 and IDC expects consumer spend on femtocell enabled services to grow to $900 million by the same year. Other surveys suggest nearly half of smartphone data usage is at home and the ‘digital generation' either does, or wants to watch mobile television at home.

As distinctions between professional and consumer applications and use blur, employees at all levels are taking popular mobile services into the workspace and combining them with mobile access to multimedia corporate applications. Mobiles are an essential part of corporate life: many business applications formerly limited to fixed devices have migrated onto wireless platforms. "Picocells support reliable connectivity to network services," continues Mallinson. "Enterprises can now support the flexibility and device independent access employees need, delivering reliable and consistent mobile high-speed access everywhere."

Operators are urgently addressing the capacity problems such increases in data usage imply. Some are capping monthly unlimited data plans while others encourage content developers to limit application bandwidth. Neither of which are likely to be popular with users and may increase churn: both of which enhance the consumer proposition for deploying picocells and 3G femtocells.

While adding what could be millions of mini-base stations to a network, integrating them into existing infrastructure and systems and managing them is a significant task for operators, the rewards are potentially equally significant. The cost of delivering calls drops; service levels, speed and reliability rise and operators can introduce new, high margin services to the handsets people already have.

They can encourage both usage and fixed mobile substitution by offering FemtoZone services which are tied to a particular location and automatically activated when phones are within range of the femtocell. When people get home, texts could be automatically sent to absent parents to notify them children are back; podcasts, videos or images can be loaded to phones or targeted advertising sent to interested users.

"Femtocells are a cost effective technology and real commercial proposition for the residential market," explains Mallinson. "Most people in Europe have access to broadband networks at home and, by rolling out 3G networks, carriers are stimulating demand for mobile data. However, many users are frustrated since they cannot fully exploit the benefits of 3G phones or get the quality of service or application access they want.

"Most people use phones for data indoors where, without pico or femtocells, 3G coverage is often not reliable or signals not even available. Femtocells give consumers a better experience and faster downloads so they can really use all the features and functions 3G handsets and networks support while inside."

The Femto Forum industry body, of which ip.access is a founding board member, now includes more than 90 companies, including 36 operators covering 914 million subscribers. The Forum is encouraging the development of open standards which will lead to economies of scale - unit prices are expected to to drop below $100.

There are plans to include the new I-uh standard in release 8 of the 3GPP standard due out in December. It will replace the numerous different ways in which femtocells currently connect to networks and proprietary systems and define how they can be integrated into core networks. By standardising communications between femtocells and core network gateways, operators will no longer be locked into proprietary interfaces or particular vendors and so can choose consumer premise equipment (CPE), separately from the gateway.
Concerns about managing the multitudes of new units within a network are also being addressed by the industry. Currently available for DSL equipment, the TR-069 standard allows operators to remotely manage devices, diagnose and solve problems and download software upgrades. The standard is being extended to support the management of femtocells.

Based on open standard interfaces, the nanoGSM picocell and Oyster 3G femtocell products are total end-to-end solutions which include the requisite controllers and management systems. 

Over the five years they have been used in enterprises, the advantages of the nanoGSM are well documented. Fast and easy to install it increases mobile voice and data usage and reduces operator costs. With an indoor range up to 200 metres, traffic is backhauled through existing IP networks and it supports fast data rates over GPRS and EDGE to devices such as Blackberries. The nanoGSM picocell can be hung on a wall and, once the Ethernet connection is plugged into the box, it is up and running providing guaranteed mobile capacity and service quality indoors.

Like its bigger cousin but less powerful and with a smaller range, the Oyster 3G architecture creates a complete indoor broadband access network for the residential market. Using the same underlying technical platform as the Oyster 3G, ip.access is developing next generation picocells. Having solved many of the 3G femtocell ease of use, price and installation challenges necessary to meet consumer needs, ip.access believes these solutions can be incorporated into picocells. In future, the company expects to offer self-install 3G picocells to both large enterprises and to SMEs through their existing channels.

"These are very exciting times," says Mallinson. "We are building on our experience to produce next generation picocells designed for businesses of all sizes. SMEs need plug and play, easy to use, cost effective units which can be self installed and remotely managed. It makes commercial sense for companies large and small to deploy picocells. It also makes commercial sense for operators, giving them the edge over competitors and a new value proposition for smaller companies which historically have been something of a closed shop."
It's a truism that everything is going mobile and operators are already feeling the capacity pinch. Pico and femtocells give them a cost effective means of meeting the expected upsurge in demand and delivering the network performance capable of supporting next generation multimedia applications.

Today's smart phones are as powerful and feature rich as the PCs of only a few years ago and look set to become the principle controller of all domestic electronic equipment. Operators are now able to deliver the ubiquitous high-speed networks consumers of all kinds expect.

Mallinson looks forward to the day when content is automatically and seamlessly transferred between devices over femtocell platforms: "Users will be able to control televisions remotely from their mobiles; share content between phones and other devices quickly and automatically so all are updated. In the new converged IP world, audio, video, text and photographs will be seamlessly shared between devices.

Do femtocells and picocells hold the key to the lucrative SME market?  Mark Keenan takes a look

Network operators have long tried to address that potentially very lucrative but hard-to-reach customer segment: the SME (small medium enterprise).  Over the years, operators have experienced differing degrees of success, but this is a market sector that has long been viewed as one of the biggest challenges the industry faces. However, an area of mobile communications that many analysts predict will soon be popular among consumers is increasingly being viewed as a key for unlocking the SME revenue stream for all kinds of operators.

The technology in question centres around indoor base stations, also referred to as femtocells and picocells, predicted by ABI Research to account for 102 million users worldwide by 2011.  In essence, these are small indoor access points - think of a slim paperback novel - that are designed to provide dedicated mobile network coverage within a limited area, such as a house or office.  Unlike larger macro cells, these units are relatively low-power devices and manufacturers are designing them to be as ‘plug and play' as broadband modems have become.  Another key difference between traditional base stations and these scaled-down versions is that they link back to the service provider via a broadband line (usually xDSL) to provide network backhaul, rather than using a leased line or microwave link.

Picocells and femtocell have much in common and employ the same base technology but they differ in that picocells are higher capacity and provide extra features for the business market, such as the ability to support larger numbers of simultaneous users, or to chain together picocells to create a network, and to integrate with existing IT environments.  Femtocells, on the other hand, are lower capacity and have less inbuilt ‘intelligence' but are cheaper and designed for the mass-market consumer market.

Indoor base stations address a very real market issue, namely: the problem of achieving high quality indoor coverage.  Many mobile networks - particularly in busy city and town centres - are already overloaded, with too many subscribers placing demands on the network at any one time.  Furthermore, the nature of radio based systems means that there will inevitably be weak spots in network coverage caused by a variety of obstructions, ranging trees and hills through to buildings and walls.  Even thick modern double glazing can create a problem.

A recent research-based report from analyst firm Quocirca revealed that approximately one third of SMEs had experienced problems with indoor coverage at work, with the figure raising to 45 per cent when those same users were at home (as is often the case with SME executives).  Yet despite the fact that buildings are not ideal for mobile communications, more than half of all mobile calls are believed to be made within buildings and our reliance on mobile devices as a business tool continues to increase.  Think of the number of people who live on their PDAs, whether at work, in a meeting or working from home. 

Does it matter?  Well, as the fight to attract and retain subscribers becomes harder and harder, then we all know that the emphasis on service quality increases.  Indeed, a US study carried out by Telephia indicated that over a fifth of customer churn was as a result of poor network coverage.  Research firm InStat has stated that the biggest challenge facing mobile subscribers is the lack of indoor coverage of 3G signals and warns operators that their success with 3G services will be limited unless they address this issue.

This is the operators' dilemma.  While they are banking on return-on-investment on their 3G networks, the very nature of 3G means that it finds it even harder to penetrate buildings than 2G.  At the same time, the kind of services that 3G lends itself to so well - mobile data and TV - place greater demands on the network than ever before.  Yet building whole new landscapes of macro cells is not an option, both in terms of cost and environmental restrictions. This is why so many players in the industry - not just analysts, but vendors and operators - believe that indoor base stations are the solution for overcoming the network traffic logjam.  Furthermore, they could help to enable new operators to enter the mobile market.

It would be wrong to think of indoor base stations in terms of 3G alone.  For some time now, a couple of vendors (including RadioFrame) have been deploying 2G units to network operators in Europe.  In RadioFrame's case, this includes providing business customers of Orange with picocells that enables the operator to improve service quality where needed. 
Quocirca's research underlined the fact that while growth of mobile data is happening, voice services are still business users' primary focus and where they have concerns about service quality and cost.  And let's not forget that most of these business users are still on 2G. They are also receptive to fixed mobile convergence, if presented attractively and cost-effectively. 
While femtocells may not hit the mass market for a couple of years yet, indoor base stations could well prove the solution to maintain customer satisfaction among the SME community.  Looking ahead, these ‘mini cells' can also be used to achieve fixed mobile substitution, by enabling users to reduce expensive mobile call costs by using mobile broadband IP connections. 

Ultimately, picocells could be used to enhance PBX services.  Mobiles could be integrated with the PBX to support call transfer, hunt groups and virtual fixed lines, for instance.  Potentially, picocells could replace the fixed PBX completely with a wireless PBX solution, or even supplant traditional WANs and LANs, although as these are so well-embedded in IT culture, that is certainly not going to happen overnight.  It's interesting to note, however, that the technology is pretty much there to achieve this.

Where are we now?  Apart from deployment of picocells to business users, femtocells - both on 2G and 3G - are being developed and trialled around the world, with a number of product and services from a variety of vendors and operators expected to be launched at the end of 2008 or early 2009.  Some markets are more developed than others and while Europe is expected to be one of the fastest growing pico/femto markets, Sprint in the US announced its own femtocell solution in August 2008. 

No new market sector is without some potential pitfalls.  Mass-market roll-out has been cited as a barrier and this certainly is something that needs addressing very soon.   Most mobile and fixed operators are used to supporting deployment of voice-centric mobile phones, but as some of them have found, as soon as you move into mobile data support, then far more technical support tends to be needed.  Furthermore, there is a big difference between distributing mobile phones and PDAs - whether via shops or courier delivery - and rolling out thousands - ultimately millions - of indoor base stations.

This is why it is crucial that pico and femtocells need to be ‘plug or play' and involve ‘zero touch' deployment. In other words:  devices that can be installed by the customer;  remotely activated  by the operator; and (in the case of RadioFrame's own product line) even remotely updated, all without a truck roll.

In 2007, the Femto Forum was created by a number of vendors and operators to jointly agree a way forward regarding industry standards.  A new technology that does not experience some dissent between players is rare, but the development of universally-accepted standards is happening at a relatively steady pace.  Certainly, standards should not be viewed as a total barrier to indoor base station deployment - after all, picocells are already in commercial operation - though of course, interoperability is very desirable for the market's future.

Another danger is the tendency to ‘over-hype'  picos and femtos.  Realistically, do we really think that every business and consumer will have one within the next 12 months?  I would say not.   So let's not raise expectations to a ridiculous point, or develop business plans that are based on hope rather than common-sense.

That said, the benefits of picos and femtos - to operators and users alike - are very clear.  While predictions on timescales, volumes and market expectations may vary, the general consensus would seem to be that indoor base stations are central to the future of the mobile industry, from 2G to 3G and beyond, not just for consumers but to support business users too.

Mark Keenan is General Manager for Europe, Middle East and Africa, RadioFrame Networks Inc. 
www.RadioFramenetworks.com

By Carsten Storbeck, director of product management with ADC KRONE

Fibre-to-the-home (FTTH) is certain to happen. In some countries it is well advanced, with customers enjoying data speeds of 100Mbit/s into their homes. In other territories, carriers are trying to squeeze the last few years out of their ageing copper networks but the best they can achieve is around 50Mbit/s. And this simply will not satisfy consumer demand in the coming years. The process of replacing copper cables with fibre is undoubtedly expensive but it must nevertheless happen sooner or later. Otherwise the telecomms companies will lose their broadband business to the cable TV operators.

From a technical point of view, laying fibre as far as every home is not difficult. However, installing fibre cables inside customer buildings can be a far less simple operation. This is particularly the case in continental Europe, where more than 70 per cent of people live in flats, apartments, terraces or town-houses, termed collectively as multi-dwelling units or MDUs.
 
Installing ‘traditional' copper cable in these buildings for telephony was easy. This new task is not. Network providers need to deliver broadband at 50 or 100 Mbit/s (perhaps 1Gbit/s) to each dwelling. They could use Category 5e/6 copper cable but this has a distance limit of 90 metres from the external fibre termination and requires electric power and probably an uninterruptable power supply in addition.

A far better alternative is to extend the fibre direct to each and every dwelling within the MDU, but until now this has been a difficult and costly process. Every fibre route must be measured with extreme accuracy and individual fibre cables manufactured to the correct lengths. This is an expensive and time-consuming process and three or four weeks may pass before the cables are delivered to site for installation.

Alternatively, the site technician may be able to install fibre in the cable risers from the basement to the fibre distribution points on each floor and then provide smaller fibre cables to each dwelling. Very great care is necessary, because standard fibre cables cannot tolerate the rough treatment, crude fixing methods and sharp-radius bends that are normal with copper cables. Fibre cables are simply not compatible with technicians' current working methods.

A skilled (and therefore expensive) fibre-splicing technician must either perform the whole job or else visit the site after the cable-laying is complete in order to splice all the fibre cables. This can frequently involve a hundred or more splice joints, making it a lengthy and expensive process, the more so because every splice must be tested afterwards.

In short, cabling a multi-dwelling unit for fibre has been an expensive process until now. This was before the recent launch by ADC KRONE of a fully-modular, ‘plug-and-play' fibre installation system.

The new breed of fibre system for MDU applications includes fibre cable developed using military experience.  It can be stapled to all kinds of architectural fittings without damage and bent around every type of right-angle found in buildings (on average every horizontal run needs to pass around 15 right-angles). It can even be crushed repeatedly without either damage or degradation of signal.

Accompanying this rugged cable are highly ingenious fibre distribution points that include a concealed cable-reel pre-loaded with 30, 60 or 90 metres of pre-terminated fibre cable that connects back to the previous distribution point.

In fact there are only four components in this system and even with the different fibre-length variants only nine component variants, all of which can be stocked and held in the technician's vehicle. With a stock of these nine variants in his van the wireman can arrive and start work immediately. There's no need for a site-survey nor a four-week wait while custom fibres are manufactured.

Because the fibre cables are all pre-tested at the factory, the only testing required on-site is to check the signal levels in each customer dwelling. With this novel approach, the whole process is just as simple as installing old-fashioned copper cable. The components are simple, durable, long-life and far less expensive then existing MDU fibre distribution. In this way installation costs have been reduced by 60 per cent by major telecomms carriers in the USA, where the equipment has been proven in both central office and field environments.

Taking place this 29 September through 2 October, the International Engineering Consortium's (IEC) Broadband World Forum Europe 2008 will once again gather the world's top ICT thought leaders to the Brussels Expo in Brussels Belgium.
Co-located with host sponsor Belgacom's 35th Annual ICT Symposium, the Broadband World Forum Europe 2008 will represent the entire ICT value chain and provide winning solutions to those aching to maximize the promise of broadband.

"In collaboration with the co-located ICT Symposium, this year's event will combine telecommunications solutions with innovations for enterprise," comments IEC President John Janowiak. "The exhibition and workshops will cover technology, business, strategic and operational issues on topics such as mobility, collaboration, security, and risk management, and address the most promising alternatives for moving forward in the ICT industry."

Themed "Delivering the Promise," the Broadband World Forum Europe 2008 will present key industry leaders at the event including World Forum Chair Scott Alcott, executive vice president of the service delivery engine at Belgacom; and Keynoters Didier Bellens, chief executive officer of Belgacom; Pat Russo, chief executive officer of Alcatel-Lucent; John McMahon, president and managing director of the European department at Sony Pictures Television International; Carl-Henric Svanberg, president and chief executive officer of Ericsson; and Julio Linares Lopez, chief operating officer of Telefonica de España.

The event provides "a great opportunity to see what is happening, what is on the future time path, and great interaction with all of the professionals around the world," declared World Forum Chair Scott Alcott.

Industry professionals will have the opportunity to learn from more than 250 global leaders at the forefront of broadband service, applications, and technologies and an opportunity to learn first-hand from some of the world's top product experts displaying the latest cutting-edge advancements from more than 90 key players exhibiting on the floor. The IEC will present its renowned world-class education in more than 50 keynote addresses, plenary panels, workshops, and sessions.

A history of drawing the world's ICT decision makers to the World Forum, 87 per cent of last year's attendees were manager-level and above.  In addition, more than 100 service provider companies traveled from around the globe to conduct business at the conference and exhibition. 

The Broadband World Forum's InfoVision Awards will also take place honoring the most unique and beneficial technologies, applications, products, advances and services in the industry.

Key event sponsors of the Broadband World Forum Europe 2008 include Official Host Sponsor Belgacom, as well as Nokia Siemens Networks, Alcatel Lucent, Ericsson, Huawei, NEC, Thomson, ZTE, Italtel, ADVA, Allied Telesis, Astra, AVM, Actelis, ADC, Soapstone Networks, and Cisco.

The IEC welcomes all ICT industry professionals to attend especially those involved with Carriers/Service Providers, Enterprise, Software Providers, Integrators/Aggregators, Content, Over-the-Top-Carriers, Residential, Equipment Manufacturers, and Semiconductor Manufacturers.
www.iec.org/events/2008/bbwf/register/

Pay by mobile
A new analysis of the global mobile payments opportunity forecasts that 2.1bn mobile subscribers will "pay by mobile" for digital goods downloaded to their mobile phones by 2013. Juniper Research defines digital goods as music (ringtones and full tracks), tickets, TV, user-generated content, infotainment and games - in fact any content bought by phone and delivered to the phone.

A region-by-region analysis by Juniper Research found that there is a significant growth opportunity not only for mobile payment systems, software, support and consultancy services vendors, but also for mobile operators to increase their arpu as transaction frequencies accelerate.

Report author Howard Wilcox notes: "Many digital content goods and services are becoming basic ‘must haves' - particularly in the sub 35 age group.  Devices like the iPhone - even in its 3G incarnation - are undoubtedly contributing to consumer awareness and usage of mobile music services. People who are 15 to 20 today will expect to buy directly with their phones and will drive this market over the next few years."
Highlights from the report include:

  • Users are forecast to make at least two payment transactions per month for digital goods by 2013
  • Nearly half of all mobile phone users will have bought digital goods at least once with their phones by 2013
  • The two leading regions (Western Europe and Far East & China) will account for over 50 per cent of the total digital goods gross transaction market value by 2013.

Howard Wilcox continues: "Even though typical transaction sizes will remain in the $3-$5 bracket a sufficient number of users will be using their mobiles to buy music, games, tickets, infotainment and the other digital goods sufficiently often to see gross transaction value grow nearly seven fold by 2013."

The report also focuses on purchases of physical goods - ranging from gifts to household goods to electronics - via the mobile web. It provides six-year regional forecasts of mobile payments for digital and physical goods, providing data on subscriber take-up, transaction sizes and volumes as well as detailed case studies from companies pioneering in this market. Juniper Research interviewed 37 senior executives across a wide range of vendors and operators.

Whitepapers and further details of the study, 'Mobile Payment Markets: Digital & Physical Goods 2008 - 2013' can be downloaded from www.juniperresearch.com

Wireless trends
The Wireless Technology Trends Report fills the need for a truly comprehensive wireless analysis designed to serve companies in need of understanding the position of individual technologies in the context of the overall market. This report is written for companies involved in, or potentially entering, the wireless market that are interested in a general overview accompanied by detailed forecasts.

"Over the last 12 months, many of the emerging wireless technologies have begun to exploit market sectors ranging from home automation to industrial and consumer electronics," says Dr Kirsten West, Principal Analyst of WTRS.   "The adoption of wireless as a pervasive technology is not a matter of "if", but when. The consumer desire for increasingly unfettered wireless connectivity is clear."

The report's key findings include:  

  • Ultra wideband has shifted from a nascent technology to a solidly emerging protocol underlying the certified wireless USB and Bluetooth "high speed" implementations.
  • Certified wireless USB will enable wireless transfer of photographs and other media from consumer electronics devices to per¬sonal computing and output devices.
  • Bluetooth has undergone an expansion campaign over the last 18 months to incorporate newly-emerging protocols such as near-field communications (NFC), Wibree, and Ultra Wideband.
  • ZigBee is poised to make great strides in the next year and many new products are expected to be released to market. It is very possible that the momentum of ZigBee will finally surge in 2008.
  • Wireless delivery of high definition video content is an area that has very much emerged over the last 12 months. The market for wireless delivery of high definition video promises to be large, with decisive consumer adoption once the technology has been proven in end products.
  • WiMAX has become a significant competition to alternative technologies in emerging markets.

www.researchandmarkets.com/product/641244/wireless_technology_trends_report_2008 
 
Research push
On 10 September, the European Commission officially launched 14 research projects, which are part of the FIRE initiative for Future Internet Research and Experimentation.

The launch event in the Paris City Hall was opened by Jean-Louis Missika, Deputy Mayor of Paris in charge of Innovation and Education, and Gilles Bloch, Director General of Research and Innovation at the French Ministry of Education, Higher Education and Research. More than 165 researchers, innovation managers, including European experts from national and European research projects, as well as representatives of European and national research funding institutions attended.

The event, which takes place in the context of the French EU Presidency, is organised by the Directorate General for Information Society and Media of the European Commission in cooperation with the EC-funded projects FIREworks and OneLab2.

The 14 FIRE projects contributing to the event are funded by the EC under the 7th Framework Programme for Research (FP7) and cover a wide range of research topics in the area of Future Internet Research and Experimentation, including advanced networking approaches to architectures and protocols, coupled with their validation in large-scale testing environments, as well as interconnection of test beds that enable experimentation on a large scale. The total planned budget of these projects is more than 58.5 million euro, of which the EC funds about 40 million euro.
www.ict-fireworks.eu

Premium content
The mobile communication market in Europe has reached a saturated and mature phase. Mobile penetration is more than 100 per cent in many western and eastern countries, even as many other countries are rapidly reaching full penetration. It is clear that the mobile industry in Europe requires investing and committing in other services and applications in order to grow effectively. Mobile premium content services and applications represent a potential source of significant revenues for the mobile industry.

New analysis from Frost & Sullivan European Mobile Premium Content Markets, finds that the market (including revenues from mobile music, mobile games, mobile video/TV and mobile graphics) was worth €2.68 billion in 2007 and is estimated to reach €11.0 billion in 2012.

"Content is the new horizon for the European mobile industry," notes Frost & Sullivan Research Analyst Saverio Romeo. "During the last three years, mobile operators have been observing a slow, but continuous decline in the average revenue per user (arpu) due to the decrease of voice and SMS arpu. New sources of revenues are needed: content is an excellent candidate."

Content types such as music, video/TV and games are leading the content growth. However, new services and applications such as mobile social networking, mobile searching and location-based services are gaining momentum. All these services, which can be defined as content tools, allow users to personalise, search and share content with other users. Business models are also shifting towards ad-based models.

"In order to exploit the variety of revenue-generated business opportunities, the industry has to face some critical challenges," cautions Romeo. "Consumers will use content on mobile devices if the industry is able to offer high-quality content with an excellent user experience at affordable prices."
www.frost.com

The unification of all communication devices inside a single platform offers huge advantages for businesses looking to streamline their operations, yet requires careful planning and management if it is to deliver the benefits it promises. Driven in large part by the rise in mobile working and the growing need for more flexibility across different communications devices, there has been growing demand for unified communications in recent years with legal firms, and government at the forefront of adopting the platforms, explains Martin Anwyll

The growth of the mobile work force is a key factor in the growing demand for unified communications, in fact a recent Forrester report revealed that 64 per cent of the 2,187 US and European companies surveyed, listed mobility support for employees as a ‘priority', and nearly one in five as a ‘critical priority' in 2008. Also, analyst firm Gartner has estimated that 46.6 million people are expected to be spending at least one day working at home by 2011. This means that businesses are shifting towards a decentralised workforce to reduce costly office space.

Increasingly, the impetus to ‘go green' in these energy conscious times has moved up nearly every business's agenda over the past few months and with the ever increasing fuel costs, the use of online and virtual meetings can significantly reduce business travel costs and lower an organisation's carbon footprint.

However, the unified communications network is an inherently more sophisticated and complex environment, making high quality service difficult to deliver. What are the key issues that managers need to consider in order to quickly identify and troubleshoot any issues and avoid costly downtime?

Unified communications comprises many interdependent and heterogeneous parts making it more difficult to ensure an acceptable and consistent quality of service. Poor quality of service can put the goals and business benefits of unified communications in jeopardy, for example, poor quality of communications is likely to deter a mobile workforce from using real-time communications and reducing any potential benefits. Managers must be able to quickly identify degradation in quality of service or, if any service falls over take the appropriate action. Without monitoring, time to resolution can be significantly impacted, resulting in a poor user experience, reduced productivity and ultimately, loss of business.

Unified Communications can offer an organisation new flexibility and manageability for employees that can deliver unprecedented levels of connection between the distributed workforce. Not only will unified communications help with unravelling bottlenecks, it supports closer collaboration across the business and provides the organisation with a competitive edge that enables employees to contact each other more quickly and eliminate any delays caused by the inability to reach key decision-makers. Poor quality of communications is likely to deter the mobile workforce, reducing any potential cost benefits that an organisation may experience.

There are usually several factors that contribute to a manager being unable to deliver high quality of service and these can range from IP networks not being ready for real-time communications to organisations employing VoIP technology from more than one VoIP vendor. Real-time communication solutions such as email, instant messaging and VoIP, rely upon consistent, stable and low-latency network connections. However, most existing networks are built upon technologies that were not originally designed to be consistent, stable and low-latency and new communication technologies are much less tolerant and prone to service issues.

Unified communications is usually multi-vendor by nature; however, mergers, acquisitions and fragmented purchasing processes mean that an organisation will deploy VoIP technology from more then one of the major VoIP vendors such as Cisco, Nortel and Avaya. There are very few management vendors that support multiple VoIP technologies. Of those vendors that do, it is rare that they are able to manage other unified communications solutions such as Microsoft Exchange, BlackBerry or other business applications.

Each different technology or application that supports unified communications tends to bring its own native management tool. These native management tools generally support a specific piece of the organisations infrastructure and not the service as a whole. This will usually result in holes in the unified communications management or a patchwork of disparate tools to address the management challenge.

The use of an enterprise real-time communications server, for example Microsoft OCS, enables the unified communications infrastructure to allow instant messaging, presence, audio-video conferencing and web conferencing functionalities but administrators should be aware that intermittent network problems can cause issues.  For example, when network congestion causes dropped connections in conference calls or when poor call quality is detected from an OCS client to a non-OCS client.

Organisations that use an OCS platform in conjunction with another voice system must have a solution in place that allows them to have visibility of all major components so that they can effectively manage and monitor the unified communications environment. It is important that organisations are able to proactively anticipate potential problems before they have an impact on their core business.

Security also needs to be a priority in the buying equation.  With technologies in place that only provides protection from existing and impending threats, the likelihood of a major and successful attack on unified communications systems is growing for one simple reason; end-user failure to implement security techniques properly. For example, traditional firewalls do not protect VoIP calls as voice packets must be encrypted and traverse a firewall without undue latency.   Any network that ends with an IP address is vulnerable to unauthorised calls, spammers, information theft and other malicious activity by hackers and DoS (Denial of Service) attacks that can, at best, adversely impact call quality. In a worst-case scenario, the entire network can be at risk during a VoIP security breach.

In order to be effective from a security perspective, the unified communications management system must provide an automated security layer that monitors the entire unified communications environment in real time to increase protection levels and ensure layered defences. It should be capable of correlating security events and alert on security breaches and performing analysis and forensics - all in real time.

Organisations that are looking to deploy unified communications need visibility into the health of the entire communications platform and more importantly need to manage the service that it delivers. By using a comprehensive lifecycle management approach an organisation can successfully ensure that deployment, operation and continued roll-out of unified communication services is delivered at a high standard.

Unifying communications helps streamline business processes and improved connectivity has a direct influence on information sharing, productivity and efficiency. The task of unifying communication applications is actually the opposite of shifting to a single communications platform. Multiple, often unrelated systems must be linked together to appear seamless to the end user. The one unifying element is the underlying data network.

The process of planning, managing and improving begins before deployment and this means that prior to any new communications tools or technologies being used, the capacity of the network must be assessed to see whether it can cope with the anticipated communications traffic.

Once deployment has taken place, organisations must constantly monitor and manage the unified communications system and services. Monitoring should take place at the element level and from the perspective of the user. Taking a proactive approach can save an organisation's time and money by identifying problems before they have an impact on end users, freeing up time for managers to apply a more cost-effective solution. It is essential that organisations have the ability to quickly and easily troubleshoot issues; this requires tools and knowledge that are specific to unified communications technologies, especially those providing real-time communications.

Proactive assessment and monitoring tools provide organisations with the ability to generate comprehensive reports that detail the usage and performance of the various elements of the unified communications infrastructure. The ability to generate reports enables managers to adjust elements such as tracking calls, dial plans, gateway utilisation and external links to improve the service delivery.

Managing a unified communications system needs extensive visibility into the organisation's converged voice and data environment. Vendors offer management capabilities for fundamental unified communications technologies including VoIP, Microsoft Exchange Server, Active Directory and networking equipment. The ability to monitor, troubleshoot, report, diagnose and resolve events with one management solution enables an organisation to correlate events and take corrective or preventative action, effectively minimising resources and time to resolve issues.

The unified communications management market is young and immature. Many organisations are defining their own approach to unified communications and how it meets their specific needs. Organisations need to find a unified communications management solution that fits comfortably, with the flexibility to be tailored to the company's needs.

Business has always relied heavily on communications and organisations that have integrated business applications into the unified communications platform found that they were able to resolve customer issues faster whilst maintaining a higher quality of communication experience.

As communication platforms become more complex and integrated in nature, organisations require tools to assess, monitor, troubleshoot, and secure voice and data transactions. Organisations cannot afford to function in today's economy without the assurance of their communication systems' performance.

Martin Anwyll is Product Line Specialist, VoIP Solutions (EMEA) for NetIQ.

While we are still at an early point in the evolution of 4G network technologies, Rob Dalgety looks forward to greater clarity on how the different technologies will fare and how the market will shake out

Vodafone, T-Mobile and France Telecom have all announced plans to deploy LTE-based 4G networks. Some of these service providers are also planning to support WiMAX, another 4G technology, in addition to LTE. Live deployments of WiMAX are already underway in different parts of the world. Still other service providers are getting behind UMB, WiMAX, LTE or some mix of the three. It feels like we had not really finished discussions of 3G technologies before moving on to 4G. Now a dazzling array of old and new technology acronyms - WiMAX, LTE, UMB, OFDMA, 3GPP, 3GPP2, IEEE - are all now forming a part of the 4G lexicon.

There are three primary network technologies that support ‘4G'. The first is WiMAX:  Worldwide Interoperability for Microwave Access.  WiMAX was initially developed to support ‘last-mile' wireless broadband as an alternative to wired technologies, and has now been extended to support use cases that are truly mobile (non-line-of-sight communications, mobility via the IEEE 802.16e standard, etc.). 

It is currently the most mature of the 4G technologies, with stabilised standards and a number of live deployments around the world.

The second is LTE: Long Term Evolution.  This technology is an evolution of many of the currently deployed ‘GSM Family' of cellular networks (3G/UMTS and 2G/GSM networks) for which the standards are currently being finalised. This technology has the support of a large number of mobile operators and major equipment vendors, as it is intended to provide a relatively straightforward upgrade path from the current network infrastructures.  The first live deployments of LTE are expected in 2010 and beyond.

The final primary network technology in question is UMB: Ultra Mobile Broadband. This is the OFDMA-based 4G extension to the CDMA-2000 3G standard driven by Qualcomm.  This is primarily a route to 4G for service providers who currently use CDMA-2000-based networks. We are still at an early point in the evolution of 4G network technologies. As time passes, it will become clearer how each of these different technologies will fare and how the market will shake out.  Different network technologies will be deployed in different regions and territories depending on a variety of factors, including the commercial and regulatory environment, the availability of spectrum, and the applicability of different network technologies to the aspirations and goals of the service providers.

Moving to these 4G technologies will provide technical and economic opportunities for different service providers. These include the cost efficiencies gained by ensuring IP support in both the core and radio-access networks, as well as significant performance improvements, which can range from improved data throughput to reduced latency and increased capacity (subject to spectrum).  Evaluating these technical and economic factors is a significant part of the process of deciding when and how to move to a 4G technology.

Also key to the process of deciding when and how to move to a 4G technology are the new service opportunities that are enabled by 4G-the new mobile services that can be presented to end users. These can be distilled into four main areas:

  • Fixed Services: 4G technologies such as WiMAX and LTE provide the ability to support broadband and voice services that have, until now, been delivered by wireline technologies. There are opportunities to:

Substitute for wireline technologies, for example in rural areas - where the economics of wireline deployment are reduced by the lack of subscriber density, and/or 
Cannibalise fixed services - this would be by using the wireless network to form part of the proposition where the wireless service offering is more compelling across the marketing mix than the fixed alternatives.

  • Mobile Computing: Enabling truly mobile computing (laptops, modems, dongles, tablets and ultra-mobile PCs) is another service opportunity for 4G technologies. The higher bandwidth available and lower latency of 4G networks fits well with the services used by these more data-centric devices -whether the services are accessing a corporate network and enterprise applications, or using other data-heavy applications.
  • Mobile (2.0) Services: Beyond data-centric devices like computers, 4G networks will deliver the underlying network connectivity and bandwidth to unleash more advanced next-generation data services on traditional mobile devices.
  • Mobile Consumer Electronic Devices: 4G networks will also offer the opportunity to support connectivity for a wider range of consumer electronic devices - from MP3 players and cameras to personal media players. The mobile phone will be just one of many end-user devices connected to 4G networks. All of these different devices can benefit from this connectivity, which will enable support for over-the-air content and data updates as well as ongoing management of the devices themselves.

In general, timing of the rollout and uptake of these services should follow the order outlined above.  However, there will be many regional differences in Europe and around the world.  For example, territories that have extensive and well-developed wireline infrastructures may see more focus on the mobile services opportunities (points 2-4 above) than on the fixed services opportunities.  The commercial strategies of service providers will also significantly impact service mixes and rollout plans. Some service providers may decide to innovate, creating new service opportunities (such as providing connectivity for mobile consumer electronics devices). Others may go after known market opportunities that are currently serviced by wireline technologies, thereby cannibalising the ‘wired services'.

Regardless of which 4G network is under discussion, it is the new wireless network technologies themselves that tend to dominate the discussion of 4G currently. This focus on network technologies is not a new phenomenon; we saw this when 3G networks rolled out and we are seeing it now as the 4G market evolves.  The network itself will form an important part of the discussion as we move into the 4G world.  However, based on the quantity of conference papers, standards body activity, press coverage and news, you might think that once the 4G network issues are resolved and deployed, any issues having to do with delivering services over those networks will be resolved as well.

The reality is that this is far from the case.  There is a range of other enabling-technology elements that are essential to delivering 4G services. Critical components that need to be considered include the devices and services that will use the network. In building a truly functional and vibrant ecosystem, we not only need to deal with network, device, application and service issues, but we also need to deliver the key advanced management capabilities and tools that are essential for a truly seamless, high-quality end-user experience.

A critical lesson learned from the cellular world is that as the service environment becomes more complex, embedding advanced visibility and manageability into devices and services is critical to optimising new services and delivering a great end-user experience. These capabilities will be even more critical as we move into the 4G world with all its inherent complexity.  For example, in a 4G ecosystem, the device might be anything from a mobile phone to a computer to an MP3 player or a vending machine.  Being able to detect the device, recognise it for what it is, activate it and configure the relevant services on it without the need for human intervention constitutes the first step to an excellent user experience.   After all, there would be no point activating and configuring a vending machine for voice services!

The technical ecosystem is critical, but it is compelling service propositions that deliver value to the consumers or enterprises that will ultimately drive end-user demand for 4G services.  These propositions will rely on the inherent capabilities of the network as well as other technical components. But the litmus test will be the full value proposition service providers deliver to their users - the '7 Ps' of marketing. For a truly compelling value proposition, service providers must consider it all-from the pricing and promotional offers, to the service functions (product), distribution channels (place) and physical presence - especially of devices (from traditional mobile devices to other connected consumer electronic devices) - to the provisioning and setting up of services (process), and all of the delivery and support processes (people) that accompany the offerings.

There are significant service opportunities in the 4G environment.  The network is a core part of the equation - the oxygen that will support successful 4G services.  The network is also a significant component of the business case for investment and deserves attention for that reason as well.  However, there are other enabling technologies that will be critical to the success of 4G, including devices, services and management capabilities. These will all be important technical components in ensuring a full-functioning 4G technology ecosystem.  The ability to grow the usage of services - and reap the revenues that increased usage will bring - will also depend on the value proposition service providers deliver to end users. Offer compelling value propositions delivered over a superior technology infrastructure, and 4G services will fly.

Rob Dalgety is Commercial Director, Mformation Technologies
www.mformation.com

The early hype for mobile tv may now have died away, but Kamil Grajski is certain that successful mobile broadcast is on its way

The mobile TV hype is over.  Aggressive projections regarding consumer adoption, and revenue (whether via advertising or subscription) have not come to pass.  Chief among the more creative aspirations was that by the opening ceremonies for the 2008 Summer Olympics, aided by a European Commission sponsored DVB-H technology mandate, the European Union would be well along the road to rapid adoption of mobile TV and mobile broadcast services. Worse, news has emerged in recent days that due in part to lack of mobile network operator engagement, the Mobile 3.0 consortium that had won a license in Germany with plans to use DVB-H for its mobile TV service may be on the verge of collapse.  In parallel, it was announced last week that Norwegian broadcasters NRK, TV 2 and MTG had banded together to launch a free-to-air T-DMB-based mobile TV service.

All of this is not to pick exclusively on DVB-H.  Not at all.

The reality is that there are challenges aplenty across the whole mobile TV technology spectrum.  For example, operators in Korea have launched services using the T-DMB standard, which sports more than sixty device-types and more than 8M subscribers. But even with that size of audience, advertising revenue is running behind expectations. In the USA, which recently managed to air a live, dedicated mobile TV Olympics coverage channel via FLO-powered AT&T and Verizon services, uptake of mobile TV has been slower than many expected.  Qualcomm's CEO Paul Jacobs, who's company supports the FLO broadcast standard as well as other technologies, has publicly expressed a desire to see greater mobile network operator marketing and promotion of mobile TV services.

But despite the march of seemingly apparent bad news, the underlying trends for mobile TV still register strong positives.  In a word, the hype is gone merely to be replaced by the harsh realities of establishing a new global consumer mass market medium. Although TV is seen as a mature medium, transferring the broadcast model that appears on cable, satellite and terrestrial networks is a massive undertaking. Despite some early teething troubles mobile broadcast TV has launched commercially in a number of closely followed markets around the world including Japan, Korea, Europe (Italy) and the United States.  Other on-air or planned launches include Austria, Finland, Netherlands, among several others. So the commitment from operators and the industry is there.

So what has been learned so far?  For those operators that launched with a free-to-air model (and lots of devices), such as in Japan and Korea, initial consumer adoption is not the major issue - the services are in fact proving extremely popular.  What remains unclear is revenue and profitability growth, as operators try to move consumers over to the more lucrative subscription channels that feature premium content.  Conversely, operators that launched with only a pay-TV model have seen adoption as the greater issue. And even then, long term revenue and profitability growth are yet to be fully tested.

In Europe, 3 Italy, which was first to market with a mobile DVB/H-powered pay TV service, recently announced the addition of a free-to-air bouquet as a way of boosting adoption.  Similarly, MediaFLO USA has added a free-to-air promotional channel aimed at giving consumers an easy way to sample and subscribe.  Thus, in response to market realities, the launched mobile broadcast market is implementing and testing the proposition that a hybrid free-to-air and pay-TV model may be optimal to drive adoption and fuel business growth.

Looking at forthcoming deployments, the medium- and long-term trends remain positive.  In the medium-term, the realization that capacity (spectral efficiency) is pivotal to broadcast TV's success has seen a review of the different technologies on offer. Capacity is important for two reasons: firstly, the ability to deliver more channels over the same service means that operators can deliver a greater mix of paid and free content as they seek to drive uptake. Secondly, the capacity has an effect on the amount of spectrum needed to deliver TV services. This is of critical importance in Europe, where there is much greater pressure on the amount of spectrum available from country to country. This is one of the advantages of the FLO air interface, as it supports ~1.5X-2X the number of video channels as any other mobile broadcast technology.  Such capacity gives excellent flexibility in adjusting bandwidth between free-to-air and subscription-based programming.

For example, while it is early to define it as a widespread trend, we've observed regulators (France and the UAE among others) implement must-carry free-to-air requirements on mobile broadcast licensees.  The number of such free-to-air channels varies, but can be as high as 5-10 channels.  MediaFLO operating in an 8MHz channel can deliver 30+ channels streaming video (25+ frames per second; QVGA; AAC+ stereo audio).  Thus a 10 channel free-to-air requirement still leaves up to 20 channels to power a subscription-based service, or other allocation to include such services as multimedia file delivery (Clipcast), IP datacast services and mobile interactive TV.

Also in the medium term we can expect action to result from recently concluded spectrum auctions, including those in the United States and the United Kingdom.

Two key long-term trends signal continued positive momentum.  First, mobile broadcast related regulatory consultations have recently concluded or are in progress in India, Singapore, Taiwan, Ireland and the United Arab Emirates.  In Europe there is steady progress relating to the Digital Dividend and spectrum harmonization primarily via the CEPT ECC Task Group 4 operating under mandate from the European Commission.  Second, 3G-based mobile TV continues to build momentum.  In key high-growth markets, such as throughout the Middle East and North Africa, 3G-based mobile TV has created spirited competition between operators.

Ultimately, as 3G-based mobile TV and related services drive adoption and simultaneous use grows, the low-cost bearer economics of mobile broadcast in general and capacity performance of MediaFLO in particular will drive adoption.

Back to today.  While the mobile industry still works out the details of how best to create a robust mobile TV business, consumers are still getting to grips with even basic multimedia content. But there is optimism in the air: as people get used to browsing the web and consuming video and music on their mobile devices, they set the stage for TV and other richer content in the future. So despite the death of all the early hype, it's certainly a question of ‘when' and not ‘if' for mobile broadcast TV. Nobody said that changing habits was easy, but done right the effect can be amazing. Just ask Apple; not for what it has achieved for itself, but rather for the catalytic effect the iPhone has had on the whole industry. The work we do today prepares the ground for the services we enjoy tomorrow.

Kamil Grajski is President of the FLO Forum
www.floforum.org

In the face of continued threats to mobile communications from such factors as message-borne spyware, malicious messages and invasive spam texts, education is the next key step to fighting for mobile security explains Jay Seaton

In May this year, IMS research released data stating that in 2012, 900 million users will be accessing banking and payment services through their mobile phones.  This enormous figure reflects the evolution of consumer activity from the high street bank, through to the PC and on to the smart phone, and yet there is little to no education for users on the risks involved in accessing banking data through their mobile phone.  So while the last few years have focused on educating the public about banking, shopping and online activity through their PC's, the mobile phone is a new arena where operators need to step up and educate users on protecting their personal data on the go. 

Many consumers assume that operators pre-load security functionality on to handsets, and when purchasing a mobile phone, consumers are offered insurance for loss or theft of the handset, but not in relation to mobile security. Indeed, McAfee's 2008 Mobile Security report identified that 72 per cent of mobile users were concerned about the level of security services for their mobile phones, which shows that it is still a topic that remains less prominent, but just as prevalent, as PC security.  With the advancements of the mobile phone, come the additional tasks of protecting information stored and accessed via the device.

While the meaning of "security" hasn't changed much over time, its context has evolved at a frightening pace, with more and more risks making themselves ever present in consumers' everyday lives.  Prominently reported in the news, there have been several stories already this year of confidential information being accessed and taken out of the workplace through employee's devices.  Indeed, a recent survey by Decipher Inc found that "70 per cent (of those questioned) said they access what they consider to be sensitive data on their smartphone in order to work outside the office." While the mobile phone is not a new arena for security threats, it is still hugely overshadowed by the traditional areas of threats to home, work and personal computers.  With record numbers of spam hitting consumers' mobiles on an hourly basis, it's time to shift the focus from the online world to the personal realm of mobile communication.

For consumers the mobile phone has opened up a world of new possibilities. Mobile subscribers can now use their mobile phone for a host of activities -- be it paying or accessing bank details, purchasing cinema and concert tickets, travelling around cities or accessing social networking sites such as Facebook. All these tools are aimed at making consumers' lives easier to work while on the move.  However, with all the progress that has been made, there is still very little information available for the mobile user in protecting against the same dangers that would be second nature to them whilst working on a PC.

So what areas are mobile users most at risk from? 

Unwanted SMS
One of the most prevalent security risks for mobile users, and one universally recognised now is unwanted SMS - most users would recognise unwanted advertising, or in worse scenarios, unwanted and malicious messages.  With an estimated 72 per cent of all mobile phone subscribers worldwide being active users of SMS, each is at risk from several forms of SMS abuse, from unwanted advertising, denial-of-service (DOS) attacks in the form of SMS flooding or scam messages encouraging subscribers to make premium rate calls. 

Spam Text
Global SMS spam levels continue to rise at a frightening pace.  In March this year, China saw an unprecedented influx of spam messages with 200 million China Mobile subscribers hit.  What makes it easy for spammers is the low price of SMS, meaning users can be targeted outside their own country.  However with mobile marketing becoming ever more popular, Application-to-Person (A2P) SMS is gradually becoming more and more common as advertisers aim to reach audiences through different channels.

Messaging-borne spyware and malware 
Spyware and malware are one of the most malicious formats of mobile threats.  Users can be targeted through a message containing a URL or web link, which once clicked, downloads a virus or application with a hidden piece of code to the handset.  The most common strain of Trojan targets the user's address book, which then infects all contacts with the same virus and the pattern, repeats itself.  Small businesses in particular are at risk as they are more likely to be using smart phones with unsecured email clients.  Additionally, the arbitrators of the Trojans use fake mobile accounts that cannot be billed to a single operator, meaning there is a huge volume of messaging traffic that no one is paying for which makes them very hard to trace.

MMS threats
Despite currently being less prominent than SMS in attacking mobile users, malicious MMS messages are a threat for users with Bluetooth.  The virus once installed on a device, can replicate itself through the Bluetooth application again, through the address book on the phone.  The user is then charged for the huge volume of messaging that has taken place unbeknownst to the user, as well as draining the battery of the phone.  However, MMS threats can be more easily controlled through disabling the Bluetooth function when not in use.

So what can consumers do to prevent these threats?  First, more education is required from the operators and network providers - 55 per cent of users expect mobile operators to preload mobile security functionality to all handsets, and with PC security readily available, the lack of knowledge for mobile user's means assumptions are made by users because information is not readily available.  With planning already well under way for London's 2012 Olympic Games, its important to ensure security on all levels is in place - mobile security needs to be a key part of this.Second, the mobile operators need to heed their advice and deploy mobile security tools and services to ensure their subscribers are protected. For under 18s and other vulnerable users, a mobile operator can empower parents to control who can contact their children, and the types of content they are willing to receive. This can be done through content controls which allow parents to prevent children from accessing inappropriate web and WAP sites; receiving unwanted and unsolicited messages such as phishing attempts, bullying and harassment, pornographic images by MMS; or subscription to unwanted premium rate messaging services.

For corporate organisations, operators can not only provide subscribers with the means to enforce corporate usage policies (ensuring Mobile Data compliance to existing LAN Acceptable Use Policies) but can also extend this capability from Internet access to embrace messaging and safeguard users from spam, phishing and virus attacks, while also protecting the operator's network.

However it is not only the end users who need protecting. The mobile operators' networks are also affected by SMS fraud leading to revenue loss between operators. Studies of operator traffic show that typically one to two per cent of all traffic carried may be spoofed or faked, which for the large messaging volumes carried, result in direct costs.

Growing mobile messaging and data revenues depends upon the growth of accessible mobile content. However without controls, users are potentially subject to harassment, unsolicited messaging, inappropriate content and fraud. Unless addressed, these concerns will inhibit the growth of mobile phone penetration in new segments, and the usage of messaging and data. Without the ability to preserve privacy through managing content and access, a user has one choice - suffer or switch off the service.

Jay Seaton is Chief Marketing Officer at Airwide Solutions

Martin Creaner sets the scene for the TM Forum's Management World Orlando, against the background of the massive changes taking place within the communications, entertainment and information marketplace

We've all heard the saying "May you live in interesting times." I think that adage applies very nicely to the telecommunications industry. If you've blinked at all in the past 20 years, you'll have missed some striking changes that have seen the shift from traditional telcos leading the way, to the importance of cable providers, and now increasingly to the rise of Internet and new media companies.

These are certainly interesting times to be a telecom observer, but if you're entrenched in the industry it can be downright head-spinning to keep up with who's in the market, what kinds of services are being offered, how they are being billed and if customers are truly happy.

Sometimes it may feel as if you're operating in a vacuum, but that mentality is going by the wayside as communications providers and others in the industry are realizing that to hang onto the market share they have, or to build their customer base, they can't go it alone anymore.

Established telcos are feeling the heat as they struggle to get their costs down, get their revenues up and keep their customers happy. And they are expected to do all of this while fending off the ever-encroaching competition.

At TM Forum, our core membership has been these traditional telecom operators, so we're experiencing the massive industry changes right alongside these established players. Historically, our organization was really the central meeting point for the telcos, and soon thereafter the vendors that supply products to these companies started joining up. Then year after year, we kept growing and adding new areas of focus, including targeting cable providers and their suppliers as members.

But in today's new world order of entertainment and information coming at you 24 hours a day, 7 days a week, these providers know full well they cannot be all things to all people all of the time. They are going to have to partner with content providers, content aggregators and other new entrants to the telecom value chain or risk falling by the wayside and being looked at as simply commodity bit carriers.

The nice linear value chain that we've known and loved for years in telecom has developed twists and turns along the way as more of these new entrants to the market insert themselves into the service delivery process.

From our perspective at TM Forum, we have to think of these new players not only as potential members of our organization but also as viable contributors when it comes to developing standards and processes for creating and delivering next-generation services.

This paradigm shift from telcos as the communications power brokers to the rise of cable providers in the past 25 years, and now the entrance of new players on the scene means a lot of questions, confusion and general scrutiny on the entire communications service delivery value chain.

TM Forum's flagship event this November is all about addressing not only the broadening and expansion of the telecom value chain, but also what needs to happen behind the scenes to ensure services that customers really want are being created, that these new services are being delivered to customers with the agreed quality of service and availability, and that the services are actually providing new sources of revenue for providers. In that vein, for the first time we're offering a tri-summit format that places equal emphasis on what we see as the cornerstones for the telecom industry today. Rather than having a single, monolithic conference, we're going to shine the spotlight on the areas we think will be important over the next several years.

The Transformation Summit will address the challenges facing the mainstream telcos and cable players who have been working diligently to drive costs out of their back office and are now looking at instituting a bigger-picture transformation to ensure their future success. So rather than just focusing on the importance of cost-cutting, we'll also look at opening up opportunities for revenue enhancement across these organizations.

This summit will feature presentations on our core frameworks including TM Forum's Business Process Framework (eTOM) , our Information Framework (SID) and our Application Framework (TAM). These are the central building blocks for anyone who wants to travel down the transformation route.

The Revenue Management and Customer Experience Summit will showcase what TM Forum has been doing in this space in the past few years, which includes the absorption of the Global Billing Association (now our Revenue Management Initiative) and IPDR.org (now TM Forum's Cable Initiative). Instead of being just the organization for the standardization of OSS, we've expanded our horizon with a focus on BSS as well, including hot topics such as billing, charging, rating, revenue assurance and how to proactively manage the customer experience.

The Digital Commerce and Advertising Summit is really all about convergence and represents the latest broadening of scope within TM Forum. We see digital commerce as where revenue growth opportunities will come from in the next five to ten years,

This summit will focus on how telecom, cable, advertising and media/entertainment are coming together in new and interesting ways. We'll try to answer questions like how is advertising going to facilitate to new revenue models in the digital world and how do you create service delivery capabilities to deliver this new content in an effective and efficient way.

In addition to these ground-breaking summits, Management World Orlando will also feature world-class keynote addresses by executives from Rogers Communications, Credit Suisse and Deutsche Telekom; 18 introductory and intermediate training courses on topics such as NGOSS, Service Delivery Framework, Next-Generation Billing and Monetizing Content; and of course an expo featuring the leading companies in the industry.

We'll also be highlighting a number of real-world demonstrations of proven and implemented technical frameworks and business solutions in Forumville, your one-stop shop to see market-ready deployments, to experience the convergence of communications, media and entertainment, cable and other sectors, and to talk to visionary TM Forum members and their partners.

Forumville includes Content Encounter, which will feature more than 20 companies showcasing end-to-end solutions for the digital marketplace. You'll see everything from the content lifecycle, advertising-based business models, advanced services creation and delivery, revenue assurance and much more.

Within Forumville you'll also find our Catalyst Showcase, which is TM Forum's technology proving ground for innovative projects. These projects will have a broad technical appeal to show attendees but will also highlight how they can deliver real business benefits and positively impact the bottom line.

Catalyst Projects that will be featured at Management World Orlando include Global Information Mobility, Harmony Phase 3, Enhanced eBonding and Targeted Advertising through Enriched Subscriber Information. In addition, we'll be looking closely at several focus areas, including E2E Service Quality Management.

Management World is really a reflection of where TM Forum has come from, where we are today and where we are going in the future. We want to take our telco base, build on that without leaving anything or anyone behind and get a few steps ahead to help everyone be more competitive in the new marketplace.
We hope you'll come along for the journey to Orlando!

Management World Orlando, 17-20 November 2008, Orlando, Florida
Martin Creaner is President of the TM Forum
www.tmforum.org

There is a natural symbiosis between IPTV and advertising, argues Tony Hart.  He looks at what it might mean for telcos

When it comes to IPTV, two of the biggest challenges facing service providers are a) how to generate revenue and secure some kind of return-on-investment; and b), how does IPTV differentiate itself in markets where it is competing against other TV delivery platforms?   This is why more and more industry players are turning the spotlight on the role of advertising across IPTV networks.   In return, the flexible nature of IPTV promises to breathe new life into tired old TV advertising formats and help to halt the declining TV ad revenue in some Western markets.

Targeted advertising... addressable advertising.... personalised advertising... call it what you will, but this new approach to TV advertising could help to give consumers a more relevant experience, while at the same time helping to attract advertisers to this new delivery medium and generating revenue for the service providers involved.   Nor is this just hot air: in the past year, operators in Europe have already conducted addressable advertising campaigns across terrestrial TV channels, with further campaigns planned.

Annelise Berendt, of industry analyst firm Ovum, has previously gone on record saying:  "The IPTV platform offers advertisers the best of both worlds.  It offers the immersive and proven impact of traditional television with the added benefits of being able to enhance it with interactivity.  It also offers the addressability and accountability of advertising in the Internet world, enabling the targeting of individual homes, personalized advertising and measurement of an advertisement's impact."

Before we delve further into what addressable advertising is all about, let's be clear about the definition of IPTV being used here.  In this article, we are talking about IPTV in the sense of TV delivered across a private IP network to a subscriber's broadband access point, to then be viewed on a television set.  This kind of IPTV -already being delivered by the likes of BT, Orange France Telecom and Telefonica - is not about web TV (or Internet TV as some people call it), which unlike IPTV, cannot guarantee the quality of service that viewers associate with TV. 

There will be some overlap between IPTV and web TV (such as being able to access Web pages from a TV screen, associated with a specific TV programme).  Furthermore,  the growth of Web TV has changed the way we consume video-based video for ever,  meaning that the viewer is in charge of ‘when and where'. It is even possible to interact with content via social websites.

At the moment, many consumers in early markets typically receive IPTV services as part of a package, rather than actively demanding the technology.  After all, consumers are generally only interested in the content, not the delivery mechanism.  However, this is not enough, particularly for operators competing with traditional terrestrial, cable and satellite.  As Ashley Highfield, Director, Future Media & Technology, BBC has said on the corporation's web site: "The winners will be the IPTV aggregators who offer truly complementary, differentiated services to those which people can find on their TVs...what IP-delivered TV should be about are the things that traditional television struggles at: amplification,  global distribution, rediscovery engagement, collaboration, innovation and navigation." With IPTV, service providers have massive opportunity to provide integrated services, more personalised content and user-generated content are all possibilities.

In addition, advertising on IPTV has the potential to be dramatically different to the ‘traditional' TV experience.  IPTV enables content to be targeted according to different factors, the first of which is geography. Because of the bi-directional nature of IP, it also becomes possible to discover viewing behaviour in ‘real time' without the consumer having to give away any potentially sensitive personal data. In this way the service provider can tell whether the IP address associated with a particular device is consistently watching a genre of programmes.  This information can be used not only to offer certain kinds of content, but also to help service providers to offer advertisers a means through which to deliver more relevant advertising.

For instance, a household that is clearly a regular consumer of holiday programmes - but never watches any children's TV - could be targeted to receive ads about holidays but de-selected to see any ads aimed at families.  Furthermore, if consumers also ‘opt in' to provide additional information themselves, then profiling of ads could become even more detailed. 
This tailored approach has advantages all round. It excludes the danger of advertisers falling into a ‘spray and pray' approach to TV advertising.   ‘Frequency capping' can be used to ensure that a viewer only sees an ad a certain number of times, or to create serialised ads where viewers see ‘episodes' in sequence.  The same brand ad could have different sequences depending on the viewer (for instance, city hatchback car with one message for young people, then a different message and visuals for an older audience).  In this way, viewers are less likely to skip ads, even when using PVRs.  Research from Nielsen has shown that while viewers do skip or fast-forward ads, ad-skipping habits vary according to the show or whether the programme is watched live or later.  When watching Survival: China, just under 20 per cent of the 5.16 million ‘live' viewers ad-skipped, while of the 6.51 million who recorded and watched the programme up to three days later, 5.23 million did not ad-skip.

As far as IPTV operators are concerned, these new, more engaging formats can help to attract the advertisers who are becoming increasingly disillusioned with the return-on-investment from TV advertising.  Addressable advertising means less wastage, which apart from being more cost-effective for big brand advertisers, also brings TV promotion within the grasp of a whole new pool of businesses who would have previously found TV too unfocused and expensive.  Finally, the data of viewing habits of different IP addresses can be used to create the basis for more sophisticated measurement tools.   Moreover, the same ad avail (or timeslot) can be targeted to different viewers and most importantly, sold to different advertisers, making this a highly attractive business model for broadcasters.

Of course, the viewers themselves also benefit from seeing more relevant and hopefully more enjoyable advertising.  If consumers are presented with more relevant advertising then they are more likely to accept its presence outside of traditional TV.  This in turn makes advertising a more valuable revenue stream for service providers. For example, many mobile TV services are expected to be loss leaders and offered to consumers for free.  If the operators behind mobile TV can hope to recoup some of their investment through advertising, then their business cases are in better shape.

Some IPTV operators, broadcasters and advertising agencies have already been exploring addressable advertising opportunities.  The world's first targeted TV advertising campaign over IPTV took place in late 2007, involving UK broadcaster Channel 4, the UK IPTV network Inuk, media agency Mediacom and Packet Vision.  Using an ad from an existing financial sector client, the campaign ran daily on Channel 4 for two weeks.  It specifically targeted at university students across the UK so that during the same 40 seconds in which the ad spot ran, students saw an ad from a different brand to the rest of the general viewing population.  The targeting was made possible by installing Packet Vision's IPTV advertising solution, the PV1000 (which involves services, software and hardware, including a single rack mounted unit in the telco's network, providing splicing, routing, ad insertion and management features) within the Inuk Freewire service, an IPTV network that provides triple play services to universities across the UK.  

Rhys Mclachlan, Head of Broadcasting Implementation at Mediacom said at the time: "We've delivered a pure targeted campaign for a client through television advertising on terrestrial broadcasting for the first time.  Packet Vision offers an opportunity for advertisers wanting to reach a specific demographic without screening copy to viewers who fall outside of the intended audience.  We also see advertisers with restricted budgets using this service on a regional basis for the delivery of cost-effective campaigns."

Another example in the summer of 2008 involved Channel 4 via Inuk again, but this time using Dollond & Aitchison, one of the UK's leading eyewear providers and its agency Arena BLM.   Arena BLM was keen to exploit the targeting potential of IPTV for its client.  It booked a campaign which features a D&A lifeguard saving a woman who is drowning in a sea of glasses, to run on Channel 4 and to be seen only by students.  Caroline Binfield, Business Director of Television at Arena BLM said, "This innovative technology allows our client to target niche and highly relevant audiences, which will drive improved efficiency of the advertising campaign."

These early experiences are just the beginning.  To earn its place, IPTV cannot just be yet another ‘me too' delivery vehicle: it has to offer something different, as well as make money for its stakeholders.  Addressable advertising could be the key to making IPTV a truly profitable medium.

Tony Hart is Business Development Manager with Packet Vision
www.packetvision.com

How can operators best manage and monetise the capacity demands of Internet TV?  Jonathon Gordon takes a look

There has been much controversy over who should foot the bill for over-the-top Internet TV services such as the BBC iPlayer, ITV's catch-up TV and Channel 4oD, as well as non-broadcaster user-generated content channels such as YouTube and Joost. The rise in popularity of these bandwidth-intensive services seems to know no bounds. According to Ofcom online viewing has doubled over the past year from 1.57 million to 2.96 million, with one in nine UK homes, for instance, now tuning in via their PC. Our viewing habits are changing and we now demand that footage be available to view, perhaps repeatedly, anytime, anywhere. This trend is even seeing PVRs (personal video recorders) and home media centres equipped with Internet connectivity. In such numbers video services consume vast amounts of broadband capacity. So much so that there is a real danger of their popularity threatening the deployment of next generation networks.

Why? Because the way these services are monetised means there is little payback for the operator. Research from Telco 2.0 looking at the impact of the BBC's iPlayer on UK ISP Plusnet found that costs have gone up 200 per cent, from 6.1p to 18.3p per user, because the ISP needs to buy more capacity but is seeing no additional revenue. And it's not just the operator who gets the short end of the straw. Users are also being short changed given that three hours viewing of Joost, a P2P video service devised by the same founders as Skype, would use up the 1GB monthly allowance awarded to most subscribers. Content providers too have a vested interest in how this content is delivered. Poor delivery means their service may alienate viewers. Internet TV content is highly susceptible to latency, delay and jitter caused by fluctuating contention rates and the emergence of high definition footage, which can consume up to 75Mb per minute when streaming at 10Mbps, is likely to further exacerbate the problem.

So should the service provider, content provider or end user foot the bill? At the moment, the jury is out on what new business models will emerge. Traditionally, a linear model has seen consumers pay for access to content and distributors paying the content provider with additional revenue generated from advertising. The operator is left out of the equation based on the assumption that the fee users pay for their broadband connection will cover consumption. In reality there is little correlation between content and the cost of delivery. iTunes, for example, charges a dollar per MP3 download and around five dollars for a movie even though the latter is 100 times greater in size. Clearly operators can't charge a corresponding amount for Internet video yet at the other extreme, user-generated content is available for free. There has to be a middle ground.

Internet TV consumption monopolises resources and should the current situation continue unabated some argue that the operator will be unable to cope with operational expenditure ruling out network upgrades and stymying technological advancement. New business models, therefore, have to emerge. Perhaps the ISP will partner with legitimate content providers, or they may decide to steer clear of any content-related activity altogether, choosing to act as an access-only conduit but with payment reflecting the level of access. Regardless of the model that evolves, there is an imperative for the content chain and access provider to work more closely together to ensure Quality of Experience (QoE) for the user.

Content owners use Content Delivery Networks (CDNs), which sometimes use P2P file sharing technologies, to distribute content. P2P is valuable in improving QoE because this distributed storage and retrieval mechanism improves speed, minimises the load on servers and is a cheap and scalable means of distribution. These systems can be used to help operators prevent congestion. In return, the operator can assist the content owner by using caching technology to prevent service degradation.

Caching provides an ideal opportunity for the operator to add value. It works by storing popular content in close proximity to the user, allowing the operator to ensure content is delivered effectively while also meeting the needs of many users at once. But caching should only be seen as part of the solution because it tends to focus on specific traffic types rather than addressing traffic volume as a whole. What's more, as more content is made available, viewing is likely to fragment, making it more difficult to store the most popular content on the cache and improve the QoE.

The operator needs to be able to factor in off-net issues and the total bandwidth available, requiring caching to be supplemented by another technology capable of traffic management. Deep Packet Inspection (DPI) service optimisation is the ideal partner as this technology is capable of peering into the traffic stream to determine the signature of applications whilst also monitoring bandwidth consumption. It's a versatile technology as the operator can use it to passively monitor the network and capacity consumption or to take more assertive action. For instance, the operator could decide to prioritise video applications across the network, allowing the operator to guarantee Quality of Service (QoS) on this type of traffic. It's easy to see that this type of guaranteed service would appeal to Internet viewers allowing the operator to market this service or simply use it as a differentiator.

Network operators who favour the proactive approach can use DPI to establish network congestion policies. These allocate bandwidth according to the traffic category, either boosting or limiting the capacity. Traffic can be prioritised according to its content and congestion managed according to these categories. As a consequence, the operator can better utilise network resources, conserving bandwidth and postponing the need for frequent network upgrades. If the DPI device is subscriber aware, it can take into account any subscriber SLAs and compare these with the application categories dictated in the traffic management policy. DPI devices, which are both content and subscriber aware, can inform the creation of tiered service packages and be used to tweak these should usage patterns alter.

When used in league with caching, DPI traffic management can prioritise the passage of specified traffic across the network, reducing the delays associated with multimedia content buffering. When the subscriber requests content this communication is recognised by a blade housed on the DPI device in real time and the request is redirected to the caching engine. If the content is already housed on the cache, it is streamed directly to the subscriber from the caching via the DPI device. Alternatively, the cache can retrieve the content over the Internet, whether it is housed on CDN servers or P2P nodes, a request again routed through the DPI device. Regardless of the source, all content is managed by the DPI traffic management system to prevent congestion. DPI is also capable of prioritising this traffic and, if used with a service gateway functionality, can also subject it to filtering. The Service Gateway essentially allows the DPI device to interface with value added systems that provide security control or url filtering in order to carry out different rule sets. A Service Gateway DPI device unifies the operator's billing, subscriber management and provisioning systems, acting as a central point of management and one-stop shop that combines data to assess service utilisation. In an Internet TV context, it can carry out pre-processing or post-processing of the content flow, allowing it to perform harmful content filtering, for example.

As well as governing the network, a DPI traffic management solution can also be used as a customer-facing tool. It is able to set quotas for individual subscribers without showing a bias towards any one particular content provider. Quota Management can be used to differentiate between video traffic, VoIP traffic and web traffic, using volume usage quotas to provision and enforce customised service plans. The DPI device collects information and allocates usage per subscriber, meters actual service consumption, and adjusts QoS according to content, volume and time elapsed, or any combination of these parameters. When the allocated quota is reached, the operator can choose to redirect the customer to a portal where they can "refuel" their quota or change their service plan. Quota Management ensure the user's access to high quality content is protected and allows the operator to manage bandwidth resources.

In essence, DPI traffic management finally makes the operator part of the Internet TV value chain. A unified, open platform DPI traffic management device with Service Gateway and video caching capabilities optimises network resources and prioritises traffic according to the nature of the content. As a consequence, Internet TV traffic is retrieved and delivered in as timely and conservatively a way as possible while also allowing new subscriber-led service plans to be developed.

For the viewer, timing is everything. Uninterrupted access will become as necessary to the survival and proliferation of Internet TV as always-on broadband has been to the web. So operators need to turn on, tune-in or drop out.

Jonathon Gordon is Director of Marketing, Allot Communications, and can be contacted via
jgordon@allot.com

Only a few years ago startups such as Skype and VoiceBuster gave the telecom industry a real scare by allowing people to make telephone calls over the Internet free of charge. They forced telecom operators to cannibalize their traditional voice revenues with VoIP services. It was one of the reasons for the telcos to start investing in IPTV.  Now there is a new threat for the telcos in the form of Internet TV. Will new players again traumatize the telecom industry by undermining the telcos nascent IPTV services, asks Rob van den Dam

Many telecom operators are investing in digital content in the hope of offsetting the fall in fixed-voice revenues. They focus primarily on offering television and video services, in particular IPTV; many of them see this as a necessity to combat the trend of losing subscribers to cable companies, which are increasingly offering VoIP as part of triple-play bundles.

But new Internet developments again pose a threat to telecom operators. The Internet has already caused a transformation in the telecom industry in the domain of communication services, where new players such as Skype have forced telecom companies to offer VoIP at substantially lower prices than they previously offered for traditional voice services over the fixed network. And now, the Internet enables Internet-TV start-ups to become a threat for the telcos nascent IPTV-services. Today Internet video is still delivered in rather low quality via sites like YouTube. In spite of grainy images and the small window format, however, these sites have been successful in attracting millions of viewers on a regular basis. And as broadband becomes faster and available to a broader public, they will be able to offer professional video services with a continually improving image quality, in this way providing an alternative for IPTV.

IPTV is a system where video content is transmitted in the form of IP-data packages over a closed secure network. The infrastructure is configured such that viewers can only receive the IPTV provider's own TV-channels. IPTV focuses primarily on the TV-set in the living room, generally a wide-screen TV with high image quality. A Set Top Box (STP) is required to receive the signal. IPTV telecom operators are very uniquely placed to enhance the television experience:

  • They can augment their IPTV-offerings with a wide variety of voice and data services.
  • They are well placed to combine IPTV on the TV with the other screens: the PC and the mobile.
  • They also have a lot of information about the viewer that they can use to deliver personalised content and advertising.
  • And last, but certainly not least, they are able to guarantee a qualitative high end-to-end television experience.

Currently most IPTV-services are based on subscriptions and a Video-on-Demand charge.
Up to now, many IPTV operators have focused on offering the same services - the same TV-channels and type of content - that their competitors, usually the cable companies, offer. But some telcos have taken it further. For example, Belgacom in Belgium is competing primarily on exclusive sports content. Other operators rather compete on offering ease-of-use. For example, by offering an Electronic Programme Guide (EPG) that allows individual users in the household to set up their own TV-guide with their own favourite programs and settings. These operators are taking optimum advantage of the possibilities that IPTV offers with regard to personalization and interactivity. There are numerous ways to compete, and each IPTV provider has his own strategy.

Internet TV has the "look and feel" of IPTV, but is delivered over the open public Internet, and is delivered "over the top" (OTT) of existing networks, actually getting a free ride. Internet TV is usually delivered to the PC or another device connected to the Internet, using peer-to-peer technology. Internet TV offers the OTT providers the following advantages:

  • They do not have to invest in distribution networks because they use the telecom and cable companies' networks.
  • They offer the same type of interactivity and viewing capabilities as IPTV.
  • They have a global coverage.

However in contrast to IPTV:

  • There are still issues with the video quality, though it is continually improving.
  • Users really need some technical know-how to use it properly.

Internet TV is not a controlled environment. There are no guarantees regarding accessibility, availability and reliability. There is no control over who is allowed to watch which programs and under what circumstances, such as related to distribution rights in different countries.
Internet TV providers offer programs for free; revenue is preliminary based on advertising. Obviously, Internet TV is still in the embryonic phase. There are a number of players who are attempting to create a market for themselves. Joost is the most well-known. Joost is coming from the developers of the music-sharing program Kazaa and the VoIP-service Skype, developments which has severely traumatized the music recording industry and telecom sector, respectively. Joost only distributes professionally-made content, and sharing advertising revenue with the content providers. While Joost is focusing on a large public, Babelgum focuses on specific target groups by offering niche content via a large selection of theme-channels. Hulu, the online video project from Newscorp and NBC/Universal has begun its offering to the US-public early 2008. Other providers include Narrowstep and JumpTV.
Variations on Internet TV include BBC's iPlayer and Apple TV. iPlayer is an on-demand TV-service enabling users to view BBC-programs via Internet. Apple TV uses an STB that makes it possible to stream digital information from any computer with iTunes to a widescreen high-definition TV-set. This enables viewers to transmit videos, TV-programs, music, YouTube videos and other Internet material from the computer to their TV-set, or to save them on the STB hard disk. And if it is up to Microsoft, users will soon be able to connect their TV to a Windows Media Centre PC or an Xbox 360 using Microsoft's Media Centre Services to get their daily diet of TV-programs.

All together, more than enough threat for the telcos who have spent large amounts of money building and launching their own IPTV-services. They are understandably worried, that OTT providers will ultimately capture all the value that video-over-IP promises. In that case, they would be left with nothing to offer but the so-called ‘dumb pipe'.
Clashes appear to be unavoidable. IPTV and OTT providers will certainly be confronting one another in the domain of distribution and advertising.

In terms of the first point, the OTT providers shift the distribution problem to the owners of the networks. IPTV providers invest heavily in upgrading their networks for their own IPTV-services; now they must handle the OTT traffic as well, which means additional investments. In fact, incumbent telecom companies face the unique dilemma that as they increase their broadband capacity, they make it easier for OTT providers to deliver the quality-of-service that is required for professional TV-broadcasting. Of course, that will not be acceptable for the telecom companies. They can respond in different ways:

  • Filter the OTT-traffic, possibly block specific traffic, and offer higher distribution priority and quality to parties who are willing to pay (more). However, throttling OTT-traffic controversially violates the so-called net-neutrality principles, i.e. blocking other parties' traffic to give their own services precedence. This could lead to intervention by government regulators.
  • Find a way to insert themselves into the relationships between the OTT provider and their customers, and make agreements regarding the charge-through of the distribution costs, either to the OTT provider or its customers.
  • Open its IPTV-platform to OTT-content by making services from OTT suppliers available as separate IPTV-channels. This would allow the operators to bring in extra revenue.

In terms of the second point, the advertising, ultimately it all revolves around the advertising relationships and the possibilities the Internet offers for more efficient targeting. Internet television is currently paid entirely by advertising. IPTV advertising will also become increasingly important for telcos to fund their content, as customers do not expect to pay for all content. In 2007, IBM's Institute for Business Value conducted a consumer survey to evaluate changes in consumers' media behaviour. A number of questions were related to advertising; the results indicated that in all the countries involved the majority of those surveyed were willing to view advertising before or after a good quality, free video broadcast.
 
In the battle for advertising funds both parties offer good possibilities for efficient and effective advertising, better than the traditional TV providers. But the telcos seem to have the best assets; assets that advertisers really value. First of all, with their network capabilities telcos are able to better control where the ads go to and to track advertising effectiveness. Telcos collect vast quantities of customer data, which they can use to develop profiles of their subscribers, including viewing patterns and perhaps shopping habits. They can combine these customer insights with their ability to identify the location of individual users and offer highly targeted, localised promotions. Integrated telcos can even combine data collected from fixed, wireless and other networks. They are well placed to enable the advertising experience practically anywhere, on any device and at any time.
Over the short term, Internet TV does not represent a real threat to the IPTV providers. IPTV has a clear possibility to establish a strong position in this market before the problems regarding image quality and the ease-of-use of Internet TV are resolved. But after that, the situation may change. In particular when Internet TV moves to the TV-screen, Internet TV can pose a bigger threat for IPTV.

It is all about getting Internet video onto the TV-screen. The Apple TV initiative mentioned earlier illustrates this. More and more manufacturers of consumer electronics are working on developments for equipping TV-sets with possibilities that make Internet access possible. Sony, as an example, is working on rolling out a network adapter for showing web clips on its HDTV's. It is only a question of time until the access to Internet is a standard feature built into the TV-set. This is an essential milestone from the perspective of the consumer that makes things a lot easier for him.
On the other hand, a partnership between IPTV and OTT suppliers is not unlikely:

  • Telcos can make OTT-content available as part of their IPTV-services
  • OTT providers can profit from the IPTV providers' "walled garden" that gives them a better guarantee in terms of quality, control over the distribution, and feedback with regard to volumes, viewing times and viewer behaviour.
  • Telcos can use the OTT-channel to collect additional customer data regarding consumers' viewing habits for improving targeted advertising.

In fact, we are already seeing this type of initiatives. Some telecom companies are bringing the Internet TV players into their own IPTV-environment, such as Verizon with YouTube and BT with Podshow. They offer, as it were, an extension of their closed IPTV environment. Many providers will offer their own Internet TV in parallel to this, possibly geared to other customer segments, optimally utilizing brand recognition, relations and distribution of content across both channels. BT Vision, with its IPTV-platform and web portal with a download archive for on-demand content and the purchase of physical DVDs, is one example of this.

OTT Internet TV is currently seen as a marginal threat for the IPTV providers. But as bandwidth and QoS will become less of an issue, the OTT providers will increasingly develop into mature TV suppliers of online live HD programming. Joost, Hulu, Babelgum and others are most likely just the top of the iceberg. More of these types of companies will emerge. They will get funding and then fight for customers and advertisers. In the end, it will come down to finding a solid business model. At the same time,  IPTV will mature, finding the right ways and approaches to be successful. Probably there is room for both IPTV and Internet TV, each addressing a particular consumer segment, and the possibility for some sort of partnership is certainly there.

Rob van den Dam is European telecom leader of the IBM Institute for Business Value and can be contacted via: rob_vandendam@nl.ibm.com

At heart all healthy businesses are trying to do the same thing, says David Ollerhead

Linguists today think that all languages have the same purpose and deep structure. Basically linguists believe that all languages are at heart doing the same thing. This appears to be true of healthy businesses too.

All healthy businesses have the same purpose: to grow and maximise profitability within the markets in which they are operating. There's plenty of practical empirical evidence to suggest that healthy businesses also have a great deal in common in their structures and the way they organise their activities.

Management skills, after all, are widely regarded as transferable between different vertical sectors. Senior executives tend to be recruited (or appointed to Boards) based on their success in roles where it is their positive impact on a particular organisation that matters rather than the sector in which the organisation operates. This suggests that healthy businesses have in common organic things which good managers can consistently nurture and develop, whatever the nature of the vertical sector where the business operates.
Similarly, university and business school courses focus on management skills in a general sense. ‘Serial entrepreneurs' are, by definition, fabled for their expertise at forming, growing and then selling businesses in a wide variety of sectors. Indeed, the very existence of management consultants who are geared to consulting in any sector where managers need assistance or guidance is perhaps the most decisive evidence of all that ‘management skill' is a tangible, discrete and specific thing which is basically sector-independent.

Further evidence that healthy businesses are all doing much the same thing is found in how brands operate. Major brands positively exult in their ability to win a presence in markets that on the face of it are disparate but in practice tend to become linked when a brand successfully establishes a loyal, enthusiastic, customer base.

Taking two examples, the Virgin brand (including music, travel, publishing, communications, financial services and soft drinks) has come to be associated with fun, youthfulness, value for money and Richard Branson, while the Saga brand (including travel, publishing, financial services) is seen by many adherents as signifying reliability, good quality, and a square deal for the over-50s. Brand-loyal customers willing to buy from more than one and very possibly all the different businesses under one particular brand obviously feel that the brand is more important than what's being sold.

The science of linguistics that originated the idea of, deep down, all languages being the same, is a fascinating science, but ultimately simply an academic pursuit. Business, on the other hand, powers the world's wealth and is the source for most people of their income and economic security. Big-picture conclusions about business and how it works consequently have massive implications for all of us.

The route to growing and maximising profit is to sell more products or services to more customers, given that neither the business nor its customers will want there to be any negative changes in the quality of the products or services being delivered. Equally importantly, in the case of a service, the business will not want customers to be over-serviced, which will increase the quality of what is being supplied but make supplying it much less profitable. The organisation will also want to sell more things to more customers without disproportionately increasing the time taken to supply what is being sold.

For healthy businesses, a melodious and useful mantra is: ‘Revenue is vanity, profit is sanity, cash-flow is key'. Chasing revenue for its own sake makes no sense if the revenue does not come accompanied by a healthy profit and a correspondingly healthy and positive cash-flow.  Above all, it makes no sense for a business to succeed in its aim of selling more products or services to more customers unless the business can do so without disproportionately increasing the cost of supplying what is being sold. Similarly, the business will want to avoid disproportionately reducing the prices of what is being sold. Selling more things to more customers by slashing the price (such as through a ‘buy one get one free' offer) can easily reduce profit and so be self-defeating.

Within the constraints of these qualifications a healthy business's aims are clear. All healthy businesses are trying to sell more things to more customers without:

  • compromising the need for the business to supply products and services to the required (rather than excessive) level of quality
  • incurring costs that make supplying the products and services unprofitable
  • reducing prices to a level where supplying the product or service becomes unprofitable.

So, how does a healthy business achieve these vital objectives?  Ultimately, the very nature of what a healthy business actually is suggests there can only be one answer to this question. The only way for a business to sell more products and services to more customers is to have a total focus on its customers. The fact that this answer, baldly stated, sounds straightforward does not make it any easier to achieve, or lessen its importance.
The first challenge in achieving this vital customer focus is knowing who your customers are, which includes your existing customers (i.e. the ones you've won already) and also your potential customers (i.e. the ones you could win.)

The second challenge is knowing what your existing and potential customers need, at least in the context of what you are able to sell to them. This challenge may well be more difficult than knowing who your existing and potential customers actually are, but mastering this second challenge is vital to your success, because until you truly understand what your customers need, it is always possible that:

  • you might be offering customers things that they don't actually want, or that not enough customers want
  • you might be focusing on irrelevant issues (eg cost-discounting things customers don't really want) instead of getting to grips with finding out what customers do want
  • you might start improving areas of your business that have no ultimate effect on customers and the improvement of which will therefore not lead to you selling more things to more customers.

The third challenge, once you know what your customers do want from you, is to work out how you can meet these needs by profitably producing goods and services as efficiently as possible.

The fourth challenge is the need to commit yourself to ensuring that your responses to the first three challenges are subjected to a continual state of interrogation that involves making sure your responses are undergoing a continual state of improvement.
The four challenges are fairly easily stated but by no means easy to meet. They involve, above all, establishing and maintaining a focus on your customers rather than on internal matters at the business or on your own personal concerns. But businesses that really do rise to the challenges - businesses that become, in effect, experts at focusing on customer needs - can enjoy prodigious success.

Once you do know who your customers are and what they want from you, one particularly potent way to ensure that your business is really focused around their needs and meeting those needs with maximum efficiency, is to look hard at your business's processes.
In business, a process is a series of steps that produces a specified deliverable to meet a customer need.  This definition is precise: the steps of the activity must actually meet customer needs (or, for organisations that have several processes, the needs of different customers) successfully. A series of steps that doesn't meet customer needs can't properly be regarded as a process, or at least not an effective one.

Whatever the precise nature of the process or processes a business carries out, the very fact that process is actually defined in terms of delivering a benefit to customers leaves no doubt that a business's process or processes lie not only at the heart of the business but are the heart of the business.

And make no mistake: all good businesses will have a healthy heart whose pumping creates maximum profit for you, and maximum satisfaction for your customers.

David Ollerhead is head of consulting within the Professional Services Group at Airwave Solutions Limited, and can be contacted at david.ollerhead@airwavesolutions.co.uk
www.airwavesolutions.co.uk

 

 

    

@eurocomms

Other Categories in Features