Features

As the deadline for EU Data Regulation compliance passes, and many member states opt to postpone, Ross Brewer warns telcos not to underestimate the amount of work involved in addressing the regulations

The original deadline for the European Union Data Regulations - namely September 2007 - has been and gone.  Instead of telecommunications companies across Europe retaining data to support the crime fighting efforts of regional security forces, most member states, including the UK, have chosen to postpone the application.  It is expected that the laws in the large member states will start to be implemented from the beginning of 2008.  If this is the case, large enterprises will need to have solutions in place around the middle of 2008. 
While extending the deadline may buy additional time in which telecommunications companies can get their data retention house in order, the reality is that too many organisations are dragging their heels in addressing the regulation and don't have an appreciation of the sheer amount of work needed to ensure compliance.  Industry estimates put achieving Data Directive compliance at anytime up to 18 months.  With this in mind, if organisations are to be fully compliant in time for the new deadline, then they can't delay a moment longer.

The European Union (EU) formally adopted Directive 2006/24/EC on 15 March 2006.  The directive related to the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks.  In other words, communications providers need to retain - for a period of between six months and two years - all data that will allow relevant local authorities to trace and source communications to investigate, detect and prosecute serious crime. 
The directive - which applies to fixed telephony, mobile telephony, Internet access, Internet mail and Internet telephony - covers every single aspect of a communication including its source; destination; date, time and duration; the type of communication; the type of communication device and the location of the mobile communication.

In putting the regulations off until the last minute, organisations risk facing additional pressures in a bid to fast track achieving compliance within the imposed deadline - if they can achieve compliance in the first place. There is also the added risk that crime fighting abilities of regional security forces will be detrimentally affected as the EU Data regulations have been designed to help the security services in the fight against crime. 
With over 41 billion text messages alone sent last year, not to mention the millions of calls made every minute of every day, telcos will face enormous challenges.  These don't just relate to storing the data securely, but being able to, should an investigation be required, locate and retrieve the data as quickly as possible so as not to hamper proceedings. 
The amount of audit and logging of information required by the Data Directive has the potential to overload storage capabilities of even the largest and most technical savvy organisations. But in an ever more competitive industry, what can telcos do to put a necessary compliance solution in place which will not detrimentally impact the day-to-day operation of the business in anyway?  

The good news is that the information that needs to be retained by the Data Directive already exists within the organisation in the form of log data.  This log data provides a fingerprint overview of every action that occurs across the enterprise and importantly, across a telco's network.  However, as it is generated at a rate of millions of bytes per second, being able to capture it to provide forensic reports across the enterprise in a timely fashion as mandated in Article 8 of the Data Directive will be a challenge. 

As such, when looking for a solution to manage this problem, organisations need to find a product that has its roots in scalability and can run a single search across all devices across the organisation to help minimise the impact of locating the data in the first place. 
This is where log management solutions can step in and provide a means of searching log data and producing reports.

Steps to compliance
Installing an off-the-shelf compliance solution a couple of weeks before the deadline will not be sufficient.  Any solution will need to be tailored to meet the specific challenges of each organisation and go through a rigorous testing procedure to ensure that it is robust enough and capable of storing and retrieving information within the recommended guidelines.
Getting started with any enterprise-wide strategy for compliance requires an understanding of the requirements particular to each industry and business. Policies should then be put in place for collecting, alerting, reporting on, storing, searching and sharing data from all systems, applications and network elements.  This creates a closed-loop process that governs the life-cycle of enterprise data and ensures the compliance programme is successful.

It sounds an obvious first step, but without taking time clearly to understand the specifics of the EU Data Retention Directive, there is a risk that some controls or requirements may fall through the net, which will have serious implications further down the line.  For example, minimum requirements for the Directive include that the solution must be enterprise scalable and distributed; fault tolerant and ensure no loss of data; be able to prove that the logs are immutable so that they can be used in a court of law and be able to produce forensic reports across the enterprise in a timely fashion.

Once the specifics of the Directive have been clearly understood, the next step is to put in place the IT controls and frameworks to help govern compliance tasks and keep the business on track for complying with the mandate.  

Goals should then be defined and key tasks for successful compliance identified, agreed and set.  Then specific tasks relating to each goal can be set.  Once these tasks are complete, configuration of network elements, systems and applications can then be addressed. 
Alerting mechanisms and scheduled reporting will advise IT personnel when any part of the solution isn't complying with the policies.  Early reporting of any problems will ensure that they can be addressed in a timely fashion with minimal impact on the rest of the operation.  Alerts and schedules can also demonstrate compliance to auditors.

Alerting and reporting on logs must be substantiated with immutable log archives.  It's therefore critical to store logs centrally with a long-term archival solution that preserves the integrity of the data, as required by the EU Data Directive.  Immutable logs require time stamps, digital signature, encryption and other precautions to prevent tampering - both during transit of the data from the logging device to the storage device, as well as during archival - if they are to stand up as evidence in any legal proceedings.

It's easy to view the EU Data Directive as yet another piece of Brussels bureaucracy, but unlike many other regulations, this Directive isn't about preventing or identifying financial regularities within big business.  Instead, the EU Data Directive has been designed to help the ever complex fight against crime - at an individual country, European and global level.
Telcos shouldn't feel daunted at the compliance task that lies ahead of them, after all, they already hold all of the data that is needed by the EU Directive.  Instead of viewing compliance as an isolated IT project, they should instead look upon it as a business issue that requires a cross-functional approach, involving people, processes and technology across the enterprise.  Taking the steps necessary to understand, define and implement the appropriate IT controls and frameworks for the business will simplify compliance and reduce the costs and resources involved in completing compliance related tasks in line with the Directive's deadline.

Ross Brewer is Vice President and Managing Director, EMEA, LogLogic

The GSMA's Mobile World Congress 2008 runs on 11th to 14th February in Barcelona, aiming to - once again - prove its value as the leading event in the mobile industry

The growing convergence of media, communications, entertainment and information being very much on the minds of all those involved in the currently separate industries (though in the words of Keith Willetts, TM Forum Chairman, each in the process of eyeing up the other's lunch), the GSMA's Mobile World Congress is clearly intending to hit all the right buttons, with all the right buzz words. 

Noting that mobile is now about much more than simple voice calls, but is a social and economic phenomenon, providing new channels and opportunities for information, entertainment and the Internet generation, GSMA has declared the four major themes of the conference to be ubiquitous networks, the over-the-top service challenge, mobile's social and economic value, and the mobile digital content revolution, and thus clearly aims to ensure that Mobile World Congress 2008, safeguards its reputation as the largest and most significant global mobile event.  The organisers further declare: "The structure of the upcoming Mobile World Congress is being developed to reflect the ongoing changes in the mobile value chain.  This programme identifies the risks that will be taken and highlights the rewards that can be reaped in brining these changes to all geographic markets and mobile services."

Under this umbrella, the event will tackle a variety of the issues currently dominating telco thinking and concerns.  The convergence of fixed and mobile communications, for instance, may well have a certain natural evolution, but is it inevitable, real or even desirable?  The MWC session will look to unearth what FMC means for leading operators and define the role it will play in their business development.

While the ‘Society on the Move' session will explore the way in which mobile has integrated into modern culture, recognising the growing importance of mobile across society, the commercial aspects of the market are very much to the fore.  So, within the Mobile Entertainment Summit, for example, one session will address the proposition that some believe that mobile advertising is essential to unlock the potential of mobile entertainment, and that advertisers will jump at the chance to spend billions in the mobile arena, while others believe that users will simply turn off.  The session aims to put the opportunity into context and uncover what it will take to deliver the mobile advertising promise.  Under the same Summit, the conference will also address the view that the youth demographic is the darling of the mobile world.  Quick to take up new services and open to new technologies, the youthful user is every operator's target - but is this reputation justified?

On the technology front, the conference will look at such areas as HSPA, under the title ‘The story continues with HSUPA and HSPA Evolved', noting that while HSDPA is already proving a huge commercial success on a global basis, what is the evolutionary path for this mobile broadband technology?  The session will aim to provide operator insight into deploying HSUPA and the move to HSPA Evolved.  Staying in the technology arena, ‘LTE - defining the future' recognises that while standards are quietly being finalised, speculation and rumour pervade.  The session will provide insight into the reality of LTE and its prospect against alternative future mobile broadband technologies.

Noting that mobile penetration has reached over 100 per cent in 38 countries to date, and many more are high in the 90s, the focus of the keynote session on building ubiquitous networks centres on the recognition that while there may be a mobile phone in everyone's pocket, mobile isn't everywhere and the challenge is to take high speed, high capacity mobile services everywhere.  It will examine the development and rollout of ever-more capable networks and their convergence with fixed, to deliver ubiquitous services.
And while MWC recognises that the mobile industry is unrivalled in its technical excellence (well, up to a point Lord Copper), it all too often forgets that the success of its achievements rest on the experience the user gets when he first switches on.  The ‘It's the user experience stupid!' session highlights the importance of putting the user first and the developments that can set consumer imagination racing.

The highly topical subjects of VoIP and Mobile TV must also, of course, have a slot at the conference.  In the case of the former, the ‘VoIP - coming ready or not!' session works on the basis that VoIP has the promise to revolutionise the cost base of voice communications, but will it, and how will VoIP impact the business of mobile operators?  In the case of Mobile TV, it is noted that the application has had a stuttering start, but that, as it finally begins to reach commercial deployment, the race to find the killer business model is replacing the technology debate.  The ‘Mobile TV - taking it to the masses' session looks at the realities of rollout and the business impact of technology decisions.

The above, of course, is a mere snapshot of all that Mobile World Congress 2008 aims to be.  The event also offers both visual and hands-on back up to theories and discussions in the shape of the many products and services on display at the concurrent exhibition.  And, most importantly, it is the king and queen of networking opportunities - of the business and social kind, of course.

A new breed of service providers are helping to reverse the fortunes of some of Europe's most influential mobile companies - by helping them manage their customer lifecycles far more effectively, explains Georgia Hannias

Traditional marketing is no longer a viable communications tool in the mobile telecoms world.  Glossy adverts in the papers, mass-marketing direct mail and telemarketing are not winning customers' loyalty - especially when every operator seems to be offering the same thing.  In this confusing climate of choice, even a great product or service isn't making the real difference to business growth.  What customers are looking for is an operator who can understand their preferences and ultimately tailor their services to give them what they want, when they want it.

Understanding the needs of the customer is what Customer Lifecycle Management (CLM) is all about. This communication trend focuses on managing customer behaviour by finding a way to create a bond between the operator and the customer throughout the customer's lifecycle. Once a one-to-one communication is created, consumer behaviour can be monitored, so that companies know exactly what their customers want and can subsequently provide a tailored service to meet individual needs.  This is usually the first step forward in successfully delivering targeted services and, most importantly, building a strong, long term relationship between the service provider and the customer.
 "Customer lifecycle management has a future in mobile telephony - especially in Western Europe where there is a blurring of fixed and mobile services," explains Rob Bamforth, Principal Analyst at Quocirca. "This convergence is eating away at revenue generated by traditional services such as voice calls. To find new ways of earning income and to keep customers happy, operators have to differentiate themselves from their competitors- and to do so in a way that makes a subscriber feel valued and not just a number. CLM can help achieve this by enabling an operator to target an individual person."

Marketing departments are all too aware of this demand for more personalised services and how customer lifecycle management can help achieve this goal. More importantly, they can usually produce the creative ideas that are needed to develop the campaigns that will engage customers. The main problem, however, is that the internal complexities involved in executing new programmes usually prevent them from becoming a reality. IT departments tend to place more emphasis on the OSS and BSS systems and the call centre side of a business, which means that marketing's requirements almost always takes a back seat internally.

Consequently, campaigns that should only take weeks to complete and launch can take months, and many compromises are usually made to get a programme out the door- leaving marketing departments frustrated by the fact that they haven't quite delivered the programme they wanted.

Fortunately, there is a new way of simplifying the execution and enhancing the flexibility of personalised marketing campaigns known as Marketing on Demand. Simply put, Marketing on Demand is a software-as-a-service model that removes the internal complexities involved in launching new campaigns.  It enables marketing departments to be more focused on the programmes they want to deliver, leaving the technical execution to a team of external specialists. 

With the availability of marketing on demand solutions, marketing departments can choose to implement customer retention programmes quickly and easily without the need for a long-drawn out IT project.  This brings a whole new dimension to the way marketing departments can control and execute their marketing campaigns.  It provides the flexibility to capitalise on existing customer data, regardless of where it is stored within the organisation, as it is fed to a hosted solution which is capable of automating all out and inbound interactions in real-time.  This enables operators to manage customers on a one-to-one basis so that a relevant and consistent communication journey begins and continues throughout the customer's lifecycle.   Best of all, marketing on demand means that any number of programmes can be launched in one go, thereby allowing operators to execute and manage a combination of programmes without any delay or restrictions. 

Software as a service (SaaS) based-delivery model is already driving major growth in the customer relationship management applications market. According to analyst Gartner, mobile operators and other enterprises continued to invest in front-office applications last year, with worldwide CRM software revenue exceeding $7.4 billion (£3.7 billion) in 2007, up 14 per cent from $6.5 billion (£3.2 billion) in 2006.

By year-end 2007, SaaS represented more than $1 billion in CRM software revenue. "The sustained performance of major on-demand solutions providers is driving the growth in the SaaS segment," says Sharon Mertz, Gartner research director.  Mertz also explained this method of delivery was likely to become the dominant one in this market by 2011, as companies updated existing business automation systems to ensure they could meet expected targets for business and revenue growth.
 

"Marketing on demand deployment models are helping communications providers use their CRM systems as a strategic tool to gain competitive advantage, maximise loyalty, reduce churn and increase ARPU throughout the customer lifecycle," says Mikko Hietanen, CEO of Agillic.  "Customer Lifecycle Management solutions take the CRM SaaS model one step further, by providing communications service providers with an application that enables real-time, multi-channel, cross channel dialogues driven by the customer.  What this helps to achieve is the delivery of a true and consistent understanding of how an individual interacts with their service provider."

Software systems like those offered by Agillic also help communications service providers and their marketing agencies to work even more efficiently as it is programmed to automatically execute the business rules required to reach customers across all touch points from printed direct mail, digital media and customer services.

"Agillic is one of the first companies I have heard of that focuses exclusively on customer lifecycle management," says Joel Cooper, Senior Analyst, Europe at Pyramid Research. "In highly saturated telecoms markets, what Agillic offers is the kind of thing that operators would be advised to adopt. As customer growth dries up, customer lifecycle management is a good starting point in terms of better understanding exactly what the customer wants."
Agillic has many years of experience of deploying marketing on demand business models for its customers.   The general principle is that marketing on demand must keep the co-ordination and collaboration of any project focused on the business value rather than the technology. Achieving this requires the assurance that promotional campaigns are set up correctly with the right messages - which is usually the biggest and most costly challenge for most mobile operators today.

Communication service providers can choose to implement a choice of best practice marketing concepts that are already successfully deployed and generating positive results.  Consequently, marketing does not have to reinvent the wheel but instead can benefit from a quick route to market by simply adapting proven programmes with their own branding and business rules.   As programmes develop, the marketing department can easily monitor the positive and negative effects of the programmes and make immediate alterations in real time so no time is lost to time-consuming code changes. They also benefit from having access to advisers that can guide marketing on the best programmes that will match their most pressing business objectives.

Most operators already possess existing data about customer preferences and usage patterns which can be uploaded to the CLM solutions to execute meaningful, customised and timely marketing programmes by crossing all customer touch points," says Mikko Hietanen.  "More importantly, these communications are only received through channels that are agreed to by the customer, which keeps them happy and strengthens relationships even more. This is something that CRM could never provide as the human dimension does not exist."

Georgia Hannias

Operators are increasingly turning to picocells to improve network quality and control infrastructure costs says Chris Cox

Operators have been struggling with coverage problems since mobile communications were invented.  But massive recent investment in 3G licenses and infrastructure has changed the capex and opex cost equation, affecting the performance of the companies that sell equipment to operators. As this article was being written (in Autumn 2007), for example, Ericsson, Nokia Siemens Networks and Motorola made a series of anaemic financial announcements and all three blamed a slowdown in operator infrastructure spending.
While much of the heat is generated by the spiralling cost of implementing 3G, operators are also increasingly cautious about their 2G commitments.

To cope with sustained customer demand for GSM services, operators are looking for ways to keep further investment in additional expensive GSM infrastructure under control. Every operator is actively looking for smarter, more cost-effective methods to upgrade existing networks and yet still deliver additional coverage and capacity to their customers, exactly where it's needed.

Introducing picocells
Many are increasingly turning to picocells.  Picocells are very small base stations that deliver a mobile signal in areas of high mobile traffic (such as in offices or busy public locations) or poor network coverage (such as indoors) and use an existing broadband connection for backhaul.  They are proving to be very much more cost-effective than the alternative of upgrading the macro network.

Operators originally deployed picocells as a "band aid" for coverage black spots. However, as the technology has matured, operators are increasingly seeing picocells as an important mechanism for improving their macro network capacity and performance. And, perhaps most interestingly, they're beginning to be used in one of the most competitive battlegrounds of all: picocells enable operators to use the promise of great service quality to attract (and win) lucrative business customers from incumbent rivals. 

Every network has black spots where coverage is marginal or non-existent. In areas with marginal coverage, service quality inside buildings can drop off sharply, resulting in dropped calls, ‘network busy' signals, slow data rates and poor voice quality.
The first major application of picocells was to address this issue of signal quality in buildings.
For enterprises, coverage in the workplace is a major driver of dissatisfaction with operators - this is a consistent finding in all global regions. And where coverage is an issue, it's the dominant issue and is a key reason for churn (along with price and handset choice). Critically, coverage is often the primary differentiator between operators - and the decisive factor in awarding contracts.

The traditional solution to in-building coverage problems has been the repeater. But today, planners aren't so quick to turn to repeaters to fill black spots or penetrate buildings.
Repeaters are to network planning what the rock was to the caveman.  Simple and ubiquitous, but not the most sophisticated solution around. But until picocells arrived, network planners generally accepted that the alternative always looked more expensive and difficult to deploy.

While repeaters extend coverage, they act to drain capacity from the cell in which they operate, Picocells by contrast add both coverage and capacity, as well as the ability to enhance data rates. In addition, repeaters can distort the macro cell, causing interference and handover issues, creating severe radio planning problems. Picocells integrate seamlessly into the macro network.

Repeaters can be difficult and time consuming to install and they're also problematic to manage (they don't offer automated fault reporting for instance). Picocells can install in a few hours and offer integrated fault and performance monitoring.
At the beginning, some operators with difficult coverage challenges shied away from picocells because they felt a little uncomfortable with the need to use IP for backhaul, an unfamiliar protocol for network planners used to more established ways of doing things. As IP has become ubiquitous, however, that resistance has melted away.

Providing good service means always having sufficient capacity available. But avoiding ‘network busy' errors in commercial centres and large cities is becoming more difficult as usage levels increase, driven by competitive voice tariffs and attractive new data services.
Subscribers are spending more time on the network doing new, more bandwidth-intensive things. That may affect the quality of service operators provide to premium customers, like BlackBerry and other PDA users, who expect to be able to access services whenever they want. Providing the right level of capacity is tough in densely populated areas and it's limited by the spectrum available to an operator.

Simply adding new macro cells - even if they're micro base stations - is expensive and time consuming. And public opposition to the introduction of more and more radio masts is increasing around the world as well, even if good sites can be found.
An operator's lack of capacity is not only a churn driver but also a brake on new service uptake.

Picocells offer the possibility of highly targeted deployment, rapid rollout and limited impact on the macro network in terms of interference and the requirement for network planning. Each base station is inexpensive and a single base station controller can handle a hundred base stations.

ip.access undertook some business case research recently (see the Case for Picocells: Operator Business Cases at www.ipaccess.com). In one scenario, where the goal was to offload 60 per cent of the indoor mobile usage of 7000 users over a two square mile area, using picocells was 53 per cent cheaper than upgrading the macro network (each of those users was estimated to use 800 voice minutes each month and 5MB of data per month).
Enterprise market capture

As picocells become more and more ubiquitous as the solution to difficult coverage and capacity challenges, more far-sighted operators are beginning to see how to use the technology to attract and retain lucrative business customers.
In Sweden, for example Spring Mobil is the fourth GSM operator, having won a license in 2003. It aims to replace fixed telephony in the office using nanoGSM and has recruited over 500 enterprise customers replacing fixed lines.
Spring can provide fast, low-cost coverage and capacity where enterprise customers need it most. They can sell more minutes while supporting their best customers with the most modern services. The solution reduces churn and drives traffic from fixed lines to mobile networks.
Picocells are an emerging GSM infrastructure play for many operators. They are being used to help operators:

  • Differentiate from commodity networks
  • Increase revenues from voice and data
  • Offer competitive in-office tariffs
  • Decrease churn

Operators have spotted that, with picocells, they can sell new services while improving macro cell performance without the need to over-spend on infrastructure because they change the approach to  ‘Pinpoint Provision', adding coverage exactly where needed.
Picocells are a proven, end-to-end solution carrying billions of minutes of traffic every year, in dozens of operators around the world.
The future looks bright for picocells.

Chris Cox is Marketing Manager, ip.access

Roaming fraud is now of such concern to operators that the GSM Association has developed a new set of fraud detection standards for its operator members. Eugene Bergen Henegouwen details the development, and urges operators to implement compliant solutions today

Some roaming fraud is a fact of life for many mobile operators and, at its worst, has the potential to cause great harm to the bottom line. This type of illegal behavior seems to affect operators most acutely where the borders of rich countries meet those who are poorer. In one form of fraud, SIM cards are cloned by well-organized and technically savvy criminals, who then sell them to consumers who most likely have no knowledge that the cards they're using are illegal. Eventually, the cards are deactivated by the operator, but by then the fraud has been in place for some time and money has been lost.

The criminals who participate in this type of fraud are successful because fraud detection and resolution is hindered by the length of time it can take for a visited operator to notify the home operator of the roaming activity that has been taking place. Currently, data records that track subscribers' activities as they roam are exchanged with the home operator within about 36 hours. This leaves a big window of opportunity for criminals who often target their activity during weekends and holiday times when operators are most vulnerable to fraud.
Roaming fraud - when it does hit - hits hard. It is the equivalent of leaving for a weekend trip and returning to find the whole house flooded even though you only had a dripping tap when you left. Any loopholes are potentially exploited by criminals who will strike decisively and "flash flood" what they can in a short timeframe because they know the extent of the capabilities - and limitations - of the current reporting standard. Clearly, now is the time to form a more watertight seal to combat this fraud.

Roaming fraud has become such a concern for operators that the GSM Association (GSMA) has developed and mandated a new set of fraud detection standards for its operator members. The Near Real Time Roaming Data Exchange (NRTDRE) initiative has been developed in specific response to International Revenue Share Fraud (IRSF), which thrives on the clone-and-sell technique.

The new NRTDRE standard will dramatically reduce the record exchange time and make fraud easier to spot and stamp out. It's not that operators don't have systems in place to fight the risk of roaming fraud themselves, but the current High Usage Report (HUR) system is outmoded. NRTDRE has been developed in its place as the next generation of roaming fraud protection in an attempt to effectively cut fraudsters out of the picture and restore the level of security that operators need to operate successfully and - most importantly - profitably.
According to a GSMA survey of 37 operators some months ago, roaming fraud losses affect networks of all sizes and in all regions. In one instance, a single operator is reported to have suffered losses of c11.1 million in just less than two years, just one more piece of evidence that IRSF has grown to be a costly concern. This may not sound like a huge figure, given that operator turnover can often be in the multimillions or even the billions. But when you take into account that it is all a loss from the profit and not the turnover line - and cannot ever be invoiced - the percentage grows to an uncomfortable level.

In the view of many analysts, this form of fraud has grown to the point where it is not merely a minor irritant. Martina Kurth, principal research analyst for Gartner Group, recently acknowledged that "roaming fraud is a very real and present danger for operators the world over, and it is impacting their bottom lines. Any initiative that enables operators to minimize roaming fraud is, therefore, a strategic business issue on which operators must act if there is not to be further erosion into their profitability."

The GSMA NRTRDE initiative aims to replace HUR to keep operators ahead of the growing sophistication employed by the fraudsters. With NRTRDE, the visited network is required to forward Call Data Records (CDRs) to the subscriber's home operator within four hours of the call end time. If the visited operator is unable to get this information to the home operator in time, the visited operator is held liable for any fraud associated with those calls, and so there is a greater degree of motivation for all parties to make the measure successful.
This approach toward fraud is particularly important in garnering the support of the whole industry of operators because without a shift in responsibility to the visited operator, the motivation to adopt the new standard would be limited. NRTRDE is making operators more accountable for the behaviour of visiting subscribers, and where once it was hard to enforce anti-fraud protection, operators now are much more motivated to work together.
Once the home operator has received the CDR, it can detect fraud via a fraud management system. There is a record format for exchanging these near real-time records, which has been defined in the GSMA's Definition of Transferred Account Data Interchange Group (TADIG) standards document as "TD.35." Syniverse has played the lead role with the GSMA in the development of standards which support NRTRDE, serving both as chair of TADIG and as the official author and editor of TD.35, the fundamental building block of anti-roaming fraud resolution.

The adoption of NRTRDE is expected to reduce the incidence of roaming fraud by up to 90 per cent because of the closing of the roaming fraud window currently open on operator networks that use HUR. Just as importantly, NRTRDE offers operators a far more accurate and timely view of how their networks are performing against fraud.
Rather than waiting until October 2008, the date at which GSMA members are required to implement the new standard, both the GSMA and Syniverse are urging operators to implement compliant solutions today. We believe that as the adoption process gains momentum, operators who continue to rely on HUR will eventually become disadvantaged and may be prone to more attacks.

Syniverse  has been taking part in GSMA trials to ensure our solution as well as the solutions of other providers have the interoperability needed for a global industry. All of the trial work undertaken thus far ensures operators can begin rolling out solutions ahead of the deadline and, in many cases, form a watertight boundary that excludes fraudsters and enables the operator to experience the cost benefits sooner rather than later.

The countdown to the GSMA's October 2008 NRTRDE implementation target date is moving closer by the day, and adoption is gaining momentum in the industry. Moreover, with the cost of roaming dropping in Europe for subscribers, it's reasonable to expect that the amount of roaming traffic will increase, making fraud patterns harder to spot and heightening the risk for those relying on HUR. In partnership with the GSMA, we believe the time to start forming a more watertight seal to combat sophisticated fraud is clearly now as the industry moves to protect not only the operators' revenues but also the future of roaming for subscribers worldwide.

Eugene Bergen Henegouwen is Executive Vice President, EMEA, for Syniverse Technologies

WiMAX is finally making its way into the mainstream telecoms market as WiMAX operators around the world begin to roll out their services. Previous obstacles for launching their ventures, such as obtaining the necessary spectrum licenses, deploying mobile WiMAX infrastructure, or selecting the right vendor have either been overcome or are about to be resolved. Finally, WiMAX is no longer a hyped up term being tossed around in the telecoms industry. It is now a reality, and operators are testing their WiMAX capabilities in real-life environments, with paying subscribers, on loaded networks. David Knox assesses the different types of entrants in the WiMAX market, and reveals the many charging challenges the new innovation faces and the ways in which operators can overcome them

According to a recent report from the Gartner Group, revenue from sales of WiMAX equipment will grow to more than $6.2 billion by 2011, and global connections will reach 85 million in the same year. The current scramble to roll out WiMAX around the world provides enough evidence to back up this statistic, as operators get ready to launch the next big thing in global communications.  Best of all, WiMAX is being regarded as a technology that is being embraced by both developed and developing nations around the world - creating endless business opportunities for vastly different economies.

Paving the way for WIMAX's entry into the mainstream is a new breed of dedicated WiMAX service providers that are starting to successfully target high density, high usage metropolitan areas in various parts of the world, as well as in rural communities where there is no access to fixed line broadband or 3G.

In the developing world, telecoms provider Wateen Pakistan has successfully deployed its WiMAX network in 17 major cities across Pakistan, including Islamabad, Karachi and Lahore. The success of the project demonstrates how the cost effectiveness and speed of deployment offered by WiMAX is allowing competitive carriers to quickly build a wireless broadband network. It also shows how a developing economy can immediately embrace a new, innovative next-generation technology, and smoothly deploy a cutting edge communications infrastructure.

Closer to home, UK companies like The Cloud, Europe's largest Wi-Fi operator, are making positive inroads into the WiMAX space. The Cloud has become the first to offer the service in the financial district known as The City Of London. Heralded as Europe's most advanced WiMAX roll out, the venture has given 350,000 workers and thousands of visitors the chance to get broadband access anywhere within "the square mile".

The success of this project has led to more successes for The Cloud, including its hotspot access deal with McDonald's, which recently rolled out free high speed wireless Internet access across almost 1,200 restaurants in the UK, making it the UK's biggest provider of free wireless Internet access. The Cloud also signed a major new deal with the BBC, which became the first UK broadcaster to have all it's online content made available for free via Wi-Fi. This latest venture enables the public to access all bbc.co.uk content for free through the UK's largest network of hotspots, operated by The Cloud.  The 7,500 hotspots are located at a various locations across the UK, including McDonald's, Coffee Republic, BAA airports (Heathrow, Gatwick and Stansted) as well as a number of outdoor locations including Canary Wharf and the City of London.  

Dedicated WiMAX providers are not the only entrants in this burgeoning market. Established fixed line telecommunications providers with no mobile arm - such as UK's BT- are also trying to get into the WiMAX space to complement their fixed line broadband offerings and to compete with the likes of 3G for high-speed data access. Established mobile service providers are also trying to muscle their way into the market, especially in regions where there is strong demand for high speed data but where 3G is not a feasible option, due to higher infrastructure deployment costs or geographical difficulties. A good example is the Caribbean's leading GSM operator Digicel, which recently rolled out WiMAX in the Cayman Islands. The existing broadband offerings in the country rely heavily on fixed lines, making it expensive and limiting in choice for many consumers and businesses. Digicel therefore used WIMAX as an opportunity to create competition by offering a better solution to consumers at a lower cost. 

Each of these aforementioned types of WiMAX market entrants will have many challenges to overcome before they can achieve profitability - and each one will also have different requirements for WiMAX charging, depending on their infrastructure and what BSS/OSS systems they already have in place.

One of the main challenges presented by WiMAX in terms of charging will be finding a means to perform user authentication in real-time for home and roaming subscribers. Another will be the ability for subscribers to be able to roam on other WiMAX networks and still use the same authentication mechanism and balance information from their home network. In other words, operators will need to find a way to offer customers just one account with their "home" WiMAX provider and not require them to worry about multiple sign-ons or topping-up their balance with multiple providers.

Having real-time access to customer, pricing and product information which may be stored "off-board" in existing / legacy systems will be another essential requirement for WiMAX providers, so that they can have a real-time view of the customer and therefore be able to charge and control the service in real-time and provide a positive user experience.
Being able to enforce post-paid credit limits in real-time in order to reduce windows for fraud and exposure to bad debt will also be crucial for WiMAX charging. This is a very important issue to address, since fraud and bad debt are now considered the largest areas of revenue leakage for telecom operators. According to a new survey published by UK research firm Analysys, average fraud losses resulting from all types of fraud, including external fraud, internal fraud and fraud by other operators, has grown from 2.9 per cent of operators' total revenue last year to 4.5 per cent this year, and this is expected to increase with the rise in popularity of data and Internet services. WiMAX providers must therefore find a way to protect themselves from this kind of revenue leakage if they want to manage short term as well as long term profitability.

The flexibility to offer pre-paid charging will be another challenge for WiMAX. Initially many WiMAX users will be corporate clients, so they will expect to be on post-paid deals, but they will also need the reassurance of not having to worry about exceeding spending limits. A real-time convergent charging solution can significantly enhance the flexibility of pricing plans and allow users to be automatically switched over from a post-paid to a pre-paid payment mechanism if these user-definable spending limits are exceeded. Instead of being an impediment to new service rollouts, the right charging solution will be able to drive change with the fast launch of new pricing strategies - regardless of what type of WiMAX contract a user is on.

Fortunately for WiMAX service providers, the market already boasts the right technology to ease its numerous rating and charging challenges.  Among the rating and charging experts for this new innovation is VoluBill, which offers a comprehensive range of WiMAX solution capabilities - whether it is simply WiMAX charging, or a complete WiMAX solution including integrated customer care, web self-care, billing and voucher management.
We also offer flexible deployment options for the solutions, making it possible to start with a limited scope and functional footprint and to expand the scope of the solution as business requirements demand.

All eyes are on WiMAX as it brings the reality of truly portable high-speed data access to the mainstream public. According to a recent report from Informa, revenues from mobile broadband services will generate more than US $400 billion globally by 2012, giving WiMAX the potential to be one of the most profitable innovations in telecoms history.

The next big step that WiMAX providers must take is to invest in solutions that offer flexible charging and control capabilities. This will ensure that operators will be able to maximise their financial and business potential while bringing a service that is not only enjoyable to the customer but affordable as well.

David Knox is Product Marketing Director at VoluBill

Opting for a managed services solution provides telcos with enhanced agility in a highly competitive marketplace claims Dominic Smith

Being able to adapt to market pressures and respond quickly is obviously key to success in today's highly competitive telecoms landscape. And yet many telcos are weighed down by the sheer weight and complexity of their technological and service infrastructures.
In the mobile domain, operators' portfolios typically include 2G GSM, SMS, MMS, GPRS, 3G and HSPA basic services, not to mention the range of value-added services, content and applications accessible on top. In addition, in all market sectors, operators have to maintain and bill for a broad array of legacy services. And they need to be able to tailor their offerings to meet the specific needs of a wide variety of market segments - from large enterprises to individual consumers.

Today, some operators are making the mistake of trying to be "all things to all people". They are looking to provide customers with a complete portfolio of converged services including broadband, mobile and fixed communications solutions. The problem is, that by so doing, it becomes increasingly difficult for these operators to meet the needs of all of their customers. 

To compete effectively, they need to be agile, able to focus on customer requirements and efficiently deliver the solutions that their customers will actually benefit from. However, with the often onerous requirement to manage and maintain an intricate network of products, services and applications, agility can seem a highly elusive quality.

In this context, it is hardly surprising that telecoms operators are increasingly interested in exploring the possibility of outsourcing their CRM and billing systems to third party solutions providers and, by so doing, freeing themselves up to focus on their core business. 

Steady market growth
Cambridge-based research firm, Analysys expects the Western European market for outsourcing technology and customer services by telecoms operators to show six per cent annual growth between 2005 and 2010, rising from c5.9 billion to c8.0 billion. Our own experience at Cerillion indicates that the appetite of operators to outsource business support systems for customer management, order management and billing is on the increase.

Cerillion's on-the-show floor survey carried out at Barcelona's 3GSM World Congress in February found that 50 per cent of respondents thought operators were more open-minded about outsourced billing than a year before. Just 15 per cent said they were less so.
To underline this positive mood, major new contracts are regularly reported in the media. In recent times, one of the most notable was the March 2007 announcement by IBM Global Services that it had won a 10-year deal with Indian operator, Idea Cellular. Under the terms of the contract, IBM is helping to handle services like billing, revenue assurance, credit collection and subscriber management.

A diverse market
One of the most important advantages of the managed service approach for CRM and billing systems is that it can benefit a wide range of operators, working on a broad array of projects. An operator undergoing a large-scale business transformation project, for example, may benefit from a managed service approach to ensure it remains competitive and retains sufficient agility to be able to launch new products and services for the project duration. 
A telco looking to establish itself in an emerging market, may seek to put a managed service into operation while it is focused on bringing new people on board and training them up, before ultimately transitioning to an in-house managed solution. 

Alternatively, an operator may take a long-term strategic decision not to manage its own CRM and billing systems but to hand that role over to a provider with expertise in the field, leaving the operator itself free to focus on delivering a high quality customer experience. Again, the ultimate goal is enhanced business agility.      

Putting the customer first
This focus on the customer is important. After all, it is customers that will ultimately have to pay to allow operators the luxury of owning and managing their own business support systems. It is often overlooked, but perhaps the most important single benefit operators can achieve from outsourcing their systems is the cost saving that can be passed onto customers.

When purchasing systems, operators typically incur significant upfront capital expenses before they begin to reap benefits. With a managed service model, the entry barrier is much lower. While the operator still has operational costs to take into account, those costs will usually be lower and more predictable than with a traditional licensed implementation.
There is also a risk that telcos who manage their billing and CRM systems in-house end up concentrating more on the technology than on their customers. Although the situation has undoubtedly improved over recent years, the telecoms industry still has an unfortunate tendency to focus more on system functionality than real business drivers. Too many misguided decisions have been made by IT directors intent on purchasing the latest state-of-the-art systems rather than investing in a planned strategy of business improvement and enhanced customer service
provision.

In a competitive market, operators looking to achieve enhanced agility should always put the customer first. While acknowledging that technology is important, Professor Robert East, expert in customer behaviour at Kingston Business School, comments: "Customer-facing technology needs to become more sophisticated to deal with recurrent issues more quickly and solve problems more efficiently. Businesses need to focus on technology that actually delivers satisfaction to people."

Reaping the rewards
But it is not only customers who have to foot the bill for operators indulging in the luxury of managing their own systems. On top of the obvious capex and opex charges, operators may be missing out on a range of other business improvements that the managed service model can offer.

One key benefit they could achieve by migrating to a managed service model is the ability to commit their technology provider to a service level agreement (SLA) with agreed turnaround times for implementing new products and incident resolution, for example.    
Such contracts formalise the way that billing and CRM systems are run and, by so doing, enable operators to gauge how quickly system changes and additions can be implemented. Again, this provides them with enhanced control over their systems environment and the ability to react more quickly to external pressures.  In contrast, the IT department within a large telco business will typically have no specific SLAs in place with any other part of the organisation and often no fixed review process either.

Another important advantage of the managed service approach is that it supports improved time to market for new services. This is because the managed service provider, typically with the benefit of extensive experience of a broad range of different customer installations, will usually understand the procedures and processes around those systems much more clearly than the operator does.

Operators working in emerging markets can also benefit by obtaining access to scarce skilled resource directly, rather than facing the headache of trying to recruit people locally with the requisite skills. Telco start-ups in all regions can often also benefit in a similar way.

Positive prospects
The future for outsourcing of CRM and billing systems is looking increasingly positive. As Simon Sherrington, author of a recent Analysys report on outsourcing, points out: "Outsourcing has become an important weapon in a telecoms operator's strategic arsenal. An effective and well-managed outsourcing scheme can deliver flexibility, reduce time to market for new services, and help to deliver profit growth for shareholders."
There is also clear evidence that outsourcing can help operators to achieve significant cost savings. However, in today's highly competitive telecoms environment the most important benefit of a managed service approach is the enhanced agility it brings telcos to focus on their core business of delivering a high-quality service to their customers.

Dominic Smith, Marketing Director, Cerillion Technologies

Technology innovations tend to capture the imagination - and the headlines - but, says Lance Spencer, there's little point in providing bells and whistles if all a company wants to do is simply communicate first and foremost

Is it just me or is this decade just flying by?  With 2008 just around the corner, it only seems like yesterday that companies were busying themselves with notions that the Millennium bug would wreak havoc on their computers and telecommunications systems.  And God help anyone who was flying at the very turn of the century!

In between times, the subject of advances in voice telephony and Voice over IP (VoIP) has raised more crackles and jitters than anyone in our business could have expected.  It's been nearly 120 years since the innovative American undertaker Strowger invented the telephone exchange, and just over a century later we are still finding more ways to fiddle around with his original invention.

Although technological improvements are being made for the benefit of customers, sometimes it's plain old telephony ("ah, POTs", I hear the telco veterans recall) that matters most.  Yes, companies like to have telephones that work and furthermore they like to have telephones that offer the same experience they've had for decades.  If their first experience of ‘new' technology is not a happy one, then bet your bottom dollar that they will shy away in future. 

With a plethora of telecoms providers claiming they will help to improve business communications whilst cutting costs, it's a confusing picture for the end user. Poor old Strowger will be turning in his grave!

On a more positive note, the clever bit is that new and emerging communications technologies enable telecoms providers to offer end users innovative services that provide exactly what customers want with a whole lot else besides.

However, there's no point in providing bells and whistles if all a company wants to do is simply communicate first and foremost.  This is particularly the case with smaller to medium sized businesses who now have the potential to get their hands on ‘grown up' telecoms technology but fear being bamboozled by all the complexity that comes with it.
Over the past two years Tiscali and other companies have been unbundling the local loop with a view to providing innovative services that end users understand and want.
For our part, we are on target to reach just over 800 exchanges by the end of 2007 and have a further target of 200 in the first quarter of 2008 to exceed 1,000.  Beyond this the amount of additional coverage gained for business purposes is relatively minimal.
By deploying equipment in BT exchanges, it's possible to introduce services that provide the ‘basics' that end users need, e.g. a telephone line, but in a way which opens up a whole new world of telecoms potential if that is what they want.  And in the majority of cases, once end users have confidence in the core technology, that's precisely what they ask for.
For example, Tiscali recently introduced an ‘all in one' service designed to enable resellers to deliver tailor made, competitively priced line rental, voice and data services over a single connection.  WVLA - which stands for Wholesale Line, Voice and ADSL - is a wholesale, fully unbundled service based on the unbundling capability in BT exchanges.   Resellers are provided with the line, full traditional voice services and an ADSL based data connection in one product.

It's the equivalent of taking the Wholesale Line Rental (WLR) service, Carrier Pre Select (CPS) and an ADSL connection all in one package.  Resellers do not need a relationship with BT for this or any other part of the WLVA service.

Coming back to our head scratching end user, the key aspect here is that the user experience is exactly the same as with the services they've taken in the past.  A degree in telecommunications network infrastructure isn't required and the commercial benefits are plainly laid out and understandable.

It's then up to the reseller to be as daring and as creative as they like, offering differentiated products to customers by expanding and diversifying their product portfolio.  In fact, our experience is that the two go hand in hand.  Once an end user is comfortable that the basics are in place, it's then a case that the end user and reseller can jointly discuss added value services such as IPCentrex.

Depending on the type of product offering, end users can then make their own choices about the level of sophistication and service required.  For example, the ADSL component of the WLVA service is initially available in Standard and Business grades of service with the option of enhanced care.  Product speeds range from 512Kbps to Max grade services.  Voice is offered with a range of telephony features suitable for businesses and consumers.  The line is provisioned with virtually all features enabled, allowing resellers to develop their own product bundles tailored to their customers. 

And as well as creating their own product range out of such offerings, many features come free of charge to resellers.  Combined with very competitive call rates available from wholesale providers, this provides resellers with the potential to offer competitively priced products while still increasing their margins.

Services such as WLVA are based on the latest Dense Wavelength Division Multiplexing (DWDM) technology.  This provides the ability to carry the next generation of data products, for example IPTV and telco grade voice services.

In the future, companies can expect to see the introduction of ADSL 2+ services offering speeds of up to 24Mb as they become available.  There will also be investment in Wholesale Broadband Connect (WBC) as BT rolls out its 21st Century Network.

Another example of simplicity at the core is "Voice Ready" Broadband.  Having introduced a ‘four line' version for smaller businesses earlier in 2007, we thought it would make sense to make the service accessible to larger SMEs so we introduced a ‘ten line' version towards the end of 2007.  It's the first service to offer guaranteed voice quality over broadband with enhanced care and SLAs to support carrier class voice services.

Such guarantees in this service offering and others are important because smaller businesses (and particularly those at the ‘S' end of the SME spectrum) have often thought of themselves as being at the lower end of the telecoms pecking order.  If as an industry we are to remove barriers to adoption then, as well as innovation, we must be prepared to deliver quality across everything we do.

Voice Ready is designed specifically for resellers wishing to provide guaranteed quality services to support real time applications such as Voice over ADSL, Video Streaming and Video Conferencing.  ADSL Connectivity is ideal for service providers with an existing network infrastructure and who wish to overlay additional network capabilities such as Local Loop Unbundled services.

Voice Ready Broadband is the first service of its type to offer Service Level Agreement (SLA) backed, guaranteed voice quality over broadband with enhanced care to support carrier class voice services.  The SLAs are on latency, jitter and packet loss, all of which are crucial to the quality of a voice call over a single broadband line. These guarantees mean that resellers can offer cost effective alternatives to traditional PSTN and ISDN based services.
Other benefits to resellers include low cost interconnection with network infrastructure at 100Mb, 1Gb and 10Gb, industry-leading product development of new technologies such as ADSL2+ and the ability to provide differentiated services in the marketplace.
As we edge towards the end of the ‘naughties', reflections on this past decade will show that despite criticism of the telecoms industry, on the whole we're actually pretty good at delivering innovation.  Ok, it can take us a while to get there, especially if products such as Voice over IP are over-hyped, but successful change doesn't always happen overnight, especially where a monopoly continues to be untangled.

Companies of all shapes and sizes are benefiting from telecoms unbundling, product innovation, service guarantees, and perhaps most importantly the ability to present new products in a way that end users understand them.  Now that must be putting a smile back onto Mr Strowger's face.

Lance Spencer is Tiscali Business Services Product and Marketing Director, and can be contacted via 
e-mail: lance.spencer@uk.tiscali.com
www.tiscali-business.co.uk

Matthew Finnie looks at the ever growing bandwidth requirements in Europe, and details what can be done to meet this increased demand

For those of us that have been working for a decade or more, the speed and reliability of our Internet connection is now assumed. We want access to be ubiquitous and limitless. We no longer hunker down and dial into the Internet or marvel at a T3 transatlantic interconnect which speeds it all up.

Europe - the new Internet superpower
The Internet was a North American invention and for a long time America was the Internet. Now those days have gone. Europe is now the largest Internet presence in the world, and the fastest growing market.  Recent data from telecoms analyst group Point Topic's Global Broadband Statistics service suggested that Eastern Europe continues to show growth, and was the only region to record more than 10 per cent growth in Q2 of this year. In March 2007 Romania passed the one million subscriber mark, which made it three Eastern European countries in the 10 fastest growing countries worldwide.

Western Europe is also setting a fast pace when it comes to broadband growth. Greece was the top grower in percentage terms in Q2, expanding by 27 per cent, and the biggest mover in the top 10 countries by number of subscribers was France, achieving the highest percentage growth rate of 9.36 per cent in the quarter.

As well as being the fastest growing territory, Europe also has the highest number of broadband subscribers. A recent audit by Internet World Stats revealed America had 64,614,000 subscribers and China had 48,500,000. The total number of subscribers in the nine highest European countries is 77,706,870, so Europe is clearly a long way ahead, even without including the rest of the continent.

The increase in broadband subscription is being driven not by businesses but by consumers and their relentless experimenting and evolving applications. The sharing of videos, photos, music and more across sites such as YouTube and Facebook, places enormous demands on bandwidth. The challenge for the DSL providers giving consumers access, is that they aren't sharing in the valuations the content providers are seeing. Given this, how do they maintain spend to keep delivering service while access prices in real terms are declining on a per meg basis?  Perhaps part of the problem is that many of these have still not embraced a connectionless NGN world where access to a customer is not a guarantee of all service revenue?

New applications are now being developed that assume that the broadband service is available. While the provider looks to include TV, there are whole sections of people with no network who see the penetration of broadband as the meal ticket for their latest venture. The most bizarre turn in this trend is mobile operators freeing up capacity in their own wireless networks by placing femtocells on the consumer premises and in some cases using the existing DSL as wireless backhaul.

Bandwidth, and its availability, has hit a tipping point where people expect to see ever increasing levels of service. One note of caution, most consumer broadband networks are horribly asymmetrical, while bandwidth is going up there is still a bias toward download making it unsuitable for many business applications. But the global business community is also demanding ever-increasing volumes of bandwidth as more and more business critical applications sit on the Internet.

Business 2.0 
Web 2.0 is a phrase coined to capture the collaborative nature of a network, and now people are talking about Business 2.0 in the hope that some of this social networking and agile application delivery will rub off in the corporate sector. The demands for corporate bandwidth are also being challenged.

The practical needs of a corporate are less glamorous but by no means less network intensive. The IT Director no longer has the luxury of time or resources to embark on a grand IT plan spending millions for a promised brighter future in two years. IT Directors are talking about embracing a "service aware" approach to developing IT applications that borrows much from the experimental evolutions seen in the consumer Internet experience. The challenge for many is that invariably this approach is network centric (it has to be) but requires a more iterative and agile approach to application development.
And the on-going move towards collaboration and unified communications is a key factor in demands on corporate networks too. Unified communications - the embedding of tools such as IM, presence conferencing and voice into one platform - is going to have as big an impact on business communications now, as e-mail did in the 90s. Earlier this year we launched our own integrated communications platform, Interoute One. This service enables corporate customers to manage voice calls as easily and cost effectively as e-mail, and without the need for complex integration or upgrades to their existing telephony infrastructure.
And with its recent launch of Office Communications Server (OCS), even Microsoft is looking to get into the unified communications market. Yet despite its strengths, OCS is only as strong as the network that carries it. Without a network provider able to route calls the OCS doesn't achieve its ambition.
So with the demand brought about by unified communications, combined with Web 2.0 and businesses running more and more applications over their networks, a clear picture emerges - there is an incredible increase in demand for bandwidth to ensure the highest quality end-user experience.
This demand can only be met by more bandwidth, yet for service providers, what is the best way to meet this demand?

Buy or build?
The demand for bandwidth will stretch many existing service provider infrastructures to breaking point, so what are the options available? DSL has traditionally been the preferred technology to deliver broadband services in Europe. It uses existing copper access networks to deliver broadband and is well entrenched in Europe, but is struggling to cope with bandwidth demand now, and in the future will struggle massively as demand grows yet further.

Consequently, service providers have to start looking at deploying fibre deeper into the network, even to the home or building, to meet future bandwidth requirements. Fibre is as future-proof a communications technology as is possible to get, and several service providers have already made a commitment to deploy fibre-to-the-node or fibre-to-the-home networks in the next three to five years. But is building new fibre networks a viable option?
It's a paradoxical situation. The main problem with fibre is that it is just so expensive to build from scratch, it's basically real estate. The industry has had well-documented problems with a number of carriers who built large fibre networks in the 90s, and then watched as the telco market dipped and their businesses suffered massively. Now the telco market is booming again, demand for bandwidth is high, yet the best (and arguably only) way to meet that demand requires fibre. So for service providers looking to stay in the wholesale provider space the only way to sensibly achieve this is through fibre ownership.
This means that in the wholesale space, if you are without the physical assets, you are shortly going to run out of capacity. Even if you have fibre but only a leased pair you will shortly be buying more. This leaves a small but battle hardened minority of carriers with multiple fibre backbones that will be the dominant suppliers in the market place. But even owning fibre is not quite enough - you need to have an operation that understands how to deal with Europe's different laws and regions. For example, the Interoute network connects 85 cities in 22 countries across 54,000 cable kilometres of fibre. Up to 48 fibre pairs have been deployed throughout the network, which means it has the capacity to carry over a petabit (a billion megabits per second) of traffic. The backbone is complimented by 19 deep city fibre networks that interconnect with the necessary diversity of access methods required to deliver 21st century telecoms

Fibre is vital to business communications in 2007 and beyond - it is the ultimate delivery mechanism - and the harsh reality is that a service provider without the physical fibre in the ground will not have the raw material necessary to satisfy the huge demand for bandwidth. The die is now cast, fibre is the bandwidth raw material of choice for the Internet, but building fibre networks is now simply too expensive, so the only real option is to buy bandwidth from someone who has it.  There is a tipping point with all technology when it passes through a point where the user understands it and operates accordingly. Bandwidth and the Internet has hit that point, and fibre is the only viable way to meet the demand.

Matthew Finnie is Chief Technology Officer at Interoute

At twenty-six, Ethernet is something of a ‘Grand Old Man' of networking technologies.  But as Mark Bennett points out, Ethernet is still one of the most agile and reliable networking standards available, despite its relatively advanced years.  Like a good wine, it just gets better with age

In the early 1980's, technology was offering the world new ways to live and work.  If you chose to, you could drive to the office in the ‘revolutionary' Sinclair C5, unwind with a Beta-max recording of ET or even play a few games on the latest Amstrad.  Fortunately, while these high-profile ‘flash-in-the-pans' were hogging the limelight, our predecessors in telecoms engineering were putting in place some more enduring technologies.  The first Ethernet products hit the market in 1981, and over the past 26 years the standard has established itself as the de facto choice for LAN and MAN connectivity.  Such longevity is rare in the technology space and bares witness to the versatility of the standards. 

Ethernet has been such a huge success because it meets the criteria essential for the mass adoption of any product:  It is inexpensive, it is flexible, it is simple and delivers an elegant solution to a potentially very complex problem, so it should be no surprise that it is now ubiquitous.  Any technology though, can only last if it can change to meet the demands of users.  It is this Darwinian ability to continually evolve that marks out the true survivors of the technology space.  Ethernet has proved that it can do this and is now moving beyond its traditional spheres of LAN and MAN, to provide a much more comprehensive approach to networking.  It is becoming increasingly clear that, aged 26, Ethernet is going from strength to strength and emerging as the standard of choice for long-haul access technology.
The reasons for this can be traced to the changing needs of businesses, especially in specific sectors.  Many of these organisations are demanding ever-increasing amounts of bandwidth in their networks, and Ethernet is emerging as the best means for providing this.  This market includes a wide range of organisations in sectors as diverse as utilities, finance, public sector and smaller business.  The key markets here are companies which can be termed ‘DIY': businesses which have their own in-house IT managers and want to retain management of their own IP-based networks.  Such organisations are typical of utilities, media and finance companies.  The public sector also fits the DIY mould, and we are seeing high levels of demand from local and central government, as well as all areas of schools and higher education. Demand from the indirect markets of mobile, national and international carriers, as well as from IT services companies, is also growing fast.

So what, exactly, are these larger businesses and public sector bodies using Ethernet for?  Ethernet has a number of properties that appeal to these organisations.  It's speed and comparatively low cost makes the technology ideal for inter-site connectivity, and we are seeing a great deal of demand for this.  As LAN speeds increase - with 100Mbps being standard and Gigabit (1,000Mbps) more common as well - it makes sense to ensure that the wider network is not a bottleneck so 100Mbps and 1Gbps inter-site networks are in regular deployment.  Such organisations are also increasingly looking to leverage the benefits of data, voice and applications convergence so Ethernet is establishing itself as the ideal method to provide access to converged next-generation networks (such as MPLS-based core networks).  Convergence brings with it the need to handle ever-increasing amounts of data traffic as more and more applications are placed on a single network - everything from rich content e-business applications to services and IT centralisation applications.  Ethernet has the capacity to handle vast amounts of data without putting any strain on the network.  So it seems that service convergence is breathing new life into Ethernet networking.

It's not just large enterprises that are driving demand for Ethernet connections.  Although Ethernet has typically been associated with larger enterprises, demand from small and medium sized businesses (SMBs) is also growing. SMBs are looking for increased bandwidth to deploy next-generation services and Ethernet is an appealing option for them because of its simplicity and familiarity from its use in the LAN. Ethernet is enabling SMBs to roll out the advanced applications that help them compete with bigger companies, such as VoIP, distributed WANs IP-based video conferencing and real-time collaboration.

We are seeing demand from SMBs, which have traditionally used low bandwidth leased lines, grow particularly fast. These businesses will often have more than one site and be too large for DSL, especially due to the limitations of upstream bandwidth that ADSL gives when used for a VPN.  They require permanent, dedicated, always-on bandwidth, to support applications critical to their business. This sector is increasingly turning to Ethernet for security, reliability and quality of service on a private network, which is not currently available on DSL services, as a solution to their increasing bandwidth needs.  Ethernet is attractive as it offers more bandwidth at a lower price per megabyte, a critical consideration for SMBs. Solutions that can be scaled to deliver multiple services are particularly attractive as these can deliver additional business benefits such as applications like VoIP.

The indirect market for Ethernet is also seeing considerable growth.  This market is made up of service providers investing in the technology in order to deliver advanced services to their customers or to extend their networks.  Both national and international carriers as well as mobile operators business ISPs and IT services companies, are all using Ethernet to enhance the services they deliver, both for high speed Internet access and to provide hosting and connectivity to their customers.

Mobile operators in particular are trying to leverage Ethernet to reduce operating costs of their access networks as well as to provide the higher levels of bandwidth, required as Mobile Data begins to be adopted more widely.  Mobile operators are trying to increase ARPU to recoup their investment in 3G by offering data-heavy applications such as mobile Internet and TV.  The high-speed 3G and WiMax networks supporting these services are rapidly increasing backhaul bandwidth requirements.   As traditional backhaul technologies and architectures struggle to support growing demands, operators are turning to packet transport technologies, such as Ethernet, as a cost effective solution.  Ethernet is therefore increasingly supporting new, revenue-generating services for mobile operators without allowing backhaul costs to spiral, helping to maintain mobile operators' profitability.    
International carriers, on the other hand, are moving to Ethernet to support the increasing demand for bandwidth being made by their large multi-national customers.  Traditionally leased lines would have been used to cater for such demand, but Ethernet has proven it can offer a much more scalable alternative at a lower price per megabit.   International carriers can use Ethernet tails to provide their customers with a range of applications and services, including MPLS IP VPN, Ethernet connectivity, Internet and VoIP.  Because Ethernet is a layer two protocol, it allows international carriers to retain full control of layer three IP routing and their own IP classes of service SLAs.

National operators are driving demand for Ethernet along similar lines.  In the UK, Ethernet backhaul provided by altnets is proving particularly popular as a cost saving option, as their wholesale offering can be much less expensive than the incumbent's.

A number of organisations want to centralise the hosting and management of servers, data and storage systems, and other IT assets in one location.  To do this they need dedicated, very high bandwidth, high quality connectivity to that centralised facility. This centralisation of IT assets is designed to reduce their operating and capital costs, as well as enable delivery of advanced new services. 

Historically, however, the business case for centralising IT assets may not have been viable due to the high cost of bandwidth required to connect remote sites to central data centres.  Ethernet offers more bandwidth at a lower price per megabit than the technologies traditionally deployed, helping to drive the business case for centralisation. Ethernet together with MPLS IPVPNs, is ideal for connecting remote sites to central locations with Gigabit Ethernet services providing the ideal, very high bandwidth connections between data centres. In addition, the cost savings can then be diverted into other applications that fit with Ethernet, enabling these organisations to layer multiple services and applications on to the same network.

With Ethernet, bandwidth can be increased or decreased at very short notice.  This means that companies need only pay for the bandwidth they actually use, increasing capacity easily and quickly as and when required, either to handle a short-term spike in demand or a longer term increase in traffic. This flexibility is very attractive to organisations of all kinds, and the scalable nature of Ethernet is one of its key selling points.

Ethernet, therefore, has proved itself in the face of a fast changing telecoms market.  Change usually forces technology to sink or swim, and the recent move towards convergence, and bandwidth-hungry applications has shown how resilient Ethernet is.  Indeed, the future for Ethernet is looking good.  Carrier Ethernet is emerging to offer operators a flavour of Ethernet with the same characteristics of leased lines, while the IEEE has launched a new set of OAM standards for Ethernet in order to better manage and maintain the technology.

At 26, therefore, the Ethernet story is far from over.  Ethernet looks set to be the dominant access standard for the foreseeable future, driven by the demands of business and ideally placed to meet these demands.  The versatility and simplicity of the standard that led to its dominance in LAN and MAN, is extending to the WAN, delivering access to the next-generation of applications and services.  It goes to show that in the world of telecoms you really can teach an old dog, new tricks - tricks that are increasingly appealing to business.    

Mark Bennett is Head of Data Portfolio at THUS plc
www.thus.net

by Dominic Smith, marketing director, Cerillion Technologies

 

In line with the current emphasis on conserving resources, reducing wastage and cutting carbon emissions, telecommunications services are at the forefront of a revolution in green thinking, which is affecting every business sector today. And telecoms operators themselves are under ever-greater pressure to adopt environmentally friendly strategies.

Telcos have long been pioneers in helping businesses from other industries pursue a green agenda. The deployment of robust global wide area networks and connectivity for video-conferencing applications, for example, have both played an important role in reducing the need for business travel and unnecessary face-to-face meetings.

However, operators increasingly need to see the provision of tools to help other businesses become greener as just one element of their overall environmental strategy. Today, they need to be focused on making a more direct contribution towards the future well-being of the planet.

How Electronic Billing Helps Conserve Resources 

Telecoms operators have already made a start in the direction of greater environmental responsibility, with several key initiatives already well advanced. One of the most significant is the work they are doing to promote electronic billing. In the past, potential cost savings were the major incentive for operators and end customers alike.

And electronic billing does offer clear business benefits being comparatively inexpensive both for the operator and the end customer. In contrast, hard-copy itemised bills typically entail significant print costs and wastage of resources.

Yet, despite operators actively pushing the benefits of electronic billing, customer take-up has been slow in general. One of the likely reasons is that the environmental benefits of the service have until recently not been highlighted by operators.

BT recently began to ‘push' a green angle to electronic billing by encouraging its major customers to convert to its OneBill service in preference to hard-copy paper invoices. Further underlining its green credentials, BT's efforts to convert customers to paper-free billing have resulted in more than 500,000 trees being planted in the UK so far.

The landmark has been achieved thanks to BT's partnership with the Woodland Trust, which guarantees that every time a customer signs up for paper-free billing, BT pays for a native broadleaf sapling to be planted in a UK woodland creation site.

BT's approach is just one sign that, in the future, telcos are likely to give the issue of ‘green billing' a higher priority. Indeed, most operators today are actively looking at ways in which they are selling the service. With the public better informed about environmental issues than ever, the ‘green' approach is likely to have great resonance with customers in the future.

Going forward, software providers will increasingly join forces with their telco customers to promote the reductions in environmental waste and operational cost that can potentially be achieved by pursuing this methodology. Over time, greater numbers of businesses are likely to elect to become part of what has been described as ‘the paperless electronic billing and payment manifesto.' 

Operators often see electronic billing as just one element of a larger integrated self-service strategy. The concept of self-care for customers is in itself an environmentally friendly one.

Customers can be encouraged to carry out key tasks easily themselves online such as updating personal details or ordering new services without drawing too heavily on the resources of the operator, either in terms of systems or of people. Self-care typically results in reduced customer dependency on large call centres, and, by association, radically reduces the amount of physical equipment required to service customers. 

Pre-integration and the Managed Service Model

Operators can also make a more direct contribution towards protecting the environment through the systems model they choose to implement. Opting for a pre-integrated set of components when updating billing and CRM systems or IT systems architectures, is a good initial step.

Adopting such an approach enables operators to significantly reduce the time projects take, together with the amount of travel undertaken and resource usage required, especially when compared with the traditional best-of-breed approach, which can involve large integration teams visiting the operator's site every day for months on end.

Operators can take the benefits achieved from pre-integration one step further by choosing to follow a managed service model, thereby reducing the expense of integration teams flying to different locations to install and maintain new systems. Instead, existing infrastructure at existing managed service centres and existing hardware and software can be used - a more environmentally sustainable approach.

The benefits achieved will tend to accrue over time with the cumulative efficiencies of re-use. Once an operator has commissioned a service partner to put in place the relevant people, hardware and infrastructure in one of these centres, there is no need to re-invest every time the centre is used. And of course the wide area network connectivity provided by operators enables them to achieve all the benefits associated with hosting services in distant locations and managing them remotely.

In the complete managed and hosted model, this is taken one stage further still, with the management of the system typically carried out by a third party from a remote location. The managed service approach does of course reduce, if not completely eliminate, the need for teams of consultants to travel to and from site.

Looking into the Green Future

Those telecoms operators, like BT for example, that have already deployed electronic billing and online self-service capabilities for their customers are likely to promote these areas much more in the future. And those who haven't already deployed this functionality are likely to begin doing so very soon. Perhaps they also need to look at how they provide those services and then they should really also consider the benefits of both the pre-integrated solution and the managed service approach.

Renewable energy providers are often able to sell their services at a premium because they are selling to the ‘green aware consumer'.  In the age of telecoms commoditisation, one way operators can justify maintaining their prices is through investment in green initiatives and having green systems in place. Maybe, in the future, they may even have a legitimate argument for charging a premium when selling electronic billing and self-care to a green audience?

Another possibility is the emergence of niche operators focused entirely on targeting the green consumer. In recent years, the market has seen the arrival of virtual operators who target specific industry segments, encompassing everything from students to sports enthusiasts and from children to coffee-shop users.

Maybe we are not far away from seeing the first telecoms reseller, which sets up, brands and positions its products purely in the green segment and exclusively targets the green consumer? With the growing business focus on environmental protection, this is likely to become a reality sooner rather than later.

How accurate is the position that WiMAX and cellular technologies stand in opposite corners in the development of wide area wireless standards? Robert Syputa takes a look

The debate surrounding the new entrant into wide area wireless standard developments has tended to be constructed as being WiMAX versus Cellular technologies and market development.  Industry shaping debates should start with a clear understanding of the framing of the premise of the debate.  Judging from recent white papers, panel discussions, articles and interviews, WiMAX is being opposed as an anti-cellular effort rather than an alternative development that fits into cellular mobile and the broader context of fixed-mobile convergence.  While WiMAX appeals to alternative service providers because it can be used in spectrum that are designated specifically for use by wireless broadband, both it and the cellular industry at large have changed significantly over the past few years to render this a Swiss cheese argument:

  • WiMAX and LTE are developing according to goals for evolution to the 4G multi-service wireless broadband platform that addresses both mobility and high bandwidth applications.
  • While WiMAX has broadened to become more mobile and capable of being used for media services, 3G cellular has become increasingly broadband, resulting in practical convergence between these fields of development. What's more, both are driven to use the same core sets of technologies, authentication and handoff, network management, dissimilar network roaming that align goals for network operation and user experience.
  • Multi-mode SoC and device designs are increasingly capable of delivering a user experience that disregards the differences between WiMAX and cellular. If the user can make use of services that transit from WiMAX to cellular networks, the argument in favour of control of the huge market share currently held by cellular becomes mute.
  • The argument that the huge investment in development of the cellular industry sets it apart from WiMAX also breaks down in light of the fact that many of cellular's most prominent contributors are also contributing their technology, design, production and marketing capabilities to WiMAX. Operators may convert or cross-sell their cellular customers to WiMAX to gain additional revenues.
  • Mainstream regulatory organisations, including ITU, are setting the requirements for next generation wireless systems to which both WiMAX and 3GPP/3GPP2 aspire.
  • Next generation wireless will be based on OFMDA, which causes a similar discontinuity of air interfaces from 3G for LTE and WiMAX.

There remain practical differences in technical implementations, market momentum, regulation of spectrum, and corporate support between WiMAX and LTE but the gap continues to shrink such that it looks increasingly like the gap between one generation of mainstream cellular system and the next.  As with every cellular system, operator decisions regarding adoption of WiMAX depend on detailed business case analysis that takes into account all known factors for successful deployment and business development. 
We contend that WiMAX is another variant of cellular which faces the same hurdles for adoption as a major new system development, such as LTE, that is cast within the traditional cellular standards development groups. While the approach of WiMAX and 3GPP/3GPP2 started out from a different set of objectives, fixed-nomadic data versus mobile voice, the technologies and market demand has evolved over the past seven years to become very similar.
What's more, the evolutionary path directs both fields of development toward the same basic goals and sets of technologies, making arguments that these are distinct impractical:

  • WiMAX 802.16e-2005 looks very likely to be accepted as a member of IMT-2000 cellular.
  • WiMAXm 802.16m/j will be proposed for IMT-Advanced
  • LTE, the next generation of systems from the 3G camp, will similarly use OFDMA and be proposed for IMT-Advanced.
  • Major goals for IMT-Advanced include an evolutionary framework upon which multiple classes of services and scale of operation can be developed. The goals of ITU for 4G look very similar to what WiMAX has become over the past three years as major telecommunications companies and operators have influenced development.

Ericsson, the world's largest supplier of cellular infrastructure, has renounced the importance of WiMAX. Earlier this year, Ericsson announced that they were pulling the plug on development of WiMAX and would devote their B3G efforts to develop LTE.   Ericsson has resold WiMAX equipment from Airspan but has never committed significant effort to development of WiMAX internally.  Strategically, it has never made sense for them to push WiMAX, or any alternative, that may dilute their own market position.
Ericsson executive vice president Bert Nordberg contends in the June 18th issue of the Globes online magazine: "We have nothing against WiMAX, but I have to say that it has no business model. This, at least, is Ericsson's conclusion about the matter. Therefore we're not investing in this area at all. What is supposed to work on WiMAX already works on cellular 3G."
Counterpoint: One thing that is unarguable is that the cellular industry has evolved and managed to survive adoption of new wireless interfaces that were not directly backward compatible.  This has been driven by the need to deliver better levels of voice and higher bandwidth data service.  Operators would prefer to see systems evolve on the same technology platform long enough to enjoy profits, but have been driven to adopt new cellular systems that provide a commercial advantage despite the need to commit large capital expenditure to displace or deploy next generation systems into new spectrum.  The business case for 3G deployment is clearly demonstrated, but as the need for bandwidth continues to grow, so the case for a shift to B3G (beyond 3G) systems based on OFDMA technology is progressively strengthened.
Bert Nordberg says: "They talk about WiMAX having 30 million customers in 2010," says Nordberg. "But by that time, cellular broadband will have 500 million subscribers. These are completely different orders of size. If we have learned anything from the history of technology adoption in the telecommunications market, it is that standardisation has huge power, and cellular is the standard."
Two points:  1) The wireless industry has broadened and matured to be focused on multiple classes of service.  The vision for IMT-Advanced and 4G is for highly scalable multi-service evolutionary platforms.  While this development is likely to be dominated in numbers by mobile applications, the trend is for more diverse and specialised services.  The majority of future profits will likely come from extended services, not from basic voice or data connections. 2) Ethernet is the predominant standard for wired data communications and momentum more directly extends to the WiMAX.  Open use of Internet communications and applications is part of the converged landscape of fixed-mobile technology and market convergence.  It is myopic to consider cellular mobile market momentum as a sole defensible position, particularly since that is translatable via multi-mode to new service networks. Wireless communications has been defined within various standards development groups and sets of companies that have technology and commercial agendas.  WiMAX is definitely a cellular technology, for the most part indistinct from established cellular by virtue of the increasingly overlapping road maps for development.  If a cellular operator adopts WiMAX, which is multi-mode compatible, with their existing cellular network, their customers hardly need to know.  WiMAX does also appeal to alternative service providers and various classes of service that are distinct from mainstream mobile cellular.   However, these can often take advantage of cost dynamics achieved in mobile markets.  Standardization does have huge power in helping to drive costs and market adoption.  Convergence between IT/Networking, Internet, radio, music and TV media and new interactive PtP viral video as well as mobile and fixed communications drive multiple participants together to influence overall product and market development.  While mobile cellular dominates in terms of volume, it does not dominate in terms of applications, content or dollars and openness of development and user participation.  The WiMAX standard comes about at a time that opening up of many classes of service to the benefits of standardisation is practical.
Several more arguments can be made for a shift to B3G platforms that take better advantage of evolving trends in smart antennas and granularly adopted smart wireless broadband networks.  The cellular wireless approach can be criticised in it's entirety as being too constrained to pursue the coming generation of wireless development: without a major re-write that will make LTE more similar to WiMAX than 3G, incapable of being granularly organised and deployed into open IP use scenarios.  ITU's goals for IMT-Advanced appear quite bold: A multi-service platform capable of providing per-user bandwidths of 1 Gbps fixed-nomadic and 100 Mbps mobile.  Asking Ericsson how they plan to achieve 4G performance in LTE or beyond has delivered a response that is very close to the path of development WiMAX is already well on the way to achieving.  That flips the debate about continuity of technology developments to place LTE as the follower rather than the leader of the dominant emerging mandates.  And the inevitable reorganisation of wireless business models along lines of open rather than prescribed content and applications conspires to shift the debate to a matter of when not if new operator revenue models will emerge.
 The gains in performance needed to deliver 4G will not come from advances in either CDMA or core OFDM interface technologies but from how networks are organized and deployed to make multiple use of available spectrum and source content and applications resources within the distributed network.  Delivering the performance gains has more to do with building of smart networks that incorporate wireless than wireless itself.  4G is a wireless broadband network with everything that implies.  OFDMA is the core link technology for WiMAX and LTE 4G, but the performance gains must be built upon through an evolution more to do with how networks.  The impact of the evolutionary shift to take advantage of the ‘spatial' and architectural domain of wireless development will be to greatly increase bandwidth density while reducing costs. Suffice it to say that the shift is to a new evolutionary platform with all that this implies: An additional dimension of development that will deliver 3X-10X total network throughput improvement over cellular wireless.  What may be the factor that scares up protests to WiMAX the most is the recognition that it is rapidly evolving to deliver on a frontier of new developments that have just started to unfold. 
Is the debate about WiMAX being a development that is outside the mainstream of cellular development or is it that the entire field of wireless is converging and that brings into play additional industry participants and markets?  Put directly, who owns wireless broadband?  Is it a select group of mobile companies or a broadened field of development that increasingly includes networking, IT and media interests?  We think the momentum is shifting to allow a new contender: Both WiMAX and LTE will battle in the ring for the 4G crown.
This may appear to add to problems of harmonisation, but systems are increasingly harmonised at higher levels of functionality and converged via multimode at the user device level.  Spectrums are also increasingly harmonised through device integration.  An enlightening example of this trend is the incorporation of Qualcomm Flo/MediaFlo into 3G devices in Europe and the United States: the dissimilar technologies are converged at the chip-set and device level with integration into higher levels.  The decision to use MediaFlo/Flo becomes the operator's commercial decision, not so much a standards debate.  Likewise, we expect decisions regarding WiMAX to resolve on practical concerns and for discussions about what is or is not cellular to become meaningless. 

Robert Syputa is Senior Analyst, Maravedis
www.maravedis-bwa.com

In the first of a regular column for European Communications, Benoit Reillier looks at the role played by regulators and politicians in the rapidly changing telecoms arena

Regulation, public policy, competition law… it would be tempting to discount these notions in the belief that the communications sector has more to do with technological change, innovation and marketing than with politics and regulators.
All market participants, however, are playing a global game whose rules are being decided by Governments, regulators and competition authorities. In fact, a closer look suggests that the regulatory and competition framework, within which communications firms operate, may be more important than how well they ‘play the game’.
This is especially true in the telecommunications sector where incumbent operators were often deemed too powerful and subjected to heavy sector specific regulations. These rules were set by Governments and regulators in order to ensure the development of a competitive market while protecting consumers’ interests.
Enlightened strategy departments in many communications firms have been aware of the importance of regulatory affairs for quite some time now. As a result many heads of strategy have now taken on these added responsibilities (historically often left to legal or communications departments).
Indeed, shaping the debate and contributing to the development of a fairer and more efficient regulatory model is becoming paramount for operators’ future. In a market where time horizons for investment can be decades, regulatory visibility is critical. This is especially the case when existing infrastructures, such as the fixed copper networks that can be found in most countries, are showing signs of obsolescence. Indeed, when legacy infrastructures become a bottleneck, new multi billion dollar investments are required to upgrade as illustrated by the early roll out plans of new generation fibre networks in several countries.
The EU Commission has played a critical role in shaping the regulatory environment over the past decade. The Commission gives guidelines and tools (a framework) to National Regulatory Authorities (so called NRAs) as well as homework (market reviews to be carried out) and deadlines. It also reviews progress of its NRAs every year (implementation reports) to see if they have been good students or not. Those who have worked well are praised while other countries are named and shamed or encouraged to do better. Very much like a teacher, the Commission is also asking for new powers to be able to better discipline those who do not follow the rules.
Viviane Reding, the vocal EU Commissioner for the Information Society, is currently reviewing the telecoms regulatory framework that will apply to all member states over the next few years. If approved by the EU Parliament, the NRAs in each country will have to implement the new policies proposed. Some of these, like the addition of mandated functional separation to the toolbox of regulators, could result in operators having to split their operations so that the retail side of the business (that sells services) would become separate from the network infrastructure. Mandating such drastic measures would, of course, have far reaching consequences for the development of the market.
While these debates may seem remote, they will have a profound impact on the way in which the market develops; on the type of competition that emerges as well as on the levels of investments in infrastructure and services. Given the strong relationship that has been established by many economists between investment in communications infrastructures and economic growth, the stakes are high. Indeed it is not just about telecommunications but also about the overall productivity gains enabled by these new services.
So equipment manufacturers, consumers and operators alike all have a lot to gain from contributing to the regulatory process and shaping the debate. Best regulatory practice involves a period of consultation with stakeholders, and this opportunity to review the arguments and contribute to the debate shouldn’t be missed.
The challenges behind policies such as mandated structural separation, increased pan-European regulation and new generation networks are some of the critical topics to be addressed over the next few years… and as many opportunities to shape the debate.

Benoit Reillier is a London based Director of the telecoms practice of global economics advisory firm LECG. He can be reached at breillier@lecg.com.
The views expressed in this column are his own.

European Communications takes a look at the important issues up for discussion at the Broadband World Forum Europe

 

The success of broadband penetration into the access network is ushering in a new era of content for the residential consumer and enterprise end-user alike.  Just what we mean by "content" is also undergoing transformation, as consumer devices and ubiquitous broadband service is enabling new kinds of entertainment beyond traditional TV - ones which include user-generated content, information, and video applications anytime, anywhere. As carriers around the world continue with wireline and wireless broadband deployment, they must begin to turn equal attention to how broadband usage of key applications - such as IPTV - will shape the future of the industry.
Many cutting-edge developments are taking place in the European broadband marketplace, such as advancements in IPTV. Many of the benefits of these advancements as well as accompanying issues and challenges will be discussed by top industry leaders and experts at the Broadband World Forum Europe 2007 in Berlin this October, hosted by Deutsche Telekom and organized by the International Engineering Consortium (IEC).

IPTV: On a global roll
Indeed, IPTV will be among the foremost topics in delegates' minds. Recent research from the Multimedia Research Group (MRG) indicates that there are approximately 15 million IPTV households worldwide, and that 576 IPTV service providers presently are active in the IPTV sector. According to Helmut Leopold, Chairman of the Broadband Services Forum (BSF), "Television on the basis of Internet protocol (IPTV) is on the threshold to the mass market."
According to the BSF, global IPTV growth will be pushed considerably by the Asia Pacific region with its emerging markets China and India, and Australia, whose IPTV offerings are entering the commercial phase. Worldwide growth will also be propelled by North America, where AT&T and Verizon are getting ready for countrywide IPTV rollout. For 2010, the MRG forecasts approximately 50 million IPTV households, 21.3 million in Europe alone.
"IPTV is red hot," says John Janowiak, President of the International Engineering Consortium (IEC).  "This is the kind of application we're going to look closely at in Berlin." 

Innovative applications
Several service providers to date have demonstrated the flexibility, individuality, and diversity of IPTV applications - and many of these will be presenting and participating in the World Forum.  Such deployments have built upon the end-user addressability enabled by IPTV, as well as advances in fixed / mobile convergence (FMC).
One such case study involves individualized content for young children who are hospitalized for long periods of time. Telekom Austria met the needs of these kids for individualized programming by using RFID chips implanted within stuffed animals, each of which transmits the child's age, language, background, illness, and treatment program to a set-top box.  This then yields content and applications appropriate for the individual patient.
 "This kind of innovation is at the heart of the emerging broadband world," said Janowiak, "and it's the kind of forward thinking that will be characteristic of the World Forum."
Indeed, as innovative applications and platforms for IPTV proliferate, attention is being paid toward the future of IP-based information and entertainment services across multiple consumer devices.  And the three main devices at present-and into the foreseeable future-are the TV, the PC, and the mobile handset.  Service providers looking to remain competitive and profitable will need to understand how to deliver content across these platforms in a coordinated, effective, and controlled manner-and a profitable one at that.
 "The world of broadband is converging toward an anytime, anywhere model," says Janowiak. "The IEC seeks to bring together industry players to squarely face and analyze these sorts of critical issues. In this sense, the Broadband World Forum Europe will play a central role in moving toward the future."

www.iec.org

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features