In the first of a regular column for European Communications, Ian Scales looks at why the debates surrounding network neutrality produce so much more passion in the US than in Europe

Just when we thought the ‘Internet neutrality' debate had finally exhausted itself, it reared up unexpectedly in late summer. US cable company Comcast's dabs were apparently found on some disrupted Bittorrent (a popular P2P application) file exchanges and then someone extracted a  ‘confession' from a slightly confused executive to the effect that the back-room boys might have been doing some traffic-shaping.  Bah! spat the blogosphere. This was clearly a deliberate attempt to degrade a competitor (Bittorrent is effectively another way to distribute video, the cable companies' core offering) and a foretaste of things to come unless there is Internet neutrality regulation. The ensuing argument was predictably measured and sober. The ‘Nazis' were implicated:  "First they came for the Bittorrent users," intoned one blogger, "but I said nothing because I didn't use Bittorrent, then they came for the... (and so on)."

There's been lots of that sort of thing from the US but in Europe, while there have been raised voices, there's never been the same fury and hyperbole around Internet neutrality issues. Cultural and political differences play a big part  - Internet neutrality is often linked to freedom of speech in the US and there's a long tradition of grassroots, anti-monopoly sentiment (absent in Europe). But even so there's a noticeable difference in atmosphere and I think telecom competition (or the lack of it) is at the bottom of it. In Europe, with glaring exceptions, competition seems to be at least going in the right direction, especially in the UK. In the US, after decades of liberalization and competition, the general opinion is that the market is in reverse gear and there is much talk of the old monopolies re-establishing themselves and a growing feeling of revolution betrayed.  Ever since early 2006 when the, then, AT&T chairman and chief executive Ed Whitacre started talking about the need for extra payments from Microsoft or Google for high quality content delivery, the more excitable end of the pro-Internet neutrality brigade have been on red alert - the Comcast incident looked like the first signs of a shakedown.   As to the Internet neutrality argument itself, it's notoriously difficult to pin down. Some of it goes around in circles and some of it is just stupid. And if the ‘pro' brigade are capable of a little hyperbole, the ‘antis' are even worse, blatantly misrepresenting the whole concept of neutrality as a communist plot and garnishing their arguments with predictions of an imminent Internet collapse and/or the drying up of network investment (both of which have always been about to happen since about 1993, but never do).

On the other side of the argument, Internet history is actually littered with players shelling out to gain a performance advantage for their applications or users, and on each occasion there has been a grumbling chorus. The introduction of ‘private' peering where, instead of exchanging traffic at delay-prone public peering points, players simply peered ‘privately', was one such. It was the same with content delivery networks that offered ‘better than best effort'. Wedges with thin ends were grimly forecast.  What's different now is that the backstop of broadband ISP choice is felt to be lacking in the US - many users claim they have just one possible provider, two at most, and therefore market forces alone aren't enough to keep the big ISPs honest. And they probably have a point.  In the UK, BT's decision to rearrange itself into retail and wholesale arms with the establishment of Openreach has helped foster an atmosphere of grown-up retail competition. From a truly awful total of a few tens of thousand lines just three years ago, BT's competitors have now unbundled well over 3 million lines. Rightly or wrongly that's generated the perception of real broadband choice and there seems to be a lack of angst amongst UK Internet users as a result. If my broadband ISP starts to exhibit Nazi impulses, I can probably go to another (unless I'm somewhere really remote).  So for the time being, and thanks mostly to a clued-up Dutchman, the UK seem to have cracked the regulatory conundrum.  Now we'll see if Viv Reding can sell the concept to the rest of Europe, and maybe even the US?

Ian Scales is a freelance communications journalist, editor and author.
He can be contacted via: i.scales@ntlworld.com

Mike Hawkes examines the aspects of mobile phone security that seem to be hidden from plain sight

A few weeks ago I was talking with some very well versed individuals from a highly respected anti-virus and security firm addressing mobile security, and was intrigued to note that most of the conversation was around virus protection for mobile phones rather than simple data protection.  Perhaps this isn't surprising, bearing in mind the consumer's innate fear of computer viruses, and how they can steal your life away.  These security companies make serious money combating the destructive and criminal activities of mal-ware distributors, and it makes sense to apply this knowledge directly to mobile phones.
There is, however, a rather important aspect of mobile communication security that is missed in this level of conversation; that of actual data security. 

Viruses have become known as the means of stealing personal data from individuals, which is then sold on for all sorts of fraudulent purposes.  From the hackers' perspective, this has become a necessity for computers because of the trusted and relatively trustworthy nature of PKI for secure Internet communications and also because of the necessity to remotely access PCs through ‘invisible' Trojan-horse applications. 

Mobiles are a different matter.  As m-commerce takes off, an increasing number of services invite businesses and consumers to send and receive sensitive information on the mobile phone. And this trend is only going to grow. 

Yet, measures to tackle security issues that work for PCs cannot be directly applied to mobile phones.  Firstly, mobile phones get lost a lot more often than PCs or even laptops.  It is reported that as many as 10,000 phones are left in the back of taxis in London alone each month. What of all the other taxis in other cities, busses, trains, bars and of course, those phones that are physically stolen?  Anything sensitive left in the inbox or sent items can be readily extracted from the phone.

Data on a mobile phone does not necessarily need to be sensitive for it to be of value to a non-owner either.  Increasingly, items of value are being sent to the mobile, often in barcode format over MMS. There are a number of security risks around this too. For example, with no audit trails, fraudsters can claim not to have received the message and repudiate the payment. Tickets can be bought on stolen credit cards and forwarded for cash.
Possibly more important than the issue of data on handsets, is that of data interception.  Why?  Primarily because radio communications used in mobile phone communications is inherently insecure. To quote a US security expert: "If it has an antenna, it is not secure, period." Additionally, many telecoms businesses are not really aware of what this insecurity entails, let alone of the risks to customers. 

It is true that cell interception remains a low-level threat while the pickings are poor, but as there is growth in localised concentrations of personal data being sent by phone, the incentive for fraudsters to begin cell interception increases. 

A recent example of this can be seen in Westminster, London, where the City Council invites drivers to send their credit card and other personal details via SMS to pay for parking. Other councils around the country are likely to follow suit and introduce similar schemes, creating more honey pots for fraudsters. As cell interception technology is readily and cheaply available on the black market and one can even find DIY instructions on the Internet, cell interceptions poses a real threat to mobile users.

So, there are two clear dimensions of risk here; data that can be taken off the device itself, and data that can be intercepted over the air.  Most interestingly, neither of these risks to personal data is even slightly related to the propagation of viruses between handsets. So where is the opportunity? 

By integrating tools that make phone content only available to the owner of the phone, through on-handset encryption activated through a PIN code for example, lost or stolen phone data becomes unusable for anyone else.  Combined with secure cross-air encryption, the nature of mobile phone communication, particularly SMS and MMS, has the potential to change dramatically. 

Mike Hawkes is CTO of Broca Communications

In a commercial world of increasingly numerous and, some would say, meaningless acronyms, Software as a Service (SaaS) stands out as offering real value to a wide range of businesses says Jerona Noonan

As the world becomes more virtual, so communications technologies enabling collaboration can make the difference between success and failure.  And, in turn, deploying such tools in the best manner possible is critical in ensuring their full value enterprise-wide.
In supporting this, the best Software as a Service (SaaS) solutions can lay fair claim to being the most reliable and cost-effective delivery model currently available, by offering a broad range of benefits to both the enterprise and individual users.  Quick and easy to deploy, you only pay for what you use: in addition, the IT department does not have to manage capacity, performance or maintenance and it is easy to extend use to partners, suppliers, customers and other users outside the organisation's firewall.

In its latest report, Leveraging the Value of Software as a Service: Key Benefits & Best Practices, industry analyst, Frost & Sullivan also points to other benefits ensuring a substantial return on investment.   

Users can avoid getting locked into a single vendor or solution, for example; they are able to test applications without the upfront commitment to a long-term implementation and, as a result of being located remotely, SaaS solutions aid disaster recovery planning in the event of business interruptions. 

More for less
Today, the SaaS concept has assumed a high media profile, as a cost-effective way of large enterprises meeting the broader business imperative for IT investment to achieve ‘more for less'.  As organisations face growing commercial, regulatory and environmental pressures in ever more global markets, there is an unprecedented need for IT to drive greater internal efficiencies, better customer service and within a reduced carbon footprint.  Yet the budgets available to achieve this remain consistently tight, putting pressure on solutions providers to be more creative, both in terms of the supporting technologies and their implementation and ongoing management. In response, across all areas of IT, vendors are developing SaaS-based solutions designed to achieve lower-cost service delivery.  And, as ever in a market where the latest technology ‘buzzword' appears to offer highly attractive returns, their ability to deliver the benefits both promised and desired are likely to vary significantly, depending on the quality of the individual vendor's offering and the particular sector they are looking to support.

Cost-effective deployment
Having said that, by common consent analysts point to multi-media conferencing as an area of especially strong potential for SaaS. 

The reasons are not hard to find.  It is, for example, cost-prohibitive for a large enterprise to build out its infrastructure to support a truly scalable web conferencing solution, including the appropriate network, applications and resources.   

As an online service, there is no upfront hardware or software investment to be made, allowing the enterprise to focus investment on its core competencies and not on applications and technologies requiring continuous expenditure and resources to properly maintain.  In most cases, it requires no more than a simple Windows-based application to be installed on each user's desktop. 

Companies realise that this is not an application that they can easily deploy and support themselves.  Large enterprises could not possibly keep up with the technology and changes in applications, as well as the overall support required to make this an effective and useful capability across the company.

Not surprisingly, in light of this, Frost & Sullivan predicts ‘robust growth' for such applications as audio, video and web conferencing and collaboration.  In the case of web-conferencing in particular, it forecasts that almost 70 per cent of the total market will be services by 2011.  
In today's virtual commercial world, adopting and deploying the right communications technologies has become mission-critical for businesses both large and small.  Yet the SaaS concept has, in various guises, been around for more than 20 years.  For example, the hosted services model - as a cost-efficient alternative to in-house implementation - has formed the basis of Genesys Conferencing's own market proposition since the 1980s.
Having said that, there is a clear distinction to be made here with application service providers (ASPs), in earlier years a well-recognised means of delivering hosted applications.  The fundamental difference is that, unlike the more recent SaaS delivery mechanism, ASPs typically adopted the traditional client-server model, managing the servers without making any modifications to the delivery system. 

Scaling up and managing multiple customer accounts was also problematic as, in effect, they had to add more infrastructure each time they signed a new client.  SaaS vendors, by contrast, rely on net-native applications, which significantly improves performance. 
Further, scalability and performance is greatly improved as dedicated servers are not required for each client site: this not only improves vendor profitability but enables some of the savings to be passed on to customers in the form of lower prices.  
In short, the SaaS model is now both more reliable and cost-effective in providing robust support for virtual organisations.    

For an enterprise business evaluating whether or not a hosted - and, more specifically, a SaaS - solution is most appropriate to their needs, a number of issues should be considered, both with regard to the solution and the choice of vendor.

Once again, the Frost & Sullivan report identifies a wide range number of business situations where a SaaS solution makes sound commercial sense, both financially and operationally. 
It will be especially valuable if the organisation is growing, or where employees need to be able to collaborate with one another and with others outside the organisation.  A SaaS-based approach will also be appropriate if the goal is eventually to offer a complete suite of integrated applications over time, as dictated by need, or if the company wants to avoid costly upfront expenses and ongoing maintenance charges on that up-front capital.
Selecting the most appropriate SaaS pricing model with the right scalability - both up and down - is of course essential.  Yet there are a number of further considerations to be made in choosing the right vendor.  Secure access to the provider's network and software applications is paramount, as is guaranteed 24/7 performance - especially critical in the context of real-time communications applications.

Services and support must ensure on-going availability to end-users and the ability to integrate with other existing applications within an existing infrastructure is also important.  And finally, not only must the communications application be simple to deploy but also easy and intuitive to use: for it is only by encouraging maximum adoption and continuing use throughout the organisation, that the full financial savings and other operational and environmental benefits will be realised.    

Broadly-based benefits
The advantages of SaaS, though wide-ranging, commonly have financial implications, both in terms of cashflow and outright cost savings.  For example, unlike on-premise solutions, SaaS is quick and easy to deploy across the enterprise: and, as a hosted solution, no server hardware is required. 

As SaaS is usually charged on a per user basis, companies can scale their conferencing or collaboration applications to support multiple users whenever necessary, without having to pay for every user within the organisation.  The service can also be scaled easily and quickly to more users - irrespective of location and also beyond the firewall - as the need to participate expands across and beyond the enterprise. 

This is especially attractive, as few employees need permanent availability, yet many will need access at least some of the time.  The result is that costs are kept under control, at the same time ensuring ready participation in virtual meetings and other activities whenever it is required.

As SaaS applications include all upgrades and updates, the end-user organisation and its staff will always benefit from the latest software as soon as it becomes available.  Set-up and  deployment across the enterprise is almost instantaneous: the business does not have to concern itself with setting up accounts, communicating it to each end user and then training them, as this too is taken care of within the SaaS environment.
The result of delivering web conferencing via a SaaS model is that the service is available at any time, anywhere in the world.  All that is required is Internet access and a telephone - indeed, by adopting the VoIP option, not even a telephone is necessary. 
In the context of the latest multi-media communications and collaboration solutions, the term ‘SaaS' significantly underplays and undervalues the enterprise-wide impact of some vendors' service offerings.

In contrast to those who provide the software application only, Genesys Conferencing, for example, goes further, by offering 24/7 support, protection against technology obsolescence, full set-up, implementation and deployment capability and comprehensive reliability, performance and security. 

And, where the provider authors its own software, the customer will also benefit from the flexibility of a solution that can be customised to meet their own particular needs.  The result of adopting a service solution at this deeper platform level is that the customer will be assured of a future-proof yet resilient unified communications solution that will be cost-effective both to install and maintain.

The result?  As an alternative delivery mechanism, "SaaS is changing the way companies buy and deploy software and for good reason," believes Frost & Sullivan.  "Most SaaS providers charge per user or usage, to stay flexible with their applications purchases, scaling up or down as needed and offering applications outside the organisation, as appropriate.  This model also encourages usage within the organisation, boosting productivity and the technology's RoI."       

As today's enterprise businesses face greater pressures than ever, the breadth of financial, operational and service benefits which SaaS delivery offers enable companies in all sectors to remain both productive and competitive - the essential basis of both survival and profitable growth.

Jerona Noonan is Sales Director, Genesys Conferencing

The announcement earlier this year that the MediaFLO standard has been adopted by a major US telecoms operator for mobile TV services has been greeted with disappointment by some European industry commentators and proponents of DVB-H. However, in a wider context the announcement can be seen as a boost for the whole mobile TV market and should pave the way for more widespread roll-outs of the mobile TV technologies in general. John Maguire discusses how regional, commercial and regulatory issues will influence the adoption of different standards and how open and proprietary standards can successfully co-exist

There is a fascinating battle for supremacy being played out across the globe as different mobile TV standards, both open and proprietary, vie for acceptance and market share. Currently there are three key players in this embryonic market: Representing the proprietary solution we have Qualcomm's MediaFLO end-to-end solution, which has launched in the United States. Representing the open standards are ISDB-T in Japan and DVB-H in Europe, both of which have fully rolled out, live services. 

The key advantage that a proprietary system has in an early market is that it allows a single company or joint venture to deliver a single point, turnkey solution to a telco or broadcaster. In the case of MediaFLO, Qualcomm presented a compelling argument by offering a solution that was extremely quick and easy to deploy using existing infrastructure wherever possible. For the US operators that have adopted MediaFLO the opportunity to steal a march on the opposition by investing in an ‘off the peg' solution was too good to miss.

The disadvantage of the open standards approach is because it is fundamentally eco-system based, it is a collaborative approach with certain inherent disadvantages and at first sight the end result does not appear too appealing. There is a steeper leaning curve with an open standard, with more elements to plug together, and the need to drive interoperability and manage that Eco-System in order to find and develop the best in class suppliers across the value chain. However, increasingly organisations are willing to invest in a standards driven process because of the greater benefits over a proprietary solution in the long run. Open standards tend to be supported by a group of organisations, there are more companies involved, and consequently there is a bigger eco-system, which allows participants to drive down the price and to drive up the level of competition.

Another benefit of the open standards approach is that the cost of developing the Intellectual Property (IP) can be assessed and a single IP value fixed by the standards body, for instance MPEG LA. As the cost per handset for using a particular standard is set by an independent body, operators have clarity of cost prior to entering the market and can be confident that there will be little cost variation due to changes within that standard.
The role of government support for mobile TV is another interesting area. In countries where governments currently support either analogue or digital free to air TV there will soon be a watershed when the so call ‘digital dividend' becomes available and spectrum and frequency allocation is agreed for services such as mobile TV. Governments in countries such as the UK and Ireland have a mandate and arguably a social responsibility to provide free to air content broadcast to traditional TVs. The question that must be addressed very soon is how does mobile TV fit into this free to air model, and should the state funded broadcasters such the BBC and RTE be expected to extend their service to include hand-held devices?
If the answer is ‘yes' then a government is going to find it far easier politically to adopt an open standards based solution, which, by definition, can be serviced by any company with suitable knowledge and experience, rather than a proprietary solution. In fact, in Europe the first steps towards government acceptance of an open standard for mobile TV (in this case DVB-H) have been led by Viviane Reding Member of the European Commission responsible for Information Society and Media European Union, who has been extremely vocal about the need for Europe to work together and accept the open standard approach.

Another benefit of governmental support for open standards is that by deploying networks using compatible standards across multiple regions and in the longer term, multiple countries, not only are there benefits through volumes of scale, but there is also the opportunity to develop roaming capabilities.  For mobile TV, deployment of a proprietary standard would not allow this and would lead to more market fragmentation than an open standards approach could potentially deliver. For example, if MediaFLO or a similar proprietary standard gained a foothold in Europe, it might be successful in one or two countries, but there would be little chance of Pan-European organic growth. On the other hand, an open standard such as DVB-H has a far higher chance of widespread success in Europe as there are far fewer barriers to its adoption. In fact, the rollout of DVB-H services is likely to cause a snowball effect in opening up new business models and extra revenue channels for roaming.  Roaming for mobile TV is not as important as for multi cast or unicast TV, but an open standards approach is typically the desired strategy. Hence all the major mobile operators are looking for well defined interfaces, not proprietary standards, so that they can choose best in class. 

However in the face of all logic indicating that open standards are the way forwards, the success of MediaFLO in the US demonstrates exactly what a clever business case Qualcomm have developed.  MediaFLO put together an end-to-end system designed specifically for the US market and, in order to demonstrate the suitability of the concept, went as far as sourcing and licensing the content. Qualcomm were then able to approach operators and offer a complete working solution that could be rolled out in a very short period of time. However once MediaFLO had been demonstrated ‘in the field', Qualcomm took it to the US standardisation body with the aim of having it adopted because even MediaFLO recognises that the only way to drive the business globally is to have it as a worldwide standard. 
However, arguably it is the size of the market that is far more important than which standard is being used, certainly in the short term, and consequently it is interesting to consider the levels of penetration that the commercially available mobile TV networks achieved. When the Japanese government took the decision to launch an ISDB-T service, it first secured a mandate to proceed and positive endorsement of the standard from all of the mobile operators. By securing buy-in from all the major players in the Japanese mobile TV market to contribute to the standard and to develop devices based on it, the government ensured that after one year of operation, the service was available to 70 per cent of the Japanese population. Of these potential users, 10 per cent were actually using the mobile TV service. In the case of 3 Italia with its DVB-H (H3G) network, after one year of operation, 800,000 of the available 8 million subscribers were using the mobile TV service. So, interestingly, in both markets, open standards hit 10 per cent of the available population. In the USA, despite Qualcomm's best attempts to make mobile TV accessible, the proprietary MediaFLO networks have reportedly shipped around less than 1 million devices after about a year of operation, which is well down in percentage terms compared with the comparable Italian and Japanese networks. The USA is a very big, very fragmented market, but not surprisingly there is an ongoing debate as to whether the slow take-up of mobile TV can be attributed to geography, the standard or perhaps external economical factors.

Another more divisive question that is being asked is ‘can mobile TV be a compelling enough offering to drive volumes'?  The answer to this is certainly ‘Yes', but with a number of minor caveats. Evidence from the many mobile TV trials across the world indicate that users will adopt mobile TV, but they expect high quality audio and video as a given. Users are also becoming more sophisticated thanks to the ever increasing range of content delivered in both linear and non-linear formats to PC and television screens.  To be successful, a mobile TV services will need to offer a range of high quality content delivered as linear TV, video on demand TV, and even audio and video downloads, not just broadcast TV on a smaller screen. The success of the video ipod and the iphone demonstrates the value of downloadable content and Nokia are currently rolling out YouTube mobile in a similar vein. The technology to allow unicast, multicast and broadcast delivery is already available and is still improving. Each iteration of improvement will provide mobile operators and device manufacturers with the tools to make mobile TV more desirable to the market. However, in order to reach ubiquity, there needs to be a large degree of interoperability and certainly a collaborative approach throughout the sector. Some form of open standards approach would certainly make this process of growth and integration far quicker, easier and more successful. However in order to ensure that the proposed standard is developed and used to the benefit of the whole sector it would be beneficial to have strong, impartial stewardship. Industry bodies such as the OMA and BMCO forum are taking up that mantle and are driving open standards forward and help to ensure that the right tool sets are developed in order to deploy a system and provide clarity of cost, benefits and issues for potential users. The importance and value of proving clear business guidance to potential mobile TV operators through the availability of information regarding capex, opex, and potential revenues will help bring new players into the sector and help the whole industry in the longer term by helping increase volume. 

The other issue that must be considered is that all the discussion about standards is currently focussed on the generic mobile TV service. Of course, once that open platform is available, there is significant scope for operators to develop and deliver proprietary services based on those open standards. This second stage development will give network operators and device manufactures the opportunity to innovate, build their individual business cases and differentiate their offerings once the basics are in place.

In essence, the most important issue for any company involved in the mobile TV market is to ensure that their business and technology planning is focussed on the medium to long term and the importance of the inter-operability that will certainly be required to make mobile TV a global success. There might be the temptation to roll-out a proprietary solution in order to kick-start their service and provide improved short term revenue, but as the market develops, the benefit of having 10 per cent of potential subscribers locked in, might switch to become the problem of having 90 per cent of potential subscribers locked out. While the size of the market is intrinsically more important than the standards being used, the paradox is that choosing an open standard will potentially help the market develop to its full potential while use of a proprietary standard could lead to fragmentation. Given the choice, I believe that open standards will provide the best opportunities for the mobile TV sector, as they drive and promote interoperability and best in class solutions across the value chain. Open standards also drive competition which fundamentally drives economies of scale and enables device manufacturers to make large scale solutions which proprietary systems don't. Either way, it is important to take a pragmatic view at this stage of mobile TV's development, and as new services are rolled out, more handsets will become available and the potential viewing population will increase. Long may this continue!

John Maguire is General Manager Consumer Mobile at Silicon & Software Systems (S3)

As the deadline for EU Data Regulation compliance passes, and many member states opt to postpone, Ross Brewer warns telcos not to underestimate the amount of work involved in addressing the regulations

The original deadline for the European Union Data Regulations - namely September 2007 - has been and gone.  Instead of telecommunications companies across Europe retaining data to support the crime fighting efforts of regional security forces, most member states, including the UK, have chosen to postpone the application.  It is expected that the laws in the large member states will start to be implemented from the beginning of 2008.  If this is the case, large enterprises will need to have solutions in place around the middle of 2008. 
While extending the deadline may buy additional time in which telecommunications companies can get their data retention house in order, the reality is that too many organisations are dragging their heels in addressing the regulation and don't have an appreciation of the sheer amount of work needed to ensure compliance.  Industry estimates put achieving Data Directive compliance at anytime up to 18 months.  With this in mind, if organisations are to be fully compliant in time for the new deadline, then they can't delay a moment longer.

The European Union (EU) formally adopted Directive 2006/24/EC on 15 March 2006.  The directive related to the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks.  In other words, communications providers need to retain - for a period of between six months and two years - all data that will allow relevant local authorities to trace and source communications to investigate, detect and prosecute serious crime. 
The directive - which applies to fixed telephony, mobile telephony, Internet access, Internet mail and Internet telephony - covers every single aspect of a communication including its source; destination; date, time and duration; the type of communication; the type of communication device and the location of the mobile communication.

In putting the regulations off until the last minute, organisations risk facing additional pressures in a bid to fast track achieving compliance within the imposed deadline - if they can achieve compliance in the first place. There is also the added risk that crime fighting abilities of regional security forces will be detrimentally affected as the EU Data regulations have been designed to help the security services in the fight against crime. 
With over 41 billion text messages alone sent last year, not to mention the millions of calls made every minute of every day, telcos will face enormous challenges.  These don't just relate to storing the data securely, but being able to, should an investigation be required, locate and retrieve the data as quickly as possible so as not to hamper proceedings. 
The amount of audit and logging of information required by the Data Directive has the potential to overload storage capabilities of even the largest and most technical savvy organisations. But in an ever more competitive industry, what can telcos do to put a necessary compliance solution in place which will not detrimentally impact the day-to-day operation of the business in anyway?  

The good news is that the information that needs to be retained by the Data Directive already exists within the organisation in the form of log data.  This log data provides a fingerprint overview of every action that occurs across the enterprise and importantly, across a telco's network.  However, as it is generated at a rate of millions of bytes per second, being able to capture it to provide forensic reports across the enterprise in a timely fashion as mandated in Article 8 of the Data Directive will be a challenge. 

As such, when looking for a solution to manage this problem, organisations need to find a product that has its roots in scalability and can run a single search across all devices across the organisation to help minimise the impact of locating the data in the first place. 
This is where log management solutions can step in and provide a means of searching log data and producing reports.

Steps to compliance
Installing an off-the-shelf compliance solution a couple of weeks before the deadline will not be sufficient.  Any solution will need to be tailored to meet the specific challenges of each organisation and go through a rigorous testing procedure to ensure that it is robust enough and capable of storing and retrieving information within the recommended guidelines.
Getting started with any enterprise-wide strategy for compliance requires an understanding of the requirements particular to each industry and business. Policies should then be put in place for collecting, alerting, reporting on, storing, searching and sharing data from all systems, applications and network elements.  This creates a closed-loop process that governs the life-cycle of enterprise data and ensures the compliance programme is successful.

It sounds an obvious first step, but without taking time clearly to understand the specifics of the EU Data Retention Directive, there is a risk that some controls or requirements may fall through the net, which will have serious implications further down the line.  For example, minimum requirements for the Directive include that the solution must be enterprise scalable and distributed; fault tolerant and ensure no loss of data; be able to prove that the logs are immutable so that they can be used in a court of law and be able to produce forensic reports across the enterprise in a timely fashion.

Once the specifics of the Directive have been clearly understood, the next step is to put in place the IT controls and frameworks to help govern compliance tasks and keep the business on track for complying with the mandate.  

Goals should then be defined and key tasks for successful compliance identified, agreed and set.  Then specific tasks relating to each goal can be set.  Once these tasks are complete, configuration of network elements, systems and applications can then be addressed. 
Alerting mechanisms and scheduled reporting will advise IT personnel when any part of the solution isn't complying with the policies.  Early reporting of any problems will ensure that they can be addressed in a timely fashion with minimal impact on the rest of the operation.  Alerts and schedules can also demonstrate compliance to auditors.

Alerting and reporting on logs must be substantiated with immutable log archives.  It's therefore critical to store logs centrally with a long-term archival solution that preserves the integrity of the data, as required by the EU Data Directive.  Immutable logs require time stamps, digital signature, encryption and other precautions to prevent tampering - both during transit of the data from the logging device to the storage device, as well as during archival - if they are to stand up as evidence in any legal proceedings.

It's easy to view the EU Data Directive as yet another piece of Brussels bureaucracy, but unlike many other regulations, this Directive isn't about preventing or identifying financial regularities within big business.  Instead, the EU Data Directive has been designed to help the ever complex fight against crime - at an individual country, European and global level.
Telcos shouldn't feel daunted at the compliance task that lies ahead of them, after all, they already hold all of the data that is needed by the EU Directive.  Instead of viewing compliance as an isolated IT project, they should instead look upon it as a business issue that requires a cross-functional approach, involving people, processes and technology across the enterprise.  Taking the steps necessary to understand, define and implement the appropriate IT controls and frameworks for the business will simplify compliance and reduce the costs and resources involved in completing compliance related tasks in line with the Directive's deadline.

Ross Brewer is Vice President and Managing Director, EMEA, LogLogic

The GSMA's Mobile World Congress 2008 runs on 11th to 14th February in Barcelona, aiming to - once again - prove its value as the leading event in the mobile industry

The growing convergence of media, communications, entertainment and information being very much on the minds of all those involved in the currently separate industries (though in the words of Keith Willetts, TM Forum Chairman, each in the process of eyeing up the other's lunch), the GSMA's Mobile World Congress is clearly intending to hit all the right buttons, with all the right buzz words. 

Noting that mobile is now about much more than simple voice calls, but is a social and economic phenomenon, providing new channels and opportunities for information, entertainment and the Internet generation, GSMA has declared the four major themes of the conference to be ubiquitous networks, the over-the-top service challenge, mobile's social and economic value, and the mobile digital content revolution, and thus clearly aims to ensure that Mobile World Congress 2008, safeguards its reputation as the largest and most significant global mobile event.  The organisers further declare: "The structure of the upcoming Mobile World Congress is being developed to reflect the ongoing changes in the mobile value chain.  This programme identifies the risks that will be taken and highlights the rewards that can be reaped in brining these changes to all geographic markets and mobile services."

Under this umbrella, the event will tackle a variety of the issues currently dominating telco thinking and concerns.  The convergence of fixed and mobile communications, for instance, may well have a certain natural evolution, but is it inevitable, real or even desirable?  The MWC session will look to unearth what FMC means for leading operators and define the role it will play in their business development.

While the ‘Society on the Move' session will explore the way in which mobile has integrated into modern culture, recognising the growing importance of mobile across society, the commercial aspects of the market are very much to the fore.  So, within the Mobile Entertainment Summit, for example, one session will address the proposition that some believe that mobile advertising is essential to unlock the potential of mobile entertainment, and that advertisers will jump at the chance to spend billions in the mobile arena, while others believe that users will simply turn off.  The session aims to put the opportunity into context and uncover what it will take to deliver the mobile advertising promise.  Under the same Summit, the conference will also address the view that the youth demographic is the darling of the mobile world.  Quick to take up new services and open to new technologies, the youthful user is every operator's target - but is this reputation justified?

On the technology front, the conference will look at such areas as HSPA, under the title ‘The story continues with HSUPA and HSPA Evolved', noting that while HSDPA is already proving a huge commercial success on a global basis, what is the evolutionary path for this mobile broadband technology?  The session will aim to provide operator insight into deploying HSUPA and the move to HSPA Evolved.  Staying in the technology arena, ‘LTE - defining the future' recognises that while standards are quietly being finalised, speculation and rumour pervade.  The session will provide insight into the reality of LTE and its prospect against alternative future mobile broadband technologies.

Noting that mobile penetration has reached over 100 per cent in 38 countries to date, and many more are high in the 90s, the focus of the keynote session on building ubiquitous networks centres on the recognition that while there may be a mobile phone in everyone's pocket, mobile isn't everywhere and the challenge is to take high speed, high capacity mobile services everywhere.  It will examine the development and rollout of ever-more capable networks and their convergence with fixed, to deliver ubiquitous services.
And while MWC recognises that the mobile industry is unrivalled in its technical excellence (well, up to a point Lord Copper), it all too often forgets that the success of its achievements rest on the experience the user gets when he first switches on.  The ‘It's the user experience stupid!' session highlights the importance of putting the user first and the developments that can set consumer imagination racing.

The highly topical subjects of VoIP and Mobile TV must also, of course, have a slot at the conference.  In the case of the former, the ‘VoIP - coming ready or not!' session works on the basis that VoIP has the promise to revolutionise the cost base of voice communications, but will it, and how will VoIP impact the business of mobile operators?  In the case of Mobile TV, it is noted that the application has had a stuttering start, but that, as it finally begins to reach commercial deployment, the race to find the killer business model is replacing the technology debate.  The ‘Mobile TV - taking it to the masses' session looks at the realities of rollout and the business impact of technology decisions.

The above, of course, is a mere snapshot of all that Mobile World Congress 2008 aims to be.  The event also offers both visual and hands-on back up to theories and discussions in the shape of the many products and services on display at the concurrent exhibition.  And, most importantly, it is the king and queen of networking opportunities - of the business and social kind, of course.

A new breed of service providers are helping to reverse the fortunes of some of Europe's most influential mobile companies - by helping them manage their customer lifecycles far more effectively, explains Georgia Hannias

Traditional marketing is no longer a viable communications tool in the mobile telecoms world.  Glossy adverts in the papers, mass-marketing direct mail and telemarketing are not winning customers' loyalty - especially when every operator seems to be offering the same thing.  In this confusing climate of choice, even a great product or service isn't making the real difference to business growth.  What customers are looking for is an operator who can understand their preferences and ultimately tailor their services to give them what they want, when they want it.

Understanding the needs of the customer is what Customer Lifecycle Management (CLM) is all about. This communication trend focuses on managing customer behaviour by finding a way to create a bond between the operator and the customer throughout the customer's lifecycle. Once a one-to-one communication is created, consumer behaviour can be monitored, so that companies know exactly what their customers want and can subsequently provide a tailored service to meet individual needs.  This is usually the first step forward in successfully delivering targeted services and, most importantly, building a strong, long term relationship between the service provider and the customer.
 "Customer lifecycle management has a future in mobile telephony - especially in Western Europe where there is a blurring of fixed and mobile services," explains Rob Bamforth, Principal Analyst at Quocirca. "This convergence is eating away at revenue generated by traditional services such as voice calls. To find new ways of earning income and to keep customers happy, operators have to differentiate themselves from their competitors- and to do so in a way that makes a subscriber feel valued and not just a number. CLM can help achieve this by enabling an operator to target an individual person."

Marketing departments are all too aware of this demand for more personalised services and how customer lifecycle management can help achieve this goal. More importantly, they can usually produce the creative ideas that are needed to develop the campaigns that will engage customers. The main problem, however, is that the internal complexities involved in executing new programmes usually prevent them from becoming a reality. IT departments tend to place more emphasis on the OSS and BSS systems and the call centre side of a business, which means that marketing's requirements almost always takes a back seat internally.

Consequently, campaigns that should only take weeks to complete and launch can take months, and many compromises are usually made to get a programme out the door- leaving marketing departments frustrated by the fact that they haven't quite delivered the programme they wanted.

Fortunately, there is a new way of simplifying the execution and enhancing the flexibility of personalised marketing campaigns known as Marketing on Demand. Simply put, Marketing on Demand is a software-as-a-service model that removes the internal complexities involved in launching new campaigns.  It enables marketing departments to be more focused on the programmes they want to deliver, leaving the technical execution to a team of external specialists. 

With the availability of marketing on demand solutions, marketing departments can choose to implement customer retention programmes quickly and easily without the need for a long-drawn out IT project.  This brings a whole new dimension to the way marketing departments can control and execute their marketing campaigns.  It provides the flexibility to capitalise on existing customer data, regardless of where it is stored within the organisation, as it is fed to a hosted solution which is capable of automating all out and inbound interactions in real-time.  This enables operators to manage customers on a one-to-one basis so that a relevant and consistent communication journey begins and continues throughout the customer's lifecycle.   Best of all, marketing on demand means that any number of programmes can be launched in one go, thereby allowing operators to execute and manage a combination of programmes without any delay or restrictions. 

Software as a service (SaaS) based-delivery model is already driving major growth in the customer relationship management applications market. According to analyst Gartner, mobile operators and other enterprises continued to invest in front-office applications last year, with worldwide CRM software revenue exceeding $7.4 billion (£3.7 billion) in 2007, up 14 per cent from $6.5 billion (£3.2 billion) in 2006.

By year-end 2007, SaaS represented more than $1 billion in CRM software revenue. "The sustained performance of major on-demand solutions providers is driving the growth in the SaaS segment," says Sharon Mertz, Gartner research director.  Mertz also explained this method of delivery was likely to become the dominant one in this market by 2011, as companies updated existing business automation systems to ensure they could meet expected targets for business and revenue growth.

"Marketing on demand deployment models are helping communications providers use their CRM systems as a strategic tool to gain competitive advantage, maximise loyalty, reduce churn and increase ARPU throughout the customer lifecycle," says Mikko Hietanen, CEO of Agillic.  "Customer Lifecycle Management solutions take the CRM SaaS model one step further, by providing communications service providers with an application that enables real-time, multi-channel, cross channel dialogues driven by the customer.  What this helps to achieve is the delivery of a true and consistent understanding of how an individual interacts with their service provider."

Software systems like those offered by Agillic also help communications service providers and their marketing agencies to work even more efficiently as it is programmed to automatically execute the business rules required to reach customers across all touch points from printed direct mail, digital media and customer services.

"Agillic is one of the first companies I have heard of that focuses exclusively on customer lifecycle management," says Joel Cooper, Senior Analyst, Europe at Pyramid Research. "In highly saturated telecoms markets, what Agillic offers is the kind of thing that operators would be advised to adopt. As customer growth dries up, customer lifecycle management is a good starting point in terms of better understanding exactly what the customer wants."
Agillic has many years of experience of deploying marketing on demand business models for its customers.   The general principle is that marketing on demand must keep the co-ordination and collaboration of any project focused on the business value rather than the technology. Achieving this requires the assurance that promotional campaigns are set up correctly with the right messages - which is usually the biggest and most costly challenge for most mobile operators today.

Communication service providers can choose to implement a choice of best practice marketing concepts that are already successfully deployed and generating positive results.  Consequently, marketing does not have to reinvent the wheel but instead can benefit from a quick route to market by simply adapting proven programmes with their own branding and business rules.   As programmes develop, the marketing department can easily monitor the positive and negative effects of the programmes and make immediate alterations in real time so no time is lost to time-consuming code changes. They also benefit from having access to advisers that can guide marketing on the best programmes that will match their most pressing business objectives.

Most operators already possess existing data about customer preferences and usage patterns which can be uploaded to the CLM solutions to execute meaningful, customised and timely marketing programmes by crossing all customer touch points," says Mikko Hietanen.  "More importantly, these communications are only received through channels that are agreed to by the customer, which keeps them happy and strengthens relationships even more. This is something that CRM could never provide as the human dimension does not exist."

Georgia Hannias

Operators are increasingly turning to picocells to improve network quality and control infrastructure costs says Chris Cox

Operators have been struggling with coverage problems since mobile communications were invented.  But massive recent investment in 3G licenses and infrastructure has changed the capex and opex cost equation, affecting the performance of the companies that sell equipment to operators. As this article was being written (in Autumn 2007), for example, Ericsson, Nokia Siemens Networks and Motorola made a series of anaemic financial announcements and all three blamed a slowdown in operator infrastructure spending.
While much of the heat is generated by the spiralling cost of implementing 3G, operators are also increasingly cautious about their 2G commitments.

To cope with sustained customer demand for GSM services, operators are looking for ways to keep further investment in additional expensive GSM infrastructure under control. Every operator is actively looking for smarter, more cost-effective methods to upgrade existing networks and yet still deliver additional coverage and capacity to their customers, exactly where it's needed.

Introducing picocells
Many are increasingly turning to picocells.  Picocells are very small base stations that deliver a mobile signal in areas of high mobile traffic (such as in offices or busy public locations) or poor network coverage (such as indoors) and use an existing broadband connection for backhaul.  They are proving to be very much more cost-effective than the alternative of upgrading the macro network.

Operators originally deployed picocells as a "band aid" for coverage black spots. However, as the technology has matured, operators are increasingly seeing picocells as an important mechanism for improving their macro network capacity and performance. And, perhaps most interestingly, they're beginning to be used in one of the most competitive battlegrounds of all: picocells enable operators to use the promise of great service quality to attract (and win) lucrative business customers from incumbent rivals. 

Every network has black spots where coverage is marginal or non-existent. In areas with marginal coverage, service quality inside buildings can drop off sharply, resulting in dropped calls, ‘network busy' signals, slow data rates and poor voice quality.
The first major application of picocells was to address this issue of signal quality in buildings.
For enterprises, coverage in the workplace is a major driver of dissatisfaction with operators - this is a consistent finding in all global regions. And where coverage is an issue, it's the dominant issue and is a key reason for churn (along with price and handset choice). Critically, coverage is often the primary differentiator between operators - and the decisive factor in awarding contracts.

The traditional solution to in-building coverage problems has been the repeater. But today, planners aren't so quick to turn to repeaters to fill black spots or penetrate buildings.
Repeaters are to network planning what the rock was to the caveman.  Simple and ubiquitous, but not the most sophisticated solution around. But until picocells arrived, network planners generally accepted that the alternative always looked more expensive and difficult to deploy.

While repeaters extend coverage, they act to drain capacity from the cell in which they operate, Picocells by contrast add both coverage and capacity, as well as the ability to enhance data rates. In addition, repeaters can distort the macro cell, causing interference and handover issues, creating severe radio planning problems. Picocells integrate seamlessly into the macro network.

Repeaters can be difficult and time consuming to install and they're also problematic to manage (they don't offer automated fault reporting for instance). Picocells can install in a few hours and offer integrated fault and performance monitoring.
At the beginning, some operators with difficult coverage challenges shied away from picocells because they felt a little uncomfortable with the need to use IP for backhaul, an unfamiliar protocol for network planners used to more established ways of doing things. As IP has become ubiquitous, however, that resistance has melted away.

Providing good service means always having sufficient capacity available. But avoiding ‘network busy' errors in commercial centres and large cities is becoming more difficult as usage levels increase, driven by competitive voice tariffs and attractive new data services.
Subscribers are spending more time on the network doing new, more bandwidth-intensive things. That may affect the quality of service operators provide to premium customers, like BlackBerry and other PDA users, who expect to be able to access services whenever they want. Providing the right level of capacity is tough in densely populated areas and it's limited by the spectrum available to an operator.

Simply adding new macro cells - even if they're micro base stations - is expensive and time consuming. And public opposition to the introduction of more and more radio masts is increasing around the world as well, even if good sites can be found.
An operator's lack of capacity is not only a churn driver but also a brake on new service uptake.

Picocells offer the possibility of highly targeted deployment, rapid rollout and limited impact on the macro network in terms of interference and the requirement for network planning. Each base station is inexpensive and a single base station controller can handle a hundred base stations.

ip.access undertook some business case research recently (see the Case for Picocells: Operator Business Cases at www.ipaccess.com). In one scenario, where the goal was to offload 60 per cent of the indoor mobile usage of 7000 users over a two square mile area, using picocells was 53 per cent cheaper than upgrading the macro network (each of those users was estimated to use 800 voice minutes each month and 5MB of data per month).
Enterprise market capture

As picocells become more and more ubiquitous as the solution to difficult coverage and capacity challenges, more far-sighted operators are beginning to see how to use the technology to attract and retain lucrative business customers.
In Sweden, for example Spring Mobil is the fourth GSM operator, having won a license in 2003. It aims to replace fixed telephony in the office using nanoGSM and has recruited over 500 enterprise customers replacing fixed lines.
Spring can provide fast, low-cost coverage and capacity where enterprise customers need it most. They can sell more minutes while supporting their best customers with the most modern services. The solution reduces churn and drives traffic from fixed lines to mobile networks.
Picocells are an emerging GSM infrastructure play for many operators. They are being used to help operators:

  • Differentiate from commodity networks
  • Increase revenues from voice and data
  • Offer competitive in-office tariffs
  • Decrease churn

Operators have spotted that, with picocells, they can sell new services while improving macro cell performance without the need to over-spend on infrastructure because they change the approach to  ‘Pinpoint Provision', adding coverage exactly where needed.
Picocells are a proven, end-to-end solution carrying billions of minutes of traffic every year, in dozens of operators around the world.
The future looks bright for picocells.

Chris Cox is Marketing Manager, ip.access

Roaming fraud is now of such concern to operators that the GSM Association has developed a new set of fraud detection standards for its operator members. Eugene Bergen Henegouwen details the development, and urges operators to implement compliant solutions today

Some roaming fraud is a fact of life for many mobile operators and, at its worst, has the potential to cause great harm to the bottom line. This type of illegal behavior seems to affect operators most acutely where the borders of rich countries meet those who are poorer. In one form of fraud, SIM cards are cloned by well-organized and technically savvy criminals, who then sell them to consumers who most likely have no knowledge that the cards they're using are illegal. Eventually, the cards are deactivated by the operator, but by then the fraud has been in place for some time and money has been lost.

The criminals who participate in this type of fraud are successful because fraud detection and resolution is hindered by the length of time it can take for a visited operator to notify the home operator of the roaming activity that has been taking place. Currently, data records that track subscribers' activities as they roam are exchanged with the home operator within about 36 hours. This leaves a big window of opportunity for criminals who often target their activity during weekends and holiday times when operators are most vulnerable to fraud.
Roaming fraud - when it does hit - hits hard. It is the equivalent of leaving for a weekend trip and returning to find the whole house flooded even though you only had a dripping tap when you left. Any loopholes are potentially exploited by criminals who will strike decisively and "flash flood" what they can in a short timeframe because they know the extent of the capabilities - and limitations - of the current reporting standard. Clearly, now is the time to form a more watertight seal to combat this fraud.

Roaming fraud has become such a concern for operators that the GSM Association (GSMA) has developed and mandated a new set of fraud detection standards for its operator members. The Near Real Time Roaming Data Exchange (NRTDRE) initiative has been developed in specific response to International Revenue Share Fraud (IRSF), which thrives on the clone-and-sell technique.

The new NRTDRE standard will dramatically reduce the record exchange time and make fraud easier to spot and stamp out. It's not that operators don't have systems in place to fight the risk of roaming fraud themselves, but the current High Usage Report (HUR) system is outmoded. NRTDRE has been developed in its place as the next generation of roaming fraud protection in an attempt to effectively cut fraudsters out of the picture and restore the level of security that operators need to operate successfully and - most importantly - profitably.
According to a GSMA survey of 37 operators some months ago, roaming fraud losses affect networks of all sizes and in all regions. In one instance, a single operator is reported to have suffered losses of c11.1 million in just less than two years, just one more piece of evidence that IRSF has grown to be a costly concern. This may not sound like a huge figure, given that operator turnover can often be in the multimillions or even the billions. But when you take into account that it is all a loss from the profit and not the turnover line - and cannot ever be invoiced - the percentage grows to an uncomfortable level.

In the view of many analysts, this form of fraud has grown to the point where it is not merely a minor irritant. Martina Kurth, principal research analyst for Gartner Group, recently acknowledged that "roaming fraud is a very real and present danger for operators the world over, and it is impacting their bottom lines. Any initiative that enables operators to minimize roaming fraud is, therefore, a strategic business issue on which operators must act if there is not to be further erosion into their profitability."

The GSMA NRTRDE initiative aims to replace HUR to keep operators ahead of the growing sophistication employed by the fraudsters. With NRTRDE, the visited network is required to forward Call Data Records (CDRs) to the subscriber's home operator within four hours of the call end time. If the visited operator is unable to get this information to the home operator in time, the visited operator is held liable for any fraud associated with those calls, and so there is a greater degree of motivation for all parties to make the measure successful.
This approach toward fraud is particularly important in garnering the support of the whole industry of operators because without a shift in responsibility to the visited operator, the motivation to adopt the new standard would be limited. NRTRDE is making operators more accountable for the behaviour of visiting subscribers, and where once it was hard to enforce anti-fraud protection, operators now are much more motivated to work together.
Once the home operator has received the CDR, it can detect fraud via a fraud management system. There is a record format for exchanging these near real-time records, which has been defined in the GSMA's Definition of Transferred Account Data Interchange Group (TADIG) standards document as "TD.35." Syniverse has played the lead role with the GSMA in the development of standards which support NRTRDE, serving both as chair of TADIG and as the official author and editor of TD.35, the fundamental building block of anti-roaming fraud resolution.

The adoption of NRTRDE is expected to reduce the incidence of roaming fraud by up to 90 per cent because of the closing of the roaming fraud window currently open on operator networks that use HUR. Just as importantly, NRTRDE offers operators a far more accurate and timely view of how their networks are performing against fraud.
Rather than waiting until October 2008, the date at which GSMA members are required to implement the new standard, both the GSMA and Syniverse are urging operators to implement compliant solutions today. We believe that as the adoption process gains momentum, operators who continue to rely on HUR will eventually become disadvantaged and may be prone to more attacks.

Syniverse  has been taking part in GSMA trials to ensure our solution as well as the solutions of other providers have the interoperability needed for a global industry. All of the trial work undertaken thus far ensures operators can begin rolling out solutions ahead of the deadline and, in many cases, form a watertight boundary that excludes fraudsters and enables the operator to experience the cost benefits sooner rather than later.

The countdown to the GSMA's October 2008 NRTRDE implementation target date is moving closer by the day, and adoption is gaining momentum in the industry. Moreover, with the cost of roaming dropping in Europe for subscribers, it's reasonable to expect that the amount of roaming traffic will increase, making fraud patterns harder to spot and heightening the risk for those relying on HUR. In partnership with the GSMA, we believe the time to start forming a more watertight seal to combat sophisticated fraud is clearly now as the industry moves to protect not only the operators' revenues but also the future of roaming for subscribers worldwide.

Eugene Bergen Henegouwen is Executive Vice President, EMEA, for Syniverse Technologies

WiMAX is finally making its way into the mainstream telecoms market as WiMAX operators around the world begin to roll out their services. Previous obstacles for launching their ventures, such as obtaining the necessary spectrum licenses, deploying mobile WiMAX infrastructure, or selecting the right vendor have either been overcome or are about to be resolved. Finally, WiMAX is no longer a hyped up term being tossed around in the telecoms industry. It is now a reality, and operators are testing their WiMAX capabilities in real-life environments, with paying subscribers, on loaded networks. David Knox assesses the different types of entrants in the WiMAX market, and reveals the many charging challenges the new innovation faces and the ways in which operators can overcome them

According to a recent report from the Gartner Group, revenue from sales of WiMAX equipment will grow to more than $6.2 billion by 2011, and global connections will reach 85 million in the same year. The current scramble to roll out WiMAX around the world provides enough evidence to back up this statistic, as operators get ready to launch the next big thing in global communications.  Best of all, WiMAX is being regarded as a technology that is being embraced by both developed and developing nations around the world - creating endless business opportunities for vastly different economies.

Paving the way for WIMAX's entry into the mainstream is a new breed of dedicated WiMAX service providers that are starting to successfully target high density, high usage metropolitan areas in various parts of the world, as well as in rural communities where there is no access to fixed line broadband or 3G.

In the developing world, telecoms provider Wateen Pakistan has successfully deployed its WiMAX network in 17 major cities across Pakistan, including Islamabad, Karachi and Lahore. The success of the project demonstrates how the cost effectiveness and speed of deployment offered by WiMAX is allowing competitive carriers to quickly build a wireless broadband network. It also shows how a developing economy can immediately embrace a new, innovative next-generation technology, and smoothly deploy a cutting edge communications infrastructure.

Closer to home, UK companies like The Cloud, Europe's largest Wi-Fi operator, are making positive inroads into the WiMAX space. The Cloud has become the first to offer the service in the financial district known as The City Of London. Heralded as Europe's most advanced WiMAX roll out, the venture has given 350,000 workers and thousands of visitors the chance to get broadband access anywhere within "the square mile".

The success of this project has led to more successes for The Cloud, including its hotspot access deal with McDonald's, which recently rolled out free high speed wireless Internet access across almost 1,200 restaurants in the UK, making it the UK's biggest provider of free wireless Internet access. The Cloud also signed a major new deal with the BBC, which became the first UK broadcaster to have all it's online content made available for free via Wi-Fi. This latest venture enables the public to access all bbc.co.uk content for free through the UK's largest network of hotspots, operated by The Cloud.  The 7,500 hotspots are located at a various locations across the UK, including McDonald's, Coffee Republic, BAA airports (Heathrow, Gatwick and Stansted) as well as a number of outdoor locations including Canary Wharf and the City of London.  

Dedicated WiMAX providers are not the only entrants in this burgeoning market. Established fixed line telecommunications providers with no mobile arm - such as UK's BT- are also trying to get into the WiMAX space to complement their fixed line broadband offerings and to compete with the likes of 3G for high-speed data access. Established mobile service providers are also trying to muscle their way into the market, especially in regions where there is strong demand for high speed data but where 3G is not a feasible option, due to higher infrastructure deployment costs or geographical difficulties. A good example is the Caribbean's leading GSM operator Digicel, which recently rolled out WiMAX in the Cayman Islands. The existing broadband offerings in the country rely heavily on fixed lines, making it expensive and limiting in choice for many consumers and businesses. Digicel therefore used WIMAX as an opportunity to create competition by offering a better solution to consumers at a lower cost. 

Each of these aforementioned types of WiMAX market entrants will have many challenges to overcome before they can achieve profitability - and each one will also have different requirements for WiMAX charging, depending on their infrastructure and what BSS/OSS systems they already have in place.

One of the main challenges presented by WiMAX in terms of charging will be finding a means to perform user authentication in real-time for home and roaming subscribers. Another will be the ability for subscribers to be able to roam on other WiMAX networks and still use the same authentication mechanism and balance information from their home network. In other words, operators will need to find a way to offer customers just one account with their "home" WiMAX provider and not require them to worry about multiple sign-ons or topping-up their balance with multiple providers.

Having real-time access to customer, pricing and product information which may be stored "off-board" in existing / legacy systems will be another essential requirement for WiMAX providers, so that they can have a real-time view of the customer and therefore be able to charge and control the service in real-time and provide a positive user experience.
Being able to enforce post-paid credit limits in real-time in order to reduce windows for fraud and exposure to bad debt will also be crucial for WiMAX charging. This is a very important issue to address, since fraud and bad debt are now considered the largest areas of revenue leakage for telecom operators. According to a new survey published by UK research firm Analysys, average fraud losses resulting from all types of fraud, including external fraud, internal fraud and fraud by other operators, has grown from 2.9 per cent of operators' total revenue last year to 4.5 per cent this year, and this is expected to increase with the rise in popularity of data and Internet services. WiMAX providers must therefore find a way to protect themselves from this kind of revenue leakage if they want to manage short term as well as long term profitability.

The flexibility to offer pre-paid charging will be another challenge for WiMAX. Initially many WiMAX users will be corporate clients, so they will expect to be on post-paid deals, but they will also need the reassurance of not having to worry about exceeding spending limits. A real-time convergent charging solution can significantly enhance the flexibility of pricing plans and allow users to be automatically switched over from a post-paid to a pre-paid payment mechanism if these user-definable spending limits are exceeded. Instead of being an impediment to new service rollouts, the right charging solution will be able to drive change with the fast launch of new pricing strategies - regardless of what type of WiMAX contract a user is on.

Fortunately for WiMAX service providers, the market already boasts the right technology to ease its numerous rating and charging challenges.  Among the rating and charging experts for this new innovation is VoluBill, which offers a comprehensive range of WiMAX solution capabilities - whether it is simply WiMAX charging, or a complete WiMAX solution including integrated customer care, web self-care, billing and voucher management.
We also offer flexible deployment options for the solutions, making it possible to start with a limited scope and functional footprint and to expand the scope of the solution as business requirements demand.

All eyes are on WiMAX as it brings the reality of truly portable high-speed data access to the mainstream public. According to a recent report from Informa, revenues from mobile broadband services will generate more than US $400 billion globally by 2012, giving WiMAX the potential to be one of the most profitable innovations in telecoms history.

The next big step that WiMAX providers must take is to invest in solutions that offer flexible charging and control capabilities. This will ensure that operators will be able to maximise their financial and business potential while bringing a service that is not only enjoyable to the customer but affordable as well.

David Knox is Product Marketing Director at VoluBill

Opting for a managed services solution provides telcos with enhanced agility in a highly competitive marketplace claims Dominic Smith

Being able to adapt to market pressures and respond quickly is obviously key to success in today's highly competitive telecoms landscape. And yet many telcos are weighed down by the sheer weight and complexity of their technological and service infrastructures.
In the mobile domain, operators' portfolios typically include 2G GSM, SMS, MMS, GPRS, 3G and HSPA basic services, not to mention the range of value-added services, content and applications accessible on top. In addition, in all market sectors, operators have to maintain and bill for a broad array of legacy services. And they need to be able to tailor their offerings to meet the specific needs of a wide variety of market segments - from large enterprises to individual consumers.

Today, some operators are making the mistake of trying to be "all things to all people". They are looking to provide customers with a complete portfolio of converged services including broadband, mobile and fixed communications solutions. The problem is, that by so doing, it becomes increasingly difficult for these operators to meet the needs of all of their customers. 

To compete effectively, they need to be agile, able to focus on customer requirements and efficiently deliver the solutions that their customers will actually benefit from. However, with the often onerous requirement to manage and maintain an intricate network of products, services and applications, agility can seem a highly elusive quality.

In this context, it is hardly surprising that telecoms operators are increasingly interested in exploring the possibility of outsourcing their CRM and billing systems to third party solutions providers and, by so doing, freeing themselves up to focus on their core business. 

Steady market growth
Cambridge-based research firm, Analysys expects the Western European market for outsourcing technology and customer services by telecoms operators to show six per cent annual growth between 2005 and 2010, rising from c5.9 billion to c8.0 billion. Our own experience at Cerillion indicates that the appetite of operators to outsource business support systems for customer management, order management and billing is on the increase.

Cerillion's on-the-show floor survey carried out at Barcelona's 3GSM World Congress in February found that 50 per cent of respondents thought operators were more open-minded about outsourced billing than a year before. Just 15 per cent said they were less so.
To underline this positive mood, major new contracts are regularly reported in the media. In recent times, one of the most notable was the March 2007 announcement by IBM Global Services that it had won a 10-year deal with Indian operator, Idea Cellular. Under the terms of the contract, IBM is helping to handle services like billing, revenue assurance, credit collection and subscriber management.

A diverse market
One of the most important advantages of the managed service approach for CRM and billing systems is that it can benefit a wide range of operators, working on a broad array of projects. An operator undergoing a large-scale business transformation project, for example, may benefit from a managed service approach to ensure it remains competitive and retains sufficient agility to be able to launch new products and services for the project duration. 
A telco looking to establish itself in an emerging market, may seek to put a managed service into operation while it is focused on bringing new people on board and training them up, before ultimately transitioning to an in-house managed solution. 

Alternatively, an operator may take a long-term strategic decision not to manage its own CRM and billing systems but to hand that role over to a provider with expertise in the field, leaving the operator itself free to focus on delivering a high quality customer experience. Again, the ultimate goal is enhanced business agility.      

Putting the customer first
This focus on the customer is important. After all, it is customers that will ultimately have to pay to allow operators the luxury of owning and managing their own business support systems. It is often overlooked, but perhaps the most important single benefit operators can achieve from outsourcing their systems is the cost saving that can be passed onto customers.

When purchasing systems, operators typically incur significant upfront capital expenses before they begin to reap benefits. With a managed service model, the entry barrier is much lower. While the operator still has operational costs to take into account, those costs will usually be lower and more predictable than with a traditional licensed implementation.
There is also a risk that telcos who manage their billing and CRM systems in-house end up concentrating more on the technology than on their customers. Although the situation has undoubtedly improved over recent years, the telecoms industry still has an unfortunate tendency to focus more on system functionality than real business drivers. Too many misguided decisions have been made by IT directors intent on purchasing the latest state-of-the-art systems rather than investing in a planned strategy of business improvement and enhanced customer service

In a competitive market, operators looking to achieve enhanced agility should always put the customer first. While acknowledging that technology is important, Professor Robert East, expert in customer behaviour at Kingston Business School, comments: "Customer-facing technology needs to become more sophisticated to deal with recurrent issues more quickly and solve problems more efficiently. Businesses need to focus on technology that actually delivers satisfaction to people."

Reaping the rewards
But it is not only customers who have to foot the bill for operators indulging in the luxury of managing their own systems. On top of the obvious capex and opex charges, operators may be missing out on a range of other business improvements that the managed service model can offer.

One key benefit they could achieve by migrating to a managed service model is the ability to commit their technology provider to a service level agreement (SLA) with agreed turnaround times for implementing new products and incident resolution, for example.    
Such contracts formalise the way that billing and CRM systems are run and, by so doing, enable operators to gauge how quickly system changes and additions can be implemented. Again, this provides them with enhanced control over their systems environment and the ability to react more quickly to external pressures.  In contrast, the IT department within a large telco business will typically have no specific SLAs in place with any other part of the organisation and often no fixed review process either.

Another important advantage of the managed service approach is that it supports improved time to market for new services. This is because the managed service provider, typically with the benefit of extensive experience of a broad range of different customer installations, will usually understand the procedures and processes around those systems much more clearly than the operator does.

Operators working in emerging markets can also benefit by obtaining access to scarce skilled resource directly, rather than facing the headache of trying to recruit people locally with the requisite skills. Telco start-ups in all regions can often also benefit in a similar way.

Positive prospects
The future for outsourcing of CRM and billing systems is looking increasingly positive. As Simon Sherrington, author of a recent Analysys report on outsourcing, points out: "Outsourcing has become an important weapon in a telecoms operator's strategic arsenal. An effective and well-managed outsourcing scheme can deliver flexibility, reduce time to market for new services, and help to deliver profit growth for shareholders."
There is also clear evidence that outsourcing can help operators to achieve significant cost savings. However, in today's highly competitive telecoms environment the most important benefit of a managed service approach is the enhanced agility it brings telcos to focus on their core business of delivering a high-quality service to their customers.

Dominic Smith, Marketing Director, Cerillion Technologies

Technology innovations tend to capture the imagination - and the headlines - but, says Lance Spencer, there's little point in providing bells and whistles if all a company wants to do is simply communicate first and foremost

Is it just me or is this decade just flying by?  With 2008 just around the corner, it only seems like yesterday that companies were busying themselves with notions that the Millennium bug would wreak havoc on their computers and telecommunications systems.  And God help anyone who was flying at the very turn of the century!

In between times, the subject of advances in voice telephony and Voice over IP (VoIP) has raised more crackles and jitters than anyone in our business could have expected.  It's been nearly 120 years since the innovative American undertaker Strowger invented the telephone exchange, and just over a century later we are still finding more ways to fiddle around with his original invention.

Although technological improvements are being made for the benefit of customers, sometimes it's plain old telephony ("ah, POTs", I hear the telco veterans recall) that matters most.  Yes, companies like to have telephones that work and furthermore they like to have telephones that offer the same experience they've had for decades.  If their first experience of ‘new' technology is not a happy one, then bet your bottom dollar that they will shy away in future. 

With a plethora of telecoms providers claiming they will help to improve business communications whilst cutting costs, it's a confusing picture for the end user. Poor old Strowger will be turning in his grave!

On a more positive note, the clever bit is that new and emerging communications technologies enable telecoms providers to offer end users innovative services that provide exactly what customers want with a whole lot else besides.

However, there's no point in providing bells and whistles if all a company wants to do is simply communicate first and foremost.  This is particularly the case with smaller to medium sized businesses who now have the potential to get their hands on ‘grown up' telecoms technology but fear being bamboozled by all the complexity that comes with it.
Over the past two years Tiscali and other companies have been unbundling the local loop with a view to providing innovative services that end users understand and want.
For our part, we are on target to reach just over 800 exchanges by the end of 2007 and have a further target of 200 in the first quarter of 2008 to exceed 1,000.  Beyond this the amount of additional coverage gained for business purposes is relatively minimal.
By deploying equipment in BT exchanges, it's possible to introduce services that provide the ‘basics' that end users need, e.g. a telephone line, but in a way which opens up a whole new world of telecoms potential if that is what they want.  And in the majority of cases, once end users have confidence in the core technology, that's precisely what they ask for.
For example, Tiscali recently introduced an ‘all in one' service designed to enable resellers to deliver tailor made, competitively priced line rental, voice and data services over a single connection.  WVLA - which stands for Wholesale Line, Voice and ADSL - is a wholesale, fully unbundled service based on the unbundling capability in BT exchanges.   Resellers are provided with the line, full traditional voice services and an ADSL based data connection in one product.

It's the equivalent of taking the Wholesale Line Rental (WLR) service, Carrier Pre Select (CPS) and an ADSL connection all in one package.  Resellers do not need a relationship with BT for this or any other part of the WLVA service.

Coming back to our head scratching end user, the key aspect here is that the user experience is exactly the same as with the services they've taken in the past.  A degree in telecommunications network infrastructure isn't required and the commercial benefits are plainly laid out and understandable.

It's then up to the reseller to be as daring and as creative as they like, offering differentiated products to customers by expanding and diversifying their product portfolio.  In fact, our experience is that the two go hand in hand.  Once an end user is comfortable that the basics are in place, it's then a case that the end user and reseller can jointly discuss added value services such as IPCentrex.

Depending on the type of product offering, end users can then make their own choices about the level of sophistication and service required.  For example, the ADSL component of the WLVA service is initially available in Standard and Business grades of service with the option of enhanced care.  Product speeds range from 512Kbps to Max grade services.  Voice is offered with a range of telephony features suitable for businesses and consumers.  The line is provisioned with virtually all features enabled, allowing resellers to develop their own product bundles tailored to their customers. 

And as well as creating their own product range out of such offerings, many features come free of charge to resellers.  Combined with very competitive call rates available from wholesale providers, this provides resellers with the potential to offer competitively priced products while still increasing their margins.

Services such as WLVA are based on the latest Dense Wavelength Division Multiplexing (DWDM) technology.  This provides the ability to carry the next generation of data products, for example IPTV and telco grade voice services.

In the future, companies can expect to see the introduction of ADSL 2+ services offering speeds of up to 24Mb as they become available.  There will also be investment in Wholesale Broadband Connect (WBC) as BT rolls out its 21st Century Network.

Another example of simplicity at the core is "Voice Ready" Broadband.  Having introduced a ‘four line' version for smaller businesses earlier in 2007, we thought it would make sense to make the service accessible to larger SMEs so we introduced a ‘ten line' version towards the end of 2007.  It's the first service to offer guaranteed voice quality over broadband with enhanced care and SLAs to support carrier class voice services.

Such guarantees in this service offering and others are important because smaller businesses (and particularly those at the ‘S' end of the SME spectrum) have often thought of themselves as being at the lower end of the telecoms pecking order.  If as an industry we are to remove barriers to adoption then, as well as innovation, we must be prepared to deliver quality across everything we do.

Voice Ready is designed specifically for resellers wishing to provide guaranteed quality services to support real time applications such as Voice over ADSL, Video Streaming and Video Conferencing.  ADSL Connectivity is ideal for service providers with an existing network infrastructure and who wish to overlay additional network capabilities such as Local Loop Unbundled services.

Voice Ready Broadband is the first service of its type to offer Service Level Agreement (SLA) backed, guaranteed voice quality over broadband with enhanced care to support carrier class voice services.  The SLAs are on latency, jitter and packet loss, all of which are crucial to the quality of a voice call over a single broadband line. These guarantees mean that resellers can offer cost effective alternatives to traditional PSTN and ISDN based services.
Other benefits to resellers include low cost interconnection with network infrastructure at 100Mb, 1Gb and 10Gb, industry-leading product development of new technologies such as ADSL2+ and the ability to provide differentiated services in the marketplace.
As we edge towards the end of the ‘naughties', reflections on this past decade will show that despite criticism of the telecoms industry, on the whole we're actually pretty good at delivering innovation.  Ok, it can take us a while to get there, especially if products such as Voice over IP are over-hyped, but successful change doesn't always happen overnight, especially where a monopoly continues to be untangled.

Companies of all shapes and sizes are benefiting from telecoms unbundling, product innovation, service guarantees, and perhaps most importantly the ability to present new products in a way that end users understand them.  Now that must be putting a smile back onto Mr Strowger's face.

Lance Spencer is Tiscali Business Services Product and Marketing Director, and can be contacted via 
e-mail: lance.spencer@uk.tiscali.com

Matthew Finnie looks at the ever growing bandwidth requirements in Europe, and details what can be done to meet this increased demand

For those of us that have been working for a decade or more, the speed and reliability of our Internet connection is now assumed. We want access to be ubiquitous and limitless. We no longer hunker down and dial into the Internet or marvel at a T3 transatlantic interconnect which speeds it all up.

Europe - the new Internet superpower
The Internet was a North American invention and for a long time America was the Internet. Now those days have gone. Europe is now the largest Internet presence in the world, and the fastest growing market.  Recent data from telecoms analyst group Point Topic's Global Broadband Statistics service suggested that Eastern Europe continues to show growth, and was the only region to record more than 10 per cent growth in Q2 of this year. In March 2007 Romania passed the one million subscriber mark, which made it three Eastern European countries in the 10 fastest growing countries worldwide.

Western Europe is also setting a fast pace when it comes to broadband growth. Greece was the top grower in percentage terms in Q2, expanding by 27 per cent, and the biggest mover in the top 10 countries by number of subscribers was France, achieving the highest percentage growth rate of 9.36 per cent in the quarter.

As well as being the fastest growing territory, Europe also has the highest number of broadband subscribers. A recent audit by Internet World Stats revealed America had 64,614,000 subscribers and China had 48,500,000. The total number of subscribers in the nine highest European countries is 77,706,870, so Europe is clearly a long way ahead, even without including the rest of the continent.

The increase in broadband subscription is being driven not by businesses but by consumers and their relentless experimenting and evolving applications. The sharing of videos, photos, music and more across sites such as YouTube and Facebook, places enormous demands on bandwidth. The challenge for the DSL providers giving consumers access, is that they aren't sharing in the valuations the content providers are seeing. Given this, how do they maintain spend to keep delivering service while access prices in real terms are declining on a per meg basis?  Perhaps part of the problem is that many of these have still not embraced a connectionless NGN world where access to a customer is not a guarantee of all service revenue?

New applications are now being developed that assume that the broadband service is available. While the provider looks to include TV, there are whole sections of people with no network who see the penetration of broadband as the meal ticket for their latest venture. The most bizarre turn in this trend is mobile operators freeing up capacity in their own wireless networks by placing femtocells on the consumer premises and in some cases using the existing DSL as wireless backhaul.

Bandwidth, and its availability, has hit a tipping point where people expect to see ever increasing levels of service. One note of caution, most consumer broadband networks are horribly asymmetrical, while bandwidth is going up there is still a bias toward download making it unsuitable for many business applications. But the global business community is also demanding ever-increasing volumes of bandwidth as more and more business critical applications sit on the Internet.

Business 2.0 
Web 2.0 is a phrase coined to capture the collaborative nature of a network, and now people are talking about Business 2.0 in the hope that some of this social networking and agile application delivery will rub off in the corporate sector. The demands for corporate bandwidth are also being challenged.

The practical needs of a corporate are less glamorous but by no means less network intensive. The IT Director no longer has the luxury of time or resources to embark on a grand IT plan spending millions for a promised brighter future in two years. IT Directors are talking about embracing a "service aware" approach to developing IT applications that borrows much from the experimental evolutions seen in the consumer Internet experience. The challenge for many is that invariably this approach is network centric (it has to be) but requires a more iterative and agile approach to application development.
And the on-going move towards collaboration and unified communications is a key factor in demands on corporate networks too. Unified communications - the embedding of tools such as IM, presence conferencing and voice into one platform - is going to have as big an impact on business communications now, as e-mail did in the 90s. Earlier this year we launched our own integrated communications platform, Interoute One. This service enables corporate customers to manage voice calls as easily and cost effectively as e-mail, and without the need for complex integration or upgrades to their existing telephony infrastructure.
And with its recent launch of Office Communications Server (OCS), even Microsoft is looking to get into the unified communications market. Yet despite its strengths, OCS is only as strong as the network that carries it. Without a network provider able to route calls the OCS doesn't achieve its ambition.
So with the demand brought about by unified communications, combined with Web 2.0 and businesses running more and more applications over their networks, a clear picture emerges - there is an incredible increase in demand for bandwidth to ensure the highest quality end-user experience.
This demand can only be met by more bandwidth, yet for service providers, what is the best way to meet this demand?

Buy or build?
The demand for bandwidth will stretch many existing service provider infrastructures to breaking point, so what are the options available? DSL has traditionally been the preferred technology to deliver broadband services in Europe. It uses existing copper access networks to deliver broadband and is well entrenched in Europe, but is struggling to cope with bandwidth demand now, and in the future will struggle massively as demand grows yet further.

Consequently, service providers have to start looking at deploying fibre deeper into the network, even to the home or building, to meet future bandwidth requirements. Fibre is as future-proof a communications technology as is possible to get, and several service providers have already made a commitment to deploy fibre-to-the-node or fibre-to-the-home networks in the next three to five years. But is building new fibre networks a viable option?
It's a paradoxical situation. The main problem with fibre is that it is just so expensive to build from scratch, it's basically real estate. The industry has had well-documented problems with a number of carriers who built large fibre networks in the 90s, and then watched as the telco market dipped and their businesses suffered massively. Now the telco market is booming again, demand for bandwidth is high, yet the best (and arguably only) way to meet that demand requires fibre. So for service providers looking to stay in the wholesale provider space the only way to sensibly achieve this is through fibre ownership.
This means that in the wholesale space, if you are without the physical assets, you are shortly going to run out of capacity. Even if you have fibre but only a leased pair you will shortly be buying more. This leaves a small but battle hardened minority of carriers with multiple fibre backbones that will be the dominant suppliers in the market place. But even owning fibre is not quite enough - you need to have an operation that understands how to deal with Europe's different laws and regions. For example, the Interoute network connects 85 cities in 22 countries across 54,000 cable kilometres of fibre. Up to 48 fibre pairs have been deployed throughout the network, which means it has the capacity to carry over a petabit (a billion megabits per second) of traffic. The backbone is complimented by 19 deep city fibre networks that interconnect with the necessary diversity of access methods required to deliver 21st century telecoms

Fibre is vital to business communications in 2007 and beyond - it is the ultimate delivery mechanism - and the harsh reality is that a service provider without the physical fibre in the ground will not have the raw material necessary to satisfy the huge demand for bandwidth. The die is now cast, fibre is the bandwidth raw material of choice for the Internet, but building fibre networks is now simply too expensive, so the only real option is to buy bandwidth from someone who has it.  There is a tipping point with all technology when it passes through a point where the user understands it and operates accordingly. Bandwidth and the Internet has hit that point, and fibre is the only viable way to meet the demand.

Matthew Finnie is Chief Technology Officer at Interoute

At twenty-six, Ethernet is something of a ‘Grand Old Man' of networking technologies.  But as Mark Bennett points out, Ethernet is still one of the most agile and reliable networking standards available, despite its relatively advanced years.  Like a good wine, it just gets better with age

In the early 1980's, technology was offering the world new ways to live and work.  If you chose to, you could drive to the office in the ‘revolutionary' Sinclair C5, unwind with a Beta-max recording of ET or even play a few games on the latest Amstrad.  Fortunately, while these high-profile ‘flash-in-the-pans' were hogging the limelight, our predecessors in telecoms engineering were putting in place some more enduring technologies.  The first Ethernet products hit the market in 1981, and over the past 26 years the standard has established itself as the de facto choice for LAN and MAN connectivity.  Such longevity is rare in the technology space and bares witness to the versatility of the standards. 

Ethernet has been such a huge success because it meets the criteria essential for the mass adoption of any product:  It is inexpensive, it is flexible, it is simple and delivers an elegant solution to a potentially very complex problem, so it should be no surprise that it is now ubiquitous.  Any technology though, can only last if it can change to meet the demands of users.  It is this Darwinian ability to continually evolve that marks out the true survivors of the technology space.  Ethernet has proved that it can do this and is now moving beyond its traditional spheres of LAN and MAN, to provide a much more comprehensive approach to networking.  It is becoming increasingly clear that, aged 26, Ethernet is going from strength to strength and emerging as the standard of choice for long-haul access technology.
The reasons for this can be traced to the changing needs of businesses, especially in specific sectors.  Many of these organisations are demanding ever-increasing amounts of bandwidth in their networks, and Ethernet is emerging as the best means for providing this.  This market includes a wide range of organisations in sectors as diverse as utilities, finance, public sector and smaller business.  The key markets here are companies which can be termed ‘DIY': businesses which have their own in-house IT managers and want to retain management of their own IP-based networks.  Such organisations are typical of utilities, media and finance companies.  The public sector also fits the DIY mould, and we are seeing high levels of demand from local and central government, as well as all areas of schools and higher education. Demand from the indirect markets of mobile, national and international carriers, as well as from IT services companies, is also growing fast.

So what, exactly, are these larger businesses and public sector bodies using Ethernet for?  Ethernet has a number of properties that appeal to these organisations.  It's speed and comparatively low cost makes the technology ideal for inter-site connectivity, and we are seeing a great deal of demand for this.  As LAN speeds increase - with 100Mbps being standard and Gigabit (1,000Mbps) more common as well - it makes sense to ensure that the wider network is not a bottleneck so 100Mbps and 1Gbps inter-site networks are in regular deployment.  Such organisations are also increasingly looking to leverage the benefits of data, voice and applications convergence so Ethernet is establishing itself as the ideal method to provide access to converged next-generation networks (such as MPLS-based core networks).  Convergence brings with it the need to handle ever-increasing amounts of data traffic as more and more applications are placed on a single network - everything from rich content e-business applications to services and IT centralisation applications.  Ethernet has the capacity to handle vast amounts of data without putting any strain on the network.  So it seems that service convergence is breathing new life into Ethernet networking.

It's not just large enterprises that are driving demand for Ethernet connections.  Although Ethernet has typically been associated with larger enterprises, demand from small and medium sized businesses (SMBs) is also growing. SMBs are looking for increased bandwidth to deploy next-generation services and Ethernet is an appealing option for them because of its simplicity and familiarity from its use in the LAN. Ethernet is enabling SMBs to roll out the advanced applications that help them compete with bigger companies, such as VoIP, distributed WANs IP-based video conferencing and real-time collaboration.

We are seeing demand from SMBs, which have traditionally used low bandwidth leased lines, grow particularly fast. These businesses will often have more than one site and be too large for DSL, especially due to the limitations of upstream bandwidth that ADSL gives when used for a VPN.  They require permanent, dedicated, always-on bandwidth, to support applications critical to their business. This sector is increasingly turning to Ethernet for security, reliability and quality of service on a private network, which is not currently available on DSL services, as a solution to their increasing bandwidth needs.  Ethernet is attractive as it offers more bandwidth at a lower price per megabyte, a critical consideration for SMBs. Solutions that can be scaled to deliver multiple services are particularly attractive as these can deliver additional business benefits such as applications like VoIP.

The indirect market for Ethernet is also seeing considerable growth.  This market is made up of service providers investing in the technology in order to deliver advanced services to their customers or to extend their networks.  Both national and international carriers as well as mobile operators business ISPs and IT services companies, are all using Ethernet to enhance the services they deliver, both for high speed Internet access and to provide hosting and connectivity to their customers.

Mobile operators in particular are trying to leverage Ethernet to reduce operating costs of their access networks as well as to provide the higher levels of bandwidth, required as Mobile Data begins to be adopted more widely.  Mobile operators are trying to increase ARPU to recoup their investment in 3G by offering data-heavy applications such as mobile Internet and TV.  The high-speed 3G and WiMax networks supporting these services are rapidly increasing backhaul bandwidth requirements.   As traditional backhaul technologies and architectures struggle to support growing demands, operators are turning to packet transport technologies, such as Ethernet, as a cost effective solution.  Ethernet is therefore increasingly supporting new, revenue-generating services for mobile operators without allowing backhaul costs to spiral, helping to maintain mobile operators' profitability.    
International carriers, on the other hand, are moving to Ethernet to support the increasing demand for bandwidth being made by their large multi-national customers.  Traditionally leased lines would have been used to cater for such demand, but Ethernet has proven it can offer a much more scalable alternative at a lower price per megabit.   International carriers can use Ethernet tails to provide their customers with a range of applications and services, including MPLS IP VPN, Ethernet connectivity, Internet and VoIP.  Because Ethernet is a layer two protocol, it allows international carriers to retain full control of layer three IP routing and their own IP classes of service SLAs.

National operators are driving demand for Ethernet along similar lines.  In the UK, Ethernet backhaul provided by altnets is proving particularly popular as a cost saving option, as their wholesale offering can be much less expensive than the incumbent's.

A number of organisations want to centralise the hosting and management of servers, data and storage systems, and other IT assets in one location.  To do this they need dedicated, very high bandwidth, high quality connectivity to that centralised facility. This centralisation of IT assets is designed to reduce their operating and capital costs, as well as enable delivery of advanced new services. 

Historically, however, the business case for centralising IT assets may not have been viable due to the high cost of bandwidth required to connect remote sites to central data centres.  Ethernet offers more bandwidth at a lower price per megabit than the technologies traditionally deployed, helping to drive the business case for centralisation. Ethernet together with MPLS IPVPNs, is ideal for connecting remote sites to central locations with Gigabit Ethernet services providing the ideal, very high bandwidth connections between data centres. In addition, the cost savings can then be diverted into other applications that fit with Ethernet, enabling these organisations to layer multiple services and applications on to the same network.

With Ethernet, bandwidth can be increased or decreased at very short notice.  This means that companies need only pay for the bandwidth they actually use, increasing capacity easily and quickly as and when required, either to handle a short-term spike in demand or a longer term increase in traffic. This flexibility is very attractive to organisations of all kinds, and the scalable nature of Ethernet is one of its key selling points.

Ethernet, therefore, has proved itself in the face of a fast changing telecoms market.  Change usually forces technology to sink or swim, and the recent move towards convergence, and bandwidth-hungry applications has shown how resilient Ethernet is.  Indeed, the future for Ethernet is looking good.  Carrier Ethernet is emerging to offer operators a flavour of Ethernet with the same characteristics of leased lines, while the IEEE has launched a new set of OAM standards for Ethernet in order to better manage and maintain the technology.

At 26, therefore, the Ethernet story is far from over.  Ethernet looks set to be the dominant access standard for the foreseeable future, driven by the demands of business and ideally placed to meet these demands.  The versatility and simplicity of the standard that led to its dominance in LAN and MAN, is extending to the WAN, delivering access to the next-generation of applications and services.  It goes to show that in the world of telecoms you really can teach an old dog, new tricks - tricks that are increasingly appealing to business.    

Mark Bennett is Head of Data Portfolio at THUS plc


Other Categories in Features