Features

With ADSL2+ technology now being pushed to its absolute limits, carriers are talking about the next generation of broadband, VDSL and VDSL2, with speeds of up to 200 Mbits/s on relatively short line lengths from the DSLAM. Jorg Franzke explains how it is possible to roll out the speed benefits of VDSL and VDSL2 to most urban and city customers, without breaking the telco bank

It’s been an eventful decade across Europe as former state incumbents, cable companies and virtual telcos have all, seemingly en-masse, jumped on the broadband telecoms wagon and rolled out an increasing range of high-speed broadband services to their customers.
Most experts agree that, even with ADSL2+ offering customers access to up to 24 Mbits/s downstream data speeds, customers’ appetites for even faster speed services are still increasing, with some cable companies already talking about offering 100 Mbits/s as standard.

The only problem with this new generation of very high-speed broadband services is that they rely on VDSL and VDSL2 technology. Whilst ADSL2+ can happily support copper line lengths of two or more kilometres, the maximum available rates are achieved with VDSL at a maximum range of just 300 meters from the DSLAM (digital subscriber line access module), which gives around 52 Mbits/s. When we move on up to VDSL2 (ITU G.993.2) technology, carriers are even talking about rates of up to 200 Mbits/s.
But VDSL2 deteriorates quickly from a theoretical maximum of 250 Mbit/s at zero metres from the DSLAM to 100 Mbits/s at 500 metres, and 50 Mbits/s at 1.0 kilometre.  As a result of these line length limitations, very few customers will be within the coverage range of VDSL2 DSLAMs installed at the central exchange. So, most local loop carriers are discussing moving the active electronics, including the DSLAMs, out of their central offices and into larger versions of the roadside cabinets that form an integral part of the street furniture we see every day.
A major issue is that, with the move out of the central office comes the de-centralisation of the main distribution frame where connections have to be moved to initiate new services such as ADSL and VDSL.  Each time a customer requests a change of service, jumper wires have to be moved - fairly easy and efficient in a warm, dry clean centralised environment – but once the connections have to be made in the cold and rain it becomes an operational issue.  To give the reader an idea of the massive scale involved, however, a network the size of BT in the UK would require around 65,000 of these externally deployed active electronics cabinets. A network the size of Germany’s T-Com would require around 100,000 such cabinets.
In theory, the incumbent telcos could employ teams of roving engineers to maintain and provision the cabinets in much the same way as central offices are serviced at the moment, but the costs associated with the necessary engineering `truck rolls’ is anathema on the financial and ecological fronts. Even one visit per fortnight, at say ?50 per technician visit, would clock up annual costs of ?130 million per annum on a 100,000 cabinet network.
Consequently, any carrier electing to stay with the status quo and implement manual re-jumpering at the thousands upon thousands of active equipment roadside cabinets, will be forced to reduce their costs by making only scheduled visits. The corollary of this is that each cabinet may only figure in the schedule once every fortnight, meaning that the time-to-provision each customer will become much longer than currently is the case.
New approach
A markedly different technique is needed and newly developed automatic cross-connects  (ACX) can now be used to replace manual distribution/jumpering frames in the remote cabinets and so save carriers significant sums of money on the operational expenditure (OpEx) front.
With an automated ACX solution, not only are there no delays in waiting for an appropriate technician truck roll, but also the control of the ACX can be integrated directly into the carrier’s operations support system. Using this approach allows for the service connection to follow on automatically from the customer's order, within an hour or two, rather than the customer facing a wait of several days, as is currently normally the case with a central office, or several weeks in the case of above manual re-jumpering scenario.
Many carriers and manufacturers alike are chasing a holy-grail of Zero Touch for their networks. We have been pioneering a slightly less pipedream approach based on practicality and best Return on Investment.
The aim of a Zero Touch network would, of course, be that the field technician never needs to visit the remote site.  Which is all very well until you take into account that active equipment, and its power supplies and air-conditioning go wrong from time to time. So occasional technician visits are inevitable.
Zero touch systems have other drawbacks, not the least of which is the fact that the purchase costs can be substantial, so reducing the installation's return on investment. In the case of automated cross connects, Zero-touch would need a non-blocking switching matrix which is very expensive, and current non-blocking technology simply isn’t up to the job of transmitting 100Mbit/s signals.
The third issue with zero touch systems is the fact that the cabinet needs to be equipped with a large degree of reserve DSLAM and splitter capacity, ditto power supplies and air-conditioning, so seriously increasing the levels of capital expenditure required.
Our theory is simple. What happens if we introduce a minimum number of technician truck tolls to the mix, creating a `minimum touch’ not zero touch active electronics-based local loop?
This is where the financials begin to get interesting, as a minimum touch network is far more financially viable. It requires significantly lower levels of capital expenditure with very similar levels of operating costs. Less spare capacity is needed as this can be added when demand dictates. Likewise a much less expensive semi-blocking ACX can be used – with the full frequency range for 100Mbit/s service delivery
A good minimum touch system has the advantage of automating the provisioning and re-provisioning of lines without incurring the high capital costs of a zero touch system, or attenuating the signal levels required for effective VDSL and VDSL2 transmissions.
Well before the ACX system reaches saturation levels, it can signal its status to the central exchange, allowing engineers to make a planned site visit, install additional capacity if needed and hardwire connections already switched through to VDSL freeing up the switch ports to be used again for the next six or twelve months.
The result of this approach is good scalability, lower cost per line and a reduced space requirement. And all without affecting those all-important customer satisfaction levels.
Using a minimum touch ACX approach means that only one or two maintenance visits each year are required for each cabinet, with remote monitoring shouldering the responsibility of maximising network up-time.
In the event that something like a DSLAM card fails, the ability of ACX to connect ‘any-to-any’ can be employed to ensure that customers are only minimally affected by any technical problems. Depending on the severity of the failure, the cabinet's active technology can be remotely reconfigured to maintain service for the customers affected and the network operations centre can schedule a truck roll when it suits the operator. This makes for a more cost-effective maintenance strategy.
ACX technologies
In a survey of switching technology for remote automated cross connect devices used in next generation carrier networks, research company Venture Development Corporation (VDC) considered a number of technologies, but rejected robotic and solid-state/electronic switching, the former being error prone, expensive and with poor life expectancy, whilst solid-state/electronic switches have electrical parameters that make them unsuitable for the high bandwidth requirements of xDSL services like VDSL2. VDC also noted that a very specific `electromagnetic’ variation of the MEMS relay may become a suitable technology, but this is currently only in testing as regards ACX applications and, as yet, has no field application track record.
VDC concluded: “We believe the electromagnetic relay is acceptable technology because of its proven reliability, ruggedness and minimal transmission impairment.”  It did not judge any other technology to be currently acceptable. This, and the fact that the power requirements are so minimal, are the reasons that we had chosen to develop our own ACX product range around the tried and tested electromagnetic relay.
Obviously whether or not to implement ACX or to manage the service provision process manually is a matter for individual carriers. The choice of technology is critical from the perspective of reliability, minimal power consumption and the ability to handle the very high frequencies needed for VDSL2, but far more important in this rapidly changing telecoms word is the need for rapid return on investment.
It is our contention that Zero Touch is a step too far and that in the world of every day  engineering issues, Minimum Touch networks and minimum touch ACX are the way to minimised costs.

Jorg Franzke is ACX product manager for ADC KRONE, and can be contacted via tel: +49 308453-2498; e-mail: jörg.franzke@adckrone.com
www.adckrone.com

End-to-end transaction data is increasingly being recognised as the not-so-secret sauce required for full-flavoured telco transformation. If so, it should be treated with the reverence it deserves, Thomas Sutter, CEO of data collection and correlation specialist, Nexus Telecom tells Ian Scales

Nexus Telecom is a performance and service assurance specialist in the telecom OSS field. It is privately held, based in Switzerland and was founded in 1994. With 120 employees and about US$30 million turnover, Nexus Telecom can fairly be described as a 'niche' player within a niche telecom market. However, heavyweights amongst its 200 plus customers include Vodafone, T-Mobile and Deutsche Telekom.

It does most of its business in Europe and has found its greatest success in the mobile market. The core of its offer to telcos involves a range of network monitoring probes and service and revenue assurance applications, which telcos can use to plan network capacity, identify performance trends and problems and to verify service levels. Essentially, says CEO, Thomas Sutter, Nexus Telecom gathers event data from the network - from low-level network stats, right up to layer 7 applications transactions - verifies, correlates and aggregates it and generally makes it digestible for both its own applications and those delivered by other vendors.  What's changing, though, is the importance of such end-to-end transaction data. 
Nexus Telecom is proud of its 'open source approach' to the data it extracts from its customers' networks and feels strongly that telcos must demand similar openness from all their suppliers if the OSS/BSS field is to develop properly.  Instead of allowing proprietary approaches to data collection and use at network, service and business levels respectively, Sutter says the industry must support an architecture with a central transaction record repository capable of being easily interrogated by the growing number of business and technical applications that demand access.  It's an idea whose time may have come.  According to Sutter, telcos are increasingly grasping the idea that data collection, correlation and aggregation is not just an activity that will help you tweak the network, it's about using data to control the business. The term 'transformation' is being increasingly used in telecom.
As currently understood it usually means applying new thinking and new technology in equal measure: not just to do what you already do slightly better or cheaper, but to completely rethink the corporate approach and direction, and maybe even the business model itself. 
There is a growing conviction that telco transformation through the use of detailed end-to-end transaction data to understand and interact with specific customers has moved from interesting concept to urgent requirement as new competitors, such as Google and eBay, enter the telecom market, as it were, pre-transformed. Born and bred on the Internet, their sophisticated use of network and applications data to inform and drive customer interaction is not some new technique, cleverly adopted and incorporated, but is completely integral to the way they understand and implement their business activities. If they are to survive and prosper, telcos have to catch up and value data in a similar way.  Sutter says some are, but some are still grappling with the concepts. 
"Today I can talk to customers who believe that if they adopt converged networks with IP backbones, then the only thing they need do to stay ahead in the business is to build enough bandwidth into the core of the network, believing that as long as they have enough bandwidth everything will be OK."
This misses the point in a number of ways, claims Sutter. 
"Just because the IP architecture is simple doesn't mean that the applications and supply chain we have to run over it are simple  - in fact it's rather the other way about.  The 'simple' network requires that the supporting service layers have to be more complex because they have to do more work." 
And in an increasingly complex telco business environment, where players are engaged with a growing number of partners to deliver services and content, understanding how events ripple across networks and applications is crucial.
"The thing about this business is not just about what you're doing in your own network - it's about what the other guy is doing with his. We are beginning to talk about our supply chains. In fact the services are generating millions of them every day because supply chains happen automatically when a service, let's say a voice call over an IP network, gets initiated, established, delivered and then released again. These supply chains are highly complex and you need to make sure all the events have been properly recorded and that your customer services are working as they should. That's the first thing, but there's much more than that.  Telcos need to harness network data - I call them 'transactions' - to develop their businesses."
Sutter thinks the telecom industry still has a long way to go to understand how important end-to-end transaction data will be.
"Take banking. Nobody in that industry has any doubt that they should know every single detail on any part of a transaction. In telecoms we've so far been happy to derive statistics rather than transaction records. Statistics that tell us if services are up and running or if customers are generally happy. We are still thinking about how much we need to know, so we are at the very beginning of this process."
So end-to-end transaction data is important and will grow in importance.  How does Nexus Telecom see itself developing with the market?
"When you look at what vendors deliver from their equipment domains it becomes obvious that they are not delivering the right sort of information. They tend to deliver a lot of event data in the form of alarms and they deliver performance data - layer 1 to layer 4 - all on a statistical basis.  This tells you what's happening so you can plan network capacity and so on.  But these systems never, ever go to layer 7 and tell you about transaction details - we can. 
"Nexus Telecom uses passive probes (which just listen to traffic rather than engage interactively with network elements) which we can deploy independently of any vendor and sidestep interoperability problems.  Our job is to just listen so all we need is for the equipment provider to implement the protocols in compliance with the given standards."
So given that telcos are recognising the need to gather and store, what's the future OSS transaction record architecture going to look like? 
"I think people are starting to understand it's important that we only collect the data once and then store it in an open way so that different departments and organisations can access it at the granularity and over the time intervals they require, and in real (or close to real) time. So that means that our  approach and the language we use must change. Where today we conceptualise data operating at specific layers - network, service and business - I can see us developing an architecture which envisages all network data as a single collection which can be used selectively by applications operating at any or all of those three layers.  So we will, instead, define layers to help us organise the transaction record lifecycle. I envisage a collection layer orchestrating transaction collection, correlation and aggregation.  Then we could have a storage layer, and finally some sort of presentation layer so that data can be assembled in an appropriate format for its different constituencies  - the  marketing  people, billing people, management guys, network operation guys and so on, each of which have their own particular requirements towards being in control of the service delivery chain. Here you might start to talk about OSS/BSS Convergence."
Does he see his company going 'up the stack' to tackle some of these applications in the future. 
"It is more important to have open interfaces around this layering.  We think our role at Nexus Telecom is to capture, correlate, aggregate and pre-process data and then stream or transfer it in the right granularity and resolution to any other open system."
Sutter thinks the supplier market is already evolving in a way that makes sense for this model.
"If you look at the market today you see there are a lot of companies - HP, Telcordia, Agilent and Arantech, just to name a few - who are developing all sorts of tools to do with customer experience or service quality data warehouses.  We're complementary since these players don't want to be involved in talking to network elements, capturing data or being in direct connection with the network.  Their role is to provide customised information such as specific service-based KPIs (key performance indicators) to a very precise set of users, and they just want a data source for that."
So what needs to be developed to support this sort of role split between suppliers? An open architecture for the exchange of data between systems is fundamental, says Sutter. In the past, he says, the ability of each vendor to control the data generated by his own applications was seen as fundamental to his own business model and was jealously guarded. Part of this could be attributed to the old-fashioned instinct to 'lock in' customers. 
"They had to ask the original vendor to build another release and another release just to get access to their own data," he says. But it was also natural caution.  "You would come along and ask, 'Hey guys, can you give me access to your database?', the response would be 'Woah, don't touch my database.  If you do then I can't guarantee performance and reliability.' This was the problem for all of us and that's why we have to get this open architecture. If the industry accepts the idea of open data repositories as a principle, immediately all the vendors of performance management systems, for instance, will have to cut their products into two pieces.  One piece will collect the data, correlate and aggregate it, the second will run the application and the presentation to the user.  At the split they must put in a standard interface supporting standards such as JMS, XML or SNMP. That way they expose an open interface at the join so that data may be stored in an open data to the repository as well as exchanged with their own application. When telcos demand this architecture, the game changes. Operators will begin to buy separate best in class products for collecting the data and presenting it and this will be a good thing for the entire industry.  After all, why should I prevent my customer having the full benefit of the data I collect for him just because I'm not as good in the presentation and applications layer as I am in the collection layer? If an operator is not happy with a specific reporting application on service quality and wants to replace it, why should he always loose the whole data collection and repository for that application at the same time?"
With the OSS industry both developing and consolidating, does Nexus Telecom see itself being bought out by a larger OSS/BSS player looking for a missing piece in its product portfolio?
"Nexus Telecom is a private company so we think long-term and we grow at between 10 and 20 per cent each year, investing what we earn. In this industry, when you are focusing on a specialisation such as we are, the business can be very volatile and, on a quarter-by-quarter basis, it sometimes doesn't look good from a stock market perspective."
But if a public company came along and offered a large amount of money? "Well, I'm not sure. The thing is that our way of treating customers, our long-term thinking and our stability would be lost if we were snapped up by a large vendor. Our customers tend to say things like  'I know you won't come through my door and tell me that someone somewhere in the US has decided to buy this and sell that and therefore we have to change strategy.' Having said that, every company is for sale for the right price, but it would have to be a good price."
So where can Nexus Telecom go from here?  Is there an opportunity to apply the data collection and correlation expertise to sectors outside telecom, for instance?
"Well, the best place to go is just next door and for us that's the enterprise network. The thing is, enterprise networks are increasingly being outsourced to outsourcing companies, which then complete the circle and essentially become operators. So again we're seeing some more convergence and any requirement for capturing, correlating and aggregating of transactions on the network infrastructure is a potential market for us. In the end I think everything will come together: there will be networks and operators of networks and they will need transaction monitoring.  But at the moment we're busy dealing with the transition to IP - we have to master the technology there first.”

Ian Scales is a freelance communications journalist.

Could high speed packet access become high cost packet access, asks Pat Dolan

Maturity often brings a heightened sense of reality, a healthy dose of cynicism, and a degree of wariness around new products or services that pertain to be 'the next big thing.'
Some mobile networks are now 25 years old. They suffer from declining revenue and are struggling to keep abreast of seemingly continuous advances in technology. Given the maturity of these networks, as a shiny new acronym takes the stage in the mobile market, operators can be forgiven for treating it with a degree of suspicion and for ordering it to undergo a full examination.

One such acronym is HSPA – High Speed Packet Access, which has been promoted by mobile operators as the dream data network, designed to realise the promise of 3G. HSPA has been billed as the rescue remedy that will enable delivery of data services that will convince consumers to upgrade their price plans. It may even succeed in getting the tardy corporate market to spend serious money going mobile.
So how can an opportunity to make more revenue be viewed suspiciously in a market where operator margins are declining?
The answer depends on the profitability of that revenue, and how much it costs to achieve. The issue isn't really how much mobile users spend.  The key figure is how much profit margin users deliver to operators – and that depends on how much it costs operators to deliver services.
HSPA, according to the mobile industry, will enable the delivery of high-bandwidth services to a standard that will encourage use. The industry expects the major users to be the generation that has grown up with mobile – the so-called Echo Boomers, children of the Baby Boomers, – to whom technology is a given. They are the generation of the iPod device, laptop and Xbox® video game system, who expect gadgets to not only look good but to deliver services at any place and any time. These users are the mobile gamers, the TV-on-demand consumers, the market that expects technology to work, and work well.
Similarly, the corporate market is there for the taking – the success of WiFi demonstrates that high-speed mobile wireless connectivity has a following, when it works properly.
Therefore, with the introduction of HSPA, operators can be confident that the amount of data carried over the mobile network will increase. Here is where the careful inspection of the technology kicks in. To handle the anticipated increase in traffic and maintain the quality of service required to not only court but keep the high speed data user, operators must invest in their networks.
A report from Gartner published last year stated: “Vendors present HSDPA as a simple software upgrade on the Node B in base stations. However, feedback from operators indicates that, even for major vendors, this may not be the case.”
It may indeed not be the case. Put more traffic on the same motorway and you get a traffic jam. Then you must add more road to accommodate the traffic. And it all costs money.
The potential bottleneck for HSPA lies in the backhaul – not perceived as the most glamorous part of the network, but the part that decides whether the traffic flows freely or grinds to a halt. Backhaul is the transport of data between the Radio Network Controller (RNC) and the Node B (base station) in the 3G/HSPA-capable network. Traditionally in 2G/2.5G networks, backhaul capability was provided by E-1 leased lines. There was no great choice when those networks were implemented, and the leased-line capacity of 2Mbps served the voice and minor data needs of those networks to date. But when you consider that Gartner puts the typical cost of an E-1 link at $4,500 or $5,600 per year, and HSPA driving the requirement to 10 or more E-1 links per site, the cost implications begin to bite.
But all is not lost. The world has moved on from E1. There are now other backhaul solutions available. It is crucial for operators today, more than ever, to consider the cost implications of the backhaul transport choice. Each E-1 link operators add increases their transport capability. It also automatically adds more cost.
Pseudowire technology can address both cost and speed. Pseudowires enable more cost-effective networks by enabling backhaul over packet-based technologies such as Metro Ethernet, Ethernet Microwave and DSL. They also can protect operators' investment in their existing networks by providing a bridge between “old” and new transport models. Take this route, and the investment to provide a network capable of delivering services over HSPA to a standard expected by the user becomes a realistic, sustainable and economically smart move.
One operator already enjoying the inherent benefits of pseudowire technology is Telecom Italia, with others giving it serious considerations too.  Telecom Italia for example has an award-winning application for transporting ATM-based 3G traffic across an Ethernet network. 
Operators need a clear strategy for backhaul upgrades to realise the potential of HSPA. Following the “if it ain't broke, don't fix it” E-1 path will deliver bandwidth, but at what cost?  Backhaul networks are already one of the largest operating expenses for mobile operators. If the market wants to deliver high speed and not high cost data packet access, then maybe it's time to make a new choice.

Pat Dolan is VP & General Manager EMEA at Tellabs
www.tellabs.com

As Internet gambling – whether via mobile or fixed lines – continues its upward path, Windsor Holden looks at the different approaches of prohibition versus regulation currently exhibited by governments on opposite sides of the Atlantic

Before we kick this off, I would like to get one thing straight. “Tex” Holden, king of casinos, lord of the gamblers, is a myth. You will not espy me playing the roulette wheel at the Golden Nugget, nor cleaning up at baccarat in Monte-Carlo, nor even studying the form guide in the Racing Post. In short, I am not ordinarily a gambling man, although one might suggest taking a through train on a Friday evening from Manchester Piccadilly to Cambridge represented a bit of a punt, particularly if I were expecting it to arrive on time, or even arrive at all, rather than bundling all its passengers into the cold dark night at Sheffield.

And yet, while drafting some notes for this article, and cursing Central Trains, I remembered that it was the Euromillions umpteenth draw that night, and I’d forgotten to buy a ticket, and the hundred million ackers destined for my wallet would now slip regrettably into the pockets of some undeserving continental chappie.
Because the lottery doesn’t count, does it? It’s just a couple of quid here and there, a bit of a laugh, a bit of a giggle: not proper gambling, with big money an’ that. Indeed; and the vast majority of us do not count ourselves as gamblers, although more than three-quarters of the UK adult population, for instance, play the National Lottery at least once per year, and most on a more regular basis. Even if the Lottery is excluded, nearly half of us dabble in one form or another. Gambling, in its myriad forms, is popular. People want to gamble. And, with more and more people acquiring access to the Internet, whether through a fixed line or mobile phone,  companies are naturally eager to allow them to fulfil their desires online.
But governments have not always seen it this way, and generally feel disposed to impose some or other form of regulatory structure on the industry. There are two extreme approaches here. The first, which may still be practised in some relaxed and possibly dangerous environs, is anything goes. The problem with this regulatory model is that it offers enormous scope for the following scenario: a couple of characters named Slim and Frenchie set up www.greatbets.com, offer wonderful odds for the less percipient and discerning customers and then silently disappear into the night with the loot. While profitable in the short term, this business model is probably not sustainable in the longer haul, if only because people are highly unlikely to be repeat visitors to your site. Furthermore, and unfortunately for Slim and Frenchie, very few – if any – governments are inclined to be so laissez-faire, with the possible exception of countries where Slim and Frenchie are first cousins to/and or best buddies with the Minister of Commerce.
The second model, which the United States has employed, lies at the other end of the spectrum. Here, under the quaint and misplaced belief that prohibition actually works, the feds will send in the modern day Eliot Nesses to arrest anyone who places a bet over an Internet connection. There are many things that could be said about this policy, and while, as Johnson said of Cymbeline (and, latterly, if somewhat unkindly, Muggeridge said of Sons and Lovers), it is pointless to criticise unresisting imbecility, it can be enjoyable and cathartic nonetheless. And so, let us take a closer look at the US approach.
While you can legally buy lottery tickets, gamble in licensed casinos, and place a bet at a bookmakers in person, it becomes a heinous crime should you wish to do so remotely via a telephone. The legislation which decrees that it is so is the 1961 Wire Act: this was originally drafted to prohibit bookmakers in states where gambling was then legal taking bets from would-be gamblers in other less liberal areas of the US.   However, times have changed, and America has become exposed to the wonders and depravities of the World Wide Web, and various upright (or uptight, depending on your viewpoint) congressmen and senators have sought to update the 1961 Act  to ensure that those pesky remote gamblers don’t gamble online across state lines: in fact, to ensure that they can’t gamble online at all. Senator John Kyl repeatedly tried to introduce various incarnations of the Internet Gambling Prohibition Act , none of which made it to full senate vote, before Senator Jim Leach took up the baton for righteousness and sponsored HR4411, the Unlawful Internet Enforcement Act. The argument behind this legislation is as follows:
“The Internet's ease of accessibility and anonymous nature: (1) make it difficult to prevent underage gambling; and (2) feed on the compulsive behaviour of the millions of Americans suffering from gambling addiction.”
I have a big problem with (1) and an even bigger problem with (2). To (1), I would answer: no, dear, it doesn’t. There are numerous checks and balances that can be put in place, notably age-verification requiring at least two items of identification, which actually works very well. And as for (2): up to a point, Lord Copper. The Internet can certainly facilitate mobile gambling (that’s why Ladbrokes, William Hill, Uncle Victor Chandler and all were in there like a shot); it makes it very easy to place bets without the bother of popping down to the bookies or the bingo hall (or, if you’re feeling flush and flash, the Golden Nugget). But I would question whether the majority of those Americans who do suffer from gambling addiction as a result of visits to the aforementioned locales would be significantly more likely to gamble were Internet gambling to be permitted. For one, as a recent survey by the American Gaming Association revealed, the demographic profiles of online gamblers are markedly different from, say casino gamblers: the former are predominantly males, aged under 40 and college educated; the latter are inclined to be older, less well educated, and with a greater preponderance of women in the mix. Secondly, and this is the biggie: why should the overwhelming majority of responsible gamblers be denied the opportunity to enjoy a leisure activity in their own homes which is perfectly acceptable and legal in public places?
This is not in any way to deny that gambling addiction, whether online or in any other form, is a serious issue: a short perusal of the website run by the charity Gamcare provides hair-raising evidence of the scale of problem gambling, and of its consequences, both to the gamblers themselves and to their families. But, as Gamcare acknowledges, there are ways of addressing the problem, not least by working with the gambling industry, regulators and the government. For Senators Kyl and Leach, that conciliatory approach would be anathema. Naughty Internet gamblers! Bad Internet gamblers! Ban them and lock them up!
I called it imbecilic: and yet (credit where credit’s due) there may be method in their apparent madness. Back in 2001, when the American Gaming Association was appearing before a Congressional Subcommittee, it grumbled that offshore Internet gambling sites: “Frustrate important state policies, including restrictions on the availability of gaming within each State… Unregulated Internet gambling that exists today allows an unlicensed, untaxed, unsupervised operator to engage in wagering that is otherwise subject to stringent federal and state regulatory controls. These controls are vital to preserving the honesty, integrity and fairness that those in the gaming industry today have worked so hard for so long to bring about.”
Untaxed. That is the key word here, the one which gives all governments the screaming habdabs. Those blasted foreigners are not paying tax – just look at all that gorgeous, lovely, sexy money being pumped offshore! Why not introduce an Internet protectionism; oh, sure, it’ll mean that us boys at the AGA don’t get any competition, but hell, we can put up with that! We’re good, God-fearing, tax-paying US citizens!
But the problem for the US government is that those blasted foreigners are not taking things lying down. In Antigua and Barbuda, gambling is the driving force behind the economy: the tiny nation is home to more than 500 remote gambling sites; it has been estimated that 3,000 of its 67,000 inhabitants are employed in the on-line gambling business. And many of its best customers are American. Accordingly, when the US government began using the Wire Act to prohibit foreign transmission of gambling information, its Antiguan counterpart lodged a complaint with the World Trade Organisation, which found in November 2004 that the US restrictions on remote gambling were in violation of international trade agreements. The US government, naturally, appealed against this verdict, before altering its standpoint and claiming that it was no longer in breach of the trade agreements, but hey, you still can’t gamble on Antiguan web sites. Much paperwork later, a WTO Dispute Settlement Body (DSB) will shortly adjudicate on the matter: whether the US will abide by its ruling is another matter entirely.
Well, then, we have had the two extremes: now for the middle ground, which is what Tessa Jowell, the UK Minister of Culture, is proposing. I am no lover of regulation for regulation’s sake; but in this the British government has been careful and reasoned at just about every stage in the process. Recognising the international nature of the industry, it has called for the implementation of international standards of regulation. Recognising that would-be punters tend, not unnaturally, to be more attracted to gambling websites where Frenchie and Slim don’t disappear into the sunset, it has backed the introduction of kitemarks for the online industry. It has worked with the Gambling Commission to develop proposed licensing conditions and codes of practice. It has called for greater co-operation with the industry as a whole.
But before I get too lovey-dovey with Tessa Jowell, I would venture to suggest that at that heart of the matter, there is a great deal of similarity between the US and UK governments. Both are aware that gambling is not going to go away; both, being pragmatists, would like to earn some money from it. However, while the US approach is rooted firmly in the past, in the robust and defiant protectionism it has so often mocked when its own products are being exported, the UK option has been to appreciate that it is better to have the gambling companies inside the tent rather than outside it (in Antigua, Gibraltar, or wherever). It is an approach which will ensure that the industry is well regulated, that gamblers’ rights are protected; it will also be an approach which is more financially rewarding. So I would say to the US government: do you sincerely want to be rich? If so, listen up.
Dr Windsor Holden is Principal Analyst at Juniper Research
Mobile Gambling Markets Re-Assessed (Post US Legal Changes), 2006-2011,  published by Juniper Research, January 2007
www.juniperresearch.com

Designing solutions based on specific customer requirements sounds like Nirvana for telecoms users.  Chris Britton explains how Multi-Network Operators are aiming to fulfil the dream

Historically, corporate customers were compelled to rely on incumbent public
telecoms operators (PTOs) for their telecoms services. Now, an increasingly liberalised
 telecoms sector has led to fierce competition among both established telecommunications operators and newcomers, which is delivering choice and improving quality of service and price benefits across the majority of markets.

In this deregulated telecoms world, a new type of company has emerged – the Multi-Network Operator (MNO). Unlike traditional, network-centric telecoms operators, MNOs do not own the infrastructure over which their services are provided.
Instead, they design tailored solutions based on their customers’ specific communications requirements, using an optimal combination of telecoms networks and technologies.   Relying on their skills, experience, service management capabilities and partnerships, MNOs bring added value to enterprise customers, as well as to their network infrastructure partners.
So why is the term ‘multi-network operator’ used here in preference to the more familiar ‘virtual network operator’ (VNO)?  The key reason is that MNOs understand there is nothing ‘virtual’ about their customers’ networking requirements.
Regardless of the underlying infrastructure, MNOs take full responsibility for designing, implementing and managing network solutions.  Indeed, while some traditional telecoms operators offer different service levels depending on whether a solution is ‘on-net’ or ‘off-net’, MNOs approach service level commitments more from an end-to-end perspective. 
The term, Multi-Network-Operator, highlights the way MNOs use their expertise, skills and tools and solutions to find the optimum mix of network technologies on their customers’ behalf. They do this via multiple routes, from multiple sources and across multiple geographies, before integrating the disparate components and operating them as a single solution.
MNOs also differentiate themselves from VNOs via their product focus. VNOs have previously concentrated on virtual private network (VPN) solutions, which involve connecting multiple sites using shared Internet infrastructure. While not ignoring VPN services, MNOs focus on delivering high capacity, custom-built network solutions using dedicated leased line, optical fibre and Ethernet technology, which they manage 24x7. 
MNOs believe that business customers have become accustomed to receiving substandard service from traditional telcos and are crying out for significantly higher service levels for their wide area networks.  It is a sad fact that there are now whole industries – such as telecom expense management and outsourced service level management – that exist because customers feel the need to ‘police’ the traditional telco model. 
MNOs represent a more customer-service driven approach.  In this respect, they offer several key benefits over single carriers operating alone.
Traditional telcos are inevitably focused on maximising the utilisation of the network infrastructure they own. Everything they do – the way they price services, target customers and select technologies to offer – is geared towards squeezing as much value as possible out of that capacity. In contrast, MNOs are constrained by neither the fiscal nor the physical limits of a single network. 
This is an important advantage when servicing the needs of large corporates. The ongoing globalisation of industry means that enterprises increasingly need their networks to provide services to all parts of the world. Consequently, they are looking for end-to-end solutions that will almost certainly demand a combination of the network facilities of carriers in their home region with providers operating in territories across the globe.
By managing multiple networks instead of relying on one provider with a limited footprint, the best MNOs enable customers to expand their business network into new markets and remote, hard-to-serve locations faster and more cost-effectively than ever before.
The MNO approach can also offer increased choice and flexibility for customers in other ways. Rather than focusing on maximising capacity in one particular network, it concentrates on delivering the combination of network technologies that best suit the business needs of the specific customer. Once this is decided, the MNO can concentrate on putting these networks together as a seamless whole and managing them to a single service level agreement. 
In addition, MNOs can help their customers to achieve lower total cost of ownership by reducing the hidden costs of using a traditional telco. Again, the best of these operators will adhere to robust global service level agreements, provide a single point of contact for multiple vendor solutions and deliver accurate, easy-to-understand invoices to customers.
Leading MNOs also have the expertise to integrate multiple transport and networking technologies ranging from private line to satellite and from IP-VPN to Frame Relay, coupled with the experience to deploy solutions that combine fixed broadband, managed mobility and secure site-to-site connectivity.
MNOs’ in-depth understanding of their customers’ business also enables them to offer consultancy as enterprises migrate to triple play solutions and next generation technologies like Ethernet and multiprotocol label switching (MPLS) and are faced with the need to integrate and interconnect increasingly disparate environments.
For the MNOs, committed to providing the best possible networking solutions for their customers, the move to these fast and efficient new networking technologies is inevitable and to be welcomed. And they are much better placed than carriers to provide an effective migration strategy.
This is because unlike carriers, they do not own the network infrastructure and therefore they have no financial stake in the embedded legacy environment. Effectively, they take on the role of reseller of network capacity and services. For these reasons, they are much more likely to prioritise the need to migrate customers to new technologies.
The MNO approach can also provide both network “route” diversity and vendor diversity. Many enterprise customers today see this as a critical requirement to ensure that their networks are both reliable and highly cost-effective.
Leading MNOs will be able to use network design tools to identify diverse, alternative transport routes to eliminate network ‘single-points-of failure', while drawing on a portfolio of wholesale carrier relationships to give customers an immediate and cost-effective second or third carrier option.
But it is not only customers that benefit from the MNO approach.  More and more facilities-based telecom carriers are actively looking for opportunities to work with MNOs because they recognise that the MNO can be a strategic sales channel for their wholesale efforts, allowing them to support customer requirements they would otherwise never have been able to fulfil. 
Increasingly, the MNO model is attracting telecom industry leaders.  GTT is a good example of this, with several members of its board boasting leadership experience with companies such as Sprint, AT&T, Equant and Nextel.  These executives recognise the value of the MNO’s service-driven, customer-centric approach.  They understand there is category of business customers that needs a more closely tailored approach to their wide area network requirements. 
They also recognise the market trends driving the adoption of the MNO model: enterprises are more global in focus; they are moving into new, hard-to-serve markets and they want greater network diversity to address business continuity concerns.  Having seen the strengths and the limitations of traditional telcos, these executives appreciate, perhaps better than anyone else, that the time is right for enterprise customers to consider the MNO approach.
Certainly, the prospects for MNOs appear positive.  Armed with a business model that addresses today’s telecom environment, MNOs can offer enterprise customers a range of benefits that are beyond what most traditional telecoms carriers can provide. Equally, this realisation is beginning to attract senior executives who have previously played key roles within large operator organisations to take a more active involvement in the new MNO model. With the current rapid rate of technological change making flexibility and diversity of product offerings more compelling and with the complexity of the market making the single point of contact that MNOs provide more attractive, the future looks bright indeed. 

Chris Britton is Executive Vice President EMEA, Global Telecom & Technology

European Communications presents its regular round-up of the latest developments in the world of telecommunications

Developing mobile advertising
The GSM Association (GSMA) and the Mobile Marketing Association (MMA) have agreed to co-operate to accelerate the development of mobile advertising worldwide.  The two organisations will collaborate to deliver standardisation and transparency around current mobile advertising activity, and to develop new, innovative advertising techniques.
The MMA will lead the development of guidelines, formats and best practices for mobile advertising, while the GSMA will work with mobile operators globally to develop and prioritise consistent structures, such as inventory types, and commercial and measurement models that will allow advertisers to create valuable advertising propositions.
“This partnership will build on the MMA's work-to-date in the development of mobile advertising,” says Bill Gajda, chief commercial officer of the GSMA.  “The MMA and GSMA will bring leading advertisers, agencies and operators together to ensure that this very promising, but nascent advertising medium realises its full potential for the benefit of all players in the ecosystem.”
The agreement follows the recent announcement of the GSMA's Mobile Media and Entertainment Group that will oversee its Mobile Advertising Programme, which is made up of representatives from leading mobile operators from around the globe.
 “The value chain for mobile advertising is more complex than other media channels, with the mobile operator playing a key role, hence the driver for collaboration.  The GSMA brings the global GSM mobile operator community to the table and we are pleased to be working with them to expand the reach of a sustainable mobile advertising ecosystem,” says Laura Marriott, president of the MMA.  “We look forward to working jointly with the GSMA to deliver a consistent global industry standard for mobile advertising.”
Details: www.gsmworld.com /  www.mmaglobal.com
MVNOs on the rise
MVNOs will continue to grow on a global basis – with worldwide subscriber numbers more than doubling for the period from 2007 to 2012, according to a recent report, The Future of the MVNO, from telecoms research and consultancy firm BroadGroup Tariff Service. However the report warns that business models and distribution will need to change.
The report examines over 300 MVNOs in 37 countries, and profiles the main players in each of the main mobile markets where access to the incumbent's network has been allowed. It also evaluates the role of the country regulator in enabling the MVNO to become established in key markets.
The research reveals a wide range of different approaches and market drivers. The global mobile market is becoming more fragmented with the power of brands and distribution – together with the emergence of new low-cost MVNE aggregators – favouring the development of emerging niche MVNOs based on a small social community. The report features case studies based on in-depth interviews with BT, Lebara, Virgin Mobile and Blyk, each using a different business model.
Retailers and non-telecoms companies with strong customer relationships are using the MVNO model as a marketing tool to broaden and improve their existing customer experience, and so improve customer retention for their core business.
The distinction between pure MVNO and pure MNO is likely to become increasingly difficult to sustain as the MNO is utilising the MVNO technique of sub-brands or multi-brands to retain loyal customers.  As the larger MVNOs grow their subscriber base they also seek to develop a post-paid business stream and are adopting the characteristics of the MNO.
“The MVNO model is perceived as a perfect low cost entry vehicle to launch new mobile business models,” comments Margrit Sessions, Managing Director of BroadGroup Tariff Services.  “MVNOs can help lower prices in a market, but purely competing on price can not be sustained as a long-term strategy. Developing new business models and distribution will be key to success”.
Details: margrit.sessions@broad-group.com

WiFi health scare
Health concerns surrounding WiFi have the potential to seriously undermine consumer confidence, and affect competition in the telecoms marketplace, according to telecoms consultancy Lorgan Orviss International
The scientific community appears polarised by heath concern reports – such as the 'test' that allegedly proved that WiFi radiation in the classroom was three times the level generated by mobile phone masts – a portion of the community believing caution is imperative, and the remainder believing it is all irresponsible scaremongering.
“If schools across the UK are starting to rethink implementing WiFi, as reports have suggested, confidence is already rattled,” says Hugh Roberts, senior strategist for Logan Orviss. “Consumer behaviour and purchasing decisions in the private sector will be impacted.”
Roberts continues: “It is important to consider what could happen in the communications value chain. Wi-Fi offers a form of 'mobility' for fixed line operators who want to offer their customers converged services that include 'out of home experiences' without incurring mobile roaming tariffs for voice and data services. Even a small erosion of consumer confidence – which is now almost inevitable – will change the competitive landscape and will undoubtedly influence the future re-structuring of the telecoms industry.”
Logan Orviss notes two other areas that might become affected if these scare stories continue. One - telcos are investing in convergent services targeted at family groups, where the bill payer (typically a parent) is responsible for the overall profile of the family's usage, although individuals are able to top-up or modify their accounts in defined ways. Home networks – typically WiFi – have been an important part of the development of this comprehensive offering.  And two - apart from the potential decline in customer revenues from hardware and usage sales, teleco advertising revenues for certain types of convergent services that utilise WiFi may be hit. Even with the current level of concern, the advertiser profile will start to change.
Details: www.logan-orviss.com
Must have VoD
Video on demand (VoD) revenues will reach $12.7 billion worldwide in 2011, making it one of the fastest-growing digital content services over the forecast period, predicts analyst and consulting company Ovum. Starting from a base of $2.7 billion in 2007, Ovum expects to see more telcos across the globe launching their on-demand content propositions, moving themselves into content distribution.
"VoD is not a revenue generator at the moment but a 'must have' vision of the future in terms of both cash flow and telcos' content business survival," says Aleksandra Bosnjak, Content and Media Analyst at Ovum.
"From a content provider's perspective, telcos and ISPs will be the new contributors to content distribution and film finance, especially over the long term as the service improves and reaches a more significant scale and enhances its on-demand functionalities," explains Bosnjak.
Telcos are facing competition from all kinds of players - from old pay TV media to new digital distribution entrants - and the pressures of network convergence. This, coupled with challenges around content acquisition costs and finding the winning VoD business model formula, will mean that it is not a source of cash for the moment.
"We argue that over the next five years, 50 per cent of telcos' costs will come from content acquisition and marketing-related activities," says Bosnjak. "In their quest for an innovative content strategy, some telcos will experiment with various forms of content finance, such as financial backing via minimum guarantees, or go even deeper into the actual co-productions or co-ventures. In fact, we predicted this move back in February 2006 when we ran into telcos at the Berlin Film Festival. And we already see it happening with France Telecom and a baby IPTV operator Croatia Telekom Max TV service, which is producing its own short-format shows, and by using its own in-house production talent and facilities."
Ovum's view is that a careful content strategy and locally adapted VoD proposition will be a major driver of telco VoD service revenues, now estimated to comprise one third of the whole VoD revenue pie, depending on the country.
"Understanding the cash flow of traditional content distribution and collaboration with local content players will be the best approach for many operators in this tough VoD race - because the future of TV content, and especially European content distribution, is based on an on-demand business model," concludes Bosnjak.
Details: www.ovum.com

The One Laptop Per Child initiative has drawn both great praise, and considerable criticism.  Lynd Morley takes an overview of the debate and the role telecommunications can play

The (almost) legendary Nicholas Negroponte was a keynote speaker at TMW in Nice this year.  The co-founder and director of MIT Media Lab, and author of the seminal Being Digital, came to talk about another of his brain-children – the One Laptop Per Child (OLPC) initiative.
It may not seem the most obvious subject for a keynote address at one of the leading communications OSS events, but Negroponte pointed out in his opening remarks: “I think that a lot of the big problems in this world will all have solutions that include education.  And telecommunications and education are intimately tied.  I’m very fond of telling ministers of telecommunications that they are, in fact, ministers of education.  Because, until the world is really connected, education remains a very narrow phenomenon.
“If we look ahead to a world where children  – who are global by nature – have the opportunity to communicate with each other and learn, clearly telecommunications has a big role to play.”
OLPC is a non-profit organisation, set up with the goal of providing children in developing nations with laptop computers, offering – among other things – access to a whole raft of information.  As Negroponte stresses, OLPC is an education initiative, not a laptop organisation.
However, this apparently completely altruistic activity has generated an amazing amount of criticism and backbiting; some from the industry itself – including Intel and Mircrosoft - and much of it among the tech-heads and bloggers whose comments – some angry, some well meaning, some passionate about the greater need in developing nations for a whole range of things from fresh water, to food, to healthcare – has provided a huge theatre of debate.   Indeed, the arguments came to a head recently when Negroponte effectively accused Intel of damaging the non-profit scheme by launching a competitive product – the Classmate PC – resulting in some countries, previously behind OLPC, now considering their options.  Given that Intel chairman Craig Barrett initially described the OLPC laptop – the XO-1 – as a ‘gadget’ and questioned the effectiveness of the scheme, the Classmate is an interesting development.  Intel, however, strenuously denies that it is undercutting its own prices in order to push OLPC out of what it has now decided is a lucrative market.
Other giant players in the industry, however, are committed to the OLPC initiative.  BT is backing the project.  It’s chief science officer, Sinclair Stockman comments: "The project aims, through connecting even the most disadvantaged to the Internet and the web, to provide them with an invaluable tool to build a better and safer future.
"One of the challenges - which is where BT is providing assistance - is how to extend the local net connectivity, which is built into the PCs and allows them to easily form local wi-fi based networks, on to the global Internet.
"This in turn allows Internet protocol access to a much richer source of information - and allows the children to participate in wider global communities.
“The technical challenges are just one hurdle to overcome,” he adds. "Others include language, content delivery, effective community sharing - and also assuring the trustworthiness of the connected community."
So, EC reader, whether you are of the Gordon Gekko school of thinking (greed is good, and b****r the consequences) or the – shall we say – Al Gore school, perfecting the art of ‘caring’, you can join the critics and nay-sayers, and dismiss the whole thing as another example of Negroponte’s supposed egomania (or just simply misplaced good intentions) or you might conclude that it may not be perfect, but at least the guy is trying something, and try to find out just how the telecoms industry can provide additional help.
Details: www.laptop.org

The European Conference on Optical Communication (ECOC) event organisers detail what to expect at this year’s show in Berlin

The 33rd annual European Conference and Exhibition on Optical Communication (ECOC) will take place on 16-20 September, 2007 at the Internationales Congress Centrum (ICC) in Berlin, Germany – Europe’s largest conference centre. 
The event, expected to be a sell-out for the first time since the height of the telecoms boom in 2000, will feature over 300 exhibitors and a comprehensive speaker line-up, including some of the world’s leading technical developers, both commercial and academic, addressing key industry topics.
Confirmed speakers at the conference include: Gregory Raybon of Alcatel-Lucent – 100 Gbit/s: ETDM generation and long haul transmission; Biswanath Mukherjee of the Department of Computer Science, Univiversity of California Davis, USA – Optical Networks: The Road Ahead; and Russel Davey of BT – Long-reach Access and Future Broadband Network Economics.
Within the exhibition, seven of the world’s largest carriers, AT&T, China Telecom, Deutsche Telekom, France Telecom Group, KDDI, Telecom Italia and Verizon, under the banner of the Optical Internetworking Forum (OIF), will show the results of months of interoperability demonstrations.
“This is the first time that we have had service providers take part in the exhibition and this is a clear recognition of the rise of optical communications to become integral to all players in the telecom and datacoms sector,” says marketing manager, Simon Kears.
Also on the exhibition floor, visitors will be able to see and take part in a number of new, interactive features: The FTTx Resource centre, delivered by The Light Brigade, will be a focal point for all things FTTx; the ECOC Market Focus seminars will feature presentations from senior executives at JDSU, Bookham and a view on the European FTTH market from Heavy Reading’s Graham Finnie; the latest products will be showcased in the live demonstration area; and the CTTS will give free practical training courses in fusion splicing and fibre preparation tools.
Details: www.ecocexhibition2007.com

  EVENT    /LOCATION/    DATE/    CONTACT

EMC Europe/    Paris/    14-15 June    /    www.theiet.org
Capacity CEE/    Prague/    18-19 June    /    www.telcap.co.uk
NXTcomm/    Chicago/    18-21 June    /    www.NXTcommShow.com
FTTx Summit/    Munich/    18-21 June/        www.iir-events.com
CommunicAsia/    Singapore/    19-22 June    /    www.communicasia.com
Mobile TV World/    Rome/    21-22 June    /    www.items-int.com
Optimising Telecoms Opex & Capex/    London/    25-27 June/        www.informatm.com
Mobile Content & Services/    Berlin/    25-28 June    /    www.iir-events.com
Radio Planning Forum/    Monaco/    25-28 June    /    www.iir-events.com
Service Quality Management
& SLAs for Telecoms/    Berlin/    25-28 June    /    www.iir-events.com
Telecoms Wholesale/    Berlin/    25-28 June/        www.iir-events.com
Revenue Assurance Summit/    Kuala Lumpur/    25-28 June/        www.iqpc.co.uk
Telecoms Loyalty & Churn/     Barcelona/    25-29 June    /    www.iir-events.com
WDM & Optical Networking/    Cannes/    25-29 June    /    www.iir-events.com
Digital Home/    Berlin/    2-4 July/        www.iqpc.co.uk
VoIP Asia/    Singapore/    16-19 July/        www.iir-events.com
Black Hat USA/    Las Vegas/    28 July-2 August/        www.blackhat.com
Telecoms World Africa/    Johannesburg/    30 July-3 August/        www.carriersworld.com
MVNO Summit/    Chicago/    6-8 August    /    www.iqpc.co.uk
Wi-World Africa/    Johannesburg/    27-30 August/        www.carriersworld.com
GSM>3G ME & Gulf/    Dubai/    2-3 September/        www.gsm-3gworldseries.com
Fixed Mobile Convergence/    Chicago/    5-7 September/        www.pulver.com
SDP & SOA in Telecoms/    Berlin/    5-7 September/        www.marcusevans.com
IBC 2007/    Amsterdam/    6-10 September/        www.ibc.org
Telecoms Quality & Business
Process Excellence/    Vienna/    10-11 September/        www.jacobfleming.com
Branding in Converging Comms/    Berlin/    10-11 September /        www.jacobfleming.com
Effective HR Management in
Telecoms/    Amsterdam/    10-11 September/        www.jacobfleming.com
EXPP Summit/    London/    10-11 September/        www.expp-summit.com
Nordic & Baltic Telecom Forum/    Helsinki/    10-12 September/        www.marcusevans.com
Mobile Device Management/    Amsterdam/    12-14 September/    www.marcusevans.com
User Generated Content &
Social Networking/    Rome/    12-14 September/        www.marcusevans.com
Evolving Telecoms/    Berlin/    17-19 September/        www.marcusevans.com
ECOC 07/    Berlin/    17-19 September/        www.ecocexhibition2007.com
MVNO Congress/    Vienna/    17-20 September/        www.iir-events.com
Number Portability/    Prague/    17-20 September/        www.iir-events.com
Strategic CRM in Telecoms/    Lisbon/    20-21 September/        www.jacobfleming.com
VSAT 2007    /London/    24-27 September/        www.comsys.co.uk
Carrier Ethernet World/    Geneva/    24-28 September/        www.iir-events.com
GSM>3G CEECom/    Prague/    25-26 September/        www.gsm-3gworldseries.com

Ignoring business continuity is no longer a reasonable option, yet many companies are wary of being sold a dud.  Patrick Roberts looks at the positive steps that can be taken to ensure a better-informed choice of solution

According to the Chartered Management Institute’s 2006 Annual Survey of Business Continuity Management, less that half of the organisations surveyed (49 per cent) had a “Business Continuity Plan covering their critical business activities.”  In understanding why so many businesses are still not investing in business continuity, despite the proven benefits, it is instructive to look back at an article, “The Market for ‘Lemons’: Quality, Uncertainty and the Market Mechanism”, published by Nobel Prize-winner George Akerlof over 30 years ago.  The article is an elegant exploration of the situation that evolves when the buyer of goods or services does not know the true value of what they are buying: they cannot distinguish between a high quality product and a ‘lemon’.   

BUSINESS CONTINUITY - A lemon by any other name?

Consider, for example, what would happen to the used car market if only two types of vehicle were offered for sale - high quality cars worth £10,000 and jalopies worth only £2000 – and prospective purchasers cannot tell the difference.  If a buyer believes that there are equal numbers of both types of vehicle on the market then they might be inclined to think “I have a 50/50 chance of getting a good car or a ‘lemon’ so I’ll offer up to £6000 for a car.”  But on further reflection he or she realises that the owner of a quality used vehicle is very unlikely to sell it in a market where people are only willing to pay half of what it is really worth (they will find an alternative means of selling it) so, in reality, all you will have in this market is jalopies.  The prospective buyer then determines not to pay more than £2000 for a car bought in this way.
Measuring the true value of an investment in business continuity management is extremely challenging for the most experienced business continuity practitioners let alone a prospective purchaser with little knowledge of the subject.  The stage is therefore set for a classic ‘lemons’ problem where prices in the sector are forced down and, ultimately, both buyers and sellers are driven away from the market.  Whilst, in the illustration above, there are numerous practical alternatives to buying a second-hand car (eg buying new, public transport), simply ignoring business continuity is no longer a sensible option for most organisations.  Obviously the onus to remedy the situation lies largely with the business continuity profession and much is already being done by both individuals and professional bodies, such as the Business Continuity Institute, to improve awareness and understanding.  However, those involved in purchasing business continuity products and services - including training, consultancy services and IT solutions – also have a vested interest in becoming more knowledgeable in order to ensure that they are getting the best value for money.  The rest of this article concentrates on the positive steps that people in this latter group can take to ensure that they are better-informed consumers.
The publication of BS 25999:1 in November 2006 was a very important milestone, establishing a simple and robust lingua franca for business continuity management.  The document itself is less than 50 pages in length so any prospective purchaser of business continuity services would be well advised to take the time to read it.  Building on this foundation, a great deal of general business continuity information is available in the form of public presentations: business continuity practitioners speak regularly to audiences from numerous professional and business organisations and many of these events are free.  As one becomes more knowledgeable, it is also worth considering attending specialist events such as the Business Continuity Expo and Business Continuity Institute Annual Symposium where industry-leading practitioners discuss best practice and examine topical issues.  These simple steps give the knowledge and confidence to ensure that what is being purchased is actually appropriate to the business needs. 
Finally, a wide range of high-quality business continuity training is now available, ,with many organisations offering inexpensive one or two-day introductory courses to the subject.  Some providers also offer courses in a convenient evening-class format or can even deliver bespoke training in your workplace.  Scenario-based crisis management exercises, where a management team has to wrestle with the difficulties of a simulated incident, are also a very enjoyable and effective way to raise awareness of business continuity issues and improve the ability of individuals and teams to manage in a crisis.  Taking advantage of some of these numerous training opportunities is undoubtedly the best way to equip yourself as a sophisticated business continuity consumer and ensure that you are getting good value from your investment.
In conclusion, the following simple case study is offered as a recent illustration of how improved buyer understanding can create real win-win outcomes for buyer and seller.  The client in question was a small public-sector body who received a day of Crisis Management training for their executive team as part of a national programme delivered by Needhams 1834 on behalf of the Cabinet Office Emergency Planning College.  As a result of this training, the executive team realised that their existing business continuity plan was not fit for purpose; it also gave them the confidence to ask Needhams 1834 to conduct a review of their plan and deliver a simple exercise.  In the event it only took an additional seven days of work to provide the client with a far more robust business continuity plan and facilitate a simple exercise to familiarise the crisis management team with its contents.  Surely this is a far better outcome (for both parties) than the client struggling on with an inadequate plan or paying for a great deal more consultancy than they really needed: improved understanding by the buyer led to a win-win situation.   

Patrick Roberts is Senior Consultant with Needhams 1834.

Needhams 1834 Ltd will be exhibiting at the Business Continuity Expo and Conference held at EXCEL Docklands, London from 28th - 29th March 2007.
www.businesscontinuityexpo.co.uk

Enterprise mobility should focus on message delivery not device type insists Peter Semmelhack

The benefits of mobile business solutions are hard to ignore for most enterprises today. Used for everything from tracking home visits by medical staff and maintaining airport x-ray machines to keeping beer flowing in pubs and hotels, mobile applications are giving customer-facing employees in the field the power to run their business wherever they happen to be. As Rob Bamforth, mobile applications analyst at Quocirca so neatly puts it, "the key application of all mobile devices is communication, and the 'killer' feature is relevance to the user."

ENTERPRISE MOBILITY - Don't get hung up on the phone

The debate has moved on to how best to manage mobile devices and treat them as the business enablers that they truly are. IDC predicts that $52 billion will be spent on all mobile services by 2010, with $1.5 billion of this being spent on mobile device management and security.

Let the user be the chooser
Traditionally, field service has been the poster child for enterprise mobility since, by its nature, it involves the delivery of timely customer information to a geographically dispersed workforce who are mobile for the majority of their working day. Because of this, field service engineers have often borne the brunt of early attempts at force fitting enterprise applications onto mobile devices that were never designed to handle that level of complexity. The result was many failed or less than stellar deployments due to lacklustre user adoption.
The BlackBerry revolution awoke executives to the freedom, flexibility and efficiency that could be gained through mobile access to applications. Now mobility is being demanded not only by field service operations but also in a variety of customer-facing roles. As IT managers juggle the mobility requirements, user preferences, and budget requirements of different business units across the enterprise, they soon realise that when it comes to mobility solutions one size clearly does not fit all.
A survey undertaken at Service Management Europe found that that 23 per cent of respondents cited user acceptance as one of the major hurdles to successful mobilisation of data applications. This is precisely why Pitney Bowes UK set up working groups to consult its field service engineers on their device preferences and mobile application interface requirements, long before it standardised on a hosted enterprise mobility solution to deliver SAP and Siebel updates directly to field service engineers' devices throughout Europe.
It is therefore vital to consult field staff on the best device for their needs and the best interface to aid them in their collection of updates and daily reporting of services delivered or sales made. It is also important to retain flexibility by choosing a solution that can support future migration to new devices. The bottom line is choosing the right mobility solution from the start will lead to fewer headaches when deciding on devices and networks and when managing the evolution of the system over time.

Change costs
One of the major issues concerning IT directors is total cost of ownership (TCO). This cost is generally driven down by simplification. So any increase in complexity, such as a change in device type across the enterprise, or even in only one part of it will inevitably increase TCO. And one thing is certain, change will come.
The device market is changing so fast that any decision made at the time of deployment could be almost obsolete within six months with newer, as yet unknown, devices coming onto the market offering immediate and significant incremental business value and/or cost reduction opportunities. In addition, the recent litigation between NTP and RIM sent shockwaves through the enterprise community in the US, where the BlackBerry has become essential to the average executive. With some employees becoming more dependent on their mobile device than their desktop PC, the enterprise mobility strategy must include the flexibility to minimise disruption if the company needs to switch devices to accommodate mergers or acquisitions, change of operator, customer demands or new technologies.
Gartner predicts that the overall TCO for mobile solutions will rise by 30 per cent for most enterprises. Gartner attributes this cost to "the increased support costs for a more-disparate set of mobile data users, lack of management of recurring monthly charges for mobile data services and the need to support point solutions across multiple types of wireless data offerings." Inevitably, if an enterprise opts for a mobility solution that already supports the majority of devices on the market, then this will reduce the costs, while also reducing the time to roll out mobile applications to a new set of devices. Therefore, enterprises must plan to have a multiple device environment to enable the business to take advantage of the value of new devices as they come onto the market. The cost of supporting multiple device specific applications versus a single application for multiple devices must be considered.

International deployments
When rolling out mobile applications to thousands of users across multiple territories it is vital to consider choosing software that supports the majority of devices without needing any modification. So whether staff choose to work on PDAs, Pocket PCs, BlackBerrys, notebooks or laptops, or even a touch tone phone using IVR, they should still receive the same updates from the back-end applications running over SMS, GSM, GPRS, 1x RTT, 2 way paging or Mobitex.  This means you must deploy a flexible and extensible software solution that accommodates multiple language support, and both wireless device and network variances without any costly re-engineering of the application.
Field sales and service pose particular problems for larger enterprises because they involve the management of data delivery across multiple territories, over numerous networks. For example a field engineer may need to run the same field service application interchangeably on either his BlackBerry or his laptop depending on the service task he is undertaking. He may prefer to use his laptop for diagnostics in an area with poor wireless coverage. However at the next job he may prefer to use his BlackBerry. Using real time data communications, backed up by 24 x7 monitoring from a reliable Network Operations Centre, ensures consistency and currency of the data, no matter which device is being used for the job in hand.

Host with the most
We have stated earlier that simplification reduces cost. One significant way to simplify application delivery is to use a hosted 'on-demand' model, enabling employees to access mobile applications on whatever device they happen to be working on at that time. Analysts at Unstrung advise that "if the use of BlackBerries, Treos, or other mobile messaging devices extends beyond a few top executives, it's time to consider outsourcing the management thereof." Using a hosted model allows central management of devices, with different user privileges for different groups. Depending on the service provider, this model can also enable 24 x 7 monitoring of multiple networks around the world, to guarantee message delivery.
According to industry analyst AMR, the software-as-a-service-model grew by 60 per cent in 2005 and this is driving the CRM market. The next logical step is to extend this hosted software to the field sales and service staff that have most contact with your customers and this means getting critical data held within enterprise CRM and ERP systems such as Siebel, NetSuite, and SAP onto their mobile devices, so that they have the most up to date customer information at their fingertips while they are on site with customers.
We have discussed the need to choose an enterprise mobility solution that supports the widest range of handheld devices. But sometimes the most effective way to get critical information to an engineer is to route the message to a landline near the site where the engineer is working. This is particularly important for employees working on hidden assets such as underground piping, or in hospital environments where there is no wireless coverage. So enterprise mobility isn't always about accessing data from a PDA or phone, sometimes traditional communications technologies are the most effective route to get messages to your mobile employees. So don't get hung up on the phone, or on one particular device. Enterprise mobility is all about ensuring that relevant up to date information reaches the right person at the right time - using the most effective delivery method possible.

Peter Semmelhack is CTO of Antenna Software
www.antennasoftware.com

As telecommunications vendors clamour to develop and launch products based around the Advanced Telecom Computing Architecture (ATCA), operators and equipment manufacturers must decide how the standard will shape their business. Robin Kent outlines the core propositions of ATCA, its market potential and the ways in which operators and telecommunications equipment manufacturers can harness its value.

Anybody that has tried his or her hand at plate spinning will know how hard it is to build momentum and keep the plate moving. Anyone skilled - and brave - enough to attempt the feat with several plates at once, will be able to relate to the challenges that the network operators face every day.

ATCA - Balancing transition

The process of migrating from tried and tested proprietary systems to relatively new standards-based platforms is fraught with risks, challenges and concerns. Operators cannot afford to make a sudden switch from one system to another, so no small amount of balance and co-ordination is required to ensure the transition is as seamless as possible. Whilst some standards can be overlooked in favour of proprietary systems, others are simply too significant to ignore.
As one of the most significant telecommunications standards to be developed in recent years, ATCA is set to have a massive influence on the telecommunications industry with more than 200 vendors, ranging from shelf through to card and chassis manufacturers, already developing products based on the standard.
ATCA is an open standard-based platform created by the PCI Computer Manufacturers Group (PICMG) to provide an architecture for modular components which are capable of rapid integration into high performance and carrier grade service solutions.  The main objective of the standard is to increase levels of interoperability between different vendor components to offer greater flexibility when developing infrastructures. In turn, this enables operators to benefit from:
•    Reduced time to market for new applications
•    Lower upfront development costs
•    Less reliance on proprietary solutions
•    Increased cost savings through economies of scale
•    Greater flexibility of service offerings
•    Protection of investment
ATCA has been developed with the needs of the communications industry at its heart and is particularly likely to benefit organisations in the access, edge, and transport and service delivery environments.  Whilst there will be peripheral benefits outside of the telecommunications space, it is estimated that approximately 95 per cent of ATCA applications will lie within the communications edge / access and core markets (Source: Venture Development Corporation).
Given the level of attention currently being paid to the development of ATCA-compliant products by major players like Sun Microsystems, Intel and Motorola, it is widely anticipated that ATCA, along with its sister architecture MicroTCA, will become the dominant open standards force in the Communications Computing Equipment space by 2009.

Market acceptance
Since it was first formally introduced in 2003, ATCA adoption has exceeded previous standards-based architectures, such as PCI or VME, yet market penetration has remained relatively low, as many carriers remain largely unaware or unconvinced of ATCA's value proposition. Telecommunications equipment manufacturers (TEMs) too have taken time to explore the full business case for ATCA and determine their approach to the standard. In many regards, it faces the classic challenges of standardisation (eg compatibility, proven interoperability, slow uptake and competition with the sheer volume of other standards), but recent developments within the telecommunications industry are set to change ATCA's fortunes for the better.
2006 saw the creation of two new industry organisations that are now helping to develop the mainstream market for a standards-based communications platform.  At the beginning of the year, the SCOPE Alliance was formed to promote the availability of open carrier-grade base platforms based on commercial off-the-shelf (COTS) hardware/ software and free open-source software building blocks. The organisation's efforts were then bolstered by the formation of the Communications Platforms Trade Association (CP-TA) in April 2006 to address the interoperability certification requirements, which meant that the core elements for creating a mainstream market were now in place for the first time.
Because these components have only recently come together, ATCA is still a technology in development and remains the subject of extensive trials and evaluations.  This practice is likely to become more commonplace through 2007, but it could be 2009 before large-scale field deployments commence.

Applications: ATCA and IMS
ATCA reflects a fundamental shift in the industry's approach to the development of telecommunication architectures.  The days of structured deployments, planned well in advance, for long-life revenue generating applications are a thing of the past and they are now being superseded by constantly evolving architectures, capable of providing the flexibility to meet service providers' demands, quickly. Perhaps one of the most compelling illustrations of ATCA's benefits is its role in facilitating IP Multimedia Subsytems (IMS).
The IMS proposition is centred on the flexibility it has to enable operators to provide customers with the very latest revenue-generating IP-based services, such as interactive gaming or streaming video.  The open standards-based architecture of ATCA can be used to facilitate this requirement by enabling TEMs and service providers to re-use the computing foundation for their products. 
ATCA provides the foundation for a flexible, scalable, high-performance architecture that is more than capable of meeting these demands.  It can help to lower both capex and opex by reducing the time to market for new services and significantly lowering development costs. This is a key consideration when factoring in the cost associated with operating a constantly evolving architecture and the potentially short commercial lifecycle of new applications and services.

To ATCA, or, not to ATCA?
So far, we have discussed some of the clear benefits of embracing ATCA, but it is also clear that the standard does not offer an overnight panacea. Inevitably, there are aspects of the telecommunications network that will need to remain in place for the foreseeable future, and it is also true that some aspects of the infrastructure may not be best served by ATCA.
There are a number of scenarios in which it might be more appropriate to use non-ATCA products, including for entry range, cheap or mass produced technologies and high-density input and output interfaces associated with processing. In response to these challenges, the PICMG has already begun specifications for MicroTCA.
The MicroTCA standard promises to offer many of the features of ATCA in a smaller, lower cost format that is suited to network edge and enterprise applications. Because of this, MicroTCA is likely to have broader appeal beyond TEMs and operators but, crucially for the telecommunication market, it shares the same system architecture as ATCA. This means it can be controlled by the same management software and enables a relatively simple migration between the two platforms. Moving forward, it seems that combining the two standards will enable vendors and operators alike to benefit from greater economies of scale.
Over the next 3-5 years, systems architects will need to perform a plate spinning role to ensure that operators can reap the benefits of their existing proprietary systems alongside open standards-based platforms. With the rise of COTS hardware, open systems and applications, and the move toward IMS infrastructures, underpinned by ATCA, the economic viability of proprietary systems is inevitably being called into question.  However, as one of the largest overheads in its operational expenditure, it is understandable that operators will want to protect the investment they have made in the applications developed on these networks. 
Taking the step from proprietary to standards-based systems will necessarily require a phased approach as there are key considerations for the TEM community to explore before they wholeheartedly commit to any standard-based platform.  The proprietary systems they currently operate have been developed to deliver 99.999 per cent availability and ensuring this is maintained throughout both the migration process and the lifetime of the new network is a major priority. Some may also be concerned about the cost and time implications of re-writing applications as they move from legacy to standards-based systems. Others will focus on the fundamental change to product development process as they shift from an internally run project to a multiple vendor management exercise.
Standards-based solutions offer far greater levels of portability than their proprietary cousins.  Suppliers exploiting the ATCA platform are not going to be locked into a specific hardware OEM because they are able to select the best solution for the same standard technology.  The openness of ATCA is comparable to that of the Linux OS, in that it is distributed by many vendors and supported on every ATCA CPU board.  Therefore, adopting a standards-based approach to infrastructure development can benefit both the customer and vendor by creating a robust environment with the flexibility and scalability to support innovation and evolution.
So how can operators begin to address these challenges? On the surface the options are clear - they either adopt or ignore ATCA - but both approaches have their downfalls.  Choosing the former at this particular point in time may result in long-term benefits, but until the technology is more widespread and proven it could be seen as an expensive, high-risk approach.  On the other hand, a failure to prepare for the introduction of ATCA-compliant technologies could cause them to lose ground to the competition over the medium to long-term.
It is clear that readying the infrastructure for the adoption of ATCA, without compromising the operator's ability to exploit existing and forthcoming revenue opportunities in the interim, has to be their ultimate goal.  To achieve this aim they will need to re-evaluate both the hardware upon which their existing infrastructure is based and the capabilities of the software systems they use to help them deal with emerging standards and protocols.

Robin Kent is Director of Operations with Adax Europe
www.adax.com

Organisations like the TeleManagement Forum have a dilemma when it comes to anniversaries. TM Forum’s 20th birthday in 2009 will naturally be a cause for celebration: you don’t get such a long run in this business unless you’re doing something right. But there’s a nagging worry too. Can a successful first 20 years as a thought leader; framework and standards setter – mostly for telecoms operational and business support systems - serve as a basis for another 20 years setting frameworks for an industry that is turning rapidly into something else, as new players muster at its borders? Does the heritage help or hinder when it comes to refining a role in the rapidly converging telecom, media and Internet industries, where the new tends to be seen as ‘good’ and anything else is consigned as ‘legacy’?

LEAD INTERVIEW - Brave New World

For Keith Willetts, the TM Forum's original co-founder and current Chairman, it's a question that soon answers itself, once you apply a little deep thought to the matter.
“What's become really clear, over the past year or so especially, is that convergence is here – it's for real and we're really at the start of the process,” he says. “What you've got is three trillion dollar industries - media, Internet and telecom - all coming together. You just have to pick up a newspaper, listen to the news or, of course, surf the web to see that it's happening. Who's Virgin bought?  What services is Skype offering now? All that sort of thing. And over the coming years we're going to see far more of this mixing and matching – where a company strong in one field takes over or forms an alliance with a company that's strong in another.”
For Willetts it's a process that brings opportunities as well as threats.  One apparent threat for many in the telecom industry is that telecom becomes sidelined in many markets as a new breed of player moves in and takes over. This extreme scenario might involve powerful, highly capitalised Internet companies, such as eBay with IP telephony company Skype (which it bought in 2005) completely disrupting the traditional telephony market.
“I use Skype and I'm amazed at just how good the service is. I wonder to myself, 'why would you need anything else?. But,” admits Willetts, “the more likely scenario is that we'll end up with a real mix of companies which take elements from all three sectors.”
There lies the opportunity. Willetts thinks the TMF can provide the frameworks that integrate the players, just as it has hitherto provided frameworks to integrate telcos' disparate back-office systems. The challenge is to apply its expertise in a new way.
“What we've been good at is helping our members develop a lean end-to-end process environment – a set of frameworks and standards encapsulated in our NGOSS  (New Generation Operations Software and Systems) initiative that enables them to build flow-through business processes that cross the old internal demarcation lines that were, and often still are, such a feature in traditional telcos. Using NGOSS they can begin to join the dots between things like inventory, provisioning, service assurance and so on.”
What's clearly required in the new converged telecom-media-Internet world, he points out, is a similar set of guidelines at the inter-company level. “We are going to need standards and frameworks that reach beyond the company and the sector to automate things like content delivery, digital rights management and things we haven't even thought of yet.
“Of course, it's a huge area and there are a number of unresolved questions,” he says. “For example one specific conversation we've recently had within TMF has been around the possibility of defining a value chain. And we came to the conclusion that such a question presupposes we know who is going to be where in the chain. In fact, all we can actually say is that there will be value chains and there will be different people at different positions within them.  What we're looking at is the development of something more two- or even three-dimensional than a simple chain – it's probably better to think of these relationships forming something like a  'value web', where companies might sit in any one of several positions.  They might be undertaking one set of commercial roles in one territory and a different set in another.”
In fact, says Willetts, TMF as an organisation is keen to develop a role as an independent business and technical facilitator rather than being seen as the advocate of a particular, sector-specific, approach.  The reason is simple – the telecom industry itself won't exist as we know it five to ten years from now.  It's transforming, and as web and media companies are moving onto some of its traditional turf, telecoms itself is branching out into many new areas. 
 “It's important we aren't seen to be in the business of promoting any particular outcome,” claims Willetts.  “We want to be part of an environment where there can be a range of outcomes, shapes and services. The important thing is that user companies and providers can actually put the pieces together and have them work. It's a case of  'may the best man win'.”
So where exactly is the TM Forum running to?
First, TMF is inviting thought leaders from media and cable companies to join its Board in order to get a 360-degree view of emerging needs.  Second, it's rapidly broadening its business and software vision to encompass the needs of information and content-based services and the myriad of virtual providers and value chain players.  Third, collaboration with other bodies will be important and ongoing.  For example, recently TMF  struck a landmark deal with the Broadband Services Forum (BSF) with a formal partnership where relevant work is shared.  In fact the members of each organisation will have influence over related technical work in the area of telecom-web convergence issues and the first fruits of the collaboration will show up in a new TMF document entitled “Telecom Media Convergence Industry Challenges and Impact on the Value Chain”. The relationship will also contribute to more multimedia focused panels at TM Forum events, and future development of process standards for content management and convergent media-telecom operations.
 “One of the most exciting and fundamental things we're going to do is to develop what we're calling a ' super-catalyst', and we'll be kicking that project off at Nice this year.”
The TMF Catalysts are joint projects undertaken by members and sponsored by service providers, usually to demonstrate leading edge thinking on how to solve problems in integrating the back office, using approaches based on  TM Forum standards and guidelines. The results of these projects are demonstrated at TMF's TeleManagement World conferences in Nice and Dallas each year.
 “The super-catalyst, which we're likely to call the Converged Services Showcases, will be  really major events, involving media companies, device companies, cable TV, IPTV and mobile TV,” says Willetts.  “The idea is to show a whole set of advanced service scenarios, but unlike what you'd see at a trade show - where you typically just see the thing working - with the super-catalyst you'll be able to walk around the back of this and be shown how it's actually being operated and controlled using standards and the various OSS and BSS systems involved.
 “It's at an early stage of development, but the general idea is that you go to the show floor and you see the equivalent of a town with houses and retail establishments and so on.  And you see all these services that you're getting and then you walk around the back to the network operations centre and you can see how it's all being managed.  It's a big leap.”
We're working on, not just a demonstration, but a real catalyst designed to flush out problems and what standards you need, and what bits you need to invent that you haven't thought of. The fact is that we don't know what the standards requirements are in some cases in the converged world yet, and that's why this super-catalyst is going to be a great vehicle for developing the whole area. It's going to be a major undertaking.”
The plan is for the first super-catalysts to debut later this year at the TMF's Dallas TMW. 
Nice TMW will be the start of the major change.  “What we want to show is that convergent services are here. So we have a very strong convergence message and a very strong illustration that TMF is responding.  There will be discussion about managing content-based and entertainment-based services and more involvement from media companies. For instance, for a meeting at Nice we've invited executives from Disney, Time-Warner and Virgin Mobile to join the table. The fact is that it's just as relevant for a senior executive at BT, say, to sit down with a Disney executive, as it is for the Disney guy to get to understand how the company can exploit the telecom space.”
 “For some of these players convergence will result in a partners' love-fest and for others it will be 'daggers drawn', as they realise they're going to be contesting the same space, but in the long run nobody knows who will be in which role at any one given point in time. TMF's role isn't to try and predict that.”
What about the core frameworks and standards generated by the TMF? Will these have to change markedly to accommodate the broader remit and the entry of new types of player into the value web?
“Yes, no doubt there will be changes as we go forward. One area that we're probably going to have to address in all our output is outsourcing. While our current guidelines intrinsically assist players to define and manage all their processes, so that outsourcing, where required, will be simpler to accomplish, it's also true to say that outsourcing isn't often specifically allowed for. I've just been to India to speak at a TMF event there, and what I heard there was really eye opening in terms of the way outsourcing is being used to reduce costs.
“At Bharti Airtel, one of the big mobile operators with 80 to 90 million subscribers, all the IT is outsourced and they operate at a cost level that a European mobile operator, for instance, can't even come close to.”
Willetts says he thinks that outsourcing and partnering arrangements are bound to become more complex and must be catered for in the back office in a fundamental way.
“For example BT might run an IPTV service in the UK using its infrastructure, and in Germany it might run a service on someone else's because it doesn't own any infrastructure there. But it will probably want to run the same brand and service.  The back office systems need to support that sort of thing.”
But the big question has to be asked. Isn't there a danger in all this for TMF?  This is a member-driven organisation and it is energised by a core of highly motivated, mostly telecoms-oriented individuals who give, not just their companies' time, but often their own time and effort as well. Doesn't TMF run the big risk in realigning itself so radically?
Willetts is adamant: “What people sometimes don't understand is that it's not a question of: 'If you go and chase all these converging media and web companies, will you desert your core telecom membership in the process?'  That question forgets the fact that  telecom companies are, themselves, becoming multi-media companies.  So, the reality is, to be of maximum use to our core constituency, we need to run with them, not away from them.”

Ian Scales is a freelance communications journalist.

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features