Bob Drummond discusses how operators can benefit from an agile, flexible and open platform to proactively deliver dynamic services to their customer base

What do the Glastonbury music festival, the Rugby World Cup and the Oscars have in common?  They are all high profile, internationally broadcast events that draw attention from millions of fans and dominate the agendas of society, newspapers and television for the short period of their duration.

These are all events that operators could capitalise on if they had the flexibility and agility to rapidly and economically deploy innovative services on their networks, even for a short period of time. For operators on the lookout for new revenue streams or the next ‘sticky’ application, this is a golden opportunity to engage new and existing mobile subscribers by riding the wave of highly popular live events with the offer of exciting applications.
Over the recent Cricket World Cup, what cricket fan would not enjoy winning a game involving the same team and opponents on his or her mobile phone? If your team didn’t win, replay the game on your mobile and see if you could have done better! Next time you’re on your way to watch your favourite football team play, what if you could play the match on your mobile – complete with the same starting team on the pitch and on the bench, correct strip, same opponent players and the same conditions…even down to the weather conditions?
With higher return visits to the application promised through this dynamic, always-fresh approach, and premium revenues on offer, what is holding operators back from introducing services, applications or games aligned to such headline-making events? Beyond understanding the opportunities presented by such events, how do operators meet the technological challenges that ensure that customers are happy with the new services they receive? 
The telecoms industry is challenged to achieve a business model that keeps costs down, maintains innovation and responds to competition from within and outside of its own marketplace. All of this whilst still creating profit and new revenue streams to stay in the game. The most difficult aspect of this challenge, however, has been created over the years by the operators themselves. It is the legacy of a history of growth that has seen additional, proprietary infrastructure systems installed for each wave of evolution and has resulted in vertically-oriented and proprietary systems’ infrastructures.
Proprietary Intelligent Network (IN) systems are typically monolithic in structure, with hardware, software and applications tightly integrated and designed to operate well as a unit. As a consequence they are expensive to maintain and enhance because operators are restricted to using the services of the vendor even for minor enhancements to the system. This creates a ‘lock-in’ environment where operators become increasingly reliant on the vendor for its ability to innovate. A new service capability can take years and millions of dollars to deploy in this environment, vastly affecting the feasibility, cost and timescale of bringing new services to market. 
Furthermore, the telecoms industry has typically invested in applications and platforms as and when needed, resulting in a mix of incompatible development, deployment and operational environments. Typically, the switching and services layers of the IN will be organised vertically – rather than with integration across the rest of the infrastructure in mind – producing a complex series of silo-based architectures where the cost to develop, deploy and maintain exciting new services for all subscribers is too high. The obstruction to innovation in new multi-media, multi-access, multi-network services means that operators face difficulties in delivering the rich, converged services that their customers want and that differentiate them in a crowded marketplace.
The ability to offer new services that piggyback high profile events such as a World Cup or the Live8 music festival requires a degree of agility and flexibility that silo design and proprietary lock-in of legacy infrastructures obstructs. So, without heavy investment in a new convergent architecture, what can operators do?
The answer lies in open standards. Compared to the world of Internet and enterprise applications, developing telecoms services on the traditional proprietary IN platforms is an outdated approach that is time-consuming and expensive.  Proprietary, vertically-integrated systems need to make way for openness, modularity and portability to create an environment for cost-effective service development. 
Operators have spent a decade demanding open platforms from their suppliers, even introducing a series of open standards initiatives, such as Parlay and JAIN, to drive this agenda.
JAIN SLEE is the open Java standard that is tailored to the large-scale execution of communications services across existing and Next Generation Networks. With JAIN SLEE-compliant application servers providing an open, flexible and carrier-grade service execution platform, operators can achieve agility in service development and deployment, and also capitalise on cost leadership. Application development is no longer controlled by the proprietary vendors, but open to input from operator’s own in-house development and a competitive market of off-the-shelf application developers.
In this dynamic environment, a range of application developers can quickly and cost-effectively address market opportunities and roll out services in conjunction with events that hit their audience’s agenda. As JAIN SLEE addresses the need for a horizontal platform across the entire operator infrastructure, services can converge voice, data and video silos to provide truly innovative and compelling offerings that drive revenues and grow customer loyalty.
A live multi-media service for the Glastonbury music festival, for example, can be designed to appeal to operator’s high spending audience of young adults. The open platform makes it flexible enough to update daily with news, weather, alerts and programme changes, as well as offer live downloads of artist tracks, in order to provide a compelling service for users.
With the move away from inflexible legacy telecoms networks to an open environment, operators can now benefit from a wide pool of third party developers for innovative and cost effective new applications.  For the type of applications discussed at the outset of this article, an agile and flexible platform also supports the modification or reconfiguration of an application during the lifetime of the related real-world event to continue providing a compelling service for repeat users based on service take-up, user behaviour and feedback received during the event.
Operators need to fully embrace the opportunity of such dynamic service delivery, or risk being left behind by users that come to expect more from their network. For operators such as Vodafone and O2 that sponsor high-profile events around the world, the opportunities are endless for increasing sponsorship returns, explore new revenues and generate new levels of customer loyalty using dynamic service innovation.

Bob Drummond is VP of Marketing and Professional Services at OpenCloud

New technology is disrupting traditional advertising, and in its place different forms are evolving, offering very specifically targeted messages.  Lawrence Kenny and Rob van den Dam describe how advertising spend in the emerging online channels is now growing at a remarkable rate

The advent of emerging online advertising channels is making marketers lick their lips. These marketers are seeking more effective ways of optimising their expenditure and they are excited over the prospect of being able to target their ads in a highly personal way. They are spending more and more on targeted personalised advertising - at considerable cost to traditional advertising. Everyone is fighting for the new media advertising revenue. At the same time telcos have begun to realise that advertising can become an important source of revenue, an opportunity that they simply can't resist. 

Although telecom operators have little presence in advertising today, the medium represents an emerging opportunity that operators are uniquely positioned to address. They have unique assets that advertisers value. First of all, they have a large customer base. And with their authentication, authorisation and accounting controls, telcos are able to determine who the customer is and what services and products they are buying. Useful not only for controlling where the ads go to, but also for tracking advertising effectiveness.
Telcos have a direct relationship with customers. They collect vast quantities of customer data, which they can use to develop profiles of their subscribers, including demographic characteristics, personal attributes and preferences of those subscribers – and even, perhaps, their shopping habits and viewing patterns, provided the operators have the relevant analytical tools and capabilities. They can combine these customer insights with their ability to identify where individual users are based and offer highly targeted, localised promotions. Moreover, many operators have already developed solid relationships with local advertisers through their directory businesses.
Telcos are also well placed to enable the advertising experience practically anywhere, on any device and at any time. They can, for instance, manage the delivery of ads across the mobile phone, PC and TV-set; over fixed, wireless and other networks. What's more, they also provide a direct interactive response channel for the customers, and a feedback loop to advertisers allowing them to track advertising performance.
As telcos move into media - an industry that has historically been part funded through advertising - it will find that relying on subscriptions and pay-per-view models is unsustainable in a world where consumers do not expect to pay for all content. Content is expensive to generate and offer to consumers, and advertising provides a means to offer richer content at a more reasonable cost. Many telcos are therefore experimenting with opt-in advertising plans to fund content. Perhaps this is the most significant benefit, as it allows consumer access to richer content and media. Advertising may also provide consumers with access to content they previously were unaware of. A number of operators are already taking steps toward adding advertising on IPTV and cell phones.

IPTV advertising
The big advertising revenue still comes from television. But the traditional TV advertising model is becoming increasingly unsustainable. With the shift from analogue to digital broadcasting, the number of TV channels has multiplied, and audiences are becoming much more fragmented. This reduces the efficacy of an approach that relies on centrally scheduled programmes to deliver real-time advertising to a large, undifferentiated audience; and uses ratings to estimate the size of the audience. It results in low effectiveness, as advertisers need to pay for large audience even if they just reach small targets. Which makes TV ads too expensive.
IPTV could provide the answer. IPTV presents the opportunity to combine the powerful brand-building effect of conventional TV-quality advertising with the strengths of online; the ability to target specific audiences and allow customers to easily pursue their interest in a product, even to the point of purchase.
IPTV is an advertiser's dream. With IPTV, telcos have the ability to control where the ads go to - targeted at large groups, small groups or even individual television sets within a single household. 
The ads can be fine-tuned to the people within a household most likely to be watching at a certain time. When watching IPTV, users will be able to freeze the programming in order to interact with any advertising that attracts their attention, submit their details for further information on a brand or in some cases make an online purchase. And IPTV provides the means to measure precisely how many people have seen a particular advertisement. Payment models can be geared to actual viewers watching, the number of “red button” pressed, or perhaps a percentage of the sales.
With IPTV the ways in which ads can be personalised are limitless. Different ads can be generated once one ad has been shown a specific number of times. It gives advertisers the benefit that their ads won't annoy irrelevant audiences, or be shown too often and alienate their customers. IPTV also opens new opportunities to diversify ad formats. Ads can be placed when the set-top box boots up, on information screens, as a screensaver, as buffer when a movie loads, or dynamically in the video streams. The facility to 'telescope' out an advertisement could be possible using a click-through function for the consumer. There is also the possibility of search and recommendation, perhaps in partnership with an Internet search engine such as Google.
IPTV could provide a gateway to Internet advertising for sectors traditionally reluctant to embrace the medium. And IPTV will attract local companies who would otherwise not have considered TV advertising as an option. Telecom Austria has already explored ultra-local TV-advertising in the village of Engerwitzdorf and found it especially attracted local companies for advertising.
In Europe, the French IPTV market is leading the pack in targeted advertising trials, but IPTV providers in other European countries are also experimenting with advertising. Examples are Tiscali TV (formerly known as Homechoice) running a dedicated Honda channel in the UK, and Telecom Austria. BT is talking to both brands and agencies about offering (Vision) IPTV advertising. In the US, Verizon is currently deploying the technical tools that will allow it to insert local ads into its programming. On that foundation, the telco plans to introduce more targeted and interactive ads in its FIOS IPTV service. Though advanced ad deployments are still a ways off, AT&T (with its U-verse IPTV service) also likes the promise of an ad play that combines mobile phones, television and the Internet.

Mobile advertising
Mobile advertising represents another unexploited opportunity for telecom operators. It is one that telcos are particularly well-positioned to capture since they have control over what is delivered to the device and are the only companies that have the right to know the location of their subscribers, information that advertisers would love to use to target customers. The mobile phone is the most personal consumer device we own, and that most people carry with them 24 hours a day. It affords advertisers an opportunity to present very targeted and time-sensitive information that is of interest to the user. With nearly three billion cell phone users in the world, it's clear that mobile advertising represents a huge opportunity. Informa Telecom & Media predicts that worldwide spend of mobile advertising will be worth $11.35 billion in 2011.
Advertising on mobile devices can take many forms, including banners, sponsored video content and messages sent to users, but telcos and advertisers still need to determine what works best in different circumstances. Advertising techniques cannot simply be copied from the Internet. The screens and devices are smaller; the exposure time tolerated by the user is likely to be less; too many click-throughs will annoy users; and in many cases, operators must be able to identify the device type to render content appropriately.
Even more so than with Internet advertising, mobile advertising must be relevant, interesting to the audience and, especially, not overbearing in quantity. In fact, mobile advertising should be a combination of search, location and presence, and recommendation functions, based on a deep understanding of the consumer's passions, hobbies, purchases, past click-patterns and the like.
Outside Asia, where mobile advertising has grown rapidly in markets like Japan where NTT Docomo has been running small banner ads on its mobile portals for more than five years, the mobile operators have moved cautiously in adding advertising on cell phones for fear of alienating subscribers and increasing churn by doing so. But there have been a number of initiatives.
In the summer of 2006, Virgin Mobile USA introduced a programme called Sugar Mama, that compensates its phone users with free calling minutes for watching commercials, reading advertiser text messages and taking surveys from brands. In its first seven months, the Sugar Mama campaign awarded 3 million minutes to about 250,000 of the registered customers. Virgin Mobile recently announced that they will use JumpTab's search-based advertising platform to offer ads that are highly targeted and relevant for its users. Companies such as Verizon, Sprint and Cingular are now also beginning to test and roll out advertising on mobile phone screens.
In Europe, EMI Music and T-Mobile joined forces at the end of 2006 to pilot ad-supported mobile videos in Britain. Ad-funding company Amobee has recently launched a commercial advertising trial with Orange in France, with such companies as Coca-Cola and Saab having signed up for the trial. Orange customers interested in playing games will be offered them for free, or at a reduced rate, if they first agree to watch an advert. Mobile operator 3UK announced the launch of a service in April supported by personalised advertising to provide free content for its users. Also Vodafone and Yahoo! aim to launch a mobile advertising business in the first half of this year.
However, media brands such as Fox News, USA Today and The New York Times are now also joining the game by providing advertising via their mobile websites, which are accessed directly through a mobile browser and not through a mobile operator's menu. And they are not the only parties that think there will be big business for them down the road. Internet players Google and Yahoo! have already started to include advertising in their mobile search and portal properties. Yahoo! has even launched a mobile advertising platform in 19 countries across Europe, Asia and the Americas, instantly enabling advertisers to reach consumers around the globe on their mobile phones. Advertisers already signed up include the Hilton Hotel group, Pepsi and Singapore Airlines. And then there is Nokia, also jumping onto the mobile advertising bandwagon, by announcing two mobile advertising services designed for targeted campaigns on the handsets.
Highly targeted and addressable advertising will increase advertising revenue per viewer significantly, while the viewer experience becomes more personalised and well received. Several studies have confirmed that subscribers are more likely to respond favourably to advertisements if the topic is of interest to them. This type of advertising, however, raises the issue of privacy. There are acts in both Europe and the US to ensure that user-specific data is not used for any purpose other than for providing the telecommunications service itself. “Opting in” may well be seen as the route to go, and prove popular with consumers: giving them increasingly relevant ads. Here consumers allow their “user-specific” data to be used, in return for being included in special offers.
Many parties, from marketers to big media companies, to handset makers, to Internet players, to telecom operators, hope to get a piece of the pie. But operators have the demographic, transactional, behaviour and location data necessary to deliver marketing and advertising that meets the consumer need for relevant advertising. Operators are now at the point where they should exploit their unique technical advantages to secure their part of the pie.

Lawrence Kenny is Global Telecommunications Industry Leader for IBM Global Business Services.  Rob van den Dam is European Telecommunications Leader for the IBM Institute for Business Value

With ADSL2+ technology now being pushed to its absolute limits, carriers are talking about the next generation of broadband, VDSL and VDSL2, with speeds of up to 200 Mbits/s on relatively short line lengths from the DSLAM. Jorg Franzke explains how it is possible to roll out the speed benefits of VDSL and VDSL2 to most urban and city customers, without breaking the telco bank

It’s been an eventful decade across Europe as former state incumbents, cable companies and virtual telcos have all, seemingly en-masse, jumped on the broadband telecoms wagon and rolled out an increasing range of high-speed broadband services to their customers.
Most experts agree that, even with ADSL2+ offering customers access to up to 24 Mbits/s downstream data speeds, customers’ appetites for even faster speed services are still increasing, with some cable companies already talking about offering 100 Mbits/s as standard.

The only problem with this new generation of very high-speed broadband services is that they rely on VDSL and VDSL2 technology. Whilst ADSL2+ can happily support copper line lengths of two or more kilometres, the maximum available rates are achieved with VDSL at a maximum range of just 300 meters from the DSLAM (digital subscriber line access module), which gives around 52 Mbits/s. When we move on up to VDSL2 (ITU G.993.2) technology, carriers are even talking about rates of up to 200 Mbits/s.
But VDSL2 deteriorates quickly from a theoretical maximum of 250 Mbit/s at zero metres from the DSLAM to 100 Mbits/s at 500 metres, and 50 Mbits/s at 1.0 kilometre.  As a result of these line length limitations, very few customers will be within the coverage range of VDSL2 DSLAMs installed at the central exchange. So, most local loop carriers are discussing moving the active electronics, including the DSLAMs, out of their central offices and into larger versions of the roadside cabinets that form an integral part of the street furniture we see every day.
A major issue is that, with the move out of the central office comes the de-centralisation of the main distribution frame where connections have to be moved to initiate new services such as ADSL and VDSL.  Each time a customer requests a change of service, jumper wires have to be moved - fairly easy and efficient in a warm, dry clean centralised environment – but once the connections have to be made in the cold and rain it becomes an operational issue.  To give the reader an idea of the massive scale involved, however, a network the size of BT in the UK would require around 65,000 of these externally deployed active electronics cabinets. A network the size of Germany’s T-Com would require around 100,000 such cabinets.
In theory, the incumbent telcos could employ teams of roving engineers to maintain and provision the cabinets in much the same way as central offices are serviced at the moment, but the costs associated with the necessary engineering `truck rolls’ is anathema on the financial and ecological fronts. Even one visit per fortnight, at say ?50 per technician visit, would clock up annual costs of ?130 million per annum on a 100,000 cabinet network.
Consequently, any carrier electing to stay with the status quo and implement manual re-jumpering at the thousands upon thousands of active equipment roadside cabinets, will be forced to reduce their costs by making only scheduled visits. The corollary of this is that each cabinet may only figure in the schedule once every fortnight, meaning that the time-to-provision each customer will become much longer than currently is the case.
New approach
A markedly different technique is needed and newly developed automatic cross-connects  (ACX) can now be used to replace manual distribution/jumpering frames in the remote cabinets and so save carriers significant sums of money on the operational expenditure (OpEx) front.
With an automated ACX solution, not only are there no delays in waiting for an appropriate technician truck roll, but also the control of the ACX can be integrated directly into the carrier’s operations support system. Using this approach allows for the service connection to follow on automatically from the customer's order, within an hour or two, rather than the customer facing a wait of several days, as is currently normally the case with a central office, or several weeks in the case of above manual re-jumpering scenario.
Many carriers and manufacturers alike are chasing a holy-grail of Zero Touch for their networks. We have been pioneering a slightly less pipedream approach based on practicality and best Return on Investment.
The aim of a Zero Touch network would, of course, be that the field technician never needs to visit the remote site.  Which is all very well until you take into account that active equipment, and its power supplies and air-conditioning go wrong from time to time. So occasional technician visits are inevitable.
Zero touch systems have other drawbacks, not the least of which is the fact that the purchase costs can be substantial, so reducing the installation's return on investment. In the case of automated cross connects, Zero-touch would need a non-blocking switching matrix which is very expensive, and current non-blocking technology simply isn’t up to the job of transmitting 100Mbit/s signals.
The third issue with zero touch systems is the fact that the cabinet needs to be equipped with a large degree of reserve DSLAM and splitter capacity, ditto power supplies and air-conditioning, so seriously increasing the levels of capital expenditure required.
Our theory is simple. What happens if we introduce a minimum number of technician truck tolls to the mix, creating a `minimum touch’ not zero touch active electronics-based local loop?
This is where the financials begin to get interesting, as a minimum touch network is far more financially viable. It requires significantly lower levels of capital expenditure with very similar levels of operating costs. Less spare capacity is needed as this can be added when demand dictates. Likewise a much less expensive semi-blocking ACX can be used – with the full frequency range for 100Mbit/s service delivery
A good minimum touch system has the advantage of automating the provisioning and re-provisioning of lines without incurring the high capital costs of a zero touch system, or attenuating the signal levels required for effective VDSL and VDSL2 transmissions.
Well before the ACX system reaches saturation levels, it can signal its status to the central exchange, allowing engineers to make a planned site visit, install additional capacity if needed and hardwire connections already switched through to VDSL freeing up the switch ports to be used again for the next six or twelve months.
The result of this approach is good scalability, lower cost per line and a reduced space requirement. And all without affecting those all-important customer satisfaction levels.
Using a minimum touch ACX approach means that only one or two maintenance visits each year are required for each cabinet, with remote monitoring shouldering the responsibility of maximising network up-time.
In the event that something like a DSLAM card fails, the ability of ACX to connect ‘any-to-any’ can be employed to ensure that customers are only minimally affected by any technical problems. Depending on the severity of the failure, the cabinet's active technology can be remotely reconfigured to maintain service for the customers affected and the network operations centre can schedule a truck roll when it suits the operator. This makes for a more cost-effective maintenance strategy.
ACX technologies
In a survey of switching technology for remote automated cross connect devices used in next generation carrier networks, research company Venture Development Corporation (VDC) considered a number of technologies, but rejected robotic and solid-state/electronic switching, the former being error prone, expensive and with poor life expectancy, whilst solid-state/electronic switches have electrical parameters that make them unsuitable for the high bandwidth requirements of xDSL services like VDSL2. VDC also noted that a very specific `electromagnetic’ variation of the MEMS relay may become a suitable technology, but this is currently only in testing as regards ACX applications and, as yet, has no field application track record.
VDC concluded: “We believe the electromagnetic relay is acceptable technology because of its proven reliability, ruggedness and minimal transmission impairment.”  It did not judge any other technology to be currently acceptable. This, and the fact that the power requirements are so minimal, are the reasons that we had chosen to develop our own ACX product range around the tried and tested electromagnetic relay.
Obviously whether or not to implement ACX or to manage the service provision process manually is a matter for individual carriers. The choice of technology is critical from the perspective of reliability, minimal power consumption and the ability to handle the very high frequencies needed for VDSL2, but far more important in this rapidly changing telecoms word is the need for rapid return on investment.
It is our contention that Zero Touch is a step too far and that in the world of every day  engineering issues, Minimum Touch networks and minimum touch ACX are the way to minimised costs.

Jorg Franzke is ACX product manager for ADC KRONE, and can be contacted via tel: +49 308453-2498; e-mail: jörg.franzke@adckrone.com

End-to-end transaction data is increasingly being recognised as the not-so-secret sauce required for full-flavoured telco transformation. If so, it should be treated with the reverence it deserves, Thomas Sutter, CEO of data collection and correlation specialist, Nexus Telecom tells Ian Scales

Nexus Telecom is a performance and service assurance specialist in the telecom OSS field. It is privately held, based in Switzerland and was founded in 1994. With 120 employees and about US$30 million turnover, Nexus Telecom can fairly be described as a 'niche' player within a niche telecom market. However, heavyweights amongst its 200 plus customers include Vodafone, T-Mobile and Deutsche Telekom.

It does most of its business in Europe and has found its greatest success in the mobile market. The core of its offer to telcos involves a range of network monitoring probes and service and revenue assurance applications, which telcos can use to plan network capacity, identify performance trends and problems and to verify service levels. Essentially, says CEO, Thomas Sutter, Nexus Telecom gathers event data from the network - from low-level network stats, right up to layer 7 applications transactions - verifies, correlates and aggregates it and generally makes it digestible for both its own applications and those delivered by other vendors.  What's changing, though, is the importance of such end-to-end transaction data. 
Nexus Telecom is proud of its 'open source approach' to the data it extracts from its customers' networks and feels strongly that telcos must demand similar openness from all their suppliers if the OSS/BSS field is to develop properly.  Instead of allowing proprietary approaches to data collection and use at network, service and business levels respectively, Sutter says the industry must support an architecture with a central transaction record repository capable of being easily interrogated by the growing number of business and technical applications that demand access.  It's an idea whose time may have come.  According to Sutter, telcos are increasingly grasping the idea that data collection, correlation and aggregation is not just an activity that will help you tweak the network, it's about using data to control the business. The term 'transformation' is being increasingly used in telecom.
As currently understood it usually means applying new thinking and new technology in equal measure: not just to do what you already do slightly better or cheaper, but to completely rethink the corporate approach and direction, and maybe even the business model itself. 
There is a growing conviction that telco transformation through the use of detailed end-to-end transaction data to understand and interact with specific customers has moved from interesting concept to urgent requirement as new competitors, such as Google and eBay, enter the telecom market, as it were, pre-transformed. Born and bred on the Internet, their sophisticated use of network and applications data to inform and drive customer interaction is not some new technique, cleverly adopted and incorporated, but is completely integral to the way they understand and implement their business activities. If they are to survive and prosper, telcos have to catch up and value data in a similar way.  Sutter says some are, but some are still grappling with the concepts. 
"Today I can talk to customers who believe that if they adopt converged networks with IP backbones, then the only thing they need do to stay ahead in the business is to build enough bandwidth into the core of the network, believing that as long as they have enough bandwidth everything will be OK."
This misses the point in a number of ways, claims Sutter. 
"Just because the IP architecture is simple doesn't mean that the applications and supply chain we have to run over it are simple  - in fact it's rather the other way about.  The 'simple' network requires that the supporting service layers have to be more complex because they have to do more work." 
And in an increasingly complex telco business environment, where players are engaged with a growing number of partners to deliver services and content, understanding how events ripple across networks and applications is crucial.
"The thing about this business is not just about what you're doing in your own network - it's about what the other guy is doing with his. We are beginning to talk about our supply chains. In fact the services are generating millions of them every day because supply chains happen automatically when a service, let's say a voice call over an IP network, gets initiated, established, delivered and then released again. These supply chains are highly complex and you need to make sure all the events have been properly recorded and that your customer services are working as they should. That's the first thing, but there's much more than that.  Telcos need to harness network data - I call them 'transactions' - to develop their businesses."
Sutter thinks the telecom industry still has a long way to go to understand how important end-to-end transaction data will be.
"Take banking. Nobody in that industry has any doubt that they should know every single detail on any part of a transaction. In telecoms we've so far been happy to derive statistics rather than transaction records. Statistics that tell us if services are up and running or if customers are generally happy. We are still thinking about how much we need to know, so we are at the very beginning of this process."
So end-to-end transaction data is important and will grow in importance.  How does Nexus Telecom see itself developing with the market?
"When you look at what vendors deliver from their equipment domains it becomes obvious that they are not delivering the right sort of information. They tend to deliver a lot of event data in the form of alarms and they deliver performance data - layer 1 to layer 4 - all on a statistical basis.  This tells you what's happening so you can plan network capacity and so on.  But these systems never, ever go to layer 7 and tell you about transaction details - we can. 
"Nexus Telecom uses passive probes (which just listen to traffic rather than engage interactively with network elements) which we can deploy independently of any vendor and sidestep interoperability problems.  Our job is to just listen so all we need is for the equipment provider to implement the protocols in compliance with the given standards."
So given that telcos are recognising the need to gather and store, what's the future OSS transaction record architecture going to look like? 
"I think people are starting to understand it's important that we only collect the data once and then store it in an open way so that different departments and organisations can access it at the granularity and over the time intervals they require, and in real (or close to real) time. So that means that our  approach and the language we use must change. Where today we conceptualise data operating at specific layers - network, service and business - I can see us developing an architecture which envisages all network data as a single collection which can be used selectively by applications operating at any or all of those three layers.  So we will, instead, define layers to help us organise the transaction record lifecycle. I envisage a collection layer orchestrating transaction collection, correlation and aggregation.  Then we could have a storage layer, and finally some sort of presentation layer so that data can be assembled in an appropriate format for its different constituencies  - the  marketing  people, billing people, management guys, network operation guys and so on, each of which have their own particular requirements towards being in control of the service delivery chain. Here you might start to talk about OSS/BSS Convergence."
Does he see his company going 'up the stack' to tackle some of these applications in the future. 
"It is more important to have open interfaces around this layering.  We think our role at Nexus Telecom is to capture, correlate, aggregate and pre-process data and then stream or transfer it in the right granularity and resolution to any other open system."
Sutter thinks the supplier market is already evolving in a way that makes sense for this model.
"If you look at the market today you see there are a lot of companies - HP, Telcordia, Agilent and Arantech, just to name a few - who are developing all sorts of tools to do with customer experience or service quality data warehouses.  We're complementary since these players don't want to be involved in talking to network elements, capturing data or being in direct connection with the network.  Their role is to provide customised information such as specific service-based KPIs (key performance indicators) to a very precise set of users, and they just want a data source for that."
So what needs to be developed to support this sort of role split between suppliers? An open architecture for the exchange of data between systems is fundamental, says Sutter. In the past, he says, the ability of each vendor to control the data generated by his own applications was seen as fundamental to his own business model and was jealously guarded. Part of this could be attributed to the old-fashioned instinct to 'lock in' customers. 
"They had to ask the original vendor to build another release and another release just to get access to their own data," he says. But it was also natural caution.  "You would come along and ask, 'Hey guys, can you give me access to your database?', the response would be 'Woah, don't touch my database.  If you do then I can't guarantee performance and reliability.' This was the problem for all of us and that's why we have to get this open architecture. If the industry accepts the idea of open data repositories as a principle, immediately all the vendors of performance management systems, for instance, will have to cut their products into two pieces.  One piece will collect the data, correlate and aggregate it, the second will run the application and the presentation to the user.  At the split they must put in a standard interface supporting standards such as JMS, XML or SNMP. That way they expose an open interface at the join so that data may be stored in an open data to the repository as well as exchanged with their own application. When telcos demand this architecture, the game changes. Operators will begin to buy separate best in class products for collecting the data and presenting it and this will be a good thing for the entire industry.  After all, why should I prevent my customer having the full benefit of the data I collect for him just because I'm not as good in the presentation and applications layer as I am in the collection layer? If an operator is not happy with a specific reporting application on service quality and wants to replace it, why should he always loose the whole data collection and repository for that application at the same time?"
With the OSS industry both developing and consolidating, does Nexus Telecom see itself being bought out by a larger OSS/BSS player looking for a missing piece in its product portfolio?
"Nexus Telecom is a private company so we think long-term and we grow at between 10 and 20 per cent each year, investing what we earn. In this industry, when you are focusing on a specialisation such as we are, the business can be very volatile and, on a quarter-by-quarter basis, it sometimes doesn't look good from a stock market perspective."
But if a public company came along and offered a large amount of money? "Well, I'm not sure. The thing is that our way of treating customers, our long-term thinking and our stability would be lost if we were snapped up by a large vendor. Our customers tend to say things like  'I know you won't come through my door and tell me that someone somewhere in the US has decided to buy this and sell that and therefore we have to change strategy.' Having said that, every company is for sale for the right price, but it would have to be a good price."
So where can Nexus Telecom go from here?  Is there an opportunity to apply the data collection and correlation expertise to sectors outside telecom, for instance?
"Well, the best place to go is just next door and for us that's the enterprise network. The thing is, enterprise networks are increasingly being outsourced to outsourcing companies, which then complete the circle and essentially become operators. So again we're seeing some more convergence and any requirement for capturing, correlating and aggregating of transactions on the network infrastructure is a potential market for us. In the end I think everything will come together: there will be networks and operators of networks and they will need transaction monitoring.  But at the moment we're busy dealing with the transition to IP - we have to master the technology there first.”

Ian Scales is a freelance communications journalist.

Could high speed packet access become high cost packet access, asks Pat Dolan

Maturity often brings a heightened sense of reality, a healthy dose of cynicism, and a degree of wariness around new products or services that pertain to be 'the next big thing.'
Some mobile networks are now 25 years old. They suffer from declining revenue and are struggling to keep abreast of seemingly continuous advances in technology. Given the maturity of these networks, as a shiny new acronym takes the stage in the mobile market, operators can be forgiven for treating it with a degree of suspicion and for ordering it to undergo a full examination.

One such acronym is HSPA – High Speed Packet Access, which has been promoted by mobile operators as the dream data network, designed to realise the promise of 3G. HSPA has been billed as the rescue remedy that will enable delivery of data services that will convince consumers to upgrade their price plans. It may even succeed in getting the tardy corporate market to spend serious money going mobile.
So how can an opportunity to make more revenue be viewed suspiciously in a market where operator margins are declining?
The answer depends on the profitability of that revenue, and how much it costs to achieve. The issue isn't really how much mobile users spend.  The key figure is how much profit margin users deliver to operators – and that depends on how much it costs operators to deliver services.
HSPA, according to the mobile industry, will enable the delivery of high-bandwidth services to a standard that will encourage use. The industry expects the major users to be the generation that has grown up with mobile – the so-called Echo Boomers, children of the Baby Boomers, – to whom technology is a given. They are the generation of the iPod device, laptop and Xbox® video game system, who expect gadgets to not only look good but to deliver services at any place and any time. These users are the mobile gamers, the TV-on-demand consumers, the market that expects technology to work, and work well.
Similarly, the corporate market is there for the taking – the success of WiFi demonstrates that high-speed mobile wireless connectivity has a following, when it works properly.
Therefore, with the introduction of HSPA, operators can be confident that the amount of data carried over the mobile network will increase. Here is where the careful inspection of the technology kicks in. To handle the anticipated increase in traffic and maintain the quality of service required to not only court but keep the high speed data user, operators must invest in their networks.
A report from Gartner published last year stated: “Vendors present HSDPA as a simple software upgrade on the Node B in base stations. However, feedback from operators indicates that, even for major vendors, this may not be the case.”
It may indeed not be the case. Put more traffic on the same motorway and you get a traffic jam. Then you must add more road to accommodate the traffic. And it all costs money.
The potential bottleneck for HSPA lies in the backhaul – not perceived as the most glamorous part of the network, but the part that decides whether the traffic flows freely or grinds to a halt. Backhaul is the transport of data between the Radio Network Controller (RNC) and the Node B (base station) in the 3G/HSPA-capable network. Traditionally in 2G/2.5G networks, backhaul capability was provided by E-1 leased lines. There was no great choice when those networks were implemented, and the leased-line capacity of 2Mbps served the voice and minor data needs of those networks to date. But when you consider that Gartner puts the typical cost of an E-1 link at $4,500 or $5,600 per year, and HSPA driving the requirement to 10 or more E-1 links per site, the cost implications begin to bite.
But all is not lost. The world has moved on from E1. There are now other backhaul solutions available. It is crucial for operators today, more than ever, to consider the cost implications of the backhaul transport choice. Each E-1 link operators add increases their transport capability. It also automatically adds more cost.
Pseudowire technology can address both cost and speed. Pseudowires enable more cost-effective networks by enabling backhaul over packet-based technologies such as Metro Ethernet, Ethernet Microwave and DSL. They also can protect operators' investment in their existing networks by providing a bridge between “old” and new transport models. Take this route, and the investment to provide a network capable of delivering services over HSPA to a standard expected by the user becomes a realistic, sustainable and economically smart move.
One operator already enjoying the inherent benefits of pseudowire technology is Telecom Italia, with others giving it serious considerations too.  Telecom Italia for example has an award-winning application for transporting ATM-based 3G traffic across an Ethernet network. 
Operators need a clear strategy for backhaul upgrades to realise the potential of HSPA. Following the “if it ain't broke, don't fix it” E-1 path will deliver bandwidth, but at what cost?  Backhaul networks are already one of the largest operating expenses for mobile operators. If the market wants to deliver high speed and not high cost data packet access, then maybe it's time to make a new choice.

Pat Dolan is VP & General Manager EMEA at Tellabs

As Internet gambling – whether via mobile or fixed lines – continues its upward path, Windsor Holden looks at the different approaches of prohibition versus regulation currently exhibited by governments on opposite sides of the Atlantic

Before we kick this off, I would like to get one thing straight. “Tex” Holden, king of casinos, lord of the gamblers, is a myth. You will not espy me playing the roulette wheel at the Golden Nugget, nor cleaning up at baccarat in Monte-Carlo, nor even studying the form guide in the Racing Post. In short, I am not ordinarily a gambling man, although one might suggest taking a through train on a Friday evening from Manchester Piccadilly to Cambridge represented a bit of a punt, particularly if I were expecting it to arrive on time, or even arrive at all, rather than bundling all its passengers into the cold dark night at Sheffield.

And yet, while drafting some notes for this article, and cursing Central Trains, I remembered that it was the Euromillions umpteenth draw that night, and I’d forgotten to buy a ticket, and the hundred million ackers destined for my wallet would now slip regrettably into the pockets of some undeserving continental chappie.
Because the lottery doesn’t count, does it? It’s just a couple of quid here and there, a bit of a laugh, a bit of a giggle: not proper gambling, with big money an’ that. Indeed; and the vast majority of us do not count ourselves as gamblers, although more than three-quarters of the UK adult population, for instance, play the National Lottery at least once per year, and most on a more regular basis. Even if the Lottery is excluded, nearly half of us dabble in one form or another. Gambling, in its myriad forms, is popular. People want to gamble. And, with more and more people acquiring access to the Internet, whether through a fixed line or mobile phone,  companies are naturally eager to allow them to fulfil their desires online.
But governments have not always seen it this way, and generally feel disposed to impose some or other form of regulatory structure on the industry. There are two extreme approaches here. The first, which may still be practised in some relaxed and possibly dangerous environs, is anything goes. The problem with this regulatory model is that it offers enormous scope for the following scenario: a couple of characters named Slim and Frenchie set up www.greatbets.com, offer wonderful odds for the less percipient and discerning customers and then silently disappear into the night with the loot. While profitable in the short term, this business model is probably not sustainable in the longer haul, if only because people are highly unlikely to be repeat visitors to your site. Furthermore, and unfortunately for Slim and Frenchie, very few – if any – governments are inclined to be so laissez-faire, with the possible exception of countries where Slim and Frenchie are first cousins to/and or best buddies with the Minister of Commerce.
The second model, which the United States has employed, lies at the other end of the spectrum. Here, under the quaint and misplaced belief that prohibition actually works, the feds will send in the modern day Eliot Nesses to arrest anyone who places a bet over an Internet connection. There are many things that could be said about this policy, and while, as Johnson said of Cymbeline (and, latterly, if somewhat unkindly, Muggeridge said of Sons and Lovers), it is pointless to criticise unresisting imbecility, it can be enjoyable and cathartic nonetheless. And so, let us take a closer look at the US approach.
While you can legally buy lottery tickets, gamble in licensed casinos, and place a bet at a bookmakers in person, it becomes a heinous crime should you wish to do so remotely via a telephone. The legislation which decrees that it is so is the 1961 Wire Act: this was originally drafted to prohibit bookmakers in states where gambling was then legal taking bets from would-be gamblers in other less liberal areas of the US.   However, times have changed, and America has become exposed to the wonders and depravities of the World Wide Web, and various upright (or uptight, depending on your viewpoint) congressmen and senators have sought to update the 1961 Act  to ensure that those pesky remote gamblers don’t gamble online across state lines: in fact, to ensure that they can’t gamble online at all. Senator John Kyl repeatedly tried to introduce various incarnations of the Internet Gambling Prohibition Act , none of which made it to full senate vote, before Senator Jim Leach took up the baton for righteousness and sponsored HR4411, the Unlawful Internet Enforcement Act. The argument behind this legislation is as follows:
“The Internet's ease of accessibility and anonymous nature: (1) make it difficult to prevent underage gambling; and (2) feed on the compulsive behaviour of the millions of Americans suffering from gambling addiction.”
I have a big problem with (1) and an even bigger problem with (2). To (1), I would answer: no, dear, it doesn’t. There are numerous checks and balances that can be put in place, notably age-verification requiring at least two items of identification, which actually works very well. And as for (2): up to a point, Lord Copper. The Internet can certainly facilitate mobile gambling (that’s why Ladbrokes, William Hill, Uncle Victor Chandler and all were in there like a shot); it makes it very easy to place bets without the bother of popping down to the bookies or the bingo hall (or, if you’re feeling flush and flash, the Golden Nugget). But I would question whether the majority of those Americans who do suffer from gambling addiction as a result of visits to the aforementioned locales would be significantly more likely to gamble were Internet gambling to be permitted. For one, as a recent survey by the American Gaming Association revealed, the demographic profiles of online gamblers are markedly different from, say casino gamblers: the former are predominantly males, aged under 40 and college educated; the latter are inclined to be older, less well educated, and with a greater preponderance of women in the mix. Secondly, and this is the biggie: why should the overwhelming majority of responsible gamblers be denied the opportunity to enjoy a leisure activity in their own homes which is perfectly acceptable and legal in public places?
This is not in any way to deny that gambling addiction, whether online or in any other form, is a serious issue: a short perusal of the website run by the charity Gamcare provides hair-raising evidence of the scale of problem gambling, and of its consequences, both to the gamblers themselves and to their families. But, as Gamcare acknowledges, there are ways of addressing the problem, not least by working with the gambling industry, regulators and the government. For Senators Kyl and Leach, that conciliatory approach would be anathema. Naughty Internet gamblers! Bad Internet gamblers! Ban them and lock them up!
I called it imbecilic: and yet (credit where credit’s due) there may be method in their apparent madness. Back in 2001, when the American Gaming Association was appearing before a Congressional Subcommittee, it grumbled that offshore Internet gambling sites: “Frustrate important state policies, including restrictions on the availability of gaming within each State… Unregulated Internet gambling that exists today allows an unlicensed, untaxed, unsupervised operator to engage in wagering that is otherwise subject to stringent federal and state regulatory controls. These controls are vital to preserving the honesty, integrity and fairness that those in the gaming industry today have worked so hard for so long to bring about.”
Untaxed. That is the key word here, the one which gives all governments the screaming habdabs. Those blasted foreigners are not paying tax – just look at all that gorgeous, lovely, sexy money being pumped offshore! Why not introduce an Internet protectionism; oh, sure, it’ll mean that us boys at the AGA don’t get any competition, but hell, we can put up with that! We’re good, God-fearing, tax-paying US citizens!
But the problem for the US government is that those blasted foreigners are not taking things lying down. In Antigua and Barbuda, gambling is the driving force behind the economy: the tiny nation is home to more than 500 remote gambling sites; it has been estimated that 3,000 of its 67,000 inhabitants are employed in the on-line gambling business. And many of its best customers are American. Accordingly, when the US government began using the Wire Act to prohibit foreign transmission of gambling information, its Antiguan counterpart lodged a complaint with the World Trade Organisation, which found in November 2004 that the US restrictions on remote gambling were in violation of international trade agreements. The US government, naturally, appealed against this verdict, before altering its standpoint and claiming that it was no longer in breach of the trade agreements, but hey, you still can’t gamble on Antiguan web sites. Much paperwork later, a WTO Dispute Settlement Body (DSB) will shortly adjudicate on the matter: whether the US will abide by its ruling is another matter entirely.
Well, then, we have had the two extremes: now for the middle ground, which is what Tessa Jowell, the UK Minister of Culture, is proposing. I am no lover of regulation for regulation’s sake; but in this the British government has been careful and reasoned at just about every stage in the process. Recognising the international nature of the industry, it has called for the implementation of international standards of regulation. Recognising that would-be punters tend, not unnaturally, to be more attracted to gambling websites where Frenchie and Slim don’t disappear into the sunset, it has backed the introduction of kitemarks for the online industry. It has worked with the Gambling Commission to develop proposed licensing conditions and codes of practice. It has called for greater co-operation with the industry as a whole.
But before I get too lovey-dovey with Tessa Jowell, I would venture to suggest that at that heart of the matter, there is a great deal of similarity between the US and UK governments. Both are aware that gambling is not going to go away; both, being pragmatists, would like to earn some money from it. However, while the US approach is rooted firmly in the past, in the robust and defiant protectionism it has so often mocked when its own products are being exported, the UK option has been to appreciate that it is better to have the gambling companies inside the tent rather than outside it (in Antigua, Gibraltar, or wherever). It is an approach which will ensure that the industry is well regulated, that gamblers’ rights are protected; it will also be an approach which is more financially rewarding. So I would say to the US government: do you sincerely want to be rich? If so, listen up.
Dr Windsor Holden is Principal Analyst at Juniper Research
Mobile Gambling Markets Re-Assessed (Post US Legal Changes), 2006-2011,  published by Juniper Research, January 2007

Designing solutions based on specific customer requirements sounds like Nirvana for telecoms users.  Chris Britton explains how Multi-Network Operators are aiming to fulfil the dream

Historically, corporate customers were compelled to rely on incumbent public
telecoms operators (PTOs) for their telecoms services. Now, an increasingly liberalised
 telecoms sector has led to fierce competition among both established telecommunications operators and newcomers, which is delivering choice and improving quality of service and price benefits across the majority of markets.

In this deregulated telecoms world, a new type of company has emerged – the Multi-Network Operator (MNO). Unlike traditional, network-centric telecoms operators, MNOs do not own the infrastructure over which their services are provided.
Instead, they design tailored solutions based on their customers’ specific communications requirements, using an optimal combination of telecoms networks and technologies.   Relying on their skills, experience, service management capabilities and partnerships, MNOs bring added value to enterprise customers, as well as to their network infrastructure partners.
So why is the term ‘multi-network operator’ used here in preference to the more familiar ‘virtual network operator’ (VNO)?  The key reason is that MNOs understand there is nothing ‘virtual’ about their customers’ networking requirements.
Regardless of the underlying infrastructure, MNOs take full responsibility for designing, implementing and managing network solutions.  Indeed, while some traditional telecoms operators offer different service levels depending on whether a solution is ‘on-net’ or ‘off-net’, MNOs approach service level commitments more from an end-to-end perspective. 
The term, Multi-Network-Operator, highlights the way MNOs use their expertise, skills and tools and solutions to find the optimum mix of network technologies on their customers’ behalf. They do this via multiple routes, from multiple sources and across multiple geographies, before integrating the disparate components and operating them as a single solution.
MNOs also differentiate themselves from VNOs via their product focus. VNOs have previously concentrated on virtual private network (VPN) solutions, which involve connecting multiple sites using shared Internet infrastructure. While not ignoring VPN services, MNOs focus on delivering high capacity, custom-built network solutions using dedicated leased line, optical fibre and Ethernet technology, which they manage 24x7. 
MNOs believe that business customers have become accustomed to receiving substandard service from traditional telcos and are crying out for significantly higher service levels for their wide area networks.  It is a sad fact that there are now whole industries – such as telecom expense management and outsourced service level management – that exist because customers feel the need to ‘police’ the traditional telco model. 
MNOs represent a more customer-service driven approach.  In this respect, they offer several key benefits over single carriers operating alone.
Traditional telcos are inevitably focused on maximising the utilisation of the network infrastructure they own. Everything they do – the way they price services, target customers and select technologies to offer – is geared towards squeezing as much value as possible out of that capacity. In contrast, MNOs are constrained by neither the fiscal nor the physical limits of a single network. 
This is an important advantage when servicing the needs of large corporates. The ongoing globalisation of industry means that enterprises increasingly need their networks to provide services to all parts of the world. Consequently, they are looking for end-to-end solutions that will almost certainly demand a combination of the network facilities of carriers in their home region with providers operating in territories across the globe.
By managing multiple networks instead of relying on one provider with a limited footprint, the best MNOs enable customers to expand their business network into new markets and remote, hard-to-serve locations faster and more cost-effectively than ever before.
The MNO approach can also offer increased choice and flexibility for customers in other ways. Rather than focusing on maximising capacity in one particular network, it concentrates on delivering the combination of network technologies that best suit the business needs of the specific customer. Once this is decided, the MNO can concentrate on putting these networks together as a seamless whole and managing them to a single service level agreement. 
In addition, MNOs can help their customers to achieve lower total cost of ownership by reducing the hidden costs of using a traditional telco. Again, the best of these operators will adhere to robust global service level agreements, provide a single point of contact for multiple vendor solutions and deliver accurate, easy-to-understand invoices to customers.
Leading MNOs also have the expertise to integrate multiple transport and networking technologies ranging from private line to satellite and from IP-VPN to Frame Relay, coupled with the experience to deploy solutions that combine fixed broadband, managed mobility and secure site-to-site connectivity.
MNOs’ in-depth understanding of their customers’ business also enables them to offer consultancy as enterprises migrate to triple play solutions and next generation technologies like Ethernet and multiprotocol label switching (MPLS) and are faced with the need to integrate and interconnect increasingly disparate environments.
For the MNOs, committed to providing the best possible networking solutions for their customers, the move to these fast and efficient new networking technologies is inevitable and to be welcomed. And they are much better placed than carriers to provide an effective migration strategy.
This is because unlike carriers, they do not own the network infrastructure and therefore they have no financial stake in the embedded legacy environment. Effectively, they take on the role of reseller of network capacity and services. For these reasons, they are much more likely to prioritise the need to migrate customers to new technologies.
The MNO approach can also provide both network “route” diversity and vendor diversity. Many enterprise customers today see this as a critical requirement to ensure that their networks are both reliable and highly cost-effective.
Leading MNOs will be able to use network design tools to identify diverse, alternative transport routes to eliminate network ‘single-points-of failure', while drawing on a portfolio of wholesale carrier relationships to give customers an immediate and cost-effective second or third carrier option.
But it is not only customers that benefit from the MNO approach.  More and more facilities-based telecom carriers are actively looking for opportunities to work with MNOs because they recognise that the MNO can be a strategic sales channel for their wholesale efforts, allowing them to support customer requirements they would otherwise never have been able to fulfil. 
Increasingly, the MNO model is attracting telecom industry leaders.  GTT is a good example of this, with several members of its board boasting leadership experience with companies such as Sprint, AT&T, Equant and Nextel.  These executives recognise the value of the MNO’s service-driven, customer-centric approach.  They understand there is category of business customers that needs a more closely tailored approach to their wide area network requirements. 
They also recognise the market trends driving the adoption of the MNO model: enterprises are more global in focus; they are moving into new, hard-to-serve markets and they want greater network diversity to address business continuity concerns.  Having seen the strengths and the limitations of traditional telcos, these executives appreciate, perhaps better than anyone else, that the time is right for enterprise customers to consider the MNO approach.
Certainly, the prospects for MNOs appear positive.  Armed with a business model that addresses today’s telecom environment, MNOs can offer enterprise customers a range of benefits that are beyond what most traditional telecoms carriers can provide. Equally, this realisation is beginning to attract senior executives who have previously played key roles within large operator organisations to take a more active involvement in the new MNO model. With the current rapid rate of technological change making flexibility and diversity of product offerings more compelling and with the complexity of the market making the single point of contact that MNOs provide more attractive, the future looks bright indeed. 

Chris Britton is Executive Vice President EMEA, Global Telecom & Technology

European Communications presents its regular round-up of the latest developments in the world of telecommunications

Developing mobile advertising
The GSM Association (GSMA) and the Mobile Marketing Association (MMA) have agreed to co-operate to accelerate the development of mobile advertising worldwide.  The two organisations will collaborate to deliver standardisation and transparency around current mobile advertising activity, and to develop new, innovative advertising techniques.
The MMA will lead the development of guidelines, formats and best practices for mobile advertising, while the GSMA will work with mobile operators globally to develop and prioritise consistent structures, such as inventory types, and commercial and measurement models that will allow advertisers to create valuable advertising propositions.
“This partnership will build on the MMA's work-to-date in the development of mobile advertising,” says Bill Gajda, chief commercial officer of the GSMA.  “The MMA and GSMA will bring leading advertisers, agencies and operators together to ensure that this very promising, but nascent advertising medium realises its full potential for the benefit of all players in the ecosystem.”
The agreement follows the recent announcement of the GSMA's Mobile Media and Entertainment Group that will oversee its Mobile Advertising Programme, which is made up of representatives from leading mobile operators from around the globe.
 “The value chain for mobile advertising is more complex than other media channels, with the mobile operator playing a key role, hence the driver for collaboration.  The GSMA brings the global GSM mobile operator community to the table and we are pleased to be working with them to expand the reach of a sustainable mobile advertising ecosystem,” says Laura Marriott, president of the MMA.  “We look forward to working jointly with the GSMA to deliver a consistent global industry standard for mobile advertising.”
Details: www.gsmworld.com /  www.mmaglobal.com
MVNOs on the rise
MVNOs will continue to grow on a global basis – with worldwide subscriber numbers more than doubling for the period from 2007 to 2012, according to a recent report, The Future of the MVNO, from telecoms research and consultancy firm BroadGroup Tariff Service. However the report warns that business models and distribution will need to change.
The report examines over 300 MVNOs in 37 countries, and profiles the main players in each of the main mobile markets where access to the incumbent's network has been allowed. It also evaluates the role of the country regulator in enabling the MVNO to become established in key markets.
The research reveals a wide range of different approaches and market drivers. The global mobile market is becoming more fragmented with the power of brands and distribution – together with the emergence of new low-cost MVNE aggregators – favouring the development of emerging niche MVNOs based on a small social community. The report features case studies based on in-depth interviews with BT, Lebara, Virgin Mobile and Blyk, each using a different business model.
Retailers and non-telecoms companies with strong customer relationships are using the MVNO model as a marketing tool to broaden and improve their existing customer experience, and so improve customer retention for their core business.
The distinction between pure MVNO and pure MNO is likely to become increasingly difficult to sustain as the MNO is utilising the MVNO technique of sub-brands or multi-brands to retain loyal customers.  As the larger MVNOs grow their subscriber base they also seek to develop a post-paid business stream and are adopting the characteristics of the MNO.
“The MVNO model is perceived as a perfect low cost entry vehicle to launch new mobile business models,” comments Margrit Sessions, Managing Director of BroadGroup Tariff Services.  “MVNOs can help lower prices in a market, but purely competing on price can not be sustained as a long-term strategy. Developing new business models and distribution will be key to success”.
Details: margrit.sessions@broad-group.com

WiFi health scare
Health concerns surrounding WiFi have the potential to seriously undermine consumer confidence, and affect competition in the telecoms marketplace, according to telecoms consultancy Lorgan Orviss International
The scientific community appears polarised by heath concern reports – such as the 'test' that allegedly proved that WiFi radiation in the classroom was three times the level generated by mobile phone masts – a portion of the community believing caution is imperative, and the remainder believing it is all irresponsible scaremongering.
“If schools across the UK are starting to rethink implementing WiFi, as reports have suggested, confidence is already rattled,” says Hugh Roberts, senior strategist for Logan Orviss. “Consumer behaviour and purchasing decisions in the private sector will be impacted.”
Roberts continues: “It is important to consider what could happen in the communications value chain. Wi-Fi offers a form of 'mobility' for fixed line operators who want to offer their customers converged services that include 'out of home experiences' without incurring mobile roaming tariffs for voice and data services. Even a small erosion of consumer confidence – which is now almost inevitable – will change the competitive landscape and will undoubtedly influence the future re-structuring of the telecoms industry.”
Logan Orviss notes two other areas that might become affected if these scare stories continue. One - telcos are investing in convergent services targeted at family groups, where the bill payer (typically a parent) is responsible for the overall profile of the family's usage, although individuals are able to top-up or modify their accounts in defined ways. Home networks – typically WiFi – have been an important part of the development of this comprehensive offering.  And two - apart from the potential decline in customer revenues from hardware and usage sales, teleco advertising revenues for certain types of convergent services that utilise WiFi may be hit. Even with the current level of concern, the advertiser profile will start to change.
Details: www.logan-orviss.com
Must have VoD
Video on demand (VoD) revenues will reach $12.7 billion worldwide in 2011, making it one of the fastest-growing digital content services over the forecast period, predicts analyst and consulting company Ovum. Starting from a base of $2.7 billion in 2007, Ovum expects to see more telcos across the globe launching their on-demand content propositions, moving themselves into content distribution.
"VoD is not a revenue generator at the moment but a 'must have' vision of the future in terms of both cash flow and telcos' content business survival," says Aleksandra Bosnjak, Content and Media Analyst at Ovum.
"From a content provider's perspective, telcos and ISPs will be the new contributors to content distribution and film finance, especially over the long term as the service improves and reaches a more significant scale and enhances its on-demand functionalities," explains Bosnjak.
Telcos are facing competition from all kinds of players - from old pay TV media to new digital distribution entrants - and the pressures of network convergence. This, coupled with challenges around content acquisition costs and finding the winning VoD business model formula, will mean that it is not a source of cash for the moment.
"We argue that over the next five years, 50 per cent of telcos' costs will come from content acquisition and marketing-related activities," says Bosnjak. "In their quest for an innovative content strategy, some telcos will experiment with various forms of content finance, such as financial backing via minimum guarantees, or go even deeper into the actual co-productions or co-ventures. In fact, we predicted this move back in February 2006 when we ran into telcos at the Berlin Film Festival. And we already see it happening with France Telecom and a baby IPTV operator Croatia Telekom Max TV service, which is producing its own short-format shows, and by using its own in-house production talent and facilities."
Ovum's view is that a careful content strategy and locally adapted VoD proposition will be a major driver of telco VoD service revenues, now estimated to comprise one third of the whole VoD revenue pie, depending on the country.
"Understanding the cash flow of traditional content distribution and collaboration with local content players will be the best approach for many operators in this tough VoD race - because the future of TV content, and especially European content distribution, is based on an on-demand business model," concludes Bosnjak.
Details: www.ovum.com

The One Laptop Per Child initiative has drawn both great praise, and considerable criticism.  Lynd Morley takes an overview of the debate and the role telecommunications can play

The (almost) legendary Nicholas Negroponte was a keynote speaker at TMW in Nice this year.  The co-founder and director of MIT Media Lab, and author of the seminal Being Digital, came to talk about another of his brain-children – the One Laptop Per Child (OLPC) initiative.
It may not seem the most obvious subject for a keynote address at one of the leading communications OSS events, but Negroponte pointed out in his opening remarks: “I think that a lot of the big problems in this world will all have solutions that include education.  And telecommunications and education are intimately tied.  I’m very fond of telling ministers of telecommunications that they are, in fact, ministers of education.  Because, until the world is really connected, education remains a very narrow phenomenon.
“If we look ahead to a world where children  – who are global by nature – have the opportunity to communicate with each other and learn, clearly telecommunications has a big role to play.”
OLPC is a non-profit organisation, set up with the goal of providing children in developing nations with laptop computers, offering – among other things – access to a whole raft of information.  As Negroponte stresses, OLPC is an education initiative, not a laptop organisation.
However, this apparently completely altruistic activity has generated an amazing amount of criticism and backbiting; some from the industry itself – including Intel and Mircrosoft - and much of it among the tech-heads and bloggers whose comments – some angry, some well meaning, some passionate about the greater need in developing nations for a whole range of things from fresh water, to food, to healthcare – has provided a huge theatre of debate.   Indeed, the arguments came to a head recently when Negroponte effectively accused Intel of damaging the non-profit scheme by launching a competitive product – the Classmate PC – resulting in some countries, previously behind OLPC, now considering their options.  Given that Intel chairman Craig Barrett initially described the OLPC laptop – the XO-1 – as a ‘gadget’ and questioned the effectiveness of the scheme, the Classmate is an interesting development.  Intel, however, strenuously denies that it is undercutting its own prices in order to push OLPC out of what it has now decided is a lucrative market.
Other giant players in the industry, however, are committed to the OLPC initiative.  BT is backing the project.  It’s chief science officer, Sinclair Stockman comments: "The project aims, through connecting even the most disadvantaged to the Internet and the web, to provide them with an invaluable tool to build a better and safer future.
"One of the challenges - which is where BT is providing assistance - is how to extend the local net connectivity, which is built into the PCs and allows them to easily form local wi-fi based networks, on to the global Internet.
"This in turn allows Internet protocol access to a much richer source of information - and allows the children to participate in wider global communities.
“The technical challenges are just one hurdle to overcome,” he adds. "Others include language, content delivery, effective community sharing - and also assuring the trustworthiness of the connected community."
So, EC reader, whether you are of the Gordon Gekko school of thinking (greed is good, and b****r the consequences) or the – shall we say – Al Gore school, perfecting the art of ‘caring’, you can join the critics and nay-sayers, and dismiss the whole thing as another example of Negroponte’s supposed egomania (or just simply misplaced good intentions) or you might conclude that it may not be perfect, but at least the guy is trying something, and try to find out just how the telecoms industry can provide additional help.
Details: www.laptop.org

The European Conference on Optical Communication (ECOC) event organisers detail what to expect at this year’s show in Berlin

The 33rd annual European Conference and Exhibition on Optical Communication (ECOC) will take place on 16-20 September, 2007 at the Internationales Congress Centrum (ICC) in Berlin, Germany – Europe’s largest conference centre. 
The event, expected to be a sell-out for the first time since the height of the telecoms boom in 2000, will feature over 300 exhibitors and a comprehensive speaker line-up, including some of the world’s leading technical developers, both commercial and academic, addressing key industry topics.
Confirmed speakers at the conference include: Gregory Raybon of Alcatel-Lucent – 100 Gbit/s: ETDM generation and long haul transmission; Biswanath Mukherjee of the Department of Computer Science, Univiversity of California Davis, USA – Optical Networks: The Road Ahead; and Russel Davey of BT – Long-reach Access and Future Broadband Network Economics.
Within the exhibition, seven of the world’s largest carriers, AT&T, China Telecom, Deutsche Telekom, France Telecom Group, KDDI, Telecom Italia and Verizon, under the banner of the Optical Internetworking Forum (OIF), will show the results of months of interoperability demonstrations.
“This is the first time that we have had service providers take part in the exhibition and this is a clear recognition of the rise of optical communications to become integral to all players in the telecom and datacoms sector,” says marketing manager, Simon Kears.
Also on the exhibition floor, visitors will be able to see and take part in a number of new, interactive features: The FTTx Resource centre, delivered by The Light Brigade, will be a focal point for all things FTTx; the ECOC Market Focus seminars will feature presentations from senior executives at JDSU, Bookham and a view on the European FTTH market from Heavy Reading’s Graham Finnie; the latest products will be showcased in the live demonstration area; and the CTTS will give free practical training courses in fusion splicing and fibre preparation tools.
Details: www.ecocexhibition2007.com


EMC Europe/    Paris/    14-15 June    /    www.theiet.org
Capacity CEE/    Prague/    18-19 June    /    www.telcap.co.uk
NXTcomm/    Chicago/    18-21 June    /    www.NXTcommShow.com
FTTx Summit/    Munich/    18-21 June/        www.iir-events.com
CommunicAsia/    Singapore/    19-22 June    /    www.communicasia.com
Mobile TV World/    Rome/    21-22 June    /    www.items-int.com
Optimising Telecoms Opex & Capex/    London/    25-27 June/        www.informatm.com
Mobile Content & Services/    Berlin/    25-28 June    /    www.iir-events.com
Radio Planning Forum/    Monaco/    25-28 June    /    www.iir-events.com
Service Quality Management
& SLAs for Telecoms/    Berlin/    25-28 June    /    www.iir-events.com
Telecoms Wholesale/    Berlin/    25-28 June/        www.iir-events.com
Revenue Assurance Summit/    Kuala Lumpur/    25-28 June/        www.iqpc.co.uk
Telecoms Loyalty & Churn/     Barcelona/    25-29 June    /    www.iir-events.com
WDM & Optical Networking/    Cannes/    25-29 June    /    www.iir-events.com
Digital Home/    Berlin/    2-4 July/        www.iqpc.co.uk
VoIP Asia/    Singapore/    16-19 July/        www.iir-events.com
Black Hat USA/    Las Vegas/    28 July-2 August/        www.blackhat.com
Telecoms World Africa/    Johannesburg/    30 July-3 August/        www.carriersworld.com
MVNO Summit/    Chicago/    6-8 August    /    www.iqpc.co.uk
Wi-World Africa/    Johannesburg/    27-30 August/        www.carriersworld.com
GSM>3G ME & Gulf/    Dubai/    2-3 September/        www.gsm-3gworldseries.com
Fixed Mobile Convergence/    Chicago/    5-7 September/        www.pulver.com
SDP & SOA in Telecoms/    Berlin/    5-7 September/        www.marcusevans.com
IBC 2007/    Amsterdam/    6-10 September/        www.ibc.org
Telecoms Quality & Business
Process Excellence/    Vienna/    10-11 September/        www.jacobfleming.com
Branding in Converging Comms/    Berlin/    10-11 September /        www.jacobfleming.com
Effective HR Management in
Telecoms/    Amsterdam/    10-11 September/        www.jacobfleming.com
EXPP Summit/    London/    10-11 September/        www.expp-summit.com
Nordic & Baltic Telecom Forum/    Helsinki/    10-12 September/        www.marcusevans.com
Mobile Device Management/    Amsterdam/    12-14 September/    www.marcusevans.com
User Generated Content &
Social Networking/    Rome/    12-14 September/        www.marcusevans.com
Evolving Telecoms/    Berlin/    17-19 September/        www.marcusevans.com
ECOC 07/    Berlin/    17-19 September/        www.ecocexhibition2007.com
MVNO Congress/    Vienna/    17-20 September/        www.iir-events.com
Number Portability/    Prague/    17-20 September/        www.iir-events.com
Strategic CRM in Telecoms/    Lisbon/    20-21 September/        www.jacobfleming.com
VSAT 2007    /London/    24-27 September/        www.comsys.co.uk
Carrier Ethernet World/    Geneva/    24-28 September/        www.iir-events.com
GSM>3G CEECom/    Prague/    25-26 September/        www.gsm-3gworldseries.com

Ignoring business continuity is no longer a reasonable option, yet many companies are wary of being sold a dud.  Patrick Roberts looks at the positive steps that can be taken to ensure a better-informed choice of solution

According to the Chartered Management Institute’s 2006 Annual Survey of Business Continuity Management, less that half of the organisations surveyed (49 per cent) had a “Business Continuity Plan covering their critical business activities.”  In understanding why so many businesses are still not investing in business continuity, despite the proven benefits, it is instructive to look back at an article, “The Market for ‘Lemons’: Quality, Uncertainty and the Market Mechanism”, published by Nobel Prize-winner George Akerlof over 30 years ago.  The article is an elegant exploration of the situation that evolves when the buyer of goods or services does not know the true value of what they are buying: they cannot distinguish between a high quality product and a ‘lemon’.   

BUSINESS CONTINUITY - A lemon by any other name?

Consider, for example, what would happen to the used car market if only two types of vehicle were offered for sale - high quality cars worth £10,000 and jalopies worth only £2000 – and prospective purchasers cannot tell the difference.  If a buyer believes that there are equal numbers of both types of vehicle on the market then they might be inclined to think “I have a 50/50 chance of getting a good car or a ‘lemon’ so I’ll offer up to £6000 for a car.”  But on further reflection he or she realises that the owner of a quality used vehicle is very unlikely to sell it in a market where people are only willing to pay half of what it is really worth (they will find an alternative means of selling it) so, in reality, all you will have in this market is jalopies.  The prospective buyer then determines not to pay more than £2000 for a car bought in this way.
Measuring the true value of an investment in business continuity management is extremely challenging for the most experienced business continuity practitioners let alone a prospective purchaser with little knowledge of the subject.  The stage is therefore set for a classic ‘lemons’ problem where prices in the sector are forced down and, ultimately, both buyers and sellers are driven away from the market.  Whilst, in the illustration above, there are numerous practical alternatives to buying a second-hand car (eg buying new, public transport), simply ignoring business continuity is no longer a sensible option for most organisations.  Obviously the onus to remedy the situation lies largely with the business continuity profession and much is already being done by both individuals and professional bodies, such as the Business Continuity Institute, to improve awareness and understanding.  However, those involved in purchasing business continuity products and services - including training, consultancy services and IT solutions – also have a vested interest in becoming more knowledgeable in order to ensure that they are getting the best value for money.  The rest of this article concentrates on the positive steps that people in this latter group can take to ensure that they are better-informed consumers.
The publication of BS 25999:1 in November 2006 was a very important milestone, establishing a simple and robust lingua franca for business continuity management.  The document itself is less than 50 pages in length so any prospective purchaser of business continuity services would be well advised to take the time to read it.  Building on this foundation, a great deal of general business continuity information is available in the form of public presentations: business continuity practitioners speak regularly to audiences from numerous professional and business organisations and many of these events are free.  As one becomes more knowledgeable, it is also worth considering attending specialist events such as the Business Continuity Expo and Business Continuity Institute Annual Symposium where industry-leading practitioners discuss best practice and examine topical issues.  These simple steps give the knowledge and confidence to ensure that what is being purchased is actually appropriate to the business needs. 
Finally, a wide range of high-quality business continuity training is now available, ,with many organisations offering inexpensive one or two-day introductory courses to the subject.  Some providers also offer courses in a convenient evening-class format or can even deliver bespoke training in your workplace.  Scenario-based crisis management exercises, where a management team has to wrestle with the difficulties of a simulated incident, are also a very enjoyable and effective way to raise awareness of business continuity issues and improve the ability of individuals and teams to manage in a crisis.  Taking advantage of some of these numerous training opportunities is undoubtedly the best way to equip yourself as a sophisticated business continuity consumer and ensure that you are getting good value from your investment.
In conclusion, the following simple case study is offered as a recent illustration of how improved buyer understanding can create real win-win outcomes for buyer and seller.  The client in question was a small public-sector body who received a day of Crisis Management training for their executive team as part of a national programme delivered by Needhams 1834 on behalf of the Cabinet Office Emergency Planning College.  As a result of this training, the executive team realised that their existing business continuity plan was not fit for purpose; it also gave them the confidence to ask Needhams 1834 to conduct a review of their plan and deliver a simple exercise.  In the event it only took an additional seven days of work to provide the client with a far more robust business continuity plan and facilitate a simple exercise to familiarise the crisis management team with its contents.  Surely this is a far better outcome (for both parties) than the client struggling on with an inadequate plan or paying for a great deal more consultancy than they really needed: improved understanding by the buyer led to a win-win situation.   

Patrick Roberts is Senior Consultant with Needhams 1834.

Needhams 1834 Ltd will be exhibiting at the Business Continuity Expo and Conference held at EXCEL Docklands, London from 28th - 29th March 2007.

Enterprise mobility should focus on message delivery not device type insists Peter Semmelhack

The benefits of mobile business solutions are hard to ignore for most enterprises today. Used for everything from tracking home visits by medical staff and maintaining airport x-ray machines to keeping beer flowing in pubs and hotels, mobile applications are giving customer-facing employees in the field the power to run their business wherever they happen to be. As Rob Bamforth, mobile applications analyst at Quocirca so neatly puts it, "the key application of all mobile devices is communication, and the 'killer' feature is relevance to the user."

ENTERPRISE MOBILITY - Don't get hung up on the phone

The debate has moved on to how best to manage mobile devices and treat them as the business enablers that they truly are. IDC predicts that $52 billion will be spent on all mobile services by 2010, with $1.5 billion of this being spent on mobile device management and security.

Let the user be the chooser
Traditionally, field service has been the poster child for enterprise mobility since, by its nature, it involves the delivery of timely customer information to a geographically dispersed workforce who are mobile for the majority of their working day. Because of this, field service engineers have often borne the brunt of early attempts at force fitting enterprise applications onto mobile devices that were never designed to handle that level of complexity. The result was many failed or less than stellar deployments due to lacklustre user adoption.
The BlackBerry revolution awoke executives to the freedom, flexibility and efficiency that could be gained through mobile access to applications. Now mobility is being demanded not only by field service operations but also in a variety of customer-facing roles. As IT managers juggle the mobility requirements, user preferences, and budget requirements of different business units across the enterprise, they soon realise that when it comes to mobility solutions one size clearly does not fit all.
A survey undertaken at Service Management Europe found that that 23 per cent of respondents cited user acceptance as one of the major hurdles to successful mobilisation of data applications. This is precisely why Pitney Bowes UK set up working groups to consult its field service engineers on their device preferences and mobile application interface requirements, long before it standardised on a hosted enterprise mobility solution to deliver SAP and Siebel updates directly to field service engineers' devices throughout Europe.
It is therefore vital to consult field staff on the best device for their needs and the best interface to aid them in their collection of updates and daily reporting of services delivered or sales made. It is also important to retain flexibility by choosing a solution that can support future migration to new devices. The bottom line is choosing the right mobility solution from the start will lead to fewer headaches when deciding on devices and networks and when managing the evolution of the system over time.

Change costs
One of the major issues concerning IT directors is total cost of ownership (TCO). This cost is generally driven down by simplification. So any increase in complexity, such as a change in device type across the enterprise, or even in only one part of it will inevitably increase TCO. And one thing is certain, change will come.
The device market is changing so fast that any decision made at the time of deployment could be almost obsolete within six months with newer, as yet unknown, devices coming onto the market offering immediate and significant incremental business value and/or cost reduction opportunities. In addition, the recent litigation between NTP and RIM sent shockwaves through the enterprise community in the US, where the BlackBerry has become essential to the average executive. With some employees becoming more dependent on their mobile device than their desktop PC, the enterprise mobility strategy must include the flexibility to minimise disruption if the company needs to switch devices to accommodate mergers or acquisitions, change of operator, customer demands or new technologies.
Gartner predicts that the overall TCO for mobile solutions will rise by 30 per cent for most enterprises. Gartner attributes this cost to "the increased support costs for a more-disparate set of mobile data users, lack of management of recurring monthly charges for mobile data services and the need to support point solutions across multiple types of wireless data offerings." Inevitably, if an enterprise opts for a mobility solution that already supports the majority of devices on the market, then this will reduce the costs, while also reducing the time to roll out mobile applications to a new set of devices. Therefore, enterprises must plan to have a multiple device environment to enable the business to take advantage of the value of new devices as they come onto the market. The cost of supporting multiple device specific applications versus a single application for multiple devices must be considered.

International deployments
When rolling out mobile applications to thousands of users across multiple territories it is vital to consider choosing software that supports the majority of devices without needing any modification. So whether staff choose to work on PDAs, Pocket PCs, BlackBerrys, notebooks or laptops, or even a touch tone phone using IVR, they should still receive the same updates from the back-end applications running over SMS, GSM, GPRS, 1x RTT, 2 way paging or Mobitex.  This means you must deploy a flexible and extensible software solution that accommodates multiple language support, and both wireless device and network variances without any costly re-engineering of the application.
Field sales and service pose particular problems for larger enterprises because they involve the management of data delivery across multiple territories, over numerous networks. For example a field engineer may need to run the same field service application interchangeably on either his BlackBerry or his laptop depending on the service task he is undertaking. He may prefer to use his laptop for diagnostics in an area with poor wireless coverage. However at the next job he may prefer to use his BlackBerry. Using real time data communications, backed up by 24 x7 monitoring from a reliable Network Operations Centre, ensures consistency and currency of the data, no matter which device is being used for the job in hand.

Host with the most
We have stated earlier that simplification reduces cost. One significant way to simplify application delivery is to use a hosted 'on-demand' model, enabling employees to access mobile applications on whatever device they happen to be working on at that time. Analysts at Unstrung advise that "if the use of BlackBerries, Treos, or other mobile messaging devices extends beyond a few top executives, it's time to consider outsourcing the management thereof." Using a hosted model allows central management of devices, with different user privileges for different groups. Depending on the service provider, this model can also enable 24 x 7 monitoring of multiple networks around the world, to guarantee message delivery.
According to industry analyst AMR, the software-as-a-service-model grew by 60 per cent in 2005 and this is driving the CRM market. The next logical step is to extend this hosted software to the field sales and service staff that have most contact with your customers and this means getting critical data held within enterprise CRM and ERP systems such as Siebel, NetSuite, and SAP onto their mobile devices, so that they have the most up to date customer information at their fingertips while they are on site with customers.
We have discussed the need to choose an enterprise mobility solution that supports the widest range of handheld devices. But sometimes the most effective way to get critical information to an engineer is to route the message to a landline near the site where the engineer is working. This is particularly important for employees working on hidden assets such as underground piping, or in hospital environments where there is no wireless coverage. So enterprise mobility isn't always about accessing data from a PDA or phone, sometimes traditional communications technologies are the most effective route to get messages to your mobile employees. So don't get hung up on the phone, or on one particular device. Enterprise mobility is all about ensuring that relevant up to date information reaches the right person at the right time - using the most effective delivery method possible.

Peter Semmelhack is CTO of Antenna Software


This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.


This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features