Features

Will the huge success of SMS be eclipsed by later generation services?  Not exactly, says Prisicilla Awde 

Virtually everybody does it every day because it’s a cheap, fast and an easy way to keep in touch and do business. No, we are not talking about making phone calls but sending SMS messages, which, after telephony, is the most successful and ubiquitous communications service ever. The big question is: is it threatened by next generation mobile instant messaging (IM), e-mail or multimedia messaging (MMS)? The consensus is that it is not, at least not for several years.
SMS may even have a negative impact on the success of advanced messaging services simply because it is integrated into all phones, is known, liked and used by millions who see little advantage in switching to more expensive, if media rich alternatives. Mobile phones are automatically set up for SMS, it works without data connections, is interoperable across networks/devices, is international and reliable.
In Western Europe alone, Gartner estimates that over 180 billion SMS messages were sent in 2006 and expects growth of over 20 per cent, reaching 210 billion in 2011. In developing countries where low incomes and high poverty levels favour cheap, easy to use devices/services, SMS is booming. In the last year India added six million mobile subscribers each month and growth in China is similar. In Africa roughly three per cent of the population have bank accounts but around 20 per cent have mobile phones.
SMS traffic is used for secure micro-payments and to share information in all sectors from agribusiness to health care, from fishing to financial services and political/entertainment elections. Expats use voice SMS for cheap international connections and it overcomes literacy/keypad barriers by providing a ‘talk/listen’ alternative to ‘type/read’.
Deepak Natarajan, Head of Marketing, Roamware predicts that over 2.3 trillion text messages will be sent in 2010: “SMS is alive, kicking and thriving and the vast majority of the global population is interested in it. Operators have a huge incentive to continue supporting SMS because many organisations use it as a communications channel. New sticky applications/features make people reluctant to change.
“Low cost handsets have produced a revolution in communications, but the next step needs cheap, available, easy to use next generation devices. Within ten years there will be a big growth in MMS and mobile IM but prices are not being lowered sufficiently across the board to motivate users to move away from SMS.”
The situation is not helped by innovative value added SMS features (storage, divert, print and filtering), which encourage usage, increase margins and reduce churn. “The original SMS infrastructure was store/forward but now it is direct deliver, texts can be used for transactional services,” explains Jeff Wilson, Chairman, Telsis.
Growth in advanced messaging has been inhibited by complicated set up; relatively expensive smart phones/tariffs and slow 3G roll out. Flat rate pricing and bundling accelerates take up but the choice of messaging service depends on circumstances - each is applicable to different situations. “Price; interoperability; ubiquity and user familiarity always inhibit new services but different messaging services will co-habit - growth in one increases traffic in others,” says Jay Seaton, Chief Marketing Officer at Airwide Solutions. “Operators’ revenues from SMS exceed all other messaging types combined. Carriers can add sticky enhanced SMS services over existing infrastructure fast/cost effectively and are increasing SMS revenues from personalised services.”
Andrew Moran, Head of Enterprise, Symbian agrees that take up is slowed by the lack of compelling consumer price plans: “If operators get the pricing right users will flock to new services driving up ARPU. Applications are available and the magic ingredients are there for a whole suite of messaging services if operators can hit on the right model.”
Mobile IM is perhaps the natural successor to SMS and some analysts estimate the market will be worth $1billion in 2011. It may find applications in customer service and could be useful for cross and up-selling.
“Although it is a different type of communication, an on-going chat, personal IM complements SMS although it won’t be a substitute,” believes Gabriel Solomon, Senior VP at the GSM Association. “SMS will co-habit with new services – the top 20 per cent of consumers using multimedia services still want to talk to the rest of the population via SMS. One shot, fast SMS messaging will continue to be used for more and more applications as a secure system where pre-paid accounts limit fraud.”
Extending computer IM and e-mail functionality into the mobile environment is a natural progression suggests Doug Brackbill, CMO at Visto. “The desktop and mobile environments will merge to allow seamless communications: people will move between systems. Messaging services are not competitive but meet different needs.”
Although MMS is mainly used to send informal and impromptu photographs, it may merge with SMS as graphics or ‘emoticons’ are automatically added by servers.
MMS is important for mobile advertising where subsidies will change the whole messaging market. Although mobile spam is less acceptable than on PCs, tariffs will drop for people willing to accept non-intrusive, relevant, useful, multimedia adverts. The subsequent lower costs will likely drive advanced messaging take-up. SMS will however still play a big part in advertising because of the huge addressable audience - some companies see it as the most economical way of reaching millions.
“MMS has been disappointing, but we are starting to see more recent growth especially for providers to deliver content rather than for personal messages,” says Stephanie Pittet, Principal Analyst for Gartner. “Mobile IM and e-mail will become more important especially in mature countries with high-end devices. It’s a slow start but operators are striking deals with IM providers and flat fee e-mails will be important for consumers. All are niche markets now. MMS won’t ever reach the levels of SMS which has high margins and users expect it to be central to all offerings.”
Mike Short, VP R&D for the O2 Group believes text based SMS will be popular until 2010 because of its convenience, familiarity, ubiquity, easy availability and the number of promotions relying on it. After then he expects other capabilities will be integrated into handsets especially e-mail, which makes the work environment mobile. “This is an evolution not revolution: there will be higher adoption in some places over others. Convenience and reach will drive new services,” says Short. “Customers do what’s most applicable. We need to make it easy and give them the widest range of services possible.”
Hilmar Gunnarsson, VP corporate development at Oz asks why operators launched MMS instead of just adding a picture facility to SMS. “Rather than marketing new services, carriers should evolve the underlying architecture to enhance and add new capabilities, giving users richer messaging experiences under the SMS umbrella. We need to think of things from users’ perspectives, keep things simple and easy to use. E-mail should be embedded into all phones and applications pre-integrated.”
The market is moving towards a situation in which people will not care which messaging system they use but will choose the most convenient/appropriate. Mobile unified messaging will give users a single in-box, address book and number accessible via any device. “When 4G networks become available in 2015/16, IP will be the prime network connection and there will no longer be a clear split between MMS, IM or SMS – messaging will be converged,” believes Aram Krol, market development manager, Acision.”

Priscilla Awde is a freelance communications journalist

The promise of IPTV is fraught with dangers - from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

The promise of IPTV is fraught with dangers – from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

Laura Marriott looks at the way operators are able to marry new technologies, such as mobile TV, with mobile marketing initiatives in order to deliver the services subscribers want, in a personalised and convenient form

There are several key stakeholder groups in the mobile marketing value chain. Each has a different set of objectives and requirements. For the advertisers it is about their ability to implement highly-targeted, measurable marketing initiatives to a wide audience. For the operators it is about rewarding customer loyalty (or preventing churn), increasing average revenue per user (ARPU) and developing service differentiation. Consumers, meanwhile, demand compelling content delivered to them in a personalised and convenient manner.
All these players will benefit from the development of a prosperous mobile advertising industry.
The opportunity is certainly a large one. According to ABI research, worldwide mobile marketing and ad spending will reach $19 billion by 2011, up from $3 billion in 2007.
For mobile operators, mobile marketing and advertising represents a relatively rare opportunity to create value from a non-subscriber revenue source. As most operators around the world still struggle to ramp-up subscriber usage of lucrative new data services, creating new revenue streams becomes ever more vital, especially as core voice revenues continue to decline due to price pressures.
There are five main types of mobile marketing content delivery: voice (e.g.: IVR), mobile internet (e.g.: WAP), messaging (e.g.: SMS/MMS), Video and television and downloadable (Java/ Brew etc.). Traditionally, the operator has used such delivery mechanisms to deliver "on-deck" content, which has required the content provider - or mobile marketer - to have a direct relationship with the operator. It was considered vital that the operator "owned" the customer and that this relationship was not compromised by third parties - other brands, advertisers, billing providers that could dilute the operator's relationship with its customer base. The downside of this model was that the operator shouldered the burden of marketing the new services, content was often limited or its potential was unrealised, and it was difficult for third-party brands to truly leverage their value.
As a consequence many of these early content-based initiatives were not compelling as far as the end-consumer was concerned and a few failed to deliver the desired uplift in operator-revenue.
Thanks to the rise of mobile marketing and advertising this model is evolving. As operators are increasingly able to generate revenue from marketing initiatives, their mobile web sites can begin to operate more like a portal than a mere storefront. This means that off-deck and on-deck content can be combined to enable access to provide the consumer with an almost unlimited range of content and services.
These emerging business models are also being shaped by new mobile technologies, most notably by the rise of mobile broadcasting. Today the mobile TV market is a small one in terms of revenue and consumer adoption, but the industry predicts strong growth. The first nationwide mobile TV services went live in the US and Europe in 2007 after years of trials and customer usage studies.  Nevertheless, the technology is in its infancy as far as consumers are concerned: of the estimated 233 million handsets in the US, only around 10 million are able to access mobile TV. (Sources: m:metrics, eMarketer, Forrester, Yankee Group).    
As the two industries - mobile marketing and mobile TV - are emerging, an opportunity has risen for them to compliment each other and stimulate further growth. A recent study by m:metrics commissioned by the Mobile Marketing Association (MMA), found that among US subscribers interested in mobile TV services, almost half (41 per cent) said they would watch mobile adverts in order to access mobile TV services free of charge. Another 20 per cent said they would access mobile advertising if it meant accessing services for a reduced fee.
Advertising within mobile TV environments can take many forms. As with traditional TV, advertising can occur before (pre-roll), after (post-roll) or during programming (including product integration or branded entertainment) - all formats most consumers will be familiar with. In addition, there are a number of forms of mobile-centric advertising elements that can be incorporated in the mobile programming. These include, real-time text overlays, real-time image overlay (watermark), and premium-rate SMS (PSMS) links to encourage Participation TV and avatar branding alerts.
The FIFA World Cup in Germany in 2006 proved a fertile test bed for many of these emerging business models. Mobile Dreams Factory, a specialist in delivering mobile marketing campaigns and an MMA member, was the company behind one of the most successful of these: Vodafone's "Videogoals 3D", which delivers animated, 3D videos of goals scored during the matches to mobile devices within minutes of the real-time experience.
3D models of all the teams and players were created. When a goal was scored, the play was reconstructed and animated within seven minutes of the real-time activity. The video was converted to a mobile format that was sent to Vodafone clients and to Marca.com, a leading Spanish media group, for publication on the Internet.
Videogoals 3D was the most visited of Vodafone services during the World Cup and the most visited of Marca.com's internet services, with more than nine million videos viewed. Not only did Vodafone associate its brand with football via this high-appeal, unique product, but the company was also able to promote its Vodafone Live 3G content-based service, including driving MMS revenues. The service was recognised at the MMA's annual mobile marketing awards 2006 for its ability to integrate multiple cross-media elements to communicate and deliver on the service availability.
For advertisers, the World Cup represented an opportunity to reach a huge audience but there were two main problems: traditional advertising was often prohibitively expensive for smaller companies and the event itself was so saturated with advertising that some less well-known brands could easily get lost in the noise. Mobile TV provided a solution to both problems.
CEPSA, a leading gas and oil company in Spain, worked with the Mobile Dreams Factory to create a World Cup-related real-time videoblog. The company sent two reporters to Germany to provide additional information on the activities and training sessions of the Spanish team and to share other anecdotes.
This was considered the first time that a real-time video service, utilising 3G mobile devices, captured and simultaneously transmitted a signal to thousands of users during a major event such as the World Cup. Using 3G mobile devices, the service was able to capture and simultaneously transmit a video signal via the Internet while audio files were transmitted over the operator's network. The audio and video subsequently came together in a platform in Madrid. The system recorded all the live connections so viewers could view them later as time-delayed videos.
The real-time videoblog was innovative because it involved the convergence of digital media, using the mobile device both as a communication tool and as the medium itself. By integrating 3G technology with both mobile and internet services provided the consumer with an exceptional mobile experience, while extending the brand for CEPSA and connecting it with the FIFA World Cup. The service was given the Innovation Award for Creativity in Technology at the MMA's mobile marketing awards 2006.
The success of such pioneering services during the World Cup has seen them subsequently taken up elsewhere and to similar effect. The Videogoals 3D service, for example, has since been rolled out by Telefonica´s Movistar in Spain, which successfully integrated Coca-Cola as the campaign advertiser of the service. The model is able to satisfy all members of the value chain: Coca-Cola is able to reach Movistar subscribers; Movistar is able to offer free, exclusive content that differentiates it from competitors, and the user receives exclusive, free premium-content.
The portal exhibits the corporate image of the advertiser and banners are placed in different places around the site featuring the Coca-Cola logo. These can be static or animated, and the system will automatically detect the user's device to request adapted and appropriate media.
The service saw some astonishing results. In the first three months of operation, more than 10,000 users registered for the service and around 1 million impacts on banners and downloads.
Its still early days, but such case studies demonstrate how operators are able to marry new technologies such as mobile TV with mobile marketing elements to deliver compelling services to subscribers. Hopefully, 2007 will see many more examples and lots of unique innovations. 

Laura Marriot is President of the Mobile Marketing Association

Is Apple really stealing a march on the mobile industry? 
Lynd Morley takes a look

Apple's iPhone may be over-hyped, over-priced and (so far) not over here, but despite that it's been a bit of a cage-rattler for the mobile industry - both vendors and service providers.  And so it should. Yes, we know all the nerdy objections to the thing are valid: most of the gadget's features are already available on existing phones from other vendors. And while the touch-sensitive screen is very clever and the ‘visual voicemail' feature breaks new ground (for the first time the device vendor seems to be dictating network-supported features to the service provider) these are hardly innovations to set the industry quaking in its boots.   What the industry should really worry about with the iPhone only became blindingly obvious in early September when Apple launched its latest iPod (the music devices with the white earplugs that young persons tend to wear).  It looked strangely familiar...   In fact the new iPOD Touch is really the iPhone without the phone.  It's a WiFi-enabled music player  with the same case and has the same touch screen and, most important, the same icon-driven interface. What Apple has now delivered is two identical ‘p's in a pod - a phone and a player. Importantly, there is certainly more to come.  This is the concrete expression of Apple's iLife framework for integrating the consumer's digital world. Functions such as satellite navigation, camera, storage can all be spun out as stand-alone products or spun in in various combinations to share functions on a convergence device.   Using it, the Apple brand can be spread like a viral infection across multiple existing categories of electronic device and even a few we've yet to think of, most of all across mobile phones.  In fact what Apple is selling with it's iLife is not the same old gadgetry but an electronic wardrobe of matching accessories.  It may be brilliantly trendy and a marketing triumph but the approach is also brilliantly necessary. That's because the natural brake on the utility of digital media devices of all types has always been their complexity at the user interface - so the ‘fiddlingaboutness' generally required to do things with portable digital content meant most of us under-use the facilities already available. There's a much better chance of us all doing complicated things like synching up devices, transferring files and songs to a central library and so on and so forth, if the processes are easy to follow and execute  (and aren't completely different on each device you come to).   So that' why Apple's might just succeed - the nagging worry for the mobile phone incumbents is that maybe they, or perhaps a combination of players, should have grasped this nettle more deftly themselves and moved on from the ‘button-driven' gadget environment to a personal device systems market first.  

Mobile TV debacle
The FLO Forum, a global body of more than 80 companies, today reacted to the European Commission’s Communication on “Strengthening the Internal Market for Mobile TV”.
The FLO Forum applauds the Commission’s efforts to advance the high potential mobile TV opportunity in Europe, including the focus on spectrum, harmonisation, standardisation and regulation. However, the FLO Forum believes that the Communication’s intention of favouring any one particular mobile TV technology for Europe could stall the advancement of a healthy European mobile TV eco-system.
Dr. Kamil A. Grajski, President, FLO Forum said of the Commission’s Communication: “The FLO Forum supports the principle of technology neutrality, which the major European industry groups have been calling on the Commission to respect[1]. There is a reason why the principle of technology neutrality exists and that is to ensure that the market can choose which technology delivers the most attractive solution for the consumer. Each country has its own unique market conditions and each mobile broadcasting technology standard has very different performance characteristics. Locking the European market into one technology model is potentially harmful to the growth of mobile broadcasting in Europe and will hinder the development of innovative technologies.”
“Despite its youth, the mobile TV marketplace already offers multi-standard and multi-technology products and services - from chipsets to broadcast network transmission equipment. It is now cost-effective and routine to consider multi-modal mobile TV handsets. These developments should allow for the take up of attractive broadcasting services that will enable economies of scale. Technology is not the problem, but restriction of choice will be,” added Grajski.
“The Commission’s support for DVB-H for mobile broadcasting in Europe is based, in part, on the suggestion that a mobile TV technology mandate, like the GSM mandate, is necessary to achieve economies of scale and to position European companies globally at the technology forefront. But the analogy is contrary to the market reality today,” said Grajski. “The mobile TV industry is still in its early stages, but the GSM mandate came after GSM had launched with wide commercial success. Technology mandation for mobile TV in Europe is not supported by the facts,” Grajski continues.
 Regarding FLO technology, Grajski notes that “recent independent trials of FLO technology in the UK involving several EU-based FLO Forum members highlighted significant technical advantages, which lead to savings on infrastructure spending.  FLO offers twice the capacity of DVB-H, or alternatively the same capacity, but with a network build out with significantly reduced cost. This can translate into millions of euros difference in capital and operating expenditures for a network.“
Concluding, Grajski notes that “Technology mandation is not an appropriate regulatory tool in innovative and dynamic markets such as mobile TV, especially where the market remains undecided and where the technology continues to evolve rapidly.”
Details:  www.floforum.org
 
Enterprise mobility
Indoor base stations, cellular radio enhancements and IP Multimedia Subsystem (IMS) will give mobile operators crucial new capabilities as they battle with WLAN vendors to exploit the enterprise mobility market, according to a new report, Seizing the Opportunities from Enterprise Mobility, published by Analysys, the global advisers on telecoms, IT and media.
"The limited coverage and throughput and the relatively high prices of indoor cellular services make it difficult for mobile operators to satisfy enterprises' requirements for mobility", according to report co-author, Dr Mark Heath.
"However, the combination of three major technological developments could radically enhance the capabilities available to mobile operators, enabling them to make substantial improvements to their enterprise mobility solutions, and to fend off competing solutions from the WLAN community."
Key findings from the new report include:
• Indoor base stations will significantly improve the coverage and performance of indoor cellular service, allowing mobile operators to devise different charging strategies for indoor services, including low-cost or free internal calls
• Cellular radio enhancements, such as HSPA+ and CDMA2000 1* Revision B, will increase the throughput and capacity of cellular systems to match those of WLANs, particularly with indoor base stations
• IMS will give mobile operators the functionality they need to integrate and control their indoor base stations and to deliver the flexibility, sophistication and interworking between services that enterprises will expect. "Armed with these new technologies, mobile operators are well placed to attack the mobile enterprise market", according to co-author Dr Alastair Brydon.
"One approach would be to use the technologies to integrate enterprises' existing systems and applications with their cellular networks, although this would need substantial support from systems integrators. An alternative, albeit more radical, tactic would be to aim for pervasive cellular mobility, whereby the same cellular network solution is used to deliver all of an enterprise's services and applications in every environment. "For some enterprises, the simplicity and uniformity of a common cellular service in all environments will have major benefits", says Brydon.
Details: http://research.analysys.com

Cost savings
To realise the true potential offered by mobile working, organisations must move towards delivering secure access to real-time, line-of-business applications. With a plethora of new devices, software and innovations coming to market the mobile workspace is constantly evolving. However, mobile and wireless technologies are notoriously averse to standardisation - users consistently experience technology fragmentation, interoperability issues and rapid obsolescence.
IDC predicts that in 2008 businesses will be focused on regaining control over mobility developments - having a clear vision as to how solutions should evolve to achieve flexibility, ease of use and cost savings. "Although a mobile enterprise deployment will require some up-front investment, most companies expect that, over time, the benefits will outweigh expenses, and eventually cost savings will be realised," said Stacy Sudan, research analyst, Mobile Enterprise Software.
A survey conducted at IDC's 2006 Mobility Conference identified that the combined IT spend of delegates was in excess of £300 million, with the average expenditure being £81 million; 95 per cent of delegates surveyed indicated that their mobile budgets would increase, on average by 40.3 per cent in 2007; respondents' key priorities beyond providing email access were customer relationship management, sales force and field force automation, and the implementation of additional security measures such as authentication and digital signatures.
Details: www.idc.com
 
Co-ordinating anti-spam
IronPort has announced the fruition of an anti-spam pilot project conducted between a coalition of leading European telcos and security organisations.  The seven month project resulted in improved spam catch rates, at the same time as revealing the need for a cooperation framework with more partners to standardise reporting formats, share information and adopt a common set of best practices in anti-spam.
In January 2007, IronPort joined forces with ETIS – The Global IT Association for Telecommunications, TeliaSonera, KPN, Belgacom and Telenor and Dutch research organisation TNO.  The coalition’s goal is to eliminate the majority of spam on the European network level before it even reaches the mail servers.
Throughout the pilot IronPort has provided insight into spam trends and methods, and best practice tips acquired by working with other large ISPs throughout the world.  The company also provided technology to all pilot members to enable them to see how IronPort’s reputation filtering & anti-spam technology eliminates spam.
The project trialled a combination of different technologies and co-operation procedures with positive results:  One of the participating ISPs remedied close to 16,000 spam incidents in less than a day during the pilot period.
The group found that the active information exchange among ISPs, especially regarding spam traffic flows received from one another, is a successful approach towards reducing spam.  Even within the limited time frame of the pilot, this level of information sharing dramatically reduced the spam rates as well as customer complaints in the participating ISPs. The existence of trust among partners under the ETIS umbrella allowed the process of resolving spam incidents to be automated to a great extent.
The Anti-Spam Pilot Project members will propose a Road Map for the expansion of the project to the major European Telcos at the next ETIS Information Security Working Group meting which will be held in Delft, The Netherlands on September 27.
Details: http://www.ironport.com/

End-to-end transaction data is increasingly being recognised as the not-so-secret sauce required for full-flavoured telco transformation. If so, it should be treated with the reverence it deserves, Thomas Sutter, CEO of data collection and correlation specialist, Nexus Telecom tells Ian Scales

Nexus Telecom is a performance and service assurance specialist in the telecom OSS field. It is privately held, based in Switzerland and was founded in 1994. With 120 employees and about US$30 million turnover, Nexus Telecom can fairly be described as a 'niche' player within a niche telecom market. However, heavyweights amongst its 200 plus customers include Vodafone, T-Mobile and Deutsche Telekom.

It does most of its business in Europe and has found its greatest success in the mobile market. The core of its offer to telcos involves a range of network monitoring probes and service and revenue assurance applications, which telcos can use to plan network capacity, identify performance trends and problems and to verify service levels. Essentially, says CEO, Thomas Sutter, Nexus Telecom gathers event data from the network - from low-level network stats, right up to layer 7 applications transactions - verifies, correlates and aggregates it and generally makes it digestible for both its own applications and those delivered by other vendors. What's changing, though, is the importance of such end-to-end transaction data.

Nexus Telecom is proud of its 'open source approach' to the data it extracts from its customers' networks and feels strongly that telcos must demand similar openness from all their suppliers if the OSS/BSS field is to develop properly. Instead of allowing proprietary approaches to data collection and use at network, service and business levels respectively, Sutter says the industry must support an architecture with a central transaction record repository capable of being easily interrogated by the growing number of business and technical applications that demand access. It's an idea whose time may have come. According to Sutter, telcos are increasingly grasping the idea that data collection, correlation and aggregation is not just an activity that will help you tweak the network, it's about using data to control the business. The term 'transformation' is being increasingly used in telecom.

As currently understood it usually means applying new thinking and new technology in equal measure: not just to do what you already do slightly better or cheaper, but to completely rethink the corporate approach and direction, and maybe even the business model itself.

There is a growing conviction that telco transformation through the use of detailed end-to-end transaction data to understand and interact with specific customers has moved from interesting concept to urgent requirement as new competitors, such as Google and eBay, enter the telecom market, as it were, pre-transformed. Born and bred on the Internet, their sophisticated use of network and applications data to inform and drive customer interaction is not some new technique, cleverly adopted and incorporated, but is completely integral to the way they understand and implement their business activities. If they are to survive and prosper, telcos have to catch up and value data in a similar way. Sutter says some are, but some are still grappling with the concepts.

"Today I can talk to customers who believe that if they adopt converged networks with IP backbones, then the only thing they need do to stay ahead in the business is to build enough bandwidth into the core of the network, believing that as long as they have enough bandwidth everything will be OK."

This misses the point in a number of ways, claims Sutter.

"Just because the IP architecture is simple doesn't mean that the applications and supply chain we have to run over it are simple - in fact it's rather the other way about. The 'simple' network requires that the supporting service layers have to be more complex because they have to do more work."

And in an increasingly complex telco business environment, where players are engaged with a growing number of partners to deliver services and content, understanding how events ripple across networks and applications is crucial.

"The thing about this business is not just about what you're doing in your own network - it's about what the other guy is doing with his. We are beginning to talk about our supply chains. In fact the services are generating millions of them every day because supply chains happen automatically when a service, let's say a voice call over an IP network, gets initiated, established, delivered and then released again. These supply chains are highly complex and you need to make sure all the events have been properly recorded and that your customer services are working as they should. That's the first thing, but there's much more than that. Telcos need to harness network data - I call them 'transactions' - to develop their businesses."

Sutter thinks the telecom industry still has a long way to go to understand how important end-to-end transaction data will be.

"Take banking. Nobody in that industry has any doubt that they should know every single detail on any part of a transaction. In telecoms we've so far been happy to derive statistics rather than transaction records. Statistics that tell us if services are up and running or if customers are generally happy. We are still thinking about how much we need to know, so we are at the very beginning of this process."

So end-to-end transaction data is important and will grow in importance. How does Nexus Telecom see itself developing with the market?

"When you look at what vendors deliver from their equipment domains it becomes obvious that they are not delivering the right sort of information. They tend to deliver a lot of event data in the form of alarms and they deliver performance data - layer 1 to layer 4 - all on a statistical basis. This tells you what's happening so you can plan network capacity and so on. But these systems never, ever go to layer 7 and tell you about transaction details - we can.

"Nexus Telecom uses passive probes (which just listen to traffic rather than engage interactively with network elements) which we can deploy independently of any vendor and sidestep interoperability problems. Our job is to just listen so all we need is for the equipment provider to implement the protocols in compliance with the given standards."

So given that telcos are recognising the need to gather and store, what's the future OSS transaction record architecture going to look like?

"I think people are starting to understand it's important that we only collect the data once and then store it in an open way so that different departments and organisations can access it at the granularity and over the time intervals they require, and in real (or close to real) time. So that means that our approach and the language we use must change. Where today we conceptualise data operating at specific layers - network, service and business - I can see us developing an architecture which envisages all network data as a single collection which can be used selectively by applications operating at any or all of those three layers. So we will, instead, define layers to help us organise the transaction record lifecycle. I envisage a collection layer orchestrating transaction collection, correlation and aggregation. Then we could have a storage layer, and finally some sort of presentation layer so that data can be assembled in an appropriate format for its different constituencies - the marketing people, billing people, management guys, network operation guys and so on, each of which have their own particular requirements towards being in control of the service delivery chain. Here you might start to talk about OSS/BSS Convergence."

Does he see his company going 'up the stack' to tackle some of these applications in the future.

"It is more important to have open interfaces around this layering. We think our role at Nexus Telecom is to capture, correlate, aggregate and pre-process data and then stream or transfer it in the right granularity and resolution to any other open system."

Sutter thinks the supplier market is already evolving in a way that makes sense for this model.

"If you look at the market today you see there are a lot of companies - HP, Telcordia, Agilent and Arantech, just to name a few - who are developing all sorts of tools to do with customer experience or service quality data warehouses. We're complementary since these players don't want to be involved in talking to network elements, capturing data or being in direct connection with the network. Their role is to provide customised information such as specific service-based KPIs (key performance indicators) to a very precise set of users, and they just want a data source for that."

So what needs to be developed to support this sort of role split between suppliers? An open architecture for the exchange of data between systems is fundamental, says Sutter. In the past, he says, the ability of each vendor to control the data generated by his own applications was seen as fundamental to his own business model and was jealously guarded. Part of this could be attributed to the old-fashioned instinct to 'lock in' customers.

"They had to ask the original vendor to build another release and another release just to get access to their own data," he says. But it was also natural caution. "You would come along and ask, 'Hey guys, can you give me access to your database?', the response would be 'Woah, don't touch my database. If you do then I can't guarantee performance and reliability.' This was the problem for all of us and that's why we have to get this open architecture. If the industry accepts the idea of open data repositories as a principle, immediately all the vendors of performance management systems, for instance, will have to cut their products into two pieces. One piece will collect the data, correlate and aggregate it, the second will run the application and the presentation to the user. At the split they must put in a standard interface supporting standards such as JMS, XML or SNMP. That way they expose an open interface at the join so that data may be stored in an open data to the repository as well as exchanged with their own application. When telcos demand this architecture, the game changes. Operators will begin to buy separate best in class products for collecting the data and presenting it and this will be a good thing for the entire industry. After all, why should I prevent my customer having the full benefit of the data I collect for him just because I'm not as good in the presentation and applications layer as I am in the collection layer? If an operator is not happy with a specific reporting application on service quality and wants to replace it, why should he always loose the whole data collection and repository for that application at the same time?"

With the OSS industry both developing and consolidating, does Nexus Telecom see itself being bought out by a larger OSS/BSS player looking for a missing piece in its product portfolio?

"Nexus Telecom is a private company so we think long-term and we grow at between 10 and 20 per cent each year, investing what we earn. In this industry, when you are focusing on a specialisation such as we are, the business can be very volatile and, on a quarter-by-quarter basis, it sometimes doesn't look good from a stock market perspective."

But if a public company came along and offered a large amount of money? "Well, I'm not sure. The thing is that our way of treating customers, our long-term thinking and our stability would be lost if we were snapped up by a large vendor. Our customers tend to say things like 'I know you won't come through my door and tell me that someone somewhere in the US has decided to buy this and sell that and therefore we have to change strategy.' Having said that, every company is for sale for the right price, but it would have to be a good price."

So where can Nexus Telecom go from here? Is there an opportunity to apply the data collection and correlation expertise to sectors outside telecom, for instance?

"Well, the best place to go is just next door and for us that's the enterprise network. The thing is, enterprise networks are increasingly being outsourced to outsourcing companies, which then complete the circle and essentially become operators. So again we're seeing some more convergence and any requirement for capturing, correlating and aggregating of transactions on the network infrastructure is a potential market for us. In the end I think everything will come together: there will be networks and operators of networks and they will need transaction monitoring. But at the moment we're busy dealing with the transition to IP - we have to master the technology there first.”

Ian Scales is a freelance communications journalist.

Agillic/Telenor Sonofon
To compete effectively in Denmark’s mobile market, Telenor Sonofon relies on a unique way to secure customer loyalty. The operator avoids impersonal and costly direct marketing campaigns by using Agillic’s Post-Paid CLM Business Solution to create an individualised, one-to-one relationship with each high value post-paid customer. This approach has significantly improved customer relations, reduced churn and increased overall ARPU.
 “Without Agillic’s solution Telenor Sonofon was unable to effectively execute its one-to-one communication strategy.  We wanted to take advantage of the in-depth knowledge we held on our customers to enable us to enter into consistent intelligent dialogues across all our communication channels.   We believed this approach would have a positive effect on our churn levels,” explained Martin Kildgaard Nelson, CRM Manager, from Telenor Sonofon.  “We were right.  Having the ability to automatically advise and educate each customer on the services and products that matches his needs transformed the way we were able to communicate with our customers.”
Agillic’s technology solves this lack of transparency. ”The CLM solution supports successful customer interactions driven by business rules and real time event and behaviour triggers, which gives our customers the empowerment to initiate their own dialogues.  Telenor Sonofon responds automatically, with relevant messages which can be reinforced simultaneously through multiple touch points such as MMS, SMS, e-mail and the Internet,” said Martin. “We now create ongoing learning relationships with each customer through the use of both historical data and real time interactions from a central system which has been instrumental in gaining user loyalty and trust.”
Telenor Sonofon uses a number of innovative campaigns to encourage customers to join its loyalty programme.  The results have exceeded expectations.  “We initially hoped for a 20 per cent registration rate. Within a year, over 35 per cent of our post-paid customers signed up to our loyalty programme,” said Martin. “The churn rate also dropped significantly – among the lowest in the Danish wireless market.”
This success was driven by Agillic’s technology. Concludes Martin: “We noticed a 50 per cent churn reduction with our post-paid campaign and a 5 -10 per cent increase in APRU.  This was the biggest success of all of our marketing promotions. These figures are undeniable proof that Agillic’s Post-paid CLM Business Solution provides excellent business results.”
To read the full Telenor Sonofon case study, log on to European Communication’s website www.eurocomms.co.uk

Qualcomm/BSkyB
Two trials conducted by Qualcomm and BSkyB in laboratory and ‘live’ settings in and around Cambridge and Manchester have confirmed the performance claims made of the MediaFLO mobile broadcast technology. The trial featured 11 channels from BSkyB delivered to non-commercial devices from Qualcomm.  Factors such as total throughput, single frequency network (SFN), network acquisition, channel switching time, layered modulation and video codec performance were all evaluated.
The trial aimed to test various technical performance claims for the MediaFLO system and perform comparative analysis against DVB-H. To this end all comparative laboratory measurements were based on common test equipment for MediaFLO and DVB-H, while drive test routes for DVB-H and MediaFLO were nominally the same. The result was a thorough technical analysis of FLO technology’s capabilities, confirming pervious performance claims about the MediaFLO system.
From a technical perspective the trial showed that the FLO physical layer performs as well as or better than previously claimed, with laboratory and field performance results in substantial agreement. Comparatively, whether in the laboratory or in the field, the DVB-H physical layer underperformed the FLO physical layer by around 4.5dB.
Results At A Glance:
•    MediaFLO physical layer field performance was around 4.5 dB overall better for non-layered modes with comparable bit per second per hertz capacity
•    dB advantage could allow a MediaFLO network to either cover twice the geographical area per transmitter when applying modes of equal capacity - resulting in a substantial reduction in network expense - or provide double the service offering on a channel count basis for a constant cell size with the same spectrum and transmitter deployment
•    Testing demonstrated that MediaFLO is capable of supporting 20 channels of QVGA video and stereo audio in a single 5MHz spectrum allocation. This can be scaled for an 8MHz UHF channel. This performance represents a 20 per cent increase in channels relative to prior performance claims of 20 video channels per 6 MHz channel
Details: www.qualcomm.com

External Links

Agillic

Great changes are afoot in the telecoms industry – offering considerable potential to industry players, while current achievements deserve to be celebrated now.  Lynd Morley looks back at TMW Nice

It was a year of ‘firsts’ for the TM Forum in Nice this Spring.  The theme of the show, for instance, moved away, for the first time, from a purely telecoms focus to one that not only recognised, but actively embraced, the massive changes which are bringing communications, information and entertainment services together into a huge pot of convergence.
Commenting on these development, TM Forum President, Martin Creaner, noted: “The pace of change is such that it is becoming hard to define the ‘telecom’ industry any more, and it’s just as hard to define the cable or media industries either.  I think we are truly witnessing the birth of a telecom/cable/media/web ecosystem.”  Certainly the range of companies from these various industries attending the event – including Time-Warner, Virgin Media, UPC Broadband and Disney, in addition to the usual suspects from the telecom OSS/BSS world – underlined the expansion of the TM Forum’s remit, while the keynote speakers, including MIT Media Lab’s Nicholas Negroponte, discussing his One Laptop Per Child initiative, and Rory Sutherland, Vice Chairman of Ogilvy Group UK pointing to the changes in the advertising industry which are being brought about by online advertising, underlined the event’s breadth of vision.
The atmosphere of collaboration was also emphasised by the TM Forum’s own new collaborative ventures.  Following the OSS through Java Initiative’s (OSS/J) decision in early 2006 to join with the TM Forum, the organisation’s determination to expand its scope and reach was further emphasised at TMW by the announcement that it would be merging with two more organisations – the Global Billing Association (GBA) and the Internet Protocol Detail Record Organisation (IPDR).  Commenting on the announcement, Alex Leslie, GBA CEO, noted: “We seemed to be looking at the same kind of issues from different angles.  Given that the BSS and OSS industries really need to converge, it makes enormous sense for us to work together.”
Collaboration was undoubtedly the ‘plat du jour’ in Nice where, among the many conference sessions – covering the six broad topics of Business Transformation, Convergence, Customer and Services, IT and Operations, Systems and Software, and Billing and Revenue Management – not to mention the many hospitality events and the two exhibition halls, senior executives from many of the world’s largest providers of telecom and cable services, network operators and content providers managed to get together to discuss how best to collaborate to drive down the integration tax.
Another ‘first’ this year, were the TM Forum Awards, presented at an evening ceremony – complete with formal dinner and attendees looking resplendent in black tie and evening dress.  Steve Fratini of Telcordia was presented with the rarely conferred honour of distinguished fellow of the TM Forum, while BT was the first winner of an award noting exceptional contribution to the organisation.  Given the youth of the awards, and the obvious delight of the recipients, here is a list of the winning companies:
Best practice award for a service provider – Korea Telecom
Best practice award for a supplier – Progress Software
Prosspero award for TM Forum standards adoption by a supplier – Hewlett-Packard
Prosspero award for TM Forum standards adoption by a service provider – BT
Best Catalyst for TM World 2007 – the Product and Services Assembly Project
Most innovative supplier – Highdeal
Best new OSS or BSS product – Arantech.

Outsourcing hardware developments to technology consultants can give the best chance of success, and could help to avoid many sleepless nights, according to Tim Fergus

The world of electronic hardware and product development is a challenging environment. From  small start up companies right through to large multinationals, the need to meet ever changing user requirements and launch products in a timely, cost effective fashion is key to long term success. The drive for the latest function or increased performance drives the development process with unrelenting urgency. With such pressures the need to succeed is paramount and may well dictate the future of the company.

This has to be balanced with the need to keep staff costs down and often companies will find they are resource limited and need to look for assistance outside.  This can take many forms but the most common is to use external short-term employees or contractors to cope with excessive peaks of demand. In some instances the complete outsourcing of a work package of complete development itself may make sense. This can be an effective method for a very targeted work package.
The challenge in contracting work is to ensure that the quality and delivery of the work package or complete development is done in timely fashion. This will ultimately depend on the contractor chosen, how they interact with the client organisation and the level of responsibility they offer.
This begs the question – how to ensure that contracting out work results in the greatest chance for success with the least intervention?
By taking ownership and responsibility for your hardware development, technology consultants (TC) can offer the best chance of success while delivering value beyond that expected.
For effective outsourcing of development work, the TC needs to possess key skills and abilities in addition to focus and drive. These can be highly effective at delivering results in short timescale.
By understanding not only the work to be done, but also the clients’ requirements and future needs, the TC is ideally placed to drive progress where it matters. Such an understanding drives the project forward to completion in a controlled and rapid way. This breeds confidence and ensures that the job is delivered in a short time as possible while minimising costs.
Consultants are sometimes visualised as being removed from the action; report writing and advisory in nature rather than actually doing the work. TCs are, however, more practically focused. Effectively, they are professional contract engineering services at the sharp end of development. Their wide breath of knowledge allows them to go from top-level system definition to implementing hardware, often working directly on the bench with a soldering iron.
Whereas individual may focus on specific tasks, TCs can take a much higher level of responsibility, in both their time management and the product development. Effectively they take ownership of the project until completion. This becomes much more important when the work to be done is a discrete package or even a complete product development. TCs can also provide direction to other sub-contractors/short-term individuals employed by the client, or even direct to client staff. This frees up the client and removes the burden of day-to-day resource management.
Consultants are generally broad in their experience and understanding. They may not be familiar with your product or technology, but the ability to learn rapidly will result in them delivering insight and value in a short time. Such flexibility is the key for the consultant’s survival in a rapidly changing world, which benefits the client. A larger consulting firm will offer many talented individuals – something the client will appreciate. This removes the need for careful and painstaking contractor selection; this has already been done by the TC company. They are that rare breed of excellent engineers with the business acumen and drive to succeed.
TC’s are keen to ask questions and take a view from outside and above the project. They don’t accept what they are told without questioning the reasoning behind the decision process. Such insight can be invaluable on projects and gives a helicopter view of how everything fits in – or does not! In many cases the simple questions are the one to ask. They know the bounds on what is possible and will flag up unreasonable assumptions.
In general large TC firms have expertise outside the technology or industry in which they normally work. For example a consultant doing hardware development may be able to access experts in IT, project management, change management, strategy and planning. It is best to use a TC that has a department or practice that fits with the technology to be developed. In some instances a TC firm will have its own specialist test equipment which may be available for use on client site or for use on the client project.
In some instances a complete development of software, hardware, industrial design and product fabrication, test and approvals can be conducted by a single TC firm. The ability to work in an integrated multi-disciplinary team is key to delivering quality product designs. Even smaller firms can offer these services when subcontracted out to partners, however, if possible, it is best to keep all skills in one place. Co-location is the key for team dynamics to be optimum.
On paper at least, TC firms can seem more expensive than contractors from other sources. This is however not true if you consider the value added to the project. Often the use of TCs can shorten development timescales, free up other members of client staff, or help with strategy and vision. When you consider such benefits, the true value can be
understood.

Tim Fergus is a Principal Consultant with PA Consulting’s Wireless Technology Practice, and can be contacted via: tel +44 1763 267492;
e-mail: innovation @paconsulting.com
www.paconsulting.com/wireless

Inline and out-of-band LAN security appliances offer different levels of functionality. Understanding these differences is key to selecting the right product for your organisation, says Jeff Prince

The local-area network (LAN) has emerged as a security risk, subject to insider misuse, as well as external attacks. Threats can arise from a number of aspects including rogue hosts on wireless, guests plugging into open ports in a conference room, contractors or partners needing access to corporate resources and the continued movement of laptops between the corporate LAN and the Internet. At the same time, malware is escalating because attacks are easier to build, faster to spread and motivated by financial gain.
The IT department finds itself providing more points of access into the LAN without compromising systems and data. In response to these challenges, vendors have developed a variety of LAN security devices. Enterprises looking to secure their LANs will find these platforms readily available and easy to deploy within an existing network infrastructure.
LAN security devices fall into two broad classes - those that operate inline and those that operate out-of-band. Inline platforms are deployed between the wiring closet switch and the network core and are distributed throughout a network, close to users. They function as both a policy decision point and an enforcement device, because they sit in the stream of network traffic.
Out-of-band LAN security appliances are centrally located and typically connect to a switch in the core. They are not directly in the flow of traffic and therefore act as a policy decision point, with enforcement being delegated to other infrastructure devices, usually the wiring closet switch in the distribution layer.
Inline and out-of-band LAN security devices differ in terms of their interoperability with existing infrastructure, the security services they support, and the operational issues they pose.
A LAN security device must protect the LAN from both internal and external risks. To be effective, the platform should support key functions including network admission control (NAC), traffic visibility, post-admission control, and malware control.
NAC includes authentication and host posture check. It allows the IT department to verify that users are who they say they are and the machine they are using complies with corporate standards (for example, running an approved operating system with current patches and fixes and an updated anti-virus program). The best devices incorporate NAC that:
-    Supports both active and passive authentication
-    Influences existing identity stores for authentication
-    Identifies a user’s role as part of authentication, which is essential for applying control policies to that user following admission to the network
-    Provides ubiquitous host posture check that applies to all classes of users, including employees, contractors and visitors - without burdening IT
-    Works with multiple host agents
-    Supports hosts not under enterprise control

Traffic visibility
Traffic visibility is a pre-requisite for access control and auditing, because devices cannot control what they cannot see. Look for the level of visibility granularity that will deliver the level of control your business needs. For granular control, a LAN security platform must:
-    Tie all LAN traffic to the user and not simply to IP or MAC addresses
-    Provide key user data, including login/logout time, applications run and resources reached
-    Perform deep packet inspection on all flows and not just sampled traffic
-    Retain statistics about all flows for regulatory compliance and accounting purposes
-    Track security incidents, including those relating to host posture checks, policy violations, authentication failures and malware events
-    Provide real-time and historical data
-    Provide an aggregated view of the LAN's security health
In terms of traffic visibility, inline and out-of-band LAN security appliances offer significantly different capabilities. Inline devices have the capacity to see everything that goes by because they sit in the flow of traffic and out-of-band appliances have no visibility into ongoing LAN traffic.
Post-admission policies provide control over where users go and what resources they can access once they are admitted onto the network. For the most granular security, a LAN security platform should provide post-admission control functionality that:
-    Ties all LAN activity back to specific users – this link enables the IT department to define rights and permissions, as well as control and enforcement actions, based on a user’s role in the organisation
-    Supports universal access control – this architecture ensures the correct rights and permissions are applied to all users, regardless of the access method used, or location from which they attach to the LAN
Post-admission control capabilities of inline versus out-of-band security appliances vary greatly. If designed with comprehensive traffic visibility, an inline device can apply per-flow packet handling, allowing for granular control based on user, group, and application, even layer 7 content. Since enforcement is built in, the platform is able to inspect user traffic and apply controls at LAN speed.
Lacking traffic visibility, out-of-band appliances are limited in their access control capabilities. In addition, since out-of-band appliances are dependent on distribution switches for policy enforcement, they have limited enforcement control over user traffic.

Malware control
Malware detection and blocking provides the IT department another tool for protecting the LAN. Worms, viruses, bots, spyware and other malware can wreak havoc with network availability. Comprehensive post-admission traffic visibility and control is required to contain malware. When evaluating a LAN security appliance for malware control, look for devices that:
-    Granularly block bad traffic. For example, giving the IT department the flexibility to block all traffic from an infected user or just the infected application
-    Recognise and contain ‘zero-hour’ attacks
-    Operate close to the host to limit the spread of malware and minimise system and network damage
Inline LAN security platforms can scan for malware and therefore have the ability to continuously monitor traffic in real-time. Operating inline enables this class of device to respond quickly and directly apply enforcement actions.
Out-of-band appliances cannot perform malware control, as they have no traffic visibility once a user has been admitted onto the LAN.
It is important to evaluate a LAN security appliance for its potential impact on network and IT operations, specifically whether it impacts LAN performance, or the IT departments’ ability to troubleshoot the network.
Out-of-band LAN security appliances generally don’t affect LAN performance.
In contrast, inline devices must have high performance characteristics to keep up with LAN traffic at line speed and perform functions such as deep packet inspection and continuous real-time monitoring and enforcement.
Inline devices that rely on off-the-shelf processors will not be able to sustain gigabit speeds and are likely to negatively impact LAN performance.
In terms of troubleshooting, inline platforms have the advantage of being simpler to manage and troubleshoot than out-of-band devices, because they combine policy decision and enforcement functions in a single box. With out-of-band appliances, the IT department must determine which device, the LAN security appliance or switch, is the source of a problem.
In selecting a LAN security appliance, IT and security personnel need to consider the range of internal and external threats their LAN faces, along with the specific requirements of their organisation. Which appliance is best will depend on a number of factors, including the set of security services desired, the granularity of traffic visibility and control needed and where in the network IT prefers to implement their LAN security.
Organisations that want only admission control will find good options among both out-of-band and inline. Businesses that want to implement more post-admission controls should focus on inline devices, since out-of-band appliances are much more limited in these functions.
Regardless of architectural approach, the IT department needs to move quickly to protect against LAN security risks.
Jeff Prince is CTO, ConSentry Networks

From its roots as a broadcast technology conference and exhibition, IBC has evolved to become a leading event focussed on the creation, management and delivery of content for the entertainment industry. Ian Volans takes a look at what will be on offer at the show this year 

EVENT PREVIEW - IBC 2007

Like telecommunications, the broadcast television sector is living through a period of rapid change.  Audiences are fragmenting as the handful of national channels that originally broadcast on analogue migrate to digital terrestrial transmitters, and accommodate new services.  At the same time, high definition is increasing the quality expectations of viewers and changing production techniques.
Even in countries where multi-channel satellite pay-TV and cable add to the competitive mix, broadband is challenging the status quo with the introduction of a new distribution channel for television and video entertainment.  And with virtually every European adolescent and adult carrying a mobile, the concept of watching television on the “fourth screen” is beginning to gain traction, if more slowly than the mobile industry would like. 
In the face of this changing landscape, the IBC 2007 conference and exhibition continues to advance, reflecting new technologies and commercial realities. 

Broadcasting by broadband
Mobile has been a recurring theme in IBC’s conference in recent years.  In 2007, the growing importance of IPTV and the distribution of video content over the Internet will be reflected in the opening theme day of the conference, Broadcasting by Broadband on Thursday 6th September. 
After several false dawns, broadband now offers a delivery channel that offers an alternative to the traditional television and radio broadcast business model. Broadcasting by Broadband will explore how broadband providers, which are more closely aligned to telcos than to broadcasters, will affect the broadcast landscape.
Regulators, equipment manufacturers, service providers and content owners are all stakeholders in this developing new world of interactive multimedia, but the rules are not yet fully understood. Each technology has its proponents, but they are more focussed on competition than co-operation. The end users – consumers - have a desire for the content they want, at the quality they want, in the place they want, at the time they want it.
The opening theme day for IBC2007 will begin with a jargon-free technical description and analysis of DSL, WiFi, WiMax, Powerline, ultra wide band, digital terrestrial, digital satellite and mobile TV to provide a comprehensive understanding of the technologies, their capabilities and their role in a business plan.
A business environment session will look at regulation and finance; at the implications of telcos and ISPs being successful in challenging for spectrum released as terrestrial services go digital; and how the regulation of content developed for broadcast may be applied to broadband delivery.  The day will conclude with case studies from organisations already providing a mix of services, including broadcast radio and television, over various broadband-enabled platforms.
The latest developments in mobile TV and video consumption will be a central strand in the Digital Lifestyles - media to your home or on the move conference theme day on Saturday 8th September.
As the communications and media industries converge, opportunities to serve the digital home expand. This is just part of a broader trend towards a digital lifestyle, characterised by media on the move, digital delivery of media to the home and accessible media storage within the home.
Whether it is broadcast, webcast or user-generated, all content is increasingly contributing to the growth phenomenon of social networking. This presents new challenges to the traditional business models of broadcasters and advertisers. The interplay of the four screens – cinema, television, computer and mobile - demands cross-platform media solutions.
The Digital Lifestyle theme day brings together case studies and guidance that address the options for repurposing content for different networks, consumption environments and storage; the DRM challenges of cross-platform delivery; the potential impact of the one billion mobile phones shipped in 2006 - and again in 2007 - on media capture and delivery; and the growth of a possible fifth screen – in-car navigation devices.
As well as exploring new media opportunities and challenges within the conference, mobile and IPTV technologies also feature strongly in the exhibition. 
In 2005, a dedicated Mobile Zone was created within the IBC exhibition to provide an opportunity for application developers, content providers and technology companies to showcase their capabilities at the centre of the broadcast industry's leading international event. It doubled in size in 2006, and will be bigger again in 2007. 
Mobile Zone exhibitors are diverse and drawn from across the ecosystem that is rapidly growing up around mobile TV and video.  This year’s exhibitors range from designers and turnkey suppliers of end-to-end mobile TV broadcast networks such as ENENSYS Technologies and LARCAN to weComm, the company that developed the Sky Anytime on Mobile solution that enables users of 120 different mobile devices to access the UK satellite broadcaster’s content.  Frontier Silicon, expects to use IBC to unveil a multi-standard RF & baseband system-on-a-chip that will be vital to delivering economies of scale in mobile TV handsets when the addressable market is fragmented with a variety of broadcast standards deployed in different countries. 
Qualcomm will be present in the Mobile Zone for the third year running.  While in much of Europe deployment of broadcast mobile TV is stalled pending the release of digital dividend spectrum, Qualcomm is progressively rolling-out its MediaFLO network across the US.  At a mobile TV conference in March, Jeff Brown, Head of Global Strategy and Development for Qualcomm cited forecasts from Wall Street analysts Bernstein Research that suggest that MediaFLO could become the world’s largest single multi-channel pay TV platform within five years.  By the time of IBC in September, Qualcomm may be in a position to provide a progress report on its US venture.
New in 2007, the IPTV Zone will provide an opportunity to explore the technologies and developments that are allowing broadband providers to compete with traditional broadcast distribution. Exhibitors in the inaugural IPTV Zone encompass big broadcast names such as Grass Valley; global technology players like Texas Instruments; middleware specialists such as MHP software solutions provider Osmosys; and HD set-top-box specialists ADB and Vidanti.
Some Zone exhibitors have relevance across mobile and IPTV.  Snell & Willcox, renowned for its image processing conversion and compression technologies, is adapting its expertise to improve image quality or increase channel capacity across wireless, IPTV and Internet delivery platforms.  For broadcasters or carrier providers who need to repurpose content for multiple distribution methods, the company’s iCR automated content repurposing workstation can simultaneously create separate outputs optimised for IPTV and mobile TV.
Conceived to complement the peer-reviewed IBC Conference, a programme of Business Briefings provides an opportunity for companies exhibiting in the two Zones to share their experiences with any IBC delegate, visitor or exhibitor.  Each day of the free-admission Business Briefings begins with a presentation from an independent analyst on the current state-of-play to provide context for the Briefings that follow.
M:Metrics, a pioneer in the study of consumer consumption of multimedia content on mobile devices, will introduce the Mobile Business Briefings, while Decipher, one of Europe’s brightest new digital media consultancies will introduce the IPTV Business Briefings.
IBC 2007, RAI Amsterdam: Conference 6-10 September; Exhibition 7-11 September.  More information: www.ibc.org
Ian Volans is Mobile Consultant to IBC

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features