Features

Oren Glanz looks at the future for mobile music, and what operators need to consider to ensure they don’t get left behind

Music fans have always been proud of their collections and where, in the past, they used to compare their massive vinyl collections in meters, gigabytes are now a better means of comparison. Statistics from the International Federation of the Phonographic Industry's 2007 Digital Music Report demonstrates the growth of digital music, showing an estimated $2 billion worth of music being sold online or through mobile phones in 2006, accounting for 10 per cent of music sales. A Strategy Analytics forecast for Western Europe shows that in 2006 there were almost a million active mobile music users and it continues by predicts that by 2010 this figure will have risen to 2  million users - which has made mobile operators take note and place even more faith in the mobile delivery channel.
Usage and interest in mobile music is clearly exploding so how can operators ensure their subscribers become addicted to downloading music?
With so many devices on the market there is the potential for mobile music users to be quick to churn when a service/device doesn't live up to their expectations - very often without contacting their operator.  These customers do not contribute to the operator's top line, and can tarnish the reputation of mobile music amongst their peers and fellow subscribers.  This is where the challenge becomes difficult, as most service providers do not have a clear view of their customers preferences or what barrier to adoption they face when trying to use a service.  They do, however, now accept that customers will not make contact to highlight any barriers to service.
It is important to note, even when a subscriber has successfully overcome any barriers to initially adopting a service, that there are still issues to turn them from an occasional user to an active user.   It is vital for mobile operators to understand that identifying problems with delivery/usage is only part of the issue, encouraging new users and increasing their consumption should be imperative and is a key to a successful mobile music service.
Subscribers trying to access mobile music services can encounter various technical, training, value and compatibility obstacles, which can reduce the likelihood of their becoming a habitual user.  These include:
Digital right management (DRM) - creates major barriers within a music download service as the mechanism for DRM can vary between the complete separation of the music content from the DRM file to a lock on forwarding tracks.  Another major issue is variances between handset models - certified Vs non-certified operator handsets.
Navigation - concerned with music downloads tracking all aspects of a subscribers' journey through to identifying valued content.  Experience shows that navigation paths are not always clear, subscribers cannot find the content they are after, causing them to quickly lose interest and abandon a service.
Barriers to usage - also common relating to areas of handset and music service usage.  Customer behaviour is very intuitive with most individuals happy to try new services, but the majority will give up once they encounter a barrier to usage.  The sheer complexity of many new mobile services sometimes causes this.
Functions such as streaming and downloading can often be confusing.  Subscribers tend to struggle with how to search, locate, select and replay after they have downloaded a track, especially the mass market, non-technical and inexperienced users.
This leads us to far simpler barriers, but ones that are ultimately the largest barriers to subscribers becoming addicted to mobile music on their handset.  Usability, training, interest and other user experience barriers are just as damaging as any of the previously mentioned issues.  Mobile operators need to look at how they can ensure such customer problems are solved before they impact on the user experience.
So with so many barriers to the success of mobile music, why use a mobile phone as a music delivery method and player when compared with other standalone players?  The obvious argument is the need to only carry one device, and the advantage mobile operators hold here is that consumers are already addicted to their devices and other services such as SMS.  But how can operators change this device addiction into a mobile music addiction?
The answer is simple - mobile operators need to improve their subscribers' user experience when using music services, which, in turn, will drive satisfaction.
Understanding users and delivering exceptional customer service is just as important a part of the mobile experience as the latest technology and the size of the marketing budget. It can be the key differentiator for a business.  Too much time and money is invested in getting products to market quickly rather than getting products to market efficiently.  The objective is to provide the best mobile experience for each individual user.
To do this, mobile operators need to build an end-to-end view of their music service from all contributing systems and content.  This is not as easy as it might seem, as although common elements are contained within most music services, many aspects of delivery, billing and content are unique to each operator - which affects subscriber behaviour, usage patterns and other adoption barriers.  The simplest way to achieve an end-to-end view is by approaching the problem with Service Adoption Management (SAM) tools.
SAM treats adoption and usage from a holistic point of view, taking into account all technical and non-technical issues which affect adoption and usage levels.  It provides an operator with a clear insight into the groups enabled to use mobile music and presents the operator with the information to enable them to encourage a user to download songs on their phone.  For example, SAM tools allow an operator to understand which handset a user is using, what they are interested in and at which stage of adoption they are.  If a user can be seen to search for a song but stops, and does not download it, then pricing could be the barrier and the mobile operator can review this.  If a user is only downloading one song a month, then the operator should try to encourage greater usage by offering promotions.
By using these principles and analysing subscriber information (such as billing, CRM, customer care and network or platform services) it should also be possible to track other key adoption traits such as social networking, campaign effectiveness and early service launch measurement.
Armed with this highly granular insight into virtually any aspect of usage, and the ability to identify and remedy usage barriers, even those which have never before arisen, managers in charge of mobile music marketing and content can make more informed decisions and better target offerings and campaigns. By providing deep real-time visibility into subscribers' interactions with mobile music, it is possible to stimulate customer loyalty and increase mobile music usage and generate meaningful profits.
Operators can immediately see how mobile music services perform, how campaigns are working, and the experience of both individual and/or larger groups of users. This detailed real-time information can then be used to make mobile music more effective and attractive, and to proactively contact customers who need training or an upgrade to use services successfully.
Customers enjoying a positive first experience of new services will be more likely to use them again.  If services do not work or perform properly, operators can proactively contact these customers to solve the problem, offer assistance or offer refunds.  By treating customers positively in this way, users are more likely to use the service again and stay loyal.
This will be measurable by the increase in number of music tracks downloaded, the growth of active users, reduction of streaming terminated sessions, reduction of DRM related problems or in the number of music related calls received by customer care.
The mobile industry needs to be less focused on the technology and should instead proactively explore the real needs of users.  When setting their objectives, service providers should also consider the impact of customer satisfaction on their brand and image.  Providing a higher quality customer experience will also help service providers to differentiate themselves from their competitors, increase loyalty and attract more customers.   
Understanding and improving the user experience, not just for customers who have became addicted users of mobile music, but more importantly, for those who never get past their first failed usage attempt, is essential.  SAM tools revolutionise the whole business process of value added services such as mobile music and allow the operator to understand what is actually happening to the user.  By focusing on removing some of the frustration that is inherent in today's users' experiences the road should be clear for mobile operators to succeed in reaching the mass market with mobile music.

Oren Glan is CEO of Olista

Andrea Casini explains how operators can look to grow their existing operations in the face of a tough market environment

It’s common knowledge that the mobile phone market in developed countries is highly competitive, and opportunities for revenue growth are not as clear-cut as they once were. Mobile operators need to grow the potential of their network in order to grow the business and survive, rather than simply rely on finding additional subscribers. Evolution, it seems, is the only way forward. Having accrued a huge customer base, there is pressure to keep pricing low in order to keep churn to a minimum.
However, new technologies such as mobile VoIP are poised to change the established income model. To be seen as a mobile phone company alone is no longer good enough. In the consumer market, applications like the mobile Internet and content such as ringtones are big business; services such as mobile TV have yet to take off but are also potential revenue generators.
Meanwhile, enterprises are increasingly offering mobility as an option to their workforce. There is now a plethora of mobile technologies in, or entering, the marketplace - all vying for enterprise business. For a mobile operator looking to take a slice of the lucrative enterprise mobility market, they need to compete with fixed line technologies such as VoIP using Wi-Fi or WiMAX. Enterprises that opt for mobile services to support their workforce will expect unprecedented reliability. 

A crowded market
We’ve reached a stage of mobile phone ubiquity. There is an expectation, from both leisure and business consumers, that their handset will be able to find a signal wherever they are. Currently, places that are densely populated or experience large spikes in usage, such as city centres or football stadiums, can suffer a drop in service with patchy coverage and call drop out. As a high number of users all vie for a signal in one small area, reception quality will inevitably suffer.
Delivering consistent service in enclosed public spaces is further compounded by the fact that leisure customers will tend to use entertainment applications such as mobile TV and gaming when they’re sitting indoors, making in-building penetration necessary. Business users will also spend a significant proportion of their time accessing e-mails and transferring data whilst indoors. The border between private and business users is becoming more and more undefined, with people trying to use similar services and data rates.
Analysys predicts that half of all telephone calls made across Western Europe in 2008 will be made on a mobile phone. A tall order, admittedly, and one that will only be possible if mobile operators evolve to embrace the need for in-building technology.
The very nature of 3G signalling makes it difficult to penetrate areas such as office blocks and tunnels. In addition, high-rise buildings will require additional capacity and bandwidth due to the higher number of users. Resolving the problem of poor reception in built up areas and at large events, by embracing in-building wireless coverage and capacity, has become an important service differentiator for wireless carriers.
In-building solutions can vary in power, offering high capacity blanket coverage in shopping malls through to limited home capacity boosters. Products such as distributed Antenna systems and RF repeaters are designed to guarantee wireless coverage in highly populated areas and provide cost-effective, common infrastructures. This infrastructure will support all of the various standards, while providing a high level of omnipresent coverage to mobile phone users. If there were not be a seamless transition to 3G, its adoption could be threatened by alternative fixed-line services.

The future of in-building
In the consumer and enterprise market, femtocells and picocells could, in the future, help with coverage problems at home and at work. These products transmit mobile signals over a LAN and have a more limited range than large-scale in-building products. Picocells are a higher power indoor basestation alternative to femtocells.
In the UK, Ofcom recently auctioned off 3.3Mhz of the wireless spectrum, equating to a series of twelve low power licenses to allow both fixed line and mobile operators to compete for the option to install picocells into buildings and public spaces. Deployment has been slow as operators appear reluctant to embrace the potential revenue opportunities. However, O2 is due to launch picocells later this year for roughly €100 each.
While femtocells show promise for solving some indoor wireless coverage issues, the technology is still in the early stages of development. Deploying femtocells for individual buildings could, in the long term, reduce operator costs and allow operators to offer smart tariffs. The first femtocells have gone on sale in the UK and are currently being promoted as a way to boost mobile reception at home, but the market is still in its infancy.  
The idea of dual tariffs, using femtocells, will mean users can get the best deal when using their phone at home or out and about. New handsets could automatically hand over from a basestation to a femtocell as users come into range at home. Operators may also offer femtocells as part of a subscriber package. Vodafone is already trialling the technology and this is one step on from the current BT Fusion system, which is reliant on Wi-Fi. Femtocells could even see mobile operators grab a share of the home phone market by offering blanket coverage.
Femtocells and picocells will effectively usurp the need for WLAN and other fixed/wireless offerings. Currently technical issues, such as standardisation and integration with existing networks, need to be resolved before these products will gain wide acceptance. However by offering devices that boost coverage, operators could eventually look forward to increased revenue, as their services are more widespread and readily available. This could also prevent churn.
The in-building theory is not new. Andrew Corporation has already deployed many in-building solutions to provide indoor wireless coverage. The company deployed an ION™-B fiber distributed antenna system (DAS) at Dallas Fort-Worth International Airport Terminal D, its parking structure and the nearby Grand Hyatt Hotel to extend wireless coverage to customers. The ION-B utilises head-end equipment that interfaces with multiple, co-located, operator base stations.
Providing blanket mobile coverage and a set of competitive tariffs to match will mean mobile operators will finally be in a position to claim glory. Users who can make important calls and access data wherever they need to are less likely to look to another operator. In the UK, O2’s churn for contract customers was 23 per cent for the year ending December 2006. Coupled with a £2 lower average revenue per user (ARPU) over 2006, the financial cost of losing customers and paying to retain them in an increasingly competitive market is high.

Location based services
Growing the potential of a network through in-building is only one side of the story. Location Based Services (LBS) are already offered by some mobile phone networks and are a way to generate significant revenue for operators. From child-finder services to tracking enterprise equipment and location-targeted advertising pushes, there is money to be made from LBS.
In the UK, Ofcom has stipulated that by early 2008, any VoIP service allowing users to make calls to ordinary phone numbers must also be able to make calls to the emergency services. Research by the communications watchdog has revealed that 78 per cent of VoIP users who cannot use their service to call 999 either thought they could, or did not know whether they could. This development means that geo-location will be provided on a “push” basis (where information is sent to the user without them explicitly requesting it) and it will be done at the operators’ expense.
The cost of forced geo-location can be offset in the long term with other LBS options, however. Currently, the solution is primarily used as a way to send custom advertising and other information to mobile phone subscribers based on their current location. Major operators, focusing on recouping the cost of 3G licenses and falling voice revenues, have yet to exploit the full potential of LBS. Yet this is set to change with a wave of new lower cost devices, including the Nokia N95, that offer LBS capabilities. In fact, ABI Research predicts that by 2011, the total population of GPS-enabled location-based services subscribers will reach 315 million, up from 12 million in 2006. This represents a rise from less than 0.5 per cent of total wireless subscribers today to more than 9 per cent worldwide at the end of the study's five-year forecast period.
The business case, as demonstrated by Nextel in the US, which currently has a 53 per cent market share of the LBS, is there. Operators, with an influx of hardware, should look at LBS in a new light. It has been over-hyped previously, but it is another way to generate revenue with a relatively low outlay.

The growth potential
The mobile phone market is changing. Operators have succeeded in establishing huge customer bases, to the point where phones now\ outnumber people. Yet, in the mature markets of Europe and the US, there is a need to look beyond voice and data services in order to prevent churn and drive revenue.
In-building, in its simplicity, overcomes one of the most common reasons for a customer changing operator: lack of signal. Meanwhile, the market for LBS and geo-location has been over-hyped, but with an influx of new devices and increasing operator confidence it will be possible to overcome cynicism and make a positive business case to consumers and enterprises alike.
After all, we are in an age of convenience, and in-building solutions and LBS are about to make everything that little bit easier.

Environmental and radiation issues
EM and radiation compatibility, health hazard with exposure to non-ionizing radiations has always been a hot topic, recently coming as a burning issue following a recent article about interferences from mobile phones to medical equipment in hospitals.
As every RF engineer knows very well, unlike many generalist reporters and the general public, 2G and 3G mobile terminals and phones are power-controlled by the network, so as to ensure that networks function properly, interferences are minimised, and battery life is prolonged for more air-time and subscriber’s convenience.
The degree of action for Power Control extends over 30dB, or 1000 times, for GSM and to much more (in the range of 70dB, or 10 million times) for UMTS terminals; which means more transmit power is requested in Uplink (up to 1 or 2 Watts) when radio coverage and visibility are poor (high path loss, high penetration loss - i.e. in buildings from outdoor cells), and on the contrary, very low power (down to 1 mW in GSM or much less for UMTS) is needed with good coverage and antenna visibility - i.e. with an efficient in-building system.
Moreover, specific parameters can be set at certain locations, so that Uplink power will never exceed a pre-set value not to cause out-of-control interferences to the own operator’s network or to other services.
And, by the way, well designed in-building systems would have several distributed low-power antennas (or DAS), with power in the range of mW and radiation levels well below the most stringent EU Country-specific limits (.5 V/m).
Actual results of implementing in-building coverage show local RF power density decreasing by three orders of magnitude, with any previous interference issues turned to non-existing.
In-building coverage is therefore the only solution that ensures, on top of seamless communication and proper service for the network and its subscribers, the highest degree of EM and radiation compatibility, either to delicate equipment like navigation instruments in airplanes or medical equipment in hospitals, and to the human beings.

Andrea Casini is VP EMEA Sales & Marketing,
Andrew Corporation

Will the huge success of SMS be eclipsed by later generation services?  Not exactly, says Prisicilla Awde 

Virtually everybody does it every day because it’s a cheap, fast and an easy way to keep in touch and do business. No, we are not talking about making phone calls but sending SMS messages, which, after telephony, is the most successful and ubiquitous communications service ever. The big question is: is it threatened by next generation mobile instant messaging (IM), e-mail or multimedia messaging (MMS)? The consensus is that it is not, at least not for several years.
SMS may even have a negative impact on the success of advanced messaging services simply because it is integrated into all phones, is known, liked and used by millions who see little advantage in switching to more expensive, if media rich alternatives. Mobile phones are automatically set up for SMS, it works without data connections, is interoperable across networks/devices, is international and reliable.
In Western Europe alone, Gartner estimates that over 180 billion SMS messages were sent in 2006 and expects growth of over 20 per cent, reaching 210 billion in 2011. In developing countries where low incomes and high poverty levels favour cheap, easy to use devices/services, SMS is booming. In the last year India added six million mobile subscribers each month and growth in China is similar. In Africa roughly three per cent of the population have bank accounts but around 20 per cent have mobile phones.
SMS traffic is used for secure micro-payments and to share information in all sectors from agribusiness to health care, from fishing to financial services and political/entertainment elections. Expats use voice SMS for cheap international connections and it overcomes literacy/keypad barriers by providing a ‘talk/listen’ alternative to ‘type/read’.
Deepak Natarajan, Head of Marketing, Roamware predicts that over 2.3 trillion text messages will be sent in 2010: “SMS is alive, kicking and thriving and the vast majority of the global population is interested in it. Operators have a huge incentive to continue supporting SMS because many organisations use it as a communications channel. New sticky applications/features make people reluctant to change.
“Low cost handsets have produced a revolution in communications, but the next step needs cheap, available, easy to use next generation devices. Within ten years there will be a big growth in MMS and mobile IM but prices are not being lowered sufficiently across the board to motivate users to move away from SMS.”
The situation is not helped by innovative value added SMS features (storage, divert, print and filtering), which encourage usage, increase margins and reduce churn. “The original SMS infrastructure was store/forward but now it is direct deliver, texts can be used for transactional services,” explains Jeff Wilson, Chairman, Telsis.
Growth in advanced messaging has been inhibited by complicated set up; relatively expensive smart phones/tariffs and slow 3G roll out. Flat rate pricing and bundling accelerates take up but the choice of messaging service depends on circumstances - each is applicable to different situations. “Price; interoperability; ubiquity and user familiarity always inhibit new services but different messaging services will co-habit - growth in one increases traffic in others,” says Jay Seaton, Chief Marketing Officer at Airwide Solutions. “Operators’ revenues from SMS exceed all other messaging types combined. Carriers can add sticky enhanced SMS services over existing infrastructure fast/cost effectively and are increasing SMS revenues from personalised services.”
Andrew Moran, Head of Enterprise, Symbian agrees that take up is slowed by the lack of compelling consumer price plans: “If operators get the pricing right users will flock to new services driving up ARPU. Applications are available and the magic ingredients are there for a whole suite of messaging services if operators can hit on the right model.”
Mobile IM is perhaps the natural successor to SMS and some analysts estimate the market will be worth $1billion in 2011. It may find applications in customer service and could be useful for cross and up-selling.
“Although it is a different type of communication, an on-going chat, personal IM complements SMS although it won’t be a substitute,” believes Gabriel Solomon, Senior VP at the GSM Association. “SMS will co-habit with new services – the top 20 per cent of consumers using multimedia services still want to talk to the rest of the population via SMS. One shot, fast SMS messaging will continue to be used for more and more applications as a secure system where pre-paid accounts limit fraud.”
Extending computer IM and e-mail functionality into the mobile environment is a natural progression suggests Doug Brackbill, CMO at Visto. “The desktop and mobile environments will merge to allow seamless communications: people will move between systems. Messaging services are not competitive but meet different needs.”
Although MMS is mainly used to send informal and impromptu photographs, it may merge with SMS as graphics or ‘emoticons’ are automatically added by servers.
MMS is important for mobile advertising where subsidies will change the whole messaging market. Although mobile spam is less acceptable than on PCs, tariffs will drop for people willing to accept non-intrusive, relevant, useful, multimedia adverts. The subsequent lower costs will likely drive advanced messaging take-up. SMS will however still play a big part in advertising because of the huge addressable audience - some companies see it as the most economical way of reaching millions.
“MMS has been disappointing, but we are starting to see more recent growth especially for providers to deliver content rather than for personal messages,” says Stephanie Pittet, Principal Analyst for Gartner. “Mobile IM and e-mail will become more important especially in mature countries with high-end devices. It’s a slow start but operators are striking deals with IM providers and flat fee e-mails will be important for consumers. All are niche markets now. MMS won’t ever reach the levels of SMS which has high margins and users expect it to be central to all offerings.”
Mike Short, VP R&D for the O2 Group believes text based SMS will be popular until 2010 because of its convenience, familiarity, ubiquity, easy availability and the number of promotions relying on it. After then he expects other capabilities will be integrated into handsets especially e-mail, which makes the work environment mobile. “This is an evolution not revolution: there will be higher adoption in some places over others. Convenience and reach will drive new services,” says Short. “Customers do what’s most applicable. We need to make it easy and give them the widest range of services possible.”
Hilmar Gunnarsson, VP corporate development at Oz asks why operators launched MMS instead of just adding a picture facility to SMS. “Rather than marketing new services, carriers should evolve the underlying architecture to enhance and add new capabilities, giving users richer messaging experiences under the SMS umbrella. We need to think of things from users’ perspectives, keep things simple and easy to use. E-mail should be embedded into all phones and applications pre-integrated.”
The market is moving towards a situation in which people will not care which messaging system they use but will choose the most convenient/appropriate. Mobile unified messaging will give users a single in-box, address book and number accessible via any device. “When 4G networks become available in 2015/16, IP will be the prime network connection and there will no longer be a clear split between MMS, IM or SMS – messaging will be converged,” believes Aram Krol, market development manager, Acision.”

Priscilla Awde is a freelance communications journalist

The promise of IPTV is fraught with dangers - from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

The promise of IPTV is fraught with dangers – from outages to poor quality pictures - but effective systems test and measurement could save the day.  Co-founders Alan Robinson, CEO, and Robert Winters, Chief Marketing Officer of Shenick Network Systems, discuss the options with Priscilla Awde

Imagine this: virtually the whole nation settling in to watch the rugby world cup or European soccer final and the television picture goes down for thirty minutes or freezes just as the home side is about to score a goal. It may flicker at critical times, the sound be unsynchronised or users unable to change channels quickly/efficiently. Perhaps the latest film released on Video on Demand (VOD), can be paid for but not downloaded or hackers may launch a denial of service attack. A power outage may cause major disruption to television signals.
One person, at least, needs no imagining. Robert Winters, Chief Marketing Officer at Shenick Network Systems, instead predicts a riot should any one of these all too feasible scenarios actually happen in a live IPTV network.
Couch potatoes everywhere are increasingly intolerant of any outages and expect picture perfect television. Guaranteeing quality of service and of individual subscriber's experiences are, however, major and often underestimated challenges for all service providers, but especially in the IPTV environment where lost packets, jitter and latency, combined with poor network architecture and inability to scale, will all affect the viewing experience.
Driven by the twin imperatives of falling revenues from the cash cow of voice, and customer churn to competitors, operators are now moving into higher margin services. The possibilities of increasing ARPU in their growing base of broadband subscribers and reducing churn by creating sticky applications make the triple play package of voice, video and data compelling, if challenging. In fact, operators have little option but to add new revenue streams if they are to compete effectively in the next generation world of integrated and convergent multimedia services.
However, in doing so, telcos are moving into a highly competitive market already populated by established cable and satellite providers. Having gone through trial and error, these networks are optimised for video, can carry voice and data applications and are scaleable. The cable and satellite operators have also negotiated long standing agreements with major studios and other content owners.
Alan Robinson CEO at Shenick suggests it is difficult for telcos to get interesting content, given competition from existing players and because they are not used to the video/television business. "However, telcos must produce compelling content services at the right price point," says Robinson. "The audio/visual sector is very competitive but can provide excellent revenue streams for operators and a way of increasing ARPU and keeping customers."
The best effort approach to service levels is no longer good enough in the IPTV world where packet losses have become more serious then ever. User expectations have risen with exposure to digital video consumer electronic equipment and DVDs, which provide high quality video and audio, making people less tolerant of degradation or poor service.
These are just some of the challenges facing operators and, which have also delayed roll out of some early commercial IPTV launches. Others involve more technical issues including network capacity and scalability. Yet most can be solved by careful network planning and a serious commitment to early and continual end-to-end test and measurement routines.
 "It will take operators a while to roll out television," Robinson suggests. "IPTV is harder to get working than people realised, mainly because legacy systems were best effort - which may be alright for broadband and Internet access but is not for mission critical television services. People will tolerate service outages in certain situations, like the mobile phone sector where dropped calls still happen because there is no alternative technology, but that is not the case in the competitive television market."
Unlike the first deployment of DSL broadband applications in which the quality could be patchy and losing data packets was rarely critical, operators cannot afford any loss or interference with IPTV signals but must ensure high service levels and minimise transmission and technical problems. "Quality is a key differentiator for IPTV, so implementing the best and right equipment, carrying out pre and post deployment and real-time network monitoring and testing are essential," explains Winters. "Operators must continually test the quality of subscriber's experience and monitor service assurance to deliver the best possible results."
Among the old but significant factors affecting service levels are the huge number and variety of equipment installed in multi-vendor communications networks. Operators are used to handling interoperability and integration issues and ensuring equipment conforms consistently to open standards, but these become critical in IPTV deployments.
Although it may sound obvious, operators must match triple-play services to network capabilities - a consideration which has delayed at least one major European IPTV launch. Targeting the entire subscriber base with IPTV means that telcos will at some point, hit the scalability wall. Pre-deployment testing will help determine the exact number of subscribers any given architecture will be able to support and demonstrate how the existing network will react to application loads both at launch and going forward.
The constant challenge of transmitting next generation services over legacy architecture is the ability to scale, and, ultimately, performance - all problems that must be addressed at the earliest stages of IPTV launches.
 "Prior to deployment operators must decide which vendor to use for IPTV; which set top boxes; DSLAM equipment; network components; routers; switches; core transport and encoders, among others, they will use," believes Robinson. "Which vendors can do the job and, when everything is put together, does it work? What are the integration issues; the performance limitations? Will the network need to be re-architected to provide more bandwidth or more boxes added to reduce contention and handle demand? Assuring on-going quality of service is an end-to-end problem."
Fortunately, there are solutions but they require an early and on-going commitment to testing and measuring how equipment performs, what is happening in the network, and how the whole reacts to peaks and troughs in demand. Emulating the behaviour of hundreds or thousands of subscribers in the laboratory prior to deployment identifies and solves problems before any customers are connected.
Able to test both standard and high-definition IPTV and VoD down to the level of individual viewers, Shenick's high performance converged IP communications test system diversifEye 4.0 gives both equipment vendors and service providers the ability to test real world VoD functionality. They can determine how networks perform under high load conditions such as network surges. So operators can guarantee service level quality before televisions are turned on.
Quality of experience testing in IPTV networks must include service and transmission layers and an understanding of the interaction between them. Ideally, testing the actual received decoded video stream against a known good source on an end-to-end basis provides the most accurate results.
It is important to conduct converged IP tests which include layers two to seven and carry out functional, load, QOS/QOE limitation testing for IPTV, VoD, VoIP, data applications and overall security. Passive and active probes throughout the network are part of on-going monitoring and service assurance programmes.
 "We can set up and test the type of traffic generated behind a typical household, which may include several televisions, perhaps high definition TVs; one or two PCs and several telephones," explains Robinson. "Engineers can emulate traffic in a multiple home system and create a real world environment to give operators and equipment manufacturers an opportunity to test performance limitations and quality of service. They can monitor VoIP or high-speed Internet traffic and see what happens if there is a surge to join channels as users all switch programmes simultaneously - will this clog the DSLAMs or other aggregation devices or even the video servers? Powerful laboratory equipment and test routines find bottlenecks in high load systems.
"Pre-deployment performance testing allows operators to upgrade systems where necessary but it must not stop there. There is a constant need to monitor live networks and do regression tests whenever new equipment is added into the system. Service assurance monitoring guarantees high performance, discovers problems fast and highlights where to go to fix them."
Testing early and often is a mantra operators ignore at their peril since it is difficult to debug problems in live IPTV deployments. Consistent low performance increases customers' dissatisfaction and the likelihood they will move to competitors.
Effective network and application monitoring is best controlled from a dedicated centre where each channel can be checked in real time from the satellite feed into the head end and through to individual subscribers. Sophisticated statistical models produce scores to evaluate the video quality. The optimum standard of service may vary between operators and with what subscribers are watching or doing.
Changing camera angles, choosing what to watch, when, or having on-screen ‘chats' with friends are big drivers for IPTV but most are bandwidth intensive. Equally the system must be able to handle people browsing through channels without either slowing down or adversely affecting the video/audio quality.
"The bandwidth required for Digital Video Recording (DVR), VoIP, Video on Demand (VOD), or peer-to-peer downloads is up to 30Mbps for successful deployments," explains Winters. "Television must take priority but it also takes up bandwidth which may have an adverse effect on other services. It is therefore important to split application flows over virtual LANs, otherwise channel hopping, for instance, will affect QOS. Operators must monitor each application stream and be able to control, test and measure flow quality. Fully integrated triple-play packages strain networks, making it important to test for full use of all equipment simultaneously."
As telcos scale up and deliver IPTV to the mass market they may hit bandwidth problems. Current DSL technologies may handle today's requirements and deployments of up to 200,000 subscribers but operators are likely to see performance issues when they scale up to millions of customers. It is then they may have to extend fibre deeper into the network but fibre to the home/curb/node (FTTH/C/N), architectures are becoming cheaper and increasingly feasible especially in new housing or commercial developments. Telcos may also have to add more boxes in exchanges to reduce the number of subscribers per unit. Alternatively operators may turn to WiMax as a means of adding more bandwidth in the last mile.
Countries in the Far East are driving broadband deployment: in Japan and South Korea for instance access speeds of 100Mbps are commonly available and not expensive. With this available capacity there are no problems with scalability, contention or quality of service.
Keeping ahead of developments and being able to test for future technologies, network architectures or applications are part of daily life for Shenick. Winters and Robinson agree the next big shift is that IPTV will move from the current multicast model to more of a unicast system better able to cater for personal usage patterns. Single users will be allocated an amount of dedicated bandwidth for applications like VOD, which may raise more contention/capacity problems especially if one person in the house is downloading a video whilst another is watching broadcast television.
However, convergence is a reality now, they believe, and people are starting to look at interactive and integrated voice and video applications.
"This is still very early days for IPTV, with only around two million deployments worldwide. Lots of operators are talking about it but it is still in the early growth stage," says Winters.
Security is yet another factor which must be considered. "Operators are already concerned with content security but there will be an increasing number of malicious or denial of service attacks on television. Hackers may jam the system to prevent people changing channels or generate viruses making it important to test firewalls and simulate the effects of such attacks, in the laboratory," adds Winters.
Operators are expanding the amount of bandwidth in the access network either by rolling out fibre or using new technologies to squeeze more capacity from the copper plant. Several different core network protocols are appearing with the move to NGNs, all of which must be supported and tested. "Each vendor has their own way of testing and implementing standards. Equipment manufacturers may work with specific operators who have certain performance expectations which must be tested. Test and measurement is all about flexibility and we must be two years ahead of deployed services," concludes Robinson.

Priscilla Awde is a freelance communications journalist

Laura Marriott looks at the way operators are able to marry new technologies, such as mobile TV, with mobile marketing initiatives in order to deliver the services subscribers want, in a personalised and convenient form

There are several key stakeholder groups in the mobile marketing value chain. Each has a different set of objectives and requirements. For the advertisers it is about their ability to implement highly-targeted, measurable marketing initiatives to a wide audience. For the operators it is about rewarding customer loyalty (or preventing churn), increasing average revenue per user (ARPU) and developing service differentiation. Consumers, meanwhile, demand compelling content delivered to them in a personalised and convenient manner.
All these players will benefit from the development of a prosperous mobile advertising industry.
The opportunity is certainly a large one. According to ABI research, worldwide mobile marketing and ad spending will reach $19 billion by 2011, up from $3 billion in 2007.
For mobile operators, mobile marketing and advertising represents a relatively rare opportunity to create value from a non-subscriber revenue source. As most operators around the world still struggle to ramp-up subscriber usage of lucrative new data services, creating new revenue streams becomes ever more vital, especially as core voice revenues continue to decline due to price pressures.
There are five main types of mobile marketing content delivery: voice (e.g.: IVR), mobile internet (e.g.: WAP), messaging (e.g.: SMS/MMS), Video and television and downloadable (Java/ Brew etc.). Traditionally, the operator has used such delivery mechanisms to deliver "on-deck" content, which has required the content provider - or mobile marketer - to have a direct relationship with the operator. It was considered vital that the operator "owned" the customer and that this relationship was not compromised by third parties - other brands, advertisers, billing providers that could dilute the operator's relationship with its customer base. The downside of this model was that the operator shouldered the burden of marketing the new services, content was often limited or its potential was unrealised, and it was difficult for third-party brands to truly leverage their value.
As a consequence many of these early content-based initiatives were not compelling as far as the end-consumer was concerned and a few failed to deliver the desired uplift in operator-revenue.
Thanks to the rise of mobile marketing and advertising this model is evolving. As operators are increasingly able to generate revenue from marketing initiatives, their mobile web sites can begin to operate more like a portal than a mere storefront. This means that off-deck and on-deck content can be combined to enable access to provide the consumer with an almost unlimited range of content and services.
These emerging business models are also being shaped by new mobile technologies, most notably by the rise of mobile broadcasting. Today the mobile TV market is a small one in terms of revenue and consumer adoption, but the industry predicts strong growth. The first nationwide mobile TV services went live in the US and Europe in 2007 after years of trials and customer usage studies.  Nevertheless, the technology is in its infancy as far as consumers are concerned: of the estimated 233 million handsets in the US, only around 10 million are able to access mobile TV. (Sources: m:metrics, eMarketer, Forrester, Yankee Group).    
As the two industries - mobile marketing and mobile TV - are emerging, an opportunity has risen for them to compliment each other and stimulate further growth. A recent study by m:metrics commissioned by the Mobile Marketing Association (MMA), found that among US subscribers interested in mobile TV services, almost half (41 per cent) said they would watch mobile adverts in order to access mobile TV services free of charge. Another 20 per cent said they would access mobile advertising if it meant accessing services for a reduced fee.
Advertising within mobile TV environments can take many forms. As with traditional TV, advertising can occur before (pre-roll), after (post-roll) or during programming (including product integration or branded entertainment) - all formats most consumers will be familiar with. In addition, there are a number of forms of mobile-centric advertising elements that can be incorporated in the mobile programming. These include, real-time text overlays, real-time image overlay (watermark), and premium-rate SMS (PSMS) links to encourage Participation TV and avatar branding alerts.
The FIFA World Cup in Germany in 2006 proved a fertile test bed for many of these emerging business models. Mobile Dreams Factory, a specialist in delivering mobile marketing campaigns and an MMA member, was the company behind one of the most successful of these: Vodafone's "Videogoals 3D", which delivers animated, 3D videos of goals scored during the matches to mobile devices within minutes of the real-time experience.
3D models of all the teams and players were created. When a goal was scored, the play was reconstructed and animated within seven minutes of the real-time activity. The video was converted to a mobile format that was sent to Vodafone clients and to Marca.com, a leading Spanish media group, for publication on the Internet.
Videogoals 3D was the most visited of Vodafone services during the World Cup and the most visited of Marca.com's internet services, with more than nine million videos viewed. Not only did Vodafone associate its brand with football via this high-appeal, unique product, but the company was also able to promote its Vodafone Live 3G content-based service, including driving MMS revenues. The service was recognised at the MMA's annual mobile marketing awards 2006 for its ability to integrate multiple cross-media elements to communicate and deliver on the service availability.
For advertisers, the World Cup represented an opportunity to reach a huge audience but there were two main problems: traditional advertising was often prohibitively expensive for smaller companies and the event itself was so saturated with advertising that some less well-known brands could easily get lost in the noise. Mobile TV provided a solution to both problems.
CEPSA, a leading gas and oil company in Spain, worked with the Mobile Dreams Factory to create a World Cup-related real-time videoblog. The company sent two reporters to Germany to provide additional information on the activities and training sessions of the Spanish team and to share other anecdotes.
This was considered the first time that a real-time video service, utilising 3G mobile devices, captured and simultaneously transmitted a signal to thousands of users during a major event such as the World Cup. Using 3G mobile devices, the service was able to capture and simultaneously transmit a video signal via the Internet while audio files were transmitted over the operator's network. The audio and video subsequently came together in a platform in Madrid. The system recorded all the live connections so viewers could view them later as time-delayed videos.
The real-time videoblog was innovative because it involved the convergence of digital media, using the mobile device both as a communication tool and as the medium itself. By integrating 3G technology with both mobile and internet services provided the consumer with an exceptional mobile experience, while extending the brand for CEPSA and connecting it with the FIFA World Cup. The service was given the Innovation Award for Creativity in Technology at the MMA's mobile marketing awards 2006.
The success of such pioneering services during the World Cup has seen them subsequently taken up elsewhere and to similar effect. The Videogoals 3D service, for example, has since been rolled out by Telefonica´s Movistar in Spain, which successfully integrated Coca-Cola as the campaign advertiser of the service. The model is able to satisfy all members of the value chain: Coca-Cola is able to reach Movistar subscribers; Movistar is able to offer free, exclusive content that differentiates it from competitors, and the user receives exclusive, free premium-content.
The portal exhibits the corporate image of the advertiser and banners are placed in different places around the site featuring the Coca-Cola logo. These can be static or animated, and the system will automatically detect the user's device to request adapted and appropriate media.
The service saw some astonishing results. In the first three months of operation, more than 10,000 users registered for the service and around 1 million impacts on banners and downloads.
Its still early days, but such case studies demonstrate how operators are able to marry new technologies such as mobile TV with mobile marketing elements to deliver compelling services to subscribers. Hopefully, 2007 will see many more examples and lots of unique innovations. 

Laura Marriot is President of the Mobile Marketing Association

Is Apple really stealing a march on the mobile industry? 
Lynd Morley takes a look

Apple's iPhone may be over-hyped, over-priced and (so far) not over here, but despite that it's been a bit of a cage-rattler for the mobile industry - both vendors and service providers.  And so it should. Yes, we know all the nerdy objections to the thing are valid: most of the gadget's features are already available on existing phones from other vendors. And while the touch-sensitive screen is very clever and the ‘visual voicemail' feature breaks new ground (for the first time the device vendor seems to be dictating network-supported features to the service provider) these are hardly innovations to set the industry quaking in its boots.   What the industry should really worry about with the iPhone only became blindingly obvious in early September when Apple launched its latest iPod (the music devices with the white earplugs that young persons tend to wear).  It looked strangely familiar...   In fact the new iPOD Touch is really the iPhone without the phone.  It's a WiFi-enabled music player  with the same case and has the same touch screen and, most important, the same icon-driven interface. What Apple has now delivered is two identical ‘p's in a pod - a phone and a player. Importantly, there is certainly more to come.  This is the concrete expression of Apple's iLife framework for integrating the consumer's digital world. Functions such as satellite navigation, camera, storage can all be spun out as stand-alone products or spun in in various combinations to share functions on a convergence device.   Using it, the Apple brand can be spread like a viral infection across multiple existing categories of electronic device and even a few we've yet to think of, most of all across mobile phones.  In fact what Apple is selling with it's iLife is not the same old gadgetry but an electronic wardrobe of matching accessories.  It may be brilliantly trendy and a marketing triumph but the approach is also brilliantly necessary. That's because the natural brake on the utility of digital media devices of all types has always been their complexity at the user interface - so the ‘fiddlingaboutness' generally required to do things with portable digital content meant most of us under-use the facilities already available. There's a much better chance of us all doing complicated things like synching up devices, transferring files and songs to a central library and so on and so forth, if the processes are easy to follow and execute  (and aren't completely different on each device you come to).   So that' why Apple's might just succeed - the nagging worry for the mobile phone incumbents is that maybe they, or perhaps a combination of players, should have grasped this nettle more deftly themselves and moved on from the ‘button-driven' gadget environment to a personal device systems market first.  

Mobile TV debacle
The FLO Forum, a global body of more than 80 companies, today reacted to the European Commission’s Communication on “Strengthening the Internal Market for Mobile TV”.
The FLO Forum applauds the Commission’s efforts to advance the high potential mobile TV opportunity in Europe, including the focus on spectrum, harmonisation, standardisation and regulation. However, the FLO Forum believes that the Communication’s intention of favouring any one particular mobile TV technology for Europe could stall the advancement of a healthy European mobile TV eco-system.
Dr. Kamil A. Grajski, President, FLO Forum said of the Commission’s Communication: “The FLO Forum supports the principle of technology neutrality, which the major European industry groups have been calling on the Commission to respect[1]. There is a reason why the principle of technology neutrality exists and that is to ensure that the market can choose which technology delivers the most attractive solution for the consumer. Each country has its own unique market conditions and each mobile broadcasting technology standard has very different performance characteristics. Locking the European market into one technology model is potentially harmful to the growth of mobile broadcasting in Europe and will hinder the development of innovative technologies.”
“Despite its youth, the mobile TV marketplace already offers multi-standard and multi-technology products and services - from chipsets to broadcast network transmission equipment. It is now cost-effective and routine to consider multi-modal mobile TV handsets. These developments should allow for the take up of attractive broadcasting services that will enable economies of scale. Technology is not the problem, but restriction of choice will be,” added Grajski.
“The Commission’s support for DVB-H for mobile broadcasting in Europe is based, in part, on the suggestion that a mobile TV technology mandate, like the GSM mandate, is necessary to achieve economies of scale and to position European companies globally at the technology forefront. But the analogy is contrary to the market reality today,” said Grajski. “The mobile TV industry is still in its early stages, but the GSM mandate came after GSM had launched with wide commercial success. Technology mandation for mobile TV in Europe is not supported by the facts,” Grajski continues.
 Regarding FLO technology, Grajski notes that “recent independent trials of FLO technology in the UK involving several EU-based FLO Forum members highlighted significant technical advantages, which lead to savings on infrastructure spending.  FLO offers twice the capacity of DVB-H, or alternatively the same capacity, but with a network build out with significantly reduced cost. This can translate into millions of euros difference in capital and operating expenditures for a network.“
Concluding, Grajski notes that “Technology mandation is not an appropriate regulatory tool in innovative and dynamic markets such as mobile TV, especially where the market remains undecided and where the technology continues to evolve rapidly.”
Details:  www.floforum.org
 
Enterprise mobility
Indoor base stations, cellular radio enhancements and IP Multimedia Subsystem (IMS) will give mobile operators crucial new capabilities as they battle with WLAN vendors to exploit the enterprise mobility market, according to a new report, Seizing the Opportunities from Enterprise Mobility, published by Analysys, the global advisers on telecoms, IT and media.
"The limited coverage and throughput and the relatively high prices of indoor cellular services make it difficult for mobile operators to satisfy enterprises' requirements for mobility", according to report co-author, Dr Mark Heath.
"However, the combination of three major technological developments could radically enhance the capabilities available to mobile operators, enabling them to make substantial improvements to their enterprise mobility solutions, and to fend off competing solutions from the WLAN community."
Key findings from the new report include:
• Indoor base stations will significantly improve the coverage and performance of indoor cellular service, allowing mobile operators to devise different charging strategies for indoor services, including low-cost or free internal calls
• Cellular radio enhancements, such as HSPA+ and CDMA2000 1* Revision B, will increase the throughput and capacity of cellular systems to match those of WLANs, particularly with indoor base stations
• IMS will give mobile operators the functionality they need to integrate and control their indoor base stations and to deliver the flexibility, sophistication and interworking between services that enterprises will expect. "Armed with these new technologies, mobile operators are well placed to attack the mobile enterprise market", according to co-author Dr Alastair Brydon.
"One approach would be to use the technologies to integrate enterprises' existing systems and applications with their cellular networks, although this would need substantial support from systems integrators. An alternative, albeit more radical, tactic would be to aim for pervasive cellular mobility, whereby the same cellular network solution is used to deliver all of an enterprise's services and applications in every environment. "For some enterprises, the simplicity and uniformity of a common cellular service in all environments will have major benefits", says Brydon.
Details: http://research.analysys.com

Cost savings
To realise the true potential offered by mobile working, organisations must move towards delivering secure access to real-time, line-of-business applications. With a plethora of new devices, software and innovations coming to market the mobile workspace is constantly evolving. However, mobile and wireless technologies are notoriously averse to standardisation - users consistently experience technology fragmentation, interoperability issues and rapid obsolescence.
IDC predicts that in 2008 businesses will be focused on regaining control over mobility developments - having a clear vision as to how solutions should evolve to achieve flexibility, ease of use and cost savings. "Although a mobile enterprise deployment will require some up-front investment, most companies expect that, over time, the benefits will outweigh expenses, and eventually cost savings will be realised," said Stacy Sudan, research analyst, Mobile Enterprise Software.
A survey conducted at IDC's 2006 Mobility Conference identified that the combined IT spend of delegates was in excess of £300 million, with the average expenditure being £81 million; 95 per cent of delegates surveyed indicated that their mobile budgets would increase, on average by 40.3 per cent in 2007; respondents' key priorities beyond providing email access were customer relationship management, sales force and field force automation, and the implementation of additional security measures such as authentication and digital signatures.
Details: www.idc.com
 
Co-ordinating anti-spam
IronPort has announced the fruition of an anti-spam pilot project conducted between a coalition of leading European telcos and security organisations.  The seven month project resulted in improved spam catch rates, at the same time as revealing the need for a cooperation framework with more partners to standardise reporting formats, share information and adopt a common set of best practices in anti-spam.
In January 2007, IronPort joined forces with ETIS – The Global IT Association for Telecommunications, TeliaSonera, KPN, Belgacom and Telenor and Dutch research organisation TNO.  The coalition’s goal is to eliminate the majority of spam on the European network level before it even reaches the mail servers.
Throughout the pilot IronPort has provided insight into spam trends and methods, and best practice tips acquired by working with other large ISPs throughout the world.  The company also provided technology to all pilot members to enable them to see how IronPort’s reputation filtering & anti-spam technology eliminates spam.
The project trialled a combination of different technologies and co-operation procedures with positive results:  One of the participating ISPs remedied close to 16,000 spam incidents in less than a day during the pilot period.
The group found that the active information exchange among ISPs, especially regarding spam traffic flows received from one another, is a successful approach towards reducing spam.  Even within the limited time frame of the pilot, this level of information sharing dramatically reduced the spam rates as well as customer complaints in the participating ISPs. The existence of trust among partners under the ETIS umbrella allowed the process of resolving spam incidents to be automated to a great extent.
The Anti-Spam Pilot Project members will propose a Road Map for the expansion of the project to the major European Telcos at the next ETIS Information Security Working Group meting which will be held in Delft, The Netherlands on September 27.
Details: http://www.ironport.com/

End-to-end transaction data is increasingly being recognised as the not-so-secret sauce required for full-flavoured telco transformation. If so, it should be treated with the reverence it deserves, Thomas Sutter, CEO of data collection and correlation specialist, Nexus Telecom tells Ian Scales

Nexus Telecom is a performance and service assurance specialist in the telecom OSS field. It is privately held, based in Switzerland and was founded in 1994. With 120 employees and about US$30 million turnover, Nexus Telecom can fairly be described as a 'niche' player within a niche telecom market. However, heavyweights amongst its 200 plus customers include Vodafone, T-Mobile and Deutsche Telekom.

It does most of its business in Europe and has found its greatest success in the mobile market. The core of its offer to telcos involves a range of network monitoring probes and service and revenue assurance applications, which telcos can use to plan network capacity, identify performance trends and problems and to verify service levels. Essentially, says CEO, Thomas Sutter, Nexus Telecom gathers event data from the network - from low-level network stats, right up to layer 7 applications transactions - verifies, correlates and aggregates it and generally makes it digestible for both its own applications and those delivered by other vendors. What's changing, though, is the importance of such end-to-end transaction data.

Nexus Telecom is proud of its 'open source approach' to the data it extracts from its customers' networks and feels strongly that telcos must demand similar openness from all their suppliers if the OSS/BSS field is to develop properly. Instead of allowing proprietary approaches to data collection and use at network, service and business levels respectively, Sutter says the industry must support an architecture with a central transaction record repository capable of being easily interrogated by the growing number of business and technical applications that demand access. It's an idea whose time may have come. According to Sutter, telcos are increasingly grasping the idea that data collection, correlation and aggregation is not just an activity that will help you tweak the network, it's about using data to control the business. The term 'transformation' is being increasingly used in telecom.

As currently understood it usually means applying new thinking and new technology in equal measure: not just to do what you already do slightly better or cheaper, but to completely rethink the corporate approach and direction, and maybe even the business model itself.

There is a growing conviction that telco transformation through the use of detailed end-to-end transaction data to understand and interact with specific customers has moved from interesting concept to urgent requirement as new competitors, such as Google and eBay, enter the telecom market, as it were, pre-transformed. Born and bred on the Internet, their sophisticated use of network and applications data to inform and drive customer interaction is not some new technique, cleverly adopted and incorporated, but is completely integral to the way they understand and implement their business activities. If they are to survive and prosper, telcos have to catch up and value data in a similar way. Sutter says some are, but some are still grappling with the concepts.

"Today I can talk to customers who believe that if they adopt converged networks with IP backbones, then the only thing they need do to stay ahead in the business is to build enough bandwidth into the core of the network, believing that as long as they have enough bandwidth everything will be OK."

This misses the point in a number of ways, claims Sutter.

"Just because the IP architecture is simple doesn't mean that the applications and supply chain we have to run over it are simple - in fact it's rather the other way about. The 'simple' network requires that the supporting service layers have to be more complex because they have to do more work."

And in an increasingly complex telco business environment, where players are engaged with a growing number of partners to deliver services and content, understanding how events ripple across networks and applications is crucial.

"The thing about this business is not just about what you're doing in your own network - it's about what the other guy is doing with his. We are beginning to talk about our supply chains. In fact the services are generating millions of them every day because supply chains happen automatically when a service, let's say a voice call over an IP network, gets initiated, established, delivered and then released again. These supply chains are highly complex and you need to make sure all the events have been properly recorded and that your customer services are working as they should. That's the first thing, but there's much more than that. Telcos need to harness network data - I call them 'transactions' - to develop their businesses."

Sutter thinks the telecom industry still has a long way to go to understand how important end-to-end transaction data will be.

"Take banking. Nobody in that industry has any doubt that they should know every single detail on any part of a transaction. In telecoms we've so far been happy to derive statistics rather than transaction records. Statistics that tell us if services are up and running or if customers are generally happy. We are still thinking about how much we need to know, so we are at the very beginning of this process."

So end-to-end transaction data is important and will grow in importance. How does Nexus Telecom see itself developing with the market?

"When you look at what vendors deliver from their equipment domains it becomes obvious that they are not delivering the right sort of information. They tend to deliver a lot of event data in the form of alarms and they deliver performance data - layer 1 to layer 4 - all on a statistical basis. This tells you what's happening so you can plan network capacity and so on. But these systems never, ever go to layer 7 and tell you about transaction details - we can.

"Nexus Telecom uses passive probes (which just listen to traffic rather than engage interactively with network elements) which we can deploy independently of any vendor and sidestep interoperability problems. Our job is to just listen so all we need is for the equipment provider to implement the protocols in compliance with the given standards."

So given that telcos are recognising the need to gather and store, what's the future OSS transaction record architecture going to look like?

"I think people are starting to understand it's important that we only collect the data once and then store it in an open way so that different departments and organisations can access it at the granularity and over the time intervals they require, and in real (or close to real) time. So that means that our approach and the language we use must change. Where today we conceptualise data operating at specific layers - network, service and business - I can see us developing an architecture which envisages all network data as a single collection which can be used selectively by applications operating at any or all of those three layers. So we will, instead, define layers to help us organise the transaction record lifecycle. I envisage a collection layer orchestrating transaction collection, correlation and aggregation. Then we could have a storage layer, and finally some sort of presentation layer so that data can be assembled in an appropriate format for its different constituencies - the marketing people, billing people, management guys, network operation guys and so on, each of which have their own particular requirements towards being in control of the service delivery chain. Here you might start to talk about OSS/BSS Convergence."

Does he see his company going 'up the stack' to tackle some of these applications in the future.

"It is more important to have open interfaces around this layering. We think our role at Nexus Telecom is to capture, correlate, aggregate and pre-process data and then stream or transfer it in the right granularity and resolution to any other open system."

Sutter thinks the supplier market is already evolving in a way that makes sense for this model.

"If you look at the market today you see there are a lot of companies - HP, Telcordia, Agilent and Arantech, just to name a few - who are developing all sorts of tools to do with customer experience or service quality data warehouses. We're complementary since these players don't want to be involved in talking to network elements, capturing data or being in direct connection with the network. Their role is to provide customised information such as specific service-based KPIs (key performance indicators) to a very precise set of users, and they just want a data source for that."

So what needs to be developed to support this sort of role split between suppliers? An open architecture for the exchange of data between systems is fundamental, says Sutter. In the past, he says, the ability of each vendor to control the data generated by his own applications was seen as fundamental to his own business model and was jealously guarded. Part of this could be attributed to the old-fashioned instinct to 'lock in' customers.

"They had to ask the original vendor to build another release and another release just to get access to their own data," he says. But it was also natural caution. "You would come along and ask, 'Hey guys, can you give me access to your database?', the response would be 'Woah, don't touch my database. If you do then I can't guarantee performance and reliability.' This was the problem for all of us and that's why we have to get this open architecture. If the industry accepts the idea of open data repositories as a principle, immediately all the vendors of performance management systems, for instance, will have to cut their products into two pieces. One piece will collect the data, correlate and aggregate it, the second will run the application and the presentation to the user. At the split they must put in a standard interface supporting standards such as JMS, XML or SNMP. That way they expose an open interface at the join so that data may be stored in an open data to the repository as well as exchanged with their own application. When telcos demand this architecture, the game changes. Operators will begin to buy separate best in class products for collecting the data and presenting it and this will be a good thing for the entire industry. After all, why should I prevent my customer having the full benefit of the data I collect for him just because I'm not as good in the presentation and applications layer as I am in the collection layer? If an operator is not happy with a specific reporting application on service quality and wants to replace it, why should he always loose the whole data collection and repository for that application at the same time?"

With the OSS industry both developing and consolidating, does Nexus Telecom see itself being bought out by a larger OSS/BSS player looking for a missing piece in its product portfolio?

"Nexus Telecom is a private company so we think long-term and we grow at between 10 and 20 per cent each year, investing what we earn. In this industry, when you are focusing on a specialisation such as we are, the business can be very volatile and, on a quarter-by-quarter basis, it sometimes doesn't look good from a stock market perspective."

But if a public company came along and offered a large amount of money? "Well, I'm not sure. The thing is that our way of treating customers, our long-term thinking and our stability would be lost if we were snapped up by a large vendor. Our customers tend to say things like 'I know you won't come through my door and tell me that someone somewhere in the US has decided to buy this and sell that and therefore we have to change strategy.' Having said that, every company is for sale for the right price, but it would have to be a good price."

So where can Nexus Telecom go from here? Is there an opportunity to apply the data collection and correlation expertise to sectors outside telecom, for instance?

"Well, the best place to go is just next door and for us that's the enterprise network. The thing is, enterprise networks are increasingly being outsourced to outsourcing companies, which then complete the circle and essentially become operators. So again we're seeing some more convergence and any requirement for capturing, correlating and aggregating of transactions on the network infrastructure is a potential market for us. In the end I think everything will come together: there will be networks and operators of networks and they will need transaction monitoring. But at the moment we're busy dealing with the transition to IP - we have to master the technology there first.”

Ian Scales is a freelance communications journalist.

Agillic/Telenor Sonofon
To compete effectively in Denmark’s mobile market, Telenor Sonofon relies on a unique way to secure customer loyalty. The operator avoids impersonal and costly direct marketing campaigns by using Agillic’s Post-Paid CLM Business Solution to create an individualised, one-to-one relationship with each high value post-paid customer. This approach has significantly improved customer relations, reduced churn and increased overall ARPU.
 “Without Agillic’s solution Telenor Sonofon was unable to effectively execute its one-to-one communication strategy.  We wanted to take advantage of the in-depth knowledge we held on our customers to enable us to enter into consistent intelligent dialogues across all our communication channels.   We believed this approach would have a positive effect on our churn levels,” explained Martin Kildgaard Nelson, CRM Manager, from Telenor Sonofon.  “We were right.  Having the ability to automatically advise and educate each customer on the services and products that matches his needs transformed the way we were able to communicate with our customers.”
Agillic’s technology solves this lack of transparency. ”The CLM solution supports successful customer interactions driven by business rules and real time event and behaviour triggers, which gives our customers the empowerment to initiate their own dialogues.  Telenor Sonofon responds automatically, with relevant messages which can be reinforced simultaneously through multiple touch points such as MMS, SMS, e-mail and the Internet,” said Martin. “We now create ongoing learning relationships with each customer through the use of both historical data and real time interactions from a central system which has been instrumental in gaining user loyalty and trust.”
Telenor Sonofon uses a number of innovative campaigns to encourage customers to join its loyalty programme.  The results have exceeded expectations.  “We initially hoped for a 20 per cent registration rate. Within a year, over 35 per cent of our post-paid customers signed up to our loyalty programme,” said Martin. “The churn rate also dropped significantly – among the lowest in the Danish wireless market.”
This success was driven by Agillic’s technology. Concludes Martin: “We noticed a 50 per cent churn reduction with our post-paid campaign and a 5 -10 per cent increase in APRU.  This was the biggest success of all of our marketing promotions. These figures are undeniable proof that Agillic’s Post-paid CLM Business Solution provides excellent business results.”
To read the full Telenor Sonofon case study, log on to European Communication’s website www.eurocomms.co.uk

Qualcomm/BSkyB
Two trials conducted by Qualcomm and BSkyB in laboratory and ‘live’ settings in and around Cambridge and Manchester have confirmed the performance claims made of the MediaFLO mobile broadcast technology. The trial featured 11 channels from BSkyB delivered to non-commercial devices from Qualcomm.  Factors such as total throughput, single frequency network (SFN), network acquisition, channel switching time, layered modulation and video codec performance were all evaluated.
The trial aimed to test various technical performance claims for the MediaFLO system and perform comparative analysis against DVB-H. To this end all comparative laboratory measurements were based on common test equipment for MediaFLO and DVB-H, while drive test routes for DVB-H and MediaFLO were nominally the same. The result was a thorough technical analysis of FLO technology’s capabilities, confirming pervious performance claims about the MediaFLO system.
From a technical perspective the trial showed that the FLO physical layer performs as well as or better than previously claimed, with laboratory and field performance results in substantial agreement. Comparatively, whether in the laboratory or in the field, the DVB-H physical layer underperformed the FLO physical layer by around 4.5dB.
Results At A Glance:
•    MediaFLO physical layer field performance was around 4.5 dB overall better for non-layered modes with comparable bit per second per hertz capacity
•    dB advantage could allow a MediaFLO network to either cover twice the geographical area per transmitter when applying modes of equal capacity - resulting in a substantial reduction in network expense - or provide double the service offering on a channel count basis for a constant cell size with the same spectrum and transmitter deployment
•    Testing demonstrated that MediaFLO is capable of supporting 20 channels of QVGA video and stereo audio in a single 5MHz spectrum allocation. This can be scaled for an 8MHz UHF channel. This performance represents a 20 per cent increase in channels relative to prior performance claims of 20 video channels per 6 MHz channel
Details: www.qualcomm.com

External Links

Agillic

Great changes are afoot in the telecoms industry – offering considerable potential to industry players, while current achievements deserve to be celebrated now.  Lynd Morley looks back at TMW Nice

It was a year of ‘firsts’ for the TM Forum in Nice this Spring.  The theme of the show, for instance, moved away, for the first time, from a purely telecoms focus to one that not only recognised, but actively embraced, the massive changes which are bringing communications, information and entertainment services together into a huge pot of convergence.
Commenting on these development, TM Forum President, Martin Creaner, noted: “The pace of change is such that it is becoming hard to define the ‘telecom’ industry any more, and it’s just as hard to define the cable or media industries either.  I think we are truly witnessing the birth of a telecom/cable/media/web ecosystem.”  Certainly the range of companies from these various industries attending the event – including Time-Warner, Virgin Media, UPC Broadband and Disney, in addition to the usual suspects from the telecom OSS/BSS world – underlined the expansion of the TM Forum’s remit, while the keynote speakers, including MIT Media Lab’s Nicholas Negroponte, discussing his One Laptop Per Child initiative, and Rory Sutherland, Vice Chairman of Ogilvy Group UK pointing to the changes in the advertising industry which are being brought about by online advertising, underlined the event’s breadth of vision.
The atmosphere of collaboration was also emphasised by the TM Forum’s own new collaborative ventures.  Following the OSS through Java Initiative’s (OSS/J) decision in early 2006 to join with the TM Forum, the organisation’s determination to expand its scope and reach was further emphasised at TMW by the announcement that it would be merging with two more organisations – the Global Billing Association (GBA) and the Internet Protocol Detail Record Organisation (IPDR).  Commenting on the announcement, Alex Leslie, GBA CEO, noted: “We seemed to be looking at the same kind of issues from different angles.  Given that the BSS and OSS industries really need to converge, it makes enormous sense for us to work together.”
Collaboration was undoubtedly the ‘plat du jour’ in Nice where, among the many conference sessions – covering the six broad topics of Business Transformation, Convergence, Customer and Services, IT and Operations, Systems and Software, and Billing and Revenue Management – not to mention the many hospitality events and the two exhibition halls, senior executives from many of the world’s largest providers of telecom and cable services, network operators and content providers managed to get together to discuss how best to collaborate to drive down the integration tax.
Another ‘first’ this year, were the TM Forum Awards, presented at an evening ceremony – complete with formal dinner and attendees looking resplendent in black tie and evening dress.  Steve Fratini of Telcordia was presented with the rarely conferred honour of distinguished fellow of the TM Forum, while BT was the first winner of an award noting exceptional contribution to the organisation.  Given the youth of the awards, and the obvious delight of the recipients, here is a list of the winning companies:
Best practice award for a service provider – Korea Telecom
Best practice award for a supplier – Progress Software
Prosspero award for TM Forum standards adoption by a supplier – Hewlett-Packard
Prosspero award for TM Forum standards adoption by a service provider – BT
Best Catalyst for TM World 2007 – the Product and Services Assembly Project
Most innovative supplier – Highdeal
Best new OSS or BSS product – Arantech.

Outsourcing hardware developments to technology consultants can give the best chance of success, and could help to avoid many sleepless nights, according to Tim Fergus

The world of electronic hardware and product development is a challenging environment. From  small start up companies right through to large multinationals, the need to meet ever changing user requirements and launch products in a timely, cost effective fashion is key to long term success. The drive for the latest function or increased performance drives the development process with unrelenting urgency. With such pressures the need to succeed is paramount and may well dictate the future of the company.

This has to be balanced with the need to keep staff costs down and often companies will find they are resource limited and need to look for assistance outside.  This can take many forms but the most common is to use external short-term employees or contractors to cope with excessive peaks of demand. In some instances the complete outsourcing of a work package of complete development itself may make sense. This can be an effective method for a very targeted work package.
The challenge in contracting work is to ensure that the quality and delivery of the work package or complete development is done in timely fashion. This will ultimately depend on the contractor chosen, how they interact with the client organisation and the level of responsibility they offer.
This begs the question – how to ensure that contracting out work results in the greatest chance for success with the least intervention?
By taking ownership and responsibility for your hardware development, technology consultants (TC) can offer the best chance of success while delivering value beyond that expected.
For effective outsourcing of development work, the TC needs to possess key skills and abilities in addition to focus and drive. These can be highly effective at delivering results in short timescale.
By understanding not only the work to be done, but also the clients’ requirements and future needs, the TC is ideally placed to drive progress where it matters. Such an understanding drives the project forward to completion in a controlled and rapid way. This breeds confidence and ensures that the job is delivered in a short time as possible while minimising costs.
Consultants are sometimes visualised as being removed from the action; report writing and advisory in nature rather than actually doing the work. TCs are, however, more practically focused. Effectively, they are professional contract engineering services at the sharp end of development. Their wide breath of knowledge allows them to go from top-level system definition to implementing hardware, often working directly on the bench with a soldering iron.
Whereas individual may focus on specific tasks, TCs can take a much higher level of responsibility, in both their time management and the product development. Effectively they take ownership of the project until completion. This becomes much more important when the work to be done is a discrete package or even a complete product development. TCs can also provide direction to other sub-contractors/short-term individuals employed by the client, or even direct to client staff. This frees up the client and removes the burden of day-to-day resource management.
Consultants are generally broad in their experience and understanding. They may not be familiar with your product or technology, but the ability to learn rapidly will result in them delivering insight and value in a short time. Such flexibility is the key for the consultant’s survival in a rapidly changing world, which benefits the client. A larger consulting firm will offer many talented individuals – something the client will appreciate. This removes the need for careful and painstaking contractor selection; this has already been done by the TC company. They are that rare breed of excellent engineers with the business acumen and drive to succeed.
TC’s are keen to ask questions and take a view from outside and above the project. They don’t accept what they are told without questioning the reasoning behind the decision process. Such insight can be invaluable on projects and gives a helicopter view of how everything fits in – or does not! In many cases the simple questions are the one to ask. They know the bounds on what is possible and will flag up unreasonable assumptions.
In general large TC firms have expertise outside the technology or industry in which they normally work. For example a consultant doing hardware development may be able to access experts in IT, project management, change management, strategy and planning. It is best to use a TC that has a department or practice that fits with the technology to be developed. In some instances a TC firm will have its own specialist test equipment which may be available for use on client site or for use on the client project.
In some instances a complete development of software, hardware, industrial design and product fabrication, test and approvals can be conducted by a single TC firm. The ability to work in an integrated multi-disciplinary team is key to delivering quality product designs. Even smaller firms can offer these services when subcontracted out to partners, however, if possible, it is best to keep all skills in one place. Co-location is the key for team dynamics to be optimum.
On paper at least, TC firms can seem more expensive than contractors from other sources. This is however not true if you consider the value added to the project. Often the use of TCs can shorten development timescales, free up other members of client staff, or help with strategy and vision. When you consider such benefits, the true value can be
understood.

Tim Fergus is a Principal Consultant with PA Consulting’s Wireless Technology Practice, and can be contacted via: tel +44 1763 267492;
e-mail: innovation @paconsulting.com
www.paconsulting.com/wireless

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features