For many service providers, the price tag of a complete fibre network overbuild is too  steep to justify. An all-fibre network build out is actually a rare undertaking. Many incumbent service providers around the world - including those in Japan, Korea, North America and Taiwan - are utilizing existing copper infrastructure in some portion of the last mile, or in the multiple dwelling unit (MDU) risers, in order to eliminate construction disruptions and hasten rollouts and return on investments (ROI) for new, advanced broadband services. These hybrid architectures rely heavily on VDSL2 technology. With VDSL2 running over the copper portions of the access network, service providers are capable of delivering symmetrical speeds of up to 100 Mbps and support internet protocol television (IPTV), networked gaming, peer-to-peer and a variety of other broadband-intensive applications. Venkat Sundaresan takes a look

FTTH is the ideal future-proof network architecture because optical fibre is connected directly from the service provider's network to the customers' premises. Fibre is capable of delivering extremely high bandwidth and outstanding error performance, while supporting transmission over long distances without requiring expensive repeaters.

However a full-fledged FTTH deployment is costly and time-consuming. For example, both Verizon and NTT are upgrading most of their infrastructure to all fibre - pushing fibre as close to the consumer as possible in order to ensure that they can deliver the most revenue-generating services well into the future.

Verizon and NTT are deploying fibre to single family homes, where it can be installed relatively easily. In its newsletter detailing financial results for the third quarter of 2006, Verizon reports that its fibre to the home (FTTH) initiative costs $1,745 per home - $845 to pass a premise with fibre and another $900 to connect fibre to the home. In Japan, NTT's costs are slightly lower. In fiscal year 2006, the company reported that incremental FTTH investment per user was approximately 130,000 yen, or about US$1,070.  As a result of the high price tag of fibre deployment, service providers must be patient to earn a return on their investment.

Fibre to the building (FTTB) is used primarily in densely populated settings in which MDUs are prevalent. Fibre is terminated in the building and VDSL2 runs over the existing copper infrastructure in the building risers.

Verizon and NTT are using a hybrid approach - in which VDSL2 technology runs over copper deployed in the risers - to deliver broadband services to MDUs. VDSL2 is being used as the last mile technology because deploying fibre in restricted riser space is much more challenging.By deploying VDSL2 on the copper in the risers in the FTTB architecture, service providers are able to maximize copper utilization. With VDSL2, service providers can use the full 30 MHz spectrum and provide full 100 Mbps symmetrical bandwidth, which is capable of supporting IPTV and other advanced broadband services.

There are other deep-fibre architectures that - when combined with VDSL2 technology over last-mile copper infrastructure - can offer many of the same performance advantages as a FTTH deployment. However, the advantage of these architectures center around time-to-market and reduced capital outlay.  These fibre-based architectures include fibre to the node (FTTN) fibre to the remote (FTTR) and fibre to the curb (FTTC).

With FTTN/FTTR/FTTC, optical fibre is terminated in a remote terminal - often a cabinet that serves an entire neighbourhood. At the remote terminal, the signal is then converted from optical to electrical so it can ride over existing copper infrastructure using VDSL2. The vast majority of service providers worldwide have employed this type of FTTN/FTTR architecture. Those carriers include Belgacom, Deutsche Telekom and Swisscom in Europe and AT&T Corp. in North America. Using a FTTN/FTTR/FTTC architecture enables service providers to deploy high-capacity fibre to a central location in a neighbourhood, where that capacity can be shared among all homes in a neighbourhood.  Benefits include:

  • Less customer disruption -service providers do not have to dig up and install new wires to a home.
  • Faster time-to-market and return on investment (ROI) - A complete FTTH/FTTP overbuild takes a very long time, and on average, will not yield a return on investment for at least 15 years, according to Analysys, a UK-based research firm. A cabinet-based VDSL deployment can achieve ROI in about six years, Analysys noted.
  • Cost containment - compared to a complete fibre overbuild, an FTTN/FTTR/FTTC deployment is significantly less expensive. AT&T, for instance, is using this type of architecture for its U-Verse service in the United States. The company is building out FTTN, and using VDSL2 to turbo-charge the existing copper loops entering the homes. AT&T has publicly estimated that the FTTN architecture costs only about $360 per user to deploy.

The primary disadvantage to FTTN/FTTR/FTTC is that this architecture is viewed by some as an intermediate-term solution. FTTC is similar to the FTTN/FTTR, but extends fibre closer to each end user.

While the deployment costs vary widely from FTTH to FTTB and FTTN/FTTR/FTTC, the performance of each can be virtually comparable under the right conditions. 

Service providers undertaking FTTH deployments are relying on one of two common standards - Ethernet passive optical networking (EPON) and gigabit PON (GPON).  EPON is popular in the Japanese market and is making inroads in other Asian countries, including China and Korea. EPON can deliver data streams of up to 1 Gbps and operates at a distance of up to 20 km between the optical line terminal (OLT) and optical network terminal (ONT).  EPON OLTs support up to 32 individual users on each PON port.

GPON is being launched worldwide and is expected to be the FTTH technology of choice in Europe and North America. GPON delivers symmetrical and asymmetrical combinations of speeds up to 2.5 Gbps and operates at distances of up to 37 km between OLT and ONT. GPON can support up to 64 individual users per PON port.

VDSL2 also is being incorporated into fibre architectures to deliver high-speed access over existing copper loops in the last mile. VDSL2 is a physical layer technology for access networks that uses discrete multitone technique (DMT) modulation to offer high bandwidth to the consumer. It has eight profiles defined for a variety of applications, ranging from short loops to very long loops, and therefore, is a universal technology for access deployment. As carriers push fibre closer to the consumer, VDSL2 enables them to deliver revenue-enhancing, value-added services quickly and cost-effectively over existing copper infrastructure from the node, remote, curb, or even in the risers of a multi-tenant building. VDSL2 is in use today in Asia, by carriers in Japan, Korea and Taiwan, as well as in European countries, such as Belgium, Germany and Switzerland.

While FTTH offers service providers the best path for establishing a network that is capable of meeting bandwidth demands long into the future, the upfront costs and lengthy deployment process are daunting.  By employing a hybrid architecture, service providers can cost-effectively and quickly deliver advanced broadband services and generate revenue while establishing a migration path for future fibre installation.

Venkat Sundaresan is the Senior Manager of Product Marketing for the access products at Ikanos Communications.  He can be contacted via: info@ikanos.com

The increase in cybercrime is spawning new defence initiatives explains Lynd Morley

Recent revelations in the UK national newspaper, The Guardian, that the mobile phones of public figures had been systematically hacked by journalists and private detectives, not only set the cat amongst the pigeons with regard to this particular personal privacy issue, but raised the spotlight to focus, yet again, on the growing problem of cybercrime, and the many and varied forms it can take. 

Or at least it did for those who have been concerned about the growing threat of cybercrime for some time.  One aspect of the reaction to The Guardian story that was a little disturbing was that while the public (and press) outrage was directed at the invasion of privacy (quite rightly so) there was little discussion (or outrage) around the topic of securing mobile communications against any hackers.

There has, in fact, been a chorus of warnings about the growing threat of cybercrime over the past few months.  And the continuing poor economic climate is only serving the make matters worse, as redundant (and disgruntled) employees can pose a considerable threat to sensitive information on company networks.

At the beginning of this year, for instance, participants and speakers at the World Economic Forum in Davos were getting exercised about the subject.  During debates, the vulnerability of the internet (which has always been recognised), it was stressed, has  potentially huge consequences, given that it is now part of society's central nervous system and attacks could therefore threaten whole economies. 

Cybercrime comes in many shapes and sizes, including the theft of credit card or bank details, as well as identity, and intellectual property, much of which is committed by large and well-organised gangs, and the cost of all of which (including repairing the damage) runs to around $1 trillion a year according to a report from anti-virus specialists McAfee entitled Unsecured Economies: Protecting Vital Information.

At the same time, cyber warfare - within which denial of service attacks could bring a country to it knees in a very short time - is also high on the list of concerns, particularly for the state national security services around the world. 

In response to these widening threats, we've seen a flurry of announcements, including President Barak Obama's statement that he was making the protection of the US computer network "a national security priority", and was setting up a cyber security office in the White House.

The UK Government also highlighted the importance it places on cyber security within the whole national security framework when it launched its Cyber Security Strategy 2009, alongside the annual update of the National Security Strategy. Noting that as he UK's dependence of cyber space grows, so the security of cyber space becomes ever more critical to the health of the nation, the UK Home Office statement adds: "Cyber space cuts across international borders, it is largely anonymous, and the technology that underpins it continues to develop at a rapid pace."

The Strategy includes the setting up of two new organizations, both of which will be established in September this year, and will be operational by the end of March 2010.   The Office of Cyber Security will provide strategic leadership for government departments, co-coordinating a shared view of possible threats.  The Cyber Security Operations Centre will bring together existing security functions to monitor, coordinate response and provide advice and information about risks to business and the public.

Neither the US, nor the UK, can possible act alone, the international nature of cyber space and cybercrime dictating that any initiative - say the development of a cyber security legal framework for instance - requires close cooperation across the globe.

Raising awareness of the issues also plays no small part in defense against cybercrime.  Indifference, lack of action, and the lack of surprise at the vulnerability of communications systems - resulting in a certain level of cynical acceptance - can only serve to strengthen the hand of the perpetrators.

For nearly a decade, the annual Broadband World Forum Europe conference and exhibition  has grown significantly, hosting more than 6000 industry players and offering a wide range of information and communication technologies topics under the large umbrella of broadband. Themed "Delivering the Promise," this year's Event will feature more than 250 speakers in over 50 breakout sessions, keynote addresses, plenary panels, and workshops

Broadband World Forum Europe focuses on convergence, interoperability, and the transformation of carrier networks. At the event, industry experts will share their insights and experiences throughout the session programming, examining the latest broadband technology developments, content, applications, and services with particular relevance to the European market. Industry leaders from global carrier, supplier, software, and enterprise segments will explore and analyze technology developments relating to the ICT industry's value chain, preparing them to develop strategies and business models and make informed decisions on monetizing their broadband investments.

Chairman of the NGMN Alliance, Vivek Badrinath (EVP Networks Carriers Platforms and Infrastructure, France Telecom) will serve as the World Forum Chair and will host a carrier panel at the Forum.  Among the keynote speakers will be: Didier Lombard, Chairman and CEO, Orange - who will make the opening keynote address; Mika Vehvillainen, COO, Nokia Siemens Networks; Carl-Henric Svanberg, President and COO, Ericsson; Stefano Pileri, CTO, Telecom Italia; Jean-Briac Perrette, President, NBC Universal Digital Distribution; and Ben Verwaayen, CEO, Alcatel-Lucent.

Workshops will a range of topics including: Converged TV - Delivery and Management of Blended TV Services; Exploiting the Hidden Value in the Network: Deep Packet Inspection: Technology, Promise & Controversy - What You Need to Know; and The Future of Mobile Broadband: Coverage vs. Cost.  Further sessions include: Rich Communication Services: How to Leverage the Next Generation Mobile and Fixed Broadband Networks; Creating a Compelling Mix of Web 2.0 and Unified Communications Services for the Enterprise Market; Where and Where Not to Deploy ATCA in Broadband Networks; Delivering IP Telephony, IPTV, and Web Services on a Converged IMS-based Core-Network to Achieve Next Generation, Unified and Synchronized Service Experience; and The Future of Mobile Access.
Running alongside the conference, the exhibition will enable global technology vendors to showcase their most progressive broadband technologies, equipment, applications, solutions, and services.  As well as highlighting products, the organisers stress, the exhibition also provides the opportunity to discuss the latest technology developments and see how the latest innovations can keep implementation costs down, maximize return on infrastructure investment, and benefit the bottom line.

The event is organized by IEC, a nonprofit organization, dedicated to catalyzing technology and business progress worldwide in a range of high-technology industries and their university communities. Since 1944, the IEC has provided high-quality educational opportunities for industry professionals, academics, and students.

The Broadband World Forum Europe is supported by Orange as Official Host Sponsor and the NGMN Alliance as Associate Sponsor.

The ninth annual Broadband World Forum Europe will be held 7-9 September 2009 at CNIT La Defense, Paris.


Broadband standards are stimulating investment and competition, says Robin Mersh

Broadband technology continues to change our lives.  We are so busy emailing, downloading, twittering and updating our Facebooks and LinkedIn sites (to name but a few) that we've forgotten that barely a decade ago broadband users numbered just a few hundred thousand worldwide - not the 400 million plus lines we enjoy today.

The new buzzword on the block today is Super-fast broadband. While the term came to many in the UK for the first time via the mass media when Ofcom - the UK regulator - and BT - the UK incumbent telco - began making announcements earlier this year, the capability has already begun to take root around the world. Japan and Korea are leading the way, according to Ofcom research, but Sweden and Belgium are making their mark, with other countries close behind.

Super-fast broadband offers the prospect of real consumer benefits, building on those of today's broadband services, while supporting high bandwidth applications like video. The services supported by super-fast broadband bring individual, social and economic benefits globally to households and businesses. We are already seeing a massive increase in video communication over broadband and even more information and entertainment content will continually become available. The benefits will also extend to the wider economy, supporting new ways for consumers and online businesses to trade, developing new applications and services and driving creative industries everywhere.

To drive these benefits to the most people, organizations such as Ofcom are doing what they can to encourage an open competitive landscape for superfast broadband. Ofcom's vision is of a wholesale bitstream access that offers competitive communications providers the chance to accommodate innovation and product differentiation beyond the operational challenges of passive access. This type of high quality, fit for purpose bitstream has come to be known as Ethernet Active Line Access - or ALA for short.

So what is the difference of passive and active line access?  Passive allows for resale of the access facilities but requires the competitive provider to provide their own equipment. Active Line Access (ALA) provides for the sharing of access facilities and equipment, thereby minimizing collocation complexities as well as unnecessary investment on behalf of the competitive provider. 

There are a variety of benefits that are derived from ALA which:

  • Is service neutral to the applications
  • Video, HDTV, voice, data...
  • Is neutral to higher layers
  • IP-based applications, voice and video protocols...
  • Is transport access agnostic
  • Point-to-point fibre, Passive Optical Network (PON) options, copper, bonded copper, wireless...
  • Benefits from the economies of scale of Ethernet
  • Allows for innovative and differentiated services to be built
  • Improves distribution and management of next generation wholesale services
  • Customer acquisition by a competitive provider does not necessitate truck roll
  • Competitive service providers can interconnect at different points with the network provider

Ethernet was an obvious choice as the interface technology for ALA. It has proven to be simple, low cost, ubiquitous and well developed. There is a wide availability of low cost equipment that is already standardised and Ethernet has flexible bandwidth capability, excellent interoperability and well-established security and Quality of Service (QoS) protocols. Other factors making Ethernet a natural choice were its operating mode at the low Data Link layer of the Open Systems Interconnection Ref Model (OSI), which allows innovation in services and has a standard adopted by telecommunications companies around the world, as well as the significant investment and standardisation invested in it over past years.

Only last month, at the quarterly meeting of the Broadband Forum (BBF) in Valencia, Spain, BBF members were addressed by Chinyelu Onwurah, Ofcom's Head of Telecoms Technology. This was a landmark meeting in itself, as it was the first occasion that the "new" enlarged Broadband Forum had met, following the union with the IP/MPLS Forum, which has now created the global specifications body dedicated to empowering end-to-end broadband network specifications.

While praising the work of the Forum, she also highlighted that many of the Broadband Forum Technical Reports are paving the way for Next Generation Access solutions that meet the vast majority of ALA requirements. Although not written specifically for ALA requirements, these reports coupled with the work, largely in the arena of Ethernet service definition, UNI/NNI definition and business end-user requirements, that have been undertaken by another specifications body, the Metro Ethernet Forum (MEF), are critical to ALA and open competitive market success.

In a parallel development, the European Commission endorsed the need for standards in relation to wholesale broadband access products in its most recent draft Next Generation Access recommendation which went to public consultation on the 12 June 2009. In the draft recommendation, the Commission calls on national regulators to work with each other and international standards bodies to develop technical requirements that can be turned into widely accepted standards. This follows the work that Ofcom kicked-off in 2007 on Ethernet active line access technical requirements and which were finalised earlier this year.

So what are the requirements and the standards that are already laying the foundation for ALA? The key requirements of ALA are security, QoS, multicast, flexible customer premises equipment and flexible interconnection.  These are already being addressed in a variety of approved and available standards as listed in the box.

These reports represent a comprehensive list of specifications that can ensure a super-fast future and the Broadband Forum is dedicated to continuing to serve the industry with the specifications it needs to ensure that consumers have a choice in services and that the communications evolution continues.

Whilst remaining neutral in regards to choosing any one particular approach to broadband regulation and competition policy, the Broadband Forum believes that for communications providers Ethernet ALA means the availability of a standardised wholesale access product sooner, rather than later.

Robin Mersh is Chief Operating Officer, Broadband Forum

Originally telecom providers built and provided a limited and controlled range of  services, which customers could choose from.  When competition was introduced into the market along with the internet, so too was consumer awareness of choice.  In the future, Phil Kingsland contends, consumer demand will drive the development of new and specific services and a key enabler of these could well be Public ENUM

Convergent communications isn't a new concept, but it's the topic that continues to dominate the telecommunications industry.  As the new and traditional technologies continue to converge, the number and types of products and services available will grow and evolve with suppliers offering a combination of services, tools and applications for users to communicate with.

At the same time as the technology's converging, so are the telecommunications and internet industries.  One of the challenges for the two industries is the speed of development and innovation, especially regarding fixed line and packet-switched internet telecoms.

The telecommunications industry is 140 years old and provides trusted and regulated services, with the associated reputation of regulated industries in regard to speed of innovation and development of new technologies. 

In contrast, the new kid on the block for the last decade or so has been the internet, which is run on a bottom up, self regulated, multi- stakeholder model, which has delivered a fluid environment, with constant changes and innovations. The internet's model enables services to be developed, tried and adopted or rejected faster, without the large-scale investment that is required to launch a regulated telecommunications product. This has allowed the industry to introduce a number of new and innovative ways to communicate that we might otherwise not have seen.

The combination of two very different industries is having an enormous impact on the telecommunications industry.  As convergence evolves, the classification of services becomes blurred.  Customers can now get a multitude of services from many providers.  These services are often becoming consolidated in a continuous evolution, increasing competition in an already fierce market.

A new issue introduced by this convergence is not being able to contact a VoIP telephone system from another VoIP system by using the associated telephone number.  If the VoIP address of the recipient of the call is not explicitly known then the call must be routed via the Public Switched Telephone Networks (PSTNs) to identify the called party.  ENUM was designed to address this issue; it maps telephone numbers into domains that are stored in the internet domain name system (DNS).  The owner of the domain can record both the PSTN telephone number and VoIP address against the ENUM domain.  This allows people to use traditional telephone numbering systems to connect VoIP phones, without needing the PSTN to find the corresponding phones.

The implication is that users with an ENUM-aware VoIP phone, can access any registered user over the internet without use of the traditional PSTN network.  Users simply dial a telephone number in the traditional manner - it is then transformed into an ENUM domain name.  A look up is then carried out and the call is routed according to the specific indications set by the user.  If the number called is not in the ENUM database, then the call will proceed to the person's non-VoIP telephone and be charged in the normal way.

Another feature of the ENUM protocol is that users can register multiple resource addresses in their ENUM domain such as VoIP servers, mobiles, email, websites etc.  This enables the possibility to converge the multiple types of communications to one telephone number and for new services to be created to exploit this. 

For example, a person may choose a VoIP option from a returned ENUM query to reduce call costs. Or present a caller who queries their ENUM domain with the type of communication that they are available on at different times of the day, e.g. Provide telephone numbers and emails in business hours and only email address out of work hours.

As consumers, these types of services will become invaluable as we begin to use IP communications in all devices and have more addresses for each contact in their communications portfolio.  A service provider that offers subscribers effective and cost efficient management of their communications will build loyalty by providing tangible benefits and be able to charge for this value.

This is a significant move away from the traditional business model of communications providers as it places more emphasis on these value services than call charges. There are service providers who argue against the use of Public ENUM due to the fear of the control that it presents to users.  These service providers may choose to exploit the benefits of ENUM via a private ENUM registry.  This offers the service provider the opportunity to protect the existing telecoms business models and other commercial information.

However, it is yet to be proved whether users are looking to manage their own service or if they are prepared to pay for services that help them control their communications from a service provider via their Public ENUM. Certainly the argument for call charges is diminishing as more minutes are added to inclusive deals, Ofcom in the UK reports that in 2008 a mere 14% of pay monthly mobile subscribers claim that they usually exceed their inclusive minutes.

Another example of the dilemma that the convergence presents that the current influx of mobile VoIP applications has put many mobile providers in fear of losing revenue, and caused a number of carriers to block VoIP calls over 3G.  In April this year, this led the European Union to consider a ban on carrier VoIP filtering.  Should this proposal be passed by the commission, users will have widespread access to a variety of free calling tools, making IP communications technology more commonly understood and used.  At this point, having an ENUM enabled mobile phone would become a very powerful business tool, and open the door to broad consumer use.

It is clear that an IP connectivity technology such as ENUM is central to the continued development and convergence of telecommunications and internet technologies. 

For the full advantages of Public ENUM to be realised, a sizeable group of users need to have registered their numbers and be able to perform ENUM look ups.  There is some debate about how and when this will happen.  It may take an application or service that really taps into business drivers to propel widespread adoption.  This could be realised via any number of tools that bring ENUM into the consumers' consciousness, in the same way that certain VoIP products in the internet space have made IP telephony accessible for everyday users.
In contrast to public ENUM, private ENUM tips the balance in favour of the provider.  It gives suppliers the ability to manage the service and therefore the customer and also retain more control over revenue streams. 

With consumers now more aware of choice than ever, it is vital that telecommunications providers offer user centric products to retain existing customers and win new ones.  At a time when all business is more competitive than ever before, these issues should have a considerable influence when deciding what product set to offer customers.

Consumers are no longer happy to just accept traditional service offerings that provide the best benefits to the supplier.  They have learnt from competition and the internet that there is another way.  Customers are concerned not just with the current offering, but also how that impacts on the future development of products, services and applications.  A communications strategy is central to the success of any business, so being at the forefront of technology and having the ability to adapt to future developments is vital to continued success.

Opening up IP connectivity, using Public ENUM, supports the continued innovation and evolution of the telecommunications market.  It establishes opportunities for applications to be developed that will benefit users in new ways and create openings in the market for new business models.

Public ENUM has the advantage of being readily available, cheap to provide and is already deployed.  It sits comfortably alongside existing communications services, and enabling suppliers to offer truly converged communications that adopt the best features of both the telecommunications network and the internet. 

There are a number of forces in both the telecommunications and internet markets that will decide the future direction of ENUM's role in converging communications.  Undoubtedly one of these will be consumer demand.  If business users realise the potential of Public ENUM and demand a service that gives them control of their communications, the type of service they've learnt to expect from the internet, suppliers will need to meet this need to retain business.  What's clear is that Public ENUM presents the possibility of a myriad of solutions and applications that suppliers may not even have begun to realise.

Phil Kingsland is Director of Marketing and Communications, Nominet UK

The future of pay-TV is hybrid says Francois Pogodalla. What hybrid means is the  combination of DVB reception techniques for receiving broadcast digital video, with IP capability to receive video or other multimedia content over Ethernet. The opportunities for new services enabled by hybrid products are immense

The benefits and drivers for the hybrid strategy apply to both the wide area services of switched digital video, video-on-demand, over-the-top and other interactive applications, and the local area services, that is, the home network.

This is how we at ADB see convergence: not device-centric, but content-centric. Hence the goal is to facilitate access to content, whatever its shape or origin.

The home network is a key element of true content convergence, with set-top boxes interacting with the devices in the home, including personal computers, games consoles, portable media players and mobile phones. This means that personal content can be shared throughout the home, and played on the TV or the home theatre system. Utilizing the home network, these products also allow for new applications such as multi-room PVR and tuner sharing.

Connecting the home network with the content provider's network and the internet allows consumers to access their content whenever and wherever they want. They can even record content from wherever they happen to be. The benefits of the hybrid strategy for wide and local area applications, although different, can cross over to some extent. Take a satellite broadcast network, with an installed base of PVR's: having an IP connection can allow other applications such as uploading locally stored content to the network, or to other boxes in the home, with an opportunity for new VOD services that not only include the programming the operator has pre-selected, but the content the users themselves have recorded.

Adding hybrid technology onto the DVB platform opens the doors to these types of services. Customers demand plug and play. Bringing the world of the PC and Internet to the television, means making the system easy to install and use for TV viewers.

What lies behind simplifying the customer experience is the knowledge that we has accumulated over the years in managing the complex software involved in providing many different kinds of services through the set-top box. Hybrid technology is complex from the software management perspective. It requires implementing the full IP software stack in parallel with all the other software that additional hybrid services demand. There is the MHP middleware that takes care of terrestrial digital video reception, the IP software stack sitting next to it for receiving video over the IP connection. Then you add the local area applications, the DLNA implementation for home networking, plus DRM (security) and multi-room PVR services. This combination of software can put a high level of stress on the efficiency of the software stack. Building a hybrid product by just adding different software modules together can mean an unstable product, that's difficult and slow.

It is very important to be able to stabilize the complex solutions involved, and there is tremendous value in the vertical control of the entire software stack. ADB is one of the few companies that can write the complete set of software from the low level operating system, up to the high-end applications, in a hybrid environment. We are leading the industry in software integration: the first set-top box company to be certified to use the new DLNA home networking standards, and the first in the world to deploy the new MPEG-4 video codec back in 2005

The best approach to hybrid technology is to deploy proprietary implementations of open standards.  Using open standards, such as DLNA, is the best way of guaranteeing interoperability and ensuring the industry remains dynamic, with devices being able to interact with one another. Using a proprietary implementation of an open standard also ensures that all elements are fine-tuned for optimal performance.

When we designed the hybrid box for customers back in 2005, there were no chips performing MPEG-4/H.264 decoding at the time, so we implemented our own MPEG-4 decoding software, based on the standard, using an off the shelf digital processor. 

We have developed our own implementation of the MHP standard which is merged with the customer's IP stack. We have gone even further now adding local area applications -home networking through implementing the DLNA standards that support picture and music exchange and multi-room PVR throughout the home. ADB is also implementing technology from partners such as Stream Group, whose product Solocoo opens up the wider world of Internet videos.

Making it easy for the TV viewer to access a multitude of new services means providing these services as part of the TV and not a PC experience. It's important to retain the simplicity of the TV experience in this complicated environment. ADB is constantly striving to make improvements in order to present the additional choices to the consumer, in as straightforward way as possible. For example, to assist viewers in navigating the additional content, all services provided through the box are made available as channels, Another innovation is the use of 3D graphics in the user interface, so navigation on the services available becomes a more pleasant and intuitive experience for the end-user.

For the operator, the use of hybrid techniques means the extra services provided can increase subscriber retention, hence reduce churn and offer opportunities to increase revenue.. There is a definite interest from operators in hybrid technology, in making better use of their assets.

The addition of IP to the DVB digital video set-top box opens up endless possibilities for added value content - VOD, access to user specific content, whether from the Internet or the user's own content. There's no doubt that hybrid solutions feature outstanding tools to limit churn - for instance adding access to the user's own content makes the box a personal device. IP also enables the operator to free up broadcast bandwidth, thus limiting future investment in infrastructure.

Different markets worldwide may be going at different speeds, but they're all moving in the direction of a hybrid strategy - bringing the Internet to the TV and adding local area applications will happen everywhere. Digital will do for television what it's done for telecoms, which is to open up endless possibilities. The industry can now be really creative in exploring all the opportunities that a truly converged TV experience brings.

Francois Pogodalla is CEO at ADB SA

Rahm Emanuel, Chief of Staff in the Obama White House, said recently: "Rule one:  never allow a crisis to go to waste. They are opportunities to do big things". Like introducing transformational tools that may just keep your business afloat, says Aaron McCormack

It's not just the financial services industry that has suffered in the current economic downturn. Many other sectors have been affected in a crisis that, to date, has been marked by large reversals in financial results, mass layoffs, bankruptcies and the disappearance of long-standing companies. And while the question of how we got here could occupy analysts, politicians and commentators for years to come, the more immediate concern for business leaders is to find ways to respond to the circumstances that now surround them.

The issue was to the fore when the Forum of Young Global Leaders, of which I am a member, met at the World Economic Forum's annual meeting in Davos early this year. The question we asked was: "Can we create a system where we value genuine value creation beyond the quarterly results?"

If so, this is exactly the right time to make the changes needed. As London Business School's Donald Sull said in a recent article in the Financial Times: "Every downturn opens a window of opportunity to adjust the status quo and astute managers push through necessary changes while the window is open". This raises two wider questions. The first: what will the post-crisis order look like? The second: what can organisations do to prepare for it, especially at a time when budgets are tight and hard choices often have to be made?

As we all know, crystal balls are a notoriously unreliable way of finding answers. But one thing is certain - like it or not, we are all heading for a low carbon future, and we need to get there fast. Certainly, this seems to be where political leaders are placing their bets, with calls for ‘new green deals' coming from every part of the globe. UN Secretary-General Ban Ki-Moon is just one of a number of prominent leaders who believes we are being presented with a unique opportunity to address two crises at the same time: climate change and the economic downturn. Business leaders also seem to be waking up to what Lord Stern of Brentford, the author of the 2006 review that laid out the economic case for fighting global warming, is calling ‘green stimulus'.

Based on BT's experience, introducing conferencing, and using it to its full potential, is going to be an important next step for many firms. Analyst house IDC is in agreement. In 2008, the Unified Communications (UC) market in Europe was worth $2.6 billion. By 2013, IDC expects this to increase at a Compound Annual Growth Rate (CAGR) of 39% to a value of $13.5 billion, making it one of the brightest spots in a very tough technology market. "In such a challenging market, where spending is plummeting, there is a strong opportunity for solutions that can reduce expenses such as travel in the short-term," said Chris Barnard, research director at IDC. "This means that UC, which includes video and audio conferencing and collaboration solutions, is one of the few technology areas well placed to grow during the recession."

The good news is that, in most businesses, the infrastructure to support audio and web conferencing is already in place. All that is required is a phone, a computer and an internet connection - facilities that already exist on almost every desk. However, it is true that newer telepresence systems - a new generation of video collaboration services, with high definition cameras and huge projection screens, require new investment. However with their ability to illustrate physical characteristics and cues in great detail, they offer commensurately greater returns.

With this technology, it is possible to create virtual meeting environments where eye contact, body language and other conversational cues come as standard, akin to face-to-face meetings. It is possible to run the most crucial of business meetings - for example, delicate negotiations and executive recruitment interviews. And, while it will probably never completely replace face-to-face meetings, it does significantly accelerate a firm's ability to get things done.

Europe, in particular, is very much behind the curve when it comes to using existing conferencing technology to make businesses more productive, fleet of foot and efficient - exactly what is needed right now.

So what real difference does conferencing make when you adopt it whole-heartedly? To begin with, it encourages people to interact and work together to solve problems. It takes time to set up face-to-face meetings. Diaries have to be checked, rooms booked and travel plans made, and that can be a barrier to effective collaboration and teamwork. Conferencing is much simpler, and much more immediate. If it's integrated with a system that allows people to see each other's diaries - Microsoft Outlook, for example, all employees have to do is check people's availability, send an appointment and make the call. People can get together in minutes or hours, not days or weeks.

Saving time in business, product and sales operations gives sales individuals more time to sell, increasing the bottom line. Enterprises with large, dispersed field organisations, including high-tech manufacturing companies like Agilent and professional services firms like Accenture, are setting the bar for others to follow by expanding their use of web conferencing to drive online teamwork and effective delivery of sales presentations. Levels of meeting ‘attendance' are often higher when organisations hold briefings, conferences and training sessions using web conferencing and other services.

Productivity is also improved. The hours employees would have spent travelling can be put to better use. Jobs get done more quickly, which benefits both the employer and the employee. If people can meet overseas clients and suppliers without having to fly to meet them, they might be able to close four to five deals in a week, as opposed to just one. And if people aren't away from home as often, their work-life balance is improved. The need to work evenings or weekends to catch up on time lost on the road is reduced.

Essentially, introducing this kind of technology changes business processes and the very way in which people work. Many of BT's people now work in virtual teams whose members are distributed around the world. When they need to get together to discuss something, more often than not they'll use our conferencing services to do so. It's much more convenient, much more flexible and it quickly engenders both a team spirit and trust.

And the bottom line benefits? At UK supermarket Tesco, staff who use audio conferencing services save an average of £300 a meeting on travel expenses and free up 3.97 hours of ‘on the road' time to do more productive work. ‘Attending' an average of 3.55 conference calls a week, staff save a total of 584,824 hours per year by not having to travel to meetings.

The returns on investment are significant and quickly feed through to the bottom line. BT asked independent researchers from the University of Bradford and SustainIT to assess the benefits it obtained from conferencing in 2006-07. Based on employee surveys, they estimated that our people used audio and video conferencing services to hold around 850,000 meetings that would otherwise have required some or all of them to travel to the selected venue. What they described as "conservative calculations" suggested that, by eliminating about 2.6 million return journeys, time worth more than £100 million had been made available for more productive use. More than 73% of BT conferencing users believed they had saved at least three hours of travel time and 46% of the trips would have been car. It estimated that BT saves at least £128 million by using conference technology, while each physical meeting conducted by videoconferencing saves £432. 

When it comes to building a low carbon economy, the environmental benefits are equally significant. The researchers found that 97,000 tonnes of CO2 emissions had been avoided through reduced travel. What's more, these benefits weren't a one off. Year in, year out, money will be saved, our people will work more productively and CO2 emissions will be reduced. 

So yes - we are in a period of constraint at the moment. But that doesn't mean companies can't use this time to streamline their operations and shape themselves for the future. Sure, they may need all the help they can get. But they may also find that a downturn in business, far from being a bad time to think about change, may well be exactly the right time to transform the ways they work. Perhaps it's time for more organisations to heed conferencing's call.

Aaron McCormack is CEO at BT Conferencing

John Konczal examines how today's telecom providers must accommodate the   explosive demand for new digital content through business alliances

Consumer demand for digital content and value-added services is transforming telecom service providers into the new super-enablers of the digital economy.  While revenue opportunities from these new products and services are sizable and promising, introducing them to the telecommunication service provider's current business platform is not without its operational challenges. 

If the telecom service providers are going to grasp the new revenue opportunities of the digital economy they will need to have secure, flexible business processes and systems to collaborate and connect with new value-chain partners and drive new offers, such as personalised digital content, to consumers.

To offer new products and services to consumers, a new market of converged sectors of communications, media, and entertainment has had to emerge: something that can be termed as the mediacosm.

This mediacosm is now forcing the telecom service providers to restructure the business models and processes they rely on to source, market, sell and deliver products and services. In fact, to compete effectively in the new market, telecom service providers' business strategies are now beginning to mimic those of world-class retailers rather than that of manufactures.

Telecom service providers have traditionally relied on manufacturing-driven business models using a "flat" business flow and revenue model. This means that the communications service provider buys equipment and services from suppliers, integrates them into the service provider's network, and bills the end-user for products and services delivered.  In such single-dimension models, there is very little, if any, development collaboration with business partners on what products, services and content are delivered to customers.

However, significant changes in service provider strategy, backed by new open network technology, such as service delivery platforms, are driving telecom service providers to embrace a multi-dimensional business collaboration model, much like what world-class retailers use, as the core enabler of the mediacosm. World-class retailers, whose end products are an amalgam of intermediate goods and end products from partners and suppliers rely heavily on business collaboration to drive their revenue stream.

For telecom service providers to become the super enablers of the digital economy involves thinking, operating and measuring progress like a retailer. For example, an electronics superstore may collaborate with a personal computer manufacturer to develop a laptop with specific features, branding, and supply-chain integration for the store to market and sell the laptop to consumers.  In this case, the value-added product provider (the computer company) and the direct-to-consumer retailer (the electronic superstore) connect, communicate, and collaborate throughout the product introduction process in order to bring the custom laptop to market.

Realisation of the Mediacosm involves having in place business collaboration platforms where third party organisations, such as digital media companies, can plug their products and processes into the telecommunication service provider's business platform. This will enable seamless integration and the delivery of new offers to the consumers with the telecommunication service provider's network.  A telecommunication service provider's transformation into a super-enabler, as well as one which generate increased revenue, will depend on its ability to diversify its product portfolio by building a broad and deep business collaboration network of digital content and value-added service providers.

To generate these new revenue streams and to offer new services to consumers, many telecom service providers will embrace new business relationships and partnerships with content and service vendors, partners, and suppliers in order to bring a diversified set of products and services to market.

As a by-product of this strategy, a complicated mix of inter-relationships will evolve where both simultaneous collaboration and competition between telecom service providers and vendors, content owners and application/software providers will emerge.

Expertly managing this rich and complex ecosystem of partnerships will be an essential objective for telecom service providers. Those that will be successful in capitalising on the opportunity of the mediacosm will be the ones that deploy and employ technology and business capabilities that allow a telecom service provider to seamlessly integrate with multiple enterprises and enable automated collaboration between value chain partners. This will enable them to source content and value-added service from multiple points and manage business relationships in a flexible manner.

To become super enabler, telecom service providers like AT&T and British Telecom are focusing greater attention on multi-enterprise integration - or building an extended community of business partners and suppliers to bring new products and services to market and enhance how products and services are sold and distributed.

‘The best companies are the best collaborators. In the flat world, more and more business will be done through collaborations within and between companies, for a very simple reason: The next layers of value creation-whether in technology, marketing, biomedicine or manufacturing-are becoming so complex that no single firm or department is going to be able to master them alone.'
The World is Flat, Thomas Friedman, 2005

Thomas Friedman's observation above is actually an accurate description of the environment in which a successful telecom service provider operates in the age of the mediacosm. No business works as a single unit. Each one is comprised of partners and suppliers with whom it connects, communicates and collaborates with to drive positive business results for all involved.

This business collaboration produces greater efficiencies throughout the organisation and, more importantly, allows innovation to emerge to a degree that would never be possible under traditional organisational structures. It is the secure, reliable and seamless integration of people, processes and technology-and the vast potential this integration holds for a business to optimise existing resources throughout its value chain - that gives a company the  power to reshape its strategy and remain competitive.

Not only has technology levelled the playing field by making the exchange of information a universal capability, but this capability empowers companies to work together in ways that were unimaginable a generation ago. Telecom service providers now rely more and more on business collaboration networks to connect, communicate and collaborate with their partners and suppliers to bring new content and value-added service to market. This means that every participant in the network - not just the organisation at the hub - reaps the benefits of orchestrated business collaboration that allows them access to expertise and information from beyond their own corporate walls. It also enables innovation that drives and optimises customer experience and ultimately, revenue growth.

In summary, a proper multi-enterprise integration solution should allow service providers to react to market dynamics quickly and with very little effort. As a gateway, it must be able to talk to any system a provider has in place today and any communication method a provider's business partners might support. As a process enabler, it should allow a service provider to quickly assemble offers to meet the needs of fickle consumers. As a visibility and collaboration tool, it should enable more efficient ways of giving a service provider and the provider's partners better insight into your business operations. As a means to governance, it should track and record every transaction end to end, at any granularity needed to support the business. And, finally, as a security tool, it should protect against fraud, theft, revenue leakages, and liability.

"Multi-enterprise integration isn't the goal, but it supports the goal... Automating such business activities helps drive bottom-line revenue via reduced errors, reduced cost of operations and faster process execution... It also drives top-line revenue via lowering barriers to automation, improving customer and external business partner satisfaction, and increased "stickiness" once automated processes and data exchanges are implemented.'
Gartner, Inc., "Key Issues for Multienterprise B2B Integration," February 2009.

The emergence of the mediacosm is transforming how telecom service providers are structured and how they operate. In the near future, most telecom service providers are likely to be more virtual than physical. Companies may be composed of alliances among many different providers that come together to offer products and services, rather than doing most things in-house.  As the pace of change continues to accelerate, one thing is certain for business in the 21st century: successful telecom service providers will be those that have productive business collaboration, inside and outside their enterprise, to deliver the right products and services to their customers the way they want to receive them.
John Konczal is Global Industry Executive, Communications & Media, Sterling Commerce

Empirix/Manx Telecom
Manx Telecom, part of the Telefónica S.A. group and the Isle of Man's largest telecommunications and Internet provider, has deployed Empirix Hammer XMS to provide service quality assurance for its new IMS network. Over the next 18 months Manx Telecom will transition from its existing PSTN and ISDN infrastructure to an IMS network. During that timeframe it will use Empirix's Hammer XMS system to provide ongoing service monitoring throughout the network.

"We realized early on that we would need an independent monitoring system to manage the complexities that IMS introduces into our network," says Jon Huyton, technical officer, who is leading Manx Telecom's migration to IMS. "We were impressed with Hammer XMS because it addressed all of our needs in terms of functionality, working virtually ‘out of the box,' and ease of use. Most importantly, Hammer XMS provided detailed, end-to-end reports on calls as they traversed from the PSTN on to the IMS network, which was a task we were expecting to have to compile manually."

Manx Telecom selected Empirix's Hammer XMS following extensive system tests on its live IMS network. The deciding factors included Empirix's ability to deliver a comprehensive view of both TDM and IMS network operations, real-time monitoring of call flows, as well as fast set-up and flexibility to customize monitoring rules and reports.

Hammer XMS enables network operations and quality engineers to drill down from high level views of the network to granular details of individual call paths. In addition, Hammer XMS creates snapshots of normal network activity that quality engineers can reference when errors occur. These capabilities will enable Manx Telecom to identify and rectify errors, before they cause customer issues.  These monitoring capabilities will also help Manx Telecom ensure that it continues to meet quality targets set out in Service Level Agreements (SLAs), which are very important as a large proportion of the operator's revenue comes from businesses, including many financial services companies.

Andy Belcher, Empirix's managing director of Europe, the Middle East and Africa comments: "Known as an industry innovator, Manx Telecom's decision to focus on service quality assurance early on in their IMS rollout is proof that leading operators realize that proactively ensuring service quality is critical to their competitive differentiation and commitment to their customers."

Transmode/3 Scandinavia
Mobile services provider 3 Scandinavia needed to build its network capacity to ensure no bottlenecks were created as applications become more data hungry and more widely used. The company manages its own wireless backhaul infrastructure across Denmark and Sweden comprising several interconnected optical rings.

Mobile data traffic is undoubtedly growing phenomenally but the older SDH/ATM networks have inherent technical limitations that prevent cost efficient capacity increases. IP /Ethernet is often the best solution here, as it provides high capacity upgrades at a relatively low cost.

After evaluating various options, 3 Scandinavia decided to deploy Transmode's iWDM solution. Over the past year, Transmode's Multiservice Muxponder (MS-MXP) has been deployed across its Danish and Swedish networks. The MS-MXP has allowed 3 Scandinavia to deploy multiple services over a single 4 Gbit/s CWDM or DWDM wavelength.

Transmode's Multi-Service Backhaul Solution, including its iWDM capabilities offers transparent synchronization support for native TDM and Ethernet traffic. And both the native TDM and Ethernet traffic can be transported using only one single wavelength.

Håkan Snis, 3 Scandinavia's Transmission Engineering Manager explained the significance the company's improved mobile backhaul capabilities.

"Our decision to deploy Transmode equipment was not only based on cost, although there were obvious cost benefits. 3 Scandinavia had previously enjoyed several years' successful co-operation with Transmode in the previous phases of our network development and we were very pleased with the reliability of these systems."

"Since January 2008, which marked the start of the latest phases of 3 Scandinavia's development program, our requirements have been for significantly increased capacity as well as the technical means to cater for a smooth and cost-efficient migration from TDM-based transport to a future-proof IP transport model."

Sten Nordell, CTO at Transmode, comments: "Our Multi-Service Muxponder is especially designed to fit the various traffic formats available in the mobile network. Ethernet, TDM and ATM traffic can be aggregated and delivered over a single wavelength which ensures capacity can grow to match future demand without changing the optical infrastructure."

The whole new telecoms regulatory framework is now in the doldrums following a  disagreement between the Council of Ministers and the EU Parliament. You may be forgiven for thinking that a major disagreement regarding key aspects of the package must be at the origin of the current acrimonious standoff between such venerable European institutions... but you would be wrong.  The quarrel that is threatening to derail more than eighteen months of tough negotiations, is about a recently introduced anti-piracy provision allowing the disconnection of internet offenders "without prior ruling by the judicial authorities". This seemingly innocuous provision, that was introduced by the French Government at the last minute and subsequently accepted by the Council of Ministers but rejected by the Parliament, is the origin of the stalemate.

With the recent European elections, and the failed negotiations before the summer, it is unclear whether any agreement on this issue will be reached in the near future. This situation is particularly awkward for the French Government given that its very own Conseil Constitutionel, France's highest constitutional authority, ruled against the very law that the French wanted to export. The so called "three strike" provision, that would have allowed an independent body to disconnect the internet access of individuals after three warnings, is therefore back to the drawing board. This is likely to embolden the EU Parliament in this institutional impasse and lead them to wait for the Council of Ministers to back down on their version of the text. In the meantime all the measures in the package, including improved number portability provisions, the set up of a new pan-European regulatory group, introduction of functional separation as a remedy, enhanced radio spectrum management, better access to emergency services, etc... are on hold.

While understandably irritated by these latest developments, and unable to make progress on its main text, the Commission decided to revisit another area of regulatory concern and took the unusual step of re-issuing a consultation document about its regulatory proposals for NGAs.

A wide consensus about the definite need for next generation access networks to be deployed throughout Europe is now emerging. Demand uncertainty (what will people buy exactly? for how much?), technology uncertainty (will it work? what is the optimal topology?) are however significant enough for many operators to carefully phase their investment. A "build it and they will come" business case is unlikely to persuade the boards of the major operators to sign off on what is thought to require €300bn of investment in Europe. The regulatory uncertainty surrounding the treatment of such investments has also been blamed for the delay in rolling out fibre. The first consultation on the issue, launched at the end of last year, apparently attracted so much criticism that rather than coming up with a final recommendation as initially expected, the Commission is re-issuing a new consultation. Hence the new draft recommendations and guidelines on NGA to ensure that a consistent framework is adopted, probably at the end of the year, across member states.

The core of the issue is the balance between investment incentives and the risk of (re)monopolisation of the local loop. Some fear that, if these new fibre based networks are not regulated (e.g. access to them is not mandated) competition will not emerge in the future. On the other end, some argue that, if drastic wholesale access obligations are imposed, operators are not going to invest.

The Commission is clear about the need to regulate access to NGA but doesn't want to be accused of stifling investment. To achieve this dual objective the Commission is trying to promote co-investment in infrastructures (so that market participants can pool resources) as well as the roll-out of multiple-fibre lines to ensure that future competitors will be able to enter the market by using available fibres (rather than leasing an existing link as is currently the case with unbundling in copper networks). The Commission also recognises that investment risk may need to be reflected in the access price of third parties but warns that this should not be used as a way to squeeze the margins of access seekers.

Even with regulatory visibility the business case for rolling out these new generation networks in large cities is now just emerging.  It is therefore likely that some form of government intervention will be required to make these networks available in more rural areas in the future.

Benoit Reillier is a Director and co-head of the European telecoms practice at global economics advisory firm LECG. The views expressed in this column are his own.  Breillier@lecg.com

Baseline connectivity VPNs are becoming commodity products.  Many carriers have  leveraged their network assets and introduced service-aware VPNs.  Application assurance is the next evolutionary step, and Bob Emmerson argues that this development addresses the application requirements of enterprises for better, consistent, visible, end-to-end performance

Enterprises employ a wide range of applications and services: email, VoIP, IM, file transfer, video conferencing, business IPTV services, storage backup and recovery, etc. Each application has unique requirements for bandwidth as well as timing or delay sensitivity. VoIP can suffer from poor quality and dropped calls. Streaming video may break up or take a long time to start playing.

There is a popular perception that a high QoS equates to a high QoE (Quality of Experience) and that any quality problems can be accommodated by adding more bandwidth. That perception is wrong but it persists.

In order to optimize the QoE, which is the quality parameter that matters the most, service providers need to manage the quality of the various applications - the QoA (Quality of Application) - all the way to the end user.  This means that the optimum QoE can only come from a combination of QoS and QoA.

Many service providers have progressed from the basic VPN model, which was connectivity centric, to today's service-aware model that supports the convergence of IP voice, data and video and that has the performance and resiliency necessary to run latency-sensitive applications. 

This model has enjoyed considerable success.  But - and it is a very big but - it's a QoS centric approach. A high QoS assures the performance of the network, but it does not recognize the applications.  What's needed is a model that builds on the success of service-aware VPNs and goes on to assure the performance of the applications, ie a model that is QoA centric. Add this parameter and you can realize the requisite QoE for all application types.

Enterprises rely on their business applications for day-to-day operations, but the majority have little or no visibility on how they are performing over the wide area network services they employ.  It's a serious issue and its impact is growing for a number of interrelated reasons:   

  • Most apps were designed for the LAN, not the WAN
  • More and more apps are being centralized at data centres
  • Real-time voice, multimedia and business-critical data applications are converging
  • Availability and performance must be optimized across multiple locations

And the issue is compounded by the fact that many if not most IT departments have limited resources.

The primary responsibility of service-aware VPNs is to ensure that the operator's network and service performance objectives are met. There is limited focus on applications: it is assumed the application performance is acceptable if the service performance objectives are met.  But QoS does not equate to QoE. 

This model assigns different classes of service (CoS). The CoS defines the service pipe into which applications will be classified by trusted CPE devices.  Thus, the CoS determines the priority rating of the applications. However, this does not help address the enterprise's key performance issue, which is to have per-application visibility and control, without having to implement a costly CPE appliance.

Application-aware VPN services offer a new way for service providers to approach their customers. Application awareness goes to the heart of what matters most to an enterprise: the predictable performance of its voice and data applications.  That's a deliverable that will allow service providers to become a strategic partner ahead of the time when commoditization of the QoS model starts and prices erode. 

SPs that make the transition from service awareness to application assurance will be able to deliver predictable performance and generate additional revenue streams.  Their offer will be distinctively different and market research indicates that it will be welcomed.

An Ovum study conducted in Europe and the US indicated that 30% of the enterprises would pay extra for an improved QoS that guaranteed the performance of mission-critical applications.  And 20% said they would be prepared to pay for consultancy services that helped them with application performance monitoring and reporting.  A similar study conducted by IDC showed that 51% of 368 enterprises would use a managed WAN optimization service from an operator.

An application-assured VPN ensures per-application performance objectives are met through application recognition and optimization. This is enabled through a network-based approach that provides per-application classification and end-to-end assurance from both trusted and untrusted CPE devices.  So, how is it done? 

Alcatel-Lucent, for example, has designed a solution that enables service providers to deliver application assurance as well as performance monitoring and reporting.  And while the functionality is very advanced, implementing the solution is relatively simple. 

In a nutshell, the operator simply hot-inserts a hardware module into the chassis of an existing 7450 ESS (Ethernet Service Switch) or a 7750 SR Service Router).  This can be done without disrupting the services that are running at that time.

The module is known as the Application Assurance Integrated Services Adapter (AA-ISA). The baseline function is to identify the various applications in order to enable dynamic per-service, per-site and per-application QoS policy control. The term per-application QoS equates to QoA, which was introduced at the beginning of this article.

The applications that enterprises run over their wide area network are numerous and varied.  When traffic is directed to the AA-ISA traffic flows are identified and subjected to Application QoS Policy rules that determine the requisite treatment.

Each AA-ISA module provides up to 10 Gbps of deep packet analysis, a figure that enables up to 3M traffic flows. Up to seven active AA-ISA modules can be deployed per chassis. In this case the analysis capacity can scale up to 70 Gbps: this figure enables up to 21M simultaneous flows.  Application assurance solutions that are CPE-based only scale to 1 Gbps.

Application assurance enables per-application refinements that can either optimize the performance over the WAN or prioritize one application over another.  In Alcatel-Lucent parlance it enables an application-level QoS, which is arguably a more meaningful term than my QoA.  

In addition, the AA-ISA provides the data that enables visibility of applications and their performance behavior over the VPN. This data is subsequently processed by the reporting and analysis manager that provides application identification plus application monitoring and reporting. 

Alcatel-Lucent's solution enables operators to provide enterprise customers with a Web-based service portal that is used to monitor applications on a per-VPN or per-site basis. IT management can view near-real-time reports as well as archived reports and also request or change application treatment.

These reports are critically important for enterprises as they are faced with operational challenges due to limited resources as well as increasing cost constraints. Without an application reporting capability, they are effectively running blind.

Scalable application assurance has the look of a compelling business proposition, one that meets the market need for enhanced application performance over the VPN.  The concept is also a logical evolutionary step for service providers.  An application assured VPN is a differentiated service offer that heads off commoditization and by delivering predictable performance that can be monitored the business relationship becomes that of an ICT partner instead of a connectivity supplier. 

Bob Emmerson is a freelance writer who lives in The Netherlands. He can be contacted via: bob.emmerson@melisgs.nl

Unlocking the service layer will encourage a new market of innovation and  competition, both from application developers and the operators says Jonathan Bell.  But how can it be done?

Superficially, today's fixed and mobile telephone networks are not too different from those of thirty or more years ago.  You dial a number - it could be a special short number, or an 800 number, the principle is the same - signalling takes place to connect your call to the other party.  Sure, the numbers you dial look different now.  And they've added some nice features that make things more convenient - like a built-in answer-phone service and caller identification so you can filter out calls, find out who called and when etc.  For business users, there's even a little more like calling circles, hunt groups and multi-party calling for example.

We sense as service users that mobile telecoms provide a more personal service with greater utility.  Call the number and you connect to the person, not the location.  Wherever they are, whatever the time.  People of all ages have adopted text messaging enthusiastically as an additional, highly valued communication option.  Increasingly, mobile email and high-speed data are also becoming more commonly used, blurring the boundary between person-to-person telecommunications and "the web".  People are nomadic, and now connectivity and the device is mobile too, it is clear that extra attributes such as location and presence can be utilised to create services that are more "user aware" and therefore useful to the user.

To adapt the rigid A to B model that we started with, telecommunications engineers adopted a layered model and ensured that the signalling aspects of a service were separated from the actual communication channel.  It's called Signalling System 7 (SS7 for short), and the same principles have been re-used in SIP - the IP-based equivalent - that will, in all likelihood, eventually replace SS7.  The core network provides the signalling, switching and channels to deliver the service.  The layer above is the Service Layer.  It is in this layer where intelligence is added to the signalling and basic switching function. 

The Service Layer is implemented by one or more Service Control Points (SCP).  SCPs are also commonly referred to as an Intelligent Network Platform or "IN" platform.  When the network switch receives signalling for any kind of telephone number other than a standard, geographic number (which it will route directly), it passes control of the call to a designated SCP.  The SCP figures out what to do.  This might be to perform a number translation, check and authorise the call depending upon a prepaid tariff and balance, try several numbers in sequence or parallel, or look-up additional information such as subscriber ID, location, personal calling rules etc.  Throughout this process, the SCP is in charge and controls the call.

So far, so good.  So what's the problem?  Essentially, it is that SCPs are only available as a complete, vertically integrated hardware and software system.  In other words, a ‘box' that you connect into the network to perform predefined functions.  They were designed in the 1990s or earlier to provide the limited range of standard telephony services at that time and to comply with the ITU and 3GPP standards.  As the SCP is controlling phone calls, it is engineered to meet the exacting requirements for network equipment (NE) - "five nines" availability:  resilience to failure, upgrade with no down-time, hot swapping of components etc.  The deployment requirements, the restricted ambition in terms of the range of services when the SCPs were designed, the tight vertical integration and severe structural rigidities mean that the end-user services are essentially pre-baked into the SCP.  Adaptation of services has to be done by the SCP provider and is extremely expensive, often with very long lead times.  It means that the Communication Service Provider (CSP) can only sell and market a limited range of standard, utilitarian services.  They cannot experiment or innovate.  They have only one supplier for any changes that they require, the business case for which often fails due to the high costs of SCP adaptation.  Uniquely in a highly competitive marketplace, CSPs are handicapped in their ability to compete by differentiating their offer in terms of its capabilities.

Traditional SCPs are characterised by high prices, inflexibilities, single source for changes, slow evolution and enhancement.  As a CSP, once you have procured your SCP, you are a hostage to fortune.  Well, at least everyone is in the same boat.  But meanwhile voice minute price-points are in decline and all CSPs are under tremendous price pressure.  And the insurgent VoIP-based, price-focussed competitors are chasing hard. 

It used to be like this in Enterprise IT.  Enterprises bought a complete, vertically integrated stack of hardware, system software and applications from a single vendor.  This has all changed now.  There are commodity hardware providers, system software providers and application software providers.  There is competition within each layer.  The competition has driven the price-points down, platform performance and flexibility up and application innovation up.  There is something intrinsic at the heart of all this:  open systems and architectures promote competition and result in lower prices and innovation.  It's the basis of all free markets.

Until now, this option simply hasn't been available to the SCP.

What is needed is a solution that enables service agility in the telecoms network through an open, flexible platform that utilises commodity server hardware...in other words, a modern "IT" system designed explicitly for the telecoms network that unlocks the value of the telecoms service layer.

With this approach, new services can be created and delivered at a fraction of the cost and timescales normally associated with the telecoms industry.  In turn, this provides operators with the advantage needed to compete with the plethora of new device oriented applications currently hitting the market.

Opening up their SCP platforms to the burgeoning Java developer community will see increased levels of innovation in person-to-person communications - the 95% of all operator revenues today - sustaining operator market lead in these services.

Jonathan Bell is VP of Product Marketing at OpenCloud

Jean-Noël de Galzain explains how developer communities, business leaders and  policy makers will attend Open World Forum 2009 to ensure open software plays an active role in the digital recovery

Free Libre Open Source (FLOSS) has been around for a number of years, but only recently has it become a mainstream alternative to licensed software. The raison d'être of FLOSS is to enable developers to gain a new "community" style approach to the design, development and distribution of software, whereby everyone is given free access to a software's source code. This approach to sharing software gives developers greater flexibility than more popular licensed programs, which restricts use and keeps users busy with trying to avoid any violation of intellectual property rights.

The people behind the FLOSS movement are helping to make things happen for open source. FLOSS is an ecosystem driven by software editors along with enterprises, service providers, distributors and integrators. It forges breakthroughs in networks, embedded systems, Web 2.0 technologies, Web services, application development, critical information systems and security. FLOSS supports the development of cloud computing, SaaS and other emerging technologies to help revolutionise the world of IT.

The benefits of FLOSS are hard to ignore, and more and more firms across Europe are embracing the concept-especially now when companies are tightening budgets and trying to cut costs without impacting services.  According to a study by UNU-MERIT, Open Source will represent 30% of the market for software and services by 2012. 

Open source is revolutionising information technology and is no longer limited to basic software such as Linux or Apache. Fledgling open source firms are finding opportunities in various business applications, including databases, customer relationship management and business intelligence.

Without a doubt, open source creates tremendous opportunities and challenges for those in the space but what will be the impact on innovation, governance, public policy, and ultimately, IT careers? How will FLOSS really revolutionise the local landscape of information technology and what technological advances in commercial and Free Software can be expected for the future?

The Open World Forum in Paris on October 1- 2, 2009, will play host to thousands of participants from 20 countries, to share their thoughts, foster innovation and competitiveness, and address these questions. The event will explore innovations and future trends in FLOSS and how the world can help build the future of FLOSS and make it a crucial component for the digital recovery.

The second annual Open World Forum is targeting open source developers to ensure the event's success in Europe. The event is open to the community at large and not targeted specifically at a specialised sector or the research community. It is aimed at all stakeholders: communities, IT managers, architects, developers, researchers, academics, industrialists, investors, etc.

The event will be held at Eurosites George V in Paris and the agenda will include global IT players (Alcatel-Lucent, Atos, Bull, Cap Gemini, Google, IBM, HP, Siemens, Sun, Thales), as well as the main communities (ASF, Eclipse, Linux Foundation, Qualipso, Mozilla, OW2, OSA Europe), large research and competitiveness clusters (System@tic Paris Region, Cap Digital, INRIA, Fraunhofer FOKUS, UNU-MERIT) and a wide network of SMEs from around the world.
The Open World Forum addresses issues raised by policy makers, professionals, users and contributors, customers, as well as the undecided, on the new information technologies and communication strategies available using Open Source. The agenda will include seminars and plenary sessions involving well -known personalities in the industry.

The plenary sessions will explore how Open Source can contribute to renewed economic growth, as a major driver for competitiveness and innovation as well as a force for social improvement and national sovereignty.

Delegates are invited to attend the plenary sessions on key economic and societal issues such as employment, innovation, cloud computing, IT governance, the role of the communities and the FLOSS as a social and economic leverage in public policies.

The event will end with the delivery of the roadmap by the president of the Program committee, Jean-Pierre Laisné, CEO of OW2 & Chief Open Source strategist at Bull, and the directors of the 7 major themes addressed during the Open World Forum, namely public policy, innovation, ecosystems, technological revolutions and economic governance, careers and BRIC.  I hope that you can join us as well.

Jean-Noël de Galzain, CEO Wallix



Other Categories in Features