European Communications

Last update11:04:00 AM

Features

European Communications discusses the latest telecom trends with telco executives, analysts and topic experts viainsightful analysis, Q&As and opinion pieces.

DATA TRANSFORMATION - Telecoms colonic tonic

Have you got a dirty data problem? Too embarrassed to talk about it in public? Is it sapping your business vitality? Well you're not alone, and the good news is that 2007 has been widely touted as the year when the telecoms industry finally starts to sluice away its data blockages. So, without getting too anal about inconsistent data semantics and all the other symptoms that you may be experiencing, the message is that 'new-age' therapies are available to help you deal with these problems quickly, effectively and reliably, enabling you to become the business you've always wanted to be.  Paul Hollingsworth looks at the mismatch between business and technical issues and the challenge of continual data and business transformation

The sad fact is that even mentioning data migration or data management issues is a sure fire way of getting most business-level executives to instantly turn off and stop reading. It’s ‘dull as ditch water’ to use a common British idiom. But just before you give up on this article, let’s go back to that water analogy. You might not be all that interested in your home plumbing either, but it has this habit of grabbing your attention when the sink is backing up or when there’s no water coming out of the tap doesn’t it? At this point the solution is usually very expensive and involves an overpaid plumber simultaneously tutting and sniggering over his bill. And if you’ve ever been in a plumbing-centred crisis like this, you’ll know just how stupid and out of control you feel. Which brings me back to data.

" height="<% height %>" align="right" alt="DATA TRANSFORMATION - Telecoms colonic tonic" class="articleimage" />

Business managers hate the idea of data migrations primarily because of fear of the unknown and that horrible out-of-control feeling they get every time they have to think about them. Having signed off a project they have absolutely no certainty over when that project – if ever – will deliver real, hard business benefits. They also know it'll almost certainly overrun on cost or time, or both. And the really scary thought is that this could be the one in every five projects that's doomed to fail completely, having burnt through millions of dollars in the process.
Statistics from the likes of the Standish Group, which has made a business from analysing IT project failure, make grim reading. In 1994 when Standish first began collecting data, around 84 per cent of IT projects were deemed to have failed in some aspect. Ten years later the figure had improved, but still stood at 71 per cent.
Here in telco-land, I regularly come across architects who say that they can't remember a single successful data migration. Which is rather worrying given that in 2007 we are standing on a fundamental pivot point in our industry. Behind us lies the halcyon days of relative tranquillity when change was relatively slow and profit margins plentiful, while ahead of us lies the uncertain future of cut-throat competition with increasingly complex services delivered at increasingly fast speeds. Transforming from a tier one telco, to a strong, next-generation service provider is no mean feat. It involves transforming business norms, organisational structures, but also IT systems, architectures and data. It means learning a whole new content-driven, media-rich language, and requires companies to get fitter, leaner and more agile (both on an operational and technological level).
On a business level, we can articulate what is required to respond to the change drivers our industry is facing. For example, we need to:
•    cut costs and improve revenue management
•    innovate and get products to market more quickly
•    retain customers
•    increase or at least stabilise ARPUs
•    comply with regulation and legislation
•    use business information more effectively.   
But this is where the gap between business and IT becomes so very obvious, because translating business goals into technical strategies requires getting to grips with 'issues' that we have been putting off for a long time, such as consolidating and upgrading supporting systems, migrating data, transforming architectures and so on.
Surveys undertaken by UK-based Kognitio have highlighted that 57 per cent of respondents thought that decision makers did not have the information they needed to run their business optimally. The issues identified were the disparity of data, the sheer volume of data, the scale of the task to access the required data, unavailability of data and speed of access. Another survey unveiled in January found that 56 per cent of decision makers in 100 companies polled admitted that they had been discouraged from undertaking data migration projects because of the risk, cost, time and resource needed for such projects.
But in telco-land we now have a new reason to pay attention in the form of agile, lean new entrants that are extremely adept at using data to analyse and improve their business and their offer to customers – such as Google-coms and Walmart-phone. It's easy to underestimate how such companies can transform the market. Look, for example, at the telecoms arm of UK retailer Tesco, which in less than two years became the third largest prepaid mobile phone company in the UK. It acquired around half a million customers in its first twelve months, and has since added more than a million to this. And that's in a saturated mobile market where customers have to be prised away from competitors. Tesco Mobile states that it has done this by providing better customer service and understanding their needs – all of which requires good, accessible customer data. 
Of course you might take the line that fighting the retail fight is just not for you, and you're quite happy just to provide the pipework and go wholly wholesale. But don't get too complacent. Even then your partners are still going to demand data from you, and to be a successful value-added wholesaler you're going to have to get much better at providing it. You're also going to need to get better at managing your cost base, which also requires better data management. And to get from where you are to where you want to be probably means some form of data migration.
There's just one other little factor you mustn't forget. The industry formerly known as telecoms is no longer a market that will stand still or even change at a leisurely rate. So whatever solutions, systems or architectures you come up with absolutely have to accommodate the need for change. Otherwise you risk entering a never-ending project hell, whereby even if you deliver your clearly defined, properly scoped project on time and to budget, in the meanwhile the requirements have changed and the new solution cannot cope with them or support unknown and unforeseen ongoing change.
So before you throw up your hands in despair, remember that some data migrations – even very complex ones – are delivered on time, to budget and with a low risk profile. One feature of such projects is that they have business and senior executive buy-in. They combine business and technological strategies and deliver against both. To help you become one of the winners, here are five uncomfortable questions we suggest you should consider:
•    Will your data migration project deliver on time and at budgeted cost?
•    Are you comfortable with the risk profile you are assuming?
•    What happens if it doesn't work?
•    How much is it costing your organisation for every day of delay?
•    How flexible is your new architecture, will it be able to respond to inevitable change?

Paul Hollingsworth is Director of Product Marketing at Celona Technologies

IPTV OPERATIONAL STRATEGY - Into the real world

IPTV technology is moving out of the laboratories, and into the commercial world. Successful IPTV deployment, however, remain elusive say Rajeev Tankha and Dr Graham Carey

As network technologies and related integration techniques mature and improve, leading communications companies are now increasingly focusing on the operational aspects of IPTV services as key factors in an effective commercial launch and operational differentiation. Over the last two to three decades telecommunication service providers have built a customer expectation of service excellence, reliability and “carrier grade” service availability. The challenge is to meet these entrenched consumer expectations while containing IPTV-related operating expenses.  Let us look more closely at the challenges for IPTV service deployment and service provisioning.

Consumer expectations: IPTV consumers are intolerant of service glitches; services must be launched with consumer friendly operations that consistently deliver the desired level of service to the customer.
Extensive operational changes: IPTV deployment requires pervasive changes to the tools, structures, staffing, training, measurement and reward systems used to manage telephony and HSI services.
Risk of “trial and error” approaches: Developing IPTV operations from a blank piece of paper can incur unacceptable delays, risk and costs through trial and error testing and iteration. These methods usually fall short of required operational performance, severely limiting scalability, delaying commercial launch, and creating excessive operating expenses.  Most significantly there is, potentially, a major risk of affecting the service provider's brand with the launch of a poor quality service.
To date technology factors have masked operational issues: IPTV operational challenges are often entwined with – and masked by – better known networking challenges.  Many service providers experience unexpected difficulties with service provisioning, consumer complaints about service quality and reliability, with an overwhelming associated help desk and repair cost.
The affected processes include:
•    Service provisioning (order to installation)
•    Service assurance (preventive and corrective)
•    Network assurance and network change management
•    Video head end management and content management.

Service provisioning challenges
The service provisioning process must manage four interdependent streams of activity:
•    Change-out of telephony feature set and pricing
•    Loop re-arrangements and conditioning
•    Removal and re-build of broadband service and provisioning
•    Activation and installation of home network and IPTV applications
Further, this must be accomplished without unacceptable disruption to any of the consumer's existing telephony or broadband services.
The end-to-end process will most likely span a number of different business units whose procedures must be adapted to accommodate IPTV and Triple Play operations.  These assets will not usually have been integrated into a reliable end-to-end IPTV service provisioning process. Sadly with limited visibility of the end-to-end process, process failure is often not detected until downstream activities are visibly impacted and often the consumer is aware that the process has failed in some way. 
In the absence of effective operational practices, studies to date have shown that order fallout rates can exceed 50 per cent; up to 30 per cent of installations may require a physical attendance of an engineer to complete the installation. Even when the installation appears to be completed, the service provider may receive a higher volume of help desk calls within the first thirty days after the installation.

IPTV service provisioning
In order to minimise the provisioning challenges of deploying an IPTV service, telecommunications service providers need a strategy that encompasses the full range of methodologies, templates and tools specifically tailored to the consumer and business needs of IPTV services.
This strategy must therefore integrate several key elements:
•    Clear understanding of requirements – IPTV service providers need a clear definition of the operational targets for IPTV in order to organise and execute development activities toward those targets.
•    Well defined processes – IPTV Service Provisioning process flows, error checks and related process measures should be assembled into a centralised workflow management tool.
•    Operational Trial and test frameworks – Once processes are developed, coordinated testing of the operational process including multiple error conditions will enable the operator to assess readiness for market trial and launch.

Operational elements
In addition to the common issues involved in network convergence, we have identified five unique and important areas in IPTV service fulfilment that are critical to successful IPTV deployment:
•    Service Provisioning & Verification –Creation of detailed process modelling to enable the effective management of key IPTV processes including:
–    Successful collection of all required customer information
•    Order creation, configuration and provisioning of the customer IPTV service
–    Customer site survey, service verification, and troubleshooting techniques
•    Customer Trouble Resolution – Management of the collection of detailed trouble information for the categorisation and disposition of all customer trouble ticket preparation, to:
–    Reduce call holding time
–    Decrease trouble ticket resolution times
–    Reduce repeat dispatch of technicians
–    Reduce repeat customer trouble calls
•    Content Management – Managing content for IPTV services including:
–    Reconciling and integrating IPTV video service billings and content charges from content providers
–    Video monitoring to ensure billable content availability
–    Managing content provider contracts
–    Managing all intellectual property issues associated with the content such as royalty payments
–    Producing partner settlement invoices across the new IPTV value chain including advertising, sponsorships and promotion deals
–    Plus many other functions new to communications providers
•    Head End Management – Encompassing the design and management of these and other IPTV processes:
–    Channel line-up correlation and frequent additions/changes
–    Simultaneous substitution
–    Closed captioning
–    Daylight savings time change
•    Change Management – Enabling IPTV-specific functionality to manage multiple changes in:
–    Underlying infrastructure (e.g. Video Middleware, DSLAMS to IPTV DSLAMS, Modems, Gateways, central office wiring, head-end components, servers and databases)
–    Video and audio content (channel changes, program reception and encoder configurations)
–    Service pricing, service packaging and portfolio service up and cross sell
–    WEB content and upgrades, and much more

 A strategic, planned approach to IPTV deployment can enable IPTV service providers to implement efficient IPTV service provisioning while reducing the risks, costs and delays of developing “from scratch.”
A strategic process provides a foundation for continued operations design, and optimisation will further reduce design cost and time.  This helps ensure the key evolving IPTV provisioning requirements are captured, and increases the success rate on installations.
This strategy also helps create a consumer friendly IPTV service provisioning process that can be tuned and augmented as volumes grow without throw-away investment.  The use of a repeatable, controlled process improves order completion, while reducing order fallout and re-work and generating positive customer experiences. With all the above the start-up costs may be contained, and scalability enhancements can be phased in as required over time with service growth.
Of all the factors affecting the success of a new service, maintaining customer satisfaction and service quality is perhaps the most important of all.

Rajeev Tankha is Director of Product Marketing, and Dr Graham Carey is Director, Industry Solutions, Oracle Communications

Inflection points - Pointing the way

Jeff Nick examines the concept of ‘inflection points’ and their application to information processes

Every few decades, a fundamental shift occurs in how IT reaches customers. The shifts are akin to what Intel co-founder Andy Grove calls ‘strategic inflection points’, or the ‘time in a life of a business when its fundamentals are about to change.’

We’ve seen bandwidth, information growth, and infrastructure complexity explode in this decade as applications became more integrated and dynamic. This has forced the IT industry to address inflection points centred on four areas.
The first inflection point reflects the changing nature of information management.
Today, most information lifecycle management approaches (ILM) are volume based: ‘Buckets’ of information move, based on coarse descriptions of where the information currently sits, how big it is, who owns it, how old it is, etc. Customers already gain value by automatically protecting information and optimising where it’s stored.
But ILM is evolving. Automatically, tools can classify and apply ‘metadata’ labels to information based on its content. The idea is simple: Scan information and use what you learn to enable automation.
• “No one’s accessing this part of the database. Let’s automatically move it to a lower service level.”
• “This e-mail says ‘Confidential.’ We’ll automatically flag it for retention.”
• “This spreadsheet may contain personnel information. We’ll automatically ensure it’s secured.” 
The second inflection point ties into configuring IT assets for greater business value.
Typical data centres house many disparate technology ‘stacks’: applications running on specific operating systems, hosted on specific servers, using specific networks to connect to specific storage devices. Each deployment may have little or no relation to others, amplifying complexity and promoting inefficient use of resources.

Significant problems
This approach causes two significant problems:
• General-purpose computers become a dumping ground for an amalgam of different software stacks and applications. Each application has little or no relation to other applications co-hosted on the same platform and has its own patterns of resource consumption (CPU, memory, storage etc). The unpredictable resource usage results in an extremely complex operational environment and customers find themselves with a scale-out of under-utilised resources. Management simplification conflicts with the desire to optimise resource utilisation.
• The propensity to deliver IT general-purpose parts translates into an mismatch between the way that IT is delivered by vendors to customer IT organisations and the way those organisations need to deliver IT functions to their end users.
Virtualisation technologies, such as VMware, can help with both these problems significantly. Rather than mixing application workloads onto one operating system platform, VMware allows applications to be deployed into separate, flexible, virtual server containers, logically separate, while physically co-located.

Turn-key capabilities
There is a marked trend, mostly from start-ups, to deliver targeted niche turn-key capabilities in a network-centric appliance model, such as Web service gateways, encryption engines, network traffic monitors etc. These functional components, by their modular design, are self-contained, deploy non-invasively into existing configurations, optimise resource utilisation to the function being provided and are simple to manage due to their built-for-purpose limited configuration options.
Again, virtualisation technologies will increasingly shine, as virtual containers are a natural deployment vehicle for functional virtual appliances as well as traditional software stacks.
Guaranteed delivery of IT capabilities in support of service (i.e. service-level agreements) is extremely difficult to provide with any level of confidence in today’s working environment. This is due to the complex task of translating abstract business service-level objectives into concrete, actionable, resource management policies. Objectives for availability, performance, security, compliance and other dimensions of IT quality of service (QoS) are achieved by IT administrators largely through trial and error. Once some level of QoS is achieved, IT organisations are reluctant to introduce change to existing configurations. This puts IT groups directly in conflict with their primary objective: providing service to the customer business by remaining responsive to ever-changing demands and unpredicted business growth opportunities. Today, stability is achieved through wiring static configurations at the expense of dynamic flexibility.
This primary pain point in alignment between the business and the supporting IT environment is the result of some underlying fundamental problems.
First, there has been a general lack of agreement on how to express the capabilities of IT resources in a consistent manner. The disparities of implementation between vendors of similar computer technology elements are all too painfully evident to IT administrators. However, there have been increasing efforts in standards bodies to model resource types for management plug-ability. There has not yet been, though, convergence on the modelling framework across these different resource domains.
A further problem is that much emphasis to date has been placed on modelling ‘things’ (resources) rather than the ‘use of things’ (functions). As a result, significant effort has been placed on modelling every knob and dial of every resource in every resource class. While this is important for plug-ability of resources, it  does not close the gap between service-level management and resource management. What is fundamentally required is a focus on modelling the profiles for interactions with resources in the context of a given management discipline, such as availability or performance management. This would allow for direct translation from service-level objectives to resource policies specifying constraints on acceptable settings for only the resource dials and knobs associated with that particular management discipline.

Serious security issues
A lack of seamless security policy and enforcement across application, information and resource domains also causes serious security issues. Once information is retrieved, for example, from the authoritative data store and is processed in another application, stored in another repository or migrated to another resource, security context is lost.
Further, given the lack of security integration across vendors and types of IT assets, customers seek to protect themselves by building a security ‘castle’, hoping to protect their soft IT infrastructure within the castle walls. This approach, however, does not provide the necessary security policy protections to the business once inside the perimeter. Most security breaches come from within the organisation, not outside.
The solution to all these problems: a services-oriented infrastructure (SOI). Recent developments like Web services, virtualisation, and model-based resource management have coalesced to support SOI.
The third inflection point relates to the emergence of an ‘edgeless’ IT environment.
Information is everywhere. Business processes flow across a global chain of partners, customers, and employees, and perimeter-centric thinking is inappropriate. Enabled by grid technology, for example, thousands of compute nodes share petabytes of scientific information as research labs and universities collaborate to solve the mysteries of our universe.
But technology must take into account that information moves and must be secure-whether accessed internally or over the Web, at rest or in motion. We must authenticate users wherever they are and limit access based on their roles. We must define secure management policies and apply them to information regardless of resource, platform, repository, or application.
The final inflection point will change how information creates value.
We’re adept at creating information; we haven’t truly learned to leverage it. Some analysts say nearly 80 per cent of information that already exists is recreated not reused.
Information is often tied to the application or process that created it, so sharing or repurposing poses a challenge. In a sense, information is imprisoned, bound by proprietary schema and storage-access methods.
Thus, customers miss opportunities: if they could easily search, access, and combine information, they could uncover new revenue sources and operational improvements. We see this idea evolving in our expanding investments in content management anchored by Documentum, Centera content-addressed storage, and collaboration.
These are exciting developments. Inflection points will make IT a seamless part of our lives while elevating its impact. We’ll manage information according to what it is, not where it sits. We’ll design infrastructures to provide service, not just capacity. Traditional data centre perimeters will cede to solutions that realise data moves. Tools will make information ‘self-describing,’ so users can manage it automatically, according to policy.
And we will access, share, mine, and analyse information based on automatic data classification captured in metadata, letting people and applications use data beyond the original content-creation purpose.
We are just beginning to translate data to information and information to knowledge.                             

Jeff Nick is SVP and CTO with EMC
www.emc.com

Mobile advertising

Mobile advertising is fast becoming one of the most profitable sub-sectors in the telecoms industry. Fuelling this growth is the popularity of advanced mobile services and the willingness of consumers to receive advertising that appeals to their tastes. David Knox examines the potential of this new marketing
initiative and how charging and monitoring solutions can help mobile operators provide flexible and profitable ad campaigns while enhancing the consumer’s user experience

ubiquitous technology in the world, it was only a matter of time before advertisers caught on to the idea that a highly effective new way to reach a mass audience was not by radio, television or newspapers, but through a wireless handset. Mobile advertising has definitely become the latest 'it' word in the telecoms industry, with analysts predicting billions of pounds in revenue over the next five years. According to one recent report published by Informa, the mobile advertising market is going to be worth $871m this year, and will jump to $11.35bn in 2011.

Big in Japan
One of the countries to embrace the concept of wireless advertising is Japan, which has the second largest advertising market in the world behind the US and the first country to exceed 50 per cent 3G penetration. According to some estimates, mobile advertising revenues for 2006 in Japan are expected to exceed $300 million and to double that figure by 2009. This is higher than any other country in the world. Almost 60 per cent of Japanese consumers already use mobile coupons and discounts more than once a month.
Following in the footsteps of Japan is the wireless industry in Western Europe, where the idea of mobile advertising is also being explored by some of most pioneering mobile operators around, such as  Hutchison's 3G mobile group, 3. The operator is currently subsidising usage and phones through advertising on the phone. These models are also being offered through downloads, subscriptions, and video streams. It has also supported various mobile advertising campaigns in exchange for free content such as the launch of the first ever first movie video ad in Europe on behalf of Redbus, the film distributor, for the movie 'It's All Gone Pete Tong'. In partnership with 3, a banner on the carrier's wireless portal home page links to a microsite with more information on the film. At that site, subscribers have already requested more than 100,000 downloads of the movie trailer.
What is astounding about the access rate of the   aforementioned movie clip is that it shows how the average response for mobile advertising is up to 10 times greater than Internet response rates. This is due to the fact that there are only a handful of links to choose from on a typical mobile web page. If one of them is an ad, it is more likely to get clicked than the same ad on the Internet.
Tackling the mobile ad market on an even bigger scale is wireless heavyweight Vodafone, who recently announced a strategic alliance with search engine Yahoo! to create an innovative mobile advertising business as a means to inject new revenue streams for both companies.
Through this partnership, Yahoo! will become Vodafone's exclusive display advertising partner in the UK. Yahoo! will use the latest technology to provide a variety of mobile advertising formats across Vodafone's content services. The initiative will be rolled out in the UK in the first half of 2007.
Under the plans, customers who agree to accept carefully targeted display advertisements will qualify for savings on certain Vodafone services. In other words, Vodafone subscribers would pay a lower rate for data services if they were prepared to accept ad banners from Yahoo! when signing up with the mobile provider.
This promotional deal would also extend to key Vodafone mobile assets including the Vodafone live! portal games, television and picture messaging services.
Vodafone and Yahoo!'s approach to tackling the mobile ad sector is a strategic one. It's all about understanding what each user wants from their mobile phone and providing a truly unique individual advertising experience. In a way, the mobile handset will be transformed into a wireless 'magazine' filled with all sorts of adverts, only it will be slicker – since the advertisements will be live, potentially interactive, and most importantly, targeting the tastes of the mobile user. It can also be based on user location at the time.
Matching user interests
This brings us to the subject of profiling, which is indisputably the most important aspect of mobile marketing and one of the key ingredients to success. In order for Vodafone, 3, or any other mobile operator to make any money, they must know the interests of their users so that the advertising campaigns directed their way actually appeal to them.
This is where charging and rating applications can help operators conduct effective ad campaigns. These solutions are all about control – allowing operators to implement sophisticated, value-based charging strategies that help to differentiate between innovative packages, bundles and promotions. Subsequently, VoluBill's 'Charge it' real-time data control and charging solution, for example, could be installed in a wireless network to detect subscribers and, thereafter, display or redirect them to advertisements that match their user profile if they have chosen to receive advertising.
Information about the brand tastes and interests of the wireless user would be gathered by the mobile carrier before the user agrees to subscribe to advertising. This information would then be stored in the network and accessible to the Charge it solution to ensure that advertising is targeting the right audience.  So, for example, if one mobile user is identified as a die-hard fashionista who loves to read Vogue magazine each month, then Charge it could access the profile information in-real time and automatically redirect the consumer to an advert for the film 'Devil Wears Prada' when they next establish a data connection via the handset. If the consumer has previously received the ad for the film and has responded to the link, then Charge it can also identify that the user has previously accessed the advert and block it from being sent again to the consumer. This helps ensure that the target audience doesn't get put off by seeing advertisements they have already responded to.
This method of monitoring can also be used to track the success of a particular campaign and the revenue share that is generated as a result. If every single fashionista that signed up for mobile advertising clicked on the film link for the 'Devil Wears Prada' advert, for example, and also agreed to receive a free copy of the book which was simultaneously promoted in the movie advert, then Charge it would be able to provide the operator details on how many and which subscribers saw the ad, and how many of them bought the book. Most importantly, Charge it could calculate the income generated from the advert – which would be divided up by so many parties, including the operator and its service providers, the advertising agency, the film distributor, the book publisher and, last but not least, the writer.

Extensive advantages
Indeed, what solutions like Charge it provide are extensive advantages to operators both in charging and managing the user experience, whether subscribers are surfing the Internet or downloading content-rich premium services. As mentioned before, charging and monitoring solutions are all about controlling the mobile advertising medium and ensuring its success.
Most, if not all, of the mobile content and adverts provided on the handset would be managed by search engine platforms such as from Google or, in the case of Vodafone, Yahoo!. As such, charging and monitoring solutions providers can provide the additional user profiling, traffic redirection and charging functionality required to be able to offer a complete mobile advertising and charging solution.
Apart from accessing profile information, Charge it could also be configured to respond to other factors that would enable operators to redirect a subscriber's wireless Internet session to a specific advertising page. One way would be through location identification. If a user went online to a particular website that didn't match their profile, for example, like a fashionista accessing rugby match details, then Charge it could be configured to track and monitor this activity and record it for future marketing purposes. The next time an advertising campaign was launched concerning the sport, the aforementioned user would be sent the advertisement during their next Internet session. Again, this data would be monitored in order to enable Charge it to monitor the adverts and to redirect or block the information when necessary. 
If users accepted the ads sent to them, then Charge it could also be used to help operators to offer flexible price plans. This could include charging customers who access ads at a cheaper rate for services, or granting them a free Internet usage allowance per day. Consumers could also potentially receive periodic adverts like commercials on TV. The more frequent the adverts, the less they would pay for services.

Acceptance
Although widely ignored for many years as a possible income spinner, mobile advertising has finally been accepted by the business world as an innovation that can combine the wide reach of television with the precision of direct marketing and the tracking potential of the Internet. All of this adds up to serious revenue potential. Mobile advertisements also benefit all the participants involved in its new marketing game. While carriers get to boost their data and content revenue streams, advertisers gain a new and effective way of targeting a mass consumer audience. Meanwhile, mobile users get to benefit from great promotions and offers for the products and services they want to hear about, as well as cheaper mobile Internet usage.
Before mobile advertising and marketing can reach its full potential, however, certain technical and business requirements need to be met. This includes putting in place the right charging and monitoring infrastructure, as well as building relationships with quality advertising partners. Once installed, and with the right commercial models in place, there is nothing stopping mobile advertising from possibly becoming the most successful sub-sector in global telecoms history.                   

David Knox is Product Marketing Director at VoluBill

Lead interview

Technology companies come and go, but some are blessed with the foresight to help drive the technological developments that permeate all our lives. One such company is Micron, whose COO, Mark Durcan, tells Lynd Morley why it has been so successful

Future gazers abound in our industry, and we're being promised a near-future of sensor networks and RFID tags that will control or facilitate everything from ordering the groceries, to personalised news projected into our homes or from our mobile phones. This stuff of science fiction, fast becoming science fact, is the visible, sexy end-result of the technology, but what about the guys working at the coal-face, actually producing the tools that enable the dreams to come true?
Micron Technology is one of the prime forces at that leading edge. Among the world's leading providers of advanced semiconductor solutions, Micron manufactures and markets DRAMs, NAND Flash memory, and CMOS image sensors, among other semiconductor components and memory modules for use in computing, consumer, networking and mobile products. And Mark Durcan, Micron's Chief Operating Officer, is confident that the company has been instrumental in helping the gradual realisation of the future gazers' predictions.
“I do think that we are, in many ways, creating the trends, because we've created the technology which enables them,” he comments. “I can give you two prime examples. The first is in the imaging space where, for many decades, charge-coupled devices (CCDs) were the technology of choice for capturing electronic images – mostly because the image quality associated with CCDs was much better than that of CMOS images, which is what Micron builds today. 
“Nonetheless, we were strong believers that we could marry very advanced process technology, device design and circuit design techniques with the CMOS imager technology, and really create a platform that enabled a whole new range of applications. 
“I think we did that successfully,” he continues, “and the types of applications that were then enabled are really quite stunning. For instance, with CCDs you have to read all the bits out serially, so you can't capture images very quickly. With CMOS imagers you can catch thousands of images per second, which then opens the door to a whole new swathe of applications for the imagers – from very high speed cameras, to electronic shutters that allow you to capture a lot of images, and, by the way, you can do it using far less power. We have already made a major impact in providing image sensors to the notoriously power hungry cameraphone and mobile device based marketplaces, and in the space of two years have become the leading supplier of imaging solutions there. One in three cameraphones now have our sensors and in only two years we have become the largest manufacturer of image sensors in unit terms worldwide. So now, for instance, the technology enables all sorts of security, medical, notebook and automotive applications – you can tune the imagers for a very high dynamic range, low light and low noise at high temperatures which then enables them to operate in a wide variety of environments that CCDs can't function in.
As a result, you can put imaging into a multitude of applications that were never possible before, and I think we really created that movement by creating the high quality sensors that drive those applications.”
The second example Durcan quotes is in the NAND memory arena. “What we've done is probably not   apparent to everyone just yet, but, actually, I believe that we've broken Moore's law.
“We are now scaling in the NAND arena much faster than is assumed under Moore's law, and that has really changed the rate at which incremental memory can be used in different and new ways. As a result, I believe it will also pretty quickly change the way computers are architected with respect to memory distribution. So we're going to start seeing changes in what types of memory are used, and location in the memory system, and it's all being driven by a huge productivity growth, associated with NAND flash and the rate at which we're scaling it. We are scaling it faster than anyone else in the world now and we are also well tuned to the increasingly pushy demands of mobile communications, computing and image capture devices.“
The productivity growth Durcan alludes to has been particularly sharp for Micron over the past year. The formation of IM Flash – a joint venture with Intel – in January 2006 has seen the companies bringing online a state-of-the-art 300mm NAND fabrication facility in Virginia, while another 300mm facility in Utah is on track to be in production early next year. The venture also produces NAND through existing capacity at Micron's Idaho fabrication facility. And just to keep things even busier, the partners introduced last July the industry's first NAND flash memory samples built on 50 nanometre process technology. Both companies are now sampling 4 gigabit 50nm devices, with plans to produce a range of products, including multi-level cell NAND technology, starting next year. At the same time, Intel and Micron announced in November 2006 their intention to form of a new joint venture in Singapore (where Micron has a long history of conducting business) that will add a fourth fabrication facility to their NAND manufacturing capability.
In June 2006, Micron also announced the completion of a merger transaction with memory card maker Lexar Media, a move that helped Micron expand from its existing business base into consumer products aimed at digital cameras, mobile computing and MP3 or portable video playing devices.
“Our merger with Lexar is interesting for a number of different reasons,” Durcan comments. “Certainly it brings us closer to the consumer, as, historically, our products tended to be sold through OEMs. But, in addition, it provides the ability to build much more of a memory system, as opposed to stand-alone products, given that Lexar delivers not only NAND memory, but also a NAND controller that manipulates the data in different ways and puts it in the right format for the system that you're entering. Working closely with Lexar, we want to ensure that this controller functionality is tied to the new technologies we want to adopt on the NAND front, making sure that they work well together, thus enabling more rapid introduction of new technologies and getting them to market more quickly.”
The considerable activity of the past twelve months clearly reflect Micron's view of itself as a company that is in the business of capturing, moving and storing data, and aiming for the top of the tree in each section.   On the 'capturing' front, for instance, Durcan notes: “We've been very successful from a technology development perspective, and I think we're pretty much the unquestioned leader in the image quality and imaging technology arena. As mentioned we also happen to be the world's biggest imaging company now – it happened more quickly than any of us thought it would, but it was driven by great technology. So we have plenty of challenges now in making sure that we optimise the opportunity we've created to develop new and more diversified applications.”

Stringent tests
Certainly, the company is willing to put its developments to the most stringent of tests. All of Micron's senior executives, including Durcan, recently drove four Micron off-road vehicles in an exceptionally rugged all-terrain race in California, the Baja 1000, digitally capturing and storing more than 140 hours of video from the race, using Micron's DigitalClarity image sensors and Lexar Professional CompactFlash memory cards specially outfitted for its vehicles. All the technology performed remarkably well, as did Micron's CEO Steve Appleton, who won the contest's Wide Open Baja Challenge class some 30 minutes ahead of the next closest competitor.
Appleton's energetic and non-risk-averse approach to both the Baja 1000 (in some ways the American version of the Paris Dakar Rally) and to life in general (he is reputed to have once crashed a plane during a stunt flight, but still proceeded with a keynote speech just a few days later) is reflected in an undoubted lack of stuffiness within Micron.
Certainly, the company has taken a certain level of risk in pioneering technology developments. RFID is a case in point. “Sometimes,” Durcan explains, “the technology was there, but the market was slow to develop. RFID is a good example. Today, Micron has the largest RFID patent portfolio in the world. We certainly developed a lot of the technology that is now incorporated in global RFID standards, but when we first developed it, the threat of terrorism, for instance, was less obvious, so we simply couldn't get these tags going that are now absolutely commonplace. I suppose you could say we've been a little ahead of our time.”
The company is also managed by a comparatively young executive team, with a very non-hierarchical approach to business. “I do believe that we have a certain mindset that keeps us pretty flexible,” Durcan explains, “and one our strongest cards is that we have some really great people, with a great work ethic. At the same time, we drive a lot of decisions down into the company. We're probably less structured in our decision making than a lot of companies. 
“So, we try to get the right people in the room (not necessarily in the room actually, but on the same phone line!) to make a decision about what is the right space to operate in, then we can turn it over to people who can work the details.
“We try to get to that right space, at a high level, through good communication and then drive it down. It is the opposite of what I believe can happen when companies grow, become compartmentalised, and tend to get more and more siloed.
“There is also very strong synergy between the different activities within Micron,” he continues. “In each case we're really leveraging advanced process technology, advanced testing technology, and large capital investments in large markets. There are a lot of things that are similar and they do all play closely with each other.”

International bunch
Micron's people are, in fact, a truly international bunch, recruited globally, and bringing a great diversity of skills and approaches to the company. “I think that we are one of the most global semiconductor companies in the world,” Durcan says, “despite being a relatively young company. We recently started manufacturing our sensors in Italy and have design centres in Europe, both in the UK and Norway, which are expanding their operations. In fact we are now manufacturing on most continents – except in Africa and Antartica – and we have design teams right around the world who work on a continuous 24hr cycle handing designs from site to site. We've tried to grow a team that is very diverse, and leverage the whole globe as a source of locating the best talent we can.”
So, does all this talent produce its own crop of future gazers? Durcan believes they have their fair share.  “There certainly are people at Micron who are very good at seeing future applications. My personal capabilities are much more at the technology front end. I can see it in terms of 'we can take this crummy technology and really make it great'. Then I go out and talk to other people in the company who say 'that's fantastic, if we can do that, then we can...'. It really does take a marriage of the whole company, and a lot of intellectual horsepower.”
That horsepower has resulted in a remarkable number of patents for Micron. Durcan comments: “The volume and quality of new, innovative technology that Micron has been creating is captured by our patent portfolio.  It's an amazing story, and something I'm really proud of.  The point is, Micron is a pretty good-sized company, but we're not large by global standards – we're roughly 23,500 employees worldwide. Yet we are consistently in the top five patent issuers in the US.
“I feel the more important part of the patent story, however, is that when people go out and look at the quality of patent portfolios, they typically rank Micron as the highest quality patent portfolio in the world – bar none. I think that's pretty impressive and speaks volumes about the quality our customers benefit from.”

Lynd Morley is editor of European Communications