Features

The emergence of pre-integrated CRM and billing solutions is helping operators to provide not only the highest quality service possible, but also the most straightforward, and so, says Dominic Smith, help prevent customer churn

In today’s converged telecoms market, there is greater diversity than ever before in services, providers and pricing. For operators, it means managing an extensive array of network infrastructures and partners while, at the same time, developing a broad service set is an increasingly complex business. One of the greatest challenges in delivering simple but effective customer service is the sheer range of solutions on offer.

There is a vast array of packages and price schemes available in the mobile communications sector alone.  Operators' portfolios typically include 2G GSM, SMS, MMS, GPRS, 3G and HSPA basic services, not to mention the range of value-added services, content and applications accessible on top. In addition, there may be slightly different versions of some of the services, depending on the user's handset capabilities, and these in turn can mean alternate pricing plans.
In addition, in all sectors of the market, operators have to maintain and bill for a broad array of legacy services. And they need to be able to tailor their offerings to meet the specific needs of a wide variety of market segments - from large enterprises to individual consumers.
The challenge for operators is to simplify this often convoluted tangle of different service types. After all, looking beyond the headline prices, most end-users are interested in three key elements of the operator's service.
First, they want high quality and easy-to-use services.  Second, they need clear and accurate billing. Finally, if they do encounter difficulties, they want a straightforward and reliable way of sorting them out.
This means access to an efficient single point of contact, with the necessary knowledge and expertise to resolve any issues quickly. From the operator's point of view, the most effective way of delivering this type of simple, effective service is through the provision of pre-integrated CRM and billing solutions which customer-facing staff can use to support the interaction process.
By implementing high-quality CRM and billing systems and integrating them with their surrounding network infrastructure, operators have all the information they need in one place, ensuring they can provide a consistent, responsive and professional service to customers. They also have the reassurance of knowing that no data will be lost between systems, that updated information is available instantly to other users and that their systems provide a joined-up integrated 360° view of their customers.
The best of this new breed of fully convergent CRM and billing systems also give operators a bi-directional view of their customer management activities. In other words, they enable them to manage the dataset from the point of view both of the customer and the business itself. They can, for example, highlight not only the types of complaints made by one individual customer but also the total number received by the organisation as a whole, over a given period.
To deliver this functionality, it is critical that operators have not only a tightly integrated component set but also a single unified database, which links to these components. This optimised technical infrastructure enables them to examine all of the system data associated with a given customer and acts as the foundation for simple but proactive customer service.
At one level, it allows operators to escalate events. As part of most corporate service level agreements, they can define exactly how long they have to action certain events. If this time limit is exceeded, an alarm is triggered, notifying both operator and end customer of an infringement. Receiving this type of information in advance of a customer complaint helps the operator to deliver proactive service to end users, by acknowledging a fault on their system but at the same time reassuring them that the problem is being addressed and rectified.
At another level, the technology enables the operator to define its own business rules about how it sells services. In particular, by providing a stable and robust platform, it allows the operator to configure processes and procedures that build on the functionality of the systems infrastructure and ensure that customers receive a consistently high-quality service. 
Currently, few UK businesses are collating or leveraging their customer data as effectively as this. According to the findings of UK CRM consultancy firm, Detica, only 13 per cent of companies can be categorised as leaders when it comes to collating customer data and subsequently using it to improve customer relationships, while 40 per cent are categorised as 'strugglers', having very little good data and limited ability to exploit what they do have.
The results of the survey clearly illustrate the challenges businesses face in achieving a good understanding of their customers and using the insights gained effectively across the enterprise as a whole.
To be successful in building customer loyalty, operators need to focus more clearly on the customer. With the help of closely integrated CRM and billing solutions they can manage the relationship from a single entry point.  This means ensuring that customer-facing staff have all relevant data about a given customer instantly available as and when required.
There is a multitude of information to be gathered: typically including details of previous interactions with the customer, payment histories, information about average monthly spend, the types of services previously used and so on. Critically, having access to this kind of intelligence should enable the operator to deliver 'right first time' customer service.
Ease-of-use is also important. Any such systems need to be intuitive. When they purchase a bundle of services from an operator, customers must be sure that each service is activated on the network in a logical order and that follow-on services are provisioned, as and when required. The operator's customer service representative (CSR) should also be able to view the status of these services at any one time.
In addition, particularly in today's fast-changing telecoms environment, chosen systems need to be flexible enough to manage a broad spectrum of technologies from PSTN to 3G and from ISDN to ADSL. This is critical for operators who are migrating into new technology sectors either through business growth, merger or acquisition.
Today, with an ever-growing number of players in the market, it is increasingly easy for users to switch from one operator to another. In order to prevent churn, therefore, it is vital that operators provide not only the highest quality service possible but also the most straightforward.
The emergence of pre-integrated CRM and billing solutions is helping to achieve this. Equally, with the onward march of convergence and the consequent desire of many of the larger players to try to be 'all things to all people', these solutions can play a key role in enabling alternate operators to offer a bundle of services tailored to a specific market segment.
Operators can do very little about the increasing complexity of the telecoms marketplace. However, what they can and increasingly will need to do, in order to maintain their competitive edge, is to mask this complexity from the customer. Operators will need to ensure that their multi-play service offerings are straightforward to use, that billing is accurate and that the latest customer information is available to all service staff.  Pre-integrated CRM and billing solutions offer an excellent means of enabling them to transform this vision into a reality.

Dominic Smith is Marketing Director, Cerillion Technologies

Searching out the reality behind the hype, Ray Dogra points out that IPTV looks likely to generate significant revenue within its first three years of service, while cautious optimism surrounds its long-term potential

More than half – some 60 per cent - of communications industry executives believe that Internet Protocol Television (IPTV) can generate significant revenue within the first three years of service, according to findings of a survey recently released by Accenture and the Economist Intelligence Unit (EIU).  The survey of nearly 350 executives from telecommunications, broadcasting and media companies across 46 countries in the Americas, Europe and Asia revealed industry-wide confidence in the longer-term outlook for IPTV.

IPTV REVENUES - A Question of confidence

However, confidence in the short-term outlook remains mixed, with slightly more than half of respondents saying they are not confident in the ability of IPTV to generate significant revenues within the next 12 months.  On the other hand, one-fifth of respondents said they are confident or very confident, and more than one-quarter said they are somewhat to fairly confident, that IPTV will generate significant revenues within 12 months.
When asked what they believed would be the principal revenue source for IPTV, about half of the industry executives surveyed selected advertising. However, network operators, as a subset of all respondents — which included equipment vendors, consumer electronic companies, content providers and broadcasters/studios — disagreed, with three-quarters saying they believe that subscription fees for premium content will provide the largest recurring revenue stream, followed by basic content subscription fees and then advertising fees.
Further, when queried the reasons for pursuing the IPTV market, the greatest number of respondents cited new revenue streams, followed by acquiring new customers, and increasing sale of broadband access connections.
Overwhelmingly, executives believe that discounted pricing through service bundling will be the primary motivation behind consumer spending.  Nearly two-thirds of all respondents — and three-quarters of network operators surveyed — said they believe that discounted service bundles provide the greatest enticement to buy IPTV.  The ability to move content between devices was also cited as an important enticement, selected by some 38 per cent of respondents, as was the convenience of a single bill for multiple services.
Yet there are obstacles to IPTV adoption. One-quarter of respondents said that the primary short-term obstacle to IPTV adoption is a quality-of-service issue relating to unproven architectures, low bandwidth and other technology issues.  The same number said they believe that quality-of-service issues will be resolved over the next three years, leaving stiffer competition from alternative TV providers as the toughest challenge to the adoption of IPTV.  Another challenge to IPTV adoption, cited by respondents, is high subscription fees due to the high cost of network access and equipment.
Lastly, when asked which types of companies are most likely to generate revenue from IPTV, the vast majority of respondents selected content providers, followed by telecommunications providers.  Not surprisingly, more than two-thirds of respondents said that traditional broadcasters have the least to gain from IPTV, a view held strongly by respondents across all company types, including broadcasters themselves.
IPTV will fundamentally alter the traditional television advertising model. Because the set-top box will become essentially a database of viewing and purchasing activities, IPTV will unlock opportunities for more personalised advertising—including one-to-one, targeted offers and interactive ads. Advertisers can identify the commercials that customers are watching, and this data can be used to improve the efficiency of marketing campaigns.
That functionality will mean that the value of advertising will increase for everyone in the IPTV value chain. Consumers are more likely to pay attention to an offer targeted to their expressed interests (and they will be able to opt in or out, easing the concerns of privacy advocates), and advertisers are more likely to pay for it. Even small businesses—a pharmacy, a dry cleaner, a florist—might be willing to pay for an IPTV ad if they knew it would be targeted only to people living within a certain distance from the store.
When it comes to IPTV, it is true that “content is king.” The experiences of providers both in Europe and the United States have made quite clear that the “killer application” for IPTV solutions is the programming content. Acquiring and processing that content are not skills that a telecommunications provider necessarily possesses. Hence, providers must decide how to get those skills. One option is a joint venture model in which the IPTV provider teams with cable and satellite operators to acquire wholesale premium content. A second option is a sales package model where a company forms a direct content acquisition relationship with film studios and other content providers.
In either case, the cost of content will be substantial for IPTV platforms. Accenture's business case analysis indicates that content acquisition will become the most significant component in the cost/revenue model, reaching more than 40 per cent of costs by the fifth year of operation.
Stability of the service is the most important ingredient of IPTV success; and that means stability of the platform and architecture itself. If the IPTV service is unstable, high customer churn will result, and operators may end up with a customer base where churn negates their customer acquisition efforts.
A truly comprehensive IPTV solution encompasses the systems, video infrastructure and network elements required for an end-to-end solution, as well as definition of the processes to operationalise the video services being offered. The most important success factor, in Accenture's experience, is creating a stable and scalable IPTV service over a broadband multi-service platform. This has proved to be challenging to almost every operator. Monitoring and measuring quality of an IPTV feed, for example, is difficult. Network experts know how to measure network quality, but that does not necessarily translate into measures of what a customer perceives on a TV screen. If a customer has video-on-demand and then complains about the picture quality during a movie, knowing where the problem occurred is vital to deciding whether to issue a refund to the customer.
In general, 'cautious optimism' is the most appropriate summary of executive sentiment today about the future of IPTV. In Accenture's view, a degree of caution and careful planning is appropriate. A number of issues remain when it comes to creating a stable and scalable IPTV service over a broadband multi-service platform. These challenges can be addressed, but players in the IPTV industry will need to consider a broad range of technical, content and customer service factors as they proceed.
In short, providers that miss the mark on quality may not get a second chance in a competitive market. The experiences from providers both in the UK and the US have made abundantly clear that the TV service itself must work as well as consumers' existing level of service. No 'me too' TV service will succeed—regardless of its interactivity—unless it functions as well as or better than existing TV services.
The business case for IPTV remains attractive in the long term, yet certainly not simple to create or realise.  For now, IPTV will probably be simultaneously the most complex service a provider has, and also the one with the lowest gross margin.  Success with IPTV means remaining skeptical about the hype cycle of IPTV, and the 'bells and whistles' of interactive services. Being visionary in long-term thinking but practical in the short term will be key to achieving high performance through IPTV.

Ray Dogra, Accenture
www.accenture.com/iptvmonitor3

Have you got a dirty data problem? Too embarrassed to talk about it in public? Is it sapping your business vitality? Well you're not alone, and the good news is that 2007 has been widely touted as the year when the telecoms industry finally starts to sluice away its data blockages. So, without getting too anal about inconsistent data semantics and all the other symptoms that you may be experiencing, the message is that 'new-age' therapies are available to help you deal with these problems quickly, effectively and reliably, enabling you to become the business you've always wanted to be.  Paul Hollingsworth looks at the mismatch between business and technical issues and the challenge of continual data and business transformation

The sad fact is that even mentioning data migration or data management issues is a sure fire way of getting most business-level executives to instantly turn off and stop reading. It’s ‘dull as ditch water’ to use a common British idiom. But just before you give up on this article, let’s go back to that water analogy. You might not be all that interested in your home plumbing either, but it has this habit of grabbing your attention when the sink is backing up or when there’s no water coming out of the tap doesn’t it? At this point the solution is usually very expensive and involves an overpaid plumber simultaneously tutting and sniggering over his bill. And if you’ve ever been in a plumbing-centred crisis like this, you’ll know just how stupid and out of control you feel. Which brings me back to data.

DATA TRANSFORMATION - Telecoms colonic tonic

Business managers hate the idea of data migrations primarily because of fear of the unknown and that horrible out-of-control feeling they get every time they have to think about them. Having signed off a project they have absolutely no certainty over when that project – if ever – will deliver real, hard business benefits. They also know it'll almost certainly overrun on cost or time, or both. And the really scary thought is that this could be the one in every five projects that's doomed to fail completely, having burnt through millions of dollars in the process.
Statistics from the likes of the Standish Group, which has made a business from analysing IT project failure, make grim reading. In 1994 when Standish first began collecting data, around 84 per cent of IT projects were deemed to have failed in some aspect. Ten years later the figure had improved, but still stood at 71 per cent.
Here in telco-land, I regularly come across architects who say that they can't remember a single successful data migration. Which is rather worrying given that in 2007 we are standing on a fundamental pivot point in our industry. Behind us lies the halcyon days of relative tranquillity when change was relatively slow and profit margins plentiful, while ahead of us lies the uncertain future of cut-throat competition with increasingly complex services delivered at increasingly fast speeds. Transforming from a tier one telco, to a strong, next-generation service provider is no mean feat. It involves transforming business norms, organisational structures, but also IT systems, architectures and data. It means learning a whole new content-driven, media-rich language, and requires companies to get fitter, leaner and more agile (both on an operational and technological level).
On a business level, we can articulate what is required to respond to the change drivers our industry is facing. For example, we need to:
•    cut costs and improve revenue management
•    innovate and get products to market more quickly
•    retain customers
•    increase or at least stabilise ARPUs
•    comply with regulation and legislation
•    use business information more effectively.   
But this is where the gap between business and IT becomes so very obvious, because translating business goals into technical strategies requires getting to grips with 'issues' that we have been putting off for a long time, such as consolidating and upgrading supporting systems, migrating data, transforming architectures and so on.
Surveys undertaken by UK-based Kognitio have highlighted that 57 per cent of respondents thought that decision makers did not have the information they needed to run their business optimally. The issues identified were the disparity of data, the sheer volume of data, the scale of the task to access the required data, unavailability of data and speed of access. Another survey unveiled in January found that 56 per cent of decision makers in 100 companies polled admitted that they had been discouraged from undertaking data migration projects because of the risk, cost, time and resource needed for such projects.
But in telco-land we now have a new reason to pay attention in the form of agile, lean new entrants that are extremely adept at using data to analyse and improve their business and their offer to customers – such as Google-coms and Walmart-phone. It's easy to underestimate how such companies can transform the market. Look, for example, at the telecoms arm of UK retailer Tesco, which in less than two years became the third largest prepaid mobile phone company in the UK. It acquired around half a million customers in its first twelve months, and has since added more than a million to this. And that's in a saturated mobile market where customers have to be prised away from competitors. Tesco Mobile states that it has done this by providing better customer service and understanding their needs – all of which requires good, accessible customer data. 
Of course you might take the line that fighting the retail fight is just not for you, and you're quite happy just to provide the pipework and go wholly wholesale. But don't get too complacent. Even then your partners are still going to demand data from you, and to be a successful value-added wholesaler you're going to have to get much better at providing it. You're also going to need to get better at managing your cost base, which also requires better data management. And to get from where you are to where you want to be probably means some form of data migration.
There's just one other little factor you mustn't forget. The industry formerly known as telecoms is no longer a market that will stand still or even change at a leisurely rate. So whatever solutions, systems or architectures you come up with absolutely have to accommodate the need for change. Otherwise you risk entering a never-ending project hell, whereby even if you deliver your clearly defined, properly scoped project on time and to budget, in the meanwhile the requirements have changed and the new solution cannot cope with them or support unknown and unforeseen ongoing change.
So before you throw up your hands in despair, remember that some data migrations – even very complex ones – are delivered on time, to budget and with a low risk profile. One feature of such projects is that they have business and senior executive buy-in. They combine business and technological strategies and deliver against both. To help you become one of the winners, here are five uncomfortable questions we suggest you should consider:
•    Will your data migration project deliver on time and at budgeted cost?
•    Are you comfortable with the risk profile you are assuming?
•    What happens if it doesn't work?
•    How much is it costing your organisation for every day of delay?
•    How flexible is your new architecture, will it be able to respond to inevitable change?

Paul Hollingsworth is Director of Product Marketing at Celona Technologies

IPTV technology is moving out of the laboratories, and into the commercial world. Successful IPTV deployment, however, remain elusive say Rajeev Tankha and Dr Graham Carey

As network technologies and related integration techniques mature and improve, leading communications companies are now increasingly focusing on the operational aspects of IPTV services as key factors in an effective commercial launch and operational differentiation. Over the last two to three decades telecommunication service providers have built a customer expectation of service excellence, reliability and “carrier grade” service availability. The challenge is to meet these entrenched consumer expectations while containing IPTV-related operating expenses.  Let us look more closely at the challenges for IPTV service deployment and service provisioning.

Consumer expectations: IPTV consumers are intolerant of service glitches; services must be launched with consumer friendly operations that consistently deliver the desired level of service to the customer.
Extensive operational changes: IPTV deployment requires pervasive changes to the tools, structures, staffing, training, measurement and reward systems used to manage telephony and HSI services.
Risk of “trial and error” approaches: Developing IPTV operations from a blank piece of paper can incur unacceptable delays, risk and costs through trial and error testing and iteration. These methods usually fall short of required operational performance, severely limiting scalability, delaying commercial launch, and creating excessive operating expenses.  Most significantly there is, potentially, a major risk of affecting the service provider's brand with the launch of a poor quality service.
To date technology factors have masked operational issues: IPTV operational challenges are often entwined with – and masked by – better known networking challenges.  Many service providers experience unexpected difficulties with service provisioning, consumer complaints about service quality and reliability, with an overwhelming associated help desk and repair cost.
The affected processes include:
•    Service provisioning (order to installation)
•    Service assurance (preventive and corrective)
•    Network assurance and network change management
•    Video head end management and content management.

Service provisioning challenges
The service provisioning process must manage four interdependent streams of activity:
•    Change-out of telephony feature set and pricing
•    Loop re-arrangements and conditioning
•    Removal and re-build of broadband service and provisioning
•    Activation and installation of home network and IPTV applications
Further, this must be accomplished without unacceptable disruption to any of the consumer's existing telephony or broadband services.
The end-to-end process will most likely span a number of different business units whose procedures must be adapted to accommodate IPTV and Triple Play operations.  These assets will not usually have been integrated into a reliable end-to-end IPTV service provisioning process. Sadly with limited visibility of the end-to-end process, process failure is often not detected until downstream activities are visibly impacted and often the consumer is aware that the process has failed in some way. 
In the absence of effective operational practices, studies to date have shown that order fallout rates can exceed 50 per cent; up to 30 per cent of installations may require a physical attendance of an engineer to complete the installation. Even when the installation appears to be completed, the service provider may receive a higher volume of help desk calls within the first thirty days after the installation.

IPTV service provisioning
In order to minimise the provisioning challenges of deploying an IPTV service, telecommunications service providers need a strategy that encompasses the full range of methodologies, templates and tools specifically tailored to the consumer and business needs of IPTV services.
This strategy must therefore integrate several key elements:
•    Clear understanding of requirements – IPTV service providers need a clear definition of the operational targets for IPTV in order to organise and execute development activities toward those targets.
•    Well defined processes – IPTV Service Provisioning process flows, error checks and related process measures should be assembled into a centralised workflow management tool.
•    Operational Trial and test frameworks – Once processes are developed, coordinated testing of the operational process including multiple error conditions will enable the operator to assess readiness for market trial and launch.

Operational elements
In addition to the common issues involved in network convergence, we have identified five unique and important areas in IPTV service fulfilment that are critical to successful IPTV deployment:
•    Service Provisioning & Verification –Creation of detailed process modelling to enable the effective management of key IPTV processes including:
–    Successful collection of all required customer information
•    Order creation, configuration and provisioning of the customer IPTV service
–    Customer site survey, service verification, and troubleshooting techniques
•    Customer Trouble Resolution – Management of the collection of detailed trouble information for the categorisation and disposition of all customer trouble ticket preparation, to:
–    Reduce call holding time
–    Decrease trouble ticket resolution times
–    Reduce repeat dispatch of technicians
–    Reduce repeat customer trouble calls
•    Content Management – Managing content for IPTV services including:
–    Reconciling and integrating IPTV video service billings and content charges from content providers
–    Video monitoring to ensure billable content availability
–    Managing content provider contracts
–    Managing all intellectual property issues associated with the content such as royalty payments
–    Producing partner settlement invoices across the new IPTV value chain including advertising, sponsorships and promotion deals
–    Plus many other functions new to communications providers
•    Head End Management – Encompassing the design and management of these and other IPTV processes:
–    Channel line-up correlation and frequent additions/changes
–    Simultaneous substitution
–    Closed captioning
–    Daylight savings time change
•    Change Management – Enabling IPTV-specific functionality to manage multiple changes in:
–    Underlying infrastructure (e.g. Video Middleware, DSLAMS to IPTV DSLAMS, Modems, Gateways, central office wiring, head-end components, servers and databases)
–    Video and audio content (channel changes, program reception and encoder configurations)
–    Service pricing, service packaging and portfolio service up and cross sell
–    WEB content and upgrades, and much more

 A strategic, planned approach to IPTV deployment can enable IPTV service providers to implement efficient IPTV service provisioning while reducing the risks, costs and delays of developing “from scratch.”
A strategic process provides a foundation for continued operations design, and optimisation will further reduce design cost and time.  This helps ensure the key evolving IPTV provisioning requirements are captured, and increases the success rate on installations.
This strategy also helps create a consumer friendly IPTV service provisioning process that can be tuned and augmented as volumes grow without throw-away investment.  The use of a repeatable, controlled process improves order completion, while reducing order fallout and re-work and generating positive customer experiences. With all the above the start-up costs may be contained, and scalability enhancements can be phased in as required over time with service growth.
Of all the factors affecting the success of a new service, maintaining customer satisfaction and service quality is perhaps the most important of all.

Rajeev Tankha is Director of Product Marketing, and Dr Graham Carey is Director, Industry Solutions, Oracle Communications

Jeff Nick examines the concept of ‘inflection points’ and their application to information processes

Every few decades, a fundamental shift occurs in how IT reaches customers. The shifts are akin to what Intel co-founder Andy Grove calls ‘strategic inflection points’, or the ‘time in a life of a business when its fundamentals are about to change.’

We’ve seen bandwidth, information growth, and infrastructure complexity explode in this decade as applications became more integrated and dynamic. This has forced the IT industry to address inflection points centred on four areas.
The first inflection point reflects the changing nature of information management.
Today, most information lifecycle management approaches (ILM) are volume based: ‘Buckets’ of information move, based on coarse descriptions of where the information currently sits, how big it is, who owns it, how old it is, etc. Customers already gain value by automatically protecting information and optimising where it’s stored.
But ILM is evolving. Automatically, tools can classify and apply ‘metadata’ labels to information based on its content. The idea is simple: Scan information and use what you learn to enable automation.
• “No one’s accessing this part of the database. Let’s automatically move it to a lower service level.”
• “This e-mail says ‘Confidential.’ We’ll automatically flag it for retention.”
• “This spreadsheet may contain personnel information. We’ll automatically ensure it’s secured.” 
The second inflection point ties into configuring IT assets for greater business value.
Typical data centres house many disparate technology ‘stacks’: applications running on specific operating systems, hosted on specific servers, using specific networks to connect to specific storage devices. Each deployment may have little or no relation to others, amplifying complexity and promoting inefficient use of resources.

Significant problems
This approach causes two significant problems:
• General-purpose computers become a dumping ground for an amalgam of different software stacks and applications. Each application has little or no relation to other applications co-hosted on the same platform and has its own patterns of resource consumption (CPU, memory, storage etc). The unpredictable resource usage results in an extremely complex operational environment and customers find themselves with a scale-out of under-utilised resources. Management simplification conflicts with the desire to optimise resource utilisation.
• The propensity to deliver IT general-purpose parts translates into an mismatch between the way that IT is delivered by vendors to customer IT organisations and the way those organisations need to deliver IT functions to their end users.
Virtualisation technologies, such as VMware, can help with both these problems significantly. Rather than mixing application workloads onto one operating system platform, VMware allows applications to be deployed into separate, flexible, virtual server containers, logically separate, while physically co-located.

Turn-key capabilities
There is a marked trend, mostly from start-ups, to deliver targeted niche turn-key capabilities in a network-centric appliance model, such as Web service gateways, encryption engines, network traffic monitors etc. These functional components, by their modular design, are self-contained, deploy non-invasively into existing configurations, optimise resource utilisation to the function being provided and are simple to manage due to their built-for-purpose limited configuration options.
Again, virtualisation technologies will increasingly shine, as virtual containers are a natural deployment vehicle for functional virtual appliances as well as traditional software stacks.
Guaranteed delivery of IT capabilities in support of service (i.e. service-level agreements) is extremely difficult to provide with any level of confidence in today’s working environment. This is due to the complex task of translating abstract business service-level objectives into concrete, actionable, resource management policies. Objectives for availability, performance, security, compliance and other dimensions of IT quality of service (QoS) are achieved by IT administrators largely through trial and error. Once some level of QoS is achieved, IT organisations are reluctant to introduce change to existing configurations. This puts IT groups directly in conflict with their primary objective: providing service to the customer business by remaining responsive to ever-changing demands and unpredicted business growth opportunities. Today, stability is achieved through wiring static configurations at the expense of dynamic flexibility.
This primary pain point in alignment between the business and the supporting IT environment is the result of some underlying fundamental problems.
First, there has been a general lack of agreement on how to express the capabilities of IT resources in a consistent manner. The disparities of implementation between vendors of similar computer technology elements are all too painfully evident to IT administrators. However, there have been increasing efforts in standards bodies to model resource types for management plug-ability. There has not yet been, though, convergence on the modelling framework across these different resource domains.
A further problem is that much emphasis to date has been placed on modelling ‘things’ (resources) rather than the ‘use of things’ (functions). As a result, significant effort has been placed on modelling every knob and dial of every resource in every resource class. While this is important for plug-ability of resources, it  does not close the gap between service-level management and resource management. What is fundamentally required is a focus on modelling the profiles for interactions with resources in the context of a given management discipline, such as availability or performance management. This would allow for direct translation from service-level objectives to resource policies specifying constraints on acceptable settings for only the resource dials and knobs associated with that particular management discipline.

Serious security issues
A lack of seamless security policy and enforcement across application, information and resource domains also causes serious security issues. Once information is retrieved, for example, from the authoritative data store and is processed in another application, stored in another repository or migrated to another resource, security context is lost.
Further, given the lack of security integration across vendors and types of IT assets, customers seek to protect themselves by building a security ‘castle’, hoping to protect their soft IT infrastructure within the castle walls. This approach, however, does not provide the necessary security policy protections to the business once inside the perimeter. Most security breaches come from within the organisation, not outside.
The solution to all these problems: a services-oriented infrastructure (SOI). Recent developments like Web services, virtualisation, and model-based resource management have coalesced to support SOI.
The third inflection point relates to the emergence of an ‘edgeless’ IT environment.
Information is everywhere. Business processes flow across a global chain of partners, customers, and employees, and perimeter-centric thinking is inappropriate. Enabled by grid technology, for example, thousands of compute nodes share petabytes of scientific information as research labs and universities collaborate to solve the mysteries of our universe.
But technology must take into account that information moves and must be secure-whether accessed internally or over the Web, at rest or in motion. We must authenticate users wherever they are and limit access based on their roles. We must define secure management policies and apply them to information regardless of resource, platform, repository, or application.
The final inflection point will change how information creates value.
We’re adept at creating information; we haven’t truly learned to leverage it. Some analysts say nearly 80 per cent of information that already exists is recreated not reused.
Information is often tied to the application or process that created it, so sharing or repurposing poses a challenge. In a sense, information is imprisoned, bound by proprietary schema and storage-access methods.
Thus, customers miss opportunities: if they could easily search, access, and combine information, they could uncover new revenue sources and operational improvements. We see this idea evolving in our expanding investments in content management anchored by Documentum, Centera content-addressed storage, and collaboration.
These are exciting developments. Inflection points will make IT a seamless part of our lives while elevating its impact. We’ll manage information according to what it is, not where it sits. We’ll design infrastructures to provide service, not just capacity. Traditional data centre perimeters will cede to solutions that realise data moves. Tools will make information ‘self-describing,’ so users can manage it automatically, according to policy.
And we will access, share, mine, and analyse information based on automatic data classification captured in metadata, letting people and applications use data beyond the original content-creation purpose.
We are just beginning to translate data to information and information to knowledge.                             

Jeff Nick is SVP and CTO with EMC
www.emc.com

Mobile advertising is fast becoming one of the most profitable sub-sectors in the telecoms industry. Fuelling this growth is the popularity of advanced mobile services and the willingness of consumers to receive advertising that appeals to their tastes. David Knox examines the potential of this new marketing
initiative and how charging and monitoring solutions can help mobile operators provide flexible and profitable ad campaigns while enhancing the consumer’s user experience

ubiquitous technology in the world, it was only a matter of time before advertisers caught on to the idea that a highly effective new way to reach a mass audience was not by radio, television or newspapers, but through a wireless handset. Mobile advertising has definitely become the latest 'it' word in the telecoms industry, with analysts predicting billions of pounds in revenue over the next five years. According to one recent report published by Informa, the mobile advertising market is going to be worth $871m this year, and will jump to $11.35bn in 2011.

Big in Japan
One of the countries to embrace the concept of wireless advertising is Japan, which has the second largest advertising market in the world behind the US and the first country to exceed 50 per cent 3G penetration. According to some estimates, mobile advertising revenues for 2006 in Japan are expected to exceed $300 million and to double that figure by 2009. This is higher than any other country in the world. Almost 60 per cent of Japanese consumers already use mobile coupons and discounts more than once a month.
Following in the footsteps of Japan is the wireless industry in Western Europe, where the idea of mobile advertising is also being explored by some of most pioneering mobile operators around, such as  Hutchison's 3G mobile group, 3. The operator is currently subsidising usage and phones through advertising on the phone. These models are also being offered through downloads, subscriptions, and video streams. It has also supported various mobile advertising campaigns in exchange for free content such as the launch of the first ever first movie video ad in Europe on behalf of Redbus, the film distributor, for the movie 'It's All Gone Pete Tong'. In partnership with 3, a banner on the carrier's wireless portal home page links to a microsite with more information on the film. At that site, subscribers have already requested more than 100,000 downloads of the movie trailer.
What is astounding about the access rate of the   aforementioned movie clip is that it shows how the average response for mobile advertising is up to 10 times greater than Internet response rates. This is due to the fact that there are only a handful of links to choose from on a typical mobile web page. If one of them is an ad, it is more likely to get clicked than the same ad on the Internet.
Tackling the mobile ad market on an even bigger scale is wireless heavyweight Vodafone, who recently announced a strategic alliance with search engine Yahoo! to create an innovative mobile advertising business as a means to inject new revenue streams for both companies.
Through this partnership, Yahoo! will become Vodafone's exclusive display advertising partner in the UK. Yahoo! will use the latest technology to provide a variety of mobile advertising formats across Vodafone's content services. The initiative will be rolled out in the UK in the first half of 2007.
Under the plans, customers who agree to accept carefully targeted display advertisements will qualify for savings on certain Vodafone services. In other words, Vodafone subscribers would pay a lower rate for data services if they were prepared to accept ad banners from Yahoo! when signing up with the mobile provider.
This promotional deal would also extend to key Vodafone mobile assets including the Vodafone live! portal games, television and picture messaging services.
Vodafone and Yahoo!'s approach to tackling the mobile ad sector is a strategic one. It's all about understanding what each user wants from their mobile phone and providing a truly unique individual advertising experience. In a way, the mobile handset will be transformed into a wireless 'magazine' filled with all sorts of adverts, only it will be slicker – since the advertisements will be live, potentially interactive, and most importantly, targeting the tastes of the mobile user. It can also be based on user location at the time.
Matching user interests
This brings us to the subject of profiling, which is indisputably the most important aspect of mobile marketing and one of the key ingredients to success. In order for Vodafone, 3, or any other mobile operator to make any money, they must know the interests of their users so that the advertising campaigns directed their way actually appeal to them.
This is where charging and rating applications can help operators conduct effective ad campaigns. These solutions are all about control – allowing operators to implement sophisticated, value-based charging strategies that help to differentiate between innovative packages, bundles and promotions. Subsequently, VoluBill's 'Charge it' real-time data control and charging solution, for example, could be installed in a wireless network to detect subscribers and, thereafter, display or redirect them to advertisements that match their user profile if they have chosen to receive advertising.
Information about the brand tastes and interests of the wireless user would be gathered by the mobile carrier before the user agrees to subscribe to advertising. This information would then be stored in the network and accessible to the Charge it solution to ensure that advertising is targeting the right audience.  So, for example, if one mobile user is identified as a die-hard fashionista who loves to read Vogue magazine each month, then Charge it could access the profile information in-real time and automatically redirect the consumer to an advert for the film 'Devil Wears Prada' when they next establish a data connection via the handset. If the consumer has previously received the ad for the film and has responded to the link, then Charge it can also identify that the user has previously accessed the advert and block it from being sent again to the consumer. This helps ensure that the target audience doesn't get put off by seeing advertisements they have already responded to.
This method of monitoring can also be used to track the success of a particular campaign and the revenue share that is generated as a result. If every single fashionista that signed up for mobile advertising clicked on the film link for the 'Devil Wears Prada' advert, for example, and also agreed to receive a free copy of the book which was simultaneously promoted in the movie advert, then Charge it would be able to provide the operator details on how many and which subscribers saw the ad, and how many of them bought the book. Most importantly, Charge it could calculate the income generated from the advert – which would be divided up by so many parties, including the operator and its service providers, the advertising agency, the film distributor, the book publisher and, last but not least, the writer.

Extensive advantages
Indeed, what solutions like Charge it provide are extensive advantages to operators both in charging and managing the user experience, whether subscribers are surfing the Internet or downloading content-rich premium services. As mentioned before, charging and monitoring solutions are all about controlling the mobile advertising medium and ensuring its success.
Most, if not all, of the mobile content and adverts provided on the handset would be managed by search engine platforms such as from Google or, in the case of Vodafone, Yahoo!. As such, charging and monitoring solutions providers can provide the additional user profiling, traffic redirection and charging functionality required to be able to offer a complete mobile advertising and charging solution.
Apart from accessing profile information, Charge it could also be configured to respond to other factors that would enable operators to redirect a subscriber's wireless Internet session to a specific advertising page. One way would be through location identification. If a user went online to a particular website that didn't match their profile, for example, like a fashionista accessing rugby match details, then Charge it could be configured to track and monitor this activity and record it for future marketing purposes. The next time an advertising campaign was launched concerning the sport, the aforementioned user would be sent the advertisement during their next Internet session. Again, this data would be monitored in order to enable Charge it to monitor the adverts and to redirect or block the information when necessary. 
If users accepted the ads sent to them, then Charge it could also be used to help operators to offer flexible price plans. This could include charging customers who access ads at a cheaper rate for services, or granting them a free Internet usage allowance per day. Consumers could also potentially receive periodic adverts like commercials on TV. The more frequent the adverts, the less they would pay for services.

Acceptance
Although widely ignored for many years as a possible income spinner, mobile advertising has finally been accepted by the business world as an innovation that can combine the wide reach of television with the precision of direct marketing and the tracking potential of the Internet. All of this adds up to serious revenue potential. Mobile advertisements also benefit all the participants involved in its new marketing game. While carriers get to boost their data and content revenue streams, advertisers gain a new and effective way of targeting a mass consumer audience. Meanwhile, mobile users get to benefit from great promotions and offers for the products and services they want to hear about, as well as cheaper mobile Internet usage.
Before mobile advertising and marketing can reach its full potential, however, certain technical and business requirements need to be met. This includes putting in place the right charging and monitoring infrastructure, as well as building relationships with quality advertising partners. Once installed, and with the right commercial models in place, there is nothing stopping mobile advertising from possibly becoming the most successful sub-sector in global telecoms history.                   

David Knox is Product Marketing Director at VoluBill

Technology companies come and go, but some are blessed with the foresight to help drive the technological developments that permeate all our lives. One such company is Micron, whose COO, Mark Durcan, tells Lynd Morley why it has been so successful

Future gazers abound in our industry, and we're being promised a near-future of sensor networks and RFID tags that will control or facilitate everything from ordering the groceries, to personalised news projected into our homes or from our mobile phones. This stuff of science fiction, fast becoming science fact, is the visible, sexy end-result of the technology, but what about the guys working at the coal-face, actually producing the tools that enable the dreams to come true?
Micron Technology is one of the prime forces at that leading edge. Among the world's leading providers of advanced semiconductor solutions, Micron manufactures and markets DRAMs, NAND Flash memory, and CMOS image sensors, among other semiconductor components and memory modules for use in computing, consumer, networking and mobile products. And Mark Durcan, Micron's Chief Operating Officer, is confident that the company has been instrumental in helping the gradual realisation of the future gazers' predictions.
“I do think that we are, in many ways, creating the trends, because we've created the technology which enables them,” he comments. “I can give you two prime examples. The first is in the imaging space where, for many decades, charge-coupled devices (CCDs) were the technology of choice for capturing electronic images – mostly because the image quality associated with CCDs was much better than that of CMOS images, which is what Micron builds today. 
“Nonetheless, we were strong believers that we could marry very advanced process technology, device design and circuit design techniques with the CMOS imager technology, and really create a platform that enabled a whole new range of applications. 
“I think we did that successfully,” he continues, “and the types of applications that were then enabled are really quite stunning. For instance, with CCDs you have to read all the bits out serially, so you can't capture images very quickly. With CMOS imagers you can catch thousands of images per second, which then opens the door to a whole new swathe of applications for the imagers – from very high speed cameras, to electronic shutters that allow you to capture a lot of images, and, by the way, you can do it using far less power. We have already made a major impact in providing image sensors to the notoriously power hungry cameraphone and mobile device based marketplaces, and in the space of two years have become the leading supplier of imaging solutions there. One in three cameraphones now have our sensors and in only two years we have become the largest manufacturer of image sensors in unit terms worldwide. So now, for instance, the technology enables all sorts of security, medical, notebook and automotive applications – you can tune the imagers for a very high dynamic range, low light and low noise at high temperatures which then enables them to operate in a wide variety of environments that CCDs can't function in.
As a result, you can put imaging into a multitude of applications that were never possible before, and I think we really created that movement by creating the high quality sensors that drive those applications.”
The second example Durcan quotes is in the NAND memory arena. “What we've done is probably not   apparent to everyone just yet, but, actually, I believe that we've broken Moore's law.
“We are now scaling in the NAND arena much faster than is assumed under Moore's law, and that has really changed the rate at which incremental memory can be used in different and new ways. As a result, I believe it will also pretty quickly change the way computers are architected with respect to memory distribution. So we're going to start seeing changes in what types of memory are used, and location in the memory system, and it's all being driven by a huge productivity growth, associated with NAND flash and the rate at which we're scaling it. We are scaling it faster than anyone else in the world now and we are also well tuned to the increasingly pushy demands of mobile communications, computing and image capture devices.“
The productivity growth Durcan alludes to has been particularly sharp for Micron over the past year. The formation of IM Flash – a joint venture with Intel – in January 2006 has seen the companies bringing online a state-of-the-art 300mm NAND fabrication facility in Virginia, while another 300mm facility in Utah is on track to be in production early next year. The venture also produces NAND through existing capacity at Micron's Idaho fabrication facility. And just to keep things even busier, the partners introduced last July the industry's first NAND flash memory samples built on 50 nanometre process technology. Both companies are now sampling 4 gigabit 50nm devices, with plans to produce a range of products, including multi-level cell NAND technology, starting next year. At the same time, Intel and Micron announced in November 2006 their intention to form of a new joint venture in Singapore (where Micron has a long history of conducting business) that will add a fourth fabrication facility to their NAND manufacturing capability.
In June 2006, Micron also announced the completion of a merger transaction with memory card maker Lexar Media, a move that helped Micron expand from its existing business base into consumer products aimed at digital cameras, mobile computing and MP3 or portable video playing devices.
“Our merger with Lexar is interesting for a number of different reasons,” Durcan comments. “Certainly it brings us closer to the consumer, as, historically, our products tended to be sold through OEMs. But, in addition, it provides the ability to build much more of a memory system, as opposed to stand-alone products, given that Lexar delivers not only NAND memory, but also a NAND controller that manipulates the data in different ways and puts it in the right format for the system that you're entering. Working closely with Lexar, we want to ensure that this controller functionality is tied to the new technologies we want to adopt on the NAND front, making sure that they work well together, thus enabling more rapid introduction of new technologies and getting them to market more quickly.”
The considerable activity of the past twelve months clearly reflect Micron's view of itself as a company that is in the business of capturing, moving and storing data, and aiming for the top of the tree in each section.   On the 'capturing' front, for instance, Durcan notes: “We've been very successful from a technology development perspective, and I think we're pretty much the unquestioned leader in the image quality and imaging technology arena. As mentioned we also happen to be the world's biggest imaging company now – it happened more quickly than any of us thought it would, but it was driven by great technology. So we have plenty of challenges now in making sure that we optimise the opportunity we've created to develop new and more diversified applications.”

Stringent tests
Certainly, the company is willing to put its developments to the most stringent of tests. All of Micron's senior executives, including Durcan, recently drove four Micron off-road vehicles in an exceptionally rugged all-terrain race in California, the Baja 1000, digitally capturing and storing more than 140 hours of video from the race, using Micron's DigitalClarity image sensors and Lexar Professional CompactFlash memory cards specially outfitted for its vehicles. All the technology performed remarkably well, as did Micron's CEO Steve Appleton, who won the contest's Wide Open Baja Challenge class some 30 minutes ahead of the next closest competitor.
Appleton's energetic and non-risk-averse approach to both the Baja 1000 (in some ways the American version of the Paris Dakar Rally) and to life in general (he is reputed to have once crashed a plane during a stunt flight, but still proceeded with a keynote speech just a few days later) is reflected in an undoubted lack of stuffiness within Micron.
Certainly, the company has taken a certain level of risk in pioneering technology developments. RFID is a case in point. “Sometimes,” Durcan explains, “the technology was there, but the market was slow to develop. RFID is a good example. Today, Micron has the largest RFID patent portfolio in the world. We certainly developed a lot of the technology that is now incorporated in global RFID standards, but when we first developed it, the threat of terrorism, for instance, was less obvious, so we simply couldn't get these tags going that are now absolutely commonplace. I suppose you could say we've been a little ahead of our time.”
The company is also managed by a comparatively young executive team, with a very non-hierarchical approach to business. “I do believe that we have a certain mindset that keeps us pretty flexible,” Durcan explains, “and one our strongest cards is that we have some really great people, with a great work ethic. At the same time, we drive a lot of decisions down into the company. We're probably less structured in our decision making than a lot of companies. 
“So, we try to get the right people in the room (not necessarily in the room actually, but on the same phone line!) to make a decision about what is the right space to operate in, then we can turn it over to people who can work the details.
“We try to get to that right space, at a high level, through good communication and then drive it down. It is the opposite of what I believe can happen when companies grow, become compartmentalised, and tend to get more and more siloed.
“There is also very strong synergy between the different activities within Micron,” he continues. “In each case we're really leveraging advanced process technology, advanced testing technology, and large capital investments in large markets. There are a lot of things that are similar and they do all play closely with each other.”

International bunch
Micron's people are, in fact, a truly international bunch, recruited globally, and bringing a great diversity of skills and approaches to the company. “I think that we are one of the most global semiconductor companies in the world,” Durcan says, “despite being a relatively young company. We recently started manufacturing our sensors in Italy and have design centres in Europe, both in the UK and Norway, which are expanding their operations. In fact we are now manufacturing on most continents – except in Africa and Antartica – and we have design teams right around the world who work on a continuous 24hr cycle handing designs from site to site. We've tried to grow a team that is very diverse, and leverage the whole globe as a source of locating the best talent we can.”
So, does all this talent produce its own crop of future gazers? Durcan believes they have their fair share.  “There certainly are people at Micron who are very good at seeing future applications. My personal capabilities are much more at the technology front end. I can see it in terms of 'we can take this crummy technology and really make it great'. Then I go out and talk to other people in the company who say 'that's fantastic, if we can do that, then we can...'. It really does take a marriage of the whole company, and a lot of intellectual horsepower.”
That horsepower has resulted in a remarkable number of patents for Micron. Durcan comments: “The volume and quality of new, innovative technology that Micron has been creating is captured by our patent portfolio.  It's an amazing story, and something I'm really proud of.  The point is, Micron is a pretty good-sized company, but we're not large by global standards – we're roughly 23,500 employees worldwide. Yet we are consistently in the top five patent issuers in the US.
“I feel the more important part of the patent story, however, is that when people go out and look at the quality of patent portfolios, they typically rank Micron as the highest quality patent portfolio in the world – bar none. I think that's pretty impressive and speaks volumes about the quality our customers benefit from.”

Lynd Morley is editor of European Communications

Technology companies come and go, but some are blessed with the foresight to help drive the technological developments that permeate all our lives. One such company is Micron, whose COO, Mark Durcan, tells Lynd Morley why it has been so successful

Lead interview – It's a vision thing

Future gazers abound in our industry, and we’re being promised a near-future of sensor networks and RFID tags that will control or facilitate everything from ordering the groceries, to personalised news projected into our homes or from our mobile phones. This stuff of science fiction, fast becoming science fact, is the visible, sexy end-result of the technology, but what about the guys working at the coal-face, actually producing the tools that enable the dreams to come true?
Micron Technology is one of the prime forces at that leading edge. Among the world’s leading providers of advanced semiconductor solutions, Micron manufactures and markets DRAMs, NAND Flash memory, and CMOS image sensors, among other semiconductor components and memory modules for use in computing, consumer, networking and mobile products. And Mark Durcan, Micron’s Chief Operating Officer, is confident that the company has been instrumental in helping the gradual realisation of the future gazers’ predictions.
“I do think that we are, in many ways, creating the trends, because we’ve created the technology which enables them,” he comments. “I can give you two prime examples. The first is in the imaging space where, for many decades, charge-coupled devices (CCDs) were the technology of choice for capturing electronic images – mostly because the image quality associated with CCDs was much better than that of CMOS images, which is what Micron builds today. 
“Nonetheless, we were strong believers that we could marry very advanced process technology, device design and circuit design techniques with the CMOS imager technology, and really create a platform that enabled a whole new range of applications. 
“I think we did that successfully,” he continues, “and the types of applications that were then enabled are really quite stunning. For instance, with CCDs you have to read all the bits out serially, so you can’t capture images very quickly. With CMOS imagers you can catch thousands of images per second, which then opens the door to a whole new swathe of applications for the imagers – from very high speed cameras, to electronic shutters that allow you to capture a lot of images, and, by the way, you can do it using far less power. We have already made a major impact in providing image sensors to the notoriously power hungry cameraphone and mobile device based marketplaces, and in the space of two years have become the leading supplier of imaging solutions there. One in three cameraphones now have our sensors and in only two years we have become the largest manufacturer of image sensors in unit terms worldwide. So now, for instance, the technology enables all sorts of security, medical, notebook and automotive applications – you can tune the imagers for a very high dynamic range, low light and low noise at high temperatures which then enables them to operate in a wide variety of environments that CCDs can’t function in.
As a result, you can put imaging into a multitude of applications that were never possible before, and I think we really created that movement by creating the high quality sensors that drive those applications.”
The second example Durcan quotes is in the NAND memory arena. “What we’ve done is probably not   apparent to everyone just yet, but, actually, I believe that we’ve broken Moore’s law.
“We are now scaling in the NAND arena much faster than is assumed under Moore’s law, and that has really changed the rate at which incremental memory can be used in different and new ways. As a result, I believe it will also pretty quickly change the way computers are architected with respect to memory distribution. So we’re going to start seeing changes in what types of memory are used, and location in the memory system, and it’s all being driven by a huge productivity growth, associated with NAND flash and the rate at which we’re scaling it. We are scaling it faster than anyone else in the world now and we are also well tuned to the increasingly pushy demands of mobile communications, computing and image capture devices.“
The productivity growth Durcan alludes to has been particularly sharp for Micron over the past year. The formation of IM Flash – a joint venture with Intel – in January 2006 has seen the companies bringing online a state-of-the-art 300mm NAND fabrication facility in Virginia, while another 300mm facility in Utah is on track to be in production early next year. The venture also produces NAND through existing capacity at Micron’s Idaho fabrication facility. And just to keep things even busier, the partners introduced last July the industry’s first NAND flash memory samples built on 50 nanometre process technology. Both companies are now sampling 4 gigabit 50nm devices, with plans to produce a range of products, including multi-level cell NAND technology, starting next year. At the same time, Intel and Micron announced in November 2006 their intention to form of a new joint venture in Singapore (where Micron has a long history of conducting business) that will add a fourth fabrication facility to their NAND manufacturing capability.
In June 2006, Micron also announced the completion of a merger transaction with memory card maker Lexar Media, a move that helped Micron expand from its existing business base into consumer products aimed at digital cameras, mobile computing and MP3 or portable video playing devices.
“Our merger with Lexar is interesting for a number of different reasons,” Durcan comments. “Certainly it brings us closer to the consumer, as, historically, our products tended to be sold through OEMs. But, in addition, it provides the ability to build much more of a memory system, as opposed to stand-alone products, given that Lexar delivers not only NAND memory, but also a NAND controller that manipulates the data in different ways and puts it in the right format for the system that you’re entering. Working closely with Lexar, we want to ensure that this controller functionality is tied to the new technologies we want to adopt on the NAND front, making sure that they work well together, thus enabling more rapid introduction of new technologies and getting them to market more quickly.”
The considerable activity of the past twelve months clearly reflect Micron’s view of itself as a company that is in the business of capturing, moving and storing data, and aiming for the top of the tree in each section.   On the ‘capturing’ front, for instance, Durcan notes: “We’ve been very successful from a technology development perspective, and I think we’re pretty much the unquestioned leader in the image quality and imaging technology arena. As mentioned we also happen to be the world’s biggest imaging company now – it happened more quickly than any of us thought it would, but it was driven by great technology. So we have plenty of challenges now in making sure that we optimise the opportunity we’ve created to develop new and more diversified applications.”

Stringent tests
Certainly, the company is willing to put its developments to the most stringent of tests. All of Micron’s senior executives, including Durcan, recently drove four Micron off-road vehicles in an exceptionally rugged all-terrain race in California, the Baja 1000, digitally capturing and storing more than 140 hours of video from the race, using Micron’s DigitalClarity image sensors and Lexar Professional CompactFlash memory cards specially outfitted for its vehicles. All the technology performed remarkably well, as did Micron’s CEO Steve Appleton, who won the contest’s Wide Open Baja Challenge class some 30 minutes ahead of the next closest competitor.
Appleton’s energetic and non-risk-averse approach to both the Baja 1000 (in some ways the American version of the Paris Dakar Rally) and to life in general (he is reputed to have once crashed a plane during a stunt flight, but still proceeded with a keynote speech just a few days later) is reflected in an undoubted lack of stuffiness within Micron.
Certainly, the company has taken a certain level of risk in pioneering technology developments. RFID is a case in point. “Sometimes,” Durcan explains, “the technology was there, but the market was slow to develop. RFID is a good example. Today, Micron has the largest RFID patent portfolio in the world. We certainly developed a lot of the technology that is now incorporated in global RFID standards, but when we first developed it, the threat of terrorism, for instance, was less obvious, so we simply couldn’t get these tags going that are now absolutely commonplace. I suppose you could say we’ve been a little ahead of our time.”
The company is also managed by a comparatively young executive team, with a very non-hierarchical approach to business. “I do believe that we have a certain mindset that keeps us pretty flexible,” Durcan explains, “and one our strongest cards is that we have some really great people, with a great work ethic. At the same time, we drive a lot of decisions down into the company. We’re probably less structured in our decision making than a lot of companies. 
“So, we try to get the right people in the room (not necessarily in the room actually, but on the same phone line!) to make a decision about what is the right space to operate in, then we can turn it over to people who can work the details.
“We try to get to that right space, at a high level, through good communication and then drive it down. It is the opposite of what I believe can happen when companies grow, become compartmentalised, and tend to get more and more siloed.
“There is also very strong synergy between the different activities within Micron,” he continues. “In each case we’re really leveraging advanced process technology, advanced testing technology, and large capital investments in large markets. There are a lot of things that are similar and they do all play closely with each other.”

International bunch
Micron’s people are, in fact, a truly international bunch, recruited globally, and bringing a great diversity of skills and approaches to the company. “I think that we are one of the most global semiconductor companies in the world,” Durcan says, “despite being a relatively young company. We recently started manufacturing our sensors in Italy and have design centres in Europe, both in the UK and Norway, which are expanding their operations. In fact we are now manufacturing on most continents – except in Africa and Antartica – and we have design teams right around the world who work on a continuous 24hr cycle handing designs from site to site. We’ve tried to grow a team that is very diverse, and leverage the whole globe as a source of locating the best talent we can.”
So, does all this talent produce its own crop of future gazers? Durcan believes they have their fair share.  “There certainly are people at Micron who are very good at seeing future applications. My personal capabilities are much more at the technology front end. I can see it in terms of ‘we can take this crummy technology and really make it great’. Then I go out and talk to other people in the company who say ‘that’s fantastic, if we can do that, then we can...’. It really does take a marriage of the whole company, and a lot of intellectual horsepower.”
That horsepower has resulted in a remarkable number of patents for Micron. Durcan comments: “The volume and quality of new, innovative technology that Micron has been creating is captured by our patent portfolio.  It’s an amazing story, and something I’m really proud of.  The point is, Micron is a pretty good-sized company, but we’re not large by global standards – we’re roughly 23,500 employees worldwide. Yet we are consistently in the top five patent issuers in the US.
“I feel the more important part of the patent story, however, is that when people go out and look at the quality of patent portfolios, they typically rank Micron as the highest quality patent portfolio in the world – bar none. I think that’s pretty impressive and speaks volumes about the quality our customers benefit from.”

Lynd Morley is editor of European Communications

So, what’s the correlation between Charles Darwin’s theory of evolution and base station antennas, you may ask. Peter Kenington makes the connection...

In 1859, when Charles Darwin first published “On the Origin of Species...”, there is little doubt that he did not have wireless base-stations in mind. In reality, however, many of his ideas are equally applicable to this area of evolution, as they are to evolution in the natural world. One of the key elements that have enabled this to happen is the now widely accepted set of open specifications, set out by the Open Base Station Architecture Initiative (OBSAI), covering the interfaces between the main modules within a base station.
Darwin's key insight was in noticing that, in the natural world, the survival of a species is based upon its ability to adapt to environmental change and to competition from rival species that are also evolving. The key to survival is in finding a niche, be it large or small, within the ecosystem of our planet. Niches exist at all levels within the food chain, from that of a simple, low functionality existence (e.g. single-celled creatures, bacteria etc.) through to something of much higher 'performance' (e.g. tigers, dolphins and man). The same situation exists in the base-station arena, where simple, low cost pico and femto BTSs are beginning to emerge to fulfil low cost, short-range coverage requirements. These complement larger, more sophisticated designs for high-capacity, multi-carrier macro cells. In both cases, the key to survival rests in    achieving acceptable or superior levels of performance, for the lowest possible ownership cost.
The natural world and the world of base-stations have one major difference from an evolutionary perspective, and that is that base-stations can now immediately take advantage of the latest innovation (in RF or baseband) through the designing in or substitution of a new RF, baseband, transport, clock/control or power supply module. The internal interface specifications, pioneered by OBSAI, enable this process for a BTS (Base-station Transceiver System). In the case of the natural world, however, the adoption of a new form of jaw, leg muscle etc. from another species would either take very many generations or prove to be impossible. This is, in many respects, how things used to be in the base-station world: if a particular OEM included a useful innovation in its new generation BTS product, at best, this would be copied by its rivals in a subsequent generation of BTS. 
The adoption of the OBSAI standards represents a significant short-cut in this process, since innovations which are included in module products placed on the open market, can become a part of many manufacturers' products quite quickly. This potentially reduces the hardware development burden within the OEMs and allows them greater freedom to concentrate on developing aspects of their products, which will provide true differentiation from their competitors. Many of these areas fall within the domain of software and this will increasingly dominate in future BTS generations. It is this area more than most that will lead to the survival of the 'fittest' – which in a BTS context translates to the most innovative.
The OBSAI organisation consists of more than 130 component, module and base station vendors. Its aim is to create an open set of specifications for the internal modules required in a base-station, encompassing both interface and mechanical aspects. A good analogy in the PC industry is that of the PCI bus/card specifications. Many more unit-level options will become available in the future, as the technology develops to integrate modules into combined units. This will open up new location possibilities for base-stations due to the availability of smaller and more versatile architectures that are easier to site.

Revolution in evolution
The last 20 years or so has seen a revolution in the PC industry, in cost, yes, but also in capability and this has been brought about, in a large part, by the PCI bus. Most PC manufacturers offer a huge range of models, but these generally fall into a very small number of 'families' – often just one. PC manufacturers have achieved the seemingly impossible, in being able to offer a huge range of choice to their customers whilst employing a modest cost base for their organization.  This has been achieved through minimizing inventory and design engineering services, through the use of standard packaging and 'modules' (e.g. graphics cards, DVD-drives, memory cards, motherboards etc.). These modules can then be selected appropriately to generate a large product offering appropriate for all tastes and budgets. Their success rests, in a large part, on them being able to provide the customer with exactly what he or she wants. This is a level of service that it has not, in the past, been economic to provide in the base-station area.

Changed landscape
OBSAI's announcement, in June 2006, that it had released a full set of interface, hardware and test specifications for the internal interfaces within a base-station has changed the landscape for the mobile radio base-station. OBSAI's specifications are compatible with all of the major current and emerging air interface standards, including GSM, GSM/EDGE, WCDMA, CDMA2000 and WiMAX and are available for public download free of charge (from www.obsai.com). These specifications allow module vendors to manufacture modules that are capable of operating in any OBSAI-compatible BTS, thereby reducing substantially the development effort and costs involved in the introduction of a new range of BTS products. They also enable a more PC-like model to be adopted in the design and construction of a BTS product – i.e. the selection of modules from a range of vendors at a range of capability levels and costs, such that the overall BTS closely matches the operator's requirements. Embracing this model will be a route, and a key, to survival in the emerging BTS marketplace.

A head in the clouds
The changes in the market landscape, in terms of planning restrictions, health concerns, acoustic noise objections (from cooling fan noise) and many other issues are making it increasingly difficult for operators to erect new cell sites. New BTS architectures are therefore emerging to try and address these problems; here again, the survivors will be those that adopt these new architectures and can make them work for their customers.
These issues have given birth to the remote RF head – a new form of BTS deployment that is fully supported by the OBSAI specifications. This architecture places the active RF electronics remotely from the rest of the BTS and its associated backhaul. The remote RF head itself houses all of the radio-related functions (transmitter RF, receiver RF, filtering etc.). This is then connected to the remainder of the BTS via fibre-optic cable.
The above arrangement allows main elements of the BTS (the digital and network interface modules) to be housed in a low-cost internal location, such as a basement. The RF head can then be situated on the roof of the building or on an outside wall. Another option is to site the remote RF head at the top of an antenna mast, with the remainder of the BTS being located at the base of the mast in a suitable hut or other enclosure.

Cheap hotels bring comfort
This principle can be extended to multiple remote RF heads, whilst still maintaining a single, central, location for the other aspects of their associated base-stations. This concept is usually referred to as a BTS hotel. The remote RF heads themselves can be located a substantial distance from the main BTS hotel site, due to the very low losses associated with the fibre optic cables used to connect them to the remainder of the BTS.
One of the main advantages of the BTS hotel architecture lies in its ability to provide cost-effective BTS redundancy. It is typically not economically viable to provide 100 per cent redundancy within a traditional BTS. However, in the case of a BTS hotel, N+1 redundancy can be used (i.e. the provision of one redundant BTS covering a number of active BTS systems within the BTS hotel location).
Significant disadvantage
The significant disadvantage with the BTS hotel architecture is, however, in the cost of the fibre optic links that run between the BTS hotel and its remote units. Installing new fibre – if there is none in existence already – involves significant civil works and is therefore extremely expensive. There are, however a number of examples of various types of BTS hotel in operation today, covering applications in city centers, at airports and for major sporting events.
So, natural selection has been a part of the evolution of the earth's species for billions of years and has proved to be a successful method of ensuring that the best-adapted species are available to maintain our ecosystem. In the world of base-station engineering, the same principles apply – however the timescales are dramatically shorter. The open specifications provided by OBSAI shine a powerful light on the future evolutionary path for the BTS – it will be interesting to see who is the fittest and, hence, survives this new dawn.                                             

Peter Kenington is the Managing Director of Linear Communications Consultants  and the Technical Chair of OBSAI.  Email: pbk@linearcomms.com

A fundamental part on handset design is usability testing – the measurement of the ability of users to complete tasks effectively.
Mats Hellman explains

As mobile handsets evolve into ever-more sophisticated devices, with an ever-expanding list of capabilities, it is vital that users are able to access the features they want quickly and intuitively. Users need to feel equally at ease accessing Internet pages or sending a multimedia message from their mobile devices as they do looking up and dialling a friend’s number. By creating a satisfactory user experience, handset makers can earn long-lived popularity and loyalty that goes beyond the initial appeal of stylish design.
It is increasingly important for mobile device makers to put special emphasis on user satisfaction in a world where consumers have a huge choice and can switch their handsets regularly with little effort or cost.
Central to the creation of winning handset designs is usability testing: measuring the ability of users to complete tasks efficiently and effectively. Furthermore, it is vital to check that user expectations are satisfied before new models are introduced into the market.
Usability testing involves putting processes in place to gather good user input – and then using it well – to ensure a good user experience. This may sound like common sense, but our experience shows such techniques are under-used across the mobile industry.

Ask the true experts
Putting the user at the heart of the design process is an extremely effective way to gather the more subjective feedback needed to enhance design. By integrating usability testing into product development, and by working closely with users we begin to understand how users experience efficiency and effectiveness, rather than just trying to construct objective measurements of efficiency and effectiveness themselves.
Many handset designers are tempted to base their usability metrics on a composite of simple observations of number of clicks or number of errors made when accessing a particular function. Today, the most popular way of testing usability is to carry out consumer surveys and undertake lab testing. While these elements will both produce useful results, they leave little or no room for further probing should any unexpected issues arise as part of any feedback.
What’s more, large-scale surveys might produce substantial amounts of data, but all too often this data is not fed back into development teams effectively. Unless usability testing is conducted continually – with rapid feedback into the design team – the user data can quickly get out of sync with the design process, and it gets harder for designers to ask follow-up questions while they’re still working on with the relevant feature.
Measuring the number of clicks and errors users make only tells us so much. Besides, it could well be that making more clicks to access a particular function is more effective if the user feels the process is more logical and easier to follow. A screen crammed full of supposedly helpful icons – while needing fewer clicks – might be more confusing to the user than having a smaller set of icons that each lead to an extra layer of options. Furthermore, errors made as part of the learning process are not always viewed as unsatisfactory by users. On the other hand, errors that are made repeatedly as a result of poor, non-intuitive interface design are very frustrating.
One way to address this issue effectively is to create a permanent ‘test expert’ who acts as the user advocate throughout the development process. This role follows the traditions of participatory design and ethnographic approach to research, as well as suiting modern methods of software engineering.
Building on its success in computer system development in industry, participatory design is now finding favour in mobile handset usability testing as a way of bringing user expertise into the design process. UIQ Technology has itself developed a model for measuring attitudes directly that eradicates the need to second-guess certain behaviours and maintains the focus on the user experience, rather than user behaviour.

Testing in the comfort zone
In this new model, the test expert sits down with users to evaluate design alternatives, and discover new options, to find out what makes a satisfying design. The goal is to understand and describe the use of mobile phones as perceived by the users; to collect data from the inside, not the outside.
More meaningful results are achieved by conducting usability testing in familiar locations – rather than in test environments – where users feel more comfortable and open to sharing their experiences and views. By working side-by-side with users, software developers can help them play an enthusiastic and engaged role in the design process – resulting in handsets that work for them.
Ideally, the user and test expert should test the phone together. The user performs a number of tasks and discusses the experience of using the phone with the test expert. The test expert monitors the way the user accomplishes the task and notes any difficulties. They both respond to written statements of attitude to record their evaluations of the phone.
The next stage is to put the results of the interactive testing to good use among two key audiences: designers and decision makers. Designers want concrete, immediate feedback to be able to improve interaction design during every step of the process. Decision makers, on the other hand, prefer cold hard facts in diagrams to present at meetings.
The test expert’s observations, together with the evaluations made with the tester, provide good detailed feedback to designers. The advantage of carrying out continuous testing and relaying test outcomes informally and directly to designers is extremely short turn-around time, from problem discovery to implemented solution.
To suit the needs of decision makers, rather than being presented as pure numbers, the usability metrics for a given phone are expressed as positions on a two-dimensional field, in which user satisfaction or dissatisfaction is plotted against efficiency. This offers an at-a-glance view of how satisfactory the user experience is, and where resources need to be allocated for improvement.
This approach differs from usability metrics that aim to determine if one system is more usable than another through experimental methods similar to those used in the natural sciences in that it is grounded in the social sciences. The test results are not validated through future repetition, but through the knowledge and experience gained by the test experts, and how these influence the development of the mobile handset to ensure the best possible user interaction experience.
Confidence to move forward
UIQ Technology itself learned a lot from this process in the move from UIQ 2 to UIQ 3 – which made the change from a user interface that was primarily pen-based to one that could be solely key-based. The intensive usability testing programme we put in place gave us the confidence to make a number of drastic design changes.
One such change was the introduction of the new navigation control for list views in the Contacts application. This provides a ‘peep-hole’ into a given contact’s details without having to open the full contact page. Each item, such as a phone number or email address, can be acted upon directly from the list view (to initiate a text message, for example). Our user testing uncovered a number of design wrinkles that had to be ironed out. Usability testing also really helped in the development of the Agenda application, where we discovered a number of important differences between using the calendar with one hand (in soft-key mode) and with two hands (in pen mode). The test users contributed greatly to finding good solutions for the redesign.
The process of implementing one-handed navigation has been aided greatly by having users available on a daily basis, with the test expert maintaining user focus, rather just doing ‘unit testing’. Extending a touch-screen, pen-based UI to serve as a soft-key, non-touch-screen UI as well, on just one codeline – while offering consistent navigation behaviour between the two UI styles – was no simple task! It would not have been possible in the time taken without the kind of user input we gathered through our usability testing model.
Evaluating and acting on real consumer experience and satisfaction need not be an insurmountable obstacle – as it is widely perceived in the industry. There can be no excuse for any consumer dissatisfaction with a phone that is already on the market. We all need to learn to do the right thing from the start: it’s for our own good – as well as our customers’.                                     

Mats Hellman is Head of Systems Design at UIQ Technology

    

This website uses cookies to improve your experience. Using our website, you agree to our use of cookies

Learn more

I understand

About cookies

This website uses cookies. By using this website and agreeing to this policy, you consent to SJP Business Media's use of cookies in accordance with the terms of this policy.

Cookies are files sent by web servers to web browsers, and stored by the web browsers.

The information is then sent back to the server each time the browser requests a page from the server. This enables a web server to identify and track web browsers.

There are two main kinds of cookies: session cookies and persistent cookies. Session cookies are deleted from your computer when you close your browser, whereas persistent cookies remain stored on your computer until deleted, or until they reach their expiry date.

Refusing cookies

Most browsers allow you to refuse to accept cookies.

In Internet Explorer, you can refuse all cookies by clicking “Tools”, “Internet Options”, “Privacy”, and selecting “Block all cookies” using the sliding selector.

In Firefox, you can adjust your cookies settings by clicking “Tools”, “Options” and “Privacy”.

Blocking cookies will have a negative impact upon the usability of some websites.

Credit

This document was created using a Contractology template available at http://www.freenetlaw.com.

Other Categories in Features