Deloitte defines the main trends for 2010 in the telecommunications-, media- and technology sector

Brussels, 15 February 2010 – For the 9th time already, Deloitte presents its annual predictions for the media, telecommunications and technology sector, which have been gathered in 3 dossiers. A grasp from some of the main conclusions learns that the meld of web content with television programmes should intensify as concurrent use of the web and TV takes off this year. Also, 2010 could be an inflection year for VoIP – voice call over the top of an IP-based network – via mobile phone, given the growing number of WiFi-enabled phones, more WiFi hotspots, and the increase of ‘one-to-many’ communication.

1. Trends for the media industry for 2010

The shift to online advertising: more selective, but the trend continues

Online advertising continues to gain share in 2010 and beyond. Online advertising spending will not only grow in absolute amounts but is also likely to grow substantially faster that the total advertising market. Its global share will grow roughly from 10 percent at the end of 2009 to 15 percent by the end of 2011. The categories of online that are likely to experience the greatest growth remain search, click, social network, and cost per action (CPA).

Advertisers increasingly want the ability to measure effectiveness and are becoming indifferent to other purported advantages. Online advertising is perceived as the solution that allows advertisers to measure effectiveness of their spending. Main challenge for the media companies will be to create a business model in which online advertising contributes to their profit at the same level as non-online advertising.

André Claes, Industry Leader TMT at Deloitte Belgium: “The economic environment is pressing very hard for measurement and efficiency. If you can market a product that generates traceable results, you are more likely to be on the winning hand. One noticeable exception is probably local or regional advertising for which print will remain the main product.”

TV and the web belong together, but not necessarily on the same screen

Melding web content with television programmes should intensify because concurrent use of the web and TV will take off in 2010. André Claes: “Convergence seems to be undisputable also in Belgium. We have seen the Belgian broadcasters investing heavily in their websites lately in Belgium, confirming their ambition of taking a stance on the web with their programs.”

Converged web and television consumption is however expected to be based on existing televisions and devices, with ‘convergence’ being user-driven, given the mismatch between the swelling consumer demand for concurrent web and TV usage and the typical ten-year renewal cycle for televisions. André Claes adds: “Users will combine existing sets with standalone browser-enabled devices, most WiFi enabled laptops and Netbooks, smartphones, MP4 players, and portable game consoles. As simultaneous web and television use gains popularity, television producers will be encouraged to create websites that feed off viewers’ eagerness to react to what they are watching”.

Publishing fights back: pay walls and micropayments

In 2010, the newspaper and magazine industry will continue to threaten to charge readers for online content, however that talk is unlikely to be matched by action. Publishers rumoured to be thinking about pay walls may ultimately decide against it, or are choosing hybrid models where most content is free, while charging only for a limited quantity of premium content. André Claes: “It may however be ambitious to believe that in smaller markets newspapers have the reading potential and the means to maintain paid for offerings on the web”.

Publishers who use pay walls need to maintain and publicise the premium nature of their content. Excessive cost-cutting could devalue the brand. Online readers might be willing to become micropayment customers, but only if the content is good enough and worth the effort. The value of the micropayment strategy to the content provider requires volume: one micropayment per customer every two weeks might result in transaction costs exceeding gross margins.

eReaders fill a niche, but eBooks fly off the (virtual) shelves

Deloitte predicts that in 2010, stand-alone eReader devices will likely sell five million units globally. Meanwhile, electronic versions of books (eBooks) could sell as many as 100 million copies. While the makers of eReader devices might be initially pleased to hear this, the downside for them is that more eBooks may be read on

PCs, netbooks, smartphones and netTabs than on singlepurpose eReaders.

With sales of $1.5 billion likely, eReaders are far from failing, but competition from other devices is likely to slow their growth rate going into 2011, even as eBook growth remains close to 200 percent. André Claes adds: “however, the recent launch of the Apple “IPad“ might bring a change to this. It will bring a lot of attention to what the e-books stores have on offer, and might as a side effect trigger a sales acceleration of more genuine eReaders”.

2. Trends for the technology industry for 2010

Moore’s Law is alive and well in 2010

Despite forecasts of a gloomier scenario, Moore’s Law will probably work in 2010, with advances allowing for greater transistor density. However, this may not yield more powerful chips. Moore’s Law—the traditional ability of the global semiconductor industry to double the number of transistors in a square centimetre of silicon every 18-24 months—is not expected to come to a screeching halt in 2010, or even slow down.

The increased density is unlikely to be used to produce larger or more computationally powerful chips. Instead, “good enough” chips that are smaller, use less electricity and cost less money could emerge. With current growth of lower cost laptops and ultra low-cost netbooks, the next generations of PC chips are likely to be optimised for price, with consideration given to power consumption, but little focus on performance.

Other hot markets - smartphones, and perhaps tablets - will likely be optimised for power consumption, and possibly price, however, performance will be almost irrelevant. Although some chips will remain performance-driven, this segment may not see much growth. Many IT applications (server farms, etc.) are large users of electrical power, so more efficient chips are a good thing. New equipment that uses less electricity and requires less cooling may allow for re-architected or larger data centres without necessitating increased refrigeration or power supplies.

Thinking thin is in again: virtual desktop infrastructures challenge the PC

Deloitte predicts that in 2010 thin client will be taken far more seriously than in previous years, even if it does not outsell its thick client counterpart. Over the next five years, thin client should reach 10 percent of organisations’ computers, with the majority of medium to large businesses considering a shift to virtual desktop infrastructure. Thin client can help to deliver direct savings by minimising and making IT support and maintenance more efficient, as well as reducing hardware costs and licensing fees.

There are other less tangible benefits to virtual desktop infrastructure including; mobility, increased productivity, lower real estate costs, lower power consumption and better security. Those charged with deploying thin client may need to convince workers who begrudge the lack of a local hard disk drive that pure forms of thin client entails. However, abetted by a backdrop of recession or slow recovery, employers may consider it a good opportunity to reshape working conditions.

IT procurement stands on its head

In the past, technology and telecommunications hardware and software manufacturers have targeted products to the enterprise market, specifically the gate-keeping IT department. In 2010, many enterprise purchasing decisions will be based more on the preferences of individual employees. With the rise of the ‘prosumer’—employees who buy a phone for both work and play—more and more enterprises are likely to allow employees to choose their own phones, or at least allow prosumer-selected phones to integrate better with enterprise networks.

Enterprise-focused vendors will need to alter sales techniques originally designed to sell to monolithic buyers whose concerns were enterprise in scale. While IT departments will have to become more flexible, best practices are still necessary, such as deleting data on employees’ devices if they change jobs. Also, given the faddish nature of consumer sentiment, processes that reduce product churn will be needed. The future of many enterprise computing and telecom tools will likely involve compromises between work and personal life, that is, employees being available 24/7 but allowed to choose their own smartphone.

3. Trends for the telecommunications industry for 2010

Mobile VoIP becomes a social network

2010 could be an inflection year for VoIP – voice call over the top of an IP-based network – via mobile phone, given the growing number of WiFi-enabled phones, more WiFi hotspots, and the increase of ‘one-to-many’ communication. Within three years, mobile VoIP could be worth over US$30 billion globally. If routed over WiFi, mobile VoIP could lessen demands on the cellular network, and smaller operators in markets where the calling party pays could see a decrease in overall termination charges.

Companies may use the allure of free calls to enable the flow of advertising messages, therefore substantiating the mobile voice market’s value. If mobile VoIP results in declining revenues for operators, investment available for maintaining networks could drop and threaten the roll-out of next generation infrastructure. Portals, such as Yahoo or Facebook, could promote mobile VoIP applications by pointing to smartphone versions of their websites.

André Claes: “IP traffic generated by mobile operators has exponentially grown over the past 3 years. While such operators had a rather limited demand 3 years ago, they now appear as top tier clients for international IP transit services”.

Widening the bottleneck – telecom technology helps decongest the mobile network

With nearly 600 million mobile broadband connections, 2010 could see the wireless equivalent of gridlock Telecommunication technologies that can make existing wireless networks perform better should experience stronger growth than overall IT spending.

Leading pure-play companies in this area should see year-on-year growth approaching 100 percent, with the average company expected to grow by 30 to 40 percent. Sectors thought to benefit from addressing the congestion problem are hardware and software markets, including policy management, compression, streaming, and caching technologies. Handset-makers, specifically of smartphones, that adopt technologies to reduce network usage relative to competitors will see an advantage. However, without action, techniques such as metered pricing and traffic management may be necessary.

Paying for what we eat: carriers change data pricing and make regulators happy

North American network operators, both wireless and wireline, will likely move away from “all you can eat” data pricing plans. Instead, some customers will certainly be billed for how much data they use, and may even be charged for when they use it and also what kind of data is being used. The consensus view on North American data pricing has been that the only way to attract subscribers is to offer unmetered data. Moreover, the consensus also suggests that once made, the offer of unmetered data pricing can never be withdrawn without enormous customer backlash.

Motivated by a desire for net neutrality, regulators may introduce new rules allowing service providers to move to more usage-based pricing, and simultaneously defuse consumer complaints by observing that the carriers were practically forced to do so. Although carriers have feared that net neutrality rules would force them to provide services that don’t make sense economically, the reality may be that the new rules will make it easier for them to shift customers off the unmetered broadband plans that appear to be breaking their networks.

André Claes: “In Belgium and Europe, neither flat rate pricing plans nor “pay per byte used” pricing plans are very frequent. Telecom operators rather offer tiered pricing packages with various caps on total bits used in a billing period. However, several Belgian operators have recently introduced unlimited IP access offerings, which may well change the norm.”

Nixing the nines: reliability redefined and reassessed

2010 will see enterprises less likely to default to 99.999 percent – or ‘five nines’ – reliability for contracted services, and, instead, determining quality levels on a per-application or per-process level. Although a move to three nines may appear negligible, the loss in quality could be more than made up in savings. Efforts to understand what is meant or implied by services levels will be key in 2010, with telecommunications suppliers and their customers possibly moving to a more easily understood commitment. Executives responsible for procuring services should evaluate the implications of changes to any services, whilst IT and telecommunications departments should constantly review internal users’ requirements and tolerance for downtime. Service providers should constantly look at ways of reducing their maintenance costs.

André Claes comments: “Service availability is a critical part of service level agreements. While wholesale carriers still commit to very high availability standards, clients don’t hesitate to limit downturn risks by getting such services from multiple service providers. Telecom operators recognise that VoIP has made lower quality of service more acceptable, but also point out that, despite the huge VoIP growth, the majority of voice traffic is still carried by highly reliable PSTN networks”

The line goes leaner. And greener

In 2010, the global telecommunications sector will focus heavily on reducing CO2 emissions, with cost control being the common driver in developed and developing countries. Operators with fixed and mobile operations should consider the merits of shifting voice and data traffic between fixed and mobile networks to reduce overall energy costs, in addition to considering how metered broadband usage might discourage excessive network usage. More reliable network technology could translate into reducing emissions generated by maintenance teams.

André Claes concludes: “Equipment manufacturers should continue to improve network efficiency, whilst adapting innovations in power efficiency of mobile phones to network components. Device manufacturers and the mobile industry should continue to strive to reduce emissions with initiatives such as turning off chargers and a single standard for chargers”.