Contracts refresher: excluding liability for loss of profits

Posted on January 28th, 2015 by

When a technology contract goes wrong, customers will often suffer not just from a loss of systems but also from disruption to their business. Disruption may lose them vital revenues and even give rise to claims from customers. It would seem intuitive that contracts should be clear cut and allow customers to claim for loss of profit. But the position is far from clear. As a result, customers and suppliers must carefully craft their contracts if they are to effectively include or exclude claims for loss of profits.

The key issue is that English law only allows losses to be claimable if they are not unlikely or reasonably foreseeable as a result of the breach at the time the contract was entered into. Exceptionally, claims may be allowed where at the time the contract was concluded the parties had special knowledge of a certain kind of loss (e.g. that one of the customer’s contracts depended on delivery by the supplier). These principles were established in Hadley v Baxendale (1854) 9 Ex Ch 341 and Heron II [1969] 1 AC 350, and  reiterated over the years.

These types of loss are often referred to in shorthand as “direct” loss to describe the “not unlikely” or foreseeable kind and “indirect loss” to cover other losses which are only claimable if special knowledge is evident. The distinction can quickly become unhelpful if the longhand definitions are forgotten as plainly a kind of loss like damage to property, on different sets of facts, could be direct or indirect under the Hadley v Baxendale test. This leads to a lot of confusion as people try to pigeonhole, say, loss of profits as necessarily being in one category or another. In reality, lawyers need to look to the case law for guidance on whether loss of profits have been determined to be claimable in similar circumstances to the ones they face, and then draft the best they can to reinforce or avoid the consequences.

The courts have therefore long recognised that loss of profits arising from a breach of contract can be a direct loss or an indirect loss, depending on the circumstances, including the nature of the contract and the nature of the breach. It is essential then that the exclusion and limitation provisions make clear whether any references to “loss of profits” are to all loss of profits (both direct and indirect), or only one or the other. Two High Court cases last year – Fujitsu v IBM, [2014] EWHC 752 (TCC) and Polypearl Limited v E.on Energy Solutions Limited [2014] EWHC 3045 (QB) – illustrate well the approach taken by the courts when interpreting exclusions of “loss of profits” in the context of direct and indirect loss and the pitfalls where the contract is unclear. Before commenting on these cases, it is helpful to delve further into the approach the courts take when interpreting exclusions clauses designed to avoid liability for loss of profits (should losses be claimable under the Hadley v Baxendale rule).

Interpreting exclusion and limitation clauses: the courts’ approach

Liability provisions in a contract typically exclude or cap a party’s liability for certain types of losses. It is important for all parties that these provisions are drafted clearly and unambiguously. A clearly drafted clause is less likely to be disputed, and if it ever fell to the courts to interpret the clause, there is less risk that the court might interpret it in a way that was not anticipated, leaving a party exposed to unexpected risks and liabilities. Good exclusion clauses do not leave it to the case law to decide what will be direct or indirect loss. They spell out the division of risk between the parties, and expressly exclude some types of loss (or cover other types through express warranties and indemnities).

In the past, the courts strained their interpretation of exclusion/limitation clauses to reach a fair or just outcome. Now, though, the courts will generally uphold and give effect to the literal meaning of a clause that has been negotiated between experienced business parties, provided the clause is clear, unambiguous and not open to more than one meaning and not drafted so widely that a party’s obligations are effectively robbed of contractual force, (i.e. so that obligations are just statements of intent).

The general approach taken by the courts when interpreting an exclusion/limitation clause is the same as for any other part of the contract, namely:

  1. Ascertain what a reasonable person would have understood the parties to mean. The “reasonable person” is assumed to have all the background knowledge which would reasonably have been available to the parties in the situation in which they were at the time of the contract;[1]
  2. If that approach results in two possible interpretations, then the court will generally take the interpretation that is most consistent with business common sense;[2]
  3. Where the parties have used unambiguous language, the court will apply it;[3]
  4. There is a presumption that a party does not intend to abandon any remedies arising by operation of law. Clear express words must be used in order to rebut this presumption.[4]
  5. The court will strain against interpreting an exclusion clause in a way that renders a party’s obligation under the contract no more than a statement of intent. The court will not reach that conclusion unless no other conclusion is possible.[5]


Loss of profit; direct and indirect loss

It is good practice, when drafting an exclusion or limitation clause, to set out clearly the types of loss that the parties intend will be recoverable (subject to any agreed cap) and those that will be excluded. This gives the parties more certainty than relying on a clause that refers in broad terms to “direct” and “indirect” losses.

Care is needed when drafting to make clear whether references to “loss of profits” are to both the direct and indirect kind, or only one or the other. Two High Court cases last year illustrate well the approach taken by the courts when interpreting exclusions of “loss of profits” and the pitfalls where the contract is unclear.

Fujitsu v IBM

In Fujitsu v IBM, the court had to decide whether an exclusion clause in a sub-contract between IBM and Fujitsu effectively excluded IBM’s liability for all loss of profits (i.e. direct and indirect), or for only “indirect” loss of profits.  IBM was the principal contractor under a contract for the provision of IT and business process change services and Fujitsu was its subcontractor. Fujitsu alleged that IBM had breached the subcontract by failing to allocate to Fujitsu the performance of services that, under the terms of the sub-contract, should have been performed by Fujitsu. As a preliminary issue, the High Court had to consider the exclusion clause in the sub-contract, which read:

“20.7 Neither Party shall be liable to the other under this Sub-Contract for loss of profits, revenue, business, goodwill, indirect or consequential loss or damage…”

Were the types of loss listed in the clause (loss of profits, revenue etc.) intended to be examples of indirect or consequential loss? The court ruled that the clause excluded liability for all loss of profit, not just the “indirect” kind. If the intention was to exclude indirect loss of profit only, the court said that it would have expected the parties to make this clear.   The references to “loss of revenue, business or goodwill” were not necessarily indicative of indirect loss. As it stood, the clause did not make “loss of profit” a sub-set of “indirect or consequential loss”. There was nothing in the context or surrounding clauses that pointed to a different interpretation than to simply apply the words of the clause.

Polypearl Limited v E.on Energy Solutions Limited

The same issue arose in a more recent case – Polypearl Limited v E.on Energy Solutions Limited.  Polypearl claimed that E.On Energy Solutions was in breach of a minimum spend commitment under an agreement for the sale/purchase of cavity wall insulation products. Polypearl claimed loss of profits of £2.1m on the shortfall and, as a preliminary issue, the court had to consider whether the following clause excluded liability for all loss of profit or for indirect loss of profit only:

“(10.1) Neither party will be liable to the other for any indirect or consequential loss, (both of which include, without limitation, pure economic loss, loss of profit, loss of business, depletion of goodwill and like loss) howsoever caused (including as a result of negligence) under this Agreement, except in so far as it relates to personal injury or death caused by negligence.”

Polypearl argued that its lost profits on the shortfall were a direct loss, and the judge agreed. The judge noted that the words in parenthesis made the meaning of the clause ambiguous. Did these words mean that Clause 10.1 applied only to indirect or consequential loss of profit? The court ruled that the clause excluded liability for indirect/consequential loss of profits, and not direct loss of profits:

  1. The most likely (and often the only) damage that Polypearl would suffer from E.on’s failure to meet the minimum spend commitment would be a loss of profits. It was unlikely that a business person would wish to exclude this direct loss;
  2. It was more in accordance with business common sense to interpret the words in parenthesis as an explanation of the phrase “indirect or consequential loss” rather than an attempt to place all loss of profits in the “indirect” category;
  3. The clause did not clearly indicate that the parties intended to abandon a claim for direct loss of profits. The clause did not go far enough to rebut the presumption that the parties to a contract do not intend to abandon any remedies for a breach of contract arising by operation of law.


Drafting tip      

It seems from these cases (and others)[6] that ambiguity around whether a particular type of loss is excluded or not commonly arises where references to specific types of loss (e.g. loss of profit, revenue, goodwill etc.) are bundled in with a reference to “indirect” loss. If the intention is to exclude liability for a certain type of loss in all cases, whether the loss is direct or indirect, then one way of avoiding this ambiguity is to separate out the exclusion of liability for indirect loss and the exclusion of liability for that specific type of loss.



[1] Rainy Sky v Kookmin [2011] UKSC 50, at 14

[2] Rainy Sky v Kookmin [2011] UKSC 50, at 21

[3] Rainy Sky v Kookmin [2011] UKSC 50, at 23

[4]Modern Engineering (Bristol) Ltd v Gilbert-Ash (Northern) Ltd [1974] AC 689 at 717

[5]Astrazeneca v Albermarle [2011] EWHC 1574, at 313

[6] See for example Proton Energy Group SA v Orlen Lietuva [2013] EWHC 2872 (Comm)


Leading the way in Digital Services – The UK hosts the first D5 summit

Posted on January 27th, 2015 by

The UK hosted the first summit of the ‘D5′ group of countries in December. The D5 Charter claims that the group, whose founding members are Estonia, Israel, New Zealand, South Korea, and the UK, comprises the most digitally advanced governments in the world.

The D5 group has been created to encourage collaboration between the founding members in the field of digital services. The D5 members have committed to working towards nine key principles of digital development.

One of the key principles of the D5 group is to make more and more of the systems, tradecraft, manuals and standards created by the D5 members ‘open source’ and shared between the members. The D5 group recognises that adhering to the group’s Charter will encourage innovation and growth in the digital economies of the D5 members and potentially lead to cost savings.

In his opening speech, Minister for the Cabinet Office Francis Maude spoke about transforming public services and delivering them in a more cost effective way by taking more services online. In particular he cited the Government Digital Service’s work in replacing over 1700 government websites with ‘GOV.UK’ as an example of the importance of digital services which has saved the government £60 million.

The UK government has already taken steps to show its commitment to the D5 Charter. The code for GOV.UK is open source and the government has pledged to share it with the government of New Zealand.

The D5 members have committed themselves to opening up markets to competition, in particular to small and medium sized enterprises. The D5’s commitment will be welcome news for start-ups in particular who may already be benefiting from the UK Government’s ‘Small business: GREAT ambition‘ initiative to help small businesses grow.


The Retail Ombudsman: a new watchdog on the block

Posted on January 27th, 2015 by

Having returned from a disappointingly snow-less winter holiday I was delighted to see that The Retail Ombudsman’s (“TRO“) website is up and running.

Launched on 1 January 2015, TRO is an independent organisation established to resolve disputes between consumers and UK retailers (both bricks and mortar and online). TRO is a voluntary organisation (currently without certification or legislative backing), and is funded via membership and complaints handling fees. Retailers can sign up to the TRO, whereby they agree to be bound by the scheme’s Code of Practice. Consumers can submit complaints about a retailer’s goods and/or services and TRO will investigate the complaint and come to a decision as to whether a retailer is in breach of consumer protection laws. If a consumer disagrees with the TRO’s decision TRO released a statement (here) confirming that the consumer can appeal the decision through TRO or take the claim to the courts. If a member-retailer disagrees with the TRO decision it is currently unclear what the ramifications will be, or how TRO could enforce a decision against a member retailer.

TRO will also assist consumers with complaints lodged against retailers who are not members. If a non-member retailer is contacted by TRO any decision is merely advisory.

As a retailer why would I sign up with TRO?

Given the unenforceability against non-members, and the membership and complaint handling fees, it might initially appear that there is little benefit to TRO membership as a retailer would merely be paying fees in order to subject themselves to a third party’s (potentially) more onerous Code of Practice. However, TRO has received a great deal of publicity from a number of leading news outlets; as the TRO brand grows in the eyes of the consumer, the value and consumer trust a retailer derives from the TRO ‘seal of approval’ will also grow. Also, by handling consumer complaints, TRO acts as filter to spurious complaints. This can potentially save a retailer time and resource which they may have otherwise expended dealing with the complaint themselves.

It should also be mentioned that Directive 2013/11/EU on Consumer Alternative Dispute Resolution (the “Directive“) – which must be implemented in the UK by 9 July 2015 – requires that (amongst other things) a certified Alternative Dispute Resolution (“ADR“) body is available to handle disputes concerning contractual obligations between consumers and businesses. Given TRO’s launch date proximity to the Directive’s implementation date it seems more likely than not that TRO will seek to become the certified ADR body for consumer complaints concerning goods and services. Businesses and consumers would therefore be sign-posted to TRO for all consumer goods and services complaints at first instance, further raising the profile of TRO.

Retailers should be aware of TRO and their powers, and make an informed decision whether they wish to subject themselves to TRO’s Code of Practice now, later, or not at all.

TRO’s website can be found here.


Consumer rights for digital content closer to becoming law

Posted on January 21st, 2015 by

A year after it was first introduced into parliament, the Consumer Rights Bill, labelled as the ‘biggest overhaul of consumer law for a generation’ by the government, is in the final stages of passing through parliament.

As previously reported in this blog (links here, here and here), among other reforms, the Bill introduces a new statutory framework for consumer rights in respect of digital content (i.e. consumer purchases of software, apps, and ‘freemium’ purchases made through ‘free’ apps).  ‘Digital content’ as a legal concept was introduced by the EU Consumer Rights Directive, which came into force in 2014.  Most of the Directive’s requirements have already been implemented in the UK by the Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013 (see previous post here which summarises the Regulation’s rules for selling digital downloads).  The Consumer Rights Bill implements a number of the Directive’s remaining requirements.

The Bill introduces a statutory right to a repair or replacement from a trader where digital content is not of satisfactory quality, fit for a particular purpose and/or matches its description. This right won’t apply to B2B transactions, as it applies to ‘consumers’ – individuals acting for purposes which are wholly or mainly outside that individual’s trade, business, craft or profession.

The Department for Business, Innovation and Skills (“BIS“) has indicated in its guidance on implementation that the Bill should come into force in October 2015, and expects that it will issue guidance six months beforehand in April 2015 to give businesses sufficient time to comply. BIS has also indicated that in addition to its own guidance, the Implementation Group (which includes a cross section of consumer, business and enforcement representatives) will support regulatory and trade bodies that wish to develop specific business guidance, opening the possibility for specific digital industry guidance being issued by major stakeholders in the tech industry.

The Bill has passed all stages in both houses of parliament, and is now in the stage of “ping pong” where the two houses send proposed amendments to the Bill back and forth until a final form of the Bill is agreed. Once finalised, keep an eye out for a Tech Bytes update about how this impacts your business and the steps you’ll need to take to comply before October 2015.


The Network of the Future?

Posted on December 30th, 2014 by

Just as we are all getting used to the convenience and ease in connecting to the internet that the 4G network provides – I myself having recently upgraded and been impressed by the blindingly fast connection speeds – a contender for the ‘next big thing’ for our connected society appears on the horizon. It is commonly (and not unsurprisingly) referred to as ‘5G’. A pattern that seems to be emerging is that a new mobile “G”, or generation, appears about every ten years – for example, the UK saw a 2G rollout begin in the early 90s, 3G in 2003 and 4G in 2012. Based on this trend, we may see the first UK rollout of 5G in the early part of the next decade.

Driven by New Technology:

Even though we have only recently seen a full-scale rollout of 4G in the UK, there are a number of technologies that demand (or will soon demand) functionality that existing and planned 4G infrastructures will be unable to meet. To my mind, it is these technologies that will shape the development of the network of the future. Whilst not describing a particular technology, and as yet not comprising any particular standards, 5G is something of a moving target. It is, however, likely to be defined by the demands of future technologies set against the limitations of current wireless infrastructure.

For most of us the 4G networks meet our current needs in terms of high speed and low latency, which seem to be the current focus of the consumer market. However, new applications are being developed all the time, and these are expected to dictate the requirements of any new network specification. A likely key driver could be the growth of the ‘Internet of Things’, which envisages a wide variety and massive number of wireless-enabled devices communicating with us and each other. Bandwidth-hungry applications such as ultra-high-definition (UHD) video and virtual reality are set to rise, and the ability to transition these high data-rate/low-latency data streams seamlessly between transmitters as users move will be key to positive user experiences.

To meet the demands of new applications and technologies, the network of the future will need to provide for incredible speeds and low latency across a wide geographical area. It will need to ensure great service in crowds by providing large bandwidth in regions of high-demand, as well as ensuring that the best experience follows the user as they move. Reliable, always-on, real-time connections are anticipated to be central to the success of 5G.

The new network will need to be smart, in that it must dynamically alter its behaviour to deliver a level of service suitable to each individual user’s requirements. For example, an embedded sensor periodically reporting on the status of critical infrastructure will need access to a reliable, robust and resilient network, but perhaps only with very low bandwidth requirements. On the other hand, a UHD video service will require high bitrates and low latency, but not necessarily all the time. Such ‘context-aware’ networking will be essential to enable the efficient servicing of the huge number of networked devices expected to be introduced over the coming years. 5G networks will need significant capacity, flexibility, and energy and cost efficiency improvements over existing 4G networks to cope with increasingly data-hungry devices.

Building the Network:

There are technical hurdles that will need to be addressed before devices and networks can meet a 5G standard, and there are a number of approaches that could help resolve them, together with the conversion or adaptation of existing wireless networks such as GSM, HSPA, LTE, and Wi-Fi. One possible technical solution to bandwidth challenges might be ‘Massive MIMO’ (Multiple Input Multiple Output), which would use hundreds of antennas at transmitters and receivers to achieve more efficient spectrum use. However, advances in radio and antenna technology will be required before large numbers of antennas can be cost-effectively deployed.

Predicted bottlenecks in capacity (number of users) and bandwidth (data volume per user) could be addressed by using other parts of the electromagnetic spectrum for broadcast. Today, mobile networks predominantly use spectrum in the sub-3GHz band. In the future, 5G could use spectrum in the 10-30 GHz and 30-300 GHz where there are large chunks of continuous spectrum available for use. Whilst these high frequencies are attenuated to a greater extent as they travel through air and obstacles (and therefore have a shorter range), they may be more suitable for urban environments with a high density of wireless devices where range is not an issue. Higher frequency transmissions may also be used to complement lower frequencies when in sufficient proximity to other networks, with lower frequencies being used to provide the core service. For further information on the current thinking at EU level on spectrum planning for 5G see here.

In modern wireless networks upload and download channels tend to be given different spectrum allocations so that there is no interference between the channels. If the same allocation of spectrum could be used for both upload and download, then spectrum capacity could theoretically be doubled. Self-interference remains a big hurdle to overcome before this technology can be implemented however, although both analogue and digital techniques have been used to reduce interference. The potential gains if these techniques were to be further developed are certainly of interest, and this interest might drive more funds into the development of this technology.


Whilst 5G is very much a network of the future, it is already receiving attention from both political institutions (such as the European Commission) and industry.  On that basis alone, it is definitely one to watch over the coming years.



Big Banks, Big Data… Big Opportunities?

Posted on December 12th, 2014 by

A UK government commissioned report has concluded that there is a broad market for exploitation of bank data. The report contains a strong call for pro-consumer open standards but without any real economic analysis of the cost and consequences for the banking sector.

The HM Treasury report, written by the Open Data Institute and Fingleton Associates, promotes open standards in data exchange. Open standards would allow customers to easily use comparison websites, account aggregators and other banks’ services. However, many banks see control of their own data as their lifeblood. Banks may well prove reluctant to share nicely if it affects their competitive advantage.

In supporting its findings, the Treasury report cites McKinsey research that making consumer finance data more accessible could generate $210-280bn of value globally. These staggering (if, McKinsey admits, tentative) figures are based on better financial data analytics aiding risk assessment and lending decisions. To achieve these huge benefits, a degree of data sharing is vital. However, McKinsey’s figures assume a combination of open data and closed proprietary data, a point which the report ignores. A balance is necessary to allow consumer choice to grow; while encouraging continued bank investment in robust infrastructure and innovative services.

The report is confused on data protection issues, assuming all sharing will be consent based. The paper’s call for financial personal data to be treated as sensitive data “as a matter of course” only creates barriers to data sharing. The position is regulated at European level and does not currently require financial data to be treated as sensitive. No doubt, any future government-led data sharing initiative will need rules of engagement compatible with data protection law. The ultimate model needs to be practical as well as compliant to allow the best use of shared data.

ODI and Fingleton reach tentative conclusions (based primarily on consultants views and re-use of current open standards) that implementing data sharing should cost less than £1m per bank and be achievable in under a year. This seems to assume a less complex landscape than in many banks with diverse data formats across their retail, mortgage, investment and pensions businesses. Perhaps the figures will be closer to those given by one of the banks themselves, that “it would probably take ‘tens of millions of pounds’. Not only is it unclear who will foot the bill, but also whether banks will be able to charge for providing the platform for other providers to piggy back on.

Some other small set up costs are identified and ongoing costs are suggested to be “very low”. The report (in assuming current open standards are sufficient) does not address detailed development of taxonomies, regulation and a robust framework. These initial costings are somewhat in contrast to the Health and Social Care Information Centre’s £200m per year budget. Perhaps the creation of HSCIC to share government health data provides some insights about setting up data sharing schemes (and the teething problems involved), government leadership, data protection, timing and costs.

The UK’s approach reflects a global drive to promote government data sharing. Access to detailed statistics on UK citizens should boost the economy. However, bank proprietary data is a different matter to public records. The Treasury Call for Evidence in the new year will no doubt raise some robust responses from banks, industry bodies and consumer associations.

As a first step towards exploring government-led open data standards in the banking sector, the report provides ample food for thought. Open standards will open up a world of consumer choice. However, the thinking needs development, both on the overall economic impact for the UK financial services sector; and the practicalities of a complex programme with ramifications far broader than just IT implementation.


Get Off My Cloud

Posted on December 10th, 2014 by

Microsoft appeals against ruling that US companies must disclose personal information held overseas.

The US Electronic Communications Privacy Act (ECPA) allows US law enforcement to force communications service providers to search for and seize the personal emails of their customers.

In April this year, a New York district judge ordered Microsoft to disclose the personal communications of a customer held on servers in Dublin, rejecting Microsoft’s argument that the scope of the power under the ECPA is limited to information held in the US.

Microsoft has recently appealed to the US Second Circuit Court of Appeals against this decision, again arguing that the ECPA should not be interpreted as having “extra-territorial” application- i.e. that it should not apply to information held outside the US.

The European Commissioner for Justice has also expressed concern at the ruling, stating that it may be in breach of international law and inconsistent with the protection of individuals guaranteed in the EU.


Ofcom publishes decision on 700 MHz repurposing

Posted on December 2nd, 2014 by

Regular readers may recall my article on Ofcom’s consultation on a repurposing of the 700 MHz spectrum –

As anticipated, Ofcom has now published its decision to make spectrum in the 700 MHz band, previously used for broadcasting digital terrestrial television (DTT) and programme making and special events (PMSE), available for mobile data use.  Ofcom considers that “enabling the 700 MHz band to be used for mobile data will allow mobile networks to provide better performance at a lower cost, which will bring considerable benefits to citizens and consumers”.

Ofcom reiterates its commitment to ensuring that these changes occur in such a manner as to minimise disruption, and safeguarding the benefits to consumers that DTT and PMSE offer by moving these to other frequencies.  It is anticipated that the changes to existing uses will take place in 2019, at which time viewers may need to retune their television sets.

Ofcom hopes to make the 700 MHz band available for use in transmitting mobile data no later than 2022, and have already begun work to make the changes.


First European Workshop on 5G Spectrum Planning

Posted on November 28th, 2014 by

On 13th November the European Commission held its first workshop on spectrum planning for 5G to discuss spectrum challenges for 5G including usage aspects, technical, and regulatory needs – it was attended by representatives from national regulators, industry, and research. This update sets out a few of the key comments made by the speakers.


Roberto Viola – Deputy Director-General of the European Commission Directorate General for Communications Networks, Content & Technology – expressed that this workshop was very timely, given that policy-makers now understand the importance of spectrum policy and the types of service that are required to support new technologies and usage. The goal he expressed is greater co-operation between European member states in the future and the use of harmonised Europe-wide measures, rather than regulators acting within national silos. He would like to see Europe becoming a world leader when it comes to spectrum policy.

5G does not describe any particular specification at this time, but is a blend of a number of different technologies, which may include new radio access technologies, and existing wireless networks such as GSM, HSPA, LTE, and Wi-Fi. According to Mr Viola, it is yet to be defined accurately when it comes to policy in Europe. He recalls that when frequency planning for 4G was conducted, it was not planned effectively – this has led to distractions. He hopes that when it comes to 5G a greater degree of international co-operation and planning will be achieved, as there will be no 5G without effective spectrum planning.

The Commission would like to see 1.2GHz of bandwidth to be allocated to mobile telecommunications for 5G – this will include both licensed spectrum and shared (unlicensed) spectrum. In particular, it intends to recycle the 700MHz band which is currently used in the mobile industry and for digital terrestrial television broadcasting, and this process is already beginning in the United Kingdom. Spectrum sharing in the 2.3GHz band will also be vital, facilitated by advanced technology in devices, as well as by an appropriate legal framework. The Commission would like to see harmonisation within the 3.4GHz band, and intends to discuss this at future World Radio Conferences. It expects to see the upper part of the C-Band (3.8GHz-4.2GHz) remaining allocated to satellite systems, which play an important role globally. Mr Viola also suggested that higher frequency regions of the spectrum, and in particular the 60-80GHz band, could be serious candidates for some close proximity applications.

His personal vision for 5G is about quality of experience – for example, a seamless transition between transmitters (from larger cells, to smaller cells, to pico-cells, to nano-cells) – and he would like to see networks and infrastructure designed in a very different way to support the quality of experience. He sees sharing and layering as being the new norm using cognitive radio access technology and smart layer selection.


Karl-Heinz Laudan – Vice President of Spectrum Policy and Projects at Deutche Telekom AG – gave his views on the anticipated use of 5G. Data components of transmissions have become more and more important over the last few years, overtaking the voice elements in mobile telecommunications, and will change further to become more machine-centric as the ‘Internet of Things’ develops. Much of this data will be video-streaming, including non-linear broadcasting, but this is unlikely to be the whole story and there will be a wide range of data traffic sources in use, all of which will have different requirements. These requirements will be much more than just higher speed with lower latency. For example, incredible speeds (bitrates); ubiquitous device access (wide coverage); great service in crowds (bandwidth in areas of high demand); best experience following you (seamlessness); and reliable real-time connections (always on), are all anticipated to be central to the success of 5G.

Mr Laudan considers that the earlier existing technologies (2G, 3G) may fall away and the spectrum be reallocated to 5G, and that LTE/4G could be integrated into 5G, so that all mobile telecommunications in the future would be transmitted via 5G. He believes that spectrum demand on backhaul will need to be considered, and that small cells in ultra-dense networks will require a new concept for wireless backhauling. He also feels that with proper planning there is now an opportunity for harmonisation globally for a 5G band.


Peter Olsen of DIGITALEUROPE – a membership organisation for national trade associations and corporates representing the digital technology industry in Europe – highlighted how the pace of change has quickened and that timescales for increasing capacity are crucial. He reiterated the importance of identifying and understanding the use cases discussed by Mr Laudan, and designing the network accordingly whilst incorporating other advances in technology. In addition to the physical infrastructure, he also stressed the importance of cloud infrastructure and virtualisation due to the applications which are being developed. A further vital element which he emphasised is the need to keep sustainability and security on the agenda.


Darko Ratkaj of the European Broadcasting Union (EBU) spoke of media services in the context of 5G, and how broadcasters may use the new 5G networks. Whilst linear radio and television are still the core proposition for most broadcasters, there are now a multitude of options which are not bound by channel schedules and channel formats, in high quality formats supported by digital (as opposed to analogue) technologies. Media is delivered to audiences via broadcast networks (DTT, satellite, and cable), and also broadband – including fixed networks (such as IPTV and OTT) and mobile networks which will include 5G when it is introduced. Mr Ratkaj suggests that there is no one network to rule them all, as none of these delivery methods will be accessible to all users at all times. Therefore, 5G will need to work alongside the other delivery methods to effectively serve all possible audiences.

Media distribution networks must be assessed by technical capability, reach, costs, and the ability to guarantee the prominence of services. The EBU believes that 5G can help broadcasters deliver to small devices, but is not currently in a position to state the direction on which 5G should take – whether to support only broadband delivery, or having wider implications on broadcast delivery. From the EBU’s perspective however, the enablement of linear broadcast remains its key concern.

Mr Ratkaj believes it is vital that the design and construction of the 5G specification and infrastructure must follow policy, and not the reverse.



Future-proofing the 4G Infrastructure: LAA-LTE

Posted on November 25th, 2014 by

The deployment and uptake of LTE (Long Term Evolution, commonly known as 4G), operating on various licensed spectrums, has been growing rapidly around the world.  However, as demand increases for more network speed and capacity, operators are looking for new ways to future-proof existing LTE infrastructure.  One such proposal is to use the unlicensed parts of the spectrum within which Wi-Fi technology operates to boost LTE’s spectral efficiency and reliability.  This technology is being referred to as Licence Assisted Access (“LAA-LTE“).

LAA-LTE was first proposed in December 2013 and has been debated by members of the international standards group 3rd Generation Partnership Project (“3GPP“) on several occasions throughout the year.  While approved in principle, progress of LAA-LTE is slow because of concerns from some 3GPP members, typically from those with significant investments in Wi-Fi hotspot infrastructure, about the interplay between Wi-Fi and LTE in the unlicensed bands.  Co-existence issues and the potential for leakage into neighbouring bands will also need to be considered.

Whilst LAA-LTE is in its infancy and remains in debate between 3GPP members, telecoms counsel should make themselves aware of this technology as it may, in particular, bring with it regulatory requirements arising from operation in the unlicensed spectrum throughout the world (such as the use of dynamic frequency selection and transmission power control).  Depending on the outcome of debates concerning LAA-LTE’s coexistence with Wi-Fi, potentially contentious considerations may also arise over spectrum sharing.

Details of the LAA study can be found on the 3GPP website, and the study item here.