Big Banks, Big Data… Big Opportunities?

Posted on December 12th, 2014 by

A UK government commissioned report has concluded that there is a broad market for exploitation of bank data. The report contains a strong call for pro-consumer open standards but without any real economic analysis of the cost and consequences for the banking sector.

The HM Treasury report, written by the Open Data Institute and Fingleton Associates, promotes open standards in data exchange. Open standards would allow customers to easily use comparison websites, account aggregators and other banks’ services. However, many banks see control of their own data as their lifeblood. Banks may well prove reluctant to share nicely if it affects their competitive advantage.

In supporting its findings, the Treasury report cites McKinsey research that making consumer finance data more accessible could generate $210-280bn of value globally. These staggering (if, McKinsey admits, tentative) figures are based on better financial data analytics aiding risk assessment and lending decisions. To achieve these huge benefits, a degree of data sharing is vital. However, McKinsey’s figures assume a combination of open data and closed proprietary data, a point which the report ignores. A balance is necessary to allow consumer choice to grow; while encouraging continued bank investment in robust infrastructure and innovative services.

The report is confused on data protection issues, assuming all sharing will be consent based. The paper’s call for financial personal data to be treated as sensitive data “as a matter of course” only creates barriers to data sharing. The position is regulated at European level and does not currently require financial data to be treated as sensitive. No doubt, any future government-led data sharing initiative will need rules of engagement compatible with data protection law. The ultimate model needs to be practical as well as compliant to allow the best use of shared data.

ODI and Fingleton reach tentative conclusions (based primarily on consultants views and re-use of current open standards) that implementing data sharing should cost less than £1m per bank and be achievable in under a year. This seems to assume a less complex landscape than in many banks with diverse data formats across their retail, mortgage, investment and pensions businesses. Perhaps the figures will be closer to those given by one of the banks themselves, that “it would probably take ‘tens of millions of pounds’. Not only is it unclear who will foot the bill, but also whether banks will be able to charge for providing the platform for other providers to piggy back on.

Some other small set up costs are identified and ongoing costs are suggested to be “very low”. The report (in assuming current open standards are sufficient) does not address detailed development of taxonomies, regulation and a robust framework. These initial costings are somewhat in contrast to the Health and Social Care Information Centre’s £200m per year budget. Perhaps the creation of HSCIC to share government health data provides some insights about setting up data sharing schemes (and the teething problems involved), government leadership, data protection, timing and costs.

The UK’s approach reflects a global drive to promote government data sharing. Access to detailed statistics on UK citizens should boost the economy. However, bank proprietary data is a different matter to public records. The Treasury Call for Evidence in the new year will no doubt raise some robust responses from banks, industry bodies and consumer associations.

As a first step towards exploring government-led open data standards in the banking sector, the report provides ample food for thought. Open standards will open up a world of consumer choice. However, the thinking needs development, both on the overall economic impact for the UK financial services sector; and the practicalities of a complex programme with ramifications far broader than just IT implementation.


Get Off My Cloud

Posted on December 10th, 2014 by

Microsoft appeals against ruling that US companies must disclose personal information held overseas.

The US Electronic Communications Privacy Act (ECPA) allows US law enforcement to force communications service providers to search for and seize the personal emails of their customers.

In April this year, a New York district judge ordered Microsoft to disclose the personal communications of a customer held on servers in Dublin, rejecting Microsoft’s argument that the scope of the power under the ECPA is limited to information held in the US.

Microsoft has recently appealed to the US Second Circuit Court of Appeals against this decision, again arguing that the ECPA should not be interpreted as having “extra-territorial” application- i.e. that it should not apply to information held outside the US.

The European Commissioner for Justice has also expressed concern at the ruling, stating that it may be in breach of international law and inconsistent with the protection of individuals guaranteed in the EU.


Ofcom publishes decision on 700 MHz repurposing

Posted on December 2nd, 2014 by

Regular readers may recall my article on Ofcom’s consultation on a repurposing of the 700 MHz spectrum –

As anticipated, Ofcom has now published its decision to make spectrum in the 700 MHz band, previously used for broadcasting digital terrestrial television (DTT) and programme making and special events (PMSE), available for mobile data use.  Ofcom considers that “enabling the 700 MHz band to be used for mobile data will allow mobile networks to provide better performance at a lower cost, which will bring considerable benefits to citizens and consumers”.

Ofcom reiterates its commitment to ensuring that these changes occur in such a manner as to minimise disruption, and safeguarding the benefits to consumers that DTT and PMSE offer by moving these to other frequencies.  It is anticipated that the changes to existing uses will take place in 2019, at which time viewers may need to retune their television sets.

Ofcom hopes to make the 700 MHz band available for use in transmitting mobile data no later than 2022, and have already begun work to make the changes.


First European Workshop on 5G Spectrum Planning

Posted on November 28th, 2014 by

On 13th November the European Commission held its first workshop on spectrum planning for 5G to discuss spectrum challenges for 5G including usage aspects, technical, and regulatory needs – it was attended by representatives from national regulators, industry, and research. This update sets out a few of the key comments made by the speakers.


Roberto Viola – Deputy Director-General of the European Commission Directorate General for Communications Networks, Content & Technology – expressed that this workshop was very timely, given that policy-makers now understand the importance of spectrum policy and the types of service that are required to support new technologies and usage. The goal he expressed is greater co-operation between European member states in the future and the use of harmonised Europe-wide measures, rather than regulators acting within national silos. He would like to see Europe becoming a world leader when it comes to spectrum policy.

5G does not describe any particular specification at this time, but is a blend of a number of different technologies, which may include new radio access technologies, and existing wireless networks such as GSM, HSPA, LTE, and Wi-Fi. According to Mr Viola, it is yet to be defined accurately when it comes to policy in Europe. He recalls that when frequency planning for 4G was conducted, it was not planned effectively – this has led to distractions. He hopes that when it comes to 5G a greater degree of international co-operation and planning will be achieved, as there will be no 5G without effective spectrum planning.

The Commission would like to see 1.2GHz of bandwidth to be allocated to mobile telecommunications for 5G – this will include both licensed spectrum and shared (unlicensed) spectrum. In particular, it intends to recycle the 700MHz band which is currently used in the mobile industry and for digital terrestrial television broadcasting, and this process is already beginning in the United Kingdom. Spectrum sharing in the 2.3GHz band will also be vital, facilitated by advanced technology in devices, as well as by an appropriate legal framework. The Commission would like to see harmonisation within the 3.4GHz band, and intends to discuss this at future World Radio Conferences. It expects to see the upper part of the C-Band (3.8GHz-4.2GHz) remaining allocated to satellite systems, which play an important role globally. Mr Viola also suggested that higher frequency regions of the spectrum, and in particular the 60-80GHz band, could be serious candidates for some close proximity applications.

His personal vision for 5G is about quality of experience – for example, a seamless transition between transmitters (from larger cells, to smaller cells, to pico-cells, to nano-cells) – and he would like to see networks and infrastructure designed in a very different way to support the quality of experience. He sees sharing and layering as being the new norm using cognitive radio access technology and smart layer selection.


Karl-Heinz Laudan – Vice President of Spectrum Policy and Projects at Deutche Telekom AG – gave his views on the anticipated use of 5G. Data components of transmissions have become more and more important over the last few years, overtaking the voice elements in mobile telecommunications, and will change further to become more machine-centric as the ‘Internet of Things’ develops. Much of this data will be video-streaming, including non-linear broadcasting, but this is unlikely to be the whole story and there will be a wide range of data traffic sources in use, all of which will have different requirements. These requirements will be much more than just higher speed with lower latency. For example, incredible speeds (bitrates); ubiquitous device access (wide coverage); great service in crowds (bandwidth in areas of high demand); best experience following you (seamlessness); and reliable real-time connections (always on), are all anticipated to be central to the success of 5G.

Mr Laudan considers that the earlier existing technologies (2G, 3G) may fall away and the spectrum be reallocated to 5G, and that LTE/4G could be integrated into 5G, so that all mobile telecommunications in the future would be transmitted via 5G. He believes that spectrum demand on backhaul will need to be considered, and that small cells in ultra-dense networks will require a new concept for wireless backhauling. He also feels that with proper planning there is now an opportunity for harmonisation globally for a 5G band.


Peter Olsen of DIGITALEUROPE – a membership organisation for national trade associations and corporates representing the digital technology industry in Europe – highlighted how the pace of change has quickened and that timescales for increasing capacity are crucial. He reiterated the importance of identifying and understanding the use cases discussed by Mr Laudan, and designing the network accordingly whilst incorporating other advances in technology. In addition to the physical infrastructure, he also stressed the importance of cloud infrastructure and virtualisation due to the applications which are being developed. A further vital element which he emphasised is the need to keep sustainability and security on the agenda.


Darko Ratkaj of the European Broadcasting Union (EBU) spoke of media services in the context of 5G, and how broadcasters may use the new 5G networks. Whilst linear radio and television are still the core proposition for most broadcasters, there are now a multitude of options which are not bound by channel schedules and channel formats, in high quality formats supported by digital (as opposed to analogue) technologies. Media is delivered to audiences via broadcast networks (DTT, satellite, and cable), and also broadband – including fixed networks (such as IPTV and OTT) and mobile networks which will include 5G when it is introduced. Mr Ratkaj suggests that there is no one network to rule them all, as none of these delivery methods will be accessible to all users at all times. Therefore, 5G will need to work alongside the other delivery methods to effectively serve all possible audiences.

Media distribution networks must be assessed by technical capability, reach, costs, and the ability to guarantee the prominence of services. The EBU believes that 5G can help broadcasters deliver to small devices, but is not currently in a position to state the direction on which 5G should take – whether to support only broadband delivery, or having wider implications on broadcast delivery. From the EBU’s perspective however, the enablement of linear broadcast remains its key concern.

Mr Ratkaj believes it is vital that the design and construction of the 5G specification and infrastructure must follow policy, and not the reverse.



Future-proofing the 4G Infrastructure: LAA-LTE

Posted on November 25th, 2014 by

The deployment and uptake of LTE (Long Term Evolution, commonly known as 4G), operating on various licensed spectrums, has been growing rapidly around the world.  However, as demand increases for more network speed and capacity, operators are looking for new ways to future-proof existing LTE infrastructure.  One such proposal is to use the unlicensed parts of the spectrum within which Wi-Fi technology operates to boost LTE’s spectral efficiency and reliability.  This technology is being referred to as Licence Assisted Access (“LAA-LTE“).

LAA-LTE was first proposed in December 2013 and has been debated by members of the international standards group 3rd Generation Partnership Project (“3GPP“) on several occasions throughout the year.  While approved in principle, progress of LAA-LTE is slow because of concerns from some 3GPP members, typically from those with significant investments in Wi-Fi hotspot infrastructure, about the interplay between Wi-Fi and LTE in the unlicensed bands.  Co-existence issues and the potential for leakage into neighbouring bands will also need to be considered.

Whilst LAA-LTE is in its infancy and remains in debate between 3GPP members, telecoms counsel should make themselves aware of this technology as it may, in particular, bring with it regulatory requirements arising from operation in the unlicensed spectrum throughout the world (such as the use of dynamic frequency selection and transmission power control).  Depending on the outcome of debates concerning LAA-LTE’s coexistence with Wi-Fi, potentially contentious considerations may also arise over spectrum sharing.

Details of the LAA study can be found on the 3GPP website, and the study item here.


Ofcom launches high data capacity spectrum auction consultation

Posted on November 21st, 2014 by

Ofcom published a consultation on the release of spectrum in the 2.3 GHz and 3.4 GHz bands on 7 November 2014. Potential bidders will be able to submit their comments until 23 January 2015. The auction is expected at the end of 2015 or the start of 2016.

No specific uses have been prescribed for this spectrum. However, these spectrum bands are suitable for very high data capacity, making it ideal for mobile broadband services. The most recent mobile handsets released by the major players are compatible with the 2.3 GHz spectrum in other countries. The 2.3 GHz band is used for 4G mobile broadband networks in ten non-European countries. The 3.4 GHz band is already used for 4G wireless broadband in the UK and a further five countries.

Ofcom proposes to auction 40 MHz of spectrum within the 2.3 GHz band and 150 MHz of spectrum within the 3.4 GHz band in 38 lots of 5 MHz. Reserve prices of between £2.5m to £5m per lot for the 2.3 GHz spectrum, and £1m for the 3.4 GHz spectrum are proposed.

This spectrum has been released by the Ministry of Defence under the Government’s initiative to free up public sector spectrum.


The UK’s implementation of the new EU Procurement Directive

Posted on November 11th, 2014 by

The government has published proposed draft regulations to implement the new EU Public Procurement Directive (2014/24/EU, the “Directive“). The Directive is part of a package of measures that will reform public sector procurement across the EU and must be implemented in Member States by 17 April 2016. The government has indicated that it aims to implement the Directive sooner than the 17 April deadline.

The government proposes to adopt a “copy out” approach for much of the Directive. Most of the Directive’s provisions are mandatory and do not leave room for altering its substance when transposing it into UK law. However, there are areas of the Directive that are not mandatory or where the Directive leaves room for Member States to determine their national rules. This blog looks at three such areas and the government’s proposed approach.

The Light-Touch Regime

In a limited number of circumstances the Directive gives the government scope to make choices about how to implement the Directive. Perhaps the most significant change being brought in by the Directive is the abolition of the distinction between “Part A” and “Part B” (“priority” and “non-priority”) services and the introduction of a new “light-touch” regime for social and other specific services set out in Schedule 3 to the draft Public Contracts Regulations (the “Regulations“). Many (but not all) services that are categorised as “Part B” services under the current regime and procurements for these services will be subject to the new light touch regime if the contract value is €750,000 or more.

Member States have flexibility to devise their own national rules for the award of contracts for Schedule 3 services. The government has taken a “minimalistic” approach to the UK’s light touch regime. Contracting authorities will be able to determine the procedures to be applied in connection with the award of contracts for Schedule 3 services, as long as those procedures are sufficient to ensure compliance with the principles of transparency and equal treatment of economic operators.

SME Access/Division of contracts into lots

The Directive aims to increase the possibilities for small and medium sized enterprises (SMEs) to participate in large scale public procurements by introducing new mechanisms allowing contracting authorities to award contracts in the form of lots. Members States have a choice over whether the division of contracts into lots should be mandatory under national law. The government proposes to allow contracting authorities to decide whether to award a contract in the form of lots on a case by case basis. Where a contracting authority decides not to divide a contract into lots it would have to provide an indication of the reasons for its decision. The government also proposes to allow bidders to tender for combined lots. Contracting authorities would need to make clear in the procurement documents the possibility that contracts will be awarded for combined lots and indicate the lots that may be combined.

Bidders’ past performance

The Directive gives Member States the option to require contracting authorities to exclude economic operators from participating in a procurement procedure if that economic operator has “shown significant or persistent deficiencies in the performance of a substantive requirement under a prior public contract … which led to early termination of that prior contract, damages or other comparable sanctions”.

The current wording of the Regulations does not mandate an economic operator’s exclusion from a procurement procedure if there have been significant deficiencies in its past performance and the contracting authority retains discretion over whether or not to exclude the economic operator. The government has proposed that the “default” exclusion period should be three years; the maximum permitted under the Directive.

It is anticipated that the Government will publish guidance material for contracting authorities on how to exercise their discretion to exclude bidders based on their past performance.


Not-spots and Revelations – network coverage in the UK

Posted on November 6th, 2014 by

Having more to do with Muse than you might imagine, yesterday the UK government launched a consultation to look at proposals to improve network coverage in areas with access to some, but not all, of the 4 major networks (EE, O2, Three and Vodafone). These partial ‘not-spots’ affect approximately 20% of UK landmass. The consultation is asking for opinions on three different options to address partial not-spots (and considers a fourth ‘do-nothing’ option).

Option 1: infrastructure sharing

This proposal refers to Mobile Network Operator (MNO) site sharing, mast sharing and full radio access network sharing.

To achieve an infrastructure sharing programme the government is proposing to direct Ofcom to vary Wireless Telephony Act licence (“Telephony Licence”) terms to impose a coverage obligation on all MNOs, requiring them to achieve a geographic coverage equal to the combined coverage of all MNOs. The manner in which each MNO achieves this would be at their discretion.

Option 2: Multi-Operator Mobile Virtual Network Operators (MO-MVNO)

This proposal aims to encourage the use of a hybrid Mobile Virtual Network Operator (MVNO) model. A MO-MVNO would have agreements with two or more MNOs and would need to provide consumers with access to multiple networks.

To ensure that existing MVNOs and new market entrants could operate under this model the government would want to ensure that MNO and MVNO agreements do not contain exclusivity provisions. To do so they are proposing to direct Ofcom to vary Telephony Licence terms so that MNO/MVNO agreements cannot restrict an MVNO’s right to enter into agreements with other MNOs.

Option 3: national roaming

Under this proposal it would be obligatory for MNOs to make available, in partial not-spots, their coverage to other MNOs which do not have coverage in the area. Due to recognised technical and cost hurdles, only non-seamless national roaming for voice and text services would be required. 3G and 4G coverage would not need to be made available under this proposal.

To mandate national roaming the government would introduce secondary legislation to direct Ofcom to vary MNO Telephony Licences and introduce a non-seamless roaming requirement in areas where there are partial not-spots, as well as impose pricing restrictions.


Whilst addressing partial mobile coverage in different ways, each proposal introduces additional, and in some cases significant, obligations into existing and new MNO Telephony Licences and agreements. Now is the time for MNOs and other interested parties to weigh in with their arguments for the pros and cons of each option, or to state why existing infrastructure sharing projects and mechanisms are sufficient to address coverage concerns. Responses to the consultation must be submitted by 26 November 2014.

If the plans to address partial not-spots go ahead maybe we will be able to finally stream our favourite artists everywhere in the UK…

The press release and consultation, including the draft infrastructure sharing, MO-MVNO, and national roaming directions, are available online.


New ISO Standard for Cloud Computing

Posted on November 5th, 2014 by

The summer of 2014 saw another ISO Standard published by the International Standards Organisation (ISO). ISO27018:2014 is a voluntary standard governing the processing of personal data in the public cloud.

With the catchy title of “Information technology – Security techniques – Code of the practice for protection of personally identifiable information (PII) in public clouds acting as PII processors” (“ISO27018“), it is perhaps not surprising that this long awaited standard is yet to slip off the tongue of every cloud enthusiast.  European readers may have assumed references to PII meant this standard was framed firmly on the US – wrong!

What is ISO27018?

ISO27018 sets out a framework of “commonly accepted control objectives, controls and guidelines” which can be followed by any data processors processing personal data on behalf of another party in the public cloud.

ISO27018 has been crafted by ISO to have broad application from large to small and from public entity to government of non-profit.

What is it trying to achieve?

Negotiations in cloud deals which involve the processing of personal data tend to be heavily influenced by the customer’s perceptions of heightened data risk and sometimes very real challenges to data privacy compliance. This is hurdle for many cloud adopters as they relinquish control over data and rely on the actions of another (and sometimes those under its control) to maintain adequate safeguards. In Europe, until we see the new Regulation perhaps, a data processor has no statutory obligations when processing personal data on behalf of another. ISO27018 goes some way to impose a level of responsibility for the personal information it processes.

ISO27018’s introductory pages call out its objectives:

  1. It’s a tool to help the public cloud provider to comply with applicable obligations: for example there are requirements that the public cloud provider only processes personal information in accordance with the customer’s instructions and that they should assist the customer in cases of data subject access requests;
  2. It’s an enabler of transparency allowing the provider to demonstrate why their cloud services are well governed: imposing good governance obligations on the public cloud provider around its information security organisation (eg the segregation of duties) and objectives around human resource security prior to (and during employment) and encouraging programmatic awareness and training. Plus it echoes the asset management and access controls elements of other ISO standards (see below);
  3. It will assist the customer and vendor in documenting contractual obligations: by addressing typical contractually imposed accountability requirements; data breach notification, imposing adequate confidentially obligations on individuals touching on data and flowing down technical and organisation measures to sub-processors as well as requiring the documentation of data location. This said, a well advised customer may wish to delve deeper as this is not a full replacement for potential data controller to processor controls; and
  4. It offers the public cloud customer a mechanism to exercise audit and compliance rights: with ISO27018’s potential application across disparate cloud environments, it remains to be seen whether a third party could certify compliance against some of the broader data control objectives contained in ISO27018. However, a regular review and reporting and/or conformity reviews may provide a means for vendor or third party verification (potentially of more use where shared and/or virtualised server environments practically frustrate direct data, systems and data governance practice audit by the customer).

ISO27018 goes some way towards delivering these safeguards. It is also a useful tool for a customer to evaluate the cloud services and data handling practices of a potential supplier. But it’s not simple and it’s not a substitute for imposing compliance and control via contract.

A responsible framework for public cloud processors

Privacy laws around the world prescribe nuanced, and sometimes no, obligations upon those who determine the manner in which personal information is used. Though ISO27018 is not specifically aimed at the challenges posed by European data protection laws, or any other jurisdiction for that matter, it is flexible enough to accommodate many of the inevitable variances. It cannot fit all current and may not fit to future rules. However, in building this flexibility, it loses some of its potential bite to generality.

Typically entities adopting ISO27001 (Information security management) are seeking to protect their own assets data but it is increasingly a benchmark standard for data management and handling among cloud vendors. ISO27018 builds upon the ISO27002 (Information technology – Security technique – Code of practice for information security controls) reflecting its controls, but adapting these for public cloud by mapping back to ISO27002 obligations where they remain relevant and supplementing these controls where necessary by prescribing additional controls for public cloud service provision (as set out separately in Annex A to ISO27018). As you may therefor expect, ISO27018 explicitly anticipates that a personal information controller would be subject to wider obligations than those specified and aimed at processors.

Adopting ISO27018

Acknowledging that the standard cannot be all-encompassing, and that the flavours of cloud are wide and varied, ISO27018 calls for an assessment to be made across applicable personal information “protection requirements”.  ISO27018 calls for the organisation to:

  1. Assess the legal, statutory, regulatory and contractual obligations of it and its partners (noting particularly that some of these may mandate particular controls (for example preserving the need for written contractual obligations in relation to data security under the Directive (95/46/EC) 7th Principle));
  2. To complete a risk assessment across its business strategy and information risk profile; and
  3. To factor in corporate policies (which may, at times, go further than the law for reasons of principle, global conformity or because of third party influences).

What ISO27018 should help with

ISO27018 offers a reference point for controllers who wish to adopt cloud solutions run by third party providers. It is a cloud computing information security control framework which may form part of a wider contractual commitment to protect and secure personal information.

As we briefly explained in an earlier post in our tech blog, the European Union has also spelled out its desire to promote uniform standard setting in cloud computing. ISO27018 could satisfy the need for broadly applicable, auditable data management framework for public cloud provision. But it’s not EU specific and lacks some of the rigour an EU based customer may seek.

What ISO27018 won’t help with

ISO27018 is not an exhaustive framework. There are a few obvious flaws:

  1. It’s been designed for use in conjunction with the information security controls and objectives set out in ISO27002 and ISO27001 which provide general information security frameworks. This is a high threshold for small or emerging providers (many of which do not meet all these controls or certify to these standards today). So more accessible for large enterprise providers but something to weigh up – the more controls there are the more ways there are to slip up;
  2. It may be used as a benchmark for security and, coupled with contractual commitments to meet and maintain selected elements of ISO27018, it won’t be relevant to all cloud solutions and compliance situations (though some will use it as if it were);
  3. It perpetuates the use of the PII moniker which, already holding specific US legal connotation (i.e. narrower application) is now used is a wider defined context under ISO27018 (in fact PII under ISO27018 is closer to the definition of personal data under EU Directive 95/46/EC). This could confuse the stakeholders in multi-national deals and the corresponding use of PII in the full title to ISO27014 potentially misleads around the standard’s potentially applicability and use cases;
  4. ISO27018 is of no use in situations where the cloud provider is (or assumes the role) of data controller and it assumes all data in the cloud is personal data (so watch this space for ISO27017 (coming soon) which will apply to any data (personal or otherwise)); and
  5. For EU based data controllers, other than constructing certain security controls, ISO27018 is not a mechanism or alternative route to legitimise international data transfers outside of the European Economic Area. Additional controls will have to be implemented to ensure such data enjoys adequate protection.

What now?

ISO27018 is a voluntary standard and not law and it won’t entirely replace the need for specific contractual obligations around processing, accessing and transferring personal data. In a way its ultimate success can be gauged by the extent of eventual adoption. It will be used to differentiate, but it will not always answer all the questions a well-informed cloud adaptor should be asking.

It may be used in whole or in part and may be asserted and used alongside or as a part of contractual obligations, information handling best practice or simply a benchmark which a business will work towards. Inevitability there will be those who treat the Standard as if it is the law without thought about what they are seeking to protect against and what potential wrongs they are seeking to right.  If so, they will not reap the value of this kind of framework.



Small Business, Enterprise and Employment Bill Consultation

Posted on October 27th, 2014 by

It is no secret that SME’s find it difficult to get involved in public sector contract opportunities because of, amongst other things, the cost and amount of time they need to invest in a sometimes cumbersome and bureaucratic bidding process which gives them no guarantee that they’d actually win the work. They’re up against some large corporates who have very deep pockets, lots of bidding/public sector experience and immense pressure to win deals (sometimes at whatever the cost!).

Since 2010, the Government has been taking steps to remove these barriers, such as lean procurement methods and (more recently) the procurement law reforms following Lord Young’s Growing Your Business report.  Now, the Government has launched a consultation to help SME’s gain even better access to public sector opportunities ( The Small Business Enterprise and Employment Bill (“SBEE”) aims “to build a stronger economy and improve the general climate in which small businesses operate“. A clause in the SBEE will, subject to Parliamentary procedure:

  • give the Government the ability to deliver “key measures to help to ensure that remaining barriers for small businesses are removed, procurement practices become more efficient and small businesses have better opportunities to grow“; and
  • enable the Government to issue guidance which contracting authorities will be obliged to take into account.

The consultation seeks the views of buyers, sellers and other stakeholders so that the Government can ensure that SMEs can better and more directly access public sector opportunities. It seeks views on three specific measures:

  • Measure 1 – duties to exercise procurement functions in an efficient and timely manner;
  • Measure 2 – a duty to make available, free of charge, information or documents, or processes necessary for any potential supplier to bid for a contract opportunity; and
  • Measure 3 – a duty to accept electronic invoices.

The consultation ends on 13th November 2014.

I would encourage SME’s to participate and if you would like more information on the consultation process, SBEE or any other procurement matters please don’t hesitate to get in touch.