Payment Terminal

Technology is entwined with much of our day to day lives, with no better example than the growth of smartphone adoption, a device now seen as a must-have. Payment and banking is almost unrecognisable from ten years ago with online banking, mobile apps, ‘chip and pin’, contact-less payments and online payments.

At events, however, many attendees often find trying to make a simple credit/debit card payment can be a frustrating and unreliable experience. For ourselves as technology providers ‘credit card machines’ or PDQs as they are known, come top of the list of complaints from event organisers, traders and exhibitors.

These problems not only cause frustration for attendees but also present a serious issue in terms of financial return for traders and exhibitors, and their desire to be present at events. It is well documented that the ability to take contactless and chip & pin payments at events increases takings, reduces risks from large cash volumes and can improve flow and trackability.

So why is it such a problem? Much of the issue comes down to poor communication and misinformation on top of what is already a relatively complex environment. Card payments and the machines which can take payments are highly regulated by the banking industry meaning they tend to lag behind other technology, however, this can be overcome and a properly thought through approach can deliver large scale reliable payment systems.

Bad Terminology

A lot of the confusion around PDQ machines comes from the design and terminology used. Although the machines all look the same there are differences in the way they work. Nearly all PDQs use the design of a cradle/base station with a separate handheld unit. The handheld part connects to the base station using Bluetooth. This is where the confusion starts as people often describe these units as ‘wireless’ because of the Bluetooth, however, their actual method of connectivity to the bank may be one of four different types:

  • Telephone Line (PSTN – Public Switched Telephone Network) – This is the oldest and, until a few years ago, the most common type of device, it requires a physical telephone line between the PDQ modem and the bank. It’s slow, difficult and very costly to use at event sites because of the need for a dedicated physical phone line, however, once it is working it is reliable.
  • Mobile PDQ (GPRS/GSM) – Currently the most common form of PDQ, it uses a SIM card to connect to a mobile network to use GSM or GPRS to connect back to the bank. Originally seen as the go anywhere device, in the right situation they are excellent, however, they have limitations, the most obvious being they require a working mobile network to operate. At busy event sites the mobile networks rapidly become saturated and this means the devices cannot connect reliably. As they use older GPRS/GSM technology they are also very slow – it doesn’t make any difference if you try and use the device in a 4G area – it can only work using GPRS/GSM. As they use the mobile operator networks they may also incur data charges.
  • Wi-Fi PDQ – Increasingly common, this version connects to a Wi-Fi network to get its connectivity to the bank. On the surface this sounds like a great solution but there some challenges, firstly it needs a good, reliable Wi-Fi network. The second issue is that many Wi-Fi PDQs still operate on the 2.4GHz Wi-Fi spectrum which on event sites is heavily congested and suffers lots of interference making the devices unreliable. This is not helped by the relative weak Wi-Fi components in a PDQ compared to a laptop for example. It is essential to check that any Wi-Fi PDQ is capable of operating in the less congested 5GHz spectrum.
  • Wired IP PDQ – Often maligned because people think it doesn’t have a ‘wireless’ handset, but they are actually the same as all the others and have a wireless handset but it uses a physical wire (cat5) from the base station to connect to a network. In this case the network is a computer network using TCPIP and the transactions are routed in encrypted form across the internet. If a suitable network is available on an event site then this type of device is the most reliable and fastest, and there are no call charges.

All of these units look very similar and in fact can be built to operate in any of the four modes, however, because banks ‘certify’ units they generally only approve one type of connectivity in a particular device. This is slowly starting to change but the vast majority of PDQs in the market today can only operate on one type of connectivity and this is not user configurable.

On top of these aspects there is also the difference between ‘chip & pin’ and ‘contactless’. Older PDQs typically can only take ‘chip & pin’ cards whereas newer devices should also be enabled for contactless transactions.

Myth or Fact

Alongside confusion around the various types of PDQs there is a lot of conflicting and often inaccurate information circulated about different aspects of PDQs. Let’s start with some of the more common ones.

I have a good signal strength so why doesn’t it work?

The reporting of signal strength on devices does nothing but create frustration. Firstly because it is highly inaccurate and crude, and secondly because it means very little – a ‘good’ signal indicator does not mean that the network will work!

The issue is that signal strength does not mean there is capacity on the network, it is frequently the case at event sites that a mobile phone will show full signal strength due to a temporary mobile mast being installed but there is not enough capacity in terms of data to service the devices so the network does not work. A useful analogy is comparing networks to a very busy motorway. You can get on, but you won’t necessarily go anywhere. The same can be true on a poorly designed Wi-Fi network, or a well-designed Wi-Fi network which doesn’t have enough internet capacity.

In fact you can have a low signal strength and still get very good data throughput on a well-designed network. Modern systems also use a technique known as ‘beam-forming’ where a device is not prioritised until it is actually transmitting data which means it may show a low signal strength which increases when it is doing something.

On the flip side your device may show a good signal strength but the quality of the signal may be poor, this could be due to interference, poor design or sometimes even weather & environmental conditions!

Wi-Fi networks are less secure than mobile networks

There are two parts to this, firstly all PDQs encrypt their data no matter what type of connection they use, they have to so that they meet banking standards (PCI-DSS) and protect against fraud. The second aspect is that a well-designed Wi-Fi network is as secure, if not more secure, than a mobile network. A good Wi-Fi network will use authentication, strong encryption and client isolation to protect devices, it should also be the case that all PDQs are connected to a separate ‘virtual network’ to isolate them away from any other devices.

You have to keep logging into the Wi-Fi network

Wi-Fi networks can be configured in many ways but for payment systems there should be no need to keep having to log in. This problem tends to be seen when people are trying to use a payment system on a ‘Public Wi-Fi network’ which will often have a login hijack/splash page and a time limit.

A multi-network M2M GPRS/GSM SIM is guaranteed to work

Sadly this is not true, although a PDQ with a SIM card which can roam between mobile networks and use GPRS or GSM may offer better connectivity, there is no guarantee. Some event sites have little or no coverage from any mobile operator and even where there is coverage, capacity is generally the limiting factor.

Mobile signal boosters will solve my problem

Mobile signal boosters, or more correctly signal repeaters, are used professionally by mobile operators in some circumstances, for example inside large buildings, to create coverage where signal strength is very weak due to their construction (perhaps there is a lot of glass of metal which can reduce signals from outside). In the UK the purchase and use of them by anyone outside of a mobile operator is illegal (they can cause more problems with interference). For temporary event sites they provide little benefit anyway as it is typically a capacity issue which is the root cause of problems.

A Personal hotspot (Mi-Fi) will solve my problem

Personal hotspots or Mi-Fi devices work by connecting to a mobile network to get connectivity and then broadcasting a local Wi-Fi network for devices to connect to. Unfortunately, at event sites where the mobile networks are already overloaded these devices offer little benefit, and even if they can get connected to a mobile network the Wi-Fi aspect struggles against all the other wireless devices. On top of that these devices cause additional interference for any existing on-site network making the whole situation even worse.

The Next Generation & the Way Forward…

The current disrupters in the payment world are the mobile apps with devices such as PayPal Here and iZettle. Although they avoid the traditional PDQ they still require good connectivity, either from the mobile networks or a Wi-Fi network, and hence the root problem still exists.

Increasingly exhibitors are also using online systems to extend their offerings at events via tablets and laptops which also require connectivity. An even better connection is required for these devices as they are often transferring large amounts of data, placing more demands on the network. Even virtual reality is starting to appear on exhibitors stands so there is no doubt that the demand for good connectivity will continue to increase year on year.

What the history of technology teaches us is that demand always runs ahead of capacity. This is especially true when it comes to networks. For mobile operators to deliver the level of capacity required at a large event is costly and complex, and in some cases just not possible due to limits on available wireless spectrum.

4G is a step forward but still comes nowhere close to meeting the need in high demand areas such as events, and that situation will worsen as more people move to 4G and the demand for capacity increases. Already the talk is of 5G but that is many years away.

For events, realistically, the position for the foreseeable future is a mixed one. For small events in a location well serviced by mobile networks with limited requirements then 3G/4G can be a viable option, albeit with risks. No mobile network is guaranteed and performance will always drop as the volume of users increases as it is a shared medium. There are no hard and fast rules around this as there are many factors but in simple terms the more attendees present the lower the performance!

For any sizeable event the best approach is a dedicated event network serviced with appropriate connectivity providing both Wi-Fi and wired connections. This solution facilitates usage for Wired IP-based PDQs, Wi-Fi PDQs, iZettle and other new payment devices, as well as supporting requirements for tablets, laptops and other mobile devices, each managed by appropriate network controls.

With the right design this approach provides the best flexibility and reliability to service the ever-expanding list of payment options. What is particularly important is that an event network is under the control of the event organiser (generally via a specialist contractor) and not a mobile operator, as this removes a number of external risks. For those without existing compatible PDQs the option of rental of a wired or Wi-Fi PDQ can be offered at the time of booking.

The key in all of this is planning and communication, payment processing has to be tightly controlled from a security point of view so it is important that enough time is available to process requests, especially where temporary PDQs are being set-up as they often require around 10 working days.

15360051168_4162e2067e_kSorry to disappoint, but yes our blog last week on Li-Fi at festivals was an April Fool’s joke. The response to it though highlights just how much importance people put on remaining connected whilst at events.

Li-Fi is a real technology and does hold promise but it is practically much more suited to indoor environments and certainly not outdoor lighthouses! As with many technologies theoretical speeds are indeed very fast in the lab but real-world use is some way off, in the meantime Wi-Fi and 3G/4G remain the primary options for keeping connected.

All is not lost though as these technologies continue to develop, and more and more events are deploying infrastructure to improve attendee experience. Wi-Fi has moved a long way from the days of 11Mbps 802.11b, one of the first standards. Modern 802.11ac wireless access points support far more users, offer much higher speeds and contain a raft of technology to create the best user experience. A well designed high-density Wi-Fi deployment using 802.11ac and directional antennas can support thousands of simultaneous users and still provide good speeds.

The rapid deployment of 4G infrastructure by mobile carriers has improved connectivity at smaller events but events attracting more than a few thousand quickly overload cell towers which are limited by spectrum availability and coverage size.

Testing is underway with new technologies which may help – the first is LTE-U (Long Term Evolution Unlicensed) which more simply put is using unlicensed spectrum such as 5 GHz to deliver additional 4G capacity. The challenge is that this technology introduces yet another connectivity method into what is becoming very congested spectrum. It is in effect robbing Peter to pay Paul and therefore the approach has split the industry due to concerns over the impact it may have on Wi-Fi installations.

Another approach in testing, supported by Ruckus and Qualcomm amongst others, is OpenG using shared spectrum at 3.5 GHz in the US. It is not dissimilar to LTE-U but because it uses different shared spectrum does not clash with existing Wi-Fi. With the Ruckus solution the 3.5GHz radio is being integrated into existing dual-band Wi-Fi access points providing a triple radio solution in one unit which can be deployed easily.

Wi-Fi also continues to evolve with 802.11ac now at ‘wave 2’, a fuller implementation of the standard featuring ‘Multi-User MIMO’, a way of better utilising spatial channels across devices giving increased capacity. Then there is 802.11ax, touting speeds of 10 Gbps but we won’t see that any time soon as the standard is unlikely to be ratified until at least 2019 by which time Li-Fi may also be a reality!

Unfortunately, as is typical with these mobile technology evolutions, once testing and approval is complete there is a lag whilst the mobile handset manufacturers catch up with integrating the technology and penetrating the market which can add several years before mass market adoption is reached.

In the meantime, well implemented 802.11ac Wi-Fi remains the best approach for high density connectivity, and that’s certainly what we will be using this summer.

Event Technology Myths

For our third myth busters article Wi-Fi becomes the focus, touching on the relationship between microwave ovens, water and Wi-Fi, wireless signal propagation and Wi-Fi security.

My microwave oven stops my Wi-Fi from working properly – TRUE (but not always)

For the non-technical the idea that whilst warming up a bowl of soup in the microwave oven you struggle to browse the internet on your Wi-Fi seems bizarre but it can indeed be true. The reason is quite straightforward – the frequency of the microwaves used in a microwave oven are around 2.4GHz which is the same frequency as used by one of the two Wi-Fi bands. The issue can occur because microwave ovens are not always perfectly shielded so some of the microwaves can leak out (harmlessly) and interfere with the Wi-Fi. Industrial microwaves tend to be more of an issue as they use higher power.

The good news is that the 5GHz Wi-Fi band which is now more commonly supported in devices is not impacted by microwave ovens – although it can be affected by RADAR but that’s another story!

My Wi-Fi works through walls but not through trees – TRUE

The way wireless signals propagate through objects is quite a complex area but there some general rules. The first relates to 2.4GHz Wi-Fi and interestingly links back to microwave ovens. The reason microwave ovens operate around 2.4GHz is that this is the resonant frequency of water so if you bombard water with 2.4GHz microwaves the molecules vibrate vigorously and the water (or your food that contains water) heats up. This is great when you want to cook bacon quickly but no so good when you want to pass a 2.4GHz Wi-Fi signal through trees which contain lots of water – the signal is simply absorbed into all the water.

It is very important to note that Wi-Fi signals are extremely low power in comparison to a microwave oven so you will not cook yourself if you absorb Wi-Fi signals! On event sites trees can become a real bane for the IT engineers trying to run wireless links which is why you will hear them talking about ‘Line of Sight’.

When it comes to walls it does depend on the type of wall – a basic plasterboard or normal brick wall will only absorb some of the Wi-Fi signal, a more substantial wall will absorb more. Walls which have metal mesh in them will often block Wi-Fi altogether. On the whole though a strong Wi-Fi signal will pass through most normal walls. Windows can help or hinder depending on the type of glass used as modern thermal insulating glass can block Wi-Fi signals quite effectively.

Temporary structures at events sites are a whole case in themselves, some temporary cabins are near enough transparent to Wi-Fi but others, particularly the newer well insulated variety, are just about impervious requiring Wi-Fi access points in each cabin. Marquees and other temporary structures often exhibit a different behaviour, being transparent in good weather but more opaque when it starts raining! The water coats the marquee or structure and can create a reflective layer and also absorb signals so that less signal gets through.

The second element of this relates to the frequency of the Wi-Fi as when it comes to wireless signals the lower the frequency the greater the propagation. This is seen most obviously when you have dual band Wi-Fi operating at 2.4GHz and 5GHz. The lower frequency 2.4GHz signal will travel further than the 5GHz signal, and this becomes an important point when designing Wi-Fi coverage (along with lots of other factors!)

All Wi-Fi networks are insecure – BUSTED

Because Wi-Fi is a broadcast technology that passes through the open air anyone with the right equipment can pick up the signal, for this reason it is very important that these signals are encrypted to avoid information being intercepted by the wrong people. One of the most common ways of encrypting a Wi-Fi network is by using a technology called WPA2 – Wi-Fi Protected Access.

WPA2 is commonly set-up with a Pre-Shared Key (PSK), this alphanumeric string should only be known by those who need access to the network and they enter the key when they are connecting to the network. The potential problem with this approach is that the PSK is used to generate the encryption key and if you use a weak key then the network is left open to a fairly simple attack which can gain access to the network within minutes.

The solution is simple – longer and more complex keys! For every character added the cracking process becomes considerably harder by a factor of compute years. The question is how long. There is no agreed answer on this as it depends on how random the key is. A truly random key of 10 alphanumeric characters is actually very hard to break, taking many years but a similar length key using dictionary words could be broken very quickly.

To be safe we normally recommend a minimum of 12 characters with typical password rules – upper and lower case, numeric characters, special characters and no dictionary words unless they have character replacements.

Of course a strong key only remains strong whilst it is only known by those who should know it and this is a weakness of the shared key approach as if the key is leaked, security across the network is compromised. There are additional factors that can be introduced to improve security further – for example one technique is called Dynamic Pre-Shared Key (D-PSK) which uses dynamic, unique keys for each user so there is no risk of a leaked key.

We will cover Wi-Fi and general network security in more depth in a later blog but with the right set-up Wi-Fi networks are perfectly secure – more so than most wired networks!

Event Technology Myths

In the second part of our myth busting we look at satellite, high density Wi-Fi and broadband speed.

Satellite is the best all round solution for quick event deployment – BUSTED

Over the last few years KA band satellite has become a cheap option for temporary internet access, it can be a great solution in certain cases but there are many cases where it is not suitable. Satellite suffers from a high latency due to the distance to the satellite and this means every piece of data takes around 600ms to cross space. That delay might not seem much but it is crippling to services such as VPN (Virtual Private Networks), VoIP, video calls, online gaming and any application which requires lots of rapid two-way data traffic. It is great however for large file uploads and video streaming, however, it is important to watch data usage as this can rack up significant additional costs.

Satellite is also a poor solution for wide-scale access such as public Wi-Fi, this is because of a technology it uses to try and boost speed, the downside of which limits the number of simultaneous users who can connect to one satellite service. Most KA satellite services also have high contention ratios which can reduce the advertised 18Mbps/6Mbps type speeds down to something considerably lower, a similar trick is used with home broadband services. Uncontended services are available but the cost is much higher and other than for short durations (it’s normally sold in 15 minute slots) it is not competitive with other solutions.

Satellite can absolutely be the right approach, and we deploy lots of satellite solutions, but understanding the user requirements and explaining what the user experience will be like are extremely important to avoid disappointment and frustration.

Better Wi-Fi just means using more Wi-Fi access points – BUSTED

One of the most common problems with Wi-Fi networks is too many Wi-Fi access points and a poor design. A typical response to a user complaining about Wi-Fi is for another Wi-Fi access point to be deployed to ‘improve coverage’, yet frequently this just makes matters worse. Large scale and high density Wi-Fi requires very careful design to avoid what is known as Co-Channel Interference (CCI) where multiple wireless access points are in effect shouting at each other and slowing the whole network down.

Using fewer high capacity managed wireless access points with a detailed radio spectrum design, often with focused antennas, can deliver much high capacity and a better user experience than a thick blanket of access points. Good Wi-Fi design is a technical art requiring some very detailed knowledge – the output though is pretty much invisible to the normal user until it doesn’t work!

20Mbps of broadband speed is always the same – BUSTED

It would be nice if the experience and speed of all broadband services were the same so that when you are told you have 20Mbps that’s what you get. Reality is somewhat different and more complex due to a number of factors:

  • Contention Ratio – Nearly all providers contend their services, which effectively shares the capacity between multiple users, this can be as much as 50:1 whereby your 20Mbps is shared between 50 users! More normally 20:1 is seen, then 5:1 on more business (and expensive) orientated services, up to the perfect 1:1 (no contention).
  • Asynchronous / Synchronous – ADSL and FTTC (known as BT Infinity but also sold under different names) services are asynchronous, this means that the download speed is not the same as the upload speed. The original principle was that people need more download than upload speed but with modern cloud services, video calls and general rich media this has changed considerably and a low upload speed can be more crippling than the download speed. For example, you may have an ‘20Mbps ADSL service’ but typically the upload is only 768kbps and if the upload is at capacity the download becomes throttled due to the way TCP/IP networks work. Services such as true optic fibre (also sometimes called leased lines) are synchronous.
  • Connection Speed / Throughput Speed – This is primarily an issue for ADSL/FTTC but can be seen with other services too. The speed advertised by an ADSL modem when it connects is only the theoretical speed of the link between the modem and the local exchange. The real throughput or speed depends on the entire route from your computer to the location you are connecting to – this is a complex web of routers, fibre and ‘internet peering’. Different parts of that route may suffer congestion and reduce the overall speed of the connection. Choice of Internet Service Provider (ISP) is an important factor as the good ones have better peering and higher capacity links to reduce the risk of congestion and optimise routing.
  • Latency – Every device, cable and piece of fibre on a network through which data has to pass introduces an element of latency or delay- that’s due to physics. The amount of delay depends on distance (hence why satellite is a problem), quality of links (a poor link needs to use more error correction which adds delay), utilisation of links (high utilisation adds delay) and the number of routers, switches, etc. in the path. Good services may only add a few milliseconds of latency, poor ones several hundred milliseconds and that can make a big difference to user experience.

That’s it for issue 2. Next time, does my microwave really break my Wi-Fi? How comes Wi-Fi works through walls but not though trees? And should you worry about network security.

Welcome to the event technology myth busters! Just like the popular American show (Mythbusters –  check it out!) we will be taking myths we hear about from customers and proving, once and for all, if they are true, busted or plausible!Event Technology Myths

GPRS (mobile phone) PDQ systems are unreliable at events – TRUE

GPRS payment terminals are designed to connect to the same technology as your mobile phone so it stands to reason if your mobile phone is working it will, right? Right. Generally, GPRS networks operate really well and the unit works all over the place. The exception to this, unfortunately for those in the events industry, is that when the mobile phone network suffers from overload the terminals will have the same issue as you making a call. The majority of the mobile phone network is designed for large scale coverage area, not high density (such as 30,000 people in a field). If you are going to try and use a PDQ terminal in this type of situation it is much better to hire a cabled or Wi-Fi terminal as part of the event provision at the same time as you request services such as power.

Optic Fibre internet is always expensive – BUSTED

Optic fibre internet (sometimes called leased lines) is the best type of connectivity. It’s dedicated (just for you), has a fast support process and is generally very reliable. If your home broadband is like a B Road (narrow, busy and sometimes blocked unexpectedly) then optic fibre is the three lane motorway. Getting a motorway to your door can be expensive but for many locations it is now cost effective, especially over 3 or 5 years. Tricks to keeping the costs down? Order early, order from the right supplier and plan for the future, for example order a link with the highest capacity possible, just run it at a slower speed until you need more.

You can generate good revenue from charging for use of public Wi-Fi networks – BUSTED

It seems so obvious – deploy a public Wi-Fi network at an event and attendees will flock to it and pay to get a good service when the mobile networks become overloaded. Unfortunately, this isn’t the case as attendees are cautious about public Wi-Fi and do not like paying for it. This should not be a surprise considering most other public Wi-Fi in cafes, shopping centres, etc. is free at point of use or users get free access via an existing account such BT or Vodafone. Then add in the fact that at most events the attendee is paying to enter the event and you can see why they are reluctant to pay again.

Recovering the cost of deploying public Wi-Fi has to be a lot more creative – it is all about the content and the usage data. Lots of platforms out there now quickly and effectively collect marketing information from those using the service, such as email addresses, social media information, sites visited, etc. All of which can either be used for your own or other activations. Those using the networks need to agree, but many do once they appreciate the service has to be paid for in some way!

That’s it for this issue! More to come over the next few weeks including; Do all venues have sufficient internet access? Can wireless networks be customised with logos and text? Does streaming always suffer from site unless it has its own connection? Is satellite internet a good option for all events?

1199922_38790784For event organisers life on the road, in and out of venues, holed up in damp cabins and questionable hotels means the technology they carry and the software tools they use are critical to their day to day job. It’s an ever changing landscape and, to some degree, a personal preference but there are a few key items to think about to ensure the teams stay productive at a sensible cost.

The Laptop – Personal & Critical

Although smartphones and tablets are the most talked about items of the last few years it is still the trusty laptop that is at the core of the road warrior armoury. It is the item not to skimp on, buying too cheaply will cost more in the longer term but at the same time there is no sense in buying at the top end – the best value is in middle.

Choosing a proper business laptop rather than the cheaper consumer models is a wise move – they survive better on the road and focus on the things that make a difference for an intensive user – battery life, keyboard feel, screen quality, lighter weight, etc. Size is important – there is no need for a massive 17” screen, you are better off sticking with a smaller screen and using an external monitor when you really need the extra screen area, the saving in weight and the fact you can then use your laptop on a train or plane is a much better benefit. Be wary of ultra-high resolutions on smaller displays as these often frustrate users as they can be so hard to read.

Hard drive failure just before an event is not something you want. To minimise the risk select an SSD (Solid State Drive) instead of a traditional hard drive – SSDs are not immune to failure but they are a lot more tolerant of being bashed about in an event world and they are much faster.

In terms of performance the marketing always suggests the latest, fastest and most expensive processor is the way to go, however, overall laptop performance is down to the sum of the parts so there is no point in buying one with a high end processor which is then crippled by a slow hard drive, limited memory and weak graphics. These days’ processors are so good that unless you have some very specific needs you are better off buying a mid-range processor with plenty of memory, an SSD, decent graphics and good build quality. For example in the Intel processor range you should avoid the low end Core i3, instead picking a Core i5. Unless you have a specific, very demanding usage case there is little point in the extra cost of a Core i7.

Ultrabooks (extra thin and lightweight laptops) are worth the expense for the highly mobile but be careful on selection as many no longer have a physical network connector built in – they rely purely on wireless connections. The workaround is typically an external adapter. Similarly, many Ultrabooks have dropped some of the older generation connectors such as VGA in favour of HDMI and mini-HDMI – this isn’t necessarily a bad thing but you need to think ahead when presenting!

The type of wireless the laptop supports is very important and it is almost essential that you choose one which supports both 2.4GHz and 5GHz frequencies. The 2.4GHz range is typically so crowded on event sites that it is often unusable, whereas 5GHz has more capacity and provides a much better experience.

Should you buy an Apple Mac or a Microsoft Windows based laptop? In my view it doesn’t really matter – they both share the same core components and each suffers from similar types of failures and security issues. It is more about what sort of user experience you want and if you are already used to one or the other do not underestimate the initial loss in productivity if you switch!

Productivity Tools – Too Many to Choose From

The emergence of cloud services has led to an explosion in productivity tools, particularly ones that work well across distributed teams. Dropbox, Box, Office 365, Google Drive, Evernote, Google Docs, Microsoft One Drive, Skype, WhatsApp – the list goes on and on. They all have pros and cons and most will meet the needs of the majority of users. It’s not so much about which tools you choose, but about how many and how you manage them.

With a distributed team, especially one that includes freelancers, it is far too easy for everyone to do their own thing and productivity drops because no one knows where anything is or which version is the current one. It is really important to agree the tools and stick to them – less is more!

Offerings such as Office 365 where email, office applications and project sites can all be delivered as a single SaaS (Software as a Service), allow rapid scaling and shrinking of licences which is very effective for dynamic teams. There are additional benefits too since they are hosted in the cloud there are no VPN (Virtual Private Network) complexities for users connecting back to a central office whilst on an event site.

The downside of the modern cloud services is they require connectivity, not an issue when you are in the office but on event sites the impact is a lot more significant. The background synchronisation that takes place from your laptop, phone and tablet all consume bandwidth and this has increased the connectivity demand from event sites significantly which must be factored into event plans.

Security – Ignore at Your Peril

Distributed teams, a need to share lots of information, contractors, freelancers and a just get it done driver provide a mix which is an IT security nightmare. Information access, control and protection gets more complex every day and sadly the leakage of sensitive information and hacking are very real problems.

It all starts with the humble login and password, still the way that nearly all systems are accessed. We all hate them and we all get lazy with them. A few golden rules to start with:

  • Never use ‘shared’ logins – the moment you use shared credentials you lose all ability to audit and control. If you suffer a breach you will not be able to trace it and the only way to stop it involves impacting everyone.
  • Do not use the same password on multiple accounts – People hate this one but it is increasingly important. The reason is simple – the majority of systems use your email address as the login id so if one system gets hacked (which is all too common) and login details are compromised the hacker knows that using the same login id / password combination on other systems is more than likely to work. What starts as an annoying but manageable breach on a harmless website becomes an exposure to financial data, banking, customer information etc.
  • Strong passwords – It’s incredible that the most popular password is still 123456 and the second most popular is password. In a business environment that should be treated as irresponsible and a possible disciplinary offence. Password hacking methods have moved way beyond the old ‘brute force’ attacks which means even fairly complex passwords are cracked surprisingly quickly. If you can remember your password easily then it is probably too simple!

The last two points above are at the core of the issue which blights confidence in computer security – realistically no human can manage dozens and dozens of different, complex passwords so the weak ones persist and play straight into the hacker’s hands.

At first the solution seems counter-intuitive – password managers. These utilities such as Lastpass and 1Password manage all of your passwords allowing you to have unique, complex passwords for every system you use. You then just have one password to remember to access the password manager.

Surely this is a bigger risk as that one password gets access to everything? Potentially yes, but there are reasons why this risk is smaller than the risk of not using a password manager.

Firstly, you are far more likely to remember one complex password than lots of them. Secondly the password manager (or at least the good ones) is local to your devices so to try and hack the password the hacker needs access to your actual device, not an on-line website so this adds another layer of defence. A password manager is infinitely more secure than yellow sticky notes stuck to your screen.

To go a step further, particularly for a password manager, using ‘two factor authentication’ is wise. Two factor authentication provides an additional layer of security in a similar way to the card readers used by many banks for on-line banking but instead of a card reader they use an application on your computer or smartphone. Products such as Google Authenticator are now supported on many password managers and also directly on other on-line services.

Passwords are a key part of security but there are a few other aspects which need to be watched carefully. Most security breaches are still caused by employees or contractors – both intentionally and unintentionally. With documents and information bouncing between people and systems at an alarming rate knowing who has access to information and where information is stored is crucial.

Thankfully the majority of staff and contractors are trustworthy but it only takes one. Using unique logins for all staff as mentioned above makes the process of closing down access much more straightforward when it is no longer required and provides traceability. Most systems now provide a granular access control so that not everyone gets access to everything. A clearly owned ‘leaver process’ is also important to make sure logins are removed and content deleted from sharing locations.

Effective technology usage can make a big difference to productivity but it is too easy to overcomplicate. We now have an amazing array of systems with which to share content and communicate but when the pressure is on ‘old fashioned’ email still comes out on top as it is simple and dependable. The same thought should hold true for the other aspects; event road warriors require simple and dependable solutions that do not distract them from what they need to do – run events!

People at concert shooting video or photo.

Only a few years ago communication from attendees at an event consisted of the occasional phone call (if you could make one) and maybe a text or two. The phenomenal success of smart phones and social media has changed all that at a pace no one was expecting.

We raced through textual commentary and onto photo commentary within a couple of years, a development which saw a huge shift in network demand on event sites, the few bytes of a text message replaced by megabytes of high resolution photos.

Now we are seeing the next shift into video, initially starting as a ‘record and upload’ approach but rapidly shifting to live video streaming with some current generation smartphones capable of 4K ultra HD video at more than 50Mbps! Last week Facebook announced Live Video, their offering in the live video streaming arena to compete with Periscope and Meerkat, a reflection on how fast the area of personal video streaming is moving. The key point is that Facebook Live Video is integrated into their main application removing a barrier to usage and fuelling more rapid adoption across its massive existing user base.

The data demands of such use are vast, especially in a high density environment such as an event and therefore, for the moment, this is likely to restrict the growth to some degree, however, it is happening and with it comes a significant shift in thinking about how content from live events is managed.

Putting aside the technical pressures on event networks in terms of capacity the real question is about the content. We have seen the shift from ‘no cameras’ to a reluctant acceptance that the control of photos from event sites is pretty much impossible, even though many artists do not like the sea of people watching events through their phones. Increasingly there is some acceptance of a time shifted amateur video appearing on YouTube but the idea of real-time video streaming takes the subject of content management to a new level.

For organisers, promoters and artists the question is do you try and stop it from occurring using either technical or physical approaches, or accept it and turn it to an advantage. Technically restricting video streaming on an event managed and controlled Wi-Fi network is perfectly feasible but on the 3G/4G mobile network services it would come down to discussions with those operators as to whether they would be prepared to block streaming from specific cell towers during an event which is unlikely, although in reality at present it is unlikely these networks would be reliable enough to support a video stream.

Alternatively, rather than trying to block at source, a continuous scan of live streaming services to identify and remove streams could be employed but the effort required to do this would be huge and would not be successful without the support of the service providers such as Facebook & Periscope.

Physically trying to control it becomes a question as to how strong do you get with identifying and removing abusers, an approach which can cause tension between fans and artists, would require additional resource to police and would never be entirely successful.

Is the answer then to accept and adopt, finding ways to benefit from this new communications channel? Does it really harm the event or the artists to have boundless unmanaged content strewn across the internet? On the other hand, do you want someone wandering around backstage streaming everything? Or someone in the front row streaming what could be a surprisingly high quality and atmospheric video? It’s a copyright nightmare but potentially provides massive exposure.

Technology is not only at the source of this issue it is also likely to be part of the solution too but not before event organisers decide on approach. One thing we do know from technology over the last twenty years is that it cannot be ignored, it is the ultimate disruptor and will always find a way of winning through.

Now we have hit the main festival season, (and were blessed with a dry Glastonbury!) we take a moment to look at the less glamorous aspect of events – ensuring organisers have the right help when they need it. Whatever the event, be it in a muddy dry field on Pilton Farm, a hotel in central London or a conference hall in San Francisco, having the appropriate levels of technology support is critical to success.

Identify your critical elements

What is going to have the largest effect to your event if it fails? This sounds like a simple question and one that typically forms part of a risk assessment, however sometimes things get missed. Nothing happens without power (in most cases!) but if the internet connection drops can you continue to process bar transactions? Or scan tickets? Or monitor crowds through the CCTV system?

The Thinker

Confirm support expectations

With a plan including critical elements in hand, how easy is failure to work around? If nothing can be done then make sure you have a clear agreement about getting help if you need it. Weekend or out of hours suppliers can be expensive at the last minute. Can one of your team be trained to fix a basic problem? Can you get onsite or standby support?

Document and prepare a plan B

“Failure to prepare is preparing to fail”. If the worse happens what can be done? What could be lined up as a backup plan? Can you revert to a manual system and if so can that be prepared in advance? Running through failure scenarios gives you an option if the worst happens and means if it happens then the plan can be activated whilst others work on fixing the main issue.

The world of support isn’t glamorous and one of thousands to be considered when planning and executing an event but when things go wrong it can save a lot of stress and potential pain.