The issue of wireless encryption ‘cracking’ has been in the news again recently thanks to Thomas Roth and his claim to be able crack WPA-PSK passwords in a matter of minutes. The basic methods used are nothing new, primarily a hybrid brute force and dictionary attack, which essentially is like you sitting at a computer and trying every word you can think of as the password. What was different in this case is the use of cloud computing to harness enormous processing power – enough to try 400,000 passwords per second bringing the time to guessing the password down considerably. This all sounds rather concerning, but is it really?  

If you fit the best lock money can buy to your front door and then you leave it on the latch, can you really complain when someone opens the door and burgles your house? The important thing with encryption is the complexity of the password as the time it takes to crack a password depends very significantly upon the password strength. Roth himself said “If  [the password is] in a dictionary it’ll be very fast, but if you have to brute force it and it’s longer than eight characters and its complexity is okay, it’ll take a very long time.” By ‘long time’ he means years and years, and the longer the password the longer it takes, in fact exponentially longer.  

Security Officer

Security is only as strong as the weakest link

So, nothing to worry about then?…well not quite when you consider the way WPA-PSK is often used. The clue is in the name – PSK stands for Pre-Shared Key – and as it suggests the key is shared between all users. If you take a typical event site where organisers, press and crew require a ‘secure’ wireless network often WPA-PSK will be used, but it’s often not as secure as intended for two reasons. 

Firstly, the password or key is being given to many people and it only takes one person to release the password into the wild and the whole network is compromised. Once compromised the only way to secure the network again is to change the shared password which means all users need to be notified of the new key, not very practical in the middle of an event. 

The second issue is that because the password is being shared between many people generally a short, easy to remember one is used, opening up the network to the type of attack described above. Visit many media centres, event HQ’s etc. and you will see the network password printed on A4 pieces of paper stuck to the wall.  

Network security is often seen as a hassle, along with the “it won’t happen to us” mentality but there are more and more reasons to take it seriously. Prior to the news about the WPA-PSK crack there was also news about a plugin for the Firefox browser that could ‘listen’ to other users’ data on a wireless network (either an open network or one where the key is known). Increasingly at events more and more data is transmitted across the network and much of it is sensitive. Yes there are secondary mechanisms such as VPN and SSL that are used to protect some data but often you will find file shares, websites and other data all unencrypted and open to see on the network.  

We do take network security very seriously and have been offering individual user names and passwords for network access for several years which gives us access control with a much better level of granularity, along with the ability to provide a full audit of users. For 2011 we are going a step further and at the Event Production Show in February we will be launching an additional service known as DPSK or Dynamic Pre-Shared Key. Using this service once a user logs onto the network they are transparently given a dynamic, unique encryption key. This means that all users have a different (and very strong) encryption key, ensuring all data transmitted is well protected and users do not need to know the key or share it with anyone. All the user needs to know is their username and password (which stills needs to be ‘strong’) but if that user’s details are compromised the only impact is to that user and that user’s account can be quickly blocked.  

We understand that every event has different needs and aspects such as network security are a balance between risk and complexity so we have developed a range of solutions to meet those different needs. If you are concerned about the security of your IT systems at events then drop in for a chat at the Event Production Show or contact us for a discussion.

Chip and pin is part of our lives. Four digits and a piece of plastic is all that’s needed to pay for anything from a morning coffee to that expensive watch for Christmas (you have got all your presents by now – right?!) The chip and pin terminals or PDQs (Process Data Quickly) have become the staple of any company who want to take larger payments at events. As the cheque looks destined to bow out (perhaps?) shortly we take a quick look at some of the key points of PDQ machines in a mobile environment as they can be the cause of much pain. 

Merchant Account – Anyone who accepts payments from a card needs a merchant account. Issued by the banks it can be a lengthy application process so leave plenty of time when applying. In our experience to keep a merchant account live you’ll need to have a PDQ of some type – this is typically a phone line based model you can keep in the office. The merchant account and terminal rental have a monthly fee normally around £20. 

Transaction Fees– The downside of card payment are the transaction fees, typically for credit cards these are around 2.5% of the transaction amount. For debit cards the fee is normally a fixed amount of around 35p per transaction. 

Mobile Terminals – If you are going to be using your PDQ out and about all the time then go for a mobile terminal as your default instead of a telephone line based model. The main reason people don’t go for these all the time is because mobile units (by which I mean GPRS connected) have a higher monthly charge just like a cellular data card.

Wi-Fi Chip and Pin Machine

Wi-Fi Chip and Pin Machine

GPRS Connectivity– When looking at a mobile terminal you’ll need to select the right type of connectivity to keep it working. A GPRS terminal works in a similar to a cellular Internet data dongle. Connected to the cellular data network the unit will work anywhere with mobile phone reception for whichever mobile operator you select. This is great for smaller stands who operate at venues or locations where the amount of attendees is not excessive, by which I mean the cellular network will continue to operate. The issue here is as the attendee numbers for the event increase the mobile phone network becomes slower to respond and may become overloaded. We have all experienced not being able to make calls at big events – the same problem exists for the GPRS terminal – it will just stop working, often at a critical point. In 2010 we have seen a significant rise in the number of events that have experienced problems with GPRS devices due to the network being overloaded. 

Wired/Wi-Fi Connectivity – The latest PDQ units are now shipping with Wi-Fi or cabled network connectivity built in. For those people who work at large events when the mobile phone network is not reliable this type of connectivity is a reliable way of ensuring you can continue to process transactions regardless of the mobile phone networks. The challenge here is the event needs to have a Wi-Fi or wired network in place – many of the larger events now do but the best thing to do is check beforehand. 

Short Term Terminal Rental– Many companies like the option of renting PDQ machines for a short period, especially for events where they need Wi-Fi or GPRS units. There are several companies that rent units (including ourselves) but it is important to note there is a lead time of around 14 days to get your merchant id added to the temporary unit and some banks have additional restrictions which need to be removed. For PDQ rental you still need to have your own merchant id. 

So whatever choice you make a PDQ terminal can be invaluable for any business on the road. Increased transaction amounts and greater security are just some of the obvious benefits but beware the pitfalls and make sure you plan well in advance of when you need the service operating. As always we are happy to provide advice on types of machine, connectivity, pros and cons etc. to help you make the right decision for your business.

There’s plenty of press coverage of the recent, much anticipated, announcement of the approval of the Wi-Fi Direct standard. On the surface non-technical folks would be unlikely to give it a second thought but if you rely on Wi-Fi networks at events then Wi-Fi Direct could be a cause for concern. So what exactly is it and why the concern?

In simple terms think of Bluetooth but using a Wi-Fi standard i.e. device to device communication without the use of a ‘Wireless Access Point’. OK , but we have Bluetooth so why bother? Potentially better range, better performance and a single wireless standard across devices. Also factor in that Bluetooth has never really made it big in the US whereas Wi-Fi has.

But the more technical folks already know how to do ‘ad hoc’ wireless networks today using laptops and wireless adapters so what’s the difference? Not a lot, other than making it simpler and giving it a standard so that a wider range of devices can be certified. Sounds great, so I can connect my laptop directly to my wireless printer? Yes, and any other device that becomes ‘Wi-Fi Direct Certified’.

On one level Wi-Fi Direct is potentially a great addition to the connectivity tool-set, not a replacement for Bluetooth but a complimentary offering, a sort of next level up from a Personal Area Network (PAN), however there is a downside.

The downside is two fold, firstly imagine what happens when you put hundreds of users in a small space all firing up Wi-Fi Direct. Remember what used to happen in a room full of laptops with infrared connectivity and the constant ‘whoosh’ noise as they all kept finding one another and tried to establish a connection! Imagine that over a much wider area with all types of devices.

Today we are still seeing issues at events with the virus which creates an ad hoc network on an infected computer (using a very similar approach to Wi-Fi Direct) called ‘Free Public Wi-Fi’. Unsuspecting users connect to this and then become infected themselves. This virus has been around for some time but has recently gained more press coverage, thankfully it is easy to resolve but it is a nuisance at events where we often see dozens of infected computers.

The second issue is one of interference. The 2.4GHz frequency range that the majority of current Wi-Fi devices use is highly congested. Everything from microwave ovens to Bluetooth devices emit radiation around this frequency, all of which appears as interference to Wi-Fi devices and reduces performance. Now add in hundreds of Wi-Fi Direct networks all emitting in the same frequency range and chaos results. Recent large launches such as the iPhone 4 were hampered by interference caused by hundreds of MiFi devices; Wi-Fi Direct will add a whole new level of interference.

So how bleak is the situation? Hopefully the Wi-Fi Direct standard will address these concerns but details are hard to find at present. Also many of these aspects exist in one form or another today and hence already have to be managed at event sites but it does place increased pressure on the professional network. Two major factors which come into play and can assist are the use of the 5GHz frequency range for critical services where currently there is far less interference (although that is changing). The second factor is to use equipment designed for difficult environments, features such as interference rejection (using aspects such as beam-forming) and automatic channel management become highly important in maintaining a usable network.

The picture may become clearer as more details are made available around the Wi-Fi Direct standard but for any organiser planning on the use of Wi-Fi at an event, especially where there is likely to be a high density of users such as a media centre, it is critical they engage a professional team who have the right tools, equipment and experience to minimise the risk and deliver a quality network.

In the first part of this article we discussed some of the challanges associated with using ADSL technology to support an event or business. Despite the government’s sabre rattling and some serious investment from certain parties in newer fibre technologies, ADSL is, and will continue, to be the way much of the UK gets its broadband for some time to come. So until we all enjoy fibre links to the doorstop (don’t hold your breath) it helps to know a little bit about ADSL and how you can get the most out of it…so we’ll continue where we left off:  

From this comes the magic of broadband. Thanks to treehugger.com for the image

 

  1. It’s just copper – The POTS (plain old telephone service) hasn’t really changed that much since the days of the 1880’s. In the 1970s things started to go digital at the back end but in terms of actual delivery to the home it has remained a pair of copper wires. It was well known even then that the wires carrying all those voice calls could do much more. In fact a copper line being used for a phone call is only using about 0.3% of the theoretical throughput. Early work on using the phone system for data goes back as far as 1948 but it was the mid-1980s where most progress was made, first with ISDN and then DSL. DSL actually covers a number of variations (often called xDSL) of which the most common is ADSL or Asymmetric DSL.
  2. Why do speeds vary so much? – ADSL currently uses a technology called Discrete Multi-Tone (DMT). Essentially it divides your copper line into 247 different 4-kHz channels. You get the equivalent of 247 channels divided across the phone line with which to potentially send data back and forth. Each channel is monitored and, if the quality is not good enough, the signal is shifted to another channel. The system constantly shifts signals between different channels, searching for the best channels for transmission and reception. In addition, some of the lower channels (those starting at about 8 KHz), are used as bidirectional channels for upstream and downstream information. As the system constantly monitors all the channels and stops using those that do not provide a good enough signal it is not unusual to see the overall line speed vary from day to day or hour by hour depending on interference or even weather conditions!
  3. ADSL needs to ‘train’ – In an effort to make sure everyone got the best speed possible and knowing the variation in the condition of the copper which makes up our legacy telephone system, ADSL is designed with an inbuilt sliding scale. Once the modem is installed and the service is live the modem will start negotiating with the equipment at the exchange, initially trying higher speeds and then slowing down until it finds a compromise between speed and quality. Over the first 48 hours or so it will continue to do this, varying the settings it uses automatically.  Some modems are better at doing this than others so it’s worth investing in reasonable kit if you want to see the maximum speed your line will maintain.
  4. It say’s up to 20Mbps, I’m getting 2Mbps! – Distance is the bg problem as the further you are from the exchange the weaker the signal is by the time it reaches you, which means more of those little channels are unusable.  In the real world unfortunately very few people will get the headline speed. There are some steps you can take to help the situation though. Firstly, try to minimise any additional cabling so that you are not adding to the length in you house, ideally an ADSL modem should be connected to the master socket (where the the phone line enters the house) to minimise any further loss or interference. Make sure you use a reasonable quality micro-filter, saving an extra £1 on the cheapest filter may end up losing you some speed. All ADSL modems are not the same, some are more sensitve  and use better quality components leading to a more stable and better performing connection so do your research when purchasing.  If you do have to run additional cable in your house then make sure it is connected correctly as mis-wiring can cause all kinds of problems. It is also worth routing any cable away from power cables and pipes where possible to minimise any interference.
  5. Sharing the service – One of the main challenges with ADSL is the contention within the service. In most cases ADSL is offered as a consumer service which is cheap. Like everything with a price point that means it has a catch which is normally the contention ratio. A contention ratio is managed by the ISP and means you could be sharing your 20 Mbit/s download with up to 50 people at the same time (each ISP has different rules, you’ll see it in the very very very small print). If all users in the same contention group play fair the chances of everyone downloading content (websites, email etc) at the same time is low and therefore everyone enjoys the Internet at a fast speed. However the model falls down when people download much larger files, stream video content or share files with others. It’s not the user’s problem (or, to be fair the ISP since there has to be some money somewhere!) it’s just a reality of offering a competitively priced service.

ADSL is in many ways a great technology for home use but does have significant limitations when it comes to business or critical services, especially when you consider there are no real guarantees on service or performance. You can never predict exactly how a line will behave until it is installed and has been running for a few days – not ideal in our case where we are deploying services for short periods. There are a few other options though:  

  • Bonding and Load Balancing– this is a complex area in itself and often mis-sold but there are cases where it makes sense to ‘bundle’ multiple ADSL lines together to give improved performance. There are many cases though where this approach will not significantly improve things and can lead to all kinds of other problems (VoIP and VPN for example are very intolerant of many bonded solutions).
  • SDSL (Symmetric DSL)  – A variant of DSL which as the name suggests offers the same upload and download performance. The catch? It is limited to 2Mbps per line and many exchanges do not support it. Where it can be used though it is normally offered as a business service with little or no contention so will often outperform a much ‘faster’ ADSL service. SDSL lines can also be bonded or balanced in a similar way to ADSL but again there are catches.
  • Alternative Leased Line technologies – There are several newer technologies being rolled out which although they still use the copper wires can deliver much higher speeds. Presently they tend to only be available in major towns and cities and are more expensive than broadband DSL services but they are an alternative to technologies like fibre which are very expensive unless an existing service is in place.
  • Optic Fibre, Satellite & Wireless MANs – Beyond the copper wire there are several other technologies to deliver high speed connectivity which all have their pros and cons too. The detail of these will have to wait for another day.

The key point though is that if you need connectivity at an event it is very important to talk through with someone the requirements and options available. It is all too easy for an event to be hampered by poor connectivity and there is no ‘one size fits all’ approach. In the last few months alone we have used all of the methods listed above at events, from a 1Mbps ADSL line to 2Gbps of optic fibre. It is also worth noting that in general there are at least 4-6 week lead times on getting phone lines and DSL services installed so good up front planning is also very important!

ADSL. Never before has one technology been responsible for liberating thousands and frustrating just as many at the same time. Some may think I’m overplaying its importance but this one implementation of technology has made huge changes to our lives. I’m not suggesting it’s the light bulb or the printing press but as we install temporary connectivity services for our event customers around the UK it astounds me how high expectations are with regards to connectivity. Even better (or worse – depending how you see the world) is how quickly everyone has forgotten the pain of the dial-up modem which, until a few years ago, was our only way onto the Internet!    

Everyone wants Wi-Fi (from compguy.co.za)

Broadband take-up in the UK currently stands at 73%, having more than doubled since 2005 when Facebook.com had only just been bought by Zuckerberg and friends for a mere £200,000. Fast broadband was something most people just didn’t have. I don’t mean people out in far flung corners of the country (that’s a generalisation I know…I’ll get to it shortly) but I mean people in towns and large villages where you were still stuck paying by the second for modems to squawk and shriek their way through six or seven emails at a time.    

Now we expect, nay demand, always on fast Internet. Fast enough we don’t have to wait for the download to finish or the web page to render. Even in my short time on the planet I struggle to think how I ever survived on a 33k dial up modem (yes I had one). I remember the first time I used my newly purchased snap on modem module for my Palm Pro PDA at other people’s houses. I literally got rounds of applause for checking the weather using my Freeserve account through an 0845 telephone number. Nowadays if someone hasn’t got “20 Mbit/s” (megabits per second) broadband with Wi-Fi connectivity prevalent within the house I don’t stop for coffee, let alone dinner.    

My point, I guess, is that broadband (now synonymously coupled with Wi-Fi) has become as ‘expected’ as the TV or the complimentary coffee. Not only does it mean companies like Etherlive exist to meet that expectation but, with or without technical partners like us on board, it’s expected to be provided and work, 100% of the time.    

So in this, the first of two articles, we go back to the basics a little – what is ADSL? What are its limitations?  

  1. First things first – ADSL stands for Asymmetric Digital Subscriber Line, referring to the technology rather than ‘broadband’ which can be applied to other technologies such as cable modems. The term ‘asymmetric’ alludes to a very important point, ADSL is designed to offer better download speeds than upload speeds, no one had thought about all the content sharing that was possible when ADSL was conceived in the early 90’s!
  2. What flavours of ADSL can you get – Like any technology the only constant is change. Depending on the technology at your closest telephone exchange dictated which generation of ADSL you are stuck with. The original ADSL was approved in 1998 and has been superseded many times but you may still only be able to work with this technology because your local exchange hasn’t been upgraded. The latest forms of ADSL are ADSL2 (2002) with 12.0 Mbit/s download and 1.0 Mbit/s upload and now ADSL2+ (2003) supporting a maximum download of 24.0 Mbit/s and upload of 1.0 Mbit/s
  3. Don’t forget your filter – The little white box which should go on every phone socket that has broadband is a simple bit of hardware which blocks all signals above a certain frequency from entering the phone line. Without it your broadband will not work properly and you will probably have some unpleasant sounds when you make a phone call!
  4. It’s all about distance – Simple really, the further you are from the exchange the lower the speed you can expect. You only get the full ‘headline’ speed when you near enough live next door to the exchange. Also add on to that the fact that older wiring will tend to perform more poorly as cables deteriorate over time. Once you get to a couple of km from the exchange you may be lucky to get a 1Mbps connection.
  5. ‘Unbundled Exchanges’ – This refers to whether an exchange has been opened up to other telecommunications companies, currently it is still primarily cities and large towns that have unbundled exchanges. The advantage of unbundled exchanges is that other companies can provide and control the level of service, whereas in a standard exchange it all goes back to BT (no matter who you use). 
  6. Beware of the small print – I won’t go through the full technical details in this article (we’ll save the best for part two) however there any many ‘gotaches’ with ADSL which you should check before signing up with an Internet service provider. These include, but are not limited to, the contention ratio of the connection (how many people are sharing it, often up to 50), the ADSL technology (ADSL, ADSL2, ADSL2+ etc), download limits, throttling or blocking of certain protocols (important for peer to peer fans) and other terms of use. Not all providers are the same and on the whole the less you pay the lower the level of service you can expect.

That’s part one of our review. Even with other options (explored in the next part) ADSL is sometimes the only option. In the next article we’ll also look at some of the more technical aspects associated with ADSL and what you can do about squeezing every little bit of speed from your connection.

Etherlive will hold a seminar at the Wireless & Mobile 09 exhibition at London’s Olympia 2 giving insight into working with the latest wireless technologies in challenging temporary deployments. 

From the Wireless & Mobile website:

Real World Wireless – some practical advice

 From corporate enterprise to muddy fields, Etherlive shares their knowledge of wireless technologies. How can you optimise your current environment at minimal cost and use next generation equipment effectively? Based on experience ranging from permanent installations to supplying temporary mesh networks to the events industry, delivering VoIP, video and CCTV, this is a practical discussion about the pitfalls of wireless networks and how you can avoid them in your deployments.

If you haven’t already you can book your place here

 

Wireless and Mobile '09

Wireless and Mobile '09

 

One of things that happens in a downturn is most companies stop spending on IT. Well they think they stop spending on IT, what they really do is stop buying new hardware. Research has shown that the cost of computer hardware is only a small percentage (often less than 20%) of the real total cost of ownership for IT. Most of the costs actually come from aspects such as IT support, lost user productivity when problems occur, software, data loss, power consumption and misplaced assets. Although it may feel wrong it is often cheaper in the long term to replace aging hardware sooner rather than trying to keep it going as the new hardware is capable of reducing costs in other areas.

Etherlive have a keen focus on helping customers reduce their overall IT costs whilst driving up their productivity. One aspect of this is improving IT manageability and Etherlive work closely with Intel Corporation to deliver maximum benefit from their Intel® vPro™ technology and Intel® Active Management Technology.

These technologies can deliver great benefits, driving down support costs, reducing overall power consumption and improving employee productivity. Recently we worked with Intel to develop an ‘activation’ message to help customers understand what they need to do take advantage of Intel® vPro™ technology and Intel® Active Management Technology. You can find out more on the Intel ‘Make the Case’ website with articles and videos from Etherlive, Cap Gemini and Computacenter covering the different aspects of the technology. You can find out more about the specific services we offer around activation at vproactivation.www.etherlive.co.uk.

The key point is don’t sit back and ignore your IT, go and shake the real costs out. Not only will you save money, you should also end up with a more productive IT environment.

Chris Green Etherlive Intel Active Management Technology

Chris Green from Etherlive explains Intel Active Management Technology features

 

I like to think these days I’m not easily drawn to the latest gadget craze so, when hunting down a company mobile phone handset, I was somewhat concerned that the team consistently said the iPhone 3G ticked all the requirements boxes. With our requirements including push email, WiFi and a good web browser, competition was relatively thin but the consumer label of the original iPhone did concern us.

After much discussion we opted for the iPhone 3G and after three months of use by the team what’s out verdict? In a word – excellent. Ok it’s not quite perfect but it is pretty darn good. Now before I get spammed by all those folks who love to complain in forums about its shortcomings let me expand on my view in a few key areas:

Call Quality – no issue here, much better than my previous few phones. I haven’t experienced the ‘call drop’ problem that has been reported so much – personally I suspect it was primarily related to a certain carrier’s network in the US.

Buggy OS – Yes the 2.02 firmware was poor, a step back from 2.01 but even so the phone still worked fine and the issues were contained to specific functions. OS 2.1 however has proved to be excellent – as stable as a normal phone, and much better than your average smart phone.

Battery Life – No smart phone has the multiday battery life we expect from a more traditional phone. The iPhone is no exception, so yes it needs charging daily but with the 2.1 firmware it is very reasonable considering what it is delivering.

Email Integration – We use Microsoft Exchange, although I also have a POP3 account running on the phone too. Both accounts work flawlessly with excellent usability. The only letdown is the lack of task and note integration and the missing ability to invite people to calendar items on the phone. Hopefully a software update will resolve that.

Camera – Not what it should be, but hardly a major issue on a business phone.

App Store – A great killer app for Apple, an environment that is controlled enough to give you confidence in the apps that cover everything from games to serious business tools. 

The point is that if you focus on narrow aspects you will find imperfections but when you take the whole package together you get an impressive device that is very user focused, comes at a good price point and yet still fits comfortably into your pocket (a key requirement for me).

What I haven’t mentioned of course is the fact that the iPhone blurs the edges between a consumer item and a business tool. Personally for me that’s a great bonus – one device that meets my business and personal needs, and I suspect most professionals like that idea too. The people that don’t like that idea are in enterprise IT departments.

And that’s the big issue. The iPhone 3G is a great small/medium business tool but I doubt it will succeed in the enterprise environment. Having spent 13 years working in a very big enterprise IT department I know all the questions and issues that will be raised which, in the view of those departments, make the iPhone 3G completely unsuitable for enterprise use.

Sadly many enterprise IT departments are struggling to keep up with where their users are – they are worrying about the latest encryption standards whilst the sales team are happily copying confidential presentations on USB memory sticks. In one company I know over 60% of the company laptops have iTunes installed. Then there are all the people who are happily syncing their non-company phone via a dock, copying contacts, email and confidential information onto a completely uncontrolled device.

Rather than embracing this, most enterprises continue to fight it – a futile exercise – but because the iPhone looks like a consumer device, is seen as a gadget and straddles the consumer/business boundary it will more than likely be officially kept out of most enterprises. Of course the users will be finding any way possible to get them in though the back door.

As for us we have already moved to the next step and are developing applications for the iPhone which forms part of our interactive event strategy – from basic event guides to location aware solutions, video streaming and real-time information screens.

Now I’m not saying immediately ditch your Blackberry, Nokia or whatever, but I am saying ignore the hype and the naysayers, focus on user requirements and keep an open mind. The iPhone isn’t for everyone but it does set the scene for the next generation.