Mar 032018

Traditional approach to WAN is starting to wane as organisations continue on their cloud strategy journeys. Now I don’t believe that home grown data centres (DCs) are going to disappear overnight, but what we have today is a varied mix of multi Cloud, CoLo and On Prem, and will do so for a number of years.

However what this approach to hosting is doing is making the industry rethink how to consume all these hybrid services. This rethink comes at a time when a new set of buzz words is hitting traditional networking. I mean of course SDN (Software Defined Networking) and the more latterly cousin SD-WAN (Software Defined WAN).

SDN is definitely the future, but not what I want to concentrate on today. SD-WAN is more exciting at this juncture, and whilst most people will tell you the main benefit of SD-WAN is to vastly reduce your WAN operating costs (some believe up to 2.5 times cheaper), there is also a great opportunity to implement agile WANs that can consume the new array of service locations like cloud & CoLo and deliver them to your edge network.

Underneath SD-WAN are four key principles:

  1. Provides Telco Agnostic Approach – in essence it can support multiple transport types (MPLS, Internet, 4G/LTE) –  (Known in some quarters as Hybrid-WAN)
  2. Supports Dynamic Path Selection – so can choose the appropriate physical path, and sending critical traffic over appropriate links.
  3. Provides Application Optimisation – utilising caching and compression techniques to improve application performance. Think of it as Dynamic QoS.
  4. A Simple Centralised WAN Management – and more importantly removing Telco lock in for all those complex MPLS changes and QoS tweaks that come at a cost.

There are other facets, but these principles drive a lot of the benefits that organisations are just starting to see from adoption.

There is a lot of technology out there that can deliver SD-WAN (Gartner already identified 23 key vendors for SD-WAN back in May 2017) and the battle for the WAN edge is hotting up. Traditional router hardware will give way to vCPE (Virtual Customer Premise Equipment) along with NFV (Network Function Virtualisation) all sitting on some x86 server tin with a competent hypervisor. The NFVs will sit alongside the vCPE providing what is basically a “branch in a box”. (And yes vCPE is in reality one type of NFV, but we’ll let the tech marketers have their moment).  Rapid deployment will be a key aspect of such solutions with a price point that should be competitive when compared with multiple hardware specific solutions. And then there’s the space and power saving aspect, which is not to be sniffed at especially in the cost sensitive retail market.

I believe we are at a key juncture in networking that demands a review of organisation’s roadmaps for their WAN. The gestation period for a large enterprise to consider a change of WAN provider, let alone the technology as well, is long and the actual transformation can be longer. However one of the benefits of SD-WAN is you can “dip your toe” into various solutions whilst maintaining your traditional WAN. Organisations need to start seeing how their teams can evolve into this paradigm shift in WAN management, and along the way understand whether in-house or managed is the way to go.

Whilst traditional managed WAN services will ultimately decrease, I believe we will start to see the likes of traditional VNOs (Virtual Network Operators) start to evolve their offerings by providing a Managed SD-WAN. (Some have been out there for a while just take a look at what Talari Networks are doing stateside).

So ultimately it’s time to stop thinking MPLS = WAN, and to start being driven by software!



Oct 242015

cloud_82There have been many incidents over the years of hacked organisations and personal details “spilled” across the internet. The very recent TalkTalk incident though I suspect will be TalkedTalked about (pardon the pun) for some time to come.

Not withstanding this being their third such breach in a year according to the press, this one is significant for two reasons to me 1) TalkTalk are seeming to be saying they do not know if they were storing critical customer data encrypted and 2) As possible proof of this, some customers are reporting attacks on their own bank accounts and having “money drained”.

I suspect there will be a number of IT Security specialists currently or previously employed by TalkTalk thinking a clear “I told you so”. Having been responsible for key IT Infrastructure & Security systems across a number of organisations, I know too well the “Risk versus Cost” game. I would surmise that TalkTalk have played this game, and have thoroughly and publicly lost. Lost may be too kind, it was more like they “had their arse handed to them”!

It is very likely that the black hat hacker(s) had to work hard to penetrate the external defences (well I hope they did), but they must have thought good old Kris Kringle had visited early when they found the at rest unencrypted data. It is unacceptable in this modern day eCommerce led world, to believe that encrypting critical stored data is not mandatory.

As an analogy of changing times, not so long ago we didn’t lock our front doors, but who would now leave their house, or indeed go to bed at night, with the front door unlocked. We have to be realistic that this is the cyber led world now, and criminal profit is far easier gained in bits and bytes than with breaking and entering.

TalkTalk may have been faced previously with a project funding decision in the millions, when considering how to encrypt their at rest data, and possibly took the risk rather than cost approach. What is their view now as the financial impact on their business will undoubtedly dwarf any implementation costs they were previously presented with. Also they will now have to deliver an accelerated solution for encryption, if they want to stem the business loss flow, and this will be an order of magnitude more costly now.

Dido Harding, current Chief Exec, is now telling the media that the situation is not as bad as first thought, as they don’t store full credit card numbers and customers TalkTalk My Account passwords have not been stolen.

IT Security specialists get a hard time in the industry, constantly lambasted for being blockers to progress in projects and for adding un-necessary costs. If anything good should come of this incident, it should be hopefully that business executives now heed the warnings from the multitude of reports and audits they get, and make the necessary investment.

This investment is paramount as they need to protect, what is for most organisations their most precious assets, their customers.


Mar 152015

Computer-Cartoon-283Having worked in a number of organisations over the last 30 years in IT, I have noticed a common and still persistent set of issues. I have mainly specialised in the design, implementation and operation of IT infrastructure, the key blocks that support the critical applications that make our everyday modern lives possible.

It is in the infrastructure arena where I see these repeated common issues:

  1. A lack of ongoing investment in infrastructure leading to performance, capacity and security issues.
  2. Inability for organisations to understand and implement a service recharge model to effectively finance the infrastructure operation and maintenance.
  3. Architectural teams that design solutions that are operationally inefficient, costly to maintain and complex to integrate into existing landscapes leading to incidents and protracted recoveries.

With the advent of larger and larger virtualisation transformations, cloud or not, infrastructure robustness is more and more key, as multiple silos of implementation stacks disappear, replaced with vast swathes of virtual environments.

Organisations had a basic understanding of how to recharge the business areas when you could clearly identify the “tin” that went into platforming a specific solution. The virtual world brings a new challenge for internal IT departments to transform themselves into service providers. There’s a reason why IaaS, PaaS and SaaS  are key buzz words of the infrastructure world. They shouldn’t be the sole domain of Cloud providers, and all IT departments should be seeking a way to construct their offerings in the form of “as a Service” capabilities. Along with also building with finance the associated recharge models, that give a clear process for a balanced ability for business areas to pay for consumed services, and in turn allow IT to maintain, scale and secure their infrastructure.

For some organisations these issues can be a massive barrier to bringing IT back into the fold as a performant element of the business that is seen as an enabler. The first step in this key transformation, is for IT to put its arms around its estate and understand the landscape from both an application and infrastructure aspect, fully documenting and understanding the dependencies. These key landscape documents allow for various techniques to be adopted for either assessing the current hot spots with differing heat map lenses, or setting target landscapes and providing key roadmaps for delivering to those targets.

The next key step is a partnership with finance to build those recharge models and demonstrate to the business the ability for IT to become a service provider of choice. Organisations have to avoid a default reaction of outsourcing solves all. There is a place to have commoditised elements of IT run as a managed service, but outsourcing your risk rarely works.

The final step is to have IT Change Delivery (IT Projects / Portfolio) create an operationally aware approach for the design and implementation of solutions. This is done working with their Architectural capabilities as a key bridge between the Change and Operation worlds. Solution Architects have to have a keen eye on the ability for a solution to be financially (capital and revenue) efficient and operationally acceptable. This makes service transition an enabler not a barrier to project delivery.

These transformations are not quick by nature, and there is a need to accept that those roadmaps you have built are the start of a long journey, that will ultimately deliver that performant, robust set of services that modern day organisations demand.


Aug 232014

WAM350-XU-22-0A recent addition to the Technology Marketeers dictionary has been “Ecosystem”. Now the traditional definition of “Ecosystem” is something along the lines of “a biological community of interacting organisms and their physical environment.”

Now the likes of Apple would like you to think of an Ecosystem being a set of Apple sourced hardware and services working in unison together solving all your life’s problems. So there you are with your iPhone, iPad, Mac Airbook and your iMac all in perfect harmony via iCloud.

Now obviously the cynical amongst us will recognise this for what it is and that being a blatant attempt at vendor lock in. Once they have your data, your documents, your photos, your music, your contacts (you get the drift) then moving away from any one of those devices would break your perfectly balanced “Ecosystem” world.

However I’ve been on a voyage of discovery attempting to create my own “Ecosystem” that delivers that zen of balance in my life whilst retaining a level of agnosticism from any one manufacturer.

The voyage is not complete but I am past the “bay of biscay” and into calmer waters (Holiday cruisers using Southampton will get that reference)

So what was my destination, my utopia? Well I want immediate access to my documents in the Martini fashion (Anytime, Anyplace, Anywhere) , I want to be able to edit and review. I want access to my emails (Currently I monitor around 5 different addresses).  I want a centrally managed set of web links. And of course I want 24×7 access to my social media of choice.

So first to my agnostic set of platforms, I am currently sporting:

  • Apple iMac 27″  – Mac OS
  • Google Nexus 7 Tablet – Android
  • Apple iPhone 4S – iOS 7
  • Lenovo Yoga 2 11 – Windows 8.1

The iMac gives me my home office desktop environment, and I also run VMWare fusion allowing me to run Windows XP, Windows 7 and Windows 8.1 when the need requires for any legacy Microsoft apps.

The Nexus is my home tablet for couch surfing and the usual eCommerce needs (eBay, Amazon, PayPal, Banking) and even the remote control for Tivo box!

The Lenovo is latest addition and is an 11″ laptop with touch screen and conversion to tablet mode with 360 degree hinge. I have grown more familiar with Windows 8.1 and once again recognise that whilst Microsoft are usually late to the party (just look back at the Internet boom) they do start getting it right eventually. The Lenovo provides me with my portable office environment and it is a capable entertainment device.

Finally the iPhone is my smartphone of choice (for the moment), although I have been flirting with a return to Android, but have sent back two devices recently as build issues still plaque a lot of devices in that format.

So what binds these devices into my “Ecosystem”. Well it is a mix of services that has been changing slowly over the months but currently consists of:


My primary email domain is hosted on a Smartermail host, which has served me well for many years. However whilst I still use it to process my emails, I wanted access to all my stored emails in the Martini fashion. Whilst Smartermail emulated Exchange quite well, I couldn’t quote get it working across my array of devices.

That’s where the Office 365 for Home came in, more of which below, but for email it is predominately hotmail as was, then for a while, and now Windows 8.1 obviously has native support built in with the Mail app and if you want you can also access it via Microsoft Outlook if you want a richer user experience. (In case anyone is struggling to configure older versions of outlook to use addresses the server name is simply and then use your full email address as username)

One of the features of is that you can configure it to send and receive as your own domain name, simply providing it with pop3 login details. So effectively this is my address.

I have managed to migrate all my stored emails from my Outlook for Mac on my iMac to online, simply by configuring my Outlook for Mac with IMAP to and simply dragging and dropping all the folders onto the inbox. Took a short while to sync, but now I have access to all my emails.

On the iphone I have configured it via the inbuilt mail account. On the iMac I am using Outlook for Mac. On Android you have a choice you can use Exchange settings using the server I detailed or there is an app which works if a little “bulky” looking to use.


So now onto my documents. Until recently I was using Dropbox, which I had a plentiful 20Gb of free storage due to a phone purchase a while back. However I started using Office 365 for Home of recent, and this was providing me with my Microsoft Office apps so essential for everyday business, but the new and improved OneDrive suddenly got upgraded to 1Tb of storage, and I got 5 sets of those as I can create 5 users for Office 365 for Home.

OneDrive on the iMac as the same features as Dropbox in that it looks like standard folders to the iMac but automatically syncs the files up to the wonderful cloud.

There are OneDrive apps for Android and iOS, and while still some bugs in them to work out, integrate nicely into both environments. So I have access to my docs and I can process them on Office. On the iMac I use the Mac Office suite which comes with Office 365 for Home.

On the iPhone there is Office Mobile which is free and your Office 365 for Home account unlocks the edit capabilities.

On Android there are a number of capable Office compatible apps, or you can use the Office Mobile app again available for Android.

Finally on the Lenovo I use full Office suite with the license provided by my Office 365 for Home account.

So my “Ecosystem” is working almost flawlessly and I am productive wherever and whenever, but I am sure that my voyage will continue as and when new solutions surface.

However I am not be-owing to anyone manufacturer and I have the flexibility to switch kit as and when the desire or need arises.

As a final thought I believe as the desire for companies (and employees) to utilise BYOD (Bring Your Own Devices) the need to create your own “Ecosystems” will increase.


Jan 072014

internet of things
The tech savvy out there will have been hearing more and more a buzz phrase of “The Internet of Things” or IoT. The Oxford Dictionary defines it quite dryly as – “a proposed development of the Internet in which everyday objects have network connectivity, allowing them to send and receive data:”. Wahooo! I hear you say, but what does that exactly mean to everyday life. So let me explain it a different way, or in my way anyway.

We are all pretty much use to the Internet and what it delivers today, but that internet predominately for us starts and finishes on a screen. What I mean by that is our everyday interaction with the Internet is totally virtual whether we are reading emails, shopping, doing our bit in social media or streaming Eastenders on iPlayer. Regardless it never gets any further than the computer monitor, the tablet screen or our smartphones.

So what does IoT do to change that? Well it connects everyday physical objects to the internet to extend or improve their functionality by allowing communication via the internet either to other physical objects or to virtual systems. The outcome of that communication to my mind is then also a physical event. Let me give you an example – Hive Active Heating – the link takes you to details of a system British Gas are selling which connects your home central heating system to the internet. With the appropriate app on your smartphone you can then control the heating remotely. Not a mind blowing example of the potential for IoT, but nevertheless an example. Your central heating is connected to the internet and by communicating over such you can change the physical state of the heating in your house.

Now I see IoT in the same vein as the good old “The Cloud” hype of a few years ago. We all now live in “The Cloud” in one way or the other, and in the corporate world the actuality of cloud  based solutions is well established. However when “The Cloud” first cast a shadow over the IT industry it was hype waiting for technology to catch up.

IoT is just the same, the potential is there, but the technology and the uptake of that technology is yet to happen in sufficient amounts to see it as an everyday occurrence. One thing that might hamper that uptake is a resistance by the general populous to accept further integration into cyberspace. This is usually borne out of fear that ranges from the paranoia of a real life SkyNet occurrence to a more sensible fear of the erosion of our normal social interaction and an ever increasing dependency on technology. I for one don’t have any of those fears, although I still think Arnie is a cyborg from the future!! I embrace such advancements and see them as bringing the benefits that we have all accepted from such technology.

You only have to look at the advancements in communications that led ultimately to the likes of the internet and widespread use of smartphones. We evolved from telegraph through to telephone, then onto mobile phones, then texting, then aligned with the evolving internet we had mobile data and then the smartphone was born embracing all technologies. At each stage society had elements of fear of each technology, we’ve all had those conversations about how mobile phones were destroying social interaction, and later on the same argument for texting killing the art of conversation. However I see it as exactly opposite, as the human race is adept at evolving with the use of technology. I see my own daughter holding frequent face to face conversations with her friends, but that can be actually face to face or over via Apple’s facetime. It extends her capability of social interaction not diminishes! Used correctly and sensibly and yes with the right security controls all these technologies enhance our lives.

However on to the Internet of Kerchings, all these technologies only succeed because there’s a successful commercial imperative at play. The same will be true of IoT, it will need that initial push from commercial income, and that is starting to happen.

CES is one of the biggest shows each year for consumer technology companies to show off their wares and normally announce what’s going to be big that year. So where else better to see what is coming for the Internet of Things to turn it into Kerchings! Samsung are a great example of what to expect with their announcements of a Connected Home . Of course initially their solution will be specific to Samsung devices locking you into their ecosystem, but they are offering control of your home from anywhere straight to your Samsung Galaxy Gear watch. Samsung are keen to increase revenue from an increasing saturated smartphone and tablet market, and what better way than locking you into proprietary systems. Of course all these toys will have to play nice together one day, so you can control your various devices from any manufacturer. However until then expect the leading technology companies to be looking for your cash from buying into the Internet of Kerchings!

Jun 182013

Screen Shot 2013-06-18 at 11.47.11

I’ve been a long time fan of Google Maps as my SatNav of choice, ever since the Android app started it’s beta navigation option. If you don’t switch off your brain totally it rarely “steers” you wrong, and is great for local route finding for those difficult to find places, and for long journeys where advanced notice of traffic issues is a great help.

However I’d come across the recent headline grabbing Waze some weeks back, before the news of their Google sale hit. Waze, as you can guess is a smartphone (Android & iOS) based SatNav app. However it has a unique feature in that realtime traffic alerts and info are made available through a community based approach. This approach, sometimes referred to in the industry as crowd sourcing, gathers information proffered by it’s over 30 million users as they drive around.

I decided to try the application out as the approach was interesting and the screenshots I had seen demonstrated a slightly fresher GUI approach than Google on the face of it.

Waze has a cartoon like interface and offers the user “sweeties” as bonuses along their journeys as they demonstrate their contribution to the service and also miles travelled. This builds up your profile and provides a type of gaming approach, probably to pander to the Generation Z users. However what is disconcerting, is the appearance of a target sweet on the map at certain times and the compelling feeling to “dash” towards it. Not conducive to safe driving, but that might just be me 🙂

That being said the mapping is easy on the eyes and the status tabs at top and bottom provide clear information, probably a little less cluttered than on Google Maps.

Another winning point for Waze is its spoken junction descriptions. Google Maps takes the approach of actually reading out the whole sign that you would see on the roadside. For a busy junction this means being past the junction before it has detailed the various destinations and road numbers. Waze simplifies this into clearer directions detailing just what you need, and this is enhanced by the top tab, which shows a clear idea of which numbered exit and in the case of ring roads whether this will take you clockwise or anti-clockwise. However saying that, Waze seemed a little late announcing the junction exits, often stating 1 mile to junction when in fact I was at the two thirds mark, and this led to the final instruction of “take the exit” being a little too late. This could be caused by inaccuracies in my phone’s GPS, but Google maps doesn’t suffer from this. If you keep an eye on the directions and junction distances you shouldn’t have a problem.

Waze also adopts the feature used by Google Maps whereby it zooms into the detail when approaching junctions, so you can look to take the right lanes and see more clearly where the actual road exits are. This works well, especially if you are using a standard smartphone and not some humungous phablet. (My views on phablets coming soon!)

The real killer to whether a SatNav app is any use is of course the routes it finds and their usability. In the whole the main routes for long distance utilising motorways and A roads, is OK with waze, but it starts to let itself down on the more intricate routing when you get closer to your destination. I found that it took a very basic “get there the most direct route” approach, which then included the use of 20MPH zoned roads and humped roads. It also at one point wanted me to turn right at a no right turn filter. So some work needed on the data held on junction types and the minor road options in cities. Google is somewhat better at this, but to be fair it does sometimes send you up the back of beyond.

The traffic information is useful and can provide advance warning of issues to allow you to reroute. What does worry me a bit is the need for the driver to input the update into the app whilst obviously driving. Not the best idea, but somewhat risk reduced by the use of quick select icons for each type of issue. You are also offered the opportunity to provide feedback if an incident (or speed camera / trap) is not actually there as indicated. Again one of the powers of crowd sourcing can be an element of self moderation.

So of course Waze hit the news recently as they announced they had been successfully purchased by Google and would be allowed to continue their Israeli based development, which was one of their key requirements from the sale. They of course got $1.3b dollars for their four years of effort and each of the founders got turned into instant millionaires (in any denomination) earning them from $38m to $78m.

The bigger question is why would Google buy them considering the success they have with Google Maps. One speculation obviously is to keep such a good competitor app away from their main rivals. However I also believe there are some great benefits that Google may derive from the developed technology within Waze. The crowd sourced data itself could easily be utilised by Google Maps to provided auxiliary information for their traffic feeds. Time will show us what the real plans are, in the meantime Waze will continue to develop what is a very mature and useable app for the smartphone market.

For myself I will probably be sticking to Google Maps, and more because of it’s the devil I know, rather than Waze cannot compete. I may change my mind in time.

Jun 092013

plasterThere’s been a few articles recently that have caused me some consternation. They are all, in my humble opinion, basically linked in their root cause. (I’m big on root cause analysis, not enough of it done by my IT colleagues in an industry eager to simply rectify the immediate issue and simply lay blame, rather than learn from it’s mistakes – anyway I digress)

So the three core news matters I am talking about are 1) The call to internet industry to regulate porn especially the wholly unacceptable child porn industry, 2)  the building story of how (allegedly)  the US Government is watching everything we do on the internet and telling our “Dad”, or GCHQ as it is known and finally 3) the lesser story around the call from police to have SmartPhones updated with a “kill switch” rendering them inoperable in any country and thus making them less desirable as the theft item of choice

Now apart from tech you may be wondering what my thinking around how they are linked is all about.

However first a quick examination of the usual approach being taken, albeit fuelled by the media, on these stories.

Taking each in turn and dealing with the worse of these matters is the raising concern that access to child porn has led to some horrific news stories of late. So don’t get me wrong, given the simple option I would press the magic button that would eradicate the content and access for this insidious industry. However what we see in the media is a call to the Internet Service Providers and specifically the search engines to block this content. This is a sticky plaster approach and whilst on the face of it, it would “solve” the problem, do you really think this would stop the content being available or distributed merely because Google didn’t return an easily clickable link. We delude ourselves if we truly think that.

The likes of Google are easy media targets, being toted as the big evil empires that are spreading this content. This is just not the case, and we all know it. Also you need to understand the immense complexity of sifting this content from the acceptable content, which would be a moving target as the actual perpetrators would constantly shift and morph their cyber appearance.

So to the root cause, which is a not a technology answer, but a recognition that morals and acceptable behaviour have become eroded in modern life and we choose (yes we are to blame) to ignore what is in plain sight, in that we have allowed this underworld culture to develop and thrive and we need to take control back again. Yes by all means use technology in an effort to monitor (and more of that for point 2) and protect, but we have to accept responsibility in the way we bring up our children and develop them into society. As with any complex problem resolution, it will take time to rectify and there are no “quick fixes” even if the politicians want them as their next campaign vehicle.

I haven’t got the panacea for this and wouldn’t dare purport to knowing as such, I just know that I’ve had enough of “sticky plasters”.

So to point 2 and to big brother watching you. I’ve got a simple edict I live by – “If I never do anything I am ashamed of or that is  illegal why would I be bothered if anyone knows what I am doing, where I am driving and who I “liked” on Facebook or “followed” on Twitter.

Now I know for a lot of people this will be the red rag to the bull, but to be honest, I don’t care, as if this gets someone inflamed I start getting suspicious. Also if you do insist on  being a naughty boy or girl, then don’t Tweet it, don’t Facebook it, don’t film it and “whoops” upload it to YouTube. They’re called social media for a reason.

OK, so now I’ll get the rants about privacy invasion on email or chat, and this is where it gets tricky. So whilst in an ideal world the right authorities would have access to all data in the course of investigations, we do of course have the issue of corrupt use of this data. However while those debates run, the real criminals are getting away with it.

So again I come back to root cause, and the fact that nefarious deeds and the perpetrators of such will utilise any technical means at their disposal, so we need to look to long term solutions to the actual reasons why people are driven to crime and we need to look to again our society and our acceptance of slipping morals and an increasing laziness on our part to accept responsibility for our own actions and for the actions of those that we are accountable for. So again no quick “sticky plaster” fixes, but hard graft in long term fixes.

The last story caught my eye this weekend, and was kind of the straw on the camels back. The police are calling for tech to be added to phones to allow them to be permanently disabled remotely.  (Click for Full Story). This is not a new idea, as I was designing software to do exactly the same into a new generation of digital mobiles in the late eighties. The idea is fraught with issues, the biggest being you just know some numpty sat in his/her bedroom hacking away will come up with a virus to activate the kill switch. The tech will push up the price of the phones and finally it will further fuel the big brother debate I have already spoken about.

So quickly to the root cause, as I have vented quite sufficiently on this blog today, and to the fact that once again it’s the acceptability of petty crimes in society that allow people to believe they have a right to steal as it is a “victimless crime”. There is no such thing, especially looking at the costs of smartphones whether up front or embedded in rentals, the increasing popularity of dedicated mobile phone insurance. (Check your home policies folks if you think your away from home house insurance covers your mobile these days) and the loss of business individuals have when they lose the mass of data on these devices – (Backup, Backup, Backup guys, no excuse these days).

So I am back to my original rant on declining morals and lack of accountability that has eroded our society. It is all to easy to sit back in moral judgement over technology and the big evil corporates who run it, but we all need to start looking a little closer to home now and then.

I’ll get back in my “technology” box now and return to blogging on pure tech now.


May 192013


UPDATED 19-05-2013 – Coincidentally my favourite tech site ran an review today on StewartStand RFID-blocking Card Case

It wasn’t that long ago that we were happy to queue at the till and wait for the salesperson to fill out the credit card slip, imprint it with the card details and then get us to duly sign the carbon copies.

This process did us for years, and whilst fraud was possible it was rarer than it is today. However the need to authorise tranasactions and ensure that 1) the card wasn’t stolen and 2) sufficient funds would be honoured by the card issuer led to the advent of Point of Sale (PoS) terminals.

PoS terminals made possible by cheap modems and the ability to quickly (well quickly by those days standards) generate an authorisation code for a transaction, both sped up the till process and provided some improved assurance to both merchant and card issuer.

We were still happily signing away little slips of paper happy that the APACS (now the UK Payments Authority) protocols were dishing out the valid authorisations.

(Little side story: In the 90’s I developed a mobile PoS solution using a Verifone terminal and a Nokia phone that worked over GSM and provided APACS 30 authorisations in under 15 seconds!! – I can see you’re impressed – Click here for the proof).

Fraud was still rife and copying cards and their near simplistic magnetic stripes was easy, so the industry needed something more secure. We had started to get use to using PIN numbers in ATM’s to get at our cash, so a means to use that PIN out and about for card authorisations was needed. And thus we entered the era of Chip & PIN. The chip of course is the little embedded chip we have all come to know on our credit cards. This chip, also known as a Smart Card and yes it is pretty much the same tech as you get in your mobile phone SIM card, provided a means of validating the authenticity of the PIN code without actually carrying said code on the chip. This allowed the PoS terminals to validate the PIN’s authenticity before continuing with any transactions whether offline or online. Rolling out in full in the UK from 2004, this has become the near defacto method of processing card transactions in retailers. (Although when all else fails we can still wait and sign a piece of paper!)

However this was still not fast enough for the Generation Y & now Z populous, and a faster way of spending our dosh was needed.

Along comes Near Field Communication (NFC), a tech looking for a problem pretty much. Take a look at the link for the full gory details, but basically it is a mechanism by which two devices can exchange a small amount of data very quickly by being in near proximity to each other, say 4 inches. It’s a more grown up version of RFID, those smart tags placed on products so they can be tracked from manufacture to consumer – yes big brother is watching and knows what you like!

NFC offered a great opportunity to the payments industry to effect a faster way of authorising and processing a payment transaction than the old slow and kludgy Chip and PIN. Basically with NFC embedded in your credit card (or debit) you could merely wave your card at a terminal and pay and thus Wave & Pay (Not sure if anyone has trademarked it as such and variations on a theme exist, but you get the gist)

Slowly becoming more and more available across banks and card issuers, you can now buy your McDonalds or pay for your lunch at EAT with a mere flick of the wrist.

Payments are currently limited at the moment to around £20 per transaction, as the thought of someone nicking your card and walking into Currys and making off with a 50″ TV was somewhat of a worry.

However it seems that all is not golden in this new dawn. People’s anxiety about problems with the proximity of their cards to readers, whether genuine or rogue, seem to becoming to fruition. As the BBC Article today postulates, customers at Marks & Spencer who have rolled out contactless (wave & pay) PoS terminals, are having their cards erroneously charged.

In my mind these are early teething problems, and only the start of what the fraudsters may also come up with. Don’t forget the big stories of the PoS terminals being tampered with and recording all Chip & PIN transactions in the likes of petrol stations.

If however you are concerned about the rogue element of having a contactless card in your wallet and purse, can I suggest you bolster the coffers of the entrepreneurial people who have come up with the concept of RFID shielded wallets and purses. These items protect the contents, whether credit cards with NFC or indeed passports which now utilise RFID, from being read. UK companies are waking up to the revenue opportunities – RFID Protect and as ever the savvy Chinese manufacturing market has started to produce the goods – YAS Product.

So it’s now a case of wirelessly protecting your assets or Waving & Praying!!

UPDATED 19-05-2013 – Coincidentally my favourite tech site ran an review today on StewartStand RFID-blocking Card Case

May 162013


WifiCartoonAs soon as you mention that you work in IT one of two inevitable questions get asked. Either “Can you have a quick look at my Laptop” or “I’m having terrible problems with my network at home, what do you recommend”.

The latter question is becoming more predominant with the increasing popularity of home broadband solutions, with the inevitable “free” wireless router. The key issue is that everyone thinks “wireless” is the utopian answer and their home will be seamless oasis of connectivity.

The reality is however far from the truth, with laptops, tablets, smartphones, media devices and now the likes of YouView boxes all clamouring for the airwaves and struggling with security protocols and ever changing wireless speeds.

Let’s change one perception, I am yet to find a house that can find it’s suffice of wireless signal in every room from a single free wireless router, usually installed in the main lounge or office. I gave up on this approach years ago, initially forgoing WiFi and wiring my PC’s and media devices to the router via what was then the quite new Power line communication devices. (Click the link to learn more about these, but in simple terms it uses the ring mains in your house to effect one virtual network switch using Ethernet signalling).

This sufficed for a while, but as we know the advent of the Smartphone and then the Tablet was upon us and the need to have these wirelessly connected in the home presented itself again. This was OK if I was sat in the lounge to the most part, but I needed a solution that could feed the various devices being used by the family across all the rooms.

I had due to various swaps and upgrades with internet providers a cupboard full of these free wireless routers. I did some research and found I was able to repurpose these devices into very flexible devices using a free firmware replacement called DD-WRT. This flexible firmware allowed me to configure these old routers and turn them into simple wireless access points which I could attach to the various power line devices around the house. If you’re not too daunted by the tech it is quite easy to “burn” the new firmware in following the website’s documentation. I mainly used old D-Link DIR-615 routers popular with Virgin Media at the time, but the firmware works on a wide range of hardware which are listed on the Supported Device Wiki.

Whilst a bit bulky in each room and taking up two power sockets, this solution worked perfectly for a number of years for me.

As I am always on a minimalist search when it comes to tech, I recently decided to look again at the progress in the power line devices. The speeds of these devices have increased to support the demand of HD media streaming and the like that a lot of households now present. A number of the manufacturers are now embedding a WiFi chip into the plug top devices. This effectively consolidates my customised DD-WRT routers into the plug top devices, and with some the inclusion of a passthrough socket means they don’t take up any main socket space.

My current device of choice are a range from Solwise, and whilst they are yet to do the faster 500Mbps devices with WiFi I have opted for the 200Mbps device which comes with WiFi (Model NET-PL-200AV-PEW-N). You do need to configure each device with your wireless settings, but the guides are very good. I found direct purchase from Solwise the best price I could find.

So my expectation is that the next generation of devices will be 500Mbps over mains speeds with built in WiFi (usually around 300Mbps) with the local ethernet port working at 1Gbps, and thus supporting the more recent desktops like Macs which come with 1Gbps Ethernet ports.

Power line devices are now a mature product and I fully expect soon that new build homes will come with network built in, in fact I recently spotted this product – Power Ethernet Socket – which would allow for easy conversion of existing wiring in a home.

Just needs the WiFi chip and I’m sorted!Power_Ethernet_Socket_front_white

UPDATE – 18/07/2018

CompariTech have a good article on installing OpenVPN on DD-WRT

May 122013

iOBDI’m a big Audi fan and a Gadget Geek, so anything that plays to both these passions is a win. So I recently succombed and bought an iOBD2 from

OBD (or On-Board Diagnostics) is a pretty near industry standard for a diagnostic connector on most modern vehicles. It allows the appropriate equipment access to an array of information from the vehicle, including detailed self diagnostic error codes.

For many years the connector was for the purvey of the motor industry allowing them to attach to expensive diagnostic monitoring equipment, suck their teeth and land you with an expensive bill for merely telling you why the engine fault light had come one.

More recently more generic equipment became available for independent garages, so slightly less teeth sucking, but a bill is a bill.

Obviously a number of entrepreneurial individuals developed connectors & cables and associated software to allow anyone with a laptop to access the same information. This was a great boon for DIY mechanics and also brought the equipment into the affordable realm of the smaller garages.

So to the iOBD! As the name suggests originally developed for iPhone (or iPad) it was a standalone connector that could wirelessly (via WiFi) send the data to an iPhone. The manufacturers of the device, XTOOL, also developed an app to display the various data feeds in a user friendly manner. These dashboards give you access to information that may not be usually available via your normal dashboard and allows the more savvy home mechanic to ensure their vehicle is performing at peak and also detect any engine management error codes. They could then effect a repair or if too difficult at least be forearmed with the right information to ensure a fair charge ensues from the local garage.

The manufacturers then developed the Android version which works over Bluetooth, but in exactly the same manner.

So this is all fine and dandy, and I have quelled my gadget itch for a while.

However two things strike me as opportunities, one for a better use for the device and second for the industry in general.

So first, I am planning at some stage to change my Audi (and yes probably for another Audi), and it came to me that this device would be great to assess any used vehicles I went to test. The seller should not be averse to the device being attached and the “vitals” could be monitored and any engine management codes also picked up. With a little research before the visits, you could easily assess what things to look for as signs if impending issues. Considering that the device recently got dropped in price in the UK to £45, this is a small price to pay to start initially assessment for a used vehicle purchase.

The next idea that came to me, was with the advent of “smart” vehicles and the prevalent fashion for all things connected and Bluetooth in vehicles, it is not beyond belief that a cheap option for relaying the OBD signals via Bluetooth could be done. However this would mean the industry relinquishing what is a sizeable revenue stream I suspect from ongoing maintenance.

It would only take one forward thinking company though to start this trend, and the rest I suspect would follow.

In the meantime I’ll satisfy myself with my dashboard mounted smartphone telling me exactly what my ignition advance is and whether my Alpha Sensor is working. (And no I don’t really know if that matters)