Detoxification Complete

So I got through it, I got through the Metagenics 10 day clear change program.

I have to say I am very glad I did, it was not easy but it is very worthwhile.

The last time I posted on this I was going through the hard part, the intense detoxification. I had some nights where it was hard to sleep, in a bad way with aches and pains. This lasted for about 3 days and it the aches in my calves took even longer to dissipate. I think I probably didn’t drink enough water which made this part harder.

Working out through the ache period was hard. I did what I could but I was seriously weak from the restricted diet and not at my most mobile. If this discomfort was an indicator of the process working then safe to say, it was working!

Now onto the good. And there is so much good.

I smashed my workout this morning, most energy and strength I’ve had since I can remember. It felt good!

My mental clarity is amazing. I am thinking quicker, am more productive, and doing better work than I have in a long time.

My energy levels are great, I am steady all day with no caffeine. I have no cravings for stuff like before like soda, sugar, carbs, that kind of thing.

My sleep is amazing. I am out light a light and sleep right through with high quality sleep.

I can’t recommend this program enough. It’s been something I wish I’d known about long ago and feel it’s given me a really positive reset. I recommend anyone who thinks they might have run themselves down a bit, maybe with some unhealthy habits to give it a go. It wont be easy but it will be very worth it! System reboot!

This is the product I used: Metagenics – Clear Change Program – 10 Day (Peach Flavor)


My fiancee just recently completed a detox program and has been amazed at the results. I’ve always been skeptical about these things but on her suggestion I am giving it a try.

I’m currently on day 4 of the Metagenics 10 day detox and am feeling results already.

Yesterday I started to really ache, like real muscle fatigue. This is an indication that the process is working, basically the toxins are being pushed out into your bloodstream and muscles. Today it’s more of the same but my head feels remarkably clear. Maybe why I felt inspired to write this?

I’ll post more on results as I go along but right now it’s feeling like it was a good idea!

This is the detox I am using: Metagenics – Clear Change Program – 10 Day (Peach Flavor)

Please get secure!


There has been a lot of discussion recently on cloud security, and internet security in general. Much of this has been in response to several large security events, both with cloud resources and general issues where some of the fabric of the internet has been compromised.

The net of this is a massive amount of credentials have found their way into the hands of those with malign attempt and some companies have been put out of business by what is clearly some sort of organized crime.

Regardless of whether you run your service in your own data centers, or in the cloud several key best practices are mandatory should you wish to stay in business and protect your users.

Rallying Calls

Many will have seen in the press the attack on codespaces where by having their cloud credentials compromised, the company was held to ransom and at the point where they did not yield to the attackers demands, their assets were destroyed which put the company out of business. By failing to properly protect their infrastructure, Codespaces as a going concern ceased to exist.

In addition there have been stories in the press recently regarding Russian hackers compromising billions of passwords. Although ultimately only users can address these password issues, as the service provider there are several things that can be done which will mitigate the risk associated with these kind of breaches.

Both of these stories represent a clear and present danger to a business. One represents a direct attack on a company, and the other represents a vector for an indirect attack where malicious use could quickly consume the resources of a business and take them to a state where they cease to be effective.


The Codespaces incident highlighted the need to protect cloud infrastructure with the same rigor as you would with physical infrastructure. The company in question failed to protect this infrastructure with reasonable access controls so that once access was achieved by the attacker, they were up against it to close the door once the attack had been triggered.

They also failed to implement reasonable backup procedures. Their backups of their infrastructure were all stored with the cloud provider. A typical backup procedure will see backups be stored at a separate location so that if one location is destroyed, the data is protected. In this situation a single provider should be considered a single site and as such another separate backup of the data is essential.

Vendors who store a users identity data have a responsibility to protect data, that much is clear. They also have a responsibility to protect their customers when they may become victims of a data breach. In this situation its difficult to prevent the breach say it came from an undocumented exploit, but a patching mechanism must be in place that allows issues such as heart bleed to be addressed in a very timely manner to shut the door on additional data leakage.

A responsible vendor can implemented measures such as multi-factor authentication that will help protect themselves and customers when the risk of compromise is such that it will be damaging to them or their customer. Also in addition responsible provider can force password resets when data leakage is known to have occurred.

Stay tuned on this blog for more on what you can do to protect you and your customers in your journey to the cloud.


Tagged , ,

Open Source Log Management

Last night I attended a Cloud Austin user group which covered various logging solutions. Some were fully open source where as others were proprietary soltuions.

LogStash was the first up. It’s a ruby based solution that provides a means for templating and parsing logs into a common JSON format. From that point it’s up to you how you put them somewhere to do what you need with them. Options include pushing alerts out through Nagios or sending to the likes of Elastic Search. Visualization options include the likes of Graphite and Kibana.

It’ has quite a nice feature where you can daisy chain LogStash instances which will allow for a log processing pipeline.

It’s strength seems to be in the fact that it can pretty much take log data from anywhere and bring it into a consistent format.

Graylog2 looks like an interesting solution for actually making sense of the logs but it’s kind of restrictive in what it will accept in. There are options around this though. Natively it will only accept GELF or Syslog formatted messages. There are a series of appenders that can be added to the configuration of various logging frameworks which will output GELF format although this might not be an ideal solution as you need control of the application creating the logs. An interesting solution for this that seems to have a lot of traction is to combine both LogStash for normalization + Graylog2 for analytics, alerting, searching and stream analysis.

Graylog2 is java based and its dashboard is in the process of being ported from being a rails application to Java + Scala. Logs are stored in Elastic Search and all of the metadata required is stored in MongoDB.

Graylog2 has commercial support available via it’s creator,


SumoLogic is a SaaS based soltion that some users have been migrating to due to the cost of splunk. It does seem to be cheaper and it’s provided on a utility based model. It seems very Splunk like and is obviously targetted at this market. It contains some novel features, LogReduce being an interesting one that will group log messages together that look like they have been created by a similar error. This can be useful in terms of not having to see everything be repated.


Splunk is considered a market leader but it comes with a pricetag to match. They offer a free version but that does’t go very far. It looks like other solutions are fast catching up and offer a lot more bang for the buck. If you want a turnkey solution for log analysis and don’t care too much about cost, this might be the way to go but it probably wouldn’t sit well as a strategic investment.


An interesting one to watch is Project Meniscus that’s being sponosred by Rackspace. The goal of this is to provide a hyperscale, MultiTenant, MultiPlexable solution that sits well with OpenStack. To that end it probably never will be an OpenStack core project but it does leverage common components such as Keystone.

It is written in Python and leverages the Common Event Expression. It’s goals are to leverage syslog, rsyslog and liblognorm. It has a coordinator process which will observe the load on the system and bring workers online and offline as is required. Typically Logs are stored in HDFS and Elastic Search.


My conclusion from this session is there seems to be a lot of activity in this space and it should be very possible to build out a scalable log management solution without having to resort to paying license fees. Further investigation will be required around absolute suitability of the open solutions and also consideration around total cost of ownership.

After this I’m most interested in a LogStash + Graylog2 solution.

Tagged ,

Problems PXE booting using Ubuntu MAAS on 12.04LTS

I’ve had some intermittent problems when for some reason despite DHCP being configured correctly, nodes will not PXE boot for MAAS discovery.

I’ve found a fix that works for me, it’s to restart the Twisted TFTP server by doing the following:

sudo service maas-pserv stop


sudo service maas-pserv start

It’s a strange one as connecting to the tftp server on the local machine works, but it just sits there waiting on the PXE. This seems to solve it.


Tagged , , , ,

Technology Equalizers and the Cloud

I heard a comment recently that stated if the developed world had wanted to do more for Africa 20 years ago, the best thing they could of done would have been to invest in cellular networks for as much of the continent as the money would provide. Forget food aid, or payments that end up the hands of corrupt governments, put in modern communications infrastructure and economic development will grow from there. I had a fascinating conversation with one of my Indian colleagues who was telling me about how new ways of using older mobile technology is emerging there, for example, there is a text only version of Facebook that basically gets new money for old rope! This led me to consider what cloud computing might mean for the developing world. There are two common approaches to providing cloud infrastructure which will yield different appeals to different markets. There is the race to the top which will provide premium services at a premium price. In this day and age, the cost of not being compliant with legislation can be astronomical where as the cost of using an expert in this field who handles this specific workload for a lot of customers can be modest in comparison.

The race to the bottom will provide access to computing resources which will allow those with less to spend to complete with the more established players. Currently the market seems to be more focused on the  migration of on premise data centers to cloud data centers, managed by a service provider. While this model will continue I can’t wait to see what innovation will occur when by means of low cost compute resources and free access to training and tooling will lead to new products being created in the developing world where things haven’t yet reached an access point that can really transform the way of life.

I predict there will be a time that with low cost compute resources and mobile networks, we’ll see developing nations skip some of the costly infrastructure development that has prohibited growth up till now and we’ll see a more decentralized business model develop where markets close and abroad can be reached with new and innovative product offerings. Think the farmer in rural Africa being able to sell his crops directly to households in the western world needing only presence and a local shipping agent!

Tagged , ,

Another round of Retro nostalgia: My story

So, it comes around once in a while, usually once a year or so, but this time I think it’s more special. Well, more special for me as recently we’ve seen my introduction to computing, the Sinclair ZX Spectrum celebrate it’s 30th anniversary.

Why is this important? Well, I guess getting that computer on my 5th Christmas was what introduced me to this industry, and what has ultimately become my career of choice. Interestingly though, the gift of this flagship of the home computer golden era was probably much more significant than a gift today of an XBOX, PS3 or even an Mac or PC. This post from Scott Hanselman tickles on the same subject, from a marginally different perspective.

In 1982, the ZX Spectrum was the Christmas gift of choice for many, and as a 5 year old having my own computer was something that I thought even at that innocent age was just a dream. After all, these were the preserve of businesses and the movies, they weren’t something any ordinary person had. Well, by luck through my generous parents, and my Dad having a contact that worked at Timex in Dundee, who were contracted to manufacture the ZX Spectrum, who got him jumping the wait list!

So, what was special? Well, this Register article touches on it. When you opened the box, laying there, was a couple of manuals, the most hefty of those being entitled “BASIC Programming”. This was a first class immersion into the concept of computer programming. It was total immersion, so real was the immersion that in order to load a program or game on the spectrum you were forced to type in a LOAD “” <Enter> command!

 In fact the immersion went even further than that. Rather than allow free form programming input on the original generation of ZX Spectrums, you were forced to use key combinations to create the programming keywords. Heck, they were even printed on the keys. An inquisitive child not only has programming keywords right there in their face on the keyboard but also the reference manual right there in their hand which tells them exactly what they mean.

Am sure to many, these opportunities right there were lost on plenty of spectrum users, but for a critical mass, these brought inquisitive minds into computer programming and I’d hazard a guess that this incubated an entire generation of software developer. Will we see an additional generation now that computers we use are designed to steer users away from the internals of how they work? It remains to be seen, but I for one think I am better placed in this industry because it was a passion from a young age, rather than it being something that I chose later to be my vocation.

Incidentally, its great to see the Raspberry PI pushing for a resurgence in this! I’ll be watching with interest the enthusiasm it’s adopted with. Maybe we’ll see a second coming of the child hobbyist programmer!

Tagged , ,