Open Source Log Management

Last night I attended a Cloud Austin user group which covered various logging solutions. Some were fully open source where as others were proprietary soltuions.

LogStash was the first up. It’s a ruby based solution that provides a means for templating and parsing logs into a common JSON format. From that point it’s up to you how you put them somewhere to do what you need with them. Options include pushing alerts out through Nagios or sending to the likes of Elastic Search. Visualization options include the likes of Graphite and Kibana.

It’ has quite a nice feature where you can daisy chain LogStash instances which will allow for a log processing pipeline.

It’s strength seems to be in the fact that it can pretty much take log data from anywhere and bring it into a consistent format.

Graylog2 looks like an interesting solution for actually making sense of the logs but it’s kind of restrictive in what it will accept in. There are options around this though. Natively it will only accept GELF or Syslog formatted messages. There are a series of appenders that can be added to the configuration of various logging frameworks which will output GELF format although this might not be an ideal solution as you need control of the application creating the logs. An interesting solution for this that seems to have a lot of traction is to combine both LogStash for normalization + Graylog2 for analytics, alerting, searching and stream analysis.

Graylog2 is java based and its dashboard is in the process of being ported from being a rails application to Java + Scala. Logs are stored in Elastic Search and all of the metadata required is stored in MongoDB.

Graylog2 has commercial support available via it’s creator, www.torch.sh.

 

SumoLogic is a SaaS based soltion that some users have been migrating to due to the cost of splunk. It does seem to be cheaper and it’s provided on a utility based model. It seems very Splunk like and is obviously targetted at this market. It contains some novel features, LogReduce being an interesting one that will group log messages together that look like they have been created by a similar error. This can be useful in terms of not having to see everything be repated.

 

Splunk is considered a market leader but it comes with a pricetag to match. They offer a free version but that does’t go very far. It looks like other solutions are fast catching up and offer a lot more bang for the buck. If you want a turnkey solution for log analysis and don’t care too much about cost, this might be the way to go but it probably wouldn’t sit well as a strategic investment.

 

An interesting one to watch is Project Meniscus that’s being sponosred by Rackspace. The goal of this is to provide a hyperscale, MultiTenant, MultiPlexable solution that sits well with OpenStack. To that end it probably never will be an OpenStack core project but it does leverage common components such as Keystone.

It is written in Python and leverages the Common Event Expression. It’s goals are to leverage syslog, rsyslog and liblognorm. It has a coordinator process which will observe the load on the system and bring workers online and offline as is required. Typically Logs are stored in HDFS and Elastic Search.

 

My conclusion from this session is there seems to be a lot of activity in this space and it should be very possible to build out a scalable log management solution without having to resort to paying license fees. Further investigation will be required around absolute suitability of the open solutions and also consideration around total cost of ownership.

After this I’m most interested in a LogStash + Graylog2 solution.

Tagged ,

Problems PXE booting using Ubuntu MAAS on 12.04LTS

I’ve had some intermittent problems when for some reason despite DHCP being configured correctly, nodes will not PXE boot for MAAS discovery.

I’ve found a fix that works for me, it’s to restart the Twisted TFTP server by doing the following:

sudo service maas-pserv stop

then

sudo service maas-pserv start

It’s a strange one as connecting to the tftp server on the local machine works, but it just sits there waiting on the PXE. This seems to solve it.

 

Tagged , , , ,

Technology Equalizers and the Cloud

I heard a comment recently that stated if the developed world had wanted to do more for Africa 20 years ago, the best thing they could of done would have been to invest in cellular networks for as much of the continent as the money would provide. Forget food aid, or payments that end up the hands of corrupt governments, put in modern communications infrastructure and economic development will grow from there. I had a fascinating conversation with one of my Indian colleagues who was telling me about how new ways of using older mobile technology is emerging there, for example, there is a text only version of Facebook that basically gets new money for old rope! This led me to consider what cloud computing might mean for the developing world. There are two common approaches to providing cloud infrastructure which will yield different appeals to different markets. There is the race to the top which will provide premium services at a premium price. In this day and age, the cost of not being compliant with legislation can be astronomical where as the cost of using an expert in this field who handles this specific workload for a lot of customers can be modest in comparison.

The race to the bottom will provide access to computing resources which will allow those with less to spend to complete with the more established players. Currently the market seems to be more focused on the  migration of on premise data centers to cloud data centers, managed by a service provider. While this model will continue I can’t wait to see what innovation will occur when by means of low cost compute resources and free access to training and tooling will lead to new products being created in the developing world where things haven’t yet reached an access point that can really transform the way of life.

I predict there will be a time that with low cost compute resources and mobile networks, we’ll see developing nations skip some of the costly infrastructure development that has prohibited growth up till now and we’ll see a more decentralized business model develop where markets close and abroad can be reached with new and innovative product offerings. Think the farmer in rural Africa being able to sell his crops directly to households in the western world needing only presence and a local shipping agent!

Tagged , ,

Another round of Retro nostalgia: My story

So, it comes around once in a while, usually once a year or so, but this time I think it’s more special. Well, more special for me as recently we’ve seen my introduction to computing, the Sinclair ZX Spectrum celebrate it’s 30th anniversary.

Why is this important? Well, I guess getting that computer on my 5th Christmas was what introduced me to this industry, and what has ultimately become my career of choice. Interestingly though, the gift of this flagship of the home computer golden era was probably much more significant than a gift today of an XBOX, PS3 or even an Mac or PC. This post from Scott Hanselman tickles on the same subject, from a marginally different perspective.

In 1982, the ZX Spectrum was the Christmas gift of choice for many, and as a 5 year old having my own computer was something that I thought even at that innocent age was just a dream. After all, these were the preserve of businesses and the movies, they weren’t something any ordinary person had. Well, by luck through my generous parents, and my Dad having a contact that worked at Timex in Dundee, who were contracted to manufacture the ZX Spectrum, who got him jumping the wait list!

So, what was special? Well, this Register article touches on it. When you opened the box, laying there, was a couple of manuals, the most hefty of those being entitled “BASIC Programming”. This was a first class immersion into the concept of computer programming. It was total immersion, so real was the immersion that in order to load a program or game on the spectrum you were forced to type in a LOAD “” <Enter> command!

 In fact the immersion went even further than that. Rather than allow free form programming input on the original generation of ZX Spectrums, you were forced to use key combinations to create the programming keywords. Heck, they were even printed on the keys. An inquisitive child not only has programming keywords right there in their face on the keyboard but also the reference manual right there in their hand which tells them exactly what they mean.

Am sure to many, these opportunities right there were lost on plenty of spectrum users, but for a critical mass, these brought inquisitive minds into computer programming and I’d hazard a guess that this incubated an entire generation of software developer. Will we see an additional generation now that computers we use are designed to steer users away from the internals of how they work? It remains to be seen, but I for one think I am better placed in this industry because it was a passion from a young age, rather than it being something that I chose later to be my vocation.

Incidentally, its great to see the Raspberry PI pushing for a resurgence in this! I’ll be watching with interest the enthusiasm it’s adopted with. Maybe we’ll see a second coming of the child hobbyist programmer!

Tagged , ,