QCon day two

Reflections on software arquitecture

This first talk of the second day was given by a lady called Linda Northrop.
Well, this is a little embarrassing to say but this presentation pretty much sucked. It was pretty much a talk about the role of software architecture in the process of software development and a rambling of trivial stuff that pretty much every attendee know or should know since they are all professionals. The speaker works for SEI and the approach was a very boring one and was a real pain to stay till the end. Fortunately like every thing in this world the talk come to an end and I was relieved that that happen and suddenly all the suicidal thoughts vanished with it..

The quest for low-latency with concurrent Java

This was a pretty amazing talk by Martin Thompson. He uses queuing theory to develop some very interesting data structures to solve the productor consumer problem in the lower ammount of time. Stuff like hardware atomic operations like CAS are employed to avoid the problem of locking and even more advanced atomic native operations are used to avoid spin-lock contation when the CAS operation fails. This tricks were used in the implementation of the LMAX Disruptor The speaker is a renowned specialist in concurrent solutions and this presentation was a very detailed journey on how everything is important when we want to process a queue in only 13 microseconds when several threads are involved in the process of producing and consuming messages. Concepts like memory allocation and garbage collection are also very important to take into account because they also play a role in the footprint of computer cycles needed, so what this means is that a deep knowledge in these topics is needed if you want to push into the limits of performance. I would advise you to check for his blog because you can find a lot of more interesting technical information regarding these subjects.

How will persistent memory change software design

This was a very low level presentation done by Maciej Maciejewski an Intel developer. It was all about the new technology of persistent memories and how they are implemented in a driver perspective and the work needed to do on the user space libraries to enable applications to use these new inovations, he also give us the code for the NVM implementation. A pretty neat thing was the fact that the application was adapting Redis database to use these new features by overriding the writing/reading primitives by those implemented in NVML and the results were pretty much impressive. This is a pretty awesome new sfuff for those who are using redis as a persistent cache database because the performance, in this case, will improve for all those operations that need to write to disk since in this case all those operations are redirect to this new persistent memory hardware. Another interesting feature of persistent memory and the API developed is the possibility to persist state of intensive computation applications. The point here is that in case of failure you can restart the machine from there. The downside of this new technology is the that, it is new, and pretty much of this will not be available in the near future for those who use a language like Java. Another point that was not pretty much clear to me was the OS compatibility of NVML. And so I asked the guy, and this was pretty much the first question I ever did in a conference like this. And turns out it was a pretty good question because the answer is no. For now just linux will have implemented these features. This is a very important question to me because only when this features are implemented in all OS it is possible to push these technologies into the Java community.

The speaker is not the best, but he is certainly very knowledgeable and the contents were pretty interesting. Not awesome, but nevertheless a nice presentation.

Game of Performance: A Song of JIT and GC

The title is pretty cool. The typography was that of Game of Thrones so I pretty much thought that would be a pretty fun presentation, at least. It turns out that the talk Monica gave us was a pretty boring one. The problem here was that the contents were presented in a very automated way and the purpose of many of the constructions related with garbage collection and compiling optimization techniques were not explained at all. Stuff like intrinsics and other low level compiler constructions were presented based in the premise that we already know the basics. Well for some of us that is true, but for many others, I believe, that was a not valid assumption so the consequence of this approach was that many of the people was listening were pretty well lost in all the details that were presented. Aside from JIT inner working Monica also expose the garbage collector and several algorithms to do the job, she explains why premature promotion of younger generations to older ones, in a generational garbage collector is a bad idea and shows some ways to tackle this problem.
In the end the presentation was a little bit of a pain and the level of satisfaction was really low. It is sad because I had some pretty good expectation about this one.

Interruption

This interruption is not a strange name for a mysterious presentation about interrupt handlers, in fact no presentation happened at this time. The reason why, is that I and a bunch of other colleagues were not interested in any of the purposed presentations and, additionally, we were a little bit tired and so the last two presentations were at risk to not be proper attended. So what did we did? We travel along the roads of London and stopped at a very nice Costa Coffee which at the same time was a library and took a nice meal. Then we came back to the conference center to attend to the last two talks of the day

Containers change everything

Anne Currie is a very enjoyable person. She smiles all the time and has very high communication capabilities. So this was enough reasons to conclude this presentation as a success. Apart from human capabilities there is the content, and that was also very enjoyable, not because of new technical stuff learned but because it was nice to be aware for the fact that the savings with the use of containers are tremendous and the granularity of control increase from virtual machine into the level of process which give us a much higher degree of precision to manage allocation of resources for those processes. This enable the increase of data center utilization from levels of 10%, 15% into 50%, 70% which is a awesome achievement. Other nice feature is the possibility to respond better to peaks of load by migrating resources from one set of containers to those who are under load. In the end I had the privilege to talk with Anne and share the perspective of a developer and ask her if she has her teams also working with docker as a way of implement their cluster lab and in this way blind themselves to the inherent problem of doing experiences in a dev environment shared by others. She said that that was a interesting perspective and added that most of the times the reason for the not adoption of docker are devOp teams because of new headaches. I just told her that this was not our case, not the devOp guys, at least...

The nihilist's guide to wreking humans and systems

The speakers were Christina Camilleri and Shubham Shah. Well this was a little disappointing talk because it was basically a demonstration of social engineering and, honestly speaking, a really basic one. The techniques were pretty basic and the targets were also very easy. What I think was bad was the case scenario that was presented. People with little or none security awareness, ok this is pretty normal nowadays, but also an infrastructure with nonexistent security measures. All in conjunction created a target profile very implausible scenario and therefore the interest of the presentation was lower than could potentially be.