← Home About Archive Dramatis personae Reading Photos Replies Also on Micro.blog
  • Book Review: How Google Tests Software

    How Google Tests Software coverHave you not noticed a book recently? Forgotten that you were reading whilst you were reading? That’s the author’s ideal: their books should melt away whilst you are reading them, so that the content transcends the medium and becomes the event.

    Can this happen with a technical book? Honestly, I don’t believe it can; technical books are so full of references, tables and figures, footnotes and diagrams that you can not escape their structure, their architecture for long. I could briefly get lost in an alloy phase diagram in “Engineering Materials”, but I couldn’t read the book page for page, for hours on end like I could a Julian Barnes or an Iain M. Banks.

    An engineer’s job does (or at least should) include reading up on things, whether that be a new book or browsing the web for information. This being an engineering blog, I thought the occasional review of interesting resources that I have encountered might end up being something that I could write about. This is the first in this unforeseeably long or short series or reviews.

    The book that kickstarted this whole thought process was one I came across as background reading for my post on whether Software Engineering is Engineering: it was the ebook How Google Tests Software

    How Google Tests Software (HGTS) was written (developed and compiled, perhaps?) by three gurus in the art of software testing: James Whittaker, Jason Arbon and Jeff Carollo. In style, it is what could be expected of Google from an outsider’s viewpoint - quite chatty, breezy, somewhat at odds with the incredibly technical and mathematical work that they do. It is also replete with excellent word selection, suggesting that whilst coding is at the heart of their work, this trio is also at home communicating with people. Indeed, being bright and capable of communication is a key aspect of their respective rises to the upper echelons of Google (and, in Whittaker’s case, Microsoft) management. James Whittaker certainly has literary form, having written “How to break software” and “Exploratory Software Testing” prior to HGTS.

    In truth, and from my perspective thankfully, HGTS is only semi-technical. There is not much in the way of code snippets or significant jargon; it’s more a case of using dialect (“dog-fooding” for internal pre-Alpha software testing, for example). The book reminded me a little of the classic aerospace book “Flight without formulae” {Link} in that there is a minimum of code and a maximum of description. This suited me down to the ground. Someone in the software development world may be disappointed at not having chunks of test code to try to understand or to try out, but this book describes in a lively way the key principles of how to manage testing, how to manage testers and how testing has to become integrated both into the product and into the company itself. This makes the book worthwhile reading for software developers, I’m sure - but also for us.

    The essential message of the book is entirely relevant even at my mechanical end of the engineering spectrum: it is that {software} testing and quality must go hand in hand with development.

    In the book, we learn how Google went so far as to kill off the group called “Testing Services” and to resurrect it as “Engineering Productivity.” More than merely a rebranding, the switch ensured that the software developers were testing their new code all the way through the development process: the Productivity Team gave them the tools to do so.

    Software testing consists of several levels, from quality checks on portions of code, through to logic and functionality tests on components and upwards to full interdependent systems and finally user testing.

    Equally, there are several levels of test engineer involved: there is the SWE (Software Engineer), who principally develops code, but also tests the same code for “low-level” bugs. There is the SET, the Software Engineer in Test, who aids the SWEs in writing test code and the frameworks for such testing, and finally there is the TE, the Test Engineer, who is involved in the user-side testing of an app or a site.

    The test team is kept small by design, making it a limited resource that thereby keeps a large enough balance of responsibility on the side of the SWEs to keep things as bug-free and as smooth as possible. The idea is that if Testing were to become a huge department, like in the bad old days, software quality would become worse, not better, since SWEs would once more feel released from the constraint of having to consider testing and quality as being an integral part of what they create. Google (as would any other company) would slow down to become a bureaucratic monster, no longer nimble, no longer smart.

    The sheer complexity of what the testers do is incredible and totally beyond my ken. Tests that range from small to enormous, automated bots that trawl websites for bugs, whole tracking systems for bugs: these are all impossible creatures for me.

    Intellectually, though, they become analogies and hints for improving our own ways of working. Let’s take the structure: We have technical, production and quality departments: why not eliminate the quality department as we know it and create a Productivity Improvement Team? Indeed, why is quality treated separately? If we focus on productivity, we automatically have to eliminate quality issues. Google tracks bugs with Buganizer? Well, we could move on from quality catalogues (aka rogues’ galleries) to active tracking and destroying of our own quality stumbles - for everybody. Google trawls websites for usability issues? We could do much more collecting of warranty and benchmark data for our parts and those of our competitors. Google raises bug alerts on competitors’ sites? Hmm, well, perhaps that’s an analogy too far, but the notion of making our industry a better place is a noble one.

    Google uses what they term the “ACC” Analysis methodology, where teams think through Attributes, Components and Capabilities to determine an initial test plan for that product for each instance where a component is broken or a capability not met. That is, they think through what would happen and how a user would be affected if a particular component were suboptimal or broken, and assess how frequent that type of failure would be. It all sounds very similar to the FMEA methodology in our world.

    Tellingly, though, Google doesn’t seem to let itself get bogged down in documentation or specifications. “…I suppose there is some fairytale world where every line of code is preceded by a test, which is preceded by a specification. Maybe that world exists. I don’t know. But in the innovative and fast-paced world that I live in, you get what you get. Spec? Great! Thank you very much, I will put it to good use. But… demanding a spec won’t get you one… Nothing a spec writer … can do will help us find a problem that a real user will encounter.” 

    I would be interested to know if Google needs to pass audits in the same way we do.

    Google can be very clear on how it should manage clever people: “…I am a big believer in keeping things focused. Whenever I see a team trying to do too much, say working on five things and doing only 80% of all five, I have them step back and prioritise. Drop the number of things that you are doing to two or three and nail those 100%.”

    So - the way Google has set up its development teams with quality at their heart, then set up productivity teams that provide the tools for quality to succeed sounds like a benchmark for us to meet.

    These and many more are the lessons to be drawn from How Google Tests Software. I would certainly recommend you delving into the book for even more on how to recruit clever people, how to work with barefoot managers, and how to ensure that jobs and roles are not entities in themselves, but part of a community in their own right.

    Could Google learn something from us? Well, if Google really wanted to know how to bog itself down in administration, they could always learn from us and introduce PPAPs to the software world. That would help, I’m sure.

    And: did you not notice your browser there for a few minutes? If so, then it’s a sign that this post was in some way or other interesting; if not, then you probably disappeared down a few tangents via those links - the very nature of blogs and the internet (or your browser crashed…)

    → 1:15 AM, Nov 15
  • Engineers: are we but droplets in a cloud?

    When trawling the net and various books online for background to my post wondering whether Software Engineering is Engineering, I came across a book on how to teach software engineering (its name, like this clause, would only interrupt the flow of this post). I was only afforded the preview on Google Play (OK, I didn't buy the book), but one phrase I came across intrigued me, since it gets to the core of my thoughts on this blog. The phrase is this:

    "Software engineering - the "engineering" of software - is part process, part technology, part resource management, and, debatably, until recently, part luck .... Learning to be a software engineer - learning about software - learning about engineering (the former, a nebulous topic, the latter an equally nebulous attitude of professionalism) form the target that educators are aiming to hit..."

    Or, paraphrased: "Engineering is a nebulous attitude of professionalism."

    I think that's a fabulous non-description, but it raises some interesting considerations, as that word nebulous - cloudy, vague, formless - bears so much information and insinuation. The word implies that engineering can be observed and classified but only billows around a probing grasp. It implies that the macro and the micro definitions of engineering are completely different: in the same way that clouds are made up of a myriad of droplets and the nucleae of those droplets, engineering is made up of myriad interconnections and dependencies. It's what makes engineering so potentially fascinating and so potentially frustrating.

    Instead of trying to capture all of those influences in words, I decided to resort to the prototypical engineering fallback tool - a sketch. It's more of a brainstorm than anything defined, though: it's nebulous, made up of lots of droplets and is liable to change at any moment. Here's what it looks like today:

     

    Engineering Bubbles by S. Abbott 2012

    I'll keep refining it, but you get the picture. The form of your own particular cloud depends entirely on your engineering environment and whichever way the winds of development and commerce blow. Is engineering unique in this respect? Undoubtedly not - there are many more nebulous attitudes of professionalism - but it's a good thought-raiser.

    And there's one thing that the nebulous analogy misses entirely: clouds don't produce paperwork.

    You may use the picture for your own devices under a kind of CC license: Common Courtesy. A simple link and acknowledgement would be appreciated!

    → 9:11 PM, Aug 17
  • Is Software Engineering Engineering?

    Frustration at the cutting edgeSoftware seems to be getting all the glory these days, with the notable exception of the Curiosity landing - but any system that uses a a rocket crane to gently place a one tonne nuclear rover into a crater on Mars is astounding. Aside from the MSL, though, it's all Facebook this and Google that - even Microsoft, the uncoolest of them all collects kilometres of headlines. I get the impression that engineers like me, working on "things" like metals, coatings, fluids, remain unlauded. In modern parlance, I work on "dumb*" things. They are non-trivial things, of course, otherwise I wouldn't be engineering them, the products I work on also have many millions of users and the company I work for even makes a profit - but it's not software.

    The world of software deserves its acclaim. The engineering that I do could hardly be imagined without IT. Spreadsheets and presentation tools, web browsers, emails, data analysis software all across the spectrum to text messages are an integral part of my working life. In one sense, then, software is "merely" a tool that enables me to add value to things. Equally, I am aware of the tools that I use when I'm not in the office: music sequencers, smartphones, GPS systems - and blogging apps, of course.

    All of this software resides in hardware, but in many cases the physical is largely transparent. Software defines the utility of the hardware.

    So software is one of modern life's key enablers. It can be stunningly complex and is in a perpetual state of development (unless the company goes bust or is bought out for its team). The question is, though: is software engineered, or does it somehow "happen"?

    Put another way: if I were to dust off my Basic or my Turbo Pascal and hop over to a software company, would I recognise what I would do there as engineering?

    The title Software Engineer certainly exists. It can be found in the job pages of Facebook and Yammer. There are university courses offered in Software Engineering the world over. There is a Software Engineering Institute at Carnegie Mellon, and the Fraunhofer Institute has its own Experimental Software Engineering group.

    Yet despite all of this apparent validation, the title still seems diffuse and interchangeable. Some companies avoid the title Engineer altogether, using by preference the word "Developer", which seems currently to have the highest cachet, whether the practitioner is Junior, Senior, Expert or Chief Expert. A developer friend tells me that where he works, the title "engineer" is not used at all, as it smacks of robust inflexibility grounded by paperwork, whereas developers are by nature free to react quickly and autonomously to the ever-changing requirements and bug discoveries that define software.

    I see what he means (and take slight, but acknowledging umbrage at that assessment of engineering). But others use the title "Engineer" as a standard moniker - including Google, of all places. So how do they use it?

    James Whittaker (now back at Microsoft) in his highly engaging book "How Google Tests Software" describes many of the development tools used by Google during software development. They seem to be parallels of my own tools. He talks about specifications, about a kind of FMEA (risk derived from what Google calls ACCs - Attribute / Component / Capability factors), about test and validation, about breaking things to find their weak points and subsequent focus on fixing those areas.

    A Google Software Engineer (also described in the book as a "feature developer") is responsible for delivering tested, bug-free code to a particular project. Software Engineers in Test are geared up to write code and test-frameworks to find bugs in the product, and Test Engineers work specifically on ways to break the total product in clever ways.

    It all sounds quite similar to my world. Instead of code, I write drawings and specifications. I organise testing and validation, I have to deal with change. Our manufacturing engineers ensure that product can be made and our quality engineers ensure that product is measured and released for sale. However, whilst specifications, documentation and requirements are all present and largely correct at Google, they come across as being secondary to the ultimate goal of shipping bug-free code.

    This is of course totally true. They are secondary (I shudder when I hear Quality Managers refer to what they do as "value add." It's cost-added for value saved.) However there is a different emphasis on rigour between software and hardware that may be reflected in the real titles of hardware engineers but software developers.

    One of the directors quoted in How Google Tests Software explicitly states "I suppose there is a fairytale world where every line of code is preceded by a test which is preceded by a specification… But in the innovative and fast-paced world that I live in, you get what you get… Demanding a spec won't get you one… I can whine or I can add value." Equally apposite: "Test plans are the first testing document to be created and the first one to die of neglect."

    These statements reflect the same pressures that I experience as a mechanical engineer. We also have timing pressures to deal with, and spec writing is also a necessary evil. But the attitude seems different. I simply could not imagine such a statement coming from a GM or Daimler director, let alone from that great automotive bureaucracy, Volkswagen.

    Documents and specifications aside, subtly but tellingly, in a series of interviews with Google Test Directors at the end of How Google Tests Software, each director refers automatically to developers and only occasionally use the word "engineer" as a secondary term.

    So perhaps engineering is a nice-to-have concept in the world of software, a little bolted on. On the other hand, we engineers may be too static and outmoded for the modern and fast-paced world of gold-medal software firms like Google. Perhaps our production models that involve factories, process engineers and ISO / TS audits are too rigid to take the liberties that the softies can take and get away with them as often as they do.

    But as we have seen, the title Software Engineer is very much in existence. Maybe we need to take a step back from software's cutting edge to where software takes a secondary seat to the hardware. Car or aircraft entertainment systems, or production process control systems would be good examples, as would be the medical equipment industry.

    The clearest answer I have found so far to the question "Are Software Engineers Engineers?" lies in a job description for a medical equipment manufacturer. Here's what this software engineer is supposed to manage:

    • Development of software
    • Verification…of Quality Management and Regulatory Affairs
    • Collaboration for the development of software requirements
    • Development of the software architecture
    • Implementation and integration, supervision of external resources
    • Support of product maintenance
    • Production and customer care

    This collection of responsibilities sounds more like what I have to manage on a daily basis. This software engineer must juggle the code and its application, must (this being the medical industry) monitor specs and regulations carefully and must ensure that production is secured, whilst also designing in a certain ease of use for the end user (I wonder if they say "end user" or "patient"? I think it makes a difference…).

    It doesn't sound as sexy as a startup's freedom or a Google's heavyweight fleetness of foot, and it doesn't reflect much of the pioneering spirit of a Brunel or an Edison; but it's engineering as I know it.

    Perhaps the difference between the software developer and the hardware engineer really is as simple as the maturity of the market and of the company. Just as terrible auto accidents in the 1970s and 80s resulted in ever-increasing regulation, so potential privacy disasters at Facebook and Google is landing them with audits and governmental control.

    Perhaps the Zuckerbergs and Larry Pages of today are the Rolls and Fords of yesteryear, and their companies are destined to become as bureaucratic as their successful forebears. The attraction of startups is that they are small and start under the radar of heavy regulation. To achieve the scale and success of Microsoft or Apple, of BMW or General Electric, they too must generate a strong, supporting skeleton. The trick and the challenge is not to let that become a fossil.

    So in the end, what's my answer? Is Software Engineering Engineering? Yes, it can be. There are sufficient differences that both worlds can learn from each other, even if they cannot often transfer people (I see myself more easily transferring to the nuclear industry than to Google) but the disciplines and tools involved have their parallels. In the balance, I feel that my world could learn more from software than vice-versa, especially in terms of sleekness and agility. What could they learn from us?

    Apart from great paperwork, I mean…

    I'd love to hear your thoughts on this theme, too. Are you a Software Engineer yourself? Or are you somewhere in between (in avionics, say, or electronic gadgetry? Fire away!

    *dumb: software is deemed smart, but it, too, can be reduced to equations and lots of "if…else" statements. Not as dumb as they sound, my components react to certain conditions in particular ways, and with more subtlety than many programmes do.

    → 3:19 AM, Jul 9
  • RSS
  • JSON Feed
  • Micro.blog