A few weeks ago I saw an article on LinkedIn that Google had decided to drop its 20% time for its teams. If you’ve been living under a rock, this is one of the most referenced (and admired practices) at Google. In essence, every engineer was allowed to spend/invest 20% of their work time on project(s) that interested them. It was a creativity and innovation incubator of sorts. Teams would surround the “best ideas” and work on this with 20% time. As experiments would show merit, they might make it into the core products or leveraged as a new tool, technique, or method. And no, 20% time did not mean that employees worked 120% of the requisite time. It was an 80/20 split and not intended as a project schedule accelerator.
Now they’ve changed policies. Innovation is being focused more on specific teams working in labs, so more centralized. And 20% time is now jokingly referred to as 120% time as Google’s official policy hasn’t been to “remove it”, just to move it to discretionary—in each employees “spare time”. It’s too bad really, because this policy was truly inspirational to many companies.
Google 20% time inspired a generation of software companies, particularly those leveraging agile methods. I get around to quite a few conferences and over the years folks frequently talked about “their variations” of 20% time, how they implemented it, and the impact it had to their business. I don’t recall a single conversation where a 20% time variant caused significant angst on the part of a company.
One of my frequent examples of 20% time’s influence is Atlassian. They’re one of my favorite companies, first because they product some wonderful tools that are simple and just plain work. I’ve been a long time user of their products such as Jira & Greenhopper for agile project management. Sure they are not the most feature rich, enterprise level agile management tools on the planet, but they simply work. But enough of the commercial; the other reason I admire Atlassian is that they are an agile shop. They eat their own dog food or practice what they preach.
This is an exercise in trust. More important than the specific number of hours spent on side projects versus billable projects is the idea that the company has faith in its engineers to do innovative things. Its recognizing innovation can’t be mandated and is not guaranteed, but we can create an environment that nurtures innovators—where experimenting is welcomed and failing is OK. Giving engineer’s autonomy is a powerful motivator
I actually agree with Ben’s insight. At the end of the day, I think 20% time IS about trust. Oh, we’ll talk a good game about being behind the competitive curve, or behind schedule, or lean and mean; but truly if you’re building software – you need to innovate. And trusting your team is the way to go, but so few “get” that fact.
Oh well. Let’s switch gears and talk about a real-world example from my own experience. I hope it helps by providing a crisp example of what implementing 20% time might “look like”.
iContact’s – Innovation Days or iDays
I was the Director of Software Development at iContact from 2009 to 2012. We were a solid, perhaps even high-performing Agile / Scrum shop, and were consistently looking for ways to improve. Taking a page from the Google and Atlassian playbooks, we created something called Innovation Days or iDays within our technology organization. The organization encompassed about 100 engineers.
We ran iDays for over two years, so it evolved during that time. What I thought I’d do in this article is share some of the details of iDays as a way of helping you understand and potentially implement a similar practice.
- We decided that there needed to be a “Cruise Director” for iDays. Someone in a management role to keep it going, provide leadership support, and generally provide guidance. The role fell to one of our managers and he did a great job of “guiding” its evolution over the two years. Consider this role to somewhat akin to being a Scrum Master for iDays.
- We didn’t want iDays to focus on the individual, nor on existing teams. We had the notion that iDays would be: (1) Driven by ideas; (2) Ideas would draw a team to them based on the potential & value of the idea; and (3) A team would form around the idea. It minimally needed to be cross-functional, so developer(s) and tester(s) needed to “sign-up” for the idea. And each idea needed a champion or sponsor; perhaps call them a Product Owner.
- We chose to hold iDays at the end of a release sequence or Release Train. In our case, that was every 9-10 weeks. We would focus 2-3 days at the end of the week between releases for iDays. Nothing else was scheduled during those days, as we wanted total immersion & focus on the part of the teams. Usually it would be a Thursday – Friday or Wednesday – Friday sequence. We would also provide lunches and snacks during each day and over the weekend if the teams wanted it.
- Not all iDays teams produced code, but all produced “something”. If they were focused towards potential product-level code ideas, then they would be responsible for environment setup and ensuring that the work didn’t impact the ongoing production or maintenance code branches. In other words, there was an organizational agreement that iDays projects would not negatively impact our customers. They needed to be “Done” within the context of the goal.
- The iDays Scrum Master would start sending reminders out for team ideation 2-3-4 weeks before the end of the release. He would setup a wiki page (Confluence of course) for teams to capture their ideas and natural team formation. This would be the place where ideas were vetted. These could be new ideas or continuing ideas from the last iDays. We simply wanted full transparency for ideas leading up to iDays.
- Not all ideas were “worthy”. Some didn’t collect a cross-functional team. Still others were simply too large to “fit” into iDays and still product something valuable and demonstrable. Still others didn’t align with our customers or business goals. While filtering was very lightweight, the Scrum Master also served as an iDays project vetting agent to ensure we were staying focused and on-point. So ideas, very much like User Stories, were vetted, clarified, broken down, and refined during the weeks leading up to each iDays.
- Once iDays started, there was tremendous focus in the building. The small teams focused on getting something “done”. We didn’t have a clear Definition of Done for the iDays projects, but the overriding goal was—something demonstrable that didn’t “mess up” Production. It seemed to be enough. The other thing that seemed to keep the teams focused was the demo. Everyone new that they would be showing “working code” on Monday morning, so that kept the focus and the pressure on.
- We intentionally scheduled iDays to be at the end of a week. While we didn’t ask the teams to work over the weekend, if often happened. We sort of expected it as their enthusiasm increased with the fruition of their idea. We wanted to allow for as much immersion around each idea as possible.
- On Monday morning, an iDays Review was conducted. We held it very much like we did our Sprint Reviews. It was a “whole company” event, we had an agenda that was published, and each team took a turn at demonstrating their ideas. A typical iDays might involve ~10 or more teams and the review might take 1-2 hours for all of the teams. Over time, it became such an “event” that our entire C-level and Senior leadership teams attended.
- We asked the entire organization to vote on the Top 3 innovative projects they saw from iDays. Everyone got a vote, even the teams. This encouraged attendance at the review. It also fostered engagement. But ultimately it was fun and created some healthy competition amongst the teams. “Winning” an iDays project was a badge of honor within the technology teams.
Beyond the top 3, every iDays project was recognized and appreciated. As far as I was concerned, everyone was valuable.
One final point. After about 2 years of doing iDays, I kept track of how many of the “ideas” made it into our product(s). Over 60% of our iDays ideas made it either (1) into the hands of our customers or (2) directly into the day-to-day efforts to put product into the hands of our customers.
Here is a short list of the high-impact lessons we learned while using our iDays model:
- Have a champion or guide;
- Have a cross-functional, cross-team focus;
- Let “ideas” drive, but not everything is ‘worthy’;
- Setup a specific time in your work tempo;
- Focused teams with no interruptions;
- Demo, celebrate, valuate, and celebrate again;
- Reflect & iterate;
- You’ll hit rocky patches, but stay committed;
- Trust your teams!
How do you measure the success of a program like this? I know, I know, it’s hard.
We focused on morale at iContact quite a bit. Did the teams gain energy and engagement as a result? Did the entire organization get a lift? Did it increase our creativity and thinking? Those measures, in and of themselves, seemed worth it to us. It also helped our teamwork: cross-team collaboration, focus, and customer awareness.
We also measured which “ideas” made it into “Production”. As I said, during the course of iDays, over 60% of the ideas made it. It was about 64% as I recall. We took great pride in this figure because of the impact it had to our clients.
The final measure, one that I personally took a lot of pride in was from our leadership team. One day our CEO came down to the technical leadership team and asked a very interesting question. He said:
Why can’t we add more days to iDays? I’m thinking of 4-5 days. We get such incredible results now, I wonder what the additional time would do?
After that statement, I think our measures were essentially…done ;-)
As always, thanks for listening,