Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

5 Users Every Friday

Nächste SlideShare
On Top of the World
On Top of the World
Wird geladen in …3

Hier ansehen

1 von 37 Anzeige

Weitere Verwandte Inhalte

Aktuellste (20)


5 Users Every Friday

  1. 1. 5 Users Every Friday A Case Study in Applied Research August 25, 2009 Tom Illmensee tillmensee@ironworks.com Ironworks Consulting Alyson Muff amuff@lmco.com Lockheed Martin
  2. 2. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff Our deal.
  3. 3. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff U.S. economy – 2007/2008 S.S. Short Circuit http://www.flickr.com/photos/amphalon/489866219/
  4. 4. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/x-ray_delta_one/3821747779/
  5. 5. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff people interactions collaboration flexibility
  6. 6. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/mike_miley/3602643132/ http://www.flickr.com/photos/krossbow/3279026325/
  7. 7. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/hinkelstone/2765597758/
  8. 8. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/eyesplash/3562374696/
  9. 9. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff Wireframes >> days/weeks Plan usability study >> 1 week Recruit users >> 1.5 weeks Conduct sessions >> 2 days Analyze data >> 1 week Recommendations >> 3 days Write report >> 1 week
  10. 10. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  11. 11. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/helios89/2094055234/
  12. 12. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  13. 13. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/gabriele/68282411/ http://www.flickr.com/photos/opera-nut/2752861268/
  14. 14. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  15. 15. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  16. 16. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/joelanman/366190064/
  17. 17. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  18. 18. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  19. 19. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  20. 20. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  21. 21. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  22. 22. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  23. 23. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  24. 24. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  25. 25. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  26. 26. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  27. 27. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/racecarphotos/535337722/
  28. 28. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  29. 29. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  30. 30. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  31. 31. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff “[Customer] research collateral, including original paper prototypes, were all prominently displayed [in Agile work rooms], giving the entire team a sense that the [shopper] was always in the room. I heard repeated feedback that team members from the QA and technical teams were especially interested by this new way of seeing things—it really opened their eyes to designing solutions for people.” - Senior Business Strategy Manager
  32. 32. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff
  33. 33. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/lorda/129806123/
  34. 34. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff “One of the responsibilities of software development is to assure that the right product is being created, and then to create it in the right way. The only way to accomplish this is to apply the best practices of programming, design, and all of the other associated disciplines and crafts. *TRUE* agile means integrating these crafts into a joyful, unified, productive whole.” - Alan Cooper “Usability's role in a design project is to be the source of truth about what really works in the world, as opposed to the team's hopes for what might work.” - Jakob Nielsen
  35. 35. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/14degrees/1440092644/
  36. 36. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff http://www.flickr.com/photos/15302763@N04/3386432469
  37. 37. 5 Users Every Friday: A Case Study in Applied Research Tom Illmensee | Alyson Muff Thanks for listening. Any questions? Tom Illmensee Ironworks Consulting tillmensee@ironworks.com Alyson Muff Lockheed Martin amuff@lmco.com

Hinweis der Redaktion

  • Good afternoon! I’m Tom, andI’m here to testify. I have seen how Agile and User Experience can work together in harmony. I’ll take it a step further – I have seen user experience methods transformed – and even improved – by the cleansing fires of Agile.I should say upfront - we’re not trying to sell anyone on any particular approach to Agile UX – we’re simply sharing our experiences. Alyson couldn’t be here today – so feel free to send your toughest questions to her – Alyson’s email address is listed in the handout.Quick show of hands:Anyone involved with interface design for Agile?Anyone have experience with usability tests relating to Agile projects ?This talk will address these topics.
  • Alyson is an information architect and human factors researcher at Lockheed Martin. She’s also a math professor at the University of Maryland. I know. I have no idea how she does it.I’m an information architect at Ironworks Consulting in Richmond, Virginia. Our company builds and maintains web sites and intranets.But last year at this time, Alyson and I worked together at a well-known – and now extinct - consumer electronics retailer in Richmond, VA. We were part of a 6 person user experience team focused on e-commerce – things like shopping carts, widgets to promote sales, and product finders. Essentially web-based software to help people buy flat screen TVs, laptops, and fancy cameras. Before Agile came along we were a busy and mature UX practice – Alyson and I were responsible for making wireframes and conducting usability and other types of research with shoppers – even including some ethnography. Most importantly – we managed to spread the gospel of user-centered design and usability within the organization. This helped a lot during the transition to Agile later on. Also, before Agile there was an emerging sense of community between IT and UX. I’m here to tell you how our team transitioned our user experience practice from waterfall to Agile / Scrum – and how we sustained and even streamlined usability testing and iterative design.
  • Agile came to our organization at a time of crisis. The company’s stock price was plummeting. Market share was eroding. We faced tremendous business challenges and leadership looked desperately for new modes of strategic flexibility and fast execution.The e-commerce division saw Agile as a means to accomplish more, faster – and as an antidote to waterfall’s long and cumbersome cycles.Our leadership hoped through Agile methods we could somehow navigate better through the stormy seas ahead by quickly improving and expanding the web site’s capabilities.
  • Agile seemed promising – so we ran a smallexperiment to test it out. The business had been asking for wish list* functionality for a year – but time and resources were never available in the waterfall culture.A small, ad-hoc team got together to make a wish list using a home-brewed Scrum-like approach. Anyone have an experience with home-brewed Agile?So a developer, a BA, application services, database architect, a UX person (me), QA and a business representative met every day – working 3-week long sprints. We adopted Agile customs we had read about: like the daily stand-up, a coach role, stories, and demos. As for UX, I drew inspiration from Desiree Sy’s paper on Agile usability and design (listed in your handout). It certainly felt like Agile was a great way to go – and for me a guiding light out of the murk.The wish list came out better than anyone expected. Leadership was impressed. We accomplished more in a month and a half than a full team could do in 6 months. I have to say – it was pretty cool. We were finally doing iterative design with usability research in a highly collaborative setting. So the UX team was thrilled when word came down we would start a formal Agile process.
  • We were optimistic – some of us were downright exited about Agile. We loved the emphasis on people, interactions, collaboration, and flexibility.Waterfall was not about these things!In Agile we felt like our research efforts had a better chance of affecting design because work seemed so iterative. If usability tests revealed flaws in the functionality – then we could change the design on the spot. No requirements to update, no wireframe annotations to mess with.Well, our enthusiasm would soon be tested – and tempered by heat.
  • Thinking back – the homebrew Scrum project was like cooking hotdogs on a campfire. Fast, simple, fun, a bit crude – but tasty results.As we got into formal Agile training – by comparison – it was like cooking in Julia Child’s kitchen. Our trainers were great – very experienced. But the curriculum was focused entirely on engineering and management processes. User experience was – I guess – kind of tacked on…
  • If you’ve been through any kind of formal Agile training, then you know there is a learning curve. For us the term customers had to be redefined to distinguish stakeholders from shoppers. There were user stories, cards, points, burn downs, stand ups, some guy named Fibonnaci. Everyone went through some culture shock.At our company there were two independent Scrum teams in separate work rooms. Alyson’s team focused on promotional functionality – my team focused on personalization features. Our rooms were on separate floors, but close enough for me and Alyson meet for a little while every day to vent, I mean compare notes.IT, UX, business and a coach we co-located. From 8:30 in the morning until 5:30 in the evening, we trained, worked, ate, argued and laughed. It was a blast.
  • I don’t remember iteration zero – where the stories were laid out and the back log defined. A lot of the strategy and foundational research was already defined – so I think both teams used zero mostly for training. Coding started quickly. Alyson and I discovered fairly quickly – and rather painfully – that our approaches to user-centered design were not as fast or as flexible as our Scrum teams needed them to be.We were not ready to plan, execute and report usability research within such tight timeframes. We feared we could not keep up with Agile’s pace – and the functionality would be designed around the team’s preferences – instead of shoppers. How would the teams make informed design decisions? I confess! We kind of dug our heels in at first. Our usability process worked fine in waterfall, why not in Agile? But it didn’t take long before several IT managers used the word “bottleneck” to describe the user-centered design process. And they were right. This perception was deadly – and we had to do something about it.
  • What were all of these UX bottlenecks?Wireframes and diagrams could take days or weeks to develop and were often entwined with detailed, written requirements and use cases.The bigger jam was around research. It could take weeks just to plan a usability study. It could take a month to conduct usability sessions, analyze all the data and write recommendations (listed in epic reports).We knew this was a problem. And no matter how much we protested, the Agile train was not going to slow down for us.
  • The bottleneck factor was evident in how the Agile rooms were arranged.UX tasks were so long and time consuming they were not even associated with user stories. We had our own parking lot far away from the task board. Needless to say, this symbolized an “us” and “them” dynamic that was not going to lead anywhere constructive. With Agile’s emphasis on development and engineering tasks – research seemed irrelevant at worst and additional at best for mission success. It was too cumbersome – and results came too late. Some on our team got upset. Suddenly we were back to justifying usability – and feeling like engineering was racing ahead without considering the needs of users. It was a UX nightmare.
  • Agile was not going to slow down. We had to get faster. It was at this point Alyson and I realized Agile was kicking the beat and playing the chords, and maybe it was our job to find ways to harmonize with it. We had to get ahead of iterations – to anticipate the needs of the team so we could all make informed decisions about designs and functionality. User research would be our instrument. We had to jump in and improvise – risk hitting some bum notes as we learned the tune.
  • Our first adjustment was to assign 2 UX people to each team. Maybe two of us could get more done. One would serve as the dedicated design resource – attending stand-ups, etc. The other would serve as an external research consultant to the team. So no long and cumbersome research tasks on the board. This was supposed to simplify planning for Scrum teams while providing independent, objective data.But this approach just created new bottlenecks – within the UX team. It was just too difficult synchronizing activities and resources. Plus, while were getting more efficient in some areas – research still took to long. Results from research trickled in too late. And Scrum teams were confused who was working on what. The code factory was in high gear…and UX was still falling behind and not providing the right kind of support.
  • A quick musical analogy: Agile can be defined as “the ability to move with quick and easy grace.” Kind of like this Finnish speed metal band. They are STRATOVARIUS! UX was trying to jam with Agile - the speed metal band – with something like these alpine horns. Slow and deep does not rock. Our problem: Planning and recruiting for usability tests could take a week or more. Running protocols with up to 8 users could take us up to 12 hours.Data analysis could take another week. The report takes a week to write – and a week to read. About 4 weeks – tough when sprints are only 3. No wonder we couldn’t keep up.
  • That’s not to say it isn’t possible. Trial and error. Maybe this sound works. Dunno.In our organization it wasn’t a question of whether usability research had value – it was more about having access to it on-demand.We also wanted to build, restore and shore up bridges between UX and IT and the business.
  • I mentioned Desiree Sy’s fabulous paper on Agile usability. We also looked to others for inspiration and advice. Desiree, Jakob Nielsen, Carol Barnum, Jared Spool, Jeff Patton, Frank Maurer – all have explored in great detail the nuances of end-user research in Agile environments. In the handout packets you’ll find a list of articles Alyson and I found helpful.We absorbed their recommendations and crafted an approach tailored to our environment.
  • We resolved to get user research into the natural rhythm of the Scrum teams.This meant we would run usability sessions with 5 shoppers every Friday, per team. In those sessions we’d test as much functionality as we could. So it could be stuff from past, present and future iterations. We also wanted to make the process accessible and transparent to our teams. Our mission was to get rich usability data flowing constantly and consistently – usability would be a normal and expected occurrence.But we would have to get creative and take some risks to pull this off.
  • The idea was to compress all the steps to fit within a week. Here’s the basic pattern in 5 beats.Planning starts on Tuesday. Here we’re deciding what our questions are and how we’ll try to find answers.We finish the test plan and prep on Wednesday and Thursday. This might include some wireframes, design comps, or setting up a semi-functional prototype. We wrote task scenarios designed to help users experience the interface realistically. We handled test logistics, too: recruiting participants and getting our lab prepped. We ran usability tests on Friday – starting first thing in the morning. By late afternoon we had started analyzing data and dashed off a quick results summary for the team before leaving for the weekend.Monday we finished the analysis and got ready for a review session with the team.Tuesday we reviewed research-based recommendations and started planning for the next round of tests.Each beat was a minefield of potential bottlenecks. Here’s how we dealt with them.
  • First, let’s talk about the plan. There’s a sample in the handout packet. We know documentation frequently causes bottlenecks – especially in Agile – so we needed to streamline. But you do have to establish what you’re going to study and how you’re going to study it. So on Tuesdays the teams would start talking about their concerns and questions about widgets in the current iteration. We might also discuss upcoming functionality or things in previous releases we want to evaluate.In about an hour we could usually come to some consensus about what we would test with users that week. Although, it was common to get a bunch of last minute additions Thursday evening.We filled out this test plan template during these discussions.
  • Why bother to do this? The plan brings a bit of clarity around test objectives, method, equipment needed, set-up, questions, and types of data we’ll collect. After this – on Wednesday and Thursday come the preparations – and more potential bottlenecks like:- getting prototypes or mockups ready writing a test script recruiting users and handling logistics securing equipment and a place to run the test sessionsLet’s take a look at an example of a prototype…
  • Here’s a rough mockup for a feature designed to collect email and cell phone numbers from customers who want to receive an alert when a TV goes on sale.We wanted to know if shoppers understood the service and find any potential design flaws. Essentially we wanted to observe users interacting with the stuff before we coded it. We catch the big problems at this early stage so we don’t have to do a lot of rework later. We would often test with conceptual sketches like these and wireframes. We usually had an hour with each shopper, so sessions might include a variety of stuff: paper prototypes, mock-ups, semi-functional prototypes, even stuff in production. We usually conducted short interviews with customers about their shopping habits and preferences to break the ice – so over time we collected a lot of useful data we could have used for personas.We would write a script with scenarios to help shoppers understand the task and context of use. Let’s talk about another bottleneck: recruiting.
  • Finding, screening, and scheduling the right kinds of users can take a lot of time. We had help. We worked with a local market research firm. Recruiting is a deep topic – so here are the key points: We scheduled research appointments weeks in advance We had established criteria for shoppers and recruited a mix All we had to do each week was pick up incentives to pay participants and make sure they had directions to our building.Also, in a pinch, we could always run out to a local store and hand out gift cards for quick feedback…more on that in a moment.
  • So we have our test plan, a script, prototypes, and 5 users scheduled and ready to go.Here lurks another potential bottleneck: access to a lab.Professional labs are great, but they can be expensive, too elaborate and inconvenient for busy teams. So we took Jared Spool’s advice and found an alternative: a conference room at HQ.This had lots of advantages: Easy for us to set-up the room and control the environment Easy for team members to observe sessions – they just had to walk down the hall. No additional cost
  • Here’s how we set up a typical conference room for tests with semi-functional prototypes.On Friday – test day – our first participant – Joe shopper – would arrive around 9. Joe sits at the head of the table in front of the computer. That’s me sitting next to Joe.I’m listening, making notes and asking questions. My team members are in the room, too. They sit at the front of the table and can see what’s happening on the screen. This arrangement lets everyone see things through Joe’s eyes – they can even talk to Joe at the end of the session.This arrangement was something new in our organization. We were used to labs with mirrors and observation rooms. Having everyone in one room was a leap – but it worked phenomenally well. Users did not seem distracted by the observers, and observers seemed to be more attentive.Observers knew not to make changes until all the sessions were complete and only after recommendations were discussed as a team.
  • We had other configurations depending on what was in the test plan. Here’s a lab at a local store.Stores provided convenient access to our users. We usually exchanged a $25 gift card for 25 minutes of feedback.If Alyson were here, she would tell you about her many spur of the moment research expeditions – with QA and business analysts running protocols and collecting the data. The stores were usually noisy – and shoppers with 30 minutes to burn were sometimes hard to find. But this worked in a pinch.Fact is, your lab is wherever you are collecting data. It doesn’t need to be fancy.
  • On a related note. If you’ve observed formal usability tests then you’ve probably seen some of the gear involved. Cameras, mirrors, microphones, Morae software for video playback and session analysis, electrodes (kidding), eye trackers. All great and powerful tools – and in our case, bottlenecks, too.Why record hours of sessions if you and your team have absolutely no time to review later? The best data collection tools for the job – in our situation: a laptop or paper prototypeA digital camera (for paper sessions)Active listening skillsNotebook and penBy the third week we were simply logging observations by hand -- noting significant events (clicked button on page X) a comment (didn’t see grand total change, seemed confused) and severity (high).
  • It does take some experience and discipline to run effective usability tests. But it’s not rocket surgery – with a little training anyone can do it.And we cannot downplay the significance of having the team attend these sessions – even if its just one hour per week. There’s no substitute for observing the code in action with real users at the controls.
  • So by 3:30 on Friday we had data from 5 sessions – and pages of notes. No time to rest. We quickly summarized the big news so the team would have a list of critical issues to think about.Then, on to more analysis – and more potential bottlenecks.Before Agile, data analysis could take a minimum of a week. We had to find a way to get recommendations to the team faster without jumping to hasty conclusions.
  • One way to deal with the analysis bottleneck was to simply eyeball the data. Analysis can be time consuming in terms of processing results and perilous in terms of interpretation. The first step was to review notes looking for events flagged as most severe – and for obvious patterns in the data.For example, one week we saw over and over users become frustrated trying to position a pop-up on the page – it was actually a fixed layer. Ordinarily the researcher would have calculated time to complete tasks and error rates – but in Agile it was more helpful simply to identify it as a significant issue worthy of discussion with the team.Teams needed to know where to focus efforts the following week. They didn’t need mountains of data – they wanted to know what parts were working and what parts were not.
  • That’s not to say we dismissed findings or only collected qualitative data.We did run some longitudinal, summative usability studies over the course of several weeks. We also gathered quantitative data with surveys, ranked features, counted frequency of errors. Alyson has a knack for numbers and could do analyses fast – she is a math professor after all. But for the most part, the teams did not have the time to wait for deep statistical work – nor did they really have a use for it.
  • On Monday we finished unpacking our observations began working on the mother of all bottlenecks: the report.Before Agile we invested a lot of time in our reports. This was a big organization that seemed to feast on reports. But Scrum teams generally didn’t have time to read reports – we certainly didn’t have time to write anything fancy. Our teams needed to know where the problems were and what to do about them. So that’s what we gave them. We annotated screenshots with recommendations for each issue. The idea was to have group discussion about what steps to take. So we made discussion guides – and printed them on poster-sized paper and hung them up on the walls like wanted posters.By Tuesday, teams had discussed impact and priority and decided what to fix immediately and what to add to the backlog. One of my favorite memories was noodling at a whiteboard with the developers on a design revision based on research data collected Friday. In about 10 minutes we felt like we had a solution that solved the problem. This never would have happened in waterfall.
  • The Agile rooms were plastered with these research guides and prototypes. After a few weeks we were literally immersed in research. This helped reinforce the teams’ awareness of usability and the plight of our shoppers.This quote from a senior business strategy manager shows how research affected the quality of the work and team culture.We did get into a pretty good rhythm, and by the 3rd or 4th week Agile rooms started to look a little different.
  • UX tasks – now leaner and more efficient – we’re easier to include with engineering tasks on the board. Stand-ups covered all the efforts of the team, including research. Not only that. Alyson and I relaxed the boundaries of our discipline – we let others help with the research and design and we took advantage of opportunities to help out with QA and development tasks. There was transparency. By the 7th week there was a distinct blurring of lines between roles. UX got involved with QA, QA got involved in UX. Developers made wireframes. BAs were observing shoppers and collecting usability data. Teams felt involved and proud of the UX efforts because they were making a difference – we were making informed decisions based on data – while keeping shoppers central in the process. Demos featured highlights from weekly research sessions. Insights from research were a team product!All this speed did come at a price. Turns out bottlenecks do have some value – they provide opportunities to pause, to catch your breath, to reflect.
  • Maybe it was the frantic, desperate environment of a troubled company.Maybe it was our own reckless ambition – but we got tired.Fatigue was becoming an issue. But we were not the only ones dragging! Velocity, the number of stories attempted in iterations – and long hours taxed the teams to their limits and affected morale. Has anyone experienced fatigue in Agile? Here’s an opportunity for further research: fatigue and burn out in Scrum.On top of this the sense of desperation within the company just intensified. Weird emails from leadership about cost cutting* and the tumbling stock price added more stress to an already intense environment.
  • Our story is nearly at the end – so I’d like to reflect on why research matters in Agile. Why did we work so hard to make it happen? Was it worth the effort?When it’s done carefully, efficiently, and creatively usability brings truth to the table. And truth, when it’s accessible and timely, leads to better design decisions. Better design decisions produce functionality that is easier and more satisfying to use. Which was really the essence of Agile in our environment. To build quality stuff quickly, fluidly and collaboratively.Alan Cooper wrote recently about the need in Agile to build the right product – and to create it in the right way – drawing on the skills of all disciplines.This notion , combined with Jakob Nielsen’s idea that the role of usability is to be the source of truth – help to summarize our experiences with Agile UX.
  • We all know UX research is a critical input for great design.And let’s admit it - in the speedy world of Agile, usability and user-centered design sometimes cause bottlenecks – but it doesn’t have to be that way.We adapted UX for Scrum through some trial and error in an extremely complex and deeply troubled organization. If we can do it, so can you.You can learn so much by watching your users interact with the stuff you make. You don’t need a fancy lab and gear and tons of time. Experiment with methods and techniques. Improvise. Take some risks.
  • The secret is simple. Agile has its own rhythms – its own cadences – its own song. It’s a Swedish metal band.User experience can be successful in Agile if we listen to its beat – find the groove – and improvise melodies over that groove with the right instruments.And understand Agile tends not to play the slow ballads. It’s gonna be fast and funky. Imagine Thelonious Monk playing Flight of the Bumblebee.