Categories
Events

Is your team healthy?

Dave Malouf, design ops leader shared his expertise on understanding, measuring, and managing the health of design teams. The talk was focussed on design but the principles could be applied to any creative, marketing, development or consulting contexts, if not others.

The event held on May 15, was hosted by the Design Ops meetup crew and like many other meetups in these COVID lock-down times was held online. I’m sure I’ll return to these notes many times so I’ve taken the time to share the presentation and the Q&A in some detail here.

Creativity

Characteristics

  • Serendipity, association by design
  • Formulation and processes that look at exploration
  • Critique
  • Storytelling – giving it place and purpose and understand to evaluate
    places to externalize work – compared not just by us and others

How to measure it? 

  • Ideas generated by rounds of work
  • How many critique sessions? Quality of critiques.
  • Can anyone show work at any time, ask for help at any time?

Empathy

Characteristics

  • Bringing the outside world into the organisation
  • Synthesising to inform new ideas
  • Interacting with people (research subjects)

How to measure it? 

  • Insights that are usable
  • Do people understand your product/experience? Can it be found?
  • Using SUS, asking and measuring did we make it better, are we moving forward?
  • Is what we are creating valuable?
  • Is research or testing performed with a regular cadence?
  • Are stakeholders regularly in contact with end-customers (cited Jared Spool’s concept of regular contact hours – e.g. a regular cadence of 2 hrs every 6 weeks)

Collaboration

Characteristics

  • Co-creating with people in and out of the organisation

How to measure it? 

  • Looking at design-related user stories in backlog against what is output at the end of the sprint

Communication

Characteristics

  • Clear mission vision and goals
  • Clear roles and responsibilities for the design team, within and across teams, and further out in the organisation

How to measure it? 

  • Surveys and instruments to understand if the team is communicating well

Engagement

Characteristics

  • Flow, team performance, a team that feels they are contributing
  • … as opposed to a team that is over-stressed, burnt out, doing more than they should be doing

How to measure it? 

  • Are the team providing referrals?
  • Surveys, does the team feel they are doing work they should not be doing?

The measures were both quantitative and qualitative. Dave Malouf described that some vital signs of team performance are clear and can be measured, instrumented, and quantified. Other signs are qualitative – looking at things that are observable and having things that can be compared against.

An interesting discussion followed, expertly facilitated by the hosts considering it was all online. Dave expanded on what he is doing with his relatively new team outlining that they are creating their own vital signs and measuring engagement via pulse surveys. This has included setting heuristic standards based on principles. I was particularly interested in this as while I’ve answered employee engagement survey questions about company values I’ve never thought to quantify what teams I’ve led thought of the culture in relation to principles we have agreed to.

Understanding of roles and confusing titles was another issue spoken about, which has impacted hiring through to promotion and giving feedback to people. This is being addressed by rebuilding the ladder and creating a new measurement based on that.

Engagement is so often tied to meaningful work. A solution to understanding the type of work being performed was to measure the proportion of time spent on value-adding strategic work versus time spent on operational tasks.

Other questions and answers were:

Audience Question: How do you tackle the challenges of remote asynchronous work versus real-time work?
Dave Malouf: What’s missing from distributed teams (and any team more than 100 designers is distributed) is passive transparency and being able to see externalised work.

Audience Question: How do you coach your team?
DM: Take interest in the individual goals of people, where they want to be in 5 years, and then figure out how to get them there. Dave added to this that he is “real” with people, work structures are pyramids and their goal might not be reached at that workplace. He asks them to find a job description that makes them hungry -which is used as a tool to identify gaps to fill.

Audience Question: How do you balance collaboration when you are dealing with an overly competitive coworker?
DM: Manages toxic personalities by using principles and values to help people make decisions — some of those decisions are around who does what when? Who speaks when? Competitive people often see themselves as hardworking, ambitious, confident not competitive. You need to step up and gain visibility of how to do that for the better

Also, recognise the nature of the situation — is it competitiveness? Refreshingly Dave Malouf recognised the privilege and “culture of patriarchy” and learning to recognise behaviour that wasn’t uninclusive – and calling each other out.

Audience Question: How can you ask for constructive feedback from clients?
DM: Often clients come back with recommendations not constructive feedback. Use retros during the project.

So many of the issues were all too familiar and it was energising and heartening for me that care for people and culture was the topic. This — if not anything else, proves the value of thinking about design ops and design leadership as disciplines in their own right. 

Categories
Events Research

Conducting qualitative research during COVID-19

COVID-19 has changed everything in our lives. Dr Deborah Lupton asked how can fieldwork continue now that researchers cannot simply meet with participants? The responses from her Twitter and academic community resulted in a crowdsourced resource – Doing fieldwork in a pandemic which outlines alternative methods to qualitative research. She shared this in a fully subscribed webinar hosted by QRS international.

Digital and analogue methods featured heavily as alternatives to qualitative fieldwork. Expected alternatives included

  • Phone interviews
  • Online research platforms (that can be set up by a market research company)
    • see in real-time what people are typing online (and avoids transcription)
  • Apps and social media – public and private Facebook groups, Reddit, etc
  • Online surveys
  • Scanning social media

Dr Upton emphasised novel approaches – and made me reflect on whether my own research practice could do with reinjection of creative techniques. These included:

  • Photo and video voice elicitation
    • probes that can be shared via mobile phone
    • talking and messaging in real-time
  • Re-enactment videos followed up with discussion online
  • Story completion method where people are given the beginning of a story to complete
    • Can be done via pen and paper or online using survey platform
    • Helpful for researching sensitive or highly personal topics as participants can project their experience to a hypothetical third person.
  • Epistolary interviews asynchronous, one-to-one interviews mediated by technology.
    • e.g. using email, and/or Microsoft Word to go back and forth
    • Allows time to build a relationship with the participant

There was a big emphasis on diary studies and journaling, as an alternative to ethnographic field research and interviews with lots of questions. This included creative methods as prompts for future phone and online discussion

  • Paper diaries that can be mailed, online diary platforms, or simply emailing Word docs
  • Including creative exercises such as drawings, handwritten creative responses, mapping exercises, letters, cultural probes, zine-making and collages where images are taken from magazines and words added

Revisit ethics

Dr Lupton emphasised ethical considerations of remote research in the COVID-19 context

  • People may be experiencing additional anxiety
  • The privacy of conducting remote fieldwork in shared spaces – people are now stuck in their homes with family all around them
  • People may be experiencing disrupted family relationships, violence, may be unwell, have underlying chronic health issues
  • Digital data privacy management

In some contexts, including academic contexts, the response to these considerations will require new approvals by ethics committees. In all contexts, researchers need to build trust with participants and demonstrate an understanding of the difficulties that different people are experiencing.

Everything has changed

Dr Upton believes any research of any subject will now be in the context of COVID-19.  She argued we are all COVID researchers now. The social impact of this pandemic is unknown. Upton cautioned that now is not the right time to dive into research – there’s an adjustment period and people are feeling traumatised, worried and anxious, wondering what will be happening with their lives. As researchers, we need to pause, ‘read the room’ and understand the affective atmospheres of when to do applied research. We need to protect the wellbeing of participants and ourselves and consider the timeframes when we and participants will be ready.

Alongside the wonderfully nerdy method catalogue, Upton reminded us that “this will be a long moment.”

References and links:

Categories
Events Service design

Is good content service design? And how do you deliver it?

Thank you to everyone at IXDA Sydney — especially Joe and Lisa who were no sooner off the plane from the IXDA international conference in Milan, than they were hosting another event and sharing what they learned.

The Feb 20 IXDA gathering headline speaker was Mel Flanagan from Nook Studios who shared Nook’s case studies and approach. Check out Nook’s site for the government case studies outlined on the night.

Nook’s approach is content first, participatory, highly visual, and user-centred. The government case studies featured websites, brochures, and videos. Examples reimagined maps to overlay other information like policy data. Others included visual representations of process and legislation to show where people fit in. One such visualisation influenced the government department involved to create an extra step in their process for community consultation and outreach. This shows the power of reimagining and ‘evidencing’ experiences.

Here is what I took away and took notice of from the evening.

Reframing service design with content strategy

  • Services don’t work without content. Content is central to the experience
  • If you are not being content-first you are making transactions, not experiences
  • Get to know the policy, how the money flows, the time involved, the material flows – to then map and visualise a process that people can make sense of and use

“Content design is service design” Mel Flanagan, Nook Studios.

Content and information design is service design

  • Consider, what the user journey is and should be, and what the content experience should be to support it
  • What information do people need to know and understand across their journey? (Mel spoke of creating technical maps, story maps, and policy maps).
  • What decisions is the user informing?
  • Who do they need to go to? What are their rights?

“A lot of what we are doing is map making” Mel Flanagan, Nook Studios

Redesign the project

  • Gather data and content first
  • Have a content team from the start to avoid the ‘content crisis’ that occurs when project development and design streams progress too far without content. After all, what can you launch without content?
  • Rethink “discovery” phrase – include an explicit pre-design phase to understand objectives, audiences, stakeholders and importantly to also understand the context, data, and what content exists now
  • Include participation with stakeholders at every step
  • Optimise your workflow and design process to produce the digital experience with print artefacts

As always, be human-centred

  • Get to know your audience to give them what they need
  • Take a participatory approach with stakeholders
  • Test concepts with the target audience/end-users

 

Categories
Events

Content strategy meets design system

Feb 10 was a joint meet-up between Sydney Content Strategy and the Design Systems meetups. The speakers were:

  • Tony Starr, Content Design Manager for the Atlassian Cloud Platform and Product Content Standards;
  • Steven Berends, formally at the DTA and currently founder of Bear Lion Bird and on assignment at the Australian Trade and Investment Commission.

Both speakers outlined projects and the underlying principles where content has evolved alongside and within design systems. Anthony Starr titled his talk “Strive for 73% content in your design system”. This was not about content playing second fiddle and both talks showed that mature product and service design teams are merging design systems and content strategy.

So what’s in a system?

  • Pattern libraries and style guides but keep these short, usable, and readable
  • Style and grammar: voice and tone, mechanics, glossaries
  • Resources for writing user-facing documentation, emails, in product help, and other content
  • Examples of best practice

The problem that content strategy and content design systems are solving for:

  • The friction caused by inconsistency across ecosystem of websites and services
  • Creating consistency throughout the user journey
  • Capturing institutional knowledge

As always

  • Start with user needs, meet the user story

Measure

  • Identify indicators as well as measures – e.g. 0 searches on pages can infer unmet needs

Governance

  • Consider a content style council to manage decisions. The metaphor of a tree was used to show this system of decisions
  • Root decisions
    • Terminology
    • Voice and tone
    • Brand guidelines
    • Messaging (think microcopy of interactions)
    • Organisational styles
  • Trunk decisions
    • Glossary
    • Product terminology
    • In-product experiences
    • ‘Spicy’ style and grammar choices
  • Branch decisions
    • Content patterns
  • Leaf decisions:
    • Product guidelines
    • Standards and style choices

Form communities of practice, be visible to embed use:

  • Team up with Brand to increase ‘capacity through the system’
  • Allow uniqueness, and
    • incorporate new patterns into the system
    • also ask that people document why they have created a new pattern
  • Keep agile practice, interaction design, and developers close
  • Socialise wherever and whenever possible. Examples included:
    • Through project delivery
    • laying the groundwork with stakeholders
    • being visible in Slack/social channels
    • speaking at in house events
    • attending team and project brainstorm sessions
    • considering a service design to support the design and content system

Thanks as always to Elle Geraghty for organising an amazing free community event. For more info on future meetups by these organisers go to: https://designsystemmeetup.com/ and https://www.sydneycontentstrategy.com/

If you want to learn more another write-up of the meetup was captured by Mattia Fregola at https://www.notion.so/Design-System-Meetup-v12-0-32eaae71e73d42558b7ebb1030547b3e and a video of one of the talks is available at https://designsystemmeetup.com/v12.0.0/.

 

This slideshow requires JavaScript.

Categories
Design Events

What’s love got to do with MVPs?

At this Sydney Agile Business Analysts & Product Owners meetup Erwin van der Koogh set about educating and dispelling some myths about the emerging and illusive term — the Minimum Viable Product, or MVP.

So what isn’t an MVP?

  • It’s not what can be built before the deadline
  • It’s not what is possible for a given budget
  • It’s not the first version that is going to be delivered
  • It’s not the least you can get away with.

It is a product, whether built or prototyped or simulated that is designed to get a market signal to test. Erwin related the MVP concept back to Kano model theory making the point that an MVP should not be the “minimum” product that you can get away with — as that would only elicit an “indifferent” response.  This might sound pithy in a blog post, but if you learn about these concepts MVP does then rise as a term that is about building products customers will love, rather than a production term wholly associated with being lean and agile. Erwin adapted a couple of Eric Reiss quotes to make the point that an MVP should be that slice of end-to-end functionality that you can get away with, that also behaves and performs as users expect. Or to bring the love into it

A minimum viable product is that version of a product which allows a team to collect a maximum amount of validated learning about true love [sic] with the least effort

Some examples further illustrated the point.

  • Dropbox was not the first file sharing service but it was the first that allowed synching to a desktop drive. To test the product a video was made that pretended that the product did in fact already exist to test whether anyone would use it and buy it. The Dropbox video MVP attracted 50,000 sign-ups in 48 hours.
  • Zappos, the first online show retailer needed to prove that customers would buy shoes online. The team went to a shoe store, bought a shoe, took a photograph which they posted online. Someone did buy it and while they lost money on their first sale they did prove that customers would buy shoes off the internet.
  • Intuit, wanting to help fight poverty in India had a hypothesis that fisherman with access to real time price information would be able to better manage and plan their business. This team created paper prototypes that people signed up to. Behind the scenes someone manually entered and texted market prices to the tens of people who signed up, and elsewhere in the simulation a couple of people pretended to be an IVR and quickly iterated scripts before the service was built.

In the discussion I asked whether or not minimum viable experience would be a better term, and the phrase minimum viable experiment was explored. Theory around the experience economy and the Cynefin framework was explained and again Eric Reiss was adapted with a bit of folly …

A start-up is a human institution designed to find true love [sic]  under conditions of extreme uncertainty.

And while this theory is a lot to get one’s head around it makes sense. MVP as both product and research approach is designed for operating environments of extreme uncertainty, and for creating products for an experience economy. I.e. product research that can be experienced by customers in real market conditions.

Attendees asked what the difference was between a MVP and a prototype with the answer that while a lot of MVPs are prototypes, not all prototypes are MVPs. This makes sense if an MVP is a product designed to test for a market signal, rather than other research objectives often associated with concept and usability testing.

The talk finished by encouraging Business Analysts to switch their mindset from a “solution finder” to a “problem understander”. Music to the ears of this designer.

Check out all of Erwin van der Koogh’s slides from this talk on Slideshare

MVP, You keep using that word. I don’t think it means what you think it does from Erwin van der Koogh
Categories
Events Research

JTBD for Health meetup wrap-up

Last Monday saw a great crowd of almost 30 people turn up at Brain Mates to hear Justin Sinclair of Neo speak about a health case study at the JTBD Sydney Meetup.

Justin shared how Jobs-to-be-Done research was applied to investigate customer needs and behaviour when choosing a health provider. His client had hit a crossroads in the development of an online solution for provider selection and decided that some customer centred research was needed.

Some background to JTBD (and the case-study)

For those unfamiliar with JTBD, it is a research technique that focuses on the moment where a customer has made an explicit choice to switch between one product or service to another. The underlying rationale being that all priorities have been crystallised in this moment of decision making which can then, in turn, inform product development.

For the health study, Justin and his team from Neo recruited customers with a range of health needs who had chosen new providers.

The next fundamental concept of Jobs-to-be-done is the job itself which in JTBD terms is framed as a problem that a product or service is “hired” to solve.

The problems discovered in the health study went far broader than diagnosis and treatment. The problems involved the whole experience; some examples:

  • The convenience of location and appointment times.
  • Cost, and lack of information around insurance.
  • History of information with an existing provider as a reason to stay and a barrier to change.
  • Comfort, trust, and alignment of values between patients and health practitioners.
  • Questions of expertise and credentials of health practitioners.

The research successfully illuminated the customer journey. It also posed a challenge because the importance of all decision making factors and touchpoints varied significantly depending on whether the patient had chronic needs, a new serious illness, a minor illness or was simply organising a check-up.

Applying Jobs-to-be-Done to customer research

Jobs-to-be-Done was incorporated into the research discussion guide and all researchers received training in how to do a “switch interview”. Group analysis was done using the “Four Forces” model: looking at the forces driving a customer to a new solution and the forces that are holding them back.

In this case study two of the four forces in the framework – habit and anxiety – proved fundamental to understanding customer choices and the difficulty faced by the client’s solution in trying to solve far-ranging problems with their online product.

So consider for a moment that this approach seeks to find and define customer “jobs” in order to make products that do those jobs. What was most fascinating to me was hearing about the challenge the researches faced in framing the job at the right level. Defining the “job” too broadly, at too high a level, gets us insights that aren’t actionable. Defining a job too narrowly or specifically categorises customers by type and fails to capture the needs of the behaviour around the task or job of choosing a provider.

Interestingly for the client, the research brought into question the fundamental value that this product was bringing to the market. The product is now on hold maybe because of this insight: “Discovering a job isn’t enough if you can’t viably solve it.”

Questions from the audience

I think what I am beginning to like most about Jobs-to-be-Done is the diversity it attracts. We had user experience designers, product managers, and developers in the audience. Their questions reflected how diverse these disciplines can be in their approaches. We had side conversations about personas and the value of qualitative research versus quantitative research.

It’s difficult to baseline the understanding of such a wide audience with a new toolset. Jobs-to-be-Done is quite jargonistic to someone new to it. This made it all the more valuable to hear about it applied in practice and also shows how much awareness and education around this and related techniques are needed. Which is the reason for the meetup so come along to the next one Monday July 13.

The Jobs to be Done Sydney meet up was founded by Christian Lafrance and is organised by Christian and yours truly. 

Follow Justin Sinclair on Twitter at or check out where he works at Neo. Thanks to Justin and Neo for sharing your work at our meetup. 

New to Jobs-to-be-Done. Check out http://jobstobedone.org/

Thanks as ever to our hosts and event sponsors Brain Mates