I like drawing out my article ideas for the Auth0 Developer Blog before firing up the blog editor and typing. Here’s an example, which I was doodling this morning, which I made at the dealership while my car was being serviced.
Product owners and managers: Pendo’s ProductCraft conference on May 7, 2020 is virtual and free! Here’s their summary of the event:
The ProductCraft Virtual Conference offers the same high-quality session content as our in-person events, just via a 100% online format. Our speakers are product leaders at some of tech’s fastest-growing companies, and will be sharing their best practices, unique perspectives, and experiences of what it means to work in product.
And as always, we’ll be putting a different spin on the traditional product conference format, with plenty of opportunities for networking.
Here’s the agenda:
|11:00 am – 11:35 am EDT||BEST PRACTICES FOR BUILDING PRODUCTS WITH A FULLY REMOTE TEAM
Holly Kennedy, VP of Design, 15Five, Dianne Frommelt, VP of Product, 15Five
|11:35 am – 12:05 pm EDT||DO THIS, NOT THAT: GUIDING DECISION-MAKING WITH PRODUCT PRINCIPLES
Jeetu Patel, Chief Product Officer, Box
|12:05 pm – 12:35 pm EDT||YOUR PRODUCT IS NEVER GOING TO BE READY
Karen Rubin, Chief Revenue Officer, Owl Labs
|12:35 pm – 1:05 am EDT||INNOVATION SYSTEMS FOR COMPETITIVE PRODUCTS
Brian Crofts, Chief Product Officer, Pendo
|1:05 pm – 1:35 pm EDT||BUILDING RELEVANT AND IMPACTFUL INNOVATIONS AMID UNCERTAIN TIMES
Shravan Goli, CPO and Head of Consumer Business, Coursera
The stream starts on Thursday, May 7 at 11:00 AM EDT. If you can’t catch it live because you’re one of the fortunate ones still with a job, it’s being recorded and will be sent to you after the event.
Funny because it’s (often) true.
(You might also want to check out this post of mine from 2018.)
Last night, Anitra and I gave Tampa Bay UX Group’s first presentation of 2020: An overview of the accessibility features in iOS 13, the latest version of Apple’s mobile operating system.
A good crowd — including a handful of people new to the Tampa Bay area — were in attendance at the event, which took place at Kforce, who have a very nice meetup space. I’ll have to talk to them about using their space for Tampa iOS Meetup:
Anitra and I tag-teamed for our presentation. She presented from the ux/ui specialist point of view, while I presented from the programmer/implementer angle:
Here are the slides from our presentation:
We started with a couple of definitions of accessibility:
- The ISO 9241-20 definition: “The usability of a product, service, environment, or facility by people within the widest range of capabilities.”
- A more general definition, and a good way of approaching the topic: Accessibility is making your apps usable by all people.
We then provided a set of personas, around which we based the demos:
- Jacob, a 32 year-old paralegal who has been blind since birth. As a paralegal, he’s college-educated and writes case law summaries. He lives with a roommate. He’s tech-savvy and an early adopter with the latest gear.
- Emily, a 24 year-old college student with cerebral palsy. She finds it difficult to use her hands and has occasional difficulty speaking clearly. She wants to be independent and lives in a small, independent living facility.
- Trevor, an 18 year-old student with autism spectrum disorder who is uncomfortable with change. He loves videogames, but strongly prefers ones with which he is familiar. In fact, he prefers having an established routine.
- Steven, a 39 year-old graphic artist who is deaf. He is annoyed by accessibility issues, which include video without captions and other systems that require hearing.
Our first demo was of VoiceOver, the gesture-based screen reader. We demonstrated its ability to not only read text on screen, but to facilitate navigation for people who have no or low vision, as well as to describe images — even if no “alt text” is provided. If you’re curious about using VoiceOver, you should check out this quick video guide:
Our second demo was of Voice Control, the new voice command system, which is separate from Siri. It offers an impressive amount of control over your device using only your voice; I was even able to demonstrate playing Wine Crush, a Candy Crush-style app that I wrote from Aspirations Winery, using only my voice. To find out more about Voice Control, see this promotional video from Apple:
We also wanted to show that accessibility can be aided using iOS features that weren’t specifically made for that purpose. We demonstrated this with an app that allows users to click on buttons using a head-tracking user interface based on the face-tracking capability built into Apple’s augmented reality framework:
— Tampa Bay UX (@TampaBayUX) January 31, 2020
I’ll post of video of this demo in action soon, but if you’d like to try it out for yourself, you can find it on GitHub: it’s the HeadGazeLib project.
We followed these feature demos with a couple of coding examples, where I showed how you can use SwiftUI’s accessibility features to further enhance the accessibility of your apps:
And finally, we closed the presentation with links to the following resources:
- Apple’s Human Interface Guidelines: Inclusive Design
- WWDC 2019: Visual Design and Accessibility
- WWDC 2019: Accessibility in SwiftUI
- WWDC 2019: Writing Great Accessibility Labels
- Hacking with Swift tutorial series on SwiftUI accessibility
We’d like to thank Krissy Scoufis and Beth Galambos for inviting us to present at the Tampa Bay UX Group meetup. They’re a great group that promotes an important — yet often neglected — part of application development, and we’re always happy to take part in their events. We’d also like to thank everyone who attended; you were a great audience with fantastic questions and comments!
More photos from the event
You might also want to check out the other presentations we did at Tampa Bay UX Group’s meetups: