Libby Helper

What are Alexa Skills?

Alexa Skills are like apps for an Echo device. They add additional functions to Alexa so that users can do more things, like talk to Pikachu and get daily trivia. Some skills also allow users to connect to existing mobile apps and other services. For example, Lyft has a skill that allows you to call a driver via your Echo device. For this project, we selected a mobile app to create a voice user interface (VUI) to work in parallel or as a complement to it.

App: Libby by Overdrive

Libby by Overdrive is a mobile app that makes it easy for users to search for and check out e-books and audiobooks from libraries. Overdrive is the system that many libraries use to loan out e-books and audiobooks, and their company designed the Libby app. Libby allows users to enter and save their library card information, select libraries to search, manage their loans and holds (across multiple libraries, if needed), search library collections, and check out or hold books. Checked out items can be downloaded and read directly in Libby or can be sent to be read on Kindle.

VUI: Libby Helper

I created a complementary experience to the app. Though I think searching for books via voice would be more convenient than having to type it in, having a VUI read through a list of search results isn’t ideal. I decided that the VUI would compile search results and then send them to the app to be searched through at a later time.

Libby Helper provides a lot of the functions of the original Libby app. To open, the user can tell Alexa to start Libby Helper. From there, the user can ask for information about their loans and holds, search for books, or ask for book suggestions. The skill will send long search results to the mobile app for the user to scan through later. If the user has a specific title in mind, they can ask Libby Helper to check the book out. The skill will either check the book out or put it on the user’s holds list if the book has a waitlist.

Process

The first thing I did was take an inventory of the mobile apps functions to figure out what might translate well into an Alexa skill. I mapped out the app’s user flow and tried to insert places where user input might fit in.

Flowchart for the Libby mobile app and the Libby Helper Alexa Skill
A user flow chart of the Libby mobile app and Libby Helper Alexa Skill. The red boxes represent tasks in the mobile app while the speech bubbles and the blue boxes represent where the VUI skill fits in.

Next, I tried to create scripts for different user scenarios, such as a user trying to find out how long their books were due or a user trying to find a specific book. This helped me create possibilities for the kind of dialogue Alexa might encounter.

Sample script written of various interactions within an Alexa skill
Examples of scenarios that I had scripted before writing Libby Helper.

I decided to use the Business Q&A blueprint because it allowed me to input several variations on how a user might ask the same question. I was also able to essentially keep the user in the skill for several interactions because it is built to continue asking the user if they have more questions. This gave me a lot of flexibility to add questions that built on each other and create my own flow for user interaction.

Using the scripts I had written, I wrote several questions and answers into the blueprint. I then tested the blueprint with my boyfriend. I explained the purpose of the mobile app but otherwise tried not to give him information on the functions of the Alexa skill, because I didn’t want to give him any clues about the kinds of questions he needed to ask.

I soon discovered that the introductory text in the skill misled the user into giving very general commands that I hadn’t anticipated at all. I had wanted to avoid giving the user specific examples of commands in the introduction since I thought that would take too long and limit their imaginations when it came to the functions of the app. I also noticed that the questions that I had built into the answers within the skill were redundant because the blueprint automatically has Alexa ask if the user has any more questions after each answer.

To solve this problem, I added the general commands my boyfriend had tried during the test as questions in the blueprint and had Alexa use the answer to instruct the user how to properly word their questions and commands. I tested the skill with him a second time, and he was able to use it much more smoothly.

However, there were still some issues with the speech patterns. For example, when the skill suggested how to word a search command, my boyfriend immediately tried it out, but because of the nature of the blueprint, Alexa cut him off by asking if he had another question. I decided to reword this and a few other questions/commands to minimize the chance of a user getting cut off. In this example, instead of having Alexa give the suggested command at the end of the answer, I gave it at the beginning.

  • Original: “I can perform a search and send the results to your mobile app. Tell me something like, Search for Harry Potter.”
  • New: “If you tell me something like, Search for Harry Potter or Search for nonfiction books, I can perform a search and send the results to your mobile app.”

For the third test, I gave my boyfriend a list of tasks to complete within the skill:

  • Find out when your books are due back.
  • Search for Harry Potter.
  • Get three suggestions from Libby.
  • Try to check out the most popular book at the library.

I tried to pick tasks that spanned the functions of the skill and which might require multiple interactions that built on each other. For example, I had intended for him to ask what books he had out on loan and then ask about when they were due back by the title. However, instead of asking about what books he had checked out, he directly asked when his books were due back. Since I hadn’t anticipated that question, Alexa couldn’t answer or answered a different question altogether. He asked multiple times and got frustrated. Eventually, he gave up and said, “Help!” Again, I hadn’t anticipated a user asking for help, so he wasn’t able to complete the task.

I edited the skill to include a help question that would give the user more specific pointers about the functions within the skill and added a question that would let users directly ask about when their books were due. After these refinements, my boyfriend was able to complete all of the tasks I’d given him.

Demo

Mobile-VUI Interaction

Mock-up of how Libby Helper would interact with the Libby app
Recent activity would show up as cards on the Libby mobile app.

Reflections

The most challenging thing was predicting what a user might say. Though writing out a script beforehand was useful for getting started, my user tester ended up going in totally different directions that I hadn’t anticipated. Since this type of skill is more dependent on conversation than my previous skill was, it was much more difficult to write for it. Having a user to test the skill on was definitely a necessity, though I would benefit from having additional testers.

I am most proud of how I used the Business Q&A blueprint to create my skill. I was worried that it would be impossible to create the skill that I wanted since its functions were sometimes dependent on multiple interactions, but I’m glad I was able to figure out a way to make it work, especially with using the questions to create another way to give the user a prompt. I think I was successful in crafting questions and answers that would allow users to organically explore the functions of the skill.

Design Iteration

After presenting the skill to my class, I received some interesting ideas for personalizing the experience for the user. People wondered if there were things that Alexa could proactively tell the user, like alerts about nearby libraries based on GPS location or warnings that their books were overdue. 

When I was originally writing the skill, I designed it with a new user in mind, so I tried to make the introduction general and used it to provide guidance. This worked very well for my boyfriend, who had never even used Libby or anything like it before. However, I didn’t take into consideration what it would be like for someone who used the skill on a regular basis. I also didn’t consider that since the skill requires a sort of account login to use, Alexa would be able to provide customized greetings to the user. Taking into account the suggestions from my classmates about getting warnings from the skill when their books were due, I thought that the greeting would be a good place to add dialogue like that.

I first decided to have Alexa automatically tell the user about activity or notifications as soon as they opened the skill. However, it was a little difficult for me to decide whether there should be a limit to how many notifications Alexa would read and what counted as significant enough activity for her to report. If they moved up in the waitlist for their book, but not up enough to be ready to download the book, would a user want to hear that? If the user hadn’t opened the skill in a long time, would Alexa read all of the notifications since that time or just the latest ones? Would a user want to even hear Alexa’s notifications or would they rather read them in the Libby app directly?

I decided that I would instead have Alexa tell the user that they had X number of notifications, with an offer to read them now. The idea would be that if they were interested, they could answer in the affirmative and Alexa would read them out. If they weren’t interested, they could skip them until later. Once the notifications had been read once, they would be cleared like a push notification on your phone. Then Alexa would revert to a shorter greeting the next time the user opened the skill. I demonstrate the new greeting in the demo video below.

Tags:

Leave a Reply