So far you've learned lots of details about sound design and different concepts that will help you design sound successfully. This includes thinking about sound from your everyday environment, brainstorming sound mappings, creating those sounds and embedding them in interactives, and even evaluating them. In this lesson, we'll reiterate and integrate some examples. We'll reiterate important parts from each of the previous modules and integrate examples that cover best practices in design, evaluation, analyses, and iterative redesign. In the beginning of the course, we asked you to think about what sounds you hear daily and you took time to reflect on what sounds you hear during everyday technology use. Has that list changed? Are there other sounds you notice now that you didn't notice at the beginning of the course? If you were already working with learning interactives before this course, did you include some previously? If you did, what was it used for? Usually, sounds are included to increase engagement and enjoyment. If you already have some sounds in your learning tool, have you thought about how you might change them to be more informative? Next, do you learn about different mappings that you can use to convey information? These mappings can be simple, like changing pitch, or these mappings can be complex and rely on multiple sound streams or multiple aspects of one sound that's changing. Let's check out a couple of examples for this starting with balloons and static electricity. Balloons and static electricity uses both simple and complex sound designs. Picking up the balloon and transferring charges are simple mappings. Using two balloons makes it more complicated, particularly since the two streams of information are concurrently changing. This is most noticeable when we remove the wall after both balloons have a negative charge. To reduce how overwhelming multiple sounds can be to learners, you learned about different concepts like progressive disclosure or starting with a smaller amount of sounds, and the information, and then overtime adding other layers. A similar thing happens here with the balloons. The sim starts with just one, allowing a learner to become familiar with just one sound set at first, and then they can try it with both balloons, expanding their understanding. Another example of layered sounds with both simple and complex mappings comes from friction. Selecting the chemistry book and touching the physics books both play sounds. Moving the chemistry book around then adds a sound as the molecules jiggle. Once there's enough heat, they break away. As the thermometer cools off, we hear a steam whistle sound. Each of these layers give distinct examples of feedback. Some are single sounds like touching the other book, other sounds are more complex and have mappings that change as I interact with the system. The next set of lessons focus on evaluations, including what questions you want to answer, and what methods or question types you could use. Remember that evaluations don't have to be formal. Sometimes near the end of your design process, you may want a really formal evaluation. Otherwise, you can do informal evaluation sessions with experts or potential users. When designing an evaluation, start with what you want to learn. Comprehension and interpretation, experience, usability. What you want to learn drives what kinds of evaluation you want to use and what kinds of questions you want to ask. Let's check out an example in balloons and static electricity. Maybe we want to know how much a sound matches its context. For example, does this sound for picking up the balloon match what you actually expect even if it's not exactly the same sound? Does it fit better than a real sound? In friction, we may want to know how learners are interpreting the jiggling sound. What do they think it means? Before the molecules breakaway, can they predict what might happen based on the sound alone? If it cools down before they break away, what do they think will happen as they start rubbing the books again? Evaluating designs with learners, or educators, or even other designers helps us refine our designs. In the next module, you learned about technical details for prototyping and creating sounds. That included different ways that you could find, record, or create sounds through websites, physical objects, and virtual synthesizers. You learned how you could use programming environments or languages like Web Audio or SuperCollider to build prototypes. To add sounds, you could either play sound clips that you created previously or you could generate the sounds through coding. Let's review two more demos of the implementation types. For both balloons and static electricity and for friction, we use two types of sounds. The first are clips that are played in Web Audio like the balloon or book pickup sound. The second are sounds that are synthesized entirely in Web Audio or are modified and played in Web Audio depending on logic in the sim. For example, in friction, higher heat means more sound clips are played with a faster playback rate and a higher pitch at different volumes. The more movement, the more clips play. In the last module, you learned some design mappings and hints that have worked really well for us. You learned about how sound layers work together with visual layers or description to present a complete representation of the relationships and concepts within an educational interactive. You also reviewed the importance of design constraints, like the actual context of use for a tool. If a teacher demonstrating the interactive in a classroom doesn't have a stereo setup for speakers, then a spatial audio-only mapping may not work. Knowing these constraints ahead of time can change how you design the mappings or how you might implement them. Now, thinking about all of these pieces, let's walk through a couple of design examples. For build an atom, one important concept is helping learners understand which particle type affects the element and understanding which particles go into the nucleus versus the electron cloud is also important. But there's three particles, so what's the best way to do this? In our design, we started by assigning an earcon to each of the particles to make them distinct. To show the difference between the created elements, we decided to use a pitch mapping for the element where an increased element number and mass decreases the pitch. But how do we know if this makes sense to learners? We can do a card sorting activity. To do this, you provide a set of cards where each physical card has one element sound that's playable through a button push. Then you can ask people to order them from smallest to largest and observe how they sort them. If people can correctly sort them with no other direction, then the mapping could very well work. After you learn how people are interpreting your designs, you can embed the sounds into your interactive and evaluate it again. During your evaluations, if you learn that something doesn't match the learner expectations or that it's too hard to interpret, then you can brainstorm other ideas that may work instead. If you're interested in learning more about how we've done this for other simulations, check out our John Travoltage design process publication from the CSUN conference. It's linked in the additional resources. Let's go through a second example, but take a different view for the sound design and evaluation. In the sim friction, learners can explore and discover the relationships between friction, heat, and molecule motion. A learner can move the books in either the micro or the macro view. It's important for them to realize that friction happens when the two books make contact, so we queue that sound with a small [inaudible] where an extra layer of sound for picking up and dropping the books. For these sounds, we map the pitch, the action. A higher pitch means the book has been picked up, while a lower pitch means it's been set down. We use different timbres to differentiate selecting the book in two views. Rubbing the books together results in a sound similar to John's foot rubbing on the rug in John Travoltage. When the books are rubbed, the temperature increases, the molecules move more and more until they fly away. The more they jiggle, the more sounds play and the higher pitch the jiggling gets. When they fly away, the sound cue changes. Then finally, as the books cool down, the thermometer decreases with a playful cool down whistle. Who are the learners? It could be any group of students. How do we find out whether these mappings made sense to those learners? We interviewed them. We had college students explore the sim after the sonifications were added. Through this, we got feedback on what sounds could work for everyone. We also interviewed learners with vision impairment. We wanted to make sure that we were supporting their exploration in a playful, meaningful way. These interviews helped us understand how to best give user interface feedback, particularly on things like selecting the book. We found that not all learners needed reinforcement about the system cooling down, so we included that sound as a secondary layer in the sim. So let's review. You've learned lots of different details about how to design sounds, what mappings to use, and how to decide whether or not those designs worked. Each module has introduced one aspect of the design and evaluation process and you've learned a lot of details about each of them. So take this chance and spend a couple of minutes reviewing one of the other sound-enhanced simulations. Then outline, one, what are the general concepts? Two, who might use it? Three, how you would evaluate the designs? You can always quickly practice this on other people's work to get more confident doing it for your own.